large-records: Efficient compilation for large records, linear in the size of the record
For many reasons, the internal code generated for modules that contain records is quadratic in the number of record fields. For large records (more than 30 fields, say), this can become problematic, leading to large compilation times and high memory requirements for ghc. The large-records library provides a way to define records that is guaranteed to result in ghc core that is linear in the number of record fields.
- large-records-0.4.tar.gz [browse] (Cabal source package)
- Package description (revised from the package)
Note: This package has metadata revisions in the cabal description newer than included in the tarball. To unpack the package including the revisions, use 'cabal get'.
|Versions [RSS]||0.1.0.0, 0.2.0.0, 0.2.1.0, 0.3, 0.4|
|Dependencies||base (>=4.13 && <4.18), containers (>=0.6.2 && <0.7), ghc, large-generics (>=0.2 && <0.3), mtl (>=2.2.1 && <2.3), primitive (>=0.7 && <0.9), record-dot-preprocessor (>=0.2.16 && <0.3), record-hasfield (>=1.0 && <1.1), syb (>=0.7 && <0.8), template-haskell, transformers (>=0.5.6 && <0.7) [details]|
|Author||Edsko de Vries|
|Revised||Revision 1 made by EdskoDeVries at 2023-10-04T11:58:15Z|
|Source repo||head: git clone https://github.com/well-typed/large-records|
|Uploaded||by EdskoDeVries at 2023-03-06T13:56:30Z|
|Reverse Dependencies||1 direct, 1 indirect [details]|
|Downloads||381 total (13 in the last 30 days)|
|Rating||2.25 (votes: 2) [estimated by Bayesian average]|
|Status||Docs available [build log]
Last success reported on 2023-03-06 [all 1 reports]