json-autotype: Automatic type declaration for JSON input data

[ bsd3, data, library, program, tools, type-provider ] [ Propose Tags ]

Generates datatype declarations with Aeson's FromJSON instances from a set of example ".json" files.

To get started you need to install the package, and run "json-autotype" binary on an input ".json" file. That will generate a new Aeson-based JSON parser.

"$ json-autotype input.json -o JSONTypes.hs"

Feel free to tweak the by changing types of the fields - any field type that is instance of FromJSON should work.

You may immediately test the parser by calling it as a script:

"$ runghc JSONTypes.hs input.json"

One can now use multiple input files to generate better type description.

Now with Elm code generation support! (If you want your favourite programming language supported too - name your price and mail the author.)

See introduction on https://github.com/mgajda/json-autotype for details.

[Skip to Readme]
Versions [faq],,,,,,,,,,,,,,,,,,,,,, 0.3, 0.4, 0.5, 1.0, 1.0.1, 1.0.2, 1.0.3, 1.0.4, 1.0.5, 1.0.6, 1.0.7, 1.0.8, 1.0.9, 1.0.10, 1.0.11, 1.0.12, 1.0.13, 1.0.14, 1.0.15, 1.0.16, 1.0.17, 1.0.18, 1.1.0, 1.1.1, 1.1.2, 2.0.0, 3.0.0, 3.0.1 (info)
Change log changelog.md
Dependencies aeson (>=0.7 && <1.4), base (>=4.3 && <4.12), bytestring (>=0.9 && <0.11), containers (>=0.3 && <0.6), filepath (>=1.3 && <1.5), GenericPretty (==1.2.*), hashable (==1.2.*), lens (>=4.1 && <4.17), mtl (>=2.1 && <2.3), optparse-applicative (>=0.12 && <1.0), pretty (>=1.1 && <1.3), process (>=1.1 && <1.7), scientific (>=0.3 && <0.5), text (>=1.1 && <1.4), uniplate (==1.6.*), unordered-containers (==0.2.*), vector (>=0.9 && <0.13), yaml (==0.8.*) [details]
License BSD-3-Clause
Copyright Copyright by Michal J. Gajda '2014-'2017
Author Michal J. Gajda
Maintainer mjgajda@gmail.com
Category Data, Tools
Home page https://github.com/mgajda/json-autotype
Bug tracker https://github.com/mgajda/json-autotype/issues
Source repo head: git clone https://github.com/mgajda/json-autotype.git
Uploaded by MichalGajda at Mon Mar 26 13:37:47 UTC 2018
Distributions LTSHaskell:3.0.1, NixOS:3.0.1, Stackage:3.0.1
Executables json-autotype
Downloads 18113 total (583 in the last 30 days)
Rating 2.5 (votes: 3) [estimated by rule of succession]
Your Rating
  • λ
  • λ
  • λ
Status Hackage Matrix CI
Docs available [build log]
Last success reported on 2018-03-26 [all 1 reports]




Maintainer's Corner

For package maintainers and hackage trustees

Readme for json-autotype-1.1.1

[back to package description]


Takes a JSON format input, and generates automatic Haskell type declarations.

Parser and printer instances are derived using Aeson.

The program uses union type unification to trim output declarations. The types of same attribute tag and similar attribute set, are automatically unified using recognition by attribute set matching. (This option can be optionally turned off, or a set of unified types may be given explicitly.) :|: alternatives (similar to Either) are used to assure that all JSON inputs seen in example input file are handled correctly.

I should probably write a short paper to explain the methodology.

Build Status Hackage Hackage Dependencies

Details on official releases are on Hackage


After installing with cabal install json-autotype, you might generate stub code for the parser:

    json-autotype input1.json ... inputN.json -o MyFormat.hs

Then you might test the parser by running it on an input file:

    runghc MyFormat.hs input.json

At this point you may see data structure generated automatically for you. The more input files you give to the inference engine json-autotype, the more precise type description will be.

Algorithm will also suggest which types look similar, based on a set of attribute names, and unify them unless specifically instructed otherwise.

The goal of this program is to make it easy for users of big JSON APIs to generate entries from example data.

Occasionally you might find a valid JSON for which json-autotype doesn't generate a correct parser. You may either edit the resulting file and send it to the author as a test case for future release.

Patches and suggestions are welcome.


The most simple example:


It will produce the module with the following datatypes and TH calls for JSON parser derivations:

    data ColorsArray = ColorsArray {
        colorsArrayHexValue    :: Text,
        colorsArrayColorName :: Text
      } deriving (Show,Eq)

    data TopLevel = TopLevel {
        topLevelColorsArray :: ColorsArray
      } deriving (Show,Eq)

Note that attribute names match the names of JSON dictionary keys.

Another example with ambiguous types:

                "parameterValue":"site API"

It will produce quite intuitive result (plus extra parentheses, and class derivations):

    data Parameter = Parameter {
        parameterParameterValue :: Bool :|: Int :|: Text,
        parameterParameterName :: Text

    data TopLevel = TopLevel {
        topLevelParameter :: Parameter

Real-world use case examples are provided in the package source repository.

Other approaches:

  • There is a TypeScript type provider, and PLDI 2016 paper on solving this problem using <em>preferred type shapes</em> instead of union types. One can think about it as a alternative theory that gives very similar results, with more complicated exposition. It also does not tackle the problem of tagged records. It also does not attempt to <em>guess</em> unification candidates in order to reduce type complexity.

  • There was a json-sampler that allows to make simpler data structure from JSON examples, but doesn't seem to perform unification, nor is it suitable for big APIs.

  • PADS project is another attempt to automatically infer types to treat <em>arbitrary</em> data formats (not just JSON). It mixes type declarations, with parsing/printing information in order to have a consistent view of both. It does not handle automatic type inference though.

  • JSON Schema generator uses .NET types to generate JSON Schema instead (in opposite direction.) Similar schema generation is used here

  • Microsoft Developer Network advocates use of Data Contracts instead to constrain possible input data.