json-to-elm

The "How I implemented json-to-elm" post is about the author's experience in implementing a tool that generates valid Elm code from different forms of input. The tool aims to reduce the time spent by developers on writing decoders in Elm. site

The author discusses the history of JSON decoders in Elm and the limitations of the existing approach. They then describe the five different implementations of json-to-elm, starting with a Python implementation and progressing to JavaScript and finally Elm.

In the first implementation, the author wrote a Python code that takes a JSON blob as input and generates the type alias, decoder, and encoder for that JSON blob. They explain how they translated the logic of JSON decoders from Elm to Python.

The second implementation involved rewriting the Python code in JavaScript to make it more accessible to Elm users, as many of them come from JavaScript backgrounds.

The third implementation was done in Elm and focused on creating a visual text input that takes JSON as input and generates Elm code as output. The author explains the challenges faced during this implementation and the need for refactoring the code to improve its readability and maintainability.

[…] Instead of doing a simple first pass, we would try and keep the context for each field we collected. We would first collect all the fields and their types. We’d then filter for type alises, running the whole process on each child type alias. Finally, we’d put them all together into the type-alias that would be returned. The code looked something like this:

createTypeAlias : Json.Value -> String -> String -> TypeAlias createTypeAlias stuff aliasName base = let fields = generateFields stuff base |> List.map (\( _, key, name, _ ) -> ( key, name )) |> Dict.fromList in { name = aliasName , fields = fields , base = base }

The fourth implementation involved further refactoring and improvements, including the use of an Abstract Syntax Tree (AST) to represent the code and the introduction of union types to represent all the possible JSON values. This allowed for better separation of logic and easier implementation of new features.

The fifth implementation added support for decoder/encoder input and introduced the ability to convert old-style decoders to the new pipeline decoders (noredink/elm-decode-pipeline). The author also added user-facing options and the ability to generate English descriptions of type aliases and decoders.

AST

In the context of the fourth implementation Noah realised that actually, an AST might make more sense.

One of the first changes they made in the fifth implementation was to use a union type to represent all the possible JSON values, instead of just getting it back as a string. This would allow him to really think about the values he was representing, at a type level instead of strings.

> I’d also be able to represent some of the recursive values in a more logical setting, using two new constructors — ResolvedType, to represent the aliases already parsed, and ComplexType, to represent a type that hadn’t been parsed yet.

[…]

The next thing Noah thought about was generating the Javascript required to parse the JSON at runtime. This would effectively be the first case of an Elm compiler written in Elm, as it allowed you to take in an Elm decoder, and verify it against some JSON input. You can check out the commit here , but he dropped support for it when the Native module syntax changed. Still a neat little sidenote!

During this time, they also developed elm-decode-pipeline as a more idiomatic alternative to the infix operators of Json.Decode.Extra. This gave them the idea that they could use the AST that Noah had written for json-to-elm to convert the old decoders to the new pipeline decoders.

[…]

Once a decoder was discovered, it would be parsed and converted into a type alias, which would then be used to build up the AST. Once we had the AST, we could generate the decoders just as we were before. This meant could read in old-style decoders, but generate new style decoders.