Reading the article I got the same conclusion as every time I approach sum types: they are ONLY useful for addressing malformed JSON structs of hacking BAD data structure/logic design, at least for most business applications (for system-level programs my reasoning is different).
The example JSON in the article, even if it may be common, is broken and I would not accept such design, because an action on an object must require the action AND the object.
For many year, I have advised companies developing business applications to avoid programming constructs (like sum types) which are very far from what a business man would understand (think of a business form in paper for the first example in the article). And the results are good, making the business logic in the program as similar as possible to the business logic in terms of business people.
This is pretty much the exact opposite of how I see the world with regards to data modeling. I suppose I'm a sum type radicalist. There are few non-trivial things in the world that I would model without heavy use of sum types.
Yes exactly. The real world is full of examples of a fixed set of exclusive options.
A programming language without sum types and exhaustive pattern matching in its type system is unable model this real world concept in its type system.
I have the exact opposite opinion, sum types naturally and logically model actual problems regarding data transformation, objects with methods obfuscate them, which is seldom a worthy tradeoff, even if it gives hypothetical business people a sense of understanding something (do they really?).
However, if you find yourself needing to use one of these regrettably ubiquitous languages that do not support them properly, it is gonna be painful either way.
I just checked, my last tax return form contains at least four inputs equivalent to a sum type.
Sum types are not purely programming constructs, they are such an everyday concept that you didn't even notice it. Not only do business people understand the concept, without such a basic understanding they wouldn't have been able to get a job in the first place.
You don't need to know category theory to understand sum types.
> malformed JSON structs of hacking BAD data structure/logic design
The real world is not particularly well structured.
> a business form in paper
There are several types of paper forms. They're typically differentiated by a type identifier (a.k.a. Title) in their headers. Scaling one layer out paper forms themselves are actually a "sum type." They have a common form factor which makes them useful but require an initial blind examination to understand the context of the rest of the document.
I really don't understand this perspective. Sum types are crazy useful for data modelling. A couple examples:
- Its quite common to need to model user input events. Eg, MouseClickEvent { mouseid, button }, KeyboardEvent { key, modifierFlags }, and so on. The most natural way to express this is using a sum type, with an event loop in your application matching out the different event types.
- Actually, most events follow this pattern. Eg, window management events, event sourcing (eg in kafka), actor mailbox events, etc.
- I've been working in the field of collaborative editing for the last ~decade. There, we talk a lot about editing operations. For text editing, they're either Insert { position, content } or Delete { position, length }. Some systems have a Replace type too. Sum types are a much more natural way to express operations.
- Results are sum types. Eg, if I submit a request to an RPC server, I could get back Ok(result) or Err(error). Thats a sum type. Likewise, this is what I want when I call into the POSIX API. I don't want to check the return value. I want a function that gives me either a result or an error. (And not a result-or-null and error-or-null like you get in Go.)
How would you elegantly express this stuff without sum types? I guess for input events you could use a class hierarchy - but that makes it infinitely harder to implement data oriented design. Things like serialization / deserialization become a nightmare.
Frankly, all the alternatives to sum types seem worse. Now I've spent enough time in languages which have sum types (typescript, rust, swift, etc), I can't go back. I just feel like I'm doing the compiler's job by hand, badly.
Huh? Forms do this all the time: fill out all of section A for identifying information and section B-1 for a new application or B-2 for a renewal. Append schedule F if you will be frobnicating in the Commonwealth of Massachusetts. I’d model that with sum types any day of the week.
The example JSON in the article, even if it may be common, is broken and I would not accept such design, because an action on an object must require the action AND the object.
For many year, I have advised companies developing business applications to avoid programming constructs (like sum types) which are very far from what a business man would understand (think of a business form in paper for the first example in the article). And the results are good, making the business logic in the program as similar as possible to the business logic in terms of business people.