I find that certain problems attract the creation of many solutions. We are overwhelmed with slightly similar yet incompatible and potentially incomplete solutions.
Why is that so? Which domains show this pattern?
My hypothesis is that these problems seem easy to approach from one idiosyncratic perspective while at the same time being hard to complete. Therefore no-one’s satisfied or able to judge existing solutions and end up creating yet another one.
In certain domains, incumbent solutions also may appear bloated, and therefore it’s easy to think one can do better, because the 50% solution appears leaner. Problem is, the remaining 50% is where the necessary risk mitigation, adaptation is. An example of that is the appearance of poorly made (but lean) databases in the age of NoSQL, which claimed to be leaner just because they did not discover yet all the things their historical competitors had discovered they had to do.
Premake, CMake, build2, Make, autoconf/automake, FASTbuild, tundra, muon, bazel, magescript, genie, gyp…
Teamcity, Jenkins
Systemd, Sysvinit, Launchd, rc…
Djinni, Cap’n’proto, Thrift, Protocol buffers, CORBA, Asn.1, Insomniac’s DDL
Reasons for incompleteness: (Mike Acton, Natalya Tatarchuk): Too much? Too little? How to filter? UI? No UI? Process on client? Protocol(s)? Bandwidth? Analysis? Historical tracking? Diffs? Multithreaded, without synchronization stalls. And deterministically.
Property Models, Systolic Arrays, Self-Adjusting Computations, Reactive Programming, Sussman’s Propagator…
https://twitter.com/mike_acton/status/893220854321922049