Predictably as a result of focusing on real work, my ability to usefully concentrate on progressing Acid has stalled. This low energy period was accentuated by trying to write a quick script that used the library, and hitting not one or two, but three fundamental errors in the space of about half an hour. Added to that, progressing on some of the bigger ticket items requires decisions that necessitate more energy than I have right now, so the project is mostly stalled for the time being.
I've instead diverted my spare energy to trying to produce higher level concept documentation, particularly in the form of a complete example use case. This has already yielded some incomplete conceptual docs and the beginnings of a new chapter that isn't checked in yet.
With my third eye wrenched wide open by a pot of strong coffee, in the process of writing that chapter I had a rather profound and troubling idea, more to do with the philosophy of modern software than any technical aspect. It relates to that old friend, the query planner, and how in modern NoSQL systems this is practically a vestigial entity. The marketing folk would have us believe this is due to the difficulty in supporting "big joins! for big data! at web scale!", but there is another side to it.
NoSQL is fun and quick exactly because it relieves the developer from the need to think in terms of some preordained schema and complicated relational algebra, something that apparently was cultural baggage and unnecessary. The query styles available, that satisfy the majority of modern database users, are so primitive that 40 years ago the folk researching the future of database systems might even have found them offensive.
And it leads me to wonder what changed, and why that relational algebra, complicated data model, and absurdly old fashioned query planner from a bygone era eventually found themselves victims of progress: it is because in the majority of applications, their use was more trouble than it is worth.
And therein lies the bother: as an industry, we have evolved from the point where we designed highly generalized knowledge systems that were expected to satisfy any informed user's hypothesis in a heartbeat, to systems that on the whole, are intended to satisfy only the repeated oscillations of labrats in a cage pumping on a lever to receive more food. As an industry we don't build generalized tools any more, we trap data in non-abstract tool-specific data models wrapped behind interfaces that only allow a handful of intended behaviors. There never will be some grand unifying schema; we will never someday expose the user to the full generality of the power of their data (apparently because it's "too hard to understand"). Eventually she will be reduced to the sole fundamental data operation of the mouse click or touch to cause more entertaining lights to blink.
The future of our industry has already arrived, and it is not some powerful higher order utility provided to the user, instead it is tidbit-sized sensual excitations satisfied by minimal interaction with (or understanding of) the machine. The future of computing is UIs of the complexity similar to hotornot.com, complex appreciation of information relationships is discouraged, and reserved only for an increasingly privileged few, where it is considered proprietary and mined en masse, entire worlds at a time, in "data warehouses" for the sole purpose of strategizing on what may succeed as the next generation of labrat UI.
Perhaps most disparaging of all, the software I've been passionately researching and implementing most of this year serves little purpose but to advance the contemporary state of affairs, and that is because commercially I've never really needed a tool of the generality that SQL provides, and increasingly probably never will. I'm ashamed to admit that even the subjects I am passionate about disgust me, such is the self-perpetuatingly regressed state of our profession.
The usual argument is that generalized applications are too difficult for the masses to understand, and yet, I remain suspicious in the light of all those business users of the 90s, who despite the unrefined nature of the tools, produced magnum opera in their own right using nothing more than Microsoft Access. It is also plentifully convenient that task-specific applications usually have a higher ability to cause lock-in, than a comparable tool that was built with data as a first class concept.
To claim information technology has penetrated the masses or that we teach computing at schools would be incorrect. Information is more rigorously defined than the atom, and computation could be seen as a method by which information is summarized into a meaningful answer describing the global state of the universe. The trends even within the "information" industry are clear symptoms of failure here: dumbing down information to a point where it can no longer be manipulated should not be considered a "successful application", nor should teaching millions of people each year how to use software like Facebook or Google.
We're told that behind closed doors some serious work is being done on the future of technology and we're only years now from meaningful AI, that will soon come along and satisfy all woes, and dig all signal from all noise, and we might ask it questions in plain English. In the meantime it's "trust us, and keep feeding us that stream of clicks!", and all we have to show for it is little more than a new device for running an evermore simplified UI with prettier transitions. Even if that big AI someday arrives, access for the average human would be via a pipe to a water-cooled datacentre, and governed by a gently caring corporation whose interests would inevitably be served.
I suspect that is where modern software development really fails: it is not that the technology isn't there (by some accounts, it basically hasn't changed in 40 years), but that UI designers are preoccupied with notions such as responsive design or grid systems or whatever else, as if the need to figure out new ways to communicate data to regular people somehow went away. I don't suppose that UI researchers on the whole are somehow generally stupid, just that either the problem has yet to be solved, or perhaps their employers have no interest in such general purpose designs.
The future of humanity's technology most probably looks nothing like RSS or microformats, or Freebase.com, or of the kind of UIs dreamt about in V (2009) or Startrek. The future is a universe of cheap, disposable task-specific blinking widgets capable only of scratching a particular itch until the next version of Android arrives, because in almost every other possible future, the need for the majority of our industry ceases to exist, and it will never allow that.
It seems I'm not very good at finishing posts with anything other than "and I hate it", so instead this will end with a video. I'm strongly reminded of The coming war on general computation, which should be required listening for anyone who thinks these ideas are even remotely accurate.
A lively discussion formed over on Reddit
Comments on a postcard to @edeadlk.