You have made your bedrock, now lie in it
As a child, I was quite fond of old-fashioned Lego bricks. One very endearing but rarely discussed property of such bricks is their durability, bordering on the indestructible. Almost any abuse inflicted on a Lego structure will, at worst, leave you with a pile of bricks entirely like the one you started with. Even the most baroque Lego edifice will always break into Legos, and not jagged shards of plastic. In my childhood, this felt like something approaching an elementary physical law. As with many real physical laws, it applies in a restricted (though usefully broad) domain: a Lego castle dropped from the Empire State Building or crushed in a garbage compactor will leave behind many pieces still recognizable as Legos, and almost certainly a few which are not. However, in my childhood I did not have ready access to garbage compactors or the roofs of tall buildings, and so the Lego brick seemed like an impregnable elementary particle: a bedrock abstraction.
A bedrock abstraction level is found in every man-made system. No recoverable failure, no matter how catastrophic, will ever demand intelligent intervention below it. Repair at those depths consists purely of physical replacement. No car crash, however brutal, will ever produce piles of loose protons and neutrons. When a Unix binary crashes, it might leave behind a core dump but never a “logic gate dump” and certainly not a “transistor dump.” Logic gates and transistors lie well below the bedrock abstraction level of any ordinary computer.
The computers we now use are descended from 1980s children’s toys. Their level of bedrock abstraction is an exceedingly low one. This would be acceptable in a micro with 64K of RAM, but when scaled up to present proportions it is a nightmare of multi-gigabyte bloat and decay. Witness, for instance, the fabled un-debuggability of multi-threaded programs on today’s architectures. It stems purely from the fact that truly atomic operations can only exist at the bedrock level, and to fully comprehend what is going on in the entire machine requires wading though a vast sea of binary soup, boiled and stirred continuously by an asynchronous world. The futility of this task is why programmers aren’t usually given even a sporting chance — observe the lack of a hardware debugger in any modern computer.
Once in a while, barring a whiff of the proverbial magic blue smoke, a hardware-level malfunction makes itself known through a fit of non-deterministic behavior in that rare bird, a piece of known-good software. Yet, to a first approximation, solid state hardware will spend years doing exactly what it says on the box, until this hour comes. And when it has come, you can swap out the dead parts and be guaranteed correctness again.
Software, on the other hand, routinely ships broken. And software cannot be meaningfully repaired or replaced, only re-designed. Yet witness the cries of righteous outrage among software vendors whenever hardware deviates ever so slightly from its advertised function. What software company could honestly lay claim to even the lowliest silicon pusher’s levels of bug-free operation and architectural soundness? I cannot help but picture a stereotypically slick and unrepentant con artist, frothing with rage after having been duped into purchasing a costly full page ad in what turned out to be a Skeptic society’s periodical. It appears that Microsoft (and now Apple) is entitled to comfortable and stable bedrock abstractions, while you and I are not.
I sometimes find myself wondering if the invention of the high-level compiler was a fundamental and grave (if perhaps inevitable) mistake, not unlike, say, leaded gasoline. No one seems to be talking about the down-side of the compiler as a technology — and there certainly is one. The development of clever compilers has allowed machine architectures to remain braindead. In fact, every generation of improvement in compiler technology has resulted in increasingly more braindead architectures, with bedrock abstraction levels ever less suited to human habitation.
Nevertheless, high-level architectures still developed, though most were strangled at birth by political forces. Think of where we might be now, if the complexity of programming a computer had been ruthlessly pruned at the source, rather than papered over with clever hacks. Dare to imagine a proper computer — one having an instruction set isomorphic to a modern high-level programming language. Such a machine would never dump the programmer (or user) by surprise into a sea of writhing guts the way today’s broken technologies do. Dare to imagine a computer where your ideas do not have to turn into incomprehensible soup before they can be set in motion; where there is, in fact, no soup of any kind present in the system at all. It would be a joy to behold.
I posit that a truly comprehensible programming environment — one forever and by design devoid of dark corners and mysterious, voodoo-encouraging subtle malfunctions — must obey this rule: the programmer is expected to inhabit the bedrock abstraction level. And thus, the latter must be habitable.