Collapsing Layers

Doing Less to Do More

This idea has been expanded in The Third Age of JavaScript

The future of technology is less layers, not more.

The Mess We’re In

As hardware has gotten faster and cheaper over time, we have chosen to stack layer after layer of software on top of each other in the name of ease of use (both taking away lower level concerns for those who would otherwise have to handcode it, and also enabling an ever-wider group of people to write software). This has made a lot of people very angry and been widely regarded as a bad move.

But it was the right move - enabling more people to write and use more software accelerated the demand for all technology, hardware included.

Of course, we have always had frequent calls for a return to simplicity and more efficient software. Niklaus Wirth in 1995 pled for Lean Software. There is a delicious irony in the contrast between Moore’s Law and Wirth’s Law - this is a special case of Jevons paradox. My favorite modern framing of pits the former CEOs of Intel vs Microsoft: What Andy giveth, Bill taketh away. Joe Armstrong gave a great Strange Loop talk on this fittingly titled The Mess We’re In. At the OS level, Casey Muratori notes that Linux + FreeBSD itself is irreducible bloat, and calls it the Thirty-Million Line Problem - (thanks Richard Feldman for the pointer!)

In the 2010’s we carried on as our predecessors did. As the modern iteration of the joke goes:

1969:

  • what’re you doing with that 2KB of RAM?
  • sending people to the moon

2018:

  • what’re you doing with that 1.5GB of RAM?
  • running Slack

Another favorite stat I like comes from Jake and Surma on the size of Minesweeper on Windows:

  • Windows 95: 9.6kb
  • Windows Vista: 4MB
  • Windows 10: 105MB

That is 1,050,000% bloat in 20 years.

What’s changing?

Three factors at play: the slowing of Moore’s law, the rise of mobile and wearable devices, and the demand for more secure and reliable software.

Moore’s Law

Rumors of Moore’s law’s death have been greatly exaggerated before, but we still hold it to be broadly true. In 2016 the industry collectively agreed it would die by 2021, but right on target, Intel in 2019 said it was back on. Of course technically, the law as stated has died a few “deaths” already: we just kept shifting the bar from transistor density to power consumption to marginal cost. So what makes this time any different?

Moore’s law won’t suffer a dramatic death, it will just slow and become irrelevant compared to progress seen by adopting other chip architectures. We will move from 10nm to 7nm to 5nm and there’s a quantum effect-imposed hard limit down near 1nm. But each process node upgrade is getting slower, and more expensive. Meanwhile, we are already shunting plenty of calculations to GPUs, even on smartphones. This is a “collapsing of layers” between workload and chipset, and of course similar movements are happening for all large scale workloads like cryptocurrency mining and machine learning applications. For general computing consumer hardware, Moore’s law is already effectively dead - laptops have not materially gained clock speed in a decade, and we have simply added multiple cores and GPUs.

There is still hope - anything from quantum computers to photonic chips to 3d chips - but all of these are at the “science fair” stage and at even longer odds than process node improvements on traditional silicon.

What’s easier - solving quantum computing or taking another look at 50 years of software bloat?

Mobile and Wearable Devices

No matter what happens to high end, “full powered” computers, the range of computers has also broadened tremendously in our lifetimes and will continue to do so. Just like PCs lagged behind Mainframes, smartphones lag behind PCs, and wearable devices like watches and headphones will lag even further. All of these will always need faster processing at low power at low cost. We cannot conceivably write software for these devices like we do for PCs.

More Secure and Reliable Software

Software used to be written for technical people, for occasional uses where lives and livelihoods did not depend on it. As software eats the world, these assumptions must change.

Of course, most software is not life-or-death, but it is hard to deny that there is a tremendous demand for more reliable software, both from users in the form of fewer bugs and crashes and bad states, as well as from developers in the form of having a more stable substrate on which to write as well as deploying more reliable code per hour (since that is the ultimate limited resource).

I choose to focus on reliability, not speed, so as not to overlap with the above points - we always want faster software. But of course speed leaks into reliability, with timeouts and hidden race conditions and the like. It is a bolder assertion that we can outright achieve more reliable software as well by collapsing layers. I recognize that this is not a necessary result, and that it is somewhat unsubstantiated. But each layer is an abstraction, and all non-trivial abstractions, to some degree, are leaky. Fewer layers, fewer leaks, fewer bugs - at the cost of having to do more per-layer.

I really don’t know anything about security, but this is a major dimension of concern on every device at every level. By all accounts we are severely lacking here. And a re-examination and collapsing of layers with the benefit of today’s knowledge is likely to yield benefits.

Dealing with the Cost

Calling for a reversal in a 50 year trend in software bloat is, frankly, ridiculous. I understand that humans are really bad at responding to slow moving train wrecks. Using nearly-free, tried and tested tech with known bugs is preferable to sinking a bunch of time into new, unproven tech with unknown bugs.

Although the cost of ignoring these issues is rising due to the reasons listed above, we may need a distinct flashpoint event or movement or community that helps to galvanize and coordinate action at every layer of the stack. I confess I have no idea how to do this. I’d love more conversation about it.

However, I do like one concept that has taken in my mind in the past year. The simple phrasing: layers that belong together, should live together. In backend API design, it is well known that sorting, filtering and paging of database results belong at the same layer of the stack, since trying to place one thing at a different layer to the other requires so much transmission of information that you basically end up merging layers poorly. So in terms of cost, the collapsing of layers pays for itself because we were paying for the cost anyway through needing to punch holes in layers that ought to have been designed together but never were, through accident of history.

I wonder what other layers “belong together”.

We might also add layers to reduce layers. In 2016 Dan Abramov kicked off a movement to bundle layers in the JavaScript ecosystem. The idea here is that, yes, we are adding layers, but we can reduce a bunch of layers to implementation detail. So from the outside it looks like just one layer, while on the inside there are of course a bunch of layers that are collectively managed by a generous open source community (socializing the cost).

This is a “cheat”, of course - a bunch of layers stuck together with duct tape and elbow grease. We aren’t really collapsing layers at all, and it is leaky as hell - but it works to prove out the demand for simplicity, and paves the way for other tools designed from the ground up to handle this.

What’s collapsing?

I intend this to be a living list of current movements in collapsing layers, to inspire the reader as to the wealth of opportunities and the degree of impact that can be had.

Challenges

To be honest, I myself struggle to reconcile this idea with Unix philosophy. I feel a little better that Linux itself is 15m lines of code - in other words - maybe its more about the simple number of layers than the actual depth or thickness of each layer. Rich Harris puts it better than I can: small modules may be better for developers at the cost of users.

I do think that Collapsing Layers is only suitable for a more mature subset of technologies. If something is nascent, growing like a weed, prone to change - you probably still want to tack on more layers. AWS is growing at 40% a year and unusable? Fine, add a second layer cloud. Nondeveloper Prosumers want to make software en masse? Fine, slap a GUI on everything.

Jim Barksdale is famous for noting that there are two ways of making money in business: bundling and unbundling. This is often applied to the business of software, but one can argue the same for technology architecture. It’s time to bundle the basics.

Further Notes

Tagged in: #tech #ideas #dx

Leave a reaction if you liked this post! 🧡
Loading comments...
Webmentions
Loading...

Subscribe to the newsletter

Join >10,000 subscribers getting occasional updates on new posts and projects!

I also write an AI newsletter and a DevRel/DevTools newsletter.

Latest Posts

Search and see all content