4
Back to Blog
Article

If We Were Designing Computers Today, What Would We Do Differently?

3 min read
If We Were Designing Computers Today, What Would We Do Differently?

Introduction

Most of the computers we use today are descendants of machines designed decades ago. They are faster, smaller, and more powerful—but at their core, they still follow assumptions made when: - Computers were rare - Users were trusted - Programs were small - Parallelism was optional If we were starting fresh today, with everything we’ve learned, it’s worth asking a simple but uncomfortable question:

Would we design computers the same way?

The answer is almost certainly no.

1. Security Would Be the Default, Not an Add-On

Blog image

Early computers assumed a friendly world. Programs trusted each other. Users trusted programs. That assumption no longer holds. If we redesigned computers today: - Memory access would be restricted by default - Programs would get only the permissions they explicitly need - Isolation would be fundamental, not optional - Instead of patching vulnerabilities endlessly, systems would be secure by construction. Security wouldn’t be a product feature. It would be part of the foundation.

2. Parallelism Would Be the Normal Case

Blog image

Modern CPUs are no longer getting much faster per core. They’re getting more cores. Yet most programming models still assume: - Sequential execution - Shared mutable state - Locks as a necessary evil A modern-first design would: - Treat parallel execution as the default - Encourage message passing over shared memory - Make race conditions hard to express Concurrency wouldn’t feel “advanced.” It would feel natural.

3. Failure Would Be Expected, Not Exceptional

Blog image

Most systems today treat failure as something to avoid. But in reality: - Networks fail - Disks fail - Processes crash - Dependencies disappear A modern design would assume failure as normal: - Processes would be isolated - Restarts would be cheap - Recovery would be automatic Instead of asking “How do we prevent failure?” We would ask “How do we recover gracefully?”

4. Humans Would Be the Primary User

Blog image

Computers are incredibly fast—but humans are not. Yet we still design systems that: - Require memorizing syntax - Punish small mistakes - Hide intent behind boilerplate If we designed computers today: - Systems would emphasize clarity over cleverness - Code would express intent, not just instructions - Tooling would optimize for understanding, not just execution The goal wouldn’t be to impress the machine. It would be to help humans think.

5. Abstractions Would Be Honest

Many abstractions today leak badly: - Memory looks infinite until it isn’t - Storage looks reliable until it fails - Networks look fast until latency matters A modern system would: - Expose costs clearly - Make tradeoffs explicit - Help developers reason about consequences Abstractions wouldn’t pretend complexity doesn’t exist. They would manage it transparently.

6. Infrastructure Would Be Incremental

Today, we often jump straight to complex infrastructure: - Distributed systems - Orchestration layers - Heavy abstractions A fresh design would favor: - Simple systems first - Clear scaling paths - Complexity only when earned Infrastructure would evolve with need, not anticipation.

Conclusion

Blog image

If we were designing computers today, we wouldn’t just make them faster. We would make them: - Safer by default - Parallel by design - Resilient to failure - Clearer for humans - Honest about tradeoffs The hardware has already changed. The world has already changed. What hasn’t fully caught up yet is how we think about computing itself. That’s where the next real shift will come from.