I was at the Computer History Museum in Mountain View last week, and one exhibit stopped me cold.
That's ENIAC -> a room-sized computer completed in 1946. It was a landmark: the moment humanity moved from manual and mechanical calculation to high-speed electronic processing. It needed an entire building, a fleet of engineers, and constant maintenance just to run.
That's ENIAC too -> reimagined by two UPenn undergraduates in 1995. Same computational logic, now sitting in the palm of your hand. 49 years. A room became a chip.
Here's the thought that hasn't left me since.
Humans are remarkable at reasoning, but even our most deliberate long-term thinking tends to be anchored in what we can see right now. We extrapolate from the present rather than imagining a genuinely different future. (If you want a framework for this, Stephen Covey's 7 Habits articulates the gap between reactive and proactive thinking better than almost anything else I've read.)
When I look at the data center arms race happening right now, the billions being poured into physical infrastructure by every major tech company, I see ENIAC. I see a pivotal, necessary, impressive piece of technology that is also, in hindsight, going to look enormous and inefficient.
The data center is the ENIAC of the 21st century.
Someone is going to figure out how to put the whole thing on a chip. I don't know exactly how or when, but the trajectory of computing says it's a matter of when, not if. And when it happens, today's infrastructure bets will look a lot like a room full of vacuum tubes.
A lot of what's being built right now feels like FOMO-driven momentum, the right instinct, but at the wrong scale. A small team working on the right compression of intelligence and energy could make today's hyperscale data centers look like ENIAC: awe-inspiring for their era, obsolete by the next.
Curious what you think. Do you see it the same way? Think it's impossible? Or are you already working on something in this space?
If any of this sparked a thought — let's talk.