Nivaan Gupta avatarNivaan Gupta
← Thinking Out Loud

AI Is Only As Good As the Robots in the World

Apr 2026

Humans Were Never Meant to Sit at Screens

There is something fundamentally mismatched about a human being staring at a screen, slowly completing a task that an algorithm purpose-built for the same job can finish in milliseconds. We are not optimised for this. We never were.

The parallel that comes to mind: humans were never meant to copy text by hand for mass distribution either, until the printing press made that irrelevant. In both cases, the technology did not replace human value — it just stopped wasting it on the wrong things.

Right now, AI is doing something similar. The "intelligence" in any workflow — deciding what to do, how to do it, and why — has always been the expensive part. The execution around it was just overhead. Over the last three years, AI has started absorbing that overhead at a pace that is hard to overstate.


A Pattern Worth Paying Attention To

I think we are moving through a specific arc. It looks like this:

Analog → Digital → Analog

It sounds counterintuitive. Let me explain.

The First Analog Age: Building the Physical World

The first era of human progress was entirely tangible. Architecture, infrastructure, art, craft. Humanity pushed itself forward through things you could touch and stand inside. These physical achievements were extraordinary, and they required extraordinary coordination.

Consider what it took to design a building before AutoCAD existed. Teams of architects working in complete sync, drawing by hand, hoping every line matched across every page, with no way to instantly verify that someone else had not made an error three drawings back.

The Digital Age: Automating the Process Layer

Then came digital tools, and they made most of that execution overhead obsolete.

AutoCAD replaced the drawing table. CRMs replaced the contact rolodex. Payment gateways replaced manual reconciliation. Each of these tools took a complex, error-prone human process and turned it into something faster, cheaper, and more reliable. The Digital Age was, at its core, a wave of process automation. It was remarkable.

By the end of it, we had a SaaS product for nearly every problem worth solving that way. The gaps that remain are either too niche to justify a platform, or too complex to be solved by software alone.

That second category is where things get interesting.

The Second Analog Age: Robots in the Physical World

We are entering the next analog age, and it is being built on top of AI.

AI is already prompting companies to rethink where humans belong in their operations. Workflow-style tasks — the kind that follow predictable steps — are increasingly handled by tools like Claude Code. The next generation of skilled workers are not learning to speak to machines through code. They are learning to think, and leaving the execution to the stack.

But here is the part most people are not saying clearly enough:

AI is only as smart as the robot it inhabits.

A model that can reason brilliantly but only receives input through a chat window is still just a very fast screen-worker. The real leap happens when the stimulus is not a typed prompt — it is the physical environment itself. Temperature, movement, proximity, resistance, weight. When models begin reasoning in response to the world around them, rather than a message sent to them, that is when the second analog age truly begins.

Prompting an AI agent is not the future. It is a transitional behaviour — the same way typing commands into a terminal was transitional before the graphical interface arrived.


The Bottleneck Nobody Is Talking About Enough

The problem is compute architecture.

A robot that reasons in real-time from its physical environment cannot phone home to a data center every time it needs to think. The latency alone makes it impractical. What it needs is a brain onboard — something capable of the kind of inference we currently need entire server halls to run.

This connects directly to something I wrote about recently: A Data Center Will Fit in the Palm of Your Hand in 50 Years. The capability of a data center needs to eventually live inside a chip. A robot's brain needs to be a fusion of local compute and sensory input, evaluating external stimuli instantly, without waiting on infrastructure thousands of miles away.

That is the convergence point. AI intelligence, robotic embodiment, and personal compute dense enough to make it real-time. When those three things meet, the second analog age will be fully underway.


Thoughts on this? Think the arc is right, or wrong somewhere? Want to guess the digital arc after the robotic analog arc? Let's talk.