On fearing AI

Jan 22, 2026 5 min read
On fearing AI

We've all seen the headlines. AI is here, it can code, and it's good at it. Good enough to spark heated debates about whether silicon will soon replace software engineers. Tech companies are pushing their products hard, claiming AI can handle anything from debugging legacy code to architecting entire systems. Meanwhile, most of us sit here with 5+ years of professional experience (and decades more of tinkering since childhood), watching machines casually solve problems that once took us hours.

Software engineering isn't just our job, it's woven into our identity. We're the builders, the problem solvers, the ones who turn coffee into code. So when a machine shows up that can write functions 60% or even 80% as well as we can, it doesn't just feel like professional competition. It feels like an attack on who we are. And honestly? That sucks.

But here's the thing: we've been here before.

The Compiler Crisis That Never Happened

Picture this: it's the mid-1950s. FORTRAN has just been released by IBM, marking the birth of the first widely-available high-level programming language and compiler. Before this moment, engineers were writing everything in assembly, painstakingly translating every logical operation into machine-specific instructions. They knew their processors intimately, could optimize down to individual clock cycles, and took pride in crafting elegant assembly code.

Then along came John Backus and his team, claiming their "FORmula TRANslation" system could let engineers write mathematical expressions in something resembling English and automatically convert them to machine code. The audacity! Surely this would spell doom for the assembly programmers who had spent years mastering their craft.

Except... it didn't.

What actually happened was magical. Engineers didn't disappear, they multiplied. They stopped wrestling with register allocation and started wrestling with algorithms. They traded segmentation faults for logic errors and found they could build bigger, more complex systems than ever before. The same pattern repeated with every subsequent abstraction layer.

When C came along, engineers didn't mourn the loss of assembly programming, they celebrated the ability to focus on structure instead of stack management. When Python arrived, they didn't cry over manual memory management, they rejoiced in being able to prototype ideas in minutes instead of hours. The personal computer revolution of the 1980s democratized programming further, and the rise of frameworks, cloud platforms, and containerization continued this trend.

Each time, the same question echoed: "What will happen to all the engineers?" And each time, the answer was the same: they adapted, they learned the new tools, and they built bigger things.

The Pattern Is the Point

Here's what's fascinating about every major advancement in programming: productivity gains never led to fewer engineers, they led to more ambitious projects and more engineers working on them. When we could write C instead of assembly, we didn't work less; we built operating systems. When we could write Python instead of C, we didn't take longer breaks; we built machine learning pipelines. When Kubernetes abstracted infrastructure, we didn't scale back; we built microservice architectures.

The pattern is crystal clear: abstractions don't eliminate engineers, they elevate them.

Think about your own career. How much of your time do you spend on the "engineering" part, the problem-solving, system design, understanding requirements, debugging complex interactions, versus the "coding" part, typing syntax, looking up API documentation, formatting JSON? If you're honest, most of your value comes from the thinking, not the keystrokes.

AI as Your Personal Compiler

This is where AI becomes less scary and more exciting. Instead of seeing ChatGPT or Copilot as competition, try thinking of them as this generation's compiler. They're tools that let you work at a higher level of abstraction, where you can describe what you want in natural language and let the machine handle the syntax.

You wouldn't feel threatened by gcc, would you? You probably don't even think about it, it's just a tool that converts your C code to machine code. You trust it to handle optimizations you don't want to think about, and you focus on the logic instead.

AI coding assistants are the same concept, just one layer higher. They convert your intent (expressed in comments, function names, or plain English) into code. Like any compiler, they sometimes produce suboptimal output that you need to refine. Like any tool, they require skill to use effectively. And like every previous abstraction layer, they free you up to work on bigger problems.

It’s not a perfect analogy, and this matters, unlike classical compilers, LLMs are:

1. Non-deterministic

Run the same prompt twice and you rarely get identical output. That makes testing harder, reproducibility trickier, and QA more statistical than binary. But far from eliminating engineers, this increases the need for them.
We’re now solving problems like:
“How do we build reliable systems on top of components that aren’t guaranteed to behave identically?”
That’s new. That’s challenging. And that’s engineering.

2. Hard to interpret

Compilers, languages, operating systems, all previous abstractions could be traced and understood all the way down to NAND gates. LLMs can be explained structurally, but not intuitively. A million-dimensional weight matrix doesn’t map to human-readable logic.
This “black-box abstraction” is fundamentally different, and it shifts the skill set required:

  • We need better debugging strategies for opaque systems.
  • We need monitoring and guardrails that assume uncertainty.
  • We need humans who understand how to work with unexplainable tools responsibly.

And yet, conceptually, the analogy still holds:
AI allows us to work at a higher level, even if the mechanism underneath is different.

The Real Engineering is Problem Solving

Here's the crucial distinction that's getting lost in all the AI hype: Engineers solve problems. Programmers write code. The coding is just one way we implement solutions, not the solution itself.

When someone comes to you with a business problem, your value isn't in the ability to type for loops or remember React syntax. Your value is in:

  • Understanding the real problem behind the stated requirements
  • Designing systems that can scale and evolve
  • Thinking through edge cases and failure modes
  • Making trade-offs between performance, maintainability, and time-to-market
  • Debugging issues across complex distributed systems
  • Mentoring junior developers and reviewing their approaches

None of these core engineering skills are threatened by AI. In fact, as AI handles more of the routine coding tasks, these higher-level skills become more valuable, not less.
Don't use AI for thinking, that's your job and it sucks at it.

Embrace the Abstraction

The engineers who thrive in the AI era will be those who lean into the abstraction rather than fighting it. Learn to prompt effectively. Understand AI's strengths and limitations. Use it to prototype faster, explore more approaches, and handle the boilerplate while you focus on architecture and business logic.

Yes, some roles will change. Junior developers might need to level up faster. Code review might shift from syntax checking to logic verification. But the fundamental need for humans who can think systemically, understand user needs, and make complex technical decisions isn't going anywhere.

We're not being replaced, we're being upgraded. Just like those assembly programmers in the 1950s who learned FORTRAN and went on to build the computing revolution, we get to learn a new tool and build the AI revolution.

The future belongs to engineers who see AI as their copilot, not their competitor. So don't fear the robots, teach them to compile your ideas into reality, and then dream bigger than ever before.

Remember: every time you've learned a new framework, language, or tool, you've essentially upgraded your personal compiler. AI is just the next iteration. The question isn't whether you can adapt, you've already proven you can. The question is: what will you build next?

Join the conversation

Great! Next, complete checkout for full access to BetterUp Product Blog.
Welcome back! You've successfully signed in.
You've successfully subscribed to BetterUp Product Blog.
Success! Your account is fully activated, you now have access to all content.
Success! Your billing info has been updated.
Your billing was not updated.