Lucas Baker (@lucasbaker):
Link
Apologies in advance for the length of this, but I want to give your question more than a cursory response.
The equivalence you draw between manual coding and historically obsolete skills like farming,
elevator operators, or sewing glosses over some key distinctions that make coding fundamentally
different in today's context. Let me try to explain my thinking with some grounded reasoning.
First, coding isn't just a "means to an end" in the same way operating a manual elevator or sewing by
hand became outdated. Those skills were replaced by automation that fully removed the need for human
intervention - elevators became automatic, and sewing machines scaled production. Coding, however,
underpins the very systems that enable AI and LLMs to function. When you mention that 25% of Y Combinator's
W25 startups have 95% AI-generated codebases, that does demonstrate AI’s power, but it also highlights a
critical dependency: those AI systems were built and are maintained by engineers who understand code deeply.
If a critical bug arises in an AI-generated codebase - for example, a security flaw, which studies
have shown can be introduced by AI - you need skilled people who can debug and fix it manually. This isn't
hypothetical; AI-generated code has been known to cause outages or require heavy debugging. The skill of
manual coding isn't obsolete - it's quite literally the back on which the foundational layer for the
technology you argue will replace it is built.
Second, the stakes and systemic role of coding differ vastly from the examples you provide. Skills
like farming or sewing became less critical for most individuals because industrial systems (like
factory farming, mass production) took over, and those systems have fault tolerance built in -
think of supply chains with redundancies. But as I mentioned in my previous post, there's no
equivalent "backup" for an individual engineer's lost skills when an AI tool fails. If a AI generates
flawed code and no one on the team can fix it because they've fully outsourced their coding ability, the
system grinds to a halt. This isn't like a farmer relying on grocery stores - coding is more akin
to the power grid you referenced earlier, but with a key difference: power grids have dedicated
engineers who maintain them, not just end-users who press a button. If we all become end-users of
AI coding tools without understanding the "grid" of software, we risk a systemic failure with no
one left to intervene.
Finally, I don't think the historical analogy fully holds because coding has a unique role in enabling
adaptability and innovation. Skills like using a slide rule became obsolete because calculators
provided a superior, self-contained solution. But AI isn't a complete replacement for coding -
it is a tool that augments it. Over-relying on AI without understanding the code can stall
skill development, which is exactly my concern. Coding doesn't just involve producing instructions
for computers. Coding leverages problem-solving, debugging, and creates new paradigms - skills
that remain essential even as AI advances. The "better means" you describe (e.g., plain language
coding) still require a foundation of technical knowledge to evaluate and refine the output.
I'm not arguing for a rejection of AI - I agree it's transformative. But the balance I advocate
for isn't a mere preference; it's a pragmatic necessity to ensure we retain the agency to fix,
innovate, and adapt when these tools inevitably fall short. Coding isn't a "dead skill" like
Morse code. It's the scaffolding of our digital world, and, as I see it, we dismantle it at
our peril.
Again, I apologize for the length of this, but I didn't just want to gloss over my response.
Too often on this platform, discussions end up getting lost in answers that prioritize brevity
over deeper thinking.