Link tags: age

1080

sparkline

Training your replacement | Go Make Things

I’ve had a lot of people recently tell me AI is “inevitable.” That this is “the future” and “we all better get used to it.”

For the last decade, I’ve had a lot of people tell me the same thing about React.

And over that decade of React being “the future” and “inevitable,” I worked on many, many projects without it. I’ve built a thriving career.

AI feels like that in many ways. It also feels different in that non-technical people also won’t shut the fuck about it.

A considered approach to generative AI in front-end… | Clearleft

A thoughtful approach from Sam:

  1. Use AI only for tasks you already know how to do, on occasions when the time that would be spent completing the task can be better spent on other problems.
  2. When using AI, provide the chosen tool with something you’ve made as an input along with a specific prompt.
  3. Always comprehensively review the output from an AI tool for quality.

A programmer’s loss of identity - ratfactor

We value learning. We value the merits of language design, type systems, software maintenance, levels of abstraction, and yeah, if I’m honest, minute syntactical differences, the color of the bike shed, and the best way to get that perfectly smooth shave on a yak. I’m not sure what we’re called now, “heirloom programmers”?

Do I sound like a machine code programmer in the 1950s refusing to learn structured programming and compiled languages? I reject that comparison. I love a beautiful abstraction just as much as I love a good low-level trick.

If the problem is that we’ve painted our development environments into a corner that requires tons of boilerplate, then that is the problem. We should have been chopping the cruft away and replacing it with deterministic abstractions like we’ve always done. That’s what that Larry Wall quote about good programmers being lazy was about. It did not mean that we would be okay with pulling a damn slot machine lever a couple times to generate the boilerplate.

Deep Blue

My social networks are currently awash with Deep Blue:

…the sense of psychological ennui leading into existential dread that many software developers are feeling thanks to the encroachment of generative AI into their field of work.

How Generative and Agentic AI Shift Concern from Technical Debt to Cognitive Debt

I recently wrote:

The issue isn’t with the code itself, but with the understanding of the code.

That’s the difference between technical debt and cognitive debt.

John has written lots more on this.

10 Thoughts On “AI,” February 2026 Edition | Whatever

  1. I don’t and won’t use “AI” in the text of any of my published work.
  2. I’m not worried about “AI” replacing me as a novelist.
  3. People in general are burning out on “AI.”
  4. I’m supporting human artists, including as they relate to my own work.
  5. “AI” is Probably Sticking Around In Some Form.
  6. “AI” is a marketing term, not a technical one, and encompasses different technologies.
  7. There were and are ethical ways to have trained generative “AI” but because they weren’t done, the entire field is suspect.
  8. The various processes lumped into “AI” are likely to be integrated into programs and applications that are in business and creative workflows.
  9. It’s all right to be informed about the state of the art when it comes to “AI.”
  10. Some people are being made to use “AI” as a condition of their jobs. Maybe don’t give them too much shit for it.

I miss thinking hard.

There are two wolves inside you…

My Builder side won’t let me just sit and think about unsolved problems, and my Thinker side is starving while I vibe-code. I am not sure if there will ever be a time again when both needs can be met at once.

Progress Without Disruption - Christopher Butler

We’ve been taught that technological change must be chaotic, uncontrolled, and socially destructive — that anything less isn’t real innovation.

The conflation of progress with disruption serves specific interests. It benefits those who profit from rapid, uncontrolled deployment. “You can’t stop progress” is a very convenient argument when you’re the one profiting from the chaos, when your business model depends on moving fast and breaking things before anyone can evaluate whether those things should be broken.

We’ve internalized technological determinism so completely that choosing not to adopt something — or choosing to adopt it slowly, carefully, with conditions — feels like naive resistance to inevitable progress. But “inevitable” is doing a lot of work in that sentence. Inevitable for whom? Inevitable according to whom?

Stop generating, start thinking - localghost

Generated code is rather a lot like fast fashion: it looks all right at first glance but it doesn’t hold up over time, and when you look closer it’s full of holes. Just like fast fashion, it’s often ripped off other people’s designs. And it’s a scourge on the environment.

The Future of Software Development is Software Developers – Codemanship’s Blog

The hard part of computer programming isn’t expressing what we want the machine to do in code. The hard part is turning human thinking – with all its wooliness and ambiguity and contradictions – into computational thinking that is logically precise and unambiguous, and that can then be expressed formally in the syntax of a programming language.

That was the hard part when programmers were punching holes in cards. It was the hard part when they were typing COBOL code. It was the hard part when they were bringing Visual Basic GUIs to life (presumably to track the killer’s IP address). And it’s the hard part when they’re prompting language models to predict plausible-looking Python.

The hard part has always been – and likely will continue to be for many years to come – knowing exactly what to ask for.

The Colonization of Confidence., Sightless Scribbles

I love the small web, the clean web. I hate tech bloat.

And LLMs are the ultimate bloat.

So much truth in one story:

They built a machine to gentrify the English language.

They have built a machine that weaponizes mediocrity and sells it as perfection.

They are strip-mining your confidence to sell you back a synthetic version of it.

Dissent | blarg

I suppose it’s not clear to me what a ‘good’ window into unreliable, systemically toxic systems accomplishes, or how it changes anything that matters for the better, or what that idea even means at all. I don’t understand how “ethical AI” isn’t just “clean coal” or “natural gas.” The power of normalization as four generations are raised breathing low doses of aerosolized neurotoxins; the alternative was called “unleaded”, but the poison was called “regular gas”.

There’s a real technology here, somewhere. Stochastic pattern recognition seems like a powerful tool for solving some problems. But solving a problem starts at the problem, not working backwards from the tools.

AI CEO – Replace Your Boss Before They Replace You

Delivering total nonsense, with complete confidence.

Pluralistic: The Reverse-Centaur’s Guide to Criticizing AI (05 Dec 2025) – Pluralistic: Daily links from Cory Doctorow

The promise of AI – the promise AI companies make to investors – is that there will be AIs that can do your job, and when your boss fires you and replaces you with AI, he will keep half of your salary for himself, and give the other half to the AI company.

That’s it.

That’s the $13T growth story that MorganStanley is telling. It’s why big investors and institutionals are giving AI companies hundreds of billions of dollars. And because they are piling in, normies are also getting sucked in, risking their retirement savings and their family’s financial security.

Now, if AI could do your job, this would still be a problem. We’d have to figure out what to do with all these technologically unemployed people.

But AI can’t do your job. It can help you do your job, but that doesn’t mean it’s going to save anyone money.

The Jeopardy Phenomenon – Chris Coyier

AI has the Jeopardy Phenomenon too.

If you use it to generate code that is outside your expertise, you are likely to think it’s all well and good, especially if it seems to work at first pop. But if you’re intimately familiar with the technology or the code around the code it’s generating, there is a good chance you’ll be like hey! that’s not quite right!

Not just code. I’m astounded by the cognitive dissonance displayed by people who say “I asked an LLM about {topic I’m familiar with}, and here’s all the things it got wrong” who then proceed to say “It was really useful when I asked an LLM for advice on {topic I’m not familiar with, hence why I’m asking an LLM for advice}.”

Like, if you know that the results are super dodgy for your own area of expertise, why would you think they’d be any better for, I don’t know, restaurant recommendations in a city you’ve never been to?

The only winning move is not to play

My mind boggles at the thought of using a generative tool based on a large language model to do any kind of qualatitive user research, so every single thing that Gregg says here makes complete sense to me.

On not choosing nice versions of AI – This day’s portion

Whenever anyone states that “AI is the future, so…” or “many people are using AI anyway, so…” they are not only expressing an opinion — they‘re shaping that future.

Web development tip: disable pointer events on link images

Here’s a little snippet of CSS that solves a problem I’ve never considered:

The problem is that Live Text, “Select text in images to copy or take action,” is enabled by default on iOS devices (Settings → General → Language & Region), which can interfere with the contextual menu in Safari. Pressing down on the above link may select the text inside the image instead of selecting the link URL.

The line and the stream. — Ethan Marcotte

I’ve come to realize that statements about the future aren’t predictions: they’re more like spells. When someone describes something to you as the future, they’re sharing a heartfelt belief that this something will be part of whatever comes next. “Artificial intelligence isn’t going anywhere” quite literally involves casting a technology forward into time. How could that be anything else but a kind of magic?