My biggest adjustment with using Copilot was that instead of writing code, my posture shifted to reviewing code. Rather than a free form solo-coding session I was now in a pair-programming session with myself (ironically) in the copilot seat reviewing.

Dave Rupert, “Thoughts on Copilot

There’s one use case that I find very well suited for Copilot, which spares me tons of tedious work - unit testing.

Ianis Triandafilov, “Using Github Copilot for unit testing

A client informed me that he will no longer pay me to write content for his website because A.I. can write it for free, but he wants to pay me a fraction of my usual rate to “rewrite it” in different words so it can pass Google’s A.I. detection screening.

Jason Colavito, on Twitter

The justification for the [generative artificial intelligence] tool given to staff, multiple people say, was that it was a way to generate content that would take human writers longer — handling the “dull SEO-friendly topics” or making sure that legal requirements for writing about finance are met. It was sold as a way to free up staff time so they could do more thoughtful work. Instead, several staffers have departed since November, and morale is low at the outlet after several rounds of layoffs, according to former employees.

The Verge, “Inside CNET’s AI-powered SEO money machine

Now, Stripe uses GPT-4 to scan these sites and deliver a summary, which outperforms those written by people.

“When we started hand-checking the results, we realized, ‘Wait a minute, the humans were wrong and the model was right.’” Mann says. “GPT-4 was basically better than human reviewers.”

Stripe’s customer testimonial on OpenAI

There’s a lot my anxious, anxious brain could say about the past year’s explosive investment in “generative artificial intelligence” (AI) tools.1 But I’ll just note that labor economics has an old, old term for [gestures around] all this: de-skilling. De-skilling is the process by which the demand for skilled labor is reduced by technology — if not eliminated altogether. As a new technology is introduced to an industry, it may be able to complete tasks that had traditionally been performed by skilled laborers. And as the technology matures, it can be overseen by fewer workers, which gradually lowers wages, and may eliminate jobs.

You can imagine why this has been on my mind. In the last year we’ve seen an explosion of utilities that can produce realistic-sounding copy, photorealistic images, and workable code — all derived from nothing more but text prompts written by a human. We’re only a few years on from watching someone build a working interface by sketching rudimentary shapes on a piece of paper, and now a few quick words in a chat interface can produce a passable-looking prototype. What does it mean to work as a designer — or an engineer, or a writer, or, or, or — when a machine produces something reasonably similar to what you or I might make?

Video source: A tweet by Rowan Cheung

From a purely technical standpoint, these tools are remarkable — I don’t want to deny that. And under certain circumstances, they might genuinely lower barriers for people.2 But with all that: these utilities are being created in a country that has minimal regulatory oversight, few privacy safeguards, and even fewer labor protections. And the makers of this software promise it can do tasks many digital workers always believed couldn’t be automated. Concerns about these tools “taking creative jobs” aren’t academic. People are losing work right now to generative AI.3

Of course, the last few months have seen hundreds of thousands of tech workers lose their jobs. Frankly, I don’t think it’s a coincidence that the tech industry conducted sweeping, wide-scale layoffs just as they began investing heavily in AI-driven automation. I’m still haunted by the fact that Microsoft announced a $10 billion dollar investment in OpenAI, the research organization behind ChatGPT and DALL-E — and they made that announcement not five days after laying off 10,000 workers. It’s hard not to read that as a signal about the future of work in our industry. About whose work — and whose skill — gets to matter.

For me, the question becomes: what are we going to do about it?

Attention to the language of the discourse is important. Much clarification can be gained by focusing on language as an expression of values and priorities. Whenever someone talks to you about the benefits and costs of a particular project, don’t ask “What benefits?” ask “Whose benefits and whose costs?”

Ursula Franklin, The Real World of Technology

My thanks to Jeff Eaton, who kindly reviewed an earlier draft of this post.


  1. To put it mildly, “artificial intelligence” is a fraught term. It’s also not especially great at describing much of the software I mention here. For more on this, I’d recommend reading this profile of Emily Bender. ↩︎

  2. I’ll be honest — I’m a little skeptical of this argument, as it feels a little too much like tech boosterism to me. (Especially given where I think this is all very likely to go.) ↩︎

  3. Of course, certain classes of tech worker have been dealing with these effects for years. ↩︎