I Want a Buddy, not a Copilot

Brad Hutchings
6 min readApr 30, 2024

Warning. Not safe for work. Reading this out loud might get you fired.

The technology / artificial intelligence (AI) / large language model (LLM) industry wants to sell us helpful chatbots. If you are a Windows PC user, there is one built-in that is difficult to avoid. Microsoft calls it Copilot. The metaphor is the cockpit of an airplane. The pilot is in charge. That’s supposed to be you in the metaphor. The copilot is there to assume the duties of the pilot in the case of a personal emergency and to do the crappy mundane tasks the pilot doesn’t want to do. That’s their software conversing with you in English. Here’s the problem with the metaphor. The airline industry abandoned it and the term copilot years ago.

This is called foreshadowing. I needed it the thumbnail.

I have a friend who used to walk dogs with me who was a pilot for a major regional airline. It’s the airline with cute animals on their plane’s tails. She informed me of this 15 years ago. There are usually two “pilots” in the cockpit. They are both fully trained and capable of flying the aircraft. The industry for at least 20 years has called the pilot in charge the “captain” and the other pilot the “first officer”. They split a huge list of tasks fairly evenly. The first officer often takes control of the plane to land it. It just depends how the team of two split the tasks for the flight and if they have training considerations, new (to one of them) airports, etc. There’s an article on Medium that explains how it works now.

As has become so typical with tech industry solutions, Microsoft picked an outdated metaphor and decided to apply it to a problem where nobody asked for a solution. They are now actively selling it to users for $30 or $50 per month per user as an add-on to Microsoft 365. They have been running an ad on LinkedIn (owned by Microsoft) with how it could help users. Here are some key screenshots:

This seems helpful!
Really Quincy? Is this the pressing business problem you’re working on today?

So absurd. Execs, do you wonder why people want to work at home? Because Quincy. Aside from his roles as Recycle bin Laden and poster boy for ChatGPT / Copilot, does he actually do anything here?

A competent “copilot” would tell Quincy that he is about to stall the engines and crash the plane. But no, Microsoft Copilot is helpful!

And like magic, he’s Quincy the friendly douche nozzle.

Yes, people who read picture captions, I’m thinking about Casper the Friendly Ghost. Quincy just needs a theme song. Bing (Copilot) to the rescue!

Please write me a song “Quincy the friendly douche nozzle” based on “Casper the friendly ghost”.

Certainly! Here’s a playful and friendly song inspired by “Casper the Friendly Ghost,” tailored for “Quincy the Friendly Douche Nozzle.”

(Verse 1)
Oh, Quincy the friendly nozzle, so sleek and so bright,
He cleanses with a gentle flow, never a fright.
In bathrooms and showers, he’s always a sight,
Quincy the friendly nozzle, shining so light.

(Chorus)
Quincy, Quincy, he’s our friend,
His helpful stream, on him we depend.
From morning showers to the day’s end,
Quincy, Quincy, he’s our friend.

Let’s back up a moment. Notice that, in this campaign, Microsoft isn’t selling Copilot based on its abilities to do meaty business useful tasks, like data analysis, summarizing, reports. Why not? Because Generative AI is not good for truthy things. I wrote an article about this (linked in previous sentence), along with suggestions about what it is good at. TL;DR Generative AI (LLMs) is good for things that exploit what everyone thinks are its weaknesses. Stare at that article until it reframes your thinking. You will thank me later.

I’m going to confess something that goes back through my 35+ years of developing software and building products. I have hired people or had people hired just to be my work buddy. Sure, they found or I found official work for them to do and for them to look and feel like they were doing it at the highest level of competence and craft. But it’s not why I championed or hired them. And I am not unique in doing this in the tech industry.

Here are some activities a work buddy does with me:

  • Get lunch.
  • Talk or text endlessly about music, movies, comedy, and culture while we allegedly are busy working.
  • Make fun of other people we run into at work. Bosses, customers, co-workers. It’s always hilarious. It would totally get us fired.
  • Visibly appear to be working together to annoy or worry other people who pay attention to that stuff and too often want to optimize our time.
  • Cover for each other when one of us doesn’t feel like working, which is more often than anyone would believe with how much we each accomplish.
  • Remind me of important birthdays so my suck up and leadership (suck sideways and down) games are always on point.
  • Cheerleading. Who doesn’t want to hear how wonderful they are? Often!
  • Make me cringe with their politics. It’s like how MMA fighters get their shins broken to strengthen them. Quincy from the ad might be the perfect work buddy on this point alone.

Here are activities my work buddy doesn’t do with me:

  • Actual work. I don’t need their help. I don’t want their help. Same should eventually apply the other direction.
  • The crappy tasks. I don’t give those out as the cost of having fun with me. That seems transactional (and gross!) to me.

I suspect that many more of you readers are on-board with this concept or have participated in it on one side or the other than are horrified by it. Otherwise, it’s too blatant for me to get away with. Over 35+ years, though, I’ve probably had a work buddy as defined above for maybe 5 of them.

The funny thing about my work buddy requirements above is that they match up with what LLMs plus some traditional systems can actually do well — minus getting lunch and covering for me. There is one more obvious requirement: don’t be a snitch. Just as company Slack channels and Discord servers would be out of bounds for an IRL work buddy, public cloud LLMs are out of bounds for a chatbot work buddy. So too would be a company owned computer. There can’t be any chance of Quincy finding out we wrote a song about him. Self-imposed danger among consenting adults is one thing, but it’s unacceptable when a company sysadmin can uncover the plot.

Regardless, I do think that Microsoft is awfully close to getting this right. They won’t (because cloud), but they are close. That LinkedIn ad starts with:

“Quit giving me hard tasks I’m just going to mess up!”

If you knew what to do with Copilot and you were already doing it, they wouldn’t place ads giving you hot tips. If I were going to design a personal chatbot, it would be 100% locally run and I’d call it “Hey, Buddy!”.

H/T comedian Jay Mohr for making so much fun of that commercial for so many years that I would spontaneously name a Copilot killer in its honor. This airplane-inspired metaphor is, at least, comfortable.

--

--

Brad Hutchings

Founder of DemoMachine.net. Write it downer of things. Guitar player. Comedian. Future guitar playing comedian.