The Unaccountability Machines

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

My regular readers will know that Dan Davies’ The Unaccountability Machine was a big influence on me. If you didn’t know this, well, you’ll probably keep hearing about it every now and then. Anyway, the short summary of this must-read book is that a lot of our systems (government business, etc.) go off the rails because they focus on a few metrics, insulate themselves and their leaders from impact, and become destructive.

There, I summarized an enormously complex book that sums up decades in a paragraph. Go me. Anyway, on to the subject.

I was reading a recent article in 404 media on how people are “staffing” companies with AI, or even discussing having entire companies that are just bots/agents. Yes, it won’t surprise you that people flush with cash or wild ideas imagine a world where they just automate everything and rake in cash. Yeah, you’re not surprised.

Nowhere’s many things wrong with this idea, from data center water burn to legal complications to AI being surprisingly crappy at many jobs. But I want to address something about what it’d be like to run a company with a bunch of stochastic systems doing work for you, because this sounds like the fears of The Unaccountability Machine taken to it’s logical conclusion. Or illogical conclusion.

Anyway, let’s imagine these AI companies, these automated companies, and what we know about AI. You have a lot of automated processes running things, running them with no moral agency because they’re not people. We know how sycophantic AI can be dangerous because it tells you what you need to know. All of this abstract and distant from real human experience, moreso because of the hype cycle.

What you’ve got here is, well, an Unaccountability Machine. A nearly completely automated company of AI agents spinning around one person is not going to get good, safe decisions. You may get something you can use to juice stock and sell off, but it won’t be safe.

What you have are devices that ape human awareness, using old data, telling people what they want, and when things go wrong the AI takes the blame. You have people insulated from real information, focused on limited measures, and using technology that will sound like it’s kissing up to them. All it is is amplifying what happens to various leaders anyway in our decaying government and business systems.

So, really, it’s just business as usual but faster. You can spin up bad ideas and unaccountability quicker.

Now I suspect a lot of this is just juicing stocks, posturing, and trying to ignore how AI costs are going to go up and legal issues will proliferate. So I’m more concerns what happens in the meantime and I doubt it’ll be good – and then of these “auto-companies” will need their work walked back.

Honestly, I hope most of them are scams. Maybe that’d be good.

I suspect Dan Davis is going to have to write yet another book.

Steven Savage

The Responsibility Machine

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

I’ve been preaching the virtues of the book The Unaccountability Machine to the point I bought copies to give people for Christmas. You, my regular readers, shall be spared anything but a reminder – it’s about how organizations go insane following simplistic ideas and shield leadership from accountability. I mean I’m still going to talk about it, but I’ll be taking a new tact and pushing it less.

The thing is someone has to take responsibility in governments, companies, etc.. If everyone goes hands off, everyone dodges responsibility, the organization will continue to fall into insanity. Backside-covering does a lot to keep an organization going, but the insanity will predominate. The organization might fall apart, get bought out, get sold, become completely financialized, etc.

I’d wager we’re going to see a lot of that in the next decade or two. We’ve already seen it from the Tories in the UK to Sears.

But anyway someone has to take responsibility. That means, ironically, the more Unaccountability in a system, the more there has to be some responsibility. The Unaccountability Machine is also a Responsibility Machine. People need to step in to do the right thing, even as others don’t.

You’ve probably been there. You might be the person who is the Responsible one – come to think of it, if you’re one of my subscribers, you probably are.

This Taking of Responsibility can happen for a number of reasons. Some people just can’t stand to see things done wrong. Others like a challenge. Others really care about the system. People have a certain responsible streak in them, if only out of sheer irritation of seeing something done wrong.

This urge to take Responsibility isn’t necessarily benevolent either. A chance to take Responsibility can be a chance to advance in one’s career – to where one can finally enjoy the benefits of Unaccountability. Responsibility can be a way to angle for a raise or bonuses. It can be a way to show off or put someone in their place. Don’t assume everyone rushing to prop up the various bad decisions in an organization is motivated by principle.

But the key thing is there’s only so many heroes and opportunists at any organization. It also means that unless the payoff they want – from seeing something work to a fat raise – needs to be coming. If it doesn’t come, there’s going to be less and less people taking Responsibility and more giving up or even seeking areas of Unaccountability

And no one can cause more damage or grift the system better than someone that actually knows how stuff works – and gave up. They’re also the ones that warn others to not fall into the Responsibility trap or to not even get hired or join up. Even the more evil of the once-Responsible types don’t want any competition.

However the people enjoying Unaccountability can coast on those taking Responsibility long enough to get a payout and leave.

So if we wonder how organizations persist when they’ve gone insane with Unaccountability (beyond money and influence), look for the people who are being Responsible. If you can’t find any then you may want to stop looking and get away.

Take a look at the world now and think that over.

Steven Savage

Take Some Responsibility

You probably heard the news: Air Canada had to pay up for something an “AI” chatbot said. This story saddens me as I love flying on Air Canada. Honestly in my trips up there the flight is often part of the fun.

Basically a guy asked an Air Canada chatbot on advice on canceling due to bereavement, it gave him advice on refunds that was wrong. He followed the advice and of course when he had to cancel, he didn’t get his refund, and made a small claims complaint to the appropriate body. Air Canada argued – seriously – the chatbot is a legally distinct entity and that the guy shouldn’t have trusted the advice, but followed a link provided by the chatbot which had gotten things wrong.

Obviously, that didn’t fly, excuse the really stupid pun.

As an IT professional who’s career is “older than One Piece” let me weigh in.

I work in medical technology (indeed, it’s my plan to do this for the rest of my career). We vet everything we install or set up. We regularly review everything we set up. We have support systems to make sure everything is working. This is, of course, because you screw up anything medical and bad things happen.

Also it’s because someone that goes into medical anything is usually pretty responsible. We IT folks are in the mix everyday and know the impact of our job. We also work with – and sometimes are or were – doctors and nurses and other medical professionals who get it.

I love working in this environment. If this appeals to you, I can honestly say check out working in medicine, medical research, and education. It’s awesome.

Know what? Other people using technology can and should take the same level of responsibility.

Technology is a choice. What you use, how you implement it, how you expose people to it, all of that is a choice. You built it or paid for it or whatever, you take responsibility if it goes wrong, be it a life or someone deserving a refund.

If the product isn’t what you thought? Then those who made it owes you an apology, wad of cash, corporate dissolution, whatever. But either way someone takes responsibility, because technology is a choice.

We’ve certainly had enough of moving fast and breaking things, which really seems to just result in enshitification and more and more ways to be irresponsible.

Besides, reputation is involved, and if nothing else saying “we don’t care of our technology on a website goes wrong” is going to make people question everything else you do. I mean, if you were on an Air Canada plane after hearing about this “sorry, not our fault” approach how safe are you going to feel?

Let’s try to be responsible here.

Steven Savage