The Unaccountability Machines

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

My regular readers will know that Dan Davies’ The Unaccountability Machine was a big influence on me. If you didn’t know this, well, you’ll probably keep hearing about it every now and then. Anyway, the short summary of this must-read book is that a lot of our systems (government business, etc.) go off the rails because they focus on a few metrics, insulate themselves and their leaders from impact, and become destructive.

There, I summarized an enormously complex book that sums up decades in a paragraph. Go me. Anyway, on to the subject.

I was reading a recent article in 404 media on how people are “staffing” companies with AI, or even discussing having entire companies that are just bots/agents. Yes, it won’t surprise you that people flush with cash or wild ideas imagine a world where they just automate everything and rake in cash. Yeah, you’re not surprised.

Nowhere’s many things wrong with this idea, from data center water burn to legal complications to AI being surprisingly crappy at many jobs. But I want to address something about what it’d be like to run a company with a bunch of stochastic systems doing work for you, because this sounds like the fears of The Unaccountability Machine taken to it’s logical conclusion. Or illogical conclusion.

Anyway, let’s imagine these AI companies, these automated companies, and what we know about AI. You have a lot of automated processes running things, running them with no moral agency because they’re not people. We know how sycophantic AI can be dangerous because it tells you what you need to know. All of this abstract and distant from real human experience, moreso because of the hype cycle.

What you’ve got here is, well, an Unaccountability Machine. A nearly completely automated company of AI agents spinning around one person is not going to get good, safe decisions. You may get something you can use to juice stock and sell off, but it won’t be safe.

What you have are devices that ape human awareness, using old data, telling people what they want, and when things go wrong the AI takes the blame. You have people insulated from real information, focused on limited measures, and using technology that will sound like it’s kissing up to them. All it is is amplifying what happens to various leaders anyway in our decaying government and business systems.

So, really, it’s just business as usual but faster. You can spin up bad ideas and unaccountability quicker.

Now I suspect a lot of this is just juicing stocks, posturing, and trying to ignore how AI costs are going to go up and legal issues will proliferate. So I’m more concerns what happens in the meantime and I doubt it’ll be good – and then of these “auto-companies” will need their work walked back.

Honestly, I hope most of them are scams. Maybe that’d be good.

I suspect Dan Davis is going to have to write yet another book.

Steven Savage

The Unaccountability Man

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

So lately for many reasons I’ve been thinking about how supposedly Great Men fail and let us down. We’ve all been disappointed, and as a person working in technology, I’ve had so many supposed luminaries disappoint me. I’ve been contemplating this for awhile, and I found something that helps understand it and makes clear how really bad it is to Hero Worship someone into deciding your life.

(And notice how we always talk Great Men? More on that later . . .)

Now as any regular reader knows, The Unaccountability Machine was a book that changed how I see the world, and I haven’t yet shut up about it. As I continue to not shut up about it, let me sum it up quickly: the book’s thesis ends up being organizations go insane because they follow limited measures as goals (like stock market value). These organizations may persist – they may be quite good at it when they go mad – but their decisions will cause problems.

Those problems, by the way, are sort of the last twenty-thirty years.

Now the idea of some Great Awesome Business Leader is a form of madness no different than deciding stock value is the only thing to pursue. You have decided to focus only on one thing, and that thing is “whatever this dude says.” That is insane it’s just one we allow because some people believe in the Great Super Savior who will save us.

(Also, ever notice how this one Dude also is good for stock prices? Hmmm . . . )

Anyway this problem has a few facets.

First, as cynical as I am about some Great Dude Saving us, let’s say you find an actual Great Dude. Fine, maybe they’re worth following but for how long? They may navigate issues today but not tomorrow after the world changes. They may age out of understanding things or just age. They might drop a bunch of very expensive hallcinogens on some New Age trip and fry their brains. Someone truly awesome isn’t forever and is still only human.

And that’s assuming that the hero-worship, the money, doesn’t go to their head. How many people who actually had at least some good ideas got so insulated from reality they lost any actual skill they had? How would we know when we’re so busy still telling how awesome they are.

Second, there’s what ed Zitron called the Business Idiot. People who know how to play the various stock market and business games but don’t really know anything else. They’re good, perhaps every good at fundraising and upping the stock price and getting venture capital – but that’s all they’re good at. They’v learned how to work the system, and in doing so give an illusion of a deliverable.

Follow those people – who are great at selling themselves – as you have the madness of following a so-called Great Man, but also of following a shyster.

Third, there’s people who fit the Great Man who are similar, fitting what I call The Narrative. Some guy shows up who says the right thing and does the right thing that fits people – and the press’ – narratives and wham they’re rich and famous. You can make a lot of money and get power jut by checking off the right boxes at the right time. This I think explains a lot of people.

This is where the term Great Man reveals the sexism in the discussions. Which tells you how much The Narrative controls our thoughts.

Fourth, of course, the Great Man idea just leads to grifters coming in, lying, and ripping people off. And we keep falling for it.

Looking for some hero to save the day, for someone to be the next Fill In The Blank, is a fools game. That person probably isn’t out there, possibly is coning you, and even if they are out there, they won’t last, they will get out of touch or want to retire or just pass away. It’s madness to rely on one person, no different than running a company just to get the stock price to go up.

Even if you benefit, what you leave in your wake will be harmful.

Steven Savage

The Morals of Madness

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

I’m fascinated by cult dynamics, because they tell us about people, inform us of dangers, and tell us about ourselves. Trust me, if you think you can’t fall into a cult you can, and are probably in more danger if you think you can’t. Understanding cults is self-defense in many ways.

On the subject of the internet age, I was listening to the famous Behind the Bastards podcast go over the Zizian “rationalist” cult. One of the fascinating things about various “rationalist” movements is how absolutely confidently irrational they are, and how they touch on things that are very mainstream. In this case the Zizians intersected with some of the extreme Effective Altruists, which seemed to start by asking “how do I help people effectively” but in the minds of some prominent people became “it’s rational for me to become a billionaire so I can make an AI to save humanity.”

If you think I’m joking, I invite you to poke around a bit or just listen to Behind the Bastards. But quite seriously you will find arguments that it’s fine to make a ton of money in an exploitative system backed by greedy VC because you’ll become rich and save the world with AI. Some Effective Altruism goes all our into arguing that this is good because you save more future people than you hurt present people. Think about that – if you’ll do more good in the future you can just screw over people now and become rich and it’s perfectly moral.

If this sounds like extreme anti-choice arguments, yep, it’s the same – imagined or potential people matter more than people who are very assuredly people now.

But as I listened to the Behind the Bastards hosts slowly try not to loose their mind while discussing those that had, something seemed familiar. People whose moral analysis had sent them around the bend into rampant amorality and immorality? An utter madness created by a simplistic measure? Yep, I heard echos of The Unaccountability Machine, which if you’ve paid attention you know influenced me enough that you are fully justified in questioning me about that.

But let’s assume I’m NOT gong to end up on a Behind the Bastards podcast about a guy obsessed with a book on Business Cybernetics, and repeat one point from that book – obsessive organizations kill off the ability to course correct.

The Unaccountability Machine author Dan Davies notes some organizations are like lab animals who were studied after removing certain brain areas. The animals could function but not adapt to change at all. Organizations that go mad, focusing on a single metric or two (like stock price), will deliberately destroy their own ability to adapt, and thus only barrel forward and/or die. They cannot adjust without major intervention, and some have enough money to at least temporarily avoid that.

The outlandish “future people matter, current do not, so make me rich” people have performed a kind of moral severance on themselves. They have found a philosophy that lets them completely ignore actual people and situations for something going on in their heads (and their bank accounts). Having found a measure they like (money!) they then find a way to cut themselves off from actual social and ethical repercussions.

If you live in the imaginary future and have money, you can avoid the real, gritty present. A lot of very angry people may not agree, but at that point you’re so morally severed you can’t understand why. Or think they’re enemies or not human or something.

Seeing this cultish behavior in context of The Unaccountability Machine helped me understand a lot of outrageous leadership issues we see from supposed “tech geniuses.” Well, people who can get VC funding, which is what passes for such genius. Anyway, too many of these people and their hangers-on go in circles until they hone the right knife to cut away their morality. Worst, they then loose the instinct to really know what they did to themselves.

Immorality and a form of madness that can’t course-correct is not a recipe for long-term success or current morality. Looking at this from both cultish dynamics and The Unaccountability Machine helps me understand how far gone some of our culture is. But at least that gives some hope to bring it back – or at least not fall into it.

And man I do gotta stop referencing that book or I’m gonna seem like I’m in a cult . . .

Steven Savage