It’s Bad It’s So Bad It’s Good

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

All right, it’s time to talk AI again. This also means I have to use my usual disclaimer of “what we call AI has been around for awhile, it’s been very useful and is useful, but we’re currently in an age of hype that’s creating a lot of crap.” Anyway, there, packed that disclaimer into one sentence, go me.

I’ve seen “AI-ish stuff” for 30 years, and the hype for it is way different this time.

Watching the latest hype for “AI” (that pile of math and language that people are cramming into everything needed or not) I started listening to the hype that also seemed to be a threat. We have to build this. We have to build this before bad guys build it. We have to build a good AI before a bad AI. This may all be dangerous anyway!

Part of current AI marketing seems to be deliberately threatening. In a lot of cases it’s the threat of AI itself, which you know, may not be a selling point. I mean I don’t want a tool that might blow up in my face. Also Colossus: The Forbin Project freaked me out as a kid and that was about competing AI’s teaming up so you’re not selling me with the threat that we have to make AI to stop AI.

But this marketing-as-threat gnawed at me. It sounded familiar, in that “man, that awful smell is familiar” type way. It also wasn’t the same as what I was used to in tech hype, and again, I’ve worked in tech for most of my life. Something was different.

Then it struck me. A lot of the “hype of the dangerous-yet-we-must-use-it” aspects of AI sounded like the lowest form of marketing aimed at men.

You know the stuff. THIS energy drink is SO dangerous YET you’re a wimp if you don’t try it. Take this course to make you a super-competitive business god – if you’re not chicken, oh and your competitors are taking it anyway. Plus about every Influencer on the planet with irrelevant tats promising to make you “more of a man” with their online course. The kind of stuff that I find insulting as hell.

Male or female I’m sure you’re used to seeing these kind of “insecure dude” marketing techniques. If you’re a guy, you’re probably as insulted as I am. Also you’d like them to stop coming into your ads thanks to algorithms.

(Really, look online ads, my prostate is fine and I’m not interested your weird job commercials).

Seeing the worst of AI hype as being no different than faux-macho advertisements aimed to sell useless stuff to insecure guys really makes it sit differently. That whiff of pandering and manipulation, of playing to insecurity mixed with power fantasies, is all there. The difference between the latest AI product and untested herbal potency drugs is nill.

And that tells me our current round of AI hype is way more about hype than actual product, and is way more pandering than a lot of past hype. And after 30+ years in IT, I’ve been insulted by a lot of marketing, and this is pretty bad.

With that realization I think I can detect and diagnose hype easier. Out of that I can navigate the current waters better – because if your product marketing seems to be a mix of scaring and insulting me, no thanks.

Steven Savage

Save Me From Peak Performance

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

I am not interested in Peak Performance.

Yes, YouTube Bros, Seminar Spewers, Vitamin Vendors, and people promising me 5G proof underwear I do NOT want to operate at my peak. I’m good thanks. I’m happy to not be at Peak performance.

See, the problem with Peak Performance is that it’s about shaving down your life to optimize one area. Know what? My life is good. It’s very diverse. I do art, I write, I play video games, I manage projects for medical research, and a lot more. I’m not willing to give up that stuff just to have shredded abs or be the world’s greatest project manager (I’ll stay in the top ten, thanks). Peak Performance is all about dedicating yourself to one thing to the point you’re just not you.

For that matter, is it even worth the effort? Do I want to take your six week seminar for the price of a new car? Do I want to spend twenty weeks training for, I dunno, my own fragile ego? Look I got things to do, donuts to eat, and stupid anime to watch thanks. I have a life.

Does my Peak Performance, being Top Alpha of Bullshit Mountain even matter to people? Will it make me a better friend, boyfriend, co-worker, cat-petter, or for that matter person? Like is it going to help anyone? Or am I just going to become even more annoying?

But also do I even want Peak Performance? I mean by whose standards, some tatted-up grifter on his third business selling me supplements? Some shrieking news personality with a side gig? Maybe my idea of Peak Performance isn’t what these people are trying to sell me – and for that matter most of them seem to be selling me ways to compensate for insecurities I don’t have.

Really, let’s be honest, Peak Performance is a kind of madness, telling you there’s this thing you have to do to be complete, that’s all you focus on. It’s marketed personal insanity, and to judge by the wildly stupid stuff I see, its also attempts to manipulate vulnerable people. Let’s face it, we’re all vulnerable at some point.

Of course it’s peak Late Stage Capitalism, promising you an optimized but dehumanized life that someone will sell to you. It’s selling you all the stuff making you miserable back to you.

So nope, I’m fine being me, thanks.

Steven Savage

The Morals of Madness

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

I’m fascinated by cult dynamics, because they tell us about people, inform us of dangers, and tell us about ourselves. Trust me, if you think you can’t fall into a cult you can, and are probably in more danger if you think you can’t. Understanding cults is self-defense in many ways.

On the subject of the internet age, I was listening to the famous Behind the Bastards podcast go over the Zizian “rationalist” cult. One of the fascinating things about various “rationalist” movements is how absolutely confidently irrational they are, and how they touch on things that are very mainstream. In this case the Zizians intersected with some of the extreme Effective Altruists, which seemed to start by asking “how do I help people effectively” but in the minds of some prominent people became “it’s rational for me to become a billionaire so I can make an AI to save humanity.”

If you think I’m joking, I invite you to poke around a bit or just listen to Behind the Bastards. But quite seriously you will find arguments that it’s fine to make a ton of money in an exploitative system backed by greedy VC because you’ll become rich and save the world with AI. Some Effective Altruism goes all our into arguing that this is good because you save more future people than you hurt present people. Think about that – if you’ll do more good in the future you can just screw over people now and become rich and it’s perfectly moral.

If this sounds like extreme anti-choice arguments, yep, it’s the same – imagined or potential people matter more than people who are very assuredly people now.

But as I listened to the Behind the Bastards hosts slowly try not to loose their mind while discussing those that had, something seemed familiar. People whose moral analysis had sent them around the bend into rampant amorality and immorality? An utter madness created by a simplistic measure? Yep, I heard echos of The Unaccountability Machine, which if you’ve paid attention you know influenced me enough that you are fully justified in questioning me about that.

But let’s assume I’m NOT gong to end up on a Behind the Bastards podcast about a guy obsessed with a book on Business Cybernetics, and repeat one point from that book – obsessive organizations kill off the ability to course correct.

The Unaccountability Machine author Dan Davies notes some organizations are like lab animals who were studied after removing certain brain areas. The animals could function but not adapt to change at all. Organizations that go mad, focusing on a single metric or two (like stock price), will deliberately destroy their own ability to adapt, and thus only barrel forward and/or die. They cannot adjust without major intervention, and some have enough money to at least temporarily avoid that.

The outlandish “future people matter, current do not, so make me rich” people have performed a kind of moral severance on themselves. They have found a philosophy that lets them completely ignore actual people and situations for something going on in their heads (and their bank accounts). Having found a measure they like (money!) they then find a way to cut themselves off from actual social and ethical repercussions.

If you live in the imaginary future and have money, you can avoid the real, gritty present. A lot of very angry people may not agree, but at that point you’re so morally severed you can’t understand why. Or think they’re enemies or not human or something.

Seeing this cultish behavior in context of The Unaccountability Machine helped me understand a lot of outrageous leadership issues we see from supposed “tech geniuses.” Well, people who can get VC funding, which is what passes for such genius. Anyway, too many of these people and their hangers-on go in circles until they hone the right knife to cut away their morality. Worst, they then loose the instinct to really know what they did to themselves.

Immorality and a form of madness that can’t course-correct is not a recipe for long-term success or current morality. Looking at this from both cultish dynamics and The Unaccountability Machine helps me understand how far gone some of our culture is. But at least that gives some hope to bring it back – or at least not fall into it.

And man I do gotta stop referencing that book or I’m gonna seem like I’m in a cult . . .

Steven Savage