It’s The Ones We Noticed

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

People developing psychosis while using Chat GPT has been in the news a lot. Well, the latest story is about an Open AI investor who seemed to lose it in real time, leading to shall we say concerns. The gentleman in question seemed to spiral into thinking the world was like the famous SCP Foundation collective work.

Of course people were a little concerned. A big investor AI losing his mind isn’t exactly building confidence in the product or the company. Or for that matter, investing.

But let me gently suggest that the real concern is that this is the one we noticed.

This is not to say all sorts of AI bigwigs and investors are losing their minds – I think some of them have other problems or lost their minds for different reasons. This isn’t to say the majority of people using AI are going to go off into some extreme mental tangent. The problem is that AI, having been introduced recently, is going to have impacts on mental health that will be hard to recognize because this is all happening so fast.

Look, AI came on quick. In some ways I consider that quite insidious as it’s clear everyone jumped on board looking for the next big thing. In some ways it’s understandable because, all critiques aside (including my own), some of it is cool and interesting. But like a lot of things we didn’t ask what the repercussion might be, which has been a bit of a problem since around about the internal combustion engine.

So now that we have examples of people losing their minds – and developing delusions of grandeur – due to AI, what are we missing?

It might not be as bad as the cases that make the news – no founding a religion or creating some metafiction roleplay that’s too real to you. But a bit of an extra weird belief, that strange thing you’re convinced on, something that’s not as noticeable but too far. Remember all the people who got into some weird conspiracies online? Yeah, well, we’ve automated that.

We’re also not looking for it and maybe it’s time we do – what kind of mental challenges are people developing due to AI that we’re not looking for?

There might not even be anything – these cases may just be unfortunate ones that stand out. But I’d really kind of like to know, especially as the technology spreads, and as you know I think it’s spreading unwisely.

Steven Savage

We Stopped Caring. We Can Again.

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

Am I going to talk about AI? Sort of, yes! But mostly I’m going to talk about caring and people, inspired by Dan Sinker’s blog post “The Who Cares Era” – just go read it.

We live in an era that doesn’t seem like people care. Market-tested perfectly created media is there, completely designed to interest us no matter what its content is. Extended cinematic universes calculated carefully enough enthusiasm has become drudgery. Politics are publicity stunts, religion is a lifestyle brand, nothing is what it is.

Finally we’ve got AI, and you can use AI to generate content for someone else – who might then use AI go get a summary of what you didn’t create.

So many people don’t seem to care. We’ve got all sorts of content, writing, media, but so much of it isn’t made by people who care or even consumed by those who care. AI is a pain in my backside, but its the result of where we were slowly heading.

(And as I note, I don’t consider AI new or always bad, but by now you know that, so let me keep ranting).

What I think we’re missing is that good art, or writing, or media is by someone and for someone. There’s an intimately involved creator who gives a damn and someone who wants that media. Real, meaningful creativity and communication is about people on both ends.

Every now and then people rediscover Ed Wood, a man with many flaws, erratic talents, and an unstoppable desire to make things. There is a reason he had and has fans, because he was doing something – if not well – and that connects with people. I say this as a person who has watched his films without the Rifftrax treatment.

I get zines and the sheer personal feel of them is amazing. I love getting to know people, and the personal nature of zines lets me connect with people who want connection. I have a shelf that includes strange collage art, speculations on podcasts, street photography, and more. There’s people on both ends.

On a very personal level, I had a friend who got inspired to change jobs in part due to my book Fan To Pro. That’s what I’m talking about: a person really wants to say something, another wants to listen, and in some cases a dialogue ensures.

AI in the way it’s often deployed messes it up, creating both generated content without a person and summing up data so people don’t have to connect. But that’s been an issue of mass media for awhile, with calculated marketing, mass efforts, calculated creations, and plenty of imitators. We’ve industrialized our arts too much, and AI just slots in there really well.

If you feel that dissatisfaction, that there is something missing in culture and arts and just talking to people, you get it. We’ve industrialized, and misuse of AI takes it farther.

It’s time to care, to care enough to create – and care enough to pay attention.

Steven Savage

It’s Bad It’s So Bad It’s Good

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

All right, it’s time to talk AI again. This also means I have to use my usual disclaimer of “what we call AI has been around for awhile, it’s been very useful and is useful, but we’re currently in an age of hype that’s creating a lot of crap.” Anyway, there, packed that disclaimer into one sentence, go me.

I’ve seen “AI-ish stuff” for 30 years, and the hype for it is way different this time.

Watching the latest hype for “AI” (that pile of math and language that people are cramming into everything needed or not) I started listening to the hype that also seemed to be a threat. We have to build this. We have to build this before bad guys build it. We have to build a good AI before a bad AI. This may all be dangerous anyway!

Part of current AI marketing seems to be deliberately threatening. In a lot of cases it’s the threat of AI itself, which you know, may not be a selling point. I mean I don’t want a tool that might blow up in my face. Also Colossus: The Forbin Project freaked me out as a kid and that was about competing AI’s teaming up so you’re not selling me with the threat that we have to make AI to stop AI.

But this marketing-as-threat gnawed at me. It sounded familiar, in that “man, that awful smell is familiar” type way. It also wasn’t the same as what I was used to in tech hype, and again, I’ve worked in tech for most of my life. Something was different.

Then it struck me. A lot of the “hype of the dangerous-yet-we-must-use-it” aspects of AI sounded like the lowest form of marketing aimed at men.

You know the stuff. THIS energy drink is SO dangerous YET you’re a wimp if you don’t try it. Take this course to make you a super-competitive business god – if you’re not chicken, oh and your competitors are taking it anyway. Plus about every Influencer on the planet with irrelevant tats promising to make you “more of a man” with their online course. The kind of stuff that I find insulting as hell.

Male or female I’m sure you’re used to seeing these kind of “insecure dude” marketing techniques. If you’re a guy, you’re probably as insulted as I am. Also you’d like them to stop coming into your ads thanks to algorithms.

(Really, look online ads, my prostate is fine and I’m not interested your weird job commercials).

Seeing the worst of AI hype as being no different than faux-macho advertisements aimed to sell useless stuff to insecure guys really makes it sit differently. That whiff of pandering and manipulation, of playing to insecurity mixed with power fantasies, is all there. The difference between the latest AI product and untested herbal potency drugs is nill.

And that tells me our current round of AI hype is way more about hype than actual product, and is way more pandering than a lot of past hype. And after 30+ years in IT, I’ve been insulted by a lot of marketing, and this is pretty bad.

With that realization I think I can detect and diagnose hype easier. Out of that I can navigate the current waters better – because if your product marketing seems to be a mix of scaring and insulting me, no thanks.

Steven Savage