But What If It’s Not Worth Doing?

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

OK this isn’t another post on AI exactly. I get it, there’s a lot of talk of AI – hell, I talk about it a lot, usually whenever Ed Zitron goes on a tear or my friends in tech (IE all my friends) discuss it. If I was friends with Ed Zitron, who knows what I’d write.

The funny thing about AI is that it’s about automation. Yes it’s complex. Yes it’s controversial. Yes, it lets you generate pictures of Jesus as a Canadian Mountie (Dudley Do-Unto-Others?). But it’s automation at the end of the day. It’s no different than a clock or a pneumatic delivery system.

And, referencing a conversation I had with friends, when you automate something on the job or at home, let’s ask a question – should you have been doing it anyway?

First, if you get something you have to automate, should it be assigned to you? If something really isn’t part of your portfolio of work, maybe someone else should do it. Yes, this includes things like home tasks and that includes the shelves you have not and almost certainly will not put up.

A painful reality I’ve come to realize is that many people take on tasks someone else can do, and often do better. However due to whatever reason it drifts up to them and of course they stick with it. Worse, the really good people often would be better at it, and maybe even have more time and hurt themselves less.

A need to automate something often says “I don’t need to do it and I may be bad at it” and the task should move up or down or somewhere else. I’m not saying automate, it, I’m saying reassign it – to someone that may automate it anyway, but still.

Secondly, and more importantly, if you have a task that can be automated it’s time to ask if anyone should be doing it period.

Anything really important needs a person, a moral authority to make a decision. You have both the decision making skills and the ethical ability to make the right decision. Automation certainly doesn’t have the ethical element, and if it doesn’t need your decision making skills . . . why are you or anyone doing it.

The task might be unnecessary. It could – and trust me I see this a lot – be the result of other automatic generation or other bad choices. It may be a signoff no one needs to sign off on, an automatic update you don’t need to be updated on, or who knows what else. I honestly think a lot of work is generated by other automatic processes and choices that could just bypass people anyway.

But there’s also the chance the task is unneeded, shouldn’t exist, or really a bad idea. Look if the task is assigned to you, a competent individual with good morals, and you want to automate it maybe it just should never have existed. Much as good Agile methods are about making sure you don’t do unneeded work, process is the same.

Whenever something has to be automated, it’s a good time to ask “why did it come to me anyway?” Because the answer may save you time automating, instead letting you hand it off, change how things work, or just ignore it.

And that’s not just AI. That’s anything.

Steven Savage

It’s The Ones We Noticed

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

People developing psychosis while using Chat GPT has been in the news a lot. Well, the latest story is about an Open AI investor who seemed to lose it in real time, leading to shall we say concerns. The gentleman in question seemed to spiral into thinking the world was like the famous SCP Foundation collective work.

Of course people were a little concerned. A big investor AI losing his mind isn’t exactly building confidence in the product or the company. Or for that matter, investing.

But let me gently suggest that the real concern is that this is the one we noticed.

This is not to say all sorts of AI bigwigs and investors are losing their minds – I think some of them have other problems or lost their minds for different reasons. This isn’t to say the majority of people using AI are going to go off into some extreme mental tangent. The problem is that AI, having been introduced recently, is going to have impacts on mental health that will be hard to recognize because this is all happening so fast.

Look, AI came on quick. In some ways I consider that quite insidious as it’s clear everyone jumped on board looking for the next big thing. In some ways it’s understandable because, all critiques aside (including my own), some of it is cool and interesting. But like a lot of things we didn’t ask what the repercussion might be, which has been a bit of a problem since around about the internal combustion engine.

So now that we have examples of people losing their minds – and developing delusions of grandeur – due to AI, what are we missing?

It might not be as bad as the cases that make the news – no founding a religion or creating some metafiction roleplay that’s too real to you. But a bit of an extra weird belief, that strange thing you’re convinced on, something that’s not as noticeable but too far. Remember all the people who got into some weird conspiracies online? Yeah, well, we’ve automated that.

We’re also not looking for it and maybe it’s time we do – what kind of mental challenges are people developing due to AI that we’re not looking for?

There might not even be anything – these cases may just be unfortunate ones that stand out. But I’d really kind of like to know, especially as the technology spreads, and as you know I think it’s spreading unwisely.

Steven Savage

We Stopped Caring. We Can Again.

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

Am I going to talk about AI? Sort of, yes! But mostly I’m going to talk about caring and people, inspired by Dan Sinker’s blog post “The Who Cares Era” – just go read it.

We live in an era that doesn’t seem like people care. Market-tested perfectly created media is there, completely designed to interest us no matter what its content is. Extended cinematic universes calculated carefully enough enthusiasm has become drudgery. Politics are publicity stunts, religion is a lifestyle brand, nothing is what it is.

Finally we’ve got AI, and you can use AI to generate content for someone else – who might then use AI go get a summary of what you didn’t create.

So many people don’t seem to care. We’ve got all sorts of content, writing, media, but so much of it isn’t made by people who care or even consumed by those who care. AI is a pain in my backside, but its the result of where we were slowly heading.

(And as I note, I don’t consider AI new or always bad, but by now you know that, so let me keep ranting).

What I think we’re missing is that good art, or writing, or media is by someone and for someone. There’s an intimately involved creator who gives a damn and someone who wants that media. Real, meaningful creativity and communication is about people on both ends.

Every now and then people rediscover Ed Wood, a man with many flaws, erratic talents, and an unstoppable desire to make things. There is a reason he had and has fans, because he was doing something – if not well – and that connects with people. I say this as a person who has watched his films without the Rifftrax treatment.

I get zines and the sheer personal feel of them is amazing. I love getting to know people, and the personal nature of zines lets me connect with people who want connection. I have a shelf that includes strange collage art, speculations on podcasts, street photography, and more. There’s people on both ends.

On a very personal level, I had a friend who got inspired to change jobs in part due to my book Fan To Pro. That’s what I’m talking about: a person really wants to say something, another wants to listen, and in some cases a dialogue ensures.

AI in the way it’s often deployed messes it up, creating both generated content without a person and summing up data so people don’t have to connect. But that’s been an issue of mass media for awhile, with calculated marketing, mass efforts, calculated creations, and plenty of imitators. We’ve industrialized our arts too much, and AI just slots in there really well.

If you feel that dissatisfaction, that there is something missing in culture and arts and just talking to people, you get it. We’ve industrialized, and misuse of AI takes it farther.

It’s time to care, to care enough to create – and care enough to pay attention.

Steven Savage