Take Some Responsibility

You probably heard the news: Air Canada had to pay up for something an “AI” chatbot said. This story saddens me as I love flying on Air Canada. Honestly in my trips up there the flight is often part of the fun.

Basically a guy asked an Air Canada chatbot on advice on canceling due to bereavement, it gave him advice on refunds that was wrong. He followed the advice and of course when he had to cancel, he didn’t get his refund, and made a small claims complaint to the appropriate body. Air Canada argued – seriously – the chatbot is a legally distinct entity and that the guy shouldn’t have trusted the advice, but followed a link provided by the chatbot which had gotten things wrong.

Obviously, that didn’t fly, excuse the really stupid pun.

As an IT professional who’s career is “older than One Piece” let me weigh in.

I work in medical technology (indeed, it’s my plan to do this for the rest of my career). We vet everything we install or set up. We regularly review everything we set up. We have support systems to make sure everything is working. This is, of course, because you screw up anything medical and bad things happen.

Also it’s because someone that goes into medical anything is usually pretty responsible. We IT folks are in the mix everyday and know the impact of our job. We also work with – and sometimes are or were – doctors and nurses and other medical professionals who get it.

I love working in this environment. If this appeals to you, I can honestly say check out working in medicine, medical research, and education. It’s awesome.

Know what? Other people using technology can and should take the same level of responsibility.

Technology is a choice. What you use, how you implement it, how you expose people to it, all of that is a choice. You built it or paid for it or whatever, you take responsibility if it goes wrong, be it a life or someone deserving a refund.

If the product isn’t what you thought? Then those who made it owes you an apology, wad of cash, corporate dissolution, whatever. But either way someone takes responsibility, because technology is a choice.

We’ve certainly had enough of moving fast and breaking things, which really seems to just result in enshitification and more and more ways to be irresponsible.

Besides, reputation is involved, and if nothing else saying “we don’t care of our technology on a website goes wrong” is going to make people question everything else you do. I mean, if you were on an Air Canada plane after hearing about this “sorry, not our fault” approach how safe are you going to feel?

Let’s try to be responsible here.

Steven Savage

How Deep Does the B.S. Go?

Lately I was speculating on the role of B.S. in our economy, politics, and technology. I’d spell it out (and swear more probably) but I do have some discretion!

We’ve normalized the idea that some people are honestly, lying to us. We expect that we’re being lied to be marketing forces, by the latest trends, and by politicians. It’s honestly so normalized, it seems we can’t imagine a less deception-free world.

(It also makes me realize how people can get blaise about COVID.)

In turn, we’ve also normalized that people we like are lying. Yeah, that famous tech guy is hyping stuff, but we like his product. Sure the politician we voted for is spouting demented nonsense, but they’re our politician. We go to see a movie we know we’ve been sold on in the negative sense or a restaurant whos food is just “OK” but you know, advertising and familiarity.

What’s struck me lately, is that we are probably too used to lying as well.

When I’ve seen people rallying to the defense of people, media, and so on that they like you an hear them repeat talking points. You can tell with just a bit of empathy that many people don’t really or exactly believe what they say. But to defend what they like for whatever reason,

I even found myself tempted to do it (which tells me I do do it, I just didn’t catch myself).

I’m wondering how deep the B.S. in our media-saturated, pundit-heavy, social media culture goes by now. I mean yes humans have always lied to others and themselves, but it feels pretty amplified and survival-adverse in my experience. How much of our lives, as individuals, is just lying about stuff?

I think some of it is definitely internet and media culture. Say the right things, do the right things, and you get money, attention, and might even become some kind of Influencer or Pundit. You can lie for a living if you play your cards right! Whatever B.S. problems we had in the past, you can do it faster, giving less time for experience and other people to provide restraining feedback.

In a time of chaos and climate change, this is even more disturbing. We’ve got a lot of problems to solve, or at least survive, and if we’ve all internalized outright deception to an extent, it’s going to be much harder. When everyone is busy not telling the truth, it gets harder to tell the truth, and even when a bunch of people do, too many might not out of various motivations.

I know at least I’ll be watching myself closer. But this is going to haunt me.

Steven Savage

Optimized Failure

I saw an online conversation about the book How Infrastructure Works by Deb Chachra. In this book that I apparently have to buy, the author mentions that resilient systems are not optimized systems. To have a network, an organization, a team be resilient you need redundancy, slack time, backup, vacation time, whatever. Doing something perfectly doesn’t mean you’ll be able to keep doing it because being able to keep doing it isn’t part of actually doing it.

This is a very obvious statement that very obviously gets missed everywhere. If you’ve ever had to explain to someone that their network needs backup or that the fastest transport route isn’t necessarily reliable you get the idea. I know I’ve been there.

I also think it explains a lot about the brittleness in today’s world. We see collapsing ecosystems, housing prices going out of reach, and stagnant wages. We’re supposedly in this high-tech age with fast deliveries and electronic banking, an optimized age, but it’s fraying isn’t it?

A big part of this is that we figured businesses, hell even government, should all be optimized to one goal – profit. Make money really well, and that’s it! Of course, at that point all you get good at is making money – probably very fast of course. You have quarterly reports to make after all.

I think this means a lot of companies and other organizations are brittle as they’re optimized to just make money. They’re not resilient as they rely entirely on getting as much cash as possible. Sure they can spend that cash when things are painfully non-resilient, to get through bad markets and so on, but it’s not the same as actually enduring. Ask anyone who’s been through a third round of layoffs.

It also means that there’s damage to the resilience of society. Regulatory capture means there’s less resilience brought by laws and policies. Layoffs to protect the bottom line destroy lives. The environment takes a hit from our pollution and dumping and the like. Bought-off politicians avoid doing anything to help people, anything resilient.

Profit-focus is just another form of optimization. Like optimization, it backfires.

Today I hear talk about the kind of larger crisis our world is in. But I think a lot of it comes down to optimizations in profit-seeking. We got so good at turning things into money, we ignored resilience. Now we’re going to have to switch back or face some pretty severe consequences.

I know what order I expect, sadly.

Steven Savage