Email or username:

Password:

Forgot your password?
51 comments
Zorin =^o.o^=

@stavvers This is my favorite story of the year.

It's about time a company got bit in the ass by their aggressive cost-cutting AI nonsense.

And the thing is, it was absolutely peanuts to Air Canada. But you know this precedent means all companies will now hesitate to deploy this garbage for fear of much larger potential damages.

Another Angry Woman

@zorinlynx my favourite detail in the story is they disabled the chatbot over those peanuts because they *know* they got off lightly

Robin Adams

@zorinlynx @stavvers Except for this:

"Experts told the Vancouver Sun that Air Canada may have succeeded in avoiding liability in Moffatt's case if its chatbot had warned customers that the information that the chatbot provided may not be accurate."

So all they'll do is add a disclaimer. But still fire all the human customer support reps so the lying chatbot is your only option.

Fahri Reza

why post the question to the chatbot if you'd be lied to, and if you were lied to it's your fault asking the question to the bot @robinadams @zorinlynx @stavvers

Petr Tesarik

@dozymoe @robinadams @zorinlynx @stavvers Yeah, good observation! Why did Air Canada even waste money on an AI chatbot if all they really wanted was fire customer support?

If I were a shareholder, I would demand a thorough explanation of this failed investment by the executive team…

Dave "Wear A Goddamn Mask" Cochran :donor:

@stavvers "“So in the case of a snowstorm, if you have not been issued your new boarding pass yet and you just want to confirm if you have a seat available on another flight, that’s the sort of thing we can easily handle with AI,” Crocker told the Globe and Mail."

"Experts told the Vancouver Sun that Air Canada may have succeeded in avoiding liability in Moffatt's case if its chatbot had warned customers that the information that the chatbot provided may not be accurate."

It's ALMOST like "AI" is a crock of shit or something 🤔

@stavvers "“So in the case of a snowstorm, if you have not been issued your new boarding pass yet and you just want to confirm if you have a seat available on another flight, that’s the sort of thing we can easily handle with AI,” Crocker told the Globe and Mail."

"Experts told the Vancouver Sun that Air Canada may have succeeded in avoiding liability in Moffatt's case if its chatbot had warned customers that the information that the chatbot provided may not be accurate."

colorblind cowboy 😷✊🏻

@dave_cochran @stavvers Ah, yes, just what we want: Customer service with a possibly negating caveat.

Petr Tesarik

@Konfettispaghetti @wendinoakland @colorblindcowboy @dave_cochran @stavvers For my part, I choose not to fly with this company. If all become like that, I choose not to fly.

KatLS

@stavvers so nice to see that chatbots aren’t legal entities. 😬at least in Canada.

John Mastodon

@stavvers ABSOLUTELY DAVE, I'M PRETTY SURE I CAN DO THAT!

The next generation of HAL's won't be that much positive!

m3t00

@stavvers@masto.ai

ai does work of 100 office dwellers, CEOs included. It's chatbot likely gave them a BOFH excuse.

Claire

@stavvers something something management decisions

Mx. Eddie R

@stavvers
Holy shit:

>

the airline should not be liable for the chatbot's misleading information because, Air Canada essentially argued, "the chatbot is a separate legal entity that is responsible for its own actions,"

Can you imagine if the court had accepted that?

Joe

@silvermoon82 @stavvers But suppose it were a human being working for them that told a customer, in writing, on an official communication channel, that they could get a discount. Seems the airline would be forced to honor that.

Kydia Music

@not2b @silvermoon82 @stavvers

Absolutely. Of course then they’d fire that employee, but they won’t do that with their Chatbot, because it replaces several (hundreds?) of human customer service reps, and saves them more money in the long run than it might lose on a few airfares it has to reimburse.

Billy Smith

@silvermoon82 @stavvers

This reminds me of an in-narrative-universe legal decision about AI that was influenced by the fact that AI's were initially used by spammers. :D

freefall.purrsia.com/

It's definitley worth an archive trawl. :D

kinyutaka

@silvermoon82 @stavvers

Which is totally bullshit, because if a human were to promise wrong policy, the company would be held to it.

Fahri Reza

Something like embedded frame that the content originated from 3rd party chatbot provider @silvermoon82 @stavvers

Amy (she/her/hers)

@stavvers@masto.ai you want to use AI as employees, its products are gonna be treated as employee products...

Joe

@stavvers The stupid thing is that Air Canada could have quietly given this one person the bereavement discount and then fixed the bot, and it would have been a lot cheaper and less embarrassing for them than having a court battle and an international story. It was one person, a couple of hundred dollars (and Canadian dollars at that).

ELOPe

@not2b @stavvers Yeah, they definitely missed the forest for the trees in this case. This probably means they farmed out their counsel to LawBot.

TrumpGPT

@stavvers Don't worry about human jobs; they're secure. Businesses might think hiring clueless folks and robots is the way to go, but they'll learn their lesson. Sharp critical thinking is essential, especially in customer service, but remember, top talent doesn't come cheap.

Paladin :verified: :ak:

@stavvers I work with chatbots and there is no way in hell that a LLM ends up in a live environement on my watch.

The tech is not ready for that yet.

GhostOnTheHalfShell

@stavvers Oh man, wait…

Some of these chat bots can be tricked into advising ANYTHING the bit had been trained on.

Fahri Reza

don't they need huge chunk of bits that's why it's called large language model, the company's question/answer that the chatbot can be trained on can't be enough @GhostOnTheHalfShell @stavvers

GhostOnTheHalfShell

@dozymoe @stavvers

that doesn’t mean its topic constrained though. there was a post of a car dealership answering topics quite wide of that area

Stanley Nerdlinger

@stavvers
Can they take it out of the chatbot's pay?

deco

@stavvers @thegibson oh. my. god. That is the best thing ever. Oh the possibilities!!!

Nikki 🦔

@stavvers I know other big companies who have got rid of their carefully curated faq systems and fed it into chatgpt and had the bots given solutions that don't work and tell customers to click buttons and vista pages that don't exist.

Fools, the lot of them.

MickeyMaousse

@nikki @stavvers AI is the bitcoin of this decade : the trendy buzz subject that allow c-suite dwellers to dazzle each other with contentless "strategies" and projects.

voxpopsicle

@stavvers brb off to play with the chat bot til I can get it to commit to something stupid

Baley

@stavvers I love this. Extra points that they didn't use a apostrophe s

Thomas Decker

@stavvers This is the only recourse we are likely to have, we will have to sue companies every single time if they don't honor whatever bullshit their chatbots produce. Because only if they cannot get away with it will they stop firing real humans.

Maggie Maybe

@thomas_decker @stavvers an Amazon chat bot told me I couldn’t cancel an order, but as soon as I received it I could get a refund and keep the item. I told them to refund me now since my issue is that they are not delivering the item. I never received the refund, I received the item and decided to let it go because it was only $7.99. This story has motivated me to contact Amazon for my eight dollars.

Faye

@stavvers @soatok this is… exactly the failure case you get when you deploy a technology designed to “look” right rather than actually produce verifiably accurate results.

timthelion

@FayeDrake @stavvers @soatok Kind of like when companies thought it was a good idea to make their facades out of plastered EPS (expanded polystyrene board). Looks great, until someone touches it.

צבי הנינג'ה🟣he/him

@stavvers
At first glance I was sure I was looking at a @lowqualityfacts post, and I wondered what happened to the font

שחר שמש شحر شمش 🇮🇱

@stavvers the only thing I didn't understand about this story is what legal defense AC possibly thought it had.
The defense they brought wouldn't have held even had that been a human respondent.

Go Up