Email or username:

Password:

Forgot your password?
8 posts total
Mark Pesce

"We used to answer the questions ourselves and we felt like we had control over it. We felt like we had agency. But one day, it was taken away from us. Instead, they bought this AI-powered program without notifying the unions, nurses, or representatives."

windowscopilot.news/2024/11/15

Possibly a Dog

@mpesce In fact, several ways to do AI wrong!

Three horrible decisions by management to replace thoughful human labor with automated inscrutable slop.. the #ai #enshittification of #healthcare has begun đŸ˜±

1. Replacing an effective tracking system that nurses developed to coordinate their patients' needs with their own workloads: "The upshot was, it took away our ability to advocate for patients. We couldn’t point to a score and say, ‘This patient is too sick, I need to focus on them alone,’ because the numbers didn’t help us make that case anymore. They didn’t tell us if a patient was low, medium, or high need. They just gave patients a seemingly random score that nobody understood, on a scale of one to infinity."

2. The *whole point* of writing case notes is to *verify* them as you do, to make sure the record matches reality, and give it a onceover with an expert eye. A scribe must be a *human*, not an "Ambient Documentation" AI -- if the doctors are too burned out to do it, then *give them less work* and *hire expert humans* to assist them and let them skim the resulting docs together.

3. Annoying AI alerts, a dystopian parody of the checklist system that real doctors invented to catch errors, improve outcomes, and train staff.

AI-everywhere management is just tightening the ratchet on everyone, taking away human agency and creativity and education and expertise, and in addition to sucking the joy out of jobs, and violating patient privacy, these moves are *definitely* going to kill people.

codastory.com/stayonthestory/n

@mpesce In fact, several ways to do AI wrong!

Three horrible decisions by management to replace thoughful human labor with automated inscrutable slop.. the #ai #enshittification of #healthcare has begun đŸ˜±

1. Replacing an effective tracking system that nurses developed to coordinate their patients' needs with their own workloads: "The upshot was, it took away our ability to advocate for patients. We couldn’t point to a score and say, ‘This patient is too sick, I need to focus on them alone,’ because...

Mark Pesce

I can not say this clearly enough: BE VERY CAREFUL WHAT YOU SHARE WITH AI CHATBOTS

BE CAREFUL WHAT YOU SHARE! News / Technology Department of Defence staff used ChatGPT Al chatbot blamed for psychosocial thousands of times without authorisation workplace training gaffe at Bunbury Greens Senator David Shoebridge said the 'horse has already bolted' prison when it comes to preventing Al services from being exposed to government data. Psychosocial Leadership trainer Charlotte Ingham said she used Microsoft's Copilot chatbot to generate examples of CAMWILSON AUG21,2023 7 € Share psychosocial hazards employees might face at Bunbury prison, where she was delivering the course. Crildprotection 0 io included a character called B Hendry, th . . ne scenario included a character called Bronwyn Hendry, the Alban ordered'afte.r Chl-ld protectionworker " ormer employee. used ChatGPT in Victorian court case "I walked in there thinking | had a fictional scenario," Ms Ingham Investigation finds staffer's report referred to doll allegedly said. used by father for ‘sexual purposes’ as ‘age-appropriate toy’ Victoria’s child protection agency has been ordered to ban staff from using Ihad no 1d'ea [the chatbot] would use real people's names,” Ms generative Al services after a worker was found to have entered significant Ingham said. amounts of personal information, including the name of an at-risk child, into ChatGPT. "I mean, should I have known?"
Wulfy

@mpesce

What happened to "If you have nothing to hide, you have nothing to fear?"

We're not doing that anymore?

Mark Pesce

I must say I am utterly shocked _beyond words_ to learn this.

Just. Utterly. Shocked.

I will not be taking questions.

cbsnews.com/news/tax-cuts-rich

Show previous comments
Michael Potter

@mpesce So, they were actually pissing on our heads and calling it rain?

tizan

@mpesce Yeah just like Alan Greenspan was surprised that banks will not self regulate and will go into ponzi scheme like the mortgage subprime debacle. Liberterian (no tax, no govt) economics never worked ..otherwise Somalia would be the greatest place on earth right now to live in.

thefathippy

@mpesce

Yes. It's *very* hard to believe, isn't it? 🙄

Mark Pesce

For those who did not receive the last memo:

THE VOID is for screaming.

THE ABYSS is for staring.

Please do not scream into THE ABYSS. And under no circumstances should you ever stare into THE VOID.

That is all.

Show previous comments
HowToPhil

@mpesce I have done such and trust, good friend, I am changed for the better. Come with me to the precipice of each and know the echo of the abyss, the sweet vistas of the void, and the fading of your soul. Come and know the glory of nothing.

Mark Pesce

Voice assistants and AI chatbots still can’t say who won the 2020 election

With months to go before the U.S. presidential election, some popular AI chatbots and voice assistants offer potentially harmful misinformation.

*sigh*

washingtonpost.com/technology/

Mark Pesce

Um. Someone just ported Audacity.

TO THE F**KING BROWSER.

wavacity.com/

Show previous comments
viq

@mpesce
Browser, the new operating system... Useful, but I have mixed feelings about this.

And then there's bellard.org/jslinux/ đŸ€Ż

@socketwench

Mark Pesce

From @axios

If you use the words "trans" or "transgender" in a tweet, for example, the message won't preview if you share it via direct message on Twitter.

Other terms said to be on the list are "bisexual," "gay," "lesbian" and "queer."

axios.com/2023/04/03/elon-musk

This kind of censorship, embedded in the code of the system - it's ugly.

Mark Pesce

It's strange... you're just rolling along through your morning and then you read something...

It touches you so profoundly, you just start writing about it.

Because you have no choice.

Which is how I just wrote my next column for @theregister

Mark Pesce

“I was a paid-up member of the male–female brain brigade. It took me several years and a long struggle to realize that I was just not finding the kinds of differences I expected.”

Cognitive neuroscientist Gina Rippon explains that there was never a scientific basis to the myth that male and female brains are biologically different — instead, a major reason why even young children behave differently depending on their gender is because they are “tiny social sponges”.

nature.com/articles/d41586-023

“I was a paid-up member of the male–female brain brigade. It took me several years and a long struggle to realize that I was just not finding the kinds of differences I expected.”

Cognitive neuroscientist Gina Rippon explains that there was never a scientific basis to the myth that male and female brains are biologically different — instead, a major reason why even young children behave differently depending on their gender is because they are “tiny social sponges”.

Show previous comments
DELETED

@mpesce If this was true, you’d find radically different gender roles across different cultures.

ko kāihe ahau đŸ« :verified:

@mpesce @chartier gotta be careful with this line of thinking. John Money tried this back in the 60s (swapping gender in a brother and sister via social aspects) and both kids ended up killings themselves.

csh

@mpesce They buried the important findings. That women tend to "prefer" certain professions because of cultural exclusion.

The title and blurb hide what her research is actually about.

Go Up