Email or username:

Password:

Forgot your password?
Carl T. Bergstrom

Allowing police officers to submit LLM-written reports reveals a remarkable misunderstanding of what LLMs do, a profound indifference to the notion of integrity in the communications of law enforcement with the justice system, or both.

Given how readily subject to suggestion human witnesses—including police officers—are known to be, this is a disaster.

Yes, police reports aren't always the most accurate, but introducing an additional layer of non-accountability is bad.

apnews.com/article/ai-writes-p

2 comments
Carl T. Bergstrom

It's a terrifying development.

LLMs are literally designed to generate *plausible-sounding* *bullshit*.

They have no accountability and even less allegiance to truth than crooked cops—but they will be much, much better at writing the kinds of falsehoods that will bring a conviction.

Alexey Skobkin

@ct_bergstrom
I'd trust a language model more than an officer who doesn't give a shit about his/her work so much that they're fine with writing fiction in their reports.

LLM's aren't the problem here. Incompetent, unmotivated and lazy people are. Or do you think their reports would get better without LLM's?

Go Up