Email or username:

Password:

Forgot your password?
Johannes Ernst

So possession of child pornography is illegal in the US even if your goal is to write code that can detect child pornography.

How does anybody solve this problem?

(Reading through the notes of today's SWICG's call on the subject github.com/swicg/meetings/tree)

8 comments
Johannes Ernst

@williampietri and how are you supposed to test your hash algorithm?

William Pietri

@J12t
I've never tried it, but typically there are dummy/safe values for testing. Although I think a common approach for CSAM is just to feed all images to an API provider who gives you back a conclusion, in which case you test differently.

Johannes Ernst

@williampietri But what does the API provider do? Relocate to outside the US?

William Pietri

@J12t Are you planning to go into the business? Or is this more general curiosity?

Johannes Ernst

@williampietri See the link in the original post for context. The Fediverse has work to do. I'm not a server operator, but we do write software for it, so it does affect us at some point.

William Pietri

@J12t Sure. But are you planning to actually build a custom CSAM detector from scratch? That strikes me as a wildly bad idea. It's such a fraught problem that I'd suggest any server operator just use an existing service.

toxtethogrady

@J12t It gets even weirder than that. Child pornography is illegal, but states are moving to make child exploitation legal by lowering the minimum age at which children can work in dangerous jobs. If we called getting burned by a McDonald's deep fat fryer child sexual abuse, maybe states would feel differently about it...

Go Up