@J12t I think the short answer is "hashing". See, e.g.:
https://www.missingkids.org/ourwork/ncmecdata
https://www.thorn.org/reporting-child-sexual-abuse-content-shared-hash/
https://www.thorn.org/blog/hashing-detect-child-sex-abuse-imagery/
Top-level
@J12t I think the short answer is "hashing". See, e.g.: https://www.missingkids.org/ourwork/ncmecdata https://www.thorn.org/reporting-child-sexual-abuse-content-shared-hash/ https://www.thorn.org/blog/hashing-detect-child-sex-abuse-imagery/ 6 comments
@J12t @williampietri See the link in the original post for context. The Fediverse has work to do. I'm not a server operator, but we do write software for it, so it does affect us at some point. @J12t Sure. But are you planning to actually build a custom CSAM detector from scratch? That strikes me as a wildly bad idea. It's such a fraught problem that I'd suggest any server operator just use an existing service. |
@williampietri and how are you supposed to test your hash algorithm?