Recital 13 -
> The term ‘online child sexual abuse’ should cover not only the dissemination of material
previously detected and confirmed as constituting child sexual abuse material (‘known’
material), but also of material not previously detected that is likely to constitute child sexual
abuse material but that has not yet been confirmed as such (‘new’ material), as well as
activities constituting the solicitation of children (‘grooming’). That is needed in order to
address not only past abuse, the re-victimisation and violation of the victims’ rights it
entails, such as those to privacy and protection of personal data, but to also address recent,
ongoing and imminent abuse, so as to prevent it as much as possible, to effectively protect
children and to increase the likelihood of rescuing victims and stopping perpetrators.
media that has been encountered before can be detected by matching against databases of file hashes, but this provision demonstrates that they will require measures far beyond that.
the kind of detection they expect will require fuzzy approaches on text and images. Machines simply can't do this accurately and it requires magical thinking to believe that they'll ever be able to.
(16)
> In order to prevent and combat online child sexual abuse effectively, providers of hosting
services and providers of publicly available interpersonal communications services should
take reasonable measures to mitigate the risk of their services being misused for such abuse,
as identified through the risk assessment. Providers subject to an obligation to adopt
mitigation measures pursuant to Regulation (EU) 2022/2065 may consider to which extent mitigation
measures adopted to comply with that obligation, which may include targeted measures to
protect the rights of the child, including age verification and parental control tools, may also
serve to address the risk identified in the specific risk assessment pursuant to this
Regulation, and to which extent further targeted mitigation measures may be required to
comply with this Regulation.
"reasonable measures" assumes a lot, especially coming from agencies that believe client-side-scanning doesn't undermine e2ee.
This is the first mention of "age verification" in the document, but it comes up quite a lot afterward. Usually that line of thinking converges on requiring people to provide some government id in order to use the internet, which is terrible for a multitude of reasons...
(16)
> In order to prevent and combat online child sexual abuse effectively, providers of hosting
services and providers of publicly available interpersonal communications services should
take reasonable measures to mitigate the risk of their services being misused for such abuse,
as identified through the risk assessment. Providers subject to an obligation to adopt
mitigation measures pursuant to Regulation (EU) 2022/2065 may consider to which extent mitigation
measures adopted to comply with that...