Email or username:

Password:

Forgot your password?
Top-level
ansuz / ऐरन

"Recital 12a" is interesting. It seems mostly geared towards excluding the "national security" apparatus from any measures that might be introduced by the legislation, but it also uses some pretty broad language that could include some other groups.

> Accordingly,
this Regulation should not apply to interpersonal communications services that are not
available to the general public and the use of which is instead restricted to persons
involved in the activities of a particular company, organisation, body or authority

I'm guessing this was included thanks to industry lobbying, but I can see it being useful for others if the legislation passes in the proposed form. Maybe the self-hosted group-chat can be framed as some kind of organisation and get an exemption?

That probably won't work for anything that federates, though, as it's murky as to whether they could be considered as "limited to persons..."

8 comments
ansuz / ऐरन

Recital 4 is weird -

> Therefore, this Regulation should contribute to the proper functioning of the internal market
by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in
a manner that is effective and that respects the fundamental rights of all parties concerned. In
view of the fast-changing nature of the services concerned and the technologies used to
provide them, those rules should be laid down in technology-neutral and future-proof
manner, so as not to hamper innovation.

when it comes to surveillance measures on e2ee platforms/services/whatever there's just not many possible options that are compatible with what they want. Client-side-scanning is the most prominent "solution" that doesn't involve directly tampering with encryption, but there's no conceivable way to do it that isn't going to have a massive number of false positives if widely deployed.

Recital 4 is weird -

> Therefore, this Regulation should contribute to the proper functioning of the internal market
by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in
a manner that is effective and that respects the fundamental rights of all parties concerned. In
view of the fast-changing nature of the services concerned and the technologies used to
provide them, those rules should be laid down in technology-neutral and future-proof
manner, so as not to hamper innovation.

ansuz / ऐरन

Recital 5 shows just how broadly applicable the legislation would be

> As they
are increasingly misused for that purpose, those services should include publicly available
interpersonal communications services, such as messaging services and web-based e-mail
services, in so far as those service as publicly available. As services which enable direct
interpersonal and interactive exchange of information merely as a minor ancillary feature
that is intrinsically linked to another service, such as chat and similar functions as part of
gaming, image-sharing and video-hosting are equally at risk of misuse, they should also be
covered by this Regulation.

Recital 5 shows just how broadly applicable the legislation would be

> As they
are increasingly misused for that purpose, those services should include publicly available
interpersonal communications services, such as messaging services and web-based e-mail
services, in so far as those service as publicly available. As services which enable direct
interpersonal and interactive exchange of information merely as a minor ancillary feature
that is intrinsically linked to another service, such as chat and...

ansuz / ऐरन

Recital 13 -

> The term ‘online child sexual abuse’ should cover not only the dissemination of material
previously detected and confirmed as constituting child sexual abuse material (‘known’
material), but also of material not previously detected that is likely to constitute child sexual
abuse material but that has not yet been confirmed as such (‘new’ material), as well as
activities constituting the solicitation of children (‘grooming’). That is needed in order to
address not only past abuse, the re-victimisation and violation of the victims’ rights it
entails, such as those to privacy and protection of personal data, but to also address recent,
ongoing and imminent abuse, so as to prevent it as much as possible, to effectively protect
children and to increase the likelihood of rescuing victims and stopping perpetrators.

media that has been encountered before can be detected by matching against databases of file hashes, but this provision demonstrates that they will require measures far beyond that.

the kind of detection they expect will require fuzzy approaches on text and images. Machines simply can't do this accurately and it requires magical thinking to believe that they'll ever be able to.

Recital 13 -

> The term ‘online child sexual abuse’ should cover not only the dissemination of material
previously detected and confirmed as constituting child sexual abuse material (‘known’
material), but also of material not previously detected that is likely to constitute child sexual
abuse material but that has not yet been confirmed as such (‘new’ material), as well as
activities constituting the solicitation of children (‘grooming’). That is needed in order to
address not only past abuse, the re-victimisation...

ansuz / ऐरन

(16)

> In order to prevent and combat online child sexual abuse effectively, providers of hosting
services and providers of publicly available interpersonal communications services should
take reasonable measures to mitigate the risk of their services being misused for such abuse,
as identified through the risk assessment. Providers subject to an obligation to adopt
mitigation measures pursuant to Regulation (EU) 2022/2065 may consider to which extent mitigation
measures adopted to comply with that obligation, which may include targeted measures to
protect the rights of the child, including age verification and parental control tools, may also
serve to address the risk identified in the specific risk assessment pursuant to this
Regulation, and to which extent further targeted mitigation measures may be required to
comply with this Regulation.

"reasonable measures" assumes a lot, especially coming from agencies that believe client-side-scanning doesn't undermine e2ee.

This is the first mention of "age verification" in the document, but it comes up quite a lot afterward. Usually that line of thinking converges on requiring people to provide some government id in order to use the internet, which is terrible for a multitude of reasons...

(16)

> In order to prevent and combat online child sexual abuse effectively, providers of hosting
services and providers of publicly available interpersonal communications services should
take reasonable measures to mitigate the risk of their services being misused for such abuse,
as identified through the risk assessment. Providers subject to an obligation to adopt
mitigation measures pursuant to Regulation (EU) 2022/2065 may consider to which extent mitigation
measures adopted to comply with that obligation,...

ansuz / ऐरन

Thomas Lohninger of epicenter.works gave a talk at the recent chaos communication camp about what the EU is doing in this area:

media.ccc.de/v/camp2023-57548-

he notes that the EU is working on some methods to do age verification without disclosing government id explicitly - some involving zero knowledge proofs - but he makes a few critical points:

1. the "digital wallet" system they are proposing is going to be a very high-value target with a fairly large attack surface

2. government-issued hardware wallets are not accessible to undocumented migrants

3. not everyone has a phone they can use for this purpose

(16a) says the age verification should be "non-discriminatory and accessible", but I don't see how that's possible given the points above without falling back to scans of government id

Thomas Lohninger of epicenter.works gave a talk at the recent chaos communication camp about what the EU is doing in this area:

media.ccc.de/v/camp2023-57548-

he notes that the EU is working on some methods to do age verification without disclosing government id explicitly - some involving zero knowledge proofs - but he makes a few critical points:

ansuz / ऐरन

I'll keep reading so that I have an idea of what's to come if this passes, but I'm only on page 12 of 199 and there's no sign that things will get any better, so I'm not going to livetoot the whole read-through.

I'll just reiterate that the #chatControl legislation is an absolute shitshow.

The people pushing for this are either ignorant or malicious. They claim to want to protect the public, but their manner of doing so will very obviously harm large groups that are already marginalised.

ansuz / ऐरन

my last point for now is that while this particular legislation concerns the EU, it is a part of an international effort to expand surveillance.

There's similar efforts in a number of other countries. Even if you aren't a citizen or resident of one of those nations, you are most likely a user of at least one service that will be affected.

This legislation absolutely needs to get shut down now, because every such measure that passes makes it easier to enact similar ones in other nations.

drathir

@ansuz Finally someone logicaly thinking to stop tie everything to mobile phones (no matter if number or apps with madly sensitive permissions access)

Go Up