There’s a lot of talk about the new Chat Control regulation here with many scary statements but few details so I decided to check what’s actually in it. Best way to learn about at is to just read it:

https://data.consilium.europa.eu/doc/document/ST-15318-2025-INIT/en/pdf

It’s long but not hard to understand. If you don’t have the time here’s a short summary for you.

First, two points:

  1. There’s no TLDR. It’s a complex legislation and it can’t be summarized in two sentences.
  2. I will only say what’s in the law. I don’t care about how you think corporations will break it or conspiracy theories about what’s really behind it. Feel free to post them but I will just ignore it.

So, to the point. The goal of the law is to prevent and combat child sexual abuse and it applies to hosting providers and publicly available communication services. At the core of the regulation is risk assessment. Each service will have to asses the risk that it will be used to distribute CSAM or to groom children. The risk is based on things like:

  • are there known cases of the service being used for such things
  • is there a mechanism to report CSAM by the users
  • does it have parent control features
  • do kids use it
  • can you search for users and identify underage users
  • can you send photos and videos using private chats

Once the risk assessment is done providers will have to address the identified risks. They can do this by implementing things like:

  • moderation (having proper tools and staffing)
  • letting users report child abuse
  • letting users control what personal information visible to other users and how other users can contact them
  • voluntarily doing things covered by Chat Control 1.0 (so client side message scanning)

If the provider identified that the service can be used for grooming it should implement age verification. The regulation says that age verification “shall be privacy preserving, respecting the principles relating to the processing of personal data, notably the principles of lawfulness, purpose limitation and data minimisation”.

Regarding E2EE it clearly states that “the regulation shall not prohibit, make impossible, weaken, circumvent or otherwise undermine cybersecurity measures, in particular encryption, including end-to-end encryption”.

So, does it allow all services to require an ID and store all your personal data? No. Can messengers break E2EE and scan all your messages? Also no.

What can happen is that some services will verify your age and store your date of birth. Anything beyond that will still be illegal and protected by GDPR. Providers can keep doing whatever they have been doing under Chat Control 1.0 (which applies since 2021) but E2EE is still protected.

Knowing all that let’s think how it will apply to some example services. This is just my best guess but I think those are reasonable assumptions:

Signal: it does not let you search for other users, you don’t share any personal information with anyone, strangers can’t message you. There’s very low risk that it will be used for grooming so it doesn’t have to do any age verification. It allows you to share videos, has e2ee and I believe there were known cases of using to share CSAM. Based on that it COULD scan media client side as allowed by Chat Control 1.0 but it’s voluntary. It will have to implement tools to report it.

Roblox: users of different ages interacting with each other, known cases of grooming on the platform: it should implement age verification.

Pornhub: low risk of grooming: no age verification. High risk of distributing CSAM: moderation and reporting.

Lemmy: my guess would be it’s not used by many kids, instances don’t have communities targeting children, it doesn’t have tools that let you easily find kids on the platform: low risk of grooming and no age verification necessary. It can be used to publish CSAM and it has happened in the past: it should have proper moderation and reporting functionality and it can scan media and compare it against known CSAM hashes.

That’s pretty much it. This is what Chat Control 2.0 is all about.

  • daed@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    20
    ·
    3 days ago

    Thanks for going through the legislature and confirming the points. I’m skeptical about new chat control legislatures going forward though. Its the start of the boiling frog. The first few are more anti-hype than anything but once everyone understands chat control ain’t bad that’s when they can put out laws which really discomfort chat app users because people won’t check up on new laws as often since they’ve been good before. Like doing a Trump for conservatives.

    • tomiant@piefed.social
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 days ago

      Everyone called everyone a conspiracy theorist because there was no hard evidence that the US was spying on its own citizens.

  • emotional_soup_88@programming.dev
    link
    fedilink
    English
    arrow-up
    13
    ·
    3 days ago

    While I do - genuinely - appreciate the effort and the post, I disagree with its general implications. I may be too much of a sad libertarian, but I believe that education is the way to go, directed at both kids at risk and at the general population who are all potential perpetrators. Control is not the way to go. Thanks again for taking your time!

  • doortodeath@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    3 days ago

    Now reread the risk assesment with good old postal letters in mind and ask yourself how distopian this law actually is. In my opinion this is not about safety, it is about control.

  • tomiant@piefed.social
    link
    fedilink
    English
    arrow-up
    11
    ·
    3 days ago

    This is so fucking bleak.

    If you know anything about anything, you know how this will affect rule of democracy.

  • sp3ctre@feddit.org
    link
    fedilink
    English
    arrow-up
    11
    ·
    3 days ago

    I fear that politicians will read it like “We only need one (or maybe 10) reported case(s), where someone groomed/shared CSAM over Threema and we’re good to go”. Am I wrong?