Home About Tags RSS
Table of contents Table of contents - click to hide

Chatcontrol is getting worse

Introduction

Thanks to a report by EDRi, I've recently become aware that various changes have been made to the chatcontrol proposal. As of writing, the latest version of the proposal can be read here.

The original proposal is already pretty bad, but the changes by the Council of the European Union make it much, much worse. I'll assume you've some familiarity with the original proposal. If you are unfamiliar with the proposal, my introduction post about chatcontrol might help a bit, however the changes to the proposal are not as focused on detection/scanning as the introduction post is.

Errata

Update one day after original post: In the original version of this blogpost, I noted that no court order was required for any type of order. This was inaccurate: For detection orders the requirement of obtaining approval by a judicial/independent administrative authority is currently still in place in the current draft (2nd-last sentence of Article 7, page 12). For the other types of orders, the original post remains accurate. For transparency, incorrect information has been left in but marked with strikethrough and linked to this section.

The changes

Removal of safeguards

For one thing, there is basically no judicial oversight anymore, as almost all safeguards have been removed.

The different orders (detection/removal/blocking/delisting) can be issued by any competent authority, which means any authority designated by a member state. In the original proposal by the Commission, competent authorities were required to be independent and impartial. In the updated proposal, this is no longer the case (see Article 26, page 42). This means that police could be a competent authorities which may demand detection/blocking by providers in the EU, without any court review.

Cross-border orders

I don't think I mentioned in previous blog posts that orders can be submitted across borders, so I'll do it here even if it's not new. Cross-border were already planned in the original proposal, though of course without judicial oversight it's a lot worse. Just imagine police in some less LGBT-friendly country in the EU forcing the service provider of a LGBT dating app to scan messages., no court orders required. Surely this will go well...

Orders may be challenged in courts of the country issuing the detection order (Article 9 paragraph 1, page 17), but of course going to court in a foreign country is not exactly simple.

A small change is that only "the most important elements necessary for the execution of the order" need to be in "any of the offical languages declared by the provider". So if you're a provider who doesn't wish to follow such orders, you'll most definitely require a translator. This applies to all orders (detection orders (page 12), removal orders (page 24), blocking orders (page 31), delisting orders (page 35)).

OK, but weird change: There is a small minor safeguard for cross-border removal orders: Service providers can ask the coordinating authority of the member state to review the removal order, and if the authority notes infringements on the requirements for the order, it ceases to have legal effects (Article 14a, paragraph 4, page 26). However

  1. the service provider may only request review within 48 hours (weekends? what's that?)
  2. the coordinating authority must check the order in < 72 hours. In fact, the Coordinating Authority should always review cross-border orders, but...
  3. the Coordinating Authority is basically just a better competent authority which is responsible for coordination between countries - which means that the safeguards removed for competent authorities are also removed for the Coordinating authority, thus it is not required to be independent/impartial anymore either
  4. for some reason this safeguard only exist for removal orders (why?!?)

Blocking Orders

Blocking orders can be issued to ISPs (Article 16, page 28). All versions of the proposal (including updated versions) speak of URLs ISPs should block, which is something that is technically be impossible to do for ISPs due to encryption of internet traffic. How they're supposed to comply with these blocking orders then remain unclear.

ISPs can be forced by a blocking order to take "reasonable measures" against "known child sexual abuse material" (strikethrough as in source). Which means in the updated version they need to perform "reasonable measures" against not only known material, but somehow also against unknown material. Given the encryption of traffic data & the state of detection technology, I'm not sure if anything about this is feasible, let alone reasonable. But whatever is "reasonable" is not defined, therefore conveniently left open to interpretation by the competent authorities.

Not only that, but basic checks to determine if the URLs are correct (Article 16, paragraph 2) or if the order ensures a fair balance of fundamental rights (paragraph 4 (d)) have been removed. Another safeguard which was removed was a maximum duration of 5 years for blocking orders (now unlimited) (page 30).

To be fair, the removal of these checks somewhat makes sense: Given that the authorities issuing these orders are no longer required to be independent/impartial, these checks probably wouldn't have been performed very well anyway.

EDRi has written in detail about blocking orders here and sent a briefing note to the council, concerned by bad terrible these changes are.

Provider obligations

There have been two minor changes to the obligations of services/providers:

When submitting a report, hosting services & providers of interpersonal communication must include IP Addresses & Port numbers (Article 13, paragraph 1 (f), page 22). This means that not only will software have to record & permanently store this data (port number is not usually logged), the proposal also conflicts with the GDPR which treats IP addresses as personal data due to being an online identifier. This will once again force services & providers to track their users, even if they don't want to.

The second change moves the requirement for human intervention in the case of unreliable detection from Article 10, paragraph 4 (c) (page 19), to a recital. This means that member states don't have to implement this part of the proposal if they don't want to.

However, part of the paragraph still remains ("ensure regular human oversight as necessary to ensure that the technologies operate in a sufficiently reliable manner"). Given that "sufficiently reliable" is not defined here, I suppose that you could still require providers to implement human review, but some clarification would really be nice here. Moving the requirement for human intervention to a recital definitely sends the wrong signal.

Introducing delisting orders

The council added a new type of order to the proposal: Delisting orders. Any competent authority has the power to send these to online search engine providers under jurisdiction of the member state to force the search engines to delist a website.

There are only two conditions for this (Article 18a, paragraph 3, page 34) that the competent authority must consider met to issue a delisting order:

  1. the delisting is necessary to prevent the dissemination of the child sexual abuse material to users in the Union [...]
  2. the website indicates, in a sufficiently reliable manner, child sexual abuse material

The first condition doesn't seem to provide sufficient safeguards (could an authority delist the Tor project website for enabling access to the Tor darknet?), though the second helps somewhat. The order should also be reviewed by the Coordinating Authority of the member state.

Still, consider the scenario where a user posts child abuse material to a website (e.g. facebook) and that this goes undetected by the automatic filters of the website (this will happen, there's no way to catch everything). Any competent authority could then order search engines to delist the website, just for that one post.

It is important to note that there is

so in case your website has been delisted by mistake... good luck.

The only thing good about this is that orders may only be isssued to search engines within the jurisdiction of the member state (Article 18a, paragraph 1, page 33).

Other changes

Likely future changes

It is rather obvious upon reading that large parts of the proposal have been copy-pasted and slightly modified, which makes it hard to keep track of the small differences between the different orders. I assume that the proposal will be worse in the future, where the few safeguards that are still in place for some orders will also be removed in the same way as for the other types of orders.

Update one day after original post: As an example, it seems likely to me that the requirement to ask a judicial/independent authority for approval for a detection order will be removed.

Conclusion

The proposal was terrible to start with: It would

The council managed to keep the bad parts from the commission and add new bad parts by

The chatcontrol proposal is consistently getting worse. It needs to be stopped. NOW.

You can keep following the development of the proposal by staying up to date with the latest search results on consilium.europa.eu and eur-lex.europa.eu (both; they each have different documents available, with some overlap).

In case you want more information, I've written quite a few articles about chatcontrol.

Written on 2023-02-04
Last updated on 2023-02-06
Tags: politics, chatcontrol