Thoughts on the Durov arrest

Today we learn that Pavel Durov, founder of the popular messaging app Telegram, has been arrested as his private jet landed in France.

Early indications are that the arrest stems from Telegram’s alleged noncompliance with French requests for content moderation and data disclosure:

What does this mean for you, my readers (predominantly American tech people)?

A bit of legal background is called for. Most social media companies of global significance that are not Chinese are headquartered in the United States. This is no accident; the United States (wisely) undertook policy moves in the late 1990s to minimize liability for the operators of online services, most notably the enactment of Section 230 of the Communications Decency Act, which (essentially) says that operators of social media websites are not liable for the torts or crimes of their users.

There are of course some, very narrow, exceptions to this rule. For example, illegal pornography is subject to a mandatory takedown-and-reporting regime under 18 U.S. Code § 2258A. I hasten to add that complying with this law is table stakes for a user-generated content business, compliance tools like PhotoDNA are widely available and free to use, and I would be shocked if Telegram didn’t comply. There’s also FOSTA-SESTA, which prohibits operators of online platforms from running services which knowingly support sex trafficking or prostitution with the intent to facilitate the same (see: United States v. Lacey et al. (Backpage), 47 U.S. Code § 230(e)(5)), 18 US Code § 2421A). This law will be a major compliance concern for any dating app, and is a reason why sites which once had “personals” listings like CraigsList, which were non-core to the rest of their offering, got rid of those “dating”-specific features. One can still post a prostitution ad in the used boat listings, but Craigslist can’t be said to have intended to facilitate that behavior.

Other than that, though, social media website operators are generally not liable for the torts or crimes of their users. Nor are they liable under aider/abettor theories if they just passively host the content. (See: Twitter v. Taamneh, 598 U.S. _ (2023) – civil liability for aiding and abetting, at least on this side of the pond, requires “knowing and substantial assistance,” and federal criminal liability – as state criminal law is disapplied by Section 230 – requires specific intent to assist in the commission of a crime).

This means that if, for example, I use Facebook to organize drug deals, Facebook (a) is under no obligation to scan its services for unlawful use and (b) is under no obligation to restrain that use, and will generally be immune civilly from my misuse unless Facebook “materially contributes,” i.e. specifically encourages, that unlawful use (see e.g. Force v Facebook, 934 F.3d 53 (2d Cir. 2019), where Facebook was held not civilly liable under JASTA to victims of Hamas, which used Facebook to disseminate propaganda online; see also Taamneh, supra), and will not be liable criminally (a) under state criminal law by operation of Section 230, and (b) under federal criminal law to the extent that Facebook does not willfully and knowingly aid, abet, counsel, or procure the commission of the offense with specific intent, per 18 USC § 2.

Most countries do not have such a permissive regime. France is part of that group. In 2020, for example, the Loi Lutte Contra la Haine sur Internet (Law against hate speech on the Internet) in relation to which global Internet companies can be fined $1.4 million per instance, and up to 4% of their total worldwide revenue, for failing to restrict “hate speech” (which in the United States constitutes “protected speech”) from their websites. Similarly, Germany has its law, the Netzwerkdurchsetzungsgesetz or “Network Enforcement Act” (sometimes referred to as the “Facebook-gesetz” but more commonly referred to by its acronym, the NetzDG), in relation to which politically inflammatory content must come down or the government has the power to impose fines north of EUR 50 million.

Not being a French lawyer, it is difficult for me to figure out exactly what legislative provisions are being invoked here. The charging documents or the warrant will tell us more when they’re published. I’m pretty sure we’re not looking at fine proceedings against Telegram Messenger, Inc. under the hate speech law e.g. or the EU DSA, because if we were, Durov would not have been dragged off a plane in handcuffs. TFI Info, the French media outlet which broke the story, suggests that the charges might be something along the lines of an aiding and abetting offense, or possibly conspiracy:

[The Ministry of] Justice considers that the lack of moderation, cooperation with law enforcement and the tools offered by Telegram (disposable number, cryptocurrencies, etc.) make it an accomplice in drug trafficking… and fraud. 

More will be revealed when the arrest warrant is made public. If, for example, it is revealed that Durov did in fact actively assist criminal users with access to the platform, for example a drug user wrote to the support channel stating: “I would like to sell drugs on your platform. How do I do this?” And Durov replied to that with assistance, then his goose would be just as cooked in America as it would be in France.

If, however, the French are simply saying that Durov’s failure to police his users or respond promptly to French document requests is the crime (which I suspect is the case), then this represents a dramatic escalation in the online censorship wars. What it means is that European states are going to try to extraterritorially dictate to foreign companies what content those companies can and cannot host on foreign-based webservers.

If correct, this would represent a major departure from the U.S.-compliant approach most U.S.-headquartered social companies currently take, which has generally governed the global compliance strategies of most non-China social media companies, including any which offer greater or lesser degrees of full encryption on their services (Telegram’s “Secret Chats” feature, WhatsApp, and Signal among them). In brief, platforms thought that if they didn’t specifically intend their platforms to be put to criminal use, they’re unlikely to find themselves on the receiving end of criminal charges. That’s not true anymore, apparently.

Telegram is not the only company in the world which has a social media platform used for unlawful purposes. Facebook’s popular encrypted messaging app WhatsApp has, famously, been used for years by the erstwhile non-state terror organization in, and now rulers of, Afghanistan, the Taliban. This fact was widely known by NATO generals and reported in the press during the Afghan war, and was even reported on again in the New York Times as recently as last year:

About a month after Mr. Inqayad, the security officer, was unable to reach his commanders during the night operation, he begrudgingly bought a new SIM card, opened a new WhatsApp account and began the process of recovering lost phone numbers and rejoining WhatsApp groups.

Sitting at his police post, a refurbished shipping container with a hand-held radio, Mr. Inqayad pulled out his phone and began scrolling through his new account. He pointed out all of the groups he is a part of: one for all of the police in his district, another for the former fighters loyal to a single commander, a third he uses to communicate with his superiors at headquarters. In all, he says, he is a part of around 80 WhatsApp groups — more than a dozen of which are used for official government purposes.

Of course, the Taliban is now Afghanistan’s entire government – at all levels – and Afghanistan is an enemy of the United States, Facebook’s home country. If Facebook were serious about keeping guys like this off their services, the most effective way to do so wouldn’t be by playing whack-a-mole with individual government employees, as Facebook does, but rather by banning the entirety of Afghanistan’s IP range and all Afghan phone numbers, and disabling app downloads in-country, which Facebook does not. Facebook chooses the ineffective measures rather than the effective ones.

Yet, Facebook CEO Mark Zuckerberg lives comfortably in an estate in Hawaii, rather than in exile, and presumably doesn’t have a warrant out for his arrest in any country, whereas Durov obviously did. I grant it is possible (even probable, given that Telegram runs on a skeleton crew of 15 engineers and approximately 100 staff worldwide) that Facebook is more responsive to French judicial requests than Telegram. However, when you’re running a globally accessible encrypted platform, it is inevitable – repeat, inevitable, as in an absolute certainty – that criminal activity will take place that is beyond your view or your ability to moderate.

If Telegram stands accused of violating French law because of its failure to moderate, as media reports indicate, an app like Signal – which demonstrably cannot respond to law enforcement requests seeking content data and has similar features to Telegram – is guilty, too, and no U.S. social company that offers end-to-end encryption (or its senior leadership) is safe. Do we really think Meredith Whitaker should wind up in prison if she decides to go to France?

(Image licensed under the Pixabay license.)

Many questions remain. For now, this is not looking good for the future of interactive web services based in Europe. American tech entrepreneurs who run their services in accordance with American values – free speech and privacy through strong encryption, in particular – should not visit Europe, should not hire in Europe, and should not host infrastructure in Europe until this situation is resolved.

UPDATE, 26 AUGUST 2024

Basically my hunch was correct:

There’s a laundry list of crimes there. Most of them relate to the French crime of complicité which roughly equates to American aider/abettor liability – involving knowingly facilitating, helping or assisting the preparation of a criminal offense, procuring an offense (through a gift, promise, threat, order, or abuse of authority or power) or giving instructions to commit an offense (see Articles 121-6 and 121-7 of the French Code pénal).

What’s important here is that in the U.S., aider/abettor liability requires a specific intent to bring about the criminal result – that is, the commission of the crime is the defendant’s object. U.S. social media companies simply failing to police their users doesn’t rise to this level, which is why U.S. social media company CEOs don’t, as a general rule, get arrested for the crimes of their users by the American government. The CSAM allegations, in particular, would only rise to the level of a crime in the USA if Durov failed to comply with the notice-and-reporting regime the U.S. has for such content. The simple existence of the criminal content absent any notice doesn’t give rise to criminal liability.

Here, the French government is accusing Durov of being complicit with – i.e. aiding and abetting – criminal activity and also unlicensed provision of “cryptological” software, with encryption products subject to prior government authorization before their use in France will be approved. The list of crimes he’s accused of facilitating includes what appears to be a rough approximation to criminal RICO, CSAM, money laundering, narcotics trafficking, hacking, and providing unlicensed cryptography.

It would make zero sense for Durov to do any of these things with specific intent. For example, intentionally engaging in narcotics trafficking is illegal virtually everywhere on Earth; the crime is punishable by death in the United Arab Emirates, where Durov is a citizen and ordinarily resides, and can attract up to a life term in the United States, which is historically very good at extraditing people.

We’ll need wait for the evidence to come out before reaching any firm conclusions on this point. If I had to guess, in a world where every platform hosts unlawful activity to some extent, this looks like selective enforcement. I would also guess that Durov was not “aiding and abetting” as the U.S. would understand it and that this French enforcement action is an overbroad application of French law to punish a perceived political enemy, with the French security state trying to use local doctrines in a novel way to try to police a foreign company with moderation policies it (and likely each of its security cooperation partners in the EU and across the channel in the UK) regards as too lax.

In the absence of a lot of evidence showing that Durov and Telegram specifically intended to commit these crimes or bring them about, there is no reason why similar charges could not be laid against any other provider of social media services in France whose moderation practices are anything less than perfect, in particular social media services which provide end-to-end encryption.

Summing up: for the time being, if you run a social media company, or if you provide encrypted messaging services, which are accessible in France, and you’re based in the United States, get out of Europe.


Responses

  1. Richard Avatar
    Richard

    This is a great post, but it’s not really true that Telegram is encrypted (certainly not end-to-end encrypted, in the way that Signal is E2EE).

    For more info:
    https://x.com/moxie/status/1497001286444617746
    https://www.platformer.news/telegram-durov-arrest-france-explainer/

  2. prestonbyrne Avatar

    I concur (I am aware that most messages are stored by TG on its servers, presumably encrypted at rest but not E2EE, meaning TG can see the data with its keys). If my hunch on this is correct, though, I think the French government’s allegations are sufficiently thin gruel that they could easily be repeated against a truly E2EE service like Signal, if there were the political will to do so.

    I presume the cryptography charges relate to the not-widely-used but nonetheless encrypted features TG does offer.

  3.  Avatar
    Anonymous

    You not acknowledging the elephant in the room. This is about war with Palestine and Israel.

  4.  Avatar
    Anonymous

    If you host CSAM, don’t set a filter on the server to detect it (very simple with PicDNA), distribute it, the authorities tell you that you host it and distribute it and that you should stop, but you don’t, you are complicit of such activities. It’s common sense.

  5. prestonbyrne Avatar

    The IWF was quoted in the BBC today saying that Telegram is in fact responsive to takedown requests of that type; as I say in the piece, I would be stunned if TG flat out refused to remove it. That would be an act of monumental managerial stupidity (and if that turns out to be the case it would amply justify law enforcement action in any Western country). The bundling of this with the RICO-like and narcotics charges give reason for skepticism. We’ll have to wait and see what French law enforcement’s case is here before drawing firm conclusions.

  6. […] Durovs arrestatie is hoe dan ook een belangrijke gebeurtenis. Maar hoe groot de zaak is – en of het slecht is voor andere sociale medianetwerken, gecodeerde communicatieplatforms en het idee van wereldwijde vrijheid van meningsuiting – hangt af van de details van de zaak, die we nog steeds niet weten (advocaat Preston Byrne heeft een goed overzicht van de inzet hiervan op zijn blog ). […]

  7. Zoran Avatar

    I wonder if this isn’t part of France’s (with UK, USA and Germany being interested parties) “war” on encryption and Durov happened to be a suitable and very simple to apprehend target!?

  8. prestonbyrne Avatar

    I also wonder this

  9. […] avocat se prononce sur les implications pour les firmes […]

  10. […] (2024-09-09): Preston Byrne (via Hacker […]

Discover more from Preston Byrne

Subscribe now to keep reading and get access to the full archive.

Continue reading