Will the UK-US data sharing agreement *really* not result in forced decryption of American communications?

Adapted from this tweet thread.

I will start this blog post by stating that, to be perfectly clear, there is nothing in the CLOUD Act that mandates forced decryption or could be construed to allow it. 

However, people who say that forced decryption of U.S. communications disclosed under the Cloud Act is impossible may be wrong. If you bear with me, I’ll explain why. 

Tim Cushing, writing for the inimitable online paper of record Techdirt, reports in this article, titled “No, The New Agreement To Share Data Between US And UK Law Enforcement Does Not Require Encryption Backdoors,” that

The reporting here is borderline atrocious. The article insinuates that this agreement will force Facebook and WhatsApp to turn over decrypted communications or install a backdoor. It won’t. The platforms may be compelled to turn over encrypted messages but all UK law enforcement will get is encrypted messages. The reporting here makes it appear as though social media platforms are being compelled to provide plaintext. They aren’t.

Correct. Not in the CLOUD Act, they’re not. TechDirt concludes:

What the UK government has in the works now won’t mandate backdoors, but it appears to be a way to get its foot in the (back)door with the assistance of the US government.

If we’re only looking at the CLOUD Act and any data sharing agreement entered into pursuant to the CLOUD Act, this view is absolutely, 100% correct. The CLOUD Act indeed prohibits the inclusion of provisions regarding forced decryption in any data sharing agreement. 

The issue is that America is not the only country in the world, the CLOUD Act is not the only law in the world and any US-UK data sharing agreement entered into pursuant to the CLOUD Act, currently in draft, will not be the only law in that applies to data disclosures in the UK. 

The UK already has plenty of its own laws – passed in 2000, 2016, and 2018 – that will fall outside of the four corners of any data sharing agreement and which currently allow the UK to either secretly force companies to backdoor their encryption, or force individuals or companies to disclose their private keys (so-called rubber-hose cryptanalysis).

These laws are, primarily, the Regulation of Investigatory Powers Act 2000, Section 49; the Investigatory Powers Act 2016, Section 253; and Schedule 1 to the Investigatory Powers (Technical Capability) Regulations 2018.

This pre-existing legislation  plus the CLOUD Act, working in tandem, could result in the US companies being compelled to provide US-based data in readable form to the UK without the UK obtaining a US warrant, even if the CLOUD Act itself is silent on decryption.

Allow me to explain.

If the US-UK data sharing agreement becomes law, UK police can ask US firms to provide the content of communications data and the US firms will be able to provide it without worrying that they’re violating 18 U.S.C 2702(a)(1).

The consequence of this will be twofold. If the current reporting is wrong (as I suspect), US companies with no UK presence will be able to more or less tell the UK to pound sand when the UK asks for the content of communications on their servers, as I expand on in considerable detail here.

US companies with a UK nexus, however (practically all major web companies and SaaS providers) will have a choice: either leave the UK or obey the UK court orders they will get served with under the CLOUD Act data agreement.

Again, these are disclosure rather than forced decryption orders.

A problem arises, however, when we consider how the CLOUD Act disclosure rules might interact with pre-existing forced decryption laws in the United Kingdom.  Namely, once your telecommunications service is under the UK’s jurisdiction, the UK government has a domestic power under Section 253 of the Investigatory Powers Act 2016 to promulgate regulations that would allow the UK government to, among other things, order firms to remove “electronic protection” from communications and maintain the ability to do so.

Screen Shot 2019-10-01 at 12.06.52 PM.png

Oh, and once they serve a company with one of these notices, the company is subject to a nondisclosure obligation, so nobody will know the notice has been given. See Section 255(8) of the Investigatory Powers Act.

Screen Shot 2019-10-01 at 1.03.30 PM.png

And sure enough, in 2018, the UK eventually adopted a statutory instrument that gave the UK government the power to impose these conditions on telecommunications providers operating or controlling all or part of their operations from within the United Kingdom:

Screen Shot 2019-10-01 at 12.11.13 PM.png


In all probability, what the CLOUD Act data sharing agreement will do is make it nearly impossible for global tech firms that store data in the United States to refuse UK warrants on Stored Communications Act grounds if they wish to continue doing business in the UK. 

The data sharing agreement to be entered into with the UK under the CLOUD Act

  • will not, in all probability,  allow UK police to forcibly pry open encrypted communications in the US; and
  • will not, in all probability, tell us anything about how the rumored data sharing agreement will interface, if at all, with existing UK forcible decryption laws or key disclosure laws which pre-date both the CLOUD Act and the data sharing agreement.

All that the meat of the CLOUD Act in 18 USC 2523  says on the subject of forced decryption is 

the terms of the agreement shall not create any obligation that providers be capable of decrypting data or limitation that prevents providers from decrypting data[.]

This doesn’t disqualify the UK’s existing forced decryption or key disclosure regimes or prevent the UK from enacting new ones. All it says is that forced decryption can’t be part of the terms of the data sharing agreement. There is nothing in the CLOUD Act that prevents the UK from serving a technical capability notice (i.e., forced decryption) on a US firm that provides encrypted data to the UK under a CLOUD Act order. I am guessing there will be nothing in the data sharing agreement either. Which means that the UK will probably remain free to serve technical capability notices on companies upon which it also serves CLOUD Act orders. 

Much, of course, will depend on the final agreement. What it will say is anybody’s guess, but I am not hopeful that it will be a particularly libertarian document, and my hunch is that the US won’t be keen to draw attention to the UK’s forced decryption laws by mentioning them in the data sharing agreement.

Which is course is the point. To the extent the data sharing agreement is silent on forced decryption, that, my friends, is your back door. Even if the CLOUD Act agreement doesn’t mandate the forced decryption of data, there are plenty of existing UK statutes that do. This means that the CLOUD Act could still result in forced decryption of data obtained from (and possibly about) US citizens and US companies on US servers by UK police, in secret, without anyone in America knowing about it or having any constitutional recourse.

Forced decryption that could probably not happen – or if it did, it would happen far less frequently – if the U.S. declined to enter into this executive agreement and (ideally) repealed the CLOUD Act.

Thoughts welcome on Twitter or in the comments.

A marmot.

Not Legal Advice, 9/30/19 – Libra a Security? And Trump Administration reportedly planning to eviscerate Fourth Amendment

Welcome back to Not Legal Advice!

Two news items worth checking out this week. I wrote a 3,000 word write-up for Item 2 yesterday so you can just navigate to the post rather than repeating it here (link below).

1) Libra is a security?

Regulatory headwinds for Facebook’s new offering from the House Committee on Financial Services:

The Libra Investment Token could amount to a security since it is intended to be sold to investors to fund startup costs and would provide them with dividends. The Libra [stablecoin] token itself may also be a security, but Facebook does not intend to pay dividends and it is unclear if investors would have a “reasonable expectation of profits.” However, the offer of Libra [stablecoins] could be integrated into the offering of the Libra Investment Token, thereby deeming both securities. Like ETFs, Libra would be redeemable by certain authorized resellers and bought and sold in the open market.

2) CLOUD Act Shenanigans in England

The Times and Bloombergare reporting that the UK and the US are planning to enter into an Executive Agreement on data sharing between the two jurisdictions.

The consequence of entry into this agreement, as reported, would be to eviscerate the Fourth Amendment by forcing certain US citizens to obey UK court orders, despite the fact that UK due process protections are weaker than America’s, and UK courts are not obliged to obey the Constitution.

The reporting may be wrong, or it might not be. The US press needs to dig into this issue more deeply. Read my full write-up on this development here.

Mel Gibson

Trump Administration reportedly planning to eviscerate Fourth Amendment, force Americans to obey British court orders

Note: on 1 October I wrote a follow-up piece concerning the possibility of forced, clandestine, judicially-unauthorized decryption of Americans’ communications taking place as a result of the proposed data sharing accord.

Americans fought two wars – fought, bled, and died – to throw off the yoke of British rule (1775) and protect American liberty, including the Fourth Amendment, from British invaders (1812).

Despite this history, the United States and the United Kingdom are apparently about to enter into a new data sharing treaty or executive agreement (current reporting from the Times and Bloomberg says it’s a “treaty,” but US statutes in this area indicate that what we’re dealing with is an executive agreement – which is different from a treaty in that it does not require ratification by the Senate) which will

  • effectively nullify the Fourth Amendment and the data privacy shield of the related federal Stored Communications Act when a British cop wants access to data stored in the US by American citizens, and
  • according to reporting in the Times and Bloomberg, force American citizens to obey British court orders, despite the fact that British courts are not bound to obey and apply the Constitution.

Donald Trump has no business giving these rights away without a fight. Nobody has any business telling Americans that we must obey foreign courts.

According to Bloomberg:

Social media platforms based in the U.S. including Facebook and WhatsApp will be forced to share users’ encrypted messages with British police under a new treaty between the two countries, according to a person familiar with the matter.

From the Times of London, which broke the story (my comments in [italics in brackets]):

WhatsApp, Facebook and other social media platforms will be forced to disclose encrypted messages from… serious criminals under a new treaty between the UK and the US.

At present [British] security services are only able to obtain data [from an American] if there is a need for an “emergency disclosure” due to an imminent threat to life. [Note: this is not true – the consensus view is that Americans have more or less absolute discretion to refuse data requests of any kind originating from the UK, whether emergency requests or not, if they are not approved by a US court under the Mutual Legal Assistance Treaty, or MLAT, procedure which is already in force between the two countries. In my experience, bona fide emergency requests are seldom refused.]

The police and prosecutors can also request data under the “mutual legal assistance” [or “MLAT”] treaty[, where after reviewing the foreign data request for compliance with all US due process and constitutional requirements such as the First, Fourth, and Fifth Amendments, a US judge serves a mandatory court order requiring a US citizen to provide the data] but the process is highly bureaucratic and can take up to two years. 

Under the new treaty, the police, prosecutors and the security services [seeking American data from US citizens] can submit requests for information [that are binding on US citizens, under penalty of law] to a [British] judge, magistrate or “other independent authority” [in Britain, which is under zero obligation to follow or apply the provisions of the U.S. Constitution].

The process will be overseen by the investigatory powers commissioner [, an appointed, un-elected British political apparatchik who also is not required to follow US law or implement US constitutional due process protections].

The UK has agreed it will not target people in the US and the US has agreed not to target people in the UK [despite the fact that this is a promise the UK cannot keep, since if the British police knew who the targets of these investigations were, there would be no need to obtain US-based user data from US citizens in the first place, and the British could use their extensive domestic surveillance and interception capabilities to obtain the information they need]. 

…Richard Walton, a former head of counterterrorism at the Metropolitan Police, said: “US tech giants have been inadvertently putting a veil over serious criminality and terrorism. It has tilted the balance in favour of criminals and terrorists. This is very welcome, it will make a big difference [to be able to circumvent that pesky U.S. Constitution and the Stored Communications Act].” 

[Only in a police state is a policeman’s job easy.]

I will preface the rest of this post by saying there is absolutely no way to guarantee that US citizens won’t be picked up in U.K. law enforcement data sweeps. U.K. law enforcement cannot know ahead of time where an Internet user is based. That’s one of the primary reasons why the police ask Internet companies for basic subscriber data and communications data – to identify the user. Even if there were such a guarantee, per the EFF, there is no statutory restriction that would prevent British police from sharing any data they collect on US citizens inadvertently with US law enforcement anyway.

Now: let me explain how all this works.

What the Fourth Amendment says

The Fourth Amendment to the US Constitution says that

  • the right of the people to be free from unreasonable searches and seizures shall not be violated; and
  • no warrants shall issue, except upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

The federal Stored Communications Act has equivalent provisions which apply Fourth Amendment-style protection to user data of online services which require the federal government to obtain a subpoena or a search warrant before obtaining certain types of data from online service providers. This requirement exists even if the online service provider would have been willing to voluntarily hand the data over, subject to certain limited exceptions such as when there is an emergency threatening life and limb.

Without the statutory protections of the Stored Communications Act, online service providers would be free to either hand over the data straight to the government or assert their own Fourth Amendment rights, within their discretion.

The proposed treaty/executive agreement reportedly disapplies the Fourth Amendment  and the Stored Communications Act in relation to, and may force Americans to obey, the rulings of foreign courts

US companies and citizens are currently absolutely free to assert their Fourth Amendment rights, refuse a foreign or domestic government agency request for data that is not judicially authorized by a US judge, and require such foreign or US government agency to obtain a subpoena or search warrant signed by a U.S. judge before they will turn over so much as one bit of data to that agency.

The U.S. has, however, enacted a statute, known as the “Clarifying Lawful Overseas Use of Data Act” or, more commonly, the CLOUD Act, which

  • permits the U.S. to enter into executive agreements with third countries for data sharing and
  • eliminates any conflicting obligations between the existing federal Stored Communications Act and compliance with data disclosure under any of aforementioned executive agreements (18 USC 2511(2)(j)). To wit, the U.S. federal law that a U.S. citizen could invoke to preserve its users’ rights in the face of an unreasonable foreign request is disapplied with respect to any lawful order handed down by a court in a country with which the U.S. has an executive agreement. So when the U.K. asks a U.S. company for information, the US company no longer can answer that “America says you can’t see these communications without a U.S. warrant,”
  • while also permitting some sharing of data inadvertently obtained by the U.K. on U.S. persons with the U.S. despite the fact that the  U.K. unquestionably, under the new arrangements, will have come into possession of that data under the authority of a British judge, and will not have first obtained a valid search warrant issued by an American judge.

The CLOUD Act also makes certain provisions for what requirements foreign orders must comply with if they are to qualify for preferential treatment under the executive agreement.

Based on U.K. and U.S. media reports, what I believe has happened is that the U.S. and the U.K. have quietly concluded the terms of an executive agreement under the CLOUD Act which will allow British courts to serve legally binding data requests on US companies directly, without requiring a U.S. judge to sign off. If such an executive agreement is signed by the President, certain US companies may be, for all intents and purposes, stripped of their hitherto-untouched 4th Amendment right to refuse search and seizure orders from foreign courts.

If the U.K. media reports are correct, this executive agreement will “force” certain U.S. citizens to follow the orders of British courts that are not themselves answerable to the Constitution or constitutional government. This may be either, as reported in the Times, through express terms in the executive agreement of which we are not yet aware, or by removing conflicting terms in existing statutory protections such as the Stored Communications Act which U.S. persons can currently invoke to withhold disclosure when faced with an overbroad foreign order.

U.S. company, no U.K. presence

DOJ guidance states that “[t]here is no requirement under U.S. law that a provider comply with a foreign order, and the CLOUD Act creates no such requirement[.]”  If we’re working from the CLOUD Act alone, US companies with no U.K. presence will face significantly increased pressure to acquiesce to U.K. law enforcement demands, but no compulsion.

However, the reporting from the U.K. last week directly contradicts the DOJ guidance. It is presently unclear whether the executive agreement will introduce a standalone provision that purports to require US-based companies to obey foreign orders, as the U.K. reporting claims, or whether the U.K. is overstating the impact of the provisions that have been agreed. The language from both Bloomberg and the Times is pretty unambiguous: “Social media platforms based in the U.S. … will be forced.” In my opinion, any such requirement would be unconstitutional.

I am hoping the picture will become clearer in the coming days as the pending agreement comes under greater public scrutiny in the U.S.

U.S. company, U.K. presence

As put by the EFF:

…foreign law enforcement officials could grab data stored in the United States, directly from U.S. companies, without following U.S. privacy rules like the Fourth Amendment, so long as the foreign police are not targeting a U.S. person or a person in the United States.

Taking the above analysis re: “will be forced” language as read, U.S. companies with significant operations in the U.K., but which keep their data in the U.S.,  such as virtually all major SaaS companies and all major consumer web companies including social media companies and e-mail providers, effectively lose any statutory or Fourth Amendment protection they had to withstand U.K. police requests which conflict with US constitutional norms and statute.

Unless these companies close up shop and leave the U.K., their global standard for data disclosure will immediately deteriorate to match the those in the U.K. 

U.S.-based users, past, present and future, even if they are not allowed to be intentionally targeted, will all suffer from the loss of privacy to which this gives rise. I consider it extremely unlikely that most U.S. companies with U.K. presences will routinely challenge the U.K. authorities’ requests on grounds that, e.g., First Amendment concerns are implicated.


I note that proponents of the executive agreement claim that “the U.K. and U.S. will not be able to target each other’s citizens” in order to sell the deal. This is a little misleading.

First, the CLOUD Act only requires that the countries do not intentionally target the other’s citizens; it does not prevent the foreign country from passing that data back to the U.S. where it is unintentionally obtained and pertains to serious crime. As I mentioned above, it is very difficult if not impossible to determine where an Internet user is from prior to serving a warrant, as a search warrant or subpoena under the Stored Communications Act or equivalent foreign instrument is often, if not usually, issued in part in order to ascertain or confirm a user’s identity and location. 

Second, the “target” of a court order is not the same thing as the person upon whom the warrant is served. The “target” is the person or account being investigated; the warrant is not served on the target, but on a US person who holds data about the target. While the CLOUD Act provides that US persons may not be intentionally targeted by U.K. court orders, it is beyond doubt that US persons will be on the receiving end of these orders, in relation to which the British courts will consider them bound to obey.

What terms, exactly, the draft executive agreement contains are not presently known. The U.K. reporting has described it as a treaty, but the US CLOUD Act makes reference specifically to executive agreements (i.e. an agreement which does not require ratification by the Senate) – but whatever the final form, it seems clear from existing reporting that the U.K. believes that companies will be forced to disclose this information under whatever arrangements have been agreed.

The U.S. has made no mention of it – yet. 

As described, these transatlantic data sharing arrangements constitute an end-run around U.S. citizens’ Fourth Amendment right to refuse to comply with foreign government-initiated, unconstitutional searches and seizures

In my experience, US tech companies are not as obstinate as the U.K. politicians who favor this “treaty” portray them. They usually voluntarily provide data to foreign law enforcement agencies where an emergency – i.e., an immediate danger to life or threat of serious bodily injury – clearly exists, such as where someone is posting a threat.

Otherwise, in non-emergency scenarios where the online activity poses no immediate danger to anyone, US tech companies generally require overseas law enforcement to get a warrant from a US federal judge that ensures the overseas request, called a Mutual Legal Assistance Treaty or MLAT request, comports with all U.S. due process, free speech, or other constitutional requirements.

So if U.K. police aren’t getting data quickly from US companies, it’s because – generally speaking –

  • for metadata or basic subscriber information, the situation isn’t an emergency; or
  • for metadata or basic subscriber information, where an emergency request is made, the police are unable to convince the U.S. company that the situation is an emergency; or
  • for metadata or basic subscriber information, in either an emergency or non-emergency setting, the U.S. company is simply exercising its constitutional rights, enshrined in the Bill of Rights appended to the Constitution of the United States of America, which the United Kingdom of Great Britain and Northern Ireland has no business whatsoever interfering with, because America won the war, and when you win the war you get to make the rules.

And if they’re not getting requested data at all from U.S. companies in non-emergency situations, it’s because

  • for metadata or basic subscriber information, the U.S. company is simply exercising its constitutional rights, and the British decide filing an MLAT is not worth their time or the MLAT they do submit does not pass constitutional muster; or
  • for the content of communications, disclosure of which will generally require a U.S. warrant, the British decide filing an MLAT is not worth their time or the MLAT they do submit does not pass constitutional muster.

Long story short: the British can already get data on a non-emergency basis if they want to, but this requires extensive bureaucratic vetting to ensure the requests comport with US constitutional requirements. What the new executive agreement is likely to aim to do is make it considerably easier for British police to pry open American servers on short notice and with limited, if any, American judicial supervision.

What this will look like in practice is anybody’s guess. Lawfare blog wrote at the time of the CLOUD Act’s passage that 

The U.S. has perhaps the strongest free-expression rules in the world. In the context of mutual legal assistance treaties, the U.S. turns down many data demands from foreign governments because they seek information in connection with speech that would not be criminal in the U.S. because of the First Amendment. The Justice Department must decide whether to ensure similar protections under Cloud Act arrangements, and it will have to determine how to prevent foreign orders from infringing on freedom of speech no matter whose version of free speech is being protected.

It seems observers are not clear as to the extent to which foreign orders can bind and what the extent of protections will be under CLOUD Act data sharing agreements. (This makes sense, as no CLOUD Act agreements have ever been made before.) Much will therefore depend on the exact terms of the U.K.-U.S. arrangements.

As a final note, if the data which a foreign government obtains happens to implicate a U.S. citizen in a crime, well, too bad: per our friends at the EFF, even where foreign police haven’t complied with U.S. constitutional requirements

the CLOUD Act fails to provide any limits on foreign police sharing Americans’ metadata with U.S. police.

and content of communications data may be shared with US authorities where it pertains to serious crime per 18 USC 2523(b)(4)H).

The U.K. does not have substantially equivalent procedural protections to the United States’

The CLOUD Act which authorizes the entry into this executive agreement with the U.K. specifies that

an executive agreement governing access by a foreign government to data subject to this chapter… shall be considered to satisfy the requirements of this section if the Attorney General, with the concurrence of the Secretary of State, determines, and submits a written certification of such determination to Congress, that… (1) the domestic law of the foreign government, including the implementation of that law, affords robust substantive and procedural protections for privacy and civil liberties in light of the data collection and activities of the foreign government that will be subject to the agreement[.]

The United Kingdom utterly lacks such protections.

As I have written before, the state of civil liberties in the U.K. is abysmal. Before any mandatory co-operation between the two countries should be enacted, the U.K. should improve its standards substantially (several-orders-of-magnitude improvements).

US internet companies are currently able, if they choose, to assert their Fourth Amendment rights for the benefit of all their users wherever those users may be based. This includes the basic requirement that before a government agency may compel a private company to hand over records, that government agency must have in hand a search warrant or subpoena. If evidence is obtained unlawfully, e.g. without first obtaining a search warrant, it cannot be used against a defendant. This is known as the exclusionary rule. 

England lacks what in the U.S. would be considered the bare minimum due process requirements to effect a search. No oath or affirmation is required, for example, before a warrant may issue. There is no requirement for probable cause before a warrant may issue; English search warrants utilize the lower common law standard of “reasonable suspicion,” and in many circumstances a warrant is not needed at all. For example, if someone is arrested, per Section 18 of the Police and Criminal Evidence Act 1984 (PACE), their house may be searched on the orders of a police officer without any judicial authorization (in a manner that would not be permitted in the US under our search incident to lawful arrest doctrines).

Worse, where the US expressly bans so-called “general warrants,” England permits them in the form of “all premises warrants” (see PACE s. 8(1A)) which authorize the search of all premises controlled by a person named in a warrant, whether there is probable cause for those premises to be searched or not. The US,  by contrast, requires suspicion to be particularized (“particularly describing the place to be searched”) and based upon probable cause; the mere fact that someone owns property and has been arrested on an indictable offense does not, in America, permit the police to then rummage through literally everything the arrestee owns.

Nor does England and Wales have an exclusionary rule; section 78 of PACE 1984 says that a court may exclude illegally obtained evidence, not that it must. Inevitably this means that more illegally obtained evidence is introduced against defendants in British trials  – including any Americans who might get inadvertently caught in one of the British dragnets envisioned by this new executive agreement – than would be the case in the United States.

In England, the “right to remain silent” does not exist for criminal defendants. Your silence can and will be used against you, as Sections 34 to 39 of the Criminal Justice and Public Order Act 1994 allow the government to draw adverse inferences if you do not answer their questions. This practice is unconstitutional in the United States, where defendants have the right to remain silent; and Carter v. Kentucky, 450 U.S. 288 (1981), says your silence cannot be used against you.

Furthermore, many “serious crimes” in England are explicitly constitutionally protected in the United States. This is especially the case where speech and expression of extreme ideas are concerned: much of what the English terms “terrorism” or “inciting hatred” the U.S. would call “free speech.”

Finally, with regard to data specifically, the English do not have anything approaching “robust substantive and procedural protections for privacy.” The entirety of English legislation surrounding extrajudicial authorization for surveillance and mandatory RIPA data requests without notice, mandatory retention of internet connection records, and more under the Investigatory Powers Act 2016 would be, without a doubt, illegal if done by the US government to US citizens.

Screen Shot 2019-09-28 at 10.57.31 PM
Who can view internet connection records in Britain without a warrant. From Wikipedia.

England does not know what it means to have a provision like the Fourth Amendment or the Stored Communications Act. Virtually every aspect of English rules of evidence and criminal procedure relating to search and seizure of data, as ordinarily practiced by UK police forces, would be struck down as unconstitutional if enacted in the United States. I struggle to understand how the Attorney General could certify to Congress that the United Kingdom is capable of satisfying the due process requirements of an executive data sharing agreement.

If this becomes law, it will be challenged in court

Britain has no experience with US-style civil liberties and cannot be trusted to issue orders to U.S. persons that comport with and respect our freedoms. 

The United States should not enter into this reported executive agreement and the CLOUD Act – which was not debated in Congress and was passed in an omnibus bill – should be repealed. 

British police have a tough job to do, as do all police forces. But the British police should be on an equal footing to American police and every other police force on Earth.

What that looks like is a requirement to obtain an American warrant, signed by an American judge, in accordance with American standards, and accountable to an American constitutional challenge, in America, before the British police can say they have a right of any kind to search American citizens’ servers, which are also in America, and seize their content. The current MLAT procedure provides for this. The new U.S.-UK executive agreement/treaty procedure, reportedly, will not.

If the U.K.-U.S. data sharing agreement is about to enter into force, this means the CLOUD Act is about to start causing injury, meaning it is likely to start facing legal challenges. Hopefully enough people will notice what is going on, there will be some backlash, and the U.K.-U.S. data sharing agreement will never become law.

In any case, men fought and died to protect the Fourth Amendment. The U.K. and the President should keep their hands off of it.


Not Legal Advice, 9/22/19 – self-proclaimed architect of the “Zug Defence” arrested, ICOBox sued, Section 230 limited by the 9th Circuit

Welcome back to this week’s edition of Not Legal Advice! Because legal advice costs money, and this blog is free.

Between delivering the keynote at blockchain day of Stamford Innovation Week and getting ready for a speaking gig at Crypto Springs, I’ve been pretty busy, so this week’s newsletter is going to be on the short side (a mere 1,800 words). This week:

  1. Self-proclaimed architect of the “Zug Defence” (or “Defense” for Americans) arrested
  2. ICOBox sued for selling unregistered securities, fraud, and operating as an unregistered broker-dealer; Paragoncoin resurfacts
  3.  Enigma v. Malwarebytes: 9th Circuit says Section 230 doesn’t apply to deliberately anticompetitive conduct

1. Self-proclaimed architect of the “Zug Defence” arrested

Last week brought us the news that Steven Nerayoff – early Ethereum advisor, sometimes Ethereum co-founder, and current one-of-those-guys-who-is-on-twelve-different-token-boards, was arrested and charged in the Eastern District of New York with extortion.

Although of course Nerayoff and his alleged co-conspirator, a fellow named Michael Hlady who previously was convicted of defrauding a group of nuns in Worcester, Mass (no, really), are innocent until proven guilty, it suffices to say that the allegations contained in the indictment do not portray either defendant in an especially flattering light.

Of wider significance here from the observer’s viewpoint is the fact that Nerayoff claims to have been the architect of – and is therefore someone with intimate knowledge of – the Ethereum Foundation’s early legal strategy. In particular, Nerayoff is likely to be aware of the contents of a legal opinion which, according to CoinDesk, is said to have cost $200,000, payment of which Nerayoff reportedly guaranteed with his own money. This person is now in federal custody.

The issuance of this legal opinion is worth re-examination, at the very least for historical purposes if nothing else. Apart from the obvious fact that $200,000 is rather a lot of money to pay for a legal opinion, the issuance of that opinion – which I presume authorized the sale, otherwise why pay $200k for it – arguably set off the ICO boom as we know it. The fact that Ethereum proceeded with legal air cover and was such a wild, runaway success encouraged other law firms, large and small, to then take a view on subsequent offerings in order to gain market share and marquee clients.

Ethereum was the first of many coin issuers to set up shop in Zug, Switzerland, known now as “crypto valley,” presumably under the theory that Swiss residence and legal structures would immunize them from U.S. law. This tactic, referred to in jest by cryptolawyer OGs as the “Zug Defence,” is rumored to involve establishing a Swiss Stiftung, or foundation, obtaining tax opinions from a Swiss law firm that the token-product is to be treated as a software product for tax purposes, and, in Ethereum’s case, obtaining a second, supplemental opinion which presumably set out the U.S. legal position (if the rumors are true). Although I have not read it, to the extent that opinion authorized the Ethereum pre-sale to occur in the U.S. without requiring the Ethereum Foundation to register the tokens or avail itself of an exemption, it would have been, in my professional opinion, legally incorrect. This conclusion is based on the SEC’s 2018 Paragon and AirFox settlements, which we may presume form the template for all enforcement actions which will follow, and in relation to which the Ethereum pre-sale, in hindsight, does not appear to have been materially different.

Generally speaking, a practitioner who possesses even one whit of conservatism in their bones will tell you that the so-called “Zug Defence” is not much of a defence at all, to the extent that the transaction or scheme touches the U.S.  or captures the U.S.’ attention. Although the statute of limitations for the Ethereum Foundation qua token issuer under the Securities Act of 1933 has run, their operations continue. When a supposed non-profit in Switzerland magically creates $20+ billion out of thin air, you can be sure this does not go unnoticed.

This is accordingly a story to watch.

This marmot is on Mt. Rainier, not in Switzerland. This marmot follows U.S. securities laws.

2. SEC sues ICOBox for selling unregistered securities, fraud, and operating as an unregistered broker-dealer; Paragoncoin resurfaces

In other federal-agencies-on-the-warpath news, the U.S. Securities and Exchange Commission sued ICOBox and its founder last week for allegedly conducting an unregistered coin offering, engaging in fraud in relation to that coin offering, and operating as an unregistered broker-dealer in relation to other coin offerings launched using its platform.

Attorneys can spot plausibly deniable sarcasm from 1,000 yards, and the complaint does not disappoint:

ICOBox proclaims to be a “Blockchain Growth Promoter and Business Facilitator for companies seeking to sell their products via ICO crowdsales” —in other words, an incubator for digital asset startups. A self-described blockchain expert, Evdokimov, has acted as the company’s co-founder, CEO, and “vision director,” among other titles.

The facts of the coin offering and the alleged fraud do not bear repeating here. More interesting from my perspective is how the SEC has built up its claim that ICOBox was acting as an unregistered broker-dealer:

The token sale conducted by at least one of these clients, Paragon Coin, Inc. (“Paragon”), constituted a securities offering under Howey… By actively soliciting and attracting investors to ICOBox’s clients’securities offerings in exchange for transaction-based compensation without registering as or associating with a registered broker-dealer, Defendants engaged in unregistered broker activities that violated the federal securities laws.

SEC v. Paragon Coin, we may remember, was the first major settlement announced between the SEC and an ICO issuer, back in November 2018. Around the same time, the SEC announced settlements with AirFox (unregistered securities offering) and the founder of EtherDelta (for operating an unregistered securities exchange). About 30 days prior to that, the SEC announced its settlement with ICO Superstore, a similar business to ICOBox, for operating as an unregistered broker-dealer.

So we should not be surprised that the SEC is going after ICOBox, nor should we be surprised if the SEC decides to go after other token mills in the future. Interestingly, the SEC appears to have used the cooperation and disclosure obtained in the Paragon exercise to build the case against ICOBox:

ICOBox’s team members highlighted on social media during the offering that ICOBOX had started to work with certain clients including Paragon (referring to it as ICOBox’s “child”), but did not disclose that no ICOBox clients had yet completed any ICOs using its services.

Tl;dr? The SEC is good at a lot of things, but they’re particularly good at playing follow-the-money, and their inquiries will not end with token issuers. They will use what they learn at issuer level to move up the chain to promoters and service providers. It will be interesting to learn what is revealed as they undergo that process.

3. Enigma v. Malwarebytes: 9th Circuit says Section 230 doesn’t apply to deliberately anticompetitive conduct

If you don’t know what Section 230 of the Communications Decency Act is, start here. If you do, recall that Section 230 has two main operative provisions:

  • Section 230(c)(1), which says that publishing platforms and users of publishing platforms are not liable for content created by someone else; and
  • Section 230(c)(2), which basically says that companies can’t be sued for good-faith moderation calls, so if e.g. you’re Milo Yiannopoulos and one of your posts is moderated off of Facebook, if you sue Facebook for it, you will lose.

With regard to each of those provisions, however, these above shorthand definitions are just that, shorthand, and what they gain in comprehension for the layman they lose in terms of the stripping away of the actual, technical language they use. Section 230(c)(2) reads as follows:

No provider or user of an interactive computer service shall be held liable on account of (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in [sub-]paragraph ([A]).

The facts of Enigma v Malwarebytes are as follows.

Enigma Software Group USA, LLC, and Malwarebytes, Inc., were providers of software that helped internet users to filter unwanted content from their computers. Enigma alleged that Malwarebytes configured its software to block users from accessing Enigma’s software in order to divert Enigma’s customers.

Malwarebytes and Enigma have been direct competitors since 2008, the year of Malwarebytes’s inception. In their first eight years as competitors, neither Enigma nor Malwarebytes flagged the other’s software as threatening or unwanted. In late 2016, however, Malwarebytes revised its PUP-detection criteria to include any program that, according to Malwarebytes, users did not seem to like.

After the revision, Malwarebytes’s software immediately began flagging Enigma’s most popular programs— RegHunter and SpyHunter— as PUPs. Thereafter, anytime a user with Malwarebytes’s software tried to download those Enigma programs, the user was alerted of a security risk and, according to Enigma’s complaint, the download was prohibited[.]

As a former startup guy, don’t I know that startup competition in the software industry is a fight to the death.

Fortunately, commerce is not a free for all and there are rules and certain standards of fair dealing that companies are expected to follow as they compete. Enigma brought a number of claims under state and federal law, ranging from unfair and deceptive trade practices to a Lanham Act violation of making a “false or misleading representation of fact” regarding another person’s goods. Malwarebytes argued it was immune from the action due to the effect of Section 230(c)(2).

Malwarebytes won at first instance. The 9th Circuit reversed:

The legal question before us is whether § 230(c)(2) immunizes blocking and filtering decisions that are driven by anticompetitive animus.

In relation to which the court found:

Enigma points to Judge Fisher’s concurrence in Zango warning against an overly expansive interpretation of the provision that could lead to anticompetitive results. We heed that warning and reverse the district court’s decision that read Zango to require such an interpretation. We hold that the phrase “otherwise objectionable” does not include software that the provider finds objectionable for anticompetitive reasons…
…if a provider’s basis for objecting to and seeking to block materials is because those materials benefit a competitor, the objection would not fall within any category listed in the statute and the immunity would not apply.

Pretty clear cut ratio there.

Eric Goldman’s treatment of the subject is much more detailed than my own. I recommend it to anyone looking to read further in this case; suffice it to say that I agree with the 9th Circuit, and disagree with Goldman, in that anti-competitive conduct by large tech companies is a growing problem, it cannot have been the intention of Congress to enable unlawful anticompetitive conduct with Section 230 and, at least as far as I am concerned, the natural meaning of “otherwise objectionable,” while extremely broad, does have limits, and, much as one would have a difficult time finding a motorcycle or a plant objectionable, it is conceivable that anti-malware software that is not itself malware might fall outside of those limits.

The opening that is created here is narrow and appears to be strictly limited to anti-competitive conduct, although there is a risk this ruling could be distinguished by new categories of litigants whose user-generated content is excluded without apparent justification from online platforms. I struggle to think whence these claims might arise, given that users of online platforms customarily contract away most of their rights and acquiesce to the platform’s discretion to filter content as it pleases in accordance with their policies (as opposed to the situation in Enigma, where Enigma’s rights vis-a-vis Malwarebytes originated in statute which Enigma did not waive). This of course naturally invites the question of whether states themselves will also try to create new statutory protections for constitutionally protected opinions which, of course, is exactly the thing that Section 230 of the the Communications Decency Act was enacted to prevent. Between Enigma and the EFF’s First Amendment challenge to FOSTA/SESTA, Section 230 jurisprudence over the next few years looks to be anything but boring.

See you next week!

Not Legal Advice, 9/16/19: Crypto taxes, systemic risk in DeFi, and Section 230

Welcome back to the second installment of Not Legal Advice, my new newsletter-thing I publish every week where I discuss three (3) items of interest from the prior week in crypto or crypto-adjacent technology law.

Because it’s happened twice, now, it’s a tradition. Traditions are warm and fuzzy and wholesome. So gather ’round the fireside, little marmot friends, and let’s have a conversation about what happened last week, and why it’s relevant going forward:

  1. France won’t tax shitcoin trades (also they are going to ban Libra from Europe)
  2. A company called “Staked” creates the “Robo Advisor for Yield,” or as I like to call it, the “Risk Mega Enhancer”
  3. The Second Circuit Court of Appeals finds that Section 230 of the Communications Decency Act is, indeed, as broad as its detractors claim

1) France won’t tax shitcoin trades (also they’re going to ban Libra)

According to official pronouncements from the French economic ministry:

  • Cryptocurrency transactions aren’t going to be subject to VAT.
  • Cryptocurrency trading activity won’t give rise to a tax charge until the crypto is traded out into fiat.

Three things limit what I can say about the French rules.

First, my French is not very good.

Second, I’m not a French avocat, which in French means both “male lawyer” and “avocado.”

Third, even in the two countries (America and England) where I am an avocado, I am not a tax avocado.

It suffices to say that France’s treatment of cryptocurrency trading income and gains differs from the tax treatment in England and the U.S. The English guidance makes it clear that trading gains are either income or capital gains, depending on whether the so-called badges of trade are present. In practice, HMRC guidance tells us that it is likely that capital gains tax would apply:

Only in exceptional circumstances would HMRC expect individuals to buy and sell cryptoassets with such frequency, level of organisation and sophistication that the activity amounts to a financial trade in itself. If it is considered to be trading then Income Tax will take priority over Capital Gains Tax and will apply to profits (or losses) as it would be considered as a business.

and what constitutes a chargeable asset for Capital Gains Tax purposes?

Cryptoassets are digital and therefore intangible, but count as a ‘chargeable asset’ for Capital Gains Tax if they’re both… capable of being owned… [and] have a value that can be realised.

And what events give rise to the charge?

Individuals need to calculate their gain or loss when they dispose of their cryptoassets to find out whether they need to pay Capital Gains Tax. A ‘disposal’ is a broad concept and includes… selling cryptoassets for money[;] exchanging cryptoassets for a different type of cryptoasset[;] using cryptoassets to pay for goods or services [; and] giving away cryptoassets to another person[.]”

What about in the U.S.?

The sale or other exchange of virtual currencies, or the use of virtual currencies to pay for goods or services, or holding virtual currencies as an investment, generally has tax consequences that could result in tax liability…

…For federal tax purposes, virtual currency is treated as property. General tax principles applicable to property transactions apply to transactions using virtual currency.

It’s obviously more complicated than that, but you get the general gist. Speak to a tax avocado if you have further questions.

Oh, and Americans, one of the great things about being American is that the warm, loving embrace of the United States is always with you wherever you may be on this or any other world – as all the dual-nationals I know joke, “we keep the U.S. passport as it means we have a seat on the last helicopter out.” Yes, that’s how much Americans with EU passports trust European voters.

The price you pay for that privilege is that America always taxes you on your worldwide income wheresoever you may be. So don’t think that you can move to France, offload all that premined Ether you’ve been sitting on for years into BTC and avoid the tax hit. Speak to a skeptical and conservative American tax avocado first.

Also, good on France for saying it won’t permit Libra to operate in Europe, as currently proposed.

2) Robo Adviser For Risk

The only thing I like less than DeFi is a DeFi bro.

Lately a number of offerings have sprung up offering staggering, double-digit rates of interest for cryptocurrency holders who are willing to commit their savings to crypto-first lending institutions who then claim they have profitable lending businesses on the other end of the transaction. In a low- to negative-interest-rate environment, everyone everywhere is trying to figure out where they can find yield and, accordingly, where they can make money.

The hope, the dream, is that crypto has magically solved this, has found its killer app, in the form of high yield interest-bearing accounts. From Balaji:

No offense, but I don’t buy it.

First, a number of these businesses – and there are more than one – will turn out to be Ponzi schemes. We don’t know which ones, but they’re out there.

Second, there is no such thing as a free lunch:

Third, there are those who argue that risk is not the driver of high rates and that some other black magic is at work. Let us examine this argument from the perspective of a borrower, who for present purposes we shall call Bob.

The ballad of Bob the Borrower

Sally Saver deposits 100 ETH with Lily Lender, who promises Sally Saver a 5% rate of interest on her deposit. Lily Lender now needs to get that 5%, plus enough to cover her expenses, from somewhere.

Ordinarily, that means that Lily Lender needs to make a loan to Borrower Bob at a rate of interest greater than 5%. You can’t run a lending business at a loss forever to gain marketshare and adoption unless you’re burning through venture funds to do so, as some of these businesses appear to be doing. BlockFi, e.g., has made it abundantly clear that it is taking a tech-company approach to developing a lending business:

We are OK with losing money for a while. If it was purely formulaic we probably wouldn’t have enough control to make sure it’s attractive enough to a large amount of people to hit our customer acquisition targets.

I don’t have an issue with that strategy, as long as the ledger balances out and BlockFi has enough spare VC firepower to satisfy its obligations. But we should not mistake this development, which is likely being mirrored by BlockFi’s competitors, for a fundamental change in the nature of risk. The risk hasn’t disappeared, it has simply been transferred onto the companies themselves, and they are paying for it in the form of a subsidy.

Subsidies, of course, have this pesky little problem that they eventually run out. When this happens, the risk that has been buried by them rears its head and begins to manifest itself in pricing. The likely result will be that the rates offered to savers will go down, and cost of funds will go up, and there will be a liquidity crunch among those who relied on them.

Speaking of which, who does rely on these liquidity facilities? Nobody really knows; it is this writer’s observation that crypto lending companies are extremely opaque about their lending operations, no doubt to gain an edge.

With interest rates at historic lows, however, we can probably guess that the people who are willing to pay north of 10% to borrow DAI are doing so either (a) because they have an interest in seeing Dai or related financial products succeed and are willing to absorb enormous losses to create the appearance of a thriving market, (b)  cannot obtain financing from literally any other source, or (c) in the case of over-collateralized loan protocol products like Dai, are seeking to obscure the source of their cryptocurrency wealth and are willing to absorb enormous losses to do so (by defaulting on the loan).

However, the fact remains: for every crypto loan product in existence, Bob the Borrower’s payments must be equal to or greater than Sally Saver’s returns in order for the product to be viable in the long term. 

Back to our regularly scheduled programming…

“Staked Automates the Best DeFi Returns With Launch of Robo Advisor,” trumpets CoinDesk. Staked has built a product…

Staked’s new Robo Advisor for Yield (RAY) service, which launches today, automates the process of finding high-yielding opportunities. Normally, investors have had to watch constantly and reallocate quickly to catch an enhanced DeFi return. Now they can set a smart contract to do the monitoring and allocating for them.

“This product is targeted to people who hold eth or dai and want to earn yield on it,” CEO Tim Ogilvie told CoinDesk in an interview. “If you hold ETH, you can earn more ETH. If you hold DAI, you can earn more DAI.”…

With RAY, investors can put their assets (ETH, USDC or DAI) into an asset-specific pool and the smart contract will automatically invest all or part of that pool into contracts with the best yield at any given time. For now, it will invest only on the money market Compound and with the derivatives protocols DYDX and BZX. But Staked is vetting additional smart contracts for safety and reliability.

“We’re not necessarily saying we are going to beat the market. We’re just saying you’ll get the best of what a savvy watcher would get in the market,” Ogilvie said.

“The vision we are building toward is the same level of sophistication the fixed income markets have in traditional finance,” he added.

I’m not going to lie, this is pretty cool, and offerings like it remind me of, e.g., crowdsourced or robo-offerings from companies like Betterment or eToro that have done very well.

But the reason I don’t like DeFi bros claiming they reinvented structured finance, as alluded to in my tweet above, is because I am a child of the Global Financial Crisis… and indeed I spent the first half of my career, in the throes of that crisis, working in structured finance. The great lesson of the crisis was that you cannot engineer risk out of transactions, you can only obscure it: this is the first law of conservation of risk, or as my friend Palley put it years ago, “The First Law of Lawmodynamics.”

The first law of thermodynamics says energy “cannot be created or destroyed. It can, however, be transferred from one location to another and converted to and from other forms of energy.” Maybe the same is so of liability and damages. You can’t destroy or avoid either by building a better mousetrap. You can only move it, or (arguably) move the consequences of that liability elsewhere.

There’s risk somewhere in crypto, waiting to get out. A combination of regulatory intervention and Ethereum going the way of MySpace are possible avenues. As the subprime crisis showed, even the most professional, Ivy League-educated, well-dressed “Savvy Watchers” won’t see it until it’s too late and everyone is running for the exits.

Don’t make the mistake of thinking that Staked – or any other DeFi company – is providing you with guaranteed risk-adjusted returns. They’re not, and anyone who thinks they are, had better steel themselves for a very unpleasant surprise.

A marmot picture: to break up the monotony of a wall of text and get a marmot thumbnail on shared links. Distributed under the Pixabay Licence.

3) The Second Circuit Court of Appeals finds that Section 230 of the Communications Decency Act is, indeed, as broad as its detractors claim

This is more of a law nerd thing, so if you’re just here for the crypto, switch off.

Politicians hate Section 230 of the Communications Decency Act, 47 U.S. Code § 230, particularly Missouri Senator Josh Hawley. Who writes:

“With Section 230, tech companies get a sweetheart deal that no other industry enjoys: complete exemption from traditional publisher liability in exchange for providing a forum free of political censorship,” said Senator Hawley. “Unfortunately, and unsurprisingly, big tech has failed to hold up its end of the bargain.

That statement is half right. Section 230 grants a broad immunity from publisher liability for online platforms that engage in traditional publisher-like activities with regard to user-generated content. They do not have any obligation to provide that forum free from censorship; indeed, Section 230 expressly permits tech companies to engage in censorship more or less free from consequences.

Section 230 gives us two rules that are largely responsible for America’s success in building a thriving Internet economy. I explore Section 230 in detail here, but for present purposes it suffices to note that it essentially promulgates two legal rules: 

  • Platforms and users are not liable for content on their platforms that has been created by someone else (Section 230(c)(1)).
  • If a web app moderates any content off of their platform, i.e. it deletes it, and anyone sues them for doing so, the person suing the web app is going to lose (the Section 230(c)(2)).

The case is Force v. Facebook. Force brought

an action for damages against Facebook pursuant to the Antiterrorism Act (“ATA”) and related claims for having knowingly provided material support and resources to HAMAS, a notorious terrorist organization that has engaged in and continues to commit terror attacks, including the terrorist attacks that killed 29-year-old Taylor Force, 16-year-old Yaakov Naftali Fraenkel, three-month-old Chaya Zissel Braun, and 76-year-old Richard Lakin, and injured Menachem Mendel Rivkin, and the families of these terror victims.

The plaintiffs alleged:

HAMAS has recognized the tremendous utility and value of Facebook as a tool to facilitate this terrorist group’s ability to communicate, recruit members, plan and carry out attacks, and strike fear in its enemies. For years, HAMAS, its leaders, spokesmen, and members have openly maintained and used official Facebook accounts with little or no interference. Despite receiving numerous complaints and widespread media and other attention for providing its online social media platform and communications services to HAMAS, Facebook has continued to provide these resources and services to HAMAS and its affiliates.

Facebook has knowingly provided material support and resources to HAMAS in the form of Facebook’s online social network platform and communication services.

The plaintiffs provided numerous examples of anti-semitic Hamas propaganda on Facebook over a period of years in their extensive, 61-page complaint, which I will not republish here. Each plaintiff sought not less than $1 billion in damages plus attorneys’ fees.

The problem the plaintiffs faced in bringing this action is that Section 230 of the Communications Decency Act was standing in their way. Recalling the literal text of Section 230:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

the burden was on the plaintiffs to demonstrate why Facebook should be treated as the publisher or speaker of Hamas’ content. The parties stipulated to the fact that Facebook was a “provider… of an interactive computer service.” The plaintiffs thus disputed:

  1. whether Facebook were “acting as the protected publisher of information” under Section 230(c)(1), i.e., it was providing some other infrastructure function that was not intended to be captured by Section 230(c)(1); and/or
  2. “whether the challenged information is provided by Hamas, or by Facebook itself,” because if Facebook is an information content provider, even in some small part (see Section 230(f)(3)), the Section 230 immunity falls away.

Facebook as protected publisher of information

At minimum, the Section 230(1) immunity is thought to apply to standard categories of speech torts such as harassment or defamation. It provides American Internet companies with a near-total defense from those claims. This is in contradistinction to the European jurisdictions such as, e.g., England, and Section 5(3) of that country’s Defamation Act 2013.

How does this work in practice? Well, let’s go back to our friend Bob Borrower from Item 2 and introduce a new character, Dan Defamer, Speech Villain Extraordinaire. Let’s also assume, arguendo, that everything that comes out of the mouth of Dan Defamer, Speech Villain Extraordinaire, isn’t protected speech according to the First Amendment.

If Dan Defamer says of Bob the Borrower, in a newspaper article, “Bob Borrower is a no-good scalliwag who does not pay his debts,” Bob the Borrower may sue Dan Defamer and the newspaper for publishing the lie (whether he will win is another matter). If, however, Dan Defamer logs on to Twitter and repeats the lie there, in the plain and ordinary meaning of the term “publisher” Twitter is a publisher as much as the newspaper is. However, it is generally understood that Twitter is not, under U.S. law, treated as the publisher of the statement and therefore is not liable for its content. Twitter is not even under a legal obligation to remove it. Dan Defamer is the speaker and it is he who is liable for the consequences of the speech.

The question presented in Force is a slightly different one, though. Rather than challenging Section 230 for speech which is itself tortious, it attempts to attack Section 230 collaterally by alleging that Facebook’s provision of an online platform, which Hamas the terrorist group then accessed, meant that Facebook itself played a role in facilitating terrorism and, accordingly, was liable to pay damages to the plaintiffs under a specific federal law which provides that victims of terrorism can seek compensation from companies that commit, or aid, abet or conspire to commit, international terrorism:

By providing its online social network platform and communications services to HAMAS, Facebook violated federal prohibitions on providing material support or resources for acts of international terrorism (18 U.S.C. § 2339A), providing material support or resources for designated foreign terrorist organizations (18 U.S.C. § 2339B), and financing acts of international terrorism (18 U.S.C. § 2339C), and committed acts of international terrorism as defined by 18 U.S.C. § 2331. Accordingly, Facebook is liable pursuant to 18 U.S.C. § 2333 and other claims to the Plaintiffs, who were injured by reason of an act of international terrorism. (Emphasis mine.)

…By participating in the commission of violations of 18 U.S.C. § 2339A that have caused the Plaintiffs to be injured in his or her person, business or property, Facebook is liable pursuant to 18 U.S.C. § 2333 for any and all damages that Plaintiffs have sustained as a result of such injuries

The relevant statute, 18 U.S.C. § 2333, reads:

Any national of the United States injured in his or her person property, or business by reason of an act of international terrorism, or his or her estate, survivors, or heirs, may sue therefor in any appropriate district court of the United States and shall recover threefold the damages he or she sustains and the cost of the suit, including attorney’s fees.

In an action under subsection (a)… for an injury arising from an act of international terrorism committed, planned, or authorized by a [designated foreign terrorist organization, i.e. Hamas]… liability may be asserted as to any person who aids and abets, by knowingly providing substantial assistance, or who conspires with the person who committed such an act of international terrorism.

But the Court rejected the plaintiffs’ reasoning, stating that:

…it is well established that Section 230(c)(1) applies not only to defamation claims, where publication is an explicit element, but also to claims where “the duty that the plaintiff alleges the defendant violated derives from the defendant’s status or conduct as a publisher or speaker.”  LeadClick, 838 F.3d at 175 (quoting Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1102 (9th Cir. 2009))

Put another way, even though there is a statute which prohibits providing material support to terrorism, Facebook’s status as a publisher of user-generated content means that it benefits from Section 230’s immunity from being treated as the publisher or speaker of content provided by another information content provider. Facebook wasn’t helping Hamas produce content. Accordingly Facebook could not be found liable for hosting it.

Facebook as information content provider

If Section 230’s immunity applies to liability purportedly arising under 18 U.S.C. § 2333, the next logical step for the plaintiffs – and the argument they raised – was to try to disapply the immunity by arguing that Facebook actually helped to produce Hamas’ content.

This is sort of like a situation we’ve often seen in the Star Trek movies (in particular Star Trek II: The Wrath of Khan and Star Trek: Generations). You don’t have to blast your way through the shields if you can trick the enemy into dropping them.

In Section 230 terms, “dropping the shields” means Facebook would be doing the talking or making other affirmative and material acts that “develop” the content in issue. The plaintiffs contended:

Facebook provided “to HAMAS use of Facebook’s data centers, computer servers, storage and communication equipment, as well as a highly-developed and sophisticated algorithm that facilitates HAMAS’s ability to reach and engage an audience it could not otherwise reach as effectively,”

Which the court understood to mean:

Plaintiffs contend that Facebook’s algorithms “develop” Hamas’s content by directing such content to users who are most interested in Hamas and its terrorist activities… we have recognized that a defendant will not be considered to have developed third‐party content unless the defendant directly and “materially” contributed to what made the content itself “unlawful.”

But the court was not convinced that this was the case.

Pointing to the Ninth Circuit’s decision in Kimzey v Yelp! Inc., the court points out that the “‘material contribution test… ‘draws the line at the crucial distinction between, on the one hand, taking actions… to… display… actionable content and, on the other hand, responsibility for what makes the displayed content [itself] illegal or actionable.”

Accordingly, the court held that Facebook, in this instance, was not materially contributing to Hamas’ content; “arranging and distributing third‐party information,” the majority opined, “inherently forms ‘connections’ and ‘matches’ among speakers, content, and viewers of content, whether in interactive internet forums or in more traditional media.  That is an essential result of publishing.” i.e., Facebook was staying in its lane as an interactive computer service provider.

The court concluded:

Accepting plaintiffs’ argument would eviscerate Section 230(c)(1); a defendant interactive computer service would be ineligible for Section 230(c)(1) immunity by virtue of simply organizing and displaying content exclusively provided by third parties,”

and further that

Plaintiffs’ “matchmaking” argument would also deny immunity for the editorial decisions regarding third‐party content that interactive computer services have made since the early days of the Internet [under Section 230(c)(2)].


Litigants have made many attempts, through many different means, to try to hold interactive computer service providers engaged in the publishing of third-party content but not creating that content liable as if they were a publisher.

Section 230 means what it says. The court points out that courts have properly “invoked the prophylaxis of section 230(c)(1) in connection with a wide variety of causes of action, including housing discrimination, negligence, and securities fraud and cyberstalking.” And now, the Second Circuit has affirmed that this includes liability under terrorism statutes as well.

Moment of zen

Glad to know ConsenSys reads this blog. Will have to feature them more often!