Elon Musk’s (apparently successful) bid to acquire Twitter has resurrected longstanding discussions in the cryptoverse regarding, at least to date, a largely theoretical product category: “decentralized social media.”
Just as Bitcoin is censorship-resistant money, the theory goes, so too can we use Bitcoin-like infrastructure to run censorship-resistant social media applications! Technically, a proof of concept at least is certainly possible. I should know; back in 2014, Casey Kuhlman, Tyler Jackson and I proposed a DAO called “Eris” that was basically a distributed version of Reddit that could run on a blockchain back-end (Ethereum POC 3, to be precise).
We built this in May of 2014 – 8 years ago. Notice the “my DAO” button in the upper right hand corner? At the time people thought we were completely insane.
Whilst that prototype went nowhere as this all happened in 2014, a time when the market couldn’t tell the difference between a smart contract and a pop tart and “DAO” was mainly something discussed among adherents of Confucianism, today a number of new entrants are having a crack at this same problem. Given my longevity in the Bitcoin/Blockchain arena I confess it is tempting to slap together a pitch deck and raise $20 million pre-seed pre-product to build the damn thing, given how much venture money is currently sloshing around. Fortunately for everyone, after my last startup I swore an oath to never attempt to develop or sell software again, so I will remain in my law office where I belong.
Designing a prototype, as we did, is admittedly a lot easier than designing something people actually want to use. Even on easier “web 2” tech, there are thousands of social media apps, yet only a handful are consequential. Creating a social media app is trivially easy, but running a successful social media business is extraordinarily hard.
Prior attempts at “decentralization” have fared poorly. The most successful attempt so far, Mastodon, is a federated service, albeit an imperfect one where individual instances do not scale well (as Donald Trump’s company, Truth Social, discovered when they forked Mastodon to try to shortcut their way to social media stardom, only to find Mastodon’s back-end couldn’t handle their traffic).
By the same token (pun intended), dumping every communication onto a blockchain and storing everything in the clear, as Bitclout does, is easy, but completely non-scalable. Facebook does not require agreement on global state and allows people to delete their data; furthermore, Facebook generates over 4 petabytes of data per day. Any system that tried to ape Bitcoin (like Bitclout) would quickly be relegated to a handful of nodes running in data centers, like Ethereum is.
There are legal problems as well. Social media companies, as it turns out, are subject to a bevy of regulations. With the exception of data privacy, these regulations are generally uniform across the United States and otherwise vary country-by-country. The rules govern the destruction and reporting of illegal content, copyright issues, data protection, and mandatory disclosure of subscriber records, among other things, in the United States. All these factors need to be accounted for in any “decentralized” social media application’s design.
The problem of unlawful material has long been identified by lawyers looking at decentralized storage solutions as a major obstacle to adoption of these services.
In the United States and across the world, the most uniformly illegal content in existence is child sexual abuse material, or CSAM, as it is referred to by law enforcement. Despite the fact that the penalties for knowingly hosting this material are extreme, ranging from heavy fines to lengthy terms of imprisonment, the crypto industry’s response to this very longstanding Internet problem has more or less been to completely ignore it.
Web2 applications which host user-generated content, such as Reddit, Twitter or Facebook, take a very proactive approach to this type of illegal material. Federal law requires “providers” – a term which means “an electronic communication service provider,” which likely would be understood by a court to describe both blockchain node operators as well as traditional, centralized service providers – to remove CSAM on discovery, securely preserve it for 90 days pending receipt of legal process, and then securely destroy it. Facebook and others use a wide range of software, including Microsoft’s PhotoDNA, to detect, remove, and report CSAM automatically.
Overseas, where there is no such thing as the First Amendment, even broader categories of “unlawful content” exist. See e.g. the German Netzwerkdurchsetzungsgesetz, or “NetzDG”, which requires operators of social media services to register with the government and, after reaching a certain scale, to abide by takedown requests; the French Law no. 2020-766 against hate-content on the Internet, which imposes fines for failing to remove unlawful content, including “terrorist” content, within one hour of posting; or Section 5 of the Defamation Act 2013 in the United Kingdom, which has a notice-and-takedown procedure for alleged defamation similar to the U.S. DMCA.
Where services like Reddit and Facebook are very responsive to all the above requests and requirements, many blockchain-based services, like StorJ or Sia, to my knowledge, have no such controls (or only very limited controls).** They permit the storage of encrypted data without the creation of a subscriber record or the means for the service provider – in this case, the node operator – being able to ascertain what data is being stored or assess the legality of storing it.
It is probable, and I would suggest even likely, that decentralized data storage services are currently being used to host unlawful content, likely without the knowledge of the node operator hosting it. This level of willful blindness would be a complete non-starter for a “decentralized” social media app, which must be designed in such a way that an otherwise law abiding user can participate in the network while being secure in the knowledge he or she is not violating local law. So far, no blockchain solution with a storage component even attempts to address this issue. It must be addressed in any design that hopes to be successful. Nobody will run a node for a decentralized service if doing so risks imprisonment.
Similarly, our intellectual property regime is not well suited to use in decentralized fashion.
Social media node operators – being entities “offering the transmission, routing or providing of connections for digital online communications… of material of the user’s choosing, without modification to the content of the material as sent or received,” are “service providers” for the purposes of the Digital Millennium Copyright Act, publishers within the meaning of the Copyright Act, and therefore will need to consider both (a) defensively, the necessity to register with the Copyright Office to avail themselves of the safe harbor protections of the DMCA and (b) consider their own exposure for hosting material which might give rise to a copyright infringement claim.
At minimum, addressing this issue might require a decentralized implementation of the DMCA’s notice-and-takedown procedure for any third party content hosted on a node (which will involve node operators needing to dox themselves with the Copyright Office if they want to benefit from this protection). Worse, we could see copyright trolls, newly emboldened by the enormous increase in possible unsophisticated defendants, ravaging node operators in repeated bad-faith attempts to extort small dollar settlements. In the alternative, the application could be designed so that users don’t host images or video – being the types of copyrightable subject matter which is most often used by vexatious copyright enforcement law firms – at all.
It is difficult to speculate what kind of infringements and enforcement one might encounter in a communications medium which does not yet exist. Judging from what we see in Web 2, however, the presence of copyright trolls in Web3 is a virtual certainty as soon as it becomes profitable for them to be there.
Data protection and disclosure.
A further issue arises when we consider that a person participating in a decentralized network may, in the course of operating his or her node, acquire large quantities of subscriber data.
Let us suppose, for sake of argument, a decentralized social media system is built where the network will allow a user to download the user profiles and posts of everyone who is two degrees remote from them. So let’s say I follow @A16Z and @marmotrecovery follows me, @A16Z would then be permitted to download and store my information and posts, as well as those of everyone who follows me, including @marmotrecovery. Judging from the sheer number of users @A16Z follows (half a million), it is safe to say that if A16Z ran a node on this hypothetical network it could be a “service provider” under the California Consumer Privacy Act or other local law and likely required to implement a compliance program.
By the same token, node operators may also become “providers of an electronic communication service” for the purposes of the Stored Communications Act (18 U.S.C. § 2701 et seq.) and therefore may be required to hand over records on their computers to the government without the government needing to obtain a warrant first – at least, to the extent that those records pertain to third parties which are within a node operator’s possession and control. Users are unlikely to want to run a network that invites this degree of intrusion into their personal lives. Applications will need to be designed so that they hold as little third-party data as possible on their nodes.
Some rough conclusions on the design of a future decentralized social media network
All of the issues identified above share one factor in common: social media does not require agreement on permanent and immutable global state. To the contrary, social media requires a degree of censorship and deletion. Decentralized tech like Bitcoin is designed in such a way as to render deletion impossible or prohibitively expensive. A decentralized Twitter will not, therefore, look anything like Bitcoin.
The need for content removal and moderation – whether due to criminal liability, civil liability, or simple usability – will be the single most important factor in the design of any decentralized social media system. The irony of the fact that perceived unfairness in content moderation in Web 2.0 is what is driving calls for decentralized social media for Web3 does not escape me. At minimum, the centrality of content moderation to the social media user experience means that simply dumping everything on the blockchain, as Bitclout does, and then replicating it across every single node of the network, as Sam Bankman-Fried appeared to suggest, with onchain pointers to IPFS for everything else, is simply not going to work.
My hunch is that the first truly successful “decentralized” social media system will not try to be an all-singing all-dancing world computer but rather will have the participants replicate the absolute bare minimum viable information required for the network to function. In my mind, when using a social network, the only opinion I ask the social network to render is whether particular content was published by a particular person. I have no interest in practically any other opinion the social network has about the world. The “blockchain” piece, if any, should be relegated to providing a register of usernames and associated public keys, and very little else.
The first successful decentralized social media service is also likely to limit the kind of data users host to plaintext, for the most part.
First, hosting only text that you and perhaps a select group of followers wrote is a low-liability proposition from the perspective of criminal, copyright, and data protection law. It is also much lighter on bandwidth and will be easier to transmit peer to peer.
Second, video and image hosting, simply due to the sheer quantity of data involved if for no other reason, will likely be outsourced anyway, much as it is now. There are plenty of third party platforms (Bitchute, Cozy, Odysee, Gab TV) which have lax, but not non-existent, content moderation policies for video content. These could address the gap in the market currently served by establishment outfits like YouTube, as well as removing responsibility from node operators to police that content – something which will be especially useful if copyright trolls are to be kept off of users’ backs. All the decentralized system would need to do to serve that content is not block links to those services (link blocking being a practice that both Facebook and Twitter engage in), or allow users to control what content they see by operating their own whitelist/blacklist of third party content providers (libs could block all the right-leaning sites, and the cons could block all of the lib media, for example). The decentralized system would then become just another source of referral traffic to these websites.
I could be wrong, of course. Some wunderkind somewhere could, as we speak, be writing a 6,000-word-long blog post on a “Zk-Dork proof of shark sharding” social media proposal to be built on some all-singing, all-dancing, Ethereum-like Rube Goldberg machine which promises to solve all scaling problems by
ConsenSys simply running the entire thing on AWS magic. My hunch, however, is that for this problem, simpler answers are more likely to be the right ones. “Decentralized social media” is likely to be more like RSS than Ethereum.
Whilst this sketch describes an imperfect solution to the censorship debate, an imperfect solution might nonetheless be a sufficient one. Most of the politically motivated “censorship” which occurs on Twitter and Facebook is not of images and videos, but of links to third party websites, the plain-text expression of wrongthink, and of digital identities themselves (see e.g. the unpersoning of Alex Jones).
An effective “decentralized” solution to the social media censorship problem likely needs to ensure only that text, links, and identity are uncensorable – the text and links by being self-hosted, and the identity by being ineradicable. If we frame the problem to address that limited set of issues I think a usable version of decentralized Twitter with a half-decent UX is achievable in the very near future.
* A lawyer friend asks: “Wouldn’t someone who wants something like deTwitter have the design goal of undermining censorship laws by making the network keep running despite the fact that it stores illegal content?”
It depends on what you’re trying to design for. A network that allows all lawful speech will have the exact same design characteristics, in terms of censorship-resistance against third parties, as one which allows all unlawful speech. A user should not be able to shut down any other user.
However, censorship resistance against third parties does not require censorship resistance against yourself. This is where a decentralized social media solution will differ most sharply from systems like Eth and Bitcoin, where censorship-resistance against the world includes censorship-resistance against yourself (you cannot erase your own transactions). Users will need to be more or less absolute dictators over their own hardware and their own speech, consistent with the First Amendment and the legal obligations of anyone who hooks a server up to the public Internet. If a user chooses to host illegal content, law enforcement should be able to take down that user without taking down the network as a whole. This will allow high-value speech constituting protected speech to flourish network-wide by being hosted from places like the United States while allowing, for example, threats of violence and other zero-value speech to be responded to by law enforcement.
While governments can hold people accountable for their speech in such a system, they will not, however, be able to “unperson” someone from it, either through the use of legal process or by applying unofficial pressure on private businesses – the type of pressure, I suspect, which was behind blanket bans of right-wing figures like Alex Jones or Milo Yiannopoulos from practically every mainstream tech offering which, for those of us who remember, were implemented practically internet-wide in the space of 24-48 hours across dozens of firms. This is why the only real ineradicable component of the system will be decentralized identity – as far as I can tell, there is nothing illegal about having a copy of an address book, even if some of the addresses belong to bad actors.
** After publishing this post a reader pointed out that decentralized blockchain service Sia has, in fact, begun introducing such controls, although it appears to be in a limited fashion. The controls do not attempt to tame the entire decentralized protocol but rather split the protocol into two parts – a paid service (SiaPro) and an unregulated, free service (SiaSky) utilizing separate domains, with the paid service playing by the rules and the unregulated service remaining, well, unregulated. See this post from David Vorick on Sia’s approach.