1. Home
  2. content-moderation

content-moderation

‘Decentralized Twitter’ Bluesky releases code, outlines content moderation

Bluesky outlined that it is “not possible to have a usable social network without moderation,” and intends to build a moderation model for the network dubbed “speech and reach.”

Twitter’s decentralized social media project Bluesky has released its first batch of code, and also committed to content moderation on the network.

Former Twitter CEO Jack Dorsey first announced Bluesky back in late 2019, but the project went independent in February after receiving $13 million worth of funding from Twitter. However, Dorsey is still a member of the board.

Bluesky released its “Authenticated Data eXperiment” (ADX) protocol code via a May 5 blog post titled “Working in Public.” The team noted that it will be publicly sharing the process of developing the platform by “releasing work before it’s complete, but also giving ourselves time to workshop new directions at early stages.”

Developers are now free to experiment with the network architecture, but Bluesky notes that it is still very basic, and that “things are missing, and things are going to change.”

At this stage, the ADX protocol will utilize “self-authenticating data” which is a model that enables operations on a network to be certified independently without a centralized host or authority. The general idea is that the user will have control over their data and can move it from platform to platform without permission, which is an idea central to the Web3 movement.

“Self-authenticating data moves authority to the user and therefore preserves the liveness of data across every hosting service.”

Content moderation

While the platform will be decentralized, Bluesky outlined that it is “not possible to have a usable social network without moderation,” and intends to build a moderation model for the network dubbed “speech and reach.”

It continued: “Moderation occurs in multiple layers through the system, including in aggregation algorithms, thresholds based on reputation, and end-user choice. There's no one company that can decide what gets published; instead there is a marketplace of companies deciding what to carry to their audiences.”

Bluesky also outlined that hosting providers will still be legally required to “remove illegal content according to their local laws.”

Related: Binance commits $500M to co-invest in Twitter with Elon Musk

Following reports that Elon Musk is set to take over Twitter for $44 billion, Bluesky explained via the social media platform on April 26 that this move will have no bearing on the company.

While Twitter did back Bluesky with $13 million, the project noted that the only condition of the deal is that the team must “research and develop technologies that enable open and decentralized public conversation.”

Chainlink, Microsoft, Banco Inter collaborate on Brazil’s CBDC pilot

Coinbase’s Philosophy on Account Removal and Content Moderation

By Brian Armstrong

In the last few years, it’s become increasingly common for tech companies to censor customers or close their accounts for a range of reasons (e.g., misinformation). Luckily, as a crypto business we don’t face this issue as frequently as a social network does, but we still need to set clear policies around acceptable use of our products. As our product suite grows, it will even include products that host user generated content like NFTs.

Our high level philosophy is that, in a democratic society, the people and their elected officials should decide what behavior is allowed and not allowed by setting laws. We think it sets a dangerous precedent when tech companies, such as Coinbase, or their executives start making judgment calls on difficult societal issues, acting as judge and jury. This approach sounds simple in theory, but in practice it is anything but.

First, it can be very complex to determine whether an activity is legal or illegal. Laws vary greatly across different countries, states, and regions. Some activities are legal only if you have a license. Some activity is in a gray area. Some unjust laws go unenforced. Like most companies, we refer suspected illegal activity to the relevant authorities, but we can’t expect to receive a timely response or opinion back from them given the many demands on their resources. Unfortunately, this puts us, along with most companies, in the unfortunate position of having to make our own determinations about what activity is legal or illegal.

Second, even if some activity is legal, it may be something that is deeply troubling to have on the platform. The world is littered with polarizing, uncomfortable, or obscene content that may still be legal. This is where companies start to exercise even more judgment on what they allow. But there is great danger of falling down a slippery slope, having to render decisions on every difficult societal issue, where you are sure to upset someone no matter where you land. Without some strong principled based approach, these decisions become arbitrary and capricious, opening the company to attack.

Finally, every company works with other companies that have their own set of moderation and deplatforming policies. For instance, for any app to be listed in the Apple and Google App Stores, it needs to play by the rules of those two companies. In the financial services world, we also work with banks and payment processors who have their own acceptable use policies. Very few companies are completely vertically integrated, with the luxury of making their own decisions in a vacuum.

So how should a company implement a reasonable approach based on the above constraints? We’ve come up with our own answer, and I want to share it here so our customers can understand it, and in case it helps other companies.

Our approach

First, it’s important to differentiate our approach based on the type of product. Coinbase has a broad product suite, but for moderation purposes we group our products as either infrastructure products or public-facing products when thinking about how to moderate them. Infrastructure products enable access to basic financial services and are typically used privately by a single customer, while public-facing products often host user generated content and have social features visible to large numbers of users. Ben Thompson’s article on moderation in infrastructure illustrates how companies typically take a different approach for each of these products.

For our infrastructure products, we use rule of law as the foundation of our approach, because we believe that governments, not companies, should be deciding what is allowed in society. We also believe that everyone deserves access to financial services, and a test of legality should be sufficient for these products.

For our public-facing products, we again start with rule of law as the foundation. But assuming something is legal in a certain jurisdiction, we also go beyond this and moderate content that is not protected speech under the First Amendment. We’re not legally held to the First Amendment as a company, and the First Amendment is a U.S. focused concept only, but we’ve chosen to use it as the guiding principle of our content moderation approach because it is in line with our values and helps ensure we don’t fall down a slippery slope over time. The First Amendment has hundreds of years of case law built up, and provides a reasonable framework to moderate content such as incitement, fighting words, libel, fraud, defamation etc. David Sacks does a great job describing this approach in this blog post.

Finally, there are cases where we want to work with external partners, such as the App Stores, and need to follow their moderation policies to do so. Sometimes third party payment providers have their own policies. For payment providers, we can simply disable functionality related to that partner if there is a problem with a specific user, while continuing to offer Coinbase services. But getting kicked out of the app stores wouldn’t help anyone. So when working with partners, our approach is to be free speech supporters, but not free speech martyrs, and to make accommodations if it is essential for us to function as a business.

This is obviously a complex issue, and hopefully the above approach starts to show a path through it that doesn’t devolve into arbitrary and capricious decision making. To boil down the above approach, we ask the following questions for our public-facing products:

1. Is the content illegal in a jurisdiction in which we operate?

A. If yes, then remove in that specific jurisdiction

2. Is the content a free speech exception under the First Amendment?

A. If yes, then remove globally

3. Has a critical partner required us to remove the content?

A. If yes, then remove the content or disable the functionality of that partner for the affected user

If the answer to any of these 3 questions is “Yes” we will take some moderation action, such as taking down content and in severe cases terminating the account.

Decentralization is the ultimate customer protection

Most of this post has been about how we can create a reasonable moderation policy that doesn’t get co-opted over time, succumb to pressure, or descend into us playing judge and jury. This is important so that Coinbase is able to stand up to pressure. Of course, the decentralized nature of cryptocurrency offers its own important protections here, and those protections get stronger the more our products decentralize.

If our policy above fails, and Coinbase starts making bad judgment calls or turns evil, customers can withdraw their crypto to any other competing exchange, wallet, or custodian. Compare this to social networks today, where you can’t take your followers with you. Your data is owned by one company, in a proprietary format. The open nature of crypto protocols provides lower switching costs, which is an important customer protection, even for relatively centralized crypto products. But decentralized, or self-custodial, crypto products have an even greater protection because the company is simply providing access to something running on-chain. For instance, no one can deplatform your ENS name without taking every ENS name offline. Decentralization moves you from the slippery slope to the crypto cliff, where the would-be censor must compromise an entire blockchain to censor just one person.

Decentralization is a spectrum, and Coinbase is moving farther down this path over time, embracing self-custody with Coinbase Wallet, stepping up user education around private keys, and by investing in Bitcoin core development and web3 protocols. The more decentralization we can support, the better protection customers will have.

Conclusion

We believe everyone deserves access to financial services, and that companies should put appropriate controls in place to prevent censorship or unjust account closures from taking place. For centralized financial infrastructure products, we believe rule of law is a sufficient standard for moderation, while for decentralized products even greater protections can be provided by the blockchain. We also acknowledge that public-facing products deserve some additional consideration, and that the First Amendment can be used as a reasonable test or boundary. We believe this approach is consistent with our mission of creating more economic freedom in the world and with the ethos of crypto.

Companies are in a difficult position when they choose to censor or terminate a customer account. What often seems like an easy decision, especially under public pressure, turns out to have larger unintended consequences and sets a dangerous precedent for the role of private companies in society. I’m sure we won’t get it perfect with our policy above, but my hope is that we’ve laid out some principles we can fall back on when difficult decisions arise, and that investors, customers, and employees can have a better understanding of our process.

Further Reading


Coinbase’s Philosophy on Account Removal and Content Moderation was originally published in The Coinbase Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Chainlink, Microsoft, Banco Inter collaborate on Brazil’s CBDC pilot