Skip to main content
Docs

Content Moderation

Centralized Moderation

Moderation of content is an absolutely critical topic when it comes to building a decentralized social network, and it is probably the topic we have spent the most time on besides engineering design.

First, because all of the data on DeSo is open, an ecosystem around moderation can develop that is more robust than what can be achieved with a traditional company.

For example, because the data is open, the best machine learning researchers at the best academic institutions in the world can build API's that label all of the content on the blockchain in a way they can't today, which can then be consumed by all node operators that want to remain compliant.

This would create an economy of scale around moderation that we believe can be more robust than what's possible within the confines of a single corporate entity.

All of the data being open also allows the Federal government to better analyze the spread of misinformation, and be more involved in preventing it, than they can be when all of the content people are seeing is locked up in a corporate walled -arden.

Moreover, at a high level, we start by considering a spectrum of how decentralized the internet can be.

Right now we are on the very “centralized” side of the spectrum, where small moderation teams at a few companies control the vast majority of public discourse.

We think this is too far on one end of the spectrum, but we also think that the opposite end, where there is total anarchy with regard to content, is even worse.


Decentralized Moderation

DeSo sits in the middle of the above spectrum. It leverages the same moderation scheme that governed the pre-Facebook internet, which we think deters harmful content without stifling innovation and competition.

Any website that displays harmful content is subject to both federal and civil litigation, whether its content comes from a blockchain or from a USB drive.

This is what prevents harmful content from seeing the light of day on the internet today, even though there are many people who could theoretically serve it.

It's also largely how the pre-Facebook internet was kept in check, and it's the same mechanism that prevents nodes on the DeSo network from serving harmful content.

For example, one of the main applications built on DeSo, Diamond is exposing a subset of all the posts on the blockchain.

Diamond filters the blockchain content to prevent showing content that is harmful or illegal. Every node that runs on top of the DeSo blockchain, including apps like Diamond, Flick or CloutFeed, can expose whatever subset of the posts that they want.

This being said, showing illegal or harmful content would not only subject them to copious amounts of litigation, but it would also likely make it such that nobody would want to use them.

That content will still technically be on the blockchain but it won't be practically accessible.

85M+ Transactions
$30M+ Creator Earnings
$2B+ Volume
2M+ Wallets

Looking for funding?

The DeSo Foundation launched Openfund, a web3 social fundraising platform to make it easy for anyone in the world to raise money with cryptocurrency.