Transaction: 33oaHpraFRBso1zIuDv_pTruFzrGTqEq4MFRV7vK6E0

HashBlockUserFee
33oaHpraFRBso1zIuDv_pTruFzrGTqEq4MFRV7vK6E0zaTen6AV-Wcq65WiIbFXn8zSA6f8INQOp5ykCp5N_mC3QXJYLQASKAxMS8hWEEoECH_52MZm60ewLdc-HGGM1DEk7hljT37Gf45JT5CoHUQ0.000020 AR
Data:

I'm working on a decentralized social media project [Clear Rain](https://clearrain.xyz) and I'm looking to integrate arweave soon. I wanted to test out posting on Feedweave so here's something I wrote with my thoughts on content moderation:

# On Content Moderation

Recently YouTube caused outrage when CEO Susan Wojcicki [said](https://www.cnn.com/videos/business/2020/04/19/inside-youtubes-numerous-policy-changes-during-the-pandemic.cnn) it is against YouTube policy to circulate views contradictory to traditional authorities like the WHO. As Austen Alfred [points out](https://twitter.com/Austen/status/1245689629178650624), though, traditional authorities have spread a lot of misinformation regarding the disease including that there is no evidence of human-to-human transmission, no evidence airborne transmission, and that masks are ineffective. At the very least, traditional authorities have shown that they are fallible like everyone else. Unfortunately, Youtube has taken the easy content moderation route, blindly agreeing with these officials.

YouTube's content moderation policy has made traditional authorities a central source of truth. There are two fundamental issues with only seeing a single perspective on any issue:

1. There is no perfect perspective.

1. Information can conflict with the self-interest of people and institutions.

No person or institution can be exposed to all relevant information on a subject. All perspectives have inconsistencies and lack some information. Additionally, it is human nature to ignore information that is not in our self-interest. It's a huge cognitive blindspot that everyone has. If only one perspective is allowed, that perspective essentially has a monopoly on information in their field, allowing them to use their preexisting influence to further impose their power.

Some may argue that the solution is to remove all forms of content moderation, but we're wary of that argument. Consider an academic journal that just let everyone publish in it. Some research would be great but most would not. Everyone would need to spend a lot more time fact-checking every piece of information, and those who blindly accept information at face value will end up believing in a lot of misinformation.[^1] It's not very practical. Nobody can be expected to verify every piece of information they come across. It would be extremely inefficient.

Some content moderation is good because it helps us filter information that we may not have the time to verify for ourselves. But these filters can become a problem if they have too much power. If Alice, a prestigious polymath, has a monopoly on academic journals, her journals would have much better research, but they would only contain her perspective. If Alice doesn't like a research paper, it won't get published. When she is wrong, she may prevent important information from circulating. Alice is a single point of failure.

YouTube's fundamental mistake is not content moderation, it is making traditional authorities a central source of truth. In doing so, these organizations also became a single point of failure.

Many of our information sources today are often on one of these two extremes; information often either comes from inside traditional institutions, or from sites that are open for anyone to publish. We think a better alternative is to allow for many quality-controlled perspectives, a middle ground between these two extremes. All perspectives contain biases so everyone should ideally expose themselves to many instead of being forced to only see one. Content moderation should be decentralized. There should be many independent decisions as to what is good content and they should come from many different perspectives.

The internet has allowed anyone to publish content, but it hasn't allowed people to easily work together to do so. There is no online mechanism that can easily allow people to organize in new ways. Publishing platforms are often either open to all like Twitter, YouTube, and Medium, or a custom solution with access control like Hackernoon or the New York Times' website. By allowing people to organize together, they can exchange feedback and benefit from each other's reach.[^2]

People should be able to create new publishing communities with access control at the click of a button. Communities will be entirely [community owned and operated](https://a16z.com/2019/03/02/cooperatives-cryptonetworks/). Within one, you may have editors and administrators who control what content is published or just who is allowed to publish. Each community will have some sort of guidelines specifying what content is allowed. Creators who violate the guidelines will put themselves at risk of being kicked out. If administrators violate the guidelines, either they'll get voted out, or everyone will eventually switch to a fairer community.

If there is no existing community to publish someone's content, or someone believes they can create a better community than the existing ones, they should be able to easily do create one. It's not realistic for someone to set up a traditional research journal, but it is realistic for someone to click a few buttons to create a new publishing community and add others to it. In other words, the cost to create, enter, and exit communities should be made as low as possible.

Easily allowing people to create communities is similar to Reddit, but our communities will be owned by users, not by some big tech company. We also want to better support longer forms of content and videos. By lowering the cost to create a publishing community, the best creators, researchers, and journalists will likely gravitate towards the communities that create the best content and reward their contributors fairly.

A way we like to think about the multiple perspective approach to content moderation is to compare information sources to markets. Right now many insiders often have the equivalent of a monopoly within their field. Insiders can decide what information is valuable, and may sometimes use this power in their self-interest. Similarly, monopolies control what features their goods have and they may use their influence to rent-seek. Monopolies are fine when they are productive, but they inhibit innovation if and when they begin to abuse their power. If incumbents do not abuse their power, people can naturally gravitate towards the best products at the best prices, and companies that fail to adapt to new technologies or new circumstances are eventually left behind. Similarly, we think a protocol can be designed to make it harder for insiders to abuse their influence, allowing people to naturally gravitate towards the best quality information over time. Information sources that fail to adapt to new research should eventually get left behind and the best creators, researchers, and journalists will eventually move to the best communities.

Such a system may be a solution to the single point of failure that is typical in content moderation. With many more organized perspectives, sites like YouTube could incorporate multiple perspectives into their content moderation policies. Or better yet, such a protocol could help people move off big tech platforms altogether.

Ultimately, these are just our predictions as to how content can be better moderated online. We admit that just because this may make sense in theory does not mean it will work in practice. It's time to build.

[^1]: It's also [impossible](https://en.wikipedia.org/wiki/Münchhausen_trilemma) to verify everything according to the skeptical school of philosophy.

[^2]: [Bundling](https://cdixon.org/2012/07/08/how-bundling-benefits-sellers-and-buyers) creators together can also be profitable!

Tags:
App-Name:FEEDweave
App-Version:0.0.1