Does Facebook have a more decentralized solution to the issue of content censorship sparked by Trump?

share
Does Facebook have a more decentralized solution to the issue of content censorship sparked by Trump?

On May 29, President Trump signed an executive order claiming to hold social media platforms, including Twitter, Facebook, Instagram, and YouTube, accountable for their "content moderation" actions. How will the social media giants, who have long acted as "arbiters of speech," respond to this? Facebook's Oversight Board seems to be a feasible solution.

Social Media Controls Your Soul

“Banks control your money, social media control your soul.”

The issue of content moderation on social media platforms has become a pressing concern in the age of the internet and social media dominance. According to reports, on May 29, U.S. President Trump signed an executive order that will hold social media platforms including Twitter, Facebook, Instagram, and YouTube accountable for their content moderation actions. The executive order states:

“Twitter, Facebook, Instagram, and YouTube wield immense, if not unprecedented, power to shape the interpretation of public events; to censor, delete, or disappear information; and to control what people see or do not see.”

Indeed, social media holds significant power as "arbiters of speech" who can determine right from wrong. But is stripping these social media giants of their moderation power really a good thing? Taking Facebook as an example, the company employs thousands of content moderators globally to screen post content. These content moderators often have to meticulously review graphic content involving pornography, violence, racism, or prejudice, causing them severe psychological trauma. Most people would not want such negative content to be widely present in their daily lives.

Facebook Launches Independent Oversight Board

If there is a way to weaken the central moderation power of social media while allowing the platform ecosystem to filter out negative information, Facebook's Oversight Board is a good approach. The Oversight Board is a third-party independent organization dubbed the "Supreme Court" of Facebook, operating independently from Facebook and composed of at least 11 to a maximum of 40 members from around the world, serving 3-year terms with no direct affiliation with Facebook.

In the first batch of Oversight Board members announced on May 7 this year (a total of 20 members), there is Professor Chen Yining from the Department of Advertising at National Chengchi University.

The board members not only provide expertise in areas such as freedom of speech, safety, privacy, and digital content governance but also review appeals that challenge content decisions on Facebook and Instagram. The Oversight Board's rulings can even overturn Facebook's original decisions.

In fact, the concept of the Oversight Board is similar to the Decentralized Autonomous Organization (DAO) in the blockchain field. In blockchain or decentralized applications, many projects operate through DAO to determine the direction of projects, maintain ecosystem operations, such as stablecoin project Maker, decentralized court Aragon, recently launched peer-to-peer lending project Compound, and many other public chain systems. This approach eliminates authoritarian issues, with all decisions collectively made by community members, achieving a "bottom-up governance model."

Taiwan's Digital Minister Audrey Tang, in a previous interview with overseas blockchain media Decrypt, shared the prospects of promoting administrative management technology innovation in Taiwan through decentralized governance. Tang stated:

“Ledgers (Tang used the term 'ledgers' instead of 'blockchain' in the interview) have great potential. They are a low-cost solution to establish 'accountability' and 'legitimacy' between departments.”

Decentralized Autonomous Organization (DAO)

Facebook's Oversight Board can be seen as a "permissioned decentralized autonomous organization," requiring permission to become a member of the organization. While this approach may not achieve absolute fairness, it is easier to implement compared to an "unpermissioned decentralized autonomous organization." The reason is that in an "unpermissioned decentralized autonomous organization," anyone worldwide can become a member of the organization, making it challenging to filter out malicious individuals. Therefore, an additional consensus mechanism must be designed to align participants' interests with the healthy development of the ecosystem to ensure the organization and members' goals align.

For this reason, most decentralized autonomous organizations issue governance tokens to economically incentivize community members to make correct decisions (similar to a shareholders' meeting). Taking the stablecoin project Maker as an example, holders of the project's token (MKR) have the power to modify protocol interest rates, collateral, and other parameters, while also having the responsibility to set the correct parameters and decisions to ensure the proper operation of the Maker protocol.

As long as the Maker protocol itself can maintain healthy development, the circulating supply of MKR in the market will gradually decrease, thereby increasing the market price of MKR tokens. Conversely, if issues arise in the protocol operation, MKR holders must bear the responsibility. For example, in mid-March, due to a sudden cryptocurrency market downturn, the collateral value in the Maker protocol fell below the total debt value, resulting in bad debts that could not be repaid to creditors. Consequently, the system automatically issued more MKR and auctioned it to repay the bad debts, leading to economic penalties for MKR holders (decision-makers) due to the token's decline.

Bottom-Up Governance

While social media often faces criticism regarding content moderation rights, it must be acknowledged that most social media's content moderation is not malicious and indeed filters out much negative and malicious content. Perhaps the real issue has always been not the content moderation itself but the "top-down" governance model of content moderation. If social media platforms could decentralize and delegate some of their power, similar to the "bottom-up" governance approach of Facebook's Oversight Board, then public discourse could be overseen by a more impartial group of individuals. Criticisms of social media "infringing on freedom of speech" or "favoring certain parties" could potentially diminish significantly, resulting in a win-win situation for users and platforms.

The issue of content moderation rights has been brought to the forefront following the executive order signed by U.S. President Trump, and in the days ahead, major social media platforms are likely to adjust their content moderation strategies under the scrutiny of regulatory bodies. As for whether these platforms will adopt a decentralized autonomous organization model, let us wait and see.