Ethereum founder Vitalik comments on Trump pressuring Twitter: 2020 is a crazy world, where politicians call the shots

share
Ethereum founder Vitalik comments on Trump pressuring Twitter: 2020 is a crazy world, where politicians call the shots

Trump's tweets were flagged by Twitter as containing misinformation, leading him to claim his freedom of speech was being violated. In response, he issued an executive order calling for a review of the legal protections enjoyed by social media platforms. If the order is passed, it could potentially have a significant impact on the platforms' moderation mechanisms.

Vitalik's Comments on Trump's Executive Order

Twitter user and Ethereum co-founder Vitalik Buterin also commented:

"It's a strange thing when major tech companies in the US are subject to fairly strict regulation, even if not up-to-date with the laws of the times, at least their current actions or inactions do not need to be worried about, and then used as an excuse to make new laws."

Vitalik agrees with American tech law consultant Preston Byrne, who said that Trump's order to review the Communications Decency Act Section 230, which gives internet platforms immunity, could be seen as rolling back regulations if platforms like Twitter, Facebook, Instagram, and YouTube are required to take responsibility for content moderation.

Byrne believes that Twitter's "fact check" labeling, without erasing Trump's statements but marking them with commentary, is also protected under freedom of speech.

Trump labeled as misleading

Vitalik even stated:

"Section 230 doesn't impose any obligation on platforms to be neutral. But in the crazy world of 2020, that doesn't matter. If politicians 'believe' there's a neutrality obligation, then, regardless of what the law says, tech companies will have an incentive to be neutral."

In a centralized world, politics and law remain much more complex, and the notion of "code is law" may solve many business rules and financial operations that can be defined by mathematics. However, when it comes to how "content moderation" is determined and balanced, more detailed mechanisms and human coordination may be needed to achieve social consensus, don't you think?