Content

Gallup Poll Reveals Most Americans Distrust Social Media Companies on Content Decisions – and the Government

Photo Illustration by Jakub Porzycki/NurPhoto via Getty Images

The takeaways

  • Gallup and the Knight Foundation have released recent research polling American respondents focusing on tech company and governmental moderation of content.

  • The findings indicate that while most Americans don’t trust tech companies to fairly judge the content on their platforms, they trust the government less and desire independent oversight.

  • The release comes as the Trump administration legally challenges tech companies, who have been criticized for their handling of the president’s recent controversial posts.

What happened?

A poll released on Tuesday by Gallup and the Knight Foundation reveals that eight out of ten Americans do not trust leading tech companies to rightly decide what content should be allowed on their platforms. The poll also found that the U.S. government was considered even less trustworthy with such decisions. Almost two-thirds of respondents supported in principle Section 230 of the Communications Decency Act, which protects major internet companies from liability for users’ content.

Though most favored social media companies moderating content over the American government, the majority was more critical of companies perceived as doing too little than too much to control harmful content. 71 percent Democrats and 54 percent independents considered companies as not tough enough compared to the government, while Republicans were more divided.

81 percent mostly favored the idea of independent content oversight boards supervising policies. Facebook is currently appointing an oversight board to supply content policy recommendations based on a small number of cases.

Why this matters

Twitter and Facebook’s stances on political bias, fact-checking – including the pandemic – and inflammatory social media posts by President Donald Trump have all come under heavy scrutiny. In May, Twitter applied warning labels on some of Trump’s controversial tweets, leading to an executive order from the president against social media companies that carries direct implications for Section 230. Despite a virtual walkout by employees at the start of June, Facebook ignored their request to take down Trump’s posts and fired an employee who directly criticized CEO Mark Zuckerberg for the company’s inaction.

The data that Gallup and the Knight Foundation have collated points to increasing unease against a backdrop of a potentially more regulated social media landscape. Notably, the report asserts that the majority of its research was collected in late March 2020 as COVID-19 approached pandemic status, with the remaining data sourced in December 2019, thereby preceding the above events.  

Since 2016, online content moderation has become a critical issue that, due to the coronavirus pandemic and the 2020 U.S. election, has recently intensified. Much of this can be traced back to the Cambridge Analytica scandal and the accusations of Russian bot interference in the 2016 election via Facebook, Twitter, and other platforms. The latter led to congressional hearings for Facebook, Google, and Twitter in 2017. 

You may also like

Comments are closed.

More in Content