Americans have little faith that social media companies will appropriately moderate content on their platforms, according to a new survey by the Knight Foundation and Gallup.
According to a survey that was released today, 84% of Americans say that “they don’t really trust social networks to make the right decisions about what people can post on their pages.”
Most respondents noted that companies operating on social networks did not do enough to remove malicious content on their sites, although there were serious disagreements on party lines.
71% out of the respondents who identified themselves as Democrats and 54% of independent respondents said that these companies were “not tough,” while only 32% of Republicans agreed.
In high-profile decisions, Facebook and Twitter have recently been repeatedly called on to moderate the misleading and incendiary posts of President Donald Trump.
This decision was noted by John Sands, director of training and influence of the Knight Foundation, who told at the Adweek conference that there was a “significant gap in trust” between Americans and social networking companies.
In addition, the survey asked about Section 230 of the Communications Order Act, a little-known law that provides liability protection for online platforms for the past quarter century.
Out of those who responded, 66% said they supported compliance with a law that protects platforms from most lawsuits compared to most user-generated content. However, 54% said the law did “more harm than good because it did not make the company responsible for illegal content on their sites and applications.”
With moderation of content in the spotlight in the context of the Covid-19 pandemic and because of presidential posts that constantly expand the boundaries of acceptable speech on the Internet, major social media players are faced with how to better control their own platforms.
Sands described these efforts as a “continuous experiment” to restore user confidence.