This is a news story, published by Brookings, that relates primarily to Peter Bergen news.
For more Peter Bergen news, you can click here:
more Peter Bergen newsFor more social media news, you can click here:
more social media newsFor more news from Brookings, you can click here:
more news from BrookingsOtherweb, Inc is a public benefit corporation, dedicated to improving the quality of news people consume. We are non-partisan, junk-free, and ad-free. We use artificial intelligence (AI) to remove junk from your news feed, and allow you to select the best tech news, business news, entertainment news, and much more. If you like social media news, you might also like this article about
transparency disclosures. We are dedicated to bringing you the highest-quality news, junk-free and ad-free, about your favorite topics. Please come every day to read the latest transparency requirements news, Transparency initiatives news, social media news, and other high-quality news about any topic that interests you. We are working hard to create the best news aggregator on the web, and to put you in control of your news feed - whether you choose to read the latest news through our website, our news app, or our daily newsletter - all free!
transparency mandatesBrookings
•74% Informative
Governments around the world are imposing transparency requirements in the hopes that they will improve content moderation practices on social media platforms.
Peter Bergen argues transparency is a necessary first step in creating a regulatory structure for social media companies.
Bergen: Regulators are absolutely key in implementing transparency reporting duties.
It requires social media companies to provide qualified researchers with access to internal company data that researchers need to conduct independent evaluations.
The digital regulator in conjunction with research agencies such as the National Science Foundation or the National Institute of Health would have to vet the researchers and the research projects.
The regulator must at a minimum assure that companies do not seek to frustrate the goals of access transparency by not providing timely or accurate data.
There are good policy reasons to be careful about mandating speech controls, authors say.
Still, there might be additional regulatory measures short of speech controls that would be helpful, they say.
Authors: We know what the problems of social media content moderation are: hate speech, disinformation, amplification of terrorist material, and material that harms minors.
VR Score
84
Informative language
89
Neutral language
47
Article tone
informal
Language
English
Language complexity
75
Offensive language
not offensive
Hate speech
not hateful
Attention-grabbing headline
not detected
Known propaganda techniques
not detected
Time-value
long-living
External references
no external sources
Source diversity
no sources
Affiliate links
no affiliate links