The social media behemoth is also making it easier to curb the spread of misinformation within Groups. Admins can automatically move posts with known false claims (that is, verified by fact checkers) to pending posts so they can be reviewed before they’re deleted. While leaders could already auto-decline posts and even auto-block posters, this could help them spot trends in bogus content and help make decisions on bans.
There are efforts to promote conversations, too. Facebook is testing an extension (shown at top) that lets admins allow content that might otherwise be flagged for bullying and harassment, such as describing a fish as “fatty.” This will only be available to actively involved admins who haven’t either helmed a removed group or committed a serious policy violation. In another test, admins can reward contributions by giving points to community members. You may get badges for welcoming newcomers or providing useful tips, for example.
The changes are both an effort to spur positive engagement and an acknowledgment that Groups have sometimes been the source of Facebook’s largest misinformation problems. It put some communities on probation for spreading false 2020 election claims, and banned hundreds of QAnon groups. The ability to allow certain flagged content is unusual — effectively, Facebook is willing to let Groups override its moderation system if they feel there’s been a mistake.
Original Story At https://www.engadget.com/facebook-groups-reels-anti-misinformation-tools-162033187.html?src=rss
Note: This article is automatically uploaded from feeds, SPOKEN by YOU is not responsible for the content within it.