Advertisement

Facebook Groups now include Reels and more anti-misinformation tools

Group heads can also allow language Facebook would normally flag.

Meta

Facebook is giving Groups a tune-up with new features, not to mention tools to ensure healthy discussions. Like it or not, Reels are coming to Groups — members can share how-to guides, vacation recaps and other videos using the seemingly omnipresent format. You can also update your Group profile to share things you might have in common, and signal that you're open to messages. And if you want to broadcast an event, you can share public Facebook events as Instagram Stories.

The social media behemoth is also making it easier to curb the spread of misinformation within Groups. Admins can automatically move posts with known false claims (that is, verified by fact checkers) to pending posts so they can be reviewed before they're deleted. While leaders could already auto-decline posts and even auto-block posters, this could help them spot trends in bogus content and help make decisions on bans.

Facebook Groups Admin Assist misinformation controls
Meta

There are efforts to promote conversations, too. Facebook is testing an extension (shown at top) that lets admins allow content that might otherwise be flagged for bullying and harassment, such as describing a fish as "fatty." This will only be available to actively involved admins who haven't either helmed a removed group or committed a serious policy violation. In another test, admins can reward contributions by giving points to community members. You may get badges for welcoming newcomers or providing useful tips, for example.

The changes are both an effort to spur positive engagement and an acknowledgment that Groups have sometimes been the source of Facebook's largest misinformation problems. It put some communities on probation for spreading false 2020 election claims, and banned hundreds of QAnon groups. The ability to allow certain flagged content is unusual — effectively, Facebook is willing to let Groups override its moderation system if they feel there's been a mistake.