How Facebook and Instagram Plan To Limit Teens’ Exposure to Harmful Content

The social media giant announced new safety measures in the aftermath of whistleblower Frances Haugen’s claims that it values “profits over safety,” and will aim to prioritize friends in the feeds of young users instead of politics.

Credit: Getty Images
Credit: Getty Images

Facebook Vice President of Global Affairs Nick Clegg announced plans to divert the attention of its teen users from harmful material on the platform in response to the claims that the company deliberately hid evidence regarding its detrimental effects on young users.

Frances Haugen, a former Facebook employee turned whistleblower, claimed last Tuesday in her testimony before Congress that Facebook valued profits over people after releasing thousands of pages of classified documents to The Wall Street Journal.

In response to the claims, Clegg revealed specific actions meant to “nudge” teens away from harmful content in their feeds. “We're going to introduce something which I think will make a considerable difference, which is where our systems see that the teenager is looking at the same content over and over again and it's content which may not be conducive to their well-being, we will nudge them to look at other content," Clegg said on CNN’s “State of the Union.”

Though he did not provide extensive details about the new safety measures, Clegg said that Instagram would implement a new “take a break” feature meant to encourage teens to step away from the platform. He also noted that Facebook would resolve to include more content from young users’ friends, instead of political content.

Clegg also confirmed that previous plans for Instagram Kids, a platform for children younger than 13, have been paused to ensure the safety of preteens. According to a September blog post by Instagram head Adam Mosser, the company will postpone the project “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

“We have no commercial incentive to do anything other than try and make sure that the experience is positive… We can’t change human nature. We always see bad things online. We can do everything we can to try to reduce and mitigate them,” Clegg said.

In her Congress testimony last week, Haugen, who previously served as a product manager for Facebook’s civic misinformation team, said she believes “Facebook’s products harm children, stoke division, and weaken our democracy,” and went on to ask Congress to treat Facebook like tobacco or opioid companies.

“When we realized tobacco companies were hiding the harms it caused, the government took action. When we figured out cars were safer with seat belts, the government took action. And today, the government is taking action against companies that hid evidence on opioids. I implore you to do the same here,” Haugen said on Tuesday.

Other social media platforms, such as TikTok, have taken measures in recent months to ensure the safety and privacy of younger users. In August, TikTok announced new safety measures for young users, such as the elimination of push notifications during specific nighttime hours. The company also changed its default privacy settings to require 16 and 17-year-old users to actively enable direct messaging, a feature which the company turned off completely for users under 16.