Facebook knew Instagram was pushing girls into dangerous content: internal document – 60 minutes

Families are suing social media companies


Social media sue: Families say social media algorithms are putting their children at risk | 60 minutes

13:29

A previously unreleased internal document reveals that Facebook, now known as Meta, knew Instagram was pushing girls towards dangerous content.

In 2021, according to the document, an Instagram employee conducted an internal investigation into eating disorders by opening a fake account as a 13-year-old girl looking for nutrition tips. She was guided to graphic content and recommendations to follow accounts titled “Skinny Binge” and “Apple Core Anorexic.”

Other internal memos show that Facebook employees are raising concerns about a company’s research that found Instagram evokes feelings in 1 in 3 teenage girls worse about their bodies and that teenagers who used the app experienced anxiety and depression more often.

Attorney Matt Bergman founded the Social Media Victims Law Center after reading the so-called “Facebook Papers” published by whistleblower Frances Haugen last year. He now works with more than 1,200 families fighting lawsuits against social media companies. Over the next year, Bergman and his team will begin the investigative process for the consolidated federal lawsuits against Meta and other companies in multimillion-dollar lawsuits that he says are more about changing policy than monetary compensation.

“Time and time again, when given the choice between the safety of our children and profit, they always choose profit.” Bergman told 60 Minutes correspondent Sharyn Alfonsi.

socialmediascreengrabs03.jpg
Matt Bergman

Bergman practiced product liability for 25 years, specializing in asbestos and mesothelioma cases. He argues that the design of social media platforms ultimately harms children.

“They designed a product on purpose — it’s addictive,” Bergman said. “They understand that kids make more money by staying online. It doesn’t matter how harmful the material is.”

“So, the fact that these kids ended up seeing the things they saw that were so disturbing,” Alfonsi asked, “was no coincidence; it was intentional?”

“Absolutely,” Bergman said. “It’s no coincidence.”

Bergman argues that the apps were explicitly designed to evade parental authority and calls for better age and identity verification protocols.

“This technology exists,” Bergman said. “When people try to connect to Tinder, there’s technology that makes sure people are who they say they are.

Bergman also wants to do away with algorithms that bring content to users.

“There’s no reason Alexis Spence, who was interested in exercise, should have been directed to anorexic content,” Bergman said. “Number three would be warnings so parents know what’s going on. Let’s be realistic, you will never be 100% secure with any social media platform. But these changes would make them safer.”

Meta, the parent company of Facebook and Instagram, declined 60 Minutes’ request for an interview, but its global safety chief, Antigone Davis, said “we want teens to be safe online” and that Instagram “doesn’t allow content that is intended for themselves advertise”. Injuries or eating disorders.” Davis also said Meta improved Instagram’s “age verification technology.”

But when 60 Minutes ran a test two months ago, a producer could lie about her age and log on to Instagram as a 13-year-old without verification. 60 Minutes could also scan for thin and malicious content. And while a prompt came up asking if the user wanted help, we clicked “View Posts” instead and easily found content promoting anorexia and self-harm.

READ :  Ultimovacs announces the completion of patient enrollment in the NIPU's Phase II clinical trial of UV1 in mesothelioma

Leave a Reply

Your email address will not be published. Required fields are marked *