A previously unreleased internal document reveals that Facebook was aware that Instagram was pushing girls to engage in dangerous content.
A previously unreleased internal document reveals that Facebook, now known as Meta, knew Instagram was exposing girls to dangerous content.
An Instagram employee conducted an internal investigation into eating disorders in 2021 by creating a fake account posing as a 13-year-old girl seeking nutritional advice, the document said. She was directed to graphic content and accounts titled “Skinny Binge” and “Apple Core Anorexic.”
Other internal memos show that Facebook employees are raising concerns about company research that found Instagram made a third of teenage girls feel bad about their bodies and that teens who used the app were more likely to experience anxiety and depression.
Attorney Matt Bergman founded the Social Media Victims Law Center after reading the so-called “Facebook Papers” published by whistleblower Frances Haugen last year. He now represents over 1,200 families in court cases against social media companies.
Next year, Bergman and his team will begin the investigative process into the consolidated federal lawsuits against Meta and other companies, which he believes are more about a policy change than monetary compensation. “Time and time again, when given the choice between the safety of our children and profit, they always choose profit,” Bergman said 60 minutes Correspondent Sharyn Alfonsi.
Continue reading: No health without mental health
Bergman practiced product liability for 25 years, specializing in asbestos and mesothelioma cases. He claims that the design of social media platforms ultimately harms children. “They designed a product on purpose — it’s addictive,” Bergman said.
“They understand that kids make more money by staying online. It doesn’t matter how harmful the material is.”
“So, the fact that these kids ended up seeing the things they saw that were so disturbing,” Alfonsi asked, “was no coincidence; was it on purpose?” “Absolutely,” Bergman said. “It’s no coincidence.”
He argues that the apps were designed to evade parental authority and calls for better age and identity verification protocols. “This technology exists,” Bergman said. “When people try to connect to Tinder, there’s technology that makes sure people are who they say they are.”
Bergman also wants to do away with algorithms that bring content to users.
Continue reading: Instagram fined $402 million for violating children’s privacy in EU
“There’s no reason Alexis Spence, who was interested in exercise, should have been directed to anorexic content,” Bergman said. “Number three would be warnings so parents know what’s going on. Let’s be realistic, you will never be 100% secure with any social media platform. But these changes would make them safer.”
Meta, the parent company of Facebook and Instagram, declined 60 minutesAsked for an interview, but global head of safety, Antigone Davis, said, “We want teens to be safe online” and that Instagram “doesn’t allow content that promotes self-harm or eating disorders.” Davis also said Meta improved Instagram’s “age verification technology.”
In a test conducted by 60 minutes However, two months ago, a producer was able to lie about her age and log onto Instagram as a 13-year-old without verification. It was also capable of scanning for meager and malicious content. And while a prompt came up asking if the user wanted help, we clicked “View Posts” instead and easily found content promoting anorexia and self-harm.