The leak that changed everything
In September 2021, the Wall Street Journal published the Facebook Files — a series of investigative reports based on internal documents leaked to the press by whistleblowerWhistleblowerA person who exposes internal misconduct within an organisation to the public. Frances Haugen copied thousands of internal Meta documents and handed them to the US Congress and the press. Frances Haugen. The documents showed: Facebook/Meta had been conducting its own studies into Instagram's effects on young users since 2019 — with devastating results.
The research was unambiguous: Instagram harms a substantial proportion of young users — particularly girls. Yet Meta never made these findings public. When the company was questioned by the US Congress in March 2021 about whether it had conducted research into harms to children, CEO Mark Zuckerberg replied: "I believe the answer is yes." He withheld the specific findings.
The figures from Meta's own research
Internal presentations seen by the Wall Street Journal documented:
- 32% of teenage girls who felt bad about their body said: Instagram makes it worse.
- 13.5% of teenage girls in the UK said: Instagram amplifies suicidal thoughts.
- 17% of teenage girls in the UK said: Instagram worsens eating disorders.
- 13% of British and 6% of US teenagers with suicidal thoughts traced the urge to harm themselves back to Instagram.
- 66% of teenage girls and 40% of teenage boys experienced "negative social comparisons" on Instagram.
- 52% of teenage girls said these negative comparisons were triggered by beauty-related images.
A further study from March 2020 found: "Comparisons on Instagram can change how young women see and describe themselves." The internal document continued: "Teens blame Instagram for increases in anxiety and depression. This reaction was unprompted and consistent across all groups."
A Meta survey of 50,590 people across 10 countries (February 2021) showed: 48% of teenage girls compare their appearance "always or often" with others on Instagram. 37% of teenage girls "always or often" see posts that make them feel worse about their body.
Meta's public response: downplaying
When the Facebook Files were published, Meta released its own excerpts of the research on 29 September 2021 — accompanied by annotations designed to downplay the findings. Meta claimed the Wall Street Journal had "misrepresented" the research. The annotations described phrases such as "We make body image issues worse" as "sensationalist" and criticised the language for "suggesting causal links" when the statements only reflected the perceptions of those surveyed.
Mark Zuckerberg wrote in a blog post that the research showed "many teens feel that Instagram helps them in difficult moments". He emphasised that across 11 of the 12 areas studied — including loneliness, anxiety and eating disorders — more teenage girls said Instagram made things better rather than worse. Body image was the only area where more said it got worse.
Critics pointed out: even if the majority says "it helps" or "no effect", a significant proportion for whom Instagram causes harm remains — and Meta knew this.
Independent research backs the findings
Meta's internal conclusions are not an isolated case. Numerous independent studies reach similar conclusions:
Systematic reviews
A systematic reviewSystematic ReviewThe highest-quality form of scientific summary: all available studies on a topic are evaluated against strict criteria. Regarded as the gold standard of evidence. of 93 studies (Ahorsu et al., 2021) examined the link between Instagram use and mental health. Conclusion: higher Instagram use correlates with depression, anxiety, stress, low self-esteem, body dissatisfaction and eating disorders. The central mechanisms are social comparison and self-presentation pressure.
A study by Ahadzadeh et al. (2024) focusing on young people aged 17–22 found: 95.2% use Instagram daily, almost half spending over four hours a day on the platform. 70% of respondents reported feelings of anxiety, depression and inadequacy which they attributed to their Instagram use.
A meta-analysis on Instagram and depression (Owusu et al., 2021) found: time spent on Instagram correlates positively with depressive symptoms, trait anxiety, social comparison, appearance anxiety, body image disorders and low self-esteem. The more Instagram followers, the higher the levels of depression and anxiety and the lower the self-esteem.
Most at risk: teenage girls
Studies consistently show: Instagram is image-driven and focused on bodies and lifestyle. While TikTok centres on performance and YouTube on video content, Instagram revolves primarily around visual self-presentation — which maximises pressure around appearance.
The Child Mind Institute summarised in 2025: Instagram tops the list of platforms that cause young people to report anxiety, depression and body image concerns. "Curating a perfect image doesn't just make others feel inadequate — it is also unhealthy for those who appear to be succeeding at it."
GDPR fine: €405 million
In September 2022, the Irish Data Protection CommissionDPCThe Irish data protection authority. Responsible for Meta, Google and Apple in the EU because these companies have their European headquarters in Ireland. (DPC) imposed a fine of €405 million on Instagram for violations of the GDPRGDPRGeneral Data Protection Regulation — the EU data protection law since 2018. Violations can result in fines of up to 4% of global annual turnover. in the processing of children's data (ages 13–17). It was the first EU-wide ruling on children's data protection rights and the second-largest GDPR fine ever imposed.
The investigation began in 2020 after a US data scientist and the DPC itself found that children who switched to Instagram business accounts had their email addresses and phone numbers publicly visible. In addition, all accounts — including those belonging to minors — were set to "public" by default.
The DPC found violations of Articles 5(1)(a), 5(1)(c), 6(1), 12(1), 24, 25(1), 25(2) and 35(1) of the GDPR. Meta had not conducted a Data Protection Impact AssessmentDPIAA mandatory document under the GDPR that analyses risks to users before a service processes personal data. Meta had not prepared one for children. (DPIA) for children, had not established a legal basis for processing, and failed to implement Privacy by Design and DefaultPrivacy by DesignA GDPR principle: data protection must be built into the technology from the outset, and the most privacy-friendly setting must be the default — not the other way around..
The Chair of the European Data Protection BoardEDPBThe EU body comprising all national data protection authorities. It ensures consistent application of the GDPR across all EU countries and can issue binding decisions. (EDPB), Andrea Jelinek, stated: "With this decision, the EDPB makes it extra clear: companies that target children must be particularly careful. Children deserve specific protection with regard to their personal data."
Meta vs. the FTC: the $5 billion battle
In the United States, the FTCFTCFederal Trade Commission — the US consumer protection authority. Responsible for data protection violations and unfair business practices. It imposed the largest fine in US history against Meta. pursued a years-long legal battle against Meta. In 2019 the FTC imposed a record fine of $5 billion for data protection violations in the Cambridge AnalyticaCambridge AnalyticaA British data analytics firm that came to light in 2018: it had unlawfully harvested data from 87 million users via a Facebook app and used it for political campaigns (including Trump 2016). scandal and ordered extensive data protection measures.
In May 2023, the FTC accused Meta of violating the terms of the 2020 settlement: Meta had misled parents over the Messenger Kids app (parents could not control who their children spoke to) and gave app developers more access to user data than disclosed. The FTC proposed an outright ban on Meta monetising data of under-18s.
Meta then sued the FTC in November 2023, calling the measures an "unconstitutional abuse of power". A federal judge nonetheless allowed the FTC to proceed. The advocacy organisation Fairplay stated: "According to recently unsealed documents, Meta could face a COPPA liability of $200 billion."
42 US states file suit (October 2023)
On 24 October 2023, 42 attorneys general (33 in a joint federal action, 9 in individual state lawsuits) filed suit against Meta. Allegations:
- Meta deliberately designed addictive features (Infinite ScrollInfinite ScrollEndless scrolling with no natural stopping point. The feed continuously loads new content — there is no moment at which you are "done". A deliberately deployed addictive design element., autoplay, like counters, recommendation algorithm) to keep children and teens on the platform.
- Meta misled the public about the safety of Instagram and Facebook.
- Meta knowingly collected data from children under 13 without parental consent (violating COPPACOPPAChildren's Online Privacy Protection Act — a US law that prohibits collecting data from children under 13 without parental consent. Meta knowingly collected children's data regardless.).
- Meta had known about the harms for years and took no action.
New York Attorney General Letitia James: "Meta has profited from the suffering of children by deliberately designing platforms with manipulative features that addict children and lower their self-esteem." Colorado Attorney General Phil Weiser, who led the federal suit: "Meta's deceptive and unfair practices have profoundly harmed our youth."
The lawsuits compare Meta to Big Tobacco in the 1990s. Psychologist Jean Twenge (San Diego State University): "Maybe in the future we will think: what are they doing, putting 12- and 14-year-olds on social media? The way we now think about smokers."
Landmark trial, 2026
In February 2026, the first bellwether trialBellwether TrialA test case tried on behalf of thousands of similar lawsuits. The outcome influences all further cases — either through precedent or by encouraging settlement. began in Los Angeles — a test case that could determine the outcome for thousands of further lawsuits against Meta, YouTube, Snapchat and TikTok. The case concerns a 19-year-old (initials "KGM") who claims she became addicted to social media as a minor, worsening depression and suicidal thoughts.
Crucially, the lawsuit argues that Meta consciously made design decisions to addict children — not content moderation failures. This sidesteps Section 230Section 230A US law that shields platforms from liability for user-generated content. The lawsuits against Meta therefore target addictive design decisions rather than content. (liability protection for user content) and the First AmendmentFirst AmendmentThe first amendment to the US Constitution, which guarantees freedom of speech. Tech companies invoke it to resist regulation — claiming their algorithms are "protected speech".. The lawsuit states: "Meta drew heavily from the techniques of slot machines and the tobacco industry to make their products addictive to young people and maximise advertising revenue."
Meta CEO Mark Zuckerberg is expected to testify. The trial is scheduled to run for six to eight weeks.
💡 Conclusion
Instagram is not a neutral tool. Meta's own research since 2019 made it clear: the platform worsens body image problems, depression and suicidal thoughts in a substantial proportion of young users — particularly girls. Instead of making the findings public and taking action, Meta concealed the studies, downplayed them before legislators and continued developing addictive features. The legal consequences — a €405M EU fine, 42 US lawsuits, ongoing trials — demonstrate that the business model of "attention at any cost" is fundamentally incompatible with child welfare.