Connect with us

Hi, what are you looking for?

News

‘They Don’t Care. They Make Money Off Us Dying’: Meet the Parents and Young People Suing Social Media

TW: mentions of eating disorders, suicide and self-harm.

girl looking sad children harmed by social media parents suing social media companies
Credit: Shutterstock/ First Glimpse Photography

Two thousand families and young people are suing social media companies such as Meta, Snap and ByteDance for the damage that apps such as Instagram, Snapchat and TikTok have wrought on their children. They are going up against some of the most powerful companies in the world. Here’s why they’re doing it:

What age were you when you first got access to social media? Were you thirteen, as the age restrictions suggest? Or did you lie, able to easily cheat a system based on honesty so that you could join in with your friends?

Either way, it’s likely that most of you encountered social media before the age of eighteen. Which means you’re part of the generation which is most affected, and most harmed by it.

TikTok’s have abounded online about the dangers of social media for children, warning parents of the risks.

@allieprib

Even if kids arent given ‘unlimited’ access to social media, I’m not sure people realize the negative impact it still has on them #socialmediakids #teacherpov #teachersoftiktok #parenttok #teacherquittok2023 #educationcrisis2023 #kidsonsocialmedia #socialmediaimpact

♬ original sound – allie

In addition, 2,000 families are suing social media companies due to this harm. 350 of the 2,000 are scheduled for 2024. Below are some of the charges and stories from the plaintiffs, with some shocking statistics to support them.

Harmful Content on Social Media

One of the main claims of the lawsuits is the lack of adequate protections for children using social media networks.

Young boy on phone parents suing social media companies for harming children
Although age limits are 13, there are far younger children using social networking apps
Credit: Shutterstock/ Jelena Stanojkovic

A survey of teens in the United Kingdom from March 2022 found that an average of 15% of respondents had seen sexualized images on TikTok, and 12% had seen violent and gory images. 8% had seen pornography, and 7% had seen content relating to dietary restriction. On Instagram, 12% had seen sexualized images, and 12% saw violent and gory images. 7% had seen pornography, and 10% saw images of dietary restriction.

These are all things that are against Instagram and TikTok’s code of conduct for users. They can also lead to mental health problems when seen online, especially when encountered by young and developing brains, including eating disorders, porn addictions and self-harming tendencies.

However, prior to now, a section of 1996 US law has shielded such companies from culpability for third-party material uploaded on their sites. Section 230, or the Communications Decency Act, absolves companies of responsibility for any illegal material on their websites.

The Social Media Victims Law Centre is a legal practice which dedicated to cases of parents wishing to sue social media companies. Mathew P. Bergman, its founding attorney, has had to pivot to Product Liability law when suing these companies.

parents suing social media, sue, Instagram, meta, children harmed by social media
Lawyers are currently building up their cases
Credit: Shutterstock/ARMMY PICCA

That is, instead of blaming sites for the content on them, they need to prove that the products (i.e., the apps themselves) are not fit for purpose.

The Dangers of Social Media Algorithms

A key way in which the lawyers are building a case against these companies is by attacking social media algorithms. These mean that you’ll continually see content that you’re interested in and are likely to enjoy.

However, such algorithms put social media’s younger users at risk. If a young child likes or comments on a video that may have harmful content on it, they will continue to see similar videos. It also can reshape a child’s and an adult’s worldview. For example, it can make alternative medicine appear as the only acceptable cure for illness through “crunchytok” videos.

One of the plaintiffs is 20 year old Alexis Spence. She has alleged that she joined Instagram at age eleven, easily able to bypass age restrictions, and quickly encountered content promoting anorexia. This led to her developing an eating disorder at the age of 12.

sad girl who's parents are suing social media for harming their child
Alexis Spence developed her eating disorder at age twelve.
Credit: Shutterstock/ Ground Picture

She says, “it started as, like, fitness stuff. And then I guess that would spark the algorithm to show me diets. It then started to shift into eating disorders.”

She also said that because actions such as taking diet pills were prevalent on her social media feed, they became normal to her.

Similar phenomena have occurred to young people when they encounter self-harm content, such as in the case of Taylor Little. Little is 21, uses they/them pronouns, and, like Spence, is suing social media companies. They allege that they were first shown a graphic self-harm picture online at the age of 11. They say that they can “still see it”, even ten years later.

They say that they are determined to win against social media companies, as “they know we’re dying. They don’t care. They make money off us dying.”

Addictiveness

young boy addicted to social media parents suing social media TikTok Instagram
Social media is highly addictive
Credit: Shutterstock/ InFocus.ee

A key issue for Little is also the addictive nature of social media, which both algorithms and the rise of short-form video content has fed into. They say that they were “trapped by my addiction at age 12. And I did not get my life back for all of my teenage years.”

Studies have shown that social media can be as addictive as cocaine. In 2023, a survey done by UCL found that 48% of British teenagers think they are addicted to social media. Such addiction can impact children’s neurological development with extreme usage and even harm adults.

See it for yourself. An online challenge last December was to see who could watch a one-and-a-half-minute TikTok video without distractions. If you’re a regular TikTok user, the results will shock you.

@attemptedsoc

defy the norm by finishing the video 🤷‍♀️🤷‍♀️ also DONT GET ME WRONG i’m also an addict, my face is buried in my phone at all times!! just thought these stats were interesting 🫶🫶🫶 #fyp #educational #attention #attentionspan #dopamine #addiction #media #sociology #tiktok #adhdtiktok #doomscrolling

♬ original sound – gia 🪐🪐

Addiction and coming across harmful content can also intersect concerningly. In February 2020, another UK study found that 87% of children who were on social media for over ten hours a day had seen harmful content in the past year. This is compared to 76% who used it for two hours or less, which is still extraordinarily high.

Disproportionately Affecting Vulnerable Groups

Worryingly, teens from vulnerable groups are more likely to come across harmful content than their peers, according to recent studies,

A U.S. study of 11-15 year old girls found that 75% of those surveyed who experienced depressive symptoms had come across material related to suicide or self-harm on Instagram. In comparison, 26% of their peers had seen such content.

parents suing social media girl sad harmful for children TikTok Instagram
Some children are more likely to encounter risky content than others
Credit: Shutterstock/ Antonio Guillem

The same study also uncovered the fact that 43% of overall respondents had seen harmful eating disorders on TikTok and 37% on Instagram. These are also two of the most used apps by British and American teenagers.

A U.K. study from 2022 discovered that socio-economic status might impact the likelihood of children encountering harmful material online. Children on Free School Meals, which you are eligible for if your parents have some form of benefits from the government, were more likely to see harmful content.

They were twice as likely to encounter self-harming material than their peers. These children also had almost double the chance of seeing images of diet restriction and sexualized images. Finally, they were also almost three times more likely to have seen pornography online.

Who’s Responsible For Harm To Children On Social Media?

In 2018, 89% of respondents to a survey on who was responsible for child smartphone use said parents and caregivers. Only 1% said companies who make apps.

In 2023, a similar study set out to investigate who was responsible for preventing social media from harming children. 51% of those surveyed said parents, whilst 19% said social media companies and 17% said Federal Governments.

suing social media companies parenting parents suing children harmful Instagram tiktok
Social media has frequently been a point of tension for parents and children.
Credit: Shutterstock/DaisyDaisy

Over the past five years, blame for harmful social media usage in children has shifted away from parents. As internal investigations in companies have come to light, we’ve become more skeptical of the networks’ assertion of safety and security online.

In 2021, Instagram ran an internal experiment. An employee pretended to be a thirteen-year-old girl looking for weight loss tips on its platform. She was quickly led to content on anorexia and binge eating. Other former employees at the company have alleged that the board was aware of data suggesting 1 in 3 girls felt worse about their body after using the app. Despite this, corporate refused to do anything about it.

Kathleen Spence, the mother of Alexis Spence, has said that for years, parents have been “gaslighted by the big tech companies that it’s our fault.” However, with the onslaught of lawsuits against these companies, she hopes to send a strong message to social media sites: “You need to do better. I’m doing everything I can. You need to do better.”

Will These Lawsuits Succeed?

parents suing social media companies for harming children Tiktok Instagram
It might be time to delete social media apps from your phone
Credit: Shutterstock/ Kicking Studio

It’s uncertain how these lawsuits will play out. But one thing is for sure, social media is harmful to our children, and probably to us as well. If these suits succeed, it will be the end of the era of sweeping internet-age problems under the carpet. Hopefully, we will finally get proper regulations for these companies in order to protect us.

Until then, think about deleting your Instagram and TikTok from your phone. Even if you still have to keep them on your computer browser, it’s still one small step in the right direction.

Hi, I'm Georgie and I'm currently studying at St Andrews University, Scotland. I love writing about social media trends and unpicking the ways which platforms like Instagram and TikTok give rise to certain cultural phenomena. I'm writing for the Life section of Trill at the moment, so look out for my articles!

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Advertisement
Advertisement

You May Also Like

Collage of Trump and Harris, with a calendar of November highlighting election day in the background.Collage of Trump and Harris, with a calendar of November highlighting election day in the background.

Politics

Kamala Harris and Donald Trump met on September 10 for a debate that viewers found wildly entertaining and quite a bit concerning.

Politics

Germany has seen the rise of far-right AfD in recent elections - will there be an imminent international shift towards extremist leaderships?

Premier League logo on waving flagsPremier League logo on waving flags

Sport

As the international break comes to a close, we take a look at all of the drama at the top flight of English football.

point and shoot camera, flip phone, vinyl record, video game console.point and shoot camera, flip phone, vinyl record, video game console.

News

Generation Z loves to own technology that is either from their childhood or before they were born. The interest may come from the trendiness...