Hundreds of families are taking legal action against some of the largest technology corporations, asserting that those companies were aware of the potential hazards of their products to children. One plaintiff has spoken out why they are taking on the challenge of going up against Silicon Valley.
I became enslaved by addiction at the age of twelve and it stayed that way for the duration of my teenage years.
Taylor Little had an obsession with social media, a compulsion that caused them to attempt suicide and suffer from years of depression.
Taylor, now 21 and using the pronoun "they", characterizes tech companies as "big, bad monsters".
Taylor has the opinion that the companies were aware of the destructive nature of the items they were providing to children.
Taylor, as well as hundreds of other American households, is suing four of the biggest tech organizations on the planet.
A lawsuit has been brought against Meta (owner of Facebook and Instagram) as well as TikTok, Google and Snap Inc (owner of Snapchat), making it one of the largest cases ever seen in Silicon Valley.
Families and school districts from all over America are among the plaintiffs.
It is asserted that the platforms are constructed with intention of being detrimental.
The lawyers representing the families think that the case of 14-year-old British schoolgirl Molly Russell is a vital illustration of the possible dangers that teens may encounter.
They remotely watched the examination into her death from Washington last year to seek out any facts that might be applicable in the American lawsuit.
Molly's name is referenced a dozen times in the complaint lodged with the court in California.
Last week, the families involved were granted a major advantage when a federal judge ruled that the companies in question could not utilise the First Amendment of the US constitution, which guarantees freedom of expression, to prevent the case from being pursued.
Judge Gonzalez Rogers declared that S230 of the Communications Decency Act, which states platforms are not publishers, did not provide the companies with unrestricted immunity.
The judge decided that, for instance, a lack of "solid" age verification and inadequate parental controls, as argued by the families, are not matters related to freedom of speech.
The lawyers representing the families referred to it as a "significant victory".
The companies deny the claims, and they plan to vigorously defend themselves.
Taylor, a Colorado resident, informs us that prior to obtaining their initial smartphone, they were active and social, engaging in dance and theatre.
If my phone was taken away, it was like I was going through withdrawal. It was too much to handle. To be clear, I'm saying my physical and mental state needed the phone, not just that I was used to having it.
Taylor recollects the initial social media alert that they opened.
It was a webpage that depicted self-inflicted injuries in detail.
When I was 11 years old, I was automatically presented with a page that I hadn't requested or anticipated. Even today, at the age of 21, I can still remember it clearly.
Taylor had difficulty dealing with matters concerning body image and eating disorders.
It was - is - almost like a cult. You feel like you're being flooded with images of an unattainable body that would take your life to achieve.
"It's not possible to avoid it."
Taylor's lawyers and the other plaintiffs have utilized an innovative method in their lawsuit by concentrating on how the platforms are put together instead of on solitary posts, comments, or pictures.
They assert that the apps have design elements that foster addiction and result in detriment.
Meta issued a declaration expressing their sympathy for the families that are affected by the complaints.
We want to assure all parents that we are committed to providing teenagers with secure, supportive experiences when they are using the internet.
TikTok chose not to comment.
Google informed us: "The assertions in the grievances are untrue. Safeguarding young people on our services has always been of paramount importance to us."
Snapchat declared that its platform was created to reduce expectations for perfection. They examine all material before it is allowed to reach a large public to stop the propagation of anything that could be damaging.
Taylor is familiar with the details of the tragedy concerning Molly Russell from north-west London, who committed suicide due to viewing a torrent of depressing and negative material on Instagram.
An inquest determined that her death was a result of her depression and negative experiences with online material.
Taylor asserted that their tales were quite comparable.
I consider myself fortunate to have been spared, and my emotions are too profound for me to be able to express when it comes to individuals such as Molly."
I'm thrilled. There is a lot to appreciate in my life. I never expected to be in my current situation.
Taylor is resolute in his pursuit of legal action.
It is evident to them that we are passing away yet they are unconcerned, profiting from our demise.
My entire expectation for improved social media relies on us being successful and pushing them to do it - as they will never voluntarily act on it.
top of page
bottom of page
Comments