Meta is facing a lawsuit over its algorithm
Meta is facing a lawsuit over its algorithm

Meta has received eight separate complaints in multiple states alleging that its algorithms contribute to mental health problems such as eating disorders, insomnia, and suicidal thoughts or tendencies among young users.

According to the complaint, excessive time on Instagram and Facebook poses serious mental health risks. The plaintiff claims that Meta has misrepresented the safety, usefulness, and non-addictive nature of its platform.

These complaints are not related to the content itself, but to the algorithm. Rather than providing a strong defense, it could result in Section 230 of the Communications Decency Act, which states that META cannot be held legally liable for external content posted on its platform.

Last year, a federal appeals court ruled Snap could be held liable for hasty participants who allegedly encouraged reckless driving and caused fatal accidents in 2017.

"This decision opens the door to lawsuits like Meta," said Eric Goldman, co-director of the High-Tech Law Institute at Santa Clara University.

The plaintiffs in the Snap case argued that the speed filter was not considered third-party content, but rather a design decision made by Snap itself.

Since the court ruled that Snape was not protected under Section 230 in this case, others are trying to circumvent the law in the same way.

But Goldman Sachs argued that Snape's verdict and the case against Meta are qualitatively different because the algorithm and what it uses are the same.

"The idea that we can algorithmically distinguish between dangerous algorithms and dangerous externalities is, in my opinion, an illusion," he said. The algorithm only asks people to see the content. Content is the problem.

Goldman also noted that lawsuits have been filed against Meta in the hope that one of the eight judges overseeing the cases will be able to take a stand against the company. But what would happen if one of the judges voted for Meta is for the other judges to follow suit.

Can Meta Sue For An Algorithm?

Regardless of Section 230, social media companies are not far from the problem of platform addiction. A bill passed by California lawmakers in late May gives parents the right to hold social media platforms accountable when their children become addicted.

Prosecutors across the country are also investigating on multiple platforms. In November, a coalition led by prosecutors from several states began studying Instagram to understand how it attracts young people. The group extended its investigation to TikTok in March.

Frances Hogan testified before Congress last year. She said Meta has not been open about Instagram's impact on young people.

Internal research has shown that the app exacerbates mental health issues for teenage girls in particular.

Jim Steyer, CEO of Common Sense Media, said: “These platforms combine addictive design and functionality with the computational amplification of disruptive content. These are some of the tactics social media platforms like Meta use.

Under public pressure, social media companies appear to be showing signs of trying to curb users' addiction.

TikTok recently announced the release of more screen time controls. It is designed to help users limit the time they spend scrolling.

Instagram has a similar daily time limit. But it does not allow mobile users to set daily reminders for less than 30 minutes.

These movements seem to be helpful. But these time constraints are easy to get around. This may just be a way to save face for these companies as they want users to actively use the platform for as long as possible.

If states continue to pass legislation to hold social media companies accountable for content posted on their platforms, Congress or the Supreme Court can step in to amend or clarify Section 230.

Previous Post Next Post