Facebook feeds without legal protection
Facebook feeds without legal protection

The Democratic legislature hopes the legal protections will be withdrawn from Facebook's newsfeed. If social networks recommend harmful content to users, they will be held legally responsible.

Congress hopes to reintroduce Section 230 security measures in certain circumstances, amid concerns that skilled workers will knowingly use hazardous materials, and companies should be held liable for such damage.

A group of lawmakers has proposed a harmful algorithms bill that would change the safeguards in Section 230 to exclude personal suggestions of content that causes serious physical or emotional harm.

The bill follows whistleblower Frances Hogan's recommendations to Congress last week.

Hugin, a former employee who uncovered extensive internal research, urged lawmakers to take action against algorithms that advertise or categorize content based on user interactions.

The bill applies to network services over 5 million visits per month. It does not cover certain categories of devices, including infrastructure services such as web hosting and search results display systems.

For covered platforms, the bill targets Section 230 of the Communication Standards Act, which prohibits people from suing online services for external content posted by users.

If the Service intentionally or negligently uses custom algorithms to recommend relevant third-party content, these situations can persist with the new exception. This can include information provided by posts, groups, accounts, and other users.

The bill doesn't necessarily allow people to sue all kinds of material Hugin criticizes, including hate speech and anorexia-related content.

Many of these substances are legal in the United States. Thus, the platform does not need any additional liability protection for its hosting.

Facebook may face legal difficulties

The bill also includes personal recommendations, which are defined as the use of algorithms based on certain personal information to evaluate content. It seems that companies can still use extensive analytics to recommend the most popular public content.

In his testimony, Hawking stressed that his goal was to increase public legal risks so that Facebook and similar companies would no longer use personal recommendations.

"If we change Section 230, you blame Facebook for the consequences of informed review choices," she said. I think he could get away with classifying parts based on.

Previous Post Next Post