Facebook encourages hate speech for profit |
Frances Hoggin (Francis Hoggin), who is responsible for the leak of a large number of internal Facebook files to the Wall Street Journal, has appeared for 60 minutes and continues to reveal the inner workings of the world's most popular social media platform.
Hogan revealed his identity on national television. She explained that the company is so committed to improving the product that it uses algorithms that amplify hate speech.
"He pays his income through our security," Hogan told host Scott Bailey for 60 Minutes.
According to a LinkedIn account, which was later deleted, Hogan was the product manager for Civic Integrity Group. After dissolving the group, he decided to leave the company in 2021.
She said she didn't think companies would be willing to invest in the things that should be invested to prevent the platform from collapsing. As a result, it exposed a lot of internal research to the US Securities and Exchange Commission in hopes of better monitoring the company.
She indicated that she worked for several companies, including Google. But Facebook's situation is much worse as the company seeks to put profits above users' interests.
“There is a conflict between what is good for the audience and what is good for Facebook,” she said. The company repeatedly chooses improvement actions that are in its best interest, such as making more money.
Although the company claims to at least help stop hate speech through its own products, an internal document leaked by Hugin states that our actions range from 3 to 5% hate and about 0.6% violence and incitement.
Facebook does not care about the interests of the user
Another document states that the company has evidence from various sources that hate speech, divisive political rhetoric and disinformation in its platform and suite of applications have affected communities around the world.
The root of the problem, Hugin says, is the algorithms introduced in 2018 that control what you see on the platform. Your goal is to increase engagement. The company has found that the best interactions are those that instill fear and disgust in users. Ha Jin said it was easier to provoke anger than other emotions.
At the time, Mark Zuckerberg said the algorithm changes were positive. “We are committed to ensuring that our services are not only fun but also serve the interests of the people,” he said.
But according to a Wall Street Journal report on Hugin's concerns, the result was a sharp turn in the direction of anger and hatred. An internal memo cited by the newspaper spoke of the effects of the rating changes: fake news and violent content are very common when they are repeated.
The Wall Street Journal began publishing these documents as the "Facebook Files" in September. A report claiming that the company's research proved that Instagram abused teenage girls has since led to a congressional hearing.