A few weeks ago an internal Facebook investigation was leaked to The Wall Street Journal showing how the company knows perfectly well that its social networks are toxic to adolescent women, something he has denied in public.
And now it is known who is the person who has leaked the information and, in addition, it has come up with more statements and very interesting facts. She is someone who knows the internal dynamics of Facebook very well: Frances Haugen, Product Manager or (now ex) Product Manager at the company. And it has also revealed several details and internal secrets unknown until now.
Frances Haugen is 37 years old, is a data scientist with a degree in computer engineering and with an MBA from Harvard. He has worked for companies like Google and Pinterest. With experience within the sector, the former executive says that she has been on other social networks and what she has seen on Facebook “is substantially worse than anything she had seen before on other platforms.”
Haugen confirms what Facebook denies
Although there is much talk that Facebook is negatively influencing different aspects of society due to the information that it constantly sends us (among others, that its social networks are ideal platforms to spread disinformation), with the documents filtered by Haugen Mark Zuckerberg and his staff are confirmed to know too even if they deny it. This has put the social network in the eye of the hurricane.
Now the Facebook whistleblower says the company encourages “content that is angry, polarizing and divisive.” Interestingly, Tim Cook, CEO of Apple, already talked about this. According to Cook, These networks prioritize “conspiracy theories and incitement to violence” because they are issues that involve more people in those conversations. In this way, the companies that manage these social platforms have the opportunity to collect more personal information from citizens and thus have more data to sell.
A conflict of interest: what’s good for Facebook and what’s good for the public
Frances Haugen has said in an interview that what she saw on Facebook over and over again was that there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook always chose to optimize its own interests, which is to earn more money.
On the other hand, he says that internal studies show that the company is lying to the public about significant advances against hate, violence and misinformation. One of the Facebook studies carried out this year and whose information has now been revealed by Haugen concludes that it is estimated that the company has managed to tackle hatred on the platforms only between 3 and 5% of what there is in total.
According to the old directive: When faced with an information space that is full of content that is angry, that fosters hatred, or that is polarizing, “this erodes our civic trust, erodes our faith in others, erodes our ability to want to care for others“.
As a consequence, “the version of Facebook that exists today can fragment our societies and cause ethnic violence around the world“He has said in reference to the hate crimes that take place from one person to another because of their skin color or their religion. And he recalled the ethnic cleansing that took place in Myanmar in 2018, when the military used Facebook to present their genocide and fomenting hate speech towards the Rohingya ethnic minority.
The algorithm that chooses some content over others
One of the serious problems Haugen sees is that, as users, we can hold the phone in our hands and access information in a constant way, through what Facebook is showing us in our ‘Feed‘. And that’s where the famous algorithms come into play that decide what we see and what we don’t.
“You have your phone. You could see 100 pieces of content if you sat down and scrolled for just five minutes. Facebook has thousands of options that it could show you. The algorithm chooses between those options based on the type of content you’ve interacted with the most on. last”. At the same time, internal Facebook studies know that content that is hateful, divisive, polarizing, elicits higher reactions.
For the former directive, while only the content with the most reactions is prioritized, which, in turn, is usually the one that spreads information that causes the ire of readers, users will continue to receive more and more information of this style. And for Facebook it does not matter if the information is true or false. The algorithm does not prioritize this aspect and Haugen says that it is since 2018 that this aspect has been so marked. In fact, he recalled how European leaders have openly questioned this fact.
With this, the engineer says that “Facebook makes more money when more content is consumed. People get more involved with things that elicit an emotional reaction. And the more anger they are exposed to, the more they interact and the more they consume. ”
Why Haugen decided to filter the data and how he did it
The former executive has said that she did not want to continue working for a company that keeps these important secrets hidden from its users. “Imagine that you know what is happening inside Facebook and you know that no one outside knows it,” and he wanted to reveal the information. Frances Haugen said that was recruited by Facebook in 2019 and says she took the job making it a condition to work against disinformation on the platform.
So that your decision would be consistent and would not remain in words that take the air, compiled as many internal documents as he could before revealing the information or resign.
She worked in the area of Civic Integrity, which sought to tackle risks for the elections, including misinformation. But after the last elections in the United States, there was a turning point inside the company. It was decided to dissolve this area and, in fact, two months later, there was the assault on the Capitol, according to Haugen.
She believes that when the firm got rid of this area aimed at curbing misinformation was the moment when he stopped trusting that Facebook really wants to invest in preventing Facebook from being dangerous.
Cover image: CBS News