Frances Haugen, Facebook whistleblower, speaks during a hearing of the Senate Subcommittee on Commerce, Science and Transport in Washington, DC, the United States, on Tuesday, October 5, 2021.
Stefani Reynolds | Bloomberg | Getty Images
Facebook whistleblower Frances Haugen told UK lawmakers Monday that the company’s refusal to take responsibility for its services or induce employees to speak up about problematic behavior created the toxic situation that exists today.
“Facebook is not ready to acknowledge that they are responsible for anyone,” said Haugen on Monday, testifying at a hearing in the UK Parliament on new laws to combat harmful online content.
Haugen made his second public appearance after revealing herself as the source of the numerous internal documents that sparked the Wall Street Journal’s “The Facebook Files” series. Haugen testified before the US Congress earlier this month and has since started sharing her treasure trove of documents with numerous news outlets.
The Facebook leadership is focused on growth and has created a culture that focuses on the positive aspects of the company’s services at the expense of addressing the problems they cause, Haugen said Monday.
“Facebook is overwhelmingly full of conscientious, kind, and empathetic people,” she said. “Good people who are embedded in systems with bad incentives are led to bad actions. There is a real pattern of people who are willing to look the other way are promoted more than people who sound the alarm.”
Haugen said Facebook didn’t create an opportunity for employees to point out issues that management should consider or researchers could investigate.
“Facebook has shown time and time again that not only do they not want to publish this data, but even when they do publish that data, they often mislead people,” she said.
It’s an attitude that’s anchored in Facebook’s startup culture and won’t change until regulations force the company to change its incentives, Haugen said.
“Whenever you see a conflict of interest between profit and people, you keep choosing profits,” said Haugen.
A Facebook spokesperson said in a statement emailed that the company agreed to the need for regulation “so that companies like ours don’t make these decisions on their own.” The rep also reiterated Facebook’s disputes from recent reports, saying the company had “spent $ 13 billion and hired 40,000 people to do one job: keep people safe on our apps.”
Here are the highlights of the hearing on Monday:
Facebook boss and CEO Mark Zuckerberg.
Erin Scott | Reuters
Is Facebook Bad?
John Nicolson, a member of parliament, asked Haugen if Facebook was just evil.
“Your evidence has shown us that Facebook fails to prevent harm to children, fails to prevent the spread of disinformation, it fails to prevent hate speech,” Nicolson said. “It has the power to deal with these issues, it’s just the decision not to do it, which makes me wonder if Facebook is just fundamentally evil. Is Facebook angry? “
Haugen said the word she wanted was “negligence”.
“I believe there is a pattern of inadequacy that Facebook is unwilling to recognize its own power,” she said. “They believe in flatness and they will not accept the consequences of their actions. So I think this is negligence and ignorance, but I cannot see into their hearts.”
Adam Mosseri, Facebook
Beck Diefenbach | Reuters
Worrying about Instagram kids
The Journal highlighted on its series that Facebook was aware that its Instagram service was detrimental to teenage mental health.
The public outcry following this report led Facebook to announce last month that it would stop developing a version of Instagram for children under the age of 13.
This issue came up again at the hearing on Monday.
Haugen said the addiction to the company’s products is labeled “problematic use” within Facebook. Facebook found that problematic consumption was much worse in young people than in older people, said Haugen.
To meet the bar for problematic use, someone needs to be confident and honest enough to admit that they have no control over the use. Haugen said that by the age of 14, between 5.8% and 8% of them said they had problems using Facebook’s products, Haugen said.
“It’s a huge problem,” she said. “If so many 14 year olds are that confident and honest, it’s probably 15%, 20%. I am deeply concerned about Facebook’s role in hurting the weakest among us. ”
Haugen said Facebook’s own reports say the problem isn’t just that Instagram is dangerous for teenagers, but that it’s more harmful than other forms of social media.
“When children describe their use of Instagram, Facebook’s own research describes it as an addict’s narrative. The kids say that would leave me miserable if I were ostracized, “said Haugen.” I’m deeply concerned that it might not be possible to make Instagram safe for a 14-year-old, and I sincerely doubt it’s possible is to make it safe for a 10 year old. “
“A novel that will be terrible to read”
At the hearing, Haugen referred to an article in the Journal that indicated that armed groups were using Facebook to incite violence in Ethiopia. According to the report, the company does not have enough people who speak the appropriate languages to monitor the situation on Facebook’s services.
Haugen said situations like this are looming in other vulnerable countries in the Global South, which is one of the main reasons she came up.
“I think situations like Ethiopia are only part of the opening chapters of a novel that will be terrible to read,” said Haugen.
Regulation could be good
Haugen praised the UK for considering regulating social media services, noting that regulation could help Facebook.
“I think regulation might actually be good for Facebook’s long-term success because it pushes Facebook back to a place that is more convenient to be on Facebook,” she said.
The Verge released a report based on Haugen’s documents on Monday that showed that teenage Facebook app users in the US have declined 13% since 2019, with a projected decline of 45% over the next two years. According to internal documents, users between the ages of 20 and 30 are expected to decrease by 4% over this period.
Haugen noted that the company could reverse that decline if regulation forced Facebook to change its incentives to make its apps more enjoyable for users.
“I think if you make Facebook safer and more enjoyable, it will be a more profitable business in 10 years because the toxic version of Facebook is slowly losing users,” she said.
SEE: How can Facebook fix the trust problem?