Facebook whistleblower Frances Haugen testifies in UK Parliament

Facebook whistleblower Frances Haugen giving evidence to the joint committee for the Draft Online Safety Bill, as part of government plans for social media regulation.

House of Commons – PA Images | PA Images | Getty Images

LONDON — Regulators have a small window of opportunity to act on the spread of hate speech and other harmful content on Facebook, whistleblower Frances Haugen told U.K. lawmakers Monday.

“When an oil spill happens, it doesn’t make it harder for us to regulate oil companies,” Haugen said at a hearing in U.K. Parliament on new legislation aimed at tackling harmful content online.

“Right now, Facebook is closing the door on us being able to act. We have a slight window of time to regain people control over AI.”

Haugen hit the headlines earlier this month when she was revealed to be the whistleblower behind the leak of a cache of internal Facebook documents that, most notably, showed the company was aware of the harm caused by its Instagram app to teens’ mental health.

The ex-Facebook employee testified in U.S. Congress, accusing company management of prioritizing “profits before people,” a claim CEO Mark Zuckerberg described as “just not true.”

It marks one of the biggest crises in recent history for Facebook, and arrives as regulators around the world look to curb the sheer power and influence of America’s tech giants.

Over the weekend, a flood of new reports emerged based on additional leaked information from Haugen.

One of the reports said Facebook was unprepared to deal with the Jan. 6 insurrection at the U.S. Capitol building, citing internal documents. Another detailed the spread of hate speech and content inciting violence in India on Facebook’s services.

‘Irresponsible’

“People don’t want to see it when they use our apps and advertisers don’t want their ads next to it,” a Facebook spokesperson told CNBC via email. “That’s why we’ve invested $13 billion and hired 40,000 people to do one job: keep people safe on our apps.”

“While we have rules against harmful content and publish regular transparency reports, we agree we need regulation for the whole industry so that businesses like ours aren’t making these decisions on our own.”

Haugen also questioned the role played by the Oversight Board, a Facebook-funded body that’s meant to hold the company to account over its moderation decisions. She says Facebook “actively misled” the board about key aspects of how it makes content rulings.

“This is a defining moment for the Oversight Board,” Haugen said. “If Facebook can come in there and just actively mislead the Oversight Board, I don’t know what the purpose of the Oversight Board is.”

Damian Collins, chair of the U.K.’s joint committee on the Draft Online Safety Bill, called the organization “more of a hindsight board than an Oversight Board.”

Haugen recently accepted an invitation to meet with the Oversight Board.

Leave a Reply

Your email address will not be published.