On Tuesday, Facebook whistleblower Frances Haugen testified in front of the Senate, warning that the social media company has long known about misinformation and hate speech on the platform and negative impacts on young users. At the hearing, Haugen explained to a Senate commerce committee panel how she believes Facebook’s Instagram platform affects children negatively, saying during opening remarks: “I am here today because I believe that Facebook’s products harm children, stoke division and weaken our democracy. The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary change because they have put their astronomical profits before people. Congressional action is needed. They won’t solve this crisis without your help.” Haugen called on lawmakers to demand more transparency into the company’s algorithms and internal metrics to guide how to regulate the company, and took aim at Section 230 of the Communications Decency Act, which protects platforms from legal liability for content posted by their users.
During her testimony, Haugen said the company’s systems for catching offending content such as hate speech catches “a very tiny minority of offending content.” She also said the Facebook platform is being used by “authoritarian or terrorist based leaders” around the world, but despite the national security threat, she did not believe Facebook was adequately prepared to monitor and combat this behavior.
Monika Bickert, vice president of content policy at Facebook, said it was “not true” that the platform’s algorithms are designed to push inflammatory content, adding, “We do the opposite, in fact, and if you look in our transparency center, you can actually see that we demote, meaning we reduce the visibility of engagement bait, click bait, and why would we do that? One big reason is for the long-term health of our services, we want people to have a good experience.”