Facebook whistleblower Frances Haugen testifies before the Senate – TechCrunch

Last Updated on December 18, 2022 by Admin

[ad_1]

After revealing her identity on Sunday night, Frances Haugen — the whistleblower who leaked controversial Facebook documents to The Wall Street Journal — testified before the Senate Committee on Commerce, Science, & Transportation on Tuesday.

Haugen’s testimony came after a hearing last week, when Facebook Global Head of Safety Antigone Davis was questioned about the company’s negative impact on children and teens. Davis stuck to Facebook’s script, frustrating senators as she failed to answer questions directly. But Haugen, a former project manager on civic misinformation at Facebook, was predictably more forthcoming with information.

Haugen is an algorithm specialist, having served as a project manager at companies like Google, Pinterest and Yelp. While she was at Facebook, she addressed issues related to democracy, misinformation, and counter-espionage.

“Having worked on four different types of social networks, I understand how complex and nuanced these problems are,” Haugen said in her opening statement. “However, the choices being made inside Facebook are disastrous — for our children, for our public safety, for our privacy and for our democracy — and that is why we must demand Facebook make changes.”

The algorithm

Throughout the hearing, Haugen made clear that she thinks that Facebook’s current algorithm, which rewards posts that generate meaningful social interactions (MSIs), is dangerous. Rolled out in 2018, this news feed algorithm prioritizes interactions (such as comments and likes) from the people who Facebook thinks you’re closest to, like friends and family.

But as the documents leaked by Haugen show, data scientists raised concerns that this system yielded “unhealthy side effects on important slices of public content, such as politics and news.”

Facebook also uses engagement-based ranking, in which an AI displays the content that it thinks will be most interesting to individual users. This means content that elicits stronger reactions from users will be prioritized, boosting misinformation, toxicity, and violent content. Haugen said she thinks that chronological ranking would help mitigate these negative impacts.

“I’ve spent most of my career working on systems like engagement-based ranking. When I come to you and say these things, I’m basically damning 10 years of my own work,” Haugen said in the hearing.

Committee Senators listen as former Facebook employee and whistleblower Frances Haugen (C) testifies before a Senate Committee on Commerce, Science, and Transportation hearing on Capitol Hill, October 5, 2021, in Washington, DC.

Committee Senators listen as former Facebook employee and whistleblower Frances Haugen (C) testifies before a Senate Committee on Commerce, Science, and Transportation hearing on Capitol Hill, October 5, 2021, in Washington, DC. (Photo by DREW ANGERER/POOL/AFP via Getty Images)

As Haugen told “60 Minutes” on Sunday night, she was part of a civic integrity committee that Facebook dissolved after the 2020 election. Facebook implemented safeguards to reduce misinformation ahead of the 2020 U.S. presidential election. After the election, it turned off those safeguards. But after the attacks on the U.S. Capitol on January 6, Facebook switched them back on again.

“Facebook changed those safety defaults in the run up to the election because they knew they were dangerous. Because they wanted that growth back after the election, they returned to their original defaults,” Haugen said. “I think that’s deeply problematic.”

Haugen said that Facebook is emphasizing a false choice — that they can either use their volatile algorithms and continue their rapid growth, or they can prioritize user safety and decline. But she thinks that adopting more safety measures, like oversight from academics, researchers and government agencies, could actually help Facebook’s bottom line.

“The thing I’m asking for is a move [away] from short-term-ism, which is what Facebook is run under today. It’s being led by metrics and not people,” Haugen said. “With appropriate oversight and some of these constraints, it’s possible that Facebook could actually be a much more profitable company five or ten years down the road, because it wasn’t as toxic, and not as many people quit it.”

Establishing government oversight

When asked as a “thought experiment” what she would do if she were in CEO Mark Zuckerberg’s shoes, Haugen said she would establish policies about sharing information with oversight bodies including Congress; she would work with academics to make sure they have the information they need to conduct research about the platform; and that she would immediately implement the “soft interventions” that were identified to protect the integrity of the 2020 election. She suggested requiring users to click on a link before they share it, since other companies like Twitter have found these interventions to reduce misinformation.

Haugen also added that she thinks Facebook as it’s currently structured can’t prevent the spread of vaccine misinformation, since the company is overly reliant on AI systems that Facebook itself says will likely never catch more than 10% to 20% of content.

Later on, Haugen told the committee that she “strongly encourages” reforming Section 230, a part of the United States Communications Decency Act that absolves social media platforms from being held liable for what their users post. Haugen thinks Section 230 should exempt decisions about algorithms, making it possible for companies to face legal consequences if their algorithms are found to cause harm.

“User generated content is something companies have less control over. But they have 100% control over their algorithms,” Haugen said. “Facebook should not get a free pass on choices it makes to prioritize growth, virality and reactiveness over public safety.”

Sen. John Hickenlooper (D-CO) asked how Facebook’s bottom line would be impacted if the algorithm promoted safety. Haugen said that it would have an impact, because when users see more engaging content (even if it’s more enraging than engaging), they spend more time on the platform, yielding more ad dollars for Facebook. But she thinks the platform would still be profitable if it followed the steps she outlined for improving user safety.

International security

As reported in one of The Wall Street Journal’s Facebook Files stories, Facebook employees flagged instances of the platform being used for violent crime overseas, but the company’s response was inadequate, according to the documents Haugen leaked.

Employees raised concerns, for example, about armed groups in Ethiopia using the platform to coordinate violent attacks against ethnic minorities. Since Facebook’s moderation practices are so dependent on artificial intelligence, that means that its AI needs to be able to function in every language and dialect that its 2.9 billion monthly active users speak. According to the WSJ, Facebook’s AI systems don’t cover the majority of the languages spoken on the site. Haugen said that though only 9% of Facebook users speak English, 87% of the platform’s misinformation spending is devoted to English speakers.

“It seems that Facebook invests more in users who make the most money, even though the danger may not be evenly distributed based on profitability,” Haugen said. She added that she thinks Facebook’s consistent understaffing of the counter-espionage, information operations, and counterterrorism teams is a national security threat, which she’s communicating with other parts of Congress about.

The future of Facebook

The members of the Senate committee indicated that they’re motivated to take action against Facebook, which is also in the midst of an antitrust lawsuit.

“I’m actually against the breaking up of Facebook,” Haugen said. “If you split Facebook and Instagram apart, it’s likely that most advertising dollars will go to Instagram, and Facebook will continue to be this Frankenstein that is endangering lives around the world, only now there won’t be money to fund it.”

But critics argue that yesterday’s six-hour Facebook outage — unrelated to today’s hearing — showed the downside of one company having so much control, especially when platforms like WhatsApp are so integral to communication abroad.

In the meantime, lawmakers are drawing up legislation to promote safety on social media platforms for minors. Last week, Sen. Ed Markey (D-MA) announced that he would reintroduce legislation with Sen. Richard Blumenthal (D-CT) called the KIDS (Kids Internet Design and Safety) Act, which seeks to create new protections for online users under 16. Today, Sen. John Thune (R-SD) brought up a bipartisan bill he introduced with three other committee members in 2019 called the Filter Bubble Transparency Act. This legislation would increase transparency by giving users the option to view content that’s not curated by a secret algorithm.

Sen. Blumenthal even suggested that Haugen come back for another hearing about her concerns that Facebook is a threat to national security. Though Facebook higher-ups spoke against Haugen during the trial, policymakers seemed moved by her testimony.



[ad_2]

Source link