Representative Veasey Leads 23 on Letter Pressing META CEO for Answers on Layoffs of Staff Dedicated to Addressing Election Disinformation
WASHINGTON - Representative Veasey (TX-33), co-founder of the Congressional Voting Rights Caucus, led 23 of his colleagues on a letter pressing Meta CEO Mark Zuckerberg for answers after reporting has highlighted a significant reduction of staff dedicated to countering misinformation on its social media platforms. The layoffs could have a dire impact on META's ability to effectively respond to election-related disinformation and malicious campaigns seeking to undermine confidence in the integrity of our election ahead of the 2024 election cycle.
"We saw an unprecedented rise of mis-and-disinformation on social media during the 2020 election - and yet, even as we head into the 2024 cycle and at a time when artificial intelligence is beginning to take off, Meta is continuing the reduce the workforce of the teams overseeing content that is simply false and misleading on its platforms. This is dangerous and we must do everything we can to hold META accountable to ensure social media companies can protect the integrity of our elections," said Representative Marc Veasey (TX-33).
"Meta's failure to respond to independent inquiries is nothing new. Meta has exhibited reckless disregard for years to civil society, lawmakers, journalists, and users who have inquired about their policy enforcement, staffing and use of machine learning tools. These failures point to a systematic unwillingness to protect users despite ample evidence that social media platforms have real offline consequences. Strong content moderation, like the "break-glass" measures Meta employed ahead of 2020 elections, can mitigate the virality of harmful, violative content. When Meta turned those features off after the November 2020 elections, harmful content surged; in turn, that content created fertile ground for the "Big Lie," which helped incite violence during the January 6th insurrection. We know — and Meta knows — what to expect in 2024 when it comes to the extremism and lies that target users and sow discord. Social media platforms have the resources needed to disrupt the spread of hate and anti-democratic disinformation — they should implement and share their plans to do so immediately," said Nora Benavidez, Senior Counsel and Director of Digital Justice and Civil Rights, Free Press.
The full text of the letter can be found here and below.
Dear. Mr. Zuckerberg,
We write to express deep concern about the potential impact of Meta's decision to downsize its election teams ahead of the 2024 U.S. presidential election year. The proliferation of misinformation and disinformation on your social media platforms has consistently been a concern for Congress, particularly during election years as users of all backgrounds are more vulnerable to malicious campaigns seeking to undermine confidence in the integrity of our elections.
This is particularly troubling given that in previous years Meta has tended to ramp up, not curtail, its election integrity efforts in the months leading up to elections. Ahead of the 2022 U.S. midterm elections, Meta's approach to fight election falsehoods was "consistent with the policies and safeguards [Meta] had in place during the 2020 U.S. presidential election." [1] However, now, your actions indicate the contrary as Meta continues to eliminate tens of thousands of jobs, including layoffs within its trust and safety division which is dedicated to overseeing content moderation and combating foreign influence campaigns.[2]
The accelerated innovations in and deployment of artificial intelligence opens the door for a deluge of misinformation and disinformation where a malicious agent can flood your platforms with hate speech, fake images, altered videos and false information about our elections. With over 50 countries preparing for elections in 2024, the risk of generative artificial intelligence causing confusion to create a global volatile political environment for voters and candidates is much greater than ever.
Given the serious risks of misinformation and disinformation running rampant on Meta's platforms ahead of the 2024 elections, we respectfully ask that you answer the following questions separately for each of your platforms including Facebook, Instagram and Threads by October 26, 2023:
1. What input and/or recommendations informed your company's decision to reduce the number of employees in the trust and safety division? Does the company intend to continue laying off additional employees of this division?
2. What steps has your company taken to ensure that current employees in the trust and safety division have the additional tools and resources to sustain, or improve, their efforts to prevent the amplification of online misinformation and disinformation?
3. How many employees at your company are currently dedicated to addressing election-relation dis/misinformation on its platform, and what are their roles? How does this compare to the number of employees who were dedicated to these efforts for the 2016, 2018, 2020, and 2022 election cycles?
4. What role has your new artificial intelligence algorithms played in identifying and removing the distribution of election-related dis/misinformation? Please describe the effectiveness of these policies and procedures and the metrics used to measure their effectiveness.
5. Do current employees at your company dedicated to addressing election-related dis/misinformation on its platform have any oversight over your new artificial intelligence algorithms?