The Free Internet Project

political bias

Summary: Mounting Allegations Facebook, Zuckerberg Have Political Bias and Favoritism for Trump and conservatives in content moderation

In the past week, more allegations surfaced that Facebook executives have been intervening in questionable ways in the company's content moderation procedure that show favoritism to Donald Trump, Breitbart, and other conservatives. These news reports cut against the narrative that Facebook has an "anti-conservative bias." For example, according to some allegations, Facebook executives didn't want to enforce existing community standards or change the community standards in a way that would flag conservatives for violations, even when the content moderators found violations by conservatives.  Below is a summary of the main allegations that Facebook has been politically biased in favor of Trump and conservatives.  This page will be updated if more allegations are reported.

Ben Smith, How Pro-Trump Forces Work the Refs in Silicon Valley, N.Y. Times (Aug. 9, 2020): "Since then, Facebook has sought to ingratiate itself to the Trump administration, while taking a harder line on Covid-19 misinformation. As the president’s backers post wild claims on the social network, the company offers the equivalent of wrist slaps — a complex fact-checking system that avoids drawing the company directly into the political fray. It hasn’t worked: The fact-checking subcontractors are harried umpires, an easy target for Trump supporters’ ire....In fact, two people close to the Facebook fact-checking process told me, the vast bulk of the posts getting tagged for being fully or partly false come from the right. That’s not bias. It’s because sites like The Gateway Pundit are full of falsehoods, and because the president says false things a lot."

Olivia Solon, Sensitive to claims of bias, Facebook relaxed misinformation rules for conservative pages, NBC News (Aug. 7, 2020, 2:31 PM): "The list and descriptions of the escalations, leaked to NBC News, showed that Facebook employees in the misinformation escalations team, with direct oversight from company leadership, deleted strikes during the review process that were issued to some conservative partners for posting misinformation over the last six months. The discussions of the reviews showed that Facebook employees were worried that complaints about Facebook's fact-checking could go public and fuel allegations that the social network was biased against conservatives. The removal of the strikes has furthered concerns from some current and former employees that the company routinely relaxes its rules for conservative pages over fears about accusations of bias."

Craig Silverman, Facebook Fired an Employee Who Collected Evidence of Right-Wing Page Getting Preferential Treatment, Buzzfeed (Aug. 6, 2020, 4:13 PM): "[S]ome of Facebook’s own employees gathered evidence they say shows Breitbart — along with other right-wing outlets and figures including Turning Point USA founder Charlie Kirk, Trump supporters Diamond and Silk, and conservative video production nonprofit Prager University — has received special treatment that helped it avoid running afoul of company policy. They see it as part of a pattern of preferential treatment for right-wing publishers and pages, many of which have alleged that the social network is biased against conservatives." Further: "Individuals that spoke out about the apparent special treatment of right-wing pages have also faced consequences. In one case, a senior Facebook engineer collected multiple instances of conservative figures receiving unique help from Facebook employees, including those on the policy team, to remove fact-checks on their content. His July post was removed because it violated the company’s 'respectful communication policy.'”

Ryan Mac, Instagram Displayed Negative Related Hashtags for Biden, but Hid them for Trump, Buzzfeed (Aug. 5, 2020, 12:17 PM): "For at least the last two months, a key Instagram feature, which algorithmically pushes users toward supposedly related content, has been treating hashtags associated with President Donald Trump and presumptive Democratic presidential nominee Joe Biden in very different ways. Searches for Biden also return a variety of pro-Trump messages, while searches for Trump-related topics only returned the specific hashtags, like #MAGA or #Trump — which means searches for Biden-related hashtags also return counter-messaging, while those for Trump do not."

Ryan Mac & Craig Silverman, "Hurting People at Scale": Facebook's Employees Reckon with the Social Network They've Built, Buzzfeed (July 23, 2020, 12:59 PM): Yaël Eisenstat, Facebook's former election ads integrity lead "said the company’s policy team in Washington, DC, led by Joel Kaplan, sought to unduly influence decisions made by her team, and the company’s recent failure to take appropriate action on posts from President Trump shows employees are right to be upset and concerned."

Elizabeth Dwoskin, Craig Timberg, & Tony Romm, Zuckerberg once wanted to sanction Trump. Then Facebook wrote rules that accommodated him., Wash. Post (June 28, 2020, 6:25 PM): "But that started to change in 2015, as Trump’s candidacy picked up speed. In December of that year, he posted a video in which he said he wanted to ban all Muslims from entering the United States. The video went viral on Facebook and was an early indication of the tone of his candidacy....Ultimately, Zuckerberg was talked out of his desire to remove the post in part by Kaplan, according to the people. Instead, the executives created an allowance that newsworthy political discourse would be taken into account when making decisions about whether posts violated community guidelines....In spring of 2016, Zuckerberg was also talked out of his desire to write a post specifically condemning Trump for his calls to build a wall between the United States and Mexico, after advisers in Washington warned it could look like choosing sides, according to Dex Torricke-Barton, one of Zuckerberg’s former speechwriters."  

Regarding election interference: "Facebook’s security engineers in December 2016 presented findings from a broad internal investigation, known as Project P, to senior leadership on how false and misleading news reports spread so virally during the election. When Facebook’s security team highlighted dozens of pages that had peddled false news reports, senior leaders in Washington, including Kaplan, opposed shutting them down immediately, arguing that doing so would disproportionately impact conservatives, according to people familiar with the company’s thinking. Ultimately, the company shut down far fewer pages than were originally proposed while it began developing a policy to handle these issues."

Craig Timberg, How conservatives learned to wield power inside Facebook, Wash. Post (Feb. 20, 2020, 1:20 PM): "In a world of perfect neutrality, which Facebook espouses as its goal, the political tilt of the pages shouldn’t have mattered. But in a videoconference between Facebook’s Washington office and its Silicon Valley headquarters in December 2016, the company’s most senior Republican, Joel Kaplan, voiced concerns that would become familiar to those within the company. 'We can’t remove all of it because it will disproportionately affect conservatives,; said Kaplan, a former George W. Bush White House official and now the head of Facebook’s Washington office, according to people familiar with the meeting who spoke on the condition of anonymity to protect professional relationships."

Related articles about Facebook

Ben Smith, What's Facebook's Deal with Donald Trump?NY Times (June 21, 2020): "Mr. Trump’s son-in-law, Jared Kushner, pulled together the dinner on Oct. 22 on short notice after he learned that Mr. Zuckerberg, the Facebook founder, and his wife, Priscilla Chan, would be in Washington for a cryptocurrency hearing on Capitol Hill, a person familiar with the planning said. The dinner, the person said, took place in the Blue Room on the first floor of the White House. The guest list included Mr. Thiel, a Trump supporter, and his husband, Matt Danzeisen; Melania Trump; Mr. Kushner; and Ivanka Trump. The president, a person who has spoken to Mr. Zuckerberg said, did most of the talking. The atmosphere was convivial, another person who got an account of the dinner said. Mr. Trump likes billionaires and likes people who are useful to him, and Mr. Zuckerberg right now is both."

Deepa Seetharaman, How a Facebook Employee Helped Trump Win--But Switched Sides for 2020, Wall St. J (Nov. 24, 2019, 3:18 PM): "One of the first things Mr. Barnes and his team advised campaign officials to do was to start running fundraising ads targeting Facebook users who liked or commented on Mr. Trump’s posts over the past month, using a product now called 'engagement custom audiences.' The product, which Mr. Barnes hand-coded, was available to a small group, including Republican and Democratic political clients. (The ad tool was rolled out widely around Election Day.) Within the first few days, every dollar that the Trump campaign spent on these ads yielded $2 to $3 in contributions, said Mr. Barnes, who added that the campaign raised millions of dollars in those first few days. Mr. Barnes frequently flew to Texas, sometimes staying for four days at a time and logging 12-hour days. By July, he says, he was solely focused on the Trump campaign. When on-site in the building that served as the Trump campaign’s digital headquarters in San Antonio, he sometimes sat a few feet from Mr. Parscale. The intense pace reflected Trump officials’ full embrace of Facebook’s platform, in the absence of a more traditional campaign structure including donor files and massive email databases."

What is Parler? An "unbiased social media"? Or platform for conservative Republicans?

Parler (French for "to talk") is a social media plafform started in 2018. Its mission is to be “an unbiased social media focused on real user experiences and engagement." It is touted as an alternative to Twitter that allows users to post content and comment like Twitter--without political bias. Many Republican politicians who believe Twitter is biased against conservatives, have migrated to Parler and are promoting it as a platform. Since Twitter and Snapchat recently moderated some of Donald Trump’s posts that have violated their community standards, conservative Republicans have switched to Parler. Ted Cruz joined Parler as did three Republican politicians Jim Jordan, Elise Stefanik and Nikki Haley, as CNBC reported. Parler may become Republican lawmaker’s and Trump's favorite social media site. Trump’s campaign manager Brad Parscale accused Twitter and Facebook for biased censorship and stated that the campaign team may select an alternative platform, such as Parlor, as reported by the Wall Street Journal. Parler ranked top news app in Apple’s app store and has 1.5 million users in 2020. By comparison, Twitter has over 145 million active users

Content moderation by Internet platforms has become a hot-button issue. In the past, platforms took permissive approaches in the name of free speech, but they soon realized the need to moderate some objectionable content posted by their user. Most people would agree with the idea that despite the importance of free expression and free flow of information, allowing everyone to post anything online may lead to false, illegal, and harmful content being shared. So Internet companies must exercise some moderation of user content, but the unsolved puzzle is: what the standards should be and who should decide them.

Touted by Republicans, Parler attracted many new users in the past few days. However, some users realized that the new hyped platform was not free of content moderation. Besides restraining the commonly prohibitive content outlined in Parler’s Community Guidelines such as spam, fighting words, pornography and criminal solicitation, Parler also makes clear in its User Agreement: “Parler may remove any content and stop your access to the Services at any time and for any reason or no reason, although Parler endeavors to allow all free speech that is lawful and does not infringe the legal rights of others … Although the Parler Guidelines provide guidance to you regarding content that is not proper, Parler is free to remove content and terminate your access to the Services even where the Guidelines have been followed.”

Some users who are liberals were reportedly banned from Parler. Techdirt compiled some of the user who were banned. Parler's banning of liberal users does not appear to be consistent with its motto as an "unbiased social media."  Even some conservative commentators criticized Parler for not abiding by its privacy policy as it asked for a driver's license from its users. The goal of a politically unbiased Internet platform may be a worthy one. But it remains to be seen whether Parler provides such a space. 

--written by Candice Wang

 

Blog Search

Blog Archive

Categories