In 2019, two users joined Facebook. Both had similar interests: young children and parenting, Christianity, civics and community.
"Carol," 41, was a conservative from North Carolina. She was interested in news, politics, then-President Donald Trump and the nation's first family. She followed the official accounts for Trump, first lady Melania Trump and Fox News.
"Karen" was the same age and lived in the same state. But she was a liberal who liked politics, news, and Sens. Bernie Sanders and Elizabeth Warren. She disliked Trump. She followed a local news site, pages about North Carolina and the liberal advocacy group MoveOn.
Facebook's algorithms got to work, suggesting what they'd be interested in.
Accepting recommendations for sites supportive of Trump led Carol to suggestions for a site called “Donald Trump is Jesus,” and another for Q***n, a wide-ranging extremist ideology that alleges celebrities and top Democrats are engaged in a p*******e ring. Karen was presented with anti-Trump pages, including one that posted an image showing an anus instead of Trump's mouth.
The two women were not real. They were created by a Facebook researcher to explore how the social media platform deepened political divides in the U.S. by recommending content rife with misinformation and extremism.
The experiment shows that Facebook, which had 2.9 billion monthly active users as of June 30, knew before the 2020 p**********l e******n that its automated recommendations amplified misinformation and polarization in the U.S., yet the company largely failed to curtail its role in deepening the political divide.
Reports describing the experiments are among hundreds of documents disclosed to the Securities and Exchange Commission and provided to Congress in redacted form by attorneys for Frances Haugen, a former Facebook employee. The redacted versions were obtained by a consortium of 17 news organizations, including USA TODAY.
Jose Rocha said he's experienced the d******eness firsthand.
A military veteran who grew up in a Democratic, pro-union family in Selah, Washington, Rocha said Facebook normalized r****t views and led him down a rabbit hole to far-right ideologies.
For a time, Rocha said, he became a N**i sympathizer and a backer of other extremist views – behavior he now blames on Facebook's recommendations system.
"I wouldn't have even known they existed if it wasn't for Facebook. So I wouldn't have went out seeking them," said Rocha, 27.
Bill Navari, 57, a conservative sports commentator from Pittsburgh, Pennsylvania, said a cousin blocked him on Facebook after he suggested she get her TDS ("Trump derangement syndrome") checked.
“I’ve seen people on Facebook saying, ‘If you are v****g for Trump, unfriend me.' But I didn’t see anyone saying, ‘If you are v****g for Biden, unfriend me,'” he said. “Facebook has become like oil and water, and never the two shall meet.”
These days, he steers clear of political debates on Facebook.
“I’ll post pics of my family, of my dog, where we went on vacation, and I stay in touch with the friends and family. But posting a meme or putting something on Facebook, it’s not going to change anyone’s mind,” he said. “I just think the conversation has become so coarse.”
Is Facebook to blame? “I don’t like pointing fingers without direct knowledge,” he said. “But I do think that Facebook is a party to this.”
The internal Facebook documents show how swiftly its recommendation algorithms can amplify polarization by sending users to content that's rife with misinformation and extremism.
The company's experiment with the hypothetical conservative user was called "Carol's Journey to Q***n." Within five days of going live on June 2, 2019, the user was barraged by "extreme, conspiratorial and graphic content," the researcher wrote.
One of the recommendations included an image labeling former President Barack Obama a "t*****r" with a caption that read, "When we're done he'll claim Kenyan citizenship as a way to escape." (Despite r****t claims to the contrary, Obama is a U.S. citizen.)
The report on the fictitious liberal user was called "Karen and the Echo Chamber of Reshares." That account went live on July 20, 2019. Within a week, Facebook's recommendations piv**ed to "all anti-Trump content." Some recommendations came from a small Facebook group that had been f**gged for "promoting illegal activity," the Facebook researcher wrote.
One image served to Karen showed then-first lady Melania Trump's face superimposed on the body of a bikini-clad woman kneeling on a bed. The caption read, "Melania Trump: Giving evangelicals something they can get behind."
Facebook whistleblower Frances Haugen appears before the Senate Commerce, Science, and T***sportation Subcommittee at the Russell Senate Office Building on October 05, 2021, in Washington, D.C. Haugen left Facebook in May and provided internal company documents about Facebook to journalists and others, alleging that Facebook consistently chooses profits over safety.
Haugen, the former Facebook employee who has blown the whistle on the company, is a former product manager who worked on Facebook’s Civic Integrity team, focusing on e******ns. She had a front-row seat to the most d******e political events in recent memory, including the J*** 6 i**********n in which Trump supporters tried to block Congress from certifying Joe Biden's win in the p**********l e******n.
Concerned that Facebook was prioritizing profits over the well-being of its users, Haugen reviewed thousands of documents over several weeks before leaving the company in May.
The documents, some of which have been the subject of extensive reporting by The Wall Street Journal and CBS News' "60 Minutes," detail company research showing that toxic and d******e content is prevalent in posts boosted by Facebook and shared widely by users.
"I saw Facebook repeatedly encounter conflicts between its own profits and our safety. Facebook consistently resolves these conflicts in favor of its own profits," Haugen alleged during a Senate hearing this month. "The result has been more division, more harm, more lies, more threats, and more combat."
Haugen has called on Facebook to limit its practice of prioritizing content that has drawn shares and comments from many users.
She has sought federal whistleblower protection from the SEC, alleging that Facebook, a publicly traded company, misled investors. She could get a financial award if the SEC were to penalize the company.
https://www.yahoo.com/news/story-carol-karen-two-experimental-080010755.htmlJessica Guynn and Kevin McCoy, USA TODAY
In 2019, two users joined Facebook. Both had simil... (