Thousands of pages of internal Facebook documents have been released showing employees’ anger over the incitement of violence on the social network and how Apple threatened to remove the app due to concerns about abuse.
The documents were obtained by former Facebook data scientist-turned-whistleblower Frances Haugen who is facing MPs today as they scrutinise the UK government’s plans to crack down on harmful online content.
She gave evidence to the US Senate earlier this month about the danger she says Facebook poses, from harming children to inciting political violence and fuelling misinformation.
Facebook whistleblower explains why Instagram is ‘more dangerous’ than other platforms – Frances Haugen’s testimony as it happened
Here, Sky News looks at the key revelations from the Facebook documents which were provided to Congress in redacted form by Ms Haugen’s legal team and were obtained by a consortium of news organisations.
• Facebook and Instagram were nearly pulled from Apple’s app store
Apple threatened to pull Facebook and Instagram from its app store over concerns that the platform was being used as a tool to trade and sell maids.
The technology giant issued the threat two years ago, citing examples of pictures of maids and their biographic details showing up online, according to internal Facebook documents seen by the Associated Press.
Even today, a quick search for “khadima,” or “maids” in Arabic, will bring up accounts featuring posed photographs of Africans and South Asians with ages and prices listed next to their images, AP reported.
After Facebook took action and disabled more than 1,000 accounts, Apple apparently dropped its threat a week later.
In a statement, Facebook said it took the problem of maid abuse seriously, despite the continued spread of adverts exploiting foreign workers in the Middle East.
“We prohibit human exploitation in no uncertain terms,” the social media giant said.
“We’ve been combating human trafficking on our platform for many years and our goal remains to prevent anyone who seeks to exploit others from having a home on our platform.”
• Zuckerberg ‘personally decided’ to censor Vietnam’s anti-government dissidents
Mark Zuckerberg faced a choice last year to either comply with demands from Vietnam’s ruling Communist Party to censor anti-government dissidents or risk getting knocked offline in one of Facebook’s most lucrative Asian markets, according to the Washington Post.
Facebook’s chief executive personally decided that the company would comply with Hanoi’s demands, three people familiar with the decision told the newspaper after speaking on the condition of anonymity.
Ahead of Vietnam’s party congress in January, Facebook significantly increased censorship of “anti-state” posts, giving the government near-total control over the platform, according to local activists and free speech advocates.
The social network earns more than $1bn in annual revenue in Vietnam, according to a 2018 estimate by Amnesty International.
Facebook told the Washington Post that the choice to censor is justified “to ensure our services remain available for millions of people who rely on them every day”.
A spokeswoman denied that decisions made by Zuckerberg “cause harm”, saying the claim was based on “selected documents that are mischaracterised and devoid of any context”.
“We have no commercial or moral incentive to do anything other than give the maximum number of people as much of a positive experience as possible,” the spokeswoman said.
• Facebook considered ditching ‘likes’ over impact on young people
Facebook researchers examined the impact of ditching “likes” on posts due to concerns about their impact on young people, according to documents seen by the New York Times.
The 2019 study examined what people would do if Facebook removed the distinct thumbs-up icon and other emoji reactions from posts on its photo-sharing app Instagram, the newspaper reported.
The buttons had caused Instagram’s youngest users “stress and anxiety”, the researchers found, especially if posts failed to get enough likes from friends.
But the researchers discovered that when the like button was hidden, users interacted less with posts and ads.
At the same time, young users did not share more photos, as the company thought they might.
Zuckerberg and other managers discussed hiding the like button for more Instagram users, according to the documents.
But in the end, a larger test was rolled out in just a limited capacity to “build a positive press narrative” around Instagram, the New York Times said.
Facebook declined to comment when contacted by Sky News in 2019 about reports it was considering a trial to hide the number of likes that posts receive.
• Outrage of Facebook staff at company’s response to US Capitol attack
Hours after rioters stormed the US Capitol building in January, Facebook’s chief technology officer posted on the company’s internal message board, according to documents seen by NBC News.
“Hang in there everyone,” Mike Schroepfer wrote.
Facebook should allow for peaceful discussion of the riot but not calls for violence, he added.
His post was reportedly met with scathing replies from employees who blamed the company for what was happening.
“I’m struggling to match my values to my employment here,” one staff member wrote in a comment.
“I came here hoping to effect change and improve society, but all I’ve seen is atrophy and abdication of responsibility.”
Another employee asked: “How are we expected to ignore when leadership overrides research-based policy decisions to better serve people like the groups inciting violence today?”
Facebook told NBC News that 83% of its employees say they would recommend it as a great place to work and that it has hired more staff this year than in any previous year.
• Facebook set up ‘war rooms’ to monitor election posts – and placed countries into ‘tiers’ of priority
After Facebook announced where it would invest resources to improve protections around global elections in 2019, the company reportedly sorted the world’s countries into “tiers” of priority.
Brazil, India and the United States were placed in “tier zero,” the highest priority, according to documents seen by The Verge.
Facebook set up “war rooms” to monitor the network continuously and created dashboards to analyse network activity and alert local officials to any problems, the technology website reported.
Germany, Indonesia, Iran, Israel, and Italy were placed in tier one, meaning they would not be given resources for enforcement of Facebook’s rules and for alerts outside the period directly around the elections.
In tier two, 22 countries were added. They would have to go without the war rooms, which Facebook also calls “enhanced operations centres”.
The rest of the world was placed into tier three. This meant Facebook would review election-related material if it was escalated by content moderators but otherwise it would not intervene.
In 2019, Facebook took several journalists to the Dublin “war room” at the heart of its efforts to protect European elections.
Follow the Daily podcast on Apple Podcasts, Google Podcasts, Spotify, Spreaker
• Facebook alarmed by drop in teenage users
Facebook researchers compiled a report in March highlighting how the social network was losing popularity among teenagers and young adults, according to internal documents seen by Bloomberg.
One graphic showed that “time spent” by American teens on Facebook was down 16% year-on-year, the news outlet reported.
Young adults in the US were also spending 5% less time on the social network, it said.
The number of teenagers signing up to the site was declining and young people were also taking much longer to join Facebook than they had in the past, the research reportedly found.
Most people born before 2000 had created a Facebook account by age 19 or 20, the research showed.
But the company was not expecting people born later to join the social network until they were much older, perhaps 24 or 25 years old – if ever.
In relation to the leaked documents, Facebook has said that “a curated selection out of millions of documents at Facebook can in no way be used to draw fair conclusions about us”.
“Internally, we share work in progress and debate options,” a statement said.
“Not every suggestion stands up to the scrutiny we must apply to decisions affecting so many people.”
Scalable information systems | A Research Paper By Hrishitva Patel
Leading Cryptocurrency YES WORLD Token launches gamify utility project
Unleash the Power of Artificial Intelligence with STARFETCH, a Fintech Firm from Switzerland Dedicated to Empowering Investors in the Face of Distorted Financial Markets