Asad Mirza
Recent allegations against Facebook, forces us to ponder over the benefits of the social media in real terms, and further how to control them from becoming toxic.
The details published by The Washington Post of a misinformation campaign pursued by Facebook against the Covid vaccine, also outlines other irregularities committed by the tech giant. A consortium of media organisations, including The Washington Post, reviewed the information, which was disclosed by the lawyers of Ms Frances Haugen, a former Facebook employee, to the US Securities and Exchange Commission and the Congress.
These documents included company’s internal communications on various aspects of running the app and monitoring the content. The Wall Street Journal had previously reported on some of the pandemic-related revelations in the papers, including how Facebook struggled to police anti-vaccine comments.
The controversy
According to the information available, Facebook researchers allegedly had deep knowledge of how coronavirus and vaccine misinformation moved through the company’s apps. It allegedly itself conducted multiple studies and produced large internal reports on what kinds of users were most likely to share falsehoods about the deadly virus.
However, allegedly this information was not shared timely and properly with the academics and lawmakers though the White House urged Facebook for months to be more transparent about the misinformation and its effects on the behaviour of its users, the company refused to share much of this information publicly, resulting in a showdown with the Biden administration.
Taken together, the documents underline just how extensively Facebook was allegedly studying coronavirus and vaccine misinformation on its platform as the virus tore across the world, unearthing findings that concerned its own employees.
Yet, in public blog posts and in congressional testimony, executives focused on more positive aspects of the social network’s pandemic response, such as how many pieces of misinformation it had taken down and its work to help its users find vaccine clinics near them.
The disconnect between what was known and shared bolsters demands from lawmakers, who are increasingly amenable to proposals to force greater transparency from tech giants, with some directly supporting Haugen’s assertion that a separate regulatory body is needed to study algorithms and internal research and keep an eye on the social media platforms.
Facebook’s influence
How Facebook, which is particularly popular with older Americans, has affected perceptions around vaccines has been a key part of discussion about the pandemic. In July, as the disease’s delta variant triggered a massive new surge and the rate of new vaccinations levelled off, the White House began placing some of the blame on the social media channels. President Biden told reporters that Facebook was “killing people”, though he later backed off the comment.
Haugen left Facebook in May, before the public fight between the company and the White House, so it’s unclear what kind of information the company had during that time. But employees were looking into the issue well before the public spat.
Employees noted that algorithmic dynamics created “self-reinforcing” feedback loops, where vaccine-sceptical posts were overwhelmingly supported in the comments and reactions. Anti-vaccine comments got boosted with many “like” and “love” reactions, while people who posted pro-vaccine content were met with derision in the form of “angry,” “sad” or “haha” reactions, the researchers wrote.
Among other things, the reports have claimed that Facebook allegedly sat on research that showed Instagram harmed teenage mental health, and struggled to remove hate speech from its platforms outside the US.
In addition, Frances Haugen who is currently in the UK has told British MPs that Facebook is “unquestionably making hate worse”, as the MPs consider what new rules to impose on big social networks.
While talking to the Online Safety Bill Committee in London, Ms Haugen said Facebook safety teams were under-resourced, and also warned that Instagram was “more dangerous than other forms of social media”.
She said while other social networks were about performance, play, or an exchange of ideas, “Instagram is about social comparison and about bodies… about people’s lifestyles, and that’s what ends up being worse for kids. She said Facebook’s own research allegedly described one problem as “an addict’s narrative” – where children are unhappy, can’t control their use of the app, but feel like they cannot stop using it.
The Online Safety Bill Committee is fine-tuning a proposed law that will place new duties on large social networks and subject them to checks by the media regulator Ofcom in the UK.
Ms Haugen also urged the committee to include paid-for advertising in its new rules, saying the current system was “literally subsidising hate on these platforms” because of their algorithmic ranking. And she also urged MPs to require a breakdown of who is harmed by content, rather than an average figure – suggesting Facebook is “very good at dancing with data”, but pushes people towards “extreme content”.
Ms Haugen also warned that Facebook was unable to police content in multiple languages around the world – something which should worry UK officials, she said. She further said that dangerous misinformation in other languages affects people in Britain and US also. “Those people are also living in the UK, and being fed misinformation that is dangerous, that radicalises people,” she warned.
“When I worked on counter-espionage, I saw things where I was concerned about national security, and I had no idea how to escalate those because I didn’t had faith in my chain of command at that point,” she told the committee.
Similar problems plague Facebook’s Oversight Board, which can overturn the company’s decisions on content, she said. She repeated her claim that Facebook has repeatedly lied to its own watchdog, and said this is a “defining moment” for the Oversight Board to “step up”.
This is not the first time that Facebook has come under criticism and allegations of interfering other country’s internal politics. In the past it has been charged with interfering in two US presidential elections, parliamentary elections in the UK and India and the role played by it during the last year’s Delhi riots.
Facebook’s future
Meanwhile amid reports that in the current year Facebook has earned $9bn despite whistle blower scandal, came news that Facebook is going to change its name to Meta. Zuckerberg says he’s chosen Meta, because of its meaning in Greek – “beyond”. It also alludes to the “Metaverse”, an online virtual oasis that he wants to build.
Critics believe Facebook has done this because the brand has become toxic. Further, Zuckerberg seems to be more interested in creating virtual worlds, which he thinks will transform the human experience. However, we’ll have to wait and see whether people will go along with it.
But meanwhile questions why Facebook alone amongst all tech and social media giants was targeted have also started being aired. Various anti and pro lobby’s are being identified for this but just like whether the Corona pandemic is really a pandemic and not a plandemic, these questions might be answered in the future, by which time we might have experienced the Metaverse.
(The author is a political commentator based in New Delhi. )