Marc Owen Jones on disinformation, misogyny and Twitter’s future under Musk – Middle East Monitor
In December 2016 Edgar Maddison Welch walked into a popular pizza restaurant in Washington DC and fired three shots from an AR-15 style rifle. Welch later said he wanted to save children locked in basements who were being abused by powerful Democrats, including Hillary Clinton.
The shooting, known as pizzagate, became a warning about how online disinformation can spread and lead to violence. In 2020 an IPSOS poll found that 17 percent of Americans believed that a group of Satan-worshiping Elites who run a child-trafficking sex ring were trying to control our politics and media.
From reports that Palestinian journalist Shireen Abu Akleh was killed by a Palestinian, to tweets suggesting former Labor leader Jeremy Corbyn was a terrorist sympathizer, many false views are spread online and widely believed, says Professor Associate of Middle Eastern Studies and Digital Humanities Marc Eoin Jones.
States including Saudi Arabia, a digital superpower that has used social media to criticize its foreign policy objectives, including the war on Yemen and the killing of Jamal Khashoggi, have weaponized the spread of fake news and disinformation. .
“These Saudi disinformation campaigns, often spread by bots and trolls, are targeted domestically, regionally and internationally,” Jones tells MEMO. “At home they are designed to stifle critical conversation and silence potential critics.”
“Regionally they are focused on targeting foreign institutions they consider unfriendly or critical – journalists or news channels. Internationally they are focused on dissidents or people considered critical of the government – such as Jeff Bezos.”
READ: Jindires, North West Syria: ‘We can’t even find tents to shelter us’
One of the targets of the Saudi government is women who criticize Crown Prince Mohammed Bin Salamn, known as MBS, Jones wrote in his book, Digital Authoritarianism in the Middle East. One of the biggest casualties was the 2020 attack on Al Jazeera anchor Ghada Oueiss in which photos were stolen from her phone and she was doctored to look like she was nude in a hot tub.
The images received thousands of tweets from accounts with images of MBS and the Saudi flag. Eight months later, Oueiss filed a lawsuit against Bin Salman and several other officials who said she was targeted for her critical coverage of human rights abuses in the Gulf state.
The attack came as online misogyny has long been a threat, with social media influencers such as Andrew Tate telling millions of online followers that women belong in the home and are men’s property.
Leaders who endorse misogynistic ideas only normalize and legitimize the behavior of other influential people. They feed each other in a mutually reinforcing strategy that creates space for violent and misogynistic speech.
Jones tells MEMO.
“Many feel that because they have hidden feelings that are shared by other influential people, that those feelings are acceptable or to be encouraged, when in reality they have major problems. Misogyny has always been there; it seems it’s becoming more of a movement.”
READ: ‘I tried to give a voice to those who live on the margins in Algeria’
Over the past twenty years, social media has grown and developed and is now used by companies on a global scale. They have expanded, says Jones, under the guise of spreading freedom. But at the same time the hate speech is growing on Twitter.
“Now they have created a huge system that cannot be controlled effectively to reduce its access, which would be a problem for their income. They rely on AI systems, instead of human moderators, to regulate content – a system very imperfect.”
“Their investment in different languages is trivial, and above all, they sell participation. Their incentives to control hate speech are political and depend on who is speaking. The will may be there, but not a priority it, and the ability to speak ‘easy access social media’ works in contrast to a controlled platform that pays attention to the many global forms of hate speech.”
In 2022, tech billionaire Elon Musk bought Twitter for $44 billion, leaving critics wondering what will happen to the trolls, misogynistic profiles and state-coordinated disinformation.
“It will get worse,” says Jones. “Musk is already monetizing access to Twitter’s API, making it harder to hold Twitter researchers accountable. The paid model means that bad actors can algorithmically promote their content by subscribing to Twitter Blue.”
“The sheer scale of the teams responsible for addressing safety and disinformation means less ability to fight bots and trolls. In short, Twitter will be a more powerful hate speech and disinformation delivery system — with more Musk content.”