Opinion | OMG, Will Disinformation Drive the Elections in 2024?
Opinion | OMG, Will Disinformation Drive the Elections in 2024?
WhatsApp, Facebook, X (formerly Twitter) and Instagram feed disinformation from one platform into another, targeting different age groups. They have failed to self-regulate themselves and their lobbying has run circles around Indian efforts to control it

The 2024 Lok Sabha elections are not only going to be the biggest, they are also going to be the toughest as success for Opposition parties is survival. Moreover, we have a neighbour — China — which has taken a very aggressive stance against the current government and its policies. The biggest fear in next year’s elections is therefore interference by China and other foreign powers using AI, social media, and WhatsApp to build a narrative of disinformation by disrupting the selection of candidates, creating social unrest, and affecting the whole election process.

Desperate times lead to desperate measures. Besides Opposition parties, China and Pakistan are equally desperate. Moreover, AI has made disinformation so easy and cheap to circulate that it can be done by a small PR agency with limited manpower.

A chance to manage or disrupt political stability in an enemy state is the stuff of statecraft that intelligence officers dream about. Earlier, disruption in an election involved cash, lots of feet on the ground, and the ability to engineer a riot or a rout for the right candidate. Its impact was limited to a constituency, region, or mostly a state. Now, technology, social media, and artificial intelligence are offering the tools that allow these disruptions to be engineered from the comfort of an office in Beijing or Toronto across the country.

This Dussehra, an AI-based anchor, Zania, was greeting everyone on WhatsApp groups. Now, imagine a video of a political leader exhorting a particular community to riot as happened recently in the Nuh riots, but this time, created with Chinese funds by an Indian agency using AI.

In my conversations with intelligence officers, both serving and retired, on the weaponisation of disinformation to be used during elections, they are unanimous about it as the biggest known risk. While the Indian intelligence agencies will do what they can, there are many things they cannot do.

Disinformation spreads faster than truth and there is enough research done on why we tend to share it faster and how the credibility of every share adds to its virality and impact. Case Sunstein highlighted the impact of social media on democracy in his seminal work #Republic way back in 2000. The blurb for the book read, “See only what you want to see, hear only what you want to hear, read only what you want to read….Our power to filter decreases exponentially if the content matches our beliefs.”

With WhatsApp groups becoming the sole arbiters of sharing information, everything gets shared as facts. A Facebook post, a tweet, a random post on a medium or an OpEd written on a website, nobody seems to exercise the basic judgement of whether it is factual, or disinformation before sharing it.

People start exercising their immense capabilities to comment and opine about it because everyone has fingers and keypads but very little judgment over their reactions. Most WhatsApp groups are by their very design and selection extremely deep echo chambers so the sharing is based completely on what they already believe in. This is where disinformation thrives and grows into a narrative that impacts decision making like choosing a candidate in an election.

In a new book ‘Why Falsehood Fly’ to be launched in February 2024, Paul Thagard, Distinguished Professor at Waterloo University says that motivated inferences drive acceptance of disinformation as truth. Everyone is prone and susceptible to motivated inference. Thagard cites examples of motivated inference in sports, even if a team is losing its fans will still believe that today the team will win, but are surprised and disappointed yet again. They will ignore the evidence and trend of poor performance and genuine challenges with the team.

“Motivated inference is not just simpleminded wishful thinking, in that motivations do not lead directly to beliefs. Rather, the goals lead to acquiring and considering only selective information, so that the evidence that makes it reasonable to maintain an emotion-based belief is questionable.”

Motivated inference also happens with stock market investors, ‘This time is different, this is not a bubble it is the ‘New Economy’ and I get it, but all these other naysayers do not really understand it.’

This motivated inference can play havoc in election season if it is fed as disinformation by foreign powers looking to disrupt an election. According to an assessment by the US Intelligence Community of foreign threats to the 2020 US elections, Russian President Putin authorised Russians to engage in a multi-pronged campaign to denigrate the Democratic Party and Joe Biden’s candidacy, specifically. The Russian efforts attempted to influence voters via disinformation and unsubstantiated allegations against Biden. This is captured in the US Department of National Intelligence report on interference in US elections in 2020. The allegations against President Biden were repeated by both traditional media and social media and in the end, as Goebbels said, a falsehood repeated long enough becomes the truth.

Another report from 2021 cites that Iran specifically used social media propaganda during civil unrest like protest marches and other demonstrations like Black Lives Matter in the US. DNI report on Iran said, “We assess that Iran carried out an influence campaign during the 2020 US election season intended to undercut the re-election prospects of former President Trump and to further its longstanding objectives of exacerbating divisions in the US, creating confusion, and undermining the legitimacy of US elections and institutions. Tehran’s efforts were aimed at denigrating former President Trump, not actively promoting his rivals. Iran’s efforts in 2020—especially its emails to individual US voters and efforts to spread allegations of voter fraud—were more aggressive than in past election cycles.”

The reason the US intelligence community is able to detect and describe these attempts by foreign intelligence agencies into US elections is because they have access to every email, post, and video posted on these social media platforms. Every internet company based in the US has to comply with a request for data including the origin of every post. The US has always trawled and tracked every bit of data on the internet for its benefit.

These social media companies do not give the same kind of access to Indian intelligence agencies to the same data, citing all kinds of legalese including freedom of expression. India banned TikTok and most social media apps from China but they still have a presence in India through their investment in Indian companies that control content. Some of these platforms have dropped news from their name but their role is perpetuating disinformation.

Meta-owned WhatsApp, Facebook, Instagram and the Elon Musk-owned X (formerly Twitter), a quartet of content feeds disinformation from one platform into another targeting different age groups. These platforms always cite the number of times the Indian government has asked them to bring down content but are rarely transparent about the number of times disinformation has circulated across their platform. Nor are they transparent about how they use or share information about users from one platform to another.

For instance, do you see an advertisement on Facebook that you have just searched on the browser? Now, this is the cookies gleaning all your data from your browser and feeding it back to Meta. Is it also doing the same with WhatsApp conversations? We don’t know and nobody knows but they will certainly not share information about disinformation campaigns by foreign governments proactively with the Indian intelligence agencies.

This lacuna has to be addressed through policy if we have to keep the institutions safe. Social media may have decimated traditional media but if it destroys the election process? Digital India Bill addresses this issue but only partly by laying the content responsibility back on these platforms but it is not enough. There has to be debate, discussion, and decision on how to curb the disinformation on these platforms. They have failed to self-regulate themselves and their lobbying has run circles around Indian efforts to control it.

Hiding behind the garb of freedom of expression and pretending to be media entities all the while avoiding any responsibility of a media organisation is the worst abuse that these lobbyists have done. Now, it is up to the Digital India Bill to reintroduce the exceptional circumstances access where Indian intelligence agencies under the direction of the Central Election Commission are given access to the origin of disinformation on these platforms.

K Yatish Rajawat is a public policy researcher and works at the Gurgaon-based think and do tank Centre for Innovation in Public Policy (CIPP). Views expressed in the above piece are personal and solely that of the author. They do not necessarily reflect News18’s views.

What's your reaction?

Comments

https://lamidix.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!