Deepfake narratives
THE disturbing trend of malicious deepfake narratives in the run-up to elections is not peculiar to India. The use of a form of artificial intelligence called deep learning to make images of fake events is a worldwide concern. Doctored content, most likely a realistic fake video, is presented as fact to alter public perception. Fictitious information, including controversial or hateful statements with the intention of playing upon political divisions, gets widely circulated. Even mainstream news sources, in their zeal to score over rivals, often fall for the propagandist misinformation, avoiding fact-checking. By the time retractions or apologies appear, large numbers of people may believe the falsehood because it came from a trustworthy source. Chief Election Commissioner (CEC) Rajiv Kumar, by flagging the potential of fake social media narratives to affect free and fair elections, has highlighted an issue that needs serious attention.
The Election Commission, with its track record of successful conduct of polls under the most trying conditions and managing mindboggling logistics, is uniquely placed to come up with solutions. The primary task is to make all stakeholders recognise the damage that the misuse of technology by disruptive elements can cause, and its capacity to erode trust in the fairness of the electoral process. Social media intermediaries, as the CEC pointed out, have the capability to detect the deepfakes proactively, especially when electoral cycles are definite and announced well in advance. What also requires intervention is to deny any prominence to fake content in platform search results. Another imperative could be to lay the ground rules for the poll panel to have inhouse mechanisms to put the brakes on fake content.
Strategies to prevent the spread of false information need mass-scale public awareness. A global debate is under way on regulating artificial intelligence — the only question is how to approach it. As efforts continue to develop effective detection and blocking systems, it would be prudent to consider the proposal to make the creation and distribution of a deepfake without including the digital marker of the modification a crime.