DT
PT
Subscribe To Print Edition About The Tribune Code Of Ethics Download App Advertise with us Classifieds
search-icon-img
search-icon-img
Advertisement

Nations should strive for cognitive security

ONE of the things which social media has taught us is how insidiously it can alter our behaviour and even the nature of reality perceived by us
  • fb
  • twitter
  • whatsapp
  • whatsapp
featured-img featured-img
Insidious: Technology allows malicious actors to perpetuate manipulation of human behaviour on a scale that is difficult to fathom.
Advertisement

Pukhraj Singh
Cyber-intelligence specialist

ONE of the things which social media has taught us is how insidiously it can alter our behaviour and even the nature of reality perceived by us. Richard Burr, Chair of the Senate Intelligence Committee, commented that the barrage of Russian disinformation which influenced the 2016 US elections sowed such divisions that it literally nudged Americans to fight on the streets.

Closer home, we have seen how political polarisation amplified by the internet led to mob lynchings and communal riots. A perpetual tension has seeped into our living rooms — ideological disagreements are often leading to the souring of relationships.

Advertisement

While the playbook of manipulation of human behaviour could be traced to the information warfare doctrines of the 19th century, technology allows malicious actors to perpetuate it on a scale that is difficult to fathom.

By hoovering up personal data from Facebook, Cambridge Analytica psychologically profiled the British populace so accurately that it was able to change the outcome of Brexit.

Advertisement

By weaponising psychometrics — a branch of psychology leveraging data sciences to deduce personality traits — it sent tailored messages to individuals over social media, using a strategy called micro-targeting, exacerbating their biases. Like a parasite, it burrowed deep into the minds of people to trigger cognitive dissonance. It is a mental conflict which occurs when our beliefs are challenged by new information, thus suppressing rational behaviour with an emotional response.

SOFWERX, a think tank of the US Special Operations Command, recently held an event to debate radical ideas on ‘Countering Weaponised Information’. One of the takeaways was that we are merely scratching the surface when it comes to mass behavioural modelling. The Russian influence operations and Cambridge Analytica’s machinations were just trailers.

Dr David Perlman, who studied applied physics at Caltech and electrical engineering at the University of Washington, now pursues his interest in cognitive science through a doctorate at University of Wisconsin-Madison.

As one of the speakers, he asserted that the psycho-social biases underpinning information warfare are so consistent that they affect everyone — even those who strive to be most objective or rational.

He advised us to inculcate a ‘bias hygiene’ by beginning with the assumption that we are already biased. Perlman recommended that we ‘red-team’ our brain, borrowing a methodology from hacking which tests cyber defences using offence.

There is indeed some method to the madness of trolls and other fringe elements which poison social media, as Perlman explained. Using vicious and divisive propaganda, they aim to stretch the Overton Window — which defines the informal boundaries of acceptable public discourse.

With a ‘door in the face’ instead of a ‘foot in the door’ — that is, by directly jumping to extreme positions instead of gradual escalation — trolls trick people and the mainstream media into legitimising issues which were earlier deemed inappropriate. A perfect example would be when terms like ‘anti-national’, ghar wapsi and ‘love jihad’ gained wide visibility and amplification as the mainstream media tried to scrutinise them. It walked into a well-laid trap.

Unfortunately, any government effort to debunk misinformation may fall flat as counter-propaganda is in itself a form of propaganda. And no amount of fact-checking may come to our rescue, as a study by the MIT’s Media Laboratory on Social Machines shows that, on Twitter, lies travel much faster than the truth.

Perlman summated, “This multidimensional space [of data-driven behavioural modelling] is the battlefield… this abstract space of ideas. Adversaries are now able to visualise at that level.”

Sara-Jayne Terp, a ‘data nerd’ exhibiting savant-like brilliance, was earlier the chief of the United Nations’ big data division. She is working on the Global Disinformation Index. Speaking at the event, she pointed out that of the 20 or so ‘attack surfaces’ of the brain, malicious actors have just weaponised a few. Terp remarked on our obsession with fake news, an imprecise parameter for the validation of online content. ‘Fake-ness’ is contextual, topical and time-dependent. She concluded on a positive note that when it comes to cyber-enabled information warfare, we have just started to give a structure to the attackers’ Tactics, Techniques and Procedures. Terp hinted that the time-tested frameworks of cyber defence could hold promise for this domain as well.

The concluding speaker was Wing Commander Keith Dear, a DPhil candidate at Oxford University’s Department of Experimental Psychology and a research fellow at Oxford’s Changing Character of War Programme. Dear stressed that warfare is fundamentally persuasive. So, it always had a cognitive premise. And as such, psychology must play a much greater role in the military’s operational, tactical and strategic planning.

Dear divulged that Cambridge Analytica-like profiling is now possible not just with social media, but also with passive indicators like how you type, speak and even your eyeball movements.

Quoting latest research, he exposited that human minds make probabilistic, predictable and replicable errors that are fully exploitable. By 2020, even commercial data brokers would hold 5,200 gigabytes of data for every person on earth — equalling 18.5 million books. Using Artificial Intelligence, adversaries could induce mass psychosis, breaking societies and nations.

Dear strongly recommended that nations should strive for cognitive security, which may become more important than guarding borders or modernising armies.

For India, a lesson from the former deputy director, National Security Agency, Chris Inglis, holds true, “Diversity beats audacity.” Like the Himalayas, our pluralism offers a natural defence against foreign interference, something which should be fostered rather than tampered with.

Advertisement
Advertisement
Advertisement
Advertisement
tlbr_img1 Home tlbr_img2 Opinion tlbr_img3 Classifieds tlbr_img4 Videos tlbr_img5 E-Paper