Subscribe To Print Edition About The Tribune Code Of Ethics Download App Advertise with us Classifieds
search-icon-img
search-icon-img
Advertisement

Deepfake videos are 'targeting your wallets', RBI sounds alarm on growing fraud trend

RBI cautions public against engaging with or falling prey to fake videos 
  • fb
  • twitter
  • whatsapp
  • whatsapp
featured-img featured-img
Photo for representation only.
Advertisement

RBI governor Shaktikanta Das on Tuesday warned investors about deepfake videos circulating on social media, falsely claiming the launch or support for certain investment schemes by the central bank.

The regulator cautioned the public against engaging with or falling prey to such videos.

What RBI said?

Advertisement

"It has come to the notice of Reserve Bank of India that fake videos of the Governor are being circulated on social media that claim launch of or support to some investment schemes by the RBI.

The videos attempt to advise people to invest their money in such schemes through use of technological tools," an official statement by RBI said.

Advertisement

RBI said its officials are not involved in or support any such activities and that these videos are fake. The RBI does not give any such financial investment advice, it added.

What are deepfake videos?

Deepfake videos are fake videos, images, or audio recordings that is generated by artificial intelligence (AI) to appear as real. They are created using a type of machine learning called deep learning. Deepfakes are dangerous as they can spread misinformation, influence people, and can also threaten privacy and security.

They can be used to swap faces and even manipulate facial expressions. After the expansion of AI intelligence, it has become difficult to identify the difference between a deepfake and a real video, as AI has become more efficient and accurate at visual representation.

Why is this a cause of concern?

The deepfake videos in the RBI crosshairs is trying to promote the launch or support some RBI-backed investment schemes. They are attempting to provide people with investment advice and get them to invest their money in these investment options using technology tools known as deepfake videos. In the recent past there has been a growing incidence of frauds using deepfake videos by fraudsters as scammers are increasingly using deepfakes of well-known people to exploit unsuspecting victims.

Not the first instance

In April this year, NSE issued the warning after observing that the face and voice of CEO Ashishkumar Chauhan was being falsely used in some investment and stock advisory videos.”Such videos seem to have been created using sophisticated technologies to imitate the voice and facial expressions of Ashishkumar Chauhan,” the NSE said. The exchange said its officials were not authorised to recommend or deal in any stocks. It said it was making efforts requesting these platforms (social media) to take down such videos, wherever possible.

The Department of Telecommunications (DoT) also cautioned investors against deepfake videos and images amid growing threat of deepfakes being used to manipulate stock prices and frauds. It has issued a warning to Indian citizens regarding fraudulent advertisements of stock market, trading, free tips on social media apps.

“Beware of fraudulent advertisements of stock market, trading, free tips on social media apps!! They may use deepfake videos and images. Never fall prey to greed, “read a message sent by DoT to a Vodafone subscriber.

On April 18, the Bombay Stock Exchange (BSE) also issued a warning about fake videos featuring its MD and CEO, Sundararaman Ramamurthy, giving false investment advice. Similarly, in August 2023, the Securities and Exchange Board of India (Sebi) advised stock brokers to stay away from financial influencers. ICICI Prudential Asset Management Company also cautioned investors in January 2024 about deepfake videos of its senior executives.

Advertisement
Advertisement
Advertisement
Advertisement
tlbr_img1 Home tlbr_img2 Opinion tlbr_img3 Classifieds tlbr_img4 Videos tlbr_img5 E-Paper