Deepfakes, Not so Fake

With deepfakes becoming more sophisticated and dangerous, experts spread a word of caution and ask netizens to be tech-aware!

Update: 2024-08-21 18:30 GMT

The viral video of Donald Trump and Elon Musk grooving to Bee Gees' Stayin’ Alive disco hit gathered millions of ‘Likes’. So did the desi Mai Khiladi Tu Anadi deepfake video featuring prominent Indian politicians gyrating on this peppy number. While these viral videos tickled everybody’s funny bones, not many realise that AI-powered deepfakes are becoming more sophisticated and dangerous day by day. After her deepfake video surfaced online, actor Rashmika Mandanna took to X stating: “So I want to tell all girls out there that this is not normal. When something is affecting you, you don’t have to keep quiet.” When the culprits were nabbed, Rashmika thanked the Delhi police and told her fans: “Girls and boys - if your image is used or morphed anywhere without your consent. It is wrong! And I hope this is a reminder that you are surrounded by people who will support you and action will be taken!”

Modus Operandi

There are hundreds of deepfake videos, audio and images floating in the virtual space, including on adult pornography sites. Some can tarnish a person’s image and reputation for life. Recently, cricket legend Sachin Tendulkar called out a deefake video of him promoting an online game. The video shows Tendulkar citing an example of his daughter (Sara) earning Rs 1.8 lakh per day by merely making predictions in the game.

Himanshu Yadav, Cybersecurity Expert & Content creator, and founder of Hackindtech says that when someone creates a deepfake of a celebrity, it is bound to go viral. AI has made it difficult for the average person to tell what is real and fake. The recent elections have been an eye-opener in terms of the number of blasphemous deepfakes that caused tension in political parties and communities. These deepfake videos blur the line between reality and fiction. This misleads the public. “Creating a deepfake usually starts with gathering some basic ‘raw’ material like photos, videos or even audio clips,” adds Himanshu.

AI is then added to the range of breaches done by using AI-powered tools to analyse and replicate a person’s voice or facial expressions, movements etc. Himanshu cautions, “What’s scary is that these AI tools are becoming easier to use, so more people can create deepfakes without any technical knowledge.”

Wild Range

Abhishek Parashar, Cyber Crime Investigator, Founder & CEO of the Indian Cyber Club Technologies says that deepfake videos and cloned audios are being used for deception and to spread misinformation. He explains how deepfakes are generated from simple ideation to barbaric creation and circulation.

The first step is 'Info Gathering' or bytes from the victim’s social media handle.

He says, “For video-based deepfakes, creators seek high-resolution images and videos that clearly show the targets (victims) face from a multitude of angles.” The more diverse the angles and expressions, the more realistic the deepfake curated appearance.

Audio deepfakes require some amount of clarity. “These can easily be extracted from podcasts, online interviews, speeches or any source where the target (victims) voice is present,” explains Abhishek.

The next step is the preprocessing of data -- wherein images or video frames are aligned via data augmentation techniques like flipping, rotating, and adjusting colour tone, brightness etc.

The last nail in the coffin is added by an AI-enabled tool. “The core of creating a deepfake involves training an AI model -- Generative Adversarial Network (GAN) or an Autoencoder,” says Abhishek. He further explains that for video deepfakes, the model is trained to swap faces, while for audio deepfakes, it is trained to mimic the voice of the target.

Complete Crackdown

Deepfakes can take a toll on an individual’s mental and emotional well-being. “Deep-fakes can be more than just misleading videos. They can be used to scam people result-ing in monetary losses,” says Himanshu.

There have been cases in India where the scammers impersonate famous CEO’s or Bank officials, tricking employees into transferring large sums of money by using a deepfake video or voice message that seems totally ‘real.’

However several cues give away vital information about deepfakes. These could vary from a person’s expression being a little too exaggerated or off, to checking for lighting or shadows and the list goes on. There is a lot of concern and confusion around deepfakes. Perhaps it’s time for people to not just be tech-savvy, but tech-aware!

Deepfake Cues

• A person’s face or expression looking a little off could be a sign

• Check if the lighting or shadows seem inconsistent

• Listen carefully, sometimes audio doesn’t match the video

• Use reverse image searches to check if a video or picture has been altered

Tags:    

Similar News

Jayam Ravi’s wife in shock
Rise of cancer in young people