CrimeCyber SecurityTechTrends

DEEPFAKES: REVOLUTIONIZING MEDIA OR POSING A THREAT?

'Seeing is not believing' is experienced by deepfakes. But, they were not always designed to cheat, says Aishwarya Tiwari.

Deepfakes are usually the manipulation of facial appearance through deep generative methods. In other words, these can be termed as synthetic media that have been digitally manipulated to replace one’s likeness  convincingly with that of another. In this article, Cyber Security awareness expert Miss Aishwarya Tiwari is addressing the moot question, whether Deefakes are revolutionizing the media or posing a threat?

Have you been noticing the buzz around AI, including deepfakes, AI-powered fraud calls, and deceptive videos flooding social media? But what exactly is deepfake technology, and does it impact you positively or negatively? Deepfake, in essence, is a manipulated audio or video clip that uses advanced artificial intelligence to convincingly replace or superimpose someone’s likeness onto another person’s, often creating deceptive or misleading content.

Deepfakes

With deepfake usage at its peak, the notion that ‘seeing is no longer believing’ has become pervasive. However, it’s important to recognize that deepfakes were not created solely for malicious purposes or to harm people. Absolutely not. It is individuals with mischievous mindsets who exploit this technology for their own selfish reasons. While deepfakes are indeed used to inflict harm, there also exist numerous applications that contribute to the betterment of society and individuals in various ways.

Deepfakes have the potential to open up a wide range of possibilities and opportunities for everyone, regardless of their background or how they engage with the world. Artificial Intelligence (AI) can generate chances and prospects for people universally, irrespective of their identity and their ways of listening, speaking, or communicating.

Here are some potential positive applications of deepfake technology for the betterment of society and individuals:

Entertainment and Creativity
Education and History Reenactment
Language Localization
Preserving Cultural Heritage
Disability Support
Medical Training
Voice Assistance and Translation
Digital Storytelling
Virtual Performances
Environmental Awareness
Scientific Visualization
Enhancing Public Safety & Digital Reconstruction: Recreating a crime scene involves both forensic expertise and creativity, utilizing logical deduction and evidence. AI-generated synthetic media can assist in reconstructing the scene by considering the connection between spatial and temporal elements. Back in 2018, a group of civil investigators employed videos from cell phones, autopsy findings, and surveillance clips to build a virtual representation of a crime scene.

Deepfakes

Deepfakes present a wonderful opportunity to make a positive difference in our lives. They can grant individuals a voice, purpose, and the capacity to influence on a larger scale and with greater speed. Diverse realms such as art, self-expression, public safety, inclusivity, and business have witnessed newfound concepts and capabilities for empowerment through deepfakes.
However, alongside the rising accessibility of synthetic media technology, the risk of misuse also grows. Deepfakes can be exploited to tarnish reputations, forge evidence, deceive the public, and erode trust in democratic institutions.

Deepfake technology is being harnessed to carry out a multitude of fraudulent scams, employing its ability to craft exceptionally convincing and lifelike fabricated content, frequently leveraging AI to manipulate audio, video, or images. In its early stages, this technology was predominantly employed for entertainment objectives, providing filmmakers and content producers with the means to seamlessly incorporate actors into various scenarios or adeptly emulate historical personalities. Nonetheless, with the progression of this technology came its exploitation by wrongdoers who aimed to capitalize on the potent art of deception, as stated by law enforcement officials.

We’ve observed videos where famous people’s faces have been manipulated using deepfake technology, and these videos have gone viral on social media. In certain instances, they’re used as meme videos, while in others, they’re shared to spread false information.

Deepfakes

Examples of such cases include:
A video featuring a deepfake version of a well-known politician giving a fictional speech went viral as a meme, generating widespread humor and discussion.
A deepfake video of a popular celebrity endorsing a fake product gained attention on social media, causing confusion among fans.
A deepfake news report featuring a renowned news anchor presenting fabricated information was widely shared online, highlighting the potential misuse of such technology for spreading misinformation.

Now, have you noticed a big increase in the number of video calls (fraudulent calls using deepfake technology) coming through WhatsApp? Allow me to explain the process and provide you with details about an incident involving a man from Kerala who fell victim to a deepfake fraud call, resulting in a significant financial loss.

PS Radhakrishnan, who lives in Kozhikode, Kerala, experienced a loss of Rs 40,000. This occurred when cybercriminals exploited deepfake AI technology, pretending to be a former colleague in a WhatsApp video call. They deceitfully requested funds for his sister’s medical procedure.
At first, Radhakrishnan received an incoming call from an unfamiliar number, which he chose not to answer. Subsequently, he discovered multiple messages on his WhatsApp from the same number. The individual on the other end claimed to be Radhakrishnan’s ex-colleague from Coal India Ltd.

“We had collaborated for nearly four decades, and I was familiar with him. His profile picture matched his photo. He inquired about my daughter and her workplace. During our text exchange, he even shared family photos and asked about colleagues we both knew,” stated Radhakrishnan, as per the HT report.

After a while, the victim received a voice call from the same number. The caller claimed to be at the Dubai airport, preparing to board a flight to India. He requested a financial favor, explaining that his sister-in-law urgently needed surgery at a Mumbai hospital. He needed ₹40,000 as an advance payment. The caller instructed to transfer the money via UPI to a phone with someone at the Mumbai hospital with his sister-in-law.

Wanting to be absolutely certain he wasn’t being deceived, the retired official insisted on an immediate video call.

“Within moments, he made a call and bore an uncanny resemblance to my ex-colleague. Despite seeing only his face, it was unmistakable. His lips and eyes moved naturally as we conversed in English. The call ended after a mere 25 seconds due to a disconnection. He then reconnected via a voice call, stressing the urgency of the financial assistance. Without further inquiry, I proceeded to transfer the money,” as detailed in the report.

A little while later, the same man called again, this time asking for Rs 35,000 to cover hospital expenses. The Kerala man began to feel suspicious because the caller sounded rushed. To handle the situation, he fibbed about not having enough money in his account.

Later on, Radhakrishnan called the number he had saved for his former colleague. Surprisingly, the person on the other end denied making any calls. That’s when he realized he had been tricked by a scammer. The police suspect that the scammer used deep-fake technology to impersonate someone known to Radhakrishnan.

You might be pondering over several questions: How did the hackers obtain the person’s personal information? How did they know about his daughter and other details? Moreover, how did the victim get persuaded during the video chat?

Deepfakes are akin to modern-day photoshopping, using artificial intelligence known as deep learning to craft fabricated images and videos, as mentioned in The Guardian. This technology can even produce entirely fictional content, including audio, often creating convincing ‘voice skins’ or ‘voice clones’ of individuals.
It involves a sequence of carefully planned steps, blending advanced technology with psychological manipulation. The scammer creates a fake profile, often using stolen or publicly available images of trusted individuals like friends or family members. They then utilize AI-powered deepfake technology to craft highly authentic video calls on social media or other online platforms. This simulates someone known to the victim – be it a friend, family member, or colleague—leading them to believe it’s a genuine interaction. Subsequently, they intensify a sense of urgency and ask the victim to transfer money to their bank accounts.

If you get a video call from someone claiming to be a friend, family member, or someone you know, it’s a good idea to call their personal mobile number to confirm it’s really them before sending any money.

Citizens should be cautious about sharing personal data online, and they should fine-tune their privacy settings on social media to control information access. Using multi-factor authentication and additional identity verification methods can help secure accounts against unauthorized access. Additionally, it’s important to be mindful about the information shared with individuals conducting surveys or product inquiries when they visit. It’s advisable to avoid speaking your mobile number loudly while making payments at malls, stores, restaurants, or ticket counters.

(Images: pexels.com)

Link to the Cyber Security Division, Ministry of Electronics & Information Technology, Government of India:

https://www.meity.gov.in/cyber-security-division

Indiainput.com is keen to contribute in spreading more awareness on Cyber security & safety of the end users. You are welcome to share experience or feedback on contactindiainput@gmail.com

Dear valued Readers and Supporters, at IndiaInput.com, YOU are the heart of everything we do! Your unwavering support has fueled our passion for delivering top-notch news and insights on a wide array of topics. We deeply appreciate the time you spend with us, making our journey so meaningful. Your favorite online news magazine Indiainput.com celebrates YOU and the association with you. We’re incredibly grateful for your selection and in joining us on this remarkable adventure. Together, let’s continue to create a brighter, exciting & knowledge-filled journey to a more rewarding future!

Visit some of our important Internal Links
Administration related news & inputs:
Spotlight related news & inputs:
Startup related news & inputs:
Innovation related news & inputs:
Specials in our news & inputs:
Infrastructure related news & inputs:
Industry related news & inputs:
Trends in news & inputs:

Related Articles

Leave a Reply

Your email address will not be published.

one + 10 =

Back to top button