Deepfakes Are Out of Control!

What Are Deepfakes?

Deepfakes are fake videos or images created using artificial intelligence (AI) to make it look like someone is saying or doing something they never did. These can be very realistic and hard to tell apart from real videos. The term “deepfake” combines “deep learning” (a type of AI) and “fake.”

Deepfakes started to become widely known around 2017. Since then, the technology has improved a lot, making it easier to create convincing fake videos and images. This has caused many problems, from spreading false information to scams and blackmail.

On June 27, Senior Minister Lee Hsien Loong warned people about deepfake videos of him. In this fake video, he is talking about international relations and foreign leaders. Mr. Lee said on Facebook that the people behind these videos have bad intentions. They want others to believe that these fake views are their own or represent the Singapore government’s stance. This can be very dangerous and harmful to the country’s interests.

Mr. Lee advised that if people see such videos, they should check if they are real and report them to the platform where they found them. He also said not to share these videos, even to say they are fake, because others might see the video and believe it is real without reading the comments.

Deepfakes often have poor quality and might look blurry because of the manipulation. Viewers can spot them by looking for unusual facial movements, blinking patterns, or noticeable edits around the face.

In another case, scammers used deepfake nude images to blackmail people in Singapore. John (not his real name), 21, connected with a woman on the dating app OkCupid. A few days later, a friend told him that someone had sent him a nude photo of John.

John realized it was the woman he met on the dating app in June 2024. She insisted on chatting via video call on Telegram. Later, she sent him a photo with his face on the naked body of a man and threatened to share it with his friends unless he paid her $1,000.

John said he never sent her any picture, especially not a nude one. They only had one video chat. The photo was a deepfake, with scammers using AI to put John’s face on someone else’s body.

The Straits Times reported that at least four men had been targeted by similar scams, known as “deepfake sexploitation.”

Share This Article
Facebook
Twitter
LinkedIn

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

Get in touch with our consultant