Technology

FAKING POINT! 

Deepfakes more ‘sophisticated’ and dangerous than ever as AI expert warns of six upgrades that let them trick your eyes

There are still some tricks to help you stay safe

DEEPFAKES are becoming increasingly dangerous after a series of artificial intelligence upgrades.

Experts have told The U.S. Sun has it’s now increasingly difficult to tell apart AI-generated video fakes from the real thing.

image 41 FAKING POINT! 
Deepfake technology can make you appear to say and do things you haven’t. Credit: Getty

Deepfakes use AI to create convincing videos of real people doing things they normally wouldn’t.

And through voice-mapping technology, they can even create false audio – making someone appear to say (or even sing…) something they never did.

AI tech can even “clone” your voice so it sounds just like you, and only needs a few seconds of audio to base it on.

We spoke to Adam Pilton, Cyber Security Consultant at CyberSmart and former Detective Sergeant investigating cybercrime, who revealed how the technology is rapidly improving.

“AI is becoming increasingly sophisticated,” Adam told us.

“The responses are increasingly accurate, the number of tools and resources we have access to has dramatically increased, from simple text responses, to pictures, audio, video and more.

“We are seeing deepfakes that have more realistic facial expressions, lip movements, and voice synthesis.

“This will make them even harder to distinguish from real videos and audio.”

There are now many different apps and services that help users to create deepfakes.

And the time it takes to generate a deepfake is also shrinking.

“Deepfake creation tools are becoming more user-friendly and accessible, lowering the technical barrier for attackers,” Adam explained.

“Cloud-based solutions and AI-powered platforms are making this more accessible.”

Deepfakes – what are they, and how do they work?

Here’s what you need to know…

  • Deepfakes are phoney videos of people that look perfectly real
  • They’re made using computers to generate convincing representations of events that never happened
  • Often, this involves swapping the face of one person onto another, or making them say whatever you want
  • The process begins by feeding an AI hundreds or even thousands of photos of the victim
  • A machine learning algorithm swaps out certain parts frame-by-frame until it spits out a realistic, but fake, photo or video
  • In one famous deepfake clip, comedian Jordan Peele created a realistic video of Barack Obama in which the former President called Donald Trump a “dipsh*t”
  • In another, the face of Will Smith is pasted onto the character of Neo in the action flick The Matrix. Smith famously turned down the role to star in flop movie Wild Wild West, while the Matrix role went to Keanu Reeves

SAVE YOURSELF

It’s not all doom and gloom, however.

Adam told The U.S. Sun that although deepfakes are becoming more advanced, so too are the ways we can catch them.

While it’s harder to rely on spotting visual or auditory mistakes yourself, technology is getting better at exposing deepfaked videos.

Using your instinct gives you the best chance of staying safe online.

Simon NewmanInternational Cyber Expo Advisory Council Member

“The risk is not necessarily increasing at the same rate as the technology itself is,” Adam said.

“Technology companies are developing tools to detect AI-generated material, making it harder for it to go unnoticed.”

Eyeing mistakes in a video can now be extremely difficult.

And even technology will sometimes miss the signs that a video is faked.

LIST OF DEEPFAKE UPGRADES REVEALED

dsc05673 enhanced nr edit JS906927237 FAKING POINT! 

Adam Pilton, Cyber Security Consultant at CyberSmart, revealed six big upgrades helping people to create deepfakes more effectively than ever…

  • 1. AI tools are getting faster at generating results
  • 2. The number of deepfake tools is growing
  • 3. Deepfakes now have more realistic facial expressions
  • 4. Mapping lip movements to audio is much better
  • 5. Create faked audio is much easier
  • 6. Deepfake-creating tools are more user-friendly

That’s why Simon Newman, CEO at the Cyber Resilience Centre for London & International Cyber Expo Advisory Council Member, warned that you’ll need to rely on your own instincts.

Questioning the context of a video – and asking whether it seems to make sense – is important.

This is especially true if a video is making a bold claim or demanding urgent action.

“As the use of artificial intelligence by cyber criminals increases, it will become much harder to tell the difference between a fake and the real thing,” Simon told The U.S. Sun.

DEFENCE AGAINST THE DEEPFAKES

Sean Keach 001 colour FAKING POINT! 

Here’s what Sean Keach, Head of Technology and Science at The Sun and The U.S. Sun, has to say…

The rise of deepfakes is one of the most worrying trends in online security.

Deepfake technology can create videos of you even from a single photo – so almost no one is safe.

But although it seems a bit hopeless, the rapid rise of deepfakes has some upsides.

For a start, there’s much greater awareness about deepfakes now.

So people will be looking for the signs that a video might be faked.

Similarly, tech companies are investing time and money in software that can detect faked AI content.

This means social media will be able to flag faked content to you with increased confidence – and more often.

As the quality of deepfakes grow, you’ll likely struggle to spot visual mistakes – especially in a few years.

So your best defence is your own common sense: apply scrutiny to everything you watch online.

Ask if the video is something that would make sense for someone to have faked – and who benefits from you seeing this clip?

If you’re being told something alarming, a person is saying something that seems out of character, or you’re being rushed into an action, there’s a chance you’re watching a fraudulent clip.

“Fortunately, the cyber security industry has made great strides in the last few years developing technology that can spot deepfakes which will hopefully reduce the number of people falling for them.

“However, we can’t rely on technology alone – taking a cautious approach and using your instinct gives you the best chance of staying safe online.”

Leave a comment

Your email address will not be published. Required fields are marked *