search
date/time
Lancashire Times
A Voice of the Free Press
frontpagebusinessartscarslifestylefamilytravelsportsscitechnaturefictionCartoons
Jamie Durham
IT Correspondent
7:27 AM 27th October 2019
scitech

Fake-Face Tech

 
None of these is the face of a real person...
None of these is the face of a real person...
Hopefully you have been watching the brilliant series on BBC, The Capture. For those who haven’t, it is a fictional thriller where the CIA and MI5 use a new technology to change the faces of people on CCTV footage – a technique known as ‘correction’ and they then use it as evidence to catch people they are sure are guilty, but don’t have enough proof to convict them. Sounds far-fetched, right?

But, what if this technology already exists and is being used to destroy people's lives and reputations right now?

The software is called Deepfake app and the implications of it are utterly terrifying, for more than one reason.

Just like in the BBC thriller, the program stitches together as many pictures as possible to create a map of the victim’s face and then imposes it onto another human being, in any situation. The worst part is it is nearly impossible to know you are looking at a fake.

If you’re finding this hard to believe, search for the Deepfake of Barack Obama calling Donald Trump a name that I can’t use in print.

The worrying reality is that Deepfake is free and available to download practically anywhere. It does require some skill to operate, but nothing a teenager with a love of technology couldn’t figure out.

There are also hundreds of videos of famous actresses in adult movies that are completely forged and truly believable. But perhaps more worryingly, this has happened to teenagers. Noelle Martin was just a normal 18-year-old law student when someone decided to make her the star of dozens of adult movies. She has since fought a six-year privacy battle, which has completely turned hers and her family’s lives upside down. She has since gone on to do a TED talk on the very subject.

In the wrong hands this technology could be devastating. We already have an example of an ex-president being ‘Deepfaked’, but what about the existing one disparaging another world leader? A school kid with a grudge against a teacher? A spouse wanting a husband in prison? It has the potential to become a widely-used weaponised video tool.

There are also unseen, and as yet unreported, security risks. For example, imagine your boss sending you a video message asking for the company credit card details because he or she has left theirs at home. The visual footage would be enough proof to not even consider checking if the phone number was correct., which reaffirms the scenarios are endless in the wrong hands.

In truth, I can foresee this technology changing social media as we know it. Public profiles on Instagram for instance will become a thing of the past, as people attempt to rid the internet of their personal pictures.