Deepfake apps: Society’s new enemy

By Riana McArthur

Is cyber-misogyny the new frontier for the embarrassment and humiliation of the disenfranchised?

Have you ever thought about the possibility of appearing in pornographic films? Or posing as an online model for Internet users to undress you? What about receiving a video clip or photo on your social media profile and realising that the person in the clip is you? Perhaps you should. It’s more than possible.

The reality of ‘deepfakes’ infiltrating every facet of your life has arrived. Deepfakes is a hybrid of “fake” and “deep learning”. Based on artificial intelligence, deepfakes is a technique for human image fusion. The method combines and overlays existing images and videos clips onto source imageries, or videos, using a machine learning technique known as ‘generative adversarial network’.

Deepfake apps are a relatively new phenomenon in the realm of the Internet age, making its first official appearance in 2017. Initially deepfake technology was used at research and academic settings and by development amateurs. It then soon spread to the political sphere, aiming to alter peoples’ perceptions of political leaders. This was soon followed by entering the entertainment realm. The use of deepfake techniques was seen by millions of people around the globe in 2016 in Rogue One for the acting of Princess Leia, and in 2018 in Solo: A Star Wars Story, when Harrison Ford’s face was inserted onto Han Solo’s face.

The list of deepfake apps is endless and includes apps such as DeepFaceLab, Face Swap Live, Deep Art and AvengeThem and is available to anyone who has access to the Internet. These apps have an ingenious capability to allow anyone to, within seconds, replace the original face in a video with someone else’s face. Deeptrace Labs, a company that researches and detects deepfake apps, found that since 2018, the number of deepfake videos increased by an alarming 84%.

The capability to reinvent a character allows for the creation of endless hoaxes, fake news, nude and pornographic scenes and revenge porn, and target vulnerable individuals such as girls and women. Mutale Nkonde, a fellow at the Data and Society Research Institute in New York, states that “The DeepNude App [a deepfake app] proves our worst fears about the unique way audio-visual tools can be weaponised against women”, ultimately altering our perceptions of people and controlling women’s bodies. This has happened to many celebrities such as Scarlett Johansen, Amber Heard and Katy Perry. And it also impacts non-celebrities, as alarmingly, deepfakes software can be purchased online for as little as R45.

Initially deepfakes appear to be innocent and exciting, but the individual could also be lured into the destructive side of the app. The result could be an individual who is ill-informed about the harmful effects of their actions, not only on themselves, but also those that they target through using deepfake apps.

The manipulation of images and videos using artificial intelligence has the ability to become a destructive mass phenomenon. Arwa Mahdawi of The Guardian believes that the key factor behind the creation of deepfake pornography is the desires to humiliate and control women. An artificial intelligence researcher, Alex Champandard, opines that due to the inability to differentiate between reality and fake media, humanity has entered an age in which it is impossible to know when content represents truth.

It is obvious that anyone could be a target through the use of deepfake apps. The question to ask is what is being done to prevent this from happening. Globally numerous governments, such as the United States and the United Kingdom, close down deepfake apps and make the creation and distribution of deepfake Apps punishable by law.

Locally, the Film and Publication Board (FPB), through the implementation of the 2017 Cybercrimes and Cyberbullying Bill, aims to rationalise the laws of South Africa which deal with cybercrime and cybersecurity and to criminalise the manufacturing and distribution of malicious communications and to provide interim protection measures. The FPB acknowledges that there is an increased demand for online content and technological advances, thus there is also a demand for the FPB to increase its monitoring of digital platforms and social media.

Arguably deepfake apps will affect the way we perceive life and place further pressure on our societal norms and values. But the key concern should be the effect deepfake apps have on women, vulnerable minority groups and children.

More stories in Issue 117

Deepfake apps: Society’s new enemy

By Riana McArthur

Have you ever thought about the possibility of appearing in pornographic films? Or posing as an online model for Internet users to undress you? What about receiving a video clip or photo on your social media profile and realising that the person in the clip is you? Perhaps you should. It’s more than possible. The […]

PODCAST: Beware the small screen

By Asikelelwe Pezisa

Analysts from across the globe encourage consumers to think critically before texting and sharing their lives on digital media sites as the time we spend online could be to our benefit, or detriment. And that all depends on whether we approach social media especially with care and consideration.

Contributors

Riana McArthur

Riana McArthur is an HPCSA registered research psychologist with more than 10 years’ experience in market research in various industries, specialising in quantitative as well as qualitative research methodologies. Riana is fortunate to have lived and worked in many countries. Although the majority of her clients are in the fast-moving consumer goods (FMCG) she has […]

Links

Deepfakes are a real political threat. For now, though, they’re mainly used to degrade women.

A new report on deepfakes finds 96 percent involve simulating porn of female celebrities (without their consent).

Visit Site

Can you believe your eyes?

Please use the sharing tools found via the share button at the top or side of articles. Copying articles to share with others is a breach of FT.com T&Cs and Copyright Policy. Email licensing@ft.com to buy additional rights. Subscribers may share up to 10 or 20 articles per month using the gift article service. More information can be found at https://www.ft.com/tour. https://www.ft.com/content/4bf4277c-f527-11e9-a79c-bc9acae3b654 While they may be increasingly cheap to pull off, their repercussions could be far-reaching. Fraudulent clips of business leaders could tank companies. False audio of central bankers could swing markets. Small businesses and individuals could face crippling reputational or financial risk.

Visit Site

Newsletter

Subscribe to our newsletter and get notified of new issues.