Introduction
The law has always struggled to evolve at the same pace as technology. As people find more inventive ways of hurting and degrading each other, the law is locked in a perennial battle against humanity’s worst instincts. In recent years, artificial intelligence (AI) and machine learning (ML) have thrown down the ultimate gauntlet for lawyers. The potential of AI and ML to improve the quality of life for all is huge. But more insidious applications of this technology raise challenging moral and legal questions. One iteration of the AI Green Knight is the computer-generated ‘deep fake’.
For the uninitiated, I shall explain. Deep fakes are synthetic media that superimpose human images onto videos using the technology of AI and ML. Deep fake technology (DFT) can accurately recreate the intonation of a person’s voice, and then overlay an audio track onto a counterfeit video. This nascent technology has come a long way in the last few decades, sprouting amateur online communities who share their work in ordinary – and dark – corners of the web.
The sleazy reputation of DFT derives from its overwhelming use in creating fake pornographic content. This is achieved by transposing the faces of celebrities or ex-partners onto the faces of porn stars. Its other infamous uses include political hoaxing and elaborate financial fraud. DFT could have numerous commercial applications, and this helps nudge the technology along. Fashion companies, for instance, see the potential of DFT to allow shoppers to ‘try on’ clothes before they buy them.
Corporate and sordid uses alike, DFT has steadily improved over the past few decades, becoming widely available through easy-to-access apps like Zao.
Let us picture the following scenario involving DFT, and enquire after how the law might address different aspects of it:
- A pornographic video claiming to contain footage of a female actor goes viral on social media sites and pornography hosting sites, including Twitter and PornHub. The video was originally posted on a forum for deep fake enthusiasts, but was then reposted on popular platforms by anonymous internet users. The video is streamed a total of 10 million times before it is removed from mainstream sites. It survives on mirror sites and the dark web.
What crimes, if any, have been committed and by whom? What civil action could be taken and against whom?
- In later analysis, it is revealed that a database containing thousands of photographs was used to engineer the deep fake video. Some of these were taken when the actor was under the age of 18. Similarly, it is discovered that some of the audio clips used to recreate the sound of her voice were taken from earlier in her career when she was fifteen.
Does this change anything?
- The actor hires a private investigator who uncovers information to suggest that the deep fake video was commissioned by one of her ex-boyfriends. She believes he sought to take revenge on her after she ended their relationship of three years.
Does this mean that the ex-boyfriend can be convicted for the criminal offence of distributing ‘revenge’ porn?
Impact
We will answer these questions shortly. However, setting aside compassion for the plight of this young actor, we might worry about the impact that unchecked and unregulated DFT could have on our society.
- DFT could be used to hurt individuals. Anyone could find themselves depicted in a porn video or represented in some other manner (e.g., endorsing racist or homophobic statements, or perhaps carrying out violent acts).
- Given the democratisation of DFT through free-to-download apps and open-source software, DFT could soon be abused by anyone: governments, businesses, organisations and individuals alike will be able to carry out deeply personal attacks easily and cheaply.
- DFT attacks could cause embarrassment, humiliation and feelings of degradation for an individual and her family and friends.
- DFT attacks could leave an individual feeling as though her right to privacy has been violated because viewers of DFT videos will see what they want to see (i.e., a celebrity’s naked body) rather than the reality (i.e., an image of a celebrity’s head superimposed onto the body of a porn star).
- DFT attacks could damage an individual’s reputation. Even if a video is latterly revealed to be fake, the damage may already be done, and individuals may find themselves their work, friendships and relationships are damaged by the content.
- As DFT improves, it may take longer to distinguish between reality and fiction. Relationships between spouses, parents, children, and friends may be irretrievable. Job opportunities may not return.
- DFT attacks could result in individuals being subjected to relentless public harassment, workplace-shaming and domestic abuse.
- DFT attacks could leave individuals feeling powerless and unable to control the manner in which they are perceived.
- DFT attacks could damage an individual’s ability to maintain or forge sexual and intimate relationships.
- DFT attacks could damage an individual’s sense of sexual, social, religious, cultural or familial identity.
- Victims of DFT attacks could experience serious, longterm mental health problems.
- Victims of DFT attacks may find it impossible to integrate into society after undergoing this image-based sexual abuse.
- Pornography produced with DFT disproportionately and detrimentally affects women in twenty-first century society.
- DFT pornography objectifies women without their consent. This further entrenches sexist attitudes in our society by sending a message to the online community, including an increasingly juvenile demographic of pornography consumers, that it is acceptable to view women as sexual objects, whether those women like it or not.
- DFT pornography may undermine the feminist project of equality, ushering in a new age of sexist values in the workplace, home and school environment. The misogynistic values inculcated by DFT pornography may be manifest in egregious ‘locker room talk’, promotion and opportunity discrimination, and sexual harassment.
- DFT pornography may result in the hounding out of women from certain workspaces, where DFT engineered explicit images circulate the office.
- For all its advantages in commercial use, DFT could spread misinformation and fake news.
- DFT could be exploited for propaganda purposes, harming elections and manipulating important political decisions.
- DFT could be used to damage the reputations of public figures, organisations, charities and governments.
- Widespread DFT usage could cause such a mushrooming of misinformation that the public becomes paralysed, unable to separate fact from fiction.
- DFT could be used to frame individuals and organisations, but it could also be offered as an excuse for actual wrong-doing.
- Widespread DFT usage could produce an additional expense for the police when investigating crime because defendants may allege that incriminating video evidence has been falsified.
- Those who cause harm by criminal and non-criminal methods may claim that incriminating video, audio or photographic evidence is fake. Cheating partners, for example, might mislead spouses into believing that implicating evidence has been fabricated.
What can the law do?
Deep fake pornography is still legal in the UK. You can consume, download, make, share and distribute all the fake images and video you want without fear of policemen knocking at your door (for the most part). If an individual can prove that some other wrong has befallen them, such as defamation of reputation or breach of the right to privacy, then they might win a civil case. But this will be an extremely expensive course of action.
Davide Buccheri was convicted of the criminal offence of sexual harassment in 2018 after he made fake explicit images of an intern at work and directed her manager’s attention to them in an effort to discredit her. He was ordered to pay £5,000 in damages and spend 16 weeks in jail. He was also fired from his job.
District Judge Richard Blake attempted to balance the damaging impact of a sentence on the young man’s promising career with the long-lasting impact of the crime on the victim, saying, “She will live forevermore with the fear that someone will Google her name and some ghastly website will come up and she will be reminded of the offences that you invoked.”
These solutions seem generally unsatisfactory, however. They require a strange contorting of a patchwork of laws that simply were not designed with deep fake pornography in mind.
Relevant criminal law provisions are found in the UK’s revenge porn legislation: Sections 33-35 of Part 1 of The Criminal Justice and Courts Act 2015. This provides that:
It is an offence for a person to disclose a private sexual photograph or film if the disclosure is made—
(a) without the consent of an individual who appears in the photograph or film, and
(b) with the intention of causing that individual distress.
However, a number of defences to this crime can be made on the basis that “A person charged with an offence under this section is not to be taken to have disclosed a photograph or film with the intention of causing distress merely because that was a natural and probable consequence of the disclosure.”
Thus, proving “revenge” in revenge pornography legislation is an uphill battle for defendants. Lowering the required level of intent to cause harm (mens rea) might be one option: the distribution of deep fake pornographic videos could be a strict liability offence. This would send a strong message about the severity of the harm done to victims of a manner of “digital rape”.
One problem with criminal prosecutions and civil lawsuits of this kind is that they are only effective if the victim knows who it is they are trying to persecute or sue. In many (perhaps most) deep fake cases, this isn’t the case. They also aren’t especially effective for having these videos removed from the internet in an expeditious manner.
Rather perversely, revenge pornography is classified as a communications offence rather than a sexual offence. This means that the victims are not given anonymity when they bring their cases to the police! Would you want to pursue your case, knowing your name would be made public as soon as you did, and that you’d probably have to wait months or years for your hearing, by which time the damage may well be done?
Strangely, copyright law provides one of the best avenues of remedy for victims of image-based sexual abuse because many revenge pornography pictures are selfies. (This means that the subject of the photo is also the copyright holder. Copyright holders can request images or videos are removed from websites.) But, again, this is pretty toothless (and unsatisfactory) when it comes to deep fake pornography.
I feel that new legislation is needed to outline specific offences pertaining to deep fake pornography. Such legislation could helpfully clarify specific acts that constitute this new type of twenty-first century harm. It would be especially helpful for elucidating the nature of non-sexual crimes involving DFT.
What should the law do?
New legislation is needed to specify crimes arising from the use of DFT. The legislation should have separate provisions pertaining to sexual uses of DFT and other abuses of this technology. It will have to be careful not to curb free speech to our detriment, leaving plenty of space for creative and commercial applications of DFT. However, if X can reasonably foresee that they will bring Y to a state of deep distress and embarrassment by producing deep fake images, then the law should seek to protect Y and bring X to justice – civil and, where there is mens rea, criminal justice.
The problem with DFT is dissemination, rather than production. This is because the harm is of a public, rather than private, nature. We would likely be disturbed by an individual who chose to pass her time by pencilling hyperrealistic pictures of violent sex acts. We would not, however, want the law to invade this aspect of our private lives. Engineering DFT images is, for some people, an art form, a science experiment. The law should not prevent individuals from making deep fake videos in their own home.
However, the law should prohibit the dissemination and sharing of pornographic DFT engineered videos. Cynics might argue that this is too difficult a margin to patrol: What kind of sharing is allowed? Can an individual send the videos to a friend, or upload them to his own private cloud sharing facilities? What if a friend comes over to view these images on the computer, but the individual doesn’t actually send them anywhere? What if I run private screenings of these films in my backyard?
My preferred solution would be the gradual introduction of an image right fenced around article 8 of the ECHR: the right to private and family life. Our image (or likeness) is a key part of our identity. Victims should be able to sue websites that do not expeditiously remove images or videos showing their likeness, if they feel that their right to private and family life has been violated. In cases where someone’s likeness is shown in a pornographic video, I think this would easily be proved.