The right to ownership of one's own likeness

 Proposal to United Nations General Assembly (UNGA)

Pre-empting the problem of Deepfake videos


Background


We presently stand on the edge of an abyss in which social media threatens to uproot our world order and cast us into chaos, as we see with the recent attacks (6 January 2021) on the American Capitol by conspiracy theorists incentivised on social media.


Deepfake videos are motion pictures that are created to look like they depict the actions and words of an existing person, usually a celebrity, but which contain content, words or actions which were not performed by that person and which may impugn on their dignity. For example, to demonstrate the seriousness of the problem, the creators of the television satire series “South Park”, created a video called “Sassy Justice” presented by someone who looks like Donald Trump. The video also features the likenesses of Al Gore, Ivanka Trump, and Mark Zuckerberg of Facebook all making uncharacteristically exaggerated remarks. For example, in this video, Mark Zuckerberg is selling kidney dialysis services rather than social media products. You can see the video here:


  • https://www.youtube.com/watch?v=9WfZuNceFDM

A similar video has been created of HM Queen Elizabeth dancing on a table:


  • https://www.youtube.com/watch?v=IvY-Abd2FfM

And here is a video of HE President Putin telling Americans that there’s no need for Russia to interfere with USA since they’re quite capable of destroying their own democracy:


  • https://www.youtube.com/watch?v=sbFHhpYU15w

As the General Assembly can see from these examples, the seriousness of this problem cannot be understated.


Other risky technologies include photoshopping, that is, using the well-known graphics software, Adobe Photoshop (www.adobe.com/photoshop/), to modify a photo of a person to change the context of an image. For example, it is possible to photoshop a picture of a head of state and place them amongst undesirable persons to make it appear as if they are on friendly terms with those undesirable persons. To give a specific example, one could place a president at a table and make it look as if he is considering plans with, say, terrorists, thereby implicating him in terrorism against his own people. 


An example of a fake image which Mr Trump shared on social media, is here:


  • https://www.cbsnews.com/news/president-trump-retweets-fake-image-nancy-pelosi-muslim-head-coverings-in-front-of-iranian-flag/


Legal and Military implications


The legal and military implications of this technology are extremely serious. Potentially, a video of a head of state could be made in which the head of state apparently utters xenophobic, racist or other hate speech comments, or, in which they declare war. Before the actual head of state can issue a rebuttal, it is quite possible that the opposing nation would have already begun manoeuvres or struck via military attacks.


Personal dignity threats and Cyberbullying


This technology is not especially new, however, applying it to remodel celebrities is new. We are all familiar with the dinosaurs seen in Jurassic Park (1993). However, applying the same 3D computer modeling to a person is also relatively easy, with the result that we can now plausibly make any person say or do anything, including performing in pornography. For an example of this, see www.mrdeepfakes.com, which contains celebrities performing in pornography, even celebrities who certainly would not so perform, e.g. celebrities with well-known feminist stances. 


The implication therefore is that not only is there a military threat but a threat to personal dignity of all persons. If deepfake technology becomes as ubiquitous as photoshop, there is a serious risk of cyberbullying being performed by bad actors. For example, a schoolchild who becomes competent at deepfake manipulation, could potentially make a pornographic deepfake of his teacher and another pupil.


Fake news threats


Videos of persons could be created which merely depict fake news, even if it is not for example a declaration of war or a pornographic video. For example, someone could create a deepfake of Bill Gates (former chairman of Microsoft and philanthropist) advocating for genocide via vaccines. This sort of fake news has already been circulated and repudiated online, however, making it illegal would give some measure of reduction of peoples’ willingness to share such material, in that Mr Gates could appeal on grounds of it being a violation of his rights.


Proposed solution: The Right to Ownership of one’s own Likeness.


It is proposed that the UNGA votes on and hopefully ratifies the following proposed draft treaty.


  1. Definition: celebrity or public person: Any person whose name and likeness is known beyond their circle of friends and family; who has appeared in any mass media, particularly television or traditional printed newspapers or large-scale print publications such as novelists, including actors, artists, popular authors, radio personalities, politicians, and high-ranking government or military officials.
  2. Definition: satire. The depiction either verbally, or via audio, or via visual media such as images or videos, of a person or group in which the person or group performs or behaves in a manner which exaggerates, stylises or stereotypes their typical behaviour, appearance or actions. The purpose of satire is to draw humorous reference to the unacceptable nature of the behaviour so depicted. Some people, however, interpret satire literally and do not understand that it is humour. For that reason we would propose that satirical images, videos or audio files henceforth shall have a disclaimer stating that they are satire (a “digital health warning”).
  3. Definition: to photoshop. To edit a photograph so as to change it from its original appearance in such a way that the viewer cannot tell that it was edited, with the intention to deceive or mislead the viewer. For example, during lockdown, an image was circulated of trains carrying Covid-19 cargo. (https://www.rtands.com/rail-news/viral-photo-of-covid-19-rail-freight-tanker-is-ruled-fake/)
  4. Definition: deepfake. A video (motion picture) in which a person is depicted as acting, performing or moving or speaking, wherein they were not involved in the creation of that video and did not act, perform, move or speak the words or motions so depicted.
  5. Definition: Impugnment of dignity. This is defined as the degradation of a person’s dignity by means of misrepresentation, e.g. through slander, libel, or in the visual arts, creation of images or video footage which aims to show a person in a bad light, in a way which is untruthful and non-satirical, and which clearly is not concomitant with the normal behaviour of the person concerned. So, if for example we know that a person normally behaves a certain way, and would never publicly behave some other way which is generally considered, if performed in public, in some way undignified or reprehensible (call it action R), then the creation and sharing (distribution or broadcasting) of an image or video which depicts a person performing action R, impugns on their dignity.


  1. Definition: Censorship. The removal or redaction of information which goes against the views, values or laws of those in power in a particular territory or domain, specifically because the material goes against those views (political or religious), values, or laws. Importantly, disagreement is not censorship. A person may respectfully disagree with views (political or religious), values or laws, and state their disagreement in public, without censorship, in any state deemed democratic.


  1. Definition: Free speech: The limited right to express one’s own views, values, opinions, artworks or other creative works, without restraint other than adherence to more fundamental human rights such as the right to life, dignity and personal freedom. Free speech is therefore curtailed to the extent to which it impugns on the dignity of others on any grounds, whether it be due to their appearance, race, gender, religion, or sexual orientation. Note that disagreeing with a person on their religion or political affiliation does not constitute an impugnment on their dignity, merely a disagreement.
  2. The right to ownership of one’s own likeness. A new treaty is therefore proposed in which defining a new human right is created, derived from the right to property, including intellectual property, and the right to bodily integrity. This new human right would be the right to ownership of one’s own likeness. In effect, it proposes that by default, all persons immediately henceforth have full and sole exclusive intellectual property rights over their face, specifically, but not their general bodily appearance or clothing style. 
  3. The aim of this treaty would be to prevent future creation of deepfakes of any person, and to immediately make all existing deepfakes illegal and subject to unopposable take-down requests, even against free speech clauses. The general principle will be that if a video or image impugns on a person’s dignity, and, if it is not intended as satire, and if it was not voluntarily authorised by the person depicted therein, it is immediately a copyright violation.
  4. It is proposed that the right to ownership of one’s own likeness supersedes the right to free speech, such that a person cannot protect a deepfake by appealing to their right to free speech.
  5. It is proposed that this right be incorporated into the constitutions of all nations who are signatories to the treaty, as constitutional amendments, with immediate and retrospective effect for all living persons. 
  6. Supersedes copyright of other materials. The right to ownership of one’s own likeness supersedes all other copyrights, and, by including someone as a deepfake in a product or creation, you automatically grant that person the copyright ownership of that material, including any and all royalties derived therefrom. So, for example, if a celebrity is superimposed into a video, earnings derived from that video must be ceded to the celebrity, as well as the original files and materials used in manufacture thereof, and the person who created the materials ceases to have any rights towards those materials. This would be subject to conditions and exclusions defined below.
  7. National penalties for non-signature. It is further proposed that all nations who decline to sign on to the treaty be subjected to sanctions in which they are subjected to copyright invalidation. That is, nations which decline to accept this proposal, forfeit the copyrights of any artistic works, texts, motion pictures, music, audio productions, artworks, books or other creative materials, that they have produced, for the duration of their refusal. The intention of this clause is to prevent any nation from becoming a “Refuge” for deepfake creators.

Problems, exclusions and subclauses


  1. Permission: In order to take a photograph of someone, and model them on a computer to produce a deepfake or photoshop them into a photograph, one would have to obtain written consent from the owner of the likeness, that is, the person themselves, and, the resultant image must state that the person’s likeness has been used with their explicit permission. Without this permission, the default assumption shall be that permission is not granted and that copyright of the person’s likeness is violated. The exclusion here would be satire purposes, and, the modified image would have to include a disclaimer stating that it is satirical.
  2. Children in most countries are not legally able to sign away rights. Hence, deepfakes and photoshopping of children into artificial or non-real scenarios shall be prohibited entirely, with the exception noted below (advertising). This has the added benefit of preventing the creation of deepfake child pornography.
  3. Advertising. It is recognised that this proposed treaty may potentially prevent the use of likenesses of persons in advertising, or editing images in advertising, hence, an exclusion clause is proposed. For example, one might see a picture of a baby on a background of flowers on a toiletry product. Since this treaty proposes banning photoshopping (placing the baby on a background of flowers), and since this treaty proposes age limits (a baby cannot sign a contract), it follows that a toiletry with a baby on a background of flowers would be illegal. Clearly that is unreasonable or excessive, hence, the amendment proposed to the right would be that parents can sign for use of a child’s image, deepfaking or photographing, but only if the child’s dignity is preserved (i.e. no nudity or inappropriate or sexual poses, actions etc.), and the signing-off is not indefinite, that is, it falls away when the child reaches age 25. At that point, the materials’ copyright would cede to the child depicted therein, in perpetuity, and no longer inhere in the corporation that paid for the advert or the artist who created it. This would therefore prevent, for example, the creation of pedophilic images or other such materials, whether deepfake or not, but allow the use of children in advertising subject to certain restrictions as defined above (preservation of dignity clauses).
  4. Monozygotic twins look the same as each other, and hence it creates a situation in which as adults, monozygotic twins would have to jointly sign a contract authorising the use of their likeness in deepfake videos or photoshopped images. No individual monozygotic twin may therefore sign away their likeness without their monozygotic twin’s written consent. (Non-monozygotic twins do not exactly resemble each other and hence this clause would not apply to them).
  5. Existing actors who have appeared in videos which include 3D-rendered likenesses of them would have to sign retrospective clauses with those motion picture (movie) studios to retrospectively grant rights. A specifically interesting case which is challenging is the case of Carrie Fisher in the film Star Wars Rogue One:

https://www.cinemablend.com/news/2559935/rogue-one-deepfake-makes-star-wars-leia-and-grand-moff-tarkin-look-even-more-lifelike

At the end of one of the episode, Carrie Fisher appears as a deepfake in which she is her 19-year-old self again. Since Carrie Fisher has regrettably passed away, she cannot retroactively sign away those rights for that scene. Hence, it is proposed, her estate and surviving relatives must be contacted and sign away those rights for that scene. Failing which, it would need to be removed or not depict her face.

  1. Royalties for the deceased. It is proposed therefore that any films or motion pictures which depict deceased actors therefore automatically owe royalties to the heirs of those actors or their estate until such time as copyright expires on that film in its territory of origin. That is, that by acting in a film, an actor does not grant rights in perpetuity for the use of their likeness without an explicit clause in the contract having already been signed that they have agreed to appear. In the case of deceased actors, it will be assumed that the rights were not granted and therefore that future royalties are due or the copyright of the film expires. The same would apply to any “extras” (background persons) in films, who may wish to either be removed, be remunerated, or be replaced or modified, or waive their rights. 


  1. Limitation on film’s age. Retrospective rights shall apply only to films from 1992 onwards, from which point onwards realistic deep fakes were possible. Films older than 1992 are not bound to this treaty.
  2. Pornography, adult filmography, and ceding of rights. It is proposed that deepfake pornography or adult filmography, be banned in its entirety unless the producer of the video or still image can demonstrate a contract with the person depicted in which they signed permission for its creation. This would allow, for example, an actor to create a scene in a motion picture in which they seem to feature, but in which they did not in fact perform at all. In such a case the actor could request that the scene be made as a deepfake, thus preserving their physical dignity and right to bodily integrity, even if not preserving their visual dignity or reputation.
  3. Minimum age. It is proposed that no-one under the age of 25 be permitted to sell their likeness due to a lack of maturity and appreciation of the future reputational consequences of so doing. It is proposed further that this treaty covers edited photographs as well as videos and deepfakes; that is, that persons under 25 may not sell their likeness without parental consent even for imagery.
  4. Right to be forgotten. It is further proposed that this treaty formalises the Right to Be Forgotten, that is, that if it is accepted in principle that all persons having copyright on their imagery, that any person requesting a permanent take-down of materials relating to them, whether visual or written, has a right to be forgotten (that is, removed from the records), specifically online social media records, internet search, and social reviews or ratings online. The aim of such a right would be to preserve the dignity of a person and allow them to undo reputational damage from past mistakes, so that their mistakes are not permanent and do not affect them for the remainder of their lives. However, the right to be forgotten should not be granted universally, and specifically not to politicians, whose acts are sometimes egregious and ought to not be expunged from the record (think of, say, Adolf Hilter, for example); nor persons implicated in serious high-profile charges, for example, the case of Epstein; serial killers, etc.
  5. Limits on photoshopping generally. Photoshopping was defined above as editing a photograph in order to mislead or deceive. This treaty proposes that the harm caused by unrealistic photoshopping of fashion and glamour models as well as actors, represents a real possibility of harm to impressionable youth who feel pressurised to meet those standards which do not exist outside of the digital realm. It is therefore proposed that in order to preserve the dignity of persons viewing the material (and prevent them from harmful consequences such as development of anorexia), that unrealistic photoshopping of persons is also limited except for satirical purposes (see for example the Zombie Angelina Jolie case, https://globalnews.ca/news/7520567/iran-sentences-zombie-angelina-jolie-to-10-years-in-jail-for-photos). In the latter case, since the modifications depicted are clearly satirical and digital, the jail sentence of 10 years is clearly egregious.


  1. Fake news. It is further proposed that photoshopping or creating deepfakes so as to stir up fear or civil unrest by misleading editing of photographs or insertion of persons into videos wherein they did not originally appear, for political or social purposes which are non-satirical, shall be deemed “fake news” and banned.
  2. News reports. It is proposed that public figures or celebrities do not have to give authorisation for the use of their images for the purposes of news reports only, and subject to a criterion that their images are shared without modification, deepfaking or photoshopping. The aim of this clause will be to allow genuine news reports (video or otherwise) to report on public figures’ activities without being authorised to adjust the imagery to impugn on the dignity of the represented person, except for satirical purposes in which case it is required to be clearly marked as satirical. This would prevent, for example, the creation of a fake image of a president conspiring with terrorists, but will allow a generic news report on television to show the president as they normally appear (e.g. suggesting investigation into bleach as a cure for Covid-19).
  3. Creation and distribution: penalties. It is proposed that the distribution of deepfakes, fake news, or edited videos or images which impugn on a person’s dignity and which were not authorised by that person (except public personages in a legitimate news reporting or satire context), are henceforth illegal and subject to a fine as well as an unopposable take-down order.
  4. No censorship threat. The above treaty does not justify or authorise the use of censorship of free speech. Free speech in the form of clearly marked satire, clearly authorised advertising, unmanipulated advertising images, expressions of political and/or religious views, and/or artistic expressions, and unmanipulated socially informative images, or videos, is still permitted, and not subject to take-down requests under these laws, provided that the above does not violate the limits of free speech defined above.


Popular posts from this blog

The risks of Deepfakes and a proposal to combat them - Edited

What is Generative AI?

How to spot a fake articles and real articles