Celebrating a 21st birthday can bring with it equal portions of excitement and dread. Excitement at celebrating your birthday. Dread at the thought of what embarrassing videos may be dug up and shared among your friends and family.

But what if the dread of such embarrassing content lingered beyond milestone celebrations? What if embarrassing videos of you were shared unsuspectingly? What if the person in those videos looked like you, sounded like you, but wasn’t you?

Artificial intelligence (‘AI’) is the latest tool in the arsenal of those up to no good on the internet. AI can now be used to generate damningly realistic videos of anyone from politicians to celebrities to your neighbours. Even those lacking in advanced computer skills can potentially take part. And just how convincing are such videos? The ABC looked into this and you can decide for yourself here. The availability of this technology reignites concerns about how effective the law is when it comes to protecting your identity and privacy.

Personality Rights

Observers of the celebrity media may be familiar with the concept of personality rights. From time to time, public figures have taken issue with the unauthorised use of their likeness. In the United States, a person’s likeness is protected by their “right of publicity”. This right can be exercised to prohibit the misappropriation of identity for commercial value. However, no such right is recognised by the law in Australia. Instead, a person who finds their identity used in a fake video created using AI must seek the protection of other areas of the law.

Defamation

The tort of defamation serves to protect a person’s reputation, A fake video will be defamatory where:

  1. It carries some defamatory imputation;
  2. It is capable of identifying the person; and
  3. The video has been published.

This is an especially pertinent possibility in the age of social media which is an inherently public means of communication. Of course, the question of whether a video is fake or real may have bearing on whether it has some defamatory imputation. A key issue created by use of AI is how it can be proved that a compromising video of you is not actually you at all.

Australian Consumer Law

The use of fake videos may also fall foul of the Australian Consumer Law. The Consumer Law prohibits the use of a fake video in trade or commerce where that use would be likely to mislead of deceive. This was the case in Talmax Pty Ltd v Telstra Corporation Ltd [1997] 2 Qd R 444 in which Telstra’s unauthorised use of an image of Kieran Perkins was found to be misleading and deceptive. Of course, the protection of the Consumer Law in such situations would likely be limited to celebrities and public endorsers. The use of a fake video containing an ordinary person is not likely to mislead or deceive consumers.

Privacy

Australia is yet to recognise a right to privacy. In ABC v Lenah Game Meats (2001) 208 CLR 199 the High Court expressed a willingness to one day recognise a tort for breach of privacy. But more than 17 years later, Lenah Game Meats remains the comprehensive review of the matter. Time will tell whether the misuse of a person’s image in a fake video provides an opportunity for the court to revisit this matter and make a decision as to whether a fake video is an invasion of privacy.

Stalking

A final consideration is the potential criminal implications of making such videos. In Victoria, section 21A of the Crimes Act makes it an offence to publish any material on the internet with the intention of causing physical or mental harm to the victim or to arouse apprehension or fear for their safety. Such acts are punishable by up to 10 years imprisonment.

Prevention is better than a Cure

Luckily there are some ways of addressing this. The company D-ID (link to their website here) is a Tel Aviv based start up that can protect your photos and videos from being hijacked and used for facial recognition to access your personal data and assets such as bank accounts as well as for manipulation.

Some Comfort

Boaz FischerCEO, Author and Founder of CommsNet Group, and recent author of the book “Protecting Your Business from Cyber Attacks 10 Minutes a Day” is confident that, for the moment at least, the technology required to create this sort of manipulated content is so labour intensive that its really only worth creating “fake content” where the stakes are high, for example in the case of mimicking Donald Trump. For the moment at least it seems that we are relatively safe from this new type of predatory attack.

Conclusion

The increasing availability of AI to create fake videos using a person’s likeness is yet another challenge faced by the law in the digital age. It remains to be seen how well the law is currently equipped to deal with such matters when the technology becomes more readily available and easier to use. It is clear that the law will develop to protect unsuspecting members of the public who find themselves the subject of a video they never actually appeared in.

 

Disclaimer
The material contained in this publication is meant to be informational only and is not to be construed as legal advice. Tisher Liner FC Law will not be held liable or responsible for any claim, which is made as a result of any person relying upon the information contained in this publication.

Related Articles

View All
Technology and Start Ups / Litigation & Dispute Resolution / Privacy Laws & Trading Terms

High Court Finds Google’s Search Engine has the Capacity to Defame

For many years, Google has claimed that its search engine has no capacity to defame Effectively, Google has argued that...
Read More
Not-for-Profit & Charities / Technology and Start Ups / Adverse Possession

TLFC – Award Finalist for Law Firm of the Year (Medium Category)

Tisher Liner FC are proud to be nominated as an award finalist in the 14th annual Victorian Legal Awards Medium Law...
Read More
Privacy Laws & Trading Terms / Business Law

Privacy Laws take effect today.

As at 12 March 2014, privacy laws changes now take effect To see if they apply to your business, obtain the handy...
Read More