Deepfakes and style mimicking – Should New Zealand adopt a right of publicity?
A style that has taken an artist a lifetime to curate can now be learned in a matter of months by generative AI.
.jpg)
Canadian visual artist Sam Yang creates distinctive anime-style, almost photo-realistic, artworks. He has millions of followers on socials and sells (and displays) his works online. In 2022, he discovered that a generative AI algorithm had been trained on hundreds of his works and was now capable of producing works “in the style of” Sam Yang. He discovered this when he was asked to judge a competition for the best impersonation of his work. Perhaps understandably, he was not impressed. He had spent years developing his style, and a generative AI model had learned to copy it in a matter of months. Users could now quickly create works in his style, in direct competition with him.
In 2023, YouTube channel “Curious Refuge” uploaded a trailer for “The Galactic Menagerie”, which appeared to be Star Wars in the style of director Wes Andersen. However, Mr Andersen had nothing to do with it. Using generative AI, Curious Refuge had arguably copied his very distinctive cinematic style.
Generative AI (in particular Generative Adversarial Networks, or GANs) is also being used to create deep-fakes: realistic images, including videos, or audio recordings of real people, that appear very much authentic, but they are not. Some generative AI requires a recording of only three seconds of a person’s speech in order to generate that same voice saying anything at all, including incorporating the speaker’s own particular emotional tone.
Voice actor Bev Standing learned this the hard way in 2021 when she discovered that her voice was being used for TikTok’s original text to speech feature. Users could type anything they liked and then make Standing’s voice say those words (no matter how offensive). Standing had never given permission for the vocal recordings to be used to generate entirely new speech in this way and sued ByteDance.
Musical artist David Guetta played with this concept in February 2023, when he posted a track that he titled “Emin-AI-em” to Twitter (as it was then known). It sounded as if it had been written and performed by Marshall Mathers III. Instead, Guetta created the work himself by asking ChatGPT to “write lyrics in the style of Eminem about future rave”, and then using generative AI (Uberduck) to recreate Marshall’s voice performing the ChatGPT-composed lyrics. The entire process took him one hour. Perhaps sensibly (as Eminem can be fairly litigious) Guetta confirmed “obviously I won’t release this commercially”.
Similarly, in April 2023, a musical track by “Ghostwriter” titled “Heart on my Sleeve” was uploaded to streaming platforms such as Spotify. It instantly gained attention because it sounded like a new co-lab between Drake and The Weeknd, but it was not. It was created using a vocal synthesiser AI, without their permission.
The legal position in New Zealand
While generative AI certainly has its fair share of copyright issues, none of the outputs in the examples above arguably amount to copyright infringement, particularly in common law jurisdictions (like New Zealand) that have no concept of derivative works. Copyright doesn’t protect an idea, or a personal likeness, or a style, per se. So, if someone in New Zealand has used recordings or images of a real person to train an AI to create entirely new recordings or digital replicas of that person, which don’t substantially reproduce the original inputs, there is no copyright issue with those outputs.
Facial images and voices are biometric information, which is regarded as sensitive personal information under the Privacy Act 2020. If your image or voice have been collected in breach of the Privacy Act, the collector agency may be liable for the Privacy Act breach, but what if they have already trained an AI using your photos and voice, and a different entity is now commercially exploiting your digital replica? It is unclear whether the Privacy Act would apply in this scenario.
It is also far from clear that the Harmful Digital Communications Act 2015 would apply, including because that Act requires the relevant conduct to result in “serious emotional distress” and it is arguably designed to address the unauthorised use of real images of individuals, rather than images generated by algorithm.
If the resulting material does not amount to defamation, the tort of invasion of privacy (which carries a “highly offensive to the reasonable person” threshold) or a breach of confidence, and the subject is not “sufficiently famous” to qualify for use of the tort of passing off or consumer protection legislation (such as the Fair Trading Act 1986), it is possible that no remedy for appropriating a voice or a likeness for commercial purposes would arise here.
A right of publicity
But perhaps (in light of what AI can now do) it very much should. The USA and other jurisdictions have adopted a right of publicity (sometimes known as personality rights): the exclusive right to control the commercialisation of a person’s likeness, including identifiable features such as their appearance, name and voice.
This right developed in the USA in the 1970s as part of a set of four “privacy” torts advocated for by Professor William Prosser:
1. Intrusion upon seclusion;
2. Invasion of privacy;
3. Publicity placing a person in a false light (a little like defamation); and
4. Appropriation of a person’s name or likeness (aka breach of publicity rights).
New Zealand has adopted the first two of these torts, and the tort of defamation, but not the fourth. And it is the fourth which arguably most appropriately addresses unauthorised commercial deepfake usage, by granting people an actionable monopoly over the commercialisation of their own likeness.
Due to the existence of this fourth tort in the USA, recent generative AI lawsuits concerning algorithms creating outputs “in the style of” certain authors’ works, also claim breach of publicity rights. These lawsuits argue that an author’s particular writing or artistic style is a key component to their personality and likeness and cannot legally be appropriated commercially without their consent. And you can perhaps understand their concern. A style that has taken an artist a lifetime to curate, and on which they rely for earning a living, can now be learned in a matter of months by generative AI. Deepfakes and vocal synthesiser outputs can be created in minutes, and could potentially pose an existential threat to performers, models and actors. Concerns about this threat featured heavily in the recent SAG-AFTRA American actors’ union strike.
Perhaps it shouldn’t be necessary for a subject to be famous in order to prevent their likeness and personal characteristics from being appropriated commercially without their consent. And therefore, arguably, the time is now right for New Zealand to adopt Prosser’s fourth tort.
Services in this insight
From Hertzian waves to hyperlinks – What the BSA’s online decision means for your business
Space Law in New Zealand — Signals from the ground
Cyber security changes flagged for New Zealand
The four Cs of successful fintech partnerships
New rule 3A introduced to the Biometric Processing Privacy Code
IPP3A is nearly in force – What agencies need to know
OPC shifts public enquiries online – What agencies should do now
AI as a confidante? Legal privilege and the ever-increasing use of AI
New Therapeutic and Health Advertising Code – What you need to know
Building blocks of trade mark law: New Zealand approach to "use as a trade mark" now compatible with Australia
Consumer law update 2025
Open banking launches in New Zealand
Is fair something to fear? The Government announces beefed-up Fair Trading Act
Is it fair? Lessons from Bartz v Anthropic and Kadrey v Meta
Open banking almost live
Why New Zealand businesses should care about the EU Data Act
Product labelling changes flagged for New Zealand
Biometric Processing Privacy Code 2025 introduced to New Zealand
Open banking regulations released for consultation
Ten tips for buy-side M&A success
A recipe for disaster – Is caramel a copyright work?
Becoming a Globally Renowned Fintech Nation (and how regulation can light the path)
Important changes made to the Privacy Act
New Zealand may ban social media for young users
Customer and Product Data Act update – Open banking officially on the way
Tips from the trenches – Your AI policy cheat sheet
Significant regulatory reform proposed for New Zealand media
Security guidance released for emerging tech companies
Customer and Product Data Bill – Select Committee reports back
Consumer law update 2024
New Zealand’s Artist Resale Royalty is ready to go
The shape of coffee – “Moccona” vs “Vittoria”
New Zealand’s Copyright Act gets a sense of humour
WIPO’s traditional knowledge treaty is adopted
Doing business in the Middle East
AI and advertising – What producers need to know
Seven contract clauses every freelancer needs
Baby Reindeer – When truth is stranger than fiction?
Our comments on the Biometric Processing Privacy Code
Therapeutic Products Act to be repealed this year
Is End-to-End to end?
Geographical indications – Changes uncorked by the EU-NZ Fair Trade Agreement
Lawyers and Generative AI – New NZ Law Society guidance released
Facing the future – A biometrics code of practice for New Zealand?
Deepfakes and style mimicking – Should New Zealand adopt a right of publicity?
Five Eyes release the Five Principles to Secure Innovation
The copyright conundrum with generative AI
Innovate at the speed of trust – Privacy Commissioner releases new guidance on artificial intelligence tools
Political advertising on social media: sludge or copyright quagmire?
Privacy Amendment Bill introduced to Parliament
New Data Privacy Framework: Meta gets a lifeline
The long and winding road to royalties
Implications of the Supreme Court’s “new debt” approach in Mainzeal
EU gets closer to AI laws
UK Supreme Court puts Quincecare ‘duty’ back in its box
A Deep Dive into The Customer and Product Data Bill
Searching for a shield: Meta’s €1.2 billion fine and international transfers in the age of Big Data
New NZ-UK Free Trade Agreement signals tech, media and IP law changes
Ditch the fax! Tips for building a tech-savvy law firm
The Incorporated Societies Act 2022 – what you need to know for your society
Common myths about copyright online
Artificial artist, or artificial plagiarist?
Big boost to gaming
Is your product “AI powered”?
The latest on New Zealand’s Consumer Data Right
Space Law in New Zealand
You Cannot Defame the Dead or Can You? Tikanga Māori and NZ Defamation Law
Open Banking is coming – through the Consumer Data Right
Massive SEC Fines for Companies Using Text and Instant Messaging
One Act to Rule Them All
A Legal Guide to Kicking SaaS
Potential changes to the Privacy Act 2020
NZ's Social Media "Code of Practice" Launched
Are you being unfair?
Are you legal?
Power Up 2022
A new Companies Office levy is one step closer
Has Paramount Pictures gone maverick?
From Russia with love: The ‘other’ Russian conflict targeting intellectual property owners
I'm back, baby
Retail Payment System Act 2022 now in force
Paying the price for getting privacy wrong
Can AI be an inventor?
Finfluencer Crackdown
TIN Fintech Insights Report Launch
Britain seeks to regulate 'Big Tech'
Disclosure of personal information - how to, not don't do
The Spice May Flow, But The Copyright Doesn’t
Sound Recording Ownership (Taylor's Version)
The Lowdown (and Lockdown) on Summer Clerkships
Building Blocks of Trust
Firm News | Legal Rankings
Buy Now, Regulate Soon
Ten simple things
Funding the Future
Cyber Security for Start-ups
Fit for purchase
The Screen Industry Workers Bill
UK/New Zealand Trade Deal Takes Flight
Palmer v Alalääkkölä
Other articles you
might like
A recent Court of Appeal decision provides long awaited clarity for businesses on the lawful use of another party’s trade mark in New Zealand.
Two contrasting court judgments have been released on whether it is legal to train LLMs using copyright protected works.
The EU Data Act is about to change how Kiwi firms handle customer data.







.jpg)




.jpg)
.jpg)

