Facing the future – A biometrics code of practice for New Zealand?

New Zealand's Privacy Commissioner wants to introduce stronger protections to mitigate the risks associated with the automated processing of biometric information.

Facing the future – A biometrics code of practice for New Zealand?Facing the future – A biometrics code of practice for New Zealand?
Category
Insight |
Insight
|
Published Date
22
March 2024
Reading Time

In a press release published late last year, Privacy Commissioner Michael Webster announced that the Office of the Privacy Commissioner (OPC) will release an exposure draft of a biometrics code of practice early this year (Proposed Code).

While the use and collection of biometric information is currently regulated under the Privacy Act 2020 (Privacy Act), the OPC wants to introduce stronger protections to mitigate the specific risks associated with the automated processing of biometric information.

What is biometric information?

The OPC defines “biometric information” as information about a person’s physical or behavioural characteristics (such as a person’s face, fingerprints, voice, or gait).

Because it relates directly to a person’s image and characteristics, which are tied to a person’s identify and difficult (sometimes impossible) to change, the OPC considers biometric information to be particularly sensitive personal information.

What is biometric technology?

Biometric technology is automated technology for authenticating and verifying human body characteristics such as fingerprints, facial patterns, eye retinas and voice patterns. These systems use algorithms to compare raw samples of biometric information (such as a photo or fingerprint) against biometric templates, which are digital mathematical representations of biometric information that has previously been collected. These systems include facial recognition, finger scanning and voice recognition technologies.

Biometric technology has transformed identity verification and security. Many of us use it daily to unlock our phone, access our digital wallets or avoid the queues at airport border control. Biometric technology can play an important role in improving policing and law enforcement outcomes (including by identifying suspects). And it has real potential benefit in commercial contexts, such as age verification to protect minors, identifying problem gamblers, and preventing retail crime to better protect customers and staff.

So, is there a problem?

The Privacy Act already generally applies to the collection and use of biometric information as personal information.

But the OPC has identified “key privacy risks” associated with biometric information:

• Unnecessary or high-risk collection and use (such as the risk of mass surveillance and profiling of individuals);

• Function and scope creep (where biometric information collected for one purpose is used for another); and

• A lack of control or knowledge about when and how biometric information is collected and used (including collecting information from people without their knowledge or involvement).

The OPC has also stated that the automated processing of biometric information poses new or increased privacy risks – specifically, that biometric technology can:

• Be used to track, monitor, or profile people in ways that are “intrusive, discriminatory or creepy”, often without their knowledge; and

• Misidentify or misclassify people, who can suffer disadvantage because of these decisions or mistakes.

The OPC has concluded that the level of risk and intrusiveness is not the same for all biometric recognition, and that there is higher risk to individuals when decision-making based on the use of biometric information is automated, removing human oversight.

A new code of practice

The Privacy Act gives the OPC the power to issue codes of practice that modify the operation of the Act and set specific rules for types of personal information. Based on public consultation, the OPC has concluded that the privacy risks associated with collecting and using biometric information in automated processing warrant a new code of practice that will vary the general obligations under the Privacy Act. This is a pivot from the OPC’s previously published position, in October 2021, that a new code of practice dealing with biometric technology was not needed.

The Proposed Code has not yet been released for consultation, but the OPC has published some information about what it will cover.

Scope of the Proposed Code

The Proposed Code will apply to all agencies regulated by the Privacy Act when they collect and use biometric information to verify, identify or classify individuals in or via automated processing.

The OPC has indicated that “automated processing” is likely to mean “using a technological system that employs an algorithm to compare or analyse biometric information”. This would include technology that automates the verification, identification, or categorisation of an individual, such as facial recognition, finger scanning or voice recognition technologies.

The Proposed Code will not apply to:

• The use of biometric information in manual processes (for example, photographs in archival collections);

• Health information (if this information is already covered by the Health Information Privacy Code 2020), genetic information or neurodata; and

• Any information that is not personal information (that is, not about an identifiable individual).

The OPC has decided that the Proposed Code will focus on three requirements:

Proportionality assessments where agencies must carefully consider whether their collection and use of biometric information in or via automated processes is proportionate to the potential privacy risks and intrusions that may occur. The OPC has previously stated its view that agencies need to be able to articulate a “strong business case” for using biometric technology in a targeted and proportionate way to meet the agency’s needs.

Additional transparency and notification requirements imposing mandatory obligations on agencies to be open and transparent about their use and collection of biometric information in or via automated processes (e.g., through clear signage notifying the public of its use).

Purpose limitations that will restrict the collection and use of biometric information in or via automated processes in certain circumstances. The OPC is proposing to rule out some use cases for auto-processing biometric information, such as targeted marketing, classifying people using prohibited grounds of discrimination, inferring someone’s emotional state, and detecting health information.

International regulation

Stricter regulation around when and how biometric information can be collected and used in or via automated processing – such as the Proposed Code – would bring New Zealand more in line with comparable jurisdictions like the EU and Australia.

For example, article 22 of the European Union’s General Data Protection Regulation (GDPR), gives individuals the right not to be subject to a decision based solely on automated processing, including profiling, which may legally or significantly affect that individual.

Under section 6 of the Australian Privacy Act 1988, both biometric information that is to be used for the purpose of automated biometric verification or biometric identification, and biometric templates, are included in the definition of “sensitive information”. In Australia, sensitive information may only be collected with consent, except in specified circumstances, whereas consent is generally not required to collect personal information that is not sensitive information.

Regulating biometric technology in New Zealand

Exactly how well a code dealing with biometric technology will work in practice in New Zealand will depend on the detail published in the exposure draft of the Proposed Code.

As the OPC has itself previously stated:

If designed well and used appropriately, biometric systems have significant benefits. These include convenience for individuals wanting to have their identity verified, efficiency for agencies seeking to identify people quickly and in large numbers, and security (because they use characteristics that cannot easily be faked, lost, or stolen). Biometric systems can also play a role in protecting privacy, by helping to guard against identity theft and fraud.

Biometric technology can also serve the legitimate purpose of detecting and preventing crime. In 2023, the Information Commissioner’s Office in the United Kingdom concluded that the live facial recognition technology provided to the retail sector by security company Facewatch served a legitimate purpose for using people’s biometric information and could be used in compliance with data protection legislation.

The United Kingdom’s ICO’s Deputy Commissioner for Regulatory Supervision stated:

Innovative solutions helping businesses prevent crime is in the public interest and a benefit to society. Data protection law recognises this, allowing personal information – in this case facial images – to be used if there is a legitimate interest, such as for the detection and prevention of crime. However, these benefits must always be balanced against the privacy rights of the individual.

The OPC has stated that it is keen to ensure that the Proposed Code is effective and workable, so it will need to balance these same concerns.

In addition to the three requirements already identified by the OPC, agencies looking to implement and use biometric technology will first need to understand, and clearly articulate in a Privacy Impact Assessment, the privacy impacts of the use case for the biometric technology and how they will be addressed and mitigated. For example, agencies will need to address and mitigate the risk that the results of automated processing of biometric information are inaccurate or biased, e.g., false results, or inaccuracies because of race or gender. Human oversight and the manual checking of results before significant decisions are made will be a critical safeguard.  

Agencies may also need to take steps to limit the personal data they collect and use in or via automated processing, and may need to ensure that vulnerable people, such as children and young people, are excluded from the automated processing of biometric information. Given the sensitive nature of biometric information, agencies can also expect specific guidance from the OPC about security and accuracy measures that should be adopted.

Next steps

The Proposed Code will be released this year for public comment. After this initial consultation, a more formal code consultation process will follow before any code is issued. If the decision is made to issue a code, we would not expect to see it finalised before late 2024.

If you have any questions in relation to the OPC’s Proposed Code, or the collection and use of biometric information within your organisation, please do not hesitate to contact us.

Services in this insight

There are no services for this current insight. Take a look at our services page for more information on our different offerings.

Services in this insight

There are no services for this current insight. Take a look at our services page for more information on our different offerings.

Services in this insight

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore.

There are no services for this current insight. Take a look at our services page for more information on our different offerings.
Previous Article
Next Article

From Hertzian waves to hyperlinks – What the BSA’s online decision means for your business

Space Law in New Zealand — Signals from the ground

Cyber security changes flagged for New Zealand

The four Cs of successful fintech partnerships

New rule 3A introduced to the Biometric Processing Privacy Code

IPP3A is nearly in force – What agencies need to know

OPC shifts public enquiries online – What agencies should do now

AI as a confidante? Legal privilege and the ever-increasing use of AI

New Therapeutic and Health Advertising Code – What you need to know

Building blocks of trade mark law: New Zealand approach to "use as a trade mark" now compatible with Australia

Consumer law update 2025

Open banking launches in New Zealand

Is fair something to fear? The Government announces beefed-up Fair Trading Act

Is it fair? Lessons from Bartz v Anthropic and Kadrey v Meta

Open banking almost live

Why New Zealand businesses should care about the EU Data Act

Product labelling changes flagged for New Zealand

Biometric Processing Privacy Code 2025 introduced to New Zealand

Open banking regulations released for consultation

Ten tips for buy-side M&A success

A recipe for disaster – Is caramel a copyright work?

Becoming a Globally Renowned Fintech Nation (and how regulation can light the path)

Important changes made to the Privacy Act

New Zealand may ban social media for young users

Customer and Product Data Act update – Open banking officially on the way

Tips from the trenches – Your AI policy cheat sheet

Significant regulatory reform proposed for New Zealand media

Security guidance released for emerging tech companies

Customer and Product Data Bill – Select Committee reports back

Consumer law update 2024

New Zealand’s Artist Resale Royalty is ready to go

The shape of coffee – “Moccona” vs “Vittoria”

New Zealand’s Copyright Act gets a sense of humour

WIPO’s traditional knowledge treaty is adopted

Doing business in the Middle East

AI and advertising – What producers need to know

Seven contract clauses every freelancer needs

Baby Reindeer – When truth is stranger than fiction?

Our comments on the Biometric Processing Privacy Code

Therapeutic Products Act to be repealed this year

Is End-to-End to end?

Geographical indications – Changes uncorked by the EU-NZ Fair Trade Agreement

Lawyers and Generative AI – New NZ Law Society guidance released

Facing the future – A biometrics code of practice for New Zealand?

Deepfakes and style mimicking – Should New Zealand adopt a right of publicity?

Five Eyes release the Five Principles to Secure Innovation

The copyright conundrum with generative AI

Innovate at the speed of trust – Privacy Commissioner releases new guidance on artificial intelligence tools

Political advertising on social media: sludge or copyright quagmire?

Privacy Amendment Bill introduced to Parliament

New Data Privacy Framework: Meta gets a lifeline

The long and winding road to royalties

Implications of the Supreme Court’s “new debt” approach in Mainzeal

EU gets closer to AI laws

UK Supreme Court puts Quincecare ‘duty’ back in its box

A Deep Dive into The Customer and Product Data Bill

Searching for a shield: Meta’s €1.2 billion fine and international transfers in the age of Big Data

New NZ-UK Free Trade Agreement signals tech, media and IP law changes

Ditch the fax! Tips for building a tech-savvy law firm

The Incorporated Societies Act 2022 – what you need to know for your society

Common myths about copyright online

Artificial artist, or artificial plagiarist?

Big boost to gaming

Is your product “AI powered”?

The latest on New Zealand’s Consumer Data Right

Space Law in New Zealand

You Cannot Defame the Dead or Can You? Tikanga Māori and NZ Defamation Law

Open Banking is coming – through the Consumer Data Right

Massive SEC Fines for Companies Using Text and Instant Messaging

One Act to Rule Them All

A Legal Guide to Kicking SaaS

Potential changes to the Privacy Act 2020

NZ's Social Media "Code of Practice" Launched

Are you being unfair?

A new Companies Office levy is one step closer

Has Paramount Pictures gone maverick?

From Russia with love: The ‘other’ Russian conflict targeting intellectual property owners

Retail Payment System Act 2022 now in force

Paying the price for getting privacy wrong

Can AI be an inventor?

Finfluencer Crackdown

TIN Fintech Insights Report Launch

Britain seeks to regulate 'Big Tech'

Disclosure of personal information - how to, not don't do

The Spice May Flow, But The Copyright Doesn’t

Sound Recording Ownership (Taylor's Version)

The Lowdown (and Lockdown) on Summer Clerkships

Building Blocks of Trust

Firm News | Legal Rankings

Buy Now, Regulate Soon

Ten simple things

Funding the Future

Cyber Security for Start-ups

Fit for purchase

The Screen Industry Workers Bill

UK/New Zealand Trade Deal Takes Flight

Palmer v Alalääkkölä

Other articles you
might like

No items found.