Tech
I
May 30, 2024

Our comments on the Biometric Processing Privacy Code

The Office of the Privacy Commissioner (OPC) has now released the exposure draft of the Biometric Processing Privacy Code (Draft Code), and Hudson Gavin Martin has made a submission.

Once implemented, a new code of practice will introduce stricter regulation around when and how biometric information can be collected and used in or via automated processing. How well this works in practice depends on the detail. As currently drafted, we are concerned that:

• The scope of the Draft Code is broader than the purpose for which a standalone code has been proposed by the OPC (i.e., to address the privacy risks posed by automated processing of biometric information);

• The Draft Code could effectively operate as a complete ban on agencies adopting and using any new biometric processing technology or biometric processing for new use cases;

• The inclusion of the proportionality test in IPP1 is not appropriate; and

• The lack of clarity in some parts of the Draft Code would make it difficult for agencies to apply the Draft Code to their biometric processing activities.

We have raised these concerns with the OPC in our submission and discuss them further below.

Scope of code

The “Exposure draft of a biometric processing code of practice: consultation paper”, published by the OPC in April 2024, states that the Draft Code should relate only to the automated processing of biometric data.

This makes sense. It is the actual automated processing of biometric information that increases privacy risks for individuals and is the OPC’s published reason for the need for a standalone code. The collection of biometric information that could potentially be used for automated processing but is not collected for that purpose or used for that purpose is not, we understand, intended to be caught by the Draft Code.

However, we consider that the Draft Code does not currently make that distinction adequately and is not sufficiently limited to its purpose.

Section 4 (Application of code) states that the Draft Code applies to “the activity of biometric processing” and “biometric information as a class of information for purposes of that activity”. We agree that a purposive approach is important, as the Draft Code should be designed to apply to agencies when they actually collect and use biometric data in or via automated processing. While the first limb of this scope is consistent with the OPC’s stated objective, the second limb risks ambiguity. Specifically, it is possible that the second limb could be interpreted to cover biometric information simply because it could be used for that purpose.

Further, the relevant definitions in the Draft Code do not give effect to the stated objective as they are not clearly linked to the activity the OPC has said it is concerned about. For example, the definition of “biometric information” in section 3 refers to specific types of personal information “in connection with any type of biometric processing”. This could be interpreted to include biometric data that is not collected for or used for the purpose of biometric processing.

Reading this with section 4, there appears to be a clear risk that an agency’s obligation to comply with the terms of the Draft Code could commence before any intention is formed to use biometric information in a biometric process, or that agencies could practically not comply with certain rules in the Draft Code because collection of biometric information has already occurred.

New use cases

The Draft Code could effectively operate as a complete ban on agencies adopting and using any new biometric processing technology or biometric processing for new use cases, as they will not be able to show effectiveness and there is no mechanism permitting agencies to test or trial new biometric processing technology or biometric processing for new use cases. This has the potential to stifle innovation, which would ultimately be to the detriment of individuals and agencies.

As currently drafted, Rule 1(1)(d) requires an agency to believe, on reasonable grounds, that the biometric processing is “not disproportionate in the particular circumstances” before it can collect biometric information for biometric processing. The circumstances that an agency must take into account in assessing whether a type of biometric processing is disproportionate are then set out in Rule 1(2). This list is drafted as inclusive (“and”), so it appears that an agency must assess and satisfy all of the circumstances before proceeding.

The first factor (Rule 1(2)(a)) is “whether or not the biometric processing is effective in achieving the agency’s lawful purpose”. From this we take that the Draft Code requires that an agency will not be permitted to collect biometric information for biometric processing if no evidence of the effectiveness of the biometric processing can be provided before the collection.

However, by definition, new biometric technology – and new use cases for the biometric processing of biometric information – are unproven in terms of effectiveness. Rule 1(2)(a) effectively operates as an outright ban on this new technology or new use cases, as agencies will never be able to show effectiveness before collection.

There should also be more clarity around the evidence that will be sufficient to show the effectiveness of a type of biometric processing for its proposed purpose. In particular:

• Are agencies entitled to rely on evidence from overseas jurisdictions where evidence is not available in New Zealand?

• Are agencies entitled to rely on evidence provided by the vendors of the technology solutions?

• What standard of evidence (e.g., anecdotal vs. research based) is likely to satisfy the effectiveness requirement?

We also consider that the Draft Code should be amended to include mechanisms that expressly permit agencies to test or trial new biometric processing technologies or biometric processing for new use cases. As we have written about previously, biometric systems have significant potential benefits (including protecting privacy, and detecting and preventing crime). Innovative solutions helping individuals and businesses are in the public interest, and it is important that the Draft Code balances this benefit to society with the privacy rights of individuals.

Proportionality

We generally support the introduction of a proportionality assessment in the Draft Code. However, we query whether it is being applied to the correct Information Privacy Principle (IPP).

Proportionality as a test goes to fairness not to necessity. Proportionality is being applied to IPP1. We suggest that this is not the appropriate IPP. IPP1 relates to the collection of Personal Information and requires it only occur if for a lawful purpose and is necessary for that purpose. There is no fairness assessment here. In contrast IPP4 looks to ensure that Personal Information collected is not collected in an unfair or intrusive manner. We suggest that this is the better place for a test of proportionality to apply.

While the OPC is permitted by the Privacy Act to modify the application of an IPP to prescribe a more stringent standard, we consider that by applying a proportionality assessment at the point of collection (a fundamental change to the current operation of the IPPs) the OPC risks including a test at the point of collection that the drafters of the Privacy Act considered more appropriate to apply as part of the manner of collection.

Clarity

The number of definitions (most of which are interrelated) adds complexity to the practical application of the Draft Code. We are concerned that the breadth of some of the definitions and references within them to other definitions (which in turn often refer back to the initial definition) could lead to unintended consequences in practice.

For example, the term “biometric information” is used in many ways in the Draft Code, and sometimes in a way that is difficult to understand and may create inconsistency. “Biometric information” in the Draft Code includes both what is typically understood to be biometric information (biometric samples) and the information that results from the technical processing of those biometric samples (biometric templates and biometric results). However, “biometric results” are also included within the definition of “biometric processing” as something that is produced from biometric information, which suggests that a biometric result is different from biometric information.

While we agree with the principle that biometric templates and biometric results are forms of sensitive personal information and should sometimes be included in the application of the Draft Code, the Draft Code must be consistent, clear, and specific in its use of such terms to allow agencies to understand and implement the code in practice. We consider that there is more work to be done to provide agencies with greater certainty as to how the code applies and to make the code easier to implement.

Next steps

It was valuable to have the opportunity to give our feedback on the Draft Code, and we watch with interest to see what changes will be implemented to address the feedback received by the OPC through this consultation round. The OPC has stated that there will be an announcement from the Privacy Commissioner “in the middle of the year” on the biometrics code.

If you have any questions about the Draft Code or the collection and use of biometric information within your organisation, please do not hesitate to contact us.

No items found.

Article Link

Dowload Resource

Dowload Resource

Insights

Tech
May 30, 2024

Our comments on the Biometric Processing Privacy Code

The Office of the Privacy Commissioner (OPC) has now released the exposure draft of the Biometric Processing Privacy Code (Draft Code), and Hudson Gavin Martin has made a submission.

Once implemented, a new code of practice will introduce stricter regulation around when and how biometric information can be collected and used in or via automated processing. How well this works in practice depends on the detail. As currently drafted, we are concerned that:

• The scope of the Draft Code is broader than the purpose for which a standalone code has been proposed by the OPC (i.e., to address the privacy risks posed by automated processing of biometric information);

• The Draft Code could effectively operate as a complete ban on agencies adopting and using any new biometric processing technology or biometric processing for new use cases;

• The inclusion of the proportionality test in IPP1 is not appropriate; and

• The lack of clarity in some parts of the Draft Code would make it difficult for agencies to apply the Draft Code to their biometric processing activities.

We have raised these concerns with the OPC in our submission and discuss them further below.

Scope of code

The “Exposure draft of a biometric processing code of practice: consultation paper”, published by the OPC in April 2024, states that the Draft Code should relate only to the automated processing of biometric data.

This makes sense. It is the actual automated processing of biometric information that increases privacy risks for individuals and is the OPC’s published reason for the need for a standalone code. The collection of biometric information that could potentially be used for automated processing but is not collected for that purpose or used for that purpose is not, we understand, intended to be caught by the Draft Code.

However, we consider that the Draft Code does not currently make that distinction adequately and is not sufficiently limited to its purpose.

Section 4 (Application of code) states that the Draft Code applies to “the activity of biometric processing” and “biometric information as a class of information for purposes of that activity”. We agree that a purposive approach is important, as the Draft Code should be designed to apply to agencies when they actually collect and use biometric data in or via automated processing. While the first limb of this scope is consistent with the OPC’s stated objective, the second limb risks ambiguity. Specifically, it is possible that the second limb could be interpreted to cover biometric information simply because it could be used for that purpose.

Further, the relevant definitions in the Draft Code do not give effect to the stated objective as they are not clearly linked to the activity the OPC has said it is concerned about. For example, the definition of “biometric information” in section 3 refers to specific types of personal information “in connection with any type of biometric processing”. This could be interpreted to include biometric data that is not collected for or used for the purpose of biometric processing.

Reading this with section 4, there appears to be a clear risk that an agency’s obligation to comply with the terms of the Draft Code could commence before any intention is formed to use biometric information in a biometric process, or that agencies could practically not comply with certain rules in the Draft Code because collection of biometric information has already occurred.

New use cases

The Draft Code could effectively operate as a complete ban on agencies adopting and using any new biometric processing technology or biometric processing for new use cases, as they will not be able to show effectiveness and there is no mechanism permitting agencies to test or trial new biometric processing technology or biometric processing for new use cases. This has the potential to stifle innovation, which would ultimately be to the detriment of individuals and agencies.

As currently drafted, Rule 1(1)(d) requires an agency to believe, on reasonable grounds, that the biometric processing is “not disproportionate in the particular circumstances” before it can collect biometric information for biometric processing. The circumstances that an agency must take into account in assessing whether a type of biometric processing is disproportionate are then set out in Rule 1(2). This list is drafted as inclusive (“and”), so it appears that an agency must assess and satisfy all of the circumstances before proceeding.

The first factor (Rule 1(2)(a)) is “whether or not the biometric processing is effective in achieving the agency’s lawful purpose”. From this we take that the Draft Code requires that an agency will not be permitted to collect biometric information for biometric processing if no evidence of the effectiveness of the biometric processing can be provided before the collection.

However, by definition, new biometric technology – and new use cases for the biometric processing of biometric information – are unproven in terms of effectiveness. Rule 1(2)(a) effectively operates as an outright ban on this new technology or new use cases, as agencies will never be able to show effectiveness before collection.

There should also be more clarity around the evidence that will be sufficient to show the effectiveness of a type of biometric processing for its proposed purpose. In particular:

• Are agencies entitled to rely on evidence from overseas jurisdictions where evidence is not available in New Zealand?

• Are agencies entitled to rely on evidence provided by the vendors of the technology solutions?

• What standard of evidence (e.g., anecdotal vs. research based) is likely to satisfy the effectiveness requirement?

We also consider that the Draft Code should be amended to include mechanisms that expressly permit agencies to test or trial new biometric processing technologies or biometric processing for new use cases. As we have written about previously, biometric systems have significant potential benefits (including protecting privacy, and detecting and preventing crime). Innovative solutions helping individuals and businesses are in the public interest, and it is important that the Draft Code balances this benefit to society with the privacy rights of individuals.

Proportionality

We generally support the introduction of a proportionality assessment in the Draft Code. However, we query whether it is being applied to the correct Information Privacy Principle (IPP).

Proportionality as a test goes to fairness not to necessity. Proportionality is being applied to IPP1. We suggest that this is not the appropriate IPP. IPP1 relates to the collection of Personal Information and requires it only occur if for a lawful purpose and is necessary for that purpose. There is no fairness assessment here. In contrast IPP4 looks to ensure that Personal Information collected is not collected in an unfair or intrusive manner. We suggest that this is the better place for a test of proportionality to apply.

While the OPC is permitted by the Privacy Act to modify the application of an IPP to prescribe a more stringent standard, we consider that by applying a proportionality assessment at the point of collection (a fundamental change to the current operation of the IPPs) the OPC risks including a test at the point of collection that the drafters of the Privacy Act considered more appropriate to apply as part of the manner of collection.

Clarity

The number of definitions (most of which are interrelated) adds complexity to the practical application of the Draft Code. We are concerned that the breadth of some of the definitions and references within them to other definitions (which in turn often refer back to the initial definition) could lead to unintended consequences in practice.

For example, the term “biometric information” is used in many ways in the Draft Code, and sometimes in a way that is difficult to understand and may create inconsistency. “Biometric information” in the Draft Code includes both what is typically understood to be biometric information (biometric samples) and the information that results from the technical processing of those biometric samples (biometric templates and biometric results). However, “biometric results” are also included within the definition of “biometric processing” as something that is produced from biometric information, which suggests that a biometric result is different from biometric information.

While we agree with the principle that biometric templates and biometric results are forms of sensitive personal information and should sometimes be included in the application of the Draft Code, the Draft Code must be consistent, clear, and specific in its use of such terms to allow agencies to understand and implement the code in practice. We consider that there is more work to be done to provide agencies with greater certainty as to how the code applies and to make the code easier to implement.

Next steps

It was valuable to have the opportunity to give our feedback on the Draft Code, and we watch with interest to see what changes will be implemented to address the feedback received by the OPC through this consultation round. The OPC has stated that there will be an announcement from the Privacy Commissioner “in the middle of the year” on the biometrics code.

If you have any questions about the Draft Code or the collection and use of biometric information within your organisation, please do not hesitate to contact us.

No items found.

Article Link

Dowload Resource

Dowload Resource

Insights

Get in Touch