Privacy Reform – Update

Privacy Reform – Update

Initiatives to reform privacy laws in Canada have progressed significantly.  However they remain, at the national level at least, an unfulfilled promise.  The most significant (and potentially impactful) reform now in place is found within Quebec’s revised private sector law, Law 25, which as of this past September, is fully in force.  On the other hand, the major overhaul of the current federal private sector law, PIPEDA,[1] is stalled in Parliamentary Committee and is not likely to be in force, even if passed, before the next election.

Bill C-27 – amended status

The federal government first introduced its proposed successor to the current law – the Consumer Privacy Protection Act (CPPA) – in November 2020.  With an election and responses that were critical of the proposed legislation in a number of key respects, the then Bill C-11 failed to progress.  The CPPA was reintroduced within a new Bill C-27 in June 2022, but only received second reading in June 2023, with detailed review at the Parliamentary INDU Committee commencing in the fall of last year.[2]  Committee hearings have progressed slowly, with expert witnesses and then clause-by-clause review (and amendments) reaching only the preliminary portions of the Bill this past May.

With the minority Parliament now risking an election at any time, it is not clear whether the Bill C-27 instance of federal privacy reform will come to fruition.  However it is likely that the major outlines of the proposed CPPA, including the amendments that have been adopted by the Committee to date – for the most part with all-party support – will find their way into an  amended federal law.  While such legislative action may be some time in the future,[3] it is instructive for stakeholders to take note – and to align their privacy compliance status with the themes likely to be reflected in the ultimate reformed federal law.  More importantly, most of the key precepts found in the CPPA are contained in Law 25 which of course is fully in force, and which will dictate privacy rules for organizations operating in Quebec, as well as elsewhere if they collect any personal information of Quebec residents.  In many respects, Law 25 is now the “bright line” standard for Canadian privacy compliance.

Some key amendments have been made to the CPPA at second reading, within the INDU Committee.  All or most of these, it may be surmised, will be reflected in the final version if and when it reaches third reading in this Parliament or, equally importantly, in a new bill introduced following the next election.

Early in the Committee hearings, Minister Champagne proposed certain amendments to the Bill, including addressing privacy as a fundamental right and recognizing the importance of children’s privacy, as well as giving the Commissioner greater flexibility to reach compliance agreements.[4]  The Committee’s clause-by-clause review has only reached the first two of these items, but significant further changes were adopted.

Privacy as a fundamental right

The most impactful amendment made to the Bill in Committee to date is the recognition of privacy as a fundamental right – sometimes stated as a “fundamental human right”.  The Bill as originally tabled did not include this precept.  The Minister’s proposed amendment would have been to include a reference to this effect within the preamble to the Bill.  However, expert and other stakeholder witnesses advocated a clearer statement directly addressing the CPPA with the result that all of the recitals to the Bill (minus a recital specific the proposed Artificial Intelligence and Data Act) are now included as the preamble to the CPPA, as opposed to the Bill.  The relevant recital reads as follows:

Whereas the protection of the fundamental right to privacy of individuals with respect to their personal information is essential to individual autonomy and dignity and to the full enjoyment of fundamental rights and freedoms in Canada; (italics added indicating addition)

 

Witnesses argued that a clear statement of the fundamental right also should be included within the substantive provisions of the CPPA – specifically the Purpose clause (section 5) – which the Committee has not yet addressed.  They argue that specific inclusion is needed in the body of the CPPA to give unambiguous legal effect to Parliament’s intention that privacy be recognized as a fundamental right.

The primary significance of adopting the precept of privacy as a fundamental right is that where any ambiguity exists regarding the respective rights of individuals and organizations, the interpretation should favour the former. Advocates argue that privacy rights of individuals should prevail over commercial interests and not be “balanced” against them, and as a fundamental right, it is not appropriate to “balance” privacy against commercial interests.

Best interests of the child

In a thrust consistent with the recognition of the fundamental right to privacy, arguably the second most significant amendment to Bill C-27 made to date by the INDU Committee is a recital in the preamble to the CPPA recognizing that the processing of children’s information should respect their privacy and best interests.

As originally tabled, the CPPA included limited provisions addressing special protections for children and other minors – primarily a stipulation that minors’ information should be considered sensitive.  Clearly, such a stipulation has implications for consent (must be express), the elements of privacy management programs, and RROSH determinations for breaches,[5] among other provisions of the CPPA.  However, stipulating that all processing of children’s data needs to recognize their best interests has much broader implications – resolving to the guiding precept that all collection, use or disclosure of children’s information must be in their best interest, not the best interest of the data collector.

The best interests of the child is the guiding stipulation of what is recognized globally as the “gold standard” for age-appropriate rules in design of online communications with minors – the UK’s Children’s Code.[6]  The UK Code has formed the basis of evolving laws and guidance addressing communicating with children online in both Canada and the US.[7]

Witnesses appearing before the Committee argued that the CPPA should go beyond the simple declaration of best interests and include specific protections for children and youth, such as defining rules for age-appropriate consent, establishing privacy-respectful processes and mechanisms for age verification, and providing for a comprehensive code of practice for organizations collecting, using or disclosing children’s personal information along the lines of the UK Code.

Definition of a minor  

Related to its consideration of children’s privacy, the Committee adopted an amendment defining the term “minor” – to mean an individual under the age of 18. This age definition will impact the provisions of the CPPA that stipulate special protections for children and other minors, such as the requirement to recognize their best interests in any processing of their personal information, as noted above, and the stipulation that the personal information of minors be considered sensitive information.

Such an age stipulation may conflict with the understanding of the term “minor” under provincial and territorial age of majority legislation which in some instances provides for majority at age 19.[8]  Additionally, by in effect prescribing special consideration for all persons under age 18, the CPPA rule may not align with other regulatory and non-regulatory rules respecting the age at which minors can consent to collection of their personal information, independent of their parents’ approval.[9]  For example, Quebec’s Law 25 stipulates that collection of personal information from persons under 14 years of age requires their parents’ consent but for ages 14 and older, consent may be given by either the individual or their parent.

Definition of anonymized information

A further not insignificant amendment to the Bill made at Committee was an adjustment of the original wording of the defined term, “anonymize” to provide that de-identification processes must respond to the criterion of “no reasonably foreseeable risk in the circumstances” of reidentification, in place of the ostensibly less precise criterion of “in accordance with generally accepted best practices”, which was the wording in the Bill as originally tabled.

The government, as part of the Committee hearings, had proposed an alternative clarification that would have aligned the CPPA provision with the parallel Quebec Law 25 provision.  The Law 25 provision in effect stipulates what the generally accepted best practices are – specifically those set out in regulations to the law.  It is not clear what the rationale was for the Committee’s amendment, other than the federal Privacy Commissioner’s view,[10] that the statute should provide for a more rigorous criterion against re-identification.  The Commissioner had argued for omission of language, such as generally accepted best practices, qualifying the criterion of non re-identifiability.

The resulting provision no longer aligns with the Law 25 provision which provides clear guidance as to the procedures required for anonymization.[11] The amended provision adopts the “no reasonable risk of reidentification in the circumstances” criterion which, to be noted, is included in the Law 25 rule as well as other privacy laws as a qualifier.[12] While not providing the clear guidance of a regulation, the “no reasonable risk” criterion has been interpreted in cases to establish a threshold minimal risk of re-identification that is understood to mean that, under defined circumstances, there is no reasonable risk of an individual being identified.[13]

Definition of sensitive information

An amendment by the Committee adds a definition of sensitive information by setting out a general principle plus a non-exhaustive list of specific categories, stated as any information for which the individual generally has a high expectation of privacy.  Specific categories of information considered “sensitive” include ethnicity, sexual orientation, biometrics, health, political opinion and financial matters.

It is understood that pursuant to s. 15(5) of the CPPA, processing of personal information will require express consent unless in the circumstances it is appropriate to rely on implied consent – which include whether the information is considered sensitive.  As a general rule, processing of sensitive information requires express consent.

The approach adopted by the Committee of a general principle together with an open-ended list of examples had been recommended by the OPC[14] and is consistent with the approaches under Law 25 and the GDPR.[15]  However the CPPA itemized categories are significantly more detailed and longer than the definitions under those laws.

Inferred information

An amendment at Committee extended the definition of “personal information” to include expressly inferred information about an individual, as recommended by the OPC.[16]  In the OPC’s view, there is some debate as to how inferences are regarded under privacy law, notwithstanding that both it and the courts have treated them as personal information.  Some view them as an output derived from personal information, such as a decision or an opinion, and argue these are outside the purview of privacy legislation.

Profiling

Again responding to a recommendation by the OPC, the Committee adopted a definition of “profiling”, [1] largely tracking the definition of the same term in Law 25. The Committee’s objective is to include other amendments to the transparency provisions in the CPPA regarding automated decision systems that would introduce a requirement for organizations to explain, in plain language, how their automated decision‑making systems profile individuals and to allow individuals to appeal automated decisions made about them when they have been profiled. [2] It is the view of the OPC that the Bill’s current language – specifically referencing decisions, recommendations and predictions – is not sufficient to include profiling – and that express reference together with a definition of the term is required.[3]

It may be noted that while the proposed defined term tracks the definition contained in Law 25, the purpose of that definition is different – profiling is mentioned expressly in Law 25 only with respect to the new “opt-in” consent requirements for tracking by electronic means.[4]  It is not mentioned expressly in the Law 25 provision addressing transparency for automated decision‑making systems, although the tracking provision arguably may be encompassed within the scope of automated decision‑making systems.

Summary

The most significant (and potentially impactful) reform now in place in Canada is found within Quebec’s revised Private Privacy Sector Law, Law 25 – arguably the new “bright line” for privacy.  While likely several years away from coming into force elsewhere in Canada, the currently stalled initiative to amend the federal private sector law, the proposed Consumer Privacy Protection Act, part of Bill C-27, contains many of the precepts found in the Quebec law and should be used as the relevant point of reference for organizations’ privacy compliance programs going forward.

For more information please contact:      David Young       416-968-6286     david@davidyounglaw.ca

Note:     The foregoing does not constitute legal advice. © David Young

Read the PDF: Privacy Reform – Update


 

[1] Personal Information Protection and Electronic Documents Act, S.C. 2000 c. 5.

[2] Standing Committee on Industry and Technology

[3] It will be recalled that the only other substantive amendments to PIPEDA – the Digital Privacy Act which included the current breach reporting regime and enhancement of the consent requirements – took ten years to enact, from the time initial proposals were circulated for consultation, and another three years to come into force.

[4] Letter to the Committee, October 20, 2023.

[5] Reasonable risk of significant harm; CPPA, s. 42 – Breach of security safeguards..

[6] ICO

[7] California; BC OIPC report

[8] The age of majority is 18 in six provinces: Alberta, Manitoba, Ontario, Prince Edward Island, Quebec, and Saskatchewan. The age of majority is 19 in four provinces and the three territories: British Columbia, New Brunswick, Newfoundland, Northwest Territories, Nova Scotia, Nunavut, and Yukon.

[9] Canadian Marketing Association Code of Ethics and Standards, Section K

[10] Submission of the Office of the Privacy Commissioner of Canada on Bill C-27, the Digital Charter Implementation Act, 2022, April 2023.

[11] Regulation for the anonymization of personal information.

[12] PHIPA

[13] See for example, PHIPA Decision 175, March 25, 2002.

[14] Submission of the Office of the Privacy Commissioner of Canada on Bill C-11, the Digital Charter Implementation Act, 2020, May 2021.

[15] See: Law 25, s. 12.

[16] OPC Submission, May 2021.

[17] Profiling means any form of automated processing of personal information consisting of the use of personal information to evaluate certain personal aspects relating to an individual, in particular to analyze or predict aspects concerning that individual’s performance at work, economic situation, health, reliability, behaviour, location or movements.

[18] CPPA, ss. 62(2)(c) and 63(3).

[19] OPC Submission May 2023:

There is also a key element missing from the CPPA in relation to [automated decision‑making systems]. Unlike the EU’s General Data Protection Regulation (GDPR), and other modern privacy laws in California and Québec, the obligations do not explicitly apply to profiling. As drafted, the obligations would only apply to ADM systems that make decisions, recommendations, or predictions.

[20] Law 25, s. 8.1.