New Privacy Bill: CPPA 2.0 – plus oversight of artificial intelligence
On June 16, the federal government introduced Bill C-27, its revised proposed new private sector privacy legislation – the Digital Charter Implementation Act, 2022 – essentially an updated Consumer Privacy Protection Act (CPPA) and Act to establish the Personal Information and Data Protection Tribunal (Tribunal Act) plus a new proposed law, the Artificial Intelligence and Data Act (AIDA).
The Bill constitutes a revised version of former Bill C-11, the Digital Charter Implementation Act, 2020, introduced in November 2020 as the government’s initial foray into the realm of “second generation” privacy laws, of which the EU’s General Data Protection Regulation (GDPR) is the most prominent example. Former Bill C-11 died on the order paper with last fall’s election but was due for amendment before being adopted on account of extensive stakeholder input received over the 19 months’ hiatus since its tabling.
Bill C-27 incorporates a number of significant adjustments relative to Bill C-11, responding to stakeholder input including from the former Privacy Commissioner, Daniel Therrien.[1] However, it does not include other substantive changes proposed by the former Commissioner and others including civil society groups, in particular the establishment of a fundamental right to privacy, as a human right. Nor does it extend express application to political parties, also recommended by the former Commissioner as well as others.
In particular, significantly, the Bill includes: a “legitimate interest” exception to consent, along the lines of the GDPR; protections for children and youth; clear exclusion of “anonymized” information from application of the law; confirmation of the existing, established regime for application of privacy law to charities and other non-profit organizations; greater flexibility for the Commissioner in conducting investigations; and broadening the expertise base and authority of the proposed Tribunal. Also significantly, the Bill will enact the AIDA – representing the first North American regulatory oversight of artificial intelligence (AI) systems.
What is in Bill C-27
Exceptions to consent – new “legitimate interests” rule
The former Bill C-11 was severely criticized in part for proposing exceptions to an individual’s consent to collection or use of their information that would have, in the words of Commissioner Therrien, represented “a step back overall for privacy protection”. These included an exception for business activities where obtaining consent would be impracticable because the organization did not have a direct relationship with the individual.
The goals of this exception were to recognize the increasing impracticalities of obtaining meaningful consent in the digital world and to enhance the scope for innovation. However, as the former Commissioner stated, the exception should not be based simply on the criterion of impracticalities of obtaining consent – potentially resulting in an individual having less rather than more control over their personal information. Instead, a more meaningful criterion along the lines of the GDPR’s “legitimate interests” rule would be appropriate.
The government has responded to this critique by adopting the former Commissioner’s recommendation to replace the indirect relationship exception with a legitimate interests exception. The exception is couched in terms of ensuring that such interests of an organization proposing to use it outweigh any adverse effect on the individual’s privacy, with the caveats that the individual reasonably would have expected such collection or use and the information was not used for influencing their behaviour.
Protections for children and youth
Bill-27 includes for the first time in federal legislation express rules addressing special protections for the collection, use and disclosure of the personal information of children and youth, also a major subject of stakeholder responses to Bill C-11. Notably, the operative term, “minors” is not defined – leaving the criteria to be determined by relevant provincial legislation. Bill C-27 provides for parental intervention in exercising a child’s rights under the law, clearly qualifies the personal information of a child as “sensitive” thus requiring express consent and a higher standard of security safeguards to protect it, and stipulates for it more straightforward rules for deletion upon request, and limitations on retention.
Clarifying treatment of de-identified and anonymized information
A further significant revision to the former Bill C-11 version of the CPPA is its more comprehensive framework for application (or non-application) of the law to de-identified and anonymized information.
Bill C-27 aligns its treatment of such information with the GDPR and Quebec’s Bill 64. It defines de-identified information as information that has been modified so that an individual cannot be directly identified from it, recognizing that a risk of re-identification remains. It defines anonymized information as information that has been irreversibly and permanently modified, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means.
De-identified information is considered personal information subject to all the CPPA’s provisions, with the exception of specific provisions that permit its use without consent, such as for an organization’s internal research, and provisions regarding access and deletion. Conversely, anonymized information is entirely outside application of the law.
Application to charities and non-profit organizations
Bill C-27 reverts its treatment of charities and other not for profit organizations to the established framework under the current Personal Information Protection and Electronic Documents Act (PIPEDA), as opposed to the potentially narrower scope that would have been applicable under the Bill C-11 provisions. PIPEDA applies, and the CPPA under Bill C-27 will apply, to the commercial activities of such organizations including sales of merchandise or event tickets, sales or leasing of donor lists, and any business operations carried on by them.
However, Bill C-11 stipulated a more restricted definition in effect excluding any activities that could have been characterized as carried out for an organization’s overall charitable or non-profit goals, which would have encompassed many of its activities that are of a strictly “commercial” nature, such as noted above.
While clearly some if not most charitable/non-profit activities of organizations do not qualify as traditionally “commercial” and therefore are outside the general scope of PIPEDA, its application to the commercial activities of such organizations means in effect that the organizations’ collection and use of personal information in all of their operations is governed by PIPEDA’s rules.
Narrowing the application of the law as likely would have resulted from the former Bill C-11 definition could have led to such organizations concluding that they did not need to comply with the law. This possibility was noted by the OPC and in many stakeholder comments regarding Bill C-11.[2]
Greater flexibility for the Commissioner in providing guidance and conducting investigations
The OPC had argued that it should have the discretion to manage its caseload in order to respond to requests from organizations and complaints from consumers more effectively, and should have greater flexibility in conducting investigations
Bill C-27 responds by providing that the Commissioner may prioritize requests by organizations that he considers to be in greatest need of guidance and is not required to act on a request that the Commissioner considers unreasonable. The Bill further provides the Commissioner with greater discretion in deciding whether to undertake an investigation or to discontinue one.
Oversight of high-risk artificial intelligence systems
A potentially significant provision of Bill C-27 is the proposed enactment of the AIDA, intended to establish an oversight regime for “high impact” artificial intelligence systems with a view to preventing harmful effects such as bias, physical and psychological health consequences, and economic loss resulting from their use.
The AIDA is intended to apply to AI systems “related to human activities” that qualify as “high impact” – which will be defined by criteria set out in regulations. Organizations operating such systems will be required to conduct risk assessments and undertake measures to mitigate risks identified by such assessments. The focus of the legislation will be in effect to regulate AI systems that have potentially significant negative impacts.
The objectives of the AIDA are in line with similar initiatives internationally, most prominently in the EU, which is proposing the Artificial Intelligence Act (AI Act), that would regulate “high risk systems” and prohibit stipulated AI systems having unacceptable consequences, but leave unregulated AI systems having either “limited risk” or minimal risk”. Similar initiatives are being taken in the US and elsewhere, although none of these have progressed to the stage of proposed legislation.[3]
In the words of Innovation, Science and Economic Development Canada (ISED), the goal of the AIDA regime is to ensure that high-impact AI systems are developed and deployed in a way that identifies, assesses, and mitigates the risks of harm and bias.
The scope of application is intended to be narrower than the CPPA’s application to automated decision systems (ADS), focussing on autonomous or semi-autonomous systems that qualify as high impact. Under the AIDA, an AI system is defined as:
a technological system that, autonomously or partly autonomously, processes data related to human activities through the use of a genetic algorithm, a neural network, machine learning or another technique in order to generate content or make decisions, recommendations or predictions.
By contrast, the CPPA’s rules apply to transparency for all ADS systems, which are defined as:
technology that assists or replaces the judgment of human decision-makers through the use of a rules-based system, regression analysis, predictive analytics, machine learning, deep learning, a neural network or other technique.
In its briefings on June 16 and 17, ISED indicated that the CPPA definition is intentionally broad since it is focussing on protecting individual rights in a wide range of automated systems whereas the AIDA regime focusses on assessing and mitigating risks specifically in autonomous or quasi-autonomous processing systems.
The AIDA will establish an Artificial Intelligence and Data Commissioner to support the Minister of Innovation, Science and Industry in fulfilling ministerial responsibilities under the Act, including by monitoring compliance, ordering third-party audits, and sharing information with other regulators as appropriate.
Again, what are intended to be regulated as “high impact systems” will be defined by regulation. However, some advance guidance can be gleaned from the proposed scope of the EU’s AI Act, although in ISED’s stakeholder briefings last week it suggested that the AIDA criteria may stipulate a different, higher threshold.
Under the EU’s proposed AI Act, the classification of an AI system as high-risk is based on the intended purpose of the system, in line with existing product safety legislation. The classification as high-risk does not only depend on the function performed by the AI system, but also on the specific purpose and modalities for which that system is used.
In addition, the AI Act sets out certain categories of systems which are considered pre-emptively high risk, including those performing functions for: biometric identification and categorisation; safety in the operation of critical infrastructure; access by and assessment of students for education and vocational training programs; recruitment, assessment, promotion and termination of job candidates and employees; access to and enjoyment of essential private services and public services and benefits, including creditworthiness assessments; and law enforcement.
What is not in Bill C-27
Bill C-27 does not include a number of substantive changes to its predecessor Bill C-11 proposed by the former Commissioner and others, including the following.
Privacy as a human right
The most significant item that is not included in the Bill is the recognition of privacy as a fundamental human right. Inclusion of this precept was urged by the former Commissioner and has been stressed as important by the incoming Commissioner, Philippe Dufresne.[4]
The new preamble to Bill C-27 refers to privacy as a right that is essential to individuals’ full enjoyment of their “fundamental rights and freedoms” and a recognition that artificial intelligence systems and other emerging technologies should uphold Canadian norms and values in line with the principles of international human rights law. However, it does not include a clear statement that privacy is a fundamental right, as is contained in Quebec law and as has been proposed to be included in a new Ontario private sector privacy law.[5]
Extending application of the CPPA expressly to political parties
Both the former Commissioner and civil society groups and privacy experts have urged the government to extend express application of the CPPA to federal political parties and their related entities. The three main parties (Liberals, Conservatives, NDP) currently are challenging in court a ruling by the BC Information and Privacy Commissioner that the BC PIPA, which is analogous in scope to PIPEDA, applies to them.
There have been many reasons advanced as to why, in the context of the modern digital economy, the parties should be subject to sophisticated privacy rules under a law such as PIPEDA, or the proposed CPPA, as opposed to the more rudimentary privacy rules under the Canada Elections Act. It remains to be seen whether this exclusion from application will receive further scrutiny when the Bill is discussed in parliamentary committee.
Cross-border data flows
The OPC in its Submission regarding Bill C-11 argued that the significance and complexity of cross-border data flows and the growing involvement of small and medium sized enterprises in such flows is such that they should be addressed in a specific dedicated section of the CPPA.
Specifically, the ‘transparency’ requirements in the CPPA should be amended to provide that organizations provide sufficient details about any international transfer or disclosures of personal information to enable individuals to understand the implications of such transfers and to assess whether an offshore service provider will maintain substantially the same protection as afforded by the CPPA. Such a requirement for in effect a risk assessment to be carried out by an organization would be consistent with the new provisions of the Quebec Private Sector Act as amended by Bill 64.
Right to contest automated decisions
The OPC had urged that that individuals be given a right to contest automated decisions made about them along the lines provided under the GDPR and in Quebec’s amended law and as is proposed to be included a new Ontario private sector privacy law.
Extending the right to be forgotten and mobility rights to indirectly acquired data
Under Bill C-27 all personal information under an organization’s control is subject to the right to be forgotten but mobility rights will be limited to information collected directly from an individual. The right to be forgotten will not apply to information that has been indexed in search engines.
The OPC had recommended inclusion of a clear right with respect to the de-indexing and/or removal of personal information from search results and other online sources and that the mobility right be expanded to include all personal information about an individual, including derived or inferred information.
Personal Information and Data Protection Tribunal
The OPC had strongly recommended elimination of the Tribunal as inserting additional complications and procedural hurdles into the investigation and enforcement process under the CPPA but, if it is to remain, that it have majority of judges, sitting or retired, and that the balance of members have privacy expertise.
The government believes that the Tribunal will provide an efficient avenue for seeking remedies for small and medium sized businesses and therefore has retained it but is amending the membership to require three members (as opposed to one as originally proposed) to have privacy expertise and that the Tribunal have authority to make orders equivalent to a court.
Summary and Conclusions
Minister Champagne has indicated that he wants a “fast track” for adoption of Bill C-27. However the Bill’s progress will depend on other government priorities and the extent of further public (and political) responses to the Bill as it progresses through Parliament. It is unlikely that consideration at committee will begin before September.
The Bill includes many revisions to its predecessor, former Bill C-11, responding to stakeholder input. On the other hand, some significant proposed revisions have not been included. It remains to be seen whether in the current minority government context Bill C-27 will be amended to reflect any of the items not included.
© David Young Law 2021 Read the Full PDF: New Privacy Bill: CPPA 2.0 – plus oversight of artificial intelligence
For more information please contact: David Young 416-968-6286 david@davidyounglaw.ca
Note: The foregoing does not constitute legal advice. © David Young Law 2022
[1] Submission of the Office of the Privacy Commissioner of Canada on Bill C-11, the Digital Charter Implementation Act, 2020, May, 2021 to the House of Commons Standing Committee on Access to Information, Privacy and Ethics.
[2]See, for example, the November 26, 2020 commentary on Bill C-11 by Esther Shainblum and Luis Chacin of the charity and not-for-profit law firm, Carters. Arguing that Bill C-11’s proposed revised definition of commercial activity would be main the reason for the CPPA’s potential lack of application to non-profits, the authors indicate that Bill C-11 would have had the effect of repealing Schedule 1 [of PIPEDA] and, if they are not caught by the legislation, leaving charities and non-profits with no guidance around appropriate privacy practices.
[3] See for example “Americans Need a Bill of Rights for an AI-Powered World”, Blog, the White House Office of Science & Technology Policy, October 22, 2021
[4] “Watchdog nominee Philippe Dufresne Stresses Privacy as a ‘Fundamental Human Right’”, National Post, June 13, 2021.
[5] See: White Paper, Modernizing Privacy in Ontario – Empowering Ontarians and Enabling the Digital Economy, Ministry of Government and Consumer Services, June 17, 2021,