Protecting Health Information. Ensuring Data Integrity.
Protecting Health Information. Ensuring Data Integrity.

Could Privacy Law Reforms Accelerate Medical AI Innovations?

While Canada has invested in building significant expertise and capacity in AI and Machine Learning (AIML), there is a recognized deficit in converting this into successful companies that can commercialize this work. In the context of health innovation, AIML often means using health data to build models, and this requires access to health data – lots of it. This is typically going to be in the form of de-identified health data that is disclosed by healthcare providers or commercial entities that process health information (such as pharmacies and EHR vendors) to the commercialization partners, which range from larger life sciences companies to health technology spinoffs or startups. The Commercialization Working Group from the Government of Canada noted that "High quality training data is foundational to the development of AI and machine learning-based products and services" and in its recommendations highlighted that "Democratizing access to the data needed to develop AI systems, while protecting Canadians' privacy, would foster innovation and help firms to develop products/services that are comprehensive and responsive enough to market."

In my role as a Canada Research Chair in Medical AI, I think it is important to highlight that there is a potential challenge in that the direction some privacy laws are taking would make this quite challenging. Examining this issue now is timely because of on-going privacy law reform that is starting to happen across the country, with the former Bill C-11 as a concrete example.

Bill C-11 - An Act to enact the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act and to make related and consequential amendments to other Acts (the “CPPA”) died on the order paper when Parliament was prorogued on August 15, 2021. It does however provide a useful point of reference with respect to the thinking of the federal government on PIPEDA amendments (at the time) to both advance Canada’s innovation agenda, while at the same time maintaining the country’s adequacy designation as against the European data protection laws, now the GDPR.

The CPPA authorizes an organization to disclose de-identified data to the four categories of entities identified in paragraph 39(b) if the disclosure is made for socially-beneficial purposes as set out in paragraph 39(c).  The four categories of recipients are:

  1. a government institution or part of a government institution in Canada,
  2. a health care institution, post-secondary educational institution or public library in Canada,
  3. any organization that is mandated, under a federal or provincial law or by contract with a government institution or part of a government institution in Canada, to carry out a socially beneficial purpose,
  4. any other prescribed entity

“Socially beneficial purposes” are defined in Subsection 39(2) as meaning:

A purpose related to health, the provision or improvement of public amenities or infrastructure, the protection of the environment or any other prescribed purpose.

The CPPA does not provide definitions for the recipients set out in categories (i)-(iii).  Nor does it indicate any common criteria that would provide a limitation on what entities could be prescribed pursuant to clause (iv).

The definition of socially-beneficial purposes does allow the disclosure of datasets for purposes related to health, which we assume encompasses developing AIML models by startups and other commercial actors. However, while startups and other commercial actors often make significant socially-beneficial contributions in terms of new treatments and devices, they are not explicitly one of the three types of entities described above, and it is difficult to see how startups, for example, can be read into these categories. It is plausible that startups and other commercial actors could  be  prescribed by regulation pursuant to paragraph 39(3)(b)(iv). However, the manner in which these prescribed entities would come about is unclear.

Should such narrow provisions make it into the next iteration of the federal private sector privacy law, that would add one more barrier to converting medical innovations into successful products and services in Canada. AIML modeling, growing companies, and competing globally are already challenging and more friction in getting access to data would have a non-trivial impact on the competitiveness of Canadian companies wishing to use Canadian data.

This is one of the barriers I highlighted in my recent op-ed in the Hill Times which focuses on the benefits of commercialization of medical AI research in Canada.

If Canada wishes to derive the many benefits of medical AI innovations, we will need updated privacy laws and regulations that are clear, that do not establish questionable barriers, and that recognize and effectively balance economic as well as other societal interests.

Related posts