On August 16, 2023, Innovation, Science and Economic Development Canada (ISED) launched a public consultation seeking stakeholder feedback on a new code of practice (the Code) for generative artificial intelligence (AI) systems. The Code is meant to offer interim guidance for companies currently developing and using AI systems until the Artificial Intelligence and Data Act (AIDA) comes into force.

The consultation document acknowledges the recent surge in popularity of generative AI systems, which are trained on large datasets to generate content in a variety of forms and contexts, and notes that this rise in popularity is accompanied by an urgent need for regulation to mitigate the risks associated with malicious or inappropriate use of AI systems.  Although the Government of Canada has taken significant steps towards regulation by tabling the AIDA as part of Bill C-27 (as previously reported by the E-TIPS® Newsletter here), the bill has not yet received royal assent.  Therefore, the Code is intended to provide interim guardrails that Canadian firms can implement on a voluntary basis before AIDA comes into force.

According to the consultation document, comment is being sought on the inclusion of the following elements of practice in the Code:

  1. Safety. The Code provides that developers and deployers would identify the ways an AI system may attract malicious use, and developers, deployers, and operators would identify the ways the system may attract harmful inappropriate use.  Once these safety risks are identified, the respective parties would take steps to prevent such risks.
  2. Fairness and Equity. A key consideration for training AI systems is to ensure that the data sets used in training will allow for accurate and unbiased output.  Therefore, the Code would direct developers to assess and curate datasets to avoid low-quality or biased data and developers, deployers, and operators to implement measures that mitigate biased output.
  3. Transparency. Given the transparency challenges associated with explaining generative AI and that some training data may not be publicly available, it is important to ensure individuals know when they are interacting with AI systems and AI-generated content. This includes having developers and deployers provide a free method for others to detect AI-generated content, and operators ensure they clearly identify when a system is an AI system to avoid it being mistaken for a human.
  4. Human Oversight and Monitoring. Deployers and operators must ensure that there is adequate human oversight in the deployment and operations of AI systems. Furthermore, the Code directs developers, deployers, and operators to implement reporting processes for adverse impacts associated with released AI systems and commit to continual fine-tuning of AI models based on findings. 
  5. Validity and Robustness. ISED recognizes the importance of ensuring that AI systems work as intended across the various situations in which they are employed.  This includes developers using a variety of testing methods to measure performance and identify vulnerabilities associated with the system, and developers, deployers, and operators implementing cybersecurity measures for their AI system.
  6. Accountability. Organizations developing, deploying, or operating AI systems should have appropriate internal governance mechanisms, a multifaceted risk management process, and ensure employees receive clarity on their role in the process.  As part of this element, developers, deployers, and operators would ensure that multiple lines of defence are implemented for an AI system, and staff have policies and training to provide them with clarity on their roles and responsibilities.

ISED welcomes stakeholder comments on the Code and is hosting roundtable sessions to seek feedback.  For further information, please see here.

Summary By: Imtiaz Karamat

E-TIPS® ISSUE

23 09 06

Disclaimer: This Newsletter is intended to provide readers with general information on legal developments in the areas of e-commerce, information technology and intellectual property. It is not intended to be a complete statement of the law, nor is it intended to provide legal advice. No person should act or rely upon the information contained in this newsletter without seeking legal advice.

E-TIPS is a registered trade-mark of Deeth Williams Wall LLP.