While lawmakers in Canada1 and elsewhere2 are endeavouring to regulate the development and use of technologies based on artificial intelligence (AI), it is important to bear in mind that these technologies are also classified within the broader family of information technology (IT).
In 2001, Quebec adopted a legal framework aimed at regulating IT. All too often forgotten, this legislation applies directly to the use of certain AI-based technologies.
The very broad notion of “technology-based documents”
The technology-based documents referred to in this legislation include any type of information that is “delimited, structured and intelligible”.3 The Act lists a few examples of technology-based documents contemplated by applicable laws, including online forms, reports, photos and diagrams—even electrocardiograms! It is therefore understandable that this notion easily applies to user interface forms used on various technological platforms.4
Moreover, technology-based documents are not limited to personal information. They may also pertain to company or organization-related information stored on technological platforms. For instance, Quebec’s Superior Court recently cited the Act in recognizing the probative value of medical imaging practice guidelines and technical standards accessible on a website.5 A less recent decision also recognized that the contents of electronic agendas were admissible as evidence.6
Due to their bulky algorithms, various AI technologies are available as software as a service (SaaS) or as platform as a service (PaaS). In most cases, the information entered by user companies is transmitted on supplier-controlled servers, where it is processed by AI algorithms. This is often the case for advanced client relationship management (CRM) systems and electronic file analysis. It is also the case for a whole host of applications involving voice recognition, document translation and decision-making assistance for users’ employees.
In the context of AI, technology-based documents in all likelihood encompass all documents that are transmitted, hosted and processed on remote servers.
The Act sets out specific obligations when information is placed in the custody of service providers, in particular IT platform providers. Section 26 of the Act reads as follows:
Anyone who places a technology-based document in the custody of a service provider is required to inform the service provider beforehand as to the privacy protection required by the document according to the confidentiality of the information it contains, and as to the persons who are authorized to access the document.
During the period the document is in the custody of the service provider, the service provider is required to see to it that the agreed technological means are in place to ensure its security and maintain its integrity and, if applicable, protect its confidentiality and prevent accessing by unauthorized persons. Similarly, the service provider must ensure compliance with any other obligation provided for by law as regards the retention of the document. (Our emphasis)
This section of the Act, therefore, requires the company wishing to use a technological platform and the supplier of the platform to enter into a dialogue. On the one hand, the company using the technological platform must inform the supplier of the required privacy protection for the information stored on the platform. On the other hand, the supplier is required to put in place “technological means” with a view to ensuring security, integrity and confidentiality, in line with the required privacy protection requested by the user.
The Act does not specify what technological means must be put in place. However, they must be reasonable, in line with the sensitivity of the technology-based documents involved, as seen from the perspective of someone with expertise in the field.
Would a supplier offering a technological platform with outmoded modules or known security flaws be in compliance with its obligations under the Act? This question must be addressed by considering the information transmitted by the user of the platform concerning the required privacy protection for technology-based documents. The supplier, however, must not conceal the security risks of its IT platform from the user since this would violate the parties’ disclosure and good faith requirements.
Are any individuals involved?
These obligations must also be viewed in light of Quebec’s Charter of Human Rights and Freedoms, which also applies to private companies. Companies that process information on behalf of third parties must do so in accordance with the principles set out in the Charter whenever individuals are involved.
For example, if a CRM platform supplier offers features that can be used to classify clients or to help companies respond to requests, the information processing must be free from bias based on race, colour, sex, gender identity or expression, pregnancy, sexual orientation, civil status, age except as provided by law, religion, political convictions, language, ethnic or national origin, social condition, a handicap or the use of any means to palliate a handicap.7 Under no circumstances should an AI algorithm suggest that a merchant should not enter into a contract with any individual on any such discriminatory basis.8
In addition, anyone who gathers personal information by technological means making it possible to profile certain individuals must notify them beforehand.9
To recap, although the emerging world of AI is a far cry from the Wild West decried by some observers, AI must be used in accordance with existing legal frameworks. No doubt additional laws specifically pertaining to AI will be enacted in the future.
If you have any questions on how these laws apply to your AI systems, please feel free to contact our professionals.
Eric Lavallée is a lawyer and trademark agent in the Business Law Group and he runs the Lavery Legal Lab on Artificial Intelligence (L3AI).
As a result of his extensive experience in intellectual property (patents, trade-marks and software protection) Mr. Lavallée took on a special interest in developments related to artificial intelligence over the past few years.
Mr. Lavallée is regularly called upon to assist businesses of all sizes, from start-ups to large corporations in drafting licensing agreements and business contracts in high technology as well as implementing protection and due diligence strategies for their intellectual property needs.
He has developed leading-edge expertise in the analysis of the legal impact of the application and implementation of artificial intelligence in sectors related to his practise of law, namely privacy protection, corporate governance and business law.
Expertise in nanotechnology
Eric Lavallée has a Master’s degree in Physics as well as a Doctorate in Electrical Engineering. Prior to joining Lavery in 2014, he was Vice-President, R&D, for a nanotechnology research and development firm. He has four inventions to his name relating to electron beam lithography for applications in microelectronics:
- Method of producing an etch-resistant polymer structure using electron beam lithography
- Plasma polymerized electron beam resist
- Fabrication of sub-micron silicide structures on silicon using resistless electron beam lithography
- Fabrication of sub-micron etch-resistant metal/semiconductor structures using resistless electron beam lithography
As a researcher, he also authored 15 scientific papers and presented his work at international conferences held in the United States, Europe and Japan in the nanotechnology industry.
1. Bill C-27, Digital Charter Implementation Act, 2022.
2. In particular, the U.S. Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, October 30, 2023.
3. Act to establish a legal framework for information technology, CQLR c C-1.1, sec. 3.
4. Ibid, sec. 71.
5. Tessier v. Charland, 2023 QCCS 3355.
6. Lefebvre Frères ltée v. Giraldeau, 2009 QCCS 404.
7. Charter of Human Rights and Freedoms, sec. 10.
8. Ibid, sec. 12.
9. Act respecting the protection of personal information in the private sector, CQLR c P-39.1, sec. 8.1.