Earlier this year (May, 2019), the federal government unveiled its so-called “Digital Charter”, a group of ten principles that the current government intends to pursue, presumably if it is re-elected in the Fall. It`s a far-reaching document, and virtually all Canadians (as individuals) and Canadian businesses and other organizations will be impacted (subject to the specific provisions, when they are released, likely within 12-18 months after the election (assuming the Liberals win re-election; but digital issues are garnering so much attention nowadays, one presumes that if another party wins the upcoming election, it will also make the thrust of the Digital Charter a priority).
It is of course too early to tell with any specificity what exactly would be in the new legislation when (and if ) it is ultimately introduced in the House of Commons. On the other hand, the Digital Charter does give a good sense of a number of principles that the federal government will be pursuing.
Privacy Law Overhaul
One area that the government seems to be focusing on is data privacy. This is not a big surprise. Canada’s federal privacy law is some 20 years old. And while two decades might not be that long a period for some legislation, in the digital realm that same period of time has been an eon. Moreover, the current privacy law has its origins in a ‘voluntary privacy code’ that was morphed into law in a hurried manner because the government of the day needed something in place in a big hurry. It was not – and is not – the most elegant statute on the books. Bottom line, it is showing its age.
So, what can we expect to see in the new legislation? Most likely, some new provisions related to enforcement. There is a strong consensus in Ottawa, and in many other centres in the country where internet privacy issues are debated, that is highly critical of the lack of enforcement powers for the Office of the Privacy Commissioner in the current statute. By contrast, for example, the fairly new European privacy law that came into effect last year, provides for a maximum fine equal to 4 % of the world-wide sales of the company that breached the statute. That provision alone has caught the attention of the global technology community, and it is driving behavioural change for sure.
Another issue that likely will be the subject of a new provision or two relates to consent. In many ways, the principle of consent is the cornerstone of our current law. The core requirement in the statute is that a company or other organization cannot collect, use or share a person’s personal information without their consent. And there is indeed much to commend – still to this day – this bedrock rule. On the other hand, there is widespread recognition that in many digital environments, the scope for asking, and receiving, meaningful consent is very low, if not zero. For example, you walk into a public building, and there is a sign telling you there are surveillance cameras capturing your image. But there is no real effective method of gathering your express consent to being recorded.
The OPC has been saying for some time now that the very concept of consent has to be re-thought, particularly in certain digital and online circumstances. Therefore, likely what will be in the statute’s overhaul are a few concepts that address this important matter.
First, there may well be direction on how data handlers have to behave when consent is not practical. And second, there likely will be provisions allowing for collection of certain information without consent, provided that certain protections are implemented and properly exercised during the collection of the information.
For example, some emphasis may be put on data de-identification. That is, there would be circumstances where certain specific elements of personal information can be collected without consent, provided (and this is a very important proviso) the information is immediately stripped of those elements that could link it to a certain individual. And ideally this process of de-entification will be per-formed by the machine that captures the information in the first place, so that the identifying elements are never uploaded to the collecting entity’s main computers (rather, in those computers one will only have anonymized data).
Moreover, the collecting organization will also have to take steps to avoid any possible “re-identification” of the now anonymized data. And the good news is that there is an entire technology and process discipline around these safeguards that is increasingly mature, so that the effective risk of re-identification is getting smaller and smaller. The helpful irony here is that very often in the computer law space, while it is technology that kicks up the challenge, it is another technology that offers the solution (or a big part of it).
To Be Or Not To Be
Speaking of de-identification, expect the new privacy law to address the so-called “right to be forgotten”. Let’s say some time ago you were declared bankrupt – and it was so long ago that the official government record no longer shows you to be a discharged bankrupt. But guess what – if you do a search (or more to the point, if a prospective employer does a search) online, various references to your bankruptcy can be found. What the “right to be forgotten” affords you is the ability to ask the operator of the search engine to change its records such that your bankruptcy does not come up in the online search.
It`s an interesting right, and not one without vocal spokespeople on both sides of the issue. Some argue it is dangerous to try to “eradicate the past”, and that full transparency requires the truth at all times, however embarrassing that might be. On the other hand, some argue that the public policy rationale for “wiping clean” certain public records should simply be carried forward into the internet search realm – otherwise, the public policy objective of “redemption” will be thwarted.
How Did You Do That?
Speaking of transparency, another very topical issue that may well be addressed in he upcoming legislation involves artificial intelligence (AI). Let`s say you apply for a loan with an online fintech. They actually boast that their AI-adjudication engine works so fast that loan applications can be vetted, and a decision made, in a matter of minutes. Now, let`s say you are refused the credit. You reach out to the fintech and ask – “why was I turned down for the loan” ?
In pre-AI days, this sort of question had a fairly straightforward answer. The financial institution had relatively clear lending criteria, based on your income, and other standard factors, and if you met the criteria, you were given the loan, and if you did not, you were refused a loan. Simple.
Things can get more complicated when an AI credit approval engine is added to the mix. The engine compares your data to millions of other customers of the fin-tech or of a credit bureau service, or perhaps tens of millions customers across the entire industry, and based on actual data patterns involving repayment and default (together with a range of other interest-ing data points, the AI system makes a decision. But here’s the thing. The actual, specific factors relevant to your credit application may not actually be known, or at least not known in the ordinary course of the fintech’s business model.
To counter such a situation, the new legislation may provide for “algorithmic transparency”; that is, when you ask about why you were turned down for the loan, the fintech actually has to figure out how the AI engine came to that decision. This would be a very interesting provision, and would, in effect, make users of AI applications accountable for their computers. One concern that it would address, is that there is a risk that AI software would perpetuate various biases, if the relevant data sets used to teach the AI engine are themselves susceptible of bias. Solving for this sort of concern will have some very interesting knock on effects. For example, if you are with a financial institution, and you are acquiring such an AI credit approval engine, you may want to address the issue of algorithmic transparency in the contract under which you acquire the system. In other words, you will try to shift responsibility for compliance with the law to the third party that built the system (and the supplier may well push back, arguing that the output results of the system are best understood by the entity that is feeding it the big data sets – namely you, the user). In short, get ready for some very interesting contract negotiations in this regard, at least until an industry consensus forms as to how this risk ought to be shared between supplier and user.
It appears that governments around the world are buying into the concept that data is the new driver of the economy. What-ever the merits of this argument (while data is important, of course, in many circumstances its strategic importance can be overblown), the result is that regulators who oversee competition policy are taking an ever greater interest in data, and the so-called platform companies that aggregate, commercialize and distribute it. Thus, as part of the Digital Charter exercise, the federal government has written to the Com-missioner of Competition and asked them to weigh in on concerns raised by the new market dynamics surrounding data. In particular, the government is concerned that consumers not be harmed by market abuses triggered by technology, and in particular data-related business models.
Also on the topic of competition, there will likely be an interesting debate about data portability- namely, should a consumer be able to require a company which holds transaction data pertaining to the consumer to share that data with third par-ties, such as another company the consumer wants to do business with. This is a highly multi-faceted issue, and legislation in this space should tread very carefully.
These are just some of the highlights of what you can expect, likely next year, in the area of tech-driven law reform flowing from the federal government’s Digital Charter. May you live in interesting times!
George Takach is a senior partner in the Technology practice at McCarthy Tetrault LLP