Trains, Voice and Video Recorders, and PIPEDA

In a late move, the Office of the Privacy Commissioner of Canada has raised concerns with the privacy exceptions in Bill C-49 regarding the use of locomotive voice and video recorders (LVVRs). The exceptions would diminish the protections of railway engineers under the Personal Information Protection and Electronic Documents Act, according to Commissioner Therrien. The Bill has already passed third reading before the House of Commons. When asked by the Senate Committee studying the Bill whether the OPC had raised the concerns before the House of Commons, Commissioner Therrien frankly admitted that the OPC had missed the significance of the amendments until they saw the debates in Parliament.

The LVVR Initiative

In 2015, the Transportation Safety Board of Canada (TSB) conducted a study on the potential use of LVVRs. The study was conducted in the wake of several high-profile railroad accidents in Canada. The TSB ultimately included that LVVR technology would enhance rail safety if implemented.

Photo by Irina Kostenich from Pexels

The Government of Canada included the mandatory use of LVVR in the Bill C-49, which promises to modernize aspects of Canada’s legislation governing rail, air and marine transportation. Unions have raised concerns regarding the privacy implications of the LVVR technology. Apart from the general objection to the constant surveillance that employees would be under in the locomotive, unions have objected to employers having access to LVVR recordings. Unions fear the data could be used against employees if it could be routinely reviewed by railway companies. They argue that the data should only be available to the TSB during an incident investigation.

The OPC’s Concerns

For privacy advocates, there is another aspect of Bill C-49 that is of interest and was the subject of concerns raised by the Privacy Commissioner of Canada, Daniel Therrien, when he appeared before the Senate Committee on Transportation and Communications to discuss Bill C-49. The role of the OPC in overseeing the privacy practices of the railway companies in connection with the LVVRs will be diminished, given the way that the Bill C-49 has been drafted.

It appears that the intention was to protect against the OPC scrutinizing the use of LVVR data by railway companies. To accomplish this, Bill C-49 provides explicit carve-outs from the application of the Personal Information Protection and Electronic Documents Act (PIPEDA). These carve-outs disturb the Commissioner. In particular:

  • Railway companies do not have to comply with section 7 of PIPEDA, which restricts the ability to collect, use or disclose personal information without consent
  • Railway companies do not have to comply with the principles in Schedule 1 of PIPEDA regarding the collection, use, disclosure and retention of information

The Commissioner is concerned that the OPC’s jurisdiction to investigate complaints under PIPEDA may be in doubt. Naturally, if a railway company may collect, use and disclose personal information in the LVVRs without regard to the section 7 of PIPEDA and Schedule 1 of PIPEDA, they will argue that the OPC has no jurisdiction to hear complaints on these issues.

Further, the OPC is concerned that an individual may not have a right of access to the personal information in the LVVRs as would otherwise be required by PIPEDA in light of section 28 of the Canadian Transportation Accident Investigation and Safety Board Act, which restricts to whom the LVVR data could be disclosed.

Find Bill C-49 on LegisInfo here.

Read the TSB Railway Safety Issues Investigation Report R16H0002 here.

Read the transcript of Commissioner Therrien’s remarks before the Senate Committee here.

ETHI Report on PIPEDA is Coming Soon

The Standing Committee on Access to Information, Privacy and Ethics will be tabling its report sometime soon following the resumption of Parliament on Monday, February 26th. The Report title will be “Towards Privacy by Design: A Review of the Personal Information Protection and Electronic Documents Act.” The title provides a strong hint that the report will be advocating including an express obligation in PIPEDA to require organizations to adopt privacy by design and by default. If adopted, this would bring Canada’s laws one step closer to Europe’s General Data Protection Regulation (GDPR), which will come into force on May 25, 2018. Privacy by Design is a made-in-Canada concept and so it would be fitting for it to “come home”.

Read my article for the International Association of Privacy Professionals (IAPP) titled “Legislating privacy by design in Canadahere.

Learn about Privacy by Design here.

Using the Criminal Code to Require News Media to “Un-Publish” Fails

In Canada, s. 486.4(2.1) of the Criminal Code to make an order protecting a victim of a crime from having any information that could identify the victim from being “published in any document or broadcast or transmitted in any way.” The victim has to be under 18 years of age.

A recent decision of the Supreme Court considered a case in which the Crown wanted information published online prior to a publication ban to be “un-published” by the news organization.

R. v. Canadian Broadcasting Corp, dealt with an application for an interim injunction to require the CBC to “un-publish” a story about a murder. Essentially, the facts of the case were: An individual was charged with the murder of a person under the age of 18. The CBC reported on the case and published the victim’s name before the Crown obtained a publication ban regarding the name of the victim. The Crown wanted the CBC to remove the information on the CBC’s website. The CBC refused. The Crown sought an interlocutory order requiring the CBC to remove the name of the victim until the hearing to decide whether the CBC was in criminal contempt for failing to remove the name of the victim.

The Crown was obliged to establish that it had a “strong prima facie case” in order to obtain the interlocutory order requiring the CBC to remove the identifying information of the victim. This is because the order would compel the CBC to “do” something. If the order simply required the CBC to refrain from doing something, the test would be lower.

The key issue was whether the Crown had a strong prima facie case that the CBC was intentionally disobeying a court order. The Crown attempted to argue that publishing on the CBC website was a continuous activity. On this theory, the CBC was directly and intentionally violating the publication ban, even if the original publication occurred prior to the publication ban. The Supreme Court did not rule out the possibility that the Crown would be successful. However, the court concluded that the Crown did not have a strong prima facie case that the publication was a continuous activity. The result was that the CBC did not have to remove the identifying information.

We should be cautious in suggesting that the court’s decision is relevant to the current debate in Canada regarding the type of “right to be forgotten” that Office of the Privacy Commissioner of Canada (OPC) has suggested exists under Canada’s private sector privacy law. I discussed this “right to be forgotten” in a recent post on the OPC’s draft Position on Online Reputation. In that draft Position paper, the OPC suggested that individuals have the right, in certain circumstances, to have inaccurate online information about them removed and could even require search engines remove or suppress search results for an individual’s name on the basis that the information about the individual was not accurate.

There is a fairly wide gulf between the CBC case and the type of “right to be forgotten” discussed by the OPC. Nevertheless, there is one intriguing point of relevance. The court seemed quite skeptical that the mere fact that information remained available online meant that it was being “continuously” published. Provided that the “story” was not “updated” (i.e. “republished”), perhaps the court might be reticent to require the editing of historical documents. It is too soon to tell but not too soon to think about. 

Read R. v. Canadian Broadcasting Corp, 2018 SCC 5.

Chasing the Autonomous Vehicle – International Trade Matters

What influence will the United States have on the public policy choices available to federal and provincial governments in Canada regarding autonomous and connected vehicles? That issue was not explored in any depth in the Canadian Senate’s important report on automated and connected vehicles (released January 29, 2018).  True, one of the Senate’s 16 recommendations focused on international cooperation with the United States. However, this recommendation was focused on making sure that vehicles “worked” in both countries from a technical perspective. However, this is simply table-stakes. International trade with the United States may be a critical factor in demarcating what practical option are available to Canadian regulators in important areas such as privacy and cybersecurity.

There were an estimated 263 million registered passenger vehicles in the United States in 2015. By comparison, Statistics Canada tells us that there were 24 million registered road motor vehicles in Canada in 2106. The total number of vehicles in Canada follows the general rule when comparing Canada and the United States. We have 1/10 the population. So, it won’t be surprising that we have very roughly 1/10 the number of passenger vehicles on the road. pexels-photo-799443.jpeg

The size of the Canadian market compared to that of the United States is an important context for determining design priorities for auto manufacturers. Another related factor is the speed with which the U.S. has moved in developing a regulatory environment. The U.S. Department of Transportation has already developed a voluntary code of safety design elements. It has also issued cybersecurity best practices.  The Senate noted that 21 U.S. States and Washington D.C. have enacted automated vehicle legislation. Federal U.S. legislation is likely inevitable. Although the U.S. Department of Transportation has not taken a prescriptive approach to safety design elements, it is likely only a matter of time before it does so. Once the technology matures, the U.S. regulatory approach is to be much more prescriptive than its Canadian counterparts. As between designing for a prescriptive standard and designing for a “principled-based” standard, the prescriptive standard wins.

The Senators clearly recognized the importance of cooperation with the United States. Recommendation 3 was for Transport Canada to strengthen its work on automated and connected vehicles with the United States through the Regulatory Cooperation Council “to ensure that these vehicle will work seamlessly in both countries.” However, there are many other areas in which cooperation might be required, in order to achieve public policy goals. For example, five of the Senate’s 16 recommendations related to privacy and cybersecurity

Recommendation 6: Transport Canada to work with the Communications Security Establishment and Public Safety Canada to develop cybersecurity guidance.

Recommendation 7: Transport Canada to work with Public Safety Canada, the Communications Security Establishment and industry stakeholders to address cybersecurity issues and a real-time crisis connect network.

Recommendation 8: Strengthen the powers of the Office of the Privacy Commissioner of Canada to proactively and enforce industry compliance with the Personal Information Protection and Electronic Documents Act.

Recommendation 9: The Government of Canada to continue to assess the need for privacy regulations specific to the connected Car.

Recommendation 10: Transport Canada to bring together stakeholders to develop a connected car framework, with privacy protection as one of its key drivers.

Apart from Recommendation 8, the question is whether deep “privacy-by-design” and “security-by-design” features can be embedded in automated and connected vehicles without close cooperation between Canada and the United States. This spans more than transportation regulatory authorities. It requires cooperation from multiple regulators — who have responsibilities for privacy – the Federal Trade Commissioner, U.S. State Attorneys General, Canadian federal and provincial Privacy Commissioners, and many others.

Read the Senate Report: Driving Change: Technology and the future of the automated vehicle.

Canada and the Right to be Forgotten

It may be surprising that, until this past Friday, there was considerable doubt about whether Canada’s federal private sector privacy law to online search engines. The Office of the Privacy Commissioner of Canada (OPC) had previously skirted deciding this issue.

This changed with the release of the draft OPC Position on Online Reputation. The OPC decided not only that online search engines (such as Google search, Bing and others) are subject to the Personal Information Protection and Electronic Documents Act (PIPEDA), but also that search engines may be required to “de-index” search results about an individual in some cases. In other words, the OPC has introduced a limited type of “right to be forgotten” in Canada.

It will be interesting to watch whether search engines accede to the OPC’s interpretation of PIPEDA. Certainly, we should expect to see some vigorous debate before the OPC finalizes its Guidelines. After that, perhaps the OPC will need to find a case and take it to court. The OPC’s “position” does not have the force of law. And, until the OPC completes an investigation, the OPC cannot take a search engine to court for a ruling.

Are search engines engaged in a “commercial activity”?

Search engines form an indispensable tool through which most individuals navigate the world wide web. However, in order for a search engine to be subject to PIPEDA, it must be performing a commercial activity and not be engaged in journalism or a literary activity. read-reading-book-reader-159623.jpeg

The OPC concluded that the search engine is not performing a “journalistic” or “literary” function. A search engine operator indexes content, applies proprietary algorithms to that content, and then displays results based on relevance, as predicted by those algorithms. The relevance of the search results are to some extent customized depending on the user and the user’ location. The OPC concluded that in many cases the search engines display advertising along with the search results and that this sale of advertising is “inextricably linked” to the search function. In this way, the OPC concluded that search engines are engaged in a commercial activity.

Where is the “right to be forgotten” in PIPEDA?

The OPC constructed a right to challenge search engine results by combining three principles in Schedule 1 to PIPEDA to develop an individual’s right to request de-indexing. These are the “accuracy”, “individual access”, and “challenging compliance” principles. Essentially, the OPC concluded that if the de-indexed result is inaccurate, incomplete or not up-to-date, then the search engine must balance the interests of the individual against the public interest of the web page continuing to be indexed and displayed in the search results.

Stay tuned for further debate

The OPC’s guidance is in draft. We should expect that the coming weeks will see considerable debate. The OPC has itself called for Parliament to consider the issues, and whether the OPC has “struck the ight balance”.

There will be supporters and critics of the OPC’s activism. Supporters of the OPC’s approach may cite the role of search engines in driving traffic to content. They will argue that the algorithms deployed by search engines are not transparent or neutral. If the information being returned by the algorithm does not accurately reflect information about the individual, they may rightly ask what the overriding interest is in making inaccurate information prominent in the search results or even displaying it at all if the harm to the individual exceeds other interests.

Nevertheless, there will be many critics of the OPC’s approach. It places the burden on search engines to arbitrate what results should be de-indexed or shown. Moreover, the OPC’s suggestion that search engines will need to geo-block results is a remarkable interference with the ability of individuals to obtain access to information and interferes with the freedom of expression of the authors of the underlying information. If the underlying information is the problem, there are a variety of tools for dealing with that information, such as the law of defamation, torts of invasion of privacy, and other legal remedies.

Read the full OPC Position on Online Reputation.

Professor Michael Geist penned a response to the OPC decision in the Globe and Mail, which can be found here.

Biometrics – Who’s aggrieved in Illinois?

The Illinois Biometric Information Privacy Act set off a wave of private litigation in the United States. The Act establishes a private right of action for any person “aggrieved” by a violation of the Act, which regulates how private entities collect, retain, disclose and destroy biometric identifiers or biometric information. Successful plaintiffs are entitled to their actual damages or liquidated damages of US$1,000 for negligent violations or US$5,000 for intentional violations.

Recently, however, the Appellate Court of Illinois tapped the breaks by requiring the plaintiff to have suffered some pecuniary or non-pecuniary damages in order to be “aggrieved“. [Don’t know what pecuniary/non-pecuniary refer to? See note at the end of this post.]

Why should Canadians care?

Canadian readers will note that, with the exception of the Province of Quebec, Canada does not have federal or provincial privacy laws that specifically target biometric information. However, general private sector privacy legislation can be used to seek monetary awards (usually after a complaints procedure has been exhausted). At least at present, the general trend seems to be for Canadian courts to provide damages for non-pecuniary damage without putting the plaintiff to much of a burden of proof (if any). However, courts may begin to turn to the debates in the U.S. over standing if there is a rush to the courts to claim damages under existing laws or if Parliament or provincial legislatures begin experimenting with statutory damages.

What is the Biometric Information Privacy Act?

The Biometric Information Privacy Act covers “biometric identifiers” and “biometric information“. Biometric identifiers are retina or iris scans, fingerprints, handprints, voiceprints, or facial geometry scans. Biometric information is any information that is based on a biometric identifier used to identify an individual.

The Act imposes certain requirements on private entities (not public authorities) when collecting, using, disclosing or even possessing biometrics. Those obligations include:

  • notifying the individual of the collection and storage of the biometrics;
  • providing an explanation of the purpose for the collection, use and storage;
  • identifying the retention period for the use and storage; and
  • obtaining a written release for the collection, use and storage.

There are other obligations as well relating to the manner of storage and maximum periods of retention and other matters.

No Strict Liability – Plaintiff must suffer damage

In Rosenbach v. Six Flags Entertainment Corp., the plaintiff (who was suing on behalf of her minor son) alleged that the defendant had taken a thumbprint of her son without complying with the Biometric Information Privacy Act. The plaintiff’s son had purchased a season pass to the defendant’s theme park. The thumbprint was to be used in connection with the season pass in order to enter the park. The plaintiff alleged that the defendant had not made the appropriate disclosures and had not obtained her consent to the thumbprint of her minor son. However, the plaintiff did not allege any damage other than that she would not have consented to the collection of the thumbprint had she known of the defendant’s practices.

The Second District Appellate Court of Illinois concluded that a plaintiff seeking a remedy under the Biometric Information Privacy Act must have suffered some pecuniary or non-pecuniary damage in order to be entitled to a remedy under the private right of action. The court concluded that the statute was not meant to be one of strict liability. In order to be “aggrieved”, the person must have suffered some harm. A technical violation of the Act would not necessarily result in any harm.

You can read the court’s decision in Rosenbach v. Six Flags Entertainment Corp. here.

The Biometric Information Privacy Act can be found here.

The law firm Ropes & Gray have an interesting client alert that you can find here.

Note: Curious as to what pecuniary and non-pecuniary mean? Without getting into the details, pecuniary damages are essentially those that can be quantified in monetary terms – for example, out of pocket expenses. Non-pecuniary damages are for injuries such as distress and pain and suffering.

Privacy and Text Messages

In December, the Supreme Court of Canada issued two important decisions on the reasonable expectation of privacy in text messages.

The decisions relate to two issues. First, does the sender of a text message continue to have a privacy interest in the content of the text message after the message has been sent? The court said that the sender could have a continuing privacy interest depending on the circumstances. Second, what are the obligations of law enforcement obligations to obtain judicial authorization to obtain copies of  past text messages through production orders to telecommunications service providers. The court concluded that there was a difference between seeking past message and future messages. This meant that law enforcement could obtain past messages from the telecommunications service provider under a lower standard of care.

You can find my analysis of these cases in an article Defining privacy in text messages – a step forward and maybe a step back for the International Association of Privacy Professionals.