Privacy Commissioner posts new case summaries

Privacy breaches and complaints can often be resolved cooperatively.  We usually hear about the large, dramatic, far reaching breaches more so than the smaller ones that get resolved.

The privacy commissioner just released some examples.

In one example, a malfeasant social engineered some information from customer service representatives that enabled the malfeasant to contact customers and try to obtain more information that could be used for fraud.  The business investigated, contacted the individuals who may have been compromised, and took steps to reduce the chances of it happening again.

In another situation, a rogue employee took customer information which was used to impersonate the company to collect money from a customer.  The business was not very responsive to the customer complaint until the privacy commissioner got involved.   In the end the employee was dismised, the customer made whole, and steps were taken to reduce the chances of it happening again.

From a business perspective, it shows the need to take privacy complaints seriously, and deal with them quickly and effectively.

From a consumer perspective, it shows the need to be cautious when you are asked for your information – especially when someone contacts you.  And be patient when your service providers take steps to make sure you are who you say you are.

Cross-posted to Slaw.

Trump’s executive order on foreigners strips privacy protection for Canadians

Included in Trump’s reprehensible executive order “Enhancing Public Safety in the Interior of the United States” was this:

Sec. 14.  Privacy Act.  Agencies shall, to the extent consistent with applicable law, ensure that their privacy policies exclude persons who are not United States citizens or lawful permanent residents from the protections of the Privacy Act regarding personally identifiable information.

The Privacy Act covers personal information held by US Federal agencies.  This would apply, for example, to information collected about Canadians entering the United States.

This should be attracting the wrath of the Canadian privacy commissioner and the Canadian government.

More detail is in this post by Michael Geist and this post on Open Media.

Given this attitude, we should be redoubling efforts to make sure our communications are encrypted.

Conventional wisdom has been that our data is just as safe in the US as Canada given that both countries have limits on privacy when it comes to law enforcement and government ability to dip into our information.  But this cavalier attitude puts that into question, and it may be prudent for Canadian entities to keep their data in Canada to the extent possible.  Where that isn’t practical, attempts should be taken (and assurances obtained from vendors) to encrypt that the data in a way that the provider doesn’t have access to it.

Cross posted to Slaw

Data Privacy Day event in London

January 28 is Data Privacy Day – “an international effort held annually on January 28 to create awareness about the importance of privacy and protecting personal information. ”

The IAPP (International Association of Privacy Professionals) is honouring the day with local “Privacy After Hours” events on Thursday January 26th.

Privacy professionals in London are welcome to attend the event being held at McGinnis Landing restaurant.  Harrison Pensa is pleased to provide the appetizers for the event.

You can sign up for the event on the IAPP website.  You have to create an IAPP logon ID to register – which is quick and painless to do.

SCC renders practical privacy decision on mortgage information

The Supreme Court of Canada, in Royal Bank v Trang, made a privacy decision that will bring a sigh of relief to lenders and creditors.

A judgment creditor asked the sheriff to seize and sell a house to satisfy the judgment.  To do that, the sheriff needed to know how much was owed on the mortgage on the house.  The mortgage lender didn’t have express consent to provide the information, and said PIPEDA prevented it from giving it.  Lower courts agreed.

But the SCC took a more practical approach.  The issue was whether there was implied consent to release that personal information.  The SCC said there was.

They interpreted implied consent in a broader perspective, looking at the entire situation, including the legitimate business interests of other creditors.  Financial information is considered to be sensitive personal information, and thus in general faces a higher threshold for implied consent.  But in this context, they held that it is a reasonable expectation of a debtor for a mortgage lender to provide a discharge statement to another creditor wanting to enforce its rights against that property.

Cross-posted to Slaw

Big data privacy challenges

Big data and privacy was one of the topics discussed at the Canadian IT Law Association conference this week.  Some of the issues worth pondering include:

  • Privacy principles say one should collect only what you need, and keep only as long as needed.  Big data says collect and retain as much as possible in case it is useful.
  • Accuracy is a basic privacy principle – but with big data accuracy is being replaced by probability.
  • A fundamental privacy notion is informed consent for the use of one’s personal information.  How do you have informed consent and control for big data uses when you don’t know what it might be used for or combined with?
  • Probability means that the inferences drawn may not always be accurate.  How do we deal with that if we as individuals are faced with erroneous inferences about us?
  • If based on information that may itself be questionable, the results may be questionable.  (The old garbage in, garbage out concept.)  It has been proposed that for big data and AI, we might want to add to Asimov’s 3 laws of robotics that it won’t discriminate, and that it will disclose its algorithm.
  • If AI reaches conclusions that lead to discriminatory results, is that going to be dealt with by privacy regulators, or human rights regulators, or some combination?
  • Should some of this be dealt with by ethical layers on top of privacy principles? Perhaps no go zones for things felt to be improper, such as capturing audio and video without notice, charging to remove or amend information, or re-identifying anonymized information.

Cross-posted to Slaw

Cloud computing: It’s all Good – or Mostly Good

A ZDNet article entitled Cloud computing: Four reasons why companies are choosing public over private or hybrid clouds makes a case for the value of the public cloud.

The reasons:

  • Innovation comes as standard with the public cloud
  • Flexibility provides a business advantage
  • External providers are the experts in secure provision
  • CIOs can direct more attention to business change

This is all good – or mostly good.

The caveat is that the use of the cloud can fail if a business adopts the cloud without thinking it through from the perspectives of mission criticality, security, privacy, and continuity.  If a business runs mission critical systems in the cloud, and that system fails, the business could be out of business.

The IT Manager no longer has to consider day to day issues around keeping software and security up to date.  But they still have to consider higher level issues.

It is important to understand what the needs are for the situation at hand.  A system that is not mission critical, or does not contain sensitive information, for example, would not require as much scrutiny as a system that runs an e-commerce site.

Issues to consider include:

  • how mission critical the system is
  • what the consequences are of a short term and long term outage
  • how confidential or personal the information is in the system
  • can the information be encrypted in transit and at rest
  • how robust the vendor’s continuity plan is
  • the need for the business to have its own continuity plan – such as a local copy of the data
  • how robust the vendor’s security is
  • does the vendor have third party security validation to accepted standards
  • does the vendor’s agreement have provisions that back these issues up with contractual terms and service levels with meaningful remedies

Cross-posted to Slaw

Privacy by Design is Crucial to avoid IoT Disasters

network-782707_1280

If anyone doubts that Privacy by Design is not a fundamentally important principle, consider these two recent articles.

This Wired article describes a hack being detailed at the upcoming Defcon conference that can easily read and type keystrokes from wireless keyboards that are not Bluetooth.  So you might want to consider replacing any non-Bluetooth wireless keyboards you have.

Security expert Bruce Schneier wrote this article entitled The Internet of Things Will Turn Large-Scale Hacks into Real World Disasters that explains the IoT risks. The fundamental problem is that not enough attention is being paid to security for IoT devices.  This leaves a door open to situations where a hacker can, for example, easily get in to your thermostat and then use that as a connection point to your network.  Cory Doctorow of Boing Boing refers to this as a coming IoT security dumpster-fire.

Bruce describes it this way:

The Internet of Things is a result of everything turning into a computer. This gives us enormous power and flexibility, but it brings insecurities with it as well. As more things come under software control, they become vulnerable to all the attacks we’ve seen against computers. But because many of these things are both inexpensive and long-lasting, many of the patch and update systems that work with computers and smartphones won’t work. Right now, the only way to patch most home routers is to throw them away and buy new ones. And the security that comes from replacing your computer and phone every few years won’t work with your refrigerator and thermostat: on the average, you replace the former every 15 years, and the latter approximately never. A recent Princeton survey found 500,000 insecure devices on the internet. That number is about to explode.

 

Cross-posted to Slaw

Emerging tech – potentially awesome and a privacy quagmire

I attended an event last night where Duncan Stewart of Deloitte talked about their TMT predictions for 2016.

It reinforced for me that the future of tech and what it will do for us is potentially awesome.  But also at the same time the amount of information that is being collected and stored about each of us is staggering.  That creates real privacy challenges, and real possibilities for abuse.  And because the information is there, there is a tendency for government and business alike to want to use it.

One scary aspect is that the more we get used to more information being collected about us, the more complacent we get.  Our personal freaky line – the line at which we stop using services because we are concerned about privacy issues – moves a little farther away.  That is in spite of the fact that the more information there is about us, the more ripe for abuse it is, and the more that we temper or alter our behaviour because we know we are being watched.

Think for a moment about all the information that is increasingly being collected about us.

  • Smartphones that know our every move and the most intimate and personal aspects of our lives.
  • Intelligent cars that know where we go and how we drive.
  • The internet of things where the stuff we own collects information about us.
  • Wearable tech that collects information about our fitness, and increasingly our health.
  • The trend for information and services to be performed in the cloud rather than locally, and stored in various motherships.
  • Big data that functions by saving as much information as possible.
  • Artificial intelligence and cognitive learning tools that can turn data into useful information and make inferences based on seemingly unconnected information.
  • Blockchain technology that has the potential to record surprising things about us.

On top of all this, it is becoming increasingly harder to understand when our info is staying on our device, when it goes somewhere else, how long it stays there, who has access to it, when it is encrypted, and who has access to the encryption keys.

It is in this context, and the fact that we just don’t have the time to spend to understand and make all the privacy choices that we need to make, that the Privacy Commissioner of Canada last week released a discussion paper titled Consent and privacy: A discussion paper exploring potential enhancements to consent under the Personal Information Protection and Electronic Documents Act

The introduction states in part:

PIPEDA is based on a technologically neutral framework of ten principles, including consent, that were conceived to be flexible enough to work in a variety of environments. However, there is concern that technology and business models have changed so significantly since PIPEDA was drafted as to affect personal information protections and to call into question the feasibility of obtaining meaningful consent.

Indeed, during the Office of the Privacy Commissioner’s (OPC’s) Privacy Priority Setting discussions in 2015, some stakeholders questioned the continued viability of the consent model in an ecosystem of vast, complex information flows and ubiquitous computing. PIPEDA predates technologies such as smart phones and cloud computing, as well as business models predicated on unlimited access to personal information and automated processes. Stakeholders echoed a larger global debate about the role of consent in privacy protection regimes that has gained momentum as advances in big data analytics and the increasing prominence of data collection through the Internet of Things start to pervade our everyday activities.

Cross-posted to Slaw

Enemy of the State – still topical

I recently watched the 1998 movie Enemy of the State .  It is a spy thriller about a lawyer being smeared by politicians because they believe he has information that can implicate them in criminal matters – the murder of a politician who was opposing a privacy bill that is really a bill empowering mass surveillance.  They use sophisticated, unsavoury, unethical, and illegal methods to watch him, discredit him, and retrieve the evidence.  No one is watching the watchers, who are out of control.

While like any disaster movie the plot is a bit over the top, it was fascinating to watch the movie again from a 2016 lens.  I challenge anyone to watch it and still say “I have nothing to hide” to dismiss privacy and surveillance concerns.

In a related sentiment, a recent study confirms that the knowledge that we may be watched puts a chilling effect on what we do.  This Techdirt article is a good summary of that study.

220px-Enemy_of_the_State

Cross posted to Slaw.