SCC renders practical privacy decision on mortgage information

The Supreme Court of Canada, in Royal Bank v Trang, made a privacy decision that will bring a sigh of relief to lenders and creditors.

A judgment creditor asked the sheriff to seize and sell a house to satisfy the judgment.  To do that, the sheriff needed to know how much was owed on the mortgage on the house.  The mortgage lender didn’t have express consent to provide the information, and said PIPEDA prevented it from giving it.  Lower courts agreed.

But the SCC took a more practical approach.  The issue was whether there was implied consent to release that personal information.  The SCC said there was.

They interpreted implied consent in a broader perspective, looking at the entire situation, including the legitimate business interests of other creditors.  Financial information is considered to be sensitive personal information, and thus in general faces a higher threshold for implied consent.  But in this context, they held that it is a reasonable expectation of a debtor for a mortgage lender to provide a discharge statement to another creditor wanting to enforce its rights against that property.

Cross-posted to Slaw

Big data privacy challenges

Big data and privacy was one of the topics discussed at the Canadian IT Law Association conference this week.  Some of the issues worth pondering include:

  • Privacy principles say one should collect only what you need, and keep only as long as needed.  Big data says collect and retain as much as possible in case it is useful.
  • Accuracy is a basic privacy principle – but with big data accuracy is being replaced by probability.
  • A fundamental privacy notion is informed consent for the use of one’s personal information.  How do you have informed consent and control for big data uses when you don’t know what it might be used for or combined with?
  • Probability means that the inferences drawn may not always be accurate.  How do we deal with that if we as individuals are faced with erroneous inferences about us?
  • If based on information that may itself be questionable, the results may be questionable.  (The old garbage in, garbage out concept.)  It has been proposed that for big data and AI, we might want to add to Asimov’s 3 laws of robotics that it won’t discriminate, and that it will disclose its algorithm.
  • If AI reaches conclusions that lead to discriminatory results, is that going to be dealt with by privacy regulators, or human rights regulators, or some combination?
  • Should some of this be dealt with by ethical layers on top of privacy principles? Perhaps no go zones for things felt to be improper, such as capturing audio and video without notice, charging to remove or amend information, or re-identifying anonymized information.

Cross-posted to Slaw

Cloud computing: It’s all Good – or Mostly Good

A ZDNet article entitled Cloud computing: Four reasons why companies are choosing public over private or hybrid clouds makes a case for the value of the public cloud.

The reasons:

  • Innovation comes as standard with the public cloud
  • Flexibility provides a business advantage
  • External providers are the experts in secure provision
  • CIOs can direct more attention to business change

This is all good – or mostly good.

The caveat is that the use of the cloud can fail if a business adopts the cloud without thinking it through from the perspectives of mission criticality, security, privacy, and continuity.  If a business runs mission critical systems in the cloud, and that system fails, the business could be out of business.

The IT Manager no longer has to consider day to day issues around keeping software and security up to date.  But they still have to consider higher level issues.

It is important to understand what the needs are for the situation at hand.  A system that is not mission critical, or does not contain sensitive information, for example, would not require as much scrutiny as a system that runs an e-commerce site.

Issues to consider include:

  • how mission critical the system is
  • what the consequences are of a short term and long term outage
  • how confidential or personal the information is in the system
  • can the information be encrypted in transit and at rest
  • how robust the vendor’s continuity plan is
  • the need for the business to have its own continuity plan – such as a local copy of the data
  • how robust the vendor’s security is
  • does the vendor have third party security validation to accepted standards
  • does the vendor’s agreement have provisions that back these issues up with contractual terms and service levels with meaningful remedies

Cross-posted to Slaw

Privacy by Design is Crucial to avoid IoT Disasters

network-782707_1280

If anyone doubts that Privacy by Design is not a fundamentally important principle, consider these two recent articles.

This Wired article describes a hack being detailed at the upcoming Defcon conference that can easily read and type keystrokes from wireless keyboards that are not Bluetooth.  So you might want to consider replacing any non-Bluetooth wireless keyboards you have.

Security expert Bruce Schneier wrote this article entitled The Internet of Things Will Turn Large-Scale Hacks into Real World Disasters that explains the IoT risks. The fundamental problem is that not enough attention is being paid to security for IoT devices.  This leaves a door open to situations where a hacker can, for example, easily get in to your thermostat and then use that as a connection point to your network.  Cory Doctorow of Boing Boing refers to this as a coming IoT security dumpster-fire.

Bruce describes it this way:

The Internet of Things is a result of everything turning into a computer. This gives us enormous power and flexibility, but it brings insecurities with it as well. As more things come under software control, they become vulnerable to all the attacks we’ve seen against computers. But because many of these things are both inexpensive and long-lasting, many of the patch and update systems that work with computers and smartphones won’t work. Right now, the only way to patch most home routers is to throw them away and buy new ones. And the security that comes from replacing your computer and phone every few years won’t work with your refrigerator and thermostat: on the average, you replace the former every 15 years, and the latter approximately never. A recent Princeton survey found 500,000 insecure devices on the internet. That number is about to explode.

 

Cross-posted to Slaw

Emerging tech – potentially awesome and a privacy quagmire

I attended an event last night where Duncan Stewart of Deloitte talked about their TMT predictions for 2016.

It reinforced for me that the future of tech and what it will do for us is potentially awesome.  But also at the same time the amount of information that is being collected and stored about each of us is staggering.  That creates real privacy challenges, and real possibilities for abuse.  And because the information is there, there is a tendency for government and business alike to want to use it.

One scary aspect is that the more we get used to more information being collected about us, the more complacent we get.  Our personal freaky line – the line at which we stop using services because we are concerned about privacy issues – moves a little farther away.  That is in spite of the fact that the more information there is about us, the more ripe for abuse it is, and the more that we temper or alter our behaviour because we know we are being watched.

Think for a moment about all the information that is increasingly being collected about us.

  • Smartphones that know our every move and the most intimate and personal aspects of our lives.
  • Intelligent cars that know where we go and how we drive.
  • The internet of things where the stuff we own collects information about us.
  • Wearable tech that collects information about our fitness, and increasingly our health.
  • The trend for information and services to be performed in the cloud rather than locally, and stored in various motherships.
  • Big data that functions by saving as much information as possible.
  • Artificial intelligence and cognitive learning tools that can turn data into useful information and make inferences based on seemingly unconnected information.
  • Blockchain technology that has the potential to record surprising things about us.

On top of all this, it is becoming increasingly harder to understand when our info is staying on our device, when it goes somewhere else, how long it stays there, who has access to it, when it is encrypted, and who has access to the encryption keys.

It is in this context, and the fact that we just don’t have the time to spend to understand and make all the privacy choices that we need to make, that the Privacy Commissioner of Canada last week released a discussion paper titled Consent and privacy: A discussion paper exploring potential enhancements to consent under the Personal Information Protection and Electronic Documents Act

The introduction states in part:

PIPEDA is based on a technologically neutral framework of ten principles, including consent, that were conceived to be flexible enough to work in a variety of environments. However, there is concern that technology and business models have changed so significantly since PIPEDA was drafted as to affect personal information protections and to call into question the feasibility of obtaining meaningful consent.

Indeed, during the Office of the Privacy Commissioner’s (OPC’s) Privacy Priority Setting discussions in 2015, some stakeholders questioned the continued viability of the consent model in an ecosystem of vast, complex information flows and ubiquitous computing. PIPEDA predates technologies such as smart phones and cloud computing, as well as business models predicated on unlimited access to personal information and automated processes. Stakeholders echoed a larger global debate about the role of consent in privacy protection regimes that has gained momentum as advances in big data analytics and the increasing prominence of data collection through the Internet of Things start to pervade our everyday activities.

Cross-posted to Slaw

Enemy of the State – still topical

I recently watched the 1998 movie Enemy of the State .  It is a spy thriller about a lawyer being smeared by politicians because they believe he has information that can implicate them in criminal matters – the murder of a politician who was opposing a privacy bill that is really a bill empowering mass surveillance.  They use sophisticated, unsavoury, unethical, and illegal methods to watch him, discredit him, and retrieve the evidence.  No one is watching the watchers, who are out of control.

While like any disaster movie the plot is a bit over the top, it was fascinating to watch the movie again from a 2016 lens.  I challenge anyone to watch it and still say “I have nothing to hide” to dismiss privacy and surveillance concerns.

In a related sentiment, a recent study confirms that the knowledge that we may be watched puts a chilling effect on what we do.  This Techdirt article is a good summary of that study.

220px-Enemy_of_the_State

Cross posted to Slaw.

Panama Papers – Points to Ponder

The Panama papers revelations are worth pondering on many levels. (This Wired article is a good summary.)

My first reaction to the high level tax evasion and corruption allegations was to blanch at the thought that someone had basically given the entire contents of a law firm’s document management system to a third party.

As a lawyer, the fact that law firm files were leaked causes me to wince. After all, solicitor-client privilege is a fundamental tenet of democratic society. Law firms take the security of their files very seriously, and getting access to this information would not be an easy task.

This has parallels to the Snowden leaks. I’ve said before that Snowden should be congratulated, not prosecuted.

But this is not the same.

Snowden leaked information about one government entity. This is a leak with personal, sensitive, and confidential information about thousands of individuals and corporations. Some of the activities exposed by the press are no doubt illegal or unethical, some may raise a debate over were the line should be between tax avoidance and tax evasion, and issues around tax havens in general.

But that does not justify this kind of breach to the press.

Unfortunately this has set a smell test where anyone who has an offshore company, or any business such as a law firm that is involved in their creation, gets unfairly tarred with suspicion.

According to press reports the journalists won’t release the actual documents to respect the privacy of the innocent. That’s good – but that shouldn’t be a decision that a journalist should have to, or should get to make.

Apple fought the FBI to keep phones secure.  In that case the end the FBI was seeking did not justify the means. That is largely because it puts the information of everyone using an iPhone at risk. So how is this leak that exposes legal files of thousands of people any different? It seems that one minute we are applauding security and privacy – and yet we now seem to be applauding a massive breach of security and privacy.

It is too easy to dismiss this as a risk that is peculiar to law firms in tax havens that are perceived to facilitate unsavoury activities. Has this perhaps put a bigger target on law firms for both inside and outside hackers?

An IT security firm told me this morning that they have been contacted by a number of law firms that are wondering what shape their security measures are in in light of the Panama Papers.

Perhaps law firms everywhere should take another look at their security measures to reduce the chances this could happen to them.

Cross-posted to Slaw

Apple fights court imposed FBI backdoor order

Apple CEO Tim Cook has taken a very public stand against an FBI request and court order to create a backdoor into the Apple operating system.  This arose from the investigation into the San Bernardino mass shooting last December.

See this article on ZDNet for more details.  And Read Tim Cook’s customer letter posted on the Apple website for a more complete explanation of Apple’s position.

Kudos to Tim Cook and Apple for this.

Security and privacy experts continue to point out that backdoors are a bad idea that cause far more harm than good.

See, for example, this ZDNet article from yesterday about a new report saying “European cybersecurity agency ENISA has come down firmly against backdoors and encryption restrictions, arguing they only help criminals and terrorists while harming industry and society.”

Cross-posted to Slaw

Encryption = good : Backdoor = bad

Every time there is a tragic attack on people or property, there is a cry from various authorities or politicians for law enforcement to get unfettered access to all kinds of communication tools.

But that would cause far more harm than good, and is a really bad idea.

The argument goes something like this:

These bad actors hide behind encrypted communications to plan their evil deeds.  Therefore to stop them law enforcement needs to have access to all this.  Therefore we need to have backdoors built into all encryption that law enforcement can use.

This is flawed in many ways.

There is no evidence that unfettered access to communications helps.  Sometimes the information was actually available, but no one managed to put it together ahead of time to stop the evil deed.

There is no way that backdoors can be limited to use by law enforcement.  They will inevitably be discovered by others and used for evil, thus rendering encryption and all the protection it provides useless.

Bad actors will stay a step ahead.  If mainstream communications and encryption tools have backdoors, they will just create their own secure communications channels.

But don’t just take my word for this.  Read, for example, this article by security expert Bruce Schneier entitled Why we Encrypt.

And this article by Cory Doctorow on how ridiculous British Prime Minister David Cameron’s comments on the need to backdoor encryption are entitled What David Cameron just proposed would endanger every Briton and destroy the IT industry.

And this article by Mike Masnick of Techdirt entitled The Paris Attacks Were An Intelligence Community Failure, Not An ‘Encryption’ Problem.

Cross posted to Slaw

11 things you should know about privacy

top legal issues for tech bus

Privacy laws apply to every business that knows any information about individuals.

Here are 11 things you should know about privacy.

  1. There are many privacy statutes that may apply depending on the nature of the information, the nature of your business, and what province your customers are in. Health information, for example, is usually subject to different statutes than other personal information.
  2. In general, if you want to use someone’s personal information for something they would not think is necessary to provide your services, you need their permission.
  3. Mandatory breach notification is becoming more common. Some provincial statutes require it, PIPEDA now includes breach notification provisions that are coming into effect soon.  The notice requirements include some rather subjective tests, and must be reviewed carefully if you have a privacy breach.
  4. The definition of personal information is fairly broad. It includes things like an IP address, and depending on the jurisdiction, may include car license plates.
  5. You need to have a privacy policy that clearly describes what you collect and what you do with personal information. The nature and complexity of that policy will vary depending on the nature of your business, the nature of the information, and what you want to do with the personal information.
  6. You must have a privacy officer who is accountable and available to your customers.
  7. A privacy policy should cover your organization as a whole, not just your web site or one product.
  8. A privacy audit may be in order. Make sure you understand what information you actually do collect, use and disclose.  A disconnect between reality and what your policy says is a recipe for disaster.
  9. Privacy, anti-spam legislation (CASL), and Don Not Call legislation complement each other, work together, and shouldn’t be viewed in isolation.
  10. Some privacy laws (in particular some provincial laws dealing with public sector or health information) say that data can’t reside outside of Canada.
  11. Having processes and protections in place to keep personal information out of the wrong hands is crucial. It is equally crucial to deal with a privacy breach appropriately to reduce legal, customer, and headline risk.