Privacy by Design is Crucial to avoid IoT Disasters

network-782707_1280

If anyone doubts that Privacy by Design is not a fundamentally important principle, consider these two recent articles.

This Wired article describes a hack being detailed at the upcoming Defcon conference that can easily read and type keystrokes from wireless keyboards that are not Bluetooth.  So you might want to consider replacing any non-Bluetooth wireless keyboards you have.

Security expert Bruce Schneier wrote this article entitled The Internet of Things Will Turn Large-Scale Hacks into Real World Disasters that explains the IoT risks. The fundamental problem is that not enough attention is being paid to security for IoT devices.  This leaves a door open to situations where a hacker can, for example, easily get in to your thermostat and then use that as a connection point to your network.  Cory Doctorow of Boing Boing refers to this as a coming IoT security dumpster-fire.

Bruce describes it this way:

The Internet of Things is a result of everything turning into a computer. This gives us enormous power and flexibility, but it brings insecurities with it as well. As more things come under software control, they become vulnerable to all the attacks we’ve seen against computers. But because many of these things are both inexpensive and long-lasting, many of the patch and update systems that work with computers and smartphones won’t work. Right now, the only way to patch most home routers is to throw them away and buy new ones. And the security that comes from replacing your computer and phone every few years won’t work with your refrigerator and thermostat: on the average, you replace the former every 15 years, and the latter approximately never. A recent Princeton survey found 500,000 insecure devices on the internet. That number is about to explode.

 

Cross-posted to Slaw

Emerging tech – potentially awesome and a privacy quagmire

I attended an event last night where Duncan Stewart of Deloitte talked about their TMT predictions for 2016.

It reinforced for me that the future of tech and what it will do for us is potentially awesome.  But also at the same time the amount of information that is being collected and stored about each of us is staggering.  That creates real privacy challenges, and real possibilities for abuse.  And because the information is there, there is a tendency for government and business alike to want to use it.

One scary aspect is that the more we get used to more information being collected about us, the more complacent we get.  Our personal freaky line – the line at which we stop using services because we are concerned about privacy issues – moves a little farther away.  That is in spite of the fact that the more information there is about us, the more ripe for abuse it is, and the more that we temper or alter our behaviour because we know we are being watched.

Think for a moment about all the information that is increasingly being collected about us.

  • Smartphones that know our every move and the most intimate and personal aspects of our lives.
  • Intelligent cars that know where we go and how we drive.
  • The internet of things where the stuff we own collects information about us.
  • Wearable tech that collects information about our fitness, and increasingly our health.
  • The trend for information and services to be performed in the cloud rather than locally, and stored in various motherships.
  • Big data that functions by saving as much information as possible.
  • Artificial intelligence and cognitive learning tools that can turn data into useful information and make inferences based on seemingly unconnected information.
  • Blockchain technology that has the potential to record surprising things about us.

On top of all this, it is becoming increasingly harder to understand when our info is staying on our device, when it goes somewhere else, how long it stays there, who has access to it, when it is encrypted, and who has access to the encryption keys.

It is in this context, and the fact that we just don’t have the time to spend to understand and make all the privacy choices that we need to make, that the Privacy Commissioner of Canada last week released a discussion paper titled Consent and privacy: A discussion paper exploring potential enhancements to consent under the Personal Information Protection and Electronic Documents Act

The introduction states in part:

PIPEDA is based on a technologically neutral framework of ten principles, including consent, that were conceived to be flexible enough to work in a variety of environments. However, there is concern that technology and business models have changed so significantly since PIPEDA was drafted as to affect personal information protections and to call into question the feasibility of obtaining meaningful consent.

Indeed, during the Office of the Privacy Commissioner’s (OPC’s) Privacy Priority Setting discussions in 2015, some stakeholders questioned the continued viability of the consent model in an ecosystem of vast, complex information flows and ubiquitous computing. PIPEDA predates technologies such as smart phones and cloud computing, as well as business models predicated on unlimited access to personal information and automated processes. Stakeholders echoed a larger global debate about the role of consent in privacy protection regimes that has gained momentum as advances in big data analytics and the increasing prominence of data collection through the Internet of Things start to pervade our everyday activities.

Cross-posted to Slaw

Enemy of the State – still topical

I recently watched the 1998 movie Enemy of the State .  It is a spy thriller about a lawyer being smeared by politicians because they believe he has information that can implicate them in criminal matters – the murder of a politician who was opposing a privacy bill that is really a bill empowering mass surveillance.  They use sophisticated, unsavoury, unethical, and illegal methods to watch him, discredit him, and retrieve the evidence.  No one is watching the watchers, who are out of control.

While like any disaster movie the plot is a bit over the top, it was fascinating to watch the movie again from a 2016 lens.  I challenge anyone to watch it and still say “I have nothing to hide” to dismiss privacy and surveillance concerns.

In a related sentiment, a recent study confirms that the knowledge that we may be watched puts a chilling effect on what we do.  This Techdirt article is a good summary of that study.

220px-Enemy_of_the_State

Cross posted to Slaw.

Panama Papers – Points to Ponder

The Panama papers revelations are worth pondering on many levels. (This Wired article is a good summary.)

My first reaction to the high level tax evasion and corruption allegations was to blanch at the thought that someone had basically given the entire contents of a law firm’s document management system to a third party.

As a lawyer, the fact that law firm files were leaked causes me to wince. After all, solicitor-client privilege is a fundamental tenet of democratic society. Law firms take the security of their files very seriously, and getting access to this information would not be an easy task.

This has parallels to the Snowden leaks. I’ve said before that Snowden should be congratulated, not prosecuted.

But this is not the same.

Snowden leaked information about one government entity. This is a leak with personal, sensitive, and confidential information about thousands of individuals and corporations. Some of the activities exposed by the press are no doubt illegal or unethical, some may raise a debate over were the line should be between tax avoidance and tax evasion, and issues around tax havens in general.

But that does not justify this kind of breach to the press.

Unfortunately this has set a smell test where anyone who has an offshore company, or any business such as a law firm that is involved in their creation, gets unfairly tarred with suspicion.

According to press reports the journalists won’t release the actual documents to respect the privacy of the innocent. That’s good – but that shouldn’t be a decision that a journalist should have to, or should get to make.

Apple fought the FBI to keep phones secure.  In that case the end the FBI was seeking did not justify the means. That is largely because it puts the information of everyone using an iPhone at risk. So how is this leak that exposes legal files of thousands of people any different? It seems that one minute we are applauding security and privacy – and yet we now seem to be applauding a massive breach of security and privacy.

It is too easy to dismiss this as a risk that is peculiar to law firms in tax havens that are perceived to facilitate unsavoury activities. Has this perhaps put a bigger target on law firms for both inside and outside hackers?

An IT security firm told me this morning that they have been contacted by a number of law firms that are wondering what shape their security measures are in in light of the Panama Papers.

Perhaps law firms everywhere should take another look at their security measures to reduce the chances this could happen to them.

Cross-posted to Slaw

Apple fights court imposed FBI backdoor order

Apple CEO Tim Cook has taken a very public stand against an FBI request and court order to create a backdoor into the Apple operating system.  This arose from the investigation into the San Bernardino mass shooting last December.

See this article on ZDNet for more details.  And Read Tim Cook’s customer letter posted on the Apple website for a more complete explanation of Apple’s position.

Kudos to Tim Cook and Apple for this.

Security and privacy experts continue to point out that backdoors are a bad idea that cause far more harm than good.

See, for example, this ZDNet article from yesterday about a new report saying “European cybersecurity agency ENISA has come down firmly against backdoors and encryption restrictions, arguing they only help criminals and terrorists while harming industry and society.”

Cross-posted to Slaw

Encryption = good : Backdoor = bad

Every time there is a tragic attack on people or property, there is a cry from various authorities or politicians for law enforcement to get unfettered access to all kinds of communication tools.

But that would cause far more harm than good, and is a really bad idea.

The argument goes something like this:

These bad actors hide behind encrypted communications to plan their evil deeds.  Therefore to stop them law enforcement needs to have access to all this.  Therefore we need to have backdoors built into all encryption that law enforcement can use.

This is flawed in many ways.

There is no evidence that unfettered access to communications helps.  Sometimes the information was actually available, but no one managed to put it together ahead of time to stop the evil deed.

There is no way that backdoors can be limited to use by law enforcement.  They will inevitably be discovered by others and used for evil, thus rendering encryption and all the protection it provides useless.

Bad actors will stay a step ahead.  If mainstream communications and encryption tools have backdoors, they will just create their own secure communications channels.

But don’t just take my word for this.  Read, for example, this article by security expert Bruce Schneier entitled Why we Encrypt.

And this article by Cory Doctorow on how ridiculous British Prime Minister David Cameron’s comments on the need to backdoor encryption are entitled What David Cameron just proposed would endanger every Briton and destroy the IT industry.

And this article by Mike Masnick of Techdirt entitled The Paris Attacks Were An Intelligence Community Failure, Not An ‘Encryption’ Problem.

Cross posted to Slaw

11 things you should know about privacy

top legal issues for tech bus

Privacy laws apply to every business that knows any information about individuals.

Here are 11 things you should know about privacy.

  1. There are many privacy statutes that may apply depending on the nature of the information, the nature of your business, and what province your customers are in. Health information, for example, is usually subject to different statutes than other personal information.
  2. In general, if you want to use someone’s personal information for something they would not think is necessary to provide your services, you need their permission.
  3. Mandatory breach notification is becoming more common. Some provincial statutes require it, PIPEDA now includes breach notification provisions that are coming into effect soon.  The notice requirements include some rather subjective tests, and must be reviewed carefully if you have a privacy breach.
  4. The definition of personal information is fairly broad. It includes things like an IP address, and depending on the jurisdiction, may include car license plates.
  5. You need to have a privacy policy that clearly describes what you collect and what you do with personal information. The nature and complexity of that policy will vary depending on the nature of your business, the nature of the information, and what you want to do with the personal information.
  6. You must have a privacy officer who is accountable and available to your customers.
  7. A privacy policy should cover your organization as a whole, not just your web site or one product.
  8. A privacy audit may be in order. Make sure you understand what information you actually do collect, use and disclose.  A disconnect between reality and what your policy says is a recipe for disaster.
  9. Privacy, anti-spam legislation (CASL), and Don Not Call legislation complement each other, work together, and shouldn’t be viewed in isolation.
  10. Some privacy laws (in particular some provincial laws dealing with public sector or health information) say that data can’t reside outside of Canada.
  11. Having processes and protections in place to keep personal information out of the wrong hands is crucial. It is equally crucial to deal with a privacy breach appropriately to reduce legal, customer, and headline risk.

Laws that politicians are NOT bound by

I’ve seen complaints suggesting emails from those running in the federal election are spam. But CASL specifically exempts political emails from the definition of spam. A recent review of political emails by a mail service provider showed that they are not even trying to comply with the spirit of CASL – such as having unsubscribe mechanisms and contact information.

It’s never been clear to me why those making laws think they deserve to be exempted from many laws they think business need to follow. Perhaps if they applied more laws to themselves some laws might be a lot more user friendly (I challenge any politician or political party to fully comply with CASL and see what a pain it is), and we would be less perturbed with their communications and campaigns.

Here are a few laws that don’t apply to politicians that perhaps should:

  • CASL
  • Privacy
  • Do Not Call
  • Signage bylaws
  • Misleading advertising

Cross-posted to Slaw

Privacy Panic Cycle

The Information Technology and Innovation Foundation has released their analysis of how privacy advocates trigger waves of public fear about new technologies in a recurring “privacy panic cycle.”

The report is an interesting read and makes some valid points.  In general, people fear new things more so than things we are familiar with.  Like the person who doesn’t fly much being nervous about the flight when statistically the most dangerous part of the journey is the drive to the airport.

While a privacy panic for emerging tech is indeed common, we can’t summarily dismiss that panic as having no basis.  The key is to look at it from a principled basis, and compare the risks to existing technology.

New tech may very well have privacy issues that need to be looked at objectively, and should be designed into the tech (called privacy by design).

Even if the privacy fears are overblown, purveyors of the technology need to understand the panic and find a way to deflate the concerns.

Cross-posted to Slaw

Internet of Things Security Standard Proposal

The Internet of Things (IoT) is surrounded by a lot of hype.  There is great promise to be able to do and know all sorts of things when all our stuff can communicate.  That could be almost anything, including thermostats, cars, garage door openers, baby monitors, appliances, fitness trackers, and the list goes on.  Cheap sensors and easy connectivity means that it is becoming trivial to measure everything and connect almost anything.

But with great promise comes great risk.  Our things will generate information about us – both direct and inferred.  There are security issues if these devices can be controlled by third parties or used as back doors to gain entry to other systems.  It may not be a big deal if someone finds out the temperature of your house – but it is a big deal if they can go through your thermostat and get into your home network.

These privacy and security issues must be dealt with up front and built into the devices and ecosystem.

The Online Trust Alliance (members include ADT, AVG Technologies, Microsoft, Symantec, TRUSTe, Verisign) just released a draft IoT Trust Framework to address this issue.  The draft is open for comments until September 14.

Cross-posted to Slaw