Did Transport Canada just ground the Canadian hobbyist Drone market?

Transport Canada just put in force an order regarding the recreational use of model aircraft, enforceable by a $3,000 fine. Details are in the graphic below and on the Transport Canada Web site.

Operation of a drone over 35 kg, or for commercial use, has not changed, and still requires a Special Flight Operations Certificate.

Restrictions on flying near airports and aircraft are understandable.

But you can’t operate a model aircraft “at a lateral distance of less than 250 feet (75m) from buildings, structures, vehicles, vessels, animals and the public including spectators, bystanders or any person not associated with the operation of the aircraft”.

If we think about that, it leaves almost nowhere to fly.   You can’t fly it with a friend within 250 feet – unless somehow the friend is “associated with the operation of the aircraft”.   And what is meant by not operating within 250 feet of animals?  If you are in a remote area away from buildings and vehicles, there is likely to be some kind of animal nearby.

Given how restrictive these rules are, not many people will want to own one, and those who already own one may have trouble finding a place to fly it.

The Drone Manufacturers Alliance “believes new drone regulations announced today by Transport Canada will provide only a negligible increase in safety while sharply curtailing the ability of Canadians to explore, photograph their country, and teach their children about science and technology.”

They also said  “The Drone Manufacturers Alliance expects all our members’ customers to fly safely and responsibly, and our years of experience show that technology and education provide a better solution than a hastily-written ban.

Aviation authorities around the world have never recorded a single confirmed collision between a civilian drone and a traditional aircraft. Indeed, many initial drone sightings reported by aircraft pilots have turned out to be birds, balloons or even a plastic bag.”

The only realistic drone to purchase now in Canada are those that weigh 250 grams (0.55 pounds) or less, which are exempt from the rules.  Drones that small may not be as capable as larger ones, but they do exist.

Cross posted to Slaw

Researchers play along with “Tech Support” scam calls

Have you ever been tempted to play along with scammers that phone just to see where it goes and to give them some grief?  Researchers at the State University of New York at Stony Brook did that and more.

They sought out scammers who claim to be from Microsoft or some sort of official tech support, and followed it through to see what happened.  They set up virtual machines that looked like normal PC’s to the scammers who remote on, and let the scam play out.

This Wired article has more detail, including the paper that the researchers wrote, and recordings of the conversations.  It is worth a read if you are curious about how they do it.

Basically the scammer tells the victim that their computer is infected with viruses and spyware.  Then for about $300, offers to clean it up.

Only about 2% of the people they talk to fall for the scam – but the revenue generated is in the tens of millions of dollars.

The US FTC has already used information provided by the researchers to get a $10 million penalty against a Florida based call centre.  About 10% of the call centres are in the US – 85% are in India.

Cross-posted to Slaw

Privacy Commissioner posts new case summaries

Privacy breaches and complaints can often be resolved cooperatively.  We usually hear about the large, dramatic, far reaching breaches more so than the smaller ones that get resolved.

The privacy commissioner just released some examples.

In one example, a malfeasant social engineered some information from customer service representatives that enabled the malfeasant to contact customers and try to obtain more information that could be used for fraud.  The business investigated, contacted the individuals who may have been compromised, and took steps to reduce the chances of it happening again.

In another situation, a rogue employee took customer information which was used to impersonate the company to collect money from a customer.  The business was not very responsive to the customer complaint until the privacy commissioner got involved.   In the end the employee was dismised, the customer made whole, and steps were taken to reduce the chances of it happening again.

From a business perspective, it shows the need to take privacy complaints seriously, and deal with them quickly and effectively.

From a consumer perspective, it shows the need to be cautious when you are asked for your information – especially when someone contacts you.  And be patient when your service providers take steps to make sure you are who you say you are.

Cross-posted to Slaw.

Trump administration to roll back net neutrality

In 2015 the US FCC took steps to prevent ISPs from discriminating against internet traffic.  This is called Net Neutrality, which Wikipedia describes as “…the principle that Internet service providers and governments regulating the Internet should treat all data on the Internet the same, not discriminating or charging differentially by user, content, website, platform, application, type of attached equipment, or mode of communication.”

The gist of the concept is that the owner of the pipes shouldn’t be able to favour the delivery of its own content over content provided by others.

At the risk of oversimplifying this, net neutrality is generally favoured by consumers and content providers, but not so much by ISPs.

In what is seen as a backwards steps for US consumers, the new chair of the FCC has made it clear that he is not a fan of the principle.

For more detail, read this New York Times article titled Trump’s F.C.C. Pick Quickly Targets Net Neutrality Rules and this CNET article titled Meet the man who’ll dismantle net neutrality ‘with a smile’

Cross-posted to Slaw

Trump’s executive order on foreigners strips privacy protection for Canadians

Included in Trump’s reprehensible executive order “Enhancing Public Safety in the Interior of the United States” was this:

Sec. 14.  Privacy Act.  Agencies shall, to the extent consistent with applicable law, ensure that their privacy policies exclude persons who are not United States citizens or lawful permanent residents from the protections of the Privacy Act regarding personally identifiable information.

The Privacy Act covers personal information held by US Federal agencies.  This would apply, for example, to information collected about Canadians entering the United States.

This should be attracting the wrath of the Canadian privacy commissioner and the Canadian government.

More detail is in this post by Michael Geist and this post on Open Media.

Given this attitude, we should be redoubling efforts to make sure our communications are encrypted.

Conventional wisdom has been that our data is just as safe in the US as Canada given that both countries have limits on privacy when it comes to law enforcement and government ability to dip into our information.  But this cavalier attitude puts that into question, and it may be prudent for Canadian entities to keep their data in Canada to the extent possible.  Where that isn’t practical, attempts should be taken (and assurances obtained from vendors) to encrypt that the data in a way that the provider doesn’t have access to it.

Cross posted to Slaw

The end of cloud computing

That’s the title of a 25 minute video that is worth watching if you have an interest in where computing is going.

Don’t panic if you have just decided to do more of you business computing in the cloud.  That isn’t going away any time soon.

It means that we will see more edge or fog computing.  Some of the computation that now happens in the cloud will increasingly happen at the edge of the network.  That might be in IOT devices, our phones, cars, or Alexa type devices.  Think of it as a return to distributed computing.  Peer to peer networks will become more common as well.  Such as cars that talk directly to each other to allow them to drive safer near each other.

In part this is because devices are becoming more capable.  For example, artificial intelligence now must use the cloud to figure out some queries.  Think of Siri or Alexa that sends your queries to the cloud.  Hardware and software advances will make it possible to do more of this at the end point – such as directly on your phone.  (That might have a side benefit of helping on the privacy front.)

Edge computing is in part being driven by necessity.  The sheer number of devices generating data, and the volumes of data they will generate, will be overwhelming.  For some applications, the cloud is simply not fast enough or reliable enough.  It is one thing if it takes a couple of seconds to get your answer back on the weather forecast.  But a self-driving car needs to react instantly to stop when someone steps off a curb in front of it.

The cloud will be where learning occurs, and where much of the data resides, but data curation and decision making will be done at the edge.

Cross-posted to Slaw

10 things to watch for at the intersection of Tech and Law in 2017

  1. CASL, Canada’s anti-spam legislation, has been with us since July 2014. It’s a terrible piece of legislation for many reasons. In July 2017 a private right of action becomes effective that will allow anyone who receives spam as defined by CASL to sue the sender. CASL sets out statutory damages, so the complainant does not have to prove any damages. Class actions will no doubt be launched. The sad part is that breaches of CASL are to a large extent breaches of the very technical requirements of the statute, rather than the sending of what most people would call spam. At some point in 2017 we may see a court decision that ponders CASL’s legality.
  2. Pipeda, Canada’s general privacy law, has been amended to require mandatory notice to the privacy commissioner and/or possible victims when there is a serious privacy breach. This is on hold pending finalization of the regulations – and may be in effect before the end of 2017.
  3. Privacy in general will continue to be put under pressure by politicians and law enforcement officials who desire to advance the surveillance state. The good news is that there is continuing pressure being put forth by privacy advocates. A UK court, for example, decided that some recent UK surveillance legislation went too far. The Snowden revelations have spurred most IT businesses to use more effective encryption. Unfortunately, I don’t think it is safe to predict that President Obama will pardon Snowden.
  4. Canada’s trademark registration process will undergo substantive change in 2018 – some good, some not so good. In 2017 the regulations and processes should be finalized, giving us more detail about how it will work in practice.
  5. We will hear a lot about security issues around the internet of things, or IOT. IOT devices can be a gateway to mayhem. IOT things include such disparate devices as thermostats, light switches, home appliances, door locks, and baby monitors. The problem is that far too often designers of IOT devices don’t design security into them. That makes it easy for malfeasants to use these devices to break into whatever networks they are connected to.
  6. Artificial Intelligence, or AI, will continue to creep in everywhere. AI is now employed in many things we use – ranging from google translate to semi-autonomous cars. Voice controlled screen and non-screen interactions – which use AI – are on the rise.
  7. AI is starting to be used in tools that lawyers use, and for tools that will replace lawyers in some areas. In 2017, we will start to see some major upheavals in the practice of law, and how people get their legal needs met. At some point every lawyer (and knowledge workers in general) will have a holy cow moment when they realize the impact of AI on their profession. AI will make inroads in things like legal research, and contract generation. It will also foster the provision of legal services online by non-lawyers to a vast underserved market that won’t pay lawyers on the current business model. These services may not be quite as good as those provided by lawyers, but consumers will be happy to pay less for what they perceive as good enough. And the quality, breadth, and sophistication of these services will continue to improve as AI improves.
  8. Another AI issue we will hear about in 2017 is embedded bias and discrimination. AI makes decisions not on hard coded algorithms, but rather learns from real world data and how things react to it. That includes how humans make decisions and respond and react to things. It thus tends to pick up whatever human bias and discrimination exists. That is a useful thing if the purpose is to predict human reactions or outcomes, like an election. But it is a bad thing if the AI makes decisions that directly affect people such as who to hire or promote, who might be criminal suspects, and who belongs on a no-fly list.
  9. The cloud has finally matured and will be adopted by more businesses in 2017. Most major international players now have data centres in Canada, which helps to raise the comfort level for Canadian businesses. Many CIOs now realize that putting everything in the cloud means that life is easier as a result, as it can make business continuity, scalability, mobility, upgrades, and security easier. Care must be taken to make sure that the right solutions are chosen, and it is being done right – but there are compelling reasons why it can be better than doing it yourself.
  10. The youngest generation in the workforce is always online, connected, and communicating, and expects their workplace to fit their lifestyle and not the other way around. Firms that embrace that will get the best and the brightest of the rising stars. It used to be that business tech was ahead of consumer tech, but that trend has been reversing for some time. More workers will get frustrated when they can do more with their own devices and apps than their corporate ones. That can lead to business challenges in areas such as security – but these challenges around rogue tech in the workplace have been around for decades.

Cross posted to Slaw

Even Santa needs to get it in writing

1654x920_yorkdale_biker_lr_logo1

The 2016 Fashion Santa.  photo source:  yorkdale.com

A Toronto mall and its former “Fashion Santa” are having a snowball fight over the character.  The mall hired a new Fashion Santa this year instead of the person who played the role before. The dispute is over who owns the character and name. They even have duelling trademark applications for Fashion Santa. 

In the end it comes down to the facts (including whether the individual is an employee or an independent contractor, and who developed the character) and the nature of any agreement that might exist.

While disputes over public characters makes for good press, this kind of dispute is actually not that rare.  Disputes often occur between individuals and the entity that hires them over who has rights to intellectual property. 

Typically the individual claims they created something before they were hired, or that they are really independent contractors providing a service.  The business claims either that it created it on its own, the employee merely had an idea that it then developed, or the employee developed it as part of the employee’s duties.  These issues can be difficult to sort out, as the facts are often fluid, and subject to different points of view. 

The best and easiest time to sort out ownership issues is at the beginning, and put it in writing.  But it may not be on the parties’ minds then.  Ownership and rights issues often get controversial only when something becomes successful and money gets involved – such as the publicity and success of Fashion Santa. 

Cross-posted to Slaw

 

SCC renders practical privacy decision on mortgage information

The Supreme Court of Canada, in Royal Bank v Trang, made a privacy decision that will bring a sigh of relief to lenders and creditors.

A judgment creditor asked the sheriff to seize and sell a house to satisfy the judgment.  To do that, the sheriff needed to know how much was owed on the mortgage on the house.  The mortgage lender didn’t have express consent to provide the information, and said PIPEDA prevented it from giving it.  Lower courts agreed.

But the SCC took a more practical approach.  The issue was whether there was implied consent to release that personal information.  The SCC said there was.

They interpreted implied consent in a broader perspective, looking at the entire situation, including the legitimate business interests of other creditors.  Financial information is considered to be sensitive personal information, and thus in general faces a higher threshold for implied consent.  But in this context, they held that it is a reasonable expectation of a debtor for a mortgage lender to provide a discharge statement to another creditor wanting to enforce its rights against that property.

Cross-posted to Slaw

Big data privacy challenges

Big data and privacy was one of the topics discussed at the Canadian IT Law Association conference this week.  Some of the issues worth pondering include:

  • Privacy principles say one should collect only what you need, and keep only as long as needed.  Big data says collect and retain as much as possible in case it is useful.
  • Accuracy is a basic privacy principle – but with big data accuracy is being replaced by probability.
  • A fundamental privacy notion is informed consent for the use of one’s personal information.  How do you have informed consent and control for big data uses when you don’t know what it might be used for or combined with?
  • Probability means that the inferences drawn may not always be accurate.  How do we deal with that if we as individuals are faced with erroneous inferences about us?
  • If based on information that may itself be questionable, the results may be questionable.  (The old garbage in, garbage out concept.)  It has been proposed that for big data and AI, we might want to add to Asimov’s 3 laws of robotics that it won’t discriminate, and that it will disclose its algorithm.
  • If AI reaches conclusions that lead to discriminatory results, is that going to be dealt with by privacy regulators, or human rights regulators, or some combination?
  • Should some of this be dealt with by ethical layers on top of privacy principles? Perhaps no go zones for things felt to be improper, such as capturing audio and video without notice, charging to remove or amend information, or re-identifying anonymized information.

Cross-posted to Slaw