January 28 is Data Privacy Day – “an international effort held annually on January 28 to create awareness about the importance of privacy and protecting personal information. ”
The IAPP (International Association of Privacy Professionals) is honouring the day with local “Privacy After Hours” events on Thursday January 26th.
Privacy professionals in London are welcome to attend the event being held at McGinnis Landing restaurant. Harrison Pensa is pleased to provide the appetizers for the event.
You can sign up for the event on the IAPP website. You have to create an IAPP logon ID to register – which is quick and painless to do.
The 2016 Fashion Santa. photo source: yorkdale.com
A Toronto mall and its former “Fashion Santa” are having a snowball fight over the character. The mall hired a new Fashion Santa this year instead of the person who played the role before. The dispute is over who owns the character and name. They even have duelling trademark applications for Fashion Santa.
In the end it comes down to the facts (including whether the individual is an employee or an independent contractor, and who developed the character) and the nature of any agreement that might exist.
While disputes over public characters makes for good press, this kind of dispute is actually not that rare. Disputes often occur between individuals and the entity that hires them over who has rights to intellectual property.
Typically the individual claims they created something before they were hired, or that they are really independent contractors providing a service. The business claims either that it created it on its own, the employee merely had an idea that it then developed, or the employee developed it as part of the employee’s duties. These issues can be difficult to sort out, as the facts are often fluid, and subject to different points of view.
The best and easiest time to sort out ownership issues is at the beginning, and put it in writing. But it may not be on the parties’ minds then. Ownership and rights issues often get controversial only when something becomes successful and money gets involved – such as the publicity and success of Fashion Santa.
Cross-posted to Slaw
The Supreme Court of Canada, in Royal Bank v Trang, made a privacy decision that will bring a sigh of relief to lenders and creditors.
A judgment creditor asked the sheriff to seize and sell a house to satisfy the judgment. To do that, the sheriff needed to know how much was owed on the mortgage on the house. The mortgage lender didn’t have express consent to provide the information, and said PIPEDA prevented it from giving it. Lower courts agreed.
But the SCC took a more practical approach. The issue was whether there was implied consent to release that personal information. The SCC said there was.
They interpreted implied consent in a broader perspective, looking at the entire situation, including the legitimate business interests of other creditors. Financial information is considered to be sensitive personal information, and thus in general faces a higher threshold for implied consent. But in this context, they held that it is a reasonable expectation of a debtor for a mortgage lender to provide a discharge statement to another creditor wanting to enforce its rights against that property.
Cross-posted to Slaw
Big data and privacy was one of the topics discussed at the Canadian IT Law Association conference this week. Some of the issues worth pondering include:
- Privacy principles say one should collect only what you need, and keep only as long as needed. Big data says collect and retain as much as possible in case it is useful.
- Accuracy is a basic privacy principle – but with big data accuracy is being replaced by probability.
- A fundamental privacy notion is informed consent for the use of one’s personal information. How do you have informed consent and control for big data uses when you don’t know what it might be used for or combined with?
- Probability means that the inferences drawn may not always be accurate. How do we deal with that if we as individuals are faced with erroneous inferences about us?
- If based on information that may itself be questionable, the results may be questionable. (The old garbage in, garbage out concept.) It has been proposed that for big data and AI, we might want to add to Asimov’s 3 laws of robotics that it won’t discriminate, and that it will disclose its algorithm.
- If AI reaches conclusions that lead to discriminatory results, is that going to be dealt with by privacy regulators, or human rights regulators, or some combination?
- Should some of this be dealt with by ethical layers on top of privacy principles? Perhaps no go zones for things felt to be improper, such as capturing audio and video without notice, charging to remove or amend information, or re-identifying anonymized information.
Cross-posted to Slaw
There have been many articles written suggesting that lawyers should learn how to code software. This Wolfram Alpha article is a good one, although many of the articles are far more adamant that every lawyer needs to learn how to code. The rationale is that because software will have an increasing effect on how lawyers practice, and who will be competing with us to provide our services, we should learn to code.
So should we learn how to code? For most lawyers, probably not.
I knew how to code before law school, and for me it has been very useful. Since my practice is largely around IT issues, it has helped me understand those issues and discuss them with clients. It has also influenced my drafting style for both contract drafting and the way I communicate with clients.
But the thought that learning how to code will give us a leg up against competitors who are developing or adopting intelligent solutions to replace our services, or will help us develop our own systems to compete or make us more efficient, is flawed. The systems that are going to have the biggest impact are based on artificial intelligence. That is very sophisticated, cutting edge stuff, and learning how to code is not going to help with that. It is something that we need to leave to the experts, or hire experts to do.
Lawyers interested in this can find resources discussing artificial intelligence and where it is headed (such as the artificial lawyer site and twitter feed that posted the Wolfram Alpha article). Looking at where this is headed, and how it might effect the practice of law would be more productive than learning how to code.
Cross posted to Slaw
Google debuted new hardware on Tuesday – including new Pixel phones, and an Amazon Echo competitor called Google Home. A key thread to all this is their new Google Assistant replacement for Google Now. (Similar to Apple’s Siri and Microsoft’s Cortana.)
But the most noteworthy part is their comment that we are switching from Mobile First to AI first. Over the past few years websites and online services have increasingly needed to be mobile friendly, so people can do what they want from whatever screen happens to be in front of them. Advances in artificial intelligence are going to put AI at the forefront.
Advances in voice recognition, natural language processing, artificial intelligence, and machine learning are leading to more accurate responses to voice commands. This makes it quicker and easier to find what we need without having to type a search command. The response can be presented to us either by an audio response, to the screen we are using, or both.
And the more intelligent the tech becomes, the more context it understands. It knows what you are looking at on your screen, and it knows you. Indeed, Google says their goal is to build a personal Google for everyone.
Perhaps HAL or Westworld isn’t as far off as we think.
Cross-posted to Slaw
A ZDNet article entitled Cloud computing: Four reasons why companies are choosing public over private or hybrid clouds makes a case for the value of the public cloud.
- Innovation comes as standard with the public cloud
- Flexibility provides a business advantage
- External providers are the experts in secure provision
- CIOs can direct more attention to business change
This is all good – or mostly good.
The caveat is that the use of the cloud can fail if a business adopts the cloud without thinking it through from the perspectives of mission criticality, security, privacy, and continuity. If a business runs mission critical systems in the cloud, and that system fails, the business could be out of business.
The IT Manager no longer has to consider day to day issues around keeping software and security up to date. But they still have to consider higher level issues.
It is important to understand what the needs are for the situation at hand. A system that is not mission critical, or does not contain sensitive information, for example, would not require as much scrutiny as a system that runs an e-commerce site.
Issues to consider include:
- how mission critical the system is
- what the consequences are of a short term and long term outage
- how confidential or personal the information is in the system
- can the information be encrypted in transit and at rest
- how robust the vendor’s continuity plan is
- the need for the business to have its own continuity plan – such as a local copy of the data
- how robust the vendor’s security is
- does the vendor have third party security validation to accepted standards
- does the vendor’s agreement have provisions that back these issues up with contractual terms and service levels with meaningful remedies
Cross-posted to Slaw
CASL, the Canadian anti-spam legislation, came into force on July 1, 2014. July 1, 2017 will be an important date for CASL, as a private right of action will become available. Anyone (class actions are likely) will be able to sue CASL violators. Statutory damages means that it won’t be necessary to prove actual damages.
CASL is a complex, illogical statute. Many businesses don’t comply because they don’t think emails they send could possibly be considered spam. After all, spam is about illicit drugs, diets and deals scams, right? Not according to CASL.
Nor do they understand they must keep detailed records to prove they have implied or express consent for each person they send an email to. Or they may be rolling the dice that they will be a low priority for CRTC enforcement. (That approach risks personal liability for directors and officers.)
Once the private right of action kicks in, the enforcement landscape changes. If a business has not yet come to grips with CASL, the spectre of private suits for violations may offer an incentive to comply.
In the long term, the private right of action could provide a couple of silver linings.
Getting CASL in front of the courts may provide some badly needed guidance on how to interpret and apply it in practice. So far, the handful of cases the CRTC has made public have not provided enough detail to help with that.
There is some thought that CASL could be struck down on constitutional grounds. Any business sued under the private right of action should include that in its defence.
The possibility of CASL being struck down should not, however, be a reason not to comply with CASL. It could take years before an action gets far enough to see that result. And that result is by no means assured.
Cross-posted to Slaw
The CRTC recently issued a media advisory entitled Enforcement Advisory – Notice for businesses and individuals on how to keep records of consent. It doesn’t add anything new – but reinforces what the CRTC is looking for. This is important because CASL requires a business to prove that they have consent to send a CEM (Commercial Electronic Message). CASL has a complex regime of express and implied consent possibilities.
The advisory states: “Commission staff has observed that some businesses and individuals are unable to prove they have obtained consent before sending CEMs. The purpose of this Enforcement Advisory is to remind those involved, including those who send CEMs, of the requirements under CASL pertaining to record keeping.”
The problem in practice is that keeping those records can be a herculean task. I’m concerned that the difficulty of getting this right will make many businesses fodder for CASL breach class action lawsuits when that right becomes available in 2017.
My personal view continues to be that the prime effect of CASL is to add a huge compliance burden to legitimate businesses. It may give some tools to attack actual spam, but its approach is fundamentally flawed, and the cost/benefit is way out of whack.
Cross-posted to Slaw