Every time there is a tragic attack on people or property, there is a cry from various authorities or politicians for law enforcement to get unfettered access to all kinds of communication tools.
But that would cause far more harm than good, and is a really bad idea.
The argument goes something like this:
These bad actors hide behind encrypted communications to plan their evil deeds. Therefore to stop them law enforcement needs to have access to all this. Therefore we need to have backdoors built into all encryption that law enforcement can use.
This is flawed in many ways.
There is no evidence that unfettered access to communications helps. Sometimes the information was actually available, but no one managed to put it together ahead of time to stop the evil deed.
There is no way that backdoors can be limited to use by law enforcement. They will inevitably be discovered by others and used for evil, thus rendering encryption and all the protection it provides useless.
Bad actors will stay a step ahead. If mainstream communications and encryption tools have backdoors, they will just create their own secure communications channels.
But don’t just take my word for this. Read, for example, this article by security expert Bruce Schneier entitled Why we Encrypt.
And this article by Cory Doctorow on how ridiculous British Prime Minister David Cameron’s comments on the need to backdoor encryption are entitled What David Cameron just proposed would endanger every Briton and destroy the IT industry.
And this article by Mike Masnick of Techdirt entitled The Paris Attacks Were An Intelligence Community Failure, Not An ‘Encryption’ Problem.
Cross posted to Slaw
Privacy laws apply to every business that knows any information about individuals.
Here are 11 things you should know about privacy.
- There are many privacy statutes that may apply depending on the nature of the information, the nature of your business, and what province your customers are in. Health information, for example, is usually subject to different statutes than other personal information.
- In general, if you want to use someone’s personal information for something they would not think is necessary to provide your services, you need their permission.
- Mandatory breach notification is becoming more common. Some provincial statutes require it, PIPEDA now includes breach notification provisions that are coming into effect soon. The notice requirements include some rather subjective tests, and must be reviewed carefully if you have a privacy breach.
- The definition of personal information is fairly broad. It includes things like an IP address, and depending on the jurisdiction, may include car license plates.
- You must have a privacy officer who is accountable and available to your customers.
- A privacy audit may be in order. Make sure you understand what information you actually do collect, use and disclose. A disconnect between reality and what your policy says is a recipe for disaster.
- Privacy, anti-spam legislation (CASL), and Don Not Call legislation complement each other, work together, and shouldn’t be viewed in isolation.
- Some privacy laws (in particular some provincial laws dealing with public sector or health information) say that data can’t reside outside of Canada.
- Having processes and protections in place to keep personal information out of the wrong hands is crucial. It is equally crucial to deal with a privacy breach appropriately to reduce legal, customer, and headline risk.
I’ve seen complaints suggesting emails from those running in the federal election are spam. But CASL specifically exempts political emails from the definition of spam. A recent review of political emails by a mail service provider showed that they are not even trying to comply with the spirit of CASL – such as having unsubscribe mechanisms and contact information.
It’s never been clear to me why those making laws think they deserve to be exempted from many laws they think business need to follow. Perhaps if they applied more laws to themselves some laws might be a lot more user friendly (I challenge any politician or political party to fully comply with CASL and see what a pain it is), and we would be less perturbed with their communications and campaigns.
Here are a few laws that don’t apply to politicians that perhaps should:
- Do Not Call
- Signage bylaws
- Misleading advertising
Cross-posted to Slaw
The Information Technology and Innovation Foundation has released their analysis of how privacy advocates trigger waves of public fear about new technologies in a recurring “privacy panic cycle.”
The report is an interesting read and makes some valid points. In general, people fear new things more so than things we are familiar with. Like the person who doesn’t fly much being nervous about the flight when statistically the most dangerous part of the journey is the drive to the airport.
While a privacy panic for emerging tech is indeed common, we can’t summarily dismiss that panic as having no basis. The key is to look at it from a principled basis, and compare the risks to existing technology.
New tech may very well have privacy issues that need to be looked at objectively, and should be designed into the tech (called privacy by design).
Even if the privacy fears are overblown, purveyors of the technology need to understand the panic and find a way to deflate the concerns.
Cross-posted to Slaw
The Internet of Things (IoT) is surrounded by a lot of hype. There is great promise to be able to do and know all sorts of things when all our stuff can communicate. That could be almost anything, including thermostats, cars, garage door openers, baby monitors, appliances, fitness trackers, and the list goes on. Cheap sensors and easy connectivity means that it is becoming trivial to measure everything and connect almost anything.
But with great promise comes great risk. Our things will generate information about us – both direct and inferred. There are security issues if these devices can be controlled by third parties or used as back doors to gain entry to other systems. It may not be a big deal if someone finds out the temperature of your house – but it is a big deal if they can go through your thermostat and get into your home network.
These privacy and security issues must be dealt with up front and built into the devices and ecosystem.
The Online Trust Alliance (members include ADT, AVG Technologies, Microsoft, Symantec, TRUSTe, Verisign) just released a draft IoT Trust Framework to address this issue. The draft is open for comments until September 14.
Cross-posted to Slaw
The Intercept has an article entitled Chatting in Secret While We’re All Being Watched that’s a good read for anyone interested in how to keep communications private. It was written by Micah Lee, who works with Glenn Greenwald to ensure their communications with Edward Snowden are private.
Even if you don’t want to read the detailed technical instructions on how to go about it, at least read the first part of the article that explains at a high level how communications can be intercepted, and the steps needed to stop that risk.
Communicating in secret is not easy. It takes effort to set it up, and it’s easy to slip up along the way. As is usually the case in any kind of security – physical or electronic – its about raising the difficulty level for someone to breach the security. The more efforts someone might take to try to intercept your communications, the more work it takes to keep it secret. For example, you raise the sophistication level of the thief who might burglarize your house as you increase security – from locking your doors, to deadbolts, to break resistant glass, to alarms, etc. It doesn’t take much extra security to make the thief go to another house, but it may take a lot more if a thief wants something specific in your house .
Edward Snowden’s communications, for example, require very diligent efforts, given the resources that various authorities might use to intercept those communications.
For the record, I think Snowden should be given a medal and a ticker tape parade, not jail time. I recommend watching Citizenfour, the documentary about Snowden that won the Academy Award for Best Documentary Feature at the 2015 Oscars. Also to read security expert Bruce Schneier’s book Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World. Another book to put this into context in Canada (based on my read of the introduction – I haven’t made it farther than that yet) is Law, Privacy and Surveillance in Canada in the Post-Snowden Era, edited by Michael Geist.
I challenge anyone to watch/read those and not be creeped out.
Cross-posted to Slaw
Several amendments were made last week to PIPEDA, the federal private sector privacy legislation. This has been sitting around in draft for a long time. Except for sections creating a new mandatory breach notification scheme, the amendments are now in force. The breach notification scheme requires some regulations before it comes into effect. More on that at the end of this post.
Several of these changes were long overdue, and bring PIPEDA more in line with some of the Provincial Acts that were drafted after PIPEDA.
Here are some of the highlights that are in force now:
- The business contact exception from the definition of personal information has been broadened.
- Provisions have been added to allow the transfer of personal information to an acquiring business for both diligence and closing purposes. Most have been approaching this in a similar way, but vendors/purchasers, and their counsel should make sure they comply with the exact requirements.
- A new section says consent is only valid if the individual would understand what they are consenting to. This speaks to the clarity of the explanation, and is particularly important when dealing with children.
- Several new exceptions to the collection, use and disclosure of personal information without consent have been added. Such as witness statements, communication to next of kin of ill or deceased persons, and fraud prevention.
- The Commissioner now has a compliance agreement remedy.
The breach notification sections that come into effect at a later date include:
- Mandatory reporting to the Commissioner of a breach where “…it is reasonable in the circumstances to believe that the breach creates a real risk of significant harm to an individual.” That test is somewhat subjective, and will no doubt cause some consternation in practice. Guidance is included on relevant factors to consider and what constitutes “significant harm”.
- The report must contain certain information and be on a form that will be in the regulations yet to be released.
- Affected individuals must be similarly notified.
- Businesses will be required to maintain records of “… every breach of security safeguards involving personal information under its control”, and provide a copy to the Commissioner on request. Note that this is “every” breach without regard to a harm threshold. This could pose a challenging compliance issue for large organizations.
- The whistleblowing provision has been amended to allow a complainant to “request” that their identity be kept confidential.
- The section with the $100,000 fine for interfering with an investigation has been amended to make it an offence to contravene the reporting requirements. That will make the decision of whether a breach passes the reporting threshold a serious matter to ponder.
Cross-posted to Slaw
Canadians often look at intrusive, anti-privacy surveillance in other countries, and at things like the NSA and Patriot Act in the United States and think we are above that. But it is becoming apparent that Canada is just as bad. We need to do better than this and move the pendulum back towards individual rights and freedoms, and away from a surveillance society that does very little if anything to actually protect us.
For example, it recently came to light that the Communications Security Establishment, or CSE, Canada’s equivalent of the NSA, monitors and stores emails sent to Canadian government agencies.
This kind of surveillance is usually justified as being necessary to deal with terrorism and threats to national security, and its effects are downplayed by comments like its just metadata, or Canadians aren’t targeted. But there does not seem to be any evidence that all this surveillance and collection actually prevents anything bad from happening. Metadata is every bit as personal, private, and informative as the data itself. Who is targeted does not change the fact that personal information on citizens is being collected and retained, and that this information has the potential to be abused and used for undesirable purposes.
Mathew Ingram puts it well in an article in the Globe entitled We can’t accept Internet surveillance as the new normal.
The only good news is that the ongoing revelations about the nature and type of spying – largely because of Edward Snowden – are creating a growing public backlash, and tech companies are working to make it harder to intercept communications. Bill C-51, the anti-terrorism bill currently in the hearing stage is a case in point, which has attracted a huge amount of criticism – both over a lack of oversight, and as to the intrusiveness and potential abuse of authority that could result.
See, for example, this Huff Post article entitled Edward Snowden Warns Canadian To Be ‘Extraordinarily Cautious’ Over Anti-Terror Bill, and Michael Geist’s article entitled Why The Anti-Terrorism Bill is Really an Anti-Privacy Bill: Bill C-51′s Evisceration of Privacy Protection
There is even a website dedicated to stopping the bill.
Cross-posted to Slaw.
The federal Privacy Commissioner has just released a report giving guidance on the privacy implications of police wearing body-worn cameras, and what police need to do to comply with privacy laws.
It points out that the issues around body-worn cameras are more complex than on fixed cameras.
As is usually the case with privacy issues, it is about balance – in this case balancing the advantages of the cameras with privacy concerns.
The report has this to say about balance:
There are various reasons why a LEA might contemplate adopting BWCs. LEAs could view the use of BWCs as bringing about certain benefits to policing or other enforcement activities. For example, in addition to being used to collect evidence, BWCs have been associated with a decrease in the number of public complaints against police officers as well as a decrease in the use of force by police officers. At the same time, BWCs have significant privacy implications that need to be weighed against the anticipated benefits. As the Supreme Court of Canada has noted, an individual does not automatically forfeit his or her privacy interests when in public, especially given technological developments that make it possible for personal information “to be recorded with ease, distributed to an almost infinite audience, and stored indefinitely”. And as the Supreme Court added more recently, the right to informational privacy includes anonymity which “permits individuals to act in public places but to preserve freedom from identification and surveillance.”
It goes on to talk about the tests to determine if the intrusion is justified, and what uses and safeguards are appropriate.
Its worth a read even if just for its general discussion around cameras and privacy.
Cross-posted to Slaw
Samsung has since clarified this language to explain that some voice commands may be transmitted to third parties to convert the command to text and make the command work. Also to point out that you can choose to just turn that feature off. That is similar to how Siri, Google Now, Cortana, and other voice command platforms work. Some voice commands are processed locally, and some may require processing in the cloud. How much is done locally, and how much in the cloud varies depending on the platform and the nature of the command.
While one should never reach conclusions based on press reports, the probability is that this issue was way overblown. But it does show how challenging privacy issues can get when it comes to technology and the internet of things (IOT).
Issues to ponder include:
- The importance of designing privacy into tech – often called “Privacy by Design” – rather than trying to bolt it on later.
- How complex privacy is in the context of modern and future technology where massive amounts of data are being collected on us from almost everything that includes things like fitness trackers, web browsers, smartphones, cars, thermostats, and appliances. Not to mention government surveillance such as the NSA and the Canadian CSE.
- The mothership issue – meaning where does all that information about us go, how much is anonymised, what happens to it when it gets there, and who gets to see or use it?
- How difficult it is to draft privacy language so it gives the business protection from doing something allegedly outside its policy – while at the same time not suggesting that it does unwanted things with information – while at the same time being clear and concise.
- How difficult it is for the average person to understand what is really happening with their information, and how much comfort comes – or doesn’t come – from a trust factor rather than a technical explanation.
- How easy it is for a business that may not be doing anything technically wrong or may be doing the same as everyone else is to become vilified for perceived privacy issues.
- Have we lost the privacy war? Are we headed to a big brother world where governments and business amass huge amounts of information about us with creeping (and creepy) uses for it?
- Are we in a world of tech haves and have nots where those making the most use of tech will be the ones willing to cross the “freaky line” where the good from the use outweighs the bad from a privacy perspective?
- Are we headed to more situations where we don’t have control over our personal freaky line?
- Where is your personal freaky line?
Cross posted to Slaw