The Surveillance Society is already here

Canadians often look at intrusive, anti-privacy surveillance in other countries, and at things like the NSA and Patriot Act in the United States and think we are above that. But it is becoming apparent that Canada is just as bad. We need to do better than this and move the pendulum back towards individual rights and freedoms, and away from a surveillance society that does very little if anything to actually protect us.

For example, it recently came to light that the Communications Security Establishment, or CSE, Canada’s equivalent of the NSA, monitors and stores emails sent to Canadian government agencies.

This kind of surveillance is usually justified as being necessary to deal with terrorism and threats to national security, and its effects are downplayed by comments like its just metadata, or Canadians aren’t targeted. But there does not seem to be any evidence that all this surveillance and collection actually prevents anything bad from happening. Metadata is every bit as personal, private, and informative as the data itself. Who is targeted does not change the fact that personal information on citizens is being collected and retained, and that this information has the potential to be abused and used for undesirable purposes.

Mathew Ingram puts it well in an article in the Globe entitled We can’t accept Internet surveillance as the new normal.

The only good news is that the ongoing revelations about the nature and type of spying – largely because of Edward Snowden – are creating a growing public backlash, and tech companies are working to make it harder to intercept communications. Bill C-51, the anti-terrorism bill currently in the hearing stage is a case in point, which has attracted a huge amount of criticism – both over a lack of oversight, and as to the intrusiveness and potential abuse of authority that could result.

See, for example, this Huff Post article entitled Edward Snowden Warns Canadian To Be ‘Extraordinarily Cautious’ Over Anti-Terror Bill, and Michael Geist’s article entitled Why The Anti-Terrorism Bill is Really an Anti-Privacy Bill: Bill C-51′s Evisceration of Privacy Protection 

There is even a website dedicated to stopping the bill.

Cross-posted to Slaw.

Privacy Commissioner issues guidance on police body cameras

The federal Privacy Commissioner has just released a report giving guidance on the privacy implications of police wearing body-worn cameras, and what police need to do to comply with privacy laws.

It points out that the issues around body-worn cameras are more complex than on fixed cameras.

As is usually the case with privacy issues, it is about balance – in this case balancing the advantages of the cameras with privacy concerns.

The report has this to say about balance:

There are various reasons why a LEA might contemplate adopting BWCs. LEAs could view the use of BWCs as bringing about certain benefits to policing or other enforcement activities.  For example, in addition to being used to collect evidence, BWCs have been associated with a decrease in the number of public complaints against police officers as well as a decrease in the use of force by police officers.  At the same time, BWCs have significant privacy implications that need to be weighed against the anticipated benefits.  As the Supreme Court of Canada has noted, an individual does not automatically forfeit his or her privacy interests when in public, especially given technological developments that make it possible for personal information “to be recorded with ease, distributed to an almost infinite audience, and stored indefinitely”. And as the Supreme Court added more recently, the right to informational privacy includes anonymity which “permits individuals to act in public places but to preserve freedom from identification and surveillance.”

It goes on to talk about the tests to determine if the intrusion is justified, and what uses and safeguards are appropriate.

Its worth a read even if just for its general discussion around cameras and privacy.

Cross-posted to Slaw

http://harrisonpensa.com/lawyers/david-canton

Big Brother in your TV? 10 “freaky line” things to think about

There has been a big kerfuffle in the last few days over the thought that Samsung smart TV’s are listening to and recording TV watcher’s conversations via their voice command feature.  That arose from a clause in their privacy policy that said in part “…if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party through your use of Voice Recognition.”

Samsung has since clarified this language to explain that some voice commands may be transmitted to third parties to convert the command to text and make the command work.  Also to point out that you can choose to just turn that feature off.  That is similar to how Siri, Google Now, Cortana, and other voice command platforms work.  Some voice commands are processed locally, and some may require processing in the cloud.  How much is done locally, and how much in the cloud varies depending on the platform and the nature of the command.

While one should never reach conclusions based on press reports, the probability is that this issue was way overblown.  But it does show how challenging privacy issues can get when it comes to technology and the internet of things (IOT).

Issues to ponder include:

  1. The importance of designing privacy into tech – often called “Privacy by Design” – rather than trying to bolt it on later.
  2. How complex privacy is in the context of modern and future technology where massive amounts of data are being collected on us from almost everything that includes things like fitness trackers, web browsers, smartphones, cars, thermostats, and appliances.  Not to mention government surveillance such as the NSA and the Canadian CSE.
  3. The mothership issue – meaning where does all that information about us go, how much is anonymised, what happens to it when it gets there, and who gets to see or use it?
  4. How difficult it is to draft privacy language so it gives the business protection from doing something allegedly outside its policy – while at the same time not suggesting that it does unwanted things with information – while at the same time being clear and concise.
  5. How difficult it is for the average person to understand what is really happening with their information, and how much comfort comes – or doesn’t come – from a trust factor rather than a technical explanation.
  6. How easy it is for a business that may not be doing anything technically wrong or may be doing the same as everyone else is to become vilified for perceived privacy issues.
  7. Have we lost the privacy war? Are we headed to a big brother world where governments and business amass huge amounts of information about us with creeping (and creepy) uses for it?
  8. Are we in a world of tech haves and have nots where those making the most use of tech will be the ones willing to cross the “freaky line” where the good from the use outweighs the bad from a privacy perspective?
  9. Are we headed to more situations where we don’t have control over our personal freaky line?
  10. Where is your personal freaky line?

Cross posted to Slaw

Happy Data Privacy Day

From the Privacy Commissioner of Canada: “On January 28, Canada, along with many countries around the world, will celebrate Data Privacy Day. Recognized by privacy professionals, corporations, government officials, academics and students around the world, Data Privacy Day highlights the impact that technology is having on our privacy rights and underlines the importance of valuing and protecting personal information.”

Privacy becomes increasingly challenging with new tech such as big data, the internet of things, wearable computers, drones, and government agencies recording massive amounts of data in the name of security.  Sober thought needs to go into balancing the advantages of such things with privacy rights, creating them in a privacy sensitive way, and giving people informed choices.

dpd_englishprivacy sample

Cross-posted to Slaw 

harrisonpensa.com/lawyers/david-canton

 

Internet of Things and Big Data raise big legal issues

The internet of things and big data are separate but related hot topics. As is often the case with new technology, the definitions are fluid, the potential is unclear, and they pose challenges to legal issues.  All of these will develop over time.

Take privacy, for example.  The basic concept of big data is that huge amounts of data are collected and mined for useful information.  That flies in the face of privacy principles that no more personal info than the task at hand needs should be collected, and that it shouldn’t be kept for longer than the task at hand requires.  Both of these concepts can lead to personal info being created, while privacy laws generally focus on the concept of personal info being collected.

Another legal issue is ownership of information, and who gets to control and use it.  If no one owns a selfie taken by a monkey, then who owns information created by your car?

If anyone is interested in taking a deeper dive into these legal issues, I’ve written a bit about it here and here, and here are some recent articles others have written:

The ‘Internet of Things’ – 10 Data Protection and Privacy Challenges

Big Data, Big Privacy Issues

The Internet of Things Comes with the Legal Things

Wipe your car before you sell it

I’m in the process of buying a new car, and realized that when we get rid of a car we should think about more than just cleaning out the glove box and taking the snowbrush out of the trunk. A list of data to clear is at the end of this post.

At one time, cars stored no personal information other than the odometer reading and radio presets.

Cars are laden with computers that control and monitor things like the engine, brakes, climate control, entertainment, tire pressure, and safety features. With this comes more data, and with more data comes the temptation to save it and to use it for other things. This is becoming even more so for hybrid and electric cars.

An example is the OBD (on board diagnostics) and EDR (electronic data recorder) system. They contain useful information for the diagnosis of problems, and information for a short period (measured in seconds or minutes) for accident investigation, such as speed, seat belt use, steering angle, number of passengers, engine speed, and throttle position.

It is possible to plug devices into the OBD port to use and retain that information for displaying a dashboard on your phone, spying on your kids driving habits, or sending to your insurer for rate calculations.

Since the EDR system contains limited memory and overwrites itself quickly, there is little risk of that personal information being used after you give up your car – but if you are concerned, make your last drive a leisurely one.

Keeping in mind that it is easy to get a used car report showing owner name and address to link data on your old car back to you, here are some things you might want to do before you part with your car:

  • Delete Bluetooth pairings.
  • Delete stored phone numbers and call history.
  • Remove any CDs, DVDs, and usb keys. (It’s easy to forget a usb key, for example, plugged into a port hidden in the glove box or other compartment, and it might have more on it than just music.)
  • Delete built in garage door opener codes.
  • Clear the GPS of pre-programed destinations and route history.
  • Clear wifi hotspot settings and passwords.
  • Remove any OBD/EDR recorders you have added.
  • Cancel Onstar subscription and reporting. (I know someone who forget to cancel reporting, and continued to get monthly reports on his old car now with the new owner.)
  • Cancel or transfer satellite radio.

Cross posted to Slaw

http://harrisonpensa.com/lawyers/david-canton

 

 

 

SCC “gets” tech – government not so much

Far too often – at least in my opinion – courts and legislators don’t seem to understand technology related issues or how the law should fit with them.  The Supreme Court of Canada, however, got it right with Spencer, which basically says that internet users have a reasonable expectation of anonymity in their online activities.  Last Fall the SCC sent a similar message in the Vu case saying that a general search warrant for a home was not sufficient to search a computer found there.  And that trend will hopefully continue with its upcoming Fearon decision on the ability to search cell phones incident to arrest.

While the SCC seems to now “get it” when it comes to privacy and technology, the federal legislature doesn’t seem to.  It has continually tried to erode privacy with a series of “lawful access” attempts, the latest of which may be unconstitutional given the Spencer decision.  Another example of the federal legislature not “getting it” is the CASL anti-spam legislation, which imposes huge burdens on normal businesses and software providers.

Cross posted to Slaw

http://harrisonpensa.com/lawyers/david-canton  

Where’s your “Freaky Line”?

For the London Free Press – December 23, 2013 – Read this at lfpress.com

How much privacy and control of your personal information are you willing to give up to get useful information in return?

Technology is becoming increasingly integrated with us — both psychologically and physically. And its use increasingly requires us to trust others with sensitive information about us.

This new age of information gathering has been described by author Robert Scoble as the “age of context.” Scoble said, “It’s scary. Over the freaky line, if you will.”

He predicts that this “freaky line” will create a new kind of digital divide between those who will and will not cross it

Take for example Google Glass, a wearable computer with a head-mounted display. Essentially it is a pair of glasses that displays Internet information hands-free, all using voice commands. This may sound relatively harmless, but can give away detailed information on your daily routine, habits and haunts.

For some, that is a bit too much. More and more of our personal and intimate information will be collected from devices and converted into usable data — both for the user and others.

Users will get valuable information such as traffic, nearby attractions and the latest deals through behavioural advertising, which is a method of tracking consumers’ activities on websites to personalize advertisements.

There is a price to getting this information. That price is that our information has to go to a mothership somewhere, and that mothership now has information about us.

As users move along the continuum from personal computers to mobile devices to wearable computers, the ability to track user’s activities moves with it. Instead of tracking websites visited, wearable devices will provide advertisers with information about the user’s physical location, movements, interactions, health, heart rate, temperature, purchases, photographs taken, and whatever other data is being collected.

Some issues that arise include what that information will be used for, the extent to which it is kept anonymous, how long it will be kept, and who will have access to it. Privacy laws don’t fit nicely into this subject. It isn’t clear what is and what is not a reasonable expectation of privacy.

Manufacturers and service providers may argue that simply purchasing and using these types of devices amounts to implied consent for the use and dissemination of your personal information. But it’s not that simple.

For now, the choice is yours. You can choose to use these devices, or not. If you choose to use them, you do have some control over how your information is used and where it goes. But it’s not always easy to figure out or understand, and it may mean giving up using certain features.

If you choose to not use them, you keep a firmer grasp on your personal information, but you tend to miss out on new technologies and the benefits they bring.

This is the “freaky line” that you must choose to cross . . . or not.

http://harrisonpensa.com/lawyers/david-canton

Top court to rule on cell privacy

The Supreme Court of Canada will soon decide when police can look at one’s cell phone without a warrant.

Given the large amount of personal information that exists on our smartphones, this will be an important decision.  A smartphone can be a window to personal information such as email, banking details, our location over time, personal photos, and even health information.  Looking at a cell phone can be as invasive as searching our home.

In the case going to the Supreme Court, the accused, Mr. Fearon, was arrested on suspicion of his role in a robbery.  At the time of his arrest, he had a cell phone, which did not have a password restricting access.  .Incident to the arrest, the police looked through the contents of his phone finding messages and pictures incriminating him for the robbery.  At trial, Mr. Fearon attempted to have the evidence deemed inadmissible, arguing that the police should have obtained a warrant to search his phone.  The Court held that the evidence was admissible. 

The Court reasoned that since the phone did not have a password, a cursory examination of its contents incidental to arrest was acceptable.  The court stated that had access to the phone required a password, the police would need a warrant to go through the phone.

The court recognized “the highly personal and sensitive nature of the contents of a cell phone” and the fairly high expectation that such information would attract an expectation of privacy, but allowed the search nonetheless.

The Supreme Court will determine whether that principle should be upheld or replaced with something different.  

Some may argue that the court did a good job balancing the privacy interests of individuals with the enforcement interests of the police.  However others may think it is akin to saying that whether police can search one’s house depends on whether the door is locked. 

The concept of a phone being locked may not be as straightforward as it seems, given ever changing technology. 

For example the iPhone 5S has a finger print reader that can be used instead of a password to gain access to the phone. 

A password is something contained in the mind of an individual, whereas a finger print is physical.  It is easily obtained through a minor amount of force, by putting the phone to the user’s finger. 

Other authentication processes are on the horizon, including some that unlock the phone when in close proximity to its owner.  For example, the upcoming Nymi is a bracelet that senses the unique heart rhythm of its owner which effectively serves as a password to provide access to everything from phones to computers to doors.  Other technology that would accomplish the same thing are electronic tattoos and authentication pills.

In a situation where an individual uses one of those authentication methods, their phone would appear to be unlocked, even though they intended to restrict access to their phone. Let’s hope the Supreme Court of Canada considers the technology in this light, rather than just a simple password.

A condensed version of this article appeared in my last Tech Watch column.

http://harrisonpensa.com/lawyers/david-canton