Internet of Things and Big Data raise big legal issues

The internet of things and big data are separate but related hot topics. As is often the case with new technology, the definitions are fluid, the potential is unclear, and they pose challenges to legal issues.  All of these will develop over time.

Take privacy, for example.  The basic concept of big data is that huge amounts of data are collected and mined for useful information.  That flies in the face of privacy principles that no more personal info than the task at hand needs should be collected, and that it shouldn’t be kept for longer than the task at hand requires.  Both of these concepts can lead to personal info being created, while privacy laws generally focus on the concept of personal info being collected.

Another legal issue is ownership of information, and who gets to control and use it.  If no one owns a selfie taken by a monkey, then who owns information created by your car?

If anyone is interested in taking a deeper dive into these legal issues, I’ve written a bit about it here and here, and here are some recent articles others have written:

The ‘Internet of Things’ – 10 Data Protection and Privacy Challenges

Big Data, Big Privacy Issues

The Internet of Things Comes with the Legal Things

Wipe your car before you sell it

I’m in the process of buying a new car, and realized that when we get rid of a car we should think about more than just cleaning out the glove box and taking the snowbrush out of the trunk. A list of data to clear is at the end of this post.

At one time, cars stored no personal information other than the odometer reading and radio presets.

Cars are laden with computers that control and monitor things like the engine, brakes, climate control, entertainment, tire pressure, and safety features. With this comes more data, and with more data comes the temptation to save it and to use it for other things. This is becoming even more so for hybrid and electric cars.

An example is the OBD (on board diagnostics) and EDR (electronic data recorder) system. They contain useful information for the diagnosis of problems, and information for a short period (measured in seconds or minutes) for accident investigation, such as speed, seat belt use, steering angle, number of passengers, engine speed, and throttle position.

It is possible to plug devices into the OBD port to use and retain that information for displaying a dashboard on your phone, spying on your kids driving habits, or sending to your insurer for rate calculations.

Since the EDR system contains limited memory and overwrites itself quickly, there is little risk of that personal information being used after you give up your car – but if you are concerned, make your last drive a leisurely one.

Keeping in mind that it is easy to get a used car report showing owner name and address to link data on your old car back to you, here are some things you might want to do before you part with your car:

  • Delete Bluetooth pairings.
  • Delete stored phone numbers and call history.
  • Remove any CDs, DVDs, and usb keys. (It’s easy to forget a usb key, for example, plugged into a port hidden in the glove box or other compartment, and it might have more on it than just music.)
  • Delete built in garage door opener codes.
  • Clear the GPS of pre-programed destinations and route history.
  • Clear wifi hotspot settings and passwords.
  • Remove any OBD/EDR recorders you have added.
  • Cancel Onstar subscription and reporting. (I know someone who forget to cancel reporting, and continued to get monthly reports on his old car now with the new owner.)
  • Cancel or transfer satellite radio.

Cross posted to Slaw

http://harrisonpensa.com/lawyers/david-canton

 

 

 

SCC “gets” tech – government not so much

Far too often – at least in my opinion – courts and legislators don’t seem to understand technology related issues or how the law should fit with them.  The Supreme Court of Canada, however, got it right with Spencer, which basically says that internet users have a reasonable expectation of anonymity in their online activities.  Last Fall the SCC sent a similar message in the Vu case saying that a general search warrant for a home was not sufficient to search a computer found there.  And that trend will hopefully continue with its upcoming Fearon decision on the ability to search cell phones incident to arrest.

While the SCC seems to now “get it” when it comes to privacy and technology, the federal legislature doesn’t seem to.  It has continually tried to erode privacy with a series of “lawful access” attempts, the latest of which may be unconstitutional given the Spencer decision.  Another example of the federal legislature not “getting it” is the CASL anti-spam legislation, which imposes huge burdens on normal businesses and software providers.

Cross posted to Slaw

http://harrisonpensa.com/lawyers/david-canton  

Where’s your “Freaky Line”?

For the London Free Press – December 23, 2013 – Read this at lfpress.com

How much privacy and control of your personal information are you willing to give up to get useful information in return?

Technology is becoming increasingly integrated with us — both psychologically and physically. And its use increasingly requires us to trust others with sensitive information about us.

This new age of information gathering has been described by author Robert Scoble as the “age of context.” Scoble said, “It’s scary. Over the freaky line, if you will.”

He predicts that this “freaky line” will create a new kind of digital divide between those who will and will not cross it

Take for example Google Glass, a wearable computer with a head-mounted display. Essentially it is a pair of glasses that displays Internet information hands-free, all using voice commands. This may sound relatively harmless, but can give away detailed information on your daily routine, habits and haunts.

For some, that is a bit too much. More and more of our personal and intimate information will be collected from devices and converted into usable data — both for the user and others.

Users will get valuable information such as traffic, nearby attractions and the latest deals through behavioural advertising, which is a method of tracking consumers’ activities on websites to personalize advertisements.

There is a price to getting this information. That price is that our information has to go to a mothership somewhere, and that mothership now has information about us.

As users move along the continuum from personal computers to mobile devices to wearable computers, the ability to track user’s activities moves with it. Instead of tracking websites visited, wearable devices will provide advertisers with information about the user’s physical location, movements, interactions, health, heart rate, temperature, purchases, photographs taken, and whatever other data is being collected.

Some issues that arise include what that information will be used for, the extent to which it is kept anonymous, how long it will be kept, and who will have access to it. Privacy laws don’t fit nicely into this subject. It isn’t clear what is and what is not a reasonable expectation of privacy.

Manufacturers and service providers may argue that simply purchasing and using these types of devices amounts to implied consent for the use and dissemination of your personal information. But it’s not that simple.

For now, the choice is yours. You can choose to use these devices, or not. If you choose to use them, you do have some control over how your information is used and where it goes. But it’s not always easy to figure out or understand, and it may mean giving up using certain features.

If you choose to not use them, you keep a firmer grasp on your personal information, but you tend to miss out on new technologies and the benefits they bring.

This is the “freaky line” that you must choose to cross . . . or not.

http://harrisonpensa.com/lawyers/david-canton

Top court to rule on cell privacy

The Supreme Court of Canada will soon decide when police can look at one’s cell phone without a warrant.

Given the large amount of personal information that exists on our smartphones, this will be an important decision.  A smartphone can be a window to personal information such as email, banking details, our location over time, personal photos, and even health information.  Looking at a cell phone can be as invasive as searching our home.

In the case going to the Supreme Court, the accused, Mr. Fearon, was arrested on suspicion of his role in a robbery.  At the time of his arrest, he had a cell phone, which did not have a password restricting access.  .Incident to the arrest, the police looked through the contents of his phone finding messages and pictures incriminating him for the robbery.  At trial, Mr. Fearon attempted to have the evidence deemed inadmissible, arguing that the police should have obtained a warrant to search his phone.  The Court held that the evidence was admissible. 

The Court reasoned that since the phone did not have a password, a cursory examination of its contents incidental to arrest was acceptable.  The court stated that had access to the phone required a password, the police would need a warrant to go through the phone.

The court recognized “the highly personal and sensitive nature of the contents of a cell phone” and the fairly high expectation that such information would attract an expectation of privacy, but allowed the search nonetheless.

The Supreme Court will determine whether that principle should be upheld or replaced with something different.  

Some may argue that the court did a good job balancing the privacy interests of individuals with the enforcement interests of the police.  However others may think it is akin to saying that whether police can search one’s house depends on whether the door is locked. 

The concept of a phone being locked may not be as straightforward as it seems, given ever changing technology. 

For example the iPhone 5S has a finger print reader that can be used instead of a password to gain access to the phone. 

A password is something contained in the mind of an individual, whereas a finger print is physical.  It is easily obtained through a minor amount of force, by putting the phone to the user’s finger. 

Other authentication processes are on the horizon, including some that unlock the phone when in close proximity to its owner.  For example, the upcoming Nymi is a bracelet that senses the unique heart rhythm of its owner which effectively serves as a password to provide access to everything from phones to computers to doors.  Other technology that would accomplish the same thing are electronic tattoos and authentication pills.

In a situation where an individual uses one of those authentication methods, their phone would appear to be unlocked, even though they intended to restrict access to their phone. Let’s hope the Supreme Court of Canada considers the technology in this light, rather than just a simple password.

A condensed version of this article appeared in my last Tech Watch column.

http://harrisonpensa.com/lawyers/david-canton

Behavioural Advertising – not an exact science

Todays Slaw post:

We have a love/hate view of behavioural advertising (tracking and targeting of individuals’ web activities, across sites and over time, in order to serve advertisements that are tailored to those individuals’ inferred interests).

On the one hand, if we are going to be served up ads on the web, it is better (for both viewers and advertisers) to be served ads that are relevant to the viewer’s interests.  On the other hand, it can be rather creepy to think we are being tracked, especially if there are profiles of us being stored somewhere, and especially if those profiles contain information that is inaccurate, or show interests that for whatever reason we don’t want to broadcast.  There are of course privacy implications to behavioural advertising, which are discussed in detail in this Policy Position on Online Behavioural Advertising guideline published by the Canadian Privacy Commissioner.

I’ve noticed lately that my work surfing habits have resulted in some ads that I was initially puzzled at, and don’t always reflect my buying interests, but make sense in terms of the web sites I’ve visited.  You may have encountered this too.  The algorithms presume that we are personally interested in the sites we go to, what we look at on those sites, and what we buy on those sites.  That is at best an educated guess.  Consider that our work web surfing may include some personal surfing, but for the most part we look at sites that are work tools (such as research sites or databases), client sites, or sites of those adverse in interest to clients.

Recently I have been getting a lot of ads for a particular web development business.  I’m not in the market to buy their services, but I’ve visited their site because they are a client.

I have also been getting a particular ad for prepaid funeral services, which shows up too often to be random.  After a brief conspiracy theory moment (do they know something about my health that I don’t?) I realized that it was a result of looking at some obituary / funeral home sites for a client.

Wondering what unusual / amusing / puzzling ads readers have encountered based on their web surfing history? 

http://harrisonpensa.com/lawyers/david-canton

Idealism, fear battle over wearable devices

glass

For the London Free Press – July 22, 2013 – Read this at lfpress.com

Imagine being able to take pictures, check e-mail, follow a map and monitor your vital signs — all without having to take out your phone or use your hands. Wearable computers now make this possible.

Wearable computers come in various forms. Head-mounted displays are gaining exposure due to the highly publicized Google Glass.

Smart clothing and wristwatch-type gadgets that monitor and store a user’s vital signs are also getting attention, as their role in the health field expands.

Some computers are being implanted beneath the skin.

As inventors promise a more seamless, immediate interaction with the devices, brain-machine interfaces (BMIs) or brain-­computer interfaces (BCIs) are being developed, integrating humans and machines.

Electrodes implanted in the body to interpret neural signals are starting to enable people with paralysis to control devices with their thoughts or enable amputees to control artificial limbs.

With the line between human and computer becoming increasingly blurred, curiosity is rising — but so are anxieties about the effects of such technology.

Questions about reliability, safety, security, control and loss of independence are often posed.

It’s logical to question where the data from these devices is stored and how it will be used. Similar concerns are expressed about smartphones; however, as computers become wearable and integrated with the body, it is more difficult to know when or control how data is collected.

And the type and volume of data being collected is increasing.

Because technological innovation moves faster than the legal system, many legal questions have yet to be answered.

One fundamental question when computers are implanted in our bodies is whether the legal focus should shift to the person rather than on the device?

What about our expectation of privacy? Is it reasonable for people in today’s technological era to expect privacy or has that effectively vanished? Do we have a right not to be photographed by the person wearing Google Glass in the grocery store?

The privacy debate is a particularly heated one, with some arguing that these devices help create a surveillance society where our every move is recorded and stored whether we like it or not.

Described as the “father of wearable computing,” Steve Mann turns such privacy concerns on their head and takes the position that these devices actually empower users, serving as a counteracting force against the “watchers.”

From this perspective, the ability to make our own recordings and collect our own data is a way to take back the power that government, corporations, institutions and media have over us.

This may seem like an idealistic view, but recent news stories about citizens recording police excesses and arrests suggest Mann’s theory of empowerment may have merit.

It’s easy to get caught up in the hype of the wearable computing trend — and it is indeed an exciting one. Whether your outlook on wearable computing is positive or ridden with fear, it is important to consider the legal and ethical issues that come with them.

Stores tracking our cell phones

Today’s Slaw post:

Some retailers are following customer movement in stores by tracking cell phone movement.  From a legal perspective it raises issues around privacy and perhaps wiretapping laws.  To a great extent whether or not such activities comply are dependent upon the subtleties of how it is being done, and how anonymously it is being done.

The other issue – as is often the case when dealing with privacy related issues – is the customer acceptance or “creepiness” factor.  Some people would welcome getting a coupon on their phone while wandering through a store.  But for others it feels like surveillance and tracking that is just plain creepy.

The New York Times has a good article exploring some of these issues entitled Attention, Shoppers: Store Is Tracking Your Cell.

From the article:

Nordstrom’s experiment is part of a movement by retailers to gather data about in-store shoppers’ behavior and moods, using video surveillance and signals from their cellphones and apps to learn information as varied as their sex, how many minutes they spend in the candy aisle and how long they look at merchandise before buying it.

All sorts of retailers — including national chains, like Family Dollar, Cabela’s and Mothercare, a British company, and specialty stores like Benetton and Warby Parker — are testing these technologies and using them to decide on matters like changing store layouts and offering customized coupons.

But while consumers seem to have no problem with cookies, profiles and other online tools that let e-commerce sites know who they are and how they shop, some bristle at the physical version, at a time when government surveillance — of telephone calls, Internet activity and Postal Service deliveries — is front and center because of the leaks by Edward J. Snowden.

Here’s how changes to PIPEDA would work

For the London Free Press – July 8, 2013 – Read this at lfpress.com

The Privacy Commissioner of Canada (OPC) recently released a report recommending reforms to the Personal Information Protection and Electronic Documents Act (PIPEDA). PIPEDA is the privacy legislation that governs private-sector privacy generally in Ontario and many other provinces.

The report noted that, “Ninety per cent of the data that exists in the world today has been created in the last two years,” and PIPEDA needs to evolve.

The report highlighted four recommendations.

1: Strengthen enforcement and encourage greater compliance

Statutory damages (meaning set damages without any requirement of proof) for certain contraventions of PIPEDA. The report cites the Copyright Act as a successful example of a statutory-damages regime.

Order-making powers to give the Commissioner the ability to issue a binding order to either enforce an action or prevent one from being committed. At present, the Commissioner can only recommend this type of action.

Administrative monetary penalties (AMPs) are suggested as a means of bringing organizations into compliance with PIPEDA. AMPs are similar to fines, but would be assessed directly by the Commissioner.

Why the OPC wants this: “It is legitimate to question how a small entity with limited resources, such as the OPC, can attract the attention of these companies and proactively encourage them to comply with PIPEDA when the reality is that there are very limited consequences for contravening Canadian privacy law.”

2: Shine a light on privacy breaches

Require organizations to report breaches of personal information to the Commissioner and to affected individuals.

Why the OPC wants this: Some organizations voluntarily report and inform individuals of privacy breaches. Some organizations do not. Those that do voluntarily report may face negative financial and reputational consequences while those that do not report may escape any form of recourse. This “creates an uneven playing field for organizations.”

3: Lift the veil on authorized disclosures

PIPEDA allows disclosure of personal information to a government institution without the knowledge or consent of the affected individual, upon request. Organizations may, but don’t always, challenge or refuse these requests. The OPC would require organizations to maintain a record of disclosures to government and make it publically available.

Why the OPC wants this: Canadians seeking access to their personal information would be able to find out if their information had been disclosed. There is no transparency or clear rules about what information can and should be provided to government institutions without a court order.

4: Walk the talk

Enforceable agreements would force an organization, at the end of a privacy investigation, to agree with the Commissioner’s recommendations and to comply within a set time period.

Make accountability provisions subject to review by the Federal Court.

Why the OPC wants this: Monitoring and analyzing a company’s actions are just as time-consuming as the Commissioner’s investigations.

http://harrisonpensa.com/lawyers/david-canton