Privacy Issues

Terms and Conditions

“You know the old expression, ‘you ain’t seen nothing yet’? Well, we’re just starting on the world of wearables,” says Ann Cavoukian former Privacy Commissioner of Canada.

There are numerous privacy concerns associated with the collection, transmission and communication of data from wearable sensors. When an individual uses a wearable device, an enormous amount of data is collected and transmitted from the wearable sensor to either a computer server, or a mobile application, that collects and communicates the data.

In January 2014, the Office of the Privacy Commissioner of Canada released a report, Wearable Computing, Challenges and opportunities for privacy protection. The report was created in response to rapid technological innovation and increased consumer demand for the adoption of wearable devices.  The Privacy Commissioner of Canada report states that collected data can be “combined, analyzed and acted upon without adequate transparency, accountability or meaningful consent.” This means that an individual’s data can be taken and used for purposes that the data subject has not given consent to. In addition to challenging the existing model of consent, these devices provide new avenues for surveillance of employers, healthcare professionals and insurance companies.

The following will elaborate on some of the concerns presented in the in the OPC Wearable Computing report, and with the assistance of Ann Cavoukian, will assess each of the associated risks.

“The purpose limitation principle, intended to limit the collection of personal information, subject to consent being given for those specific purposes is becoming increasingly difficult to apply in a world of ubiquitous computing and mobile devices.” – Ann Cavoukian, Executive Director of the Privacy and Big Data Institute, Ryerson University.

Consent

In a discussion about ethics and in a discussion about privacy, consent can mean different things. In ethics, consent refers to the individual consenting to use a wearable device. Do they have the cognitive ability to consent to its use? But in a discussion about privacy, consent refers to the individual’s agreement to allow their data to be collected and transmitted.

Informed consent is difficult to obtain. Individuals do not often read the Terms and Conditions, and the parameters of use written in a manual are typically not composed in a manner that the individual can understand. Individuals using wearable technology simply do not know what is being done with their data.

Purpose limitation principle and purpose specification

“Privacy is all about using information for the intended purposes that have been consented to for the data subject, and in this case, the elderly individual,” Cavoukian says. This is called the “purpose limitation principle,” only using the data for its intended purpose that was consented by the individual.

“The consumer doesn’t give you data to do whatever the heck you want to do with it,” says Cavoukian. “They usually give you data for a particular purpose.” This is the notion of purpose specification, the data should be used for an intended purpose. If information is going to be used for another, second purpose, then the developer should return to the individual and get their consent. “Obviously that doesn’t happen in the world of wearables at all, and that’s part of the problem.” Data from wearable devices is transmitted from a sensor to a computer server or mobile application. An individual will consent to their data being collected and communicated, because they want to be able to see their progress or read their own statistics. However, this data is often used for a second purpose, for example to inform business statistics or sold to research and insurance companies.

There are many unintended consequences that arise from the free floating use of personally identifiable data. “People don’t think of the consequences,” says Cavoukian. “When information that is intended for one purpose, is used for a myriad of other purposes, no one is thinking of the potential horrors that could arise. That’s one of the biggest problems with not limiting the use of information by design from the outset.” This leads to unintended consequences of wearable surveillance.

Wearable surveillance

The purpose of wearable technology is to track and profile different variables, such as heart rate, number of steps walked, blood pressure, etc. Many individuals use this information for their own records, or the information is provided to a healthcare professional. But there are other stakeholders who are becoming increasingly interested in this information: health insurance providers and employers for example.

Manulife Financial, an insurance company, offers discounts to U.S. consumers that use fitness trackers. They plan to launch the same program in Canada. While an individual can be incentivized to use wearable technology to lower insurance premiums, there are also negative consequences of this data being available to insurance companies. As included in a MaRS research report, Forrester Research concluded, “As fitness wearables track the activities of more and more employees, for-profit health care systems will both reward adherents to an active and healthy lifestyle … and punish non-adherents.”

“Quite frankly there’s very little in this world that people can’t find out about you.” – Hugh Judges, Fitbit user.

Employers are also incentivizing workers to use wearable devices through corporate wellness companies. Sprout is a Canadian corporate wellness company that helps companies to keep their employees healthy and active. They have an employee tracking program that counts everything from activity levels to mental health.

While there are many benefits to quantifying one’s health and finding motivation to be active, these new means of surveillance could be detrimental to the individuals who cannot meet the health or activity levels set out by insurance companies or employers.

“This is very very real,” says Cavoukian. “In the States, you see it all the time. There are all these horror stories.” She says insurance rates can increase and individuals can be cut off from getting certain services after their wearable data is accessed.

The developers

The OPC report highlights the role of transparency: there must be transparency in our relationship with the private sector and between the individual and the government. The challenges that these technologies present in the form of privacy risks, while we can speculate on some,  are difficult to predict in the long-term, states the report.

There are many risks associated with privacy, but wearable technology also offers a great opportunity to enhance privacy protection and user autonomy, states the report. When Cavoukian was the Privacy Commissioner, she went on a world tour, talking almost exclusively to engineers about “privacy by design.” “It is something I developed many years ago, which is all about embedding the necessary protections that we need, into the design of information technologies and networks infrastructure,” says Cavoukian.

Cavoukian explains that she asked engineers to proactively embed protections directly into the technology so that privacy would become an inherent part of the data architecture. “Every single engineer, and I spoke to thousands of engineers in California, and Europe, everybody said, ‘of course we can do that.’ But, they said that the biggest problem is that you have to tell us you want us to do that.”

The development of technologies happens in silos. Engineers receive instructions and they write the code. The privacy issues go to another department, that of the Chief Information Officer or the Chief Privacy Officer. There is a disconnect in the information flow. At some point, someone would tell the engineers to implement some sort of privacy solution. “They would just shake their heads because it was way too late,” says Cavoukian. “You can’t bolt on a solution after the fact, you have to do it to begin with.”

“We have to tell the engineers why it is important to embed privacy protections into technology,” – Ann Cavoukian, Executive Director of the Privacy and Big Data Institute, Ryerson University.

Cavoukian co-chairs a technology committee with engineering professionals, it is called Privacy by Design for Software Engineers. The committee is attempting to develop the “playbook” on how to influence and educate engineers and developers about privacy. “The reality is, that the majority of the role to be played in terms of protecting privacy and data is on the part of the company or organization or government department – whoever is doing the data collection,” Cavoukian says.

In addition to embedding privacy in the physical piece of technology, it is incumbent upon the organization to have strong data protection policies says Cavoukian. “To have policies where not only are they transparent to the consumer or the data subject about what they intend to do with the person’s information, but that they themselves impose some restrictions on what they do with the information.” It’s not just their data, it is the consumer’s data.

Pierre-Alexandre Fournier, CEO Hexoskin
“We have always been very serious about privacy and security, which are two different things,” says Fournier. “What happens with the data is that you’re wearing the shirts and the electronic device that’s hidden in the shirt records all the data. When you have a smartphone, you can use it to look at your health information and you can use your smartphone to transmit the information in real time to your account which is on one of our servers. Then from there, if you want, you can share it with somebody, but by default, everything is private. So you always have a copy of your information in your shirt, and there’s another copy on the servers. “

Paul Shore, Vice President of Health Care of Tractivity
“Privacy is s a big issue both in the development of the software from day one and from an ongoing basis depending on what requests we get from healthcare providers. I will give you a few examples of things that we do to ensure privacy. There are some rudimentary things, we don’t store passwords of our customers. This is becoming typical or a lot of web based software products. We do all sorts of inscriptions, we encrypt the data when it is moving across the internet, we encrypt the data when we store it on our server, we pick server suppliers, we do not use our own servers, we chose the levels of service they provide that have encryptions within. We have all sorts of policies and procedures for how we behave in our office because we have access to the data. For sure it is a big deal. The step channels of the patients, you could argue that it is not particularly damaging if the world saw how much walking you were doing. One day we may have your blood glucose levels and on and on we go. We could become more sensitive issues. It is good that we have spent a lot of effort setting it up for that. There are other things that we store in our database like name and email address because email address is your account name. Or in the case some of our patients in the U.S. we use medical record number which is a number issued by the provider and they need that – associated to patient name that’s an important thing to protect.”

Adrian Chan is a professor at Carleton University in the Department of Computer systems and Engineering. Here, he describes the role of the developer to protect and establish privacy.

The individual data subject

“I never absolve the individual of some role,” says Cavoukian. Individuals should have an awareness and take responsibility to the best of their knowledge when they use a new wearable technology. “Regardless of what the user does, it’s going to be limited in terms of their ability to influence the technology. The biggest thing an individual can decide to do is not use a wearable technology device because it is bleeding their data.

Cavoukian contends that individuals should not give up on technologies that they want to pursue because they are worried about privacy. They should want their privacy protected and they should ask the developer, how is that going to happen? The minute that question is posed to the developer, they will come back to you with an answer. “The problem most of the time is people don’t ask the question,” explains Cavoukian. “So it’s just given that people don’t care about privacy so they just do whatever they want with it. That’s what we have to reject.”

Individuals who use wearable technology and the companies that develop wearable technology should be restricting the use of the information the devices obtain and limiting the information for the purposes intended for data collection. “There has got to be far greater transparency on the uses of the information obtained, usually the personal information obtained from the individual and some consent mechanism if the individual wants the information used widely,” says Cavoukian.

 Here, Adrian Chan he interprets the role of the individual to protect their own privacy.

 

A win-win model

Cavoukian says the greatest misconception about wearables and privacy is that individuals think they live in a world of zero sums. “Zero sum means you can either have privacy or efficiency, or privacy or security. It’s always one versus the other,” says Cavoukian.

The zero sum model is deeply engrained in society explains Cavoukian. But Cavoukian wants people to abandon the model of zero sums and substitute for what’s called a positive sum model, “which is just win-win,” says Cavoukian. “You can in fact have privacy and data utility, privacy and efficiency.” To achieve the positive sum model, individuals have to think proactivity and developers must design systems in a way that they can achieve privacy, data utility, and efficiency.

You do not have to give up privacy in order to have the benefits of technologies and wearable devices. An individual can avail themselves of technology and protect their privacy. “How do you do both? The minute you say that it gets in the minds of the engineers and the designers,” says Cavoukian.

“This is the objective we have to strive for. And I know here, I’m like the David versus goliath, but we can do this.” – Ann Cavoukian 

Legislation

In Canada, there is a federal privacy commissioner and provincial commissioners. The role of the commissioner is to ensure that the nation and provinces have regulatory compliance with privacy laws in respective jurisdictions. Privacy commissioners are independent, which means they oversee the activities of government.

“The government should theoretically be leading the charge. I think we do a decent job.” – Ann Cavoukian

How are you protected by the government? 

Click below to find out. 

The Privacy Act
Wearable devices used for medical and healthcare purposes are monitored by the Privacy Act. Wearable devices used for healthcare purposes sold in Canada must have a medical device license. These devices are under the regulation of the Medical Devices Regulations, part of the Food and Drugs Act monitored by Health Canada. Under current law, the federal government can only collect information if it directly relates to an operating program of activity. The Privacy Act states that government institutions can only use personal information if it was collected for a use consistent with that purpose. The individual must consent to any other use of the information collected by a wearable device. This is an example of the purpose specification and limitation principle. “Wherever federal departments intend to make use of wearable computing devices to collect personal information, they will need to ensure that their program activities are carried out in accordance with the Privacy Act, undertake Privacy Impact Assessments (PIAs) and establish privacy protocols for conducting research, audits and evaluations, in accordance with Treasury Board directives and policies,” states the OPC report.

Personal Information Protection and Electronic Documents Act
The report by the OPC highlights that Personal Information Protection and Electronic Documents Act, PIPEDA, does not apply to “any individual in respect of personal information that the individual collects, uses or discloses for personal or domestic purposes and does not collect, use or disclose for any other purpose.” However, PIPEDA can be engaged where personal information from one device is sent to the organization that collects information.

PREVIOUS: Ethical issues. 

NEXT: In Conclusion…