Who is Regulating Data-Driven Research in Canada?
Large datasets are rapidly becoming common currency in industry[1] and government.[2] Increasingly research is being conducted by data scientists and practitioners using Big Data and Artificial Intelligence (AI), however, there is very little regulation in place to monitor and prevent potential harms to the individuals who are implicated in these studies.[3]
Big Data research is not new[4] and boundaries of personal privacy with respect to data are arguably dissolving.[5] Several cases document harm with respect to information privacy breaches and informed consent,[6] data discrimination,[7] networked effects,[8] and correlative association,[9] yet ethical frameworks and legislation in the field of data-driven research continue to be highly contested and in flux.
[10] The Government of Canada’s approach has been largely piecemeal and reactive
,[11] and policy is needed to address accountability.
Despite Big Data and AI having become the leading drivers of wealth creation,
[12] there are no precautionary or preventative measures in place in many companies. Innovation is prioritized, and the current procedure appears to be: act first, apologize, and alter course only after the offence.[13] Much of the current debate focuses on what constitutes public and private data, with researchers only being held accountable for mitigating harm when data is deemed private.[14] Advances in personal privacy protections including the right to be forgotten, to be properly informed of one’s data collection, and privacy-by-design initiatives have been enacted in Europe,[15] but Canada has fallen behind.[16]
Canada’s main private sector privacy legislation is the federal Personal Information Protection and Electronic Documents Act (PIPEDA)[17] c. 2000, and The Office of the Privacy Commissioner of Canada (OPC)[18] currently oversees compliance. The Freedom of Information and Protection Privacy Act (FIPPA) 1988 and the Municipal Freedom of Information and Protection Privacy Act (MFIPPA) r.s.o. 1990, are enforceable at provincial and municipal levels respectively, yet are not equipped to legislate current information practices of Big Data in 2018.[19] The OPC published Privacy Priorities 2015-2020[20] in June 2015, suggesting alternative enforcement possibilities such as industry codes of practice and new compliance agreements made possible through Bill S-4, but little progress has been made with respect to putting their recommendations into practice.[21] On November 1, 2018 new provisions for mandatory reporting of privacy breaches under the PIPEDA, as part of the Digital Privacy Act 2015, came into force.[22] It is important to note that political parties are currently exempt from this act, which represents a significant gap in oversight and lack of standards according to the OPC.[23]
Such amendments, as well as the new guidance documents published by the OPC,[24] including Inappropriate Data Practices July 2018 and Obtaining Meaningful Consent, to be implemented January 2019, and the House of Commons’ 2018 review of PIPEDA with respect to privacy-by-design,[25] have created more awareness and oversight. However, they still fail to hold data-driven researchers to task with respect to subject-harm. Nuanced measures of consent with respect to data privacy and protection are still the norm and there are limited means to hold data science researchers and their corporate employers accountable. Research Ethics and Institutional Review Boards operate as third-party regulatory bodies within academic environments, but no unbiased equivalent exists in industry.[26] The power to enforce data-driven ethical breaches still lies in the hands of the Canadian Government,[27] and their inaction has had industry pushback.[28] Many companies choose to ignore the OPC’s recommendations, and the lack of enforceable standards is yet to be addressed fully by the federal government.[29]
Who are the stakeholders?
| Stakeholders | Position related to policy | Can they influence the policy outcome? |
|---|---|---|
| The public, the data subject. | At the most risk, their data is what is being acted on. | Potentially. Reputation effects of companies who have breached the public trust can lead to changes in corporate policy (i.e. greater transparency measures, third-party API activity monitoring). |
| Industry and corporations. | The actors of data-driven research activities, making them critical stakeholders in promoting and perpetuating unethical research activities. | Yes. Industry and corporations are heavily considered in policy making consultation. |
| Policy makers and regulatory bodies in Canada: | Advisors with some capacity to propose amendments to current law and monitor related guidelines. | Yes, but they have no real teeth. The OPC in particular has a loud voice, but no real legislative or judicial power. |
International trade partners with Canada:
|
Will be affected by privacy laws with respect to data transfers and trade. The EU could back out of deals with Canada if current ‘adequate’ designation is lowered. | Yes. If Canada’s trade with the EU is threatened enough it could trigger a change in policy. |
The legislation makers and enforcers:
|
They regulate all lawmaking in Canada. Responsible for PIPEDA, Digital Privacy Act, Canadian Charter of Rights and Freedoms, and related legislation. House of Commons Standing Committee on Access to Information, Privacy and Ethics (ETHI). Tasked with reviewing Canada’s privacy laws and affect legislative reforms. Tri-Council Policy Statement (TCPS), Ethical Conduct for Research Involving Humans, which sets the standard for research ethics boards in Canada.[32] Canadian Human Rights Commission is involved in policy development with respect to human rights violations.[33] |
Yes. As the central lawmakers and enforcement agency, they are directly responsible for policy change. |
Policy consultants:
|
Provide consultation, policy recommendations, and critical studies on current data-driven landscape to government officials and public. | Somewhat. A lot of recommendations, lengthy reports, but none of them have legislative power. |
Academic research ethics boards:
|
REBs and IRBs regulate research ethic violations in academia and could be valuable resources for a comparable model with which to base industry review boards and policy. | Somewhat. In an academic context, these bodies have regulatory power but in an industry context they have little to no influence over research practices. |
| Data scientists and researchers. | The implement the research the policy addresses. | Not much. As practitioners, they can enact their own ethics in industry research activities and may influence employers. |
The value of Big Data extends beyond its use in the present to its potential future use.[35] As a citizen, it is nearly impossible to know when, where, or how your data will be enacted by an interested party and this has implication with respect to future discriminatory sentencing, police profiling, social control such as China’s social monitoring project expected in 2020, manipulative advertising, and geo-tracking.[36] Companies retain data for indefinite amounts of time, selling data through data brokers and other third parties. The distribution of personal information is no longer the sole concern, data footprints put public and private reputations at risk, and create potential for future discrimination,[37] with minorities, racialized, and low-income groups at the highest risk[38] due to the often biased and skewed nature of big data sets.[39] The wider public interest is in the benefit of protecting people from discrimination, persecution, and profiling, as well as targeting industries that don’t have the public’s best interests at heart. Canadians need to be assured that private actors are handling their data appropriately, but that the public sector actors — namely security and law enforcement actors — are likewise bound by new and enhanced standards.
[40]
One of the main conflicts around the issue of research ethics is the definition of meaningful consent. Implied consent is argued to be standard by industry stakeholders, and has been backed by court findings in some cases.[41] Years of weaker opt-out models in reference to anti-spam laws in Canada have made it difficult to implement stronger opt-in consent models[42], and companies are increasingly more manipulative with respect to how consent is obtained and for what reason. Notions of what constitute ‘risks of harm’ are also of issue and heavily contested. With respect to the general objection to disclosing harms, some members of industry have questioned whether the disclosure of all potential harms would lead to overly-long communications, and one organization suggested that such disclosures would create an unacceptable level of civil liability.[43]
Policy Recommendation
The central issue with any policy to date has been its lack of enforceability.[44] For this reason, we suggest prior to moving forward a federal third-party regulatory body should be established, with the power to enact and enforce legislation and conduct assessments and audits.
- This governing body should be formed with both private and public interest in mind, and would maintain best practices with respect to data-driven research. This is further supported by the IAF 2015 recommendations.[45]
- Such a third-party regulatory body could be modelled on current REB and IRBs, which provide ethical controls in an academic context.[46]
- It would consider existing acts/regulations centred on data privacy with respect to rights to be forgotten, de-index and source takedowns[47] and privacy-by-design.[48][49]
- It may also be used to enforce stronger anti-competition laws involving deceptive marketing practices and truth in advertising, which could deter deceptive research activities using Big Data and AI.[50]
Endnotes and Further Reference
[1] See: Information and Privacy Commissioner of Ontario. (2017). Big Data Guidelines (p. 23). Retrieved from https://www.ipc.on.ca/wp-content/uploads/2017/05/bigdata-guidelines.pdf; Manyika, J. (2011). Big data: The next frontier for innovation, competition, and productivity; McKinsey & Company. Retrieved from https://ci.nii.ac.jp/naid/20001705886/; and Einstein, M. (2016). Black Ops Advertising: Native Ads, Content Marketing and the Covert World of the Digital Sell. New York, UNITED STATES: OR Books. Retrieved from http://ebookcentral.proquest.com/lib/utoronto/detail.action?docID=4673446.
[2] The Canadian Government harvests, generates and employs data in many forms — including but not limited to tax data, social programs, health data, weather information, crop inventories, business program activities, and this may increase with prospective smart city plans. See: Government of Canada. (2018, February 19). Big data and innovation: key themes for competition policy in Canada. Retrieved October 6, 2018, from http://www.competitionbureau.gc.ca/eic/site/cb-bc.nsf/eng/04342.html.
[3] In academia, there are regulations include human subject protocols as well as research ethics boards (REB) and institutional review boards (IRBs), however there are concerns about the efficacy of these review boards, see Zook, M., Barocas, S., Boyd, D., Crawford, K., Keller, E., Gangadharan, S. P., … Pasquale, F. (2017). Ten simple rules for responsible big data research. PLOS Computational Biology, 13(3), e1005399. https://doi.org/10.1371/journal.pcbi.1005399 and Lewis, S. P., & Seko, Y. (2017). We tend to err on the side of caution: Ethical challenges facing Canadian research ethics boards when overseeing Internet reserach. In M. Zimmer & K. Kinder-Kurlanda (Eds.), Internet Research Ethics for the Social Age: New Challenges, Cases, and Contexts. New York: Peter Lang.
In industry there are very few equivalent regulatory bodies — some companies, such as Facebook and Google, have set up their own internal review boards although their leaders still take very little responsibility for their actions. See Frenkel, S., Confessore, N., Kang, C., Rosenberg, M., & Nicas, J. (2018, November 16). Delay, Deny and Deflect: How Facebook’s Leaders Fought Through Crisis. The New York Times. Retrieved from https://www.nytimes.com/2018/11/14/technology/facebook-data-russia-election-racism.html; and Scott Cleland in Streitfeld, D. (2013, March 12). Google Concedes That Drive-By Prying Violated Privacy. The New York Times. Retrieved from https://www.nytimes.com/2013/03/13/technology/google-pays-fine-over-street-view-privacy-breach.html.
[4] Big data collection and analysis has been around for decades, particularly with respect to travel rewards programs and within the insurance industry. However, data to this scale and developed computational power with respect to new combination and pattern-production have created a new landscape of digital information.
[5] The Snowdon revelations in 2013 concerning metadata collection in the United States provided a glimpse into what is surely a much deeper picture of dataveillance in today’s world. See Dijck, J. van. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society, 12(2), 197–208; Hardy, Q. (2012, June 4). Rethinking Privacy in an Era of Big Data. The New York Times. Retrieved November 11, 2018, from https://bits.blogs.nytimes.com/2012/06/04/rethinking-privacy-in-an-era-of-big-data/; and Information Accountability Foundation (IAF). (2015). Unified Ethical Frame for Big Data Analysis IAF Big Data Ethics Initiative, Part A. Retrieved from http://informationaccountability.org/wp-content/uploads/IAF-Unified-Ethical-Frame.pdf.
[6] For example, Facebook’s Contagion Study 2014 which resulted in the emotional manipulation of 700,000 users, Google Streetview drive-by in 2013 which revealed the large scale data scraping of personal information including passwords from their home computers, Cambridge Analytica’s manipulation of facebook users in the 2016 US election and 2017 Brexit referendum, the Banksy geographical study from 2016; see also: Scott Cleland in Streitfeld, D. (2013, March 12). Google Concedes That Drive-By Prying Violated Privacy. The New York Times. Retrieved from https://www.nytimes.com/2013/03/13/technology/google-pays-fine-over-street-view-privacy-breach.html; Consumer Watchdog, & Privacy Rights Clearing House. (2016). Complaint, Request for Investigation, Injunction, and Other Relief in the Matter of Google Inc,’s Change in Data Use Policies (Complaint No. 20580) (p. 27). Washington, DC: Federal Trade Commission. Retrieved from https://www.consumerwatchdog.org/resources/ftc_google_complaint_12-5-2016docx.pdf.
[7] For example, loans based on social networks (Facebook, 2015), and police profiling in the United States, see: Noble, S. U. (2018). Algorithms of oppression: how search engines reinforce racism. New York: New York University Press, 248; O’Neil, C. (2016). Weapons of math destruction: how big data increases inequality and threatens democracy. Great Britain: Allen Lane, 256; Gangadharan, Seeta Pena (ed.) (204). Data and Discrimination: Collected Essays. Washington, D.C.: Open Technology Institute, New America Foundation; and ACLU. (2018). First Amendment Lawsuit brought on Behalf of Academic Researchers Who Fear Prosecution Under the Computer Fraud and Abuse Act. Washington, D.C.: ACLU, November 17. https://www.aclu.org/news/judge-allows-aclu-case-challenging-law-preventing-studies-big-data-discrimination-proceed.
[8] Your data is connected to everyone you have had contact with online — networked connections implicate others in your behaviour and activities, and you in theirs. See: Dencik, L., Hintz, A., & Cable, J. (2016). Towards data justice? The ambiguity of anti-surveillance resistance in political activism. Big Data & Society, 3(2), 2053951716679678. https://doi.org/10.1177/2053951716679678; and Dijck, J. van. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society, 12(2), 197–208.
[9] Altman, et al. (2018) investigated the risk of the collection and use of personal data over time, concluding significant risks to both individuals as well as correlated groups and society at large. See: Altman, M., Wood, A., O’Brien, D. R., & Gasser, U. (2018). Practical approaches to big data privacy over time. International Data Privacy Law, 8(1), 29–51. https://doi.org/10.1093/idpl/ipx027.
[10] Metcalf, J., & Crawford, K. (2016). Where are human subjects in Big Data research? The emerging ethics divide. Big Data & Society, 3(1), 2053951716650211. https://doi.org/10.1177/2053951716650211, pp.2. See also: Zook, M., Barocas, S., Boyd, D., Crawford, K., Keller, E., Gangadharan, S. P., … Pasquale, F. (2017). Ten simple rules for responsible big data research. PLOS Computational Biology, 13(3), e1005399. https://doi.org/10.1371/journal.pcbi.1005399; and Hasselbalch, G., Tranberg, P., ApS, P., & Lind, P.-O. (2016). Data Ethics: The New Competitive Advantage. The authors (self-published), supported by the Internet Society.
[11] CIGI discusses the federal government’s approach to legislation in their 2018 papers. See: A National Data Strategy for Canada Key Elements and Policy Considerations. Centre for International Governance Innovation (CIGI). (2018). CIGI_Paper no.160_3.pdf (CIGI Papers No. 160) (p. 5). CIGI. Retrieved from https://www.cigionline.org/sites/default/files/documents/Paper%20no.160_3.pdf.
[12] As the Centre for International Governance Innovation (CIGI) states, All companies will soon become data and IP companies,
including government organizations. The combined evaluation of current data-driven companies in the United States is estimated to be in the trillions - Google (US$727 billion), Facebook (US$517 billion) and Uber (US$50 billion), are some of the biggest players. From A National Data Strategy for Canada Key Elements and Policy Considerations. Centre for International Governance Innovation (CIGI). (2018). CIGI_Paper no.160_3.pdf (CIGI Papers No. 160) (p. 2). CIGI. Retrieved from https://www.cigionline.org/sites/default/files/documents/Paper%20no.160_3.pdf.
[13] Global research and advisory company Gartner Inc. predicted by 2018, 50% of business ethics violations will occur due to improper use of big data
Gartner: Fueling the Future of Business. (n.d.). Retrieved November 18, 2018, from https://www.gartner.com/en, in Hasselbalch, G., Tranberg, P., ApS, P., & Lind, P.-O. (2016). Data Ethics: The New Competitive Advantage. The authors (self-published), supported by the Internet Society, p.11.
[14] There is a need to define what constitutes a ‘human subject’ in Big Data research, as currently the definition of harm falls short of protecting subjects in an internet research context as their information is deemed publicly accessible in many cases. The Common Rule [the primary regulation governing human-subjects research in the USA] assumes that data which is already publicly available cannot cause any further harm to the individual,
boyd, danah, & Crawford, K. (2012). Critical questions for Big Data. Information, Communication & Society, 15(5), 662–679. https://doi.org/10.1080/1369118X.2012.678878, pp.3.
[15] The General Data Protection Regulation (GDPR) became legally applicable in May 2018. It applies to EU citizens and extends to any foreign company or organization that has interactions with EU citizens. The GDPR claims to support antitrust and consumer protections laws, however one of the downsides to the GDPR and subsequent legislative is that it places the most burden on small businesses who don’t necessarily have the resources to adapt to new privacy measures quickly, leading to a greater monopoly by corporation who can afford to update their procedures. The GDPR is described here: EU GDPR Education Partners. (2018). EU GDPR Information Portal. Retrieved October 6, 2018, from https://eugdpr.org.
[16] Geist, M. (2018, March 5). No longer fit for purpose: Why Canadian privacy law needs an update. The Globe and Mail. Retrieved from https://www.theglobeandmail.com/report-on-business/rob-commentary/no-longer-fit-for-purpose-why-canadian-privacy-law-needs-an-update/article38214804/.
[17] PIPEDA is mainly concerned with privacy and consent with respect to personal information. The Act is intended to prevent organizations from collecting information by misleading or deceiving individuals about the purpose for which information is being collected.
See Personal Information Protection and Electronic Documents Act, Part 1 Principle 4.3.5. In Minister of Justice Canada. (2000). Personal Information Protection and Electronic Documents Act. Retrieved October 6, 2018, from http://laws-lois.justice.gc.ca/eng/acts/P-8.6/FullText.html.
[18] The Privacy Commissioner of Canada is an Agent of Parliament whose mission is to protect and promote privacy rights in Canada.
[19] Information and Privacy Commissioner of Ontario. (2017). Big Data Guidelines. Retrieved from https://www.ipc.on.ca/wp-content/uploads/2017/05/bigdata-guidelines.pdf, pp.23.
[20] The OPC’s ‘Privacy Priorities 2015-2020’ (“Priorities Paper”) promised to build normative framework around new business models and to identify enhancements to the consent model so that concerns raised by individuals and organizations are addressed. See: Office of the Privacy Commissioner of Canada. (June, 2015). The OPC privacy priorities 2015-2020 : mapping a course for greater protection (Monography No. IP54- 62/2015E) (p. 17). Government of Canada. Retrieved from http://publications.gc.ca/site/eng/9.801466/publication.html, p.11.
[21] Justice Minister Jody Wilson-Raybould announced in 2016 that she had instructed her officials to begin “concentrated work” toward modernizing the law, yet no concrete proposal has yet been made public, says Dan Therrien Privacy Commissioner of Canada in his 2018 report. The Office of the Privacy Commissioner of Canada. (2018, September 27). News release: Privacy Commissioner denounces slow progress on fixing outdated privacy laws [Government]. Retrieved November 11, 2018, from https://www.priv.gc.ca/en/opc-news/news-and-announcements/2018/nr-c_180927/.
[22] A new breach reporting regulation was issued in early November 2018 which requires companies to reveal when customer data has been compromised. Non-compliance with this act could result in fines up to $100,000, civil lawsuits, investigation by the OPC and reputational damage. This recommendation was made by the OPC in the 2015 Digital Privacy Act and took three years to implement, prompted finally by public outcry with respect to recent large scale data breaches by companies such as Uber, Facebook, Nissan and Equifax, for example. See: Government of Canada, L. S. B. (1985, last amended on April 1, 2018). Consolidated federal laws of canada, Digital Privacy Act (p. 21). Retrieved from http://laws-lois.justice.gc.ca/eng/annualstatutes/2015_32/page-1.html; Fraser, D., & Dykema, S. A. (2018, August 3). Digital Privacy Act: New Data Breach Response Obligations Effective November 1. Retrieved November 11, 2018, from https://www.mcinnescooper.com/publications/the-digital-privacy-act-5-faqs-about-the-new-mandatory-breach-response-obligations-effective-november-1-2018/.
[23] We also drew attention to the lack of standards and oversight over the personal information handling practices of political parties. The government introduced legislation intended to respond to this important gap. Bill C-76, however, adds nothing of substance in terms of privacy protection. Rather than impose internationally recognized standards, the bill leaves it to parties to define the rules they want to apply.
Dan Therrien in Office of the Privacy Commissioner of Canada. (2018). 2017-18 Annual Report to Parliament on the Personal Information Protection and Electronic Documents Act and the Privacy Act (Annual Report to Parliament No. IP51- 1E-PDF). Gatineau, QC: Office of the Privacy Commissioner of Canada. Retrieved from https://www.priv.gc.ca/en/opc-actions-and-decisions/ar_index/201718/ar_201718/.
[24] The OPC’s ‘Guidelines for Consent’ outlined seven principles of how to determine an appropriate form of consent and were published one day before the EU’s GDPR was enforced on May 25th, 2018.
[25] Canada, House of Commons. (2018). Towards Privacy by Design: Review of the Personal Information Protection and Electronic Documents Act. Report on the Standing Committee on Access to Information, Privacy and Ethics. Ottawa: 42nd Parliament, First Session. http://www.ourcommons.ca/Content/Committee/421/ETHI/Reports/RP9690701/ethirp12/ethirp12-e.pdf.
[26] Some corporations, such as Facebook, now have their own internal review boards but these come with their own obvious biases in terms of fair regulation practices.
[27] Generally speaking, the documents set out a combination of legal requirements and best practices that detail our expectations regarding what compliance entails
, says Daniel Therrien, but the guidelines cannot be used to establish legal standards this is the responsibility of the Canadian Government.
Office of the Privacy Commissioner of Canada. (2018). 2017-18 Annual Report to Parliament on the Personal Information Protection and Electronic Documents Act and the Privacy Act (Annual Report to Parliament No. IP51- 1E-PDF). Gatineau, QC: Office of the Privacy Commissioner of Canada. Retrieved from https://www.priv.gc.ca/en/opc-actions-and-decisions/ar_index/201718/ar_201718/.
[28] Office of the Privacy Commissioner of Canada. (2018, September 27). News release: Privacy Commissioner denounces slow progress on fixing outdated privacy laws [Government]. Retrieved November 11, 2018, from https://www.priv.gc.ca/en/opc-news/news-and-announcements/2018/nr-c_180927/.
[29] Office of the Privacy Commissioner of Canada. (2018, September 27). News release: Privacy Commissioner denounces slow progress on fixing outdated privacy laws [Government]. Retrieved November 11, 2018, from https://www.priv.gc.ca/en/opc-news/news-and-announcements/2018/nr-c_180927/.
[30] The Competition Bureau condemns representations made to the public that are false or misleading in a material respect. See section 74.01(1)(a) of the Competition Act.
[31] The Minister of Innovation, Science and Economic Development proposed recommendations to amend PIPEDA, agreeing that changes are required to our privacy regime,
from The Office of the Privacy Commissioner of Canada. (2018). 2017-18 Annual Report to Parliament on the Personal Information Protection and Electronic Documents Act and the Privacy Act (Annual Report to Parliament No. IP51- 1E-PDF). Gatineau, QC: Office of the Privacy Commissioner of Canada. Retrieved from https://www.priv.gc.ca/en/opc-actions-and-decisions/ar_index/201718/ar_201718/.
[32] The TCPS exempts most data-based subjects from their ethics considerations, however. Government of Canada, I. A. P. on R. E. (2016, February 5). Interagency Advisory Panel on Research Ethics. Retrieved November 19, 2018, from http://www.pre.ethics.gc.ca/eng/policy-politique/initiatives/tcps2-eptc2/Default/. See also: Metcalf, J., & Crawford, K. (2016). Where are human subjects in Big Data research? The emerging ethics divide. Big Data & Society, 3(1), 2053951716650211. https://doi.org/10.1177/2053951716650211.
[33] Canadian Human Rights Commission. (2018). Canadian Human Rights Commission. Retrieved November 19, 2018, from https://www.chrc-ccdp.gc.ca/eng/content/about-us.
[34] Centre for International Governance Innovation (CIGI). (2018). CIGI_Paper no.160_3.pdf (CIGI Papers No. 160) (p. 24). CIGI. Retrieved from https://www.cigionline.org/sites/default/files/documents/Paper%20no.160_3.pdf.
[35] Hasselbalch describes the competitive advantage of data in his 2016 book. Hasselbalch, G., Tranberg, P., ApS, P., & Lind, P.-O. (2016). Data Ethics: The New Competitive Advantage. The authors (self-published), supported by the Internet Society, pp.20.
[36] See: Einstein, M. (2016). Black Ops Advertising: Native Ads, Content Marketing and the Covert World of the Digital Sell. New York, UNITED STATES: OR Books. Retrieved from http://ebookcentral.proquest.com/lib/utoronto/detail.action?docID=4673446.
[37] Altman et al. (2018) discuss the documented implications of harm to human subjects in longitudinal studies and argue that similar assumptions of harm could be drawn and applied to big data research and collection. See: Altman, M., Wood, A., O’Brien, D. R., & Gasser, U. (2018). Practical approaches to big data privacy over time. International Data Privacy Law, 8(1), 29–51. https://doi.org/10.1093/idpl/ipx027.
[38] Eubanks acknowledges the effect of data-driven processes, such as data mining, policy algorithms, and predictive risk models on low-income and working-class people in the United States. Eubanks, V. (2018). Automating Inequality. St. Martin’s Press. Retrieved from https://us.macmillan.com/automatinginequality/virginiaeubanks/9781250074317.
[39] The discrimination of Big Data sets can be seen as a mirror upon which to reflect on society in some ways, as the data itself is non-valued but rather drawn from the manner in which society is currently structured and the operational biases of those who create and study the data. See also: Noble, S. U. (2018). Algorithms of oppression: how search engines reinforce racism. New York: New York University Press.
[40] CIGI has been outspoken about the rights of the public and risks of data-driven research. See: A National Data Strategy for Canada Key Elements and Policy Considerations. Centre for International Governance Innovation (CIGI). (2018). CIGI_Paper no.160_3.pdf (CIGI Papers No. 160) (p. 24). CIGI. Retrieved from https://www.cigionline.org/sites/default/files/documents/Paper%20no.160_3.pdf, pp.8.
[41] Office of the Privacy Commissioner of Canada. (2018, May 24). Commentary of the Office of the Privacy Commissioner on feedback received through the 2017 consent guidance consultation. Retrieved November 12, 2018, from https://www.priv.gc.ca/en/about-the-opc/what-we-do/consultations/consultation-on-consent-under-pipeda/consent_com_201805/.
[42] Geist, M. (2018, March 5). No longer fit for purpose: Why Canadian privacy law needs an update. The Globe and Mail. Retrieved from https://www.theglobeandmail.com/report-on-business/rob-commentary/no-longer-fit-for-purpose-why-canadian-privacy-law-needs-an-update/article38214804/.
[43] Office of the Privacy Commissioner of Canada. (2018, May 24). Commentary of the Office of the Privacy Commissioner on feedback received through the 2017 consent guidance consultation. Retrieved November 12, 2018, from https://www.priv.gc.ca/en/about-the-opc/what-we-do/consultations/consultation-on-consent-under-pipeda/consent_com_201805/.
[44] In Canada, There are still very few controls established for best practice or protection of the public privacy aside from confusing and vague consent forms, lack of transparency as to what the data is being used for, and by whom, how long it is kept and what exactly is being collected. It is not enough to ask companies to enforce more transparent privacy policies, there needs to be more action from the government regulatory bodies
states Privacy Commissioner Dan Therrien in the OPC’s 2017-2018 annual report. Office of the Privacy Commissioner of Canada. (2018). 2017-18 Annual Report to Parliament on the Personal Information Protection and Electronic Documents Act and the Privacy Act (Annual Report to Parliament No. IP51- 1E-PDF). Gatineau, QC: Office of the Privacy Commissioner of Canada. Retrieved from https://www.priv.gc.ca/en/opc-actions-and-decisions/ar_index/201718/ar_201718/.
[45] Organizations must conduct big data assessments that look at all stakeholders, define the full range of risks and benefits from analytics for all stakeholders, focus on whether risk mitigation is effective, and make a determination that, on balance, the big data undertaking is legal, fair, and just and demonstrate how that determination was reached. Such an assessment framework would be part of a comprehensive privacy management program overseen by the OPC.
Information Accountability Foundation (IAF). (2015). IAF Proposal for OPC’s “Contributions Program: Privacy Research and Related Knowledge Translation Initiatives”. (p. 15). Information Accountability Foundation. Retrieved from http://informationaccountability.org/wp-content/uploads/IAF-Proposal-OPC-Contributions-Program.pdf, p.6.
[46] While REB and IRBs have come under scrutiny for not providing adequate oversight over data-driven studies, they do provide more accountability with respect to research ethics than anything that exists in an industry context.
[47] Office of the Privacy Commissioner of Canada. (2018). 2017-18 Annual Report to Parliament on the Personal Information Protection and Electronic Documents Act and the Privacy Act (Annual Report to Parliament No. IP51- 1E-PDF). Gatineau, QC: Office of the Privacy Commissioner of Canada. Retrieved from https://www.priv.gc.ca/en/opc-actions-and-decisions/ar_index/201718/ar_201718/.
[48] For example, Cavoukian, A., & Weiss, J. B. (2012). Privacy by Design and User Interfaces: Emerging Design Criteria - Keep it User-Centric (p. 15). Information & Privacy Commissioner Ontario, Canada. Retrieved from https://www.ipc.on.ca/wp-content/uploads/Resources/pbd-user-interfaces_Yahoo.pdf.
[49] Canada, House of Commons. (2018). Towards Privacy by Design: Review of the Personal Information Protection and Electronic Documents Act. Report on the Standing Committee on Access to Information, Privacy and Ethics. Ottawa: 42nd Parliament, First Session. http://www.ourcommons.ca/Content/Committee/421/ETHI/Reports/RP9690701/ethirp12/ethirp12-e.pdf.
[50] Privacy might be something that distinguishes certain companies from others and leads to market preferences. For example, DuckDuckGo search engine, which attempts to distinguish itself by promising not to track users and otherwise highlighting privacy
. Government of Canada. (2018, February 19). Big data and innovation: key themes for competition policy in Canada. Retrieved October 6, 2018, from http://www.competitionbureau.gc.ca/eic/site/cb-bc.nsf/eng/04342.html, p.9.