Leila Mohaghegh

Introduction

Artificial intelligence is a huge revolution in the world of technology. “AI refers to the ability of a machine to perform cognitive functions we associate with human minds, such as perceiving, reasoning, learning, interacting with the environment, problem-solving, and even exercising creativity.”[1]

Artificial Intelligence reflecting a growing tendency to turn for advice, or turn over decisions altogether, to algorithms. “Intelligence” means the ability to make predictions and solve complex tasks.[2] Algorithms can do anything that can be coded, as long as they have access to data they need, at the required speed, and are put into a design framework that allows for the execution of the tasks thus determined. The effectiveness of algorithms is increasingly enhanced through “Big Data”: availability of an enormous amount of data on all human activity and other processes in the world which allows a particular type of AI known as “machine learning” to draw inferences about what happens next by detecting patterns.[3]

AI needs to process data to expand its capabilities and to perform certain tasks. Understanding this close linkage between AI and data is crucial for practical cases, as the data collection may conflict with individuals’ right to privacy and with their data autonomy.

Private companies are extraordinarily powerful gatekeepers for information and communication; they control our expressive activity, our associations with others, and our access to information and in many areas, these private companies are increasingly being deployed as agents of governance and regulation. Today, technology more often serves to reinforce and consolidate state power.[4]

There is concern that artificial intelligence disrupts the balance of power in the world. Also, the concern is that private sector companies will monopolize everything and that humans will be the only observers and captives of multinational corporations.

Privacy

Under the Oxford Dictionary of Law Enforcement, Privacy is the right to be left alone and to keep certain matters secluded from public view. The right includes the privacy of communications (telephone calls, correspondence, etc.); privacy of the home and office; environmental protection (including freedom from excessive noise); the protection of physical integrity; protection from unjustified prosecution and conviction of those engaged in consensual nonviolent sexual activities; and protection from being photographed and described if the individual has a reasonable expectation of privacy. This right is a qualified right. Public authorities have a limited but positive duty to protect privacy from interference by third parties.[5]

Several national constitutions promise protection of “privacy.” The United States Bill of Rights never uses the word, but proclaims “the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures…” Article 12 of the Universal Declaration of Human Rights says: “No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks on his honour and reputation.” This is repeated almost verbatim in Article 17 of the International Covenant on Civil and Political Rights. The European Convention on Human Rights, Article 8, says: “Everyone has the right to respect for his private and family life, his home and his correspondence.”[6]

The European Union (EU) is considered to have strong privacy laws, with transparency as a core tenant. The GDPR is an attempt “to give individuals more control over how their data are collected, used, and protected online.” It also creates more oversight and restrictions on how companies use this data.[7]

From a human rights perspective, the right to privacy could be regarded as an extension of human dignity, which has been confirmed by court rulings and legal texts such as the Universal Declaration of Human Rights and the European Convention on Human Rights (Article 8). Unlike human dignity, the ownership of data can usually be transferred to third parties by consent, which also applies to how the data are used. The General Data Protection Regulation (GDPR)[8] reads, ‘where processing is based on the data subject’s consent, the controller should be able to demonstrate that the data subject has given consent to the processing operation’.[9]

Council of Europe Commissioner for Human Rights in its recommendation states that:

“The development, training, testing, and use of AI systems that rely on the processing of personal data must fully secure a person’s right to respect for private and family life under Article 8 of the European Convention on Human Rights, including the “right to a form of informational self-determination” in relation to their data.”[10]

“In December 2013, the United Nations General Assembly adopted resolution 68/167, which expressed deep concern at the negative impact that surveillance and interception of communications may have on human rights. The General Assembly affirmed that the rights held by people offline must also be protected online, and it called upon all States to respect and protect the right to privacy in digital communication. The General Assembly called on all States to review their procedures, practices, and legislation related to communications surveillance, interception, and collection of personal data and emphasized the need for States to ensure the full and effective implementation of their obligations under international human rights law.”[11]

Most times, governments collect information from citizens. Sometimes government action benefits the people and prevents widespread harm. For instance, governments have collected large amounts of data in the interests of society to fight the coronavirus. Digital check-in systems, wristband trackers, and mobile applications are just some examples of the surveillance technology implemented by governments to monitor and track the movement of people as they seek to stem the spread of the virus. Massive amounts of data have been collected to provide policymakers with accurate and efficient information to help manage resources, and ultimately shape health and social policies. The European Data Protection Board and the Organisation for Economic Co-operation and Development have called on governments to cease and reverse the exceptional use of data when the pandemic is over.[12]

Personal and Non-Personal Data

The term used in the European Union is “personal data” while U.S. statutes use a variety of terms to identify personal data, the most common is personally identifiable information or PII.[13] When we deal with data protection, the identification of the boundaries of the definition of personal data is the first milestone: if data are to be considered as personal data, then privacy law is applicable; If data do not fall under this category, then privacy law is not applicable.[14]

Under the Oxford Dictionary of the Internet, personal data considered as “any data that relate to a living person who can be identified from those data either directly or indirectly (i.e. by using additional information that is available to the data controller).”[15]

In the European Union, personal data flows are governed by the General Data Protection Regulation (GDPR) (2016/679), which replaced the EU’s Data Protection Directive (95/46/EC) in 2018.[16] The term “personal data” is the entryway to the application of the General Data Protection Regulation (GDPR). Only if the processing of data concerns personal data, the General Data Protection Regulation applies. The term is defined in Art. 4 (1). Personal data are any information that is related to an identified or identifiable natural person.[17]

The data subjects are identifiable if they can be directly or indirectly identified, especially by reference to an identifier such as a name, an identification number, location data, an online identifier, or one of several special characteristics, which expresses the physical, physiological, genetic, mental, commercial, cultural or social identity of these natural persons. In practice, these also include all data which are or can be assigned to a person in any kind of way. For example, the telephone, credit card, or personnel number of a person, account data, number plate, appearance, customer number, or address are all personal data.[18]

In November 2018, the European Parliament and Council passed the Regulation (EU) 2018/1807 on a framework for the free flow of non-personal data in the European Union.[19]

In Europe, the non-personal data can be categorized as:

  • data which originally did not relate to an identified or identifiable natural person, such as data on weather conditions generated by sensors installed on wind turbines, or
  • data on maintenance needs for industrial machines; or
  • data which was initially personal data, but later made anonymous.[20]

Therefore, there is data that is always non-personal (because it never related to an identified or identifiable natural person) and there is also data that once was personal but no longer is (as linkage to a natural person has been removed).[21]

Conclusion

Artificial intelligence affects human rights in two ways. It can both promote human rights and be an obstacle to human rights. This varies depending on the government that uses AI technology. In other words, just as artificial intelligence can be used as a tool for the development of human rights, it can also lead to the suppression of human rights by violating privacy and infiltrating people’s lives.

The current legal frameworks governing artificial intelligence do not guarantee accountability. Because of the complexity and growth of these systems, a lack of accountability and monitoring is becoming a concern. Artificial intelligence systems and their creator need direct intervention from international organizations and human rights monitors. Given the vast amount of data, it is necessary to provide an international legal framework to establish uniform laws to protect the privacy of individuals. This eliminates the need for international companies to enforce different and sometimes contradictory laws and regulations in different countries.

Uniform international standards on the right to privacy should be established as a comprehensive international agreement that has benefits both technology companies and individuals. As long as there are dual rules and regulations, technology companies must have separate policies in each country, and this will harm individuals.

Bibliography

  • Anwar, Nessa, “Governments have collected large amounts of data to fight the coronavirus. That’s raising privacy concerns”, (17 August 2020), online: CNBC <https://www.cnbc.com/2020/08/17/governments-collected-data-to-fight-coronavirus-raising-privacy-concerns.html>.
  • Basu, Arindrajit, Elonnai Hickok & Aditya Singh Chawla, The Localisation Gambit – Unpacking Policy Measures for Sovereign Control of Data in India, by Arindrajit Basu, Elonnai Hickok & Aditya Singh Chawla (The Centre for Internet and Society, India, 2019).
  • Butterfield, Andrew ButterfieldAndrew, Gerard Ekembe NgondiGerard Ekembe Ngondi & Anne KerrAnne Kerr, Andrew Butterfield, Gerard Ekembe Ngondi & Anne Kerr, eds, personal data (Oxford University Press, 2016).
  • ·         “Commission publishes guidance on free flow of non-personal data – Questions and Answers”, (29 May 2019), online: European Commission <https://ec.europa.eu/commission/presscorner/detail/en/MEMO_19_2750>.
  • Finck, Michele & Frank Pallas, “They who must not be identified – distinguishing personal from non-personal data under the GDPR” (2020) 10:1 International Data Privacy Law 26.
  • ·         “GDPR Personal Data”, online: General Data Protection Regulation (GDPR) <https://gdpr-info.eu/issues/personal-data/>.
  • Giuffrida, Iria, “Liability for AI Decision-Making: Some Legal and Ethical Considerations” (2019) 88:2 Fordham Law Review 439.
  • Griffin, James, “The Human Right to Privacy” (2007) 44:4 SAN DIEGO LAW REVIEW 27.
  • Gooch, Graham & Michael Williams, privacy (Oxford University Press, 2015).
  • Kriebitz, Alexander & Christoph Lütge, “Artificial Intelligence and Human Rights: A Business Ethical Assessment” (2020) 5:1 Bus and hum rights j 84–104.
  • Land, Molly K & Jay D Aronson, “Human Rights and Technology: New Challenges for Justice and Accountability” (2020) 16:1 Annual Review of Law and Social Science 223–240.
  • Pappalardo, Massimiliano, “Personal data or non-personal data, that is the question! The different interpretations of ECJ and Italian Supreme Court”, (25 October 2016), online: Lexology <https://www.lexology.com/library/detail.aspx?g=804ce9b8-dfa5-4c67-bbf7-4cc3e087c2f8>.
  • Radcliff, Douglas, “Artificial Intelligence, the Rise of Technology and the Future of Human Rights”, (15 July 2020), online: Czech Centre for Human Rights and Democracy <https://www.humanrightscentre.org/blog/artificial-intelligence-rise-technology-and-future-human-rights>.
  • Risse, Mathias, “Human Rights and Artificial Intelligence: An Urgently Needed Agenda” (2018) Carr Center for Human Rights Policy – Harvard Kennedy School 22.
  • ·         “The Right to Privacy in the Digital Age”, online: United Nations Human Rights – Office of the Hight Commissioner <https://www.ohchr.org/en/issues/digitalage/pages/digitalageindex.aspx>.
  • “Unboxing Artificial Intelligence:10 steps to protect Human Rights – Recommendation” (2019) Council of Europe Commissioner for Human Rights, online: <https://rm.coe.int/unboxing-artificial-intelligence-10-steps-to-protect-human-rights-reco/1680946e64>.
  • Voss, W Gregory Gregory & Kimberly A Houser, “Personal Data and the GDPR: Providing a Competitive Advantage for U.S. Companies” W Gregory Gregory Voss & Kimberly A Houser, (19 June 2020).

[1] Iria Giuffrida, “Liability for AI Decision-Making: Some Legal and Ethical Considerations” (2019) 88:2 Fordham Law Review 439 at 441.

[2] Mathias Risse, “Human Rights and Artificial Intelligence: An Urgently Needed Agenda” (2018) Carr Center for Human Rights Policy – Harvard Kennedy School 22 at 2.

[3] Ibid at 2.

[4] Molly K Land & Jay D Aronson, “Human Rights and Technology: New Challenges for Justice and Accountability” (2020) 16:1 Annual Review of Law and Social Science 223–240 at 225–226.

[5] Graham Gooch & Michael Williams, privacy (Oxford University Press, 2015).

[6] James Griffin, “The Human Right to Privacy” (2007) 44:4 SAN DIEGO LAW REVIEW 27 at 703–704.

[7] Douglas Radcliff, “Artificial Intelligence, the Rise of Technology and the Future of Human Rights”, (15 July 2020), online: Czech Centre for Human Rights and Democracy <https://www.humanrightscentre.org/blog/artificial-intelligence-rise-technology-and-future-human-rights>.

[8] Recital 42 EU GDPR

[9] Alexander Kriebitz & Christoph Lütge, “Artificial Intelligence and Human Rights: A Business Ethical Assessment” (2020) 5:1 Bus and hum rights j 84–104 at 95–96.

[10] “Unboxing Artificial Intelligence:10 steps to protect Human Rights – Recommendation” (2019) Council of Europe Commissioner for Human Rights, online: <https://rm.coe.int/unboxing-artificial-intelligence-10-steps-to-protect-human-rights-reco/1680946e64> at 11.

[11] “The Right to Privacy in the Digital Age”, online: United Nations Human Rights – Office of the Hight Commissioner <https://www.ohchr.org/en/issues/digitalage/pages/digitalageindex.aspx>.

[12] Nessa Anwar, “Governments have collected large amounts of data to fight the coronavirus. That’s raising privacy concerns”, (17 August 2020), online: CNBC <https://www.cnbc.com/2020/08/17/governments-collected-data-to-fight-coronavirus-raising-privacy-concerns.html>.

[13] “Personal Data and the GDPR: Providing a Competitive Advantage for U.S. Companies” W Gregory Gregory Voss & Kimberly A Houser, (19 June 2020) at 4.

[14] Massimiliano Pappalardo, “Personal data or non-personal data, that is the question! The different interpretations of ECJ and Italian Supreme Court”, (25 October 2016), online: Lexology <https://www.lexology.com/library/detail.aspx?g=804ce9b8-dfa5-4c67-bbf7-4cc3e087c2f8>.

[15] Andrew ButterfieldAndrew Butterfield, Gerard Ekembe NgondiGerard Ekembe Ngondi & Anne KerrAnne Kerr, Andrew Butterfield, Gerard Ekembe Ngondi & Anne Kerr, eds, personal data (Oxford University Press, 2016).

[16] The Localisation Gambit – Unpacking Policy Measures for Sovereign Control of Data in India, by Arindrajit Basu, Elonnai Hickok & Aditya Singh Chawla (The Centre for Internet and Society, India, 2019) at 74.

[17] “GDPR Personal Data”, online: General Data Protection Regulation (GDPR) <https://gdpr-info.eu/issues/personal-data/>.

[18] Ibid.

[19] Basu, Hickok & Chawla, supra note 11 at 75.

[20] “Commission publishes guidance on free flow of non-personal data – Questions and Answers”, (29 May 2019), online: European Commission <https://ec.europa.eu/commission/presscorner/detail/en/MEMO_19_2750>.

[21] Michele Finck & Frank Pallas, “They who must not be identified – distinguishing personal from non-personal data under the GDPR” (2020) 10:1 International Data Privacy Law 26 at 13.