By Maria Lydakaki, researcher of the unit «Sciences & Technology»
“A viable future for all” is the true meaning of sustainability and the ultimate goal of the United Nations. Viability’s direct reflection is the attainment of balance between two contrasting concepts: the economic growth and the protection of fundamental Human Rights. Sustainability could be ensured through the symmetric conjunction of the aforementioned opposite forces. Are those forces really opposite;
Throughout the ages it is indicated that in practice, economic growth has been interlinked with suppression and blatant violation of fundamental human rights. Discrimination was the outcome of such practices, as well as the means of their perpetuation. From the slavery of Ancient Greece and ancient Rome up until Apartheid, the sense of inferiority provoked by socially constructed inequalities was a means of a non-holistic and non-sustainable enrichment. From Apartheid, up until our modern wage gaps and unemployment, those practices have been modified, but they have certainly not been vanished. Violations of human rights still remain profitable for violators and that catastrophic profit is not something we should be proud of. In the era of information, violations are synchronized to meet the requests of the market. As a matter of fact, ancient slavery has turned into an economic information-based and personalized “exclusion and control”. That modern kind of indirect discrimination and formation is supported by data-selling deals (2,3), the exposure of personal information without users consent, fake news, illegal collection of sensitive personal data and so on. These new violations are justified by “users convenience”, “personalized advertisement”, “saving of time” in a world where time and information are valuable. The more sensitive the information, the more valuable. Thus, the protection of personal data and especially biometric data, is an integral part of sustainable economic growth and technological development that respects human rights and prevents inequalities. After explaining why it is important to protect sensitive data in our day and age, we will focus on the current protection of personal data in the European Union.
First and foremost, the subject of the protection ought to be defined. What is the content of the term “sensitive personal data”; According to the European Commission (1) and especially the articles 4(13), (14) and (15) and Article 9 and Recitals (51) to (56) of the GDPR, the term contains:
- the personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs;
- trade-union membership;
- genetic data, biometric data processed solely to identify a human being;
- health-related data;
- data concerning a person’s sex life or sexual orientation. (4)
Biometric data are defined by the GDPR as “personal data resulting from specific technical processing relating to the physical, physiological or behavioral characteristics of a natural person, which allow or confirm the unique identification of that natural person”. Biometric data is often used for authenticating someone’s identity. For instance, fingerprint scanning to unlock devices, facial recognition software to improve security systems, smart watches and apps that collect health data (heart rate, daily consumption of calories or water, daily exercise, eating habits) use biometric data with the consent of the user. The benefits of using biometric data are security-related. In the field of advertisement, biometrics are used for creative research and analytic purposes because they capture consumer behavior instantly. It is a fact that biometrics and health data are also extremely valuable for insurance companies. Some common violations include data-selling deals between the “data holders” (a clinic, or a software company) and advertising or insurance companies (5). There are people that would pay a significant amount of money to see one’s blood pressure! Another type of indirect discrimination based on health and biometric data could be found in the work environment. Sensitive personal data could affect one’s chances of being hired or promoted and biometric systems could be used as a way of “supervision” of the personnel. Those “leaks” of health data, or other sensitive personal data (information on sex orientation) could result in the exclusion of whole groups of people from “responsible positions” for reasons that have nothing to do with their qualifications and their abilities. This exclusion (from work positions or insurance programs) accompanied with the personalized advertisement could lead to an hetero-determination or unequal access to services.
The current protection of sensitive personal data in Europe is based on the General Data Protection Regulation (especially article 9), as well as the EU Charter of Fundamental Rights (article 8) and the European Convention on Human Rights (article 8). According to the article 8 of the EU Charter, everyone has the right to the protection of his/her personal data, that must be processed fairly for specified purposes, with his/her consent, on legitimate basis while everyone must have access to their data. Last but not least, in the third paragraph of the article it is mentioned that the compliance with these rules shall be subject to control by an independent authority. The Charter’s protection is broad and general compared to the article 9 of the GDPR. In the first paragraph of the article 9, we observe the general prohibition of processing sensitive data and in the following paragraphs exceptions are introduced. Briefly, the processing of sensitive data could be tolerated in cases of:
- explicit consent provided by the data subject ;
- necessary processing for the purposes of carrying out the obligations and exercising specific rights of the controller or of the data subject in the fields of employment and social security and social protection law;
- processing that relates to personal data which are manifestly made public by the data subject;
- Necessary procession to protect the vital interests of the data subject, when the subject is physically or legally incapable of giving consent;
- Necessary processing for the establishment and exercise of defence of legal claims;
- Necessary processing for reasons of substantial public interest (in the area of public health);
- Necessary processing for the purposes of preventive or occupational medicine, for the assessment of the working capacity of the employee, medical diagnosis, the provision of health or social care or treatment or the management of health or social care systems and so on. (4)
In all cases the processing must be necessary and conducted under the principles of lawfulness, fairness and transparency that are mentioned in the article 5. The purpose of the procession must be specified. According to the article 52 of the EU Charter “Any limitation on the exercise of the rights and freedoms recognised by this Charter must be provided for by law and respect the essence of those rights and freedoms. Subject to the principle of proportionality, limitations may be made only if they are necessary and genuinely meet objectives of general interest recognised by the Union or the need to protect the rights and freedoms of others.” The limitation of the rights shall respect their “essence” and must be subjected to the principle of proportionality! In our case, when it comes to the data-selling deals the principle of proportionality is often violated and the processing shall not be characterized as “necessary”. It is far from necessary to violate the general prohibition of processing data and disrespect the essence of the right, when there are other ways to supervise and esteem the working capacity of employees or to capture consumer behavior… The data selling deals shall not be protected by the exception mentioned in the article 9 of the GDPR because there is no “necessity” that would justify them, in most cases the purposes of processing are not specified and there is no respect in the principle of lawfulness or the principle of the proportionality. (6)
This tendency is depicted in the judgments of the European Court of Human Rights.
For instance the case, “Z. v. Finland” (no. 22009/93, 25 February 1997) concerned the disclosure of the applicant’s condition as HIV-positive in criminal proceedings against her husband. The Court held that there had been a violation of Article 8 of the Convention, finding that the disclosure of the applicant’s identity and HIV infection in the text of the Court of Appeal’s judgment made available to the press was not supported by any cogent reasons. The Court noted in particular that respecting the confidentiality of health data is a vital principle in the legal systems of all the Contracting Parties to the Convention and is crucial not only to respect the sense of privacy of a patient but also to preserve his or her confidence in the medical profession and in the health services in general. (7)
However there are cases where the disclosure can be considered as necessary. For instance, the case “M.S. v. Sweden” (no. 20837/92, 27 August 1997) concerned the communication by a clinic to a social-security body of medical records containing information about an abortion performed on the applicant. The Court held that there had been no violation of Article 8 of the Convention, finding that there had been relevant and sufficient reasons for the communication of the applicant’s medical records by the healthcare provider to the social-security body and that the measure had not been disproportionate to the legitimate aim pursued, namely, by enabling the social-security body to determine whether the conditions for granting the applicant compensation for industrial injury had been met, to protect the economic well-being of the country. Moreover, the contested measure was subject to important limitations and was accompanied by effective and adequate safeguards against abuse. (7)
Not all cases of disclosure constitute violations, especially when justified by the public interest and when subjected to the principle of proportionality. Biometrics and health data must be processed with respect to human rights so as not to promote inequalities, exclusion, disrespect to the private life of the subjects and discrimination in the work place or in the access to services. Data protection is a prerequisite of freedom in the era of intelligence sharing. The EU principles support the conjugation of technological advances to ethics. The progress that has been made in Europe is significant and it could be a step towards equality and thus, viability!
- European Commission. (2019). What personal data is considered sensitive?. [online] Available here [Accessed 12 Jul. 2019].
- Thielman, S. (2017). Your private medical data is for sale – and it’s driving a business worth billions. [online] the Guardian. Available here [Accessed 13 Jun. 2019].
- Jennings, K. (2014). How Your Doctor And Insurer Will Know Your Secrets — Even If You Never Tell Them. [online] Business Insider. Available here [Accessed 13 Jun. 2019].
- General Data Protection Regulation article 9 available here
- TCS Big Data Study. (2012). Selling Big Data: Which industries sell their digital data?. [online] Available here [Accessed 13 Jul. 2019].
- The EU Charter of Human Rights, article 8 available here
- Judgment of European Court of Human Rights: coe.int. (2019). [online] Available here [Accessed 13 Jul. 2019].