Source: here 

By Stavrina Chousou, member of the «International Relations and Foreign Policy» Research Team


Introduction

The data scenery, legal and political, has drastically transformed over the past decade. From a chaotic and opportunistic environment for any and every online user, it gradually shifts to a more defined space. That shift has both merit and reasonable cause; if oil replaced gold some decades ago, data has currently replaced oil, fueling today’s economy and enabling major advancements in technology, science and the understanding of human behavior. The political underpinnings of data have been extremely intriguing, as in recent years different poles of the international scope have started realizing the potentials of data exploitation. Intelligence catalysts, propaganda facilitators, props for social transformation, and huge statistical incentives: data is the new fuel for political strategizing (Fukuyama, 2020).

The need for constant, up-to-date, examination of data regulation from a political scope is a necessity nowadays, since the state that is more involved in the control of data supply, will develop the underpinning governance norms of the future data markets, but also future technological and economic determinants that depend heavily on data: A further complexity is that state actors, approach data in different ways, stemming from their different goals, cultures and value systems (Brown, 2019). Since significant research has emerged concerning the geopolitics of data, especially in the framework of the US-China Tech war, the focus of this paper will be set upon “personal data”, in relevance to both sides of the Atlantic, namely the EU and the USA. Setting as an axis the European General Data Protection Regulation (GDPR), and focusing on these regions of the globe will provide a holistic overview on transatlantic trends- legal and political- identifying their contribution to the current normative frameworks of data regulation and compliance. 

A European approach on data regulation as a catalyst for change: The GDPR

Firstly, it is vital to clarify what exactly personal data is. According to the General Data Protection Regulation -commonly known as GDPR- General Provisions’ Article 4, “ ‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;”. Another clarification that is in order, is what the legal definition of “privacy” is.  The GDPR states that data privacy refers to a person’s ability to choose when, how, and to what extent their personal data is shared with or conveyed to others. The inclusion of many different options for privacy settings is worth noting, as it enables customization to one’s privacy preferences in virtually every source of digital and online data, such as tracking services or cookies [1], according to their will and discretion, as one would do in the traditional societal space. 

Although the GDPR is not the first legal framework attempting to safeguard digital privacy, it is the first holistic framework for data privacy with an impressive enforceability range. Its three main goals are: protecting users’ rights concerning their data, ensuring that data protection laws keep pace with the ever-changing technology landscape, and finally creating a uniform and consistent legislation across the EU. After the GDPR enactment, companies must have unambiguous consent before collecting, storing, or sharing users’ data, while under obligation to always keeping detailed documentation of their stored data and providing all such information within a 30-day window once asked to do so. On the other hand, users can now request clear and detailed documentation of the data stored in a company’s database and of course, request that companies remove their data from their databases, by providing evidence of the data removal. Furthermore, users are now able to request that inaccurate stored information be adjusted and corrected. They can also object to the use of their data about race, ethnicity, sexual orientation, gender, political views, religious beliefs and other types of profiling (Sobers, 2020).

The GDPR, from a European perspective, reflects the rising domestic concerns about a cyberspace that is not free, but uncontrollable and thus dangerous for any potential victim. It has boosted Europeans’ privacy protection trust in the Union’s digital policies, and the EU’s position as an influential market (Geradin et al., 2020).  It is empirically deduced that the GDPR was the tipping point, from which thereafter there is a rise in discussion and policy-making for personal data and their protection globally (Machuletz & Böhme, 2020; Geradin et al., 2020). Since the regulation is part of the European legal corps, its provisions apply by default to all businesses which collect European citizens’ data, affecting US, Chinese and other non-EU companies by extent. 

“American” privacy and a confused regulatory effort

On the other side of the Atlantic, attitudes about privacy are more scattered than in the EU. Public opinion is generally skeptical, believing that it is not feasible to go on everyday activities without companies (62%) and the government (63%) gathering their personal data. Moreover, 81% of Americans view that the hazards of companies collecting their data exceed the advantages, while 66% feel the same about government data collection (Auxier et al., 2019). Commenting on the argument “Data security and privacy are not really a problem because I have nothing to hide”, 89% of the US subjects opposed the statement. Characteristically a US subject answered: “I disagree, what I have to hide is my financial life(Kumaraguru et al., 2005). In the same study, Belgian and Indian people linked the concept of privacy to a familial safe space, peace of mind and mental tranquility, whilst their American counterparts associated privacy with the right to private financial information, work confidential information and the protection of what it means to be free- mostly economically free.

From a legal standpoint, the US does not have a coherent legal framework on privacy and there is no constitutional reference to the individual’s right to data protection [2]. Relating laws are focused on specific types of data and have limited powers. For example, there is the FERPA (Family Education Rights and Privacy Act) for familial sensitive information and HIPAA (Health Information Portability and Accountability Act) for medical records protection. Although they still carry merit, they are outdated and inefficient when it comes to newer technologies (Klosowski, 2021). On a corporate law level, the right to alter or delete their stored data exists in limited states, selected industries and in a limited capacity in the US, leaving companies handling the world’s largest social media platforms, unaccountable to customers’ data handling.

However, the GDPR has created a spiral effect, by establishing a precedent of institutional mobilization for their citizens’ rights to privacy, even in the digital space. California, the chronically most liberal state in the US, followed a trajectory of privacy liability in its jurisdiction, soon after the EU campaign on the GDPR at the beginning of 2016. The American ethos of “individuality” and the concept of financial freedom may have strongly influenced the California Consumer Privacy Act (CCPA), signed into law in June 2018. Nevertheless, it is still ensuring data privacy for California consumers much in the same spirit as the GDPR. 

Important differences can be noted, both in regards to the CCPA specific provisions, as well as its enforceability range, most importantly the fact that the CCPA concerns consumers, whereas the GDPR is addressed to civilians (Voss, 2021). That being said, the CCPA has significant effects on domestic markets since “ the CCPA […] is broadly applicable to American companies and  […] more than half a million U.S. companies are likely impacted by the law. In addition, the law may apply to those operating outside the U.S. too». That is true, because a business needs only to interact in any commercial aspect in California to be obligated to comply with California law (Barrett & Scitech, 2019). 

Significant similarities between the CCPA and the GDPR carry their own importance: they may give incentives on how the data privacy compliance framework might be established in the West. Both are intended to encourage data collection and transmission transparency. The individuals’ right to «be forgotten,» to access stored personal information, and to request said data be edited or corrected, are becoming legitimate consumer/civilian requests (Chin, 2019). More and more US states are expected to follow in the CCPA’s footsteps with similar laws, yet the US government response is quite different. Former US Commerce Secretary, Wilbur Ross, had notably stated that “the GDPR is a threat to trade between the US and EU”. But why the struck contrast between the authorities of the two major western blocks? Is it cultural and conceptual differences that alienate their views on digital data privacy or something more practical?

Big Tech: perplexing data regulation and governmental initiative

Big Tech companies are major players in the data industry -arguably more important than governments themselves- as their services rely heavily on access to consumer data, and the amount of data they collect is massive. The Big Tech sector is dominated by select few, influential, US-based firms, namely Google, Facebook (Meta), Apple, and Amazon. It is almost ironic that Internet services are indeed deteriorating the issue of power centralization in the tech industry. Technology is a cultural and political phenomenon most of all, resulting in such large, powerful, and norm-setting organizations creating mutually beneficial relations with governments of large and powerful countries (Srivastava, 2021).  The case of Big Tech companies and the US government is indicative. 

Big Tech provides unique opportunities to governments for access to immense databases, paving the path for important political incentives to be drawn, as previously stated (Birch et al., 2021). Nowhere is this more evident than in the US political scenery. What is more, the amount of technology lobbying taking place in the US has expanded over the past 10 years, “rivaling those of more “traditional” industries such as energy and finance” (Arogyaswamy, 2020). Cooperation in national security issues might be the most prominent example to illustrate this relationship. The Snowden revelations of 2013, showcased that Google and Facebook had been leaking user data to various intelligence agencies in the US. Amazon maintains more than 2000 partnerships with US law enforcement bodies. The trend of “broad immunity” towards big tech in the US, is problematic for privacy protection (Srivastava, 2021), and even more alarming is that “American big tech firms often cite the national interest in arguing that they should not be tightly regulated” (Arogyaswamy, 2020). There is evidence to suggest the rise of a “concealed immunity” as, despite superficial gestures and public or policy debates (e.g., US House of Representatives, 2020), nothing major has challenged Big Tech’s domination of private regulatory mechanisms (Geradin et al., 2020). 

Political actors are not that dependent on big tech in the EU, and the civil opinion is even harsher. The chronic lack of a strong telecommunications industry in the Union does create geopolitical perils, but it also provides an opportunity for actual privacy liability. Its political and social stance on big tech is “conditional liability”, meaning it aims to enforce obligations and responsibility on companies through legal binding, under conditions that will not uproot the mutually beneficial relations among them. France, which has been the GDPR’s most devoted enforcer, has long opposed the free pass on big tech. Trade associations have filed complaints with the French Autorite de la concurrence against Apple, in regards to new proclaimed changes on i-Phone user tracking. Civil demonstrations in the country are also intense, concerning for instance Amazon’s unethical competition practices (Srivastava, 2021).

However, despite first assuming that the GDPR would have harmed big tech activity in the EU, the exact opposite seems to be the case. The GDPR has enhanced market concentration, as data suggest that Google’s and Facebook’s market position has only benefited since its enactment, due to extra compliance costs. In practice, it means excluding smaller brands from entering the tech sector or even forcing them to exit it. Furthermore, larger platforms also benefit from advertisers’ increased confidence in them, as their dominating position in everyday life constitutes their products “must-haves”, rendering customer consent to their services easier to obtain (Sobers, 2020; Machuletz & Böhme, 2020). On another note, since 2018 self-perpetuating practices seem to be implemented in ad-tech services [3], as tech “giants” can unify their privacy policies, culminating in a seemingly easy-to-use privacy framework, while simultaneously internalizing data-sharing (Geradin et al., 2020). For instance, in 2019, Google removed Ad Nexus from YouTube, opting for their own ad tech and causing the near bankruptcy of Ad Nexus. Chrome issued plans to phase-out third-party cookies by 2022. Apple has also followed similar tactics in its cookie and advertising policies. 

Opting for a superficial compliance to legal provisions and maintaining a front-face of embodying current sociopolitical values and tendencies is a valid strategy for Big Tech. It might also be in the interest of governments, provided certain checks and balances are struck- although the underlying issue of power dynamics might play a decisive role in this respect. Big tech could (as Microsoft appears to have done), adhere to societal standards by cooperating with governments around the world, while predicting and guarding against future problems, in the most optimistic scenario (Gorwa & Peez, 2018). A role of improving safety standards and transparency, would benefit both firms and institutions. With different governments following different tactics, Big Tech might adjust to demand.

Mapping rising transatlantic trends

While those are serious problems, it is important to bear in mind that the GDPR is the first coherent attempt for online data regulation and criticism is only expected. It is even desirable to drive forth innovative policies, minimize their drawbacks, maximize their potential and thus, constitute models from which other states can get inspired from. Having said that, the positive impact of the GDPR is undeniable. American trends still have a long way to go, to even reach the flawed and experimental, yet existent and functioning, European privacy protection framework. Considering Chinese firms, with the backing of their government, are engaged in an unhinged effort to stake out a management function in the technologies of the future, prolonging different political positions on data regulation  might come with a cost.  (Arogyaswamy, 2020).

Trends, policy discussions and industrial practise from a US standpoint indicate that the issue of privacy and data management is not moving towards a common, western space, rooted in the EU vision of an “ethical”, accountable space. American digital privacy policies are moving more towards the Pacific-Asian attitude towards data, which seeks to minimize the negative practices surrounding loose data laws. Although practical reasons, such as strategic alignment, might play into force, the most noticeable reason for this attitude might be the simplest; The EU perceives privacy as a fundamental right and the US as a means to protect consumers and financial components (Dipshan, 2021). Therefore, a pragmatic approach, in tune with the liberal American rhetoric, might actually be the spot where civil demand, culture and policy actors meet in the US.

Conclusion

To conclude, certain observations are in order. The GDPR, despite praise or criticism, has become the catalyst for data privacy discussions and law generation, both in the EU and the US. Public opinion is more in tune about demanding more privacy over their personal data. More legally progressive states are starting paving the way to a more coherent approach to data protection in the West. Transatlantic trends, though, are not following the same paths as of now, due to the fundamentally different relations with Big Tech, as well as different legal and cultural attitudes on privacy and specifically “data privacy”. Regulatory efforts thus far, create the sense that the US and the EU aim for a more “western”, “democratic” approach, but with their own distinct cultures embedded to it. Therefore, the two poles might be more inclined to implement similar regulations, aiming to pragmatic and financially-oriented results, mostly through maintaining balances with Big Tech, namely improving transparency in data governance norms. As the EU moves towards a sui generis, safe space for online privacy and the US is battling to safeguard its relations with Big Tech while adhering to social demands, there is little evidence that a holistic, privacy-focused, transatlantic policy effort will occur.  


Footnotes

[1] For more information on cookies policy, please visit: https://arxiv.org/abs/1808.05096 

[2] For the provisions on the Bill of Rights concerning privacy and the recent amendments visit: https://constitution.laws.com/right-to-privacy 

[3] Advertisement technologies, “meaning all software and technology used by marketers to advertise digitally”. For more information visit: https://www.techfunnel.com/martech/what-is-ad-tech/ 


References

Arogyaswamy, B. (2020). Big tech and societal sustainability: an ethical framework. AI & Society, 35, 829-840. DOI: 10.1007/s00146-020-00956-6.

Art. 4 GDPR – Definitions – General Data Protection Regulation. (2018). Intersoft Consulting.  Retrieved from here.

Auxier, B., Raine, L., Anderson, M., Kumar, M., & Turner, E. (2019). Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information. Pew Research Center. Retrieved from here.

Barrett, C., & Scitech, L. (2019). Are The EU GDPR And The California CCPA Becoming The De Facto Global Standards For Data Privacy And Protection? Scholarly Journal, 15(3), 24-29. Retrieved from here.

Birch, K., Cochrane, D., & Ward, C. (2021, May 16). Data as asset? The measurement, governance, and valuation of digital personal data by Big Tech. Big Data & Society, 8(1). DOI: 10.1177/20539517211017308.

Big Tech Risks Personal Data Privacy and Corporate Security (2021). IDX. Retrieved from here [Accessed 3 January 2022].

Brown, E. (2019). The Geopolitics of Technology –  Big Data, Artificial Intelligence and 5G  in a Multipolar World. Global Risk Institute. Retrieved from here.

Chin, C. (2019). Highlights: The GDPR and CCPA as benchmarks for federal privacy legislation. Brookings Institution. Retrieved from here.

Comparing privacy laws: GDPR v. CCPA. (2019). OneTrust Data Guidance. Retrieved from here.

Dipshan, R. (2021). GDPR’s Global Impact May Be More Limited Than You Think. Law.com. Retrieved from here

Fukuyama, F. (2020). 30 Years of World Politics: What Has Changed?. Journal of Democracy, 31(1), Johns Hopkins University Press, 11-21. DΟΙ: 10.1353/jod.2020.0001

Geradin, D., Karanikioti, T., & Katsifis, D. (2020). GDPR Myopia: how a well-intended regulation ended up favouring large online platforms – the case of ad tech. European Competition Journal, 17(1),  47-92. DΟΙ: 10.1080/17441056.2020.1848059.  

Gorwa, R., & Peez, A. (2018). Big Tech Hits the Diplomatic Circuit: Norm Entrepreneurship, Policy Advocacy, and Microsoft’s Cybersecurity Tech Accord. Retrieved from here

Klosowski, T. (2021). The State of Consumer Data Privacy Laws in the US (And Why It Matters). The New York Times. Retrieved from here.

Kumaraguru, P., Cranor, L. F., & Newton, E. (2005). Privacy Perceptions in India and the United States: A Mental Model Study. Retrieved from here.

Machuletz, D., & Böhme, R. (2020). Multiple Purposes, Multiple Problems: A User Study of Consent Dialogs after GDPR. Proceedings on Privacy Enhancing Technologies, 2020(2), 481-498. DOI: 10.2478/popets-2020-0037

Sobers, R. (2020). A Year in the Life of the GDPR: Must-Know Stats and Takeaways. Varonis. Retrieved from here.

Srivastava, S. (2021). Algorithmic Governance and the International Politics of Big Tech. Perspectives on Politics, 1-12. Cambridge University Press. DOI: 10.1017/S1537592721003145Voss, G. (2021). The CCPA and the GDPR Are Not the Same: Why You Should Understand Both. Competition Policy International. Retrieved from here.


logo_transparent

H SAFIA (Student Association For International Affairs) δεν υιοθετεί ως Οργανισμός πολιτικές θέσεις. Οι απόψεις που δημοσιεύονται στο The SAFIA Blog αποδίδονται αποκλειστικά στους συγγραφείς και δεν αντιπροσωπεύουν απαραίτητα τις απόψεις του Σωματείου, του Διοικητικού Συμβουλίου ή των κατά περίπτωση και καθ’ οιονδήποτε τρόπο συνεργαζόμενων φορέων.