27 minute read
General Data Protection Regulation and California Consumer Privacy Act: Background
CURRENT EVENTS
I. Introduction
Advertisement
The Internet is certainly among the most significant technological advancements in modern times. Data and information can now be transferred almost instantly to anyone across vast distances. The convenience and broad accessibility of the Internet has led to extensive economic development, new markets, and market entrants; its importance cannot be understated. While the benefits of the Internet are readily apparent, beneath the surface lie legitimate concerns about individuals’ online data privacy.
About a decade after the advent of the Internet, the Council of the European Union (EU) took an interest in the state of consumer protection in this sphere, adopting the European Data Protection Directive (Directive 95/46/EC) in 1995. Directive 95/46/EC’s stated purpose is to provide consumers with a sense of confidence that online data will be protected and kept private. On May 25, 2018, nearly 25 years later, the EU adopted the General Data Protection Regulation (GDPR) to further address such concerns.
Part II provides a brief inquiry into the relevant legislative history, explaining the various proposals and compromises made amongst the EU, as well as the policy behind GDPR and the rights granted. Part III examines the elements of GDPR. See throughout the Union, through guidelines, opinions, and decisions.”
One of the European Council’s final sweeping changes came April 27, 2016, when they officially repealed Directive 95/46/ EC—the first step in binding EU member countries to the GDPR. The purpose of repealing the early Directive 95/46/EC was to update data privacy protections. The GDPR notes how “rapid technological developments and globalization have brought new challenges for the protection of personal data.” The GDPR went on to mention that these new advancements “require strong and more coherent data protection framework in the Union, backed by strong enforcement, given the importance of creating the trust that will allow the digital economy to develop across the internal market.” Finally, on May 25, 2018, the GDPR went into full force and effect, binding all EU member countries. See https://edps.europa.eu/data-protection/ data-protection/legislation/history-generaldata-protection-regulation_en.
This article was a collaborative effort of the following editorial members:
Greta Carlson, Jonathan McKinney, Elizabeth Slezak, Esther-Sarah Wilmot.
https://edps.europa.eu/data-protection/ data-protection/legislation/history-generaldata-protection-regulation_en. Part IV and V, respectively, address impacts within the EU and beyond it. Part VI breaks down California’s legislative response to GDPR.
II. Legislative History
Directive 95/46/EC laid the groundwork for the GDPR, allowing the European Commission to “review the existing list of countries which offer an adequate level of protection of personal data.” At this time, Directive 95/46/EC was not a binding order; rather, simply a review of Member States. Data Protection Authorities granted “seals and marks to services–to reinforce consumer confidence.”
On March 12, 2014 the European Parliament voted in favor of the GDPR with an overwhelming number of votes in favor. On June 15, 2015, the European Council decided to replace the previous Article 29 Working Party with the European Data Protection Board (EDPB). From there on out, EDPB’s role was “to ensure the consistency of the application of the GDPR A. Intent and Purpose of the GDPR
The purpose of the GDPR is threefold. The policy purposes set forth throughout the legislative history of the GDPR consists of three separate but interrelated goals. First, the GDPR sought to provide a uniform series of regulations across all EU member countries. One of the main concerns with Directive 95/46/EC was that it appeared too disjointed and limited in its reach in
order to achieve the goals of data protection and enforcement of law. Second, the GDPR seeks to instill confidence in citizens of EU member countries by ensuring their private data will be protected by those organizations that seek to use it. The GDPR explicitly grants new rights for EU citizens regarding their personal data. Third, the EU sought to boost Europe’s digital economy through these new protections—if people felt that their personal data was handled securely, they would be more willing to use digital services.
The EU found these themes to be of particularly importance in recognition of the state of the modern world–that is, one pervaded by Internet integration in all economic aspects. Indeed, Recital 2 of the GDPR, the EU Parliament and Council makes these goals–to contribute to an area of freedom, security, justice, economic union, social progress, stronger economies within the internal market, and to the well-being of natural persons–abundantly clear.
B. Rights and Protectionary Principles
Within the text of the GDPR, the EU Parliament and Council stated their thoughts and beliefs regarding data privacy in no uncertain terms. Recital 1 states, “[t]he protection of natural persons in relation to the processing of personal data is a fundamental right.” In the age of digital data processing, such language, regarding the rights of individuals, is undoubtably necessitated. Recital 3 of the GDPR further puts forth that the “processing of personal data should be designed to serve mankind.” Such strong language evidences the EU’s newfound hardline stance on data privacy.
In addition to explicitly declaring citizens’ rights, GDPR compels organizations subject to its provisions to abide by certain principles. Whereas citizens are explicitly granted rights, principles are parameters set for organizations to follow that indirectly bolster the rights of citizens.
i. Right of Data Portability
The right to data portability (RtPD) is easily defined. It is the right to receive one’s own personal data from an organization in a commonly used and easily shareable form. Surprisingly, this was one of the more contentious rights to be introduced by the GDPR. Indeed, there was much debate over whether RtDP truly belonged in the GDPR or whether it was a concept better suited for other areas of law.
While some Member States favored RtDP’s inclusion in the GDPR, others were concerned with “risks of data portability for the competitive positions of companies and . . . the relationship between commercial confidentiality and the IP of data controllers.” Some Member States argued RtDP had nothing to do with the protection of data transfer and instead had more to do with consumer and competition law. Ultimately, after critical review by the Council, the RtDP became part of the GDPR, included as Article 20. The Council ultimately felt that while the concerns respecting the RtDP are arguably better addressed by other areas of law, the policy objectives of the RtDP are very much in line with the GDPR as a whole. The RtDP was meant to “ensure that individuals are in control of their personal data and trust the digital environment . . . the RtDP could foster competition between controllers as a side-effect and thereby encourage the development of new data-related services.” The RtDP achieves the second and third policy objectives by boosting consumer confidence in EU member countries and growing the digital economy by allowing for competition and the invention of new digital services. See https://edps.europa.eu/ data-protection/data-protection/legislation/ history-general-data-protection-regulation_ en.
ii. The Accountability Principle
The accountability principle is integral to fulfilling the GDPR’s goals and expands protections and rights for EU Member States’ citizens, placing the burden on organizations collecting and transferring data to prove they are acting in accordance with the GDPR. Recital 85 gives a specific example of how the accountability principle works. If there a data breach occurs, the controller of the data has seventy-two hours to notify the persons affected by the data breach. However, if they are able to show, in accordance with the accountability principle, that the breach will not put the rights and freedoms of the citizens affected at risk, they are not required by the GDPR to notify those affected. The burden of proof ultimately lies on the organization or entity that controls the data. The GDPR does not assume organizations have citizens’ best interests in mind and the accountability principle is about forcing these organizations to prove they are responsible enough to operate in the EU and in accordance with GDPR standards.
III. Mechanics of the GDPR
The GDPR is now recognized law across the EU, allowing EU citizens more control over their own personal data, improving their security both online and offline.
The regulation protects natural persons with regard to the processing of “personal data” and creates rules relating to the free movement of that data.
i. Consent
Individual consent “is one of the few circumstances under which an organization may lawfully process data.” It is ostensibly the responsibility and burden of any organization processing data to prove they are acting within GDPR guidelines given the broad and farreaching implications of the accountability principle. Specifically, Article 4 requires service providers to obtain consent from consumers before processing personal data.
Consent has several elements. It must be free, specific, informed, and unambiguous. Consent must be given to each processing activity by statement or by clear and affirmative action as to signify agreement to the processing of personal data relating to him or her. When processing has multiple purposes, consent should be obtained for each purpose. Once consent is obtained, data controllers and processors can process the consumer’s personal data, but the consumer has the right to withdraw that consent at any time. The withdrawal of that consent does not affect the lawfulness of the processing based on consent before its withdrawal.
Additionally, Article 4 defines personal data as “any information relating to an identified or identifiable natural person, directly or indirectly,” including data such as an “address, license plate number, Social Security number, blood type, bank account information, and so on.” The GDPR not only gives data subjects more control over their own personal information, but it grants data subjects eight key rights. These rights are: “the right to be informed, the right of access, the right to rectification, the right to erasure, the right to restrict processing, the right to data portability, the right to object, and the right of automated decision making and profiling.” Within these rights, individuals have the power and control over what data marketers can collect, store, and use. Further, individuals can rely upon rights granted–such as the right to deletion, which allows for deletion of all data collected about the consumer. However, this right is not absolute and only applies in certain circumstances for the individual. For example, individuals have the right to have their personal data erased if: “the personal data is no longer necessary for the purpose which you originally collected or processed it for.”
Compliance with the GDPR regulations extends to both data controllers and data processors, contrary to the former legal regime conditioning liability upon identity of the processor–that is, who controlled the data. Data controllers and data processors both determine the purpose and means of processing data, but data processors process data for the data controllers, who in turn retain such data. Controllers are “the main decision-makers, who exercise overall control over the purposes and means of the processing of personal data.” If two or more controllers jointly “determine the purposes and means of the processing of the same personal data,” they are joint controllers. However, they are not termed joint controllers if they process the same data for different purposes. Processors act “on behalf of, and only on the instructions of, the relevant controller.”
Article 5 deals with the principles relating to the processing of personal data. The regulation provides that personal data must be “processed lawfully, fairly, and in a transparent manner in relation to the data subject; collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes.” Further, processing must be “adequate” to properly fulfill the stated purpose for collection, having a relational link or relevance to that purpose, and must be limited, or “not excessive in relation to the purposes for [processing].”
Article 6 provides for six legal lawful bases for processing. Processing shall be lawful only if, and to the extent that, at least one of the following bases applies: the data subject provides consent to the processing of his or her personal data for one or more specific purposes; performance of a contract to which the data subject is a party necessitates processing; compliance with a legal obligation to which the controller is subject necessitates processing; protection of the vital interests of the data subject or of another natural person necessitates processing; performance of a task carried out in the public interest or in the exercise of official authority vested in the controller necessitates processing; or where processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data.
Lawful processing is one piece in a global trend encouraging greater corporate accountability, with regard to protecting consumer data. This trend has influenced other countries to implement changes and take a protective posture towards consumer data privacy rights. Data controllers and
processors face new obligations with respect to personal data and shall be responsible for and able to demonstrate accountability with such data.
IV. Impact Within the EU
Data privacy regulations are set by the EU, supplemented by individual Member Countries which then set their own rules and policies at the national level. Existing data privacy regulations and intellectual property laws at both levels of governance must be reviewed and, as necessary, revised to align with the GDPR goals. To that end, the European Commission proposed revisions to the ePrivacy directive, which regulates the confidentiality of communications and the use of cookies, and Regulation 45/2001, which applies to EU institutions when processing personal data.
In fact, the GDPR specifically requires individual Member Countries to establish their own rules, policies, and enforcement authority schemes in accordance with particular parameters. However, Member States are permitted some discretion in the construction of these rules. States may provide for exceptions or limitations to GDPR rules where freedom of expression, employment law, and scientific or historical research is concerned. The national regulations established by each country reflect its individual priorities, cultures, and legal structures. The details of country’s implementation plan may impact the direction of future legislature and enforcement actions.
France, for instance, is the most restrictive digital trade country within the EU according to the DTRI Trade Restrictiveness Index (Index). France elected to adopt privacy regulations more restrictive than those required. The Loi pour une République Numérique creates a data retrieval obligation for providers of online public communications services in the French market. Recently, French data protection authorities issued a €50 million (approximately $57 million) fine against Google for violating the GDPR. The fine, issued in January 2019, is the largest penalty imposed for a data privacy breach under the GDPR. Experts contend that the Google fine set a benchmark for future data privacy penalties. France’s hardline approach may influence the direction of data privacy law in Europe.
GDPR’s impact is not limited to data privacy regulations. Rules regarding intellectual property rights must be reviewed in light of the right to data portability created in the GDPR to avoid a conflict between the two regimes. The GDPR grants data subjects the right to transmit their personal data from one data controller to another. This right can conflict with the right of data controllers to protect trade secrets, sui generis databases, and copyrighted content. Facebook, headquartered in Ireland, previously fought a request for disclosure of personal information by invoking the privilege of trade secret protection under the Irish Data Protection Acts. Vague language in the GDPR and inconsistent application of EU intellectual property rules cause uncertainty in the future of intellectual property rights in personal data in the EU.
A sui generis database right exists when a database was created as a result of substantial investment. However, the investment amount necessary to establish protection varies among Member States. In Germany, an investment as low as €4,000 may suffice to establish a sui generis right.
V. Beyond the EU
National privacy policies and regulations vary across the world. The approach of each regime reflects the country’s culture and the government’s priorities. The EU, for example, is primarily driven by privacy concerns. The protection of personal data and the privacy of communications are statutory human rights under EU law. However, security is the driving concern for other countries. Security-focused countries typically adopt restrictive data policies that limit cross-border data flows and provide limited personal privacy protections. China, Russia, and Turkey rank the highest for restrictive data policies. All three countries include some form of localization, retention, and transfer requirements.
China is the overall most restrictive digital trade country in the Index, followed by Russia, India, Indonesia, and Vietnam. China’s trade and Internet restrictions inhibit the free flow of information. For example, all data must be stored domestically and any incoming cross-border Internet traffic must pass through a national firewall. China’s robust security policies limit personal rights by permitting the government to demand proprietary information from organizations and personal data from telecommunications operators and Internet service providers. Further, China’s proposed social credit system, which would monitor citizens’ behavior and collect personal data, does not
require an individual’s consent nor does it provide an avenue for an individual to access or correct their own person data. Following in China’s footsteps, Vietnam has adopted its own restrictive cybersecurity policies.
The United States’ approach to data regulation attempts to find the middle ground. Its policies aim to find a balance between trade interests, privacy protection, and national security. The growth of digital trade has increased the focus on personal privacy and efficient data flow. However, the distinctions between opposing fundamental approaches to data regulation prevent interoperability between different data privacy regimes. The EU, US, and China all have unique approaches to data policy. As other countries mirror the policies and rules of their closest trading partners, three distinct economic spheres emerge based on each approach. Some experts warn that a system for global interoperability must be adopted to avoid fragmentation of the Internet between the European, Chinese, and American spheres. However, there are no globally accepted standards nor binding multilateral rules specifically concerning the cross-border flow of personal data.
Several international organizations have developed non-binding principles and guidelines to assist governments in developing or reshaping national data privacy policies. The Organization for Economic Co-operation and Development (OECD) established the pivotal 1980 Privacy Guidelines, the first set of international data privacy principles. The Guidelines, last updated in 2013, are based on a risk management approach and emphasize the importance of data protection in the context of cross-border flows of personal data.
The principles regarding collection processes, data protection, and individual rights established in the Guidelines have become a foundation upon which other international economic forums and organizations have built their own privacy standards and guidelines. The 2018 G-20 and the Asia-Pacific Economic Cooperation (APEC) have both published principles and best practice guides related to data privacy and cross-border data flows. The 2018 G-20 Digital Economy Ministerial Declaration identified principles for data sharing standards for privacy and cross-border data flows. The 2005 APEC Privacy Framework established principles and implementation guidelines to assist countries with establishing national data privacy policies. This year will see the revision of many of these international privacy frameworks. OECD and APEC are scheduled to update and publish their guidelines in 2019, and the 2019 G20 host has indicated the summit would focus on data governance.
Despite the necessity of cross-border data flows in international trade and commerce and growing public concern over personal data protection, there are currently no binding multilateral rules regulating privacy or cross-border data flows. Enforceable trade rules are, however, gradually being established by means of trade agreements. The GDPR allows non-EU organizations to import and export personal data in several circumstances. Under the US-EU Privacy Shield, transatlantic transfers of personal data by certified, US-based organizations are permitted. Certification requires an organization to enroll in a voluntary program and comply with its commitments and obligations. Alternatively, organizations based in countries with a mutual adequacy decision from the EU may freely transfer data in and out of the EU without any further safeguards necessary. To receive a mutual adequacy decision, the EU must determine that the country offers an adequate level of data protection. Most recently, the EU and Japan agreed to recognize each other’s laws as equivalent. Japan further committed to implement additional security measures for the handling of personal data from the EU.
The APEC Cross-Border Privacy Rules (CBPR) present a similar methodology respecting cross-border data flows of personal data. The CBPR balances data privacy rights with commercial interests and identifies best practices for protecting personal data during cross-border transfers. CBPR members agree to recognize each other’s data privacy systems as equivalent. Current members include the United States, Japan, Mexico, Canada, South Korea, Singapore, Taiwan, and Australia.
The recently enacted Comprehensive and Progressive Agreement for Trans‐Pacific Partnership (CPTPP) created the strongest commitments for binding digital trade globally. The partnership is comprised of eleven Asia-Pacific countries, including Australia, Canada, Japan, Mexico, and Vietnam. CPTPP allows for the cross-border transfer of information between signatories, subject to restrictions for legitimate public policy purposes. Discriminatory trade barriers and localization requirements are prohibited. The agreement further requires all signatories to develop their own legal framework for the protection of personal information and to adopt consumer protection laws for online commercial activities, but did not provide any specific guidance in this regard.
The proposed United States-Mexico-
Canada Agreement (USMCA) similarly establishes trade rules and policies on privacy, cross-border data flows, and security. The USMCA, like the CPTPP, promotes interoperability, prohibits arbitrary restriction of cross-border data flows, and requires member countries to establish national rules for personal privacy and online transactions. However, the USMCA goes further than CPTPP, referring countries to the APEC Privacy Framework and the OECD Guidelines for guidance in developing their own privacy framework and consumer protection laws.
Expanding upon these agreements is an important step toward establishing consistent international standards for data privacy and data flows. The CBPR will continue to grow in strength and influence as more countries and organizations join. Furthermore, the best practices delineated by the CBPR can serve as the basis for further binding trade agreements. Because the CBPR includes economies at different stages of development, its initiatives can be scaled up to larger global efforts. Similarly, the language of the USMCA is sufficiently broad, enabling its adaptation for use in future agreements. Recent developments in data policy, privacy principles, and digital trade agreements serve as templates, or least a starting place, on which to base future domestic regulations and binding multilateral agreements.
VI. CCPA v. GDPR
Close on the heels of the GDPR, progressive legislators in California similarly sought to enact data privacy protections for their constituents with the California Consumer Privacy Act (CCPA). Widely perceived to be America’s reply to the GDPR, major discrepancies between the bills do exist and are sure to cause headaches in compliance departments in global firms with presences in each market.
A. Definitional Distinctions
CCPA protects “consumers,” natural persons residing in the state and those domiciled there, but living elsewhere with the intention of returning. As defined, “consumer” is a term broad enough to encompass even employee and businessto-business transactions, in addition to household goods and services consumers. Take for comparison the broader “data subject,” defined simply as an identifiable person to whom personal data relates, contemplated by the GDPR.
Largely, CCPA takes aim at Big Tech by targeting for-profit California entities that either: have gross revenue exceeding twenty five million dollars; derive above 50% of annual revenue from selling consumer information; or buy, sell, or share more than fifty thousand consumers’ personal data each year. Service providers, third parties, entities that share common branding, as well as controllers of covered businesses are all required to abide by the regulation; likely a legislative tactic to close anticipated loopholes in advance of their exploitation. Note GDPR’s broader scope in this area–even entities established outside the EU may be subject to abiding its regulations, specifically if the behavior of EU data subjects is monitored or in connection with some offer of goods or services. Both schemes essentially approach the sort of data—personal information, as respectively defined—similarly and contemplate how best to accomplish protection. Specifically, both aim at protecting personal information related or associated with an identifiable natural person, consumers and data subjects respectively.
One work around to undertaking such compliance contemplates the disassociation of data collected from identifiable individuals, such that without other information the data is not traceable back to that individual. The degree of disassociation creates the distinction between “pseudonymous” and “anonymous” data. Often times, this sort of largely undiscernible data, with respect to the specific persons providing it, is clumped together into a big heap or “aggregated.” In both regimes, anonymizing, pseudonymizing, and aggregating functions must be performed with stringent technical controls to meet qualification of the respective regulatory definitional bar. Companies that follow these guidelines are able to qualify the consumer information collected as “deidentified” or “aggregated,” neither of which are restricted by the CCPA. Similarly, the EU’s framework does not regulate data qualifying as “anonymous.” However, “pseudonymous” data is defined to include personal data and is thus fair game under the general regulatory scheme. California has yet to make any clear rule or distinction as to pseudonymous data.
B. Foundational Regime Differences
Differences in each regime’s respective foundational principles manifest in a variety of forms, significantly in the distinction between their respective approaches: opt-in versus opt-out. Where European law finds an implicit right of personal dignity and a positive right to protection of personal data, an affirmative opt-in approach logically follows. Conversely, American law, and
by extension Californian law, traditionally balances competing principles and positive and negative rights, predictably then, resulting in the opt-out approach taken by the California legislature in enacting the CCPA. In terms of practical effect, this distinction has not presently resulted in any severe disparities, but instead simply lends context to developmental differences. Positive rights protected by GDPR include the right to a privacy notice, to information disclosure or access, to data portability, to deletion, to rectification, to object in limited ways or even absolutely to data processing, and to object to automated decision making. Differences in the regimes largely relate to positive rights, many of which are not recognized in the California bill.
Both regimes contemplate a right of disclosure and access to the information collected and shared. CCPA consumers can opt for a written disclosure of this information. GDPR’s disclosure grant goes further, granting access to some extent to additional portable formats. Data portability rights in both schemes largely mirror one another, each requiring the provision of data in a readily usable format. GDPR’s grant is, again, broader, requiring data controllers to actually facilitate a requested transfer to a third party.
California consumers enjoy a fairly comprehensive right to require deletion of their personal information by a particular business, who must in turn instruct its service providers of the same, subject to certain refusal rights. In Europe the “right to be forgotten” only applies if one of six conditions is present, however where available this right is coupled with a far more stringent informing obligation to downstream recipients than is available to California consumers.
Further, GDPR data subjects enjoy the right to rectify inaccurate and incomplete personal data, some right to restrict and object to the processing of their data, as well as the right to object to automated decision-making. The implications of the aforementioned, specifically with regard to the ability to object to algorithm and nonhuman decision making, are sure to make a significant impact on companies like Google, no stranger to accusations respecting the lack of corporate transparency. CCPA, on the other hand, provides consumers with an affirmative opt-out right. Essentially, businesses subject to the law are required to include a clear and conspicuously located “Do Not Sell My Information” link on their homepage. Businesses must also comply with consumer requests to opt-out of the sale of their personal information and cannot request reauthorization for the same, for at least 12 months post-opt-out. While GDPR has some opt-out features, the law mainly relies on its opt-in approach to obtain substantially similar results.
C. Similarities
Big distinctions between the CCPA and GDPR include potential disparities in the gravity of effect on and scope of parties regulated, severity of penalties imposed, in addition to those relating to the application of diverging approaches. However, both regimes find agreement in several areas. Both afford a similar privacy notice and information disclosure, as to the data collected and the use intended for those protected, with some procedural differences in terms of which information, how much, and the method of its delivery.
The disturbing trend of increasingly common data breaches can, in the vast number of cases, be appropriately hedged with proper preemptive security measures. GDPR affirmatively requires particular security measures in accordance with respective risk levels. CCPA does not dictate particulars in this respect. However, it does establish a cause of action for certain breaches where the party proves a failure to maintain reasonable security measures, with risk and reasonableness judged as they align with existing case law.
GDPR finds implicit in the law that discrimination for exercising rights granted is barred. CCPA on the other hand does not allow discrimination against a consumer that exercises their rights, but a consumer who exercises their rights can be charged differently, to the extent the difference reasonably relates to the value their data provided. Businesses may offer financial incentives to consumers, but to do so must disclose as much in their terms and policies plus obtain the consumer’s opt-in consent.
Both bills require compliance with verifiable rights requests within substantially similar windows of time, generally no more than three months. European data controllers have the option to charge a fee for compliance with the law. In California, non-excessive requests must be free to consumers. Thus, the two regimes find effective similarity here except to the extent some consumers with strenuous requests are charged some fee to obtain the compliance sought.
D. Consequences
As cumbersome as compliance may be on companies subject to these rules, violations of law in this respect correlate with even costlier fines and potential remedial
measures, as well as the manifestations of such action which can cost exponential sums to the individual whose data rights have been violated. GDPR grants data subjects a private right of action for both material and non-material breaches to remedy rights violated. CCPA conversely grants consumers a narrow private right of action for breaches that implicate a sub-set of personal data. Consumers can seek the greater-of actual or statutory damages, which range from $100 to $750 per consumer, per incident, as well as injunctive or declaratory relief, subject to 30 day cure period.
CCPA subjects violators to a fee schedule beginning at $2,500 per violation or $7,500 per intentional violation, again subject to a 30 day cure period. GDPR employs a higherof rule with respect to either twenty million Euros or four percent of annual global revenue in addition to any fees imposed by individual member states.
E. Developing Trends
Trunomi founder, Stuart Lacey, encourages consumers to recognize the reality: that is, your data is being sold. Why not willingly enter into the equation, profit on the sale of your own data, and take some control back? As awareness grows this approach may appeal to some. Others may push further for a more stringent system that goes beyond CCPA, or even GDPR. At the very least, this issue is top of mind for those informed purveyors of the common practices corporate actors are currently undertaking (or failing to undertake), particularly with respect to transparency practices. Still others accuse and suspect Big Tech and their algorithms of worse, ranging from rumors of shadow banning to more hysterical conspiracies. With artificial intelligence around the bend, these issues are sure to become less and less opaque in the public consciousness. The effect of compliance with both GDPR and CCPA may result in a high burden on smaller firms who cannot afford to uphold two different systems. They may be forced to adopt whatever is stricter or attempt duplicitous compliance. The pressure of dual systems may result in pressure for legislative action from a federal level as well.
VII. Conclusion
How privacy online will be defined on the Internet still remains to be seen. The Internet’s pioneering history has, thus far, largely gone unpunctuated by regulation, a trend that analogizes its nature–a modernized technological frontier. The GDPR represents a frontline of sorts in the combat of such uncertainty, providing regulatory uniformity throughout the EU as to instill consumer confidence, encourage the use of the Internet, and boost Europe’s digital economy. New rights protect individuals, while new principles direct organizations to act with the utmost care and responsibility in handling the personal digital data of citizens. Recent developments, such as cross-border trade agreements and the enactment of comparable legal protections, should serve as a signal to nations and organizations alike, that the time is now to take stock of and implement modern data privacy protections.