Jusletter IT

Can Data Protection be Improved through Privacy Impact Assessments?

  • Author: Rolf H. Weber
  • Category: Scientific Articles
  • Region: Switzerland
  • Field of law: Data Protection, Data Security
  • Citation: Rolf H. Weber, Can Data Protection be Improved through Privacy Impact Assessments?, in: Jusletter IT 12 September 2012
Due to the increasing sensitivity of civil society about privacy on the one hand and the lack of globally harmonized data protection principles on the other hand self-regulatory measures gain importance in privacy compliance. The most relevant «mechanism» is the Privacy Impact Assessment (PIA) as developed by the EU and other institutions. The contribution discusses the elements of PIA processes in view of their risk assessment functions and pleads for a more comprehensive application in data protection frameworks.

Inhaltsverzeichnis

  • 1. Relevance of Compliance in Privacy Matters
  • 2. Legal Foundation of Privacy Compliance Elements
  • 3. Privacy Impact Assessment as New Control Measure
  • 3.1. Notion and Benefits
  • 3.2. RFID as Example and Predecessor of PIA
  • 3.3. Envisaged New Legal Basis for PIA in the EU
  • 3.4. Swiss Prospects
  • 4. Important Elements of Privacy Impact Assessment
  • 4.1. Risk Assessment Factors
  • 4.2. Risk Assessment Process
  • 4.3. Consideration of the Privacy-by-Design Concept
  • 5. Outlook

1.

Relevance of Compliance in Privacy Matters ^

[1]

Recently, aspects of privacy gained higher attraction in civil society. Many users of electronic network infrastructures such as the Internet have become more sensitive about data collection and data use by large enterprises. In view of the fast moving technologies, laws cannot easily cope with protective measures in this context. Therefore, privacy compliance is more and more an issue for enterprises looking at their reputation for behaving consciously in their fields.

[2]
Indeed, the protection of data quality and data integrity has become such an important topic in the life of organizations (80% of the corporate assets are digital nowadays) that the management can not come around the task of establishing a data protection policy. The details of such policy depend upon the range of functions of the given (market) environment. The more sensitive data are hold and treated by an organization, the higher established standards must be developed.
[3]

The term «privacy» has successfully defied at giving it a precise meaning.1 On the one hand, privacy inevitably varies from society to society;2 therefore, the existing rules show a patch-work of different provisions. On the other hand, privacy conveys a large number of concepts and ideas.3 Generally looking, three basic features of privacy can be distinguished, namely (i) secrecy, i.e. information known about an individual, (ii) solitude, i.e. access to an individual, and (iii) anonymity, i.e. attention paid to an individual.4 Concepts of privacy can be rooted in human dignity, breach of confidence relating to proprietary rights, and protection of individual autonomy from state interference.5

[4]
In view of these basic features of privacy and the applicable data protection laws, enterprises must develop their strategy for complying with the relevant legal requirements:
  • Organizational rules have to describe functions of responsible persons and segregate the duties amongst the persons;
  • The policy has to describe the security levels and the measures applied to achieve such levels;
  • A project management needs to be implemented and conditions for user participation should be outlined;
  • A data classification scheme is to be established in order to control access rights;
  • The responsibility and external requirements for review processes must be introduced.
[5]

A new instrument which could be used in the context of compliance with data protection regulations is the privacy impact assessment (PIA), generally regarded as a systematic risk assessment tool, that can be usefully integrated into a decision-making process, evaluating a proposal in terms of impact on personal data privacy with the objective of avoiding or minimizing adverse effects.6

2.

Legal Foundation of Privacy Compliance Elements ^

[6]

(i) As to the legal basis of privacy compliance, self-regulation needs to be given closer attention at first instance. Self-regulation follows the principle of subsidiarity, meaning that government intervention should only take place if participants of a specific community are not able to find suitable solutions (structures, behaviors) themselves; since, however, public law defines the contours of private law, it contains aspects of the role of self-regulatory mechanisms.7 The legitimacy of self-regulation lies in the fact that private incentives lead to a need-driven rule-setting process.

[7]

A major advantage of self-regulation can be seen in the possibility to develop and establish rules independent of the principle of territoriality; other strengths are the efficiency in responding to real needs and mirroring the technology, the openness for a permanent consultation process and for a timely adaption of rules in case of changing technologies, as well as the existence of incentives for compliance with a «self-given» legal framework.8 Disadvantages of self-regulation, however, should not be overlooked: The quality of the «legislative» process can hardly be judged under the angle of a democratic participation (problem of outsider), «private norms» are not generally binding in legal terms, self-regulatory mechanisms are not always stable, and – in particular – this kind of legal framework does hardly know enforcement procedures leading to sanctions in case of non-compliance.9

[8]

(ii) Since technical elements play an important role, the importance of the technology-oriented theory approaches should not be underestimated. In the light of a perceived dissatisfaction with the available regulatory models, Lawrence Lessig developed a new, more technically-oriented approach, the so-called «code-based regulation», for the online world some 10 years ago.10 According to Lessig, human behavior is regulated by a complex interrelation between four forces, namely law, markets, social norms and architecture.11 Code solutions, similar to legal rules, principally reflect «information» that allocates and enforces entitlements. The design of the code materially influences human behavior since architecture is one of the four regulators; depending on the architecture certain activities will be possible or difficult to carry out.12 Therefore, Lessig arrives at a world in which code can do much of «the work that the law used to do far more effectively than the law did.»13 Consequently, effective regulatory power shifts from law to code based on an effective architectural framework.14 Lessig’s approach that relates the code/architecture to the control paradigm has not remained uncontested: For example, the aspect of established control structures has a political impact; furthermore, courts should impose checks on the powers of private regulators where such regulation threatens important collective values.15

[9]

(iii) A general surveillance function could be assumed by international legal instruments. The traditional national legislation is confronted with the major disadvantage of the limited scope due to the territoriality principle:16 Domestic rules «only» apply within the boundaries of the concerned country (at least as long as an extraterritorial effect of the law is not envisaged) and have the consequence that a harmonized level of privacy protection cannot be achieved. Consequently, as it is the case today, national provisions in the privacy field request from data-exporting persons and enterprises to comply with the principle of a comparable level of protection abroad.17

[10]

From a theoretical point of view, a transnational approach is inevitable in the online world.18 Nevertheless, the development and implementation of internationally harmonized rules is hardly possible if societal perceptions vary so widely as in the case of privacy; since customary law can also not be built on the basis of diverging understandings of privacy, a successful harmonization of rules is not likely to happen. Therefore, internationally binding agreements related to privacy merely exist on a regional level (for example within the European Union). Other legal instruments only have non-binding character (being the case for the UN- and the OECD-Guidelines related to data protection). Fundamental freedoms in Human Rights Conventions often encompass the right to privacy, but the principles are usually worded in a relatively vague way which makes their practical enforceability doubtful. Nevertheless, even if not all desirable principles work out as envisaged, experience with transnational law proves that global problems can be tackled by the international community,19 which obviously means that the community should strengthen the efforts to negotiate and conclude additional treaties.20

[11]

(iv) Compliance, for example in the form of the below discussed privacy impact assessment, must rely on private initiatives and efforts; therefore, self-regulation plays an important role. In addition, technical designs are the basis for implementing privacy standards, thereby realizing the successful implementation of the privacy policies. Cross-border acceptance of privacy measures can increase their reputation and acknowledgement. Consequently, all legal instruments have the potential to exercise a relevant function.

3.

Privacy Impact Assessment as New Control Measure ^

3.1.

Notion and Benefits ^

[12]

A Privacy Impact Assessment (PIA) offers the data users an «early warning» system since privacy problems might be identified and detected prior to the implementation of certain systems.21

[13]

The benefits in conducting a PIA encompass the possibility (i) to establish and maintain compliance with privacy and data protection laws and regulations, (ii) to manage privacy risks within an organization and in relation to third persons, and (iii) to provide public benefits through the success of privacy-by-design efforts.22 In addition, a PIA is useful in enabling to adequately consider the impact of an action on personal data privacy, directly addressing the privacy problems in the process, providing solutions or safeguards at the design stage and benchmarking for future privacy compliance audit and control, being a cost-effective way of reducing privacy risks as well as providing a credible source of information to allay any privacy concerns from the public and the stakeholders.23

[14]

Apart from the below thoroughly discussed (and most detailed) PIA related to RFID applications within the European Union, other authorities have also issued general leaflets and guidelines24 or have dealt with the PIA in connection with specific issues such as public security25 or e-government.26

3.2.

RFID as Example and Predecessor of PIA ^

[15]

An extensive and still ongoing PIA project concerns the Radio Frequency Identification (RFID) applications. On May 12, 2009, the European Commission issued a Recommendation on the implementation of privacy and data protection principles in applications supported by RFID.27 This Recommendation should become the basis for developing a framework related to privacy impact assessments by the industry, in collaboration with all relevant stakeholders, and supervised by the Article 29 Data Protection Working Party.28 On March 31, 2010, industry representatives delivered a PIA framework proposal which has been reviewed and partly rejected for improvement by the Article 29 Data Protection Working Party. The second proposal has then been approved on February 11, 2011.29 In April 2011, the «inauguration» of the PIA framework on RFID applications was formally enacted and in November 2011 a first review of the actual implementation has taken place.

[16]

In the understanding of the EU authorities the PIA should be designed on the basis of the following taxonomy:30

  • The PIA is to be introduced as a process making the assessment of privacy impacts of certain activities a conscious and systematic effort.
  • The framework should identify the objectives of the (RFID) applications as well as the common structure and content of such applications.
  • A PIA report must be made available documenting the PIA process and addressing the review steps of its implementation.
  • PIA templates may be developed based on the framework to provide industry-based, application-based, or other specific formats for PIA processes.
[17]

According to the relevant documentation, the PIA process related to RFID applications is constructed in two phases:31

  • A pre-assessment phase classifying the RFID applications according to a four level scale, based on a decision tree.
  • A risk assessment phase, broken down in four main steps, namely (i) characterization of the application (data types, data flows, RFID technology, data storage and transfers, etc.), (ii) identification of the risks to personal data and evaluation of the threats (likelihood and potential impact), (iii) identification and recommendation of controls, in response to previously identified risks, and (iv) documentation of the results of the PIA process (incl. conditions for review and information concerning residual risks).
[18]

Furthermore, the relevant documentation requests the PIA application operators to establish their own internal procedures in view of supporting the execution of the PIA processes, encompassing for example the following steps:32

  • Scheduling of the PIA process in order to plan the necessary executions and adjustments in time and in compliance with industry requests and the supervision by the competent authorities;
  • Internal review of the PIA process (incl. the initial analysis) and of the PIA reports in view of the compliance with the applicable documentation and the factual implementation of the relevant measures;
  • Compilation of supporting artifacts (for example results of security reviews, control designs) as evidence that the processes are executed in a proper way;
  • Determination of the persons and/or functions within the organization who have the authority for relevant actions during the PIA process;
  • Provision of criteria of how to evaluate and document whether the actual applications are ready for deployment consistent with the PIA framework and any relevant PIA template;
  • Consideration/identification of factors that would require a new or revised PIA process, for example due to significant changes in the applications, failures in the processes, weaknesses in implementation measures, etc.
  • Stakeholder consultation in order to receive the appropriate feedback from all directly or indirectly involved persons/organizations/authorities and, thereupon to be in a position of improving the PIA process.
[19]

If issues or even problems occur, a specific corrective action plan needs to be developed; relevant risks identified should be appropriately mitigated in order to assure that no significant residual risks remain and a new PIA might have to be executed.33

3.3.

Envisaged New Legal Basis for PIA in the EU ^

[20]

In the course of the fundamental revision of the EU Directive on Data Protection 95/4634 which has been launched on January 25, 2012,35 the EU Commission proposes to include a specific provision related to a data protection impact assessment into the future legal frame-work in the newly drafted Data Protection Regulation.36 Art. 33 para. 1 of the proposal reads as follows:

[21]
«Where processing operations present specific risks to the rights and freedoms of data subjects by virtue of their nature, their scope or their purposes, the controller or the processor acting on the controller’s behalf shall carry out an assessment of the impact of the envisaged processing operations on the protection of personal data.»
[22]

Thereafter, the proposed provision lists specific (data protection) risks in case of data processing (para. 2), followed by the conditions to be fulfilled by the impact assessment (para. 3), namely the description of the envisaged processing operations and of the risks to the rights and freedoms of data subjects, the measures envisaged to address the risks, safeguards, security measures and mechanisms to ensure the protection of personal data. Furthermore, additional provisions regulate the exchange of views on a PIA and the supervision by the competent authorities (paras. 4–7).37

[23]
The outcome of the legislative process is obviously not yet known. However, the opposition to a pre-final draft of the legal framework which unintendedly surfaced in November 2011 to the public mainly concerned the form (Regulation instead of Directive), the introduction of new fundamental rights (such as the right to be forgotten) and the extended supervisory regime with massive powers of data protection surveillance authorities, but it was not directed against the principle of a privacy impact assessment. Therefore, the fair assumption might be expressed that the PIA is going to survive the legislative process and will come into force, expectedly in 2014.

3.4.

Swiss Prospects ^

[24]

So far, concrete proposals regarding the introduction of the PIA concept do not exist in Switzerland. Nevertheless, Art. 11 para 1 of the Swiss Data Protection Act (Swiss DPA)38 contains a provision stating that in order to improve data protection and data security, both the manufacturer of data processing systems and data processing programmes, private individuals and federal bodies processing personal data may undergo an evaluation of their systems, processes and their organization carried out by independent certification authorities. However, serving as a basis for legal recognition of certifications for data protection, the validity of Art. 11 Data Protection Act only has a limited effect; for example, a certification gives no clear statement about to what extent the rules of the Data Protection Act are observed by the respective enterprise.39 Furthermore, the certification according to the wording of Art. 11 para 2 Swiss DPA is of voluntary nature and therewith includes no demandable obligation. But Switzerland might consider implementing some features of the ongoing EU revision of data protection legislation into the next already envisaged amendment to the DPA.

4.

Important Elements of Privacy Impact Assessment ^

[25]
If compliance with data protection standards through a privacy impact assessment should be successful, some elements are of particular importance.

4.1.

Risk Assessment Factors ^

[26]
The objective of a risk assessment is to identify the privacy risks caused by specific processes or applications; the discussed RFID applications are a new technological tool, but may only serve as an example since privacy risks occur in virtually every business and in government. Therefore, the development of an adequate and appropriate risk design is crucial.
[27]

In the context of a PIA, privacy risks are to be concretely identified. Notwithstanding the fact that each industry or business is confronted with some specific privacy risks it seems to be possible to list «critical factors» which might be applicable in virtually all market segments. On that note, the following privacy risks should be taken into consideration:40

  • No data collections for unspecified and unlimited data collections;
  • No data collections exceeding their purpose;
  • Avoidance of incomplete information and lack of transparency;
  • No combinations of data collections exceeding the given purpose of collection;
  • Lack of erasure policies or mechanisms;
  • Avoidance of invalid explicit consent to data collection (for example threat);
  • No secrete data collection;
  • Avoidance of a situation of inability to grant access;
  • No technical/operational measures preventing objections by data subjects;
  • Lack of transparency of automated individual decisions;
  • Insufficient access right management;
  • Insufficient authentication mechanisms;
  • Illegitimate data processing;
  • Insufficient logging mechanisms;
  • Uncontrollable data gathering.
[28]

Seen from the angle of the data subject, the relevant factors of privacy risks which are to be taken into account encompass (i) the functions and activities of the data users, (ii) the nature of the personal data involved, (iii) the number of individuals affected, (iv) the gravity of harm caused in case of improper handling of data protection rules and (v) the compliance with the privacy standards contained in applicable codes, policies, practices and regulations.41

[29]

Ideally, already at an early stage of a system’s development the efforts should be directed to a pro-active mitigation of risks through technical and organizational controls.42 The better the precautionary measures realize the requested privacy objectives, the less likely problems will occur during the PIA process afterwards. Sometimes, PIA leaflets of data protection authorities contain valuable lists of possible measures for avoiding and mitigating privacy risks.43

4.2.

Risk Assessment Process ^

[30]
Risk assessment processes are known from many segments of the industry, mainly the high-technology and also the health fields. Lessons learned could be that risk assessments should be run at an early stage prior to the decision of definitely implementing certain systems, that measures susceptible to malicious attacks should be avoided and applications already configured in a privacy friendly way should get a special preference.
[31]

Generally looking, despite the difficulties, it is unavoidable that in addition to the identification of the risks, a PIA process includes a relative quantification of the risks, i.e. the PIA operator must consider, as guided by the principle of proportionality and under reasonable conditions, whether privacy risks are likely or not likely to realize in the processes executed in the business.44 Such kind of risk assessment requires evaluating the applicable risks from a privacy perspective, encompassing the significance of the risk, the likelihood of its occurrence and the magnitude of the impact (low, medium, high) in case of its occurrence.

[32]

A generally important problem in the risk assessment context concerns the uncertainties and complexities inherent in risk analyses and the use of science (technology) policies. Usually, three aspects of scientific evaluation are taken into account:45 (i) balancing categories of evidential reasoning, (ii) judging data and theories, and (iii) considering desiderata of rationality. The balancing categories concern how to weigh different risk estimates; judging data encompasses the determination of quality of data and theories used, depending on statistical properties, methodology, reliability, relevance and the level of scrutiny by the scientific community. Rationality is described as including conceptual clarity for all terms used in the discourse, logical deduction, methodological rigor, practicality, ontological realism, epistemological reflection, and valuation.

[33]

So far, experience with the application of PIA is still limited and knowledge must be gained in practice. The first review of the efforts related to the RFID PIA in November 2011, however, gives at least some confidence that this instrument could also be used in other contexts.46

4.3.

Consideration of the Privacy-by-Design Concept ^

[34]

Since an international harmonization of data protection laws might be unlikely to happen during this decade, the merits of a well-drafted privacy policy by the concerned entity (business, government) should not be underestimated, particularly since guidelines and models are available and experiences have already been gained.47 This assessment goes along with the fact that according to studies related to regulations governing the transborder flow of data an organizationally-based approach making data exporters accountable for ensuring the continued protection of personal data is the most viable tool for an appropriate privacy framework.48 Obviously, adequate governing practices within the organization must be put in place and, even more, accountability measures are to be implemented.49

[35]

In order to improve the privacy legal framework it has been proposed by legal scholars to turn to a «IT-security legislation» approach.50 This concept encompasses initiatives that demand the establishment of reliable IT-security standards which should protect from unauthorized disclosure of data; for example, new industry-based technological standards could introduce stringent safeguards. Thereby, technological «norms» would delineate a legal concept in which a broad scale of privacy problems can be designated.51 Such kind of approach has the advantage of leading to a concept of relational privacy taking into account the specific demands of the manifold data subjects.52 Furthermore, the implementation of an adequate PIA could eventually mitigate civil liability.53

[36]

From a technological angle a pro-active integration of privacy principles in a system’s design should be achieved in the form of privacy-by-design concept.54 This term can be defined as a pro-active engineering and management approach enabling a selective and sustainable minimization of information systems' privacy risks through technical and governance controls.55 The seven fundamental principles of privacy-by-design are:56 (i) proactive not reactive; preventive not remedial, (ii) privacy as the default setting, (iii) privacy embedded into design, (iv) full functionality – positive-sum, not zero-sum, (v) end-to-end security – full lifecycle protection, (vi) visibility and transparency – keep it open, (vii) respect for user privacy – keep it user-centric.

[37]
Summarizing, the PIA enables the concerned entities to design the conditions of the data protection framework according to the individual needs. Since the risks for privacy and non-disclosure of personal data are different in not identical circumstances, the protection measures should also be different, i.e. technology as precautionary means should assist in trying to achieve the (at least) second-best solution for the implementation of the data protection regime by a PIA. Insofar, privacy rules can be individualized and matched with the concrete needs in the given environment.

5.

Outlook ^

[38]

The non-compliance with the above described precautionary principles related to a legal risk management concept regarding privacy by the responsible persons of an organization can lead to personal liability. Even without a specific harmonization of legal systems in this field, corporation laws usually provide for a liability if the governing bodies of an organization do not comply with the appropriate managerial duties. In the meantime, legal doctrine widely accepts that the implementation of a risk management system in information infrastructures (including data protection) is indeed a task and thereby a legal obligation of the management.

[39]

Apart from the legal responsibility as such, compliance with data protection rules is also a matter of perception and reputation. Therefore, many organizations have recognized in the meantime that the establishment of an adequate data protection policy can constitute an inherent business value, for example as «selling argument» in electronic transactions with customers.57 Consequently, the notion of a systemic data protection calling for an overall regime to protect data quality and data integrity gains importance.58

[40]

In particular, private institutions and associations have introduced procedures which allow reviewing the data protection policy of specific organizations on a self-regulatory basis. If the targets of a respective audit are met, the concerned organization receives a certificate related to its compliance with data protection rules which can be used as a label.59 Another possibility consists in the introduction of ratings judging the level of data protection on the enterprises in a specific market.

 


Rolf H. Weber is Chair Professor for Business Law at the University of Zurich, Visiting Professor at the Hong Kong University and attorney-at-law in Zurich.

 

This article is based on an earlier contribution of the author: Privacy Impact Assessment – A Privacy Protection Improvement Model?, in: 25th IVR World Congress, Law, Science and Technology, Paper Series No. 039/2012, Series B, pp. 1-13, http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/24897.

 


  1. 1 Cheung, A., Rethinking Public Privacy in the Internet: A Study of Virtual Persecution by the Internet Crowd, The Journal of Media Law Vol. 1:2 (2009), p. 191.
  2. 2 The best indicator for this assessment is a comparison of the national rules on transborder data flows; thereto see the extensive list of Kuner, C., Regulation of Transborder Data Flows under Data Protection and Privacy Law: Past, Present and Future, TILT Law & Technology Working Paper No. 016/2010 (October 2010), Version: 1.0, Annex, http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1689483.
  3. 3 Warren, S./Brandeis, L., The Right to Privacy, Harvard Law Review 4(5) (1890), p. 205 refer to the right «to be let alone»; see also Hosein, G., Privacy as Freedom, in: Jørgensen (ed), Human Rights in the Global Information Society, Cambridge (2006), pp. 122-125 and 131-135.
  4. 4 Wacks, R., Law, Morality, and the Private Domain, Hong Kong (2000), p. 238; Weber, R.H., Shaping Internet Governance: Regulatory Challenges, Zurich (2009), p. 239.
  5. 5 Cheung, supra note 1, p. 193; see in general Weber, R.H., How Does Privacy Change in the Age of the Internet, in: Fuchs et al (eds), Internet and Surveillance: The Challenges of Web 2.0 and Social Media, New York (2012), p. 280.
  6. 6 See Office of the Privacy Commissioner for Personal Data (PCPD), Hong Kong, Privacy Impact Assessment (PIA) (July 2010), p. 1, http://www.pcpd.org.hk/english/publications/files/PIAleaflet_e.pdf; see also Australian Government, Office of the Privacy Commissioner (OPC), Privacy Impact Assessment Guide (August 2006), p. 4, http://www.privacy.gov.au/publications/PIA06.pdf.
  7. 7 Weber, R.H., Regulatory Models for the Online World, Zurich (2002), p. 79.
  8. 8 Weber, supra note 7, pp. 80, 83-84; Weber, R.H., Overcoming the Hard Law/Soft Law Dichotomy in Times of (Financial) Crises, Journal of Governance and Regulation Vol. 1:1 (2012), pp. 8-14
  9. 9 Weber, supra note 7, pp. 84-85; Weber, supra note 8, p. 13.
  10. 10 Lessig, L., Codes and other Laws of Cyberspace, New York (1999), pp. 3 et seq.
  11. 11 Lessig, supra note 10, p. 88.
  12. 12 Lessig, supra note 10, p. 129.
  13. 13 Lessig, supra note 10, p. 130.
  14. 14 Lessig, supra note 10, p. 296.
  15. 15 See Mayer-Schönberger, V., Delete: The Virtue of Forgetting in the Digital Age, Princeton (2009), pp. 145/46; see also Weber, supra note 7, p. 99.
  16. 16 For a general overview see Weber, supra note 7, pp. 57 et seq.
  17. 17 Kuner, C., Regulation of Transborder Data Flows under Data Protection and Privacy Law, OECD Digital Economy Papers, No. 187 (2011), OECD Publishing, p. 21, http://dx.doi.org/10.1787/5kg0s2fk315f-en.Kuner.
  18. 18 Weber, supra note 7, p. 77.
  19. 19 Biegel, S., Beyond our control? Confronting the Limits of our Legal System in the Age of Cyberspace, Cambridge (2001), p. 184.
  20. 20 Weber, supra note 7, p. 77.
  21. 21 The following subchapters 3.1-3.3 and 4.1-4.2 closely follow the earlier contribution of the authors (cited in author description).
  22. 22 Privacy and Data Protection Impact Assessment Framework for RFID Applications (12 January 2011), p. 3, http://cordis.europa.eu/fp7/ict/enet/documents/rfid-pia-framework-final.pdf.
  23. 23 PCPD Hong Kong, supra note 6, p. 1.
  24. 24 For example PCPD Hong Kong, supra note 6 and Australian OPC, supra note 6.
  25. 25 See US Department of Homeland Security, State, Local, and Regional Fusion Center Initiative, Privacy Impact Assessment (December 11, 2008), online available at: http://www.dhs.gov/xlibrary/assets/privacy/privacy_pia_ia_slrfci.pdf.
  26. 26 See for example the United States E-Government Act of 2002, Pub. L. No. 107-347 (Dec. 17, 2002).
  27. 27 http://ec.europa.eu/information_society/policy/rfid/documents/recommendationonrfid2009.pdf.
  28. 28 See also Weber, R.H./Weber, R., Internet of Things: Legal Perspectives, Zurich (2010), pp. 45/46.
  29. 29 Opinion 9/2011 on the revised Industry Proposal for a Privacy and Data Protection Impact Assessment Framework for RFID Applications, 00327/11/EN, WP 180.
  30. 30 PIA Framework, supra note 22, p. 4.
  31. 31 Opinion 9/2011, supra note 29, pp. 4/5.
  32. 32 PIA Framework, supra note 22, pp. 4/5.
  33. 33 PIA Framework, supra note 22, p. 10.
  34. 34 European Parliament, Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the Protection of Individuals with Regard to the Processing of Personal Data and on Free Movement of such Data, 1995, http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:1995:281:0031:0050:EN:PDF.
  35. 35 European Commission, Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), COM(2012) 11/4 draft, 2012, http://ec.europa.eu/justice/data-protection/document/review2012/com_2012_11_en.pdf.
  36. 36 A Regulation of the EU would be directly applicable in the Member States, in contrast to a Directive which has to be implemented in national law; nevertheless, the competence of the EU to release a Regulation in the data protection field is hotly debated and highly contested.
  37. 37 See also the short comments of Kuner, C., The European Commission’s Proposed Data Protection Regulation: A Copernican Revolution in European Data Protection Law, Privacy and Security Law Report (Feb. 06, 2012), pp. 1, 8, http://www.huntonprivacyblog.com/wp-content/uploads/2012/02/Kuner-EU-regulation-article.pdf; Traung, P., The Proposed New EU General Data Protection Regulation, Computer Law Review International (CRi) Vol. 2 (2012), pp. 33, 46/47.
  38. 38 Swiss Data Protection Act, http://www.admin.ch/ch/d/sr/2/235.1.de.pdf.
  39. 39 Rosenthal, D., Art. 11 Data Protection Act, margin number 1, in: Rosenthal/Jöhri (eds), Commentary on the Swiss Data Protection Act, Zurich (2008).
  40. 40 The list of privacy risks follows the PIA Framework, supra note 22, Annex III.
  41. 41 PCPD Hong Kong, supra note 6, p. 2.
  42. 42 PIA Framework, supra note 22, p. 7.
  43. 43 See for example PCPD Hong Kong, supra note 6, p. 2.
  44. 44 PIA Framework, supra note 22, p. 9.
  45. 45 See for example Crawford-Brown, D./Pauwelyn, J./Smith, K., Environmental Risk, Precaution, and Scientific Rationality in the Context of WTO/NAFTA Trade Rules, Risk Analysis 24 (2004), pp. 461, 465.
  46. 46 Spiekermann, S., The RFID PIA – developed by industry, agreed by regulators, in: Wright et al (eds.), Privacy Impact Assessment: Engaging Stakeholders in Protecting Privacy (2012), p. 9, pre-publishing version, http://ec.europa.eu/information_society/policy/rfid/documents/pia_spiekermann.pdf.
  47. 47 For further details see Weber, supra note 5, p. 280.
  48. 48 Kuner, supra note 17, pp. 20 and 26.
  49. 49 PIA Framework, supra note 22, pp. 17 and 19.
  50. 50 See Weber, supra note 5, pp. 283 and 286.
  51. 51 See Solove, D.J., A taxonomy of privacy, University of Pennsylvania Law Review 154(3) (2006), 477 et seq.; Weber, supra note 5, p. 286.
  52. 52 See Kleve, P./De Mulder, R., Privacy protection and the right to information: a search of a new symbiosis in the information age, in: Kierkegaard (ed.), Cyberlaw & security privacy, 2nd ed., Ankara (2007), p. 340.
  53. 53 See Gellert, R./Kloza, D., Can Privacy Impact Assessment Mitigate Civil Liability? A «Precautionous» Approach, in: Jusletter-IT 24 February 2012.
  54. 54 This term has been coined by Cavoukian, A., Information & Privacy Commissioner Ontario, Canada, Privacy by Design: The 7 Foundational Principles, http://www.privacybydesign.ca/content/uploads/2009/08/7foundationalprinciples.pdf.
  55. 55 See Spiekermann, S., The Challenges of Privacy-by-Design, Communications of the ACM, Viewpoint (July 2012), Vol. 55/No. 7, p. 1.
  56. 56 Cavoukian, supra note 54, p. 2.
  57. 57 See, as an example, the initiatives of the organization e-comtrust improving the data quality standards of online shops (http://www.e-comtrust.ch/).
  58. 58 See Weber, R.H., Datenschutzrecht vor neuen Herausforderungen, Zurich (2000), pp. 73 et seq.
  59. 59 The certification process done through the intermediary services of e-comtrust (supra note 57) is based on the European e-commerce standards SNR CWA 14842:2003.