Charles D Raab. International Review of Law, Computers & Technology. Volume 11, Issue 1. March 1997.
There is much debate amongst policy-makers, practitioners and commentators on the protection of the privacy of personal information as to the continued viability of historical solutions after more than twenty-five years of regulatory activity. Alongside this reassessment of the past, it is felt to be important to take stock of the sources and implications of new strategies which will characterize the foreseeable future. However, whereas most discussions of privacy or data protection focus upon particular legal, technological and other means of protecting privacy, they mainly leave under-explored the question of their interaction in a given system, and therefore cast little light upon the extent to which the individual approaches may act as partial pre-conditions for each other.
Building upon that groundwork, this article seeks to clarify the combinatorial possibilities of privacy-protecting strategies by gaining a perspective on the actors or participants who use them and the nature of the emergent networks of interdependent protective activity. It explores the relationships amongst these ‘stakeholders’, seeking new insights into the nature of regulatory policy as a collaborative activity of the state, civil society and the market, albeit one which takes place in some considerable tension with the adversarial premises upon which regulation is based.
Taking Stock of the Present
It is first necessary to give a sketch of the challenging conditions which privacy or data protection laws and systems address. These conditions are a mixture of technological, social, economic and governmental factors. They constitute the world which generates the participants’ relationships and shapes those relationships as adversarial or co-operative.
The state of information and communication technology (ICT) with which data protection laws and policies now contend is far different from that of the `computer age’ of the 1970s, when the first laws were drafted. Rapid developments in ICT and in the purposes to which personal data are put have raised serious doubts about the continued adequacy of solutions based upon a culture of mainframes, ‘databanks’, single-country considerations, and a conceptual division of sectors into public and private. Governments attempted to regulate personal-data activities within that culture, whilst also preserving the values of free interchange of information in commerce and administration. These aims are still current, as are the regulatory and legitimising aspects of data protection that stand in a delicate relationship to each other, whether of tension or of balance.
But new developments of the `information society’ challenge the established frameworks. These innovations combine new technological possibilities with new wants or needs in the public and private sectors. They include electronic cash and payments systems, genetic screening, the forensic use of DNA profiles, ‘smart’ cards, the electronic monitoring of workers’ performance, telephone `caller ID’, video surveillance for crime control, roadpricing and traffic-monitoring schemes. The relevance of existing regulations to these applications is often unclear, and they strain the capacity of well-founded dataprotection principles to give guidance as to what is fair, excessive, purpose-specific, and so on in these innovations.
Likewise, the shift from mainframe computing environments towards de-centralised desktop and portable facilities, networking, and the growth of the Internet point up the difficulty of regulating internal as well as transborder data flows with existing national laws. It remains to be seen how robust will be the current revision of laws and systems in response to the 1995 European Data Protection Directive, in the face of further change. If `information superhighways’ hold out the promise of freely-flowing information in the service of democracy, they also pose a threat to the possibility of protecting against misuse of the personal data that flows across the information networks.
Pessimists say that it is already too late to expect much privacy in the information society, or perhaps the `surveillance society’, of today and tomorrow. Current departures bring new and potentially dangerous uses of personal data along with the many benefits of business and administrative applications. On the one hand, there is nothing new in government and industry’s desire for information about citizens or customers in order to fulfil a host of obligations, rights and economic functions in capitalist and non-capitalist societies alike, as well as for social control purposes whether beneficial or harmful. On the other hand, ICTs greatly expand the possibilities of such practices, and make new policies feasible through the increased processing speed, transferability, and volume of data involved. In addition, the decentralised dissemination of personal data, the ease with which they can be assembled, matched or combined to create new information, and then translated into different media for storage and transmission, make accountability, consent and transparency much more difficult to achieve or to enforce.
Pessimism is further nourished by the perception that projects involving the use of information systems attract greater resources of public support, official backing, and private investment than does a concern for privacy. Public sector projects aim at crime control, administrative rationalisation, and improvements in the delivery of services. Understandably, they appeal to electorates that are fearful of the safety of their persons and property, to governments intent on cutting costs, eliminating fraud, and deregulating industry, and to public-sector managers and the public alike who want more efficient and better-tailored services. In the private sector, the ease and convenience of transferring money electronically, making travel bookings, and shopping from home are among the objectives that are seen as good for business and as satisfying customers’ wants.
These applications of ICT are burgeoning as government and industry seek votes or profits from new developments in a highly competitive environment. There is a great deal of money to be made by innovators and suppliers of ICT hardware and software, for much of the promise of the `information society’, or ‘economy’ or ‘polity’, is predicated upon the sophisticated exploitation of `smart cards’, touch-screen kiosks, computer programmes, the Internet and other devices. Such innovations are accompanied by powerful salesmanship and publicity aimed at gaining public and political acceptance. Where public demand for these innovations is weak, providers aim to inculcate it and to prepare societies for what is portrayed as an inevitable revolution.
One hazard is that a number of ethical issues, including privacy, will be overlooked in this process or argued away. However, there are indications of a more enlightened approach which recognises that safeguards might be `good for business’ (or government), even if the conjunction of privacy and other interests may only be marginal. There are signs that awareness of ethical issues is gaining a hold upon the consciousness of computer professionals and others, and on their representative associations, for whom these values have typically been submerged or regarded as rooted in outside the proper concern of technical specialists.
Other factors besides the increasing reliance of the polity, society and economy upon personal data exert an influence over the outlook for the future. One is the unclarity of the basic concept, ‘privacy’, which is defined in a wide variety of ways, with little agreement amongst philosophers, sociologists or lawyers. This makes it too easy to ask how it is possible to draft laws to protect what cannot be defined, even though a great deal of literature testifies to the psychological, social and political importance of privacy, however defined. Some would argue that the `right to privacy’ acts more as an ideal that is enshrined in various prestigious documents than as an enforceable mechanism of control over data practices, although it is important that controls be seen as rights rather than founded upon, say, utilitarian premises.
The sporadic and unconnected emergence of information privacy issues into (and out of) public awareness makes the relatively low political salience a further challenging factor. Increasing the prominence of privacy in public consciousness and on political agendas may yield dividends for the reinforcement of protection, as with ‘green’ or environmental issues. Yet, as the course of green politics indicates, single issues can easily slide down the agenda again. Data protection agencies and their officials may play a useful role here in bringing privacy concerns into greater public and political prominence.
One interesting development along these lines is the increasingly organised concentration of views, and thus the elevation of a collective profile amongst official data protectors in countries of the EU as they seek influence over national and international policies that pose privacy issues, such as the Schengen and Europol Conventions and the implementation of the 1995 Directive. But whether regulators are willing and able to risk clashes with their governments, or with international organisations, and what degree of political leverage they wield, remain important questions. Regulators must remain political animals, but they cannot easily choose the terrain on which they roam.
A final, but highly significant factor complicating the prospects for data protection is that the interests of individual members of the public, commercial organisations, government agencies, and others form a complex, cross-cutting pattern. These persons and groups cannot be simply construed only as either the individual victims or the corporate beneficiaries of the use of personal data; therefore, they cannot each be simply identified as only the gainer or the loser from regulation. For example, individuals, while desiring privacy, also have an interest as citizens and consumers in governmental and commercial efficiency, and in the improved services that an intensive use of personal data might bring. Commerce and government, while wanting to make the maximal use of personal data, also have an interest in legitimacy and in maintaining public trust through the protection of individuals’ privacy, and thus may even welcome some regulative restrictions that might facilitate a more lucrative, or a more politically acceptable, exploitation of customers’ or citizens’ data. The matching of reality to rhetoric, however, cannot be assumed. These complexities are likely to persist into the future as firms and states seek to provide more goods and better services through electronic transactions.
The patterning of interests, both between individuals and the state or industry, and within each ‘side’, presents difficult conflicts for resolution, yet it might yield important conjunctions that create possibilities for mutual accommodation. Although this sketch has concentrated mainly on the problems thrown up for data protection by new ICT and other developments, it should also be said that the problems are perhaps more a challenge to the laws covering information privacy than they are to the fuller dimensions of the data protection systems that have developed around laws but which are not wholly determined by them. Data protection is a matter of public policy and societal activity beyond the law, and there may be cause for anti-pessimistic (albeit not optimistic) argument about the future shape of data protection through the exploitation of extra-legal solutions along with the application of law.
Realisation of these possibilities will depend on actions taken by the various participants in the process of privacy protection, and particularly upon how they define the situation for themselves and for others. Here the arts of persuasion, image-making and propaganda enter the field alongside laws and regulatory machinery, thus expanding our conception of the politics and governance of privacy protection beyond legal and technological analysis. Nevertheless, it is necessary to review and to comment upon the variety of formal arrangements that has grown up in recent times in order to understand the range of available instruments.
Although there are a number of tools for privacy protection, at the heart of the law, policy and practice lies a set of `fair information’ principles. As Bennett has shown, there has been international convergence on these, although divergence in their implementation. The principles gained impetus through the work of international organisations in the 1970s, in particular, of the Organisation for Economic Co-operation and Development (1981) and the Council of Europe (1981). As enunciated by the latter, the principles require that data should be collected fairly and lawfully, kept for specified and legitimate purposes and not used incompatibly with them. They should be adequate, relevant and not excessive in relation to their purposes; accurate and not outdated; held only as long as necessary, and held securely. Especially sensitive information should be specially safeguarded. Data subjects should be able to have access to their data and to get them corrected if necessary, and to obtain remedies.
Variations in the embodiment of these principles in law are very important in terms of the protection afforded to individuals in different countries, especially in the light of current efforts at international harmonisation through the EU Directive. Particularly significant are the national differences in the chosen mix of policy instruments within legal frameworks. Bennett enumerates five such instruments, or models, pointing out that a country’s approach to data protection may reflect one of these whilst actually combining different instruments. Subsequent writing,ll e.g., Industry Canada, and Briscoe, also points to a variety of privacy-protecting mechanisms and solutions, but devotes relatively little attention to their mutual operations. For the purpose of this article, it is the combinations that are of particular interest, but the typology will first be outlined.
Three related mechanisms-licensing data users, registration of their holdings, and the establishment of a data commissioner-can be taken to constitute a state-centred approach to data protection. Licensing involves an authority established by the state to grant licenses permitting the use of personal data on condition that applicants fulfil designated requirements. This may be an excessively bureaucratic method of regulation, and may break down in practice, especially in the emergent proliferated informatics environment.
The second model of state regulation, that of registration as such, requires data users to register their personal information holdings with an agency according to a standard set of characteristics. This procedure has also often proved to be very onerous; the EU Directive retains registration as an instrument, and there are current attempts, for instance in the UK, to simplify the existing registration system further whilst meeting the Directive’s requirements.
Bennett’s data-commissioner model does not depend upon these two devices. It involves a public authority who not only reacts but may take the initiative to advise, supervise and exert pressure upon datausers where strong powers may be absent. This role may be combined with the registration model, as in the UK. Commissioners and agencies may develop a host of relationships throughout the political and administrative systems, and in society at large. Flaherty” has pointed up the importance of data protection agencies. As will be seen, their position is crucial in any analysis of the networks of data protection that link the state, civil society and the market.
A fourth policy instrument, voluntary control, involves self-regulation by organisations that use personal data. Self-regulation is mainly an adjunct of statutory control in countries that have data protection laws and may be encouraged by regulatory authorities; the Dutch, British and New Zealand systems provide different examples. Private-sector data users may group themselves into sectoral or industry-wide organisations through which they come into contact with both the state and data subjects. Interests may be intermediated and codes of practice negotiated and adopted through such ‘peak’ organisations; the direct-marketing and financial industries are among the most prominent in doing this. Individual companies often also institute codes of practice and rules for their own employees.
Public or state organisations also use this instrument. As data users, government departments and the like must comply with laws and come into the purview of the state’s chosen regulatory machinery, even though they may be accorded greater exemption rights than commercial firms. For example, public sector bodies must register where there are registration systems, although exemptions and less restrictive third-party non-disclosure provisions may be available to them. But they may also adopt a measure of voluntary self-control by formulating codes of practices such as that which, for example, applies to police forces in Britain. Public bodies might in fact be more likely, in practice or in law, to be given wider scope to regulate themselves.
The fifth instrument delineated by Bennett is subject control, or self-help. Individuals pursue their rights in the system of data protection, take the initiative to gain access to their data, complain and exert pressure upon data users, or otherwise actively participate in their own protection. Self-help bears a relationship to one of the common definitions of privacy, in which individuals control the terms on which information about them is communicated to others. Individual-based activities are akin to a ‘market’ solution, in which the level and quality of data protection is achieved through the outcomes of direct interactions between data users and data subjects, or between data subjects and suppliers of technology products. The outcome may often be determined in the courts, but also through (and reflecting) the relative power of users, subjects and suppliers. The granting or withholding of consent to the collection and processing of one’s data plays an important part in data protection systems and laws. The improvement of the consent process in favour of the data subject will remain a crucial part of the privacy agenda in the future, not least in the context of the implementation of the European Directive and of its influence over the practices of data users in third countries.
Data Protection as Co-production
The outline of instruments above highlighted certain defining properties, rather than describing reality in detail. It is worth reinforcing the point that most, although not all, of these instruments are used to one extent or another in every system of data protection, and that they therefore can be seen more as variables than as the criteria defining different types of systems. Putting it the other way round, every system will exhibit its own combination of models or instruments; moreover, these may change over time. Already implied in these mechanisms were relationships between the persons, roles or institutions that take part in data protection, although these of course also vary across systems and even across sectors within a system.
Moreover, the circle of those with a stake in the law and policy of data protection goes beyond state, societal or market actors who directly wield the law and the instruments of implementation. It includes technology designers and providers, civil liberties and privacy pressure groups, media and academic commentators, consumer pressure groups, private consultants, and government policy-makers in all fields concerning programmes in which personal data are used. Some of these often act in a countervailing fashion against intrusive information practices in the economy and in public life. As do the technical professionals and data users, these groups aim to influence policy, and are often involved within the consultation networks of state decision-makers as well of the independent regulatory agencies.
It is perhaps best, however, to begin by looking at state provision for privacy protection. As we have seen, at the level of state action are laws and authoritative instruments for their enforcement. The latter are manifested in the role of data commissioner, armed with certain powers and duties. Commissioners may only be able to enforce the law rarely and as a matter of last resort. To enforce the law, they must know what is going on amongst those they are meant to control. This is becoming more difficult as information systems are ubiquitous, opaque, diverse and dispersed, and the `intelligence capability’ of regulators may be severely tested. If a register of data users is mandated by law, it may tell commissioners something about these systems, but it must first be created. The register itself may be a negotiated product, devised through the interactions of regulators with data users and other interested policy actors; its current modification in the United Kingdom, for example, is the result of an explicit consultation exercise.
Even before they register, data users must realise that the law applies to them, but it might not be sensible for commissioners to leave it up to them, and may have to take the first steps. In some cases this will involve discussions and negotiations, for the registration requirements, definitions and interpretations may be vague, and the commissioner may have considerable discretion in the way different classes of data user are dealt with. Both at this stage and at later ones, commissioners may find themselves as much encouraging, enabling and exhorting data users to comply, as enforcing. Moreover, the implementation network may well include private-sector management consultants and accountancy firms in teaching their clients about data protection so that they may comply with legal requirements.
Much of the work of commissioners is comprehensible as an activity of steering in which a combination of penalties and rewards is used to move data users towards compliance. In particular, however, their interaction with data users may be characterised by a search for common understandings of some of the key elements of data protection, such as the meaning of ‘fair and lawful’ data processing, the way to provide meaningful consent for data subjects, and the permissible extent of data-matching.
Alternatively, these issues may be put to the test in the courts. However, much of the success of regulatory officials and their organisations may be deemed to depend as much on their ability to foster a climate of compliance amongst data users as on their formal sanctioning activities, even if upheld in litigation. This may be particularly important in the public sector, where the temptation to ignore privacy protection may be great if there are over-riding public purposes that can be invoked-albeit sometimes unconvincingly-as justification, and where formal sanctions, such as litigation or proscription of activities, may be inapplicable. But it is also important in the private sector, where the official regulator’s knowledge about the data-processing activities of firms and organisations may be very deficient owing to a lack of inspection and monitoring capability or formal powers.
In both sectors, therefore, there are severe limits to the extent to which implementation of data protection can be a matter of the `top-down’ exercise of authority; rather, the success of the regulator-and thus the extent of data protection-may depend upon good behaviour, based on understanding, by the regulated within the legal framework. It can be argued that the increasing fluidity of information practices will continue to put a premium upon this knowledge-based understanding in the future, requiring continuous revision in the light of rapidly changing circumstances. This mode of relationship between the regulatory body and data users can be seen in another way as an emphasis upon education, which extends to individual data subjects as well. Commissioners, such as Britain’s Data Protection Registrar or the Federal German Bundesbeauftragte have been involved in educative activities not only with those they regulate, but with those whose rights they protect. Thus data protection can be viewed in terms of mutual learning, a critically important process in view of the inherent complexities and dynamism of information systems.
One particular aspect is the regulator’s role in advising on or negotiating codes of practice with data users. This can be seen in one sense as a recognition of the limits of legalistic state regulation as such, even though the state may play a decisive part in promoting societal self-regulation. Also, whilst much might be left up to the individual data subject in the ‘market’ mode of privacy protection, individuals’ ability to engage in subject-access request transactions, with an awareness of their rights and of data users’ obligations, and with the likelihood of obtaining redress of grievances, may rely heavily upon state action. This takes the form of a relationship between commissioners and data subjects in general as well as in specific cases, in which commissioners enter the market on behalf of a particular individual in a specific case, beyond their general educative and promotional activities. In contrast, in systems that lack data protection agencies, individuals may be left to their own devices in seeking legal remedies, although they may be able to seek the assistance of citizens’ or consumers’ action groups, where they exist, in pursuing their claims. A further involvement of the law and of regulators in the privacy ‘market’ concerns the use of `privacy-enhancing technologies’ (PETs), which are discussed below.
Thus, the outline so far has sketched some of the involvements of regulatory institutions with other interests located elsewhere in the system of data protection. Regulators also interact with other state actors, and especially with political and governmental officials who set the overall policies for data protection and for those administrative, economic and technological activities which have implications for information privacy. Data protectors have often seen that their influence within these policy spheres is an essential determinant of their ability to exercise their more specific regulatory functions, and have accordingly sought to insert themselves more effectively into the policy process. There are many policy domains to which regulators must pay attention, but the most significant are likely to be in areas where technological means are being applied with increasing impetus to solving social, economic and administrative problems using personal data, or to promoting industrial and commercial activities. This includes the fields of law and order, the welfare state, telecommunications and high-tech industries. Policy influence may depend not only on the formation of more favourable routines in the policy-making process to enable such participation (e.g. by getting into the decision-stream early enough), but also on the personality of, and approach taken by, commissioners themselves.
Another factor affecting the degree of influence wielded by regulators is the availability of public issues on which to become active, providing opportunities to exercise creativity in pressing regulatory and privacy concerns upon political and public arenas that might otherwise neglect them. A case in point is the entry on to the British political agenda of identity cards in 1995, giving a chance for the Registrar to conduct her own public consultation alongside that of the Home Office, the government department responsible for this policy area, and thus raising the profile of privacy in public discourse. Media attention, as well as the airing of public debate between apprehensive civil libertarians and proponents of identity cards, are important in such issues, and illustrate the participation of wider groups in the formulation of policy.
Turning to some of the most weighty institutions of civil society, here one must consider the prominent part played by data users, their organisations and ‘peak’ associations representative trade bodies and the like-in the implementation process. Of course, these interests may be adversarial to those of commissioners and individual data subjects. They will seek to influence policy in the direction of minimising the effects of the implementation of data protection on their personal-information activities, and perhaps to turn data protection to their advantage where they can, for compliance and a reputation for good practice may confer useful legitimacy on a data user, as mentioned earlier. This adherence may be particularly reinforced where competition exists-for example, among banks-and if the secure holding of customers’ personal data becomes an important element in advertising and in the presentation of a respectable corporate image.
However, political and economic circumstances may provide leverage for data users, or for technology suppliers, to shape public policy and laws in ways that have disturbing implications for the prospects of privacy protection. Competitive advantage may be gained through the exploitation of new applications that use personal data in banking, retailing, and other industries, even while the ‘security’ image is maintained publicly. Governments recognise the economic potential of, for example, smart cards, thus facilitating alliances between themselves and industrial interests that may be strengthened by the perception that the same technology opens up new possibilities for the public sector itself as a major user of such devices for the delivery of services and for the control of fraud. The place allotted to privacy protection within the policy framework developed for these applications will reflect the relative power positions of privacy-oriented and other interests. On the rhetorical plane, the issue is often constructed in the misleading terms of hard-headed business sense and genuine public service versus the paranoia of the privacy lobby.
Here the wider participation of pressure groups, associations, official regulators and the mass or specialist media play a part in the positioning of data protection within the technological specifications, or further ‘downstream’ at the application end. Amongst the other societal organisations that take part in the formation of data protection are those that are more clearly privacy-oriented than are the data users and promoters of new technology. The influence that activists have with states and governments may be limited by mutual distrust, and by perceptions that governments may be ‘soft’ on privacy in the context of the powerful arguments for the `information revolution’ that were mentioned earlier, although privacy advocates may gain greater respectability if they are seen to be knowledgeable as well as useful to regulators.
Consumer groups may find it difficult to adopt a clear position, advocating both consumer convenience and consumer protection, but they often enjoy a somewhat privileged position in governmental circles where `consumer-orientation’ has become a watch-word in the public sector, and where political advantage can be found in heeding their voice. Associations of equipment producers as well as information technology professionals such as the British Computing Society and the American-based Association for Computing Machinery and Computer Professionals for Social Responsibility are involved in the design and use of technologies that have implications for information privacy, and have taken ethical issues on board in their conception of professionalism by promulgating codes of ethics and of practice, with an eye to self-regulation. Privacy commissioners have to tap into the technical expertise where they do not possess it within their own organisations.
In arguing that data protection involves mutual dependencies, one should not construe data users or technologists blandly as consensual helpers who share common goals with privacy regulators. If dataprotection has much to do with the resolution of information conflicts,2′ then neither ideological alignment, warm feelings, nor sweet reason should necessarily be expected in the exchanges between mutually-dependent actors such as official regulators and data users. Nevertheless, practical working relationships develop in networks that include data users, governments and privacy regulators. Singly and collectively, data users interact frequently with regulatory officials and take part in frequent consultations over specific or general issues. In particular, industry-wide associations are instrumental in developing codes of practice and conduct, which have been important adjuncts of data protection in a number of countries, perhaps especially in the Netherlands. In that country, codes have a major and quasi-official role in the system of data protection, and the commissioner’s office (Registratiekamer) plays a central part in their adoption and promulgation.
Codes of practice are attracting increasing attention. They imply adherence to data protection principles and standards and give a public affirmation that these will guide the practices of data users. They are favourably regarded in the deregulatory climate of the 1990s as a way of ensuring privacy protection, and especially insofar as there are, as we have seen, limits to the efficacy of law itself or the activities of official regulatory bodies. Moreover, the `information superhighway’ environment, which is difficult to regulate by law, is likely to have to rely upon self-regulation and the adoption of common standards. In future years, we may expect to see a greater emphasis on self-regulation, involving codes and standards, perhaps especially where the legal basis for general data protection is somewhat insecure.
For different reasons, codes are of value both to privacy regulators and to industry or to those parts of the public sector that adopt them, for it should again be borne in mind that codes of practice may exist within the state amongst its data users, as for example the police, local government and health services. For regulators, codes relieve some of the pressure of enforcement from themselves and transfer it to firms and the like. For data users, codes keep the state and its laws at arm’s length, and may have additional public-relations or reputation-building value that paves the way for more intensive uses of personal information. Codes also may assist in the educative function, because data users need to learn more about data protection in relation to their processing activities if they are to adhere in practice to their industry’s code, especially where auditing is used. It may well be that the relationship between state regulators and data-users develops along neo-corporatist lines. For this to be so, the representative bodies that produce codes should be able to enjoin compliance upon their member-organisations in return for the privileged influence accorded to these ‘peak’ associations in regulatory policy-making and as the price of the `arms-length’ relationship, while commissioners should have some way of ensuring intra-sectoral adherence as well. Such compliance work is therefore also a matter for action within and amongst social and economic institutions, as well as involving state-level official regulators.
The nature of these processes within self-regulating sectors, in which the prospects for privacy protection are determined, is therefore likely to be of increasing concern to regulators to the extent that self-regulation is seen as more effective or easier to implement than legal sanctions in the future described earlier, and because the determination of ‘adequacy’ of data protection in third countries, as required by the European Directive, may in part depend upon such intelligence concerning the degree to which professional codes are “complied with” (Article 25.2). Doubts surround the ability of this determination to be put into practice, although this problem is under investigation within the EU. There is a potential danger to standards of privacy protection in over-reliance upon self-regulation, especially if the less-regulated environment of the future fails to include a sufficient infrastructure of checks, audits, and avenues for complaints and redress. The Canadian development of a model code incorporating a recognised privacy standard is a promising development along these lines,’,” especially if it is widely adopted and also referenced in government legislation, thus combining state action with self-regulation.
In addition, considering the ‘market’ zone of privacy protection, codes send signals to data subjects undertaking transactions, for codes imply a guarantee of good performance given to those who, whether voluntarily or under legal obligation, furnish information about themselves, or whose details are gathered without their knowledge or consent. The relationship between data users and data subjects in the commercial market or in the metaphorical one that involves exchanges between individuals and government may thus crucially depend upon the climate of trust that can be fostered through codes. As has long been known through public-opinion surveys,27 trust is a key element in the public perception of privacy risks.
Trust affects the readiness of data subjects to give consent to the collection and use of their data. The question of consent has always been one of the thorniest problems in data protection, and the extent and manner in which it is made available to individuals was at issue in the formation of the European Directive. The fluidity of personal data in the far-flung environment of electronic commerce on the Internet, as well as in more prosaic settings such as modern health-care systems that require the passage of data across organisations and functions outside the clinical setting, will keep the problem of gaining consent high among the concerns of official privacy protectors. For the same reason, the transparency of data processing will also remain a goal that is difficult to achieve. As with consent, the development of commercial and administrative processes involving personal data complicates the determination of who the data user is, even though laws seek to clarify this. Codes may come to play an essential part in these dimensions of data users’ accountability, and in influencing the level of public confidence.
The consent principle and the problem of transparency point up the role of individuals and markets in shaping information systems, but also in influencing or achieving safeguards to privacy through the normative assumptions underlying exchanges of data between themselves and data users. From the standpoint of individuals, data protection depends in part upon the degree to which they can control ‘their’ information, as implied in Westin’s definition or in the German concept of `informational self-determination’. This suggests that it is individuals who instigate market relations over information, although there are strict bounds to this insofar as welfare benefits, driving licences, health services, taxation, and virtually everything else that comprises modern citizenship makes such control dubious. The issue here is how far information-gathering or surveillance (dataveillance) should be permitted, and with what safeguards to privacy. Individuals’ power position in relation to the state and to societal institutions, and therefore their market position, is at stake in the outcome of policy debates and practices surrounding this issue.
The position of individuals may be improved through the activities of state or civil society actors, as the above discussion of commissioners indicated. For their part, individuals contribute to data protection outside the satisfaction of their own specific requests or complaints by enabling commissioners thereby to monitor practices amongst data users, and to monitor the success of the commissioners’ own activities. Commissioners may only become alerted to possible malfeasance or bad information practices through learning about these from subjects’ unhappy market experiences, analogously to the way in which police often maintain that the general public is the most useful source of information about crime. This enables commissioners to investigate and understand organisations’ data practices for more general purposes of policy and implementation, sometimes regarding whole sectors. Monitoring the effect of their own activities upon the level of public complaints also helps regulators to develop and modify their strategies and the deployment of resources for carrying them out.
The efficacy of self-help by itself may be questioned. Where the quality of data protection most likely plays a relatively small part in individuals’ decisions to purchase goods or services, as for example in shopping or banking, the ‘market’ for data protection operates less in favour of privacy than where the passage of information is more obvious. The latter cases would include mail-order purchases or electronic commerce on the Internet, in which customers may be much more concerned about what happens to their personal details and where firms may think their business interests are better served by giving assurances of confidentiality and security.
We are far from a situation where customers boycott firms because of lax data protection, or where trade unions take industrial action against employers who engage in excessive workplace surveillance of employees, and these scenarios are unlikely to occur. Nevertheless, consumer groups and trade unions can, and do, support the individual vis-a-vis privacy invasions from these sources, whether in particular cases or in terms of pressing more generally for improved practices by particular data users. These organisations may be expected to play an increasing part in raising public and political awareness of the implications of new technologies and practices for privacy in the mundane but crucial spheres of consumption, services and employment. Higher-profile roles will be available for the media and the privacy pressure groups, who pay close attention to developments in the private and public sectors which may threaten privacy, and who take part in policy-related consultations as champions of the individual.
Thus self-help is assisted by these societal organs, as it may be by the state’s own instruments, including the law and the regulatory activity of commissioners in helping people to learn their rights and to make use of them. Intensified use of ‘dataveillance’ in the future will require further strengthening of these supportive mechanisms for self-help. Whether this will take place depends not only upon more subject-friendly legislation-for example, in regard to the principle of consent and the means by which individuals can exercise it-but upon the growth in the salience of privacy as a public issue and upon the governmental and societal resources that are made available for its protection. In some countries, such as the United States and Canada, this will require primary legislation to cover the private sector; in others, it will require legislation securing rights in both sectors. Everywhere, it will require a better understanding of the way in which personal information flows across the boundaries of these sectors, and across national borders. However, nowhere will legislation by itself suffice.
Redress of grievances or self-provision places the onus upon individuals as they carry on exchange relations with others. Keen interest is being shown in PETs as new tools for data protection that function in the direct relationship between individuals and data users; for example, in electronic cash systems. Sophisticated forms of encryption, greater control by individuals of the contents and use of smart cards, and `trusted third parties’ are among the ideas and innovations that place data protection nearer to the point of use in ‘market’ transactions. These technological solutions may go a long way not only towards safeguarding anonymity and other dimensions of privacy, but in underpinning what many see as an imminent revolution in commercial and governmental transactions using personal data.
On the other hand, it may be too facile to pass the solution to privacy issues over to the domain of technology without also considering the framework of institutions as well as the dynamics of economic and social relationships in which PETs may play a role. This is not only because the efficacy of particular PETs needs substantiation, but because they might too easily become simply legitimising tokens of ‘security’, or even of anonymity, if the way they are used is not incorporated into knowledge and action within the worlds of enforcement and self-regulation. As shown by 6 and Briscoe, legal and regulatory intervention could become a necessary strategy for privacy protection in the future proliferation of PETs. They argue that governments should take powers to require, or provide incentives for, a greater use of PETs in the design of `smart card’ systems and the like. This would seem to betoken a closer relationship between state policy-makers and implementers on the one hand, and technology users, suppliers and designers on the other hand, in one area of the ‘policy space’.
This may be as far as the investigation needs to be taken for the present in order to illustrate the mutual implication of various strategies and the interrelationship of different actors in the field of data protection and its implementation. These interactions make the analytical construction of the field more complicated, but more interesting in that they lend support to a view of data protection as an area of regulatory policy that cannot be seen in terms of only one vantage point, that of the ‘top’, which is often privileged in law—or administration—based studies, or of one time-point, which is again a consequence of certain formal approaches.
Data protection is statutory public policy, but the route to its achievement runs through inter-organisational relations and implementation networks, ranging between consensus and conflict, and located in a complex `policy space’ in which the ultimate provision of data protection is arbitrated. Moreover, the policy space is not confined to nation-states but extends globally through explicit structures (including, for example, the European Commission, multi-national firms such as American Express, and pressure groups such as Privacy International) as well as more ephemerally through the policy-related privacy discourse that is found in plentitude on the Internet.
If the field of data protection can be conceived as a repertory of means towards the end of safeguarding privacy, it will be increasingly important to comprehend, and perhaps shape, the connections among the various mechanisms or strategies and among those who deploy them. Two broad positions can be taken: one is that the various market, civil society and state forces make separate, complementary and sometimes alternative contributions towards protecting privacy. Rather like some versions of ‘pluralism’ in the study of politics, the outcome-data protection-is the aggregated result of these separate inputs. The other position, however, is that these different inputs are mutually dependent, with each a potential facilitator or inhibitor of the other.
It is the second, more complex construct that has been adopted in this article, but it requires elaboration beyond the present discussion because it points up the extent to which the efficacy of data protection may increasingly require a coherent strategy to manage the mutual dependencies so that they move in the same direction, rather than the separate solutions being left to coexist as best they may. This elaboration has only been done in brief terms in this article, in which the approach taken was to consider schematically how participants in each ‘level’ of solutions related to the others in co-producing dataprotection. This ideally requires systematic empirical and comparative investigation across systems, contributing to the further emergence of privacy and data protection as a coherent field of public policy in which the law plays a central part, but does not wholly determine the scope of action.