Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'Privacy Protection for the 21st Century'

Beyond the OECD Guidelines: Privacy Protection for the 21st Century

Roger Clarke

Principal, Xamax Consultancy Pty Ltd, Canberra

Visiting Fellow, Department of Computer Science, Australian National University

First Draft of 4 January 2000

© Xamax Consultancy Pty Ltd, 1997, 1998, 1999, 2000

This document is at http://www.rogerclarke.com/DV/PP21C.html


Abstract

The framework within which privacy protection is discussed has been stalled since the early 1970s. During the intervening three decades, information technology developments and convergence have produced many, quite dramatic incursions into the private space of people, with serious impacts on the individual, communities, society, and the body politic.

As the U.S. becomes increasingly isolated, and the Administration's resistance against comprehensive protections begins to crumble, the scene may be set for a very substantial, and desperately urgent, set of enhancements to the framework. This paper catalogues the deficiencies that were inherent in the 'fair information practices' tradition that the OECD's 1980 Guidelines codified, together with the additional problems that have arisen since their formulation. It represents a comprehensive agenda for 21st century privacy protection.


Contents


1. Introduction

Information privacy has never been more keenly discussed than it is at the turn of the millenium. That it has been under assault from privacy-invasive technologies is unquestioned; but what to do about that assault is a topic of much debate. A few have actively argued for the abandonment of privacy as a human value (e.g. Sun's CEO, Scott McNealy, reported in Sprenger1999 and Whitaker 1999). With that comes the abandonment of humanity, and the application to our selves of every technology that we have contrived for the purposes of manipulating our artefacts. Brin (1998) argued for the abandonment of secrecy as the means of achieving privacy, and the use instead of ubiquitous openness.

For the vast majority of people, however, privacy is a critical human value that must be defended. There are several approaches that can be taken, including protection entirely through technology, protection entirely through law, or protection through a combination of law and technology. This author adopts the last of these approaches (see, for example, Clarke 1988, 1997, 1999a, 1999b).

This paper, however, focuses entirely on legal protections. This is a mainstream topic in most countries in the world. In the U.S.A., on the other hand, proposals for regulatory intervention draw the ire of libertarians. They overlook the fact that their country has far more privacy-relevant statutes than the many countries in Europe and elsewhere that have a single comprehensive Act (see Smith 1997, Rotenberg 1999). The reason is that that the U.S. has a tendency to kneejerk reactions to issues, which results in inconsistent, overlapping, situation-specific legislation.

Almost all discussions about privacy laws take as a given that the reference point for regulatory arrangements is the OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (OECD 1980). This paper re-visits those Guidelines. It identifies their origins and philosophical basis. It then considers the weaknesses that they contained when they were promulgated, and the damage that has been wrought by technological developments since that time. It concludes that they fall so far short of the needs of advanced societies as to be as harmful as they are helpful. The paper embodies proposals for a substantial set of enhancements.

Early and decisive action is needed in relation to the deficiencies catalogued in this paper. The assumption is made throughout, however, that much of it could, at least in principle, be achieved through wholesale revision and extension of the OECD Guidelines. Whether it could be achieved in practice, and achieved quickly enough, depends on an assessment of the dominance within that organisation of economic over cultural priorities, and on the slowness of multilateral negotiations in that forum. Given the attempt made in such documents as OECD (1998) to claim that the Guidelines need no change, there has to be some pessimism about the matter. In the interests of space, however, the politics of the matter are not addressed in this paper.

The scope of the paper is broad, and the treatment of each point must of necessity be very brief and compressed. Similarly, the introductory material is constrained to that necessary to provide an appropriate foundation on which the various arguments can be built. The references include multiple, prior papers by the author, many of which provide further references to the wider literature.


2. Background

This section provides background to the assessment that follows, commencing with brief reviews of the origins of the privacy notion in general, and the information privacy concept in particular. This is followed by outlines of the dominant framework for information privacy protection, and the tension between privacy as a human right and as an economic right.


2.1 Privacy

Privacy was not a significant feature of tribal or village life. It became a more significant human value during the period of the rinascimento and the industrial revolution. By the end of the nineteenth century, learned justices in the United States discussed it, arguing that "the right to be let alone ... secures the exercise of extensive civil privileges" Warren & Brandeis (1890). Historical and philosophical discussions of privacy are in Flaherty (1972), Seipp (1981), Schoeman (1984), Smith (1997), Burkert (1999) and Bennett & Grant (1999).

Privacy is often discussed as though it were a moral or constitutional right. Sub-section 2.5 considers that question. The body of the paper, however, adopts the approach originally espoused by Morison (1973) that it is much more convenient for the purposes of analysis to approach it as an interest, rather than as a right:

Privacy is the interest that individuals have in sustaining a 'personal space', free from interference by other people and organisations.

It is an interest that has several dimensions:

With the close coupling that has occurred between computing and communications, particularly since the 1980s, the last two aspects have become closely linked, and are commonly referred to as 'information privacy':

Information Privacy is the interest an individual has in controlling, or at least significantly influencing, the handling of data about themselves.

An important implication of the definition of privacy as an interest is that it has to be balanced against many other, often competing, interests. At the level of an individual, it may be necessary to sacrifice some privacy, in order to satisfy another interest. The privacy interest of one person may conflict with that of another person, or of a groups of people, or of an organisation, or of society as a whole. Hence:

Privacy Protection is a process of finding appropriate balances between privacy and multiple competing interests.

Privacy must on occasions be compromised in order to sustain other important interests such as law and order, and reasonably fair distribution of social benefits and costs. For psychological, social and political reasons, however, it is essential that privacy be highly valued, and not subjugated to other social considerations, or to the demands of economic efficiency.


2.2 Information Privacy

The interest in controlling information about oneself reflects concerns about the the exercise of power by others. It is a concern that has been heightened during the twentieth century, as the scale of social institutions grew, distance increased between individuals and the organisations that they dealt with, rational management took hold, organisational decision-making ceased to be based on personal judgement and trust and came to be based almost entirely on data, and technologies were developed and applied to achieve those ends.

As ever, artists sensed the forthcoming change long before it arrived, most notably Zamyatin (1922), but also Orwell (1948) and Russell (1949). These initial warnings were stimulated by the spectre of authoritarian governments (variously fascist and communist) harnessing technology to their anti-democratic aims. From about 1950 onwards, a gradual shift is discernible towards technology as itself a determinant of the directions of change. Early expressions of concern in non-fiction literatures included Packer (1957, 1964), Long (1967), Stone (1968), Miller (1969), Rosenberg (1969), Thompson (1970), Warner & Stone (1970), Miller (1972), Rule et al. (1974), Wessell (1974) and Weizenbaum (1976). Subsequent examinations of concerns arising from computer technology are to be found in Burnham (1983) and Laudon (1986). A more generalised expression of deep concern about the nature of the surveillance society is Foucault (1975).

The most influential work of a policy nature was undertaken in the U.S.A. (Westin 1967, 1971, Westin & Baker 1974). See also Hondius (1975). Flaherty (1984) provides a bibliography of privacy works. Burkert (1999) reports that a 1970 edition of Westin's 1967 publication was available in German translation in the same year, courtesy of IBM, and significantly influenced developments in Germany. This reflects the close association that has always existed between Westin's work and the needs of business and government.


2.3 Information Privacy Laws

The first laws that expressly protected information privacy were passed in Europe in the early 1970s. The West German Land of Hesse passed its Datenschutzgesetz (Data Protection Act) in 1970, and that term quickly came to be used in virtually all discussions. Sweden's Data Act of 1973 was the first such legislation at national level. A succession of Continental countries followed, including Germany in 1977 and France in 1978. The United Kingdom saw private members' Bills as early as 1961 and 1967, and Government-appointed Committees produced thoughtful reports (Younger 1972, Lindop 1978); but no government showed any interest in sponsoring legislation.

In a short time, concern arose that inconsistencies among the regulatory regimes in European countries might become a restraint on trade. To address this risk, common requirements were codified by the Council of Europe (CE 1981). Although not binding, this instrument progressively encouraged laggard European nations to act. The U.K. legislation of 1984 was expressly a response to the Council of Europe instrument, designed to avoid British corporations whose business involved passing personal data across national boundaries from being disadvantaged. Portugal, Spain and Belgium were very late movers, in the early 1990s. In 1990, the United Nations adopted Guidelines Concerning Computerized Personal Data Files (UN 1990).

The European Union spent the first half of the 1990s debating the form of a mechanism for ensuring uniform application of 'fair information practices'. It eventually established requirements (EU 1995) that came into effect in October 1998. This led to legislative amendments in many countries, and the first legislation in Italy.

In the U.S.A., early discussions, and in particular HEW (1973), resulted in the enactment of a regulatory regime for federal government agencies in the form of the U.S. Privacy Act of 1974. The Ford Administration threatened to veto that statute, and forced the proposed Privacy Commission to be reduced to a short term Study Commission (Laudon 1986, p.6). That Commission's Report (PPSC 1977) implicitly accepted the need to make surveillance as publicly acceptable as possible, consistent with its expansion and efficiency (Rule 1980, pp.75, 110). Because of the loose phrasing of the statute, agencies had little difficulty subverting the Privacy Act. Some States of the United States of America also enacted laws around this time.

The EU Directive (passed in 1995, with effect from October 1998) created a real threat of disadvantage for U.S. businesses if the Congress fails to pass substantial privacy protection laws regulating the private sector. The U.S. Administration's attempts to rely on so-called 'self-regulation', code-named 'safe harbor', appear unlikely to satisfy the E.U.'s requirements that "Member States shall provide for the right of every person to a judicial remedy for any breach of the rights ..." (EU 1995, Article 22), and that "the transfer to a third country of personal data... may take place only if ... the third country in question ensures an adequate level of protection, [including] the rules of law both general and sectoral, in force in the third country in question" (EU 1995, Article 25.1).

The association of the economically advanced nations, the Organisation for Economic Cooperation and Development (OECD), was similarly concerned, and also issued a codification in 1980. That document is the central pillar on which discussion has subsequently proceeded. An outline is provided in section 3, below.

Outside Europe and North America, developments were slower. In Australia, for example, the call to arms was issued in the Boyer Lecture Series of 1969, by a subsequent Governor-General, Zelman Cowen (Cowen 1969). A report was called for by the Commonwealth and State Attorneys-General (Morison 1973), but only one of the nation's nine jurisdictions acted upon it. A law reform commission study dragged on over a 6-year period, and was published so late that the momentum was lost (ALRC 1983). Australia acceded to the OECD Guidelines in 1984, but legislation was passed only in 1988, and even then only as a byproduct of a failed attempt to instigate a national identification scheme.

For details of the history of privacy legislation, see Flaherty (1979, 1984, 1989), Bennett (1992, esp. pp.45-94) and Burkert (1999). Sources of privacy laws are catalogued in Flaherty (1989), Bennet (1992), Madsen (1992), Smith (1997), EPIC (1999) and Rotenberg (1999), and a catalogue of online sources is at Clarke (2000).


2.4 The 'Fair Information Practices' Paradigm

Although there are variations among the many pieces of national legislation, there is a great deal of commonality. This derives from their common origins, and it has been sustained by the coordinative pressures brought by the Council of Europe, the OECD and the European Union. This section describes this dominant paradigm.

Conventional information privacy protections are best described as 'fair information practices' (FIP). The origins of FIP lie in the foundation work of Columbia University economist, Alan Westin (Westin 1967, 1971; Westin & Baker 1974). In those early years of personal data systems, few serious studies were undertaken, and Westin school of thought quickly dominated. For reviews of the origins of FIP laws and guidelines, see Smith (1974-), Flaherty (1986), Bennett (1992, pp.95-152), and Madsen (1992).

The Westin thesis was essentially that the invisible economic hand of business and government activity would ensure that IT did not result in excessive privacy invasion. Hence privacy regulation was unnecessary. The second implicit tenet was that, to the extent that regulation was imposed, it was essential that the detrimental effects on business and government be minimised. This because, in Westin's view, the concerns of individuals and society are secondary to the need for the efficient operations of business and government not to be hindered. Even though this claim is highly contentious, it was very appealing to legislators, to their sponsors in business and to their advisers. This has been the driving force behind virtually all of the Bills that have been passed into law.

Organisations have perceived their interests to dictate the collection, maintenance and dissemination of ever more data, ever more 'finely grained'. This 'information-intensity' phenomenon has arisen from the increasing scale of human organisations, making them more remote from their clients, and more dependent on abstract, stored data rather than personal knowledge. Other factors have been an increasing level of education among organisations' employees, the concomitant trend toward 'scientific management' and 'rational decision-models', based on detailed criteria and a significant amount of data, and, particularly since the middle of the century, the brisk development in IT.

FIP legislation has facilitated these trends, in return for quite limited constraints on organisational behaviour in relation to personal data. FIP-based privacy regimes have been described as an 'official response' which legitimated dataveillance measures in return for some limited procedural protections (Rule 1974, Rule et al. 1980).

Rather than supporting individual freedoms, however, administrative efficiency has been shown to generally conflict with it. Modern practices have little regard for what have been called 'atypical, idiosyncratic or extenuating circumstances' (Marx & Reichman 1984, p.436). Achieving a full understanding of the circumstances generally requires not only additional data which would have seemed too trivial and/or too expensive to collect, but also the application of common sense, and contemporary received wisdom, public opinion and morality.

Another feature of the FIP approach is its focus on 'data protection': the principles are oriented towards the protection of data about people, rather than the protection of people themselves. This is justified on the pragmatic grounds that it is an operational concept more easily coped with by business and government agencies than the abstract notion of privacy, and it is therefore easier to produce results.

This paper argues that the intervening quarter-century has demonstrated quite comprehensively that FIP-based privacy protection laws have not delivered what humans actually need. Rather than subordinating information privacy concerns to matters of administrative efficiency, privacy-protective regimes need to assert the primacy of the social over economic interests.


2.5 The Form of Information Privacy Rights

This sub-section addresses the question of the nature of the privacy interest. Two distinctly different approaches are identified. The first reflects recognition of privacy as a human right. The other adopts the rationalist economist's perspective, and assumes that the social sphere is subordinate to mankind's epic quest for the efficient allocation of scarce resources.

(1) Privacy as a Human Right

The dominant approach to privacy is to perceive it as a fundamental human right. It is expressly recognised in the key international instruments, the Universal Declaration of Human Rights (UDHR 1948) at Article 12, and the International Covenant on Civil and Political Rights (ICCPR 1976), at Article 17, which use the same form of words:

No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence ... Everyone has the right to the protection of the law against such interference or attacks.

The EU Directive places itself squarely in this tradition in declaring at the outset that its objective is to "protect the fundamental rights and freedoms of natural persons, and in particular their right to privacy with respect to the processing of personal data" (EU 1995, Article 1.1).

In 1983, the German Constitutional Court read a right of 'informational self-determination' into that country's Constitution (Burkert 1999). The U.S. Supreme Court also recognised the existence of a constitutional right to privacy in 1965, but in 1976 decided that it was limited to "matters relating to marriage, procreation, contraception, family relationships, and child rearing and education" and perhaps personal appearance (Smith 1997, p.v).

(2) Privacy as an Economic Right

The debates in the U.S. have, however, followed a different course from those in Europe. Burkert (1999) describes the U.S. approach as being "economic-technological", whereas that of Europe is "social values-oriented". This is borne out by the focal point of information infrastructure policy on the two sides of the Atlantic: whereas European countries publish works on 'information society', the U.S. literature is dominated by the term 'information economy'. The Australian Government since 1993 also reflects this dominance of corporate over community interests.

There has therefore been some degree of drift towards a perception of privacy as an economic right. This is consistent with business interests, because it enables administrative efficiency to be valued very highly when balanced against privacy, and hence to force more substantial compromise than would be feasible if privacy were regarded as a human right.

An attempt to reconcile the two approaches was made in Cavoukian (1999b), but this effectively just adopts the U.S. position, by downplaying the human rights aspects.

There are several variants to the economic rights school of thought, which grant more or less freedom to the individual, and less or more power to organisations that handle data about them.

* A Right of Individuals to Preclude Use of Personal Data

The most advantageous approach from the perspective of business and government would be the granting to individuals of a limited right to deny a specific organisation the right to use specific data about them. This is sometimes referred to as 'opt-out'. Such a right would of course be circumscribed by express legal authority for specific organisations to use specific personal data.

In effect, all uses of personal data would be treated as being legal unless and until something made them illegal. Business interests expect that public apathy would result in relatively few consumers opting out, and these would in any case be trouble-makers who they would be quite pleased to lose contact with.

* A Right of Organisations to Use Personal Data

A stronger form of protection, and hence a greater constraint on organisations, would be the preclusion of the use of personal data in the absence of an express right to do so. Such a right could arise under law, or under consent (express or implied). It would of course be circumscribed by express legal authority for specific organisations to use specific personal data. This approach is sometimes referred to as 'opt-in'.

* A Right of Persons to Charge for the Use of Personal Data

The use of personal data might be subject to payment by the user to the data subject. This would enable the use of data to be determined in a marketplace, through contractual arrangements in respect of each relationship between a person and an organisation. Organisations would only be likely to enter into negotiations for such a contractual right where they foresaw financial or other advantages to themselves in doing so; and this would represent a considerable protection for information privacy.

There are precedents for such approaches. Some U.S. supermarket chains offer a discount in return for access to personal data; and the 'Fly-Buys' scheme in Australia is widely recognised by consumers as involving a sacrifice of privacy in return for the possibility of a reward.

Such a right would be likely to be circumscribed by express legal authority for specific organisations to use specific personal data.

* Intellectual Property Rights in Personal Data

The most substantial form of the 'economic rights' approach to privacy is an intellectual property right in personal data. Proposals surface from time to time, particularly in North America, for the creation of such a right (e.g. Laudon 1993). In most discussions, it is implicitly assumed that property would be vested in the person to whom the data relates, and that that person could grant organisations licences to copy, store, use and/or disclose the data.

Some qualifications would be needed, such as an implied licence for each organisation with whom an individual conducts a transaction, enabling it to retain personal data on its records, and to use it in ways that are consistent with the relationship and/or privacy laws. It is not clear whether such a right could be exercised against the State.

There are, however, serious problems with an economic right in personal data based on intellectual property law. Not least among them is the difficulty that, in copyright law, the dictum is that rights relate to expressions, not ideas; and hence data itself is not subject to copyright. As a result, an extension to copyright law could create rights in representations of personal data, but not rights in the personal data itself.

The tendency towards viewing privacy as an economic rather than a human right is in effect a diversionary tactic consistent with the 'fair information practices' school of thought, and aligned with business interests. This author asserts the dominance of social over economic interests. The remainder of this paper assesses the primary co-ordinative instrument, the OECD Guidelines, against the expectations of the public.


3. The OECD's 1980 Guidelines

The primary reference point for international discussions about privacy protection is the OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (1980). This section provides a brief review, sufficient to support the critique that follows.

The Organisation for Economic and Cultural Development (OECD), formed in 1961, is "a club of like-minded countries that ... provides governments a setting in which to discuss, develop and perfect economic and social policy" (About OECD, 2000). In practice, its focus is much more on economic than on social matters, with just one of its 15 Committees and associated Directorates addressing all of Education, Employment, Labour and Social Affairs.

In 1999, the OECD had 29 member-countries covering most of Europe, the U.S.A., Canada and Mexico, plus Japan and Korea, and Australia and New Zealand. Those countries produced two thirds of the world's goods and services.

During the late 1970s, the OECD perceived "a danger that disparities in national legislations could hamper the free flow of personal data across frontiers. ... Restrictions on these flows could cause serious disruption in important sectors of the economy, such as banking and insurance" (OECD 1980, p.1). The concern about "unjustified obstacles" is expressed many times in the document. The expression makes clear that privacy, a merely social interest, is a constraint on the implicitly higher-order economic interest in "free flows of personal data", and that its constraining effects must be minimised.

To address that risk, it sought "a consensus on basic principles which can be built into existing national legislation, or serve as a basis for legislation in those countries which do not yet have it" (p.1), and thereby codified the already fairly mature FIP-based regime.

The Guidelines themselves are a mere 2-1/2 pages and 22 numbered paragraphs. In this paper they are referred using the prefix 'G'. The Principles are at G7-G14, but in this paper are referred to as P1-P8. They are reproduced in Exhibit 1, below.

The Guidelines are preceded by a Preface and Recitals totalling 1-1/2 pages, and followed by an Explanatory Memorandum of 14 pages comprising an Introduction and 77 numbered paragraphs (referred to in this paper using the prefix 'EM').

The instrument is a set of Guidelines, not a Convention; and it merely "recommends" that member-countries "take [the Principles] into account" in their domestic legislation (p.2). This means that it is in no sense obligatory for a country that accedes to them to comply with all, or indeed any, part of the document, to ever establish legislation to give them effect, or to ensure that laws that are passed are compliant with the Guidelines in all respects.

Further, although the document states that the Principles are "minimum standards" (G6/EM49), other sections of the text make abundantly clear that this expression is mere rhetoric. They can be compromised in many ways without the shortfalls being in any sense being in breach of the Guidelines (e.g. G3/EM45, G4/EM46, EM47 and P1/EM52).

In summary, the nature of the Guidelines is as follows:

Clarke (1987) provides a structured interpretation of the OECD Guidelines, including a modest re-structuring to overcome some drafting clumsiness. Its purpose is to assist researchers in assessing the extent to which laws and proposals for law comply with the Guidelines. It is applied to the Australian Privacy Act 1988 in Clarke (1989).


Exhibit 1: The OECD Principles

Collection Limitation Principle

P1. There should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.

Data Quality Principle

P2. Personal data should be relevant to the purposes for which they are to be used, and, to the extent necessary for those purposes, should be accurate, compete and kept up-to-date.

Purpose Specification Principle

P3. The purposes for which personal data are collected should be specified not later than at the time of collection and the subsequent use limited to the fulfilment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.

Use Limitation Principle

P4. Personal data should not be disclosed, made available or otherwise used for purposes other than those specified in accordance with [Principle 3] except:

(a) with the consent of the data subject; or

(b) by the authority of law.

Security Safeguards Principle

P5. Personal data should be protected by reasonable security safeguards against such risks as loss or unauthorised access, destruction, use, modification or disclosure of data.

Openness Principle

P6. There should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller.

Individual Participation Principle

P7. An individual should have the right:-

(a) to obtain from the a data controller, or otherwise, confirmation of whether or not the data controller has data relating to him;

(b) to have communicated to him, data relating to him

(i) within a reasonable time;

(ii) at a charge, if any, that is not excessive;

(iii) in a reasonable manner; and

(iv) in a form that is readily intelligible to him;

(c) to be given reasons if a request made under sub-paragraphs (a) and (b) is denied, and to be able to challenge such denial; and

(d) to challenge data relating to him and, if the challenge is successful, to have the data erased, rectified, completed or amended.

Accountability Principle

P8. A data controller should be accountable for complying with measures which give effect to the principles stated above.


4. Fundamental Deficiencies in the OECD Guidelines

In sub-sections 2.3 and 2.4, the process leading to the expression of the OECD Guidelines was shown to have been motivated by the protection of business activities, rather than of people's privacy. This section identifies aspects of the OECD Guidelines that were inadequate even within the context of the times. Section 6 addresses further inadequacies that have arisen over time.


4.1 The Scope of Data Protection

Privacy-intrusive technologies can be applied by any organisation that has power over individuals. Hence the expectation of privacy protections is that they encompass all aspects, across a variety of dimensions. This sub-section identifies ways in which the OECD Guidelines fall short of that need.

(1) The Public and Private Sectors

The public expectation is that organisations in both the public and private sectors are subject to privacy regulation. The OECD considered restricting the scope of the Guidelines to only the public or only the private sector, but decided to cover both (G2,G5,EM44).

Because the Guidelines are in no way binding, some countries that have passed laws have applied them only to one of the two. Most commonly, the laws are expressly applied only to government agencies, on the grounds that government databases created the greatest dangers, that natural processes would limit the intrusiveness of companies, or that governments should gather experience with the public sector first prior to imposing regulation on the private sector.

The EU Directive explicitly requires member-countries to impose a statutory privacy protection regime on the private sector, which is in several respects more stringent than the OECD Guidelines. This has impacts on other countries, because of the requirement in Article 25 that "the transfer to a third country of personal data ... may take place only if ... the third country in question ensures an adequate level of protection, ... assessed in the light of all the circumstances ... [including] the rules of law, both general and sectoral, in force in the third country in question ..." (EU 1995).

Although negotiations continue between the E.U. and the U.S.A., it is difficult to see how self-regulatory arrangements could satisfy the 'rules of law' test. Hence the EU Directive has generated pressure on countries that anticipate the transfer of personal data from European countries to enact privacy protection laws regulating the private sector to at least the level described in the OECD Guidelines.

(2) Organisations

The public expectation is that all organisations will be subject to the privacy protection regime. There may of course be differences in the particular manner in which particular Principles are applied (e.g. national security agencies might be supervised by a separate Inspector-General; and subject access to medical records held by health care organisations might involve the presence of an appropriately qualified intermediary); but no organisation should be granted exemption.

The OECD Guidelines provide ample scope for exemptions and exceptions, in particular at G2/EM43, G3(b)/EM45, G4/EM46-47, EM19(g), EM52, EM58, and EM59. For example, EM45 expressly "permits Member countries to exercise their discretion with respect to the degree of stringency with which the Guidelines are to be implemented".

In various statutes that are claimed to be OECD-compliant, all manner of government agencies have been exempted, especially in such areas as national security and law enforcement (which are of course among the most feared agencies, and whose prior records show them to be most in need of control), and even in "public policy ("ordre public")" (G4), ac concept whose vagueness is useful for justifying the exemption even of revenue-collection agencies.

(3) Categories of Data Collection and Processing

The public expectation is that all data collections and all kinds of data processing will be subject to the privacy protection regime. There may of course be differences in the particular manner in which particular Principles are applied (e.g. data security, and access by the data subject); but no exemptions should be granted for particular data collections or particular kinds of processing.

The OECD Guidelines' scope for exemptions and exceptions, on the other hand, can be readily interpreted as applying to data collections as well as organisations. Despite EM37 stating that "the principles ... are valid for the processing of data in general, irrespective of the particular technology employed", EM45 confirms the interpretation of G3(b) as allowing data protection laws to be applied only to data that is subject to "automatic processing".

Some OECD-compliant laws apply not to 'personal data', but only to 'records containing personal data'. This deflects attention away from personal information and towards particular I.T. mechanisms, and creates opportunities for schooldays-debaters-turned-barristers to ply their trade against the public interest. Genuine privacy-protective law applies to all data, and its applicability should not be constrained through the use of qualifiers.

Many OECD-compliant schemes are qualified in relation to what are sometimes referred to as 'public registers', such as electoral rolls and land titles registers. Some schemes even exempt such data collections entirely from all privacy protections. For example, the Australian Privacy Act fully exempts "generally available publications" (through the s.6 definition of 'record').

As is argued in Clarke (1997d), qualifications to privacy laws for such registers are unjustifiable. It is especially critical that the uses be controlled by reference to the purposes of the collection. The Australian Privacy Charter (APC 1994) includes at Principle 17 (Public registers) the statement that: "Where personal information is collected under legislation and public access is allowed, these Principles still apply except to the extent required for the purpose for which public access is allowed".

(4) Individuals

The public expectation is that everyone's personal data will be subject to the privacy protection regime. There may of course be differences in the particular manner in which particular Principles are applied (e.g. in the case of minors, and other persons not fully capable of exercising their rights); but no person should be excluded.

The OECD Guidelines define 'data subject' as "an identified or identifiable individual" (implicitly only, see G1b), and other references (e.g. at EM33 and 41) are to 'individual' and 'physical persons' in an unqualified manner.

Despite this, the Guidelines' general looseness has permitted a wide range of exceptions to arise. These include:

All such exceptions undermine public confidence in the reasonableness of privacy protection regimes.

Some laws apply only where the person's identity is apparent. In the case of the Australian law, the scope is further constrained by the requirement that the identity needs to be able to be apparent or to be reasonably ascertained "from the information or opinion". This leaves unregulated circumstances in which the identity of the person is apparent from context, or where it can be ascertained only by the addition of further information.

The Australian Privacy Act 1988 defines 'individual' very openly as "a natural person" (s.6(1)). An exception is made in s.41(4), however. The Privacy Commissioner is precluded from investigating a breach of the Alteration Principle unless the person concerned is either an Australian citizen or has rights of permanent residence.

Genuine privacy-protective law applies to all data that is capable of being associated, through any means, with an identified person, and is subject to no qualifications based on who the person is.


4.2 Exemptions and Exceptions

The degree of impact on the privacy interest depends on the circumstances. So too does the significance of countervailing interests. It was argued in the preceding sub-section, and in greater depth in Clarke (1997c), that exemptions and exceptions to the Principles are not the appropriate response. The OECD Guidelines' permissive approach to exemptions and exceptions is inimical to effective privacy protection.

Any form of exemption is a very blunt weapon, because it creates a void within which uncontrolled abuses can occur. Instead what must be striven for is balanced implementation of universal principles, reflecting the context. This applies particularly to operations whose effectiveness would be nullified in the event of full or even partial public disclosure of data, or of modus operandi. In such instances, an appropriate proxy needs to be authorised in law to have access to the sensitive operational information, but precluded in law from disclosing details to the public. Not only government agencies but also government business enterprises and private sector organisations have tenable claims for treatment along these lines.

This further implies that:

It is also vital to privacy protection that the favoured status traditionally granted to defence, national security, criminal intelligence and law enforcement agencies be rolled back. Parliaments must make these agencies understand that they are subject to democratic processes and that their distinctive and challenging functions and operational environments dictate the need for careful formulation and implementation of privacy protections, not for exemption.


4.3 Monolithism of State and Corporation

When it suits their interests, agencies adopt the attitude that the agencies of government form a monolith, and hence all data transfers are internal to government. This is inimical to the information privacy interest, and it is necessary for Parliaments to make clear that agencies are independent organisations for the purposes of data transfer, and that all data transfers are therefore subject to the rules regarding collection and dissemination.

In addition, there is a danger that privacy protections may be subverted by the concentration of business functions and their associated data into large, multi-function agencies. Hence boundaries must be drawn not only between agencies but also between business functions and systems, and data collections.

'Virtual centralisation' of data by way of network connection, and data linkage via common identifiers, embody extremely high information privacy risks. Many European countries already have a multi-purpose identifier, and a multi-purpose central database. A few, most notably Denmark, hold a great deal of personal data, creating enormous power for the next invader or authoritarian government. In other countries, particularly the United States, Canada, the United Kingdom and Australia, the 'national databank' agenda of the mid-1960s is being continually resuscitated by government agencies, together with pressure for a general-purpose identification scheme. These must be strenuously resisted if the existing balance of power between individuals and the State is not to be dramatically altered.

The emergence of electronic services delivery (Clarke 1999b) has seen additional developments that require control. These include the tendency towards cross-notification between agencies of data that may be of value to other parts of the monolith (e.g. change of address); and cross-system enforcement, whereby one agency withholds money or services from its client, in order to enforce a debt the client is claimed to owe to another agency.

Much the same propensities for monolithism exist within corporations. Privacy protection law must establish bulkheads between business units, divisions and subsidiaries, to preclude compromise of privacy protections. Moreover, some takeovers and mergers are targeted specifically at the acquisition of personal data, or the consolidation of separate holdings. These also need to be regulated, to ensure that data is used only for the purposes to which the individuals concerned reasonably expected the data to be put.

The OECD Guidelines are entirely silent on this matter. The sole test is relevance to purpose (P3, EM54).

Monolithism of government and corporations must be expressly denied, and Parliaments must make clear that government agencies are independent organisations for the purposes of data transfer, and so are corporate business units.


4.4 The Scope of the Protective Regime

There are several ways in which regulation can be achieved. These are usefully segmented into:

The features of an effective co-regulatory scheme are described in (Clarke 1999) as comprising all of the following:

Provision for sector-specific codes first appeared in the 1988 legislation in The Netherlands (Burkert 1999), and is a particular feature of the New Zealand Act of 1993.

The OECD Guidelines "encourage and support self-regulation, whether in the form of codes of conduct or otherwise" (G19(b)). A careful reading establishes, however, that the intention was not to permit uncontrolled self-regulation. At G19(d), there is an explicit requirement to "provide for adequate sanctions and remedies in case of failures to comply ..."; and "[in] common law countries ... non-legislative implementation of the Guidelines would complement legislative action" (EM70, my emphasis). The EU Directive states that "The Member States and the Commission shall encourage the drawing up of codes of conduct" (EU 1995, Article 27); and the context set by Articles 22-25 makes clear that these must be subject to legal sanctions and enforcement.

The Guidelines need to be explicit about the inadequacy of uncontrolled self-regulation, and about the elements required to implement a co-regulatory approach.


4.5 The Privacy Protection Agency

Privacy protection regimes based on cases being brought by private citizens against the might of large, wealthy, information-rich, worldly-wise agencies and corporations have not worked, and are highly unlikely to do so in the future. To achieve appropriate balance between information privacy and administrative efficiency, it is necessary for an organisation to exist which has sufficient powers and resources to effectively represent the information privacy interest (Australian Law Reform Commission 1983; OTA 1986, pp.57-59, 113-122; Flaherty D.H. 1989, esp. pp.359-407).

It has been argued that, in the United States, "a small agency of perhaps sixty professionals drawn from legal, social science and information systems backgrounds" would provide sufficient resources to address the problem (Laudon 1986, p.383). From the perspective of Australia and Canada, this would seem parsimonious for such a large country, but there can be little doubt that, given an appropriate set of powers, and sufficient independence from the Executive, it could make a significant difference to the balance of power.

It would be valuable to complement such a body with an effective option for individuals to prosecute and sue organisations which fail to comply with legal requirements. This can only come about if the requirements of organisations are made explicit, and this in turn is only likely to come about if detailed codes, prepared by a privacy protection agency on the basis of research and experience, are given statutory authority. In addition to valuable discussions in Privacy Protection Study Commission (1977), Australian Law Reform Commission (1983), Laudon (1986, pp.382-4), and Flaherty (1989), elements of all of these can be found in Canadian, Australian and New Zealand legislation and practice.

There are two competing models. The conventional one involves the agency being required to balance information privacy against other interests (such as administrative efficiency), and is based on negotiation and conciliation rather than adversary relationships. This risks capture of the privacy protection agency by the other much larger, more powerful, information-rich and diverse agencies. Laudon argues strongly for the alternative: an agency that is explicitly an advocate for information privacy, and that can represent that interest before the courts and Congress (Laudon 1986, p.384).

The EU Directive requires that "Each Member State shall provide that one or more public authorities are responsible for monitoring the application within its territory of the provisions adopted by the Member States pursuant to this Directive" (EU 1995, Article 28.1).

The OECD Guidelines fail to require the creation of any such body. The requirement in G19(d) for "adequate sanctions and remedies" is interpreted in EM70 to permit "either the setting-up of special supervisory bodies, or reliance on already existing control facilities, whether in the form of courts, existing public authorities or otherwise".

The public expectation is that a specialist body will exist to supervise government agencies and corporations, and the OECD Guidelines fail to fulfil that expectation.


4.6 The Privacy Protection Agency's Capabilities

The public expectation is that the privacy protection agency will be capable of performing its function of exercising control over government agencies and corporations, and will not be unduly constrained by law, by coercion by governments, by 'capture' by the organisations it regulates, or by resources.

There are over 20 national Data Protection and Privacy Commissioners, and some 20 in subsidiary jurisdictions in such countries as Canada, Germany, Switzerland and Australia. Registers of web-pages of these organisations are available from the Global Internet Liberty Campaign (GILC) and the Australasian Legal Information Institute (AustLII).

Many of these agencies are seriously constrained in their work, in such ways as the following:

The Privacy Commissioner must have sufficient powers and resources to effectively represent the information privacy interest, supervise organisations' behaviour, and act on material breaches. This requires involvement in the development of policy by governments and agencies, not just in its implementation. Labour-intensive activities designed to sap his office's energies, such as registration and digest publication, must either be dropped entirely, or devolved to the agencies themselves.

The EU Directive requires that the privacy protection agency have adequate powers to perform its function. The provisions include that the agency "shall act with complete independence in exercising the functions entrusted to them", that "each Member State shall provide that the supervisory authorities are consulted when drawing up administrative measures or regulations", and "each authority shall ... be endowed with investigative powers, ... effective powers of intervention, ... and the power to engage in legal proceedings where the national provisions ... have been violated or to bring these violations to the attention of the judicial authorities" (EU 1995, Article 28).

The OECD Guidelines fail to specify the measures needed to ensure that the privacy protection regime is achieved.


4.7 Justification

The public expectation is that there be means whereby organisations can be called to account, and their justification for existing and intended new practices subjected to examination. The need encompasses several elements, which are addressed in the following sub-sections.

(1) Justification of Systems

The public expectation is that organisations be required to justify the existence of systems that handle personal data, and the creation of new systems: "At some point ... the repressive potential of even the most humane systems must make them unacceptable" (Rule 1980, p.120, emphasis in original); and "We need to recognize that the potential for harm from certain surveillance systems may be so great that the risks outweigh their benefits" (Marx 1986, p.48). Examples of systems whose justification is contentious include blacklists in such contexts as credit, insurance and tenancy; databases of biometrics, especially of persons other than convicted criminals; data-matching schemes (Clarke 1995a); and registers of persons accused of behaviour currently regarded as heinous, such as sexual abuse of minors, or even sexual harassment.

The OECD Guidelines include a requirement that "there should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller" (P6, EM57). This establishes openness; but it does not imply any need for justification.

There have been almost no personal data systems, or even uses of systems, which have been banned outright. In the case of data-matching programs, Shattuck (1984, p.540) reported that during the first five years, the unsympathetic interpretation of the U.S. Privacy Act by the supervising agency, the Office of Management and the Budget, resulted in not a single matching program being disapproved. Worse still, as Laudon noted in relation to the U.S. public sector, "a pattern has emerged ... in which the identification of a social problem [such as tax evasion, welfare fraud, illegal immigration, or child maintenance default] provides a blanket rationale for a new system without serious consideration of the need for a system" (1986, p.385). Such 'blanket rationale' has no place in a genuinely privacy-protective environment.

Few sets of Information Privacy Principles appear to even contemplate such an extreme action as disallowing some applications of IT because of their excessively privacy-invasive nature. Exceptions include those of the New South Wales Privacy Committee (NSWPC 1977), which are not legally enforceable, and, with qualifications, Sweden. The Australian Privacy Charter states that "The onus is on those who wish to interfere with privacy to justify doing so" (APC 1994).

A regime that meets the populace's needs would therefore:

Claims that a system is justified need to be assessed using an appropriate method. In the case of especially privacy-intrusive applications of information technology, a formal Privacy Impact Analysis is needed (Clarke 1998b). Governments have accepted the principle that projects that have significant potential impact on the physical environment require careful and independent assessment prior to a commitment being made to them. In the same way, Governments must not commit to IT projects that have significant privacy impacts, until after those impacts have been submitted to a public assessment process.

In some cases, it may be unreasonable to ask an organisation to make its full strategy publicly available, because of the harm this might do to the system's effectiveness, or the organisation's competitive position, or possibly because of the costs involved. As a proxy for full public availability, it may be appropriate for the control to exist in the form of availability of the information to an organisation that is independent of the proponent, and that enjoys the public's confidence in relation to such matters, e.g. a privacy protection agency, together with appropriate public reporting by that organisation.

To reach a decision as to whether or not a system should be permitted, or permitted to continue, an evaluation is needed. Given that the tension is between a social interest and an economic one, mere financial analysis is inadequate; instead, the appropriate technique to apply is cost/benefit analysis. A description of CBA is provided in Clarke (1995a). Application of the technique to the evaluation of computer matching programs is examined in Kusserow (1984, pp.33-40), PCC (1989), PCA (1990), and Clarke (1995b).

(2) Justification of System Purposes

The public expectation is that organisations be required to justify the purposes of systems that handle personal data. The NSW Privacy Committee's Guidelines of 1977 stated that "A personal data system should exist only if its general purpose and specific uses are socially acceptable" (NSWPC 1977).

The OECD Guidelines require only that "the purposes for which personal data are collected should be specified ..." (P3, my emphasis). EM54 enlarges on this as follows: "Such specification of purposes can be made in a number of alternative or complementary ways, e.g. by public declarations, information to data subjects, legislation, administrative decrees, and licences provided by supervisory bodies". Of these, the only ones which imply any kind of legal authority are 'legislation' and 'licences'. In short, the OECD Guidelines generally treat organisations as though they were sovereign, and create no momentum of any kind towards an environment in which they need to justify their actions.

The limitation of data-handling only to the purposes of collection is central to the entire body of OECD Principles, especially P1 - Collection Limitation, P2 - Data Quality, P4 - Use Limitation, and P6 - Openness; yet OECD-compliant schemes impose almost no controls on the purposes of collection. Under some regimes, for example, statements of purpose are used that are constructively abstract or vague, and hence all-inclusive. Examples include the 'routine uses' clause much-loved among agencies subject to the U.S. Privacy Act, e.g. 'for the purpose of government (or taxation, or welfare) administration', and 'to enable the organisation to perform its statutory functions'.

A genuinely privacy-protective regime must require more than that the purpose or purposes of personal data be stated, and perhaps published. It must:

All organisations need to be responsible in law to respond to requests for justification of usage. The power to challenge justification may be:

The arbitration of the acceptability of system purposes might take several alternative forms, including:

(3) Justification of System Features

The public expectation is that organisations be required to justify specific features of systems that handle personal data. Examples include the use of particular identifiers, or data from particular sources, or data of especial sensitivity.

The OECD Guidelines recognise two categories of system features. One is personal data of especial sensitivity (such as "race, religious beliefs, criminal records"). This is discussed in EM6, EM19(a), EM45, EM50-51, but no substantive conclusion was reached, and hence the Guidelines are in effect silent on the matter. The EU Directive is, on the other hand, much more forthright, stating that (subject to some qualifications) "Member States shall prohibit the processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life" (EU 1995, Article 8.1).

The other aspect that is mentioned in the OECD Guidelines, at EM19(e), EM24, EM26(b), EM51 and EM70, is unfair discrimination "on such bases as nationality and domicile, sex, race, creed, or trade union affiliation" (EM70). Once again, however, no substantive conclusion is reached.

A genuinely privacy-protective regime must:

The EU Directive directly addresses the issue, by requiring that "Member States shall determine the processing operations likely to present specific risks to the rights and freedoms of data subjects and shall check that these processing operations are examined prior to the start thereof" (EU 1995, Article 20.1).

(4) Justification of the Relevance of Data to a Decision

The public expectation is that organisations be required to justify the relevance of personal data to particular uses. Examples include the use in employment decisions of marital status and citizenship.

The OECD Guidelines state that "Personal data should be relevant to the purposes for which they are to be used" (P2). EM53 draws attention to the particular risk that "data concerning opinions may easily be misleading if they are used for purposes to which they bear no relation, and the same is true of evaluative data". Yet the Guidelines provide no explicit requirement that an organisation be called to account for its usage; and hence privacy protection agencies may be unable to enforce the requirement.

An effective privacy-protective regime would:

(5) Justification of Adverse Decisions

The public expectation is that organisations be required to explain the reasons for decisions adverse to the interests of an individual. Examples include refusals to hire, lend, grant a benefit, or award a licence. The reasons include counselling of the individual, the discovery of errors in information provided and in entries in databases, and the enablement of appeals and complaints processes. The issue is of especial concern in the case of government agencies, which exercise the authority of law, or are in effect monopoly providers of services.

The right to be given reasons for adverse decisions was a matter of difficulty for the OECD (P7, EM60). Principle 7(c) makes clear that "an individual should have the right to be given reasons if a request [for access or correction] is denied". In a particularly bold move, "broadening of this right to include reasons for adverse decisions in general, based on the use of personal data, met with sympathy by the Expert Group. However, on final consideration a right of this kind was thought to be too broad for insertion in the privacy framework constituted by the Guidelines" (EM60).

An effective privacy-protective regime would:


4.8 No Disadvantage

The public expectation is that the exercise of privacy rights must not prejudice access to other rights or services. The Australian Privacy Charter expressed the requirement in the following manner: "People should not have to pay in order to exercise their rights of privacy described in this Charter (subject to any justifiable exceptions), nor be denied goods or services or offered them on a less preferential basis. The provision of reasonable facilities for the exercise of privacy rights should be a normal operating cost" (APC 1994, at 18).

The OECD Guidelines appear to be silent on this important matter.


4.9 Handling of Data Sensitivity

Data varies greatly in its sensitivity, depending on the data-item, the person concerned, and the circumstances. It is inadequate to assume that particular data-items (e.g. date-of-birth, marital status, number of dependants) have a particular, fixed degree of sensitivity. The public expectation is that all personal data be subject to protections, and that additional protections apply in particular circumstances.

There are already several ways in which data sensitivity may have to be considered in the Australian legal context, including:

The OECD Guidelines, as noted in section 4.7(3) above, discussed the matter at EM6, EM19(a), EM45, and EM50-51, but reached no substantive conclusions; whereas the EU Directive imposes requirements (EU 1995, Article 8.1). The Australian Privacy Commissioner's Principles state that "We will limit the collection of highly sensitive information about you. [In particular,] an organisation should not collect personal information revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, or details of health or sex life" (NPFHI 1998).

An effective information privacy protective regime would:


4.10 Adaptiveness

The public expectation is that the law is responsive to changes in technology and practice, and that the privacy protection agency is capable of detecting and understanding changes, and communicating them to legislatures and the public.

If a proper balance between information privacy and other interests is to be maintained, processes must be instituted whereby technological change is monitored, and appropriate modifications to law, policy and practice brought about. This needs to be specified as a function of the privacy protection agency, and the agency funded accordingly.

The OECD Guidelines appear to be silent on these important matters.


5. Post-1960s Information Technology

In the 1960s, computers were used primarily for computation and progressively also for record-keeping, to which the term 'data processing' was applied. The early data protection laws, the 'fair information practices' school of thought, and the OECD Guidelines, were all shaped in this context.

The context has changed. Technologies have increased greatly in sophistication, and in capacity. Progress has include quite dramatic increases in processor computational speed, storage size, access and speed; and reduction and then minituarisation of processor size, energy requirements, and heat production. As a result, computers have migrated from large rooms to small chips. In addition, many new technologies have emerged, in such areas as printing, display, storage, and data analysis, and have been been grafted onto the base technology of computing.

Progressively, convergence has taken place between computing and telecommunications, and also with robotics, to deliver what is currently referred to as 'information technology'. Transponders have been added to computers, and inherently intrusive applications such as biometrics, locator mechanisms and dataveillance have been developed and deployed. A catalogue of recent technological developments is in Clarke (1998h). The additional threats arising in the context of the Internet (Clarke 1998f) are merely portents of the surveillance capabilities of future, which will use more sophisticated and more pervasive information infrastructure, especially that inherent in mobile computing (Clarke 1999c).

Only 15 years after the first data protection law was enacted, it could be shown that the absence of a common personal identifier was the only remaining barrier to widespread dataveillance of the kind implied by dystopian novels such as Orwell's '1984' (Clarke 1988). In a similar vein, a review of privacy protections in the U.S. public sector concluded that "new applications of personal information have undermined the goal of the Privacy Act that individuals be able to control information about themselves" (OTA 1986). The second 15 years that has elapsed since the era for whose technologies the OECD Guidelines were formulated have produced far more substantial increases in the privacy-invasiveness.


6. 21st Century Deficiencies in the OECD Guidelines

Section 4 addressed deficiencies that existed in the OECD Guidelines from the outset. This section identifies additional inadequacies that are apparent at the turn of the 21st century, but that have become apparent only during the last two decades, primarily as a result of advances in information technology, and their application in privacy-abusive ways.


6.1 The Privacy Protection Agency and the Public

The public expectation is that a strong relationship exist between privacy protection agencies on the one hand, and public interest advocates and representatives on the other.

The experience in Australia has fallen far short of that expectation. The Commonwealth Privacy Commissioner routinely consults with the organisations subject to regulation, but seldom with privacy advocates and representatives of the public. The Commissioner has even declined to make the results of surveys of agency practices publicly available. Moreover, communications with organisations are routinely treated as being commercial-in-confidence, whereas advocates and representatives seldom place such constraints on the information that they provide to the Commissioner.

At the very minimum, it would be expected that a privacy protection agency would deal even-handedly with the various interests. Many people would expect much more than that, because of the perception that the agency's role is first and foremost to represent the interests of the public, and that administrative efficiency is a constraint, not an objective.

The OECD Guidelines appear to be silent on this matter. They clearly need to be enhanced to require the privacy protection agency to make the maximum information available to the public, and to establish working relationships with privacy advocates and representatives of the public.


6.2 Consultation and Participation

The OECD Guidelines require "a general policy of openness about developments, practices and policies with respect to personal data" (P6). In the late 1970s, this seemed like a bold move, expecting organisations to provide information to the public. It now seems pitifully inadequate.

In the early 1970s, personal data systems operated almost entirely within the boundaries of a particular corporation or government agency (what would now be referred to as 'intra-organisational' systems). Through the 1980s, with the convergence of communications with computing, systems matured into 'inter-organisational' forms (linking pairs of organisations), and then 'multi-organisational' arrangements. The risks of data-sharing expanded enormously; but the most that OECD-compliant privacy protection regimes offered continued to be information about "the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller" (P6).

During the 1990s, the proliferation of access points to networks enabled further maturation of systems into 'extra-organisational' form (Clarke 1992c). Schemes such as ATMs, EFT/POS and, more recently, web-based shopping involve micro-enterprises (small, single-site, and in many cases single-person enterprises, such as retail outlets and service agents) and members of the public. Organisations are now dealing electronically with 'business partners' who do not have professional IT managers with an understanding of such arcane arts and technologies as systems analysis and communications protocols. They also have greatly enhanced capability to gather personal data, and to apply it to their dealings with the individuals concerned.

The mere provision of information to the micro-enterprise, consumer and citizen participants in these systems is entirely inadequate. The public needs to be involved in the requirements analysis, system design, testing, implementation and operation of such schemes.

This is not merely a social argument. The uptake of EFT/POS was delayed a decade by design blunders that could have been avoided had consumer interests been properly represented (Clarke 1992b, 1992c). More recently, growth rates in consumer Internet commerce have been far slower than other Internet metrics, because of marketer ignorance of consumer needs and expectations, as a result of their almost total exclusion from the analysis and design process (Clarke 1999c. See also Clarke 1998a).

Extra-organisational systems of all kinds demand at least consultation between the scheme's sponsors and the people affected by it, and preferably active participation by them. This is especially the case with schemes that handle personal data. Small business, consumers and citizens need to be involved in the key life-cycle phases of project scoping, requirements analysis, design, testing, implementation and operation. In addition, there is a need for privacy impact analyses to be undertaken as a pre-condition for the consideration of proposals for especially privacy-intrusive applications of information technology.

The OECD Guidelines, if they are to provide a basis for 21st century privacy protection, must be enhanced to reflect these needs.


6.3 Decision-Making by Artefacts

When the early data protection instruments were being formulated, decision-making and actions by artefacts unaided by humans was still in the realm of science fiction. In the 1980s, it became a gleam in the eyes of social engineers and cost accountants, and in the 1990s it became a reality.

An effective information privacy protection regime would impose responsibility on the operator of a personal data system to ensure that all decisions about human beings (or at least those that might reasonably be expected to have negative consequences for the people concerned) are subject to review by a human being before being communicated or implemented.

The EU Directive states that (subject to some qualifications) "Member States shall grant the right to every person not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc." (EU 1995, Article 15.1).

The OECD Guidelines need updating to reflect this need.


6.4 Multi-Purpose Identifiers

Many identification schemes are used by a single organisation, for a single purpose; but there are obvious attractions in sharing the costs across multiple organisations and business functions. A special case of a multi-purpose id scheme is what is usefully described as an 'inhabitant registration scheme' (Clarke 1994c). This provides everyone in a country with a unique code, and a token (generally a card) containing the code. Such schemes operate in many European countries, primarily for the administration of taxation, national superannuation and health insurance. In some countries, is is used for additional purposes, such as the administration of social welfare and banking, and to ensure that particular rights are exercised only by people entitled to them, such as the exercise of voting rights, the right of residence, the right to work, the right of movement across the country's borders, and the right of movement within the country.

Inhabitant registration schemes are endured, and perhaps even welcomed, by the inhabitants of some countries; but are disliked, actively opposed, and undermined in many others. For any government seeking to repress behaviour, they provide the means to link database entries and hence to make visible each person's life-story, behaviour patterns, and even current location. The public policy aspects of schemes of this nature are discussed in Clarke (1988, 1992, 1994c, and 1997e).

The public expectation is that multiple usage of identifiers be subject to significant limitations and that, where an 'inhabitant registration scheme' exists, so too do significant legal, organisational and technical protections.

The OECD Guidelines mention identifiers only in passing: "different traditions and different attitudes by the general public have to be taken into account. Thus, in one country universal personal identifiers may be considered both harmless and useful whereas in another country they may be regarded as highly sensitive and their use restricted or even forbidden" (EM45).

There is plenty of evidence of the nervousness of the populace about such schemes. In Australia, the 1985-87 government proposal for an 'Australia Card' saw the greatest-ever volume of letters to newspaper editors, and the second-largest-ever street marches (second only to the Vietnam War protests). Not only was the proposal withdrawn, but a Privacy Act was at last passed, and the national Tax File Number was subjected to considerable constraints (Clarke 1987a).

Some years later, when the Australian Privacy Commissioner promulgated principles for privacy protection in the private sector, Principle 7 read as follows: "We will limit our use of identifiers that government agencies have assigned to you. [In particular, ] an organisation should not adopt as its own identifier an identifier that has been assigned by a government agency ... [and, subject to qualifications] an organisation should not use or disclose an identifier assigned to an individual by a government agency" (NPFHI 1998). The Guidance Notes state that "This aims to prevent the gradual adoption of government identity numbers as de facto universal identity numbers".

Several other countries have had similar experiences. In France in the mid-1970s, there was a public outcry about a leaked government plan for a national databank, under the code-name 'Safari'. In Canada, concern about the widespread use of the Social Insurance Number (SIN) resulted in some winding back of its use in government agencies (Flaherty 1989, p.283-4). The Privacy Commissioner provides guidance regarding its usage and dangers (PCC 1999a, 1999b). "In 1991, the Hungarian Constitutional Court ruled that a law creating a multi-use personal identification number violated the constitutional right of privacy. In 1998, the Philippine Supreme Court ruled that a national ID system violated the constitutional right to privacy. ... [During 1998-99], cards projects in South Korea and Taiwan were stopped after protests" (EPIC 1999).

In the United States, consideration is periodically given to official (as distinct from the existing, de facto) widespread application of the Social Security Number (SSN) as a general-purpose identifier. The decision has always been negative, for the many different reasons explained in FACFI (1976). For additional sources, see EPIC (1995-), PI (1997) and Clarke (1997f).

Confronted with the difficulty that many European countries already have substantial and substantially different schemes in place, the EU was forced to limit its requirements to the anodyne "Member States shall determine the conditions under which a national identification number or any other identifier of general application may be processed" (EU 1995, Article 8.7).

The OECD Guidelines need to be expanded to place tight limitations on the multiple use of identification schemes, and to require considerable protections where an inhabitant registration scheme exists. Despite the apparent social engineering efficiencies available from such mechanisms, the risks of privacy abuse are simply too high.


6.5 Multiple Identifiers for Each Individual

People perform many different roles. In some circumstances, people wish to use multiple identifiers to reflect those different roles, in order to quarantine the information arising in relation to them. Examples of personal roles include parent, child, spouse, scoutmaster, sporting team-coach, participant in professional and community committees, writer of letters-to-the-newspaper-editor, chess-player, and participant in newsgroups, e-lists and chat-channels. Examples of roles played by a person in dealing with organisations include contractor, beneficiary, customer, lobbyist, debtor, creditor, licensee and citizen.

The reasons for seeking to segregate the data-trails arising from one's multiple roles vary from entirely innocent to criminal, and the activity is variously supported, tolerated and opposed by the State (Clarke 1994a, 1997e, 1999e). The laws of countries such as Britain and Australia in no way preclude such multiple identities (or aliases, or aka's - for 'also known as'). An act of fraud that depends on the use of multiple or situation-specific identities is, on the other hand, in most jurisdictions a criminal offence.

The public expectation is that individuals are free to use different identifiers with different organisations, and when conducting distinct relationships with the same organisation. The OECD Guidelines are silent on this matter, and need to be enhanced to address it.


6.6 Identification Tokens

Tokens are used by many organisations to assist in the authentication of an individual's identity, or of the bearer's attributes, credentials or eligibility (Clarke 1994c, 1999e). Passports and so-called 'id cards' are common examples.

Inert printing and embossing on cards have been supplemented by photographs and magnetic strips bearing data that is not visible to the person who carries it. The advent of chip-cards during the last quarter of the 20th century has created the scope for an id card to act as a witness against its bearer, as a spy in the person's own wallet or purse.

In the electronic world, a new form of digital token has emerged. So-called 'digital signatures' can be included within messages. They are generated using an encryption key that need never leave the possession of the person generating the signature. The encryption key is too long to memorise, and hence needs to be carried in some kind of physical token, quite possibly one that identifies the individual in some manner. The digital signature process, and the technological framework needed to support it (public key infrastructure, or PKI), create a substantial set of new privacy threats, which naive designs fail to address (Greenleaf & Clarke 1997, Clarke 1998e).

The public expectation is that the motivations for tight social control, and an efficient identification scheme to support it, be balanced against the interests of individuals in the various aspects of civil liberty. A substantial set of protections is essential in relation to tokens generally, and in relation to intrusive tokens in particular, including digital tokens. Clarke (1998e) argues that requirements in relation to digital signature technology include choice in relation to tokens, keys, identifiers, certificates and to their use, and personal control over tokens.

The OECD Guidelines are entirely silent on these vital issues, because they belong to an era that pre-dates the emergence of the underlying technologies. The Guidelines urgently need enhancement.


6.7 Biometrics

There is a tendency for organisations to apply measures of people's physical selves as a means of reducing the levels of error and fraud. Biometrics is a generic term encompassing a wide range of measures of human physiography and behaviour (Clarke 1994c, Tomko 1998, Cavoukian 1999). Measures of relatively stable aspects of the body include fingerprints, thumb geometry, and aspects of the iris and ear-lobes. Dynamic measures of behaviour include the process (as distinct from the product) of creating a hand-written signature, and the process of keying a password. Despite the inexactitude of DNA measures, it appears likely that attempts will be made by a variety of organisations in the near future to use genetic measures as a biometric (OTA 1990).

People generally feel demeaned to have their physical selves treated as being merely an element of an organisation-imposed process. More threateningly still, the higher reliability of biometrics, and the capability to gather them surreptitiously, can significantly enhance the capacity of corporations and governments to monitor, manipulate and repress individual behaviour.

It is very common for proposals for biometric schemes to involve central storage of the biometrics, as police fingerprint records do now. The existence of the measure under the control of another party creates the possibility of the biometric being used in order to masquerade as that individual. Possible uses would be to gain access to buildings, software or data; to digitally sign messages or transactions; to 'capture' the person's identity; to compromise the person's identity and/or harm their reputation; to suppress the person's behaviour; or to 'frame' the person through the creation of false data trails, and to thereby gain wrongful convictions.

Beyond the use of natural physiographic and physiodynamic identifiers lies the emergent use of imposed identifiers. This quickly migrated from science fiction to delivered product as, during the last decade of the 20th century, it became routine to embed chips in pets and breeding stock. The first invasions of the human body have occurred consensually (in a university professor, as a research project), and with parental consent (in children, to enable identification in the event of abduction). Proposals are 'in the air' for insertion with consent-under-duress (prisoners) and entirely non-consensual embedment (in senile dementia patients).

The public expectation is that very tight controls will be imposed on all uses of biometrics. In particular, it is quite critical to the society and polity that schemes be devised such that biometric measures are only ever known to a chip held by the individual, and to a secure device that is currently measuring the person concerned (Clarke 1997e). This is analogous to the mechanism that has been in use for many years to protect secure PINs that are input on ATM and EFT/POS keyboards.

The OECD Guidelines are silent on all of these critical issues, as are other instruments of the 'fair information practices' family. It is vital that they be updated, and updated urgently, to cater for these developments in technology.


6.8 Anonymity

Reversing the trends towards intrusive identification technologies and towards multiple usage of identifiers is not sufficient to satisfy the need. It is also vital to sustain the longstanding and quite vital freedoms to conduct transactions anonymously (Clarke 1995c, 1996f, 1999d, Smith & Clarke 1999).

The relentless quest by corporations and government agencies for ever more, ever more finely-grained personal data has extended in the later years of the 20th century to the conversion of anonymous into identified transaction streams. Examples include:

Government agencies are frequently in a position to legally impose on individuals the condition that they identify themselves when performing particular kinds of transactions. Corporations may use a combination of inducements and market power to achieve the same end. It is vital to the privacy interest that the increase in information-intensity sought by organisations be resisted and reversed, or subverted.

Given the ravages already caused by privacy-invasive technologies, there is even a strong case for increasing the availability of anonymity in dealings. This has been the motivation underlying the explosion in anonymisation tools, otherwise known as privacy-enhancing technologies (PETs), that occurred during the last five years of the old century (IPCR 1995, EPIC 1997-, Burkert 1997, Clarke 1999b).

The OECD Guidelines are entirely silent on this question. So too is the EU Directive. There are, however, at least two instruments that have explicitly recognised the need for such a measure. The Australian Privacy Charter (APC 1994) includes at Principle 10 - Anonymous Transactions, the statement that "People should have the option of not identifying themselves when entering transactions".

In 1997-98, the Australian Privacy Commissioner established a set of Principles intended for application in the private sector. This includes, as Principle 8 - Anonymity, the statement that "If we can (and you want to) we will deal with you anonymously. [In particular,] wherever it is lawful and practicable, individuals should have the option of not identifying themselves when entering transactions" (NPFHI 1998).

The OECD Guidelines need enhancement to require that consumer transactions and government programs permit anonymity. Schemes denying that option and instead involving obligatory identification require very careful justification, which needs to be published in order to enable public scrutiny.


6.9 Pseudonymity

One of the means whereby accountability is achieved is by ensuring that society can inflict punishment on miscreants. This is difficult in the case of anonymous transactions, and hence the deterrent effect is lost that would otherwise arise from the possibility of retribution. Retribution is only one form of (dis)incentive encouraging reasonable social behaviour, and its importance is frequently over-emphasised; but its loss is clearly of consequence.

Given that technologies have been devised that invade privacy, and that others have been devised that enhance privacy through anonymity, it is likely to be feasible to devise some that balance the apparently conflicting aims of identification and anonymity. What I have dubbed 'privacy-sympathetic tools' (PSTs) depend on the notion of protected indirect identification, or pseudonymity. They of necessity include legal, organisational and technical protections for the means of relating the pseudonym or persona to the person (or persons) behind it (Clarke 1994a, 1996f, 1999b, 1999e).

The OECD Guidelines are entirely silent on the question. So too is the EU Directive. The Guidelines need enhancement to encourage the development of such technologies, and the adoption of pseudonymity in dealings between organisations and individuals.


6.10 The Scope of Privacy Protections

In section 2.1, it was observed that privacy is not a single interest, but has several dimensions, including privacy of the person, of personal behaviour, of personal communications, and of personal data. The preceding analysis has shown how technologies that abuse information privacy have been converging with other forms of privacy invasion, such as video surveillance interfering with privacy of personal behaviour, and biometric collection intruding into the privacy of the person.

It has long been anomalous that information privacy enjoys at least some minimalist protections, but other forms of privacy do not. It is now critical to extend the scope of privacy protections. The public expectation is that privacy protections encompass all dimensions of privacy.

The OECD Guidelines expressly declare that they only "apply to personal data" (G2), and "do not constitute a set of general privacy protection principles" (EM38). The EU Directive is also concerned only with the"right to privacy [of natural persons] with respect to the processing of personal data" (EU 1995, Article 1.1).

Some privacy-protection regimes do, on the other hand, provide the privacy watchdog with responsibilities and commensurate powers to research into, and advise government in relation to, such matters. In a few instances, such as the N.S.W. Privacy Committee (1975-1998) and Commission (1998-), the agency also has complaints-investigation powers in respect of privacy matters of all kinds.

The Australian Privacy Charter includes several principles that address this need:

It is critical to the public interest that the regulatory regime extend beyond mere data protection to encompass all dimensions of privacy. Because of the considerable and increasing interactions between data protection and other dimensions, it is appropriate that the OECD Guidelines be the instrument whereby that extension be achieved.


7. Conclusions

The opening sentence of Warren & Brandeis (1890) conveys the timeless need for adaptation:

"That the individual shall have full protection in person and in property is a principle as old as the common law; but it has been found necessary from time to time to define anew the exact nature and extent of such protection".

"Recent inventions and business methods call attention to the next step which must be taken for the protection of the person".

The OECD's 1980 Guidelines reflected the appreciation of the needs for privacy protection that was widespread during the early-to-mid-1970s. It has been argued in this paper that the appreciation was seriously deficient then, and that the puny protections it called for have been overwhelmed by subsequent developments. We need to "define anew the exact nature and extent of" privacy protections.

Laudon argued that "a second generation of privacy legislation is required" (Laudon 1986, p.400). This second generation began in the late 1980s, with the United States somewhat improving control over computer matching with its 1988 Act; with Canada rolling back the uses of the Social Insurance Number and regulating data matching in 1989; and with Australia issuing draft Guidelines and passing its first (admittedly limited) second generation legislation in 1990 (after finally catching up with the first generation only at the beginning of 1989), and formally recognising the need for additional principles relating to identification and to anonymity. The EU Directive, although the first action by that organisation, is also clearly a second-generation instrument.

A previous paper has argued that privacy has become strategically important for organisations of all kinds (Clarke 1996b). This paper argues that public policy initiatives are urgently required, in order to force the maturation of the framework within which privacy protections are conceived, in order to address flaws in existing regimes, and in order to adapt them to cope with the dramatic advances in information technology of the last quarter-century.

The winding-back of privacy protections is untenable. Merely sustaining the fair information practices model is also untenable, because it results in only two scenarios, both of which are unacceptable:

Mankind's needs at the beginning of the new millenium are for a new paradigm of privacy protection. This paper has argued that this new paradigm must provide very substantial additional features; but that it does not necessarily have to be revolutionary to the extent of forcing the rescission of existing laws.

The Fair Information Practices paradigm, as codified in the OECD Guidelines of 1980, has provided very limited protections for people, and has successfully limited the harm to the interests of government agencies and corporations. The revised version of this American-inspired movement, the 'privacy as an economic right' approach, stands in direct contrast to the assertion of privacy as a human right.

This paper has identified an array of deficiencies in privacy protections, which arise from the fair information practices model. The 21st century will bring with it yet more dramatic incursions into the privacy of individuals, which will undermine community, society and the body politic. Unless privacy is asserted as a human right, and the substantial additional protections argued for in this paper are created, the increasing distance between individuals and institutions will result in the breakdown of social and economic processes.


Acknowledgements

This paper builds on prior work by several other longstanding researchers in the area, particularly (in alphabetical order) Colin Bennett, Herbert Burkert, David Flaherty, Graham Greenleaf, Michael Kirby, Ken Laudon, Gary Marx, James Rule, Robert Ellis Smith and Nigel Waters. It also owes a debt to many Australian advocates, most notably (in alphabetical order) Julie Cameron, Chris Connolly, Tim Dixon, Graham Greenleaf and Nigel Waters; to key advocates overseas, especially David Banisar, Simon Davies, Marc Rotenberg and Robert Ellis Smith; and to those Data Protection and Privacy Commissioners and their staff who have made intellectual contributions to the body of available information. Responsibility for the evaluative and judgmental elements of the paper rests with myself.


References

References to the author's own works are listed separately, below.

APC (1994) 'Australian Privacy Charter, Australian Privacy Charter Council, December 19994, at http://www.rogerclarke.com/DV/PrivacyCharter.html

Bennett C. (1992) 'Regulating Privacy: Data Protection and Public Policy in Europe and the United States' Cornell University Press, New York, 1992

Bennett C. & Grant R. Eds. (1999) 'Visions of Privacy : Policy Choices for the Digital Age' Univ of Toronto Press, 1999

Bentham J. (1791) 'Panopticon; or, the Inspection House', London, 1791

Brin D. (1998) 'The Transparent Society' Addison-Wesley, 1998

Burkert H. (1997) 'Privacy-Enhancing Technologies: Typology, Critique, Vision' in Agre P.E. & Rotenberg M. (Eds.) (1997) 'Technology and Privacy: The New Landscape' MIT Press, 1997

Burkert H. (1999) 'Privacy / Data Protection: A German/European Perspective' Proc. 2nd Symposium of the Max Planck Project Group on the Law of Common Goods and the Computer Science and Telecommunications Board of the National Research Council, Wood Hole, Mass., June 1999

Burnham D. (1983) 'The Rise of the Computer State' Random House, New York, 1983

Cavoukian A. (1999a) 'Privacy and Biometrics', Proc. 21st International Conference on Privacy and Personal Data Protection, Hong Kong, 13-15 September 1999, pp. 10-18

Cavoukian A. (1999b) 'Privacy as a Fundamental Human Right vs. an Economic Right: an Attempt at Conciliation', Information and Privacy Commissioner/Ontario, September 1999, at http://www.ipc.on.ca/web_site.eng/matters/sum_pap/papers/pr-right.htm

CE (1981) 'Council of Europe Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data' Brussels, 1981, at http://www.privacy.org/pi/intl_orgs/coe/dp_convention_108.txt

Cowen Z. (1969) 'The Private Man' The Boyer Lectures, Australian Broadcasting Commission, Sydney, 1969

CSA (1995) 'Model Code for the Protection of Personal Information' Canadian Standards Association, CAN/CSA-Q830-1995 (September 1995)

Davies S. (1992) 'Big Brother: Australia's Growing Web of Surveillance' Simon & Schuster, Sydney, 1992

Davies S. (1996) 'Monitor: Extinguishing Privacy on the Information Superhighway', Pan Macmillan Australia, 1996

EPIC (1995-) 'National ID Cards', Electronic Privacy Information Center, Washington DC, at http://www.epic.org/privacy/id_cards/default.html

EPIC (1997-) 'EPIC Online Guide to Practical Privacy Tools', at http://www.epic.org/privacy/tools.html

EPIC (1999) 'Privacy and Human Rights 1999: An International Survey of Privacy Laws and Developments', Electronic Privacy Information Center / Privacy International, 1999, at http://www.privacyinternational.org/survey/

EU (1995) 'The Directive on the protection of individuals with regard to the processing of personal data and on the free movement of such data', European Commission, Brussels, 25 July 1995, at http://europa.eu.int/eur-lex/en/lif/dat/1995/en_395L0046.html

FACFI (1976) 'The Criminal Use of False Identification', U.S. Federal Advisory Committee on False Identification, Washington, D.C., 1976

Flaherty D.H. (1972) 'Privacy in Colonial New England' Charlottesville VA, 1972

Flaherty D.H. (1979) 'Privacy and Government Data Banks: An International Perspective' Mansell, 1979

Flaherty D.H. (1984) 'Privacy and Data Protection: An International Bibliography' Mansell, 1984

Flaherty D.H. (1989) 'Protecting Privacy in Surveillance Societies' Uni. of North Carolina Press, 1989

Foucault M. (1977) 'Discipline and Punish: The Birth of the Prison' Peregrine, London, 1975, trans. 1977

Gandy O.H. (1993) 'The Panoptic Sort: Critical Studies in Communication and in the Cultural Industries' Westview, Boulder CO, 1993

GILC (1998) 'Privacy And Human Rights: An International Survey of Privacy Laws and Practice' Global Internet Liberty Campaign, Washington DC, September 1998, at http://www.gilc.org/privacy/survey/

HEW (1973) 'Records, Computers and the Rights of Citizens' U.S. Dept. of Health, Education and Welfare, Secretary's Advisory Committee on Automated Personal Data Systems, MIT Press, Cambridge. Mass., 1973

Hondius F. (1975) 'Emerging Data Protection in Europe' North Holland, 1975

HREOC (1995a) 'Community Attitudes to Privacy', Information Paper No. 3, Human Rights Australia - Privacy Commissioner, Sydney (August 1995)

Hughes G. (1991) 'Data Protection Law in Australia', Law Book Company, 1991

ICCPR (1976) 'International Covenant on Civil and Political Rights' United Nations, 1976, at http://www.privacy.org/pi/intl_orgs/un/international_covenant_civil_political_rights.txt

IPCR (1995) 'Privacy-Enhancing Technologies: The Path to Anonymity' Information and Privacy Commissioner (Ontario, Canada) and Registratiekamer (The Netherlands), 2 vols., August 1995, at http://www.ipc.on.ca/web%5Fsite.eng/matters/sum%5Fpap/papers/anon%2

De.htmKim J. (1997) 'Digitized Personal Information and the Crisis of Privacy: The Problems of Electronic National Identification Card Project and Land Registry Project in South Korea', at http://kpd.sing-kr.org/idcard/joohoan2.html

Larsen E. (1992) 'The Naked Consumer: How Our Private Lives Become Public Commodities' Henry Holt, New York, 1992

Laudon K.C. (1986) 'Dossier Society: Value Choices in the Design of National Information Systems' Columbia U.P., 1986

Laudon K.C. (1993) 'Markets and Privacy' Proc. Int'l Conf. Inf. Sys., Orlando FL, Ass. for Computing Machinery, New York, 1993, pp.65-75

Lindop (1978) 'Report of the Committee on Data Protection' Cmnd 7341, HMSO, London (December 1978)

Long E.V. (1967) 'The Intruders' Praeger, New York,1967

Madsen W. (1992) 'Handbook of Personal Data Protection' Macmillan, London, 1992

Miller A.R. (1969) 'Computers and Privacy' Michign L. Rev. 67 (1969) 1162-1246

Miller A.R. (1972) 'The Assault on Privacy' Mentor, 1972 (orig. published University of Michigan Press, 1971)

Morison W.L. (1973) 'Report on the Law of Privacy' Govt. Printer, Sydney 1973

NPFHI (1998) '', Office of the Australian Privacy Commissioner, February 1998, January 1999, at http://www.privacy.gov.au/private/index.html#4.1

NSWPC (1977) 'Guidelines for the Operation of Personal Data Systems' New South Wales Privacy Committee, Sydney, 1977

NSWPC (1995) 'Smart Cards: Big Brother's Little Helpers', The Privacy Committee of New South Wales, No.66, August 1995, at http://www.austlii.edu.au/au/other/privacy/smart/

NZ (1993) Privacy Act 1993 (NZ) at http://www.knowledge-basket.co.nz/privacy/legislation/legislation.html

NZPC (1998a) 'Necessary and Desirable: Privacy Act 1993 Review', available for $NZ130 plus p.&p. from http://www.mcgovern.co.nz/orderrev.html

NZPC (1998b) 'Necessary and Desirable: Privacy Act 1993 Review: Highlights ...' Office of the Privacy Commissioner, Auckland N.Z., at http://privacy.org.nz/news4.html

OECD (1980) 'Guidelines on the Protection of Privacy and Transborder Flows of Personal Data', Organisation for Economic Cooperation and Development, Paris, 1980, at http://www.oecd.org/dsti/sti/it/secur/prod/PRIV-en.HTM, accessed 3 April 1998

OECD (1998) 'Implementing the OECD 'Privacy Guidelines' in the Electronic Environment: Focus on the Internet', Committee for Information, Computer and Communications Policy, Organisation for Economic Cooperation and Development, Paris, May 1998, at http://www.oecd.org/dsti/sti/it/secur/news/

OTA (1981) 'Computer-Based National Information Systems: Technology and Public Policy Issues' Office of Technology Assessment, Congress of the United States, Washington DC (September 1981)

OTA (1985) 'Electronic Surveillance and Civil Liberties' OTA-CIT-293, U.S. Govt Printing Office, Washington DC, October 1985

OTA (1986) 'Federal Government Information Technology: Electronic Record Systems and Individual Privacy' OTA-CIT-296, U.S. Govt Printing Office, Washington DC, June 1985

OTA (1990) 'Genetic Witness: Forensic Uses of DNA Tests' Office of Technology Assessment, Congress of the United States, OTA-BA-438, Washington DC (July 1990)

Packard V. (1957) 'The Hidden Persuaders' Penguin, London, 1957

Packard V. (1964) 'The Naked Society' McKay, New York, 1964

PCA (1998) 'Minding our own business: Privacy protocol for Commonwealth agencies in the Northern Territory handling personal information of Aboriginal and Torres Strait Islander people' Privacy Commissioner of Australia, March 1998

PCC (1999a) 'Social Insurance Numbers (SIN)', Privacy Commissioner of Canada, at http://www.privcom.gc.ca/02_05_d_02_e.htm

PCC (1999b) 'Audit and Privacy Issues - Policy Regarding SINs in Canada', Privacy Commissioner of Canada, at http://www.privcom.gc.ca/02_05_a_990512_2_e.htm

PI (1997) 'National ID Cards', Privacy International, at http://www.privacy.org/pi/issues/idcard/

PPSC (1977) 'Personal Privacy in an Information Society' Privacy Protection Study Commission, U.S. Govt. Printing Office, July 1977

Privacy International (1996) 'Privacy International's FAQ on Identity Cards'', at http://www.privacy.org/pi/activities/idcard/

Rosenberg J.M. (1969) 'The Death of Privacy' Randon House, 1969

Roszak T. (1986) 'The Cult of Information' Pantheon 1986

Rotenberg M. (1999) 'The Privacy Law Sourcebook 1999' Electronic Privacy Information Center, 1998, 1999, from http://www.epic.org/pls/

Rule J.B. (1974) 'Private Lives and Public Surveillance: Social Control in the Computer Age' Schocken Books, 1974

Rule J.B., McAdam D., Stearns L. & Uglow D. (1980) 'The Politics of Privacy' New American Library 1980

Rule J.B., McAdam D., Stearns L. & Uglow D. (1983) 'Documentary Identification and Mass Surveillance in the U.S.' Social Problems 31:222-234 December, 1983

Russell, B. (1949) 'Authority and the Individual' George Allen and Unwin. 1949

Schoeman F. D. Ed. (1984) 'Philosophical Dimensions of Privacy: An Anthology' Cambridge University Press, 1984

Seipp D.J. (1981) 'Note: The Right to Privacy in Nineteenth Century America' 94 Harvard L. Rev. (1981) 1892

Sieghart P. (1976) 'Privacy and Computers' Latimer, 1976

Smith R.E. (ed.) (1974-) Privacy Journal, Providence RI, monthly since November 1974

Smith R.E. (1997) 'Compilation of State and Federal Privacy Laws', Privacy Journal, Providence RI, editions in 1975, 1976, 1978, 1981, 1984, 1988, 1992, 1997

Sprenger P. (1999) 'Sun on Privacy: 'Get Over It' ', Wired News, 26 January 1999, at http://www.wired.com/news/politics/0,1283,17538,00.html

Stigler G.J. (1980) 'An introduction to privacy in economics and politics' Journal of Legal Studies 9(4), 1980, 623-644

Stone, M.G. (1968) 'Computer Privacy' Anbar, 1968

Thompson, A.A. (1970) 'A Big Brother in Britain Today' Michael Joseph, 1970

Tomko G. (1998) 'Biometrics as a Privacy-Enhancing Technology: Friend or Foe of Privacy?' Proc. Conf. Privacy Laws & Business, 9th Privacy Commissioners' / Data Protection Authorities Workshop, Santiago de Compostela, Spain, September 1998, at http://www.dss.state.ct.us/digital/tomko.htm

Tucker G. (1992) 'Information Privacy Law in Australia' Longman Cheshire, Melbourne, 1992

UDHR (1948) 'Universal Declaration of Human Rights' United Nations, 10 December 1948, at http://www3.itu.int/udhr/

UN (1990) 'Guidelines Concerning Computerized Personal Data Files', United Nations, at http://www.datenschutz-berlin.de/gesetze/internat/aen.htm

Warner, M., and Stone, M. (1970) 'The Data Bank Society: Organisations, Computers and Social Freedom' George Allen and Unwin, 1970

Warren S. & Brandeis L.D. (1890) 'The Right to Privacy' 4 Harvard Law Review (1890) 193-220, at http://athena.louisville.edu/library/law/brandeis/privacy.html

Weizenbaum J. (1976) 'Computer Power and Human Reason, Publisher, 1976

Wessell, M.R. (1974) 'Freedom's Edge: The Computer Threat to Society' Addison- Wesley, Reading, Mass., 1974

Westin A.F. (1967) 'Privacy and Freedom' Atheneum 1967

Westin, A.F., Ed. (1971) 'Information Technology in a Democracy', Harvard University Press, Cambridge, Mass., 1971

Westin A.F. & Baker M.A. (1974) 'Databanks in a Free Society: Computers, Record-Keeping and Privacy' Quadrangle 1974

Whitaker R. (1999) 'The End of Privacy: How Total Surveillance Is Becoming a Reality', New Press, New York, 1999

The Winds (1997) 'The future has arrived' (June 1997), at http://www.thewinds.org/archive/government/idcard6-97.html

Younger K. (1972) 'Report, Committee on Privacy' U.K. Cmnd 5012, London, 1972

Zamyatin E. (1922) 'We' Penguin, 1922, 1980


References to the Author's Own Works

This paper is a further development on close to 30 years of research undertaken by the author in this area. Many of the publications listed below contain additional references to the wider literature.

Clarke R. (1987a) 'Just Another Piece of Plastic for Your Wallet: The Australia Card' Prometheus 5,1 June 1987 Republished in Computers & Society 18,1 (January 1988), with an Addendum in Computers & Society 18,3 (July 1988). At http://www.rogerclarke.com/DV/OzCard.html

Clarke R. (1987b) 'The OECD Data Protection Guidelines: A Template for Evaluating Information Privacy Law and Proposals for Information Privacy Law' Working Paper (25pp.) (October 1987). At http://www.rogerclarke.com/DV/PaperOECD.html

Clarke R. (1988) 'Information Technology and Dataveillance', Commun. ACM 31,5 (May 1988). Republished in C. Dunlop and R. Kling (Eds.), 'Controversies in Computing', Academic Press, 1991, at http://www.rogerclarke.com/DV/CACM88.html

Clarke R. (1989) 'The Privacy Act 1988 as an Implementation of the OECD Data Protection Guidelines', Working Paper, June 1989, at http://www.rogerclarke.com/DV/PActOECD.html

Clarke R. (1992a) 'The Resistible Rise of the National Personal Data System' Software Law Journal 5,1 (January 1992) , at http://www.rogerclarke.com/DV/SLJ.html

Clarke R. (1992b) 'Case Study Cardomat/Migros: An Open EFT/POS System' Austral. Comp. J. 24,1 (February 1992), at http://www.rogerclarke.com/EC/Migros.html

Clarke R. (1992c) 'Extra-Organisational Systems: A Challenge to the Software Engineering Paradigm' Proc. IFIP World Congress, Madrid (September 1992), at http://www.rogerclarke.com/SOS/PaperExtraOrgSys.html

Clarke R. (1993a) 'Why the Public Is Scared of the Public Sector', IIR Conference paper, February 1993, at http://www.rogerclarke.com/DV/PaperScared.html

Clarke R. (1993b) 'Profiling: A Hidden Challenge to the Regulation of Data Surveillance', Journal of Law and Information Science 4,2 (December 1993), at http://www.rogerclarke.com/DV/PaperProfiling.html. A shorter version was published as 'Profiling and Its Privacy Implications' Australasian Privacy Law & Policy Reporter 1,6 (November 1994), at http://www.rogerclarke.com/DV/AbstractProfiling.html

Clarke R.A. (1994a) 'The Digital Persona and Its Application to Data Surveillance' The Information Society 10,2 (June 1994), at http://www.rogerclarke.com/DV/DigPersona.html

Clarke R. (1994b) 'Information Technology: Weapon of Authoritarianism or Tool of Democracy?' Proc. World Congress, Int'l Fed. of Info. Processing, Hamburg, September 1994. At http://www.rogerclarke.com/DV/PaperAuthism.html

Clarke R. (1994c) 'Human Identification in Information Systems: Management Challenges and Public Policy Issues' Information Technology & People 7,4 (December 1994) 6-37, at http://www.rogerclarke.com/DV/HumanID.html

Clarke R. (1994d) 'Dataveillance by Governments: The Technique of Computer Matching' Information Technology & People 7,2 (December 1994). Abstract at http://www.rogerclarke.com/DV/AbstractMatchIntro.html

Clarke R. (1995a) 'Computer Matching by Government Agencies: The Failure of Cost/Benefit Analysis as a Control Mechanism' Informatization and the Public Sector (March 1995). At http://www.rogerclarke.com/DV/MatchCBA.html

Clarke R. (1995b) 'A Normative Regulatory Framework for Computer Matching' Journal of Computer and Information Law XIII,4 (Summer 1995) 585-633, at http://www.rogerclarke.com/DV/MatchFrame.html

Clarke R. (1995c) 'When Do They Need to Know 'Whodunnit?': The Justification for Transaction Identification; The Scope for Transaction Anonymity and Pseudonymity' Proc. Conf. Computers, Freedom & Privacy, San Francisco, 31 March 1995. At http://www.rogerclarke.com/DV/PaperCFP95.html. Revised version published as 'Transaction Anonymity and Pseudonymity' Privacy Law & Policy Reporter 2, 5 (June/July 1995) 88-90

Clarke R. (1995d) 'Clients First or Clients Last? The Commonwealth Government's IT Review' Privacy Law & Policy Reporter 2, 4 (April / May 1995). At http://www.rogerclarke.com/DV/CFCL.html

Clarke R. (1995e) 'Trails in the Sand', at http://www.rogerclarke.com/DV/Trails.html

Clarke R. (1996a) 'Smart move by the smart card industry: The Smart Card Industry's Code of Conduct' Privacy Law & Policy Reporter 2, 10 (January 1996) 189-191, 195. At http://www.rogerclarke.com/DV/SMSC.html

Clarke R. (1996b) 'Privacy and Dataveillance, and Organisational Strategy', EDPAC Conference Paper (May 1996), at http://www.rogerclarke.com/DV/PStrat.html

Clarke R. (1996c) 'Data Transmission Security, or Cryptography in Plain Text'Privacy Law & Policy Reporter 3, 2 (May 1996), pp. 24-27 , at http://www.rogerclarke.com/II/CryptoSecy.html

Clarke R. (1996d) 'Privacy Issues in Smart Card Applications in the Retail Financial Sector', in 'Smart Cards and the Future of Your Money', Australian Commission for the Future, June 1996, pp.157-184. At http://www.rogerclarke.com/DV/ACFF.html

Clarke R. (1996e) 'The Information Infrastructure is a Super Eye-Way: Book Review of Simon Davies' 'Monitor'' Privacy Law & Policy Reporter 3, 5 (August 1996), at http://www.rogerclarke.com/DV/Monitor.html

Clarke R. (1996f) 'Identification, Anonymity and Pseudonymity in Consumer Transactions: A Vital Systems Design and Public Policy Issue', Conference on 'Smart Cards: The Issues', Sydney, 18 October 1996, at http://www.rogerclarke.com/DV/AnonPsPol.html

Clarke R. (1997a) 'What Do People Really Think? MasterCard's Survey of the Australian Public's Attitudes to Privacy', Privacy Law & Policy Report 3,9 (January 1997), at http://www.rogerclarke.com/DV/MCardSurvey.html

Clarke R. (1997b) 'Flaws in the Glass; Gashes in the Fabric: Deficiencies in the Australian Privacy-Protective Regime', Invited Address to Symposium on 'The New Privacy Laws', Sydney, 19 February 1997 , at http://www.rogerclarke.com/DV/Flaws.html

Clarke R. (1997c) 'Exemptions from General Principles Versus Balanced Implementation of Universal Principles' Working Paper, February 1997, at http://www.rogerclarke.com/DV/Except.html

Clarke R. (1997c) 'Smart Cards in Banking and Finance' The Australian Banker 111,2 (April 1997), at http://www.rogerclarke.com/EC/SCBF.html

Clarke R. (1997d) 'Privacy and 'Public Registers'', Proc. IIR Conference on Data Protection and Privacy, Sydney, 12-13 May 1997, at http://www.rogerclarke.com/DV/PublicRegisters.html

Clarke R. (1997e) 'Public Interests on the Electronic Frontier', Invited Address to IT Security '97, 14 & 15 August 1997, Rydges Canberra (August 1997). Republished in Computers & Law No. 35 (April 1998) pp.15-20, at http://www.rogerclarke.com/II/IIRSecy97.html

Clarke R. (1997f) 'Chip-Based ID: Promise and Peril', Proc. International Conference on Privacy, Montreal, 23-26 September 1997, at http://www.rogerclarke.com/DV/IDCards97.html

Clarke R. (1998a) 'Direct Marketing and Privacy', Proc. AIC Conf. on the Direct Distribution of Financial Services, Sydney, 24 February 1998, at http://www.rogerclarke.com/DV/DirectMkting.html

Clarke R. (1998b) 'Privacy Impact Assessments', Working Paper, February 1998, at http://www.rogerclarke.com/DV/PIA.html

Clarke R. (1998b) 'Serious Flaws in the National Privacy Principles', Privacy Law & Policy Reporter 4, 9 (March 1998), at http://www.rogerclarke.com/DV/NPPFlaws.html

Clarke R. (1998c) 'Platform for Privacy Preferences: An Overview' (April 1998), Privacy Law & Policy Reporter 5, 2 (July 1998) 35-39, at http://www.rogerclarke.com/DV/P3POview.html

Clarke R. (1998d) 'Platform for Privacy Preferences: A Critique' (April 1998), Privacy Law & Policy Reporter 5, 3 (August 1998) 46-48, at http://www.rogerclarke.com/DV/P3PCrit.html

Clarke R. (1998e) 'Public Key Infrastructure: Position Statement', May 1998, at http://www.rogerclarke.com/DV/PKIPosn.html

Clarke R. (1998f) 'Information Privacy On the Internet: Cyberspace Invades Personal Space' Telecommunication Journal of Australia 48, 2 (May/June1998), at http://www.rogerclarke.com/DV/IPrivacy.html

Clarke R. (1998g) 'A History of Privacy in Australia', December 1998, at http://www.rogerclarke.com/DV/OzHistory.html

Clarke R. (1998h) 'A History of Privacy in Australia: Context', December 1998, at http://www.rogerclarke.com/DV/OzHC.html

Clarke R. (1999a) 'Internet Privacy Concerns Confirm the Case for Intervention', Communications of the ACM, 42, 2 (February 1999) 60-67, at http://www.rogerclarke.com/DV/CACM99.html

Clarke R. (1999b) 'The Legal Context of Privacy-Enhancing and Privacy-Sympathetic Technologies', presentation at AT&T Research Labs, Florham Park NJ, 5 April 1999, at http://www.rogerclarke.com/DV/Florham.html

Clarke R. (1999c) 'Electronic Services Delivery: From Brochure-Ware to Entry Points'. Proc. 12th International Bled EC Conf., Slovenia, June 1999, at http://www.rogerclarke.com/EC/ESD.html

Clarke R. (1999d) 'The Willingness of Net-Consumers to Pay: A Lack-of-Progress Report', Proc. 12th International Bled EC Conf., Slovenia, June 1999, at http://www.rogerclarke.com/EC/WillPay.html

Clarke R. (1999e) 'Anonymous, Pseudonymous and Identified Transactions: The Spectrum of Choice', Proc. IFIP User Identification & Privacy Protection Conference, Stockholm, June 1999, at http://www.rogerclarke.com/DV/UIPP99.html

Clarke R. (1999f) 'Person-Location and Person-Tracking: Technologies, Risks and Policy Implications' Proc. 21st International Conf. Privacy and Personal Data Protection, Hong Kong, September 1999, at http://www.rogerclarke.com/DV/PLT.html

Clarke R. (2000) 'Privacy Laws: Resources', at http://www.rogerclarke.com/DV/PrivacyLaws.html

Greenleaf G.W. & Clarke R. (1997) 'Privacy Implications of Digital Signatures', IBC Conference on Digital Signatures, Sydney, 12 March 1997, at http://www.rogerclarke.com/DV/DigSig.html

Smith A. & Clarke R. (1999) 'Identification, Authentication and Anonymity in a Legal Context', Proc. IFIP User Identification & Privacy Protection Conference, Stockholm, June 1999, at http://www.rogerclarke.com/DV/AnonLegal.html



xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 17 December 1999 - Last Amended: 4 January 2000 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/PP21C.html
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2022   -    Privacy Policy