'Profiling' and Its Privacy Implications

Roger Clarke
Department of Commerce

Version of 25 August 1994

Published in the Australasian Privacy Law and Policy Reporter

© Australian National University, 1994

Data surveillance, usefully abbreviated to dataveillance, is the systematic use of personal data systems in the investigation or monitoring of the actions or communications of people. It is supplanting more traditional forms of surveillance because it is cheap and effective.

Personal dataveillance involves monitoring an identified individual. Techniques include transaction-triggered screening, front-end verification, front-end audit and cross-system enforcement. Mass dataveillance, on the other hand, monitors groups of people, in order to generate suspicion about particular members of the population. Mass dataveillance techniques include general use of the above techniques, without any transaction to trigger them, plus additional tools.

One such tool is profiling. Profiling is little-understood and ill-documented, but increasingly used. During recent years, it has become increasingly possible to undertake research into some forms of dataveillance, such as data matching. Profiling, on the other hand, continues to be undertaken largely undercover. The technique is highly privacy-invasive, and its practice accordingly needs to be subjected to the harsh light of day.

Profiling is a technique comprising two steps:

A profile may be constructed on the basis of some model of the class of people, from experience of them, or from empirical evidence about them. It enables an organisation to identify 'suspects' or 'prospects' within a large population.

Government agencies apply profiling to a variety of purposes. Taxation agencies are understood to use it to select people and their returns for audit. It is used by various agencies in their searches for drug dealers, likely violent offenders, arsonists, rapists, child molesters, and sexually exploited children. It is reasonably presumed that it is used by customs and immigration authorities to assess people and cargo arriving at entry points to Australia and other countries.

Like any technique, it can be used for purposes which are 'good' or 'evil', and there is considerable scope for argument as to where the balance-point lies between its inherent privacy-invasiveness and other social and economic interests. Consider, for example, the following (not entirely hypothetical) target groups:

In the private sector, companies are using profiling for purposes related to their staff, but especially to their customers. There has been a shift in marketing budgets away from advertising in the mass media towards direct marketing, 'individualised mass marketing', and 'micro-marketing'. This has brought with it a desire to gather sufficiently detailed information on people's buying habits and personal preferences that messages can be projected to individuals which are precisely attuned to their personal interests. The use of these techniques varies from the crass (e.g. the despatch of differently worded letters to electors depending on their known views on an issue), to the fair, effective and efficient.

The benefits of marketing applications can be summarised as:

The 'downside' of profiling, however, is that it is put to some uses which society would consider inappropriate and oppressive if they were publicly known and debated; and to others which are in principle acceptable to society, but which are undertaken in ways that are unfair, insensitive or discriminatory. At some point, selectivity in advertising crosses a boundary to become consumer manipulation. In addition, profiling leads companies to ignore certain types of people, and thereby limit their access to information about goods and services.

Government use of profiling involves a wide range of dangers to individuals and to society as a whole. It may be based entirely on data which the organisation already holds, but more commonly it draws on the data-holdings of multiple organisations using the facilitative techniques of data concentration and/or data matching. It therefore applies existing data to secondary purposes, and stimulates the collection of additional data for speculative purposes. Privacy law has arisen in large part to place controls over such proliferation and multiple usage.

Another issue is that profiling is used proactively, as a predictive tool, generating suspicion where there previously was none. There is a real fear that the onus of proof may be in the process of being switched. It has traditionally been incumbent on the accuser to prove an individual's guilt, but there are now several key areas in which that tradition has already been eroded, including taxation administration and the processing of people at border-crossings. Precisely these agencies are understood to be among the leaders in the application of profiling.

At its most extreme, profiling, and the attitudes associated with it, may result in the classical, Kafkaesque situation whereby the catalyst for an interview, search or even trial, is not apparent to the individual. The 'accuser' is disembodied: an abstract algorithm established by persons remote in time and space, and applied by a computer. The authors of the European Commission's Draft Directive were sufficiently concerned about this aspect of dataveillance to propose a right for people "not to be subjected to an administrative or private decision adversely affecting him which is based solely on automatic processing defining a personality profile".

Also of serious concern is that profiling is generally undertaken surreptitiously, outside the purview, and largely without the knowledge, of the public, its Parliamentary representatives or any statutory watchdog. Organisations remain free to develop and apply the technique as they see fit, without mechanisms to ensure that the interests of all stakeholders are appropriately reflected. This breeds a climate of suspicion among those of the public who are aware of, or suspect, the activities. The organisations involved have a tendency to not disclose the methods they are using, at least in part because of the fear of being subjected to controls, or even having their practices banned.

This factor compounds the concern about due process, because in order to maintain the smoke-haze surrounding their practices, organisations using profiling tend to not bring the real evidence forward into court, but rather to seek out and use other information which they are able to gather during the course of the investigation. Defendants are therefore placed in the position of having to defend against evidence, and even charges, which seem to be beside the point, rather than the real issue.

Associated with these concerns is the extent to which judgmental valuations are embedded in applications of profiling. Cultural, racial and gender biases, for example, are inevitable, because of the facts of the matter (e.g. the arrest rates of aboriginal people in Australia, and persons of negro and latino origin in the United States, are higher than those for white people), the way in which data is collected, organised and presented (e.g. more data is collected about people of lower socio-economic origins, because they are more commonly applicants for benefits), and the way in which characteristics are inferred (i.e. the people who prepare the profiles bring with them their own theories about which kinds of people are prone to behave in the manner being targetted).

A variety of factors might act to prevent unreasonable uses of profiling, and constrain unreasonable practices in relation to such profiling as is done. These include self-restraint by the organisation or by its employed professionals, policy statements or codes of conduct, the exercise of countervailing power by other organisations (e.g. competitors) or by the public, inadequate technology or data, incompetent staff, inter-organisational jealousies, or insufficient cost-justification for the use of the technique.

Many of these constraints, however, are entirely dependent upon the practices and their implications becoming common knowledge: they are non-operative if the organisation is successful in suppressing the fact of its use of the technique. Moreover, government agencies are generally much less responsive to pressure from public interest groups through the media. Such intrinsic or natural controls have been unable to bring about public visibility of profiling practices, and are hardly likely to ensure the appropriateness of its use.

This author's research has to date unearthed no regulatory measures dealing explicitly with profiling. Moreover, long-standing common law protections such as the laws of confidence and defamation, and privacy and data protection measures, were conceived without any understanding of the technique. Such external control measures as do exist are therefore generic, or accidental and incidental.

One provision which exists in some form in most statutes is that referred to by the OECD Guidelines (1980) as the Openness Principle, i.e. "there should be a general policy of openness about developments, practices and policies with respect to personal data". Unfortunately the implementation of the Principle in most countries falls far short of that aspiration, particularly because of the wording chosen by Parliamentary Draftsmen to implement it, and the manifold exemptions and exceptions provided. The Australian Privacy Act 1988, for example, requires only that "a record-keeper ... shall ... takes such steps as are, in the circumstances, reasonable to enable any person to ascertain ... the nature of [personal information held] ... [and] the main purposes for which that information is used" (Principle 5). Unsurprisingly, disclosure of profiling activities is rare, even by organisations subject to that statute, and even in response to direct requests for information.

Profiling is an important application of information technology, but also one which embodies considerable dangers to individuals and society. On the basis of the limited evidence publicly available, the conclusion is inescapable that the law in most countries provides only very limited means of constraining, or even ensuring public knowledge about, the use of profiling. Profiling is largely being conducted without public knowledge, without justification, and without appropriate safeguards.

Government agencies and corporations are taking advantage of the lack of regulation to apply profiling as they see fit. Intrinsic controls over dataveillance techniques have been repeatedly shown to be utterly inadequate. Profiling therefore demands far more attention than has to date been given to it by researchers, by executives in both the public and private sector, by regulatory agencies, and by legislators.

Further detail on the topic of profiling may be found in Clarke R. 'Profiling: A Hidden Challenge to the Regulation of Dataveillance' J. L. & Inf. Sc. 4,2 (December 1993), For a general review, see Clarke R. 'Information Technology and Dataveillance' Commun. ACM 31,5 (May 1988), re-published in C. Dunlop and R. Kling (Eds.) 'Controversies in Computing', Academic Press, 1991

Navigation

Go to Roger's Home Page.

Go to the contents-page for this segment.

Send an email to Roger

Last Amended: 13 October 1995

These community service pages are a joint offering of the Australian National University (which provides the infrastructure), and Roger Clarke (who provides the content).

The Australian National University
Visiting Fellow, Faculty of
Engineering and Information Technology,

Information Sciences Building Room 211

Xamax Consultancy Pty Ltd, ACN: 002 360 456

78 Sidaway St
Chapman ACT 2611 AUSTRALIA

Tel: +61 6 288 6916 Fax: +61 6 288 1472