Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'Biometrics Regulation'

Biometrics' Inadequacies and Threats, and the Need for Regulation

Roger Clarke

Principal, Xamax Consultancy Pty Ltd, Canberra

Visiting Fellow, Department of Computer Science, Australian National University

Revised Draft of 15 April 2002

© Xamax Consultancy Pty Ltd, 2002

This document is at http://www.rogerclarke.com/DV/BiomThreats.html


Abstract

Biometric technologies have strengths and weaknesses, and their effectiveness is greatly dependent on the environment in which they are applied. Marketers are guilty of a vast amount of over-selling of their products, and buyers need to be very sceptical about the claims that are being made.

Biometric technologies are also extraordinarily threatening to the freedoms of individuals, variously as employees, customers, citizens, welfare recipients, and persons-in-the-street. Yet the design of schemes that are substantially invasive of the privacy of the person, and of behavioural privacy, as well as of data privacy, is being undertaken by organisations in blithe ignorance of the concerns of the people they are intended to be inflicted upon.

Biometrics technologies are being implemented so badly, and in such a threatening manner, that they need to be banned, until and unless an appropriate and legally enforced regulatory regime is established.

Paradoxically, this might be the only means of saving an industry that has promised much for years and delivered very little. If the present practices continue, public revulsion will build up and explode, the mood will swing suddenly and substantially, and biometrics will be set back decades. By calling a halt, involving public interest advocates and representatives, and getting genuine controls into place before any further mis-fires are perpetrated, the industry might yet survive and prosper.


Contents


1. Introduction

Suppliers and interested consultancies are guilty of over-hyping of biometrics to a yet greater extent than is normal for technology. Extreme scepticism is needed in listening to or reading anything published about biometrics. Here are some of the dimensions of the hogwash being disseminated.

This paper assumes that the reader is aware of the kinds of biometric technologies. Introductory material can be found in Bowman (2000), in the early sections of Clarke (2001), and in other references provided in the Resources section of this paper.

In general, biometric schemes involve:

This paper refers primarily to physiographic features that are utilised by currently available biometric technologies. Many aspects are also applicable to DNAprofiling, and to imposed physical identifiers such as embedded microchips.


2. Inadequacies of Biometrics

This first section of the paper consolidates information about the serious inadequacies that are being glossed over by the vast majority of literature.


2.1 Diversity the Context of Use

The settings in which biometrics technologies may be applied vary considerably. This section identifies some of the most important factors that affect schemes' quality, effectiveness, intrusiveness and justification. The paper will argue that the conception of biometrics technologies and products, and the design of applications, generally fail to reflect the need for diversity.

The applications of biometrics are of two distinct kinds:

Industry literature commonly uses the term 'verification' to refer to this application. That term is seriously misleading as to the reliability of the result, and should be avoided. The industry also uses the term 'authentication' without the qualifier 'identity'. This is also evidence of shoddy thinking, because there are many assertions other than of identity that are subject to authentication. See Clarke (2001).

There is variability in the subject's knowledge of and consent for the acquisition of the reference measure(s). The circumstances include:

The concept of consent has been subjected to far too little examination. See Clarke (2002).

Another manner in which schemes differ is the information that the reference measure is associated with, and the degree of confidence in the association with that information. Possibilities include:

The same differences arise in relation to the subject's knowledge of and consent for the acquisition of a test-measure. The circumstances include:

The practicability of capturing a reasonable-quality measure varies considerably. Examples of circumstances in which acquisition of the test-measure may be attempted include:

The foreground purposes of identification and identity authentication include:

There may be background or ancillary purposes of identification and identity authentication, which may include:

The degree of willingness of the subject is highly variable, and includes:

Any identification and identity authentication scheme, whether it includes biometrics or not, needs to be designed to reflect these factors. Because of the complexities involved, it is very easy to conceive of applications that appear on the surface to be justified, but which make substantially no contribution to their objectives.


2.2 Quality Issues

Biometrics technologies are subject to a range of errors. The first cluster of challenges arises in the context of the acquisition of reference measures. Reference measures may be inaccurate, or may be wrongly associated with an identifier or other data. Reasons for this include:

A second cluster of challenges results in the possibility that test-measures may be inaccurate, or may be wrongly associated with an identifier or other data. Reasons for this include:

Even where a test-measure is appropriately acquired, and compared against a reference measure that was appropriately acquired, and which had information associated with it that was appropriate, the result may be inaccurate, as a result of:

Various means are feasible for addressing each of these risks. For example, several reference measures might be taken, and either stored or used to compute some kind of average measure; and reference measures might be progressively adapted based on subsequent test-measures. Each such approach creates greater complexity, and new risks.

Permanent and temporary changes, and acquisition processes for reference measures but especially for test-measures, are subject to at least some degree of manipulation by the subjects from whom the measures are being acquired. In some cases, the intent is to cause a test-measure not to match against the reference measure, whereas in other cases, the purpose is to cause a match to be achieved when it should be rejected.

Further features can be designed into the technology, or the individual product, or the particular application, in order to counter that risk. Biometrics products commonly include countermeasures against at least the more obvious or easily perpetrated of these tricks, such as 'liveness testing', which is intended to preclude a prosthetic being used, such as a latex overlay over a thumb, or a picture of an eye. Each such countermeasure adds complexity, and creates new scope for error, and for manipulation by subjects.

Most of the approaches adopted to address this vast array of error-sources are proprietary, unpublished, and if they are audited at all then the audit reports are seldom publicly available.

For all of the above reasons, there are inevitably differences between test-measures and reference measures. Hence biometrics technologies cannot seek precise equality, and must instead have a tolerance range within which the two are deemed to be sufficiently close. The tolerance range may be fixed, or may be able to be re-set by the operator, or by the system manager.

Sometimes 'false positives' or 'false acceptances' will occur, i.e. a match will be declared, or the assertion will be deemed to be authenticated, even though the person who presented was not the one the system thought it was. On the opposite end of the spectrum, sometimes 'false negatives' or 'false rejections' occur, in that the right person is rejected. There is a wide variety of causes of these errors, including reference measure inaccuracy, capture error and measuring instrument error.

In general, the tighter the tolerances are set (to avoid false positives), the more false negatives will arise; and the looser the tolerances are set (to avoid false negatives), the more false positives occur. The tolerance is commonly set to reflect the interests of the scheme's primary sponsor, with little attention to the concerns of other stakeholders. A further indication of the sloppy thinking that biometrics proponents have allowed themselves is the use of a spurious statistic called the 'net error rate', formed by relating the false acceptance and false rejection rates.


2.3 Quality and Relevance in Particular Settings

Catching random acts by one-time perpetrators is a forlorn hope, because they simply will not register against a database of suspects. Think suicide bombers, whether wearing explosives in Palestine, or boarding a commercial flight in Seattle.

Catching 'persons suspected of being involved in serious crime' will also work only occasionally, because they have the commitment and the capacity to invest in subversion methods. Forget the 'most wanted list', especially of international terrorists.

In doing so, they will create suspicion, impose embarrassment on, and at the very least inconvenience, significant numbers of people who are not part of the targets who were used to justify the scheme in the first place.

In short, biometric schemes in public places are destined to catch some 'small fry'. In addition to persons charged with lesser crimes, and misdemeanours, that probably means nearly everyone. That's because 'function creep' would ensure that sirport checkpoints detected people with an expired visa, an outstanding invitation to a police interview, an airport parking infringement notice, and progressively parking infringement notices generally, toll avoidance notices and outstanding traffic fines.

The applications of facial recognition technology at the 2001 SuperBowl in Tampa and in nearby Ybor City FL have been unqualified failures. For example, "The [Ybor City] system has never correctly identified a single face in its database of suspects, let alone resulted in any arrests ... [and] the system made many false positives" (ACLU 2002c).

Nor should this have been a surprise to anyone. "A study by the government's National Institute of Standards and Technology (NIST) ... found false-negative rates for face-recognition verification of 43 percent using photos of subjects taken just 18 months earlier. And those photos were taken in perfect conditions, significant because facial recognition software is terrible at handling changes in lighting or camera angle or images with busy backgrounds. The NIST study also found that a change of 45 degrees in the camera angle rendered the software useless. The technology works best under tightly controlled conditions, when the subject is starting directly into the camera under bright lights - although another study by the Department of Defense found high error rates even in those ideal conditions. Grainy, dated video surveillance photographs of the type likely to be on file for suspected terrorists would be of very little use" (ACLU 2002a).

But excessive claims are not limited to just non-credible facial recognition technology. Even the products that are widely perceived to be the least unreliable, fingerprints and iris scans, are hyped out of all proportion. <Evidence needed here>

What is quite remarkable is the paucity of published, independent assessments of technologies, products and applications. The few commercial assessors demand very high fees for their publications, and governmental testing laboratories and government-funded university laboratories appear to publish very little. Perhaps they are be embarrassed at the low quality of the things they test; or fearful that, should the poor quality of the technologies in use become publicly evident, they would undermine the credibility of the industry on which their very existence depends; or even concerned about being accused of being unpatriotic if they are perceived to doubt the wisdom of installing technology in order to fight the 'war on terrorism'.


2.4 Biometrics as Fashion Accessory

But hype works. Kneejerk politicans and regulators demand kneejerk reactions from corporations and government agencies in the public eye. Busy businesspeople need to be seen to be doing something; and will cheerfully install without evaluation if subsidies are available, or if investment is perceived to be a necessary cost in order to ensure survival (e.g. through approval of a licence renewal, or favourable press coverage to counter negative publicity).

Despite the considerable evidence of abject inadequacy, facial recognition technology has been reported as being installed at Boston MA, Providence RI, San Francisco CA and Fresno CA airports, and planned at Oakland CA, Pearson (Toronto) and Keflavik (Iceland). And this despite the fact that, had they been installed prior to 9/11, they would not have prevented the attacks, partly because of the inappropriate design, and partly because the individuals who perpetrated the acts had not been identified as part of the target-group.


3. Impacts and Implications

The second section of the paper outlines the vast array of ways in which biometric technologies impinge upon people.


3.1 Imposition and Inconvenience

Some biometrics involve considerable interruptions to the people who the scheme is imposed upon. There are circumstances in which the imposition and inconvenience might be accepted by the public as being justified by the risks involved. During a period when, and in places where, some reasonable degree of suspicion exists that terrorist acts might be perpetrated, commensurate levels of inconvenience may be tolerable. But this does not justify general application of biometrics, nor of sustained application of biometrics in locations where the threat-level is perceived to have fallen considerably.

The impositions and inconveniences are of various kinds, some affecting everyone who is subject to the scheme, and others affecting only some people.

(1) The General Public

Schemes that depend upon control-points inevitably involve queuing, and hence delays and lost time. The time lost can be far more than the queuing time alone, because it may result in missed transport connections and a lengthy wait for the next available service.

(2) Outliers

Every form of biometrics makes assumptions about the physiography of the people who it is to be imposed upon. Registration (i.e. the capture of one or more reference measures that are to be treated as authoritative, and against which future measures are to be compared) is generally challenging. But it is especially problematical for some people. The imposition and inconvenience is repeated on every occasion that a measure is sought.

Causes include genetic abnormality, illness and injury. Examples include, in the fingerprinting arena, people who lack a right thumb, or both thumbs, or all fingers, whose ridges are too faint to provide a reliable image, or whose prints have been rendered unreadable through excessive wear, chemical harm, or injury.

(2) Culture-Specific Sensitivities

For many people, special importance is associated with particular parts of the body, the appearance of the face, or the integrity of the body as a whole. A wide variety of particular concerns exist among people of particular ethnic, cultural, religious and philosophical backgrounds.

(4) Selectivity

In the U.K. facial recognition systems are subject to bias in relation to subject-selection: "camera operators have been found to focus disproportionately on people of colour; and the mostly male (and probably bored) operators frequently focus voyeuristically on women" (ACLU 2002c).

Even when some form of automated subject-selection is implemented, there may also be inherent biases towards some subjects.

(5) False Positives

Errors in the technology, the product design, the process design, and the operation of the scheme, all rebound on the people who are wrongly selected as being suspicious or incapable of complying with the dictates of the equipment and operatives. The person is subjected to the embarrassment of being treated as a suspect, and of being led away for interrogation.

The person is faced with the difficulties of convincing interrogators that the technology has failed again, and in effect prosecuting one's innocence, in the absence of information and of informed representation.

For people who are perceived by the control-point staff as being 'different from us', biases and bigotries are likely to compound the suspicion. Problems arise from ethnic, lingual and cultural differences, such as the inability to achieve clear voice-communications and to read body-signals, and the increased probability of mis-reading body-signals.

The person is likely to lose substantial additional time, partly because it may involve queueing and waiting for clearance from some specialist resource or supervisor, or feedback from some remote resource, but also because it is very likely to result in consequential time lost, e.g. as a result of a missed flight.


3.2 Privacy Invasiveness

Privacy is the interest that people have in sustaining a 'personal space', free from interference by other people and organisations (Clarke 1997). It has multiple dimensions, several of which are negatively affected by biometrics.

(1) Privacy of the Person

Privacy of the person is concerned with the integrity of the individual's body. Issues include compulsory immunisation, blood transfusion without consent, compulsory provision of samples of body fluids and body tissue, and compulsory sterilisation.

Biometric technologies don't just involve collection of information about the person, but rather information of the person, intrinsic to them. That alone makes the very idea of these technologies distasteful to people in many cultures, and of many religious persuasions.

In addition, each person has to submit to examination, in some cases in a manner that many people regard as demeaning. For example, the provision of a quality thumbprint involves one's forearm and hand being grasped by a specialist and rolled firmly and without hesitation across a piece of paper or a platen; and an iris-print or a retinal print require the eye to be presented in a manner compliant with the engineering specifications of the supplier's machine.

Some biometric technologies involve physically invasive processes, such as the projection of electromagnetic radiation at and hence inevitably into the person's eyes. Some, such as those based on DNA, go so far as to require the person to provide a sample of body-fluids or body-tissue.

(2) Privacy of Personal Behaviour

Privacy of personal behaviour is concerned with the freedom of the individual to behave as they wish, subject to not harming or unduly infringing upon the interests of others.

The monitoring of people's movements and actions through the use of biometrics increases the transparency of individuals' behaviour to organisations. Those organisations are in a better position to anticipate actions that they would prefer to prevent. Moreover, an organisation that performs biometrics-aided monitoring is in a position to share personal data with other organisations, such as contracted suppliers and customers, 'business partners', and corporations and governments agencies with which it 'enjoys a strategic relationship'.

The actual extent to which organisations gain yet more power over individuals is only part of the problem. Individuals self-censor when they perceive themselves to be being observed. This has the useful result that it acts as a deterrent to anti-social behaviour. But it also has a 'chilling effect' on social, political and economic behaviour, especially of a unconventional, non-conformist, potentially controversial, and innovative nature.

(3) Denial of Anonymity and Pseudonymity

An especial concern is that the application of biometric technologies generally undermines anonymity and pseudonymity. Until very recent times, the vast majority of actions and transactions undertaken by people were anonymous, or were identified only to the extent that an observer saw them and might remember them, but no records of the event were kept.

Corporations and government agencies have been working very hard to deny people the ability to keep their transactions anonymous. As a result of new forms of information technology, the cost of data capture has plummeted, and huge numbers of transactions are now recorded which would have been uneconomic to record in the past. These records carry enough information to identify who the person was who conducted them, and systems are designed so as to readily associate the data with that person.

Biometric technologies create new capabilities for the association of identity with transactions that have never been recorded before, such as passing through a door within a building, across an intersection, or into a public place or an entertainment facility. They provide a powerful weapon to corporations and governments, whereby yet more of the remnant anonymity of human action can be stripped away. See Clarke (1999).

It is feasible for biometrics to be applied in a manner that protects identity. See, for example, Bleumer (1998). To date, however, such approaches have attracted very little attention from researchers, technologists, manufacturers, designers, scheme sponsors or government policy agencies.

(4) Privacy of Personal Data

Privacy of personal data is concerned with the claim that individuals have that data about themselves should not be automatically available to other individuals and organisations, and that, even where data is possessed by another party, the individual must be able to exercise a substantial degree of control over that data and its use.

Biometrics gives rise to three groups of issues. The first relates to the biometric itself, which may be collected, stored, used, and disclosed. It is highly sensitive data, and needs to be subject to substantial privacy protections. Contrary to the assumptions made by most designers, privacy involves much more than mere security protections, and includes constraints and in many cases outright prohibitions on all aspects of handling, including collection, storage, use and disclosure. See Clarke (1989).

The second aspect relates to data that is associated with the biometric. For example, the location at which a person was recognised might be recorded, together with the date and time. This is also highly sensitive data, and also demands authority for use, and substantial privacy protections. For an analysis of person-location and person-tracking matters, see Clarke (2001).

The third aspect is linkage between, on the one hand, the biometric and the data associated with it, and on the other hand additional data acquired from some other source. Some biometrics technology providers offer integration with third-party databases, suggesting the ability to gain access to personal data from such sources as driver licensing records and telephone directories.

(5) Cross-System Enforcement

Individuals come to attention when they are subjected to a biometric surveillance tool. This may be because they are identified as a result of their biometric matching a reference measure, or because they have declared their identity in order to have it authenticated. The question arises as to what use the scheme operator or others may make of the fact that the individual is known to be at a particular location.

Biometrics of individuals would generally be collected, in effect as a condition of being in a particular location, for a particular purpose. For example at the turnstiles of a sporting event, in order to identify 'known trouble-makers', and at a security checkpoint in an airport terminal, in the hope of detecting 'suspected terrorists'.

The temptation exists, however, to intercept identified individuals for additional purposes. These may range from outstanding arrest warrants, through requests to assist law enforcement agencies, to outstanding traffic fines, outstanding child support payments, and outstanding parking fines.

(6) Function-Creep

A typical proposal for a biometric scheme post the 9/11 terrorist actions is for a control-point to be established, or a new one created, people required to submit to a biometric measure, and that measure compared against some database or databases drawn from some existing source(s). The sources (to date) contain reference measures for relatively small numbers of people. In addition, the quality of reference measures is in many cases very low, and many of the individuals are interested in evading detection. As a result, there are very, very few 'positives' or 'hits'.

It is less embarrassing for vendors, and for the agencies and corporations that have invested in the scheme, if the positives can be increased. There is also a sense of enhanced security if the assertion is not merely 'This person, who we haven't seen before, is not known to be a member of the class of nasty people we're looking for', but rather 'This person is Joe Bloggs, who is not known to be a member of the class of nasty people we're looking for'. (It is not entirely clear that the increase in security arising from that change would actually be particularly significant).

There is accordingly a temptation on the part of scheme operators to seek more data to add to the reference database. In some jurisdictions, government agencies are permitted to collect biometrics without a conviction, or even a charge; and there are increasing attempts to compulsorily or pseudo-voluntarily acquire biometrics from many categories of people only remotely associated with crime (such as visitors to prisons, and people in geographical and/or temporal proximity to the scene of a crime).

An approach attractive to scheme sponsors would be, of course, to use each control-point as a collection-point for new entries to the database. This might result in not only reference measures associated with some form of identifier (such as a passport number), but also a new data-trail of the occasions on which the person had passed that control-point, and perhaps also affiliated control-points.

As was explained many years ago, there is no natural barrier against function creep of this kind: each step can be more or less convincingly justified, and the race to a state of ubiquitous surveillance can run unimpeded. See Rule (1974), Rule et al. (1980).


3.3 Masquerade

Biometrics suffer from a fundamental security weakness. A person's physiographic features and behaviour are visible to the world, and hence biometric is not a protectable secret. It can be acquired by any party that can force, cajole or trick the person into making the relevant part of their body, or the relevant behaviour available to a measuring device.

Using this measure, it is tenable for an artefact to be constructed that can make it appear that a particular person is presenting to the measuring device. Such an artefact may be a synthesised physical thing, or in some cases a synthesised set of signals. Most biometrics technologies are such that an artefact does not have to be a close copy of the original in order to be convincing, but merely a close enough equivalent of the aspects that the technology relies upon. Hill (2001, pp.36-40) presents a generic masquerade method.

Moreover, a reference biometric is capable of being acquired from its storage location. This is an even greater security weakness, because that makes it much easier for a masquerade to be performed by presenting a convincing-looking biometric that is very likely to pass the test.

The feasibility of the exploit varies depending on such factors as the kind of biometric, and the kind of storage. The technology to fabricate a convincing iris, based on the data captured and stored by an iris-reading device would seem to be challenging, and may well not currently exist. On the other hand, if a biometric comprises measurements of some part of a person's body, such as the first knuckle of the right thumb, then technology is probably already available that can produce a synthetic equivalent of that body-part.

Moreover, some biometric techniques select a small sub-set of the captured data, such as the number and orientation of ridges on a fingerprint, or the location and size of features in an iris. The risk is all the greater if the biometric is used in its raw form, or the compression is insufficiently `lossy' and hence the compressed form can be used to generate an adequate masquerade, or the hashing algorithm is not one-way.

A significant risk exists that an imposter could produce means to trick devices into identifying or authenticating a person even if they are not present. Possible uses would be to gain access to buildings, software or data, digitally sign messages and transactions, capture the person's identity, harm the person's reputation, or 'frame' the person.

Any identification or identity authentication process that involves storage of a biometric is fraught with enormous risks. These will very likely rebound on the person, whether or not it harms the organisation that sponsors the scheme.


3.4 Permanent Identity-Theft

An act of masquerading as another person is a single event. If the imposter conducts a succession of masquerades, especially if they generate a data-trail that is associated with that identity, their behaviour amounts to the appropriation of the person's identity.

Note, however, that recent hysterical literature, not only from American media and technology suppliers, but even from U.S. government agencies such as the Federal Trade Commission (FTC) and the National Fraud Center (NFC), have used the term 'identity theft' very loosely, e.g. "About fifty percent of the identity theft victims that called our Hotline reported that a credit card was opened in their name or that unauthorized charges wer placed on their existing credit card" (FTC 2002). See also the hilarious claim by the NFC that "nearly 200 of America's most rich and famous people ... have had their identities stolen" (Wilcox & Regan 2001). It is appropriate to limit the scope of the term to substantial and permanent appropriation, and to exclude individual masquerade events.

Cases of identity theft can have very serious consequences for the victims. Organisations caannot distinguish the acts and transactions of the two individuals using the one identity, and hence they are merged together. A typical outcome is that the person faces demands for payment from organisations they have never purchased anything from, and shortly afterwards can no longer gain access to loans.

Under these circumstances, the identity can become so tainted that the person has to abandon that identity and adopt a new one. That is challenging, because such an act is readily interpreted as an admission of guilt, and an attempt to avoid the consequences of actions that are presumed to be actions of that person, rather than of the imposter.

Biometrics adds a frightening new dimension to identity theft. The purveyors of the technology convey the message that it is foolproof, in order to keep making sales . The organisations that sponsor schemes want to believe that it is foolproof, in order to avoid liabilities for problems. The resulting aura of accuracy and reliability will make it extraordinarily difficult for an individual who has been subjected to identity theft to prosecute their innocence.

A person whose biometric is seriously compromised is precluded from the ultimate means of addressing the problem. People can change their passwords, their PINs, their loginids, and even their commonly-used names (if only with considerable inconvenience). But they generally cannot change the physiological feature or behaviour that a biometrics scheme uses to identify them or a authenticate their identity. Once it is captured, a person is forever subject to masquerade by each person or organisation that gains access to it.


3.5 Access Denial and Identity Denial

Biometrics is capable of being applied to the denial of access to some location, function or data in the event that the attempted authentication of an assertion of identity fails. Alternatively, biometrics might be used to identify individuals who should be denied access.

Examples include ex-employees and company premises, convicted shop-lifters and shops, problem-gamblers and casinos, and convicted 'rowdies' and football grounds. But the technique could of course be extended to the denial of access by customers suspected of shop-lifting, complainants about a company's practices, known agitators against the company's practices, and gamblers suspected of being good at gambling. Government agencies could find scores of applications, such as preventing targeted people from using transport facilities.

Where the biometric scheme is isolated, a person may be denied access to a very specific location (such as the pharmacy in a hospital), or a very specific function (such as the invocation of a program that enables amendment of a particular master-file), or very specific data (such as access to a particular person's records). Where a mistake has been made (such as incorrect authorisations, or a false rejection by the biometric device), inconvenience will arise, which may be of a serious nature, but in most cases would be a hindrance or annoyance rather than a major problem.

But if an explosion in biometric applications were to occur (as suppliers are promising themselves as a result of increased security awareness), there would be a tendency towards standardisation and multiple usage of schemes. This could have much more serious consequences, because file-errors and false-rejections could become tantamount to the denial of a person's identity, or at least of the rights that a 'normal' person enjoys. This was imagined by Orwell in '1984', in the form of an 'unperson', and investigated further in Brunner's 'The Shockwave Rider' (1975).


3.6 DNA Profiling

DNA is in one sense just another biometric. But its dangers are even more extreme than other biometrics, for at least the following reasons:


3.7 Broader Social Impacts

Biometric technologies, building as they do on a substantial set of other surveillance mechanisms, create an environment in which organisations have enormous power over individuals. Faced with the prospect of being alienated by employers, by providers of consumer goods and services, and by government agencies, individuals are less ready to voice dissent, or even to complain.

That is completely contrary to the patterns that have been associated with the rise of personal freedoms and free, open societies. It represents the kind of closed-minded society that the Soviet bloc created, and which the free world decried. The once-free world is submitting to a 'technological imperative', and permitting surveillance technologies to change society for the worse. Biometrics tools are among the most threatening of all surveillance technologies, and herald the severe curtailment of freedoms, and the repression of 'different-thinkers', public interest advocates and 'troublemakers'.

Clearly, this undermines democracy, because candidates, dependent on parties, sponsors and the media, are less willing to be marginalised; supporters are less prepared to be seen to be so; and voters become fearful of the consequences if their voting patterns become visible. This has been referred to by various authors during the last 50 years as a 'chilling effect'.

Less clearly, the suppression of different-thinkers strangles the economy. It does this because the adaptability of supply is dependent on experimentation, choice, and the scope for consumers to change their demand patterns.

Beyond the fairly practical considerations of freedom of thought and action, democracy and economic behaviour, there is the question of the ethics of the matter. If we're happy to treat humans in the same manner as manufactured goods, shipping cartons, and pets, then biometrics technologies are unobjectionable. If, on the other hand, humans continue to be accorded special respect, then biometrics technologies are repugnant to contemporary free societies.

Authoritarian governments ride rough-shod over personal freedoms and human rights. They will establish legal authority for and enforcement of the capture of biometrics for every transaction, and at every doorway. Such governments see consent and even awareness by the person as being irrelevant, because they consider that the interests of society or 'the State' (i.e. of the currently powerful cliques) dominate the interests of individuals.

In the free world as well, substantial momentum exists within governments and corporations to apply those same technologies, and in the process destroy civil rights in those countries.


4. Achieving Protections

The third section of the paper examines some key aspects of the process whereby the biometrics technolological imperative will be resisted, contemporary designs rejected, and protections achieved.


4.1 Public Acceptability

Imposed authority only works up to a point. If it isn't accepted by the populace, then it lasts only as long as the repressive regime retains its power.

Public fears about biometrics, and the identification and person-tracking that it facilitates, are easily stirred up. In August 2001, Borders quickly learnt the error of its ways, and withdrew its proposal to implement a facial recognition scheme in a London UK store (Perera 2001). 9/11 might (but might not) have changed public attitudes for a while; but there's no reason to believe that the public would now, or in the future, perceive the risk of terrorist action to justify biometrics apparatus any-time, any-place, just-in-case.

Examples of public reactions? e.g. UK shopping-centres? Huntington Beach in Orange County LA?

In early 2002, Dollar Rent-a-Car abandoned fingerprinting its customers because its benefits were "not enough of a reduction to irritate even a small number of customers who were irritated" (Alexander 2002). Remarkably, however, "Dollar took nearly 170,000 customer thumb prints during the three-month test" and "although the test is being halted, the prints will remain on file at Dollar's corporate headquarters in Tulsa for seven years before being destroyed".


4.2 Absence of Consultation, Participation and Evaluation

To date, technologies have been invented, products developed, and schemes designed and implemented, in almost total ignorance of the needs and concerns of the people affected by them.

This is serious enough with any form of technology, but with one so fundamentally intrusive into individuals' rights, and threatening to society and democracy, it is quite untenable.

Guidelines exist on how to undertake a privacy impact assessment (e.g. Clarke 1998), and how to achieve representation of the affected public in the design process. Guidelines also exist as to how a corporation can develop a privacy strategy (e.g. Clarke 1996).

4.3 The Protections Required

This paper has identified an enormous array of serious risks inherent in biometrics technologies. This section proposes a set of safeguards that need to be implemented in order to protect against those risks.

(1) The Context

(2) The Technologies and Products

(3) Application Designs Generally

(4) Application Design Processes

(5) Regulatory Measures


4.4 The Feasibility of Compliant Schemes

The claim may be made that the requirements identified above constitute a prohibition against biometrics, because they preclude any system ever being deployed. Clearly there are many designs that would fail against the criteria; but there remain a variety of applications that would satisfy them. This section offers a couple of examples that the public would be likely to find acceptable, because the intrusiveness is both limited and balanced against the risks involved.

(1) A Consumer Payments Mechanism

See Clarke (1996).

(2) An Access Control Mechanism

A scheme could be readily designed that protects anonymity and pseudonymity, while denying access to a location, function or data to unauthorised persons.

Such a scheme would involve a token in the possession of the individual, which contains a reference-measure of a biometric, and some form of credential which records the authorisations available to that person, in terms of access to locations, functions or data.

The person would present themselves at a device that incorporated both a card-reader and an biometric-measuring capability. Each device would need to be compliant with a set of security standards, and would need to perform processes to authenticate the other device.

The test-measurewould be authenticated against the reference-measure. If it passed, then the device would not disclose nor store any information relating to the person's identity. The token would pass to the device the credential it contains.

In order to cope with failed tests, a manual fallback procedure would need to be available.


4.5 A Moratorium on the Application of Biometrics

Applications of biometric technology do not resemble the above scenario. And virtually none of the safeguards identified earlier are in place. One Privacy Commissioner has noted this, and declared a policy position that demands a raft of "minimal standards" with which biometrics applications need to comply (IPCO 2001).

Given the extraordinarily serious implications of biometric technologies, and the absence of any effective protections, a ban is needed on all applications. The ban needs to remain in force until after a comprehensive set of design requirements and protections has been devised, implemented, and is actually in force.

Only in this manner can biometric technology providers and scheme sponsors be forced to balance the interests of all parties, rather than serving the interests of only the powerful and repressing the individuals whose biometrics are to be captured.


5. Conclusions

The biometrics industry would be laughable if the impact and implications of the technologies weren't so dire. The industry is incapable of regulating its own public relations distortions, or of establishing a sensible dialogue with organisations that are considering the adoption of the goods and services it peddles. It is inviting a most severe public backlash.

Governments and Parliaments in various countries, worst of all the U.S., have been preferring to act to protect corporations' interests, but to justify inaction when it comes to protecting the interests of people, on the grounds that it's usually bad for business.

Biometrics technologies harbours threats to personal freedoms that are far more severe than mere terrorism. The constraints that national security devotees have in mind would squelch freedom and initiative as completely as the Communist regimes that the free world so deplored.

Governments need to focus not just on protections against acts of violence; but also on protections against inhuman security technologies. Application of biometrics technologies needs to be subjected to a statutory ban until a regulatory framework is in place that:


Resources

These have been re-located into a separate document. The sections are:



xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 2 April 2002 - Last Amended: 15 April 2002 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/BiomThreats.html
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2022   -    Privacy Policy