THE PERSONAL PROSPECTUS &
THE THREAT OF A FULL DISCLOSURE FUTURE
Forthcoming Northwestern University Law Review (2011).
*By Scott R. Peppet
As for privacy in general, it is difficult to see how a
pooling equilibrium is avoided in which privacy is
„voluntarily‟ surrendered, making the legal protection of
privacy futile. 1 -- Richard Posner
Every day that Tom Goodwin drives his Chevy Tahoe, his insurance company uses a small electronic monitor in his car to track his total driving time, speed, and driving habits. If he drives less than ten thousand hours a year, doesn‟t drive much after midnight, and avoids frequently slamming on
the brakes, at the end of the year he receives up to twenty-five percent off his premiums. “There‟s this Big Brother thing, but it‟s good,” Goodwin says. 2“Since I know I‟m being watched, I‟m on my best behavior.” To date, 3Progressive Insurance‟s MyRate program is available in twenty states and
has enrolled roughly ten thousand customers. Other insurance companies are 4following suit. Some carriers are going further, offering discounts for the use of more sophisticated devices that record geographical location, minute-5by-minute speeding violations, and whether seat belts are in use. Rental car
* Associate Professor of Law, University of Colorado School of Law. I thank my colleagues at the University of Colorado Law School for their interest in and feedback on this project, particularly Paul Ohm, Vic Fleischer and Phil Weiser. I thank Mark Gibson and Matt Burns for their excellent research assistance. 1 Richard Posner, Privacy, in 3 THE NEW PALGRAVE DICTIONARY OF ECON. & THE LAW 103
(1998) [hereinafter Posner, Privacy]. 2Bengt Halvorson, Car Insurance Savings Come With „Big Brother,‟
http://www.cnn.com/2009/LIVING/wayoflife/05/22/aa.pay.as.drive.insurance (last visited July 9, 2010). 3 See http://www.progressive.com/myrate (last visited July 9, 2010). The program was recently renamed Snapshot and updated slightly. See id. (last visited August 3, 2010). 4 Many insurance providers offer similar discounts, sometimes of up to sixty percent off regular premiums. See Jilian Mincer, To Your Benefit, WALL ST. J. (Dec. 7, 2009) (discussing
various plans). GMAC Insurance, for example, uses OnStar data to track total miles driven. See http://www.gmac123.com/auto-insurance/smart-discounts/low-mileage-discount.asp (last visited July 1, 2010). 5 See www.anpac.com/drivesmart (last visited July 12, 2010). Intel is working on more sophisticated monitoring systems for cars akin to the “black boxes” in aircraft, capable of
recording and transmitting basic vehicle telemetry, whether seat belts are in use, geographical location, mechanical malfunctions, and video of auto accidents, all of which would be of great interest to an insurance carrier. See John R. Quain, Intel Working on Black Box for Your
UNRAVELING PRIVACY 2
companies have also experimented with using such monitors to incentivize safe driving.
Similarly, every day the Mayo Clinic in Rochester, Minnesota uses remote monitoring devices to check up on the health of residents at the nearby Charter House senior living center. The devices transmit data about irregular heart rhythm, breathing rate, and the wearer‟s position and motion. “The goal,” says Dr. Charles Bruce, the lead investigator on the project, “is to have full remote monitoring of people, not patients, just like you measure 6the pressure of your tires today.” Medical device companies are racing to
enter the remote monitoring space. Proteus Biomedical, for example, is testing a wearable electronic device that can sense when patients have taken 7their pills and transmit that information to the patients‟ doctors, and
GlySens is working on an implantable subcutaneous blood sugar sensor for diabetics that uses the cellular network to constantly send real time results to 8one‟s doctor. Although today these devices do not report data to users‟
health insurers, it would be a simple step for a patient to provide such access in return for a discount. Indeed, such “pervasive lifestyle incentive 9management” is already being discussed by those in the healthcare field.
Finally, every day tenants, job applicants, and students voluntarily disclose verified personal information to their prospective landlords, employers, and safety-conscious universities using online services such as 10MyBackgroundCheck.com. Rather than forcing these entities to run a
background check, an applicant can digitally divulge pre-verified information such as criminal record, sex offender status, eviction history, and previous rental addresses. Moreover, these services allow an applicant to augment her resume by having verified drug testing done at a local collection site and added to her digital record. MyBackgroundCheck.com 11calls this “resume enhancement.”
Car, NEW YORK TIMES (July 7, 2010). Event recorders may become mandatory in new thvehicles. See Motor Vehicle Safety Act of 2010, S. 3302, 111 Cong. ?107 (2010). For
discussion of the privacy implications of such technologies, see Patrick R. Mueller, Every
Time You Brake, Every Turn You Make—I‟ll Be Watching You: Protecting Driver Privacy in
Event Data Recorder Information, 2006 WIS. L. REV. 135 (2006). 6 http://www.medicaldevice-network.com/features/feature81227 (last visited July 1, 2010). 7 See Don Clark, Take Two Digital Pills and Call Me in the Morning, WALL ST. J. (Aug. 4,
2009). 8 See http://www.signonsandiego.com/news/2010/jul/28/sd-company-hopes-monitor-will-revolutionize/ (last visited August 1, 2010). Regular blood sugar monitors (which require pricking the finger) already exist to transmit such data electronically after each reading. See
http://www.ideallifeonline.com/products/glocomanager (last visited July 20, 2010). 9 See e.g., Upkar Varshney, Pervasive Healthcare and Wireless Health Monitoring, 12
MOBILE NETW. APPL. 113, 115 (2007) (“Pervasive lifestyle incentive management could involve giving a small mobile micro-payment to a user device every time the user exercises or eats healthy food.”). 10 See http://www.mybackgroundcheck.com (last visited July 11, 2010). 11 See http://www.mybackgroundcheck.com/DrugTesting.aspx (last visited July 11, 2010).
UNRAVELING PRIVACY 3
This Article makes three claims. First, these examples—Tom
Goodwin‟s car insurance, pervasive health monitoring, and the
incorporation of verified drug testing into one‟s “enhanced resume”—illustrate that rapidly changing information technologies are making possible the low-cost sharing of verified personal information for economic reward, or, put differently, the incentivized extraction of previously unavailable personal information from individuals by firms. In this new world, economic actors do not always need to “sort” or screen each other based on publicly available information, but can instead incentivize each other to “signal” their characteristics. For example, an insurance company
does not need to do extensive data mining to determine whether a person is a risky driver or an unusual health risk—it can extract that information from
the insured directly. Second, this change towards a “signaling economy” (as
opposed to the “sorting economy” in which we have lived since the late 1800s) poses a very different threat to privacy than the threat of data mining, aggregation and sorting that has preoccupied the burgeoning informational privacy field for the last decade. In a world of verifiable information and low-cost signaling, the game-theoretic “unraveling effect” kicks in, leading self-interested actors to disclose fully their personal information for economic gain. Although at first consumers may receive a discount for using a driving or health monitor, privacy may unravel as those who refuse to do so are assumed to be withholding negative information and therefore stigmatized and penalized. Third, privacy law and scholarship must reorient
towards this unraveling threat to privacy. Privacy scholarship is unprepared for the possibility that when a few have the ability and incentive to disclose, all may ultimately be forced to do so. The field has had the luxury of ignoring unraveling because technologies did not exist to make a signaling economy possible. Those days are over. As the signaling economy evolves, privacy advocates must either concede defeat or focus on preventing unraveling. The latter will require both a theoretical shift in our conception of privacy harms and practical changes in privacy reform strategies.
The Article‟s three Parts track these claims. Part I explores the
emerging signaling economy.
* * *
Part II takes up the Article‟s second claim: that even the first steps
we are now taking towards a signaling economy—steps like those in the
three examples above—pose a new set of privacy challenges previously
Richard Posner first articulated these challenges decades ago, 12although at the time they were more theoretical than practical. Even with
control over her personal information, he argued, an individual will often find it in her self interest to disclose such information to others for economic
12 Posner‟s description of this problem is in Posner, Privacy, supra note __ at 105-107. He
began to develop such themes in RICHARD A. POSNER, THE ECONOMICS OF JUSTICE 234 (1981).
UNRAVELING PRIVACY 4
gain. If she can credibly signal to a health insurer that she does not smoke, she will pay lower premiums. If she can convince her employer that she is diligent, she will receive greater pay. As those with positive information about themselves choose to disclose, the economic “unraveling effect” will
occur: in equilibrium, all will disclose their information, whether positive or negative, as disclosure by those with the best private information leads to disclosure even by those with the worst.
The classic example of unraveling imagines a buyer inspecting a 13crate of oranges. The quantity of oranges in the crate is unknown and
opening the crate before purchase is unwise because the oranges will rot before transport. There are stiff penalties for lying, but no duty on the part of the seller to disclose the number of oranges in the crate. The number of oranges will be easy to verify once the crate is delivered and opened. The buyer believes that there can‟t be more than one hundred oranges.
The unraveling effect posits that all sellers will fully disclose the number of oranges in the crate, regardless of how many their crate contains. Begin with the choice faced by a seller with one hundred oranges in his crate. If the seller stays silent, the buyer will assume there are fewer than one hundred oranges and will be unwilling to pay for the full amount. The seller with one hundred oranges will therefore disclose and charge full price. Now consider the choice of a seller with ninety nine oranges. If this seller stays quiet, the buyer will assume that there are fewer than ninety nine oranges and will discount accordingly. The silent seller gets pooled with all the lower-value sellers, to his disadvantage. He will therefore disclose.
And so it goes, until one reaches the seller with only one orange and the unraveling is complete. As Douglas Baird, Robert Gertner and Randal Picker put it, “[s]ilence cannot be sustained because high-value sellers will
distinguish themselves from low-value sellers through voluntary 14disclosure.” The economist Robert Frank coined the term the “full
disclosure principle” to describe this phenomenon in his classic text Passions Within Reason. The principle is simple: “if some individuals stand to benefit by revealing a favorable value of some trait, others will be forced 15to disclose their less favorable values.”
In the decades since Posner‟s challenge, however, privacy law has almost entirely overlooked the threat of unraveling. Instead, recent
13 This example is drawn from S.J. Grossman & O.D. Hart, Disclosure Laws and Takeover
Bids, 35 J. FIN. 323, 324 (1980). It has been repeated since. See DOUGLAS G. BAIRD ET AL.,
GAME THEORY AND THE LAW 90 (1994) (using this example) [hereinafter BAIRD ET AL., GAME
THEORY]; Robert H. Gertner, Disclosure and Unraveling, 1 THE NEW PALGRAVE DICTIONARY
OF ECON. & THE LAW 605 (1998) (same). 14 BAIRD ET AL., GAME THEORY, supra note __ at 90. 15 ROBERT H. FRANK, PASSIONS WITHIN REASON 104 (1988).
UNRAVELING PRIVACY 5
16informational privacy scholarship has focused on the privacy threats of
firms sorting individuals by mining aggregated public data such as credit histories. Informational privacy law has reacted to sorting becoming more 17commonplace and sophisticated. The field is dominated by Daniel
Solove‟s concept of the “digital dossier,” which is a metaphor for the 18aggregate of information available online about a given person. Privacy
scholars fear that we are moving towards a world in which everything becomes public—where all of our personal information becomes easily 19available to others as part of our digital dossier. In reaction to this fear, the 20literature is replete with calls to give individuals greater control over their
personal information through the common law of property and tort and 21through stronger statutory privacy rights.
The personal prospectus poses a different threat than Solove‟s digital dossier, however, and it demands different solutions than increased control over one‟s information. In a signaling economy, even if individuals
have control over their personal information, that control is itself the undoing of their privacy. Because they hold the keys, they can be asked—or
forced—to unlock the door to their personal information. Those who refuse to share their private information will face new forms of economic discrimination. How long before one‟s unwillingness to put a monitor in one‟s car amounts to an admission of bad driving habits, and one‟s unwillingness to wear a medical monitor leads to insurance penalties for assumed risky behavior? In a signaling economy, forced disclosure will be as or more difficult a problem as data mining and the digital dossier.
* * *
Part III thus begins to reorient informational privacy law towards the threats of signaling and unraveling.
16 See Neil M. Richards, The Information Privacy Law Project, 94 GEO. L.J. 1087 (2006)
(discussing the field of informational privacy law). 17 See Part II(B) for discussion. 18 See DANIEL J. SOLOVE, THE DIGITAL PERSON 2 (2004) (defining the digital dossier). 19 See John Palfrey, The Public and the Private at the United States Border with Cyberspace,
78 MISS. L.J. 241, 244 (2008) (discussing growth of the digital dossier); Corey Ciocchetti, E-
Commerce and Information Privacy: Privacy Policies as Personal Information Protectors,
44 AM. BUS. L.J. 55, 55-56 (2007) (demonstrating the ease of obtaining a digital dossier on a person); Lee Tien, Privacy, Technology and Data Mining, 30 OHIO N.U. L. REV. 389, 398-99
(2004) (explaining the risks digital dossiers pose to privacy and associational freedom). 20 Control has been the dominant American definition of privacy, see e.g. ALAN F. WESTIN,
PRIVACY AND FREEDOM 7 (1967); Charles Fried, Privacy, 77 YALE L.J. 475, 482 (1968)
(privacy is the “control we have over information about ourselves”), and the dominant prescribed remedy for privacy violation. See e.g., Sonja R. West, The Story of Us: Resolving
the Face-Off Between Autobiographical Speech and Information Privacy, 67 WASH. & LEE L.
REV. 589, 606 (2010) (“[I]t is the control over the disclosure of information … that lies at the heart of legal protection for information privacy.”); Paul M. Schwartz, Internet Privacy and
the State, 32 CONN. L. REV. 815, 820 (2000) (“The weight of the consensus about the
centrality of privacy-control is staggering.”). 21 See Part II(B).
UNRAVELING PRIVACY 6
* * *
I. THE PERSONAL PROSPECTUS &
THE EVOLUTION OF A SIGNALING ECONOMY
A. SORTING AND SIGNALING
It is often difficult to distinguish the trustworthy from the untrustworthy, the good from the bad, the high quality from the low. If you are choosing a business partner, you might value honesty and diligence—but
how to determine whether your potential partner has such traits and isn‟t just putting on a good show to lure you into the deal? If you are purchasing a car, 22how do you determine whether it is dependable or a lemon?
These asymmetric information problems—how to distinguish one
desirable “type” of person, good, or asset from another less desirable type—have occupied economists and legal scholars for decades. Consider the simple decision of whether to lend to Person A or Person B. If you could easily determine that A is more credit-worthy, you would choose to do business with A and not B (or, at least, to charge a greater interest rate to B than A). If you cannot so distinguish, however, you will either lend to neither or charge both the higher interest rate because you must cover for the 23possibility that both are of the undesirable type that is likely to default.
This creates extra costs for A and B, inefficiencies for you, and a burden on 24the economy generally. If the market really falls apart, credit-worthy A 25types may be priced out of the market completely.
Sorting and signaling are the two primary economic devices to 26overcome such information asymmetries. Sorting or “screening” theory
assumes that an uninformed party will filter counterparties based on what observable characteristics or information are available, if the desired 27characteristic is unobservable. For example, a lender might use job
turnover, prior bankruptcies, or a poor credit score as proxies for future default risk.
22 See George A. Akerlof, The Market for „Lemons‟: Quality Uncertainty and the Market
Mechanism, 83 Q. J. ECON. 488 (1970). 23 See generally Dwight M. Jaffee & Thomas Russell, Imperfect Information, Uncertainty,
and Credit Rationing, 90 Q. J. ECON. 651, 651-52 (1976) (describing these dynamics). 24 Joseph E. Stiglitz & Andrew Weiss, Credit Rationing in Markets with Imperfect
Information, 72 AM. ECON. REV. 393 (1981). 25 See Akerlof, supra note __, at 490-92. 26 For an overview of sorting and signaling, see John G. Riley, Silver Signals: Twenty-Five
Years of Screening and Signaling, 39 J. ECON. LIT. 432 (2001). 27 See e.g., Roger Klein, Richard Spady & Andrew Weiss, Factors Affecting the Output and
Quit Properties of Production Workers, 58 REV. ECON. STUDIES 929 (1991) (exploring
example of employers sorting job applicants based on high school graduation as a proxy for perseverance).
UNRAVELING PRIVACY 7
28Signaling is the counterpart to sorting. Economic actors use
signals to qualitatively distinguish themselves from other economic actors. Signaling “refers to actions taken by an informed party for the sole purpose 29of credibly revealing his private information.” Return to our credit
example. If there are two types of borrowers—A & B—seeking funds and A
is likely to pay back while B is not, A has incentive to reveal its type to the lender in order to receive a lower interest rate.
A may try to signal its type by simply saying “I am a good credit 30risk—I will repay my loans,” but talk is cheap. The lender will doubt A
because A has every reason to lie. Moreover, because it is easy to say such words, both A and B will say them and the lender will be no better off than it was before in trying to distinguish A from B.
A may therefore disclose information that can be used as a proxy of future creditworthiness, such as income level or employment history. For such disclosure to be an effective signal, however, the disclosed information must be verifiable. Such verification has been costly in an economy based 31on analog information. An economic actor seeking to rely on a piece of
information must expend time and resources to verify it—by calling
references, checking employment or tax records, or calling to verify educational achievements. Although such steps are effective in some instances, they impose costs. When signaling is cost-prohibitive, economic actors will instead rely on sorting.
B. SORTING AND THE DIGITAL DOSSIER
28 See Michael Spence, Informational Aspects of Market Structure: An Introduction, 90 Q. J.
ECON. 591, 592 (1976) (“[Signaling and sorting] are opposite sides of the same coin.”). 29 N. GREGORY MANKIW, PRINCIPLES OF ECONOMICS 482 (2004). Put differently, “adverse
selection may give rise to signaling, which is the attempt by the informed side of the market to communicate information that the other side would find valuable.” WILLIAM A.
MCEACHERN, ECONOMICS 313 (2003). 30 See Joseph Farrell & Matthew Rabin, Cheap Talk, 10 J. ECON PERSP. 103 (1996)
(discussing cheap talk generally). 31 As a result, economists generally focus on signaling devices that are self-verifying by being costly to fake—whereby an action taken by A serves in and of itself as a signal of A‟s type. See N. GREGORY MANKIW, PRINCIPLES OF ECONOMICS 482 (2004) (defining signaling).
There are many examples. See e.g., DIANE COYLE, THE SOULFUL SCIENCE: WHAT
ECONOMISTS REALLY DO AND WHY IT MATTERS 153 (2007) (Indian villagers borrow huge
sums to pay for expensive weddings to signal their caste and social status); Paul Herbig & John Milewicz, Market Signaling Behavior in the Service Industry, 1 ACAD. MARKETING
STUDIES J. 35, 39 (1997) (banks and law firms spend vast sums on elaborate office buildings to signal their quality and solvency to potential clients); Robert Puelz & Arthur Snow,
Evidence on Adverse Selection: Equilibrium Signaling and Cross-Subsidization in the Insurance Market, 102 J. POL. ECON. 236, 238 (1994) (an insured chooses a high-deductible
health insurance plan, thereby signaling their belief in their health and their low risk to the insurance company). Spence began modern signaling theory with Michael Spence, Job
Market Signaling, 87 Q. J. ECON. 355 (1973). See also A. Michael Spence, Competition in
Salaries, Credentials, and Signaling Prerequisites for Jobs, 90 Q. J. ECON. 51 (1976)
(discussing his classic example of signaling through educational achievement).
UNRAVELING PRIVACY 8
This has been the situation in the “sorting economy” that has developed over the last one hundred and fifty years. Before turning to the evolving signaling economy in Section C, one must first understand the sorting economy and its culmination in today‟s digital dossier.
* * *
By the 1970s and 1980s, computer technology made it far easier for credit agencies to collaborate across geographic distances by sharing information, giving rise to the small number of large credit agencies that 32now dominate the American market. In turn, the Internet revolution of the
last twenty years allowed information aggregation to explode far beyond the credit markets. It is difficult to overstate the pervasive nature of the data 33mining and aggregation that feed today‟s digital dossier. “Data collection
is the dominant activity of commercial websites. Some 92 percent of them collect personal data from web users, which they then aggregate, sort, and 34use.” One scholar has estimated that corporate data mining links at least seven thousand transactions to each individual in the United States per 35year—approximately half a million transactions over a lifetime.
Supermarkets, airlines, hotels, and merchants all track and share information 36about consumers to better market their products. All of this comprises our
The dominant purpose of this data mining and aggregation is predictive profiling—creating models that can extrapolate from existing data 37to predict future behavior. In other words, sorting. As Douglas Baird has
argued about lenders, for example,
32 Prior to the 1970s, the credit bureau industry had largely been fragmented into many local agencies; the onset of the computer revolution eliminated the efficiencies of having a local bureau as opposed to a larger, more regional or national agency, leading to consolidation of the credit agency industry and the arrival of a few large, national credit agencies. See Pagano
& Jappelli, supra at 1712 (“From a network of local monopolies, credit bureaus began to evolve into a nationwide oligopoly.”). 33 Although the focus here is on data mining by private entities for economic purposes, it is worth noting that governmental data mining and aggregation obviously poses serious risks to privacy. For discussion of governmental use of such data, see e.g., Ira S. Rubinstein, Ronald D. Lee & Paul M. Schwartz, Data Mining and Internet Profiling: Emerging Regulatory and
Technological Approaches, 75 U. CHI. L. REV. 261 (2008). 34 LAWRENCE LESSIG, CODE: VERSION 2.0 219 (2006). 35 Jason Millar, Core Privacy: A Problem for Predictive Data-Mining, in LESSONS FROM THE
IDENTITY TRAIL: ANONYMITY, PRIVACY AND IDENTITY IN A NETWORKED SOCIETY 103, 105
(Kerr, Steeves & Lucock, eds., 2009). See also James X. Dempsey & Lara Flynn,
Commercial Data and National Security, 72 GEO. WASH. L. REV. 1459, 1464-65 (2004)
(surveying the increase in data collection and data mining). 36 See Chris Jay Hoofnagle, Big Brother‟s Little Helpers: How ChoicePoint and Other
Commercial Data Brokers Collect and Package Your Data for Law Enforcement, 29 N.C. J.
INT‟L. L. & COM. REG. 595, 596 (2004) (discussing how opportunities for rent-seeking have led corporations to data-collection efforts). 37 See Millar, supra note ___ at 106 (discussing descriptive versus predictive data mining).
UNRAVELING PRIVACY 9
[a]dvances in data processing allow information about debtors to be
collected on a massive scale. It is now possible to look at a
particular debtor, identify characteristics such as age, marital status,
education, and length of stay at current employer, compare that
debtor with others for whom there is a credit history, and make a
confident prediction about the likelihood that the debtor will repay a 38loan.
Beyond credit markets, corporations might explore correlations between past consumer behavior (e.g., has this person bought both Brand X and Brand Y) and future purchases (e.g., will that predict that they will also 39purchase Brand Z). An insurance company might use health records to 40predict life expectancy. An employer might try to extrapolate the
likelihood of future success as an employee from the tea leaves of a 41candidate‟s past. A merchant might try to predict whether a given
customer‟s check will bounce based on rudimentary information about that 42check-writer.
The point is that the digital dossier is the technological culmination of one hundred and fifty years of increasingly sophisticated sorting. The upside is that massive data aggregation and computer data analysis create market efficiencies because they allow parties to overcome information asymmetries with greater accuracy and lower cost. The downside is the risk to privacy.
38 Douglas G. Baird, Technology, Information, and Bankruptcy, 2007 U. ILL. L. REV. 305, 312. 39 See Tal Z. Zarsky, Desperately Seeking Solutions: Using Implementation Based Solutions for the Troubles of Information Privacy in the Age of Data Mining and the Internet Society,
56 ME. L. REV. 13, 36-37 (2004) (discussing use of data mining to reveal correlations in consumer behavior); JOSEPH P. BIGUS, DATA MINING WITH NEURAL NETWORKS 17-18 (1996)
(discussing correlation of product purchases). 40 See Anita Ramasastry, Lost in Translation? Data Mining, National Security and the
“Adverse Inference” Problem, 22 SANTA CLARA COMPUTER & HIGH TECH. L.J. 757, 768
(2006) (“Factors such as our credit score are meant to be predictors of how likely we are to repay our loans; likewise, our health, age and other physical characteristics are meant to be predictors of what our life expectancy may be.”). 41 The U.S. market for pre-employment background screening is roughly $2 billion per year. See J. Howard Beales, III & Timothy J. Muris, Choice or Consequences: Protecting Privacy
in Commercial Information, 76 U. CHI. L. REV. 109, 110 (2008). This is a common use of the
digital dossier. See Robert Sprague, Orwell Was an Optimist: The Evolution of Privacy in the
United States and Its De-Evolution for American Employees, 42 J. MARSHALL L. REV. 83, 87
(2008) (discussing databases for pre-employment screening). 42 A merchant can electronically submit a shopper‟s drivers license or bank information, which can be gleaned from the check itself, and various services compare that information to their databases to provide the merchant with a rating of the check-writer‟s reliability. See
Ronald J. Mann, Information Technology and Non-Legal Sanctions in Financing
Transactions, 54 VAND. L. REV. 1627, 1632-1633 (2001) (discussing check verification
UNRAVELING PRIVACY 10
Informational privacy scholars have trumpeted the dangers of the sorting made possible by the digital dossier:
We‟re heading toward a world where an extensive trail of
information fragments about us will be forever preserved on the
Internet, displayed instantly in a Google search. We will be forced
to live with a detailed record beginning with childhood that will stay
with us for life wherever we go, searchable and accessible from
anywhere in the world. This data can often be of dubious reliability;
it can be false and defamatory; or it can be true but deeply
humiliating or discrediting. We may find it increasingly difficult to
have a fresh start, a second chance, or a clean slate. … This record
will affect our ability to define our identities, to obtain jobs, to 43participate in public life, and more.
44This has been the dominant concern of the privacy field for the last decade.
C. SIGNALING AND THE PERSONAL PROSPECTUS
Despite being the center of attention in privacy law, however, sorting is not the only means available to overcome information asymmetries. The three examples in the Introduction—Tom Goodwin‟s car
insurance, the innovation of health monitoring systems, and the incorporation of verified drug testing into one‟s “enhanced resume”—illustrate that we are now living in a world in which firms can increasingly rely on information transmitted directly from a consumer to the firm rather
43 DANIEL J. SOLOVE, THE FUTURE OF REPUTATION: GOSSIP, RUMOR, AND PRIVACY ON THE
INTERNET 17 (2007). 44ST See e.g. SIMSON GARFINKEL, DATABASE NATION: THE DEATH OF PRIVACY IN THE 21
CENTURY (2001) (discussing the threat of linked databases and the digitization of records); Jonathan Zittrain, Privacy 2.0, 2008 U. CHI LEGAL F. 65, 77-86 (discussing various types of
personal information now available digitally, including images and video); Seth Safier, Between Big Brother and the Bottom Line: Privacy in Cyberspace, 5 VA. J.L. & TECH. 6, 10
(2000) (discussing how these technologies allow for collection of “vast amounts of in-depth,
and potentially sensitive, personal information”); H.T. Tavani, Informational Privacy, Data
Mining, and the Internet, 1 ETHICS & INFORMATION TECHNOLOGY 37 (1999); Fred H. Cate,
Government Data Mining: The Need for a Legal Framework, 43 HARV. C.R.-C.L. L. REV.
435 (2008) (discussing the end of “practical obscurity” brought about by data mining); Christopher Slobogin, Government Data Mining and the Fourth Amendment, 75 U. CHI. L.
REV. 317 (2008) (discussing widespread use of data mining by government agencies and government‟s reliance on commercial data gathering companies); Tal Z. Zarsky, “Mine Your
Own Business!”: Making the Case for the Implications of Data Mining of Personal Information in the Forum of Public Opinion, 5 YALE J.L. & TECH. 4 (2002-2003) (discussing
privacy concerns related to data mining); William Thomas DeVries, Protecting Privacy in the
Digital Age, 18 BERKELEY TECH. L.J. 283, 291 (2003) (noting “three major digital
developments that deeply affect privacy: (1) the increase in data creation and the resulting collection of vast amounts of personal data--caused by the recording of almost every modern interaction; (2) the globalization of the data market and the ability of anyone to collate and examine this data; and (3) lack of the types of control mechanisms for digital data that existed to protect analog data”).