Business Insider reports that Ford executive Jim Farley stated, in a panel discussion:
We know everyone who breaks the law, we know when you’re doing it. We have GPS in your car, so we know what you’re doing. By the way, we don’t supply that data to anyone.
He later retracted that, saying Ford doesn’t routinely collect GPS data about its drivers, but that he was just “imagin[ing] a day when the data might be used anonymously and in aggregate to help other marketers with traffic related problems.” I’m happy to accept that clarification.
Yet the point remains that Ford could technically gather this information, and could use it to prevent injuries. For instance, if GPS data shows that someone is speeding — or the car’s internal data shows that the driver is speeding, or driving in a way suggestive of drunk driving or extreme sleepiness, and the data can then be communicated to some central location — then Ford could notify the police, so the dangerous driver can be stopped. And the possibility of such reports could deter the dangerous driving in the first place.
Ford, then, is putting extremely dangerous devices on the road. It’s clearly foreseeable that those devices will be misused (since they often are misused). Car accidents cause tens of thousands of deaths and many more injuries each year. And Ford has a means of making those dangerous devices that it distributes less dangerous; yet it’s not using them.
Sounds like a lawsuit, no? Manufacturer liability for designs that unreasonably facilitate foreseeable misuse is well-established. And the fact that the misuse may stem from negligence (or even intentional wrongdoing) on the user’s part doesn’t necessarily block liability, so long as the user misconduct is foreseeable. [UPDATE: I should note that I’m not wild about these aspects of our tort law system, and think they should likely be trimmed back in various ways; but there is certainly ample legal doctrine out there — whether one likes it or not — potentially supporting liability in such a situation.]
Similar lawsuits were, after all, brought against gun manufacturers. They were mostly rejected by courts, largely because gun manufacturers couldn’t reasonably tell when their products were being misused — but here, Ford could determine this (either now or in the very near future), and deter or stop actual misuses without stopping proper use of the car by law-abiding drivers. And while these sorts of gun manufacturer lawsuits were also preempted by the Protection of Lawful Commerce in Firearms Act, that’s a gun-specific statute, which doesn’t prevent lawsuits against manufacturers of other products.
Of course, the problem is that such car manufacturer liability would interfere with what many people see as their privacy (or, perhaps more precisely, freedom from surveillance). We don’t expect that our own devices will constantly report our actions — even likely illegal actions — to the police. And while one can imagine regulations or statutes that require such reporting, we’d expect a political debate about this, with potential political accountability for government officials that vote for such mandates. We don’t expect, I think, that courts will impose such obligations on device manufacturers as a matter of the common law of torts, in the absence of this sort of political debate.
Yet tort law precedents say surprisingly little about such tensions between product liability law (or negligence law more broadly) and privacy. Courts could say that some proposed precaution — such as some proposed product feature — isn’t required by the “reasonable care” standard, because that precaution would excessively intrude on privacy; and sometimes they do say so. But sometimes they ignore privacy issues altogether, and they generally don’t discuss them in any methodical way.
Attentive readers might realize that this is all a pitch for my forthcoming Tort Law vs. Privacy article, which will be published this year in the Columbia Law Review. Here is the Introduction the article:
Through the privacy torts, tort law aims to protect privacy. But tort law, and especially negligence law, can also reduce privacy.
Tort law can pressure property owners, employers, and consumer product manufacturers into engaging in more surveillance. Tort law can pressure colleges, employers, and others into more investigation of students’, employees’, or customers’ lives. Tort law can pressure landlords, employers, and others into more dissemination of potentially embarrassing information about people. Tort law can require people to reveal potentially embarrassing information about themselves. And technological change is likely to magnify this pressure still further. Yet this tendency has gone largely undiscussed.
Modern negligence law (including the law of product design defects) obligates all of us to take reasonable precautions to prevent harm caused even in part by our actions, by our products, by our employees, or by others who are using our property. We also have duties to affirmatively protect some people — customers, tenants, other business visitors, and likely social guests — even against threats that we didn’t help create. All these duties may require us to take reasonable precautions against criminal acts by others. [“The conduct of a defendant can lack reasonable care insofar as it foreseeably … permits the improper conduct of … a third party.” “The improper action or misconduct in question can take a variety of forms. It can be negligent, reckless, or intentional in its harm-causing quality. It can be either tortious or criminal, or both.”] And some of those required precautions may involve disclosing information about ourselves, or gathering and disclosing information about others.
Under the Learned Hand formula for determining negligence, the requirement of “reasonable precautions” is often understood as requiring cost-effective precautions. Liability for failure to take a precaution is proper if B
Gathering or disclosing information about people’s backgrounds, tendencies, and actions is increasingly inexpensive, and increasingly effective at helping avoid, interrupt, or deter harm. The B (burden) of such precautions thus gets lower. The P (probability) that they will prevent harm gets higher.
Failure to take those precautions thus becomes negligent. When comprehensive nationwide background checks were expensive and ineffective, they weren’t required by the duty to exercise reasonable care. Now they are cheap, quick, and more comprehensive, so failing to do a background check is often seen as negligent. And employers do indeed report the desire to avoid legal liability as a major reason for investigating the backgrounds of job applicants. [Footnote: See Society for Human Resource Mgmt., Background Checking: Conducting Criminal Background Checks, at 3–4 (Jan. 22, 2010) (reporting that 73% of the 347 respondents, though in a self-selected sample, conducted criminal background checks on all job candidates, with 19% more conducting them only on candidates for particularly sensitive classes of positions); id. at 7 (reporting that 55% of 310 respondents gave “[t]o reduce liability for negligent hiring” as one of “the primary reasons that [their] organization conducts criminal background checks on job candidates”).]
Likewise, as video surveillance cameras became cheap enough to be cost-effective, courts began to hold that defendants may be negligent for failing to install surveillance cameras. [Footnote: See, e.g., Rodriguez-Quinones v. Jimenez & Ruiz, S.E., 402 F.3d 251, 256 (1st Cir. 2005) (noting low cost of a camera as part of the reason that a property owner might have a duty to install it).] Failure to provide camera surveillance is now a common claim in negligence cases. “Take reasonable care” translates into a steady and growing pressure: investigate, surveil, disclose.
Still more comprehensive surveillance is likely to become technically feasible soon. Image recognition software will likely make it easier for one guard to monitor many more video cameras, by alerting the guard to which screen is showing a potentially dangerous confrontations. Facial recognition software will make it easier to keep track of who is present where and when, and to instantly look up visitors in criminal records databases. Again, under modern negligence law, as these precautions against crime become feasible, they may become legally mandated (on pain of liability should a crime take place in the absence of such precautions).
Likewise, product manufacturers can increasingly monitor misuse of their products by customers. Car manufacturers can design cars that e-mail the police or call 911 whenever the car goes over 80 miles per hour. They can likely design cars that monitor the driver for signs strongly associated with drunk driving, and call 911 when those signs are present. They can design cars with breathalyzer ignition interlocks that check their drivers’ breath alcohol level and report to the police attempts to drive drunk. As such technologies get cheap enough — cellular communication already has, and breathalyzer ignition overrides likely will, too — it becomes much more plausible to claim that a manufacturer is negligent for designing a deadly machine that fails to inexpensively monitor its operator for signs of dangerous driving.
These tendencies also bear on the likely future scope of government surveillance, and not just private surveillance, as Part III.F will discuss. First, duties imposed on private property owners and employers are generally applicable to the government as property owner and employer. Surveillance data collected by the government in those capacities can easily be shared with law enforcement agencies.
Second, as the NSA PRISM story vividly illustrates, surveillance data collected by private entities can easily be subpoenaed or otherwise obtained by law enforcement agencies, without a warrant or probable cause. What the private sector gathers, the government can easily demand.
Third, the increasing prevalence of private surveillance may subtly make people more willing to accept government surveillance. If private entities are, for instance, required to maintain surveillance cameras with face recognition software on private property, it will be much harder to argue that police departments should be prohibited from doing the same on government-owned streets.
Negligence law, then, can pressure potential defendants into taking what I call “privacy-implicating precautions”: disclosing information about employees, customers, tenants, students, and the like, gathering information about them, and surveilling them. This pressure can sometimes have immediate and striking effects. An employer who must, for instance, warn customers about the threat posed by an employee — either because the employee has committed crimes, or because the employee is being stalked by a criminal who might injure bystanders in a future attack — will likely dismiss the employee, or not hire him in the first place. The same may be so for a landlord who must disclose this information about a tenant. And the pressure can also have long-term effects that are even more pervasive, as people’s understanding of the privacy they should demand is molded by the limits on the privacy that they have grown used to.
What then, should the tort system demand privacy-implicating disclosure, information gathering, or surveillance? This is a question that people who care about privacy, whether academics, advocates, citizens, judges, or legislators, should confront. If I am right, then tort law could affect privacy in largely unseen but substantial ways. Those who are interested in privacy should consider how they can participate in controlling and perhaps limiting these effects, whether through legislation, amicus briefs, or scholarly analysis.
This Article will not try to offer a general answer to the question. Perhaps there is no single answer, but rather different answers for different contexts. When it comes to affirmative protection for privacy, the legal system has developed many different privacy rules to deal with different kinds of intrusions. Maybe there should likewise be several different privacy doctrines constraining the scope of negligence law. Moreover, people who value privacy differently, and for different reasons, will likely come to different answers. I don’t want to commit myself to substantive proposals that rely on a theory of privacy that many readers and many judges may not share.
Instead, this article will try to explore not what the answer ought to be, but which actor in the tort system should provide it. Should these privacy-vs.-safety decisions generally be made by jurors, applying the “reasonable care” standard? Should judges decide as a matter of law that certain precautions need not be taken because of the burden they impose on privacy? Or should the decisions be left to legislators or administrative agencies, with judges generally rejecting demands for privacy-implicating precautions unless a legislative or administrative body has mandated such precautions?
Part I of the article will briefly define what I mean by privacy here — essentially “control over the processing — i.e., the acquisition, disclosure, and use — of personal information,” which includes limitations on surveillance. Part II will then catalog some of the specific ways that negligence law and product design defect law may require behavior that undermines privacy, or mandates surveillance.
Parts III, IV, and V will discuss which institutions could take the lead in evaluating such privacy-implicating proposed restrictions to juries; the parts will outline the arguments for jury decisionmaking (Part III), judicial decisionmaking via “no duty rules” (Part IV), and judges’ leaving the matter to legislative and administrative agency decisionmaking (Part V). In the process, the discussion will point to the relatively few court cases that have discussed these questions, almost all of them discussing them very briefly. My tentative view is that this is an area where courts should avoid allowing liability in the absence of legislative or administrative agency guidance; but I hope that the analysis offered throughout the article will be useful even to those who come to a different bottom line.
[Footnotes: I use the term “privacy” for convenience here to include freedom from surveillance, even in public places.
I focus here on how substantive liability rules may require gathering and revealing information. I do not discuss the important but already well-discussed debate about how discovery in civil cases may diminish the privacy of litigants, litigants’ employees, and others.
Product design defect law in practice largely applies negligence principles, since it imposes liability for “unreasonable” product designs — designs that could have, reasonably and cost-effectively, been made safer. See, e.g., Restatement (Third) of Torts (Prod. Liab.) § 2 cmt. d, reporter’s note cmt. a. (In some respects, product design defect law departs from negligence principles, for instance in holding distributors liable for manufacturers’ negligent design choices, even if the distributor was not itself negligent, id. § 2(b); but those differences are largely irrelevant for our purposes.) For purposes of this article, I will use “negligence law” to refer both to standard negligence law and product design defect law.
I deliberately don’t label [various privacy-implicating precautions] “privacy-violating precautions,” because it may well be that some of these precautions should be required (whether by juries, judges, or legislatures), despite their privacy costs. In such a situation, the precautions may not be seen as violating the legitimate scope of the right to privacy, just as the law may restrict speech without violating free speech. But all the precautions do implicate privacy, in the sense that they impose privacy costs that ought to be considered in analyzing whether the precautions ought to be required.]