By David Navetta, Boris Segalis, and Kris Kleiner, Norton Rose Fulbright US LLP
As the development of self-driving car technology progresses, the prospect of privately-owned autonomous vehicles operating on public roads is nearing. Industry experts predict that autonomous vehicles will be commercially available within the next five-to-ten years. However, the use of this technology presents significant privacy and security issues that should be explored and addressed before these vehicles are fully commercialized.
I. Privacy issues
Because autonomous vehicles are largely experimental at this time, it remains unclear what type of personal information may be collected by these vehicles. Nonetheless, at a minimum, location data associated with a particular vehicle will be tracked and logged. Location tracking has already proven to be a lightning rod with respect to mobile phones. Some of the privacy considerations related to the use of autonomous cars are discussed further below.
a. Owner and Passenger Information
Perhaps the most important information that could be collected, particularly when combined with other information discussed below, is identifying information about the owner or passenger of the autonomous vehicle. It is likely that the autonomous vehicle would need to maintain information about the owner and passengers for a variety of different purposes. For example, the vehicle would likely need to maintain information about passengers to authenticate authorized use. Furthermore, information about the passengers would also lend itself to a variety of conveniences that are common in many cars available today, including customizable comfort, safety, and entertainment settings. It is likely that cars, based on setting preferences and other information collected while in use, will be able to identify drivers, passengers and their activities with a high degree of certainty.
The Drivers’ Privacy Protection Act and other federal statutes, including the Electronic
Communications Privacy Act and the Federal Communications Act could apply to certain aspects of autonomous vehicle data and communications. Additionally, 47 states and the District of Columbia have enacted laws applicable to personal information. While these laws are generally applicable to data breaches, many also include requirements for safeguarding personal information. Although these laws provide for some protections of various personal information, because of the type of data involved, the manner of collection, or the entity collecting the data, some or all of these protections may not be applicable to autonomous vehicles.
b. Location tracking
Location data is something that is necessarily implicated in the use of autonomous vehicles. In fact, it has been happening for some time now, but additional location information would allow the ability to provide additional features and benefits to the user. For example, navigation features available in many modern cars include the option to save specific locations in memory;use current location and planned route to identify additional information relevant to the trip, including real-time traffic data, points-of-interest on or near the planned route; and to set routing preferences, such as avoiding highways or toll roads.
Correlating location, destination, speed, and route data with additional information about the passenger, and the date and time of the trip would allow someone to get a picture of when, where and how an individual travels, particularly if this information is stored or logged over some period of time. This information may prove very beneficial for purposes of traffic planning, reducing congestion and improving safety, but it also could be used for secondary marketing purposes. However, viewing travel data and patterns over time may also enable one to deduce other information about the owner or passengers, such as where they live and work as well as locations like stores, restaurants, and other establishments that they frequently visit.
The privacy risks associated with the collection of and access to location information raise both individual personal and larger policy and societal concerns. On an individual basis, the
availability of location information “generates a precise, comprehensive record of a person’s
public movements that reflect a wealth of detail about her familial, political, professional,
religious, and sexual associations.” For example, accessing an individual’s historical location
and destination information would permit visibility to “trips the indisputably private nature of which takes little imagination to conjure: trips to the psychiatrist, the plastic surgeon, the abortion clinic, the AIDS treatment center, the strip club, the criminal defense attorney, the by-the-hour motel, the union meeting, the mosque, synagogue or church, the gay bar and on and on.” Even where location information does not reveal this type of private information, the ability to identify the present location or historical travel patterns of a particular person makes them more susceptible to physical harm or stalking if the information is accessible to the wrong person.
From a commercial perspective, location and destination information could provide valuable
marketing information to advertisers. Knowing where a person lives, works, and shops would allow a business to infer information about income level and spending habits. One could envision the tracking and storage of autonomous vehicle data to lead to lawsuits similar to those filed against various internet advertising companies engaged in behavioral tracking using cookies. There are various other implications of this data, ranging from providing customized advertising using interfaces connected to the Internet in the a vehicle (on a dedicated screen, through the car’s speakers or to mobile devices in the car) to specifically routing a vehicle to expose a captive audience of passengers to certain businesses or destinations based on personalized interests inferred from their individual data.
Perhaps the most fundamental question that needs to be considered is how the collection and sharing of location data impact the concept of an individual’s “reasonable expectation of privacy,” which impacts both protections afforded by Fourth Amendment and the applicability of privacy interests in tort law. Another factor that complicates the reasonable expectation of privacy issue is the potential that location data from autonomous cars would be shared with third parties, including the manufacturer or other service providers.
c. Sensor data
Autonomous cars in use today (and again many existing human-driven vehicles) contain various sensors that collect data relating to the operation of the autonomous vehicle as well as its surroundings. By constantly collecting data about its surroundings, however, the vehicle is continuously capturing information about the people and things it encounters, creating a potential privacy concern in the same way that a different Google project, Google Street View, drew the interest of the Federal Communications Commission (FCC).
There, the FCC imposed sanctions on Google for its conduct in gathering the Wi-Fi network and “payload” data during the Street View project. One can imagine that autonomous vehicles could collect driving habits, destinations and other revealing information about other drivers without their knowledge or consent. Additional concerns could arise based on the use of imagery captured by the vehicle, including ownership disputes and potential invasion of privacy claims, depending upon the circumstances in which the images are captured.
One specific type of “sensor” that deserves particular attention is the voice-recognition and
control system of the autonomous car. Many consumer devices currently on the market
integrate voice control functionality, including smartphones and televisions. The addition of
these features to consumer products has led to public concern and complaints about the
collection and transmission of private communications.
In October 2015, California enacted legislation regulating voice-recognition technology in smart televisions. These laws require manufacturers of smart televisions to inform customers about the voice-recognition features during initial setup or installation and bar the sale or use of any speech captured by voice-recognition technology for advertising purposes and prohibits the manufacturer or entity providing these features from being compelled to build specific features for allowing an investigative or law enforcement officer to monitor communications through that feature.
II. Security issues
In addition to individual privacy concerns, autonomous vehicles also present issues relating to personal safety and security. The potential security risks come from a variety of sources, both internal and external to the automated vehicle itself, which are discussed in further detail below.
In a 2014 Harris Interactive survey about the use of autonomous cars, more than 50 percent of respondents raised concerns about the prospect of having a hacker gain control of the vehicle. In 2015, researchers Charlie Miller and Chris Valasek were able to exploit a vulnerability in certain Chrysler vehicles to gain control of the vehicle’s internal computer network. Miller and Valasek discovered the vulnerability in the entertainment system, which allowed remote access through an open port in the system. With access to the entertainment system and the CAN bus, Miller and Valasek were able to remotely manipulate various systems, including the air conditioning controls, stereo, windshield wipers, transmission, steering, and were able to both kill the engine and engage or disable the brakes. Although these vulnerabilities occurred in normal vehicles, they illustrate some of the potential risks that could arise with autonomous vehicles, as one would expect many of the features and systems available in normal cars to be available and included in autonomous vehicles. Furthermore, to the extent that autonomous cars lack the ability for a passenger to take control of the vehicle to respond, the safety threat posed by these vulnerabilities could be even more acute.
The next source of risk related to personal safety with autonomous cars comes from the
technology itself. Karl Lagnemma, director of a start-up focused on the development of software for self-driving cars explained the risk posed by software bugs, stating: “[e]veryone knows security is an issue and will at some point become an important issue. But the biggest threat to an occupant of a self-driving car today isn’t any hack, it’s the bug in someone’s software because we don’t have systems that we’re 100-percent sure are safe.”
Steven Shladover, a researcher at the University of California, Berkeley, stated that having “safety-critical, fail-safe software for completely driverless cars would require reimagining how software is designed.” Although bugs in the software in other devices, like computers, smartphones or other devices, are relatively common, the implications of software failure in an autonomous car could have much more serious implications. This is a risk widely recognized by American consumers, as 79 percent of consumers have cited fears that “equipment needed by driverless vehicles—such as sensors or braking software—would fail at some point.”
The algorithms used in the autonomous vehicle’s decision-making process also present
potential risks to the safety of passengers and those in the vicinity of the vehicle:
- How should the car be programmed to act in the event of an unavoidable accident?
- Should it minimize the loss of life, even if it means sacrificing the occupants, or should it
protect the occupants at all costs?
- Should it choose between these extremes at random?
Unlike human drivers who make real-time decisions while driving, an automated vehicle’s decision, although based on various inputs available from sensor data, is a result of logic developed and coded by a programmer ahead of time.
The difficulty in making and coding the decision process is illustrated in the following
An automated vehicle is traveling on a two-lane bridge when a bus that is traveling in the
opposite direction suddenly veers into its lane. The automated vehicle must decide how to react with the use of whatever logic has been programmed in advance. The three alternatives are as follows:
A. Veer left and off the bridge, which guarantees a severe, one-vehicle crash;
B. Crash head-on into the bus, which will result in a moderate, two-vehicle crash; and
C. Attempt to squeeze past the bus on the right. If the bus suddenly corrects back toward its own lane (a low-probability event given how far the bus has drifted) a crash is avoided.
If the bus does not correct itself, a high-probability event, then a severe, two-vehicle crash results. This crash would be a small, offset crash, which carries a greater risk of injury than the full, frontal collision in Alternative B.
The technological advancement in autonomous vehicles will be staggering and has already
generated significant excitement. However, the legal issues and risks associated with obtaining and using personal data, as well as the various cybersecurity threats, need to be thoroughly considered before commercialization.
About The Authors
David Navetta is a US co-chair of Norton Rose Fulbright’s Data Protection, Privacy and
Cybersecurity practice group. David focuses on technology, privacy, information security, and intellectual property law. His work ranges from compliance and transactional work to breach notification, regulatory response, and litigation. David has helped hundreds of companies across multiple industries prepare for and respond to data security breaches.
Boris Segalis is a US co-chair of Norton Rose Fulbright’s Data Protection, Privacy, and Cybersecurity practice group and has practiced exclusively in this area since 2007. Boris advises clients on data protection, privacy, and cybersecurity issues arising in the context of compliance and business strategy, technology transactions, breach preparedness, and response, disputes and regulatory investigations, and legislative and regulatory strategy. He represents clients across industries, from Fortune 100 global organizations to emerging technology and new media companies. Kris Kleiner is an associate in Norton Rose Fulbright’s Data Protection, Privacy, and Cybersecurity practice group. Kris regularly advises clients on best practices as well as compliance with state and federal privacy and cybersecurity regulations and have experience assisting various clients operating in multiple industries in identifying, remediating, and responding to data privacy incidents. David, Boris, and Kris can be reached at David.Navetta@nortonrosefulbright.com,
Boris.Segalis@nortonrosefulbright.com, and Kris.Kleiner@nortonrosefulbright.com, or at our company website http://www.nortonrosefulbright.com.