1.
Introduction: Setting the Scene ^
A nightmare or science fiction? With the advent of automated vehicles this has now become a possible scenario. Indeed, first exemplar cases have already been reported1.
For the UK, the recent proposal by the Department for Transport (DoT) touches upon the issue of driver liability in a world of hackable autonomous vehicles, but only in passing2.
2.
The Legal Framework ^
Autonomous vehicles are on the march3 and there seems little stopping the encroachment of robotic agents into our daily lives. Both the government and private sector are making conservative estimates that levels 4 and 5 of autonomy (referencing complex driving situations and full, end-to-end, hands-off journey autonomy) will be achieved somewhere between 2020 and the 2030’s4.
With the onset of this automation into our daily lives and into an environment that already carries an accepted risk profile5, a large number of legal and societal questions are brought into focus.
3.
Background ^
Technological advancement in this space isn’t new. Research in this area has been actively performed since the 1920s6, but legal and regulatory bodies were slow to address the issue outright, no doubt due to the public’s unwillingness to entrust its safety to a machine, and as a result of pragmatic hurdles surrounding battery, sensor and computational technologies.
In 2011, things started to change with the United States (US) drafting legal instruments to cope with the onset of autonomous vehicles, initially by the state of Nevada7, followed by subsequent states, until today, where 16 states8 have either enacted, or are looking to enact, legal policy and regulatory measures to allow for autonomous vehicles to exist alongside normal, human operated motor vehicles. This change is being reflected globally with prominent countries following suit such as the UK, Japan and Singapore9.
From the periphery, it would appear that everything is developing well, new technology is being adopted, supported by legislative evolution, under the premise that everyone will be driven by autonomous agents come 202110, increasing safety, reducing human error and allowing productive use of time when in transit.
4.
State of the Art ^
Car hacking, and the issues this activity raises are becoming a seriously discussed topic. Since 2015, with the advent of the infamous Jeep Cherokee case involving security researchers Charlie Miller and Chris Valasek12, it’s one that is attracting serious attention, with the aforementioned security exploit resulting in the recall of 1.4 million cars by Fiat13 and resulting in the US National Highway Traffic Safety Administration opening an investigation into the recall.
Since this time, Miller and Valasek have continued their research, uncovering further issues with the underlying systems14 of the same vehicle, while researchers in China have identified issues with other car manufactures systems e.g. Tesla15. While commentators such as David Pogue, of Scientific America, attribute these developments to scare mongering tactics16, the fundamental job of regulators and law makers is to attempt to pre-empt technological advances and ensure the legal landscape is prepared to deal with these new issues, even if, at present, the threats and risks may be arbitrarily «hypothetical». The inauguration of a «Car hacking village» (2015) at the world’s leading hacker event Defcon, would suggest these «hypothetical» issues will become non-hypothetical very soon.
5.
Legal Treatment ^
The DoT UK report provides legal mechanisms for dealing with a number of tricky legal areas, proposing that answers can be found by way of «a proportionate response», passing the «duty of care» onto car manufacturers and insurance companies, stating the following as a key strategic driver:
«Our proposal is to extend compulsory motor insurance to cover product liability to give motorists cover when they have handed full control over to the vehicle (i.e. they are out-of-the-loop). And, that motorists (or their insurers) rely on courts to apply the existing rules of product liability – under the Consumer Protection Act, and negligence – under the common law, to determine who should be responsible.»17
6.
Issues with Liability ^
The DoT paper addresses this problem, and suggests to treat car-hacking as the equivalent to car-hijacking, as a normative equivalent to a theft of the vehicle. How the car is brought under the control by a third party does not matter – by breaking the door or by breaking code – as long as the result is the same. This however plays down the special nature of cyberspace, where accountability and traceability is not a given.
«If an accident occurred as a result of an automated vehicle being hacked then we think it should be treated, for insurance purposes, in the same way as an accident caused by a stolen vehicle. This would mean that the insurer of the vehicle would have to compensate a collision victim, which could include the «not at fault driver» for damage caused by hacking but, where the hacker could be traced, the insurer could recover the damages from the hacker.»19
- Takes full control of the vehicle. It now responds to the hacker in exactly the same way as it would to the «driver»
- Obtains marginal control over some functions, but not direct control over the entire car: For instance, they can increase or decrease speed, but not influence the direction
- Does not control these functions, but interferes with and degrades the performance of the software, eventually resulting in an accident e.g. the brakes are less responsive, or reaction to external obstacles slowed down. The will of the driver is thwarted (the driver wants to slow down, but can’t) but neither is the hacker now in control
- Uses ransomware to «hold it hostage». In this case, the malicious intruder can’t manipulate the vehicles functions but then neither can the owner. The hacker exercises a degree of control at the detriment of the lawful owner, and it is in their power alone to restore full access. It is at least debatable if this is best understood as theft, or as criminal damage combined with extortion. In this scenario, it is unlikely (because the vehicle is now incapable of movement) that accidents occur, but also not impossible – e.g. if the car is disabled in an unsafe space and could not be physically moved before a third party ran into it.
- Disrupts the information the vehicles control system is receiving e.g. manipulation of the speedometer, causing the driver to behave incorrectly, resulting in incurring civil or criminal penalties. This differs from the previous scenario in that here, a human driver is again involved, so arguably an issue only for level 4 automation and below
- Manipulates the core decision making algorithms e.g. machine learning, rule based decision trees etc. and feeds these false information, resulting in sensor manipulation directly affecting the vehicles behaviour. In this scenario the attacker does not directly exercise control, but neither does the owner. In this scenario, the behaviour of the car is less predictable than in 2 or 3 – for anyone, including the hacker.
For some forms of car hacking, this will remain the case. In particular in scenario 1, where the attacker takes control of the car, the total number of cars affected will necessarily remain limited. The situation however looks very different if we consider scenarios 4, 5 and 6. As e.g. the recent ransomware attack against the UK hospital system showed, an almost unlimitedly large number of computers can be affected at the same time, using the same weakness in the attack against all of them21. While a traditional theft exploits the weakness of a specific car, in a specific location (e.g. parked out of sight from observers), an attack against the software system of autonomous vehicles could simultaneously target all the cars with that exploit.
7.
The Unknown ^
The DoT paper argues for an analogous treatment of hacking into a car with criminal manipulation undertaken in the physical world. In many ways, this is a sound analogy. It ignores however the special nature of cyberspace, where a single attack can have an almost unlimited number of targets, and where accountability and traceability is not a given. The report incorrectly identifies something it regards as a «known», when in most cases, this should be treated as an «unknown»:
«If an accident occurred as a result of an automated vehicle being hacked then we think it should be treated, for insurance purposes, in the same way as an accident caused by a stolen vehicle. This would mean that the insurer of the vehicle would have to compensate a collision victim, which could include the ‹not at fault driver› for damage caused by hacking but, where the hacker could be traced, the insurer could recover the damages from the hacker.»22
8.
Traceability ^
- Traceability of the hacker
- Traceability of the manipulation
Traceability of the hacker is not always easy23 and existing tools make it very difficult, even for law enforcement officials.
Tracing manipulation is an issue, not only when taken together with the scenarios presented previously but also when underlying algorithms are targeted. An attack that aims to control a car will have to follow a predictable pattern, and as a result leave identifiable digital traces as evidence. It seems less clear if this will also be the case for attacks that target the machine learning (ML) algorithms (an integral aspect of driverless cars24) not to achieve a specific goal, but simply to distrupt, distort and degrade the car’s functions. This will require to separate malicious modification from «normal» behaviour, which may include intrinsic «mistakes»25.
This is further complicated by the opaque nature of the algorithms in-play, imposed upon us by the secretive nature of software development and the manufacturers testing procedures26.
9.
Conclusion ^
The UK’s proposal to support the emergent technology of driverless cars27 and inform industry sectors on how to proceed and establishing a code of practice28, is by and large a pragmatic approach, which, following in the footsteps of the US, aims at ensuring that the UK environment is a supporter of innovation and that evolving regulatory and legal mechanisms do not hinder progress.
As stated in the DoT paper:
«We want to take a pragmatic and proportionate approach, with a rolling programme of regulatory reform.»29
As always, when confronted with technological change, two contradictory tendencies come into play: On the one hand, the apparent novelty of the technology often leads to demands for law reform. The problematic scenarios look different from the past, and that alone is seen as reason enough to update the legal provisions. This tendency is countermanded by our ability to recognise familiar patterns even in new surroundings. This, combined with the inherent slowness of the legislative process, pushes us to draw analogies between new problems and those of the past. It seems always easier to keep a tried and tested system in place than experiment with an entirely new approach. In the proposal of the DoT, this latter tendency by and large wins out. In UK Law it has always has been the duty of the car holder to ensure adequate insurance cover, and a part of this legally prescribed minimal cover also extends to accidents that occur when the car was stolen. For certain types of hacking attacks, this seems to provide an adequate solution, where past experience indicates that an overall small additional risk for the insurance companies provides substantial social benefits. These benefits are particularly pertinent when an as yet untried technology reaches the market and the public needs reassurance. However, as we have seen, this approach lumps together a range of rather disparate scenarios, and for some at least, the analogy to a traditional vehicle theft seem pernicious. There are some new forms of attacks against automated cars conceivable that have no equivalent under the old technology, and may require us to think about entirely new ways on how to allocate and collectivise the risk, if we want the technology to succeed.
Jonathan Sinclair holds a BEng (Hons) in Software Engineering, an MSc in Informatics and is half way through completion of his LLM in Information Technology Law, in addition to these academic qualifications he's a highly qualified IT security professional holding certifications from established bodies such as: (ISC)2, EC-Council, PECB, he's also a seasoned speaker having been invited to speak at Universities, IT security and AI conferences .
Burkhard Schafer, Professor of Computational Legal Theory, The University of Edinburgh, SCRIPT Centre for IT and IP law Old College, Edinburgh, EH8 9YL, UK. B.schafer@ed.ac.uk; http://www.law.ed.ac.uk/people/burkhardschafer.
- 1 Andrew J. Hawkins, Uber says it’s reviewing incident of self-driving car running a red light, The Verge, 14 December 2016, available at: http://www.theverge.com/2016/12/14/13960836/uber-self-driving-car-san-francisco-red-light-safety (all websites last accessed on 16 November 2017); Evan Ackerman, Fatal Tesla Self-Driving Car Crash Reminds Us That Robots Aren’t Perfect, Spectrum, 1 July 2016, available at: http://spectrum.ieee.org/cars-that-think/transportation/self-driving/fatal-tesla-autopilot-crash-reminds-us-that-robots-arent-perfect.
- 2 Department for Transport, Pathway to Driverless Cars: Proposals to support advanced driver assistance systems and automated vehicle technologies, 2016, available at: http://webarchive.nationalarchives.gov.uk/20170123080341/https:/www.gov.uk/government/uploads/system/uploads/attachment_data/file/536365/driverless-cars-proposals-for-adas-and_avts.pdf.
- 3 With some figures estimating that 10 million cars that possess some form of driving automation will be on the road by 2020; see: BI Intelligence, 10 million self-driving cars will be on the road by 2020, Business Insider Deutschland, 15 June 2016, available at: http://uk.businessinsider.com/report-10-million-self-driving-cars-will-be-on-the-road-by-2020-2015-5-6.
- 4 KPMG, Connected and Autonomous Vehicles – The UK Economic Opportunity (2015), available at: https://www.smmt.co.uk/wp-content/uploads/sites/2/CRT036586F-Connected-and-Autonomous-Vehicles-%E2%80%93-The-UK-Economic-Opportu...1.pdf.
- 5 Department for Transport, UK Annual road fatalities (2014), available at: https://www.gov.uk/government/publications/annual-road-fatalities#history.
- 6 Adrienne Lafrance, Your Grandmother’s Driverless Car, The Atlantic, 29 June 2016, available at: http://www.theatlantic.com/technology/archive/2016/06/beep-beep/489029/.
- 7 Nevada state AB 511 legislature approval, 28 March 2011, available at: http://www.leg.state.nv.us/Session/76th2011/Reports/history.cfm?ID=1011; Assembly Bill No. 511–Committee on Transportation, 2011, available at: http://www.leg.state.nv.us/Session/76th2011/Bills/AB/AB511_EN.pdf.
- 8 National Conference of State Legislatures, Autonomous self-driving vehicles legislation, 2016, available at: http://www.ncsl.org/research/transportation/autonomous-vehicles-legislation.aspx.
- 9 Charlotte Jee / Christina Mercer, Driverless car news: The great driverless car race: Where will the UK place?, Techworld, 22 June 2017, available at: http://www.techworld.com/personal-tech/great-driverless-car-race-where-will-uk-place-3598209/.
- 10 Reuters, BMW says self-driving car to be level 5 capable by 2021, Reuters UK, 16 March 2017, available at: http://uk.reuters.com/article/uk-bmw-autonomous-self-driving-idUKKBN16N1Y8?il=0.
- 11 DoT, Pathway to Driverless Cars 2016 (note 2).
- 12 Andy Greenberg, Hackers Remotely Kill a Jeep on the Highway – With Me in It, Wired, 21 July 2015, available at: https://www.wired.com/2015/07/hackers-remotely-kill-jeep-highway/.
- 13 David Shepardson, Fiat Chrysler will recall vehicles over hacking worries, The Detroit News, 24 July 2015, available at: http://www.detroitnews.com/story/business/autos/2015/07/24/us-pushing-guard-vehicle-cyberhacking/30613567/.
- 14 Andy Greenberg, The Jeep Hackers Are Back to Prove Car Hacking Can Get Much Worse, Wired, 1 August 2016, available at: https://www.wired.com/2016/08/jeep-hackers-return-high-speed-steering-acceleration-hacks/.
- 15 Andy Greenberg, Tesla Responds to Chinese Hack With a Major Security Upgrade, Wired, 27 September 2016, available at: https://www.wired.com/2016/09/tesla-responds-chinese-hack-major-security-upgrade/.
- 16 «Yes, new technology is always a little scary. But let’s not exploit that fear. Let’s assess the hackable-car threat with clarity, with nuance—and with all the facts»; see David Pogue, Why Car Hacking Is Nearly Impossible, Scientific American, 28 October 2016, available at: https://www.scientificamerican.com/article/why-car-hacking-is-nearly-impossible/.
- 17 DoT, Pathway to Driverless Cars 2016 (note 2), p. 12.
- 18 As in the case of the financial sector e.g. credit card forgery and embezzlement etc.
- 19 DoT, Pathway to Driverless Cars 2016 (note 2), p. 21.
- 20 DoT, Pathway to Driverless Cars 2016 (note 2), p. 23.
- 21 https://www.nao.org.uk/report/investigation-wannacry-cyber-attack-and-the-nhs/.
- 22 DoT, Pathway to Driverless Cars 2016 (note 2), p. 21.
- 23 Alan Woodward, Viewpoint: How hackers are caught out by law enforcers, BBC News, 12 March 2012, available at: http://www.bbc.co.uk/news/technology-17302656.
- 24 Harry Armstrong, Machines that learn in the wild – Machine learning capabilities, limitations and implications, Nesta, July 2015, available at: https://www.nesta.org.uk/sites/default/files/machines_that_learn_in_the_wild.pdf.
- 25 Lisa Shay / Woodrow Hartzog / John Nelson / Gregory Conti, Do Robots Dream of Electric Laws? An Experiment in the Law as Algorithm, in: Ryan Calo / A. Michael Froomkin / Ian Kerr (Eds.), Robot Law, Edward Elgar Publishing, Gloucester 2013, pp. 274–305.
- 26 Andrew J. Hawkins, Uber dismissed warnings about its illegal self-driving test for months, emails show, The Verge, 27 February 2017, available at: https://www.theverge.com/2017/2/27/14698902/uber-self-driving-san-francisco-dmv-email-levandowski.
- 27 DoT, Pathway to Driverless Cars 2016 (note 2).
- 28 DoT, The Pathway to Driverless Cars: A code of practice for testing, 2015, available at: https://www.vicroads.vic.gov.au/~/media/files/documents/safety-and-road-rules/pathwaydriverlesscars.pdf?la=en.
- 29 DoT, Pathway to Driverless Cars 2016 (note 2), p. 8.
- 30 Ben Herzberg / Dima Bekerman / Igal Zeifman, Breaking Down Mirai: An IoT DDoS Botnet Analysis, Imperva Incapsula, 26 October 2016, available at: https://www.incapsula.com/blog/malware-analysis-mirai-ddos-botnet.html
- 31 Susan W. Brenner, Cybercrime and the Law: Challenges, Issues and Outcomes, Northeastern University Press, USA 2012.