“Smarter” cars, unintended acceleration – and unintended consequences
In this article, I consider some recent press reports relating to embedded software in the automotive sector.
In The Times newspaper (London, 2015-10-16) the imminent arrival of Tesla cars that “use autopilot technology to park themselves and change lane without intervention from the driver” was noted.
By most definitions, the Tesla design incorporates what is sometimes called “Artificial Intelligence” (AI).Others might label it a “Smart” (or at least “Smarter”) Vehicle.
Tesla’s application of AI is – in my view - a significant development, and highlights the extent to which electronic control systems in vehicles have advanced in recent years.This design may also suggest that the transition towards fully autonomous vehicles on our roads may be beginning more quickly than had been previously anticipated.
As I’ve thought about the Tesla developments, I have been wondering if we are not already seeing another possible impact of AI in vehicle electronics, this time evidenced by what is now generally referred to as the “VW Emissions Scandal”?
In the case of VW, the vehicles were apparently made “smart enough” to work out that they were being tested and to use this information to lower their emissions (temporarily), allowing them to pass the test.
Some have compared the VW situation to problems with software development that were previously reported at Toyota.These problems are alleged to have resulted in “unintended vehicle acceleration”, with fatal consequences in some cases.
One observation I have heard (several times) when discussing these matters with colleagues can be paraphrased as follows: “At least nobody died as a result of the VW problems”.
If we lay aside the longer-term impact of many millions of cars generating higher levels of pollution than advertised and focus purely on the short-term impact, then this comparison may be felt to have some validity.However, if we choose to dismiss the impact of the VW issues, then I think we run the risk of overlooking the fact that there are other ways in which software and AI could be employed in vehicle electronics to beat testing regimes – and some of these may be (even) less benign.
To explain, let me go back to The Times, this time the issue dated 2015-10-17.Under a headline “Porsche tried to stop safety report on throttle delay”, the paper outlines a case brought by John Cieslik against the car company, alleging that the car has a safety defect that almost caused him to crash when overtaking a lorry.
It appears (from the newspaper article) that the behaviour of the car has – as with the VW diesels – been adapted to deal “intelligently” with a test scenario.In this case, it is a restriction on the noise levels generated by cars when travelling at 30 mph.What is being alleged is that – to beat this test – the cars have a delayed throttle response (in the region of 2 seconds) when running at this speed in conditions that match the test scenario.
If these allegations are correct, then it would – at least – be highly disconcerting to the driver to experience such a delay in normal driving conditions, and it seems surprising that this matter has received little press exposure.
In the newspaper report that I mentioned at the start of this article (The Times, 2015-10-16), Nick Reed from the UK’s Transport Research Laboratory is quoted as saying: “It would be legal for a driver to use Tesla’s autopilot mode in the UK, as it’s an advanced version of existing driver assistance systems”.
In many ways, this seems to me to be a remarkable statement.While there have been no suggestions that the alleged software-development problems noted at Toyota, VW or Porsche also apply at Tesla, I am beginning to wonder if current development standards (such as ISO 26262) and related automotive regulations are capable of keeping pace with the rate of change in the automotive sector at this time.Would the same attitudes apply if we were discussing significant changes to software designs in the aerospace sector?
First category of software scandal: Malfunctions of electronic control units due ,for example,to corrupted data in unprotected eeproms from the manufacturer (I observed many cases of them durring my work in practice) or software errors non prevented by the manufacturer (like Toyota, it is ought to a goto software instruction I think) . As you well know electronic coctrol units are more and more complicated and there is no an absolute and safe way to catch all of these kind of mulfunctions even if you will exame the code !
Second category of software scandal: It is so easy to encrypt a piece of code inside of a microcontroller doing a dirty job, like VW.
The only way to catch the scandal is to exam the encrypted code and not examing the car !
So, for the moment current development standards (such as ISO 26262) and related automotive regulations are not capable of keeping pace with the rate of change in the automotive sector at this time catching all of the scandals.
To post reply to a comment, click on the 'reply' button attached to each comment. To post a new comment (not a reply to a comment) check out the 'Write a Comment' tab at the top of the comments.
Please login (on the right) if you already have an account on this platform.
Otherwise, please use this form to register (free) an join one of the largest online community for Electrical/Embedded/DSP/FPGA/ML engineers: