EmbeddedRelated.com
Forums
The 2024 Embedded Online Conference

QF72 story - wow

Started by Clifford Heath May 14, 2017
This story is a doozy.
<http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what-happens-when-psycho-automation-leaves-pilots-powerless-20170510-gw26ae.html>
On 14-May-17 8:03 PM, Clifford Heath wrote:
> This story is a doozy. > <http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what-happens-when-psycho-automation-leaves-pilots-powerless-20170510-gw26ae.html> >
Yep, it was an interesting read for sure.
On 14.5.2017 &#1075;. 15:03, Clifford Heath wrote:
> This story is a doozy. > <http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what-happens-when-psycho-automation-leaves-pilots-powerless-20170510-gw26ae.html> >
Interesting reading, I read almost the whole article. If I start commenting on the quality of the software they mut have on these planes I'll go into a rant predictable to all seasoned group members so I won't. But I am amazed that - as the article states - the board computer on these planes has priority over the pilots. This is not a bug, it is an error at the product concept level... If it is true of course, I still have difficulties believing that. The fact they did manage to gain some manual control hints towards "not quite true". But I certainly seem to know why I don't fly unless I really have to. Europe is small enough to travel by car, train or bus (have done all of these, rarely of course so it has been practical). Last time I flew was during the Soviet era, this was my only way to "defect" to West Germany back then. It was a Tu-134 IIRC, which must have had mechanical controls and all but I remember how every new sound (there was a ventilator or something which got on periodically) almost turned me into a religious person... well, at least while the flight lasted :). Dimiter ------------------------------------------------------ Dimiter Popoff, TGI http://www.tgi-sci.com ------------------------------------------------------ http://www.flickr.com/photos/didi_tgi/
AT Monday 15 May 2017 21:05, Dimiter_Popoff wrote:

> On 14.5.2017 &#1075;. 15:03, Clifford Heath wrote: >> This story is a doozy. >> <http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what-
happens-when-psycho-automation-leaves-pilots-powerless-20170510-gw26ae.html>
>> > > Interesting reading, I read almost the whole article. If I start > commenting on the quality of the software they mut have on these planes > I'll go into a rant predictable to all seasoned group members so > I won't. > > But I am amazed that - as the article states - the board computer > on these planes has priority over the pilots. This is not a bug, it > is an error at the product concept level... If it is true of course, > I still have difficulties believing that. The fact they did manage > to gain some manual control hints towards "not quite true".
I am doing airborne software myself. You are correct. It is not a bug, it is a serious error of the requirements. Software has to be written according to requirements and this is verified (DO-178). If the sys-reqs are erroneous the software is also incorrect. Similar to the crash of an Airbus in Warsaw. In that case it was an interlock that prevented the thrust reversal when not both wheels see weight-on-wheel. The landing happened with extreme side wind keeping one side slightly lifted. Thus thrust reversal was blocked and the plane ran past the runway and crashed into the ground, killing most passengers. This interlock had been implemented after a Lauda Air crashed in south east Asia because the thrust reversal was activated in mid air. This is why you normally have reviews of system requirements and for every change you have to perform an impact analysis to prevent such oversights. -- Reinhardt
On 15/05/17 15:58, Reinhardt Behm wrote:
> AT Monday 15 May 2017 21:05, Dimiter_Popoff wrote: > >> On 14.5.2017 &#1075;. 15:03, Clifford Heath wrote: >>> This story is a doozy. >>> <http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what- > happens-when-psycho-automation-leaves-pilots-powerless-20170510-gw26ae.html> >>> >> >> Interesting reading, I read almost the whole article. If I start >> commenting on the quality of the software they mut have on these planes >> I'll go into a rant predictable to all seasoned group members so >> I won't. >> >> But I am amazed that - as the article states - the board computer >> on these planes has priority over the pilots. This is not a bug, it >> is an error at the product concept level... If it is true of course, >> I still have difficulties believing that. The fact they did manage >> to gain some manual control hints towards "not quite true". > > I am doing airborne software myself. > You are correct. It is not a bug, it is a serious error of the requirements. > Software has to be written according to requirements and this is verified > (DO-178). If the sys-reqs are erroneous the software is also incorrect. >
What about that case of a plane crash because the pilot intentionally crashed it (as far as they could figure out afterwards, anyway)? He was mentally unstable and drove the plane into a mountain. If the onboard computers had had priority over the pilot in that case, there would have been no crash. It is far from easy to say who or what is most reliable in cases like this, and who or what should have priority. Automated systems can (if designed properly) be safer and more stable than humans - but on the other hand, humans are better at handling unexpected situations. However, I thought that critical flight systems were made in triplicate, and if one goes bananas it gets outvoted, and the other two run the plane? Did that not happen here?
> Similar to the crash of an Airbus in Warsaw. In that case it was an > interlock that prevented the thrust reversal when not both wheels see > weight-on-wheel. The landing happened with extreme side wind keeping one > side slightly lifted. Thus thrust reversal was blocked and the plane ran > past the runway and crashed into the ground, killing most passengers. > This interlock had been implemented after a Lauda Air crashed in south east > Asia because the thrust reversal was activated in mid air. > This is why you normally have reviews of system requirements and for every > change you have to perform an impact analysis to prevent such oversights. >
AT Monday 15 May 2017 22:48, David Brown wrote:

> On 15/05/17 15:58, Reinhardt Behm wrote: >> AT Monday 15 May 2017 21:05, Dimiter_Popoff wrote: >> >>> On 14.5.2017 &#1075;. 15:03, Clifford Heath wrote: >>>> This story is a doozy. >>>> <http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what- >> happens-when-psycho-automation-leaves-pilots-powerless-20170510-
gw26ae.html>
>>>> >>> >>> Interesting reading, I read almost the whole article. If I start >>> commenting on the quality of the software they mut have on these planes >>> I'll go into a rant predictable to all seasoned group members so >>> I won't. >>> >>> But I am amazed that - as the article states - the board computer >>> on these planes has priority over the pilots. This is not a bug, it >>> is an error at the product concept level... If it is true of course, >>> I still have difficulties believing that. The fact they did manage >>> to gain some manual control hints towards "not quite true". >> >> I am doing airborne software myself. >> You are correct. It is not a bug, it is a serious error of the >> requirements. Software has to be written according to requirements and >> this is verified (DO-178). If the sys-reqs are erroneous the software is >> also incorrect. >> > > What about that case of a plane crash because the pilot intentionally > crashed it (as far as they could figure out afterwards, anyway)? He was > mentally unstable and drove the plane into a mountain. If the onboard > computers had had priority over the pilot in that case, there would have > been no crash. > > It is far from easy to say who or what is most reliable in cases like > this, and who or what should have priority. Automated systems can (if > designed properly) be safer and more stable than humans - but on the > other hand, humans are better at handling unexpected situations.
That was shown with the landing on the Potomac. That is a decision nobody can really make. But some systems designers think their computers can be better. That makes me afraid of self driving cars. I am not sure if there are really good software development procedures in place. All that "self-learning" stuff and we have "big-data" so we do not have to understand it is irresponsible to me.
> However, I thought that critical flight systems were made in triplicate, > and if one goes bananas it gets outvoted, and the other two run the > plane? Did that not happen here?
Well even if there were multiple system they behaved according to their common specifications. If these are incorrect, all systems behave correctly incorrect. Most problems are really in the sys-reqs. The software development processes in DO-78 are good and try to guaranty software that comply with the sys-reqs. Also the most famous crash (Ariane-5) was not due to buggy software but happened because the requirements for the module that lead to the crash were not correctly reviewed - the original reqs were not fit for the new environment. From the report about the Quantas flight it seems to me that nobody did think about what should happen if the sensors report invalid data or signal errors. The Air France from Brazil to Europe that crashed over the Atlantic comes to mind. There the pitot tubes gave bad data because of icing. Seems they have not learned from it. -- Reinhardt
On 15/05/17 16:34, Reinhardt Behm wrote:
> Most problems are really in the sys-reqs.
It will be interesting to see whether the sys-reqs for self-driving cars will require/forbid/ignore/punt whether the control system should kill the driver to save N pedestrians. I suspect that, as they say, is an area of active research.
On Sun, 14 May 2017 22:03:19 +1000, Clifford Heath wrote:

> This story is a doozy. > <http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what-
happens-when-psycho-automation-leaves-pilots-powerless-20170510- gw26ae.html> Airbus has always had a policy of believing that engineers can program a blind computer to fly a plane better than a pilot with eyeballs. In addition to this event, it's resulted in planes flying into mountainsides with wings that simply were not stressed with any excessive G-forces from the requested pullup. I have friends who are Boeing engineers. Boeing's policy is that if you're going to have a human pilot in the hot seat, you should trust him when the chips are down. This makes for more impressive crashes when less than well-trained pilots are flying the planes, but fewer crashes when you've got good piloting. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com I'm looking for work -- see my website!
On Sun, 14 May 2017 22:03:19 +1000, Clifford Heath
<clifford.heath@gmail.com> wrote:

>This story is a doozy. ><http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what-happens-when-psycho-automation-leaves-pilots-powerless-20170510-gw26ae.html>
Instead of a newspaper, why not read the accident report? https://www.atsb.gov.au/publications/investigation_reports/2008/aair/ao-2008-070.aspx
On Mon, 15 May 2017 12:34:14 -0500, Tim Wescott
<seemywebsite@myfooter.really> wrote:

>On Sun, 14 May 2017 22:03:19 +1000, Clifford Heath wrote: > >> This story is a doozy. >> <http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what- >happens-when-psycho-automation-leaves-pilots-powerless-20170510- >gw26ae.html> > >Airbus has always had a policy of believing that engineers can program a >blind computer to fly a plane better than a pilot with eyeballs. In >addition to this event, it's resulted in planes flying into mountainsides >with wings that simply were not stressed with any excessive G-forces from >the requested pullup.
That's the hypothetical that's always trotted out; AFAIK, it's not actually happened. And flying into a mountainside in a modern, EGPWS equipped, airliner, would require immense levels of concentration to tune out the blaring alarm. Of course the guys who ran an early Sukhoi SSJ 100 into Mt. Salak a few years ago demonstrated humans can do just that ("We're in clouds and the computer says there are mountains? Must be a database problem!").
>I have friends who are Boeing engineers. Boeing's policy is that if >you're going to have a human pilot in the hot seat, you should trust him >when the chips are down. This makes for more impressive crashes when >less than well-trained pilots are flying the planes, but fewer crashes >when you've got good piloting.
OTOH, studies (in sims) have show that Airbus pilots pull higher G's in extreme avoidance maneuvers than pilots of aircraft without hard envelope protections. For the simple reason that they don't worry about the "edge", and can just haul back on the stick and get very near the maximum possible performance. In any event, it's a non-issue. The actual difference in available maneuver load is actually fairly small, and most of the time if you're so far gone that you need to pull enough G's to bend the airframe, it's probably because you screwed up severely multiple times already. And there are darn few crashes of modern airliners. And almost all of the ones that do occur are fairly unique.

The 2024 Embedded Online Conference