EmbeddedRelated.com
Forums

QF72 story - wow

Started by Clifford Heath May 14, 2017
On Mon, 15 May 2017 23:34:21 +0800, Reinhardt Behm
<rbehm@hushmail.com> wrote:

>AT Monday 15 May 2017 22:48, David Brown wrote: > >> On 15/05/17 15:58, Reinhardt Behm wrote: >>> AT Monday 15 May 2017 21:05, Dimiter_Popoff wrote: >>> >>>> On 14.5.2017 ?. 15:03, Clifford Heath wrote: >>>>> This story is a doozy. >>>>> <http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what- >>> happens-when-psycho-automation-leaves-pilots-powerless-20170510- >gw26ae.html> >>>>> >>>> >>>> Interesting reading, I read almost the whole article. If I start >>>> commenting on the quality of the software they mut have on these planes >>>> I'll go into a rant predictable to all seasoned group members so >>>> I won't. >>>> >>>> But I am amazed that - as the article states - the board computer >>>> on these planes has priority over the pilots. This is not a bug, it >>>> is an error at the product concept level... If it is true of course, >>>> I still have difficulties believing that. The fact they did manage >>>> to gain some manual control hints towards "not quite true". >>> >>> I am doing airborne software myself. >>> You are correct. It is not a bug, it is a serious error of the >>> requirements. Software has to be written according to requirements and >>> this is verified (DO-178). If the sys-reqs are erroneous the software is >>> also incorrect. >>> >> >> What about that case of a plane crash because the pilot intentionally >> crashed it (as far as they could figure out afterwards, anyway)? He was >> mentally unstable and drove the plane into a mountain. If the onboard >> computers had had priority over the pilot in that case, there would have >> been no crash. >> >> It is far from easy to say who or what is most reliable in cases like >> this, and who or what should have priority. Automated systems can (if >> designed properly) be safer and more stable than humans - but on the >> other hand, humans are better at handling unexpected situations. > >That was shown with the landing on the Potomac. > >That is a decision nobody can really make. > >But some systems designers think their computers can be better. That makes >me afraid of self driving cars. I am not sure if there are really good >software development procedures in place. All that "self-learning" stuff and >we have "big-data" so we do not have to understand it is irresponsible to >me. > >> However, I thought that critical flight systems were made in triplicate, >> and if one goes bananas it gets outvoted, and the other two run the >> plane? Did that not happen here? > >Well even if there were multiple system they behaved according to their >common specifications. If these are incorrect, all systems behave correctly >incorrect. > >Most problems are really in the sys-reqs. The software development >processes in DO-78 are good and try to guaranty software that comply with >the sys-reqs. >Also the most famous crash (Ariane-5) was not due to buggy software but >happened because the requirements for the module that lead to the crash were >not correctly reviewed - the original reqs were not fit for the new >environment.
As far as I understand, they used Ariane-4 flight control software, which caused an integer overflow on Ariane-5 due to more dynamic movement. I was happy that our satellite was not on the first launch of Ariane-5 as originally planned :-).
On Tue, 16 May 2017 09:06:40 +0200, David Brown
<david.brown@hesbynett.no> wrote:

>On 15/05/17 17:34, Reinhardt Behm wrote: >> AT Monday 15 May 2017 22:48, David Brown wrote: >> >>> On 15/05/17 15:58, Reinhardt Behm wrote: >>>> AT Monday 15 May 2017 21:05, Dimiter_Popoff wrote: >>>> >>>>> On 14.5.2017 ?. 15:03, Clifford Heath wrote: >>>>>> This story is a doozy. >>>>>> <http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what- >>>> happens-when-psycho-automation-leaves-pilots-powerless-20170510- >> gw26ae.html> >>>>>> >>>>> >>>>> Interesting reading, I read almost the whole article. If I start >>>>> commenting on the quality of the software they mut have on these planes >>>>> I'll go into a rant predictable to all seasoned group members so >>>>> I won't. >>>>> >>>>> But I am amazed that - as the article states - the board computer >>>>> on these planes has priority over the pilots. This is not a bug, it >>>>> is an error at the product concept level... If it is true of course, >>>>> I still have difficulties believing that. The fact they did manage >>>>> to gain some manual control hints towards "not quite true". >>>> >>>> I am doing airborne software myself. >>>> You are correct. It is not a bug, it is a serious error of the >>>> requirements. Software has to be written according to requirements and >>>> this is verified (DO-178). If the sys-reqs are erroneous the software is >>>> also incorrect. >>>> >>> >>> What about that case of a plane crash because the pilot intentionally >>> crashed it (as far as they could figure out afterwards, anyway)? He was >>> mentally unstable and drove the plane into a mountain. If the onboard >>> computers had had priority over the pilot in that case, there would have >>> been no crash. >>> >>> It is far from easy to say who or what is most reliable in cases like >>> this, and who or what should have priority. Automated systems can (if >>> designed properly) be safer and more stable than humans - but on the >>> other hand, humans are better at handling unexpected situations. >> >> That was shown with the landing on the Potomac. >> >> That is a decision nobody can really make. >> >> But some systems designers think their computers can be better. That makes >> me afraid of self driving cars. I am not sure if there are really good >> software development procedures in place. All that "self-learning" stuff and >> we have "big-data" so we do not have to understand it is irresponsible to >> me. > >At least with self-driving cars, if the systems detect a problem they >can stop the car at the side of the road and call the AA. That is not >so easy in a plane! > >> >>> However, I thought that critical flight systems were made in triplicate, >>> and if one goes bananas it gets outvoted, and the other two run the >>> plane? Did that not happen here? >> >> Well even if there were multiple system they behaved according to their >> common specifications. If these are incorrect, all systems behave correctly >> incorrect. > >So you need multiple sensors too.
Not only that, you need sensors with different working principles after 2-3 normal redundant sensors.
>And multiple people writing specifications independently, before they >are combined (and then checked by multiple people).
That helps.
>I fully agree with you - it is often the specifications and system >requirements that are the issue.
That is one issue as is getting _reliable_ sensor data.
On 16.5.2017 &#1075;. 19:10, upsidedown@downunder.com wrote:
> On Mon, 15 May 2017 16:05:11 +0300, Dimiter_Popoff <dp@tgi-sci.com> > wrote: > >> On 14.5.2017 ?. 15:03, Clifford Heath wrote: >>> This story is a doozy. >>> <http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what-happens-when-psycho-automation-leaves-pilots-powerless-20170510-gw26ae.html> >>> >> >> Interesting reading, I read almost the whole article. If I start >> commenting on the quality of the software they mut have on these planes >> I'll go into a rant predictable to all seasoned group members so >> I won't. >> >> But I am amazed that - as the article states - the board computer >> on these planes has priority over the pilots. This is not a bug, it >> is an error at the product concept level... If it is true of course, >> I still have difficulties believing that. The fact they did manage >> to gain some manual control hints towards "not quite true". > > Look at the AF286 crash at Mulhouse when after a low fly-by the plane > dropped into the trees at the hill at the end of the runway. The > automation prevented a pilot too high nose up attitude, which would > have caused a stall to the ground, causing a lot of fatalities. In > this case, the pane dropped below the tree line softly with a small > number of fatalities. >
Don't get me wrong, I am anything but against automation. In the situation you describe limiting the impact of the pilot commands can be compared with say current limiting of the output of a power supply, nothing wrong limiting it such that it won't get fried because one turned a knob too much or something. Then all the large airplane controls nowadays I imagine are with servos etc., nobody in his right mind would allow pilot access to the firmware of these things which say controls the PWM of the servos. But above a certain level the pilot should be able to disable some autopilot thing which causes nosedives because it went bananas. BTW there is a nice short story from about 50 years ago by Lem, "Ananke", built on that kind of stuff (just a nice reading so I am mentioning it for those who are lucky to not have read it and can enjoy it now, one of his best IMO). Dimiter ------------------------------------------------------ Dimiter Popoff, TGI http://www.tgi-sci.com ------------------------------------------------------ http://www.flickr.com/photos/didi_tgi/
On Tue, 16 May 2017 19:30:38 +0300, upsidedown@downunder.com wrote:

>On Mon, 15 May 2017 23:34:21 +0800, Reinhardt Behm ><rbehm@hushmail.com> wrote: > >>AT Monday 15 May 2017 22:48, David Brown wrote: >> >>> On 15/05/17 15:58, Reinhardt Behm wrote: >>>> AT Monday 15 May 2017 21:05, Dimiter_Popoff wrote: >>>> >>>>> On 14.5.2017 ?. 15:03, Clifford Heath wrote: >>>>>> This story is a doozy. >>>>>> <http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what- >>>> happens-when-psycho-automation-leaves-pilots-powerless-20170510- >>gw26ae.html> >>>>>> >>>>> >>>>> Interesting reading, I read almost the whole article. If I start >>>>> commenting on the quality of the software they mut have on these planes >>>>> I'll go into a rant predictable to all seasoned group members so >>>>> I won't. >>>>> >>>>> But I am amazed that - as the article states - the board computer >>>>> on these planes has priority over the pilots. This is not a bug, it >>>>> is an error at the product concept level... If it is true of course, >>>>> I still have difficulties believing that. The fact they did manage >>>>> to gain some manual control hints towards "not quite true". >>>> >>>> I am doing airborne software myself. >>>> You are correct. It is not a bug, it is a serious error of the >>>> requirements. Software has to be written according to requirements and >>>> this is verified (DO-178). If the sys-reqs are erroneous the software is >>>> also incorrect. >>>> >>> >>> What about that case of a plane crash because the pilot intentionally >>> crashed it (as far as they could figure out afterwards, anyway)? He was >>> mentally unstable and drove the plane into a mountain. If the onboard >>> computers had had priority over the pilot in that case, there would have >>> been no crash. >>> >>> It is far from easy to say who or what is most reliable in cases like >>> this, and who or what should have priority. Automated systems can (if >>> designed properly) be safer and more stable than humans - but on the >>> other hand, humans are better at handling unexpected situations. >> >>That was shown with the landing on the Potomac. >> >>That is a decision nobody can really make. >> >>But some systems designers think their computers can be better. That makes >>me afraid of self driving cars. I am not sure if there are really good >>software development procedures in place. All that "self-learning" stuff and >>we have "big-data" so we do not have to understand it is irresponsible to >>me. >> >>> However, I thought that critical flight systems were made in triplicate, >>> and if one goes bananas it gets outvoted, and the other two run the >>> plane? Did that not happen here? >> >>Well even if there were multiple system they behaved according to their >>common specifications. If these are incorrect, all systems behave correctly >>incorrect. >> >>Most problems are really in the sys-reqs. The software development >>processes in DO-78 are good and try to guaranty software that comply with >>the sys-reqs. >>Also the most famous crash (Ariane-5) was not due to buggy software but >>happened because the requirements for the module that lead to the crash were >>not correctly reviewed - the original reqs were not fit for the new >>environment. > > >As far as I understand, they used Ariane-4 flight control software, >which caused an integer overflow on Ariane-5 due to more dynamic >movement.
Well, they adapted the Ariane-4 software (working code, and all that). What makes this failure especially annoying is that the -5 no longer actually needed the result of the calculation that triggered the overflow.
On Tue, 16 May 2017 18:54:15 +0300, upsidedown@downunder.com wrote:

>On Sun, 14 May 2017 22:03:19 +1000, Clifford Heath ><clifford.heath@gmail.com> wrote: > >>This story is a doozy. >><http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what-happens-when-psycho-automation-leaves-pilots-powerless-20170510-gw26ae.html> > >A much better factual description can be found in Aviation Herald >http://www.avherald.com/h?article=40de5374/0006&opt=0 > >Regarding the referenced Air France 447 crash into the Atlantic, which >was nat caused by automation but by inexperienced co-pilots (captain >was sleeping) who manually stalled the plane from 10 km to the sea due >to frozen pitot tubes (and hence no reliable speed info). > >As a general observation, one should not rely on a single method of >redundancy. Using multiple sensors of the same type doesn't always >help. This was clearly shown in Fukushima, were all diesel emergency >power generators got wet due to the tsunami.
Unfortunately no one has invented anything better than a pitot* as an air speed probe yet. There are a few alternatives**, but all have major issues (they have a lot more modes where they don't return valid data), and at least some of them are likely to be worse in the presence of icing. I've thought that something like a doppler radar reading the true relative motion of air molecules some distance ahead of the aircraft (enough to be thoroughly out of the flow field distortions caused by the aircraft), would be a really nice backup to the conventional air flow sensors, but unless you can assume the presence of considerable moisture (and you can't) such things are very delicate and cantankerous beasts - hardly what you'd want as a primary input to your flight control systems). *They are fundamentally very simple, and work very well within their design range. They are susceptible to icing (hence most of the one ones intended for aircraft that might fly in such conditions have big heaters attached to them), and errors due to the air flow being too far off center. They also have issue with transonic and supersonic flows, but those can be compensated for. **The stealth F-117 had a rather interesting pressure probe (conventional pitot tubes may not quite be corner reflectors, but they're pretty close) with a facetted front, and used pressures differences over some very small orifices to produce something akin to airspeed and alpha data. And in modes where reliable data really mattered (like takeoff and landing), the F-117 cranked a "real" pitot tube out from a covered compartment.
On Tue, 16 May 2017 10:47:14 -0400
rickman <gnuarm@gmail.com> wrote:

> Which is true! Pilot error is the single biggest cause of > fatalities in plane accidents.
If the actual causes were 50/50, but some pilots died, I suspect that lack of evidence would bias the results. Jan Coombs
> > How do you decide when the chips are down? I'd like to see > the statistics you are quoting. > > -- > > Rick C
AT Wednesday 17 May 2017 17:31, Jan Coombs wrote:

> On Tue, 16 May 2017 10:47:14 -0400 > rickman <gnuarm@gmail.com> wrote: > >> Which is true! Pilot error is the single biggest cause of >> fatalities in plane accidents. > > If the actual causes were 50/50, but some pilots died,
That helps natural selection. -- Reinhardt
On 5/17/2017 10:20 AM, Reinhardt Behm wrote:
> AT Wednesday 17 May 2017 17:31, Jan Coombs wrote: > >> On Tue, 16 May 2017 10:47:14 -0400 >> rickman <gnuarm@gmail.com> wrote: >> >>> Which is true! Pilot error is the single biggest cause of >>> fatalities in plane accidents. >> >> If the actual causes were 50/50, but some pilots died, > > That helps natural selection.
Yes, it helps to eliminate the homo sapiens who think man can fly! -- Rick C
AT Wednesday 17 May 2017 00:30, upsidedown@downunder.com wrote:

> On Mon, 15 May 2017 23:34:21 +0800, Reinhardt Behm > <rbehm@hushmail.com> wrote: > >>AT Monday 15 May 2017 22:48, David Brown wrote: >> >>> On 15/05/17 15:58, Reinhardt Behm wrote: >>>> AT Monday 15 May 2017 21:05, Dimiter_Popoff wrote: >>>> >>>>> On 14.5.2017 ?. 15:03, Clifford Heath wrote: >>>>>> This story is a doozy. >>>>>> <http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what- >>>> happens-when-psycho-automation-leaves-pilots-powerless-20170510- >>gw26ae.html> >>>>>> >>>>> >>>>> Interesting reading, I read almost the whole article. If I start >>>>> commenting on the quality of the software they mut have on these >>>>> planes I'll go into a rant predictable to all seasoned group members >>>>> so I won't. >>>>> >>>>> But I am amazed that - as the article states - the board computer >>>>> on these planes has priority over the pilots. This is not a bug, it >>>>> is an error at the product concept level... If it is true of course, >>>>> I still have difficulties believing that. The fact they did manage >>>>> to gain some manual control hints towards "not quite true". >>>> >>>> I am doing airborne software myself. >>>> You are correct. It is not a bug, it is a serious error of the >>>> requirements. Software has to be written according to requirements and >>>> this is verified (DO-178). If the sys-reqs are erroneous the software >>>> is also incorrect. >>>> >>> >>> What about that case of a plane crash because the pilot intentionally >>> crashed it (as far as they could figure out afterwards, anyway)? He was >>> mentally unstable and drove the plane into a mountain. If the onboard >>> computers had had priority over the pilot in that case, there would have >>> been no crash. >>> >>> It is far from easy to say who or what is most reliable in cases like >>> this, and who or what should have priority. Automated systems can (if >>> designed properly) be safer and more stable than humans - but on the >>> other hand, humans are better at handling unexpected situations. >> >>That was shown with the landing on the Potomac. >> >>That is a decision nobody can really make. >> >>But some systems designers think their computers can be better. That makes >>me afraid of self driving cars. I am not sure if there are really good >>software development procedures in place. All that "self-learning" stuff >>and we have "big-data" so we do not have to understand it is irresponsible >>to me. >> >>> However, I thought that critical flight systems were made in triplicate, >>> and if one goes bananas it gets outvoted, and the other two run the >>> plane? Did that not happen here? >> >>Well even if there were multiple system they behaved according to their >>common specifications. If these are incorrect, all systems behave >>correctly incorrect. >> >>Most problems are really in the sys-reqs. The software development >>processes in DO-78 are good and try to guaranty software that comply with >>the sys-reqs. >>Also the most famous crash (Ariane-5) was not due to buggy software but >>happened because the requirements for the module that lead to the crash >>were not correctly reviewed - the original reqs were not fit for the new >>environment. > > > As far as I understand, they used Ariane-4 flight control software, > which caused an integer overflow on Ariane-5 due to more dynamic > movement. > > I was happy that our satellite was not on the first launch of Ariane-5 > as originally planned :-).
Correct. But somebody should have checked the specs for this old module if it fits into he new environment. -- Reinhardt
AT Tuesday 16 May 2017 22:41, rickman wrote:

> On 5/15/2017 11:34 AM, Reinhardt Behm wrote: >> AT Monday 15 May 2017 22:48, David Brown wrote: >> >>> On 15/05/17 15:58, Reinhardt Behm wrote: >>>> AT Monday 15 May 2017 21:05, Dimiter_Popoff wrote: >>>> >>>>> On 14.5.2017 &#1075;. 15:03, Clifford Heath wrote: >>>>>> This story is a doozy. >>>>>> <http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what- >>>> happens-when-psycho-automation-leaves-pilots-powerless-20170510- >> gw26ae.html> >>>>>> >>>>> >>>>> Interesting reading, I read almost the whole article. If I start >>>>> commenting on the quality of the software they mut have on these >>>>> planes I'll go into a rant predictable to all seasoned group members >>>>> so I won't. >>>>> >>>>> But I am amazed that - as the article states - the board computer >>>>> on these planes has priority over the pilots. This is not a bug, it >>>>> is an error at the product concept level... If it is true of course, >>>>> I still have difficulties believing that. The fact they did manage >>>>> to gain some manual control hints towards "not quite true". >>>> >>>> I am doing airborne software myself. >>>> You are correct. It is not a bug, it is a serious error of the >>>> requirements. Software has to be written according to requirements and >>>> this is verified (DO-178). If the sys-reqs are erroneous the software >>>> is also incorrect. >>>> >>> >>> What about that case of a plane crash because the pilot intentionally >>> crashed it (as far as they could figure out afterwards, anyway)? He was >>> mentally unstable and drove the plane into a mountain. If the onboard >>> computers had had priority over the pilot in that case, there would have >>> been no crash. >>> >>> It is far from easy to say who or what is most reliable in cases like >>> this, and who or what should have priority. Automated systems can (if >>> designed properly) be safer and more stable than humans - but on the >>> other hand, humans are better at handling unexpected situations. >> >> That was shown with the landing on the Potomac. > > I'm not sure if this is a joke. I believe you are thinking of the > "landing on the Hudson" in NY where everyone lived. The landing on the > Potomac was Air Florida flight 90 where only four or five lived after > the plane struck the 14th street bridge (also killing some drivers) in > January, plunging everyone into icy waters.
You are right. I meant the landing on the Hudson. I should not post just before midnight from memory alone.
> > There were multiple causes of this accident and most of them were pilot > errors showing how poor humans are and keeping us safe. > > >> That is a decision nobody can really make. > > ??? > > >> But some systems designers think their computers can be better. That >> makes me afraid of self driving cars. I am not sure if there are really >> good software development procedures in place. All that "self-learning" >> stuff and we have "big-data" so we do not have to understand it is >> irresponsible to me. > > Don't worry about the details, consider the statistics. Computers only > need to be better than people and so far they have been for driving cars. > > If you have 1 in 1E6 chance of dying with a human pilot and a 1 in 1E7 > chance of dying with a computer pilot, which will you choose?
When sitting in that plane over NYC my preferences would be clear. Against all statistics.
> > >>> However, I thought that critical flight systems were made in triplicate, >>> and if one goes bananas it gets outvoted, and the other two run the >>> plane? Did that not happen here? >> >> Well even if there were multiple system they behaved according to their >> common specifications. If these are incorrect, all systems behave >> correctly incorrect. >> >> Most problems are really in the sys-reqs. The software development >> processes in DO-78 are good and try to guaranty software that comply with >> the sys-reqs. >> Also the most famous crash (Ariane-5) was not due to buggy software but >> happened because the requirements for the module that lead to the crash >> were not correctly reviewed - the original reqs were not fit for the new >> environment. >> >> From the report about the Quantas flight it seems to me that nobody did >> think about what should happen if the sensors report invalid data or >> signal errors. >> The Air France from Brazil to Europe that crashed over the Atlantic comes >> to >> mind. There the pitot tubes gave bad data because of icing. Seems they >> have not learned from it. > > These issues are not really "computer" issues, they are systems design > issues. I find it funny that people get so excited about systems flying > planes and driving cars, but no one thinks twice about systems running > nuclear power plants. > > In the east coast earthquake a few years back the North Anna power plant > was about 10 miles from the epicenter. The plant received twice the > seismic impact it was designed for. The plant was taken offline and the > generators were fired up. After a few minutes one generator failed. A > subsequent analysis found the cause was faulty installation of a head > gasket which in turn was caused by an incorrect installation procedure, > an installation procedure that was common to *all* the generators! This > was a single point of failure that could have resulted in a core > meltdown as a result of a bad system design. > > Why are people so unconcerned with nuclear power? Because we have not > had enough accidents in the US... yet!
I am concerned about it having some projects in nuclear plants I know how thoughtlessly people act there not even caring about defined procedures. The plants were not in Russia or some such, but in Germany. Some of the incidents never got reported. I am glad Germany is shutting down that stuff. -- Reinhardt