EmbeddedRelated.com
Forums

QF72 story - wow

Started by Clifford Heath May 14, 2017
On 5/17/2017 11:13 AM, Reinhardt Behm wrote:
> AT Tuesday 16 May 2017 22:41, rickman wrote: > >> On 5/15/2017 11:34 AM, Reinhardt Behm wrote: >>> AT Monday 15 May 2017 22:48, David Brown wrote: >>> >>>> On 15/05/17 15:58, Reinhardt Behm wrote: >>>>> AT Monday 15 May 2017 21:05, Dimiter_Popoff wrote: >>>>> >>>>>> On 14.5.2017 &#1075;. 15:03, Clifford Heath wrote: >>>>>>> This story is a doozy. >>>>>>> <http://www.smh.com.au/good-weekend/the-untold-story-of-qf72-what- >>>>> happens-when-psycho-automation-leaves-pilots-powerless-20170510- >>> gw26ae.html> >>>>>>> >>>>>> >>>>>> Interesting reading, I read almost the whole article. If I start >>>>>> commenting on the quality of the software they mut have on these >>>>>> planes I'll go into a rant predictable to all seasoned group members >>>>>> so I won't. >>>>>> >>>>>> But I am amazed that - as the article states - the board computer >>>>>> on these planes has priority over the pilots. This is not a bug, it >>>>>> is an error at the product concept level... If it is true of course, >>>>>> I still have difficulties believing that. The fact they did manage >>>>>> to gain some manual control hints towards "not quite true". >>>>> >>>>> I am doing airborne software myself. >>>>> You are correct. It is not a bug, it is a serious error of the >>>>> requirements. Software has to be written according to requirements and >>>>> this is verified (DO-178). If the sys-reqs are erroneous the software >>>>> is also incorrect. >>>>> >>>> >>>> What about that case of a plane crash because the pilot intentionally >>>> crashed it (as far as they could figure out afterwards, anyway)? He was >>>> mentally unstable and drove the plane into a mountain. If the onboard >>>> computers had had priority over the pilot in that case, there would have >>>> been no crash. >>>> >>>> It is far from easy to say who or what is most reliable in cases like >>>> this, and who or what should have priority. Automated systems can (if >>>> designed properly) be safer and more stable than humans - but on the >>>> other hand, humans are better at handling unexpected situations. >>> >>> That was shown with the landing on the Potomac. >> >> I'm not sure if this is a joke. I believe you are thinking of the >> "landing on the Hudson" in NY where everyone lived. The landing on the >> Potomac was Air Florida flight 90 where only four or five lived after >> the plane struck the 14th street bridge (also killing some drivers) in >> January, plunging everyone into icy waters. > > You are right. I meant the landing on the Hudson. I should not post just > before midnight from memory alone. >> >> There were multiple causes of this accident and most of them were pilot >> errors showing how poor humans are and keeping us safe. >> >> >>> That is a decision nobody can really make. >> >> ??? >> >> >>> But some systems designers think their computers can be better. That >>> makes me afraid of self driving cars. I am not sure if there are really >>> good software development procedures in place. All that "self-learning" >>> stuff and we have "big-data" so we do not have to understand it is >>> irresponsible to me. >> >> Don't worry about the details, consider the statistics. Computers only >> need to be better than people and so far they have been for driving cars. >> >> If you have 1 in 1E6 chance of dying with a human pilot and a 1 in 1E7 >> chance of dying with a computer pilot, which will you choose? > > When sitting in that plane over NYC my preferences would be clear. Against > all statistics.
This is such a perfect example of emotion ruling over logic. Thank you.
>>>> However, I thought that critical flight systems were made in triplicate, >>>> and if one goes bananas it gets outvoted, and the other two run the >>>> plane? Did that not happen here? >>> >>> Well even if there were multiple system they behaved according to their >>> common specifications. If these are incorrect, all systems behave >>> correctly incorrect. >>> >>> Most problems are really in the sys-reqs. The software development >>> processes in DO-78 are good and try to guaranty software that comply with >>> the sys-reqs. >>> Also the most famous crash (Ariane-5) was not due to buggy software but >>> happened because the requirements for the module that lead to the crash >>> were not correctly reviewed - the original reqs were not fit for the new >>> environment. >>> >>> From the report about the Quantas flight it seems to me that nobody did >>> think about what should happen if the sensors report invalid data or >>> signal errors. >>> The Air France from Brazil to Europe that crashed over the Atlantic comes >>> to >>> mind. There the pitot tubes gave bad data because of icing. Seems they >>> have not learned from it. >> >> These issues are not really "computer" issues, they are systems design >> issues. I find it funny that people get so excited about systems flying >> planes and driving cars, but no one thinks twice about systems running >> nuclear power plants. >> >> In the east coast earthquake a few years back the North Anna power plant >> was about 10 miles from the epicenter. The plant received twice the >> seismic impact it was designed for. The plant was taken offline and the >> generators were fired up. After a few minutes one generator failed. A >> subsequent analysis found the cause was faulty installation of a head >> gasket which in turn was caused by an incorrect installation procedure, >> an installation procedure that was common to *all* the generators! This >> was a single point of failure that could have resulted in a core >> meltdown as a result of a bad system design. >> >> Why are people so unconcerned with nuclear power? Because we have not >> had enough accidents in the US... yet! > > I am concerned about it having some projects in nuclear plants I know how > thoughtlessly people act there not even caring about defined procedures. The > plants were not in Russia or some such, but in Germany. Some of the > incidents never got reported. I am glad Germany is shutting down that stuff.
The problems with the nuclear industry that were revealed by the incident at the North Anna plant run deep. Did you know Dominion Power *lied* on their application for the North Anna reactors, not reporting the known existence of a nearby fault? They were caught and fined a mere $30,000 dollars. -- Rick C
On 17.5.2017 &#1075;. 18:54, rickman wrote:
> On 5/17/2017 11:13 AM, Reinhardt Behm wrote: > ...... >>> Don't worry about the details, consider the statistics. Computers only >>> need to be better than people and so far they have been for driving >>> cars. >>> >>> If you have 1 in 1E6 chance of dying with a human pilot and a 1 in 1E7 >>> chance of dying with a computer pilot, which will you choose? >> >> When sitting in that plane over NYC my preferences would be clear. >> Against >> all statistics. > > This is such a perfect example of emotion ruling over logic. Thank you.
No it is not. Just the statistics are based on incomplete or invalid data. Here is an example why. Fewer people die in plain crashes per person*distance than in car crashes, this is known. However the percentage of idiots among pilots is a lot lower than among drivers; I am sure the results will be different if the same comparison is made say for planes vs. buses on reputable bus lines. Then there are nowhere near enough data to make the stats on human pilots vs. autopilots on similar enough planes, it will all be below the noise floor. Completely automated planes are a good idea a target which must be pursued but humanity is not there yet, not by a long stretch. Then given how *messy* todays software is I don't see it done any time soon. I know I would want to have access to all the controls - debugging included... - if I flew an autopiloted plane *I* have programmmed - and my software is anything but messy. Dimiter ------------------------------------------------------ Dimiter Popoff, TGI http://www.tgi-sci.com ------------------------------------------------------ http://www.flickr.com/photos/didi_tgi/
On 5/17/2017 2:43 PM, Dimiter_Popoff wrote:
> On 17.5.2017 &#1075;. 18:54, rickman wrote: >> On 5/17/2017 11:13 AM, Reinhardt Behm wrote: >> ...... >>>> Don't worry about the details, consider the statistics. Computers only >>>> need to be better than people and so far they have been for driving >>>> cars. >>>> >>>> If you have 1 in 1E6 chance of dying with a human pilot and a 1 in 1E7 >>>> chance of dying with a computer pilot, which will you choose? >>> >>> When sitting in that plane over NYC my preferences would be clear. >>> Against >>> all statistics. >> >> This is such a perfect example of emotion ruling over logic. Thank you. > > No it is not. Just the statistics are based on incomplete or invalid > data. > > Here is an example why. Fewer people die in plain crashes per > person*distance than in car crashes, this is known. > However the percentage of idiots among pilots is a lot lower than > among drivers; I am sure the results will be different if the same > comparison is made say for planes vs. buses on reputable bus lines.
You are mixing issues. If you compare cars vs. airplane, that is cars vs. airplanes. What do busses have to do with anything?
> Then there are nowhere near enough data to make the stats on human > pilots vs. autopilots on similar enough planes, it will all be below > the noise floor.
And yet the data is there and it is clear.
> Completely automated planes are a good idea a target which must be > pursued but humanity is not there yet, not by a long stretch. > Then given how *messy* todays software is I don't see it done any > time soon.
We don't know the future. But again, you are discussing your emotions, not the facts.
> I know I would want to have access to all the controls - debugging > included... - if I flew an autopiloted plane *I* have programmmed - and > my software is anything but messy.
If you have no confidence in your code, I certainly don't. I'm staying off of Popoff airlines. -- Rick C
On 17.5.2017 &#1075;. 23:03, rickman wrote:
> On 5/17/2017 2:43 PM, Dimiter_Popoff wrote: >> On 17.5.2017 &#1075;. 18:54, rickman wrote: >>> On 5/17/2017 11:13 AM, Reinhardt Behm wrote: >>> ...... >>>>> Don't worry about the details, consider the statistics. Computers >>>>> only >>>>> need to be better than people and so far they have been for driving >>>>> cars. >>>>> >>>>> If you have 1 in 1E6 chance of dying with a human pilot and a 1 in 1E7 >>>>> chance of dying with a computer pilot, which will you choose? >>>> >>>> When sitting in that plane over NYC my preferences would be clear. >>>> Against >>>> all statistics. >>> >>> This is such a perfect example of emotion ruling over logic. Thank you. >> >> No it is not. Just the statistics are based on incomplete or invalid >> data. >> >> Here is an example why. Fewer people die in plain crashes per >> person*distance than in car crashes, this is known. >> However the percentage of idiots among pilots is a lot lower than >> among drivers; I am sure the results will be different if the same >> comparison is made say for planes vs. buses on reputable bus lines. > > You are mixing issues. If you compare cars vs. airplane, that is cars > vs. airplanes. What do busses have to do with anything?
I am sure you see the point, you just choose denial.
> >> Then there are nowhere near enough data to make the stats on human >> pilots vs. autopilots on similar enough planes, it will all be below >> the noise floor. > > And yet the data is there and it is clear.
No it is not.
> > >> Completely automated planes are a good idea a target which must be >> pursued but humanity is not there yet, not by a long stretch. >> Then given how *messy* todays software is I don't see it done any >> time soon. > > We don't know the future. But again, you are discussing your emotions, > not the facts.
Not at all, where did you see any emotion in that.
> > >> I know I would want to have access to all the controls - debugging >> included... - if I flew an autopiloted plane *I* have programmmed - and >> my software is anything but messy. > > If you have no confidence in your code, I certainly don't. I'm staying > off of Popoff airlines.
You are trying to be smart because you understand you are wrong now, OK. I am sure to understand my point well enough - which is that I would choose to have control even over my software which I deem safer than anyone elses. Dimiter ------------------------------------------------------ Dimiter Popoff, TGI http://www.tgi-sci.com ------------------------------------------------------ http://www.flickr.com/photos/didi_tgi/
On 5/17/2017 5:27 PM, Dimiter_Popoff wrote:
> On 17.5.2017 &#1075;. 23:03, rickman wrote: >> On 5/17/2017 2:43 PM, Dimiter_Popoff wrote: >>> On 17.5.2017 &#1075;. 18:54, rickman wrote: >>>> On 5/17/2017 11:13 AM, Reinhardt Behm wrote: >>>> ...... >>>>>> Don't worry about the details, consider the statistics. Computers >>>>>> only >>>>>> need to be better than people and so far they have been for driving >>>>>> cars. >>>>>> >>>>>> If you have 1 in 1E6 chance of dying with a human pilot and a 1 in >>>>>> 1E7 >>>>>> chance of dying with a computer pilot, which will you choose? >>>>> >>>>> When sitting in that plane over NYC my preferences would be clear. >>>>> Against >>>>> all statistics. >>>> >>>> This is such a perfect example of emotion ruling over logic. Thank >>>> you. >>> >>> No it is not. Just the statistics are based on incomplete or invalid >>> data. >>> >>> Here is an example why. Fewer people die in plain crashes per >>> person*distance than in car crashes, this is known. >>> However the percentage of idiots among pilots is a lot lower than >>> among drivers; I am sure the results will be different if the same >>> comparison is made say for planes vs. buses on reputable bus lines. >> >> You are mixing issues. If you compare cars vs. airplane, that is cars >> vs. airplanes. What do busses have to do with anything? > > I am sure you see the point, you just choose denial. > >> >>> Then there are nowhere near enough data to make the stats on human >>> pilots vs. autopilots on similar enough planes, it will all be below >>> the noise floor. >> >> And yet the data is there and it is clear. > > No it is not. > >> >> >>> Completely automated planes are a good idea a target which must be >>> pursued but humanity is not there yet, not by a long stretch. >>> Then given how *messy* todays software is I don't see it done any >>> time soon. >> >> We don't know the future. But again, you are discussing your emotions, >> not the facts. > > Not at all, where did you see any emotion in that. > >> >> >>> I know I would want to have access to all the controls - debugging >>> included... - if I flew an autopiloted plane *I* have programmmed - and >>> my software is anything but messy. >> >> If you have no confidence in your code, I certainly don't. I'm staying >> off of Popoff airlines. > > You are trying to be smart because you understand you are wrong now, OK. > I am sure to understand my point well enough - which is that I would > choose to have control even over my software which I deem safer than > anyone elses.
If you don't want to discuss the issue, why did you reply? -- Rick C
On 5/17/2017 5:27 PM, Dimiter_Popoff wrote:
> On 17.5.2017 &#1075;. 23:03, rickman wrote: >> On 5/17/2017 2:43 PM, Dimiter_Popoff wrote: >>> On 17.5.2017 &#1075;. 18:54, rickman wrote: >>>> On 5/17/2017 11:13 AM, Reinhardt Behm wrote: >>>> ...... >>>>>> Don't worry about the details, consider the statistics. Computers >>>>>> only >>>>>> need to be better than people and so far they have been for driving >>>>>> cars. >>>>>> >>>>>> If you have 1 in 1E6 chance of dying with a human pilot and a 1 in >>>>>> 1E7 >>>>>> chance of dying with a computer pilot, which will you choose? >>>>> >>>>> When sitting in that plane over NYC my preferences would be clear. >>>>> Against >>>>> all statistics. >>>> >>>> This is such a perfect example of emotion ruling over logic. Thank >>>> you. >>> >>> No it is not. Just the statistics are based on incomplete or invalid >>> data. >>> >>> Here is an example why. Fewer people die in plain crashes per >>> person*distance than in car crashes, this is known. >>> However the percentage of idiots among pilots is a lot lower than >>> among drivers; I am sure the results will be different if the same >>> comparison is made say for planes vs. buses on reputable bus lines. >> >> You are mixing issues. If you compare cars vs. airplane, that is cars >> vs. airplanes. What do busses have to do with anything? > > I am sure you see the point, you just choose denial. > >> >>> Then there are nowhere near enough data to make the stats on human >>> pilots vs. autopilots on similar enough planes, it will all be below >>> the noise floor. >> >> And yet the data is there and it is clear. > > No it is not. > >> >> >>> Completely automated planes are a good idea a target which must be >>> pursued but humanity is not there yet, not by a long stretch. >>> Then given how *messy* todays software is I don't see it done any >>> time soon. >> >> We don't know the future. But again, you are discussing your emotions, >> not the facts. > > Not at all, where did you see any emotion in that. > >> >> >>> I know I would want to have access to all the controls - debugging >>> included... - if I flew an autopiloted plane *I* have programmmed - and >>> my software is anything but messy. >> >> If you have no confidence in your code, I certainly don't. I'm staying >> off of Popoff airlines. > > You are trying to be smart because you understand you are wrong now, OK. > I am sure to understand my point well enough - which is that I would > choose to have control even over my software which I deem safer than > anyone elses.
If you don't want to discuss the issue, why did you reply? -- Rick C
On Mon, 15 May 2017 21:58:19 +0800, Reinhardt Behm wrote:

> Similar to the crash of an Airbus in Warsaw. In that case it was an > interlock that prevented the thrust reversal when not both wheels see > weight-on-wheel. The landing happened with extreme side wind keeping one > side slightly lifted. Thus thrust reversal was blocked and the plane ran > past the runway and crashed into the ground, killing most passengers.
Thankfully, only one passengers and one crew member died: most survived. https://en.wikipedia.org/wiki/Lufthansa_Flight_2904