Revealing that which is concealed. Learning about anything that resembles real freedom. A journey of self-discovery shared with the world.
Have no fellowship with the unfruitful works of darkness, but rather reprove them - Ephesians 5-11
Join me and let's follow that high road...
Friday, April 5, 2019
Boeing's Problem Is Not Software
Authored by Raul Ilargi Meijer via The Automatic Earth blog,
We had already been told that in the Ethiopian Airlines flight ET302
crash which killed all 157 people on board, the 4-month old 737 MAX 8’s
anti-stall software reengaged itself four times in 6 minutes as the
pilots struggled to straighten the plane post-takeoff. In the end, the
anti-stall software won and pushed the plane nose-down towards the
earth. Now, Ethiopia -finally?!- released its report in the March 10
crash:
Minister of Transport Dagmawit Moges said that the crew of the
Ethiopian Airlines flight from Addis Ababa to Nairobi on 10 March
“performed all the procedures repeatedly provided by the manufacturer
but were not able to control the aircraft.” As result, investigations
have concluded that Boeing should be required to review the so-called
manoeuvring characteristics augmentation system on its 737 Max aircraft
before the jets are permitted to fly again, she said. The results of the preliminary investigation led by Ethiopia’s
Accident Investigation Bureau and supported by European investigators
were presented by Ms Moges at a press conference in Addis Ababa on
Thursday morning.
Ethiopia is being kind to Boeing. However, though the anti-stall
software played a big role in what happened, Boeing’s assertion (hope?!)
that a software fix is all that is needed to get the 737MAX’s back in
the air around the globe rests on very shaky ground (no pun intended
whatsoever).
737 MAX 8. The angle-of- attack (AOA) sensor is the lower device
below the cockpit windshield on both sides of the fuselage. (Mike
Siegel/The Seattle Times)
The Seattle Times did an article on March 26 that explains a lot more
than all other articles on the topic combined. The paper of course
resides in Boeing’s backyard, but can that be the reason we haven’t seen
the article quoted all over?
If the assertions in the article are correct, it would appear that a
software fix is the least of Boeing’s problems. For one thing, it needs
to address serious hardware, not software, issues with its planes. For
another, the company better hire a thousand of the world’s best lawyers
for all the lawsuits that will be filed against it.
Its cost-cutting endeavors may well be responsible for killing a
combined 346 people in the October 29 Lion Air crash and the Ethiopian
Airlines one. Get a class-action suit filed in the US and Boeing could
be fighting for survival.
Here’s what the Seattle Times wrote 9 days ago: Lack Of Redundancies On Boeing 737 MAX System Baffles Some Involved In Developing The Jet
Boeing has long embraced the power of redundancy to protect its
jets and their passengers from a range of potential disruptions, from
electrical faults to lightning strikes. The company typically uses two
or even three separate components as fail-safes for crucial tasks to
reduce the possibility of a disastrous failure. Its most advanced
planes, for instance, have three flight computers that function
independently, with each computer containing three different processors
manufactured by different companies. So even some of the people
who have worked on Boeing’s new 737 MAX airplane were baffled to learn
that the company had designed an automated safety system that
abandoned the principles of component redundancy, ultimately entrusting
the automated decision-making to just one sensor — a type of sensor that
was known to fail.
That one paragraph alone is so potentially damaging it’s hard to fathom why everyone’s still discussing a software glitch.
Boeing’s rival, Airbus, has typically depended on three such sensors. “A single point of failure is an absolute no-no,” said
one former Boeing engineer who worked on the MAX, who requested
anonymity to speak frankly about the program in an interview with The
Seattle Times. “That is just a huge system engineering oversight. To
just have missed it, I can’t imagine how.” Boeing’s design made the
flight crew the fail-safe backup to the safety system known as the
Maneuvering Characteristics Augmentation System, or MCAS. The Times has
interviewed eight people in recent days who were involved in developing
the MAX, which remains grounded around the globe in the wake of two
crashes that killed a total of 346 people.
The Maneuvering Characteristics Augmentation System (MCAS)
was already a late addition that Boeing had not planned for initially.
They wanted a plane that was so like older ones that no training would
be needed, but did put a much heavier engine in it, which was why MCAS
was needed. As I wrote earlier today, they cut corners until there was no corner left. On hardware, on software, on pilot training (simulator), everything was done to be cheaper than Airbus. The angle-of-attack (AOA) sensor of the 737 MAX is the bottom
piece of equipment below just below the cockpit windshield. (Mike Siegel
/ The Seattle Times)
A faulty reading from an angle-of-attack sensor (AOA) — used to
assess whether the plane is angled up so much that it is at risk of
stalling — is now suspected in the October crash of a 737 MAX in
Indonesia, with data suggesting that MCAS pushed the aircraft’s nose
toward Earth to avoid a stall that wasn’t happening. Investigators have
said another crash in Ethiopia this month has parallels to the first. Boeing has been working to rejigger its MAX software in recent
months, and that includes a plan to have MCAS consider input from both
of the plane’s angle-of-attack sensors, according to officials familiar
with the new design. “Our proposed software update incorporates
additional limits and safeguards to the system and reduces crew
workload,” Boeing said in a statement. But one problem with
two-point redundancies is that if one sensor goes haywire, the plane may
not be able to automatically determine which of the two readings is
correct, so Boeing has indicated that the MCAS safety system will not function when the sensors record substantial disagreement.
The underlying idea is so basic and simple it hurts: safety come in groups of three: three
flight computers that function independently, with each computer
containing three different processors manufactured by different
companies, and three sensors. The logic behind this is so
overwhelming it’s hard to see how anyone but a sociopathic accountant
can even ponder ditching it.
And then here come the clinchers:
Some observers, including the former Boeing engineer, think the safest option would be for Boeing to have a
third sensor to help ferret out an erroneous reading, much like the
three-sensor systems on the airplanes at rival Airbus. Adding that
option, however, could require a physical retrofit of the MAX.
See? It’s not a software issue. It’s hardware, and in all likelihood not just computer hardware either.
Clincher no. 2:
Andrew Kornecki, a former professor at Embry-Riddle Aeronautical
University who has studied redundancy systems in Airbus and Boeing
planes, said operating the automated system with one or two
sensors would be fine if all the pilots were sufficiently trained in how
to assess and handle the plane in the event of a problem. But,
he said, if he were designing the system from scratch, he would
emphasize the training while also building the plane with three sensors.
The professor is not 100% honest, I would think. There is zero reason
to opt for a two-sensor system, and 1001 reasons not to. It’s all just
about cost being more important than people. That last bit explains why
Boeing went there against better judgment:
[..] Boeing had been exploring the construction of an all-new airplane earlier this decade. But after American Airlines began discussing orders for a new plane from Airbus in 2011, Boeing abruptly changed course,
settling on the faster alternative of modifying its popular 737 into a
new MAX model. Rick Ludtke, a former Boeing engineer who worked on
designing the interfaces on the MAX’s flight deck, said managers
mandated that any differences from the previous 737 had to be small
enough that they wouldn’t trigger the need for pilots to undergo new
simulator training. That left the team working on an old architecture and layers of
different design philosophies that had piled on over the years, all to
serve an international pilot community that was increasingly expecting
automation. “It’s become such a kludge, that we started to speculate and
wonder whether it was safe to do the MAX,” Ludtke said. Ludtke didn’t
work directly on the MCAS, but he worked with those who did. He said
that if the group had built the MCAS in a way that would depend
on two sensors, and would shut the system off if one fails, he thinks
the company would have needed to install an alert in the cockpit to make
the pilots aware that the safety system was off.
There you go: A two-sensor system is fundamentally unsound, and it’s
therefore bonkers to even discuss, let alone contemplate it.
And if that happens, Ludtke said, the pilots would potentially
need training on the new alert and the underlying system. That could
mean simulator time, which was off the table. “The decision path they
made with MCAS is probably the wrong one,” Ludtke said. “It shows how
the airplane is a bridge too far.”