Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

LiberalArkie

(15,707 posts)
Thu Apr 9, 2020, 12:53 PM Apr 2020

THE ANCIENT COMPUTERS IN THE BOEING 737 MAX ARE HOLDING UP A FIX

A brand-new Boeing 737 Max gets built in just nine days. In that time, a team of 12,000 people turns a loose assemblage of parts into a finished $120 million airplane with some truly cutting-edge technology: winglets based on ones designed by NASA, engines that feature the world’s first one-piece carbon-fiber fan blades, and computers with the same processing power as, uh, the Super Nintendo.

The Max has been grounded since March 2019, after some badly written software caused two crashes that killed 346 people. And while Boeing has received plenty of scrutiny for its bad code, it’s the Max’s computing power — or lack thereof — that has kept it on the ground since then.

Every 737 Max has two flight control computers. These take some of the workload off of pilots, whether that’s through full automation (such as autopilot) or through fine control adjustments during manual flight. These computers can literally fly the airplane — they have authority over major control surfaces and throttles — which means that any malfunction could turn catastrophic in a hurry. So it’s more important for manufacturers to choose hardware that’s proven to be safe, rather than run a fleet of airplanes on some cutting-edge tech with bugs that have yet to be worked out.

Boeing took that ethos to heart for the Max, sticking with the Collins Aerospace FCC-730 series, first built in 1996. Each computer features a pair of single-core, 16-bit processors that run independently of each other, which reduces computing power but also keeps a faulty processor from taking down the entire system.

Snip

https://www.theverge.com/2020/4/9/21197162/boeing-737-max-software-hardware-computer-fcc-crash

11 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies

hunter

(38,309 posts)
3. There's nothing wrong with "ancient computers."
Thu Apr 9, 2020, 01:41 PM
Apr 2020

The KISS principle is sound.

The 737-max went horribly wrong when they decided they could make that Frankenstein's bird fly on the cheap with a few software patches.

Boeing disconnected management from their engineers. That's where they failed.



LAS14

(13,777 posts)
4. Is this the computer that's holding up the fix?
Thu Apr 9, 2020, 01:45 PM
Apr 2020

If so, how/why?

Boeing took that ethos to heart for the Max, sticking with the Collins Aerospace FCC-730 series, first built in 1996. Each computer features a pair of single-core, 16-bit processors that run independently of each other, which reduces computing power but also keeps a faulty processor from taking down the entire system.

zonemaster

(232 posts)
5. I say - prove that that decision is a problem
Thu Apr 9, 2020, 01:47 PM
Apr 2020

Other than possibly locking you into a supplier that has to keep an old design around, from a computing perspective, you don't need a processor that is cutting edge. In fact, for the reasons cited, in a critical control system, you DON'T WANT something cutting edge. You want something that's sufficient for the job, is robust as hell, has completely predictable execution, has been proven over a long time and is ans will be obtainable for purchase over time. You don't need giant, multi-level cache, multi-threading, multi-core, speculative execution, etc.

A human flying an airplane is making control inputs no faster than 10 times a second, and flight surfaces of a commercial airliner certainly don't need and can't respond to inputs much faster than that, anyway. The visual system is the most sophisticated thing a human has to offer, and airplane control systems are not taking visual inputs to make flight control decisions, to my knowledge. Other than turbine control, an input update rate of 100 times per second is enough, so a 16 bit processor running at a few MHz is plenty.

rickford66

(5,522 posts)
6. The software wasn't "badly written"
Thu Apr 9, 2020, 02:19 PM
Apr 2020

It did what it was "told" to do. The new mods were poorly thought out and not tested thoroughly. The computers are perfectly adequate as stated by others here. Some hardware failed (AOA sensor) which exposed the poorly designed mod. Offline testing may not have caught this problem, but a simulator with the Red Label boxes (final S/W but not certified for flight) may have. Multiple malfunctions would have to be inserted simultaneously (AOA sensor fail, stab trim creep etc).

LiberalArkie

(15,707 posts)
7. The computers might have been too slow to handle all requests from the sub systems
Thu Apr 9, 2020, 03:06 PM
Apr 2020

The software may have been testing on a 64 bit processor running an emulator for that CPU. It was probably faster than the factory built unit.

rickford66

(5,522 posts)
8. From my experience on simulators
Thu Apr 9, 2020, 04:39 PM
Apr 2020

The most sophisticated avionics I stimulated didn't read sensor data super fast. ARINC 429 signals for instance aren't high speed. Maybe 1 to 60 times per second depending on the particular data sent. Our sims ran at 60 per and the avionics were happy. I know some avionics run as slow as 20. The few avionics in development I worked on ran 60 or less. I've been retired for a few years and am willing to be updated about these speeds.

LiberalArkie

(15,707 posts)
9. Remembering that the cpus are single core, probably single thread, 1 thing at a time
Thu Apr 9, 2020, 04:53 PM
Apr 2020

probably interrupt driven. I am thinking how easy it was to loose bits on serial ports back then.

rickford66

(5,522 posts)
10. The s/w within the boxes I worked on
Thu Apr 9, 2020, 05:33 PM
Apr 2020

was sequential, maybe 30 or 60 cycles per second then after each cycle some low priority work was done in background (whatever time was left at the end of the frame). I don't remember any serial interfaces such as PC serial on the actual boxes. A lot of 429 and similar ones if that's what you mean, but 429 has lots of error checking. I'm not an expert on the receivers, but I imagine they're buffered, so that a valid word, all the bits, would be read at once. Anyway, the 73 problem had nothing to do with the computer hardware. The s/w depending upon one sensor that failed, processed the the bad data and the results fell out. There should have been two sensor inputs or better, three so a reasonable voting algorithm could correct the trim. I'll guarantee, the engineers were overruled on this design.

LiberalArkie

(15,707 posts)
11. Oh yea for sure. The article stated that management wanted nothing used that wasn't already
Thu Apr 9, 2020, 05:45 PM
Apr 2020

approved as that would increase the lead time.

Latest Discussions»General Discussion»THE ANCIENT COMPUTERS IN ...