Prison. NOW

0
401

by Karl Denninger, Market Ticker:

Every one of the officers and directors.

Or just destroy the company.  I don’t much care which.

I’m talking about Boeing here.

Christchurch saw 50 people killed by a maniac.  The 737MAX has killed six times as many people and destroyed two hulls.

STFU about terrorism and how horrible it is until there are 300+ criminal charges of manslaughter laid for the souls aboard those two aircraft unless this entire article is crap, which it probably isn’t.

Specifically, the article states:

  • The failure analysis, including the “what bad thing(s) happen if this goes wrong” were predicated on a design that had the authority to move the trim by 0.6 degrees.  By that maximum movement the system could not crash the plane or kill anyone on board (it might produce minor injuries or discomfort to passengers however.)
  • The limits were later updated to 2.5 degrees but the documents were not updated and thus the analysis was not re-run. That’s four times the original limit, and would have prompted a much more serious rating in the event of a malfunction.
  • Worse, it was not documented that the system would reset whenever the pilot entered a trim command, and thus there was no effective limit at all on the amount of trim change the system could input.  That would have likely led to a “will lose the aircraft” (e.g. “CATASTROPHIC”, or “must be prevented“) rating for a failure.

To have a system that winds up during testing requiring four times the designed and expected range of authority is outrageous on its face.  To be off by 5%, 10% — that’s pretty normal.  You can only model so much, and models are never exact.

But when you’re off by four hundred percent your original design was crap.

Further, to have the system reset whenever the pilot gave a contrary command meant that it had unlimited authority.  The total range from neutral to the limit is 5 degrees, so 2.5 degrees is half the total design range from neutral with one action.  That’s not a “minor” adjustment!

As I said in my previous article prior to reading this I had serious questions about whether Boeing pushed the envelope too far with this design in the first place, shaving margins.

Now, if this article is correct it’s clear that Boeing knew during flight testing that the expected behavior of the aircraft with the new engines did not match the actual, in-flight performance; what they actually got in terms of aerodynamic stability under certain conditions was much worse than they expected.

But rather than change the documents to reflect the true amount of correction required and document that the actual authority of the system was unlimited due to reset behavior or put hard limit switches on the system to prevent that and then re-run the analysis, which might have resulted in at minimum a different type certificate (read: more cost for customers as pilots must be re-trained) or worse, a denied certification (potentially catastrophic costs requiring re-engineering the engine mounts and aerodynamic effects of same, redesign and re-fabrication of the wings and control surfaces, or even determination that the problems were not able to be feasibly corrected!) Boeing didn’t update the documents and thus the re-analysis was not done.

The Seattle Times calls this flawed analysis.  That, of course, assumes Boeing did not know that the authority of the system was changed to have four times that originally specified and did not know that if the pilot commanded opposite trim the system treated that as a reset and restarted, giving it the set authority anew, effectively meaning it had authority only limited by the physical limits of the mechanism.

That is an unreasonable assumption since someone changed the limits of authority between the time the system was designed and when testing was completed.  That someone most-certainly did know; the change did not happen on its own.

In addition Boeing knew that pilot commands in the opposite direction reset the system because Boeing engineers coded it that way.  Someone wrote that spec and someone else signed off on it when the programming was complete; in addition during testing it was tested against that spec.

But the FAA wasn’t notified of any of this, the documents were, if the article is correct, not updated and the failure analysis was not re-examined in light of these facts.

Look folks I’ve written code like this.  Yeah, it was a long time ago but so what?  It wasn’t for a plane but it was operating heavy machinery where excursions beyond authorized and reasonable limits either had the potential to do severe property damage and in some cases could kill someone — or a lot of someone’s.

You don’t change limits from the original design without going back through the failure analysis.

You don’t put a system together like this without defining what the maximum limits of its authority are, and what happens if they are entirely consumed — along with what can happen if they’re exceeded.

If the “what can happen if they’re exceeded” is very bad (people get badly hurt, die, or serious property damage happens) then you put physical, hard backup on said system that independently prevents that, whether it’s a limit switch that cuts the power to the contactor’s coil or something similar, and also considers any time that limit switch triggers an alarm event which indicates a critical malfunction took place that must be corrected before the thing in question is returned to service, since that “last ditch” safety device is there for the specific purpose of preventing a disaster and it just triggered.

Further, and perhaps most-critically, to not look VERY closely at ALL of the original design assumptions and their safety margins when you design for an 0.6 degree maximum automated trim correction, which is about 12% of the range from neutral and during testing you’re forced to allow a 50% range from neutral to meet requirements, four times the designed and expected limit, you ****ed up when you designed that thing in a way that might not be able to be safely operated no matter what you do in the present “as-built” configuration.

You have no damned business letting that thing, whatever it is, anywhere near people it can maim or kill until and unless you can prove that being off by 400% on a critical safety item’s range of authority does not reduce the margin of safety for the entire thing below reasonable limits.  In addition if you’re off by that much then everything in said device needs to be re-examined down to the last piece of wire, rivet, bolt and torque spec; if you screwed the pooch that badly in one place why would I believe that’s the only place your rocket scientists blew it?

Read More @ Market-Ticker.org