Wednesday 13 May 2015

Safety-Critical Systems



(Danish plane at rest in a Danish minefield)


A genuine insight into the operation of the world. Really an intro to engineering course, or an extremely applied law course, but thoughtful to boot. (“Why does regulation fail?”, “What is error?”)

Security is about preventing humans trolling you; safety is the more universal attempt of holding off Sod’s Law. Talebian. The field is highly legalistic and militaristic; basically all of the literature is unreadably constipated. Combined with the .

This makes this instructor’s humour and irreverence a godsend. An all-rounder, cynical, funny, and hyper-competent. He spent a week away unexpectedly, saving European aviation from a cybersecurity crisis.

There’s not much maths, and surprisingly little software engineering. It is nevertheless a hard course: you’re in with 4th year undergrads, the slides are intentionally terse and lacklustre, and both coursework and exam require a bunch of outside reading and synthesising thought (guesswork). There’s a lot of material, mostly indigestible.



So long, Ariane.

Instructor: Omni-competent, sarcastic.
Inherent interest: High (subject), very low (safety discourse)
Slides: Terrible.
Assessments: Very challenging ex nihilo safety tool development with minimal guidance. Lots of work for 20%
Marking: Fair (by which I mean strict)


BIG LESSONS
  • No-one knows how to write absolutely safe code.
  • And even if they did, they wouldn't be given the budget to pull it off; lives are only worth a million quid.
  • Even if they did, the environment would still produce unsafe behaviour.
  • Testing is inductive and cannot assure.
  • 10 years to approve new aviation software. Planes stay in the air because of all the money thrown at them.
  • Updating software generally increases the risk of failure.


  • **************************************************************************

    The world is a scary place. Every modern car has two separate computer networks, FlexRay and CAN, which link up an average of 40 devices, made by perhaps 20 different manufacturers. To brake, my foot’s pressure has to trigger an actuator, translate that into binary, pass the signal through the CAN network, translate it back and only then activate the brake pads. Note that police reporting forms have no box for “crash due to network failure”. I don’t drive anymore.

    If you’re ever at a job interview and the manager tells you that they can measure software risk in their business, leave. Leave, or you will end up in jail.

    What happens to the regulator if their stupid decision kills people? Nothing at all. They are the Crown.

    Remember that NASA – the only people qualified to evaluate SpaceX’s safety cases – are in competition with SpaceX. So SpaceX refuse to disclose their docking code.

    Exception-handling code is an admission of guilt. Build it better!

    Safety standards are written by old washed-up people who have given up on everything else.

    As an engineer, one is quite happy with a total system failure; it’s easier to do forensics on.

    Consider this: when a British judge puts on a wig, this is to signify that they are carrying out their judgment on behalf of the Crown – and, given the divine mandate, ultimately literally on behalf of God. This is the mindset that engineers must deal with.

    *indicating his own slides, which are quoting IEC 61508* Ach, this is just language, though.

    Regulators get paid about 40% less than their corporate counterparts; this means that the general ability of regulators is sub-par, so you get a permanent state of regulatory lag.

    The industry that lies the most – by far – are the railway operators.


No comments:

Post a Comment