Email or username:

Password:

Forgot your password?
Darius Kazemi

"16. Safety is a characteristic of systems and not of their components

Safety is an emergent property of systems; it does not reside in a person, device or department of an organization or system. Safety cannot be purchased or manufactured; it is not a feature that is separate from the other components of the system. This means that safety cannot be manipulated like a feedstock or raw material."

- How Complex Systems Fail, by Richard I. Cook

how.complexsystems.fail/#16

8 comments
Alan Gamrican

@darius Of all the credential exam prep courses I teach, "Safety" is the overriding factor in business decision making. It doesn't matter if options A, B, and C turn a profit, if option D is focused on safety even at a financial loss...it is the best answer.

Putting a cost/benefit price on human life is the sure path to negligence and liability.

Thomas

@darius in technical safety there is (at least the illusion) that safety can be manufactured, turned into an artifact. The illusion arises "inside" a socio-technical systems where technical safety is a design goal in a controlled environment. The belief in that material property of safety is so strong that it gets projected onto any problem kind of problem that engineers try to solve, even systems that are not subject to technological control. The result is invariably failure or technocracy.

Matthew Gallant

@darius A whole book on this topic I would recommend: “Drift Into Failure” by Sidney Dekker. goodreads.com/book/show/102587

Paul Cantrell

@darius Will always boost this article when I see it. Every point in it deserves study.

Dominic White

@darius @ian such a great observation. I’ve been using that to describe security for some time too. Like safety, it feels like an emergent property of controls and processes functioning correctly during an attack.

Thomas Avedik :verified:

@darius: Organizations and their #leaders are ambiguously judging 'the relationship between production targets, costs of operations and acceptable risks'.

And this #ambiguity must then be resolved by actions of employees at the 'sharp end of the system'.

But if something goes wrong, the same leaders point fingers at these employees because they 'violated' processes while ignoring the other driving forces, especially production pressure...

#CyberSecurity

Dan Yeaw

@darius Thanks for sharing this, it is definitely a great #SystemsSafety summary! Since I'm in the automotive space, I'm interested in if he had thoughts on safety when the operators have limited training and often aren't experts, unlike medical areas with doctors, nuclear reactor operators, and airline pilots.

Go Up