Email or username:

Password:

Forgot your password?
Top-level
Ian Turton

@yacc143 @vincent @calamari I always recall a discussion I had with an aerospace engineer about liability for compiler bugs, and he said what was the point, why would they want to end up owning Cray Systems if their planes started falling out of the air. Of course that was back in the 90s when we didn't expect planes to fall out of the sky.

4 comments
Andreas K

@ianturton @vincent @calamari
The point is that there needs to be some adjusting of the weighing functions.

And in our "capitalist" world, this happens by assigning costs.

As long, there is no direct cost for bugs, even catastrophic ones, companies will ship products with catastrophic bugs just to meet the schedule some marketing egg head invented for artificial reasons.

Ian Turton

@yacc143 @vincent @calamari but there's every chance that it would end with United owning the wreckage of Crowdstrike which doesn't really help. But as a software engineer I don't see criminalising bugs either as a good thing

Andreas K

@ianturton @vincent @calamari
Ah, but we reached the point where all kinds of security “regulations” (be it from authorities, ISO standards, or insurance companies) lead to situations where actually senior software engineers roll eyes when management explains “necessary measures”, that obviously will lower actual “technical” security just tick obscure items on a list, that make little to absolutely NO sense in the local context.

Andreas K

@ianturton @vincent @calamari
So having legal accountibility for these "necessary measurements" and their (often) 3rd party providers sounds like a great idea to stop these idiotic practices, and make the managers and bean counters really do their work, and assess risks, probabilities, and costs for all the options.

And not blindly assume "product X" provides capability Y for so many $Z no need to consider risks what could go wrong.

Go Up