@tess As a minor quibble, I do want to suggest an alternate scenario (to an LLM getting the nuclear codes) which may be more likely: What if a contractor uses an LLM to fill in code they're writing on some random-ass military contract, and this code gets incorporated into the UI for the humans-with-the-nuclear-codes to launch nukes or the radar system those humans use to decide whether to launch, and the LLM introduces catastrophic bugs because it's a random number generator with a human accent
@tess Like, I do think the probability*cost = for those OTHER things you mentioned is significantly *greater* than the "doomsday via incompetence" scenario I outline, but