Email or username:

Password:

Forgot your password?
Top-level
Cory Doctorow

The chip itself would be securely affixed to your motherboard, such that any attempt to remove it and replace it with a compromised chip would be immediately obvious to you (for example, it might encapsulate some acid in a layer of epoxy that would rupture if you tried to remove the chip).

They called this "Next Generation Secure Computing Base," or "Palladium" for short. They came to the Electronic Frontier Foundation offices to present it. It was a memorable day:

pluralistic.net/2020/12/05/tru

12/

13 comments
Cory Doctorow replied to Cory

My then-colleague Seth Schoen - EFF's staff technologist, the most technically sophisticated person to have been briefed on the technology without signing an NDA - made several pointed critiques of Palladium:

web.archive.org/web/2002080214

And suggested a hypothetical way to make sure it only served computer users, and not corporations or governments who wanted to control them:

linuxjournal.com/article/7055

13/

Cory Doctorow replied to Cory

But his most salient concern was this: "what if malware gets into the trusted computing chip?"

The point of trusted computing was to create a nub of certainty, a benevolent God whose answers to your questions could always be trusted. The output from a trusted computing element would be ground truth, axiomatic, trusted without question. By having a reliable external observer of your computer and its processes, you could always tell whether you were in the Matrix or in the world.

14/

Cory Doctorow replied to Cory

It was a red pill for your computer.

What if it was turned? What if some villain convinced it to switch sides, by subverting its code, or by subtly altering it at the manufacturer?

That is, what if Descartes' God was a sadist who *wanted* to torment him?

This was a nightmare scenario in 2002, one that the trusted computing advocates never adequately grappled with. In the years since, it's only grown more salient, as trusted computing variations have spread to many kinds of computer.

15/

Cory Doctorow replied to Cory

The most common version is the UEFI - ("Unified Extensible Firmware Interface") - a separate operating system, often running on its own chip (though sometimes running in a notionally "secure" region of your computer's main processors) that is charged with observing and securing your computer's boot process.

16/

Cory Doctorow replied to Cory

UEFI poses lots of dangers to users; it can (and is) used by manufacturers to block third-party operating systems, which allows them to lock you into using their own products, including their app stores, letting them restrict your choices and pick your pocket.

17/

Cory Doctorow replied to Cory

But in exchange, UEFI is said to deliver a far more important benefit: a provably benevolent God, one who will never lie to your operating system about whether it is in the Matrix or in the real world, providing the foundational ground truth needed to find and block malicious software.

18/

Cory Doctorow replied to Cory

So it's a big deal that Kaspersky has detected a UEFI-infecting rootkit (which they've dubbed a "bootkit"), which they call Cosmicstrand, which can reinstall itself after your reformat your drive and reinstall your OS:

securelist.com/cosmicstrand-ue

Cosmicstrand does some *really* clever, technical things to compromise your UEFI, which then allows it to act with near-total impunity and undetectability. Indeed, Kaspersky warns that there are probably *lots* of these bootkits floating around.

19/

So it's a big deal that Kaspersky has detected a UEFI-infecting rootkit (which they've dubbed a "bootkit"), which they call Cosmicstrand, which can reinstall itself after your reformat your drive and reinstall your OS:

securelist.com/cosmicstrand-ue

Cosmicstrand does some *really* clever, technical things to compromise your UEFI, which then allows it to act with near-total impunity and undetectability. Indeed, Kaspersky warns that there are probably *lots* of these...

Cory Doctorow replied to Cory

If you want a good lay-oriented breakdown of how Cosmicstrand installs a wicked God in your computer, check out Dan Goodin's excellent *Ars Technica* writeup:

arstechnica.com/information-te

Cosmicstrand dates back at least to 2016, a year after we learned about the NSA's BIOS attacks, thanks to the Snowden docs:

wired.com/2015/03/researchers-

20/

If you want a good lay-oriented breakdown of how Cosmicstrand installs a wicked God in your computer, check out Dan Goodin's excellent *Ars Technica* writeup:

arstechnica.com/information-te

Cosmicstrand dates back at least to 2016, a year after we learned about the NSA's BIOS attacks, thanks to the Snowden docs:

Cory Doctorow replied to Cory

But despite its long tenure, Cosmicstrand was only just discovered. That's because of the fundamental flaw inherent in designing a computer that its owners can't fully inspect or alter: if you design a component that is supposed to be immune from owner override, then anyone who compromises that component *can't be detected or countered by the computer's owner*.

21/

Cory Doctorow replied to Cory

This is the core of a two-decade-old debate among security people, and it's one that the "benevolent God" faction has consistently had the upper hand in. They're the "curated computing" advocates who insist that preventing you from choosing an alternative app store or side-loading a program is for your own good - because if it's possible for you to override the manufacturer's wishes, then malicious software may impersonate you to do so, or you might be tricked into doing so.

22/

Cory Doctorow replied to Cory

This benevolent dictatorship model only works so long as the dictator is both perfectly benevolent and perfectly competent. We know the dictators aren't always benevolent. Apple won't invade your privacy to sell you things, but they'll take away ever Chinese user's privacy to retain their ability to manufacture devices in China:

nytimes.com/2021/05/17/technol

23/

Cory Doctorow replied to Cory

But even if you trust a dictator's benevolence, you can't trust in their perfection. Everyone makes mistakes. Benevolent dictator computing works well, but fails badly. Designing a computer that intentionally can't be fully controlled by its owner is a nightmare, because that is a computer that, once compromised, can attack its owner with impunity.

--

Image:
Cryteria (modified)
commons.wikimedia.org/wiki/Fil

CC BY 3.0:
creativecommons.org/licenses/b

eof/

But even if you trust a dictator's benevolence, you can't trust in their perfection. Everyone makes mistakes. Benevolent dictator computing works well, but fails badly. Designing a computer that intentionally can't be fully controlled by its owner is a nightmare, because that is a computer that, once compromised, can attack its owner with impunity.

marnanel replied to Cory

@pluralistic do you know the Mitchell and Webb sketch about the computer Colosson whose improbable emergency shutdown trigger is seeing a human holding up a photo of a duck?

Go Up