[Previous entry: "Sic Sempris Tyrannis"] [Main Index] [Next entry: "e-gold phishing"]

12/14/2003 Archived Entry: "Say no to Palladium!"

I finally found the article about Palladium that I was seeking. It's a good overview, and well balanced, although in my opinion a bit naive where Microsoft is concerned.

Briefly, Palladium -- now called NGSCB, Next Generation Secure Computing Base -- will let Microsoft

* control the applications you can run on your computer
* control the data you can view, save, or use on your computer
* prevent you from moving applications or data to another computer
* force you to buy upgrades

In response to these accusations, Microsoft basically says "Trust us." Who in their right mind would do so?

Anyone familiar with the history of Microsoft can easily come up with examples to refute every statement quoted in the Salon article from the Microsoft representative.

According to Salon,

If Palladium stops viruses, doesn't constrain your machine, and doesn't invade privacy -- above all, if people are allowed to control Palladium, rather than vice versa -- would the system be so bad?

In the first place, that's four very big ifs, and all of the evidence to date is to the contrary. And even in a looking-glass land where all those ifs were satisfied, it would still be a bad idea.

The most complete and current analysis I've found of Palladium comes from the superb Electronic Frontier Foundataion. Their report "Trusted Computing: Promise and Risk" is a bit technical, but well balanced, and well worth the read. A few selected quotes:

"Broadly speaking, the trusted computing architecture is a misguided implementation of a valuable idea, and would offer both advantages and disadvantages to computer owners."

"the leading trusted computing proposals have a high cost: they provide security to users while giving third parties the power to enforce policies on users' computers against the users' wishes"

"Our most fundamental concern is that trusted computing systems are being deliberately designed to support threat models in which the owner of a "trusted" computer is considered a threat."

"Because third parties currently have no reliable way to tell what software you are using, they have no reliable way to compel you to use the software of their choice. This aspect of the status quo is almost always a benefit for computer owners -- in terms of improved competition, software choice, software interoperability, and owners' ability to control their computers -- and therefore no attestation scheme that changes this situation is likely to be beneficial to consumers."

"An insecure system can't magically become "secure" with the addition of a single piece of technology."

"In general, many of the security benefits of trusted computing could be achieved in some form simply by rewriting software, but this appears impractical to some."

"How can computer owners know that their trusted computing hardware has been implemented according to its published specifications?"

This last question is particularly relevant considering Microsoft's (and their partners') lax security record. Can you trust Microsoft to implement the security features correctly and honestly?

* In 2001, someone posing as a Microsoft employee obtained fraudulent certificates identifying "trusted" Microsoft software.

* The Processor Serial Number in the Pentium II and II, which supposedly could be disabled by the owner, in fact could be reenabled remotely.

* Without informing anyone, Microsoft put a "backup" crypto key -- a backdoor -- in their Windows systems, which could circumvent any key you installed. (Claims that this was done at the behest of the NSA seem to be unfounded.)

The EFF report goes on to offer several examples of how this technology can be abused, and observes,

"These are examples of a more general problem of "lock-in", often practiced as a deliberate business strategy in the software industry, to the detriment of business and home computer users alike. Unfortunately, the TCG [Trusted Computing Group] design provides powerful new tools to enable lock-in."

A somewhat more cynical view was taken by the respected computer security expert Bruce Schneier, in his Crypto-Gram Newsletter of August 15, 2002. Among other points,

"1. A "trusted" computer does not mean a computer that is trustworthy. The DoD's definition of a trusted system is one that can break your security policy; i.e., a system that you are forced to trust because you have no choice. Pd [Palladium] will have trusted features; the jury is still out as to whether or not they are trustworthy."

"2. When you think about a secure computer, the first question you should ask is: "Secure for whom?" ... Microsoft really doesn't care about what you think; they care about what the RIAA and the MPAA think."

"4. Pay attention to the antitrust angle. I guarantee you that Microsoft believes Pd is a way to extend its market share, not to increase competition."

"My fear is that Pd will lead us down a road where our computers are no longer our computers, but are instead owned by a variety of factions and companies all looking for a piece of our wallet. To the extent that Pd facilitates that reality, it's bad for society. I don't mind companies selling, renting, or licensing things to me, but the loss of the power, reach, and flexibility of the computer is too great a price to pay."

I couldn't have said it better. And Schneier is not the only one who sees the anticompetitive potential: others have pointed out that this could be Microsoft's strategy to attack open-source software.

Quoting again from Salon,

[The FCC's David] Farber concedes, though, that whether or not one thinks of Palladium's architecture as a boon to security "depends on what you believe Microsoft's long-term aims are. If you believe it's to stimulate commerce and stimulate security, it's a step in the right direction." But if you're more "neurotic" than that, Farber says, and if you're perhaps given to suspicions that Microsoft always makes decisions with the aim of frustrating competitors of the Windows empire rather than for the good of consumers, you might have a different view of the same architecture.

Call me "neurotic," then. I, and anyone familiar with the history of Microsoft, can easily come up with many examples.

* In Windows XP, when you search for a file on your local hard disk, XP quietly makes contact with a Microsoft server.

* When you view a DVD in Windows Media Player, your computer's identity and the DVD you are watching are sent to Microsoft. Worse still, "the relevant Microsoft privacy policy did not actually mention this until the matter was drawn to Microsoft's attention. So not only was Microsoft snooping on you, but it was also being sneaky about it."

* Or just read the infamous "Halloween Documents," internal memos from Microsoft in which they describe their intent and their strategies to discredit open-source software.

I tell you three times: Microsoft does not care about you. They care only about their bottom line. They've given up the path of trying to make better products, and now are pursuing the twin strategies of locking in their customers and destroying their competition. They are reacting like a wounded beast, without compunction, without scruples, and without remorse.

From what I've seen so far, the open-source community takes privacy and security seriously. Microsoft treats "privacy" and "security" as marketing ploys, to be trotted out to help sell products, and to be quietly ignored when no one is looking.

I take my privacy and security seriously. So I won't be using Palladium, or NGSCB, or whatever they end up calling it. And in all seriousness, I foresee a future when there are two classes of computing users: the "free" users, who said no to Palladium when we had a chance, and the "serfs" who gave in, took the easy road, and no longer own their computers, their software, or their data.

Brad

Powered By Greymatter