This archived article was written by: Erik Falor
A consortium of big-name technology companies have banded together in an effort to improve information security. The Trusted Computing Group (TCG) has stated that its goals are preventing against software attack, ending identity theft and protecting computer owners from physical theft.
While this new technology promises big things to consumers and companies both, it also lends itself to abuse by the very institutions meant to protect users. Depending on who you are, and how you use the computer, Trusted Computing (TC) will be more of a burden than a boon.
TC is intended to increase security by incorporating new innovations in hardware and software that ensure that sensitive data are only available to the programs or devices that need access to it. It also means to end software and media piracy. While this goal does not raise much criticism, the way the TCG intends to accomplish it frightens many.
Trusted Computing will implement a plethora of new security methods. It will require that new computers be built using new hardware and run new software. From start-up to shutdown, there will be secure processes working behind the scenes ensuring the integrity of the system.
As soon as you press the power button, the Core Root of Trust for Measurement modules will establish a chain of trust built up from the software and hardware components of the computer. It will already know what kinds of programs and hardware make up your computer system before you boot up. After your computer starts, it will verify that your computer is in a secure and trusted state, and that there are no virus in memory. If that is the case, it will then allow the next piece of hardware, the Trusted Platform Module, to do its job.
The Trusted Platform Module, or TPM for short, is responsible for creating and issuing the keys that verify that the computer is running in a secure state, and that the user is in fact, you. These keys are generated with the help of an on-board hardware random-number generator, and a hardware public key encryption/decryption system. This hardware random number generator in theory will be capable of creating higher quality random numbers than current software based programs, and make it less possible to crack them.
Any program that requests access to different pieces of hardware must get keys from the TPM before the request is granted. If the TPM does not trust that program, it’s request is denied and the transaction is logged. This means that a user will not be able to play a pirated video game, or play a pirated music file.
Companies involved with this initiative include such giants as IBM, AMD, Microsoft, Sony, Sun Microsystems, Hewlett-Packard and Intel. Conspicuously absent are Apple (who has set the trend for PCs for 20 years) and any of the Linux distributions. Perhaps most troubling is that some of the companies involved have poor track records when it comes to keeping users’ best interests in mind.
For example, Intel shipped Pentium III processors which could be identified remotely. Only after much backlash from end-users was this feature disabled.
More recently, Microsoft has released an update to its Media Player that insists that users agree to install as-of-yet unimplemented and undisclosed anti-piracy measures. According to www.newscientist.com/, part of the end-user agreement that appears prior to installation of the update, users are asked to agree to “future security updates related to ‘digital rights management,’ i.e. preventing copyright infringement.”
These initiatives, while attractive to media companies, are outrageous to computer enthusiasts everywhere. If adopted on a wide scale, it means that users could be subject to unannounced updates that would automatically install on their computers. The user would not be informed as to what function the update performs, nor have the choice whether or not to reject it. Some fear that the purpose of these updates would be to prevent the copying of certain media files, or to automatically delete files deemed unauthorized by Microsoft. Moreover, giving Microsoft remote control over your PC creates a new vulnerability that crackers can exploit.
Richard Stallman, an open-source software advocate writes in his book Free Software, Free Society, that large software firms have a history of putting users at a disadvantage. “One version of Windows was designed to report to Microsoft all the software on your hard disk … the KaZaa music-sharing software is designed so that KaZaa’s business partner can rent out the use of your computer to their clients. These malicious features are often secret, but even once you know about them it is hard to remove them, since you don’t have the source code.”
John Manferdelli, the general business manager for the unit that was developing Palladium said this in a Microsoft press release: ‘Palladium’ [Microsoft’s deprecated code name for the project] will greatly reduce the risk of many viruses and spyware – software that captures and reports information from inside your PC – and other attacks. Memory in ‘Palladium’ PCs and other devices will run only ‘trusted’ code that is physically isolated, protected, and inaccessible to the rest of the system. Files within the ‘Palladium’ architecture will be encrypted with secret coding specific to each PC, making them useless if stolen or surreptitiously copied.”
The only guarantee Trusted Computing can make is that Microsoft and its collaborators will have access to your computer’s “vault memory.” Crackers certainly will find a way around the firewalls. Most experts agree that trusted computing will not be able to prevent virus infection.
As for some of the other features, most feel that it is not a software company’s place to force users to use their computers in legal ways. Especially when computers compliant with U.S. copyright laws are distributed internationally. Keep in mind that copyright laws in the U.S. don’t mean anything in Asia. There is already much resentment in the international community towards the U.S. implementing de facto standards without consulting anybody.
And on macslash.org a commentator said “I recently had a chance to hear Steve Jobs [an original founder of Apple] himself talk about DRM [Digital Rights Management] and his attitude is ‘There is no way to prevent people from pirating music. You can throw more technology at it, but all it takes is one person to crack it and the system fails’.”
Another controversial feature will be automatic file destruction. In the wake of incriminating e-mail messages used against Microsoft in the government’s anti-trust case, Microsoft implemented a policy mandating that all records of e-mail correspondence be deleted after six months. Obviously this feature would be supremely attractive to other companies involved in embarrassing lawsuits, such as Enron, Qwest, and Phillip Morris. TC could also be used to render these e-mails unreadable outside of the company’s network. An evaporating evidence trail sounds like much more trouble than it is worth. And how many home PC users would be benefited by such a feature?
Another feature of TC that is already in use today is mandatory hardware registration with Microsoft. In order to verify that your copy of Windows XP is only running on one machine, and that other copies of it are not floating around the internet, your computer notifies Microsoft headquarters in Redmond what new hardware components you have installed. Currently it borders on that fine line demarcating corporate rights and privacy invasions. But it could mean that someday you will only be allowed to install peripherals approved by Microsoft. By using a blanket excuse such as “this particular graphics card is not ‘trusted’,” Microsoft could stomp out competition.
There doesn’t seem to be much incentive for the home user to use TC systems. There are, however, some important features that could be used for good.
An end-user can benefit from a software vault – a place on the hard drive that personal information such as Social Security numbers, credit card and bank account information can be stored securely. TC will only allow authorized programs to request the data. Ideally, a cracker or a Trojan horse program could not steal sensitive information at any point along the way from hard disk to transmission through the internet by web browser.
As far as software goes, there are two competing schools of thought. Microsoft has capitalized on proprietary software, that is, software that ultimately belongs to the corporation. That means that you do not own the copy of Word on your computer in the same way that you own your Ford Pinto. You can make any modifications you wish to your Pinto, and Ford doesn’t care a whit. But you are not allowed to modify the Word.EXE file in any way. Legally you are not even allowed to look under the hood.
The other paradigm, known as open source software, asserts that the world is not benefited if the inner workings of software always remain secret. Instead of selling you a car with the hood welded shut, open source software arrives as a box of car parts, that the user then assembles themself. This way, there are no secrets, no hidden back doors, and the user can fix any bugs themselves.
The open source community addresses the weakness inherent in software that only a handful of people are allowed to look at and maintain. Buying into the axiom of “two heads are better than one,” open source software is more likely to have bugs fixed quicker than proprietary software.
Despite all of the promises that TC makes, there is simply too much room for abuse to accept it as a plausible solution to computer security. The open source model removes any possibility that the system be built with backdoors, or holes. The very nature of open source software is democratic, which will appeal to computer users around the world who do not want to be compelled to live under U.S. laws. I submit that the industry turn to the developers in the open source community for a more practical solution.