The main article reviewed here is "Trusted Computing. Rechtliche Probleme einer entstehenden Technologie" ("Legal problems of an emerging technology"; Bechtold 2005b). This article is based on a presentation given at the U.S. Stanford University in March 2005, the slides and talk of which are available in English (2005a). Bechtold published a slightly updated version of his article, again in German (2005c). There is also a further article on TC taken into account for this review (2004b) referring to a somewhat earlier stage of TC-developments.

The reasons for this review is that Bechtold provides quite a dense and comprehensive assessment of potential legal problems associated with Trusted Computing, in particular in the area of DRM. Areas in which legal problems might emerge are identified, and recommendations are given to policy makers and those building "Trusted Computing" systems.

Background of Trusted Computing
"Trusted Computing" is a notion used by the "Trusted Computing Group" (TCG), which emerged from the former TCPA, the Trusted Computing Platform Alliance, which was founded in 1999. At that time, there had been discussion whether computers should have identifiers (cf. the discussion about the Personal Serial Number in the Intel Pentium-III processor; STOA 1999). As the TCPA suggested to have a unique identifier in each "Trusted Platform Module", observers were worried that it might be aimed at tracing PC users in general, as opposed to using the identifier only for purposes such as identifying parties in electronic commerce. When Microsoft considered using the Trusted Computing approach for basing a DRM-system, "TC" obtained a somewhat negative image in many popular media, blogs, etc. Today, the TCG is led by AMD, Hewlett-Packard, IBM, Infineon, Intel, Microsoft and Sun.

Key security concepts in the TCPA specifications were based on work by Arbaugh et al. (1997). The process the authors designed is "constructing a chain of integrity checks, beginning at power-on and continuing until the final transfer of control from the bootstrap components to the operating system itself. The integrity checks compare a computed cryptographic hash value with a stored digital signature associated with each component" (Arbaugh et al. 1997). In TCPA/TCG implementations, the chain of trust starts accordingly with the "Trusted Platform Module" (TPM), basically a smartcard chip. Today, TPMs in PCs are mainly used for secure log-in, protection of cryptographic keys, and file encryption support. Checking the whole chain of trust, e.g., operating system, drivers and applications, has not yet been implemented.

The subject of Bechtold's analysis
Bechtold reviews the actual specifications written by the Trusted Computing Group, as well as operating system developments, such as Microsoft’s "Next Generation Secure Computing Base" (NGSCB; variants of new Microsoft operating systems will increasingly support applications based on TC concepts). In addition, he takes into account recent hardware developments, in particular the new processor architectures from Intel and AMD that offer support for "curtained memory " and "virtualization". Curtained memory allows for strong isolation between different execution environments, while virtualization allows several different, even unmodified operating systems to run in parallel. Next to a legacy OS, another one could run, e.g. a custom-made one for a content application. With the help of the TPM, it can be determined what is actually running.

The following potential characteristics of Trusted Computing are highlighted:

  1. Remote attestation: Comparison of the actual state of a platform with its expected state (validation).
  2. System compartmentalisation: With the new processor architecture, e.g., a Trojan horse would not longer be able to read data from a banking application, as these would run in different compartments.
  3. Sealed storage: Data are encrypted and can only be read if the system is in a certain state (for making sure that, e.g., no software is running which is designed to "rip" content).
  4. Secure input/output: Keyboard, mouse and display are protected against manipulation.

From this list, we see that in frequent cases envisaged by the proponents of TC, "trusted" means that a third party can be enabled to check whether a remote computer can be trusted. Whether a "trusted computer" is trusted by the user, can deserve to be called "trustworthy", etc. is a different matter. As Pearson, editor of an early book on TC, put it when describing the TPM: "This security hardware contains those security functions that must be trusted." (2003, p. 5; emphasis in the original). Whether it is trusted in social and economic terms, is a different matter, however.

Risk analysis
Generally speaking Bechtold argues that there are possibly many risks arising, but that they could be dealt with by skilful design of TC-architectures and the institutional arrangements around them. We pick up here the most important points in slightly more detail:

  • Remote attestation could be used to hinder inter-operability. It could be ensured that only a certain piece of software, e.g. a Microsoft browser, can be used for getting certain services. He discusses technical remedies such as communicating only properties of a program, or attesting only the correctness of small part of the computer, e.g., a compartment, as well as legal remedies to prevent abuse of market power.

  • The role of third parties providing basic keys and metrics for using TC is an issue. For instance, the integrity of software might be checked by comparing its hash value against the one it is supposed to have. Currently, the TCG specifications do not define who these entities will be. It could be, for instance, a large corporation doing it in its own interest. However, central authorities could emerge with a significant market power. Therefore it is of potential relevance that there will be several competing companies or organisations certifying such data.

  • Given the market power of dominant players such as Microsoft, the article argues, users might be forced to use TC. For instance, banks might require the use of TC. The author demands to take such dominance, or market failures, into account.

  • "Sealed storage" might be used to ensure that certain data formats need to be used. Trusted Computing "can be used to ‘seal’ data to a particular software state on a platform. In a DRM system, this feature could be used by content providers to make sure that their content may only be accessed by consumers if their devices are in a secure state. However, it could also be used to seal data to a particular operating system, platform configuration, or software application. Software companies could develop proprietary file formats for their applications that can read this file format and thereby interoperate. As the costs of converting files would be significantly increased, this could deter customers from switching to competing applications, operating systems and even hardware platforms in the first place. Content providers could make sure that their content is only accessible with a particular proprietary player. In general, sealed storage could hamper competition in the hardware, operating system and the software applications markets. Trusted computing could prove a powerful tool to create customer lock-in and artificially increase switching costs." (2004b, p. 88f). Competition law would be a way to deal with the issue.

  • TC could be used to design a highly secure DRM system which would be difficult to circumvent. TC could be used to prevent the computer user from copying content from one system to another, as more easily possible with other DRM systems. He concludes that "DRM systems which are based on trusted computing architectures may come into conflict with copyright law... If copyright limitations allow a consumer to copy content to another device without the rights holder’s permission, the trusted platform could nevertheless prevent such copying as the sealed content could not be decrypted on the other device." (2004b, p. 95)

  • The use of keys could lead to a loss of privacy. Not only could a company verify whether one of its PCs is accessing its network, other companies could also identify platforms and concatenate keys and user identities. Bechtold reviews the merits of "Privacy Certification Authorities" providing pseudonyms and so-called "Direct Anonymous Attestation" which could be used to provide a higher level of anonymity (cf. TCG 2003).
The article also addresses other issues, such as using related patents to limit competition.

The reader gets the impression that Bechtold intends to warn of potential negative effects. In contrast to earlier such warnings, e.g. Ross Anderson’s (2005), he separates issues of TC (according to the TCG specifications), Microsoft’s plans, and DRM very clearly (cf. Safford 2002). In this sense, his work is a very useful early warning.

Summarising one can say that there are three major risks:

  1. Dominance of players. This could result in high prices, and in particular the use of open source software could be hindered if certificates were made available only with a delay or at excessive cost.
  2. Loss of capabilities to exploit copyright limitations.
  3. Loss of privacy.

These could be addressed by the following remedies:

  1. Remote attestation could be requested from only a small part of a computer, e.g. a compartment.
  2. Competing operating systems and competing institutions providing keys and hash values would be necessary for consumers to have a choice. Thus, a possible abuse of market power would be hindered. With enough competition, applications not using TC would also remain available.
  3. Control of abuse of market power through the policy maker.
  4. Privacy Certification Authorities and Direct Anonymous Attestation could be deployed to provide more privacy.

With respect to the design of DRM systems Bechtold believes in "value centered design" enabling DRM-implementations preserving copyright limitations, such as private copies (cf. 2004a).

The reviewers would like to bring up a few issues for discussion:

First, Bechtold has a fairly short list of positive effects, essentially stating that digital signatures could be implemented more securely. Other potential effects of TC, such as increased security against theft of data, e.g. from stolen laptop computers, are underemphasised. Also the potential of secure computers to make fighting malicious code less important is underemphasised. But elaborating on such benefits was apparently not within the scope of his article.

Second, Bechtold seems to have the impression that all the hard- and software which is envisaged to be built based on the TCG-principles will work properly. This may not be the case, however. It is by no means guaranteed that it will be possible to implement all the functions in an error-free way. He writes, e.g., that existing PC-architectures need only be "marginally modified" (2005b, p. 394), or that "Trusted Computing will offer a much higher level of security" (p. 404) or that "it is impossible for insecure software, viruses and other dangerous programs to hide their existence on a Trusted Computing-platform" (p. 399). This will only be the case if TC is implemented perfectly. In particular, it seems doubtful whether a permanent attestation is feasible. If attestation is not permanent, but e.g. takes place only during the system’s boot process, malicious code, cracking software, etc., might run even in a verified compartment. Regarding DRM, there is also the challenge to build PCs which make it difficult to eavesdrop data somewhere. Applying the BORA principle, cracked content could run undetected in a future, separate compartment. Protections such as watermarking might perhaps remain, though, and the process might be illegal, which would reduce such abuse.

Third, there is the interesting issue whether Microsoft will aim at blocking virtualisation regarding non-Microsoft operating systems and compartments. New Microsoft operating systems could ensure, with the help of the TPM, that they only run if no other compartments with different operating systems are running. This would hinder competition.

Bottom line
One could regard Bechtold’s worries as an example of German thoroughness and of scepticism with regard to new technologies. It seems, however, his work is right in time, as there is a good possibility that during the next few years hundreds of millions of TPMs will be in PCs. Therefore it is important to monitor whether Trusted Computing will lead to secure systems, or to lock-in. Regarding DRM, Bechtold warns that TC might prevent users from exploiting rights provided by the copyright law, so this issue will also warrant continued monitoring.


About the authors: Arnd Weber, researcher at ITAS, is currently participating in the OpenTC project ( Recent research addressed success factors of the Japanese mobile data (and music) market. He has also done work on "secure wallets" in the framework of EU-projects CAFE and SEMPER. Dirk A. Weber has experience with managing corporate networks as a Microsoft Certified Systems Engineer and as a Certified Novell Engineer. Contact: and

Status: first posted 02/03/06; licensed under Creative Commons; included in the INDICARE Monitor of February 2006