[go: up one dir, main page]

HK1168224A - Validation and/or authentication of a device for communication with a network - Google Patents

Validation and/or authentication of a device for communication with a network Download PDF

Info

Publication number
HK1168224A
HK1168224A HK12108823.2A HK12108823A HK1168224A HK 1168224 A HK1168224 A HK 1168224A HK 12108823 A HK12108823 A HK 12108823A HK 1168224 A HK1168224 A HK 1168224A
Authority
HK
Hong Kong
Prior art keywords
trusted
integrity
trusted component
trust
root
Prior art date
Application number
HK12108823.2A
Other languages
Chinese (zh)
Inventor
Y.C.沙阿
I.查
A.施米特
A.莱切尔
S.J.考尔
J.格雷多纳
Original Assignee
交互数字专利控股公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 交互数字专利控股公司 filed Critical 交互数字专利控股公司
Publication of HK1168224A publication Critical patent/HK1168224A/en

Links

Description

Validation and/or authentication of devices in communication with a network
Cross Reference to Related Applications
The present application claims the benefit of US provisional patent application No.61/253,687 filed on day 10/21 of 2009 and US patent application No.61/169,630 filed on day 4/15 of 2009, the disclosures of which are incorporated herein by reference.
Background
Currently, devices such as mobile phones, femtocells (femtocells), home nodes, cable modems, network access points, etc. can be connected to a communication network. Via the connection, the devices may receive and/or initiate telephone calls, access the internet, etc. using the communication network. However, these devices do not include a system or method for validating the integrity of components included in the device (e.g., prior to connecting to a network).
Disclosure of Invention
Systems and methods for performing trusted computing are provided. For example, devices such as computing devices, mobile devices, femtocells, access point base stations, home nodes (e.g., enhanced home node B (h (e) NB)), and so forth, can include trusted components. The trusted component may be verified by a trusted third party and may have a verification certificate stored therein based on the verification by the trusted third party.
According to an example embodiment, the trusted component may include a root of trust, such as an invariant root of trust, which may provide secure code and data storage and secure application execution. The root of trust may also be configured to verify the integrity of the trusted component, such as via a secure boot (e.g., staged secure launch).
According to an example embodiment, the device may operate according to a first scheme when the integrity of the trusted component fails the verification of the root of trust and according to a second scheme when the integrity of the trusted component passes the verification. Thus, in an example embodiment, the trusted component may invoke secure boot and runtime operations, including real-time integrity verification of the device, external entities, and communication links.
Drawings
Fig. 1 illustrates an example embodiment of a device that may be used in wireless communications;
FIG. 2 illustrates an example embodiment of a device that may include a trusted component;
FIG. 3 illustrates an example embodiment of a method of establishing a trusted component that may be included in a device;
FIG. 4 illustrates an example embodiment of trusted components that may be included in a trusted environment of a device;
FIG. 5 illustrates an example embodiment of a trusted component in communication with one or more components in a device;
FIG. 6 illustrates an example embodiment of a security access monitor and security access table that may be included in a trusted component;
FIG. 7 depicts an example embodiment of a method of validating a component in a device through a secure launch;
FIG. 8 illustrates an example embodiment of autonomous acknowledgement by a device;
FIG. 9 depicts a flow diagram of an example method for autonomous acknowledgement of a device;
10-11 illustrate example embodiments of remote validation of a device;
FIG. 12 depicts a flow diagram of an example method for remote validation of a device;
fig. 13 illustrates an example embodiment of semi-autonomous acknowledgement.
Detailed Description
Fig. 1 depicts an example implementation of a device 100 that may be used in wireless communications. According to an example embodiment, device 100 may be a computing device, a sensor node, a mobile device, a femtocell, an access point base station, a home node (e.g., an enhanced home node B (h (e) NB)), a base station, or any other suitable device that may access a network and/or extend service coverage (e.g., cellular coverage, where access may be limited or unavailable). As shown in fig. 1, the device 100 may also communicate with one or more user devices 102, such as computing devices, cellular phones, Personal Digital Assistants (PDAs), sensor nodes, and the like.
The device 100 may also communicate with external communication entities, such as a network 104. According to one embodiment, the network 104 may be a broadband network, such as a DSL network, a cable network, or the like. According to an example embodiment, an external communication entity (e.g., network 104) may include a number of components including a Platform Validation Entity (PVE)105, a security gateway (SeGW)106, a home node management system (HMS)107, and/or an Operations and Administration (OAM) component 109. As shown in fig. 1, device 100 may communicate with network 104 via a security gateway (SeGW)106, whereby device 100 may use network 104 to initiate and/or establish wireless communications, such as telephone calls, text messages, email messages, data sessions (e.g., communications via the internet), and so forth. For example, a user may interact with the user device 102 to initiate a wireless communication (e.g., a telephone call with a recipient). The user device 102 may initiate wireless communication with a recipient using the device 100 when the user device 102 may be within range of the device 100. For example, user device 102 may send or provide a request or information to initiate wireless communication to device 100. The device 100 may then send the request or information to, for example, the network 104, whereby a communication session (e.g., a telephone call) may be established between the user and the recipient.
According to an example embodiment, the integrity of the device 100 and components contained therein may be verified before the device 100 authenticates with the network 104, the user device 102, and/or another external communication entity (e.g., a Universal Serial Bus (USB) connection, a bluetooth connection, a firewire connection, etc.). For example, the device 100 may face various security flaws, such as compromised credentials (compmisedcredential), physical attacks, configuration attacks, protocol attacks, network attacks, user data attacks, identity privacy attacks, radio resource management attacks, and so forth. To prevent these security flaws from affecting, for example, network 104, user device 102, and/or another external communication entity, the integrity of device 100 and components therein may be verified to ensure that device 100 and components thereof have not been subjected to security flaws or have been attacked from a trusted state.
FIG. 2 depicts an example implementation of a device 100 that may include trusted components. As shown in fig. 2, device 100 may include a processor 110, a memory 112, a transceiver 114, a power supply 116, and an antenna 118.
Processor 110 may include a standard processor, a special-purpose processor, a microprocessor, or the like, which may execute instructions for performing trusted computations, such as instructions for initiating a secure boot (boot) through a root of trust (which may include loading and executing trusted components); instructions to verify the integrity of the trusted component; and instructions that operate according to a particular scheme that depends on whether the integrity of the trusted component is verified. If the integrity of the trusted component 120 is not verified, the scheme under which the processor 110 operates may include preventing access to information such as credentials or certificates that are required to authenticate the device 100 with an external communication entity (e.g., the network 104). For example, the device 100 may authenticate with an external communication entity (e.g., the network 104) using credentials by means of any suitable authentication technique, including but not limited to a device authentication technique, a certificate-based authentication technique, or any EAP-AK-based authentication technique.
As described above, the device 100 may also include a processor 112. In one embodiment, the memory 112 may store instructions, code, data, or any other suitable information that may be executed by the processor 110. According to an example embodiment, the memory 112 may include Random Access Memory (RAM), Read Only Memory (ROM), cache memory, flash memory, a hard disk, or any other suitable storage device. As depicted in FIG. 2, in one embodiment, the memory component 112 may be integrated into the processor 110. According to another embodiment, the memory 112 may be a separate component in communication with the processor 110.
The device 100 may also include a transceiver 114, and the transceiver 114 may be in communication with the processor 112 and the antenna 118. According to an example embodiment, the transceiver 114 and the antenna 118 may facilitate the transmission and/or reception of wireless communications and/or wired communications, such as telephone calls, text messages, e-mail messages, data sessions (e.g., communications via the internet), and so forth.
As shown in fig. 2, in one embodiment, the device may also include a power supply 116. The power source 116 may be a battery power source, an AC/DC power source, an energy harnessing power source, or the like that provides power to the device 100 and components of the device 100. For example, the power supply 116 may provide power to the processor 110, the memory 112, the transceiver, the antenna 118, or any other component such that the components included in the device 100 function as described above.
As described above, device 100 may also include trusted component 120. According to an example embodiment, the trusted component 120 may be based on a chain of trust that may be anchored at a root of trust and provide a secure execution environment for low and high level applications.
According to one embodiment, trusted component 120 may load data and applications after the authenticity and integrity of the component is checked. The trusted component 120 may also provide an execution environment in which loaded applications are secure and not compromised (pointer). In an example embodiment, the trusted component 120 may be certified by a trusted third party, such as an operator certification of a UMTS Identity Circuit Card (UICC), as will be described in detail below. Further, the trusted component 120 may indicate to the user that the device 100 is trusted and that the network operator or network may identify 100 in a verifiable manner that there is a trusted component to establish the trust level.
Each component of the hardware and/or software that includes the trusted component 120 may be certified for security and trustworthiness. For example, the trusted component 120 may include a physical attestation process and security certificates passed with the platform design, whereby the authenticity of the trusted component 120 may be verified. In one embodiment, incremental inclusion (confidential inclusion) of the trusted hardware and/or software may be used to create a chain of trust for the trusted component 120, as will be described in detail below.
Thus, according to an example embodiment, trusted component 120 may provide users and operators with trust measurements that may be used to provide direct control and access control to information (e.g., identity) as well as privacy control. For example, trusted component 120 may provide secure and reliable measurements, reporting, and device trustworthiness verification; secure trusted operation of user applications; secure trusted protection of reliability, confidentiality, integrity, availability, and privacy of data (e.g., user's identity or virtual identity); granular (granular) control of access and propagation of user information; and others.
In one embodiment, the trusted component 120 may comprise a logically independent entity and a set of functions and resources in the device 100, whereby the trusted component 120 may also provide integrity or trust state protection and secure storage for, for example, sensitive data, cryptographic techniques, timestamps, software secure execution, and the like.
According to an example embodiment, integrity or trust status protections that may be provided by the trusted component 120 may include trust status measurements, verifications, and protections. For example, the trusted component 120 may provide execution of a security scheme, protection of availability and integrity of hardware functions (which may form the basis of security critical functions of the device 100), authentication of the device 100, verification of the trusted component 120 and/or the device 100, and so forth.
As described above, the trusted component 120 may provide secure storage of various information. For example, trusted component 120 may include a secure store for storing authentication credentials, reference security metrics (e.g., trusted reference values), sensitive data, or any other suitable sensitive information. According to one embodiment, the sensitive data may include security sensitive functions (including keys, cryptographic algorithms) or any other suitable sensitive function or data.
The trusted component 120 may also provide cryptographic techniques including, for example, encryption, decryption, signature establishment and validation, and hash (hash) computations. For example, trusted component 120 may perform cryptographic functions, such as device authentication or other security-sensitive functions, including symmetric key-based encryption and decryption, asymmetric key-based encryption and decryption, hash value generation and verification, random number generation, and generation and verification of digital signatures. Further, the trusted component 120 may provide random number generation (which may include pseudo-random number generation (PRNG)), whereby the trusted component 120 may provide protection and generation of PRNG values, e.g., seed, periodicity, and so forth. As described above, the trusted component 120 may also provide secure storage, which may have security-sensitive functionality, and data stored therein that may be used in cryptographic techniques (e.g., keys or cryptographic algorithms).
In one embodiment, the trusted component 120 may provide timestamps including, for example, secure and reliable timestamps for messages and data, cryptographic signatures, and the like. Trusted component 120 may also integrity protect components of device 100 that may provide real-time measurements, such as a real-time clock.
Trusted component 120 may protect functions (including instructions and data) such as software executable functions by separating the functions and data from other components of device 100 and protecting these functions and data from unauthorized access and compromise. Furthermore, execution of functions within trusted component 120 and data produced by those functions may not be accessible by external entities (e.g., other components that are not trusted). Data (e.g., security critical or sensitive data) may be stored in a secure store in an isolated environment, such as provided by the encryption scope of trusted component 120, and may prevent external spying through user access buses and interfaces. The trusted component 120 may also enable extraction of security parameters through the controlled access port using an extraction scheme and data that may be predefined.
Trusted component 120 may also include a trusted unique Identification (ID) that is bound to the identification of device 100 and may be used interchangeably with the identification of device 100. The trusted unique ID may be public or associated with a secret (e.g., a secret key) that is known only to the trusted component 120 and that is not compromised outside of the trusted component 120. The trusted unique ID may be used to sign a message, for example, as a public key of a key pair. According to an example embodiment, the trusted unique ID may be provided by the creator of the key pair, which need not be the same entity as the creator of the identity of device 100. Thus, in one embodiment, a mapping between these identifications may be provided based on, for example, a trustworthy unique ID that is physically and logically bound to the identification of device 100. For example, the trusted unique ID and associated secret key may be pre-specified by the manufacturer as part of the root of trust and may be associated with a certificate, as described below in fig. 3.
In one embodiment, the trusted component 120 may securely store a host-side module (HPM) ID. The HPM ID may be transmitted to trusted component 120 for binding and authenticating device 100 and a host-side module (HPM). The HPM ID storage may be configured based on a scheme or rules, e.g., an operator scheme. The trusted component 120 may provide additional security functions and algorithms for associating the trusted component 120 with the HPM or for associating the trusted component 120 with HPM data that may be configured by an operator or a user. Thus, according to an example embodiment, trusted component 120 may enable device 100 to authenticate a host party and may provide evidence for a binding between an entity and credentials that are involved in authentication of device 100 and of the host party.
Trusted component 120 may also be provisioned with security sensitive functionality, cryptographic keys, and other credentials related to identification of device 100 involved. According to an example embodiment, the trusted component 120 may be provided with security-sensitive functionality, cryptographic keys, and other credentials such as device identifications and secret keys associated with device identifications that may be used for cryptographic operations using secure, out-of-band processing, whereby the trusted component 120 may be configured to securely authenticate the identification of one or more components and authorize external entities or components using standard protocols. Thus, in one embodiment, an external entity can confirm the trusted unique ID or the identity of the device 100 as belonging to a valid and authorized trusted component 120.
According to an example embodiment, the trusted component 120 may provide operator configurable functional isolation in which software executable data and hardware functions may be separated from each other. Further, the secondary identification for these functions may be embedded into the trusted component 120 based on authentication with a network (such as the network 104 that is capable of verifying the trusted component 120 via a standardized security protocol). In one embodiment, trusted component 120 may download additional operator-configurable functionality after device 100 is deployed.
Trusted component 120 may also include one or more interfaces, such as an interface that may be initialized during a secure boot process (e.g., secure boot), as will be described in detail below. According to an example embodiment, the one or more interfaces may include unprotected interfaces. An unprotected interface may facilitate communication between the trusted component 120 and general resources or components of the device 100. The unprotected interface may also provide access to data that may be cryptographically protected by the trusted component 120 and may not be stored in secure storage.
The one or more interfaces may also include a protected interface. The protected interface may provide protection for the integrity and confidentiality of data carried between various components or modules in the trusted component 120. For example, in one embodiment, the protected interface may use a security protocol that may provide encrypted communications between the various components using the protected interface. The security protocol may include measures regarding security-wise, such as authentication of components in communication with trusted component 120 and message authentication and confidentiality.
FIG. 3 illustrates an example embodiment of a method of establishing a trusted component that may be included in a device. As indicated above, a trusted component (e.g., trusted component 120) may be included in device 100 of fig. 2. According to an example embodiment, trusted component 120 may be used to verify or prove the trustworthiness of device 100 to an external entity (e.g., network 104). The verification may include validating a trust chain (e.g., a supply chain) as well as operational functions and/or applications of the device 100.
In an example embodiment, the trusted component 120 may provide a hardware-based root of trust and a trusted environment for the device 100, and the security and functionality of the trusted component 120 may be tested by an independent trusted third party 202. The trusted third party 208 may then validate the trusted component 120 based on the test. According to an example embodiment, the attestation may be delivered using a digital certificate, which may be transmitted to any external communication entity (e.g., network 104), to which network 104 device 100 may attach to attest to the attestation of the device 100.
Further, the development tool 204 may be used to develop code and data images that may incorporate trusted reference values, which may be, for example, excerpts or hashes of code and data components of the executable code image. According to an example embodiment, the trusted reference value may be used to verify the integrity of code contained in the device 100 and may detect (compounded) code or data that has been compromised.
The code image may also be validated by the trusted third party 208 and may be delivered with a digital certificate that may be transmitted to any external communication entity, such as the network 104 to which the device 100 may be attached to prove validation of the device 100.
As shown in fig. 3, the stand-alone tester 206 may test the trusted component 120 and encode security features and functionality, and may provide input to a Certificate Authority (CA)208 to generate a digital certificate for the trusted component 120 and code image.
Device manufacturer 210, e.g., a wireless device manufacturer, may then incorporate the trusted component in the design and may load a certified code image. For example, device manufacturer 210 may receive trusted component 120 along with certified code and trusted reference values. Device manufacturer 210 may then establish a device, such as device 100, which may include trusted component 120 as well as certified code and trusted reference values.
When the device 100 attaches to, for example, the network 104, the device 100 may report or provide credentials for trusted components and code images, as well as various integrity measurements, to the network 104 to validate the device 104 with the network. For example, network 104 may verify that device 100 may be trustworthy and thus network 104 may enable device 100 to establish a communication link to network 104.
FIG. 4 illustrates an example embodiment of a trusted component 120 that may be included in a trusted environment, such as device 100. According to one embodiment, device 100 may include trusted component 120 as well as other components that are not part of a trusted environment. For example, as described above, the trusted component 120 may comprise a logically independent entity and a set of functions and resources in the device 100, whereby the trusted component 120 may provide a trusted environment for integrity or trust state protection, e.g., secure storage of sensitive data, cryptographic techniques, timestamps, software secure execution, and so forth. In particular, as shown in fig. 4, trusted component 120 may include a High Security Core (HSC)122, a Modular Security Environment (MSE)124, a trusted interface 126, a core interface 128, and a core interface manager (core IFM) 130. Although the embodiment of trusted component 120 shown in fig. 4 is representative of one implementation in a home node B device, it should be understood that the implementation is not so limited and trusted component 120 may be implemented in any of the computing devices described above having wired or wireless communication capabilities.
According to an example embodiment, HSC 122 may include a root of trust 132, a trusted core 134, and a trusted interface manager (TrE IFM) 136. The root of trust 132 may have access to the device 100, trusted component 120, and HSC 122. According to one embodiment, the root of trust 132 may include a set of immutable, non-removable hardware resources that may be physically bound to the device 100, whereby the root of trust 132 may ensure the integrity of the trusted core 134 and/or the trusted interface manager 136 during a secure boot process (e.g., secure booting of the device). For example, the root of trust 132 may be a write-protected read-only memory (ROM) unit that may include functionality similar to a smartphone basic input/output system (BIOS). The root of trust 132 may also securely store information for validation or verification of, for example, the trusted component 120. For example, the root of trust 132 may securely store a reference metric, such as a trusted reference value associated with the trusted component 120. According to an example embodiment, the root of trust 132 code may be encrypted and/or decrypted by using security credentials, such as cryptographic techniques that may be included in the trusted component 120.
As described above, HSC 122 may include trusted core 134. According to an example embodiment, the trusted core 134 may provide one or more functions for trusted components, such as integrity measurement, verification, reporting and execution, autonomous or semi-autonomous validation; cryptographic functions such as encryption and decryption, signature creation and validation, and hash value calculation; a function of adding a secure time stamp to the confirmation data; and others. Trusted core 134 may also provide secure storage for: a secret, a key, a reference metric (e.g., a trusted reference value associated with a component that may be used for validation or verification), an authentication credential (e.g., a device identification and a secret key associated with the device identification that may be used for cryptographic operations), or any other information or data. In one embodiment, an extended secure boot process (e.g., secure boot) may be performed by trusted core 134, as will be described in detail below.
Trusted interface manager 136 may manage, for example, trusted interface 126, which trusted interface 126 may provide communications between trusted component 120 and other components of device 100. According to an example embodiment, trusted interface manager 136 may manage trusted interface 126 based on one or more schemes.
Trusted component 120 may also include a core interface manager 130. The trusted core interface manager 130 may manage a core interface 128, which core interface 128 may provide communication between the HSC 122 and the MSE 124 and may also provide communication between the trusted interface manager 136 and the trusted core 134. For example, the trusted core interface manager 130 may control access to the trusted core 134 and associated resources, and may load executable modules, such as software, and associated data to the MSE 124 described above. According to an example embodiment, the trusted component 120 may be included in the HSC 122. Further, the integrity of the core interface manager 130 may be protected and/or verified by an extended secure boot process that may be executed by the trusted core 134. The core interface manager may also start the HSC 122 and/or MSE 124 upon authentication via an extended secure boot process.
HSC 122 may also include physical components that may be bound to device 100, such as cryptographic units, trust roots 132, physical secure storage, and so forth. According to one embodiment, the physical components and the physical secure storage may comprise separate hardened hardware units. Physical components may also be protected from physical attacks such as simple differential power consumption analysis, probing, and the like. According to an example embodiment, such protection may be provided to the extent required by a particular application. The HSC 122 may also include an interface that may protect the HSC 122 data from unauthorized access or compromise and may control access to the trusted core 134. Thus, in an example embodiment, the physical components, physical secure storage, and interfaces may ensure the security of the HSC 122.
MSE 124 may provide a trusted environment for executing applications, which may be, for example, an Operating System (OS) verification module, a time synchronization module, a validation module, and/or the like. For example, core interface manager 130 may load an application module included in device 100 into MSE 124 based on one or more schemes or rules. In one embodiment, each application module that may be loaded may run in a protected environment of MSE 124 that is logically separate and isolated from other such environments. Trusted core 134 may also verify the integrity of the module via core interface manager 130 prior to loading the module into MSE 124.
According to an example embodiment, MSE 124 may enable extension of trusted core 134 for applications, such as security critical applications, based on one or more schemes or rules. Based on the security scheme, the security of the MSE 124 may be ensured by verifying the integrity of the loaded application via the trusted core 134 and the trusted interface manager 136 (which may initiate access control of resources of the trusted component 120 by entities external to the trusted component based on the security scheme).
As described above, the trusted component 120 may be launched via a secure launch process (e.g., secure boot) to ensure that the device 100 may be launched in a predefined trusted state. In an example embodiment, a secure boot process (e.g., secure boot) may include booting the HSC 122, the MSE 124, the trusted interface 126, the core interface 128, and the core interface manager 130. In particular, in one embodiment, the root of trust 132 may securely launch a trusted element of an Operating System (OS), such as a boot loader for an OS core. According to one embodiment, the boot loader may include an indication of the code and/or components loaded for execution and an indication of whether the integrity of the loaded code and/or components has been verified. For example, the boot loader may include a list of codes and/or components that have been loaded into memory, including, for example, whether the integrity of the codes and/or components has been verified, whereby the boot loader may be used to know what codes and/or components need to be loaded and whose integrity has been verified.
Root of trust 132 may also securely launch trusted core 134 via, for example, secure boot, whereby trusted core 134 may launch other components of trusted component 120, including HSC 122 or MSE 124.
A secure boot process (e.g., secure boot) may include measuring the integrity of or verifying the trust status of each component or element before the component or element is booted. For example, the measured integrity value may be compared to a predetermined reference metric, such as a trusted reference value, to determine whether the measured integrity value matches the predetermined reference metric. In one embodiment, the predetermined reference metric for the component may be derived by computing a hash over the component, for example, using a particular hashing algorithm. Thereafter, to ensure the integrity of the component during the secure boot process, the device may again compute the hash on the component using the same hashing algorithm. The new hash defines the measured integrity value. According to an example embodiment, when the measured integrity value matches a predetermined reference metric, the integrity of the component may be verified and the component may then be started. Alternatively, when the measured integrity value fails to match the predetermined reference metric, the integrity of the component cannot be verified, and therefore, the component cannot be started. The secure boot process may also include using the trusted component 120 to securely boot other components in the device 100, including, for example, an operating system.
In one embodiment, the root of trust 132 may remain unchanged and non-removable after the trusted component 120 and multiple components therein have all been launched via a secure launch process (e.g., secure boot). However, if trusted core 134 can detect damage to device 100, trusted core 134 may render itself and/or other components of trusted component 120 inoperable.
FIG. 5 illustrates an example embodiment of a trusted component in communication with one or more components in a device. As shown in fig. 5, according to other example embodiments, trusted component 120 may include a security access monitor 140. Security access monitor 140 may be a gateway to hardware and/or software components that may be included in trusted component 120, or may be a gateway to hardware and/or software components external to trusted component 120.
According to an example embodiment, security access monitor 140 may be similar to a Memory Management Unit (MMU) for providing chain-based and/or real-time integrity verification. Security access monitor 140 may also allow or deny access to memory, may allow or deny access to Direct Memory Access (DMA), may allow or deny access to peripherals, may define security protection features for hardware and software, may identify trusted memory contents, may provide dynamic real-time address remapping, and/or may provide state-based access control. In one embodiment, security access monitor 140 may include a security access table that may be used to control access to memory, peripherals, etc. and may be used during chain-based and/or real-time integrity verification, as will be described in detail below.
The trusted component 120 may also include a hash function 142. For example, before code or instructions, components, data, etc. can be accessed (as described above), the trusted component 120 may perform the hash function 142 on such code or instructions that can be executed to verify such components, data, etc. In an example embodiment, the hash function 142 may support a combination of hash algorithms, including, for example, the MD5 algorithm and a Secure Hash Algorithm (SHA) (e.g., SHA-1, SHA-256, SHA-512, or other SHA-based algorithms).
The hash function 142 may also process data provided by the security access monitor 140 and may generate a signature or hash of the data. According to one embodiment, the generated signature or hash may be compared to a desired trusted reference metric or value (i.e., a previously computed hash) for verification, which may be stored, for example, in a component of trusted component 120, such as security access monitor 140, as described in detail below. For example, the integrity of software code or instructions, components, data, etc. may be verified by comparing a generated signature or generated hash value provided by, for example, the hash function 142 to, for example, a reference hash value or an expected trusted reference value (e.g., a predetermined reference metric). If the signatures or hash values do not match, the software code or instructions, components, data, etc. may have been compromised.
As shown in fig. 5, the trusted component 120 may also include a decryption engine 144 and an encryption engine 146. According to an example embodiment, the decryption engine 144 may decrypt code or instructions for verifying the integrity of one or more components of the device 100, for example. The decryption engine 144 may also decrypt data from, for example, a component of the device (e.g., a component that may be external to the trusted component 120, which trusted component 120 may be used by the processor 110 or may be stored, for example, in the secure memory 148). In an example embodiment, encryption engine 146 may provide confidentiality and integrity protection (e.g., encryption) using one or more encryption algorithms (e.g., Advanced Encryption Standard (AES) and Data Encryption Standard (DES)) for code or instructions and data that may be stored in secure memory 148 and/or provided to one or more components external to trusted component 120.
The trusted component may also include a security timer 150 and a tamper detection component 152. The secure timer 150 may provide a real-time clock for timing functions such as a secure time based protocol or timing access control. The security timer 150 may also be used to verify security timing, abnormal functionality, possible insecure compromise, or to protect a processor from being frozen or hung (hanging).
According to an example embodiment, the tamper detection component 152 may detect and report unsecure or unauthorized access or tampering with components of the device 100. For example, the damage detection component 152 can include a dedicated unit. The specialized unit may include a series of modules that may be included in the trusted component 120 that may detect and report possible unsecure access or compromise to hardware or software and data. According to an example embodiment, the tamper detection component 152 may include temperature measurements, clock integrity measurements, voltage measurements, key protection, and the like.
As shown in fig. 5, the trusted component 120 may include a key generator 154 and a random number generator 156. According to an example embodiment, the key generator 154 may generate and/or provide security keys that may be used by, for example, the decryption engine 144 and/or the encryption engine 146 to decrypt and/or encrypt code or instructions and data. Similarly, the random number generator 156 may be used to generate and/or provide a random number or value that may be used during, for example, authentication of one or more components of the device 100 and/or during generation of a key by, for example, the key generator 154.
According to an example embodiment, the trusted component 120 may also be used to isolate secure code and data (including boot code, trusted tag (ticket) centric code, encrypted user programs and/or data, etc.) from non-secure components (e.g., non-secure hardware or software). For example, security access monitor 140 may be used to isolate or control access to secure code and data. Security access monitor 140 may also be used to control access to secure peripherals and Direct Memory Access (DMA) blocks.
FIG. 6 illustrates an example embodiment of a security access monitor and security access table that may be included in a trusted component. For example, as described above, security access monitor 140 may include security access table 160, which security access table 160 may be used to determine the integrity of one or more components of device 100. For example, in one embodiment, the security access table 160 may include a desired trusted reference value or a predetermined reference metric, such as a predetermined or stored hash value, which may be calculated by one or more components of the device. As described above, in one embodiment, trusted component 120 may compare a generated signature or measurement for the component to an expected trusted reference value or a predetermined reference metric to determine whether the signature or measurement matches the expected value or predetermined metric. If the signature or measurement matches an expected value or a predetermined metric, the integrity of the component is verified.
According to an example embodiment, security access monitor 140 may verify addressable content and may verify the integrity of internal components and/or content of trusted component 120 when device 100 may be started or restarted. Once integrity is verified, the processor 110 may begin executing boot Read Only Memory (ROM) code, which may include hardened ASIC hardware and/or software that cannot be changed. In an example embodiment, the hardened ASIC hardware and software may provide a root of trust 132 for the trusted component 120.
FIG. 7 depicts an example embodiment of a method of validating components in a device (e.g., device 100) through secure launch. As shown in fig. 7, secure booting of a device may be initiated from a root of trust, such as root of trust 132, through multiple phases to a fully functional state by establishing a chain of trust. In phase 1, trusted component 120 may be established from root of trust 132 in a secure launch (e.g., secure boot). For example, the root of trust 132 may be configured to verify the integrity of the trusted component 120 via secure boot. If the integrity of trusted component 120 is not verified in phase 1, root of trust 132 may operate according to a first scheme. For example, the root of trust 132 may prevent or restrict access to credentials (e.g., a device identification and a secret key associated with the device identification that may be used for cryptographic operations (including device authentication)), or may restrict or prevent access to other information stored in the trusted component 120 and/or restrict or prevent access to external components by the device 100. Furthermore, if the integrity of trusted component 120 is not verified at stage 1, the secure launch may be halted and other components in device 100 may not be verified at a subsequent stage.
Alternatively, if the integrity of trusted component 120 is verified at stage 1, root of trust 132 may operate according to a second scheme. For example, the root of trust passes control to the trusted component 120. Trusted component 120 may then perform phase 2 of the secure boot. According to an example embodiment, in phase 2, trusted component 120 may verify, load, and launch further components essential to the operation of device 100. For example, in phase 2, trusted component 120 may verify the integrity of the communication stack, protocol stack, and/or network communication module. Trusted component 120 may then load and launch each component, such as a communication stack, protocol stack, and/or network communication module, with verified integrity. According to an example embodiment, if the integrity of the communication stack, protocol stack and/or network communication module is not verified in phase 2, the device 100 may operate according to the first scheme and/or any other suitable scheme that may be defined.
If the integrity of the necessary components is verified in phase 2, the trusted component 120 may perform phase 3 of the secure boot. According to an example embodiment, in phase 3, trusted component 120 may verify, load, and launch further components. For example, in phase 3, trusted component 120 may verify the integrity of applications, operating system components, other hardware components, and the like. Trusted component 120 may then load and launch each component (e.g., application, operating system component, other hardware component, etc.) with verified integrity. According to an example embodiment, if in phase 3 the integrity of one or more other components is not verified, device 100 may operate according to the first scheme and/or any other suitable scheme that may be defined.
As shown in fig. 7, according to an example embodiment, trusted component 120 may verify a component by obtaining each component measurement or value 145 (e.g., a hash) and comparing the measurement or value to an expected or predetermined trusted reference value or measurement 147 that may be stored in device 100 via verification engine 149. According to an example embodiment, the desired or predetermined trusted reference value or measurement 147 may be securely provided or provisioned in a certificate, which may be excerpted and stored in device 100. The integrity of a component may be verified if the measurement or value of the component matches an expected or predetermined trusted reference value or measurement or certificate associated with the component. But if the measurement or value of the component does not match the expected or predetermined measurement or certificate associated with the component, the integrity of the component cannot be verified.
Fig. 8 illustrates an example embodiment of autonomous acknowledgement by the device 100. According to one embodiment, the autonomous acknowledgement of the device 100 may be performed or completed during the startup of the device 100. For example, the device 100 may directly evaluate the measurements to verify the integrity of one or more components of the device 100, whereby components that fail verification cannot be activated, as described above. According to one embodiment, access to secure data, secure functions, etc. may also be prevented when the integrity of one or more components in the device 100 is not verified as described above. Further, device 100 cannot be authenticated by network 104 when the integrity of one or more components of device 100 has not been verified, whereby device 100 is prevented from connecting to network 104 or credentials that may be used to authenticate the device with the network cannot be released by trusted components.
Fig. 9 depicts a flowchart of an example method 300 for autonomous acknowledgement by the device 100. As shown in FIG. 9, at 305, the integrity of the trusted component 120 may be verified by the root of trust 132, as described above. According to an example embodiment, the integrity of trusted component 120 may be verified and made part of a staged secure boot that may be initiated by root of trust 132.
Subsequently, at 310, it may be determined whether the integrity of the trusted component 120 is verified. For example, as described above, root of trust 132 may evaluate the measurements to verify the integrity of trusted component 120 by comparing the measurements of trusted component 120 to trusted reference values (associated with trusted component 120 and may be stored, for example, in root of trust 132). According to an example embodiment, this determination may be made as part of a staged secure boot initiated by the root of trust 132.
At 315, device 100 may operate according to a first scheme when the integrity of trusted component 120 is not verified. For example, the first scheme may restrict and/or prevent access to information included in the trusted component 120. Thus, in one embodiment, access to information that may be used to authenticate (e.g., authenticate device 100 to network 104) may be prevented when a trusted component fails verification.
At 320, the device 100 may operate according to the second scheme when the integrity of the trusted component 120 is verified. For example, as described above, when the integrity of trusted component 120 can be verified, root of trust 132 may pass control over to trusted component 120 to verify other components in device 100 as defined by the second scheme. Thus, for example, a device may be approved to operate according to a plan, such as authenticating itself to an external communication entity (e.g., a network) to enable the device to communicate with the external communication entity.
Fig. 10-11 illustrate example embodiments of remote validation of the device 100. For example, device 100 may establish an initial connection to security gateway 106, e.g., network 104. According to one embodiment, device 100 may provide measurements associated with one or more components included in device 100 to network 104 via a connection to security gateway 106.
The network 104 using, for example, the PVE 105 may then evaluate the received measurements against a predetermined reference metric (e.g., a trusted reference value) by comparing the received measurements to the predetermined reference metric as described above to determine whether one or more anomalies are encountered based on the comparison, including whether the integrity of one or more components in the device 100 is not verified. In one embodiment, network 104 may deny access to device 100 if one or more anomalies are encountered. According to another embodiment, network 104 may grant device 100 limited network access or isolated access if one or more anomalies have been encountered. The network 104 may also require the device 100 to perform one or more remedial actions if the one or more anomalies are errors with respect to non-core components (i.e., components that are not critical to the basic functionality of the device). For example, device 100 may return to a predetermined state in response to a remediation request.
Fig. 12 depicts a flow diagram of an example method 400 for remote validation of device 100. As shown in FIG. 12, at 405, the integrity of the trusted component 120 may be verified by a root of trust as described above.
At 410, the trusted component 120 may generate integrity measurements, such as hash calculations, for other components of the device 100.
At 415, for example, the trusted component 120 may provide the integrity measurement to the network 104 for use in validating the device 100 to the network 104. As described above, the network 104 using, for example, the PVE 105 may then evaluate the received measurements against predetermined reference metrics as described above by comparing the received measurements to the predetermined reference metrics to determine whether one or more anomalies are encountered based on the comparison, including whether the integrity of one or more components in the device 100 is not verified.
Fig. 13 illustrates an example embodiment of semi-autonomous acknowledgement. For example, the device 100 may evaluate the trust status measurement as described above and may store the result of the evaluation of the measurement. Device 100 may then establish an initial connection to security gateway 106, e.g., network 104. According to one embodiment, device 100 may provide the evaluation results to network 104 via a connection to security gateway 106. Device 100 may also provide a subset of the measurements to network 104 via a connection to security gateway 106. Further, according to an example embodiment, device 100 may evaluate and provide measurements in response to a request from network 104.
The network 104 may then make fine-grained (fine-grained) access control decisions based on integrity measurements of one or more components in the apparatus 100. For example, the network 104 may then determine one or more anomalies (e.g., whether the integrity of one or more components in the device 100 has not been verified) during the evaluation using, for example, the PVE 105. In one embodiment, network 104 may deny access to device 100 if one or more anomalies have been encountered. According to another embodiment, the network 104 may grant the device 100 limited network access or isolated access if one or more anomalies have been encountered. The network 104 may also provide a request to the device 100 to perform one or more remedial actions if the one or more anomalies are non-core component validation errors. For example, device 100 may return to a predetermined state in response to a remediation request.
While various embodiments have been described in connection with the preferred embodiments of the various figures, it is to be understood that other similar embodiments may be used and modifications and additions may be made to the described embodiments for performing the same function of the various embodiments without deviating therefrom. Thus, the embodiments should not be limited to any one embodiment, but rather should be construed in breadth and scope in accordance with the appended claims.
Further, it should be understood that the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination of hardware and software. Thus, the methods and apparatus of the subject matter described herein, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine is referred to as an apparatus for practicing the subject matter described herein. Where program code is stored on media, it may be the case that the program code in question is stored on one or more media that collectively perform the action in question, i.e., the media or media employed together contain code for performing the action, but where more than one separate media is present, it is not necessary that any particular portion of code be stored on any particular media. In the case of program code execution on programmable computing devices, which may be pre-stored in the device or securely transferred to the device via a remote device management protocol (e.g., OMA DM or TR069), the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. One or more programs may implement or utilize the processes associated with the subject matter, e.g., through the use of an API, reusable controls, or the like. The program is preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language, and combined with hardware implementations.

Claims (26)

1. An apparatus capable of authentication with an external communication entity, the apparatus comprising:
a trusted component configured to verify the integrity of other components of the device; and
a root of trust comprising a set of immutable hardware resources, wherein the root of trust is configured to verify an integrity of the trusted component, wherein the device operates according to a first scheme when the integrity of the trusted component fails the verification of the root of trust, and the device operates according to a second scheme when the integrity of the trusted component passes the verification of the root of trust.
2. The device of claim 1, wherein the root of trust is configured to verify the integrity of the trusted component via a secure staged boot.
3. The apparatus of claim 1, wherein the trusted component further comprises a secure store containing credentials for authenticating the apparatus with the external communication entity, wherein the first scheme comprises preventing access to the credentials if the integrity of the trusted component fails verification by the root of trust.
4. The apparatus of claim 3, wherein the root of trust securely stores a trusted reference value associated with the trusted component.
5. The device of claim 4, wherein the root of trust is configured to compare the measurement of the trusted component to a trusted reference value associated with the trusted component, wherein the integrity of the trusted component is verified when the measurement of the trusted component matches the trusted reference value associated with the trusted component.
6. The device of claim 4, further comprising other components, wherein the trusted component is further configured to verify the integrity of the other components if the integrity of the trusted component passes the verification of the root of trust.
7. The device of claim 6, wherein the device uses the other components during operation in the trusted mode.
8. The apparatus of claim 6, the secure storage further storing a trusted reference value associated with the other component.
9. The apparatus of claim 8, wherein the trusted component is configured to compare the measurements of the other components to trusted reference values stored in the secure storage associated with the other components, wherein the integrity of the other components is verified when the measurements of the other components match the trusted reference values associated with the other components.
10. The apparatus of claim 1, wherein the root of trust comprises a certificate associated with the root of trust, wherein the certificate reflects a verification of an integrity of the root of trust by a third party.
11. A method for validating one or more components in a device capable of authenticating with an external communication entity, wherein the device includes a root of trust having a set of immutable hardware resources, the method comprising:
verifying, by the root of trust, the integrity of the trusted component;
determining whether the integrity of the trusted component is verified;
operating the device according to a first scheme when the integrity of the trusted component is not verified; and
operating the device according to a second scheme when the integrity of the trusted component is verified.
12. The method of claim 11, wherein operating according to a first scheme when the integrity of the trusted component is not verified comprises: preventing access to information required to authenticate the device with the external communication entity if the integrity of the trusted component is not verified.
13. The method of claim 11, wherein the root of trust verifying the integrity of the trusted component comprises: the root of trust initiates a secure staged boot upon which the trusted component is loaded and executed, wherein the integrity of the trusted component is verified as part of the secure staged boot.
14. The method of claim 11, wherein determining whether the integrity of the trusted component is verified comprises: comparing the measurement of the trusted component to a reference value associated with the trusted component, wherein the integrity of the trusted component is verified when the measurement of the trusted component matches the trusted reference value associated with the trusted component.
15. The method of claim 11, further comprising determining, as part of the trusted component, whether integrity of other components is verified.
16. The method of claim 15, wherein determining, as part of the trusted component, whether the integrity of the other component is verified comprises: comparing the measurements of the other components to reference values associated with the other components, wherein the integrity of the other components is verified when the measurements of the other components match trusted reference values associated with the other components.
17. An apparatus capable of authentication with an external communication entity, the apparatus comprising:
a trusted component configured to generate and collect the integrity measurements of the other components of the device; and
a root of trust comprising a set of immutable hardware resources including at least one processor, wherein the root of trust is configured to verify an integrity of the trusted component, wherein the trusted component verified by the root of trust is further configured to provide an integrity measurement to the external communication entity for validating the device with the external communication entity.
18. The apparatus of claim 17, wherein the root of trust comprises a certificate associated with the root of trust, wherein the certificate reflects a verification of an integrity of the root of trust by a third party.
19. The apparatus of claim 17, wherein the external communication entity is configured to restrict at least some access to the apparatus if the integrity measurement does not match a measurement expected by the external communication entity.
20. A method for validating one or more components in a device capable of authenticating with an external communication entity, the method comprising:
verifying, by the root of trust, the integrity of the trusted component;
generating, by the trusted component, an integrity measurement of other components included in the device; and
providing the integrity measurement to the external communication entity for validating the device with the external communication entity.
21. The method of claim 20, wherein the external communication entity is configured to restrict at least some access to the device if the integrity measurement does not match a measurement expected by the external communication entity.
22. A method of establishing an authenticatable device, the method comprising:
receiving a trusted component;
receiving a certified code and a trusted reference value; and
creating a device, wherein the device comprises the trusted component and the certified code and the trusted reference value.
23. The method of claim 22, wherein the trusted component comprises a digital certificate associated with the trusted component.
24. The method of claim 22, wherein the certified code comprises a digital certificate associated with the code.
25. The method of claim 22, wherein the trusted component and the certified code are certified by a third party.
26. The method of claim 22, wherein the trusted component comprises a root of trust having a set of immutable hardware resources.
HK12108823.2A 2009-04-15 2010-04-15 Validation and/or authentication of a device for communication with a network HK1168224A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US61/169,630 2009-04-15
US61/253,687 2009-10-21

Publications (1)

Publication Number Publication Date
HK1168224A true HK1168224A (en) 2012-12-21

Family

ID=

Similar Documents

Publication Publication Date Title
CN102396251B (en) Validation and/or authentication of device for communication with network
CN107438849B (en) System and method for verifying integrity of electronic device
JP4912879B2 (en) Security protection method for access to protected resources of processor
EP2866166A1 (en) Systems and methods for enforcing third party oversight data anonymization
CN110651261A (en) Secure memory device with unique identifier for authentication
CN110874494B (en) Method, device and system for processing password operation and method for constructing measurement trust chain
EP2989741A1 (en) Generation of working security key based on security parameters
Schellekens et al. Embedded trusted computing with authenticated non-volatile memory
JP2017011491A (en) Authentication system
EP2704393B1 (en) Network connecting method and electronic device
CN107026729B (en) Method and apparatus for transferring software
CN115879087B (en) A secure and reliable startup method and system for power terminals
CN116361863B (en) Trusted environment construction method, data transmission method and data processing system
JP4818824B2 (en) Program management system and terminal device
HK1168224A (en) Validation and/or authentication of a device for communication with a network
CN112311752A (en) Internet of things smart meter safety system and implementation method
US12088695B2 (en) Systems and methods for device authentication in supply chain
Surendrababu System Integrity–A Cautionary Tale
US20250217469A1 (en) Method And System For Diagnostic Services With Remote Attestation
CN112307463B (en) An Internet of Things smart meter production test system and security protection method
Szefer Hardware Root of Trust