[go: up one dir, main page]

CN120035826A - Data processing device and method for runtime certification - Google Patents

Data processing device and method for runtime certification Download PDF

Info

Publication number
CN120035826A
CN120035826A CN202280101057.3A CN202280101057A CN120035826A CN 120035826 A CN120035826 A CN 120035826A CN 202280101057 A CN202280101057 A CN 202280101057A CN 120035826 A CN120035826 A CN 120035826A
Authority
CN
China
Prior art keywords
task
data processing
memory
attestation
tasks
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280101057.3A
Other languages
Chinese (zh)
Inventor
托马斯·奥利维尔·莫里斯·谢瓦利埃
伊万·西尔维乌·弗勒斯恰努
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN120035826A publication Critical patent/CN120035826A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/57Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/52Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity ; Preventing unwanted data erasure; Buffer overflow
    • G06F21/53Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity ; Preventing unwanted data erasure; Buffer overflow by executing in a restricted environment, e.g. sandbox or secure virtual machine

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

A data processing apparatus (110) for performing a plurality of tasks is disclosed. The apparatus (110) comprises a processing unit (111) for running an RTOS according to a security capability architecture. The RTOS implements a kernel, a attestation kernel, and a plurality of tasks. Each task uses a plurality of capabilities defined by the security capability architecture. Furthermore, the device (110) comprises a memory (115) comprising a plurality of memory bays including the memory bay of the attestation core and a respective isolated memory bay of each task. The memory compartment of each task is defined by a plurality of capabilities of the task and data of the task operation. The processing unit (111) is configured to monitor an integrity of the memory compartment of the task to generate task integrity metric data (125) indicative of the integrity of the memory compartment of the task.

Description

Data processing device and method for runtime certification
Technical Field
The present invention relates to security technology. More particularly, the present invention relates to a data processing apparatus and method for runtime certification. Furthermore, the invention relates to a remote attestation system comprising such a data processing device.
Background
One key challenge in IoT security is that limited microcontrollers are vulnerable to malicious tampering of firmware. This may be due to an attacker performing a reprogramming attack through a physical access device or a remote attack using a vulnerability in a software implementation. One popular method of mitigating such attacks is called attestation, which can verify that the device is running known firmware, i.e., that the device is in a trusted state. Attestation is generally defined as the process between both the prover and the verifier. The prover may be a resource-constrained IoT device or the like, while the verifier is typically a more computationally-powerful device, e.g., a server backend. From the prover's perspective, the proof includes two phases, (1) generating evidence about the prover's trustworthiness, (2) passing the evidence to the verifier through a security protocol. These two phases, particularly the first phase of the attestation process (i.e., generating attestation evidence), are typically preferably performed at runtime in order to detect attacks that may affect the integrity of the software during execution of the software.
Some runtime attestation methods are referred to as "dynamic integrity metrics" and "control flow attestation. In the "dynamic integrity metric" approach, a hash value (fingerprint) is periodically extracted from the predictable/static memory (i.e., code segment) of a process running on a "prover" device and compared on a "remote verifier" device to a reference fingerprint generated at build time. The "control flow attestation" method attempts to monitor the execution flow of a process and record the sequence of edges (branches) that the process passes through in order to check whether the execution flow meets expectations based on known good control flow graphs generated at build time. The method mainly covers return-oriented or jump-oriented programming attacks, which are the dominant type of memory-based attacks today.
Disclosure of Invention
It is an object of the invention to provide an improved data processing apparatus and method for runtime certification.
The above and other objects are achieved by the subject matter of the independent claims. Other implementations are apparent in the dependent claims, the description and the drawings.
According to a first aspect, a data processing apparatus for performing a plurality of tasks is provided. The data processing apparatus may be an IoT device, a smartphone, a network device, an electronic control unit, and the like. The plurality of tasks may include a sensor task for controlling one or more sensors in the data processing apparatus, an actuator task for driving one or more actuators in the data processing apparatus, a network task for providing wireless communication between the data processing apparatus and other network devices, and so on.
The data processing apparatus comprises a processing unit for running a real-time operating system (real-time operating system, RTOS) in accordance with a secure capability architecture. The processing unit may include one or more central processing units (central processing unit, CPU) and/or one or more microcontrollers, etc. The RTOS implements a kernel, a attestation core, and the plurality of tasks, each of the plurality of tasks when executed using one or more of a plurality of capabilities defined by the security capability framework. As used herein, a "security capability architecture" may include an instruction set and data structures in the form of capabilities, as well as hardware and/or software supporting such an architecture. The security capability architecture of the data processing apparatus may include an instruction set or the like called capability hardware enhanced RISC instruction (capability HARDWARE ENHANCED RISC instruction, CHERI).
Furthermore, the data processing apparatus includes a memory having a plurality of isolated memory bays (sometimes also referred to as "protection domains") including the isolated memory bay of the attestation core and a corresponding isolated memory bay of each task. The isolated memory compartment of each task is defined by data that is operated by the one or more of the plurality of capabilities of the respective task and the one or more of the plurality of capabilities of the respective task. Each isolated memory compartment of each task may be considered to fully encapsulate the corresponding task.
The processing unit in the data processing apparatus is further configured to monitor the integrity of the isolated memory bay of the plurality of tasks to generate task integrity metric data indicative of the integrity of the isolated memory bay of the plurality of tasks, thereby providing a data processing apparatus capable of ensuring integrity at runtime. An "attestation core" as used herein is a dedicated security task running in its own compartment of memory that is completely isolated from other tasks or RTOS kernels. More specifically, the proving core is a dedicated task isolated from the rest of the system, which can be invoked by the trampoline module when exchanging capabilities between other tasks, to record the exchanged capabilities and report them securely to the remote validator.
In another possible implementation, the data processing apparatus further comprises a communication interface for sending the task integrity metric data (either in the form originally generated by the processing unit or in the form of further processing) to a certification server. This enables the attestation server to run-time attest to the integrity of the isolated memory compartments of the plurality of tasks of the data processing apparatus based on the task integrity metric data.
In another possible implementation, the attestation core is configured to record the task integrity metrics data of the memory bays of each task and send the task integrity metrics data of the memory bays of each task to the attestation server periodically and/or in an event-driven manner through the communication interface. Thus, task integrity metric data can be efficiently reported.
In another possible implementation, the attestation core is configured to cryptographically protect the sent task integrity metric data according to one or more cryptographic keys, and the communication interface is configured to send the cryptographically protected task integrity metric data to the attestation server. This enables the task integrity metric data to be protected from any attack in an encrypted manner.
In another possible implementation, the communication interface is configured to receive a random number from the attestation server, the attestation core is further configured to cryptographically protect the task integrity metric data and the random number with the one or more cryptographic keys, and the communication interface is configured to send the cryptographically protected task integrity metric data and random number to the attestation server. This enables detection of replay attacks.
In another possible implementation, the one or more encryption keys are stored in the isolated memory bay of the attestation core. By storing the encryption keys in a highly secure attestation core, the encryption keys may be well protected from any attacks that attempt to extract these keys from the data processing apparatus.
In another possible implementation, the processing unit is configured to define the plurality of isolated memory compartments. For example, the processing unit may securely manage the address ranges of the plurality of isolated memory bays. Thus, the memory in the data processing device may be a low cost memory without a dedicated memory management unit.
In another possible implementation, the processing unit is configured to monitor the integrity of the isolated memory compartment of the plurality of tasks according to a trampoline module implemented by trampoline codes. The trampoline module is invoked at each transition between the isolated memory bays of the plurality of tasks and is used to report to the proving core one or more capabilities exchanged between the isolated memory bays of the plurality of tasks. This allows for efficient monitoring of the integrity of the isolated memory compartment for multiple tasks.
In another possible implementation, the processing unit is further configured to initially scan the memory to determine the plurality of isolated memory bays of the memory. This enables efficient determination of the memory compartment, i.e. protection domain, for a plurality of tasks.
In another possible implementation, the processing unit is further configured to store the task integrity metric data in the isolated memory compartment of the attestation core. By storing the task integrity metric data in a high security memory compartment of the attestation core, the task integrity metric data may be protected from any attacks, e.g., from compromised tasks in devices under the control of an attacker attempting to modify the task integrity metric data.
In another possible implementation, the task integrity metric data includes the one or more capabilities of each respective task of the plurality of tasks. This enables efficient generation of task integrity metric data from the security capability architecture of the data processing apparatus.
In another possible implementation, the one or more of the plurality of capabilities of each task includes a pointer and pointer metadata (also referred to as a "wide pointer"). This enables efficient generation of task integrity metric data from the security capability architecture of the data processing apparatus.
In another possible implementation, the RTOS of the data processing apparatus is a single address space RTOS. Thus, the data processing apparatus can implement an RTOS without requiring complex and expensive processing resources.
In another possible implementation, the security capability architecture is based on hardware and/or software. As described above, the security capability architecture may include instruction sets and data structures in the form of capabilities, as well as hardware and/or software to support such architecture. The security capability architecture of the data processing apparatus may include an instruction set or the like called capability hardware enhanced RISC instruction (capability HARDWARE ENHANCED RISC instruction, CHERI).
According to a second aspect, a remote attestation system is provided. The remote attestation system according to the second aspect comprises at least one data processing device according to the first aspect and an attestation server for receiving the task integrity metric data from the at least one data processing device and attesting the integrity of the at least one data processing device according to the task integrity metric data.
In another possible implementation, the one or more reference capabilities of each task of the at least one data processing device are defined by a task policy, and the attestation server is configured to attest to the integrity of the at least one data processing device based on the task integrity metric data and the task policy. This enables efficient attestation of the integrity of at least one data processing apparatus based on the task integrity metric data and the task policy.
According to a third aspect, a method for proving the integrity of a data processing apparatus is provided. The data processing apparatus is for executing a plurality of tasks and includes a processing unit for running a real-time operating system (real-time operating system, RTOS) according to a secure capability architecture, wherein the RTOS implements a kernel, a attestation kernel, and a plurality of tasks, each of which when executed uses one or more of a plurality of capabilities defined by the secure capability architecture, and a memory. The method comprises the following steps:
providing a plurality of isolated memory bays of the memory, wherein the plurality of isolated memory bays includes an isolated memory bay of the attestation core and a respective isolated memory bay of each task, the isolated memory bay of each task being defined by data of the one or more of the plurality of capabilities of the task and the one or more of the plurality of capabilities of the task operating;
the integrity of the isolated memory compartment of the plurality of tasks is monitored to generate task integrity metric data indicative of the integrity of the memory compartment of the plurality of tasks.
The method provided by the third aspect of the present invention may be performed by the data processing apparatus provided by the first aspect of the present invention. Thus, the further features of the method provided by the third aspect of the invention are directly achieved by the functions of the data processing apparatus provided by the first aspect of the invention and the different implementations described above and below.
According to a fourth aspect, a computer program product comprising a computer readable storage medium is provided. The computer readable storage medium is for storing program code which, when executed by a computer or a processor, causes the computer or the processor to perform the method provided by the third aspect.
The details of one or more embodiments are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description and drawings, and from the claims.
Drawings
Embodiments of the invention are described in detail below with reference to the attached drawing figures, wherein:
FIG. 1 shows a schematic diagram of an attestation system including a data processing apparatus and an attestation server provided by one example of an embodiment of the present invention;
FIG. 2 depicts a schematic diagram of isolated memory compartments of a memory in a data processing apparatus according to an example of an embodiment of the invention;
FIG. 3 shows a schematic diagram of a plurality of tasks implemented by a data processing device provided by an example in an embodiment of the invention;
FIG. 4 illustrates a schematic diagram of a trampoline module and a proving core implemented by a data processing device for monitoring the integrity of a memory bay for a plurality of tasks, provided by one example of an embodiment of the invention;
FIG. 5 illustrates a schematic diagram of a security architecture implemented by a data processing apparatus for cryptographically protecting task integrity metric data provided by one example of an embodiment of the present invention;
FIG. 6 illustrates a flow chart of a method for proving the integrity of a data processing apparatus provided by one example of an embodiment of the present invention.
In the following, like reference numerals refer to like or at least functionally equivalent features.
Detailed Description
In the following description, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific aspects in which embodiments of the invention may be practiced. It is to be understood that embodiments of the invention may be used in other respects and including structural or logical changes not depicted in the drawings. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined by the appended claims.
For example, it should be understood that the disclosure related to describing a method may be equally applicable to a corresponding device or system for performing the method, and vice versa. For example, if one or more specific method steps are described, the corresponding apparatus may comprise one or more units (e.g., functional units) to perform the described one or more method steps (e.g., one unit performing one or more steps, or a plurality of units performing one or more of the steps, respectively), even if such one or more units are not explicitly described or illustrated in the figures. On the other hand, for example, if a specific apparatus is described based on one or more units (e.g., functional units), a corresponding method may include one step to perform the function of one or more units (e.g., one step to perform the function of one or more units, or a plurality of steps to perform the function of one or more units, respectively), even if such one or more steps are not explicitly described or shown in the drawings. Furthermore, it is to be understood that features of the various exemplary embodiments and/or aspects described herein may be combined with each other, unless explicitly stated otherwise.
Fig. 1 shows a schematic diagram of an embodiment of a certification system 100, the certification system 100 comprising an embodiment of a data processing apparatus 110 (referred to as device 110 in fig. 1) and a certification server 120 (referred to as verifier 120 in fig. 1). The attestation system 100 may also include a configuration server 130 of a vendor or manufacturer of the data processing apparatus 110, the configuration server 130 being configured to configure software 135, e.g., firmware or software images, for the data processing apparatus 110. The data processing apparatus 110 may be an IoT device, a smartphone, a network device, an electronic control unit, and the like.
As shown in fig. 1 and described in detail below, the data processing apparatus 110 includes a processing unit 111, and the processing unit 111 may include one or more central processing units (central processing unit, CPUs) and/or one or more microcontrollers, etc. The processing unit 111 may be implemented in hardware and/or software and may include digital circuitry, or both analog and digital circuitry. The digital circuit may include components such as an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a digital signal processor (DIGITAL SIGNAL processor, DSP), or a general purpose processor. In addition, the data processing device 110 includes an electronic memory 115, such as a flash memory 115, for storing data. The memory 115 may store executable program code that, when executed by the processing unit 111, causes the data processing apparatus 110 to perform the functions and methods described herein. The data processing apparatus 110 may further comprise a communication interface 113, in particular a wireless communication interface and/or a wired communication interface enabling the data processing apparatus 110 to communicate with the attestation server 120, the configuration server 130 and/or other network devices.
With further reference to fig. 2 and 3, the processing unit 111 is configured to execute a plurality of software tasks, which may include, for example, a sensor task 305a for controlling one or more sensors in the data processing apparatus 110, an actuator task 305b for driving one or more actuators in the data processing apparatus 110, and a network task 305c for providing wireless communication (along with the communication interface 113) between the data processing apparatus 110 and other network devices. As shown in FIG. 3, the processing unit 111 is configured to run a real-time operating system (RTOS) 300 according to a secure capability architecture to implement a software environment of a kernel 301, a attestation core 303, and a plurality of tasks 305 a-305 c. In one embodiment, the RTOS 300 of the data processing apparatus 110 is a single address space RTOS 300.
Each of the plurality of tasks 305 a-305 c, when executed, uses one or more of the plurality of capabilities defined by the security capability framework. As used herein, a "security capability architecture" may include an instruction set and data structures in the form of capabilities, as well as hardware and/or software supporting such an architecture. The security capability architecture of the data processing apparatus may include an instruction set or the like called capability hardware enhanced RISC instruction (capability HARDWARE ENHANCED RISC instruction, CHERI). For more details on CHERI, please refer to "Capability hardware enhanced RISC instruction: CHERI instruction Set Architecture (Version 8) (capabilities HARDWARE ENHANCED RISC facilities: CHERI Instruction-Set Architecture (Version 8))", technical report, no. 951, UCAM-CL-TR-951, ISSN 1476-2986, university of Cambridge, the entire contents of which are incorporated herein by reference.
As shown in fig. 2 and 3, because of the security capability architecture, and in particular CHERI-based architecture, of the data processing apparatus 110, the memory 115 in the data processing apparatus 110 includes a plurality of isolated memory bays (sometimes also referred to as "protection domains") including the isolated memory bay of the attestation core 303 and the corresponding isolated memory bays 115 a-115 c of each task 305 a-305 c. The isolated memory compartment of each task is defined by data that is operable by one or more of the plurality of capabilities of the corresponding task and one or more of the plurality of capabilities of the corresponding task. In one embodiment, one or more of the plurality of capabilities of each task 305 a-305 c may include pointers and pointer metadata (also referred to as "wide pointers").
As schematically shown in fig. 2, each isolated memory bay 115 a-115 c of each task 305 a-305 c may be considered to fully encapsulate the corresponding task 305 a-305 c. In one embodiment, the processing unit 111 in the data processing apparatus 110 is configured to define the plurality of isolated memory bays of the attestation core 303 and the isolated memory bays 115 a-115 c of the plurality of tasks 305 a-305 c. For example, the processing unit 111 may be configured to securely manage the address ranges of the plurality of isolated memory bays 115 a-115 c.
As described in detail below, the processing unit 111 in the data processing apparatus 110 is further configured to monitor the integrity of the isolated memory bays 115 a-115 c of the plurality of tasks 305 a-305 c to generate task integrity metric data 125 indicative of the integrity of the isolated memory bays 115 a-115 c of the plurality of tasks 305 a-305 c. In one embodiment, the processing unit 111 in the data processing apparatus 110 may also be configured to store the task integrity metric data 125 in an isolated memory compartment of the attestation core 303. As described in detail below, in one embodiment, the task integrity metric data 125 may include one or more capabilities, e.g., wide pointers, of each respective task 305 a-305 c of the plurality of tasks 305 a-305 c.
As shown in the embodiment of fig. 1, the communication interface 113 in the data processing apparatus 110 may be used to send the task integrity metric data 125 (in the form of initial collection and/or generation by the processing unit 111 or in the form of further processing) to the attestation server 120. This allows the attestation server 120 to run-time attestation the integrity of the isolated memory bays 115 a-115 c of the plurality of tasks 305 a-305 c of the data processing apparatus 110 based on the task integrity metrics 125 and, in one embodiment, the reference values defined by the task policies 145 provided by the configuration server 130 or the like.
Further embodiments of the data processing apparatus 110 and the attestation system 100 are described in detail below with further reference to fig. 4 and 5.
As described above, the embodiments disclosed herein are capable of "run-time attestation" of the data processing apparatus 110, i.e., providing an attestation system 100 capable of attesting to the integrity of the plurality of tasks 305 a-305 c and the operating system kernel 301 of the data processing apparatus 110 when the data processing apparatus 110 is executed. For example, if the tasks 305 a-305 c were to run as intended after startup, the attestation system 100 could provide a trusted and verifiable proof of its integrity, and the external verifier instance could verify the proof and decide whether to trust the data processing device 110 and/or the tasks 305 a-305 c running on the data processing device 110. If the tasks 305 a-305 c of the data processing apparatus 110 are not running as intended, the data processing apparatus 110 (possibly under the control of a malicious attacker) will not be able to forge such evidence. Furthermore, according to one embodiment, the data processing device 110 may even provide context information, i.e. the location of unknown (potentially malicious) task behavior deviating from the intended behavior.
As described above, according to embodiments disclosed herein, the data processing apparatus 110 may be a Microcontroller (MCU) -based low-end device 110 that supports a secure capability architecture (e.g., CHERI) and implements a microcontroller-level real-time OS 300, e.g., freeRTOS or LiteOS. According to other embodiments, the data processing apparatus 110 may be a CPU-based high-end device 110 that supports a similar security capability architecture and implements a more complex RTOS 300, such as the Linux operating system 300.
The embodiments disclosed herein employ a new approach in evaluating the runtime integrity of a program (i.e., a plurality of tasks 305 a-305 c). Unlike conventional methods that look inside the task/program memory and attempt to understand it, which are both difficult and complex and computationally intensive, embodiments of the data processing apparatus 110 disclosed herein can monitor the internal condition of a given task/program 305 a-305 c from the perspective of other processes running within the data processing apparatus 110. This distinct approach is shown in fig. 2. According to embodiments disclosed herein, if no other tasks 305 a-305 c or processes on the data processing apparatus 110 attempt to access portions of the memory 115 (except for those portions that are allowed to access) (i.e., the respective sequestered memory bays 115 a-115 c), it may be inferred that a given task 305 a-305 c is operating as intended (i.e., not tampered with by a malicious attacker). In other words, according to embodiments disclosed herein, the data processing apparatus 110 is configured to detect at runtime whether isolation between different memory bays 115 a-115 c of the plurality of tasks 305 a-305 c is compromised. Thus, by providing evidence of isolation between the plurality of tasks 305 a-305 c and whether isolation is maintained in the form of task integrity metric data 125, efficient runtime attestation may be achieved.
As described above, the processing unit 111 in the data processing apparatus 100 is configured to implement a security capability architecture, for example, a CHERI-based architecture, which provides spatial memory security and memory isolation by "capability". The security capability architecture (e.g., CHERI-based architecture) extends the traditional instruction set architecture (Instruction Set Architecture, ISA) to achieve fine-grained memory protection and highly scalable software separation. For example, in the case of a single address space, the CHERI-based architecture implemented by the data processing apparatus 110 provided by one embodiment is able to provide memory isolation without the need for a memory management unit (Memory Management Unit, MMU). In one embodiment, the RTOS 300 of the data processing apparatus 110 may be CHERI FreeRTOS, which is a variation of FreeRTOS, and provides isolation between the different memory bays 115 a-115 c of the plurality of tasks 305 a-305 c in a single address space system. CHERI FreeRTOS 300 the features of the CHERI-based architecture are utilized to limit the set of memory regions that each of the memory bays 115 a-115 c can access. In one embodiment, each memory bay 115 a-115 c may be as small as a function or as large as some code spanning several source files.
As described above, each task 305 a-305 c is limited to its "protection domain," i.e., the isolated memory compartment 115 a-115 c. For a security capability architecture, for example, CHERI-based security capability architecture, the protection domain (i.e., the memory bays 115 a-115 c) refers to the set of capabilities that the tasks 305 a-305 c can access. For example, if tasks 305a through 305c have the capability to point to a stack in one of the registers, then one embodiment provides that data processing apparatus 110 be used to examine the pointed to memory region to determine any other capability to point to other regions in memory 115. One embodiment provides that the data processing apparatus 110 may be used to recursively find these other pointed memory regions until a complete set of all the capabilities that the respective task 305 a-305 c can access is found. This is the protection domain, i.e., the isolated memory compartment 115 a-115 c of the corresponding task 305 a-305 c. All protection domains (i.e., memory bays 115 a-115 c) are different, i.e., isolated from each other, meaning that tasks 305 a-305 c cannot access memory bays 115 a-115 c of other tasks 305 a-305 c.
In one embodiment, tasks 305 a-305 c may control an exclusive region of memory 115. Further, the tasks 305a to 305c may have pointers to jump to specific functions of the other tasks 305a to 305 c. The two tasks 305 a-305 c may share some portion of the memory 115, provided that the capability of the memory region prohibits reading or storing some capability. In one embodiment, the core 301 may have full access to the memory 115 (except for the memory compartment associated with the attestation core 303).
Typically, the protection domain (i.e., the isolated memory compartment 115 a-115 c of each task 305 a-305 c) is known a priori. For example, the network task 305c may share buffers with other tasks 305a, 305b, but not share its internal state. In this case, there may be a policy on how to set the protection domain (i.e., the memory bays 115a to 115 c).
In one embodiment, all memory bays 115 a-115 c may be initially measured and determined by scanning the entire memory 115 and inferring the different bays from the register file for each task 305 a-305 c.
In a security capability architecture (e.g., CHERI-based), it is not possible for tasks 305 a-305 c to extend their own isolated memory bay 115 a-115 c, i.e., the protection domain, without transferring control to other isolated memory bays. This inherent feature of the security capability architecture implemented by the data processing apparatus 110 eliminates the need for continuous monitoring of the isolated memory compartment, and only monitors when tasks 305 a-305 c pass control to other isolated memory compartments (i.e., in an event-driven manner).
In one embodiment, if tasks 305 a-305 c are only in communication with kernel 301, kernel 301 may be used to record the capabilities passed to other tasks 305 a-305 c. As shown in FIG. 4, for example, if task 305a calls a malloc function, the kernel 301 of the data processing apparatus 110 is used to allocate a certain portion of memory 115 and thus build a certain capability. In addition, the kernel 301 of the data processing apparatus 110 may update the isolated memory compartment 115a of the task 305a to take into account the new capabilities that the task 305a has acquired. This capability may then be passed to task 305a. It will be appreciated that this approach may result in an overestimate of the isolated memory compartment 115a of task 305a. However, the detection tasks 305 a-305 c delete a certain capability is more complex, as it may require scanning the entire memory 115. Thus, according to a specific use case, the data processing apparatus 110 provided in one embodiment may implement one of the two methods.
More specifically, tasks 305 a-305 c may have the ability to point to their code and stack, as indicated by the circle with the number 1 in FIG. 4. As shown by the circle with the number 2 in fig. 4, tasks 305 a-305 c can call the malloc function and switch to the kernel compartment via the trampoline module 401. As shown by the circle with the number 3 in fig. 4, the malloc function can generate a new capability limited by the size of the requested area and return that capability. As shown by the circle with the number 4 in fig. 4, the trampoline module 401 is used to provide the capability to the proving core 303, and the proving core 303 updates the protection domain metrics, i.e., the task integrity metric data 125. The trampoline module 401 can then return. The blocks referred to in the figures as "pcb", "csp" and "cao" are exemplary CPU registers implemented by the CHERI-based architecture according to an embodiment of the data processing apparatus 110.
As described above, the attestation core 303 of the data processing apparatus 110 is associated with a secure area of the memory 115, i.e. its own isolated memory bay (similar to a flexible engine). In one embodiment, the attestation core 303 has full control of the memory 115 in the data processing apparatus 110, but the core 301 and the plurality of tasks 305 a-305 c cannot tamper with the isolated memory compartment of the attestation core 303. In one embodiment, the memory bay of the attestation core 303 is used to securely store the task integrity metric data 125 (even if the core 301 is not trusted). In one embodiment, to transfer to a compartment of memory of the attestation core 303, the respective tasks 305 a-305 c or the core 301 may use a "CInvoke" mechanism provided by the CHERI-based architecture, which may perform secure transfers between different compartments of memory.
As described above, in one embodiment, the task policy 145 may be defined by the vendor by listing all of the desired memory bays 115 a-115 c (i.e., the protection domains of the data processing device 110). In one embodiment, the task policy 145 may define what each of the memory bays 115 a-115 c should access. Task policies 145 may be created and provided by the vendor through configuration server 130 or the like. As described above, attestation server 120 may compare task policy 145 with current task integrity metric data 125 (provided by data processing device 110) to detect whether any integrity violations exist for the isolated memory bays 115 a-115 c of the plurality of tasks 305 a-305 c.
In a CHERI-based architecture that may be implemented by the data processing apparatus 110 provided by one embodiment, the "reachability monotonicity" feature indicates that during execution of any task 305 a-305 c, the memory bays 115 a-115 c of the respective task 305 a-305 c cannot be increased until execution is transferred to the memory bays of other tasks 305 a-305 c. It will be appreciated that this is an implicit nature of the security capability architecture (particularly CHERI-based security capability architecture) and is therefore always valid, which architecture may be implemented by the data processing apparatus 110 provided by one embodiment. Thus, as described above, to monitor each isolated memory bay 115 a-115 c, embodiments disclosed herein may begin with a reference point and then monitor, i.e., measure, all of the ability to enter the respective memory bay 115 a-115 c at runtime. It will be appreciated that although these metrics may be performed at discrete points in time, this provides a continuous view of the respective compartment, since the respective isolated compartment cannot be self-added due to the inherent characteristics of the CHERI-based security capability architecture described above.
As described above, one embodiment provides that the data processing device 110 monitoring the respective memory bay 115 a-115 c may overestimate the actual respective memory bay 115 a-115 c. However, as noted above, this is generally not a problem, as embedded tasks 305 a-305 c are likely to rarely use dynamic allocation, resulting in memory idleness.
It is appreciated that one embodiment provides for the data processing device 110 to monitor all of the capabilities of the respective memory bay 115 a-115 c that should intercept access to the respective memory bay 115 a-115 c. In one embodiment, this may be achieved by the trampoline module 401, and the trampoline module 401 may be implemented by the data processing device 110 provided by one embodiment, and described in detail below in the context of fig. 4. In the embodiment shown in fig. 4, the processing unit 111 in the data processing apparatus 110 is configured to monitor the integrity of the respective isolated memory compartments 115 a-115 c of the plurality of tasks 305 a-305 c in accordance with a trampoline module 401 implemented by trampoline codes. As shown in fig. 4, the trampoline module 401 is invoked at each transition between the isolated memory bays 115 a-115 c of the plurality of tasks 305 a-305 c and is used to report to the proving core 303 one or more capabilities exchanged between the isolated memory bays 115 a-115 c of the plurality of tasks 305 a-305 c.
In one embodiment, the communication interface 113 in the data processing apparatus 110 is configured to send the current task integrity metric data 125 to the attestation server 120 whenever one of the memory bays 115 a-115 c of the plurality of tasks 305 a-305 c decreases, such that the attestation server 120 checks the current task integrity metric data 125 according to a reference value defined by the task policy 145.
As described above, the RTOS 300 implemented by the data processing apparatus 110 provided by one embodiment adds the proving core 303 and the trampoline module 401 to generate the task integrity metric data 125, as compared to a conventional RTOS. As described above, the attestation core 303 is associated with its memory compartment, i.e., a secure area in the memory 115 that is isolated from other parts of the system, including the RTOS kernel 301. In one embodiment, the memory bay of the attestation core 303 is used to store the task integrity metric data 125 and one or more encryption keys 503 (shown in fig. 5) used to digitally sign the task integrity metric data 125. Thus, the attestation core 303 may be viewed as providing an interface to add capability to the respective memory bays 115 a-115 c of the plurality of tasks 305 a-305 c.
Since the isolation between the different memory bays 115 a-115 c or between the memory bays 115 a-115 c and the core 301 may be corrupted by a malicious attacker or the like, the purpose of the attestation core 303 of the data processing apparatus 110 provided by one embodiment is to ensure the integrity of the task integrity metric data 125 in such a scenario (which is not possible if the task integrity metric data 125 is stored in the core 301 or the like)
As described above, in one embodiment, the trampoline module 401 implemented by the processing unit 111 in the data processing device 110 can be invoked at each transition between the isolated memory bays 115 a-115 c of the plurality of tasks 305 a-305 c, and used to report to the proving core 303 one or more capabilities exchanged between the isolated memory bays 115 a-115 c of the plurality of tasks 305 a-305 c. In other words, the trampoline module 401 is a special function for safely switching to other memory bays, i.e. protection domains, at run-time. During such a switchover, the trampoline module 401 is further configured to record each capability transferred to the new bay by transferring the corresponding capability to the proving core 303 to generate the task integrity metric data 125.
In one embodiment, the processing unit 111 in the data processing device 110 is configured to transfer control from the bays 115 a-115 c of the respective task 305a to the other bays 115 a-115 c by jumping to the trampoline module 401. As described above, in one embodiment, the capabilities passed to the new memory bay in the process may be forwarded to the attestation core 303. The processing unit 111 in the data processing device 110 is used to add these capabilities to the portion of the task integrity metric data 125 associated with the new memory bay. In the embodiment shown in fig. 4, task integrity metric data 125 may be provided in the form of memory bays, i.e., protection domain table 125 describing which memory regions each memory bay 115 a-115 c is capable of accessing and the corresponding permissions. This data structure in the form of the memory bay table 125 may be updated with new capabilities and stored in the memory bay of the attestation core 303. One embodiment of an efficient storage protection domain metric may be a list of memory regions and their permissions, defined by the capabilities of the attestation core 303 records. When new capabilities are added to the list, the list is updated in place and its best representation is maintained throughout.
As described above, remote attestation is the process of communicating trusted evidence to a remote third party verifier to attest to the integrity of a device (e.g., data processing apparatus 110). In one embodiment, the communication interface 113 in the data processing apparatus 110 is configured to send task integrity metric data 125 indicative of the integrity of the memory bays 115 a-115 c of the plurality of tasks such that the attestation server 120 checks the authenticity of the task integrity metric data 125.
As shown in FIG. 5, in one embodiment, the code of the device may be measured at startup and an encryption key bound to the hardware trust root may be constructed from the measurement. The vendor 130 may authenticate the device 110 by issuing a certificate. The key is stored in the attestation core 303 and cannot be obtained from anywhere else in the system. It will be appreciated that in order to trust the information provided by the attestation core 303, the remote verifier 120 needs to attest to the integrity of the core 30 itself. To this end, embodiments disclosed herein may utilize a known device identification composition engine (DEVICE IDENTIFIER composition engine, DICE) implementation 501 that derives the signing key of the attestation core 303 from a unique device identification (unique device secret) and a hash value (representation metric) of the code that each boot component loads sequentially to the attestation core 303 (including the attestation core 303 itself). Considering that the core 303 is proved to be isolated from any other tasks 305a to 305c and the core 301, it is believed that its integrity remains unchanged once loaded. Thus, the loaded attestation core may truly sign the capabilities of the metrics/records using keys derived therefrom. Based on the signature, the remote verifier 120 may verify these capabilities once the remote verifier 120 verifies the combined device and attestation core identity from the signed device certificate provided by the manufacturer 130.
At run-time, the attestation server 120 may initiate a challenge response mechanism, with the data processing apparatus 110 digitally signing the task integrity metric data 125 and the random number received from the attestation server 120 and required for the challenge response mechanism. The attestation server 120 receives the data 125 and checks that the key 503 used by the data processing device 110 to digitally sign the data 125 and the random number is based on a certificate received from a vendor (e.g., the configuration server 130). In the final stage of the attestation process, the attestation server compares the task integrity metric data 125 to a reference value defined by a task policy 145 provided by a vendor (e.g., configuration server 130).
In one embodiment, the attestation scheme implemented by data processing apparatus 110 and attestation server 120 is based on a trusted metric root (Root of Trust for Measurement, RTM) that is used to anchor attestation metrics (i.e., task integrity metrics 125) in immutable hardware and report task integrity metrics 125 in a trusted manner. To this end, in one embodiment, a device identification combining engine (DEVICE IDENTIFIER Composition Engine, DICE) RTM standard may be employed, as described above and shown in FIG. 5. Alternatively, a trusted platform module (Trusted Platform Module, TPM) may be used as an RTM standard. As shown in fig. 5, according to the dic RTM standard, a unique and random key (also referred to as a unique device key) is stored on the data processing apparatus 110, and there is a mechanism capable of preventing reading of the key after execution of the dic engine.
FIG. 6 is a flow chart providing a method 600 for proving the integrity of a data processing apparatus 110, according to one embodiment. As described above, the data processing apparatus 110 is configured to execute a plurality of tasks 305a to 305c and includes a processing unit 111, the processing unit 111 being configured to run the RTOS 300 according to a secure capability architecture, wherein the RTOS 300 implements a kernel 301, a attestation core 303, and a plurality of tasks 305a to 305c, each task 305a to 305c, when executed, using one or more of a plurality of capabilities defined by the secure capability architecture. In addition, the data processing device 110 includes a memory 115. The method 600 includes a step 601 of providing a plurality of isolated memory bays of the memory 115, wherein the plurality of isolated memory bays includes the isolated memory bay of the proof core 303 and the corresponding isolated memory bay 115 a-115 c of each task 305 a-305 c, the isolated memory bay 115 a-115 c of each task 305 a-305 c being defined by data of one or more of the plurality of capabilities of the task 305 a-305 c and one or more of the plurality of capabilities of the task 305 a-305 c operating. In addition, the method 600 includes monitoring 603 the integrity of the isolated memory bays 115 a-115 c of the plurality of tasks 305 a-305 c to generate task integrity metric data 125 indicative of the integrity of the memory bays 115 a-115 c of the plurality of tasks 305 a-305 c.
Since the method 600 may be implemented by the data processing device 110, other features of the method 600 are directly implemented by the data processing device 110 and the functions of the above-described and below-described different embodiments thereof.
Those skilled in the art will appreciate that the "blocks" ("units") in the various figures (methods and apparatus) represent or describe the functions of the embodiments of the invention (rather than necessarily separate "units" in hardware or software) so as to describe equally well the functions or features of the apparatus embodiments as well as the method embodiments (units = steps).
In several embodiments provided in the present disclosure, it should be understood that the disclosed systems, devices, and methods may be implemented in other ways. The described apparatus embodiments are merely exemplary. For example, the unit division is just one logic function division, and other division manners may be actually implemented. For example, multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. In addition, the mutual coupling or direct coupling or communication connection shown or described may be implemented by some interfaces. The indirect coupling or communication connection between devices or units may be accomplished through electronic, mechanical, or other forms.
The elements described as separate elements may or may not be physically separate, and elements shown as elements may or may not be physical elements, may be located in a single location, or may be distributed over a plurality of network elements. Some or all of the elements may be selected according to actual needs to achieve the objectives of the embodiment.
In addition, functional units in embodiments disclosed herein may be integrated into one processing unit, or each unit may physically reside separately, or two or more units may be integrated into one unit.

Claims (18)

1. A data processing apparatus (110) for performing a plurality of tasks (305 a to 305 c), the data processing apparatus (110) comprising:
A processing unit (111) for running a real-time operating system (real-time operating system, RTOS) (300) according to a secure capability architecture, wherein the RTOS (300) implements a kernel (301), a attestation kernel (303) and a plurality of tasks (305 a to 305 c), each of the plurality of tasks (305 a to 305 c) using one or more of a plurality of capabilities defined by the secure capability architecture when executed;
A memory (115) comprising a plurality of isolated memory bays, wherein the plurality of isolated memory bays comprises an isolated memory bay of the attestation core (303) and a respective isolated memory bay (115 a-115 c) of each task (305 a-305 c), the isolated memory bay (115 a-115 c) of each task (305 a-305 c) being defined by data that is operable by the one or more of the plurality of capabilities of the task (305 a-305 c) and the one or more of the plurality of capabilities of the task (305 a-305 c),
The processing unit (111) is further configured to monitor an integrity of the isolated memory bays (115 a-115 c) of the plurality of tasks (305 a-305 c) to generate task integrity metric data (125) indicative of the integrity of the isolated memory bays (115 a-115 c) of the plurality of tasks (305 a-305 c).
2. The data processing device (110) according to claim 1, wherein the data processing device (110) further comprises a communication interface (113) for sending the task integrity metric data (125) to a certification server (120).
3. The data processing apparatus (110) according to claim 2, wherein the attestation core (303) is configured to record the task integrity metric data (125) of the memory bay (105 a-105 c) of each task (305 a-305 c) and to send the task integrity metric data (125) of the memory bay (105 a-105 c) of each task (305 a-305 c) to the attestation server (120) periodically and/or in an event-driven manner via the communication interface (113).
4. A data processing apparatus (110) according to claim 2 or 3, wherein the attestation core (303) is configured to cryptographically protect the task integrity metric data (125) in accordance with one or more cryptographic keys (503), the communication interface (113) being configured to send the cryptographically protected task integrity metric data (125) to the attestation server (120).
5. The data processing apparatus (110) according to claim 4, wherein the communication interface (113) is configured to receive a random number from the attestation server (120), wherein the attestation core (303) is further configured to cryptographically protect the task integrity metric data (125) and the random number by the one or more cryptographic keys (503), and wherein the communication interface (113) is configured to send the cryptographically protected task integrity metric data (125) and random number to the attestation server (120).
6. The data processing apparatus (110) according to claim 4 or 5, wherein the one or more encryption keys (503) are stored in the isolated memory bay of the attestation core (303).
7. The data processing device (110) according to any of the preceding claims, wherein the processing unit (111) is configured to define the plurality of isolated memory compartments.
8. The data processing apparatus (110) according to any one of the preceding claims, wherein the processing unit (111) is configured to monitor the integrity of the isolated memory compartments (115 a-115 c) of the plurality of tasks (305 a-305 c) according to a trampoline module (401) implemented by trampoline codes, the trampoline module (401) being invoked at each transition between the isolated memory compartments (115 a-115 c) of the plurality of tasks (305 a-305 c) and configured to report to the proving core (303) one or more capabilities exchanged between the isolated memory compartments (115 a-115 c) of the plurality of tasks (305 a-305 c).
9. The data processing device (110) according to any of the preceding claims, wherein the processing unit (111) is further configured to initially scan the memory (115) to determine the plurality of isolated memory compartments of the memory (115).
10. The data processing device (110) according to any of the preceding claims, wherein the processing unit (111) is further configured to store the task integrity metric data (125) in the isolated memory compartment of the attestation core (303).
11. The data processing apparatus (110) of claim 10, wherein the task integrity metric data (125) includes the one or more capabilities of each respective task (305 a-305 c) of the plurality of tasks (305 a-305 c).
12. The data processing apparatus (110) according to any of the preceding claims, wherein the one or more of the plurality of capabilities of each task (305 a to 305 c) comprises pointers and pointer metadata.
13. The data processing apparatus (110) according to any of the preceding claims, wherein the RTOS (300) is a single address space RTOS (300).
14. The data processing apparatus (110) according to any of the preceding claims, wherein the security capability architecture is based on hardware and/or software.
15. A remote attestation system (100), comprising:
The at least one data processing device (110) according to any one of the preceding claims;
-a certification server (120) for receiving the task integrity metric data (125) from the at least one data processing device (110) and certifying the integrity of the at least one data processing device (110) based on the task integrity metric data (125).
16. The remote attestation system (100) of claim 15, wherein the one or more reference capabilities of each task (305 a-305 c) of the at least one data processing device (110) are defined by a task policy (145), the attestation server (120) being configured to attest to the integrity of the at least one data processing device (110) based on the task integrity metric data (125) and the task policy (145).
17. A method (600) for proving the integrity of a data processing apparatus (110), characterized in that the data processing apparatus (110) is adapted to perform a plurality of tasks (305 a to 305 c), the data processing apparatus (110) comprising a processing unit (111) adapted to run a real-time operating system (real-time operating system, RTOS) (300) according to a secure capability architecture, wherein the RTOS (300) implements a kernel (301), a proving core (303) and a plurality of tasks (305 a to 305 c), each task (305 a to 305 c) when executed using one or more of a plurality of capabilities defined by the secure capability architecture, a memory (115), the method (600) comprising:
Providing (601) a plurality of isolated memory bays of the memory (115), wherein the plurality of isolated memory bays includes an isolated memory bay of the attestation core (303) and a respective isolated memory bay (115 a-115 c) of each task (305 a-305 c), the isolated memory bay (115 a-115 c) of each task (305 a-305 c) being defined by data operated by the one or more of the plurality of capabilities of the task (305 a-305 c) and the one or more of the plurality of capabilities of the task (305 a-305 c);
the integrity of the isolated memory bays (115 a-115 c) of the plurality of tasks (305 a-305 c) is monitored (603) to generate task integrity metric data (125) indicative of the integrity of the memory bays (115 a-115 c) of the plurality of tasks (305 a-305 c).
18. A computer program product comprising a computer readable storage medium for storing program code which, when executed by a computer or a processor, causes the computer or the processor to perform the method (600) according to claim 17.
CN202280101057.3A 2022-10-21 2022-10-21 Data processing device and method for runtime certification Pending CN120035826A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2022/079477 WO2024083346A1 (en) 2022-10-21 2022-10-21 Data processing apparatus and method for runtime attestation

Publications (1)

Publication Number Publication Date
CN120035826A true CN120035826A (en) 2025-05-23

Family

ID=84360477

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280101057.3A Pending CN120035826A (en) 2022-10-21 2022-10-21 Data processing device and method for runtime certification

Country Status (2)

Country Link
CN (1) CN120035826A (en)
WO (1) WO2024083346A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160350534A1 (en) * 2015-05-29 2016-12-01 Intel Corporation System, apparatus and method for controlling multiple trusted execution environments in a system
WO2017082966A1 (en) * 2015-11-09 2017-05-18 Intel IP Corporation Integrated universal integrated circuit card on mobile computing environments
US10338957B2 (en) * 2016-12-27 2019-07-02 Intel Corporation Provisioning keys for virtual machine secure enclaves
LU100798B1 (en) * 2018-05-22 2019-11-22 Univ Luxembourg Improved computing apparatus

Also Published As

Publication number Publication date
WO2024083346A1 (en) 2024-04-25

Similar Documents

Publication Publication Date Title
US12197566B2 (en) Method and system for preventing and detecting security threats
JP5646631B2 (en) Device audit
KR101458780B1 (en) Providing a multi-phase lockstep integrity reporting mechanism
US7953980B2 (en) Signed manifest for run-time verification of software program identity and integrity
CN102279760B (en) Device booting with an initial protection component
EP2207121B1 (en) Protecting content on virtualized client platforms
US8364973B2 (en) Dynamic generation of integrity manifest for run-time verification of software program
JP4855679B2 (en) Encapsulation of reliable platform module functions by TCPA inside server management coprocessor subsystem
JP5346608B2 (en) Information processing apparatus and file verification system
CN110612517B (en) Memory protection based on system state
CN110321713B (en) Dynamic measurement method and device of trusted computing platform based on dual-system architecture
CN110334512B (en) Static measurement method and device of trusted computing platform based on dual-system architecture
JP2014513348A (en) System and method for processing a request to change a system security database and firmware storage in an integrated extended firmware interface compliant computing device
CN102656592A (en) Information processing device, information processing system, software routine execution method, and remote authentication method
US20080178257A1 (en) Method for integrity metrics management
CN112511306A (en) Safe operation environment construction method based on mixed trust model
CN118503956B (en) Software protection system, method, storage medium, device and program product
KR20200041639A (en) In-vehicle software update system and method for controlling the same
CN105308610A (en) Method and system for platform and user application security on a device
CN120035826A (en) Data processing device and method for runtime certification
Kalamkar et al. Low-level memory attacks on automotive embedded systems
Kursawe et al. Flexible μTPMs through disembedding
Kornaros et al. Securing Dynamic Firmware Updates of Mixed-Critical Applications
Ma et al. Booting IoT Terminal Device Securely with eMMC
Bravi et al. Implementation of the TCG DICE Specification into the Keystone TEE Framework

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination