CN120670334A - Prefetcher design system based on path history signature and optimization method thereof - Google Patents
Prefetcher design system based on path history signature and optimization method thereofInfo
- Publication number
- CN120670334A CN120670334A CN202510865383.7A CN202510865383A CN120670334A CN 120670334 A CN120670334 A CN 120670334A CN 202510865383 A CN202510865383 A CN 202510865383A CN 120670334 A CN120670334 A CN 120670334A
- Authority
- CN
- China
- Prior art keywords
- signature
- page
- confidence
- history
- prefetcher
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Memory System Of A Hierarchy Structure (AREA)
Abstract
The invention discloses a prefetcher design system based on path history signature, which comprises an intra-page signature table, an intra-page offset prediction table, a global history register and a prefetching filter; the prefetcher optimizing method based on the path history signature comprises the following steps of S1, confirming a corresponding physical page number according to a physical address, S2, accessing an in-page signature table according to the physical page number, indexing the in-page offset prediction table by using the obtained history signature, S3, calculating the whole confidence coefficient of a predicted path through a prefetching filter, judging whether a set threshold value is reached, and S4, accessing the in-page signature table again by using a new physical page number and accessing the in-page offset prediction table by using the obtained history signature when a certain prefetching current page crossing is detected. The invention supports a multi-step mode, can reduce inter-page historical information interference and predict a cross-page access mode, and can evaluate the accuracy of prediction by using the path confidence.
Description
Technical Field
The invention relates to the technical field of data prefetching of processors, in particular to a prefetcher design system based on path history signatures and an optimization method thereof.
Background
With the continuous improvement of the performance of computer systems, the difference between the execution speed of a processor and the memory access delay is continuously enlarged, which is called a memory wall, and in order to solve this problem, the prefetching technology is widely adopted in modern processor design. The prefetching technology is used for predicting future data access of the program and loading data into the cache in advance, so that memory access delay is reduced, and system performance is improved.
History-based prefetchers are a common prefetch strategy that predicts future accesses by analyzing history memory access patterns, while effective in some scenarios, traditional history prefetchers face serious challenges in physical page transitions, such as many prior art techniques that use very short history data to predict, or rely on only a first offset within a page, where accuracy of predictions is severely impacted due to lack of enough history information, some designs that use long global history information, resulting in inter-page history information interference, i.e., no separation of history information within a page, traditional prefetching methods that typically do not utilize history information of a previous page to predict across pages when encountering physical page boundaries, which makes these methods fail to exert maximum performance in a scenario of multi-page program accesses, and traditional techniques that use a fixed depth in prefetch depth, where values should dynamically change, some times result in insufficient coverage due to lack of access bandwidth.
Disclosure of Invention
Aiming at the problems, the invention aims to provide a prefetcher design system based on path history signature and an optimization method thereof, which support a multi-step mode, can reduce inter-page history information interference and predict a cross-page access mode, can evaluate the accuracy of prediction by using path confidence and improve the accuracy of prefetching.
The method is realized by the following technical scheme:
According to a first aspect, a prefetcher design system based on path history signatures is provided, and the prefetcher design system comprises an intra-page signature table, an intra-page offset prediction table, a global history register, a prefetching filter and a prefetching filter, wherein the intra-page offset table is used for recording an intra-page offset address and a history signature of a page, the intra-page offset prediction table is used for recording step increment, step confidence and signature confidence of the prefetching address in the page and indexing according to the history signature output by the intra-page signature table, the global history register is used for recording global history step information and folding and converting the history step information of a current long bit width into a history signature of a short bit width, and the prefetching filter is used for dynamically increasing or decreasing the prefetching depth and scope according to the accuracy of the history prefetching address by maintaining a confidence value on each predicted path. The prefetcher design system supports a multi-step long mode, can reduce inter-page historical information interference and forecast cross-page access modes, and solves the problem that the traditional method cannot forecast the cross-page access modes, so that the forecast accuracy is low.
Preferably, the field composition of the data structure of the in-page signature table includes at least the Tag, the last prefetched in-page offset address, and the signature. The access pattern in each page can be recorded by the in-page signature table.
Preferably, each page corresponds to a separate in-page signature table. By respectively storing an independent intra-page signature table for each page, the inter-page history information interference can be reduced, and the prefetching accuracy can be improved.
Preferably, the field composition of the data structure of the intra-page offset prediction table includes at least Tag, step increment, step confidence and signature confidence. The prefetch address can be accurately predicted through the intra-page offset prediction table, the cache hit rate is improved, and the memory access delay is reduced.
Preferably, each intra-page offset prediction table includes a plurality of step increments, and each step increment corresponds to a corresponding step confidence. By storing a plurality of step increments in the intra-page offset prediction table, each corresponding to a respective step confidence, the accuracy of the prefetch can be improved.
Preferably, when the history signature hits the intra-page offset prediction table, the step increment corresponding to the greatest step confidence is selected, and if the step increment of a certain prefetch is used, the corresponding step confidence++, and if any prefetch is successful, the signature confidence++, the prefetch failure, the signature confidence-. The whole credibility of the prefetching can be expressed by using the step length confidence and the signature confidence, so that the prefetching result can be more accurate.
Preferably, the global history register is implemented using a 128-bit shift register that stores the predicted step size increment after the step size increment prediction is completed. Through the global history register, the corresponding history signature can be obtained when the signature table in the page is missed, so that the prefetching efficiency is improved.
In a cache hierarchical structure, after an L2 cache receives an access request of a physical address, a corresponding physical page number is confirmed according to the physical address, S2, a history signature of a current page is obtained according to the physical page number, if hit occurs, the history signature of the current page is obtained according to a global history register, if not hit occurs, the obtained history signature indexes a page offset prediction table, the largest step confidence in the current page and the corresponding step increment thereof are obtained, S3, the whole confidence of a predicted path is calculated through a prefetch filter, whether the set threshold value is reached or not is judged, if not reached, the prefetch process is stopped, if the corresponding step increment is sent to a next-stage cache, meanwhile, the step increment is updated to the current page offset prediction table and the global history register, if hit occurs, the history signature is obtained according to the obtained history signature index page number again, if the history signature is not hit occurs, and if the history signature is obtained again according to the new page number. The prefetcher optimizing method accesses the page signature table according to the physical page number and performs the prefetching flow after obtaining the historical signature of the page, so that the prefetching result is more accurate.
Preferably, in step S3, the overall confidence of each predicted path is calculated by multiplying all the predicted confidence values on the predicted path. By calculating the overall confidence coefficient of each predicted path, whether to continue prefetching can be judged according to the overall confidence coefficient of the predicted path, and the accuracy of prefetching can be evaluated.
Preferably, in step S4, after the history signature corresponding to the new physical page number is acquired according to the global history register, the history signature is updated into the corresponding in-page signature table. By updating the history signature into the corresponding in-page signature table, the accuracy of subsequent prefetching can be improved.
Compared with the prior art, the invention has the following beneficial effects:
according to the technical scheme, a plurality of step increments and corresponding step confidence of each step increment are stored in the intra-page offset prediction table, and an independent intra-page signature table is stored for each page, so that a multi-step long mode can be supported, inter-page historical information interference is reduced, a cross-page access mode can be predicted, the problem that the traditional method cannot predict the cross-page access mode, and accordingly prediction accuracy is low is solved, meanwhile, through the global historical register, a corresponding historical signature can be obtained when the intra-page signature table is missed, the prefetching efficiency is improved, the prefetching depth and range are dynamically increased or reduced according to the accuracy of a historical prefetching address, and the prefetching accuracy is improved.
Drawings
FIG. 1 is a schematic diagram of a path history signature-based prefetcher design system;
FIG. 2 is a flow chart of a path history signature based prefetcher optimization method.
Detailed Description
The following describes the technical solution in the embodiment of the present invention in detail with reference to fig. 1-2 in the embodiment of the present invention.
As shown in FIG. 1, a schematic diagram of a path history signature-based prefetcher design system is shown, comprising an intra-page signature table, an intra-page offset prediction table, a global history register, and a prefetch filter, wherein the system comprises:
The page internal signature table is used for recording the page internal offset address and the historical signature of the last prefetched page; the field composition of the data structure of the page internal signature table at least comprises Tag, a page internal offset address, a signature, an index page internal offset prediction table, a prefetcher design system and a prefetcher design system, wherein the Tag is the number of a current page and is used for uniquely identifying the page, the page internal offset address is prefetched last time, the signature is obtained after compression according to the step increment corresponding to the last n prefetching, the n is a positive integer, and the prefetcher design system can record the access mode in each page through the page internal signature table.
In this embodiment, each page corresponds to an independent signature table in the page, that is, the history information in the page is separated, so that interference of the history information between pages can be reduced, and accuracy of prefetching is improved.
The invention relates to an intra-page offset prediction table, which is used for recording the step increment, the step confidence and the signature confidence of a pre-fetch address in a page and indexing according to a historical signature output by the intra-page signature table, and specifically comprises the following fields of a data structure of the intra-page offset prediction table, wherein the field composition at least comprises a Tag, an N-bit signature, such as 12-bit, and the matching of the Tag, when the intra-page offset prediction table is accessed in parallel by using the historical signature output by the intra-page signature table, the matching is regarded as hit, the step increment is the provided prediction step, the step confidence is realized by a 4-bit saturation counter, the prediction is correct++, otherwise, the signature confidence is realized by a 4-bit saturation counter, and the prediction is correct++, otherwise, the pre-fetcher design system can accurately predict the pre-fetch address through the intra-page offset prediction table, improve the cache hit rate and reduce the memory access delay.
In this embodiment, each intra-page offset prediction table includes a plurality of step increments, each step increment corresponds to a corresponding step confidence, when a history signature hits the intra-page offset prediction table, a step increment corresponding to the maximum step confidence is selected, if a step increment of a certain prefetch is used, the corresponding step confidence++, i.e., the step confidence is incremented, otherwise, the step confidence is decremented, and any prefetch is successful, the signature confidence++, i.e., the signature confidence is incremented, the prefetch fails, and the signature confidence—, i.e., the signature confidence is decremented.
In this embodiment, the invention uses "step confidence/signature confidence" to represent the overall reliability of a certain prefetch, and the signature confidence uses Csig to represent, the greater Csig is, the lower the reliability of a certain prefetch path is, that is, the higher the occurrence frequency of a certain historical signature is, but a plurality of step modes exist, so that the prefetch uncertainty is increased, where Csig is all referred to as Confidence of Signature, and is used for measuring the reliability of a prefetch path associated with a certain historical signature.
And the global history register is used for recording global history step information, folding and converting the current long-bit-width history step information into a short-bit-width history signature, wherein the folding is used for compressing the long-bit-width history step information into the short-bit-width signature so as to acquire the history signature capable of indexing the intra-page offset prediction table.
In the embodiment, the global history register is realized by a 128-bit shift register, and after step increment prediction is completed, the predicted step increment is stored; the prefetcher design system can acquire the corresponding historical signature when the signature table in the page is missed through the global historical register, so that the prefetching efficiency is improved.
Prefetch filter, by maintaining a confidence value on each predicted path and dynamically increasing or decreasing the depth and scope of prefetching based on the accuracy of historical prefetch addresses, the prefetch filter workflow is as follows:
initializing a confidence value, namely determining the confidence value by the result of an intra-page offset prediction table in an initial state;
The second step, updating the confidence value, namely when a new prefetching is initiated, the confidence value is adjusted according to whether the predicted result is successfully verified, namely whether the predicted result is confirmed by subsequent access;
and thirdly, calculating the path confidence coefficient, namely calculating the integral confidence coefficient of each predicted path, wherein the integral confidence coefficient of the predicted path is represented by the product of all predicted confidence values, and stopping the prefetching flow when the value is lower than a set threshold value.
The depth of prefetching refers to the number of steps of future memory access addresses which can be predicted and loaded in advance by the prefetcher, and the scope of prefetching refers to the size of a memory address space which can be covered by the prefetcher; the prefetcher design system predicts the reliability of the whole predicted path by calculating the integral confidence of each predicted path, and dynamically evaluates the reliability of the predicted path, thereby improving the accuracy of prefetching.
As shown in FIG. 2, the method is a flow chart of a prefetcher optimization method based on path history signatures, firstly, after an L2 cache receives an access request of a physical address, a corresponding physical page number is confirmed according to the physical address, then, an in-page signature table is accessed according to the physical page number, an in-page offset prediction table is indexed by using the obtained history signatures, the maximum step confidence and the corresponding step increment in the current page are obtained, the overall confidence of a predicted path is calculated through a prefetching filter, whether a set threshold is reached or not is judged, finally, when a certain prefetching of the current page is detected, the in-page signature table is accessed again by using a new physical page number or a history signature corresponding to the new physical page number is obtained according to a global history register, and the in-page offset prediction table is accessed by using the obtained history signature.
The method specifically comprises the following steps:
s1, in the cache hierarchical structure, after the L2 cache receives an access request of a physical address, the corresponding physical page number is confirmed according to the physical address.
S2, accessing an in-page signature table according to the physical page number, if the in-page signature table is hit, acquiring a history signature of the current page, if the in-page signature table is not hit, acquiring the history signature of the current page according to the global history register, and then indexing the in-page offset prediction table by using the acquired history signature to acquire the maximum step confidence and the corresponding step increment in the current page.
S3, calculating the whole confidence coefficient of the predicted path through a pre-fetching filter, judging whether the whole confidence coefficient reaches a set threshold value, stopping the pre-fetching process if the whole confidence coefficient does not reach the threshold value, initiating a pre-fetching request if the whole confidence coefficient is higher than the threshold value, sending a corresponding step increment to a next-level cache, and updating the step increment to a current intra-page offset prediction table and a global history register.
The method for calculating the overall confidence coefficient of each predicted path is a product of all predicted confidence values on the predicted path, and by calculating the overall confidence coefficient of each predicted path, whether to continue prefetching or not can be judged according to the overall confidence coefficient of the predicted path, and the accuracy of prefetching can be evaluated.
S4, when a certain pre-fetching current page crossing is detected, accessing the page internal signature table again by using a new physical page number, if hit occurs, acquiring a history signature corresponding to the new physical page number, if not hit occurs, acquiring a history signature corresponding to the new physical page number according to the global history register, and accessing the page internal offset prediction table by using the acquired history signature to perform a pre-fetching process.
After the history signature corresponding to the new physical page number is obtained according to the global history register, the history signature is updated to the corresponding in-page signature table, so that the accuracy of subsequent prefetching can be improved.
In summary, the method and the device can support a multi-step long mode, reduce inter-page history information interference, predict an inter-page access mode, solve the problem of low prediction accuracy caused by the fact that the traditional method cannot predict the inter-page access mode by storing a plurality of step increments and step confidence corresponding to the step increments respectively in the intra-page offset prediction table and respectively storing an independent intra-page signature table for each page, and simultaneously can acquire a corresponding history signature when the intra-page signature table is missed through the global history register, thereby improving the prefetching efficiency, dynamically increasing or reducing the prefetching depth and range according to the accuracy of a history prefetching address, improving the prefetching accuracy and having remarkable progress.
The above embodiments are only for illustrating the technical idea of the present invention, and the protection scope of the present invention is not limited thereto, and any modification made on the basis of the technical scheme according to the technical idea of the present invention falls within the protection scope of the present invention.
Claims (10)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202510865383.7A CN120670334A (en) | 2025-06-26 | 2025-06-26 | Prefetcher design system based on path history signature and optimization method thereof |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202510865383.7A CN120670334A (en) | 2025-06-26 | 2025-06-26 | Prefetcher design system based on path history signature and optimization method thereof |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN120670334A true CN120670334A (en) | 2025-09-19 |
Family
ID=97047215
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202510865383.7A Pending CN120670334A (en) | 2025-06-26 | 2025-06-26 | Prefetcher design system based on path history signature and optimization method thereof |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN120670334A (en) |
-
2025
- 2025-06-26 CN CN202510865383.7A patent/CN120670334A/en active Pending
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US6789171B2 (en) | Computer system implementing a multi-threaded stride prediction read ahead algorithm | |
| CN113986774B (en) | A cache replacement system and method based on instruction stream and memory access mode learning | |
| US8683129B2 (en) | Using speculative cache requests to reduce cache miss delays | |
| US11249762B2 (en) | Apparatus and method for handling incorrect branch direction predictions | |
| CN113778520B (en) | Offset prefetch method, apparatus for performing offset prefetch, computing device, and medium | |
| US20120297256A1 (en) | Large Ram Cache | |
| CN106021128A (en) | Data prefetcher based on correlation of strides and data and prefetching method of data prefetcher | |
| CN117573574A (en) | Prefetching method and device, electronic equipment and readable storage medium | |
| CN114528025B (en) | Instruction processing method and device, microcontroller and readable storage medium | |
| US11449428B2 (en) | Enhanced read-ahead capability for storage devices | |
| CN116244218A (en) | Data cache prefetching device, method and processor | |
| CN117130665A (en) | Method and system for predicting execution result of processor branch instruction | |
| CN114721974B (en) | Data prefetching method and device | |
| CN106649143B (en) | Cache access method and device and electronic equipment | |
| JP2000076017A (en) | Magnetic disk controller | |
| CN115827507A (en) | Data prefetching method and device, electronic equipment and storage medium | |
| CN120670334A (en) | Prefetcher design system based on path history signature and optimization method thereof | |
| CN120407441A (en) | A data pre-fetching method and related device | |
| CN120315762A (en) | Method for constructing pattern history table, data pre-fetching method, processor and device | |
| KR20050084232A (en) | Counter based stride prediction for data prefetch | |
| CN118069548B (en) | Prefetching method and device, electronic equipment and readable storage medium | |
| CN113760783B (en) | Joint offset prefetching method and device, computing device and readable storage medium | |
| CN114064521B (en) | Adaptive address-dependent prefetching method for irregular access | |
| CN115114189B (en) | A prefetch control strategy based on improved hill climbing method in asymmetric multi-core architecture | |
| CN114546892B (en) | Data pre-fetching method and device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |