[go: up one dir, main page]

CN120670334A - Prefetcher design system based on path history signature and optimization method thereof - Google Patents

Prefetcher design system based on path history signature and optimization method thereof

Info

Publication number
CN120670334A
CN120670334A CN202510865383.7A CN202510865383A CN120670334A CN 120670334 A CN120670334 A CN 120670334A CN 202510865383 A CN202510865383 A CN 202510865383A CN 120670334 A CN120670334 A CN 120670334A
Authority
CN
China
Prior art keywords
signature
page
confidence
history
prefetcher
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202510865383.7A
Other languages
Chinese (zh)
Inventor
张然
赵晏伯
周海斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Huachuang Micro System Co ltd
Original Assignee
Jiangsu Huachuang Micro System Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Huachuang Micro System Co ltd filed Critical Jiangsu Huachuang Micro System Co ltd
Priority to CN202510865383.7A priority Critical patent/CN120670334A/en
Publication of CN120670334A publication Critical patent/CN120670334A/en
Pending legal-status Critical Current

Links

Landscapes

  • Memory System Of A Hierarchy Structure (AREA)

Abstract

The invention discloses a prefetcher design system based on path history signature, which comprises an intra-page signature table, an intra-page offset prediction table, a global history register and a prefetching filter; the prefetcher optimizing method based on the path history signature comprises the following steps of S1, confirming a corresponding physical page number according to a physical address, S2, accessing an in-page signature table according to the physical page number, indexing the in-page offset prediction table by using the obtained history signature, S3, calculating the whole confidence coefficient of a predicted path through a prefetching filter, judging whether a set threshold value is reached, and S4, accessing the in-page signature table again by using a new physical page number and accessing the in-page offset prediction table by using the obtained history signature when a certain prefetching current page crossing is detected. The invention supports a multi-step mode, can reduce inter-page historical information interference and predict a cross-page access mode, and can evaluate the accuracy of prediction by using the path confidence.

Description

Prefetcher design system based on path history signature and optimization method thereof
Technical Field
The invention relates to the technical field of data prefetching of processors, in particular to a prefetcher design system based on path history signatures and an optimization method thereof.
Background
With the continuous improvement of the performance of computer systems, the difference between the execution speed of a processor and the memory access delay is continuously enlarged, which is called a memory wall, and in order to solve this problem, the prefetching technology is widely adopted in modern processor design. The prefetching technology is used for predicting future data access of the program and loading data into the cache in advance, so that memory access delay is reduced, and system performance is improved.
History-based prefetchers are a common prefetch strategy that predicts future accesses by analyzing history memory access patterns, while effective in some scenarios, traditional history prefetchers face serious challenges in physical page transitions, such as many prior art techniques that use very short history data to predict, or rely on only a first offset within a page, where accuracy of predictions is severely impacted due to lack of enough history information, some designs that use long global history information, resulting in inter-page history information interference, i.e., no separation of history information within a page, traditional prefetching methods that typically do not utilize history information of a previous page to predict across pages when encountering physical page boundaries, which makes these methods fail to exert maximum performance in a scenario of multi-page program accesses, and traditional techniques that use a fixed depth in prefetch depth, where values should dynamically change, some times result in insufficient coverage due to lack of access bandwidth.
Disclosure of Invention
Aiming at the problems, the invention aims to provide a prefetcher design system based on path history signature and an optimization method thereof, which support a multi-step mode, can reduce inter-page history information interference and predict a cross-page access mode, can evaluate the accuracy of prediction by using path confidence and improve the accuracy of prefetching.
The method is realized by the following technical scheme:
According to a first aspect, a prefetcher design system based on path history signatures is provided, and the prefetcher design system comprises an intra-page signature table, an intra-page offset prediction table, a global history register, a prefetching filter and a prefetching filter, wherein the intra-page offset table is used for recording an intra-page offset address and a history signature of a page, the intra-page offset prediction table is used for recording step increment, step confidence and signature confidence of the prefetching address in the page and indexing according to the history signature output by the intra-page signature table, the global history register is used for recording global history step information and folding and converting the history step information of a current long bit width into a history signature of a short bit width, and the prefetching filter is used for dynamically increasing or decreasing the prefetching depth and scope according to the accuracy of the history prefetching address by maintaining a confidence value on each predicted path. The prefetcher design system supports a multi-step long mode, can reduce inter-page historical information interference and forecast cross-page access modes, and solves the problem that the traditional method cannot forecast the cross-page access modes, so that the forecast accuracy is low.
Preferably, the field composition of the data structure of the in-page signature table includes at least the Tag, the last prefetched in-page offset address, and the signature. The access pattern in each page can be recorded by the in-page signature table.
Preferably, each page corresponds to a separate in-page signature table. By respectively storing an independent intra-page signature table for each page, the inter-page history information interference can be reduced, and the prefetching accuracy can be improved.
Preferably, the field composition of the data structure of the intra-page offset prediction table includes at least Tag, step increment, step confidence and signature confidence. The prefetch address can be accurately predicted through the intra-page offset prediction table, the cache hit rate is improved, and the memory access delay is reduced.
Preferably, each intra-page offset prediction table includes a plurality of step increments, and each step increment corresponds to a corresponding step confidence. By storing a plurality of step increments in the intra-page offset prediction table, each corresponding to a respective step confidence, the accuracy of the prefetch can be improved.
Preferably, when the history signature hits the intra-page offset prediction table, the step increment corresponding to the greatest step confidence is selected, and if the step increment of a certain prefetch is used, the corresponding step confidence++, and if any prefetch is successful, the signature confidence++, the prefetch failure, the signature confidence-. The whole credibility of the prefetching can be expressed by using the step length confidence and the signature confidence, so that the prefetching result can be more accurate.
Preferably, the global history register is implemented using a 128-bit shift register that stores the predicted step size increment after the step size increment prediction is completed. Through the global history register, the corresponding history signature can be obtained when the signature table in the page is missed, so that the prefetching efficiency is improved.
In a cache hierarchical structure, after an L2 cache receives an access request of a physical address, a corresponding physical page number is confirmed according to the physical address, S2, a history signature of a current page is obtained according to the physical page number, if hit occurs, the history signature of the current page is obtained according to a global history register, if not hit occurs, the obtained history signature indexes a page offset prediction table, the largest step confidence in the current page and the corresponding step increment thereof are obtained, S3, the whole confidence of a predicted path is calculated through a prefetch filter, whether the set threshold value is reached or not is judged, if not reached, the prefetch process is stopped, if the corresponding step increment is sent to a next-stage cache, meanwhile, the step increment is updated to the current page offset prediction table and the global history register, if hit occurs, the history signature is obtained according to the obtained history signature index page number again, if the history signature is not hit occurs, and if the history signature is obtained again according to the new page number. The prefetcher optimizing method accesses the page signature table according to the physical page number and performs the prefetching flow after obtaining the historical signature of the page, so that the prefetching result is more accurate.
Preferably, in step S3, the overall confidence of each predicted path is calculated by multiplying all the predicted confidence values on the predicted path. By calculating the overall confidence coefficient of each predicted path, whether to continue prefetching can be judged according to the overall confidence coefficient of the predicted path, and the accuracy of prefetching can be evaluated.
Preferably, in step S4, after the history signature corresponding to the new physical page number is acquired according to the global history register, the history signature is updated into the corresponding in-page signature table. By updating the history signature into the corresponding in-page signature table, the accuracy of subsequent prefetching can be improved.
Compared with the prior art, the invention has the following beneficial effects:
according to the technical scheme, a plurality of step increments and corresponding step confidence of each step increment are stored in the intra-page offset prediction table, and an independent intra-page signature table is stored for each page, so that a multi-step long mode can be supported, inter-page historical information interference is reduced, a cross-page access mode can be predicted, the problem that the traditional method cannot predict the cross-page access mode, and accordingly prediction accuracy is low is solved, meanwhile, through the global historical register, a corresponding historical signature can be obtained when the intra-page signature table is missed, the prefetching efficiency is improved, the prefetching depth and range are dynamically increased or reduced according to the accuracy of a historical prefetching address, and the prefetching accuracy is improved.
Drawings
FIG. 1 is a schematic diagram of a path history signature-based prefetcher design system;
FIG. 2 is a flow chart of a path history signature based prefetcher optimization method.
Detailed Description
The following describes the technical solution in the embodiment of the present invention in detail with reference to fig. 1-2 in the embodiment of the present invention.
As shown in FIG. 1, a schematic diagram of a path history signature-based prefetcher design system is shown, comprising an intra-page signature table, an intra-page offset prediction table, a global history register, and a prefetch filter, wherein the system comprises:
The page internal signature table is used for recording the page internal offset address and the historical signature of the last prefetched page; the field composition of the data structure of the page internal signature table at least comprises Tag, a page internal offset address, a signature, an index page internal offset prediction table, a prefetcher design system and a prefetcher design system, wherein the Tag is the number of a current page and is used for uniquely identifying the page, the page internal offset address is prefetched last time, the signature is obtained after compression according to the step increment corresponding to the last n prefetching, the n is a positive integer, and the prefetcher design system can record the access mode in each page through the page internal signature table.
In this embodiment, each page corresponds to an independent signature table in the page, that is, the history information in the page is separated, so that interference of the history information between pages can be reduced, and accuracy of prefetching is improved.
The invention relates to an intra-page offset prediction table, which is used for recording the step increment, the step confidence and the signature confidence of a pre-fetch address in a page and indexing according to a historical signature output by the intra-page signature table, and specifically comprises the following fields of a data structure of the intra-page offset prediction table, wherein the field composition at least comprises a Tag, an N-bit signature, such as 12-bit, and the matching of the Tag, when the intra-page offset prediction table is accessed in parallel by using the historical signature output by the intra-page signature table, the matching is regarded as hit, the step increment is the provided prediction step, the step confidence is realized by a 4-bit saturation counter, the prediction is correct++, otherwise, the signature confidence is realized by a 4-bit saturation counter, and the prediction is correct++, otherwise, the pre-fetcher design system can accurately predict the pre-fetch address through the intra-page offset prediction table, improve the cache hit rate and reduce the memory access delay.
In this embodiment, each intra-page offset prediction table includes a plurality of step increments, each step increment corresponds to a corresponding step confidence, when a history signature hits the intra-page offset prediction table, a step increment corresponding to the maximum step confidence is selected, if a step increment of a certain prefetch is used, the corresponding step confidence++, i.e., the step confidence is incremented, otherwise, the step confidence is decremented, and any prefetch is successful, the signature confidence++, i.e., the signature confidence is incremented, the prefetch fails, and the signature confidence—, i.e., the signature confidence is decremented.
In this embodiment, the invention uses "step confidence/signature confidence" to represent the overall reliability of a certain prefetch, and the signature confidence uses Csig to represent, the greater Csig is, the lower the reliability of a certain prefetch path is, that is, the higher the occurrence frequency of a certain historical signature is, but a plurality of step modes exist, so that the prefetch uncertainty is increased, where Csig is all referred to as Confidence of Signature, and is used for measuring the reliability of a prefetch path associated with a certain historical signature.
And the global history register is used for recording global history step information, folding and converting the current long-bit-width history step information into a short-bit-width history signature, wherein the folding is used for compressing the long-bit-width history step information into the short-bit-width signature so as to acquire the history signature capable of indexing the intra-page offset prediction table.
In the embodiment, the global history register is realized by a 128-bit shift register, and after step increment prediction is completed, the predicted step increment is stored; the prefetcher design system can acquire the corresponding historical signature when the signature table in the page is missed through the global historical register, so that the prefetching efficiency is improved.
Prefetch filter, by maintaining a confidence value on each predicted path and dynamically increasing or decreasing the depth and scope of prefetching based on the accuracy of historical prefetch addresses, the prefetch filter workflow is as follows:
initializing a confidence value, namely determining the confidence value by the result of an intra-page offset prediction table in an initial state;
The second step, updating the confidence value, namely when a new prefetching is initiated, the confidence value is adjusted according to whether the predicted result is successfully verified, namely whether the predicted result is confirmed by subsequent access;
and thirdly, calculating the path confidence coefficient, namely calculating the integral confidence coefficient of each predicted path, wherein the integral confidence coefficient of the predicted path is represented by the product of all predicted confidence values, and stopping the prefetching flow when the value is lower than a set threshold value.
The depth of prefetching refers to the number of steps of future memory access addresses which can be predicted and loaded in advance by the prefetcher, and the scope of prefetching refers to the size of a memory address space which can be covered by the prefetcher; the prefetcher design system predicts the reliability of the whole predicted path by calculating the integral confidence of each predicted path, and dynamically evaluates the reliability of the predicted path, thereby improving the accuracy of prefetching.
As shown in FIG. 2, the method is a flow chart of a prefetcher optimization method based on path history signatures, firstly, after an L2 cache receives an access request of a physical address, a corresponding physical page number is confirmed according to the physical address, then, an in-page signature table is accessed according to the physical page number, an in-page offset prediction table is indexed by using the obtained history signatures, the maximum step confidence and the corresponding step increment in the current page are obtained, the overall confidence of a predicted path is calculated through a prefetching filter, whether a set threshold is reached or not is judged, finally, when a certain prefetching of the current page is detected, the in-page signature table is accessed again by using a new physical page number or a history signature corresponding to the new physical page number is obtained according to a global history register, and the in-page offset prediction table is accessed by using the obtained history signature.
The method specifically comprises the following steps:
s1, in the cache hierarchical structure, after the L2 cache receives an access request of a physical address, the corresponding physical page number is confirmed according to the physical address.
S2, accessing an in-page signature table according to the physical page number, if the in-page signature table is hit, acquiring a history signature of the current page, if the in-page signature table is not hit, acquiring the history signature of the current page according to the global history register, and then indexing the in-page offset prediction table by using the acquired history signature to acquire the maximum step confidence and the corresponding step increment in the current page.
S3, calculating the whole confidence coefficient of the predicted path through a pre-fetching filter, judging whether the whole confidence coefficient reaches a set threshold value, stopping the pre-fetching process if the whole confidence coefficient does not reach the threshold value, initiating a pre-fetching request if the whole confidence coefficient is higher than the threshold value, sending a corresponding step increment to a next-level cache, and updating the step increment to a current intra-page offset prediction table and a global history register.
The method for calculating the overall confidence coefficient of each predicted path is a product of all predicted confidence values on the predicted path, and by calculating the overall confidence coefficient of each predicted path, whether to continue prefetching or not can be judged according to the overall confidence coefficient of the predicted path, and the accuracy of prefetching can be evaluated.
S4, when a certain pre-fetching current page crossing is detected, accessing the page internal signature table again by using a new physical page number, if hit occurs, acquiring a history signature corresponding to the new physical page number, if not hit occurs, acquiring a history signature corresponding to the new physical page number according to the global history register, and accessing the page internal offset prediction table by using the acquired history signature to perform a pre-fetching process.
After the history signature corresponding to the new physical page number is obtained according to the global history register, the history signature is updated to the corresponding in-page signature table, so that the accuracy of subsequent prefetching can be improved.
In summary, the method and the device can support a multi-step long mode, reduce inter-page history information interference, predict an inter-page access mode, solve the problem of low prediction accuracy caused by the fact that the traditional method cannot predict the inter-page access mode by storing a plurality of step increments and step confidence corresponding to the step increments respectively in the intra-page offset prediction table and respectively storing an independent intra-page signature table for each page, and simultaneously can acquire a corresponding history signature when the intra-page signature table is missed through the global history register, thereby improving the prefetching efficiency, dynamically increasing or reducing the prefetching depth and range according to the accuracy of a history prefetching address, improving the prefetching accuracy and having remarkable progress.
The above embodiments are only for illustrating the technical idea of the present invention, and the protection scope of the present invention is not limited thereto, and any modification made on the basis of the technical scheme according to the technical idea of the present invention falls within the protection scope of the present invention.

Claims (10)

1.一种基于路径历史签名的预取器设计系统,其特征在于,包括:1. A prefetcher design system based on path history signature, comprising: 页内签名表;用于记录页面的上一次预取的页内偏移地址和历史签名;In-page signature table; used to record the in-page offset address and historical signature of the last prefetch of the page; 页内偏移预测表;用于记录预取地址在页面内的步长增量、步长信心和签名信心,并根据页内签名表输出的历史签名进行索引;In-page offset prediction table; used to record the step increment, step confidence, and signature confidence of the prefetch address within the page, and indexed according to the historical signature output by the in-page signature table; 全局历史寄存器;用于记录全局的历史步长信息,并将当前长位宽的历史步长信息进行折叠转化为短位宽的历史签名;Global history register; used to record the global historical step information and fold the current long-bit-width historical step information into a short-bit-width historical signature; 预取过滤器;通过在每个预测路径上维护一个信心值,并根据历史预取地址的准确性动态增加或减少预取的深度和范围。Prefetch filter; by maintaining a confidence value on each predicted path and dynamically increasing or decreasing the depth and range of prefetches based on the accuracy of historical prefetch addresses. 2.根据权利要求1所述的一种基于路径历史签名的预取器设计系统,其特征在于,页内签名表的数据结构的字段组成至少包括:Tag、上一次预取的页内偏移地址和签名。2. A prefetcher design system based on path history signature according to claim 1, characterized in that the fields of the data structure of the intra-page signature table at least include: Tag, the intra-page offset address of the last prefetch, and the signature. 3.根据权利要求1所述的一种基于路径历史签名的预取器设计系统,其特征在于,每个页面分别对应一个独立的页内签名表。3. The prefetcher design system based on path history signature according to claim 1, characterized in that each page corresponds to an independent in-page signature table. 4.根据权利要求1所述的一种基于路径历史签名的预取器设计系统,其特征在于,页内偏移预测表的数据结构的字段组成至少包括:Tag、步长增量、步长信心和签名信心。4. A prefetcher design system based on path history signature according to claim 1, characterized in that the field composition of the data structure of the intra-page offset prediction table includes at least: Tag, step increment, step confidence and signature confidence. 5.根据权利要求1所述的一种基于路径历史签名的预取器设计系统,其特征在于,每个页内偏移预测表包含有多个步长增量,且每个步长增量分别对应一个相应的步长信心。5. A prefetcher design system based on path history signature according to claim 1, characterized in that each intra-page offset prediction table contains multiple step increments, and each step increment corresponds to a corresponding step confidence. 6.根据权利要求1所述的一种基于路径历史签名的预取器设计系统,其特征在于,当历史签名命中页内偏移预测表后,选择对应步长信心最大的步长增量;若某次预取的步长增量被使用,则对应的步长信心++,反之--;且任意预取成功,签名信心++,预取失败,签名信心--。6. A prefetcher design system based on path history signature according to claim 1 is characterized in that when the historical signature hits the intra-page offset prediction table, the step increment with the largest corresponding step confidence is selected; if the step increment of a certain prefetch is used, the corresponding step confidence is ++, otherwise -; and if any prefetch is successful, the signature confidence is ++, and if the prefetch fails, the signature confidence is -. 7.根据权利要求1所述的一种基于路径历史签名的预取器设计系统,其特征在于,全局历史寄存器采用128-bit移位寄存器实现,其在完成步长增量预测后,将预测的步长增量进行存储。7. A prefetcher design system based on path history signature according to claim 1, characterized in that the global history register is implemented using a 128-bit shift register, which stores the predicted step increment after completing the step increment prediction. 8.一种基于路径历史签名的预取器优化方法,采用权利要求1-7任意一项所述的一种基于路径历史签名的预取器设计系统,其特征在于,所述方法包括如下步骤:8. A method for optimizing a prefetcher based on path history signatures, using the prefetcher design system based on path history signatures according to any one of claims 1 to 7, characterized in that the method comprises the following steps: S1、在缓存层次结构中,当L2缓存接收到一个物理地址的访问请求后,根据该物理地址确认其对应的物理页面号;S1. In the cache hierarchy, when the L2 cache receives an access request for a physical address, it determines the corresponding physical page number based on the physical address. S2、根据物理页面号访问页内签名表,若命中,则获取当前页面的历史签名,若没有命中,则根据全局历史寄存器获取当前页面的历史签名;再使用获取的历史签名索引页内偏移预测表,获取当前页面内最大的步长信心及其对应的步长增量;S2. Access the page signature table based on the physical page number. If a hit is found, obtain the historical signature of the current page. If not, obtain the historical signature of the current page based on the global history register. Then, use the obtained historical signature to index the page offset prediction table to obtain the maximum step confidence and its corresponding step increment within the current page. S3、通过预取过滤器计算预测路径的整体置信度,并判断是否达到设定的阈值;若未达到阈值,则停止预取流程;若高于阈值,则发起预取请求,将对应的步长增量发送给下一级缓存,同时将该步长增量更新到当前页内偏移预测表和全局历史寄存器中;S3. Calculate the overall confidence of the predicted path through the prefetch filter and determine whether it reaches a set threshold. If it does not reach the threshold, stop the prefetch process. If it is higher than the threshold, initiate a prefetch request, send the corresponding step increment to the next level cache, and update the step increment to the current page offset prediction table and the global history register. S4、当检测到某次预取出现跨页时,使用新的物理页面号再次访问页内签名表,若命中,则获取新的物理页面号对应的历史签名,若没有命中,则根据全局历史寄存器获取新的物理页面号对应的历史签名;并使用获取的历史签名访问页内偏移预测表,进行预取流程。S4. When it is detected that a prefetch crosses pages, the new physical page number is used to access the page signature table again. If a hit is found, the historical signature corresponding to the new physical page number is obtained. If there is no hit, the historical signature corresponding to the new physical page number is obtained according to the global history register; and the obtained historical signature is used to access the page offset prediction table to perform the prefetch process. 9.根据权利要求8所述的一种基于路径历史签名的预取器优化方法,其特征在于,步骤S3中,每一条预测路径的整体置信度的计算方法为该条预测路径上所有预测的信心值的乘积。9. A prefetcher optimization method based on path history signature according to claim 8, characterized in that in step S3, the overall confidence of each predicted path is calculated by multiplying the confidence values of all predictions on the predicted path. 10.根据权利要求8所述的一种基于路径历史签名的预取器优化方法,其特征在于,步骤S4中,当根据全局历史寄存器获取新的物理页面号对应的历史签名后,将该历史签名更新到对应的页内签名表中。10. A prefetcher optimization method based on path history signature according to claim 8, characterized in that in step S4, after obtaining the history signature corresponding to the new physical page number according to the global history register, the history signature is updated to the corresponding in-page signature table.
CN202510865383.7A 2025-06-26 2025-06-26 Prefetcher design system based on path history signature and optimization method thereof Pending CN120670334A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202510865383.7A CN120670334A (en) 2025-06-26 2025-06-26 Prefetcher design system based on path history signature and optimization method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202510865383.7A CN120670334A (en) 2025-06-26 2025-06-26 Prefetcher design system based on path history signature and optimization method thereof

Publications (1)

Publication Number Publication Date
CN120670334A true CN120670334A (en) 2025-09-19

Family

ID=97047215

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202510865383.7A Pending CN120670334A (en) 2025-06-26 2025-06-26 Prefetcher design system based on path history signature and optimization method thereof

Country Status (1)

Country Link
CN (1) CN120670334A (en)

Similar Documents

Publication Publication Date Title
US6789171B2 (en) Computer system implementing a multi-threaded stride prediction read ahead algorithm
CN113986774B (en) A cache replacement system and method based on instruction stream and memory access mode learning
US8683129B2 (en) Using speculative cache requests to reduce cache miss delays
US11249762B2 (en) Apparatus and method for handling incorrect branch direction predictions
CN113778520B (en) Offset prefetch method, apparatus for performing offset prefetch, computing device, and medium
US20120297256A1 (en) Large Ram Cache
CN106021128A (en) Data prefetcher based on correlation of strides and data and prefetching method of data prefetcher
CN117573574A (en) Prefetching method and device, electronic equipment and readable storage medium
CN114528025B (en) Instruction processing method and device, microcontroller and readable storage medium
US11449428B2 (en) Enhanced read-ahead capability for storage devices
CN116244218A (en) Data cache prefetching device, method and processor
CN117130665A (en) Method and system for predicting execution result of processor branch instruction
CN114721974B (en) Data prefetching method and device
CN106649143B (en) Cache access method and device and electronic equipment
JP2000076017A (en) Magnetic disk controller
CN115827507A (en) Data prefetching method and device, electronic equipment and storage medium
CN120670334A (en) Prefetcher design system based on path history signature and optimization method thereof
CN120407441A (en) A data pre-fetching method and related device
CN120315762A (en) Method for constructing pattern history table, data pre-fetching method, processor and device
KR20050084232A (en) Counter based stride prediction for data prefetch
CN118069548B (en) Prefetching method and device, electronic equipment and readable storage medium
CN113760783B (en) Joint offset prefetching method and device, computing device and readable storage medium
CN114064521B (en) Adaptive address-dependent prefetching method for irregular access
CN115114189B (en) A prefetch control strategy based on improved hill climbing method in asymmetric multi-core architecture
CN114546892B (en) Data pre-fetching method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination