WO2014011181A1 - Providing data to be retrieved - Google Patents
Providing data to be retrieved Download PDFInfo
- Publication number
- WO2014011181A1 WO2014011181A1 PCT/US2012/046514 US2012046514W WO2014011181A1 WO 2014011181 A1 WO2014011181 A1 WO 2014011181A1 US 2012046514 W US2012046514 W US 2012046514W WO 2014011181 A1 WO2014011181 A1 WO 2014011181A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- module
- prospective
- conditional probability
- neural network
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F12/00—Accessing, addressing or allocating within memory systems or architectures
- G06F12/02—Addressing or allocation; Relocation
- G06F12/08—Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
- G06F12/0802—Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches
- G06F12/0862—Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches with prefetch
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F12/00—Accessing, addressing or allocating within memory systems or architectures
- G06F12/02—Addressing or allocation; Relocation
- G06F12/08—Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F12/00—Accessing, addressing or allocating within memory systems or architectures
- G06F12/02—Addressing or allocation; Relocation
- G06F12/08—Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
- G06F12/0802—Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches
- G06F12/0866—Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches for peripheral storage systems, e.g. disk cache
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F12/00—Accessing, addressing or allocating within memory systems or architectures
- G06F12/02—Addressing or allocation; Relocation
- G06F12/08—Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
- G06F12/0802—Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches
- G06F12/0866—Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches for peripheral storage systems, e.g. disk cache
- G06F12/0871—Allocation or management of cache space
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2212/00—Indexing scheme relating to accessing, addressing or allocation within memory systems or architectures
- G06F2212/10—Providing a specific technical effect
- G06F2212/1016—Performance improvement
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2212/00—Indexing scheme relating to accessing, addressing or allocation within memory systems or architectures
- G06F2212/60—Details of cache memory
- G06F2212/6024—History based prefetching
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2212/00—Indexing scheme relating to accessing, addressing or allocation within memory systems or architectures
- G06F2212/60—Details of cache memory
- G06F2212/6026—Prefetching based on access pattern detection, e.g. stride based prefetch
Definitions
- a computing system may contain a non-volatile memory device and several volatile memory devices.
- the nonvolatile memory device has a larger storage capacity than the volatile memory devices.
- the access time of data stored in the non-volatile memory device can be slower than the access time of data stored in the volatile memory devices.
- some computing systems store copies of data from a non-volatile memory source in a volatile memory source.
- the processor can then attempt to request data from the volatile memory source before requesting data from the slower non-volatile memory source.
- predicting which data the processor might request can be difficult.
- storing data from a non-volatile memory device in a volatile memory can be inefficient if the stored data is not requested by the processor.
- FIG. 1 is a block diagram of an example of a computing system that can provide data to be retrieved
- FIG. 2 is a process flow diagram illustrating an example of a method for providing data to be retrieved
- FIG. 3 is a process flow diagram illustrating an example of a method for initializing a system that provides data to be retrieved;
- FIG. 4 is an example which illustrates the data flow in a system that can provide data to be retrieved.
- Fig. 5 is an example of a tangible, non-transitory computer-readable medium that can provide data to be retrieved.
- Various methods have been developed to identify and retrieve copies of data stored in non-volatile memory devices. For example, some methods identify and retrieve copies of data from non-volatile memory devices based on data requested by the processor. These methods may store copies of data that have sequential memory addresses that follow the memory address of the data last requested by the processor. However, many applications may not store data in a sequential configuration.
- a relational database can store data in tables. The data for each table of the relational database may be stored in a non-sequential configuration in non-volatile memory because a row from a first database table may be stored following a row from a second database table. In this example, the processor may be unlikely to request data in a sequential pattern. Accordingly, a sequential method of retrieving and storing copies of non-volatile memory data in a volatile memory source can be inefficient.
- the techniques disclosed herein describe a method for providing data to be retrieved.
- the data includes any data blocks, pages of data, tables, or any other information that may be requested by the processor.
- the data to be retrieved is identified using a sequential memory module, a neural network module, and a conditional probability module.
- Each module can identify prospective data to be retrieved based on calculations that attempt to determine which data the processor might request.
- a combination of the prospective data identified in each module is then determined to provide accurate prospective data that the processor is likely to request.
- the combination of these modules can increase the efficiency of a computing system by providing copies of data that are most likely to be requested by the processor to fast memory devices.
- Fig. 1 is a block diagram of an example of a computing system 100 that may be used for providing data to be retrieved.
- the computing system 100 may include, for example, a mobile phone, laptop computer, desktop computer, or tablet computer, among others.
- the computing system 100 may include a processor 102 that is adapted to execute stored instructions.
- the processor 102 can be a single core processor, a multi-core processor, a computing cluster, or any number of other appropriate configurations.
- the processor 102 may be connected through a system bus 104 (e.g., PCI, PCI Express, HyperTransport®, Serial ATA, among others) to an input/output (I/O) device interface 106 adapted to connect the computing system 100 to one or more I/O devices 108.
- the I/O devices 108 may include, for example, a keyboard and a pointing device, wherein the pointing device may include a touchpad or a
- the I/O devices 108 may be built-in components of the computing system 100, or may be devices that are externally connected to the computing system 100.
- the processor 102 may also be linked through the system bus 1 04 to a display interface 1 10 adapted to connect the computing system 100 to a display device 1 12.
- the display device 1 12 may include a display screen that is a built-in component of the computing system 100.
- the display device 1 12 may also include a computer monitor, television, or projector, among others, that is externally connected to the computing system 100.
- the processor 102 may also be linked through the system bus 1 04 to a network interface card (NIC) 1 14.
- the NIC 1 14 may be adapted to connect the computing system 100 through the system bus 104 to a network (not depicted).
- the network (not depicted) may be a wide area network (WAN), local area network (LAN), or the Internet, among others.
- the processor first searches for requested data in memory 1 16.
- the memory 1 1 6 can include random access memory (e.g., SRAM, DRAM, SONOS, eDRAM, EDO RAM, DDR RAM, RRAM, PRAM, among others), read only memory (e.g., Mask ROM, PROM, EPROM, EEPROM, among others), flash memory, nonvolatile memory, or any other suitable memory systems.
- the processor 102 can search for the requested instructions or data in a storage device 1 18.
- the storage device 1 18 can include a hard drive, an optical drive, a USB flash drive, an array of drives, or any other appropriate combinations thereof.
- the storage device 1 18 can contain all of the stored instructions and data for the computing system 1 00.
- the storage device 1 18 can also include a memory manager 120 that includes a neural network module 122, a conditional probability module 124, and a sequential memory module 1 26.
- the memory manager 120 can provide data to be retrieved from storage 106 and store the data in memory 1 1 6 based on the neural network module 122, sequential memory module 126, and conditional probability module 124.
- the neural network module 122, sequential memory module 126, and the conditional probability module 124 can identify data that is most likely to be requested by the processor 102.
- FIG. 1 the block diagram of Fig. 1 is not intended to indicate that the computing system 100 is to include all of the components shown in Fig. 1 . Rather, the computing system 100 can include fewer or additional components not illustrated in Fig. 1 (e.g., additional memory devices, video cards, additional network interfaces, etc.).
- any of the functionalities of the memory manager 120 may be partially, or entirely, implemented in hardware or in the processor 1 02.
- the functionality may be implemented with an application specific integrated circuit, in logic implemented in the processor 1 02, or in a co-processor on a peripheral device, among others.
- Fig. 2 is a process flow diagram illustrating an example of a method for providing data to be retrieved.
- the method 200 may be used to provide data to be retrieved by using a computing system, such as the computing system 100 described in Fig. 1 .
- the method 200 may be implemented by the memory manager 120, which can provide data to be retrieved based in part on a neural network module and conditional probability module.
- a data retrieval request can include a request to retrieve data such as data blocks, pages of data, tables of data, or information related to data.
- a data retrieval request may include retrieving an instruction that is likely to be requested by the processor.
- the data retrieval request can identify data in storage that is to be copied and stored in a memory device.
- the memory devices may be faster than the storage device. By storing copies of data in a memory device, the processor may access requested data in a shorter period of time.
- the sequential memory module identifies a first set of prospective data.
- the first set of prospective data can be identified by identifying the memory address of the data block last accessed by the processor and retrieving the next sequential data blocks based on the memory address of the data block last accessed by the processor.
- the sequential memory module may detect the processor last accessed data stored at memory address N of storage. The sequential memory module can then retrieve the data that resides at memory address N+1 of storage and store a copy of the retrieved data in memory.
- the sequential memory module may retrieve a range of data blocks based on sequential memory addresses that are located after the memory address of the data block last accessed by the processor. By storing copies of sequential data in memory, the sequential memory module can increase the speed of execution of instructions because the processor may request data from a memory device rather than a slower storage device.
- a neural network module includes an interconnected group of neurons.
- a neuron includes a device with one or more inputs and one or more outputs.
- a neuron can include a mathematical function or any other appropriate mathematical computation.
- the neuron can apply a mathematical function or computation to a set of inputs and return an output.
- the neuron may include a polynomial function that includes several variables that represent different input values.
- the input values may be memory addresses for data blocks that are likely to be requested by the processor.
- a polynomial function may then calculate an output that represents the data block that is most likely to be requested by the processor.
- a neuron can return multiple outputs that represent a set of data blocks that are likely to be requested by the processor.
- a conditional probability module can include a matrix or any other appropriate representation of conditional probabilities.
- a conditional probability includes the probability of one event based on the occurrence of a second event.
- the conditional probability of A given B is the probability of A if B is known to have occurred.
- the conditional probability of A given B is the probability of a data point A being requested by a processor if another data point B is known to have been requested by the processor.
- a conditional probability module can provide the conditional probabilities of a data block being requested by a processor based on other data blocks that have been previously requested. If the neural network module and the conditional probability module have been initialized, the process continues at block 208. If the neural network module and the conditional probability module have not been initialized, the process continues at block 21 0.
- the first set of prospective data is returned as a result set of data.
- the neural network module and conditional probability module may not have prospective data included in the result set of data because the neural network module and conditional probability module may not be initialized.
- a subset of the first set may be returned as the result set.
- the memory manager 120 may be configured to retrieve five prospective data blocks, while the sequential memory module returns a first set of ten prospective data blocks. The memory manager 120 can then select a subset of five data blocks from the first set. The memory manager 120 can select the subset of the first set based on any number of suitable techniques, such as random selection, selecting the first members of the first set, or selecting the last members of the first set, among others.
- the process continues at block 208.
- the data retrieval request is sent to a neural network module and conditional probability module.
- the neural network module and conditional probability module can accept any suitable number of data retrieval requests and each module can identify a set of prospective data.
- the second set of prospective requested data is identified from the neural network module.
- the neural network module accepts as input any suitable number of prospective data blocks that are likely to be requested by the processor.
- the data retrieval request indicates the prospective data blocks to be used as input for the neural network module.
- the neural network module can then apply a set of weights to the prospective data blocks that are likely to be requested by the processor.
- the combination of each weight and prospective data block is then sent to a neuron.
- Each neuron can include a transfer function, such as a polynomial function.
- the transfer function calculates output based on the combination of each weight and prospective data block. The output can vary depending on the type of transfer function used in the neural network module.
- a linear transfer function may result in outputs from the neurons that are proportional to the total weighted output.
- threshold or sigmoid transfer functions may be used.
- the output of a threshold transfer function can be set at one of two levels depending on whether the output is greater than or less than a threshold value.
- the output of a sigmoid function can be continuous and non-linear.
- the conditional probability module may include the probability of any suitable data blocks being requested by the processor based on whether other data blocks have already been requested by the processor.
- a conditional probability module may be represented as a matrix. Each cell of the matrix represents the probability that other data blocks have been previously requested by the processor.
- a zero can be stored in a cell if the probability of the cell is below a threshold.
- a conditional probability module may have a threshold of 25 %. If the probability that cell A of the conditional probability module is to be requested is 10 %, then a zero may be stored as the probability of cell A being requested from the processor.
- the probabilities can be stored as a sparse matrix.
- the cells of the conditional probability matrix with non-zero values can be stored, while the cells with zero values are not stored.
- the data in the sparse matrix can then be stored using less memory space.
- the second set of prospective data is combined with the third set of prospective data to produce a predictive set.
- the combination of the two sets of prospective data can be based on the accuracy of the neural network module and the conditional probability module.
- the memory manager 120 can determine the accuracy of the neural network module and the conditional probability module by monitoring the outputs of the neural network module and the conditional probability module. The outputs of the neural network module and the conditional probability module can then be compared to the actual requested data. The memory manager 120 can then determine each module's accuracy.
- the accuracy as referred to herein, is a rate at which the prospective requested data is actually requested by the processor.
- the memory manager 120 may determine that a larger number of prospective data is to be returned from the neural network module than the conditional probability module because the neural network module is more accurate. As a result, the memory manager 120 can return a predictive set with a high rate of accuracy.
- the memory manager 1 20 may determine the accuracy of either the neural network module or the conditional probability module has fallen below a threshold. The memory manager 1 20 can then stop retrieving prospective data from the neural network module or conditional probability module once the accuracy of the outputs fall below a threshold.
- a predictive set is generated by combining some number of prospective data from the neural network module and some number of prospective data from the conditional probability module. For example, in an implementation wherein the predictive set is configured to return a set of eight prospective requested data blocks, the memory manager 1 20 may determine, based on accuracy of the neural network module and conditional probability module, that two of the prospective data blocks are to be identified by the neural network module and six of the prospective data blocks are to be identified by the conditional probability module.
- the predictive set and the first set identified by the sequential memory module are combined to form a result set of prospective data.
- the result set can be any appropriate combination of the first set and predictive set.
- the result set may be configured to return 1 0 prospective data blocks. If the memory manager 120 determines that the predictive set has an accuracy of 90 %, the memory manager 120 may select nine prospective data blocks from the predictive set and one data block from the first set produced by the sequential memory module. In other examples, the predictive set may have an accuracy that falls below a threshold, in which case, the memory manager 1 20 may select the entire result set from the first set of prospective data.
- the neural network module and conditional probability module may then be reinitialized to increase the accuracy of the predictive set. Re-initialization is discussed in greater detail below in relation to Fig. 3.
- the process flow diagram of Fig. 2 is not intended to indicate that the steps of the method 200 are to be executed in any particular order, or that all of the steps of the method 200 are to be included in every case.
- the method 200 may identify the second set of prospective data and the third set of prospective data in parallel. Further, any number of additional steps may be included within the method 200, depending on the specific application.
- Fig. 3 is a process flow diagram illustrating an example of a method for initializing the neural network module and conditional probability module.
- the method 300 may be used to initialize the neural network module and conditional probability module by using a computing system, such as the computing system 100 described in Fig. 1 .
- the method 300 may be implemented by the memory manager 120, which can determine if the neural network module or conditional probability module is to be initialized.
- the neural network module or the conditional probability module is to be initialized based on initialization criteria.
- the initialization criteria indicate if the neural network module or conditional probability module has not been initialized or if the neural network module or conditional probability module is to be reinitialized.
- the initialization criteria can be based on the accuracy of the neural network module or the conditional probability module.
- the memory manager 120 may be configured to include an accuracy threshold. If the accuracy of the outputs from the neural network module or the conditional probability module decline below the accuracy threshold, the neural network module or conditional probability module may be reinitialized.
- the neural network module and conditional probability module are continuously updated during initialization and re-initialization by monitoring the data requested by the processor and re-calculating the neural network module and conditional probability module based on the requested data. If the neural network module or the conditional probability module is not to be initialized or reinitialized, the process ends at block 304. If the neural network module or conditional probability module is to be initialized or reinitialized, the process continues at block 306.
- identifying prospective data from the neural network module and conditional probability module is halted.
- the prospective data is then identified by the sequential memory module.
- identifying the prospective data by the sequential memory module can increase the number of data blocks stored in memory that are requested by the processor.
- the accuracy of the memory manager 120 may increase because the prospective data identified by the sequential memory module may include a greater number of data blocks that are accessed by the processor than the prospective data identified by the neural network module or the conditional probability module.
- requested data from the processor is sent to the neural network module and the conditional probability module.
- Sending requested data to the neural network module and the conditional probability module allows for the neurons of the neural network module and the conditional probabilities of the conditional probability module to be configured to increase accuracy. For example, if the processor starts execution of a new application, the processor may request data blocks that have not previously been requested. In this example, the accuracy of the neural network module may decrease because the criteria used to identify prospective data no longer produce accurate results because the new application may request different data than the previous application. For example, the accuracy of the neural network module or the conditional probability module may decrease from 60 % to 20 %. To increase the accuracy of the neural network module and the conditional probability module, the requested data blocks can be sent to both modules.
- Each requested data block may increase the likelihood that the neural network module or the conditional probability module identifies prospective data that will be requested by the processor.
- the neural network module may be 2 % more likely to identify prospective data blocks that are requested by the processor after the neural network module is configured based on the requested data.
- Configuring the neural network module and the conditional probability module during initialization and re-initialization is discussed in greater detail below in regard to Fig. 4.
- prospective data is generated from the neural network module and the conditional probability module.
- the prospective data is generated in response to the data last requested from the processor. For example, a conditional probability in the conditional probability module may be recalculated after each data block requested by the processor is sent to the conditional probability module.
- the sequential memory module results may represent a threshold for accuracy.
- the memory manager 120 may identify a certain number of prospective data blocks from the neural network module and conditional probability module based on the accuracy of each module. After initialization, the accuracy of the neural network module and the conditional probability module may be improved. For example, after initialization, the neural network module may have an accuracy of 60 % while the accuracy of the sequential memory module is 40 %. In this example, the neural network module may begin providing prospective data and the process ends. If the accuracy of the generated prospective data is below a threshold, the process returns to block 308. Each iteration of the process can increase the accuracy of the neural network module and conditional probability module so that the generated prospective data is more likely to be requested by the processor.
- Fig. 3 The process flow diagram of Fig. 3 is not intended to indicate that the steps of the method 300 are to be executed in any particular order, or that all of the steps of the method 300 are to be included in every case.
- the neural network module and conditional probability module can be initialized separately or in parallel. Further, any number of additional steps may be included within the method 200, depending on the specific application.
- Fig. 4 is an example which illustrates the data flow in a computing system that can provide data to be retrieved.
- the computing system 1 00 may provide data to be retrieved from storage 106 through a memory manager 120 that resides in storage 106.
- the memory manager 400 first detects a data retrieval request 402.
- the data retrieval request indicates that a certain amount of data is to be retrieved from storage and stored in memory.
- the data to be retrieved is the data most likely to be requested by the processor.
- a data retrieval request may indicate that 1 0 data blocks are to be retrieved from storage and each of the 1 0 data blocks is to have a high probability of being requested from the processor.
- the data retrieval request is sent to the sequential memory module 404, the neural network module 406, and the conditional probability module 408.
- the results for the data retrieval request 402 are a combination of the prospective data identified from the sequential memory module 404, the neural network module 406, and the conditional probability 408.
- the sequential memory module 404 detects the memory address of the data block last requested by the processor. The sequential memory module 404 then retrieves the next sequential data block based on the memory address.
- the neural network module 406 identifies prospective data based on neurons that have been previously configured.
- the neural network module 406 includes an input layer, two intermediate layers of neurons, and an output layer.
- the intermediate layers of neurons can accept multiple input values and output a single output value.
- the intermediate layers can be connected in a simple feedback formation, in which the input is sent from the input layer to the first intermediate layer. The output of the first intermediate layer is then sent to the second intermediate layer and the output of the second intermediate layer is identified as prospective data by the neural network module 406.
- the neurons may be connected in complex formations in which the output of an intermediate layer can be used recursively as input.
- the prospective data identified by the neural network module 406 can include a single data block. In other examples, the neural network module 406 can generate multiple prospective data blocks.
- conditional probability module 408 can include conditional
- conditional probability for each prospective data block stored in the conditional probability module 408 may be determined according to Equation 1 .
- conditional probability module may be stored as a matrix. As discussed above, the probability may be stored as zero if the probability is below a threshold. As a result, some conditional probability modules may be primarily populated with zeros. In these examples, the conditional probability may be stored as a sparse matrix in which only the cells of the matrix with a non-zero value are stored.
- the prospective data identified by the neural network module and conditional probability module are then sent to the predictive result selector 410.
- the predictive result selector 41 0 can combine the prospective data from the neural network module 406 and the conditional probability module 408 in any suitable number of configurations.
- the memory manager 400 may be configured to return a set of prospective data blocks from the predictive result selector 41 0.
- the predictive result selector 410 may also be configured to identify a certain number of prospective data blocks from the neural network module 406 and a certain number of prospective data blocks from the conditional probability module 408.
- the predictive result selector 410 may determine that five prospective data blocks are to be identified from the neural network module 406 and seven prospective data blocks are to be identified from the conditional probability module 408.
- the number of prospective data blocks to identify from the neural network module 406 and the conditional probability module 408 is based on the accuracy of the neural network module 406 and conditional probability module 408.
- the memory manager 400 may track the accuracy of the neural network module 406 and the conditional probability module 408 by comparing the prospective data identified by each module with the actual requested data.
- the predictive result selector 41 0 may then use the accuracy of the neural network module 406 and the conditional probability module 408 to identify a set of predictive prospective data.
- the predictive prospective data from the predictive result selector 410 and the prospective data identified in the sequential memory module 404 are then sent to the final result selector 41 2.
- the final result selector 412 can combine the predictive prospective data from the predictive result selector 410 and the prospective data from the sequential memory module 404 in any suitable number of combinations.
- the memory manager 400 may store information regarding the accuracy of the sequential memory module 404 by comparing the previous prospective data identified by the sequential memory module 404 to the actual data requested by the processor.
- the final result selector 41 2 can then identify a set of prospective data based on a ratio of the number of prospective data blocks from the sequential memory module 404 and the predictive result selector 41 0.
- the memory manager 400 can then retrieve the prospective data identified by the final result selector 41 2 from a storage device and store the prospective data in a memory device.
- the neural network module 406 and the conditional probability module 408 may be reinitialized (indicated by the dotted line) when the accuracy of either module declines below a threshold.
- the sequential memory module 404 may then determine the prospective requested data while the neural network module 406 and the conditional probability module 408 are reinitialized.
- previously requested data 414 can be sent to the neural network module 406 and conditional probability module 408.
- the prospective data identified by each module can then be sent to the predictive result selector 41 0, which can analyze the accuracy of the neural network module 406 and the conditional probability module 408.
- the predictive result selector 410 can begin to select prospective data from the module with an accuracy above a threshold.
- the predictive result selector 41 0 may continue sending previously requested data to a module until the accuracy of the module improves above a threshold.
- the threshold may indicate a certain percentage of prospective data blocks identified by the neural network module 406 or the conditional probability module 408 are accessed by the processor.
- FIG. 4 The block diagram of Figure 4 is for illustrative purposes only and can provide prospective requested data in any number of configurations.
- the sequential memory module 404, the neural network 406, and the conditional probability module 408 can operate in parallel by independently identifying prospective data during the same period of time.
- the memory manager 400 may include any other appropriate number of additional storage components, such as registers, for storage of accuracy information for the neural network module 406 and the conditional probability module 408.
- Figure 5 is a block diagram showing a tangible, non-transitory, computer- readable medium 500 that provides data to be retrieved.
- the tangible, non-transitory, computer-readable medium 500 may be accessed by a processor 502 over a computer bus 504.
- the tangible, non-transitory, computer-readable medium 500 may include code to direct the processor 502 to perform the steps of the current method.
- a memory manager 506 may be adapted to direct the processor 502 to provide data to be retrieved based on prospective data identified by a neural network module 508, a conditional probability module 510, and a sequential memory module 512.
- the neural network module 508 and the conditional probability module 510 can identify prospective data to be retrieved based on different calculations such as conditional probabilities, polynomial functions and other mathematical operations.
- the sequential memory module 512 can identify prospective data based on data previously requested by the processor. It is to be understood that any number of additional software components not shown in Fig. 5 may be included within the tangible, non-transitory, computer-readable medium 500, depending on the specific application.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Description
PROVIDING DATA TO BE RETRIEVED BACKGROUND
[0001] Modern computing systems have evolved to include a variety of memory devices. For example, a computing system may contain a non-volatile memory device and several volatile memory devices. In many computing systems, the nonvolatile memory device has a larger storage capacity than the volatile memory devices. However, the access time of data stored in the non-volatile memory device can be slower than the access time of data stored in the volatile memory devices. Accordingly, some computing systems store copies of data from a non-volatile memory source in a volatile memory source. The processor can then attempt to request data from the volatile memory source before requesting data from the slower non-volatile memory source. However, predicting which data the processor might request can be difficult. Furthermore, storing data from a non-volatile memory device in a volatile memory can be inefficient if the stored data is not requested by the processor.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Certain examples are described in the following detailed description and in reference to the drawings, in which:
[0003] Fig. 1 is a block diagram of an example of a computing system that can provide data to be retrieved;
[0004] Fig. 2 is a process flow diagram illustrating an example of a method for providing data to be retrieved;
[0005] Fig. 3 is a process flow diagram illustrating an example of a method for initializing a system that provides data to be retrieved;
[0006] Fig. 4 is an example which illustrates the data flow in a system that can provide data to be retrieved; and
[0007] Fig. 5 is an example of a tangible, non-transitory computer-readable medium that can provide data to be retrieved.
DETAILED DESCRIPTION OF SPECIFIC EXAMPLES
[0008] Various methods have been developed to identify and retrieve copies of data stored in non-volatile memory devices. For example, some methods identify and retrieve copies of data from non-volatile memory devices based on data requested by the processor. These methods may store copies of data that have sequential memory addresses that follow the memory address of the data last requested by the processor. However, many applications may not store data in a sequential configuration. For example, a relational database can store data in tables. The data for each table of the relational database may be stored in a non-sequential configuration in non-volatile memory because a row from a first database table may be stored following a row from a second database table. In this example, the processor may be unlikely to request data in a sequential pattern. Accordingly, a sequential method of retrieving and storing copies of non-volatile memory data in a volatile memory source can be inefficient.
[0009] The techniques disclosed herein describe a method for providing data to be retrieved. The data, as referred to herein, includes any data blocks, pages of data, tables, or any other information that may be requested by the processor. The data to be retrieved is identified using a sequential memory module, a neural network module, and a conditional probability module. Each module can identify prospective data to be retrieved based on calculations that attempt to determine which data the processor might request. A combination of the prospective data identified in each module is then determined to provide accurate prospective data that the processor is likely to request. The combination of these modules can increase the efficiency of a computing system by providing copies of data that are most likely to be requested by the processor to fast memory devices.
[0010] Fig. 1 is a block diagram of an example of a computing system 100 that may be used for providing data to be retrieved. The computing system 100 may include, for example, a mobile phone, laptop computer, desktop computer, or tablet computer, among others. The computing system 100 may include a processor 102 that is adapted to execute stored instructions. The processor 102 can be a single core processor, a multi-core processor, a computing cluster, or any number of other appropriate configurations.
[0011] The processor 102 may be connected through a system bus 104 (e.g., PCI, PCI Express, HyperTransport®, Serial ATA, among others) to an input/output (I/O) device interface 106 adapted to connect the computing system 100 to one or more I/O devices 108. The I/O devices 108 may include, for example, a keyboard and a pointing device, wherein the pointing device may include a touchpad or a
touchscreen, among others. The I/O devices 108 may be built-in components of the computing system 100, or may be devices that are externally connected to the computing system 100.
[0012] The processor 102 may also be linked through the system bus 1 04 to a display interface 1 10 adapted to connect the computing system 100 to a display device 1 12. The display device 1 12 may include a display screen that is a built-in component of the computing system 100. The display device 1 12 may also include a computer monitor, television, or projector, among others, that is externally connected to the computing system 100. Additionally, the processor 102 may also be linked through the system bus 1 04 to a network interface card (NIC) 1 14. The NIC 1 14 may be adapted to connect the computing system 100 through the system bus 104 to a network (not depicted). The network (not depicted) may be a wide area network (WAN), local area network (LAN), or the Internet, among others.
[0013] The processor first searches for requested data in memory 1 16. The memory 1 1 6 can include random access memory (e.g., SRAM, DRAM, SONOS, eDRAM, EDO RAM, DDR RAM, RRAM, PRAM, among others), read only memory (e.g., Mask ROM, PROM, EPROM, EEPROM, among others), flash memory, nonvolatile memory, or any other suitable memory systems. If the requested instructions or data are not located in memory 1 16, the processor 102 can search for the requested instructions or data in a storage device 1 18. The storage device 1 18 can include a hard drive, an optical drive, a USB flash drive, an array of drives, or any other appropriate combinations thereof. In some examples, the storage device 1 18 can contain all of the stored instructions and data for the computing system 1 00. The storage device 1 18 can also include a memory manager 120 that includes a neural network module 122, a conditional probability module 124, and a sequential memory module 1 26. The memory manager 120 can provide data to be retrieved from storage 106 and store the data in memory 1 1 6 based on the neural network
module 122, sequential memory module 126, and conditional probability module 124. The neural network module 122, sequential memory module 126, and the conditional probability module 124 can identify data that is most likely to be requested by the processor 102.
[0014] It is to be understood that the block diagram of Fig. 1 is not intended to indicate that the computing system 100 is to include all of the components shown in Fig. 1 . Rather, the computing system 100 can include fewer or additional components not illustrated in Fig. 1 (e.g., additional memory devices, video cards, additional network interfaces, etc.). Furthermore, any of the functionalities of the memory manager 120 may be partially, or entirely, implemented in hardware or in the processor 1 02. For example, the functionality may be implemented with an application specific integrated circuit, in logic implemented in the processor 1 02, or in a co-processor on a peripheral device, among others.
[0015] Fig. 2 is a process flow diagram illustrating an example of a method for providing data to be retrieved. The method 200 may be used to provide data to be retrieved by using a computing system, such as the computing system 100 described in Fig. 1 . The method 200 may be implemented by the memory manager 120, which can provide data to be retrieved based in part on a neural network module and conditional probability module.
[0016] At block 202, a data retrieval request is detected. A data retrieval request, as referred to herein, can include a request to retrieve data such as data blocks, pages of data, tables of data, or information related to data. For example, a data retrieval request may include retrieving an instruction that is likely to be requested by the processor. In some examples, the data retrieval request can identify data in storage that is to be copied and stored in a memory device. In these examples, the memory devices may be faster than the storage device. By storing copies of data in a memory device, the processor may access requested data in a shorter period of time.
[0017] At block 204, the sequential memory module identifies a first set of prospective data. The first set of prospective data can be identified by identifying the memory address of the data block last accessed by the processor and retrieving the next sequential data blocks based on the memory address of the data block last
accessed by the processor. For example, the sequential memory module may detect the processor last accessed data stored at memory address N of storage. The sequential memory module can then retrieve the data that resides at memory address N+1 of storage and store a copy of the retrieved data in memory. In some examples, the sequential memory module may retrieve a range of data blocks based on sequential memory addresses that are located after the memory address of the data block last accessed by the processor. By storing copies of sequential data in memory, the sequential memory module can increase the speed of execution of instructions because the processor may request data from a memory device rather than a slower storage device.
[0018] At block 206, it is determined if a neural network module and conditional probability module are initialized. A neural network module, as referred to herein, includes an interconnected group of neurons. A neuron, as referred to herein, includes a device with one or more inputs and one or more outputs. In some examples, a neuron can include a mathematical function or any other appropriate mathematical computation. The neuron can apply a mathematical function or computation to a set of inputs and return an output. For example, the neuron may include a polynomial function that includes several variables that represent different input values. In some examples, the input values may be memory addresses for data blocks that are likely to be requested by the processor. A polynomial function may then calculate an output that represents the data block that is most likely to be requested by the processor. In other examples, a neuron can return multiple outputs that represent a set of data blocks that are likely to be requested by the processor.
[0019] A conditional probability module, as referred to herein, can include a matrix or any other appropriate representation of conditional probabilities. A conditional probability, as referred to herein, includes the probability of one event based on the occurrence of a second event. For example, the conditional probability of A given B is the probability of A if B is known to have occurred. In some examples, the conditional probability of A given B is the probability of a data point A being requested by a processor if another data point B is known to have been requested by the processor. A conditional probability module can provide the conditional probabilities of a data block being requested by a processor based on
other data blocks that have been previously requested. If the neural network module and the conditional probability module have been initialized, the process continues at block 208. If the neural network module and the conditional probability module have not been initialized, the process continues at block 21 0.
[0020] At block 210, the first set of prospective data is returned as a result set of data. The neural network module and conditional probability module may not have prospective data included in the result set of data because the neural network module and conditional probability module may not be initialized. In some examples, a subset of the first set may be returned as the result set. For example, the memory manager 120 may be configured to retrieve five prospective data blocks, while the sequential memory module returns a first set of ten prospective data blocks. The memory manager 120 can then select a subset of five data blocks from the first set. The memory manager 120 can select the subset of the first set based on any number of suitable techniques, such as random selection, selecting the first members of the first set, or selecting the last members of the first set, among others. After the first set is returned as the prospective data, the process ends at block 220.
[0021] If the neural network module and the conditional probability module have been initialized, the process continues at block 208. At block 208, the data retrieval request is sent to a neural network module and conditional probability module. The neural network module and conditional probability module can accept any suitable number of data retrieval requests and each module can identify a set of prospective data.
[0022] At block 212, the second set of prospective requested data is identified from the neural network module. In some examples, the neural network module accepts as input any suitable number of prospective data blocks that are likely to be requested by the processor. In some examples, the data retrieval request indicates the prospective data blocks to be used as input for the neural network module. The neural network module can then apply a set of weights to the prospective data blocks that are likely to be requested by the processor. The combination of each weight and prospective data block is then sent to a neuron. Each neuron can include a transfer function, such as a polynomial function. The transfer function calculates output based on the combination of each weight and prospective data
block. The output can vary depending on the type of transfer function used in the neural network module. For example, a linear transfer function may result in outputs from the neurons that are proportional to the total weighted output. In other examples, threshold or sigmoid transfer functions may be used. The output of a threshold transfer function can be set at one of two levels depending on whether the output is greater than or less than a threshold value. The output of a sigmoid function can be continuous and non-linear.
[0023] At block 214, the third set of prospective data is identified from the conditional probability module. As discussed above, the conditional probability module may include the probability of any suitable data blocks being requested by the processor based on whether other data blocks have already been requested by the processor. For example, a conditional probability module may be represented as a matrix. Each cell of the matrix represents the probability that other data blocks have been previously requested by the processor. In some examples, a zero can be stored in a cell if the probability of the cell is below a threshold. For example, a conditional probability module may have a threshold of 25 %. If the probability that cell A of the conditional probability module is to be requested is 10 %, then a zero may be stored as the probability of cell A being requested from the processor. By replacing probabilities below a threshold with zeros, the probabilities can be stored as a sparse matrix. In these examples, the cells of the conditional probability matrix with non-zero values can be stored, while the cells with zero values are not stored. The data in the sparse matrix can then be stored using less memory space.
[0024] At block 216, the second set of prospective data is combined with the third set of prospective data to produce a predictive set. The combination of the two sets of prospective data can be based on the accuracy of the neural network module and the conditional probability module. In some examples, the memory manager 120 can determine the accuracy of the neural network module and the conditional probability module by monitoring the outputs of the neural network module and the conditional probability module. The outputs of the neural network module and the conditional probability module can then be compared to the actual requested data. The memory manager 120 can then determine each module's accuracy. The accuracy, as referred to herein, is a rate at which the prospective requested data is
actually requested by the processor. For example, in one scenario, 30 % of the prospective data returned by the conditional probability module may be later requested by the processor and 80 % of the prospective data returned by the neural network module may be later requested by the processor. In this scenario, the memory manager 120 may determine that a larger number of prospective data is to be returned from the neural network module than the conditional probability module because the neural network module is more accurate. As a result, the memory manager 120 can return a predictive set with a high rate of accuracy.
[0025] In other examples, the memory manager 1 20 may determine the accuracy of either the neural network module or the conditional probability module has fallen below a threshold. The memory manager 1 20 can then stop retrieving prospective data from the neural network module or conditional probability module once the accuracy of the outputs fall below a threshold. In some examples, a predictive set is generated by combining some number of prospective data from the neural network module and some number of prospective data from the conditional probability module. For example, in an implementation wherein the predictive set is configured to return a set of eight prospective requested data blocks, the memory manager 1 20 may determine, based on accuracy of the neural network module and conditional probability module, that two of the prospective data blocks are to be identified by the neural network module and six of the prospective data blocks are to be identified by the conditional probability module.
[0026] At block 218, the predictive set and the first set identified by the sequential memory module are combined to form a result set of prospective data. The result set can be any appropriate combination of the first set and predictive set. For example, the result set may be configured to return 1 0 prospective data blocks. If the memory manager 120 determines that the predictive set has an accuracy of 90 %, the memory manager 120 may select nine prospective data blocks from the predictive set and one data block from the first set produced by the sequential memory module. In other examples, the predictive set may have an accuracy that falls below a threshold, in which case, the memory manager 1 20 may select the entire result set from the first set of prospective data. The neural network module and conditional probability module may then be reinitialized to increase the accuracy
of the predictive set. Re-initialization is discussed in greater detail below in relation to Fig. 3.
[0027] The process flow diagram of Fig. 2 is not intended to indicate that the steps of the method 200 are to be executed in any particular order, or that all of the steps of the method 200 are to be included in every case. For example, the method 200 may identify the second set of prospective data and the third set of prospective data in parallel. Further, any number of additional steps may be included within the method 200, depending on the specific application.
[0028] Fig. 3 is a process flow diagram illustrating an example of a method for initializing the neural network module and conditional probability module. The method 300 may be used to initialize the neural network module and conditional probability module by using a computing system, such as the computing system 100 described in Fig. 1 . The method 300 may be implemented by the memory manager 120, which can determine if the neural network module or conditional probability module is to be initialized.
[0029] At block 302, it is determined if the neural network module or the conditional probability module is to be initialized based on initialization criteria. The initialization criteria indicate if the neural network module or conditional probability module has not been initialized or if the neural network module or conditional probability module is to be reinitialized. In some examples, the initialization criteria can be based on the accuracy of the neural network module or the conditional probability module. In these examples, the memory manager 120 may be configured to include an accuracy threshold. If the accuracy of the outputs from the neural network module or the conditional probability module decline below the accuracy threshold, the neural network module or conditional probability module may be reinitialized. In some examples, the neural network module and conditional probability module are continuously updated during initialization and re-initialization by monitoring the data requested by the processor and re-calculating the neural network module and conditional probability module based on the requested data. If the neural network module or the conditional probability module is not to be initialized or reinitialized, the process ends at block 304. If the neural network
module or conditional probability module is to be initialized or reinitialized, the process continues at block 306.
[0030] At block 306, identifying prospective data from the neural network module and conditional probability module is halted. The prospective data is then identified by the sequential memory module. In some examples, identifying the prospective data by the sequential memory module can increase the number of data blocks stored in memory that are requested by the processor. In these examples, the accuracy of the memory manager 120 may increase because the prospective data identified by the sequential memory module may include a greater number of data blocks that are accessed by the processor than the prospective data identified by the neural network module or the conditional probability module.
[0031] At block 308, requested data from the processor is sent to the neural network module and the conditional probability module. Sending requested data to the neural network module and the conditional probability module allows for the neurons of the neural network module and the conditional probabilities of the conditional probability module to be configured to increase accuracy. For example, if the processor starts execution of a new application, the processor may request data blocks that have not previously been requested. In this example, the accuracy of the neural network module may decrease because the criteria used to identify prospective data no longer produce accurate results because the new application may request different data than the previous application. For example, the accuracy of the neural network module or the conditional probability module may decrease from 60 % to 20 %. To increase the accuracy of the neural network module and the conditional probability module, the requested data blocks can be sent to both modules. Each requested data block may increase the likelihood that the neural network module or the conditional probability module identifies prospective data that will be requested by the processor. For example, the neural network module may be 2 % more likely to identify prospective data blocks that are requested by the processor after the neural network module is configured based on the requested data. Configuring the neural network module and the conditional probability module during initialization and re-initialization is discussed in greater detail below in regard to Fig. 4.
[0032] At block 310, prospective data is generated from the neural network module and the conditional probability module. The prospective data is generated in response to the data last requested from the processor. For example, a conditional probability in the conditional probability module may be recalculated after each data block requested by the processor is sent to the conditional probability module.
[0033] At block 312, it is determined if the accuracy of the generated prospective data is above a threshold. In some examples, the sequential memory module results may represent a threshold for accuracy. In other examples, the memory manager 120 may identify a certain number of prospective data blocks from the neural network module and conditional probability module based on the accuracy of each module. After initialization, the accuracy of the neural network module and the conditional probability module may be improved. For example, after initialization, the neural network module may have an accuracy of 60 % while the accuracy of the sequential memory module is 40 %. In this example, the neural network module may begin providing prospective data and the process ends. If the accuracy of the generated prospective data is below a threshold, the process returns to block 308. Each iteration of the process can increase the accuracy of the neural network module and conditional probability module so that the generated prospective data is more likely to be requested by the processor.
[0034] The process flow diagram of Fig. 3 is not intended to indicate that the steps of the method 300 are to be executed in any particular order, or that all of the steps of the method 300 are to be included in every case. For example, the neural network module and conditional probability module can be initialized separately or in parallel. Further, any number of additional steps may be included within the method 200, depending on the specific application.
[0035] Fig. 4 is an example which illustrates the data flow in a computing system that can provide data to be retrieved. In some implementations, the computing system 1 00 may provide data to be retrieved from storage 106 through a memory manager 120 that resides in storage 106.
[0036] In some examples, the memory manager 400 first detects a data retrieval request 402. As discussed above, the data retrieval request indicates that a certain amount of data is to be retrieved from storage and stored in memory. The data to be
retrieved is the data most likely to be requested by the processor. For example, a data retrieval request may indicate that 1 0 data blocks are to be retrieved from storage and each of the 1 0 data blocks is to have a high probability of being requested from the processor.
[0037] The data retrieval request is sent to the sequential memory module 404, the neural network module 406, and the conditional probability module 408. The results for the data retrieval request 402 are a combination of the prospective data identified from the sequential memory module 404, the neural network module 406, and the conditional probability 408. As discussed above, the sequential memory module 404 detects the memory address of the data block last requested by the processor. The sequential memory module 404 then retrieves the next sequential data block based on the memory address.
[0038] The neural network module 406 identifies prospective data based on neurons that have been previously configured. In some examples, the neural network module 406 includes an input layer, two intermediate layers of neurons, and an output layer. The intermediate layers of neurons can accept multiple input values and output a single output value. In some examples, the intermediate layers can be connected in a simple feedback formation, in which the input is sent from the input layer to the first intermediate layer. The output of the first intermediate layer is then sent to the second intermediate layer and the output of the second intermediate layer is identified as prospective data by the neural network module 406. In other examples, the neurons may be connected in complex formations in which the output of an intermediate layer can be used recursively as input. In these examples, the prospective data identified by the neural network module 406 can include a single data block. In other examples, the neural network module 406 can generate multiple prospective data blocks.
[0039] The conditional probability module 408 can include conditional
probabilities for prospective data based on previously requested data. For example, the conditional probability for each prospective data block stored in the conditional probability module 408 may be determined according to Equation 1 .
P(A\B) = ^ (1 )
ln Equation 1 , A represents the prospective data to be retrieved and B represents the previously requested data. In some examples, the conditional probability module may be stored as a matrix. As discussed above, the probability may be stored as zero if the probability is below a threshold. As a result, some conditional probability modules may be primarily populated with zeros. In these examples, the conditional probability may be stored as a sparse matrix in which only the cells of the matrix with a non-zero value are stored.
[0040] The prospective data identified by the neural network module and conditional probability module are then sent to the predictive result selector 410. The predictive result selector 41 0 can combine the prospective data from the neural network module 406 and the conditional probability module 408 in any suitable number of configurations. For example, the memory manager 400 may be configured to return a set of prospective data blocks from the predictive result selector 41 0. The predictive result selector 410 may also be configured to identify a certain number of prospective data blocks from the neural network module 406 and a certain number of prospective data blocks from the conditional probability module 408. For example, the predictive result selector 410 may determine that five prospective data blocks are to be identified from the neural network module 406 and seven prospective data blocks are to be identified from the conditional probability module 408. In some examples, the number of prospective data blocks to identify from the neural network module 406 and the conditional probability module 408 is based on the accuracy of the neural network module 406 and conditional probability module 408. For example, the memory manager 400 may track the accuracy of the neural network module 406 and the conditional probability module 408 by comparing the prospective data identified by each module with the actual requested data. The predictive result selector 41 0 may then use the accuracy of the neural network module 406 and the conditional probability module 408 to identify a set of predictive prospective data.
[0041] The predictive prospective data from the predictive result selector 410 and the prospective data identified in the sequential memory module 404 are then sent to the final result selector 41 2. The final result selector 412 can combine the predictive prospective data from the predictive result selector 410 and the prospective data
from the sequential memory module 404 in any suitable number of combinations. For example, the memory manager 400 may store information regarding the accuracy of the sequential memory module 404 by comparing the previous prospective data identified by the sequential memory module 404 to the actual data requested by the processor. The final result selector 41 2 can then identify a set of prospective data based on a ratio of the number of prospective data blocks from the sequential memory module 404 and the predictive result selector 41 0. The memory manager 400 can then retrieve the prospective data identified by the final result selector 41 2 from a storage device and store the prospective data in a memory device.
[0042] In some examples, the neural network module 406 and the conditional probability module 408 may be reinitialized (indicated by the dotted line) when the accuracy of either module declines below a threshold. As discussed above in Fig. 3, the sequential memory module 404 may then determine the prospective requested data while the neural network module 406 and the conditional probability module 408 are reinitialized. During initialization and re-initialization, previously requested data 414 can be sent to the neural network module 406 and conditional probability module 408. The prospective data identified by each module can then be sent to the predictive result selector 41 0, which can analyze the accuracy of the neural network module 406 and the conditional probability module 408. If the accuracy of either the neural network module 406 or the conditional probability module 408 has increased above the threshold, the predictive result selector 410 can begin to select prospective data from the module with an accuracy above a threshold. In some examples, the predictive result selector 41 0 may continue sending previously requested data to a module until the accuracy of the module improves above a threshold. For example, the threshold may indicate a certain percentage of prospective data blocks identified by the neural network module 406 or the conditional probability module 408 are accessed by the processor.
[0043] The block diagram of Figure 4 is for illustrative purposes only and can provide prospective requested data in any number of configurations. For example, the sequential memory module 404, the neural network 406, and the conditional probability module 408 can operate in parallel by independently identifying
prospective data during the same period of time. Furthermore, the memory manager 400 may include any other appropriate number of additional storage components, such as registers, for storage of accuracy information for the neural network module 406 and the conditional probability module 408.
[0044] Figure 5 is a block diagram showing a tangible, non-transitory, computer- readable medium 500 that provides data to be retrieved. The tangible, non-transitory, computer-readable medium 500 may be accessed by a processor 502 over a computer bus 504. Furthermore, the tangible, non-transitory, computer-readable medium 500 may include code to direct the processor 502 to perform the steps of the current method.
[0045] The various software components discussed herein may be stored on the tangible, non-transitory, computer-readable medium 500, as indicated in Fig. 5. For example, a memory manager 506 may be adapted to direct the processor 502 to provide data to be retrieved based on prospective data identified by a neural network module 508, a conditional probability module 510, and a sequential memory module 512. The neural network module 508 and the conditional probability module 510 can identify prospective data to be retrieved based on different calculations such as conditional probabilities, polynomial functions and other mathematical operations. The sequential memory module 512 can identify prospective data based on data previously requested by the processor. It is to be understood that any number of additional software components not shown in Fig. 5 may be included within the tangible, non-transitory, computer-readable medium 500, depending on the specific application.
[0046] The present examples may be susceptible to various modifications and alternative forms and have been shown only for illustrative purposes. Furthermore, it is to be understood that the present techniques are not intended to be limited to the particular examples disclosed herein. Indeed, the scope of the appended claims is deemed to include all alternatives, modifications, and equivalents that are apparent to persons skilled in the art to which the disclosed subject matter pertains.
Claims
1 . A method for providing data to be retrieved comprising:
detecting a data retrieval request;
identifying a first set of prospective data using a sequential memory module that identifies the first set of prospective data based in part on a data block last accessed by the processor;
identifying a second set of prospective data using a neural network module that identifies the second set of prospective data based in part on a neural network and the data retrieval request;
identifying a third set of prospective data using a conditional probability
module that identifies the third set of prospective data based in part on a conditional probability and the data retrieval request;
combining the second set of prospective data and the third set of prospective data to produce a set of predicted data;
combining the first set of prospective data with the set of predicted data to produce a set of results;
retrieving the set of results from a storage device; and
storing the set of results in a memory device.
2. The method of claim 1 , wherein the neural network module comprises an input layer, two intermediate layers, and an output layer.
3. The method of claim 1 , wherein the conditional probability module comprises a matrix that comprises an entry for each prospective data in relation to previously requested data.
4. The method of claim 3, wherein the matrix is a sparse matrix.
5. The method of claim 1 , wherein identifying a third set of prospective data based on the conditional probability module comprises calculating a conditional
probability of a data block being requested by a processor based on whether another data block has been requested by the processor.
6. The method of claim 1 , wherein identifying a first set of prospective data from a sequential memory module based on the data retrieval request comprises identifying the first set of prospective data based on sequential memory addresses.
7. The method of claim 1 comprising:
detecting a criterion that indicates to reinitialize the neural network module and the conditional probability module;
returning prospective data from the sequential memory module;
reinitializing the neural network module and conditional probability module based on actual requested data; and
returning prospective data from the sequential memory module, the neural network module and the conditional probability module.
8. The method of claim 1 comprising removing prospective data from the second set of prospective data or removing prospective data from the third set of prospective data based on an accuracy comparison to actual requested data.
9. A system for providing data to be retrieved comprising:
a storage device comprising computer-readable instructions, the computer readable instructions comprising a neural network module and a conditional probability module; and
a processor to execute the computer-readable instructions to:
detect a data retrieval request;
identify a first set of prospective data using a sequential memory
module that identifies the first set of prospective data based in part on a data block last accessed by the processor;
identify a second set of prospective data using a neural network
module that identifies the second set of prospective data based in part on a neural network and the data retrieval request; identify a third set of prospective data using a conditional probability module that identifies the third set of prospective data based in part on a conditional probability and the data retrieval request; combine the second set of prospective data and the third set of
prospective data to produce a set of predicted data;
combine the first set of prospective data with the set of predicted data to produce a set of results; and
retrieve the set of results from a storage device.
10. The system of claim 9, wherein the processor is to calculate a conditional probability of a data block being requested by the processor based on whether another data block has been requested by the processor.
1 1 . The system of claim 9, wherein the processor is to identify the first set of prospective data based on sequential memory addresses.
12. The system of claim 9, wherein the processor is to:
detect a criterion that indicates to reinitialize the neural network module and the conditional probability module;
return prospective data from the sequential memory module;
reinitialize the neural network module and conditional probability module based on actual requested data; and
return prospective data from the sequential memory module, the neural network module and the conditional probability module.
13. A tangible, non-transitory computer-readable medium comprising code to direct a processor to:
detect a data retrieval request;
identify a first set of prospective data using a sequential memory module that identifies the first set of prospective data based in part on a data block last accessed by the processor;
identify a second set of prospective data using a neural network module that identifies the second set of prospective data based in part on a neural network and the data retrieval request;
identify a third set of prospective data using a conditional probability module that identifies the third set of prospective data based in part on a conditional probability and the data retrieval request;
combine the second set of prospective data and the third set of prospective data to produce a set of predicted data;
combine the first set of prospective data with the set of predicted data to produce a set of results; and
retrieve the set of results from a storage device.
14. The tangible, non-transitory computer-readable medium of claim 13 comprising code to also direct the processor to:
detect a criterion that indicates to reinitialize the neural network module and the conditional probability module;
return prospective data from the sequential memory module;
reinitialize the neural network module and conditional probability module based on actual requested data; and
return prospective data from the sequential memory module, the neural network module and the conditional probability module.
15. The tangible, non-transitory computer-readable medium of claim 13, wherein the processor is to calculate a conditional probability of a data block being requested by the processor based on whether another data block has been requested by the processor.
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2012/046514 WO2014011181A1 (en) | 2012-07-12 | 2012-07-12 | Providing data to be retrieved |
| CN201280075261.9A CN104520808A (en) | 2012-07-12 | 2012-07-12 | Providing data to be retrieved |
| EP12880789.8A EP2872986A4 (en) | 2012-07-12 | 2012-07-12 | Providing data to be retrieved |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2012/046514 WO2014011181A1 (en) | 2012-07-12 | 2012-07-12 | Providing data to be retrieved |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014011181A1 true WO2014011181A1 (en) | 2014-01-16 |
Family
ID=49916443
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2012/046514 Ceased WO2014011181A1 (en) | 2012-07-12 | 2012-07-12 | Providing data to be retrieved |
Country Status (3)
| Country | Link |
|---|---|
| EP (1) | EP2872986A4 (en) |
| CN (1) | CN104520808A (en) |
| WO (1) | WO2014011181A1 (en) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5530941A (en) * | 1990-08-06 | 1996-06-25 | Ncr Corporation | System and method for prefetching data from a main computer memory into a cache memory |
| US5623608A (en) * | 1994-11-14 | 1997-04-22 | International Business Machines Corporation | Method and apparatus for adaptive circular predictive buffer management |
| US20030204675A1 (en) * | 2002-04-29 | 2003-10-30 | Dover Lance W. | Method and system to retrieve information from a storage device |
| US7555609B2 (en) * | 2006-10-27 | 2009-06-30 | Via Technologies, Inc. | Systems and method for improved data retrieval from memory on behalf of bus masters |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5381539A (en) * | 1992-06-04 | 1995-01-10 | Emc Corporation | System and method for dynamically controlling cache management |
| US7353339B2 (en) * | 2003-12-24 | 2008-04-01 | Intel Corporation | Adaptive caching |
| CN101046784A (en) * | 2006-07-18 | 2007-10-03 | 威盛电子股份有限公司 | Memory data access system and method and memory controller |
| US8965819B2 (en) * | 2010-08-16 | 2015-02-24 | Oracle International Corporation | System and method for effective caching using neural networks |
-
2012
- 2012-07-12 CN CN201280075261.9A patent/CN104520808A/en active Pending
- 2012-07-12 EP EP12880789.8A patent/EP2872986A4/en not_active Withdrawn
- 2012-07-12 WO PCT/US2012/046514 patent/WO2014011181A1/en not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5530941A (en) * | 1990-08-06 | 1996-06-25 | Ncr Corporation | System and method for prefetching data from a main computer memory into a cache memory |
| US5623608A (en) * | 1994-11-14 | 1997-04-22 | International Business Machines Corporation | Method and apparatus for adaptive circular predictive buffer management |
| US20030204675A1 (en) * | 2002-04-29 | 2003-10-30 | Dover Lance W. | Method and system to retrieve information from a storage device |
| US7555609B2 (en) * | 2006-10-27 | 2009-06-30 | Via Technologies, Inc. | Systems and method for improved data retrieval from memory on behalf of bus masters |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP2872986A4 * |
Also Published As
| Publication number | Publication date |
|---|---|
| EP2872986A4 (en) | 2016-03-23 |
| EP2872986A1 (en) | 2015-05-20 |
| CN104520808A (en) | 2015-04-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN112667528B (en) | A method for pre-fetching data and related equipment | |
| CN110490323B (en) | Network model compression method, device, storage medium and computer equipment | |
| CN112970006B (en) | A memory access prediction method and circuit based on recurrent neural network | |
| CN108475349B (en) | Systems and methods for robust large-scale machine learning | |
| US20180314945A1 (en) | Graph matching for optimized deep network processing | |
| CN112906865B (en) | Neural network architecture search method, device, electronic device and storage medium | |
| US20240428853A1 (en) | Caching Techniques for Deep Learning Accelerator | |
| CN106339181A (en) | Method and system for processing data in storage system | |
| US20250148280A1 (en) | Techniques for learning co-engagement and semantic relationships using graph neural networks | |
| WO2021218037A1 (en) | Target detection method and apparatus, computer device and storage medium | |
| WO2021139432A1 (en) | Artificial intelligence-based user rating prediction method and apparatus, terminal, and medium | |
| CN115329140B (en) | Dynamic mini-batch size | |
| CN116090507A (en) | Electronic device and method for inference | |
| US11295236B2 (en) | Machine learning in heterogeneous processing systems | |
| CN105528303B (en) | Method and apparatus for managing storage system | |
| WO2025101527A1 (en) | Techniques for learning co-engagement and semantic relationships using graph neural networks | |
| CN116737607B (en) | Sample data caching method, system, computer device and storage medium | |
| CN111695917B (en) | Product recommendation method, system, electronic device and storage medium | |
| US11315036B2 (en) | Prediction for time series data using a space partitioning data structure | |
| WO2021043893A1 (en) | Antisymmetric neural networks | |
| WO2014011181A1 (en) | Providing data to be retrieved | |
| CN117349023A (en) | Application deployment methods, equipment and storage media | |
| Valkanas et al. | Modl: Multilearner online deep learning | |
| Feng et al. | Attmemo: Accelerating transformers with memoization on big memory systems | |
| US11347904B2 (en) | Techniques for modeling behaviors of systems via transformations of authoritative models |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12880789 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| REEP | Request for entry into the european phase |
Ref document number: 2012880789 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2012880789 Country of ref document: EP |