[go: up one dir, main page]

CN114817089B - Cache management method and device for multiple download tasks - Google Patents

Cache management method and device for multiple download tasks Download PDF

Info

Publication number
CN114817089B
CN114817089B CN202210446692.7A CN202210446692A CN114817089B CN 114817089 B CN114817089 B CN 114817089B CN 202210446692 A CN202210446692 A CN 202210446692A CN 114817089 B CN114817089 B CN 114817089B
Authority
CN
China
Prior art keywords
buffer space
space
task
downloading
download
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210446692.7A
Other languages
Chinese (zh)
Other versions
CN114817089A (en
Inventor
吴津
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing QIYI Century Science and Technology Co Ltd
Original Assignee
Beijing QIYI Century Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing QIYI Century Science and Technology Co Ltd filed Critical Beijing QIYI Century Science and Technology Co Ltd
Priority to CN202210446692.7A priority Critical patent/CN114817089B/en
Publication of CN114817089A publication Critical patent/CN114817089A/en
Application granted granted Critical
Publication of CN114817089B publication Critical patent/CN114817089B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/08Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
    • G06F12/0802Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches
    • G06F12/0866Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches for peripheral storage systems, e.g. disk cache
    • G06F12/0871Allocation or management of cache space
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/16File or folder operations, e.g. details of user interfaces specifically adapted to file systems
    • G06F16/162Delete operations
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/172Caching, prefetching or hoarding of files
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5011Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resources being hardware resources other than CPUs, Servers and Terminals
    • G06F9/5016Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resources being hardware resources other than CPUs, Servers and Terminals the resource being the memory

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

本发明涉及一种多下载任务的缓存管理方法和装置。所述方法包括:确定当前播放设备的总缓存空间;从所述总缓存空间中,为每个下载任务分配与该下载任务的属性信息相匹配的独立缓存空间,以使各所述下载任务将下载的文件保存在各自的独立缓存空间中;所述属性信息包括:码率和/或文件大小。本方案可使每个下载任务所占用的内存都与自身的码率或文件大小相匹配,即高码率的下载任务占用更多缓存,低码率的下载任务占用较少缓存,这样就避免了缓存占用不均衡的问题,从而提高拖动进度条时缓存的命中率,减少播放卡顿的情况。

The present invention relates to a cache management method and device for multiple download tasks. The method comprises: determining the total cache space of the current playback device; allocating an independent cache space matching the attribute information of the download task to each download task from the total cache space, so that each download task saves the downloaded file in its own independent cache space; the attribute information comprises: bit rate and/or file size. This scheme can make the memory occupied by each download task match its own bit rate or file size, that is, a download task with a high bit rate occupies more cache, and a download task with a low bit rate occupies less cache, thus avoiding the problem of uneven cache occupancy, thereby improving the cache hit rate when dragging the progress bar and reducing playback freezes.

Description

Cache management method and device for multiple download tasks
Technical Field
The invention relates to the technical field of cache management, in particular to a cache management method and device for multiple download tasks.
Background
To optimize the viewing experience of the user, most video currently supports independent audio. When playing video of independent audio, two download tasks are created, one of which is an audio task and the other of which is a picture task. If an advertisement is inserted in the video, a task of downloading the advertisement is created again. In addition, some video playing platforms can provide multi-view viewing, and in the multi-view viewing mode, at least two downloading tasks are created.
In the video playing process, the video is generally cached to avoid playing clamping. In the prior art, when cache management is performed on a plurality of simultaneous downloading tasks, the problem that memory occupation of the downloading tasks is unbalanced easily occurs, that is, downloading tasks with high code rate occupy less cache, downloading tasks with low code rate occupy more cache, and when a user drags a progress bar, the user is likely to not hit the cache, so that playing is blocked.
Therefore, how to avoid play blocking caused by unbalanced memory occupation when performing cache management on a plurality of download tasks performed simultaneously is a technical problem to be solved by those skilled in the art.
Disclosure of Invention
Accordingly, the present invention is directed to a method and related device for managing multiple download tasks, so as to solve the problem of playing and blocking caused by unbalanced memory occupation when performing cache management on multiple download tasks simultaneously.
In order to achieve the above purpose, the invention adopts the following technical scheme:
in a first aspect, the present invention provides a method for managing a cache of multiple download tasks, where the method is applicable to a playing device of a multimedia content, and the method includes:
Determining the total cache space of the current playing device;
From the total cache space, an independent cache space matched with the attribute information of the downloading task is allocated for each downloading task, so that each downloading task stores the downloaded file in the independent cache space;
the attribute information includes: code rate and/or file size.
Further, the determining the total buffer space of the current playing device includes:
determining a basic cache space according to the memory space of the playing device where the current downloading task is located;
And determining the total cache space of the current playing device according to the number of the downloading tasks and the basic cache space.
Further, the determining the total buffer space of the current playing device according to the number of the downloading tasks and the basic buffer space includes:
if the number of the downloading tasks is 1, determining the basic cache space as a total cache space;
if the number of the downloading tasks is greater than 1, expanding the buffer space with a preset size on the basis of the basic buffer space, and determining the expanded buffer space as the total buffer space.
Further, the method further comprises:
and if the total cache space is larger than the preset maximum available cache space, determining the preset maximum available cache space as the total cache space.
Further, the allocating, for each download task, an independent buffer space matched with the attribute information of the download task includes:
acquiring code rates of all downloading tasks, and distributing an initial buffer space for each downloading task according to the proportion among the code rates;
determining the initial buffer space as an independent buffer space; or when the initial buffer space is smaller than the preset value, determining the preset value as an independent buffer space; or when the initial buffer space is smaller than the file size of the download task, determining the file size of the download task as an independent buffer space.
Further, the allocating an initial buffer space for each download task according to the ratio between code rates includes:
for any download task, acquiring the code rate of the download task, and calculating the code rate ratio of the code rate of the download task in the total code rate of all download tasks;
taking the code rate duty ratio as the space duty ratio of the downloading task in the total cache space;
and according to the space ratio, an initial buffer space is allocated for the downloading task from the total buffer space.
Further, the determining the preset value as the independent cache space includes:
for any downloading task, if the initial buffer space is smaller than a preset value, updating the size of the independent buffer space allocated to the downloading task to be equal to the preset value;
And after reserving an independent buffer space for the downloading task in the total buffer space, reallocating the rest buffer space according to the code rate ratio of other downloading tasks.
Further, the determining the file size of the download task as the independent buffer space includes:
For any downloading task, if the initial buffer space is smaller than the file size of the downloading task, updating the size of the independent buffer space allocated to the downloading task to be equal to the file size of the downloading task;
And after reserving an independent buffer space for the downloading task from the total buffer space, reallocating the rest buffer space according to the code rate ratio of other downloading tasks.
Further, the method further comprises:
And if one download task is newly added or reduced, the total buffer space is redetermined, and based on the rest download tasks, each download task is allocated with a respective independent buffer space.
Further, the method further comprises:
for any downloading task, detecting the size of a buffer space actually occupied by the downloading task;
If the actually occupied space is larger than the size of the buffer memory space allocated to the downloading task, deleting the downloaded data according to a preset deleting rule, and suspending the downloading task.
Further, deleting the downloaded data according to a preset deletion rule includes:
deleting the data which has been played in the downloaded data; and/or the number of the groups of groups,
Deleting the data which are not played in the downloaded data and have the distance from the current playing time larger than the preset time threshold.
In a second aspect, the present application provides a cache management apparatus for multiple download tasks, including:
the determining module is used for determining the total cache space of the current playing device;
The distribution module is used for distributing an independent cache space matched with the attribute information of the downloading task for each downloading task from the total cache space so that each downloading task stores the downloaded file in the independent cache space;
wherein the attribute information includes: code rate and/or file size.
In a third aspect, the application provides an electronic device comprising a processor and a memory, the processor being coupled to the memory:
the processor is used for calling and executing the program stored in the memory;
The memory is configured to store the program, and the program is at least configured to perform the method of any one of the above.
In a fourth aspect, the present application provides a storage medium having stored thereon a computer program which when executed by a processor performs the steps of the method of any of the preceding claims.
The application adopts the technical proposal and has at least the following beneficial effects:
The scheme of the application firstly determines the total buffer space of the current playing device, and then allocates the independent buffer space for each downloading task from the total buffer space according to the code rate or the file size of each downloading task, so that each downloading task stores the downloaded file in the independent buffer space. By adopting the scheme, the memory occupied by each download task can be matched with the code rate or the file size of the memory, namely, the download task with high code rate occupies more caches, and the download task with low code rate occupies less caches, so that the problem of unbalanced cache occupation is avoided, the hit rate of the cache when the progress bar is dragged is improved, and the play jamming condition is reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of one embodiment of a method for managing multiple download tasks according to the present invention;
FIG. 2 is a flow chart of another embodiment of a method for managing multiple download tasks according to the present invention;
FIG. 3 is a flow chart illustrating another embodiment of a method for managing multiple download tasks according to the present invention;
FIG. 4 is a flow chart of another embodiment of a method for managing multiple download tasks according to the present invention;
FIG. 5 is a flow chart of another embodiment of a method for managing multiple download tasks according to the present invention;
FIG. 6 is a flow chart of another embodiment of a method for managing multiple download tasks according to the present invention;
FIG. 7 is a flow chart illustrating another embodiment of a method for managing multiple download tasks according to the present invention;
FIG. 8 is a schematic diagram illustrating a configuration provided by an embodiment of a multi-download task cache management apparatus according to the present invention;
Fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be described in detail below. It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. All other embodiments, based on the examples herein, which are within the scope of the invention as defined by the claims, will be within the scope of the invention as defined by the claims.
Referring to fig. 1, fig. 1 is a flowchart illustrating an embodiment of a method for managing multiple download tasks according to the present invention. The cache management method of the multi-download task can be applied to any playing device of the multimedia content, and the playing device can comprise, but is not limited to, at least one of the following: smart phones, computers, servers, etc. The cache management method for the multi-download task comprises the following steps:
step S11, determining the total cache space of the current playing device.
When a user opens a player on the playing device to play the corresponding content, one or more downloading tasks are synchronously created, so that the content played by the user is buffered based on the downloading tasks, and playing clamping is avoided. The player may be provided with a multitasking component, such as the HCDN component, by which one or more download tasks are cached at the same time. The playback device may include, but is not limited to, at least one of: video download task, independent audio download task, independent picture download task, advertisement download task, embodiments of the present application are not limited.
In one embodiment of the application, a cache management method for executing multiple download tasks is based on HCDN component architecture. If the user turns on a video player on a playback device, the HCDN component starts. When the user selects to play the video content, one or more download tasks are synchronously created based on HCDN components. It should be noted that, most of the video currently supports independent audio, two download tasks are created when playing the video of the independent audio, one is an independent audio download task, and the other is an independent picture download task. If an advertisement is inserted in the video, an advertisement download task is created. In addition, some video playing platforms can provide multi-view viewing, and in the multi-view viewing mode, at least two video downloading tasks are created.
In the application, the total buffer space is determined based on all the current download tasks. Referring to fig. 2, for step S11, in one embodiment of the present application, the specific processing procedure of step S11 may include the following steps:
Step S111, determining a basic cache space according to the memory space of the playing device where the current downloading task is located.
In some embodiments of the present application, if a player on a playback device mounts HCDN a component, when a user opens the player, HCDN the component is started, and a base buffer space is determined according to a memory space of the playback device. In this embodiment, the size of the basic buffer space is related to the memory space of the playback device. The larger the memory space of the playback device, the larger the basic buffer space.
As shown in table 1, in one embodiment of the present application, different memories of mobile phone devices correspond to different memory levels, and the number of basic cache blocks can be determined according to the memories of the mobile phone devices and the memory levels, and then the size of the basic cache space is determined according to the number of basic cache blocks.
Memory of mobile phone equipment Memory rank Number of basic cache blocks Basic cache size
256MB or less 1 9 18MB
256MB~512MB 2 11 22MB
512MB~1GB 3 13 26MB
1GB~2GB 4 15 30MB
2GB~3GB 5 17 34MB
3GB~4GB 6 20 40MB
4GB~5GB 7 22 44MB
5GB~6GB 8 24 48MB
6GB~7GB 9 25 50MB
7GB~8GB 10 25 50MB
8GB~9GB 11 30 60MB
9GB~10GB 12 30 60MB
10GB or more 13 40 80MB
TABLE 1
And step S112, determining the total cache space of the current playing device according to the number of the downloading tasks and the basic cache space.
After the basic cache space is determined, the cache space with the preset size is expanded according to the number of the downloading tasks so as to further expand the cache space, and the expanded cache space is determined as the total cache space of all the downloading tasks.
In some embodiments, step S112 may include the steps of: if the number of the downloading tasks is 1, determining the basic cache space as a total cache space; if the number of the downloading tasks is greater than 1, expanding the buffer space with a preset size on the basis of the basic buffer space, and determining the expanded buffer space as the total buffer space.
In one embodiment of the present application, if the number of download tasks is one, the basic buffer space may be directly used as the total buffer space. If the number of the download tasks is greater than 1, on the basis of 1 download task, each download task is added, and the buffer space with a preset size is expanded. The buffer space of the preset size may be set to a specific capacity, for example, to 10M. The setting may also be performed according to a proportion of the basic buffer space, for example, the extended buffer space of a preset size is one half, one third or other proportion of the basic buffer space.
In one embodiment of the application, the extended cache space is one half of the base cache space. Based on this, the calculation formula of the total buffer space is:
TotalCapacity=BaseCapacity+BaseCapacity÷2×(n-1);
in the above formula, totalCapacity is the total buffer space, baseCapacity is the basic buffer space, n is the number of download tasks, and n is a positive integer.
Through the above steps S111 and S112, the total buffer space required for the download tasks can be determined, so that space allocation is performed for each download task from the total buffer space.
In one embodiment of the present application, if the total buffer space is greater than the preset maximum available buffer space, the preset maximum available buffer space is determined as the total buffer space. Specifically, the maximum available buffer space may be set according to parameters of the player, and if the player on the playing device carries HCDN components, the maximum available buffer space may also be controlled according to cloud parameters of HCDN components, which is not limited in particular. For example, the maximum available buffer space may be set to 100M. In this embodiment, by limiting the upper limit value of the total buffer space, it can effectively avoid occupying too much memory of the playing device when the download task is too much, ensure that the playing device can operate normally, and avoid affecting the operation of other software if other software is also installed on the playing device.
And step S12, allocating an independent buffer space matched with the attribute information of the downloading task for each downloading task from the total buffer space so that each downloading task stores the downloaded file in the independent buffer space.
Wherein, the attribute information of the download task comprises code rate and/or file size. The code rate of the downloading task is related to the file size and the duration; in general, the code rate is equal to the ratio of the file size to the file duration. According to the application, according to the code rate and the file size of the downloading task, each independent cache space can be allocated for each downloading task from the total cache space.
For step S12, allocating, for each download task, an independent buffer space matching the attribute information of the download task, the steps of:
step S121, obtaining code rates of all the downloading tasks, and distributing an initial buffer space for each downloading task according to the proportion between the code rates.
Referring to fig. 3, in the present embodiment, step S121 may include the following steps: step S1211, for any download task, obtaining the code rate of the download task, and calculating the code rate ratio of the code rate of the download task in the total code rate of all download tasks. Specifically, assuming that the code rate of the nth download task is bitrateN, the total code rate of all download tasks is:
TotalBitrate=bitrate1+bitrate2+…+bitrateN;
In the above equation, totalBitrate is the total code rate of all download tasks, and n is a positive integer.
And step S1212, after calculating the code rate duty ratio, taking the code rate duty ratio as the space duty ratio of the download task in the total buffer space.
Step S1213, according to the code rate ratio determined in the above step, an initial buffer space is allocated for the downloading task from the total buffer space. Specifically, the initial buffer space obtained by calculating the nth download task according to the code rate bitrateN is:
calculatedTaskCapacityN=TotalCapacity×bitrateN/TotalBitrate;
In the above formula, bitrateN is the code rate of the nth download task, totalBitrate is the total code rate of all download tasks, and TotalCapacity is the total buffer space.
In some embodiments, after step S121, step S122 is performed: the initial buffer space is determined to be an independent buffer space.
Based on the steps in S121 and S122, a buffer space may be allocated for the corresponding download task by the code rate ratio of the download task, so that all tasks may execute the buffer task in the respective allocated buffer spaces. The situation that tasks with high code rate occupy less caches and tasks with low code rate occupy more caches can not occur, and the probability of hitting caches when a user drags a progress bar is improved.
In some embodiments of the present application, after step S121, step S123 may be further performed: when the initial buffer space is smaller than the preset value, the preset value is determined to be an independent buffer space.
Referring to fig. 4, in some embodiments, step S123 may include the steps of:
step S1231, for any download task, if the initial buffer space is smaller than the preset value, the size of the independent buffer space allocated to the download task is updated to be equal to the preset value.
That is, for any download task, if the size of the buffer space allocated from the total buffer space according to the code rate ratio is smaller than the preset value, the size of the buffer space allocated to the download task is updated to be equal to the preset value.
Specifically, if there is a downloading task, the size of the buffer space allocated to the downloading task from the total buffer space according to the code rate ratio is smaller than the preset value, the buffer space may be modified, and the size of the buffer space allocated to the downloading task is updated to be equal to the preset value. The preset value may be set according to actual situations, which is not limited in this embodiment.
In a specific embodiment of the present application, a player on a playing device carries HCDN components, and since the unit of management of data in the framework of HCDN components is a block, one block is 2M in size, in order to ensure that the downloading task is normally executed, each downloading task may set at least 2 available spaces of the block, that is, the preset value may be set to 4M.
And step S1232, after an independent buffer space is reserved for the downloading task in the total buffer space, the rest buffer space is redistributed according to the code rate ratio of other downloading tasks.
After reserving a buffer space with a preset value for the downloading task, other downloading tasks need to allocate the buffer space again according to the code rate occupation ratio. Specifically, the ratio of the code rate of any one download task to the total code rate of all the download tasks can be recalculated, and the ratio of the code rate to the total code rate of all the download tasks is used as the space ratio of the download task in the remaining buffer space. And according to the space occupation ratio, allocating a buffer space for the rest of other downloading tasks from the rest of buffer space.
Based on the step in S123, it can avoid that the buffer block obtained by calculation may be 0 because of a larger difference between the multitask code rates, that is, the task has no block to buffer data, so as to ensure that the downloading task can be normally executed, and improve user experience.
In some embodiments of the present application, after step S121, step S124 may also be performed: and when the initial buffer space is smaller than the file size of the downloading task, determining the file size of the downloading task as an independent buffer space.
Referring to fig. 5, in some embodiments, step S124 may include the steps of:
in step S1241, for any download task, if the initial buffer space is smaller than the file size of the download task, the size of the independent buffer space allocated to the download task is updated to be equal to the file size of the download task.
That is, for any download task, if the buffer space allocated from the total buffer space according to the code rate ratio is smaller than the file of the download task, the buffer space allocated to the download task is updated to be equal to the file size of the download task.
When the code rate of the downloaded file is high, but the file is small, the buffer space allocated from the total buffer space according to the code rate ratio may be smaller than the file of the download task. For example, a long video with a low code rate may have a short advertisement with a high code rate inserted therein, and the buffer space of the advertisement file calculated according to the code rate may be larger than the buffer space calculated by the video file. When the buffer space allocated from the total buffer space according to the code rate ratio is smaller than the file of the download task, the buffer space allocated to the download task is updated to be equal to the file size of the download task.
Step S1242, after reserving an independent buffer space for the downloading task from the total buffer space, reallocating the remaining buffer space according to the code rate ratio of other downloading tasks.
After reserving a buffer space with a preset value for the downloading task, other downloading tasks need to allocate the buffer space again according to the code rate occupation ratio. Specifically, the ratio of the code rate of any one download task to the total code rate of all the download tasks can be recalculated, and the ratio of the code rate to the total code rate of all the download tasks is used as the space ratio of the download task in the remaining buffer space. And according to the space occupation ratio, allocating a buffer space for the rest of other downloading tasks from the rest of buffer space.
Based on the step in S124, the situation that the size of the allocated buffer space is smaller than the file size and thus the download cannot be performed can be avoided.
According to the cache management method for the multiple download tasks, the total cache space required by all the download tasks at present is determined, and according to the attribute information of each download task, the respective cache space is allocated for each download task from the total cache space. The attribute information of the downloading task at least comprises the code rate and the file size of the downloading task. Each download task may perform a corresponding download task in a respective buffer space. The method and the device have the advantages that the storage space is allocated for each downloading task, the hit rate of the buffer memory when the user drags the progress bar is improved, and the playing experience of the user is optimized.
As shown in fig. 6, in one embodiment of the present application, the following steps may be further included:
S21, if one download task is newly added or reduced, the total buffer space is redetermined, and the independent buffer space is redeployed for each download task.
When new tasks are created and destroyed, new download task buffer spaces are required to be redistributed, so that the download space corresponding to each download task can be updated dynamically along with updating of the download tasks, and the situation that tasks with high code rate occupy less buffer memories and tasks with low code rate occupy more buffer memories is avoided. When the buffer space of the new download task is reallocated, the allocation may be performed as described in step S11 and step S12 in the above embodiment.
It should be noted that the method further includes step S22: and deleting all the downloading tasks when the playing equipment where the current downloading task is located exits from playing. So as to release the occupied cache space and avoid the influence of excessive occupied memory on the running speed of the playing equipment.
As shown in fig. 7, in one embodiment of the present application, the following steps may be further included:
Step S31, for any download task, detecting the size of the buffer space actually occupied by the download task.
In the application, the size of the space actually occupied by the download task can be detected in the execution process of the download task.
And S32, deleting the downloaded data according to a preset deleting rule and suspending the downloading task if the actually occupied space is larger than the size of the buffer space allocated to the downloading task.
If a download task detects that its actual occupied space exceeds the size of the buffer space allocated to the download task during execution, then the redundant download data may be deleted and the download task may be paused.
In one embodiment of the present application, step S32 specifically includes: deleting the data which has been played in the downloaded data; and/or deleting the data which are not played in the downloaded data but have the distance from the current playing time greater than the preset time threshold.
Specifically, the data that has been played in the downloaded data may be deleted as redundant downloaded data. Data which is not played in the downloaded data but is far from the current playing position can be deleted as redundant downloaded data. And the data which is already played in the downloaded data and the data which is not played but is far away from the current playing position in the downloaded data can be deleted as redundant data.
In one embodiment of the present application, the size of the redundant download data is the difference between the size of the actual space occupied by the download task and the size of the allocated buffer space.
In one embodiment of the present application, the data that has been played in the downloaded data is preferentially deleted, and when the number of data that has been played in the downloaded data is insufficient, the data that has not been played in the downloaded data but is farther from the current playing position is deleted.
And step S33, continuing the downloading task until the buffer space allocated to the downloading task is occupied if the actually occupied space is smaller than or equal to the buffer space allocated to the downloading task.
After deleting the redundant data, when the actually occupied space of the download task is smaller than or equal to the size of the buffer space allocated to the download task, the download task can be started, if the buffer space allocated to the download task is completed, the redundant data can be determined again, the redundant data is deleted and the download task is suspended, and when the actually occupied space is smaller than or equal to the size of the buffer space allocated to the download task, the download task is continued.
Based on the above steps S31-S33, by means of the downloading and deleting at the same time, the downloading task can be executed without occupying an excessive buffer space, and the task playing is not affected, and the situation that the downloading task cannot be executed due to insufficient memory can be avoided, so that the viewing experience of the user is improved.
Based on a general inventive concept, the application also provides a cache management device for multiple download tasks, which is used for realizing the method embodiment. As shown in fig. 8, the multi-download task cache management apparatus of the present embodiment includes:
A determining module 41, configured to determine a total buffer space of the current playing device;
an allocation module 42, configured to allocate, from the total buffer space, an independent buffer space matching with attribute information of the download task for each download task, so that each download task stores the downloaded file in the respective independent buffer space;
wherein the attribute information includes: code rate and/or file size.
In some embodiments, the determining module 41 includes:
The first determining unit is used for determining a basic cache space according to the memory space of the playing device where the current downloading task is located;
and the expansion unit is used for determining the total cache space of the current playing device according to the number of the downloading tasks and the basic cache space.
In some embodiments, the expansion unit is specifically configured to: if the number of the downloading tasks is 1, determining the basic cache space as a total cache space; if the number of the downloading tasks is greater than 1, expanding the buffer space with a preset size on the basis of the basic buffer space, and determining the expanded buffer space as the total buffer space.
Further, the determining module 41 further includes:
and the second determining unit is used for determining the preset maximum available buffer space as the total buffer space when the total buffer space is larger than the preset maximum available buffer space.
Further, the distribution module 42 includes:
The computing unit is used for acquiring the code rate of each downloading task and distributing an initial buffer space for each downloading task according to the proportion between the code rates;
An allocation unit for determining the initial buffer space as an independent buffer space; or when the initial buffer space is smaller than the preset value, determining the preset value as an independent buffer space; or when the initial buffer space is smaller than the file size of the download task, determining the file size of the download task as an independent buffer space.
In some embodiments, the computing unit is specifically configured to: for any download task, acquiring the code rate of the download task, and calculating the code rate ratio of the code rate of the download task in the total code rate of all download tasks; taking the code rate duty ratio as the space duty ratio of the downloading task in the total cache space; and according to the space ratio, an initial buffer space is allocated for the downloading task from the total buffer space.
Further, the distribution unit is specifically configured to: for any downloading task, if the initial buffer space is smaller than the preset value, the size of the independent buffer space allocated to the downloading task is updated to be equal to the preset value.
The computing unit is further for: and after reserving an independent buffer space for the downloading task in the total buffer space, reallocating the rest buffer space according to the code rate ratio of other downloading tasks.
Further, the distribution unit is specifically configured to: for any download task, if the initial buffer space is smaller than the file size of the download task, the size of the independent buffer space allocated to the download task is updated to be equal to the file size of the download task.
The computing unit is further for: and after reserving an independent buffer space for the downloading task from the total buffer space, reallocating the rest buffer space according to the code rate ratio of other downloading tasks.
Further, the multi-download task cache management device further includes:
The redetermining module is used for redefining the total buffer space if one download task is newly added or reduced, and redeploying the independent buffer space for each download task;
And the deleting module is used for deleting all the downloading tasks if the playing equipment where the current downloading task is located exits from playing so as to release the occupied cache space.
Further, the multi-download task cache management device further includes:
the detection module is used for detecting the size of the buffer memory space actually occupied by any download task;
the pause module is used for deleting the downloaded data according to a preset deletion rule and pausing the downloading task if the actually occupied space is larger than the size of the buffer space allocated to the downloading task;
and the downloading module is used for continuing the downloading task until the buffer space allocated to the downloading task is occupied if the actually occupied space is smaller than or equal to the buffer space allocated to the downloading task.
Further, the pause module includes:
A deleting unit for deleting the data which has been played in the downloaded data; and/or deleting the data which are not played in the downloaded data but have the distance from the current playing time greater than the preset time threshold.
The specific manner in which the respective modules perform operations in the cache management apparatus for multiple download tasks in the above embodiment has been described in detail in the embodiments related to the method, and will not be described in detail herein.
Based on one general inventive concept, the present embodiment also provides a playback apparatus. The playing device of the present embodiment includes the buffer management device for multiple download tasks of the above embodiment.
Based on a general inventive concept, the present embodiment also provides an electronic device, configured to implement the above method embodiment. As shown in fig. 9, the electronic apparatus of the present embodiment includes a processor 51 and a memory 52, and the processor 51 is connected to the memory 52. Wherein the processor 51 is used to call and execute the program stored in the memory 52; the memory 52 is used to store a program for performing at least the cache management method of the multi-download task in the above embodiment.
The specific manner in which the processor 32 executes the program in the memory 31 of the electronic device 11 in the above embodiment has been described in detail in the embodiment concerning the method, and will not be described in detail here.
Embodiments of the present application also provide a computer readable storage medium having a computer program stored thereon, which when executed by a processor, performs the steps of any of the methods described above.
It is to be understood that the same or similar parts in the above embodiments may be referred to each other, and that in some embodiments, the same or similar parts in other embodiments may be referred to.
It should be noted that in the description of the present invention, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. Furthermore, in the description of the present invention, unless otherwise indicated, the meaning of "plurality" means at least two.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
It is to be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the invention, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the invention.

Claims (13)

1. The cache management method for the multi-download tasks is characterized by being suitable for playing equipment of multimedia contents and comprising the following steps of:
Determining the total cache space of the current playing device;
From the total cache space, an independent cache space matched with the attribute information of the downloading task is allocated for each downloading task, so that each downloading task stores the downloaded file in the independent cache space; the allocating an independent buffer space matched with the attribute information of the downloading task for each downloading task comprises the following steps: acquiring code rates of all downloading tasks, and distributing an initial buffer space for each downloading task according to the proportion among the code rates; determining the initial buffer space as an independent buffer space; or when the initial buffer space is smaller than the preset value, determining the preset value as an independent buffer space; or when the initial buffer space is smaller than the file size of the downloading task, determining the file size of the downloading task as an independent buffer space;
the attribute information includes: code rate and/or file size.
2. The method of claim 1, wherein determining the total buffer space of the current playback device comprises:
determining a basic cache space according to the memory space of the playing device where the current downloading task is located;
And determining the total cache space of the current playing device according to the number of the downloading tasks and the basic cache space.
3. The method of claim 2, wherein determining the total buffer space of the current playback device according to the number of download tasks and the base buffer space comprises:
if the number of the downloading tasks is 1, determining the basic cache space as a total cache space;
if the number of the downloading tasks is greater than 1, expanding the buffer space with a preset size on the basis of the basic buffer space, and determining the expanded buffer space as the total buffer space.
4. The method as recited in claim 2, further comprising:
and if the total cache space is larger than the preset maximum available cache space, determining the preset maximum available cache space as the total cache space.
5. The method according to any one of claims 1-4, wherein said allocating initial buffer space for each download task in proportion to the ratio between code rates comprises:
for any download task, acquiring the code rate of the download task, and calculating the code rate ratio of the code rate of the download task in the total code rate of all download tasks;
taking the code rate duty ratio as the space duty ratio of the downloading task in the total cache space;
and according to the space ratio, an initial buffer space is allocated for the downloading task from the total buffer space.
6. The method according to any one of claims 1-4, wherein determining the preset value as an independent buffer space comprises:
for any downloading task, if the initial buffer space is smaller than a preset value, updating the size of the independent buffer space allocated to the downloading task to be equal to the preset value;
And after reserving an independent buffer space for the downloading task in the total buffer space, reallocating the rest buffer space according to the code rate ratio of other downloading tasks.
7. The method according to any one of claims 1-4, wherein determining the file size of the download task as an independent buffer space comprises:
For any downloading task, if the initial buffer space is smaller than the file size of the downloading task, updating the size of the independent buffer space allocated to the downloading task to be equal to the file size of the downloading task;
And after reserving an independent buffer space for the downloading task from the total buffer space, reallocating the rest buffer space according to the code rate ratio of other downloading tasks.
8. The method of any one of claims 1-4, further comprising:
And if one download task is newly added or reduced, the total buffer space is redetermined, and based on the rest download tasks, each download task is allocated with a respective independent buffer space.
9. The method of any one of claims 1-4, further comprising:
for any downloading task, detecting the size of a buffer space actually occupied by the downloading task;
If the actually occupied space is larger than the size of the buffer memory space allocated to the downloading task, deleting the downloaded data according to a preset deleting rule, and suspending the downloading task.
10. The method of claim 9, wherein deleting the downloaded data according to the preset deletion rule comprises:
deleting the data which has been played in the downloaded data; and/or the number of the groups of groups,
Deleting the data which are not played in the downloaded data and have the distance from the current playing time larger than the preset time threshold.
11. A cache management apparatus for multiple download tasks, comprising:
the determining module is used for determining the total cache space of the current playing device;
The distribution module is used for distributing an independent cache space matched with the attribute information of the downloading task for each downloading task from the total cache space so that each downloading task stores the downloaded file in the independent cache space; the allocating an independent buffer space matched with the attribute information of the downloading task for each downloading task comprises the following steps: acquiring code rates of all downloading tasks, and distributing an initial buffer space for each downloading task according to the proportion among the code rates; determining the initial buffer space as an independent buffer space; or when the initial buffer space is smaller than the preset value, determining the preset value as an independent buffer space; or when the initial buffer space is smaller than the file size of the downloading task, determining the file size of the downloading task as an independent buffer space;
the attribute information includes: code rate and/or file size.
12. An electronic device comprising a processor and a memory, the processor being coupled to the memory:
the processor is used for calling and executing the program stored in the memory;
the memory for storing the program at least for performing the method of any one of claims 1-10.
13. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the method according to any of claims 1-10.
CN202210446692.7A 2022-04-26 2022-04-26 Cache management method and device for multiple download tasks Active CN114817089B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210446692.7A CN114817089B (en) 2022-04-26 2022-04-26 Cache management method and device for multiple download tasks

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210446692.7A CN114817089B (en) 2022-04-26 2022-04-26 Cache management method and device for multiple download tasks

Publications (2)

Publication Number Publication Date
CN114817089A CN114817089A (en) 2022-07-29
CN114817089B true CN114817089B (en) 2024-11-12

Family

ID=82506670

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210446692.7A Active CN114817089B (en) 2022-04-26 2022-04-26 Cache management method and device for multiple download tasks

Country Status (1)

Country Link
CN (1) CN114817089B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104133782A (en) * 2014-07-04 2014-11-05 深圳英飞拓科技股份有限公司 Adaptive management method and device of digital monitoring platform memory

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100312828A1 (en) * 2009-06-03 2010-12-09 Mobixell Networks Ltd. Server-controlled download of streaming media files
CN106131657A (en) * 2016-06-29 2016-11-16 乐视控股(北京)有限公司 Video playing control method and device
CN109462650A (en) * 2018-11-14 2019-03-12 深圳市小牛普惠投资管理有限公司 Data file downloading method, device, computer equipment and storage medium
CN114173372B (en) * 2020-09-10 2025-08-29 华为技术有限公司 Data caching method and electronic device
CN113377724A (en) * 2021-07-02 2021-09-10 厦门雅基软件有限公司 Cache space management method, device and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104133782A (en) * 2014-07-04 2014-11-05 深圳英飞拓科技股份有限公司 Adaptive management method and device of digital monitoring platform memory

Also Published As

Publication number Publication date
CN114817089A (en) 2022-07-29

Similar Documents

Publication Publication Date Title
US20220221998A1 (en) Memory management method, electronic device and non-transitory computer-readable medium
US10540296B2 (en) Thresholding task control blocks for staging and destaging
EP3220255A1 (en) Method for storage device storing data and storage device
CN109213696B (en) Method and apparatus for cache management
CN105637470B (en) Method and computing device for dirty data management
JP2015506041A (en) Working set swap using sequential swap file
CN104317742A (en) A Thin Provisioning Method for Optimizing Space Management
US9792050B2 (en) Distributed caching systems and methods
JP7631362B2 (en) Tracking file system read operations for instant play of video games and for discarding and pre-fetching game data on the client side
JP2017117179A (en) Information processing device, cache control program and cache control method
CN105302830A (en) Map tile caching method and apparatus
CN104461735A (en) Method and device for distributing CPU resources in virtual scene
CN112015343A (en) Cache space management method and device of storage volume and electronic equipment
CN104133642A (en) SSD Cache filling method and device
EP2506135A2 (en) Method and apparatus to allocate area to virtual volume
US11481140B1 (en) Dynamic base disk mirroring for linked clones
CN114817089B (en) Cache management method and device for multiple download tasks
JP2013196108A (en) Storage control device, storage control method, and storage control program
JP2017027301A (en) Storage control device, hierarchical storage control program, and hierarchical storage control method
CN109086008A (en) Data processing method of solid-state hard disk and solid-state hard disk
CN106201921A (en) The method of adjustment of a kind of cache partitions capacity and device
CN106980471B (en) Method and device for improving hard disk writing performance of intelligent equipment
US20120079188A1 (en) Method and apparatus to allocate area to virtual volume based on object access type
CN105183375B (en) A kind of control method and device of the service quality of hot spot data
CN104899158A (en) Memory access optimization method and memory access optimization device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant