[go: up one dir, main page]

CN121012818A - Method and apparatus for loading media resources - Google Patents

Method and apparatus for loading media resources

Info

Publication number
CN121012818A
CN121012818A CN202511057024.5A CN202511057024A CN121012818A CN 121012818 A CN121012818 A CN 121012818A CN 202511057024 A CN202511057024 A CN 202511057024A CN 121012818 A CN121012818 A CN 121012818A
Authority
CN
China
Prior art keywords
file
resource
media
track
resources
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202511057024.5A
Other languages
Chinese (zh)
Inventor
匡乐
胡奕涵
贺子悦
姚智健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan MgtvCom Interactive Entertainment Media Co Ltd
Original Assignee
Hunan MgtvCom Interactive Entertainment Media Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan MgtvCom Interactive Entertainment Media Co Ltd filed Critical Hunan MgtvCom Interactive Entertainment Media Co Ltd
Priority to CN202511057024.5A priority Critical patent/CN121012818A/en
Publication of CN121012818A publication Critical patent/CN121012818A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/70Media network packetisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/80Responding to QoS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L2212/00Encapsulation of packets

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

本申请实施例提供了一种媒体资源的加载方法及装置,其中,该方法包括:生成富媒体资源对应的第一轨道文件和多媒体资源的第二轨道文件,并将所述第一轨道文件和第二轨道文件封装至目标容器中,以生成媒体文件,其中,所述富媒体资源为叠加至所述多媒体资源对应的界面上的资源;根据所述第一轨道文件的控制逻辑生成资源清单文件;将所述媒体文件和所述资源清单文件发送至客户端,以使所述客户端根据所述媒体文件加载所述多媒体资源,以及根据所述媒体文件和所述资源清单文件加载所述富媒体资源。

This application provides a method and apparatus for loading media resources. The method includes: generating a first track file corresponding to rich media resources and a second track file for multimedia resources; encapsulating the first track file and the second track file into a target container to generate a media file, wherein the rich media resources are resources overlaid on the interface corresponding to the multimedia resources; generating a resource manifest file according to the control logic of the first track file; and sending the media file and the resource manifest file to a client, so that the client loads the multimedia resources according to the media file and loads the rich media resources according to the media file and the resource manifest file.

Description

Media resource loading method and device
Technical Field
The present application relates to the field of communications, and in particular, to a method and apparatus for loading media resources.
Background
Conventional online video asset distribution and playback systems rely primarily on modes of processing and transmitting audio and video streams separately from additional assets, which present a number of challenges. First, audio-video content is transmitted through a mature standard streaming media protocol, such as a live streaming over HTTP protocol (HTTP LIVE STREAMING, abbreviated HLS) or a dynamic adaptive streaming media over HTTP protocol (DYNAMIC ADAPTIVE STREAMING over HTTP, abbreviated DASH), which are intended to efficiently transport media streams so that clients can download, cache and decode main video and audio tracks as needed to achieve a smooth playback experience.
However, for rich media resources such as barrages, subtitles, stickers, filters, interactive special effects, etc., a more decentralized and asynchronous management and loading approach is adopted. Due to the diversity of special effect resources, from simple image coverage to complex real-time rendering, the synchronization with video playback becomes complex, requiring additional mechanisms to ensure collaborative work between the two.
Aiming at the problem that in the prior art, rich media resources are usually managed and loaded independently of main stream, and further are not synchronous with multimedia resource playing, no effective solution is proposed at present.
Accordingly, there is a need for improvements in the related art to overcome the drawbacks of the related art.
Disclosure of Invention
The embodiment of the application provides a method and a device for loading media resources, which at least solve the problem that in the prior art, rich media resources are usually managed and loaded independently of main streams, so that the rich media resources are usually asynchronous with the playing of the multimedia resources.
According to one embodiment of the application, a loading method of media resources is provided, and the loading method is applied to service equipment and comprises the steps of generating a first track file corresponding to rich media resources and a second track file of multimedia resources, packaging the first track file and the second track file into a target container to generate the media files, wherein the rich media resources are resources which are overlapped on interfaces corresponding to the multimedia resources, generating a resource list file according to control logic of the first track file, and sending the media files and the resource list file to a client side so that the client side loads the multimedia resources according to the media files, and loading the rich media resources according to the media files and the resource list file.
According to another embodiment of the application, a loading method of media resources is provided, and the loading method is applied to a client and comprises the steps of receiving media files and resource list files sent by service equipment, wherein a cloud platform generates a first track file corresponding to rich media resources and a second track file of the multimedia resources, packages the first track file and the second track file into a target container to generate the media files, and generates the resource list files according to control logic of the first track file, wherein the rich media resources are resources which are overlapped on interfaces corresponding to the multimedia resources, and the multimedia resources at least comprise one of video resources and audio resources, loads the multimedia resources according to the media files, and loads the rich media resources according to the media files and the resource list files.
According to another embodiment of the application, a loading device of media resources is provided, and the loading device is applied to service equipment and comprises a first generating module, a second generating module and a sending module, wherein the first generating module is used for generating a first track file corresponding to rich media resources and a second track file of multimedia resources, and packaging the first track file and the second track file into a target container to generate media files, the rich media resources are resources which are overlapped on interfaces corresponding to the multimedia resources, the second generating module is used for generating resource list files according to control logic of the first track files, and the sending module is used for sending the media files and the resource list files to a client side so that the client side loads the multimedia resources according to the media files and loads the rich media resources according to the media files and the resource list files.
According to another embodiment of the application, a loading device of media resources is provided, which is applied to a client and comprises a receiving module, a loading module and a loading module, wherein the receiving module is used for receiving media files and resource list files sent by service equipment, the cloud platform generates a first track file corresponding to rich media resources and a second track file corresponding to the multimedia resources, the first track file and the second track file are packaged into a target container to generate media files, the resource list files are generated according to control logic of the first track files, the rich media resources are resources which are superposed on interfaces corresponding to the multimedia resources, and the multimedia resources at least comprise one of video resources and audio resources, and the loading module is used for loading the multimedia resources according to the media files and loading the rich media resources according to the media files and the resource list files.
According to a further embodiment of the application, there is also provided a computer readable storage medium having stored therein a computer program, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
According to a further embodiment of the application there is also provided an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
According to a further embodiment of the application, there is also provided a computer program product comprising a computer program which, when executed by a processor, implements the steps of any of the method embodiments described above.
According to the method and the device, a first track file corresponding to the rich media resource and a second track file of the multimedia resource are generated, the first track file and the second track file are packaged into a target container to generate the media file, wherein the rich media resource is a resource which is overlapped on an interface corresponding to the multimedia resource, a resource list file is generated according to control logic of the first track file, the media file and the resource list file are sent to a client, so that the client loads the multimedia resource according to the media file, and loads the rich media resource according to the media file and the resource list file.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the application or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, and it will be obvious to a person skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a hardware block diagram of a service device of a media resource loading method according to an embodiment of the present application;
FIG. 2 is a flow chart (one) of a method of loading media assets according to an embodiment of the application;
FIG. 3 is a flow chart (II) of a method of loading media assets according to an embodiment of the application;
FIG. 4 is a block diagram of a media resource loading device according to an embodiment of the present application;
Fig. 5 is a block diagram (two) of a media resource loading apparatus according to an embodiment of the present application.
Detailed Description
Embodiments of the present application will be described in detail below with reference to the accompanying drawings in conjunction with the embodiments.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order.
The method embodiments provided in the embodiments of the present application may be executed in a service device or similar computing device. Taking the operation on the service device as an example, fig. 1 is a hardware structure block diagram of the service device of a media resource loading method according to an embodiment of the present application. As shown in fig. 1, the service device may include one or more (only one is shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a microprocessor MPU, a processing means such as a programmable logic device FPGA, or the like) and a memory 104 for storing data, wherein the service device may further include a transmission device 106 for communication functions and an input-output device 108. It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely illustrative and is not intended to limit the structure of the service device described above. For example, the service device may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1.
The memory 104 may be used to store a computer program, for example, a software program of application software and a module, such as a computer program corresponding to a loading method of a media resource in an embodiment of the present application, and the processor 102 executes the computer program stored in the memory 104, thereby performing various functional applications and data processing, that is, implementing the above-mentioned method. Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory remotely located with respect to the processor 102, which may be connected to the service device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. The specific example of the network described above may include a wireless network provided by a communication provider of a service device. In one example, the transmission device 106 includes a network adapter (Network Interface Controller, simply referred to as a NIC) that can connect to other network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is configured to communicate with the internet wirelessly.
Fig. 2 is a flowchart (one) of a method for loading media resources according to an embodiment of the present application, wherein the media resources include rich media resources and multimedia resources, and the flowchart includes the following steps as shown in fig. 2:
step S202, generating a first track file corresponding to a rich media resource and a second track file of a multimedia resource, and packaging the first track file and the second track file into a target container to generate a media file, wherein the rich media resource is a resource which is overlapped on an interface corresponding to the multimedia resource;
It should be noted that the multimedia resources include but are not limited to video and audio, and the rich media resources include but are not limited to barrages, stickers, subtitles and special effects.
Step S202 can ensure separation and modularization of resources by abstracting and modeling multimedia resources and rich media resources into independent track files, thereby facilitating subsequent unified encapsulation and flexible control. And further combined into the media file by a specific packaging technique (e.g., using MP4, webM, etc. container formats). The unified packaging method enables different types of resources to be transmitted as a whole, reduces the number of requests of a client, simplifies the management and synchronization of the resources, and improves the transmission efficiency.
Step S204, generating a resource list file according to the control logic of the first track file;
The resource manifest file (typically presented in JSON, XML or YAML format) in step S204 is used to describe the meta information of all tracks enclosed in the media file and the file of control rules. The method comprises key information such as the ID, the type, the coding format, the priority, the display condition (such as the equipment type, the user identity and the playing scene), the rendering sequence, the dependency relationship and the like of the track. The generation of the manifest file is based on the control logic of the rich media resource track file, and provides a detailed resource management guide for the client, so that the client can dynamically decide which resources are loaded, when and how to render, thereby realizing intelligent scheduling and display of the resources.
It should be noted that the resource list file may further include control logic of the multimedia resource, where the control logic of the multimedia resource includes, but is not limited to, one-key skip, starting to display audio when the video playing reaches a certain time point, etc., and masking the target object and the target voiceprint.
Step S206, the media file and the resource list file are sent to the client, so that the client loads the multimedia resource according to the media file, and loads the rich media resource according to the media file and the resource list file.
The media file and the resource manifest file are sent to the client, and can be efficiently distributed through a standard CDN (content distribution network), or can be delivered through a self-developed distribution channel. The unified distribution mechanism ensures the consistency of the resources and control information received by the client.
On the client side, the client first parses the resource manifest file to determine which tracks (multimedia resources and rich media resources) to load based on the description and control logic therein. The client will then de-encapsulate and decode the corresponding resources from the media file according to the requirements of these tracks. Through the unified rendering interface, different types of track resources can be effectively overlapped on the video playing picture, so that the richness and the diversification of visual contents are realized. In addition, the resource manifest file enables the client to flexibly adjust the loading and rendering policies of the resources according to device characteristics, user preferences, and network conditions, thereby optimizing the playback experience.
Through the steps, the first track file and the second track file of the multimedia resource corresponding to the rich media resource are generated, the first track file and the second track file are packaged into the target container to generate the media file, wherein the rich media resource is the resource which is overlapped on the interface corresponding to the multimedia resource, the resource list file is generated according to the control logic of the first track file, the media file and the resource list file are sent to the client side, so that the client side loads the multimedia resource according to the media file, and loads the rich media resource according to the media file and the resource list file.
Alternatively, the generation of the first track file corresponding to the rich media resource in the step S202 may be implemented by determining attribute information of each sub-resource in the rich media resource, generating a track event of each sub-resource according to the attribute information of each sub-resource, and generating the first track file of the rich media resource according to the display time of each sub-resource and the track event of each sub-resource.
In an embodiment of the present application, detailed attribute information of each sub-resource (e.g., single bullet screen, single piece of artwork, single-line caption, etc.) is extracted or defined, wherein the attribute information includes, but is not limited to, type of resource (e.g., image, text, interactive instruction, etc.), size, location, presentation time, duration, special effect parameters (e.g., zoom, rotation, transparency, etc.). A clear description framework is built for each sub-resource by determining attribute information.
Based on translating the attribute information of each sub-resource into one or a series of track events. The track event is a data structure describing the specific behavior of the sub-resource on the time axis, and contains key information such as the presentation time point, duration, position parameters and the like of the sub-resource. For example, for a bullet screen resource, the track event may include bullet screen text, display time, duration, speed, display location, etc. The event data will be used to construct a track structure for the entire rich media asset, ensuring that the asset will be presented correctly when played.
And finally, summarizing the track events of all the sub-resources into a structured first track file, wherein the first track file is in a specific format (such as JSON, webVTT, custom binary format and the like) so as to be packaged into a multimedia container. The structure of the track file needs to contain the display time, type identification, metadata and other control information for each track event so that the client can understand when to load, how to decode and render each sub-resource, thereby ensuring synchronization of the rich media resource with the main video stream.
Alternatively, processing the rich media asset into a standardized track file may be divided into the following steps:
step1, preparing input resources;
1. Master video file-ensuring that video files are available, for example stored in mp4 or webm format.
2. And (3) confirming the existence of the audio file, wherein the encoding format is AAC or Opus.
3. And collecting all the expression map images to be processed, wherein the expression map image resources can be in the format of png, webp and the like.
4. Bullet screen text script, which prepares a JSON format file containing bullet screen content and display time stamp thereof.
Step 2, processing by using a tool script;
1. Ffmpeg and Python environments are configured-ensuring that the system is installed and configured with ffmpeg and a Python-enabled environment.
2. And (3) writing a Python script, namely creating a Python script to read the expression map and the barrage script, and then processing the resources by using ffmpeg to enable the resources to conform to the standard format of the track file. The script will be responsible for matching the expression map to its display time in the video and generating the corresponding track description.
Step3, creating a track description;
1. and extracting expression map information, namely extracting necessary information such as picture file names, position data, display time and the like from expression map resources by using a Python script.
2. And (3) associating the timestamps by reading the timestamp information in the barrage script, and associating the timestamps with the expression mapping information by the Python script.
3. Generating track description data, namely constructing the track description of the processed information in a standardized format (such as JSON format), wherein the track description comprises the following steps:
track_id, the unique identifier of the track.
Type: resource type, e.g., image-overlay.
Start_time, the time when the resource starts to display.
Duration of resource display.
Image_src: the file name or path of the image resource.
Position, the display position of an image on a video picture.
Scale: scaling of the image.
For example:
step 4, integrating track data;
1. and (3) serializing the track events, namely, serially connecting all the events of the expression map and the bullet screen in time sequence to form a JSON sequence, so that a plurality of track events can be described in one file.
2. Outputting a track file, namely writing the generated track description sequence into a standard track file, such as an overlay_track. This document contains a standardized description of all the expression stickers and barrages so that the subsequent packaging process can be identified and processed.
Through the steps, originally scattered rich media resources, such as expression maps and barrages, are converted into a structured track file format which can be uniformly managed, and standardized management and efficient distribution of the resources are realized.
Alternatively, "encapsulating the first track file and the second track file into a target container to generate a media file" in the above step S202 may be implemented by converting the first track file into a first data file in a target format, converting the second track file into a second data file in a target format, and writing the first data file and the second data file into target areas of the target container, respectively, and executing a target encapsulation command to encapsulate the target containers written with the first data file and the second data file into the media file.
The first track file is a track description file of rich media resources, such as a bullet screen, a subtitle, a special effect track, etc. The rich media resource is converted into a target Format (such as a custom box structure conforming to the ISO Base MEDIA FILE Format standard), so that the rich media resource can be correctly identified and decoded by the existing media container (such as MP4, webM and the like). Format conversion includes, but is not limited to, reconstructing the JSON, XML, or binary data first track file into the format of a particular region (e.g., moov, trak, mdia, etc.) in the container file.
The second track file is the basic video and audio track file and further processing is required during the encapsulation process to ensure compatibility with the track data of the rich media asset.
After the format conversion is completed, the first and second data files will be written to designated areas or boxes, respectively, of the target container (e.g., MP 4). Wherein the specified area depends on the specification of the specific container format, e.g. the first data file will be written into moof (Movie Fragment) box and the second data file into MP4 user extended box (udta, uuid box).
The package command is a media package tool (such as ffmpeg, gpac MP, 4Box, etc.), and the processed video, audio and additional rich media resource track data are packaged into a single media file through the package command. This process includes, but is not limited to, writing all track data with metadata to the corresponding blocks of the media container, ensuring consistency of the timeline and integrity of the decoded information, the last generated media file (e.g., unified _asset.mp4).
The format conversion in the embodiment of the application ensures that rich media resources can be supported by various devices and clients, and avoids resource loading failure or abnormal playing caused by incompatibility of formats. By writing the first and second data files into different areas of the target container respectively and then executing the packaging command uniformly, all resources can be integrated into a single media file efficiently, and the request times and the data processing complexity of the client are reduced.
Optionally, the process of uniformly packaging the main video, the main audio and the plurality of track files into one resolvable multi-track container comprises the following steps:
step1, preparing input data;
1. Video/audio resources, video and audio files, such as video in.mp 4 or.webm format, and AAC or Opus encoded audio.
It should be noted that the main audio and main video are already in the form of tracks in practice, which are the first and most basic components of a multitrack container.
JSON-format track file-JSON file for describing additional resources (such as map and barrage), for example, overlay_track, ensures that all necessary attribute information is contained in the file.
3. Time axis information (time anchor point) the point in time information synchronized with the video content is collected or generated for locating when the track data is displayed or applied.
Step2, track data conversion;
1. The track is converted into binary data, namely the track file in the JSON format is converted into binary data by using a proper tool (such as ffmpeg matched with Python script) so as to be suitable for the storage format of the multimedia container.
Step3, packaging the multi-rail container;
1. Tool chain is selected to determine whether to use GPAC MP4Box or FFmpeg+ bmffmux for encapsulation. GPAC MP4Box is suitable for fast and simple packaging, while FFmpeg+ bmffmux provides a higher level of customization capability and flexibility.
2. Packaging the primary video and audio files are packaged as part of a multi-track container using selected tools.
For example, when ffmpeg is used, the-map parameters are used to map the video and audio streams into the output container.
Step4, writing an expansion box;
Writing user extension box, writing the converted track data into extension box of MP4 container for storage, such as using udta or uuidbox.
Step 5, mapping the track type;
defining track types-defining the type and purpose of each track in the container, for example:
Video:Track 1(type:vide);
Audio:Track 2(type:soun);
Overlay:Track 3(type:meta/json);
Step 6, executing the packaging command;
The pack command is as follows:
{ffmpeg-i video.mp4-i overlay_track.json,
-map 0:v-map 0:a-map 1,
-c copy-metadata:s:2handler="overlay-json",
-movflags use_metadata_tags,
output_with_tracks.mp4}。
1. The input file is specified using the-i parameters to take as input the main video, main audio and track file (e.g. overlay _ track. Json).
2. Mapping streams-map parameters are used to specify which streams to extract from which inputs, e.g. -map0: v-map 0:a-map 1 maps video, audio, and third input (track file) to the output container.
3. Copy coding-use-c copy avoids re-encoding video and audio, maintaining original quality.
4. Metadata is added-metadata: s:2 handler= "overlay-json" is used to add metadata of the track handling manner to the corresponding stream, where handler is used for the client to identify the track type, s:2 denotes addition to the second stream (track).
5. Use movflags use-movflags use _metadata_tags ensures that metadata tags are used correctly.
6. Output of the encapsulated file after execution of the command, a single encapsulated media file containing all tracks (video, audio, map, bullet screen, etc.), for example named output_with_tracks.
Through the steps, various types of resource data can be integrated into a unified multi-track container, so that analysis and efficient playing of a client side are facilitated, and synchronization and cross-platform consistency of resources are ensured.
Optionally, the embodiment of the application provides a technical scheme for dynamically updating rich media resources (such as barrages, pictures, special effects and the like), which comprises the steps of determining the updating state of the rich media resources, generating a third track file according to the updated rich media resources when the updating state of the rich media resources indicates that the rich media resources are updated, and packaging the third track file into a target container to generate the updated media file.
Monitoring and detecting whether a rich media asset in the multimedia content needs to be updated, the determination of the update status may be accomplished in a variety of ways, including but not limited to version control, asset hash value checking, update timestamp comparison, and the like. When the rich media resource is detected to be updated, the new resource is processed to generate or update a corresponding track file (namely, a third track file). The generated third track file contains the updated data such as the attribute information, track event, display time and the like of the sub-resource, so that the latest and the accuracy of the resource are ensured.
And finally, uniformly packaging the updated third track file and other tracks (such as video and audio tracks) in the original media file to form the updated media file. The updated media file not only contains video and audio, but also contains the latest rich media resource track, thereby realizing seamless upgrading of the whole resource.
Optionally, taking a newly added bullet screen track file as an example, the new track expansion steps are as follows:
step1, constructing a new bullet screen track and converting the format;
1. Modeling a time axis event of a new bullet screen style;
The newly designed bullet screen patterns are converted into event sequences on the time axis using, for example, ffmpeg in combination with Python processing scripts or dedicated media editing software. Each Event contains information such as display time, text content, presentation style, etc. of the bullet screen, and can be WebVTT (a text format of subtitles and metadata) or a custom JSON-Event format.
JSON-Event is a flexible data exchange format that can describe various attributes of each bullet Event. For example, the following is a simple example of a JSON-Event:
{ "time":32.5, "text": this point turns too fast,
{ "Time":45.0, "text": who understands "," style ": fade-in" }.
Each JSON object represents a bullet event, and includes a time point (in seconds) at which the event occurs, text content of the bullet, and a specific display style (such as emphasis effect or fade-in effect).
Step2, track encapsulation and media file updating;
1. packaging the track file into an existing MP4 container;
the track file is created using the encapsulation command of ffmpeg, encapsulated with the existing media file (main video + audio track) to form an updated media file.
Wherein, the encapsulation command example of ffmpeg:
{ffmpeg-i main.mp4-i new_danmaku_track.json,
-map 0-map 1,
-c copy-metadata:s:1handler_name="danmaku_beta",
-movflags use_metadata_tags,
output_with_danmaku.mp4}。
-i main 4 and-i new danmaku track JSON specify as input the main video file and JSON description file of the new bullet track, respectively.
The map 0 and map 1 parameters are used to tell ffmpeg which streams should be selected from the first and second inputs to output, here meaning that all streams of main.mp4 are reserved and new_ danmaku _track is added as an extra track.
C copy means keeping the video and audio coding unchanged, copying the stream directly to the output file, avoiding quality and performance losses due to re-encoding.
Metadata: s 1handler_name = "danmaku _beta" is used to add metadata tags to the newly added bullet track, indicating that this is one bullet track and has a specific handle name (handler_name) for identification and control when the client parses.
Movflags use _metadata_tags ensure that ffmpeg uses metadata tags to properly create the metadata portion of the output file, which is particularly important for multi-track encapsulation, as it helps clients know how to process each track.
Output_with_danmaku.mp4 is the final packaged media file name containing the original video and audio information, as well as the newly added bullet track data.
Through the steps, the dynamic updating of the rich media resources is realized without the integral updating of the App, so that the flexibility and the efficiency of content updating are greatly improved, and the continuity and the improvement of user experience are ensured.
Optionally, after the third track file is packaged into the target container to generate the updated media file, generating an updated resource list file according to meta information and control logic of the third track file, sending the updated media file and the updated resource list file to a client so that the client loads the multimedia resource according to the updated media file and loads the updated rich media resource according to the updated media file and the updated resource list file.
The third track file contains newly added or updated rich media resource track data, including meta information such as the type, display time, duration, position, special effect parameters, priority, dependency relationship and the like of the resource. The resource manifest file is responsible for describing the meta information and control logic of all tracks, so that after updating the third track file, a resource manifest file of the latest resource status is generated, i.e. the updated resource manifest file.
The updated media file contains all multimedia resources and the newly added third track file, and the updated resource manifest file is a comprehensive description and control instruction for all available tracks. These two files need to be sent to the client together to ensure that the client can play according to the latest resource data and control logic.
After receiving the updated media file and resource manifest file, the client first parses the resource manifest file to see which resource tracks are available, and their control logic. Then, the client loads corresponding track data according to the indication of the manifest file. And for the updated rich media resource, the client loads and renders according to the display time and the control logic calculated in real time, so as to ensure accurate synchronization with the video content.
According to the embodiment of the application, the third track file and the resource list file in the media container are synchronously updated, so that the instant update and dynamic control of the resources are realized, the strong and flexible content management capability is provided for an online video platform, and the user experience and the resource distribution efficiency are obviously improved.
Optionally, a new track is added to the resource control list by:
Step1, defining meta information and control information of a new track;
Basic properties of the new track and control logic thereof are determined. Specifically, the method comprises the following key fields:
"dependencies":["track_main_video"]}。
wherein id is "track_ danmaku _beta", the unique identifier of the new track is used to distinguish and reference different resource tracks.
Type: "danmaku", identifies the type of resource carried by the track, here the barrage.
The codec is "JSON-Event" describing the coding format or data structure of the resource, where the barrage data is stored in the form of JSON-Event.
And the priority is 3, the priority of the track is defined, and the smaller the numerical value is, the higher the priority is, and the loading sequence and the display level of the resource are affected.
Conditions, set the conditions for track enablement, e.g., open only to "beta" test users, and the system needs to turn on the "enable_ danmaku _v2" function flag.
And determining a display layer of the resource in the play picture by using the render_layer to ensure the correct superposition of the resource.
DEPENDENCIES [ "track_main_video" ], list other tracks that must be loaded before track rendering, ensure that the main video track is loaded and prepared before the bullet track.
Step 2, integrating new track information into a resource control list;
The manifest is typically a structured file, such as JSON format, that describes the meta-information and control logic of all tracks in the media container. The process of adding new track information in the manifest includes:
1. opening a current resource control manifest file:
The currently stored resource control list is read, which may be a JSON-format file containing the description information of all known tracks.
2. Adding a new track description to the manifest:
the newly defined track information is added in place of the manifest (as in tracks arrays), as described in full above as "track danmaku beta". The structural integrity of the list is ensured, and the field format is correct, so that the client can successfully analyze.
3. And storing the updated resource control list:
and storing the modified list file to ensure that the newly added track information is durable and can be used for subsequent distribution and client reading.
Through the steps, the resource control list is updated and contains newly added barrage track information, which provides a basis for further distribution and dynamic selection and loading of resources of clients. The client can intelligently decide whether to load the track danmaku beta track or not according to the updated manifest file and how to display the track according to the control logic of the track, so that the aim of adding a playing function or content without App upgrading is fulfilled.
Optionally, generating a resource manifest file according to the control logic of the first track file comprises obtaining meta information of the first track file and writing the meta information into a first field in the resource manifest file, and determining the control logic according to the resource type of the rich media resource and writing the control logic into a second field in the resource manifest file.
Meta information includes, but is not limited to, track ID, resource type, codec type, priority, display time range, etc. This portion of information is critical to the client's identification and processing of particular resources. Meta-information is extracted from the first track file (e.g., packaged bullet screen, special effects track file), which can be parsed out and prepared to be written to the first field of the resource manifest file using a specialized media file parsing library (e.g., FFmpeg, gst-Parse, etc.) or custom parsing script.
The first field generally refers to a portion that directly describes some track resource base information, such as each item in the tracks array in the manifest. Different types of rich media resources (e.g., barrages, special effects, subtitles, etc.) may have different control logic, e.g., the barrages may need to take into account user identity, play speed adjustment, display density control, and special effects may involve user device capability and version compatibility decisions. And determining a control logic process, namely determining corresponding enabling rules, display priorities, rendering orders and dependency relationships according to the characteristics of the resource types and the service policies of the platform. The second field may contain conditional expressions, prioritization, dependency declarations, etc. By writing the control logic information into the second field, the client can acquire the resource processing guide while receiving the media container file, thereby realizing intelligent resource scheduling and rendering control.
Alternatively, the following is a main structure and field description of the manifest file, which may be packaged in JSON, YAML or binary metadata formats:
{
The unique identifier of the track is used for internal scheduling and reference, so that the client can accurately identify and control each track.
And the type is used for classifying and managing different types of resources such as audio and video, subtitles, barrages, mapping, special effects and the like so as to realize differentiated processing and rendering.
The decoding format or the resource packaging subformat defines how to decode and render resources, and ensures that the decoding capability of the client is matched.
Priority, rendering priority, the smaller the value, the higher the priority, for controlling the loading sequence of the resource and solving the problem of resource conflict.
Conditions, control rules, including device type, client version, user tags, function switches, etc., are used to decide under what conditions to load and expose a particular track.
Rendering layer of resources on playing interface, ensuring correct superposition of resources, such as top of main picture, subtitle layer, special effect layer, etc.
DEPENDENCIES, indicating which tracks need to be loaded in advance according to the dependency relationship before the resource rendering so as to realize the cooperative display among the resources.
The resource manifest file not only describes static information of the resource, but also supports the behavior of dynamically controlling the resource, and the control logic range includes:
1. load behavior control:
The appropriate track is intelligently selected for loading based on the client device type (mobile, tv, pc).
The rollback policy is automatically executed according to the minimum version requirements of the client, and the low version client may skip incompatible tracks to maintain the base playback function.
Through a special function switch (feature_flag), an experimental or special function track is enabled, and the gray release and A/B test of the function are realized.
2. Rendering priority control:
The priority of the map class resource is set lower than the subtitle and main video to avoid overlaying important information.
The priority of the bullet screen track is adjusted according to the subscription state of the user, for example, the VIP user can enjoy higher definition or more timely bullet screen display.
3. Time range and trigger mechanism:
And limiting the playing time range of the track, and avoiding the resource from interfering with the viewing experience at an unexpected time point.
And setting a condition triggering mechanism, such as playing to a certain time point to automatically load a specific track, so as to realize accurate resource scheduling.
4. Multi-track mutual exclusion and combining strategy:
In some cases, mutually exclusive relationships between tracks are defined, preventing resource conflicts, such as advanced filters and certain interactive instructions, from being available at the same time.
And loading and rendering the intelligent scheduling track through the dependency tree or the combined scene, so as to realize complex resource combination and hierarchical display.
Through the integrated resource list file structure and the control logic, unified packaging, dynamic scheduling and cross-platform consistent playing of video streaming media resources are realized, and the efficiency and user experience of resource management are greatly improved.
In this embodiment, a method for loading media resources is further provided, and the method is applied to a client, and fig. 3 is a flowchart (second) of a method for loading media resources according to an embodiment of the present application, as shown in fig. 3, where the flowchart includes the following steps:
Step S302, receiving a media file and a resource list file sent by a service device, wherein a cloud platform generates a first track file corresponding to a rich media resource and a second track file of the multimedia resource, packages the first track file and the second track file into a target container to generate the media file, and generates the resource list file according to control logic of the first track file, wherein the rich media resource is a resource which is overlapped on an interface corresponding to the multimedia resource;
Step S304, loading the multimedia resource according to the media file, and loading the rich media resource according to the media file and the resource list file.
In one exemplary embodiment, loading the multimedia asset according to the media file and the rich media asset according to the media file and the asset manifest file includes parsing the asset manifest file to determine control logic for the first track file and parsing the media file to obtain the first track file and the second track file, decoding the first track file and the second track file to obtain the multimedia asset and the rich media asset, loading the multimedia asset to the client through a target interface and loading the rich media asset to the client according to the control logic.
In an exemplary embodiment, after the multimedia resource is loaded according to the media file and the rich media resource is loaded according to the media file and the resource list file, the method further comprises analyzing the updated media file to obtain the first track file, the second track file and the third track file when the updated media file and the updated resource list file are received, wherein the third track file is the track file corresponding to the updated rich media resource, whether the preset condition for rendering the third track file is met by the client or not is judged, loading the multimedia resource according to the updated media file and the updated resource list file when the preset condition for rendering the third track file is met by the client, and loading the updated rich media resource according to the updated media file and the updated resource list file when the preset condition for rendering the third track file is not met by the client, and loading the updated media resource according to the updated media file and the rich media file.
In the embodiment of the application, the client reads the resource list when starting, analyzes and judges whether track_ danmaku _beta is loaded. And if the user meets the conditions, loading and rendering the track, otherwise, loading the old barrage track by default or skipping.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
The embodiment also provides a device for loading media resources, which is used for implementing the foregoing embodiments and preferred embodiments, and is not described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
Fig. 4 is a block diagram (a) of a media resource loading apparatus according to an embodiment of the present application, which is applied to a service device, as shown in fig. 4, and includes:
A first generating module 42, configured to generate a first track file corresponding to a rich media resource and a second track file of a multimedia resource, and encapsulate the first track file and the second track file into a target container to generate a media file, where the rich media resource is a resource that is superimposed on an interface corresponding to the multimedia resource;
a second generating module 44, configured to generate a resource manifest file according to the control logic of the first track file;
a sending module 46, configured to send the media file and the resource manifest file to a client, so that the client loads the multimedia resource according to the media file, and loads the rich media resource according to the media file and the resource manifest file.
According to the device, a first track file corresponding to the rich media resource and a second track file of the multimedia resource are generated, the first track file and the second track file are packaged into a target container to generate the media file, wherein the rich media resource is a resource which is overlapped on an interface corresponding to the multimedia resource, a resource list file is generated according to control logic of the first track file, the media file and the resource list file are sent to a client side, so that the client side loads the multimedia resource according to the media file, and loads the rich media resource according to the media file and the resource list file.
In an exemplary embodiment, the first generating module 42 is configured to determine attribute information of each sub-resource in the rich media resource, generate a track event of each sub-resource according to the attribute information of each sub-resource, and generate a first track file of the rich media resource according to the display time of each sub-resource and the track event of each sub-resource.
In an exemplary embodiment, the first generating module 42 is configured to convert the first track file into a first data file in a target format, convert the second track file into a second data file in a target format, and write the first data file and the second data file into target areas of the target container, respectively, and execute a target packaging command to package the target containers written into the first data file and the second data file into the media file.
In an exemplary embodiment, a first generating module 42 is configured to determine an update status of the rich media resource, generate a third track file according to the updated rich media resource if the update status of the rich media resource indicates that the rich media resource is updated, and package the third track file into a target container to generate an updated media file.
In an exemplary embodiment, the second generating module 44 is configured to generate an updated resource list file according to meta information and control logic of the third track file, send the updated media file and the updated resource list file to a client, so that the client loads the multimedia resource according to the updated media file, and load the updated rich media resource according to the updated media file and the updated resource list file.
In an exemplary embodiment, the second generating module 44 is configured to obtain meta information of the first track file and write the meta information into a first field in the resource manifest file, and determine the control logic according to the resource type of the rich media resource, and write the control logic into a second field in the resource manifest file.
Fig. 5 is a block diagram (two) of a media resource loading device according to an embodiment of the present application, applied to a client, as shown in fig. 5, the device includes:
The receiving module 52 is configured to receive a media file and a resource list file sent by a service device, where the cloud platform generates a first track file corresponding to a rich media resource and a second track file of a multimedia resource, encapsulates the first track file and the second track file into a target container, so as to generate a media file, and generates a resource list file according to control logic of the first track file, where the rich media resource is a resource that is superimposed on an interface corresponding to the multimedia resource, and the multimedia resource at least includes one of a video resource and an audio resource;
a loading module 54, configured to load the multimedia resource according to the media file, and load the rich media resource according to the media file and the resource manifest file.
In an exemplary embodiment, the loading module 54 is configured to parse the updated media file to obtain the first track file, the second track file, and the third track file when the updated media file and the updated resource list file are received, where the third track file is a track file corresponding to the updated rich media resource, whether the client meets a preset condition for rendering the third track file, load the multimedia resource according to the updated media file and the updated rich media resource according to the updated media file when the client meets the preset condition for rendering the third track file, and load the multimedia resource according to the updated media file and the updated resource list file when the client does not meet the preset condition for rendering the third track file.
In an exemplary embodiment, the loading module 54 is configured to parse the resource manifest file to determine control logic of the first track file, parse the media file to obtain the first track file and the second track file, decode the first track file and the second track file to obtain the multimedia resource and the rich media resource, load the multimedia resource to the client through a target interface, and load the rich media resource to the client according to the control logic.
It should be noted that each of the above modules may be implemented by software or hardware, and the latter may be implemented by, but not limited to, the above modules all being located in the same processor, or each of the above modules being located in different processors in any combination.
Embodiments of the present application also provide a computer readable storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
In an exemplary embodiment, the computer readable storage medium may include, but is not limited to, a U disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, etc. various media in which a computer program may be stored.
An embodiment of the application also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
In an exemplary embodiment, the electronic device may further include a transmission device connected to the processor, and an input/output device connected to the processor.
Embodiments of the application also provide a computer program product comprising a computer program which, when executed by a processor, implements the steps of any of the method embodiments described above.
Embodiments of the present application also provide another computer program product comprising a non-volatile computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of any of the method embodiments described above.
Embodiments of the present application also provide a computer program comprising computer instructions stored on a computer readable storage medium, a processor of a computer device reading the computer instructions from the computer readable storage medium, the processor executing the computer instructions to cause the computer device to perform the steps of any of the method embodiments described above.
Specific examples in this embodiment may refer to the examples described in the foregoing embodiments and the exemplary implementation, and this embodiment is not described herein.
It will be appreciated by those skilled in the art that the modules or steps of the application described above may be implemented in a general purpose computing device, they may be concentrated on a single computing device, or distributed across a network of computing devices, they may be implemented in program code executable by computing devices, so that they may be stored in a storage device for execution by computing devices, and in some cases, the steps shown or described may be performed in a different order than that shown or described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple modules or steps of them may be fabricated into a single integrated circuit module. Thus, the present application is not limited to any specific combination of hardware and software.
The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, but various modifications and variations can be made to the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the principle of the present application should be included in the protection scope of the present application.

Claims (10)

1.一种媒体资源的加载方法,其特征在于,应用于服务设备,包括:1. A method for loading media resources, characterized in that it is applied to a service device and includes: 生成富媒体资源对应的第一轨道文件和多媒体资源的第二轨道文件,并将所述第一轨道文件和第二轨道文件封装至目标容器中,以生成媒体文件,其中,所述富媒体资源为叠加至所述多媒体资源对应的界面上的资源;Generate a first track file corresponding to rich media resources and a second track file for multimedia resources, and encapsulate the first track file and the second track file into a target container to generate a media file, wherein the rich media resources are resources superimposed on the interface corresponding to the multimedia resources; 根据所述第一轨道文件的控制逻辑生成资源清单文件;Generate a resource list file based on the control logic of the first track file; 将所述媒体文件和所述资源清单文件发送至客户端,以使所述客户端根据所述媒体文件加载所述多媒体资源,以及根据所述媒体文件和所述资源清单文件加载所述富媒体资源。The media file and the resource manifest file are sent to the client so that the client can load the multimedia resource according to the media file and the rich media resource according to the media file and the resource manifest file. 2.根据权利要求1所述的方法,其特征在于,生成富媒体资源对应的第一轨道文件,包括:2. The method according to claim 1, characterized in that generating the first track file corresponding to the rich media resource includes: 确定所述富媒体资源中的每一子资源的属性信息;Determine the attribute information of each sub-resource in the rich media resource; 根据所述每一子资源的属性信息生成所述每一子资源的轨道事件;根据所述每一子资源的显示时间和所述每一子资源的轨道事件生成所述富媒体资源的第一轨道文件。Generate track events for each sub-resource based on the attribute information of each sub-resource; generate the first track file of the rich media resource based on the display time of each sub-resource and the track events of each sub-resource. 3.根据权利要求1所述的方法,其特征在于,将所述第一轨道文件和第二轨道文件封装至目标容器中,以生成媒体文件,包括:3. The method according to claim 1, characterized in that, encapsulating the first track file and the second track file into a target container to generate a media file includes: 将所述第一轨道文件转换为目标格式的第一数据文件,以及将所述第二轨道文件转换为目标格式的第二数据文件,并将所述第一数据文件和所述第二数据文件分别写入所述目标容器的目标区域;The first track file is converted into a first data file in the target format, and the second track file is converted into a second data file in the target format. The first data file and the second data file are then written into the target area of the target container, respectively. 执行目标封装命令,以将写入所述第一数据文件和所述第二数据文件的目标容器封装成所述媒体文件。Execute the target encapsulation command to encapsulate the target container written to the first data file and the second data file into the media file. 4.根据权利要求1所述的方法,其特征在于,将所述媒体文件和所述资源清单文件发送至客户端之后,所述方法还包括:4. The method according to claim 1, characterized in that, after sending the media file and the resource manifest file to the client, the method further includes: 确定所述富媒体资源的更新状态;Determine the update status of the rich media resource; 在所述富媒体资源的更新状态指示所述富媒体资源存在更新的情况下,根据更新后的富媒体资源生成第三轨道文件;If the update status of the rich media resource indicates that the rich media resource has been updated, a third track file is generated based on the updated rich media resource; 将所述第三轨道文件封装至目标容器中,以生成更新后的媒体文件。The third track file is encapsulated into the target container to generate the updated media file. 5.根据权利要求4所述的方法,其特征在于,将所述第三轨道文件封装至目标容器中,以生成更新后的媒体文件之后,所述方法还包括:5. The method according to claim 4, characterized in that, after encapsulating the third track file into the target container to generate the updated media file, the method further includes: 根据第三轨道文件的元信息和控制逻辑生成更新后的资源清单文件;An updated resource inventory file is generated based on the metadata and control logic of the third track file; 将所述更新后的媒体文件和所述更新后的资源清单文件发送至客户端,以使所述客户端根据所述更新后的媒体文件加载所述多媒体资源,以及根据所述更新后的媒体文件和所述更新后的资源清单文件加载所述更新后的富媒体资源。The updated media file and the updated resource manifest file are sent to the client so that the client can load the multimedia resources based on the updated media file and load the updated rich media resources based on the updated media file and the updated resource manifest file. 6.根据权利要求1所述的方法,其特征在于,根据所述第一轨道文件的控制逻辑生成资源清单文件,包括:6. The method according to claim 1, characterized in that generating a resource list file based on the control logic of the first track file includes: 获取所述第一轨道文件的元信息,并将所述元信息写入所述资源清单文件中的第一字段;以及,Obtain the metadata of the first track file, and write the metadata into the first field of the resource manifest file; and, 根据所述富媒体资源的资源类型确定所述控制逻辑,将所述控制逻辑写入所述资源清单文件中的第二字段。The control logic is determined based on the resource type of the rich media resource, and the control logic is written into the second field of the resource manifest file. 7.一种媒体资源的加载方法,其特征在于,应用于客户端,包括:7. A method for loading media resources, characterized in that it is applied to a client and includes: 接收服务设备发送的媒体文件和资源清单文件,其中,云平台生成富媒体资源对应的第一轨道文件和多媒体资源的第二轨道文件,并将所述第一轨道文件和第二轨道文件封装至目标容器中,以生成媒体文件,根据所述第一轨道文件的控制逻辑生成资源清单文件,所述富媒体资源为叠加至所述多媒体资源对应的界面上的资源,所述多媒体资源至少包括以下之一:视频资源、音频资源;The cloud platform receives media files and resource list files sent by the service device. The cloud platform generates a first track file corresponding to the rich media resources and a second track file for the multimedia resources. The first track file and the second track file are encapsulated into a target container to generate media files. The resource list file is generated according to the control logic of the first track file. The rich media resources are resources superimposed on the interface corresponding to the multimedia resources. The multimedia resources include at least one of the following: video resources and audio resources. 根据所述媒体文件加载所述多媒体资源,以及根据所述媒体文件和所述资源清单文件加载所述富媒体资源。The multimedia resources are loaded according to the media file, and the rich media resources are loaded according to the media file and the resource manifest file. 8.根据权利要求7所述的方法,其特征在于,根据所述媒体文件加载所述多媒体资源,以及根据所述媒体文件和所述资源清单文件加载所述富媒体资源之后,所述方法还包括:8. The method according to claim 7, characterized in that, after loading the multimedia resource according to the media file and loading the rich media resource according to the media file and the resource manifest file, the method further includes: 在接收到更新后的媒体文件和更新后的资源清单文件的情况下,对更新后的媒体文件进行解析,以获取所述第一轨道文件、所述第二轨道文件和第三轨道文件,其中,所述第三轨道文件为更新后的富媒体资源对应的轨道文件;Upon receiving the updated media file and the updated resource manifest file, the updated media file is parsed to obtain the first track file, the second track file, and the third track file, wherein the third track file is the track file corresponding to the updated rich media resource; 在所述客户端是否满足渲染所述第三轨道文件的预设条件;Whether the client meets the preset conditions for rendering the third track file; 在所述客户端满足渲染所述第三轨道文件的预设条件的情况下,根据所述更新后的媒体文件加载所述多媒体资源,以及根据所述更新后的媒体文件和所述更新后的资源清单文件加载所述更新后的富媒体资源;在所述客户端不满足渲染所述第三轨道文件的预设条件的情况下,根据所述更新后的媒体文件加载所述多媒体资源,以及根据所述更新后的媒体文件和所述更新后的资源清单文件加载所述富媒体资源。If the client meets the preset conditions for rendering the third track file, the multimedia resources are loaded according to the updated media file, and the updated rich media resources are loaded according to the updated media file and the updated resource manifest file; if the client does not meet the preset conditions for rendering the third track file, the multimedia resources are loaded according to the updated media file, and the rich media resources are loaded according to the updated media file and the updated resource manifest file. 9.一种媒体资源的加载装置,其特征在于,应用于服务设备,包括:9. A media resource loading device, characterized in that it is applied to a service device and comprises: 第一生成模块,用于生成富媒体资源对应的第一轨道文件和多媒体资源的第二轨道文件,并将所述第一轨道文件和第二轨道文件封装至目标容器中,以生成媒体文件,其中,所述富媒体资源为叠加至所述多媒体资源对应的界面上的资源;The first generation module is used to generate a first track file corresponding to rich media resources and a second track file of multimedia resources, and to encapsulate the first track file and the second track file into a target container to generate a media file, wherein the rich media resources are resources superimposed on the interface corresponding to the multimedia resources; 第二生成模块,用于根据所述第一轨道文件的控制逻辑生成资源清单文件;The second generation module is used to generate a resource list file based on the control logic of the first track file; 发送模块,用于将所述媒体文件和所述资源清单文件发送至客户端,以使所述客户端根据所述媒体文件加载所述多媒体资源,以及根据所述媒体文件和所述资源清单文件加载所述富媒体资源。The sending module is used to send the media file and the resource manifest file to the client, so that the client can load the multimedia resource according to the media file and the rich media resource according to the media file and the resource manifest file. 10.一种媒体资源的加载装置,其特征在于,应用于客户端,包括:10. A media resource loading device, characterized in that it is applied to a client and includes: 接收模块,用于接收服务设备发送的媒体文件和资源清单文件,其中,云平台生成富媒体资源对应的第一轨道文件和多媒体资源的第二轨道文件,并将所述第一轨道文件和第二轨道文件封装至目标容器中,以生成媒体文件,根据所述第一轨道文件的控制逻辑生成资源清单文件,所述富媒体资源为叠加至所述多媒体资源对应的界面上的资源,所述多媒体资源至少包括以下之一:视频资源、音频资源;The receiving module is used to receive media files and resource list files sent by the service device. The cloud platform generates a first track file corresponding to the rich media resources and a second track file for the multimedia resources, and encapsulates the first track file and the second track file into a target container to generate media files. The resource list file is generated according to the control logic of the first track file. The rich media resources are resources superimposed on the interface corresponding to the multimedia resources. The multimedia resources include at least one of the following: video resources and audio resources. 加载模块,用于根据所述媒体文件加载所述多媒体资源,以及根据所述媒体文件和所述资源清单文件加载所述富媒体资源。A loading module is used to load the multimedia resources according to the media file, and to load the rich media resources according to the media file and the resource manifest file.
CN202511057024.5A 2025-07-29 2025-07-29 Method and apparatus for loading media resources Pending CN121012818A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202511057024.5A CN121012818A (en) 2025-07-29 2025-07-29 Method and apparatus for loading media resources

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202511057024.5A CN121012818A (en) 2025-07-29 2025-07-29 Method and apparatus for loading media resources

Publications (1)

Publication Number Publication Date
CN121012818A true CN121012818A (en) 2025-11-25

Family

ID=97730994

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202511057024.5A Pending CN121012818A (en) 2025-07-29 2025-07-29 Method and apparatus for loading media resources

Country Status (1)

Country Link
CN (1) CN121012818A (en)

Similar Documents

Publication Publication Date Title
US11647071B2 (en) Method and apparatus for transmitting and receiving content
US11070893B2 (en) Method and apparatus for encoding media data comprising generated content
US20160182593A1 (en) Methods, devices, and computer programs for improving coding of media presentation description data
US11356749B2 (en) Track format for carriage of event messages
US9106935B2 (en) Method and apparatus for transmitting and receiving a content file including multiple streams
JP2021510047A (en) Synchronous playback method of media files, equipment and storage media
CN116962756A (en) Method, device, equipment and storage medium for processing immersion medium
CN114598895B (en) Audio and video processing method, device, equipment and computer readable storage medium
CN121012818A (en) Method and apparatus for loading media resources
US12250359B2 (en) Multi-track based immersive media playout
US12401865B2 (en) Method and apparatus for encapsulation of media data in a media file
JP2021508977A (en) Metadata container analysis method, equipment and storage medium
CN114554256B (en) Media stream playback, media stream processing method, device, equipment and storage medium
US20250317612A1 (en) Method, device, and computer program for optimizing dynamic encapsulation and parsing of content data
Chernyshev Library for Remote Copying of Video File Fragments
KR101310894B1 (en) Method and apparatus of referencing stream in other SAF session for LASeR service and apparatus for the LASeR service
CN113784150A (en) Video data distribution method and device, electronic equipment and storage medium
HK40070809B (en) Processing method of immersive media, device, equipment and storage medium
CN121037587A (en) A method and corresponding equipment for dynamic conversion of HLS resources based on MP4 metadata parsing
HK40070809A (en) Processing method of immersive media, device, equipment and storage medium
CN119109913A (en) Multimedia stream processing method, device, equipment, storage medium and program product
GB2634576A (en) Method, device, and computer program for signaling hidden toplevel boxes in an encapsulated media data file
GB2620651A (en) Method, device, and computer program for optimizing dynamic encapsulation and parsing of content data
CN117156087A (en) Video recording method, device, equipment and storage medium based on webpage player
CN118803300A (en) Video processing method, system, device, storage medium and product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination