CN114339289B - Video playing processing method - Google Patents
Video playing processing method Download PDFInfo
- Publication number
- CN114339289B CN114339289B CN202111660904.3A CN202111660904A CN114339289B CN 114339289 B CN114339289 B CN 114339289B CN 202111660904 A CN202111660904 A CN 202111660904A CN 114339289 B CN114339289 B CN 114339289B
- Authority
- CN
- China
- Prior art keywords
- data
- video
- audio
- playing
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003672 processing method Methods 0.000 title abstract description 15
- 238000013507 mapping Methods 0.000 claims abstract description 7
- 238000005538 encapsulation Methods 0.000 claims abstract description 5
- 238000000034 method Methods 0.000 claims description 25
- 238000012545 processing Methods 0.000 claims description 20
- 238000003860 storage Methods 0.000 claims description 20
- 238000004590 computer program Methods 0.000 claims description 7
- 230000002829 reductive effect Effects 0.000 claims description 6
- 230000003247 decreasing effect Effects 0.000 claims description 4
- 230000000717 retained effect Effects 0.000 claims description 3
- 230000001360 synchronised effect Effects 0.000 claims description 3
- 230000009191 jumping Effects 0.000 claims description 2
- 230000006870 function Effects 0.000 description 10
- 238000013461 design Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000004806 packaging method and process Methods 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 230000033764 rhythmic process Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
Landscapes
- Television Signal Processing For Recording (AREA)
Abstract
The application discloses a video playing processing method, which comprises the following steps: the playing terminal accesses a server for providing a video source and pulls the audio and video data to the local; decapsulating and decoding the audio and video data according to a file encapsulation format agreed by a protocol; synchronously playing the decoded audio and video bare data; backing up the unpacked audio and video coding data frames; image data of the key frames after video decoding is reserved, and associated time stamp information is reserved; recording the mapping relation between the key frame and the audio/video coding data frame; displaying the key frame pictures which are already played beside the main picture while the main picture is played in real time; the user browses and selects a key frame picture of interest, positions to the backup audio and video data frame, and takes the audio and video frame sequence started by the selected key frame as the data to be played back; and starting a new video playing window beside the live broadcast main picture, and playing the played back data by taking the audio and video backup data to be played back as a data source.
Description
Technical Field
The application belongs to the technical field of video playing, and particularly relates to a video playing processing method.
Background
With the development of the internet, video production, distribution and playing technologies have greatly developed, video live broadcasting is popular, and contents and forms are more and more diversified. However, if the viewer wants to review the live broadcast in real time, the viewer can only trace back the played part through the time shifting function of the playing platform, but many platforms do not provide the time shifting function, and even if the time shifting is provided, the user skips the live broadcast data in the review time period to follow the progress of the live broadcast, or then the content to be reviewed is played, that is, the real-time performance of the live broadcast is sacrificed.
Relatively speaking, professional live content such as a central view and the like can provide a function of playing back local content in real time through a powerful software and hardware function of a platform thereof, but has the defects that firstly, the content is limited, what is directly broadcast by the platform, and secondly, a point for providing playback is not necessarily really interesting to a viewer.
Disclosure of Invention
In view of the above problems, the embodiments of the present application provide a video playing processing method, which is used to implement real-time guiding and backtracking based on live video content, so that a user can select favorite video clips for playback and sharing at any time when watching live video, and meanwhile, the progress and content of live video are not affected.
In order to solve the technical problems, the application adopts the following technical scheme:
in a first aspect, an embodiment of the present application provides a method for video playing processing, including:
the playing terminal accesses a server for providing a video source and pulls the audio and video data to the local; decapsulating and decoding the audio and video data according to a file encapsulation format agreed by a protocol, wherein the decapsulated audio and video data are audio and video coding data frames, and the decoded audio and video coding data are corresponding audio and video bare data which can be rendered; synchronously playing the decoded audio and video bare data;
backing up the data of the unpacked audio and video coding data frame, wherein the data comprises audio and video data content and time stamp information associated with the data; image data after video key frame decoding and associated timestamp information are reserved; recording the mapping relation of the decoded key frame image and the data of the audio and video coding frame; the method comprises the steps that when a main picture is played in real time, key frame information which is already played is presented beside the main picture;
the user browses and selects a key frame picture of interest, positions the backup audio and video data frame according to the association relation between the selected key frame and the data of the audio and video coding frame sequence, and takes the audio and video frame sequence started by the selected key frame as the data to be played back; and starting a new video playing window beside the live broadcast main picture, and playing the played back data by taking the audio and video backup data to be played back as a data source.
In one possible design of the first aspect, the method further comprises: and when the user plays back, starting synchronous editing on the playback content, recording the play control action of the user and adopting corresponding processing.
In one possible design of the first aspect, recording the user play control action and taking the corresponding processing includes: for variable speed played content, the associated video frame data time stamps are rewritten by increasing or decreasing the time interval between data frames at the actual playing speed.
In one possible design of the first aspect, recording the user play control action and taking the corresponding processing includes: and discarding the jumped data for the content played in a jumping manner, continuing to mark the time stamp from the jumped data, and ensuring the continuity of the time stamp.
In one possible design of the first aspect, the video frame with the rewritten time stamp is data-encapsulated.
In one possible design of the first aspect, the method further comprises adding audio data or self-selected sound file data of the backup data source as audio content.
In one possible design of the first aspect, the image data of the key frame after video decoding and the associated timestamp information are retained, and the key frame image sequence is reduced to a small image data sequence for storage.
In a second aspect, an embodiment of the present application provides a computer device, at least including a memory, a processor, and a computer program stored on the memory and capable of running on the processor, where the processor implements a method for video playing processing as described in any one of the above when executing the program.
In a third aspect, embodiments of the present application provide a computer-readable storage medium having a computer program stored thereon, the program being executed by a processor to perform a method of implementing a video playback process as described in any one of the above.
The application has the following beneficial effects:
(1) Live broadcast data are cached and managed in real time, and the original playing progress of the live broadcast main window is kept in a split screen playing mode, so that the real-time performance of playing is not sacrificed, and meanwhile, the picture playback interested by a user is considered;
(2) The play window for playback provides volume control and mute function, which is convenient for users to select and play control in the live broadcast main window and the audio source for playback and play;
(3) The playing window for playback provides functions of variable speed playing (fast forward/fast backward), playing skip and the like, and is convenient for users to watch and control the playing rhythm and progress according to the needs.
(4) The method for realizing the quick and convenient live video editing and sharing is provided, the process of playing back and playing is the process of video editing processing, the user does not need special and complicated editing operation, the user can see the video to share the video, the content interested by the user is recorded and shared in real time, and the playing and sharing are not delayed.
Drawings
FIG. 1 is a flowchart illustrating a video playing processing method according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating a video playing processing method according to another embodiment of the present application;
FIG. 3 is a flowchart illustrating a video playing processing method according to another embodiment of the present application;
fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Referring to fig. 1, a video playing processing method according to an embodiment of the present application includes:
s10, a playing terminal accesses a server for providing a video source and pulls audio and video data to a local place; decapsulating and decoding the audio and video data according to a file encapsulation format agreed by a protocol, wherein the decapsulated audio and video encoded data frames are decoded into corresponding audio and video bare data for rendering; synchronously playing the decoded audio and video bare data;
s20, reserving data of the unpacked audio and video coding data frame, wherein the data comprises audio and video data content and time stamp information associated with the data; image data after video key frame decoding and associated timestamp information are reserved; recording the mapping relation of the decoded key frame image and the data of the audio and video coding frame; the method comprises the steps that when a main picture is played in real time, key frame information which is already played is presented beside the main picture; the backup key frame image data is displayed as a navigation bar for a user to select as a playback starting point, and the backup audio and video data coding frame is found according to the timestamp associated with the selected key frame picture, namely the video coding frame at the playback starting point is the coding frame where the key frame picture is located, and the playback can be quickly decoded by utilizing the characteristic that the key frame can be independently decoded. The purpose of the backup of the audio and video coding frames is to reduce the storage space, and if the storage is not limited, the backup of bare data is also possible.
Key frames, i.e., I frames, are a complete picture in video coding and can be independently encoded and decoded independent of other pictures, with other associated frame types being P frames (forward reference frames) and B frames (bi-directional reference frames). P-frames and B-frames record changes relative to I-frames. Without an I frame, P and B frames cannot be decoded.
S30, a user browses and selects a key frame image picture of interest, positions a backup audio and video data frame according to the association relation between the key frame image and the data of the audio and video coding frame sequence, and takes the audio and video frame sequence started by the key frame as the data to be played back; and starting a new video playing window beside the live broadcast main picture, and playing the played back data by taking the audio and video backup data to be played back as a data source.
In S20, the image data of the key frame after video decoding is retained, and the key frame image sequence is reduced to a small image data sequence in the associated time stamp information and then stored. The above is set to take into account the limitation of the storage space, and the key frame image sequence is optionally reduced to a small image data sequence for storage. I.e., converting to a small resolution, low quality thumbnail, such as a thumbnail of 1080p,1080 x 720 pixels to 320 x 240 pixels, can greatly reduce the storage space footprint.
On the basis of the foregoing embodiment, a video playing processing method according to still another embodiment of the present application further includes: when a user plays back, synchronous editing is started on playback content, play control actions of the user are recorded, corresponding processing is adopted, and the method can comprise the following steps: for content played at constant speed, the video data content from the backup data source and associated timestamp remain unchanged; for variable speed played content, according to the actual playing speed, the time stamp of the associated video frame data is rewritten by increasing or decreasing the time interval between the data frames; discarding the skipped data for the content to be played, continuously marking the time stamp from the skipped data, and ensuring the continuity of the time stamp; and data packaging is carried out on the video frames with the rewritten time stamps. Audio data of the backup data source or the free-standing sound file data may also be added as audio content.
The implementation of the embodiments of the present application will be further described below by way of several specific application examples.
Referring to fig. 2, a video playing processing method according to an embodiment of the present application includes:
preparing a playing terminal with internet surfing function, which can be a computer, a mobile phone, a tablet, a set-top box and the like; and selecting a live video source to be played, pulling audio and video data from a server where the video source is located in real time according to a common playing logic, decapsulating, decoding and synchronously playing.
For the unpackaged data, reserving backup data containing associated timestamp information; for decoded and played video key frame image data, the video key frame image data can be selectively converted into low-resolution and low-quality thumbnail images (such as 1080P, i.e. 1080 x 720 pixels, is reduced to 320 x 240 pixels), and the thumbnail images are presented together with a main playing window in a specific manner, so that the user can click and select the thumbnail images conveniently.
When a user selects a certain picture of the key frame guide map, starting a second playing window to play the audio and video data started by the selected picture according to the mapping relation between the selected picture and the backup data; the playing window for playback provides functions of variable speed playing (fast forward/fast backward), playing skip and the like, and is convenient for a user to watch and control the playing rhythm and progress according to the needs; the playback window provides volume control and mute function, which is convenient for users to select and play control in the live broadcast main window and the playback audio source.
Referring to fig. 3, a video playing processing method according to an embodiment of the present application includes:
preparing a playing terminal with internet surfing function, which can be a computer, a mobile phone, a tablet, an intelligent television and the like; and selecting a live video source to be played, pulling audio and video data from a server where the video source is located in real time according to a common playing logic, decapsulating, decoding and synchronously playing.
For the unpacked data, reserving a backup audio and video frame coding data containing corresponding time stamp information; for decoded and played video key frame image data, the video key frame image data can be selectively converted into low-resolution and low-quality thumbnail images (such as 1080P, i.e. 1080 x 720 pixels, is reduced to 320 x 240 pixels), and the video key frame image data can be presented together with a main playing window in a specific manner, such as a key frame guide, so that a user can conveniently click and select the video key frame image data.
When a user selects a picture of interest of the key frame navigation chart, according to the mapping relation between the selected picture and the backup data, finding the corresponding position of the related picture in the backup data, and using the position as a selected video frame data sequence, and starting a new window for playing; and rewriting the time stamp corresponding to the picture by increasing and decreasing the time stamp interval of the front video frame and the rear video frame according to the sequence of the video picture playing in the video frame sequence and the specific speed value accompanied by the picture playing.
Recording the skip operation of the user, removing the skipped picture data, and rewriting the time stamp of the video frame data to ensure that the time stamp before and after skip is kept continuous.
Packaging the video frame data with the rewritten time stamp into a specific file format, such as MP4/FLV/TS; or made into gif map files.
FIG. 4 illustrates a more specific hardware architecture diagram of a computing device provided by embodiments of the present description, which may include: a processor 101, a memory 102, an input/output interface 103, a communication interface 104, and a bus 105. Wherein the processor 101, the memory 102, the input/output interface 103, and the communication interface 104 are communicatively coupled to each other within the device via a bus 105.
The processor 101 may be implemented by a general-purpose CPU (Central Processing Unit ), a microprocessor, an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), or one or more integrated circuits, etc. for executing relevant programs to implement the technical solutions provided in the embodiments of the present disclosure.
The Memory 102 may be implemented in the form of ROM (Read Only Memory), RAM (Random Access Memory ), static storage device, dynamic storage device, or the like. Memory 102 may store an operating system and other application programs, and when implementing the solutions provided by the embodiments of the present specification by software or firmware, the relevant program code is stored in memory 102 and invoked for execution by processor 101.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, an embodiment of the present application provides a computer readable storage medium storing a plurality of computer programs capable of being loaded by a processor to perform the steps of any one of the video playing processing methods provided in the embodiment of the present application. For example, the computer program may perform the steps of:
the playing terminal accesses a server for providing a video source and pulls the audio and video data to the local; decapsulating and decoding the audio and video data according to a file encapsulation format agreed by a protocol, wherein the decapsulated audio and video data are audio and video coding data frames, and the decoded audio and video coding data are corresponding audio and video bare data which can be rendered; synchronously playing the decoded audio and video bare data;
backing up the unpacked audio and video coding data frame, which contains the audio and video data content and the time stamp information associated with the data; image data of the key frames after video decoding is reserved, and associated time stamp information is reserved; recording the mapping relation between the key frame and the audio/video coding data frame; displaying the key frame pictures which are already played beside the main picture while the main picture is played in real time;
the user browses and selects a key frame picture of interest, positions the backup audio and video data frame according to the association relation between the selected key frame and the data of the audio and video coding frame sequence, and takes the audio and video frame sequence started by the selected key frame as the data to be played back; and starting a new video playing window beside the live broadcast main picture, and playing the played back data by taking the audio and video backup data to be played back as a data source.
The specific implementation of each step can be referred to the above method embodiments, and will not be described herein.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device.
The steps in any video playing processing method provided by the embodiment of the present application can be executed by the computer program stored in the storage medium, so that the beneficial effects that any video playing processing method provided by the embodiment of the present application can be achieved, and detailed descriptions of the foregoing embodiments are omitted herein.
It should be understood that the exemplary embodiments described herein are illustrative and not limiting. Although one or more embodiments of the present application have been described with reference to the accompanying drawings, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present application as defined by the following claims.
Claims (9)
1. A method of video playback processing, comprising:
the playing terminal accesses a server for providing a video source and pulls the audio and video data to the local; decapsulating and decoding the audio and video data according to a file encapsulation format agreed by a protocol, wherein the decapsulated audio and video data are audio and video coding data frames, and the decoded audio and video coding data are corresponding audio and video bare data which can be rendered; synchronously playing the decoded audio and video bare data;
backing up the data of the unpacked audio and video coding data frame, wherein the data comprises audio and video data content and time stamp information associated with the data; image data after video key frame decoding and associated timestamp information are reserved; recording the mapping relation of the decoded key frame image and the data of the audio and video coding frame; the method comprises the steps that when a main picture is played in real time, key frame information which is already played is presented beside the main picture;
the user browses and selects a key frame picture of interest, positions the backup audio and video data frame according to the association relation between the selected key frame and the data of the audio and video coding frame sequence, and takes the audio and video frame sequence started by the selected key frame as the data to be played back; and starting a new video playing window beside the live broadcast main picture, and playing the played back data by taking the audio and video backup data to be played back as a data source.
2. The method of video playback processing of claim 1, further comprising: and when the user plays back, starting synchronous editing on the playback content, recording the play control action of the user and adopting corresponding processing.
3. The method of video playback processing of claim 2, wherein recording user playback control actions and taking corresponding processing comprises: for variable speed played content, the associated video frame data time stamps are rewritten by increasing or decreasing the time interval between data frames at the actual playing speed.
4. The method of video playback processing of claim 2, wherein recording user playback control actions and taking corresponding processing comprises: and discarding the jumped data for the content played in a jumping manner, continuing to mark the time stamp from the jumped data, and ensuring the continuity of the time stamp.
5. A method of video playback processing as claimed in claim 3 or 4, characterized in that the time-stamped rewritten video frames are data-encapsulated.
6. The method of video playback processing of claim 5, further comprising adding audio data of the backup data source or the discretionary sound file data as audio content.
7. A method of video playback processing as claimed in any one of claims 1 to 4, wherein the key frame image sequence is reduced to a smaller image data sequence for storage in the image data decoded from the retained video key frames and associated time stamp information.
8. A computer device comprising at least a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of video playback processing of any one of claims 1 to 7 when the program is executed by the processor.
9. A computer-readable storage medium, wherein a computer program is stored on the computer-readable storage medium, the program being executed by a processor to implement the method of video playback processing of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111660904.3A CN114339289B (en) | 2021-12-30 | 2021-12-30 | Video playing processing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111660904.3A CN114339289B (en) | 2021-12-30 | 2021-12-30 | Video playing processing method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114339289A CN114339289A (en) | 2022-04-12 |
CN114339289B true CN114339289B (en) | 2023-08-15 |
Family
ID=81019435
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111660904.3A Active CN114339289B (en) | 2021-12-30 | 2021-12-30 | Video playing processing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114339289B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115002529A (en) * | 2022-05-07 | 2022-09-02 | 咪咕文化科技有限公司 | Video strip splitting method, device, equipment and storage medium |
CN115550680A (en) * | 2022-09-30 | 2022-12-30 | 河南华福包装科技有限公司 | Course recording and playing method and system |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101917590A (en) * | 2009-12-17 | 2010-12-15 | 新奥特(北京)视频技术有限公司 | Network live broadcasting system with playback function and player |
CN105915985A (en) * | 2015-12-15 | 2016-08-31 | 乐视致新电子科技(天津)有限公司 | Method for performing watch-back in live broadcasting and device thereof |
CN106303585A (en) * | 2016-07-26 | 2017-01-04 | 华为技术有限公司 | Program look back method, media server, Set Top Box and program look back system |
CN106412677A (en) * | 2016-10-28 | 2017-02-15 | 北京奇虎科技有限公司 | Generation method and device of playback video file |
CN107484039A (en) * | 2017-08-22 | 2017-12-15 | 四川长虹电器股份有限公司 | A kind of method that streaming media on demand seek pictures are quickly shown |
CN107948715A (en) * | 2017-11-28 | 2018-04-20 | 北京潘达互娱科技有限公司 | Live network broadcast method and device |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8307395B2 (en) * | 2008-04-22 | 2012-11-06 | Porto Technology, Llc | Publishing key frames of a video content item being viewed by a first user to one or more second users |
US11089373B2 (en) * | 2016-12-29 | 2021-08-10 | Sling Media Pvt Ltd | Seek with thumbnail generation and display during placeshifting session |
-
2021
- 2021-12-30 CN CN202111660904.3A patent/CN114339289B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101917590A (en) * | 2009-12-17 | 2010-12-15 | 新奥特(北京)视频技术有限公司 | Network live broadcasting system with playback function and player |
CN105915985A (en) * | 2015-12-15 | 2016-08-31 | 乐视致新电子科技(天津)有限公司 | Method for performing watch-back in live broadcasting and device thereof |
CN106303585A (en) * | 2016-07-26 | 2017-01-04 | 华为技术有限公司 | Program look back method, media server, Set Top Box and program look back system |
CN106412677A (en) * | 2016-10-28 | 2017-02-15 | 北京奇虎科技有限公司 | Generation method and device of playback video file |
CN107484039A (en) * | 2017-08-22 | 2017-12-15 | 四川长虹电器股份有限公司 | A kind of method that streaming media on demand seek pictures are quickly shown |
CN107948715A (en) * | 2017-11-28 | 2018-04-20 | 北京潘达互娱科技有限公司 | Live network broadcast method and device |
Non-Patent Citations (1)
Title |
---|
Improving playback quality of peer-to-peer live streaming systems by joint scheduling and distributed Hash table based compensation;Chen Zhuo等;China Communications;第127-145页 * |
Also Published As
Publication number | Publication date |
---|---|
CN114339289A (en) | 2022-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10939069B2 (en) | Video recording method, electronic device and storage medium | |
CN112291627B (en) | Video editing method and device, mobile terminal and storage medium | |
US8868465B2 (en) | Method and system for publishing media content | |
CN111899322A (en) | Video processing method, animation rendering SDK, device and computer storage medium | |
US20070179979A1 (en) | Method and system for online remixing of digital multimedia | |
US20110041060A1 (en) | Video/Music User Interface | |
US20070169158A1 (en) | Method and system for creating and applying dynamic media specification creator and applicator | |
CN114339289B (en) | Video playing processing method | |
GB2587544A (en) | Video acquisition method and device, terminal and medium | |
JP2008262686A (en) | Method and device for recording broadcast data | |
US9373358B2 (en) | Collaborative media editing system | |
US9288248B2 (en) | Media system with local or remote rendering | |
US20090103835A1 (en) | Method and system for combining edit information with media content | |
US11678019B2 (en) | User interface (UI) engine for cloud UI rendering | |
US8768924B2 (en) | Conflict resolution in a media editing system | |
JP2019516331A (en) | Method and apparatus for optimizing regeneration | |
WO2022156646A1 (en) | Video recording method and device, electronic device and storage medium | |
CN113259705A (en) | Method and device for recording and synthesizing video | |
WO2007084870A2 (en) | Method and system for recording edits to media content | |
US8898253B2 (en) | Provision of media from a device | |
CN116095388A (en) | Video generation method, video playing method and related equipment | |
JP2020509624A (en) | Method and apparatus for determining a time bucket between cuts in audio or video | |
JP2015510727A (en) | Method and system for providing file data for media files | |
CN115695843B (en) | Prefabricated video playing method, server, terminal, medium and system | |
CN115334328B (en) | Method, device, live broadcast system, equipment and medium for entering live broadcast room page |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |