WO2008028361A1 - A method for synchronous playing video and audio data in mobile multimedia broadcasting - Google Patents
A method for synchronous playing video and audio data in mobile multimedia broadcasting Download PDFInfo
- Publication number
- WO2008028361A1 WO2008028361A1 PCT/CN2006/003735 CN2006003735W WO2008028361A1 WO 2008028361 A1 WO2008028361 A1 WO 2008028361A1 CN 2006003735 W CN2006003735 W CN 2006003735W WO 2008028361 A1 WO2008028361 A1 WO 2008028361A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- audio
- data
- channel
- time stamp
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/2368—Multiplexing of audio and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
- H04N21/43072—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
- H04N21/41407—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4341—Demultiplexing of audio and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/643—Communication protocols
- H04N21/64315—DVB-H
Definitions
- the video header is common information for recording the video data, and each video unit is divided by a sync header.
- a "relative time stamp” information is set for each video unit in each video header to record each The relative time difference of the video unit.
- the "absolute timestamp” information is taken out from the channel header to determine the initial playing time of the channel data; then the "relative timestamp” information of each video unit is taken from the video header of the channel, from the channel's The “relative timestamp” information of each audio unit is taken out from the audio header, and the “relative timestamp” of each video unit is added to the "absolute timestamp” according to the above algorithm to determine the initial playback of each video unit. Time; Add the "relative timestamp” of each audio unit to the “absolute timestamp” to determine the starting play time of each audio unit.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
一种移动多媒体广播视音频同步播放的方法 技术领域 本发明涉及到一种移动多媒体广播实时视频音频流的同步播放方法,属于 移动多媒体广播或手机电视技术领域。 背景技术 移动多媒体广播是近年来兴起的一种多媒体播放技术。 通过手持的终端, 在高速移动的情况下, 可以观看电视。 终端通过无线协议, 接收到节目单, 并 可以选择自己有权利收看的频道, 从而可以接受选择频道的多媒体数据, 实现 在移动终端上看电视。 移动多媒体广播系统发射的空中数据,被分成不同的频道,每个频道的数 据又包括: 视频、 音频和数据三种类型,要求终端能保证视频音频的同步播放, 即唇音同步。 在多媒体广播领域, 现有的保证唇音同步的方法有 2种, 一种是 TS (传 输流)协议, 系统在发送视频音频数据的同时, 打上展现时间戳 ( PTS ), 让终 端知道每一段视频音频数据的播放时间, 以保证终端的插-放的时间能与编码器 的时间次序完全相同。 另一种方法是 RTP (实时传输协议 ) , 系统给每一个数 据包打上一个时间戳, 终端严格按照时间戳进行播放, 从而可以保证视频音频 播放的同步。 上述 TS方法适合在电路网络中使用, 也适合单向广播网络使用, 但由于 每个 TS包比较小, 带来了比'较大的网络带宽的开销。 上述 RTP方法对网络带 宽的开销比较小, 但仅适合在 IP网络中使用, 不适合移动广播网絡中使用。 发明内容 本发明的目的是针对上述现有技术的不足,提供一种可在移动广播网络中 实现视频音频同步播 ·放的方法, 并且传输效率要高于目前的 TS方式。 本发明的技术方案如下: ( 1 )确定每个频道数据中的视频数据、 音频数据、 同步数据, 并提供一 个频道包头数据来记录各频道的公共信息; TECHNICAL FIELD The present invention relates to a synchronous playback method for a mobile multimedia broadcast real-time video and audio stream, and belongs to the field of mobile multimedia broadcast or mobile TV technology. BACKGROUND OF THE INVENTION Mobile multimedia broadcasting is a multimedia playback technology that has emerged in recent years. With a handheld terminal, you can watch TV at high speeds. The terminal receives the program list through the wireless protocol, and can select the channel that has the right to watch, so that the multimedia data of the selected channel can be accepted, and the television is watched on the mobile terminal. The air data transmitted by the mobile multimedia broadcasting system is divided into different channels, and the data of each channel includes: video, audio and data, and the terminal is required to ensure synchronous playback of video and audio, that is, lip sound synchronization. In the field of multimedia broadcasting, there are two methods for ensuring lip synchronization, one is TS (Transport Stream) protocol, the system sends a video and audio data, and displays a time stamp (PTS) to let the terminal know each video. The playing time of the audio data is to ensure that the insertion and release time of the terminal can be exactly the same as the time sequence of the encoder. Another method is RTP (Real-Time Transport Protocol). The system puts a time stamp on each data packet, and the terminal plays it strictly according to the time stamp, so as to ensure the synchronization of video and audio playback. The above TS method is suitable for use in a circuit network, and is also suitable for use in a unidirectional broadcast network, but since each TS packet is relatively small, it brings an overhead over a larger network bandwidth. The above RTP method has a relatively small overhead on network bandwidth, but is only suitable for use in an IP network, and is not suitable for use in a mobile broadcast network. SUMMARY OF THE INVENTION The object of the present invention is to provide a method for realizing synchronous playback and playback of video and audio in a mobile broadcast network, and the transmission efficiency is higher than that of the current TS mode. The technical solution of the present invention is as follows: (1) determining video data, audio data, and synchronization data in each channel data, and providing a channel header data to record common information of each channel;
( 2 )将每个频道的视频数据切分为多个视频单元, 并提供一个视频包头 来记录该视频数据的公共信息; 将每个频道的音频数据切分为多个音频单元, 并提供一个音频包头来 ^己录该音频数据的公共信息; (2) dividing the video data of each channel into a plurality of video units, and providing a video header to record the common information of the video data; dividing the audio data of each channel into a plurality of audio units, and providing one The audio packet header to record the public information of the audio data;
( 3 )根据每个频道数据的起始播放时间, 在所述频道包头数据中设置一 个 "绝对时间瞿 " 信息, 来记录该频道数据的起始播放时间; (3) setting an "absolute time 瞿" information in the channel header data according to the initial playing time of each channel data, to record the initial playing time of the channel data;
( 4 )根据每个视频单元的起始播放时间与该整个频道数据的起始播放时 间的相对时差, 在每个视频包头中为每个视频单元设置一个 "相对时间^ ! " 信 息, 来记录每个视频单元的播放相对时差; 根据每个音频单元的起始播放时间 与该整个频道数据的起始播放时间的相对时差, 在每个音频包头中为每个音频 单元设置一个 "相对时间戳" 信息, 来记录每个音频单元的播放相对时差; (4) According to the relative time difference between the start play time of each video unit and the start play time of the entire channel data, a "relative time ^ !" message is set for each video unit in each video header to record The relative time difference of the playback of each video unit; according to the relative time difference between the start play time of each audio unit and the start play time of the entire channel data, a "relative time stamp" is set for each audio unit in each audio pack header. " information to record the relative time difference of playback of each audio unit;
( 5 )将记录有 "绝对时间戳" 信息与 "相对时间 ^;' 信息的频道数据从 发送端发送出去。 进一步, 上述方法还包括: (5) transmitting channel data recorded with "absolute time stamp" information and "relative time ^;" information from the transmitting end. Further, the above method further includes:
( 6 )接收端接收到每个频道数据后, 从该频道包头中取出 "绝对时间戳" 信息, 确定该频道数据的起始播放时间; (6) after receiving the data of each channel, the receiving end takes the "absolute time stamp" information from the channel header to determine the initial playing time of the channel data;
( 7 )从该频道的视频包头中取出每个视频单元的 "相对时间戳" 信息, 将每个视频单元的 "相对时间戳" 分别与 "绝对时间戳" 相加, 确定每个视频 单元的起始播放时间; 从该频道的音频包头中取出每个音频单元的 "相对时间 戳" 信息, 将每个音频单元的 "相对时间戳" 分别与 "绝对时间戳" 相加, 确 定每个音频单元的起始播放时间; (7) taking the "relative timestamp" information of each video unit from the video header of the channel, adding the "relative timestamp" of each video unit to the "absolute timestamp" respectively, and determining each video unit Start playback time; take the "relative time stamp" information of each audio unit from the audio header of the channel, and add the "relative time stamp" of each audio unit to the "absolute time stamp" to determine each audio. The starting play time of the unit;
( 8 )接收端根据确定的每个视频单元与音频单元的起始播放时间, 按顺 序同步播放各个视频单元与音频单元。 上述 "绝对时间戳" 和 "相对时间戳" 是由编码器产生的。 上述 "绝对时间戳" 和 "相对时间戳" 的单位是秒, 最好精确到微秒, 以 确保重播放的精确度。 上述 "绝对时间戳" 的长度可以是 4字节, "相对时间戳" 的长度可以是 2字节, 这样可以节省一定的带宽。 上述 "绝对时间戳" 初始值是随机值, 但是随着时间的变化能不断增长。 上述每个视频单元的 "相对时间崔 "记录在该视频单元所属的枧频数据的 包头中, 每个视频单元之间用同步头来分割。 上述每个音频单元的 "相对时间戳"记录在该音频单元所属的音频数据的 包头中, 每个音频单元之间用同步头来分割。 本发明的方法是通过移动多媒体广播一个频道的绝对时间戳与每个视频 单元与音频单元的相对时间戳, 计算出每个视频、 音频单元的播放时间, 可在 移动广播网络中实现视频音频同步播放的功能, 并且传输效率较高, 能保证用 户能正常观看节目, 同时可以节省一定的带宽。 附图说明 图 1 是媒体流包头的绝对时间戳的示意图 图 2 是视频数据单元的相对时间戳的示意图 图 3 是音频数据单元的相对时间戳的示意图 具体实施方式 如图 1所示, 1个频道的数据是由包头、 视频数据、 音频数据与同步数据 组成的, 其中包头信息是一个频道的公共信息, 包括了一些控制信息与媒体描 述信息, 其中有一个字段 "绝对时间戳 ", 用来表示该频道数据的起始播放时 间。 如图 2所示,视频数据包括一个视频包头以及多个视频单元,视频包头是 用来记录该视频数据的公共信息, 每个视频单元之间用同步头来分割。 才艮据每个视频单元的起始播放时间与该整个频道数据的起始播放时间的 相对时差, 在每个视频包头中为每个视频单元设置一个 "相对时间戳" 信息, 来记录每个视频单元的 放相对时差。 这样, 每个视频单元的实际播放时间就 是 "绝对时间瞿 " 与该视频单元 "相?寸时间霍 " 的和, 例如: 视频单元 1的播放时间 =绝对时间戳 +视频单元 1的相对时间戳 视频单元 N的播放时间 =绝对时间戳 +视频单元 N的相对时间戳 如图 3所示,音频数据包括一个音频包头以及多个音频单元,音频包头是 用来记录该音频数据的公共信息, 每个音频单元之间用同步头来分割。 根据每个音频单元的起始播放时间与该整个频道数据的起始播放时间的 相对时差, 在每个音频包头中为每个音频单元设置一个 "相对时间崔 " 信息, 来记录每个音频单元的播放相对时差。 这样, 每个视频单元的实际播放时间就 是 "绝对时间戳" 与该音频单元 "相对时间戳" 的和, 例如: 音频单元 1的播放时间 =绝对时间戳 +音频单元 1的相对时间戳 音频单元 N的播放时间 =绝对时间戳 +音频单元 N的相对时间戳 发送端将记录有 "绝对时间戳"信息与 "相对时间戳"信息的频道数据发 . 送出去,接收端接收到该频道数据后, 首先从该频道包头中取出 "绝对时间戳" 信息, 确定该频道数据的起始播放时间; 再从该频道的视频包头中取出每个视 频单元的 "相对时间戳"信息,从该频道的音频包头中取出每个音频单元的 "相 对时间戳"信息, 按照上述算法, 将每个视频单元的 "相对时间戳"分别与 "绝 对时间戳" 相加, 确定每个视频单元的起始播放时间; 将每个音频单元的 "相 对时间戳" 分别与 "绝对时间戳" 相加, 确定每个音频单元的起始播放时间。 接收端根据确定的每个视频单元与音频单元的起始播放时间进行播放,就 实现了视音频同步播放。 时间戳是由编码器产生的,初始值可以是随机值,但是随着时间的变化能 不断增长, 时间要求准确。 时间戳的单位是秒, 可以精确到微秒, 以保重播放 的精确度。 通过本专利的方法,可以减少表示每个视频单元、音频单元播放时间的长 度, 例如, 本来每个视频、 音频单元的播放时间都需要 4个字节表示, 通过使 用本专利的方法, 只有 "绝对时间戳" 需要 4 字节数表示播放时间, 其余的 "相对时间戳" 只需要用 2个字节就能表示每个视频、 音频单元的播放时间。 这样可以节省一定的带宽。 下面通过一个实例对本发明做进一步描述。 在本实例中, 1 个频道有 256Kbit/s的速率, 帧频为 25帧, 音频每隔 50ms釆样 1次, 这样, 视频单元为 25个, 音频单元有 20个。 每秒钟发送 1个频道的数据, 先从包头中得到绝对时戳 T。 从视频包头中取出 25个视频单元的相对时戳, 在播放视频时, 第 m帧的 •ί番放时间 =绝对时间戳 +第 m个视频单元的相对时间戳。 从音频包头中取出 20个音频单元的相对时戳, 在 "放音频时, 第 m帧的 播放时间 =绝对时间戳 +第 m个音频单元的相对时间戳。 终端根据上述方式计算出的播放时间, 就能实现视频音频的同步播放。 (8) The receiving end synchronously plays each video unit and the audio unit according to the determined starting play time of each video unit and the audio unit. The above "absolute timestamp" and "relative timestamp" are generated by the encoder. The above units of "absolute timestamp" and "relative timestamp" are seconds, preferably to microseconds, to ensure the accuracy of replay. The above "absolute timestamp" can be 4 bytes in length, and the "relative timestamp" can be 2 bytes in length, which saves a certain amount of bandwidth. The above "absolute timestamp" initial value is a random value, but it can grow with time. The "relative time Cui" of each of the above video units is recorded in the header of the video data to which the video unit belongs, and each video unit is divided by a sync header. The "relative time stamp" of each of the above audio units is recorded in the header of the audio data to which the audio unit belongs, and each audio unit is divided by a sync header. The method of the invention is to calculate the playing time of each video and audio unit by using the absolute time stamp of one channel of the mobile multimedia broadcast and the relative time stamp of each video unit and the audio unit, and realize the video and audio synchronization in the mobile broadcast network. The function of playing, and the transmission efficiency is high, which can ensure that the user can watch the program normally, and can save a certain bandwidth. BRIEF DESCRIPTION OF DRAWINGS FIG. 1 is a schematic diagram of an absolute time stamp of a media stream packet header. FIG. 2 is a schematic diagram of a relative time stamp of a video data unit. FIG. 3 is a schematic diagram of a relative time stamp of an audio data unit. The data of the channel is composed of packet header, video data, audio data and synchronization data, wherein the header information is a channel common information, including some control information and media description information, wherein there is a field "absolute time stamp", used for Indicates the start time of the channel data. As shown in FIG. 2, the video data includes a video header and a plurality of video units. The video header is common information for recording the video data, and each video unit is divided by a sync header. According to the relative time difference between the start play time of each video unit and the start play time of the entire channel data, a "relative time stamp" information is set for each video unit in each video header to record each The relative time difference of the video unit. Thus, the actual playing time of each video unit is the sum of "absolute time 瞿" and the video unit "in time", for example: Play time of video unit 1 = absolute time stamp + relative time stamp of video unit 1 Play time of video unit N = absolute time stamp + relative time stamp of video unit N as shown in Fig. 3, audio data includes an audio header and many more An audio unit, the audio header is a common information used to record the audio data, and each audio unit is divided by a sync header. According to the relative time difference between the start play time of each audio unit and the start play time of the entire channel data, a "relative time cui" information is set for each audio unit in each audio pack header to record each audio unit. The relative play time difference. Thus, the actual play time of each video unit is the sum of the "absolute time stamp" and the "relative time stamp" of the audio unit, for example: Play time of audio unit 1 = absolute time stamp + relative time stamp audio unit of audio unit 1 Play time of N = absolute time stamp + relative time stamp of audio unit N The sender sends the channel data recorded with "absolute time stamp" information and "relative time stamp" information. The receiver receives the channel data. First, the "absolute timestamp" information is taken out from the channel header to determine the initial playing time of the channel data; then the "relative timestamp" information of each video unit is taken from the video header of the channel, from the channel's The "relative timestamp" information of each audio unit is taken out from the audio header, and the "relative timestamp" of each video unit is added to the "absolute timestamp" according to the above algorithm to determine the initial playback of each video unit. Time; Add the "relative timestamp" of each audio unit to the "absolute timestamp" to determine the starting play time of each audio unit. The receiving end plays the video according to the determined starting time of each video unit and the audio unit, thereby realizing the synchronous playback of video and audio. The timestamp is generated by the encoder, and the initial value can be a random value, but it can grow continuously with time, and the time requirement is accurate. The timestamp is in seconds and can be accurate to microseconds to ensure the accuracy of playback. Through the method of the patent, the length of the playing time of each video unit and the audio unit can be reduced. For example, the playing time of each video and audio unit needs to be represented by 4 bytes. By using the method of the patent, only " The absolute time stamp "requires 4 bytes to represent the playback time, and the remaining "relative timestamps" only need 2 bytes to represent the playback time of each video and audio unit. This saves a certain amount of bandwidth. The invention is further described below by way of an example. In this example, one channel has a rate of 256 Kbit/s, the frame rate is 25 frames, and the audio is sampled every 50 ms. Thus, there are 25 video units and 20 audio units. Send 1 channel of data per second, first get the absolute time stamp T from the header. The relative timestamp of 25 video units is taken from the video header, and the video of the mth frame is the absolute timestamp + the relative timestamp of the mth video unit. The relative time stamp of 20 audio units is taken out from the audio packet header. When "audio is played, the playback time of the mth frame = absolute time stamp + relative time stamp of the mth audio unit. The playback time calculated by the terminal according to the above manner. , can achieve synchronous playback of video and audio.
Claims
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN200610112078.8 | 2006-08-29 | ||
| CN2006101120788A CN1960485B (en) | 2006-08-29 | 2006-08-29 | Method for playing back video and audio synchronistically in mobile media broadcast |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2008028361A1 true WO2008028361A1 (en) | 2008-03-13 |
Family
ID=38071944
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2006/003735 Ceased WO2008028361A1 (en) | 2006-08-29 | 2006-12-30 | A method for synchronous playing video and audio data in mobile multimedia broadcasting |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN1960485B (en) |
| WO (1) | WO2008028361A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2405649A4 (en) * | 2009-04-27 | 2012-12-05 | Zte Corp | Method and terminal for synchronously recording sounds and images of opposite ends based on circuit domain video telephone |
Families Citing this family (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101123611B (en) * | 2007-09-25 | 2012-05-23 | 中兴通讯股份有限公司 | A method for sending streaming media data |
| CN101272499B (en) * | 2008-05-13 | 2010-08-18 | 中兴通讯股份有限公司 | Method and system for video and audio co-stream transmission |
| WO2010060240A1 (en) * | 2008-11-25 | 2010-06-03 | 中兴通讯股份有限公司 | Method for transmitting and receiving the service data of handset tv |
| CN101854533B (en) * | 2010-06-10 | 2012-05-23 | 华为技术有限公司 | Frequency channel switching method, device and system |
| CN102510488B (en) * | 2011-11-04 | 2015-11-11 | 播思通讯技术(北京)有限公司 | A kind of utilize broadcast characteristic to carry out audio-visual synchronization method and device |
| CN104125534B (en) * | 2013-07-18 | 2017-01-11 | 中国传媒大学 | Synchronous multi-channel audio recording and playing method and system |
| CN105681889A (en) * | 2015-12-31 | 2016-06-15 | 中科创达软件股份有限公司 | Audio play delay determining method |
| CN106411447B (en) * | 2016-10-08 | 2018-12-11 | 广东欧珀移动通信有限公司 | playing control method, device and terminal |
| CN108521601B (en) * | 2018-02-28 | 2022-04-29 | 海信视像科技股份有限公司 | Method and device for rapidly playing non-standard code stream |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1561642A (en) * | 2001-09-29 | 2005-01-05 | 皇家飞利浦电子股份有限公司 | Robust method for recovering a program time base in MPEG-2 transport streams and achieving audio/video synchronization |
| US20050204052A1 (en) * | 2004-02-13 | 2005-09-15 | Nokia Corporation | Timing of quality of experience metrics |
| CN1720749A (en) * | 2002-12-04 | 2006-01-11 | 皇家飞利浦电子股份有限公司 | Method of automatically testing audio/video synchronization |
| JP2006050656A (en) * | 2005-09-02 | 2006-02-16 | Nippon Telegr & Teleph Corp <Ntt> | Stream transmitting apparatus, receiving apparatus, and transmission / reception method |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE60211157T2 (en) * | 2002-09-06 | 2007-02-08 | Sony Deutschland Gmbh | Synchronous playback of media packages |
-
2006
- 2006-08-29 CN CN2006101120788A patent/CN1960485B/en not_active Expired - Fee Related
- 2006-12-30 WO PCT/CN2006/003735 patent/WO2008028361A1/en not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1561642A (en) * | 2001-09-29 | 2005-01-05 | 皇家飞利浦电子股份有限公司 | Robust method for recovering a program time base in MPEG-2 transport streams and achieving audio/video synchronization |
| CN1720749A (en) * | 2002-12-04 | 2006-01-11 | 皇家飞利浦电子股份有限公司 | Method of automatically testing audio/video synchronization |
| US20050204052A1 (en) * | 2004-02-13 | 2005-09-15 | Nokia Corporation | Timing of quality of experience metrics |
| JP2006050656A (en) * | 2005-09-02 | 2006-02-16 | Nippon Telegr & Teleph Corp <Ntt> | Stream transmitting apparatus, receiving apparatus, and transmission / reception method |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2405649A4 (en) * | 2009-04-27 | 2012-12-05 | Zte Corp | Method and terminal for synchronously recording sounds and images of opposite ends based on circuit domain video telephone |
| AU2009345285B2 (en) * | 2009-04-27 | 2013-05-02 | Zte Corporation | Method and terminal for synchronously recording sounds and images of opposite ends based on circuit domain video telephone |
| US8493429B2 (en) | 2009-04-27 | 2013-07-23 | Zte Corporation | Method and terminal for synchronously recording sounds and images of opposite ends based on circuit domain video telephone |
Also Published As
| Publication number | Publication date |
|---|---|
| CN1960485A (en) | 2007-05-09 |
| CN1960485B (en) | 2011-12-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN101282482B (en) | Apparatus, system and method for synchronously playing video data and audio data | |
| JP5086285B2 (en) | Video distribution system, video distribution apparatus, and synchronization correction processing apparatus | |
| US8776144B2 (en) | Mobile TV system and method for synchronizing the rendering of streaming services thereof | |
| JP3544963B2 (en) | Method and apparatus for synchronous playback | |
| JP4649091B2 (en) | Communication terminal, server device, relay device, broadcast communication system, broadcast communication method, and program | |
| WO2011113315A1 (en) | Stream media live service system and implementation method thereof | |
| CN101827271B (en) | Audio and video synchronized method and device as well as data receiving terminal | |
| CN109565466B (en) | Method and device for lip synchronization between multiple devices | |
| CN101854533A (en) | Channel switching method, device and system | |
| JP5372143B2 (en) | Device and method for synchronizing interactive marks with streaming content | |
| WO2010075743A1 (en) | Method and device for displaying time of internet protocol television (iptv) | |
| WO2008028361A1 (en) | A method for synchronous playing video and audio data in mobile multimedia broadcasting | |
| CN100450163C (en) | A method for synchronously playing video and audio in mobile multimedia broadcasting | |
| CN101272200B (en) | Multimedia stream synchronization caching method and system | |
| WO2008028367A1 (en) | A method for realizing multi-audio tracks for mobile mutilmedia broadcasting system | |
| JP4735666B2 (en) | Content server, information processing apparatus, network device, content distribution method, information processing method, and content distribution system | |
| CN101202613A (en) | A terminal for clock synchronization | |
| CN103269448A (en) | Realization of Audio and Video Synchronization Method Based on RTP/RTCP Feedback Early Warning Algorithm | |
| JP5092493B2 (en) | Reception program, reception apparatus, communication system, and communication method | |
| CN100473171C (en) | A Method of Clock Synchronization in Broadcasting Network | |
| WO2008031293A1 (en) | A method for quickly playing the multimedia broadcast channels | |
| CN100544448C (en) | A Clock Synchronization System for Mobile Multimedia Network | |
| CN101202918B (en) | A method for terminal correction clock | |
| CN111726669B (en) | A distributed decoding device and method for synchronizing audio and video thereof | |
| CN1960509B (en) | Method for implementing error isolation when transmitting mobile multimedia broadcasting media data |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 06840765 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| NENP | Non-entry into the national phase |
Ref country code: RU |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 06840765 Country of ref document: EP Kind code of ref document: A1 |