WO2012099359A2 - 복수의 실시간 전송 스트림을 수신하는 수신 장치와 그 송신 장치 및 멀티미디어 컨텐츠 재생 방법 - Google Patents
복수의 실시간 전송 스트림을 수신하는 수신 장치와 그 송신 장치 및 멀티미디어 컨텐츠 재생 방법 Download PDFInfo
- Publication number
- WO2012099359A2 WO2012099359A2 PCT/KR2012/000271 KR2012000271W WO2012099359A2 WO 2012099359 A2 WO2012099359 A2 WO 2012099359A2 KR 2012000271 W KR2012000271 W KR 2012000271W WO 2012099359 A2 WO2012099359 A2 WO 2012099359A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- real time
- transport stream
- time transport
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/24—Systems for the transmission of television signals using pulse code modulation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/167—Synchronising or controlling image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/178—Metadata, e.g. disparity information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
- H04N21/43072—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4345—Extraction or processing of SI, e.g. extracting service information from an MPEG stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4622—Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/61—Network physical structure; Signal processing
- H04N21/6106—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
- H04N21/6112—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving terrestrial transmission, e.g. DVB-T
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/61—Network physical structure; Signal processing
- H04N21/6106—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
- H04N21/6125—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/816—Monomedia components thereof involving special video data, e.g 3D video
Definitions
- the present invention relates to a receiving apparatus for receiving a plurality of real-time transport streams, a transmitting apparatus and a method of playing multimedia contents, and more particularly, a receiving apparatus and a transmitting apparatus for transmitting and receiving a single multimedia content through different paths and It is about how to play.
- the 3D content includes a left eye image and a right eye image, the size of the content is larger than that of the existing 2D content.
- the security of the content downloaded in advance may be a problem, and there is a difficulty in adding a mass storage device to store the content.
- the present invention is directed to the above-described needs, and an object of the present invention is to provide a receiving apparatus capable of receiving a plurality of real-time transport streams transmitted through different paths to play multimedia content, a method of playing the multimedia content, and a transport stream thereof. It is to provide a transmitting device for transmitting.
- a receiving apparatus comprising: a first receiving unit receiving a first real time transport stream through a broadcasting network, a second receiving unit receiving a second real time transport stream through a communication network; A delay processor for delaying and synchronizing at least one of the first and second real time transport streams, a first detector for detecting first data from the first real time transport stream, and second data from the second real time transport stream And a second detection unit for detecting a signal, a signal processing unit for combining multimedia information with the first data, and the second data to configure multimedia content, and a playback unit for reproducing the multimedia content.
- the first real time transport stream includes address information
- the second receiver accesses a server in the communication network using the address information to receive a metadata file from the server, and uses the metadata file.
- the metadata file may include information about a source of the second real time transport stream.
- the address information may include a reserve area in the PMT of the first real time transport stream, a descriptor area in the PMT, a reserve area of the first real time transport stream, a private data area of the first real time transport stream, and the first real time transport. It may be recorded in at least one of a reserved area in the PES of the stream, a private data area in the PES of the first real-time transport stream, a user area in the ES header, a private area in the ES header, and an SEI in the H.264 standard.
- the second data may include a plurality of data units having at least one size adaptively set according to the state of the communication network.
- one of the first data and the second data may include a left eye image, the other may include a right eye image, and the multimedia content may be 3D content.
- the first real time transport stream may include first synchronization information
- the second real time transport stream may include second synchronization information
- the first and second synchronization information may indicate a starting point of the multimedia content. It may include at least one of start information, a time stamp difference value between the first data and the second data, and a frame index.
- the receiving apparatus corrects at least one of a time stamp of each frame included in the first data and a time stamp of each frame included in the second data by using the first and second synchronization information.
- the apparatus may further include a controller configured to control the signal processor to configure the multimedia content by combining the frames of the first and second data according to the corrected time stamp.
- the first real time transport stream may include first synchronization information
- the second real time transport stream may include second synchronization information
- the first and second synchronization information may be time code information of an image frame.
- the transmission apparatus a stream configuration unit constituting a first real-time transport stream including the first data and the first synchronization information, an output unit for outputting the first real-time transport stream and the first And a control unit for controlling the output unit to delay the output timing of the first real time transport stream in accordance with the output timing of another transmission device for outputting the 2 real time transport streams.
- the second real time transport stream includes second data and second synchronization information, wherein the first and second data are data for composing one multimedia content, and the first synchronization information and the second synchronization.
- the information may be information transmitted for synchronization of the first data and the second data.
- the transmitting apparatus includes a stream configuration unit constituting a first real time transport stream including first data and address information, and an output unit for outputting the first real time transport stream, wherein the address information includes the first data.
- the second data constituting the multimedia content may be address information on a metadata file that can be obtained from a communication network.
- the method comprises: receiving a first real time transport stream through a broadcasting network, receiving a second real time transport stream through a communication network, and the first and second operations. Delaying and synchronizing at least one of the two real time transport streams, detecting first data from the first real time transport stream, and detecting second data from the second real time transport stream, the first data And combining the second data to construct multimedia content and playing the multimedia content.
- the receiving of the second real time transport stream through a communication network may include detecting address information included in the first real time transport stream, accessing a server in the communication network using the address information, and transmitting a meta data from the server. Receiving a data file and accessing a source of the second real time transport stream using the metadata file to receive the second real time transport stream.
- One of the first data and the second data may include a left eye image, the other may include a right eye image, and the multimedia content may be 3D content.
- the second data may include a plurality of data units having at least one size adaptively set according to the state of the communication network.
- the first real time transport stream includes first synchronization information
- the second real time transport stream includes second synchronization information
- the first and second synchronization information indicate content indicating a starting point of the multimedia content. It may include at least one of start information, a time stamp difference value between the first data and the second data, a frame index, and a time code.
- high-quality multimedia content can be reproduced by receiving and synchronizing a real-time transport stream through a plurality of different paths.
- FIG. 1 is a view showing the configuration of a multimedia content transmission and reception system according to an embodiment of the present invention
- FIG. 2 is a view showing the configuration of a receiving apparatus according to an embodiment of the present invention.
- FIG. 3 is a view for explaining a process of synchronizing and playing a transport stream in a receiving device
- FIG. 5 is a diagram illustrating an operation of receiving a plurality of real time transport streams through a broadcasting network and a communication network;
- 6 to 9 are diagrams for explaining a method of delivering address information in the HTTP scheme
- FIG. 10 is a diagram illustrating a configuration of an HTTP stream including an MPD file
- FIG. 11 is a diagram illustrating a configuration of an HTTP stream including synchronization information
- FIG. 12 is a diagram illustrating a transmission process of dividing and transmitting multimedia content into a plurality of streams
- FIG. 13 is a diagram for explaining a process of obtaining a transport stream in a multimedia content transmission and reception system
- FIG. 14 is a diagram illustrating a configuration of a stream in which synchronization information is included in a PMT
- 15 is a diagram illustrating a configuration of a PMT in which synchronization information is recorded
- 16 is a view for explaining a method of transmitting synchronization information using a TS adaptation field
- 17 is a view for explaining a method of transmitting synchronization information using a PES header
- 18 is a view for explaining a method of transmitting synchronization information using an EIT
- 19 is a view for explaining a method of transmitting synchronization information using a private stream
- 20 is a view for explaining a frame index transfer method using PMT
- 21 is a view for explaining a frame index delivery method using a private stream
- 22 is a diagram showing a plurality of transport streams to which time codes are respectively assigned;
- 23 to 26 are views illustrating various examples of a method for transmitting various synchronization information
- 27 to 29 are block diagrams illustrating a configuration of a transmitting apparatus according to various embodiments of the present disclosure.
- FIG. 30 is a flowchart for explaining a method of playing multimedia content according to an embodiment of the present invention.
- 31 is a flowchart illustrating a method of obtaining a second real time transport stream using address information included in the first real time transport stream.
- the multimedia content reproduction system includes a plurality of transmitters 200-1 and 200-2 and a receiver 100.
- the transmitters 1 and 2 200-1 and 200-2 transmit different signals through different paths.
- the first transmission device 200-1 transmits a first signal through a broadcasting network
- the second transmission device 200-2 transmits a second signal through a communication network 10.
- the first signal and the second signal may be configured as a real time transport stream each including different data constituting one multimedia content.
- a left eye image or a right eye image may be included in a first real time transport stream and transmitted through a broadcasting network, and another image may be included in a second real time transport stream and transmitted through a communication network.
- the first data included in the first signal and the second data included in the second signal may be implemented as various types of data in addition to the left eye image and the right eye image.
- the video data may be divided into audio data, or may be divided into video data, subtitle data, other data, and the like, and transmitted in the first and second real time transport streams, respectively.
- the reception device 100 receives and buffers a real time transport stream transmitted from transmission devices 1 and 2, respectively. In this process, at least one real time transport stream is delayed to synchronize with each other.
- the second real time transport stream transmitted through the communication network 10 may be streamed using various streaming schemes such as Real Time Protocol (RTP) or Hypertext Transfer Protocol (HTTP).
- RTP Real Time Protocol
- HTTP Hypertext Transfer Protocol
- first real time transport stream includes first synchronization information together with the first data
- second real time transport stream includes second synchronization information together with the second data
- Various information may be used as the first and second synchronization information. Specifically, at least one of content start information indicating a start point of the multimedia content, a time stamp difference value between the first data and the second data, a frame index, time code information, UTC information, and frame count information may be used as synchronization information. have.
- a transport stream for transmitting broadcast data includes a PCR (Program Clock Reference) and a PTS (Presentation Time Stamp).
- PCR means reference time information for adjusting the time reference to the transmitter side in a receiving apparatus (set-top box, TV, etc.) complying with the MPEG standard.
- the receiver adjusts the value of the STC (System Time Clock) according to the PCR.
- PTS refers to a time stamp indicating a reproduction time for synchronizing video and audio in a broadcast system according to the MPEG standard. In this specification, it is called a time stamp.
- the synchronization information is included in each real-time transmission stream transmitted through different paths, and the receiving device 200 uses the synchronization information to time-stamp the image frame included in each transport stream. Can be corrected, or the synchronization information can be directly compared and reproduced.
- the reception apparatus 100 includes a first receiver 110, a second receiver 120, a delay processor 130, a first detector 140, a second detector 150, and a signal processor 160. And a playback unit 170 and a controller 180.
- the first receiver 110 receives a first real time transport stream transmitted through a broadcast network.
- the first receiver 110 may be implemented in a form including an antenna, a tuner, a demodulator, an equalizer, and the like.
- the second receiver 120 accesses an external source through a communication network to receive a second real time transport stream.
- the second receiver 120 may include a network interface card.
- the delay processor 130 delays at least one of the first and second real time transport streams and synchronizes them with each other.
- the delay processor 130 may delay the transport stream using various methods such as a personal video recorder (PVR), time shift, memory buffering, and the like.
- PVR personal video recorder
- the delay processor 130 may delay the stream by using a buffer provided separately in the reception apparatus 100 or a buffer provided in the delay processor 130 itself. For example, if the first real time transport stream is received first and the second real time transport stream is not received, the delay processor 130 stores the first real time transport stream in a buffer and delays it. In this state, when the second real time transport stream is received, the delayed first real time transport stream is read from the buffer and provided together with the second real time transport stream to the first and second detection units.
- the delay processor 130 may analyze each stream to match the timing at which the first and second real time transport streams are provided to the first and second detectors 140 and 150, respectively. That is, the delay processor 130 may analyze the stream to determine how much delay is to be applied to at least one of the first real time transport stream and the second real time transport stream. For example, the delay processor 130 may identify a portion of the first real time transport stream and the second real time transport stream to be synchronized with each other using information such as content start information, a time stamp difference value, and a time stamp of each stream. You can check it. Alternatively, the delay processor 130 may check the parts to be synchronized with each other by comparing information such as a frame index or a time code of the two streams.
- the delay processor 130 adjusts the delay state so that timings provided to the first and second detectors 140 and 150 can be matched with each other.
- Information such as content start information, time stamp difference value, frame index, time code, etc. may be included in each stream as synchronization information or received in a separate private stream.
- the delay processor 130 may determine the degree of delay using the synchronization information, and delay the delay accordingly.
- the first detector 140 detects the first data from the first real-time transport stream
- the second detector 150 detects the second data from the second real-time transport stream, and then signals the first and second data, respectively. It is provided to the processing unit 160.
- the signal processor 160 configures the multimedia content by combining the first data and the second data. Specifically, when the first data is video data and the second data is audio data, the signal processor 160 decodes each data and provides the decoded data to the display unit and the speaker in the playback unit 170, respectively. This allows two data to be output at the same timing.
- the signal processor 160 may process the data in various ways according to the 3D display method. That is, in the polarization method, the signal processor 160 may alternately arrange a part of the synchronized left eye image and a part of the right eye image to configure one or two frames. Accordingly, the corresponding frame may be output through the display panel to which the lenticular lens or the parallax barrier is added. On the other hand, in the shutter glass method, the signal processor 160 may alternately arrange the synchronized left eye image and the right eye image and sequentially display them through the display panel.
- the playback unit 170 plays back the multimedia content processed by the signal processor 160.
- the playback unit 170 may include at least one of a display unit and a speaker according to the type of the reception device 100, or may be implemented as an interface unit connected to an external display device.
- the controller 180 may control the delay processor 130 to delay the first received stream. In addition, the controller 180 may control the signal processor 160 to perform the operation of playing the multimedia content by combining the first data and the second data.
- the controller 180 may include the time stamp of each frame included in the first data and the second data included in the second data using the synchronization information.
- the signal processor 160 may be controlled to correct at least one of time stamps of each frame and to configure multimedia content by combining each frame of the first and second data according to the corrected time stamp.
- the signal processor 160 may be controlled such that the same frames are reproduced together by directly comparing the time code or the frame index without correcting the time stamp.
- controller 180 may control the operation of each component included in the reception device 100.
- the synchronization operation for synchronizing and reproducing frames corresponding to each other by using synchronization information included in the first and second real time transport streams may be performed by the signal processor 160 or the controller 180.
- FIG. 3 is a diagram illustrating a process of synchronizing synchronization by delaying one of a plurality of real-time transport streams in the receiving device of FIG. 2.
- the transmitter 1 200-1 transmits a real time transport stream through a broadcasting network
- the transmitter 2 200-2 transmits a real time transport stream through a communication network. Even if the transmission point itself is the same, one of the two may arrive first due to environmental differences between the broadcasting network and the communication network.
- 3 illustrates that the first real time transport stream transmitted through the broadcasting network is delayed by about 2 frames and then synchronized with the second real time transport stream. As a result, a 3D video delayed by about two frames is reproduced.
- FIG. 4 is a view for explaining another embodiment of reducing the delay time.
- the image to be transmitted is divided into access units of various sizes, and the smallest size of the images is transmitted first to reduce the delay time.
- the quality of the image to be transmitted is gradually improved in consideration of the communication status.
- an SD class frame is transmitted as the first frame, and an HD class frame is transmitted from the second frame.
- FIG. 3 it can be seen that the delay time is reduced by about one frame.
- the size such as the resolution of the image, may vary depending on the state of the communication network. In other words, when the communication bandwidth is insufficient or the communication speed is low, after the smallest resolution data is transmitted as shown in FIG. 4 to minimize the delay time, the resolution may be gradually increased.
- the second real time transport stream includes a plurality of data units having at least one size adaptively set according to the state of the communication network. In the case of audio data as well as image data, data sizes may be determined and transmitted differently according to the state of the communication network. Accordingly, the reception apparatus 100 may perform synchronization while minimizing delay times of the plurality of real time transport streams.
- the second real time transport stream may be transmitted and received using a protocol such as RTP or HTTP.
- Streaming using HTTP is a streaming method that minimizes the burden on the server depending on the client's processing.
- the second receiver 120 achieves streaming by using a file transfer request or a partial file transfer request of HTTP.
- the sender In order to adaptively respond to changes in the transmission rate of the network, the sender must put files compressed at various transmission rates for a single content on the server.
- the entire content file should be divided into a plurality and stored as a file.
- the sender must provide metadata to the receiver to indicate how to sequentially obtain the plurality of separated files to play the multimedia content.
- Metadata is information for indicating where the multimedia content can be received. Metadata files can be classified in various ways depending on the type of HTTP-based streaming.
- an ism (Internet Information Service (IIS) Smooth streaming Media) file is used as a metadata file.
- IIS Internet Information Service
- m3v8 file is used as metadata file
- MPD Media Presentation Description
- the metadata file may include information that the client needs to know in advance, such as a location on a content time corresponding to each of the plurality of separate files, a URL of a source for providing the file, and a size.
- address information on a source from which the metadata file may be obtained may be included in the first real time transport stream.
- the apparatus 1 200-1 transmits a first real time transport stream TS including address information through a broadcasting network.
- the receiving apparatus 100 detects the address information and checks information on a server to provide a metadata file.
- the first detector 140 may detect address information and provide the address information to the second receiver 120.
- the second receiver 120 accesses the server 200-3 in the communication network by using the address information.
- the server 200-3 transmits the metadata file at the request of the second receiver 120.
- the second receiver 120 accesses the source 200-2 of the second real time transport stream by using the metadata file, requests the transmission of the second real time transport stream, and receives the received data.
- the metadata file includes information about the source of the second real time transport stream.
- the address information may be included in various areas in the first real time transport stream and transmitted.
- the address information may be URL information such as Hybrid3DURL or Hybrid3DMetaURL.
- Such address information may be recorded and transmitted in various sections in the first real time transport stream.
- FIG. 6 through 9 illustrate examples in which address information may be transmitted using various regions in the first real time transport stream.
- address information may be recorded in a reserve area or a descriptor area in the PMT.
- the address information may be recorded in the reserve area of the first real time transport stream or the private data area of the first real time transport stream.
- the address information may be recorded in a user (data) area or a private area in the ES header.
- the address information may be recorded in a reserve area or a private data area in a program elementary stream (PES) of the first real time transport stream.
- PES program elementary stream
- the address information may be recorded in at least one of Supplemental enhancement information (SEI).
- SEI Supplemental enhancement information
- This address information means a source from which the metadata file can be obtained, that is, address information about a server.
- the reception apparatus 100 accesses a corresponding source using address information included in the first real time transport stream and receives a metadata file from the corresponding source. As such, when a separate server managing the metadata file is used, the metadata file can be easily updated.
- the metadata file may basically include PID (Packet Identifier) information.
- various link information provided for inter-channel interworking services may be included.
- the link information includes link_original_network_id, which is the original network ID of the 3D additional video service connected to the corresponding channel, linked_carrier_frequency, which is a radio frequency value providing the 3D video service channel, or the additional video service, and the number of logical channels that provide the 3D additional video service connected to the channel.
- link_logical_channel_number link_transport_stream_id, which is an identifier for distinguishing a transport stream on a network
- link_service_id which is an identifier for distinguishing a service in a transport stream
- link_url_indicator which indicates that there is url information
- link_url_indicator which indicates that there is url information
- link_source_URL which is an address
- link_service-start_time which is a time when a linking service is provided in the case of a download and NRT service.
- the metadata file may also include modulation information of the provided broadcast stream.
- the modulation information may include SCTE_mode_1: 64-QAM, SCTE_mode_2: 256-QAM, ATSC (8VSB), AVSB (16VSB), and the like.
- the reception device 100 when the second real time transport stream is transmitted through the communication network, the reception device 100 does not immediately play back the content, but synchronizes with the first real time transport stream to play back. Accordingly, there is a need for information capable of adjusting the reproduction time of the second data included in the second real time transport stream.
- This information can be added to the metadata file. Specifically, information such as linkedContents indicating that the content needs to be synchronized playback, playableRestriction indicating that content requests are not allowed through the streaming channel before synchronizing playback, and designatedPlayTime providing an accurate playback start time or providing a start time offset. You can send it in addition to the metadata file.
- designatedPlayTime follows the UTC (Coordinated Universal Time) format.
- the reception apparatus 100 limits the reproduction of the second data until the reproduction start time acquired by the designatedPlayTime, and performs the synchronization reproduction using the synchronization information for the synchronization reproduction.
- synchronization information may be added to the metadata file. This information can be added as a period level element.
- the synchronization information may include startPTS, PTSdiff, and frame index.
- the startPTS indicates a time stamp of a starting point of the multimedia content.
- startPTS may be referred to as content start information in that startPTS is information indicating a starting point of multimedia content.
- PTSdiff represents a difference value between a time stamp given to each frame of the first real time transport stream and a time stamp given to each frame of the second real time transport stream.
- FIG. 10 shows an example of a notation method of an MPD file including frame index information.
- frame index information is indicated in the MPD file.
- time stamps of data for implementing the same content may be different due to time differences in the signal processing and the transmission process.
- the receiving apparatus 100 corrects the time stamps of the frames having the same frame index in the first data and the second data to the same value, or compares the frame indexes themselves and reproduces the same if they are the same. Synchronization can be performed.
- the transmitter 200-3 may provide synchronization information together with the MPD.
- the synchronization information may include content start information indicating a start point of the multimedia content, a time stamp difference value between the first data and the second data, a frame index, and the like.
- the synchronization information may be included in the first real time transport stream and the second real time transport stream, respectively, and may be transmitted. However, when the synchronization information is included in the metadata file, the synchronization point may be known before the second real time transport stream is transmitted.
- the first data and the second data included in each of the first real time transport stream and the second real time transport stream are processed together to form one multimedia content. Therefore, the first and second data are preferably produced together.
- FIG. 12 is a diagram for describing a transmission process of preparing first and second data together and transmitting them through two different paths.
- multimedia content photographed by one camera 310 is separated into first data and second data.
- the separated data are each encoded by the encoder 320 and then provided to different transmission apparatuses 200-1 and 200-2, respectively. That is, the first data corresponding to the reference image is encoded by the encoder 320 and then provided to the transmission device 1 200-1.
- the transmitting device 1 200-1 converts the corresponding data into a transport stream and broadcasts it through a broadcasting network in the form of an RF signal.
- the second data corresponding to the additional video is separated and encoded in units of an access unit and then provided to the transmitting apparatus 2 200-2.
- the transmitting device 2 200-2 buffers the data and transmits the data to the receiving device 100 through the communication network.
- the transmitting device 2 200-2 may also be referred to as a contents provide server.
- the transmitting device 2 200-2 stores data provided from the encoder 320 by the buffer size. When the request of the receiving device 100 is requested, the requested data is provided to the receiving device 100.
- the number of encoders 320 may be implemented in plural as the number of data.
- FIG. 13 is a diagram for describing a process of transmitting and receiving first and second data.
- the first real-time transport stream including the first data is broadcast by the transmitting device 1 200-1 and transmitted to the receiving device 100.
- the reception apparatus 100 After receiving the address information included in the first real-time transport stream, the reception apparatus 100 obtains a metadata file using the corresponding address information.
- the reception apparatus 100 accesses the transmission apparatus 2 200-2 by using the metadata file and requests the second data.
- the transmitting device 2 200-2 transmits a second real time transport stream including the second data to the receiving device 100 according to a request.
- the second data includes a plurality of data units having at least one size adaptively set according to the state of the communication network. That is, the transmitter 2 (200-2) adaptively determines the size of the second data in consideration of the state of the communication network, specifically, the communication bandwidth, the communication speed, and the like.
- the resolution of the image stored in the buffer may be determined in consideration of the communication bandwidth. That is, the communication bandwidth may be measured in the process of transmitting and receiving a request between the receiving device 100 and the transmitting device 2 (200-2).
- the transmitting device 2 200-2 selects an image (eg, an SD class or HD class image) optimized for a network state in consideration of the measured bandwidth, and transmits the selected image to the receiving device 100. This can minimize delays.
- the first and second real time transport streams may include synchronization information together with data.
- the synchronization information may include at least one of content start information, a time stamp difference value between the first data and the second data, a frame index, time code information, UTC information, and frame count information.
- the reception device 100 uses the information to determine a start time of multimedia content.
- the signal processor 160 may perform such an operation.
- the signal processor 160 compares the time stamp of the frame included in the first data and the time stamp of the frame included in the second data with the start time, respectively. According to the comparison result, a frame index of each data can be extracted and synchronization is performed using the extracted frame index.
- n + 1 frames are generated by synchronizing the L2 frame and the R2 frame.
- the signal processor 160 sets the time stamp interval to 30 and n times the R1 frame. Matches frame, R2 frame, to n + 1 frame.
- the signal processor 160 corrects the time stamp of the right eye image frame or the time stamp of the left eye image frame so that the time stamps of the two matching frames are the same.
- the right eye image frame matches the next frame of the left eye image frame.
- the signal processor 160 corrects and synchronizes the time stamp of the right eye image frame to be the same as the time stamp of the next frame of the left eye image frame.
- a time stamp difference value between two data may be used as synchronization information. That is, the first synchronization information and the second synchronization information may each include a difference value between the time stamp of the left eye image and the time stamp of the right eye image.
- the signal processor 160 corrects and synchronizes at least one of a time stamp of the left eye image and a time stamp of the right eye image by reflecting the difference value.
- the content start information and the time stamp difference value information may be recorded in an event information table (EIT), a PMT, a private stream, a transport stream header, and the like.
- EIT event information table
- PMT PMT
- private stream a transport stream header
- transport stream header a transport stream header
- synchronization information may be recorded in an mdhd or stts box. have.
- the signal processor 160 may calculate a frame rate using a time scale or a duration, and synchronize the playback time by comparing the calculated frame rates.
- the signal processor 160 may synchronize the two signals by using the relative reproduction timing and the start time.
- frame index information may be used as synchronization information.
- the frame index information means identification information provided for each frame.
- the signal processor 160 may correct the time stamps of the frames having the same frame index to be the same.
- FIG. 14 illustrates a configuration of a stream including a program map table (PMT).
- the PMT is periodically included in the first signal and the second signal transmitted from each of the transmitters 200-1 and 200-2 and transmitted.
- Various synchronization information such as the above-described content start information, time stamp difference value, frame index, etc. may be included in the PMT and transmitted.
- FIG. 15 is a diagram illustrating the structure of a PMT. According to FIG. 15, various synchronization information may be transmitted using a reserved area, a new descriptor, an extended area of an existing descriptor, and the like in the PMT.
- FIG. 16 is a diagram for explaining a method of transmitting various synchronization information using an adaptation field of a transport stream.
- random_access_indicator, transport_private_data_flag, private_data_byte, and the like are provided in the adaptation field.
- random_access_indicator is implemented with 1 bit, and when set to 1, indicates the start of a sequence header. In other words, it represents a random access point of the transport stream.
- transport_private_data_flag is also implemented as 1 bit. If set to 1, it means that there is more than 1 byte of private data.
- the private_data_byte is implemented with 4 to 5 bytes, and this part may include synchronization information such as content start information, time stamp difference value, frame index, and the like.
- PES_private_data_flag may be set to 1 and synchronization information may be recorded in the PES_private_data part.
- FIG. 18 illustrates a method of delivering synchronization information such as content start information, a time stamp difference value, a frame index, etc. using an event information table (EIT).
- EIT event information table
- a private stream in which synchronization information such as content start information, time stamp information, and frame index information is recorded that is, a data bit stream may be separately included and transmitted in a program elementary stream (PES).
- PES program elementary stream
- the stream ID of the PES header may use a reserved value in addition to the predefined 0xBD and 0xBF.
- time code, UTC, frame count information, or the like may also be transmitted using a private stream. This will be described later.
- a transport stream carries video, audio, and other data.
- Information of each program is recorded in a program map table (PMT).
- PMT program map table
- FIG. 20 illustrates a structure in which a frame index is inserted into a PMT
- the frame index may be inserted into a video stream header, an audio stream header, a TS header, and the like.
- the frame index of the next frame is recorded in each PMT.
- the value of Hybridstream_Info_Descriptor () is defined to point to the same frame index. If Descriptor () can be inserted in I-frame units in the multiplexer of the transmitting apparatus, data duplication can be prevented.
- the receiving apparatus 100 may detect an index of a frame with reference to each PMT, and then synchronize the frames of the first signal and the second signal using the index.
- the frame index may be provided in a different manner.
- FIG. 21 is a diagram illustrating a case in which a frame index is transmitted through a separate private stream.
- a private stream may be provided separately from a multimedia stream such as video or audio in the first signal, and a frame index value synchronized with the second signal may be provided through the corresponding private stream.
- the second signal is also a real-time transport stream having the structure as shown in FIG. 21, the frame index may be detected and synchronized from the private stream of the transport stream.
- time code may also be used as synchronization information.
- UTC Coordinated Universal Time
- FIG. 22 is a diagram for describing a real-time transmission method using time codes of images captured by a plurality of cameras.
- the first data and the second data photographed by the plurality of cameras are respectively encoded and then transmitted through a broadcasting network or a communication network.
- the same time code is given to the corresponding data frame. That is, the frames 51, 52, and 53 of the first data and the frames 61, 62, and 63 of the second data have different time stamps, that is, PTSs, but the same time codes are generated.
- This time code may be used as synchronization information at the receiving end.
- a time code is a series of pulse signals produced by a time code generator and is a signal standard developed for easy editing management.
- the same time code is used for synchronized management of left and right eye images. Therefore, the time code can maintain the same pair regardless of the time of stream generation or delivery.
- SMPTE Society of Motion Picture and Television Engineers
- SMPTE 12M the time code is represented in the form of "hour: minute: second: frame”.
- the SMPTE time code may be classified into a longitude time code (LTC) or a vertical interval time code (VITC) according to a recording method.
- LTC longitude time code
- VITC vertical interval time code
- the LTC is recorded according to the advancing direction of the tape.
- a total of 80 bits of data including time information (25 bits), user information (32 bits), synchronization information (16 bits), storage area (4 bits), and frame mode display (2 bits) may be configured.
- VITC is recorded on two horizontal lines within the vertical blanking period of the video signal.
- SMPTE RP-188 defines an interface specification that allows LTC or VITC type timecode to be transmitted as ancillary data.
- the time code and additional information related to the time code can be newly defined and transmitted according to this interface standard.
- the additional information related to the time code includes a time code for another image provided when the time code of the left eye image and the right eye image do not match, 2D / 3D conversion information for indicating whether the current image is a stereoscopic image, and a stereoscopic image. There may be starting point information of the image. Such additional information may be provided through a user information area or a storage area (or an unassigned area).
- the time code space may be extended and used in a network protocol. For example, a time code may be provided through an RTP header extension.
- a time code may be recorded as 25 bits of data.
- the time code may be delivered to the receiving device 100 in GoP units.
- the time code may be recorded in a private stream and transmitted. That is, the private stream in which the time code is recorded, that is, the data bit stream may be included and transmitted separately from the program elementary stream (PES).
- the stream ID of the PES header may use a reserved value in addition to the predefined 0xBD and 0xBF.
- UTC or frame count information may be transmitted similarly to the time code.
- the time code may be transmitted using supplemental enhancement information (SEI) defined in AVC (Advanced Video Coding: ISO / IEC 14496-10). That is, as shown in FIG. 24, a time code may be delivered using seconds_value, minutes_value, hours_value, and n_frames defined in Picture timing SEI.
- SEI Supplemental Enhancement Information
- FIG. 25 shows a stream structure when providing a time code using an audio stream.
- an audio stream has a structure in which sync frames are continuously arranged.
- the information on the time code may be provided in a bit stream information (bsi) area that provides information of the sync frame among the configurations of the sync frame.
- bsi bit stream information
- FIG. 26 shows PMT syntax when a time code is provided through PMT.
- a time code may be provided through a reserved or descriptor of a PMT transmitted periodically.
- the PMT provision interval may be made in GoP units or frame units to give a synchronized time code.
- a PMT is transmitted every two frames, but a PMT including a time code may be provided every frame.
- various pieces of information may be used as synchronization information, and the location thereof may also be variously set.
- FIG. 27 is a block diagram illustrating an example of a transmitting device that transmits a real time transport stream.
- the transmitting apparatus of FIG. 27 may be implemented as any one of the transmitting apparatuses 1 and 2 in the system of FIG. 1, the transmitting apparatus of FIG.
- the transmitter includes a stream configuration unit 710, an output unit 720, and a control unit 730.
- the stream configuration unit 710 configures a first real time transport stream including the first data and the first synchronization information.
- the first data may be one of a left eye image and a right eye image.
- the second data which is the other of the left eye image and the right eye image, may be provided from the other transmitting apparatus to the receiving apparatus. Accordingly, the first and second data may be combined and represented as a 3D image.
- the first data may be at least one of video data, audio data, subtitle data, and additional data constituting the multimedia content.
- the first synchronization information is information for synchronizing synchronization between the first data and the second data. Since the type of the first synchronization information has been described above, duplicate description thereof will be omitted.
- the output unit 720 transmits the stream generated by the stream configuration unit 710 to the receiving device 100 side.
- the detailed configuration of the output unit 720 may be implemented differently according to the type of the stream.
- the output unit 420 may be implemented in a form including an RS encoder, an interleaver, a trellis encoder, a modulator, and the like.
- the transmitting apparatus of FIG. 27 is a web server transmitting a stream through a network such as the Internet
- the output unit 720 may be implemented as a network interface module communicating with a receiving apparatus, that is, a web client, according to the HTTP protocol.
- the control unit 730 controls the output unit 720 to delay the output timing of the first real time transport stream in accordance with the output timing of the other transmitting device.
- the other transmitting device refers to an apparatus for transmitting a second real time transport stream including second data and second synchronization information.
- the second data refers to data for composing one multimedia content together with the first data.
- the information on the output timing can be adjusted by sharing broadcast program time information.
- various stream creators such as broadcasting stations for transmitting video and audio, third parties for transmitting additional data such as subtitles, third parties for providing related games, and the like.
- One of these stream generating subjects may transmit a time plan to other subjects based on a time code.
- Each stream generator may generate synchronization information using a time schedule and add the synchronization information to the transport stream, and may delay the transmission timing of the transport stream to match other transmitters.
- the time plan or synchronization information is frame unit information having accuracy for stream generation stage synchronization, unlike the time schedule provided by the existing EPG.
- each stream generating subject may share a reference time, that is, a PCR, through a reference time server.
- a reference time that is, a PCR
- the transmission speed can be delayed.
- the same DTS and PTS may be generated and added to the same frame of content.
- the controller 730 controls the stream configuration unit 710 and the output unit 720 to perform such a delay operation and synchronization information generation.
- the transmitting device transmitting the stream including the first data delays the transmission.
- another transmitting device transmitting the stream including the second data may delay the transmission.
- the other transmitting device may have a configuration as shown in FIG. 27.
- the reception apparatus may not need to delay the stream after receiving the stream. That is, in the whole system including the transmitting apparatus and the receiving apparatus, the stream processing delay task may be performed only at the transmitting apparatus or at the receiving apparatus. Therefore, when the transmitting apparatus itself delays stream transmission as shown in FIG. 27, the receiving apparatus may not be implemented as shown in FIG. 1.
- Fig. 28 is a block diagram showing the configuration of a transmitting device for transmitting a real time transport stream by the HTTP streaming method.
- the transmitting apparatus includes a stream constructing unit 710 and an output unit 720, and the stream constructing unit 710 includes an encoding unit 711 and a multiplexing unit 712.
- the stream configuration unit of FIG. 28 configures a first real time transport stream including first data and address information.
- the address information refers to address information of a metadata file capable of acquiring, on the communication network, second data constituting the multimedia content together with the first data. Specifically, it may be URL information about a server to provide a metadata file.
- the encoder 711 receives the first data from the content producer.
- the encoding processor 412 encodes the first data and provides the encoded first data to the multiplexer 712.
- the multiplexer 712 multiplexes the encoded first data and the address information to generate a first real time transport stream.
- the encoder 711 may also be provided with signaling information from the content producer.
- Signaling information means basic information necessary for generating synchronization information.
- the encoder 711 generates synchronization information using the signaling information and adds the synchronization information to the encoded first data.
- the encoding unit 711 If the synchronization information is content start information, the encoding unit 711 generates a time stamp of the first frame based on the PCR, and adds the time stamp as the synchronization information.
- the signaling information may be implemented as information on PCR of another transmitting device generating and transmitting the second data.
- the encoder 711 may generate, as synchronization information, a time stamp difference between the first and second data based on the signaling information, and add it to the encoded first data.
- the first data and the synchronization information may be input to the encoder 711 without additional signaling information.
- the encoding processor 711 encodes the first data and the synchronization information as they are and provides the encoded data to the multiplexer 413.
- the address information itself may be input together to the encoder 711 and encoded together with the first data.
- a configuration for performing video data compression or the like conforming to the MPEG standard may be further added to the stream configuration unit 711.
- the configuration and description thereof will be omitted.
- the multiplexer 712 muxes the additional data to the data generated by the encoder 711 to generate transmission data.
- the additional data may be PSIP and EPG information.
- the output unit 720 performs channel encoding, modulation, and the like on the transport stream provided by the multiplexer 712, converts the transmission stream into a transmission signal, and then transmits the transmission stream.
- modulation 8 VSB system used in terrestrial broadcasting system, 16 VSB system which is a high data rate system for cable TV, etc. may be used.
- FIG. 29 shows a configuration of a transmitting apparatus according to another embodiment of the present invention.
- the transmitter of FIG. 29 processes and transmits the time code as a separate private stream.
- the transmitter includes an A / V encoder 510, a time code detector 520, a time code encoder 530, and a multiplexer 540.
- the A / V encoder 510 encodes A / V data included in the input multimedia data.
- the encoding scheme may vary depending on the standard applied to the transmitting apparatus.
- the time code detector 520 detects a time code of an image from the input multimedia data and provides the time code to the time code encoder 530.
- the detected time code may be stored as a timeline data file. In this case, not only the time code but also various additional information may be detected and provided together to the time code encoder 530.
- the time code encoder unit 530 encapsulates the detected time code into an appropriate transmission format, and combines the presentation time stamps calculated using the same program system clock as the A / V encoder unit 510.
- the A / V encoder unit 510 synchronizes with the A / V (Audio / Video) data.
- the time code information processed by the time code encoder 530 is provided to the multiplexer 540 together with the A / V data processed by the A / V encoder 510.
- the multiplexer 540 multiplexes this data and outputs MPEG2-TS.
- various components such as a pilot inserter, a modulator, an interleaver, a randomizer, an RF upconverter, and the like may be added to the transmitter. Since these configurations correspond to the general configurations of the transmission apparatus, detailed illustration and description are omitted.
- FIG. 30 is a flowchart illustrating a method of playing multimedia content according to an embodiment of the present invention.
- the first data and the second data are detected from each stream (S2240). Then, the multimedia content is composed by combining the detected first and second data (S2250), and then played back (S2260).
- FIG. 31 is a flowchart for describing a method of receiving a second real-time transport stream in detail.
- the first real-time transport stream is received, it is analyzed (S2310) and address information is detected (S2320).
- the communication network is accessed using the detected address information (S2330).
- the metadata file is received from the server corresponding to the address information (S2340), and the source is accessed using the metadata file (S2350).
- a second real time transport stream is received from the source.
- the first and second real time transport streams may each include synchronization information.
- the structure of the metadata file, the recording position of the address information in the stream, and the like have been described in detail in the above-described part, duplicate description is omitted.
- the first and second data may be data constituting 3D content, such as a left eye and a right eye image, or may be partial data constituting a single multimedia content such as various videos, audio, and subtitles. .
- the program for performing the method according to various embodiments of the present disclosure described above may be stored and used in various types of recording media.
- the code for performing the above-described methods may include random access memory (RAM), flash memory, read only memory (ROM), erasable programmable ROM (EPROM), electronically erasable and programmable ROM (EEPROM), register, hard drive. It may be stored in various types of recording media readable by the terminal, such as a disk, a removable disk, a memory card, a USB memory, a CD-ROM, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Library & Information Science (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
Claims (15)
- 제1 실시간 전송 스트림을 방송망을 통해 수신하는 제1 수신부;제2 실시간 전송 스트림을 통신망을 통해 수신하는 제2 수신부;상기 제1 및 제2 실시간 전송 스트림 중 적어도 하나를 지연(delay)시켜 동기화하는 지연 처리부;상기 제1 실시간 전송 스트림으로부터 제1 데이터를 검출하는 제1 검출부;상기 제2 실시간 전송 스트림으로부터 제2 데이터를 검출하는 제2 검출부;상기 제1 데이터 및 상기 제2 데이터를 조합하여 멀티미디어 컨텐츠를 구성하는 신호 처리부; 및상기 멀티미디어 컨텐츠를 재생하는 재생부;를 포함하는 수신 장치.
- 제1항에 있어서,상기 제1 실시간 전송 스트림은 주소 정보를 포함하며,상기 제2 수신부는 상기 주소 정보를 이용하여 상기 통신망 내의 서버로 액세스하여 상기 서버로부터 메타데이터 파일을 수신하고, 상기 메타데이터 파일을 이용하여 상기 제2 실시간 전송 스트림을 수신하며,상기 메타데이터 파일은 상기 제2 실시간 전송 스트림의 소스에 대한 정보를 포함하는 것을 특징으로 하는 수신 장치.
- 제2항에 있어서,상기 주소 정보는, 상기 제1 실시간 전송 스트림의 PMT 내의 리저브 영역, PMT 내의 descriptor 영역, 상기 제1 실시간 전송 스트림의 리저브 영역, 상기 제1 실시간 전송 스트림의 프라이빗 데이터 영역, 상기 제1 실시간 전송 스트림의 PES 내의 리저브 영역, 상기 제1 실시간 전송 스트림의 PES 내의 프라이빗 데이터 영역, ES 헤더 내의 유저 영역, ES 헤더 내의 프라이빗 영역, H.264규격인 경우 SEI 중 적어도 하나에 기록되는 것을 특징으로 하는 수신 장치.
- 제1항에 있어서,상기 제2 데이터는,상기 통신 망의 상태에 따라 적응적으로 설정된 적어도 하나의 사이즈를 가지는 복수의 데이터 유닛을 포함하는 것을 특징으로 하는 수신 장치.
- 제1항 내지 제4항 중 어느 한 항에 있어서,상기 제1 데이터 및 상기 제2 데이터 중 하나는 좌안 영상을 포함하고, 다른 하나는 우안 영상을 포함하며, 상기 멀티미디어 컨텐츠는 3D 컨텐츠인 것을 특징으로 하는 수신 장치.
- 제5항에 있어서,상기 제1 실시간 전송 스트림은 제1 동기화 정보를 포함하고,상기 제2 실시간 전송 스트림은 제2 동기화 정보를 포함하며,상기 제1 및 제2 동기화 정보는상기 멀티미디어 컨텐츠의 시작 지점을 알리는 컨텐츠 스타트 정보, 상기 제1 데이터 및 상기 제2 데이터 간의 타임 스탬프 차이 값, 프레임 인덱스 중 적어도 하나를 포함하는 것을 특징으로 하는 수신 장치.
- 제6항에 있어서,상기 제1 및 제2 동기화 정보를 이용하여 상기 제1 데이터에 포함된 각 프레임의 타임 스탬프 및 상기 제2 데이터에 포함된 각 프레임의 타임 스탬프 중 적어도 하나를 보정하고, 보정된 타임 스탬프에 따라 상기 제1 및 제2 데이터의 각 프레임을 조합하여 상기 멀티미디어 컨텐츠를 구성하도록 상기 신호 처리부를 제어하는 제어부;를 더 포함하는 것을 특징으로 하는 수신 장치.
- 제5항에 있어서,상기 제1 실시간 전송 스트림은 제1 동기화 정보를 포함하고,상기 제2 실시간 전송 스트림은 제2 동기화 정보를 포함하며,상기 제1 및 제2 동기화 정보는 영상 프레임의 타임 코드 정보인 것을 특징으로 하는 수신 장치.
- 송신 장치에 있어서,제1 데이터 및 제1 동기화 정보를 포함하는 제1 실시간 전송 스트림을 구성하는 스트림 구성부;상기 제1 실시간 전송 스트림을 출력하는 출력부; 및,제2 실시간 전송 스트림을 출력하는 타 송신 장치의 출력 타이밍에 맞추어 상기 제1 실시간 전송 스트림의 출력 타이밍을 지연시키도록 상기 출력부를 제어하는 제어부;를 포함하며,상기 제2 실시간 전송 스트림은 제2 데이터 및 제2 동기화 정보를 포함하고,상기 제1 및 제2 데이터는 하나의 멀티미디어 컨텐츠를 구성하기 위한 데이터이며,상기 제1 동기화 정보 및 상기 제2 동기화 정보는 상기 제1 데이터 및 상기 제2 데이터의 동기화를 위하여 전송되는 정보인 것을 특징으로 하는 송신 장치.
- 송신 장치에 있어서,제1 데이터 및 주소 정보를 포함하는 제1 실시간 전송 스트림을 구성하는 스트림 구성부; 및상기 제1 실시간 전송 스트림을 출력하는 출력부;를 포함하며,상기 주소 정보는, 상기 제1 데이터와 함께 멀티미디어 컨텐츠를 구성하는 제2 데이터를 통신망에서 획득할 수 있는 메타데이터 파일에 대한 주소 정보인 것을 특징으로 하는 송신 장치.
- 수신 장치의 멀티미디어 컨텐츠 재생 방법에 있어서,제1 실시간 전송 스트림을 방송망을 통해 수신하는 단계;제2 실시간 전송 스트림을 통신망을 통해 수신하는 단계;상기 제1 및 제2 실시간 전송 스트림 중 적어도 하나를 지연(delay)시켜 동기화하는 단계;상기 제1 실시간 전송 스트림으로부터 제1 데이터를 검출하고, 상기 제2 실시간 전송 스트림으로부터 제2 데이터를 검출하는 단계;상기 제1 데이터 및 상기 제2 데이터를 조합하여 멀티미디어 컨텐츠를 구성하는 단계; 및상기 멀티미디어 컨텐츠를 재생하는 단계;를 포함하는 멀티미디어 컨텐츠 재생 방법.
- 제11항에 있어서,상기 제2 실시간 전송 스트림을 통신망을 통해 수신하는 단계는,상기 제1 실시간 전송 스트림에 포함된 주소 정보를 검출하는 단계;상기 주소 정보를 이용하여 상기 통신망 내의 서버로 액세스하여 상기 서버로부터 메타데이터 파일을 수신하는 단계; 및상기 메타데이터 파일을 이용하여 상기 제2 실시간 전송 스트림의 소스에 액세스하여, 상기 제2 실시간 전송 스트림을 수신하는 단계;를 포함하는 것을 특징으로 하는 멀티미디어 컨텐츠 재생 방법.
- 제12항에 있어서,상기 제1 데이터 및 상기 제2 데이터 중 하나는 좌안 영상을 포함하고, 다른 하나는 우안 영상을 포함하며, 상기 멀티미디어 컨텐츠는 3D 컨텐츠인 것을 특징으로 하는 멀티미디어 컨텐츠 재생 방법.
- 제11항에 있어서,상기 제2 데이터는, 상기 통신 망의 상태에 따라 적응적으로 설정된 적어도 하나의 사이즈를 가지는 복수의 데이터 유닛을 포함하는 것을 특징으로 하는 멀티미디어 컨텐츠 재생 방법.
- 제14항에 있어서,상기 제1 실시간 전송 스트림은 제1 동기화 정보를 포함하고,상기 제2 실시간 전송 스트림은 제2 동기화 정보를 포함하며,상기 제1 및 제2 동기화 정보는상기 멀티미디어 컨텐츠의 시작 지점을 알리는 컨텐츠 스타트 정보, 상기 제1 데이터 및 상기 제2 데이터 간의 타임 스탬프 차이 값, 프레임 인덱스, 타임 코드 중 적어도 하나를 포함하는 것을 특징으로 하는 멀티미디어 컨텐츠 재생 방법.
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2013550392A JP5977760B2 (ja) | 2011-01-19 | 2012-01-11 | 複数のリアルタイム伝送ストリームを受信する受信装置と、その送信装置およびマルチメディアコンテンツ再生方法 |
| US13/980,679 US20130293677A1 (en) | 2011-01-19 | 2012-01-11 | Reception device for receiving a plurality of real-time transfer streams, transmission device for transmitting same, and method for playing multimedia content |
| BR112013018340A BR112013018340A2 (pt) | 2011-01-19 | 2012-01-11 | dispositivo de recepção, dispositivo de transmissão, e método de reprodução de conteúdo de multimídia em um dispositivo de recepção |
| CN2012800060016A CN103329551A (zh) | 2011-01-19 | 2012-01-11 | 接收多个实时传输流的接收装置及其发送装置以及多媒体内容再现方法 |
| EP12737016.1A EP2645727A4 (en) | 2011-01-19 | 2012-01-11 | RECEPTION DEVICE FOR RECEIVING MULTIPLE REAL-TIME TRANSMISSION FLOWS, TRANSMISSION DEVICE FOR TRANSMITTING THESE FLOWS, AND METHOD FOR REPRODUCING MULTIMEDIA CONTENT |
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201161434107P | 2011-01-19 | 2011-01-19 | |
| US61/434,107 | 2011-01-19 | ||
| US201161450818P | 2011-03-09 | 2011-03-09 | |
| US61/450,818 | 2011-03-09 | ||
| KR10-2011-0128644 | 2011-12-02 | ||
| KR1020110128644A KR20120084252A (ko) | 2011-01-19 | 2011-12-02 | 복수의 실시간 전송 스트림을 수신하는 수신 장치와 그 송신 장치 및 멀티미디어 컨텐츠 재생 방법 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2012099359A2 true WO2012099359A2 (ko) | 2012-07-26 |
| WO2012099359A3 WO2012099359A3 (ko) | 2012-12-06 |
Family
ID=46715247
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2012/000271 Ceased WO2012099359A2 (ko) | 2011-01-19 | 2012-01-11 | 복수의 실시간 전송 스트림을 수신하는 수신 장치와 그 송신 장치 및 멀티미디어 컨텐츠 재생 방법 |
Country Status (7)
| Country | Link |
|---|---|
| US (1) | US20130293677A1 (ko) |
| EP (1) | EP2645727A4 (ko) |
| JP (1) | JP5977760B2 (ko) |
| KR (1) | KR20120084252A (ko) |
| CN (1) | CN103329551A (ko) |
| BR (1) | BR112013018340A2 (ko) |
| WO (1) | WO2012099359A2 (ko) |
Cited By (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103024452A (zh) * | 2012-12-21 | 2013-04-03 | 北京牡丹电子集团有限责任公司数字电视技术中心 | 一种3d立体电视节目复用方法及系统 |
| JP2014116662A (ja) * | 2012-12-06 | 2014-06-26 | Nippon Hoso Kyokai <Nhk> | 復号装置およびプログラム |
| WO2014171718A1 (ko) * | 2013-04-16 | 2014-10-23 | 엘지전자 주식회사 | 방송 전송 장치, 방송 수신 장치, 방송 전송 장치의 동작 방법 및 방송 수신 장치의 동작 방법 |
| WO2015011915A1 (ja) * | 2013-07-25 | 2015-01-29 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信方法および受信方法ならびに送信装置および受信装置 |
| WO2015029401A1 (ja) * | 2013-08-29 | 2015-03-05 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信方法および受信方法ならびに送信装置および受信装置 |
| WO2015029402A1 (ja) * | 2013-08-30 | 2015-03-05 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 受信方法、送信方法、受信装置、及び送信装置 |
| JP2015050769A (ja) * | 2013-08-29 | 2015-03-16 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 送信方法および受信方法 |
| WO2015052908A1 (ja) * | 2013-10-11 | 2015-04-16 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信方法、受信方法、送信装置および受信装置 |
| JP2015076881A (ja) * | 2013-10-11 | 2015-04-20 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 送信方法、受信方法、送信装置および受信装置 |
| JP2015201699A (ja) * | 2014-04-04 | 2015-11-12 | ソニー株式会社 | 受信装置、受信方法、送信装置、及び、送信方法 |
| US20150341634A1 (en) * | 2013-10-16 | 2015-11-26 | Intel Corporation | Method, apparatus and system to select audio-video data for streaming |
| CN105612757A (zh) * | 2013-10-31 | 2016-05-25 | 松下电器(美国)知识产权公司 | 包发送方法、内容再现方法、包发送系统以及终端 |
| WO2016129973A1 (ko) * | 2015-02-15 | 2016-08-18 | 엘지전자 주식회사 | 방송 신호 송신 장치, 방송 신호 수신 장치, 방송 신호 송신 방법, 및 방송 신호 수신 방법 |
| WO2016129966A1 (ko) * | 2015-02-13 | 2016-08-18 | 에스케이텔레콤 주식회사 | 저지연 생방송 컨텐츠 제공을 위한 프로그램을 기록한 기록매체 및 장치 |
| WO2016159636A1 (ko) * | 2015-03-30 | 2016-10-06 | 엘지전자 주식회사 | 방송 신호 송수신 방법 및 장치 |
| WO2016167632A1 (ko) * | 2015-04-17 | 2016-10-20 | 삼성전자 주식회사 | 방송 서비스를 위한 서비스 시그널링을 송수신하는 방법 및 장치 |
| JP2017508327A (ja) * | 2013-12-23 | 2017-03-23 | エルジー エレクトロニクス インコーポレイティド | 一つ以上のネットワークで放送コンテンツを送受信する装置及び方法 |
| JP2017510119A (ja) * | 2014-01-13 | 2017-04-06 | エルジー エレクトロニクス インコーポレイティド | 一つ以上のネットワークを介して放送コンテンツを送受信する装置及び方法 |
| JP2017515351A (ja) * | 2014-03-27 | 2017-06-08 | サムスン エレクトロニクス カンパニー リミテッド | Mmt及びdashを使用するブロードキャスト及びブロードバンドハイブリッドサービス |
| JP2019134489A (ja) * | 2013-07-22 | 2019-08-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 再生方法、コンテンツ送信方法、再生装置、およびコンテンツ送信装置 |
| US11317138B2 (en) | 2015-04-17 | 2022-04-26 | Samsung Electronics Co., Ltd. | Method and apparatus for transmitting or receiving service signaling for broadcasting service |
Families Citing this family (46)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9491437B2 (en) | 2010-12-07 | 2016-11-08 | Samsung Electronics Co., Ltd. | Transmitter for transmitting data for constituting content, receiver for receiving and processing data, and method therefor |
| KR101831775B1 (ko) * | 2010-12-07 | 2018-02-26 | 삼성전자주식회사 | 멀티미디어 컨텐츠를 송수신하는 송신 장치 및 수신 장치와, 그 재생 방법 |
| CN108366070A (zh) | 2011-03-16 | 2018-08-03 | 韩国电子通信研究院 | 用于提供媒体内容的方法和客户端 |
| US12212791B2 (en) * | 2011-06-14 | 2025-01-28 | Comcast Cable Communications, Llc | Metadata delivery system for rendering supplementary content |
| US9762967B2 (en) | 2011-06-14 | 2017-09-12 | Comcast Cable Communications, Llc | System and method for presenting content with time based metadata |
| KR20130018208A (ko) * | 2011-08-12 | 2013-02-20 | 한국방송공사 | 송신 장치, 수신 장치 및 그 송수신 방법 |
| US9516086B2 (en) | 2011-08-12 | 2016-12-06 | Samsung Electronics Co., Ltd. | Transmitting device, receiving device, and transceiving method thereof |
| US8948249B2 (en) * | 2011-08-19 | 2015-02-03 | Google Technology Holdings LLC | Encoder-aided segmentation for adaptive streaming |
| KR101719998B1 (ko) * | 2011-12-12 | 2017-03-27 | 엘지전자 주식회사 | 미디어 컨텐트를 수신하는 장치 및 방법 |
| WO2013162256A1 (ko) * | 2012-04-23 | 2013-10-31 | 엘지전자 주식회사 | 3d 서비스를 위한 신호 처리 장치 및 방법 |
| US9426506B2 (en) * | 2012-08-22 | 2016-08-23 | University-Industry Cooperation Group Of Kyung Hee University | Apparatuses for providing and receiving augmented broadcasting service in hybrid broadcasting environment |
| KR101385606B1 (ko) * | 2012-08-28 | 2014-04-16 | 국민대학교산학협력단 | 3차원 스트리밍 방송 수신 방법 및 그 방법을 실행하는 멀티 모드 단말기 |
| US9813325B2 (en) * | 2012-12-27 | 2017-11-07 | Comcast Cable Communications, Llc | Information stream management |
| KR101591179B1 (ko) * | 2013-06-28 | 2016-02-04 | 한국전자통신연구원 | 3d 영상 재생 장치 및 방법 |
| CN104601977A (zh) * | 2013-10-31 | 2015-05-06 | 立普思股份有限公司 | 感测装置及其信号处理方法 |
| KR20150057149A (ko) * | 2013-11-18 | 2015-05-28 | 한국전자통신연구원 | 재전송망에 기초한 3d 방송 서비스 제공 시스템 및 방법 |
| CN106464677A (zh) * | 2014-04-09 | 2017-02-22 | Lg电子株式会社 | 发送/接收广播信号的方法和设备 |
| CN106165433B (zh) * | 2014-04-09 | 2019-06-25 | Lg电子株式会社 | 广播发送装置、广播接收装置以及广播接收装置的操作方法 |
| WO2015160137A1 (ko) | 2014-04-18 | 2015-10-22 | 엘지전자 주식회사 | 방송 신호 송신 장치, 방송 신호 수신 장치, 방송 신호 송신 방법, 및 방송 신호 수신 방법 |
| DE102014109088A1 (de) * | 2014-06-27 | 2015-12-31 | Deutsche Telekom Ag | Verfahren zum kontinuierlichen Überwachen einer Synchronität zwischen verschiedenen beim HTTP adaptiven Streaming verwendeten Qualitätsprofilen |
| CN104618673B (zh) * | 2015-01-20 | 2018-05-01 | 武汉烽火众智数字技术有限责任公司 | 一种基于nvr的多路录像同步回放控制方法和装置 |
| DE102015001622A1 (de) | 2015-02-09 | 2016-08-11 | Unify Gmbh & Co. Kg | Verfahren zur Übertragung von Daten in einem Multimedia-System, sowie Softwareprodukt und Vorrichtung zur Steuerung der Übertragung von Daten in einem Multimedia-System |
| US10887644B2 (en) * | 2015-09-01 | 2021-01-05 | Sony Corporation | Reception device, data processing method, and program |
| CN108475103A (zh) | 2015-09-30 | 2018-08-31 | 惠普发展公司,有限责任合伙企业 | 交互式显示器 |
| CN106686523A (zh) * | 2015-11-06 | 2017-05-17 | 华为终端(东莞)有限公司 | 数据处理方法及装置 |
| US20180309972A1 (en) * | 2015-11-11 | 2018-10-25 | Sony Corporation | Image processing apparatus and image processing method |
| US10764473B2 (en) * | 2016-01-14 | 2020-09-01 | Disney Enterprises, Inc. | Automatically synchronizing multiple real-time video sources |
| JP6740002B2 (ja) * | 2016-05-24 | 2020-08-12 | キヤノン株式会社 | 制御装置、制御方法及びプログラム |
| US10148722B2 (en) | 2016-07-04 | 2018-12-04 | Znipe Esports AB | Methods and nodes for synchronized streaming of a first and a second data stream |
| SE541208C2 (en) * | 2016-07-04 | 2019-04-30 | Znipe Esports AB | Methods and nodes for synchronized streaming of a first and a second data stream |
| KR102263223B1 (ko) | 2017-03-14 | 2021-06-09 | 삼성전자 주식회사 | 전자장치 및 그 제어방법 |
| US10594758B2 (en) * | 2017-12-15 | 2020-03-17 | Cisco Technology, Inc. | Latency reduction by sending audio and metadata ahead of time |
| KR102085441B1 (ko) | 2017-12-26 | 2020-03-05 | (주)스코넥엔터테인먼트 | 가상 현실 제어 시스템 |
| JP6504294B2 (ja) * | 2018-03-23 | 2019-04-24 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
| CN108737807B (zh) * | 2018-05-10 | 2020-05-26 | Oppo广东移动通信有限公司 | 一种数据处理方法、终端、服务器和计算机存储介质 |
| CN108900928A (zh) * | 2018-07-26 | 2018-11-27 | 宁波视睿迪光电有限公司 | 裸眼3d直播的方法及装置、3d屏客户端、流媒体云服务器 |
| CN109194971B (zh) * | 2018-08-27 | 2021-05-18 | 咪咕视讯科技有限公司 | 一种为多媒体文件的生成方法及装置 |
| KR102029604B1 (ko) * | 2018-12-07 | 2019-10-08 | 스타십벤딩머신 주식회사 | 실시간 방송 편집 시스템 및 편집 방법 |
| CN110418207B (zh) * | 2019-03-29 | 2021-08-31 | 腾讯科技(深圳)有限公司 | 信息处理方法、装置及存储介质 |
| KR102714326B1 (ko) * | 2019-09-30 | 2024-10-07 | 한화비전 주식회사 | 비디오 데이터 및 메타 데이터를 실시간 동기화하는 영상 수신 장치 및 그 방법 |
| KR102445069B1 (ko) * | 2020-12-01 | 2022-09-21 | 주식회사 마젠타컴퍼니 | 복수의 미디어 소스를 동기화하여 통합 전송하는 시스템 및 그 방법 |
| KR102445495B1 (ko) * | 2021-02-17 | 2022-09-21 | 주식회사 엘지유플러스 | 3d 콘텐츠 재생 장치 및 방법 |
| US11615727B2 (en) * | 2021-04-12 | 2023-03-28 | Apple Inc. | Preemptive refresh for reduced display judder |
| CN117157988A (zh) | 2021-04-16 | 2023-12-01 | 抖音视界有限公司 | 最小化直播流中的初始化延迟 |
| KR102555481B1 (ko) * | 2021-10-25 | 2023-07-13 | 주식회사 픽스트리 | 멀티뷰 서비스를 위한 다중 입력 영상 동기화 방법 및 시스템 |
| CN120416571A (zh) * | 2024-01-31 | 2025-08-01 | 抖音视界有限公司 | 用于流媒体数据传输的方法、装置、设备和介质 |
Family Cites Families (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH09149419A (ja) * | 1995-11-24 | 1997-06-06 | Ekushingu:Kk | Mpeg2データの伝送システム及び伝送方法 |
| JP3567696B2 (ja) * | 1997-09-24 | 2004-09-22 | 松下電器産業株式会社 | ソフトウェアダウンロードシステム |
| WO1999066722A1 (fr) * | 1998-06-17 | 1999-12-23 | Hitachi, Ltd. | Procede et recepteur de diffusion |
| JP2001136496A (ja) * | 1999-11-05 | 2001-05-18 | Nec Corp | 受信機器、映像・データ同期装置及び方法 |
| JP4489932B2 (ja) * | 2000-11-27 | 2010-06-23 | 富士通株式会社 | 複数の通信を同期させるシステム及び方法 |
| JP2002142233A (ja) * | 2000-11-01 | 2002-05-17 | Hitoshi Ishida | 立体画像を提供するための画像提供装置および画像提供方法、受信装置および受信方法、並びに立体画像を提供するための立体画像提供システムおよび立体画像提供方法。 |
| JP4252324B2 (ja) * | 2003-01-28 | 2009-04-08 | 三菱電機株式会社 | 受信機、放送送出装置及び補助コンテンツサーバ |
| JP2004266497A (ja) * | 2003-02-28 | 2004-09-24 | Rikogaku Shinkokai | ステレオ映像放送受信用セットトップボックスおよびステレオ映像放送方法 |
| US8312335B2 (en) * | 2006-07-06 | 2012-11-13 | Lg Electronics Inc. | Method and apparatus for correcting errors in a multiple subcarriers communication system using multiple antennas |
| JP4597927B2 (ja) * | 2006-08-30 | 2010-12-15 | 日本テレビ放送網株式会社 | 放送中継システム及びその方法 |
| KR100864826B1 (ko) * | 2006-09-29 | 2008-10-23 | 한국전자통신연구원 | 디지털 방송기반의 3차원 정지영상 서비스 방법 및 장치 |
| WO2008056622A1 (en) * | 2006-11-06 | 2008-05-15 | Panasonic Corporation | Receiver |
| KR100947737B1 (ko) * | 2008-04-17 | 2010-03-17 | 에스케이 텔레콤주식회사 | 이동통신 방송 시스템과 동기 화면 검출 방법 및 방송컨텐츠와 부가 정보 간의 동기 방법 |
| KR20100050426A (ko) * | 2008-11-04 | 2010-05-13 | 한국전자통신연구원 | 3차원 방송 서비스 송수신 방법 및 시스템 |
| KR100972792B1 (ko) * | 2008-11-04 | 2010-07-29 | 한국전자통신연구원 | 스테레오스코픽 영상을 동기화하는 장치 및 방법과 이를 이용한 스테레오스코픽 영상 제공 장치 및 방법 |
| WO2010076933A1 (ko) * | 2008-12-30 | 2010-07-08 | (주)엘지전자 | 이차원 영상과 3d 영상의 통합 서비스가 가능한 디지털 방송 수신방법, 및 이를 이용한 디지털 방송 수신장치 |
| JP5559977B2 (ja) * | 2009-03-31 | 2014-07-23 | 日本放送協会 | 連携受信システム及びプログラム |
| JP2011066871A (ja) * | 2009-08-21 | 2011-03-31 | Sony Corp | コンテンツ伝送方法及び表示装置 |
| US8786670B2 (en) * | 2010-10-14 | 2014-07-22 | Cisco Technology, Inc. | Network synchronization video for composite video streams |
-
2011
- 2011-12-02 KR KR1020110128644A patent/KR20120084252A/ko not_active Ceased
-
2012
- 2012-01-11 US US13/980,679 patent/US20130293677A1/en not_active Abandoned
- 2012-01-11 BR BR112013018340A patent/BR112013018340A2/pt not_active IP Right Cessation
- 2012-01-11 WO PCT/KR2012/000271 patent/WO2012099359A2/ko not_active Ceased
- 2012-01-11 EP EP12737016.1A patent/EP2645727A4/en not_active Withdrawn
- 2012-01-11 CN CN2012800060016A patent/CN103329551A/zh active Pending
- 2012-01-11 JP JP2013550392A patent/JP5977760B2/ja not_active Expired - Fee Related
Non-Patent Citations (1)
| Title |
|---|
| None |
Cited By (74)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014116662A (ja) * | 2012-12-06 | 2014-06-26 | Nippon Hoso Kyokai <Nhk> | 復号装置およびプログラム |
| CN103024452A (zh) * | 2012-12-21 | 2013-04-03 | 北京牡丹电子集团有限责任公司数字电视技术中心 | 一种3d立体电视节目复用方法及系统 |
| WO2014171718A1 (ko) * | 2013-04-16 | 2014-10-23 | 엘지전자 주식회사 | 방송 전송 장치, 방송 수신 장치, 방송 전송 장치의 동작 방법 및 방송 수신 장치의 동작 방법 |
| JP2016521500A (ja) * | 2013-04-16 | 2016-07-21 | エルジー エレクトロニクス インコーポレイティド | 放送伝送装置、放送受信装置、放送伝送装置の動作方法及び放送受信装置の動作方法 |
| JP7591462B2 (ja) | 2013-07-22 | 2024-11-28 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 再生装置およびコンテンツ送信装置 |
| JP2023090754A (ja) * | 2013-07-22 | 2023-06-29 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 再生方法および再生装置 |
| JP2019134489A (ja) * | 2013-07-22 | 2019-08-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 再生方法、コンテンツ送信方法、再生装置、およびコンテンツ送信装置 |
| CN110266985A (zh) * | 2013-07-22 | 2019-09-20 | 太阳专利托管公司 | 再现方法、内容传输方法、再现装置、内容传输装置 |
| JP2021122144A (ja) * | 2013-07-22 | 2021-08-26 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 再生装置およびコンテンツ送信装置 |
| JP7483091B2 (ja) | 2013-07-22 | 2024-05-14 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 再生方法および再生装置 |
| US11019320B2 (en) | 2013-07-22 | 2021-05-25 | Sun Patent Trust | Storage method, playback method, storage apparatus, and playback apparatus |
| JP7703750B2 (ja) | 2013-07-25 | 2025-07-07 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信方法および送信装置 |
| CN105230026B (zh) * | 2013-07-25 | 2020-03-06 | 太阳专利托管公司 | 发送方法、接收方法、发送装置及接收装置 |
| CN105230026A (zh) * | 2013-07-25 | 2016-01-06 | 松下电器(美国)知识产权公司 | 发送方法、接收方法、发送装置及接收装置 |
| CN111314762A (zh) * | 2013-07-25 | 2020-06-19 | 太阳专利托管公司 | 发送方法、接收方法、发送装置及接收装置 |
| US12273591B2 (en) | 2013-07-25 | 2025-04-08 | Sun Patent Trust | Transmission method, reception method, transmission device, and reception device |
| JP2020031437A (ja) * | 2013-07-25 | 2020-02-27 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 送信方法および受信方法 |
| JP2021057918A (ja) * | 2013-07-25 | 2021-04-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 送信方法および受信方法 |
| JP2024114751A (ja) * | 2013-07-25 | 2024-08-23 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信方法および送信装置 |
| JP2015027082A (ja) * | 2013-07-25 | 2015-02-05 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 送信方法および受信方法 |
| US11711580B2 (en) | 2013-07-25 | 2023-07-25 | Sun Patent Trust | Transmission method, reception method, transmission device, and reception device |
| WO2015011915A1 (ja) * | 2013-07-25 | 2015-01-29 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信方法および受信方法ならびに送信装置および受信装置 |
| JP7280408B2 (ja) | 2013-07-25 | 2023-05-23 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信方法および受信方法 |
| JP2022089899A (ja) * | 2013-07-25 | 2022-06-16 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信方法および受信方法 |
| JP7057411B2 (ja) | 2013-07-25 | 2022-04-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信方法および受信方法 |
| US10356474B2 (en) | 2013-07-25 | 2019-07-16 | Sun Patent Trust | Transmission method, reception method, transmission device, and reception device |
| US11102547B2 (en) | 2013-07-25 | 2021-08-24 | Sun Patent Trust | Transmission method, reception method, transmission device, and reception device |
| US12192561B2 (en) | 2013-08-29 | 2025-01-07 | Panasonic Intellectual Property Corporation Of America | Transmitting method, receiving method, transmitting apparatus, and receiving apparatus |
| CN110139140B (zh) * | 2013-08-29 | 2021-09-14 | 松下电器(美国)知识产权公司 | 发送方法、接收方法、发送装置及接收装置 |
| CN105340289B (zh) * | 2013-08-29 | 2019-06-07 | 松下电器(美国)知识产权公司 | 发送方法、接收方法、发送装置及接收装置 |
| US11082733B2 (en) | 2013-08-29 | 2021-08-03 | Panasonic Intellectual Property Corporation Of America | Transmitting method, receiving method, transmitting apparatus, and receiving apparatus |
| JP2022051796A (ja) * | 2013-08-29 | 2022-04-01 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信方法および受信方法 |
| US11765414B2 (en) | 2013-08-29 | 2023-09-19 | Panasonic Intellectual Property Corporation Of America | Transmitting method, receiving method, transmitting apparatus, and receiving apparatus |
| CN110139140A (zh) * | 2013-08-29 | 2019-08-16 | 松下电器(美国)知识产权公司 | 发送方法、接收方法、发送装置及接收装置 |
| JP7453266B2 (ja) | 2013-08-29 | 2024-03-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信装置 |
| JP2024063222A (ja) * | 2013-08-29 | 2024-05-10 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信装置 |
| CN105340289A (zh) * | 2013-08-29 | 2016-02-17 | 松下电器(美国)知识产权公司 | 发送方法、接收方法、发送装置及接收装置 |
| JP7665821B2 (ja) | 2013-08-29 | 2025-04-21 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信装置 |
| JP2015050769A (ja) * | 2013-08-29 | 2015-03-16 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 送信方法および受信方法 |
| WO2015029401A1 (ja) * | 2013-08-29 | 2015-03-05 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信方法および受信方法ならびに送信装置および受信装置 |
| CN110278474A (zh) * | 2013-08-30 | 2019-09-24 | 松下电器(美国)知识产权公司 | 接收方法、发送方法、接收装置及发送装置 |
| US12192548B2 (en) | 2013-08-30 | 2025-01-07 | Panasonic Intellectual Property Corporation Of America | Reception method, transmission method, reception device, and transmission device |
| WO2015029402A1 (ja) * | 2013-08-30 | 2015-03-05 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 受信方法、送信方法、受信装置、及び送信装置 |
| US10911805B2 (en) | 2013-08-30 | 2021-02-02 | Panasonic Intellectual Property Corporation Of America | Reception method, transmission method, reception device, and transmission device |
| US11284142B2 (en) | 2013-08-30 | 2022-03-22 | Panasonic Intellectual Property Corporation Of America | Reception method, transmission method, reception device, and transmission device |
| JP2015050768A (ja) * | 2013-08-30 | 2015-03-16 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 受信方法、送信方法、受信装置、及び送信装置 |
| JP7410107B2 (ja) | 2013-08-30 | 2024-01-09 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 受信方法、及び、受信装置 |
| JP2019115071A (ja) * | 2013-08-30 | 2019-07-11 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 受信方法、及び、受信装置 |
| JP2022010382A (ja) * | 2013-08-30 | 2022-01-14 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 受信方法、及び、受信装置 |
| US10277931B2 (en) | 2013-08-30 | 2019-04-30 | Panasonic Intellectual Property Corporation Of America | Reception method, transmission method, reception device, and transmission device |
| JP2019169948A (ja) * | 2013-10-11 | 2019-10-03 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 送信方法、受信方法、送信装置および受信装置 |
| WO2015052908A1 (ja) * | 2013-10-11 | 2015-04-16 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信方法、受信方法、送信装置および受信装置 |
| JP2015076881A (ja) * | 2013-10-11 | 2015-04-20 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 送信方法、受信方法、送信装置および受信装置 |
| US20150341634A1 (en) * | 2013-10-16 | 2015-11-26 | Intel Corporation | Method, apparatus and system to select audio-video data for streaming |
| CN105612757A (zh) * | 2013-10-31 | 2016-05-25 | 松下电器(美国)知识产权公司 | 包发送方法、内容再现方法、包发送系统以及终端 |
| CN105612757B (zh) * | 2013-10-31 | 2020-06-23 | 松下电器(美国)知识产权公司 | 一种发送被同步显示的广播内容和线路内容的方法及内容再现方法 |
| US10080055B2 (en) | 2013-12-23 | 2018-09-18 | Lg Electronics Inc. | Apparatuses and methods for transmitting or receiving a broadcast content via one or more networks |
| JP2017508327A (ja) * | 2013-12-23 | 2017-03-23 | エルジー エレクトロニクス インコーポレイティド | 一つ以上のネットワークで放送コンテンツを送受信する装置及び方法 |
| US10911800B2 (en) | 2014-01-13 | 2021-02-02 | Lg Electronics Inc. | Apparatuses and methods for transmitting or receiving a broadcast content via one or more networks |
| JP2017510119A (ja) * | 2014-01-13 | 2017-04-06 | エルジー エレクトロニクス インコーポレイティド | 一つ以上のネットワークを介して放送コンテンツを送受信する装置及び方法 |
| US11665385B2 (en) | 2014-01-13 | 2023-05-30 | Lg Electronics Inc. | Apparatuses and methods for transmitting or receiving a broadcast content via one or more networks |
| JP2017515351A (ja) * | 2014-03-27 | 2017-06-08 | サムスン エレクトロニクス カンパニー リミテッド | Mmt及びdashを使用するブロードキャスト及びブロードバンドハイブリッドサービス |
| US11329741B2 (en) | 2014-04-04 | 2022-05-10 | Saturn Licensing, Llc | Receiving apparatus, receiving method, transmitting apparatus, and transmitting method |
| JP2015201699A (ja) * | 2014-04-04 | 2015-11-12 | ソニー株式会社 | 受信装置、受信方法、送信装置、及び、送信方法 |
| US10291341B2 (en) | 2014-04-04 | 2019-05-14 | Sony Corporation | Receiving apparatus, receiving method, transmitting apparatus, and transmitting method |
| US10771175B2 (en) | 2014-04-04 | 2020-09-08 | Saturn Licensing Llc | Receiving apparatus, receiving method, transmitting apparatus, and transmitting method |
| WO2016129966A1 (ko) * | 2015-02-13 | 2016-08-18 | 에스케이텔레콤 주식회사 | 저지연 생방송 컨텐츠 제공을 위한 프로그램을 기록한 기록매체 및 장치 |
| CN107113474A (zh) * | 2015-02-13 | 2017-08-29 | Sk电信有限公司 | 具有记录在其上以用于提供低延迟直播内容的程序的记录介质和装置 |
| US10148725B2 (en) | 2015-02-13 | 2018-12-04 | Sk Telecom Co., Ltd. | Apparatus and computer-readable recording medium having program recorded therein for providing low-latency real-time broadcast content |
| CN107113474B (zh) * | 2015-02-13 | 2020-03-03 | Sk电信有限公司 | 具有记录在其中以用于提供低延迟实时广播内容的程序的设备和计算机可读记录介质 |
| WO2016129973A1 (ko) * | 2015-02-15 | 2016-08-18 | 엘지전자 주식회사 | 방송 신호 송신 장치, 방송 신호 수신 장치, 방송 신호 송신 방법, 및 방송 신호 수신 방법 |
| WO2016159636A1 (ko) * | 2015-03-30 | 2016-10-06 | 엘지전자 주식회사 | 방송 신호 송수신 방법 및 장치 |
| WO2016167632A1 (ko) * | 2015-04-17 | 2016-10-20 | 삼성전자 주식회사 | 방송 서비스를 위한 서비스 시그널링을 송수신하는 방법 및 장치 |
| US11317138B2 (en) | 2015-04-17 | 2022-04-26 | Samsung Electronics Co., Ltd. | Method and apparatus for transmitting or receiving service signaling for broadcasting service |
Also Published As
| Publication number | Publication date |
|---|---|
| JP5977760B2 (ja) | 2016-08-24 |
| CN103329551A (zh) | 2013-09-25 |
| WO2012099359A3 (ko) | 2012-12-06 |
| US20130293677A1 (en) | 2013-11-07 |
| JP2014509111A (ja) | 2014-04-10 |
| BR112013018340A2 (pt) | 2016-10-04 |
| EP2645727A2 (en) | 2013-10-02 |
| KR20120084252A (ko) | 2012-07-27 |
| EP2645727A4 (en) | 2015-01-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2012099359A2 (ko) | 복수의 실시간 전송 스트림을 수신하는 수신 장치와 그 송신 장치 및 멀티미디어 컨텐츠 재생 방법 | |
| WO2012077982A2 (ko) | 멀티미디어 컨텐츠를 송수신하는 송신 장치 및 수신 장치와, 그 재생 방법 | |
| US11622163B2 (en) | System and method for synchronizing metadata with audiovisual content | |
| WO2013154402A1 (en) | Receiving apparatus for receiving a plurality of signals through different paths and method for processing signals thereof | |
| WO2013154397A1 (en) | Transmitting system and receiving apparatus for providing hybrid service, and service providing method thereof | |
| WO2012128563A2 (ko) | 이종망 기반 연동형 방송콘텐츠 송수신 장치 및 방법 | |
| WO2013154350A1 (en) | Receiving apparatus for providing hybrid service, and hybrid service providing method thereof | |
| WO2013025035A2 (ko) | 송신 장치, 수신 장치 및 그 송수신 방법 | |
| WO2012011724A2 (ko) | 미디어 파일 송수신 방법 및 그를 이용한 송수신 장치 | |
| WO2011059274A2 (en) | Adaptive streaming method and apparatus | |
| WO2011013995A2 (en) | Method and apparatus for generating 3-dimensional image datastream including additional information for reproducing 3-dimensional image, and method and apparatus for receiving the 3-dimensional image datastream | |
| WO2013025032A1 (ko) | 수신 장치 및 그 수신 방법 | |
| WO2012177041A2 (ko) | 미디어 컨텐트 송수신 방법 및 그를 이용한 송수신 장치 | |
| WO2013019042A1 (ko) | 실시간으로 전송되는 기준 영상과 별도로 전송되는 부가 영상 및 콘텐츠를 연동하여 3d 서비스를 제공하기 위한 전송 장치 및 방법, 및 수신 장치 및 방법 | |
| WO2012060581A2 (ko) | 미디어 콘텐트 송수신 방법 및 그를 이용한 송수신 장치 | |
| WO2016089093A1 (ko) | 방송 신호 송수신 방법 및 장치 | |
| WO2012121572A2 (ko) | 프로그램 연동형 스테레오스코픽 방송 서비스를 제공하기 위한 송신 장치 및 방법, 및 수신 장치 및 방법 | |
| WO2012077981A2 (ko) | 컨텐츠를 구성하는 데이터를 송신하는 송신 장치와 그 데이터를 수신하여 처리하는 수신 장치 및 그 방법 | |
| WO2012144857A2 (en) | Receiver for receiving and displaying a plurality of streams through separate routes, method for processing the plurality of streams and transmitting method thereof | |
| WO2012023787A2 (ko) | 디지털 수신기 및 디지털 수신기에서의 컨텐트 처리 방법 | |
| WO2012144795A2 (en) | Apparatus for outputting broadcast recorded by schedule recording and control method thereof | |
| Veenhuizen et al. | Frame accurate media synchronization of heterogeneous media sources in an HBB context | |
| WO2017043943A1 (ko) | 방송 신호 송신 장치, 방송 신호 수신 장치, 방송 신호 송신 방법, 및 방송 신호 수신 방법 | |
| WO2013055032A1 (ko) | 융합형 3dtv에서 컨텐츠 스트림에 접근하는 컨텐츠 제공 장치 및 방법, 그리고 컨텐츠 재생 장치 및 방법 | |
| WO2013077629A1 (ko) | 3dtv 방송을 위한 송수신 장치 및 그 제어 방법 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12737016 Country of ref document: EP Kind code of ref document: A2 |
|
| ENP | Entry into the national phase |
Ref document number: 2013550392 Country of ref document: JP Kind code of ref document: A |
|
| REEP | Request for entry into the european phase |
Ref document number: 2012737016 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2012737016 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 13980679 Country of ref document: US |
|
| REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112013018340 Country of ref document: BR |
|
| ENP | Entry into the national phase |
Ref document number: 112013018340 Country of ref document: BR Kind code of ref document: A2 Effective date: 20130718 |