[go: up one dir, main page]

HK1159388A - A method and system for video processing - Google Patents

A method and system for video processing Download PDF

Info

Publication number
HK1159388A
HK1159388A HK11113873.2A HK11113873A HK1159388A HK 1159388 A HK1159388 A HK 1159388A HK 11113873 A HK11113873 A HK 11113873A HK 1159388 A HK1159388 A HK 1159388A
Authority
HK
Hong Kong
Prior art keywords
video
dimensional
optical viewing
viewing device
display
Prior art date
Application number
HK11113873.2A
Other languages
Chinese (zh)
Inventor
伊利亚‧克莱巴诺夫
陈雪敏
萨米尔‧赫尔亚尔卡
马库斯‧凯勒曼
Original Assignee
美国博通公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 美国博通公司 filed Critical 美国博通公司
Publication of HK1159388A publication Critical patent/HK1159388A/en

Links

Description

Video processing method and system
Technical Field
The present invention relates to video processing. And more particularly, to a method and system for synchronizing three-dimensional (3D) glasses and a three-dimensional (3D) video display.
Background
Display devices, such as Televisions (TVs), may be used to output or play audiovisual or multimedia streams, which may include television broadcasts (TV broadcasts), television broadcasts, and/or local audio/video (a/V) material from one or more available user devices, such as video recorders (VCRs) and/or Digital Video Disc (DVD) players. Television broadcast and/or audiovisual or multimedia material may be input directly to a television set or may be delivered indirectly through one or more dedicated set-top boxes that may provide any required processing operations. Types of connectors used to input data to a television include, but are not limited to, F-connectors, S-video, hybrid and/or video component connectors, and/or more recently High Definition Multimedia Interface (HDMI) connectors.
Television transmissions are typically transmitted by a television headend over a broadcast channel via an RF carrier or cable connection. The television headend may include a terrestrial TV headend, a cable TV (CATV), a satellite TV headend, and/or a broadband TV headend. The terrestrial television front end may utilize, for example, a series of terrestrial broadcast channels, which in the united states may include, for example, channels 2-69. Cable television (CATV) broadcasts may utilize a greater number of broadcast channels. Television distribution involves the transmission of video and/or audio information that may be encoded into a broadcast channel by one of a number of available modulation methods. Television transmissions may utilize analog and/or digital modulation formats. In analog television systems, picture and sound information is encoded into and transmitted over an analog signal, where video/audio information may be conveyed over a broadcast signal, amplitude and/or frequency modulated television signal based on an analog television encoding standard. Analog television broadcasters may, for example, encode their signals using NTSC, PAL and/or SECAM analog encoding standards and then modulate these signals onto, for example, a VHF or UHF RF carrier.
In a Digital Television (DTV) system, television transmissions may be communicated by terrestrial, cable and/or satellite head-ends via discrete (digital) signals using one of the available digital modulation methods, which may include, for example, QAM, VSB, QPSK and/or OFDM. DTV systems can allow broadcasters to provide more digital channels in the same space available to analog television systems, since digital signals typically require less bandwidth than analog signals when conveying the same information. Additionally, the use of digital television signals may enable broadcasters to provide High Definition Television (HDTV) rebroadcast and/or other non-television related services over digital systems. Available digital television systems include, for example, ATSC, DVB, DMB-T/H and/or ISDN based systems. Video and/or audio information may be encoded into a digital television signal using various video and/or audio encoding and/or compression algorithms, which may include, for example, MPEG-1/2, MPEG-4AVC, MP3, AC-3, AAC, and/or HE-AAC.
Most television transmissions (and similar multimedia materials) now utilize video format standards that allow video images to be communicated in a bitstream. These video standards may utilize various interpolation and/or transrating functions to present content including still and/or moving images on a display device. For example, an anti-interlacing function may be utilized to convert a dynamic and/or static image to another format suitable for display device types that are not capable of handling interlaced content. Television transmissions, and similar video material, may be interlaced or progressive. Interlaced video comprises fields, one field can be captured at a definite interval. One frame may include a pair of fields, for example, a top field and a bottom field. The pictures forming the video may comprise a plurality of lines arranged in sequence. In one time interval, even lines of video content are captured. During a subsequent time interval, odd lines of video content are captured. The even lines may be collectively referred to as the top field and the odd lines may be collectively referred to as the bottom field. Alternatively, the odd lines may be collectively referred to as the top field and the even lines may be collectively referred to as the bottom field. For a progressive video frame, all the lines of the frame may be captured or played in sequence within a time interval. Interlaced video may include fields converted from progressive frames. For example, a progressive frame can be converted into two interlaced fields by organizing the even lines into one field and the odd lines into another field.
Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with some aspects of the present invention as set forth in the remainder of the present application with reference to the drawings.
Disclosure of Invention
A method and/or system for synchronizing three-dimensional (3D) glasses with a three-dimensional video display, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.
According to an aspect of the present invention, there is provided a video processing method including:
executed by one or more processors and/or circuits in an optical viewing device (optical viewing device):
determining a working mode for playing the three-dimensional video content;
configuring the optical viewing device based on the determined operating mode to synchronize with playback of the three-dimensional video content.
Preferably, the method further comprises synchronizing the optical viewing device prior to starting the playing of the three-dimensional video content and/or dynamically synchronizing the optical viewing device during the playing of the three-dimensional video content.
Preferably, the method further comprises communicating with a video processing device for processing and/or displaying the three-dimensional video content to facilitate the configuring of the optical viewing device.
Preferably, the method further comprises communicating with the video processing device over one or more wireless interfaces.
Preferably, the one or more wireless interfaces comprise a Wireless Personal Area Network (WPAN) interface and/or a Wireless Local Area Network (WLAN) interface.
Preferably, the operation mode includes a polarization mode (polarization mode) and/or a shutter mode (shutter mode).
Preferably, the three-dimensional video content comprises frames or fields of stereoscopic left and right view video sequences.
Preferably, the method further comprises, when the optical viewing device is operating in a polarization mode, synchronizing the polarization of the left eye viewed through the optical viewing device with the polarization of the stereoscopic left view video sequence and/or synchronizing the polarization of the right eye viewed through the optical viewing device with the polarization of the stereoscopic right view video sequence.
Preferably, the method further comprises, when the optical viewing device is operating in a shutter mode, synchronising the shutter of the left eye viewed through the optical viewing device with the display of frames and/or fields of the stereoscopic left view video sequence and/or synchronising the shutter of the right eye viewed through the optical viewing device with the display of frames and/or fields of the stereoscopic right view video sequence.
According to another aspect of the present invention, there is provided a video processing method including:
performed by one or more processors and/or circuits in a video processing system:
generating a three-dimensional (3D) output video stream for display based on a plurality of view sequences extracted from a three-dimensional input video stream; and
communicating with an optical viewing device for viewing the three-dimensional output video stream before and/or during playback of the three-dimensional output video stream, so as to configure the optical viewing device for said viewing and/or so as to synchronize said viewing by the optical viewing device.
According to still another aspect of the present invention, there is provided a video processing system including:
one or more circuits and/or processors in an optical viewing device to determine an operating mode for playing three-dimensional video content; and
the one or more circuits and/or processors are operable to configure the optical viewing device to synchronize with the playback of the three-dimensional video content based on the determined operational mode.
Preferably, the one or more circuits and/or processors are operable to synchronize the optical viewing device prior to starting the playback of the three-dimensional video content and/or to dynamically synchronize the optical viewing device during the playback of the three-dimensional video content.
Preferably, the one or more circuits and/or processors are operable to communicate with a video processing device for processing and/or displaying the three-dimensional video content to facilitate the configuring of the optical viewing device.
Preferably, the one or more circuits and/or processors are operable to communicate with the video processing device over one or more wireless interfaces.
Preferably, the one or more wireless interfaces comprise a Wireless Personal Area Network (WPAN) interface and/or a Wireless Local Area Network (WLAN) interface.
Preferably, the operation mode comprises a polarization mode and/or a shutter mode.
Preferably, the three-dimensional video content comprises frames or fields of stereoscopic left and right view video sequences.
Preferably, the one or more circuits and/or processors are operable, when the optical viewing device is operating in a polarization mode, to synchronize the polarization of a left eye viewed through the optical viewing device with the polarization of the stereoscopic left-view video sequence and/or synchronize the polarization of a right eye viewed through the optical viewing device with the polarization of the stereoscopic right-view video sequence.
Preferably, the one or more circuits and/or processors are operable, when the optical viewing device is operating in a shutter mode, to synchronize the shutter of the left eye viewed through the optical viewing device with the display of frames and/or fields of the stereoscopic left view video sequence and/or to synchronize the shutter of the right eye viewed through the optical viewing device with the display of frames and/or fields of the stereoscopic right view video sequence.
According to still another aspect of the present invention, there is provided a video processing system including:
one or more circuits and/or processors to generate a three-dimensional (3D) output video stream for display based on a plurality of view sequences extracted from a three-dimensional input video stream; and
the one or more circuits and/or processors are operable to communicate with an optical viewing device for viewing the three-dimensional output video stream prior to and/or during playback of the three-dimensional output video stream to configure the optical viewing device for the viewing and/or to synchronize the viewing by the optical viewing device.
Various advantages, aspects and novel features of the invention, as well as details of an illustrated embodiment thereof, will be more fully described in the following description and drawings.
Drawings
Fig. 1 is a schematic block diagram of a video system supporting tv broadcasting and/or local multimedia material according to an embodiment of the present invention;
fig. 2A is a schematic block diagram of a video system for providing three-dimensional video communication according to an embodiment of the present invention;
FIG. 2B is a schematic block diagram of a video processing system for generating a video stream containing three-dimensional video according to an embodiment of the present invention;
FIG. 2C is a schematic block diagram of a video processing system for processing and displaying video input including three-dimensional video and for synchronizing three-dimensional video playback operations with three-dimensional glasses according to an embodiment of the present invention;
fig. 3 is a flow chart of exemplary steps for synchronizing three-dimensional glasses with a three-dimensional video display, according to an embodiment of the present invention.
Detailed Description
Some embodiments of the present invention provide a method and system for synchronizing three-dimensional glasses with a three-dimensional video display. In various embodiments of the present invention, an optical viewing device may be used to determine an operating mode for use in viewing the playback of three-dimensional video content, and to configure and/or synchronize its operation with the playback of the three-dimensional video content based on the determined operating mode. Exemplary operating modes may include a polarization mode and/or a shutter mode. Synchronization of the optical viewing device may be performed during initialization of the optical viewing device, prior to starting to play the three-dimensional video content, and/or dynamically during play of the three-dimensional video content. The optical viewing device may communicate with a video processing device for processing and/or displaying three-dimensional video content to facilitate configuration and/or synchronization of the optical viewing device. The optical viewing device may communicate with the video processing device through one or more wireless interfaces. Exemplary wireless interfaces may include a Wireless Personal Area Network (WPAN) interface and/or a Wireless Local Area Network (WLAN) interface.
The three-dimensional video content may comprise, for example, frames or fields of stereoscopic left and right view video sequences. Thus, when the optical viewing device is operating in a polarization mode, the polarization of the left eye viewed through the optical viewing device is synchronized with the polarization of the stereoscopic left view video sequence and/or the polarization of the right eye viewed through the optical viewing device is synchronized with the polarization of the stereoscopic right view video sequence. When the optical viewing device is operating in a shutter mode, a shutter for a left eye viewed through the optical viewing device is synchronized with the display of frames and/or fields of the stereoscopic left view video sequence and/or a shutter for a right eye viewed through the optical viewing device is synchronized with the display of frames and/or fields of the stereoscopic right view video sequence.
Fig. 1 is a schematic structural diagram of a video system supporting television broadcasting and/or local multimedia material according to an embodiment of the present invention. Referring to fig. 1, a media system 100 is shown, the media system 100 may include a display device 102, a terrestrial television headend 104, a television tower 106, a television antenna 108, a cable television (CATV) headend 110, a cable television (CATV) distribution network 112, a satellite television headend 114, a satellite television receiver 116, a broadband television headend 118, a broadband network 120, a set-top box 122, and an audio-visual (AV) playback device 124.
The display device 102 may comprise suitable logic, circuitry, interfaces and/or code that may enable playing a multimedia stream comprising audio-visual (AV) data. The display device 102 may include, for example, a television, a display, and/or other display and/or audio playback device, and/or components for playing back video streams and/or corresponding audio data received directly by the display device 102 and/or indirectly through an intermediary device, such as a set-top box 122, and/or from a local media recording/playback device and/or storage resource, such as an AV playback device 124.
The terrestrial television front end 104 may comprise suitable logic, circuitry, interfaces and/or code that may be operable to radio broadcast television signals via one or more television towers 106. The terrestrial television front end 104 may broadcast analog and/or digitally encoded terrestrial television signals. The television antenna 108 may comprise suitable logic, circuitry, interfaces and/or code that may enable receiving television signals transmitted by the terrestrial television front end 104 via the television tower 106. The CATV headend 110 may comprise suitable logic, circuitry, interfaces and/or code that may enable communication of cable television signals. The CATV headend 110 may be used to broadcast cable television signals in analog and/or digital format. The CATV distribution network 112 may include a suitable distribution system such that communications from the CATV headend 110 to a plurality of cable television receivers including, for example, the display device 102, may occur. For example, the CATV distribution network 112 may include a fiber and/or coaxial cable network to enable connection of one or more CATV head-ends 110 with the display devices 102.
The satellite television front end 114 may comprise suitable logic, circuitry, interfaces and/or code that may enable downlink communication of satellite television signals to terrestrial receivers such as the display device 102. The satellite television front end 114 may comprise, for example, one of a plurality of orbiting satellite nodes in a satellite television system. The satellite television receiver 116 may comprise suitable logic, circuitry, interfaces and/or code that may enable receiving downlink satellite television signals transmitted by the satellite television front end 114. For example, the satellite receiver 116 may include a dedicated parabolic antenna for receiving satellite television signals from a satellite television front end and reflecting and/or focusing the received satellite signals to a focal point, wherein one or more Low Noise Amplifiers (LNAs) may be used to down-convert the received signals to corresponding intermediate frequencies for further processing, enabling audio/video data to be extracted, for example, by the set-top box 122. In addition, since most satellite television downlink material is securely encoded and/or scrambled, the satellite television receiver 116 may comprise suitable logic, circuitry, interfaces and/or code that may be operable to decode, descramble and/or decrypt received satellite television material.
The broadband television front end 118 may comprise suitable logic, circuitry, interfaces and/or code that may enable multimedia/television broadcasting over the broadband network 120. Broadband network 120 may include an inter-connection network system for exchanging information and/or data between a plurality of nodes based on one or more network standards, which may include, for example, TCP/IP. Broadband network 120 may include a plurality of broadband-available sub-networks that may include, for example, a satellite network, a cable network, a DVB network, the internet, and/or similar local or wide area networks that work together to deliver data containing multimedia content to a plurality of end users. Connections may be provided over broadband network 120 based on copper wire and/or fiber optic wired connections, wireless interfaces, and/or interfaces based on other standards. The broadband television front end 118 and the broadband network 120 may correspond to, for example, an Internet Protocol Television (IPTV) system.
The set-top box 122 may comprise suitable logic, circuitry, interfaces and/or code that may enable processing of television and/or multimedia streams/signals transmitted by one or more television front ends external to the display device 102. The AV playback device 124 may comprise suitable logic, circuitry, interfaces and/or code that may enable provision of video/audio material to the display device 102. For example, the AV playback device 124 may include a Digital Video Disc (DVD) player, a Blu-ray player, a Digital Video Recorder (DVR), a video game player, a surveillance system (surveillance), and/or a Personal Computer (PC) capture/playback card. Although the set-top box 122 and the AV playback device 124 are shown as separate entities, at least a portion of the functions performed by the set-top box 122 and/or the AV playback device 124 may be integrated directly into the display device 102.
In operation, the display device 102 may be used to play media streams received from one of the available broadcast headends and/or from one or more local resources. Display device 102 may receive television broadcasts transmitted by television tower 106 via a radio feed from a terrestrial television front end 104 via a television antenna 108. The display device 102 may also receive cable television broadcasts, which are transmitted by the CATV headend 110 over the CATV distribution network 112; satellite television broadcasts, which are transmitted by the satellite head end 114 and received by the satellite receiver 116; and/or internet media broadcast delivered by broadband television head end 118 over broadband network 120.
The television front end may use various encoding methods in television broadcasting. Traditionally, television broadcasts have used analog modulation format methods including, for example, NTSC, PAL, and/or SECAM. Audio encoding involves the use of separate modulation methods including, for example, BTSC, NICAM, mono FM and/or AM. However, there has now been a steady push towards Digital Television (DTV) based broadcasts. For example, the terrestrial television front end 104 may use ATSC and/or DVB based standards to facilitate DTV terrestrial broadcast. Similarly, the CATV head-end 110 and/or the satellite head-end 114 may also use suitable encoding standards to facilitate cable and/or satellite based broadcasting.
The display device 102 may be used to directly process multimedia/television broadcasts to play corresponding video and/or audio data. Alternatively, the processing operations and/or functions may be performed by an external device, such as set top box 122, which may be used to extract video and/or audio data from a received media stream, and then the extracted audio/video data may be played through display device 102.
In one aspect of the invention, the media system 100 may be used to support three-dimensional (3D) video. Recently, the development of three-dimensional video has been vigorously driven to replace two-dimensional video. Three-dimensional video images may be captured, generated (at capture or play time), and/or displayed using various methods. One of the many commonly used methods for implementing three-dimensional video is stereoscopic three-dimensional video. In stereoscopic three-dimensional video-based applications, three-dimensional video effects are generated by presenting multiple views, typically two views: left and right views, which correspond to the left and right eyes of a viewer to produce the depth of the displayed image. Accordingly, left view and right view video sequences may be captured and/or processed to generate three dimensional images. The left and right view data may then be transmitted as separate streams or they may be merged into one transport stream, the merged transport stream being separated into different view sequences only by the end user receiving/display device. The communication of the three-dimensional video can be carried out in a live television broadcast mode. In this regard, one or more television front ends may be used to transmit three-dimensional video content to the display device 102 directly and/or through the set-top box 122. Communication of stereoscopic three-dimensional video may also be achieved by using a multimedia storage device, such as a DVD or blu-ray disc, which may be used to store three-dimensional video data that may then be played by a suitable player, such as AV playback device 124. In the communication of stereoscopic three-dimensional video, a view sequence may be compressed and/or encoded into a transport stream using various compression/encoding standards. For example, the separate left and right view video sequences may be compressed based on MPEG-2MVP, H.264, and/or MPEG-4 Advanced Video Coding (AVC) or MPEG-4 Multiview Video Coding (MVC).
In various embodiments of the present invention, during the playing of the three-dimensional video through the display device 102, the three-dimensional glasses may be used for three-dimensional viewing, and the operation of the three-dimensional glasses may be synchronized with the operation of the display device 102 to facilitate the three-dimensional video viewing. In some instances, the display device 102 may play three-dimensional video without using any additional devices. For example, the display device 102 may implement an autostereoscopic three-dimensional display using one or more techniques such as lenticular screens (lenticulars) and/or parallax barriers (parallax barriers). However, in some instances, the display device 102 may not be capable of displaying video images that can independently generate a three-dimensional viewing experience. Thus, a dedicated optical device, such as glasses with three-dimensional functionality, may be used to provide a desired three-dimensional viewing experience with the display device 102. The glasses having the three-dimensional function may adopt various three-dimensional viewing methods. Exemplary techniques for use in three-dimensional eyewear may include polarization-based and/or shutter-based operations.
In a polarization based operation, the lenses or lenses on each side may have different polarizations so that the eye can receive images of different polarizations simultaneously, which when mixed in the brain can present a three-dimensional effect. For example, in stereoscopic three-dimensional video playback, differently polarized right and left view images may be presented on the display device 102. To facilitate three-dimensional viewing, polarized three-dimensional glasses may be used in which the right and left eye glass polarizations are the same as the right and left view images. Therefore, the right eye sees only the right view image and the left eye sees only the left view image, and a three-dimensional perception can be generated when the right and left eye images are mixed in the brain.
During shutter mode operation, the lenses or lenses on each side may be closed and/or open so that the alternating image perception of each eye can receive different images that when mixed in the brain can present a three-dimensional effect. For example, in stereoscopic three-dimensional video playback, the right and left view images presented by display device 102 are alternating. To facilitate three-dimensional viewing, shutter three-dimensional glasses may be used in which the right and left eye glasses open and close at the same rate as the frequency of presenting the right and left view images. Therefore, the right eye sees only the right view image and the left eye sees only the left view image, and a three-dimensional feeling can be made when the right and left eye images are mixed in the brain.
In one aspect of the invention, the operation of three-dimensional glasses can be actively synchronized to provide three-dimensional viewing. Existing three-dimensional glasses employ passive polarization and/or shutter technology-i.e., the glasses may operate in a preconfigured and/or non-adjustable polarization manner. However, to enhance the use of three-dimensional glasses, the configuration and/or operation of the three-dimensional glasses may be changed and/or adjusted prior to and/or during video playback. For example, when the three-dimensional glasses are operating in a polarization mode, the polarization parameters and/or operation of the three-dimensional glasses may be configured such that the polarization direction of the three-dimensional glasses is the same as the polarization direction of the right and left view sequences displayed by the display device 102. Likewise, when the three-dimensional glasses are operating in the shutter mode, shutter operations of the three-dimensional glasses may be synchronized with a rendering frequency of each view (e.g., right and left view rendering) displayed by the display device 102. Three-dimensional glasses synchronization may be performed based on information transmitted by the display device 102. The synchronization may be performed before the three-dimensional video playback operation begins and/or may be performed dynamically during the three-dimensional video playback operation.
Fig. 2A is a schematic block diagram of a video system for providing three-dimensional video communication according to one embodiment of the present invention. Referring to fig. 2A, a three-dimensional video transmitting unit (3D-VTU)202 and a three-dimensional video receiving unit (3D-VRU)204 are shown.
The 3D-VTU 202 may comprise suitable logic, circuitry, interfaces and/or code that may enable generation of a video stream comprising encoded/compressed three-dimensional video data that may be transmitted to, for example, the 3D-VRU204 for display and/or playback. The three-dimensional video generated by the 3D-VTU 202 may be transmitted by one or more television front ends via television distribution. The three-dimensional video generated by the 3D-VTU 202 may also be stored in a multimedia storage device such as a DVD or blu-ray disc.
The 3D-VRU204 may comprise suitable logic, circuitry, interfaces and/or code that may enable receiving and/or processing a video stream containing three-dimensional video data for playback. The 3D-VRU204 may be used, for example, to receive and/or process transport streams containing three-dimensional video data, which may be transmitted directly by, for example, the 3D-VTU 202 via television distribution. The 3D-VRU204 may also be used to receive and/or process video streams read from a multimedia storage device that may be played directly through the 3D-VRU204 and/or through a locally suitable playback device. In this regard, the operations of the 3D-VRU204 may be performed by, for example, the display device 102, the set-top box 122, and/or the AV playback device 124 shown in FIG. 1. The received video stream may comprise encoded/compressed three-dimensional video data. Thus, the 3D-VRU204 may be used to process the received video stream to extract various video content in the transport stream, and may be used to decode and/or process the extracted video stream and/or content to facilitate display operations.
In operation, the 3D-VTU 202 may be used to generate a video stream containing three-dimensional video data. The 3D-VTU 202 may compress and/or encode three-dimensional video data, for example, as stereoscopic three-dimensional video that includes left and right view sequences. The 3D-VRU204 may be configured to receive and process a video stream to facilitate playing of video content contained in the video stream via a suitable display device. In this regard, the 3D-VRU204 may be used, for example, to demultiplex a received transport stream into an encoded three-dimensional video stream and/or an additional video stream. The 3D-VRU204 may decode and/or decompress three-dimensional video data in the received video stream for display.
In various embodiments of the invention, three-dimensional glasses may be used to enable three-dimensional viewing while playing three-dimensional video received from the 3D-VRU 204. In addition, the operation of the three-dimensional glasses may be synchronized with the video playing process of the 3D-VRU204 in order to achieve the desired three-dimensional video viewing, which may be specifically referred to the description of fig. 1. In this regard, for example, in the polarization mode, the three-dimensional glasses may be synchronized with the polarization of the right and left view sequences of the stereoscopic three-dimensional video content, and/or in the shutter mode, the three-dimensional glasses may be synchronized with the display frequency at which the right and left views are displayed.
Fig. 2B is a schematic block diagram of a video processing system for generating a video stream containing three-dimensional video according to an embodiment of the present invention. Referring to fig. 2B, a video processing system 220, a three-dimensional video source 222, a base view encoder 224, an enhanced view encoder 226, and a transport multiplexer 228 are shown.
The video processing system 220 may comprise suitable logic, circuitry, interfaces and/or code that may enable capturing, generating and/or processing three-dimensional video data and for generating a transport stream containing three-dimensional video. The video processing system 220 may include, for example, a three-dimensional video source 222, a base view encoder 224, an enhancement view encoder 226, and/or a transport multiplexer 228. The video processing system 220 may be integrated into the 3D-VTU 202 to facilitate the generation of video and/or transport streams containing three-dimensional video data.
The three-dimensional video source 222 may comprise suitable logic, circuitry, interfaces and/or code that may enable capturing and/or generating source three-dimensional video content. The three-dimensional video source 222 may be used to generate stereoscopic three-dimensional video containing left-view and right-view video data from captured source three-dimensional video content for three-dimensional video display/playback. The left view video and the right view video may be transmitted to a base view encoder 224 and an enhanced view encoder 226, respectively, for video compression.
The base view encoder 224 may comprise suitable logic, circuitry, interfaces and/or code that may enable encoding of left view video from the three-dimensional video source 222, for example, on a frame basis. The base view encoder 224 may be used to form compressed and/or encoded video content of left view video from the three-dimensional video source 222 using various video encoding and/or compression algorithms, such as MPEG-2, MPEG-4, AVC, VC1, VP6, and/or other video formats, among others. In addition, the base view encoder 224 may be used to transmit information for enhancement view encoding, such as scene information from base view encoding, to the enhancement view encoder 226.
The enhanced view encoder 226 may comprise suitable logic, circuitry, interfaces and/or code that may enable encoding of right view video from the three-dimensional video source 222, for example, on a frame basis. The enhanced view encoder 226 may be used to form compressed and/or encoded video content of a right view video from the three-dimensional video source 222 utilizing various video encoding and/or compression algorithms, such as MPEG-2, MPEG-4, AVC, VC1, VP6, and/or other video formats, among others. Although only one enhanced view encoder 226 is shown in fig. 2B, the present invention is not so limited. Accordingly, any number of enhanced view encoders may be used to process the left view video and the right view video generated by the three-dimensional video source 222 without departing from the spirit and scope of the present invention.
Transport multiplexer 228 may comprise suitable logic, circuitry, interfaces and/or code that may enable combining multiple video sequences into a mixed video stream. The composite video stream may comprise a left (base) view video sequence, a right (enhanced) view video sequence and a plurality of additional video streams, which may comprise, for example, advertisement streams.
In operation, three-dimensional video source 222 may be used to capture and/or generate source three-dimensional video content to produce, for example, stereoscopic three-dimensional video data for video compression, which may include left view video and right view video. The left view video may be encoded by a base view encoder 224 to produce a left (base) view video sequence. The right view video may be encoded by an enhanced view encoder 226 to produce a right (enhanced) view video sequence. The base view encoder 224 may be used to provide information for enhancement view encoding, such as scene information, to an enhancement view encoder 226 to generate, for example, depth data. Transport multiplexer 228 may be used to merge the left (base) view video sequence and the right (enhanced) view video sequence to generate one composite video stream. In addition, one or more additional video streams may be composited into the composite video stream by transport multiplexer 228. The resulting video stream may then be transmitted, for example, to the 3D-VRU204, as described with particular reference to fig. 2A.
In various embodiments of the present invention, three-dimensional video content generated, captured, and/or processed by the video processing system 220 may be viewed using three-dimensional enabled glasses. In this regard, three-dimensional viewing may be performed with three-dimensional glasses while playing three-dimensional video received through, for example, the 3D-VRU 204. Three-dimensional glasses can provide three-dimensional viewing by having the left and right eyes perceive separate left view and right view video sequences, respectively. Therefore, a three-dimensional effect can be generated by merging the left and right images in the brain. In one aspect of the invention, the operation of the three-dimensional glasses may be synchronized with the video playback process of the 3D-VRU204 based on information communicated through, for example, the 3D-VRU204 to facilitate desired three-dimensional video viewing.
Fig. 2C is a schematic block diagram of a video processing system for processing and displaying a video input including three-dimensional video and for synchronizing three-dimensional video playback operations with three-dimensional glasses according to one embodiment of the present invention. Referring to fig. 2C, there is shown video processing system 240, host processor 242, system memory 244, video decoder 246, store and play module 248, video processor 250, view controller 252, communication module 254, antenna subsystem 256, display conversion module 258, display 260, and three-dimensional glasses 262.
The video processing system 240 may comprise suitable logic, circuitry, interfaces and/or code that may be operable to receive and process three-dimensional video data in a compressed format and to provide reconstructed output video for display. Video processing system 240 may include, for example, a host processor 242, a system memory 244, a video decoder 246, a store and play module 248, a video processor 250, a view controller 252, a communication module 254, and/or a display conversion module 258. For example, the video processing system 240 may be integrated into the 3D-VRU204 to facilitate the reception and/or processing of a transport stream containing three-dimensional video content, which is transmitted by the 3D-VTU 202. The video processing system 240 may be used to control interlaced video fields and/or progressive video frames. In this regard, the video processing system 240 may be used to decompress and/or upconvert interlaced video and/or progressive video. A video field, such as an interlaced field and/or a progressive video frame, may be referred to as a field, a video field, a frame, or a video frame. In one aspect of the invention, the video processing system 240 may be used to interface with an optical viewing device, such as three-dimensional glasses 262, to synchronize the operation of the three-dimensional glasses 262 during three-dimensional video playback.
The host processor 242 may comprise suitable logic, circuitry, interfaces and/or code that may enable processing data and/or controlling the operation of the video processing system 240. In this regard, the main processor 242 may be used to configure and/or control the operation of various other portions and/or subsystems of the video processing system 240 by providing control signals to the various other portions and/or subsystems of the video processing system 240. Host processor 242 may also control data transmission in video processing system 240, for example, during video processing. Host processor 242 may execute applications, programs, and/or code stored in system memory 244 to, for example, perform various video processing operations such as decompression, motion compensation operations, interpolation, or other processing of three-dimensional video data. The system memory 244 may comprise suitable logic, circuitry, interfaces and/or code that may enable storage of information including parameters and/or code that may affect the operation of the video processing system 240. The parameters may include configuration data and the code may include operational code, such as software and/or firmware, but the information is not so limited. Additionally, the system memory 244 may be used to store three-dimensional video data, including, for example, left and right views of stereoscopic image data.
The video decoder 246 may comprise suitable logic, circuitry, interfaces and/or code that may enable processing of encoded video data. In this regard, the video decoder 246 may be used to demultiplex and/or parse a received transport stream to extract stream entries and/or sequences therefrom, and/or to decompress video data carried by the received transport stream, and/or may perform additional security operations such as digital rights management. The compressed video data in the received transport stream may comprise three-dimensional video data corresponding to frames or fields of a multi-view stereoscopic video sequence, such as left and right review views (review views). The received video data may be compressed and/or encoded by, for example, an MPEG-2 Transport Stream (TS) protocol or an MPEG-2 Program Stream (PS) container format (contenantenerformat). In various embodiments of the present invention, the left view data and the right view data may be received in separate streams or separate files. In this embodiment, the video decoder 246 may decompress the received separate left and right view video data based on, for example, MPEG-2MVP, H.264, and/or MPEG-4 Advanced Video Coding (AVC) or MPEG-4 Multiview Video Coding (MVC). In another embodiment of the invention, stereoscopic left and right views may be merged into one frame sequence. For example, a line-based, top-to-bottom based, and/or checkerboard (checkerboard) based three-dimensional encoder may convert frames from a three-dimensional stream containing left view data and right view data into a single compressed frame and may use MPEG-2, H.264, AVC, and/or other encoding techniques. In this embodiment, video decoder 246 may decompress video data based on, for example, MPEG-4AVC and/or MPEG-2 Master File (MP).
The store and play module 248 may comprise suitable logic, circuitry, interfaces and/or code that may enable caching of three-dimensional video data, such as left and/or right views, as the three-dimensional video data is transferred from one process and/or component to another. In this regard, the store and play module 248 may receive data from the video decoder 246 and may transmit data to the display conversion module 258, the video processor 250, and/or the viewing controller 252. Additionally, the store and play module 248 may buffer decompressed reference frames and/or fields, for example, when the display conversion module 258 performs frame interpolation and/or when contrast enhancement processing operations are performed. The store and play module 248 may exchange control signals with, for example, the main processor 242 and/or write data to the system memory 248 for long term storage.
The video processor 250 may comprise suitable logic, circuitry, interfaces and/or code that may be operable to perform video processing operations on received video data to facilitate generation of an output video stream that may be played via the display 260. Video processor 250 may be used, for example, to generate video frames based on a plurality of view sequences extracted from a received transport stream, which may provide three-dimensional video playback via display 260. In this regard, video processor 250 may use video data, such as luminance and/or chrominance data, in frames and/or fields of a received sequence of views.
The viewing controller 252 may comprise suitable logic, circuitry, interfaces and/or code that may enable managing interaction with an optical viewing device, such as three-dimensional glasses 262. In this regard, the viewing controller 252 may be used, for example, to determine and/or adjust the polarization of each of the left and right view sequences in the stereoscopic three-dimensional video to transmit polarization information and/or parameters to the three-dimensional glasses 262 via the communication module 254 to perform synchronization operations in the three-dimensional glasses 262. Similarly, when the three-dimensional glasses 262 operate in the shutter mode, the viewing controller 252 may be used to determine and/or adjust the frame rate and/or the alternation frequency of the left and right view sequences in the stereoscopic three-dimensional video, and/or transmit shutter-related information and/or parameters to the three-dimensional glasses 262 via the communication module 254 to perform a synchronization operation in the three-dimensional glasses 262.
The communication module 254 may comprise suitable logic, circuitry, interfaces and/or code that may be operable to provide a communication link between the video processing system 240 and one or more devices, such as three-dimensional glasses 262, the three-dimensional glasses 262 communicatively coupled to the video processing system 240. In this regard, the communication module processes signals transmitted and/or received through, for example, the antenna subsystem 256. Communication module 254 may be used, for example, to amplify, filter, modulate/demodulate, and/or upconvert/downconvert baseband signals to RF signals and/or to amplify, filter, modulate/demodulate, and/or upconvert/downconvert RF signals to baseband signals for transmission and/or reception of RF signals according to one or more wireless standards. Exemplary wireless standards may include Wireless Personal Area Networks (WPANs), Wireless Local Area Networks (WLANs), and/or private wireless standards. In this regard, the communication module 254 may be configured to communicate via a bluetooth, ZigBee (ZigBee), 60GHz, Ultra Wideband (UWB), and/or IEEE802.11 (e.g., WiFi) interface.
Communication module 254 may perform the necessary conversion between the received RF signal and a baseband frequency signal, which may be processed by, for example, a digital baseband processor (not shown). In uplink communication (i.e., reception), for example, communication module 254 may generate the necessary signals, e.g., local oscillator signals, to receive and process RF signals at a particular frequency. The communication module may then down-convert the received RF signal, either directly or indirectly, to, for example, a baseband frequency signal. In some examples, communication module 254 may analog-to-digital convert the baseband signal components before passing them to the digital baseband processor. In downlink communications (i.e., transmission), communication module 254 may generate the necessary signals, such as local oscillator signals, to transmit and/or process RF signals at a particular frequency. Communication module 254 may then perform the necessary conversion between the baseband frequency signal, which may be generated by, for example, a digital baseband processor, and the transmitted RF signal. In some examples, communication module 254 may perform digital-to-analog conversion on the baseband signal components.
The antenna subsystem 256 may comprise suitable logic, circuitry, and/or code that may enable transmitting and/or receiving RF via one or more antennas that may be configured for RF communications within a particular bandwidth corresponding to one or more supported wireless interfaces. For example, the antenna subsystem 256 may transmit and/or receive RF over a 2.4GHz bandwidth, which may be suitable for Bluetooth and/or WLAN RF transmission and/or reception.
The display conversion module 258 may comprise suitable logic, circuitry, interfaces and/or code that may be operable to process video data generated and/or processed by the video processing system 240 to generate an output video stream that may be suitable for playback via the display 260. In this regard, the display conversion module 258 may perform, for example, frame conversion based on motion estimation and/or motion compensation to increase the number of frames when the frame rate of the display 260 is higher than the input video stream, and to convert three-dimensional video data generated and/or processed by the video processing system 240 to two-dimensional output video when the display 260 is not three-dimensional. In this regard, the three-dimensional video that is converted to the two-dimensional output stream may comprise a mixture of three-dimensional input video and three-dimensional graphics. In one feature of the invention, the display conversion module 258 may be used to adjust and/or modify certain characteristics of the three-dimensional video output stream to ensure synchronized viewing through the three-dimensional glasses 262. For example, the display conversion module 258 may adjust the polarization of the left and/or right view sequences in the output stream based on feedback from, for example, the viewing controller 252 to ensure that the polarization of the right and/or left eye in the three-dimensional glasses 262 is synchronized with the polarization of the right and/or left view sequences.
The display 260 may comprise suitable logic, circuitry, interfaces and/or code that may enable receiving reconstructed fields and/or frames of video data processed by the display conversion module 258 and may display corresponding images. The display 260 may be a separate device or the display 260 and the video processing system 240 may be implemented as a unitary device. Display 260 may be used to perform two-dimensional and/or three-dimensional video display. In this regard, a two-dimensional display may be used to display video generated and/or processed using three-dimensional techniques.
The three-dimensional glasses 262 may comprise suitable logic, circuitry, interfaces and/or code that may enable three-dimensional viewing in conjunction with a display device that may not be capable of independently providing three-dimensional display. For example, when the video processing system 240 may receive stereoscopic three-dimensional video content, the display 260 lacks autostereoscopic three-dimensional playback functionality and therefore does not have the functionality to present three-dimensional video images and/or provide three-dimensional viewing effects. Thus, the three-dimensional glasses 262 may be used to make the left and right eyes of the user perceive separate images, so that the resultant effect will correspond to a three-dimensional effect. In this regard, viewing settings and/or operations through the three-dimensional glasses 262 may be configured and/or synchronized with display and/or playback operations of the display 260 to ensure that a desired three-dimensional effect is produced.
In operation, video processing system 240 may be configured to facilitate the reception and processing of transport streams containing video data, and may be configured to generate and process an output video stream that may be played through a local display device, such as display 260. Processing the received transport stream may include demultiplexing the transport stream to extract a plurality of compressed videos, which may correspond to, for example, a sequence of views and/or additional information. Demultiplexing the transport stream may be performed in video decoder 246 or by a separate component (not shown). Video decoder 246 may be used to receive transport streams, e.g., in a multi-view compression format, containing compressed stereoscopic video data, and to decode and/or decompress such video data. For example, the received transport stream may include left and right stereoscopic views. Video decoder 246 may be used to decompress received stereoscopic video data and may buffer the decompressed data via storage and playback module 248. The decompressed video data may then be processed for playback via the display 260. The video processor 250 may be used to generate a three-dimensional and/or two-dimensional output video stream based on the decompressed video data. In this regard, when stereoscopic three-dimensional video is used, video processor 250 may decompress reference frames and/or fields that correspond to the multiple view sequences acquired by storage and playback module 248 to generate corresponding three-dimensional video streams that may be further processed by display conversion module 258 and/or viewing controller 252 before being played back via display 260. For example, display conversion module 258 may perform motion compensation in one or more frames between received frames and/or may insert pixel data to implement frame rate upconversion, if desired. The view controller 252 may be used to provide local graphics processing to stitch, for example, graphics into the generated and enhanced video output stream, and the final video output stream may then be played through the display 260.
In various embodiments of the invention, three-dimensional glasses 262 may be used to facilitate three-dimensional viewing of three-dimensional video streams received and/or processed by video processing system 240. In this regard, the three-dimensional glasses 262 may be used to enable three-dimensional viewing while playing three-dimensional video content that corresponds to the output video generated by the video processing system 240 and is displayed by the display 260. In one aspect of the invention, the operation of the three-dimensional glasses 262 may be synchronized with the operation of the video processing system 240 and/or the display 260 during viewing of the three-dimensional video through the three-dimensional glasses 262. For example, when the input video stream includes stereoscopic three-dimensional video content, display 260 may enable independent three-dimensional video playback by employing one or more techniques, such as lenticular screens and/or parallax barriers, which may enable autostereoscopic three-dimensional video playback. When the display 260 does not have the functionality to independently present three-dimensional images, three-dimensional glasses 262 may be used in conjunction with the display 260 to provide a desired three-dimensional viewing experience. In this regard, the three-dimensional glasses 262 may be used to change the viewing of the left and right eyes corresponding to the left and right view video content, respectively, such that a three-dimensional effect may be generated based on the composite effect of the right and left eyes. To facilitate proper three-dimensional viewing through the three-dimensional glasses 262, the operation and/or settings of the three-dimensional glasses 262 and/or the display 260 may be synchronized during three-dimensional playback. The three-dimensional glasses 262 may be communicatively coupled to the video processing system 240 to facilitate configuration and/or management of viewing operations of the three-dimensional glasses while playing the three-dimensional video. For example, the three-dimensional glasses 262 may communicate with the video processing system 240 via one or more wireless links, which may be supported by the communication module 254 and/or the antenna subsystem 256.
Synchronization of the three-dimensional glasses 262 may be achieved based on the operational mode of the three-dimensional glasses 262 and/or the associated inherent characteristics of the input video stream and/or the output video stream. For example, the three-dimensional glasses 262 may use polarization-based and/or shutter-based operations. During polarization-based operation, the viewing lens or lenses of each eye may have different polarizations, so that the eyes may receive different polarized images simultaneously, which when mixed will exhibit the desired three-dimensional effect. When stereoscopic three-dimensional video is played through, for example, display 260, differently polarized right and left view frames or fields may be presented on display 260. To facilitate three-dimensional viewing, right and left eye view polarizations through the three-dimensional glasses 262 may be configured and/or adjusted based on communication with the video processing system 240 via the communication module 254 such that the polarization of each eye of the three-dimensional glasses 262 is similar to the corresponding polarization of the displayed right and left view images.
During shutter mode operation, the viewing lens or lenses of each eye of the three-dimensional eyewear may be closed and/or open such that each eye sees only the corresponding view image. For example, during the playing of stereoscopic three-dimensional video, the left eye can only see the left view frames or fields and/or the right eye can only see the right view frames or fields. Thus, to facilitate three-dimensional viewing, the right and left eye shutters of the three-dimensional glasses 262 may be configured and/or adjusted based on communication with the video processing system 240 via the communication module 254 to ensure that the shutter frequency and/or open duration of each side of the three-dimensional glasses 262 exactly corresponds to the alternating left and right frames or fields displayed via the display 260.
The configuration of the three-dimensional glasses 262 may be performed based on communication with the video processing system 240 before the playback process through the display 260 begins. The operation of the three-dimensional glasses 262 may also be adjusted and/or managed during playback to accommodate any changes in parameters and/or characteristics of the output video stream, for example, as displayed by the display 260.
Fig. 3 is a flow chart of exemplary steps for synchronizing three-dimensional glasses with a three-dimensional video display, according to one embodiment of the present invention. Referring to fig. 3, a flow chart 300 is shown that includes exemplary steps that may be performed to synchronize three-dimensional eyewear with a three-dimensional video display.
In step 302, a three-dimensional input video stream may be received and processed. For example, the video processing system 240 may receive and process a video stream containing compressed video data, which corresponds to stereoscopic three-dimensional video. In this regard, the compressed video data may correspond to frames or fields of a multi-view video sequence, including, for example, left and right view streams, which may be used to render three-dimensional images via a display device, such as display 260. At step 304, a plurality of view sequences, e.g., containing left and right view video streams, may be generated based on processing of the received three-dimensional input stream. Frames and/or fields in the left and right video streams may be used to present images through display 260, and display 260 may produce a three-dimensional look when viewed in an appropriate manner. In this regard, when the display 260 does not have a function of independently generating a three-dimensional effect, an additional device such as three-dimensional glasses 262 may be used to provide a desired three-dimensional effect by using, for example, a lenticular screen.
At step 306, a communication link may be established with the three-dimensional eyewear. For example, the video processing system 240 and/or the three-dimensional glasses 262 may establish one or more communication links via, for example, the communication module 254 and/or the antenna subsystem 256 to enable interaction between the video processing system 240 and the three-dimensional glasses 262 during three-dimensional playback of the display 260. At step 308, various characteristics of the operation of the three-dimensional glasses and/or the output video stream are determined. For example, it may be determined whether the three-dimensional glasses 262 are operated in the polarization mode or the shutter mode. The polarization and/or alternate rendering frequency of the left and right view fields or frames may also be determined when the video stream processed by the video processing system 240 contains stereoscopic three-dimensional video content. The above determination process may be performed independently and/or collectively by the three-dimensional glasses 262 and/or by the video processing system 240. In addition, in performing the above determination, the three-dimensional glasses 262 and the video processing system 240 may communicate via, for example, the communication module 254 to exchange information and/or data relating to, for example, characteristics of the three-dimensional video content being processed and/or played by the video processing system 240. In step 310, the operation of the three-dimensional glasses may be synchronized with the video playback process. For example, during the playing of a video through the display 260, the viewing operation of the three-dimensional glasses 262 may be synchronized with the operation of the video processing system 240, as described with reference to fig. 2C.
Various embodiments of the present invention include a method and system for synchronizing three-dimensional glasses with a three-dimensional video display. Three-dimensional glasses 262 may be used to determine an operational mode for use in viewing three-dimensional video content played through, for example, video processing system 240, and to configure and/or synchronize its operation with the playing of the three-dimensional video content of display 260 based on the determined operational mode. Exemplary operating modes may include a polarization mode and/or a shutter mode. The synchronization of the three-dimensional glasses 262 may be performed dynamically during initialization of the three-dimensional glasses 262, before playback of the three-dimensional video content begins, and/or during playback of the three-dimensional video content. The three-dimensional glasses 262 may communicate with the video processing system 240 via, for example, the communication module 254 to facilitate configuration of the three-dimensional glasses 262 and/or synchronization of viewing operations of the three-dimensional glasses 262 while playing three-dimensional video content via, for example, the display 260. The three-dimensional glasses 262 may communicate with the video processing system 240 via one or more wireless interfaces that may be supported by the video processing system 240 via the communication module 254. Exemplary wireless interfaces may include a Wireless Personal Area Network (WPAN) interface and/or a Wireless Local Area Network (WLAN) interface. The three-dimensional video content may comprise, for example, frames or fields of stereoscopic left and right view video sequences. Accordingly, when the operational mode of the three-dimensional glasses 262 and/or the playback mode of the three-dimensional video content of the display 260 is a polarization mode, the polarization viewed by the left eye of the three-dimensional glasses 262 may be synchronized with the polarization of the stereoscopic left-view video sequence and/or the polarization viewed by the right eye of the three-dimensional glasses 262 may be synchronized with the polarization of the stereoscopic right-view video sequence. When the operational mode of the three-dimensional glasses 262 and/or the play mode of the three-dimensional video content of the display 260 is a shutter mode, the shutter viewed by the left eye of the three-dimensional glasses 262 may be synchronized with the frequency of presentation of the frames and/or fields of the stereoscopic left view video sequence presented by the display 260 and/or the shutter viewed by the right eye of the three-dimensional glasses 262 may be synchronized with the frequency of the frames and/or fields of the stereoscopic right view video sequence presented by the display 260.
Another embodiment of the present invention may provide a machine and/or computer readable storage and/or medium having stored thereon a machine code and/or a computer program having at least one code section executable by a machine and/or computer to cause the machine and/or computer to perform the above-described steps for synchronizing three-dimensional glasses with a three-dimensional video display.
Accordingly, the present invention may be realized in hardware, software, or a combination of hardware and software. The present invention can be realized in a centralized fashion in at least one computer system, or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware and software could be a general purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
The present invention may also be implemented by a computer program product, comprising all the features enabling the implementation of the methods of the invention, when loaded in a computer system. The computer program in this document refers to: any expression, in any programming language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to other languages, decoding or notation; b) reproduced in a different format.
While the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.
Cross reference to related applications
The present invention refers to and incorporates the contents of the following patent applications:
united states provisional patent application No. 61/287,689 (law firm No.20697US01), filing date 2009, 12 months and 17 days;
united states provisional patent application No. 61/287,624 (law firm No.20677US01), filing date 2009, 12 months and 17 days;
united states provisional patent application No. 61/287,634 (law firm No.20678US01), filing date 2009, 12 months and 17 days;
U.S. patent application No. 12/554,416 (law firm No.20679US01), application date 2009, 9/4;
U.S. patent application No. 12/546,644 (law firm No.20680US01), application date 2009, 8/24;
U.S. patent application No. 12/619,461 (law firm No.20681US01), filing date 2009, 11/6/11/6;
U.S. patent application No. 12/578,048 (law firm No.20682US01), filed 2009, 12/13/2009;
united states provisional patent application No. 61/287,653 (law firm No.20683US01), filing date 2009, 12 months and 17 days;
U.S. patent application No. 12/604,980 (law firm No.20684US01), application date 2009, 10 months and 23 days;
U.S. patent application No. 12/545,679 (law firm No.20686US01), application date 2009, 8/21;
U.S. patent application No. 12/560,554 (law firm No.20687US01), application date 2009, 9/16;
U.S. patent application No. 12/560,578 (law firm No.20688US01), application date 2009, 9/16;
U.S. patent application No. 12/560,592 (law firm No.20689US01), application date 2009, 9/16;
U.S. patent application No. 12/604,936 (law firm No.20690US01), application date 2009, 10 months and 23 days;
united states provisional patent application No. 61/287,668 (law firm No.20691US01), filing date 2009, 12 months and 17 days;
U.S. patent application No. 12/573,746 (law firm No.20692US01), application date 2009, 10 months and 5 days;
U.S. patent application No. 12/573,771 (law firm No.20693US01), application date 2009, 10 months and 5 days;
united states provisional patent application No. 61/287,673 (law firm No.20694US01), filing date 2009, 12 months and 17 days;
united states provisional patent application No. 61/287,682 (law firm No.20695US01), filing date 2009, 12 months and 17 days;
U.S. patent application No. 12/605,039 (law firm No.20696US01), application date 2009, 10 months and 23 days; and
united states provisional patent application No. 61/287,692 (law firm No.20698US01), application date 2009, 12 months and 17 days.
The above U.S. patent application is incorporated herein by reference in its entirety.

Claims (10)

1. A video processing method, comprising:
executing, by one or more processors and/or circuits in an optical viewing device:
determining a working mode for playing the three-dimensional video content;
configuring the optical viewing device based on the determined operating mode to synchronize with the playback of the three-dimensional video content.
2. The method of claim 1, comprising synchronizing the optical viewing device prior to starting the playing of the three-dimensional video content and/or dynamically synchronizing the optical viewing device during the playing of the three-dimensional video content.
3. The method of claim 1, comprising communicating with a video processing device for processing and/or displaying the three-dimensional video content to facilitate the configuring of the optical viewing device.
4. The method of claim 3, comprising communicating with the video processing device over one or more wireless interfaces.
5. A video processing method, comprising:
performed by one or more processors and/or circuits in a video processing system:
generating a three-dimensional (3D) output video stream for display based on a plurality of view sequences extracted from a three-dimensional input video stream; and
communicating with an optical viewing device for viewing the three-dimensional output video stream before and/or during playback of the three-dimensional output video stream, so as to configure the optical viewing device for said viewing and/or so as to synchronize said viewing by the optical viewing device.
6. A video processing system, comprising:
one or more circuits and/or processors in an optical viewing device to determine an operating mode for playing three-dimensional video content; and
the one or more circuits and/or processors are operable to configure the optical viewing device to synchronize with the playback of the three-dimensional video content based on the determined operational mode.
7. The system according to claim 6, wherein said one or more circuits and/or processors are operable to synchronize said optical viewing device prior to starting said playing of said three-dimensional video content and/or to dynamically synchronize said optical viewing device during said playing of said three-dimensional video content.
8. The system according to claim 6, wherein said one or more circuits and/or processors are operable to communicate with a video processing device for processing and/or displaying said three-dimensional video content to facilitate said configuring of said optical viewing device.
9. The system according to claim 8, wherein said one or more circuits and/or processors are operable to communicate with said video processing device via one or more wireless interfaces.
10. A video processing system, comprising:
one or more circuits and/or processors to generate a three-dimensional (3D) output video stream for display based on a plurality of view sequences extracted from a three-dimensional input video stream; and
the one or more circuits and/or processors are operable to communicate with an optical viewing device for viewing the three-dimensional output video stream prior to and/or during playback of the three-dimensional output video stream to configure the optical viewing device for the viewing and/or to synchronize the viewing by the optical viewing device.
HK11113873.2A 2009-12-17 2011-12-22 A method and system for video processing HK1159388A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US61/287,689 2009-12-17
US12/698,814 2010-02-02

Publications (1)

Publication Number Publication Date
HK1159388A true HK1159388A (en) 2012-07-27

Family

ID=

Similar Documents

Publication Publication Date Title
TWI520567B (en) Method and system for enhanced 2d video display based on 3d video input
EP2337361A2 (en) Method and system for synchronizing 3D glasses with 3D video displays
CN102763421B (en) The apparatus and method of process video content
US8988506B2 (en) Transcoder supporting selective delivery of 2D, stereoscopic 3D, and multi-view 3D content from source video
US20110149022A1 (en) Method and system for generating 3d output video with 3d local graphics from 3d input video
US9357198B2 (en) Digital broadcast receiving method providing two-dimensional image and 3D image integration service, and digital broadcast receiving device using the same
CN102763419B (en) 3D video change-over device
EP2559257B1 (en) Method for generating and rebuilding a stereoscopic-compatible video stream and related coding and decoding devices
EP2337365A2 (en) Method and system for pulldown processing for 3D video
US20110149040A1 (en) Method and system for interlacing 3d video
US9930382B2 (en) Method and apparatus for transmitting/receiving broadcast signal for 3-dimensional (3D) broadcast service
WO2012112142A1 (en) Apparatus and method for generating a disparity map in a receiving device
US20110150355A1 (en) Method and system for dynamic contrast processing for 3d video
US20110149021A1 (en) Method and system for sharpness processing for 3d video
HK1159388A (en) A method and system for video processing
HK1158415A (en) Method and system for video processing
HK1158416A (en) Method and system for video processing
HK1182866A (en) Method for generating and rebuilding a stereoscopic-compatible video stream and related coding and decoding devices
HK1182866B (en) Method for generating and rebuilding a stereoscopic-compatible video stream and related coding and decoding devices