[go: up one dir, main page]

HK1141377B - A method and system for processing signal - Google Patents

A method and system for processing signal Download PDF

Info

Publication number
HK1141377B
HK1141377B HK10107643.4A HK10107643A HK1141377B HK 1141377 B HK1141377 B HK 1141377B HK 10107643 A HK10107643 A HK 10107643A HK 1141377 B HK1141377 B HK 1141377B
Authority
HK
Hong Kong
Prior art keywords
video
wireless
motion vector
received
video frames
Prior art date
Application number
HK10107643.4A
Other languages
Chinese (zh)
Other versions
HK1141377A1 (en
Inventor
陈雪敏
夏库什(马库斯)‧凯勒曼
Original Assignee
美国博通公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/400,736 external-priority patent/US20100046623A1/en
Application filed by 美国博通公司 filed Critical 美国博通公司
Publication of HK1141377A1 publication Critical patent/HK1141377A1/en
Publication of HK1141377B publication Critical patent/HK1141377B/en

Links

Description

Signal processing method and system
Cross Reference to Related Applications
This application is referenced and claims priority from U.S. provisional patent application having filing date of 2008/19 and application number 61/090,075, the entire contents of which are incorporated herein by reference.
Technical Field
The present invention relates to digital video processing, and more particularly, to a method and system for performing motion compensated frame rate up-conversion on compressed and decompressed video streams.
Background
A significant revolution in video display technology includes flat screens based on Liquid Crystal Display (LCD) or Plasma Display Panel (PDP) technology, which quickly replaced Cathode Ray Tube (CRT) technology as the primary display device for more than half a century. The great significance of this new video display technology is that images can be displayed on a flat screen by scanning line by line at a higher image rate. This new video display technology may also accelerate the transition from Standard Definition (SD) television to High Definition (HD) television.
The low image rate format may be used in conventional video compression systems to display conventional video on modern display screens. The limitation of channel capacity may affect the display of low image speed images. For example, it is contemplated that 30Hz video sequences may be spread over mobile networks and terminals (e.g., mobile phones that may receive encoded video sequences from a server). However, due to bandwidth limitations, only low bit rate video sequences can be communicated. Thus, the encoder may shift out two thirds of the pictures to be transmitted, which will result in a sequence of image rate images of, for example, about 10H. The capacity of the available channels may differ in different video services. Different systems may be used in different parts of the world, such as NTSC, SECAM or PAL.
Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with some aspects of the present invention as set forth in the remainder of the present application with reference to the drawings.
Disclosure of Invention
A system and/or method for motion compensated frame rate up-conversion for compressing and decompressing a video bitstream, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.
According to an aspect of the invention, a method of signal processing comprises:
in a video receiver:
receiving a video bitstream comprising a plurality of video frames and corresponding encoding information;
extracting the coding information from the received video bitstream; and
performing frame rate up-conversion on the received plurality of video frames using the extracted encoding information.
Preferably, the extracted coding information comprises one or more of block motion vectors, block codes, quantization levels, and/or quantized redundant data.
Preferably, the encoded information is generated by a video transmitter through entropy decoding based on compressed video from a video source (video feed) from one of a cable television network, an IP television network, a satellite broadcasting network, a mobile communication network, a camera, and/or a video camera.
Preferably, the received plurality of video frames comprises a plurality of decoded video frames constructed in the video transmitter by decompressing the compressed video from the video source.
Preferably, the method further comprises:
generating a pixel motion vector for each of the received plurality of decoded video frames based on the extracted coding information; and
one or both of motion vector reliability and/or motion vector consistency are calculated for corresponding to the generated pixel motion vectors.
Preferably, the method further comprises generating a plurality of interpolated video frames from the received plurality of decoded video frames based on the generated pixel motion vector and one or both of the calculated motion vector reliability and/or motion vector consistency.
Preferably, the received video bitstream comprises compressed video.
Preferably, the method further comprises: decompressing the received compressed video into a plurality of decoded video frames.
Preferably, the method further comprises:
generating pixel motion vectors for each of the plurality of decoded video frames based on the extracted coding information; and
one or both of motion vector reliability and/or motion vector consistency are calculated for corresponding to the generated pixel motion vectors.
Preferably, the method further comprises generating a plurality of interpolated video frames from the plurality of decoded video frames based on the generated pixel motion vector and one or both of the calculated motion vector reliability and/or motion vector consistency.
According to another aspect of the invention, a system for signal processing comprises:
one or more circuits for use in a video receiver, the one or more circuits operable to receive a video bitstream, the video bitstream comprising a plurality of video frames and corresponding encoding information;
the one or more circuits are operable to extract the coding information from the received video bitstream;
the one or more circuits are operable to perform frame rate up-conversion on the received plurality of video bitstreams using the extracted coding information.
Preferably, the extracted coding information comprises one or more of block motion vectors, block-wise coding, quantization levels, and/or quantized redundant data.
Preferably, the encoded information is generated by a video transmitter through entropy decoding based on compressed video from a video source from one of a cable television network, an IP network, a satellite broadcasting network, a mobile communication network, a camera, and/or a video camera.
Preferably, the received plurality of video frames comprises a plurality of decoded video frames constructed in the video transmitter by decompressing the compressed video from the video source.
Preferably, the one or more circuits are operable to generate pixel motion vectors for each of the received plurality of decoded video frames based on the extracted coding information; and
one or both of motion vector reliability and/or motion vector consistency are calculated for corresponding to the generated pixel motion vectors.
Preferably, the one or more circuits are operable to generate a plurality of interpolated video frames from the received plurality of decoded video frames based on the generated pixel motion vectors and one or both of the calculated motion vector reliability and/or motion vector consistency.
Preferably, the received video bitstream comprises compressed video.
Preferably, the one or more circuits are operable to decompress the received compressed video into a plurality of decoded video frames.
Preferably, the one or more circuits are operable to generate pixel motion vectors for each of the plurality of decoded video frames based on the extracted coding information; and
one or both of motion vector reliability and/or motion vector consistency are calculated for corresponding to the generated pixel motion vectors.
Preferably, the one or more circuits are operable to generate a plurality of interpolated video frames from the plurality of decoded video frames based on the generated pixel motion vectors and one or both of the calculated motion vector reliability and/or motion vector consistency.
Various advantages, aspects and novel features of the invention, as well as details of an illustrated embodiment thereof, will be more fully described with reference to the following description and drawings.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is an exemplary block diagram of a wireless HD system that transmits a video bitstream from a wireless HD transmitter to a wireless HD receiver over a wireless HD transmission link in one embodiment in accordance with the invention;
fig. 2 is an exemplary schematic diagram of a wireless HD transmitter operable to transmit a decompressed video bitstream over a wireless HD transmission link in accordance with one embodiment of the invention;
FIG. 3 is an exemplary diagram of a decompression engine employed in a wireless transmitter for video decompression processing in accordance with one embodiment of the present invention;
fig. 4 is an exemplary diagram of a wireless HD receiver employed to receive a decompressed video bitstream over a wireless HD transmission link in accordance with one embodiment of the present invention;
figure 5 is an exemplary diagram of a frame rate up-conversion engine employed by a wireless HD receiver for motion compensated interpolation according to one embodiment of the present invention;
FIG. 6 is a block diagram of an interpolated video frame being interpolated between two reference video frames in accordance with an embodiment of the present invention;
FIG. 7 is an exemplary block diagram of interpolating motion vectors of a video frame in accordance with one embodiment of the present invention;
fig. 8 is an exemplary schematic diagram of a wireless HD transmitter operable to transmit a compressed video bitstream over a wireless HD transmission link in accordance with one embodiment of the invention;
fig. 9 is an exemplary schematic diagram of a wireless HD receiver for receiving a compressed video bitstream over a wireless HD transmission link in accordance with one embodiment of the invention;
figure 10 is a flowchart of exemplary steps for motion compensated frame rate up-conversion of compressed and decompressed video bitstreams using wireless HD, in accordance with one embodiment of the present invention;
FIG. 11 is a flow diagram of exemplary steps for video decompression in accordance with one embodiment of the present invention;
figure 12 is a flowchart illustrating exemplary steps performed by a wireless HD receiver to perform motion compensated frame rate up-conversion on compressed and decompressed video bitstreams in accordance with one embodiment of the present invention.
Detailed Description
Some embodiments of the invention may be found in a system and/or method for performing motion compensated frame rate up-conversion on compressed and decompressed video bitstreams. Various embodiments of the present invention may include a video receiver (e.g., a wireless high definition receiver) operable to receive a video bitstream from a video transmitter, such as a wireless high definition transmitter, over, for example, a wireless high definition transmission link. The received video bitstream may include a plurality of video frames for display along with encoding information. The encoded information may be extracted and used for frame rate up-conversion processing over a plurality of video frames for display. The encoding information, such as block motion vectors (block motion vectors), block coding modes (block coding modes), quantization levels (quantization levels), and/or quantized redundant data (quantized redundant data), may be generated by a wireless High Definition (HD) transmitter by entropy decoding compressed video from a video source. The video source is from, for example, an IP television network and a satellite broadcast network. The received video bitstream may be uncompressed or compressed. In the case where multiple decoded video frames are received, the wireless HD receiver is operable to generate multiple interpolated video frames for the received decoded video frames by using the extracted associated coding information and associated measurements, such as one or both of the reliability and/or consistency of the motion vectors.
In the case of uncompressed video (e.g., MPEG-2 or MPEG-4) is received. The wireless HD receiver is operable to decompress the received compressed video resulting in the generation of a plurality of decoded video frames. Decompression may occur prior to frame rate up-conversion, as previously described, the wireless HD receiver may be operable to perform frame rate up-conversion based on the generated plurality of decoded video frames.
Fig. 1 is an exemplary block diagram of a wireless HD system for transmitting a video bitstream from a wireless HD transmitter to a wireless HD receiver over a wireless HD transmission link, according to one embodiment of the present invention, with reference to the wireless HD system 100 shown in fig. 1.
Wireless HD system 100 may include video source 110, wireless HD transmitter 120, antenna 122, wireless HD transmission link 130, wireless HD receiver 140, antenna 142, and display device 150. The video source 110 may include a cable television network 111, an IP television network 112, a satellite broadcast network 113, a mobile communications network 114, a camera 115, and/or a video camera 11. Wireless HD system 100 is capable of transmitting high definition audio and video over a wireless link, such as wireless HD transmission link 130. The wireless HD system may be configured to support a variety of industry standards, such as the wireless high definition interface standard (wireless HD) and/or the wired high definition interface (WHD) standard.
The video source 110 may comprise suitable logic, circuitry, and/or code that may enable providing a compressed video bitstream with a low image rate to the wireless HD transmitter 120. The compressed video bitstream may be formed using a variety of video compression algorithms, such as those specified by MPEG-2, MPEG-4/AVC, VC1, VP6, and/or other video formats that may allow for forward, backward, and bi-directional predictive coding. The received compressed video bit stream may be provided by various direct video sources, such as a camera 115 and/or a video camera 116. The received compressed video bitstream may be provided by an indirect video source such as a cable television network 111, an IP television network 112, a satellite broadcast network 113, and/or a mobile communications network 114.
The antenna 122 may comprise suitable logic, circuitry, and/or code that may enable communication of signals in a Radio Frequency (RF) band. In this regard, the signal transmitted to the wireless HD receiver 140 may comprise uncompressed video data and/or compressed video data, although fig. 1 shows only a single antenna 122, the invention is not limited thereto. One or more antennas may therefore be used to transmit signals from the wireless HD transmitter 120 to the wireless HD receiver 140 over a Radio Frequency (RF) band without departing from the spirit and scope of various embodiments of the present invention.
The wireless HD transmitter 120 may comprise suitable logic, circuitry, and/or code that may enable transmission of various data, such as compressed video data and/or decompressed video data, over the wireless HD transmission link 130 to the wireless HD receiver. The wireless HD transmitter 120 may be configured to accept a low image rate compressed video bit stream from the video source 110. The accepted low image rate compressed video bitstream may be transmitted over a wireless HD transmission link 130 to a wireless HD receiver 140. In one embodiment of the present invention, the wireless HD transmitter 120 may be used to communicate with the wireless HD receiver 140 to determine the video formats that the wireless receiver 140 can support. The determined video format may include, for example, uncompressed, MPEG-2, MPEG-4, VC1, and/or VP6, and in this regard, wireless HD transmitter 120 may be configured to transmit the accepted compressed video bitstream in the determined video format, which may be uncompressed or compressed.
In examples where a wireless HD transmitter may be used to send uncompressed video bitstreams to wireless HD receiver 140, wireless HD transmitter 120 may first be configured to decompress the accepted compressed video bitstreams from video source 110 and then transmit these decompressed video bitstreams to wireless HD receiver 140 over wireless transmission link 130. In another embodiment of the present invention, the wireless HD transmitter 120 may be used to extract encoded information from the received compressed video bitstream from the video source 110, for example, by entropy decoding. The extracted coding information may include information related to the received video bitstream such as block motion vectors, block based coding, quantization levels, and/or quantized redundant data. The extracted encoded information may be formatted or reformatted (reformatted) with the determined video format and may be transmitted with the accepted compressed video bitstream or decompressed video bitstream over the wireless HD transmission link 130 to the wireless HD receiver 140.
The wireless HD transmission link 130 may comprise suitable logic, circuitry, and/or code that may enable transmission of wireless High Definition (HD) signals. The wireless HD transmission link 130 can be configured to transmit HD signals in accordance with a standard such as wireless HD. The wireless HD standard may be specified based on a 7GHz continuous bandwidth around a 60GHz radio frequency. Wireless HD may be employed for uncompressed digital transmission for full HD video and audio and data signal combinations. Theoretically, wireless HD is roughly the same as High Definition Multimedia Interface (HDMI). HDMI is a small audio/video interface that transfers uncompressed digital signals. In this regard, the wireless HD transmission link 130 may be configured to transmit uncompressed video bitstreams and compressed video bitstreams between the wireless HD transmitter 120 and the wireless HD receiver 140. The wireless HD transmission link 130 can be configured to handle data transmission rates up to, for example, 25Gbit/s, so that the desired video bit stream is scaled to achieve higher resolution, color depth, and/or color gamut.
The antenna 142 may comprise suitable logic, circuitry, and/or code that may enable receiving signals over a radio frequency band. In this regard, the antenna 142 may be used to receive video signals including uncompressed or compressed video bitstreams from the wireless HD transmitter 120. Although fig. 1 shows only a single antenna 142, the present invention is not limited thereto. Wireless receiver 140 may thus receive signals in a Radio Frequency (RF) band using one or more antennas without departing from the spirit and scope of various embodiments of the present invention.
The wireless HD receiver 140 may comprise suitable logic, circuitry, interfaces and/or code that may be operable to receive various data, such as a compressed video bitstream and/or a decompressed video bitstream, from a wireless HD transmitter via an antenna 142 over a wireless HD transmission link. In one embodiment of the present invention, wireless HD receiver 140 may be used to communicate with wireless HD transmitter 120 to provide supported video formats. The video formats may include, for example, uncompressed, MPEG-2, MPEG-4, VC1, and/or VP 6. In this regard, the wireless HD receiver 140 may be configured to receive either an uncompressed video bitstream or a compressed video bitstream according to a video format determined by the wireless HD transmitter 120. In the example of receiving an uncompressed video bitstream from the wireless HD transmitter 120, the wireless HD receiver 140 may be used to extract the encoded information from the received uncompressed video bitstream. The extracted coding information may include, for example, block motion vectors, block-wise coding, quantization levels, and/or quantized redundancy data associated with the original compressed video bitstream in the received uncompressed video bitstream. The encoded information may be used in the wireless HD receiver 140 for frame rate up-conversion processing. For each uncompressed video frame in the received uncompressed video bitstream, the wireless HD receiver 140 is used to interpolate one or more intermediate video frames when performing frame rate up-conversion.
Wireless HD receiver 140 may be configured to transmit the interpolated video frames to display device 150 via an interface, such as an HDMI interface and/or a displayport, to enable the interpolated video frames to be displayed for viewing by a user. In the example of receiving a compressed video bitstream from the wireless HD transmitter 120, the wireless HD receiver 140 may extract the encoded information by entropy decoding from the received compressed video bitstream. The wireless HD receiver 140 may be configured to decompress a received compressed video bitstream to generate a sequence of decoded video frames. Wireless HD receiver 140 may be configured to use the decoded video frame sequence as a base video frame and perform frame rate up-conversion processing with reference to extracted encoded information, such as block motion vectors. In performing frame rate up-conversion, wireless HD receiver 140 may be used to interpolate one or more intermediate video frames for each decoded video frame, which may be transmitted, for example, over HDMI and/or a display port to display device 150 for display of the interpolated video frames to a user.
The display device 150 may comprise suitable logic, circuitry, and/or code that may enable display of video frames received from the wireless HD receiver 140 to a user. The display device 150 may be used to communicate with the wireless HD receiver 140 using a variety of interfaces. Such as a high definition multimedia interface, ethernet, and/or a displayport.
Although a wireless HD system is illustrated in fig. 1, the present invention is not limited thereto. In this regard, the wireless HD reflector 120 and the wireless HD receiver 140 may be used to support wireless or wired communications, which may be HD or Standard Definition (SD), without departing from the spirit and scope of various embodiments of the present invention.
In operation, the wireless HD transmitter 120 may be configured to accept a compressed video bitstream from a video source 110 via an antenna 122. The wireless HD transmitter 120 may be used to extract encoded information from the accepted video bitstream. The extracted coding information may include information such as block motion vectors, block coding, quantization levels, and/or quantized redundancy data associated with the accepted compressed video bitstream. Wireless HD transmitter 120 may be used to communicate with a target receiver, such as wireless HD receiver 140, over wireless HD transmission link 130 to determine a video format, such as uncompressed, MPEG-2, MPEG-4, VC1, and/or VP6, for video transmission to wireless HD receiver 140. The extracted encoded information and the accepted video bitstream may be formatted or reformatted in the determined video format and sent together to the wireless HD receiver 140. The wireless HD receiver 140 may be used to extract encoded information from a received video bitstream, whether uncompressed or compressed, for use in performing frame rate up-conversion. In the case where the received video bitstream is compressed, the wireless HD receiver 140 may be configured to perform video decompression prior to frame rate up-conversion to generate a sequence of decoded video frames. The wireless HD receiver may be used to interpolate one or more intermediate video frames for each uncompressed video frame or decoded video frame in a frame rate up-conversion process. These interpolated video frames may be transmitted to the display device 150, such as over HDMI, ethernet, and/or a display port, for display to a user for viewing.
Fig. 2 is an exemplary diagram of a wireless HD transmitter operable to transmit a decompressed video bitstream over a wireless HD transmission link in accordance with one embodiment of the present invention. Wireless HD transmitter 200 may include a decompression engine (decompression engine)210, a processor 220, and a memory 230.
The decompression engine 210 may comprise suitable logic, circuitry, interfaces and/or code that may be operable to decompress the received compressed video bitstream from the video source 110 to generate/construct decoded video frames. Decompression engine 210 may be used to implement a variety of video decompression techniques, such as entropy decoding (entropy decoding), inverse quantization (inverse quantization), inverse transform (inverse transform), and motion compensated prediction (motion compensated prediction). Decompression engine 210 may be used to provide coding information such as block motion vectors, block coding, quantization levels, and quantized redundancy data. Target receivers, such as wireless HD receiver 140, may use these encoded information provided by decompression engine 210 to perform frame rate up-conversion.
The processor 220 may comprise suitable logic, circuitry, interfaces and/or code that may enable accepting a compressed video bitstream from the video source 110. Decoding, processor 220 may be used to transmit the accepted compressed video bitstream to decompression engine 210 for various video decoding and/or decompression operations, such as entropy decoding, inverse quantization, inverse transformation, and motion compensated prediction. The decoded video frames provided by decompression engine 210 and the extracted encoded information may be transmitted to a target video receiver, such as wireless HD receiver 140. The processor 220 is operable to communicate with the memory 230 to provide the decode decompression engine 210 with a variety of video decoding algorithms for various decoding operations. Processor 220 may be configured to communicate with wireless HD receiver 140 to determine the video formats supported for the corresponding video transmission. The processor 220 can both format the decoded video frames using the determined video format and can also be used to provide encoded information for transmission to the wireless HD receiver 140.
The memory 230 may comprise suitable logic, circuitry, interfaces, and/or code that may enable storage of information, such as executable instructions and data, used by the processor 220 and the decompression engine 210. The executable instructions may include decoding algorithms employed by decompression engine 210 for various video decoding operations. The data may include decoded video frames and extracted coding information such as block motion vectors, block-wise coding, quantization levels, and quantized redundancy data. The memory module 230 may include RAM, ROM, low latency nonvolatile memory (e.g., flash memory), and/or suitable electronic data storage.
In operation, the processor 220 may be used to accept a compressed video bitstream with a low frame rate from a video source, such as the IP television network 112. The codec processor 220 may be used to communicate the accepted compressed video bitstream to the decompression engine 210 for various video decoding and/or decompression operations, such as entropy decoding, inverse quantization, inverse transformation, and motion compensated prediction. Decompression engine 210 may be configured to provide decoded video frames and associated coding information such as block motion vectors, block coding, quantization levels, and quantized redundancy data to processor 220. Decompression engine 210 may employ a variety of decoding algorithms stored in memory 230 for corresponding video processing operations. The processor 220 is operable to cause the decoded video frames and the extracted encoded information to be combined in a determined format that is suitable for a target receiver, such as the wireless HD receiver 140.
Fig. 3 is an exemplary diagram of a decompression engine employed for video decompression processing in wireless transmission according to one embodiment of the invention. Referring to fig. 3, a decompression engine 300 is shown. The decompression engine 300 may include an entropy decoding unit 310, an inverse quantization unit 320, an inverse transformation unit 330, a combiner 340, and a motion compensated prediction unit 350.
The entropy decoding unit 310 may comprise suitable logic, circuitry, interfaces and/or code that may enable entropy encoding of data for decoding. Entropy decoding unit 310 may be used to convert the binary bits of the entropy encoded data into symbols (quantized redundant data) that may be fed or transmitted to subsequent decoding units (e.g., inverse quantization unit 320 and inverse transform unit 330) for decoding to obtain decoded video frames. This conversion from binary bits to symbols (quantized redundant data) can be done in a number of ways, for example in MPEG entropy decoding can be done by first using variable-length code decoding (variable-length decoding) and then run-length decoding (run-length decoding). The entropy decoded data may be an accepted compressed video bitstream from the video source 110. In this regard, entropy decoding unit 310 may be used to extract encoded information, such as block motion vectors, from a received compressed video bitstream. The extracted coding information may include, for example, block motion vectors, block coding, quantization levels, and/or quantized redundancy data associated with the received compressed video bitstream.
The inverse quantization unit 320 may comprise suitable logic, circuitry, and/or code that may enable scaling and/or rescaling of quantized redundant data of a decoded video frame from the entropy decoding unit 310 to construct a video frame with a finite set of colors (a limited set) in which each color is associated with its closest representation. The inverse quantization unit 320 may be used, for example, to reduce visual distortion in the reconstructed image.
The inverse transform unit 330 may comprise suitable logic, circuitry, interfaces and/or code that may enable formation of redundant macroblocks (residual macroblocks) for each dequantized video frame from the dequantization unit 320 in conjunction with a standard base format.
The Motion Compensated Prediction (MCP) unit 350 may comprise suitable logic, circuitry, and/or code that may be enabled to provide prediction for macroblocks in an uncompressed video frame. The pixel gray scale of the macroblock within the current frame may be predicted based on the pixel gray scales of the macroblocks within the pre/post reference frames. The difference between the predicted pixel gray level and the current actual pixel gray level may be considered a prediction error. The prediction error may be passed to combiner 340 for use in reconstructing a corresponding uncompressed macroblock within the current frame.
The combiner 340 may comprise suitable logic, circuitry, interfaces and/or code that may be operable to combine the redundant macroblocks from the inverse transform unit 330 with corresponding prediction error information from the Motion Compensated Prediction (MCP) unit 350 to generate reconstructed uncompressed macroblocks.
In operation, the entropy decoding unit 310 may be used to receive a compressed video bitstream from the video source 110. Entropy decoding unit 310 may be used to convert the binary bits of the received compressed video source into video quantized redundant data. The quantized redundant data may be fed to an inverse quantization unit 320. Inverse quantization unit 320 may be used to rescale the quantized redundant data to reconstruct a video frame with limited color groups. The reconstructed video frame may be transmitted to the inverse transform unit 330. The inverse transform unit 330 may be used to inverse transform the reconstructed video frame to form a redundant video frame comprising a plurality of redundant macroblocks. Redundant macroblocks can be formed in a redundant picture by comparing the reconstructed video frame to one or more standard base formats. The redundant video frames and the prediction error from the Motion Compensated Prediction (MCP) unit 350 may be combined at combiner 340 to generate reconstructed decoded/uncompressed video frames.
Figure 4 is an exemplary diagram of a wireless HD receiver employed to receive a decompressed video bitstream over a wireless HD transmission link in accordance with one embodiment of the present invention. Referring to fig. 4, a wireless HD receiver 400 is shown. The wireless HD receiver 400 may include a frame rate up conversion engine 410, a processor 420, and a memory 430.
The frame rate up-conversion engine 410 may comprise suitable logic, circuitry, interfaces and/or code that may enable up-conversion of a frame rate to provide high quality video effects for high quality video sources including, for example, digital video cameras, video camera video, and/or telecine. In this regard, the frame rate up-conversion engine 410 may be configured to perform frame rate up-conversion using encoded information extracted from the uncompressed video bitstream received by the wireless HD receiver 400. The extracted coding information may include motion vectors, block coding, quantization levels, and quantization redundancy data associated with an original compressed video bitstream of the received uncompressed video bitstream. The frame rate up-conversion engine 410 may employ various frame rate up-conversion algorithms, such as frame repetition (frame repetition) and linear interpolation (linear interpolation) through temporal filtering, to construct an interpolated video frame with a high frame rate to be displayed on an existing screen, such as the display device 150.
The processor 420 may comprise suitable logic, circuitry, interfaces and/or code that may enable processing of decoded video frames received from the wireless HD transmitter 120. Processor 420 may be configured to pass the received decoded or uncompressed video frames to frame rate up-conversion engine 410 for frame rate up-conversion of the received decoded video frames. Processor 420 may be configured to perform video frame interpolation by frame rate up-conversion engine 410 and the generated interpolated video frames may be displayed on display device 150.
The memory 430 may comprise suitable logic, circuitry, interfaces and/or code that may enable storage of information such as executable instructions and data used by the processor 420 and the frame rate up conversion engine 410. The executable instructions may include a frame rate up-conversion algorithm used by the frame rate up-conversion engine 410. The data may include decoded video frames and said extracted coding information such as motion vectors, block codes, quantization levels, and quantized redundancy data. The data may include interpolated video frames constructed by the frame rate up-conversion engine 410 for display on the display device 150. Memory 430 may comprise RAM, ROM, low latency nonvolatile memory such as flash memory, and/or suitable electronic data storage.
In operation, processor 420 is operable to receive a decoded or uncompressed video bitstream with a low frame rate sent by wireless HD transmitter 110 over wireless HD transmission link 130. Processor 420 may be configured to pass the received decoded video bitstream to frame rate up-conversion engine 410 for frame rate up-conversion of the received decoded video frames. The processor 420 and the frame rate up-conversion engine 410 may use the memory 430 for frame rate up-conversion. Frame rate up-conversion engine 410 may be used to perform frame rate up-conversion and generate interpolated video frames. Processor 420 may be used to communicate with display device 150 to display the generated interpolated video frames for a user.
Fig. 5 is an exemplary diagram of a frame rate up-conversion engine employed by the wireless HD receiver for motion compensated interpolation according to an embodiment of the present invention, and referring to fig. 5, a digital noise reduction filter 510, a pixel motion vector generator 520, a pixel motion vector estimator 530, a frame-rate up-converter 540, and a scene change detector 550 are shown.
The digital noise reduction filter 510 may comprise suitable logic, circuitry, interfaces and/or code that may be operable to denoise decoded video frames received from the wireless HD transmitter 110. It is essential to achieve good image quality that the noise reduction process be performed before the other processes. The digital noise reduction filter 510 may employ various noise reduction techniques, such as de-blocking, de-ringing, and/or other noise reduction filtering techniques, for the received decoded video frame (reference image) before frame rate up-conversion.
The pixel motion vector generator 520 may comprise suitable logic, circuitry, and/or code that may enable generation of pixel motion vectors. The pixel motion vectors may be generated from block motion vectors extracted from decoded video frames received from the wireless HD transmitter 110. The pixel motion vector generator 520 may be used to modify (refine) the extracted block motion vector and decompress the modified block motion vector to a pixel motion vector. The pixel motion vectors may be further scaled or rescaled to construct interpolated or interpolated (interpolated) video frames. Pixel motion vector generator 520 may be configured to transmit pixel motion vectors to pixel motion vector evaluator 530 and frame rate up-converter 540.
Pixel motion vector evaluator 530 may comprise suitable logic, circuitry, interfaces and/or code that may enable evaluation of one or both of motion vector reliability and/or motion vector consistency associated with pixel motion vectors generated by pixel motion vector generator 520. The reliability of the generated pixel motion vectors may be calculated in a number of typical ways, including using quantized redundant data and associated quantization levels of decoded video frames received from the wireless HD transmitter 110. Quantized redundant data and associated quantization levels may be extracted from the received decoded video frame. The smaller the quantization level with less redundant data, the more reliable the motion vector can be generated. And the larger the quantization level with more redundant data, the lower the reliability of the motion vector that can be generated. Motion vector consistency can be generated by comparing the difference of neighboring block motion vectors and motion compensated block boundary pixels. For example, one or both of motion vector reliability and/or motion vector consistency may be used to generate a measure of reliability-consistency for motion judder filtering (motion judder filtering).
The frame rate up converter 540 may comprise suitable logic, circuitry, interfaces and/or code that may enable frame rate up conversion of the received decoded video frames from the wireless HD transmitter. The frame rate up-converter 540 may be used to perform motion compensated frame rate up-conversion using the encoded information provided by the wireless HD transmitter 110. Pixel motion vectors and/or associated motion vector reliability-consistency of received decoded video frames may be used for motion compensated frame rate up-conversion. Frame rate up-converter 540 may be used to interpolate the received decoded video frame using pixel motion vectors along with reliability-consistency measures of the associated motion vectors. For example, in instances where motion vector reliability is low, frame rate up-converter 540 may be configured to interpolate a reference frame with a still image, e.g., by frame repetition. The high reliability of motion vectors can generate a full motion vector dependent interpolation (full motion vector dependent interpolation), and the interpolated video frames can be transmitted to the scene change detector 550.
The scene change detector 550 may comprise suitable logic, circuitry, and/or code that may enable detection of a scene change in a received interpolated video frame the scene change detector 550 may be operable to process the received interpolated video frame, such as by a non-linear filtering process, to reduce artifacts (artifacts) in the final interpolated video frame. The scene change detector 550 may consider reliability-consistency measurements using motion vectors to determine if and when motion compensated interpolation will fail. The scene change detector 550 may be used to identify problem regions (problematic regions) for the final interpolated video frame by a variety of methods, such as non-linear filtering.
In operation, the wireless HD receiver 120 may be used to receive decoded video frames from the wireless HD transmitter 110. The received decoded video frames are passed to a digital noise reduction filter 510 prior to other processing, and the digital noise reduction filter 510 may be used to noise reduce the received decoded video frames using a variety of noise reduction techniques, such as deblocking, deringing, or other noise reduction filtering techniques. The filtered decoded video frames may be passed to a pixel motion vector generator 520, a pixel motion vector evaluator 530, a frame rate up-converter 540, and a scene change detector 550, respectively, for further processing. Pixel motion vector generator 520 may be used to generate pixel motion vectors from encoded information, such as block motion vectors extracted from filtered decoded video frames. The generated pixel motion vectors may be provided to a pixel motion vector evaluator 530 and a frame rate up-converter 540, respectively. The pixel motion vector evaluator 530 may be configured to evaluate one or both of a motion vector reliability and/or a motion vector consistency of the generated pixel motion vectors and provide a motion vector reliability-consistency measure to the scene change detector 550. Frame rate up-converter 540 may be used to up-convert the frame rate of the filtered decoded video frame using the pixel motion vectors generated by pixel motion vector generator 520. The generated interpolated video frames from frame rate up-converter 540 may be sent to scene change detector 550. The scene change detector 550 may be used to detect scene changes in the received interpolated video frames. The scene change detector 550 may be used to process the received interpolated video frame to reduce artifacts in the final interpolated video frame. Measurements of one or both of relative motion vector reliability and/or motion vector consistency may be considered for identifying problem areas in the received interpolated video frame. Various means, such as nonlinear filters, may be employed to mask the problem areas. The last interpolated video frame may be sent to display device 150 for display.
Fig. 6 is a block diagram of an interpolated video frame between two reference video frames according to an embodiment of the present invention, and referring to fig. 6, showing the positions of a plurality of decoded video frames (reference video frames), e.g., P1602 and P2604, and the interpolated video frame 606. For example, the interpolated video frame 606 may be inserted into k time units from the decoded video frame P1602.
Fig. 7 is a block diagram of exemplary motion vectors of an interpolated video frame in an embodiment in accordance with the invention. Referring to FIG. 7, there is shown a plurality of decoded video frames, such as P1702 and P2704, and interpolated video frame 706. For example, the insertion of the interpolated video frame 706 into the k time unit motion vector 708 may be directed from an area of the previous video frame P1702 to an area of the next video frame P2704, e.g., from the decoded video frame P1702, such that the motion vector 708 may capture the motion that occurred between the original video frames P1702 and P2704. Motion vector 709 may be a shifted version of motion vector 708. Motion vector 709 may be moved to align with interpolated video frame 706.
Motion vector 709 may be decomposed into two motion vectors, e.g., motion vector MV1709a and MV2709 b. Motion vectors MV1709a and MV2709b may be scaled for motion compensated interpolation. The directions of the scaled motion vectors may be opposite to each other. The length of each scaled motion vector may be proportional to the temporal difference (temporal difference) between the interpolated video frame 706 and the corresponding original video frame (e.g., video frame P1702).
Fig. 8 is an exemplary diagram of a wireless HD transmitter that may be used to transmit a compressed video bitstream over a wireless HD transmission link in accordance with one embodiment of the present invention. Referring to fig. 8, a wireless HD transmitter 800 is shown. The wireless HD transmitter 800 may include an entropy decoding unit 810, a processor 820, and a memory 830.
The entropy decoding unit 810 may comprise suitable logic, circuitry, interfaces and/or code that may enable decoding entropy encoded data. The entropy decoding unit may operate in the same manner as the entropy decoding unit 310 described with respect to fig. 3. The entropy decoding unit 810 may be used to provide encoded information from a compressed video bitstream received from the video source 110. The coding information may include block motion vectors, block-wise coding, quantization levels, and/or quantized redundant data. The encoding mode may include information such as inter-block based encoding or intra-block based encoding and a block size (block size). The extracted encoded information may be sent to the processor 820.
The processor 820 may comprise suitable logic, circuitry, interfaces and/or code that may enable processing of the compressed video bitstream received from the video source 110. The processor 820 may be configured to insert the extracted coding information, such as block motion vectors, block coding, quantization levels, and/or quantized redundancy data, into the received compressed video bitstream for communication with a target video receiver (e.g., the wireless HD receiver 140) over the wireless HD transmission link 130. Processor 820 may be in communication with memory 830 to provide a variety of algorithms for use by entropy decoding unit 810. Processor 820 may be configured to communicate with wireless HD receiver 140 to determine a video format that supports a corresponding video transmission. The processor 820 transmits the compressed video bitstream along with the inserted coding information to the wireless HD receiver 140 in the determined video format.
The memory 830 may comprise suitable logic, circuitry, interfaces and/or code that may enable storage of information, such as executable instructions and data, for use by the processor 820 and the entropy decoding unit 810. The executable instructions may include video decoding algorithms for use by the entropy decoding unit 810 for a variety of entropy decoding operations. The data may comprise said received compressed video bitstream and said extracted coding information. Memory 830 may comprise RAM, ROM, low latency nonvolatile memory such as flash memory, and/or suitable electronic data storage.
In operation, the processor 820 may be used to receive a compressed video bitstream with a low frame rate from a video source, such as the IP television network 112. The processor 820 may be used to transmit the received compressed video bitstream to the entropy decoding unit 810 for entropy decoding. The entropy decoding unit 810 may be configured to provide encoding information, such as block motion vectors, block-wise encoding, quantization levels, and/or quantized redundant data, to the processor 820. The processor 820 may be configured to insert coding information into the received compressed video bitstream for communication with a wireless HD receiver in a supported format.
Fig. 9 is an exemplary diagram of a wireless HD receiver for receiving a compressed video bitstream over a wireless HD transmission link in accordance with one embodiment of the present invention. Referring to fig. 9, a wireless HD receiver 900 is shown. The wireless HD receiver 900 may include a decompression engine 910, a frame rate up-conversion engine 920, a processor 930, and a memory 940.
The decompression engine 910 may comprise suitable logic, circuitry, interfaces and/or code that may be operable to decompress a compressed video bitstream received from the wireless HD transmitter 120 to generate decoded video frames. The decompression engine 910 may be used to perform various video decoding/decompression operations such as entropy decoding, inverse quantization, inverse transformation, and motion compensated prediction. Decompression engine 910 may be used to provide decoded video frames to frame rate up-conversion engine 920 for further decoding processing.
The frame rate up-conversion engine 920 may comprise suitable logic, circuitry, interfaces and/or code that may enable up-conversion of a frame rate to provide high quality video effects for high quality video sources, including digital video cameras, and/or telecine. In this regard, the frame rate up-conversion engine 920 may be used to extract encoded information, such as block motion vectors, block-wise coding, quantization levels, and/or quantized redundant data, from the received compressed video bitstream from the wireless HD transmitter 120. The extracted coding information may be used to perform frame rate up-conversion on the received compressed video bitstream. Frame rate up-conversion engine 920 may employ various frame rate up-conversion algorithms, such as frame repetition and linear interpolation through time-axis filtering, to construct interpolated video frames with high frame rates for display on existing screens, such as display device 150. The processor 930 may comprise suitable logic, circuitry, interfaces and/or code that may enable processing of compressed video frames received from the wireless HD transmitter 120. Processor 930 may be operative to transmit the received compressed video bitstream to decompression engine 910 for a corresponding decoded video frame of the received compressed video bitstream. In frame rate up-conversion, the decoded video frame may be used as a reference video frame for the last interpolated video frame. Processor 930 may be used to communicate the final interpolated video frame to display device 150 for display.
Memory 940 may comprise suitable logic, circuitry, interfaces and/or code that may enable storage of information, such as executable instructions and data, for use by processor 930, frame rate up conversion engine 920 and/or decompression engine 910. Executable instructions may include various video processing algorithms, such as decompression and frame rate up-conversion, for decompression engine 910 and frame rate up-conversion engine 920, respectively. The data may include a compressed video bitstream received from the wireless HD receiver 120, encoded information extracted from the received compressed video bitstream, decoded video frames, and/or interpolated video frames. The extracted coding information may include, for example, block motion vectors, block based coding, quantization levels, and/or quantized redundancy data for use by frame rate up-conversion engine 920. Memory 940 may include RAM, ROM, low latency nonvolatile memory such as flash memory, and/or suitable electronic data storage.
In operation, the processor 930 may be configured to receive a compressed video bitstream from the wireless HD transmitter 120. Processor 930 may be operative to transmit the received compressed video bitstream to decompression engine 910 for a corresponding decoded video frame. Frame rate up-conversion engine 920 may interpolate the decoded video frames to generate interpolated video frames. In this regard, the frame rate up-conversion engine 920 may be configured to use encoded information extracted from the received compressed video bitstream in frame rate up-conversion. The final interpolated video frame constructed by frame rate up-conversion engine 920 may be sent to display device 150 for display.
Figure 10 is a flowchart illustrating exemplary steps for motion compensated frame rate up-conversion of compressed and decompressed video bitstreams using wireless HD in accordance with one embodiment of the present invention. The exemplary steps begin at step 1002, where the wireless HD transmitter 120 is operable to receive/accept a compressed video bitstream from a video source 110. In step 1004, the wireless HD transmitter 120 may be configured to extract the encoded information from the received compressed video bitstream using entropy decoding. The extracted coding information may include, for example, block motion vectors, block based coding, quantization levels, and/or quantized redundant data. In step 1006, the wireless HD transmitter may be used to obtain information, such as information regarding the video formats that the target receiver (e.g., wireless HD receiver 140) is capable of supporting.
In step 1008, the wireless HD receiver 140 may be configured to provide video format information to the wireless HD receiver 120 for video transmission. In step 1010, the wireless HD transmitter 120 may be used to determine or select a video format for use in video communication with the wireless HD receiver 140. In step 1012, the wireless HD transmitter 120 may be configured to format or reformat the extracted encoded information using the determined or the selected video format. In step 1014, it is determined whether the wireless HD transmitter 120 is configured to transmit an uncompressed video bitstream to the wireless HD receiver 140. In the case where the wireless HD transmitter 120 may be configured to send an uncompressed video bitstream to the wireless HD receiver 140, then step 1016 is entered and the wireless HD transmitter 120 may be used to decode or decompress the received compressed video bitstream by the decompression engine 210 to generate corresponding decoded video frames.
In step 1018, the wireless HD transmitter 120 may be configured to transmit an uncompressed video bitstream to the wireless HD receiver 140, the uncompressed video bitstream including the decoded video frames and the formatted or reformatted extracted encoded information. In step 1020, the wireless HD receiver 140 is operable to receive the transmitted uncompressed video bitstream. The wireless HD receiver may be configured to extract encoded information from the received uncompressed video bitstream. In step 1022, the wireless HD receiver may perform frame rate up-conversion on the received decoded video frames using the extracted encoded information to construct final interpolated video frames. In step 1024, wireless HD receiver 140 may be used to transmit the constructed final interpolated video frame to display device 150 for display, which exemplary step may return to step 1002.
In step 1014, where the wireless HD transmitter is configured to transmit a compressed video bitstream to the wireless HD receiver 140, then step 1026 is entered where the wireless HD transmitter 120 may be configured to transmit the received compressed video bitstream to the wireless HD receiver 140 along with the formatted or reformatted extracted encoded information. In step 1028, the wireless HD receiver 140 may be configured to extract coding information from the received compressed video bitstream from the wireless HD transmitter 120. In step 1030, the wireless HD receiver 140 may be configured to decompress the received compressed video bitstream from the wireless HD transmitter 120 to generate corresponding decoded video frames, exemplary steps continuing in step 1022.
Fig. 11 is a flow diagram illustrating exemplary steps for video decompression in accordance with one embodiment of the present invention. Referring to fig. 11, exemplary steps begin at step 1110, where a decompression engine, such as decompression engine 210 of wireless HD transmitter 200 and/or decompression engine 910 of wireless HD receiver 900, may be used to receive a compressed video bitstream. The compressed video bitstream received by decompression engine 210 may be received directly from video source 110. However, the compressed video bitstream received by decompression engine 910 may be transmitted by wireless HD transmitter 120 over a wireless HD transmission link. In step 1120, the decompression engine 210 or 910 may be used to perform entropy decoding on a current compressed video frame of the received compressed video bitstream to generate quantized redundancy data for the current compressed video frame.
In step 1122, a determination is made as to whether the decompression engine is located within the wireless transmitter 120. In the case where a decompression engine, such as decompression engine 210, is located within the wireless HD transmitter 120, the consultation proceeds to step 1130 and the decompression engine 210 may be used to generate encoded information for the current compressed video frame by entropy decoding. In step 1140, the decompression engine 210 or 910 predicts a current uncompressed video frame using quantized redundant data generated from a current compressed video frame and one or more previously decoded video frames of the received compressed video bitstream via a motion compensation technique.
In step 1150, the decompression engine 210 or 910 may be used to dequantize the current compressed video frame. In step 1160, the decompression engine 210 or 910 may be used to combine the current dequantized compressed video frame and the current predicted uncompressed video frame to generate a current decoded video frame. In step 1170, it is determined whether a compressed video frame in the received compressed video bitstream is decoded. In the example where a compressed video frame in the received compressed video bitstream was not decoded, then exemplary steps may continue for the next available compressed video frame in the received compressed video bitstream and return to step 1120.
In step 1122, in the case where a decompression engine, such as decompression engine 910, is located within the wireless HD receiver 140, then the exemplary steps continue in step 1140. In step 1170, the exemplary step ends in step 1190 in the instance that a compressed video frame of the received compressed video bitstream has been decoded.
Figure 12 is a flowchart illustrating exemplary steps performed by a wireless HD receiver to perform motion compensated frame rate up-conversion on compressed and decompressed video bitstreams in accordance with one embodiment of the present invention. Referring to fig. 12, exemplary steps begin at step 1210 when a frame rate up-conversion engine (e.g., frame rate up-conversion engine 410 and/or 920 located in wireless HD receiver 140) is operable to receive a decoded video frame and associated encoded information, including, for example, block motion vectors, block based coding, quantization levels, and/or quantized redundancy data. In step 1220, a frame rate up-conversion engine (e.g., frame rate up-conversion engine 410 and/or 920) may be used to perform digital noise reduction filtering on the received decoded video frames.
In step 1230, a frame rate up-conversion engine (e.g., frame rate up-conversion engine 410 and/or 920) may modify each block motion vector by using the corresponding filtered decoded video frame and/or one or more associated forward and/or backward filtered decoded video frames. In step 1240, a motion vector reliability-consistency measure may be determined for each correction block direction vector. In step 1250, a pixel motion vector may be generated for each filtered decoded video frame, for example, by decompressing the corresponding modified block motion vector. In step 1260, a frame rate up-conversion engine (e.g., frame rate up-conversion engine 410 and/or 920) may perform motion compensated interpolation for each filtered decoded video frame using the corresponding generated pixel motion vectors. In step 1270, the interpolated decoded video frames for each filtered decoded video frame may be filtered and/or monitored (guard) in view of the corresponding determined motion vector reliability-consistency measure. The filtered interpolated decoded video frame may be communicated to display device 150 for display. The exemplary procedure returns to step 1210.
Various aspects of the present invention provide methods and systems for motion compensated frame rate up-conversion for compression and decompression of video bitstreams. According to various embodiments of the invention, a video receiver (e.g., wireless HD receiver 140) may be used to receive a video bitstream from a video transmitter (e.g., wireless HD transmitter 120) over, for example, wireless HD transmission link 130. The received video bitstream may include encoded information and a plurality of video frames for display on display device 150. The wireless HD receiver 140 may be used to extract encoded information from the received video bitstream. The wireless HD receiver may be configured to perform frame rate up-conversion on the received plurality of video frames by the frame rate up-conversion engine 410 or 920 using the encoded information. The wireless HD transmitter 120 generates encoded information by entropy decoding compressed video from a video source 110, such as a cable 111, IP television 112, satellite broadcast 113, mobile communications 114, camera 115, and/or video 116 network, respectively. The extracted coding information may include one or more of block motion vectors, block based coding, quantization levels, and/or quantized redundant data.
The received video bitstream may comprise uncompressed video or compressed video. In the case where the plurality of video frames for display in the received video bitstream includes a plurality of decoded video frames such as described with respect to fig. 2 through 8, the received plurality of decoded video frames may be generated by the wireless HD transmitter 120. The wireless HD transmitter 120 may decompress the compressed video from the video source 110 using a decompression engine 210. The decompression engine 210 may be used to perform a variety of video decoding operations, such as entropy decoding, inverse quantization, inverse transformation, and motion compensated prediction. The digital noise filter 510 within the wireless HD receiver 400 may be used to perform digital noise reduction filtering on each of the received plurality of decoded video frames.
Pixel motion vector generator 520 may utilize the extracted coding information (e.g., block motion vectors) and the filtered decoded video frames to generate pixel motion vectors for each received decoded video frame. Within the pixel motion vector evaluator, one or both of the associated motion pixel reliability and/or motion vector consistency may be calculated for the generated pixel motion vectors. A plurality of interpolated video frames are generated from the received plurality of decoded video frames by a frame rate up-converter 540 based on the generated pixel motion vectors and one or both of the motion vector reliability and/or motion vector consistency calculated in the pixel motion vector evaluator 530. The generated plurality of interpolated video frames may be processed, for example, by a scene change detector 550. Artifacts such as motion judder may be masked by performing noise reduction filtering using one or both of the calculated motion vector reliability and/or motion vector consistency information.
In the example where the received video bitstream is located at a video receiver, such as wireless HD receiver 900, it may comprise, for example, MPEG-2, MPEG-4, AVC, VC1, and/or VP 1. The wireless HD receiver 900 may be used to perform video decoding on the received compressed video through a decompression engine 910. The decompression engine 910 may employ a variety of video decoding techniques including, for example, entropy decoding, inverse quantization, inverse transformation, and/or motion compensated prediction. The constructed plurality of decoded video frames may be passed through decompression engine 910 to digital noise reduction filter 510 for noise reduction processing of the generated plurality of video frames. Pixel motion vector generator 520 may utilize the extracted coding information (e.g., block motion vectors) and the filtered generated plurality of decoded video frames to generate a pixel motion vector for each of the plurality of decoded video frames. Within pixel motion vector evaluator 530, one or both of the associated motion pixel reliability and/or motion vector consistency may be calculated for the generated pixel motion vector to provide a measure on the generated pixel motion vector. A plurality of interpolated video frames may be generated from the received plurality of decoded video frames by frame rate up-converter 540 based on the generated pixel motion vectors and one or both of the motion vector reliability and/or motion vector consistency calculated at the pixel motion vector evaluator. The generated plurality of interpolated video frames may be processed, for example, by a scene change detector 550. Artifacts such as motion judder may be masked by performing noise reduction filtering using one or both of the calculated motion vector reliability and/or motion vector consistency information.
Yet another embodiment of the present invention may provide a machine-readable storage, having stored thereon, a computer program comprising at least one code section for execution by a machine to cause the machine to perform the above method for performing motion compensated frame rate up-conversion of compressed and decompressed video bitstreams.
Accordingly, the present invention may be realized in hardware, software, or a combination of hardware and software. The present invention can be realized in a centralized fashion in at least one computer system, or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware and software could be a general purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
The present invention may also be implemented by a computer program product, comprising all the features enabling the implementation of the methods of the invention, when loaded in a computer system. The computer program in this document refers to: any expression, in any programming language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to other languages, codes or symbols; b) reproduced in a different format.
While the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.

Claims (6)

1. A method of signal processing, the method comprising:
in a video receiver:
receiving a video bitstream comprising a plurality of video frames and corresponding encoding information, the encoding information being generated by a video transmitter by entropy decoding based on compressed video from a video source, the received plurality of video frames comprising a plurality of decoded video frames constructed in the video transmitter by decompressing the compressed video from the video source;
extracting the coding information from the received video bitstream; and
performing frame rate up-conversion on the received plurality of video frames using the extracted encoding information;
generating a pixel motion vector for each of the received plurality of decoded video frames based on the extracted coding information;
calculating one or both of motion vector reliability and/or motion vector consistency for corresponding to the generated pixel motion vector; and
generating a plurality of interpolated video frames from the received plurality of decoded video frames based on the generated pixel motion vectors and one or both of the calculated motion vector reliability and/or motion vector consistency.
2. The method of claim 1, wherein the extracted coding information comprises one or more of block motion vectors, block based coding, quantization levels, and/or quantized redundant data.
3. The method of claim 1, wherein the video source is from one of a cable television network, an IP television network, a satellite broadcast network, a mobile communication network, a camera, and/or a video camera.
4. A system for signal processing, the system comprising:
one or more circuits for use in a video receiver, the one or more circuits operable to receive a video bitstream comprising a plurality of video frames and corresponding encoding information, the encoding information produced by a video transmitter via entropy decoding based on compressed video from a video source, the received plurality of video frames comprising a plurality of decoded video frames constructed in the video transmitter by decompressing the compressed video from the video source;
the one or more circuits are operable to extract the coding information from the received video bitstream;
the one or more circuits are operable to perform frame rate up-conversion on the received plurality of video bitstreams using the extracted coding information;
the one or more circuits are operable to generate a pixel motion vector for each of the received plurality of decoded video frames based on the extracted encoding information;
the one or more circuits are operable to calculate one or both of motion vector reliability and/or motion vector consistency for corresponding to the generated pixel motion vector; and
generating a plurality of interpolated video frames from the received plurality of decoded video frames based on the generated pixel motion vectors and one or both of the calculated motion vector reliability and/or motion vector consistency.
5. The system of claim 4, wherein the extracted coding information comprises one or more of block motion vectors, block based coding, quantization levels, and/or quantized redundant data.
6. The system of claim 4, wherein the video source is from one of a cable television network, an IP TV network, a satellite broadcast network, a mobile communication network, a camera, and/or a video camera.
HK10107643.4A 2008-08-19 2010-08-10 A method and system for processing signal HK1141377B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US9007508P 2008-08-19 2008-08-19
US61/090,075 2008-08-19
US12/400,736 US20100046623A1 (en) 2008-08-19 2009-03-09 Method and system for motion-compensated frame-rate up-conversion for both compressed and decompressed video bitstreams
US12/400,736 2009-03-09

Publications (2)

Publication Number Publication Date
HK1141377A1 HK1141377A1 (en) 2010-11-05
HK1141377B true HK1141377B (en) 2012-11-16

Family

ID=

Similar Documents

Publication Publication Date Title
KR101056096B1 (en) Method and system for motion compensated frame rate up-conversion for both compression and decompression video bitstreams
US9185426B2 (en) Method and system for motion-compensated frame-rate up-conversion for both compressed and decompressed video bitstreams
JP2795420B2 (en) Method and apparatus and system for compressing digitized video signal
US20130156113A1 (en) Video signal processing
US10225561B2 (en) Method and apparatus for syntax signaling in image and video compression
JP2009100424A (en) Receiving device and receiving method
US6040875A (en) Method to compensate for a fade in a digital video input sequence
SG188221A1 (en) Video signal processing
US8767831B2 (en) Method and system for motion compensated picture rate up-conversion using information extracted from a compressed video stream
US9258517B2 (en) Methods and apparatuses for adaptively filtering video signals
US8243798B2 (en) Methods and apparatus for scalable video bitstreams
US8848793B2 (en) Method and system for video compression with integrated picture rate up-conversion
HK1141377B (en) A method and system for processing signal
US20070230918A1 (en) Video Quality Enhancement and/or Artifact Reduction Using Coding Information From a Compressed Bitstream
HK1149661A (en) Method and system for signal process
KR100367727B1 (en) Methods and arrangements for converting a high definition image to a lower definition image using wavelet transforms
US7804899B1 (en) System and method for improving transrating of MPEG-2 video
Challapali et al. Video compression for digital television applications
Luengo et al. HEVC Mezzanine Compression for UHD Transport over SDI and IP Infrastructures
CN108574842A (en) A kind of video information processing method and processing system
JPH07322243A (en) Image transmission equipment
KR20140092189A (en) Method of Broadcasting for Ultra HD
HK1132117B (en) A method and system for processing video data