WO2006013510A1 - Desentrelacement - Google Patents
Desentrelacement Download PDFInfo
- Publication number
- WO2006013510A1 WO2006013510A1 PCT/IB2005/052431 IB2005052431W WO2006013510A1 WO 2006013510 A1 WO2006013510 A1 WO 2006013510A1 IB 2005052431 W IB2005052431 W IB 2005052431W WO 2006013510 A1 WO2006013510 A1 WO 2006013510A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixels
- motion
- group
- particular group
- border
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/01—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
- H04N7/0117—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal
- H04N7/012—Conversion between an interlaced and a progressive signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/01—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
- H04N7/0135—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes
- H04N7/014—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes involving the use of motion vectors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/01—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
- H04N7/0135—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes
- H04N7/0142—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes the interpolation being edge adaptive
Definitions
- the invention relates to motion based de-interlacing unit for computing an image on basis of a sequence of video fields, the de-interlacing unit comprising: motion determining means for determining a motion indicator for a particular group of pixels of the image; and - interpolation means for computing the values of the particular group of pixels on basis of the sequence of video fields and on basis of the motion indicator.
- the invention further relates to an image processing apparatus comprising such a motion based de-interlacing unit.
- the invention further relates to a computer program product to be loaded by a computer arrangement, comprising instructions for de-interlacing a sequence of video fields into an image, the computer arrangement comprising processing means and a memory, the computer program product, after being loaded, providing said processing means with the capability to carry out: determining a motion indicator for a particular group of pixels of the image; and computing the values of the particular group of pixels on basis of the sequence of video fields and on basis of the motion indicator.
- the invention further relates to a method of motion based de- interlacing comprising: - determining a motion indicator for a particular group of pixels of the image; and computing the values of the particular group of pixels on basis of the sequence of video fields and on basis of the motion indicator.
- the motion threshold is not constant but adapted to the local frequency content of the picture. If the frequency content is relatively high then also the motion threshold is relatively high.
- the de- interlacing unit further comprises border detection means for detecting whether the particular group of pixels is located at a horizontal border of an object in the image and that the interpolation means is forced to perform a non-motion compensated inter-field interpolation if the border detection means determine that the particular group of pixels is located at a horizontal border of an object.
- a horizontal border is meant that there is a first region in the image which is located above a second region in the image and that a relatively strong transition in color and/or grey value is noticeable between the first region and the second region.
- This transition corresponds to the border between the first region and the second region.
- the size of the first region and the second region may be substantially different.
- the first region corresponds to only a limited number of horizontal lines such as a part of a ticker tape, i.e. a banner, while the second region corresponds to the rest of the image.
- an achievement of an embodiment of the de-interlacing unit according to the invention is that it is arranged to protect horizontal borders, by overriding the decision or output of the motion determining means. Even if the horizontal border is directly adjacent to a moving object, the de-interlacing unit according to the invention will use the interpolation which is suitable for still objects, i.e. no motion.
- the interpolation of the pixels of the moving object is such that mixing of pixels from both sides of the border is prevented. That means that for the selection of input pixels for the interpolation only pixels are selected which are located on a predetermined location relative to the detected border.
- An embodiment of the motion based de-interlacing unit is a motion adaptive de-interlacing unit, whereby: the motion determining means comprises a motion decision means for computing the motion indicator which represents whether there is motion for the particular group of pixels of the image; and whereby - the interpolation means is arranged to perform:
- An embodiment of the motion based de- interlacing unit is a motion compensated de-interlacing unit, whereby: the motion determining means comprises a motion vector computing means for computing the motion indicator which represents a motion vector for the particular group of pixels of the image; and the interpolation means is arranged to perform: - motion compensated inter-field interpolation on basis of the motion vector if the border detection means determine that the particular group of pixels is not located at a horizontal border of an object; and
- the border detection means determine that the particular group of pixels is located at a horizontal border of an object.
- the border detection means comprises vertical inconsistency computing means for computing a vertical inconsistency measure which is based on differences between respective pixels of a first group of pixels and a second group of pixels, the first group of pixels being vertically neighboring the second group of pixels.
- vertical inconsistency computing means for computing a vertical inconsistency measure which is based on differences between respective pixels of a first group of pixels and a second group of pixels, the first group of pixels being vertically neighboring the second group of pixels.
- the border detection means comprises comparing means for comparing the vertical inconsistency measure with a predetermined threshold, which is arranged to determine that the particular group of pixels is located at a horizontal border of an object if the vertical inconsistency measure is larger than the predetermined threshold.
- the border detection means further comprises: horizontal inconsistency computing means for computing a horizontal inconsistency measure which is based on further differences between respective pixels of a third group of pixels and a fourth group of pixels, the third group of pixels and the fourth group of pixels belonging to a previous field which is temporally preceding the image or belonging to a next field which is temporally succeeding the image, the third group of pixels, the fourth group of pixels and the particular group of pixels having the same vertical coordinate; and - comparing means for comparing the horizontal inconsistency measure with the vertical inconsistency measure and being arranged to determine that the particular group of pixels is located at a horizontal border of an object if the vertical inconsistency measure is relatively large compared with the horizontal inconsistency measure.
- the vertical inconsistency measure is computed by selecting the first group of pixels out of a set of groups of pixels, the selecting based on evaluation of the set of groups of pixels. That means that the selection of the first group of pixels is not pre-determined but based on evaluation of a number of candidate groups of pixels. Preferably, the evaluation is based on maximizing the differences between pixel values.
- the horizontal inconsistency measure is computed by selecting the third group of pixels out of a further set of groups of pixels, the selecting based on evaluation of the further set of groups of pixels. That means that the selection of the third group of pixels is not pre ⁇ determined but based on evaluation of a number of candidate groups of pixels.
- the evaluation is based on maximizing the differences between pixel values.
- the vertical inconsistency computing means is arranged to compute the vertical inconsistency measure by calculating a sum of differences between respective pixel values of the first group of pixels and the second group of pixels.
- the inconsistency measure might be the Sum of Absolute Difference (SAD).
- SAD Sum of Absolute Difference
- the computation of the horizontal inconsistency measure is also based on calculating a sum of absolute differences. It is a further object of the invention to provide an image processing apparatus of the kind described in the opening paragraph which provides output images with a relatively high quality.
- the de- interlacing unit further comprises border detection means for detecting whether the particular group of pixels is located at a horizontal border of an object in the image and that the interpolation means is forced to perform a non-motion compensated inter-field interpolation if the border detection means determine that the particular group of pixels is located at a horizontal border of an object.
- the image processing apparatus may comprise additional components, e.g. a display device for displaying the output images.
- the image processing unit might support one or more of the following types of image processing: Video compression, i.e. encoding or decoding, e.g. according to the MPEG standard.
- Image rate conversion From a series of original input images a larger series of output images is calculated. Output images are temporally located between two original input images;
- Temporal noise reduction This can also involve spatial processing, resulting in spatial-temporal noise reduction.
- the image processing apparatus might e.g. be a TV, a set top box, a VCR (Video Cassette Recorder) player, a satellite tuner, a DVD (Digital Versatile Disk) player or recorder.
- VCR Video Cassette Recorder
- satellite tuner e.g., a satellite tuner
- DVD Digital Versatile Disk
- This object of the invention is achieved by further detecting whether the particular group of pixels is located at a horizontal border of an object in the image and that the interpolation is a non-motion compensated inter-field interpolation if the border detection indicates that the particular group of pixels is located at a horizontal border of an object.
- This object of the invention is achieved by further detecting whether the particular group of pixels is located at a horizontal border of an object in the image and that the interpolation is a non-motion compensated inter-field interpolation if the border detection indicates that the particular group of pixels is located at a horizontal border of an object.
- Modifications of the de-interlacing unit and variations thereof may correspond to modifications and variations thereof of the image processing apparatus, the method and the computer program product, being described.
- Fig. 1 schematically shows portions of three consecutive video fields
- Fig. 2 schematically shows an embodiment of the de-interlacing unit according to the invention
- FIG. 3 A schematically shows an embodiment of a motion adaptive de- interlacing unit according to the invention
- Fig. 3B schematically shows an other embodiment of a motion adaptive de- interlacing unit according to the invention
- Fig. 3 C schematically shows an other embodiment of a motion adaptive de- interlacing unit according to the invention, having a controllable match error computing unit;
- Fig. 4 schematically shows an embodiment of a motion compensated de- interlacing unit according to the invention
- Fig. 5A schematically shows a sequence of output images which has not been generated by means of a de-interlacing unit according to the prior art
- Fig. 5B schematically shows a sequence of output images which has been generated by means of an embodiment of the de-interlacing unit according to the invention
- Fig. 6A schematically shows an embodiment of the border detection unit according to the invention
- Fig. 6B the schematically shows an alternative embodiment of the border detection unit according to the invention.
- Fig. 7 schematically shows an image processing apparatus according to the invention.
- Interlacing is the common video broadcast procedure for transmitting the odd or even numbered video frame lines alternately, i.e. field by field.
- De-interlacing attempts to restore the full vertical resolution, i.e. make odd and even lines available simultaneously for each video frame. In this specification with an image is meant either a field or a frame.
- de-interlacing There are several types of de-interlacing. The focus in this document is on motion based de-interlacing. Motion based de-interlacing may be motion adaptive de- interlacing whereby for a particular group of pixels it is determined whether there is motion or not. The type of interpolation is based on this determination.
- An alternative motion based de-interlacing is motion compensated de-interlacing whereby for a particular group of pixels the amount of motion and the direction of motion is estimated, i.e. a motion vector is computed. On basis of the computed motion vector a motion compensated interpolation is performed.
- the method of interpolation is pre-determined in the event of a detected border.
- Fig. 1 schematically shows portions of three consecutive video fields 100, 102 and 106.
- Fig. 1 also shows an output image 104.
- the blocks, e.g. 108-114 which are depicted in Fig. 1 correspond to groups of pixels. Each group of pixels comprises e.g. 8 pixels. A group of pixels might also comprise only one pixel. Notice that a part of the output image 104 and one of the three consecutive video fields 102 are the same.
- the blocks which are depicted with solid lines correspond to pixel values which are actually present in the video signal as received, i.e. the input video fields.
- the blocks which are depicted with dashed lines correspond to pixel values which have to be computed on basis of the pixel values which are actually present in the video signal as received.
- the combination of actually received pixel values and computed pixel values together form the output video image 104.
- the odd video lines y-3, y-1, y+1, y+3 of the output image 104 are copies from the video lines of the current video field 102.
- the depicted portions of the even video lines y-2, y, y+2, y +4 are based on corresponding video lines from one or more other video fields if no motion has been detected for these portions. This type of interpolation is called inter- field interpolation. However if motion has been detected, then also these even video lines y-2, y, y+2, y +4 are based on the video lines of the current video field 102.
- the pixel values of the portions of the even video lines of the output image 104 are computed by means of edge dependent, i.e. orientation dependent interpolation. This latter type of interpolation is called intra-field interpolation.
- interpolation i.e. inter-field interpolation or intra field interpolation
- a preferred method of determining whether there is motion will be explained for a particular group of pixels 108.
- the particular group of pixels 108 is located at a particular spatial position x,y.
- the method of determining whether there is motion comprises the following steps: a match error E(x,y) is computed on basis of comparing a first group of pixels 114 of a first video field 100 with a second group of pixels 112 of a second video field 106.
- the spatial position of the first group of pixels 114, the spatial position of the second group of pixels 112 and the spatial position of the particular group of pixels 108 are mutually equal, i.e. x,y.
- the match error E(x,y) is computed by means of calculating a sum of absolute differences between respective pixel values of the first group of pixels 114 and pixel values of the second group of pixels 112.
- a motion threshold T(x,y) is computed for the particular group of pixels 108.
- the motion threshold T(x,y) is based on a number of components: a predetermined constant, a noise dependent component, a contrast dependent component and an environment dependent component. Most of these components are optional but the latter one is obligatory. The different optional components are explained in more detail in connection with Fig. 3 A. The environment dependent component is explained in more detail below; - a motion indicator M(x,y) which represents whether there is motion for the particular group of pixels 108 is computed by means of comparing the match error E(x,y) with the motion threshold T(x,y).
- the match error E(x,y) is lower than the motion threshold T(x,y)
- the values of the pixels of the particular group of pixels are computed by means of interpolation.
- the type of interpolation depends on the computed value of the motion indicator M(x,y), as explained above.
- the motion threshold T(x,y) comprises an environment dependent component. That means that the values of the motion indicators M(x,y) being computed for other groups of pixels, e.g. a fourth group of pixels 110, are taken into account to compute the current motion threshold T(x,y) belonging to the particular group of pixels 108.
- the basic rule is that the values of a predetermined number of motion indicators in the spatial and/or temporal environment of the particular group of pixels 108 are evaluated. For instance the values of the following motion indicators are evaluated: M(x-3, y), M(x-1, y), M(x, y-2), M(x+2, y-2), M(x, y+3).
- the motion threshold T(x,y) for the particular group of pixels 108 is relatively low. If the number of motion indicators in this set having a value corresponding to no motion is relatively high, then the motion threshold T(x,y) for the particular group of pixels 108 is relatively high.
- Fig. 2 schematically shows an embodiment of the de-interlacing unit 200 according to the invention.
- the motion based de- interlacing unit 200 comprises: a border detection unit 230 for detecting whether the particular group of pixels 108 is located at a horizontal border of an object. This border detection unit will be explained in more detail in connection with Fig. 6 A and Fig 6B.
- the border detection unit 230 as may be provided with three inputs, image data from a previous field, a current field and a next field. Depending on the type of border detection some of these inputs are optional.
- a motion determining unit 201 for computing a motion indicator which represents motion for a group of pixels, for which the values have to be computed by means of interpolation.
- the motion determining unit may be a motion decision unit 206 as described in connection with Figs. 3A-3C or may be a motion vector computing unit for computing a motion vector for the particular group of pixels 108 of the image; and an interpolation unit 208 for computing pixel values.
- the interpolation unit 208 is controlled by the motion determining unit 201 and the border detection unit 230.
- the interpolation unit 208 is provided with input video fields and provides video frames at its output connector 216.
- the interpolation unit 208 is arranged to switch per pixel or group of pixels between multiple types of interpolation on basis of the values of the provided motion indicators and the output of the border detection unit 230.
- the interpolation unit 208 is forced to perform a non motion compensated inter-field interpolation if the border detection unit 230 determines that the particular group of pixels is located at a horizontal border of an object.
- the border detection unit 230, the motion determining unit 201, and the interpolation unit 208 may be implemented using one processor. Normally, these functions are performed under control of a software program product. During execution, normally the software program product is loaded into a memory, like a RAM, and executed from there. The program may be loaded from a background memory, like a ROM, hard disk, or magnetical and/or optical storage, or may be loaded via a network like Internet. Optionally an application specific integrated circuit provides the disclosed functionality.
- the de-interlacing unit 200 is provided with consecutive video fields at its respective input connectors 210-214.
- the image processing apparatus in which the de-interlacing unit 200 is comprised has a storage device for temporary storage of a number of video fields. This storage device is not depicted in Fig. 2.
- Fig. 3 A schematically shows an embodiment of the de-interlacing unit 300 according to the invention.
- the motion adaptive de-interlacing unit 300 comprises: a border detection unit 230 for detecting whether the particular group of pixels 108 is located at a horizontal border of an object; a match error computing unit 202 for computing match errors on basis of comparing groups of pixels of different video fields; a motion threshold computing unit 204 for computing motion thresholds; a motion decision unit 206 for computing motion indicators which represents whether there is motion for a group of pixels, for which the values have to be computed by means of interpolation.
- the motion decision unit 206 is provided with the match errors being computed by the match error computing unit 202 and is provided with the motion thresholds being computed by the motion threshold computing unit 204.
- the motion decision unit 206 is arranged to compute the motion indicators by means of comparing the match errors with the corresponding motion thresholds; and an interpolation unit 208 for computing pixel values.
- the interpolation unit 208 is arranged to switch per pixel or group of pixels between inter-field interpolation and intra field interpolation on basis of the values of the provided motion indicators and the output of the border detection unit 230.
- the interpolation unit 208 is forced to perform inter- field interpolation if the border detection unit 230 determines that the particular group of pixels is located at a horizontal border of an object.
- the match error computing unit 202, the motion threshold computing unit 204, the motion decision unit 206, the interpolation unit 208 and the border detection unit 230 may be implemented using one processor. Normally, these functions are performed under control of a software program product. During execution, normally the software program product is loaded into a memory, like a RAM, and executed from there. The program may be loaded from a background memory, like a ROM, hard disk, or magnetical and/or optical storage, or may be loaded via a network like Internet. Optionally an application specific integrated circuit provides the disclosed functionality.
- the match error computing unit 202 optionally comprises two horizontal low pass filters 218 and 220 for low pass filtering the input data. That means that low pass filtered pixel values of a first video field, typically the previous video field, are compared with low pass filtered pixel values of the second video field, typically the next video field. Preferably, comparing is based on computing differences. The reason for low pass filtering is to compensate for sync jitter.
- the motion threshold computing unit 204 comprises processing means 222- 226 for computing the different components of the motion thresholds.
- the motion threshold computing unit 204 comprises an environment evaluation unit 226 to evaluate the motion indicators in the environment, as described in connection with Fig. 1. Notice the feedback connection 228 by means of which values of motion indicators are provided from the motion decision unit 206 to the motion threshold computing unit, in particular the environment evaluation unit 226.
- the de-interlacing unit 200 comprises not depicted storage means for temporary storage of motion indicators.
- the motion threshold computing unit 204 preferably comprises a contrast computing unit 222 for computing local contrast. Preferably, mutual differences between pixel values within groups of pixels of multiple video fields are applied to compute the contrast values.
- the contrast value is relatively high, then the corresponding motion threshold is also relatively high. If the contrast value is relatively low, then the corresponding motion threshold is also relatively low.
- the motion threshold computing unit 204 optionally comprises a noise computing unit 224 for computing a global noise value.
- a noise computing unit 224 for computing a global noise value.
- Mutual differences between pixel values within groups of pixels are used to compute the global noise value.
- a limited number of groups of pixels being located at a number of locations within the video field are used. These groups of pixels are not necessarily located in a direct environment of the particular group of pixels. Preferably, the groups of pixels are located in regions where there is no motion. If the noise value is relatively high, then the motion thresholds are also relatively high. If the noise value is relatively low, then the motion thresholds are also relatively low.
- Fig. 3B schematically shows another embodiment of the de-interlacing unit
- This embodiment of the de-interlacing unit 301 is almost equal to the de-interlacing unit 200 which is described in connection with Fig. 3A.
- the only difference is the final decision unit 302.
- the control of the interpolation unit 208 is not only based on the current motion indicator M(x,y) for a particular group of pixels 108 being located at the spatial position (x,y) and output of the border detection unit 230 but is also based on other motion indicators in the environment.
- the final motion indicator F(x,y) is based on motion indicators of adjacent groups of pixels, e.g.
- F(x,y) M(x, y) * M(x-1, y) * M(x+1, y) * M(x, y-1) * M(x, y+1) where a "1" indicates motion and a "0" indicates no motion, i.e. still.
- Fig. 3 C schematically shows another embodiment of the de-interlacing unit
- the de-interlacing unit 302 having a controllable match error computing unit 202.
- This embodiment of the de-interlacing unit 302 is relatively similar to the de-interlacing unit 301 which is described in connection with Fig. 3A. The major difference is the deployment and working of the environment evaluation unit 226.
- the match error computing unit 202 of the embodiment of the de- interlacing unit 302 comprises the environment evaluation unit 226.
- the de-interlacing unit 302 comprises not depicted storage means for temporary storage of motion indicators.
- the environment evaluation unit 226 is arranged to evaluate the motion indicators in the environment. However now the output of the environment interpolation unit 226 is opposite to the output as in the case of the de- interlacing unit 200 as described in connection with Fig. 3 A.
- the match error E(x,y) comprises a local component and an environment dependent component.
- the local component is based on pixel value differences of pixels being located at spatial position x,y.
- the environment dependent component is based on values of motion indicators being computed for other groups of pixels. That means that the values of the motion indicators being computed for other groups of pixels, e.g. a fourth group of pixels 110, are taken into account to compute the current match error E(x,y) belonging to the particular group of pixels 108.
- the basic rule is that the values of a predetermined number of motion indicators in the spatial and/or temporal environment of the particular group of pixels 108 are evaluated.
- the values of the following motion indicators are evaluated: M(x-3, y), M(x-1, y), M(x, y-2), M(x+2, y-2), M(x, y+3). If the number of motion indicators in this set having a value corresponding to motion is relatively high, then the match error E(x,y) for the particular group of pixels 108 is relatively high. If the number of motion indicators in this set having a value corresponding to no motion is relatively high, then the match error E(x,y) for the particular group of pixels 108 is relatively low.
- Fig. 4 schematically shows an embodiment of a motion compensated de- interlacing unit 400 according to the invention.
- the motion based de- interlacing unit 400 comprises: a border detection unit 230 for detecting whether the particular group of pixels .108 is located at a horizontal border of an object; a motion vector computing unit 201 for computing a motion indicator which represents motion for a group of pixels, for which the values have to be computed by means of interpolation.
- the motion vector computing unit is arranged to compute a motion vector for the particular group of pixels 108 of the image; and - an interpolation unit 208 for computing pixel values.
- the interpolation unit is arranged to compute a motion vector for the particular group of pixels 108 of the image.
- the interpolation unit 208 is controlled by the motion determining unit 201 and the border detection unit 230.
- the interpolation unit 208 is provided with input video fields and provides output images at its output connector 216.
- the interpolation unit 208 is arranged to switch per pixel or group of pixels between multiple types of interpolation on basis of the values of the provided motion indicators and the output of the border detection unit 230.
- the interpolation unit 208 is arranged to perform motion compensated inter-field interpolation on basis of the motion vector if the border detection means 230 determine that the particular group of pixels is not located at a horizontal border of an object; and is arranged to perform non-motion compensated inter-field interpolation if the border detection means 230 determine that the particular group of pixels is located at a horizontal border of an object.
- Non motion compensated inter-field interpolation may be performed by field insertion, i.e. without taking care of motion or by inter-field interpolation using a motion vector which is equal to the null vector.
- the values of the particular group of pixels 108 are copies of corresponding values of another group of pixels from the previous or the next field, whereby the spatial coordinates of the particular group of pixels and of the said another group of pixels are mutually equal.
- the values of the particular group of pixels are based on combining the values of a first group of pixels from the previous field and a second group of pixels from the next field, whereby the spatial coordinates of the first group of pixels, the second group of pixels and of the particular group of pixels are mutually equal.
- Motion vector computing unit 201 comprises: a motion vector candidate generating unit 402 for generating a set of motion vector candidates; a motion vector candidate evaluation unit 404 for evaluating the motion vector candidates of the set; and a motion vector selection unit 406 for selecting the motion vector for the particular group of pixels on basis of the evaluation of the motion vector candidates.
- the border detection unit 230, the motion determining unit 201, and the interpolation unit 208 may be implemented using one processor.
- FIG. 5 A schematically shows a sequence of output images 502-506 which has been generated by means of an embodiment of the de- interlacing unit according to the prior art
- Fig. 5B schematically shows a sequence of output images 514-518 which has been generated by means of an embodiment of the de- interlacing unit according to the invention.
- the sequence of output images 502-506 originate from a television broadcast.
- the upper part 508 of the images 502-506 show a driving car, moving from the right to the left.
- the lower part 510 correspond to a banner, e.g. for displaying prizes at the stock exchange.
- the banner comprises a gray region on which characters move from the left to the right.
- the banner comprises a graphics line 512.
- the graphics line 512 is visible in only half of the images. That means the graphics line is visible, is not visible, is visible, is not visible, et cetera. The effect of this is a flickering which is perceived as very annoying. It is caused by applying intra- field interpolation.
- Fig. 5B schematically shows a sequence of output images 514-518 which is based on the same sequence of input fields as the sequence shown in Fig. 5 A. However, in the sequence 514-518 the graphics line 512 is continuously visible. This is achieved by inter- field interpolation.
- the selection of inter-field interpolation instead of intra field interpolation is based on the detection of the horizontal border between the upper part 508 and the lower part 510 of the images.
- Fig. 6A schematically shows an embodiment of the border detection unit 230 according to the invention.
- the border detection unit 230 comprises: a vertical inconsistency computing unit 602 for computing a vertical inconsistency measure which is based on differences between respective pixels of a first group of pixels and a second group of pixels, the first group of pixels being vertically neighboring the second group of pixels; - a comparing unit 604 for comparing the vertical inconsistency measure with a predetermined border threshold.
- the comparing unit 604 is arranged to determine that the particular group of pixels 108 is located at a horizontal border of an object if the vertical inconsistency measure is larger than the predetermined border threshold; an input connector 608 for providing the border detection unit 230 with input data, i.e.
- a video field 100, 102, 106 an output connector 610 for outputting a value indicating whether there is a border detected or not; and a further input connector 606 for providing the border detection unit 230 with the predetermined threshold.
- the predetermined threshold may be based on image content.
- the working of the border detection unit 230 as depicted in Fig. 6A is as follows. See also Fig. 1. Suppose that for a particular group of pixels 108 in the current field it has to be determined whether there is a border or not.
- V(x,y) SAD(P p (x,y),P p (x,y- 2)) (1) or with
- V(x,y) SAD(P p (x,y),P p (x,y + 2)), (2) whereby SAD(P p (x,y),P p (x,y — 2)) means the sum of absolute differences between the respective pixels of the groups of pixels located at spatial position x,y and x,y-2, respectively.
- SAD(P p (x,y),P p (x,y — 2)) means the sum of absolute differences between the respective pixels of the groups of pixels located at spatial position x,y and x,y-2, respectively.
- Equation 1 and 2 is used:
- V(x,y) mvL(SAD(P p (x,y),P p (x,y-2)),SAD(P p (x,y),P p (x,y + 2))) (3)
- a second way to do this is based on computing the vertical inconsistency measure V(x,y) with pixel values of the current field P c , e.g.
- V(x,y) SAD(P c (x, y-l),P c (x,y +I)) (4)
- a third way to do this is based on computing the vertical inconsistency measure V(x, y) with pixel values of the next field P n , e.g.
- V(x,y) SAD(P fl (x,y),P n (x,y-2)) (5) or with
- V(x,y) SAD(P n (x,y),P n (x,y + 2)) (6)
- V(x,y) m S ⁇ (SAD(P n (x,y),P n (x,y- 2)),SAD(P n (x,y),P n (x,y + 2))) (7)
- the border detection proceeds with comparing the vertical inconsistency measure with the predetermined border threshold B , if V(x,y) ⁇ B then it is assumed that there is a border for the particular group of pixels 108, else it is assumed that there is no border for the particular group of pixels 108.
- Fig. 6B the schematically shows an alternative embodiment of the border detection unit 230 according to the invention.
- the border detection unit 230 comprises: a vertical inconsistency computing unit 602 for computing a vertical inconsistency measure which is based on differences between respective pixels of a first group 114 of pixels and a second group of pixels, the first group of pixels being vertically neighboring the second group of pixels; horizontal inconsistency computing means 614 for computing a horizontal inconsistency measure which is based on further differences between respective pixels of a third group of pixels 114 and a fourth group of pixels 112, the third group of pixels 114 and the fourth group of pixels 112 belonging to a previous field which is temporally preceding the image or belonging to a next field 100 which is temporally succeeding the image 104, the third group of pixels 114, the fourth group of pixels 112 and the particular group of pixels 108 having the same vertical coordinate y; a comparing unit 604 for comparing the horizontal inconsistency measure with the vertical inconsistency measure and being arranged to determine that the particular group of pixels is located at a horizontal border of an object if the vertical inconsistency
- the working of the border detection unit 230 as depicted in Fig. 6B is as follows. See also Fig. 1. Suppose that for a particular group of pixels 108 in the current field it has to be determined whether there is a border or not.
- the vertical inconsistency measure can be computed with one of the Equations 1-7
- the horizontal inconsistency measure can be computed with one of the following Equations:
- H(x,y) SAD(P p (x,y),P n (x,y)) (8) or
- H(x,y) SAD(P p (x-l,y),P p (x + l,y)) (9) or
- H(x,y) SAD(P p (x-l,y),P p (x,y)) (10) or
- H(x,y) SAD(P n (x-l,y),P n (x,y)) (13) or
- Equation 8-14 the maximum of the values of Equations 8-14 is used.
- the border detection proceeds with comparing the vertical inconsistency measure with the horizontal inconsistency measure H(x,y) . If V(x, y) > C* H(x, y) +D then it is assumed that there is a border for the particular group of pixels 108, else it is assumed that there is no border for the particular group of pixels 108. Hereby are C and D constants. Typical values are 6 and 128. Typical number of pixels of a group of pixels equals 8.
- Fig. 7 schematically shows an image processing apparatus 700 according to the invention, comprising: - receiving means 702 for receiving a signal representing input images; the de-interlacing unit 704 as described in connection with any of the Figs. 2 or 3 or 4; and an optional display device 706.
- the signal may be a broadcast signal received via an antenna or cable but may also be a signal from a storage device like a VCR (Video Cassette Recorder) or Digital Versatile Disk (DVD).
- the signal is provided at the input connector 708.
- the image processing apparatus 700 might e.g. be a TV or a personal computer comprising a video signal receiving module. Alternatively the image processing apparatus 700 does not comprise the optional display device 706 but provides the output images to an apparatus that does comprise a display device 706. Then the image processing apparatus 700 might be e.g. a set top box, a satellite-tuner, a VCR player, a DVD player or recorder.
- the image processing apparatus 700 comprises storage means, like a hard-disk or means for storage on removable media, e.g. optical disks.
- the image processing apparatus 700 might also be a system being applied by a film-studio or broadcaster.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Television Systems (AREA)
Abstract
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP04103649.2 | 2004-07-29 | ||
| EP04103649 | 2004-07-29 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2006013510A1 true WO2006013510A1 (fr) | 2006-02-09 |
Family
ID=35149144
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2005/052431 Ceased WO2006013510A1 (fr) | 2004-07-29 | 2005-07-20 | Desentrelacement |
Country Status (2)
| Country | Link |
|---|---|
| TW (1) | TW200627951A (fr) |
| WO (1) | WO2006013510A1 (fr) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2063636A4 (fr) * | 2006-09-15 | 2010-05-05 | Panasonic Corp | Dispositif de traitement vidéo et procédé de traitement vidéo |
| CN101600061B (zh) * | 2009-07-09 | 2012-07-25 | 杭州士兰微电子股份有限公司 | 视频运动自适应去隔行的方法及装置 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0318760A2 (fr) * | 1987-12-02 | 1989-06-07 | Blaupunkt-Werke GmbH | Récepteur de télévision avec un dispositif de suppression des perturbations de scintillement |
| EP0395263A2 (fr) * | 1989-04-27 | 1990-10-31 | Sony Corporation | Traitement de signal vidéo dépendant du mouvement |
| EP0687105A2 (fr) * | 1994-06-10 | 1995-12-13 | NOKIA TECHNOLOGY GmbH | Procédé pour détecter le mouvement dans un signal vidéo |
| WO2001017244A1 (fr) * | 1999-08-27 | 2001-03-08 | Trident Microsystems, Inc. | Desentrelacement adapte au mouvement et au contour |
| EP1353509A1 (fr) * | 2000-12-27 | 2003-10-15 | Matsushita Electric Industrial Co., Ltd. | Dispositif d'evaluation de l'etat statique et dispositif permettant d'intercaler une ligne d'exploration |
-
2005
- 2005-07-20 WO PCT/IB2005/052431 patent/WO2006013510A1/fr not_active Ceased
- 2005-07-26 TW TW094125287A patent/TW200627951A/zh unknown
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0318760A2 (fr) * | 1987-12-02 | 1989-06-07 | Blaupunkt-Werke GmbH | Récepteur de télévision avec un dispositif de suppression des perturbations de scintillement |
| EP0395263A2 (fr) * | 1989-04-27 | 1990-10-31 | Sony Corporation | Traitement de signal vidéo dépendant du mouvement |
| EP0687105A2 (fr) * | 1994-06-10 | 1995-12-13 | NOKIA TECHNOLOGY GmbH | Procédé pour détecter le mouvement dans un signal vidéo |
| WO2001017244A1 (fr) * | 1999-08-27 | 2001-03-08 | Trident Microsystems, Inc. | Desentrelacement adapte au mouvement et au contour |
| EP1353509A1 (fr) * | 2000-12-27 | 2003-10-15 | Matsushita Electric Industrial Co., Ltd. | Dispositif d'evaluation de l'etat statique et dispositif permettant d'intercaler une ligne d'exploration |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2063636A4 (fr) * | 2006-09-15 | 2010-05-05 | Panasonic Corp | Dispositif de traitement vidéo et procédé de traitement vidéo |
| US8432495B2 (en) | 2006-09-15 | 2013-04-30 | Panasonic Corporation | Video processor and video processing method |
| CN101600061B (zh) * | 2009-07-09 | 2012-07-25 | 杭州士兰微电子股份有限公司 | 视频运动自适应去隔行的方法及装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| TW200627951A (en) | 2006-08-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US6262773B1 (en) | System for conversion of interlaced video to progressive video using edge correlation | |
| US5784115A (en) | System and method for motion compensated de-interlacing of video frames | |
| US6269484B1 (en) | Method and apparatus for de-interlacing interlaced content using motion vectors in compressed video streams | |
| KR100973429B1 (ko) | 배경 움직임 벡터 선택기, 업-변환 유닛, 이미지 처리 장치, 배경 움직임 벡터 선택 방법 및 컴퓨터 판독 가능한 기록 매체 | |
| KR101135454B1 (ko) | 특정 이미지의 특정 픽셀 값 결정 방법, 픽셀 값 결정 유닛, 이미지 처리 장치 및 컴퓨터 판독 가능한 저장 매체 | |
| US8068175B2 (en) | Method for detecting interlaced material and field order | |
| US6404461B1 (en) | Method for detecting static areas in a sequence of video pictures | |
| US20060209957A1 (en) | Motion sequence pattern detection | |
| US20100177239A1 (en) | Method of and apparatus for frame rate conversion | |
| US9918041B1 (en) | Motion adaptive de-interlacing and advanced film mode detection | |
| EP1039746B1 (fr) | Procédé et dispositif pour l'interpolation de lignes | |
| US7705914B2 (en) | Pull-down signal detection apparatus, pull-down signal detection method and progressive-scan conversion apparatus | |
| KR100422575B1 (ko) | 디인터레이싱을 위한 공간축/시간축 보간 시스템 및 방법 | |
| KR20060047638A (ko) | 필름 모드 판정 방법, 움직임 보상 화상 처리 방법, 필름모드 검출기 및 움직임 보상기 | |
| KR100722773B1 (ko) | 동영상에서 그래픽 영역을 검출하는 방법 및 장치 | |
| US7499102B2 (en) | Image processing apparatus using judder-map and method thereof | |
| EP1958451B1 (fr) | Correction de champ de vecteurs de mouvement | |
| KR20070030223A (ko) | 픽셀 보간 | |
| WO2006013510A1 (fr) | Desentrelacement | |
| JP2003179886A (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
| US20060158513A1 (en) | Recognizing film and video occurring in parallel in television fields | |
| US7327398B2 (en) | 3D vector method of inter-field motion detection | |
| WO2005091625A1 (fr) | Desentrelacement | |
| US20080246874A1 (en) | Method for detecting image sequences having linewise repeated data |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |