[go: up one dir, main page]

GB2253760A - Video image processing - Google Patents

Video image processing Download PDF

Info

Publication number
GB2253760A
GB2253760A GB9202119A GB9202119A GB2253760A GB 2253760 A GB2253760 A GB 2253760A GB 9202119 A GB9202119 A GB 9202119A GB 9202119 A GB9202119 A GB 9202119A GB 2253760 A GB2253760 A GB 2253760A
Authority
GB
United Kingdom
Prior art keywords
images
image
regions
interpolation
motion vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9202119A
Other versions
GB9202119D0 (en
GB2253760B (en
Inventor
Martin Weston
Graham Alexander Thomas
Brian R Mason
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
British Broadcasting Corp
Vistek Electronics Ltd
Original Assignee
British Broadcasting Corp
Vistek Electronics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB919102197A external-priority patent/GB9102197D0/en
Application filed by British Broadcasting Corp, Vistek Electronics Ltd filed Critical British Broadcasting Corp
Priority to GB9202119A priority Critical patent/GB2253760B/en
Publication of GB9202119D0 publication Critical patent/GB9202119D0/en
Publication of GB2253760A publication Critical patent/GB2253760A/en
Application granted granted Critical
Publication of GB2253760B publication Critical patent/GB2253760B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0135Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes
    • H04N7/014Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes involving the use of motion vectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/577Motion compensation with bidirectional frame interpolation, i.e. using B-pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/587Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal sub-sampling or interpolation, e.g. decimation or subsequent interpolation of pictures in a video sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Systems (AREA)

Abstract

Motion vectors are determined for a video image by reference to the following image only. Vectors so determined are used to control a multi-tap motion-compensated temporal interpolation filter in order to interpolate one or more images between the image and the following image. The temporal interpolation filter uses primarily or wholly samples from the following images, since this approach gives valid answers in areas of both foreground and revealed background; whereas the motion estimation algorithm cannot distinguish between such regions. The same principle may be applied when motion vectors are derived by reference to preceding images only, that is, the filter takes contributions predominantly or wholly from preceding images. <IMAGE>

Description

VIDEO IMAGE PROCESSING This invention relates to video image processing, and more particularly to motion-compensated image interpolation in regions of revealed or obscured background.
Our International Patent Application No. PCT/GB91/01621 describes a method of motion estimation in which regions of revealed and obscured background are detected, allowing a subsequent motion-compensated temporal interpolation process to take account of such regions. Reference should be made to that application for a discussion of problems concerning revealed and obscured background and some possible solutions to those problems. In that application a method of temporal interpolation is described in which information in areas of the image being interpolated that correspond to revealed background are derived only from the following field or fields, and information in areas corresponding to obscured background are taken only from preceding field(s).
The present invention is concerned with the problem of revealed and obscured background in a motion-compensated temporal interpolation process controlled by motion vectors derived in a simpler way to that described in our above-mentioned previous Patent Application. Specifically, it concerns systems in which the motion vectors are determined for pixels in each original field by reference to either the preceding field or the following field, and this information used in the generation of new images at any time instant between this field and an adjacent field.
The invention concerns the design of the temporal interpolation filter in order to reduce the deficiencies in the interpolated images under such conditions. The invention is equally applicable to the two cases (preceding and following) and will be described below for the case where vectors are determined by reference to the following field.
The invention provides an alternative method of overcoming the problems which arise when one object in a scene moves in front of another to that described in our International Patent Application No. PCT/GB91/00982, publication No. WO91/20155. The present system works almost as well, while enabling a possible reduction in cost.
The invention in its various aspects is defined in the appended claims to which reference should now be made.
The invention will be described by way of example with reference to the sole figure of the drawing, which is a timing diagram illustrating the generation of an interpolated output field from a sequence of input fields.
The Figure shows four successive fields in a television signal, numbered field 1 to field 4. On the vertical axis is shown in one dimension a spatial section across the image; the horizontal axis represents time. The images represent an object moving slowly against a rapidly-moving background.
Motion vectors are shown that have been determined for pixels in field 2 by matching to displaced pixels in field 3. Some regions of field 2 may not have a vector assigned to them, particularly if that region is obscured in the following field.
A number of possible motion estimation algorithms could be used to generate such a motion vector field, for example that described in our United Kingdom Patent No. 2,188,510B.
In order to interpolate a new field in the television signal between fields 2 and 3, a temporal filter could be used which operates on the four sample values along the estimated motion trajectory. The motion vector used would be that assigned to the pixel in the immediately preceding field at the same spatial location as the pixel to be interpolated. The temporal position of a field being interpolated and the motion trajectory X to be used for the interpolation of a pixel P in this field using the vector assigned to the same spatial location Q in the preceding field are shown in the Figure.
However, we have realised that this known method of interpolation may generate artefacts in the interpolated images in regions of revealed background. This can be understood by considering the pixel in the image being interpolated that lies on the motion trajectory shown in the Figure. A four-tap temporal filter might be used having coefficient values of, for example: - -0.1 0.6 0.6 -0.1 in the four fields for generating an output field at the mid-point between fields 2 and 3. Such a filter will produce an incorrect result for the pixel P shown in Figure 1 because the contributions from fields 1 and 2 come not from the background that should be present at this point but from the adjacent object.This gives the effect of a "halo" around the trailing edge of the object in the interpolated image, because some picture material from within the body of the object is displayed just behind it.
In accordance with the present invention, an asymmetric filter aperture is used which takes little (or no) contribution from preceding fields. An example of such an aperture is: 0.0 0.0 1.2 -0.2 which would produce the correct result for all pixels being interpolated in Figure 1. That is, if the motion estimator cannot distinguish between foreground regions and those corresponding to revealed background, then it is assumed that all the regions correspond to revealed background. Such an approach will work in both revealed areas and foreground areas having valid vectors.
More generally, it can be seen that when motion vectors are generated for samples of an input field by reference to either the preceding or the following field, but not both, the distinguishing of foreground picture regions and those being obscured or revealed respectively is not possible. The example above considered the former case only (vector generation by reference to following fields only, making the detection of revealed background impossible). The principle is equally applicable to the case where the motion estimator can not distinguish between foreground regions and those corresponding to obscured background, in which case it assumes that all the regions correspond to obscured background.The problem is overcome in either situation by the use of an asymmetric temporal filter to generate new images that takes little or no information from fields which the vector estimator itself did not use. The asymmetry of the filter is introduced because of the asymmetry in the motion estimation process.
The system is of particular relevance when motion vectors are applied to time instances other than those at which they were generated, as in the example above. Under such conditions, the interpolation filter cannot use reliably sample values along the motion trajectory in earlier fields not used by the motion estimator (field 1 in the Figure) nor sample values in the field to which the vectors were assigned (field 2 in the Figure).
In the example shown in the Figure, the region of obscured background will be detected by the motion estimator, as no match will be found for this region in the following field.
In the known system, a temporal interpolation process operating in such detected regions will be switched to a "fallback" mode.
Such a mode may, for example, involve the use of a motion vector set to zero. Although this does not allow perfect interpolation in such regions, the artefacts introduced by the use of a zero vector are generally not too objectionable and present much less of a problem than the appearance of the "halos" described above There may be an advantage in retaining a symmetric filter aperture in such fallback regions. Indeed, it is possible to switch gradually from a symmetric interpolation aperture to an asymmetric aperture according to a measure of confidence in the estimated motion vectors. A measure of confidence may be generated in a number of ways; for example by measuring the difference in luminance levels along an estimated motion trajectory. Such a confidence signal is often produced by motion estimation algorithms and is sometimes termed a prediction error.
The method is applicable to any system employing motion-compensated temporal interpolation where the motion vectors are generated by a method unable to distinguish between foreground regions and regions of either revealed or obscured background. The invention lies in the use of a filter to generate new images that works successfully in both foreground regions and either revealed or obscured regions, according to the type of region that the motion estimator cannot distinguish.
Examples of applications include motion-compensated video standards conversion between, for example, 50Hz and 60Hz field rates and the interpolation of additional video images for high field rate displays or high quality slow motion replay.
In some applications, the interpolation process does not operate purely in the temporal domain. For example, if the input signal is interlaced, it is necessary to perform both vertical and temporal interpolation when generating new fields.
The filter then operates in the vertical-temporal domain and will be more complex than the simple four-tap filter used as an example above. Nevertheless, a suitable asymmetric filter aperture can still be derived. One way of deriving such a filter given a symmetric interpolation filter is to move filter taps previously in odd fields on one side of the filter aperture to odd fields on the other side of the aperture (and similarly for even fields). This maintains the correct vertical positioning of filter coefficients. The displacement vector applied to the coefficients must of course correspond to the field to which they are moved rather than that from which they were taken. Alternatively, the filter can be re-designed as an extrapolation rather than interpolation filter, having by definition a non-zero group delay and hence an asymmetric aperture.
The operations descibed here would generally be carried out by digital electronic circuitry. The design of such circuitry incorporating the principles described here is straightforward as will be readily understood by a man skilled in the art. For example, the design of both a motion estimator and a temporal interpolator are described by Borer, T.J. et al.
"Motion-Compensated Display Field-Rate Up-Conversion", IBC '90, IEE Conference Publication No. 327, pp. 321-325, September 1990.

Claims (10)

1. A machine method of image interpolation, comprising the steps of: determining a motion vector for regions of each original image by reference to the following image in the sequence; and generating one or more images corresponding to time instants between the two images by reference to samples along an estimated motion trajectory according to the motion vector generated for the corresponding region in the first of the two images, in such a way that most or all of the image information is taken from following images.
2. A machine method of image interpolation, comprising the steps of: determining a motion vector for regions of each original image by reference to the preceding image in the sequence; and generating one or more images corresponding to time instants between the two images by reference to samples along an estimated motion trajectory according to the motion vector generated for the corresponding region in the second of the two images, in such a way that most or all of the image information is taken from preceding images.
3. A machine method of image interpolation, in which motion vectors are used in the interpolation process which have been derived in such a way that foreground regions and regions of revealed background cannot be distinguished, and in which in the interpolation method all image regions are treated as if they correspond to revealed background by taking contributions primarily or entirely from following images.
4. A machine method of image interpolation, in which motion vectors are used in the interpolation process which have been derived in such a way that foreground regions and regions of obscured background cannot be distinguished, and in which in the interpolation method all image regions are treated as if they correspond to obscured background by taking contributions primarily or entirely from preceding images.
5. A method according to any of the preceding claims, in which regions to which a motion vector cannot be assigned are interpolated using a filter that takes information from preceding and following images.
6. A method according to any of the preceding claims, in which the filter used to generate new images is switched gradually from being symmetric to asymmetric in proportion to the measured level of confidence or reliability of the motion vector, such that vectors having a low confidence level are used for interpolation with a symmetric interpolation filter.
7. A method according to any of the preceding claims, in which the filter taking information from either preceding or following fields is a filter designed for temporal extrapolation of images.
8. A method according to any of the preceding claims, in which the image sequence is interlaced and the filter used for generating new images is a vertical-temporal filter.
9. Apparatus for image interpolation, comprising: means for determining a motion vector for regions of each original image by reference to the following image in the sequence; and means for generating one or more images corresponding to time instants between the two images by reference to samples along an estimated motion trajectory according to the motion vector generated for the corresponding region in the first of the two images, in such a way that most or all of the image information is taken from following images.
10. Apparatus for image interpolation, comprising: means for determining a motion vector for regions of each original image by reference to the preceding image in the sequence; and means for generating one or more images corresponding to time instants between the two images by reference to samples along an estimated motion trajectory according to the motion vector generated for the corresponding region in the second of the two images, in such a way that most or all of the image information is taken from preceding images.
GB9202119A 1991-02-01 1992-01-31 Video image processing Expired - Fee Related GB2253760B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB9202119A GB2253760B (en) 1991-02-01 1992-01-31 Video image processing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB919102197A GB9102197D0 (en) 1991-02-01 1991-02-01 Video image processing
GB9202119A GB2253760B (en) 1991-02-01 1992-01-31 Video image processing

Publications (3)

Publication Number Publication Date
GB9202119D0 GB9202119D0 (en) 1992-03-18
GB2253760A true GB2253760A (en) 1992-09-16
GB2253760B GB2253760B (en) 1994-07-27

Family

ID=26298373

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9202119A Expired - Fee Related GB2253760B (en) 1991-02-01 1992-01-31 Video image processing

Country Status (1)

Country Link
GB (1) GB2253760B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2265783A (en) * 1992-04-01 1993-10-06 Kenneth Stanley Jones Bandwidth reduction employing a DATV channel
WO1996031069A1 (en) * 1995-03-24 1996-10-03 National Semiconductor Corporation Motion vector based frame insertion process for increasing the frame rate of moving images
WO2001078406A1 (en) * 2000-04-07 2001-10-18 Snell & Wilcox Limited Video signal processing
WO2001078388A1 (en) * 2000-04-07 2001-10-18 Snell & Wilcox Limited Method of conversion from an interlaced format to a progressive format having a lower frame rate
CN100459693C (en) * 2005-11-08 2009-02-04 逐点半导体(上海)有限公司 A motion compensation frame insertion device and frame insertion method
US10672104B2 (en) 2014-12-22 2020-06-02 Interdigital Ce Patent Holdings, Sas Method and apparatus for generating an extrapolated image based on object detection

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2231743A (en) * 1989-04-27 1990-11-21 Sony Corp Motion dependent video signal processing

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2231743A (en) * 1989-04-27 1990-11-21 Sony Corp Motion dependent video signal processing

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2265783A (en) * 1992-04-01 1993-10-06 Kenneth Stanley Jones Bandwidth reduction employing a DATV channel
GB2265783B (en) * 1992-04-01 1996-05-29 Kenneth Stanley Jones Bandwidth reduction employing a classification channel
WO1996031069A1 (en) * 1995-03-24 1996-10-03 National Semiconductor Corporation Motion vector based frame insertion process for increasing the frame rate of moving images
US5943096A (en) * 1995-03-24 1999-08-24 National Semiconductor Corporation Motion vector based frame insertion process for increasing the frame rate of moving images
US6621864B1 (en) 1995-03-24 2003-09-16 National Semiconductor Corporation Motion vector based frame insertion process for increasing the frame rate of moving images
WO2001078406A1 (en) * 2000-04-07 2001-10-18 Snell & Wilcox Limited Video signal processing
WO2001078388A1 (en) * 2000-04-07 2001-10-18 Snell & Wilcox Limited Method of conversion from an interlaced format to a progressive format having a lower frame rate
US7202909B2 (en) * 2000-04-07 2007-04-10 Snell & Wilcox Limited Video signal processing with two stage motion compensation
CN100459693C (en) * 2005-11-08 2009-02-04 逐点半导体(上海)有限公司 A motion compensation frame insertion device and frame insertion method
US10672104B2 (en) 2014-12-22 2020-06-02 Interdigital Ce Patent Holdings, Sas Method and apparatus for generating an extrapolated image based on object detection

Also Published As

Publication number Publication date
GB9202119D0 (en) 1992-03-18
GB2253760B (en) 1994-07-27

Similar Documents

Publication Publication Date Title
US5642170A (en) Method and apparatus for motion compensated interpolation of intermediate fields or frames
US5929919A (en) Motion-compensated field rate conversion
US6940557B2 (en) Adaptive interlace-to-progressive scan conversion algorithm
De Haan et al. True-motion estimation with 3-D recursive search block matching
US6005639A (en) Vector assignment for video image motion compensation
US5444493A (en) Method and apparatus for providing intra-field interpolation of video signals with adaptive weighting based on gradients of temporally adjacent fields
KR100396558B1 (en) Apparatus and method for converting frame and/or field rate using adaptive motion compensation
US7057665B2 (en) Deinterlacing apparatus and method
US6625333B1 (en) Method for temporal interpolation of an image sequence using object-based image analysis
US8254439B2 (en) Apparatus and methods for motion vector correction
CN100438609C (en) Image processing unit with degradation
US20030103568A1 (en) Pixel data selection device for motion compensated interpolation and method thereof
US20090208123A1 (en) Enhanced video processing using motion vector data
US5610662A (en) Method and apparatus for reducing conversion artifacts
JPH08307820A (en) System and method for generating high image quality still picture from interlaced video
EP0765572B1 (en) Motion-compensated field rate conversion
US20100177239A1 (en) Method of and apparatus for frame rate conversion
EP1721458A1 (en) Reducing artefacts in scan-rate conversion of image signals by combining interpolation and extrapolation of images
GB2253760A (en) Video image processing
EP0648046B1 (en) Method and apparatus for motion compensated interpolation of intermediate fields or frames
EP0575862B1 (en) Method and apparatus for adaptive interpolation
KR100382651B1 (en) Method and apparatus for detecting motion using region-wise motion decision information in video signal processing system and data interpolating method and apparatus therefor
Kalevo et al. Deinterlacing of video signals using nonlinear interpolation with simple motion compensation
Choquet et al. Multipredictive Motion Estimation Scheme With A Prediction Along Motion Axis
KR960010194B1 (en) Hd-mac. system mode generator

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 19990131