WO2019103542A1 - Procédé de traitement d'image fondé sur un mode de prédiction intra, et dispositif associé - Google Patents
Procédé de traitement d'image fondé sur un mode de prédiction intra, et dispositif associé Download PDFInfo
- Publication number
- WO2019103542A1 WO2019103542A1 PCT/KR2018/014559 KR2018014559W WO2019103542A1 WO 2019103542 A1 WO2019103542 A1 WO 2019103542A1 KR 2018014559 W KR2018014559 W KR 2018014559W WO 2019103542 A1 WO2019103542 A1 WO 2019103542A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- mpm
- candidate list
- block
- prediction mode
- current block
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/103—Selection of coding mode or of prediction mode
- H04N19/105—Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/103—Selection of coding mode or of prediction mode
- H04N19/11—Selection of coding mode or of prediction mode among a plurality of spatial predictive coding modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/593—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial prediction techniques
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/70—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
Definitions
- the present invention relates to a still image or moving image processing method, and more particularly, to a method of encoding / decoding a still image or moving image based on an intra prediction mode and an apparatus for supporting the same.
- Compressive encoding refers to a series of signal processing techniques for transmitting digitized information over a communication line or for storing it in a form suitable for a storage medium.
- Media such as video, image, and audio can be subject to compression coding.
- a technique for performing compression coding on an image is referred to as video image compression.
- Next-generation video content will feature high spatial resolution, high frame rate, and high dimensionality of scene representation. Processing such content will result in a tremendous increase in terms of memory storage, memory access rate, and processing power.
- an object of the present invention is to propose a method of effectively signaling an increased intra prediction mode using MPM (Most Probable Mode).
- a method of encoding an image based on an intra prediction mode comprising: generating a most probable mode (MPM) candidate list based on an intra prediction mode of a block neighboring the current block; Encoding an MPM flag indicating whether the current block is encoded using the MPM if the intra prediction mode of the current block is included in the MPM candidate list; Arranging the order of the MPM candidate list using a template region of the current block; And encoding an MPM index indicating the intra-prediction mode in the aligned MPM candidate list.
- MPM most probable mode
- the step of arranging the order of the MPM candidate list comprises: a prediction block of the template region generated based on the intra prediction mode included in the MPM candidate list and the surrounding reference samples of the template region; And sorting the order of the MPM candidate list based on the reconstruction block difference value.
- the step of sorting the order of the MPM candidate list may include a step of sorting the candidates included in the MPM candidate list in ascending order of the difference value.
- the difference value includes a sum of absolute transformed differences (SATD), a sum of absolute difference (SAD), or a sum of squared error (SSE) . ≪ / RTI >
- the template region is a predetermined size region adjacent to the left and upper sides.
- the template area may be a predetermined size
- the second region is an area of the second region.
- a method of decoding an image based on an intra prediction mode comprising: decoding an MPM flag indicating whether a current block is encoded using an MPM (Most Probable Mode); Generating an MPM candidate list based on an intra prediction mode of a block neighboring the current block when the current block is encoded using an MPM; Arranging the order of the MPM candidate list using a template region of the current block; Decoding an MPM index indicating a prediction mode applied to intraprediction of the current block in the aligned MPM candidate list; And generating a prediction block of the current block based on the prediction mode specified by the MPM index.
- MPM Motion Probable Mode
- the step of arranging the order of the MPM candidate list comprises: a prediction block of the template region generated based on the intra prediction mode included in the MPM candidate list and the surrounding reference samples of the template region; And sorting the order of the MPM candidate list based on the reconstruction block difference value.
- the step of sorting the order of the MPM candidate list may include a step of sorting the candidates included in the MPM candidate list in ascending order of the difference value.
- the difference value includes a sum of absolute transformed differences (SATD), a sum of absolute difference (SAD), or a sum of squared error (SSE) . ≪ / RTI >
- the template region is a predetermined size region adjacent to the left and upper sides.
- the template area may be a predetermined size
- the second region is an area of the second region.
- an apparatus for encoding an image based on an intra prediction mode comprising: an MPM for generating an MPM (Most Probable Mode) candidate list based on an intra prediction mode of a block neighboring a current block; A candidate list generation unit; An MPM flag encoding unit for encoding an MPM flag indicating whether the current block is encoded using the MPM when the intra prediction mode of the current block is included in the MPM candidate list; An MPM candidate list sorting unit for sorting the order of the MPM candidate list using a template region of the current block; And an MPM index encoding unit for encoding an MPM index indicating the intra-prediction mode in the sorted MPM candidate list.
- an apparatus for decoding an image based on an intra prediction mode comprising: an MPM flag decoding unit for decoding an MPM flag indicating whether a current block is encoded using an MPM (Most Probable Mode) part; An MPM candidate list generation unit for generating an MPM candidate list based on an intra prediction mode of a block neighboring the current block when the current block is encoded using the MPM; An MPM candidate list sorting unit for sorting the order of the MPM candidate list using a template region of the current block; An MPM index decoding unit for decoding an MPM index indicating a prediction mode to be applied to intraprediction of the current block in the aligned MPM candidate list; And a prediction block generator for generating a prediction block of the current block based on the prediction mode specified by the MPM index.
- MPM flag decoding unit for decoding an MPM flag indicating whether a current block is encoded using an MPM (Most Probable Mode) part
- An MPM candidate list generation unit for generating an MPM candidate list
- bits for expressing the intra prediction mode can be saved by efficiently arranging the prediction modes included in the MPM (Most Probable Mode) and allocating indexes, Can be improved.
- FIG. 1 is a schematic block diagram of an encoder in which still image or moving picture signal encoding is performed according to an embodiment of the present invention.
- FIG. 2 is a schematic block diagram of a decoder in which still image or moving picture signal encoding is performed according to an embodiment of the present invention.
- FIG. 3 is a diagram for explaining a block division structure of a QT (QuadTree, hereinafter referred to as 'QT') to which the present invention can be applied.
- FIG. 4 is a diagram for explaining a BT (Binary Tree, hereinafter referred to as 'BT') block division structure to which the present invention can be applied.
- BT Binary Tree
- FIG. 5 is a diagram for explaining a block division structure of a TT (Ternary Tree) block according to an embodiment of the present invention.
- FIG. 6 is a diagram for explaining an AT (Asymmetric Tree) block partitioning structure to which the present invention can be applied.
- FIG. 7 is a diagram illustrating an intra prediction method according to an embodiment to which the present invention is applied.
- FIG. 8 illustrates a prediction direction according to an intra prediction mode.
- FIG. 9 is a diagram illustrating a prediction direction according to an intra prediction mode, to which the present invention is applied.
- FIG. 10 is a diagram illustrating a method of interpolating a reference sample to generate a prediction sample, to which the present invention is applied.
- FIG. 11 is a diagram for explaining a method of constructing an MPM (Most Probable Mode) using a prediction mode of a neighboring block, to which the present invention is applied.
- FIG. 12 is a flowchart illustrating an image encoding method using an MPM (Most Probable Mode) according to an embodiment of the present invention.
- FIG. 13 is a flowchart illustrating an image decoding method using MPM (Most Probable Mode) according to an embodiment of the present invention.
- FIG. 14 is a diagram for explaining a template region according to an embodiment to which the present invention is applied.
- 15 is a diagram specifically illustrating an intra predictor according to an embodiment of the present invention.
- 16 is a diagram specifically illustrating an intra predictor according to an embodiment of the present invention.
- FIG. 17 shows a video coding system to which the present invention is applied.
- FIG. 18 shows a structure of a contents streaming system as an embodiment to which the present invention is applied.
- 'processing unit' means a unit in which processing of encoding / decoding such as prediction, conversion and / or quantization is performed.
- the processing unit may be referred to as a " processing block " or a " block "
- the processing unit may be interpreted to include a unit for the luma component and a unit for the chroma component.
- the processing unit may correspond to a coding tree unit (CTU), a coding unit (CU), a prediction unit (PU), or a transform unit (TU).
- CTU coding tree unit
- CU coding unit
- PU prediction unit
- TU transform unit
- the processing unit can be interpreted as a unit for a luminance (luma) component or as a unit for a chroma component.
- the processing unit may include a Coding Tree Block (CTB), a Coding Block (CB), a Prediction Block (PU), or a Transform Block (TB) ).
- CTB Coding Tree Block
- CB Coding Block
- PU Prediction Block
- TB Transform Block
- the processing unit may be interpreted to include a unit for the luma component and a unit for the chroma component.
- processing unit is not necessarily limited to a square block, but may be configured as a polygonal shape having three or more vertexes.
- a pixel, a pixel, or the like is collectively referred to as a sample.
- using a sample may mean using a pixel value, a pixel value, or the like.
- FIG. 1 is a schematic block diagram of an encoder in which still image or moving picture signal encoding is performed according to an embodiment of the present invention.
- an encoder 100 includes an image divider 110, a subtractor 115, a transformer 120, a quantizer 130, an inverse quantizer 140, an inverse transformer 150, A decoding unit 160, a decoded picture buffer (DPB) 170, a predicting unit 180, and an entropy encoding unit 190.
- the prediction unit 180 may include an inter prediction unit 181 and an intra prediction unit 182.
- the image divider 110 divides an input video signal (or a picture, a frame) input to the encoder 100 into one or more processing units.
- the subtractor 115 subtracts a prediction signal (or a prediction block) output from the prediction unit 180 (i.e., the inter prediction unit 181 or the intra prediction unit 182) from the input video signal, And generates a residual signal (or difference block).
- the generated difference signal (or difference block) is transmitted to the conversion unit 120.
- the transforming unit 120 transforms a difference signal (or a difference block) by a transform technique (for example, DCT (Discrete Cosine Transform), DST (Discrete Sine Transform), GBT (Graph-Based Transform), KLT (Karhunen- Etc.) to generate a transform coefficient.
- a transform technique for example, DCT (Discrete Cosine Transform), DST (Discrete Sine Transform), GBT (Graph-Based Transform), KLT (Karhunen- Etc.
- the transform unit 120 may generate transform coefficients by performing transform using a transform technique determined according to a prediction mode applied to a difference block and a size of a difference block.
- the quantization unit 130 quantizes the transform coefficients and transmits the quantized transform coefficients to the entropy encoding unit 190.
- the entropy encoding unit 190 entropy-codes the quantized signals and outputs them as a bitstream.
- the quantized signal output from the quantization unit 130 may be used to generate a prediction signal.
- the quantized signal can be reconstructed by applying inverse quantization and inverse transformation through the inverse quantization unit 140 and the inverse transform unit 150 in the loop.
- a reconstructed signal can be generated by adding the reconstructed difference signal to a prediction signal output from the inter prediction unit 181 or the intra prediction unit 182.
- the filtering unit 160 applies filtering to the restored signal and outputs the restored signal to the playback apparatus or the decoded picture buffer 170.
- the filtered signal transmitted to the decoding picture buffer 170 may be used as a reference picture in the inter-prediction unit 181. [ As described above, not only the picture quality but also the coding efficiency can be improved by using the filtered picture as a reference picture in the inter picture prediction mode.
- the decoded picture buffer 170 may store the filtered picture for use as a reference picture in the inter-prediction unit 181.
- the inter-prediction unit 181 performs temporal prediction and / or spatial prediction to remove temporal redundancy and / or spatial redundancy with reference to a reconstructed picture.
- the reference picture used for prediction is a transformed signal obtained through quantization and inverse quantization in units of blocks at the time of encoding / decoding in the previous time, blocking artifacts or ringing artifacts may exist have.
- the inter-prediction unit 181 can interpolate signals between pixels by sub-pixel by applying a low-pass filter in order to solve the performance degradation due to discontinuity or quantization of such signals.
- a subpixel means a virtual pixel generated by applying an interpolation filter
- an integer pixel means an actual pixel existing in a reconstructed picture.
- the interpolation method linear interpolation, bi-linear interpolation, wiener filter and the like can be applied.
- the interpolation filter may be applied to a reconstructed picture to improve the accuracy of the prediction.
- the inter-prediction unit 181 generates an interpolation pixel by applying an interpolation filter to an integer pixel, and uses an interpolated block composed of interpolated pixels as a prediction block Prediction can be performed.
- the intra predictor 182 predicts a current block by referring to samples in the vicinity of a block to be currently encoded.
- the intraprediction unit 182 may perform the following procedure to perform intra prediction. First, a reference sample necessary for generating a prediction signal can be prepared. Then, a prediction signal can be generated using the prepared reference sample. Thereafter, the prediction mode is encoded. At this time, reference samples can be prepared through reference sample padding and / or reference sample filtering. Since the reference samples have undergone prediction and reconstruction processes, quantization errors may exist. Therefore, a reference sample filtering process can be performed for each prediction mode used for intraprediction to reduce such errors.
- a prediction signal (or a prediction block) generated through the inter prediction unit 181 or the intra prediction unit 182 is used to generate a reconstruction signal (or reconstruction block) or a difference signal (or a difference block) / RTI >
- FIG. 2 is a schematic block diagram of a decoder in which still image or moving picture signal encoding is performed according to an embodiment of the present invention.
- the decoder 200 includes an entropy decoding unit 210, an inverse quantization unit 220, an inverse transform unit 230, an adder 235, a filtering unit 240, a decoded picture buffer (DPB) A buffer unit 250, and a prediction unit 260.
- the prediction unit 260 may include an inter prediction unit 261 and an intra prediction unit 262.
- the reconstructed video signal output through the decoder 200 may be reproduced through a reproducing apparatus.
- the decoder 200 receives a signal (i.e., a bit stream) output from the encoder 100 of FIG. 1, and the received signal is entropy-decoded through the entropy decoding unit 210.
- a signal i.e., a bit stream
- the inverse quantization unit 220 obtains a transform coefficient from the entropy-decoded signal using the quantization step size information.
- the inverse transform unit 230 obtains a residual signal (or a difference block) by inverse transforming the transform coefficient by applying an inverse transform technique.
- the adder 235 adds the obtained difference signal (or difference block) to the prediction signal output from the prediction unit 260 (i.e., the inter prediction unit 261 or the intra prediction unit 262) ) To generate a reconstructed signal (or reconstruction block).
- the filtering unit 240 applies filtering to a reconstructed signal (or a reconstructed block) and outputs it to a reproducing apparatus or transmits the reconstructed signal to a decoding picture buffer unit 250.
- the filtered signal transmitted to the decoding picture buffer unit 250 may be used as a reference picture in the inter prediction unit 261.
- the embodiments described in the filtering unit 160, the inter-prediction unit 181 and the intra-prediction unit 182 of the encoder 100 respectively include the filtering unit 240 of the decoder, the inter-prediction unit 261, The same can be applied to the intra prediction unit 262.
- FIG. 3 is a diagram for explaining a block division structure of a QT (QuadTree, hereinafter referred to as 'QT') to which the present invention can be applied.
- One block in video coding can be segmented based on QT (QuadTree).
- QT QualityTree
- one sub-block divided by QT can be further recursively partitioned using QT.
- a leaf block that is not QT-divided can be divided by at least one of BT (Binary Tree), TT (Ternary Tree), or AT (Asymmetric Tree).
- BT can have two types of segmentation: horizontal BT (2NxN, 2NxN) and vertical BT (Nx2N, Nx2N).
- TT can have two types of segmentation: horizontal TT (2Nx1 / 2N, 2NxN, 2Nx1 / 2N) and vertical TT (1 / 2Nx2N, Nx2N, 1 / 2Nx2N).
- AT is a horizontal-up AT (2Nx1 / 2N, 2Nx3 / 2N), a horizontal-down AT (2Nx3 / 2N, 2Nx1 / 2N), a vertical-left AT (1 / 2Nx2N, 3 / 2Nx2N) / 2Nx2N, 1 / 2Nx2N).
- Each BT, TT, and AT can be recursively further partitioned using BT, TT, and AT.
- FIG. 3 shows an example of QT division.
- the block A can be divided into four sub-blocks (A0, A1, A2, A3) by QT.
- the sub-block A1 can be further divided into four sub-blocks (B0, B1, B2, B3) by QT.
- FIG. 4 is a diagram for explaining a BT (Binary Tree, hereinafter referred to as 'BT') block division structure to which the present invention can be applied.
- BT Binary Tree
- FIG. 4 shows an example of BT division.
- Block B3 which is no longer partitioned by QT, can be divided into vertical BT (C0, C1) or horizontal BT (D0, D1).
- each sub-block can be further recursively partitioned, such as in the form of horizontal BT (E0, E1) or vertical BT (F0, F1).
- FIG. 5 is a diagram for explaining a block division structure of a TT (Ternary Tree) block according to an embodiment of the present invention.
- FIG. 5 shows an example of TT division.
- Block B3 which is no longer partitioned by QT, may be divided into vertical TT (C0, C1, C2) or horizontal TT (D0, D1, D2).
- each sub-block can be further recursively divided into a horizontal TT (E0, E1, E2) or a vertical TT (F0, F1, F2).
- FIG. 6 is a diagram for explaining an AT (Asymmetric Tree) block partitioning structure to which the present invention can be applied.
- Block B3 which is no longer partitioned by QT, may be partitioned into vertical AT (C0, C1) or horizontal AT (D0, D1).
- each subblock can be further recursively partitioned, such as in the form of horizontal AT (E0, E1) or vertical TT (F0, F1).
- BT, TT, and AT segmentation can be used together.
- a subblock divided by BT can be divided by TT or AT.
- subblocks divided by TT can be divided by BT or AT.
- a subblock divided by AT can be divided by BT or TT.
- each subblock may be partitioned into a vertical BT, or after a vertical BT partition, each subblock may be partitioned into a horizontal BT.
- the two kinds of division methods have the same shape in the final division although the division order is different.
- searching is performed from left to right and from top to bottom, and searching for a block means a procedure for determining whether or not each divided sub-block is further divided into blocks, or when a block is not further divided, Refers to a coding order of a block, or a search order when referring to information of another neighboring block in a sub-block.
- And may use the decoded portion of the current picture or other pictures that contain the current processing unit to recover the current processing unit in which decoding is performed.
- a picture (slice) that uses only the current picture, that is, a picture (slice) that uses only the current picture, that is, a picture (slice) that performs only intra-picture prediction is referred to as an intra picture or an I picture
- a picture (slice) using a predictive picture or a P picture (slice), a maximum of two motion vectors and a reference index may be referred to as a bi-predictive picture or a B picture (slice).
- Intra prediction refers to a prediction method that derives the current processing block from a data element (e.g., a sample value, etc.) of the same decoded picture (or slice). That is, it means a method of predicting the pixel value of the current processing block by referring to the reconstructed areas in the current picture.
- a data element e.g., a sample value, etc.
- Inter prediction refers to a prediction method of deriving a current processing block based on a data element (e.g., a sample value or a motion vector) of a picture other than the current picture. That is, this means a method of predicting pixel values of a current processing block by referring to reconstructed areas in other reconstructed pictures other than the current picture.
- a data element e.g., a sample value or a motion vector
- intra prediction (or intra prediction) will be described in more detail.
- Intra prediction or intra prediction
- FIG. 7 is a diagram illustrating an intra prediction method according to an embodiment to which the present invention is applied.
- the decoder derives an intra prediction mode of the current processing block (S701).
- intra prediction it is possible to have a prediction direction with respect to the position of a reference sample used for prediction according to the prediction mode.
- An intra prediction mode having a prediction direction is referred to as an intra prediction mode (Intra_Angular prediction mode).
- intra prediction mode Intra_Angular prediction mode
- intra-planar (INTRA_PLANAR) prediction mode there are an intra-planar (INTRA_PLANAR) prediction mode and an intra-DC (INTRA_DC) prediction mode as intra-prediction modes having no prediction direction.
- Table 1 illustrates the intra-prediction mode and related names
- FIG. 8 illustrates the prediction direction according to the intra-prediction mode.
- intra prediction prediction is performed on the current processing block based on the derived prediction mode. Since the reference sample used in the prediction differs from the concrete prediction method used in the prediction mode according to the prediction mode, when the current block is encoded in the intra prediction mode, the decoder derives the prediction mode of the current block in order to perform prediction.
- the decoder checks whether neighboring samples of the current processing block can be used for prediction, and constructs reference samples to be used for prediction (S702).
- neighbor samples of the current processing block include a sample adjacent to the left boundary of the current processing block of size nS x nS and a total of 2 x nS samples neighboring the bottom-left, A sample adjacent to the top boundary and a total of 2 x n S samples neighboring the top-right side and one sample neighboring the top-left of the current processing block.
- the decoder may substitute samples that are not available with the available samples to construct reference samples for use in prediction.
- the decoder may perform filtering of the reference samples based on the intra prediction mode (S703).
- Whether or not the filtering of the reference sample is performed can be determined based on the size of the current processing block.
- the filtering method of the reference sample may be determined by a filtering flag transmitted from the encoder.
- the decoder generates a prediction block for the current processing block based on the intra prediction mode and the reference samples (S704). That is, the decoder determines the intra prediction mode derived in the intra prediction mode deriving step S701, the prediction for the current processing block based on the reference samples obtained through the reference sample building step S702 and the reference sample filtering step S703, (I.e., generates a prediction sample).
- the left boundary sample of the prediction block i.e., the sample in the prediction block adjacent to the left boundary
- samples in the prediction block adjacent to the upper boundary that is, samples in the prediction block adjacent to the upper boundary
- filtering may be applied to the left boundary sample or the upper boundary sample, similar to the INTRA_DC mode, for the vertical direction mode and the horizontal direction mode of the intra directional prediction modes.
- the value of a predicted sample can be derived based on a reference sample located in a prediction direction.
- the boundary sample which is not located in the prediction direction may be adjacent to the reference sample which is not used for prediction. That is, the distance from the reference sample that is not used for prediction may be much closer than the distance from the reference sample used for prediction.
- the decoder may adaptively apply filtering to the left boundary samples or the upper boundary samples according to whether the intra-prediction direction is vertical or horizontal. That is, when the intra prediction direction is vertical, filtering is applied to the left boundary samples, and filtering is applied to the upper boundary samples when the intra prediction direction is the horizontal direction.
- FIG. 9 is a diagram illustrating a prediction direction according to an intra prediction mode, to which the present invention is applied.
- the six non-directional DC modes, the remaining 65 directional prediction modes except for the planar mode can have prediction directions as shown in FIG. 9, and the encoder / It is possible to perform intra prediction by copying a reference sample determined according to the direction.
- the prediction mode numbers from 2 to 66 can be sequentially allocated from the lower left prediction direction to the upper right prediction direction, respectively.
- the method proposed by the present invention mainly describes intraprediction using 65 prediction modes recently discussed, but can also be applied to intra prediction using 35 conventional prediction modes in the same manner.
- FIG. 10 is a diagram illustrating a method of interpolating a reference sample to generate a prediction sample, to which the present invention is applied.
- the encoder / decoder can generate a prediction sample by copying a reference sample determined according to the prediction direction of the intra prediction mode. If the reference sample determined according to the prediction direction is not an integer pixel position, the encoder / decoder can interpolate adjacent integer pixel reference samples to compute a reference sample of the fractional pixel location and copy it to generate a prediction sample have.
- the encoder / decoder can compute an interpolated reference sample using the reference samples of the corresponding two integer pixel positions and the reference sample-to-sample distance ratio obtained through the angle of the prediction mode, as shown in FIG. 10 .
- the encoder / decoder can then generate a predicted sample by copying the computed interpolated reference sample.
- a tan value for the angle? Of the prediction mode may be defined to calculate the position of the sub-pixel (i.e., the fractional pixel). Also, in order to improve the complexity of the operation, it can be defined by scaling in integer units, and tan? Can be determined for each of 67 prediction modes by using Table 2 below.
- tan -1 ? For some prediction modes can be determined using Table 3 below.
- the encoder / decoder may apply an interpolation filter on the integer pixel reference samples.
- the interpolation filter can be selectively determined according to the size of the current processing block.
- the encoder / decoder performs interpolation on reference samples at integer pixel locations, and when the width or height of the current processing block is less than or equal to 8, a cubic filter is used as the interpolation filter . If the width or height of the current processing block is greater than 8, the encoder / decoder may use a Gaussian filter as the interpolation filter.
- the directional prediction mode may be classified into a vertical direction prediction mode when the prediction mode is larger than or equal to the prediction mode 34, and a horizontal direction prediction mode when the prediction mode is 34 times smaller. If the mode is the vertical direction prediction mode, the encoder / decoder selects the interpolation filter based on the width of the current processing block. If the mode is the horizontal direction prediction mode, the interpolation filter can be selected based on the height of the current processing block.
- the DC mode indicates a prediction method of constructing a prediction block using an average value of reference samples located around the current block.
- An effective prediction can be expected when the pixels in the current processing block are homogeneous.
- the value of the reference sample is not uniform, a discontinuity may occur between the prediction block and the reference sample.
- a planar prediction method has been devised.
- the planar prediction method constructs a prediction block by performing horizontal linear prediction and vertical linear prediction using surrounding reference samples and then averaging them.
- the encoder / decoder may perform post-processing filtering to alleviate the discontinuity between the reference sample and the prediction block boundary for blocks predicted in the horizontal direction, the vertical direction, and the DC mode. Thereafter, the decoder can restore the block encoded by intraprediction by summing the prediction block and the inverse-transformed residual signal into the pixel region.
- the prediction mode information is transmitted to the decoder, where MPM (Most Probable Mode) can be used for encoding of the efficient prediction mode.
- MPM Most Probable Mode
- the MPM starts with the assumption that the intra prediction mode of the current processing block will be the same as or similar to the prediction mode of the previously intra predicted block in the surroundings. Will be described with reference to the following drawings.
- FIG. 11 is a diagram for explaining a method of constructing an MPM (Most Probable Mode) using a prediction mode of a neighboring block, to which the present invention is applied.
- the encoder / decoder may use a prediction mode of a neighboring block to construct an MPM candidate list.
- the maximum number of MPM candidates constituting the MPM candidate list is 6.
- the present invention is not limited to this.
- the number of MPM candidates applied to the proposed method may be 3, 4, or 5, or may be 7 or more.
- the encoder / decoder may construct six MPM candidate lists (which may be referred to as MPM lists, MPM candidates, MPM candidate groups, etc. in the present invention) using the prediction mode of the neighboring blocks as well.
- the encoder / decoder can construct an MPM candidate list in the order of Left (L), Above (A), Planar, DC, Below left (BL), Above right (AR) and Above left (AL).
- a prediction mode defined as a default mode is set to the MPM list as shown in FIG.
- the default mode represents a predicting mode (or a prediction mode group) which is preferentially considered, and may include statistically selected prediction modes.
- the default mode may be configured to include a total of six prediction modes: 50, 18, 2, 34, 60, and 65 prediction modes.
- the current prediction mode is a prediction mode existing in the MPM candidate list
- the index information indicating the prediction mode to be applied to the current intra prediction is transmitted in the MPM list. Therefore, (In this case, a total of 7 bits is required), the number of bits according to the prediction mode signaling can be saved.
- the index of the MPM may be binarized in a truncated unary manner.
- encoding / decoding can be performed by separating into three context tables of horizontal direction, vertical direction and non-directional mode according to the direction of the MPM mode have.
- the prediction mode exists in the MPM candidate list, one of the modes excluding the six MPM candidates will be applied to the current block, so that the prediction mode information can be accurately transmitted by only signaling for a total of 61 prediction modes.
- the decoder When the current block to be decoded is coded in the intra mode, the decoder decodes the residual signal from the video signal transmitted from the encoder. At this time, the decoder performs entropy decoding on the signal that is symbol-based on the probability, and then performs inverse quantization and inverse transform to restore the residual signal of the pixel domain. Meanwhile, the intra-prediction unit 262 of the decoder generates a prediction block using the prediction mode transmitted from the encoder and the neighboring reference samples already reconstructed. Thereafter, the predicted signal and the decoded residual signal are summed to reconstruct the intra predicted block.
- the MPM candidate list has a predetermined order (for example, Left (L), Above (A), Planar, DC, Below left (BL) , above left (AL)) using the prediction mode of the neighboring block.
- a predetermined order for example, Left (L), Above (A), Planar, DC, Below left (BL) , above left (AL)
- This takes into account the spatial redundancy characteristic, which is the starting point of the intra prediction, and is a predetermined order from the assumption that the already restored mode around the block is similar to the current block.
- the binarization of the MPM index can be performed by a truncated unary binarization method as shown in Table 4 below.
- the present invention proposes a method of generating an optimized MPM candidate list by improving the above problems and constructing the MPM in order of less distortion.
- FIG. 12 is a flowchart illustrating an image encoding method using an MPM (Most Probable Mode) according to an embodiment of the present invention.
- the encoder generates an MPM candidate list based on the intra prediction mode of a block neighboring the current block (S1201). 12, it is assumed that the maximum number of MPM candidates constituting the MPM candidate list is 6. However, the present invention is not limited thereto. For example, the number of MPM candidates applied to the proposed method may be 3, 4, or 5, or may be 7 or more. As an example, the encoder can generate the MPM candidate list by applying the method described in Fig.
- the encoder encodes the MPM flag indicating whether the current block is encoded using the MPM (S1202).
- the encoder aligns the order of the MPM candidate list using the template area of the current block (S1203).
- the encoder and the decoder similarly use the prediction block (or predicted area, predicted sample) generated based on each candidate (or prediction mode) in the MPM candidate list with respect to the template area of the current block,
- the MPM candidate list can be sorted based on the degree of similarity between the restoration block (or restoration area, restoration sample) of the template area. That is, in step S1203, the prediction block of the template region generated based on the intra-prediction mode included in the MPM candidate list and the surrounding reference sample of the template region, And sorting the order of the MPM candidate list.
- the encoder can sort the candidates included in the MPM candidate list in ascending order of the difference value.
- the encoder generates prediction blocks of the template area using the derived candidate mode, and then aligns the MPM candidates in ascending order based on the sum of absolute transformed differences (SATD) between the restoration blocks of the template area You can assign an index.
- SATD sum of absolute transformed differences
- SAD sum of absolute difference
- SSE sum of squared error
- the encoder and decoder can use the already reconstructed information around for the calculation of the degree of distortion.
- the encoder can specify the templates using the restored samples around the current block and then sort the MPMs in descending order of SATD (or SAD, SSE) derived using the template.
- the encoder encodes an MPM index indicating the intra-prediction mode in the sorted MPM candidate list in step S1203 (S1204).
- the encoder may generate a prediction block of the current block based on an intra prediction mode of the current block.
- the encoder may determine the order of the MPM candidates in the MPM candidate list and signal the determined order to the decoder.
- the encoder can signal the sequence of MPM candidate lists (i.e., MPM order) in units of sequence, picture, slice, tile, coding tree unit (CTU) or CU (Coding Unit). That is, the encoder can calculate the degree of distortion in the same manner as in the above-described embodiment, and then transmit it to the decoder in combination with the optimum order.
- the encoder determines the order of the MPM based on the degree of distortion with respect to the original image by using various methods, To the decoder.
- the MPM order between the encoder and the decoder may be predetermined based on the location of the current processing block (or coding block) in the image.
- the MPM order can be determined based on the coordinates of the current block in the image or on the specific area containing the current block.
- the MPM order may be determined according to the coordinate information of the image, or a specific area including a current block such as a picture, a slice, and a tile.
- the order of the MPMs may be preset based on statistics for the optimal mode depending on the location of the block within the image, slice, or tile.
- the encoder can determine the MPM sequence according to the coordinates or the area of the current coded block in the image using various various methods.
- FIG. 13 is a flowchart illustrating an image decoding method using MPM (Most Probable Mode) according to an embodiment of the present invention.
- the decoder decodes the MPM flag indicating whether or not the current block is encoded using MPM (Most Probable Mode) (S1301).
- the decoder If the current block is coded using the MPM, the decoder generates an MPM candidate list based on the intra prediction mode of the block neighboring the current block (S1302). 13, it is assumed that the maximum number of MPM candidates constituting the MPM candidate list is 6. However, the present invention is not limited thereto. For example, the number of MPM candidates applied to the proposed method may be 3, 4, or 5, or may be 7 or more. For example, the decoder can generate the MPM candidate list by applying the method described in FIG.
- the decoder arranges the order of the MPM candidate list using the template area of the current block (S1303).
- the decoder updates the prediction block (or predicted area, predicted sample) generated based on each candidate (or prediction mode) in the MPM candidate list for the template area of the current block and the restoration
- the MPM candidate list can be sorted based on the similarity between blocks (or restoration regions, restoration samples).
- the step S1303 may include a step of calculating, based on the intra-prediction mode included in the MPM candidate list and the prediction block of the template region generated based on the surrounding reference samples of the template region, And sorting the order of the MPM candidate list.
- the decoder can sort the candidates included in the MPM candidate list in ascending order of the difference value.
- the decoder generates prediction blocks of the template regions using the derived candidate mode, aligns the MPM candidates in ascending order based on the sum of absolute transformed differences (SATD) between the restoration blocks of the template region, You can assign an index.
- SATD sum of absolute transformed differences
- SAD sum of absolute difference
- SSE sum of squared error
- the encoder and decoder can use the already reconstructed information around for the calculation of the degree of distortion.
- the decoder can specify the template using the restored samples around the current block and then align the order of the MPMs in descending order of SATD (or SAD, SSE) derived using the template.
- the decoder decodes the MPM index indicating the prediction mode applied to intra-prediction of the current block in the sorted MPM candidate list in step S1303 (S1304).
- the decoder generates a prediction block of the current block based on the prediction mode specified by the MPM index (S1305).
- the encoder may determine the order of the MPM candidates in the MPM candidate list and signal the determined order to the decoder.
- the decoder may receive a sequence of MPM candidate lists (i.e., MPM order) in units of a sequence, picture, slice, tile, coding tree unit (CTU) or CU (Coding Unit) from the encoder. That is, the encoder can calculate the degree of distortion in the same manner as in the above-described embodiment, and then transmit it to the decoder in combination with the optimum order.
- the encoder determines the order of the MPM based on the degree of distortion with respect to the original image by using various methods, To the decoder.
- the MPM order between the encoder and the decoder may be predetermined based on the location of the current processing block (or coding block) in the image.
- the MPM order can be determined based on the coordinates of the current block in the image or on the specific area containing the current block.
- the MPM order may be determined according to the coordinate information of the image, or a specific area including a current block such as a picture, a slice, and a tile.
- the order of the MPMs may be preset based on statistics for the optimal mode depending on the location of the block within the image, slice, or tile.
- the decoder can determine the MPM sequence according to the coordinates or the area of the current coded block in the image in the same manner as the encoder using various methods have.
- FIG. 14 is a diagram for explaining a template region according to an embodiment to which the present invention is applied.
- the encoder / decoder can arrange the order of the MPM candidate list using a template area as shown in FIG.
- the encoder / decoder can arrange the MPM candidate list based on the similarity between the prediction block of the template area and the restoration block of the template area, which is generated based on the prediction mode in the MPM candidate list as shown in Fig. That is, the template region may be a specific region that is predetermined among the reconstructed neighboring regions of the current block or the reference block.
- the size of the template area may be set to a restored area of a specific size adjacent to the left and top of the current block (or reference block, target block).
- the template region may be an already restored LXN sample region on the left side as shown in FIG. 14 and a sample region on the upper side that has already been reconstructed NXL size.
- the encoder / decoder can order the MPM candidates based on distortion of the template region.
- the encoder / decoder can assign a prediction mode index (or MPM index) based on the difference value of the template area for each MPM candidate.
- the encoder / decoder may also generate a prediction block of the template region using the surrounding (or neighboring) reference samples of the template based on the candidate prediction mode.
- the reference sample used may be, for example, a reference sample of the template shown in Fig.
- the encoder / decoder can set another template area for the processing block located adjacent to the CTU boundary in the CTU. For example, if the current block is located adjacent to the CTU boundary in the CTU, the template region may be preset to a region of a certain size adjacent to the left, unlike that shown in Fig.
- 15 is a diagram specifically illustrating an intra predictor according to an embodiment of the present invention.
- the intra prediction unit is shown as one block in FIG. 15 for the convenience of explanation, the intra prediction unit may be implemented by a configuration included in the encoder and / or the decoder.
- the intra prediction unit implements the functions, procedures, and / or methods proposed in FIGS. 7 to 14 above.
- the intraprediction unit may include an MPM candidate list generation unit 1501, an MPM flag decoding unit 1502, an MPM candidate list sorting unit 1503, and an MPM index decoding unit 1504.
- the MPM candidate list generation unit 1501 generates an MPM candidate list based on an intra prediction mode of a block neighboring the current block. 15, it is assumed that the maximum number of MPM candidates constituting the MPM candidate list is 6. However, the present invention is not limited thereto. For example, the number of MPM candidates applied to the proposed method may be 3, 4, or 5, or may be 7 or more. For example, the MPM candidate list generation unit 1501 can generate an MPM candidate list by applying the method described in FIG.
- the MPM flag decoding unit 1502 encodes the MPM flag indicating whether or not the current block is encoded using the MPM.
- the MPM candidate list sorting unit 1503 arranges the order of the MPM candidate list using the template region of the current block.
- the encoder and the decoder similarly use the prediction block (or predicted area, predicted sample) generated based on each candidate (or prediction mode) in the MPM candidate list with respect to the template area of the current block,
- the MPM candidate list can be sorted based on the degree of similarity between the restoration block (or restoration area, restoration sample) of the template area.
- the MPM candidate list sorting unit 1503 arranges the MPM candidate list in the intra-prediction mode included in the MPM candidate list and between the prediction block of the template region generated based on the surrounding reference samples of the template region, The order of the MPM candidate lists can be sorted based on the difference value. At this time, the MPM candidate list sorting unit 1503 can sort the candidates included in the MPM candidate list in ascending order of the difference value.
- the encoder generates prediction blocks of the template area using the derived candidate mode, and then aligns the MPM candidates in ascending order based on the sum of absolute transformed differences (SATD) between the restoration blocks of the template area You can assign an index.
- SATD sum of absolute transformed differences
- SAD sum of absolute difference
- SSE sum of squared error
- the encoder and decoder can use the already reconstructed information around for the calculation of the degree of distortion. It is possible to arrange the order of the MPMs in descending order of the SATD (or SAD, SSE) derived from the templates using the already restored samples around the current block.
- the MPM index decoding unit 1504 encodes an MPM index indicating the intra prediction mode in the aligned MPM candidate list.
- the intra prediction unit may generate a prediction block of the current block based on an intra prediction mode of the current block.
- the intra-prediction unit may determine the order of the MPM candidates in the MPM candidate list and signal the determined order to a decoder.
- the intraprediction unit may signal the order of the MPM candidate list (i.e., MPM order) in units of a sequence, picture, slice, tile, coding tree unit (CTU) or CU (Coding Unit). That is, the intraprediction unit can calculate the degree of distortion in the same manner as in the above-described embodiment, and then transmit it to the decoder in combination with the optimum order.
- the intra predictor determines the order of the MPM based on the degree of distortion with respect to the original image by using various methods, To the decoder.
- the MPM order between the encoder and the decoder may be predetermined based on the location of the current processing block (or coding block) in the image.
- the MPM order can be determined based on the coordinates of the current block in the image or on the specific area containing the current block.
- the MPM order may be determined according to the coordinate information of the image, or a specific area including a current block such as a picture, a slice, and a tile.
- the order of the MPMs may be preset based on statistics for the optimal mode depending on the location of the block within the image, slice, or tile.
- the intra predictor can determine the MPM sequence according to the coordinates or the area of the current coding block in the video using various methods.
- 16 is a diagram specifically illustrating an intra predictor according to an embodiment of the present invention.
- the intra prediction unit is shown as one block in FIG. 16 for the sake of convenience, the intra prediction unit may be implemented in an encoder and / or a decoder.
- the intra prediction unit implements the functions, procedures and / or methods proposed in FIGS. 7 to 14 above.
- the intraprediction unit includes an MPM flag decoding unit 1601, an MPM candidate list generating unit 1602, an MPM candidate list arranging unit 1603, an MPM index decoding unit 1604, and a prediction block generating unit 1605 Lt; / RTI >
- the MPM flag decoding unit 1601 decodes the MPM flag indicating whether the current block is encoded using the MPM (Most Probable Mode).
- the MPM candidate list generation unit 1602 generates an MPM candidate list based on an intra prediction mode of a block neighboring the current block. 16, it is assumed that the maximum number of MPM candidates constituting the MPM candidate list is 6. However, the present invention is not limited thereto. For example, the number of MPM candidates applied to the proposed method may be 3, 4, or 5, or may be 7 or more. For example, the MPM candidate list generation unit 1602 can generate an MPM candidate list by applying the method described in FIG.
- the MPM candidate list sorting unit 1603 arranges the order of the MPM candidate list using the template area of the current block.
- the encoder and the decoder compare the prediction block (or predicted area, predicted sample) generated based on each candidate (or prediction mode) in the MPM candidate list with the template area of the current block, (Or restoration region, restoration sample) of the MPM candidate list. That is, as an embodiment, the MPM candidate list arranging unit 1603 arranges the intra prediction mode included in the MPM candidate list and the prediction block of the template area generated based on the surrounding reference samples of the template area and the restoration block of the template area
- the order of the MPM candidate lists can be sorted based on the difference value.
- the MPM candidate list sorting unit 1603 may sort the candidates included in the MPM candidate list in ascending order in order of the difference value.
- the MPM candidate list sorting unit 1603 generates a prediction block of a template region using the derived candidate mode, and then generates an MPM candidate based on the sum of absolute transformed differences (SATD) Can be sorted in ascending order and indexed in order.
- SATD sum of absolute transformed differences
- SAD sum of absolute difference
- SSE sum of squared error
- the encoder and decoder can use the already reconstructed information around for the calculation of the degree of distortion.
- the MPM candidate list sorting unit 1603 assigns a template using already restored samples around the current block and arranges the order of MPMs in descending order of SATD (or SAD, SSE) derived using the template .
- the MPM index decoding unit 1604 decodes an MPM index indicating a prediction mode applied to intra prediction of the current block in the aligned MPM candidate list.
- the prediction block generation unit 1605 generates a prediction block of the current block based on the prediction mode specified by the MPM index.
- the encoder may determine the order of the MPM candidates in the MPM candidate list and signal the determined order to the decoder.
- the decoder may receive a sequence of MPM candidate lists (i.e., MPM order) in units of a sequence, picture, slice, tile, coding tree unit (CTU) or CU (Coding Unit) from the encoder. That is, the encoder can calculate the degree of distortion in the same manner as in the above-described embodiment, and then transmit it to the decoder in combination with the optimum order.
- the encoder determines the order of the MPM based on the degree of distortion with respect to the original image by using various methods, To the decoder.
- the MPM order between the encoder and the decoder may be predetermined based on the location of the current processing block (or coding block) in the image.
- the MPM order can be determined based on the coordinates of the current block in the image or on the specific area containing the current block.
- the MPM order may be determined according to the coordinate information of the image, or a specific area including a current block such as a picture, a slice, and a tile.
- the order of the MPMs may be preset based on statistics for the optimal mode depending on the location of the block within the image, slice, or tile.
- the decoder can determine the MPM sequence according to the coordinates or the area of the current coded block in the image in the same manner as the encoder using various methods have.
- FIG. 17 shows a video coding system to which the present invention is applied.
- the video coding system may include a source device and a receiving device.
- the source device may deliver the encoded video / image information or data in the form of a file or stream to a receiving device via a digital storage medium or network.
- the source device may include a video source, an encoding apparatus, and a transmitter.
- the receiving device may include a receiver, a decoding apparatus, and a renderer.
- the encoding apparatus may be referred to as a video / image encoding apparatus, and the decoding apparatus may be referred to as a video / image decoding apparatus.
- the transmitter may be included in the encoding device.
- the receiver may be included in the decoding apparatus.
- the renderer may include a display unit, and the display unit may be composed of a separate device or an external component.
- a video source can acquire video / image through capturing, compositing, or generating a video / image.
- the video source may include a video / video capture device and / or a video / video generation device.
- the video / video capture device may include, for example, one or more cameras, video / video archives including previously captured video / images, and the like.
- the video / image generation device may include, for example, a computer, tablet, smart phone, and the like (electronically) to generate video / images.
- a virtual video / image may be generated through a computer or the like. In this case, the video / image capturing process may be replaced in the process of generating related data.
- the encoding device may encode the input video / image.
- the encoding apparatus can perform a series of procedures such as prediction, conversion, and quantization for compression and coding efficiency.
- the encoded data (encoded video / image information) can be output in the form of a bitstream.
- the transmitting unit may transmit the encoded video / image information or data output in the form of a bit stream to a receiving unit of the receiving device through a digital storage medium or a network in the form of a file or a stream.
- the digital storage medium may include various storage media such as USB, SD, CD, DVD, Blu-ray, HDD, SSD and the like.
- the transmission unit may include an element for generating a media file through a predetermined file format, and may include an element for transmission over a broadcast / communication network.
- the receiving unit may extract the bitstream and transmit it to the decoding apparatus.
- the decoding apparatus may perform a series of procedures such as inverse quantization, inverse transformation, and prediction corresponding to the operation of the encoding apparatus to decode the video / image.
- the renderer may render the decoded video / image.
- the rendered video / image can be displayed through the display unit.
- FIG. 18 shows a structure of a contents streaming system as an embodiment to which the present invention is applied.
- the content streaming system to which the present invention is applied may include an encoding server, a streaming server, a web server, a media repository, a user device, and a multimedia input device.
- the encoding server compresses content input from multimedia input devices such as a smart phone, a camera, and a camcorder into digital data to generate a bit stream and transmit the bit stream to the streaming server.
- multimedia input devices such as a smart phone, a camera, a camcorder, or the like directly generates a bitstream
- the encoding server may be omitted.
- the bitstream may be generated by an encoding method or a bitstream generating method to which the present invention is applied, and the streaming server may temporarily store the bitstream in the process of transmitting or receiving the bitstream.
- the streaming server transmits multimedia data to a user device based on a user request through the web server, and the web server serves as a medium for informing the user of what services are available.
- the web server delivers it to the streaming server, and the streaming server transmits the multimedia data to the user.
- the content streaming system may include a separate control server. In this case, the control server controls commands / responses among the devices in the content streaming system.
- the streaming server may receive content from a media repository and / or an encoding server. For example, when receiving the content from the encoding server, the content can be received in real time. In this case, in order to provide a smooth streaming service, the streaming server can store the bit stream for a predetermined time.
- Examples of the user device include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC, Such as tablet PCs, ultrabooks, wearable devices (e.g., smartwatches, smart glass, HMDs (head mounted displays)), digital TVs, desktops Computers, and digital signage.
- PDA personal digital assistant
- PMP portable multimedia player
- slate PC Such as tablet PCs, ultrabooks, wearable devices (e.g., smartwatches, smart glass, HMDs (head mounted displays)), digital TVs, desktops Computers, and digital signage.
- Each of the servers in the content streaming system can be operated as a distributed server. In this case, data received at each server can be distributed.
- the embodiments described in the present invention can be implemented and executed on a processor, a microprocessor, a controller, or a chip.
- the functional units depicted in the figures may be implemented and implemented on a computer, processor, microprocessor, controller, or chip.
- the decoder and encoder to which the present invention is applied can be applied to multimedia communication devices such as a multimedia broadcasting transmitting and receiving device, a mobile communication terminal, a home cinema video device, a digital cinema video device, a surveillance camera, a video chatting device, (3D) video devices, video telephony video devices, and medical video devices, and the like, which may be included in, for example, a storage medium, a camcorder, a video on demand (VoD) service provision device, an OTT video (Over the top video) And may be used to process video signals or data signals.
- the OTT video (Over the top video) device may include a game console, a Blu-ray player, an Internet access TV, a home theater system, a smart phone, a tablet PC, a DVR (Digital Video Recorder)
- the processing method to which the present invention is applied may be produced in the form of a computer-executed program, and may be stored in a computer-readable recording medium.
- the multimedia data having the data structure according to the present invention can also be stored in a computer-readable recording medium.
- the computer-readable recording medium includes all kinds of storage devices and distributed storage devices in which computer-readable data is stored.
- the computer-readable recording medium may be, for example, a Blu-ray Disc (BD), a Universal Serial Bus (USB), a ROM, a PROM, an EPROM, an EEPROM, a RAM, a CD- Data storage devices.
- the computer-readable recording medium includes media implemented in the form of a carrier wave (for example, transmission over the Internet).
- the bit stream generated by the encoding method can be stored in a computer-readable recording medium or transmitted over a wired or wireless communication network.
- an embodiment of the present invention may be embodied as a computer program product by program code, and the program code may be executed in a computer according to an embodiment of the present invention.
- the program code may be stored on a carrier readable by a computer.
- Embodiments in accordance with the present invention may be implemented by various means, for example, hardware, firmware, software, or a combination thereof.
- an embodiment of the present invention may include one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs) field programmable gate arrays, processors, controllers, microcontrollers, microprocessors, and the like.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- an embodiment of the present invention may be implemented in the form of a module, a procedure, a function, or the like which performs the functions or operations described above.
- the software code can be stored in memory and driven by the processor.
- the memory is located inside or outside the processor and can exchange data with the processor by various means already known.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
La présente invention porte sur un procédé de traitement d'image fondé sur un mode de prédiction intra, et sur un dispositif associé. En particulier, un procédé de décodage d'une image sur la base d'un mode de prédiction intra peut comprendre : une étape de décodage d'un drapeau de mode le plus probable (MPM) indiquant si un bloc courant a été codé à l'aide d'un MPM ; une étape de génération d'une liste de MPM candidats sur la base d'un mode de prédiction intra d'un bloc voisin du bloc courant, si le bloc courant a été codé à l'aide du MPM ; une étape de tri de l'ordre de la liste de MPM candidats à l'aide d'une région modèle du bloc courant ; une étape de décodage d'un indice de MPM indiquant un mode de prédiction, à l'intérieur de la liste triée de MPM candidats, qui doit être appliqué à la prédiction intra du bloc courant ; et une étape de génération d'un bloc de prédiction du bloc courant sur la base du mode de prédiction spécifié par l'indice de MPM.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201762590342P | 2017-11-23 | 2017-11-23 | |
| US62/590,342 | 2017-11-23 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019103542A1 true WO2019103542A1 (fr) | 2019-05-31 |
Family
ID=66631693
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2018/014559 Ceased WO2019103542A1 (fr) | 2017-11-23 | 2018-11-23 | Procédé de traitement d'image fondé sur un mode de prédiction intra, et dispositif associé |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2019103542A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP4207758A1 (fr) * | 2022-01-04 | 2023-07-05 | FG Innovation Company Limited | Dispositif et procédé de décodage de données vidéo |
| WO2024007158A1 (fr) * | 2022-07-05 | 2024-01-11 | Oppo广东移动通信有限公司 | Procédé de construction de liste de candidats, procédé, appareil et système de codage et de décodage vidéo |
| US20240214560A1 (en) * | 2021-09-10 | 2024-06-27 | Hyundai Motor Company | Video encoding/decoding method and apparatus |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20140008503A (ko) * | 2012-07-10 | 2014-01-21 | 한국전자통신연구원 | 영상 부호화/복호화 방법 및 장치 |
| US20160127725A1 (en) * | 2014-10-31 | 2016-05-05 | Ecole De Technologie Superieure | Method and system for fast mode decision for high efficiency video coding |
| WO2017105097A1 (fr) * | 2015-12-17 | 2017-06-22 | 삼성전자 주식회사 | Procédé de décodage vidéo et appareil de décodage vidéo utilisant une liste de candidats de fusion |
| KR20170100211A (ko) * | 2016-02-25 | 2017-09-04 | 주식회사 케이티 | 비디오 신호 처리 방법 및 장치 |
| WO2017188652A1 (fr) * | 2016-04-26 | 2017-11-02 | 인텔렉추얼디스커버리 주식회사 | Procédé et dispositif destinés au codage/décodage d'image |
-
2018
- 2018-11-23 WO PCT/KR2018/014559 patent/WO2019103542A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20140008503A (ko) * | 2012-07-10 | 2014-01-21 | 한국전자통신연구원 | 영상 부호화/복호화 방법 및 장치 |
| US20160127725A1 (en) * | 2014-10-31 | 2016-05-05 | Ecole De Technologie Superieure | Method and system for fast mode decision for high efficiency video coding |
| WO2017105097A1 (fr) * | 2015-12-17 | 2017-06-22 | 삼성전자 주식회사 | Procédé de décodage vidéo et appareil de décodage vidéo utilisant une liste de candidats de fusion |
| KR20170100211A (ko) * | 2016-02-25 | 2017-09-04 | 주식회사 케이티 | 비디오 신호 처리 방법 및 장치 |
| WO2017188652A1 (fr) * | 2016-04-26 | 2017-11-02 | 인텔렉추얼디스커버리 주식회사 | Procédé et dispositif destinés au codage/décodage d'image |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240214560A1 (en) * | 2021-09-10 | 2024-06-27 | Hyundai Motor Company | Video encoding/decoding method and apparatus |
| EP4207758A1 (fr) * | 2022-01-04 | 2023-07-05 | FG Innovation Company Limited | Dispositif et procédé de décodage de données vidéo |
| US12363289B2 (en) | 2022-01-04 | 2025-07-15 | Sharp Kabushiki Kaisha | Device and method for decoding video data |
| WO2024007158A1 (fr) * | 2022-07-05 | 2024-01-11 | Oppo广东移动通信有限公司 | Procédé de construction de liste de candidats, procédé, appareil et système de codage et de décodage vidéo |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019190181A1 (fr) | Procédé de codage d'image/de vidéo basé sur l'inter-prédiction et dispositif associé | |
| WO2020141879A1 (fr) | Procédé et dispositif de décodage de vidéo basé sur une prédiction de mouvement affine au moyen d'un candidat de fusion temporelle basé sur un sous-bloc dans un système de codage de vidéo | |
| WO2020017861A1 (fr) | Procédé d'inter-prédiction pour une prédiction d'informations de mouvement temporel dans une unité de sous-bloc, et dispositif associé | |
| WO2020145775A1 (fr) | Procédé et dispositif de codage d'image permettant la réalisation d'une prédiction intra fondée sur un mrl | |
| WO2020171632A1 (fr) | Procédé et dispositif de prédiction intra fondée sur une liste mpm | |
| WO2020256344A1 (fr) | Signalisation d'informations indiquant un ensemble de noyaux de transformée dans un codage d'image | |
| WO2021040402A1 (fr) | Codage d'image ou de vidéo basé sur un codage de palette | |
| WO2021137597A1 (fr) | Procédé et dispositif de décodage d'image utilisant un paramètre de dpb pour un ols | |
| WO2020256346A1 (fr) | Codage d'informations concernant un ensemble de noyaux de transformation | |
| WO2020149630A1 (fr) | Procédé et dispositif de décodage d'image basé sur une prédiction cclm dans un système de codage d'image | |
| WO2020145604A1 (fr) | Procédé et dispositif de codage vidéo basé sur une prédiction intra à l'aide d'une liste mpm | |
| WO2020180100A1 (fr) | Codage vidéo ou d'image basé sur un codage intra-bloc | |
| WO2019103542A1 (fr) | Procédé de traitement d'image fondé sur un mode de prédiction intra, et dispositif associé | |
| WO2021145725A1 (fr) | Dispositif et procédé de codage d'image basés sur la signalisation d'information relative au filtrage | |
| WO2020256506A1 (fr) | Procédé et appareil de codage/décodage vidéo utilisant une prédiction intra à multiples lignes de référence, et procédé de transmission d'un flux binaire | |
| WO2020145620A1 (fr) | Procédé et dispositif de codage d'image basé sur une prédiction intra utilisant une liste mpm | |
| WO2023075563A1 (fr) | Procédé et dispositif de codage/décodage de caractéristique et support d'enregistrement stockant un flux binaire | |
| WO2019199093A1 (fr) | Procédé de traitement d'image basé sur un mode d'intraprédiction, et dispositif associé | |
| WO2021137589A1 (fr) | Procédé et dispositif de décodage d'image | |
| WO2020180044A1 (fr) | Procédé de codage d'images basé sur un lmcs et dispositif associé | |
| WO2024080766A1 (fr) | Procédé de codage/décodage d'images sur la base d'une transformée non séparable, procédé de transmission de flux binaire et support d'enregistrement pour enregistrer un flux binaire | |
| WO2024010356A1 (fr) | Procédé et appareil de codage/décodage d'image, et support d'enregistrement dans lequel est stocké un flux binaire | |
| WO2021137590A1 (fr) | Procédé de décodage d'image lie à un codage d'unité nal ph et dispositif associé | |
| WO2021091255A1 (fr) | Procédé et dispositif de signalisation de syntaxe de haut niveau pour codage image/vidéo | |
| WO2021006651A1 (fr) | Procédé de codage d'image sur la base d'un filtrage de déblocage, et appareil associé |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18880308 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18880308 Country of ref document: EP Kind code of ref document: A1 |