US20110128282A1 - Method for Generating the Depth of a Stereo Image - Google Patents
Method for Generating the Depth of a Stereo Image Download PDFInfo
- Publication number
- US20110128282A1 US20110128282A1 US12/780,074 US78007410A US2011128282A1 US 20110128282 A1 US20110128282 A1 US 20110128282A1 US 78007410 A US78007410 A US 78007410A US 2011128282 A1 US2011128282 A1 US 2011128282A1
- Authority
- US
- United States
- Prior art keywords
- image
- pixels
- paths
- depths
- dynamic programming
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/97—Determining parameters from multiple pictures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
Definitions
- the disclosure relates in general to a method for generating image depth of a stereo image, and more particularly to a method for generating image depth of a stereo image through multiple paths with greater gradient.
- the belief propagation algorithm and the dynamic programming algorithm are commonly used in the stereo matching technology. Let the technology disclosed in United States Patent No. 2009/0129667 be taken for example. Despite the image depths obtained by the belief propagation algorithm is more accurate, a larger memory and a larger amount of computing time are required. Let the technology disclosed in U.S. Pat. No. 7,570,804 be taken for example.
- the dynamic programming algorithm has the advantages of requiring smaller memory and smaller amount of computing time. In the conventional method which computes the image depth by the dynamic programming algorithm, the entire scan line (single column or single row) is optimized.
- the disclosure is directed to a method for generating image depth of a stereo image.
- a method for generating image depth of a stereo image includes the following steps. Firstly, a stereo image is received. Next, a number of paths with greater gradient are searched in the stereo image. Then, the image depths of a number of first pixels in the paths are generated. After that, the image depths of a number of second pixels not in the paths are generated according to the image depths of the first pixels.
- FIG. 1 is a flowchart of a method for generating image depth of a stereo image according to an embodiment of the disclosure
- FIGS. 2A ⁇ 2D is schematic diagrams which show an example of obtaining a path by the greedy algorithm
- FIG. 3 is a diagram showing multiple paths
- FIG. 4 is a block diagram of a system used for performing the method for generating image depth of a stereo image of FIG. 1 .
- FIG. 1 a flowchart of a method for generating image depth of a stereo image according to an embodiment of the disclosure is shown.
- the method disclosed in the present embodiment of the disclosure includes the following steps. Firstly, the method begins at step 102 , a stereo image is received. Next, the method proceeds to step 104 , multiple paths with greater gradient are searched in the stereo image. Then, the method proceeds to step 106 , multiple image depths of the first pixels in the paths are generated. After that, the method proceeds to step 108 , according to the image depths of the first pixels, multiple image depths of the second pixels not in the paths are generated.
- the multiple paths with greater gradient preferably are paths with greater color change.
- the depths are more likely to be wrongly calculated in the region with smaller color change, and the calculation method using one row or one column as a unit tends to produce streak noise in the depth chart.
- paths with greater gradient such as paths with greater color change are searched in the image first, and then the depths of the image are calculated afterwards. After that, other pixel depths of the image are calculated by using other algorithms.
- the depths of the pixels obtained in the paths with greater color change have higher accuracy.
- the accuracy of the depths of image is increased if multiple depths of the pixels in paths with higher accuracy are obtained first and then the depths of the pixels not in the paths are obtained next.
- the occurrence of streak noise in the depths is effectively reduced, and the quality of the three-dimensional image generated according to the depth is increased.
- the stereo image being received such as includes a left-eye two-dimensional image and a right-eye two-dimensional image.
- multiple paths with greater gradient can be searched according to one of the left-eye two-dimensional image and the right-eye two-dimensional image.
- the paths with greater gradient can be obtained by using the greedy algorithm or the dynamic programming algorithm, but the disclosure is not limited thereto.
- FIGS. 2A ⁇ 2D an example of obtaining a path by the greedy algorithm is shown.
- the starting point of the path be pixel P 1 .
- the three pixels adjacent to the pixel P 1 are candidate points as indicated by arrow.
- the pixel whose color or grey value differs with the pixel P 1 most is selected as the second pixel in the path.
- the selected pixel P 2 is indicated in FIG. 2B .
- the pixel whose color or grey value differs with the pixel P 2 most is selected as the third pixel in the path as indicated in FIG. 2C .
- the above step is repeated so as to obtain n points in the path as indicated in FIG. 2D .
- a path L 1 composed of P 1 , P 2 . . . Pn is obtained.
- FIG. 3 Another path L 2 as indicated in FIG. 3 is obtained by repeating FIGS. 2A ⁇ 2D .
- Other paths (illustrated in the diagram) are obtained by repeating FIGS. 2A ⁇ 2D .
- the energy function e 1 of each pixel in an image is defined as:
- I denotes the brightness value of the pixel.
- the path s y is defined as:
- (j, y (j)) denotes the coordinates of pixels in the paths, and m denotes the number of pixels included in one row of an image.
- the difference in the y coordinate is within one pixel.
- the path to be searched in the present embodiment of the disclosure is the path with the smallest sum of the energy of all pixels, and must be conformed to the following expression of s*:
- an accumulative energy function M (i, j) is defined as:
- M ( i,j ) e ( i,j )+max( M ( i ⁇ 1 ,j ⁇ 1), M ( i ⁇ 1 ,j ), M ( i ⁇ 1 ,j+ 1))
- the maximum value of M (i, j) can be searched by using the dynamic programming algorithm, so as to obtain the entire path with largest energy by inference.
- the image depths of the first pixels in the paths are preferably obtained by the dynamic programming algorithm.
- the energy function of the dynamic programming algorithm such as includes a matching cost function and a penalty function.
- the present embodiment of the disclosure uses the following energy function:
- E path ⁇ ( d ⁇ ( x , y ) ) ⁇ ( x , y ) ⁇ S * ⁇ C ⁇ ( x , y , d ⁇ ( x , y ) ) + ⁇ ( x , y ) ⁇ S * ⁇ ⁇ ⁇ ( x , y ) ⁇ ⁇ ⁇ ( d ⁇ ( x , y ) ) - d ⁇ ( x + 1 , y x + 1 ) )
- C(x, y, d(x,y)) denotes the matching cost when the disparity of the pixel (x,y) equals d(x,y).
- ⁇ (x,y), ⁇ (d) are penalty functions arbitrarily defined.
- I Left (x, y) and I Right (x, y) respectively denote the brightness values of the left-eye image pixel (x, y) and the right-eye image pixel (x, y), k is a given constant.
- E path is minimized by using the dynamic programming algorithm so as to obtain the image depths corresponding to all pixels in the path s*.
- multiple image depths of the second pixels not in the paths can be generated by using the bilateral filter or by using the dynamic programming algorithm.
- the second pixel is such as the pixel P 1 ′ of FIG. 3 .
- the method of generating the image depths by using the bilateral filter is disclosed below.
- the bilateral filter is a low-pass filter which maintains the details of image edge.
- the bilateral filter is used for generating the depth values of the pixels not in the paths of the depth chart so as to produce a high-quality depth chart.
- p denotes the pixel to which filter processing is performed
- ⁇ denotes a mask range centered at p
- I pf denotes the color of the filtered pixel
- I p and I q respectively denote the colors of pixels p and q
- Gs and Gr denote two low-pass filters, the former functions in the pixel space, and the latter functions in the color space.
- the present embodiment of the disclosure uses “Real-Time Edge-Aware Image Processing With The Bilateral Grid” disclosed by Chen, J., Paris, S., and Durand, F. 2007 as well as the method of bilateral grid disclosed in ACM SIGGRAPH 2007 (San Diego, Calif., Aug. 5-9, 2007).
- Bilateral grid is a data structure which maps a two-dimensional image onto a three-dimensional space grid, wherein the mapping function is expressed as:
- r and s denote two adjustable parameters; (u, v) denotes the coordinates of pixels in a two-dimensional image; I (u, v) denotes the brightness value of pixel (u, v); (x, y, z) denotes the pixel coordinate after the pixel (u, v) is mapped into the three-dimensional space grid.
- the mask range must be large enough such as covers 1/36 ⁇ 1 ⁇ 4 of the image.
- the I (u, v) of the mapping function uses the brightness value of the source image, but the values stored in the grid are changed to (d, n), wherein d denotes the sum of depth estimates of all pixels mapped into the grid, and n also denotes the number of pixels mapped into the grid.
- the unknown depths of remaining multiple second pixels can be obtained by using the dynamic programming algorithm.
- the unknown depths of remaining multiple second pixels can be compensated by scan line optimization utilized in conventional method.
- the depths of multiple second pixels not in the paths can be obtained according to the depths of multiple first pixels in the paths obtained in the step 106 .
- Multiple image depths of the second pixels are calculated along the row direction or along the column direction.
- bilateral filtering is parallelly performing by dividing the stereo image into a number of blocks and using each block which is treated as an operation unit to save computing time and parallelly doing the operation on each block.
- the disclosure provides a system used for performing the method for generating image depth of a stereo image of FIG. 1 , wherein the block diagram of the system is indicated in FIG. 4 .
- the system 400 includes an image processing unit 402 and a storage unit 404 .
- the image processing unit 402 is for receiving the stereo image Im and performing the steps 102 ⁇ 108 of FIG. 1
- the storage unit 404 is for storing the image depths of the stereo image Im and the first pixels and the second pixels.
- the method for generating image depth of a stereo image disclosed in the above embodiments of the disclosure increases the accuracy of image depth and is conducive to enhancing the quality of subsequent three-dimensional image.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
- Image Generation (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| TW98141004 | 2009-12-01 | ||
| TW098141004A TWI398158B (zh) | 2009-12-01 | 2009-12-01 | 產生立體影像之影像深度的方法 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20110128282A1 true US20110128282A1 (en) | 2011-06-02 |
Family
ID=44068520
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US12/780,074 Abandoned US20110128282A1 (en) | 2009-12-01 | 2010-05-14 | Method for Generating the Depth of a Stereo Image |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US20110128282A1 (zh) |
| TW (1) | TWI398158B (zh) |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110038529A1 (en) * | 2009-08-12 | 2011-02-17 | Hitachi, Ltd. | Image processing apparatus and image processing method |
| US20120092462A1 (en) * | 2010-10-14 | 2012-04-19 | Altek Corporation | Method and apparatus for generating image with shallow depth of field |
| WO2013075611A1 (zh) * | 2011-11-23 | 2013-05-30 | 华为技术有限公司 | 一种深度图像滤波方法、获取深度图像滤波阈值的方法和装置 |
| US20130258064A1 (en) * | 2012-04-03 | 2013-10-03 | Samsung Techwin Co., Ltd. | Apparatus and method for reconstructing high density three-dimensional image |
| US9007441B2 (en) | 2011-08-04 | 2015-04-14 | Semiconductor Components Industries, Llc | Method of depth-based imaging using an automatic trilateral filter for 3D stereo imagers |
| US9047656B2 (en) * | 2009-01-20 | 2015-06-02 | Entropic Communications, Inc. | Image processing using a bilateral grid |
| US9070196B2 (en) | 2012-02-27 | 2015-06-30 | Samsung Electronics Co., Ltd. | Apparatus and method for estimating disparity using visibility energy model |
| WO2020113824A1 (zh) * | 2018-12-04 | 2020-06-11 | 深圳市华星光电半导体显示技术有限公司 | 图像处理方法 |
| US10992873B2 (en) * | 2019-01-18 | 2021-04-27 | Qualcomm Incorporated | Systems and methods for color matching for realistic flash images |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| AU2011203028B1 (en) * | 2011-06-22 | 2012-03-08 | Microsoft Technology Licensing, Llc | Fully automatic dynamic articulated model calibration |
| TWI456526B (zh) * | 2011-11-03 | 2014-10-11 | Au Optronics Corp | 多視角立體影像產生方法及採用此方法之多視角立體影像產生裝置 |
| KR101888969B1 (ko) * | 2012-09-26 | 2018-09-20 | 엘지이노텍 주식회사 | 영상특성을 이용한 스테레오 매칭장치 |
| EP3236657A1 (en) * | 2016-04-21 | 2017-10-25 | Ultra-D Coöperatief U.A. | Dual mode depth estimator |
Citations (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6266153B1 (en) * | 1998-05-12 | 2001-07-24 | Xerox Corporation | Image forming device having a reduced toner consumption mode |
| US6384859B1 (en) * | 1995-03-29 | 2002-05-07 | Sanyo Electric Co., Ltd. | Methods for creating an image for a three-dimensional display, for calculating depth information and for image processing using the depth information |
| US6674903B1 (en) * | 1998-10-05 | 2004-01-06 | Agfa-Gevaert | Method for smoothing staircase effect in enlarged low resolution images |
| US6885771B2 (en) * | 1999-04-07 | 2005-04-26 | Matsushita Electric Industrial Co. Ltd. | Image recognition method and apparatus utilizing edge detection based on magnitudes of color vectors expressing color attributes of respective pixels of color image |
| US20050286758A1 (en) * | 2004-06-28 | 2005-12-29 | Microsoft Corporation | Color segmentation-based stereo 3D reconstruction system and process employing overlapping images of a scene captured from viewpoints forming either a line or a grid |
| US7034963B2 (en) * | 2001-07-11 | 2006-04-25 | Applied Materials, Inc. | Method for adjusting edges of grayscale pixel-map images |
| US7085409B2 (en) * | 2000-10-18 | 2006-08-01 | Sarnoff Corporation | Method and apparatus for synthesizing new video and/or still imagery from a collection of real video and/or still imagery |
| US20070024614A1 (en) * | 2005-07-26 | 2007-02-01 | Tam Wa J | Generating a depth map from a two-dimensional source image for stereoscopic and multiview imaging |
| US7518618B2 (en) * | 2005-12-23 | 2009-04-14 | Xerox Corporation | Anti-aliased tagging using look-up table edge pixel identification |
| US20090129667A1 (en) * | 2007-11-16 | 2009-05-21 | Gwangju Institute Of Science And Technology | Device and method for estimatiming depth map, and method for generating intermediate image and method for encoding multi-view video using the same |
| US7570804B2 (en) * | 2004-12-07 | 2009-08-04 | Electronics And Telecommunications Research Institute | Apparatus and method for determining stereo disparity based on two-path dynamic programming and GGCP |
| US7602531B2 (en) * | 2006-03-22 | 2009-10-13 | Lexmark International, Inc. | Halftone edge enhancement for production by an image forming device |
| US7639891B2 (en) * | 2005-12-23 | 2009-12-29 | Xerox Corporation | Corner sharpening using look-up table edge pixel identification |
| US20110074784A1 (en) * | 2009-09-30 | 2011-03-31 | Disney Enterprises, Inc | Gradient modeling toolkit for sculpting stereoscopic depth models for converting 2-d images into stereoscopic 3-d images |
| US20110169823A1 (en) * | 2008-09-25 | 2011-07-14 | Koninklijke Philips Electronics N.V. | Three dimensional image data processing |
| US8036451B2 (en) * | 2004-02-17 | 2011-10-11 | Koninklijke Philips Electronics N.V. | Creating a depth map |
| US8249333B2 (en) * | 2006-01-10 | 2012-08-21 | Microsoft Corporation | Segmenting image elements |
| US8411080B1 (en) * | 2008-06-26 | 2013-04-02 | Disney Enterprises, Inc. | Apparatus and method for editing three dimensional objects |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CA2693666A1 (en) * | 2007-07-12 | 2009-01-15 | Izzat H. Izzat | System and method for three-dimensional object reconstruction from two-dimensional images |
-
2009
- 2009-12-01 TW TW098141004A patent/TWI398158B/zh active
-
2010
- 2010-05-14 US US12/780,074 patent/US20110128282A1/en not_active Abandoned
Patent Citations (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6384859B1 (en) * | 1995-03-29 | 2002-05-07 | Sanyo Electric Co., Ltd. | Methods for creating an image for a three-dimensional display, for calculating depth information and for image processing using the depth information |
| US6266153B1 (en) * | 1998-05-12 | 2001-07-24 | Xerox Corporation | Image forming device having a reduced toner consumption mode |
| US6674903B1 (en) * | 1998-10-05 | 2004-01-06 | Agfa-Gevaert | Method for smoothing staircase effect in enlarged low resolution images |
| US6885771B2 (en) * | 1999-04-07 | 2005-04-26 | Matsushita Electric Industrial Co. Ltd. | Image recognition method and apparatus utilizing edge detection based on magnitudes of color vectors expressing color attributes of respective pixels of color image |
| US7085409B2 (en) * | 2000-10-18 | 2006-08-01 | Sarnoff Corporation | Method and apparatus for synthesizing new video and/or still imagery from a collection of real video and/or still imagery |
| US7034963B2 (en) * | 2001-07-11 | 2006-04-25 | Applied Materials, Inc. | Method for adjusting edges of grayscale pixel-map images |
| US8036451B2 (en) * | 2004-02-17 | 2011-10-11 | Koninklijke Philips Electronics N.V. | Creating a depth map |
| US20050286758A1 (en) * | 2004-06-28 | 2005-12-29 | Microsoft Corporation | Color segmentation-based stereo 3D reconstruction system and process employing overlapping images of a scene captured from viewpoints forming either a line or a grid |
| US7570804B2 (en) * | 2004-12-07 | 2009-08-04 | Electronics And Telecommunications Research Institute | Apparatus and method for determining stereo disparity based on two-path dynamic programming and GGCP |
| US20070024614A1 (en) * | 2005-07-26 | 2007-02-01 | Tam Wa J | Generating a depth map from a two-dimensional source image for stereoscopic and multiview imaging |
| US7518618B2 (en) * | 2005-12-23 | 2009-04-14 | Xerox Corporation | Anti-aliased tagging using look-up table edge pixel identification |
| US7639891B2 (en) * | 2005-12-23 | 2009-12-29 | Xerox Corporation | Corner sharpening using look-up table edge pixel identification |
| US8249333B2 (en) * | 2006-01-10 | 2012-08-21 | Microsoft Corporation | Segmenting image elements |
| US7602531B2 (en) * | 2006-03-22 | 2009-10-13 | Lexmark International, Inc. | Halftone edge enhancement for production by an image forming device |
| US20090129667A1 (en) * | 2007-11-16 | 2009-05-21 | Gwangju Institute Of Science And Technology | Device and method for estimatiming depth map, and method for generating intermediate image and method for encoding multi-view video using the same |
| US8411080B1 (en) * | 2008-06-26 | 2013-04-02 | Disney Enterprises, Inc. | Apparatus and method for editing three dimensional objects |
| US20110169823A1 (en) * | 2008-09-25 | 2011-07-14 | Koninklijke Philips Electronics N.V. | Three dimensional image data processing |
| US20110074784A1 (en) * | 2009-09-30 | 2011-03-31 | Disney Enterprises, Inc | Gradient modeling toolkit for sculpting stereoscopic depth models for converting 2-d images into stereoscopic 3-d images |
Non-Patent Citations (3)
| Title |
|---|
| Avidan, Shai; Shamir, Ariel; "Seam Carving for Content-Aware Image Resizing"; * |
| Haung, Wei-Jia, Chen-Te Wu, Kai-Che Liu, "Seam based dynamic programming for stereo matching"; SIGGRAPH ASIA '09 ACM SIGGRAPH ASIA 2009 Posters, * |
| Michael Rubinstein, Ariel Shamir, Shai Avidan; "Improved Seam Carving for Video Retargeting", ACM SIGGRAPH * |
Cited By (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9047656B2 (en) * | 2009-01-20 | 2015-06-02 | Entropic Communications, Inc. | Image processing using a bilateral grid |
| US9552631B2 (en) | 2009-01-20 | 2017-01-24 | Entropic Communications, Llc | Image processing using a bilateral grid |
| US8401278B2 (en) * | 2009-08-12 | 2013-03-19 | Hitachi, Ltd. | Image processing apparatus and image processing method |
| US20110038529A1 (en) * | 2009-08-12 | 2011-02-17 | Hitachi, Ltd. | Image processing apparatus and image processing method |
| US20120092462A1 (en) * | 2010-10-14 | 2012-04-19 | Altek Corporation | Method and apparatus for generating image with shallow depth of field |
| US8810634B2 (en) * | 2010-10-14 | 2014-08-19 | Altek Corporation | Method and apparatus for generating image with shallow depth of field |
| US9007441B2 (en) | 2011-08-04 | 2015-04-14 | Semiconductor Components Industries, Llc | Method of depth-based imaging using an automatic trilateral filter for 3D stereo imagers |
| WO2013075611A1 (zh) * | 2011-11-23 | 2013-05-30 | 华为技术有限公司 | 一种深度图像滤波方法、获取深度图像滤波阈值的方法和装置 |
| US9594974B2 (en) | 2011-11-23 | 2017-03-14 | Huawei Technologies Co., Ltd. | Depth image filtering method, and depth image filtering threshold obtaining method and apparatus |
| US9070196B2 (en) | 2012-02-27 | 2015-06-30 | Samsung Electronics Co., Ltd. | Apparatus and method for estimating disparity using visibility energy model |
| US9338437B2 (en) * | 2012-04-03 | 2016-05-10 | Hanwha Techwin Co., Ltd. | Apparatus and method for reconstructing high density three-dimensional image |
| US20130258064A1 (en) * | 2012-04-03 | 2013-10-03 | Samsung Techwin Co., Ltd. | Apparatus and method for reconstructing high density three-dimensional image |
| WO2020113824A1 (zh) * | 2018-12-04 | 2020-06-11 | 深圳市华星光电半导体显示技术有限公司 | 图像处理方法 |
| US10992873B2 (en) * | 2019-01-18 | 2021-04-27 | Qualcomm Incorporated | Systems and methods for color matching for realistic flash images |
Also Published As
| Publication number | Publication date |
|---|---|
| TW201121300A (en) | 2011-06-16 |
| TWI398158B (zh) | 2013-06-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20110128282A1 (en) | Method for Generating the Depth of a Stereo Image | |
| US7876954B2 (en) | Method and device for generating a disparity map from stereo images and stereo matching method and device therefor | |
| US10321112B2 (en) | Stereo matching system and method of operating thereof | |
| US9030469B2 (en) | Method for generating depth maps from monocular images and systems using the same | |
| CN101635859B (zh) | 一种实现平面视频转立体视频的方法和装置 | |
| US9053540B2 (en) | Stereo matching by census transform and support weight cost aggregation | |
| US8773427B2 (en) | Method and apparatus for multiview image generation using depth map information | |
| US20150097827A1 (en) | Target Region Fill Utilizing Transformations | |
| CN101437170A (zh) | 多眼视图像生成系统以及多眼视图像生成方法 | |
| CN106815594A (zh) | 立体匹配方法及装置 | |
| US8416989B2 (en) | Image processing apparatus, image capture apparatus, image processing method, and program | |
| EP3963546B1 (en) | Learnable cost volume for determining pixel correspondence | |
| CN104091339A (zh) | 一种图像快速立体匹配方法及装置 | |
| JP2017068577A (ja) | 演算装置、方法及びプログラム | |
| CN105323573A (zh) | 三维图像显示装置和方法 | |
| Ma et al. | Depth-guided inpainting algorithm for free-viewpoint video | |
| CN108335267A (zh) | 一种深度图像的处理方法、装置、设备和存储介质 | |
| WO2014198029A1 (en) | Image completion based on patch offset statistics | |
| CN119741434B (zh) | 基于双域信息融合的动态自适应多视图立体重建方法及模型 | |
| Hu et al. | Adaptive region aggregation for multi‐view stereo matching using deformable convolutional networks | |
| Hallek et al. | Real-time stereo matching on CUDA using Fourier descriptors and dynamic programming | |
| de Oliveira et al. | On the performance of DIBR methods when using depth maps from state-of-the-art stereo matching algorithms | |
| CN114049436B (zh) | 一种改进的级联结构多视角立体重建方法 | |
| CN112907645B (zh) | 视差图获取方法、装置、训练方法、电子设备和介质 | |
| US9077963B2 (en) | Systems and methods for generating a depth map and converting two-dimensional data to stereoscopic data |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE, TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WANG, CHIN-YUAN;HO, CHIA-HANG;WU, CHUN-TE;AND OTHERS;REEL/FRAME:024385/0908 Effective date: 20100409 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |