[go: up one dir, main page]

WO2011071313A2 - Procédé et appareil d'extraction d'une image de texture et d'une image de profondeur - Google Patents

Procédé et appareil d'extraction d'une image de texture et d'une image de profondeur Download PDF

Info

Publication number
WO2011071313A2
WO2011071313A2 PCT/KR2010/008758 KR2010008758W WO2011071313A2 WO 2011071313 A2 WO2011071313 A2 WO 2011071313A2 KR 2010008758 W KR2010008758 W KR 2010008758W WO 2011071313 A2 WO2011071313 A2 WO 2011071313A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
pattern
target object
scene
texture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2010/008758
Other languages
English (en)
Korean (ko)
Other versions
WO2011071313A3 (fr
Inventor
김진웅
블란코 리베라루제
김태원
장은영
김욱중
허남호
이수인
황승구
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Electronics and Telecommunications Research Institute ETRI
Original Assignee
Electronics and Telecommunications Research Institute ETRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Electronics and Telecommunications Research Institute ETRI filed Critical Electronics and Telecommunications Research Institute ETRI
Priority to EP10836208.8A priority Critical patent/EP2512142A4/fr
Priority to CN2010800633447A priority patent/CN102884798A/zh
Priority to US13/514,807 priority patent/US20130083165A1/en
Publication of WO2011071313A2 publication Critical patent/WO2011071313A2/fr
Publication of WO2011071313A3 publication Critical patent/WO2011071313A3/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Definitions

  • the present invention relates to an apparatus and method for extracting a texture image and a depth image, and to an apparatus and method for simultaneously extracting a texture image and a depth image by irradiating a pattern image that becomes white light when temporally integrated.
  • the method of extracting depth information in 3D technology uses a stereo method.
  • the stereo method means using a left image and a right image for 3D display.
  • a problem may occur in a bandwidth of a transmission channel.
  • the newly proposed method can solve the bandwidth problem by transmitting a texture image corresponding to a color image and a depth image corresponding to the texture image, and synthesizing a stereo image through scene synthesis.
  • the conventional method is difficult to effectively acquire a texture image and a depth image when the target object is dynamic.
  • the target object is a human
  • white light irradiated to the human body in order to acquire a texture image and a depth image has an uncomfortable problem for the human eye.
  • the color of the target object included in the 3D image or the ambient illumination ( ambience illumination) on the target object there was a problem that the accuracy of the texture image is inferior.
  • the present invention provides a method and apparatus for simultaneously acquiring a texture image and a depth image of a target object by photographing a scene image by continuously irradiating a target image with a pattern image that becomes white light when temporally integrated.
  • the present invention provides a method and an apparatus capable of preventing inconvenience to a human's vision by irradiating white light by irradiating a target image with a pattern image that becomes white light when temporally integrated.
  • the present invention provides a method and apparatus for robustly finding a corresponding point to the color of the surface of a target object and environment lighting by decoding the encoded pattern image by checking the color of the scene image through the texture image of the target object.
  • An apparatus includes a pattern image irradiation unit for irradiating a pattern image to the target object; An image capturing unit configured to capture a scene image in which the pattern image is reflected by a target object; And an image processor extracting a texture image and a depth image using the captured scene image.
  • a method of irradiating a pattern image to a target object by a pattern image irradiator A scene image photographing unit photographing a scene image in which the pattern image is reflected by a target object; And extracting, by the image processor, a texture image and a depth image using the captured scene image.
  • the texture image and the depth image of the target object may be simultaneously obtained by continuously photographing the scene image by irradiating a pattern image that becomes white light to the target object.
  • the target object by irradiating the target object with a pattern image that becomes white light when integrated in time, it is possible to prevent inconvenience to human vision by irradiating white light.
  • the present invention by identifying the color of the scene image through the texture image of the target object and decoding the encoded pattern image, it is possible to find a corresponding point robustly to the color of the surface of the target object and the environment lighting.
  • FIG. 1 is a diagram illustrating an apparatus for simultaneously extracting a texture image and a depth image according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a pattern arrangement of a pattern image according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a process of extracting a texture image from a scene image according to an embodiment of the present invention.
  • FIG. 4 is a diagram illustrating a process of capturing a scene image in which a pattern image is integrated in time according to an embodiment of the present invention.
  • FIG. 5 is a diagram illustrating a process of capturing a scene image using a beam splitter according to an embodiment of the present invention.
  • FIG. 6 illustrates a relationship between a surface of a target object and reflected light according to an embodiment of the present invention.
  • FIG. 7 is a diagram illustrating a CIELab chart according to an embodiment of the present invention.
  • FIG. 8 is a diagram for checking the color of a scene image according to an embodiment of the present invention.
  • FIG. 9 is a diagram illustrating a synchronization signal between a pattern irradiator and an image photographing unit according to an exemplary embodiment of the present invention.
  • FIG. 10 is a diagram illustrating a synchronization signal between a pattern irradiator and an image photographing unit according to another exemplary embodiment of the present invention.
  • FIG. 11 is a flowchart illustrating an apparatus for simultaneously extracting a texture image and a depth image according to an embodiment of the present invention.
  • FIG. 1 is a diagram illustrating an apparatus for simultaneously extracting a texture image and a depth image according to an embodiment of the present invention.
  • the apparatus 101 may include a pattern image irradiator 102, an image capturing unit 103, and an image processor 104.
  • the pattern image irradiator 102 may irradiate the target object 105 with the pattern image.
  • the pattern image irradiator 102 may store the pattern image in the buffer and sequentially and repeatedly irradiate the pattern image stored in the buffer according to the synchronization signal of the image processor 104. That is, the pattern image irradiator 102 may irradiate the target object 105 with the pattern image encoded with specific information according to the structured light technique.
  • the pattern image irradiator 102 may irradiate the pattern image based on a temporal or spatial pattern arrangement.
  • the pattern image irradiator 102 may irradiate the pattern image based on the deBruijin pattern array among the spatial pattern arrays.
  • the pattern image according to the exemplary embodiment of the present invention is irradiated to the target object 105 at high speed according to the pattern arrangement.
  • the pattern image when the pattern image is integrated for a predetermined time, the same effect as that of irradiating white light may be obtained. That is, the pattern image may be classified into R (Red), G (Green), and B (Blue) in a high speed camera, but in a general camera, R (Red), G (Green), and B (Blue) are combined as white light. Can be distinguished.
  • the image capturing unit 103 may capture a scene image in which the pattern image is reflected by the target object 105.
  • the image capturing unit 103 may capture a scene image in which the pattern image is irradiated onto the target object 105 and reflected according to the synchronization signal of the image processing unit 105.
  • the image capturing unit 103 may be configured of at least one camera, and the type of camera may be a general camera or a high speed camera.
  • general cameras do not classify pattern images as R (Red), G (Green), B (Blue) and recognize them as white light.
  • High-speed cameras recognize pattern images as R (Red), G (Green), B Means a camera that can be classified as (Blue).
  • the image processor 104 may extract the texture image 106 and the depth image 107 of the target object 105 using the captured scene image.
  • the image processor 104 may extract the depth image 107 by applying the structured light technique and the stereo matching technique.
  • a stereo matching method may be used as a passive approach to extract depth information of the target object 105.
  • a structured light method may be used as an active approach to extract depth information of the object 105.
  • Stereo matching is basically a method of finding two or more images of moving views and finding correspondence points between the captured images.
  • 3D depth information may be derived by applying trigonometric methods to corresponding points.
  • the stereo matching technique when the surface of the target object 105 is homogenous without texture, ambiguity may occur and it may be difficult to find a corresponding point.
  • the structured light technique may calculate a corresponding point by irradiating specific pattern light to remove ambiguity.
  • the structured light technique may find a corresponding point by irradiating a pattern image encoded with specific information to the target object 105, and then photographing and decoding a scene image reflected by the target object 105.
  • the color of the scene image may be confirmed based on the texture image of the target object 105.
  • the image processor 104 may extract the texture image 106 of the target object 105 in various ways. In this case, when the image capturing unit 103 is a high speed camera or a general camera, a method of extracting the texture image 106 of the target object 105 may vary. When the image capturing unit 103 is a high speed camera, the image processing unit 104 may extract the texture image 106 of the target object 105 by averaging the scene image captured by the image capturing unit 103. In addition, when the image capturing unit 103 is a general camera, the image capturing unit 103 may capture a scene image integrated in time by maintaining a long shutter speed. Then, the image processor 104 may extract the texture image 106 of the target object 105 using the scene image integrated in time.
  • the image capturing unit 103 is a general camera and a high speed camera
  • the general camera and the high speed camera may respectively capture a scene image in which the pattern image is reflected on the target object 105 through a beam splitter.
  • FIG. 2 is a diagram illustrating a pattern arrangement of a pattern image according to an embodiment of the present invention.
  • the pattern image irradiator 102 may irradiate the pattern image according to the deBruijin pattern arrangement illustrated in FIG. 2.
  • a pattern image may be irradiated onto a target object spatially in the order of the pattern 1 201, the pattern 2 202, and the pattern 3 203. That is, one stripe color may change in the order of Red-> Green-> Blue in the order of the pattern 1 (201), the pattern 2 (202), and the pattern 3 (203).
  • the human eye may integrate the pattern image for a predetermined time and finally feel it as white light.
  • FIG. 3 is a diagram illustrating a process of extracting a texture image from a scene image according to an embodiment of the present invention.
  • FIG. 3 a process of extracting a texture image from a scene image in which a pattern image is reflected on a target object is illustrated.
  • FIG. 3 illustrates a process of extracting the texture image 304 when the image capturing unit 103, which is a high speed camera, photographs the scene images 301 to 303.
  • the image capturing unit 103 may capture the reflected scene image 1 301, the scene image 2 302, and the scene image 303 according to the pattern image irradiated by the pattern image irradiator 102. Then, the image processor 104 may extract the texture image 304 for the target object by calculating an average of the scene image 1 301, the scene image 2 302, and the scene image 303.
  • the texture image 304 may be extracted when white light is irradiated.
  • the texture image 304 may be extracted using the property of white light. Can be extracted. If the target object is a human, when the white light is directly irradiated, the human eye may feel uncomfortable. Therefore, by continuously irradiating the pattern image, the target object may have the same effect as the white light.
  • FIG. 4 is a diagram illustrating a process of capturing a scene image in which a pattern image is integrated in time according to an embodiment of the present invention.
  • the image capturing unit 103 which is a general camera captures a scene image.
  • the image capturing unit 103 may capture a scene image obtained by integrating three consecutive pattern images in time by maintaining a long shutter speed of the camera to lengthen an exposure time.
  • the scene image in which the pattern image is temporally integrated has the same effect as the scene image reflected by irradiating white light to the target object.
  • the image processor 104 may extract a texture image of the target object from the captured scene image.
  • FIG. 5 is a diagram illustrating a process of capturing a scene image using a beam splitter according to an embodiment of the present invention.
  • FIG. 5 illustrates that the image photographing unit 501, which is a general camera, and the image photographing unit 502, which is a high speed camera, simultaneously capture a scene image reflected after the pattern image is irradiated onto the target object 504 by the beam splitter 503.
  • FIG. Indicates.
  • a texture image such that white light is projected by a time-integrated effect is obtained through the general camera image capturing unit 501, and a pattern image is obtained through a high speed camera. Can be used to extract depth information.
  • FIG. 6 illustrates a relationship between a surface of a target object and reflected light according to an embodiment of the present invention.
  • the image processing unit 104 in order for the image processing unit 104 to decode the pattern image encoded according to the structured light technique, it is necessary to check the color of the scene image photographed by the image capturing unit 103.
  • the scene image refers to an image in which the pattern image is reflected on the target object.
  • the image processing unit 104 may check the color of the scene image using the texture image of the target object.
  • the texture image of the target object the influence of the color of the surface of the target object or the environmental lighting can be eliminated, so that more robust decoding can be performed.
  • the color of the reflected scene image may vary depending on the color of the surface of the target object or the environmental lighting.
  • R (Red) when the color of the surface of the target object is R (Red), R (Red) becomes reflected light among the white light irradiated to the target object, and all other colors become absorbed light.
  • the white light has the same effect as when the pattern image is sequentially irradiated.
  • the scene image reflected by the target object is composed of R (Red).
  • G (Green) when the color of the surface of the target object is G (Green), G (Green) is reflected light among the white light irradiated to the target object, and all the remaining colors are absorbed light. do. Then, the scene image reflected on the target object is composed of G (Green).
  • FIG. 7 is a diagram illustrating a CIELab chart according to an embodiment of the present invention.
  • FIG. 7 illustrates the color of the scene image captured by the image capturing unit 103 when the color of the surface of the target object is Reddish, Greenish, Blueish, and the color of light irradiated to the target object is Red, Green, Blue, or White.
  • a CIELab chart is shown showing change.
  • the color of light irradiated to the target object corresponds to the color of the pattern image irradiated to the target object.
  • the coordinates of four lights irradiated onto the target object are biased to the color of the surface of the target object.
  • the image capturing unit 103 photographs the reflected scene image after red, green, and blue are irradiated to the target object, and when the image processing unit 104 calculates an average of the captured scene image, the color of the surface of the target object. Can be obtained.
  • the color of the scene image may be confirmed. The process of checking the color of the scene image will be described in detail with reference to FIG. 8.
  • FIG. 8 is a diagram for checking the color of a scene image according to an embodiment of the present invention.
  • an average color of a scene image is shown in FIG. 801. Then, in FIG. 802, among the three shifted colors generated by adding R (Red), G (Green), and B (Blue) to the average color, a transition color having a small difference from the color of the scene image is selected. It may be identified by the colors c1, c2, and c2 of the scene image.
  • FIG. 9 is a diagram illustrating a synchronization signal between a pattern irradiator and an image photographing unit according to an exemplary embodiment of the present invention.
  • a synchronization signal is required between the pattern image irradiator 102 and the image capturing unit 103.
  • a time diagram of a shutter signal of the image capturing unit 103 is illustrated based on a trigger signal of the pattern image irradiating unit 102 to which the pattern image is irradiated. In this case, it can be seen that different time diagrams appear when the image capturing unit 103 is a high speed camera or a general camera.
  • the shutter signal of the image capturing unit 103 is maintained during the period in which the trigger signal of the pattern image irradiating unit 102 is generated.
  • the image capturing unit 103 is a high speed camera
  • the signal cycle is short while in the case of a general camera, the signal cycle is long. That is, referring to FIG. 9, when the image capturing unit 103 is a high speed camera, 3 scene images reflecting R, G, and B of the pattern image are respectively captured, while the image capturing unit 103 is a general camera. In this case, it can be seen that R, G, and B of the pattern image are reflected to photograph one scene image integrated in time.
  • FIG. 10 is a diagram illustrating a synchronization signal between a pattern irradiator and an image photographing unit according to another exemplary embodiment of the present invention.
  • FIG. 10 shows that the period of the trigger signal of the pattern image capturing unit 102 is shorter than that of FIG. 9. That is, FIG. 10 illustrates a case where the movement of the target object is fast. When the movement of the target object is fast, the pattern image needs to be irradiated faster. Thus, FIG. 10 has a shorter period of the trigger signal than FIG.
  • FIG. 11 is a flowchart illustrating an apparatus for simultaneously extracting a texture image and a depth image according to an embodiment of the present invention.
  • the pattern image irradiator 102 of the apparatus 101 may irradiate the pattern image to the target object in operation S1101.
  • the pattern image irradiator 102 may sequentially irradiate the pattern image based on a temporal or spatial pattern arrangement.
  • the pattern image irradiator 102 may irradiate the pattern image based on the deBruijin pattern array among the spatial pattern arrays.
  • the pattern image may be irradiated to the target object at high speed according to the pattern arrangement, and may be white light when integrated for a predetermined time.
  • the image capturing unit 103 of the apparatus 101 may capture a scene image in which the pattern image is reflected by the target object (S1102).
  • the image capturing unit 103 may capture a scene image in which the pattern image is reflected from the target object by using a general camera and a high speed camera using a beam splitter.
  • the image processor 104 of the apparatus 101 may extract the texture image and the depth image by using the captured scene image (S1103).
  • the image processor 104 may extract the depth image by applying the structured light technique and the stereo matching technique to the scene image. At this time, the image processor 104 may check the color of the scene image according to the texture image of the target object and extract the corresponding point using the color of the scene image according to the structured light technique. In detail, the image processor 104 uses the difference between the transition color generated by applying the R (Red), G (Green), and B (Blue) channels to the average color of the scene image of the target object and the original color of the scene image. To check the color of the scene image.
  • the image processor 104 may extract the texture image of the target object by averaging the pattern image.
  • the image processing unit 104 uses the scene image in which the pattern image is temporally integrated to determine the target object.
  • the texture image may be extracted.
  • Methods according to an embodiment of the present invention can be implemented in the form of program instructions that can be executed by various computer means and recorded in a computer readable medium.
  • the computer readable medium may include program instructions, data files, data structures, and the like, alone or in combination.
  • Program instructions recorded on the media may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well-known and available to those having skill in the computer software arts.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Image Analysis (AREA)
  • Image Generation (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un appareil et un procédé destinés à extraire une image de texture et une image de profondeur. L'appareil selon la présente invention irradie une image de motif sur un objet-cible, photographie l'image de la scène dans laquelle l'image de motif est réfléchie par l'objet-cible et extrait simultanément une image de texture et une image de profondeur en utilisant l'image de la scène.
PCT/KR2010/008758 2009-12-08 2010-12-08 Procédé et appareil d'extraction d'une image de texture et d'une image de profondeur Ceased WO2011071313A2 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP10836208.8A EP2512142A4 (fr) 2009-12-08 2010-12-08 Procédé et appareil d'extraction d'une image de texture et d'une image de profondeur
CN2010800633447A CN102884798A (zh) 2009-12-08 2010-12-08 提取纹理图像和深度图像的装置及方法
US13/514,807 US20130083165A1 (en) 2009-12-08 2010-12-08 Apparatus and method for extracting texture image and depth image

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20090120920 2009-12-08
KR10-2009-0120920 2009-12-08

Publications (2)

Publication Number Publication Date
WO2011071313A2 true WO2011071313A2 (fr) 2011-06-16
WO2011071313A3 WO2011071313A3 (fr) 2011-10-20

Family

ID=44146051

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2010/008758 Ceased WO2011071313A2 (fr) 2009-12-08 2010-12-08 Procédé et appareil d'extraction d'une image de texture et d'une image de profondeur

Country Status (5)

Country Link
US (1) US20130083165A1 (fr)
EP (1) EP2512142A4 (fr)
KR (1) KR101407818B1 (fr)
CN (1) CN102884798A (fr)
WO (1) WO2011071313A2 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8983121B2 (en) 2010-10-27 2015-03-17 Samsung Techwin Co., Ltd. Image processing apparatus and method thereof

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101346982B1 (ko) * 2010-11-08 2014-01-02 한국전자통신연구원 텍스쳐 영상과 깊이 영상을 추출하는 장치 및 방법
KR101282352B1 (ko) * 2011-09-30 2013-07-04 주식회사 홀코 가변패턴을 이용한 3차원 이미지 촬영장치 및 방법
KR101913317B1 (ko) * 2012-03-20 2018-10-30 삼성전자주식회사 장면 정보를 획득하는 방법 및 그 장치
KR101359099B1 (ko) * 2012-11-14 2014-02-06 재단법인대구경북과학기술원 카메라의 밝기 조절을 이용한 깊이 정보 획득 장치 및 방법

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BR9709679A (pt) * 1996-06-13 2000-01-11 Leuven K U Res & Dev Sistema e processo para alcançar uma descrição de forma tridimencional.
US6542185B1 (en) * 1998-01-07 2003-04-01 Intel Corporation Method and apparatus for automated optimization of white and color balance on video camera
JP3729035B2 (ja) 2000-06-30 2005-12-21 富士ゼロックス株式会社 3次元画像撮像装置および3次元画像撮像方法
JP3855053B2 (ja) * 2003-01-30 2006-12-06 国立大学法人 東京大学 画像処理装置、画像処理方法、及び画像処理プログラム
KR100528343B1 (ko) * 2003-07-14 2005-11-15 삼성전자주식회사 3차원 객체의 영상 기반 표현 및 편집 방법 및 장치
JP2005128006A (ja) 2003-09-29 2005-05-19 Brother Ind Ltd 3次元形状検出装置、撮像装置、及び、3次元形状検出プログラム
US7421111B2 (en) * 2003-11-07 2008-09-02 Mitsubishi Electric Research Laboratories, Inc. Light pen system for pixel-based displays
DE102004029552A1 (de) * 2004-06-18 2006-01-05 Peter Mäckel Verfahren zur Sichtbarmachung und Messung von Verformungen von schwingenden Objekten mittels einer Kombination einer synchronisierten, stroboskopischen Bildaufzeichnung mit Bildkorrelationsverfahren
WO2007061632A2 (fr) * 2005-11-09 2007-05-31 Geometric Informatics, Inc. Methode et appareil pour une imagerie de surface tridimensionnelle a coordonnees absolues
KR100806201B1 (ko) * 2006-10-30 2008-02-22 광주과학기술원 깊이영상의 계층적 분해를 이용한 삼차원 비디오 생성방법, 이를 위한 장치, 및 그 시스템과 기록 매체
CA2693666A1 (fr) * 2007-07-12 2009-01-15 Izzat H. Izzat Systeme et procede pour une reconstruction d'objet tridimensionnelle a partir d'images bidimensionnelles
KR100918862B1 (ko) * 2007-10-19 2009-09-28 광주과학기술원 참조영상을 이용한 깊이영상 생성방법 및 그 장치, 생성된깊이영상을 부호화/복호화하는 방법 및 이를 위한인코더/디코더, 그리고 상기 방법에 따라 생성되는 영상을기록하는 기록매체
CN101281023A (zh) * 2008-05-22 2008-10-08 北京中星微电子有限公司 一种获取三维目标外形的方法及系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8983121B2 (en) 2010-10-27 2015-03-17 Samsung Techwin Co., Ltd. Image processing apparatus and method thereof

Also Published As

Publication number Publication date
CN102884798A (zh) 2013-01-16
KR101407818B1 (ko) 2014-06-17
EP2512142A4 (fr) 2014-02-26
WO2011071313A3 (fr) 2011-10-20
KR20110065399A (ko) 2011-06-15
US20130083165A1 (en) 2013-04-04
EP2512142A2 (fr) 2012-10-17

Similar Documents

Publication Publication Date Title
KR101493064B1 (ko) 고도의 정확성으로 결함이 있는 눈을 검출하는 카메라에 기반한 방법
WO2017204571A1 (fr) Appareil de détection de caméra pour obtenir des informations tridimensionnelles d'un objet, et appareil de simulation de golf virtuel l'utilisant
WO2014035127A1 (fr) Appareil de génération d'image de profondeur
WO2009142390A2 (fr) Appareil de mesure d'un profil de surface
WO2014081107A1 (fr) Procédé et dispositif pour obtenir une image 3d
WO2014035128A1 (fr) Système de traitement d'image
WO2015016459A1 (fr) Appareil de capture d'image de champ lumineux comprenant un réseau de microlentilles décalées
WO2018174535A1 (fr) Système et procédé pour une carte de profondeur
WO2016068560A1 (fr) Marque translucide, procédé de synthèse et de détection de marque translucide, marque transparente, et procédé de synthèse et de détection de marque transparente
WO2015115802A1 (fr) Dispositif et procédé d'extraction d'informations de profondeur
WO2011087337A2 (fr) Dispositif d'inspection de substrat
WO2011071313A2 (fr) Procédé et appareil d'extraction d'une image de texture et d'une image de profondeur
WO2016200096A1 (fr) Appareil de mesure de forme tridimensionnelle
ATE285079T1 (de) 3d- bilderzeugungssystem
WO2010076988A2 (fr) Procédé d'obtention de données d'images et son appareil
JP5694332B2 (ja) オブジェクトの外観を強調するための照明システム及び方法
WO2020101431A1 (fr) Procédé de rendu d'image en trois dimensions, dispositif de traitement d'image utilisant ledit procédé, dispositif de capture d'image en interfonctionnement avec ledit dispositif de traitement d'image, procédé de capture d'image par ledit dispositif de capture d'image, et système de rendu d'image en trois dimensions
WO2013176482A1 (fr) Procédé de mesure de la hauteur pour un dispositif de mesure de formes tridimensionnelles
WO2018101746A2 (fr) Appareil et procédé de reconstruction d'une zone bloquée de surface de route
CN110264529A (zh) 二维标定板、三维标定体、相机系统及相机标定方法、标定支撑架
WO2016099154A1 (fr) Procédé de contrôle et appareil de contrôle pour un substrat sur lequel des composants sont chargés
WO2017195984A1 (fr) Dispositif et procédé de numérisation 3d
WO2012002601A1 (fr) Procédé et appareil permettant de reconnaître une personne à l'aide d'informations d'image 3d
WO2013100223A1 (fr) Procédé pour créer les informations de hauteur d'un dispositif d'inspection de substrat
WO2017086522A1 (fr) Procédé de synthèse d'image d'incrustation couleur sans écran d'arrière-plan

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080063344.7

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10836208

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2010836208

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 13514807

Country of ref document: US