TW201233141A - Scanning projectors and image capture modules for 3D mapping - Google Patents
Scanning projectors and image capture modules for 3D mapping Download PDFInfo
- Publication number
- TW201233141A TW201233141A TW100128723A TW100128723A TW201233141A TW 201233141 A TW201233141 A TW 201233141A TW 100128723 A TW100128723 A TW 100128723A TW 100128723 A TW100128723 A TW 100128723A TW 201233141 A TW201233141 A TW 201233141A
- Authority
- TW
- Taiwan
- Prior art keywords
- image
- pattern
- region
- interest
- scanned
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2518—Projection by scanning of the object
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/254—Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/363—Image reproducers using image projection screens
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/365—Image reproducers using digital micromirror devices [DMD]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Mechanical Optical Scanning Systems (AREA)
Abstract
Description
201233141 六、發明說明: 【發明所屬之技術領域】 本發明概言之係關於用於投影並擷取光輻射之方法及裝 置,且特定而言係關於為了 3D映射之目的之投影及影像擷 取。 【先前技術】 此項技術中已知用於光學3D映射(即,藉由處理一物體 之一光學影像來產生該物體之表面之一個3D輪廓)之各種 方法。此種3D輪廓亦稱作一個3D映像、深度映像或深度 影像,且3D映射亦稱作深度映射。 一些3D映射方法係基於將一雷射斑紋圖案投影至該物體 上,並隨後分析該物體上之該圖案之一影像。舉例而言, 其揭示内容以引用方式併入本文中之PCT國際公開申請案 WO 2007/043036闡述一種其中一同調光源及一隨機斑紋圖 案之一產生器將一同調隨機斑紋圖案投影至該物體上之用 於物體重構之系統及方法。一成像單元偵測受照區域之光 回應並產生影像資料。使用該物體之該影像中之該圖案相 對於該圖案之一參考影像之移位來即時重構該物體之一個 3D映像。用於使用斑紋圖案來進行3D映射之更多方法例 如闡述於其揭示内容以引用方式併入本文中之PCT國際公 開申請案WO 2007/105205中。 其他光學3D映射方法將不同種類之圖案投影至欲映射之 物體上。舉例而言,其揭示内容以引用方式併入本文中之 PCT國際公開申請案WO 2008/120217闡述一種包括含有光 158131.doc 201233141 點之一固定圖案之一單個透明片之用於3D映射之照射總 成。一光源用光輻射來透照該透明片以便將該圖案投影至 一物體上。一影像擷取總成擷取該物體上之該圖案之一影 像,且該影像經處理以便重構該物體之一個3D映像。 【發明内容】 下文所述之本發明各實施例提供用於有效投影圖案,特 定而言用於3D映射以及用於成像此等所投影圖案之方法及 設備。 〆 因此,根據本發明之一實施例,提供包括一照射模組之 用於映射之設備,該照射模組包括:_輻射源,其經組態 以發射一輪射光束;及一掃描器,其經組態以在一選定角 範圍内接收並掃描該光束。照射光學器件經組態以投影該 所掃描光束以便形成在一所關注區域内延伸之光點之一圖 案。一成像模組經組態以擷取投影至該所關注區域中之一 物體上的該圖案之一影像。一處理器經組態以處理該影像 以便建構該物體之一個三維(3D)映像。 s亥等光點之該圖案可在由該設備映射之一深度範圍内不 相關。 在一些實施例中,該輻射源經控制以便在該掃描器掃描 該光束時調變該光束之一強度,由此在該所關注區域上形 成該寻光點之該圖案。該照射模組可經組態以回應於由該 成像模組所擷取之該影像而修改該圖案。該照射模組可經 組態以控制該輻射源及該掃描器中之至少一者以便修改該 所關注區域之一選定部分内呈陣列之該等光點之一角密 158131.doc -5- 201233141 度。另一選擇係或另外,該照射模組可經組態以控制該輻 射源及該掃描器中之至少一者以便修改關於該所關注區域 之一選定區中該等光點之一亮度。 在一替代實施例中,該掃描器經組態以在一第一角範圍 内掃描該光束,且該等光學器件包括一光束分裂器,該光 束分裂器經組態以形成在大於該第一角範圍之一第二角範 圍内共同延伸的該所掃描光束之多個有角間隔開之複製 物。該掃摇器及該光束分裂器可經組態以用由該所掃描光 束之該多個有角間隔開之複製物所形成之該圖案來拼貼該 所關注區域β 在另—實施例中,該等光學器件包括一圖案化元件,該 圖案化元件經組態以在由該光束照射時在一第一角範圍内 形成δ亥圖案,且該掃描器經組態以引導該光束連續地以多 個不同角度撞擊該圖案化元件以便形成在大於該第一角範 圍之一第二角範圍内共同延伸的該圖案之多個有角間隔開 之複製物。該掃描器及該圖案化元件可經組態以用該圖案 之該多個有角間隔開之複製物來拼貼該所關注區域。 在再一實施例中’該掃描器經組態以在一第一角範圍内 知也S亥光束’且該專光學器件包括—掃描擴展元件,該掃 描擴展元件經組態以分散該所掃描光束以便以空間圖案涵 蓋大於該第一角範圍之一第二角範圍。該掃描擴展元件可 選自由一 Λ面反射器與一繞射光學元件組成之—元件群 組0 在一所揭示實施例中,該照射模组包括至少一個光束感 158131.doc • 6 - 201233141 測器’該至少-個光束感測器係以處於由該掃描器掃描之 角範圍内之-選定角度定㈣便週期性地接㈣所掃描光 束並由此驗證該掃描器正在操作。通常,該照射模組經組 態以在該感測器未能週期性地接收到該所掃描光束時抑制 自該輻射源發射該光束。 在-些實施例中’該輻射源包括:一第一輻射源,其發 射一紅外光束,該紅外光束經調變以形成該等光點之該圖 案;及-第二輻射源,其發射一可見光光束該可見光光 束經調變以將一可見影像投影至該所關注區域上。該等掃 描器及光學器件經組態以同時將該紅外光束及該可見光光 束兩者投影至該所關注區域上。通常’該第二㈣源經控 制以便回應於該3D映像而將該可見影像投影至該物體上。 在所揭示實施例中,該處理器經配置以藉由尋得該所擷 取影像之若干區中之該等光點與屬於該圖案之一參考影像 之對應參考光點位置之間的各別偏移來導出該3D映像,其 中該等各別偏移指示該等區與影像擷取總成之間的各別距 離。在一些實施例中’該成像模組包括一位置靈敏偵測 器,該位置靈敏偵測器經組態以感測並輸出在由該照射模 組投影該物體上的該圖案中之每一光點時該光點之一偏 移。該成像模組可經組態以與該照射模組中之該掃描器同 步地或與來自該輻射源之該光束一起掃描該位置靈敏偵測 益之一視域。 另—選擇係或另外,該照射模組及該成像模組經配置以 使得該等偏移沿一第一方向發生,且該成像模組包括配置 158131.doc 201233141 成沿該第一方向延伸之一或多個列之一偵測器元件陣列及 經組態以將該圖案成像至該陣列上且具有沿該第一方向較 沿一第二垂直方向為大之一光學功率之像散光學器件。 在些貫施例中,該成像模組包括一感測器及成像光學 器件,該等感測器及成像光學器件界定與該照射模組之該 所掃描光束同步地在該所關注區域内掃描之一感測區。該 感測器可包括具有一滾動快門之一影像感測器,其中該滾 動陕門與该所掃描光束同步。另外或另一選擇係,該照射 模組中之該掃描器可係可控制的以動態地改變該選定角範 圍’且該縣模組可包括-成像掃描器,料像掃描器經 $態以動態地掃描該錢區以與該所掃#光束之該選定角 範圍相匹配。 根據本發明之-實施例,亦提供—種包括—照射模組之 用於映射线備,該照射模組包括射源,該輕射源經 組態以發射具有根據一所指定時間調變而改變之一強度之 -輻射光束。-掃描器經組態以在—所關注區域内接收並 掃描該光束,以便以由對該光束之該時間調變決定之一空 間強度圖案來將該輻射投影至該區域上。一成像模組經組 態以擷取投影至該所關注區域中之—物體上的該空間強度 圖案之-影像。-處理器經㈣以處理該影像以便建構該 物體之一個三維(3D)映像。 在一所揭示實施例中,該時間調變係二進制的,且其中 該空間強度圖案包括藉由該時間調變所產生之一光點陣 列0 158131.doc 201233141 。。在一項實施例十,該成像模組包括一感測器及成像光學 器件’該等感測器及成像光學器件界定與該照射模組之該 所掃猫光束同步地在該所關注區域内掃描之一感測區。 根據本發明之-實施例,另外提供一種用於映射之方 法《亥方法包括在一選定角範圍内掃描一賴射光束以便形 成在-所關注區域内延伸之光點之一圖帛。操#並處理投 影至該所關注區域中之一物體上的該圖案之-影像以便建 構該物體之一個三維(3D)映像。 、根據本發明之-實施例,進—步提供—種用於映射之方 法°亥方法包括產生具有根據一所指定時間調變而改變之 一強度之-輻射光束。在—所關注區域内掃描該光束,以 便以由對該光束之該時間調變決定之一空間強度圖案來將 該輻射投影至該區域上。操取並處理投影至該所關注區域 令之物體上的s亥空間強度圖案之一影像以便建構該物體 之一個三維(3D)映像。 σ圖式閱讀下文對本發明各實施例之詳細說明將更 全面地瞭解本發明,其中·· 【實施方式】 下文闡述之本發明各貫施例尤其提供用於有效投影圖 案,特定而言用於3D映射以及用於有效成像此等所投影圖 案之方法及設備。 在本發明之一些實施射,—照射模組將光點之一圖案 投影至-所關注區域上’且一成像模組操取出現在該區域 中之物體上之該圖案之一影像。此影像經處理以便尋得該 15813 丨.doc 201233141 影像十之該等光點之位置,並在此基礎上建構該所關注區 域中之-物體之-個3D映像。通常基於該影像中之該等光 點相對於該圖案之一參考影像中之對應參考光點位置之偏 移藉由三角量測來計算該映像中之深度座標。 在所揭示實施例中,動態地投影該圖案,即,不是在該 整個區域上同時投影該圖案,而是藉由掃描由-輻射源發 射之-光束來形成該圖案。在—選定角範圍内掃描該光 束。(所揭示實施財之—些實施例涉及控制及/或擴展此 範圍。)通常在該掃描期間調變該光束之強度以便形成所 期望圖案。該掃描在可在映射一給定景物之過程甲修改該 圖案之態樣(諸如其密度、亮度及/或角範圍)意義上係「_ 態的」。儘管下文所述之實施例係具體針對光點圖案而緣 製,但本發明之原理可同樣適用於為了 31)映射之目的而形 成其他種類之圖案。 此動態掃描方法在若干個重要方面係有利的。舉例而 言,以此方式之動態掃描在形成圖案方面,特定而言在可 在所關注區域之影像的基礎上修改圖案方面賦予靈活性。 舉例而言,圖案中之光點之角密度及/或光點之亮度可根 據景物中之所關注物體之景物條件及特徵而在不同區中不 同。 同樣地可與對照射光束之掃描結合動態地操作成像模 組,以便成像模組之活動視域追蹤在掃描中之任一時刻實 際照射之圖案之區。未必如同在習用影像感測器中一樣同 時榻取由所關注區域如此形成之影像,而是可作為形成該 158131.doc •10. 201233141 3D映像之過程之一部分,基於在掃描期間由一偵測器擷取 之本端信號以電子方式組合該影像。以此方式使照射及偵 測源集中於一小的移動區中可增強所偵測圖案之信號/背 景比且因此改善3D映射之精度。成像模組之視域可例如以 光學方式(可能使用(至少部分地)同—掃描器作為照射模 組)或以電子方式(使用具有一滾動快門之一影像感測器)追 蹤對照射光束之掃描。 下文所述之實施例中之一些實施例涉及擴展由照射模組 所提供之掃描之角範圍。此等實施例解決一些31)映射系統 對遠大於H用掃描ϋ之掃描範圍之—寬視域之需要。 在一種這樣的實施例中,投影模組之光學器件包含一光束 分裂器’該光束分裂器同時形成所掃描光束之多個有角間 隔開之複製物。此等複製物在—較該掃描範圍為大之角範 圍内共同延伸。在另—實施例巾,該掃描器引導來自該輕 射源之該光束連續地以多個不同角度撞擊一圖案化元件, 且因此形成圖案之多個有角間隔開之複製物。不論是哪種 情況,照射模組之元件可經組態而以此方式用該圖案來拼 貼該所關注區域1,用該圖案之毗鄰複製物來涵蓋該區 域,而無複製物之間的顯著重疊或間隙。(在此上下文 中’間隙或重疊在其大約為光點之間的間距或大於此量級 之情況下被視為「顯著的」。) 另一選擇係或另外,照射模組可包含諸如—凸面反射器 或一繞射光學元件(DOE)夕 -, (E)之—知描擴展元件,該掃描擴展 元件擴展由所掃描光束涵蓋之角範圍。 158131.doc -11. 201233141 下文闡述對使用一所掃描輻射源之一個3D映射系統之元 件之其他應用及關於使用一所掃描輻射源之3D映射系統之 元件之變化形式。 系統描述 圖1係根據本發明之一實施例,用於3D映射之系統2〇之 一示意性俯視圖。系統20構建於一映射裝置22周圍,該映 射裝置經組態以擷取影像並產生一景物之3D映像。該 此處包括諸如該裝置之一 伏用者之手之一物體28。由裝置 22所產生之观像中之深度資訊可例如由一主機電腦(未 展示)用作使得使用者能夠與在電腦上運行之遊戲及盆他 應用程式以及與-顯示螢幕上所展示之元件互動之一個扣 使用者介面之-部分。(此種功能性例如闡述於其揭示内 容以引用彳式併入本文中之美國專利申請公開案 2009/0183〗25巾。)置22之此特定應用此處提及僅為舉例說 :,且該裝置之映射功能亦可用於其他㈣,且適用於實 質上任何合適類型之景物及3D物體。 在圖!中所示之實例中,映射裝置22中之_照_㈣ 將-光輻射圖案投影至物體28上,如下文將詳細解釋。用 於此目的之光輻射通常處於紅外(IR)_,但同 用可見光或紫外光。(在圖7中展示之—項實施例中,—昭 射模組投影紅外及可見輻射兩者”一成像模組3 : 解碼該物體上之該圖案之影像以便產生影像中之每—像音 之-數位移位值。該移位值表示所擷取影像中之每, 之區中之圖案之-元件(通常^光點)與該圖案之—參^ 158131.doc •12· 201233141 中之對應圖案元件之一參考位置之間的偏移。此等偏 移才曰不對應於該像素之實際景物中之點與影像操取總成之 間的各別距離。另一選擇係,模組38可輸出原始像素值, s等移位值可由裝置22之另一組件或由主機電腦來計 算。 裝置22中之一處理器46處理該等移位值(在必要時由模 組38計算相對於原始像素值之移位值之後)以便產生由裝 置22知射且成像之所關注區域之—深度映像。該深度映像 包含-個3D座標陣列,該陣列包含物體表面於—預定義視 域二之每-點(X、Y)處之—深度(z)座標值。(在—影像相 關資料陣列之上下文中,此等(χ、Υ)點亦稱作像素。)在 本實施例中,該處理II基於該圖案於每—像素處之橫向移 位藉由三角量測來計算物體28之表面上之點之30座標。此 4二角量測計鼻之原理例如闡述於上述pCT公開申請案 wo 2007/043036、WO 2007/105205 及獨 2〇〇8/12〇217 中。 在替代實施例中,加以必要的變通,裝置22之元件可用 於其他類型之深度映射系統中,諸如基於往來於所關注景 物之光脈衝之飛行時間之量測之系統或立體系統以及使用 所投影光束之其他種類之應用。 在圖1中,X軸被視為沿著裝置22之正面之水平方向,Y 軸為垂直方向(在此視圖中超出頁面),且Z轴沿該物體由 該總成成像之大體方向遠離裝置22延伸。模組3〇及38之光 軸平行於Z軸’其中X軸上之各別光瞳相隔一已知距離。 158131.doc -13- 201233141 在此組態中,由模組38擷取之影像中之圖案之橫向移位將 僅限於(在谷許錯誤範圍内)沿X方向,如上述PCT公開申請 案中所解釋。 如上所述,照射模組30用光點之一圖案(諸如光點之— 不相關圖案)來照射所關注景物。在本專利申請案之上下 文中且在申請專利範圍中,措詞「不相關圖案」係指其位 置在橫向於投影光束軸之平面中不相關之光點(其可係亮 的或暗的)之一所投影圖案。該等位置在隨橫向移位而變 化之圖案之自相關對於大於光點尺寸且不大於可發生於由 該系統映射之深度範圍内之最大移位之任一移位不重要意 義上不相關。隨機、偽隨機及擬週期圖案通常不相關到由 上述定義所指定之程度。 為產生光點之圖案,模組30通常包含一合適之輻射源 32,諸如一經準直二極體雷射或一發光二極體(led)或具 有適當形狀之一輻射光束之其他光源。該光束由一合適之 掃榣器3 4及照射光學器件3 5在一角度範圍内掃描。該光束 在該掃描期間經調變以便產生該圖案。舉例而言,該光束 了在時間上藉由接通與關斷源3 2來加以調變以形成光點或 其他形式之二進制圖案。光學器件35通常包含可在不同實 施例中採取各種不同形式之一或多個透鏡及/或其他光學 、’且件,如下所述。該圖案在界定一投影視域(F〇V) %之某 角fe圍内投影至該景物上,因此將對源32之時間調變轉 換成在系統2 〇之所關注區域中之物體上延伸之一所期望空 間強度圖案。 158131.doc 14- 201233141 在所揭示實施例中,播扣 卸描器34包3具有一機械掃描驅動 器之一掃描鏡50,{日亦·5Γ姑ra ·Β· ,1』 1-亦可使用其他類型之掃描器(諸如聲 光掃描器)。掃描器3 4可包含^ 巴3例如一雙向掃描鏡或一對單 向掃描鏡。此等鏡可其Μ效人ΛΙ Μ , 土於整5型微機電系統(MEMS)技 術。此種類之掃描鏡俜ά啤 .. 兄保由诸如Micro vision公司(華盛頓州201233141 VI. Description of the Invention: [Technical Field] The present invention relates generally to a method and apparatus for projecting and capturing optical radiation, and in particular to projection and image capture for the purpose of 3D mapping . [Prior Art] Various methods for optical 3D mapping (i.e., by processing an optical image of an object to produce a 3D contour of the surface of the object) are known in the art. Such a 3D profile is also referred to as a 3D image, depth map or depth image, and 3D mapping is also referred to as depth mapping. Some 3D mapping methods are based on projecting a laser streak pattern onto the object and then analyzing an image of the pattern on the object. For example, the PCT International Publication No. WO 2007/043036, the disclosure of which is hereby incorporated by reference in its entirety, is incorporated herein by reference in its entirety, the disclosure of the disclosure of the disclosure of the entire disclosure of the disclosure of System and method for object reconstruction. An imaging unit detects the light response of the illuminated area and produces image data. The pattern in the image of the object is used to reconstruct a 3D image of the object in real time with respect to the displacement of one of the reference images of the pattern. Further methods for the use of zebra patterns for 3D mapping are described, for example, in PCT International Publication No. WO 2007/105205, the disclosure of which is incorporated herein by reference. Other optical 3D mapping methods project different kinds of patterns onto the object to be mapped. For example, the disclosure of PCT International Publication No. WO 2008/120217, the disclosure of which is incorporated herein by reference in its entirety in its entire entire entire entire entire entire entire entire entire entire entire entire disclosure Assembly. A light source is irradiated with light to expose the transparent sheet to project the pattern onto an object. An image capture assembly captures an image of the pattern on the object and the image is processed to reconstruct a 3D image of the object. SUMMARY OF THE INVENTION Embodiments of the invention described below provide methods and apparatus for efficient projection of patterns, particularly for 3D mapping, and for imaging such projected patterns. Accordingly, in accordance with an embodiment of the present invention, there is provided an apparatus for mapping comprising an illumination module, the illumination module comprising: a radiation source configured to emit a beam of light; and a scanner It is configured to receive and scan the beam over a selected angular range. The illumination optics are configured to project the scanned beam to form a pattern of spots that extend within a region of interest. An imaging module is configured to capture an image of the pattern projected onto an object in the region of interest. A processor is configured to process the image to construct a three-dimensional (3D) image of the object. The pattern of spots such as shai can be uncorrelated within a depth range mapped by the device. In some embodiments, the source of radiation is controlled to modulate the intensity of one of the beams as the scanner scans the beam, thereby forming the pattern of the spot of light on the region of interest. The illumination module can be configured to modify the pattern in response to the image captured by the imaging module. The illumination module can be configured to control at least one of the radiation source and the scanner to modify a corner of the array of pixels in a selected portion of the region of interest 158131.doc -5 - 201233141 degree. Alternatively or additionally, the illumination module can be configured to control at least one of the radiation source and the scanner to modify a brightness of one of the light spots in a selected region of the region of interest. In an alternate embodiment, the scanner is configured to scan the beam over a first angular range, and the optical devices include a beam splitter configured to form greater than the first A plurality of angularly spaced replicas of the scanned beam that are coextensive within one of the angular extents. The sweeper and the beam splitter can be configured to collage the region of interest β with the pattern formed by the plurality of angularly spaced replicas of the scanned beam. In another embodiment The optical device includes a patterned element configured to form a delta pattern within a first angular range upon illumination by the beam, and the scanner is configured to direct the beam continuously The patterned elements are struck at a plurality of different angles to form a plurality of angularly spaced replicas of the pattern that are coextensive within a second angle range that is greater than one of the first angular ranges. The scanner and the patterning element can be configured to collage the region of interest with the plurality of angularly spaced replicas of the pattern. In still another embodiment 'the scanner is configured to have a beam of light within a first angular range' and the specialized optical device includes a scan extension element configured to disperse the scanned beam In order to cover a second angular range greater than the first angular range in a spatial pattern. The scan extension element can be selected from a face reflector and a diffractive optical element - component group 0. In one disclosed embodiment, the illumination module includes at least one beam sensation 158131.doc • 6 - 201233141 The at least one beam sensor periodically (4) scans the scanned beam at a selected angle within the angular range scanned by the scanner and thereby verifies that the scanner is operating. Typically, the illumination module is configured to inhibit emission of the beam from the source when the sensor fails to periodically receive the scanned beam. In some embodiments, the radiation source includes: a first radiation source that emits an infrared light beam that is modulated to form the pattern of the light spots; and a second radiation source that emits a The visible light beam is modulated to project a visible image onto the region of interest. The scanners and optics are configured to simultaneously project both the infrared beam and the visible beam onto the region of interest. Typically the second (four) source is controlled to project the visible image onto the object in response to the 3D image. In the disclosed embodiment, the processor is configured to obtain a respective difference between the light spots in the plurality of regions of the captured image and the corresponding reference spot positions of the reference image belonging to the pattern. An offset is derived to derive the 3D image, wherein the respective offsets indicate respective distances between the regions and the image capture assembly. In some embodiments, the imaging module includes a position sensitive detector configured to sense and output each of the light in the pattern projected onto the object by the illumination module One of the spots is offset at the point. The imaging module can be configured to scan the position sensitive field of view with the scanner in the illumination module or with the beam from the radiation source. Alternatively, or in addition, the illumination module and the imaging module are configured such that the offsets occur in a first direction, and the imaging module includes a configuration 158131.doc 201233141 extending along the first direction An array of one or more columns of detector elements and astigmatic optics configured to image the pattern onto the array and having an optical power that is greater in the first direction than in a second vertical direction . In some embodiments, the imaging module includes a sensor and imaging optics, the sensors and imaging optics defining scanning in the region of interest in synchronization with the scanned beam of the illumination module One of the sensing areas. The sensor can include an image sensor having a rolling shutter, wherein the scrolling gate is synchronized with the scanned beam. Alternatively or in the alternative, the scanner in the illumination module can be controllable to dynamically change the selected angular range 'and the county module can include an imaging scanner, and the image scanner is in a state of $ The money zone is dynamically scanned to match the selected angular extent of the scanned #beam. According to an embodiment of the present invention, there is also provided a method for mapping a line comprising an illumination module, the illumination module comprising a source configured to transmit with a modulation according to a specified time Change one of the intensity - the radiation beam. The scanner is configured to receive and scan the beam within the region of interest to project the radiation onto the region by a spatial intensity pattern determined by the temporal modulation of the beam. An imaging module is configured to capture an image of the spatial intensity pattern projected onto the object in the region of interest. - The processor passes (4) to process the image to construct a three-dimensional (3D) image of the object. In one disclosed embodiment, the time modulation is binary, and wherein the spatial intensity pattern comprises one of the arrays of light dots 0 158131.doc 201233141 generated by the time modulation. . In an embodiment 10, the imaging module includes a sensor and imaging optics, wherein the sensors and imaging optics are defined in the region of interest in synchronization with the scanned cat beam of the illumination module. Scan one of the sensing areas. In accordance with an embodiment of the present invention, a method for mapping is further provided that the method includes scanning a beam of light within a selected angular range to form a map of light spots extending within the region of interest. Manipulate # and process the image of the pattern projected onto an object in the area of interest to construct a three-dimensional (3D) image of the object. In accordance with an embodiment of the present invention, a method for mapping is provided that includes a radiation beam having an intensity that varies according to a specified time modulation. The beam is scanned within the region of interest to project the radiation onto the region by a spatial intensity pattern determined by the temporal modulation of the beam. An image of one of the s-space intensity patterns projected onto the object of interest is manipulated and processed to construct a three-dimensional (3D) image of the object. The present invention will be more fully understood from the following detailed description of various embodiments of the invention, wherein the embodiments of the invention set forth below particularly provide for effective projection patterns, in particular for 3D mapping and methods and apparatus for efficiently imaging such projected patterns. In some implementations of the invention, the illumination module projects a pattern of light spots onto the area of interest and an imaging module manipulates an image of the pattern on the object in the area. The image is processed to find the position of the light spot of the image of the 15813 丨.doc 201233141 image, and on this basis, a 3D image of the object in the region of interest is constructed. The depth coordinates in the image are typically calculated by triangulation based on the offset of the spots in the image relative to the corresponding reference spot locations in one of the reference images. In the disclosed embodiment, the pattern is dynamically projected, i.e., instead of simultaneously projecting the pattern over the entire area, the pattern is formed by scanning a beam of light emitted by the source of radiation. The beam is scanned within the selected angle range. (The disclosed embodiments relate to controlling and/or extending this range.) The intensity of the beam is typically modulated during the scan to form the desired pattern. The scan is "_" in the sense that it can modify the pattern of the pattern (such as its density, brightness, and/or angular extent) during the mapping of a given scene. Although the embodiments described below are specific to the dot pattern, the principles of the present invention are equally applicable to the formation of other types of patterns for the purpose of 31) mapping. This dynamic scanning method is advantageous in several important respects. By way of example, dynamic scanning in this manner imparts flexibility in patterning, in particular in modifying the pattern based on the image of the area of interest. For example, the angular density of the spots in the pattern and/or the brightness of the spots may vary from region to region depending on the scene conditions and characteristics of the object of interest in the scene. Similarly, the imaging phantom can be dynamically operated in conjunction with the scanning of the illuminating beam such that the active field of view of the imaging module tracks the area of the pattern that is actually illuminated at any one of the scans. It is not necessarily the same as the image thus formed in the area of interest as in the conventional image sensor, but can be used as part of the process of forming the image of 158131.doc •10.201233141 3D, based on a detection during scanning The local signal captured by the device electronically combines the images. Concentrating the illumination and detection sources in a small moving area in this manner enhances the signal/background ratio of the detected pattern and thus improves the accuracy of the 3D mapping. The field of view of the imaging module can be optically (possibly using (at least partially) the same as the scanner as an illumination module) or electronically (using one of the image sensors with a rolling shutter) to track the illumination beam scanning. Some of the embodiments described below relate to extending the angular extent of the scan provided by the illumination module. These embodiments address the need for some 31) mapping systems to have a wide field of view that is much larger than the scanning range of the H-scan. In one such embodiment, the optics of the projection module includes a beam splitter' which simultaneously forms a plurality of angularly spaced replicas of the scanned beam. These replicas are coextensive in a range that is greater than the scan range. In another embodiment, the scanner directs the beam from the light source to continuously strike a patterned element at a plurality of different angles, and thus form a plurality of angularly spaced replicas of the pattern. In either case, the components of the illumination module can be configured to use the pattern to tile the region of interest 1 in this manner, with the adjacent replica of the pattern to cover the region without copying between the regions Significant overlap or gap. (In this context, 'gap or overlap is considered "significant" if it is approximately the spacing between the spots or greater than this magnitude.) Alternatively or additionally, the illumination module may contain, for example, A convex reflector or a diffractive optical element (DOE) - (E) - a known extension element that extends the angular extent covered by the scanned beam. 158131.doc -11. 201233141 The following is a description of other applications of components of a 3D mapping system using a scanned radiation source and variations of components of a 3D mapping system using a scanned radiation source. System Description FIG. 1 is a schematic top plan view of a system 2 for 3D mapping in accordance with an embodiment of the present invention. System 20 is constructed around a mapping device 22 that is configured to capture images and produce a 3D image of a scene. This includes an object 28 such as one of the hands of the device. The depth information in the view produced by device 22 can be used, for example, by a host computer (not shown) to enable the user to interact with the game and the tablet application running on the computer and the components displayed on the display screen. One of the interactions of the user interface - part. (This functionality is set forth, for example, in U.S. Patent Application Publication No. 2009/0183, which is incorporated herein by reference.) The mapping function of the device can also be used for other (4), and is applicable to virtually any suitable type of scene and 3D object. In the picture! In the example shown, the photo-radiation pattern is projected onto the object 28 in the mapping device 22, as will be explained in detail below. The optical radiation used for this purpose is usually in the infrared (IR) _, but is the same as visible or ultraviolet light. (In the embodiment shown in FIG. 7 - the imaging module projects both infrared and visible radiation" an imaging module 3: decodes the image of the pattern on the object to produce each image in the image The number-shift value. The shift value represents the - component (usually the spot) of the pattern in each of the captured images and the pattern - 158131.doc •12· 201233141 Corresponding to the offset between the reference positions of one of the pattern elements. These offsets do not correspond to the respective distances between the points in the actual scene of the pixel and the image manipulation assembly. The original pixel value may be output 38, and the shift value may be calculated by another component of device 22 or by the host computer. One of the processors 46 in device 22 processes the shift values (calculated by module 38 as necessary) After the shift value of the original pixel value) to generate a depth map of the region of interest that is imaged and imaged by device 22. The depth map contains a 3D coordinate array containing the surface of the object - a predefined field of view Depth (z) coordinate value at each point (X, Y) (In the context of an image-related data array, such (χ, Υ) points are also referred to as pixels.) In this embodiment, the process II is based on the lateral shift of the pattern at each pixel by a triangular amount. The 30 coordinates of the point on the surface of the object 28 are measured. The principle of the 4 diopter nose is described, for example, in the above-mentioned pCT publication application WO 2007/043036, WO 2007/105205, and 2〇〇8/12〇 217. In an alternate embodiment, with the necessary modifications, the elements of device 22 may be used in other types of depth mapping systems, such as systems or stereo systems based on measurements of time of flight of light pulses to and from the scene of interest, and Other types of applications using the projected beam. In Figure 1, the X-axis is considered to be along the horizontal direction of the front of the device 22, the Y-axis is vertical (the page is exceeded in this view), and the Z-axis is along the object. The general direction of imaging of the assembly extends away from the device 22. The optical axes of the modules 3A and 38 are parallel to the Z axis 'where the respective pupils on the X axis are separated by a known distance. 158131.doc -13- 201233141 In this configuration, by module 38撷The lateral shift of the pattern in the image will be limited to (in the range of the error) in the X direction, as explained in the above-mentioned PCT published application. As described above, the illumination module 30 uses a pattern of light spots (such as The light spot - an unrelated pattern to illuminate the object of interest. In the context of this patent application and in the scope of the patent application, the word "unrelated pattern" means that its position is not in the plane transverse to the axis of the projection beam. a projected pattern of one of the associated spots (which may be bright or dark). The autocorrelation of the patterns in the position as a function of lateral displacement is greater than the spot size and no greater than may occur from mapping by the system Any shift in the maximum shift within the depth range is not significant in the sense of being irrelevant. Random, pseudo-random and quasi-periodic patterns are usually not related to the extent specified by the above definition. To create a pattern of spots, the module 30 typically includes a suitable source of radiation 32, such as a collimated diode laser or a light emitting diode (LED) or other source of light having a suitably shaped radiation beam. The beam is scanned over a range of angles by a suitable broom 34 and illumination optics 35. The beam is modulated during the scan to produce the pattern. For example, the beam is modulated in time by turning the source 32 on and off to form a spot or other form of binary pattern. Optical device 35 typically includes one or more lenses and/or other opticals, and can be employed in a variety of different forms in different embodiments, as described below. The pattern is projected onto the scene within a corner 124 defining a projection field of view (F〇V) %, thereby converting the time modulation of the source 32 to an object extending in the region of interest of the system 2 One of the desired spatial intensity patterns. 158131.doc 14- 201233141 In the disclosed embodiment, the buckle unloader 34 package 3 has a scanning mirror 50 of one of the mechanical scanning drivers, and can also be used. Other types of scanners (such as acousto-optic scanners). The scanner 34 can include a bi-directional scanning mirror or a pair of one-way scanning mirrors. These mirrors are effective, and they are used in the entire Type 5 microelectromechanical system (MEMS) technology. This type of scanning mirror is a beer.. Brother is made by such as Micro Vision (Washington State
Redmond市)之若干個製造商生產。 成像模組38通常包含在—感測器4()上形成出現在所關注 區域中之景物上之投影圖案之-影像之物鏡光學器件心。 在圖1中所描繪之實例中’感測㈣包含-CM〇s影像感測 器’該CMOS影像感測器包含二維矩陣之镇測器元件41。 該矩陣之列及行與軸對準。另—選擇係,可在模組 3 8中使用其他類型之感測器,如下所述。感測器及物鏡 光學器件42界定-成像視域44,該成像視域通常含於裝置 22之所關注區域中F〇v 36内。儘管感測器4〇在圖工中展示 為具有大致相等數目個列及行之偵測器元件41,但在下文 中闡述之其他實施例中,該感測器可包含僅一小數目個 列,或甚至僅一單個列或一單個位置靈敏偵測器元件。 如上所述,輻射源32通常發射紅外輻射。感測器4〇可包 a不具有一紅外截止濾波器之一單色感測器,以便偵測具 有尚靈敏度之所投影圖案之影像。為增強由感測器4〇擷取 之影像之反差,光學器件42或該感測器本身可包含一帶通 濾波器(未展示),該帶通濾波器傳遞輻射源32之波長而阻 斷其他頻帶中之環境輻射。 處理器46通常包含以軟體(或韌體)程式化以實施該處理 158131.doc •15- 201233141 並控制本文中闡述之功能之—嵌人式微處理器。該處理器 可例如動態地控制照射模組3G及/或成像模組38以調整諸 如圖案氆纟、亮度及角範圍之參數’如下文詳細闡述。一 記憶體48可保存程式碼、查找表及/或期中計算結果。另 -選擇係或另外,處理器46可包含用於實施其功能中之一 些功能或所有功能之可程式化硬體邏輯電路。可適用於處 理器46之冰度映射處理器之實施方案之細節提供於其揭 不内合以引用方式併入本文中之美國專利申請公開案 2010/0007717中。 掃摇照射模組 圖2A及圖2B係根據本發明之_實施例應用,照射模組 30於兩個不同操作階段中之示意性俯視圖。在此實施例 中,輻射源32包含一雷射二極體52及一準直透鏡M。來自 "亥軲射源之5亥光束由掃描鏡5〇在受限於該掃描器之機械及 光學性質之-角度範圍内掃描。(為簡化起見自此圖及後 續圖省略該掃描器構。WA展示該鏡處於該掃描之大致 中^處而在圖2]8中§亥鏡處於其最極端偏轉下。此偏轉界 定可由所掃描光束涵蓋之最大角範圍。 為擴展此範圍,諸如一合適繞射光學元件(D〇E)之一光 束分裂益55分裂所掃描光束以形成所掃描光束之多個有角 ㈣開之複製物56' 58、60。(在沒有該光束分裂器的情 況下,模組30將投影僅光束56。)當鏡5〇掃描該韓射光束 時’複製物56、58 ' 60在該所關注區域上並行掠過,從而 涵蓋大於由該掃描器獨自提供之掃描範圍之一角範圍。儘 I58131.doc •16- 201233141 官為簡化起見,圖2A及圖2B展示三個複製光束,但光束 分裂器55亦可經組態以賦予僅兩個複製光束或賦予一更大 數目個複製光束。一般而言,光束分裂器55可經組態以根 據應用要求來產生呈實質上任一所期望佈局之mXn個光束 複製物之實質上任一陣列。 圖2C係根據本發明之一實施例,由圖2A及圖2B之照射 模組投影之輻射圖案之一示意性正視圖。對雷射二極體52 之接通/關斷調變致使每一光束複製物56、58、6〇、…在視 域36之一對應子區内形成光點66之一各別圖案64。光束分 裂益55之扇出角及掃描器34之角掃描範圍通常經選取以使 得圖案64將所關注區域拼貼成實質上無孔且無圖案之間的 重疊。此種拼貼配置可有效地用來在3D映射系統中於一寬 角範圍内投影圖案。另一選擇係,扇出角及掃描範圍可經 選取以使得圖案64重疊。該等圖案可如同在所描繪實施例 中一樣係光點圖案,或可包含其他類型之結構化光。 圖3係根據本發明之一替代實施例,照射模組3〇之一示 意性側視圖。此實施例可用於形成圖2C中展示之同一種經 拼貼圖案 '然而,其不同於此處所述之其他實施例,因為 其與掃描鏡50結合使用一繞射光學元件(D〇E)川作為一空 間調變器來形成對該景物之圖案化照射。由於此配置二 低對鏡50之要求’以使得一慢得多的掃描速率成為可能, 或鏡50可簡單地在離散位置之間跳轉,且照射源32可以一 慢得多的速率脈衝接通及關斷。 在光學原理方面,此實施例類似於皆以引用方式併入本 158131.doc -17- 201233141 文中之美國專利中請公開案2009/0185274及2010/0284082 中闡述之基於DOE之方案。此等公開申請案闡述用於使用 一對DOE來形成繞射圖案之方法,該對〇〇£中之一者將一 輸入光束分裂成一矩陣之輸出光束,而另一者將一圖案應 用至s亥等輸出光束中之每一者。該兩個D〇E因此共同將輻 射投影至該圖案之多個B比鄰例項中之一空間區域上。 在本實施例中,鏡50之掃描圖案代替該等D〇E中之一者 以將來自輻射源3 2之輸入光束分裂成多個中間光束η。為 此目的,鏡50沿X及γ方向掃描至—矩陣之預定角度中之 每一者且在此等角度中之每一者處駐留達通常大約為幾毫 秒之某一時間週期。每一駐留點界定一光束72。DOE 70沿 著一各別軸76將光束72中之每一者繞射成一圖案化輸出光 束74。軸76與光束74之發散角之間的扇出角可經選取(藉 由DOE 70及鏡50之掃描圖案之適當設計)以使得光束以 圖2C中所示之方式拼貼視域36。 圖3之實施例可與各種類型之影像擷取模組3 8結合操 作,如下所述。由於依序照射光束74,因此模组38之影像 擷取圖案與照射序列同步以便使所關注景物之擷取影像中 之乜號/背景比最大化係合意的。某些類型之影像感測器 (諸如CMOS感測器)具有一滾動快門,該滾動快門可使用 例如闡述於其揭示内容以引用方式併入本文中之於2〇1〇年 4月19日提出申請之美國專射請案12/762,373中之技術來 與照射序列同步。 圖4係根據本發明之另一實施例,照射模組3〇之一示意 158131.doc 18- 201233141 性側視圖。假定源32為-雷射,則其發射之光束較強且應 由鏡50連續掃描則紐眼㈣全n纟請之正常操作 中’源32僅在鏡50移動時發射該光束,以使得在視域财 之所有位置處之駐留時間較短且因此不對眼睛造成任何危 險。然❿’若驅動鏡50之機構黏住或者誤操作,則該光束 可在一個位置處駐留達一延長週期。 為避免此不測事件,模組30包含麵合至處理器46(未展 示於此圖中)之諸如光電二極體之一或多個光束感測器 8〇、82、…。此等感測器係以處於由鏡掃描之角範圍内之 一選定角度或若干角度定位以便週期性地接收所掃描光束 且因此驗證該掃描器正在操作。在此實例中,在f〇v362 相對侧上展示兩個感測器’但亦可使用一單個安全感測器 或一更大數目個此類感測器。 驅動鏡50之機構可經程式化,例如,以在每一掃描開始 時將來自源32之光束引向感測㈣且在每_掃描結束時將 來自源32之光束引向感測器82。當該光束撞擊該等感測器 中之一者時’彼感測器向處理器46輸出一脈衝。該處理器 監測該等脈衝且追縱在各脈衝間所流逝之時間。若時間超 過一預設最大值,則該處理器將立即抑制自賴射源32發射 該光束(通常藉㈣單地切斷該光束)。此種料事件將在 鏡5〇被卡在一給定位置處之情況下發生。因此,在這樣一 種情況下’將立即切斷來自模組玫該光束,且將避免任 何潛在的安全隱患。 一示意 圖5係根據本發明之再一實施例,投影模組川之 158131.doc -19- 201233141 性側視圖此實施例特定而言涉及相對於鏡π之掃描範圍 擴展F〇V 36°其解決在諸如MEMS之某些技術t,鏡5〇之 掃描範圍較小,而一⑽映射應用要求在—寬視域上映射 之問題。 在所描繪實施例中,鏡5G在等於%_/2之—角範圍内 掃描,從而賜予通常大約為1〇。至3〇。之寬度a—之一初 始FOV。來自鏡5〇之光束撞擊__掃描擴展元件(在此種情 兄下凸面反射器88) ’該掃描擴展元件擴展光束範圍以 使得辦36具有可大約為6Q。至m。之寬度a。〆針對二維 (X-Y)知描,兀件60可係球面的,或其可具有沿X及Y方向 八有不同曲率半徑之一表面以便產生在一個維度上較在另 一個維度上為寬之一視域’或其可具有某一其他非球面形 狀。另一選擇係,該掃描擴展反射器可由一D0E或具有類 似掃描擴展性質之一繞射元件(未展示)取代。此外,另— 選擇係,反射器88之功能可由同一類型或不同類型之光學 元件之一組合履行。 圖6係根據本發明之一實施例,一個3〇映射系統%在操 作中之一示意性、形象化圖。在此系統中,裝置與一遊 戲控制臺92結合使用以操作與兩個參與者94及96之一互動 遊戲。為此目的,裝置22將光點1〇〇之一圖案投影至其視 域中之物體(包括參與者及諸如系統9〇位於其中之房間之 牆壁(及其他元件)之一背景98)上。裝置22擷取並處理該圖 案之—影像,如上文所解釋,以便形成參與者及景物中之 v、他物體之一個3D映像。控制臺%回應於參與者之身體移 158131.doc •20- 201233141 動而控制遊戲,參與者之身體移動係由裝置22或控制臺92 藉由分割並分析3D映像之變化偵測到。 此處展示系統9 0以便舉例說明3 D映射系統可遇到之困難 中之些困難。所映射景物上之物體可在尺寸上相差很 大,且常常小的物體(諸如參與之手、腿及頭)快速移.動及 改變其外觀形式。此外,往往正是此等物體需要為了在控 制臺92上運行之遊戲或其他互動應用之目的而準確地映 射。同時,所關注區域中之不同物體可既因反射比之變化 亦因距該裝置之距離之很大差異而以大不相同之強度將圖 ,射反射返回至裝置22β因此,由成像模組%擷取之 衫像中之圖案之-些區可能太暗以致於不能提供準確的深 度讀取。 ^為克服此等問題,裝置22中之照射模組3〇適應性地投影 ’從而回應於該景物之幾何形狀而改變該圖案之密 度及/或亮度。關於景物幾何形狀之資訊由成像模組職 =取之影像及/或由藉由處理此等影像產生之犯映像提 在系統90之操作期間,動態地控制輻射源32及 掃描器I以在較小或迅速變化,或者需要密切注意或更 好深度涵蓋之重要物體(諸如參與者94及96之身體)上以更 大霜磨is·县{止Produced by several manufacturers in Redmond City. The imaging module 38 typically includes an objective lens optics core on the sensor 4() that forms a projection pattern of the projected pattern appearing on the scene in the region of interest. In the example depicted in Figure 1, 'sensing (d) includes -CM〇s image sensor'. The CMOS image sensor comprises a two-dimensional matrix of detector elements 41. The columns and rows of the matrix are aligned with the axes. Alternatively - selection system, other types of sensors can be used in module 38, as described below. The sensor and objective optics 42 define an imaging field of view 44 that is typically contained within F〇v 36 of the region of interest of the device 22. Although the sensor 4 is shown in the pictorial as a detector element 41 having a substantially equal number of columns and rows, in other embodiments set forth below, the sensor may include only a small number of columns. Or even a single column or a single position sensitive detector element. As noted above, the radiation source 32 typically emits infrared radiation. The sensor 4 can include a monochrome sensor that does not have an infrared cut filter to detect an image of the projected pattern having sensitivity. To enhance the contrast of the image captured by sensor 4, optical device 42 or the sensor itself may include a bandpass filter (not shown) that transmits the wavelength of radiation source 32 to block other Environmental radiation in the frequency band. Processor 46 typically includes an embedded microprocessor that is programmed in software (or firmware) to perform the process and control the functionality set forth herein. The processor can, for example, dynamically control the illumination module 3G and/or the imaging module 38 to adjust parameters such as pattern 氆纟, brightness, and angular extent' as explained in detail below. A memory 48 can store code, lookup tables, and/or interim calculation results. Alternatively - or in addition, processor 46 may include programmable hardware logic for performing some or all of its functions. The details of an embodiment of an ice-mapping processor that can be applied to processor 46 are provided in U.S. Patent Application Publication No. 2010/0007717, which is incorporated herein by reference. Sweeping illumination module Figures 2A and 2B are schematic top views of an illumination module 30 in two different stages of operation in accordance with an embodiment of the present invention. In this embodiment, the radiation source 32 includes a laser diode 52 and a collimating lens M. The 5 ray beam from the "Hui 轱 source is scanned by the scanning mirror 5 within an angular range limited by the mechanical and optical properties of the scanner. (The scanner structure is omitted from this and subsequent figures for the sake of simplicity. WA shows that the mirror is at approximately the middle of the scan and in Figure 2] 8 the § mirror is at its most extreme deflection. This deflection can be defined by The wide range of angles covered by the scanned beam. To extend this range, a beam splitting beam, such as a suitable diffractive optical element (D〇E), splits the scanned beam to form a plurality of angular (four) copies of the scanned beam. Object 56' 58, 60. (In the absence of the beam splitter, the module 30 will project only the beam 56.) When the mirror 5 scans the Korean beam, the 'replica 56, 58' 60 is in focus. The areas are swept in parallel to cover a larger angular range than the scan range provided by the scanner alone. I58131.doc •16- 201233141 For the sake of simplicity, Figures 2A and 2B show three replicated beams, but the beam splits The device 55 can also be configured to impart only two replicated beams or to impart a greater number of replicated beams. In general, the beam splitter 55 can be configured to produce substantially any desired layout depending on the application requirements. mXn beam replica Figure 2C is a schematic front elevational view of one of the radiation patterns projected by the illumination module of Figures 2A and 2B. The on/off of the laser diode 52 is shown in accordance with one embodiment of the present invention. The off-modulation causes each beam replica 56, 58, 6, ... to form a respective pattern 64 of spots 66 in a corresponding sub-region of the field of view 36. The fan-out angle of the beam splitting 55 and the scanner 34 The angular scan range is typically selected such that pattern 64 tiles the regions of interest into substantially non-porous and without overlap between the patterns. Such a tile configuration can be effectively used in a wide angle range in a 3D mapping system. An inner projection pattern. Alternatively, the fanout angle and scan range may be selected such that the patterns 64 overlap. The patterns may be light spot patterns as in the depicted embodiment, or may include other types of structured light. Figure 3 is a schematic side elevational view of one of the illumination modules 3A in accordance with an alternative embodiment of the present invention. This embodiment can be used to form the same tiled pattern shown in Figure 2C. However, this is different from here. Other embodiments as described because of its scanning A combination of a diffractive optical element (D〇E) is used as a spatial modulator to form a patterned illumination of the scene. Since this configuration requires the mirror 50 to be 'reduced to a much slower scan rate. It is possible, or the mirror 50 can simply jump between discrete positions, and the illumination source 32 can be pulsed on and off at a much slower rate. In terms of optical principles, this embodiment is similarly incorporated by reference. The present invention is directed to a DOE-based solution as set forth in US Patent Nos. 2009/0185274 and 2010/0284082. These publications disclose a method for forming a diffraction pattern using a pair of DOEs. One of the pair splits an input beam into an output beam of a matrix, while the other applies a pattern to each of the output beams, such as shai. The two D〇E thus collectively project the radiation onto a spatial region of a plurality of B-neighbor items of the pattern. In the present embodiment, the scan pattern of mirror 50 replaces one of the D 〇 E to split the input beam from radiation source 32 into a plurality of intermediate beams η. For this purpose, mirror 50 is scanned in the X and gamma directions to each of the predetermined angles of the matrix and resides at each of these angles for a period of time typically of the order of a few milliseconds. Each dwell point defines a beam 72. The DOE 70 diffracts each of the beams 72 along a respective axis 76 into a patterned output beam 74. The fan-out angle between the axis 76 and the divergence angle of the beam 74 can be selected (by appropriate design of the scan pattern of the DOE 70 and mirror 50) such that the beam is tiled into the field of view 36 in the manner shown in Figure 2C. The embodiment of Figure 3 can be combined with various types of image capture modules 38, as described below. Since the beam 74 is sequentially illuminated, the image capture pattern of the module 38 is synchronized with the illumination sequence to maximize the apostrophe/background ratio in the captured image of the scene of interest. Some types of image sensors, such as CMOS sensors, have a rolling shutter that can be proposed on April 19, 2000, for example, as disclosed in the disclosure of which is incorporated herein by reference. The application in the United States specifically requests the technique in 12/762,373 to synchronize with the illumination sequence. Figure 4 is a side elevational view of one of the illumination modules 3 158 158131.doc 18- 201233141 in accordance with another embodiment of the present invention. Assuming the source 32 is a laser, the beam it emits is strong and should be continuously scanned by the mirror 50. The new eye (4) is all n. In normal operation, the source 32 emits the beam only when the mirror 50 is moved, so that The residence time at all locations of the sights is short and therefore poses no danger to the eyes. Then, if the mechanism of the drive mirror 50 is stuck or mishandled, the light beam can reside at a position for an extended period of time. To avoid this event, the module 30 includes one or more of the beam sensors 8A, 82, ... that are surfaced to the processor 46 (not shown). The sensors are positioned at a selected angle or angles within the angular range of the mirror scan to periodically receive the scanned beam and thereby verify that the scanner is operating. In this example, two sensors are shown on opposite sides of f〇v362, but a single security sensor or a larger number of such sensors can also be used. The mechanism for driving mirror 50 can be programmed, for example, to direct the beam from source 32 toward sensing (4) at the beginning of each scan and to direct beam from source 32 to sensor 82 at the end of each scan. When the beam strikes one of the sensors, the sensor outputs a pulse to the processor 46. The processor monitors the pulses and tracks the time elapsed between pulses. If the time exceeds a predetermined maximum value, the processor will immediately suppress the emission of the beam from the source 32 (usually by (4) simply cutting the beam). Such a material event will occur if the mirror 5 is jammed at a given location. Therefore, in such a case, the beam from the module will be cut off immediately and any potential safety hazards will be avoided. A schematic diagram 5 is a projection side view of a projection module according to still another embodiment of the present invention. 158131.doc -19-201233141 Sexual side view This embodiment specifically relates to the expansion of the scanning range relative to the mirror π by F〇V 36°. In some technologies such as MEMS, the scanning range of the mirror 5 is small, and a (10) mapping application requires a problem of mapping on a wide field of view. In the depicted embodiment, mirror 5G is scanned over an angle equal to %_/2, such that the weight is typically about 1 〇. To 3 〇. Width a - one of the initial FOVs. The beam from the mirror 5 strikes the __scanning extension element (in this case, the convex reflector 88). The scanning extension element extends the beam range such that the device 36 has a determinable amount of about 6Q. To m. The width a. For two-dimensional (XY) knowledge, the element 60 may be spherical, or it may have one surface having eight different radii of curvature along the X and Y directions to produce a width that is wider in one dimension than in the other. A field of view' or it may have some other aspherical shape. Alternatively, the scan spread reflector can be replaced by a DOE or a diffractive element (not shown) having similar scanning spread properties. In addition, the function of the reflector 88 can be performed by a combination of one of the same type or different types of optical elements. Figure 6 is a schematic, pictorial representation of a 3〇 mapping system % in operation, in accordance with an embodiment of the present invention. In this system, the device is used in conjunction with a game console 92 to operate an interactive game with one of the two participants 94 and 96. To this end, device 22 projects a pattern of spots 1 至 onto objects in its field of view (including participants 98 and one of the walls 98 (and other components) of the room in which system 9 is located). The device 22 captures and processes the image of the pattern, as explained above, to form a 3D image of the participant and the scene, the object. The console % responds to the participant's body movement 158131.doc • 20- 201233141 to control the game, and the participant's body movement is detected by the device 22 or console 92 by segmenting and analyzing the changes in the 3D image. System 90 is shown here to illustrate some of the difficulties encountered by 3D mapping systems. Objects on the mapped scene can vary greatly in size, and often small objects (such as participating hands, legs, and heads) move quickly and change their appearance. Moreover, it is often the case that such objects need to be accurately mapped for the purpose of games or other interactive applications running on console 92. At the same time, different objects in the region of interest may reflect the image at a greatly different intensity due to the difference in reflectance and the distance from the device, and the reflection is returned to the device 22β. The patterns in the shirts that are captured may be too dark to provide accurate depth reading. To overcome these problems, the illumination module 3 in the device 22 is adaptively projected' to change the density and/or brightness of the pattern in response to the geometry of the scene. The information about the geometry of the scene is controlled by the imaging module and/or by the image generated by processing the images. During the operation of the system 90, the radiation source 32 and the scanner I are dynamically controlled to Small or rapid changes, or important objects that need close attention or better depth coverage (such as the body of participants 94 and 96) with a larger frost mill is · county {
以便補償所擷取景物内 〜叫,m —稀疏圖案來涵蓋大 裝置22可回應於景物之變化而 裝置22可動態地調整輻射源32之輸 取景物内之距離及反射率之變化。 i58131.doc •21· 201233141 囚此 ,〜w衣直U之物體 (諸如背景98)方向以更大亮度投影光點,同時減少明亮、 附近物體上之所投影功率。另一選擇係或另外,可控制鏡 之本端掃描速度及因此在掃描範圍内之每一位置處之駐留 時間以在需要更強照射之區中賦予更長本端駐留時間及因 此更大本端投影能量^此等種類之自適應功率控制增強系 統90之動態範圍且使對可用輻射功率之利用最優化。 作為系統90之動態操作之另一個態樣(未圖解說明於圖6 中),亦可動態地調整裝置22中之照射及成像模組之角範 圍。舉例而言’在首先掏取—廣角影像並形成所關注區域 之一廣角、低解析度3D映像之後,可控制裝置22以放大已 在該區域内識別出之特定物體。因此,可減小所投影圖案 之角掃描範圍及成像模組之感測範圍以提供參與者叫及% 之身體之更高解析度深度映像。當參與者在該景物内移動 時’可相應地調整掃描及感測範圍。 圖7係根據本發明之再一實施例,一照射模組11〇之一示 意性側視圖。模組11〇可用來代替裝置22中之模組3〇(圖 1),且提供在使用同一掃描硬體來同時投影紅外圖案(用於 3D映射)及可由該裝置之一使用者觀看之可見内容兩者方 面之附加功能。 在此種實施例中,裝置22可使用紅外圖案來形成一給定 物體之個3D映像,且隨後可將適合該物體之形狀及外形 之一可見影像投影至該物體上。此種功能例如在呈現使用 者介面圖形及文字方面,且特定而言在「擴增實境」應用 158I31.doc -22· 201233141 (針對其裝置22甚至可整合成由使用者佩帶之護目鏡以使 得映射及可見影像投影與使用者之視域對準)方面係有用 的。此種類之應用例如闡述於其揭示内容以引用方式併入 本文中之於2011年7月18日提出申請之PCT專利申請案 PCT/IB2011/053192 中。 如圖7中所示,一光束組合器114(諸如二向色反射器)將 來自輻射源32之紅外光束與來自一可見光源112之一可見 光束對準。源112可係單色的或多色的。舉例而言,源m 可包含一適用於單色照射之雷射二極體或LED,或其可包 含其光束經調變及組合以便在視域中之每一點處投影所期 望色彩之不同色彩之多個雷射二極體或LED(未展示)。出 於後一種目的,組合器114可組合兩個或更多個二向色元 件(未展示)以便對準所有該等不同色彩及紅外光束。 當鏡50在FOV 36内掃描時,處理器46同時調變源32及 72 .源32經調變以在視域中之每一點處產生用於3D映射之 所期望圖案,而源112係根據欲在同一點處投影之可見影 像(其可基於物體於彼點處之3D映像)之像素值(強度及可 能色彩)調變。由於可見及紅外光束係光學對準及同軸 的’因此可見影像將自動地與3d映射配準。 影像擷取組態 圖8係根據本發明之一實施例,成像模組刊之一示意性 形象化圖。此貫施例利用照射模組3 0之掃描圖案與成像模 組之續出圖案之間的同步。此同步使得能夠使用具有相對 於仃之數目之一相對小數目個列122之偵測器元件41之一 15813I.doc -23· 201233141 影像感測器12 0。換§之,該影像感測器本身具有沿γ方向 (圖中之垂直方向)之低解析度’但沿χ方向之高解析度。 在所描繪實施例中’影像感測器80具有不到十個列且可具 有例如一千個或更多個行。另一選擇係,影像感測器可具 有更大或更小數目個列,但仍遠少於行之列。 物鏡光學器件42包含將視域44映射至影像感測器12〇之 一像散成像元件。光學器件42具有沿γ方向較沿又方向為 大之放大率,以使得影像感測器之每—列122擷取來自該 視域中之一對應矩形區124之光。舉例而言,每—矩形區 124之長寬比可大約為10:1 (χ:γ),而列122具有大約為 1000:1之一長寬比。可根據影像感測器中之列及行之數目 及視域之所期望尺寸按任一所期望比例來選取光學器件4 2 之不同X及Υ放大率。舉例而言,在一項實施例中,光學 益件42可包含一圓柱形透鏡,且感測器12〇可包含僅一單 個列之偵測器元件。 ”、、射模組30以一光栅圖案在視域44内掃描來自輻射源 之光束,從而用多個水平掃描線來涵蓋區124中之每一 者。當每一個線由來自該照射模組之光點掃描時,對應列 122擷取自該景物反射之輻射。自感測器12〇之讀出與照射 掃描同步’以使得實質上僅在該景物中之對應區124由該 掃描照射時讀出該等列122之偵測器元件41。因此,減小 其間每一列122針對每—讀出整合環境光之時間長度,且 因此增強感測器12〇之輪出中之信號/環境比。 在此實施例中由模組38擷取之影像之解析度取決於照射 158131.doc •24· 201233141 掃描之解析度,因為與掃描同步多次讀出每一列i22之感 測器120。換言之,在照射模組掃描對應區124時多次掃描 第列,並隨後多次掃描第二線,然後多次掃描第三線等 ^。舉例而言,若影像感測器中之第一列負責擷取照射模 組之刖—百個掃描線(垂直阳乂之1/10),則在掃描第二線 之刖—百次掃描第一列。為此目的,感測器120包括類似 於例如-習用、全解析度CM0S影像感測器中之讀出電路 之合適讀出電路(未展示)。另一選擇係,可在照射側上使 用一垂直扇出元件,且影像感測器之線可同時掃描與對應 照射知描同步之每一者β 另選擇係,可與照射掃描同步垂直多工對影像感測器 120之掃描。舉例而言,在此方案中,使用具有一百個列 之成像感測器,該影像感測器之第一列擷取例如照射掃 才田之第一個、第101個、第201個、第301個掃描線等等。 另外或另一選擇係,成像模組38可實施例如闡述於其揭示 内谷以引用方式併入本文中之於2〇1〇年12月6日提出申請 之美國臨時專利申請案61/419,891中所述之種類之經空間 多工成像方案。亦可使用上述掃描技術之一組合。 展示於圖8中之影像擷取模組3 8之配置維持沿χ方向之全 解析度,而依賴於照射模組3〇之經同步掃描來提供γ方向 解析度。此貫施例因此允許藉助更簡單的讀出電路來使影 像感測器120變得更小且成本更低,同時潛在地提供沿乂方 向之增大之解析度。此χ解析度係有用的,因為在圖丨中所 不之組態中,僅由模組38擷取之影像中之圖案之χ方向移 15813I.doc -25- 201233141 位指示景物内之深度變化,如上文所解釋。沿又方向之高 解析度因此提供對光點偏移之準確讀取,此又會達成對深 度座標之準確計算。 圖9係根據本發明之另一實施例,成像模組刊之一示意 性形象化圖。此實施例亦利用照射模組3〇之掃描圖案與成 像模組之讀出圖案之間的同步。然而在此種情況下,有效 地掃描成像模組3 8之視域44。 在圖9之實施例中,影像擷取模組38包含一位置靈敏偵 測器13 0,該位置靈敏偵測器經組態以感測並輸出由照射 模組30投影至物體上之圖案中之每一光點之一X偏移。換 言之,當模組30依次將每一光點投影至所關注區域中之對 應位置上時,偵測器13〇感測其影像並指示其相對於對應 參考光點位置之各別偏移(及因此景物於彼位置處之深度 座私)。偵測器130在該圖中展示為具有沿χ方向延伸之一 單個列之偵測器元件41之一線掃描感測器9〇。另一選擇 係,位置靈敏债測器Π0可包含具有指示當前成像至該债 測器上之光點之位置之一類比讀出之一單式偵測器元件。 在所描繪實施例中,物鏡光學器件42將視域44中之一矩 形區134映射至感測器13〇中之該列之偵測器元件41上。光 學器件42可如同在該先前實施例中一樣係像散的,其具有 沿X方向較沿Υ方向為大之光功,以使得區134具有一較該 感測器中之該列之偵測器元件為低之長寬比(χ:γ)。一掃 描鏡132與照射模組30之光柵掃描同步沿γ方向在視域梢内 掃描區134,以使得區134始終含有當前處於圖案化照射下 158131.doc •26- 201233141 之水平線。以此方式’該影像擷取模組擷取具有高解析度 及间^號/環境比之景物上之圖案之一影像,同時使用一 簡單一維感測器。 圖10係根據本發明之再一實施例,一個3D映射裝置140 之示思性、形象化圖。在此實施例中,一照射模組142 及成像模組144之視域共同地由一共同鏡150沿Y方向掃 描。照射模組142包含輻射源32及呈一掃描鏡152之形式之 一光束掃描器,該光束掃描器沿X方向掃描輻射光束。成 像模組144包含一偵測器154,該偵測器之視域與掃描鏡 152同步由包含一掃描鏡156之一成像掃描器沿X方向掃 描。投影光學器件146將照射光束投影至裝置14〇之所關注 區域上以便形成景物中之物體上之光點之所期望圖案,且 物鏡光學器件148將該等光點成像至偵測器ι54上。 因此與照射模組142之所掃描光束同步地在所關注區域 内掃描成像模組144之感測區。偵測器154可包含例如如同 在圖9之實施例中一樣之一位置靈敏偵測器或一其中谓測 器元件配置成列及行之一小面積影像感測器。如同在該等 先前實施例中一樣,成像模組144提供對光點偏移之量 測’該等量測隨後可經處理以產生一深度映像。儘管圖 中之鏡150方便地由照射及成像模組兩者共用,但每—模 組亦可具有其自帶、經同步Y方向掃描鏡。在後—種情兄 下’可使用MEMS技術來產生裝置14〇中之所有該等鏡 然而’由於Y方向掃描相對慢,因此具有一步進式驅動 之單個鏡150對於此應用係可行的且有利於維持精確同 158131.doc -27- 201233141 步ο 圖10中所示之配置可用來實施不僅對所投影圖案令之光 點密度及亮度’而且對掃描區之動態控制,如 丄又所解 釋。可操作鏡150及152以改變在其内掃描照射光束之角範 圍,且鏡150及156然後將掃描偵測器154之掃描區以與所 掃描光束之角範圍相匹配。以此方式,例如,可首先操作 裝置140以擷取一整個景物之一粗3D映像,從而涵蓋一寬 的角範圍。可分割該3D映像以便識別一所關注物體,且隨 後可動態地調整照射總成及成像總成之掃描範圍以擷取並 映射僅具有高解析度之物體之區。對此種動態控制之其他 應用將為熟習此項技術者所知且視為歸屬於本發明之範 鳴。 上文已展示並闡述了用以增強用於圖案投影及影像擷 之掃描架構之若干個具體方式。此等實施例以舉例方式 解說明本發明之態樣如何可尤其用於改善眼睛安全、增 視域、同時使用同一掃描硬體來投影紅外圓案及可見内 兩者、以及藉由使成像模組與照射模組同步來減小成像 組之尺寸。以上實施例之替代實施方案及組合亦視為歸 於本發明之範疇。此等方案可使用以下之各種組合:用 3D映射之掃描投影以及對可見資訊之投影;用以成形或 裂掃描光束之繞射光學器件;用以擴大投H統之減 折射及域繞射光學器件;及對投影及影㈣取之㈣ 掃描。 且 因此應瞭解,列舉上文所述各實施例僅為舉例說明, 158131.doc -28· 201233141 本發明不僅限於上文中特別展示及闡述之内容。而是,本 發明之範疇包括上文中所述之各種特徵之組合及子組合兩 者’以及熟習此項技術者在閱讀上述說明後可想到且在先 前技術中未揭示之本發明之變化形式及修改形式。 【圖式簡單說明】 圖1係根據本發明之一實施例,一用於3D映射之系統之 一示意性俯視圖; 圖2 A及圖2B係根據本發明之一實施例,一照射模組於 兩個不同操作階段中之示意性俯視圖; 圖2C係根據本發明之一實施例,由圖2A及圖2B之模組 才又影之一圖案之一示意性正視圖; 圖3至圖5係根據本發明之其他實施例,照射模組之示意 性俯視圖; 圖6係根據本發明之一實施例,一個3]〇映射系統在操作 中之一示意性、形象化圖; 圖7係根據本發明之再一實施例,一照射模組之一示意 性俯視圖; 圖8及圖9係根據本發明之實施例,成像模組之示意性、 形象化圖;及 圖1 〇係根據本發明之一實施例,一個3]〇成像裝置之一示 意性、形象化圖。 【主要元件符號說明】 20 系統 22 映射裝置 158131.doc -29· 物體 照射模組 輻射源 掃描器 照射光學器件 投影視域 成像模組 感測器 偵測器元件 物鏡光學器件 成像視域 處理器 記憶體 掃描鏡 雷射二極體 準直透鏡 光東分裂器 所掃描光束之多個有角間隔開之複製物 所掃描光束之多個有角間隔開之複製物 所掃描光束之多個有角間隔開之複製物 圖案 光點 、繞射光學元件 中間光束 •30- 201233141 74 76 80 82 88 90 92 94 96 98 100 110 112 114 120 122 124 130 132 134 140 142 144 146 圖案化輸出光束 軸 光束感測器 光束感測器 凸面反射器 三維映射系統 遊戲控制臺 參與者 參與者 背景 光點 照射模組 可見光源 組合器 影像感測器 列 矩形區 位置靈敏偵測器 掃描鏡 矩形區 三維映射裝置 照射模組 成像模組 投影光學器件 158131.doc 201233141 148 物鏡光學器件 150 共同鏡 152 掃描鏡 154 偵測器 156 掃描鏡 158131.doc -32·In order to compensate for the size of the captured scene, the m-sparse pattern covers the large device 22 in response to changes in the scene. The device 22 can dynamically adjust the change in the distance and reflectance within the input scene of the radiation source 32. I58131.doc •21· 201233141 Prison this, ~w clothes straight U objects (such as background 98) direction to project light spots with greater brightness, while reducing the projected power on bright, nearby objects. Alternatively or additionally, the local scanning speed of the mirror and thus the dwell time at each location within the scanning range can be controlled to impart a longer local residence time and therefore a larger margin in areas where more illumination is required End-projection energy ^ These types of adaptive power control enhance the dynamic range of system 90 and optimize the utilization of available radiated power. As another aspect of the dynamic operation of system 90 (not illustrated in Figure 6), the angular extent of the illumination and imaging modules in device 22 can also be dynamically adjusted. For example, after first capturing a wide-angle image and forming a wide-angle, low-resolution 3D image of the region of interest, the device 22 can be controlled to magnify the particular object that has been identified within the region. Thus, the angular scan range of the projected pattern and the sensing range of the imaging module can be reduced to provide a higher resolution depth map of the body of the participant and %. The scanning and sensing range can be adjusted accordingly as the participant moves within the scene. Figure 7 is a schematic side elevational view of an illumination module 11 in accordance with yet another embodiment of the present invention. The module 11A can be used in place of the module 3〇 (FIG. 1) in the device 22, and is provided to simultaneously project an infrared pattern (for 3D mapping) using the same scanning hardware and can be viewed by a user of the device. Additional features in both aspects of the content. In such an embodiment, device 22 may use an infrared pattern to form a 3D image of a given object, and then a visible image suitable for the shape and shape of the object may be projected onto the object. Such functions are for example in the presentation of user interface graphics and text, and in particular in the "Augmented Reality" application 158I31.doc -22. 201233141 (for their device 22 even integrated into the goggles worn by the user It is useful to make the mapping and visible image projections aligned with the user's field of view. An application of this type is set forth, for example, in the disclosure of which is hereby incorporated by reference in its entirety in its entirety in its entirety in the the the the the the the the the As shown in Figure 7, a beam combiner 114, such as a dichroic reflector, aligns the infrared beam from radiation source 32 with a visible beam from a source of visible light 112. Source 112 can be monochromatic or multi-colored. For example, the source m can comprise a laser diode or LED suitable for monochromatic illumination, or it can include different colors whose beams are modulated and combined to project a desired color at each point in the field of view. Multiple laser diodes or LEDs (not shown). For the latter purpose, combiner 114 can combine two or more dichroic elements (not shown) to align all of the different color and infrared beams. When mirror 50 is scanned within FOV 36, processor 46 modulates sources 32 and 72 simultaneously. Source 32 is modulated to produce a desired pattern for 3D mapping at each point in the field of view, while source 112 is based on The pixel value (intensity and possible color) of the visible image to be projected at the same point (which can be based on the 3D image of the object at the point) is modulated. Since the visible and infrared beams are optically aligned and coaxial, the visible image will automatically be registered with the 3d map. Image Capture Configuration Figure 8 is a schematic visualization of an imaging module in accordance with an embodiment of the present invention. This embodiment utilizes the synchronization between the scanning pattern of the illumination module 30 and the continuation pattern of the imaging module. This synchronization enables the use of one of the detector elements 41 having a relatively small number of columns 122 relative to the number of 15 15813I.doc -23· 201233141 image sensor 120. In other words, the image sensor itself has a low resolution in the gamma direction (the vertical direction in the figure) but a high resolution in the x direction. In the depicted embodiment, image sensor 80 has fewer than ten columns and may have, for example, one thousand or more rows. Alternatively, the image sensor can have a larger or smaller number of columns, but still far less than the row. Objective lens optics 42 includes an astigmatic imaging element that maps field of view 44 to image sensor 12A. The optics 42 have a magnification that is greater in the gamma direction than in the other direction such that each column 122 of the image sensor extracts light from a corresponding rectangular region 124 in the field of view. For example, the aspect ratio of each of the rectangular regions 124 can be approximately 10:1 (χ: γ), while the column 122 has an aspect ratio of approximately 1000:1. The different X and Υ magnifications of the optical device 4 2 can be selected in any desired ratio based on the number of columns and rows in the image sensor and the desired size of the field of view. For example, in one embodiment, optical benefit 42 can include a cylindrical lens, and sensor 12 can include only a single column of detector elements. The shot module 30 scans the light beam from the radiation source in the field of view 44 in a raster pattern to cover each of the regions 124 with a plurality of horizontal scan lines. When each line is from the illumination module When the spot is scanned, the corresponding column 122 extracts the radiation reflected from the scene. The readout from the sensor 12 is synchronized with the illumination scan so that substantially only the corresponding region 124 in the scene is illuminated by the scan. The detector elements 41 of the columns 122 are read. Thus, the length of time during which each column 122 is integrated for ambient light is read, and thus the signal/environment ratio in the rounds of the sensor 12 is enhanced. The resolution of the image captured by module 38 in this embodiment depends on the resolution of the illumination 158131.doc • 24· 201233141 scan because the sensor 120 of each column i22 is read multiple times in synchronization with the scan. When the illumination module scans the corresponding area 124, the column is scanned multiple times, and then the second line is scanned multiple times, and then the third line is scanned multiple times, etc. For example, if the first column in the image sensor is responsible for 撷Take the illuminating module - hundreds of scan lines ( 1/10 of the straight yang, then scan the first line after scanning the second line. For this purpose, the sensor 120 includes similar to, for example, a conventional, full-resolution CMOS image sensor. A suitable readout circuit (not shown) for the readout circuit. Alternatively, a vertical fan-out component can be used on the illumination side, and the line of the image sensor can simultaneously scan each of the corresponding illumination illuminations. The β-selection system can vertically scan the image sensor 120 in synchronization with the illumination scan. For example, in this scheme, an imaging sensor having one hundred columns is used, and the image sensor is A column captures, for example, the first, 101st, 201st, 301th scan lines, etc. of the illumination field. Alternatively or in another selection, the imaging module 38 can be implemented, for example, as disclosed in its disclosure A spatially multiplexed imaging scheme of the type described in U.S. Provisional Patent Application Serial No. 61/419,891, filed on Dec. 6, 2011, which is incorporated herein by reference. The image capture module 38 shown in Figure 8 is configured. The full resolution in the χ direction is dependent on the synchronized scanning of the illumination module 3 to provide gamma directional resolution. This embodiment thus allows the image sensor 120 to be made by means of a simpler readout circuit. Smaller and less expensive, while potentially providing an increase in resolution along the 乂 direction. This resolution is useful because, in the configuration that is not shown in the figure, only the image captured by the module 38 The direction of the pattern in the direction shifts 15813I.doc -25- 201233141 bit indicates the depth variation within the scene, as explained above. The high resolution along the other direction thus provides an accurate reading of the spot offset, which in turn will be achieved Accurate calculation of depth coordinates. Figure 9 is a schematic visualization of an imaging module in accordance with another embodiment of the present invention. This embodiment also utilizes synchronization between the scanning pattern of the illumination module 3〇 and the readout pattern of the imaging module. In this case, however, the field of view 44 of the imaging module 38 is effectively scanned. In the embodiment of FIG. 9, the image capture module 38 includes a position sensitive detector 130 configured to sense and output the pattern projected by the illumination module 30 onto the object. One of each spot is offset by X. In other words, when the module 30 sequentially projects each spot to a corresponding position in the region of interest, the detector 13 detects its image and indicates its respective offset with respect to the position of the corresponding reference spot (and Therefore, the depth of the scene at the location is private.) The detector 130 is shown in the figure as a line scan sensor 9 of a detector element 41 having a single column extending in the x-direction. Alternatively, the position sensitive debt detector Π0 can include an analog detector element having an analogy to one of the positions indicative of the spot currently being imaged onto the detector. In the depicted embodiment, objective optics 42 maps one of the rectangular regions 134 of the field of view 44 to the detector element 41 of the column in the sensor 13A. The optical device 42 can be astigmatic as in the previous embodiment, having optical power greater in the X direction than in the x direction, such that the region 134 has a detection of the column in the sensor. The device component has a low aspect ratio (χ: γ). A scanning mirror 132 is synchronized with the raster scan of the illumination module 30 in the gamma direction in the field of view scanning zone 134 such that zone 134 always contains the horizontal line currently under the patterned illumination 158131.doc • 26-201233141. In this way, the image capture module captures an image of a pattern on a scene having a high resolution and an interval/environment ratio while using a simple one-dimensional sensor. Figure 10 is a schematic, pictorial representation of a 3D mapping device 140 in accordance with yet another embodiment of the present invention. In this embodiment, the fields of view of an illumination module 142 and an imaging module 144 are collectively scanned by a common mirror 150 in the Y direction. The illumination module 142 includes a radiation source 32 and a beam scanner in the form of a scanning mirror 152 that scans the radiation beam in the X direction. The imaging module 144 includes a detector 154 whose field of view is scanned in the X direction by an imaging scanner including one of the scanning mirrors 156 in synchronization with the scanning mirror 152. Projection optics 146 project an illumination beam onto the region of interest of device 14 to form a desired pattern of spots on the object in the scene, and objective optics 148 image the spots onto detector ι 54. Therefore, the sensing region of the imaging module 144 is scanned in the region of interest in synchronization with the scanned beam of the illumination module 142. Detector 154 can include, for example, a position sensitive detector as in the embodiment of Fig. 9 or a small area image sensor in which the preamble elements are arranged in columns and rows. As in the previous embodiments, imaging module 144 provides a measure of the spot offset. The measurements can then be processed to produce a depth map. Although the mirror 150 is conveniently shared by both the illumination and imaging modules, each of the modules may have its own, synchronized Y-direction scanning mirror. In the latter case, MEMS technology can be used to generate all of the mirrors in device 14 然而 however 'since the Y-direction scan is relatively slow, so a single mirror 150 with a step-drive is feasible and advantageous for this application. The configuration shown in Figure 10 can be used to maintain the dot density and brightness of the projected pattern as well as the dynamic control of the scanning zone, as explained in the 。 158131.doc -27- 201233141 step. The mirrors 150 and 152 are operable to vary the angular extent within which the illumination beam is scanned, and the mirrors 150 and 156 then match the scan area of the scan detector 154 to the angular extent of the scanned beam. In this manner, for example, device 140 can be first operated to capture a coarse 3D image of one of the entire scenes to cover a wide angular range. The 3D image can be segmented to identify an object of interest, and the scan range of the illumination assembly and imaging assembly can then be dynamically adjusted to capture and map regions of objects having only high resolution. Other applications for such dynamic control will be known to those skilled in the art and are considered to be a tribute to the present invention. Several specific ways to enhance the scanning architecture for pattern projection and image defects have been shown and described above. These embodiments illustrate, by way of example, how aspects of the present invention may be particularly useful for improving eye safety, increasing the field of view, simultaneously using the same scanning hardware to project both infrared and visible images, and by imaging the imaging mode. The group is synchronized with the illumination module to reduce the size of the imaging set. Alternative embodiments and combinations of the above embodiments are also considered to be within the scope of the invention. These schemes can use various combinations of: scanning projection with 3D mapping and projection of visible information; diffractive optics for shaping or splitting the scanning beam; to reduce the refracting and field diffracting optics of the projection system Device; and (4) scanning for projection and shadow (4). It should be understood, therefore, that the various embodiments described above are merely illustrative, and that the invention is not limited to the particulars shown and described herein. Rather, the scope of the present invention includes both the combinations and sub-combinations of the various features described above, and variations of the invention which are apparent to those skilled in the <RTIgt; Modifications. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a schematic plan view of a system for 3D mapping according to an embodiment of the present invention; FIG. 2A and FIG. 2B are diagrams showing an illumination module according to an embodiment of the present invention; 2 is a schematic plan view of one of the two different operational stages; FIG. 2C is a schematic front view of one of the patterns of the modules of FIGS. 2A and 2B according to an embodiment of the present invention; FIG. 3 to FIG. According to another embodiment of the present invention, a schematic plan view of an illumination module; FIG. 6 is a schematic, visualized diagram of a 3] 〇 mapping system in operation according to an embodiment of the present invention; Still another embodiment of the invention, a schematic top view of an illumination module; FIGS. 8 and 9 are schematic, pictorial views of an imaging module in accordance with an embodiment of the present invention; and FIG. In one embodiment, a schematic, visualized diagram of a 3] imaging device. [Major component symbol description] 20 System 22 Mapping device 158131.doc -29· Object illumination module Radiation source Scanner illumination optics Projection field of view Imaging module Sensor detector component Objective optics Imaging field of view Processor memory Body scanning mirror laser diode collimator lens optical splitter of a plurality of angularly spaced replicas of a scanned beam of light scanned by a plurality of angularly spaced replicas of the scanned beam Open replica pattern spot, diffractive optical element intermediate beam • 30- 201233141 74 76 80 82 88 90 92 94 96 98 100 110 112 114 120 122 124 130 132 134 140 142 144 146 Patterned output beam axis beam sensing Beam sensor convex reflector three-dimensional mapping system game console participant participant background light point illumination module visible light source combiner image sensor column rectangular area position sensitive detector scanning mirror rectangular area three-dimensional mapping device illumination module Imaging Module Projection Optics 158131.doc 201233141 148 Objective Mirror Optics 150 Common Mirror 152 Scan Mirror 154 Detector 156 scanning mirror 158131.doc -32·
Claims (1)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US37272910P | 2010-08-11 | 2010-08-11 | |
| US201061425788P | 2010-12-22 | 2010-12-22 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| TW201233141A true TW201233141A (en) | 2012-08-01 |
| TWI513273B TWI513273B (en) | 2015-12-11 |
Family
ID=45567416
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| TW100128723A TWI513273B (en) | 2010-08-11 | 2011-08-11 | Scanning projectors and image capture modules for 3d mapping |
Country Status (4)
| Country | Link |
|---|---|
| US (4) | US9098931B2 (en) |
| CN (1) | CN103053167B (en) |
| TW (1) | TWI513273B (en) |
| WO (1) | WO2012020380A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| TWI549476B (en) * | 2013-12-20 | 2016-09-11 | 友達光電股份有限公司 | Display system and method for adjusting visible range |
Families Citing this family (211)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| IL165212A (en) | 2004-11-15 | 2012-05-31 | Elbit Systems Electro Optics Elop Ltd | Device for scanning light |
| US8400494B2 (en) * | 2005-10-11 | 2013-03-19 | Primesense Ltd. | Method and system for object reconstruction |
| US8600120B2 (en) | 2008-01-03 | 2013-12-03 | Apple Inc. | Personal computing device control using face detection and recognition |
| US8933876B2 (en) | 2010-12-13 | 2015-01-13 | Apple Inc. | Three dimensional user interface session control |
| US9035876B2 (en) | 2008-01-14 | 2015-05-19 | Apple Inc. | Three-dimensional user interface session control |
| US8384997B2 (en) | 2008-01-21 | 2013-02-26 | Primesense Ltd | Optical pattern projection |
| JP5588310B2 (en) | 2009-11-15 | 2014-09-10 | プライムセンス リミテッド | Optical projector with beam monitor |
| US20110187878A1 (en) | 2010-02-02 | 2011-08-04 | Primesense Ltd. | Synchronization of projected illumination with rolling shutter of image sensor |
| US9825425B2 (en) | 2013-06-19 | 2017-11-21 | Apple Inc. | Integrated structured-light projector comprising light-emitting elements on a substrate |
| WO2012011044A1 (en) | 2010-07-20 | 2012-01-26 | Primesense Ltd. | Interactive reality augmentation for natural interaction |
| US9201501B2 (en) | 2010-07-20 | 2015-12-01 | Apple Inc. | Adaptive projector |
| CN103053167B (en) * | 2010-08-11 | 2016-01-20 | 苹果公司 | Scanning projector and the image capture module mapped for 3D |
| US10739460B2 (en) | 2010-08-11 | 2020-08-11 | Apple Inc. | Time-of-flight detector with single-axis scan |
| US9036158B2 (en) | 2010-08-11 | 2015-05-19 | Apple Inc. | Pattern projector |
| US8959013B2 (en) | 2010-09-27 | 2015-02-17 | Apple Inc. | Virtual keyboard for a non-tactile three dimensional user interface |
| EP2643659B1 (en) | 2010-11-19 | 2019-12-25 | Apple Inc. | Depth mapping using time-coded illumination |
| US9131136B2 (en) | 2010-12-06 | 2015-09-08 | Apple Inc. | Lens arrays for pattern projection and imaging |
| US8872762B2 (en) | 2010-12-08 | 2014-10-28 | Primesense Ltd. | Three dimensional user interface cursor control |
| CN106125921B (en) | 2011-02-09 | 2019-01-15 | 苹果公司 | Gaze detection in 3D map environment |
| US9857868B2 (en) | 2011-03-19 | 2018-01-02 | The Board Of Trustees Of The Leland Stanford Junior University | Method and system for ergonomic touch-free interface |
| US8840466B2 (en) | 2011-04-25 | 2014-09-23 | Aquifi, Inc. | Method and system to create three-dimensional mapping in a two-dimensional game |
| US11165963B2 (en) | 2011-06-05 | 2021-11-02 | Apple Inc. | Device, method, and graphical user interface for accessing an application in a locked device |
| US9377865B2 (en) | 2011-07-05 | 2016-06-28 | Apple Inc. | Zoom-based gesture user interface |
| US9459758B2 (en) | 2011-07-05 | 2016-10-04 | Apple Inc. | Gesture-based interface with enhanced features |
| US8881051B2 (en) | 2011-07-05 | 2014-11-04 | Primesense Ltd | Zoom-based gesture user interface |
| US8908277B2 (en) | 2011-08-09 | 2014-12-09 | Apple Inc | Lens array projector |
| US8749796B2 (en) | 2011-08-09 | 2014-06-10 | Primesense Ltd. | Projectors of structured light |
| US9030498B2 (en) | 2011-08-15 | 2015-05-12 | Apple Inc. | Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface |
| US9122311B2 (en) | 2011-08-24 | 2015-09-01 | Apple Inc. | Visual feedback for tactile and non-tactile user interfaces |
| US9218063B2 (en) | 2011-08-24 | 2015-12-22 | Apple Inc. | Sessionless pointing user interface |
| WO2013095679A1 (en) | 2011-12-23 | 2013-06-27 | Intel Corporation | Computing system utilizing coordinated two-hand command gestures |
| US10345911B2 (en) | 2011-12-23 | 2019-07-09 | Intel Corporation | Mechanism to provide visual feedback regarding computing system command gestures |
| WO2013095677A1 (en) * | 2011-12-23 | 2013-06-27 | Intel Corporation | Computing system utilizing three-dimensional manipulation command gestures |
| KR102273746B1 (en) * | 2012-01-11 | 2021-07-06 | 시리얼 테크놀로지즈 에스.에이. | Optical apparatus for illuminating a pixel matrix and/or a controllable spatial light modulator for a display |
| US8854433B1 (en) | 2012-02-03 | 2014-10-07 | Aquifi, Inc. | Method and system enabling natural user interface gestures with an electronic system |
| US9329080B2 (en) | 2012-02-15 | 2016-05-03 | Aplle Inc. | Modular optics for scanning engine having beam combining optics with a prism intercepted by both beam axis and collection axis |
| US9157790B2 (en) | 2012-02-15 | 2015-10-13 | Apple Inc. | Integrated optoelectronic modules with transmitter, receiver and beam-combining optics for aligning a beam axis with a collection axis |
| US9229534B2 (en) | 2012-02-28 | 2016-01-05 | Apple Inc. | Asymmetric mapping for tactile and non-tactile user interfaces |
| WO2013140308A1 (en) | 2012-03-22 | 2013-09-26 | Primesense Ltd. | Diffraction-based sensing of mirror position |
| US9335220B2 (en) | 2012-03-22 | 2016-05-10 | Apple Inc. | Calibration of time-of-flight measurement using stray reflections |
| US9435638B2 (en) | 2012-03-22 | 2016-09-06 | Apple Inc. | Gimbaled scanning mirror array |
| US11169611B2 (en) | 2012-03-26 | 2021-11-09 | Apple Inc. | Enhanced virtual touchpad |
| US9111135B2 (en) | 2012-06-25 | 2015-08-18 | Aquifi, Inc. | Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera |
| US9098739B2 (en) | 2012-06-25 | 2015-08-04 | Aquifi, Inc. | Systems and methods for tracking human hands using parts based template matching |
| TWI648561B (en) | 2012-07-16 | 2019-01-21 | 美商唯亞威方案公司 | Optical filter and sensor system |
| US10060728B2 (en) * | 2012-07-26 | 2018-08-28 | Nec Corporation | Three-dimensional object-measurement device, medium, and control method |
| DE112013003679B4 (en) | 2012-07-26 | 2023-05-04 | Apple Inc. | Dual axis scanning mirror and method of scanning |
| JP2015206590A (en) * | 2012-08-30 | 2015-11-19 | 三洋電機株式会社 | Information acquisition device and object detection device |
| US8836768B1 (en) | 2012-09-04 | 2014-09-16 | Aquifi, Inc. | Method and system enabling natural user interface gestures with user wearable glasses |
| KR20150063540A (en) | 2012-10-23 | 2015-06-09 | 애플 인크. | Production of micro-mechanical devices |
| US9152234B2 (en) | 2012-12-02 | 2015-10-06 | Apple Inc. | Detecting user intent to remove a pluggable peripheral device |
| US9092665B2 (en) | 2013-01-30 | 2015-07-28 | Aquifi, Inc | Systems and methods for initializing motion tracking of human hands |
| US9129155B2 (en) | 2013-01-30 | 2015-09-08 | Aquifi, Inc. | Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map |
| WO2014125427A1 (en) | 2013-02-14 | 2014-08-21 | Primesense Ltd. | Flexible room controls |
| EP2972081B1 (en) * | 2013-03-15 | 2020-04-22 | Apple Inc. | Depth scanning with multiple emitters |
| US9298266B2 (en) | 2013-04-02 | 2016-03-29 | Aquifi, Inc. | Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects |
| US9042717B2 (en) * | 2013-07-31 | 2015-05-26 | Delphi Technologies, Inc. | Camera system with rotating mirror |
| US9798388B1 (en) | 2013-07-31 | 2017-10-24 | Aquifi, Inc. | Vibrotactile system to augment 3D input systems |
| US9898642B2 (en) | 2013-09-09 | 2018-02-20 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs |
| US10203399B2 (en) | 2013-11-12 | 2019-02-12 | Big Sky Financial Corporation | Methods and apparatus for array based LiDAR systems with reduced interference |
| WO2015085982A1 (en) * | 2013-12-11 | 2015-06-18 | Api International Ag | Device for 3-d measurement of a surface and projection unit, and method for 3-d measurement |
| US9528906B1 (en) | 2013-12-19 | 2016-12-27 | Apple Inc. | Monitoring DOE performance using total internal reflection |
| US20150176977A1 (en) * | 2013-12-20 | 2015-06-25 | Lemoptix Sa | Methods and devices for determining position or distance |
| KR101710003B1 (en) | 2014-01-07 | 2017-02-24 | 한국전자통신연구원 | Real time dynamic non planar projection apparatus and method |
| US9507417B2 (en) | 2014-01-07 | 2016-11-29 | Aquifi, Inc. | Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects |
| CN106415361B (en) | 2014-01-19 | 2018-11-13 | 苹果公司 | Couple scheme for the scanning lens array equipped with universal joint |
| US9619105B1 (en) | 2014-01-30 | 2017-04-11 | Aquifi, Inc. | Systems and methods for gesture based interaction with viewpoint dependent user interfaces |
| JP2015141372A (en) * | 2014-01-30 | 2015-08-03 | 株式会社ミツトヨ | Irradiation device, irradiation method, measurement device, and measurement method |
| US20150244949A1 (en) * | 2014-02-21 | 2015-08-27 | Rajiv Laroia | Illumination methods and apparatus |
| CN104898124A (en) * | 2014-03-07 | 2015-09-09 | 光宝科技股份有限公司 | Sampling method for depth detection and optical device thereof |
| US9360554B2 (en) | 2014-04-11 | 2016-06-07 | Facet Technology Corp. | Methods and apparatus for object detection and identification in a multiple detector lidar array |
| TWI524050B (en) * | 2014-04-15 | 2016-03-01 | 聚晶半導體股份有限公司 | Image capturing device, image depth generating device and method thereof |
| US9633441B2 (en) * | 2014-06-09 | 2017-04-25 | Omnivision Technologies, Inc. | Systems and methods for obtaining image depth information |
| EP3167311B1 (en) * | 2014-07-08 | 2019-11-13 | Facebook Technologies, LLC | Method and system for adjusting light pattern for structured light imaging |
| JP6456084B2 (en) * | 2014-09-24 | 2019-01-23 | キヤノン株式会社 | Image processing apparatus, image processing method, and program |
| US20160096072A1 (en) * | 2014-10-07 | 2016-04-07 | Umm Al-Qura University | Method and system for detecting, tracking, and visualizing joint therapy data |
| US9835853B1 (en) | 2014-11-26 | 2017-12-05 | Apple Inc. | MEMS scanner with mirrors of different sizes |
| US9784838B1 (en) | 2014-11-26 | 2017-10-10 | Apple Inc. | Compact scanner with gimbaled optics |
| US9674415B2 (en) * | 2014-12-22 | 2017-06-06 | Google Inc. | Time-of-flight camera system with scanning illuminator |
| CN104634277B (en) * | 2015-02-12 | 2018-05-15 | 上海图漾信息科技有限公司 | Capture apparatus and method, three-dimension measuring system, depth computing method and equipment |
| US11493634B2 (en) | 2015-02-13 | 2022-11-08 | Carnegie Mellon University | Programmable light curtains |
| US10359277B2 (en) | 2015-02-13 | 2019-07-23 | Carnegie Mellon University | Imaging system with synchronized dynamic control of directable beam light source and reconfigurably masked photo-sensor |
| US11425357B2 (en) | 2015-02-13 | 2022-08-23 | Carnegie Mellon University | Method for epipolar time of flight imaging |
| US11972586B2 (en) | 2015-02-13 | 2024-04-30 | Carnegie Mellon University | Agile depth sensing using triangulation light curtains |
| US10679370B2 (en) | 2015-02-13 | 2020-06-09 | Carnegie Mellon University | Energy optimized imaging system with 360 degree field-of-view |
| US9798135B2 (en) | 2015-02-16 | 2017-10-24 | Apple Inc. | Hybrid MEMS scanning module |
| US9934574B2 (en) * | 2015-02-25 | 2018-04-03 | Facebook, Inc. | Using intensity variations in a light pattern for depth mapping of objects in a volume |
| US10036801B2 (en) | 2015-03-05 | 2018-07-31 | Big Sky Financial Corporation | Methods and apparatus for increased precision and improved range in a multiple detector LiDAR array |
| US11002531B2 (en) | 2015-04-20 | 2021-05-11 | Samsung Electronics Co., Ltd. | CMOS image sensor for RGB imaging and depth measurement with laser sheet scan |
| US11736832B2 (en) | 2015-04-20 | 2023-08-22 | Samsung Electronics Co., Ltd. | Timestamp calibration of the 3D camera with epipolar line laser point scanning |
| US10250833B2 (en) * | 2015-04-20 | 2019-04-02 | Samsung Electronics Co., Ltd. | Timestamp calibration of the 3D camera with epipolar line laser point scanning |
| KR102473740B1 (en) * | 2015-04-20 | 2022-12-05 | 삼성전자주식회사 | Concurrent rgbz sensor and system |
| US10145678B2 (en) * | 2015-04-20 | 2018-12-04 | Samsung Electronics Co., Ltd. | CMOS image sensor for depth measurement using triangulation with point scan |
| US20160309135A1 (en) | 2015-04-20 | 2016-10-20 | Ilia Ovsiannikov | Concurrent rgbz sensor and system |
| US9525863B2 (en) | 2015-04-29 | 2016-12-20 | Apple Inc. | Time-of-flight depth mapping with flexible scan pattern |
| US10012831B2 (en) | 2015-08-03 | 2018-07-03 | Apple Inc. | Optical monitoring of scan parameters |
| US10895643B2 (en) * | 2015-08-05 | 2021-01-19 | Ams Sensors Singapore Pte. Ltd. | Intelligent illumination systems using modulated light |
| US9880267B2 (en) | 2015-09-04 | 2018-01-30 | Microvision, Inc. | Hybrid data acquisition in scanned beam display |
| US10503265B2 (en) * | 2015-09-08 | 2019-12-10 | Microvision, Inc. | Mixed-mode depth detection |
| US9897801B2 (en) | 2015-09-30 | 2018-02-20 | Apple Inc. | Multi-hinge mirror assembly |
| US9703096B2 (en) | 2015-09-30 | 2017-07-11 | Apple Inc. | Asymmetric MEMS mirror assembly |
| KR102473735B1 (en) * | 2015-11-09 | 2022-12-05 | 삼성전자주식회사 | Operation method of imaging apparatus |
| US10708577B2 (en) * | 2015-12-16 | 2020-07-07 | Facebook Technologies, Llc | Range-gated depth camera assembly |
| US10324171B2 (en) | 2015-12-20 | 2019-06-18 | Apple Inc. | Light detection and ranging sensor |
| JP6631261B2 (en) * | 2016-01-14 | 2020-01-15 | セイコーエプソン株式会社 | Image recognition device, image recognition method, and image recognition unit |
| JP2019505843A (en) | 2016-01-22 | 2019-02-28 | コーニング インコーポレイテッド | Wide-view personal display device |
| KR102406327B1 (en) | 2016-02-02 | 2022-06-10 | 삼성전자주식회사 | Device and operating method thereof |
| US9866816B2 (en) | 2016-03-03 | 2018-01-09 | 4D Intellectual Properties, Llc | Methods and apparatus for an active pulsed 4D camera for image acquisition and analysis |
| CN108885260B (en) | 2016-04-08 | 2022-06-03 | 苹果公司 | Time-of-flight detector with single axis scanning |
| KR102454228B1 (en) * | 2016-04-29 | 2022-10-14 | 엘지전자 주식회사 | Multi-vision device |
| CN105959668A (en) * | 2016-04-29 | 2016-09-21 | 信利光电股份有限公司 | Shooting module with 3D scanning function and 3D scanning method thereof |
| US9648225B1 (en) * | 2016-05-10 | 2017-05-09 | Howard Preston | Method, apparatus, system and software for focusing a camera |
| US20170328990A1 (en) * | 2016-05-11 | 2017-11-16 | Texas Instruments Incorporated | Scalable field of view scanning in optical distance measurement systems |
| US11106030B2 (en) | 2016-05-11 | 2021-08-31 | Texas Instruments Incorporated | Optical distance measurement system using solid state beam steering |
| WO2017200896A2 (en) | 2016-05-18 | 2017-11-23 | James O'keeffe | A dynamically steered lidar adapted to vehicle shape |
| WO2018128655A2 (en) | 2016-09-25 | 2018-07-12 | Okeeffe James | Distributed laser range finder with fiber optics and micromirrors |
| US11340338B2 (en) | 2016-08-10 | 2022-05-24 | James Thomas O'Keeffe | Distributed lidar with fiber optics and a field of view combiner |
| US10578719B2 (en) | 2016-05-18 | 2020-03-03 | James Thomas O'Keeffe | Vehicle-integrated LIDAR system |
| WO2018031830A1 (en) | 2016-08-10 | 2018-02-15 | Okeeffe James | Laser range finding with enhanced utilization of a remotely located mirror |
| CN107726053B (en) * | 2016-08-12 | 2020-10-13 | 通用电气公司 | Probe system and detection method |
| US10145680B2 (en) * | 2016-08-12 | 2018-12-04 | Microvision, Inc. | Devices and methods for providing depth mapping with scanning laser image projection |
| US9766060B1 (en) * | 2016-08-12 | 2017-09-19 | Microvision, Inc. | Devices and methods for adjustable resolution depth mapping |
| US10298913B2 (en) | 2016-08-18 | 2019-05-21 | Apple Inc. | Standalone depth camera |
| WO2018126248A1 (en) * | 2017-01-02 | 2018-07-05 | Okeeffe James | Micromirror array for feedback-based image resolution enhancement |
| WO2018044958A1 (en) | 2016-08-29 | 2018-03-08 | Okeeffe James | Laser range finder with smart safety-conscious laser intensity |
| US10073004B2 (en) | 2016-09-19 | 2018-09-11 | Apple Inc. | DOE defect monitoring utilizing total internal reflection |
| US10488652B2 (en) * | 2016-09-21 | 2019-11-26 | Apple Inc. | Prism-based scanner |
| DK179978B1 (en) | 2016-09-23 | 2019-11-27 | Apple Inc. | IMAGE DATA FOR ENHANCED USER INTERACTIONS |
| US10200683B2 (en) | 2016-12-21 | 2019-02-05 | Microvision, Inc. | Devices and methods for providing foveated scanning laser image projection with depth mapping |
| US10158845B2 (en) | 2017-01-18 | 2018-12-18 | Facebook Technologies, Llc | Tileable structured light projection for wide field-of-view depth sensing |
| US10681331B2 (en) | 2017-02-06 | 2020-06-09 | MODit 3D, Inc. | System and method for 3D scanning |
| DE102017204668A1 (en) * | 2017-03-21 | 2018-09-27 | Robert Bosch Gmbh | An object detection apparatus and method for monitoring a light projection surface for intrusion of an object |
| US10969488B2 (en) * | 2017-03-29 | 2021-04-06 | Luminar Holdco, Llc | Dynamically scanning a field of regard using a limited number of output beams |
| JP6626036B2 (en) | 2017-04-18 | 2019-12-25 | ファナック株式会社 | Laser processing system with measurement function |
| US10210648B2 (en) | 2017-05-16 | 2019-02-19 | Apple Inc. | Emojicon puppeting |
| US10542245B2 (en) | 2017-05-24 | 2020-01-21 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
| US10613413B1 (en) | 2017-05-31 | 2020-04-07 | Facebook Technologies, Llc | Ultra-wide field-of-view scanning devices for depth sensing |
| US11163042B2 (en) | 2017-06-06 | 2021-11-02 | Microvision, Inc. | Scanned beam display with multiple detector rangefinding |
| WO2018229358A1 (en) * | 2017-06-14 | 2018-12-20 | Majo | Method and device for constructing a three-dimensional image |
| CN107121886A (en) * | 2017-06-20 | 2017-09-01 | 深圳奥比中光科技有限公司 | 3D imaging electronicses |
| US10181200B1 (en) | 2017-06-28 | 2019-01-15 | Facebook Technologies, Llc | Circularly polarized illumination and detection for depth sensing |
| EP3662406B1 (en) * | 2017-08-01 | 2023-11-22 | Apple Inc. | Determining sparse versus dense pattern illumination |
| US11445094B2 (en) | 2017-08-07 | 2022-09-13 | Apple Inc. | Electronic device having a vision system assembly held by a self-aligning bracket assembly |
| US10268234B2 (en) | 2017-08-07 | 2019-04-23 | Apple Inc. | Bracket assembly for a multi-component vision system in an electronic device |
| US10976551B2 (en) | 2017-08-30 | 2021-04-13 | Corning Incorporated | Wide field personal display device |
| US10153614B1 (en) | 2017-08-31 | 2018-12-11 | Apple Inc. | Creating arbitrary patterns on a 2-D uniform grid VCSEL array |
| US10574973B2 (en) * | 2017-09-06 | 2020-02-25 | Facebook Technologies, Llc | Non-mechanical beam steering for depth sensing |
| KR102301599B1 (en) | 2017-09-09 | 2021-09-10 | 애플 인크. | Implementation of biometric authentication |
| CN109829281A (en) | 2017-09-09 | 2019-05-31 | 苹果公司 | The realization of biometric authentication |
| WO2018226265A1 (en) | 2017-09-09 | 2018-12-13 | Apple Inc. | Implementation of biometric authentication |
| EP3482345B1 (en) | 2017-09-09 | 2021-12-08 | Apple Inc. | Implementation of biometric authentication with detection and display of an error indication |
| KR102185854B1 (en) | 2017-09-09 | 2020-12-02 | 애플 인크. | Implementation of biometric authentication |
| CN114777683A (en) | 2017-10-06 | 2022-07-22 | 先进扫描仪公司 | Generating one or more luminance edges to form a three-dimensional model of an object |
| US11415675B2 (en) | 2017-10-09 | 2022-08-16 | Luminar, Llc | Lidar system with adjustable pulse period |
| US11353559B2 (en) * | 2017-10-09 | 2022-06-07 | Luminar, Llc | Adjustable scan patterns for lidar system |
| CN111512180B (en) * | 2017-10-22 | 2024-11-26 | 魔眼公司 | Adjust the projection system of the distance sensor to optimize the beam layout |
| US10612912B1 (en) | 2017-10-31 | 2020-04-07 | Facebook Technologies, Llc | Tileable structured light projection system |
| US10460509B2 (en) * | 2017-11-07 | 2019-10-29 | Dolby Laboratories Licensing Corporation | Parameterizing 3D scenes for volumetric viewing |
| US11592530B2 (en) * | 2017-11-30 | 2023-02-28 | Cepton Technologies, Inc. | Detector designs for improved resolution in lidar systems |
| US10942244B2 (en) * | 2017-12-12 | 2021-03-09 | Waymo Llc | Systems and methods for LIDARs with adjustable resolution and failsafe operation |
| US10521926B1 (en) | 2018-03-21 | 2019-12-31 | Facebook Technologies, Llc | Tileable non-planar structured light patterns for wide field-of-view depth sensing |
| US11073603B2 (en) * | 2018-04-03 | 2021-07-27 | GM Global Technology Operations LLC | Controlled scan pattern transition in coherent lidar |
| US10694168B2 (en) * | 2018-04-22 | 2020-06-23 | Corephotonics Ltd. | System and method for mitigating or preventing eye damage from structured light IR/NIR projector systems |
| US11170085B2 (en) | 2018-06-03 | 2021-11-09 | Apple Inc. | Implementation of biometric authentication |
| US11422292B1 (en) | 2018-06-10 | 2022-08-23 | Apple Inc. | Super-blazed diffractive optical elements with sub-wavelength structures |
| WO2019243046A1 (en) * | 2018-06-18 | 2019-12-26 | Lumileds Holding B.V. | Lighting device comprising led and grating |
| CN110623763B (en) * | 2018-06-22 | 2023-03-14 | 阿莱恩技术有限公司 | Intraoral 3D scanner with multiple miniature cameras and miniature pattern projectors |
| US11896461B2 (en) | 2018-06-22 | 2024-02-13 | Align Technology, Inc. | Intraoral 3D scanner employing multiple miniature cameras and multiple miniature pattern projectors |
| CN110824599B (en) | 2018-08-14 | 2021-09-03 | 白金科技股份有限公司 | Infrared band-pass filter |
| US11100349B2 (en) | 2018-09-28 | 2021-08-24 | Apple Inc. | Audio assisted enrollment |
| US10860096B2 (en) | 2018-09-28 | 2020-12-08 | Apple Inc. | Device control using gaze information |
| DE102018124835B3 (en) * | 2018-10-09 | 2019-11-07 | Sick Ag | Optoelectronic sensor and method for detecting objects |
| US11287644B2 (en) | 2019-02-14 | 2022-03-29 | Microvision, Inc. | Alteration of resonant mode frequency response in mechanically resonant device |
| US11150088B2 (en) | 2019-05-13 | 2021-10-19 | Lumileds Llc | Depth sensing using line pattern generators |
| GB201907188D0 (en) * | 2019-05-21 | 2019-07-03 | Cambridge Mechatronics Ltd | Apparatus |
| US11754682B2 (en) | 2019-05-30 | 2023-09-12 | Microvision, Inc. | LIDAR system with spatial beam combining |
| US11828881B2 (en) | 2019-05-30 | 2023-11-28 | Microvision, Inc. | Steered LIDAR system with arrayed receiver |
| US11796643B2 (en) | 2019-05-30 | 2023-10-24 | Microvision, Inc. | Adaptive LIDAR scanning methods |
| US11320663B2 (en) | 2019-06-04 | 2022-05-03 | Omni Vision Technologies, Inc. | Diffractive optical elements made of conductive materials |
| US11480660B2 (en) | 2019-07-09 | 2022-10-25 | Microvision, Inc. | Arrayed MEMS mirrors for large aperture applications |
| US11579256B2 (en) | 2019-07-11 | 2023-02-14 | Microvision, Inc. | Variable phase scanning lidar system |
| US11604347B2 (en) | 2019-08-18 | 2023-03-14 | Apple Inc. | Force-balanced micromirror with electromagnetic actuation |
| US11812203B2 (en) * | 2019-08-28 | 2023-11-07 | Panasonic Intellectual Property Management Co., Ltd. | Projection system and projection method |
| US11681019B2 (en) | 2019-09-18 | 2023-06-20 | Apple Inc. | Optical module with stray light baffle |
| US11506762B1 (en) | 2019-09-24 | 2022-11-22 | Apple Inc. | Optical module comprising an optical waveguide with reference light path |
| TWI758672B (en) | 2019-12-19 | 2022-03-21 | 宏正自動科技股份有限公司 | Electronic device and power distribution method |
| WO2021158703A1 (en) * | 2020-02-03 | 2021-08-12 | Nanotronics Imaging, Inc. | Deep photometric learning (dpl) systems, apparatus and methods |
| WO2021163444A1 (en) | 2020-02-12 | 2021-08-19 | Imagineoptix Corporation | Optical elements for integrated ir and visible camera for depth sensing and systems incorporating the same |
| CN115176171B (en) * | 2020-02-26 | 2025-08-12 | 株式会社电装 | Optical detection device and optical axis deviation determination method in optical detection device |
| JP7322908B2 (en) * | 2020-02-26 | 2023-08-08 | 株式会社デンソー | Optical detection device and method for determining optical axis deviation in optical detection device |
| US11754767B1 (en) | 2020-03-05 | 2023-09-12 | Apple Inc. | Display with overlaid waveguide |
| US12066545B2 (en) * | 2020-03-24 | 2024-08-20 | Magic Leap, Inc. | Power-efficient hand tracking with time-of-flight sensor |
| US11317031B2 (en) * | 2020-03-30 | 2022-04-26 | Amazon Technologies, Inc. | High intensity pattern projection and general illumination using rolling shutter camera and a synchronized scanning laser |
| US11443447B2 (en) | 2020-04-17 | 2022-09-13 | Samsung Electronics Co., Ltd. | Three-dimensional camera system |
| CN111610534B (en) | 2020-05-07 | 2022-12-02 | 广州立景创新科技有限公司 | Imaging device and imaging method |
| CN116034289A (en) | 2020-05-13 | 2023-04-28 | 卢米诺有限责任公司 | LiDAR system with high-resolution scanning patterns |
| EP4185888A4 (en) * | 2020-07-21 | 2024-08-21 | Leddartech Inc. | BEAM STEERING DEVICE, PARTICULARLY FOR LIDAR SYSTEMS |
| CN115218820B (en) * | 2021-04-20 | 2025-08-08 | 上海图漾信息科技有限公司 | Structured light projection device, depth data measurement head, computing device and measurement method |
| JP7508150B2 (en) | 2020-07-22 | 2024-07-01 | 上海図漾信息科技有限公司 | Depth data measuring device and structured light projection unit |
| EP4264460B1 (en) * | 2021-01-25 | 2025-12-24 | Apple Inc. | Implementation of biometric authentication |
| WO2022159899A1 (en) | 2021-01-25 | 2022-07-28 | Apple Inc. | Implementation of biometric authentication |
| US12210603B2 (en) | 2021-03-04 | 2025-01-28 | Apple Inc. | User interface for enrolling a biometric feature |
| US20240175991A1 (en) * | 2021-03-10 | 2024-05-30 | Pioneer Corporation | Sensor device, control device, control method, program, and storage medium |
| WO2022197339A1 (en) | 2021-03-17 | 2022-09-22 | Apple Inc. | Waveguide-based transmitters with adjustable lighting |
| DE102021107194A1 (en) | 2021-03-23 | 2022-09-29 | Sick Ag | Photoelectric sensor and method for detecting objects |
| US12216754B2 (en) | 2021-05-10 | 2025-02-04 | Apple Inc. | User interfaces for authenticating to perform secure operations |
| US11942090B2 (en) | 2021-06-04 | 2024-03-26 | Apple Inc. | Accessory device based authentication for digital assistant requests |
| US12277205B2 (en) | 2021-09-20 | 2025-04-15 | Apple Inc. | User interfaces for digital identification |
| US12219267B2 (en) * | 2021-09-24 | 2025-02-04 | Apple Inc. | Adaptive-flash photography, videography, and/or flashlight using camera, scene, or user input parameters |
| US20250285458A1 (en) * | 2022-04-25 | 2025-09-11 | Virginia Tech Intellectual Properties, Inc. | Automated objects labeling in video data for machine learning and other classifiers |
| CN115379182B (en) * | 2022-08-19 | 2023-11-24 | 四川大学 | A bidirectional structured light encoding and decoding method, device, electronic equipment and storage medium |
Family Cites Families (258)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB1353470A (en) | 1970-10-19 | 1974-05-15 | Post D | Position measuring apparatus utilizing moire fringe multiplication |
| DE2951207A1 (en) | 1978-12-26 | 1980-07-10 | Canon Kk | METHOD FOR THE OPTICAL PRODUCTION OF A SPREADING PLATE |
| US4542376A (en) | 1983-11-03 | 1985-09-17 | Burroughs Corporation | System for electronically displaying portions of several different images on a CRT screen through respective prioritized viewports |
| DE3570529D1 (en) | 1984-03-05 | 1989-06-29 | Siemens Ag | Optical system for the simultaneous reception of thermal and laser radiation |
| JPH0762869B2 (en) | 1986-03-07 | 1995-07-05 | 日本電信電話株式会社 | Position and shape measurement method by pattern projection |
| US4843568A (en) | 1986-04-11 | 1989-06-27 | Krueger Myron W | Real time perception of and response to the actions of an unencumbered participant/user |
| JPH0615968B2 (en) | 1986-08-11 | 1994-03-02 | 伍良 松本 | Three-dimensional shape measuring device |
| US4850673A (en) | 1987-11-23 | 1989-07-25 | U. S. Philips Corporation | Optical scanning apparatus which detects scanning spot focus error |
| JPH01240863A (en) | 1988-03-23 | 1989-09-26 | Kowa Co | Method and apparatus for generating speckle pattern |
| US5090797A (en) | 1989-06-09 | 1992-02-25 | Lc Technologies Inc. | Method and apparatus for mirror control |
| JPH0340591A (en) | 1989-07-06 | 1991-02-21 | Katsuji Okino | Method and device for image pickup and display of stereoscopic image |
| JPH0743683Y2 (en) | 1989-07-28 | 1995-10-09 | 日本電気株式会社 | Optical fiber extra length processing structure |
| DE69128808T2 (en) | 1990-04-12 | 1998-07-23 | Matsushita Electric Ind Co Ltd | Optical head with hologram-connected objective lens |
| JP3083834B2 (en) | 1990-08-21 | 2000-09-04 | オリンパス光学工業株式会社 | Optical pickup device |
| US5075562A (en) | 1990-09-20 | 1991-12-24 | Eastman Kodak Company | Method and apparatus for absolute Moire distance measurements using a grating printed on or attached to a surface |
| GB9116151D0 (en) | 1991-07-26 | 1991-09-11 | Isis Innovation | Three-dimensional vision system |
| US5691989A (en) | 1991-07-26 | 1997-11-25 | Accuwave Corporation | Wavelength stabilized laser sources using feedback from volume holograms |
| US5483261A (en) | 1992-02-14 | 1996-01-09 | Itu Research, Inc. | Graphical input controller and method with rear screen image detection |
| EP0559978B1 (en) | 1992-03-12 | 1998-08-05 | International Business Machines Corporation | Image processing method |
| US5636025A (en) | 1992-04-23 | 1997-06-03 | Medar, Inc. | System for optically measuring the surface contour of a part using more fringe techniques |
| US5477383A (en) | 1993-02-05 | 1995-12-19 | Apa Optics, Inc. | Optical array method and apparatus |
| JP3353365B2 (en) | 1993-03-18 | 2002-12-03 | 静岡大学長 | Displacement and displacement velocity measuring device |
| US5406543A (en) | 1993-04-07 | 1995-04-11 | Olympus Optical Co., Ltd. | Optical head with semiconductor laser |
| JP3623250B2 (en) | 1993-06-23 | 2005-02-23 | オリンパス株式会社 | Video display device |
| US5856871A (en) | 1993-08-18 | 1999-01-05 | Applied Spectral Imaging Ltd. | Film thickness mapping using interferometric spectral imaging |
| JP3537881B2 (en) | 1994-03-29 | 2004-06-14 | 株式会社リコー | LED array head |
| US5557397A (en) | 1994-09-21 | 1996-09-17 | Airborne Remote Mapping, Inc. | Aircraft-based topographical data collection and processing system |
| US6041140A (en) | 1994-10-04 | 2000-03-21 | Synthonics, Incorporated | Apparatus for interactive image correlation for three dimensional image production |
| JPH08186845A (en) | 1994-12-27 | 1996-07-16 | Nobuaki Yanagisawa | Focal distance controlling stereoscopic-vision television receiver |
| US5630043A (en) | 1995-05-11 | 1997-05-13 | Cirrus Logic, Inc. | Animated texture map apparatus and method for 3-D image displays |
| IL114278A (en) | 1995-06-22 | 2010-06-16 | Microsoft Internat Holdings B | Camera and method |
| KR19990029064A (en) | 1995-07-18 | 1999-04-15 | 낸시 엘. 후체슨 | Moiré Interference System and Method with Extended Image Depth |
| JP2768320B2 (en) | 1995-09-04 | 1998-06-25 | 日本電気株式会社 | Tunable optical filter |
| JPH09261535A (en) | 1996-03-25 | 1997-10-03 | Sharp Corp | Imaging device |
| US5701326A (en) | 1996-04-16 | 1997-12-23 | Loral Vought Systems Corporation | Laser scanning system with optical transmit/reflect mirror having reduced received signal loss |
| US5614948A (en) | 1996-04-26 | 1997-03-25 | Intel Corporation | Camera having an adaptive gain control |
| DE19638727A1 (en) | 1996-09-12 | 1998-03-19 | Ruedger Dipl Ing Rubbert | Method for increasing the significance of the three-dimensional measurement of objects |
| JP3402138B2 (en) | 1996-09-27 | 2003-04-28 | 株式会社日立製作所 | Liquid crystal display |
| IL119341A (en) | 1996-10-02 | 1999-09-22 | Univ Ramot | Phase-only filter for generating an arbitrary illumination pattern |
| IL119831A (en) | 1996-12-15 | 2002-12-01 | Cognitens Ltd | Apparatus and method for 3d surface geometry reconstruction |
| WO1998028593A1 (en) | 1996-12-20 | 1998-07-02 | Pacific Title And Mirage, Inc. | Apparatus and method for rapid 3d image parametrization |
| US5838428A (en) | 1997-02-28 | 1998-11-17 | United States Of America As Represented By The Secretary Of The Navy | System and method for high resolution range imaging with split light source and pattern mask |
| US6002520A (en) | 1997-04-25 | 1999-12-14 | Hewlett-Packard Company | Illumination system for creating a desired irradiance profile using diffractive optical elements |
| JPH10327433A (en) | 1997-05-23 | 1998-12-08 | Minolta Co Ltd | Display device for composted image |
| US6031611A (en) | 1997-06-03 | 2000-02-29 | California Institute Of Technology | Coherent gradient sensing method and system for measuring surface curvature |
| US6525821B1 (en) * | 1997-06-11 | 2003-02-25 | Ut-Battelle, L.L.C. | Acquisition and replay systems for direct-to-digital holography and holovision |
| US6008813A (en) | 1997-08-01 | 1999-12-28 | Mitsubishi Electric Information Technology Center America, Inc. (Ita) | Real-time PC based volume rendering system |
| DE19736169A1 (en) | 1997-08-20 | 1999-04-15 | Fhu Hochschule Fuer Technik | Method to measure deformation or vibration using electronic speckle pattern interferometry |
| US6101269A (en) | 1997-12-19 | 2000-08-08 | Lifef/X Networks, Inc. | Apparatus and method for rapid 3D image parametrization |
| US6560019B2 (en) | 1998-02-05 | 2003-05-06 | Canon Kabushiki Kaisha | Diffractive optical element and optical system having the same |
| DE19815201A1 (en) | 1998-04-04 | 1999-10-07 | Link Johann & Ernst Gmbh & Co | Measuring arrangement for detecting dimensions of test specimens, preferably of hollow bodies, in particular of bores in workpieces, and methods for measuring such dimensions |
| US6750906B1 (en) | 1998-05-08 | 2004-06-15 | Cirrus Logic, Inc. | Histogram-based automatic gain control method and system for video applications |
| US6731391B1 (en) | 1998-05-13 | 2004-05-04 | The Research Foundation Of State University Of New York | Shadow moire surface measurement using Talbot effect |
| DE19821611A1 (en) | 1998-05-14 | 1999-11-18 | Syrinx Med Tech Gmbh | Recording method for spatial structure of three-dimensional surface, e.g. for person recognition |
| GB2352901A (en) | 1999-05-12 | 2001-02-07 | Tricorder Technology Plc | Rendering three dimensional representations utilising projected light patterns |
| US6912293B1 (en) | 1998-06-26 | 2005-06-28 | Carl P. Korobkin | Photogrammetry engine for model construction |
| US6377700B1 (en) | 1998-06-30 | 2002-04-23 | Intel Corporation | Method and apparatus for capturing stereoscopic images using image sensors |
| JP3678022B2 (en) | 1998-10-23 | 2005-08-03 | コニカミノルタセンシング株式会社 | 3D input device |
| US6084712A (en) | 1998-11-03 | 2000-07-04 | Dynamic Measurement And Inspection,Llc | Three dimensional imaging using a refractive optic design |
| US8965898B2 (en) | 1998-11-20 | 2015-02-24 | Intheplay, Inc. | Optimizations for live event, real-time, 3D object tracking |
| US6759646B1 (en) | 1998-11-24 | 2004-07-06 | Intel Corporation | Color interpolation for a four color mosaic pattern |
| JP2001166810A (en) | 1999-02-19 | 2001-06-22 | Sanyo Electric Co Ltd | Device and method for providing solid model |
| US6259561B1 (en) | 1999-03-26 | 2001-07-10 | The University Of Rochester | Optical system for diffusing light |
| US6636538B1 (en) | 1999-03-29 | 2003-10-21 | Cutting Edge Optronics, Inc. | Laser diode packaging |
| US6815687B1 (en) * | 1999-04-16 | 2004-11-09 | The Regents Of The University Of Michigan | Method and system for high-speed, 3D imaging of optically-invisible radiation |
| US6751344B1 (en) | 1999-05-28 | 2004-06-15 | Champion Orthotic Investments, Inc. | Enhanced projector system for machine vision |
| JP2000348367A (en) | 1999-06-04 | 2000-12-15 | Olympus Optical Co Ltd | Optical unit and optical pickup |
| US6512385B1 (en) | 1999-07-26 | 2003-01-28 | Paul Pfaff | Method for testing a device under test including the interference of two beams |
| US6268923B1 (en) | 1999-10-07 | 2001-07-31 | Integral Vision, Inc. | Optical method and system for measuring three-dimensional surface topography of an object having a surface contour |
| JP2001141430A (en) | 1999-11-16 | 2001-05-25 | Fuji Photo Film Co Ltd | Image pickup device and image processing device |
| LT4842B (en) | 1999-12-10 | 2001-09-25 | Uab "Geola" | Universal digital holographic printer and method |
| US6301059B1 (en) | 2000-01-07 | 2001-10-09 | Lucent Technologies Inc. | Astigmatic compensation for an anamorphic optical system |
| US6937348B2 (en) | 2000-01-28 | 2005-08-30 | Genex Technologies, Inc. | Method and apparatus for generating structural pattern illumination |
| US6700669B1 (en) | 2000-01-28 | 2004-03-02 | Zheng J. Geng | Method and system for three-dimensional imaging using light pattern having multiple sub-patterns |
| US20020071169A1 (en) | 2000-02-01 | 2002-06-13 | Bowers John Edward | Micro-electro-mechanical-system (MEMS) mirror device |
| JP4560869B2 (en) | 2000-02-07 | 2010-10-13 | ソニー株式会社 | Glasses-free display system and backlight system |
| JP3662162B2 (en) | 2000-03-03 | 2005-06-22 | シャープ株式会社 | Bi-directional optical communication module |
| JP4265076B2 (en) | 2000-03-31 | 2009-05-20 | 沖電気工業株式会社 | Multi-angle camera and automatic photographing device |
| US6690467B1 (en) | 2000-05-05 | 2004-02-10 | Pe Corporation | Optical system and method for optically analyzing light from a sample |
| KR100355718B1 (en) | 2000-06-10 | 2002-10-11 | 주식회사 메디슨 | System and method for 3-d ultrasound imaging using an steerable probe |
| US6810135B1 (en) | 2000-06-29 | 2004-10-26 | Trw Inc. | Optimized human presence detection through elimination of background interference |
| JP2002026452A (en) | 2000-07-12 | 2002-01-25 | Toyota Central Res & Dev Lab Inc | Surface emitting light source, method of manufacturing the same, light source for laser beam machine |
| TW527518B (en) | 2000-07-14 | 2003-04-11 | Massachusetts Inst Technology | Method and system for high resolution, ultra fast, 3-D imaging |
| US7227526B2 (en) | 2000-07-24 | 2007-06-05 | Gesturetek, Inc. | Video-based image control system |
| US6686921B1 (en) | 2000-08-01 | 2004-02-03 | International Business Machines Corporation | Method and apparatus for acquiring a set of consistent image maps to represent the color of the surface of an object |
| US6754370B1 (en) | 2000-08-14 | 2004-06-22 | The Board Of Trustees Of The Leland Stanford Junior University | Real-time structured light range scanning of moving scenes |
| US6639684B1 (en) | 2000-09-13 | 2003-10-28 | Nextengine, Inc. | Digitizer using intensity gradient to image features of three-dimensional objects |
| US6583873B1 (en) | 2000-09-25 | 2003-06-24 | The Carnegie Institution Of Washington | Optical devices having a wavelength-tunable dispersion assembly that has a volume dispersive diffraction grating |
| US6813440B1 (en) | 2000-10-10 | 2004-11-02 | The Hong Kong Polytechnic University | Body scanner |
| JP3689720B2 (en) | 2000-10-16 | 2005-08-31 | 住友大阪セメント株式会社 | 3D shape measuring device |
| ATE463004T1 (en) | 2000-11-06 | 2010-04-15 | Koninkl Philips Electronics Nv | METHOD FOR MEASURING THE MOTION OF AN INPUT DEVICE |
| JP2002152776A (en) | 2000-11-09 | 2002-05-24 | Nippon Telegr & Teleph Corp <Ntt> | Range image encoding method and apparatus, and range image decoding method and apparatus |
| JP2002191058A (en) | 2000-12-20 | 2002-07-05 | Olympus Optical Co Ltd | Three-dimensional image acquisition device and three- dimensional image acquisition method |
| JP2002213931A (en) | 2001-01-17 | 2002-07-31 | Fuji Xerox Co Ltd | Instrument and method for measuring three-dimensional shape |
| US6841780B2 (en) | 2001-01-19 | 2005-01-11 | Honeywell International Inc. | Method and apparatus for detecting objects |
| US6611000B2 (en) | 2001-03-14 | 2003-08-26 | Matsushita Electric Industrial Co., Ltd. | Lighting device |
| CN1220283C (en) | 2001-04-23 | 2005-09-21 | 松下电工株式会社 | Ligth emitting device comprising LED chip |
| JP2002365023A (en) | 2001-06-08 | 2002-12-18 | Koji Okamoto | Apparatus and method for measurement of liquid level |
| AU2002354681A1 (en) | 2001-07-13 | 2003-01-29 | Mems Optical, Inc. | Autosteroscopic display with rotated microlens-array and method of displaying multidimensional images, especially color images |
| US6741251B2 (en) | 2001-08-16 | 2004-05-25 | Hewlett-Packard Development Company, L.P. | Method and apparatus for varying focus in a scene |
| US20030090818A1 (en) | 2001-11-02 | 2003-05-15 | Wittenberger John Carl | Co-aligned receiver and transmitter for wireless link |
| AU2003217587A1 (en) | 2002-02-15 | 2003-09-09 | Canesta, Inc. | Gesture recognition system using depth perceptive sensors |
| US7369685B2 (en) | 2002-04-05 | 2008-05-06 | Identix Corporation | Vision-based operating method and system |
| US7811825B2 (en) | 2002-04-19 | 2010-10-12 | University Of Washington | System and method for processing specimens and images for optical tomography |
| US20030227614A1 (en) | 2002-06-05 | 2003-12-11 | Taminiau August A. | Laser machining apparatus with automatic focusing |
| WO2003105289A2 (en) | 2002-06-07 | 2003-12-18 | University Of North Carolina At Chapel Hill | Methods and systems for laser based real-time structured light depth extraction |
| US7006709B2 (en) | 2002-06-15 | 2006-02-28 | Microsoft Corporation | System and method deghosting mosaics using multiperspective plane sweep |
| US20040001145A1 (en) | 2002-06-27 | 2004-01-01 | Abbate Jeffrey A. | Method and apparatus for multifield image generation and processing |
| US6924915B2 (en) | 2002-08-26 | 2005-08-02 | Canon Kabushiki Kaisha | Oscillation device, optical deflector using the oscillation device, and image display device and image forming apparatus using the optical deflector, and method of manufacturing the oscillation device |
| US6859326B2 (en) | 2002-09-20 | 2005-02-22 | Corning Incorporated | Random microlens array for optical beam shaping and homogenization |
| KR100624405B1 (en) | 2002-10-01 | 2006-09-18 | 삼성전자주식회사 | Optical component mounting board and its manufacturing method |
| US7194105B2 (en) | 2002-10-16 | 2007-03-20 | Hersch Roger D | Authentication of documents and articles by moiré patterns |
| GB2395261A (en) | 2002-11-11 | 2004-05-19 | Qinetiq Ltd | Ranging apparatus |
| TWI291040B (en) | 2002-11-21 | 2007-12-11 | Solvision Inc | Fast 3D height measurement method and system |
| US7103212B2 (en) | 2002-11-22 | 2006-09-05 | Strider Labs, Inc. | Acquisition of three-dimensional images by an active stereo technique using locally unique patterns |
| US20040174770A1 (en) | 2002-11-27 | 2004-09-09 | Rees Frank L. | Gauss-Rees parametric ultrawideband system |
| US7639419B2 (en) | 2003-02-21 | 2009-12-29 | Kla-Tencor Technologies, Inc. | Inspection system using small catadioptric objective |
| US7127101B2 (en) | 2003-03-10 | 2006-10-24 | Cranul Technologies, Inc. | Automatic selection of cranial remodeling device trim lines |
| EP1606576A4 (en) | 2003-03-24 | 2006-11-22 | D3D L P | LASER SCANNING SYSTEM FOR DENTAL APPLICATIONS |
| US20040213463A1 (en) | 2003-04-22 | 2004-10-28 | Morrison Rick Lee | Multiplexed, spatially encoded illumination system for determining imaging and range estimation |
| US7539340B2 (en) | 2003-04-25 | 2009-05-26 | Topcon Corporation | Apparatus and method for three-dimensional coordinate measurement |
| US7295330B2 (en) | 2003-07-11 | 2007-11-13 | Chow Peter P | Film mapping system |
| US20070057946A1 (en) | 2003-07-24 | 2007-03-15 | Dan Albeck | Method and system for the three-dimensional surface reconstruction of an object |
| CA2435935A1 (en) | 2003-07-24 | 2005-01-24 | Guylain Lemelin | Optical 3d digitizer with enlarged non-ambiguity zone |
| US6940583B2 (en) | 2003-07-28 | 2005-09-06 | International Business Machines Corporation | Method and apparatus for amplitude filtering in the frequency plane of a lithographic projection system |
| US7064876B2 (en) | 2003-07-29 | 2006-06-20 | Lexmark International, Inc. | Resonant oscillating scanning device with multiple light sources |
| US20050111705A1 (en) | 2003-08-26 | 2005-05-26 | Roman Waupotitsch | Passive stereo sensing for 3D facial shape biometrics |
| US7187437B2 (en) | 2003-09-10 | 2007-03-06 | Shearographics, Llc | Plurality of light sources for inspection apparatus and method |
| US6934018B2 (en) | 2003-09-10 | 2005-08-23 | Shearographics, Llc | Tire inspection apparatus and method |
| US7874917B2 (en) | 2003-09-15 | 2011-01-25 | Sony Computer Entertainment Inc. | Methods and systems for enabling depth and direction detection when interfacing with a computer program |
| EP1517166B1 (en) | 2003-09-15 | 2015-10-21 | Nuvotronics, LLC | Device package and methods for the fabrication and testing thereof |
| US7064810B2 (en) | 2003-09-15 | 2006-06-20 | Deere & Company | Optical range finder with directed attention |
| US7112774B2 (en) | 2003-10-09 | 2006-09-26 | Avago Technologies Sensor Ip (Singapore) Pte. Ltd | CMOS stereo imaging system and method |
| US7250949B2 (en) | 2003-12-23 | 2007-07-31 | General Electric Company | Method and system for visualizing three-dimensional data |
| US20050135555A1 (en) | 2003-12-23 | 2005-06-23 | Claus Bernhard Erich H. | Method and system for simultaneously viewing rendered volumes |
| US8134637B2 (en) | 2004-01-28 | 2012-03-13 | Microsoft Corporation | Method and system to increase X-Y resolution in a depth (Z) camera using red, blue, green (RGB) sensing |
| US7961909B2 (en) | 2006-03-08 | 2011-06-14 | Electronic Scripting Products, Inc. | Computer interface employing a manipulated object with absolute pose detection component and a display |
| US20070165243A1 (en) | 2004-02-09 | 2007-07-19 | Cheol-Gwon Kang | Device for measuring 3d shape using irregular pattern and method for the same |
| JP2005236513A (en) | 2004-02-18 | 2005-09-02 | Fujinon Corp | Imaging apparatus |
| JP4572312B2 (en) | 2004-02-23 | 2010-11-04 | スタンレー電気株式会社 | LED and manufacturing method thereof |
| US7227618B1 (en) | 2004-03-24 | 2007-06-05 | Baokang Bi | Pattern generating systems |
| US7304735B2 (en) | 2004-04-02 | 2007-12-04 | Kla-Tencor Technologies | Broadband wavelength selective filter |
| US7427981B2 (en) | 2004-04-15 | 2008-09-23 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Optical device that measures distance between the device and a surface |
| US7308112B2 (en) | 2004-05-14 | 2007-12-11 | Honda Motor Co., Ltd. | Sign based human-machine interaction |
| KR20070028533A (en) | 2004-06-16 | 2007-03-12 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | Apparatus and method for generating a scanning beam in an optical pickup head, an optical storage system having a small optical pickup head and a small pickup head |
| CN101031837B (en) | 2004-07-23 | 2011-06-15 | 通用电气医疗集团尼亚加拉有限公司 | Method and apparatus for fluorescent confocal microscopy |
| US20060017656A1 (en) | 2004-07-26 | 2006-01-26 | Visteon Global Technologies, Inc. | Image intensity control in overland night vision systems |
| KR101238608B1 (en) | 2004-07-30 | 2013-02-28 | 익스트림 리얼리티 엘티디. | A system and method for 3D space-dimension based image processing |
| US7120228B2 (en) | 2004-09-21 | 2006-10-10 | Jordan Valley Applied Radiation Ltd. | Combined X-ray reflectometer and diffractometer |
| JP5128047B2 (en) | 2004-10-07 | 2013-01-23 | Towa株式会社 | Optical device and optical device production method |
| JP2006128818A (en) | 2004-10-26 | 2006-05-18 | Victor Co Of Japan Ltd | Recording program and reproducing program corresponding to stereoscopic video and 3d audio, recording apparatus, reproducing apparatus and recording medium |
| IL165212A (en) | 2004-11-15 | 2012-05-31 | Elbit Systems Electro Optics Elop Ltd | Device for scanning light |
| US7076024B2 (en) | 2004-12-01 | 2006-07-11 | Jordan Valley Applied Radiation, Ltd. | X-ray apparatus with dual monochromators |
| US20060156756A1 (en) | 2005-01-20 | 2006-07-20 | Becke Paul E | Phase change and insulating properties container and method of use |
| US20060221218A1 (en) | 2005-04-05 | 2006-10-05 | Doron Adler | Image sensor with improved color filter |
| EP1875162B1 (en) | 2005-04-06 | 2014-06-11 | Dimensional Photonics International, Inc. | Determining positional error of an optical component using structured light patterns |
| JP2006310417A (en) | 2005-04-27 | 2006-11-09 | Sony Corp | PHOTOELECTRIC CONVERSION DEVICE, ITS MANUFACTURING METHOD, AND OPTICAL INFORMATION PROCESSING DEVICE |
| US7750356B2 (en) | 2005-05-04 | 2010-07-06 | Avago Technologies Fiber Ip (Singapore) Pte. Ltd. | Silicon optical package with 45 degree turning mirror |
| US7560679B1 (en) | 2005-05-10 | 2009-07-14 | Siimpel, Inc. | 3D camera |
| US7609875B2 (en) * | 2005-05-27 | 2009-10-27 | Orametrix, Inc. | Scanner system and method for mapping surface of three-dimensional object |
| CN1725042B (en) | 2005-06-30 | 2010-11-24 | 昆明理工大学 | Scanning Grating Writing Method Based on Talbot Interferometer and Scanning Talbot Interferometer |
| US7583875B2 (en) | 2005-07-22 | 2009-09-01 | Seiko Epson Corporation | Illumination device, image display device, and projector |
| US8400494B2 (en) | 2005-10-11 | 2013-03-19 | Primesense Ltd. | Method and system for object reconstruction |
| US8050461B2 (en) | 2005-10-11 | 2011-11-01 | Primesense Ltd. | Depth-varying light fields for three dimensional sensing |
| US9330324B2 (en) | 2005-10-11 | 2016-05-03 | Apple Inc. | Error compensation in three-dimensional mapping |
| US20110096182A1 (en) | 2009-10-25 | 2011-04-28 | Prime Sense Ltd | Error Compensation in Three-Dimensional Mapping |
| US8018579B1 (en) | 2005-10-21 | 2011-09-13 | Apple Inc. | Three-dimensional imaging and display system |
| US8792978B2 (en) | 2010-05-28 | 2014-07-29 | Lockheed Martin Corporation | Laser-based nerve stimulators for, E.G., hearing restoration in cochlear prostheses and method |
| US20070133840A1 (en) | 2005-11-04 | 2007-06-14 | Clean Earth Technologies, Llc | Tracking Using An Elastic Cluster of Trackers |
| US7856125B2 (en) | 2006-01-31 | 2010-12-21 | University Of Southern California | 3D face reconstruction from 2D images |
| US7433024B2 (en) | 2006-02-27 | 2008-10-07 | Prime Sense Ltd. | Range mapping using speckle decorrelation |
| JP4692329B2 (en) | 2006-02-28 | 2011-06-01 | 日本ビクター株式会社 | Optical wireless communication device |
| CN101496033B (en) | 2006-03-14 | 2012-03-21 | 普莱姆森斯有限公司 | Depth-varying light fields for three dimensional sensing |
| EP1994503B1 (en) | 2006-03-14 | 2017-07-05 | Apple Inc. | Depth-varying light fields for three dimensional sensing |
| US7423821B2 (en) | 2006-03-24 | 2008-09-09 | Gentex Corporation | Vision system |
| US7869649B2 (en) | 2006-05-08 | 2011-01-11 | Panasonic Corporation | Image processing device, image processing method, program, storage medium and integrated circuit |
| US8488895B2 (en) | 2006-05-31 | 2013-07-16 | Indiana University Research And Technology Corp. | Laser scanning digital camera with pupil periphery illumination and potential for multiply scattered light imaging |
| US8139142B2 (en) | 2006-06-01 | 2012-03-20 | Microsoft Corporation | Video manipulation of red, green, blue, distance (RGB-Z) data including segmentation, up-sampling, and background substitution techniques |
| US7352499B2 (en) | 2006-06-06 | 2008-04-01 | Symbol Technologies, Inc. | Arrangement for and method of projecting an image with pixel mapping |
| WO2008014826A1 (en) | 2006-08-03 | 2008-02-07 | Alterface S.A. | Method and device for identifying and extracting images of multiple users, and for recognizing user gestures |
| US7737394B2 (en) | 2006-08-31 | 2010-06-15 | Micron Technology, Inc. | Ambient infrared detection in solid state sensors |
| KR20090052889A (en) | 2006-09-04 | 2009-05-26 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | Method for determining depth map from images and device for determining depth map |
| US7256899B1 (en) | 2006-10-04 | 2007-08-14 | Ivan Faul | Wireless methods and systems for three-dimensional non-contact shape sensing |
| US8542421B2 (en) * | 2006-11-17 | 2013-09-24 | Celloptic, Inc. | System, apparatus and method for extracting three-dimensional information of an object from received electromagnetic radiation |
| US8090194B2 (en) | 2006-11-21 | 2012-01-03 | Mantis Vision Ltd. | 3D geometric modeling and motion capture using both single and dual imaging |
| US7990545B2 (en) | 2006-12-27 | 2011-08-02 | Cambridge Research & Instrumentation, Inc. | Surface measurement of in-vivo subjects using spot projector |
| US7840031B2 (en) | 2007-01-12 | 2010-11-23 | International Business Machines Corporation | Tracking a range of body movement based on 3D captured image streams of a user |
| US8350847B2 (en) | 2007-01-21 | 2013-01-08 | Primesense Ltd | Depth mapping using multi-beam illumination |
| US7894078B2 (en) | 2007-04-23 | 2011-02-22 | California Institute Of Technology | Single-lens 3-D imaging device using a polarization-coded aperture masks combined with a polarization-sensitive sensor |
| US20080212835A1 (en) | 2007-03-01 | 2008-09-04 | Amon Tavor | Object Tracking by 3-Dimensional Modeling |
| JP4232835B2 (en) | 2007-03-07 | 2009-03-04 | セイコーエプソン株式会社 | Actuator, optical scanner and image forming apparatus |
| US8150142B2 (en) | 2007-04-02 | 2012-04-03 | Prime Sense Ltd. | Depth mapping using projected patterns |
| US8493496B2 (en) | 2007-04-02 | 2013-07-23 | Primesense Ltd. | Depth mapping using projected patterns |
| US8488868B2 (en) | 2007-04-03 | 2013-07-16 | Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of Industry, Through The Communications Research Centre Canada | Generation of a depth map from a monoscopic color image for rendering stereoscopic still and video images |
| US7835561B2 (en) | 2007-05-18 | 2010-11-16 | Visiongate, Inc. | Method for image processing and reconstruction of images for optical tomography |
| US8494252B2 (en) | 2007-06-19 | 2013-07-23 | Primesense Ltd. | Depth mapping using optical elements having non-uniform focal characteristics |
| CN101785025B (en) | 2007-07-12 | 2013-10-30 | 汤姆森特许公司 | System and method for three-dimensional object reconstruction from two-dimensional images |
| JP4412362B2 (en) | 2007-07-18 | 2010-02-10 | 船井電機株式会社 | Compound eye imaging device |
| CN101371786B (en) * | 2007-08-24 | 2011-01-12 | 北京师范大学珠海分校 | A method and system for three-dimensional reconstruction of X-ray images |
| US20090060307A1 (en) | 2007-08-27 | 2009-03-05 | Siemens Medical Solutions Usa, Inc. | Tensor Voting System and Method |
| DE102007045332B4 (en) | 2007-09-17 | 2019-01-17 | Seereal Technologies S.A. | Holographic display for reconstructing a scene |
| KR101439434B1 (en) | 2007-10-05 | 2014-09-12 | 삼성전자주식회사 | Image sensor and manufacturing method thereof |
| KR100858034B1 (en) | 2007-10-18 | 2008-09-10 | (주)실리콘화일 | Single chip vitality image sensor |
| JP5012463B2 (en) | 2007-12-03 | 2012-08-29 | セイコーエプソン株式会社 | Scanning image display system and scanning image display device |
| US8166421B2 (en) | 2008-01-14 | 2012-04-24 | Primesense Ltd. | Three-dimensional user interface |
| US8176497B2 (en) | 2008-01-16 | 2012-05-08 | Dell Products, Lp | Method to dynamically provision additional computer resources to handle peak database workloads |
| JP5588353B2 (en) | 2008-01-21 | 2014-09-10 | プライムセンス リミテッド | Optical design for zero order reduction |
| US8384997B2 (en) | 2008-01-21 | 2013-02-26 | Primesense Ltd | Optical pattern projection |
| WO2009095862A1 (en) * | 2008-02-01 | 2009-08-06 | Koninklijke Philips Electronics N.V. | Autostereoscopic display device |
| US8331006B2 (en) | 2008-02-13 | 2012-12-11 | Nokia Corporation | Display device and a method for illuminating a light modulator array of a display device |
| KR20090091610A (en) | 2008-02-25 | 2009-08-28 | 삼성전자주식회사 | MEMS Mirror and Scanning Actuator |
| DE102008011350A1 (en) | 2008-02-27 | 2009-09-03 | Loeffler Technology Gmbh | Apparatus and method for real-time detection of electromagnetic THz radiation |
| US8121351B2 (en) | 2008-03-09 | 2012-02-21 | Microsoft International Holdings B.V. | Identification of objects in a 3D video using non/over reflective clothing |
| US8035806B2 (en) | 2008-05-13 | 2011-10-11 | Samsung Electronics Co., Ltd. | Distance measuring sensor including double transfer gate and three dimensional color image sensor including the distance measuring sensor |
| US8456517B2 (en) | 2008-07-09 | 2013-06-04 | Primesense Ltd. | Integrated processor for 3D mapping |
| EP2362936B1 (en) * | 2008-10-28 | 2012-10-17 | 3Shape A/S | Scanner with feedback control |
| KR20100063996A (en) | 2008-12-04 | 2010-06-14 | 삼성전자주식회사 | Scanner and image forming apparatus employing the same |
| US8462207B2 (en) | 2009-02-12 | 2013-06-11 | Primesense Ltd. | Depth ranging with Moiré patterns |
| US7949024B2 (en) | 2009-02-17 | 2011-05-24 | Trilumina Corporation | Multibeam arrays of optoelectronic devices for high frequency operation |
| EP2226652B1 (en) | 2009-03-02 | 2013-11-20 | Sick Ag | Optoelectronic sensor with alignment light transmitter |
| US8786682B2 (en) | 2009-03-05 | 2014-07-22 | Primesense Ltd. | Reference image techniques for three-dimensional sensing |
| US8717417B2 (en) | 2009-04-16 | 2014-05-06 | Primesense Ltd. | Three-dimensional mapping and imaging |
| US8503720B2 (en) | 2009-05-01 | 2013-08-06 | Microsoft Corporation | Human body pose estimation |
| US8744121B2 (en) | 2009-05-29 | 2014-06-03 | Microsoft Corporation | Device for identifying and tracking multiple humans over time |
| EP2275990B1 (en) | 2009-07-06 | 2012-09-26 | Sick Ag | 3D sensor |
| WO2011013079A1 (en) | 2009-07-30 | 2011-02-03 | Primesense Ltd. | Depth mapping based on pattern matching and stereoscopic information |
| US8773514B2 (en) | 2009-08-27 | 2014-07-08 | California Institute Of Technology | Accurate 3D object reconstruction using a handheld device with a projected light pattern |
| CN102667495A (en) | 2009-09-28 | 2012-09-12 | 喷特路姆科技有限公司 | Method, apparatus and system for remote wind sensing |
| US8305502B2 (en) | 2009-11-11 | 2012-11-06 | Eastman Kodak Company | Phase-compensated thin-film beam combiner |
| JP5588310B2 (en) | 2009-11-15 | 2014-09-10 | プライムセンス リミテッド | Optical projector with beam monitor |
| JP5452197B2 (en) | 2009-12-03 | 2014-03-26 | パナソニック株式会社 | MEMS optical scanner |
| US8830227B2 (en) | 2009-12-06 | 2014-09-09 | Primesense Ltd. | Depth-based gain control |
| US8320621B2 (en) | 2009-12-21 | 2012-11-27 | Microsoft Corporation | Depth projector system with integrated VCSEL array |
| DE102010005993B4 (en) | 2010-01-27 | 2016-10-20 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Laser scanner device and method for three-dimensional non-contact environmental detection with a laser scanner device |
| US20110187878A1 (en) | 2010-02-02 | 2011-08-04 | Primesense Ltd. | Synchronization of projected illumination with rolling shutter of image sensor |
| US20110188054A1 (en) | 2010-02-02 | 2011-08-04 | Primesense Ltd | Integrated photonics module for optical projection |
| US8786757B2 (en) | 2010-02-23 | 2014-07-22 | Primesense Ltd. | Wideband ambient light rejection |
| US8982182B2 (en) | 2010-03-01 | 2015-03-17 | Apple Inc. | Non-uniform spatial resource allocation for depth mapping |
| US8279418B2 (en) | 2010-03-17 | 2012-10-02 | Microsoft Corporation | Raster scanning for depth detection |
| US8330804B2 (en) | 2010-05-12 | 2012-12-11 | Microsoft Corporation | Scanned-beam depth mapping to 2D image |
| US8654152B2 (en) | 2010-06-21 | 2014-02-18 | Microsoft Corporation | Compartmentalizing focus area within field of view |
| CN103053167B (en) * | 2010-08-11 | 2016-01-20 | 苹果公司 | Scanning projector and the image capture module mapped for 3D |
| US9036158B2 (en) | 2010-08-11 | 2015-05-19 | Apple Inc. | Pattern projector |
| WO2012027410A1 (en) | 2010-08-23 | 2012-03-01 | Lighttime, Llc | Ladar using mems scanning |
| EP2643659B1 (en) | 2010-11-19 | 2019-12-25 | Apple Inc. | Depth mapping using time-coded illumination |
| US9280718B2 (en) | 2010-11-24 | 2016-03-08 | Nocimed, Llc | Systems and methods for automated voxelation of regions of interest for magnetic resonance spectroscopy |
| US9131136B2 (en) | 2010-12-06 | 2015-09-08 | Apple Inc. | Lens arrays for pattern projection and imaging |
| US9030528B2 (en) | 2011-04-04 | 2015-05-12 | Apple Inc. | Multi-zone imaging sensor and lens array |
| US8908277B2 (en) | 2011-08-09 | 2014-12-09 | Apple Inc | Lens array projector |
| US8749796B2 (en) | 2011-08-09 | 2014-06-10 | Primesense Ltd. | Projectors of structured light |
| US9684075B2 (en) | 2011-10-27 | 2017-06-20 | Microvision, Inc. | Scanning laser time of flight 3D imaging |
| US9157790B2 (en) | 2012-02-15 | 2015-10-13 | Apple Inc. | Integrated optoelectronic modules with transmitter, receiver and beam-combining optics for aligning a beam axis with a collection axis |
| US9396382B2 (en) | 2012-08-17 | 2016-07-19 | Flashscan3D, Llc | System and method for a biometric image sensor with spoofing detection |
| US20140063189A1 (en) | 2012-08-28 | 2014-03-06 | Digital Signal Corporation | System and Method for Refining Coordinate-Based Three-Dimensional Images Obtained from a Three-Dimensional Measurement System |
| US8948482B2 (en) | 2012-11-01 | 2015-02-03 | Align Technology, Inc. | Motion compensation in a three dimensional scan |
| KR20150057011A (en) | 2013-11-18 | 2015-05-28 | 삼성전자주식회사 | A camera intergrated with a light source |
| US20160125638A1 (en) | 2014-11-04 | 2016-05-05 | Dassault Systemes | Automated Texturing Mapping and Animation from Images |
| US10107914B2 (en) | 2015-02-20 | 2018-10-23 | Apple Inc. | Actuated optical element for light beam scanning device |
-
2011
- 2011-08-10 CN CN201180037859.4A patent/CN103053167B/en active Active
- 2011-08-10 US US13/810,451 patent/US9098931B2/en active Active
- 2011-08-10 WO PCT/IB2011/053560 patent/WO2012020380A1/en not_active Ceased
- 2011-08-11 TW TW100128723A patent/TWI513273B/en active
-
2015
- 2015-06-25 US US14/749,654 patent/US9677878B2/en active Active
-
2017
- 2017-05-08 US US15/588,719 patent/US10218963B2/en active Active
-
2019
- 2019-01-09 US US16/243,106 patent/US10721459B2/en active Active
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| TWI549476B (en) * | 2013-12-20 | 2016-09-11 | 友達光電股份有限公司 | Display system and method for adjusting visible range |
Also Published As
| Publication number | Publication date |
|---|---|
| US9677878B2 (en) | 2017-06-13 |
| US10721459B2 (en) | 2020-07-21 |
| CN103053167B (en) | 2016-01-20 |
| TWI513273B (en) | 2015-12-11 |
| US10218963B2 (en) | 2019-02-26 |
| US9098931B2 (en) | 2015-08-04 |
| CN103053167A (en) | 2013-04-17 |
| US20150292874A1 (en) | 2015-10-15 |
| US20170244955A1 (en) | 2017-08-24 |
| WO2012020380A1 (en) | 2012-02-16 |
| US20130127854A1 (en) | 2013-05-23 |
| US20190149805A1 (en) | 2019-05-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| TWI513273B (en) | Scanning projectors and image capture modules for 3d mapping | |
| US11733524B1 (en) | Depth camera assembly based on near infra-red illuminator | |
| US10395111B2 (en) | Gaze-tracking system and method | |
| KR102594058B1 (en) | Method and system for tracking eye movements with optical scanning projector | |
| CN105659106B (en) | 3D Depth Mapping Using Dynamic Structured Light | |
| JP7150966B2 (en) | Tracking the optical flow of backscattered laser speckle patterns | |
| TW201712371A (en) | Projection device for data eyeglasses, data eyeglasses, and process to operate a projection device for data eyeglasses enabling a user to have projected image on their data eyeglasses | |
| US10726257B2 (en) | Gaze-tracking system and method of tracking user's gaze | |
| JP3450801B2 (en) | Pupil position detecting device and method, viewpoint position detecting device and method, and stereoscopic image display system | |
| JP6430813B2 (en) | Position detection apparatus, position detection method, gazing point detection apparatus, and image generation apparatus | |
| JP2014224995A (en) | Scanning image display device and exit pupil enlarging method | |
| Xia et al. | Towards an Expanded Eyebox for a Wide-Field-of-View Augmented Reality Near-eye Pinlight Display with 3D Pupil Localization | |
| KR20210070799A (en) | Apparatus for display | |
| JP2007017180A (en) | Marker recognition method and apparatus in optical motion capture |