[go: up one dir, main page]

WO2009066998A3 - Apparatus and method for multiple-touch spatial sensors - Google Patents

Apparatus and method for multiple-touch spatial sensors Download PDF

Info

Publication number
WO2009066998A3
WO2009066998A3 PCT/MY2008/000164 MY2008000164W WO2009066998A3 WO 2009066998 A3 WO2009066998 A3 WO 2009066998A3 MY 2008000164 W MY2008000164 W MY 2008000164W WO 2009066998 A3 WO2009066998 A3 WO 2009066998A3
Authority
WO
WIPO (PCT)
Prior art keywords
spatial
image data
touch
dimensional
cameras
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/MY2008/000164
Other languages
French (fr)
Other versions
WO2009066998A2 (en
Inventor
Hock Woon Hon
Shern Shiou Tan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mimos Bhd
Original Assignee
Mimos Bhd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mimos Bhd filed Critical Mimos Bhd
Publication of WO2009066998A2 publication Critical patent/WO2009066998A2/en
Publication of WO2009066998A3 publication Critical patent/WO2009066998A3/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to an apparatus and a method for multiple-touch three-dimensional contactless control for spatial sensing. The apparatus comprises two cameras (101, 102) having spatial sensors to capture object position in a form of an image, a register for registering the spatial directions as sensed by the spatial sensors, a data processing unit, and a computer for computing object point derivation and blob analysis. The method multiple- touch three-dimensional contactless control for spatial sensing comprising the steps of: positioning first and second camera (101, 102) and capturing image data of an object (107) by the cameras (101, 102); transferring the captured image data of the object (107) through background and foreground segmentation using image processing function; determining spatial positions of each of the image data of the object (107) and deriving a three-dimensional spatial position of the point; and processing the captured image data through blob analysis.
PCT/MY2008/000164 2007-11-23 2008-11-24 Apparatus and method for multiple-touch spatial sensors Ceased WO2009066998A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
MYPI20072085A MY147059A (en) 2007-11-23 2007-11-23 Apparatus and method for multiple-touch spatial sensors
MYPI20072085 2007-11-23

Publications (2)

Publication Number Publication Date
WO2009066998A2 WO2009066998A2 (en) 2009-05-28
WO2009066998A3 true WO2009066998A3 (en) 2009-10-15

Family

ID=40668031

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/MY2008/000164 Ceased WO2009066998A2 (en) 2007-11-23 2008-11-24 Apparatus and method for multiple-touch spatial sensors

Country Status (2)

Country Link
MY (1) MY147059A (en)
WO (1) WO2009066998A2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2947348B1 (en) * 2009-06-25 2011-08-19 Immersion DEVICE FOR HANDLING AND VISUALIZING A VIRTUAL OBJECT
CN107945172A (en) * 2017-12-08 2018-04-20 博众精工科技股份有限公司 A kind of character detection method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0913790A1 (en) * 1997-10-29 1999-05-06 Takenaka Corporation Hand pointing apparatus
US20040108990A1 (en) * 2001-01-08 2004-06-10 Klony Lieberman Data input device
KR20070061153A (en) * 2005-12-08 2007-06-13 한국전자통신연구원 Hand Tracking 3D Input Device and Method Using Multiple Cameras
US7277599B2 (en) * 2002-09-23 2007-10-02 Regents Of The University Of Minnesota System and method for three-dimensional video imaging using a single camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0913790A1 (en) * 1997-10-29 1999-05-06 Takenaka Corporation Hand pointing apparatus
US20040108990A1 (en) * 2001-01-08 2004-06-10 Klony Lieberman Data input device
US7277599B2 (en) * 2002-09-23 2007-10-02 Regents Of The University Of Minnesota System and method for three-dimensional video imaging using a single camera
KR20070061153A (en) * 2005-12-08 2007-06-13 한국전자통신연구원 Hand Tracking 3D Input Device and Method Using Multiple Cameras

Also Published As

Publication number Publication date
WO2009066998A2 (en) 2009-05-28
MY147059A (en) 2012-10-15

Similar Documents

Publication Publication Date Title
TWI708216B (en) Method and system for calibrating vision system in environment
WO2007025300A3 (en) Capturing and processing facial motion data
CN104626169B (en) Robot part grabbing method based on vision and mechanical comprehensive positioning
WO2008123466A1 (en) Image processing device, control program, computer-readable recording medium, electronic device, and image processing device control method
GB201119501D0 (en) An apparatus, method and system
JP2010539557A5 (en)
WO2017077925A1 (en) Method and system for estimating three-dimensional pose of sensor
WO2008090608A1 (en) Image reading device, image reading program, and image reading method
EP3114647A2 (en) Method and system for 3d capture based on structure from motion with simplified pose detection
JP2008116373A5 (en)
WO2009053848A3 (en) Methods and processes for detecting a mark on a playing surface and for tracking an object
CN101799717A (en) Man-machine interaction method based on hand action catch
CN111488775B (en) Device and method for judging degree of visibility
WO2017215351A1 (en) Method and apparatus for adjusting recognition range of photographing apparatus
JP2013186042A (en) Distance calculating device and distance calculating method
WO2009112761A3 (en) System for measuring clearances and degree of flushness and corresponding method
CN101782370A (en) Measurement positioning method based on universal serial bus (USB) camera and method for measuring movement locus of moving object
WO2008123462A1 (en) Image processing device, control program, computer-readable recording medium, electronic device, and image processing device control method
CN104700385A (en) Binocular vision positioning device based on FPGA
WO2009125132A3 (en) Method for determining a three-dimensional representation of an object using a sequence of cross-section images, computer program product, and corresponding method for analyzing an object and imaging system
TW201741938A (en) Motion sensing method and device
CN104376323B (en) A kind of method and device for determining target range
JP2008309595A (en) Object recognizing device and program used for it
WO2009066998A3 (en) Apparatus and method for multiple-touch spatial sensors
CN104113684B (en) Control method and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08852074

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08852074

Country of ref document: EP

Kind code of ref document: A2