[go: up one dir, main page]

WO2009072736A1 - Procédé et système de reconnaissance de geste adaptatif pour l'utilisateur - Google Patents

Procédé et système de reconnaissance de geste adaptatif pour l'utilisateur Download PDF

Info

Publication number
WO2009072736A1
WO2009072736A1 PCT/KR2008/005100 KR2008005100W WO2009072736A1 WO 2009072736 A1 WO2009072736 A1 WO 2009072736A1 KR 2008005100 W KR2008005100 W KR 2008005100W WO 2009072736 A1 WO2009072736 A1 WO 2009072736A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
user
information
interface
user gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2008/005100
Other languages
English (en)
Inventor
Jong Hong Jeon
Seung Yun Lee
Sung Han Kim
Kang Chan Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Electronics and Telecommunications Research Institute ETRI
Original Assignee
Electronics and Telecommunications Research Institute ETRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020080022182A external-priority patent/KR100912511B1/ko
Application filed by Electronics and Telecommunications Research Institute ETRI filed Critical Electronics and Telecommunications Research Institute ETRI
Priority to US12/745,800 priority Critical patent/US20100275166A1/en
Publication of WO2009072736A1 publication Critical patent/WO2009072736A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72445User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting Internet browser applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • the present invention relates to a user adaptive gesture recognition method and a user adaptive gesture recognition system.
  • the mobile digital apparatuses include a cellular phone, a PDA (personal digital assistant), a PMP (portable multimedia player), an MP3P (moving picture experts group audio layer-3 player), a digital camera, and the like.
  • Such mobile apparatuses provide a user interface by means of a button having a directional key function or a keypad.
  • a touch screen has been widely used, and thus an interface is provided in various ways.
  • Such a mobile apparatus has a display device for information display and an input unit for input operation in a compact terminal. Accordingly, unlike a personal computer, in the mobile apparatus it is difficult to use a user interface such as a mouse. This causes the user to feel inconvenience in an environment in which the movements among the screens are complex, for example in a mobile browsing environment.
  • An exemplary embodiment of the present invention provides a user adaptive gesture recognition system that recognizes based on information collected by a terminal equipped with a sensor.
  • the system includes: a sensing information processing unit that extracts a coordinate value from sensing information collected by the sensor; a user adaptive gesture processing unit that extracts position conversion information from the extracted coordinate value to recognize a user gesture, and outputs association information for driving one of a browser function and application program functions in association with the user gesture or stores the user gesture; and an association unit that associates an interface with the user gesture based on the output association information.
  • Another embodiment of the present invention provides a user adaptive gesture recognition method that recognizes a user gesture based on information collected by a terminal equipped with a sensor.
  • the method includes: extracting a coordinate value from sensing information collected by the sensor; extracting position conversion information from the extracted coordinate value, and recognizing a user gesture based on the extracted position conversion information; determining whether or not interface information corresponding to the recognized user gesture is stored; and if it is determined in the determining that the interface information corresponding to the user gesture is stored, generating interface information for associating the corresponding interface with the gesture and associating the interface with the gesture.
  • Yet another embodiment of the present invention provides a user adaptive gesture recognition method that recognizes a user gesture based on information collected by a terminal equipped with a sensor.
  • the method includes: determining whether or not a gesture registration request is input; when the gesture registration request is input, extracting a coordinate value from sensing information collected by the sensor; extracting position conversion information from the extracted coordinate value, and recognizing a user gesture based on the extracted position conversion information; determining whether or not standard gesture information corresponding to the recognized user gesture is stored; and if it is determined that the standard gesture information is not stored, defining and storing a command of the user gesture and interface information corresponding to the user gesture.
  • the user gesture can be recognized and processed by using the acceleration sensor in the mobile apparatus.
  • the user adaptive gesture can be stored in the mobile apparatus by using the acceleration sensor, and thus the mobile application can be utilized with a simple gesture.
  • the present invention can be applied to various mobile apparatuses, thereby improving the user interface of the mobile apparatus.
  • FIG. 1 is a diagram illustrating the principle of general acceleration sensors.
  • FIG. 2 is a diagram illustrating the detection principle of a general acceleration sensor.
  • FIG. 3 is a diagram illustrating a keypad- type terminal equipped with an acceleration sensor according to an exemplary embodiment of the present invention.
  • FIG. 4 is a diagram illustrating a touch screen-type terminal equipped with an acceleration sensor according to an exemplary embodiment of the present invention.
  • FIG. 5 is a diagram illustrating the structure of a user adaptive gesture recognition system according to an exemplary embodiment of the present invention.
  • FIG. 6 is a diagram illustrating the detailed structure of a user adaptive gesture processing unit according to an exemplary embodiment of the present invention.
  • FIGS. 7 and 8 are diagrams illustrating user gestures according to an exemplary embodiment of the present invention.
  • FIG. 9 is a diagram illustrating user gesture patterns according to an exemplary embodiment of the present invention.
  • FIG. 10 is a flowchart illustrating a successive user gesture recognition processing according to an exemplary embodiment of the present invention.
  • FIG. 11 is a flowchart illustrating a user gesture registration processing according to an exemplary embodiment of the present invention. Mode for the Invention
  • FIG. 1 is a diagram illustrating the principle of general acceleration sensors.
  • the acceleration sensor is generally used in an airbag of an automobile. Specifically, the acceleration sensor is used to instantaneously detect an impact when the automobile is crashed.
  • the acceleration sensor is an element for detecting a change in speed per unit time.
  • a mechanical-type sensor was used, but at present, a semiconductor-type sensor is widely used.
  • the semiconductor-type sensor can be small and perform accurate detection.
  • the semiconductor-type sensor is installed in the mobile terminal to measure an inclination, thereby correcting screen display.
  • the semiconductor-type sensor is used in a passometer to detect a shake during movement.
  • the mechanical-type acceleration sensor primarily includes a proof mass 10, a spring 20, and a damper 30.
  • the acceleration is calculated based on a change in position of the proof mass by Math Figure 1.
  • the mechanical-type acceleration sensor covers a small acceleration range, it is not suitable for a small and thin portable electronic apparatus. Accordingly, a semiconductor-type acceleration sensor having a proof mass shown in (b) of FIG. 1 is attracting attention.
  • the acceleration sensor put to practical use shown in (b) of FIG. 1 outputs the size of the acceleration applied to the object, and is divided according to the number of axes.
  • the acceleration sensor includes a one-axis acceleration sensor, a two- axis acceleration sensor, and a three-axis acceleration sensor.
  • the three-axis acceleration sensor that has a detection range in three directions can measure the ac- celeration in a three-dimensional space in three directions of x, y, and z axes.
  • the three-axis acceleration sensor is used to detect the inclination of the terminal.
  • Other acceleration sensors are used in the airbag of the automobile and to control a walking posture of a robot and to detect a shock in an elevator.
  • FIG. 2 is a diagram illustrating the detection principle of a general acceleration sensor.
  • the acceleration of gravity based on the inclination may be as shown in FIG. 2.
  • the gradient (sine value) is 30°.
  • the sensor is vertical along the y-axis direction. Meanwhile, if the acceleration in the x-axis direction is 1 G, and the acceleration in the y-axis direction is 0 G, the sensor is placed along the x-axis direction.
  • the acceleration sensor is inclined at 45°in the x-axis direction, the acceleration is calculated by the equation lGxSin 45, that is, 0.707 G. In this way, the inclination state of the sensor versus the ground direction can be detected.
  • the detection sensitivity [V/g] of the acceleration sensor represents a decrease in acceleration detection due to a change in voltage per acceleration.
  • an acceleration sensor needs to be small and thin, and should have excellent detection sensitivity and impact resistance.
  • the acceleration sensors may be divided into a piezo-resistive type, a capacitive type, a heat distribution detection type, and a magnetic type according to an acceleration detection method.
  • the piezo-resistive type and the capacitive type are attracting attention.
  • FIG. 3 is a diagram illustrating a keypad-type terminal equipped with an acceleration sensor according to an exemplary embodiment of the present invention.
  • a terminal 100 includes a keypad or buttons.
  • a user gesture is recognized by an acceleration sensor installed in the terminal. That is, when the terminal 100 executes mobile browsing or a mobile application, the mobile application or the contents of mobile browsing is displayed on a display unit 110 of the terminal 100.
  • user gesture recognition by the acceleration sensor installed in the terminal 100 may be made as follows.
  • user gesture recognition may be based on single recognition.
  • the user wants to input his/her gesture to the terminal, he/she inputs a gesture while pressing a button assigned with a recognition request function, and then releases the button.
  • a gesture is input.
  • the gesture input may be achieved by buttons 120 to 123 according to the characteristics of the terminal.
  • the gesture input may be achieved by a function unique to each button.
  • an interface or a program corresponding to the specific gesture is executed.
  • user gesture recognition may be based on successive recognition.
  • the user presses one of buttons 120 to 123 assigned with a successive recognition request function to drive a successive gesture recognition function, such that the user gestures are successively recognized.
  • the user may register a gesture in advance and use the gesture.
  • the user inputs a user gesture to be registered while pressing one of buttons 120 to 123 assigned with a user gesture registration request function, and then releases the button. In this way, the user gesture to be registered is input. Subsequently, user gesture registration is performed.
  • FIG. 4 is a diagram illustrating a touch screen-type terminal equipped with an acceleration sensor according to an exemplary embodiment of the present invention.
  • a terminal shown in FIG. 4 includes a touch panel but performs gesture recognition based on an internal acceleration sensor, and operates similarly to the terminal having a keypad or buttons shown in FIG. 3. However, since the terminal shown in FIG. 4 includes a touch panel, gesture recognition is made differently from that of the terminal shown in FIG. 3.
  • a terminal equipped with a touch screen 140 assigns predetermined regions of the touch screen to virtual buttons 150 to 152 in advance.
  • the assigned regions function as a successive recognition processing function call virtual button 150, a single recognition processing function call virtual button 151, and a user gesture recognition call virtual button 152.
  • a terminal includes a touch screen, one or more buttons 160 to 162 are separately provided, the functions may be assigned to the buttons, like the terminal having a keypad or buttons.
  • a user adaptive gesture recognition system that receives sensing information from an acceleration sensor in a terminal according to an exemplary embodiment of the present invention, and recognizes and processes a user gesture, will be described with reference to FIG. 5.
  • a user adaptive gesture recognition system 200 is installed in the terminal, but this is not intended to limit the present invention.
  • FIG. 5 is a diagram illustrating the structure of a user adaptive gesture recognition system according to an exemplary embodiment of the present invention.
  • a user adaptive gesture recognition system includes a button recognition unit 210, a sensing information processing unit 220, a user adaptive gesture processing unit 230, and an association unit 240.
  • the association unit 240 includes an in-terminal function association unit 241, a mobile browser association unit 242, and a mobile application association unit 243.
  • the button recognition unit 210 recognizes a user gesture or determines to register the user gesture when the user presses a button assigned with a user gesture recognition request function, a button assigned with a user gesture registration request function, or a corresponding region of the touch screen.
  • the sensing information processing unit 220 receives sensing information from the terminal 100 at the same time the button recognition unit 210 recognizes the operation of the button, and extracts a coordinate value collected by the acceleration sensor.
  • a method of extracting a coordinate value is well known in the art, and herein a detailed description thereof will be omitted.
  • the user adaptive gesture processing unit 230 recognizes the user gesture based on the coordinate value extracted by the sensing information processing unit 220. Then, the user adaptive gesture processing unit 230 searches an interface or program driving information that is pre -registered by the user in association with the recognized gesture, and drives an in-terminal function, a mobile browser function, or a function of a mobile application program in association with the interface or program.
  • the user adaptive gesture processing unit 230 will be described in detail with reference to FIG. 6.
  • FIG. 6 is a diagram illustrating the detailed structure of the user adaptive gesture processing unit according to an exemplary embodiment of the present invention.
  • the user adaptive gesture processing unit 230 includes a user gesture learning unit 232, a user adaptive gesture recognition unit 231, a user gesture- application program association processing unit 233, and an information storage unit.
  • the information storage unit includes a user gesture-interface association information storage unit 234, a user gesture-interface association information registration unit 237, a standard gesture registration storage unit 235, and a user gesture registration storage unit 236.
  • the user adaptive gesture recognition unit 231 recognizes the user gesture based on a coordinate value extracted from the sensing information.
  • the user gesture learning unit 232 records the user gesture recognized by the user adaptive gesture recognition unit 231, searches interface association information corresponding to the user gesture, and determines whether or not to register the user gesture.
  • the recording of the user gesture means that the user gesture recognized by the user adaptive gesture recognition unit 231 is temporarily recorded prior to storing the user gesture in each storage unit according to the situation.
  • the user gesture-application program association processing unit 233 receives user gesture information from the user gesture learning unit 232 and outputs application program information for driving a program or an interface corresponding to the user gesture information. That is, the user gesture-application program association processing unit 233 searches association information about the application program or interface stored in the user gesture-interface association information storage unit 237, and if program or interface information corresponding to the user gesture information is stored, outputs the application program information through the interface so as to drive the program or interface. If the program or interface information corresponding to the user gesture information is not stored, the user-gesture-application program association processing unit 233 performs control to store the user gesture information.
  • the user gesture-interface association information storage unit 234 stores, in association with the user gesture information, association information on the application program or interface when the user performs the corresponding gesture.
  • the user gesture-interface association information registration unit 237 registers the program or interface information on the user gesture.
  • the registration information includes the program or interface information in the user gesture-interface association information storage unit 234. That is, while the user gesture-interface association information storage unit 234 stores the program or interface information that is pre-set by the user, the user gesture-interface association information registration unit 237 stores information on programs or interfaces that can be executed on the terminal.
  • the standard gesture registration storage unit 235 stores feature values of individual standard gestures for user gesture recognition.
  • the standard gesture-based feature value is information on a predefined gesture. Accordingly, even if the user does not input information on a user adaptive gesture, a service can be provided with a gesture that is pre-stored in the standard gesture registration storage unit 235.
  • the user gesture registration storage unit 236 stores feature values of individual user gestures.
  • the user gesture-based feature value is stored in association with the program or interface information stored in the user gesture-interface association information storage unit 234.
  • the user gesture registration storage unit 236 and the user gesture-interface association information storage unit 234 are provided separately from each other, but this is not intended to limit the present invention.
  • the association unit 240 shown in FIG. 5 includes the in-terminal function association unit 241 that performs association with various functions in the terminal, the mobile browser association unit 242 that performs association with a mobile browser, and the mobile application association unit 243 that performs association with a mobile application.
  • the association unit 240 performs association with one of a function in the terminal, a mobile browser, and a mobile application according to the user gesture.
  • FIGS. 7 and 8 are diagrams illustrating user gestures according to an exemplary embodiment of the present invention.
  • the user may perform a gesture with the terminal while pressing a button for gesture recognition motion, or may perform an enlargement gesture or a reduction gesture that are pre-registered so as to enlarge or reduce the size of the display screen.
  • the gesture that is stored in the user gesture registration storage unit 236 is based on the sensing information collected by the acceleration sensor in a state where the user presses a button for successive motion recognition.
  • FIG. 7 illustrates an example where the screen size is enlarged or reduced when the terminal is moved forth or back.
  • the terminal includes a touch screen
  • the user may touch a virtual button so as to execute the same function.
  • FIG. 8 illustrates a gesture on up and down motion in a three-dimensional space.
  • the screen is reduced or enlarged when the terminal is moved up or down.
  • an interface function to reduce or enlarge the display screen size is executed.
  • the terminal includes a touch screen
  • the user may touch a virtual button so as to execute the same function.
  • FIG. 9 is a diagram illustrating user gesture patterns according to an exemplary em- bodiment of the present invention.
  • various patterns may be performed according to a three- dimensional direction from a start point to an end point, a kind of a turn, and a rotation direction.
  • other different gesture patterns may be defined by the user.
  • the defined gesture patterns are used in association with related programs.
  • FIG. 10 is a flowchart a successive user gesture recognition processing according to an exemplary embodiment of the present invention.
  • the button recognition unit 210 of the terminal determined whether or not an input to execute an acceleration sensor-based gesture recognition function is received (SlOO).
  • the user presses an acceleration sensor-based gesture recognition start button and generates an input signal so as to perform the input to execute the acceleration sensor-based gesture recognition function, but this is not intended to limit the present invention.
  • the sensing information processing unit 220 collects acceleration sensing information (SI lO).
  • the collected acceleration sensing information means a coordinate value of the acceleration sensor when being moved.
  • the user adaptive gesture recognition unit 231 of the user adaptive gesture processing unit 230 receives the acceleration sensing information as the coordinate value from the sensing information processing unit 220, and extracts successive three- dimensional position conversion information.
  • the user adaptive gesture recognition unit 231 recognizes a user gesture from the extracted position conversion information (S 120), and transmits the user gesture to the user gesture learning unit 232.
  • the user gesture learning unit 232 records the user gesture based on the acceleration sensing information, and then determines whether or not the recorded user gesture is stored in and can be identified from the user gesture registration storage unit 236 (S 130). That is, the user gesture learning unit 232 determines whether or not the gesture recognized based on the sensing information is stored in and can be identified from the user gesture registration storage unit 236 (S 130).
  • the user gesture learning unit 232 determines that the gesture recognized based on the acceleration sensing information can be identified, it is confirmed whether or not a program or an interface is predefined in association with the corresponding gesture (S 140). Whether or not the program or interface in association with the gesture is predefined is determined according to whether or not the corresponding program or interface is searched from the user gesture-interface association information storage unit 234. If the program or interface is predefined, interface information is output for association with the corresponding program or interface (S 150).
  • step S 140 If it is determined in step S 140 that no program or interface in association with the gesture is defined in the user gesture-interface association information storage unit 234, it is determined whether or not to define a new program or interface in association with the corresponding gesture (S 160). If it is determined to define the program or interface, the user gesture learning unit 232 transmits information on the program or interface in association with the corresponding gesture to the user gesture-interface association information registration unit 237 and stores the program or interface program therein (S 170).
  • step S 130 If it is determined in step S 130 that the gesture cannot be identified, recognition of the corresponding gesture is interrupted, and the process returns to step SlOO in which it is determined whether or not an input to execute a gesture recognition function is received.
  • FIG. 11 is a flowchart illustrating user gesture registration processing according to an exemplary embodiment of the present invention.
  • the button recognition unit 210 determines whether or not the user presses an acceleration sensor-based gesture registration button to perform an input to execute a gesture registration function (S200). If the user presses the button and requests gesture registration, the sensing information processing unit 220 collects acceleration sensing information from the acceleration sensor (S210).
  • the button recognition unit 210 determines whether or not the user releases the acceleration sensor-based gesture registration button to interrupt the registration request input (S220). If the registration request input is not received, the button recognition unit 210 recognizes a gesture from the acceleration sensing information received by the user adaptive gesture recognition unit 231 (S230). The user gesture learning unit 232 determines whether or not the gesture recognized by the user adaptive gesture recognition unit 231 is pre-registered in the user gesture registration storage unit 236 (S240). If it is determined that the recognized gesture is not registered, the user gesture learning unit 232 selects a command or interface in association with the gesture, and registers the selected command or interface in the user gesture registration storage unit 236 (S250).
  • step S240 the user gesture learning unit 232 determines whether or not to define a new command or interface (S260). It the user gesture learning unit 232 determines to define a new command or interface, the new command or interface information is selected from the standard gesture registration storage unit 235 or the user gesture-interface association information registration unit 237, and is then input to and stored in the user gesture-interface association information storage unit 234 (S270).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé et un système de reconnaissance de geste adaptatif pour l'utilisateur. Elle concerne un procédé et un système de reconnaissance de geste adaptatif pour l'utilisateur qui, au moyen d'un terminal pourvu d'un détecteur d'accélération, peut commander un logiciel d'application mobile dans le terminal ou traiter une fonction de programme d'application d'exploration à afficher sur le terminal en fonction des informations d'accélération. Ceci permet de reconnaître le geste de l'utilisateur et de le traiter au moyen d'un détecteur d'accélération monté dans un dispositif mobile. Il est également possible de mémoriser le geste adaptatif pour l'utilisateur dans le dispositif mobile au moyen du détecteur d'accélération et d'utiliser, par conséquent, une application mobile sans difficultés et d'un simple geste.
PCT/KR2008/005100 2007-12-03 2008-08-29 Procédé et système de reconnaissance de geste adaptatif pour l'utilisateur Ceased WO2009072736A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/745,800 US20100275166A1 (en) 2007-12-03 2008-08-29 User adaptive gesture recognition method and user adaptive gesture recognition system

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2007-0124592 2007-12-03
KR20070124592 2007-12-03
KR10-2008-0022182 2008-03-10
KR1020080022182A KR100912511B1 (ko) 2007-12-03 2008-03-10 사용자 적응형 제스처 인식 방법 및 그 시스템

Publications (1)

Publication Number Publication Date
WO2009072736A1 true WO2009072736A1 (fr) 2009-06-11

Family

ID=40717894

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2008/005100 Ceased WO2009072736A1 (fr) 2007-12-03 2008-08-29 Procédé et système de reconnaissance de geste adaptatif pour l'utilisateur

Country Status (1)

Country Link
WO (1) WO2009072736A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853073A (zh) * 2010-06-18 2010-10-06 华南理工大学 一种应用于手势识别的旋转特征码的距离测度方法
DE102010002677A1 (de) 2010-03-09 2011-09-15 Robert Bosch Gmbh Separation von translatorischer und rotatorischer Bewegung zur Bewegungsdetektion
WO2012091862A1 (fr) * 2010-12-27 2012-07-05 Sling Media, Inc. Systèmes et procédés de reconnaissance de gestes adaptatifs
WO2012134914A1 (fr) * 2011-03-28 2012-10-04 Apple Inc. Systèmes et procédés permettant de définir des paramètres d'impression au moyen d'une interface utilisateur
EP2541392A3 (fr) * 2011-07-01 2013-10-23 Seiko Epson Corporation Terminal portable, système d'impression, procédé de commande pour terminal portable et programme informatique
US8724146B2 (en) 2011-03-28 2014-05-13 Apple Inc. Systems and methods for defining print settings using device movements
CN104750386A (zh) * 2015-03-20 2015-07-01 广东欧珀移动通信有限公司 一种手势识别方法及装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003036452A1 (fr) * 2001-10-24 2003-05-01 Sony Corporation Dispositif servant a afficher une information d'image
US6573883B1 (en) * 1998-06-24 2003-06-03 Hewlett Packard Development Company, L.P. Method and apparatus for controlling a computing device with gestures
WO2005093550A2 (fr) * 2004-03-01 2005-10-06 Apple Computer, Inc. Procedes et dispositifs pour l'exploitation d'un dispositif portatif sur la base d'un accelerometre
KR20060027180A (ko) * 2004-09-22 2006-03-27 주식회사 엔씨소프트 휴대 단말기의 3차원 공간에서의 움직임을 영상 정보에반영하는 방법 및 휴대 단말기

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6573883B1 (en) * 1998-06-24 2003-06-03 Hewlett Packard Development Company, L.P. Method and apparatus for controlling a computing device with gestures
WO2003036452A1 (fr) * 2001-10-24 2003-05-01 Sony Corporation Dispositif servant a afficher une information d'image
WO2005093550A2 (fr) * 2004-03-01 2005-10-06 Apple Computer, Inc. Procedes et dispositifs pour l'exploitation d'un dispositif portatif sur la base d'un accelerometre
KR20060027180A (ko) * 2004-09-22 2006-03-27 주식회사 엔씨소프트 휴대 단말기의 3차원 공간에서의 움직임을 영상 정보에반영하는 방법 및 휴대 단말기

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010002677A1 (de) 2010-03-09 2011-09-15 Robert Bosch Gmbh Separation von translatorischer und rotatorischer Bewegung zur Bewegungsdetektion
CN101853073A (zh) * 2010-06-18 2010-10-06 华南理工大学 一种应用于手势识别的旋转特征码的距离测度方法
WO2012091862A1 (fr) * 2010-12-27 2012-07-05 Sling Media, Inc. Systèmes et procédés de reconnaissance de gestes adaptatifs
US9785335B2 (en) 2010-12-27 2017-10-10 Sling Media Inc. Systems and methods for adaptive gesture recognition
WO2012134914A1 (fr) * 2011-03-28 2012-10-04 Apple Inc. Systèmes et procédés permettant de définir des paramètres d'impression au moyen d'une interface utilisateur
CN103430139A (zh) * 2011-03-28 2013-12-04 苹果公司 利用输入接口定义打印设置的系统与方法
US8724146B2 (en) 2011-03-28 2014-05-13 Apple Inc. Systems and methods for defining print settings using device movements
EP2541392A3 (fr) * 2011-07-01 2013-10-23 Seiko Epson Corporation Terminal portable, système d'impression, procédé de commande pour terminal portable et programme informatique
CN104750386A (zh) * 2015-03-20 2015-07-01 广东欧珀移动通信有限公司 一种手势识别方法及装置
CN104750386B (zh) * 2015-03-20 2018-01-19 广东欧珀移动通信有限公司 一种手势识别方法及装置

Similar Documents

Publication Publication Date Title
US20100275166A1 (en) User adaptive gesture recognition method and user adaptive gesture recognition system
JP6121102B2 (ja) 近接感知による触覚的効果
EP2353065B1 (fr) Contrôle et accès à un contenu par traitement du mouvement sur des dispositifs mobiles
CN102246125B (zh) 具有运动姿态识别的移动设备
WO2009072736A1 (fr) Procédé et système de reconnaissance de geste adaptatif pour l'utilisateur
JP5338662B2 (ja) 情報処理装置、入力装置及び情報処理システム
US20090262074A1 (en) Controlling and accessing content using motion processing on mobile devices
US7933738B2 (en) Determining a point of application of force on a surface element
CN102426490A (zh) 电子设备、处理方法和程序
CN103262005A (zh) 检测涉及计算设备的有意移动的姿势
US9367169B2 (en) Method, circuit, and system for hover and gesture detection with a touch screen
TW201145146A (en) Handling tactile inputs
CN107144291A (zh) 一种数据处理方法及移动终端
JP5759659B2 (ja) タッチパネルに対する押下圧力を検出する方法および携帯式端末装置
KR101365083B1 (ko) 모션 인식을 통한 인터페이스 장치 및 이의 제어방법
CN103984407B (zh) 使用运动传感器融合来进行运动识别的方法及装置
KR100777107B1 (ko) 가속도 센서를 이용한 문자인식 장치 및 방법
CN111145891A (zh) 信息处理方法、装置及电子设备
US9235338B1 (en) Pan and zoom gesture detection in a multiple touch display
US7924265B2 (en) System and method for emulating wheel-style, rocker-style, or wheel-and-rocker style navigation with an analog pointing device
US20050110756A1 (en) Device and method for controlling symbols displayed on a display device
WO2012075629A1 (fr) Interface utilisateur
JP5080409B2 (ja) 情報端末装置
KR102194778B1 (ko) 공간상의 상호 작용을 이용한 단말의 제어 방법 및 그 단말
EP2407866B1 (fr) Dispositif électronique portable et procédé pour déterminer l'emplacement d'un effleurement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08793599

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 12745800

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08793599

Country of ref document: EP

Kind code of ref document: A1