[go: up one dir, main page]

WO2018061603A1 - Système de manipulation gestuelle, procédé de manipulation gestuelle et programme - Google Patents

Système de manipulation gestuelle, procédé de manipulation gestuelle et programme Download PDF

Info

Publication number
WO2018061603A1
WO2018061603A1 PCT/JP2017/031413 JP2017031413W WO2018061603A1 WO 2018061603 A1 WO2018061603 A1 WO 2018061603A1 JP 2017031413 W JP2017031413 W JP 2017031413W WO 2018061603 A1 WO2018061603 A1 WO 2018061603A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
operator
gesture
gesture input
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/031413
Other languages
English (en)
Japanese (ja)
Inventor
祐司 篠村
藤原 直樹
克尚 平井
督之 市原
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shimane Prefecture
Yazaki Corp
Original Assignee
Shimane Prefecture
Yazaki Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shimane Prefecture, Yazaki Corp filed Critical Shimane Prefecture
Publication of WO2018061603A1 publication Critical patent/WO2018061603A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • the present invention relates to a gesture operation system and a gesture operation method for instructing operation on an operation target in response to a gesture input by an operator.
  • the present invention has been made under the above-described circumstances, and determines whether it is a right hand or a left hand, changes the operation target according to each, and makes it easier and more accurate for an operator to issue operation instructions to a plurality of various operation targets. It is an object of the present invention to provide a gesture operation system, a gesture operation method, and a program that can be performed in a simple manner.
  • a gesture operation system includes an imaging unit that images an operator's hand that operates a vehicle steering device, and the captured image before the gesture input reception period by the operator.
  • a discriminating unit that detects a hand existing in a detection region set between the steering device and the operator based on imaging data of the operator's hand and discriminates either the right hand or the left hand And before the reception period, according to the determined hand pattern, a selection unit that selects any one of a plurality of different operation objects related to the vehicle, and the reception period, And an instruction unit for giving an operation instruction to the operation target based on a gesture input by a hand.
  • the gesture operation system changes the shape of the operation area for receiving the gesture input according to the determined right hand or left hand, and the operation area is A setting unit that is set between an apparatus and the operator may be further included, and the instruction unit may perform the operation instruction based on a gesture input by the hand in the operation area.
  • the shape of the operation area may be set so that the determined hand is asymmetrical with respect to the operator so that the operation of the steering device is not hindered, and swipeable up and down and left and right. Good.
  • the setting unit may arrange the operation area on the basis of the position of the hand held over the operator before the reception period.
  • the instruction unit further includes the content of the operation instruction associated with the determined state of the right hand or the left hand. An operation instruction for the operation target based on the above may be performed.
  • the gesture operation system further includes a generation unit that generates an operation region for receiving the gesture input between the steering device and the operator, and the instruction unit is configured to include the operation region in the operation region.
  • the operation instruction may be performed based on hand gesture input.
  • the instruction unit may feed back the result of the operation instruction.
  • a gesture operating method includes a step of imaging an operator's hand operating a vehicle steering apparatus, and the imaging operation is performed before a period for accepting a gesture input by the operator. Detecting a hand existing in a detection region set between the steering device and the operator based on imaging data of an operator's hand, and determining either a right hand or a left hand; Before the acceptance period, according to the determined hand pattern, selecting any operation target from a plurality of different operation targets related to the vehicle; Performing an operation instruction for the operation target based on a gesture input by the hand during the reception period.
  • a program according to the present invention causes a computer to execute the above gesture operation method.
  • the present invention it is possible to distinguish between the right hand and the left hand, change the operation target according to each, and perform an operation instruction to a plurality of various operation objects more easily and accurately.
  • the gesture operation system 1 gives an operation instruction to an operation target device related to a vehicle according to a gesture input of an operator who is a driver of the vehicle.
  • FIG. 1 is a block diagram illustrating a configuration example of a gesture operation system 1 according to an embodiment of the present invention.
  • the gesture operation system 1 includes a CPU (Central Processing Unit) 11, a ROM (Read Only Memory) 12, a RAM (Random Access Memory) 13, and a sensor 31, and these components 11, 12. , 13 are connected to the bus 40.
  • the operation target 20 is also connected to the bus 40.
  • the operation target 20 includes, for example, a HUD (Head-Up Display) 21, a car navigation device 22 including a display unit 221, a room or a headlight 23, and the like.
  • a HUD Head-Up Display
  • a car navigation device 22 including a display unit 221, a room or a headlight 23, and the like.
  • the navigation device 22 will be described as a single navigation device, but can be integrated into the CPU 11, ROM 12 and RAM 13.
  • the CPU 11 is connected to each of the constituent elements 11 to 13, 20, and 31 through the bus 40 to perform control signal and data transfer processing, and execute and calculate various programs for realizing the entire operation of the gesture operation system 1. Processing, processing related to gesture input, and the like are performed.
  • the ROM 12 stores an operating system program and various data necessary for operation control of the entire gesture operation system 1, and the program is read into the RAM 13 and started to be executed by the CPU 11.
  • the program is stored in a recording medium such as a DVD-ROM or HDD.
  • the RAM 13 is provided with a recording area for temporarily storing data and programs necessary for gesture input, and holds the data and programs.
  • the sensor 31 is an imaging device for detecting a gesture input performed by an operator during driving.
  • the sensor output is processed by the CPU 11 and an operation is instructed according to the gesture input.
  • a driver who operates the steering (steering device) 5 performs a gesture by hand. If the gesture can be detected, the configuration of the sensor 31 is as follows. Any type of sensor can be used.
  • the sensor 31 may be an RGB color sensor capable of imaging a hand, an infrared distance sensor, a laser distance sensor, an ultrasonic distance sensor, a distance image sensor, an electrolytic sensor, an image sensor, or the like.
  • FIG. 2 is a diagram for explaining a state in the vehicle on which the gesture operation system 1 is mounted.
  • a steering wheel 5 and an instrument panel (instrument panel) 26 are provided in the vehicle.
  • HUD21 is provided in the windshield surface which an operator can visually recognize.
  • the sensor 31 is provided almost at the center of the instrument panel 26.
  • the sensor 31 detects a gesture by the operator's hand.
  • the display unit 221 is provided in the car navigation device 22.
  • FIG. 3A to FIG. 3E are diagrams for explaining the outline in time series.
  • the operator 100 moves the right hand 101 in the x direction to input a gesture.
  • the CPU 11 detects the gesture input and executes an operation instruction to the operation target 20 such as the car navigation device 22.
  • the operation region s1 is generated after the hand is detected in the detection region t1. Furthermore, in the operation area s1, when the gesture input by the hand of the operator 100 is accepted, an operation instruction to the operation target 20 is performed.
  • a predetermined period after the operation area s1 is generated is referred to as a “reception period” of gesture input.
  • the detection area t1 is set “before the reception period”.
  • FIGS. 3A and 3C to 3E The detection area t1 before the gesture input reception period (FIG. 3B) and the operation area s1 during the reception period (FIGS. 3C to 3E) are respectively shown in FIGS.
  • positioning aspect is shown.
  • FIG. 4 is a diagram for explaining (a) a top view of the detection region t1 and (b) a side view of the detection region t1.
  • FIG. 5 is a diagram for explaining (a) a top view of the operation area s1 and (b) a side view of the operation area s1. 4 and 5 merely illustrate the arrangement and shape of the regions t1 and s1.
  • FIG. 6 is a flowchart showing the entire gesture operation control process.
  • step S10 the preliminary operation detection process is performed before the gesture input reception period (step S10), and then the main operation process is performed (step S11).
  • steps S10 to S11 are shown in detail in FIGS. 7 to 8 described later.
  • FIG. 7 is a flowchart of processing (preliminary motion detection processing) sequentially executed before the gesture input acceptance period.
  • FIG. 8 is a flowchart of processing (this operation processing) sequentially executed during the gesture input acceptance period.
  • the CPU 11 acquires one frame of image data (imaging data) captured by the sensor 31 (step S101), and determines whether the hand of the operator 100 exists in the image data (step S101). S102). In step S102, the CPU 11 extracts only a region having a certain distance (for example, a three-dimensional component) from the image data from the sensor 31, sets the region as the detection region t1 (FIG. 4), and detects the detection region t1. It is determined whether or not the hand of the operator 100 exists inside.
  • a certain distance for example, a three-dimensional component
  • this gesture operation system 1 it is determined that a hand is present from the analysis results of various elements such as the movement amount and shape of an object (such as a hand) obtained from the acquired image data and the barycentric coordinates.
  • the CPU 11 binarizes the image data from the sensor 31 based on a preset threshold value, and detects the contour of the object included in the image data. To do. Then, the CPU 11 extracts a contour having a preset area from the plurality of detected contours, and recognizes the contour as the hand 101 when the center-of-gravity coordinates of the contour are included in the detection region t1. If it is determined that the hand is present.
  • step S102 determines whether the hand is present. If it is determined in step S102 that the hand is present, the CPU 11 determines whether the preliminary operation condition is satisfied (step S103).
  • the CPU 11 determines that the preliminary operation condition is satisfied when the hand is stationary for a predetermined time with the palm extended.
  • the above-described determination as to whether or not the palm has been expanded can be made based on hand unevenness information, for example.
  • the determination of whether or not the hand is stationary can be made based on the amount of movement of the hand barycentric coordinates. For example, it is determined that the hand is stationary when the movement amount of the center-of-gravity coordinates of the hand within a certain frame is equal to or less than a threshold value.
  • the preliminary operation and the preliminary operation condition described above are not limited to the above examples, and can be freely set.
  • the detection region t1 described above may be any region that can detect the hand of the operator 100, and can be implemented by various other alternative positions, sizes, shapes, and the like.
  • the detection region t1 may be moved without being fixedly arranged at the same position at the time of detection in step S103. By such movement, for example, the position of the detection region t1 at the time of determination in step S103 may be different from the position of the detection region t1 at the time of determination in step S102 (when determining whether a hand is present). it can.
  • step S104 When it is determined in step S103 that the preliminary operation condition is satisfied, the CPU 11 performs initial setting of the operation target 20 (step S104).
  • a state of a hand satisfying the preliminary operation condition of the above-described step S102 is selected from the operation targets of hardware (HUD 21, car navigation device 22, etc.) and software (music, map, etc.). It is conceivable to determine a corresponding operation target according to (pattern).
  • FIG. 9A is a diagram illustrating an example of data d1 including a hand state when hardware is determined.
  • FIG. 9B is a table showing an example of data d2 including a hand state when software is determined (called).
  • the HUD 21 is determined as the operation target 20, or when it is determined that the right hand is in the goo state, It is exemplified that air conditioning is determined as the operation target 20.
  • step S104 in FIG. 7 the CPU 11 reads the data d1, d2 from the ROM 12, and determines the operation target 20 corresponding to the state of the right hand or the left hand.
  • the CPU 11 determines whether the operator's hand determined in step S102 is the right hand or the left hand before the gesture input reception period (step S104). For example, as such a determination method, a right hand or a left hand is determined based on whether or not the center of gravity position of the hand is relatively on the right side or the left side (preset range) on the horizontal plane in the optical axis direction of the sensor 31. It is conceivable.
  • CPU11 selection part selects the operation target 20 matched with the state of any hand from several various operation target.
  • the hand state described above is not limited to the example shown in FIG. 9A or 9B, and various states may be included.
  • FIGS. 10A to 10O illustrate various hand states.
  • 10 (i) shows the goo state in FIGS. 9A and 9B
  • FIG. 10 (j) shows the choke state in FIGS. 9A and 9B
  • FIG. 10 (k) shows FIGS. 9A and 9B.
  • the state of par in is shown.
  • various other hand states are shown. Thereby, the correspondence with the various various operation target 20 can be set.
  • the CPU 11 (generating unit) generates the operation area s1 (step S105). For example, in FIGS. 5A and 5B, the CPU 11 detects the hand 101 held by the operator 100 based on the image data from the sensor 31, and between the steering wheel 5 and the operator 100. The operation area s1 is generated around the hand 101.
  • the operation area s1 described above is not limited to the shape shown in FIGS. 5A and 5B, and can be set freely.
  • the CPU 11 acquires one frame of image data captured by the sensor 31 (step S111), and determines whether there is a gesture input operation by the operator 100 in the image data (step S112).
  • the CPU 11 analyzes the movement of the hand of the operator 100 in the operation area s1 from the image data from the sensor 31, and determines whether the operation is a gesture input operation.
  • this gesture operation system 1 it is determined as a gesture input operation from the analysis results of various elements such as the amount and direction of movement of the hand and the change in the state of the hand.
  • the CPU 11 instruction unit performs an operation corresponding to the gesture input.
  • An operation instruction is given to the object 20 (step S113).
  • the content of the operation instruction to the operation target 20 associated with the state of the right hand or the left hand is recorded in the ROM 12 as data.
  • the data is read and an operation instruction is issued by the CPU 11.
  • FIG. 11 to 13 are diagrams showing data d11 to d13 of correspondence between the operation target 20 and the operation instruction content 25.
  • FIG. The ROM 12 stores the data d11 to d13.
  • the CPU 11 reads these data from the ROM 12, and issues an operation instruction to the operation target 20 corresponding to the gesture input along the operation instruction content 25.
  • a map scale or the like is given as the operation target 20, and operation instruction content 25 related to operations such as wide area / details of the map is shown.
  • an engine or the like is cited as the operation target 20, and an operation instruction content 25 related to an operation such as on / off of the engine is shown.
  • FIG. 13 a camera video or the like is cited as the operation target 20, and an operation instruction content 25 related to an operation such as switching of the back camera video / panoramic video is shown.
  • the CPU 11 instruction unit feeds back the result of the operation instruction in step S113 (step S114).
  • feedback for example, changing the display on the display unit 221 or the instrument panel 26 of the HUD 21, the car navigation device 22, or changing the operation of the operation target 20 (for example, ON / OFF of an air conditioner, etc.) ), Notification by outputting sound from a speaker, and the like.
  • This feedback plays a role of visually or audibly confirming an operation instruction for an appropriate operation target of the operator 100.
  • the CPU 11 determines whether the operation has been completed, that is, whether the operation has been completed (step S115). For example, when the hand has moved out of the detection area t1, or when the hand has moved out of the operation area s1, the CPU 11 determines that the operation has been completed. According to the flowchart of FIG. 8, even if the hand state does not match the pre-registered hand pattern (FIG. 9A, FIG. 9B) in the determination process of step S ⁇ b> 112, Judged as a state.
  • a hand existing in the detection region t1 is detected before the gesture input reception period, and whether the hand is either the right hand or the left hand.
  • the operation target is selected according to the state of the hand.
  • an operation instruction for the operation target is performed based on the gesture input by the hand of the operator 100.
  • the state of the hand includes various states of the right hand or the left hand, it is possible to set the correspondence between the various hand states and a plurality of various operation objects (including hardware and software) 20.
  • This means that the operator can be instructed to operate the various types of operation objects 20. For example, operations of a plurality of different application functions (such as map display and music playback) can be instructed to the same hardware (such as a car navigation device).
  • Modification 1 Although the position where the operation area s1 is arranged is not described above, the operation area s1 may be arranged based on the position of the hand immediately after the preliminary movement.
  • FIG. 14 shows an arrangement example of the operation area s1.
  • the operation area s1 is arranged at a position corresponding to the position of the hand 101 held by the operator 100 in the detection area t1 immediately after the preliminary movement. Therefore, in this gesture operation system 1, it is possible to perform gesture input while moving the hand at a position desired by the operator.
  • the CPU 11 generates an operation area s1 in which the right hand 101 can be swiped while moving the right elbow to the left.
  • the operator 100 can perform a swipe operation up and down and left and right while making the corresponding hand asymmetrical when viewed from the operator 100 while driving so as not to hinder the operation of the steering 5.
  • the operability of gesture input is improved.
  • FIG. 16 illustrates a mode in which a preliminary operation is detected when two detection regions t1 and t11 are used.
  • the center of gravity position p of the hand 101 for example, the x-axis, y-axis, and z in the world coordinate system.
  • Each axis value) and the hand 101 is stationary for a certain time. Accordingly, it is possible to determine whether the preliminary operation condition is satisfied while determining the presence of the hand described in step S103.
  • step S103 shown in FIG. 7 the center of gravity position p of the hand of the operator 100 is analyzed based on the image data from the sensor 31, and the state of the hand having the center of gravity position p and the hand registered in advance are analyzed. When the state matches, it can be determined that the motion of the hand satisfies the preliminary operation condition as the preliminary operation.
  • the sensor 31 can be changed as long as it can capture an image of the movements of the hands 101 and 102 of the operator 100 who operates the steering wheel 5.
  • FIG. 17 shows that (a) the sensor 31 is installed on the ceiling of the vehicle, (b) the sensor 31 is installed near the map lamp, and (c) the sensor 31 is near the car navigation device 22.
  • positioning which shows the case where it installs and the case where (d) sensor 31 is installed in a dashboard is shown.
  • Module 7 In the above description, aspects of passengers other than the driver (passenger seat, rear seat) have not been mentioned, but it is also possible for these passengers to input gestures as the operator 100 in the same manner as the driver. Good.
  • a sensor 31 capable of imaging a passenger is provided, and the gesture operation system 1 according to the above-described embodiment performs processing for the passenger's gesture input based on the image data from the sensor 31 (steps S101 to S105). Steps S111 to S115) may be performed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un système de manipulation gestuelle, etc, qui permet à un opérateur de faire la distinction entre une main droite et une main gauche et de modifier l'objet correspondant respectif à manipuler, et fournit plus facilement et précisément une indication de manipulation à une pluralité de divers objets à manipuler. Le système de manipulation gestuelle comprend : une unité d'imagerie pour imager les mains d'un opérateur qui manipule l'engrenage de direction d'un véhicule; une unité de détermination pour détecter une main présente dans une zone de détection qui est établie entre l'engrenage de direction et l'opérateur sur la base des données d'imagerie des mains de l'opérateur imagée avant une période d'acceptation pour une entrée de geste par l'opérateur, et déterminer si la main est une main droite ou une main gauche; une unité de sélection pour sélectionner, avant la période d'acceptation et conformément à un motif pour la main déterminée, tout objet devant être manipulé parmi une pluralité d'objets différents devant être manipulés qui sont associés au véhicule; et une unité d'indication pour donner, dans la période d'acceptation, une indication de manipulation à l'objet à manipuler sur la base d'une entrée de geste par la main.
PCT/JP2017/031413 2016-09-30 2017-08-31 Système de manipulation gestuelle, procédé de manipulation gestuelle et programme Ceased WO2018061603A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016194322A JP2018055614A (ja) 2016-09-30 2016-09-30 ジェスチャ操作システム、ジェスチャ操作方法およびプログラム
JP2016-194322 2016-09-30

Publications (1)

Publication Number Publication Date
WO2018061603A1 true WO2018061603A1 (fr) 2018-04-05

Family

ID=61759573

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/031413 Ceased WO2018061603A1 (fr) 2016-09-30 2017-08-31 Système de manipulation gestuelle, procédé de manipulation gestuelle et programme

Country Status (2)

Country Link
JP (1) JP2018055614A (fr)
WO (1) WO2018061603A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113408330A (zh) * 2020-02-28 2021-09-17 株式会社斯巴鲁 交通工具的乘员监视装置

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110716648B (zh) * 2019-10-22 2021-08-24 上海商汤智能科技有限公司 手势控制方法和装置
WO2022157880A1 (fr) * 2021-01-21 2022-07-28 三菱電機株式会社 Dispositif de détection de main, dispositif de reconnaissance de geste et procédé de détection de main
JP7163526B1 (ja) 2021-07-20 2022-10-31 株式会社あかつき 情報処理システム、プログラム及び情報処理方法
JP7052128B1 (ja) 2021-07-20 2022-04-11 株式会社あかつき 情報処理システム、プログラム及び情報処理方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07282235A (ja) * 1994-04-15 1995-10-27 Matsushita Electric Ind Co Ltd 動作認識装置
JP2013205983A (ja) * 2012-03-27 2013-10-07 Sony Corp 情報入力装置及び情報入力方法、並びにコンピューター・プログラム
JP2014119295A (ja) * 2012-12-14 2014-06-30 Clarion Co Ltd 制御装置、及び携帯端末

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07282235A (ja) * 1994-04-15 1995-10-27 Matsushita Electric Ind Co Ltd 動作認識装置
JP2013205983A (ja) * 2012-03-27 2013-10-07 Sony Corp 情報入力装置及び情報入力方法、並びにコンピューター・プログラム
JP2014119295A (ja) * 2012-12-14 2014-06-30 Clarion Co Ltd 制御装置、及び携帯端末

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113408330A (zh) * 2020-02-28 2021-09-17 株式会社斯巴鲁 交通工具的乘员监视装置

Also Published As

Publication number Publication date
JP2018055614A (ja) 2018-04-05

Similar Documents

Publication Publication Date Title
JP5261554B2 (ja) 指先ポインティング、ジェスチャに基づく車両用ヒューマンマシンインタフェース
JP6030430B2 (ja) 制御装置、車両及び携帯端末
CN104627094B (zh) 识别用户手势的车辆和用于控制该车辆的方法
JP6316559B2 (ja) 情報処理装置、ジェスチャー検出方法、およびジェスチャー検出プログラム
JP6202810B2 (ja) ジェスチャ認識装置および方法ならびにプログラム
WO2018061603A1 (fr) Système de manipulation gestuelle, procédé de manipulation gestuelle et programme
JP2018150043A (ja) 自動車における情報伝送のためのシステム
CN106030460B (zh) 移动体用手势引导装置、移动体用手势引导系统及移动体用手势引导方法
KR102084032B1 (ko) 사용자 인터페이스, 운송 수단 및 사용자 구별을 위한 방법
US20140079285A1 (en) Movement prediction device and input apparatus using the same
US20140168068A1 (en) System and method for manipulating user interface using wrist angle in vehicle
CN103869970B (zh) 通过2d相机操作用户界面的系统和方法
CN105807912A (zh) 车辆、用于控制该车辆的方法和其中的手势识别装置
JP6515028B2 (ja) 車両用操作装置
CN104040465A (zh) 使用在三维空间中执行的手势以及相关的计算机程序产品控制车辆功能的方法和设备
JP6011579B2 (ja) ジェスチャ入力装置
JP2018203214A (ja) 駐車支援装置、駐車支援方法、運転支援装置、および運転支援方法
JP2016038621A (ja) 空間入力システム
JP3933139B2 (ja) コマンド入力装置
WO2018061413A1 (fr) Dispositif de détection de geste
JP2006285370A (ja) ハンドパターンスイッチ装置及びハンドパターン操作方法
CN107291219A (zh) 用户界面、运输工具和用于识别用户的手的方法
JP5136948B2 (ja) 車両用操作装置
JP6819539B2 (ja) ジェスチャ入力装置
JP6315443B2 (ja) 入力装置、マルチタッチ操作の入力検出方法及び入力検出プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17855561

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17855561

Country of ref document: EP

Kind code of ref document: A1