[go: up one dir, main page]

TW201039210A - Touch-sensing device and procession method for man-machine interface - Google Patents

Touch-sensing device and procession method for man-machine interface Download PDF

Info

Publication number
TW201039210A
TW201039210A TW99111355A TW99111355A TW201039210A TW 201039210 A TW201039210 A TW 201039210A TW 99111355 A TW99111355 A TW 99111355A TW 99111355 A TW99111355 A TW 99111355A TW 201039210 A TW201039210 A TW 201039210A
Authority
TW
Taiwan
Prior art keywords
touch
human
touch object
coordinate
touch panel
Prior art date
Application number
TW99111355A
Other languages
Chinese (zh)
Inventor
Ming-Chuan Lin
Lin Lin
Chih-Chiang Lin
Yung-Chang Lin
Kuang-Wei Li
Chih-Chang Lai
Shyh-Jeng Chen
Original Assignee
Wintek Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wintek Corp filed Critical Wintek Corp
Priority to TW99111355A priority Critical patent/TW201039210A/en
Publication of TW201039210A publication Critical patent/TW201039210A/en

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

A touch-sensing device includes a touch panel, a touch panel controller, a data processing system, and a learning module for man-machine interface. An object performs a touch action on the touch panel, and the touch panel controller generates coordinate signals according to the touch action and output a variation value of the coordinate signals. The data processing system recognizes the change in position of the object according to the variation value, and the learning module correlates the position changes with hot keys or instructions defined by software and stores their correspondences.

Description

201039210 六、發明說明: 【發明所屬之技術領域】 本發明關於—種使用者可自行定義觸控物件移動狀 態的觸控裝置及其人機介面處理方法。 【先前技術】 圖1為-習知觸控裝示意圖。如圖丨所示,一觸 〇 碰面板102可藉由觸碰面板控制器⑽輪出單點或多點的觸 碰座標至-處理ϋ系統104,且將座標錢透侧如USB等的 傳輸介面108傳送到處理器系統104 ’然後在處理器系統刚 上藉由觸碰座標判斷目前觸碰操作下的手勢。然而,目前對 應一軟體的快捷鍵或指令的手勢,係由觸控裝置製造商定義 而不一定是使用者真正喜歡的手勢。 習知設計中例如美國專利N0.7,479,949號定義了按鍵 ^ 翻頁、捲動與二維的動作等觸控操作,但這些操作基本上都 是由手機製造商定義’使用者無法隨意設定。另外,美國專 利公開案N0.2007/0229466號提到敲擊動作(tap)的手勢如 何產生’但無法由使用者自行設定。再者,美國專利公開案 N0.2007/021009號提及敲擊動作(tap)、拖曳動作(drag)、雙敲 擊動作(double tap)等的手勢如何產生,但同樣無法由使用者 自行設定。 201039210 【發明内容】 ▲本發明提供-種使用者可自行定_控物件移動狀 態的觸控裝置及其人機介面處理方法。 本發明的其他目的和優點可以從本發明所揭露的技術 特徵中得到進一步的了解。 依本發明之-實施例,-種觸控裝置包含—觸碰面板、 一觸碰面板控㈣…處理料統及—人機細學習模組。 ❹ 觸碰面板供—觸碰物件產生—觸碰動作,觸麵板控制器依 據觸碰動作產生座標信號’並輸出—越錢變化值。處理 器系統依據座標變化值判別觸碰物件的位置變化,且人機介 面學習模組將觸碰物件的位置變化狀態對應至—軟體的快 捷鍵或指令並記憶對應關係。 依本發明之另-實施例,一種觸控裝置包含一觸碰面 板、一觸碰面板控制器、—處理㈣、統及-人機介面學習模 〇 、组。觸碰面板供一觸碰物件產生一觸碰動作,觸碰面板控制 器依據觸碰動作產生多個座標信號,處理器系統接收這些座 標信號並依據這些座標信號的變化判別觸碰物件的位置變 化。人機介面學賴_該觸碰物件的位置變化狀態對應至 一軟體的快捷鍵或指令並記憶對應關係。 依本發明之另-實施例,—種用於觸控裝置的人機介面 處=方法,包含如下步驟:開啟—人機介面學習程式;選取 欲定義的快捷鍵或指令;利用一觸碰物件於觸控裝置上產生 5 201039210 鍵或指令。 :==::=:::= 藉由上述各個實_之設計,狀機介面學習模纪可將 ;"觸碰物件的位置變化狀態對應至一軟體的快捷鍵或指令 並s己憶對應義’如此朗者即可自行定義手勢麵控筆動 作,達到人性化的人機介面效果。201039210 VI. Description of the Invention: [Technical Field] The present invention relates to a touch device in which a user can define a moving state of a touch object and a human-machine interface processing method thereof. [Prior Art] FIG. 1 is a schematic view of a conventional touch device. As shown in FIG. 一, the one-touch rubbing panel 102 can rotate the single-point or multi-point touch coordinates to the processing system 104 by touching the panel controller (10), and transmit the coordinate money side such as USB. The interface 108 is passed to the processor system 104' and then the gesture under the current touch operation is determined by the touch coordinates on the processor system. However, the current gestures for a shortcut or command of a software are defined by the touch device manufacturer and are not necessarily gestures that the user really likes. Conventional designs such as U.S. Patent No. 7,479,949 define touch operations such as button flipping, scrolling, and two-dimensional motion, but these operations are basically defined by the handset manufacturer' In addition, U.S. Patent Publication No. N07007/0229466 mentions how a gesture of tapping occurs, but cannot be set by the user. Furthermore, U.S. Patent Publication No. 00070071009 mentions how gestures such as tapping, dragging, and double taping are generated, but cannot be set by the user. . 201039210 [Description of the Invention] ▲ The present invention provides a touch device capable of controlling the moving state of an object and a human-machine interface processing method thereof. Other objects and advantages of the present invention will become apparent from the technical features disclosed herein. According to the embodiment of the present invention, the touch device includes a touch panel, a touch panel control (four), a processing system, and a human-machine fine learning module.触 Touch panel for - touch object generation - touch action, the touch panel controller generates coordinate signal according to the touch action and outputs - the value of the change. The processor system determines the position change of the touch object according to the coordinate change value, and the human machine interface learning module corresponds the position change state of the touch object to the software shortcut key or the command and memorizes the corresponding relationship. According to another embodiment of the present invention, a touch device includes a touch panel, a touch panel controller, a processing (four), a system, and a human interface learning module and a group. The touch panel generates a touch action for a touch object, and the touch panel controller generates a plurality of coordinate signals according to the touch action, and the processor system receives the coordinate signals and determines the position change of the touch object according to the change of the coordinate signals. . The human interface learning _ the position change state of the touch object corresponds to a software shortcut or instruction and memorizes the correspondence. According to another embodiment of the present invention, a method for a human-machine interface for a touch device includes the following steps: opening a human-machine interface learning program; selecting a shortcut key or instruction to be defined; and using a touch object Generate 5 201039210 keys or commands on the touch device. :==::=:::= With the above design, the machine interface can learn the model; "touch the position change state of the object corresponding to a software shortcut or command and remember The corresponding meaning 'so long can be customized to the gesture face control pen action, to achieve a humane interface.

本發明的其他目的和優點可以從本發明所揭露的技術 特徵中得到進-步的了解^為讓本發明之上述和其他目的、 特徵和優點能更明顯易懂,下文特舉實施例並配合所附圖 式’作詳細說明如下。 【實施方式】 有關本發明之前述及其他技術内容、特點與功效,在以 下配合參考圖式之實施例的詳細說明中,將可清楚的呈現。 以下實施例中所提到的方向用語,例如:上、下、左、右、 前或後等,僅是參考附加圖式的方向。因此,使用的方向用 語是用來說明並非用來限制本發明。 圖2為一示意圖’顯示依本發明一實施例之具有人機介 面學習模組的觸控裝置1〇。如圖2所示,觸控裝置10包含 一觸碰面板12、一處理器系統14、一觸碰面板控制器16、 一傳輸介面18及一人機介面學習模組22。當例如使用者的 6 201039210 手指觸碰觸碰面板12時,觸碰面板控制器I6可對應單點〆 多點觸碰位置產生座標信號QS,且觸碰^^控制器Μ = 座標信號QS透過例如USB料傳輸介面傳送到處理器 系統14,然後在處理器系、统14上判別座標信號的變化奸 知相對應的手勢變化。或者,於另—實施例中,觸碰面板: 制器16可依據座標賴08的變化直接判別座標信號的變= 並得知相對朗手勢變化,再將6期的手勢傳至處理器系 〇 、统14。當然,上述一觸碰物件的觸碰動作並不限定為使用者 手指的手勢動作,例如一觸控筆的移動操作等等亦可。 處理器系統14具有-個常駐的人機介面學f模組22, 人機介面學習模組22例如可利用一人機介面學習程式實施, 將-觸碰物件的位置變化狀態對應至一軟體的快捷鍵或指"" 令並記憶該對應關係,如此使用者即可自行定義手勢或觸抄 筆動作,達到人性化的人機介面效果。 如下以不同實例說明人機介面學習模組22的操作。 (實例-) 如圖3所示,當開啟人機介面學習程式後,顯示一畫面 選單,在選單中有多種的指令可供選擇,例如先選定Ctrl++ 指令(該指令在Adobe軟體代表放大動作(z〇〇min)),之 後使用者如圖4所示將觸控筆26由中間位置往右上角移 動,觸碰面板控制器16讀取相對應的座標變化,再將此座 標變化對應至Ctrl++指令。如此人機介面學習程式即可瞭解 7 201039210 觸控筆26由中間位置往右上角移動所代表的意義是“放 大”的動作。因此當下次使用者使用觸控筆26由中間位置 往右上角移動,螢幕所顯示的物件(objects)即可放大。換 句話說’使用者可以先選SCtrl++齡後,接著在觸碰面板 12上做出任意的動作,人機介面學習程式即可辨認該動作並 連結至對應Ctrl++指令的動作。 (實例二) 〇 如圖3所示,當開啟人機介面學習程式後,例如選定Ctrl__ 才曰令亥才曰令在Adobe軟體代表縮小動作(z〇om 〇试),此時 使用者可如圖5所示於觸碰面板上將兩個手指24由外往内靠 近,觸碰面板控制器16讀取相對應的座標變化,再將此座標 變化對應至Ctrl--指令。如此當下次使用者使用兩個手指24 由外往内靠近’即可產生縮小動作。 〇 (實例三) 如圖3所示,當開啟人機介面學習程式後,例如選定^ 指令,該指令在Adobe軟體代表往下捲動(Scr〇lling),此時 使用者如圖6所示兩個手指24由上往下移動,觸碰面板控制 器16讀取相對應的手勢變化,再將此手勢變化對應至j指 令,如此當下次使用者使用兩個手指24由上往下移動,即可 產生往下捲動動作。 201039210 (實例四) 如圖3所示,當開啟人機介面學習程式後,例如選定个 指令,該指令在Adobe軟體代表往上捲動,此時使用者如圖 7所示兩個手指24由下往上移動,觸碰面板控制器16讀取相 對應的手勢變化,再將此手勢變化對應至彳指令,如此當下 次使用者使用兩個手指24由下往上移動,即可產生往上捲動 ❹ 動作。 因此,藉由人機介面學習模組22的設計,使用者可定義 不同手勢或觸控筆移動狀態對應至一軟體的不同快捷鍵或 指令,達到人性化的人機介面觸控效果。 圖8為一流程圖,說明依本發明一實施例之搭配人機介 面學習模組22的人機介面處理方法。於本實施例中,觸碰面 Q 板控制器16可將座標信號0S透過例如USB等的傳輸介面18 傳送到處理器系統14,然後在處理器系統14上判別手勢變 化’該人機介面處理方法包含如下步驟:The above and other objects, features and advantages of the present invention will become more apparent from the aspects of the invention. The drawings 'are described in detail below. The above and other technical contents, features and effects of the present invention will be apparent from the following detailed description of the embodiments of the invention. The directional terms mentioned in the following embodiments, such as up, down, left, right, front or back, etc., are only directions referring to the additional drawings. Therefore, the directional term used is used to describe that it is not intended to limit the invention. 2 is a schematic diagram showing a touch device 1 having a human-machine interface learning module according to an embodiment of the present invention. As shown in FIG. 2, the touch device 10 includes a touch panel 12, a processor system 14, a touch panel controller 16, a transmission interface 18, and a human interface learning module 22. When, for example, the user's 6 201039210 finger touches the touch panel 12, the touch panel controller I6 can generate the coordinate signal QS corresponding to the single-point multi-touch position, and touch the ^^ controller Μ = coordinate signal QS through For example, the USB material transfer interface is transmitted to the processor system 14, and then the change in the coordinate signal is discriminated on the processor system 14 to determine the corresponding gesture change. Alternatively, in another embodiment, the touch panel: the controller 16 can directly determine the change of the coordinate signal according to the change of the coordinate mark 08 and learn the change of the relative gesture, and then transmit the gesture of the sixth period to the processor system. , system 14. Of course, the touch action of the above-mentioned one touch object is not limited to the gesture action of the user's finger, such as the movement operation of a stylus pen, and the like. The processor system 14 has a resident human interface f module 22, and the human interface learning module 22 can be implemented by using a human interface learning program, for example, to map the position change state of the touch object to a software shortcut. The key or the finger "" orders and memorizes the correspondence, so that the user can define the gesture or the stylus action to achieve the humanized human interface effect. The operation of the human interface learning module 22 is illustrated in different examples as follows. (Example -) As shown in Figure 3, when the human interface learning program is enabled, a screen menu is displayed. There are various commands to choose from in the menu. For example, the Ctrl++ command is selected first (this command represents the zoom in the Adobe software ( Z〇〇min)), after the user moves the stylus 26 from the middle position to the upper right corner as shown in FIG. 4, the touch panel controller 16 reads the corresponding coordinate change, and then changes the coordinates to Ctrl++. instruction. This kind of human interface learning program can understand 7 201039210 The stylus 26 moves from the middle position to the upper right corner and represents the meaning of “expanding”. Therefore, the next time the user moves from the middle position to the upper right corner using the stylus 26, the objects displayed on the screen can be enlarged. In other words, the user can select SCtrl++ and then make any action on the touch panel 12. The human interface learning program can recognize the action and link to the action corresponding to the Ctrl++ command. (Example 2) 〇 As shown in Figure 3, when the human-machine interface learning program is enabled, for example, Ctrl__ is selected to enable the user to zoom in on the Adobe software (z〇om 〇 test), at this time the user can As shown in FIG. 5, the two fingers 24 are brought closer to the inside from the touch panel, and the touch panel controller 16 reads the corresponding coordinate change, and then the coordinate change is corresponding to the Ctrl--instruction. Thus, the next time the user uses two fingers 24 to move from the outside to the inside, a reduction action can be produced. 〇 (Example 3) As shown in Figure 3, when the human-machine interface learning program is enabled, for example, the ^ command is selected, the instruction scrolls down in the Adobe software representative (Scr〇lling), at this time, the user is as shown in FIG. The two fingers 24 move from top to bottom, and the touch panel controller 16 reads the corresponding gesture change, and then changes the gesture corresponding to the j command, so that the next time the user uses two fingers 24 to move from top to bottom, The scrolling action can be generated. 201039210 (Example 4) As shown in Figure 3, when the human interface learning program is enabled, for example, a command is selected, the command scrolls up on the Adobe software representative. At this time, the user has two fingers 24 as shown in FIG. Moving up and down, the touch panel controller 16 reads the corresponding gesture change, and then assigns the gesture change to the 彳 command, so that the next time the user moves from bottom to top using the two fingers 24, the upward movement can be generated. Scroll ❹ action. Therefore, by the design of the human interface learning module 22, the user can define different gestures or stylus movement states corresponding to different shortcut keys or commands of a software to achieve a humanized human interface touch effect. FIG. 8 is a flow chart illustrating a human interface processing method of the human interface learning module 22 in accordance with an embodiment of the present invention. In this embodiment, the touch surface Q board controller 16 can transmit the coordinate signal OS to the processor system 14 through the transmission interface 18 such as USB, and then discriminate the gesture change on the processor system 14 'the human interface processing method Contains the following steps:

Step 10:開始。Step 10: Start.

Step 12:開啟人機介面學習程式。Step 12: Open the human interface learning program.

Step 14:選取欲定義的快捷鍵或指令。Step 14: Select the shortcut key or command you want to define.

Step 16:使用者於觸碰面板上觸碰相對應的手勢。Step 16: The user touches the corresponding gesture on the touch panel.

Step 18:由觸碰面板控制器讀取相對應的座標變化。 9 201039210Step 18: The corresponding coordinate change is read by the touch panel controller. 9 201039210

Step 20:將此座標變化對應至選取的快捷鍵或指令。Step 20: Correspond to this coordinate change or instruction.

Step 22:結束。 圖9為一流程圖’說明依本發明另一實施例之搭配人機 介面學習模組22的人機介面處理方法。於本實施例中,觸碰 面板控制器16可直接判讀並編輯手勢,再透過傳輸介面18將 手勢傳送到處理器系統14,該人機介面處理方法包含如下步 驟:Step 22: End. Figure 9 is a flow chart illustrating the human interface processing method of the human interface learning module 22 in accordance with another embodiment of the present invention. In this embodiment, the touch panel controller 16 can directly interpret and edit the gesture, and then transmit the gesture to the processor system 14 through the transmission interface 18. The human interface processing method includes the following steps:

Step 30:開始Step 30: Start

Step 32·•開啟人機介面學習程式Step 32·•Open the human-machine interface learning program

Step 34:選取欲定義的快捷鍵或指令。Step 34: Select the shortcut key or command you want to define.

Step 36:使用者於觸碰面板上觸碰相對應的手勢Step 36: The user touches the corresponding gesture on the touch panel

Step 38:由觸碰面板控制器讀取相對應的手勢變化Step 38: Read the corresponding gesture change by touching the panel controller

Step 40:將此手勢變化對應至選取的快捷鍵或指令 Step 42:結束。 雖然本發明已以較佳實_揭露如上,雜並非用以限 定本發明,任何熟習此技藝者,在不麟本發明之精神和範 圍内,當可作些許之更動與麟,因此本發明之保護範圍當 視後附之中請專利範圍所界定者為準。另外,本發明的任一 實施例或申料鄕圍不須達成本㈣賴露之全部目的 或優點或特點。此外’摘要部分和標題僅是用來辅助專利文 件搜尋之用’並非用來限制本發明之權利範圍。 201039210 【圖式簡單說明】 圖1為一習知觸控裝置的示意圖。 圖2為一示意圖,顯示依本發明一實施例之具有人機介 面學習模組的觸控裝置。 圖3至圖7為顯示本發明的人機介面學習模組的操作實 例示意圖。 ΟStep 40: Correspond to this gesture change to the selected shortcut or command Step 42: End. Although the present invention has been described above in detail, it is not intended to limit the present invention, and those skilled in the art will be able to make some modifications and versatility within the spirit and scope of the present invention. The scope of protection shall be subject to the definition of patent scope in the attached annex. In addition, any of the embodiments or the scope of the invention is not required to achieve all of the objects or advantages or features of the present invention. Further, the 'summary section and the headings are only used to assist in the search of patent documents' and are not intended to limit the scope of the invention. 201039210 [Simple Description of the Drawings] FIG. 1 is a schematic diagram of a conventional touch device. 2 is a schematic diagram showing a touch device having a human-machine interface learning module according to an embodiment of the invention. 3 to 7 are views showing an operation example of the human interface learning module of the present invention. Ο

圖8為一流程圖,說明依本發明一實施例之搭配人機介 面學習模組的人機介面處理方法。 圖9為一流程圖,說明依本發明另一實施例之搭配人機 介面學習模組的人機介面處理方法。 【主要元件符號說明】 100 觸控裝置 102 觸碰面板 104 處理器系統 106 觸碰面板控制器 os 座標信號 S10-S22 > S30-S42 方法 步驟 10 觸控裝置 12 觸碰面板 14 處理器系統 16 觸碰面板控制器 18 傳輸介面 22 人機介面學習模組 24 手指 26 觸控筆 11FIG. 8 is a flow chart illustrating a human-machine interface processing method for a human-machine interface learning module according to an embodiment of the present invention. FIG. 9 is a flow chart illustrating a human interface processing method for a human interface learning module according to another embodiment of the present invention. [Main component symbol description] 100 touch device 102 touch panel 104 processor system 106 touch panel controller os coordinate signal S10-S22 > S30-S42 method step 10 touch device 12 touch panel 14 processor system 16 Touch panel controller 18 transmission interface 22 human interface learning module 24 finger 26 stylus 11

Claims (1)

201039210 七申請專利範園: L ~種觸控裝置,包含: 觸碰面板,供-觸碰物件產生-觸碰動作; 觸碰面板控制器,依據該觸碰動作產生座標信號,並 輸出一座標信號變化值; 處理器系統’依據該座標變化值判別該觸碰物件的位 置變化;及 Ο 機’I面學習模組,將該觸碰物件的該位置變化狀態 軟體的快捷鍵或指令並記憶該對應關係。 如申明專利範圍第1項所述之觸控裝置,其中該觸碰 物件為一觸控筆。 如申明專利範圍第1項所述之觸控裝置,其中該觸碰 物件為-使用者的手指。 4·如申請專利範圍第1項所述之觸控裝置,其中該觸碰 〇 制器利用—USB傳輸介面將該座標變化值傳送到該 處理器系統。 5.—種觸控裝置,包含: 觸碰面板,供一觸碰物件產生一觸碰動作; 一觸碰面板控制器,依據該觸碰動作產生複數個座標信 號; 一處理器系統,接收該些座標信號並依據該些座標信號 的變化判別該觸碰物件的位置變化·,及 12 201039210 -人機介面學f模組,將該觸鋪件的該位置變化狀離 〜至-軟體的快捷鍵或指令並記憶該對應關係。“ 6. 如申請專利制第5項所述之觸控裝置,射 物件為一觸控筆。 °3觸破 7. 如申請專利範圍第5項所述之觸控 物件為-制者料指。 T該觸岐 G 8·如申μ專機園第5項所述之觸控裝置,其中該 面板控制器利用-USB傳輸介面將該位置變化狀態傳送到 該處理器系統。 .9· 控裝置的人機介面處理方法,包含如下牛 驟: 開啟一人機介面學習程式; 選取欲定義的快捷鍵或指令; 利用一觸碰物件於賴控裝置上產生-觸碰動作; 讀取該觸碰物件相對應的座標變化以判別該觸碰物件 的移動狀態;及 將該觸碰物件的該移動狀態對應至選取的該快捷 該指令。 10.如申請專利範圍第9項所述之人機介面處理方法,其 中該觸碰物件的該_狀祕面板控制器判別。 U·如申請專利範圍第9項所述之人機介面處理方法,其 中該觸碰物件的該移動狀態係利用-處理器系統判別。 13 201039210201039210 Seven application patent garden: L ~ kind of touch device, including: touch panel, for-touch object generation-touch action; touch panel controller, generate coordinate signal according to the touch action, and output a standard a signal change value; the processor system determines the position change of the touch object according to the coordinate change value; and the machine 'I face learning module, the position of the touch object changes state software shortcut key or instruction and memorizes The correspondence. The touch device of claim 1, wherein the touch object is a stylus. The touch device of claim 1, wherein the touch object is a user's finger. 4. The touch device of claim 1, wherein the touch controller transmits the coordinate change value to the processor system using a USB transmission interface. 5. A touch device comprising: a touch panel for generating a touch action for a touch object; a touch panel controller for generating a plurality of coordinate signals according to the touch action; a processor system receiving the The coordinate signals determine the positional change of the touch object according to the change of the coordinate signals, and 12 201039210 - the human interface interface f module, the position of the touchdown is changed from ~ to - the shortcut of the software Key or instruction and remember the correspondence. 6. In the case of the touch device described in claim 5, the projectile is a stylus. °3 is broken. 7. The touch object described in claim 5 is a manufacturer. The touch device of the fifth aspect of the invention, wherein the panel controller transmits the position change state to the processor system by using a USB transfer interface. .9·Control device The human-machine interface processing method includes the following steps: opening a human-machine interface learning program; selecting a shortcut key or instruction to be defined; using a touch object to generate a touch action on the control device; reading the touch object Corresponding coordinate changes to determine the moving state of the touch object; and the moving state of the touch object corresponds to the selected shortcut command. 10. Human interface processing as described in claim 9 The method of the human interface processing according to the ninth aspect of the invention, wherein the moving state of the touch object is a processor system. Discrimination. 13 201039210 12.如申請專利範圍第9項所述之人機介面處理方法, 其中該觸碰物件為一觸控筆。 13.如申請專利範圍第9項所述之人機介面處理方法, 其中該觸碰物件為一使用者的手指。 1412. The human-machine interface processing method according to claim 9, wherein the touch object is a stylus. 13. The human-machine interface processing method according to claim 9, wherein the touch object is a user's finger. 14
TW99111355A 2009-04-24 2010-04-13 Touch-sensing device and procession method for man-machine interface TW201039210A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW99111355A TW201039210A (en) 2009-04-24 2010-04-13 Touch-sensing device and procession method for man-machine interface

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW98113577 2009-04-24
TW99111355A TW201039210A (en) 2009-04-24 2010-04-13 Touch-sensing device and procession method for man-machine interface

Publications (1)

Publication Number Publication Date
TW201039210A true TW201039210A (en) 2010-11-01

Family

ID=44995379

Family Applications (1)

Application Number Title Priority Date Filing Date
TW99111355A TW201039210A (en) 2009-04-24 2010-04-13 Touch-sensing device and procession method for man-machine interface

Country Status (1)

Country Link
TW (1) TW201039210A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI559106B (en) * 2014-09-19 2016-11-21 Hakko Electronics Co Ltd Programmable controller system, programmable display

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI559106B (en) * 2014-09-19 2016-11-21 Hakko Electronics Co Ltd Programmable controller system, programmable display

Similar Documents

Publication Publication Date Title
KR101361214B1 (en) Interface Apparatus and Method for setting scope of control area of touch screen
TWI552040B (en) Multi-region touchpad
EP2972669B1 (en) Depth-based user interface gesture control
JP5427279B2 (en) Touch event model
CN103105963B (en) Touch device and control method thereof
US8842084B2 (en) Gesture-based object manipulation methods and devices
US9841827B2 (en) Command of a device by gesture emulation of touch gestures
US9335899B2 (en) Method and apparatus for executing function executing command through gesture input
US20110227947A1 (en) Multi-Touch User Interface Interaction
JP5951886B2 (en) Electronic device and input method
JP2014241139A (en) Virtual touchpad
JP2011028524A (en) Information processing apparatus, program and pointing method
CN103218044B (en) A kind of touching device of physically based deformation feedback and processing method of touch thereof
WO2018019050A1 (en) Gesture control and interaction method and device based on touch-sensitive surface and display
JP6248462B2 (en) Information processing apparatus and program
TW201005598A (en) Touch-type mobile computing device and display method thereof
US20140068524A1 (en) Input control device, input control method and input control program in a touch sensing display
CN103809903B (en) Method and apparatus for controlling virtual screen
TW201218036A (en) Method for combining at least two touch signals in a computer system
JP5275429B2 (en) Information processing apparatus, program, and pointing method
JP2011081447A (en) Information processing method and information processor
KR101442438B1 (en) Single touch process to achieve dual touch experience field
TWI497357B (en) Multi-touch pad control method
KR20140067861A (en) Method and apparatus for sliding objects across touch-screen display
TW201039210A (en) Touch-sensing device and procession method for man-machine interface