[go: up one dir, main page]

CN101699387B - Non-touch interactive system and method - Google Patents

Non-touch interactive system and method Download PDF

Info

Publication number
CN101699387B
CN101699387B CN200910163965.1A CN200910163965A CN101699387B CN 101699387 B CN101699387 B CN 101699387B CN 200910163965 A CN200910163965 A CN 200910163965A CN 101699387 B CN101699387 B CN 101699387B
Authority
CN
China
Prior art keywords
display
control circuit
contact
sensing region
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN200910163965.1A
Other languages
Chinese (zh)
Other versions
CN101699387A (en
Inventor
张锐
Y·吴
T·A·普罗赫尔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honeywell International Inc
Original Assignee
Honeywell International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honeywell International Inc filed Critical Honeywell International Inc
Publication of CN101699387A publication Critical patent/CN101699387A/en
Application granted granted Critical
Publication of CN101699387B publication Critical patent/CN101699387B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04805Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention relates to a non-contact interaction system and a non-contact interaction method. A contactless display system enables a user to interact with a displayed image by moving a finger or pointer towards a selected portion of the image. The image may be dynamically magnified or translated in response to the detected movement. The method of operation can be manually switched between contact and non-contact operation to increase flexibility.

Description

非触摸式交互系统及方法Non-touch interactive system and method

技术领域 technical field

本发明涉及导航(navigate)小屏幕显示器的系统及其方法。更具体地,本发明涉及具有用于追踪用户手指指向虚拟键盘的轨迹的非接触式传感器的该系统及方法。The present invention relates to a system and method for navigating a small screen display. More specifically, the present invention relates to such systems and methods with non-contact sensors for tracking the trajectory of a user's fingers pointing at a virtual keyboard.

背景技术 Background technique

可以在例如手机、个人数字助理(PDA)、移动电脑和成像器的产品上见到各种类型的小屏幕显示器。越来越多地,用户必须导航小屏幕显示器去利用手机进行网页浏览或使用PDA浏览照片。Various types of small screen displays can be found on products such as cell phones, personal digital assistants (PDAs), mobile computers, and imagers. Increasingly, users must navigate small-screen displays to browse the web with a mobile phone or view photos with a PDA.

小型触摸屏用于支持与运行在诸如PDA和移动电话的便携装置上的程序的交互是常见的。小型触摸屏也在寻找到家用产品的路径,例如Honeywell公司的TH8321U1006型恒温器,Honeywell公司的6271V型安全面板,以及各种个人健康监视装置。它们已经在包裹递送、零售仓储和精炼厂野外操作中应用多年。Small touch screens are common to support interaction with programs running on portable devices such as PDAs and mobile phones. Small touch screens are also finding their way into household products, such as Honeywell's TH8321U1006 thermostat, Honeywell's 6271V security panel, and various personal health monitoring devices. They have been used for many years in package delivery, retail warehousing and refinery field operations.

已知的,导航系统使用缩放和全景控制或通过鱼眼观视器。然而,在一些情况下使用这些控制是不方便和低效率的。例如,很难结合触摸屏去使用鱼眼导航。同样很难利用鼠标或触摸屏去缩放和全景图形。在小型屏上使用触针的导航可能对用户更加困难。Known navigation systems use zoom and pan controls or via a fisheye viewer. However, using these controls is inconvenient and inefficient in some situations. For example, it is difficult to use fisheye navigation with a touchscreen. It's also difficult to zoom and pan graphics using a mouse or touchscreen. Navigation using a stylus can be more difficult for users on small screens.

因此对于另外一种自然的并且更容易使用的控制导航图形大小比例的装置的需求是持续地。因此存在对于自然且容易使用的控制导航大尺度和小尺度图形的替换方法的持续需求。There is therefore a continuing need for another natural and easier-to-use means of controlling the size and ratio of navigation graphics. There is therefore a continuing need for natural and easy-to-use alternative methods of controlling the navigation of large-scale and small-scale graphics.

同样可期待的是,能够在导航诸如地图或建筑楼层平面图的大尺度图形中使用这种方法。It would also be desirable to be able to use this approach in navigating large scale graphics such as maps or building floor plans.

发明内容 Contents of the invention

一种显示系统,其包括:图像可以在其表面呈现的多维显示装置;位于该表面附近的多个非接触式传感器;以及耦合到该传感器和该显示装置的控制电路,该电路响应来自传感器的信号,确定向该表面的区域移动的指示部件的轨迹。A display system comprising: a multi-dimensional display device on a surface on which an image may be presented; a plurality of non-contact sensors positioned adjacent the surface; and a control circuit coupled to the sensors and the display device, the circuit responsive to signals from the sensors signal determining the trajectory of the pointing member moving towards the region of the surface.

附图说明 Description of drawings

附图1为根据本发明的非触摸式交互系统的图示;Accompanying drawing 1 is the illustration of non-touch interactive system according to the present invention;

附图2为附图1中的系统的部分软件元件的框图;Accompanying drawing 2 is the block diagram of some software elements of the system among accompanying drawing 1;

附图3为图示交互方法的流程图;Accompanying drawing 3 is the flowchart of illustration interaction method;

附图4A、4B示出了附图1中系统的两种不同的应用;Accompanying drawing 4A, 4B have shown two kinds of different applications of the system in accompanying drawing 1;

附图5为非触摸模式输入屏幕,包括建筑名字,街道号码和地址,城市,邮政编码,第一警报,楼层号,检测器;Accompanying drawing 5 is non-touch mode input screen, comprises building name, street number and address, city, zip code, first alarm, floor number, detector;

附图6为中间非触摸模式屏幕,包括建筑名字,街道号码和地址,城市,邮政编码,第一警报,楼层号,检测器;Accompanying drawing 6 is the middle non-touch mode screen, including building name, street number and address, city, zip code, first alarm, floor number, detector;

附图7为具有退出按钮的非触摸模式区域显示屏,包括建筑名字,街道号码和地址,城市,邮政编码,第一警报,楼层号,检测器;Figure 7 is a non-touch mode area display with exit button, including building name, street number and address, city, zip code, first alarm, floor number, detector;

附图8A、8B、8C示出了附图1中系统的非接触式传感器的各个方面;Figures 8A, 8B, and 8C illustrate various aspects of the touchless sensor of the system of Figure 1;

附图9A、9B、9C示出了虚拟键盘非触摸式导航的一种形式,在图9A中,当手指足够靠近触摸屏时,将根据手指的位置选择和放大可视键盘的一个框;以及Accompanying drawing 9A, 9B, 9C have shown a form of virtual keyboard non-touch navigation, and in Fig. 9A, when finger is close enough to touch screen, will select and enlarge a box of visual keyboard according to the position of finger; And

附图10A、10B、10C示出了虚拟键盘非触摸式导航的另一种形式,在图10A中,将根据手指的位置连续放大虚拟键盘。Accompanying drawings 10A, 10B and 10C show another form of non-touch navigation of the virtual keyboard. In FIG. 10A, the virtual keyboard will be enlarged continuously according to the position of the fingers.

具体实施方式 Detailed ways

尽管本发明的实施例可以采取许多不同形式,在附图中示出其具体实施例并且结合下列理解在本文中详细描述该具体实施例:将本发明的公开作为本发明原理的例证,以及某些最佳实施方式,并且并不意为将本发明限制于所示的具体实施例。Although embodiments of the invention may take many different forms, specific embodiments thereof are shown in the drawings and are described in detail herein with the understanding that the disclosure of the invention is by way of illustration of the principles of the invention, and that certain These are some of the best modes shown, and are not intended to limit the invention to the specific examples shown.

本发明的实施例包括非触摸或非接触界面,其在三维中感测用户的手指或手的位置。在公开的实施例中,可以在显示装置的边缘布置多个电容性传感器。可以追踪指向显示在装置上的虚拟键盘上的点的手指或手的轨迹。这使得相关系统在手指实际触摸到屏幕前,能预测用户试图选择的显示装置屏幕上的点。Embodiments of the invention include a touchless or contactless interface that senses the position of a user's finger or hand in three dimensions. In disclosed embodiments, multiple capacitive sensors may be arranged at the edge of the display device. The trajectory of a finger or hand pointing to a point on a virtual keyboard displayed on the device can be traced. This allows the system to predict the point on the display device screen that the user is trying to select before the finger actually touches the screen.

根据本发明可以使用Z轴手指位置数据以在例如地图显示器上控制缩放比或缩范围。可选择地,可通过利用这种非触摸指示方法来控制地图显示器上的鱼眼。鱼眼的多个参数,例如缩放比、缩放范围、缩放形状(矩形、圆角矩形、椭圆形等等)以及鱼眼周围失真边缘的比例可以在处理中修改。Z-axis finger position data can be used in accordance with the invention to control zooming or zooming out on eg a map display. Alternatively, the fisheye on the map display can be controlled by utilizing this non-touch pointing method. Several parameters of the fisheye, such as zoom ratio, zoom range, zoom shape (rectangle, rounded rectangle, ellipse, etc.) and the scale of the distortion edge around the fisheye can be modified in the process.

当用户移动他/她的手指,显示器中的图形内容相应地更新。手指的移动同样可以控制地图显示器上的缩放/跨越操作或鱼眼效果。这些处理对于用户是高效率和直观的。As the user moves his/her finger, the graphic content in the display is updated accordingly. Finger movements can also control zoom/span operations or fisheye effects on the map display. These processes are efficient and intuitive to the user.

相同的方法也可以用于控制小屏幕显示器上的虚拟键盘并且与小屏幕显示器上的虚拟键盘相交互。它克服了与小虚拟键盘相关的长期性问题即按键总是比人指尖小得多。The same method can also be used to control and interact with the virtual keyboard on the small screen display. It overcomes the longstanding problem associated with small virtual keyboards that the keys are always much smaller than a human fingertip.

精确的交互需要仅仅键盘的目标区域的放大(例如一些按键的小的子集)。非触摸界面使用与手的位置相关的z轴数据来推断键盘上需要的目标区域并且自动缩放或放大虚拟键盘上需要的区域。Precise interaction requires magnification of only the target area of the keyboard (eg a small subset of some keys). The non-touch interface uses the z-axis data related to the position of the hands to infer the desired target area on the keyboard and automatically scales or magnifies the desired area on the virtual keyboard.

在一些应用中,来自非触摸装置的输入信号可能会扰乱不需要为非触摸的交互。在本发明的一个方面中,多个不同的方法可以用于直观地和迅速地停止/启动非触摸交互。在一个实施例中,用户的右手可以用于指示和控制缩放控制或鱼眼控制,左手可以用于操作启动或停止非触摸导航的按钮。在这种处理中,左手还可以用于在快速移动(fly)中迅速改变鱼眼或缩放参数,同时右手做指示和拖拉动作来提供非常有效的双手交互。In some applications, input signals from non-touch devices may disrupt interactions that do not need to be non-touch. In one aspect of the invention, a number of different methods can be used to stop/start non-touch interactions intuitively and quickly. In one embodiment, the user's right hand may be used to point and control the zoom control or fisheye control, and the left hand may be used to operate buttons that start or stop touchless navigation. In this processing, the left hand can also be used to quickly change fisheye or zoom parameters on the fly, while the right hand makes pointing and dragging actions to provide a very effective two-handed interaction.

附图1示出了一个非触摸或非接触式交互系统100。该系统100包括通过数据总线152与触摸屏输入缓中器153、非触摸输入缓冲器154、显示器缓冲器155、缓冲器156以及存储单元或寄存器157相连接的可编程处理单元151。触摸屏158通过触摸屏缓冲器153与处理器耦合。非触摸感测装置159,例如多个基于电容性的非接触式传感器,通过非触摸输入缓冲器154与处理器相耦合。FIG. 1 shows a touchless or contactless interaction system 100 . The system 100 includes a programmable processing unit 151 connected by a data bus 152 to a touch screen input buffer 153 , a non-touch input buffer 154 , a display buffer 155 , a buffer 156 , and a storage unit or register 157 . The touch screen 158 is coupled with the processor through the touch screen buffer 153 . A non-touch sensing device 159 , such as a plurality of capacitive-based non-contact sensors, is coupled to the processor through a non-touch input buffer 154 .

图形显示器160通过显示器缓冲器155耦合到处理器。显示装置160包括在其上呈现各种图像的显示屏。该非触摸传感器159位于将在随后更加详细地讨论的显示装置160的显示屏的外围的周围。Graphics display 160 is coupled to the processor through display buffer 155 . The display device 160 includes a display screen on which various images are presented. The non-touch sensor 159 is located around the periphery of the display screen of the display device 160 which will be discussed in more detail subsequently.

输入/输出装置161通过输入/输出缓冲器156耦合至处理器。输入/输出装置可以包括任何允许系统与外部信息交互的装置的组合。Input/output device 161 is coupled to the processor through input/output buffer 156 . Input/output devices may include any combination of devices that allow the system to interact with external information.

存储单元157包括对处理器151实现非触摸交互系统所必须的信息和/或程序或可执行软件。例如,显示器控制软件157a可以以计算机可读形式存储于单元157中。其他的系统控制软件同样可以存储于单元157中。The storage unit 157 includes information and/or programs or executable software necessary for the processor 151 to implement a touchless interaction system. For example, display control software 157a may be stored in unit 157 in a computer readable form. Other system control software may also be stored in unit 157 .

附图2示出了通过处理器151来执行的系统100的各种软件模块200。模块200以磁性计算机或光学计算机可读形式存储于单元157中。软件200包括命令执行模块202、命令识别器模块204、数据接收器206、图形系统显示模块208和域(domain)模型210,其提供关于显示的区域的信息。各种模块的运行将相对于附图3的过程250来讨论。FIG. 2 shows various software modules 200 of the system 100 executed by the processor 151 . Module 200 is stored in unit 157 in magnetic computer or optical computer readable form. The software 200 includes a command execution module 202, a command recognizer module 204, a data receiver 206, a graphics system display module 208, and a domain model 210, which provides information about displayed regions. The operation of the various modules will be discussed with respect to process 250 of FIG. 3 .

如附图3的流程图中所示,在252,将来自例如传感器158或159的传感器的数据从缓冲器153、154载入到诸如206a、206b的各个接收器中。在254,将该数据载入到输入缓冲器204b中。As shown in the flowchart of FIG. 3, at 252 data from a sensor such as sensor 158 or 159 is loaded from buffer 153, 154 into a respective receiver such as 206a, 206b. At 254, the data is loaded into input buffer 204b.

在256,手势分析器204a分析数据。在258,手势分析器204a传送系统命令到命令执行202。在260,该命令执行改变模型210的域对象的状态。At 256, gesture analyzer 204a analyzes the data. At 258 , gesture analyzer 204 a transmits the system command to command execution 202 . At 260 , the command execution changes the state of the domain objects of the model 210 .

在262,命令执行202通知图形系统模块208去改变显示器160上的可视图像的状态。在264,图形系统模块208更新显示单元160上的图像。在266,命令执行模块202随后更新系统状态。At 262 , command execution 202 notifies graphics system module 208 to change the state of the visual image on display 160 . At 264 , the graphics system module 208 updates the graphics on the display unit 160 . At 266, the command execution module 202 then updates the system status.

附图4A示出了本发明的一个实施例,非接触、导航区域显示器,例如可以用于估计建筑的报警条件。在第二实施例中,附图4B示出了根据本发明的小型显示器非接触导航。Figure 4A illustrates one embodiment of the present invention, a non-contact, navigational area display, such as may be used to estimate a building's alarm condition. In a second embodiment, Figure 4B shows a small display touchless navigation according to the present invention.

附图5示出了在初始显示状态中的附图4a的实施例类型或应用的显示。在步骤1中,如所示,用户可以点击按钮去进入非触摸导航状态。附图6为在显示单元160上呈现给用户的确认屏幕。用户可以进入如在步骤3所示的非触摸导航状态。附图7示出了在非触摸导航状态时在显示单元160上的屏幕呈现。提供按钮来退出非触摸状态。Figure 5 shows the display of the embodiment type or application of Figure 4a in an initial display state. In step 1, as shown, the user can tap the button to enter the touchless navigation state. FIG. 6 is a confirmation screen presented to the user on the display unit 160 . The user can enter the touchless navigation state as shown in step 3. FIG. 7 shows the screen presentation on the display unit 160 in the non-touch navigation state. Provides a button to exit the non-touch state.

附图8A、8B、8C示出了关于显示单元160的屏幕160a的周边布置的非接触式传感器159的特征的各个部分。如本文所示,传感器159限定了外部截面圆锥型(frusto-conical)感测区域160b和内部区域160c。FIGS. 8A , 8B, and 8C show various parts of features of the proximity sensor 159 arranged with respect to the periphery of the screen 160 a of the display unit 160 . As shown herein, sensor 159 defines an outer cross-sectional frusto-conical sensing region 160b and an inner region 160c.

当用户的手指或指示装置位于外部区域160b时,区域显示器或地图可以被导航或滚动以及缩放。当用户的手指靠近屏幕时,在区域160b和160c中,例如,呈现的图像从一个级别向更详细的级别来缩放。当用户的手指进入内部区域160c时,在一个实施例中,用户可以仅在地图或显示器上放大和缩小。区域160c能够帮助终端用户将地图或显示器平滑且没有抖动或跳动地放大/缩小。When the user's finger or pointing device is located in the outer area 160b, the area display or map can be navigated or scrolled and zoomed. As the user's finger approaches the screen, in regions 160b and 160c, for example, the rendered image zooms from one level to a more detailed level. When the user's finger enters the inner area 160c, in one embodiment, the user can only zoom in and out on the map or display. Region 160c can help the end user zoom in/out the map or display smoothly and without jitter or jerk.

附图9A、9B、9C示出了利用系统100的导航虚拟键盘的一个解决方案的步骤。可以通过用户去激活按键去选择和扩大区域。然后,可以通过用户去激活不同的按键去选择和扩大第二区域,等等,直到完成需要的进入。Figures 9A, 9B, 9C show the steps of a solution using the system 100 for navigating the virtual keyboard. Areas can be selected and enlarged by user deactivating keys. It is then possible for the user to activate different keys to select and expand the second area, and so on, until the desired entry is made.

附图10A、10B、10C示出了另一个利用系统100的导航虚拟键盘的解决方案的步骤。在附图10A、10B、10C的实施例中,可以扩大或放大用户手指移动到的键盘的任何部分以使得用户去无缝地激活连续按键组。应当理解本发明的实施例可以被包括在例如无线电话、移动电脑、或成像装置、一切可能具有相对小键盘的电子装置中。Figures 10A, 10B, 10C illustrate the steps of another solution using the system 100 for navigating a virtual keyboard. In the embodiments of Figures 10A, 10B, 10C, any portion of the keyboard that the user's finger moves can be enlarged or magnified to enable the user to seamlessly activate consecutive groups of keys. It should be understood that embodiments of the present invention may be included in electronic devices such as wireless telephones, mobile computers, or imaging devices, all of which may have relatively small keypads.

综上所述,应当注意到在不脱离本发明的实质和范围的情况下多种变化和改变是有效的。应当理解没有意图或应当推断出关于本文所述的特定设备的限制。当然,其意图由所附的权利要求去覆盖落入该权利要求的范围之内的所有这类改变。From the foregoing, it should be noted that various changes and modifications are available without departing from the spirit and scope of the invention. It is to be understood that no limitation with respect to the particular devices described herein is intended or should be inferred. It is, of course, intended by the appended claims to cover all such changes which fall within the scope of such claims.

Claims (9)

1.一种显示系统,其包括:1. A display system comprising: 图像能够在其表面呈现的多维显示装置;Multi-dimensional display devices capable of presenting images on their surfaces; 位于该表面附近的多个非接触式传感器;以及a plurality of non-contact sensors located near the surface; and 耦合到该传感器和该显示装置的控制电路,该电路建立该显示装置的表面附近的第一多维感测区域和第一多维感测区域内的第二多维感测区域,该第一多维感测区域具有预定形状,并且该第二多维感测区域具有不同于该第一多维感测区域的形状,并且该电路响应来自传感器的信号,确定向该表面的区域移动的指示部件的轨迹以及响应于其,改变该表面上的图像;以及a control circuit coupled to the sensor and the display device, the circuit establishing a first multi-dimensional sensing region near a surface of the display device and a second multi-dimensional sensing region within the first multi-dimensional sensing region, the first multi-dimensional sensing region the multidimensional sensing region has a predetermined shape, and the second multidimensional sensing region has a different shape than the first multidimensional sensing region, and the circuitry determines an indication of movement toward a region of the surface in response to a signal from the sensor the trajectory of the part and, in response thereto, changing the image on the surface; and 其中当指示部件在第一区域中时图像能够被滚动以及缩放并且当在第二区域中时图像仅仅能够被缩放。Wherein the image can be scrolled and zoomed when the pointing member is in the first area and can only be zoomed when in the second area. 2.如权利要求1所述的系统,其包括耦合到控制电路的显示控制电路,该显示控制电路响应所确定的轨迹来动态地调整呈现在该表面上的图像的放大参数。2. The system of claim 1, comprising a display control circuit coupled to the control circuit, the display control circuit dynamically adjusting a magnification parameter of the image presented on the surface in response to the determined trajectory. 3.如权利要求1所述的系统,其中该传感器包括电容型传感器。3. The system of claim 1, wherein the sensor comprises a capacitive sensor. 4.如权利要求1所述的系统,其中将该传感器被配置为限定截面圆锥型区域,在该区域内指示部件能够被感测。4. The system of claim 1, wherein the sensor is configured to define a cross-sectional conical region within which the indicating member can be sensed. 5.如权利要求1所述的系统,其包括手动操作元件,该手动操作元件耦合到在与该表面上的图像交互的非接触模式和接触类型模式之间切换的控制电路。5. The system of claim 1, including a manual operating element coupled to a control circuit that switches between a non-contact mode and a contact-type mode of interacting with the image on the surface. 6.如权利要求1所述的系统,其中该控制电路响应于所选择的与该表面上的图像交互的接触类型模式或非接触模式之一。6. The system of claim 1, wherein the control circuit is responsive to a selected one of a contact-type mode or a non-contact mode of interaction with the image on the surface. 7.如权利要求1所述的系统,其包括与显示装置的屏幕相关联的多个接触类型传感器。7. The system of claim 1, comprising a plurality of contact-type sensors associated with a screen of a display device. 8.如权利要求1所述的系统,其包括存储在耦合到该控制电路的计算机可读介质上的显示管理软件。8. The system of claim 1 including display management software stored on a computer readable medium coupled to the control circuit. 9.如权利要求8所述的系统,其中该显示管理软件在通过该控制电路执行时,响应于该指示部件的确定轨迹在显示单元的表面上呈现动态改变图像。9. The system of claim 8, wherein the display management software, when executed by the control circuit, presents dynamically changing images on the surface of the display unit in response to the determined trajectory of the pointing member.
CN200910163965.1A 2008-07-01 2009-06-30 Non-touch interactive system and method Expired - Fee Related CN101699387B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/166,022 US8443302B2 (en) 2008-07-01 2008-07-01 Systems and methods of touchless interaction
US12/166,022 2008-07-01

Publications (2)

Publication Number Publication Date
CN101699387A CN101699387A (en) 2010-04-28
CN101699387B true CN101699387B (en) 2014-06-25

Family

ID=41172383

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200910163965.1A Expired - Fee Related CN101699387B (en) 2008-07-01 2009-06-30 Non-touch interactive system and method

Country Status (3)

Country Link
US (1) US8443302B2 (en)
EP (1) EP2144147A3 (en)
CN (1) CN101699387B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10838504B2 (en) 2016-06-08 2020-11-17 Stephen H. Lewis Glass mouse

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100202656A1 (en) * 2009-02-09 2010-08-12 Bhiksha Raj Ramakrishnan Ultrasonic Doppler System and Method for Gesture Recognition
US10180746B1 (en) 2009-02-26 2019-01-15 Amazon Technologies, Inc. Hardware enabled interpolating sensor and display
US9740341B1 (en) 2009-02-26 2017-08-22 Amazon Technologies, Inc. Capacitive sensing with interpolating force-sensitive resistor array
DE102009032069A1 (en) * 2009-07-07 2011-01-13 Volkswagen Aktiengesellschaft Method and device for providing a user interface in a vehicle
US9785272B1 (en) 2009-07-31 2017-10-10 Amazon Technologies, Inc. Touch distinction
US9740340B1 (en) 2009-07-31 2017-08-22 Amazon Technologies, Inc. Visually consistent arrays including conductive mesh
CA2774867A1 (en) * 2009-09-21 2011-03-24 Extreme Reality Ltd. Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
US8810524B1 (en) 2009-11-20 2014-08-19 Amazon Technologies, Inc. Two-sided touch sensor
US20120095575A1 (en) * 2010-10-14 2012-04-19 Cedes Safety & Automation Ag Time of flight (tof) human machine interface (hmi)
US8421752B2 (en) * 2011-01-27 2013-04-16 Research In Motion Limited Portable electronic device and method therefor
EP2678764A4 (en) * 2011-02-22 2017-03-22 Hewlett-Packard Development Company, L.P. Control area for facilitating user input
CN102693063B (en) * 2011-03-23 2015-04-29 联想(北京)有限公司 Operation control method and device and electronic equipment
MX346223B (en) 2011-04-22 2017-03-10 Pepsico Inc Beverage dispensing system with social media capabilities.
US9218704B2 (en) 2011-11-01 2015-12-22 Pepsico, Inc. Dispensing system and user interface
US9594499B2 (en) * 2012-02-21 2017-03-14 Nokia Technologies Oy Method and apparatus for hover-based spatial searches on mobile maps
CN103312861A (en) * 2012-03-06 2013-09-18 联想(北京)有限公司 Control method, control system and equipment containing same
US8654076B2 (en) 2012-03-15 2014-02-18 Nokia Corporation Touch screen hover input handling
US9310895B2 (en) 2012-10-12 2016-04-12 Microsoft Technology Licensing, Llc Touchless input
US20140152566A1 (en) * 2012-12-05 2014-06-05 Brent A. Safer Apparatus and methods for image/sensory processing to control computer operations
US9511988B2 (en) * 2012-12-27 2016-12-06 Lancer Corporation Touch screen for a beverage dispensing system
US9881337B2 (en) 2013-02-22 2018-01-30 Cantor Futures Exchange, L.P. Systems and methods for providing seamless transitions between graphical images on a binary options interface
CN104102440B (en) * 2013-04-08 2018-05-25 华为技术有限公司 In user interface compression, the method and apparatus of decompression file
WO2015022498A1 (en) * 2013-08-15 2015-02-19 Elliptic Laboratories As Touchless user interfaces
EP2933743B1 (en) * 2014-04-18 2019-05-08 Université de Rennes 1 Method of characterizing molecular diffusion within a body from a set of diffusion-weighted magnetic resonance signals and apparatus for carrying out such a method
USD749115S1 (en) * 2015-02-20 2016-02-09 Translate Abroad, Inc. Mobile device with graphical user interface
US11340710B2 (en) 2016-06-08 2022-05-24 Architectronics Inc. Virtual mouse
CN106940608B (en) * 2017-03-07 2020-06-16 Oppo广东移动通信有限公司 A control method of a display screen, a display screen and an electronic device
US10747429B2 (en) 2018-08-01 2020-08-18 International Business Machines Corporation Compensating for user hand tremors when using hand-held electronic devices
US12014120B2 (en) 2019-08-28 2024-06-18 MFTB Holdco, Inc. Automated tools for generating mapping information for buildings
US10825247B1 (en) * 2019-11-12 2020-11-03 Zillow Group, Inc. Presenting integrated building information using three-dimensional building models
US11676344B2 (en) 2019-11-12 2023-06-13 MFTB Holdco, Inc. Presenting building information using building models
US12333655B2 (en) 2019-11-12 2025-06-17 MFTB Holdco, Inc. Presenting building information using video and building models
JP7672721B2 (en) * 2020-05-25 2025-05-08 エヌ・ゼット・テクノロジーズ・インコーポレイテッド Retrofit touchless interface for contact-type input devices
CN117113565B (en) * 2023-08-22 2024-09-13 鞍钢股份有限公司 A design method for non-uniform structure tubular heat exchanger

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2648558B2 (en) 1993-06-29 1997-09-03 インターナショナル・ビジネス・マシーンズ・コーポレイション Information selection device and information selection method
US7466307B2 (en) * 2002-04-11 2008-12-16 Synaptics Incorporated Closed-loop sensor on a solid-state object position detector
US7656393B2 (en) * 2005-03-04 2010-02-02 Apple Inc. Electronic device having display and surrounding touch sensitive bezel for user interface and control
JP2004038407A (en) 2002-07-01 2004-02-05 Arcadia:Kk Character input device and method
US8460103B2 (en) * 2004-06-18 2013-06-11 Igt Gesture controlled casino gaming system
GB2421161B (en) 2003-02-26 2007-08-29 Tomtom Bv Navigation device with touch screen
DE10310794B4 (en) 2003-03-12 2012-10-18 Hewlett-Packard Development Co., L.P. Operating device and communication device
US20040183833A1 (en) * 2003-03-19 2004-09-23 Chua Yong Tong Keyboard error reduction method and apparatus
EP1596271A1 (en) * 2004-05-11 2005-11-16 Hitachi Europe S.r.l. Method for displaying information and information display system
GB0412787D0 (en) 2004-06-09 2004-07-14 Koninkl Philips Electronics Nv Input system
JP2008505379A (en) 2004-06-29 2008-02-21 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Touchdown feedforward in 3D touch interaction
US7728825B2 (en) 2005-03-22 2010-06-01 Microsoft Corporation Targeting in a stylus-based user interface
US20060250376A1 (en) 2005-05-03 2006-11-09 Alps Electric Co., Ltd. Display device
DE102006037154A1 (en) * 2006-03-27 2007-10-18 Volkswagen Ag Navigation device and method for operating a navigation device
US7903094B2 (en) * 2006-06-23 2011-03-08 Wacom Co., Ltd Information processing apparatus, operation input method, and sensing device
US20080055259A1 (en) * 2006-08-31 2008-03-06 Honeywell International, Inc. Method for dynamically adapting button size on touch screens to compensate for hand tremor
US8316324B2 (en) * 2006-09-05 2012-11-20 Navisense Method and apparatus for touchless control of a device
JP2008197934A (en) * 2007-02-14 2008-08-28 Calsonic Kansei Corp Operator determining method
EP2135155B1 (en) * 2007-04-11 2013-09-18 Next Holdings, Inc. Touch screen system with hover and click input methods

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10838504B2 (en) 2016-06-08 2020-11-17 Stephen H. Lewis Glass mouse

Also Published As

Publication number Publication date
US20100005427A1 (en) 2010-01-07
EP2144147A2 (en) 2010-01-13
EP2144147A3 (en) 2013-07-03
US8443302B2 (en) 2013-05-14
CN101699387A (en) 2010-04-28

Similar Documents

Publication Publication Date Title
CN101699387B (en) Non-touch interactive system and method
KR101541928B1 (en) Visual feedback display
KR101384857B1 (en) User interface methods providing continuous zoom functionality
KR101424294B1 (en) A computer implemented method and computer readable medium for performing an operation in response to an input and a gesture received from a user of a touch screen device
KR101361214B1 (en) Interface Apparatus and Method for setting scope of control area of touch screen
EP2657811B1 (en) Touch input processing device, information processing device, and touch input control method
EP2708996A1 (en) Display device, user interface method, and program
US9459704B2 (en) Method and apparatus for providing one-handed user interface in mobile device having touch screen
US20140380209A1 (en) Method for operating portable devices having a touch screen
US20140062875A1 (en) Mobile device with an inertial measurement unit to adjust state of graphical user interface or a natural language processing unit, and including a hover sensing function
WO2009142880A1 (en) Proximity sensor device and method with subregion based swipethrough data entry
CN103838507A (en) Touch-sensing display device and driving method thereof
JP2004192241A (en) User interface device and portable information device
KR20110115683A (en) One-Hand Input Method on Touch Screen
JP2011081447A (en) Information processing method and information processor
KR20100136289A (en) Display Control Method of Mobile Terminal
US9235338B1 (en) Pan and zoom gesture detection in a multiple touch display
KR101920864B1 (en) Method and terminal for displaying of image using touchscreen
KR101403079B1 (en) method for zooming in touchscreen and terminal using the same
EP3433713B1 (en) Selecting first digital input behavior based on presence of a second, concurrent, input
KR101165388B1 (en) Method for controlling screen using different kind of input devices and terminal unit thereof
CN102789358A (en) Image output and display method, device and display equipment
KR101136327B1 (en) A touch and cursor control method for portable terminal and portable terminal using the same
KR20120078816A (en) Providing method of virtual touch pointer and portable device supporting the same
TW201928650A (en) Apparatus and method for zooming screen

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20140625