WO2011061395A1 - Method, computer program and device for interacting with a computer field of the invention - Google Patents
Method, computer program and device for interacting with a computer field of the invention Download PDFInfo
- Publication number
- WO2011061395A1 WO2011061395A1 PCT/FI2010/050924 FI2010050924W WO2011061395A1 WO 2011061395 A1 WO2011061395 A1 WO 2011061395A1 FI 2010050924 W FI2010050924 W FI 2010050924W WO 2011061395 A1 WO2011061395 A1 WO 2011061395A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tip
- image
- control data
- computer
- relation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
Definitions
- the present invention relates to interaction with a computer.
- the present invention relates to a solution which improves interaction techniques when images are being projected on a display surface .
- a stylus is an input tool that can be used with various devices, e.g. Personal Digital Assistants (PDA), personal computers, mobile phones etc. to input information.
- PDA Personal Digital Assistants
- a stylus When a stylus is used, a user operates a touch screen with a stylus rather than using a finger. It also improves the precision of the touch input, al ⁇ lowing use of smaller user interface elements.
- Sty ⁇ luses may be used for entering commands, handwriting or drawing on a touch-sensitive surface.
- One object of the invention is to improve us ⁇ er interaction techniques when images are being pro ⁇ jected on a display.
- Another object of this invention is to produce kinesthetic and tactile signals, which are being complementary to pictorial three dimensional cues .
- a device for interacting with a com ⁇ puter and a user comprises a first part comprising a tip for contacting a surface, a second part for contacting the user and a signalling means for interacting with the computer.
- the device further comprises moving means for moving the second part in relation to the tip while the tip is arranged to main ⁇ tain contact with the surface.
- the movement of the se ⁇ cond part in relation to the tip corresponds to a change in at least one image parameter, the image be ⁇ ing projected on a display.
- the movement of the second part in relation to the tip provides the user with a haptic sense.
- the signalling means are configured to receive control data from the computer, the control data depending on the at least one image parameter at a position of the tip on the image on the display, wherein the moving means are configured to move the second part in relation to the tip based on the control data.
- the at least one image pa ⁇ rameter comprises a Z-coordinate value in the image at the tip position, the distance between the tip and the second part simulating movement along Z-axis in the image .
- the at least one image pa ⁇ rameter comprises a visual parameter.
- the visual parameter comprises at least one of brightness and colour.
- the at least one image pa ⁇ rameter comprises at least one physical property.
- the device comprises meas ⁇ uring means for determining the distance between the tip and the second part, and the signalling means are configured to send the distance information to the computer.
- the moving means are con ⁇ figured to move the second part in relation to the tip when the user applies positive or negative pressure with the tip to the surface.
- the device further com ⁇ prises tactile feedback means for providing tactile feedback to the user as a complementary feedback.
- the device comprises a stylus, wherein the second part comprises a grip part, arranged at least partially around the first part.
- the moving means are configured to move the grip part in relation to the tip along a longitudinal axis of the stylus.
- a method for interacting with a de ⁇ vice and a user comprises determining a position of a tip of a first part of the device on an image on a display, generating control data based on the tip position, and providing the device with the control data.
- the control data depends on at least one image parameter at the tip position on the image on the display.
- the control data is configured to control moving means of the device to move a second part of the device in relation to the tip while the tip is ar ⁇ ranged to maintain contact with a surface.
- a change in the distance between the tip and the second part cor ⁇ responds to a change in the at least one image parame ⁇ ter at the tip position in the image.
- the at least one image pa ⁇ rameter comprises a Z-coordinate value of the image at the tip position.
- the at least one image pa ⁇ rameter comprises a visual parameter.
- the visual parameter comprises at least one of brightness and colour.
- the at least one image pa ⁇ rameter comprises at least one physical property.
- the method further com ⁇ prises providing the device with additional control data comprising data for providing tactile feedback with the device.
- a method for interacting with a de ⁇ vice and a user comprises determining a position of a tip of a first part of the device on an image on a display.
- the method further comprises re ⁇ closing control data from the device, the control data indicating distance between the tip and a second part of the device, the second part being arranged to move in relation to the tip while the tip is arranged to maintain contact with a surface.
- the distance informa ⁇ tion is converted into a change in an image parameter in the image on the display.
- the converting comprises converting the distance information into a displace ⁇ ment of the image or a portion of the image along Z- axis .
- the converting comprises converting the distance information into a change in a physical property.
- the method further com ⁇ prises determining pressure level of the tip on the surface, and sending data to the device to decrease the distance between the tip and the second part when the pressure level exceeds a first limit.
- the method further com ⁇ prises determining pressure level of the tip on the surface, and sending data to the device to increase the distance between the tip and the second part when the pressure level falls under a second limit.
- the method further com ⁇ prises providing the device with additional control data including data for providing tactile feedback with the device.
- a computer program comprising program code configured to execute any of the above meth ⁇ od characteristics when executed on a processing de- vice.
- the computer program is em ⁇ bodied on a computer-readable medium.
- Advantages of the invention relate e.g. to enhanced interaction with a device as a user of the device receives cues e.g. about image depth properties via the device.
- Figure 1 discloses a block diagram illustrat ⁇ ing a stylus based interaction process in accordance with one embodiment of the invention
- Figure 2 discloses a diagram illustrating an imaging mode in accordance with one embodiment of the invention
- Figure 3 discloses a diagram illustrating a pointing mode in accordance with one embodiment of the invention
- Figure 4 discloses a top view showing dis ⁇ placements of a stylus grip along Z-axis in accordance with one embodiment of the invention.
- Figure 5 discloses a block diagram illustrat ⁇ ing a general structure of a device in accordance with one embodiment of the invention.
- Figure 1 discloses a block diagram illustrat ⁇ ing a stylus based interaction process in accordance with one embodiment of the invention.
- the arrangement of Figure 1 includes a stylus 2.
- the stylus 2 comprises a first part comprising a tip 6 and a second part comprising a grip part 4.
- the grip part 4 is arranged to move in relation to an inner part of the stylus 2 along the longitudinal axis of the stylus 2.
- the arrangement comprises also a display 8 being e.g. a flat or autos- tereoscopic display.
- a detector 12 is used to detect position of the tip 6.
- the detector 12 comprises a separate drawing tablet connected to a computer.
- the display 8 is connected to the computer or the computer itself includes the dis ⁇ play 8.
- the detector 12 is a built-in feature of the display 8. This is the case e.g. when the display 8 is a touch-sensitive display.
- a controller 10 is configured to interact with the computer and to control movement of the grip part 4 in relation to the tip 6.
- the stylus 2 may also include means for determining distance between the tip 6 and the grip part 4.
- an image is presented on the display 8.
- the controller 10 receives data (e.g. from the computer) concerning image depth or virtual elevation of the image area and changes the grip part 4 position regarding the tip 6 by providing kinesthetic feedback in an area 16.
- the stylus 2 comprising means for moving the grip part in relation to the tip 6 of the stylus 2.
- the grip part 4 has been coupled with the first part comprising the tip 6 e.g. via an actuator (e.g. a linear motor etc.) .
- the actuator e.g. a linear motor etc.
- the user 14 holds sty ⁇ lus 2 in the grip part 4.
- the detector 12 detects a tip 6 position us ⁇ ing any appropriate technology.
- the coordinates of a pointed area on the display 8 are coupled with other pictorial cues (e.g., a local brightness, color, relative size, etc.) which provide perception of the sense of distance and relative prox ⁇ imity of this area regarding other objects of the vis ⁇ ual scene (i.e. the image on the display) .
- pictorial cues e.g., a local brightness, color, relative size, etc.
- the user shares attention between pictorial cues of the particular field of interest and other objects of the visual scene.
- the dis ⁇ tance between the tip 6 and the grip part 4 is changed thus simulating relative distances along Z-axis in the image .
- the relative distances along Z-axis are determined from pictorial cues (such as, for example, brightness, color, relative size, etc . ) .
- Figure 2 shows a diagram in which brightness is used as a parameter for determining image depth from an image.
- the numbering used follows the number ⁇ ing of Figure 1.
- the distance between the grip part 4 and the tip 6 can change (increase or decrease) .
- darker areas in the image are considered to be inside (concave) or further away along Z-axis than other illuminated (convex) brighter areas and vice versa.
- the image itself may include parameter data indicating z-axis information, and this can be used to determine the movement of the grip part 4 of in relation to the tip 6.
- an opposite end 18 of the stylus 2 may produce complementary feedback coordinated with direction of the grip part 4 displacements e.g. via friction, vibration and/or skin stretch.
- the embodiment disclosed in Figure 1 is use ⁇ ful e.g. for development and exploration of three- dimensional maps, images simulated for surgeon train ⁇ ing purposes or playing three-dimensional games.
- Fur ⁇ thermore the invention at hand eliminates discrepan ⁇ cies and ambiguities occurred during observation and pen/stylus-based interaction with images containing three-dimensional cues and projected on a flat display surface .
- Figure 1 disclosed an embodiment of the in ⁇ vention wherein the invention is implemented as a sty ⁇ lus.
- the invention may be implemented with any appropriate device comprising a first part comprising a tip for contacting a surface and a second part for contacting the user.
- the device also comprises moving means or moving the second part in relation to the tip.
- the movement of the second part in relation to the tip corresponds to a change in at least one image parameter projected on a display.
- the invention may take various forms other than the stylus-based so ⁇ lution discussed above.
- Figure 3 discloses a diagram illustrating a pointing mode in accordance with one embodiment of the invention. The numbering used below follows the numbering used in Figure 1.
- the controller 10 receives data about the coordinates of the image area pointed by the tip 6 from the detector 12 of the sty ⁇ lus tip 6 position.
- the detector 12 detects also pressure applied to the display 8 surface.
- the display 8 is a separate device from a drawing tablet and the detector 12 detects the stylus tip 6 position on the drawing tablet.
- Pressure Pi as shown in Figure 3 is prede ⁇ fined to hold proximity of the grip 4 in a local area continuously and steady with a non-zero predefined pressure. There are three levels of the pressure Pi, P2 and P 0 .
- the controller 10 de ⁇ creases proximity of the grip 4 to the tip 6 (i.e. the distance between the grip 4 and the tip 6 decreases) by aligning the pressure to Pi.
- the controller 10 increases proximity of the grip 4 to the tip 6 (i.e. the distance between the grip 4 and the tip 6 in ⁇ creases) by aligning the pressure to Pi.
- Displacements of the grip 4 position in rela ⁇ tion to the tip 6 can be converted into displacements of a local area of a virtual 3D scene (i.e. image on the display) , relative position of the pointer, relo ⁇ cation of a virtual object or changing another property related to Z-axis.
- the displacement of the grip 4 position in relation to the tip 6 may also simulate changes in a physical property, e.g. viscosity, den ⁇ sity etc.
- an opposite end 18 of the stylus can produce appropriate tactile stimuli (e.g., via friction, vibration and skin stretch) as a kind of complementary feedback coordinated with direction of the grip 4 displacements.
- Figure 4 discloses a top view showing dis ⁇ placements of a stylus grip along Z-axis in accordance with one embodiment of the invention. The numbering again follows the numbering used in Figure 1.
- FIG. 5 discloses a block diagram illustrat ⁇ ing a general structure of a device in accordance with one embodiment of the invention.
- a controller 50 is configured to interact with a computer, i.e. control ⁇ ler 50 may send and/or receive data to/from the com ⁇ puter.
- the actual connection between the controller 50 and the computer may be a wired or a wireless connec ⁇ tion.
- the controller 50 is also configured to operate and control other parts of the device, e.g. an actua ⁇ tor 52.
- the actuator 52 is e.g. a lin ⁇ ear motor.
- the actuator 52 may take any appropriate form which enables moving the se- cond part in relation to the tip.
- the device may also include means for measuring 54 a distance from the se ⁇ cond part to a surface contacted by the tip. Any ap ⁇ muscularte technical solution can be used to measure the distance.
- distance information may be sent to the computer by the controller 50.
- the device may also com ⁇ prise tactile feedback means 58 for providing tactile feedback (e.g. via vibration) to the user as a comple ⁇ mentary feedback.
- the device may also com ⁇ prise at least one optional input and/or output means 56.
- An input may refer e.g. to a button. A user may give an additional input with the button when using the device.
- the device may also include an output means providing additional output to the user.
- the output means may refer e.g. to means for providing sound.
- the exemplary embodiments can include, for example, any suitable servers, workstations, PCs, lap ⁇ top computers, personal digital assistants (PDAs) , In ⁇ ternet appliances, handheld devices, cellular tele ⁇ phones, smart phones, wireless devices, other devices, and the like, capable of performing the processes of the exemplary embodiments.
- PDAs personal digital assistants
- the devices and subsystems of the exemplary embodiments can communicate with each other using any suitable protocol and can be imple ⁇ mented using one or more programmed computer systems or devices.
- the exemplary em ⁇ bodiments are for exemplary purposes, as many varia ⁇ tions of the specific hardware used to implement the exemplary embodiments are possible, as will be appre ⁇ ciated by those skilled in the hardware and/or soft ⁇ ware art(s) .
- the functionality of one or more of the components of the exemplary embodiments can be implemented via one or more hardware and/or software devices.
- the exemplary embodiments can store informa ⁇ tion relating to various processes described herein.
- This information can be stored in one or more memo ⁇ ries, such as a hard disk, optical disk, magneto- optical disk, RAM, and the like.
- One or more databases can store the information used to implement the exem ⁇ plary embodiments of the present inventions.
- the data ⁇ bases can be organized using data structures (e.g., records, tables, arrays, fields, graphs, trees, lists, and the like) included in one or more memories or storage devices listed herein.
- the processes described with respect to the exemplary embodiments can include appropriate data structures for storing data collected and/or generated by the processes of the devices and subsystems of the exemplary embodiments in one or more databases .
- All or a portion of the exemplary embodiments can be conveniently implemented using one or more gen ⁇ eral purpose processors, microprocessors, digital sig ⁇ nal processors, micro-controllers, and the like, pro ⁇ grammed according to the teachings of the exemplary embodiments of the present inventions, as will be ap ⁇ preciated by those skilled in the computer and/or software art(s) .
- Appropriate software can be readily prepared by programmers of ordinary skill based on the teachings of the exemplary embodiments, as will be ap ⁇ preciated by those skilled in the software art.
- the exemplary embodiments can be implemented by the preparation of application-specific integrated circuits or by interconnecting an appropriate network of conventional component circuits, as will be appre ⁇ ciated by those skilled in the electrical art(s) .
- the exemplary embodiments are not limited to any specific combination of hardware and/or software.
- the exemplary embodiments of the present inventions can include software for control ⁇ ling the components of the exemplary embodiments, for driving the components of the exemplary embodiments, for enabling the components of the exemplary embodi ⁇ ments to interact with a human user, and the like.
- Such software can include, but is not limited to, de ⁇ vice drivers, firmware, operating systems, development tools, applications software, and the like.
- Such com ⁇ puter readable media further can include the computer program product of an embodiment of the present inven ⁇ tions for performing all or a portion (if processing is distributed) of the processing performed in imple ⁇ menting the inventions.
- Computer code devices of the exemplary embodiments of the present inventions can include any suitable interpretable or executable code mechanism, including but not limited to scripts, in ⁇ terpretable programs, dynamic link libraries (DLLs) , Java classes and applets, complete executable pro ⁇ grams, Common Object Request Broker Architecture (COR- BA) objects, and the like.
- parts of the processing of the exemplary embodiments of the present inventions can be distributed for better performance, reliability, cost, and the like.
- the components of the exem ⁇ plary embodiments can include computer readable medium or memories for holding instructions programmed ac ⁇ cording to the teachings of the present inventions and for holding data structures, tables, records, and/or other data described herein.
- Computer readable medium can include any suitable medium that participates in providing instructions to a processor for execution. Such a medium can take many forms, including but not limited to, non-volatile media, volatile media, trans ⁇ mission media, and the like.
- Non-volatile media can include, for example, optical or magnetic disks, mag- neto-optical disks, and the like.
- Volatile media can include dynamic memories, and the like.
- Transmission media can include coaxial cables, copper wire, fiber optics, and the like.
- Computer- readable media can include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other suitable magnetic medium, a CD-ROM, CDR, CD-RW, DVD, DVD-ROM, DVD1RW, DVD ⁇ R, any other suitable optical medium, punch cards, paper tape, optical mark sheets, any other suitable physical medium with pat ⁇ terns of holes or other optically recognizable indi ⁇ cia, a RAM, a PROM, an EPROM, a FLASH-EPROM, any other suitable memory chip or cartridge, a carrier wave or any other suitable medium from which a computer can read .
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Numerical Control (AREA)
- Position Input By Displaying (AREA)
Abstract
The invention discloses a device for interacting with a computer and a user. The device comprises a first part comprising a tip for contacting a surface, a second part for contacting the user and a signalling means for interacting with the computer. The device further comprises moving means for moving the second part in relation to the tip while the tip is arranged to maintain contact with the surface. The movement of the second part in relation to the tip corresponds to a change in at least one image parameter, the imagebeing projected on a display, and wherein the movement of the second part in relation to the tip provides the user with a haptic sense.
Description
METHOD, COMPUTER PROGRAM AND DEVICE FOR INTERACTING
WITH A COMPUTER
FIELD OF THE INVENTION
The present invention relates to interaction with a computer. In particular, the present invention relates to a solution which improves interaction techniques when images are being projected on a display surface .
DESCRIPTION OF THE RELATED ART
A stylus is an input tool that can be used with various devices, e.g. Personal Digital Assistants (PDA), personal computers, mobile phones etc. to input information. When a stylus is used, a user operates a touch screen with a stylus rather than using a finger. It also improves the precision of the touch input, al¬ lowing use of smaller user interface elements. Sty¬ luses may be used for entering commands, handwriting or drawing on a touch-sensitive surface.
Nowadays, in advanced multimodal interfaces besides spatial audio, enhancement of graphic cues is often based on the use of a haptic sense. Many sta¬ tionary input devices have been developed to augment visual interaction with three-dimensional objects through complementary haptic sense.
There have been proposed different solutions to modify or adapt a regular shape of the stylus to increase the human performance in stylus-based inter¬ action. However, in these solutions, changes of the stylus shape had not the goal to influence on human perception of pictorial cues.
Various solutions for different kinds of de¬ vices in three-dimensional pointing mostly rely on en¬ hanced visual feedback even when the person can apply different pressure on a stylus tip to change the cur¬ sor location along the direction of the applied force
as disclosed e.g. in patent application US2008/0225007. The use of the stylus pressure against a stopped display surface usually has a limited travel distance and produces a tactile feedback in a limited range. Because haptic sense cannot provide a high ac¬ curacy the person has to exclusively rely on visual feedback. Therefore pressure-based input requires sig¬ nificant efforts to hold continuously a given value doing micro-movements produced by fingers, even when the special design of the stylus could provide greater displacements of the stylus tip.
Thus, there is a challenge to develop a new and more efficient solution for enhancing user experience during user interaction with images projected on a display surface.
SUMMARY OF THE INVENTION
One object of the invention is to improve us¬ er interaction techniques when images are being pro¬ jected on a display. Another object of this invention is to produce kinesthetic and tactile signals, which are being complementary to pictorial three dimensional cues .
According to one aspect of the invention, there is provided a device for interacting with a com¬ puter and a user. The device comprises a first part comprising a tip for contacting a surface, a second part for contacting the user and a signalling means for interacting with the computer. The device further comprises moving means for moving the second part in relation to the tip while the tip is arranged to main¬ tain contact with the surface. The movement of the se¬ cond part in relation to the tip corresponds to a change in at least one image parameter, the image be¬ ing projected on a display. The movement of the second
part in relation to the tip provides the user with a haptic sense.
In one embodiment, the signalling means are configured to receive control data from the computer, the control data depending on the at least one image parameter at a position of the tip on the image on the display, wherein the moving means are configured to move the second part in relation to the tip based on the control data.
In one embodiment, the at least one image pa¬ rameter comprises a Z-coordinate value in the image at the tip position, the distance between the tip and the second part simulating movement along Z-axis in the image .
In one embodiment, the at least one image pa¬ rameter comprises a visual parameter. In one embodi¬ ment, the visual parameter comprises at least one of brightness and colour.
In one embodiment, the at least one image pa¬ rameter comprises at least one physical property.
In one embodiment, the device comprises meas¬ uring means for determining the distance between the tip and the second part, and the signalling means are configured to send the distance information to the computer. In one embodiment, the moving means are con¬ figured to move the second part in relation to the tip when the user applies positive or negative pressure with the tip to the surface.
In one embodiment, the device further com¬ prises tactile feedback means for providing tactile feedback to the user as a complementary feedback.
In one embodiment, the device comprises a stylus, wherein the second part comprises a grip part, arranged at least partially around the first part. In
one embodiment, the moving means are configured to move the grip part in relation to the tip along a longitudinal axis of the stylus.
According to another aspect of the invention, there is provided a method for interacting with a de¬ vice and a user. The method comprises determining a position of a tip of a first part of the device on an image on a display, generating control data based on the tip position, and providing the device with the control data. The control data depends on at least one image parameter at the tip position on the image on the display. The control data is configured to control moving means of the device to move a second part of the device in relation to the tip while the tip is ar¬ ranged to maintain contact with a surface. A change in the distance between the tip and the second part cor¬ responds to a change in the at least one image parame¬ ter at the tip position in the image.
In one embodiment, the at least one image pa¬ rameter comprises a Z-coordinate value of the image at the tip position.
In one embodiment, the at least one image pa¬ rameter comprises a visual parameter. In one embodi¬ ment, the visual parameter comprises at least one of brightness and colour.
In one embodiment, the at least one image pa¬ rameter comprises at least one physical property.
In one embodiment, the method further com¬ prises providing the device with additional control data comprising data for providing tactile feedback with the device.
According to another aspect of the invention, there is provided a method for interacting with a de¬ vice and a user. The method comprises determining a
position of a tip of a first part of the device on an image on a display. The method further comprises re¬ ceiving control data from the device, the control data indicating distance between the tip and a second part of the device, the second part being arranged to move in relation to the tip while the tip is arranged to maintain contact with a surface. The distance informa¬ tion is converted into a change in an image parameter in the image on the display.
In one embodiment, the converting comprises converting the distance information into a displace¬ ment of the image or a portion of the image along Z- axis .
In one embodiment, the converting comprises converting the distance information into a change in a physical property.
In one embodiment, the method further com¬ prises determining pressure level of the tip on the surface, and sending data to the device to decrease the distance between the tip and the second part when the pressure level exceeds a first limit.
In one embodiment, the method further com¬ prises determining pressure level of the tip on the surface, and sending data to the device to increase the distance between the tip and the second part when the pressure level falls under a second limit.
In one embodiment, the method further com¬ prises providing the device with additional control data including data for providing tactile feedback with the device.
According to another aspect of the invention, there is provided a computer program comprising program code configured to execute any of the above meth¬ od characteristics when executed on a processing de-
vice. In one embodiment, the computer program is em¬ bodied on a computer-readable medium.
Advantages of the invention relate e.g. to enhanced interaction with a device as a user of the device receives cues e.g. about image depth properties via the device.
BRIEF DESCRIPTION OF THE DRAWINGS:
The accompanying drawings, which are included to provide a further understanding of the invention and constitute a part of this specification, illus¬ trate embodiments of the invention and together with the description help to explain the principles of the invention. In the drawings:
Figure 1 discloses a block diagram illustrat¬ ing a stylus based interaction process in accordance with one embodiment of the invention;
Figure 2 discloses a diagram illustrating an imaging mode in accordance with one embodiment of the invention ;
Figure 3 discloses a diagram illustrating a pointing mode in accordance with one embodiment of the invention ;
Figure 4 discloses a top view showing dis¬ placements of a stylus grip along Z-axis in accordance with one embodiment of the invention; and
Figure 5 discloses a block diagram illustrat¬ ing a general structure of a device in accordance with one embodiment of the invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Reference will now be made in detail to the embodiments of the present invention, examples of which are illustrated in the accompanying drawings.
Figure 1 discloses a block diagram illustrat¬ ing a stylus based interaction process in accordance with one embodiment of the invention.
The arrangement of Figure 1 includes a stylus 2. The stylus 2 comprises a first part comprising a tip 6 and a second part comprising a grip part 4. In this embodiment the grip part 4 is arranged to move in relation to an inner part of the stylus 2 along the longitudinal axis of the stylus 2. The arrangement comprises also a display 8 being e.g. a flat or autos- tereoscopic display. A detector 12 is used to detect position of the tip 6. In one embodiment, the detector 12 comprises a separate drawing tablet connected to a computer. Furthermore, the display 8 is connected to the computer or the computer itself includes the dis¬ play 8. In another embodiment, the detector 12 is a built-in feature of the display 8. This is the case e.g. when the display 8 is a touch-sensitive display. A controller 10 is configured to interact with the computer and to control movement of the grip part 4 in relation to the tip 6. The stylus 2 may also include means for determining distance between the tip 6 and the grip part 4.
In one embodiment, during an exploration mode, an image is presented on the display 8. Accord¬ ing to the coordinates of an image area received from the detector 12 of the tip 6 position, the controller 10 receives data (e.g. from the computer) concerning image depth or virtual elevation of the image area and changes the grip part 4 position regarding the tip 6 by providing kinesthetic feedback in an area 16. In order to change the position of the grip part 4 in re¬ lation to the tip 6, the stylus 2 comprising means for moving the grip part in relation to the tip 6 of the stylus 2. In one embodiment, the grip part 4 has been coupled with the first part comprising the tip 6 e.g. via an actuator (e.g. a linear motor etc.) . When the
actuator is run, the distance between the tip 6, which usually coincides with the display 8 surface, and the grip part 4, changes. This provides a user 14 with an enhanced haptic sense being complementary to the pic¬ torial cues (visual feedback 20) .
As shown in Figure 1, the user 14 holds sty¬ lus 2 in the grip part 4. During interaction with the display 8 the detector 12 detects a tip 6 position us¬ ing any appropriate technology. In one embodiment, the coordinates of a pointed area on the display 8 are coupled with other pictorial cues (e.g., a local brightness, color, relative size, etc.) which provide perception of the sense of distance and relative prox¬ imity of this area regarding other objects of the vis¬ ual scene (i.e. the image on the display) . During in¬ teraction with the display 8, the user shares attention between pictorial cues of the particular field of interest and other objects of the visual scene.
Usually, when the user moves the stylus over a flat display surface, the kinesthetic signals commu¬ nicate that the distance between the grip part 4 and the stylus tip 6 is constant. However, as disclosed already above, in the embodiment of Figure 1 the dis¬ tance between the tip 6 and the grip part 4 is changed thus simulating relative distances along Z-axis in the image .
In one embodiment, the relative distances along Z-axis are determined from pictorial cues (such as, for example, brightness, color, relative size, etc . ) .
Figure 2 shows a diagram in which brightness is used as a parameter for determining image depth from an image. The numbering used follows the number¬ ing of Figure 1.
As brightness increases the distance between the grip part 4 and the tip 6 can change (increase or decrease) . In other words, e.g. darker areas in the
image are considered to be inside (concave) or further away along Z-axis than other illuminated (convex) brighter areas and vice versa. In another embodiment, the image itself may include parameter data indicating z-axis information, and this can be used to determine the movement of the grip part 4 of in relation to the tip 6.
Moreover, an opposite end 18 of the stylus 2 may produce complementary feedback coordinated with direction of the grip part 4 displacements e.g. via friction, vibration and/or skin stretch.
The embodiment disclosed in Figure 1 is use¬ ful e.g. for development and exploration of three- dimensional maps, images simulated for surgeon train¬ ing purposes or playing three-dimensional games. Fur¬ thermore, the invention at hand eliminates discrepan¬ cies and ambiguities occurred during observation and pen/stylus-based interaction with images containing three-dimensional cues and projected on a flat display surface .
In another embodiment of Figure 1, instead of using depth information as a parameter data from an image, also physical property information can be used as an image parameter. For example, some physical property information (e.g. density, viscosity, etc,) can be concluded from the image, and distance between the grip part 4 and the tip 6 is changed in response to changes in the physical parameter.
Figure 1 disclosed an embodiment of the in¬ vention wherein the invention is implemented as a sty¬ lus. A man skilled in the art understands that the im¬ plementation is not limited only to a stylus type of solution. The invention may be implemented with any appropriate device comprising a first part comprising a tip for contacting a surface and a second part for contacting the user. The device also comprises moving means or moving the second part in relation to the
tip. The movement of the second part in relation to the tip corresponds to a change in at least one image parameter projected on a display. Thus, the invention may take various forms other than the stylus-based so¬ lution discussed above.
Figure 3 discloses a diagram illustrating a pointing mode in accordance with one embodiment of the invention. The numbering used below follows the numbering used in Figure 1.
During the pointing mode, the controller 10 receives data about the coordinates of the image area pointed by the tip 6 from the detector 12 of the sty¬ lus tip 6 position.
In this embodiment, the detector 12 detects also pressure applied to the display 8 surface. In an¬ other embodiment, the display 8 is a separate device from a drawing tablet and the detector 12 detects the stylus tip 6 position on the drawing tablet.
Pressure Pi as shown in Figure 3 is prede¬ fined to hold proximity of the grip 4 in a local area continuously and steady with a non-zero predefined pressure. There are three levels of the pressure Pi, P2 and P0.
When pressure of the tip 6 on the display 8 surface becomes higher than P2, the controller 10 de¬ creases proximity of the grip 4 to the tip 6 (i.e. the distance between the grip 4 and the tip 6 decreases) by aligning the pressure to Pi. When pressure of the stylus tip 6 becomes less than P0, the controller 10 increases proximity of the grip 4 to the tip 6 (i.e. the distance between the grip 4 and the tip 6 in¬ creases) by aligning the pressure to Pi. Thus, within a dynamic range of the grip 4 displacements the user can actively manage e.g. the distance of the pointed local area along Z-axis.
Displacements of the grip 4 position in rela¬ tion to the tip 6 can be converted into displacements
of a local area of a virtual 3D scene (i.e. image on the display) , relative position of the pointer, relo¬ cation of a virtual object or changing another property related to Z-axis. The displacement of the grip 4 position in relation to the tip 6 may also simulate changes in a physical property, e.g. viscosity, den¬ sity etc.
In one embodiment, an opposite end 18 of the stylus can produce appropriate tactile stimuli (e.g., via friction, vibration and skin stretch) as a kind of complementary feedback coordinated with direction of the grip 4 displacements.
Figure 4 discloses a top view showing dis¬ placements of a stylus grip along Z-axis in accordance with one embodiment of the invention. The numbering again follows the numbering used in Figure 1.
When the stylus 2, and more particularly, the tip 6 moves across a flat display 8 surface along X- axis and/or Y-axis, displacements of the grip 4 simu¬ late relative distances along Z-axis. As can be seen from Figure 4, as the virtual elevation parameter on the Z-axis reduces, the distance between the tip 6 and the grip 4 grows, and vice versa.
Figure 5 discloses a block diagram illustrat¬ ing a general structure of a device in accordance with one embodiment of the invention. A controller 50 is configured to interact with a computer, i.e. control¬ ler 50 may send and/or receive data to/from the com¬ puter. The actual connection between the controller 50 and the computer may be a wired or a wireless connec¬ tion. The controller 50 is also configured to operate and control other parts of the device, e.g. an actua¬ tor 52. When the actuator 52 is run, the distance between a first part of the device comprising a tip and a second part, changes. The actuator 52 is e.g. a lin¬ ear motor. In another embodiment, the actuator 52 may take any appropriate form which enables moving the se-
cond part in relation to the tip. The device may also include means for measuring 54 a distance from the se¬ cond part to a surface contacted by the tip. Any ap¬ propriate technical solution can be used to measure the distance. In one embodiment, distance information may be sent to the computer by the controller 50.
In one embodiment, the device may also com¬ prise tactile feedback means 58 for providing tactile feedback (e.g. via vibration) to the user as a comple¬ mentary feedback.
In one embodiment, the device may also com¬ prise at least one optional input and/or output means 56. An input may refer e.g. to a button. A user may give an additional input with the button when using the device.
In addition to or instead of the input means the device may also include an output means providing additional output to the user. The output means may refer e.g. to means for providing sound.
The exemplary embodiments can include, for example, any suitable servers, workstations, PCs, lap¬ top computers, personal digital assistants (PDAs) , In¬ ternet appliances, handheld devices, cellular tele¬ phones, smart phones, wireless devices, other devices, and the like, capable of performing the processes of the exemplary embodiments. The devices and subsystems of the exemplary embodiments can communicate with each other using any suitable protocol and can be imple¬ mented using one or more programmed computer systems or devices.
It is to be understood that the exemplary em¬ bodiments are for exemplary purposes, as many varia¬ tions of the specific hardware used to implement the exemplary embodiments are possible, as will be appre¬ ciated by those skilled in the hardware and/or soft¬ ware art(s) . For example, the functionality of one or more of the components of the exemplary embodiments
can be implemented via one or more hardware and/or software devices.
The exemplary embodiments can store informa¬ tion relating to various processes described herein. This information can be stored in one or more memo¬ ries, such as a hard disk, optical disk, magneto- optical disk, RAM, and the like. One or more databases can store the information used to implement the exem¬ plary embodiments of the present inventions. The data¬ bases can be organized using data structures (e.g., records, tables, arrays, fields, graphs, trees, lists, and the like) included in one or more memories or storage devices listed herein. The processes described with respect to the exemplary embodiments can include appropriate data structures for storing data collected and/or generated by the processes of the devices and subsystems of the exemplary embodiments in one or more databases .
All or a portion of the exemplary embodiments can be conveniently implemented using one or more gen¬ eral purpose processors, microprocessors, digital sig¬ nal processors, micro-controllers, and the like, pro¬ grammed according to the teachings of the exemplary embodiments of the present inventions, as will be ap¬ preciated by those skilled in the computer and/or software art(s) . Appropriate software can be readily prepared by programmers of ordinary skill based on the teachings of the exemplary embodiments, as will be ap¬ preciated by those skilled in the software art. In ad¬ dition, the exemplary embodiments can be implemented by the preparation of application-specific integrated circuits or by interconnecting an appropriate network of conventional component circuits, as will be appre¬ ciated by those skilled in the electrical art(s) . Thus, the exemplary embodiments are not limited to any specific combination of hardware and/or software.
Stored on any one or on a combination of computer readable media, the exemplary embodiments of the present inventions can include software for control¬ ling the components of the exemplary embodiments, for driving the components of the exemplary embodiments, for enabling the components of the exemplary embodi¬ ments to interact with a human user, and the like. Such software can include, but is not limited to, de¬ vice drivers, firmware, operating systems, development tools, applications software, and the like. Such com¬ puter readable media further can include the computer program product of an embodiment of the present inven¬ tions for performing all or a portion (if processing is distributed) of the processing performed in imple¬ menting the inventions. Computer code devices of the exemplary embodiments of the present inventions can include any suitable interpretable or executable code mechanism, including but not limited to scripts, in¬ terpretable programs, dynamic link libraries (DLLs) , Java classes and applets, complete executable pro¬ grams, Common Object Request Broker Architecture (COR- BA) objects, and the like. Moreover, parts of the processing of the exemplary embodiments of the present inventions can be distributed for better performance, reliability, cost, and the like.
As stated above, the components of the exem¬ plary embodiments can include computer readable medium or memories for holding instructions programmed ac¬ cording to the teachings of the present inventions and for holding data structures, tables, records, and/or other data described herein. Computer readable medium can include any suitable medium that participates in providing instructions to a processor for execution. Such a medium can take many forms, including but not limited to, non-volatile media, volatile media, trans¬ mission media, and the like. Non-volatile media can include, for example, optical or magnetic disks, mag-
neto-optical disks, and the like. Volatile media can include dynamic memories, and the like. Transmission media can include coaxial cables, copper wire, fiber optics, and the like. Common forms of computer- readable media can include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other suitable magnetic medium, a CD-ROM, CDR, CD-RW, DVD, DVD-ROM, DVD1RW, DVD±R, any other suitable optical medium, punch cards, paper tape, optical mark sheets, any other suitable physical medium with pat¬ terns of holes or other optically recognizable indi¬ cia, a RAM, a PROM, an EPROM, a FLASH-EPROM, any other suitable memory chip or cartridge, a carrier wave or any other suitable medium from which a computer can read .
While the present inventions have been de¬ scribed in connection with a number of exemplary embodiments, and implementations, the present inventions are not so limited, but rather cover various modifica¬ tions, and equivalent arrangements, which fall within the purview of prospective claims.
Claims
1. A device for interacting with a computer and a user, said device comprising:
a first part comprising a tip for contacting a surface ;
a second part for contacting the user;
a signalling means for interacting with the computer;
characterized in that
the device comprises moving means for moving the second part in relation to the tip while the tip is arranged to maintain contact with the surface;
wherein the movement of the second part in rela¬ tion to the tip corresponds to a change in at least one image parameter, the image being projected on a display, and wherein the movement of the second part in relation to the tip provides the user with a haptic sense .
2. The device of claim 1, wherein the signalling means are configured to receive control data from the com¬ puter, the control data depending on the at least one image parameter at a position of the tip on the image on the display, wherein the moving means are config¬ ured to move the second part in relation to the tip based on the control data.
3. The device of claim 2, wherein the at least one im¬ age parameter comprises a Z-coordinate value in the image at the tip position, the distance between the tip and the second part simulating movement along Z- axis in the image.
4. The device of claim 2, wherein the at least one im¬ age parameter comprises a visual parameter.
5. The device of claim 4, wherein the visual parameter comprises at least one of brightness and colour.
6. The device of claim 2, wherein the at least one im¬ age parameter comprises at least one physical prop¬ erty.
7. The device of any of claims 1 to 6, wherein the de¬ vice comprises measuring means for determining the distance between the tip and the second part, and the signalling means are configured to send the distance information to the computer.
8. The device of any of claims 1 to 7, wherein the moving means are configured to move the second part in relation to the tip when the user applies positive or negative pressure with the tip to the surface.
9. The device of any of claims 1 to 8, further com¬ prising tactile feedback means for providing tactile feedback to the user as a complementary feedback.
10. The device of any of claims 1 to 9, in which the device comprises a stylus, wherein the second part comprises a grip part, arranged at least partially around the first part.
11. The device of claim 10, wherein the moving means are configured to move the grip part in relation to the tip along the longitudinal axis of the stylus.
12. A method for interacting with a device and a user, the method comprising:
determining a position of a tip of a first part of the device on an image on a display;
characterized in that
generating control data based on the tip posi¬ tion;
providing the device with the control data, wherein the control data depends on at least one image parameter at the tip position on the image on the dis¬ play; wherein the control data is configured to con¬ trol moving means of the device to move a second part of the device in relation to the tip while the tip is arranged to maintain contact with a surface, wherein a change in the distance between the tip and the second part corresponds to a change in the at least one image parameter at the tip position in the image.
13. The method of claim 12, wherein the at least one image parameter comprises a Z-coordinate value of the image at the tip position.
14. The method of claim 12, wherein the at least one image parameter comprises a visual parameter.
15. The method of claim 14, wherein the visual parame¬ ter comprises at least one of brightness and colour.
16. The method of any of claims 14 to 15, further com¬ prising :
averaging the visual parameter over the whole image on the display or over a portion of the image on the display.
17. The method of claim 12, wherein the at least one image parameter comprises a physical property.
18. The method of any of claims 12 to 17, further com¬ prising: providing the device with additional control data comprising data for providing tactile feedback with the device.
19. A method for interacting with a device and a user, the method comprising:
determining a position of a tip of a first part of the device on an image on a display;
receiving control data from the device, the con¬ trol data indicating distance between the tip and a second part of the device, the second part being ar¬ ranged to move in relation to the tip while the tip is arranged to maintain contact with a surface;
converting the distance information into a change in an image parameter in the image on the dis¬ play .
20. The method of claim 19, wherein the converting comprises converting the distance information into a displacement of the image or a portion of the image along Z-axis.
21. The method of claim 19, wherein the converting comprises converting the distance information into a change in a physical property.
22. The method of any of claims 19 to 21, further com¬ prising : determining pressure level of the tip on the surface; and
sending data to the device to decrease the dis¬ tance between the tip and the second part when the pressure level exceeds a first limit.
23. The method of any of claims 19 to 21, further com¬ prising :
determining pressure level of the tip on the surface; and
sending data to the device to increase the dis¬ tance between the tip and the second part when the pressure level falls under a second limit.
24. The method of any of claims 19 to 23, further com¬ prising: providing the device with additional control data including data for providing tactile feedback with the device.
25. A computer program comprising program code configured to execute the method of any of claims 12 to 18 when executed on a processing device.
26. The computer program of claim 25, wherein the computer program is embodied on a computer-readable me¬ dium .
27. A computer program comprising program code configured to execute the method of any of claims 19 to 24 when executed on a processing device.
28. The computer program of claim 27, wherein the computer program is embodied on a computer-readable me¬ dium .
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| FI20090434A FI20090434A7 (en) | 2009-11-17 | 2009-11-17 | Method, computer program and device for interacting with a computer |
| FI20090434 | 2009-11-17 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2011061395A1 true WO2011061395A1 (en) | 2011-05-26 |
Family
ID=41395160
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/FI2010/050924 Ceased WO2011061395A1 (en) | 2009-11-17 | 2010-11-16 | Method, computer program and device for interacting with a computer field of the invention |
Country Status (2)
| Country | Link |
|---|---|
| FI (1) | FI20090434A7 (en) |
| WO (1) | WO2011061395A1 (en) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014053695A1 (en) * | 2012-10-02 | 2014-04-10 | Nokia Corporation | An apparatus and associated methods for providing electrotactile feedback |
| US10133370B2 (en) | 2015-03-27 | 2018-11-20 | Tampereen Yliopisto | Haptic stylus |
| CN111746398A (en) * | 2019-03-27 | 2020-10-09 | 株式会社斯巴鲁 | Contactless operating device for vehicle and vehicle |
| JP2020194465A (en) * | 2019-05-30 | 2020-12-03 | 学校法人立命館 | Tactile sense presentation system, tactile sense presentation device, controller, and method of presenting tactile sense in virtual world to user in real world |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020097223A1 (en) * | 1998-06-23 | 2002-07-25 | Immersion Corporation | Haptic feedback stylus and othef devices |
| US6448959B1 (en) * | 1991-12-13 | 2002-09-10 | Fujitsu, Limited | Desired region specifying system in an image editing apparatus |
| DE102005027203A1 (en) * | 2005-06-13 | 2006-12-14 | Mühlburger, Nobert | Haptic-user interface for e.g. controlling computer system, has actuator producing definitive counter-force to shift input segment with respect to holding segment, so that length/height of input unit is increased or decreased |
| EP1821182A1 (en) * | 2004-10-12 | 2007-08-22 | Nippon Telegraph and Telephone Corporation | 3d pointing method, 3d display control method, 3d pointing device, 3d display control device, 3d pointing program, and 3d display control program |
| US20080143693A1 (en) * | 2000-05-24 | 2008-06-19 | Immersion Corporation | Haptic stylus utilizing an electroactive polymer |
| US20090079703A1 (en) * | 2007-09-20 | 2009-03-26 | Electronics And Telecommunications Research Institute | Device and system for providing user with sensation effect on touch screen |
-
2009
- 2009-11-17 FI FI20090434A patent/FI20090434A7/en not_active Application Discontinuation
-
2010
- 2010-11-16 WO PCT/FI2010/050924 patent/WO2011061395A1/en not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6448959B1 (en) * | 1991-12-13 | 2002-09-10 | Fujitsu, Limited | Desired region specifying system in an image editing apparatus |
| US20020097223A1 (en) * | 1998-06-23 | 2002-07-25 | Immersion Corporation | Haptic feedback stylus and othef devices |
| US20080143693A1 (en) * | 2000-05-24 | 2008-06-19 | Immersion Corporation | Haptic stylus utilizing an electroactive polymer |
| EP1821182A1 (en) * | 2004-10-12 | 2007-08-22 | Nippon Telegraph and Telephone Corporation | 3d pointing method, 3d display control method, 3d pointing device, 3d display control device, 3d pointing program, and 3d display control program |
| DE102005027203A1 (en) * | 2005-06-13 | 2006-12-14 | Mühlburger, Nobert | Haptic-user interface for e.g. controlling computer system, has actuator producing definitive counter-force to shift input segment with respect to holding segment, so that length/height of input unit is increased or decreased |
| US20090079703A1 (en) * | 2007-09-20 | 2009-03-26 | Electronics And Telecommunications Research Institute | Device and system for providing user with sensation effect on touch screen |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014053695A1 (en) * | 2012-10-02 | 2014-04-10 | Nokia Corporation | An apparatus and associated methods for providing electrotactile feedback |
| US10133370B2 (en) | 2015-03-27 | 2018-11-20 | Tampereen Yliopisto | Haptic stylus |
| CN111746398A (en) * | 2019-03-27 | 2020-10-09 | 株式会社斯巴鲁 | Contactless operating device for vehicle and vehicle |
| JP2020194465A (en) * | 2019-05-30 | 2020-12-03 | 学校法人立命館 | Tactile sense presentation system, tactile sense presentation device, controller, and method of presenting tactile sense in virtual world to user in real world |
| JP7442778B2 (en) | 2019-05-30 | 2024-03-05 | 学校法人立命館 | Haptic presentation system, tactile presentation device, controller, and method for presenting tactile sensations in a virtual world to a user in the real world |
Also Published As
| Publication number | Publication date |
|---|---|
| FI20090434L (en) | 2011-05-18 |
| FI20090434A7 (en) | 2011-05-18 |
| FI20090434A0 (en) | 2009-11-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11907448B2 (en) | Systems, devices, and methods for physical surface tracking with a stylus device in an AR/VR environment | |
| US20240402791A1 (en) | Three-dimensional object tracking to augment display area | |
| CN110603509B (en) | Joint of direct and indirect interactions in a computer-mediated reality environment | |
| US11068111B2 (en) | Hover-based user-interactions with virtual objects within immersive environments | |
| KR101570800B1 (en) | Systems and methods for controlling a cursor on a display using a trackpad input device | |
| US8416066B2 (en) | Active vibrations | |
| US9430106B1 (en) | Coordinated stylus haptic action | |
| CN117348743A (en) | Computer, rendering method and position indication device | |
| CN102317892A (en) | Method of controlling information input device, information input device, program and information storage medium | |
| CN103890703A (en) | Input control device, input control method, and input control program | |
| US11397478B1 (en) | Systems, devices, and methods for physical surface tracking with a stylus device in an AR/VR environment | |
| CN109643206A (en) | Control device, display device, program and detection method | |
| CN108431734A (en) | Touch feedback for non-touch surface interaction | |
| CN109804638A (en) | Dual Mode Augmented Reality Interface for Mobile Devices | |
| WO2011061395A1 (en) | Method, computer program and device for interacting with a computer field of the invention | |
| CN109871117A (en) | Information processing unit, display device and information processing system | |
| Knierim et al. | The SmARtphone controller: leveraging smartphones as input and output modality for improved interaction within mobile augmented reality environments | |
| CN102902412B (en) | Many pointers indirect input device based on accelerate mutual | |
| CN104137026A (en) | Interactive Cartographic Recognition | |
| KR102322968B1 (en) | a short key instruction device using finger gestures and the short key instruction method using thereof | |
| CN110119226A (en) | Hand input device, hand-written input system and hand-written inputting method | |
| WO2023234822A1 (en) | An extended-reality interaction system | |
| JPH04257014A (en) | Input device | |
| Abdullah et al. | A virtual environment with haptic feedback for better distance estimation | |
| JP7707629B2 (en) | Information processing device, information processing program, and information processing system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10831200 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 10831200 Country of ref document: EP Kind code of ref document: A1 |