US20170315698A1 - User interface for use in computing device with sensitive display - Google Patents
User interface for use in computing device with sensitive display Download PDFInfo
- Publication number
- US20170315698A1 US20170315698A1 US14/833,140 US201514833140A US2017315698A1 US 20170315698 A1 US20170315698 A1 US 20170315698A1 US 201514833140 A US201514833140 A US 201514833140A US 2017315698 A1 US2017315698 A1 US 2017315698A1
- Authority
- US
- United States
- Prior art keywords
- user interface
- graphical user
- screen
- interface object
- hover
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the present disclosure relates to a user interface for use in a computing device with a sensitive display.
- Such computing device nowadays has one or more great memories and one or more great processors, thereby becoming multi-functional with various computer programs executable on the device.
- Such computer programs may include, for example, a text editor, a WWW (World Wide Web) browser, and video games.
- the multi-functional computing device should be welcomed, but, at the same time, may require complicated operations for users thereby making usability worse. Improvement in a user interface in such computing device is great concern for making usability better.
- the user interface is improved by use of a sensitive display.
- the sensitive display typically detects a contact or a tap of an object like a user's finger onto the surface of the sensitive display as well as displays graphics.
- the sensitive display may be advantageous in that it may enable intuitive and easy operations for the users. More improvement in a user interface through the sensitive display has been sought.
- the multi-functional computing device is a handheld device that is small for mobility and is provided with a tiny display and a tiny loudspeaker.
- the video displayed on its local display is so small that the user may feel he/she would like to enjoy the video with a larger remote or external display device.
- the sound outputted through its local loudspeaker is so unsatisfactory that the user may feel he/she would like to enjoy the music with a larger remote or external loudspeaker.
- the multi-functional device can communicate with a remote media-playing device such as a remote display device and a remote loudspeaker.
- a remote media-playing device such as a remote display device and a remote loudspeaker.
- a first aspect of the present invention is a method of a user interface for use in a computing device with a sensitive display.
- a tap of an object such as a user's finger onto the sensitive display and hover of the object in proximity over the sensitive display are detected.
- a popup dialog is displayed at a location determined based on the detected tap or hover.
- a second aspect of the present invention is a method of a user interface for use in a computing device with a sensitive display. According to the second aspect, hover of an object such as a user's finger in proximity over the sensitive display is detected. Responsive to hover detected above a predetermined area within a screen of a computer program displayed on the sensitive display, a menu is displayed.
- a third aspect of the present invention is a method of a user interface for use in a computing device with a sensitive display.
- hover of an object such as a user's finger in proximity over the sensitive display is detected.
- an assistant object is displayed for assisting the tappable object to be tapped.
- a fourth aspect of the present invention is a method of a user interface for use in a computing device that is provided with a sensitive display and is operable in connection with a remote display device.
- hover of an object such as a user's finger in proximity over the sensitive display is detected.
- Video signals representing a screen of a computer program executed in the computing device can be sent to the remote display device.
- video signals representing an indicator indicative of the detected hover can be sent to the remote display device.
- a fifth aspect of the present invention is a method of a user interface for use in a computing device that is provided with a sensitive display and is operable in connection with a remote display device.
- a screen of a computer program executed in the computing device can be changed according to whether or not communication is active between the computing device and the remote display device.
- tappable used in this application means possibility or ability of being tapped.
- a tappable object means an object which is to be tapped, or which a user can tap on.
- FIG. 1 illustrates a front view of a computing device according to a first embodiment.
- FIG. 2 is a block diagram illustrating means and/or circuitry provided in a computing device according to a first embodiment.
- FIG. 3 is a flowchart illustrating operations performed by a computing device according to a first aspect of the first embodiment.
- FIG. 4 illustrates how the location for a notification dialog is determined based on hover according to the first aspect of the first embodiment.
- FIGS. 5, 6, 7, and 8 illustrate how the location for a notification dialog is determined based on hover according to the first aspect of the first embodiment.
- FIGS. 9, 10, 11, and 12 illustrate how a notification dialog is displayed according to the first aspect of the first embodiment.
- FIGS. 13, 14, 15, and 16 illustrate how a notification dialog is displayed according to the first aspect of the first embodiment.
- FIG. 17 is a flowchart illustrating operations performed by a computing device according to a second aspect of the first embodiment.
- FIG. 18 illustrates how the location for a notification dialog is determined based on tap according to the second aspect of the first embodiment.
- FIGS. 19, 20, 21, and 22 illustrate how the location for a notification dialog is determined based on tap according to the second aspect of the first embodiment.
- FIGS. 23, 24, 25, and 26 illustrate how a notification dialog is displayed according to the second aspect of the first embodiment.
- FIGS. 27, 28, 29, and 30 illustrate how a notification dialog is displayed according to the second aspect of the first embodiment.
- FIG. 31 is a flowchart illustrating operations performed by a computing device according to a third aspect of the first embodiment.
- FIGS. 32, 33, 34, and 35 illustrate how a menu is displayed according to the third aspect of the first embodiment.
- FIG. 36 is a flowchart illustrating operations performed by a computing device according to a fourth aspect of the first embodiment.
- FIGS. 37, 38, 39, and 40 illustrate how an assistant object is displayed according to the fourth aspect of the first embodiment.
- FIGS. 41, 42, and 43 illustrate how an assistant object is displayed according to the fourth aspect of the first embodiment.
- FIG. 44 is a flowchart illustrating operations performed by a computing device according to a fifth aspect of the first embodiment.
- FIGS. 45, 46, 47, and 48 illustrate how an assistant object is displayed according to the fifth aspect of the first embodiment.
- FIGS. 49, 50, and 51 illustrate how an assistant object is displayed according to the fifth aspect of the first embodiment.
- FIG. 52 illustrates a system including a computing device and a remote display device according to a second embodiment.
- FIG. 53 illustrates a system including a computing device and a remote display device according to the second embodiment.
- FIG. 54 is a block diagram illustrating means and/or circuitry provided in a computing device according to the second embodiment.
- FIG. 55 is a flowchart illustrating operations performed by a computing device according to a first aspect of the second embodiment.
- FIG. 56 is a flowchart illustrating operations performed by a computing device according to a second aspect of the second embodiment.
- FIG. 57 is a flowchart illustrating operations performed by a computing device according to a third aspect of the second embodiment.
- FIG. 58 is a flowchart illustrating operations performed by a computing device according to a fourth aspect of the second embodiment.
- FIGS. 59 and 60 illustrate how an indicator is displayed according to the first to fourth aspects of the second embodiment.
- FIGS. 61 and 62 illustrate how an indicator is displayed according to the first to fourth aspects of the second embodiment.
- FIGS. 63 and 64 illustrate how an indicator is displayed according to the first to fourth aspects of the second embodiment.
- FIGS. 65 and 66 illustrate how an indicator is displayed according to the first to fourth aspects of the second embodiment.
- FIGS. 67 and 68 illustrate how an indicator is displayed according to the first to fourth aspects of the second embodiment.
- FIGS. 69 and 70 illustrate how an indicator is displayed according to the first to fourth aspects of the second embodiment.
- FIG. 71 is a flowchart illustrating operations performed by a computing device according to a fifth aspect of the second embodiment.
- FIG. 72 is a flowchart illustrating operations performed by a computing device according to a sixth aspect of the second embodiment.
- FIG. 73 is a flowchart illustrating operations performed by a computing device according to a seventh aspect of the second embodiment.
- FIG. 74 is a flowchart illustrating operations performed by a computing device according to an eighth aspect of the second embodiment.
- FIGS. 75 and 76 illustrate how a tappable object is emphasized according to the fifth to eighth aspects of the second embodiment.
- FIGS. 77 and 78 illustrate how a tappable object is emphasized according to the fifth to eighth aspects of the second embodiment.
- FIGS. 79 and 80 illustrate how a tappable object is emphasized according to the fifth to eighth aspects of the second embodiment.
- FIGS. 81 and 82 illustrate how a tappable object is emphasized according to the fifth to eighth aspects of the second embodiment.
- FIGS. 83 and 84 illustrate how a tappable object is emphasized according to the fifth to eighth aspects of the second embodiment.
- FIG. 85 is a flowchart illustrating operations performed by a computing device according to a ninth aspect of the second embodiment.
- FIG. 86 is a flowchart illustrating operations performed by a computing device according to a tenth aspect of the second embodiment.
- FIG. 87 is a flowchart illustrating operations performed by a computing device according to an eleventh aspect of the second embodiment.
- FIGS. 88 through 90 illustrate how an indicator is displayed according to the ninth to eleventh aspects of the second embodiment.
- FIGS. 91 through 93 illustrate how an indicator is displayed according to the ninth to eleventh aspects of the second embodiment.
- FIGS. 94 through 96 illustrate how an indicator is displayed according to the ninth to eleventh aspects of the second embodiment.
- FIG. 97 is a flowchart illustrating operations performed by a computing device according to a twelfth aspect of the second embodiment.
- FIG. 98 is a flowchart illustrating operations performed by a computing device according to a thirteenth aspect of the second embodiment.
- FIG. 99 is a flowchart illustrating operations performed by a computing device according to a fourteenth aspect of the second embodiment.
- FIG. 100 is a flowchart illustrating operations performed by a computing device according to a fifteenth aspect of the second embodiment.
- FIGS. 101 and 102 illustrate how an indicator is displayed and how a tappable object is emphasized according to the twelfth to fifteenth aspects of the second embodiment.
- FIGS. 103 and 104 illustrate how an indicator is displayed and how a tappable object is emphasized according to the twelfth to fifteenth aspects of the second embodiment.
- FIGS. 105 and 106 illustrate how an indicator is displayed and how a tappable object is emphasized according to the twelfth to fifteenth aspects of the second embodiment.
- FIGS. 107 and 108 illustrate how an indicator is displayed and how a tappable object is emphasized according to the twelfth to fifteenth aspects of the second embodiment.
- FIGS. 109 and 110 illustrate how an indicator is displayed and how a tappable object is emphasized according to the twelfth to fifteenth aspects of the second embodiment.
- FIG. 111 is a flowchart illustrating operations performed by a computing device according to a sixteenth aspect of the second embodiment.
- FIGS. 112, 113, 114, and 115 illustrate how a software keyboard is displayed according to the sixteenth aspect of the second embodiment.
- a computing device 1 can detect taps of an object such as a user's finger onto its local sensitive display 3 and hover of the object in proximity over such sensitive display 3 .
- the computing device 1 controls display of various graphical objects responsive to the detected tap and/or hover.
- the computing device 1 controls display of a dialog on the sensitive display 3 when an event to pop up the dialog occurs while a computer program is being executed and a screen of the computer program is being displayed on the sensitive display 3 .
- the computing device 1 controls display of a menu associated with a computer program in a screen of the computer program while the computer program is being executed.
- the computing device 1 controls display of an assistant object for assisting a tappable object, which appears in a screen of a computer program, to be tapped while the computer program is being executed.
- the computing device 1 performs the display controls based on a location of a tap and/or hover of an object such as a user's finger on and/or over the sensitive display 3 .
- FIG. 1 depicts the computing device 1 .
- FIG. 1 is a front view of the computing device 1 .
- the computing device 1 is a multi-functional computing device suitable in size for mobility.
- the computing device 1 can be a cell phone, a tablet computer, a laptop computer, and other similar computing device.
- FIG. 2 is a block diagram of the computing device 1 for illustrating the configuration of the computing device 1 in more detail.
- the computing device 1 mainly has a processor 2 , the sensitive display 3 , telecommunication circuitry 4 , and a memory 5 .
- the processor 2 generally processes instructions of computer programs stored in the memory 5 to execute the computer programs, so as to realize a variety of functions of the computing device 1 .
- the processor 2 can be a CPU (Central Processing Unit), a MPU (Micro Processing Unit), a DSP (Digital Processing Unit), or one or combination of other general or dedicated processors.
- the sensitive display 3 is a display device composed essentially of a display 3 a and a sensor 3 b .
- the display 3 a can be a LCD (Liquid Crystal Display), an EL (Electro-Luminance) display, or one of other similar types of display devices.
- the display 3 a displays graphics and video in accordance with video signals sent from the processor 2 .
- the sensor 3 b is a sensor to distinctively detect (i) taps of one or more objects, such as a user's finger and a stylus, made onto the sensor 3 b and (ii) hover of such object made in proximity over the sensor 3 b .
- the sensor 3 b sends to the processor 2 signals representing (i) the location of detected tap as long as such tap is detected and (ii) the location of detected hover as long as such hover is detected.
- a tap may be a touch or a contact in other words.
- the sensor unit 3 b detects gestures by (i) continuously detecting hover continuously made in proximity above the sensor unit 3 b or (ii) continuously detecting a movement of the object while a tap is maintained on the sensor unit 3 b .
- the technologies of sensing of taps, hover, and/or gestures are disclosed, for example, in the U.S. patent publications Nos.
- the display 3 a and the sensor 3 b may be mechanically integrated together.
- the sensitive display 3 displays graphics and video as well as detects taps, hover, and gestures of an object like the user's finger or a stylus on or above the sensitive display 3 .
- the telecommunication circuitry 4 is circuitry for telecommunication over a telecommunication network.
- the telecommunication circuitry 4 can be circuitry for telecommunication pursuant to CDMA (Code Divided Multiple Access) or other similar telecommunication standards or protocols.
- the memory 5 is a memory device, for example, such as a flash memory, an EEPROM, a HDD (Hard Disk Drive), combination thereof, and one or combination of other similar memory devices.
- the memory 5 stores computer programs to be executed by the processor 2 .
- the memory 5 stores an OS (Operating System) 5 a , a WWW (World Wide Web) browser 5 b , a video game 5 c , a text editor 5 d , a media player 5 e , a display control program 5 f , and a telecommunication program 5 g .
- OS Operating System
- WWW World Wide Web
- the WWW browser 5 b , the video game 5 c , the text editor 5 d , and the media player 5 e are typically application programs that run on the OS 5 a .
- the programs 5 b to 5 e are often collectively referred to as application programs.
- the display control program 5 f and the telecommunication program 5 g can also run on the OS 5 a , or can be incorporated in the OS 5 a running as part of the OS 5 a .
- the display control program 5 f and the telecommunication program 5 g run in the background as long as the OS 5 a is running.
- One or more of the application programs 5 b to 5 e are executed on the OS 5 a in response to user's selection.
- the display control program 5 f and the telecommunication program 5 g are executed while the OS 5 a is executed.
- the processor 2 sends video signals to the sensitive display 3 in accordance with instructions of the OS 5 a , the application programs 5 b to 5 e , the display control program 5 f , and/or the telecommunication program 5 g .
- the sensitive display 3 displays graphics and video in accordance with the video signals.
- the graphics and the video to be displayed include screens, icons, dialogs, menus and other graphical objects or contents.
- the screen may contain one or more tappable objects or contents within the screen, such as a HTML (Hyper Text Markup Language) link, a text-input field, a software button, and a software keyboard.
- the dialog is a graphical object, with a message and one or more tappable objects, which pops up in response to occurrence of one or more given events associated with the OS 5 a , the application programs 5 b to 5 e , the display control program 5 f , and/or the telecommunication program 5 g .
- the menu is associated with the OS 5 a , the application programs 5 b to 5 e , the display control program 5 f , and/or the telecommunication program 5 g , and is a tappable object for operation of the OS 5 a , the application programs 5 b to 5 e , the display control program 5 f , and/or the telecommunication program 5 g.
- one or more icons representing one or more of the application programs 5 b to 5 e are displayed on the sensitive display 3 in accordance with the instruction of the OS 5 a .
- a screen of one of the application programs 5 b to 5 e is displayed on the sensitive display 3 in accordance with the instructions of the one of the application programs 5 b to 5 e .
- a dialog for notifying a user of an incoming call over the telecommunication network is popped up in accordance with the instructions of the telecommunication program 5 g.
- the sensitive display 3 When an object such as the user's finger hovers over the sensitive display 3 , the sensitive display 3 detects the hover and determines the location above which the hover exists over the sensitive display 3 . The sensitive display 3 continuously sends to the processor 2 signals each representing the determined hover location during the hover detection.
- the location may be a position or an X-Y coordinates in other words.
- the sensitive display 3 When a tap of an object such as the user's finger is made onto the sensitive display 3 , the sensitive display 3 detects the tap and determines the location at which the tap is made on the sensitive display 3 . The sensitive display 3 then sends to the processor 2 a signal representing the determined tap location.
- the location may be a position or an X-Y coordinates in other words.
- the processor 2 receives the signals from the sensitive display 3 . Based on the received signals, the processor 2 determines the location of the hover and tap within a screen displayed on the sensitive display 3 . The processor 2 then operates in response to the hover and tap in accordance with the instructions of the OS 5 a , the application programs 5 b to 5 e , the display control program 5 f , and/or the telecommunication program 5 g.
- the processor 2 determines that a tap is made on an icon representing the WWW browser 5 b , the processor 2 launches the WWW browser 5 b.
- the processor 2 determines that a tap is made on a text-input field, the processor 2 launches a software keyboard.
- the processor 2 Upon executing the programs 5 a to 5 g , the processor 2 generates and sends video signals representing a screen of one of the programs 5 a to 5 g to the sensitive display 3 , so as for the sensitive display 3 to display the screen. Also, the processor 2 receives operations of the program whose screen is displayed by way of, for example, the user's tapping software buttons or other tappable graphical objects that appear in the screen.
- the processor 2 when executing the OS 5 a without executing any of the application programs 5 b to 5 e , the processor 2 displays the screen of the OS 5 a on the sensitive display 3 .
- the screen may contain icons representing the application programs 5 b to 5 e .
- the processor 2 receives operations for launching one of the application programs 5 b to 5 e through the user's tapping on the one of the icons.
- the processor 2 when executing the WWW browser 5 b on the OS 5 a , the processor 2 displays the screen of the WWW browser 5 b on the sensitive display 3 .
- the screen may contain software buttons.
- the processor 2 receives operations for connecting to WWW pages through the user's tapping on the software buttons.
- the processor 2 usually does not display screen of the display control program 5 f and the telecommunication program 5 g because the programs 5 f and 5 g run in the background while the OS 5 a is running.
- One or more events are associated with one or more of the programs 5 a to 5 g . If an event occurs in one of the programs 5 a to 5 g , the processor 2 generates and sends video signals representing a dialog associated with the event to the sensitive display 3 so as for the sensitive display 3 to pop up the dialog over the screen already displayed.
- the dialog may contain one or more tappable graphical objects.
- the processor 2 executes a predetermined action associated with the tapped graphical objects.
- an incoming call event is associated with the telecommunication program 5 g .
- the processor 2 continuously monitors an incoming call over the telecommunication network through the telecommunication circuitry 4 from some distant caller. Responsive to arrival of an incoming call, the sensitive display 3 pops up a dialog for notifying the user of the incoming call over the already-displayed screen.
- the dialog may contain a software button for answering the incoming call. Responsive to the user's tapping the software button, the processor 2 establishes telecommunication between the user, namely, the computing device 1 and the caller.
- a virus detection event is associated with the WWW browser 5 b .
- the processor 2 continuously monitors computer virus maliciously hidden in WWW pages. Responsive to some hidden computer virus detected, the sensitive display 3 pops up a dialog for notifying the user of the detected computer virus over the already-displayed screen.
- the dialog may contain a software button for responding to the notice and selecting countermeasures against the computer virus. Responsive to the user's tapping the software button, the processor 2 quarantines or eliminates the virus.
- FIG. 3 is a flowchart illustrating a first aspect of the display control in accordance with the display control program 5 f .
- the display control is executed responsive to occurrence of one of the above-mentioned given events triggering a popup of a dialog.
- the processor 2 determines whether or not hover of an object such as the user's finger in proximity over the sensitive display 3 is being detected (S 101 ). Namely, the processor 2 determines whether or not signals representing the location of hover are being sent from the sensitive display 3 (S 101 ).
- the processor 2 pops up a dialog associated with the event at a predetermined location over the screen on the sensitive display 3 (S 102 ).
- the predetermined location may be the center of the screen, the bottom area of the screen, or the likes.
- the processor 2 determines a location that is predetermined pixels away from the determined hover location (S 103 ).
- a dialog is to be displayed at the determined location. As illustrated in FIG. 4 , the determination in S 103 can be done by simply selecting one of the locations L each of which is predetermined pixels away from the determined hover location 11 .
- the determination in S 103 can be done by determining first and second areas based on the determined hover location and then determining a given location within the second area, as described as follows with reference to 5 to 8 .
- the first area 12 is determined based on the determined hover location 11 .
- the first area 12 can be defined from vertexes each of which is predetermined pixels away from the determined hover location 11 as illustrated in FIG. 5 .
- the predetermined pixels may be identical among all of the vertexes to form the first area 12 to be a regular square, or may be different among the vertexes to form an irregular square.
- the first area 12 may be formed from four vertexes to be a square, or may be formed from more than or less than four vertexes to be a polygon other than the square. Instead, as illustrated in FIG. 7 , the first area 12 can be defined as a circle having a radius of predetermined pixels from the detected hover location 11 .
- the second area 13 is determined based on the first area 12 .
- the second area 13 shaded in FIGS. 6 and 8 , is defined to be an area other than the first area in the screen as illustrated in FIGS. 6 and 8 .
- a given location L is determined within the second area.
- the processor 2 pops up a dialog associated with the given event at the determined location over the screen on the sensitive display 3 (S 104 ).
- the dialog is popped up to be displayed some pixels away from the user's finger hovering in proximity over the screen.
- the processor 2 determines whether or not one or more tappable graphical objects contained in the dialog are tapped (S 105 ).
- FIGS. 9, 10, 11, 12, 13, 14, 15, and 16 illustrate examples describing how a dialog is popped up in accordance with the above-mentioned display control.
- FIGS. 9 to 12 illustrate an example of popping up a dialog responsive to occurrence of an incoming call event associated with the telecommunication program 5 g while the OS 5 a is being executed.
- the screen of the OS 5 a is displayed on the sensitive display 3 as illustrated in FIG. 9 .
- the screen contains icons 14 representing one or more of the application programs 5 b to 5 e .
- the processor 2 receives operation for selecting and launching one of the application programs 5 b to 5 e by way of the user's tapping on an icon 14 .
- the telecommunication program 5 g is running in the background, whose screen is not displayed. If an incoming call event occurs, the processor 2 determines whether or not hover is being detected. If there is no hover detected, the processor 2 then displays a dialog 15 for notifying the user of the incoming call over the screen of the active OS 5 a at a predetermined location, such as the substantially center of the screen, as depicted in FIG. 10 .
- the processor 2 determines a location which is predetermined pixels (for example, Z pixels in FIG. 12 ) away from the location of the detected hover, and then displays the dialog 15 at the determined location as depicted in FIGS. 11 and 12 .
- the dialog 15 contains a tappable object 15 a for answering the incoming call as well as a tappable object 15 b for denying the incoming call.
- the processor 2 Responsive to the user's tapping the graphical object 15 a , the processor 2 establishes telecommunication between the user, namely, the computing device 1 and the caller through the telecommunication circuitry 4 , in accordance with the instructions of the telecommunication program 5 g.
- the above-mentioned display control can avoid user's erroneous tapping on the suddenly displayed dialog 15 .
- the dialog 15 popped up near the finger 10 just when the user is about or ready to tap any of the icons 14 with hovering the finger 10 over the screen of the OS 5 a
- the user might erroneously tap the tappable object 15 a or 15 b against his/her intention.
- the user might feel bad if he/she erroneously answered the incoming call by tapping the tappable object 15 a against his/her intention.
- the dialog 15 is always popped up distantly from the user's hovering finger 10 , and so the erroneous operation can be avoided.
- FIGS. 13 to 16 illustrate an example of popping up a dialog responsive to occurrence of a virus detection event associated with the WWW browser 5 b while the WWW browser 5 b is running.
- the screen of the WWW browser 5 b is displayed on the sensitive display 3 as illustrated in FIG. 13 .
- the screen contains software buttons 16 and hyperlink buttons 17 .
- the processor 2 receives operation of scrolling forward or back WWW pages by way of the user's tapping the software buttons 16 , and operations of connecting to other WWW pages by way of the user's tapping the hyperlink buttons 17 .
- the WWW browser 5 b continuously monitors computer viruses hidden in the WWW pages in the background.
- the processor 2 determines whether or not hover of the user's finger is being detected. If there is no hover detected, the processor 2 then displays a dialog 18 for notifying the user of the detected virus over the screen at a predetermined location, such as the substantially center of the screen, as depicted in FIG. 14 .
- the processor 2 determines a location which is predetermined pixels (for example, Z pixels in FIG. 16 ) away from the location of the detected hover, and then displays the dialog 18 at the determined location as depicted in FIGS. 15 and 16 .
- the dialog 18 contains a tappable object 18 a for checking the details of the virus as well as a tappable object 18 b for eliminating the virus. Responsive to the user's tapping the graphical objects 18 a , the processor 2 displays detail information about the virus. Responsive to the user's tapping the graphical objects 18 b , the processor 2 executes the instructions of WWW browser 5 b to exterminate the virus.
- the above-mentioned display control can avoid the user's erroneous tapping on the suddenly displayed dialog 18 .
- the dialog 18 popped up near the finger 10 just when the user is about or ready to tap the software buttons 16 or the hyperlink buttons 17 with hovering the finger 10 above the screen
- the user might erroneously tap the tappable object 18 a or 18 b against his/her intention.
- the user might feel bad if he/she erroneously eliminated the virus by tapping the object 18 b against his/her intention to analyze the virus carefully.
- the dialog 18 is popped up distantly from the user's hovering finger 10 , and so the erroneous operation can be avoided.
- FIG. 17 is a flowchart illustrating a second aspect of the display control in accordance with the display control program 5 f .
- the display control is executed upon occurrence of one of the above-mentioned given events triggering a popup of a dialog.
- the processor 2 determines whether or a tap of an object such as the user's finger on the sensitive display 3 is being detected (S 111 ). Namely, the processor 2 determines whether or not signals representing the location of tap are being sent from the sensitive display 3 (S 111 ).
- the processor 2 pops up a dialog associated with the event at a predetermined location over the screen on the sensitive display 3 (S 112 ).
- the predetermined location may be the center of the screen, the bottom area of the screen, or the likes.
- the processor 2 determines a location that is predetermined pixels away from the determined tap location (S 113 ).
- a dialog is to be displayed at the predetermined location. As illustrated in FIG. 18 , the determination in S 113 can be done by simply selecting one of the locations L each of which is predetermined pixels away from the determined tap location 11 .
- the determination in S 113 can be done by determining first and second areas based on the determined tap location and then determining a given location within the second area, as described as follows with reference to 10 A to 10 D.
- the first area 12 is determined based on the determined tap location 11 .
- the first area 12 can be defined from vertexes each of which is predetermined pixels away from the determined tap location 11 as illustrated in FIG. 19 . Note that the predetermined pixels may be identical among all of the vertexes to form the first area 12 to be a regular square, or may be different among the vertexes to form an irregular square.
- the first area 12 may be formed from four vertexes to be a square, or may be formed from more than or less than four vertexes to be a polygon other than the square. Instead, as illustrated in FIG. 21 , the first area 12 can be defined as a circle having a radius of predetermined pixels from the detected tap location 11 .
- the second area 13 is determined based on the first area 12 .
- the second area 13 shaded in FIGS. 20 and 22 , is defined to be an area other than the first area in the screen as illustrated in FIGS. 20 and 22 .
- a given location L is determined within the second area.
- the processor 2 pops up a dialog associated with the given event at the determined location over the screen on the sensitive display 3 (S 114 ).
- the dialog is popped up to be displayed some pixels away from the user's finger tapped on the screen.
- the processor 2 determines whether or not one or more tappable graphical objects contained in the dialog are tapped (S 115 ).
- the processor 2 executes a given action associated with the tapped graphical object (S 116 ).
- FIGS. 23, 24, 25, 26, 27, 28, 29, and 30 illustrate examples describing how a dialog is popped up in accordance with the above-mentioned display control.
- FIGS. 23 to 26 illustrate an example of popping up a dialog responsive to occurrence of an incoming call event associated with the telecommunication program 5 g while the OS 5 a is being executed.
- the screen of the OS 5 a is displayed on the sensitive display 3 as illustrated in FIG. 23 .
- the screen contains icons 14 representing one or more of the application programs 5 b to 5 e .
- the processor 2 receives operation for selecting and launching one of the application programs 5 b to 5 e by way of the user's tapping on an icon 14 .
- the telecommunication program 5 g is running in the background, whose screen is not displayed. If an incoming call event occurs, the processor 2 determines whether or not a tap is being detected. If there is no tap detected, the processor 2 then displays a dialog 15 for notifying the user of the incoming call over the screen of the active OS 5 a at a predetermined location, such as the substantially center of the screen, as depicted in FIG. 24 .
- the processor 2 determines a location which is predetermined pixels (for example, Z pixels in FIG. 26 ) away from the location of the detected tap, and then displays the dialog 15 at the determined location as depicted in FIGS. 25 and 26 .
- the dialog 15 contains a tappable object 15 a for answering the incoming call as well as a tappable object 15 b for denying the incoming call.
- the processor 2 Responsive to the user's tapping the graphical object 15 a , the processor 2 establishes telecommunication between the user, namely, the computing device 1 and the caller through the telecommunication circuitry 4 , in accordance with the instructions of the telecommunication program 5 g.
- the above-mentioned display control can avoid user's erroneous tapping on the suddenly displayed dialog 15 .
- the dialog 15 popped up near the finger 10 just when the user is tapping on the screen of the OS 5 a for operation of the OS 5 a .
- the user might erroneously tap the tappable object 15 a or 15 b against his/her intention.
- the user might feel bad if he/she erroneously answered the incoming call by tapping the tappable object 15 a against his/her intention.
- the dialog 15 is always popped up distantly from the user's tapped finger 10 , and so the erroneous operation can be avoided.
- FIGS. 27 to 30 illustrate an example of popping up a dialog responsive to occurrence of a virus detection event associated with the WWW browser 5 b while the WWW browser 5 b is running.
- the screen of the WWW browser 5 b is displayed on the sensitive display 3 as illustrated in FIG. 27 .
- the screen contains software buttons 16 and hyperlink buttons 17 .
- the processor 2 receives operation of scrolling forward or back WWW pages by way of the user's tapping the software buttons 16 , and operations of connecting to other WWW pages by way of the user's tapping the hyperlink buttons 17 .
- the WWW browser 5 b continuously monitors computer viruses hidden in the WWW pages in the background.
- the processor 2 determines whether or not a tap of the user's finger is being detected. If there is no tap detected, the processor 2 then displays a dialog 18 for notifying the user of the detected virus over the screen at a predetermined location, such as the substantially center of the screen, as depicted in FIG. 28 .
- the processor 2 determines a location which is predetermined pixels (for example, Z pixels in FIG. 30 ) away from the location of the detected tap, and then displays the dialog 18 at the determined location as depicted in FIGS. 29 and 30 .
- the dialog 18 contains a tappable object 18 a for checking the details of the virus as well as a tappable object 18 b for eliminating the virus. Responsive to the user's tapping the objects 18 a , the processor 2 displays detail information about the virus. Responsive to the user's tapping the graphical objects 18 b , the processor 2 executes the instructions of WWW browser 5 b to exterminate the virus.
- the above-mentioned display control can avoid the user's erroneous tapping on the suddenly displayed dialog 18 .
- the dialog 18 popped up near the finger 10 just when the user is tapping on the screen, the user might erroneously tap the object 18 a or 18 b against his/her intention. The user might feel bad if he/she erroneously eliminated the virus by tapping the tappable object 18 b against his/her intention to analyze the virus carefully.
- the dialog 18 is popped up distantly from the user's tapped finger 10 , and so the erroneous operation can be avoided.
- FIG. 31 is a flowchart illustrating a third aspect of the display control in accordance with the display control program 5 f .
- the display control is executed while a screen of one of the programs 5 a to 5 g is displayed on the sensitive display 3 .
- One or more given location or area within the screen are assigned for popup of a menu for operation of the one of the programs. For example, the upper right part of the screen can be assigned.
- the processor 2 While the screen of the program is displayed on the sensitive display 3 (S 200 ), the processor 2 continuously determines whether or not hover of an object such as the user's finger is being detected at the assigned location above the screen for more than a predetermined period based on the signals from the sensitive display 3 (S 201 ). If the processor 2 has continuously received the signals representing hover above the assigned location for more than the predetermined period, the processor 2 determines affirmatively.
- the processor 2 displays a menu at a first location over the screen (S 202 ).
- the first location is defined to be predetermined pixels away from the assigned location.
- the menu is a tappable graphical object for operation of the executed program.
- the processor 2 continuously determines whether or not hover anywhere in the screen is kept continuously detected based on the signals from the sensitive display 3 (S 203 ).
- the processor 2 determines dismissively if the sensitive display 3 stops detecting hover because, for example, the user has moved his/her finger away from the sensitive display 3 .
- the processor 2 determines affirmatively if the sensitive display 3 keeps detecting hover because, for example, the user has kept his/her finger in proximity the sensitive display 3 .
- the processor 2 receives the user's tap on the menu through the sensitive display 3 (S 206 ).
- the processor 2 executes a given action associated with the menu in accordance with the instructions of the executed program (S 207 ).
- FIGS. 32 to 35 illustrate examples describing how a menu is popped up in accordance with the above-mentioned display control while the text editor 5 c is being executed.
- the screen of the text editor 5 c contains a text input field 20 and a software keyboard 21 .
- the processor 2 receives taps on tappable alphabetical keys within the software keyboard 21 through the sensitive display 3 to input texts in the text input field 21 .
- a given part 22 in the upper right of the screen is assigned for menu popup.
- the processor 2 If the processor 2 detects hover of the user's finger 10 above the part 22 for more than a predetermined period, the processor 2 displays a menu 23 at a first location that is predetermined pixels (for example, Y pixels in FIG. 34 ) away from the part 22 , as depicted in FIG. 34 .
- the menu 23 is a graphical object containing tappable software buttons entitled “save”, “edit”, and “option” by way of example.
- the processor 2 keeps displaying the menu 23 as long as hover of the finger 10 is kept over the screen.
- the processor 2 executes actions associated with any one of the software buttons within the menu 23 by receiving a tap on the one of the software buttons in accordance with the instructions of the text editor 5 c .
- the processor 2 can save a document created through text inputs in the memory 5 by receiving a tap on the software button entitled “save”.
- the processor 2 stops displaying the menu 23 if the predetermined time has lapsed or detection of hover has stopped before reception of tap on the menu 23 .
- the above-mentioned display control can hide the menu 23 to avoid the screen from being occupied by the menu 23 unless or until the user hopes or needs to operate by use of the menu 23 , so as to enhance screen visibility.
- the menu 23 is displayed some pixels away from the hovering finger 10 , so as to avoid the finger 10 itself from interrupting the menu 23 .
- display of the menu 23 can be kept or ceased through easy operation of the user's keeping the finger 10 in proximity over the screen or moving the finger 10 away from the screen. Accordingly, usability in operating the programs can be improved.
- FIG. 36 is a flowchart illustrating a fourth aspect of the display control in accordance with the display control program 5 f .
- the display control is executed while a screen of one of the programs 5 a to 5 g is displayed on the sensitive display 3 .
- the screen may contain one or more tappable objects, such as texts and images, associated with a given action defined by instructions of one of the programs 5 a to 5 g .
- a screen of the WWW browser 5 b displays a WWW page and may contain a tappable object linked to another WWW page. Tapping on the object may initiate a given action, namely, connection to and display of the linked WWW page.
- the processor 2 While the screen of one of the programs 5 a to 5 g is displayed on the sensitive display 3 (S 300 ), the processor 2 continuously determines whether or not hover of an object such as the user's finger 10 is being detected above a tappable object over the screen for more than a predetermined period (S 301 ). If the processor 2 has continuously received the signals representing hover above the location of the tappable object for more than the predetermined period, the processor 2 determines affirmatively.
- the processor 2 If hover is detected (S 301 : YES), the processor 2 generates and displays an assistant object at a location which is predetermined pixels away from the tappable object over the screen (S 302 ).
- the assistant object is an object for assisting the tappable object to be tapped.
- the assistant object can be, for example, generated by enlarging the tappable object.
- the processor 2 continuously determines whether or not hover above the tappable object is kept continuously detected based on the signals from the sensitive display 3 (S 303 ).
- the processor 2 determines dismissively if the sensitive display 3 stops detecting hover because, for example, the user has moved his/her finger away from the sensitive display 3 .
- the processor 2 determines affirmatively if the sensitive display 3 keeps detecting hover because, for example, the user has kept his/her finger in proximity above the tappable object.
- the processor 2 stops displaying the assistant object (S 304 ). On the other hand, as long as detection of hover is kept (S 303 : YES), the processor 2 keeps displaying the assistant object until a predetermined time has lapsed (S 305 ).
- the processor 2 receives the user's tap on the tappable object through the sensitive display 3 (S 306 ).
- the processor 2 executes a given action associated with the tappable object in accordance with the instructions of the program (S 307 ).
- FIGS. 37 to 40 illustrate an example describing how the assistant object is displayed in accordance with the above-mentioned display control while the WWW browser 5 b is being executed.
- the WWW page contains tappable text objects 31 to 33 and a tappable image object 30 .
- Each of the tappable text objects 31 to 33 consists of texts, whereas the tappable image object 30 consists of an image.
- Each of the tappable objects 30 to 33 is linked to another WWW page.
- the processor 2 receives taps on the tappable objects 30 to 33 through the sensitive display 3 to connect to the linked WWW page and display a screen of the linked WWW page.
- the processor 2 If the processor 2 detects hover of an object such as the user's finger 10 above one of the tappable objects 30 to 33 , for example the tappable text object 31 , for more than a predetermined period as depicted in FIG. 38 , the processor 2 generates an assistant object 41 for assisting the tappable text object 31 to be tapped. The processor 2 then displays the assistant object 41 near but predetermined pixels (for example, X pixels in FIG. 39 ) away from the original tappable text object 31 , as depicted in FIG. 39 .
- predetermined pixels for example, X pixels in FIG. 39
- the processor 2 keeps displaying the assistant object 41 as long as hover of the finger 10 is kept detected above the tappable text object 31 .
- the processor 2 executes an action associated with the tappable text object 31 by receiving tap on the tappable text object 31 in accordance with the instructions of the WWW browser 5 b .
- the processor 2 can connect to another WWW page linked to the tappable text object 31 and display a screen of the linked WWW page.
- the processor 2 stops displaying the assistant object 41 if the predetermined time has lapsed or detection of hover has stopped because, for example, the user has moved the finger 10 away from the tappable text object 31 before reception of tap on the tappable text object 31 .
- the processor 2 again detects hover above one of the tappable objects 30 to 33 , for example the tappable image object 30 because the user has moved the finger 10 from above the tappable text object 31 to above the tappable image object 30 , the processor 2 then generates and displays an assistant object 42 for assisting the tappable image object 30 to be tapped, near but predetermined pixels (for example, X pixels in FIG. 40 ) away from the original tappable image object 30 , as depicted in FIG. 40 .
- predetermined pixels for example, X pixels in FIG. 40
- the assistant object generated in accordance with the above display control may be embodied in various manners within the scope of its purpose of assisting the a original tappable object to be tapped.
- the assistant object can be generated by copying the original tappable object, and enlarging the copied object, as depicted in FIGS. 41 and 42 .
- the assistant object can be a thumbnail or screenshot of a WWW page linked to the original tappable object as depicted in FIG. 43 .
- the above-mentioned display control can enhance usability in the user tapping on tappable objects that appear in the screen, and can avoid the user from erroneously tapping on the tappable objects. More specifically, even in case the tappable objects are displayed too small for the user to easily read or recognize what is written in the tappable object or what will occur upon tapping on the tappable object because the sensitive display 3 is tiny, the user can read or recognize what is written in the tappable object or what will occur upon tapping on the tappable object by taking a look at the assistant object. Accordingly, the user's erroneous tapping on the tappable objects against his/her intention can be avoided. Therefore, usability in operating the programs can be improved.
- FIG. 44 is a flowchart illustrating a fifth aspect of the display control in accordance with the display control program 5 f .
- the display control is executed while the screen of one of the programs 5 a to 5 g is displayed on the sensitive display 3 .
- the screen may contain one or more tappable objects, such as texts and images, associated with a given action for operation of the one of the programs 5 a to 5 g .
- a screen of the WWW browser 5 b which is a WWW page, may contain a tappable object linked to another WWW page. Tapping on the graphical object may execute a given action, namely, connection to and display of the linked WWW page.
- the processor 2 While the screen of one of the programs 5 a to 5 g is displayed on the sensitive display 3 (S 310 ), the processor 2 continuously determines whether or not hover of an object such as the user's finger is being detected above a tappable object over the screen for more than a predetermined period (S 311 ). If the processor 2 has continuously received the signals representing hover at the location of the tappable object for more than the predetermined period, the processor 2 determines affirmatively.
- the processor 2 If hover is detected (S 311 : YES), the processor 2 generates and displays an assistant object at a location which is predetermined pixels away from the tappable object over the screen (S 312 ).
- the assistant object is an object for assisting the tappable object to be tapped.
- the assistant object is generated in a tappable form, and is associated with the same given action as the action originally associated with the tappable object. In other words, the same instructions as the instructions originally assigned to the tappable object for executing the given action, is also assigned to the assistant object.
- the processor 2 determines whether or not hover anywhere over the screen is kept continuously detected based on the signals from the sensitive display 3 (S 313 ).
- the processor 2 determines dismissively if the sensitive display 3 stops detecting hover because, for example, the user has moved his/her finger away from the sensitive display 3 .
- the processor 2 determines affirmatively if the sensitive display 3 keeps detecting hover because, for example, the user has kept his/her finger in proximity anywhere over the screen.
- the processor 2 stops displaying the assistant object (S 314 ). On the other hand, as long as detection of hover is kept (S 313 : YES), the processor 2 keeps displaying the assistant object until a predetermined time has lapsed (S 315 ).
- the processor 2 receives the user's tap on the assistant object through the sensitive display 3 (S 316 ).
- the processor 2 executes a given action associated with the assistant object, which corresponds to the action originally associated with the tappable object, in accordance with the instructions of the program (S 317 ).
- FIGS. 45 to 48 illustrate an example describing how the assistant object is displayed in accordance with the above-mentioned display control while the WWW browser 5 b is being executed.
- the screen of the WWW browser 5 b which is a WWW page, contains tappable text objects 31 to 33 and a tappable image object 30 .
- Each of the tappable objects 30 to 33 is linked to another WWW page.
- the processor 2 receives taps on the tappable objects 30 to 33 through the sensitive display 3 to connect to the linked WWW page and display a screen of the linked WWW page.
- the processor 2 If the processor 2 detects hover of an object such as the user's finger 10 above one of the tappable objects 30 to 33 , for example the tappable text object 31 , for more than a predetermined period as depicted in FIG. 46 , the processor 2 generates an assistant object 41 for the tappable text object 31 .
- This assistant object 41 is associated with the action originally associated with the tappable text object 31 , namely, has a link to the WWW page to which the tappable text object 31 is originally linked.
- the processor 2 displays the assistant object 41 near but predetermined pixels (for example, X pixels in FIG. 47 ) away from the original tappable text object 31 , as depicted in FIG. 47 .
- the processor 2 keeps displaying the assistant object 41 as long as hover of the finger 10 is kept anywhere over the screen.
- the processor 2 executes an action associated with the assistant object 41 , which corresponds to the action originally associated with the tappable text object 31 , by receiving tap on the assistant object 41 in accordance with the instructions of the WWW browser 5 b .
- the processor 2 can connect to another WWW page linked to the assistant object 41 , which corresponds to the WWW page originally linked to the tappable text object 31 , and display a screen of the linked WWW page.
- the processor 2 stops displaying the assistant object 41 if the predetermined time has lapsed or detection of hover has stopped because, for example, the user has moved the finger 10 away from the screen before reception of tap on the assistant object 41 .
- the processor 2 again detects hover above one of the tappable objects 30 to 33 , for example the tappable image object 30 because the user has moved the finger 10 to approach to the tappable image object 30 , the processor 2 then generates and displays an assistant object 42 for the tappable image object 30 near but predetermined pixels (for example, X pixels in FIG. 48 ) away from the original tappable object 30 , as depicted in FIG. 48 .
- the assistant object generated in accordance with the above display control may be embodied in various manners within the scope of its purpose of assisting the tappable object to be tapped.
- the assistant object can be generated by copying the original tappable object, and enlarging the copied object, as depicted in FIGS. 49 and 50 .
- the assistant object can be a thumbnail or screenshot of a WWW page linked to the assistant object, which corresponds to the WWW page originally linked to the original tappable object, like the tappable object 31 in FIG. 51 , as depicted in FIG. 51 .
- the above-mentioned display control can enhance usability in the user executing a given action associated with tappable objects that appear in the screen. More specifically, even in case the tappable objects are displayed too small for the user to easily tap on the tappable object for executing the given action because the sensitive display 3 is tiny, the user can execute the given action easily by tapping on an assistant object instead of the original tappable object. Therefore, usability in operating the programs can be improved.
- a second embodiment is disclosed with reference to FIGS. 52 to 110 .
- a computing device 50 is operable in connection with a remote display device 51 that is physically separated from the computing device 50 .
- the computing device 1 can detect taps of an object such as a user's finger onto its local sensitive display 54 and hover of such object in proximity over the sensitive display 54 .
- the computing device 50 controls display of a screen of a computer program executed in the computing device 50 based on communication between the computing device 50 and the remote display device 51 .
- FIGS. 52 and 53 depicts the computing device 50 and the remote display device 51 .
- the computing device 50 is a multi-functional computing device suitable in size for mobility.
- the computing device 50 can be a cell phone, a tablet computer, a laptop computer, and other similar computing device.
- the computing device 50 has communication circuitry 55 for wirelessly communicating with communication circuitry 52 coupled to the remote display 51 .
- the remote display device 51 is a typical desktop display device suitable for use, for example, on a desk, a table, or in a living room.
- the size of the remote display device 51 can be 20 inches, 32 inches, 40 inches, 60 inches, and so on.
- the remote display device 51 is physically different from the computing device 50 .
- the computing device 50 and the remote display device 51 have their components and circuitry housed in housings different from each other.
- the remote display device 51 is coupled to the communication circuitry 52 for wirelessly communicating with the communication circuitry 55 .
- the communication circuitry 52 can be provided inside the remote display device 51 as depicted in FIG. 52 , or can be an external device attachable to the remote display device 51 by way of, for example, USB (Universal Serial Bus) or another interface as depicted in FIG. 53 .
- USB Universal Serial Bus
- the communication circuitry 55 and 52 can communicate with each other in accordance with, for example, the Bluetooth (registered trademark of Bluetooth SIG, INC.) protocol, the FireWire (registered trademark of Apple Inc.) protocol, the WiMAX (registered trademark of WiMAX Forum Corporation) protocol, the wireless LAN (Local Area Network) protocol, or another wireless communication protocol.
- the Bluetooth registered trademark of Bluetooth SIG, INC.
- the FireWire registered trademark of Apple Inc.
- WiMAX registered trademark of WiMAX Forum Corporation
- wireless LAN Local Area Network
- the communication circuitry 52 can receive video signals streamed through the communication circuitry 55 , and can output the received video signals to the remote display device 51 .
- the computing device 50 can send video signals to the remote display device 51 , thereby making the remote display device 51 display graphics or video represented by the video signals.
- FIG. 54 is a block diagram of the computing device 50 for illustrating the configuration of the computing device 50 in more detail.
- the computing device 50 mainly has a processor 53 , the sensitive display 54 , the communication circuitry 55 , and a memory 56 .
- the processor 53 generally processes instructions of computer programs stored in the memory 56 to execute the computer programs, so as to realize a variety of functions of the computing device 50 .
- the processor 53 can be a CPU (Central Processing Unit), a MPU (Micro Processing Unit), a DSP (Digital Processing Unit), or another general or dedicated processor.
- the sensitive display 54 is a display device composed essentially of a display 54 a and a sensor 54 b .
- the display 54 a can be a LCD (Liquid Crystal Display), an EL (Electro-Luminance) display, or one of other similar types of display devices.
- the display 54 a displays graphics and video in accordance with video signals sent from the processor 53 .
- the sensor 54 b is a sensor to distinctively detect (i) taps of one or more objects, such as a user's finger and a stylus, made onto the sensor 54 b and (ii) hover of such object made in proximity over the sensor 54 b .
- the sensor 54 b sends to the processor 53 signals representing (i) the location of detected tap as long as such tap is detected and (ii) the location of detected hover as long as such hover is detected.
- a tap may be a touch or a contact in other words.
- the sensor unit 54 b detects gestures by (i) continuously detecting hover continuously made in proximity above the sensor unit 54 b or (ii) continuously detecting a movement of the object while a tap is maintained on the sensor unit 54 b .
- the technologies of sensing of taps, hover, and/or gestures are disclosed, for example, in the U.S. patent publications Nos.
- the display 54 a and the sensor 54 b may be mechanically integrated together.
- the sensitive display 54 displays graphics and video as well as detects taps, hover, and gestures of an object like the user's finger or a stylus on or above the sensitive display 54 .
- the communication circuitry 55 is circuitry for wireless communication with the communication circuitry 52 .
- video signals are transmitted to the communication circuitry 52 through the communication circuitry 55 under control by the processor 53 .
- the communication circuitry 55 can communicate in accordance with the Bluetooth (registered trademark of Bluetooth SIG, INC.) protocol, the FireWire (registered trademark of Apple Inc.) protocol, the WiMAX (registered trademark of WiMAX Forum Corporation) protocol, the wireless LAN (Local Area Network) protocol, or another wireless communication protocol.
- the memory 56 is a memory device, for example, such as a flash memory, an EEPROM, a HDD (Hard Disk Drive), and another similar memory device.
- the memory 56 stores computer programs to be executed by the processor 53 .
- the memory 56 stores an OS (Operating System) 56 a , a WWW (World Wide Web) browser 56 b , a video game 56 c , a text editor 56 d , a media player 56 e , and a display control program 56 f .
- the WWW browser 56 b , the video game 56 c , the text editor 56 d , and the media player 56 e are typically application programs that run on OS 56 a .
- the programs 56 b to 56 e are often collectively referred to as application programs.
- the display control program 56 f can also run on the OS 56 a , or can be incorporated in the OS 56 a running as part of the OS 56 a.
- One or more of the application programs 56 b to 56 e are executed on the OS 56 a in response to the user's selection.
- the display control program 56 f is executed while the OS 56 a and/or one or more of the application programs 56 b to 56 e are executed.
- the processor 53 sends video signals to the sensitive display 54 in accordance with instructions of the OS 56 a , the application programs 56 b to 56 e , and/or the display control program 56 f .
- the sensitive display 54 displays graphics and video in accordance with the video signals.
- the graphics and the video to be displayed include screens, icons, and other graphical objects or contents.
- the screen may contain one or more tappable graphical objects or contents within the screen, such as a HTML (Hyper Text Markup Language) link, a text-input field, a software button, and a software keyboard.
- HTML Hyper Text Markup Language
- one or more icons representing one or more of the application programs 56 b to 56 e are displayed on the sensitive display 54 in accordance with the instruction of the OS 56 a .
- a screen of one of the application programs 56 b to 56 e is displayed on the sensitive display 54 in accordance with the instruction of the application programs 56 b to 56 e.
- the sensitive display 54 When an object such as the user's finger hovers over the sensitive display 54 , the sensitive display 54 detects the hover and determines the location above which the hover is made over the sensitive display 54 . The sensitive display 54 continuously sends to the processor 53 signals representing the determined hover location during the hover detection.
- the location may be a position or an X-Y coordinates in other words.
- a gesture may be defined by continuous hover in a predetermined path.
- the sensitive display 54 may detect a gesture responsive to detecting left-to-right linear continuous hover or circular continuous hover. In this case, the sensitive display 54 may then send to the processor 53 a signal representing the detected gesture.
- the sensitive display 54 When a tap of an object such as the user's finger is made onto the sensitive display 54 , the sensitive display 54 detects the tap and determines the location at which the tap is made within the sensitive display 54 . The sensitive display 54 then sends to the processor 53 a signal representing the determined tap location.
- the location may be a position or an X-Y coordinates in other words.
- a gesture may be defined by a movement of the object while a tap is once detected and maintained on the sensitive display 54 .
- the sensitive display 54 may detect a gesture responsive to detecting a left-to-right linear movement of the object or a circular movement of the object. In this case, the sensitive display 54 may then send to the processor 53 a signal representing the detected gesture.
- the processor 53 receives the signals from the sensitive display 54 . Based on the received signals, the processor 53 determines the location of the hover and tap within a screen displayed on the sensitive display 54 . The processor 53 then operates in response to the hover and tap in accordance with the instructions of the OS 56 a , the application programs 56 b to 56 e , and/or the display control program 56 f.
- the processor 53 determines that a tap is made onto an icon representing the WWW browser 56 b , the processor 53 launches the WWW browser 56 b.
- the processor 53 determines that a tap is made onto a text-input field, the processor 53 launches a software keyboard.
- the processor 53 determines that hover is made anywhere above a screen displayed on the sensitive display 54 , the processor 53 generates a video signal representing an indicator indicative of the determined hover location if the communication circuitry 55 is active.
- the processor 53 activates or deactivates the communication circuitry 55 in response to the user's operation, for example, through the sensitive display 54 .
- an icon for activation or deactivation of the communication circuitry 55 is displayed on the sensitive display 54 .
- the processor 53 may activate the communication circuitry 55 if the communication circuitry 55 had not been active, and vice versa.
- Responsive to detection of a predetermined gesture made on or above the sensitive display 54 the processor 53 may activate the communication circuitry 53 if the communication circuitry 55 had not been active, and vice versa.
- FIG. 55 is a flowchart illustrating a first aspect of the display control in accordance with the display control program 56 f.
- the processor 53 launches, namely, executes one of the application programs 56 b to 56 e in response to the user's selection (S 400 ).
- the selection is made by way of, for example, the user's tap on an icon representing any of the application programs 56 b to 56 e on the sensitive display 54 , and the processor 53 detects the tap on the icon.
- the processor 53 determines whether or not the communication circuitry 55 is active or not (S 401 ).
- the communication circuitry 55 becomes activated or deactivated by way of, for example, the user's operation such as a tap or hover of his/her finger on or above a predetermined icon displayed on the sensitive display 54 and a predetermined gesture on or above the sensitive display 54 .
- the processor 53 If the communication circuitry 55 is not active (S 401 : No), the processor 53 generates video signals representing a screen of the executed application program and sends the video signals to the sensitive display 54 (S 402 ). Accordingly, the screen of the executed application program is displayed on the sensitive display 54 .
- the screen of the executed application program may contain one or more tappable objects such as a HTML link, a text-input field, a software button, a software keyboard, and the like.
- the processor 53 does not generate video signals representing an indicator indicative of the location of hover even if such hover is detected by the sensitive display 54 .
- the processor 53 If the communication circuitry 55 is active (S 401 : Yes), the processor 53 generates video signals representing the screen of the executed application program and also generates video signals representing an indicator over the screen. The indicator indicates the hover location by being displayed at the location of hover detected by the processor 53 over the screen. The processor 53 then sends the generated video signals of the screen and the indicator to the communication circuitry 52 through the communication circuitry 55 (S 403 ).
- the screen of the executed application program and the indicator indicative of the location of the user's finger's hover over the screen are displayed on the remote display device 51 .
- the screen of the executed application program may contain one or more tappable object such as a HTML link, a text-input field, a software button, and a software keyboard.
- the processor 53 operates in response to the user's tap on the tappable objects contained in the screen in accordance with the executed application program (S 404 ).
- FIG. 56 is a flowchart illustrating a second aspect of the display control in accordance with the display control program 56 f.
- the processor 53 launches, namely, executes the OS 56 a in response to activating or powering on the computing device 50 (S 500 ).
- the activation is made, for example, by way of the user's turning on the computing device 50 .
- the processor 53 determines whether or not the communication circuitry 55 is active or not (S 501 ).
- the communication circuitry 55 becomes activated or deactivated by way of, for example, the user's operation such as a tap or hover of his/her finger on or above a predetermined icon displayed on the sensitive display 54 and a predetermined gesture on or above the sensitive display 54 .
- the processor 53 If the communication circuitry 55 is not active (S 501 : No), the processor 53 generates video signals representing a screen of the OS 56 a and sends the video signals to the sensitive display 54 (S 502 ). Accordingly, the screen of the OS 56 a is displayed on the sensitive display 54 .
- the screen of the OS 56 a may contain one or more tappable icons representing one or more of the application programs 56 b to 56 e.
- the processor 53 If the communication circuitry 55 is active (S 501 : Yes), the processor 53 generates video signals representing the screen of the OS 56 a and also generates video signals representing an indicator over the screen. The indicator indicates the hover location by being displayed at the location of hover detected by the processor 53 over the screen. The processor 53 then sends the generated video signals of the screen and the indicator to the communication circuitry 52 through the communication circuitry 55 (S 503 ).
- the screen of the OS 56 a and the indicator indicative of the location of the user's finger's hover over the screen are displayed on the remote display device 51 .
- the screen of the OS 56 a contains one or more tappable icons representing one or more of the application programs 56 b to 56 e.
- the processor 53 operates in response to the user's tap on the tappable icons contained in the screen in accordance with the OS 56 a (S 504 ).
- FIG. 57 is a flowchart illustrating a third aspect of the display control in accordance with the display control program 56 f.
- the display control of FIG. 57 is operated when the communication circuitry 55 becomes activated. While the communication circuitry 55 is not active, as mentioned above with reference to FIGS. 55 and 56 , the screen of the OS 56 a or one of the application programs 56 b to 56 e is displayed on the sensitive display 54 (S 402 , S 502 ).
- the processor 53 activates the communication circuitry 55 (S 600 ).
- the processor 53 then stops displaying the screen on the sensitive display 54 (S 601 ). More specifically, the processor 53 may stop sending the video signals of the screen to the sensitive display 54 .
- the processor 53 starts generating video signals representing an indicator indicative of the location of hover detected by the processor 53 .
- the processor 53 then starts sending to the remote display device 54 via the communication circuitry 55 video signals of the screen and the indicator ( 602 ).
- FIG. 58 is a flowchart illustrating a fourth aspect of the display control in accordance with the display control program 56 f.
- the display control of FIG. 58 is operated when the communication circuitry 55 becomes deactivated. While the communication circuitry 55 is active, as mentioned above with reference to FIGS. 55 and 56 , video signals representing the screen of the OS 56 a or one of the application programs 56 b to 56 e as well as an indicator indicative of the location of hover detected by the processor 53 are being sent to the remote display device 54 .
- the processor 53 deactivates the communication circuitry 55 (S 700 ).
- the processor 53 then stops sending the video signals of the screen and the indicator (S 701 ).
- the processor 53 may also stop generating the video signals representing the indicator.
- the processor 53 then starts displaying the screen, without the indicator, on the sensitive display 54 (S 702 ). More specifically, the processor 53 starts sending video signals of the screen to the sensitive display 54 .
- FIGS. 59 and 60 illustrate how the screen is displayed if the executed application program is the WWW (World Wide Web) browser 56 b.
- WWW World Wide Web
- the screen of the WWW browser 56 b is displayed on the sensitive display 54 .
- An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by the sensitive display 54 .
- HTML links 60 and graphical buttons 61 contained in the screen are tappable by the user's finger 62 for operation of the WWW browser 56 b.
- the video signals of the screen of the WWW browser 56 b and the indicator 63 are sent from the computing device 50 to the remote display device 51 . Accordingly, the indicator 63 indicative of the location of hover detected by the processor 53 during the display of the screen is displayed over the screen at the remote display device 51 .
- FIGS. 61 and 62 illustrate how the screen is displayed if the executed application program is a video game 56 c.
- the screen of the video game 56 c is displayed on the sensitive display 54 .
- An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by the sensitive display 54 .
- Graphical buttons 64 contained in the screen are tappable by the user's finger 62 for operation of the video game 56 c.
- the video signals of the screen of the video game 56 c and the indicator 63 are sent from the computing device 50 to the remote display device 54 . Accordingly, the indicator 63 indicative of the location of hover detected by the processor 53 during the display of the screen is displayed over the screen.
- FIGS. 63 and 64 illustrate how the screen is displayed if the executed application program is the text editor 56 d.
- the screen of the text editor 56 d is displayed on the sensitive display 54 .
- An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by the sensitive display 54 .
- Graphical keyboard 65 contained in the screen is tappable for operation of the text editor 56 d , namely, for text inputting.
- the video signals of the screen of the text editor 56 d and the indicator 63 are sent from the computing device 50 to the remote display device 51 . Accordingly, the indicator 63 indicative of the location of hover detected by the processor 53 during the display of the screen is displayed over the screen.
- FIGS. 65 and 66 illustrate how the screen is displayed if the executed application program is the media player 56 e.
- the screen of the media player 56 e is displayed on the sensitive display 54 .
- An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by the sensitive display 54 .
- Thumbnails 66 of pictures or movies that appear in the screen are tappable for operation of the media player 56 e , namely, for displaying an enlarged picture corresponding to the tapped thumbnail at an area 67 or for playing a movie corresponding to the tapped thumbnail at the area 67 .
- the video signals of the screen of the media player 56 e and the indicator 63 are sent from the computing device 50 to the remote display device 54 . Accordingly, the indicator 63 indicative of the location of hover detected by the processor 53 during the display of the screen is displayed over the screen.
- FIGS. 67 and 68 illustrate how the screen is displayed when the OS 56 a is executed.
- the screen of the OS 56 a is displayed on the sensitive display 54 .
- An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by the sensitive display 54 .
- Icons 68 representing the application programs 56 b to 56 e that appear in the screen are tappable for operation of the OS 56 a , namely, for launching one of the application programs 56 b to 56 e corresponding to the tapped icon.
- the video signals of the screen of the OS 56 a and the indicator 63 are sent from the computing device 50 to the remote display device 54 . Accordingly, the indicator 63 indicative of the location of hover detected by the processor 53 during the display of the screen is displayed over the screen.
- the indicator 63 can be shaped and/or colored in any manner within the scope of its intention to indicate hover location.
- the indicator 63 can be shaped to be a form of an arrow as depicted in FIG. 69 , or can be shaped to be a form of a circle and colored translucently or transparently as depicted in FIG. 70 .
- the indicator 63 indicates, on the remote display device 51 , the location of his/her finger hovering over the sensitive display 54 . Therefore, the user can easily recognize where in the sensitive display 54 he/she should tap on in order to tap the tappable objects 60 , 61 , 64 , 65 , 66 , or 68 within the screen while he/she is keeping watching the screen displayed on the remote display device 51 .
- the indicator 63 is not displayed over the screen because he/she can recognize where to tap without the indicator 63 because he/she can see the finger hovering in proximity over the sensitive display 54 while he/she is watching the screen displayed on the sensitive display 54 .
- FIG. 71 is a flowchart illustrating a fifth aspect of the display control performed in accordance with the display control program 56 f.
- the processor 53 launches, namely, executes one of the application programs 56 b to 56 e in response to the user's selection (S 800 ).
- the selection is made by way of, for example, the user's tap on an icon representing the application programs 56 b to 56 e on the sensitive display 54 , and the processor 53 detects the tap on the icon.
- the processor 53 determines whether or not the communication circuitry 55 is active or not (S 801 ).
- the communication circuitry 55 becomes activated or deactivated by way of, for example, the user's operation such as a tap or hover of his/her finger on or above a predetermined icon displayed on the sensitive display 54 and a predetermined gesture on or above the sensitive display 54 .
- the processor 53 If the communication circuitry 55 is not active (S 801 : No), the processor 53 generates video signals representing a screen of the executed application program, and sends the video signals to the sensitive display 54 (S 802 ). Accordingly, the screen of the executed application program is displayed on the sensitive display 54 .
- the screen of the executed application program may contain one or more tappable objects such as a HTML link, a text-input field, a software button, a software keyboard, and the like.
- Each tappable object in a screen has a predefined default appearance such as, for example, size and color.
- each tappable object is displayed in its predefined default appearance.
- the processor 53 continuously determines whether or not hover is made above a tappable object in the displayed screen (S 803 ). The determination can be made by, for example, comparing the location of a tappable object with the location of hover detected by the processor 53 . If the processor 53 determines that hover is made above a tappable object (S 803 : Yes), the processor 53 emphasizes the tappable object (S 804 ). The emphasizing can be made by, for example, changing the predefined default appearance of the tappable object.
- the changing includes, without limitation, enlarging or zooming up the predefined default size of the tappable object and highlighting the predefined default color of the tappable object.
- the processor 53 then generates video signals representing the screen of the executed application program with the emphasized tappable object, and sends the generated video signals to the communication circuitry 52 through the communication circuitry 55 (S 805 ).
- the processor 53 determines that hover is not made above a tappable object (S 803 : NO), the processor 53 does not perform the emphasizing.
- the screen of the executed application program is displayed on the remote display device 51 , with the tappable object emphasized while hover of the user's finger exists above the tappable object.
- the steps S 801 to S 805 may be continuously performed while an application program is executed.
- the processor 53 operates in response to the user's tap on the tappable objects contained in the screen in accordance with the executed application program (S 806 ).
- FIG. 72 is a flowchart illustrating a sixth aspect of the display control performed in accordance with the display control program 56 f.
- the processor 53 launches, namely, executes the OS 56 a in response to activating or powering on the computing device 50 (S 900 ).
- the activation is made, for example, by way of the user's turning on the computing device 50 .
- the processor 53 determines whether or not the communication circuitry 55 is active or not (S 901 ).
- the communication circuitry 55 becomes activated or deactivated by way of, for example, the user's operation such as a tap or hover of his/her finger on or above a predetermined icon displayed on the sensitive display 54 and a predetermined gesture on or above the sensitive display 54 .
- the processor 53 If the communication circuitry 55 is not active (S 901 : No), the processor 53 generates video signals representing a screen of the OS 56 a and sends the video signals to the sensitive display 54 (S 902 ). Accordingly, the screen of the OS 56 a is displayed on the sensitive display 54 .
- the screen of the OS 56 a may contain one or more tappable icons representing one or more of the application programs 56 b to 56 e . Each tappable icon in a screen has a predefined default appearance such as, for example, size and color. In S 902 , each tappable icon is displayed in its predefined default appearance.
- the processor 53 continuously determines whether or not hover is made above a tappable icon in the displayed screen (S 903 ). The determination can be made by, for example, comparing the location of a tappable icon with the location of hover detected by the processor 53 . If the processor 53 determines that hover is made above a tappable icon (S 903 : Yes), the processor 53 emphasizes the tappable icon (S 904 ). The emphasizing can be made by, for example, changing the predefined default appearance of the tappable icon.
- the changing includes, without limitation, enlarging or zooming up the predefined default size of the tappable icon and highlighting the predefined default color of the tappable icon.
- the processor 53 then generates video signals representing the screen of the OS 56 a with the emphasized tappable icon, and sends the generated video signals to the communication circuitry 52 through the communication circuitry 55 (S 905 ).
- the processor 53 determines that hover is not made above a tappable icon (S 903 : NO), the processor 53 does not perform the emphasizing.
- the screen of the OS 56 a is displayed on the remote display device 51 , with the tappable icon emphasized when hover of the user's finger exists above the tappable icon.
- the steps S 901 to S 905 may be continuously performed while the OS 56 a is executed.
- the processor 53 operates in response to the user's tap on the tappable icons contained in the screen in accordance with the OS 56 a (S 906 ).
- FIG. 73 is a flowchart illustrating a seventh aspect of the display control in accordance with the display control program 56 f.
- the display control of FIG. 73 is operated when the communication circuitry 55 becomes activated. While the communication circuitry 55 is not active, as mentioned above with reference to FIGS. 71 and 72 , the screen of the OS 56 a or one of the application programs 56 b to 56 e is displayed on the sensitive display 54 (S 802 , S 902 ).
- the processor 53 activates the communication circuitry 55 (S 1000 ).
- the processor 53 then stops displaying the screen on the sensitive display 54 (S 1001 ). More specifically, the processor 53 may also stop sending video signals of the screen to the sensitive display 54 .
- the processor 53 starts sending of the screen to the remote display device 51 via the communication circuitry 55 ( 1002 ).
- the processor 53 also starts emphasizing the tappable object or icon above which hover is made as described in S 804 , S 805 , S 904 , and S 905 (S 1003 ).
- FIG. 74 is a flowchart illustrating a eighth aspect of the display control in accordance with the display control program 56 f.
- the display control of FIG. 74 is operated when the communication circuitry 55 becomes deactivated. While the communication circuitry 55 is active, as mentioned above with reference to FIGS. 71 and 72 , video signals representing the screen of the OS 56 a or one of the application programs 56 b to 56 e is being sent to the remote display device 54 . A tappable object or a tappable icon contained in the screen becomes emphasized if the processor 53 determines that hover exists above the tappable object or tappable icon.
- the processor 53 deactivates the communication circuitry 55 (S 1100 ).
- the processor 53 then stops sending the video signals of the screen (S 1101 ).
- the processor 53 also stops emphasizing the tappable object or icon even if hover is detected above the tappable object or icon (S 1102 ).
- the processor 53 then starts displaying the screen on the sensitive display 54 (S 1103 ). More specifically, the processor 53 starts sending video signals of the screen to the sensitive display 54 .
- FIGS. 75 and 76 illustrate how the screen is displayed if the executed application program is the WWW (World Wide Web) browser 56 b.
- HTML links 60 and graphical buttons 61 contained in the screen are tappable by the user's finger 62 for operation of the WWW browser 56 b .
- Each of the HTML links 60 and the graphical buttons 61 is displayed in its predefined default appearance.
- the processor 53 continuously determines whether or not hover is made above any of the HTML links 60 and the graphical buttons 61 . As illustrated in FIG. 76 , if hover is made above one of the HTML links 60 entitled “Today's topic”, the HTML link 60 entitled “Today′ topic” is emphasized. Accordingly, the video signals of the screen of the WWW browser 56 b , with the HTML link entitled “Today's topic” being emphasized, are sent from the computing device 50 to the remote display device 51 as long as hover exists above the HTML link entitled “Today's topic”. The HTML link 60 entitled “Today's topic” is thus displayed in the emphasized appearance as long as hover exists above the HTML link 60 entitled “Today's topic”.
- the emphasizing may be, for example, enlarging the HTML link 60 entitled “Today's topic” as depicted in FIG. 76 .
- the emphasizing stops if hover does not exist from above the HTML link 60 entitled “Today's topic” because, for example, the user has moved his/her finger 62 away from above the HTML link 60 entitled “Today's topic”, and then the HTML link 60 entitled “Today's topic” is displayed in its predefined default appearance again.
- FIGS. 77 and 78 illustrate how the screen is displayed if the executed application program is a video game 56 c.
- buttons 64 contained in the screen are tappable by the user's finger 62 for operation of the video game 56 c.
- the processor 53 continuously determines whether or not hover is made above any of the graphical buttons 64 .
- the circular graphical button 64 is emphasized. Accordingly, the video signals of the screen of the video game 56 c , with the circular graphical button 64 being emphasized, are sent from the computing device 50 to the remote display device 51 as long as hover exists above the circular graphical button 64 .
- the circular graphical button 64 is thus displayed in the emphasized appearance as long as hover exists above the circular graphical button 64 .
- the emphasizing may be, for example, changing the size and the color of the circular graphical button 64 as illustrated in FIG.
- FIGS. 79 and 80 illustrate how the screen is displayed if the executed application program is the text editor 56 d.
- the screen of the text editor 56 d is displayed on the sensitive display 54 .
- Graphical keyboard 65 contained in the screen is tappable for operation of the text editor 56 d , namely, for text inputting.
- the graphical keyboard 65 is displayed with every key displayed in its predefined default appearance.
- the processor 53 continuously determines whether or not hover is made above any key of the graphical keyboard 65 .
- hover is made above a key, namely, a “V” key in the graphical keyboard 65
- the “V” key is emphasized.
- the video signals of the screen of the text editor 56 d with the “V” key being emphasized, are sent from the computing device 50 to the remote display device 51 as long as hover exists above the “V” key.
- the graphical keyboard 65 is thus displayed with the “V” key displayed in the emphasized appearance as long as hover exists above the “V” key.
- the emphasizing may be, for example, enlarging the “V” key as illustrated in FIG. 80 .
- the emphasizing stops if hover does not exist because, for example, the user has moved his/her finger 62 away from above the “V” key, and then the “V” key is displayed in its predefined default appearance again.
- FIGS. 81 and 82 illustrate how the screen is displayed if the executed application program is the media player 56 e.
- the screen of the media player 56 e is displayed on the sensitive display 54 .
- Thumbnails 66 of pictures or movies that appear in the screen are tappable for operation of the media player 56 e , namely, for displaying an enlarged picture corresponding to the tapped thumbnail at an area 67 or for playing a movie corresponding to the tapped thumbnail at the area 67 .
- Each thumbnail 66 is displayed in its predefined default appearance.
- the processor 53 continuously determines whether or not hover is made above any of the thumbnails 66 . As illustrated in FIG. 82 , if hover is made above the upper thumbnail 66 , the upper thumbnail 66 is emphasized. Accordingly, the video signals of the screen of the media player 56 e , with the upper thumbnail 66 emphasized, are sent from the computing device 50 to the remote display device 51 as long as hover exists above the upper thumbnail 66 . The upper thumbnail 66 is thus displayed in the emphasized appearance as long as hover exists above the upper thumbnail 66 .
- the emphasizing may be, for example, changing the size and color of the upper thumbnail 66 as illustrated in FIG. 82 . The emphasizing stops if hover does not exist above the upper thumbnail 66 because, for example, the user has moved his/her finger 62 away from above the upper thumbnail 66 , and then the upper thumbnail 66 is displayed in its predefined default appearance again.
- FIGS. 83 and 84 illustrate how the screen is displayed when the OS 56 a is executed.
- the screen of the OS 56 a is displayed on the sensitive display 54 .
- Icons 68 representing the application programs 56 b to 56 e that appear in the screen are tappable for operation of the OS 56 a , namely, for launching one of the application programs 56 b to 56 e corresponding to the tapped icon.
- Each icon 68 is displayed in its predefined default appearance.
- the processor 53 continuously determines whether or not hover is made above any of the icons 68 .
- the “e” icon representing the WWW browser 56 b
- the “e” icon is emphasized.
- the video signals of the screen of the OS 56 a with the “e” icon emphasized, are sent from the computing device 50 to the remote display device 51 as long as hover exists above the “e” icon.
- the “e” icon is thus displayed in the emphasized appearance as long as hover exists above the “e” icon.
- the emphasizing may be, for example, changing the size and shape of the “e” icon as illustrated in FIG. 84 .
- the emphasizing stops if hover does not exist because, for example, the user has moved his/her finger 62 away from above the “e” icon, and then the “e” icon is displayed in its predefined default appearance again.
- the user when the user enjoys the computer programs with his/her eyes on the screen displayed on the remote display device 51 with the communication circuitry 55 being active, a tappable object which is about to be tapped is emphasized. Therefore, the user can easily recognize where in the sensitive display 54 he/she should tap on or how far he/she should move the finger in order to tap the tappable objects 60 , 61 , 64 , 65 , 66 , or 68 within the screen while he/she is keeping watching the screen displayed on the remote display device 51 .
- FIG. 85 is a flowchart illustrating a ninth aspect of the display control in accordance with the display control program 56 f.
- the processor 53 launches, namely, executes one of the application programs 56 b to 56 e in response to the user's selection (S 1300 ).
- the selection is made by way of, for example, the user's tap on an icon representing any of the application programs 56 b to 56 e on the sensitive display 54 , and the processor 53 detects the tap on the icon.
- the processor 53 determines whether or not the communication circuitry 55 is active or not (S 1301 ).
- the communication circuitry 55 becomes activated or deactivated by way of, for example, the user's operation such as a tap or hover of his/her finger on or above a predetermined icon displayed on the sensitive display 54 and a predetermined gesture on or above the sensitive display 54 .
- the processor 53 determines whether or not a screen of the executed application program contains one or more tappable objects such as a HTML link, a text-input field, a software button, and a software keyboard (S 1303 ). If at least one tappable object is contained in the screen (S 1303 : Yes), the processor 53 generates video signals representing the screen of the executed application program and also generates video signals representing an indicator over the screen. The indicator indicates the hover location by being displayed at the location of hover detected by the processor 53 over the screen. The processor 53 then sends the generated video signals of the screen and the indicator to the communication circuitry 52 through the communication circuitry 55 (S 1305 ). In this manner, the screen of the executed application program and the indicator indicative of the location of, for example, the user's finger's hover over the screen are displayed on the remote display device 51 .
- tappable objects such as a HTML link, a text-input field, a software button, and a software keyboard
- the processor 53 determines whether no tappable object is contained in the screen (S 1303 : No). If no tappable object is contained in the screen (S 1303 : No), the processor 53 generates video signals representing the screen of the executed application program, but does not generate video signals representing the indicator even if hover is detected. The processor 53 then sends the generated video signals of the screen to the communication circuitry 52 through the communication circuitry 55 (S 1304 ).
- the processor 53 generates video signals representing a screen of the executed application program and sends the video signals to the sensitive display 54 (S 1302 ). Accordingly, the screen of the executed application program is displayed on the sensitive display 54 .
- the screen of the executed application program may contain one or more tappable objects such as a HTML link, a text-input field, a software button, a software keyboard, and the like.
- the processor 53 does not determine whether or not the screen contains one or more tappable objects, or generate video signals representing the indicator even if hover is detected by the sensitive display 54 .
- the processor 53 operates in response to the user's tap on the tappable objects contained in the screen in accordance with the executed application program (S 1306 ).
- FIG. 86 is a flowchart illustrating a tenth aspect of the display control in accordance with the display control program 56 f.
- the display control of FIG. 86 is operated when the communication circuitry 55 becomes activated. While the communication circuitry 55 is not active, as mentioned above with reference to FIG. 85 , the screen of one of the application programs 56 b to 56 e is displayed on the sensitive display 54 (S 1302 ).
- the processor 53 activates the communication circuitry 55 (S 1400 ).
- the processor 53 then stops displaying the screen on the sensitive display 54 (S 1401 ). More specifically, the processor 53 may stop sending the video signals of the screen to the sensitive display 54 . The processor 53 may also turn off the sensitive display 54 .
- the processor 53 starts determining whether or not the screen contains at least one tappable objects (S 1402 ).
- the processor 53 starts generating video signals based on the determination at S 1402 and sending the generated video signals to the communication circuitry 52 through the communication circuitry 55 (S 1403 ). Namely, the processor 53 generates video signals representing the screen and the indicator indicative of the location of hover detected by the processor 53 as long as the processor 53 determines that at least one tappable object is contained in the screen, whereas the processor 53 generates video signals representing the screen but does not generate video signals representing the indicator as long as the processor 53 determines that no tappable object is contained.
- FIG. 87 is a flowchart illustrating an eleventh aspect of the display control in accordance with the display control program 56 f.
- the display control of FIG. 87 is operated when the communication circuitry 55 becomes deactivated. While the communication circuitry 55 is active, as mentioned above with reference to FIG. 85 , video signals representing the screen only or representing the screen and the indicator are being sent to the remote display device 51 based on the determination whether or not the screen contains at least one tappable object.
- the processor 53 deactivates the communication circuitry 55 (S 1500 ).
- the processor 53 stops determining whether or not the screen contains at least one tappable object (S 1501 ) and also stops sending the video signals to the remote display device 54 (S 1502 ).
- the processor 53 then starts displaying the screen, without the indicator, on the sensitive display 54 (S 1503 ). More specifically, the processor 53 starts sending video signals of the screen only to the sensitive display 54 .
- FIGS. 88 through 90 illustrate how the screen is displayed if the executed application program is the WWW (World Wide Web) browser 56 b.
- the screen of the WWW browser 56 b is displayed on the sensitive display 54 .
- An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by the sensitive display 54 .
- Tappable objects of HTML links 60 and graphical buttons 61 may appear in the screen for operation of the WWW browser 56 b.
- the video signals of the screen and the indicator 63 indicative of hover location are sent from the computing device 50 to the remote display device 51 . Accordingly, the indicator 63 indicative of the location of hover detected by the processor 53 during the display of the screen is displayed over the screen at the remote display device 51 .
- the video signals of the screen only are sent from the computing device 50 to the remote display device 51 .
- the screen may contain no tappable object when, for example, the browser 56 b displays a movie streamed from a video streaming website in a full-screen manner.
- FIGS. 91 through 93 illustrate how the screen is displayed if the executed application program is a video game 56 c.
- the screen of the video game 56 c is displayed on the sensitive display 54 .
- An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by the sensitive display 54 .
- the screen may contain graphical buttons 64 to be tapped by the user's finger 62 for operation of the video game 56 c.
- the communication circuitry 55 While the communication circuitry 55 is active, as long as the screen contains the graphical buttons 64 , the video signals of the screen of the video game 56 c and the indicator 63 are sent from the computing device 50 to the remote display device 54 . Accordingly, the indicator 63 indicative of the location of hover detected by the processor 53 during the display of the screen is displayed over the screen.
- the communication circuitry 55 even while the communication circuitry 55 is active, as long as the screen contains none of the graphical buttons 64 , the video signals of the screen only are sent from the computing device 50 to the remote display device 54 .
- the screen contains none of the graphical buttons 64 when, for example, the video game 56 c can be operated by the user's gesture on or above the screen instead of tapping on the graphical buttons 64 .
- FIGS. 94 through 96 illustrate how the screen is displayed if the executed application program is the media player 56 e.
- the screen of the media player 56 e is displayed on the sensitive display 54 .
- An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by the sensitive display 54 .
- Thumbnails 66 of pictures or movies may appear in the screen for operation of the media player 56 e , namely, for displaying an enlarged picture corresponding to the tapped thumbnail at an area 67 or for playing a movie corresponding to the tapped thumbnail at the area 67 .
- the communication circuitry 55 While the communication circuitry 55 is active, as long as the screen contains the thumbnails 66 , the video signals of the screen of the media player 56 e and the indicator 63 are sent from the computing device 50 to the remote display device 54 . Accordingly, the indicator 63 indicative of the location of hover detected by the processor 53 during the display of the screen is displayed over the screen.
- the communication circuitry 55 even while the communication circuitry 55 is active, as long as the screen contains none of the thumbnails 66 , the video signals of the screen only are sent from the computing device 50 to the remote display device 54 .
- the screen contains none of the thumbnails 66 when, for example, the enlarged picture or movie is played and displayed at the area 67 in a full-screen manner.
- the indicator 63 can be shaped and/or colored in any manner within the scope of its intention to indicate hover location.
- the indicator 63 can be shaped to be a form of an arrow as depicted in FIG. 33A , or can be shaped to be a form of a circle and colored translucently or transparently as depicted in FIG. 70 .
- the indicator 63 does not appear on the remote display 54 as long as the user does not need to recognize where to tap because no tappable object appears on the remote display 54 .
- an indicator indicative of detected hover is displayed suitably or ideally depending on the user's need. Therefore, usability in operation through the sensitive display 54 can be improved.
- FIG. 97 is a flowchart illustrating a twelfth aspect of the display control performed in accordance with the display control program 56 f.
- the processor 53 launches, namely, executes one of the application programs 56 b to 56 e in response to the user's selection (S 1600 ).
- the selection is made by way of, for example, the user's tap on an icon representing the application programs 56 b to 56 e on the sensitive display 54 , and the processor 53 detects the tap on the icon.
- the processor 53 determines whether or not the communication circuitry 55 is active or not (S 1601 ).
- the communication circuitry 55 becomes activated or deactivated by way of, for example, the user's operation such as a tap or hover of his/her finger on or above a predetermined icon displayed on the sensitive display 54 and a predetermined gesture on or above the sensitive display 54 .
- the processor 53 continuously determines whether or not hover is made above a tappable object in the displayed screen (S 1602 ). The determination can be made by, for example, comparing the location of a tappable object with the location of hover detected by the processor 53 . If the processor 53 determines that hover is made above a tappable object (S 1602 : Yes), the processor 53 emphasizes the tappable object (S 1603 ). The emphasizing can be made by, for example, changing the predefined default appearance of the tappable object.
- the changing includes, without limitation, enlarging or zooming up the predefined default size of the tappable object and highlighting the predefined default color of the tappable object.
- the processor 53 displays the screen of the executed application program, with the emphasized tappable object, on the sensitive display 54 (S 1604 ).
- the processor 53 determines that hover is not made above a tappable object (S 1602 : NO), the processor 53 does not perform the emphasizing and displays the screen on the sensitive display 54 (S 1604 ).
- the screen of the executed application program is displayed on the sensitive display 54 , with the tappable object emphasized while hover of the user's finger exists above the tappable object.
- the steps S 1602 to S 1604 may be continuously performed while an application program is executed as long as the communication circuitry is not active.
- the processor 53 if the communication circuitry 55 is active (S 1601 : Yes), the processor 53 generates video signals representing the screen and also generates video signals representing an indicator over the screen.
- the indicator indicates the hover location by being displayed at the location of hover detected by the processor 53 over the screen.
- the processor 53 then sends the generated video signals of the screen and the indicator to the communication circuitry 52 through the communication circuitry 55 (S 1605 ). Accordingly, the screen and the indicator are displayed on the remote display device 51 .
- the processor 53 operates in response to the user's tap on the tappable objects contained in the screen in accordance with the executed application program (S 1606 ).
- FIG. 98 is a flowchart illustrating a thirteenth aspect of the display control performed in accordance with the display control program 56 f.
- the processor 53 launches, namely, executes the OS 56 a in response to activating or powering on the computing device 50 (S 1700 ).
- the activation is made, for example, by way of the user's turning on the computing device 50 .
- the processor 53 determines whether or not the communication circuitry 55 is active or not (S 1701 ).
- the processor 53 determines whether or not the communication circuitry 55 is active or not (S 1701 ).
- the communication circuitry 55 becomes activated or deactivated by way of, for example, the user's operation such as a tap or hover of his/her finger on or above a predetermined icon displayed on the sensitive display 54 and a predetermined gesture on or above the sensitive display 54 .
- the processor 53 continuously determines whether or not hover is made above a tappable object in the displayed screen (S 1702 ). The determination can be made by, for example, comparing the location of a tappable object with the location of hover detected by the processor 53 . If the processor 53 determines that hover is made above a tappable object (S 1702 : Yes), the processor 53 emphasizes the tappable object (S 1703 ). The emphasizing can be made by, for example, changing the predefined default appearance of the tappable object.
- the changing includes, without limitation, enlarging or zooming up the predefined default size of the tappable object and highlighting the predefined default color of the tappable object.
- the processor 53 then displays the screen of the OS 56 a , with the emphasized tappable object, on the sensitive display 54 (S 1704 ).
- the processor 53 determines that hover is not made above a tappable object (S 1702 : NO), the processor 53 does not perform the emphasizing and displays the screen on the sensitive display 54 (S 1704 ).
- the screen of the OS 56 a is displayed on the sensitive display 54 , with the tappable object emphasized while hover of the user's finger exists above the tappable object.
- the steps S 1702 to S 1704 may be continuously performed while an application program is executed as long as the communication circuitry is not active.
- the processor 53 If the communication circuitry 55 is active (S 1701 : Yes), the processor 53 generates video signals representing the screen and also generates video signals representing an indicator over the screen. The indicator indicates the hover location by being displayed at the location of hover detected by the processor 53 over the screen. The processor 53 then sends the generated video signals of the screen and the indicator to the communication circuitry 52 through the communication circuitry 55 (S 1705 ). Accordingly, the screen and the indicator are displayed on the remote display device 51 .
- the processor 53 operates in response to the user's tap on the tappable objects contained in the screen in accordance with the OS 56 a (S 1706 ).
- FIG. 99 is a flowchart illustrating a fourteenth aspect of the display control in accordance with the display control program 56 f.
- the display control of FIG. 99 is operated when the communication circuitry 55 becomes activated. While the communication circuitry 55 is not active, as mentioned above with reference to FIGS. 97 and 98 , the screen of the OS 56 a or one of the application programs 56 b to 56 e is displayed on the sensitive display 54 (S 1604 , S 1704 ).
- the processor 53 activates the communication circuitry 55 (S 1800 ).
- the processor 53 stops determining whether or not hover is made above a tappable object (S 1801 ), stops the emphasizing of tappable objects (S 1802 ), and also stops displaying the screen on the sensitive display 54 (S 1803 ). More specifically, the processor 53 may also stop sending video signals of the screen to the sensitive display 54 .
- the processor 53 starts generating video signal representing the indicator indicative of the location of hover detected by the sensitive display 54 , and starts sending the video signals representing the screen and the indicator to the remote display device 51 via the communication circuitry 55 ( 1804 ).
- FIG. 100 is a flowchart illustrating a fifteenth aspect of the display control in accordance with the display control program 56 f.
- the display control of FIG. 100 is operated when the communication circuitry 55 becomes deactivated. While the communication circuitry 55 is active, as mentioned above with reference to FIGS. 97 and 98 , video signals representing the screen of the OS 56 a or one of the application programs 56 b to 56 e as well as the indicator are being sent to the remote display device 51 .
- the processor 53 deactivates the communication circuitry 55 (S 1900 ).
- the processor 53 then stops sending the video signals of the screen and the indicator (S 1901 ).
- the processor 53 may also stop generating the video signals representing the indicator.
- the processor 53 then starts determining whether or not hover is made above a tappable object contained in the screen (S 1902 ), and starts emphasizing the tappable object responsive to determination that hover is made above the tappable object (S 1903 ). Accordingly, the tappable object contained in the screen becomes emphasized if the processor 53 determines that hover exists above the tappable object.
- the processor 53 then displays the screen on the sensitive display 54 (S 1904 ). More specifically, the processor 53 starts sending video signals of the screen to the sensitive display 54 .
- FIGS. 101 and 102 illustrate how the screen is displayed if the executed application program is the WWW (World Wide Web) browser 56 b.
- WWW World Wide Web
- HTML links 60 and graphical buttons 61 contained in the screen are tappable by the user's finger 62 for operation of the WWW browser 56 b .
- Each of the HTML links 60 and the graphical buttons 61 is displayed in its predefined default appearance. As illustrated in FIG. 101 , if hover of an object such as the user's finger 62 is made above the HTML link 60 a labeled “Today's topic”, the HTML link 60 a is emphasized.
- the video signals of the screen of the WWW browser 56 b and the indicator 63 are sent from the computing device 50 to the remote display device 51 . Accordingly, the indicator 63 indicative of the location of hover detected by the processor 53 during the display of the screen is displayed over the screen at the remote display device 51 .
- FIGS. 103 and 104 illustrate how the screen is displayed if the executed application program is a video game 56 c.
- the screen of the video game 56 c is displayed on the sensitive display 54 .
- An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by the sensitive display 54 .
- Graphical buttons 64 contained in the screen are tappable by the user's finger 62 for operation of the video game 56 c .
- the graphical button 64 a is emphasized.
- the video signals of the screen of the video game 56 c and the indicator 63 are sent from the computing device 50 to the remote display device 54 . Accordingly, the indicator 63 indicative of the location of hover detected by the processor 53 during the display of the screen is displayed over the screen.
- FIGS. 105 and 106 illustrate how the screen is displayed if the executed application program is the text editor 56 d.
- the screen of the text editor 56 d is displayed on the sensitive display 54 .
- An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by the sensitive display 54 .
- Graphical keyboard 65 contained in the screen is tappable for operation of the text editor 56 d , namely, for text inputting.
- FIG. 105 if hover of an object such as the user's finger 62 is made above a “V” key of the graphical keyboard 65 , the “V” key is emphasized.
- the video signals of the screen of the text editor 56 d and the indicator 63 are sent from the computing device 50 to the remote display device 51 . Accordingly, the indicator 63 indicative of the location of hover detected by the processor 53 during the display of the screen is displayed over the screen.
- FIGS. 107 and 108 illustrate how the screen is displayed if the executed application program is the media player 56 e.
- the screen of the media player 56 e is displayed on the sensitive display 54 .
- An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by the sensitive display 54 .
- Thumbnails 66 of pictures or movies that appear in the screen are tappable for operation of the media player 56 e , namely, for displaying an enlarged picture corresponding to the tapped thumbnail at an area 67 or for playing a movie corresponding to the tapped thumbnail at the area 67 .
- the thumbnail 66 a is emphasized.
- the video signals of the screen of the media player 56 e and the indicator 63 are sent from the computing device 50 to the remote display device 54 . Accordingly, the indicator 63 indicative of the location of hover detected by the processor 53 during the display of the screen is displayed over the screen.
- FIGS. 109 and 110 illustrate how the screen is displayed when the OS 56 a is executed.
- the screen of the OS 56 a is displayed on the sensitive display 54 .
- An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by the sensitive display 54 .
- Icons 68 representing the application programs 56 b to 56 e that appear in the screen are tappable for operation of the OS 56 a , namely, for launching one of the application programs 56 b to 56 e corresponding to the tapped icon.
- hover of an object such as the user's finger 62 is made above the icon 68 representing the browser 56 b
- the icon 68 is emphasized.
- the video signals of the screen of the OS 56 a and the indicator 63 are sent from the computing device 50 to the remote display device 54 . Accordingly, the indicator 63 indicative of the location of hover detected by the processor 53 during the display of the screen is displayed over the screen.
- the indicator 63 can be shaped and/or colored in any manner within the scope of its intention to indicate hover location.
- the indicator 63 can be shaped to be a form of an arrow as depicted in FIG. 69 , or can be shaped to be a form of a circle and colored translucently or transparently as depicted in FIG. 70 .
- the indicator 63 indicates, on the remote display device 51 , the location of his/her finger hovering over the sensitive display 54 . Therefore, the user can easily recognize where in the sensitive display 54 he/she should tap on in order to tap the tappable objects 60 , 61 , 64 , 65 , 66 , or 68 within the screen while he/she is keeping watching the screen displayed on the remote display device 51 .
- the indicator 63 is not displayed over the screen because he/she can recognize where to tap without the indicator 63 because he/she can see the finger hovering in proximity over the sensitive display 54 while he/she is watching the screen displayed on the sensitive display 54 .
- the processor 53 controls display of tappable graphical keys or software keys for text input depending on whether the communication circuitry 55 is active or not.
- FIG. 111 is a flowchart illustrating the sixteenth aspect of the display control in accordance with the display control program 56 f.
- the processor 53 continuously determines the location at which a tap is made within a screen based on signals representing taps sent from the sensitive display 54 .
- the processor 53 then continuously determines whether or not a tap is made at a text input field in the screen.
- the text input filed is associated with one or more tappable software keys for text input.
- the processor 53 determines whether or not the communication circuitry 55 is active or not (S 1201 ).
- the processor 53 If the communication circuitry 55 is not active (S 1201 : No), namely, when the screen is displayed on the sensitive display 54 according to the first to fifteenth aspects of the display control, the processor 53 generates video signals representing one or more tappable software keys for text input with a first size and sends the video signals to the sensitive display 54 (S 1202 ).
- the video signals representing the tappable software keys may be sent in parallel with or along with video signals representing the screen.
- the first size is smaller than the size of the screen. Accordingly, the one or more software keys are displayed overlappingly along with the screen on the sensitive display 54 .
- the processor 53 If the communication circuitry 55 is active (S 1201 : Yes), namely, when the screen is displayed on the remote display device 51 according to the first to fifteenth aspects of the display control, the processor 53 generates video signals representing one or more tappable software keys for text input with a second size and sends the video signals to the sensitive display 54 (S 1203 ).
- the second size is larger than the first size, and can preferably be a full-screen size. Accordingly, the one or more software keys are displayed alone on the sensitive display 54 .
- the processor 53 receives text inputs in response to the user's tap on the tappable software keys (S 1204 ).
- the processor 53 stops sending the video signals of the tappable software keys to the sensitive display 54 (S 1205 , S 1206 ).
- the processor 53 may turn off the sensitive display 54 because there is no longer anything to be displayed on the sensitive display 54 until a tap on the text input field is detected again.
- FIGS. 112 to 115 depict how the tappable software keys are displayed when the screen of the WWW browser 56 b is displayed.
- the screen of the WWW browser 56 b contains a text input filed 70 at which a URL (Uniform Resource Locator) can be input by the user.
- URL Uniform Resource Locator
- the screen is displayed on the sensitive display 54 in accordance with the first to fifteenth aspects of the display control, as depicted in FIG. 112 .
- the processor 53 determines whether the communication circuitry 55 is being active or not. If the communication circuitry 55 is determined to be not active, the processor 53 generates and sends video signals of a graphical keyboard 71 with the first size to the sensitive display 54 .
- the graphical keyboard 71 includes plural tappable alphabetical keys for text input. Accordingly, the graphical keyboard 71 is displayed along with the screen of the WWW browser 56 b as depicted in FIG. 113 . In this situation, the user can perform text inputs through the graphical keyboard 71 while watching the screen on the sensitive display 54 although the graphical keyboard 71 is relatively small.
- the screen of the WWW browser 56 b is displayed on the remote display device 51 in accordance with the first to fifteenth aspects of the display control, as depicted in FIG. 114 .
- the processor 53 determines whether the communication circuitry 55 is being active or not. If the communication circuitry 55 is determined to be active, the processor 53 generates and sends video signals of a graphical keyboard 71 with the second size to the sensitive display 54 . Accordingly, the graphical keyboard 71 is displayed alone almost or substantially entirely on the sensitive display 54 , apart from the screen of the WWW browser 56 b displayed on the remote display device 51 , as depicted in FIG. 115 . In this situation, the user can perform text inputs easily because the displayed graphical keyboard 71 is relatively large.
- the sixteenth aspect of the display control begins when a tap on a text input field contained in the screen is detected (S 1200 ). Instead, the sixteenth aspect of the display control may begin when hover is detected in proximity above the text input filed over the screen for more than a predetermined period.
- the video signals generated by the computing device 1 and 50 may be analog video signals or digital video signals. Also, the video signals may be non-encoded or encoded pursuant to some protocol such as MPEG (Motion Picture Experts Group).
- MPEG Motion Picture Experts Group
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A user interface at a computing device with a sensitive display is disclosed. A dialog is popped up, on the sensitive display, at a position that is predetermined pixels away from the position of hover or a tap of an object detected by the sensitive display. A menu for operation of a computer program is displayed responsive to such hover being detected above a predetermined location within a screen of the computer program on the sensitive display. An assistance object for assisting a tappable object to be tapped is displayed responsive to such hover being detected above the tappable object on the sensitive display. A tappable object is emphasized responsive to such hover being detected above the tappable object on the sensitive display.
Description
- This application claims the benefit of U.S.
provisional patent application 61/584,850 filed on Jan. 10, 2012 and entitled “Display Control for use in Handheld Computing Device through Sensitive Display”, the content of which is incorporated herein by reference in its entirety. - The present disclosure relates to a user interface for use in a computing device with a sensitive display.
- There have been marketed computing devices such as PCs (Personal Computers), laptops, mobile phones, and tablets. Such computing device nowadays has one or more great memories and one or more great processors, thereby becoming multi-functional with various computer programs executable on the device. Such computer programs may include, for example, a text editor, a WWW (World Wide Web) browser, and video games.
- The multi-functional computing device should be welcomed, but, at the same time, may require complicated operations for users thereby making usability worse. Improvement in a user interface in such computing device is great concern for making usability better. For example, the user interface is improved by use of a sensitive display. The sensitive display typically detects a contact or a tap of an object like a user's finger onto the surface of the sensitive display as well as displays graphics. The sensitive display may be advantageous in that it may enable intuitive and easy operations for the users. More improvement in a user interface through the sensitive display has been sought.
- Usability may be made worse when the multi-functional computing device is a handheld device that is small for mobility and is provided with a tiny display and a tiny loudspeaker. For example, in playing video with such computing device, the video displayed on its local display is so small that the user may feel he/she would like to enjoy the video with a larger remote or external display device. In another example, in playing music with such computing device, the sound outputted through its local loudspeaker is so unsatisfactory that the user may feel he/she would like to enjoy the music with a larger remote or external loudspeaker. To do so, it is advantageous if the multi-functional device can communicate with a remote media-playing device such as a remote display device and a remote loudspeaker. Thus, improvement in a user interface for use in a computing device in connection with a remote media-playing device is also great concern.
- A first aspect of the present invention is a method of a user interface for use in a computing device with a sensitive display. According to the first aspect, a tap of an object such as a user's finger onto the sensitive display and hover of the object in proximity over the sensitive display are detected. Within a screen of a computer program executed in the computing device, a popup dialog is displayed at a location determined based on the detected tap or hover.
- A second aspect of the present invention is a method of a user interface for use in a computing device with a sensitive display. According to the second aspect, hover of an object such as a user's finger in proximity over the sensitive display is detected. Responsive to hover detected above a predetermined area within a screen of a computer program displayed on the sensitive display, a menu is displayed.
- A third aspect of the present invention is a method of a user interface for use in a computing device with a sensitive display. According to the third aspect, hover of an object such as a user's finger in proximity over the sensitive display is detected. Responsive to hover detected above a tappable object within a screen of a computer program displayed on the sensitive display, an assistant object is displayed for assisting the tappable object to be tapped.
- A fourth aspect of the present invention is a method of a user interface for use in a computing device that is provided with a sensitive display and is operable in connection with a remote display device. According to the fourth aspect, hover of an object such as a user's finger in proximity over the sensitive display is detected. Video signals representing a screen of a computer program executed in the computing device can be sent to the remote display device. Also, video signals representing an indicator indicative of the detected hover can be sent to the remote display device.
- A fifth aspect of the present invention is a method of a user interface for use in a computing device that is provided with a sensitive display and is operable in connection with a remote display device. According to the fifth aspect, a screen of a computer program executed in the computing device can be changed according to whether or not communication is active between the computing device and the remote display device.
- The word “tappable” used in this application means possibility or ability of being tapped. For example, a tappable object means an object which is to be tapped, or which a user can tap on.
-
FIG. 1 illustrates a front view of a computing device according to a first embodiment. -
FIG. 2 is a block diagram illustrating means and/or circuitry provided in a computing device according to a first embodiment. -
FIG. 3 is a flowchart illustrating operations performed by a computing device according to a first aspect of the first embodiment. -
FIG. 4 illustrates how the location for a notification dialog is determined based on hover according to the first aspect of the first embodiment. -
FIGS. 5, 6, 7, and 8 illustrate how the location for a notification dialog is determined based on hover according to the first aspect of the first embodiment. -
FIGS. 9, 10, 11, and 12 illustrate how a notification dialog is displayed according to the first aspect of the first embodiment. -
FIGS. 13, 14, 15, and 16 illustrate how a notification dialog is displayed according to the first aspect of the first embodiment. -
FIG. 17 is a flowchart illustrating operations performed by a computing device according to a second aspect of the first embodiment. -
FIG. 18 illustrates how the location for a notification dialog is determined based on tap according to the second aspect of the first embodiment. -
FIGS. 19, 20, 21, and 22 illustrate how the location for a notification dialog is determined based on tap according to the second aspect of the first embodiment. -
FIGS. 23, 24, 25, and 26 illustrate how a notification dialog is displayed according to the second aspect of the first embodiment. -
FIGS. 27, 28, 29, and 30 illustrate how a notification dialog is displayed according to the second aspect of the first embodiment. -
FIG. 31 is a flowchart illustrating operations performed by a computing device according to a third aspect of the first embodiment. -
FIGS. 32, 33, 34, and 35 illustrate how a menu is displayed according to the third aspect of the first embodiment. -
FIG. 36 is a flowchart illustrating operations performed by a computing device according to a fourth aspect of the first embodiment. -
FIGS. 37, 38, 39, and 40 illustrate how an assistant object is displayed according to the fourth aspect of the first embodiment. -
FIGS. 41, 42, and 43 illustrate how an assistant object is displayed according to the fourth aspect of the first embodiment. -
FIG. 44 is a flowchart illustrating operations performed by a computing device according to a fifth aspect of the first embodiment. -
FIGS. 45, 46, 47, and 48 illustrate how an assistant object is displayed according to the fifth aspect of the first embodiment. -
FIGS. 49, 50, and 51 illustrate how an assistant object is displayed according to the fifth aspect of the first embodiment. -
FIG. 52 illustrates a system including a computing device and a remote display device according to a second embodiment. -
FIG. 53 illustrates a system including a computing device and a remote display device according to the second embodiment. -
FIG. 54 is a block diagram illustrating means and/or circuitry provided in a computing device according to the second embodiment. -
FIG. 55 is a flowchart illustrating operations performed by a computing device according to a first aspect of the second embodiment. -
FIG. 56 is a flowchart illustrating operations performed by a computing device according to a second aspect of the second embodiment. -
FIG. 57 is a flowchart illustrating operations performed by a computing device according to a third aspect of the second embodiment. -
FIG. 58 is a flowchart illustrating operations performed by a computing device according to a fourth aspect of the second embodiment. -
FIGS. 59 and 60 illustrate how an indicator is displayed according to the first to fourth aspects of the second embodiment. -
FIGS. 61 and 62 illustrate how an indicator is displayed according to the first to fourth aspects of the second embodiment. -
FIGS. 63 and 64 illustrate how an indicator is displayed according to the first to fourth aspects of the second embodiment. -
FIGS. 65 and 66 illustrate how an indicator is displayed according to the first to fourth aspects of the second embodiment. -
FIGS. 67 and 68 illustrate how an indicator is displayed according to the first to fourth aspects of the second embodiment. -
FIGS. 69 and 70 illustrate how an indicator is displayed according to the first to fourth aspects of the second embodiment. -
FIG. 71 is a flowchart illustrating operations performed by a computing device according to a fifth aspect of the second embodiment. -
FIG. 72 is a flowchart illustrating operations performed by a computing device according to a sixth aspect of the second embodiment. -
FIG. 73 is a flowchart illustrating operations performed by a computing device according to a seventh aspect of the second embodiment. -
FIG. 74 is a flowchart illustrating operations performed by a computing device according to an eighth aspect of the second embodiment. -
FIGS. 75 and 76 illustrate how a tappable object is emphasized according to the fifth to eighth aspects of the second embodiment. -
FIGS. 77 and 78 illustrate how a tappable object is emphasized according to the fifth to eighth aspects of the second embodiment. -
FIGS. 79 and 80 illustrate how a tappable object is emphasized according to the fifth to eighth aspects of the second embodiment. -
FIGS. 81 and 82 illustrate how a tappable object is emphasized according to the fifth to eighth aspects of the second embodiment. -
FIGS. 83 and 84 illustrate how a tappable object is emphasized according to the fifth to eighth aspects of the second embodiment. -
FIG. 85 is a flowchart illustrating operations performed by a computing device according to a ninth aspect of the second embodiment. -
FIG. 86 is a flowchart illustrating operations performed by a computing device according to a tenth aspect of the second embodiment. -
FIG. 87 is a flowchart illustrating operations performed by a computing device according to an eleventh aspect of the second embodiment. -
FIGS. 88 through 90 illustrate how an indicator is displayed according to the ninth to eleventh aspects of the second embodiment. -
FIGS. 91 through 93 illustrate how an indicator is displayed according to the ninth to eleventh aspects of the second embodiment. -
FIGS. 94 through 96 illustrate how an indicator is displayed according to the ninth to eleventh aspects of the second embodiment. -
FIG. 97 is a flowchart illustrating operations performed by a computing device according to a twelfth aspect of the second embodiment. -
FIG. 98 is a flowchart illustrating operations performed by a computing device according to a thirteenth aspect of the second embodiment. -
FIG. 99 is a flowchart illustrating operations performed by a computing device according to a fourteenth aspect of the second embodiment. -
FIG. 100 is a flowchart illustrating operations performed by a computing device according to a fifteenth aspect of the second embodiment. -
FIGS. 101 and 102 illustrate how an indicator is displayed and how a tappable object is emphasized according to the twelfth to fifteenth aspects of the second embodiment. -
FIGS. 103 and 104 illustrate how an indicator is displayed and how a tappable object is emphasized according to the twelfth to fifteenth aspects of the second embodiment. -
FIGS. 105 and 106 illustrate how an indicator is displayed and how a tappable object is emphasized according to the twelfth to fifteenth aspects of the second embodiment. -
FIGS. 107 and 108 illustrate how an indicator is displayed and how a tappable object is emphasized according to the twelfth to fifteenth aspects of the second embodiment. -
FIGS. 109 and 110 illustrate how an indicator is displayed and how a tappable object is emphasized according to the twelfth to fifteenth aspects of the second embodiment. -
FIG. 111 is a flowchart illustrating operations performed by a computing device according to a sixteenth aspect of the second embodiment. -
FIGS. 112, 113, 114, and 115 illustrate how a software keyboard is displayed according to the sixteenth aspect of the second embodiment. - A first embodiment is disclosed with reference to
FIGS. 1 to 51 In the first embodiment, acomputing device 1 can detect taps of an object such as a user's finger onto its localsensitive display 3 and hover of the object in proximity over suchsensitive display 3. Thecomputing device 1 controls display of various graphical objects responsive to the detected tap and/or hover. - In first and second aspects of the first embodiment, the
computing device 1 controls display of a dialog on thesensitive display 3 when an event to pop up the dialog occurs while a computer program is being executed and a screen of the computer program is being displayed on thesensitive display 3. - In a third aspect of the first embodiment, the
computing device 1 controls display of a menu associated with a computer program in a screen of the computer program while the computer program is being executed. - In fourth and fifth aspects of the first embodiment, the
computing device 1 controls display of an assistant object for assisting a tappable object, which appears in a screen of a computer program, to be tapped while the computer program is being executed. - In the above aspects, the
computing device 1 performs the display controls based on a location of a tap and/or hover of an object such as a user's finger on and/or over thesensitive display 3. - (Configuration)
-
FIG. 1 depicts thecomputing device 1.FIG. 1 is a front view of thecomputing device 1. - The
computing device 1 is a multi-functional computing device suitable in size for mobility. Thecomputing device 1 can be a cell phone, a tablet computer, a laptop computer, and other similar computing device. -
FIG. 2 is a block diagram of thecomputing device 1 for illustrating the configuration of thecomputing device 1 in more detail. - The
computing device 1 mainly has aprocessor 2, thesensitive display 3,telecommunication circuitry 4, and amemory 5. - The
processor 2 generally processes instructions of computer programs stored in thememory 5 to execute the computer programs, so as to realize a variety of functions of thecomputing device 1. Theprocessor 2 can be a CPU (Central Processing Unit), a MPU (Micro Processing Unit), a DSP (Digital Processing Unit), or one or combination of other general or dedicated processors. - The
sensitive display 3 is a display device composed essentially of adisplay 3 a and asensor 3 b. Thedisplay 3 a can be a LCD (Liquid Crystal Display), an EL (Electro-Luminance) display, or one of other similar types of display devices. Thedisplay 3 a displays graphics and video in accordance with video signals sent from theprocessor 2. Thesensor 3 b is a sensor to distinctively detect (i) taps of one or more objects, such as a user's finger and a stylus, made onto thesensor 3 b and (ii) hover of such object made in proximity over thesensor 3 b. Thesensor 3 b sends to theprocessor 2 signals representing (i) the location of detected tap as long as such tap is detected and (ii) the location of detected hover as long as such hover is detected. A tap may be a touch or a contact in other words. Further, thesensor unit 3 b detects gestures by (i) continuously detecting hover continuously made in proximity above thesensor unit 3 b or (ii) continuously detecting a movement of the object while a tap is maintained on thesensor unit 3 b. The technologies of sensing of taps, hover, and/or gestures are disclosed, for example, in the U.S. patent publications Nos. 2009/194344 invented by Harley et al, 2008/297487 invented by Hotelling et al, 2009/289914 invented by CHO, 2006/26521 invented by Hotelling et al, 2006/244733 invented by Geaghan et al, 2010/45633 invented by Gettemy et al, 2011/169780 invented by Goertz et al, 2008/158172 invented by Hotelling et al, and the issued U.S. Pat. No. 7,653,883 invented by Hotelling et al, U.S. Pat. No. 8,232,990 invented by King et al, U.S. Pat. No. 7,880,732 invented by Goertz, 7663607 invented by Hotelling et al, U.S. Pat. No. 7,855,718 invented by Westerman, U.S. Pat. No. 7,777,732 invented by HERZ et al, U.S. Pat. No. 7,924,271 invented by Christie et al, U.S. Pat. No. 8,219,936 invented by Kim et al, U.S. Pat. No. 8,284,173 invented by Morrison, U.S. Pat. No. 6,803,906 invented by Morrison, U.S. Pat. No. 6,954,197 invented by Morrison et al, U.S. Pat. No. 7,692,627 invented by Wilson, the contents of which are incorporated herein by reference in their entirety. Thedisplay 3 a and thesensor 3 b may be mechanically integrated together. As a result, thesensitive display 3 displays graphics and video as well as detects taps, hover, and gestures of an object like the user's finger or a stylus on or above thesensitive display 3. - The
telecommunication circuitry 4 is circuitry for telecommunication over a telecommunication network. For example, thetelecommunication circuitry 4 can be circuitry for telecommunication pursuant to CDMA (Code Divided Multiple Access) or other similar telecommunication standards or protocols. - The
memory 5 is a memory device, for example, such as a flash memory, an EEPROM, a HDD (Hard Disk Drive), combination thereof, and one or combination of other similar memory devices. Thememory 5 stores computer programs to be executed by theprocessor 2. In particular, thememory 5 stores an OS (Operating System) 5 a, a WWW (World Wide Web)browser 5 b, avideo game 5 c, atext editor 5 d, amedia player 5 e, adisplay control program 5 f, and atelecommunication program 5 g. TheWWW browser 5 b, thevideo game 5 c, thetext editor 5 d, and themedia player 5 e are typically application programs that run on theOS 5 a. Theprograms 5 b to 5 e are often collectively referred to as application programs. Thedisplay control program 5 f and thetelecommunication program 5 g can also run on theOS 5 a, or can be incorporated in theOS 5 a running as part of theOS 5 a. Thedisplay control program 5 f and thetelecommunication program 5 g run in the background as long as theOS 5 a is running. - One or more of the
application programs 5 b to 5 e are executed on theOS 5 a in response to user's selection. Thedisplay control program 5 f and thetelecommunication program 5 g are executed while theOS 5 a is executed. - The
processor 2 sends video signals to thesensitive display 3 in accordance with instructions of theOS 5 a, theapplication programs 5 b to 5 e, thedisplay control program 5 f, and/or thetelecommunication program 5 g. Thesensitive display 3 displays graphics and video in accordance with the video signals. - For example, the graphics and the video to be displayed include screens, icons, dialogs, menus and other graphical objects or contents. The screen may contain one or more tappable objects or contents within the screen, such as a HTML (Hyper Text Markup Language) link, a text-input field, a software button, and a software keyboard. The dialog is a graphical object, with a message and one or more tappable objects, which pops up in response to occurrence of one or more given events associated with the
OS 5 a, theapplication programs 5 b to 5 e, thedisplay control program 5 f, and/or thetelecommunication program 5 g. The menu is associated with theOS 5 a, theapplication programs 5 b to 5 e, thedisplay control program 5 f, and/or thetelecommunication program 5 g, and is a tappable object for operation of theOS 5 a, theapplication programs 5 b to 5 e, thedisplay control program 5 f, and/or thetelecommunication program 5 g. - In this manner, for example, one or more icons representing one or more of the
application programs 5 b to 5 e are displayed on thesensitive display 3 in accordance with the instruction of theOS 5 a. For example, a screen of one of theapplication programs 5 b to 5 e is displayed on thesensitive display 3 in accordance with the instructions of the one of theapplication programs 5 b to 5 e. For example, a dialog for notifying a user of an incoming call over the telecommunication network is popped up in accordance with the instructions of thetelecommunication program 5 g. - When an object such as the user's finger hovers over the
sensitive display 3, thesensitive display 3 detects the hover and determines the location above which the hover exists over thesensitive display 3. Thesensitive display 3 continuously sends to theprocessor 2 signals each representing the determined hover location during the hover detection. The location may be a position or an X-Y coordinates in other words. - When a tap of an object such as the user's finger is made onto the
sensitive display 3, thesensitive display 3 detects the tap and determines the location at which the tap is made on thesensitive display 3. Thesensitive display 3 then sends to the processor 2 a signal representing the determined tap location. The location may be a position or an X-Y coordinates in other words. - The
processor 2 receives the signals from thesensitive display 3. Based on the received signals, theprocessor 2 determines the location of the hover and tap within a screen displayed on thesensitive display 3. Theprocessor 2 then operates in response to the hover and tap in accordance with the instructions of theOS 5 a, theapplication programs 5 b to 5 e, thedisplay control program 5 f, and/or thetelecommunication program 5 g. - For example, in accordance with the instructions of the
OS 5 a, if theprocessor 2 determines that a tap is made on an icon representing theWWW browser 5 b, theprocessor 2 launches theWWW browser 5 b. - For example, in accordance with the instructions of the
WWW browser 5 b, if theprocessor 2 determines that a tap is made on a text-input field, theprocessor 2 launches a software keyboard. - Upon executing the
programs 5 a to 5 g, theprocessor 2 generates and sends video signals representing a screen of one of theprograms 5 a to 5 g to thesensitive display 3, so as for thesensitive display 3 to display the screen. Also, theprocessor 2 receives operations of the program whose screen is displayed by way of, for example, the user's tapping software buttons or other tappable graphical objects that appear in the screen. - For example, when executing the
OS 5 a without executing any of theapplication programs 5 b to 5 e, theprocessor 2 displays the screen of theOS 5 a on thesensitive display 3. The screen may contain icons representing theapplication programs 5 b to 5 e. And, theprocessor 2 receives operations for launching one of theapplication programs 5 b to 5 e through the user's tapping on the one of the icons. - For example, when executing the
WWW browser 5 b on theOS 5 a, theprocessor 2 displays the screen of theWWW browser 5 b on thesensitive display 3. The screen may contain software buttons. And, theprocessor 2 receives operations for connecting to WWW pages through the user's tapping on the software buttons. - The
processor 2 usually does not display screen of thedisplay control program 5 f and thetelecommunication program 5 g because the 5 f and 5 g run in the background while theprograms OS 5 a is running. - One or more events are associated with one or more of the
programs 5 a to 5 g. If an event occurs in one of theprograms 5 a to 5 g, theprocessor 2 generates and sends video signals representing a dialog associated with the event to thesensitive display 3 so as for thesensitive display 3 to pop up the dialog over the screen already displayed. The dialog may contain one or more tappable graphical objects. In response to the user's tapping on the graphical object, theprocessor 2 executes a predetermined action associated with the tapped graphical objects. - For example, an incoming call event is associated with the
telecommunication program 5 g. While thetelecommunication program 5 g is running in the background, theprocessor 2 continuously monitors an incoming call over the telecommunication network through thetelecommunication circuitry 4 from some distant caller. Responsive to arrival of an incoming call, thesensitive display 3 pops up a dialog for notifying the user of the incoming call over the already-displayed screen. The dialog may contain a software button for answering the incoming call. Responsive to the user's tapping the software button, theprocessor 2 establishes telecommunication between the user, namely, thecomputing device 1 and the caller. - For example, a virus detection event is associated with the
WWW browser 5 b. When theWWW browser 5 b is running, theprocessor 2 continuously monitors computer virus maliciously hidden in WWW pages. Responsive to some hidden computer virus detected, thesensitive display 3 pops up a dialog for notifying the user of the detected computer virus over the already-displayed screen. The dialog may contain a software button for responding to the notice and selecting countermeasures against the computer virus. Responsive to the user's tapping the software button, theprocessor 2 quarantines or eliminates the virus. - The detail of the display control performed by the
processor 2 in accordance with thedisplay control program 5 f is described below. -
FIG. 3 is a flowchart illustrating a first aspect of the display control in accordance with thedisplay control program 5 f. According to the first aspect of the present embodiment, the display control is executed responsive to occurrence of one of the above-mentioned given events triggering a popup of a dialog. - If a given event occurs (S100), the
processor 2 determines whether or not hover of an object such as the user's finger in proximity over thesensitive display 3 is being detected (S101). Namely, theprocessor 2 determines whether or not signals representing the location of hover are being sent from the sensitive display 3 (S101). - If no hover is being detected (S101: NO), the
processor 2 pops up a dialog associated with the event at a predetermined location over the screen on the sensitive display 3 (S102). The predetermined location may be the center of the screen, the bottom area of the screen, or the likes. - If hover is being detected and the location of the hover is determined (S101: YES), the
processor 2 determines a location that is predetermined pixels away from the determined hover location (S103). A dialog is to be displayed at the determined location. As illustrated inFIG. 4 , the determination in S103 can be done by simply selecting one of the locations L each of which is predetermined pixels away from the determined hoverlocation 11. - More specifically, without limitation, the determination in S103 can be done by determining first and second areas based on the determined hover location and then determining a given location within the second area, as described as follows with reference to 5 to 8. As illustrated in
FIG. 5 or 7 , first, thefirst area 12 is determined based on the determined hoverlocation 11. Thefirst area 12 can be defined from vertexes each of which is predetermined pixels away from the determined hoverlocation 11 as illustrated inFIG. 5 . Note that the predetermined pixels may be identical among all of the vertexes to form thefirst area 12 to be a regular square, or may be different among the vertexes to form an irregular square. Also note that thefirst area 12 may be formed from four vertexes to be a square, or may be formed from more than or less than four vertexes to be a polygon other than the square. Instead, as illustrated inFIG. 7 , thefirst area 12 can be defined as a circle having a radius of predetermined pixels from the detected hoverlocation 11. Next, thesecond area 13 is determined based on thefirst area 12. Thesecond area 13, shaded inFIGS. 6 and 8 , is defined to be an area other than the first area in the screen as illustrated inFIGS. 6 and 8 . Finally, a given location L is determined within the second area. - Now back in
FIG. 3 , After determination of the location for a dialog, theprocessor 2 pops up a dialog associated with the given event at the determined location over the screen on the sensitive display 3 (S104). As a result, the dialog is popped up to be displayed some pixels away from the user's finger hovering in proximity over the screen. - After popup of the dialog in accordance with S102 or S104, the
processor 2 determines whether or not one or more tappable graphical objects contained in the dialog are tapped (S105). - If a tappable graphical object is tapped (S105: YES), the
processor 2 executes a given action associated with the tapped graphical object (S106). -
FIGS. 9, 10, 11, 12, 13, 14, 15, and 16 illustrate examples describing how a dialog is popped up in accordance with the above-mentioned display control. -
FIGS. 9 to 12 illustrate an example of popping up a dialog responsive to occurrence of an incoming call event associated with thetelecommunication program 5 g while theOS 5 a is being executed. - While the
OS 5 a is executed, the screen of theOS 5 a is displayed on thesensitive display 3 as illustrated inFIG. 9 . The screen containsicons 14 representing one or more of theapplication programs 5 b to 5 e. Theprocessor 2 receives operation for selecting and launching one of theapplication programs 5 b to 5 e by way of the user's tapping on anicon 14. - The
telecommunication program 5 g is running in the background, whose screen is not displayed. If an incoming call event occurs, theprocessor 2 determines whether or not hover is being detected. If there is no hover detected, theprocessor 2 then displays adialog 15 for notifying the user of the incoming call over the screen of theactive OS 5 a at a predetermined location, such as the substantially center of the screen, as depicted inFIG. 10 . - On the other hand, if there is hover of the user's
finger 10 detected, theprocessor 2 determines a location which is predetermined pixels (for example, Z pixels inFIG. 12 ) away from the location of the detected hover, and then displays thedialog 15 at the determined location as depicted inFIGS. 11 and 12 . - The
dialog 15 contains atappable object 15 a for answering the incoming call as well as atappable object 15 b for denying the incoming call. Responsive to the user's tapping thegraphical object 15 a, theprocessor 2 establishes telecommunication between the user, namely, thecomputing device 1 and the caller through thetelecommunication circuitry 4, in accordance with the instructions of thetelecommunication program 5 g. - Advantageously, the above-mentioned display control can avoid user's erroneous tapping on the suddenly displayed
dialog 15. Specifically speaking, if thedialog 15 popped up near thefinger 10 just when the user is about or ready to tap any of theicons 14 with hovering thefinger 10 over the screen of theOS 5 a, the user might erroneously tap the 15 a or 15 b against his/her intention. The user might feel bad if he/she erroneously answered the incoming call by tapping thetappable object tappable object 15 a against his/her intention. Thanks to the above-mentioned display control, thedialog 15 is always popped up distantly from the user's hoveringfinger 10, and so the erroneous operation can be avoided. -
FIGS. 13 to 16 illustrate an example of popping up a dialog responsive to occurrence of a virus detection event associated with theWWW browser 5 b while theWWW browser 5 b is running. - While the
WWW browser 5 b is executed on theOS 5 a, the screen of theWWW browser 5 b is displayed on thesensitive display 3 as illustrated inFIG. 13 . The screen containssoftware buttons 16 andhyperlink buttons 17. Theprocessor 2 receives operation of scrolling forward or back WWW pages by way of the user's tapping thesoftware buttons 16, and operations of connecting to other WWW pages by way of the user's tapping thehyperlink buttons 17. Also, theWWW browser 5 b continuously monitors computer viruses hidden in the WWW pages in the background. - If a virus detection event occurs, the
processor 2 determines whether or not hover of the user's finger is being detected. If there is no hover detected, theprocessor 2 then displays adialog 18 for notifying the user of the detected virus over the screen at a predetermined location, such as the substantially center of the screen, as depicted inFIG. 14 . - On the other hand, if there is hover of the user's
finger 10 detected, theprocessor 2 determines a location which is predetermined pixels (for example, Z pixels inFIG. 16 ) away from the location of the detected hover, and then displays thedialog 18 at the determined location as depicted inFIGS. 15 and 16 . - The
dialog 18 contains atappable object 18 a for checking the details of the virus as well as atappable object 18 b for eliminating the virus. Responsive to the user's tapping thegraphical objects 18 a, theprocessor 2 displays detail information about the virus. Responsive to the user's tapping thegraphical objects 18 b, theprocessor 2 executes the instructions ofWWW browser 5 b to exterminate the virus. - Advantageously, the above-mentioned display control can avoid the user's erroneous tapping on the suddenly displayed
dialog 18. Specifically speaking, if thedialog 18 popped up near thefinger 10 just when the user is about or ready to tap thesoftware buttons 16 or thehyperlink buttons 17 with hovering thefinger 10 above the screen, the user might erroneously tap the 18 a or 18 b against his/her intention. The user might feel bad if he/she erroneously eliminated the virus by tapping thetappable object object 18 b against his/her intention to analyze the virus carefully. Thanks to the above-mentioned display control, thedialog 18 is popped up distantly from the user's hoveringfinger 10, and so the erroneous operation can be avoided. -
FIG. 17 is a flowchart illustrating a second aspect of the display control in accordance with thedisplay control program 5 f. According to the second aspect of the present embodiment, the display control is executed upon occurrence of one of the above-mentioned given events triggering a popup of a dialog. - If a given event occurs (S110), the
processor 2 determines whether or a tap of an object such as the user's finger on thesensitive display 3 is being detected (S111). Namely, theprocessor 2 determines whether or not signals representing the location of tap are being sent from the sensitive display 3 (S111). - If no tap is being detected (S111: NO), the
processor 2 pops up a dialog associated with the event at a predetermined location over the screen on the sensitive display 3 (S112). The predetermined location may be the center of the screen, the bottom area of the screen, or the likes. - If a tap is being detected and the location of the tap is determined (S111: YES), the
processor 2 determines a location that is predetermined pixels away from the determined tap location (S113). A dialog is to be displayed at the predetermined location. As illustrated inFIG. 18 , the determination in S113 can be done by simply selecting one of the locations L each of which is predetermined pixels away from thedetermined tap location 11. - More specifically, without limitation, the determination in S113 can be done by determining first and second areas based on the determined tap location and then determining a given location within the second area, as described as follows with reference to 10A to 10D. As illustrated in
FIG. 19 or 21 , first, thefirst area 12 is determined based on thedetermined tap location 11. Thefirst area 12 can be defined from vertexes each of which is predetermined pixels away from thedetermined tap location 11 as illustrated inFIG. 19 . Note that the predetermined pixels may be identical among all of the vertexes to form thefirst area 12 to be a regular square, or may be different among the vertexes to form an irregular square. Also note that thefirst area 12 may be formed from four vertexes to be a square, or may be formed from more than or less than four vertexes to be a polygon other than the square. Instead, as illustrated inFIG. 21 , thefirst area 12 can be defined as a circle having a radius of predetermined pixels from the detectedtap location 11. Next, thesecond area 13 is determined based on thefirst area 12. Thesecond area 13, shaded inFIGS. 20 and 22 , is defined to be an area other than the first area in the screen as illustrated inFIGS. 20 and 22 . Finally, a given location L is determined within the second area. - Now back in
FIG. 17 , After determination of the location for a dialog, theprocessor 2 pops up a dialog associated with the given event at the determined location over the screen on the sensitive display 3 (S114). As a result, the dialog is popped up to be displayed some pixels away from the user's finger tapped on the screen. - After popup of the dialog in accordance with S112 or S114, the
processor 2 determines whether or not one or more tappable graphical objects contained in the dialog are tapped (S115). - If a tappable graphical object is tapped (S115: YES), the
processor 2 executes a given action associated with the tapped graphical object (S116). -
FIGS. 23, 24, 25, 26, 27, 28, 29, and 30 illustrate examples describing how a dialog is popped up in accordance with the above-mentioned display control. -
FIGS. 23 to 26 illustrate an example of popping up a dialog responsive to occurrence of an incoming call event associated with thetelecommunication program 5 g while theOS 5 a is being executed. - While the
OS 5 a is executed, the screen of theOS 5 a is displayed on thesensitive display 3 as illustrated inFIG. 23 . The screen containsicons 14 representing one or more of theapplication programs 5 b to 5 e. Theprocessor 2 receives operation for selecting and launching one of theapplication programs 5 b to 5 e by way of the user's tapping on anicon 14. - The
telecommunication program 5 g is running in the background, whose screen is not displayed. If an incoming call event occurs, theprocessor 2 determines whether or not a tap is being detected. If there is no tap detected, theprocessor 2 then displays adialog 15 for notifying the user of the incoming call over the screen of theactive OS 5 a at a predetermined location, such as the substantially center of the screen, as depicted inFIG. 24 . - On the other hand, if there is a tap detected, the
processor 2 determines a location which is predetermined pixels (for example, Z pixels inFIG. 26 ) away from the location of the detected tap, and then displays thedialog 15 at the determined location as depicted inFIGS. 25 and 26 . - The
dialog 15 contains atappable object 15 a for answering the incoming call as well as atappable object 15 b for denying the incoming call. Responsive to the user's tapping thegraphical object 15 a, theprocessor 2 establishes telecommunication between the user, namely, thecomputing device 1 and the caller through thetelecommunication circuitry 4, in accordance with the instructions of thetelecommunication program 5 g. - Advantageously, the above-mentioned display control can avoid user's erroneous tapping on the suddenly displayed
dialog 15. Specifically speaking, if thedialog 15 popped up near thefinger 10 just when the user is tapping on the screen of theOS 5 a for operation of theOS 5 a, the user might erroneously tap the 15 a or 15 b against his/her intention. The user might feel bad if he/she erroneously answered the incoming call by tapping thetappable object tappable object 15 a against his/her intention. Thanks to the above-mentioned display control, thedialog 15 is always popped up distantly from the user's tappedfinger 10, and so the erroneous operation can be avoided. -
FIGS. 27 to 30 illustrate an example of popping up a dialog responsive to occurrence of a virus detection event associated with theWWW browser 5 b while theWWW browser 5 b is running. - While the
WWW browser 5 b is executed on theOS 5 a, the screen of theWWW browser 5 b is displayed on thesensitive display 3 as illustrated inFIG. 27 . The screen containssoftware buttons 16 andhyperlink buttons 17. Theprocessor 2 receives operation of scrolling forward or back WWW pages by way of the user's tapping thesoftware buttons 16, and operations of connecting to other WWW pages by way of the user's tapping thehyperlink buttons 17. Also, theWWW browser 5 b continuously monitors computer viruses hidden in the WWW pages in the background. - If a virus detection event occurs, the
processor 2 determines whether or not a tap of the user's finger is being detected. If there is no tap detected, theprocessor 2 then displays adialog 18 for notifying the user of the detected virus over the screen at a predetermined location, such as the substantially center of the screen, as depicted inFIG. 28 . - On the other hand, if there is a tap detected, the
processor 2 determines a location which is predetermined pixels (for example, Z pixels inFIG. 30 ) away from the location of the detected tap, and then displays thedialog 18 at the determined location as depicted inFIGS. 29 and 30 . - The
dialog 18 contains atappable object 18 a for checking the details of the virus as well as atappable object 18 b for eliminating the virus. Responsive to the user's tapping theobjects 18 a, theprocessor 2 displays detail information about the virus. Responsive to the user's tapping thegraphical objects 18 b, theprocessor 2 executes the instructions ofWWW browser 5 b to exterminate the virus. - Advantageously, the above-mentioned display control can avoid the user's erroneous tapping on the suddenly displayed
dialog 18. Specifically speaking, if thedialog 18 popped up near thefinger 10 just when the user is tapping on the screen, the user might erroneously tap the 18 a or 18 b against his/her intention. The user might feel bad if he/she erroneously eliminated the virus by tapping theobject tappable object 18 b against his/her intention to analyze the virus carefully. Thanks to the above-mentioned display control, thedialog 18 is popped up distantly from the user's tappedfinger 10, and so the erroneous operation can be avoided. -
FIG. 31 is a flowchart illustrating a third aspect of the display control in accordance with thedisplay control program 5 f. According to the third aspect of the present embodiment, the display control is executed while a screen of one of theprograms 5 a to 5 g is displayed on thesensitive display 3. One or more given location or area within the screen are assigned for popup of a menu for operation of the one of the programs. For example, the upper right part of the screen can be assigned. - While the screen of the program is displayed on the sensitive display 3 (S200), the
processor 2 continuously determines whether or not hover of an object such as the user's finger is being detected at the assigned location above the screen for more than a predetermined period based on the signals from the sensitive display 3 (S201). If theprocessor 2 has continuously received the signals representing hover above the assigned location for more than the predetermined period, theprocessor 2 determines affirmatively. - If hover is detected (S201: YES), the
processor 2 displays a menu at a first location over the screen (S202). The first location is defined to be predetermined pixels away from the assigned location. The menu is a tappable graphical object for operation of the executed program. - Once the
processor 2 displays the menu, theprocessor 2 continuously determines whether or not hover anywhere in the screen is kept continuously detected based on the signals from the sensitive display 3 (S203). Theprocessor 2 determines dismissively if thesensitive display 3 stops detecting hover because, for example, the user has moved his/her finger away from thesensitive display 3. Theprocessor 2 determines affirmatively if thesensitive display 3 keeps detecting hover because, for example, the user has kept his/her finger in proximity thesensitive display 3. - If detection of hover stops (S203: NO), the
processor 2 stops displaying the menu (S204). On the other hand, as long as detection of hover is kept (S203: YES), theprocessor 2 keeps displaying the menu until a predetermined time has lapsed (S205). - While the menu is displayed, the
processor 2 receives the user's tap on the menu through the sensitive display 3 (S206). - If the menu is tapped (S206: YES), the
processor 2 executes a given action associated with the menu in accordance with the instructions of the executed program (S207). -
FIGS. 32 to 35 illustrate examples describing how a menu is popped up in accordance with the above-mentioned display control while thetext editor 5 c is being executed. - As illustrated in
FIG. 32 , the screen of thetext editor 5 c contains atext input field 20 and asoftware keyboard 21. Theprocessor 2 receives taps on tappable alphabetical keys within thesoftware keyboard 21 through thesensitive display 3 to input texts in thetext input field 21. In addition, a givenpart 22 in the upper right of the screen is assigned for menu popup. - If the
processor 2 detects hover of the user'sfinger 10 above thepart 22 for more than a predetermined period, theprocessor 2 displays amenu 23 at a first location that is predetermined pixels (for example, Y pixels inFIG. 34 ) away from thepart 22, as depicted inFIG. 34 . Themenu 23 is a graphical object containing tappable software buttons entitled “save”, “edit”, and “option” by way of example. - The
processor 2 keeps displaying themenu 23 as long as hover of thefinger 10 is kept over the screen. Theprocessor 2 executes actions associated with any one of the software buttons within themenu 23 by receiving a tap on the one of the software buttons in accordance with the instructions of thetext editor 5 c. For example, theprocessor 2 can save a document created through text inputs in thememory 5 by receiving a tap on the software button entitled “save”. - On the other hand, the
processor 2 stops displaying themenu 23 if the predetermined time has lapsed or detection of hover has stopped before reception of tap on themenu 23. - Advantageously, the above-mentioned display control can hide the
menu 23 to avoid the screen from being occupied by themenu 23 unless or until the user hopes or needs to operate by use of themenu 23, so as to enhance screen visibility. - Further, the
menu 23 is displayed some pixels away from the hoveringfinger 10, so as to avoid thefinger 10 itself from interrupting themenu 23. - In addition, display of the
menu 23 can be kept or ceased through easy operation of the user's keeping thefinger 10 in proximity over the screen or moving thefinger 10 away from the screen. Accordingly, usability in operating the programs can be improved. -
FIG. 36 is a flowchart illustrating a fourth aspect of the display control in accordance with thedisplay control program 5 f. According to the fourth aspect of the present embodiment, the display control is executed while a screen of one of theprograms 5 a to 5 g is displayed on thesensitive display 3. The screen may contain one or more tappable objects, such as texts and images, associated with a given action defined by instructions of one of theprograms 5 a to 5 g. For example, a screen of theWWW browser 5 b displays a WWW page and may contain a tappable object linked to another WWW page. Tapping on the object may initiate a given action, namely, connection to and display of the linked WWW page. - While the screen of one of the
programs 5 a to 5 g is displayed on the sensitive display 3 (S300), theprocessor 2 continuously determines whether or not hover of an object such as the user'sfinger 10 is being detected above a tappable object over the screen for more than a predetermined period (S301). If theprocessor 2 has continuously received the signals representing hover above the location of the tappable object for more than the predetermined period, theprocessor 2 determines affirmatively. - If hover is detected (S301: YES), the
processor 2 generates and displays an assistant object at a location which is predetermined pixels away from the tappable object over the screen (S302). The assistant object is an object for assisting the tappable object to be tapped. The assistant object can be, for example, generated by enlarging the tappable object. - Once the
processor 2 displays the assistant object, theprocessor 2 continuously determines whether or not hover above the tappable object is kept continuously detected based on the signals from the sensitive display 3 (S303). Theprocessor 2 determines dismissively if thesensitive display 3 stops detecting hover because, for example, the user has moved his/her finger away from thesensitive display 3. Theprocessor 2 determines affirmatively if thesensitive display 3 keeps detecting hover because, for example, the user has kept his/her finger in proximity above the tappable object. - If detection of hover stops (S303: NO), the
processor 2 stops displaying the assistant object (S304). On the other hand, as long as detection of hover is kept (S303: YES), theprocessor 2 keeps displaying the assistant object until a predetermined time has lapsed (S305). - While the assistant object is displayed, the
processor 2 receives the user's tap on the tappable object through the sensitive display 3 (S306). - If the tappable object is tapped (S306: YES), the
processor 2 executes a given action associated with the tappable object in accordance with the instructions of the program (S307). -
FIGS. 37 to 40 illustrate an example describing how the assistant object is displayed in accordance with the above-mentioned display control while theWWW browser 5 b is being executed. - As illustrated in
FIG. 37 , as the screen of theWWW browser 5 b, a WWW page is displayed. The WWW page contains tappable text objects 31 to 33 and atappable image object 30. Each of the tappable text objects 31 to 33 consists of texts, whereas thetappable image object 30 consists of an image. Each of thetappable objects 30 to 33 is linked to another WWW page. Theprocessor 2 receives taps on thetappable objects 30 to 33 through thesensitive display 3 to connect to the linked WWW page and display a screen of the linked WWW page. - If the
processor 2 detects hover of an object such as the user'sfinger 10 above one of thetappable objects 30 to 33, for example thetappable text object 31, for more than a predetermined period as depicted inFIG. 38 , theprocessor 2 generates anassistant object 41 for assisting thetappable text object 31 to be tapped. Theprocessor 2 then displays theassistant object 41 near but predetermined pixels (for example, X pixels inFIG. 39 ) away from the originaltappable text object 31, as depicted inFIG. 39 . - The
processor 2 keeps displaying theassistant object 41 as long as hover of thefinger 10 is kept detected above thetappable text object 31. Theprocessor 2 executes an action associated with thetappable text object 31 by receiving tap on thetappable text object 31 in accordance with the instructions of theWWW browser 5 b. For example, theprocessor 2 can connect to another WWW page linked to thetappable text object 31 and display a screen of the linked WWW page. - On the other hand, the
processor 2 stops displaying theassistant object 41 if the predetermined time has lapsed or detection of hover has stopped because, for example, the user has moved thefinger 10 away from thetappable text object 31 before reception of tap on thetappable text object 31. After stopping display of theassistant object 41, if theprocessor 2 again detects hover above one of thetappable objects 30 to 33, for example thetappable image object 30 because the user has moved thefinger 10 from above thetappable text object 31 to above thetappable image object 30, theprocessor 2 then generates and displays anassistant object 42 for assisting thetappable image object 30 to be tapped, near but predetermined pixels (for example, X pixels inFIG. 40 ) away from the originaltappable image object 30, as depicted inFIG. 40 . - The assistant object generated in accordance with the above display control may be embodied in various manners within the scope of its purpose of assisting the a original tappable object to be tapped.
- For example, the assistant object can be generated by copying the original tappable object, and enlarging the copied object, as depicted in
FIGS. 41 and 42 . - For example, the assistant object can be a thumbnail or screenshot of a WWW page linked to the original tappable object as depicted in
FIG. 43 . - Advantageously, the above-mentioned display control can enhance usability in the user tapping on tappable objects that appear in the screen, and can avoid the user from erroneously tapping on the tappable objects. More specifically, even in case the tappable objects are displayed too small for the user to easily read or recognize what is written in the tappable object or what will occur upon tapping on the tappable object because the
sensitive display 3 is tiny, the user can read or recognize what is written in the tappable object or what will occur upon tapping on the tappable object by taking a look at the assistant object. Accordingly, the user's erroneous tapping on the tappable objects against his/her intention can be avoided. Therefore, usability in operating the programs can be improved. -
FIG. 44 is a flowchart illustrating a fifth aspect of the display control in accordance with thedisplay control program 5 f. According to the fifth aspect of the present embodiment, the display control is executed while the screen of one of theprograms 5 a to 5 g is displayed on thesensitive display 3. The screen may contain one or more tappable objects, such as texts and images, associated with a given action for operation of the one of theprograms 5 a to 5 g. For example, a screen of theWWW browser 5 b, which is a WWW page, may contain a tappable object linked to another WWW page. Tapping on the graphical object may execute a given action, namely, connection to and display of the linked WWW page. - While the screen of one of the
programs 5 a to 5 g is displayed on the sensitive display 3 (S310), theprocessor 2 continuously determines whether or not hover of an object such as the user's finger is being detected above a tappable object over the screen for more than a predetermined period (S311). If theprocessor 2 has continuously received the signals representing hover at the location of the tappable object for more than the predetermined period, theprocessor 2 determines affirmatively. - If hover is detected (S311: YES), the
processor 2 generates and displays an assistant object at a location which is predetermined pixels away from the tappable object over the screen (S312). The assistant object is an object for assisting the tappable object to be tapped. Specifically, the assistant object is generated in a tappable form, and is associated with the same given action as the action originally associated with the tappable object. In other words, the same instructions as the instructions originally assigned to the tappable object for executing the given action, is also assigned to the assistant object. - Once the
processor 2 displays the assistant object, theprocessor 2 determines whether or not hover anywhere over the screen is kept continuously detected based on the signals from the sensitive display 3 (S313). Theprocessor 2 determines dismissively if thesensitive display 3 stops detecting hover because, for example, the user has moved his/her finger away from thesensitive display 3. Theprocessor 2 determines affirmatively if thesensitive display 3 keeps detecting hover because, for example, the user has kept his/her finger in proximity anywhere over the screen. - If detection of hover stops (S313: NO), the
processor 2 stops displaying the assistant object (S314). On the other hand, as long as detection of hover is kept (S313: YES), theprocessor 2 keeps displaying the assistant object until a predetermined time has lapsed (S315). - While the assistant object is displayed, the
processor 2 receives the user's tap on the assistant object through the sensitive display 3 (S316). - If the assistant object is tapped (S316: YES), the
processor 2 executes a given action associated with the assistant object, which corresponds to the action originally associated with the tappable object, in accordance with the instructions of the program (S317). -
FIGS. 45 to 48 illustrate an example describing how the assistant object is displayed in accordance with the above-mentioned display control while theWWW browser 5 b is being executed. - As illustrated in
FIG. 45 , the screen of theWWW browser 5 b, which is a WWW page, contains tappable text objects 31 to 33 and atappable image object 30. Each of thetappable objects 30 to 33 is linked to another WWW page. Theprocessor 2 receives taps on thetappable objects 30 to 33 through thesensitive display 3 to connect to the linked WWW page and display a screen of the linked WWW page. - If the
processor 2 detects hover of an object such as the user'sfinger 10 above one of thetappable objects 30 to 33, for example thetappable text object 31, for more than a predetermined period as depicted inFIG. 46 , theprocessor 2 generates anassistant object 41 for thetappable text object 31. Thisassistant object 41 is associated with the action originally associated with thetappable text object 31, namely, has a link to the WWW page to which thetappable text object 31 is originally linked. Theprocessor 2 then displays theassistant object 41 near but predetermined pixels (for example, X pixels inFIG. 47 ) away from the originaltappable text object 31, as depicted inFIG. 47 . - The
processor 2 keeps displaying theassistant object 41 as long as hover of thefinger 10 is kept anywhere over the screen. Theprocessor 2 executes an action associated with theassistant object 41, which corresponds to the action originally associated with thetappable text object 31, by receiving tap on theassistant object 41 in accordance with the instructions of theWWW browser 5 b. For example, theprocessor 2 can connect to another WWW page linked to theassistant object 41, which corresponds to the WWW page originally linked to thetappable text object 31, and display a screen of the linked WWW page. - On the other hand, the
processor 2 stops displaying theassistant object 41 if the predetermined time has lapsed or detection of hover has stopped because, for example, the user has moved thefinger 10 away from the screen before reception of tap on theassistant object 41. After stopping display of theassistant object 41, if theprocessor 2 again detects hover above one of thetappable objects 30 to 33, for example thetappable image object 30 because the user has moved thefinger 10 to approach to thetappable image object 30, theprocessor 2 then generates and displays anassistant object 42 for thetappable image object 30 near but predetermined pixels (for example, X pixels inFIG. 48 ) away from the originaltappable object 30, as depicted inFIG. 48 . - The assistant object generated in accordance with the above display control may be embodied in various manners within the scope of its purpose of assisting the tappable object to be tapped.
- For example, the assistant object can be generated by copying the original tappable object, and enlarging the copied object, as depicted in
FIGS. 49 and 50 . - For example, the assistant object can be a thumbnail or screenshot of a WWW page linked to the assistant object, which corresponds to the WWW page originally linked to the original tappable object, like the
tappable object 31 inFIG. 51 , as depicted inFIG. 51 . - Advantageously, the above-mentioned display control can enhance usability in the user executing a given action associated with tappable objects that appear in the screen. More specifically, even in case the tappable objects are displayed too small for the user to easily tap on the tappable object for executing the given action because the
sensitive display 3 is tiny, the user can execute the given action easily by tapping on an assistant object instead of the original tappable object. Therefore, usability in operating the programs can be improved. - A second embodiment is disclosed with reference to
FIGS. 52 to 110 . In the second embodiment, acomputing device 50 is operable in connection with aremote display device 51 that is physically separated from thecomputing device 50. Thecomputing device 1 can detect taps of an object such as a user's finger onto its localsensitive display 54 and hover of such object in proximity over thesensitive display 54. Thecomputing device 50 controls display of a screen of a computer program executed in thecomputing device 50 based on communication between thecomputing device 50 and theremote display device 51. - Each of
FIGS. 52 and 53 depicts thecomputing device 50 and theremote display device 51. - The
computing device 50 is a multi-functional computing device suitable in size for mobility. Thecomputing device 50 can be a cell phone, a tablet computer, a laptop computer, and other similar computing device. - The
computing device 50 hascommunication circuitry 55 for wirelessly communicating withcommunication circuitry 52 coupled to theremote display 51. - The
remote display device 51 is a typical desktop display device suitable for use, for example, on a desk, a table, or in a living room. The size of theremote display device 51 can be 20 inches, 32 inches, 40 inches, 60 inches, and so on. - The
remote display device 51 is physically different from thecomputing device 50. In other words, thecomputing device 50 and theremote display device 51 have their components and circuitry housed in housings different from each other. - The
remote display device 51 is coupled to thecommunication circuitry 52 for wirelessly communicating with thecommunication circuitry 55. Thecommunication circuitry 52 can be provided inside theremote display device 51 as depicted inFIG. 52 , or can be an external device attachable to theremote display device 51 by way of, for example, USB (Universal Serial Bus) or another interface as depicted inFIG. 53 . - The
55 and 52 can communicate with each other in accordance with, for example, the Bluetooth (registered trademark of Bluetooth SIG, INC.) protocol, the FireWire (registered trademark of Apple Inc.) protocol, the WiMAX (registered trademark of WiMAX Forum Corporation) protocol, the wireless LAN (Local Area Network) protocol, or another wireless communication protocol.communication circuitry - The
communication circuitry 52 can receive video signals streamed through thecommunication circuitry 55, and can output the received video signals to theremote display device 51. - Thanks to the above configuration of the
computing device 50 and theremote display device 51, thecomputing device 50 can send video signals to theremote display device 51, thereby making theremote display device 51 display graphics or video represented by the video signals. -
FIG. 54 is a block diagram of thecomputing device 50 for illustrating the configuration of thecomputing device 50 in more detail. - The
computing device 50 mainly has aprocessor 53, thesensitive display 54, thecommunication circuitry 55, and amemory 56. - The
processor 53 generally processes instructions of computer programs stored in thememory 56 to execute the computer programs, so as to realize a variety of functions of thecomputing device 50. Theprocessor 53 can be a CPU (Central Processing Unit), a MPU (Micro Processing Unit), a DSP (Digital Processing Unit), or another general or dedicated processor. - The
sensitive display 54 is a display device composed essentially of adisplay 54 a and asensor 54 b. Thedisplay 54 a can be a LCD (Liquid Crystal Display), an EL (Electro-Luminance) display, or one of other similar types of display devices. Thedisplay 54 a displays graphics and video in accordance with video signals sent from theprocessor 53. Thesensor 54 b is a sensor to distinctively detect (i) taps of one or more objects, such as a user's finger and a stylus, made onto thesensor 54 b and (ii) hover of such object made in proximity over thesensor 54 b. Thesensor 54 b sends to theprocessor 53 signals representing (i) the location of detected tap as long as such tap is detected and (ii) the location of detected hover as long as such hover is detected. A tap may be a touch or a contact in other words. Further, thesensor unit 54 b detects gestures by (i) continuously detecting hover continuously made in proximity above thesensor unit 54 b or (ii) continuously detecting a movement of the object while a tap is maintained on thesensor unit 54 b. The technologies of sensing of taps, hover, and/or gestures are disclosed, for example, in the U.S. patent publications Nos. 2009/194344 invented by Harley et al, 2008/297487 invented by Hotelling et al, 2009/289914 invented by CHO, 2006/26521 invented by Hotelling et al, 2006/244733 invented by Geaghan et al, 2010/45633 invented by Gettemy et al, 2011/169780 invented by Goertz et al, 2008/158172 invented by Hotelling et al, and the issued U.S. Pat. No. 7,653,883 invented by Hotelling et al, U.S. Pat. No. 8,232,990 invented by King et al, U.S. Pat. No. 7,880,732 invented by Goertz, 7663607 invented by Hotelling et al, U.S. Pat. No. 7,855,718 invented by Westerman, 7777732 invented by HERZ et al, U.S. Pat. No. 7,924,271 invented by Christie et al, U.S. Pat. No. 8,219,936 invented by Kim et al, U.S. Pat. No. 8,284,173 invented by Morrison, 6803906 invented by Morrison, 6954197 invented by Morrison et al, U.S. Pat. No. 7,692,627 invented by Wilson, the contents of which are incorporated herein by reference in their entirety. Thedisplay 54 a and thesensor 54 b may be mechanically integrated together. As a result, thesensitive display 54 displays graphics and video as well as detects taps, hover, and gestures of an object like the user's finger or a stylus on or above thesensitive display 54. - The
communication circuitry 55 is circuitry for wireless communication with thecommunication circuitry 52. In particular, video signals are transmitted to thecommunication circuitry 52 through thecommunication circuitry 55 under control by theprocessor 53. Thecommunication circuitry 55 can communicate in accordance with the Bluetooth (registered trademark of Bluetooth SIG, INC.) protocol, the FireWire (registered trademark of Apple Inc.) protocol, the WiMAX (registered trademark of WiMAX Forum Corporation) protocol, the wireless LAN (Local Area Network) protocol, or another wireless communication protocol. - The
memory 56 is a memory device, for example, such as a flash memory, an EEPROM, a HDD (Hard Disk Drive), and another similar memory device. Thememory 56 stores computer programs to be executed by theprocessor 53. In particular, thememory 56 stores an OS (Operating System) 56 a, a WWW (World Wide Web)browser 56 b, avideo game 56 c, atext editor 56 d, amedia player 56 e, and adisplay control program 56 f. TheWWW browser 56 b, thevideo game 56 c, thetext editor 56 d, and themedia player 56 e are typically application programs that run onOS 56 a. Theprograms 56 b to 56 e are often collectively referred to as application programs. Thedisplay control program 56 f can also run on theOS 56 a, or can be incorporated in theOS 56 a running as part of theOS 56 a. - One or more of the
application programs 56 b to 56 e are executed on theOS 56 a in response to the user's selection. Thedisplay control program 56 f is executed while theOS 56 a and/or one or more of theapplication programs 56 b to 56 e are executed. - The
processor 53 sends video signals to thesensitive display 54 in accordance with instructions of theOS 56 a, theapplication programs 56 b to 56 e, and/or thedisplay control program 56 f. Thesensitive display 54 displays graphics and video in accordance with the video signals. - For example, the graphics and the video to be displayed include screens, icons, and other graphical objects or contents. The screen may contain one or more tappable graphical objects or contents within the screen, such as a HTML (Hyper Text Markup Language) link, a text-input field, a software button, and a software keyboard.
- In this manner, for example, one or more icons representing one or more of the
application programs 56 b to 56 e are displayed on thesensitive display 54 in accordance with the instruction of theOS 56 a. For example, a screen of one of theapplication programs 56 b to 56 e is displayed on thesensitive display 54 in accordance with the instruction of theapplication programs 56 b to 56 e. - When an object such as the user's finger hovers over the
sensitive display 54, thesensitive display 54 detects the hover and determines the location above which the hover is made over thesensitive display 54. Thesensitive display 54 continuously sends to theprocessor 53 signals representing the determined hover location during the hover detection. The location may be a position or an X-Y coordinates in other words. - Also, a gesture may be defined by continuous hover in a predetermined path. For example, the
sensitive display 54 may detect a gesture responsive to detecting left-to-right linear continuous hover or circular continuous hover. In this case, thesensitive display 54 may then send to the processor 53 a signal representing the detected gesture. - When a tap of an object such as the user's finger is made onto the
sensitive display 54, thesensitive display 54 detects the tap and determines the location at which the tap is made within thesensitive display 54. Thesensitive display 54 then sends to the processor 53 a signal representing the determined tap location. The location may be a position or an X-Y coordinates in other words. - Also, a gesture may be defined by a movement of the object while a tap is once detected and maintained on the
sensitive display 54. For example, thesensitive display 54 may detect a gesture responsive to detecting a left-to-right linear movement of the object or a circular movement of the object. In this case, thesensitive display 54 may then send to the processor 53 a signal representing the detected gesture. - The
processor 53 receives the signals from thesensitive display 54. Based on the received signals, theprocessor 53 determines the location of the hover and tap within a screen displayed on thesensitive display 54. Theprocessor 53 then operates in response to the hover and tap in accordance with the instructions of theOS 56 a, theapplication programs 56 b to 56 e, and/or thedisplay control program 56 f. - For example, in accordance with the instructions of the
OS 56 a, if theprocessor 53 determines that a tap is made onto an icon representing theWWW browser 56 b, theprocessor 53 launches theWWW browser 56 b. - For example, in accordance with the instructions of the
WWW browser 56 b, if theprocessor 53 determines that a tap is made onto a text-input field, theprocessor 53 launches a software keyboard. - For example, in accordance with the instructions of the
display control program 56 f, if theprocessor 53 determines that hover is made anywhere above a screen displayed on thesensitive display 54, theprocessor 53 generates a video signal representing an indicator indicative of the determined hover location if thecommunication circuitry 55 is active. - (Communication Circuitry Operation)
- The
processor 53 activates or deactivates thecommunication circuitry 55 in response to the user's operation, for example, through thesensitive display 54. - For example, an icon for activation or deactivation of the
communication circuitry 55 is displayed on thesensitive display 54. Responsive to detection of a tap or hover of the user's finger or a stylus on or above the icon, theprocessor 53 may activate thecommunication circuitry 55 if thecommunication circuitry 55 had not been active, and vice versa. Or, Responsive to detection of a predetermined gesture made on or above thesensitive display 54, theprocessor 53 may activate thecommunication circuitry 53 if thecommunication circuitry 55 had not been active, and vice versa. -
FIG. 55 is a flowchart illustrating a first aspect of the display control in accordance with thedisplay control program 56 f. - The
processor 53 launches, namely, executes one of theapplication programs 56 b to 56 e in response to the user's selection (S400). The selection is made by way of, for example, the user's tap on an icon representing any of theapplication programs 56 b to 56 e on thesensitive display 54, and theprocessor 53 detects the tap on the icon. - While the one of the
application programs 56 b to 56 e is executed, theprocessor 53 determines whether or not thecommunication circuitry 55 is active or not (S401). - As described above, the
communication circuitry 55 becomes activated or deactivated by way of, for example, the user's operation such as a tap or hover of his/her finger on or above a predetermined icon displayed on thesensitive display 54 and a predetermined gesture on or above thesensitive display 54. - If the
communication circuitry 55 is not active (S401: No), theprocessor 53 generates video signals representing a screen of the executed application program and sends the video signals to the sensitive display 54 (S402). Accordingly, the screen of the executed application program is displayed on thesensitive display 54. The screen of the executed application program may contain one or more tappable objects such as a HTML link, a text-input field, a software button, a software keyboard, and the like. Theprocessor 53 does not generate video signals representing an indicator indicative of the location of hover even if such hover is detected by thesensitive display 54. - If the
communication circuitry 55 is active (S401: Yes), theprocessor 53 generates video signals representing the screen of the executed application program and also generates video signals representing an indicator over the screen. The indicator indicates the hover location by being displayed at the location of hover detected by theprocessor 53 over the screen. Theprocessor 53 then sends the generated video signals of the screen and the indicator to thecommunication circuitry 52 through the communication circuitry 55 (S403). - In this manner, the screen of the executed application program and the indicator indicative of the location of the user's finger's hover over the screen are displayed on the
remote display device 51. The screen of the executed application program may contain one or more tappable object such as a HTML link, a text-input field, a software button, and a software keyboard. - While the screen of the executed application program and the indicator are displayed on the
sensitive display 54 or on theremote display device 51 in accordance with S402 or S403, theprocessor 53 operates in response to the user's tap on the tappable objects contained in the screen in accordance with the executed application program (S404). -
FIG. 56 is a flowchart illustrating a second aspect of the display control in accordance with thedisplay control program 56 f. - The
processor 53 launches, namely, executes theOS 56 a in response to activating or powering on the computing device 50 (S500). The activation is made, for example, by way of the user's turning on thecomputing device 50. - While the
OS 56 a is executed, theprocessor 53 determines whether or not thecommunication circuitry 55 is active or not (S501). - As described above, the
communication circuitry 55 becomes activated or deactivated by way of, for example, the user's operation such as a tap or hover of his/her finger on or above a predetermined icon displayed on thesensitive display 54 and a predetermined gesture on or above thesensitive display 54. - If the
communication circuitry 55 is not active (S501: No), theprocessor 53 generates video signals representing a screen of theOS 56 a and sends the video signals to the sensitive display 54 (S502). Accordingly, the screen of theOS 56 a is displayed on thesensitive display 54. The screen of theOS 56 a may contain one or more tappable icons representing one or more of theapplication programs 56 b to 56 e. - If the
communication circuitry 55 is active (S501: Yes), theprocessor 53 generates video signals representing the screen of theOS 56 a and also generates video signals representing an indicator over the screen. The indicator indicates the hover location by being displayed at the location of hover detected by theprocessor 53 over the screen. Theprocessor 53 then sends the generated video signals of the screen and the indicator to thecommunication circuitry 52 through the communication circuitry 55 (S503). - In this manner, the screen of the
OS 56 a and the indicator indicative of the location of the user's finger's hover over the screen are displayed on theremote display device 51. The screen of theOS 56 a contains one or more tappable icons representing one or more of theapplication programs 56 b to 56 e. - While the screen of the
OS 56 a and the indicator are displayed on thesensitive display 54 or on theremote display device 51 in accordance with S502 or S503, theprocessor 53 operates in response to the user's tap on the tappable icons contained in the screen in accordance with theOS 56 a (S504). -
FIG. 57 is a flowchart illustrating a third aspect of the display control in accordance with thedisplay control program 56 f. - The display control of
FIG. 57 is operated when thecommunication circuitry 55 becomes activated. While thecommunication circuitry 55 is not active, as mentioned above with reference toFIGS. 55 and 56 , the screen of theOS 56 a or one of theapplication programs 56 b to 56 e is displayed on the sensitive display 54 (S402, S502). - In this situation, if the user operates to activate the
communication circuitry 55 by, for example, tapping on a predetermined icon on thesensitive display 54, theprocessor 53 activates the communication circuitry 55 (S600). - The
processor 53 then stops displaying the screen on the sensitive display 54 (S601). More specifically, theprocessor 53 may stop sending the video signals of the screen to thesensitive display 54. - Instead, the
processor 53 starts generating video signals representing an indicator indicative of the location of hover detected by theprocessor 53. Theprocessor 53 then starts sending to theremote display device 54 via thecommunication circuitry 55 video signals of the screen and the indicator (602). -
FIG. 58 is a flowchart illustrating a fourth aspect of the display control in accordance with thedisplay control program 56 f. - The display control of
FIG. 58 is operated when thecommunication circuitry 55 becomes deactivated. While thecommunication circuitry 55 is active, as mentioned above with reference toFIGS. 55 and 56 , video signals representing the screen of theOS 56 a or one of theapplication programs 56 b to 56 e as well as an indicator indicative of the location of hover detected by theprocessor 53 are being sent to theremote display device 54. - In this situation, if the user operates to deactivate the
communication circuitry 55 by, for example, tapping on a predetermined icon on thesensitive display 54, theprocessor 53 deactivates the communication circuitry 55 (S700). - The
processor 53 then stops sending the video signals of the screen and the indicator (S701). In S701, theprocessor 53 may also stop generating the video signals representing the indicator. - Instead, the
processor 53 then starts displaying the screen, without the indicator, on the sensitive display 54 (S702). More specifically, theprocessor 53 starts sending video signals of the screen to thesensitive display 54. - The detail of how the screen is displayed in accordance with the above-mentioned first to fourth aspects of display control is explained below.
-
FIGS. 59 and 60 illustrate how the screen is displayed if the executed application program is the WWW (World Wide Web)browser 56 b. - As illustrated in
FIG. 59 , while thecommunication circuitry 55 is not active, the screen of theWWW browser 56 b is displayed on thesensitive display 54. An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by thesensitive display 54. HTML links 60 andgraphical buttons 61 contained in the screen are tappable by the user'sfinger 62 for operation of theWWW browser 56 b. - As illustrated in
FIG. 60 , while thecommunication circuitry 55 is active, the video signals of the screen of theWWW browser 56 b and theindicator 63 are sent from thecomputing device 50 to theremote display device 51. Accordingly, theindicator 63 indicative of the location of hover detected by theprocessor 53 during the display of the screen is displayed over the screen at theremote display device 51. -
FIGS. 61 and 62 illustrate how the screen is displayed if the executed application program is avideo game 56 c. - As illustrated in
FIG. 61 , while thecommunication circuitry 55 is not active, the screen of thevideo game 56 c is displayed on thesensitive display 54. An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by thesensitive display 54.Graphical buttons 64 contained in the screen are tappable by the user'sfinger 62 for operation of thevideo game 56 c. - As illustrated in
FIG. 62 , while thecommunication circuitry 55 is active, the video signals of the screen of thevideo game 56 c and theindicator 63 are sent from thecomputing device 50 to theremote display device 54. Accordingly, theindicator 63 indicative of the location of hover detected by theprocessor 53 during the display of the screen is displayed over the screen. -
FIGS. 63 and 64 illustrate how the screen is displayed if the executed application program is thetext editor 56 d. - As illustrated in
FIG. 63 , while thecommunication circuitry 55 is not active, the screen of thetext editor 56 d is displayed on thesensitive display 54. An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by thesensitive display 54.Graphical keyboard 65 contained in the screen is tappable for operation of thetext editor 56 d, namely, for text inputting. - As illustrated in
FIG. 64 , while thecommunication circuitry 55 is active, the video signals of the screen of thetext editor 56 d and theindicator 63 are sent from thecomputing device 50 to theremote display device 51. Accordingly, theindicator 63 indicative of the location of hover detected by theprocessor 53 during the display of the screen is displayed over the screen. -
FIGS. 65 and 66 illustrate how the screen is displayed if the executed application program is themedia player 56 e. - As illustrated in
FIG. 65 , while thecommunication circuitry 55 is not active, the screen of themedia player 56 e is displayed on thesensitive display 54. An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by thesensitive display 54.Thumbnails 66 of pictures or movies that appear in the screen are tappable for operation of themedia player 56 e, namely, for displaying an enlarged picture corresponding to the tapped thumbnail at anarea 67 or for playing a movie corresponding to the tapped thumbnail at thearea 67. - As illustrated in
FIG. 66 , while thecommunication circuitry 55 is active, the video signals of the screen of themedia player 56 e and theindicator 63 are sent from thecomputing device 50 to theremote display device 54. Accordingly, theindicator 63 indicative of the location of hover detected by theprocessor 53 during the display of the screen is displayed over the screen. -
FIGS. 67 and 68 illustrate how the screen is displayed when theOS 56 a is executed. - As illustrated in
FIG. 67 , while thecommunication circuitry 55 is not active, the screen of theOS 56 a is displayed on thesensitive display 54. An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by thesensitive display 54.Icons 68 representing theapplication programs 56 b to 56 e that appear in the screen are tappable for operation of theOS 56 a, namely, for launching one of theapplication programs 56 b to 56 e corresponding to the tapped icon. - As illustrated in
FIG. 68 , while thecommunication circuitry 55 is active, the video signals of the screen of theOS 56 a and theindicator 63 are sent from thecomputing device 50 to theremote display device 54. Accordingly, theindicator 63 indicative of the location of hover detected by theprocessor 53 during the display of the screen is displayed over the screen. - The
indicator 63 can be shaped and/or colored in any manner within the scope of its intention to indicate hover location. For example, theindicator 63 can be shaped to be a form of an arrow as depicted inFIG. 69 , or can be shaped to be a form of a circle and colored translucently or transparently as depicted inFIG. 70 . - According to the above-mentioned first to fourth aspects of the second embodiment, when the user enjoys the computer programs with his/her eyes on the screen displayed on the
remote display device 51 with thecommunication circuitry 55 being active, theindicator 63 indicates, on theremote display device 51, the location of his/her finger hovering over thesensitive display 54. Therefore, the user can easily recognize where in thesensitive display 54 he/she should tap on in order to tap the 60, 61, 64, 65, 66, or 68 within the screen while he/she is keeping watching the screen displayed on thetappable objects remote display device 51. - On the other hand, when the user enjoys the computer programs with his/her eyes on the screen displayed on the
sensitive display 54, theindicator 63 is not displayed over the screen because he/she can recognize where to tap without theindicator 63 because he/she can see the finger hovering in proximity over thesensitive display 54 while he/she is watching the screen displayed on thesensitive display 54. - In this way, usability can be highly improved because an indicator indicative of detected hover are displayed suitably or ideally depending on whether or not a computing device is communicatable with a remote display device. Therefore, usability in operation through the
sensitive display 54 can be improved. -
FIG. 71 is a flowchart illustrating a fifth aspect of the display control performed in accordance with thedisplay control program 56 f. - The
processor 53 launches, namely, executes one of theapplication programs 56 b to 56 e in response to the user's selection (S800). The selection is made by way of, for example, the user's tap on an icon representing theapplication programs 56 b to 56 e on thesensitive display 54, and theprocessor 53 detects the tap on the icon. - While the one of the
application programs 56 b to 56 e is executed, theprocessor 53 determines whether or not thecommunication circuitry 55 is active or not (S801). - As described above, the
communication circuitry 55 becomes activated or deactivated by way of, for example, the user's operation such as a tap or hover of his/her finger on or above a predetermined icon displayed on thesensitive display 54 and a predetermined gesture on or above thesensitive display 54. - If the
communication circuitry 55 is not active (S801: No), theprocessor 53 generates video signals representing a screen of the executed application program, and sends the video signals to the sensitive display 54 (S802). Accordingly, the screen of the executed application program is displayed on thesensitive display 54. The screen of the executed application program may contain one or more tappable objects such as a HTML link, a text-input field, a software button, a software keyboard, and the like. Each tappable object in a screen has a predefined default appearance such as, for example, size and color. In S802, each tappable object is displayed in its predefined default appearance. - If the
communication circuitry 55 is active (S801: Yes), theprocessor 53 continuously determines whether or not hover is made above a tappable object in the displayed screen (S803). The determination can be made by, for example, comparing the location of a tappable object with the location of hover detected by theprocessor 53. If theprocessor 53 determines that hover is made above a tappable object (S803: Yes), theprocessor 53 emphasizes the tappable object (S804). The emphasizing can be made by, for example, changing the predefined default appearance of the tappable object. The changing includes, without limitation, enlarging or zooming up the predefined default size of the tappable object and highlighting the predefined default color of the tappable object. Theprocessor 53 then generates video signals representing the screen of the executed application program with the emphasized tappable object, and sends the generated video signals to thecommunication circuitry 52 through the communication circuitry 55 (S805). - If the
processor 53 determines that hover is not made above a tappable object (S803: NO), theprocessor 53 does not perform the emphasizing. - In this manner, the screen of the executed application program is displayed on the
remote display device 51, with the tappable object emphasized while hover of the user's finger exists above the tappable object. The steps S801 to S805 may be continuously performed while an application program is executed. - While the screen of the executed application program is displayed on the
sensitive display 54 or on theremote display device 51 in accordance with S802 or S805, theprocessor 53 operates in response to the user's tap on the tappable objects contained in the screen in accordance with the executed application program (S806). -
FIG. 72 is a flowchart illustrating a sixth aspect of the display control performed in accordance with thedisplay control program 56 f. - The
processor 53 launches, namely, executes theOS 56 a in response to activating or powering on the computing device 50 (S900). The activation is made, for example, by way of the user's turning on thecomputing device 50. - While the
OS 56 a is executed, theprocessor 53 determines whether or not thecommunication circuitry 55 is active or not (S901). - As described above, the
communication circuitry 55 becomes activated or deactivated by way of, for example, the user's operation such as a tap or hover of his/her finger on or above a predetermined icon displayed on thesensitive display 54 and a predetermined gesture on or above thesensitive display 54. - If the
communication circuitry 55 is not active (S901: No), theprocessor 53 generates video signals representing a screen of theOS 56 a and sends the video signals to the sensitive display 54 (S902). Accordingly, the screen of theOS 56 a is displayed on thesensitive display 54. The screen of theOS 56 a may contain one or more tappable icons representing one or more of theapplication programs 56 b to 56 e. Each tappable icon in a screen has a predefined default appearance such as, for example, size and color. In S902, each tappable icon is displayed in its predefined default appearance. - If the
communication circuitry 55 is active (S901: Yes), theprocessor 53 continuously determines whether or not hover is made above a tappable icon in the displayed screen (S903). The determination can be made by, for example, comparing the location of a tappable icon with the location of hover detected by theprocessor 53. If theprocessor 53 determines that hover is made above a tappable icon (S903: Yes), theprocessor 53 emphasizes the tappable icon (S904). The emphasizing can be made by, for example, changing the predefined default appearance of the tappable icon. The changing includes, without limitation, enlarging or zooming up the predefined default size of the tappable icon and highlighting the predefined default color of the tappable icon. Theprocessor 53 then generates video signals representing the screen of theOS 56 a with the emphasized tappable icon, and sends the generated video signals to thecommunication circuitry 52 through the communication circuitry 55 (S905). - If the
processor 53 determines that hover is not made above a tappable icon (S903: NO), theprocessor 53 does not perform the emphasizing. - In this manner, the screen of the
OS 56 a is displayed on theremote display device 51, with the tappable icon emphasized when hover of the user's finger exists above the tappable icon. The steps S901 to S905 may be continuously performed while theOS 56 a is executed. - While the screen of the
OS 56 a is displayed on thesensitive display 54 or on theremote display device 51 in accordance with S902 or S905, theprocessor 53 operates in response to the user's tap on the tappable icons contained in the screen in accordance with theOS 56 a (S906). -
FIG. 73 is a flowchart illustrating a seventh aspect of the display control in accordance with thedisplay control program 56 f. - The display control of
FIG. 73 is operated when thecommunication circuitry 55 becomes activated. While thecommunication circuitry 55 is not active, as mentioned above with reference toFIGS. 71 and 72 , the screen of theOS 56 a or one of theapplication programs 56 b to 56 e is displayed on the sensitive display 54 (S802, S902). - In this situation, if the user operates to activate the
communication circuitry 55 by, for example, tapping on a predetermined icon on thesensitive display 54, theprocessor 53 activates the communication circuitry 55 (S1000). - The
processor 53 then stops displaying the screen on the sensitive display 54 (S1001). More specifically, theprocessor 53 may also stop sending video signals of the screen to thesensitive display 54. - Instead, the
processor 53 starts sending of the screen to theremote display device 51 via the communication circuitry 55 (1002). Theprocessor 53 also starts emphasizing the tappable object or icon above which hover is made as described in S804, S805, S904, and S905 (S1003). -
FIG. 74 is a flowchart illustrating a eighth aspect of the display control in accordance with thedisplay control program 56 f. - The display control of
FIG. 74 is operated when thecommunication circuitry 55 becomes deactivated. While thecommunication circuitry 55 is active, as mentioned above with reference toFIGS. 71 and 72 , video signals representing the screen of theOS 56 a or one of theapplication programs 56 b to 56 e is being sent to theremote display device 54. A tappable object or a tappable icon contained in the screen becomes emphasized if theprocessor 53 determines that hover exists above the tappable object or tappable icon. - In this situation, if the user operates to deactivate the
communication circuitry 55 by, for example, tapping on a predetermined icon on thesensitive display 54, theprocessor 53 deactivates the communication circuitry 55 (S1100). - The
processor 53 then stops sending the video signals of the screen (S1101). Theprocessor 53 also stops emphasizing the tappable object or icon even if hover is detected above the tappable object or icon (S1102). - Instead, the
processor 53 then starts displaying the screen on the sensitive display 54 (S1103). More specifically, theprocessor 53 starts sending video signals of the screen to thesensitive display 54. - The detail of how the screen is displayed in accordance with the above-mentioned fifth to eighth aspects of display control is explained below.
-
FIGS. 75 and 76 illustrate how the screen is displayed if the executed application program is the WWW (World Wide Web)browser 56 b. - As illustrated in
FIG. 75 , while thecommunication circuitry 55 is not active, the screen of theWWW browser 56 b is displayed on thesensitive display 54. HTML links 60 andgraphical buttons 61 contained in the screen are tappable by the user'sfinger 62 for operation of theWWW browser 56 b. Each of the HTML links 60 and thegraphical buttons 61 is displayed in its predefined default appearance. - While the
communication circuitry 55 is active, theprocessor 53 continuously determines whether or not hover is made above any of the HTML links 60 and thegraphical buttons 61. As illustrated inFIG. 76 , if hover is made above one of the HTML links 60 entitled “Today's topic”, theHTML link 60 entitled “Today′ topic” is emphasized. Accordingly, the video signals of the screen of theWWW browser 56 b, with the HTML link entitled “Today's topic” being emphasized, are sent from thecomputing device 50 to theremote display device 51 as long as hover exists above the HTML link entitled “Today's topic”. The HTML link 60 entitled “Today's topic” is thus displayed in the emphasized appearance as long as hover exists above theHTML link 60 entitled “Today's topic”. The emphasizing may be, for example, enlarging theHTML link 60 entitled “Today's topic” as depicted inFIG. 76 . The emphasizing stops if hover does not exist from above theHTML link 60 entitled “Today's topic” because, for example, the user has moved his/herfinger 62 away from above theHTML link 60 entitled “Today's topic”, and then theHTML link 60 entitled “Today's topic” is displayed in its predefined default appearance again. -
FIGS. 77 and 78 illustrate how the screen is displayed if the executed application program is avideo game 56 c. - As illustrated in
FIG. 77 , while thecommunication circuitry 55 is not active, the screen of thevideo game 56 c is displayed on thesensitive display 54.Graphical buttons 64 contained in the screen are tappable by the user'sfinger 62 for operation of thevideo game 56 c. - While the
communication circuitry 55 is active, theprocessor 53 continuously determines whether or not hover is made above any of thegraphical buttons 64. As illustrated inFIG. 78 , if hover is made above one of thegraphical buttons 64, namely, a circulargraphical button 64, the circulargraphical button 64 is emphasized. Accordingly, the video signals of the screen of thevideo game 56 c, with the circulargraphical button 64 being emphasized, are sent from thecomputing device 50 to theremote display device 51 as long as hover exists above the circulargraphical button 64. The circulargraphical button 64 is thus displayed in the emphasized appearance as long as hover exists above the circulargraphical button 64. The emphasizing may be, for example, changing the size and the color of the circulargraphical button 64 as illustrated inFIG. 78 . The emphasizing stops if hover does not exist above the circulargraphical button 64 because, for example, the user has moved his/herfinger 62 away from above the circulargraphical button 64, and then the circulargraphical button 64 is displayed in its predefined default appearance gain. -
FIGS. 79 and 80 illustrate how the screen is displayed if the executed application program is thetext editor 56 d. - While the
communication circuitry 55 is not active, the screen of thetext editor 56 d is displayed on thesensitive display 54.Graphical keyboard 65 contained in the screen is tappable for operation of thetext editor 56 d, namely, for text inputting. Thegraphical keyboard 65 is displayed with every key displayed in its predefined default appearance. - While the
communication circuitry 55 is active, theprocessor 53 continuously determines whether or not hover is made above any key of thegraphical keyboard 65. As illustrated inFIG. 80 , if hover is made above a key, namely, a “V” key in thegraphical keyboard 65, the “V” key is emphasized. Accordingly, the video signals of the screen of thetext editor 56 d, with the “V” key being emphasized, are sent from thecomputing device 50 to theremote display device 51 as long as hover exists above the “V” key. Thegraphical keyboard 65 is thus displayed with the “V” key displayed in the emphasized appearance as long as hover exists above the “V” key. The emphasizing may be, for example, enlarging the “V” key as illustrated inFIG. 80 . The emphasizing stops if hover does not exist because, for example, the user has moved his/herfinger 62 away from above the “V” key, and then the “V” key is displayed in its predefined default appearance again. -
FIGS. 81 and 82 illustrate how the screen is displayed if the executed application program is themedia player 56 e. - As illustrated in
FIG. 81 , while thecommunication circuitry 55 is not active, the screen of themedia player 56 e is displayed on thesensitive display 54.Thumbnails 66 of pictures or movies that appear in the screen are tappable for operation of themedia player 56 e, namely, for displaying an enlarged picture corresponding to the tapped thumbnail at anarea 67 or for playing a movie corresponding to the tapped thumbnail at thearea 67. Eachthumbnail 66 is displayed in its predefined default appearance. - While the
communication circuitry 55 is active, theprocessor 53 continuously determines whether or not hover is made above any of thethumbnails 66. As illustrated inFIG. 82 , if hover is made above theupper thumbnail 66, theupper thumbnail 66 is emphasized. Accordingly, the video signals of the screen of themedia player 56 e, with theupper thumbnail 66 emphasized, are sent from thecomputing device 50 to theremote display device 51 as long as hover exists above theupper thumbnail 66. Theupper thumbnail 66 is thus displayed in the emphasized appearance as long as hover exists above theupper thumbnail 66. The emphasizing may be, for example, changing the size and color of theupper thumbnail 66 as illustrated inFIG. 82 . The emphasizing stops if hover does not exist above theupper thumbnail 66 because, for example, the user has moved his/herfinger 62 away from above theupper thumbnail 66, and then theupper thumbnail 66 is displayed in its predefined default appearance again. -
FIGS. 83 and 84 illustrate how the screen is displayed when theOS 56 a is executed. - As illustrated in
FIG. 83 , while thecommunication circuitry 55 is not active, the screen of theOS 56 a is displayed on thesensitive display 54.Icons 68 representing theapplication programs 56 b to 56 e that appear in the screen are tappable for operation of theOS 56 a, namely, for launching one of theapplication programs 56 b to 56 e corresponding to the tapped icon. Eachicon 68 is displayed in its predefined default appearance. - While the
communication circuitry 55 is active, theprocessor 53 continuously determines whether or not hover is made above any of theicons 68. As illustrated inFIG. 84 , if hover is made above the “e” icon representing theWWW browser 56 b, the “e” icon is emphasized. Accordingly, the video signals of the screen of theOS 56 a, with the “e” icon emphasized, are sent from thecomputing device 50 to theremote display device 51 as long as hover exists above the “e” icon. The “e” icon is thus displayed in the emphasized appearance as long as hover exists above the “e” icon. The emphasizing may be, for example, changing the size and shape of the “e” icon as illustrated inFIG. 84 . The emphasizing stops if hover does not exist because, for example, the user has moved his/herfinger 62 away from above the “e” icon, and then the “e” icon is displayed in its predefined default appearance again. - According to the fifth to eighth aspects of display control of the second embodiment, when the user enjoys the computer programs with his/her eyes on the screen displayed on the
remote display device 51 with thecommunication circuitry 55 being active, a tappable object which is about to be tapped is emphasized. Therefore, the user can easily recognize where in thesensitive display 54 he/she should tap on or how far he/she should move the finger in order to tap the 60, 61, 64, 65, 66, or 68 within the screen while he/she is keeping watching the screen displayed on thetappable objects remote display device 51. - On the other hand, when the user enjoys the computer programs with his/her eyes on the screen displayed on the
sensitive display 54, tappable objects are not emphasized because he/she can recognize where to tap without the emphasizing because he/she can see the finger hovering in proximity over thesensitive display 54 while he/she is watching the screen displayed on thesensitive display 54. - In this way, usability can be highly improved because a tappable object is emphasized suitably or ideally depending on whether or not a computing device is communicatable with a remote display device. Therefore, usability in operation through the
sensitive display 54 can be improved. -
FIG. 85 is a flowchart illustrating a ninth aspect of the display control in accordance with thedisplay control program 56 f. - The
processor 53 launches, namely, executes one of theapplication programs 56 b to 56 e in response to the user's selection (S1300). The selection is made by way of, for example, the user's tap on an icon representing any of theapplication programs 56 b to 56 e on thesensitive display 54, and theprocessor 53 detects the tap on the icon. - While the one of the
application programs 56 b to 56 e is executed, theprocessor 53 determines whether or not thecommunication circuitry 55 is active or not (S1301). - As described above, the
communication circuitry 55 becomes activated or deactivated by way of, for example, the user's operation such as a tap or hover of his/her finger on or above a predetermined icon displayed on thesensitive display 54 and a predetermined gesture on or above thesensitive display 54. - If the
communication circuitry 55 is active (S1301: Yes), theprocessor 53 determines whether or not a screen of the executed application program contains one or more tappable objects such as a HTML link, a text-input field, a software button, and a software keyboard (S1303). If at least one tappable object is contained in the screen (S1303: Yes), theprocessor 53 generates video signals representing the screen of the executed application program and also generates video signals representing an indicator over the screen. The indicator indicates the hover location by being displayed at the location of hover detected by theprocessor 53 over the screen. Theprocessor 53 then sends the generated video signals of the screen and the indicator to thecommunication circuitry 52 through the communication circuitry 55 (S1305). In this manner, the screen of the executed application program and the indicator indicative of the location of, for example, the user's finger's hover over the screen are displayed on theremote display device 51. - On the other hand, if no tappable object is contained in the screen (S1303: No), the
processor 53 generates video signals representing the screen of the executed application program, but does not generate video signals representing the indicator even if hover is detected. Theprocessor 53 then sends the generated video signals of the screen to thecommunication circuitry 52 through the communication circuitry 55 (S1304). - Back in S1301, if the
communication circuitry 55 is not active (S1301: No), theprocessor 53 generates video signals representing a screen of the executed application program and sends the video signals to the sensitive display 54 (S1302). Accordingly, the screen of the executed application program is displayed on thesensitive display 54. The screen of the executed application program may contain one or more tappable objects such as a HTML link, a text-input field, a software button, a software keyboard, and the like. Theprocessor 53 does not determine whether or not the screen contains one or more tappable objects, or generate video signals representing the indicator even if hover is detected by thesensitive display 54. - While the screen of the executed application program and the indicator are displayed on the
sensitive display 54 according to S1305 or the screen is displayed on theremote display device 51 according to S1302, theprocessor 53 operates in response to the user's tap on the tappable objects contained in the screen in accordance with the executed application program (S1306). -
FIG. 86 is a flowchart illustrating a tenth aspect of the display control in accordance with thedisplay control program 56 f. - The display control of
FIG. 86 is operated when thecommunication circuitry 55 becomes activated. While thecommunication circuitry 55 is not active, as mentioned above with reference toFIG. 85 , the screen of one of theapplication programs 56 b to 56 e is displayed on the sensitive display 54 (S1302). - In this situation, if the user operates to activate the
communication circuitry 55 by, for example, tapping on a predetermined icon on thesensitive display 54, theprocessor 53 activates the communication circuitry 55 (S1400). - The
processor 53 then stops displaying the screen on the sensitive display 54 (S1401). More specifically, theprocessor 53 may stop sending the video signals of the screen to thesensitive display 54. Theprocessor 53 may also turn off thesensitive display 54. - Instead, the
processor 53 starts determining whether or not the screen contains at least one tappable objects (S1402). Theprocessor 53 starts generating video signals based on the determination at S1402 and sending the generated video signals to thecommunication circuitry 52 through the communication circuitry 55 (S1403). Namely, theprocessor 53 generates video signals representing the screen and the indicator indicative of the location of hover detected by theprocessor 53 as long as theprocessor 53 determines that at least one tappable object is contained in the screen, whereas theprocessor 53 generates video signals representing the screen but does not generate video signals representing the indicator as long as theprocessor 53 determines that no tappable object is contained. -
FIG. 87 is a flowchart illustrating an eleventh aspect of the display control in accordance with thedisplay control program 56 f. - The display control of
FIG. 87 is operated when thecommunication circuitry 55 becomes deactivated. While thecommunication circuitry 55 is active, as mentioned above with reference toFIG. 85 , video signals representing the screen only or representing the screen and the indicator are being sent to theremote display device 51 based on the determination whether or not the screen contains at least one tappable object. - In this situation, if the user operates to deactivate the
communication circuitry 55 by, for example, tapping on a predetermined icon on thesensitive display 54, theprocessor 53 deactivates the communication circuitry 55 (S1500). - The
processor 53 then stops determining whether or not the screen contains at least one tappable object (S1501) and also stops sending the video signals to the remote display device 54 (S1502). - Instead, the
processor 53 then starts displaying the screen, without the indicator, on the sensitive display 54 (S1503). More specifically, theprocessor 53 starts sending video signals of the screen only to thesensitive display 54. - The detail of how the screen is displayed in accordance with the above-mentioned ninth to eleventh aspects of display control is explained below.
-
FIGS. 88 through 90 illustrate how the screen is displayed if the executed application program is the WWW (World Wide Web)browser 56 b. - As illustrated in
FIG. 88 , while thecommunication circuitry 55 is not active, the screen of theWWW browser 56 b is displayed on thesensitive display 54. An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by thesensitive display 54. Tappable objects of HTML links 60 andgraphical buttons 61 may appear in the screen for operation of theWWW browser 56 b. - As illustrated in
FIG. 89 , while thecommunication circuitry 55 is active, as long as the 60 and 61 appear in the screen of thetappable objects WWW browser 56 b, the video signals of the screen and theindicator 63 indicative of hover location are sent from thecomputing device 50 to theremote display device 51. Accordingly, theindicator 63 indicative of the location of hover detected by theprocessor 53 during the display of the screen is displayed over the screen at theremote display device 51. On the other hand, even while thecommunication circuitry 55 is active, as long as the screen contains no tappable object, the video signals of the screen only are sent from thecomputing device 50 to theremote display device 51. As illustrated inFIG. 90 , the screen may contain no tappable object when, for example, thebrowser 56 b displays a movie streamed from a video streaming website in a full-screen manner. -
FIGS. 91 through 93 illustrate how the screen is displayed if the executed application program is avideo game 56 c. - As illustrated in
FIG. 91 , while thecommunication circuitry 55 is not active, the screen of thevideo game 56 c is displayed on thesensitive display 54. An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by thesensitive display 54. The screen may containgraphical buttons 64 to be tapped by the user'sfinger 62 for operation of thevideo game 56 c. - As illustrated in
FIG. 92 , while thecommunication circuitry 55 is active, as long as the screen contains thegraphical buttons 64, the video signals of the screen of thevideo game 56 c and theindicator 63 are sent from thecomputing device 50 to theremote display device 54. Accordingly, theindicator 63 indicative of the location of hover detected by theprocessor 53 during the display of the screen is displayed over the screen. On the other hand, even while thecommunication circuitry 55 is active, as long as the screen contains none of thegraphical buttons 64, the video signals of the screen only are sent from thecomputing device 50 to theremote display device 54. As illustrated inFIG. 93 , the screen contains none of thegraphical buttons 64 when, for example, thevideo game 56 c can be operated by the user's gesture on or above the screen instead of tapping on thegraphical buttons 64. -
FIGS. 94 through 96 illustrate how the screen is displayed if the executed application program is themedia player 56 e. - As illustrated in
FIG. 94 , while thecommunication circuitry 55 is not active, the screen of themedia player 56 e is displayed on thesensitive display 54. An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by thesensitive display 54.Thumbnails 66 of pictures or movies may appear in the screen for operation of themedia player 56 e, namely, for displaying an enlarged picture corresponding to the tapped thumbnail at anarea 67 or for playing a movie corresponding to the tapped thumbnail at thearea 67. - As illustrated in
FIG. 95 , while thecommunication circuitry 55 is active, as long as the screen contains thethumbnails 66, the video signals of the screen of themedia player 56 e and theindicator 63 are sent from thecomputing device 50 to theremote display device 54. Accordingly, theindicator 63 indicative of the location of hover detected by theprocessor 53 during the display of the screen is displayed over the screen. On the other hand, even while thecommunication circuitry 55 is active, as long as the screen contains none of thethumbnails 66, the video signals of the screen only are sent from thecomputing device 50 to theremote display device 54. As illustrated inFIG. 96 , the screen contains none of thethumbnails 66 when, for example, the enlarged picture or movie is played and displayed at thearea 67 in a full-screen manner. - The
indicator 63 can be shaped and/or colored in any manner within the scope of its intention to indicate hover location. For example, theindicator 63 can be shaped to be a form of an arrow as depicted inFIG. 33A , or can be shaped to be a form of a circle and colored translucently or transparently as depicted inFIG. 70 . - According to the above-mentioned ninth to eleventh aspects of the second embodiment, even when the user enjoys the computer programs with his/her eyes on the screen displayed on the
remote display device 51 with thecommunication circuitry 55 being active, theindicator 63 does not appear on theremote display 54 as long as the user does not need to recognize where to tap because no tappable object appears on theremote display 54. In other words, an indicator indicative of detected hover is displayed suitably or ideally depending on the user's need. Therefore, usability in operation through thesensitive display 54 can be improved. -
FIG. 97 is a flowchart illustrating a twelfth aspect of the display control performed in accordance with thedisplay control program 56 f. - The
processor 53 launches, namely, executes one of theapplication programs 56 b to 56 e in response to the user's selection (S1600). The selection is made by way of, for example, the user's tap on an icon representing theapplication programs 56 b to 56 e on thesensitive display 54, and theprocessor 53 detects the tap on the icon. - While the one of the
application programs 56 b to 56 e is executed, theprocessor 53 determines whether or not thecommunication circuitry 55 is active or not (S1601). - As described above, the
communication circuitry 55 becomes activated or deactivated by way of, for example, the user's operation such as a tap or hover of his/her finger on or above a predetermined icon displayed on thesensitive display 54 and a predetermined gesture on or above thesensitive display 54. - If the
communication circuitry 55 is not active (S1601: No), theprocessor 53 continuously determines whether or not hover is made above a tappable object in the displayed screen (S1602). The determination can be made by, for example, comparing the location of a tappable object with the location of hover detected by theprocessor 53. If theprocessor 53 determines that hover is made above a tappable object (S1602: Yes), theprocessor 53 emphasizes the tappable object (S1603). The emphasizing can be made by, for example, changing the predefined default appearance of the tappable object. The changing includes, without limitation, enlarging or zooming up the predefined default size of the tappable object and highlighting the predefined default color of the tappable object. Theprocessor 53 then displays the screen of the executed application program, with the emphasized tappable object, on the sensitive display 54 (S1604). - If the
processor 53 determines that hover is not made above a tappable object (S1602: NO), theprocessor 53 does not perform the emphasizing and displays the screen on the sensitive display 54 (S1604). - In this manner, the screen of the executed application program is displayed on the
sensitive display 54, with the tappable object emphasized while hover of the user's finger exists above the tappable object. The steps S1602 to S1604 may be continuously performed while an application program is executed as long as the communication circuitry is not active. - Back in S1601, if the
communication circuitry 55 is active (S1601: Yes), theprocessor 53 generates video signals representing the screen and also generates video signals representing an indicator over the screen. The indicator indicates the hover location by being displayed at the location of hover detected by theprocessor 53 over the screen. Theprocessor 53 then sends the generated video signals of the screen and the indicator to thecommunication circuitry 52 through the communication circuitry 55 (S1605). Accordingly, the screen and the indicator are displayed on theremote display device 51. - While the screen only is displayed on the
sensitive display 54 according to S1604 or the screen and the indicator are displayed on theremote display device 51 according to S1605, theprocessor 53 operates in response to the user's tap on the tappable objects contained in the screen in accordance with the executed application program (S1606). -
FIG. 98 is a flowchart illustrating a thirteenth aspect of the display control performed in accordance with thedisplay control program 56 f. - The
processor 53 launches, namely, executes theOS 56 a in response to activating or powering on the computing device 50 (S1700). The activation is made, for example, by way of the user's turning on thecomputing device 50. - While the
OS 56 a is executed, theprocessor 53 determines whether or not thecommunication circuitry 55 is active or not (S1701). - While the
OS 56 a is executed, theprocessor 53 determines whether or not thecommunication circuitry 55 is active or not (S1701). - As described above, the
communication circuitry 55 becomes activated or deactivated by way of, for example, the user's operation such as a tap or hover of his/her finger on or above a predetermined icon displayed on thesensitive display 54 and a predetermined gesture on or above thesensitive display 54. - If the
communication circuitry 55 is not active (S1701: No), theprocessor 53 continuously determines whether or not hover is made above a tappable object in the displayed screen (S1702). The determination can be made by, for example, comparing the location of a tappable object with the location of hover detected by theprocessor 53. If theprocessor 53 determines that hover is made above a tappable object (S1702: Yes), theprocessor 53 emphasizes the tappable object (S1703). The emphasizing can be made by, for example, changing the predefined default appearance of the tappable object. The changing includes, without limitation, enlarging or zooming up the predefined default size of the tappable object and highlighting the predefined default color of the tappable object. Theprocessor 53 then displays the screen of theOS 56 a, with the emphasized tappable object, on the sensitive display 54 (S1704). - If the
processor 53 determines that hover is not made above a tappable object (S1702: NO), theprocessor 53 does not perform the emphasizing and displays the screen on the sensitive display 54 (S1704). - In this manner, the screen of the
OS 56 a is displayed on thesensitive display 54, with the tappable object emphasized while hover of the user's finger exists above the tappable object. The steps S1702 to S1704 may be continuously performed while an application program is executed as long as the communication circuitry is not active. - Back in S1701, if the
communication circuitry 55 is active (S1701: Yes), theprocessor 53 generates video signals representing the screen and also generates video signals representing an indicator over the screen. The indicator indicates the hover location by being displayed at the location of hover detected by theprocessor 53 over the screen. Theprocessor 53 then sends the generated video signals of the screen and the indicator to thecommunication circuitry 52 through the communication circuitry 55 (S1705). Accordingly, the screen and the indicator are displayed on theremote display device 51. - While the screen only is displayed on the
sensitive display 54 according to S1604 or the screen and the indicator are displayed on theremote display device 51 according to S1605, theprocessor 53 operates in response to the user's tap on the tappable objects contained in the screen in accordance with theOS 56 a (S1706). -
FIG. 99 is a flowchart illustrating a fourteenth aspect of the display control in accordance with thedisplay control program 56 f. - The display control of
FIG. 99 is operated when thecommunication circuitry 55 becomes activated. While thecommunication circuitry 55 is not active, as mentioned above with reference toFIGS. 97 and 98 , the screen of theOS 56 a or one of theapplication programs 56 b to 56 e is displayed on the sensitive display 54 (S1604, S1704). - In this situation, if the user operates to activate the
communication circuitry 55 by, for example, tapping on a predetermined icon on thesensitive display 54, theprocessor 53 activates the communication circuitry 55 (S1800). - The
processor 53 then stops determining whether or not hover is made above a tappable object (S1801), stops the emphasizing of tappable objects (S1802), and also stops displaying the screen on the sensitive display 54 (S1803). More specifically, theprocessor 53 may also stop sending video signals of the screen to thesensitive display 54. - Instead, the
processor 53 starts generating video signal representing the indicator indicative of the location of hover detected by thesensitive display 54, and starts sending the video signals representing the screen and the indicator to theremote display device 51 via the communication circuitry 55 (1804). -
FIG. 100 is a flowchart illustrating a fifteenth aspect of the display control in accordance with thedisplay control program 56 f. - The display control of
FIG. 100 is operated when thecommunication circuitry 55 becomes deactivated. While thecommunication circuitry 55 is active, as mentioned above with reference toFIGS. 97 and 98 , video signals representing the screen of theOS 56 a or one of theapplication programs 56 b to 56 e as well as the indicator are being sent to theremote display device 51. - In this situation, if the user operates to deactivate the
communication circuitry 55 by, for example, tapping on a predetermined icon on thesensitive display 54, theprocessor 53 deactivates the communication circuitry 55 (S1900). - The
processor 53 then stops sending the video signals of the screen and the indicator (S1901). Theprocessor 53 may also stop generating the video signals representing the indicator. - Instead, the
processor 53 then starts determining whether or not hover is made above a tappable object contained in the screen (S1902), and starts emphasizing the tappable object responsive to determination that hover is made above the tappable object (S1903). Accordingly, the tappable object contained in the screen becomes emphasized if theprocessor 53 determines that hover exists above the tappable object. Theprocessor 53 then displays the screen on the sensitive display 54 (S1904). More specifically, theprocessor 53 starts sending video signals of the screen to thesensitive display 54. - The detail of how the screen is displayed in accordance with the above-mentioned twelfth to fifteenth aspects of display control is explained below.
-
FIGS. 101 and 102 illustrate how the screen is displayed if the executed application program is the WWW (World Wide Web)browser 56 b. - As illustrated in
FIG. 101 , while thecommunication circuitry 55 is not active, the screen of theWWW browser 56 b is displayed on thesensitive display 54. HTML links 60 andgraphical buttons 61 contained in the screen are tappable by the user'sfinger 62 for operation of theWWW browser 56 b. Each of the HTML links 60 and thegraphical buttons 61 is displayed in its predefined default appearance. As illustrated inFIG. 101 , if hover of an object such as the user'sfinger 62 is made above the HTML link 60 a labeled “Today's topic”, the HTML link 60 a is emphasized. - As illustrated in
FIG. 102 , while thecommunication circuitry 55 is active, the video signals of the screen of theWWW browser 56 b and theindicator 63 are sent from thecomputing device 50 to theremote display device 51. Accordingly, theindicator 63 indicative of the location of hover detected by theprocessor 53 during the display of the screen is displayed over the screen at theremote display device 51. -
FIGS. 103 and 104 illustrate how the screen is displayed if the executed application program is avideo game 56 c. - As illustrated in
FIG. 103 , while thecommunication circuitry 55 is not active, the screen of thevideo game 56 c is displayed on thesensitive display 54. An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by thesensitive display 54.Graphical buttons 64 contained in the screen are tappable by the user'sfinger 62 for operation of thevideo game 56 c. As illustrated inFIG. 103 , if hover of an object such as the user'sfinger 62 is made above the circulargraphical button 64 a, thegraphical button 64 a is emphasized. - As illustrated in
FIG. 104 , while thecommunication circuitry 55 is active, the video signals of the screen of thevideo game 56 c and theindicator 63 are sent from thecomputing device 50 to theremote display device 54. Accordingly, theindicator 63 indicative of the location of hover detected by theprocessor 53 during the display of the screen is displayed over the screen. -
FIGS. 105 and 106 illustrate how the screen is displayed if the executed application program is thetext editor 56 d. - As illustrated in
FIG. 105 , while thecommunication circuitry 55 is not active, the screen of thetext editor 56 d is displayed on thesensitive display 54. An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by thesensitive display 54.Graphical keyboard 65 contained in the screen is tappable for operation of thetext editor 56 d, namely, for text inputting. As illustrated inFIG. 105 , if hover of an object such as the user'sfinger 62 is made above a “V” key of thegraphical keyboard 65, the “V” key is emphasized. - As illustrated in
FIG. 106 , while thecommunication circuitry 55 is active, the video signals of the screen of thetext editor 56 d and theindicator 63 are sent from thecomputing device 50 to theremote display device 51. Accordingly, theindicator 63 indicative of the location of hover detected by theprocessor 53 during the display of the screen is displayed over the screen. -
FIGS. 107 and 108 illustrate how the screen is displayed if the executed application program is themedia player 56 e. - As illustrated in
FIG. 107 , while thecommunication circuitry 55 is not active, the screen of themedia player 56 e is displayed on thesensitive display 54. An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by thesensitive display 54.Thumbnails 66 of pictures or movies that appear in the screen are tappable for operation of themedia player 56 e, namely, for displaying an enlarged picture corresponding to the tapped thumbnail at anarea 67 or for playing a movie corresponding to the tapped thumbnail at thearea 67. As illustrated inFIG. 107 , if hover of an object such as the user'sfinger 62 is made above theupper thumbnail 66 a, thethumbnail 66 a is emphasized. - As illustrated in
FIG. 108 , while thecommunication circuitry 55 is active, the video signals of the screen of themedia player 56 e and theindicator 63 are sent from thecomputing device 50 to theremote display device 54. Accordingly, theindicator 63 indicative of the location of hover detected by theprocessor 53 during the display of the screen is displayed over the screen. -
FIGS. 109 and 110 illustrate how the screen is displayed when theOS 56 a is executed. - As illustrated in
FIG. 109 , while thecommunication circuitry 55 is not active, the screen of theOS 56 a is displayed on thesensitive display 54. An indicator indicative of hover is not generated and displayed over the screen even if such hover is detected by thesensitive display 54.Icons 68 representing theapplication programs 56 b to 56 e that appear in the screen are tappable for operation of theOS 56 a, namely, for launching one of theapplication programs 56 b to 56 e corresponding to the tapped icon. As illustrated inFIG. 109 , if hover of an object such as the user'sfinger 62 is made above theicon 68 representing thebrowser 56 b, theicon 68 is emphasized. - As illustrated in
FIG. 110 , while thecommunication circuitry 55 is active, the video signals of the screen of theOS 56 a and theindicator 63 are sent from thecomputing device 50 to theremote display device 54. Accordingly, theindicator 63 indicative of the location of hover detected by theprocessor 53 during the display of the screen is displayed over the screen. - The
indicator 63 can be shaped and/or colored in any manner within the scope of its intention to indicate hover location. For example, theindicator 63 can be shaped to be a form of an arrow as depicted inFIG. 69 , or can be shaped to be a form of a circle and colored translucently or transparently as depicted inFIG. 70 . - According to the above-mentioned twelfth to fifteenth aspects of the second embodiment, when the user enjoys the computer programs with his/her eyes on the screen displayed on the
remote display device 51 with thecommunication circuitry 55 being active, theindicator 63 indicates, on theremote display device 51, the location of his/her finger hovering over thesensitive display 54. Therefore, the user can easily recognize where in thesensitive display 54 he/she should tap on in order to tap the 60, 61, 64, 65, 66, or 68 within the screen while he/she is keeping watching the screen displayed on thetappable objects remote display device 51. - On the other hand, when the user enjoys the computer programs with his/her eyes on the screen displayed on the
sensitive display 54, theindicator 63 is not displayed over the screen because he/she can recognize where to tap without theindicator 63 because he/she can see the finger hovering in proximity over thesensitive display 54 while he/she is watching the screen displayed on thesensitive display 54. - In this way, usability can be highly improved because an indicator indicative of detected hover are displayed suitably or ideally depending on whether or not a computing device is communicatable with a remote display device. Therefore, usability in operation through the
sensitive display 54 can be improved. - In a sixteenth aspect of the present embodiment, the
processor 53 controls display of tappable graphical keys or software keys for text input depending on whether thecommunication circuitry 55 is active or not. -
FIG. 111 is a flowchart illustrating the sixteenth aspect of the display control in accordance with thedisplay control program 56 f. - While the screen of the
OS 56 a or one of theapplication programs 56 b to 56 e is displayed according to the first to fifteenth aspects of the display control depicted inFIGS. 52 to 110 , theprocessor 53 continuously determines the location at which a tap is made within a screen based on signals representing taps sent from thesensitive display 54. Theprocessor 53 then continuously determines whether or not a tap is made at a text input field in the screen. The text input filed is associated with one or more tappable software keys for text input. - If a tap on a text input field contained in the screen is detected (S1200), the
processor 53 determines whether or not thecommunication circuitry 55 is active or not (S1201). - If the
communication circuitry 55 is not active (S1201: No), namely, when the screen is displayed on thesensitive display 54 according to the first to fifteenth aspects of the display control, theprocessor 53 generates video signals representing one or more tappable software keys for text input with a first size and sends the video signals to the sensitive display 54 (S1202). The video signals representing the tappable software keys may be sent in parallel with or along with video signals representing the screen. - The first size is smaller than the size of the screen. Accordingly, the one or more software keys are displayed overlappingly along with the screen on the
sensitive display 54. - If the
communication circuitry 55 is active (S1201: Yes), namely, when the screen is displayed on theremote display device 51 according to the first to fifteenth aspects of the display control, theprocessor 53 generates video signals representing one or more tappable software keys for text input with a second size and sends the video signals to the sensitive display 54 (S1203). - The second size is larger than the first size, and can preferably be a full-screen size. Accordingly, the one or more software keys are displayed alone on the
sensitive display 54. - While the one or more tappable software keys are displayed on the
sensitive display 54 in accordance with S1202 or S1203, theprocessor 53 receives text inputs in response to the user's tap on the tappable software keys (S1204). - When the user has completed the text inputs through tapping on the tappable software keys on the
sensitive display 54, theprocessor 53 stops sending the video signals of the tappable software keys to the sensitive display 54 (S1205, S1206). Preferably, in S1206, theprocessor 53 may turn off thesensitive display 54 because there is no longer anything to be displayed on thesensitive display 54 until a tap on the text input field is detected again. -
FIGS. 112 to 115 depict how the tappable software keys are displayed when the screen of theWWW browser 56 b is displayed. The screen of theWWW browser 56 b contains a text input filed 70 at which a URL (Uniform Resource Locator) can be input by the user. - While the
communication circuitry 55 is not active, the screen is displayed on thesensitive display 54 in accordance with the first to fifteenth aspects of the display control, as depicted inFIG. 112 . - If a tap at the
text input field 70 in the screen through thesensitive display 54 by the user'sfinger 62 is detected, theprocessor 53 determines whether thecommunication circuitry 55 is being active or not. If thecommunication circuitry 55 is determined to be not active, theprocessor 53 generates and sends video signals of agraphical keyboard 71 with the first size to thesensitive display 54. Thegraphical keyboard 71 includes plural tappable alphabetical keys for text input. Accordingly, thegraphical keyboard 71 is displayed along with the screen of theWWW browser 56 b as depicted inFIG. 113 . In this situation, the user can perform text inputs through thegraphical keyboard 71 while watching the screen on thesensitive display 54 although thegraphical keyboard 71 is relatively small. - While the
communication circuitry 55 is active, the screen of theWWW browser 56 b is displayed on theremote display device 51 in accordance with the first to fifteenth aspects of the display control, as depicted inFIG. 114 . - If a tap at the
text input field 70 in the screen through thesensitive display 54 by the user'sfinger 62 is detected, theprocessor 53 determines whether thecommunication circuitry 55 is being active or not. If thecommunication circuitry 55 is determined to be active, theprocessor 53 generates and sends video signals of agraphical keyboard 71 with the second size to thesensitive display 54. Accordingly, thegraphical keyboard 71 is displayed alone almost or substantially entirely on thesensitive display 54, apart from the screen of theWWW browser 56 b displayed on theremote display device 51, as depicted inFIG. 115 . In this situation, the user can perform text inputs easily because the displayedgraphical keyboard 71 is relatively large. - Thanks to the above-mentioned display control, usability can be highly improved because one or more software keys are displayed suitably or ideally depending on whether or not a computing device is communicatable with a remote display device.
- As illustrated in
FIG. 111 , the sixteenth aspect of the display control begins when a tap on a text input field contained in the screen is detected (S1200). Instead, the sixteenth aspect of the display control may begin when hover is detected in proximity above the text input filed over the screen for more than a predetermined period. - In the above-mentioned embodiments, the video signals generated by the
1 and 50 may be analog video signals or digital video signals. Also, the video signals may be non-encoded or encoded pursuant to some protocol such as MPEG (Motion Picture Experts Group).computing device - Further modifications and alternative embodiments will be apparent to those skilled in the art in view of this disclosure. Accordingly, the above description is to be construed as illustrative only and is for the purpose of teaching those skilled in the art a manner of carrying out the invention. It is to be understood that the forms of the invention herein shown and described are to be taken as exemplary embodiments. Various modifications may be made without departing from the scope of the invention. For example, equivalent elements or materials may be substitute for those illustrated and described herein, and certain features of the invention may be utilized independently of the use of other features, all as would be apparent to one skilled in the art after having the benefit of this description of the invention. In addition, the terms “a” and “an” are generally used in the present disclosure to mean one or more.
- Thanks to the above-mentioned user interface with improved display control according to the embodiments, usability can be highly improved.
Claims (19)
1-12. (canceled)
13. A method for a computing device with a sensitive display, comprising:
displaying a screen of a computer program that contains a first graphical user interface object, on the sensitive display;
detecting hover of a physical object in proximity above the sensitive display while the physical object is hovering above the sensitive display;
detecting a tap of the physical object on the sensitive display when the physical object comes in contact with the sensitive display subsequently to the hover;
determining a hover location above which the physical object is hovering in the screen on the basis of the detection of the hover;
determining a tap location on which the physical object comes in contact in the screen on the basis of the detection of the tap;
in response to determining that the physical object is hovering above the first graphical user interface object for more than a first predetermined time period on the basis of the determination of the hover location, displaying a second graphical user interface object associated with the first graphical user interface object a given pixels away from the first graphical user interface object in the screen;
after the display of the second graphical user interface object, in response to at least one of failure of detecting the hover and lapse of a second predetermined time period, stopping the display of the second graphical user interface object; and
in response to determining that the physical object is tapped onto the second graphical user interface object on the basis of the determination of the tap location before the stop of the display of the second graphical user interface object, invoking a predetermined event associated with the second graphical user interface object.
14. A method according to claim 13 , wherein:
the first graphical user interface object is a give location or area within the screen that is assigned for popup of a menu for operation of the computer program;
the second graphical user interface object is the menu with one or more software buttons each of which is assigned a discrete predetermined action associated with the computer program;
displaying the second graphical user interface object includes popping up the menu the given pixels away from the given location or area; and
invoking the predetermined event includes, in response to determining that the physical object is tapped onto one of the software buttons on the basis of the determination of the tap location, performing the predetermined action assigned to the one of the software buttons.
15. A method according to claim 13 , wherein:
the first graphical user interface is assigned a predetermined action associated with the computer program so that the predetermined action is performed in response to determining that the physical object is tapped onto the first graphical user interface object on the basis of the tap location; and
invoking the predetermined event includes, in response to determining that the physical object is tapped onto the second graphical user interface object on the basis of the determination of the tap location before the stop of the display of the second graphical user interface object, performing the predetermined action assigned to the first graphical user interface object.
16. A method according to claim 15 , wherein:
the first graphical user interface object is a hypertext link to a World Wide Web (WWW) page, and is assigned access to the hyperlinked WWW page as the predetermined action so that access to the hyperlinked WWW page is performed in response to determining that the physical object is tapped onto the first graphical user interface object; and
invoking the predetermined event includes accessing to the hyperlinked WWW page.
17. A method according to claim 13 , wherein the second graphical user interface object corresponds to magnification of the first graphical user interface object.
18. A method according to claim 13 , wherein:
the first graphical user interface object is a hypertext link to a World Wide Web (WWW) page; and
the second graphical user interface object is one of (i) magnification of the first graphical user interface and (ii) a screenshot or thumbnail of the hyperlinked WWW page.
19. A mobile user device, comprising:
a sensitive display configured to detect (a) hover of a physical object in proximity above the sensitive display while a physical object is hovering above the sensitive display and (b) a tap of the physical object on the sensitive display when the physical object comes in contact with the sensitive display subsequently to the hover; and
a processor configured to:
display a screen of a computer program that contains a first graphical user interface object, on the sensitive display;
determine a hover location above which the physical object is hovering in the screen on the basis of the detection of the hover;
determine a tap location on which the physical object comes in contact in the screen on the basis of the detection of the tap;
in response to determining that the physical object is hovering above the first graphical user interface object for more than a first predetermined time period on the basis of the determination of the hover location, display a second graphical user interface object associated with the first graphical user interface object a given pixels away from the first graphical user interface object in the screen;
after the display of the second graphical user interface object, in response to at least one of failure of detecting the hover and lapse of a second predetermined time period, stop the display of the second graphical user interface object; and
in response to determining that the physical object is tapped onto the second graphical user interface object on the basis of the determination of the tap location before the stop of the display of the second graphical user interface object, invoke a predetermined event associated with the second graphical user interface object.
20. A mobile user device according to claim 19 , wherein:
the first graphical user interface object is a give location or area within the screen that is assigned for popup of a menu for operation of the computer program;
the second graphical user interface object is the menu with one or more software buttons each of which is assigned a discrete predetermined action associated with the computer program;
displaying the second graphical user interface object includes popping up the menu the given pixels away from the given location or area; and
invoking the predetermined event includes, in response to determining that the physical object is tapped onto one of the software buttons on the basis of the determination of the tap location, performing the predetermined action assigned to the one of the software buttons.
21. A mobile user device according to claim 19 , wherein:
the first graphical user interface is assigned a predetermined action associated with the computer program so that the predetermined action is performed in response to determining that the physical object is tapped onto the first graphical user interface object on the basis of the tap location; and
invoking the predetermined event includes, in response to determining that the physical object is tapped onto the second graphical user interface object on the basis of the determination of the tap location before the stop of the display of the second graphical user interface object, performing the predetermined action assigned to the first graphical user interface object.
22. A mobile user device according to claim 21 , wherein:
the first graphical user interface object is a hypertext link to a World Wide Web (WWW) page, and; and is assigned access to the hyperlinked WWW page as the predetermined action so that access to the hyperlinked WWW page is performed in response to determining that the physical object is tapped onto the first graphical user interface object; and
invoking the predetermined event includes accessing to the hyperlinked WWW page.
23. A mobile user device according to claim 19 , wherein the second graphical user interface object corresponds to magnification of the first graphical user interface object.
24. A mobile user device according to claim 19 , wherein:
the first graphical user interface object is a hypertext link to a World Wide Web (WWW) page; and
the second graphical user interface object is one of (i) magnification of the first graphical user interface object and (ii) a screenshot or thumbnail of the hyperlinked WWW page.
25. A computer program product embodied on a non-transitory computer-readable medium, the computer program product including computer program instructions that, when executed by a processor coupled to a sensitive display configured to detect (a) hover of a physical object in proximity above the sensitive display while a physical object is hovering above the sensitive display and (b) a tap of the physical object on the sensitive display when the physical object comes in contact with the sensitive display subsequently to the hover, cause the processor to perform operations comprising:
displaying a screen of a computer program that contains a first graphical user interface object, on the sensitive display;
determining a hover location above which the physical object is hovering in the screen on the basis of the detection of the hover;
determining a tap location on which the physical object comes in contact in the screen on the basis of the detection of the tap;
in response to determining that the physical object is hovering above the first graphical user interface object for more than a first predetermined time period on the basis of the determination of the hover location, displaying a second graphical user interface object associated with the first graphical user interface object a given pixels away from the first graphical user interface object in the screen;
after the display of the second graphical user interface object, in response to at least one of failure of detecting the hover and lapse of a second predetermined time period, stopping the display of the second graphical user interface object; and
in response to determining that the physical object is tapped onto the second graphical user interface object on the basis of the determination of the tap location before the stop of the display of the second graphical user interface object, invoking a predetermined event associated with the second graphical user interface object.
26. A computer program product according to claim 25 , wherein:
the first graphical user interface object is a give location or area within the screen that is assigned for popup of a menu for operation of the computer program;
the second graphical user interface object is the menu with one or more software buttons each of which is assigned a discrete predetermined action associated with the computer program;
displaying the second graphical user interface object includes popping up the menu the given pixels away from the given location or area; and
invoking the predetermined event includes, in response to determining that the physical object is tapped onto one of the software buttons on the basis of the determination of the tap location, performing the predetermined action assigned to the one of the software buttons.
27. A computer program product according to claim 25 , wherein:
the first graphical user interface is assigned a predetermined action associated with the computer program so that the predetermined action is performed in response to determining that the physical object is tapped onto the first graphical user interface object on the basis of the tap location; and
invoking the predetermined event includes, in response to determining that the physical object is tapped onto the second graphical user interface object on the basis of the determination of the tap location before the stop of the display of the second graphical user interface object, performing the predetermined action assigned to the first graphical user interface object.
28. A computer program product according to claim 27 , wherein:
the first graphical user interface object is hypertext link to a World Wide Web (WWW) page, and is assigned access to the hyperlinked WWW page as the predetermined action so that access to the hyperlinked WWW page is performed in response to determining that the physical object is tapped onto the first graphical user interface object; and
invoking the predetermined event includes accessing to the hyperlinked WWW page.
29. A computer program product according to claim 25 , wherein the second graphical user interface object corresponds to magnification of the first graphical user interface object.
30. A computer program product according to claim 25 , wherein:
the first graphical user interface object is a hypertext link to a World Wide Web (WWW) page; and
the second graphical user interface object is one of (i) magnification of the first graphical user interface object and (ii) a screenshot or thumbnail of the hyperlinked WWW page.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/833,140 US20170315698A1 (en) | 2012-01-10 | 2015-08-23 | User interface for use in computing device with sensitive display |
| US16/824,607 US10969930B2 (en) | 2012-01-10 | 2020-03-19 | User interface for use in computing device with sensitive display |
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201261584850P | 2012-01-10 | 2012-01-10 | |
| US13/732,407 US9116598B1 (en) | 2012-01-10 | 2013-01-01 | User interface for use in computing device with sensitive display |
| US14/833,140 US20170315698A1 (en) | 2012-01-10 | 2015-08-23 | User interface for use in computing device with sensitive display |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/732,407 Continuation US9116598B1 (en) | 2012-01-10 | 2013-01-01 | User interface for use in computing device with sensitive display |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/824,607 Continuation US10969930B2 (en) | 2012-01-10 | 2020-03-19 | User interface for use in computing device with sensitive display |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20170315698A1 true US20170315698A1 (en) | 2017-11-02 |
Family
ID=53838435
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/732,407 Active 2033-07-10 US9116598B1 (en) | 2012-01-10 | 2013-01-01 | User interface for use in computing device with sensitive display |
| US14/833,140 Abandoned US20170315698A1 (en) | 2012-01-10 | 2015-08-23 | User interface for use in computing device with sensitive display |
| US16/824,607 Active US10969930B2 (en) | 2012-01-10 | 2020-03-19 | User interface for use in computing device with sensitive display |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/732,407 Active 2033-07-10 US9116598B1 (en) | 2012-01-10 | 2013-01-01 | User interface for use in computing device with sensitive display |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/824,607 Active US10969930B2 (en) | 2012-01-10 | 2020-03-19 | User interface for use in computing device with sensitive display |
Country Status (1)
| Country | Link |
|---|---|
| US (3) | US9116598B1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10788964B1 (en) * | 2019-05-10 | 2020-09-29 | GE Precision Healthcare LLC | Method and system for presenting function data associated with a user input device at a main display in response to a presence signal provided via the user input device |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10852913B2 (en) | 2016-06-21 | 2020-12-01 | Samsung Electronics Co., Ltd. | Remote hover touch system and method |
| CN107291443B (en) * | 2017-05-05 | 2020-10-23 | 北京金山安全软件有限公司 | Information processing method and device and terminal equipment |
| CN108700985A (en) * | 2017-06-28 | 2018-10-23 | 华为技术有限公司 | A kind of icon display method and device |
| USD921691S1 (en) * | 2018-12-11 | 2021-06-08 | Lg Electronics Inc. | Display screen with graphical user interface |
| JP2022108147A (en) * | 2021-01-12 | 2022-07-25 | レノボ・シンガポール・プライベート・リミテッド | Information processing device and control method |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20040049541A1 (en) * | 2002-09-10 | 2004-03-11 | Swahn Alan Earl | Information retrieval and display system |
| US20070201863A1 (en) * | 2006-02-28 | 2007-08-30 | Microsoft Corporation | Compact interactive tabletop with projection-vision |
| US20080229197A1 (en) * | 2003-11-26 | 2008-09-18 | International Business Machines Corporation | Dynamic and intelligent hover assistance |
| US20120154255A1 (en) * | 2010-12-17 | 2012-06-21 | Microsoft Corporation | Computing device having plural display parts for presenting plural spaces |
| US20120254808A1 (en) * | 2011-03-30 | 2012-10-04 | Google Inc. | Hover-over gesturing on mobile devices |
| US20130083074A1 (en) * | 2011-10-03 | 2013-04-04 | Nokia Corporation | Methods, apparatuses and computer program products utilizing hovering, in part, to determine user interface orientation |
Family Cites Families (44)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5327161A (en) | 1989-08-09 | 1994-07-05 | Microtouch Systems, Inc. | System and method for emulating a mouse input device with a touchpad input device |
| US8479122B2 (en) | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
| US7663607B2 (en) | 2004-05-06 | 2010-02-16 | Apple Inc. | Multipoint touchscreen |
| US6803906B1 (en) | 2000-07-05 | 2004-10-12 | Smart Technologies, Inc. | Passive touch system and method of detecting user input |
| US7289083B1 (en) | 2000-11-30 | 2007-10-30 | Palm, Inc. | Multi-sided display for portable computer |
| US9164654B2 (en) | 2002-12-10 | 2015-10-20 | Neonode Inc. | User interface for mobile computer unit |
| US8095879B2 (en) | 2002-12-10 | 2012-01-10 | Neonode Inc. | User interface for mobile handheld computer unit |
| SE0103835L (en) | 2001-11-02 | 2003-05-03 | Neonode Ab | Touch screen realized by display unit with light transmitting and light receiving units |
| US6954197B2 (en) | 2002-11-15 | 2005-10-11 | Smart Technologies Inc. | Size/scale and orientation determination of a pointer in a camera-based touch system |
| US8902196B2 (en) | 2002-12-10 | 2014-12-02 | Neonode Inc. | Methods for determining a touch location on a touch screen |
| US20050110756A1 (en) * | 2003-11-21 | 2005-05-26 | Hall Bernard J. | Device and method for controlling symbols displayed on a display device |
| US20050162402A1 (en) * | 2004-01-27 | 2005-07-28 | Watanachote Susornpol J. | Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback |
| US7653883B2 (en) | 2004-07-30 | 2010-01-26 | Apple Inc. | Proximity detector in handheld device |
| US7692627B2 (en) | 2004-08-10 | 2010-04-06 | Microsoft Corporation | Systems and methods using computer vision and capacitive sensing for cursor control |
| JP4351599B2 (en) | 2004-09-03 | 2009-10-28 | パナソニック株式会社 | Input device |
| US7489306B2 (en) | 2004-12-22 | 2009-02-10 | Microsoft Corporation | Touch screen accuracy |
| US20060244733A1 (en) | 2005-04-28 | 2006-11-02 | Geaghan Bernard O | Touch sensitive device and method using pre-touch information |
| US7934156B2 (en) | 2006-09-06 | 2011-04-26 | Apple Inc. | Deletion gestures on a portable multifunction device |
| US8090087B2 (en) | 2006-10-26 | 2012-01-03 | Apple Inc. | Method, system, and graphical user interface for making conference calls |
| US20080297487A1 (en) | 2007-01-03 | 2008-12-04 | Apple Inc. | Display integrated photodiode matrix |
| US8026904B2 (en) * | 2007-01-03 | 2011-09-27 | Apple Inc. | Periodic sensor panel baseline adjustment |
| US8054296B2 (en) * | 2007-01-03 | 2011-11-08 | Apple Inc. | Storing baseline information in EEPROM |
| US7855718B2 (en) | 2007-01-03 | 2010-12-21 | Apple Inc. | Multi-touch input discrimination |
| US8970501B2 (en) | 2007-01-03 | 2015-03-03 | Apple Inc. | Proximity and multi-touch sensor detection and demodulation |
| US7777732B2 (en) | 2007-01-03 | 2010-08-17 | Apple Inc. | Multi-event input system |
| US7924271B2 (en) | 2007-01-05 | 2011-04-12 | Apple Inc. | Detecting gestures on multi-event sensitive devices |
| US7975242B2 (en) | 2007-01-07 | 2011-07-05 | Apple Inc. | Portable multifunction device, method, and graphical user interface for conference calling |
| JP4605170B2 (en) | 2007-03-23 | 2011-01-05 | 株式会社デンソー | Operation input device |
| US8094137B2 (en) | 2007-07-23 | 2012-01-10 | Smart Technologies Ulc | System and method of detecting contact on a display |
| US8219936B2 (en) | 2007-08-30 | 2012-07-10 | Lg Electronics Inc. | User interface for a mobile device using a user's gesture in the proximity of an electronic device |
| US8138896B2 (en) | 2007-12-31 | 2012-03-20 | Apple Inc. | Tactile feedback in an electronic device |
| US20090194344A1 (en) | 2008-01-31 | 2009-08-06 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Single Layer Mutual Capacitance Sensing Systems, Device, Components and Methods |
| US8576181B2 (en) | 2008-05-20 | 2013-11-05 | Lg Electronics Inc. | Mobile terminal using proximity touch and wallpaper controlling method thereof |
| JP4743267B2 (en) | 2008-12-12 | 2011-08-10 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
| JP4734435B2 (en) * | 2009-03-23 | 2011-07-27 | 株式会社スクウェア・エニックス | Portable game device with touch panel display |
| US9323398B2 (en) * | 2009-07-10 | 2016-04-26 | Apple Inc. | Touch and hover sensing |
| US8766926B2 (en) * | 2009-10-14 | 2014-07-01 | Blackberry Limited | Touch-sensitive display and method of controlling same |
| US20110157089A1 (en) | 2009-12-28 | 2011-06-30 | Nokia Corporation | Method and apparatus for managing image exposure setting in a touch screen device |
| US8232990B2 (en) | 2010-01-05 | 2012-07-31 | Apple Inc. | Working with 3D objects |
| US10048725B2 (en) * | 2010-01-26 | 2018-08-14 | Apple Inc. | Video out interface for electronic device |
| WO2011105996A1 (en) * | 2010-02-23 | 2011-09-01 | Hewlett-Packard Development Company, L.P. | Skipping through electronic content on an electronic device |
| JP4818454B1 (en) * | 2010-08-27 | 2011-11-16 | 株式会社東芝 | Display device and display method |
| US8614693B2 (en) * | 2010-08-27 | 2013-12-24 | Apple Inc. | Touch and hover signal drift compensation |
| US9116611B2 (en) * | 2011-12-29 | 2015-08-25 | Apple Inc. | Devices, methods, and graphical user interfaces for providing multitouch inputs and hardware-based features using a single touch input |
-
2013
- 2013-01-01 US US13/732,407 patent/US9116598B1/en active Active
-
2015
- 2015-08-23 US US14/833,140 patent/US20170315698A1/en not_active Abandoned
-
2020
- 2020-03-19 US US16/824,607 patent/US10969930B2/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20040049541A1 (en) * | 2002-09-10 | 2004-03-11 | Swahn Alan Earl | Information retrieval and display system |
| US20080229197A1 (en) * | 2003-11-26 | 2008-09-18 | International Business Machines Corporation | Dynamic and intelligent hover assistance |
| US20070201863A1 (en) * | 2006-02-28 | 2007-08-30 | Microsoft Corporation | Compact interactive tabletop with projection-vision |
| US20120154255A1 (en) * | 2010-12-17 | 2012-06-21 | Microsoft Corporation | Computing device having plural display parts for presenting plural spaces |
| US20120254808A1 (en) * | 2011-03-30 | 2012-10-04 | Google Inc. | Hover-over gesturing on mobile devices |
| US20130083074A1 (en) * | 2011-10-03 | 2013-04-04 | Nokia Corporation | Methods, apparatuses and computer program products utilizing hovering, in part, to determine user interface orientation |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10788964B1 (en) * | 2019-05-10 | 2020-09-29 | GE Precision Healthcare LLC | Method and system for presenting function data associated with a user input device at a main display in response to a presence signal provided via the user input device |
Also Published As
| Publication number | Publication date |
|---|---|
| US10969930B2 (en) | 2021-04-06 |
| US9116598B1 (en) | 2015-08-25 |
| US20200285381A1 (en) | 2020-09-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10969930B2 (en) | User interface for use in computing device with sensitive display | |
| US12045440B2 (en) | Method, device, and graphical user interface for tabbed and private browsing | |
| US20250168270A1 (en) | User interfaces for content streaming | |
| US11644966B2 (en) | Coordination of static backgrounds and rubberbanding | |
| US11500516B2 (en) | Device, method, and graphical user interface for managing folders | |
| US11650733B2 (en) | Device, method, and graphical user interface for controlling multiple devices in an accessibility mode | |
| US10152228B2 (en) | Enhanced display of interactive elements in a browser | |
| US9733812B2 (en) | Device, method, and graphical user interface with content display modes and display rotation heuristics | |
| US10304163B2 (en) | Landscape springboard | |
| US10353550B2 (en) | Device, method, and graphical user interface for media playback in an accessibility mode | |
| US9626098B2 (en) | Device, method, and graphical user interface for copying formatting attributes | |
| CN102754071B (en) | There is equipment and the method for multiple application program display modes of the pattern comprising the display resolution with another equipment | |
| US8806362B2 (en) | Device, method, and graphical user interface for accessing alternate keys | |
| US20120266079A1 (en) | Usability of cross-device user interfaces | |
| KR101441217B1 (en) | Apparatus and method for conditionally enabling or disabling soft buttons | |
| CN117435094A (en) | Multifunctional device control of another electronic device | |
| WO2012164170A1 (en) | Method and apparatus for spatially indicating notifications | |
| US20230393720A1 (en) | Content item scrubbing techniques |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |