HK1117614A - Stripe user interface - Google Patents
Stripe user interface Download PDFInfo
- Publication number
- HK1117614A HK1117614A HK08112230.7A HK08112230A HK1117614A HK 1117614 A HK1117614 A HK 1117614A HK 08112230 A HK08112230 A HK 08112230A HK 1117614 A HK1117614 A HK 1117614A
- Authority
- HK
- Hong Kong
- Prior art keywords
- information
- region
- user interface
- user
- content
- Prior art date
Links
Description
Cross Reference to Related Applications
This application claims priority from provisional U.S. patent application No.60/586,855 filed on 9/7/2004 and filed on 11/7/2005, assigned to the assignee of the present application, and filed on U.S. patent application serial No. EL990506282 entitled "cut user interface", the disclosure of which is hereby incorporated by reference in its entirety.
Technical Field
The present invention relates generally to the field of graphical user interfaces. More particularly, the disclosed embodiments relate to user interfaces for mobile computing devices.
Background
Generally, information systems in mobile platforms tend to process information in the platform separately, and the relationships between different types of information and information hierarchies are not always clear. This can make it difficult to quickly and efficiently locate information stored in the system. A convenient and efficient way to correlate information stored in the system and use this relationship to support positioning information would be beneficial.
Currently, mobile platform user interfaces ("UIs") use a single indicator icon (e.g., a message generic indicator) to notify that one or more of the same type of event has occurred. If one indicator icon is used to inform the user or to indicate several items of the same type, the user does not know how many individual events are bound in a single icon. Also, the user is not aware of the order in which events occur or the period of time in which events may occur at all.
Typically, a pop-up window will be used if an event requires more screen "real estate" to display information associated with the event. However, pop-up windows tend to obscure the view below, and the user interface objects below the pop-up window are no longer visible to the user.
Visual notifications may be displayed in different ways depending on the priority of the notification. One way is to use an icon in a dedicated screen location to notify of an event that occurred. This screen location (e.g., WindowsTMSystem tray area and NOKIATMA 60 series universal indicator pane) may hold several indicators for each event type, andand the order of the icons has no obvious meaning. Icons are often small and they disappear after the user has interacted with the event associated with the notification.
Pop-up windows (dialog windows) are used in cases where more display and/or user intervention is required for displaying event information. Visual annunciators (notifiers) are often enhanced by audio and tactile (vibration) outputs.
Graphical user interfaces typically provide a method by which a user can control a device, such as a computing system or mobile phone. Typically, current mobile user interfaces are application driven, which means that, in order to perform his/her task, the user must use one or more applications to achieve his/her purpose. For example, sending multimedia messaging service ("MMS") messages requires the user to use a phonebook, a media library, and a text editor. Each of these applications may have its own user interface habits, which may result in poor performance and redundant steps in each task.
The mobile user interface is mainly menu driven, which means that the main functions of the application can only be accessed through a menu structure. As mobile devices are becoming more versatile and feature-filled, menu structures have fallen into position, which can lead to significant usability problems.
One popular type of graphical user interface ("GUI") display is based on visual metaphors: a monitor screen to be a work area, called a "desktop", is defined in which the contents of files and documents are presented in a floating area called a "window". In addition to windows, graphical user interfaces typically include icons that represent various objects in a computer system. In this context, the term "object" refers to any software entity that exists in computer memory and constitutes an exemplar of a particular class. For example, the object may be a data file containing the document content. It may also be an application or other type of service provider, such as a hard drive. An object may also be a container for other objects, such as a folder or a window.
Another problem relates to notifying the user of events that are (possibly) of interest. Notifications may not provide the user with sufficient information about the objects/events associated therewith, thus requiring more user attention and interaction with the device. For example, if a user receives a new message, the user is typically unaware of the sender or subject of the message without opening the message. Moreover, notifications are often disruptive in that they interrupt the execution of the user's primary task.
A user interface with a navigation mode adapted for one-handed interaction would be helpful, especially in terms of interaction and navigation.
Disclosure of Invention
The present invention relates to a user interface for an electronic device. In one embodiment, a user interface includes: a system area, a summary (summary) strip (strip) area, and an overview (overview) area that presents or displays information about the selected strip. Information and data relating to different categories may be presented in each stripe region. Each summary stripe provides an overview of the events and objects of the selected category. The summary stripe area may include: a search category, a content category, a calendar category, a people or contacts category, an application category, and an environment category. The number and type of stripes may vary depending on the device. Embodiments of the user interface of the present invention allow for simultaneous interaction between content, user, task, environment and system related information, display an overview of terminal content and status, display proximity and context information, provide cognitive information about people and events, and support easy customization and extensibility.
Drawings
The foregoing aspects and other features of the invention are explained in the following description, taken in connection with the accompanying drawings, wherein:
FIG. 1A is a schematic diagram of one embodiment of an electronic device incorporating features of the present invention.
FIG. 1B illustrates one embodiment of a device including a user interface incorporating features of the present invention.
FIG. 2A illustrates one embodiment of a user interface incorporating features of the present invention.
FIG. 2B illustrates an exemplary dynamic icon that may be used in embodiments of the present invention.
FIG. 3 illustrates one embodiment of a user interface incorporating features of the present invention in which a library or content schema/category is selected.
FIG. 4 illustrates one embodiment of a user interface incorporating features of the present invention in which an environmental mode/category is selected.
FIG. 5 illustrates one embodiment of a user interface incorporating features of the present invention in which a contact or people mode/category is selected.
FIG. 6 illustrates one embodiment of a user interface incorporating features of the present invention in which a calendar mode/category is selected.
FIG. 7 illustrates one embodiment of a user interface incorporating features of the present invention in which an application mode/category is selected.
FIG. 8 illustrates one embodiment of a user interface incorporating features of the present invention in which a search mode/category is selected.
FIG. 9 illustrates one embodiment of a system area of a user interface incorporating features of the present invention.
FIG. 10 is a flow chart of one embodiment of a method incorporating features of the present invention.
FIG. 11 illustrates one embodiment of different zoom (zoom) levels in a user interface incorporating features of the present invention.
FIG. 12 illustrates hierarchical relationships and grid-like navigation in a user interface incorporating features of the present invention.
FIG. 13 illustrates the extensibility of the stripe region in one embodiment of a user interface having an extensible information region incorporating features of the present invention.
FIG. 14 is a pictorial representation of various embodiments/configurations of a user interface incorporating features of the present invention.
FIG. 15 is a pictorial representation or screen shot of various embodiments and layouts of a user interface incorporating features of the present invention.
FIG. 16 is a pictorial representation of one embodiment of a three dimensional view or display of a user interface incorporating features of the present invention.
17A-17I are pictorial representations of embodiments of features and functions of a user interface incorporating features of the present invention that are suitable for use in the display area of a device using the user interface of the present invention.
FIG. 18 is a flow diagram of one embodiment of interacting with a user interface incorporating features of the present invention.
FIG. 19 is a flow chart illustrating one embodiment of a method incorporating features of the present invention.
20A-20H illustrate screen shots of one embodiment for searching and locating information using a user interface incorporating features of the present invention.
21A-21E illustrate screenshots of a search application of one embodiment of a user interface incorporating features of the present invention.
FIGS. 22A-22I illustrate screenshots of an active idle state of one embodiment of a user interface incorporating features of the present invention.
FIGS. 23A-23F illustrate screen shots of event notifications in one embodiment of a user interface incorporating features of the present invention.
24A-24F illustrate screen shots of one embodiment of a magnifier feature in a user interface incorporating features of the present invention.
FIGS. 25A-25H illustrate screenshots of a device management system in one embodiment of a user interface incorporating features of the present invention.
FIG. 26 is a block diagram of one embodiment of an architecture that can be used to practice embodiments of the present invention.
FIG. 27 is a flow chart illustrating one embodiment of a method incorporating features of the present invention.
FIG. 28 is a flow chart illustrating one embodiment of a method incorporating features of the present invention.
FIG. 29 is a flow chart illustrating one embodiment of a method incorporating features of the present invention.
FIG. 30 is a flow chart illustrating one embodiment of a method incorporating features of the present invention.
Detailed Description
Referring to FIG. 1A, there is shown a schematic or block diagram of a system 100 incorporating features of the present invention. While the invention will be described in conjunction with the embodiments shown in the drawings, it should be understood that the invention can be embodied in many alternate forms of embodiments. Also, any suitable size, shape or type of elements or materials could be used.
The user interface of the present invention typically provides a visual sense to learn important information in the device. The disclosed embodiments support simultaneous interaction between categories of information stored on or available through a device, where the information may include information related to content, users, tasks, environments, and other system information and applications.
For example, referring to FIG. 2A, one embodiment of a user interface incorporating features of the present invention is shown. As shown in fig. 2A, the user interface provides a graphical display of features, functions, and information that may be stored in or available through the device to a user. The user interface of the present invention allows a user to interact between categories of information simultaneously.
The disclosed embodiments provide a user interface with a "bar" or bar-like background for system-related functions and information. Each bar 210 in fig. 2A is generally a horizontal display of features and functions available through the user interface 200. Interaction with the user interface may be performed using, for example, a five-way joystick or a cursor. Information, e.g., content, people, events or services, at each stripe can be easily accessed and queried locally, nearby or remotely, which enhances the search functionality. The user interface is easily scalable and is not limited to any particular screen size, scale, shape, or orientation. By using different types of notifications, perception, overview and access to objects and events of interest is provided.
The user interface of the present invention is typically provided on a display of an electronic device, such as a portable terminal device including a mobile phone. FIG. 1A illustrates a schematic diagram of one embodiment of an electronic device 100 incorporating features of the present invention. The device 100 may be a wireless terminal device operating in a communication system such as a GSM, GPRS, UMTS or bluetooth system. The electronic device may also be, for example, a handheld, portable, or desktop computer, or a gaming device or console, or a personal digital assistant ("PDA"). In alternative embodiments, the user interface of the present invention may be implemented on any content and task driven electronic device. The device 100 typically comprises a processor 101 and a memory 102 for operating tasks of the device 100 and for running applications 103 stored in the device. For operation in a communication system, the device may include a transceiver 104 and an antenna 105. For data entry, device 100 may include a keyboard 106, which may be a soft keyboard or a touch-sensitive area on the surface of the device, that provides a visual symbol or other indicator to the user for selection by touching the symbol. Device 100 may also include other input devices such as a joystick, rocker switch controller, touch sensitive display, or voice command capability. For data output, the device 100 includes a display 107, which may be a monochrome or color display, an LCD panel, a touch sensitive panel or other suitable display, and a vibration motor for tactile output. In one embodiment, the processor 101 may cooperate with other input/output devices 109 and speaker/microphone 108.
The present invention may be embodied in various forms. One embodiment includes a mobile device, such as a Personal Digital Assistant (PDA), mobile terminal, cellular telephone, or similar device, having an organizational mode navigation display. The organizational schema graphically represents features of a computer program for the device. Referring to FIG. 1B, one embodiment of a mobile device 110 on which the present invention may be applied generally includes a display 112 and a keypad 118. The keypad 118 may include a left shift key 120, a right shift key 126, an up arrow key 124, a down arrow key 126, an input scroll wheel 128, and other input keys 130. The keys 120, 122, 124, and 126 may also include soft keys whose function is appropriate to the state of the user interface. The input wheel 128 may be rotated for providing rotational input to the device and may be pressed as a whole like a key for providing selector input. In another embodiment, the input scroll wheel is located on the keypad as a rotatable key that can be rotated clockwise and counterclockwise, or pressed as a whole.
The display 112 shows an organization scheme 132 of the computer program stored in the memory 102 of FIG. 1A. The memory 102 also contains instructions for displaying the organizational schema 132 and for using the organizational schema to navigate through the computer program. A computer program, as used herein, may refer to any computer program in which an operator may navigate program features, such as an operating system, word processor, spreadsheet, email, telephone computer program, game, and so forth. In operation, the processor 101 of FIG. 1A processes instructions in the memory 102 in accordance with a computer program and receives input from the keypad 106 or other input device for modifying the view 112 shown on the display, as is well known in the art of Graphical User Interfaces (GUIs). The keypad 106, display 107, and processor 101 may be collectively referred to as a graphical user interface through which a user may interact with the device 110 of FIG. 1B.
The user uses the patterns 132 and GUI to navigate through the computer program and its features.
Referring to FIG. 2A, in one embodiment, the user interface 200 of the present invention generally provides a graphical user interface or display that includes a system area 201, a summary area 210, and an overview area 220, wherein the overview area 220 may provide detailed information related to selected summary area categories 210A-210F.
If the user wants to use the functionality provided in the menu of the user interface, the user selects an area, such as "person" 210C of FIG. 2A, by moving the selection or controller device in the appropriate direction. Referring to FIG. 1B, this may include, for example, moving the joystick controller in an "up" direction, or toggling or soft keys by pressing the appropriate key 124, 128, or "tapping" or contacting a corresponding portion of the touch-sensitive display 129, for example, by stylus 127. The user interface then loads the "people" application, and the user can access the "people" features, as described below.
The summary area 210 shown in fig. 2A typically includes 6 categories. In alternative embodiments, any suitable number of categories may be used in the summary area 210, depending on the device and application. The summary area 210 of FIG. 2A generally includes a search category or application 210A, an environment category 210B, a people category 210C, a calendar category 210D, a content category 210E, and an application category 210F. In alternative embodiments, the category of each summary area 210 may vary depending on the particular application and device on which the user interface 200 is located. For example, if the particular device on which the user interface of the present invention is used is a gaming machine device, a summary area category called "N-gage arena" may be included. It is also noted that the title of each category area is exemplary only to provide the user with a short identifying description to illustrate what the underlying application is related to the category and what information it may provide or may be accessed. Thus, the particular embodiment shown in FIG. 2A is merely exemplary, and the layout and categories may be arranged in any suitable manner corresponding to particular devices and applications.
In one embodiment, referring to FIG. 2A, each summary area 210, also referred to herein as a "sliver," provides an overview of the events and objects of the selected categories 210A-210F. Each summary area category 210A-210F may also include one or more indicators or icons 230. The icons can include, for example, text, images, dynamic icons, hypertext, and user interface widgets. The indicator or icon 230 typically provides a visual link to other information related to the summary area category. The type of indicator 230 may be any suitable type, number, and combination thereof, depending on the particular application and device.
Referring to FIG. 2B, examples of some dynamic icons are shown. Dynamic icons are typically icons whose appearance may change to reflect changes in the associated object and application. For example, referring to fig. 2A and 2B, one indicator 230A in the people category 210C may include a "friends icon" 231. If a "buddy" is downloading, for example, the dynamic icon 231 may change to the image 232 to reflect that a "buddy is downloading. Another example of a dynamic icon is shown with reference to message count icon 233. Message count icon 233 may indicate the number of messages received, shown as 6 in this state 233. In another state 234, if a new message is received, icon 233 may change as shown by icon 234 to indicate the presence of a "new" or perhaps "unread" message.
Referring again to FIG. 2A, each stripe 210A-210F can indicate a different level of information related to the particular category assigned to the particular stripe 210A-210F.
For example, referring to fig. 3, the content stripe 210E of fig. 2A may include library information. The library information may relate to files and other information stored in the device. Fig. 3 illustrates the selection of content or a library stripe 301. When selected, the library stripe 301 is highlighted and detailed information 310 relating to the stored content is shown in detail area 220 of FIG. 2A. The content on the library information stripe 301 may generally provide awareness and overview of events and objects related to, for example, personal media, and support activities related to accessing, playing, and sharing data.
In one embodiment, the library information 301 may include one or more hierarchical levels of information that a user may access when the library information is highlighted. For example, a hierarchy 310 of information may be displayed in the detailed information area 220 as an expansion of the library stripe 301. Each hierarchy 311, 313, 315 and 317 may include appropriate indicators or icons 312, 314, 316 and 318 detailing and allowing selection of personal files and information. For example, hierarchy 311 has an indicator or icon 312 that provides information related to the most recently accessed files. The icons or objects provide access to the underlying files in a known manner. Other levels of information related to content in the device may include game sessions 313, streaming/broadcast media 315, and available media and files 317. In alternative embodiments, different tiers 310 may include any suitable categories of information. The level 310 may be considered a "top" level category for information available in other sub-levels.
A second hierarchical level associated with the highlighted or selected library stripe 301 may provide previews and thumbnails of the stored information with control over access to the information. For example, another sub-hierarchy 310 may provide access to "favorites," such as playlists, image collections, bookmarks, channel collections, authored content, and different types of content views. Different content views may be shared to, rated, annotated, located, and used based on type, for example, further sub-levels may add an overview of contacts (e.g., "get" or "share"), calendar functions (e.g., "maintain", "archive", "backup", or "synchronize"), and environment (e.g., "get", "share", or "maintain"). The number of levels or sub-levels is limited only by the desired information and the system and application of the user interface.
In one embodiment, access to the various levels may be referenced in a process called "zooming". In one embodiment with reference to FIG. 11, for each summary stripe 210 of FIG. 2A, a "zoom" may include, for example, 4 discrete zoom levels L0-L3. In alternate embodiments, any suitable number of zoom levels may be used. For example, referring to fig. 11, zoom level L0 presents the information visible in the summary stripe area 210 of fig. 2A. The zoom level L0 may include one or more dynamic icons L001 with one or more status and attribute indications. The zoom level L0 of FIG. 11 presents a "friends" list, which would be a component of the "people" bar 210C of FIG. 2A. Each "buddy" of "buddy 1" through "buddy 5" has a dynamic icon associated with it that can display further information related to the particular "buddy". In one embodiment, the detailed information section 220 of FIG. 2A may include the L2 information. By selecting one of "friend 1" to "friend 5", the zoom level L1 is presented or displayed. The zoom level L1 typically presents the selected object (in this case "buddy 2") as a larger thumbnail or preview, with information related to the selected object (buddy 2), also identified in this example as "Jane Smith". As shown in the example of fig. 11, the information includes a name L101, availability L102, and message information L103. In one embodiment, zoom level L1 may also include the most commonly used controls, such as "message" L104, "chat" L105, "call" L106, and "more" L107. In alternate embodiments, any suitable controls may be displayed. For example, bar 210C may be used to display zoom level L1 with dynamic height, or zoom level L1 as a tooltip. In alternative embodiments, the zoom level L1 may be displayed in any suitable manner, shape, or appearance.
If more detailed information is required from zoom level L1, an object or application of zoom level L1 may be opened to create zoom level L2. The zoom level L2 generally includes more detailed information related to the selected object. In this example, friend 2's name is more prominently displayed as "Jane Smith" in a corner region L201 of the display for level L2. The zoom level L2 may also include an informational data field L202 for viewing, editing, and entering further information related to the selected object. One or more controls L203 may be displayed, which may be selected. The open objects/applications of zoom level L2 may use the window space of a particular device or display as needed. This may include sizing and resizing as needed to dynamically maximize or minimize window space. The zoom level L2 may be of any suitable size and use a screen view of a particular display or device.
The zoom level L3 shown in the embodiment of FIG. 11 shows or presents information from the zoom level L2 in such a way that: it is possible to display the relation with other objects L301-L304 on the application window border area L306. In the example shown in fig. 11, the zoom level L3 uses a full screen view, and the object L305 is generally located in the central area of the screen L3, with the object message L301, the store L302, the calendar L303, and the contact L304 around the border area L306. In alternative embodiments, the different zoom levels may be presented in any suitable form, typically with more detailed information displayed in each level. The number of tiers is limited only by the needs of a particular application, device, or user.
FIG. 10 illustrates one embodiment of a method incorporating features of the present invention. In one embodiment, the user selects or highlights 1002 a region or bar. Information related to the strip is displayed 1004 and objects in the strip may be selected. The objects in the selected region are selected 1006, for example, by clicking on the objects. A thumbnail preview of objects corresponding to content in the selected region may be displayed 1008. If desired, the size of the bar or display cube may be automatically set or scaled to accommodate the preview information, while the other unselected bars are similarly adjusted. Objects in the preview may be selected and opened 1012 with more detailed information related to the selected object displayed. Dynamic sizing 1014 may be applied if desired. A full screen view may be selected or applied 1016, in which case the relationship between the displayed object and the application in the display border area will be identified.
More detailed information related to each summary area category 210 of fig. 2A will be discussed. Referring to FIG. 4, one embodiment of a selected context class or stripe 401 is shown. The information or applications associated with the environment category 401 are typically for interacting with the environment, objects and people in the vicinity of the device implementing the user interface of the present invention. As shown in FIG. 4, a top level or zoom level L0 of FIG. 11 is shown with respect to the selected environment category 401. As shown in fig. 4, the selected environment category 401 includes indicators of information for the hierarchy of "friends" 402, "devices" 403, and "services" 404 at a zoom level L0. In alternative embodiments, appropriate indicators may be used and displayed. Selecting an object from level L0 in screen 410 may bring the user to a level of more detailed information, see L1 of FIG. 11. The zoom level L1 for the environment 401 may include more detail related to the selected information level 402, 403, or 404. The object may be selected on the screen 410 by any known or suitable means. Zoom level L3 may add the possibility to view a map. In one embodiment, only interactions with objects in the general area occur.
Referring to fig. 5, a screen is shown displaying a selected contacts category 501. "contacts," which may also be referred to as "people," generally provide information and access for conversations, messaging, and contacts. The indicators provided by L0 are for categories such as missed conversations, new messages 503, contact online 502, and chat requests 504. Selecting one of the icons 502a, 503a, or 504a will allow the user to access the next level or sub-level of information associated with the respective category. For example, the zoom level L2 associated with the contacts category 501 may provide access to standard messaging clients, PEC with communication history, chat, voice (rich) calls. The zoom level L3 for contact 501 may be essentially a zoom level L2 with an overview of content, calendars, applications, environment based on the selected contact. This hierarchy would allow the user to select a contact and view information related to the contact at a different but related hierarchy.
FIG. 6 illustrates an exemplary embodiment of a selected calendar category 601. Generally, the calendar category 602 displays events and calendars, and displays temporal relationships between objects. As shown in FIG. 6, at a first level L0 of information, a visual timeline 602 is shown along with indicators or notifications 603 for upcoming events, tasks, things to do, communications, and contact logs. In an embodiment, the information level L1 may add some controls to the L0 view of FIG. 6. The information level L2 may include a typical calendar view.
FIG. 7 illustrates one embodiment of the L0 information hierarchy for the selected application category 701. The application class 701 is generally used to support application-driven methods and to access third-party applications. Level L0 of FIG. 7 includes a "taskbar" view that illustrates the currently active applications.
FIG. 8 illustrates one embodiment of a display of the user interface of the present invention after selection of the search category 201A of FIG. 2A. While in the selected search category or mode 801, the user may look for content, people, events, either locally or remotely. The information level L0 of fig. 7 displays text boxes 802 and/or 803 for entering a search string. Also, other options for searching may be provided at the top level or at subsequent levels for advanced searching. The other options may include, for example, dates 803, 804, type 805 (including, for example, content type, event, people, and service location (local or remote)). Each subsequent information tier may provide advanced search options (e.g., keyword type metadata), search history, and saved searches and search results. In one embodiment, subsequent tiers can support and present search results for contacts, applications, environments, and calendars.
Fig. 9 illustrates one embodiment of the system area 201 of fig. 2A. The system area 901 is typically reserved for indicators or tools that are not directly related to any summary stripe category. The general purpose of the system area 901 is to manage and sense current connections and traffic 902, provide operator and other service provider information 903, sense current battery status and provide charging instructions 904, provide time and date data 905 and attribute data or information 906. Selecting or highlighting any of the controls or indicators 902-906 may provide more detailed information related to the selected category or utility.
In general, navigation with respect to the interface 200 of FIG. 2A is generally based on a class grid navigation. For example, FIG. 12 illustrates one embodiment of a class mesh navigation system. Using a 5-way joystick, for example, the user can navigate between the system area 1201 and the various bars 1202-1208. For example, to move between the bars 1202 and 1208, the user may move the cursor in the "up/down" direction. To select further information in the selected bar, such as contact 1206 in screen 1210, the user may move the cursor "left/right". As shown in fig. 12, by moving the cursor to the "right" and selecting the next object in the long bar, different levels of information will be displayed in screens 1211, 1212, and 1213. In alternate embodiments, any suitable navigation system or device may be used. For example, the primary input device for one-handed interaction may be a 5-way joystick, rocker key, or trackball. The secondary device may for example comprise a soft key or a capacitive slider (slide). Two-handed interactions may include, for example, a stylus and a touch screen (capacitive). In alternate embodiments, any suitable input device may be used to select, edit, and enter information.
The size and shape of the objects displayed on the user interface 200 of fig. 2A may vary and may be scaled and sized to fit the display of the desired device application. Referring to fig. 13, the height, width and/or shape of the selected bar 1301 may also be dynamically changed according to the available information and the information to be displayed on the selected level. For example, in the selected bar 1301, information blocks 1310 and 1312 are displayed. The bar 1301 has been adjusted and shaped to display these information blocks 1310 and 1312. In one embodiment, the summary stripe, which does not contain any activity indicators, can be minimized and the shape of the stripe, which requires more space, is stretched and set in such a way that: the entire screen may be used as shown in screens 1302, 1303, and 1304, where only three categories or stripes are shown when more are available. Also, all of the stripes may not be visible at the same time, or one or more stripes may be partially revealed or hidden.
FIG. 14 illustrates different embodiments of the user interface of the present invention scaled, shaped, and sized for a particular screen size and shape. As shown in FIG. 14, the layout and appearance of an embodiment of the user interface of the present invention is limited only by the type of device/display that uses the user interface.
In one embodiment, the user interface 200 of FIG. 2A may be rotated from a portrait orientation to a landscape orientation. For example, referring to FIG. 15, the user interface 1500 may be changed from portrait orientation 1505 to landscape orientation 1506 using capacitive slider 1501 and 1504. In alternative embodiments, the user interface display may be rotated from one direction to another using any suitable method. In one embodiment, as shown in FIG. 16, the user interface of the present invention may be a three-dimensional (3-D) image or representation 1601. As shown in fig. 16, each side 1602-1604 of three-dimensional object 1601 may provide a view that includes more detailed hierarchical information. For example, side 1603 includes the L0 zoom level as previously described. The side 1602 may include a map view with more detailed information related to the environmental category.
The disclosed embodiments provide different ways of notifying the user. Notifications can use sound, touch, or vision (animation, slide), and there are typically 4 main notification types. The pop-up notifier is used for notifying the user of: an action is required. It pops out before all windows and receives the input focus (focus). A passive (soft) notifier is a pop-up window to notify the user of information without gaining input focus. For example, passive annunciators do not interfere with the user's current task or activity. The status notifier is used to indicate status, active functions, received messages, etc., and remains active until the status changes. For example, an icon appears in a location such as a full (pan) summary of the user interface. The presence of the icon may indicate information and a modifier (modifier) may be used to provide the information. The field notifier is associated with an error in the input field and the entry.
FIGS. 17A-17I illustrate one embodiment of a simplified user interface system incorporating features of the present invention for a mobile phone. In such embodiments, the visual display area has been reduced to accommodate or fit the size of a particular screen (e.g., "NOKIA S60"). As can be seen in fig. 17A, the different stripes 1701 and 1705 are distributed substantially along a vertical line. The user can move or switch between the bars by moving the joystick, cursor or other pointing device left/right rather than up/down. Pressing or activating a select/open option or function activates a menu point. Another right click key press may open a menu bar. The left/right menu options may be replaced by up/down options depending on the layout of the particular device.
For example, in the screen shown in FIG. 17A, pressing the "Right" key selects a long bar 1702, as shown in FIG. 17B. Further "right" key depression will select either bar 1703 or 1705, as shown in FIG. 17C or FIG. 17D, respectively.
If in the screen shown in FIG. 17A, the user activates or presses the "on" function, the function associated with the bar 1702 may be displayed, as shown in screen 17E. FIG. 17F shows the functionality associated with the bar 1702 of FIG. 17B when "select" or "open" 1707 is activated. While in the screen shown in fig. 17F, when a feature or task item 1710 is selected, further functions and options may be accessed and a "select" option 1708 is activated.
FIG. 17G shows the function associated with stripe 1703 of FIG. 17C when the "select" or "open" function 1711 is activated. Similarly, FIG. 17H shows the function associated with stripe 1705 of FIG. 17D when "open" 1712 is activated.
The user interface of the disclosed embodiments divides information into categories. As shown in FIG. 20A, these categories may include, for example, content, environment, contacts, calendar, and applications. The user selects a category to view the corresponding information, also referred to as a zoom level. The selection of categories is typically done by using a pen or stylus on a touch screen.
One feature of the present invention is the ability to view desired information based on the relationship between the information and the categories. An example of this is described, for example, in U.S. patent application entitled cutui, filed on 11.7.2005, the disclosure of which is incorporated herein by reference in its entirety.
Referring to fig. 20A, when viewing an item in the user interface of the present invention, a user can easily search for and find information associated with the item being viewed. The user selects an item or any content in the item, such as a word in an SMS or file, and drags it into the long bar desired to search. Any information found in the stripe that is relevant to the selected item may then be displayed.
For example, as shown in fig. 20A, each summary stripe area 2011-. In one embodiment, the blocks represent commonly used functions. For example, the "MMS" box 2010 in the application bar 2015 represents "send multimedia message". Selecting the MMS box 2010 and dragging the MMS box 2010 into a different bar indicates searching for information related to the MMS box 2010 in a different function or bar. Dragging the selected item 2010 into the "other area" 2016 instead of a particular bar will search all bars or functions for information related to the selected box or item 2010.
For example, referring to FIG. 20B, the user is browsing in the contacts area or stripe 2013. In the contact area 2013, there is an item "David" 2021. The user can select the object David 2021 and view personal information related thereto. If, for example, the user wishes to identify any tasks related to David 2021, the user may select or highlight the item or object 2021 corresponding to "David" and "drag" it into, for example, calendar bar 2014 in any convenient manner, as shown in FIG. 20C. The calendar function or system of the device incorporating the user interface of the present invention will search for the function for "David" and list any tasks or other information associated with or related to "David" 2021. In FIG. 20D, area 2041 of the user interface displays the information and tasks found to be associated with "David" from the calendar function 2014.
Referring to FIG. 20E, the user returns a contact stripe 2013. If the user wishes to determine whether a contact or friend "David" 2021 is available or is in or near a predetermined area, the user drags the item or object associated with "David" 2021 into the context bar 2012, as shown in FIG. 20E. The system then searches for the context function for "David" 2021 and informs the user if "David" is available, as shown in fig. 20F. The term "available" generally means that the individual or a device associated with the individual is in communication with a system incorporating the present user interface or is located in a predetermined area or location. For example, if a user searches for "David" 2021, a system incorporating the user interface of the present invention may identify at least the approximate location of the mobile communication device associated with "David" in any suitable or well-known manner. It may be determined from the location information whether "David" is located within a predetermined area or in a certain location or proximity of the mobile device. If location information cannot be determined, "David" may be considered "unavailable". However, if "David" is located, as shown in fig. 20F, an information bar or message 2061 may be displayed on the user interface to notify the user.
Embodiments of the present invention establish relationships between information stored in or available to a system. Information in the system is connected and related and the user can easily and efficiently search for and find relevant information.
For example, referring to fig. 20G and 20H, the user is viewing information in the selected contact stripe 2013. The displayed information includes a short message service message 2074. "message 1" shown in fig. 20G is a request for "monthly report". The user selects "month report" 2072 in the SMS and "drags" it to the "content" bar 2011, where the term "drag" is well known. The content stripe 2011 includes access to information and files stored in the device. When the text "month report" 2072 is dragged into the content strip 2011, the system will search its data store and save content to determine if a file or data related to or corresponding to the "month report" is stored therein. If a file is listed, it will be displayed in the detailed information section 2082 associated with the content stripe 2011. In this example, as shown in fig. 20H, a file "month report" 2083 is displayed in the latest file area 2084 of the detailed information section 2082.
As shown in fig. 20H, to send or transmit the file "month report" 2083 to the requestor, the user drags the month report 2083 object into the application stripe 2015. A multimedia messaging service ("MMS") feature may be selected to send the month report 2083 file. In alternate embodiments, the file may be transmitted using any suitable method or application.
FIG. 18 illustrates one embodiment of a method incorporating features of the present invention. In one embodiment, the user has selected a region/stripe 1802 and information and content related thereto is displayed 1804. An item is selected 1806 from the detail region and dragged 1808 into at least one other zone. The zone searches 1810 for information and content related to the item. Any information or content found may be identified and displayed 1812.
The user interface of the disclosed embodiments also provides the ability to search for information stored in the mobile platform. As shown in FIG. 2A, the user interface 200 provides categories of information or functionality that may be divided into, for example, an environment 210B, contacts or people 210C, a calendar 210D, content 210E, and applications 210F.
The user selects any one of the category regions or bars 210 to access the function and view the corresponding information associated with the selected category in the detailed information section 220.
Referring to FIG. 21A, the user interface of the present invention allows for searching for information in a long stripe category, either individually or as a whole. The user may also review relevant information found as a result of the search. For example, as shown in FIG. 21A, one embodiment of a user interface 2100 of the present invention generally includes 3 primary regions. These areas include: a system field 2101, a category or "long" field 2102, and a details field 2103. The system area 2101 typically includes 3 selection icons or objects as access methods or roaming (range) methods to the underlying functions and applications of the system. In alternative embodiments, the system field 2101 may include any suitable number of icons corresponding to underlying system functions. In the example shown in FIG. 21A, the system field 2101 includes soft keys for "System" 2104, "find" 2105 and "Primary" 2106.
The long bar field 2101 is used to select and display a particular category of information (210A-210F of FIG. 2). Information related to the selected category is displayed in detail field 2103. The movement and navigation between icons may be in any suitable manner, including a joystick or keypad. Referring to FIG. 1B, keys or cursor functions such as "OK," "Backward," "Up," "Down," "left," and "Right" may be located on the mobile platform's keypad 330.
Referring to the embodiment illustrated in FIG. 21A, a method of searching for information in a user interface of the present invention is illustrated. The "find" selection object 2105 in system field 2101 provides an entry to the search function. When "find" 2105 is selected, screen 2110 is displayed on user interface 2100. Through this search user interface, the user may search all of the categories of information in the strip area 2102, or may search any of the categories individually. When listing search results, the user may view other information related to the search results. For example, to search all of the information categories in the strip area 2102, a search phrase or criteria is entered into area 2111. The search results may be displayed in detail area 2103.
Referring to FIG. 21B, a user wishes to search the contact area for a phone number or other contact details related to a contact. The find 2105 function has been activated and search contacts bar 2114 is selected. In the details area 2103, a search criteria input area 2111 is provided to allow the user to enter search criteria.
The user then enters the details area 2103 to enter a search phrase or criteria into field 2111. As shown in fig. 21C, a search phrase "Tom" is input in the area 2111, and a search function or key 2117 is activated. The search results 2118 may be displayed on a user interface. Other categories in the long bar area 2102 that are not selected as primary search areas will be displayed as "relevant". If any information relevant to the search criteria is identified in the "relevant" categories 2120, 2122, 2123, and 2124, that information may be displayed in the respective long bars. For example, the number 2119 on a different icon 2119 on each of bars 2120, 2122, 2123, and 2124 indicates the number of each indicated item that has been found to be associated with the search criterion "Tom".
For example, by searching the contacts area 2114 for "Tom," the corresponding phone number and other contact details are displayed in area 2118. The "related content" area 2120 indicates that information related to "Tom" has been found, shown as icons 2120A and 2120B. Referring to FIG. 21D, if the user selects "related content" bar 2120, information 2130 in bar 2120 related to "Tom" is displayed. Similarly, the associated calendar bar 2123 shows: at least one item 2123A in category 2123 has been identified in the search as being related to "Tom". As shown in fig. 21E, the user may directly select or move to an associated calendar category 2123 to view the meeting information 2123A that has been indicated.
As previously described, the different areas, regions, and icons of the user interface 2100 of FIGS. 21A-21E may be navigated between by any suitable navigation tool, such as a stylus or multi-directional cursor device for a touch screen. Soft key devices that change function based on the mode of the user interface may also be used.
FIG. 19 illustrates one embodiment of a process incorporating features of the present invention. In one embodiment, a search mode of a user interface is activated 1902. A category or a long bar in which content is to be searched is selected 1904. The search criteria are input 1906. The search is run and the results are displayed 1908. Any unselected areas having content relevant to the search criteria are identified or highlighted 1910. To view the contents of the unselected areas, the "relevant" information area is selected 1912.
The disclosed embodiments may also include an idle screen for the user interface. As is known in the user interface art, during inactive or inactive use, the user interface or display may revert to a mode commonly referred to as an "idle" mode. In idle mode, a "wallpaper", screen saver, or other image may be presented over a major portion or section of the display screen. In embodiments of the present invention, the user interface may enter a mode that may be referred to as an "active idle" mode, in which a screen saver or image is preferably displayed, but the functionality and mode of the user interface remains at least partially visible and active.
For example, referring to FIG. 22A, one embodiment of an active idle screen 2201 incorporating features of the present invention is shown. In one embodiment, the active idle screen 2201 displays icons 2202, 2203, 2204, 2205, 2206, and 2207 that generally correspond to or are related to categories of information available to or that can be made available to a user or operator. For example, icons 2202, 2203, 2204, 2205, 2206, and 2207 shown in FIG. 22A are general representations of icons 210A-210F of FIG. 2A. These categories typically include functionality and modes as previously described, which may include, for example, search 2202, environment 2203, people 2204, calendar 2205, content 2206, and applications 2207. The icons, images or graphics used to represent the various categories or functions associated with each of the icons 2202-2207 may be in any desired or suitable form, and the scope of the invention is not limited to the exemplary display shown in fig. 22A.
To identify the active idle state of the user interface of FIG. 2A, the icons 2202 and 2207 of FIG. 22A are generally smaller in size than the icons or stripes of FIG. 2A. To conserve screen space and enhance visibility of "wallpaper" during idle mode, the category appearance may be reduced to a minimum or any desired size of icons or images. The invention is characterized by allowing basic management of incoming events while the device is in an idle state through customization and scalability, and providing a direct link from the idle state to a selected event in the user interface active state.
As shown in fig. 22A, the images, icons, and category appearances are reduced in size relative to the embodiment of the bars 210A-210F shown in fig. 2A to save screen space and enhance wallpaper visibility in the idle state of the device. Icons 2202 and 2207 include icons or images representing underlying applications or categories. The remaining space may be used, for example, for wallpaper. In alternative embodiments, the remaining space may be used for any suitable purpose, including displaying text or images, or other applications such as gaming.
Although the appearance of the categories of fig. 22A is reduced, whenever a new event is detected by the device, the detection or occurrence of the event is notified on the idle screen by a notifier or notification. For example, in one embodiment, referring to FIG. 22A, if an event or activity is detected or occurs with respect to one of the categories 2202- "2207, a signal may be represented by an indicator or notifier 2209, 2211. The annunciator may be displayed in a location that enables the user to connect or associate the annunciation with the corresponding category. For example, as shown in fig. 22A, notifiers 2209, 2211 are sufficiently adjacent to their respective corresponding categories, or can be considered derivatives of their respective corresponding categories. In alternative embodiments, the annunciators may appear in any suitable location on the display 2201 so long as the user is able to recognize that the annunciators correspond to a certain category. For example, in one embodiment, the annunciator may appear anywhere on or within the display area, with text or images that associate the annunciator with the respective category for which it provides annunciation.
For example, in one implementation, the notifier 2211 is activated and displayed when a new "message" or incoming call related to the people category 2204 is detected or received. Notifier 2209 indicates: an event is occurring or has occurred with respect to the environment category 2203.
Notifiers 2209 and 2211 generally include icons including arrows 2210. In alternate embodiments, any suitable image or icon may be used for notification. 22B-22I show other examples of possible annunciators.
In one implementation, referring to fig. 22B, notifiers 2209 and 2211 of fig. 22A can have a simplified state and an expanded state. For example, as shown in FIG. 22B, the expanded state 2220 of the notifier 2211 of FIG. 22A is shown. The extended state 2220 may include other details and information related to the event for which the notification is provided. As shown in this example, notifier 2210 of fig. 22A indicates that: an event occurs in people or contacts area 2204. In extended state 2220, the notifier indicates: the event is "Benjamin Online". The expanded state 2220 may be any suitable shape and size necessary to display the desired information. Aspects of the annunciator may be highlighted in any suitable manner, including, for example, size, font, or color. The user may customize the appearance, such as reflected by the examples shown in fig. 22B-22I.
In one embodiment, the extended state 2220 may occur automatically, either simultaneously with the occurrence of an event or initial notification, or within a specified time thereafter. For example, if the presence of a contact from the contact list associated with the people category 2204 is detected, his 21A notification 2211 may be initially displayed. Within a period of time after the event or initial notification, the notification 2211 may expand to the expanded state notification 2220 of FIG. 22B. This may occur automatically or based on user activity.
For example, after display of the notification 2211, some action by the operator may be required to cause the notification 2211 to expand into the expanded state notification 2220 of fig. 22B. This may include, for example, "clicking" on any portion of the icon or image of the notification 2211, such as arrow 2210. In alternate embodiments, any suitable action or activity may change the simplified status notification to the extended status notification. For example, in one implementation, the presence of arrow icon 2210 indicates the availability of further information related to the notification 2211. Clicking or acting on the arrow icon 2210 of FIG. 22A causes the expanded state 2220 to be displayed, as shown in FIG. 22B.
In one embodiment, the notification of the expanded state 2220 shown in FIG. 22B may change state back to another simplified state after a predetermined period of time, such as the notification 2230 of FIG. 22C. For example, referring to FIG. 22A, a system incorporating the user interface of the present invention detects that a contact is online. As shown in fig. 22A, a notification 2209 associated with the "people" category 2204 appears on the display 2201. Notification 2209 changes state to expanded state 2220 of FIG. 22B either automatically or after user activity. As shown in FIG. 22B, the extended state 2220 provides further information about the event to the user, i.e., "Benjamin is online". As shown in FIG. 22B, the phrase "online" is highlighted to more particularly identify the event. After a predetermined period of time, e.g., 15-30 seconds, the notification 2220 of FIG. 22B changes state to the simplified contact online notification 2230 shown in FIG. 22C. In one implementation, simplified notification 2230 may be generally the same as notifier 2211 of FIG. 22A. As shown in fig. 22C, notifier 2230 provides an indication of the event type via icon 2222. In general, the simplified contact online notification provides information about the type of event and the number of events occurring in each category of events corresponding to the function. For example, icon 2222 in simplified contacts online notification 2230 indicates that: in people category 2204, one contact is online. In alternative embodiments, the simplified notification may provide any suitable or desired information related to the event.
Referring to fig. 22A, notifications 2209 and 2211 indicate: events have occurred in two categories, namely, environment 2203 and people 2204. In one implementation, the categories 2203, 2204 may be highlighted by a change in color or appearance when an event or notification occurs. In general, any desired change may be made to emphasize the occurrence of an event, for example, an audible or mechanical notification, such as a sound or vibration.
For example, in one embodiment, referring to FIG. 22B, category icon 2204 is highlighted, and expansion icon 2220 is also highlighted, with the text "online" 2221 highlighted in a different color than the remaining text. In fig. 22C, a category icon 2204 is highlighted and a simplified notification icon 2230 is highlighted, for example, by a different color, hue, or font, to notify the user that the contact is online. The appearance of arrow icon 2208 indicates that: further information about the event is available or the notifier 2230 can be extended.
FIG. 22D illustrates another embodiment of a notifier 2240 associated with a contact person category 2204. As shown in FIG. 22D, icon 2240 shows multiple events occurring in different sub-categories. For example, icon 2222 corresponds to the number of contacts "online" (2). Notification 2240 shows that "2" contacts are online. An exemplary icon 2241 for an envelope style may indicate the presence of a mail message or message (3). In this example, the number "3" adjacent to the icon 2241 indicates: in this category 3 mail messages are received. Notification 2240 can expand if other events occur in other subcategories, and notification 2204 can decrease in size or appearance if the event ceases (i.e., online contacts go offline) or is handled by the user. Notifier 2204 may also be reduced to a reduced state based on a user action such as clicking indicator 2208 or automatically. In one embodiment, when indicator 2208 can be reversed to show that the action thereon will cause it to be in a reduced state.
In one implementation, the notification 2211 of FIG. 22A can be expanded to provide detailed information about the notification. For example, referring to fig. 22E, if the user "points and clicks" on, for example, category 2204 of fig. 22A, an event list 2250 may be displayed that provides detailed information related to each event. A list of icons 2256 may also be displayed to allow the user to filter the display of event categories. In the example of fig. 22E, icon 2258 for an "all" event is highlighted and selected, causing all events occurring in category 2204 to be displayed in list 2250. If an icon related to, for example, sub-category 2206 is selected, only the "online contacts" event, for example, event 2253, associated with this sub-category 2260 will be displayed. A scroll bar 2262 or other such similar device may be provided to allow the user to scroll through or navigate through different events, which may also provide a viewable total number of events. Any suitable method can be used to select or scroll to an event in the list 2250, such as a graphical user interface pen or mouse. Icon 2263 may also indicate: there are more events in the displayed list that can be viewed.
Fig. 22F illustrates another example of an annunciator 2270 associated with the environmental category 2203. Referring to fig. 22A and 22F, the device detects the occurrence of an event in the environment category 2203. Notification 2209 appears on display 2201. The notification 2209 may then change to an expanded state 2270, highlighting the category 2203 in some form. The information in the extended notification 2270 indicates: a network device identified as a "Gil laptop" is detected 2272 and may also indicate a connection type 2272 (e.g., "bluetooth"). In alternate embodiments, any suitable indication, icon, and information type and description may be displayed. Arrow 2271 may indicate: there is more detailed information to be displayed and viewed. As noted earlier, arrow 2271 may be an active or dynamic icon.
Fig. 22G shows an example of a notifier 2280 belonging to the content category 2206. In this example, the indication or icon 2281 may indicate the occurrence of pending or available activity, such as "MP 3" playback. In the example shown in fig. 22G, an icon associated with category 2206 is highlighted to indicate the presence or activation of the time of the event. Icon 2281 is displayed to indicate the sub-category or characteristic or type of event. In one embodiment, the color may be changed to correspond to the media type, for example, which the user perceives or is able to determine from the icon or indication. The content notification icon 2280 shown in fig. 22G is a minimum or minimized state. Icon 2280 may remain displayed as long as the content remains active or the user performs some other activity.
Fig. 22H shows a content notification icon 2280 that has been expanded to a more detailed state 2290. In this example, a title 2291 of particular content, or a portion thereof, is displayed. The user may customize the description or amount of information displayed in notification 2290 to display any desired information in any particular form, image, or type of image.
Figure 221 shows the notifier 2290 of figure 22H expanded to include other information. Active or dynamic icons may be displayed in association with the notifier 2295. For example, a control 2296 is displayed that will allow the user to play the content. Arrow indicators 2297 and 2298 may allow the user to view more details or text about the information in block 2291. When the device and user interface are in an idle state and notification 2280 is present, notifier 2295 can be set to appear when any or a particular key is pressed. For example, if the device is a mobile phone and is in an idle state and the keypad is locked, notification 2280 notifies the user of the event in content category 2206. When the phone is unlocked, activation of a key, hand, or soft key may cause display 2295 to appear. 2295 the view may be maintained permanently until closed by the user, or may be maintained only temporarily.
FIG. 27 illustrates one embodiment of a method incorporating features of the present invention. Idle states of the device are detected 2702 and activated. The stripe region is reduced in size to activate idle state 2704. The occurrence of a zone-related event is detected 2706. A notifier for a first state is displayed 2708 in conjunction with the zone. The notifier then extends 2710 to a second state to provide more detail about the event. The notifier then changes 2712 to a third state, which has summary information about the event. The area 2712 is selected to view the event. The expected future event type is selected 2714. The event list is displayed 2718, which may be performed by user activity or automatically.
Embodiments of the present invention generally provide or display event notifications or annunciators for the categories 210 of FIG. 2A. These event notifiers may include, for example: notification of an information message, a bluetooth ("BT") request, a "buddy" online, a missed call, or an upcoming calendar event. In general, notifications may be provided in response to event detection that occurs with respect to the functionality and features of a device. Referring to fig. 23A-23F, in one embodiment, the user interface of the present invention displays or presents event notifications on a timeline (timeline) or on a buddy list. As shown in FIG. 23A, in one embodiment, the table of events may be presented as, for example, "chording" or a line 2302. Depending on the screen size and layout, the chord direction may be horizontal or vertical. Also, the chord may be a straight line or a curved line. The geometry is not limited to the scope of the invention. Depending on the size of the display or user preference, the size of the lines may be much larger by about 1 pixel wide.
Initially, referring to fig. 23A, there are no objects on the strings 2302 and the user interface or display 2301 does not have any annunciators. The display 2301 may display a background or other image, if desired. A system area 2300 may be provided in a portion of the display area. Although the term "string" is used to describe this embodiment of the invention, any suitable graphical image or icon may be used, including any suitable descriptive term.
In one embodiment, the strings 2302 begin to vibrate or move when a device including the user interface of the present invention detects that a new event is about to occur or has occurred. In one embodiment, the frequency or vibration may depend on the event importance of how long an event will become active. For example, the device may detect that a message is to be communicated. The system may react differently to the message to be received and the detection of the received message.
Referring to fig. 23B, the detection or occurrence of a device event is indicated by the appearance of a notification icon on the screen, which is displayed in fig. 23B as, for example, a balloon 2304. The type, size, and shape of the notification icon 2304 are not limited to the example of fig. 23B, and may include any suitable icon. As shown in fig. 23B, balloon 2304 contains a small icon 2304A that represents the type of event associated with the notification.
In one embodiment, a portion of the icon 2304 appears from the top of the display screen 2301. The icon 2304 moves to the other end of the line and more of the icon 2304 becomes visible until it is fully displayed. When the icon 2304 becomes fully visible, a pop-up window 2303 may be displayed on the screen 2301 to provide the user with more detailed information about the event. The pop-up window 2303 may include (hyper) text, icons, images, or other user interface components (e.g., a progress bar for displaying download status). In alternative embodiments, the pop-up window may include any suitable information. If the pop-up window 2303 is associated with more than one event/object, the number of such objects may be displayed in the pop-up window.
After a short period of time (e.g., 1.5-3 seconds), pop-up window 2303 may be hours and only balloon 2304 remains visible. Balloon 2304 may then begin or continue to move toward the other end of chord 2302. For example, the icon 2304 slowly appears on the screen 2301, changing from a partial unlike to a full image as shown in fig. 23B. When the full image appears, icon 2303 appears sideways. Icon 2303 remains for a predetermined period of time and is then automatically removed. Icon 2304 then continues to move along line 2302. The movement speed of the icon 2304 may be, for example, 1 pixel/minute, although any suitable speed may be implemented. Object 2304 may also be moved in order to make room for the other notifiers 2305 shown in FIG. 23C.
There may be cases where: screen 2301 already contains a notifier for a similar event that has just become visible. Thus, when newer objects begin to move toward older annunciators, the older annunciators are moved toward newer annunciators and joined, e.g., annunciators 2307 in FIG. 23D. If several events of the same type are combined, the diameter of balloon 2307 may be increased to indicate the combination. As shown in fig. 23D, balloon 2307 has an associated pop-up window 2308 that specifies: a new message has been received and the name of the sender of the message is indicated therein.
The chords 2302 may become "overcrowded" when a certain number of balloons (depending on screen size) are added on top of each other. Thus, to accommodate more balloons along a chord, it is possible to interlock or overlap the balloons.
The balloon may "pop" or be removed from the display when the event associated with the notifier ends (e.g., the message has been read) or the user has explicitly turned off the event. In one embodiment, the annunciator may disappear after a certain period of time (e.g., 15 minutes) has elapsed, or the string may be filled by using a "first-in-first-out" method.
The user may also interact with the annunciator, possibly through a 5-way joystick or touch screen or stylus, for example. The user may change the input focus to chord by, for example, the left soft key. The user may then scroll through the annunciator using the up and down keys. Upon selection of an event, the notifier may open the associated event/object.
23E and 23F illustrate an alternative visualization in which the annunciators 2310, 2320 pop out of the border region 2315 of the display 2301 and retract into the border region after a predetermined period of time.
These embodiments of the present invention generally require reduced real estate of the display screen, the temporal relationship of the display notifications, and may provide "soft notifications" to the user by vibrating the chords of the timeline. When a new event is notified, the pop-up window may display more information through the text/image/UI widget and possibly incorporate the event in one notifier.
FIG. 28 illustrates one embodiment of a method incorporating features of the present invention. The device detects 2802 the occurrence of the event. The line appears on the screen and begins vibrating 2804. The icon begins to appear 2806 at one end of the line. When the icon moves along the line and appears fully, a pop-up window appears 2808 with information about the event. After a predetermined period of events, the pop-up window disappears 2810. The icon continues to move along the line toward the other end of the line. If there are other icons for the same or similar type of event, the icons may be merged 2812 to form a single icon for that event type. Alternatively, the icons may be linked with other icons on the line or share the space 2814. When the event terminates or ends, the icon disappears 2816.
In one embodiment, the user interface of the present invention can provide multiple, simultaneous views of the same information without the use of a split pane. For example, the display of a handheld device may not provide enough screen space to present information to a user in a parallel fashion. The information is thus divided into several windows and may be displayed sequentially.
In one embodiment, the present invention provides a "zoomable" user interface for a small screen device such as a mobile phone or personal digital assistant ("PAD"). The scalable interface allows for simultaneous interaction between information related to content, users, tasks, ambiance, applications, and systems, even when the available display area is limited. The object associated with the currently selected object is highlighted. The user interface displays an overview of the terminal content and the status of the content. Proximity and content information may be displayed and perceptual information about people and events may be provided. Embodiments of the present invention also support easy customization and extensibility. In different embodiments, the width of each zone or category may vary. The user may hide or minimize areas not relevant to the current task, or if the user needs more space to view objects of a certain area, the user may do so. In one embodiment, the system may automatically perform this type of zone scaling. For example, when focusing on a single region, the other regions may be deformed in such a way that they do not consume much screen space, but rather provide contextual information about the zoom and navigation state.
Although interaction through a stylus and a touch screen or some other pointing device is more direct, in other embodiments, the interaction may be scaled down for use, for example, by a 5-way joystick. In this case, the layout is simplified in such a way that: making it grid based and each zone must not contain more than one column. Access to the detailed information and functionality may be achieved in several ways. For example, if the user selects an item by pressing the 5-way joystick, the default activity associated with the summary object is completed. If the user presses long on the object, the context menu will pop up, now moving up and down for highlighting the menu item. Another option is that the menu will pop up by pressing the joystick once, and moving up and down is for changing the highlighted part of the menu. Since the default activity is the first on the list, it can be accessed by double-clicking. The menu may be closed by selecting an item or using an additional soft key.
Furthermore, a scalable user interface is considered beneficial because mobile devices have limited interaction and rendering capabilities, and the ZUI eliminates the need to scroll long lists and focus operations. In a GUI, the size of a window limits the amount of content that can be viewed, and if the size of the content is larger than the size of the window, scrolling of the window content is required. However, the ZUI uses the screen as the canvas itself, and the user can pan and zoom the content. It has no overlapping windows and uses screen space efficiently. Window boundaries and controls (control bar, minimize and maximize buttons, etc.) do not consume screen space, but the same familiar user interface components (widgets) can be used. While in focus and context visualization, the ZUI may maintain the spatial relationship of the objects. Furthermore, the present invention reduces the amount of information displayed by having the user select what information to display and where to display.
For example, referring to FIG. 24A, embodiments of the zoomable user interface of the present invention generally include two main areas, a system area 2402 and a canvas area 2420.
The system area 2402 generally provides the same or similar functionality as the system area described with respect to fig. 1. The system area 2402 may include information related to device status and navigation as well as system tools.
If the user moves the input focus to system area 2402 or selects a system area, the system area is maximized. Information about connections, traffic and links to areas for changing system settings in the device are provided. For example, system state information may include information related to active links and traffic 2403, battery settings and status 2404, operator information 2405, date and time information 2406, 2407.
The system area 2402 may also provide navigation/system functions and tools. These may include, for example: back and home keys 2408, 2409, and view controls 2410, 2411. These controls can be used to change the type of viewing (e.g., list, grid, titled plane) in the content category. This type of control is needed because different types of content objects may require different views. For example, a grid full of thumbnails may fit an image or video, but may not display long object titles, which are typically associated with audio files. The user may also create a client view that is best suited for the desired purpose, such as an "R" related display type, a column/grid display, or re-enabling a previously recorded canvas layout.
The search areas 2412, 2413 generally provide the same search functionality as previously described. The areas 2412, 2413 may be used to search for content via various search criteria, e.g., local, from the user's own device, peer-to-peer ("P2P") network, nearby, or at the internet. When the user begins typing a search string in the text box of the search field area 2412, the system begins filtering irrelevant objects from the categories 2421, 2423, 2425, 2427, 2429, and 2431. If the search results in an empty category, the category may contain a link or button for extending the search beyond the local device. The user may access the advanced search feature by activating or pressing a key associated with the search area. The search results are presented on the canvas by filtering out (hiding) irrelevant objects. The search fields generally include a search input field 2443 and a search category field 2442, as shown in FIG. 24B. FIG. 24B illustrates one embodiment of a user interface display of information and content described with reference to FIG. 24A using icons, objects, and text images.
The user interface of the present invention allows interaction through a pointing device. Referring to FIG. 24B, the user moves a cursor (vs. magic lenses) on the canvas 2420. When the cursor is moved over an object and tapped once, such as object 2444 in fig. 24B, the detailed information area 2445 becomes visible. Selecting the desired function zooms in (or opens a window) to the view presenting the desired features and information. To assist navigation, zooming and other view switching may be enhanced by animation. Also, other types of effects besides magnifying objects may be used to manipulate the selected object.
Referring to FIG. 24A, the canvas 2420 is a container that can be used to display objects. The object on the canvas may be, for example, a file, a message, a calendar event, a link to an application or service, or a contact. Basically, an object may comprise any type of entity or application stored and presented in a terminal. The objects may be displayed as images, icons, text, or any combination thereof. The canvas area 2420 is further divided into content categories/sections 2421-2432. The zones 2420A-2420F are containers for objects. Each zone typically contains objects that share similar characteristics. The exact number, as well as the shape and size of the categories 2421, 2423, 2425, 2427, 2429, and 2431 can vary.
Referring to fig. 24B, one example of a category may be contact 2440B. Contact category 2440B may include, for example, phonebook contacts and methods of talking to the contacts (e.g., voice calls, instant messages, chat). The message category 2440D may include, for example, email, SMS, MMS tools for exchanging information with a person. Calendar category 2440C may include calendar events and other objects having a time or time dimension. The environment category 2440F (also referred to as proximity) typically displays context or location information as well as proximity data. The content or application category 2440E typically includes all objects stored in the device or accessible to the user through the device. Application section 2440E may also provide space for creating new content or applications, including third party applications, that are not in one defined category. Included in the application category 2440E may be a restrictive "content" category, which may include, for example, the object "media" that contains all of the media files (images, videos, animations, music, etc.) of the device. Other zones may be added if desired. In one implementation, the canvas 2420 may include the category "events" for all types of event notifiers to be presented centrally.
Referring to fig. 24B, a menu containing common commands/actions (e.g., sort, create new, delete, minimize) for all objects of category 2440A may be accessed using, for example, a zone header of 2443 for the notifier. When an object is selected, object-specific commands/actions can be accessed through the "detailed information" area to be displayed. For example, in fig. 24B, when the object 2444 of "friend 2" is highlighted and selected, a detailed information area 2445 appears, which includes functions and commands related to the objects that can be selected and operated.
It is possible to minimize the appearance of other objects on the screen when detailed information on the selected object (dynamic region) is displayed. In fig. 24C, contact category 2440B is highlighted. Additional information related to each item in contact category 2440B is displayed while unselected notifier categories 2443 are reduced in size and remain in a reduced size state. Also, in FIG. 24C, if the user selects object 2451 in contacts category 2440B, a portion of the canvas may be used to present detailed information about the selected object (e.g., the most important metadata) and the most common functions related to the selected object.
As shown in FIG. 24C, both the contacts category 2440B and the calendar category 2440C have been expanded to include more detailed information related to each item in the respective categories. The details may appear proximate to the respective icon.
The user interface of the present invention also provides a related view mode that allows the user to identify all objects that are related to the selected object. For example, referring to FIG. 24D, the object "friend 2" 2461 is selected. By activating the related view mode, all data or objects related to the selected object 2461 can be viewed. The relevant information is displayed in area 2462 and may include location information, recent contact data, and recent files exchanged between the user and friend 2. In alternate embodiments, any suitable category of related information may be displayed.
Once the user selects an object, e.g., 2461, the size of the unselected category may be reduced in this manner: the control is rendered using a portion of the canvas to display items related to the selected object. For example, a "display related" key, which may be a hardware key or a soft key, may be presented. If the user activates this function, all objects related to the selected object are emphasized. This may be done, for example, as a secondary highlight, or to gray out or hide irrelevant objects in a category that may be visible. If some types do not contain any objects (all of which are hidden), then these categories can be minimized. Also, an icon for presenting the "parent" item is displayed on the related item control area 2460. If the device has a pointing device, such as a stylus or trackball, hovering the cursor over the secondary highlighted item will display the relationship to the parent item (e.g., as a tooltip, or to emphasize attributes and values of associated metadata in the detailed information region).
FIG. 24E illustrates one embodiment of a user interface incorporating features of the present invention having a screen size reduced to approximately 176 by 208 pixels. The categories 2470 and items 2471 may be scaled to a desired screen size for a particular device.
FIG. 24F illustrates another embodiment of a user interface incorporating features of the present invention, showing categories 2480 and files in use 2481.
FIG. 29 illustrates one embodiment of a method incorporating features of the present invention. The title of zone 2902 is selected or highlighted. The size of a region, e.g., its width, is expanded 2904 to display more information about the region and the objects in the region. An object is selected 2908 in the zone. Information related to the object from the zone is displayed. The related content function is activated 2910 and information related to the selected object from the other, unselected area is displayed 2912.
In one embodiment, the present invention provides a user interface for accessing, consuming, managing, and sharing digital content between multiple connected devices. Many types of devices may be used to obtain, create, pay for, share, and manage digital content. Some of these devices include, for example: DVB-T/H receiver, game console, PC, camera, MP3 player, smart phone, PDA and mobile phone. When these devices are connected or interconnected in some manner, they may form a device ecosystem, such as that shown in FIG. 25A. Embodiments of the present invention provide a user interface for accessing, consuming, and managing digital content between a plurality of connected devices.
Generally, devices that provide digital content provide a separate user interface for their features and functions. For example, referring to fig. 25A, a user may have several devices 2501, 2502, 2503, and 2504 with which the user may interact and access digital content. U shape1-U3Representing the user to whom (part of) the content is shared and able to interact with and access the content with device 2401 and 2404. The problem is that, for example, although device 2502 may be able to play content 2511, content 2511 can only be accessed by interacting with device 2501. The device 2501 may have different user interfaces and interaction habits than the device 2502, which require learning. It can be difficult to remember what content each user can access separately. Searching/managing the content becomes tedious because the user cannot keep track of the content in each area shown in fig. 25A. The present invention provides a user interface that can combine the content of several devices into a single view.
Referring to FIG. 25B, in one embodiment, the user interface of the present invention includes three main areas. A personnel area 2521, a device area 2522, and a content area 2523. The people or user area 2521 typically identifies people with whom content in the selected device may be shared. The section identifies the users and groups and people with which the content is or can be shared. The device area identifies all media devices to which the user has access and may include devices and components for storing and accessing content. The content area identifies digital content accessible through the selected device. Such an embodiment is generally referred to herein as a "plant ecosystem". The user interface may be interacted with through a stylus and touch screen using "drag and drop" techniques or other suitable navigation methods.
Referring to fig. 25B, each category or region includes a title bar 2530A, 2530C, an area 2540A, 2540B, 2540C for describing the contents thereof, and a status bar 2550A, 2550B, 2550C for providing additional information about the selection. Although not shown in the figures, the functionality of the user interface may be accessed through a menu bar, pop-up (context) menu, or a toolbar.
FIG. 25C illustrates a more detailed example of one embodiment of a user interface when selecting the user group 2561. The people region 2521 displays the group/people to which content is shared as well as the user himself. The user is depicted by icon 2562 and the other individual users are depicted by other icons 2563 and 2567. The user may be highlighted in different ways depending on whether the user is online or offline. Icon 2568 shows a workgroup with 2/6 users online. Icon 2569 represents a separate offline user with access. Region 2570 may provide the total number of online or offline users.
The device area 2522 displays all media devices owned by the user and having access rights. If the device is not available (e.g., the device owner is not connected, or does not share the device, or the user's device cannot be connected to it for some reason), it is highlighted differently than other device icons, such as graying out. On top, the device currently used to access the ecosystem, in this example device 2571, is shown. Region 2572 indicates the number of accessible devices.
The content region 2523 displays the categories based on the metadata. In which digital personal content accessible (via the selected device) is displayed. The content is shown as a hierarchy in this example, but other types of views are possible.
Fig. 25D shows an example when a single device 2524 is selected. The selected device has a secondary highlight, grayed out. A user accessing the device 2524 is displayed in the personnel area 2521. There are users 2562-. Users 2568, 2569 that have no access are "grayed out". Region 2570 indicates the number of users accessing the selected device.
The content region 2523 lists the files 2580 accessible through the selected device. Graying out has no classification of content.
It is also possible to select multiple devices simultaneously. (toggle select on/off).
In FIG. 25E, a single device is selected, emphasized by a black thin border, and the input focus is on the content area 2523, where the file 2581 was selected. The device 2525 that actually stores the file 2581 is highlighted. A person having access to the file 2581 is highlighted in the person area 2521. There are users 2562, 2563, 2564, 2565 and 2569. The content region 2523 may also provide a play control for the selected file as a toolbar or pop-up menu.
Fig. 25F and 25G describe examples regarding drag-and-drop interaction, file sharing, and copying. In FIG. 25F, the user shares a single file 2581 to a single user 2569 by dragging the file 2581 over the user 2569 and applying it there. Similarly, in FIG. 25G, the user copies file 2581 to other device 2526. After the user drags the object 2581 over the device item 2526 and stays there for some period of time (e.g., 1.5 seconds), the menu with the most important functions is opened and the file 2581 is copied.
Fig. 25H is an example when two regions (a person 2521 and a device 2522) are minimized and a content region 2523 is expanded. In the event that the user is not interested in, for example, reviewer 2521 or device 2522, these regions may be minimized by, for example, clicking on the respective title bar. The remaining area is maximized and the activity selection of the minimized bar is displayed as icons and text in the minimized bar. The maximized area may display additional information about the item.
The present invention provides a unified user interface (i.e. similar interactions, content presentation and structure) for all devices of a device ecosystem, provides the possibility to manage access rights of users, devices and content in a single view, provides the possibility to view content of several devices simultaneously, provides interactions enhanced by drag-and-drop features that easily minimize regions of non-interest and maximize display space of regions of interest.
FIG. 30 illustrates one embodiment of a method incorporating features of the present invention. A 3002 user is selected from the user area to identify the device and content to which the user has access permission. A 3004 device is selected from the device area to identify the user with permission to access and content associated with the device. The content file may be selected 3008 from the content area to identify the devices that may use the content and the users that may access the content. To share content with other users, the content item for the file is selected and dragged 3010 over the other users. The content is then shared 3012.
The above steps can be implemented by standard well-known programming techniques. The novelty of the above-described embodiment lies not in the specific programming techniques but in the use of the steps described to achieve the described results. Software program code embodying the present invention is typically stored in some type of persistent storage, such as persistent storage running a computer configured to include the GUI of the present invention. In a client/server environment, such software program code may be stored in a storage associated with a server. The software programming code may be embodied on any of a variety of known media for use with a data processing system, such as a diskette, or hard drive, or CD-ROM. The code may be distributed on such media, or from the memory or storage of one computer system over a network of some type to other computer systems for use by users of those other systems. It is well known to embody software program code on physical media and/or distribute software code over networks, as will be further described herein. For example, computer memory may be encoded with executable instructions that represent computer code that cause a computer to operate in a particular mode.
It will be understood that each element of the description and combinations of elements in the description can be implemented by general and/or special purpose hardware-based systems that perform the specified functions or steps, or by combinations of general and/or special-purpose hardware and computer instructions.
These program instructions may be provided to a processor to produce a machine, such that the instructions which execute on the processor create means for implementing the functions specified in the illustrations. The computer program instructions are executable by the processor to cause the processor to perform a series of operational steps to produce a computer-implemented process such that the instructions that execute on the processor provide steps for implementing the functions specified in the illustrations. Accordingly, the figures support combinations of means for performing the specified functions, combinations of steps for performing the specified functions and program instruction means for performing the specified functions.
The present invention may be implemented using software, hardware, or a combination of both software and hardware. Software for the present invention is stored on one or more processor-readable storage devices, comprising: hard disk drives, CD-ROMs, DVDs, optical disks, floppy disks, tape drives, RA, ROM, flash memory, or other suitable storage devices. In alternative embodiments, some or all of the software may be replaced by dedicated hardware, including client integrated circuits, gate arrays, FPGAs, PLDs, and dedicated processors. In one embodiment, one or more processors are programmed with software implementing the present invention. The one or more processors may communicate with one or more storage devices (hard disk drives, CD-ROMs, DVDs, optical disks, floppy disks, tape drives, RAs, ROMs, flash memory, or other suitable storage devices), peripherals (printers, monitors, keyboards, pointing devices), and/or communication interfaces (e.g., network cards, wireless transmitters/receivers, etc.).
FIG. 26 is one embodiment of an exemplary device incorporating features of the present invention that may be used to practice the present invention. As shown, computer system 2600 can be linked to another computer system 2602 such that computers 2602 and 2604 can send information to and receive information from each other. In one embodiment, the computer system 2602 may include a server computer adapted to communicate with a network 2604, such as the Internet. In an alternative embodiment, system 2600 may comprise a peer-to-peer ("P2P") network, wherein each computer forms a network node and operates as both a client and a server at the same time. Computer systems 2602 and 2604 may be linked together by any conventional means, including: a modem, hard-wired connection, wireless connection, or fiber optic link. Typically, information can be made available to both computer systems 2602 and 2604 using a communication protocol typically sent over a communication channel or through a dial-up connection on ISDN line. Computers 2602 and 2604 are generally adapted to use program storage devices embodying machine-readable computer source code which is adapted to cause the computers 2602 and 2604 to perform the method steps of the present invention. Program storage devices incorporating features of the present invention may be designed, made and used as components of machines using optics, magnetic properties, electromagnetic signals and/or electronics to perform the procedures and methods of the present invention. In alternative embodiments, the program storage device may include magnetic media such as a diskette or computer hard drive, which is readable and executable by a computer. In other alternative embodiments, the program storage devices may include optical disks, read-only memories ("ROMs"), floppy disks, and semiconductor materials and chips.
Computer systems 2602 and 2604 may also include a microprocessor for executing stored programs. Computer 2600 may include a data storage device 2606 on its program storage device for the storage of information and data. The computer programs and software incorporating the processes and method steps incorporating features of the present invention may be stored on one or more computers 2602 and 2604 on other conventional program storage devices. In one embodiment, computers 2602 and 2604 may include a user interface 2610 and a display interface 2608 through which features of the invention may be accessed. The user interface 2619 and display interface 2608 may be adapted to allow queries and commands to be entered into the system and become the result of the selection commands and queries.
Claims (46)
1. A method for navigating information in a mobile terminal, comprising:
displaying, on a graphical user interface of the mobile terminal, a plurality of window regions, wherein each region is a container for objects and provides connection, information and related functions of the terminal;
receiving a first user input to select one of the regions;
receiving a second input to select an object in the selected region;
in response to the second user input, a first hierarchical level of detailed objects and information related to the selected object is displayed within the displayed proximity.
2. The method of claim 1, further comprising selecting an information item displayed within the displayed proximity and moving the selected information item into one of the zones, wherein the zone is automatically searched for relevant information for the information item.
3. The method of claim 1, further comprising selecting at least one object in a selected region and dragging the selected object into another region, searching the other region for any relevant information of the selected object, and displaying any relevant information of the selected object found in the other region.
4. The method of claim 1, further comprising displaying the zones along a substantially horizontal line, the zones stacked in a substantially vertical column.
5. The method of claim 1, further comprising:
selecting an object in a region and displaying first hierarchical information having at least one dynamic icon and a status indicator;
selecting one of the at least one dynamic icon and displaying a second hierarchical information having more detailed data associated with the selected icon;
opening an object associated with the second hierarchical information and displaying third hierarchical information associated with the opened object; and
expanding the third level information to a fourth level to display relationships with other objects in the application window border region.
6. The method of claim 1, further comprising, upon said selecting one of said regions, resizing said selected and unselected regions so that summary information related to said selected region can be displayed within said display area of said device.
7. The method of claim 1, further comprising selecting a region to be searched using a search criteria, performing the search and displaying search results of the region search, and for each non-selected region, providing an indication if any relevant information for the search criteria is available in the non-selected region.
8. The method of claim 7, further comprising selecting an unselected area where there is an indication of relevant information for the search criteria, and displaying the relevant information.
9. The method of claim 1, further comprising entering an idle mode of the device during periods of inactivity, entering the idle mode comprising: the size of the zones is reduced to a minimum and wherein the displayed wallpaper area is expanded.
10. The method of claim 9, further comprising:
detecting an occurrence of an event in the device; and
indicating detection of the event by providing a notifier in a first state on the display related to the category in which the event occurred.
11. The method of claim 10, further comprising, after a predetermined period of time, expanding the notifier from the first state to a second state, the second state providing more detailed information about the event than the first state.
12. The method of claim 11, further comprising changing the second state of the notifier to a third state, the third state indicating an occurrence of a new event, an event type, and a number of similar events occurring in the same category.
13. The method of claim 12, further comprising, in the third state, providing at least one control function for operating or controlling the event.
14. The method of claim 10, further comprising selecting a category in which the event occurs and displaying a list of all events occurring in the selected category.
15. The method of claim 14, further comprising, after selecting the category, displaying an event filtering list, selecting events from the filtering list, and displaying in the list only those events that have occurred that are related to the events selected from the event filtering list.
16. The method of claim 1, further comprising:
detecting an occurrence of an event related to a device function;
displaying at least a portion of a line segment on a portion of the display of the device;
vibrating the line segment at a predetermined frequency and for a predetermined period of time to indicate the detection of the occurrence of the event;
forming at least a portion of an icon at one end of the line segment, the icon corresponding to a type of the detected event;
moving the at least a portion of the icon from the one end of the line segment toward the other end, the at least a portion of the icon changing state to the entire icon;
forming a pop-up window in the vicinity of the entire icon, the pop-up window providing information about the event and disappearing from view after a predetermined period of time; and
moving the entire icon to the other end of the line segment.
17. The method of claim 16, wherein a frequency of vibration of the line segment is dependent on the event type.
18. The method of claim 16, further comprising, in response to detection of a new event, forming a new icon on the line segment, determining that the new icon for the new event is similar to a previous icon on the line segment for a previous event, and merging the new icon and the previous icon.
19. The method of claim 16, further comprising selecting the icon to open an associated event object.
20. The method of claim 1, further comprising:
accessing a title area of a zone to select the zone;
selecting an object within the region, wherein the size of the selected object is expanded to display at least one function associated with the selected object.
21. The method of claim 20, further comprising, upon said selecting said header region of said region, expanding a width of said selected region to expand a display area associated with objects in said region and reducing a width of any unselected regions.
22. The method of claim 1, further comprising, after selecting an object in a region, displaying information corresponding to the selected object from the region and displaying information related to the selected object taken from any unselected regions.
23. The method of claim 1, further comprising, upon selection of the object in the selected region, highlighting objects in unselected regions that are related to the selected object in the selected region.
24. The method of claim 23, wherein a device object is selected in a device region of the user interface and all users having access to the selected device are highlighted in a user region of the user interface and all content categories associated with the selected device are displayed in a content region of the user interface.
25. The method of claim 24, further comprising selecting a content file in the content area and dragging the selected content file into the user area and over a user to provide the user with access to the selected content file.
26. The method of claim 24, further comprising selecting a content file in the content area and dragging the selected content file into the device area and over a device such that the selected content file is available through the device.
27. The method of claim 1, further comprising:
providing a user zone identifying a user list;
providing a device zone identifying a list of available devices; and
providing a content area identifying a list of available content files;
selecting a user, at which time all devices to which the selected user has access are highlighted, and content files in the content area that are accessible through the highlighted devices are highlighted;
selecting a device, wherein all users in the user area having access to the device are highlighted, and all content files in the content area accessible by the selected device are highlighted; and
selecting a file from the content region while highlighting all devices in the device region through which the selected file is accessible and highlighting all users in the user region having access to the highlighted devices in the device region.
28. A user interface for an electronic device, comprising:
a system area;
a summary stripe area comprising a display of categories of information accessible using the device; and
a detailed information area providing at least an overview of any relevant information for a selected sliver in the sliver area.
29. The user interface of claim 28, wherein each summary stripe region includes one or more objects linked to information related to the stripe region.
30. The user interface of claim 29, wherein the summary stripe area comprises a content category stripe, a calendar category stripe, a contacts category stripe, an application category stripe, and an environment category stripe.
31. The user interface of claim 28, further comprising summary stripes oriented in a substantially horizontal direction on the display.
32. The user interface of claim 28, wherein each summary stripe includes at least one selectable dynamic icon, image, text, or hypertext.
33. The user interface of claim 28, wherein each summary stripe includes at least a first level of information displayed when the summary stripe is selected and a second level of information displayed when an icon on the first level is selected, the second level providing more detailed information about the selected stripe than the first level.
34. The user interface of claim 33, further comprising, a third tier of information displayed upon selection of an icon in the second tier that displays information of the second tier and displays relationships of the information of the second tier to other objects on the displayed application window border region of the device.
35. The user interface of claim 28, wherein the user interface comprises a computer and a software program operating on the computer.
36. A method for providing and accessing menu functions on a display of an electronic device, the method comprising:
providing one or more categories of information in a menu structure on the display;
reformatting the selected menu function to be displayed as a primary object on the display when one of the categories is selected;
determining which unselected categories have a relationship with the selected menu function; and
reformatting the unselected menu functions associated with the selected menu function to be displayed on the display as secondary objects associated with the primary object.
37. A graphical user interface for a terminal device, comprising:
a display screen;
a user input device;
a processor configured to display content of the device on the display screen;
means for dividing the content of the device into a plurality of zones, each zone providing information relating to the content and functionality of the terminal that the processor is configured to display;
a main region; and
at least one secondary region comprising more detailed information about the selected object in the primary region.
38. The graphical user interface of claim 37, further comprising:
means for providing a list of devices accessible through the user interface;
means for providing a list of users having access to the listed devices; and
means for providing a list of content accessible from the listed devices.
39. The graphical user interface of claim 37 wherein the graphical user interface comprises a computer and a software program operating on the computer.
40. A software product comprising instructions executable by a processor unit to enable the processor unit to perform the steps of claim 1.
41. The software product according to claim 40, wherein the software product is stored on a magnetic or optical data carrier.
42. The software product of claim 40, wherein the software product is stored in a computer memory.
43. The software product of claim 40, wherein the software product is stored on a read-only memory.
44. The software product according to claim 40, wherein the software product is stored on a remote computer of the mobile terminal and is transmittable on electronic or electromagnetic signals.
45. A computer program product, comprising:
a computer usable medium having computer readable code means embodied thereon for causing a computer to generate a user interface for a terminal device, said computer readable code means in said computer program product comprising:
computer readable program code means for causing a computer to display a plurality of window regions on a graphical user interface of the mobile terminal, wherein each region is a container for objects and provides connectivity, information and related functions of the terminal;
computer readable program code means for causing a computer to receive a first user input to select one of the regions;
computer readable program code means for causing a computer to receive a second input to select an object in the selected region;
computer readable program code means for causing a computer to display a first hierarchical level of detailed objects and information related to said selected object in a vicinity of said display.
46. A computer program having a program code stored on a machine-readable carrier for implementing the method according to claim 1, when the program runs on a computer.
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US11/179,024 | 2005-07-11 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| HK1117614A true HK1117614A (en) | 2009-01-16 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20060020904A1 (en) | Stripe user interface | |
| JP7026183B2 (en) | User interface for your application Devices, methods, and graphical user interfaces for interacting with objects. | |
| KR102004553B1 (en) | Managing workspaces in a user interface | |
| KR101678271B1 (en) | Systems and methods for displaying notifications received from multiple applications | |
| US20060010395A1 (en) | Cute user interface | |
| KR101025259B1 (en) | Enhanced Pocket Computer and Associated Methods | |
| KR20090017626A (en) | Improved Portable Electronic Devices and Related Methods | |
| JP2008542868A (en) | Improved pocket computer and related methods | |
| US20070045961A1 (en) | Method and system providing for navigation of a multi-resource user interface | |
| HK1117614A (en) | Stripe user interface |