WO2006001794A1 - Procede de fonctionnement d'un systeme de menu - Google Patents
Procede de fonctionnement d'un systeme de menu Download PDFInfo
- Publication number
- WO2006001794A1 WO2006001794A1 PCT/US2004/018600 US2004018600W WO2006001794A1 WO 2006001794 A1 WO2006001794 A1 WO 2006001794A1 US 2004018600 W US2004018600 W US 2004018600W WO 2006001794 A1 WO2006001794 A1 WO 2006001794A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- menu
- user
- menu system
- desired location
- voice
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/42—Systems providing special services or facilities to subscribers
- H04M3/487—Arrangements for providing information services, e.g. recorded voice services or time announcements
- H04M3/493—Interactive information services, e.g. directory enquiries ; Arrangements therefor, e.g. interactive voice response [IVR] systems or voice portals
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/26—Speech to text systems
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/223—Execution procedure of a spoken command
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/42—Systems providing special services or facilities to subscribers
- H04M3/487—Arrangements for providing information services, e.g. recorded voice services or time announcements
- H04M3/493—Interactive information services, e.g. directory enquiries ; Arrangements therefor, e.g. interactive voice response [IVR] systems or voice portals
- H04M3/4936—Speech interaction details
Definitions
- the present invention generally relates to menu systems such as voice menu systems, and more particularly, to a method for operating a menu system that enables, among other things, users to quickly navigate to a desired location in a menu provided by the menu system.
- Menu systems such as voice menu systems are widely used by consumers.
- voice menu systems are often employed by commercial and governmental entities to handle incoming telephone calls from consumers related to matters such as informational requests, sales, customer service, technical support and/or other matters.
- the use of such menu systems is cost-effective and desirable for entities since they tend to reduce the need for human resources, and thereby reduce the associated overhead costs. While menu systems such as voice menu systems may be cost-effective and desirable for the entities that employ them, consumers tend to dislike using them.
- menus provided by menu systems are often configured as decision trees having a plurality of hierarchical layers. Users navigating through a given menu may be provided with several selectable options at various decision points of the decision tree. With a voice menu system, for example, a user may have to listen to these options at various decision points (without the ability to see ahead in the menu), and enter a given key input to select a desired option. This process can be time-consuming and inconvenient for users, particularly if the user has to navigate past several decision points before reaching a desired location in the menu.
- the process of navigating through a given menu may become significantly more inconvenient if a user must navigate through the same menu multiple times. This may occur, for example, if a user repeatedly accesses a particular menu to do things, such as receive information (e.g., technical support, etc.), place orders, and/or the like. In such cases, the user must navigate through the menu to the desired menu location each time the menu is accessed, thereby wasting his or her valuable time.
- One approach for addressing this problem is for the user to attempt to write down the necessary steps to get to a desired location in a given menu.
- a user could write down the specific key inputs needed to get to a specific location in a given menu the first time the menu is accessed. Then, when the menu is subsequently accessed, the user may attempt to advance to the desired location in the menu without listening to all of the available menu options.
- This approach is not optimal since it may be cumbersome, or even impossible in some cases, for users to write down potentially long sequences of specific key inputs.
- some menu systems may not allow users to select the next option until the entire list of options has been output. This approach is also undesirable since a specific sequence of key inputs may no longer bring the user to the desired location in the menu if the menu hierarchy changes.
- a method for operating a menu system such as voice menu system which addresses the foregoing problems, and thereby enables users to quickly navigate to a desired location in a menu provided by the menu system.
- the present invention disclosed herein addresses these and/or other issues.
- a method for operating a menu system is disclosed.
- the method comprises steps of enabling a user to navigate to a desired location in a menu provided by the menu system, and providing an address corresponding to the desired menu location to the user responsive to a first user input.
- a menu system is disclosed.
- the menu system comprises memory means for storing data including addresses for locations in a menu provided by the menu system.
- Processing means enables a user to navigate to a desired one of the locations, and provides the address corresponding to the desired location to the user responsive to a first user input.
- a voice menu system is disclosed.
- the voice menu system comprises a memory operative to store data including addresses for locations in a menu provided by the voice menu system.
- a processor is operative to enable a user to navigate to a desired one of the locations, and provide the address corresponding to the desired location to the user responsive to a first user input.
- FIG. 1 is an exemplary environment suitable for implementing the present invention
- FIG. 2 is a block diagram providing further details of the menu system of FIG. 1 according to an exemplary embodiment of the present invention
- FIG. 3 is a flowchart illustrating steps according to an exemplary embodiment of the present invention.
- the exemplifications set out herein illustrate preferred embodiments of the invention, and such exemplifications are not to be construed as limiting the scope of the invention in any manner.
- environment 100 suitable for implementing the present invention is shown.
- environment 100 comprises user input and/or output (I/O) means such as user I/O device 10, and menu means such as menu system 20.
- user I/O device 10 is embodied as a telephone
- menu system 20 is embodied as a voice menu system.
- user I/O device 10 is operative to enable user interaction with menu system 20 via any type of wired and/or wireless link.
- user I/O device 10 is embodied as any type of telephone device such as a wired, wireless, cellular, and/or other type of telephone. Accordingly, user I/O device 10 may include voice I/O means, a keypad, and/or a display. According to another exemplary embodiment, user I/O device 10 may be embodied as a user I/O device for a consumer electronics device such as a television signal receiver, computer, and/or other device. Accordingly, user I/O device 10 may be embodied as a hand ⁇ held remote control device, wired and/or wireless keyboard, or other suitable device or element. Menu system 20 is operative to provide various menu functions.
- menu system 20 is embodied as a voice menu system, but may be embodied as any type of menu system such as, but not limited to, a menu system included in a consumer electronics product such as a television signal receiver and/or computer.
- menu system 20 provides one or more menus each configured as a decision tree having a plurality of hierarchical layers. Users may interact with menu system 20 via user I/O device 10 and thereby "navigate" through a given menu to a desired location therein. At various decision points of a menu, users may be presented with a plurality of selectable options.
- a user may be instructed to press a first key to select a first option, press a second key to select a second option, press a third key to select a third option, and so on.
- menu system 20 may advance the user to another decision point where further selectable options are presented. In this manner, the user may navigate through a given menu until he or she reaches a desired location in the menu.
- the arrangement and organization of a given menu and number of selectable options provided by the menu are a matter of design choice. All types of such menus are considered to be within the scope of the present invention.
- each menu provided by menu system 20 is divided into a plurality of "nodes" which define each and every user accessible location in the menu.
- Each node, or location has a unique address for identification.
- each decision point and branch of a menu's decision tree may be identified by its address.
- each such address may be represented by one or more alphabetic characters and/or numbers. As will be described later herein, these addresses are provided to enable users to quickly navigate through a menu.
- a user may provide a predetermined input (e.g., key input, voice input, etc.) to menu system 20 via user I/O device 10 whenever he or she has reached a desired location (i.e., node) in a given menu.
- Menu system 20 provides the address corresponding to the desired location to the user in response to the predetermined input.
- the user Once the user has the address for the desired location, he or she can automatically advance through the given menu to the desired location (i.e., node) whenever the given menu is accessed by directly inputting the address. In this manner, users can advantageously save a great amount of time navigating through a given menu after the given menu is accessed for the first time.
- Menu system 20 of FIG. 2 comprises processing means such as processor 22, and memory means such as memory 24.
- processing means such as processor 22, and memory means such as memory 24.
- the foregoing elements of FIG. 2 may be embodied using integrated circuits (ICs), and such elements may for example be included on one or more ICs.
- ICs integrated circuits
- Processor 22 is operative to perform various processing and control functions of menu system 20.
- processor 22 includes logic elements and is operative to execute software code to enable various menu functions, such as, but not limited to, enabling users to access and navigate through menus, enabling menu outputs (e.g., voice, text, etc.) to users, providing address information corresponding to menu locations (i.e., nodes) to users responsive to user inputs, automatically advancing users to desired menu locations responsive to user inputs, and other functions.
- Memory 24 is operative to perform data storage functions of menu system 20.
- memory 24 stores data including, but not limited to, software code, menu data, address data for menu locations (i.e., nodes), and/or other data which enable the aforementioned and/or other functions of processor 22.
- a flowchart 300 illustrating steps according to an exemplary embodiment of the present invention is shown.
- the steps of FIG. 3 will be described with reference to the elements of environment 100 shown in FIGS. 1 and 2.
- the steps of FIG. 3 are merely exemplary, and are not intended to limit the present invention in any manner.
- a user accesses a menu provided by menu system 20 for the first time.
- menu system 20 is embodied as a voice menu system
- the user may access the menu at step 310 by dialing a telephone number associated with menu system 20.
- the user may access the menu at step 310 by entering one or more predetermined key inputs to the consumer electronics device via user I/O device 10. For example, the user may press a "menu" key and/or other key of user I/O device 10 to access the menu at step 310.
- the menu may be provided as a sequence of on-screen menu displays provided via a display device associated with the consumer electronics device.
- the user navigates to a desired location in the menu.
- the user may navigate to the desired location in the menu at step 320 by entering key and/or voice inputs via user I/O device 10 responsive to automated voice messages provided by menu system 20 under the control of processor 22.
- automated voice messages may present the user with various selectable options at various decision points of the menu. For example, assume that the user has accessed a menu provided by menu system 20 desiring information about a hardware problem with a particular model of a home laptop computer that he or she owns. At a first decision point in the menu, the user may for example enter a specific key and/or voice input to select a "Tech Support" option.
- the user may for example enter a specific key and/or voice input to select a "Home Users" option.
- the user may for example enter a specific key and/or voice input to select a "Laptops" option.
- the user may for example enter a specific key and/or voice input to select a "Hardware Problems” option.
- the user may for example enter a specific key and/or voice input to select a "Model XYZ” option for his or her specific laptop model (i.e., the desired location).
- the user may navigate through a given menu under the control of processor 22 until a desired location in the given menu is reached at step 320.
- user I/O device 10 is embodied as a hand-held remote control device, wired and/or wireless keyboard, or other device and menu system 20 is included in a consumer electronics device
- the user may navigate to the desired location in the menu at step 320 by entering one or more key inputs to the consumer electronics device via user I/O device 10 responsive to the on-screen menu displays provided via the display device associated with the consumer electronics device.
- the user provides a predetermined input to menu system 20 via user I/O device 10 to indicate that the desired location in the menu has been reached.
- the user may provide the predetermined input to menu system 20 at step 330 by entering a predetermined key (e.g., pound key, star key, menu key, etc.), and/or a predetermined voice input (e.g., the word "yes," etc.).
- a predetermined key e.g., pound key, star key, menu key, etc.
- a predetermined voice input e.g., the word "yes," etc.
- the user may provide the predetermined input to menu system 20 at step 330 by entering a predetermined key input (e.g., menu key, etc.).
- menu system 20 provides an address corresponding to the desired location in the menu to the user.
- processor 22 retrieves the address corresponding to the desired menu location from memory 24 and provides the address aurally and/or visually to the user via user I/O device 10 responsive to the predetermined input of step 330.
- processor 22 retrieves the address corresponding to the desired menu location from memory 24 and provides the address to the user aurally and/or visually via the consumer electronics device and/or an associated display responsive to the predetermined input of step 330.
- the address is represented by one or more alphabetic characters (e.g., ACT) and/or numbers (e.g., 349).
- the user accesses the menu of menu system 20 again.
- user I/O device 10 is embodied as a telephone device and menu system 20 is embodied as a voice menu system
- the user may access the menu at step 350 via user I/O device 10 by dialing a telephone number associated with menu system 20.
- user I/O device 10 is embodied as a hand-held remote control device, wired and/or wireless keyboard, or other device and menu system 20 is included in a consumer electronics device
- the user may access the menu at step 350 by entering one or more predetermined key inputs via user I/O device 10.
- step 350 is substantially identical to step 310, except that the user has the address corresponding to the desired menu location when accessing menu system 20 at step 350, and can thereby quickly navigate through the menu.
- the user inputs the address provided at step 340 to menu system 20.
- the user may input the address to menu system 20 at step 360 by entering key and/or voice inputs via user I/O device 10.
- the user may input the address to menu system 20 at step 360 by entering key inputs via user I/O device 10.
- the user may be prompted by menu system 20 to enter the address at step 360.
- menu system 20 is embodied as a voice menu system
- the user may be prompted to enter the address at the beginning of the call which accesses menu system 20 at step 350.
- menu system 20 is included in a consumer electronics device, the user may be prompted to enter the address immediately after menu system 20 is accessed at step 350.
- the user may first enter one or more predetermined inputs (e.g., voice, key, etc.) before entering the actual address to thereby inform menu system 20 that the address is about to be entered.
- predetermined inputs e.g., voice, key, etc.
- the user may enter a predetermined sequence of one or more key inputs at any time after accessing menu system 20 at step 350 to thereby inform menu system 20 that the next input represents an address.
- processor 22 of menu system 20 receives and detects the address input at step 360.
- menu system 20 automatically advances the user to the desired location in the menu.
- processor 22 processes the address input at step 360 and automatically advances the user to the desired location in the menu responsive to the address input.
- the menu in question may be a voice menu or an on-screen menu.
- the user would be automatically advanced to the point in the menu following the user's selection of the "Model XYZ" option for his or her specific laptop model. Accordingly, the user would avoid having to re-navigate past five decision points, thereby saving a great amount of time and potential frustration.
- the present invention provides a method for operating a menu system such as a voice menu system that enables, among other things, users to quickly navigate to a desired location in a menu provided by the menu system.
- the present invention may be applicable to various systems, devices and/or apparatuses having a menu system, either with or without a display device.
- the phrases "consumer electronics device” or “television signal receiver” as used herein may refer to systems or apparatuses including, but not limited to, television sets, computers or monitors that include a display device, and systems or apparatuses such as set-top boxes, video cassette recorders (VCRs), digital versatile disk (DVD) players, video game boxes, personal video recorders (PVRs), computers or other apparatuses that may not include a display device. While this invention has been described as having a preferred design, the present invention can be further modified within the spirit and scope of this disclosure. This application is therefore intended to cover any variations, uses, or adaptations of the invention using its general principles. Further, this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains and which fall within the limits of the appended claims.
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computational Linguistics (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2004/018600 WO2006001794A1 (fr) | 2004-06-10 | 2004-06-10 | Procede de fonctionnement d'un systeme de menu |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2004/018600 WO2006001794A1 (fr) | 2004-06-10 | 2004-06-10 | Procede de fonctionnement d'un systeme de menu |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2006001794A1 true WO2006001794A1 (fr) | 2006-01-05 |
Family
ID=34958083
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2004/018600 Ceased WO2006001794A1 (fr) | 2004-06-10 | 2004-06-10 | Procede de fonctionnement d'un systeme de menu |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2006001794A1 (fr) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0827320A2 (fr) * | 1996-08-22 | 1998-03-04 | AT&T Corp. | Méthode et système d'optimisation de menu vocal |
| EP1020789A1 (fr) * | 1999-01-18 | 2000-07-19 | THOMSON multimedia | Appareil comportant une interface vocale ou manuelle et procédé d'aide a l'apprentissage des commandes vocales d'un tel appareil |
| EP1107544A2 (fr) * | 1999-12-07 | 2001-06-13 | SAMSUNG ELECTRONICS Co. Ltd. | Méthode de définition de touches programmables pour la sélection des fonctions désirées par l'utilisateur d'un terminal de communication |
-
2004
- 2004-06-10 WO PCT/US2004/018600 patent/WO2006001794A1/fr not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0827320A2 (fr) * | 1996-08-22 | 1998-03-04 | AT&T Corp. | Méthode et système d'optimisation de menu vocal |
| EP1020789A1 (fr) * | 1999-01-18 | 2000-07-19 | THOMSON multimedia | Appareil comportant une interface vocale ou manuelle et procédé d'aide a l'apprentissage des commandes vocales d'un tel appareil |
| EP1107544A2 (fr) * | 1999-12-07 | 2001-06-13 | SAMSUNG ELECTRONICS Co. Ltd. | Méthode de définition de touches programmables pour la sélection des fonctions désirées par l'utilisateur d'un terminal de communication |
Non-Patent Citations (2)
| Title |
|---|
| "MEMORY LIFE MENU SYSTEM FOR ACOUSTIC APPLICATIONS", IBM TECHNICAL DISCLOSURE BULLETIN, IBM CORP. NEW YORK, US, vol. 37, no. 10, 1 October 1994 (1994-10-01), pages 537, XP000475768, ISSN: 0018-8689 * |
| "USER INTERFACE SHORTCUT", IBM TECHNICAL DISCLOSURE BULLETIN, IBM CORP. NEW YORK, US, vol. 33, no. 3A, 1 August 1990 (1990-08-01), pages 413 - 414, XP000120530, ISSN: 0018-8689 * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US6317143B1 (en) | Programmable graphical user interface control system and method | |
| US6615248B1 (en) | Method and system for presenting content selection options | |
| US5638438A (en) | System and method for automatically creating new pages in a touch screen based telephony product having a hierarchical repertory | |
| JP4532585B2 (ja) | インプット・デバイスのためのタイプアヘッド・キーパッド・インプット | |
| US6075843A (en) | User device-independent transaction model | |
| US20190019156A1 (en) | Presenting favorite contacts information to a user of a computing device | |
| US20070216659A1 (en) | Mobile communication terminal and method therefore | |
| US9588865B2 (en) | System and method for displaying usage history of applications executed between devices | |
| US20080313574A1 (en) | System and method for search with reduced physical interaction requirements | |
| CN102426511A (zh) | 系统级搜索的用户界面 | |
| EP2126732A2 (fr) | Conservation l'expérience de contenu d'un utilisateur sur de multiples dispositifs de calcul au moyen d'informations d'emplacement | |
| US10013263B2 (en) | Systems and methods method for providing an interactive help file for host software user interfaces | |
| EP2452251B1 (fr) | Entrée conviviale d'éléments de texte | |
| CN109189954B (zh) | 内容推荐方法和装置 | |
| US9733897B2 (en) | Method and apparatus of searching content | |
| US8423481B2 (en) | Self-learning method for keyword based human machine interaction and portable navigation device | |
| JP2011524586A (ja) | モバイル・テレビジョン及びマルチメディア・プレーヤ・キーの提示 | |
| US6727921B1 (en) | Mixed mode input for a graphical user interface (GUI) of a data processing system | |
| CN101978364B (zh) | 在多个输入装置之间实现一致性操作的操作系统 | |
| WO2006001794A1 (fr) | Procede de fonctionnement d'un systeme de menu | |
| EP2127296B1 (fr) | Procédé et dispositif permettant de personnaliser des sources de données souscrites | |
| US20060107233A1 (en) | Method and system for navigating through a plurality of features | |
| US7636082B2 (en) | Dialing methods and related devices | |
| CN105302836A (zh) | 网页标签的关闭方法和系统 | |
| JPH0535459B2 (fr) |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
| 122 | Ep: pct application non-entry in european phase |