[go: up one dir, main page]

WO2018198285A1 - Dispositif de commande d'affichage, dispositif de navigation et procédé de commande d'affichage - Google Patents

Dispositif de commande d'affichage, dispositif de navigation et procédé de commande d'affichage Download PDF

Info

Publication number
WO2018198285A1
WO2018198285A1 PCT/JP2017/016805 JP2017016805W WO2018198285A1 WO 2018198285 A1 WO2018198285 A1 WO 2018198285A1 JP 2017016805 W JP2017016805 W JP 2017016805W WO 2018198285 A1 WO2018198285 A1 WO 2018198285A1
Authority
WO
WIPO (PCT)
Prior art keywords
icon
abstraction level
unit
displayed
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/016805
Other languages
English (en)
Japanese (ja)
Inventor
翔伍 岡本
三浦 紳
卓爾 森本
季美果 池上
博彬 柴田
晃平 鎌田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to PCT/JP2017/016805 priority Critical patent/WO2018198285A1/fr
Priority to JP2019514997A priority patent/JP6671544B2/ja
Publication of WO2018198285A1 publication Critical patent/WO2018198285A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Definitions

  • This invention relates to a display control device for controlling the display of icons.
  • Patent Document 1 describes an in-vehicle information output device that switches a displayed electronic program guide according to whether or not a vehicle is running. When the vehicle is not running, the in-vehicle information output device displays character information related to the program category and program content, and when the vehicle is running, graphic information indicating the program category and program content is displayed. Is displayed.
  • the present invention has been made to solve the above-described problems, and even if an icon is displayed in a display form with a different abstraction level, a display control device that allows the user to easily understand the meaning of the displayed icon.
  • the purpose is to obtain.
  • the display control device When the icon associated with the function is operated or displayed, the display control device according to the present invention acquires an icon information indicating the icon, and an icon corresponding to the abstraction level. Change the display mode, the number of times the icon is used, the information management unit that manages the level of abstraction when the icon is displayed, and the number of times the icon is managed when the icon is operated When an icon is displayed, an abstraction level changing unit that changes an abstraction level when the icon is displayed according to the number of times the icon is managed by the information management unit, and an icon is displayed.
  • An output unit that outputs display information indicating an icon in a display form according to an abstraction level when the icon managed by the management unit is displayed It is intended to.
  • the abstraction level when the icon is displayed is changed according to the number of times the icon is used, even if the icon is displayed in a display form with a different abstraction level, the displayed icon The meaning of is easy for the user to understand.
  • FIG. 1 shows the structure of the display control apparatus which concerns on Embodiment 1 of this invention. It is a figure which shows the structure at the time of applying the display control apparatus which concerns on Embodiment 1 of this invention to a navigation apparatus. It is an example of the icon of Embodiment 1 of this invention. It is an example of the information regarding the display of the icon of Embodiment 1 of this invention.
  • 5A and 5B are diagrams showing a hardware configuration example of the display control apparatus according to Embodiment 1 of the present invention. It is a flowchart which shows the display process of Embodiment 1 of this invention. It is a flowchart which shows the operation processing of Embodiment 1 of this invention.
  • FIG. 9A is a diagram in the case where the function is executed by the operation of the icon in the second embodiment of the present invention
  • It is a flowchart which shows the operation processing of Embodiment 2 of this invention.
  • FIG. 15A and FIG. 15B are diagrams for explaining a display form for each user according to the third embodiment of the present invention.
  • FIG. 16A and FIG. 16B are diagrams for explaining a display form for each user according to the third embodiment of the present invention.
  • Embodiment 1 a display form when the icon is displayed from a display form including both character data and graphic data to a display form mainly including graphic data according to the number of times the icon indicating the function to be executed is used.
  • a mode of switching in a stepwise manner will be described.
  • the conversion stage of the icon display form is defined as an abstraction level, and the higher the degree of icon graphic data, the higher the abstraction level. Each time the number of times the icon is used reaches the set number, the abstraction level is raised to the next higher level, and the display form of the displayed icon is switched to the display form mainly composed of graphic data.
  • FIG. 1 is a diagram showing the configuration of a display control apparatus 1A according to Embodiment 1 of the present invention.
  • the display control apparatus 1 ⁇ / b> A includes an information acquisition unit 1, a usage count change unit 2, an abstraction level change unit 3, an information management unit 4, and an output unit 5.
  • the information acquisition unit 1 acquires icon information.
  • the icon information includes the content of the function executed when the icon is operated, that is, the content of the function associated with the icon and the icon ID corresponding to the content of the function.
  • the usage count changing unit 2 changes the usage count of the icon managed by the information management unit 4.
  • the abstraction level changing unit 3 changes the abstraction level when the icon is displayed according to the number of times the icon managed by the information management unit 4 is used. The level of abstraction when the icon is displayed is managed by the information management unit 4.
  • the information management unit 4 manages information using a storage unit (not shown). As will be described later with reference to FIG. 3, the information management unit 4 manages the association between the icon ID, the character data indicating the function contents, and the graphic data indicating the function contents. For each icon, the information management unit 4 defines a plurality of display forms, an abstraction level corresponding to the display form, and the number of times the icon is used to raise the abstraction level to the next higher level. And the information management part 4 manages the display form of the icon according to the abstraction level, the frequency
  • the output unit 5 selects a display form according to the abstraction level when the icon is displayed, and outputs display information indicating the icon of the selected display form. As described above, the level of abstraction when icons are displayed is managed by the information management unit 4.
  • the display control device 1A can be applied to a device such as a navigation device, a smartphone, a personal computer, or a television, for example.
  • a device such as a navigation device, a smartphone, a personal computer, or a television to which the display control device 1A is applied can display an icon on a screen, and executes a function associated with the icon when the icon is operated. is there.
  • FIG. 2 is a diagram showing a configuration when the display control device 1A is applied to, for example, the navigation device 1B.
  • the navigation device 1B includes a display control device 1A, a navigation unit 10, an input unit 11, and a display unit 12.
  • the input unit 11 and the display unit 12 are separated from the navigation device 1B as devices, and may be in a state of being connected to the navigation device 1B by wired connection or wireless connection.
  • the navigation unit 10 controls processing such as searching for a route from the current location to the input destination, input processing of the navigation device 1B, display processing of the navigation device 1B, and the like.
  • the navigation unit 10 manages the content of the function associated with the icon and the icon ID corresponding to the content of the function.
  • the navigation unit 10 displays the content of the function associated with the icon and the icon ID corresponding to the content of the function.
  • Information is output to the display control apparatus 1A.
  • the navigation unit 10 displays the content of the function associated with the icon and the icon ID corresponding to the content of the function as icon information. Output to.
  • the application destination of the display control device 1A is a device such as a smartphone, a personal computer, or a television
  • these devices manage the content of the function associated with the icon and the icon ID corresponding to the content of the function.
  • Configuration that is, the configuration similar to the navigation unit 10, the configuration corresponds to the content of the function associated with the icon and the content of the function when the icon is operated or displayed
  • the icon ID is output as icon information to the display control device 1A.
  • the input unit 11 is an input device such as a touch panel or a remote controller.
  • the currently displayed icon is selected and operated by, for example, a touch operation using a touch panel or a button operation using a remote controller.
  • the input unit 11 outputs information indicating the operation content to the navigation unit 10.
  • the input unit 11 is a touch panel
  • information on the touched coordinate position is output to the navigation unit 10.
  • the navigation unit 10 uses the coordinate position information to determine which icon is the operated icon, that is, the content of the function associated with the icon and the icon ID corresponding to the content of the function. Is identified.
  • the application destination of the display control device 1A is a device such as a smartphone, a personal computer, or a television, these devices have the same configuration as the input unit 11, and when an icon is operated, information indicating the operation content Is output to a configuration corresponding to the navigation unit 10.
  • the display unit 12 is a display device such as a liquid crystal display or a head-up display.
  • the navigation unit 10 acquires the display information output from the display control device 1A.
  • the navigation unit 10 performs display processing such as icon drawing using the display information, creates an image signal indicating a display screen on the display unit 12, and outputs the image signal to the display unit 12.
  • the display unit 12 displays a display screen indicated by the image signal output from the navigation unit 10. Even if the application destination of the display control device 1A is a device such as a smartphone, a personal computer, or a television, these devices have the same configuration as the display unit 12 and display a display screen configured by icons and the like.
  • Embodiment 1 and Embodiments 2 and 3 to be described later a case where the display control device 1A is applied to a navigation device will be described as an example.
  • FIG. 3 is a diagram showing an example of correspondence between icon IDs, graphic data indicating function contents, and character data indicating function contents for icons used in the navigation device 1B.
  • the icon ID, the graphic data indicating the content of the function, and the character data indicating the content of the function are managed by the information management unit 4 in a table state as shown in FIG.
  • the first column from the left is the icon ID of the icon
  • the second column from the left is the graphic data indicating the contents of the function associated with the icon
  • the third column from the left is associated with the icon. This is character data indicating the contents of the function.
  • the character data in the third column from the left and the icon ID in the first column from the left are also used in the icon information output by the navigation unit 10.
  • the slash “/” shown in FIG. 3 is not shown in the content of the function associated with the icon included in the icon information.
  • the graphic data in the second column from the left indicates graphic data corresponding to the character data in the third column from the left.
  • the character data is separated by a slash “/”, and the graphic data in the second column from the left is constituted by a plurality of graphic data corresponding to the divided character data.
  • the content of the function associated with the icon with the icon ID 01 is the content of the route search “return to home with priority on general roads”, and “general road priority” and “home”.
  • the function is expressed as shown in the second column from the left using graphic data respectively.
  • the content of the function associated with the icon with the icon ID 02 is the content of the facility search “search for nearby convenience stores”, and “peripheral”, “convenience store”, and “ The function is expressed as shown in the second column from the left using graphic data indicating “search”.
  • the information shown in FIG. 3 is, for example, already set at the time of product shipment as the navigation device 1B and managed by the information management unit 4.
  • the character data delimiter indicated by the slash “/” is in a predetermined state. It has become.
  • information management The section 4 may sequentially create a table as shown in FIG. 3 by dividing the character data under specific conditions and converting it into graphic data.
  • FIG. 4 is a description example of information related to icon display. This information is managed by the information management unit 4 in a table state as shown in FIG.
  • the information management unit 4 manages each piece of information shown in FIG. 4 for each icon by associating it with an icon ID.
  • FIG. 4 is associated with a function of “searching for cafes around the route”.
  • An icon is given as an example. For example, when the icon is selected and operated using the input unit 11 of the navigation device 1B, the function of “search for cafes around the route” is executed, and the cafes around the route from the current location to the destination are displayed. The search results are displayed on the display unit 12 in a list format.
  • “display target” in the first column from the left indicates an abstraction level when the icon is displayed.
  • the abstraction level marked with ⁇ is the abstraction level when the “Search for cafes around the route” icon is displayed.
  • the display target is one row below the row where the value of “current number of uses” in the fifth column from the left in FIG. 4 is equal to the value of “number of uses necessary for level increase” in the fourth column from the left.
  • the display target is set to abstract level 1. If the setting of the display object changes, the circles in FIG. 4 move to the corresponding abstraction level line. In the example of FIG. 4, since the display target is set to the abstraction level 2, a circle is described in the abstraction level 2 line.
  • the “abstraction level” in the second column from the left defines the abstraction level corresponding to each display form.
  • the abstraction level is defined to increase by one as character data is partially deleted in the display form.
  • the initial level of abstraction is level 1.
  • FIG. 4 shows a case where the “search for cafes around the route” icon takes a four-stage display form.
  • the display form of abstraction level 1 is a form composed of character data indicating the content of the function associated with the icon and graphic data representing the character data.
  • graphic data corresponding to “route neighborhood”, “cafe”, and “search” are used for the content of the function “search for cafes around the route”.
  • the character data “cafe” is deleted from the abstraction level 1 display form to obtain character data “search around the route”.
  • the graphic data is the same as the abstract level 1 display form.
  • the character data “search by” is deleted from the abstraction level 2 display form to obtain character data “route vicinity”.
  • the graphic data is the same as the abstract level 1 display form.
  • the display form of abstraction level 4 is a form in which all character data is deleted and only graphic data is obtained. The graphic data is the same as the abstract level 1 display form.
  • the “number of times required for level increase” in the fourth column from the left defines the number of times the icon is used to increase the level of abstraction when the icon is displayed. ing. In FIG. 4, for example, 20 times to rise from abstraction level 1 to abstraction level 2, 20 more times to go from abstraction level 2 to abstraction level 3, and from abstraction level 3 to abstraction level 4 It is defined that 15 more times are required to go up. Since there is no level that is higher than the abstraction level, in FIG. 4, the level of abstraction level 4, which corresponds to the highest level of abstraction level, It shall not be defined.
  • the “abstraction level” in the second column from the left, the “display form” in the third column, and the “number of uses necessary for level increase” in the fourth column are already set when the product is shipped as the navigation device 1B, for example.
  • the information management unit 4 is set and managed. However, the information management unit 4 may use the information shown in FIG. 3 to sequentially define the display form corresponding to the abstraction level and the number of uses necessary for the level increase.
  • the information management unit 4 may record the information shown in FIG. 4 on an external server. In this way, for example, when a new icon is added by updating the navigation device 1B or the like, the display form and the number of times required for increasing the level can be updated together on the server side, which is efficient. . Similarly, the information shown in FIG. 3 may be recorded in an external server.
  • the “current number of uses” in the fifth column from the left indicates the number of uses by the user up to that point in time for each display form.
  • FIG. 4 shows a case where an abstraction level 1 display form icon is selected 20 times and an abstraction level 2 display form icon is selected 7 times. For this reason, the display target of the “search for cafes around the route” icon is set at the abstraction level 2.
  • the processing circuit may be dedicated hardware or a CPU (Central Processing Unit) that executes a program stored in a memory.
  • the CPU is also called a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a processor, or a DSP (Digital Signal Processor).
  • the communication device 102 implements the functions of acquiring information from outside the display control device 1A, which is a function of the information acquisition unit 1, and outputting information to the outside of the display control device 1A, which is a function of the output unit 5.
  • the communication device 102 is, for example, a CAN (Controller Area Network).
  • CAN Controller Area Network
  • FIG. 5A is a diagram illustrating a hardware configuration example in a case where the functions of the usage number changing unit 2, the abstraction level changing unit 3, and the information management unit 4 are realized by the processing circuit 101 that is dedicated hardware.
  • the processing circuit 101 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a combination thereof.
  • the functions of the usage number changing unit 2, the abstraction level changing unit 3 and the information management unit 4 may be realized by combining separate processing circuits 101, or the functions of the respective units may be realized by one processing circuit 101. Also good.
  • FIG. 5B is a diagram illustrating a hardware configuration example in a case where the functions of the usage number changing unit 2, the abstraction level changing unit 3, and the information management unit 4 are realized by the CPU 105 that executes a program stored in the memory 104. It is.
  • the functions of the usage number changing unit 2, the abstraction level changing unit 3, and the information management unit 4 are realized by software, firmware, or a combination of software and firmware.
  • Software and firmware are described as programs and stored in the memory 104.
  • the CPU 105 reads out the program stored in the memory 104, expands it in a RAM (Random Access Memory) 106, and executes it. To realize.
  • RAM Random Access Memory
  • the display control apparatus 1A has a memory 104 for storing a program or the like in which each step shown in the flowcharts of FIGS. 6, 7, 10, 13, and 14 to be described later is executed. Have. In addition, it can be said that these programs cause a computer to execute the procedures or methods of the respective units of the usage number changing unit 2, the abstraction level changing unit 3, and the information management unit 4.
  • the memory 104 is, for example, a nonvolatile or volatile semiconductor memory such as RAM, ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable ROM), or EEPROM (Electrically Erasable Programmable ROM), or A disk-shaped recording medium such as a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, or a DVD (Digital Versatile Disc) is applicable.
  • the functions of the usage number changing unit 2, the abstraction level changing unit 3 and the information management unit 4 may be realized by dedicated hardware and partly realized by software or firmware.
  • the function of the information management unit 4 is realized by a processing circuit as dedicated hardware, and the processing circuit reads and executes the program stored in the memory for the use frequency changing unit 2 and the abstraction level changing unit 3 By doing so, the function can be realized.
  • the processing circuit can realize the functions of the above-described use frequency changing unit 2, abstraction level changing unit 3 and information management unit 4 by hardware, software, firmware or a combination thereof.
  • FIG. 6 is a flowchart showing processing when an icon is displayed.
  • the navigation unit 10 When it is necessary to display an icon on the navigation device 1B, the navigation unit 10 outputs icon information indicating the icon to be displayed to the information acquisition unit 1 of the display control device 1A. Then, the information acquisition unit 1 of the display control device 1A acquires the icon information output from the navigation unit 10 (step ST1). For example, when the navigation device 1B wants to display an icon “search for cafes around the route”, the information acquisition unit 1 is associated with the icon “search for cafes around the route” from the navigation unit 10.
  • Character data indicating the content of the function and an icon ID corresponding to the content of the function, for example, “07” are acquired as icon information.
  • processing is performed for the icon corresponding to the icon ID acquired by the information acquisition unit 1 as icon information in step ST1.
  • the abstraction level changing unit 3 acquires the current display level, that is, the current abstraction level when the icon is displayed, from the information managed by the information management unit 4 (step ST2). Taking FIG. 4 as an example, the abstraction level changing unit 3 acquires “2” as the abstraction level.
  • the abstraction level changing unit 3 determines whether the abstraction level acquired in step ST2 is the highest level, that is, whether the abstraction level can be increased any more (step ST3). For example, in FIG. 4, when the current display target is set to abstract level 2, the process moves to step ST ⁇ b> 4 as described later, and the current display target is set to abstract level 4. As will be described later, the process proceeds to step ST7. When the abstraction level acquired in step ST2 is the highest level (step ST3; YES), the process proceeds to the process of step ST7 described later.
  • the abstraction level change unit 3 uses the abstraction level acquired in step ST2 from the information managed by the information management unit 4.
  • the number of uses of the icon in the display form is acquired (step ST4). Taking FIG. 4 as an example, the abstraction level changing unit 3 acquires “7” as the number of uses.
  • the abstraction level changing unit 3 determines whether or not the number of uses acquired in step ST4 has reached the set number of times (step ST5).
  • the set number of times is “the number of uses necessary for level increase” corresponding to the abstraction level acquired in step ST2. For example, in FIG. 4, when the current display target is set to the abstract level 2 and the number of uses of the icon of the display form of the abstract level 2 is 20, the “level increase” in the abstract level 2 Since it has reached “20”, which is the “required number of times of use”, the processing moves to the processing of step ST6 as described later.
  • step ST7 If the current display target is set to abstract level 2 and the number of uses of the display form of abstract level 2 is 7, the number of uses required for level increase in abstract level 2 Since it does not reach a certain “20”, the process proceeds to the process of step ST7 as described later. If the number of uses acquired in step ST4 has not reached the set number of times (step ST5; NO), the process proceeds to step ST7 described later.
  • the abstraction level changing unit 3 raises the abstraction level when the icon is displayed to the next higher level.
  • Step ST6 The change of the abstraction level when the icon is displayed by the abstraction level changing unit 3 is reflected in the “display target” of the information managed by the information management unit 4.
  • the abstract level changing unit 3 displays the abstract level. Is raised to the next higher level, and the abstraction level when the icon is displayed is designated as abstraction level 3.
  • the ⁇ mark in the display target column shown in FIG. 4 moves to the abstraction level 3 line.
  • the output unit 5 outputs display information indicating an icon in a display form corresponding to the abstraction level when the icon is displayed from the information managed by the information management unit 4 (step ST7).
  • the output unit 5 outputs display information indicating icons in the display form of abstraction level 2.
  • the navigation unit 10 performs display processing such as icon drawing.
  • the display unit 12 displays the icon of the abstract level display form set as the display target.
  • the display unit 12 has a display form of abstraction level 2, character data “search around route” and graphic data corresponding to “route neighborhood”, “cafe”, and “search”, respectively. An icon consisting of and is displayed.
  • FIG. 7 is a flowchart showing processing when an icon is operated.
  • the navigation unit 10 When an icon displayed on the navigation device 1B is operated by the user, the navigation unit 10 outputs icon information indicating the operated icon to the information acquisition unit 1 of the display control device 1A. Then, the information acquisition unit 1 of the display control device 1A acquires the icon information output from the navigation unit 10 (step ST11). For example, when the icon “search for cafes around the route” is operated on the navigation device 1B, the information acquisition unit 1 is associated with the icon “search for cafes around the route” from the navigation unit 10. The character data indicating the content of the function and the icon ID corresponding to the content of the function, such as “07”, are acquired as icon information. In subsequent steps ST12 to ST14, processing is performed for the icon corresponding to the icon ID acquired by the information acquisition unit 1 as icon information in step ST11.
  • the usage count changing unit 2 acquires the current abstraction level when the icon is displayed from the information managed by the information management unit 4 (step ST12). Taking FIG. 4 as an example, the usage count changing unit 2 acquires “2” as the abstraction level.
  • the usage count changing unit 2 acquires the usage count of the icon in the abstract level display form acquired in step ST12 (step ST13). Taking FIG. 4 as an example, the usage count changing unit 2 acquires “7” as the usage count.
  • the usage count changing unit 2 adds 1 to the usage count of the icon in the abstract level display form acquired in step ST12 (step ST14). Taking FIG. 4 as an example, the number of uses is changed from “7” to “8”. The change in the number of uses by the use number changing unit 2 is reflected in the information managed by the information management unit 4.
  • the display control device 1A can be applied to a device such as a navigation device, a smartphone, a personal computer, or a television, and the display control device 1A is built in the navigation device 1B.
  • the display control apparatus 1A may be built in an external server. In this case, when the external server transmits / receives information to / from a device such as a navigation device, a smartphone, a personal computer, or a television, the display form of icons displayed on the device is controlled.
  • the mobile terminal When the display control device 1A is built in a mobile terminal such as a smartphone, the mobile terminal transmits / receives information to / from a navigation device, a personal computer, a television, or the like, so that the mobile terminal uses a navigation device, a personal computer, a television, or the like.
  • the display form of icons displayed on the device may be controlled.
  • the information acquisition unit 1 and the number of times of use are changed by linking the device 20, such as a navigation device, a personal computer or a television, the external server 30, and the portable terminal 40 in a state where information can be transmitted and received.
  • the unit 2, the abstraction level changing unit 3, the information management unit 4, and the output unit 5 may be distributed in the device 20, the external server 30, and the mobile terminal 40.
  • the display control device 1 ⁇ / b> A is configured by the device 20, the external server 30, and the mobile terminal 40.
  • the information management unit 4 manages the current number of uses for each abstraction level.
  • the information management unit 4 may manage the total number of times the icon has been operated instead of each abstraction level.
  • the number of uses required for level increase is set in a state that can be compared with the total number of times the icon has been operated.
  • the number of uses required for the abstraction level to increase from level 2 to level 3 is 40, and the abstraction level is required to increase from level 3 to level 4.
  • the number of uses is set to 55, and is compared with the total number of uses of the “search for cafes around the route” icon.
  • the display form includes both character data and graphic data, and the graphic data main display form.
  • the display form when the icon is displayed is switched in stages. Therefore, the user can operate while learning the meaning of the graphic data indicated by the icon by comparing it with the change of the character data, and even if the icon is displayed in a display form with a different abstraction level, the meaning of the displayed icon is displayed. Easy to understand. Further, according to the display control device 1A, since the icon is finally displayed only with graphic data, the user can intuitively operate the icon without the trouble of reading the character data. In addition, since the icon is finally displayed only with graphic data, the display area occupied by the icon can be reduced, and the display area other than the icon can be widened.
  • a display form corresponding to the abstraction level may be defined by stepwise change of character data without using graphic data.
  • the display form of abstraction level 1 is only character data “return to home with priority on general road”.
  • the display form is defined as only character data “general road priority home”
  • the display form of abstraction level 3 is defined as only character data “general home”.
  • the icon display form is a combination of character data and graphic data according to the level of abstraction. The higher the level of abstraction, the more intuitive the graphic data is to reduce the character data. There is an advantage that it can be operated.
  • the information management unit 4 records the managed information on an external server. Thereby, the information can be updated collectively on the server side, which is efficient.
  • Embodiment 2 when a function associated with an icon is executed without operating the icon, the abstraction level when the icon is displayed is lowered to the next lower level.
  • the function associated with the icon can be executed by another operation of the device itself, if the user instructs the execution of the function by a method other than the operation of the icon, Since the current icon display form is difficult for the user to understand or is not fully learned for the user, the icon is displayed in a form that is restored to the display form with a lower level of abstraction that was previously displayed. is there.
  • the configuration of the display control apparatus 1A according to the second embodiment is the same as the configuration shown in FIG. 1, and the same reference numerals are used for configurations having the same or corresponding functions as those already described in the first embodiment. A description thereof will be omitted or simplified.
  • the information acquisition unit 1 according to the second embodiment performs the function by an operation other than the operation of the icon. A flag indicating whether or not is executed is acquired as icon information. Hereinafter, the flag is appropriately referred to as an “icon flag”.
  • the usage count changing unit 2 changes the usage count of the icons managed by the information management unit 4. At that time, the use frequency changing unit 2 also appropriately changes according to the icon flag.
  • the abstraction level changing unit 3 according to the second embodiment changes the abstraction level when the icon is displayed according to the number of times the icon is used and the icon flag managed by the information management unit 4.
  • the icon in the second embodiment has a plurality of display forms according to the abstraction level, and the function associated with the icon is either the operation of the icon or an operation other than the operation of the icon It is an icon that is executed by.
  • the navigation unit 10 determines whether or not it is due to the operation of the icon when the function designated by the user using the input unit 11 is performed, and indicates the determination result in the icon flag.
  • the icon flag is set to TRUE, and when the function is executed by an operation other than the operation of the icon, Assume that the icon flag is set to FALSE.
  • the navigation unit 10 determines that the icon has been operated and sets the icon flag to TRUE. Further, when the user selects and operates an icon by operating a button on the remote controller, the navigation unit 10 determines that the icon has been operated and sets the icon flag to TRUE. On the other hand, when the user inputs an execution instruction for the function associated with the icon being displayed on the display unit 12 using the HW (Hard Wear) key or menu command of the navigation device 1B itself, the navigation unit 10 Sets the icon flag to FALSE.
  • HW Hard Wear
  • the icon flag is set to TRUE.
  • the icon flag is set to FALSE.
  • the icon flag is set to TRUE or FALSE as in the case of the navigation device 1B and the personal computer.
  • FIG. 9A and FIG. 9B are diagrams illustrating an example in which a function associated with the icon is executed by an icon operation and a case in which the function is executed by an operation other than the icon operation.
  • FIG. 9A shows an example of operating an icon
  • FIG. 9B shows an example of operating using an HW key or the like of the navigation device 1B instead of an icon.
  • FIG. 9A the touch panel portion and the HW key are shown below the touch panel portion.
  • the current location screen is shown, and icons associated with the functions “return to home”, “switch to general road priority”, and “delete route” are displayed on the right side of the screen in order from the top.
  • FIG. 9A shows a case where the user operates the icon to execute the function “delete route”. In this case, the icon flag is set to TRUE.
  • FIG. 9B shows a situation in which the screen transitions from the left in the figure to the current location screen, the NAVI screen, and the route editing screen in order from the left in the figure.
  • the user selects “NAVI” of the HW key on the current location screen, “Edit route” on the NAVI screen, and “Delete route” on the route edit screen without operating the icon.
  • the function of “erasing the route” is executed.
  • the icon flag is set to FALSE.
  • FIG. 10 is a flowchart showing processing when an icon is operated. Processes that are the same as or correspond to the processes described using FIG. 7 are assigned the same reference numerals, and descriptions thereof are omitted or simplified. Further, the processing when the icon is displayed by the display control apparatus 1A of the second embodiment is the same as the flowchart shown in FIG.
  • the navigation unit 10 executes the function designated by the user using the input unit 11
  • the navigation unit 10 displays the character data indicating the content of the function, the icon ID corresponding to the content of the function, and the execution of the function is an icon.
  • An icon flag indicating whether it is due to an operation is output as icon information to the information acquisition unit 1 of the display control device 1A.
  • the information acquisition unit 1 of the display control device 1A acquires the icon information output from the navigation unit 10 (step ST11).
  • processing is performed for the icon corresponding to the icon ID acquired as the icon information by the information acquisition unit 1 in step ST11.
  • the use count changing unit 2 performs the processes of Step ST12 and Step ST13 already described. Subsequently, the use frequency changing unit 2 determines whether the icon flag of the icon information acquired by the information acquiring unit 1 is FALSE (step ST21). For example, when the user operates the icon as shown in FIG. 9A, the icon flag is TRUE, so that the process proceeds to step ST14 as described later, and the user operates other than the icon as shown in FIG. 9B. When the execution of the function associated with the icon is instructed, the icon flag becomes FALSE, so that the process proceeds to step ST22 as described later.
  • step ST21; NO the use frequency changing unit 2 performs the process of step ST14 already described.
  • the icon flag is FALSE (step ST21; YES)
  • the usage count changing unit 2 subsequently determines whether the abstraction level acquired in step ST12 is the lowest level, that is, the abstraction level is further lowered. It is determined whether there is no state (step ST22). For example, in FIG. 4, when the current display target is set to the abstract level 2, the process proceeds to the process of step ST23 as described later. In FIG. 4, when the current display target is set to the abstraction level 1, the abstraction level is not lowered any more, so the display control device 1A ends the process as described later.
  • step ST12 When the abstraction level acquired in step ST12 is the lowest level (step ST22; YES), the display control apparatus 1A ends the process.
  • the usage count changing unit 2 sets the usage count of the icon in the display mode of the abstraction level acquired in step ST12 to “0”. (Step ST23). The change in the number of uses by the use number changing unit 2 is reflected in the information managed by the information management unit 4.
  • the abstraction level changing unit 3 lowers the abstraction level when the icon is displayed to the next lower level (step ST24). That is, the abstraction level when the icon is displayed is lowered to one level below the abstraction level acquired in step ST12. The change of the abstraction level when the icon is displayed by the abstraction level changing unit 3 is reflected in the “display target” of the information managed by the information management unit 4.
  • the use frequency changing unit 2 displays the abstraction level that has become the abstraction level when the icon is displayed by the processing of step ST24, that is, the number of uses at the level lowered by the abstraction level changing unit 3.
  • the number of uses of the form is set to “0” (step ST25).
  • the change in the number of uses by the use number changing unit 2 is reflected in the information managed by the information management unit 4.
  • the usage count changing unit 2 may not set the usage count to “0”, but may set a value, for example, half of the usage count required for level increase.
  • the information management unit 4 may manage the number of times that the icon flag becomes FALSE, and the abstraction level when the icon is displayed can be lowered only when the number of times reaches the threshold. .
  • the display control apparatus 1A when the display form when the icon is displayed is gradually switched to the display form having a high abstraction level, the switching is performed. If the later display form is difficult for the user to understand or not fully learned by the user, the display form is returned to the display form with the lower abstraction level that was displayed last time. Therefore, it is possible to control the switching of the display form according to the understanding of the user.
  • Embodiment 3 a mode in which the user of the display control device 1A is specified and the icon display mode is switched for each user will be described.
  • the third embodiment can be combined with both the first and second embodiments.
  • the configuration of the display control apparatus 1A according to the third embodiment is the same as the configuration shown in FIG. 1, and the same reference numerals are used for configurations having the same or corresponding functions as those already described in the first embodiment. A description thereof will be omitted or simplified.
  • the information acquisition unit 1 of Embodiment 3 acquires user specifying information in addition to icon information.
  • the user specifying information is information for specifying the user of the display control apparatus 1A, and is, for example, a user ID assigned to each user.
  • the user specifying information is output from a user specifying unit 13 described later.
  • the information management unit 4 manages the number of times an icon is used for each user, the abstraction level when the icon is displayed, and the like.
  • the output unit 5 according to the third embodiment performs selection according to the user when selecting the icon display form.
  • the user specifying unit 13 specifies a user of the display control device 1A, that is, a user of the navigation device 1C, and outputs information indicating the specified user, for example, a user ID, to the display control device 1A as user specifying information. .
  • the user specifying unit 13 manages information specific to the user, for example, biometric information such as a fingerprint or a face image, personal information used by the user for login, and the user ID in association with each other.
  • the user specifying unit 13 acquires information specific to the user from a biosensor, a camera, a personal information input device, or the like (not shown), and outputs a user ID corresponding to the user of the navigation device 1C.
  • the user specifying unit 13 is a separate device from the navigation device 1 ⁇ / b> C and may be connected to the navigation device 1 ⁇ / b> C in a wired or wireless connection. Good.
  • the application destination of the display control device 1A is a device such as a smartphone, a personal computer, or a television
  • these devices have the same configuration as the user specifying unit 13, and the user specifying information is output to the display control device 1A. Is done.
  • FIG. 12 shows information managed by the information management unit 4 for each icon. Unlike what is shown in FIG. 4, “display target” and “current number of times of use” are managed for each user.
  • FIG. 13 is a flowchart illustrating processing when an icon is displayed.
  • the navigation unit 10 When it is necessary to display an icon on the navigation device 1C, the navigation unit 10 outputs icon information indicating the icon to be displayed. Moreover, the user specific
  • the subsequent processing of step ST32 to step ST37 is processing for the user specified by the user specifying information acquired in step ST31, that is, processing related to the “display target” and “current number of uses” of the user.
  • the processing contents are the same as those in steps ST2 to ST7 shown in FIG.
  • FIG. 14 is a flowchart illustrating processing when an icon is operated.
  • the navigation unit 10 When an icon displayed on the navigation device 1C is operated by the user, the navigation unit 10 outputs icon information indicating the operated icon. Moreover, the user specific
  • the subsequent processing of step ST42 to step ST44 is processing for the user specified by the user specifying information acquired in step ST41, that is, processing related to the “display target” and “current number of uses” of the user.
  • the processing contents are the same as those in steps ST12 to ST14 shown in FIG.
  • the display target” and the “current usage count” are managed for each user as shown in FIG. 12 .
  • the display form may be managed for each user by the information management unit 4.
  • the display form of abstraction level 2 is composed of graphic data and character data “search around route”.
  • the display form composed of the graphic data and the character data “search in the vicinity of the route” is the display form of the abstraction level 2 of the user A.
  • the display form composed of the data and the character data “route neighborhood cafe” is the display form of the abstraction level 2 of the user B.
  • the display data is managed as the display form for each user by changing the character data for each user.
  • the number of divisions of the character data is the same if the abstraction level is the same.
  • the character data is divided into three parts such as “search / search for / around the route”, and in the abstract level 2 display form, “search / search around the route” or “around the route”.
  • the display form of abstraction level 3 it is divided into 1 divisions such as “around the route”, “cafe” or “search”. Since there is no character data at the abstraction level 4, the character data is divided into zero. That is, the decreasing tendency of the number of divisions of character data when the abstraction level becomes high is common to each user, but the content of the character data changes for each user.
  • the graphic data may be changed for each user.
  • FIGS. 15A and 15B the impression or color of graphic data is changed for each user.
  • FIG. 15A is an example of rounded graphic data, and is used for the user A.
  • FIG. 15B is an example of angular figure data and is used for the user B.
  • pink-based graphic data may be used for user A
  • green-based graphic data may be used for user B.
  • FIGS. 16A and 16B the symbol of the graphic data itself is changed for each user.
  • FIG. 16A uses “magnifying glass” as graphic data indicating “search” and is used for user A.
  • FIG. On the other hand, FIG.
  • 16B uses “eyes” as graphic data indicating “search” and is used for the user B. As described with reference to FIGS. 15A to 16B, the impression, color, and symbol itself of the graphic data are considered to be familiar and easy to understand for each user as described with reference to FIGS. 15A to 16B. Is changed for each user.
  • the setting of the display form for each user may be already set at the time of product shipment as the navigation device 1C. May be set by inputting his / her preferences when the navigation device 1C is used for the first time, and further, setting or updating can be performed from an external server even after the use of the navigation device 1C is started. Also good.
  • the display form can be switched for each user, and the icon is displayed in a display form with an abstraction level suitable for each user. .
  • the display control device can display an icon because the user can easily understand the meaning of the displayed icon even if the icon is displayed in a display form with a different abstraction level. It is suitable for use in a device such as a navigation device.
  • 1 information acquisition unit 1A display control device, 1B, 1C navigation device, 2 usage count change unit, 3 abstraction level change unit, 4 information management unit, 5 output unit, 10 navigation unit, 11 input unit, 12 display unit, 13 User identification unit, 20 device, 30 external server, 40 mobile terminal, 101 processing circuit, 102 communication device, 103 bus, 104 memory, 105 CPU, 106 RAM.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon l'invention, une unité de gestion d'informations (4) gère : des modes d'affichage de chaque icône qui sont associés à différents niveaux d'abstraction ; un compte d'utilisations de chaque icône ; et le niveau d'abstraction auquel chaque icône doit être affichée. Une unité de changement de compte d'utilisations (2) change le compte d'utilisations géré de chaque icône chaque fois que l'icône est manipulée. Une unité de changement de niveau d'abstraction (3) change le niveau d'abstraction auquel chaque icône doit être affichée, conformément au compte d'utilisations géré de l'icône.
PCT/JP2017/016805 2017-04-27 2017-04-27 Dispositif de commande d'affichage, dispositif de navigation et procédé de commande d'affichage Ceased WO2018198285A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2017/016805 WO2018198285A1 (fr) 2017-04-27 2017-04-27 Dispositif de commande d'affichage, dispositif de navigation et procédé de commande d'affichage
JP2019514997A JP6671544B2 (ja) 2017-04-27 2017-04-27 表示制御装置、ナビゲーション装置、及び、表示制御方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/016805 WO2018198285A1 (fr) 2017-04-27 2017-04-27 Dispositif de commande d'affichage, dispositif de navigation et procédé de commande d'affichage

Publications (1)

Publication Number Publication Date
WO2018198285A1 true WO2018198285A1 (fr) 2018-11-01

Family

ID=63919702

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/016805 Ceased WO2018198285A1 (fr) 2017-04-27 2017-04-27 Dispositif de commande d'affichage, dispositif de navigation et procédé de commande d'affichage

Country Status (2)

Country Link
JP (1) JP6671544B2 (fr)
WO (1) WO2018198285A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005102001A (ja) * 2003-09-26 2005-04-14 Kyocera Mita Corp 画像処理装置
JP2010004301A (ja) * 2008-06-19 2010-01-07 Konica Minolta Business Technologies Inc 情報処理装置
JP2013246536A (ja) * 2012-05-24 2013-12-09 Carecom Co Ltd アイコン表示装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005102001A (ja) * 2003-09-26 2005-04-14 Kyocera Mita Corp 画像処理装置
JP2010004301A (ja) * 2008-06-19 2010-01-07 Konica Minolta Business Technologies Inc 情報処理装置
JP2013246536A (ja) * 2012-05-24 2013-12-09 Carecom Co Ltd アイコン表示装置

Also Published As

Publication number Publication date
JPWO2018198285A1 (ja) 2019-11-07
JP6671544B2 (ja) 2020-03-25

Similar Documents

Publication Publication Date Title
US10825456B2 (en) Method and apparatus for performing preset operation mode using voice recognition
US11592968B2 (en) User terminal apparatus and management method of home network thereof
US11163425B2 (en) User terminal apparatus and management method of home network thereof
US10564813B2 (en) User terminal apparatus and management method of home network thereof
US9134972B2 (en) User interface generation apparatus
JP5429060B2 (ja) 表示制御装置、表示制御方法、表示制御プログラム並びにこの表示制御プログラムが記録された記録媒体
US20130254714A1 (en) Method and apparatus for providing floating user interface
JP2007287135A (ja) 画像表示制御装置および画像表示制御装置用のプログラム
JPWO2008035489A1 (ja) ナビゲーションシステムおよび同システムにおける操作ガイダンス表示方法
US8640130B2 (en) Information processing apparatus, application control method, and program
JP2012138076A (ja) メディアコンテンツを検索するためのユーザインタフェース
TW201820109A (zh) 控制使用者介面之方法、程式及裝置
JP2009210641A (ja) 画像表示処理装置、画像表示処理方法および画像表示処理プログラム
US10871898B2 (en) Display apparatus for providing preview UI and method of controlling display apparatus
JP6671544B2 (ja) 表示制御装置、ナビゲーション装置、及び、表示制御方法
JP4532988B2 (ja) 操作画面の制御方法及びプログラム、並びに表示制御装置
JP2009088653A (ja) リモートコントローラおよび遠隔操作方法
JP2009025905A (ja) 情報処理装置およびアイコン表示方法
KR20190055489A (ko) 전자 장치 및 그 제어 방법
JP2009199456A (ja) 情報処理装置、表示方法及びプログラム
WO2009084084A1 (fr) Dispositif de reproduction de support d'enregistrement, procédé de reproduction de support d'enregistrement, programme de reproduction de support d'enregistrement et support d'enregistrement sur lequel un programme de reproduction de support d'enregistrement est mémorisé
JP7729735B2 (ja) 情報処理装置、情報処理方法、および情報処理プログラム
JP4765893B2 (ja) タッチパネル搭載装置、外部装置、及び外部装置の操作方法
US9990338B2 (en) Display device for controlling enlargement of displayed image data, and data processing device and non-transitory computer readable medium
US8910047B2 (en) Device-specific and application-specific computing device, playback device and method for controlling playback device using computing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17907980

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019514997

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17907980

Country of ref document: EP

Kind code of ref document: A1