WO2017099213A1 - 周辺環境の評価結果に関するオノマトペ提示装置 - Google Patents
周辺環境の評価結果に関するオノマトペ提示装置 Download PDFInfo
- Publication number
- WO2017099213A1 WO2017099213A1 PCT/JP2016/086696 JP2016086696W WO2017099213A1 WO 2017099213 A1 WO2017099213 A1 WO 2017099213A1 JP 2016086696 W JP2016086696 W JP 2016086696W WO 2017099213 A1 WO2017099213 A1 WO 2017099213A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- onomatopoeia
- evaluation
- surrounding environment
- unit
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- A—HUMAN NECESSITIES
- A42—HEADWEAR
- A42B—HATS; HEAD COVERINGS
- A42B3/00—Helmets; Helmet covers ; Other protective head coverings
- A42B3/04—Parts, details or accessories of helmets
- A42B3/0406—Accessories for helmets
- A42B3/0433—Detecting, signalling or lighting devices
-
- A—HUMAN NECESSITIES
- A42—HEADWEAR
- A42B—HATS; HEAD COVERINGS
- A42B3/00—Helmets; Helmet covers ; Other protective head coverings
- A42B3/04—Parts, details or accessories of helmets
- A42B3/30—Mounting radio sets or communication systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/26—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using acoustic output
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/178—Warnings
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B7/00—Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00
- G08B7/06—Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096716—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096775—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2430/00—Signal processing covered by H04R, not provided for in its groups
- H04R2430/20—Processing of the output signals of the acoustic transducers of an array for obtaining a desired directivity characteristic
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R3/00—Circuits for transducers, loudspeakers or microphones
- H04R3/12—Circuits for transducers, loudspeakers or microphones for distributing signals to two or more loudspeakers
Definitions
- the present invention relates to a technique for presenting an evaluation result of a surrounding environment to a user. More specifically, the present invention relates to a technique for providing information on the outside world to a user who observes the situation of the outside world, such as a driver driving a vehicle.
- the driver who drives the car understands the situation of the surrounding environment from the images that can be seen through the windshield and makes various decisions.
- a pedestrian crossing the pedestrian crossing sees it is determined whether or not it is necessary to stop. If you see a car approaching the intersection, determine if you need to slow down.
- the driver makes various judgments based on the sound of the vehicle that enters the ear and the sound outside the vehicle.
- the rider While performing the driving operation, the rider pays attention to the image that enters the eye and the sound that enters the ear, and grasps the situation of the surrounding environment. The rider performs a driving operation according to the grasped situation. In this way, when a human moves on a vehicle, he / she takes action according to the surrounding environment while confirming the surrounding environment.
- Patent Document 1 relates to a head-up display that projects an image on a windshield by projecting light onto the windshield of an automobile.
- the information displayed by the head-up display includes vehicle information indicating the vehicle status such as the traveling speed and direction indication of the vehicle, and foreground information indicating the status on the route along which the vehicle moves.
- vehicle information and foreground information displayed on a head-up display are displayed in different modes so that they can be identified by a driver. For example, when a bicycle traveling in front of a car is recognized as foreground information, an indicator for notifying the appearance of the bicycle is displayed on the head-up display.
- Patent Document 1 by showing an indicator on the windshield, the driver's consciousness can be directed to a car or a bicycle appearing on the route. Alternatively, the driver can be made aware of the appearance of cars and bicycles.
- the driver is alerted by the indicator and can pay attention to the appearance of cars and bicycles, but it is the driver himself who does all the analysis of the specific situation.
- the driver who is cautioned grasps the situation of the car or bicycle based on the actual image recognized through the windshield. In other words, the driver determines the situation on the route based on the image of the car or bicycle that has actually entered the sight.
- This invention makes it a subject to provide the technique which transmits the information of the outside world to a human being as easily as possible.
- the onomatopoeia presentation device related to the evaluation result of the surrounding environment which is one embodiment of the present invention, includes a measurement result input unit, an onomatopoeia data acquisition unit, and an onomatopoeia output unit.
- a measurement result input part inputs the measurement result about the state in a user's surrounding environment measured by the measuring device.
- the onomatopoeia data acquisition unit acquires onomatopoeia data indicating onomatopoeia corresponding to the evaluation of the state of the surrounding environment performed based on the measurement result input by the measurement result input unit.
- the onomatopoeia output unit outputs the onomatopoeia data acquired by the onomatopoeia data acquisition unit to a notification device capable of notifying the user by voice or text, and the onomatopoeia corresponding to the evaluation of the surrounding environment is provided to the notification device. While the user is operating in the surrounding environment, the user is presented by voice or text.
- the onomatopoeia data acquisition unit obtains one onomatopoeia for the plurality of evaluation items obtained by performing one evaluation on the plurality of evaluation items.
- the onomatopoeia data shown may be acquired.
- onomatopoeia If onomatopoeia is not used, it is necessary to explain the evaluations related to a plurality of evaluation items with characters and the like, and the user takes time to grasp the contents. Alternatively, it is difficult to read and understand a long explanation while continuing some operation. In this respect, since the onomatopoeia simply expresses evaluations regarding a plurality of evaluation items, the user can easily grasp the evaluation results.
- the onomatopoeia data acquisition unit performs multiple evaluations on the one evaluation item obtained by performing multi-stage evaluation on one evaluation item.
- Onomatopoeia data indicating onomatopoeia expressed in stages may be acquired.
- onomatopoeia in order to express multi-level evaluation contents for a certain evaluation item, it is necessary to explain using characters and the like, and the user takes time to grasp the contents. Alternatively, it is difficult to read and understand a long explanation while continuing some operation. In this respect, if it is onomatopoeia, it is possible to simply express a multi-stage evaluation regarding a certain evaluation item, so that the user can easily grasp the evaluation result.
- the onomatopoeia data acquired by the onomatopoeia data acquisition unit uses the measurement result or a value calculated based on the measurement result as a predetermined threshold value. It is good also as data which shows the onomatopoeia corresponding to evaluation of the said surrounding environment obtained by comparing.
- the user includes a driver of a vehicle, and the notification device is transparent or semi-arranged at the tip of the driver's line of sight. You may include the image display part provided on a transparent member.
- the driver can confirm the onomatopoeia displayed at the end of the line of sight.
- the state of the surrounding environment includes a state of an object existing around the vehicle, and the onomatopoeia presentation device relating to the evaluation result of the surrounding environment further includes: A position specifying unit that specifies a position of the object; and the image display unit is a region in which a straight line that connects the viewpoint of the driver and the position of the object specified by the position specifying unit intersects the member. You may display onomatopoeia in the vicinity.
- the driver can recognize the object by turning his gaze in the direction in which the onomatopoeia as character information is displayed.
- the user includes a driver of the vehicle, and the notification device is an audio output unit provided in a helmet worn by the driver May be included.
- the driver can grasp the information from the onomatopoeia output from the speaker by wearing a helmet.
- the state of the surrounding environment includes a state of an object existing around the vehicle
- the onomatopoeia presentation device relating to the evaluation result of the surrounding environment further includes: A position specifying unit for specifying the position of the object, wherein the sound output unit includes a directional sound output unit having directivity, and the directional sound output unit is specified by the position specifying unit.
- Onomatopoeia may be output by voice so that the position of the object becomes a sound source.
- the driver can recognize that the object is present in the direction in which the onomatopoeia as voice information is output.
- the onomatopoeia presentation program relating to the evaluation result of the surrounding environment of the present invention causes the computer to execute a measurement result input process, an onomatopoeia data acquisition process, and an onomatopoeia output process.
- a measurement result about a state in the user's surrounding environment measured by the measurement device is input.
- the onomatopoeia data acquisition process acquires onomatopoeia data indicating onomatopoeia corresponding to the evaluation of the state of the surrounding environment, which is performed based on the measurement result input in the measurement result input process.
- the onomatopoeia output processing outputs the onomatopoeia data acquired by the onomatopoeia data acquisition unit to a notification device capable of notifying the user by voice or text, and the onomatopoeia corresponding to the evaluation of the surrounding environment is output to the notification device. While the user is operating in the surrounding environment, the user is presented by voice or text.
- the onomatopoeia presentation program relating to the evaluation result of the surrounding environment of the configuration 9, the onomatopoeia data acquisition processing is performed by performing one evaluation on a plurality of evaluation items, and one onomatopoeia for the plurality of evaluation items. It may be a process of acquiring onomatopoeia data to be shown.
- onomatopoeia If onomatopoeia is not used, it is necessary to explain the evaluations related to a plurality of evaluation items with characters and the like, and the user takes time to grasp the contents. Alternatively, it is difficult to read and understand a long explanation while continuing some operation. In this respect, since the onomatopoeia simply expresses evaluations regarding a plurality of evaluation items, the user can easily grasp the evaluation results.
- the onomatopoeia data acquisition process performs multiple evaluations on the one evaluation item obtained by performing multistage evaluation on one evaluation item. It may be a process of acquiring onomatopoeia data indicating onomatopoeia expressed in stages.
- onomatopoeia in order to express multi-level evaluation contents for a certain evaluation item, it is necessary to explain using characters and the like, and the user takes time to grasp the contents. Alternatively, it is difficult to read and understand a long explanation while continuing some operation. In this respect, if it is onomatopoeia, it is possible to simply express a multi-stage evaluation regarding a certain evaluation item, so that the user can easily grasp the evaluation result.
- the onomatopoeia data acquired by the onomatopoeia data acquisition process is the measurement result or a value calculated based on the measurement result as a predetermined threshold value. It may be obtained as an evaluation of the surrounding environment by comparison.
- the onomatopoeia presentation method relating to the evaluation result of the surrounding environment of the present invention includes a measurement result input step, an onomatopoeia data acquisition step, and an onomatopoeia output step.
- a measurement result about a state in the user's surrounding environment measured by the measurement device is input.
- the onomatopoeia data acquisition step acquires onomatopoeia data indicating onomatopoeia corresponding to the evaluation of the state of the surrounding environment, which is performed based on the measurement result input in the measurement result input step.
- the onomatopoeia data acquisition step outputs the acquired onomatopoeia data to a notification device capable of notifying the user by voice or text, and the user supplies the onomatopoeia corresponding to the evaluation of the surrounding environment to the notification device.
- the user is presented by voice or text while operating in the surrounding environment.
- the onomatopoeia data acquisition step includes one onomatopoeia for the plurality of evaluation items obtained by performing one evaluation on the plurality of evaluation items. It may be a step of acquiring onomatopoeia data to be shown.
- onomatopoeia If onomatopoeia is not used, it is necessary to explain the evaluations related to a plurality of evaluation items with characters and the like, and the user takes time to grasp the contents. Alternatively, it is difficult to read and understand a long explanation while continuing some operation. In this respect, since the onomatopoeia simply expresses evaluations regarding a plurality of evaluation items, the user can easily grasp the evaluation results.
- the onomatopoeia data acquisition step performs multiple evaluations on the one evaluation item obtained by performing multi-stage evaluation on one evaluation item. It may be a step of acquiring onomatopoeia data indicating onomatopoeia expressed in stages.
- onomatopoeia in order to express multi-level evaluation contents for a certain evaluation item, it is necessary to explain using characters and the like, and the user takes time to grasp the contents. Alternatively, it is difficult to read and understand a long explanation while continuing some operation. In this respect, if it is onomatopoeia, it is possible to simply express a multi-stage evaluation regarding a certain evaluation item, so that the user can easily grasp the evaluation result.
- the onomatopoeia data acquired in the onomatopoeia data acquisition step uses the measurement result or a value calculated based on the measurement result as a predetermined threshold value. It may be obtained as an evaluation of the surrounding environment by comparison.
- the onomatopoeia output unit causes the notification device to present onomatopoeia while the user is operating in the surrounding environment.
- the onomatopoeia presentation device related to the evaluation result of the surrounding environment of the present invention is It is not always necessary to provide a means for detecting whether or not.
- a detection unit that detects whether or not the user is performing a user operation may or may not be provided.
- the sensor may be a sensor that detects an input operation on the device or a sensor that detects the movement of the device.
- the detection unit For example, if the user is manipulating a certain vehicle, a sensor that detects an operation amount of an accelerator, a brake, a steering wheel, or the like can be used as the detection unit.
- movement can be made into a detection part.
- the onomatopoeia output unit detects the user's operation. Control may be made so that onomatopoeia is output during this time.
- the onomatopoeia presentation device does not include a detection unit, the onomatopoeia output unit may be configured so that the time from the measurement information input timing to the onomatopoeia output timing is shortened. For example, onomatopoeia can be output while the user is operating by using a CPU having a high processing capability.
- the onomatopoeia presentation device relating to the evaluation result of the surrounding environment according to the present invention presents the onomatopoeia to the user while the user is operating
- the onomatopoeia can be recognized by the user in real time.
- the real time means a timing at which an onomatopoeia can be presented before the operation is completed when the user is performing an operation, and is not limited to an instantaneous timing.
- the timing when the onomatopoeia output unit outputs onomatopoeia is not limited to when the user is operating.
- the onomatopoeia output unit may start onomatopoeia output while the user is operating, and may continue to output onomatopoeia even after the user's operation ends. That is, the onomatopoeia may be output both during and after the user's operation.
- the onomatopoeia presentation apparatus regarding the evaluation result of the surrounding environment of this invention should just be provided with the measurement result input part, the onomatopoeia data acquisition part, and the onomatopoeia output part as minimum structure.
- the onomatopoeia presentation device may further include an evaluation unit, or the evaluation unit may be provided in an external device.
- the onomatopoeia presentation device may include a measurement device.
- the result of comparison with the threshold value is used under various criteria.
- temperature, humidity, moving speed of an object, and the like can be used.
- As a standard not only a single standard but also a plurality of standards can be used for comprehensive evaluation.
- the information providing apparatus is an information providing apparatus that presents the state of the surrounding environment to the user, and measures the state of the surrounding environment based on the measurement unit that measures the state in the surrounding environment and the measurement result of the measuring unit.
- An output unit that presents the state of the surrounding environment to the user by outputting the onomatopoeia data acquired by the data acquisition unit, the data acquisition unit that acquires onomatopoeia data corresponding to the evaluation performed by the evaluation unit With.
- onomatopoeia If onomatopoeia is not used, it is necessary to explain the contents of the environmental assessment with long sentences, etc., and the user takes time to grasp the contents. Alternatively, it is difficult to read and understand a long explanation while continuing some operation. In the present embodiment, by using onomatopoeia, it is possible to simply express the evaluation regarding the environmental evaluation item. The user can grasp the evaluation result while performing some operation.
- the driver who drives the car or the rider who drives the motorcycle grasps the situation of the surrounding environment from the video entering the field of view and makes various judgments. The same applies to a person who is not on a vehicle, that is, a person who is walking or running. People always take action according to the surrounding environment while keeping track of the surrounding environment.
- the inventor of the present application has noticed that there are some events in the surrounding environment that can be easily judged by the observer while others cannot be easily judged by the observer. If the object to be observed can be easily determined by the observer, many purposes can be achieved by prompting the observer that the object to be observed exists. An observer who is urged to watch may view the observation object with his / her eyes and grasp the situation. However, the observer may not be able to easily determine what state the observation object is in and how it moves.
- Onomatopoeia as a means for presenting the situation of the surrounding environment to the observer in a situation where there is no time and work allowance for the observer who receives the presentation of information. came up with.
- onomatopoeia to provide information has the following advantages. First, evaluation contents related to a plurality of evaluation items can be expressed by one onomatopoeia. Second, multi-stage evaluation values for a single evaluation item can be expressed by onomatopoeia.
- the inventor of the present application can effectively use onomatopoeia for a mechanism that provides the observer with the situation of the surrounding environment grasped from a plurality of evaluation items or the situation of the surrounding environment evaluated in multiple stages for a single evaluation item. I came up with that. Therefore, I came up with the idea that by using an external information providing system using onomatopoeia, it is possible to give information on the surrounding environment in a simple manner to the observer.
- the inventor of the present application gives information on the surrounding environment to the user while the user is operating by using the “onomatopoeia presentation device regarding the evaluation result of the surrounding environment”, which is an example of the external information providing system.
- “giving information about the surrounding environment while the user is operating” means information on the surrounding environment that the user wants to know when performing a certain operation or a certain operation. This means that the user is given information on the relevant surrounding environment while performing the operation. In other words, it means providing information related to the surrounding environment to the user while the user is operating in the surrounding environment. For example, if the user is driving a motorcycle, this means providing information on the surrounding environment while the user is operating the motorcycle. For example, if the user is running, this means that information on the surrounding environment is given while the user is running.
- an external information providing apparatus will be described as an example of an onomatopoeia presentation apparatus related to the evaluation result of the surrounding environment of the present invention.
- the “measurement information input unit” in the following embodiments corresponds to the “measurement result input unit” of the present invention.
- the “onomatopoeia acquisition unit” in the embodiment corresponds to the “onomatopoeia data acquisition unit” of the present invention.
- the “output unit” in the embodiment corresponds to the “onomatopoeia output unit” of the present invention.
- “Speaker” and “display” in the embodiment are examples of the “notification device” of the present invention.
- the external information providing apparatus is mounted on a motorcycle.
- the rider of the motorcycle grasps the situation of the surrounding environment and performs a driving operation according to the grasped situation.
- the external information providing apparatus evaluates the surrounding environment and acquires onomatopoeia corresponding to the evaluation content.
- the motorcycle provides the onomatopoeia acquired by the external information providing apparatus to the rider as an evaluation result of the surrounding environment.
- the rider controls the motorcycle and acquires information indicating the evaluation result of the surrounding environment expressed in onomatopoeia while continuing to travel.
- Onomatopoeia is a word that expresses the cry of an animal or the sound of an object.
- Examples of onomatopoeia include “Wanwan” (“wan-wan” for Japanese speech, “bow-how” for English speech: Onomatopoeia representing dog cry), “Gachan” (“gacyan” for Japanese speech, English “Crush” for speech: Onomatopoeia representing sound that breaks pottery, etc.), “rattling” (“gata-gata” for Japanese speech, “bank” for English speech: when the desk legs are of different lengths Onomatopoeia representing a shaking sound).
- a mimetic word is a word that expresses the state of a thing or the behavior of a person in terms of such voices.
- mimetic words include “Surutsuru” (in Japanese speech “tsuru-tsuru”: mimetic word representing a smooth surface), “puyopuyo” (in Japanese speech “puyo-puyo”: soft elastic state ”And“ Yoshiyori ”(in Japanese speech,“ yochi-yochi ”: a mimetic word representing a state in which a baby moves forward little by little with limbs).
- mimetic words examples include “Soro Soro” (“soro-soro” for Japanese speech, “slo-slo-slo” for English speech: mimetic word representing a state of low speed and movement), “Bune” (Japanese “View” for speech, “zoom” for English speech: imitation word indicating speed and fast action, “Cuien” (“cueen” for Japanese speech, “zoooooom” for English speech: quick progress and refreshing Mimic words representing feeling), “Furafura” (“fla-fla” in Japanese speech, “diz-diz-diz” in English speech: a mimetic word representing a movement while shaking).
- Onomatopoeia is a general term for onomatopoeia and mimicry words.
- Japanese is a language that frequently uses onomatopoeia, but onomatopoeia is also used in English.
- Onomatopoeia such as“ beep ”is used for“ onomatopoeia representing ringing sounds such as buzzers ”.
- “boy!”, “Boing!”, “Whoo!”, And the like are used as onomatopoeia representing the state of jumping and bouncing.
- the motorcycle 1 is a naked type motorcycle as shown in FIG.
- the external information providing apparatus according to the present embodiment can be applied to all types of motorcycles such as a racer type and an American type.
- the external information providing apparatus according to the present embodiment can be applied to a scooter.
- the left and right and front and rear directions of the motorcycle 1 refer to the left and right and front and rear directions viewed from the rider seated on the seat 81 of the motorcycle 1.
- arrow F indicates the forward direction of the motorcycle 1
- arrow R indicates the rear direction of the motorcycle 1
- arrow U indicates the upward direction of the motorcycle 1
- arrow D indicates the motorcycle. 1 indicates the downward direction.
- the motorcycle 1 includes a body frame 6.
- the vehicle body frame 6 includes a head pipe 61.
- the head pipe 61 is disposed at the front portion of the vehicle body.
- a steering shaft 70 is inserted into the head pipe 61 so as to be rotatable in the left-right direction.
- a handle 71 is fixed to the upper portion of the steering shaft 70. The handle 71 can rotate in the left-right direction integrally with the steering shaft 70.
- the front fork 72 is fixed to the steering shaft 70 via a bracket.
- a front wheel 73 is rotatably attached to the lower end of the front fork 72. When the rider steers the handle 71 in the left-right direction, the front wheel 73 can rotate in the left-right direction.
- a front suspension 74 is provided on the front fork 72.
- a seat 81 is provided at a portion slightly rearward from the middle in the longitudinal direction of the vehicle body.
- the seat 81 is supported by the body frame 6.
- a fuel tank 82 is provided in front of the seat 81.
- the fuel tank 82 is supported by the vehicle body frame 6.
- An engine 83 is provided below the fuel tank 82.
- the engine 83 is supported by the body frame 6.
- a rear arm 75 is provided behind the engine 83.
- the rear arm 75 is supported on a pivot shaft provided on the vehicle body frame 6 so as to be swingable in the vertical direction.
- a rear wheel 76 is rotatably provided at the rear portion of the rear arm 75.
- a rear suspension 77 is provided between the rear wheel 76 and the vehicle body frame 6.
- a mission device 84 is provided behind the engine 83.
- the mission device 84 is supported by the vehicle body frame 6.
- the output of the engine 83 is transmitted to the rear wheel 76 via the mission device 84 and the chain.
- the motorcycle 1 is equipped with a large number of sensors 101 to 109 as shown in FIG.
- the image sensor 101 is provided in the front part of the vehicle body.
- the image sensor 101 is a sensor that can image the front of the vehicle.
- a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary MOS) image sensor can be used.
- the voice sensor 102 is a microphone that is provided near the handle 71 and collects the voice of the surrounding environment or the voice of the rider.
- the image sensor 101 and the audio sensor 102 may be attached at different locations.
- the distance measuring sensor 103 is disposed above the headlamp and measures the distance to the observation object.
- the distance measuring sensor 103 reflects the light irradiated from the light source on the observation object, and analyzes the reflected light to measure the distance to the observation object.
- An LED or the like can be used as the light source.
- the direction sensor 104 is attached below the sheet 81 and measures the direction.
- a magnetic sensor or the like can be used.
- the position information sensor 105 is provided below the seat 81 and acquires the position information of the motorcycle 1.
- a GPS (Global Positioning System) receiver can be used.
- the temperature sensor 106 is provided below the seat 81 and measures the outside air temperature.
- the humidity sensor 107 is provided below the seat 81 and measures the humidity of the outside air.
- the FM receiver 108 is provided below the seat 81, receives FM (Frequency Modulation) multiplex broadcast, and acquires text information such as weather forecast and traffic information.
- the expressway information receiving unit 109 is provided below the seat 81, and acquires traffic jam information, construction information, and the like on the expressway.
- the external environment information providing apparatus 10 includes a measurement unit 100, an environment evaluation module 10P, and an output unit 500.
- the environment evaluation module 10P includes a recognition unit 200, an onomatopoeia acquisition unit 300, and an onomatopoeia database 450.
- the measuring unit 100 includes the sensors 101 to 109 described above.
- measurement unit 100 includes nine types of sensors 101 to 109 as sensors for providing outside world information. However, this is an example, and a configuration including a part of these sensors is provided. But you can. Or the structure further provided with another sensor may be sufficient. For example, an infrared sensor or a stereo camera may be provided. Alternatively, a sensor (blood flow sensor or sweat sensor) that measures the rider's biological information may be provided.
- the measuring unit 100 measures an external event necessary for grasping an environmental situation.
- the recognition unit 200 is a processing unit that evaluates the surrounding environment based on the information measured by the measurement unit 100.
- the recognition unit 200 includes a measurement information input unit 201, an evaluation unit 202, an image processing unit 203, a display position determination unit 204, a measurement information database 250, a determination reference database 251 and an image database 252.
- the measurement information input unit 201 inputs information measured by the measurement unit 100. That is, information measured by the various sensors 101 to 109 is input.
- the measurement information input unit 201 stores information other than image data in the measurement information database 250 among the input measurement information. The contents of the measurement information database 250 will be described in detail later.
- the measurement information input unit 201 stores the image data acquired by the image sensor 101 in the image database 252. The contents of the image database 252 will be described in detail later.
- the image processing unit 203 performs image processing on the image data stored in the image database 252.
- the processing contents of the image processing unit 203 will be described in detail later, but the image processing unit 203 recognizes an object such as a car or a bicycle included in the image data and analyzes the movement of the object such as the car or the bicycle. Alternatively, the image processing unit 203 recognizes the road and analyzes the state of the road.
- the evaluation unit 202 evaluates the surrounding environment based on the measurement information stored in the measurement information database 250, the image analysis result executed in the image processing unit 203, and the determination criterion table stored in the determination criterion database 251. Do.
- the evaluation unit 202 determines an onomatopoeia that represents the evaluation result of the surrounding environment. The contents of the evaluation method by the evaluation unit 202 will be described in detail later.
- the display position determination unit 204 determines a position for displaying onomatopoeia. As will be described later, in the present embodiment, a part of the shield of the helmet worn by the rider is used as a display for displaying onomatopoeia. The display position determination unit 204 determines at which position of the display the onomatopoeia is to be displayed.
- the onomatopoeia acquisition unit 300 accesses the onomatopoeia database 450 and acquires data corresponding to the onomatopoeia determined by the evaluation unit 202.
- the onomatopoeia acquisition unit 300 outputs the acquired onomatopoeia data to the output unit 500.
- the output unit 500 outputs onomatopoeia data and notifies the rider of the evaluation result of the surrounding environment.
- the output unit 500 includes a speaker 501 and a display 502.
- the speaker 501 is used as the output unit 500.
- text data or image data is used as onomatopoeia data
- a display 502 is used as the output unit 500.
- onomatopoeia is recorded as character information
- image data onomatopoeia is recorded as image information.
- the “environment evaluation module 10P” of the present embodiment includes the function of the “onomatopoeia presentation device regarding the evaluation result of the surrounding environment” of the present invention.
- the “measurement result input unit” included in the “onomatopoeia presentation device relating to the evaluation result of the surrounding environment” of the present invention corresponds to the “measurement information input unit 201” included in the “environment evaluation module 10P”.
- the “onomatopoeia data acquisition unit” included in the “onomatopoeia presentation device relating to the evaluation result of the surrounding environment” of the present invention corresponds to the “onomatopoeia acquisition unit 300” included in the “environment evaluation module 10P”.
- the “onomatopoeia output unit” included in the “onomatopoeia presentation device relating to the evaluation result of the surrounding environment” of the present invention is included in the “onomatopoeia acquisition unit 300” included in the “environment evaluation module 10P”.
- the “notification device” that outputs the onomatopoeia data by the “onomatopoeia presentation device regarding the evaluation result of the surrounding environment” of the present invention corresponds to the “output unit 500”.
- the “onomatopoeia presentation device regarding the evaluation result of the surrounding environment” of the present invention may include the measurement information database 250 or may hold the measurement information database 250 outside the device.
- the “onomatopoeia presentation device regarding the evaluation result of the surrounding environment” of the present invention may include the evaluation unit 202 or may use an evaluation unit outside the device.
- the evaluation unit may be placed on a server connected via a network.
- the “onomatopoeia presentation device regarding the evaluation result of the surrounding environment” of the present invention may include the onomatopoeia database 450 or may hold the onomatopoeia database 450 outside the apparatus.
- the “onomatopoeia presentation device regarding the evaluation result of the surrounding environment” of the present invention may include the image database 203 or may hold the image database 203 outside the device.
- the “onomatopoeia presentation device regarding the evaluation result of the surrounding environment” of the present invention may include the image processing unit 203 or may use an image processing unit outside the device.
- the image processing unit may be placed on a server connected via a network.
- the “onomatopoeia presentation device regarding the evaluation result of the surrounding environment” of the present invention may include the display position determination unit 204 or may hold the display position determination unit 204 outside the apparatus.
- the display position determination unit may be placed on a server connected via a network.
- FIG. 4 is a diagram showing a helmet 85 to be worn when the rider gets on the motorcycle 1.
- the helmet 85 is provided with left and right speakers 501 and 501.
- the rider can check the onomatopoeia output from the speakers 501 and 501 by voice while wearing the helmet 85 and operating the motorcycle 1.
- a part of the shield of the helmet 85 is used as the display 502.
- An image output from an AR (Augmented Reality) projector 5021 provided in the helmet 85 is displayed on the display 502.
- the rider can check the onomatopoeia output to the display 502 as an image while wearing the helmet 85 and operating the motorcycle 1.
- the AR projector 5021 of the present embodiment draws an onomatopoeia at a position where a straight line connecting the rider's viewpoint and the observation object intersects the display 502. That is, the AR projector 5021 performs control so that the onomatopoeia is displayed at the tip of the line of sight of the rider who views the observation object.
- the rider can naturally bring the observation object into the field of view.
- the speaker that outputs the onomatopoeia is provided in the helmet worn by the rider, but the speaker may be provided in another location.
- a speaker may be provided near the front cowl or the handle.
- the helmet shield worn by the rider is used as the display for displaying onomatopoeia, but the display is not limited to this.
- the display in order to make it easy for the rider to grasp the onomatopoeia, it is desirable to arrange the display on a transparent or translucent member arranged at the tip of the rider's line of sight.
- the measurement information input unit 201, the evaluation unit 202, the image processing unit 203, the display position determination unit 204, and the onomatopoeia acquisition unit 300 use a CPU (Central Processing Unit), a memory, and a memory as a work area. This is realized by software operating on the CPU.
- a CPU Central Processing Unit
- the measurement information input unit 201, the evaluation unit 202, the image processing unit 203, the display position determination unit 204, components such as a onomatopoeia acquisition unit 300, a work memory, a ROM, and the like, and the measurement information Components such as a memory for storing the database 250, the determination reference database 251, the image database 252 and the onomatopoeia database 450 are arranged on one or a plurality of substrates as the environment evaluation module 10P. As shown in FIG. 2, in the present embodiment, the environment evaluation module 10 ⁇ / b> P is disposed below the sheet 81.
- a part or all of the measurement information input unit 201, the evaluation unit 202, the image processing unit 203, the display position determination unit 204, and the onomatopoeia acquisition unit 300 may be configured by hardware. Also in this case, components such as hardware for realizing the measurement information input unit 201, the evaluation unit 202, the image processing unit 203, the display position determination unit 204, the onomatopoeia acquisition unit 300, the measurement information database 250, the determination reference database 251, Components such as a memory for storing the image database 252 and the onomatopoeia database 450 are arranged on one or more substrates as the environment evaluation module 10P.
- the measurement information database 250 will be described with reference to FIG.
- the measurement information input unit 201 stores information other than image data in the measurement information database 250 among the measurement information input from the measurement unit 100. That is, the measurement information input unit 201 stores the measurement information input from the various sensors 102 to 109 in the measurement information database 250. Data stored in the measurement information database 250 is recorded as a record using time information as key information, as shown in FIG. That is, the measurement information acquired at each time is stored in the measurement information database 250 in association with the time information. The time information is acquired from a timer (not shown) provided in the environment evaluation module 10P.
- Latitude information and longitude information are measurement information acquired by the position information sensor 105.
- the temperature information is measurement information acquired by the temperature sensor 106.
- the humidity information is measurement information acquired by the humidity sensor 107.
- the weather information is acquired from the character information received by the FM receiver 108.
- FIG. 5 shows information measured by some of the sensors 102 to 109, but the measurement information acquired by each sensor 102 to 109 is associated with time information in the measurement information database 250. Stored.
- FIG. 5 there is a location where NaN (Not a Number) is input as measurement information. This indicates a case where measurement information cannot be acquired from the sensor.
- the image database 252 stores image data acquired by the image sensor 101.
- the format of the image data is not particularly limited, but for example, a format such as JPEG (Joint Photographic Experts Group) or TIFF (Tagged Image File Format) can be used.
- the image database 252 stores, for example, an image in front of the motorcycle 1 captured by the image sensor 101 at a frame rate such as 10 frames / second.
- the external information providing apparatus 10 recognizes observation objects such as automobiles and bicycles that appear on the route of the motorcycle 1, and presents the status of these observation objects to the rider as an environmental evaluation result. There is a need. Therefore, in order to analyze the state of the observation object as accurately as possible, it is desirable that the image data be acquired at a relatively high frame rate such as 10 frames / second.
- the external information providing apparatus is used for other purposes and it is not necessary to present the status of the surrounding environment in real time, there is no problem even if the frame rate of the image data is low. What is necessary is just to determine the frame rate of image data according to the use for which an external field information provision apparatus is utilized.
- the image data may be stored for a period of time, for example, data for the past 10 minutes is stored in the image database 252. What is necessary is just to set the preservation
- the onomatopoeia database 450 stores text data, image data, audio data, and the like that record onomatopoeia.
- text data a character string representing onomatopoeia is recorded as a character code.
- image data a character string representing onomatopoeia is recorded as image information.
- JPEG Joint Photographic Experts Group
- TIFF Tagged Image File Format
- image data moving image data such as MPEG can also be used.
- a character string representing onomatopoeia is recorded as voice information.
- MP3 MPEG-1 Audio Layer-3
- AAC Advanced Audio Coding
- the onomatopoeia acquisition unit 300 can output the onomatopoeia to the display 502. Further, if voice data is stored for a certain onomatopoeia, the onomatopoeia acquisition unit 300 can output the onomatopoeia to the speaker 501.
- the measurement information input unit 201 stores the measurement information measured by the measurement unit 100 in the measurement information database 250.
- image data acquired by the image sensor 101 is stored in the image database 252 by the measurement information input unit 201.
- the image processing unit 203 executes various image analysis processes on the image data stored in the image database 252.
- the evaluation unit 202 refers to the measurement information database 250, the image analysis result in the image processing unit 203, and the determination criterion database 251, and evaluates the surrounding environment.
- the evaluation unit 202 determines an onomatopoeia corresponding to the evaluation of the surrounding environment.
- the evaluation unit 202 evaluates the surrounding environment based on a single measurement information stored in the measurement information database 250 for a certain environmental evaluation item. For example, the evaluation unit 202 evaluates the surrounding environment related to the temperature based on the temperature information measured by the temperature sensor 106.
- the evaluation unit 202 evaluates the surrounding environment based on a plurality of pieces of measurement information stored in the measurement information database 250 for other environment evaluation items. For example, the evaluation unit 202 evaluates the surrounding environment based on the temperature information measured by the temperature sensor 106 and the humidity information measured by the humidity sensor 107.
- the temperature information is information that can be used to evaluate the temperature.
- Humidity information is information that can be used to evaluate ease of spending.
- the evaluation unit 202 performs one environmental evaluation with respect to two environmental evaluation items, ie, warmth and ease of spending.
- the evaluation unit 202 processes the calculation information (not the measurement information as it is, but the measurement information calculated based on the single measurement information stored in the measurement information database 250. Information) to evaluate the surrounding environment.
- the information obtained by processing the measurement information includes an image analysis result performed in the image processing unit 203 on the image data acquired by the image sensor 101.
- the evaluation unit 202 performs environment evaluation based on information on a moving object detected based on image data acquired by the image sensor 101.
- the evaluation unit 202 calculates an operation value calculated based on a plurality of pieces of measurement information stored in the measurement information database 250 (information obtained by processing measurement information, not a value as it is). ) To evaluate the surrounding environment.
- the information obtained by processing the measurement information includes an image analysis result performed in the image processing unit 203 on the image data acquired by the image sensor 101.
- the evaluation unit 202 includes information related to the quietness calculated based on the voice data measured by the voice sensor 102 and information related to the congestion degree calculated based on the traffic congestion information acquired by the expressway information receiving unit 109.
- Environmental assessment based on The evaluation unit 202 performs one environmental evaluation for two environmental evaluation items of silence and congestion.
- FIG. 7 is a diagram illustrating a determination criterion table T1 stored in the determination criterion database 251 that is referred to by the evaluation unit 202.
- the evaluation unit 202 performs environmental evaluation on a certain environmental evaluation item (in this case, environmental evaluation item A) based on single measurement information.
- the reference table T1 when certain condition information (measurement value) sa satisfies the condition of sa ⁇ 3, the onomatopoeia X1 representing the environmental evaluation for the environmental evaluation item A is associated.
- certain condition information (measurement value) sa satisfies the condition of sa ⁇ 3
- the onomatopoeia X2 representing the environmental evaluation for the environmental evaluation item A is associated.
- the evaluation unit 202 performs an environmental evaluation on the temperature based on the temperature information.
- the environmental evaluation item A is an evaluation item related to the temperature.
- Onomatopoeia X1 is associated with “pokapoka” (in Japanese speech, the mimic word “poka-poka” indicates warm), and onomatopoeia X2 is associated with “keen” (in Japanese speech, “mimic” word “keen”: cold. Is indicated).
- the onomatopoeia X1 and the onomatopoeia X2 can express a multi-stage evaluation for one environmental evaluation item of the temperature.
- FIG. 11 shows evaluation items related to the strength of the wind.
- the evaluation unit 202 determines onomatopoeia according to the speed at which the tree recognized by the image analysis result swings.
- three threshold values Th1, Th2, and Th3 are set for the speed at which the tree swings.
- "Shin” a mimetic word “Shen” in Japanese voice
- "Saa” a mimetic word “Saa” in Japanese voice
- "Zawazawa” Japanese voice
- FIG. 12 shows evaluation items related to road conditions.
- the evaluation unit 202 determines the onomatopoeia according to the diameter of the stone falling on the road recognized by the image analysis result.
- two threshold values Th1 and Th2 are set for the stone diameter.
- “Korokoro” a mimetic word “coro-coro” in Japanese voice
- “Gorogoro” a mimetic word “goro-goro” in Japanese voice
- “Gorongoron” 3 types of onomatopoeia called “goron-goron” in Japanese voice are associated.
- FIG. 13 shows evaluation items related to the weather.
- the evaluation unit 202 determines the onomatopoeia according to the weather information obtained from the measurement information, the size of raindrops recognized from the image analysis result, or the density of rain. In the example of FIG. 13, four threshold values Th1, Th2, Th3, Th4 are set for the rainfall.
- FIG. 8 is a diagram illustrating a determination criterion table T2 stored in the determination criterion database 251 that is referred to by the evaluation unit 202.
- the evaluation unit 202 performs environment evaluation based on a plurality of pieces of measurement information for a plurality of environment evaluation items (here, the environment evaluation item B and the environment evaluation item C).
- the reference table T2 when certain measurement information (measurement value) sb and measurement information (measurement value) sc satisfy the condition of sb ⁇ 2 and sc ⁇ 12, two environmental evaluation items (environment evaluation item B and environmental evaluation item) Onomatopoeia X3 representing one environmental evaluation is associated with C).
- Onomatopoeia X4 representing one environmental evaluation is associated with C.
- Onomatopoeia X5 is associated.
- the evaluation unit 202 performs environmental evaluation based on the temperature information measured by the temperature sensor 106 and the humidity information measured by the humidity sensor 107.
- the environmental evaluation item B is an evaluation item related to the temperature
- the environmental evaluation item C is an evaluation item related to ease of spending.
- “Mushimushi” an imitation word “mushi-mushi” in Japanese speech: indicating warm and high humidity
- Onomatopoeia X4 is associated with “pokapoka” (in Japanese speech, the mimetic word “poka-poka”: indicates warm and low humidity).
- Onomatopoeia X5 is associated with “Solid” (a mimetic word “beta-beta” in Japanese speech: indicating high humidity).
- “On Saratope X6” is associated with “sarasara” (in Japanese speech “sara-sara” imitation word: cool and low humidity).
- the evaluation unit 202 performs one environmental evaluation on the two environmental evaluation items of the temperature and the ease of spending, and associates one onomatopoeia.
- FIG. 15 shows an example of evaluating the road surface condition.
- evaluation regarding road surface luminance information and evaluation of temperature are performed.
- the evaluation unit 202 determines the onomatopoeia based on the road surface brightness information recognized from the image analysis result and the temperature information obtained from the measurement information.
- a threshold value Th ⁇ b> 1 is set for luminance and a threshold value Th ⁇ b> 2 is set for temperature.
- the state where the luminance exceeds the threshold Th1 means that a glossy substance such as water or ice is detected on the road surface. For example, 0 degree Celsius is set as the threshold value Th2 related to the temperature.
- FIG. 9 is a diagram illustrating a determination criterion table T3 stored in the determination criterion database 251 that is referred to by the evaluation unit 202.
- the evaluation unit 202 calculates an operation value (processed measurement information) calculated based on a single measurement information for a certain environmental evaluation item (here, the environmental evaluation item D). Information) to evaluate the environment.
- the information obtained by processing the measurement information includes an image analysis result performed in the image processing unit 203 on the image data acquired by the image sensor 101.
- FIG. 10 is a diagram illustrating a determination criterion table T4 stored in the determination criterion database 251 that is referred to by the evaluation unit 202.
- the evaluation unit 202 performs a plurality of operations calculated based on a plurality of pieces of measurement information for a plurality of environment evaluation items (here, environment evaluation items E, F, and G).
- One environmental evaluation is performed based on the value (information obtained by processing the measurement information).
- the information obtained by processing the measurement information includes an image analysis result performed in the image processing unit 203 on the image data acquired by the image sensor 101.
- the onomatopoeia X10 representing one environmental evaluation for the three environmental evaluation items E, F, and G is associated.
- the onomatopoeia X11 representing one environmental evaluation for the three environmental evaluation items E, F, and G is associated.
- the onomatopoeia is associated with ranges of other values for the calculated values f (se), f (sf), and f (sg).
- the evaluation unit 202 refers to the measurement information database 250, the image analysis result executed in the image processing unit 203, and the determination criterion database 251, and single or multiple measurement information, single or multiple measurements. Based on the calculated value calculated based on the information or the image analysis result, the environmental evaluation is performed and the onomatopoeia representing the content of the environmental evaluation is specified.
- step S ⁇ b> 1 the measurement information input unit 201 inputs various measurement information from the measurement unit 100.
- the measurement information input unit 201 inputs measurement information measured by the sensors 101 to 109.
- step S ⁇ b> 2 the measurement information input unit 201 stores the input measurement information in the measurement information database 250.
- step S1 when measurement information about a certain sensor cannot be acquired, NaN (Not a Number) is input as measurement information of the sensor.
- step S ⁇ b> 3 the measurement information input unit 201 stores the input image data in the image database 252.
- step S4 the evaluation unit 202 determines whether or not the surrounding environment evaluation is possible. Whether the surrounding environment can be evaluated is determined at predetermined time intervals such as one second intervals.
- the evaluation unit 202 measures time with a timer (not shown) and determines whether or not the timing for evaluating the surrounding environment has arrived.
- the evaluation of the surrounding environment means a state in which measurement information is prepared for items related to the surrounding environment such as weather, road surface condition, moving objects, and the environment evaluation is ready.
- step S4 If the surrounding environment cannot be evaluated (NO in step S4), the process proceeds to step S1. If the surrounding environment can be evaluated (YES in step S4), the process proceeds to step S5. In step S5, the evaluation unit 202 acquires the evaluation result of the surrounding environment.
- FIG. 18 is a flowchart showing the specific processing contents of the evaluation result acquisition processing (step S5) in FIG.
- the evaluation unit 202 first evaluates the surrounding environment related to the moving object based on the image analysis result and measurement information by the image processing unit 203 (step S51). For example, the evaluation unit 202 acquires the image analysis result of the image processing unit 203 in order to evaluate the speed of a bicycle moving in front of the motorcycle 1. Alternatively, an image analysis result is acquired in order to evaluate the speed of a pedestrian moving around the motorcycle 1.
- the evaluation unit 202 evaluates the road condition based on the image analysis result and the measurement information by the image processing unit 203 (step S52). For example, the evaluation unit 202 determines whether the road surface is frozen or whether there is a puddle on the road surface based on information such as the color and brightness of the road obtained from the image analysis result and temperature information obtained from the measurement information. , Evaluate whether there is snow on the road surface and what size of stone is falling on the road surface.
- the evaluation unit 202 evaluates the surrounding environment regarding the weather condition based on the image analysis result and the measurement information by the image processing unit 203 (step S53). For example, the evaluation unit 202 performs weather-related evaluation based on the sky color obtained from the image analysis result and the weather information or temperature information obtained from the measurement information.
- the evaluation unit 202 evaluates the additional information based on the image analysis result and the measurement information by the image processing unit 203 (step S54). For example, the evaluation unit 202 evaluates the degree of congestion on the road based on the traffic jam information obtained from the measurement information.
- step S6 the display position determination unit 204 determines a position for displaying onomatopoeia. As shown in FIG. 19, the display position determination unit 204 determines a position where a line connecting the rider's viewpoint and the observation object and the display 502 intersect as the display position of the onomatopoeia.
- the observation object is recognized by the image analysis process executed in the image processing unit 203.
- a bicycle is recognized by image analysis.
- the display position determination unit 204 holds in advance the position of the eyes of the rider who is riding the motorcycle 1.
- the display position determination unit 204 calculates a straight line connecting the position of the bicycle estimated based on the image analysis result and the position of the eyes of the rider, and further sets the position where the straight line and the display 502 intersect as the onomatopoeia display position. decide.
- the display position determination unit 204 may use the distance measuring sensor 109 to specify the position of the observation object.
- the display position determination unit 204 corrects the position of the observation target estimated based on the image analysis result using the distance to the observation target measured by the distance measuring sensor 109, so that the accurate measurement up to the observation target is performed. A long distance can be obtained.
- the display position determination unit 204 may specify the position of the observation object using the position information sensor 105.
- the display position determination unit 204 creates display position information and gives it to the evaluation unit 202.
- the evaluation unit 202 provides the display position information to the onomatopoeia determination unit 300 together with the evaluation result of the surrounding environment. Note that when it is not necessary to adjust the display position of the onomatopoeia according to the position of the observation object, a default position is designated as the display position information. For example, onomatopoeia is displayed at the default position when a special meaning is not required for the display position such as a congestion situation.
- step S7 the onomatopoeia acquisition unit 300 determines whether or not onomatopoeia determination timing has arrived.
- the onomatopoeia acquisition unit 300 determines that the onomatopoeia determination timing has arrived when the environmental evaluation result is acquired in step S5.
- step S7 If the onomatopoeia decision timing has not arrived (NO in step S7), the process returns to step S1 and continues to input measurement information. Measurement information associated with time information is accumulated in the measurement information database 250.
- the evaluation unit 202 determines onomatopoeia based on the evaluation result of the surrounding environment obtained in step S5 (step S8 in FIG. 17). For example, the onomatopoeia is determined by executing the processing described with reference to FIGS.
- the onomatopoeia acquisition unit 300 accesses the onomatopoeia database 450 and acquires onomatopoeia data representing the onomatopoeia determined in step S8 (step S9).
- the onomatopoeia acquisition unit 300 outputs the acquired onomatopoeia data to the output unit 500 together with the display position information obtained in step S6 (step S10).
- the acquired onomatopoeia is output from the speaker 501, for example.
- the acquired onomatopoeia is output from the display 502.
- Onomatopoeia is displayed on the display 502 based on the display position information.
- FIG. 20 is a diagram illustrating an example of onomatopoeia displayed on the display 502 based on the display position information.
- FIG. 20 shows an image reflected in the eyes of the rider. That is, an image that is seen by the rider's eyes through the shield of the helmet 85 is shown.
- the bicycle approaches the road ahead of the motorcycle 1.
- the bicycle is recognized by the image processing unit 203, and it is evaluated that the moving speed is very slow as a result of the image analysis processing. Therefore, the onomatopoeia determination unit 300 determines “soon” (an imitation word “soro-soro” in Japanese speech) as the onomatopoeia corresponding to the evaluated moving speed of the bicycle.
- the display position determination unit 204 sets the display position of the onomatopoeia to a position near the bicycle based on the image analysis result and the measurement result by the distance measuring sensor 103. As a result, as shown in FIG. 20, the onomatopoeia “Now” is displayed near the bicycle.
- the left and right speakers 501 may be provided with directivity. Thereby, when outputting onomatopoeia with sound, the position of the observation object can be transmitted to the rider by adjusting the sound output from the left and right speakers 501. Specifically, it is possible to make the rider recognize the position of the sound source from the phase difference between the sounds output to the left and right speakers. By using this method, it is possible to transmit to the rider where the moving object appears and in which direction the puddle of the road surface exists.
- components such as a memory for storing the determination reference database 251, the image database 252 and the onomatopoeia database 450 are arranged on one or a plurality of substrates as the environment evaluation module 10P has been described.
- the environment evaluation module 10 ⁇ / b> P is disposed below the sheet 81.
- the “onomatopoeia presentation device relating to the evaluation result of the surrounding environment” of the present invention is mounted on the motorcycle 1 as a vehicle part.
- a portable information terminal such as a smartphone may be used as the “environment evaluation module 10P” which is the “onomatopoeia presentation device regarding the evaluation result of the surrounding environment” of the present invention. That is, by attaching the smartphone held by the rider to the motorcycle 1, it can be used as an “environment evaluation module”.
- Riders can use the smartphone as an “environment evaluation module” by downloading and installing “environment evaluation module software” from the server on the network to the smartphone.
- a program for causing a computer to operate as an onomatopoeia presentation device related to an evaluation result of the surrounding environment is also included in the embodiment of the present invention.
- a computer-readable non-transitory recording medium in which such a program is recorded is also included in the embodiment of the present invention.
- the sensors 101 to 109 described in the above embodiment are “measurement devices” that can be used in the present invention.
- the “measuring device” that can be used in the present invention may be mounted on the motorcycle 1 or may not be mounted on the motorcycle 1 like the sensors 101 to 109.
- the “measuring device” can use various sensors mounted on the smart watch.
- an acceleration sensor, an image sensor, a biological information sensor, a GPS receiver, a temperature sensor, or the like included in the smart watch can be used.
- the measuring device can be a sensor that detects a state quantity of the surrounding environment that is of interest to the rider (user).
- FIG. 21 is a diagram illustrating a configuration example of the measurement apparatus 100, the environment evaluation module 10P, and the output unit 500 according to the embodiment.
- FIG. 21 is a diagram in which the smart watch 10W is used as the “measuring device” and the smartphone 10F is used as the “environment evaluation module” and the “output unit”.
- the rider uses the smart watch 10W as the measuring device 100A by mounting the smart watch 10W on the arm.
- the rider uses the smartphone 10F as the environment evaluation module 10PA by mounting the smartphone 10F installed with the “environment evaluation module software” on the vehicle. Further, the rider uses a speaker or a display included in the smartphone 10F as the output unit 500A.
- the smartphone 10F inputs various pieces of measurement information from the smart watch 10W functioning as the measurement device 100A using communication means such as Bluetooth.
- the smartphone 10F that functions as the environment evaluation module 10PA acquires onomatopoeia data based on the measurement information.
- the smartphone 10F outputs the acquired onomatopoeia as sound from a speaker included in the smartphone 10F.
- the smartphone 10F outputs onomatopoeia as an image to a display included in the smartphone 10F.
- Smart watch 10W may be used as an “output unit”.
- the smartphone 10F transmits the acquired onomatopoeia data to the smart watch 10W using communication means such as Bluetooth.
- the smart watch 10W outputs the received onomatopoeia data as a sound from a speaker included in the smart watch 10W or from a headphone connected to the smart watch 10W.
- the smart watch 10W outputs onomatopoeia as an image to a display included in the smart watch 10W.
- the smartphone 10F may output the acquired onomatopoeia data using a speaker or a display provided in the helmet using communication means such as Bluetooth.
- FIG. 22 is a diagram illustrating another configuration example of the measurement apparatus 100, the environment evaluation module 10P, and the output unit 500 according to the embodiment.
- FIG. 22 is a diagram in which the smart watch 10W is used as the “measuring device”, a part of the functions of the “environment evaluation module”, and the smartphone 10F is used as the “output unit”.
- FIG. 22 is a diagram using a computer server 10S connected via a network as an “evaluation unit”.
- the rider uses the smart watch 10W as the measuring device 100B by wearing the smart watch 10W on the arm.
- the rider uses the smartphone 10F as the environment evaluation module 10PB by mounting the smartphone 10F in which the “environment evaluation module software” is installed on the vehicle.
- the environment evaluation module 10PB does not have the function of the “evaluation unit”.
- the rider uses a speaker or a display included in the smartphone 10F as the output unit 500B.
- the computer server 10S connected via the network is used as the evaluation unit 202B.
- the smartphone 10F inputs various pieces of measurement information from the smart watch 10W functioning as the measurement device 100B using a communication means such as Bluetooth.
- the smartphone 10F that functions as the environment evaluation module 10PB inputs the measurement information
- the smartphone 10F transmits the measurement information to the computer server 10S.
- the computer server 10S functioning as the evaluation unit 202B evaluates the measurement information and determines onomatopoeia data.
- the smartphone 10F receives onomatopoeia data determination information from the computer server 10S.
- the smartphone 10F acquires onomatopoeia data based on the determination information from the onomatopoeia database.
- the smartphone 10F outputs the acquired onomatopoeia as sound from a speaker included in the smartphone 10F.
- the smartphone 10F outputs onomatopoeia as an image to a display included in the smartphone 10F.
- a speedometer provided in the motorcycle 1 may be used.
- a display may be provided on a part of the speedometer to display onomatopoeia.
- the environment evaluation module 10P includes the onomatopoeia database 450
- the onomatopoeia acquisition part 300 can acquire onomatopoeia data in real time.
- the output unit 500 can present the onomatopoeia to the rider in real time as the environmental evaluation.
- the onomatopoeia database may be arranged on a server on the network without holding the onomatopoeia database in the motorcycle 1.
- the onomatopoeia acquisition unit 300 acquires onomatopoeia data using wireless communication via the network.
- the onomatopoeia as an environmental evaluation can be presented to the rider, although the real-time property of presenting the onomatopoeia data is reduced.
- the rider can receive onomatopoeia as an environmental evaluation while continuing the driving operation of the motorcycle 1.
- the onomatopoeia acquisition unit 300 may automatically generate onomatopoeia corresponding to environmental evaluation.
- Onomatopoeia is a word that appeals to the common sense of human beings. Therefore, it is possible to create a database of human feelings for a specific word or phrase.
- a plurality of existing onomatopoeia may be combined to generate a new onomatopoeia.
- the onomatopoeia “bechabecha” in Japanese speech “becha-becha” mimic word: indicates wetness
- “fuwafuwa” in Japanese speech “fuwa-fuwa” mimicry word: cotton Onomatopoeia exists.
- Each of these onomatopoeias has certain implications. In explaining the situation having both of these implications, these onomatopoeia may be combined to generate and present an onomatopoeia such as “bechafuwa” (a mimetic word “becha-fuwa” in Japanese speech). Since the onomatopoeia acquisition unit 300 has an onomatopoeia automatic generation function, it is possible to more appropriately present onomatopoeia for environmental evaluation.
- the information providing apparatus is an information providing apparatus that presents the state of the surrounding environment to the user, and is based on the measurement unit that measures the state in the surrounding environment and the measurement result of the measurement unit.
- An evaluation unit that evaluates the state of the surrounding environment
- a data acquisition unit that acquires onomatopoeia data corresponding to the evaluation performed by the evaluation unit, and the onomatopoeia data acquired by the data acquisition unit are output to the user.
- onomatopoeia If onomatopoeia is not used, it is necessary to explain the contents of the environmental assessment with long sentences, etc., and the user takes time to grasp the contents. Alternatively, it is difficult to read and understand a long explanation while continuing some operation. In this respect, since the onomatopoeia simply expresses the evaluation related to the environmental evaluation items, the user can easily grasp the evaluation result.
- the measurement result (for example, sensor information) is also used for an environmental situation that is difficult to judge from an image that enters the observer's field of view, such as road conditions and wind strength. Thus, it can be presented to the observer.
- a situation that cannot be judged by the user's visual observation, or a situation that can be predicted to some extent by the user's visual observation, but cannot be judged accurately, these situations also Can be communicated to the user.
- the evaluation unit performs one evaluation for a plurality of evaluation items
- the data acquisition unit acquires one onomatopoeia data for the plurality of evaluation items.
- the evaluation unit performs multi-stage evaluation on one evaluation item
- the data acquisition unit acquires onomatopoeia data representing multi-stage evaluation on one evaluation item.
- the output device includes an image display unit that outputs onomatopoeia data as character information.
- Onomatopoeia representing the situation of the surrounding environment can be transmitted through the user's vision.
- the output device includes a voice output unit that outputs onomatopoeia data as voice information.
- An onomatopoeia representing the situation of the surrounding environment can be transmitted through the user's hearing.
- the evaluation unit evaluates the surrounding environment by comparing the measurement result or a value calculated based on the measurement result with a predetermined threshold value.
- An environmental evaluation result can be obtained in real time by comparing the measurement result and the like with a predetermined threshold.
- the user includes the driver of the vehicle, and the image display unit is provided on a transparent or translucent member disposed at the tip of the driver's line of sight.
- the driver can confirm the onomatopoeia displayed at the end of the line of sight.
- the member includes a helmet shield worn by the driver.
- the driver can check the onomatopoeia displayed on the shield of the helmet positioned ahead of the line of sight.
- the state of the surrounding environment includes the state of the object existing around the vehicle
- the information providing apparatus further includes a position specifying unit that specifies the position of the object
- the image display unit displays character information in the vicinity of a region where a straight line connecting the driver's viewpoint and the position of the object specified by the position specifying unit intersects the member.
- the driver can recognize the object by turning his / her line of sight in the direction in which the onomatopoeia as character information is displayed.
- the user includes a vehicle driver
- the audio output unit includes a speaker provided in the vehicle.
- the driver can grasp the information with the onomatopoeia output from the speaker.
- the user includes a driver of the vehicle, and the audio output unit includes a speaker provided in a helmet worn by the driver.
- the driver can grasp the information by onomatopoeia output from the speaker by wearing the helmet.
- the state of the surrounding environment includes the state of the object existing around the vehicle
- the information providing apparatus further includes a position specifying unit that specifies the position of the object
- the audio output unit includes a directional audio output unit having directivity, and the directional audio output unit outputs audio information so that the position of the object specified by the position specifying unit becomes a sound source.
- the driver can recognize that the object is present in the direction in which the onomatopoeia as voice information is output.
- the output apparatus outputs the onomatopoeia data acquired by the data acquisition unit in real time.
- the evaluation result of the surrounding environment can be quickly presented to the user.
- the information providing apparatus further includes a storage unit that accumulates onomatopoeia data, and the data acquisition unit acquires onomatopoeia data from the storage unit.
- the evaluation result of the surrounding environment can be presented to the user in real time.
- the data acquisition unit acquires onomatopoeia data from an external server using wireless communication. Shared onomatopoeia data can be used. You can share the latest onomatopoeia database.
- the data acquisition unit automatically generates onomatopoeia data according to the evaluation.
- An onomatopoeia that describes the situation of the surrounding environment in more detail can be presented.
- the external information providing apparatus 10 mounted on the motorcycle 1 has been described as an example of the external information providing apparatus of the present invention. That is, the embodiment in which the surrounding environment of the motorcycle 1 is evaluated and the evaluation result is provided to the rider by onomatopoeia has been described as an example.
- the external information providing apparatus of the present invention may be mounted on a vehicle other than a motorcycle such as an automobile or a bicycle.
- the onomatopoeia can present the evaluation result of the surrounding environment to the person who rides the vehicle.
- a person who is not on the vehicle, that is, a person who is walking or running may wear the external information providing apparatus of the present invention. In this case, it is possible to present the evaluation result of the surrounding environment to the person who is walking or running by onomatopoeia.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
- Audible And Visible Signals (AREA)
- Stereophonic System (AREA)
Abstract
Description
本発明の実施の形態の一つである周辺環境の評価結果に関するオノマトペ提示装置は、計測結果入力部、オノマトペデータ取得部、および、オノマトペ出力部を備える。
計測結果入力部は、計測装置によって計測されたユーザの周辺環境における状態についての計測結果を入力する。
オノマトペデータ取得部は、前記計測結果入力部が入力した前記計測結果に基づいて行われた、前記周辺環境の状態の評価に対応したオノマトペを示すオノマトペデータを取得する。
オノマトペ出力部は、前記オノマトペデータ取得部が取得した前記オノマトペデータを、前記ユーザに音声または文字により報知可能な報知装置に出力して、前記報知装置に、前記周辺環境の評価に対応したオノマトペを前記ユーザが前記周辺環境において動作をしている間に前記ユーザに音声または文字により提示させる。
上記構成1の周辺環境の評価結果に関するオノマトペ提示装置において、前記オノマトペデータ取得部は、複数の評価項目に対して1つの評価を行うことによって得られた、前記複数の評価項目に対する1つのオノマトペを示すオノマトペデータを取得してもよい。
上記構成1または2の周辺環境の評価結果に関するオノマトペ提示装置において、前記オノマトペデータ取得部は、1つの評価項目に関して多段階の評価を行うことによって得られた、前記1つの評価項目に関する評価を多段階で表したオノマトペを示すオノマトペデータを取得してもよい。
上記構成1ないし3のいずれかの周辺環境の評価結果に関するオノマトペ提示装置において、前記オノマトペデータ取得部が取得するオノマトペデータは、前記計測結果あるいは前記計測結果に基づき算出された値を所定の閾値と比較することにより得られた前記周辺環境の評価に対応するオノマトペを示すデータとしてもよい。
上記構成1ないし4のいずれかの周辺環境の評価結果に関するオノマトペ提示装置において、前記ユーザは、車両の運転者を含み、前記報知装置は、前記運転者の視線の先に配置される透明あるいは半透明の部材上に設けられる画像表示部を含んでもよい。
上記構成5の周辺環境の評価結果に関するオノマトペ提示装置において、前記周辺環境の状態は、前記車両の周辺に存在する対象物の状態を含み、前記周辺環境の評価結果に関するオノマトペ提示装置は、さらに、前記対象物の位置を特定する位置特定部を含み、前記画像表示部は、前記運転者の視点と前記位置特定部において特定された前記対象物の位置とを結ぶ直線が前記部材と交わる領域の近傍にオノマトペを文字で表示してもよい。
上記構成1ないし6のいずれかの周辺環境の評価結果に関するオノマトペ提示装置において、前記ユーザは、車両の運転者を含み、前記報知装置は、前記運転者が装着するヘルメットに設けられた音声出力部を含んでもよい。
上記構成7の周辺環境の評価結果に関するオノマトペ提示装置において、前記周辺環境の状態は、前記車両の周辺に存在する対象物の状態を含み、前記周辺環境の評価結果に関するオノマトペ提示装置は、さらに、前記対象物の位置を特定する位置特定部を含み、前記音声出力部は、指向性を有する指向性音声出力部を含み、前記指向性音声出力部は、前記位置特定部において特定された前記対象物の位置が音源となるようにオノマトペを音声で出力してもよい。
本発明の周辺環境の評価結果に関するオノマトペ提示プログラムは、計測結果入力処理、オノマトペデータ取得処理、および、オノマトペ出力処理をコンピュータに実行させる。
計測結果入力処理は、計測装置によって計測されたユーザの周辺環境における状態についての計測結果を入力する。
オノマトペデータ取得処理は、前記計測結果入力処理で入力された前記計測結果に基づいて行われた、前記周辺環境の状態の評価に対応したオノマトペを示すオノマトペデータを取得する。
オノマトペ出力処理は、前記オノマトペデータ取得部が取得した前記オノマトペデータを、前記ユーザに音声または文字により報知可能な報知装置に出力して、前記報知装置に、前記周辺環境の評価に対応したオノマトペを前記ユーザが前記周辺環境において動作をしている間に前記ユーザに音声または文字により提示させる。
上記構成9の周辺環境の評価結果に関するオノマトペ提示プログラムにおいて、前記オノマトペデータ取得処理は、複数の評価項目に対して1つの評価を行うことによって得られた、前記複数の評価項目に対する1つのオノマトペを示すオノマトペデータを取得する処理であってもよい。
上記構成9または10の周辺環境の評価結果に関するオノマトペ提示プログラムにおいて、前記オノマトペデータ取得処理は、1つの評価項目に関して多段階の評価を行うことによって得られた、前記1つの評価項目に関する評価を多段階で表したオノマトペを示すオノマトペデータを取得する処理であってもよい。
上記構成9ないし11のいずれかの周辺環境の評価結果に関するオノマトペ提示プログラムにおいて、前記オノマトペデータ取得処理が取得するオノマトペデータは、前記計測結果あるいは前記計測結果に基づき算出された値を所定の閾値と比較することにより前記周辺環境の評価として得られるものであってもよい。
本発明の周辺環境の評価結果に関するオノマトペ提示方法は、計測結果入力工程、オノマトペデータ取得工程、および、オノマトペ出力工程を備える。
計測結果入力工程は、計測装置によって計測されたユーザの周辺環境における状態についての計測結果を入力する。
オノマトペデータ取得工程は、前記計測結果入力工程で入力された前記計測結果に基づいて行われた、前記周辺環境の状態の評価に対応したオノマトペを示すオノマトペデータを取得する。
前記オノマトペデータ取得工程は、取得した前記オノマトペデータを、前記ユーザに音声または文字により報知可能な報知装置に出力して、前記報知装置に、前記周辺環境の評価に対応したオノマトペを前記ユーザが前記周辺環境において動作をしている間に前記ユーザに音声または文字により提示させる。
上記構成13の周辺環境の評価結果に関するオノマトペ提示方法において、前記オノマトペデータ取得工程は、複数の評価項目に対して1つの評価を行うことによって得られた、前記複数の評価項目に対する1つのオノマトペを示すオノマトペデータを取得する工程であってもよい。
上記構成13または14の周辺環境の評価結果に関するオノマトペ提示方法において、前記オノマトペデータ取得工程は、1つの評価項目に関して多段階の評価を行うことによって得られた、前記1つの評価項目に関する評価を多段階で表したオノマトペを示すオノマトペデータを取得する工程であってもよい。
上記構成13ないし15のいずれかの周辺環境の評価結果に関するオノマトペ提示方法において、前記オノマトペデータ取得工程が取得するオノマトペデータは、前記計測結果あるいは前記計測結果に基づき算出された値を所定の閾値と比較することにより前記周辺環境の評価として得られてもよい。
本実施の形態の外界情報提供装置は、自動二輪車に搭載される。自動二輪車のライダーは、周辺の環境の状況を把握し、把握した状況に応じた運転操作を行う。外界情報提供装置は、周辺の環境の評価を行い、評価内容に対応するオノマトペを取得する。自動二輪車は、外界情報提供装置によって取得されたオノマトペを、周辺環境の評価結果としてライダーに提供する。ライダーは、自動二輪車を操縦し、走行を継続しながら、オノマトペで表現された周辺環境の評価結果を示す情報を取得する。
本明細書で使用する言葉の定義について図1を用いて説明する。擬声語(あるいは擬音語)とは、動物の鳴き声や物体の音響を表現した語である。擬声語の例としては、「わんわん」(日本語音声では“wan-wan”、英語音声では“bow-wow”:犬の鳴き声を表す擬声語)、「ガチャン」(日本語音声では“gacyan”、英語音声では“clash”:陶器などが割れる音を表す擬声語)、「ガタガタ」(日本語音声では“gata-gata”、英語音声では“bonk”:机の脚の長さが異なるときに、机が揺れる音を表す擬声語)などがある。
本実施の形態に係る自動二輪車1の構成について図2を参照しながら説明する。本実施の形態においては、自動二輪車1は、図2に示すように、ネイキッドタイプの自動二輪車である。ただし、本実施の形態に係る外界情報提供装置は、レーサータイプ、アメリカンタイプなどあらゆる種類の自動二輪車に適用可能である。あるいは、本実施の形態に係る外界情報提供装置は、スクーターに適用することも可能である。
本実施の形態の自動二輪車1は、図2に示すように、多数のセンサ101~109を搭載している。画像センサ101は、車体の前部に設けられている。画像センサ101は、車両の前方を撮像可能なセンサであり、たとえば、CCD(Charge Coupled Device)イメージセンサやCMOS(Complementary MOS)イメージセンサなどを用いることができる。音声センサ102は、ハンドル71の近傍に設けられ、周辺環境の音声あるいはライダーの音声を集音するマイクである。画像センサ101および音声センサ102は、別の場所に取り付けられてもよい。
次に、自動二輪車1に搭載される外界情報提供装置10の構成について、図3を参照しながら説明する。本実施の形態の外界情報提供装置10は、計測部100、環境評価モジュール10P、および出力部500を備えている。環境評価モジュール10Pは、認識部200、オノマトペ取得部300およびオノマトペデータベース450を備えている。
図5を参照しながら計測情報データベース250について説明する。計測情報入力部201は、計測部100から入力した計測情報のうち、画像データ以外の情報を計測情報データベース250に格納する。つまり、計測情報入力部201は、各種センサ102~109より入力した計測情報を計測情報データベース250に格納する。計測情報データベース250に格納されるデータは、図5に示すように、時刻情報をキー情報としたレコードして記録されている。つまり、各時刻において取得された計測情報が、時刻情報と関連付けられて計測情報データベース250に格納されている。時刻情報は、環境評価モジュール10Pに設けられたタイマー(図示省略)より取得される。
画像データベース252には、画像センサ101が取得した画像データが格納される。画像データのフォーマットは特に限定されないが、たとえば、JPEG(Joint Photographic Experts Group)、あるいはTIFF(Tagged Image File Format)などのフォーマットが利用可能である。
図6を参照しながらオノマトペデータベース450について説明する。オノマトペデータベース450は、オノマトペを記録したテキストデータ、画像データあるいは音声データなどを格納している。テキストデータには、オノマトペを表す文字列が文字コードとして記録されている。画像データには、オノマトペを表す文字列が画像情報として記録されている。画像データのフォーマットとしては、JPEG(Joint Photographic Experts Group)、TIFF(Tagged Image File Format)などが利用できるが、特に限定されるものではなく、各種のフォーマットが利用可能である。画像データとして、MPEGなどの動画データを用いることもできる。音声データには、オノマトペを表す文字列が音声情報として記録されている。音声データのフォーマットとしては、MP3(MPEG-1 Audio Layer-3)、AAC(Advanced Audio Coding)などが利用できるが、特に限定されるものではなく、各種のフォーマットが利用可能である。
次に、環境評価の処理方法について説明する。上述したように、計測情報入力部201により、計測部100において計測された計測情報が、計測情報データベース250に格納される。また、計測情報入力部201により、画像センサ101において取得された画像データが画像データベース252に格納される。画像処理部203は、画像データベース252に格納された画像データに対して各種の画像解析処理を実行する。評価部202は、計測情報データベース250、画像処理部203における画像解析結果および判定基準データベース251を参照し、周辺環境の評価を行う。評価部202は、周辺環境の評価に対応するオノマトペを決定する。
次に、図16~図17のフローチャートを参照しながら、本実施の形態に係る外界情報提供装置の処理の流れについて説明する。
10 外界情報提供装置
10P 環境評価モジュール
250 計測情報データベース
251 判定基準データベース
252 画像データベース
450 オノマトペデータベース
Claims (15)
- 計測装置によって計測されたユーザの周辺環境における状態についての計測結果が入力される計測結果入力部と、
前記計測結果入力部が入力した前記計測結果に基づいて行われた、前記周辺環境の状態の評価に対応したオノマトペを示すオノマトペデータを取得するオノマトペデータ取得部と、
前記オノマトペデータ取得部が取得した前記オノマトペデータを、前記ユーザに音声または文字により報知可能な報知装置に出力して、前記報知装置に、前記周辺環境の評価に対応したオノマトペを前記ユーザが前記周辺環境において動作をしている間に前記ユーザに音声または文字により提示させるオノマトペ出力部と、
を備える周辺環境の評価結果に関するオノマトペ提示装置。 - 請求項1に記載の周辺環境の評価結果に関するオノマトペ提示装置であって、
前記オノマトペデータ取得部は、複数の評価項目に対して1つの評価を行うことによって得られた、前記複数の評価項目に対する1つのオノマトペを示すオノマトペデータを取得する、周辺環境の評価結果に関するオノマトペ提示装置。 - 請求項1または請求項2に記載の周辺環境の評価結果に関するオノマトペ提示装置であって、
前記オノマトペデータ取得部は、1つの評価項目に関して多段階の評価を行うことによって得られた、前記1つの評価項目に関する評価を多段階で表したオノマトペを示すオノマトペデータを取得する、周辺環境の評価結果に関するオノマトペ提示装置。 - 請求項1ないし請求項3のいずれかに記載の周辺環境の評価結果に関するオノマトペ提示装置であって、
前記オノマトペデータ取得部が取得するオノマトペデータは、前記計測結果あるいは前記計測結果に基づき算出された値を所定の閾値と比較することにより得られた前記周辺環境の評価に対応するオノマトペを示すデータであるオノマトペ提示装置。 - 請求項1ないし請求項4のいずれかに記載の周辺環境の評価結果に関するオノマトペ提示装置であって、
前記ユーザは、車両の運転者を含み、
前記報知装置は、前記運転者の視線の先に配置される透明あるいは半透明の部材上に設けられる画像表示部を含む、周辺環境の評価結果に関するオノマトペ提示装置。 - 請求項5に記載の周辺環境の評価結果に関するオノマトペ提示装置であって、
前記周辺環境の状態は、前記車両の周辺に存在する対象物の状態を含み、
前記周辺環境の評価結果に関するオノマトペ提示装置は、さらに、
前記対象物の位置を特定する位置特定部、
を含み、
前記画像表示部は、前記運転者の視点と前記位置特定部において特定された前記対象物の位置とを結ぶ直線が前記部材と交わる領域の近傍にオノマトペを文字で表示する周辺環境の評価結果に関するオノマトペ提示装置。 - 請求項1ないし請求項4のいずれかに記載の周辺環境の評価結果に関するオノマトペ提示装置であって、
前記ユーザは、車両の運転者を含み、
前記報知装置は、前記運転者が装着するヘルメットに設けられた音声出力部、を含む周辺環境の評価結果に関するオノマトペ提示装置。 - 請求項7に記載の周辺環境の評価結果に関するオノマトペ提示装置であって、
前記周辺環境の状態は、前記車両の周辺に存在する対象物の状態を含み、
前記周辺環境の評価結果に関するオノマトペ提示装置は、さらに、
前記対象物の位置を特定する位置特定部、
を含み、
前記音声出力部は、指向性を有する指向性音声出力部、を含み、
前記指向性音声出力部は、前記位置特定部において特定された前記対象物の位置が音源となるようにオノマトペを音声で出力する、周辺環境の評価結果に関するオノマトペ提示装置。 - 計測装置によって計測されたユーザの周辺環境における状態についての計測結果を入力する計測結果入力処理、
前記計測結果入力処理で入力された前記計測結果に基づいて行われた、前記周辺環境の状態の評価に対応したオノマトペを示すオノマトペデータを取得するオノマトペデータ取得処理、および、
前記オノマトペデータ取得部が取得した前記オノマトペデータを、前記ユーザに音声または文字により報知可能な報知装置に出力して、前記報知装置に、前記周辺環境の評価に対応したオノマトペを前記ユーザが前記周辺環境において動作をしている間に前記ユーザに音声または文字により提示させるオノマトペ出力処理、
をコンピュータに実行させる周辺環境の評価結果に関するオノマトペ提示プログラム。 - 請求項9に記載の周辺環境の評価結果に関するオノマトペ提示プログラムであって、
前記オノマトペデータ取得処理は、複数の評価項目に対して1つの評価を行うことによって得られた、前記複数の評価項目に対する1つのオノマトペを示すオノマトペデータを取得する、周辺環境の評価結果に関するオノマトペ提示プログラム。 - 請求項9または請求項10に記載の周辺環境の評価結果に関するオノマトペ提示プログラムであって、
前記オノマトペデータ取得処理は、1つの評価項目に関して多段階の評価を行うことによって得られた、前記1つの評価項目に関する評価を多段階で表したオノマトペを示すオノマトペデータを取得する、周辺環境の評価結果に関するオノマトペ提示プログラム。 - 請求項9ないし請求項11のいずれかに記載の周辺環境の評価結果に関するオノマトペ提示プログラムであって、
前記オノマトペデータ取得処理が取得するオノマトペデータは、前記計測結果あるいは前記計測結果に基づき算出された値を所定の閾値と比較することにより得られた前記周辺環境の評価に対応するオノマトペを示すデータであるオノマトペ提示プログラム。 - 計測装置によって計測されたユーザの周辺環境における状態についての計測結果を入力する計測結果入力工程と、
前記計測結果入力工程で入力された前記計測結果に基づいて行われた、前記周辺環境の状態の評価に対応したオノマトペを示すオノマトペデータを取得するオノマトペデータ取得工程と、
前記オノマトペデータ取得工程が取得した前記オノマトペデータを、前記ユーザに音声または文字により報知可能な報知装置に出力して、前記報知装置に、前記周辺環境の評価に対応したオノマトペを前記ユーザが前記周辺環境において動作をしている間に前記ユーザに音声または文字により提示させるオノマトペ出力工程と、
を備える周辺環境の評価結果に関するオノマトペ提示方法。 - 請求項13に記載の周辺環境の評価結果に関するオノマトペ提示方法であって、
前記オノマトペデータ取得工程は、複数の評価項目に対して1つの評価を行うことによって得られた、前記複数の評価項目に対する1つのオノマトペを示すオノマトペデータを取得する、周辺環境の評価結果に関するオノマトペ提示方法。 - 請求項13または請求項14に記載の周辺環境の評価結果に関するオノマトペ提示方法であって、
前記オノマトペデータ取得工程は、1つの評価項目に関して多段階の評価を行うことによって得られた、前記1つの評価項目に関する評価を多段階で表したオノマトペを示すオノマトペデータを取得する、周辺環境の評価結果に関するオノマトペ提示方法。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| ES16873106T ES2903165T3 (es) | 2015-12-11 | 2016-12-09 | Dispositivo para presentar una onomatopeya referente a resultados de evaluar un entorno circundante y método para presentar una onomatopeya |
| EP16873106.5A EP3389023B1 (en) | 2015-12-11 | 2016-12-09 | Device for presenting an onomatopoeia pertaining to results of evaluating a surrounding environment and method for presenting an onomatopoeia |
| JP2017555155A JP6917311B2 (ja) | 2015-12-11 | 2016-12-09 | 周辺環境の評価結果に関するオノマトペ提示装置、オノマトペ提示プログラム及びオノマトペ提示方法 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2015242385 | 2015-12-11 | ||
| JP2015-242385 | 2015-12-11 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017099213A1 true WO2017099213A1 (ja) | 2017-06-15 |
Family
ID=59014149
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2016/086696 Ceased WO2017099213A1 (ja) | 2015-12-11 | 2016-12-09 | 周辺環境の評価結果に関するオノマトペ提示装置 |
Country Status (4)
| Country | Link |
|---|---|
| EP (1) | EP3389023B1 (ja) |
| JP (1) | JP6917311B2 (ja) |
| ES (1) | ES2903165T3 (ja) |
| WO (1) | WO2017099213A1 (ja) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110549353A (zh) * | 2018-05-31 | 2019-12-10 | 国立大学法人名古屋大学 | 力觉视觉化装置、机器人以及存储力觉视觉化程序的计算机可读介质 |
| WO2021240901A1 (ja) * | 2020-05-28 | 2021-12-02 | 本田技研工業株式会社 | 鞍乗り型車両 |
| WO2022102446A1 (ja) * | 2020-11-12 | 2022-05-19 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、情報処理システム、及びデータ生成方法 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007334149A (ja) * | 2006-06-16 | 2007-12-27 | Akira Hata | 聴覚障害者用ヘッドマウントディスプレイ装置 |
| JP2013063718A (ja) * | 2011-09-20 | 2013-04-11 | Nippon Seiki Co Ltd | 車両用警告装置 |
| WO2013190637A1 (ja) * | 2012-06-19 | 2013-12-27 | 三菱電機株式会社 | 擬音発生システムおよび地図データベース |
| JP2015003707A (ja) | 2013-06-24 | 2015-01-08 | 株式会社デンソー | ヘッドアップディスプレイ、及びプログラム |
| JP2015136953A (ja) * | 2014-01-20 | 2015-07-30 | アルパイン株式会社 | 二輪車両情報提供装置 |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0971198A (ja) * | 1995-09-06 | 1997-03-18 | Mazda Motor Corp | 障害物確認装置 |
| JP4660872B2 (ja) * | 2000-02-09 | 2011-03-30 | ソニー株式会社 | 運転支援装置及び運転支援方法 |
| US7561966B2 (en) * | 2003-12-17 | 2009-07-14 | Denso Corporation | Vehicle information display system |
| JP2010231337A (ja) * | 2009-03-26 | 2010-10-14 | Mazda Motor Corp | 車両の車線逸脱警報装置 |
| US9064152B2 (en) * | 2011-12-01 | 2015-06-23 | Elwha Llc | Vehicular threat detection based on image analysis |
| US9417838B2 (en) * | 2012-09-10 | 2016-08-16 | Harman International Industries, Incorporated | Vehicle safety system using audio/visual cues |
-
2016
- 2016-12-09 JP JP2017555155A patent/JP6917311B2/ja active Active
- 2016-12-09 ES ES16873106T patent/ES2903165T3/es active Active
- 2016-12-09 EP EP16873106.5A patent/EP3389023B1/en active Active
- 2016-12-09 WO PCT/JP2016/086696 patent/WO2017099213A1/ja not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007334149A (ja) * | 2006-06-16 | 2007-12-27 | Akira Hata | 聴覚障害者用ヘッドマウントディスプレイ装置 |
| JP2013063718A (ja) * | 2011-09-20 | 2013-04-11 | Nippon Seiki Co Ltd | 車両用警告装置 |
| WO2013190637A1 (ja) * | 2012-06-19 | 2013-12-27 | 三菱電機株式会社 | 擬音発生システムおよび地図データベース |
| JP2015003707A (ja) | 2013-06-24 | 2015-01-08 | 株式会社デンソー | ヘッドアップディスプレイ、及びプログラム |
| JP2015136953A (ja) * | 2014-01-20 | 2015-07-30 | アルパイン株式会社 | 二輪車両情報提供装置 |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110549353A (zh) * | 2018-05-31 | 2019-12-10 | 国立大学法人名古屋大学 | 力觉视觉化装置、机器人以及存储力觉视觉化程序的计算机可读介质 |
| KR20210055650A (ko) * | 2018-05-31 | 2021-05-17 | 도요타지도샤가부시키가이샤 | 역각 시각화 장치, 로봇 및 역각 시각화 프로그램 |
| US11279037B2 (en) | 2018-05-31 | 2022-03-22 | National University Corporation Nagoya University | Force-sense visualization apparatus, robot, and force-sense visualization program |
| KR102452924B1 (ko) * | 2018-05-31 | 2022-10-11 | 도요타지도샤가부시키가이샤 | 역각 시각화 장치, 로봇 및 역각 시각화 프로그램 |
| CN110549353B (zh) * | 2018-05-31 | 2022-10-14 | 国立大学法人名古屋大学 | 力觉视觉化装置、机器人以及存储力觉视觉化程序的计算机可读介质 |
| WO2021240901A1 (ja) * | 2020-05-28 | 2021-12-02 | 本田技研工業株式会社 | 鞍乗り型車両 |
| JPWO2021240901A1 (ja) * | 2020-05-28 | 2021-12-02 | ||
| JP7312321B2 (ja) | 2020-05-28 | 2023-07-20 | 本田技研工業株式会社 | 鞍乗り型車両 |
| US12377928B2 (en) | 2020-05-28 | 2025-08-05 | Honda Motor Co., Ltd. | Saddle-ride type vehicle with microphone located on operation-portion support portion |
| WO2022102446A1 (ja) * | 2020-11-12 | 2022-05-19 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、情報処理システム、及びデータ生成方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3389023A4 (en) | 2019-04-24 |
| EP3389023B1 (en) | 2021-12-08 |
| EP3389023A1 (en) | 2018-10-17 |
| ES2903165T3 (es) | 2022-03-31 |
| JP6917311B2 (ja) | 2021-08-11 |
| JPWO2017099213A1 (ja) | 2018-11-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10496889B2 (en) | Information presentation control apparatus, autonomous vehicle, and autonomous-vehicle driving support system | |
| US10079929B2 (en) | Determining threats based on information from road-based devices in a transportation-related context | |
| US10453260B2 (en) | System and method for dynamic in-vehicle virtual reality | |
| US20160378112A1 (en) | Autonomous vehicle safety systems and methods | |
| JP7521920B2 (ja) | 運転支援装置及びデータ収集システム | |
| CN108538086A (zh) | 辅助驾驶员进行道路车道变更 | |
| KR20240091285A (ko) | 개인 이동성 시스템을 이용한 증강 현실 강화된 게임플레이 | |
| CN118020052B (zh) | 基于ar的个人移动系统的性能调节 | |
| CN104067326A (zh) | 用户辅助的位置情况标识 | |
| WO2017172142A1 (en) | Preceding traffic alert system and method | |
| WO2017128801A1 (zh) | 一种智能头戴设备和智能头戴设备的控制方法 | |
| JP6653398B2 (ja) | 情報提供システム | |
| JPWO2018179305A1 (ja) | 走行経路提供システムおよびその制御方法、並びにプログラム | |
| US12462502B2 (en) | AR odometry sensor fusion using personal vehicle sensor data | |
| JP6558356B2 (ja) | 自動運転システム | |
| US20210209949A1 (en) | Roadside apparatus and vehicle-side apparatus for road-to-vehicle communication, and road-to-vehicle communication system | |
| JP6917311B2 (ja) | 周辺環境の評価結果に関するオノマトペ提示装置、オノマトペ提示プログラム及びオノマトペ提示方法 | |
| JP7656676B2 (ja) | 交通安全支援システム及びコンピュータプログラム | |
| JP2021534490A (ja) | 自律型車両のための行列の検出および行列に対する応答 | |
| CN118476207B (zh) | 用于使用增强现实设备定位个人移动系统的方法、计算装置和非暂态计算机可读存储介质 | |
| JP2022047580A (ja) | 情報処理装置 | |
| KR20240124386A (ko) | 개인 차량으로부터의 센서 데이터를 이용한 주행거리 측정 | |
| JP2014134503A (ja) | 疲労軽減サポート装置 | |
| US20240362925A1 (en) | Situational awareness systems and methods and micromobility platform | |
| Kashevnik et al. | Context-based driver support system development: Methodology and case study |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16873106 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2017555155 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2016873106 Country of ref document: EP |
|
| ENP | Entry into the national phase |
Ref document number: 2016873106 Country of ref document: EP Effective date: 20180711 |