[go: up one dir, main page]

WO2014034779A1 - Unité d'évaluation de caractéristique individuelle et procédé d'évaluation de caractéristique individuelle - Google Patents

Unité d'évaluation de caractéristique individuelle et procédé d'évaluation de caractéristique individuelle Download PDF

Info

Publication number
WO2014034779A1
WO2014034779A1 PCT/JP2013/073140 JP2013073140W WO2014034779A1 WO 2014034779 A1 WO2014034779 A1 WO 2014034779A1 JP 2013073140 W JP2013073140 W JP 2013073140W WO 2014034779 A1 WO2014034779 A1 WO 2014034779A1
Authority
WO
WIPO (PCT)
Prior art keywords
characteristic
scene
instantaneous
driver
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2013/073140
Other languages
English (en)
Japanese (ja)
Inventor
充伸 神沼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissan Motor Co Ltd
Original Assignee
Nissan Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd filed Critical Nissan Motor Co Ltd
Publication of WO2014034779A1 publication Critical patent/WO2014034779A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W40/09Driving style or behaviour
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera

Definitions

  • the present invention relates to a personal characteristic estimation device and a personal characteristic estimation method for estimating a driver's characteristics based on driver's operation information, vehicle behavior, and driving scene when driving the vehicle.
  • Patent Document 1 describes an apparatus for estimating driver characteristics (a general term for personality, tendency, temperament, etc.) based on behavior when a driver drives a vehicle. Things are known. Patent Document 1 discloses that a driver recognizes a driver's kite based on a driving operation when driving the vehicle, and supports driving based on the driver's kite.
  • Patent Document 1 recognizes a driver's habit and supports driving, and does not estimate the basic characteristics of the driver.
  • An object of the present invention is to provide a personal characteristic estimation device and a personal characteristic estimation method capable of estimating a basic characteristic of a driver.
  • a personal characteristic estimation device includes a vehicle information acquisition unit that acquires vehicle information, a scene estimation unit that estimates a scene in which the vehicle travels based on the vehicle information, a plurality of scenes, and driving Instantaneous characteristic model storage means for setting a plurality of characteristic tendency labels indicating an instantaneous characteristic tendency of a person and storing a characteristic model associating a characteristic tendency label easy to be expressed in each scene as an instantaneous characteristic model Based on the scene estimated by the scene estimation means, referring to the instantaneous characteristic model, the instantaneous characteristic estimation means for estimating the instantaneous characteristic tendency of the driver, and the instantaneous characteristic estimated by the instantaneous characteristic estimation means Characteristic tendency estimating means for statistically analyzing the characteristic tendency and estimating a static characteristic tendency of the driver.
  • the personal characteristic estimation method sets a plurality of scenes, sets a plurality of characteristic tendency labels indicating an instantaneous characteristic tendency of the driver, and displays each scene in the scene.
  • the characteristic model associated with the easy characteristic tendency label is stored as the instantaneous characteristic model, the scene where the vehicle travels is estimated based on the vehicle information, and the instantaneous characteristic model of the driver is referred to based on the estimated scene.
  • a static characteristic tendency of the driver is estimated by statistically analyzing the estimated instantaneous characteristic tendency and statistically analyzing the estimated instantaneous characteristic tendency.
  • FIG. 1 is a block diagram showing the configuration of the personal characteristic estimation apparatus according to the first embodiment of the present invention.
  • FIG. 2 is an explanatory diagram schematically showing processing of the estimation unit of the personal characteristic estimation device according to the first embodiment of the present invention.
  • FIG. 3 is a flowchart showing a processing procedure of the personal characteristic estimation apparatus according to the first embodiment of the present invention.
  • FIG. 4 is a block diagram showing a configuration of the personal characteristic estimation apparatus according to the second embodiment of the present invention.
  • FIG. 5 is an explanatory diagram schematically showing an instantaneous characteristic model of the personal characteristic estimation device according to the second embodiment of the present invention.
  • FIG. 1 is a block diagram showing the configuration of the personal characteristic estimation apparatus according to the first embodiment of the present invention.
  • FIG. 2 is an explanatory diagram schematically showing processing of the estimation unit of the personal characteristic estimation device according to the first embodiment of the present invention.
  • FIG. 3 is a flowchart showing a processing procedure of the personal characteristic estimation apparatus according to the first embodiment of the
  • FIG. 6 is an explanatory diagram schematically showing how a characteristic model is selected by the model switching unit of the personal characteristic estimation device according to the second embodiment of the present invention.
  • FIG. 7 is a flowchart showing a processing procedure of the personal characteristic estimation apparatus according to the second embodiment of the present invention.
  • FIG. 8A is an explanatory diagram illustrating an example of a likelihood threshold according to each embodiment of the present invention.
  • FIG. 8B is an explanatory diagram illustrating an example of a calculated likelihood value according to each embodiment of the present invention.
  • FIG. 9A is an explanatory diagram illustrating a relationship between a frequency threshold value and a frequency result according to each embodiment of the present invention.
  • FIG. 9B is an explanatory diagram illustrating a relationship between a frequency threshold value and a frequency result according to each embodiment of the present invention.
  • FIG. 10 is an explanatory diagram showing a relationship between a plurality of scenes and a characteristic tendency label that can be detected in each scene according to each embodiment of the present invention.
  • FIG. 11A is an explanatory diagram showing the situation of the scene (1) according to each embodiment of the present invention.
  • FIG. 11B is an explanatory diagram showing the situation of the scene (2) according to each embodiment of the present invention.
  • FIG. 11C is an explanatory diagram showing a situation of the scene (3) according to each embodiment of the present invention.
  • FIG. 11D is an explanatory diagram showing a situation of the scene (4) according to each embodiment of the present invention.
  • FIG. 11A is an explanatory diagram showing the situation of the scene (1) according to each embodiment of the present invention.
  • FIG. 11B is an explanatory diagram showing the situation of the scene (2) according to each embodiment of the present invention.
  • FIG. 12A is an explanatory diagram showing the situation of the scene (5) according to each embodiment of the present invention.
  • FIG. 12B is an explanatory diagram showing the situation of the scene (6) according to each embodiment of the present invention.
  • FIG. 12C is an explanatory diagram showing the situation of the scene (7) according to each embodiment of the present invention.
  • FIG. 12D is an explanatory diagram showing a situation of the scene (8) according to each embodiment of the present invention.
  • FIG. 13 is an explanatory diagram showing the relationship between the driving signal and the driver's static characteristics according to each embodiment of the present invention.
  • FIG. 14A is an explanatory diagram showing a procedure for designing an environmental corpus according to each embodiment of the present invention.
  • FIG. 14A is an explanatory diagram showing a procedure for designing an environmental corpus according to each embodiment of the present invention.
  • FIG. 14B is an explanatory diagram showing a procedure for designing an environmental corpus according to each embodiment of the present invention.
  • FIG. 15 is an explanatory diagram showing a procedure for designing a characteristic corpus according to each embodiment of the present invention.
  • FIG. 16 is an explanatory diagram showing the structure of the environmental model and the characteristic model according to each embodiment of the present invention.
  • FIG. 1 is a block diagram showing the configuration of the personal characteristic estimation apparatus according to this embodiment.
  • the personal characteristic estimation apparatus 100 includes a signal detection unit (vehicle information acquisition unit) 11, an estimation unit 12, an instantaneous characteristic model storage unit (instantaneous characteristic model storage unit) 15, and a tendency estimation unit ( Characteristic tendency estimation means) 16.
  • the estimation unit 12 includes a scene estimation unit (scene estimation unit) 13 and an instantaneous characteristic estimation unit (instantaneous characteristic estimation unit) 14.
  • the signal detection unit 11 acquires information related to the operation of the vehicle and information related to the behavior of the vehicle from a CAN (Controller Area Network) mounted on the vehicle.
  • information relating to the operation of the vehicle include information such as a brake operation, a steering angle of the steering, and an accelerator opening.
  • information regarding the behavior of the vehicle include information such as the traveling speed of the vehicle, the yaw rate, and the discharge level of the battery.
  • the signal detection unit 11 acquires information on the outside of the vehicle.
  • the signal detection unit 11 acquires information related to the external environment based on a surrounding image captured by an in-vehicle camera (not shown) or a traveling path of the host vehicle detected by a GPS device or the like.
  • the signal detection unit 11 acquires information related to operation history of the navigation device, the air conditioner, and the audio mounted on the vehicle.
  • the information related to the operation of the vehicle, the information related to the behavior of the vehicle, and the information related to the operation history will be collectively referred to as a “vehicle signal”.
  • vehicle information the concept which shows at least one of the information regarding the operation of the vehicle, the information regarding the behavior of the vehicle, the information regarding the operation history, and the information on the outside of the vehicle.
  • the instantaneous characteristic model storage unit 15 stores a plurality of preset scenes (scenes (1) to (8) shown in FIGS. 11A to 12D described later).
  • the instantaneous characteristic model storage unit 15 sets a plurality of characteristic tendency labels indicating the characteristics of the driver, an environmental model used for estimating the scene, and a characteristic tendency that can be easily expressed in each scene.
  • a characteristic model associated with a label (hereinafter abbreviated as “label”) is stored as an instantaneous characteristic model.
  • the instantaneous characteristic model storage unit 15 estimates the scene as shown in the correspondence relationship in the correspondence table shown in FIG. 10 in which a plurality of scenes related to vehicle travel are taken in the row direction and a plurality of labels are taken in the column direction.
  • an environment model that is used for the purpose and a characteristic model that associates each scene with a label instantaneous characteristic of the driver
  • an instantaneous characteristic model that is prominently displayed in the scene are stored as an instantaneous characteristic model.
  • scenes (1) to (1) to (3) are classified according to conditions such as road shape in the row direction, time zone, traffic jam information, external state, presence of pedestrians, presence of vehicles, presence of oncoming vehicles, and the like. 8) is set.
  • various characteristic tendency labels labels (1) to (9) indicating the characteristics of the driver are set.
  • one or more labels that are easy to be displayed for each scene are indicated by circles. For example, in a scene (1) (FIG. 11A) in which a vehicle travels on a “crosswalk without a signal on a narrow street” (FIG.
  • the instantaneous characteristic model storage unit 15 stores two characteristic models in which the labels (1) and (4) are associated with the scene (1). As shown in FIG.
  • the estimation unit 12 since the estimation unit 12 uses a characteristic model that combines a scene and a characteristic (label) when estimating the instantaneous characteristic of the driver, the instantaneous characteristic model storage unit 15 stores the characteristic model in FIG. Sixteen characteristic models corresponding to the number of circles shown are stored.
  • the correspondence table shown in FIG. 10 is merely an example.
  • the “determinism” column in the label (9) is not circled. This does not mean that the driver's characteristics such as determinism are difficult to express in the scene (1), but data indicating the relationship between the scene (1) and the label (9) is not obtained. It is shown that. Therefore, the correspondence table shown in FIG. 10 may be changed by future research.
  • a static characteristic detected by a general and static characteristic tendency measuring method such as AQ test (Autism-Spectrum Quotient)
  • AQ test Autism-Spectrum Quotient
  • it is greater than a predetermined value for example, 0 .2 or higher
  • a predetermined value for example, 0 .2 or higher
  • the “instantaneous characteristic” is an instantaneous characteristic estimated based on the behavior of the driver.
  • a “static characteristic” which is a personal characteristic of the driver (generic name such as personality, tendency, and temperament) is obtained based on a number of “instantaneous characteristics”.
  • the scene estimation unit 13 shown in FIG. 1 has instantaneous characteristics based on vehicle signals (information on vehicle operation, information on vehicle behavior, and information on operation history) detected by the signal detection unit 11 and information on the outside of the vehicle.
  • the scene in which the host vehicle travels is estimated with reference to the environmental model stored in the model storage unit 15.
  • the scene estimation unit 13 estimates a scene by grasping a situation in which the host vehicle is currently traveling based on a surrounding image captured by an in-vehicle camera (not shown).
  • the scene estimation unit 13 refers to the environment model and corresponds to which scene among the scenes (1) to (8) indicated in the row direction of the correspondence table shown in FIG. Estimate.
  • the instantaneous characteristic estimation unit 14 refers to the characteristic model stored in the instantaneous characteristic model storage unit 15 based on the vehicle signal detected by the signal detection unit 11 and the scene estimated by the scene estimation unit 13.
  • the instantaneous characteristics of the person Specifically, the instantaneous characteristic estimation unit 14 obtains the likelihood of each label using an HMM (Hidden Markov Model) algorithm (an algorithm for statistically analyzing characteristic trends) or the like based on the vehicle signal. Detailed procedures will be described later.
  • HMM Hidden Markov Model
  • the personal characteristic estimation apparatus 100 can be configured as an integrated computer including a central processing unit (CPU), RAM, ROM, and storage means such as a hard disk.
  • CPU central processing unit
  • RAM random access memory
  • ROM read-only memory
  • storage means such as a hard disk.
  • the general questionnaire C1 asks the subject (driver in this embodiment) his / her behavior and introspection when a specific environment is given, and estimates the static characteristics of the subject from the answer result. Therefore, in order to estimate a static characteristic similar to the static characteristic estimated by the general questionnaire C1 from a driving operation signal or a vehicle operation signal (hereinafter referred to as a driving signal), based on the driving signal, It is only necessary to be able to estimate the specific environment given by the questionnaire C1 and the behavior of the driver when the specific environment is given.
  • driving behavior is determined instantaneously by the driver's characteristics, environment, surrounding conditions, traffic rules, and manners
  • the driver's instantaneous characteristics can be detected by grasping the driving behavior.
  • the driving signal does not include information representing the relationship between the driving behavior and the driver's static characteristics
  • the driving version questionnaire is used to represent the relationship between the driving behavior and the driver's static characteristics. C2 is required.
  • a question such as “Do you think that pedestrians should give up in any case when they are trying to cross?” Is given to the subject.
  • the subject answers the question while considering the actual driving scene, such as “Would you stop before the pedestrian crossing?” Or “Would you go first while observing the pedestrian?”
  • the static characteristics of the driver can be estimated by observing the driving behavior described in the driving questionnaire C2.
  • a signal indicating a one-time stop after stepping on the brake appears before the pedestrian crossing, and “going first while observing the pedestrian” is expressed.
  • the driving signal C3 when the driver takes an action in a specific environment is extracted and a label explaining the driving action (for example, “other-friendliness empathy”, etc.) can be given.
  • the driving behavior can be estimated from only the driving signal, and the instantaneous characteristics of the driver observed instantaneously estimated from the driving behavior can be calculated.
  • static characteristics equivalent to the driver's static characteristics expected from the score (answer result) obtained from the psychological test (general questionnaire C1) are obtained. It can be estimated automatically.
  • the estimation unit 12 first divides the operation signal input from the signal detection unit 11 for each unit time and performs environment recognition.
  • the estimation unit 12 uses an environmental model based on an HMM (Hidden Markov Model) and prepares in advance a road shape such as a right / left turn or a narrow road, a traffic jam situation, and the like based on the input driving signal.
  • One pattern is recognized from among the patterns representing a plurality of environments.
  • the estimation unit 12 selects a characteristic model based on the environment represented by the recognized pattern.
  • the estimation unit 12 selects a characteristic model that has been learned in advance so that the instantaneous characteristics of the driver can be recognized in the “right turn” environment.
  • the estimation unit 12 recognizes the instantaneous characteristics of the driver observed instantaneously using the selected characteristic model.
  • the recognized instantaneous characteristic is accumulated every time a recognition result is transmitted.
  • the tendency estimation unit 16 obtains the driver's static characteristics based on the occurrence frequency and distribution of a plurality of recognized instantaneous characteristics.
  • the environmental corpus and the characteristic corpus are designed in advance by the following method.
  • An environmental model is obtained by pre-learning based on an environmental corpus, and a characteristic model is obtained by pre-learning based on a characteristic corpus.
  • FIG. 14A and 14B are explanatory diagrams showing examples of collection and extraction of driving signals (vehicle speed in this example) in three environments by each of driver numbers D1 to Dn.
  • Environment S1 shown to FIG. 14A has shown the situation of straight road congestion, S2 the narrow road crossing, and S3 the state of a T-shaped road right turn.
  • FIG. 14B shows a state in which vehicle speed data for each environment (S1 to S3) is cut out and environmental symbols (straight road congestion, narrow road crossing, T-turn right turn) are added to the cut out vehicle speed data.
  • the characteristic corpus collects driving signals in a plurality of environments by each of the drivers D1 to Dn over a long period of time, and extracts and classifies vehicle speed data for each environment symbol. Further, the plurality of vehicle speed data collected for each environmental symbol is classified for each driving action corresponding to the instantaneous characteristics of the driver, and a characteristic symbol is given to the vehicle speed data collected for each driving action.
  • FIG. 15 gives characteristic symbols (emotional empathy behavior, non-sympathetic behavior) to the vehicle speed data collected for each driving behavior (a1, a2) when the environmental symbol is “narrow road crossing, with pedestrians” It shows how it was done.
  • FIG. 16 is an explanatory diagram showing the structure of the environment model and the characteristic model, and uses the Left_to_Right model used in speech recognition.
  • an instantaneous characteristic model is generated by using each corpus to learn transition probabilities and output probabilities between states that are output when an input parameter series is observed for each symbol. .
  • step S11 the signal detection unit 11 acquires various vehicle signals and information on the outside of the vehicle. Specifically, depending on the CAN mounted on the vehicle, vehicle signals such as accelerator opening, brake operation, steering angle, vehicle speed, yaw rate, etc., external video signals captured by an in-vehicle camera, or GPS current Get location information.
  • vehicle signals such as accelerator opening, brake operation, steering angle, vehicle speed, yaw rate, etc.
  • external video signals captured by an in-vehicle camera, or GPS current Get location information.
  • step S ⁇ b> 12 the signal detection unit 11 determines the primary difference between the acquired signals at a certain time interval (this is referred to as a difference ⁇ ) and the secondary difference between the primary differences of the acquired signal (this is referred to as a difference ⁇ ). Is calculated and parameterized.
  • the signal detector 11 Accelerator opening difference ⁇ 1 obtained at t2, Accelerator opening difference ⁇ 2 obtained at times t2 and t3, Accelerator opening difference ⁇ 3 obtained at times t3 and t4, Accelerator opening difference obtained at times t4 and t5
  • the difference ⁇ 4 is calculated. Further, the signal detector 11 obtains a difference ⁇ 1 between the differences ⁇ 1 and ⁇ 2, a difference ⁇ 2 between the differences ⁇ 2 and ⁇ 3, and a difference ⁇ 3 between the differences ⁇ 3 and ⁇ 4.
  • the scene estimation unit 13 refers to the environmental model, and estimates the scene in which the host vehicle is traveling based on various vehicle signals detected by the signal detection unit 11 or information on the outside of the vehicle. .
  • the instantaneous characteristic estimation unit 14 refers to the characteristic model and calculates the likelihood for the label (labeled with a circle in the correspondence table shown in FIG. 10) associated with the estimated scene. .
  • the instantaneous characteristic estimation unit 14 calculates the likelihood of each label using an HMM (Hidden Markov Model) algorithm based on the difference ⁇ obtained in step S12 and the difference ⁇ .
  • HMM Hidden Markov Model
  • the instantaneous characteristic estimation unit 14 obtains the likelihood of each label by substituting the difference ⁇ obtained in step S12 and the data of the difference ⁇ into the HMM algorithm. Note that the details of the HMM algorithm are well-known techniques and will not be described.
  • the vehicle is traveling on the “crosswalk without a signal on a narrow road” of the scene (1) in the correspondence table shown in FIG. 10 in the scene estimation unit 13. Is estimated, the likelihood of “other-person sympathy” of label (1) and the likelihood of “non-sympathy” of label (4) are calculated.
  • step S14 the instantaneous characteristic estimation unit 14 determines whether or not the likelihood of the label having the maximum likelihood exceeds a preset threshold value.
  • a preset threshold value For example, in the case of the scene (1) in the correspondence table shown in FIG. 10, the likelihood threshold for “other-sympathetic empathy” in label (1) and the likelihood threshold for “non-sympathy” in label (4) are as follows: “3000” and “2000” shown in FIG. 8A are set.
  • the instantaneous characteristic estimation unit 14 sets a histogram for each label.
  • the trend estimation unit 16 increments the numerical value of the histogram of the label having the maximum likelihood in step S15.
  • the likelihood of “other-person sympathy” in label (1) and the likelihood of “non-sympathy” in label (4) are obtained by the processing by the HMM algorithm. Assume that “2500” and “2300” are recognized. When the label having the maximum likelihood is the “other-friendliness sympathy” of the label (1) at the detection time t1, the likelihood “2500” of the label (1) does not exceed the likelihood threshold “3000”. Therefore (NO in step S14), the tendency estimation unit 16 does not increment the numerical value of the histogram of the label (1).
  • the tendency estimation unit 16 increments the numerical value of the histogram of the label (4).
  • the histogram is an accumulation of the number of times that the likelihood exceeds the likelihood threshold.
  • the histogram is stored in a memory or the like.
  • FIG. 9A is an explanatory diagram showing the relationship between the frequency threshold and the frequency result.
  • FIG. 9B is an explanatory diagram showing a frequency threshold and a histogram of frequency results.
  • the trend estimation unit 16 increments the numerical value of the histogram of the label with the maximum likelihood when the likelihood of the label with the maximum likelihood exceeds the likelihood threshold.
  • the tendency estimation unit 16 displays the “frequency” shown in FIGS. 9A and 9B. Increment the value of “Result”.
  • step S16 the tendency estimation unit 16 determines whether or not the maximum frequency value appearing in the histogram exceeds a preset frequency threshold value.
  • a preset frequency threshold value In the example shown in FIG. 9B, the frequency result of “non-sympathy” of the label (4) which is the maximum frequency value exceeds the frequency threshold value (that is, the frequency threshold value is “4”). Therefore, the tendency estimation unit 16 determines that the maximum frequency value appearing in the histogram exceeds the frequency threshold (YES in step S16).
  • step S17 the tendency estimation unit 16 outputs “non-sympathy” of the label (4) as a static characteristic of the driver. That is, the tendency estimation unit 16 determines that the static characteristic of the driver who drives the vehicle is “non-sympathetic”, and outputs the determination result to a subsequent device (not shown).
  • the frequency result shown in the histogram of FIG. 9B increases with time and eventually exceeds the frequency threshold. In order to avoid this phenomenon, it is also possible to accumulate the frequency result for a certain period of time or every predetermined number of detections, and use a numerical value obtained by normalizing the accumulated data as the frequency result.
  • Scene (1) is the situation shown in FIG. 11A.
  • scene (2) to (8) in the correspondence table shown in FIG. 10 will be described with reference to FIGS. 11B to 12D.
  • FIG. 11B shows a situation in which the host vehicle passes through the “crosswalk without a stop line on a narrow road” in the scene (2).
  • FIG. 11C shows a situation in which the vehicle makes a “left turn from a narrow road to a wide road” in the scene (3).
  • FIG. 11D shows a situation in which the host vehicle passes through “a pedestrian crossing on a one-lane road with good visibility” in scene (4).
  • FIG. 12A shows a situation where the host vehicle makes a “left turn from a wide road to a narrow road” in the scene (5).
  • FIG. 12B shows a situation in which the host vehicle performs the “straight line in a congested narrow road” in the scene (6).
  • FIG. 12C shows a situation in which the host vehicle travels in “Merge with heavy traffic” in scene (7).
  • FIG. 12D shows a situation in which the host vehicle performs the “lane changing action in a traffic jam” of the scene (8).
  • the instantaneous characteristic estimator 14 includes labels associated with the scenes estimated by the scene estimator 13 among the scenes (1) to (8) (labels that are circled in the correspondence table shown in FIG. 10). ) For the likelihood.
  • the personal characteristic estimation device 100 associates an environmental model used for estimating a scene with a label (a driver's instantaneous characteristic) that is prominently expressed in the scene.
  • the model is stored as a characteristic model and an instantaneous characteristic model.
  • the personal characteristic estimation device 100 estimates a scene in which the host vehicle travels based on the vehicle information with reference to the environment model.
  • the personal characteristic estimation device 100 refers to the characteristic model and calculates the likelihood of each label associated with the estimated scene using an algorithm of the HMM or the like based on information operated by the driver.
  • the personal characteristic estimation device 100 increments the numerical value of the histogram of the label having the maximum likelihood.
  • the personal characteristic estimation device 100 determines that the label having the maximum frequency value (driver's instantaneous characteristic) is the static characteristic of the driver. To do.
  • the personal characteristic estimation device 100 estimates the driver's static characteristics (generic name, tendency, temperament, etc.) with high accuracy based on the behavior that the driver can take while driving the vehicle and the surrounding environment of the vehicle. can do.
  • HMI Human Machine Interface
  • the following three guidances can be considered when the navigation device detects a traffic jam and searches for a detour: (1) Explain to the driver the reason for detouring and automatically set the detour (2) Let the driver select whether to take a detour or the original route; (3) Automatically set a detour without explaining anything to the driver. If the driver's static characteristic estimated by the personal characteristic estimation device 100 is used, an appropriate guidance can be selected from the guidances (1) to (3) according to the driver's static characteristic. Become.
  • the driver's static characteristics are estimated by a psychological method such as having a subject perform an AQ test using a questionnaire.
  • the conventional method has a problem that a true answer cannot always be obtained, and further, personal information leaks.
  • the personal characteristic estimation device 100 of the present embodiment estimates the static characteristics of the driver based on the behavior of the driver in each scene when driving the vehicle. The static characteristics of the driver can be estimated accurately at a low cost.
  • the instantaneous characteristic model storage unit 15 stores a characteristic model that combines a scene and a characteristic (label).
  • the personal characteristic estimation device 100 can reduce the storage capacity.
  • the signal detection unit 11 acquires at least one signal among the accelerator opening of the vehicle, the brake operation, the steering angle, the yaw rate, the vehicle speed, and the battery discharge level. Since the estimation unit 12 estimates a scene using the scene estimation unit 13 based on the signal acquired by the signal detection unit 11 and detects the driver's behavior using the instantaneous characteristic estimation unit 14, The characteristic estimation device 100 can estimate the static characteristic of the driver with high accuracy based on the vehicle signal.
  • the scene estimation unit 13 estimates the scene in which the host vehicle travels based on data such as road shape, time zone, traffic jam information, external conditions, presence / absence of pedestrians, presence / absence of vehicles, presence / absence of oncoming vehicles, etc.
  • the personal characteristic estimation device 100 can estimate an appropriate scene according to the situation around the host vehicle, and can estimate the static characteristics of the driver with high accuracy.
  • the instantaneous characteristic estimator 14 uses a hidden Markov model (HMM) as a method for statistically analyzing the result estimated using the instantaneous characteristic model.
  • HMM hidden Markov model
  • the personal characteristic estimation device 100 is a characteristic having a correlation of a predetermined value or more (for example, 0.2 or more) with respect to a characteristic detected by a conventional method of measuring a general and static characteristic tendency such as an AQ test. Is set as a characteristic tendency label, a label that can easily detect the static characteristics of the driver can be appropriately selected, and the static characteristics of the driver can be estimated with high accuracy.
  • the trend estimation unit 16 detects a label having the maximum likelihood every arbitrary time, accumulates the detected labels, forms a histogram, calculates a distribution for each label, and based on a label whose histogram value exceeds a threshold value Thus, since the static characteristic of the driver is estimated, the personal characteristic estimation device 100 can estimate the static characteristic of the driver with high accuracy.
  • FIG. 4 is a block diagram showing the configuration of the personal characteristic estimation apparatus according to this embodiment.
  • the personal characteristic estimation apparatus 101 includes a signal detection unit 11, a scene estimation unit 13, an instantaneous characteristic estimation unit 14, an instantaneous characteristic model storage unit 15, a trend estimation unit 16, and a model switching unit. (Model switching means) 17.
  • the signal detection unit 11 Since the signal detection unit 11 has the same configuration as the signal detection unit 11 shown in FIG.
  • the instantaneous characteristic model storage unit 15 stores a plurality of preset scenes (scenes (1) to (8) shown in FIGS. 11A to 12D).
  • the instantaneous characteristic model storage unit 15 sets a plurality of characteristic tendency labels indicating the characteristics of the driver, an environmental model used for estimating the scene, and a characteristic tendency that can be easily expressed in each scene.
  • a characteristic model associated with a label (hereinafter abbreviated as “label”) is stored as an instantaneous characteristic model.
  • the instantaneous characteristic model storage unit 15 estimates the scene as shown in the correspondence relationship in the correspondence table shown in FIG. 10 in which a plurality of scenes related to vehicle travel are taken in the row direction and a plurality of labels are taken in the column direction.
  • a characteristic model in which a label (a driver's instantaneous characteristic) prominently displayed in each scene is associated with each scene.
  • the instantaneous characteristic model storage unit 15 stores a plurality of characteristic models classified for each scene. Specifically, as shown in FIG. 5, label (1), label (4), and garbage (others) are stored as the characteristic model of scene (1). Label (2), label (6), and garbage are stored as a characteristic model of scene (2). Similarly, one or more labels associated with the scene are stored as a characteristic model of each scene after the scene 3.
  • the scene estimation unit 13 is information about the external environment of the host vehicle detected by the signal detection unit 11 and vehicle signals (information about vehicle operation, information about vehicle behavior, and information about operation history). ),
  • the scene in which the host vehicle travels is estimated with reference to the environmental model stored in the instantaneous characteristic model storage unit 15.
  • the scene estimation unit 13 grasps a situation in which the host vehicle is currently traveling based on a surrounding image captured by an in-vehicle camera (not shown), and estimates a scene.
  • the scene estimation unit 13 refers to the environment model and corresponds to which scene among the scenes (1) to (8) indicated in the row direction of the correspondence table shown in FIG. Estimate.
  • the model switching unit 17 performs a process of selecting a desired characteristic model from a plurality of characteristic models stored in the instantaneous characteristic model storage unit 15 based on the scene estimated by the scene estimation unit 13. That is, as illustrated in FIG. 6, the model switching unit 17 performs a process of selecting a characteristic model associated with the scene estimated by the scene estimation unit 13 from a plurality of characteristic models. Specifically, when the current scene is “a crosswalk without a signal on a narrow road” in the scene (1) in the correspondence table shown in FIG. 10, the model switching unit 17 changes the scene (1) to the scene (1).
  • the associated characteristic model including “other-person sympathy” of label (1), “non-sympathy” of label (4), and “garbage” is selected.
  • the instantaneous characteristic estimation unit 14 estimates the driver's instantaneous characteristic using the vehicle signal detected by the signal detection unit 11 and the characteristic model selected by the model switching unit 17. For example, as shown in FIG. 6, when the model switching unit 17 selects the scene 2 characteristic model, the label (2), label (6), and garbage stored as the scene 2 characteristic model The label with the maximum likelihood is extracted and output to the trend estimation unit 16. A method for extracting the label with the maximum likelihood will be described later.
  • step S31 the signal detection unit 11 acquires various vehicle signals and vehicle external information. Specifically, depending on the CAN mounted on the vehicle, vehicle signals such as accelerator opening, brake operation, steering angle, vehicle speed, yaw rate, etc., external video signals captured by an in-vehicle camera, or GPS navigation Get information.
  • vehicle signals such as accelerator opening, brake operation, steering angle, vehicle speed, yaw rate, etc.
  • external video signals captured by an in-vehicle camera, or GPS navigation Get information.
  • step S ⁇ b> 32 the signal detection unit 11 determines the primary difference between the acquired signals at a certain time interval (this is referred to as a difference ⁇ ) and the secondary difference between the primary differences of the acquired signal (this is referred to as a difference ⁇ ). Is calculated and parameterized.
  • the scene estimation unit 13 refers to the environment model, and information on the outside of the vehicle detected by the signal detection unit 11 (specifically, measured by an outside image captured by the in-vehicle camera or a GPS device). Based on the position information of the own vehicle), the scene likelihood of the own vehicle is calculated, and the scene in which the own vehicle is traveling is estimated.
  • step S34 the model switching unit 17 determines whether or not the scene estimated by the scene estimation unit 13 is garbage. If the estimated scene is garbage, the process returns to step S31. If the estimated scene is not garbage, the process proceeds to step S35.
  • step S35 the model switching unit 17 selects a characteristic model associated with the estimated scene from among a plurality of characteristic models (see FIG. 5) respectively associated with the scenes (1) to (8). .
  • the characteristic model associated with the scene (2) is selected as shown in FIG.
  • step S36 the instantaneous characteristic estimation unit 14 calculates the likelihood (characteristic likelihood) of each label using an algorithm of HMM (Hidden Markov Model) using the difference ⁇ obtained in step S32 and the difference ⁇ of the difference. calculate.
  • HMM Hidden Markov Model
  • the signal detection unit 11 determines that the time t1 , T2 of the accelerator opening obtained at time t2, the difference ⁇ 2 of the accelerator opening obtained at time t2, t3 ⁇ 2, the difference of accelerator opening obtained at time t3, t4, the accelerator opening obtained at time t4, t5
  • the difference ⁇ 4 is calculated. Further, the signal detector 11 obtains a difference ⁇ 1 between the differences ⁇ 1 and ⁇ 2, a difference ⁇ 2 between the differences ⁇ 2 and ⁇ 3, and a difference ⁇ 3 between the differences ⁇ 3 and ⁇ 4.
  • the instantaneous characteristic estimator 14 calculates the likelihood for each label by substituting these data into the HMM algorithm.
  • the instantaneous characteristic estimating unit 14 causes the scene estimating unit 13 to drive the host vehicle on the “crosswalk without a signal on a narrow road” of the scene (1) in the correspondence table shown in FIG. If it is estimated that there is, the likelihood of “other-person sympathy” in label (1) and the likelihood of “non-sympathy” in label (4) are calculated.
  • step S37 the instantaneous characteristic estimation unit 14 determines whether or not the likelihood of the label having the maximum likelihood exceeds a preset threshold value.
  • a preset threshold value For example, in the case of scene (1) in the correspondence table shown in FIG. 10, the likelihood threshold for “other-sympathetic empathy” for label (1) and the likelihood threshold for “non-sympathy” for label (4) are , “3000” and “2000” shown in FIG. 8A, respectively.
  • the instantaneous characteristic estimation unit 14 sets a histogram for each label.
  • the tendency estimation unit 16 sets the numerical value of the histogram of the label having the maximum likelihood. Is incremented.
  • the likelihood of “other-person sympathy” in label (1) and the likelihood of “non-sympathy” in label (4) are obtained by the processing by the HMM algorithm. , “2500” and “2300”.
  • the likelihood “2500” of the label (1) does not exceed the likelihood threshold “3000”. Therefore (NO in step S37), the tendency estimation unit 16 does not increment the numerical value of the histogram of the label (1).
  • the tendency estimation unit 16 increments the numerical value of the histogram of the label (4).
  • the histogram is an accumulation of the number of times that the likelihood exceeds the likelihood threshold.
  • the histogram is stored in a memory or the like.
  • FIG. 9A is an explanatory diagram showing the relationship between the frequency threshold and the frequency result.
  • FIG. 9B is an explanatory diagram showing a histogram of frequency results.
  • the trend estimation unit 16 increments the numerical value of the histogram of the label with the maximum likelihood when the likelihood of the label with the maximum likelihood exceeds the likelihood threshold.
  • the trend estimation unit 16 performs the “frequency” shown in FIGS. 9A and 9B. Increment the value of “Result”.
  • step S39 the trend estimation unit 16 determines whether or not the normalized maximum frequency value appearing in the histogram exceeds a preset frequency threshold value.
  • a preset frequency threshold value In the example shown in FIG. 9B, the frequency result of “non-sympathy” of the label (4) which is the normalized maximum frequency value exceeds the frequency threshold value (that is, the frequency threshold value is “4”).
  • the tendency estimation unit 16 determines that the normalized maximum frequency value appearing in the histogram exceeds the frequency threshold (YES in step S39). ).
  • step S40 the tendency estimation unit 16 outputs “non-sympathy” of the label (4) as a static characteristic of the driver. That is, the tendency estimation unit 16 determines that the static characteristic of the driver who drives the vehicle is “non-sympathetic”, and outputs the determination result to a subsequent device (not shown).
  • the personal characteristic estimation device 101 associates an environmental model used for estimating a scene with a label (a driver's instantaneous characteristic) that is prominently expressed in the scene.
  • the model is stored as a characteristic model and an instantaneous characteristic model.
  • the personal characteristic estimation apparatus 101 estimates a scene in which the host vehicle travels based on the vehicle information with reference to the environment model.
  • the personal characteristic estimation device 101 refers to the characteristic model, recognizes the label associated with the estimated scene, and uses the HMM algorithm or the like based on the information operated by the driver to estimate the likelihood of each label. Calculate the degree.
  • the personal characteristic estimation apparatus 101 increments the numerical value of the histogram of the label having the maximum likelihood.
  • the personal characteristic estimation device 101 displays the label (the driver's instantaneous characteristic) having the normalized maximum frequency value as the driver's static value. Judged to be characteristic.
  • the personal characteristic estimation apparatus 101 estimates the driver's static characteristics (generic name, tendency, temperament, etc.) with high accuracy based on the behavior that the driver can take while driving the vehicle and the surrounding environment of the vehicle. can do.
  • the driver's static characteristics for example, when performing guidance based on the estimated driver static characteristics based on the HMI (Human_Machine_Interface) for the driver, an appropriate response corresponding to the driver static characteristics may be taken. it can.
  • the model switching unit 17 selects a characteristic model associated with the scene estimated by the scene estimation unit 13 from a plurality of characteristic models (see FIG. 5), and the instantaneous characteristic estimation unit 14 selects the selected characteristic model. Based on this, since the likelihood of each label is calculated, the personal characteristic estimation device 101 can reduce the calculation load when calculating the likelihood of each label.
  • the trend estimator 16 accumulates and normalizes the label detected by the instantaneous characteristic estimator 14 at any given time and having the maximum likelihood for a predetermined time or every predetermined number of detections. Create a histogram and calculate the distribution of each label. Therefore, the personal characteristic estimating apparatus 101 converts the numerical value of the histogram into an appropriate numerical value and estimates the driver's static characteristic, so that the driver's static characteristic can be estimated with high accuracy. .
  • the personal characteristic estimation apparatus and personal characteristic estimation method of the present invention are not limited to the first and second embodiments, and can be appropriately changed without departing from the gist of the present invention.
  • the HMM Hidden Markov Model
  • the present invention is not limited to this. Instead, it is possible to use other techniques having equivalent functions.
  • the present invention can be used to estimate personal characteristics based on behavior during vehicle driving.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

Dispositifs d'évaluation (100, 101) de caractéristique individuelle comprenant une unité de détection (11) de signal, une unité d'évaluation (13) de scène, une unité d'évaluation (14) de caractéristique instantanée, une unité de stockage (15) de modèle de caractéristique instantanée et une unité d'évaluation (16) de tendance. L'unité de détection (11) de signal obtient des informations de véhicule. L'unité d'évaluation (13) de scènes évalue, sur la base des informations de véhicule, des scènes que traverse un véhicule hôte. L'unité de stockage (15) de modèle de caractéristique instantanée stocke en tant que modèle de caractéristique instantanée un modèle de caractéristique dans lequel une étiquette de tendance de caractéristique par laquelle la scène est facilement exprimée, est associée à chaque scène. Sur la base de la scène qui est évaluée à l'aide de l'unité (13) d'évaluation de scènes, l'unité d'évaluation (14) de caractéristique instantanée demande le modèle de caractéristique instantanée et évalue une tendance de caractéristique instantanée d'un conducteur. L'unité d'évaluation de tendance analyse statistiquement la tendance de caractéristique instantanée évaluée, à l'aide d'un HMM ou d'un autre algorithme, et évalue une tendance de caractéristique statique du conducteur.
PCT/JP2013/073140 2012-08-31 2013-08-29 Unité d'évaluation de caractéristique individuelle et procédé d'évaluation de caractéristique individuelle Ceased WO2014034779A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-190952 2012-08-31
JP2012190952 2012-08-31

Publications (1)

Publication Number Publication Date
WO2014034779A1 true WO2014034779A1 (fr) 2014-03-06

Family

ID=50183584

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/073140 Ceased WO2014034779A1 (fr) 2012-08-31 2013-08-29 Unité d'évaluation de caractéristique individuelle et procédé d'évaluation de caractéristique individuelle

Country Status (1)

Country Link
WO (1) WO2014034779A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019016238A (ja) * 2017-07-07 2019-01-31 Kddi株式会社 運転車両信号から個人特性を特定しやすい道路区間を推定する推定装置、車両端末、プログラム及び方法
WO2019193660A1 (fr) * 2018-04-03 2019-10-10 株式会社ウフル Système de commutation de modèles appris par machine, dispositif périphérique, procédé de commutation de modèles appris par machine et programme
EP4273013A3 (fr) * 2022-05-02 2024-01-10 Toyota Jidosha Kabushiki Kaisha Dispositif de gestion de caractéristiques individuelles, procédé de gestion de caractéristiques individuelles, support de stockage non transitoire stockant un programme, et procédé de génération de modèle appris

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008126908A (ja) * 2006-11-22 2008-06-05 Denso Corp 運転行動推定方法および装置
JP2009073465A (ja) * 2007-08-28 2009-04-09 Fuji Heavy Ind Ltd 安全運転支援システム
JP2010221962A (ja) * 2009-03-25 2010-10-07 Denso Corp 運転行動推定装置
JP2012113631A (ja) * 2010-11-26 2012-06-14 Toyota Motor Corp 運転支援システム及び運転支援管理センター

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008126908A (ja) * 2006-11-22 2008-06-05 Denso Corp 運転行動推定方法および装置
JP2009073465A (ja) * 2007-08-28 2009-04-09 Fuji Heavy Ind Ltd 安全運転支援システム
JP2010221962A (ja) * 2009-03-25 2010-10-07 Denso Corp 運転行動推定装置
JP2012113631A (ja) * 2010-11-26 2012-06-14 Toyota Motor Corp 運転支援システム及び運転支援管理センター

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019016238A (ja) * 2017-07-07 2019-01-31 Kddi株式会社 運転車両信号から個人特性を特定しやすい道路区間を推定する推定装置、車両端末、プログラム及び方法
WO2019193660A1 (fr) * 2018-04-03 2019-10-10 株式会社ウフル Système de commutation de modèles appris par machine, dispositif périphérique, procédé de commutation de modèles appris par machine et programme
EP4273013A3 (fr) * 2022-05-02 2024-01-10 Toyota Jidosha Kabushiki Kaisha Dispositif de gestion de caractéristiques individuelles, procédé de gestion de caractéristiques individuelles, support de stockage non transitoire stockant un programme, et procédé de génération de modèle appris

Similar Documents

Publication Publication Date Title
JP6074553B1 (ja) 情報処理システム、情報処理方法、およびプログラム
JP6800575B2 (ja) 自己の乗り物のドライバを支援する方法およびシステム
CN110114810B (zh) 信息处理系统、信息处理方法以及存储介质
JP6052530B1 (ja) 情報処理システム、情報処理方法、およびプログラム
JP6341311B2 (ja) ドライバの動的道路シーンに対する精通度インデックスのリアルタイム作成
JP6307356B2 (ja) 運転コンテキスト情報生成装置
US10220841B2 (en) Method and system for assisting a driver of a vehicle in driving the vehicle, vehicle and computer program
JP5867296B2 (ja) 運転シーン認識装置
JP5840046B2 (ja) 情報提供装置、情報提供システム、情報提供方法、及びプログラム
JP5598411B2 (ja) 車両用情報提供装置
CN112203916A (zh) 用于确定目标车辆变道相关信息的方法和设备、用于确定车辆舒适性度量用以预测目标车辆的驾驶机动操纵的方法和设备以及计算机程序
CN115038936A (zh) 用于场景感知交互的系统和方法
JP6418574B2 (ja) 危険度推定装置、危険度推定方法及び危険度推定用コンピュータプログラム
JP5867524B2 (ja) 運転評価装置及び運転評価方法、並びに運転支援システム
JP6206022B2 (ja) 運転支援装置
US20230048304A1 (en) Environmentally aware prediction of human behaviors
JP6511982B2 (ja) 運転操作判別装置
CN107133568A (zh) 一种基于车载前视相机的限速提示及超速告警方法
Rahman et al. Driving behavior profiling and prediction in KSA using smart phone sensors and MLAs
WO2014034779A1 (fr) Unité d'évaluation de caractéristique individuelle et procédé d'évaluation de caractéristique individuelle
CN117751394A (zh) 用于支持自动行驶车辆的环境识别的方法和装置
JP6926644B2 (ja) 異常推定装置および表示装置
CN105447511B (zh) 一种基于Adaboost Haar-Like特征的SVM目标检测方法
JP2014046820A (ja) 運転者特性推定装置
CN107003143B (zh) 用于识别人员的目的地的方法和目的地识别单元

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13833851

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13833851

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP