[go: up one dir, main page]

JP2009098900A - Aside look status determination device - Google Patents

Aside look status determination device Download PDF

Info

Publication number
JP2009098900A
JP2009098900A JP2007269455A JP2007269455A JP2009098900A JP 2009098900 A JP2009098900 A JP 2009098900A JP 2007269455 A JP2007269455 A JP 2007269455A JP 2007269455 A JP2007269455 A JP 2007269455A JP 2009098900 A JP2009098900 A JP 2009098900A
Authority
JP
Japan
Prior art keywords
aside
driver
state
observation
state determination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2007269455A
Other languages
Japanese (ja)
Other versions
JP5311793B2 (en
Inventor
Toshiyuki Nanba
利行 難波
Hiroaki Sekiyama
博昭 関山
Keisuke Okamoto
圭介 岡本
Yoshihiro Oe
義博 大栄
Yoichi Sato
洋一 佐藤
Yoshihiro Suda
義大 須田
Takahiro Suzuki
高宏 鈴木
Daisuke Yamaguchi
大助 山口
Shiro Kumano
史朗 熊野
Kenichi Horiguchi
研一 堀口
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Tokyo NUC
Toyota Motor Corp
Original Assignee
University of Tokyo NUC
Toyota Motor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Tokyo NUC, Toyota Motor Corp filed Critical University of Tokyo NUC
Priority to JP2007269455A priority Critical patent/JP5311793B2/en
Priority to US12/738,576 priority patent/US8665099B2/en
Priority to EP08840164.1A priority patent/EP2201496B1/en
Priority to PCT/IB2008/002739 priority patent/WO2009050564A2/en
Publication of JP2009098900A publication Critical patent/JP2009098900A/en
Application granted granted Critical
Publication of JP5311793B2 publication Critical patent/JP5311793B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/29Graphical models, e.g. Bayesian networks
    • G06F18/295Markov models or related models, e.g. semi-Markov models; Markov random fields; Networks embedding Markov models
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/803Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/84Arrangements for image or video recognition or understanding using pattern recognition or machine learning using probabilistic graphical models from image or video features, e.g. Markov models or Bayesian networks
    • G06V10/85Markov-related models; Markov random fields
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Databases & Information Systems (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Probability & Statistics with Applications (AREA)
  • Traffic Control Systems (AREA)
  • Instructional Devices (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To provide a looking aside status determination device which can determine more accurately whether a driver looks aside. <P>SOLUTION: The looking aside status determination device 100 includes: an observation amount acquisition means 10 for acquiring at least one piece of information out of information on line of sight, information on operation and information on traveling condition of a driver as an observation amount; a looking aside status determination means 12 for determining a looking aside status based on a stochastic relation between the looking aside status and the observation amount and the observation amount acquired by the observation amount acquisition measure 10. The looking aside status determination means 12 computes a looking aside probability of the driver and determines that the driver is looking aside when the computed probability exceeds a threshold. <P>COPYRIGHT: (C)2009,JPO&INPIT

Description

本発明は、運転者の脇見状態を判定する脇見状態判定装置に関し、特に、運転者の視線に関する情報と、運転操作に関する情報又は走行状態に関する情報等の少なくとも一つを観測量として取得し、既知の脇見状態と既知の観測量との間の確率的関係及び現在の観測量に基づいて現在の脇見状態を判定する脇見状態判定装置に関する。   The present invention relates to an armpit state determination device that determines a driver's armpit state, and in particular, acquires at least one of information related to the driver's line of sight, information related to driving operation, information related to driving state, and the like as an observation amount, and is known. The present invention relates to a side-by-side state determination device that determines a current side-by-side state based on a stochastic relationship between a side-by-side state and a known observation amount and a current observation amount.

従来、過去の所定時間内における運転者の視線方向の方向別頻度、及び、過去の所定期間内における操舵角速度の角速度別頻度に基づいて運転者が脇見をしているか否かを判定する運転者状態検出装置が知られている(例えば、特許文献1参照。)。   Conventionally, a driver who determines whether or not the driver is looking aside based on the frequency according to the direction of the gaze direction of the driver within a predetermined time in the past and the frequency according to the angular velocity of the steering angular velocity within a predetermined period in the past. A state detection device is known (for example, refer to Patent Document 1).

この運転者状態検出装置は、隣接車線に他車が存在し、かつ、運転者が車線を維持しようとしている場合であって、運転者が脇見をせずに運転しているときには、車両位置を一定に保とうとする意識が強くなることでステアリング操作が比較的頻繁になり操舵角速度も比較的大きくなるのに対し、運転者が脇見をしながら運転をしているときには、ステアリング操作が比較的緩慢になり操舵角速度も比較的小さくなるという前提条件を設定して、過去の所定時間内における運転者の視線方向の方向別頻度が所定状態となった場合に、過去の所定期間内における操舵角速度の操舵角速度別頻度に基づいて運転者が脇見運転をしているか否かを判定する。   This driver state detection device detects the vehicle position when there is another vehicle in the adjacent lane and the driver is trying to maintain the lane, and the driver is driving without looking aside. Steering operation is relatively frequent and steering angular velocity is relatively large due to a strong awareness of keeping constant, while steering operation is relatively slow when the driver is driving while looking aside. If the precondition that the steering angular velocity becomes relatively small and the direction frequency of the driver's line of sight in the past predetermined time becomes a predetermined state, the steering angular velocity in the past predetermined period is set. It is determined whether the driver is looking aside based on the frequency according to the steering angular velocity.

このように、この運転者状態検出装置は、所定期間における操舵角速度の統計値に基づいて予め設定しておいた特定の前提条件が満たされるか否かを判定することで、過去の所定時間内における運転者の視線方向の方向別頻度が所定状態となった場合に、運転者が脇見運転をしているか否かを高精度に判定しようとする。
特開2003−80969号公報 須田善大、高橋良至、大貫正明、“研究用ユニバーサル・ドライビングシミュレータ”、自動車技術、Vol.59,No.7,(2005),pp.83-88、自動車技術会 岡兼司、菅野裕介、佐藤洋一、“頭部変形モデルの自動構築を伴う実時間頭部姿勢推定”、情報処理学会論文誌:コンピュータビジョンとイメージメディア、Vol.47,No.SIG10(CVIM15),pp.185-194
As described above, the driver state detection device determines whether or not a specific precondition set in advance based on the statistical value of the steering angular velocity in a predetermined period is satisfied, so that a predetermined time period in the past can be satisfied. When the frequency according to the direction of the driver's line of sight is in a predetermined state, it is determined with high accuracy whether or not the driver is driving aside.
JP 2003-80969 A Yoshihiro Suda, Yoshitoshi Takahashi, Masaaki Onuki, “Universal Driving Simulator for Research”, Automotive Technology, Vol.59, No.7, (2005), pp.83-88, Automotive Engineering Society Okakane, Yusuke Kanno, Yoichi Sato, “Real-time Head Posture Estimation with Automatic Construction of Head Deformation Model”, IPSJ Transactions on Computer Vision and Image Media, Vol. 47, No. SIG10 (CVIM15), pp.185-194

しかしながら、特許文献1に記載の運転者状態検出装置は、脇見の有無を判定するために観測可能なデータ(操舵角速度)に関する特定の前提条件を定義するので、そのような特定の前提条件に当てはまらないステアリング操作には対応することができず、運転者が脇見運転をしていないにもかかわらず脇見運転をしているものと判定したり、反対に、運転者が脇見運転をしているにもかかわらず脇見運転をしていないものと判定したりしてしまう。   However, since the driver state detection device described in Patent Literature 1 defines specific preconditions regarding observable data (steering angular velocity) in order to determine the presence or absence of looking aside, it does not apply to such specific preconditions. The steering operation cannot be handled, and it is determined that the driver is driving aside while the driver is not looking aside. Conversely, the driver is driving aside. However, it is determined that the driver is not driving aside.

上述の点に鑑み、本発明は、運転者が脇見をしているか否かをより正確に判定できる脇見状態判定装置を提供することを目的とする。   In view of the above-described points, an object of the present invention is to provide a look-ahead state determination device that can more accurately determine whether or not a driver is looking aside.

上述の目的を達成するために、第一の発明に係る脇見状態判定装置は、運転者の視線に関する情報と、運転操作に関する情報又は走行状態に関する情報のうちの少なくとも一つとを観測量として取得する観測量取得手段と、脇見状態と観測量との間の確率的関係及び前記観測量取得手段が取得した観測量に基づいて脇見状態を判定する脇見状態判定手段と、を備えることを特徴とする。   In order to achieve the above-described object, the aside look state determination device according to the first aspect of the invention acquires information relating to the driver's line of sight and at least one of information relating to the driving operation or information relating to the driving state as an observation amount. An observation amount acquisition unit, and a stochastic state determination unit that determines a state of looking aside based on a stochastic relationship between the observation state and the observation amount and the observation amount acquired by the observation amount acquisition unit. .

また、第二の発明は、第一の発明に係る脇見状態判定装置であって、前記脇見状態判定手段は、運転者が脇見をしている確率を算出し、算出した確率が閾値を超えた場合に運転者が脇見をしていると判定することを特徴とする。   Further, the second invention is a side-by-side state determination device according to the first invention, wherein the side-by-side state determination unit calculates a probability that the driver is looking aside, and the calculated probability exceeds a threshold value. In this case, it is determined that the driver is looking aside.

また、第三の発明は、第二の発明に係る脇見状態判定装置であって、前記脇見状態判定手段は、ベイジアンネットワークを用いて運転者が脇見をしている確率を算出することを特徴とする。   Further, the third invention is a side-by-side state determination device according to the second invention, wherein the side-by-side state determination unit calculates a probability that the driver is looking aside using a Bayesian network. To do.

また、第四の発明は、第二又は第三の発明に係る脇見状態判定装置であって、前記脇見状態判定手段は、遷移モデル及び観測モデルを事前に学習し、前記遷移モデル及び前記観測モデルと前記観測量取得手段が取得した観測量とに基づいて運転者が脇見をしている確率を算出することを特徴とする。   Further, the fourth invention is an armpit state determination device according to the second or third invention, wherein the armpit state determination means learns a transition model and an observation model in advance, and the transition model and the observation model And a probability that the driver is looking aside based on the observation amount acquired by the observation amount acquisition means.

また、第五の発明は、第一乃至第四の何れかの発明に係る脇見状態判定装置であって、前記運転操作に関する情報は、操舵角、アクセル踏み込み量、ブレーキ踏み込み量又はウィンカー操作に関する情報を含むことを特徴とする。   A fifth aspect of the invention is a side-viewing state determination device according to any one of the first to fourth aspects of the invention, wherein the information related to the driving operation is information related to a steering angle, an accelerator depression amount, a brake depression amount, or a blinker operation. It is characterized by including.

また、第六の発明は、第一乃至第四の何れかの発明に係る脇見状態判定装置であって、前記走行状態に関する情報は、車両の速度又は加速度を含むことを特徴とする。   According to a sixth aspect of the present invention, there is provided an armpit state determination apparatus according to any one of the first to fourth aspects of the present invention, wherein the information related to the traveling state includes a vehicle speed or acceleration.

また、第七の発明は、第一乃至第四の何れかの発明に係る脇見状態判定装置であって、前記走行状態に関する情報は、車両とセンターラインとの間の距離又は車両とセンターラインとの間の角度を含むことを特徴とする。   The seventh invention is an armpit state determination device according to any one of the first to fourth inventions, wherein the information on the running state is a distance between the vehicle and the center line, or the vehicle and the center line. Including the angle between.

また、第八の発明は、第一乃至第四の何れかの発明に係る脇見状態判定装置であって、前記走行状態に関する情報は、周囲環境に関する情報を含むことを特徴とする。   An eighth aspect of the invention is an armpit state determination device according to any one of the first to fourth aspects of the invention, wherein the information related to the running state includes information related to the surrounding environment.

また、第九の発明は、第一乃至第四の何れかの発明に係る脇見状態判定装置であって、前記運転者の視線に関する情報は、頭部の傾斜角度を含むことを特徴とする。   According to a ninth aspect of the present invention, there is provided a side-viewing state determination device according to any one of the first to fourth aspects, wherein the information relating to the driver's line of sight includes a head inclination angle.

上述の手段により、本発明は、運転者が脇見をしているか否かをより正確に判定できる脇見状態判定装置を提供することができる。   With the above-described means, the present invention can provide an aside look state determination device that can more accurately determine whether or not the driver is looking aside.

以下、図面を参照しつつ、本発明を実施するための最良の形態の説明を行う。   Hereinafter, the best mode for carrying out the present invention will be described with reference to the drawings.

図1は、本発明に係る脇見状態判定装置の構成例を示すブロック図であり、脇見状態判定装置100は、制御部1、各種観測量を取得する装置20〜28、運転者の入力を受け付ける入力装置29、及び、各種音声を出力する音声出力装置30から構成される車載装置である。   FIG. 1 is a block diagram illustrating a configuration example of an aside look state determination device according to the present invention. A look aside state determination device 100 receives a control unit 1, devices 20 to 28 for acquiring various observation amounts, and driver input. This is an in-vehicle device that includes an input device 29 and a sound output device 30 that outputs various sounds.

ここで、「観測量」とは、数値として観測できる量であり、例えば、図2に示すようなカテゴリ「運転操作」、「車両状態」又は「運転者情報」に分類され、「運転操作」カテゴリにおける「ステアリング角度」、「アクセル踏み込み量」、「ブレーキ踏み込み量」及び「方向指示器設定」、「車両状態」カテゴリにおける「速度」、「加速度」、「車両とセンターラインとの間の距離」及び「車両のセンターラインに対する角度」、並びに、「運転者情報」カテゴリにおける「頭部ヨー角(運転者の頭部の鉛直軸周りの回転角をいう。)」が含まれる。   Here, the “observation amount” is an amount that can be observed as a numerical value. For example, the “observation amount” is classified into the categories “driving operation”, “vehicle state”, or “driver information” as shown in FIG. “Steering angle”, “accelerator depression amount”, “brake depression amount” and “direction indicator setting” in category, “speed”, “acceleration”, “distance between vehicle and centerline” in “vehicle state” category And “the angle of the vehicle with respect to the center line”, and “the head yaw angle (which refers to the rotation angle around the vertical axis of the driver's head”) in the “driver information” category.

また、「観測量」は、アクセル踏み込み速度、ブレーキ踏み込み速度、ステアリング角速度、車両位置(緯度、経度、高度がある。)、道路種類(直線道路、湾曲道路、交差点付近等がある。)、周囲環境(脇見を誘発しやすい繁華街、景勝地、又は、脇見を誘発し難い高速道路等がある。)、歩行者の有無、他車の速度、加速度、自車と他車との間の距離、頭部ピッチ角(運転者の頭部の車両左右方向に伸びる軸周りの回転角をいう。)、頭部ロール角(運転者の頭部の車両前後方向に伸びる軸周りの回転角をいう。)等、観測可能な量であって脇見状態に影響を与え得る如何なるものを含んでいてもよい。   “Observed amount” includes accelerator depression speed, brake depression speed, steering angular speed, vehicle position (latitude, longitude, altitude), road type (straight road, curved road, near intersection, etc.), and surroundings. Environment (there are busy streets, scenic spots, or highways that are difficult to induce side effects), the presence or absence of pedestrians, the speed and acceleration of other vehicles, and the distance between your vehicle and other vehicles , Head pitch angle (refers to the rotation angle around the axis of the driver's head extending in the vehicle lateral direction), head roll angle (refers to the rotation angle of the driver's head around the axis extending in the vehicle longitudinal direction) Etc.), and any amount that can affect the look-ahead state may be included.

また、「観測量」は、連続値(アナログデータ)が採用されてもよいが、その連続値を複数段階に区分した離散値(デジタルデータであり、例えば、観測量が「速度」の場合、5、10、15といった5km/h毎の段階で区分される。)や複数の指標(例えば、道路種類における直線道路の指標として「1」を設定し、道路種類における湾曲道路の指標として「2」を設定したりする。)が採用されてもよい。   The “observed quantity” may be a continuous value (analog data), but is a discrete value obtained by dividing the continuous value into a plurality of stages (digital data. For example, when the observed quantity is “speed”, (5, 10 and 15 are classified at every 5 km / h) and a plurality of indices (for example, “1” is set as an index of a straight road in a road type, and “2” is set as an index of a curved road in a road type. Or “.” May be adopted.

なお、脇見状態判定装置100は、好適には、各種観測量をそれぞれ12段階の離散値に区分して取り扱うこととする。段階数が少ないと各種観測量の特徴を抽出するのが困難となり、一方で、段階数が多いとノイズ(異常値)の混入が無視できなくなるからである。   Note that the aside look state determination apparatus 100 preferably handles various observation amounts by dividing them into 12 discrete values. This is because if the number of steps is small, it is difficult to extract the characteristics of various observation quantities, while if the number of steps is large, mixing of noise (abnormal values) cannot be ignored.

脇見状態判定装置100は、例えば、運転者の頭部ヨー角、及び、脇見状態に関連する各種観測量を継続的に取得しながら、その頭部ヨー角が所定角度以上となった場合における各種観測量の瞬間値や統計値(平均値、最大値、最小値、分散、中間値等がある。)と所定の遷移モデル及び所定の観測モデルとに基づいて、直接的に観測できない脇見状態を条件付確率で表すようにする。   The looking-ahead state determination device 100, for example, continuously acquires various observation amounts related to the driver's head yaw angle and the looking-aside state, and various types when the head yaw angle becomes a predetermined angle or more. Based on the instantaneous values and statistical values of observations (mean values, maximum values, minimum values, variances, intermediate values, etc.) and predetermined transition models and predetermined observation models, Express with conditional probability.

ここで、「遷移モデル」とは、二以上の状態が互いに遷移する場合における各状態の条件付確率分布であり、例えば、時刻tにおいて脇見状態でなかった(X=OFF)場合に時刻t+1において脇見状態(Xt+1=ON)となる条件付確率分布であって、P(Xt+1=ON|X=OFF)のように表される。 Here, the “transition model” is a conditional probability distribution of each state when two or more states transition to each other. For example, the time t + 1 when the state is not an aside state at the time t (X t = OFF). Is a conditional probability distribution that results in an aside look state (X t + 1 = ON), and is expressed as P (X t + 1 = ON | X t = OFF).

また、「観測モデル」とは、観測量の条件付確率分布であり、例えば、時刻tにおいて脇見状態(X=ON)であった場合に観測量Z (k)が所定値zとなる条件付確率分布であって、P(Z (k)=z|X=ON)のように表される。なお、Z(k)は、N種類ある観測量のk番目の観測量の値を示す。 In addition, the “observation model” is a conditional probability distribution of the observation amount. For example, the observation amount Z t (k) becomes a predetermined value z in a state of looking aside (X t = ON) at time t. A conditional probability distribution, expressed as P (Z t (k) = z | X t = ON). Z (k) represents the value of the k-th observed amount of N types of observed amounts.

また、脇見状態判定装置100は、例えば、遷移モデル及び観測モデルを予め用意しているものとし、それら遷移モデル及び観測モデルは、例えば、ドライビングシミュレータ(例えば、非特許文献1参照。)を用いて運転者毎に事前に学習される。   Moreover, the aside look state determination apparatus 100 prepares, for example, a transition model and an observation model in advance, and the transition model and the observation model use, for example, a driving simulator (see, for example, Non-Patent Document 1). Learned in advance for each driver.

図3は、ドライビングシミュレータの構成例を示す概略図であり、図3(A)は、被験者(運転者)DRが、車両模型VCの各種操作部(図示しない、アクセルペダル、ブレーキペダル、ステアリング等である。)に連動するスクリーンSC上の映像を見ながら車両の運転を擬似的に体験できるようになっていることを示す。   FIG. 3 is a schematic diagram illustrating a configuration example of a driving simulator. FIG. 3A illustrates a case where a subject (driver) DR is operated by various operation units (not shown, an accelerator pedal, a brake pedal, a steering, etc.) of the vehicle model VC. It is shown that the user can experience the driving of the vehicle in a pseudo manner while viewing the video on the screen SC linked to the screen SC.

また、図3(A)は、ドライビングシミュレータDSが、運転者撮影用カメラ21を運転席前方に備え、かつ、被験者DRの運転操作を検証するための検証用カメラMCを運転席後方上部に備えることを示す。   FIG. 3A also shows that the driving simulator DS includes the driver photographing camera 21 in the front of the driver's seat and a verification camera MC for verifying the driving operation of the subject DR in the upper rear of the driver's seat. It shows that.

また、図3(A)は、ドライビングシミュレータDSが、運転者DRの脇見を誘発するターゲット画像TGを表示させるためのプロジェクタP1、P2を後部座席の左右に備えることを示し、運転者DRは、首を振らなければ(頭部ヨー角が所定角度φ以上にならなければ)、ターゲット画像TGを視認できないものとする。   FIG. 3A shows that the driving simulator DS includes projectors P1 and P2 for displaying a target image TG that induces the driver DR to look aside, on the left and right of the rear seat. It is assumed that the target image TG cannot be visually recognized unless the head is shaken (the head yaw angle is not equal to or larger than the predetermined angle φ).

このようにして、ドライビングシミュレータDSは、運転者撮影用カメラ21が取得した画像から、従来技術(例えば、非特許文献2参照。)を用いて、脇見状態を最も良く反映すると考えられる頭部ヨー角を検出できるようにする。   In this way, the driving simulator DS uses the conventional technique (see, for example, Non-Patent Document 2) from the image acquired by the driver photographing camera 21, and is considered to best reflect the look-ahead state. Make the corners detectable.

ドライビングシミュレータDSは、ターゲット画像TGの表示頻度、表示時間、ターゲット画像TGの表示位置、ターゲット画像TGの種類等をランダムに決定し、例えば、表示頻度を1分間当たり2.2回、表示時間を1回当たり5秒間、ターゲット画像TGの種類を50種類としながら、左右のプロジェクタP1、P2を利用してターゲット画像TGを含む投影画像PGを左右の壁面W1、W2にそれぞれ投影させる。   The driving simulator DS randomly determines the display frequency of the target image TG, the display time, the display position of the target image TG, the type of the target image TG, and the like. For example, the display frequency is 2.2 times per minute and the display time is set. The projection image PG including the target image TG is projected onto the left and right wall surfaces W1 and W2 using the left and right projectors P1 and P2, respectively, while the target image TG is set to 50 types for 5 seconds each time.

図3(B)は、右側の壁面W2に投影される投影画像PGの一例を示し、投影画像PGが、その左側領域に歩行者の形をしたターゲット画像TGを含む状態を示す。   FIG. 3B shows an example of a projection image PG projected on the right wall surface W2, and shows a state in which the projection image PG includes a target image TG in the shape of a pedestrian in the left side area.

なお、ドライビングシミュレータDSは、歩行者の他に、道路標識、四字熟語等、被験者DRの興味を引く画像をターゲット画像TGとして使用してもよく、また、被験者DRに心的負荷を与えることなく自然に脇見をさせるために、左右何れの方向にターゲット画像TGが表示されたかを音声で知らせるようにしてもよい。   In addition to the pedestrian, the driving simulator DS may use, as the target image TG, an image that attracts the subject's DR, such as a road sign or a four-character idiom. In order to naturally look aside, it may be notified by voice whether the target image TG is displayed in the left or right direction.

また、ドライビングシミュレータDSは、被験者DRの頭部ヨー角がφ(例えば、±10°である。)以上で、かつ、ターゲット画像TGが表示されている状態を脇見が発生している状態と定義した上で、被験者DRに擬似的な運転を所定回数繰り返し実施させるようにする(この行為を教師付き学習と呼ぶ。)。   Further, the driving simulator DS defines a state in which the head yaw angle of the subject DR is φ (for example, ± 10 °) or more and the target image TG is displayed as a state in which a side look is occurring. Then, the subject DR is caused to repeatedly perform the pseudo driving a predetermined number of times (this action is called supervised learning).

なお、脇見が発生している状態は、頭部ヨー角以外の観測量を用いて定義されてもよく、また、脇見の発生していない状態が、任意の観測量を用いて定義されてもよい(例えば、頭部ヨー角がφ以上であっても、操舵角が所定角度以上となっている状態を、脇見の発生していない状態と定義する。)。   It should be noted that the state in which a side look is occurring may be defined using an observation amount other than the head yaw angle, and the state in which no side look is occurring may be defined using an arbitrary observation amount. (For example, even if the head yaw angle is equal to or greater than φ, a state in which the steering angle is equal to or greater than a predetermined angle is defined as a state in which a side look has not occurred.)

上述の方法に従って獲得した各種観測量に基づき、ドライビングシミュレータDSは、遷移モデル及び観測モデルを学習(生成)する。   The driving simulator DS learns (generates) the transition model and the observation model based on the various observation amounts acquired according to the above-described method.

なお、遷移モデル及び観測モデルは、フレーム(所定の時間的単位をいい、例えば、運転者撮影用カメラ21の撮影単位(フレーム/秒)でいうところのフレームである。)の数に基づいて導出される。   The transition model and the observation model are derived based on the number of frames (referred to as a predetermined time unit, for example, a frame in terms of a shooting unit (frame / second) of the driver shooting camera 21). Is done.

具体的には、遷移モデルが(1)式で表され、観測モデルが(2)式で表される。   Specifically, the transition model is represented by the equation (1), and the observation model is represented by the equation (2).

Figure 2009098900
Figure 2009098900

Figure 2009098900
なお、frame{y}は、条件yを満たすフレームの数を意味する。
Figure 2009098900
Note that frame {y} means the number of frames that satisfy the condition y.

ここで再度、図1を参照しながら、脇見状態判定装置100の構成要素について説明する。   Here, with reference to FIG. 1 again, the components of the looking-aside state determination apparatus 100 will be described.

制御部1は、CPU(Central Processing Unit)、並びに、RAM(Random Access Memory)及びROM(Read Only Memory)等の記憶部15を備えたコンピュータであって、例えば、観測量取得手段10、脇見状態学習手段11及び脇見状態判定手段12のそれぞれに対応するプログラムを記憶部15に記憶しながら、各手段に対応する処理をCPUに実行させる。   The control unit 1 is a computer including a CPU (Central Processing Unit) and a storage unit 15 such as a RAM (Random Access Memory) and a ROM (Read Only Memory). While storing the program corresponding to each of the learning means 11 and the looking-aside state determination means 12 in the storage unit 15, the CPU is caused to execute processing corresponding to each means.

記憶部15は、例えば、ドライビングシミュレータDSを用いて学習が行われた遷移モデル150及び観測モデル151を運転者毎に記憶し、車両を運転する各運転者に対応する遷移モデル150及び観測モデル151を脇見状態判定装置100が選択的に利用できるようにする。   The storage unit 15 stores, for example, the transition model 150 and the observation model 151 learned using the driving simulator DS for each driver, and the transition model 150 and the observation model 151 corresponding to each driver driving the vehicle. Can be used selectively by the aside state determination apparatus 100.

道路撮影用カメラ20は、道路上に設置された区画線や道路標示等を認識するためのカメラであり、例えば、フロントグリル付近に設置され、自車が走行する道路上の車線境界線等を撮影する。   The road photography camera 20 is a camera for recognizing lane markings, road markings, and the like installed on the road. Take a picture.

運転者撮影用カメラ21は、運転者を撮影するためのカメラであり、例えば、インストルメントパネル内やダッシュボード上等に設置され、運転者の頭部を撮影する。なお、運転者撮影用カメラ21は、好適には、ドライビングシミュレータDSで使用されるものと同じ配置、同じ型が使用される。ドライビングシミュレータDSで得られた遷移モデル150及び観測モデル151を有効に利用できるようにするためである。   The driver photographing camera 21 is a camera for photographing the driver, and is installed, for example, in an instrument panel or a dashboard, and photographs the driver's head. The driver photographing camera 21 preferably has the same arrangement and the same type as those used in the driving simulator DS. This is because the transition model 150 and the observation model 151 obtained by the driving simulator DS can be used effectively.

操舵角センサ22は、車輪の操舵角に関連するステアリングシャフトの回転角を測定するためのセンサであり、例えば、ステアリングシャフトに埋め込まれた磁石による磁気抵抗をMR(Magnetic Resistance)素子により読み取りステアリングシャフトの回転角を検出し、検出結果を制御部1に出力する。   The steering angle sensor 22 is a sensor for measuring the rotation angle of the steering shaft related to the steering angle of the wheel. For example, the steering shaft reads a magnetic resistance by a magnet embedded in the steering shaft by an MR (Magnetic Resistance) element. And the detection result is output to the control unit 1.

アクセル踏み込み量センサ23は、アクセルの踏み込み量を検出するためのセンサであり、例えば、可変抵抗器を用いたポテンショメータによりアクセルペダルの踏み込み量を検出し、検出結果を制御部1に出力する。   The accelerator depression amount sensor 23 is a sensor for detecting the accelerator depression amount. For example, the accelerator depression amount sensor 23 detects the depression amount of the accelerator pedal using a potentiometer using a variable resistor, and outputs the detection result to the control unit 1.

ブレーキ踏み込み量センサ24は、アクセル踏み込み量センサ23と同様、ブレーキの踏み込み量を検出するための手段である。   The brake depression amount sensor 24 is a means for detecting the depression amount of the brake, like the accelerator depression amount sensor 23.

方向指示器25は、自車の進行方向を外部に通知するための装置であり、例えば、ウィンカーランプを点滅させて自車の進行方向を外部に通知するようにし、その状態を示す電気信号を制御部1に出力する。   The direction indicator 25 is a device for notifying the traveling direction of the host vehicle to the outside. For example, the blinker lamp blinks to notify the traveling direction of the host vehicle to the outside, and an electric signal indicating the state is sent. Output to the control unit 1.

速度センサ26は、車両の速度を測定するセンサであり、例えば、各車輪に取り付けられ各車輪とともに回転する磁石による磁界の変化をMR素子が磁気抵抗として読み取り、これを回転速度に比例したパルス信号として取り出すことで車輪の回転速度及び車両の速度を検出し、検出結果を制御部1に出力する。   The speed sensor 26 is a sensor that measures the speed of the vehicle. For example, the MR element reads a change in magnetic field caused by a magnet attached to each wheel and rotates with each wheel as a magnetic resistance, and this is a pulse signal proportional to the rotation speed. As a result, the rotational speed of the wheel and the speed of the vehicle are detected, and the detection result is output to the control unit 1.

加速度センサ27は、車両の加速度を検出するためのセンサであり、例えば、半導体ひずみゲージを用いて車両の前後方向、上下方向、左右方向の3軸方向の加速度を検出し、検出結果を制御部1に出力する。   The acceleration sensor 27 is a sensor for detecting the acceleration of the vehicle. For example, the acceleration sensor 27 detects the acceleration in the three axial directions of the vehicle in the front-rear direction, the vertical direction, and the left-right direction using a semiconductor strain gauge. Output to 1.

ナビゲーション装置28は、GPS(Global Positioning System)機能により取得される車両の位置情報と、ハードディスクやDVD等に記憶された地図情報とに基づいて目的地までの経路を示しながら車両を誘導するための装置であり、自車位置(緯度、経度、高度)、自車が走行する道路の種類(直線道路、湾曲道路、交差点付近等がある。)、又は、周囲環境(脇見を誘発しやすい繁華街、景勝地、又は、脇見を誘発し難い高速道路等がある。)に関する情報等を制御部1に出力する。   The navigation device 28 guides the vehicle while showing the route to the destination based on the vehicle position information acquired by the GPS (Global Positioning System) function and the map information stored in the hard disk or DVD. It is a device, and the location of the vehicle (latitude, longitude, altitude), the type of road on which the vehicle travels (straight roads, curved roads, intersections, etc.), or the surrounding environment (a busy street that tends to induce side effects) , Etc.), information relating to a scenic spot or a highway that is difficult to induce aside is output to the control unit 1.

入力装置29は、運転者による入力を受け付けるための装置であり、例えば、ステアリング上に配置されるステアリングスイッチであって、脇見状態判定装置100による判定結果が正しいか否かのフィードバックを運転者が入力できるようにし、運転者によるその入力情報を制御部1に出力する。   The input device 29 is a device for receiving an input from the driver. For example, the input device 29 is a steering switch disposed on the steering wheel, and the driver provides feedback as to whether or not the determination result by the aside look determination device 100 is correct. The input information by the driver is output to the control unit 1.

音声出力装置30は、音声メッセージや警報等を出力するための装置であり、例えば、車載スピーカやアラームであって、脇見状態判定装置100により運転者が脇見をしていると判定された場合に、制御部1から送られてくる制御信号に応じて注意喚起メッセージやアラーム音を出力する。   The voice output device 30 is a device for outputting a voice message, a warning, or the like. For example, the voice output device 30 is an in-vehicle speaker or an alarm, and when the driver determines that the driver is looking aside, In response to a control signal sent from the control unit 1, a warning message or an alarm sound is output.

次に、制御部1が有する各種手段について説明する。   Next, various units included in the control unit 1 will be described.

観測量取得手段10は、各種観測量を取得するための手段であり、各種観測量を取得する装置20〜28からの出力を継続的に取得しながら、取得した観測量を記憶部15に記憶する。なお、観測量取得手段10は、取得した直近の観測量をその統計値に反映させた上で観測量を消去又は破棄するようにしてもよい。   The observation amount acquisition means 10 is a means for acquiring various observation amounts, and stores the acquired observation amounts in the storage unit 15 while continuously acquiring outputs from the devices 20 to 28 for acquiring various observation amounts. To do. Note that the observation amount acquisition unit 10 may delete or discard the observation amount after reflecting the acquired latest observation amount in the statistical value.

また、観測量取得手段10は、運転開始時から現在時刻までの全期間に取得された観測量の全てを利用できるようにしてもよく、過去の所定期間で取得された観測量のみを利用できるようにしてもよい。   Moreover, the observation amount acquisition means 10 may be able to use all of the observation amounts acquired during the entire period from the start of operation to the current time, or can use only the observation amounts acquired in the past predetermined period. You may do it.

また、観測量取得手段10は、例えば、道路撮影用カメラ20が出力する道路の画像を受け、二値化処理、エッジ検出処理等の画像処理を施すことにより道路標示や車線境界線等を抽出する。   The observation amount acquisition means 10 receives, for example, a road image output from the road photographing camera 20 and performs image processing such as binarization processing and edge detection processing to extract road markings, lane boundary lines, and the like. To do.

更に、観測量取得手段10は、抽出した車線境界線と既知であるカメラ位置との関係から、自車と車線境界線との間の距離を算出したり、自車と車線境界線とが形成する角度を算出したりする。   Furthermore, the observation amount acquisition means 10 calculates the distance between the own vehicle and the lane boundary line from the relationship between the extracted lane boundary line and the known camera position, or forms the own vehicle and the lane boundary line. Calculate the angle to perform.

また、観測量取得手段10は、例えば、運転者撮影用カメラ21が出力する運転者の頭部の画像を受け、既知の頭部姿勢推定方法(例えば、非特許文献2参照。)を用いて頭部の姿勢を検出し、運転者の頭部のヨー角、ピッチ角、ロール角を導出する。   The observation amount acquisition unit 10 receives, for example, an image of the driver's head output from the driver photographing camera 21 and uses a known head posture estimation method (see, for example, Non-Patent Document 2). The posture of the head is detected, and the yaw angle, pitch angle, and roll angle of the driver's head are derived.

更に、観測量取得手段10は、運転者撮影用カメラ21が出力する運転者の顔面の画像を受け、二値化処理、エッジ検出処理等の画像処理を施すことにより顔面内の目、鼻、口の位置を抽出したり、或いは、目の中における瞳の位置を抽出したりすることで、運転者の視線方向を推定するようにしてもよい。   Further, the observation amount acquisition means 10 receives an image of the driver's face output from the driver photographing camera 21, and performs image processing such as binarization processing and edge detection processing to thereby make eyes, nose, The driver's line-of-sight direction may be estimated by extracting the position of the mouth or extracting the position of the pupil in the eye.

脇見状態学習手段11は、遷移モデル150及び観測モデル151の学習を行うための手段であり、例えば、脇見状態判定装置100が一旦脇見であると判定し音声出力装置30から警告を出力させた後に、入力装置29を介して運転者が脇見をしていなかったことを事後的に検知した場合に、その判定結果が誤りであったことを遷移モデル150及び観測モデル151に反映させるようにする。   The looking-aside state learning means 11 is a means for learning the transition model 150 and the observation model 151. For example, after the aside-looking state determination device 100 determines that the looking-aside state is once aside and outputs a warning from the voice output device 30 When the fact that the driver is not looking aside is detected afterward through the input device 29, the transition model 150 and the observation model 151 reflect that the determination result is incorrect.

また、脇見状態学習手段11は、指紋認証装置、虹彩認証装置又はパスワード等の運転者識別手段により運転者を識別した上で、運転者毎に、遷移モデル150及び観測モデル151を更新するようにする。運転者毎の特徴を取り入れながら遷移モデル150及び観測モデル151を準備することにより、後述の脇見状態判定手段12による脇見判定の精度を更に向上させるためである。   Further, the side-by-side state learning means 11 updates the transition model 150 and the observation model 151 for each driver after identifying the driver by a driver identification means such as a fingerprint authentication device, an iris authentication device, or a password. To do. This is because the transition model 150 and the observation model 151 are prepared while incorporating the features of each driver, thereby further improving the accuracy of the side-by-side determination by the side-by-side state determination unit 12 described later.

このようにして、脇見状態学習手段11は、脇見状態判定装置100による脇見の判定精度を継続的に向上させることができる。   In this way, the looking-aside state learning unit 11 can continuously improve the accuracy of looking aside by the looking-ahead state determination device 100.

脇見状態判定手段12は、脇見をしているか否かを判定するための手段であり、例えば、ダイナミックベイジアンネットワーク(以下、「DBN」とする。)を用い、現在時刻までに得られた各種観測量に基づいて現在時刻における脇見の確率を導出し、導出した確率が所定値ξ(例えば、0.8である。)以上の場合に脇見であると判定する。   The looking-aside state determining means 12 is a means for determining whether or not the person is looking aside. For example, using a dynamic Bayesian network (hereinafter referred to as “DBN”), various observations obtained up to the present time are used. Based on the quantity, the probability of looking aside at the current time is derived, and when the derived probability is equal to or greater than a predetermined value ξ (for example, 0.8), it is determined that the person is looking aside.

なお、所定値ξは、例えば、ドライビングシミュレータDSにおける複数のシミュレーション結果に対してleave-one-out法に基づく脇見判定精度の評価を行い、TP(True Positive:真陽性率であり、脇見状態を脇見であると判定する比率をいう。)、FP(False Positive:偽陽性率であり、非脇見状態を脇見であると判定する比率をいう。)、FN(False Negative:偽陰性率であり、脇見状態を脇見でないと判定する比率をいう。)、及び、TN(True Negative:真陰性であり、非脇見状態を脇見でないと判定する確率をいう。)の値を考慮しながら決定される。   Note that the predetermined value ξ is, for example, an evaluation of the aside look determination accuracy based on the leave-one-out method for a plurality of simulation results in the driving simulator DS, and a TP (True Positive: true positive rate). FP (False Positive: a false positive rate, and a ratio to determine a non-aware state as an aside), FN (False Negative: a false negative rate) It is determined in consideration of the values of the ratio of determining that the aside is not looking aside) and TN (True Negative: the probability of determining that the aside is not aside).

DBNは、隠れ状態変数(観測量の履歴から直接的に取得することができない変数であり、例えば、脇見状態である。)と観測変数(観測量の履歴から直接的に取得することができる変数であり、例えば、頭部ヨー角、速度、加速度等である。)との間に確率的な関係が存在する場合に、有向グラフを用いてその関係を記述する手法である。   The DBN is a hidden state variable (a variable that cannot be directly acquired from the history of observation amounts, for example, a state of looking aside) and an observation variable (a variable that can be acquired directly from the history of observation amounts). For example, when there is a stochastic relationship with the head yaw angle, velocity, acceleration, etc.), this is a technique for describing the relationship using a directed graph.

DBNは、この関係を定常状態と仮定し定量的に学習することで、ある時刻までの観測変数の状態の履歴からその時刻の隠れ状態変数の値の尤もらしさを確率的に算出する。   The DBN calculates the likelihood of the value of the hidden state variable at that time from the history of the state of the observed variable up to a certain time by learning quantitatively assuming this relationship as a steady state.

なお、本実施例においては、隠れ状態変数(脇見状態)が1次マルコフ過程に従い、かつ、各時刻における隠れ状態変数(脇見状態)がそれぞれの時刻の観測変数に影響を及ぼすものと仮定する。   In the present embodiment, it is assumed that the hidden state variable (armpit state) follows a first-order Markov process, and the hidden state variable (armpit state) at each time affects the observed variable at each time.

図4は、DBNのネットワークトポロジを示す概念図であり、1次マルコフ過程に従う隠れ状態変数の推移が破線の矢印で表され、各時刻における隠れ状態変数(脇見状態)が各時刻の観測変数に影響を及ぼす様子が実線の矢印で表されている。   FIG. 4 is a conceptual diagram showing the DBN network topology. Transitions of hidden state variables according to the first-order Markov process are represented by broken-line arrows, and hidden state variables at each time (side-viewed state) are observed variables at each time. The influence is represented by solid arrows.

ここで、X及びZt(Zt={Z (1)、・・・、Z (N)})は、それぞれ、時刻tにおける状態変数(脇見状態)の値及び時刻tにおける観測変数(頭部ヨー角、速度、加速度等)の値を示し、Xは、ON(脇見をしている)又はOFF(脇見をしていない)の何れかの値を取り、Nは、観測変数の種類の数である。また、各観測変数Z (k)は、連続量を離散化した離散値を有し、各観測変数の離散値は、例えば、12段階に区分される。 Here, X t and Zt (Zt = {Z t (1) ,..., Z t (N) }) are respectively the value of the state variable (side-view state) at time t and the observed variable at time t ( head yaw angle, velocity, indicates the value of the acceleration, etc.), X t takes a value of either ON (with which) or OFF (not looking aside and the inattentive), N is the observed variables The number of types. Each observation variable Z t (k) has a discrete value obtained by discretizing a continuous quantity, and the discrete value of each observation variable is divided into, for example, 12 stages.

(3)式は、観測量の取得を開始した時刻から時刻t迄の全観測量に関する情報Z1:tに基づいて時刻tにおける脇見状態Xの事後確率P(X|Z1:t)を求めるための式であり、DBNは、(3)式を用いて各観測量が所定値となる場合における脇見の確率を導出する。なお、「事後確率」とは、条件付き確率の一種であり、本実施例においては、既に得られている観測量を考慮に入れた上での脇見の起こりやすさを表すものである。 The expression (3) is based on the information Z 1: t related to all observation amounts from the time when the observation amount acquisition is started until the time t, and the a posteriori probability P (X t | Z 1: t of the side-view state X t at the time t. The DBN derives the probability of looking aside when each observation amount has a predetermined value using the equation (3). The “posterior probability” is a kind of conditional probability, and in the present embodiment, represents the likelihood of looking aside, taking into account the already obtained observation amount.

Figure 2009098900
ここで、αは、P(Zt+1)を示す定数であり、P(Zt+1 (k)|Xt+1)は、記憶部15に記憶された観測モデル151を用いて決定され、P(Xt+1|x)は、記憶部15に記憶された遷移モデル150を用いて決定される。
Figure 2009098900
Here, α is a constant indicating P (Z t + 1 ), and P (Z t + 1 (k) | X t + 1 ) is determined using the observation model 151 stored in the storage unit 15, and P (X t + 1 | X t ) is determined using the transition model 150 stored in the storage unit 15.

従って、現在時刻における脇見の事後確率P(X=ON|Z1:t)は、これまでに観測された観測量Z1:tに基づいてP(x|Z1:t (k))を導出することにより導出されることとなる。 Therefore, the a posteriori probability P (X t = ON | Z 1: t ) at the current time is P (x t | Z 1: t (k) based on the observation amount Z 1: t observed so far. ) Is derived.

脇見状態判定手段12は、上述のようにDBNを用いて導出した現在時刻における脇見の事後確率P(X=ON|Z1:t)と所定値ξ(例えば、0.8である。)とを比較し、事後確率P(X=ON|Z1:t)が所定値ξ以上となる場合に、現在の状態(この場合、頭部ヨー角がφ以上である状態をいう。)が脇見であると判定する。 The looking-aside state determination unit 12 uses the DBN as described above, and the a posteriori probability P (X t = ON | Z 1: t ) of the looking-aside at the current time and a predetermined value ξ (for example, 0.8). And the posterior probability P (X t = ON | Z 1: t ) is equal to or greater than a predetermined value ξ, the current state (in this case, the state where the head yaw angle is equal to or greater than φ). Is determined to be an aside.

次に、図5を参照しながら、脇見状態判定装置100が運転者の脇見に対して注意喚起する処理(以下、「脇見注意喚起処理」とする。)について説明する。なお、図5は、脇見注意喚起処理の流れを示すフローチャートであり、脇見状態判定装置100は、車両が所定速度以上で走行している間、脇見注意喚起処理を繰り返し実行するものとする。   Next, with reference to FIG. 5, a description will be given of a process in which the aside look determination device 100 alerts the driver's aside (hereinafter referred to as “aside look alerting process”). FIG. 5 is a flowchart showing the flow of the aside look alerting process, and the aside look state determination device 100 repeatedly executes the aside look alerting process while the vehicle is traveling at a predetermined speed or higher.

最初に、脇見状態判定装置100の制御部1は、脇見状態判定手段12により、DBNを用いて現在時刻における脇見の事後確率を算出する(ステップS1)。   First, the control unit 1 of the side-by-side state determination apparatus 100 uses the side-by-side state determination unit 12 to calculate the a posteriori probability of looking aside at the current time using the DBN (step S1).

その後、制御部1は、観測量取得手段10により、運転者撮影用カメラ21が出力する画像を受けて運転者の頭部のヨー角を導出し、その頭部のヨー角と所定角度φとを比較する(ステップS2)。   Thereafter, the control unit 1 derives the yaw angle of the driver's head by receiving the image output from the driver photographing camera 21 by the observation amount acquisition means 10, and the yaw angle of the head and the predetermined angle φ. Are compared (step S2).

その頭部のヨー角が所定角度φ未満である場合(ステップS2のNO)、制御部1は、脇見注意喚起処理を一旦終了させる。頭部のヨー角が所定角度φ未満であれば、注意喚起する必要がないからである。   When the yaw angle of the head is less than the predetermined angle φ (NO in step S2), the control unit 1 once ends the aside look alerting process. This is because it is not necessary to call attention if the yaw angle of the head is less than the predetermined angle φ.

その頭部のヨー角が所定角度φ以上である場合(ステップS2のYES)、制御部1は、脇見状態判定手段12により、算出された事後確率と所定値ξとを比較し(ステップS3)、算出された事後確率が所定値ξ以下の場合(ステップS3のNO)、脇見注意喚起処理を一旦終了させる。算出された事後確率が低ければ、注意喚起する必要がないからであり、注意喚起が必要ないときに警報を出力してしまうのを防止するためである。   When the head yaw angle is equal to or larger than the predetermined angle φ (YES in step S2), the control unit 1 compares the calculated posterior probability with the predetermined value ξ by the look-ahead state determination unit 12 (step S3). If the calculated posterior probability is less than or equal to the predetermined value ξ (NO in step S3), the aside look alerting process is temporarily terminated. This is because if the calculated posterior probability is low, it is not necessary to call attention, and this is to prevent an alarm from being output when no warning is required.

算出された事後確率が所定値ξより大きい場合(ステップS3のYES)、制御部1は、音声出力装置30に制御信号を出力して警報を出力させ、脇見を止めるよう運転者に注意喚起した上で(ステップS4)、脇見注意喚起処理を終了させる。   When the calculated posterior probability is greater than the predetermined value ξ (YES in step S3), the control unit 1 outputs a control signal to the audio output device 30 to output an alarm, and alerts the driver to stop looking aside. Above (step S4), the aside look alerting process is terminated.

なお、図5に示す脇見注意喚起処理では、頭部ヨー角が所定角度φ以上の場合に限りDBNを用いて現在時刻における脇見の事後確率を算出するようにしてもよい。   In the aside look alerting process shown in FIG. 5, the a posteriori probability of looking aside at the current time may be calculated using DBN only when the head yaw angle is equal to or greater than the predetermined angle φ.

観測量取得手段10が導出した頭部ヨー角が離散値として扱われる場合であって、その所定角度φが一区分の中間に含まれる場合(例えば、区分が±4°刻みで設定された場合であって、所定角度φ(±10°)が、±8°から±12°の区分に含まれる場合をいう。)もあるからであり、このとき、頭部ヨー角±8°と頭部ヨー角±10°とは、同じ離散値として扱われるからである。これは、頭部ヨー角が所定角度φ未満であっても、脇見をしていると判定される場合が有ることを意味する。   When the head yaw angle derived by the observation amount acquisition means 10 is treated as a discrete value and the predetermined angle φ is included in the middle of one section (for example, when the section is set in increments of ± 4 °) And the predetermined angle φ (± 10 °) is included in the range of ± 8 ° to ± 12 °.) At this time, the head yaw angle ± 8 ° and the head This is because the yaw angle ± 10 ° is treated as the same discrete value. This means that it may be determined that the head is looking aside even if the head yaw angle is less than the predetermined angle φ.

以上の構成により、脇見状態判定装置100は、各種観測量と脇見状態との間の確率的関係を用いて、直接的に観測できない脇見状態の脇見確率を算出しながら、運転者が脇見をしているか否かを判定するので、脇見の有無をより正確に判定することができる。   With the above-described configuration, the aside look state determination device 100 uses the stochastic relationship between the various observation amounts and the look aside state to calculate the aside look probability of the aside state that cannot be observed directly, while the driver looks aside. Therefore, it is possible to more accurately determine the presence or absence of looking aside.

また、脇見状態判定装置100は、各種観測量と脇見状態との間の確率的関係を用いて脇見状態を判定するので、一見しただけでは法則性の見いだせない各種観測量と脇見状態との間の関係にも柔軟に対応することができ、脇見の有無をより正確に判定することができる。   Moreover, since the armpit state determination apparatus 100 determines the armpit state using a stochastic relationship between various observation amounts and the armpit state, between the various observation amounts and the armpit state that cannot find a law property at first glance. Can be flexibly dealt with, and the presence or absence of looking aside can be determined more accurately.

また、脇見状態判定装置100は、右左折時や車線変更時の安全確認等、脇見でないが視線を前方から逸脱させた場合と、脇見をして視線を前方から逸脱させた場合とを、より正確に区別することができ、脇見でないが視線を前方から逸脱させた場合に警報を出力させてしまうの(誤判定に基づく警報出力)を防止することができる。   Further, the armpit state determination apparatus 100 is more suitable for a safety check at the time of turning left or right or when changing lanes. It is possible to accurately discriminate, and it is possible to prevent an alarm from being output (alarm output based on an erroneous determination) when the line of sight deviates from the front although not looking aside.

以上、本発明の好ましい実施例について詳説したが、本発明は、上述した実施例に制限されることはなく、本発明の範囲を逸脱することなしに上述した実施例に種々の変形及び置換を加えることができる。   Although the preferred embodiments of the present invention have been described in detail above, the present invention is not limited to the above-described embodiments, and various modifications and substitutions can be made to the above-described embodiments without departing from the scope of the present invention. Can be added.

例えば、上述の実施例では、装置20〜28が出力する観測量と脇見状態との確率的関係をDBNで記述し、脇見の事後確率を算出することで脇見をしているか否かを判定するが、本発明に係る脇見状態判定装置は、レーダセンサが出力する車間距離、路車間通信機が出力する車線の種類(走行車線又は追い越し車線等である。)等、更に別の観測量をDBNに入力しながら、脇見をしているか否かを判定するようにしてもよい。   For example, in the above-described embodiment, the stochastic relationship between the observation amount output from the devices 20 to 28 and the aside state is described in DBN, and it is determined whether the aside is aside by calculating the a posteriori probability of the aside. However, the side-by-side state determination device according to the present invention further provides another observation amount such as the inter-vehicle distance output by the radar sensor and the type of lane output by the road-to-vehicle communication device (such as a traveling lane or an overtaking lane). It is also possible to determine whether or not the person is looking aside while inputting.

また、本発明に係る脇見状態判定装置は、ニューラルネットワーク等の他の確率モデルを用いて脇見をしているか否かを判定するようにしてもよい。   Further, the side-by-side state determination device according to the present invention may determine whether or not a side-by-side is using another probability model such as a neural network.

本発明に係る脇見状態判定装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of the looking-aside state determination apparatus which concerns on this invention. 脇見状態判定装置が取得する観測量の分類表の一例である。It is an example of the classification table of the observation amount which a look-aside state determination apparatus acquires. ドライビングシミュレータの構成例を示す概略図である。It is the schematic which shows the structural example of a driving simulator. DBNのネットワークトポロジを示す概念図である。It is a conceptual diagram which shows the network topology of DBN. 脇見注意喚起処理の流れを示すフローチャートである。It is a flowchart which shows the flow of an armpit attention alert process.

符号の説明Explanation of symbols

1 制御部
10 観測量取得手段
11 脇見状態学習手段
12 脇見状態判定手段
15 記憶部
20 道路撮影用カメラ
21 運転者撮影用カメラ
22 操舵角センサ
23 アクセル踏み込み量センサ
24 ブレーキ踏み込み量センサ
25 方向指示器
26 速度センサ
27 加速度センサ
28 ナビゲーション装置
29 入力装置
30 音声出力装置
100 脇見状態判定装置
150 遷移モデル
151 観測モデル
DR 被験者
DS ドライビングシミュレータ
MC 検証用カメラ
P1、P2 プロジェクタ
PG 投影画像
SC スクリーン
TG ターゲット画像
VC 車両模型
W1、W2 壁面
DESCRIPTION OF SYMBOLS 1 Control part 10 Observed amount acquisition means 11 Aside look state learning means 12 Aside look state determination means 15 Memory | storage part 20 Camera for road photography 21 Camera for driver photography 22 Steering angle sensor 23 Acceleration depression amount sensor 24 Brake depression amount sensor 25 Direction indicator 26 Speed sensor 27 Acceleration sensor 28 Navigation device 29 Input device 30 Audio output device 100 Aside look state determination device 150 Transition model 151 Observation model DR Subject DS Driving simulator MC Verification camera P1, P2 Projector PG Projection image SC Screen TG Target image VC Vehicle Model W1, W2 Wall surface

Claims (9)

運転者の視線に関する情報と、運転操作に関する情報又は走行状態に関する情報のうちの少なくとも一つとを観測量として取得する観測量取得手段と、
脇見状態と観測量との間の確率的関係及び前記観測量取得手段が取得した観測量に基づいて脇見状態を判定する脇見状態判定手段と、
を備えることを特徴とする脇見状態判定装置。
An observation amount acquisition means for acquiring, as an observation amount, information relating to the driver's line of sight and information relating to driving operation or information relating to the driving state;
A side-by-side state determination unit that determines a side-by-side state based on the stochastic relationship between the side-by-side state and the observation amount and the observation amount acquired by the observation amount acquisition unit;
A device for determining a state of looking aside.
前記脇見状態判定手段は、運転者が脇見をしている確率を算出し、算出した確率が閾値を超えた場合に運転者が脇見をしていると判定する、
ことを特徴とする請求項1に記載の脇見状態判定装置。
The aside state determination means calculates the probability that the driver is looking aside, and determines that the driver is looking aside when the calculated probability exceeds a threshold,
The side-by-side state determination apparatus according to claim 1.
前記脇見状態判定手段は、ベイジアンネットワークを用いて運転者が脇見をしている確率を算出する、
ことを特徴とする請求項2に記載の脇見状態判定装置。
The aside look determination means calculates the probability that the driver is looking aside using a Bayesian network.
The side-by-side state determination apparatus according to claim 2.
前記脇見状態判定手段は、遷移モデル及び観測モデルを事前に学習し、前記遷移モデル及び前記観測モデルと前記観測量取得手段が取得した観測量とに基づいて運転者が脇見をしている確率を算出する、
ことを特徴とする請求項2又は3に記載の脇見状態判定装置。
The armpit state determination means learns the transition model and the observation model in advance, and the probability that the driver is looking aside based on the transition model and the observation model and the observation amount acquired by the observation amount acquisition means. calculate,
The side-view state determination apparatus according to claim 2 or 3,
前記運転操作に関する情報は、操舵角、アクセル踏み込み量、ブレーキ踏み込み量又はウィンカー操作に関する情報を含む、
ことを特徴とする請求項1乃至4に記載の脇見状態判定装置。
The information related to the driving operation includes information related to a steering angle, an accelerator depression amount, a brake depression amount, or a winker operation.
The aside look state determination apparatus according to any one of claims 1 to 4.
前記走行状態に関する情報は、車両の速度又は加速度を含む、
ことを特徴とする請求項1乃至4に記載の脇見状態判定装置。
The information on the running state includes vehicle speed or acceleration,
The aside look state determination apparatus according to any one of claims 1 to 4.
前記走行状態に関する情報は、車両とセンターラインとの間の距離又は車両とセンターラインとの間の角度を含む、
ことを特徴とする請求項1乃至4に記載の脇見状態判定装置。
The information on the running state includes a distance between the vehicle and the center line or an angle between the vehicle and the center line.
The aside look state determination apparatus according to any one of claims 1 to 4.
前記走行状態に関する情報は、周囲環境に関する情報を含む、
ことを特徴とする請求項1乃至4に記載の脇見状態判定装置。
The information on the running state includes information on the surrounding environment.
The aside look state determination apparatus according to any one of claims 1 to 4.
前記運転者の視線に関する情報は、頭部の傾斜角度を含む、
ことを特徴とする請求項1乃至4に記載の脇見状態判定装置。
The information on the driver's line of sight includes the tilt angle of the head,
The aside look state determination apparatus according to any one of claims 1 to 4.
JP2007269455A 2007-10-16 2007-10-16 Aside look status determination device Active JP5311793B2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2007269455A JP5311793B2 (en) 2007-10-16 2007-10-16 Aside look status determination device
US12/738,576 US8665099B2 (en) 2007-10-16 2008-10-16 Inattentive state determination device and method of determining inattentive state
EP08840164.1A EP2201496B1 (en) 2007-10-16 2008-10-16 Inattentive state determination device and method of determining inattentive state
PCT/IB2008/002739 WO2009050564A2 (en) 2007-10-16 2008-10-16 Inattentive state determination device and method of determining inattentive state

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007269455A JP5311793B2 (en) 2007-10-16 2007-10-16 Aside look status determination device

Publications (2)

Publication Number Publication Date
JP2009098900A true JP2009098900A (en) 2009-05-07
JP5311793B2 JP5311793B2 (en) 2013-10-09

Family

ID=40551472

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007269455A Active JP5311793B2 (en) 2007-10-16 2007-10-16 Aside look status determination device

Country Status (4)

Country Link
US (1) US8665099B2 (en)
EP (1) EP2201496B1 (en)
JP (1) JP5311793B2 (en)
WO (1) WO2009050564A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011033840A1 (en) * 2009-09-18 2011-03-24 トヨタ自動車株式会社 Driving evaluation system, vehicle-mounted machine, and information processing center
JP2017504867A (en) * 2013-11-25 2017-02-09 ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツングRobert Bosch Gmbh Method for evaluating driver motion in a vehicle
JP2017191367A (en) * 2016-04-11 2017-10-19 株式会社デンソー Safe driving support device and safe driving support program
JP2017224254A (en) * 2016-06-17 2017-12-21 アイシン精機株式会社 Visual recognition direction estimation device
KR20220095430A (en) * 2020-12-30 2022-07-07 한국과학기술원 Computer system for predicting concentration of driver through real-time analysis of driving data, and method thereof

Families Citing this family (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7859392B2 (en) 2006-05-22 2010-12-28 Iwi, Inc. System and method for monitoring and updating speed-by-street data
US9067565B2 (en) 2006-05-22 2015-06-30 Inthinc Technology Solutions, Inc. System and method for evaluating driver behavior
US9129460B2 (en) 2007-06-25 2015-09-08 Inthinc Technology Solutions, Inc. System and method for monitoring and improving driver behavior
US9117246B2 (en) 2007-07-17 2015-08-25 Inthinc Technology Solutions, Inc. System and method for providing a user interface for vehicle mentoring system users and insurers
US8397228B2 (en) * 2007-11-14 2013-03-12 Continental Automotive Systems, Inc. Systems and methods for updating device software
US8520070B1 (en) 2008-10-30 2013-08-27 Rosco Inc. Method and system with multiple camera units installed in protective enclosure
US8963702B2 (en) * 2009-02-13 2015-02-24 Inthinc Technology Solutions, Inc. System and method for viewing and correcting data in a street mapping database
US8847771B2 (en) * 2013-01-25 2014-09-30 Toyota Motor Engineering & Manufacturing North America, Inc. Method and apparatus for early detection of dynamic attentive states for providing an inattentive warning
WO2014152802A1 (en) * 2013-03-14 2014-09-25 Gerald Murphy Driver control assistance sensor and method
DE102013013869B4 (en) * 2013-08-20 2019-06-19 Audi Ag Method for operating a driver assistance system to warn against excessive deflection and motor vehicle
TWI493511B (en) * 2013-09-04 2015-07-21 Ind Tech Res Inst Method and system for detecting conditions of drivers, and electronic apparatus thereof
US9172477B2 (en) 2013-10-30 2015-10-27 Inthinc Technology Solutions, Inc. Wireless device detection using multiple antennas separated by an RF shield
DE102014215461B4 (en) * 2014-08-05 2023-06-15 Robert Bosch Gmbh Method and device for operating a vehicle, in particular a railway vehicle
US10552750B1 (en) 2014-12-23 2020-02-04 Amazon Technologies, Inc. Disambiguating between multiple users
US10438277B1 (en) 2014-12-23 2019-10-08 Amazon Technologies, Inc. Determining an item involved in an event
US10475185B1 (en) * 2014-12-23 2019-11-12 Amazon Technologies, Inc. Associating a user with an event
US9878663B1 (en) 2016-12-07 2018-01-30 International Business Machines Corporation Cognitive dialog system for driving safety
US9718405B1 (en) 2015-03-23 2017-08-01 Rosco, Inc. Collision avoidance and/or pedestrian detection system
JP6516187B2 (en) * 2015-07-23 2019-05-22 パナソニックIpマネジメント株式会社 Sleepiness judging device, sleepiness judging method, sleepiness judging program and recording medium
KR102137213B1 (en) 2015-11-16 2020-08-13 삼성전자 주식회사 Apparatus and method for traning model for autonomous driving, autonomous driving apparatus
JP2019518287A (en) 2016-06-13 2019-06-27 ジーボ インコーポレーテッドXevo Inc. Method and system for car parking space management using virtual cycle
CN106384509A (en) * 2016-10-08 2017-02-08 大连理工大学 Urban road driving time distribution estimation method considering taxi operation states
US10832148B2 (en) 2016-12-07 2020-11-10 International Business Machines Corporation Cognitive dialog system for driving safety
EP3559897A4 (en) 2016-12-22 2020-08-12 Xevo Inc. Method and system for providing artificial intelligence analytic (aia) services using operator fingerprints and cloud data
US10373332B2 (en) * 2017-12-08 2019-08-06 Nvidia Corporation Systems and methods for dynamic facial analysis using a recurrent neural network
JP6888542B2 (en) * 2017-12-22 2021-06-16 トヨタ自動車株式会社 Drowsiness estimation device and drowsiness estimation method
US10552695B1 (en) * 2018-12-19 2020-02-04 GM Global Technology Operations LLC Driver monitoring system and method of operating the same
WO2020165908A2 (en) * 2019-02-17 2020-08-20 Guardian Optical Technologies Ltd System, device, and methods for detecting and obtaining information on objects in a vehicle
JP7047821B2 (en) * 2019-07-18 2022-04-05 トヨタ自動車株式会社 Driving support device
US10752253B1 (en) * 2019-08-28 2020-08-25 Ford Global Technologies, Llc Driver awareness detection system
CN113569699B (en) * 2021-07-22 2024-03-08 上汽通用五菱汽车股份有限公司 Attention analysis methods, vehicles and storage media
DE102023204260A1 (en) * 2023-05-09 2024-11-14 Robert Bosch Gesellschaft mit beschränkter Haftung Method and device for operating a vehicle and driver assistance system for a vehicle

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08207617A (en) * 1995-02-08 1996-08-13 Toyota Motor Corp Inattentive driving determination device
JP2001138767A (en) * 1999-11-17 2001-05-22 Nissan Motor Co Ltd Vehicle alarm system
JP2002331850A (en) * 2001-05-07 2002-11-19 Nissan Motor Co Ltd Driving behavior intention detection device
JP2003080969A (en) * 2001-09-11 2003-03-19 Nissan Motor Co Ltd Driver status detection device
JP2005208943A (en) * 2004-01-22 2005-08-04 Denso It Laboratory Inc System for providing service candidate, user side communication device, and service candidate server
JP2006202015A (en) * 2005-01-20 2006-08-03 Ntt Docomo Inc Gaze state detection device and gaze state estimation method
JP2006343904A (en) * 2005-06-08 2006-12-21 Xanavi Informatics Corp Driving support device
JP2007183931A (en) * 2005-12-07 2007-07-19 Matsushita Electric Ind Co Ltd Secure device, information processing terminal, server, and authentication method
JP2007183831A (en) * 2006-01-06 2007-07-19 Toyota Motor Corp Vehicle control device
JP2007261486A (en) * 2006-03-29 2007-10-11 Toyota Motor Corp Vehicle motion design device and vehicle motion control device

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3727078B2 (en) * 1994-12-02 2005-12-14 富士通株式会社 Display device
KR100214087B1 (en) * 1996-06-07 1999-08-02 서정욱 Information offering system and method thereof
US5798687A (en) * 1996-10-15 1998-08-25 Terry Mijal Vehicular safety system
US6198397B1 (en) * 1998-03-18 2001-03-06 Charles D. Angert Magnetic steering wheel movement sensing device
US6191686B1 (en) * 1999-09-27 2001-02-20 Edwin Z. Gabriel Collision avoidance system for cars
US6218947B1 (en) * 2000-05-23 2001-04-17 Ronald L. Sutherland Driver sleep alarm
JP3073732U (en) 2000-06-01 2000-12-08 一八 星山 Vehicle running recording device with car navigation function
US6317686B1 (en) * 2000-07-21 2001-11-13 Bin Ran Method of providing travel time
US7233933B2 (en) * 2001-06-28 2007-06-19 Microsoft Corporation Methods and architecture for cross-device activity monitoring, reasoning, and visualization for providing status and forecasts of a users' presence and availability
JP2004058799A (en) * 2002-07-26 2004-02-26 Murakami Corp On-vehicle passenger photographing device
JP4110902B2 (en) 2002-09-27 2008-07-02 日産自動車株式会社 Armpit detector
DE10355221A1 (en) * 2003-11-26 2005-06-23 Daimlerchrysler Ag A method and computer program for detecting inattentiveness of the driver of a vehicle
JP4389567B2 (en) * 2003-12-03 2009-12-24 日産自動車株式会社 Lane departure prevention device
JP4193765B2 (en) * 2004-01-28 2008-12-10 トヨタ自動車株式会社 Vehicle travel support device
US7639146B2 (en) * 2004-09-29 2009-12-29 Baura Gail D Blink monitor for detecting blink occurrence in a living subject

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08207617A (en) * 1995-02-08 1996-08-13 Toyota Motor Corp Inattentive driving determination device
JP2001138767A (en) * 1999-11-17 2001-05-22 Nissan Motor Co Ltd Vehicle alarm system
JP2002331850A (en) * 2001-05-07 2002-11-19 Nissan Motor Co Ltd Driving behavior intention detection device
JP2003080969A (en) * 2001-09-11 2003-03-19 Nissan Motor Co Ltd Driver status detection device
JP2005208943A (en) * 2004-01-22 2005-08-04 Denso It Laboratory Inc System for providing service candidate, user side communication device, and service candidate server
JP2006202015A (en) * 2005-01-20 2006-08-03 Ntt Docomo Inc Gaze state detection device and gaze state estimation method
JP2006343904A (en) * 2005-06-08 2006-12-21 Xanavi Informatics Corp Driving support device
JP2007183931A (en) * 2005-12-07 2007-07-19 Matsushita Electric Ind Co Ltd Secure device, information processing terminal, server, and authentication method
JP2007183831A (en) * 2006-01-06 2007-07-19 Toyota Motor Corp Vehicle control device
JP2007261486A (en) * 2006-03-29 2007-10-11 Toyota Motor Corp Vehicle motion design device and vehicle motion control device

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011033840A1 (en) * 2009-09-18 2011-03-24 トヨタ自動車株式会社 Driving evaluation system, vehicle-mounted machine, and information processing center
JP2011065527A (en) * 2009-09-18 2011-03-31 Toyota Motor Corp Driving evaluation system, vehicle-mounted machine, and information processing center
CN102549628A (en) * 2009-09-18 2012-07-04 丰田自动车株式会社 Driving evaluation system, vehicle-mounted machine, and information processing center
JP2017504867A (en) * 2013-11-25 2017-02-09 ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツングRobert Bosch Gmbh Method for evaluating driver motion in a vehicle
JP2017191367A (en) * 2016-04-11 2017-10-19 株式会社デンソー Safe driving support device and safe driving support program
JP2017224254A (en) * 2016-06-17 2017-12-21 アイシン精機株式会社 Visual recognition direction estimation device
WO2017217044A1 (en) * 2016-06-17 2017-12-21 アイシン精機株式会社 Viewing direction estimation device
KR20220095430A (en) * 2020-12-30 2022-07-07 한국과학기술원 Computer system for predicting concentration of driver through real-time analysis of driving data, and method thereof
KR102438689B1 (en) * 2020-12-30 2022-09-01 한국과학기술원 Computer system for predicting concentration of driver through real-time analysis of driving data, and method thereof

Also Published As

Publication number Publication date
US20100265074A1 (en) 2010-10-21
EP2201496B1 (en) 2018-05-23
WO2009050564A2 (en) 2009-04-23
WO2009050564A8 (en) 2010-07-15
JP5311793B2 (en) 2013-10-09
EP2201496A2 (en) 2010-06-30
WO2009050564A3 (en) 2009-09-03
US8665099B2 (en) 2014-03-04

Similar Documents

Publication Publication Date Title
JP5311793B2 (en) Aside look status determination device
CN108068821B (en) Device, system and method for determining concentration degree of driver
JP7544502B2 (en) Driving assistance device and data collection system
JP6497915B2 (en) Driving support system
CN107176169B (en) The automatic driving control system of vehicle
KR102051142B1 (en) System for managing dangerous driving index for vehicle and method therof
US10336252B2 (en) Long term driving danger prediction system
FI124068B (en) Procedure for improving driving safety
US20200290628A1 (en) Personalized device and method for monitoring a motor vehicle driver
JP6024679B2 (en) Sign recognition device
CN111383477A (en) Information prompt device
JP2004157880A (en) Confirmation behavior evaluation device
US20220363266A1 (en) Systems and methods for improving driver attention awareness
JP6631569B2 (en) Operating state determining apparatus, operating state determining method, and program for determining operating state
JP7263997B2 (en) DRIVING ACTION EVALUATION DEVICE, METHOD AND PROGRAM
JP2020024551A (en) Driving consciousness estimation device
CN110383361A (en) Method and apparatus for reminding driver to start at optical signal equipment
JP7185251B2 (en) Driving information processing device and driving information processing program
JP6678147B2 (en) Danger sign determination device, danger sign determination system, vehicle, and program
US12223701B2 (en) Systems and methods for training event prediction models for camera-based warning systems
JP7376996B2 (en) Vehicle dangerous situation determination device, vehicle dangerous situation determination method, and program
JP2012103849A (en) Information provision device
JP2019012480A (en) Driving diagnostic device and driving diagnostic method
JP2023177045A (en) Seat position estimation device, seat position estimation method, and computer program for seat position estimation
CN119636759B (en) Driving behavior prediction method, device, electronic device and storage medium

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20100816

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20111208

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120110

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120223

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120612

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120712

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20130108

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130305

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A821

Effective date: 20130315

A911 Transfer to examiner for re-examination before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20130409

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20130611

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20130702

R151 Written notification of patent or utility model registration

Ref document number: 5311793

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250