[go: up one dir, main page]

WO2014054152A1 - Dispositif de traitement d'informations embarqué - Google Patents

Dispositif de traitement d'informations embarqué Download PDF

Info

Publication number
WO2014054152A1
WO2014054152A1 PCT/JP2012/075794 JP2012075794W WO2014054152A1 WO 2014054152 A1 WO2014054152 A1 WO 2014054152A1 JP 2012075794 W JP2012075794 W JP 2012075794W WO 2014054152 A1 WO2014054152 A1 WO 2014054152A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information
driver
processing apparatus
passenger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2012/075794
Other languages
English (en)
Japanese (ja)
Inventor
下谷 光生
秀彦 大木
御厨 誠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to JP2014539539A priority Critical patent/JP5885852B2/ja
Priority to PCT/JP2012/075794 priority patent/WO2014054152A1/fr
Publication of WO2014054152A1 publication Critical patent/WO2014054152A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection

Definitions

  • the present invention relates to an in-vehicle information processing apparatus that alerts a driver of a host vehicle or controls traveling of the host vehicle based on other vehicle information acquired from another vehicle.
  • Patent Document 1 it is determined whether or not the other vehicle is a vehicle that should pay attention based on information (static information) unique to the driver of the other vehicle, and there is a certain alerting effect. It was.
  • the vehicle may correspond to a vehicle to which attention should be paid.
  • Patent Document 1 does not determine whether or not the other vehicle is a vehicle to which attention should be paid based on the state in the other vehicle including the passenger's state. Therefore, it cannot be said that the driver of the own vehicle is sufficiently alerted that the other vehicle is a vehicle to which attention should be paid.
  • the present invention has been made to solve these problems, and an object of the present invention is to provide an in-vehicle information processing apparatus capable of sufficiently alerting the driver of the host vehicle.
  • an in-vehicle information processing apparatus includes an other vehicle position detection unit that detects the position of another vehicle existing around the host vehicle, and another detected by the other vehicle position detection unit. Based on the communication unit that acquires other vehicle information including other in-vehicle information indicating the state of the passenger or the vehicle other than the driver in the vehicle from the other vehicle by communication, and the other in-vehicle information acquired by the communication unit And a control unit that controls the driver of the host vehicle or controls the traveling of the host vehicle.
  • an other vehicle position detection unit that detects the position of another vehicle existing around the host vehicle, and a passenger or vehicle other than the driver in the other vehicle detected by the other vehicle position detection unit.
  • Other vehicle information including other vehicle in-vehicle information indicating the state is acquired from the other vehicle by communication, and based on the other vehicle information acquired by the communication unit, alerting the driver of the host vehicle, or Since the control unit that controls the traveling of the host vehicle is provided, it is possible to sufficiently alert the driver of the host vehicle.
  • FIG. 1 is a diagram illustrating an application example of the in-vehicle information processing apparatuses 100 and 200 according to the first embodiment.
  • the vehicle A and the vehicle B are traveling in the same direction, and the vehicle C is traveling in the oncoming lane.
  • the vehicle information processing apparatus 100 is mounted on the vehicle A
  • the vehicle information processing apparatus 200 is mounted on the vehicle B, so that the vehicle A and the vehicle B can communicate with each other by inter-vehicle communication.
  • the in-vehicle information processing apparatus 100 will be described as a receiving-side apparatus that receives information transmitted from the vehicle B.
  • the on-vehicle information processing apparatus 200 will be described as a transmission-side apparatus that transmits information to the vehicle A.
  • FIG. 2 is a block diagram illustrating an example of the configuration of the in-vehicle information processing apparatus 100.
  • the host vehicle is described as vehicle A and the other vehicle is described as vehicle B.
  • the in-vehicle information processing apparatus 100 includes an other vehicle position detection unit 101, a communication unit 102, a GUI (Graphical (User Interface) unit 103, an attention level calculation unit 104, and a map DB (Data ⁇ ⁇ Base). 105, an in-vehicle sensor I / F (interface) unit 106, and a control unit 107.
  • the other vehicle position detection unit 101 is connected to the ultrasonic sensor 108 and the image sensor 109.
  • the other vehicle position detection unit 101 detects the relative position of the other vehicle (vehicle B) existing around the host vehicle (vehicle A) based on the detection result by the ultrasonic sensor 108 or the image sensor 109.
  • An example of the image sensor 109 is a camera.
  • the communication unit 102 performs inter-vehicle communication with the vehicle B and acquires other vehicle information from the vehicle B.
  • the other vehicle information is information including all of the information related to the other vehicle (vehicle B).
  • the communication means may be any means such as a wireless LAN (Local Area Network), UWB (Ultra Wide Band), or optical communication.
  • the GUI unit 103 is connected to a touch panel 110, a liquid crystal monitor 111 (display unit), and a speaker 112.
  • the GUI unit 103 inputs driver operation information acquired via the touch panel 110 to the control unit 107.
  • display information input from the control unit 107 is output to the liquid crystal monitor 111
  • audio information input from the control unit 107 is output to the speaker 112.
  • the attention level calculation unit 104 calculates the attention level for the vehicle B based on the other vehicle information acquired from the vehicle B via the communication unit 102.
  • the attention level refers to the degree to which the driver of the vehicle A should pay attention to the vehicle B (the degree to which attention is paid), and the attention level calculation unit 104 has at least two attentions (two steps) or more. A level is calculated.
  • the map DB 105 stores map data.
  • the in-vehicle sensor I / F unit 106 is connected to a GPS (Global Positioning System) 113, a vehicle speed pulse 114, a gyro sensor 115, a vehicle control device 116, an engine control device 117, a body system control device 118, and the like via an in-vehicle LAN 119. Yes.
  • the control unit 107 can receive and instruct various information via the in-vehicle LAN 119 and the in-vehicle sensor I / F unit 106.
  • control unit 107 Information acquired by each of the GPS 113, the vehicle speed pulse 114, and the gyro sensor 115 is input to the control unit 107 via the in-vehicle sensor I / F unit 106, and the position of the host vehicle is detected by the control unit 107. That is, the control unit 107 has a function of detecting the position of the host vehicle.
  • the vehicle control device 116 inputs an operation by a driver from a brake pedal, an accelerator pedal, or a steering wheel, and controls the traveling of the host vehicle.
  • the speed of the host vehicle is controlled by controlling the engine speed, the brake system, etc., or the traveling direction of the host vehicle is controlled by controlling the attitude of the shaft. It also controls semi-automatic driving functions such as auto cruise.
  • the engine control device 117 performs fuel control and ignition timing control.
  • the body system control device 118 controls operations that are not directly related to traveling in the host vehicle. For example, it controls wiper driving, lighting information transmission, blinker lighting, door opening and closing, window opening and closing.
  • the control unit 107 controls each component of the in-vehicle information processing apparatus 100.
  • FIG. 3 is a block diagram showing an example of the configuration of the in-vehicle information processing apparatus 200.
  • the host vehicle is described as vehicle B and the other vehicle is described as vehicle A.
  • the in-vehicle information processing apparatus 200 includes an in-vehicle state detection unit 201, a communication unit 202, a GUI unit 203, a driver dynamic state detection unit 204, a map DB 205, and a position detection unit 206.
  • the driver static information acquisition unit 207 and the control unit 208 are provided.
  • the in-vehicle state detection unit 201 is connected to the in-vehicle detection sensor 209 and detects the state in the vehicle B based on the detection result by the in-vehicle detection sensor 209. For example, the presence or absence of the passenger, the passenger's state, Detect vehicle status.
  • the in-vehicle detection sensor 209 includes, for example, a camera that is an image sensor, a pressure sensor provided in each seat to detect whether or not a passenger is sitting on the seat, or a microphone that acquires sound in the vehicle B. is there.
  • Information indicating the state in the vehicle B detected by the in-vehicle state detection unit 201 can be used as in-vehicle information, and the in-vehicle information can be included in the own vehicle information and transmitted to the vehicle A by the communication unit 202.
  • the passenger means a person existing in the vehicle other than the driver, and the passenger means an object including a load or an animal (pet) existing in the vehicle. It shall be said.
  • the communication unit 202 performs inter-vehicle communication with the vehicle A, and transmits own vehicle information to the vehicle A.
  • the host vehicle information is information including all information related to the host vehicle (vehicle B) transmitted to the other vehicle (vehicle A), and corresponds to the other vehicle information acquired by the communication unit 102 in FIG. .
  • the communication means may be any means such as wireless LAN, UWB, or optical communication.
  • the GUI unit 203 is connected to the touch panel 210 and the liquid crystal monitor 211.
  • the GUI unit 203 inputs driver operation information acquired via the touch panel 210 to the control unit 208.
  • the display information input from the control unit 208 is output to the liquid crystal monitor 211.
  • the driver dynamic state detection unit 204 detects the current activity state of the driver of the vehicle B. Information indicating the current activity state of the driver detected by the driver dynamic state detection unit 204 is used as driver dynamic information, and the driver dynamic information is included in the own vehicle information, and the communication unit 202 uses the vehicle A. Can be sent to.
  • the map DB 205 stores map data.
  • the position detection unit 206 is connected to the GPS 212 and the vehicle speed pulse 213.
  • the position detection unit 206 detects the position of the host vehicle based on information acquired from each of the GPS 212 and the vehicle speed pulse 213.
  • the driver static information acquisition unit 207 acquires driver static information that is information unique to the driver of the vehicle B.
  • Examples of the driver static information include information related to driver sign display (information such as beginners and elderly people), driver's license information, or accident history information.
  • the driver static information acquired by the driver static information acquisition unit 207 can be included in the host vehicle information and transmitted to the vehicle A by the communication unit 202.
  • the control unit 208 controls each component of the in-vehicle information processing apparatus 200.
  • the H / F (Hands Free) device 214 is a device for performing H / F calls (hands-free calls), and is connected to the control unit 208.
  • the AV (Audio Visual) device 215 is a device for reproducing audio or video such as radio or music, and is connected to the control unit 208.
  • the in-vehicle information detected by the in-vehicle state detection unit 201 includes passenger static information, which is information unique to the passenger, passenger dynamic information indicating the passenger's current activity state, and presence in the vehicle.
  • the load information indicating the load state of the load to be loaded is included.
  • Passenger static information includes, for example, information on the presence or absence of a passenger, information on the seating position of the passenger, or information on the attributes of the passenger.
  • the passenger attributes include age, sex, and presence / absence of handicap.
  • the passenger's attribute or seating position may be acquired by image recognition.
  • Passenger dynamic information is information indicating that the passenger is talking to the driver, information indicating that the passenger is alone, or that the infant baby is crying It contains information indicating.
  • Passenger dynamic information may be acquired by image processing or voice recognition processing.
  • the load information is information indicating the loading state of the load in the vehicle, and may be acquired by image processing using a camera.
  • information indicating the state of an animal such as a pet present in the vehicle may be included in the in-vehicle information.
  • FIG. 4 is a flowchart showing an example of the operation of the in-vehicle information processing apparatus 100.
  • step S41 the control unit 107 detects the current position of the vehicle A, which is the host vehicle, based on the information acquired by the GPS 113, the vehicle speed pulse 114, and the gyro sensor 115.
  • the control unit 107 generates image data that displays the position of the host vehicle (the position of the vehicle A) on the map.
  • the generated image data is input to the liquid crystal monitor 111 via the GUI unit 103, and an image is displayed on the liquid crystal monitor 111.
  • step S42 it is determined whether or not the vehicle B that is another vehicle around the vehicle A is detected.
  • the process proceeds to step S43.
  • the process proceeds to step S46.
  • the vehicle B is detected by the other vehicle position detection unit 101 based on information from the ultrasonic sensor 108 or the image sensor 109.
  • step S43 the communication unit 102 acquires other vehicle information including other in-vehicle information of the vehicle B through inter-vehicle communication.
  • Other vehicle information is acquired every predetermined timing (for example, 0.1 second).
  • the vehicle A may acquire other vehicle information from the vehicle B after making a communication request to the vehicle B. Further, when the vehicle B always transmits other vehicle information, the vehicle A may acquire other vehicle information transmitted from the vehicle B.
  • step S44 the attention level calculation unit 104 calculates the attention level based on the in-vehicle information of the vehicle B included in the other vehicle information.
  • the caution level calculation unit 104 calculates two caution levels (whether or not attention is required) (two levels).
  • control unit 107 determines a display method of the vehicle B on the map based on the attention level calculated by the attention level calculation unit 104.
  • step S45 the control unit 107 outputs the image data to the liquid crystal monitor 111 via the GUI unit 103 so as to display the display method determined in step S44.
  • the liquid crystal monitor 111 displays the vehicle B on the map based on the image data input from the control unit 107.
  • step S46 it is determined whether or not the driving of the vehicle A has been completed. When the driving of the vehicle A is finished, the process is finished. On the other hand, when the driving
  • FIG. 5 is a diagram illustrating an example of display on the vehicle A when attention to the vehicle B is unnecessary.
  • the caution level calculation unit 104 calculates the caution level based on the information in the other vehicle included in the other vehicle information acquired from the vehicle B.
  • the control unit 107 determines that attention to the vehicle B is unnecessary based on the attention level calculated by the attention level calculation unit 104 (that is, the state in the vehicle B is good)
  • the vehicle B reflecting the determination result is displayed on the liquid crystal monitor 111. For example, as shown in FIG. 5, the vehicle B is displayed as a white triangle.
  • the driver of the vehicle A can easily recognize that the vehicle B is not a vehicle requiring attention.
  • FIG. 6 is a diagram illustrating an example of display on the vehicle A when attention to the vehicle B is necessary.
  • the caution level calculation unit 104 calculates the caution level based on the information in the other vehicle included in the other vehicle information acquired from the vehicle B.
  • the control unit 107 determines that attention to the vehicle B is necessary based on the attention level calculated by the attention level calculation unit 104 (that is, it is necessary to pay attention to the state in the vehicle B).
  • the vehicle B reflecting the determination result is displayed on the liquid crystal monitor 111. For example, as shown in FIG. 6, the vehicle B is displayed with a color different from that of the vehicle A (different hatching in FIG. 6).
  • the driver of the vehicle A can easily recognize that the vehicle B is a vehicle requiring attention.
  • FIG. 7 is a flowchart showing an example of the operation of the in-vehicle information processing apparatus 200.
  • step S71 the control unit 208 detects the current position of the vehicle B, which is the host vehicle, based on the information acquired by the GPS 212 and the vehicle speed pulse 213.
  • the control unit 208 generates image data for displaying the own vehicle position (the position of the vehicle B) on the map based on the position detection result of the vehicle B and the map data stored in the map DB 205.
  • the generated image data is input to the liquid crystal monitor 211 via the GUI unit 203, and an image is displayed on the liquid crystal monitor 211.
  • step S72 the in-vehicle state detection unit 201 detects the state in the vehicle B.
  • step S73 the control unit 208 determines whether there is a communication request from the vehicle A, which is another vehicle, via the communication unit 202. If there is a communication request from the vehicle A, the process proceeds to step S74. On the other hand, if there is no communication request from the vehicle A, the process proceeds to step S75. That is, when there is a communication request from the vehicle A, the control unit 208 controls the communication unit 202 to transmit the own vehicle information to the vehicle A.
  • step S74 information indicating the state in the vehicle B detected by the in-vehicle state detection unit 201 is used as the in-vehicle information, and the in-vehicle information is included in the own vehicle information and transmitted from the communication unit 202 to the vehicle A.
  • the own vehicle information and own vehicle information transmitted in step S74 correspond to the other vehicle information and the other vehicle information acquired in step S43 of FIG.
  • step S75 it is determined whether or not the driving of the vehicle B has been completed. When the driving of the vehicle B is finished, the process is finished. On the other hand, if the driving of the vehicle B has not ended, the process proceeds to step S71.
  • the first embodiment it is possible to easily determine whether or not the other vehicle is a vehicle to be careful by changing the display method of the other vehicle based on the state in the other vehicle. Therefore, sufficient attention can be given to the driver of the host vehicle.
  • ⁇ Modification 1> In the first embodiment, it has been described that the display method of the other vehicle is determined based on the attention level calculated by the attention level calculation unit 104 in step S44 of FIG. 4, but is not limited thereto. .
  • the traveling of the host vehicle may be controlled based on the attention level.
  • the control unit 107 controls the vehicle control device 116 that controls semi-automatic driving such as auto cruise based on the dynamic state of the driver of another vehicle. Based on control by the control unit 107, the vehicle control device 116 increases the inter-vehicle distance when attention to other vehicles is required, and sets the inter-vehicle distance to a normal length when attention to other vehicles is not required. .
  • a warning earlier than usual may be notified to the driver when the attention level is high.
  • a warning by voice or the like may be output to the driver of the own vehicle based on the attention level.
  • the control unit 107 controls to output an alarm from the speaker 112 when attention to another vehicle is required based on the attention level.
  • the in-vehicle information processing apparatus 200 of the vehicle B may include a caution level calculation unit (not shown), and the caution level calculation unit may calculate the caution level.
  • the vehicle B includes information on the calculated attention level in its own vehicle information and transmits it to the vehicle A.
  • the alerting to the driver of the vehicle A or the traveling of the vehicle A is controlled.
  • information regarding passengers of the vehicle B may be set and stored in advance.
  • attributes such as the passenger's age and information related to the seating position may be set in advance via an input unit such as the touch panel 210 and stored, and the setting stored at the time of departure may be selected.
  • the above information may be set when the vehicle B departs.
  • attributes such as the passenger's age and information related to the seating position can be stored and selected in advance, so that it is not necessary to detect the passenger's attribute and seating position by the in-vehicle detection sensor 209.
  • the information on the passenger is automatically the information on the passenger who is the owner of the communication terminal registered in advance in the communication terminal when receiving an inquiry from the vehicle information processing apparatus 200. It may be in the form of replying to. In this case, the passenger's trouble can be saved.
  • the seating position may be automatically detected by the communication terminal and the in-vehicle communication device installed near the seat. Further, the vehicle side or the communication terminal detects the position of the passenger who is the owner of the communication terminal by touching the communication terminal such as FeliCa (registered trademark) on the reading device installed near the seat. Also good.
  • the information regarding the passenger of the vehicle B may be settable from a communication terminal (for example, a smartphone) possessed by the passenger.
  • an input interface for inputting information related to passengers is acquired (downloaded) from an external server to a communication terminal, and information regarding passengers (such as attributes such as age and seating) is acquired using the acquired input interface. Enter and set the location information. The set information is transmitted from the communication terminal to the in-vehicle information processing apparatus 200.
  • the information regarding the passengers may be in a form of replying to an inquiry from the in-vehicle information processing apparatus 200.
  • attributes such as the passenger's age and information related to the seating position are acquired by communication with the communication terminal possessed by the passenger, so it is necessary to detect the passenger's attribute and seating position by the in-vehicle detection sensor 209. Disappear.
  • the detection of the position of the other vehicle by the other vehicle position detection unit 101 using the ultrasonic sensor 108 and the image sensor 109 has been described.
  • the method for detecting the position of the other vehicle is limited to this. It is not a thing.
  • the vehicle number that is the unique information of the other vehicle is recognized by the image processing by the image sensor 109 and received via the communication unit 102. Vehicle number information of the other vehicle included in the vehicle information is acquired. Then, another vehicle may be specified by collating the vehicle number recognized by the image sensor 109 with the vehicle number information acquired via the communication unit 102.
  • the in-vehicle information processing apparatus 100 includes an other vehicle position detection unit 101, an ultrasonic sensor 108, and an image sensor 109 that are included in the in-vehicle information processing apparatus 100 according to the first embodiment (see FIG. 2). Not equipped.
  • Other configurations and operations are the same as those of the first embodiment, and thus description thereof is omitted here.
  • vehicle A In the own vehicle (hereinafter referred to as vehicle A), if a communication request is issued to another vehicle (hereinafter referred to as vehicle B) by inter-vehicle communication and a response is received from vehicle B (that is, communication with vehicle B is possible). If so, it is assumed that the vehicle B exists. Thereafter, the position information of the vehicle B is acquired from the vehicle B by inter-vehicle communication.
  • vehicle B In the own vehicle (hereinafter referred to as vehicle A), if a communication request is issued to another vehicle (hereinafter referred to as vehicle B) by inter-vehicle communication and a response is received from vehicle B (that is, communication with vehicle B is possible). If so, it is assumed that the vehicle B exists. Thereafter, the position information of the vehicle B is acquired from the vehicle B by inter-vehicle communication.
  • the on-vehicle information processing apparatus 100 does not include the other vehicle position detection unit 101, the ultrasonic sensor 108, and the image sensor 109.
  • the configuration can be simplified as compared with the first embodiment.
  • the position detection method of the other vehicle in the other vehicle is arbitrary, but when the position detection method of the other vehicle using the quasi-zenith satellite is adopted, the position detection accuracy is good, so that it is particularly effective.
  • Embodiment 3 of the present invention a case where communication between the own vehicle (hereinafter referred to as vehicle A) and another vehicle (hereinafter referred to as vehicle B) is performed through a predetermined communication network other than inter-vehicle communication will be described. To do.
  • vehicle A own vehicle
  • vehicle B another vehicle
  • description thereof is omitted here.
  • the vehicle A and the vehicle B may communicate with each other via a wide area communication network such as a mobile phone.
  • a wide area communication network such as a mobile phone.
  • communication may be performed via DSRC (Dedicated Short Range Communication) (registered trademark) or road-to-vehicle communication by wireless LAN.
  • DSRC Dedicated Short Range Communication
  • the vehicle A acquires the position information of the vehicle B, it may be acquired from an apparatus for detecting a vehicle installed on the roadside.
  • the communication unit 102 of the vehicle A can acquire other vehicle information from the vehicle B via a predetermined communication network, and is the same as in the first and second embodiments. The effect is obtained.
  • FIG. 8 is a diagram showing an example of display on the host vehicle (vehicle A) when traveling on a road having a plurality of lanes. Vehicles B, C, and D indicate other vehicles.
  • lane information included in the map information of the map DB (for example, map DB 105, 205) provided for each vehicle A and D, and each vehicle A to D are provided. It is possible to detect which lane the vehicle is traveling on the basis of information on white line recognition by a camera (for example, the image sensor 109 provided in the vehicle A).
  • lane information included in the map information of the map DB (for example, the map DB 105, 205) provided for each vehicle A to D and the quasi-zenith satellite in each vehicle A to D are used. Based on the positional information of the host vehicle, it is possible to detect which lane the vehicle is traveling.
  • the vehicle A acquires information on the lane and the position on which the vehicles B to D travel from the vehicles B to D. That is, the positions of the vehicles B to D are specified based on the position information of the vehicles B to D included in the other vehicle information or the information specifying the traveling road. Then, based on the acquired information on the lane and the position where the vehicles BD travel and the information on the lane and the position where the vehicle A travels, which position the vehicles BD are traveling with respect to the vehicle A. It can be judged.
  • 9 (a) to 9 (d) show the position display of each own vehicle displayed on each vehicle.
  • the vehicle A to D travels to which vehicle A by which lane.
  • the position of the other vehicle (vehicles B to D) may be displayed based on the own vehicle (vehicle A) so that the driver can easily see the vehicle.
  • the display content of the other vehicle is changed depending on whether or not the other vehicle is in a state of caution.
  • the positions of the vehicles B to D specify the position information of the vehicles B to D or the traveling road included in the other vehicle information acquired from the vehicles B to D. Since the information is specified based on the information, it is possible to determine which position the vehicles B to D are traveling with respect to the vehicle A, and alert the driver of the own vehicle based on the determination. It becomes possible.
  • the attention level calculation unit 104 calculates two (two steps) attention levels based on the other in-vehicle information in the other vehicle (hereinafter referred to as vehicle B). Based on the driver dynamic information of the vehicle B, driver static information, in-vehicle information, and position information, a plurality (a plurality of stages) of attention levels are calculated. Other configurations and operations are the same as those in the first to fourth embodiments, and thus description thereof is omitted here.
  • the attention level calculation unit 104 can calculate a more detailed attention level by acquiring information indicating the state of the driver in such another vehicle.
  • predetermined levels or coefficients are set according to the state of each information.
  • the levels set according to the state of each information will be described with reference to FIGS.
  • FIG. 10 is a diagram showing an example of the relationship between driver dynamic information and level.
  • the level L1 is set according to the activity state (dynamic state) of the driver of the vehicle B.
  • “during loud music” means that the driver is listening to music at a loud volume.
  • “when arousal is reduced” refers to a state in which the driver feels drowsy, for example.
  • FIG. 11 is a diagram showing an example of the relationship between the driver static information and the level.
  • the level L2 is set according to information unique to the driver of the vehicle B.
  • Gold License means a driver's license issued to a superior driver (no accident / no violation for 5 years before the date when the driver's license expires)
  • a driver's license color is gold.
  • Normal license means a driver's license issued to a driver other than a good driver, and the color of the driver's license is green or blue. Say.
  • Driver sign display vehicle refers to a vehicle that particularly displays a driver's state, for example, an initial driver sign (beginner mark), an elderly driver sign (old driver mark). It means a vehicle displaying a handicapped person sign (handicapped person mark) or a hearing handicapped person sign (deaf person mark).
  • FIG. 12 is a diagram showing an example of the relationship between the passenger's state and the level including the pet state.
  • the state in the vehicle B is assumed to be a passenger or pet state, and the level L3 is set according to the passenger or pet state.
  • FIG. 13 is a diagram showing an example of the relationship between the load information and the level.
  • level L4 is set according to the state of the load.
  • predetermined or higher indicates that the load is loaded above a predetermined height
  • predetermined or lower indicates that the load is loaded below a predetermined height.
  • FIG. 14 is a diagram illustrating an example of the relationship between the vehicle position of the vehicle B and the coefficient R.
  • the coefficient R is set according to the vehicle position of the vehicle B.
  • the control unit 107 performs control for alerting the driver and control of semi-automatic driving (control of the inter-vehicle distance) based on the level of attention calculated according to the equation (1).
  • FIG. 15 is a diagram illustrating an example of the relationship between the attention level L and the attention calling method.
  • control unit 107 performs alerting according to the plurality of attention levels L calculated by the attention level calculation unit 104.
  • the “display” column shows a display example of the vehicle B on the map of the liquid crystal monitor 111.
  • the “voice” column shows an example of the voice output from the speaker 112.
  • a plurality of attention levels are calculated based on the driver dynamic information, driver static information, other in-vehicle information, and position information of the other vehicle (vehicle B). Therefore, the control unit 107 can perform appropriate alerting and semi-automatic driving control according to the state (attention level) of the other vehicle. For example, when the driver of the other vehicle is in a state of reduced alertness, attention to the other vehicle is required, so that it is possible to alert the driver of the own vehicle to be more careful.
  • Embodiment 6 of the present invention another vehicle (hereinafter referred to as vehicle B) that can communicate with the host vehicle (hereinafter referred to as vehicle A), and another vehicle that cannot communicate with vehicle A (hereinafter referred to as vehicle C). 16) will be described with reference to FIGS. Note that the configuration and operation are the same as in Embodiments 1 to 5, and thus the description thereof is omitted here.
  • FIG. 16 is a diagram illustrating an example of display on the vehicle A.
  • a vehicle B travels in front of the vehicle A, and a vehicle C travels behind the vehicle A. Since the inter-vehicle communication is established between the vehicle A and the vehicle B (communication is possible), an antenna (a downward triangle added to the vehicle A and the vehicle B) is displayed on the vehicle A and the vehicle B. For example, the display of the vehicle B may be changed according to the attention level L shown in FIG.
  • the vehicle A and the vehicle C have not established an inter-vehicle communication (cannot communicate), and therefore no antenna is displayed on the vehicle C.
  • FIG. 17 the vehicle A and the vehicle B are displayed so as to be connected by a dashed arrow. Other displays are the same as in FIG.
  • the display as shown in FIG. 17 makes it easier for the driver to visually recognize that the communication between the vehicle A and the vehicle B is established.
  • the display as shown in FIG. 16 is displayed.
  • the display is as shown in FIG. good.
  • the display contents are determined by the value of L, but the states of L1, L2, L3, and L4 may be expressed.
  • the state of the driver may be displayed by changing the color of the triangle display, and the triangle display may be displayed according to the state of the passenger or the vehicle.
  • FIG. 18 is a diagram illustrating another example of display on the vehicle A.
  • a vehicle B travels in front of the vehicle A, and a vehicle C travels behind the vehicle A. Further, vehicle-to-vehicle communication has been established between vehicle A and vehicle B (communication is possible). At this time, the vehicle B is in a state that does not require attention.
  • the vehicle C is displayed in a somewhat three-dimensional manner.
  • the display as shown in FIG. 19 makes it easy for the driver of the vehicle A to visually recognize that the vehicle B is a vehicle to be aware of.
  • FIG. 20 is a diagram illustrating another example of display on the vehicle A.
  • a black square is displayed on the vehicle A, and a white square is displayed on the vehicles B and C. This indicates that the vehicle A and the vehicles B and C can communicate with each other.
  • an initial driver sign (beginner mark) is displayed on the vehicle C.
  • a square is not displayed on the vehicle D. This indicates that the vehicle D cannot communicate between vehicles.
  • an elderly driver sign an elderly person mark
  • an elderly person mark is displayed on the vehicle D.
  • the initial driver sign (starter mark) of the vehicle C and the elderly driver sign (elderly person mark) of the vehicle D can be acquired by the image sensor 109 provided in the vehicle A. Moreover, as long as it can distinguish and display that communication between vehicles is possible, it is good also as not only a rectangle but arbitrary shapes.
  • the control unit 107 can control the vehicle B on the liquid crystal monitor 111 according to the case where the control unit 107 cannot communicate with the vehicle C and the other in-vehicle information when the control unit 107 can communicate with the vehicle B.
  • C is controlled so that the display relating to C is different, the driver of vehicle A can easily see the state of the other vehicle. Therefore, sufficient attention can be given to the driver.
  • the control unit 107 can also control the traveling of the vehicle A according to the case where communication with the vehicle C is impossible and the information within the other vehicle when communication with the vehicle B is possible.
  • the in-vehicle information processing apparatus has both functions of a transmission side function for transmitting own vehicle information and a reception side function for receiving other vehicle information transmitted from another vehicle. The case where it has is demonstrated.
  • FIG. 21 is a diagram illustrating an example of the configuration of the in-vehicle information processing device 300 according to the seventh embodiment.
  • the in-vehicle information processing apparatus 300 has a configuration in which the in-vehicle information processing apparatus 100 shown in FIG. 2 and the in-vehicle information processing apparatus 200 shown in FIG. 3 are combined.
  • the configuration and operation of the in-vehicle information processing apparatus 300 are the same as those of the in-vehicle information processing apparatuses 100 and 200 according to Embodiments 1 to 6, and thus description thereof is omitted here.
  • the in-vehicle information processing device 300 has the function on the transmission side and the function on the reception side. Drivers can be careful.
  • control unit 107 has been described as detecting the vehicle position based on information acquired by the GPS 113, the vehicle speed pulse 114, and the gyro sensor 115, but the in-vehicle sensor I / F unit 106 is described. May have a function of detecting the position of the host vehicle.
  • the detection of the relative position of the other vehicle existing around the own vehicle using the ultrasonic sensor 108 and the image sensor 109 has been described.
  • the position detection method of the other vehicle is not limited to this. is not.
  • the absolute position of another vehicle can be detected by adding the position information obtained by the GPS 113 of the own vehicle to the detection results obtained by the ultrasonic sensor 108 and the image sensor 109.
  • Embodiment 1 although the case where one vehicle B is detected as another vehicle in FIG. 4 has been described, a plurality of other vehicles may be detected.
  • the detection priority order may be determined based on the coefficient R corresponding to the position of the other vehicle with respect to the own vehicle as shown in FIG. Good. Specifically, when there is another vehicle running before and after the own vehicle, the other vehicle existing in front of the own vehicle is detected first, and the other vehicle information of the other vehicle is acquired, and then the own vehicle The other vehicle existing behind is detected and the other vehicle information of the other vehicle is acquired. That is, you may make it detect in order from the vehicle with the largest value of the coefficient R shown in FIG.
  • the user may arbitrarily set the priority order, or the user may arbitrarily set the position at which the vehicle existing in priority is detected.
  • the priority order may be set based on the attention level calculated by the attention level calculation unit 104 (for example, in descending order of attention level).
  • the priority order may be set in the same manner as described above for semi-automatic driving control (travel control).
  • the vehicle B is painted and displayed as an example of a display when attention to the vehicle B is necessary, but the present invention is not limited to this. For example, it may be displayed three-dimensionally or displayed larger.
  • the own vehicle acquires the driver static information from the other vehicle (vehicle B), but once the driver static information is acquired, it is not acquired thereafter. May be.
  • the calculation of the attention level is not limited to the equation (1).
  • the driver of the own vehicle vehicle A
  • vehicle A vehicle A
  • the attention level calculation unit 104 calculates the attention level based on the driver dynamic information, the driver static information, the other in-vehicle information, and the position information of the other vehicle (vehicle B).
  • the attention level may be calculated by combining arbitrary information among driver dynamic information, driver static information, other in-vehicle information, or position information. Further, the attention level may be calculated in consideration of the passenger's seating position. Further, the calculated attention level may be three or more stages as shown in FIG. 15, or may be two stages as in the first embodiment.
  • the driver's static information includes a gold license, a normal license, or a driver-signed vehicle, but this information is explained based on traffic rules in Japan.
  • level L2 corresponding to information corresponding to the information shown in FIG. 11 is set.
  • the values of the levels L1 to L4 or the coefficient R may be arbitrary values.
  • the driver of the own vehicle vehicle A
  • vehicle A the driver of the own vehicle
  • the alerting method can be arbitrarily changed.
  • the driver of the own vehicle vehicle A
  • vehicle A vehicle A
  • FIG. 15 shows a display example of another vehicle, but the color, the color density, the shape, and the degree of solid may be any.
  • a number indicating the attention level L may be displayed beside or inside the shape indicating the other vehicle. That is, the display may be different depending on the attention level L.
  • vehicle-to-vehicle communication has been described as an example of communication means, but other communication means (for example, see Embodiment 3) may be used.
  • attention is given to the driver of the host vehicle and travel control (control of the inter-vehicle distance) is performed based on the attention level calculated by the attention level calculation unit 104. May be notified.
  • an inter-vehicle distance from another vehicle may be detected by the ultrasonic sensor 108, and an alarm may be notified from the liquid crystal monitor 111 or the speaker 112 when the inter-vehicle distance is equal to or less than a predetermined distance.
  • the inter-vehicle distance that serves as a threshold for alerting may be changed depending on the attention level. For example, when the value of the attention level is large, the inter-vehicle distance may be increased. Further, when there is another vehicle that cannot communicate, the inter-vehicle distance may be increased.
  • FIG. 20 shows a driver sign, but the present invention is not limited to this.
  • a mark indicating that there is an infant in the car may be displayed.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)
PCT/JP2012/075794 2012-10-04 2012-10-04 Dispositif de traitement d'informations embarqué Ceased WO2014054152A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2014539539A JP5885852B2 (ja) 2012-10-04 2012-10-04 車載情報処理装置
PCT/JP2012/075794 WO2014054152A1 (fr) 2012-10-04 2012-10-04 Dispositif de traitement d'informations embarqué

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/075794 WO2014054152A1 (fr) 2012-10-04 2012-10-04 Dispositif de traitement d'informations embarqué

Publications (1)

Publication Number Publication Date
WO2014054152A1 true WO2014054152A1 (fr) 2014-04-10

Family

ID=50434511

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/075794 Ceased WO2014054152A1 (fr) 2012-10-04 2012-10-04 Dispositif de traitement d'informations embarqué

Country Status (2)

Country Link
JP (1) JP5885852B2 (fr)
WO (1) WO2014054152A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016024557A (ja) * 2014-07-17 2016-02-08 本田技研工業株式会社 メッセージの交換を行うプログラム、方法、及び電子機器
WO2016121174A1 (fr) * 2015-01-30 2016-08-04 ソニー株式会社 Système de traitement d'informations et procédé de commande
KR20170109427A (ko) * 2016-03-21 2017-09-29 한국철도기술연구원 차량 부착 카메라를 이용한 물류 모니터링 시스템
KR20190088416A (ko) * 2018-01-18 2019-07-26 도요타지도샤가부시키가이샤 에이전트 제휴 시스템, 에이전트 제휴 방법 및 비일시적인 기억 매체
CN110858278A (zh) * 2018-08-22 2020-03-03 上海博泰悦臻网络技术服务有限公司 车辆、车机设备及其车辆关键信息的屏显提示方法
JP2020050324A (ja) * 2018-09-28 2020-04-02 株式会社Jvcケンウッド 業務支援システムおよび業務支援方法
CN111885078A (zh) * 2015-01-20 2020-11-03 松下电器(美国)知识产权公司 不正常应对方法以及电子控制单元
JP2023155282A (ja) * 2020-04-01 2023-10-20 株式会社デンソー 提示制御装置及び提示制御プログラム

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0719939A (ja) * 1993-06-30 1995-01-20 Toshiba Corp 自重計機能付きナビゲーション装置
JP2004343399A (ja) * 2003-05-15 2004-12-02 Alpine Electronics Inc 車載システム
JP2009048564A (ja) * 2007-08-22 2009-03-05 Toyota Motor Corp 車両位置予測装置
WO2010084568A1 (fr) * 2009-01-20 2010-07-29 トヨタ自動車株式会社 Système de contrôle de circulation en ligne et véhicule
JP2010205123A (ja) * 2009-03-05 2010-09-16 Nec System Technologies Ltd 運転支援方法、運転支援装置及び運転支援用プログラム
JP2010272083A (ja) * 2009-05-25 2010-12-02 Denso Corp 車載通信装置および通信システム
WO2012056688A1 (fr) * 2010-10-27 2012-05-03 三洋電機株式会社 Dispositif de terminal
JP2012132821A (ja) * 2010-12-22 2012-07-12 Aisin Aw Co Ltd 車載制御装置、車載制御方法及び車載制御用プログラム
JP2012173930A (ja) * 2011-02-21 2012-09-10 Mitsubishi Electric Corp 車車間通信装置および車載ナビゲーション装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4040441B2 (ja) * 2002-12-04 2008-01-30 トヨタ自動車株式会社 車両用通信装置
JP4480613B2 (ja) * 2005-03-29 2010-06-16 アルパイン株式会社 ナビゲーション装置
JP2010217956A (ja) * 2009-03-13 2010-09-30 Omron Corp 情報処理装置及び方法、プログラム、並びに情報処理システム
JP2011175368A (ja) * 2010-02-23 2011-09-08 Clarion Co Ltd 車両制御装置

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0719939A (ja) * 1993-06-30 1995-01-20 Toshiba Corp 自重計機能付きナビゲーション装置
JP2004343399A (ja) * 2003-05-15 2004-12-02 Alpine Electronics Inc 車載システム
JP2009048564A (ja) * 2007-08-22 2009-03-05 Toyota Motor Corp 車両位置予測装置
WO2010084568A1 (fr) * 2009-01-20 2010-07-29 トヨタ自動車株式会社 Système de contrôle de circulation en ligne et véhicule
JP2010205123A (ja) * 2009-03-05 2010-09-16 Nec System Technologies Ltd 運転支援方法、運転支援装置及び運転支援用プログラム
JP2010272083A (ja) * 2009-05-25 2010-12-02 Denso Corp 車載通信装置および通信システム
WO2012056688A1 (fr) * 2010-10-27 2012-05-03 三洋電機株式会社 Dispositif de terminal
JP2012132821A (ja) * 2010-12-22 2012-07-12 Aisin Aw Co Ltd 車載制御装置、車載制御方法及び車載制御用プログラム
JP2012173930A (ja) * 2011-02-21 2012-09-10 Mitsubishi Electric Corp 車車間通信装置および車載ナビゲーション装置

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016024557A (ja) * 2014-07-17 2016-02-08 本田技研工業株式会社 メッセージの交換を行うプログラム、方法、及び電子機器
CN111885078A (zh) * 2015-01-20 2020-11-03 松下电器(美国)知识产权公司 不正常应对方法以及电子控制单元
CN111885078B (zh) * 2015-01-20 2022-03-08 松下电器(美国)知识产权公司 不正常应对方法以及电子控制单元
WO2016121174A1 (fr) * 2015-01-30 2016-08-04 ソニー株式会社 Système de traitement d'informations et procédé de commande
CN107209019A (zh) * 2015-01-30 2017-09-26 索尼公司 信息处理系统和控制方法
JPWO2016121174A1 (ja) * 2015-01-30 2017-11-09 ソニー株式会社 情報処理システムおよび制御方法
US10302444B2 (en) 2015-01-30 2019-05-28 Sony Corporation Information processing system and control method
CN112762956A (zh) * 2015-01-30 2021-05-07 索尼公司 信息处理系统和控制方法
KR20170109427A (ko) * 2016-03-21 2017-09-29 한국철도기술연구원 차량 부착 카메라를 이용한 물류 모니터링 시스템
KR101872698B1 (ko) * 2016-03-21 2018-06-29 한국철도기술연구원 차량 부착 카메라를 이용한 물류 모니터링 시스템
US11302189B2 (en) 2018-01-18 2022-04-12 Toyota Jidosha Kabushiki Kaisha Agent cooperation system, agent cooperation method, and non-transitory storage medium
KR102190186B1 (ko) * 2018-01-18 2020-12-11 도요타지도샤가부시키가이샤 에이전트 제휴 시스템, 에이전트 제휴 방법 및 비일시적인 기억 매체
KR20190088416A (ko) * 2018-01-18 2019-07-26 도요타지도샤가부시키가이샤 에이전트 제휴 시스템, 에이전트 제휴 방법 및 비일시적인 기억 매체
CN110858278A (zh) * 2018-08-22 2020-03-03 上海博泰悦臻网络技术服务有限公司 车辆、车机设备及其车辆关键信息的屏显提示方法
WO2020067009A1 (fr) * 2018-09-28 2020-04-02 株式会社Jvcケンウッド Système de support commercial et procédé de support commercial
JP2020050324A (ja) * 2018-09-28 2020-04-02 株式会社Jvcケンウッド 業務支援システムおよび業務支援方法
JP7070298B2 (ja) 2018-09-28 2022-05-18 株式会社Jvcケンウッド 業務支援システムおよび業務支援方法
JP2023155282A (ja) * 2020-04-01 2023-10-20 株式会社デンソー 提示制御装置及び提示制御プログラム
JP7616287B2 (ja) 2020-04-01 2025-01-17 株式会社デンソー 提示制御装置及び提示制御プログラム
JP2025019089A (ja) * 2020-04-01 2025-02-06 株式会社デンソー 提示制御装置及び提示制御プログラム
JP7747155B2 (ja) 2020-04-01 2025-10-01 株式会社デンソー 提示制御装置及び提示制御プログラム

Also Published As

Publication number Publication date
JPWO2014054152A1 (ja) 2016-08-25
JP5885852B2 (ja) 2016-03-16

Similar Documents

Publication Publication Date Title
JP5931208B2 (ja) 車載情報処理装置
JP5885852B2 (ja) 車載情報処理装置
JP7371671B2 (ja) 車両に安全に追い付けるように運転を支援するシステムおよび方法
JP5885853B2 (ja) 車載情報処理装置
JP6773046B2 (ja) 運転支援装置及び運転支援方法、並びに移動体
JP5278292B2 (ja) 情報提示装置
JP5397735B2 (ja) 車両用緊急車両接近検出システム
EP2679447B1 (fr) Systèmes et procédés permettant de désactiver un klaxon de véhicule
CN108235780A (zh) 用于向车辆传送消息的系统和方法
CN111161551B (zh) 用于检测、警报和响应紧急车辆的设备、系统和方法
JP7372381B2 (ja) 交通安全支援システム
JP7372382B2 (ja) 交通安全支援システム
JP2020050204A (ja) 車両の走行制御方法及び走行制御装置
JP2018507479A (ja) 自動車用の運転者支援システム
JP6261812B1 (ja) 車載装置、携帯端末装置、認識支援システム、認識支援方法、及び認識支援プログラム
EP3002557B1 (fr) Procédé et système pour identifier une situation avec un pilote potentiellement non vigilant
JP2006254055A (ja) 車載電話システム
JP7731841B2 (ja) 交通安全支援システム
JP7775135B2 (ja) 交通安全支援システム
JP7767219B2 (ja) 交通安全支援システム
GB2631691A (en) Controlling output of alert signals to an occupant of a vehicle
JP2023151973A (ja) 運転支援装置
JP2025140865A (ja) 運転支援装置、運転支援方法、及びプログラム
JP2025140864A (ja) 運転支援装置、運転支援方法、及びプログラム
JP2025140871A (ja) 運転支援装置、運転支援方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12886027

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2014539539

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12886027

Country of ref document: EP

Kind code of ref document: A1