[go: up one dir, main page]

GB2525654A - Driver incident assistance system and method - Google Patents

Driver incident assistance system and method Download PDF

Info

Publication number
GB2525654A
GB2525654A GB1407713.5A GB201407713A GB2525654A GB 2525654 A GB2525654 A GB 2525654A GB 201407713 A GB201407713 A GB 201407713A GB 2525654 A GB2525654 A GB 2525654A
Authority
GB
United Kingdom
Prior art keywords
incident
driver
vehicle
information
assistance system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1407713.5A
Other versions
GB2525654B (en
GB201407713D0 (en
Inventor
Harpreet Singh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1407713.5A priority Critical patent/GB2525654B/en
Publication of GB201407713D0 publication Critical patent/GB201407713D0/en
Publication of GB2525654A publication Critical patent/GB2525654A/en
Application granted granted Critical
Publication of GB2525654B publication Critical patent/GB2525654B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • G07C5/0866Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0816Indicating performance data, e.g. occurrence of a malfunction

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)

Abstract

A driver incident assistance system (1) for outputting information to a driver of a vehicle (3) following an incident, such as an accident. The driver incident assistance system (1) includes one or more processors (19) configured to receive sensor data from one or more sensors (10). The sensor data is analysed to determine that an incident has occurred and to assign a classification to said incident. The driver incident assistance system (1) controls the output of information to the driver (D) in dependence on the assigned classification of the incident. The incident may be classified into one of a number of incident levels, such as being a minor incident or a major incident. A different set of information may be output for a minor incident than for a major incident. The system may save sensor data and determine if any other vehicles were involved in the incident and output information accordingly. Image sensors (cameras) may be used to assign a classification to the incident, determine the state of the vehicle occupants or identify characteristics of vehicles involved in the incident. The invention may display a checklist for a driver to complete when an event, impact, crash or collision, is detected.

Description

DRIVER INCIDENT ASSISTANCE SYSTEM AND METHOD
TECHNICAL FIELD
The present disclosure relates to a driver incident assistance system and method. Aspects of the present disclosure relate to a system for controlling the output of information in the event of an incident; to a method of controlling the output of information following an incident; and to a vehicle incorporating a driver incident assistance system.
BACKGROUND
A typical human machine interface (HMI) in a vehicle is configured to respond to interactions initiated by the user, for example the driver. When responding to the user input, the human machine interface does not consider the condition of the driver. This is particularly relevant if the vehicle has been involved in an incident, such as an accident or collision, which might affect the driver condition.
DEl 02008060567A1 disdoses a mobile communication device setting an emergency caD for automatcaDy generating and transmitting a voice message. U5201 109801 6A1 and US2OI 2220258A1 both disclose vehicle communication systems which pace an emergency caD upon detection of an emergency event.
US2008257252 discloses an accident assistance system for a motor vehicle having an accident detector for detecting an acc!dent. instructions are printed on an information carrier which is ejected when an airbag is triggered in the event of an accident. The instructions concerning escape measures and emergency measures are advantageousty printed onto the airbag.
The present disclosure relates to a driver incident assistance system and method which overcomes or ameliorates at least some of the problems associated with prior art systems.
SUMMARY OF THE INVENTION
Aspects of the present invention relate to a system for controlling the output of information; to a method of controlling the output of information; and to a vehicle incorporating a driver incident assistance system, as set out in the appended claims.
According to a further aspect of the present invention there is provided a driver incident assistance system for outputting information to a driver of a vehicle following an incident, the driver incident assistance system comprising one or more processors configured to: receive sensor data from one or more sensors; analyse the sensor data to determine that an incident has occurred; assign a classification to said incident; and control the output of information to the driver in dependence on the assigned classification of said incident. The incident typically takes the form of a collision or an accident resulting in damage to the vehicle. In use, the driver incident assistance system can modify the information output to the driver based on the classification of the incident. At least in certain embodiments, the information output to the driver can be tailored to suit the detected incident. The likelihood of the driver being distracted by unnecessary or irrelevant information can be reduced.
The classification can indicate a severity of the incident, for example a minor incident or a major incident. A first set of information can be output if the incident is classified as being a minor incident. A second set of information is output if the incident is classified as being a major incident. The first and second sets of information can differ from each other. The information can be stored in one or more dialogue trees. The traversal of the or each dialogue tree can depend on one or more responses made by the driver to a prompt and/or question output by the driver incident assistance system. The traversal of the dialogue tree can also be influenced by driver preferences stored in a driver preference file. The driver preferences can be based on historic decisions made by the driver, or by preferences input by the driver. The dialogue tree can be selected from a plurality of different dialogue trees in dependence on the classification of the incident. For example, a first dialogue tree can be selected if the incident is classed as a minor incident, whereas a second dialogue tree can be selected if the incident is classed as a major incident. Similarly, a third dialogue tree can be selected if the incident involves another vehicle, whereas a fourth dialogue tree can be selected if the incident involves a stationary object.
The information can be output to one or more output devices. The output devices can comprise a display screen and/or an audio system. The one or more processors can be configured to output an audio signal and/or a video signal to cause said one or more output devices to output the information. The audio signal and/or video signal can be stored in memory.
The classification can indicate the nature of the incident. For example, the classification can determine whether another vehicle is involved or if the incident involved a stationary object.
The one or more processors can be configured to determine if another vehicle is involved in said incident. The one or more processors can be configured to control the output of information to the driver in dependence on the determination of whether another vehicle is involved in said incident. For example, the driver may be under different legal obligations depending on whether another vehicle is involved in the incident.
The one or more processors can be configured selectively to output sensor data generated at the time of the incident to memory. The memory can be provided in the vehicle.
Alternatively, the memory can be separate from the vehicle, for example in a remote data centre. The sensor data can be transmitted to over a wireless network, such as a cellular telephone network. The sensor data generated over a time period contemporaneous with the detected incident can be selected for storage. For example, the sensor data generated immediately prior to the incident and immediately after the incident can be stored. The sensor data can be archived, for example to support an insurance claim. The sensor data can be transmitted over a cellular network, for example to a cellular telephone associated with the driver or to a predefined electronic mailing address.
The one or more processors can be configured to receive image data from at least one external image sensor and/or at least one interior image sensor. The one or more processors can be configured to analyse said image data received from said at least one external image sensor to assign the classification to said incident; and/or to identify one or more characteristics of another vehicle involved in the incident. One or more image frames can be selected from the image data for storage.
The one or more processors can be configured to analyse said image data received from the at least one internal image sensor to track movements and/or recognise gestures of a vehicle occupant following the incident. For example, the one or more processors can analyse said image data to identify one or more of the following: (a) a movement of the vehicle occupant; (b) a gesture performed by the vehicle occupant; and (c) a facial expression of the vehicle occupant. The driver condition can be assessed with reference to one or more of these functions. The one or more processors can be configured to assign a classification to a condition of said vehicle occupant in dependence on said tracked movements and/or recognised gestures of the vehicle occupant.
The one or more sensors can comprise one or more of the following: an ultrasonic sensor; a radar sensor; a lateral accelerometer; a longitudinal accelerometer; a vehicle speed sensor; a crash sensor; a pressure sensor a gyroscope; or a satellite navigation system (OPS) sensor (to identify a location of the vehicle); or a driver monitoring sensor. It will be appreciated that more than one of each type of sensor can be provided. The one or more processing being configured to assign the classification to said incident based on one or more thresholds defined for said one or more sensors. For example, a proximity threshold can be defined for said ultrasonic sensor and/or said radar sensor. An acceleration threshold can be defined for said lateral accelerometer and/or longitudinal accelerometer. The acceleration threshold can be defined to avoid system errors, for example due to a driver initiated dynamic event. The one or more processors can monitor a driver braking request and/or a torque request event to avoid erroneous detection of an incident. The pressure sensor and/or the crash sensor can detect and/or classify the incident.
The identification and/or classification of the incident can be based on a determined condition of a driver of the vehicle. The driver condition can be determined based on output(s) from one or more driver monitoring sensors, for example health sensor data output by one or more health monitoring sensors. The output of information can be controlled in dependence on a determined driver condition.
The output of information can be controlled in dependence on a determined geographical location of the vehicle. For example, the information to be output can be selected in dependence on local guidelines and/or legislation.
According to a further aspect of the present invention there is provided a vehicle comprising a driver incident assistance system as described herein coupled to one or more sensors.
According to a yet further aspect of the present invention there is provided a method of controlling the output of information to a driver of a vehicle following an incident, the method comprising: receiving sensor data from one or more sensors; analysing the sensor data to determine that an incident has occurred; assigning a classification to said incident; and controlling the output of information to the driver in dependence on the assigned classification of said incident. The form and/or substance of the information to be output to the driver can be controlled in dependence on the assigned classification. For example, in certain scenarios, the method can comprise outputting the information as an audio signal rather than a video signal.
The identification and/or classification of the incident can be based on a determined condition of a driver of the vehicle. The output of information can be controlled based on the determined condition of the driver, for example based on a signal output from one or more driver monitoring sensors. The driver monitoring sensors can be health monitoring sensors configured to assess the driver condition.
The output of information can be controlled in dependence on a determined geographical location of the vehicle, for example from a signal output from a global positioning system (GPS). For example, the information to be output can be selected in dependence on local guidelines and/or legislation.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the present invention will now be described, by way of example only, with reference to the accompanying figures, in which: Figure 1 shows a schematic representation of a vehicle incorporating a driver incident assistance system in accordance with an embodiment of the present invention; Figure 2 is a block diagram illustrating the driver incident assistance system shown in Figure 1; Figure 3 is a schematic representation of an interior of the vehicle shown in Figure 1; Figure 4 is a first flow chart illustrating a first operating mode of the driver incident assistance system shown in Figure 1; and Figure 5 is a second flow chart illustrating a second operating mode of the driver incident assistance system shown in Figure 1.
DETAILED DESCRIPTION
A driver incident assistance system 1 for a vehicle 3 in accordance with an embodiment of the present invention will now be described with reference to the accompanying Figures. The driver incident assistance system 1 outputs context-relevant information to a driver in the event of an incident, such as a collision or accident, involving the vehicle 3.
The driver incident assistance system 1 is coupled to a driver communication system 5 for controlling the output of information and notifications to a driver D; and to a vehicle incident monitoring system 7 configured to monitor one or more vehicle parameters to identify and classify an incident involving the vehicle 3. The driver communication system 5 controls the output of notification and information to a driver D of the vehicle 3 in dependence on the vehicle incident monitoring system 7.
A schematic representation of the vehicle 3 is shown in Figure 1. The vehicle 3 comprises an electronic control unit (ECU) 9 which is configured to implement the driver incident assistance system 1, the driver communication system 5 and the vehicle incident monitoring system 7 described herein. As shown in Figure 2, the ECU 9 communicates with vehicle systems and the vehicle sensors 10 over a CAN bus (not shown). The vehicle sensors 10 comprise one or more speed sensors 11 configured to determine the vehicle speed; one or more ultrasonic sensors 12; one or more accelerometers 13 to measure lateral and/or longitudinal vehicle acceleration; a radar system 14; and first and second internal image sensors 15-1, 15-2 for generating image data associated with the interior of the vehicle 3 (referred to herein as internal image data); and first, second and third external image sensors 17-1, 17-2, 17-3 for generating image data associated with the exterior of the vehicle 3 (referred to herein as external image data). The vehicle sensors 10 could also comprise one or more of the following: a gyroscope, a crash (collision) sensor and a pressure sensor. The vehicle sensors 10 can be used with other vehicle systems, for example the ultrasonic sensors 12 can form part of a parking system, and the radar system 14 can form part of an adaptive cruise control system. The ECU 9 receives data from said vehicle sensors 10 at least substantially in real time.
The ECU 9 comprises one or more processors 19 (only one of which will be described for clarity) coupled to memory 21 comprising one or more memory devices. The processor 19 is coupled to said vehicle sensors 10; one or more input devices 23; and one or more output devices 25. The input devices 23 comprise one or more switches 27, such as a mechanical switch or a capacitive switch; and optionally a microphone (not shown) for receiving audio signals within a cabin of the vehicle 3. The output devices 25 include an audio speaker 29 and a display screen 31. The display screen 31 can optionally be a touch screen which functions as both an input and output device. The display screen 31 can be provided in an instrument cluster or in a centre console of the vehicle 3. The ECU 9 can be coupled to a wireless transceiver 33 for transmitting and receiving communication signals, for example over a cellular telephone network. The memory 21 is configured to provide a storage buffer for recording sensor data generated by said vehicle sensors 10. The memory 21 can provide a storage buffer to record sensor data generated over a time period of 60 seconds. It will be appreciated that the period of time for which sensor data is stored in the memory 21 can be configurable.
The processor 19 is arranged to output an audio signal and/or an image signal to the output devices 25. The audio and image signals can be stored in said memory 21 and retrieved by said processor 19. The audio and image signals are converted to respective audio and visual outputs to communicate system information and/or notifications to the driver D. The input devices 23 receive user inputs to control the driver incident assistance system 1.
The user inputs can include one or more of the following: motion gestures; activation of a predefined region of a touch-screen; detection of a gesture on the touch-screen; or manual actuation of a button or switch. If the processor 19 is coupled to a microphone, the driver D can issue voice commands which are interpreted using speech recognition algorithms operating on said processor 19 or on a remote server (via said wireless transceiver 33).
The ECU 9 receives signals from the vehicle sensors 10 via the CAN bus to monitor vehicle dynamic parameters and/or information relating to the vehicle surroundings. With reference to one or more predefined incident detection thresholds, the ECU 9 can determine when an incident has occurred. For example, if the accelerometer 13 detects a positive or negative change in vehicle acceleration which is greater than a predefined acceleration threshold, the ECU 9 can determine that an unexpected incident, such as a collision, has occurred. The incident detection thresholds can be selected in dependence on other vehicle dynamic parameters, such as vehicle speed. For example, lower incident detection thresholds can be applied when the vehicle 3 is stationary than when the vehicle 3 is moving. The ECU 9 can also monitor driver requests, such as a torque request signal and/or a brake request signal, to identify a requested change in vehicle acceleration which is not indicative of an incident.
The thresholds can also be calibrated to help prevent erroneous incident detection events, for example resulting from acceleration due to gradient changes. The thresholds defined in respect of lateral acceleration can be different from those defined in respect of longitudinal acceleration.
The ECU 9 analyses the data from said vehicle sensors 10 to classify the incident based on one or more predefined thresholds. The ECU 9 can, for example, classify the incident as minor (for example, a small dent or scratch) or major (for example, vehicle body damage).
The ECU 9 can, for example, classify the incident as a major or minor incident in dependence on whether an airbag was deployed as a result of the incident. One or more pressure sensors and/or one or more crash sensors could be used to detect the incident and optionally also to classify the incident. The driver incident assistance system 1 can control the subsequent output of information to the driver D in dependence on the incident classification. The ECU 9 can be configured to determine other details of the incident based on the analysis of said data. The vehicle sensors 10 could be configured to identify a change in direction of the vehicle and/or vehicle spinning as a result of the detected incident. For example, the ECU 9 could identify a change in direction based on analysis of data from one or more of the following vehicle sensors 10: the accelerometer 13; image data from the internal image sensors 15-1, 15-2 and/or the external image sensors 17-1, 17-2, 17-3.
Equally, the crash (collision) sensors and/or the gyroscope and/or a global satellite positioning system could be used to detect a change in direction. The magnitude and/or direction of the incident could be detected through the crash sensor(s) and/or the pressure sensor(s) which determine which airbags are to be deployed within the vehicle 3 (for example on which side of the vehicle 3). The magnitude and direction of the incident could also be used as a parameter to provide relevant information to the driver (for example the vehicle 3 is facing in the wrong direction as a result of the incident). The measured magnitude and/or direction (i.e. lateral and/or longitudinal) of the detected vehicle acceleration could provide an indication of the nature of the incident, for example the portion of the vehicle 3 affected and/or whether the incident involved a stationary object or a moving vehicle.
The first and second internal image sensors 15-1, 15-2 are driver facing and, in use, generate image data. As shown in Figure 3, the first internal image sensor 15-1 is located in a centre position within the cabin C and the second internal image sensor 15-2 is located on a driver side of the cabin C. The field of view matrix of each of the first and second internal image sensors 15-1, 15-2 is illustrated in Figure 3 by respective view frustums VF1 IN, VF2IN.
In operation, the first and second internal image sensors 15-1, 15-2 each generate internal image data which is output to the processor 19. The processor 19 analyses the internal image data to identify driver movements and behaviour. The internal image data is analysed by image processing software running on the processor 19 to monitor driver behaviour. This analysis can be used to assess the wellbeing of the driver in the event of an incident. For example, the internal image data can be analysed to track head and/or body movements.
Equally, the internal image data can be analysed to determine body position of the driver.
The internal image data could be used to assess driver state or condition as a result of the incident.
The first, second and third external image sensors 17-1, 17-2, 17-3 are outwardly facing. As shown in Figure 1, the first external image sensor 17-1 is located on a left side of the vehicle 3 (for example mounted in a left wing mirror) and the second image sensor 17-2 is located on a right side of the vehicle 3 (for example mounted in a right wing mirror). The third external image sensor 17-3 is provided in a central position at the rear of the vehicle (for example mounted to a tailgate). The field of view matrix of each of the first, second and third external image sensors 17-1, 17-2, 17-3 is illustrated in Figure 1 by respective view frustums VF1Ex, VF2EX, VF3EX. In operation, the first, second and third external image sensors 17-1, 17-2, 17-3 each generate external image data which is output to the processor 19. The external image data is buffered in the memory 21, for example to store external image data generated for the preceding 30 seconds, 60 seconds or longer. As described herein, the processor 19 can selectively archive said external image data for future reference. In particular, if an incident is detected, the processor 19 can archive external image data generated prior to, during and immediately after the incident. The processor 19 could also operate selectively to archive internal image data generated by said first and second internal image sensors 15-1, 15-2.
The external image data could optionally be analysed to determine further information relating to the nature of an incident. For example, analysis of the external image data can be performed to identify a vehicle registration number of another vehicle involved in the incident. The analysis can identify a vehicle registration number of another vehicle involved in the incident for future reference. An image frame showing the vehicle registration number can be flagged and archived for future reference. The image data could be augmented to highlight the vehicle number plate. The image analysis can be executed by the processor 19 operating image analysis software. Alternatively, the image analysis can be performed remotely, for example using the wireless transceiver 33 to transmit the external image data over a wireless network to a remote server. The same techniques can be applied for analysis of the internal image data.
The driver incident assistance system 1 can be configured continuously to monitor for an incident involving the vehicle 3 or proximal to the vehicle 3. In a variant, the driver incident assistance system 1 could periodically monitor for an incident. The driver incident assistance system 1 can be arranged to initiate a communication with a driver D to notify them that an incident has occurred and to offer assistance. The driver incident assistance system 1 can optionally include natural language voice recognition, gesture recognition and driver state detection system. The driver incident assistance system 1 can provide information to the driver and accept response in any communication medium including one or more of the following: voice, gesture, buttons and/or touchscreen.
The driver incident assistance system 1 provides an emergency response service automatically activated upon detection of an accident or other emergency incident. The driver incident assistance system 1 can classify the incident, for example as a minor or major incident. The minor incident can be classified as when an accident does not activate vehicle airbags and the vehicle 3 suffers small body damage e.g. scratch or dent. A major incident can be identified when the incident activates vehicle airbags.
The driver incident assistance system 1 uses vehicle sensors 10, including ultrasonic and radar sensors, to detect any object within a threshold distance of the vehicle 3. Once an object is detected in close proximity to the vehicle 3, the driver incident assistance system 1 can activate video processing to determine if the incident involves another vehicle or a static objects e.g. wall or tree. By determining the nature of the incident, the driver incident assistance system 1 can avoid providing the driver D with unnecessary information. Upon detection of a genuine incident (major or minor), the driver incident assistance system 1 controls communication of information and/or notifications to the driver D. The driver incident assistance system 1 initiates the following functions simultaneously: 1. Analyse data from the vehicle sensors 10 to classify damage to the vehicle 3 as a result of a detected incident; and 2. Determine the state (well-being) of the driver following an incident.
It is envisaged that the post-incident the driver will be confused and/or anxious. The driver incident assistance system 1 detects the condition of the driver (typically the vehicle driver) following an incident. The driver incident assistance system 1 initiates communication by advising the driver of appropriate actions to take and, as appropriate, assisting in the performance of those actions. For example, the driver incident assistance system 1 can provide information relating to one or more of the following: health and safety procedure, information and evidence required for initiating an insurance claim, making an emergency call etc. The information can be tailored in dependence on additional parameters, such as geographical location of the vehicle, to comply with local laws and regulations (e.g. emergency contact number, nearest hospital or services driver like from historic data). The driver incident assistance system 1 also provides a facility to the driver D to store photographic image(s), for example to record details of damage to the vehicle 3, and details of any other vehicle involved in the incident. The driver incident assistance system 1 can provide an option to geographically tag the location using satellite or mobile phone data as evidence to support an insurance claim or for other future use. A time stamp can be added to the photograph which can then be sent to an email account or cellular telephone, for example associated with the driver of the vehicle 3.
The driver incident assistance system 1 can control communication of information to the driver and receive inputs from the driver D using the input devices 23, such as a voice system and/or touch screen. The first and second internal image sensors 15-1, 15-2 generate internal image data which can be analysed by the processor 19 to detect movements and gestures. The processor 19 can be configured to detect small movement and gestures, for example head and/or hand movements, when the driver D is not in a condition to operate the input devices 23 in a normal manner (such as using voice commands). The driver incident assistance system 1 can also include a learning system to adopt driver behaviour and preferences. Based on historic driver behaviour and preferences data, the driver incident assistance system 1 can propose an appropriate action/response to the identified incident and/or identify a telephone number of an individual to be contacted when an incident of this type occurs. The driver incident assistance system 1 can initiate a telephone call to the preferred contact number subject to approval from the driver D. The contact number can be pre-defined by the driver or learnt from historical usage data.
The driver incident assistance system 1 can be useful in a situation where the driver D may unknowingly take improper actions following an incident, such as an accident. These improper actions can potentially result in serious problems for the driver D. The driver incident assistance system 1 can output initial guidance to the driver D to make sure the driver D is following the proper steps and can help to reassure the driver. In a worst case scenario (for example, the driver has a previous health condition which worsens post-incident), the driver incident assistance system 1 can also be linked to present an eCall or an SOS text message system to intimate emergency response services.
The operation of the driver incident assistance system 1 will now be described with reference to the flowcharts shown in Figures 4 and 5 A first flow chart 100 is shown in Figure 4 to illustrate the operation of the driver incident assistance system 1 when the vehicle 3 is stationary and unoccupied (corresponding to the vehicle 3 being parked).
The driver incident assistance system 1 is enabled by a driver (STEP 101). The driver incident assistance system 1 activates the vehicle sensors 10 and the generated sensor data is buffered in memory 21 for a time period of 60 seconds (STEP 103). The vehicle sensors 10 actively monitor the vehicle surroundings to determine when an incident, such as a collision or accident, occurs. If an incident is detected, a check is performed to determine whether an object is identified in close/dangerous proximity to the vehicle 3 (STEP 105). If no such object is detected, the vehicle sensors 10 continue actively to monitor the vehicle surroundings. If, however, an object is detected in proximity to the vehicle 3, an object identification feature is activated to identify the type of object, for example to determine whether the object is a person, animal, static object, another parked vehicle, a moving vehicle etc. (STEP 107). An accident classification procedure is then implemented (STEP 109). The classification procedure can, for example, determine that an accident did not occur and revert the driver incident assistance system 1 to monitoring the vehicle surroundings.
Alternatively, the classification procedure can determine that a minor accident (for example, a small dent) or a major accident (for example, vehicle body damage) occurred. The driver incident assistance system 1 can record the type of incident and temporarily archive data derived from the vehicle sensors 10, for example image data from the internal image sensors 15-1, 15-2 and/or the external image sensors 17-1, 17-2, 17-3; and/or ultrasonic sensors 12, before the incident occurred (STEP 111).
The driver incident assistance system 1 then accesses defined user preferences to determine an appropriate response to the incident (STEP 113). The driver-preferences can, for example, include a geographical location of the vehicle 3 (derived from a satellite navigation system), an assessment of the vehicle damage, insurance specific rules etc. The driver incident assistance system 1 then determines when the driver D enters the vehicle 3 (STEP 115). The driver incident assistance system 1 notifies the driver D that an incident occurred while the vehicle 3 was unoccupied (STEP 117). The driver notification can, for example, comprise a report of the incident and provide relevant guidance information to the driver D. The information can include, but is not limited to, one or more of the following: (a) health and safety rules as determined by the geographical location; (b) any emergency action legally required to be undertaken by the driver D as governed by local legislation; (c) contact details for the local emergency services, such as telephone numbers; (d) and information required by an insurer to make a claim under the vehicle insurance.
The driver incident assistance system 1 prompts the driver to indicate whether they require details of the incident. The driver responds to the prompt using one of the input devices 23 (STEP 119). If the driver response to the incident information prompt is positive, the driver incident assistance system 1 displays recorded image data to the driver D along with external image data, including the vehicle registration number of any other vehicles involved in the incident (STEP 121).
The driver incident assistance system 1 then prompts the driver D to indicate whether they require more information about making an insurance claim (STEP 123). If the driver response to the system prompt is negative, the driver incident assistance system 1 does not display the image data but instead outputs the prompt to provide additional information is required about making an insurance claim (STEP 123). A check is performed to determine the driver response to the insurance information prompt (STEP 125). If the driver response is positive, the driver incident assistance system 1 provides relevant information for making an insurance claim (STEP 127), for example outlining the STEPs involved and/or providing contact information for the driver's insurance company.
The driver incident assistance system 1 then prompts the driver to confirm whether they wish to retrieve a photographic image which is geographically tagged to support an insurance claim (STEP 129). A check is performed to determine the driver response to the prompt (STEP 131). If the driver response is positive, the driver incident assistance system 1 accesses the image data stored in memory 21 to extract an image frame showing the vehicle, optionally highlighting the vehicle number plate (STEP 133). The driver incident assistance system 1 can time stamp the image data along with geographical data (STEP 135), such as a grid reference and/or longitudinal/latitude information identifying the location where the incident occurred. The indication system 1 can transmit the data to a cellular telephone associated with the driver D and/or to an email address specified by the driver D. The driver incident assistance system 1 then identifies from the driver history and/or personal settings if the driver D typically calls a third party in this type of situation (STEP 137). If the driver indicates that they do not wish to retain a picture of the incident (STEP 131), the driver communication system 5 accesses the driver's historic data to determine whether a third party should be contacted (STEP 137). A check is then performed by the driver incident assistance system 1 to determine whether contact details are available for the third party (STEP 139). If contact details are available, the driver incident assistance system 1 asks the driver D if they would like to contact the preferred third party or someone else (STEP 141). If contact details are not available, the driver incident assistance system 1 enquires whether the driver D would like to contact someone else (STEP 143). A check is performed to determine the response received from the driver D (STEP 145). If the driver D confirms the contact name, the driver incident assistance system 1 dials the requested initiates a telephone call with the third party and saves this action in a user preference file (STEP 147). If the driver D responds negatively to the request, the driver incident assistance system 1 records a negative response in the user preference file (STEP 149). The driver incident assistance system 1 then goes to a sleep mode (referred to as an ideal state" in the flow chart) until the driver requests additional information (STEP 151). When the driver incident assistance system 1 is in this sleep mode, it will not initiate another dialogue until ta new event is detected or the driver requests information.
A second flow chart 200 is shown in Figure 5 to illustrate the operation of the driver incident assistance system 1 when the vehicle 3 is moving.
The driver incident assistance system 1 is enabled by a driver (STEP 201). The driver incident assistance system 1 activates the vehicle sensors 10 and the generated sensor data is buffered in memory 21 for a time period of 60 seconds (STEP 203). The vehicle sensors 10 actively monitor the vehicle surroundings to determine when an incident, such as a collision or accident, occurs. If an incident is detected, a check is performed to determine whether an object is identified in close/dangerous proximity to the vehicle 3 (STEP 205). If no such object is detected, the vehicle sensors 10 continue actively to monitor the vehicle surroundings. If, however, an object is detected in proximity to the vehicle 3, an object identification feature is activated to identify the type of object, for example to determine whether the object is a person, animal, static object, a parked vehicle, a moving vehicle etc. (STEP 207). An accident classification procedure is then implemented (STEP 209). The classification procedure can, for example, determine that an accident did not occur and revert the driver incident assistance system 1 to monitoring the vehicle surroundings.
Alternatively, the classification procedure can determine that a minor accident (for example, a small dent) or a major accident (for example, vehicle body damage) occurred. The driver incident assistance system 1 can record the type of incident and temporarily archive data derived from the vehicle sensors 10, for example image data from the internal image sensors 15-1, 15-2 and/or the external image sensors 17-1, 17-2, 17-3; and/or ultrasonic sensors 12, before the incident occurred (STEP 211).
The driver incident assistance system 1 then accesses defined driver-preferences to determine an appropriate response to the incident (STEP 213). The driver-preferences can, for example, include a geographical location of the vehicle 3 (derived from a satellite navigation system), an assessment of the vehicle damage, insurance specific rules etc. the driver incident assistance system 1 determines that an accident has occurred (STEP 215). A check is performed to determine if the accident has resulted in any critical issues relating to the driver's health (STEP 217).
If the check determines critical injuries have resulted from the incident, the driver incident assistance system 1 transmits the driver health condition details to a local emergency response system as part of a vehicle "SOS" system (STEP 219). The driver incident assistance system 1 will perform the analysis to determine further questions will increase the driver workload and their impact on their condition (STEP 221). If the driver incident assistance system 1 determines that further questions would impact on their condition, the driver incident assistance system goes into a sleep more until the driver requests additional information (STEP 223). If stored user preferences and/or historic data indicate that the driver D prefers not to receive additional information, the driver incident assistance system 1 can go directly to the sleep mode until the driver requests additional information (STEP 223).
If the driver incident assistance system 1 determines that further questions would not impact on the condition of the driver D, the driver incident assistance system 1 can ask the driver if they require emergency guidance (STEP 225). Equally, if the check performed to assess driver condition (STEP 217) determines that no critical issues have arisen, the driver incident assistance system 1 can initiate a dialogue with the driver to determine if they require emergency guidance (STEP 225). The driver response is monitored (STEP 227) and, if positive, the driver incident assistance system 1 outputs emergency guidance information to the driver (STEP 227). The guidance information can include, but is not limited to, one or more of the following: (a) health and safety rules as determined by the geographical location; (b) any emergency action legally required to be undertaken by the driver D as governed by local legislation; (c) contact details for the local emergency services, such as telephone numbers; (d) and information, such as information required by an insurer to process a claim under the vehicle insurance. The information could include information related to the vehicle condition, for example brake system condition, coolant levels (high/low) and so on. The advice could be based upon vehicle condition and also the position/location of the road. For example, if the vehicle 3 is in a high speed lane but the vehicle 3 is safe the advice could be to remain in the vehicle 3 until an emergency vehicle arrives. The information displayed could be determined based on a geographical location, for example to comply with local guidelines and/or legislation. Thus, the system can provide guidance based on the geographical location of the vehicle. Alternatively, or in addition, the information displayed could be determined based on a determined condition of the driver, for example based on a signal output by one or more driver monitoring sensors. The driver incident assistance system 1 then prompts the driver D to indicate whether they require more information about making an insurance claim (STEP 229). Similarly, if the driver indicates that they do not require emergency guidance, the driver incident assistance system 1 prompts the driver D to indicate whether they require more information about making an insurance claim (STEP 229).
If the driver response to the system prompt is negative, the driver incident assistance system 1 does not display the image data but instead outputs the prompt for confirmation from the driver whether additional information is required about making an insurance claim (STEP 231). A check is performed to determine the driver response to the insurance information prompt (STEP 233). If the driver response is positive, the driver incident assistance system 1 provides relevant information for making an insurance claim (STEP 235), for example outlining the steps involved and/or providing contact information for the driver's insurance company.
A check is then performed to determine if the accident involves a third party (STEP 237). If the incident did involve a third party, the driver incident assistance system 1 asks the driver D if they wishes to take a photographic image with geographical tagging to support an insurance claim (STEP 239). A check is performed to determine the driver response to the photographic image prompt (STEP 241). If the driver response is positive, the processor 19 analyses the image data from the external image sensors 17-1, 17-2, 17-3 to determine if the other vehicle involved in the incident is within the field of view matrix of one or more of the external image sensors 17-1, 17-2, 17-3 (STEP 243). If the other vehicle is not within the field of view matrix, the driver incident assistance system 1 accesses the image data stored in memory 21 to extract an image frame showing the vehicle and optionally highlighting the vehicle number plate. The driver incident assistance system 1 can time stamp the image data along with the geographical data (STEP 245), such as a grid reference and/or longitudinal/latitude information, to identify the location where the incident occurred. The driver incident assistance system 1 can transmit the data to a cellular telephone associated with the driver D and/or to an email address specified by the driver D. The driver incident assistance system 1 then identifies from the driver history and/or personal settings if the driver D typically calls a third party in this type of situation (STEP 247). If the driver indicates that they do not wish to retain a picture of the incident (STEP 241), the driver communication system 5 accesses the driver's historic data to determine whether a third party should be contacted (STEP 247). The driver incident assistance system 1 asks the driver D if they would like to contact the preferred third party or someone else (STEP 249). A check is performed to determine the response received from the driver D (STEP 251). The driver incident assistance system 1 initiates a telephone call with the third party and saves this action in a user preference file (STEP 253). If the driver D responds negatively to the request, the driver incident assistance system 1 records a negative response in the user preference file (STEP 255). The driver incident assistance system 1 then goes to a sleep mode until the driver requests additional information (STEP 257).
It will be appreciated that various changes and modifications can be made to the driver incident assistance system 1 described herein without departing from the scope of the present invention.
Other aspects of the present invention are set out in the following numbered paragraphs: 1. A driver incident assistance system for outputting information to a driver of a vehicle following an incident, the driver incident assistance system comprising one or more processors configured to: receive sensor data from one or more sensors; analyse the sensor data to determine that an incident has occurred; assign a classification to said incident; and control the output of information to the driver in dependence on the assigned classification of said incident.
2. A driver incident assistance system as described in paragraph 1, the one or more processors being configured to classify said incident as being a minor incident or a major incident.
3. A driver incident assistance system as described in paragraph 2, wherein a first set of information is output if the incident is classified as being a minor incident; and a second set of information is output if the incident is classified as being a major incident.
4. A driver incident assistance system as described in paragraph 1, the one or more processors being configured to determine if another vehicle is involved in said incident.
5. A driver incident assistance system as described in paragraph 4, the one or more processors being configured to control the output of information to the driver in dependence on the determination of whether another vehicle is involved in said incident.
6. A driver incident assistance system as described in paragraph 1, the one or more processors being configured selectively to output sensor data generated at the time of the incident to memory.
7. A driver incident assistance system as described in paragraph 1, the one or more processors being configured to receive image data from at least one external image sensor and/or at least one interior image sensor.
8. A driver incident assistance system as described in paragraph 7, the one or more processors being configured to analyse said image data received from said at least one external image sensor to assign a classification to said incident; and/or to identify one or more characteristics of another vehicle involved in the incident.
9. A driver incident assistance system as described in paragraph 7, the one or more processors being configured to analyse said image data received from the at least one internal image sensor to track movements and/or recognise gestures of a vehicle occupant following the incident.
10. A driver incident assistance system as described in paragraph 9, the one or more processors being configured to assign a classification to a condition of said vehicle occupant in dependence on said tracked movements and/or recognised gestures of the vehicle occupant.
11. A driver incident assistance system as described in paragraph 1, wherein said one or more sensors comprise one or more of the following: an ultrasonic sensor; a radar sensor; a lateral accelerometer; a longitudinal accelerometer; a vehicle speed sensor, a global positioning system (GPS) sensor, a driver monitoring sensor.
12. A driver incident assistance system as described in paragraph 11, the one or more processing being configured to assign a classification to said incident based on one or more thresholds defined for said one or more sensors.
13. A driver incident assistance system as described in claim 1, wherein the output of information is controlled in dependence on a determined geographical location of the vehicle; and/or a determined driver condition.
14. A vehicle comprising a driver incident assistance system as described in paragraph 1 coupled to one or more sensors.
15. A method of controlling the output of information to a driver of a vehicle following an incident, the method comprising: receiving sensor data from one or more sensors; analysing the sensor data to determine that an incident has occurred; assigning a classification to said incident; and controlling the output of information to the driver in dependence on the assigned classification of said incident.
16. A method as described in paragraph 15, wherein the output of information is controlled in dependence on a determined geographical location of the vehicle; and/or a determined driver condition.

Claims (19)

  1. CLAIMS: 1. A system for outputting information to a driver of a vehicle following an incident, the system comprising one or more processors configured to: receive sensor data from one or more sensors; analyse the sensor data to determine that an incident has occurred; assign a classification to said incident; and control the output of information to the driver in dependence on the assigned classification of said incident.
  2. 2. A system as claimed in claim 1, the one or more processors being configured to classify said incident as being a minor incident or a major incident.
  3. 3. A system as claimed in claim 2, wherein a first set of information is output if the incident is classified as being a minor incident; and a second set of information is output if the incident is classified as being a major incident.
  4. 4. A system as claimed in any one of claims 1, 2 or 3, the one or more processors being configured to determine if another vehicle is involved in said incident.
  5. 5. A system as claimed in claim 4, the one or more processors being configured to control the output of information to the driver in dependence on the determination of whether another vehicle is involved in said incident.
  6. 6. A system as claimed in any one of the preceding claims, the one or more processors being configured selectively to output sensor data generated at the time of the incident to memory.
  7. 7. A system as claimed in any one of the preceding claims, the one or more processors being configured to receive image data from at least one external image sensor and/or at least one interior image sensor.
  8. S. A system as claimed in claim 7, the one or more processors being configured to analyse said image data received from said at least one external image sensor to assign a classification to said incident; and/or to identify one or more characteristics of another vehicle involved in the incident.
  9. 9. A system as claimed in claim 7 or claim 8, the one or more processors being configured to analyse said image data received from the at least one internal image sensor to track movements and/or recognise gestures of a vehicle occupant following the incident.
  10. 10. A system as claimed in claim 9, the one or more processors being configured to assign a classification to a condition of said vehicle occupant in dependence on said tracked movements and/or recognised gestures of the vehicle occupant.
  11. 11. A system as claimed in any one of the preceding claims, wherein said one or more sensors comprise one or more of the following: an ultrasonic sensor; a radar sensor; a lateral accelerometer; a longitudinal accelerometer; a vehicle speed sensor; a global positioning system (UPS) sensor; a driver monitoring sensor.
  12. 12. A system as claimed in claim 11, the one or more processing being configured to assign a classification to said incident based on one or more thresholds defined for said one or more sensors.
  13. 13. A system as claimed in any one of the preceding claims, wherein the output of information is controlled in dependence on a determined geographical location of the vehicle; and/or a determined driver condition.
  14. 14. A vehicle comprising a system as claimed in any one of the preceding claims coupled to one or more sensors.
  15. 15. A method of controlling the output of information to a driver of a vehicle following an incident, the method comprising: receiving sensor data from one or more sensors; analysing the sensor data to determine that an incident has occurred; assigning a classification to said incident; and controlling the output of information to the driver in dependence on the assigned classification of said incident.
  16. 16. A method as claimed in claim 15, wherein the output of information is controlled in dependence on a determined geographical location of the vehicle; and/or a determined driver condition.
  17. 17. A system substantially as herein described with reference to the accompanying figures.
  18. 18. A method substantially as herein described with reference to the accompanying figures.
  19. 19. A vehicle substantially as herein described with reference to the accompanying figures.
GB1407713.5A 2014-05-01 2014-05-01 Driver incident assistance system and method Active GB2525654B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1407713.5A GB2525654B (en) 2014-05-01 2014-05-01 Driver incident assistance system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1407713.5A GB2525654B (en) 2014-05-01 2014-05-01 Driver incident assistance system and method

Publications (3)

Publication Number Publication Date
GB201407713D0 GB201407713D0 (en) 2014-06-18
GB2525654A true GB2525654A (en) 2015-11-04
GB2525654B GB2525654B (en) 2018-12-26

Family

ID=50980443

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1407713.5A Active GB2525654B (en) 2014-05-01 2014-05-01 Driver incident assistance system and method

Country Status (1)

Country Link
GB (1) GB2525654B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2554559A (en) * 2016-09-23 2018-04-04 Auto Logisitic Solutions Ltd Vehicle accident detection and notification
CN112639894A (en) * 2018-07-20 2021-04-09 C·博诺 Computer-aided method for analyzing accident data related to persons
US12175816B2 (en) 2021-11-29 2024-12-24 Amazon Technologies, Inc. Fleet data collection using a unified model to collect data from heterogenous vehicles
US12340636B2 (en) 2021-01-27 2025-06-24 Amazon Technologies, Inc. Vehicle data extraction service
US12462618B1 (en) * 2021-02-01 2025-11-04 Amazon Technologies, Inc. Vehicle analysis service for providing logic for local analysis and additional remote support

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6714894B1 (en) * 2001-06-29 2004-03-30 Merritt Applications, Inc. System and method for collecting, processing, and distributing information to promote safe driving
US20050230947A1 (en) * 2004-04-16 2005-10-20 Hon Hai Precision Industry Co., Ltd. Electronic safety device for a motor vehicle
US20100157061A1 (en) * 2008-12-24 2010-06-24 Igor Katsman Device and method for handheld device based vehicle monitoring and driver assistance
WO2012080741A1 (en) * 2010-12-15 2012-06-21 Andrew William Wright Method and system for logging vehicle behaviour
US20130006674A1 (en) * 2011-06-29 2013-01-03 State Farm Insurance Systems and Methods Using a Mobile Device to Collect Data for Insurance Premiums

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6714894B1 (en) * 2001-06-29 2004-03-30 Merritt Applications, Inc. System and method for collecting, processing, and distributing information to promote safe driving
US20050230947A1 (en) * 2004-04-16 2005-10-20 Hon Hai Precision Industry Co., Ltd. Electronic safety device for a motor vehicle
US20100157061A1 (en) * 2008-12-24 2010-06-24 Igor Katsman Device and method for handheld device based vehicle monitoring and driver assistance
WO2012080741A1 (en) * 2010-12-15 2012-06-21 Andrew William Wright Method and system for logging vehicle behaviour
US20130006674A1 (en) * 2011-06-29 2013-01-03 State Farm Insurance Systems and Methods Using a Mobile Device to Collect Data for Insurance Premiums

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2554559A (en) * 2016-09-23 2018-04-04 Auto Logisitic Solutions Ltd Vehicle accident detection and notification
CN112639894A (en) * 2018-07-20 2021-04-09 C·博诺 Computer-aided method for analyzing accident data related to persons
CN112639894B (en) * 2018-07-20 2023-09-05 C·博诺 Computer Aided Method for Analyzing Personnel-Related Accident Data
US12340636B2 (en) 2021-01-27 2025-06-24 Amazon Technologies, Inc. Vehicle data extraction service
US12462618B1 (en) * 2021-02-01 2025-11-04 Amazon Technologies, Inc. Vehicle analysis service for providing logic for local analysis and additional remote support
US12175816B2 (en) 2021-11-29 2024-12-24 Amazon Technologies, Inc. Fleet data collection using a unified model to collect data from heterogenous vehicles

Also Published As

Publication number Publication date
GB2525654B (en) 2018-12-26
GB201407713D0 (en) 2014-06-18

Similar Documents

Publication Publication Date Title
US10817751B2 (en) Learning data creation method, learning method, risk prediction method, learning data creation device, learning device, risk prediction device, and recording medium
US9613459B2 (en) System and method for in-vehicle interaction
US11713046B2 (en) Driving assistance apparatus and data collection system
CN111048171B (en) Method and device for solving motion sickness
US9865258B2 (en) Method for recognizing a voice context for a voice control function, method for ascertaining a voice control signal for a voice control function, and apparatus for executing the method
JP6668814B2 (en) Automatic traveling control device and automatic traveling control system
US9714037B2 (en) Detection of driver behaviors using in-vehicle systems and methods
KR20210121015A (en) Detection of leftover objects
US11285966B2 (en) Method and system for controlling an autonomous vehicle response to a fault condition
US20150084757A1 (en) Methods and systems for determining auto accidents using mobile phones and initiating emergency response
US11912267B2 (en) Collision avoidance system for vehicle interactions
US9928833B2 (en) Voice interface for a vehicle
CN105270292A (en) Controlling access to an in-vehicle human-machine interface
GB2525654A (en) Driver incident assistance system and method
US10286781B2 (en) Method for the automatic execution of at least one driving function of a motor vehicle
US20130002870A1 (en) Method for the Output of Information
US9984298B2 (en) Method for outputting a drowsiness warning and control unit
CN110072726B (en) autonomous vehicle computer
WO2020079755A1 (en) Information providing device and information providing method
JP2020130502A (en) Information processing device and information processing method
US12217312B2 (en) System and method for indicating whether a vehicle crash has occurred
US20240317135A1 (en) Automated Motor Vehicle and Method for Controlling the Automated Motor Vehicle
US20250148807A1 (en) Center device, on-vehicle device, and safety confirmation system
JP7618926B2 (en) Driving assistance control device and driving assistance control method
JP7798628B2 (en) Information processing device, information processing method, program, and storage medium