US20160325676A1 - System and method for warning a driver of pedestrians and other obstacles - Google Patents
System and method for warning a driver of pedestrians and other obstacles Download PDFInfo
- Publication number
- US20160325676A1 US20160325676A1 US14/707,905 US201514707905A US2016325676A1 US 20160325676 A1 US20160325676 A1 US 20160325676A1 US 201514707905 A US201514707905 A US 201514707905A US 2016325676 A1 US2016325676 A1 US 2016325676A1
- Authority
- US
- United States
- Prior art keywords
- vehicle
- warning
- pedestrian
- driver
- pedestrians
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/008—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
Definitions
- the present application generally relates to a vehicle warning system, and, more particularly, to a system and method for enhancing a driver's awareness of pedestrians and other objects by showing projected travel paths of the vehicle, pedestrian and/or moving objects both within and outside an effective field of view of the driver.
- Warning systems may be general warning systems that may inform the driver of different operating conditions of the vehicle.
- vehicles may be equipped with warning systems which may be used to warn the driver of low fuel amounts, high and/or low engine temperature, a drop in oil pressure, problems in charging the vehicle battery, doors and/or trunks that may be open, as well other vehicle conditions.
- Motorized vehicles may be equipped with more advanced warning systems which may be used to warn the driver of the vehicle about potentially dangerous situations involving other traffic participants.
- warning systems have been designed for vehicles that may be used to provide a driver with back-up collision warning, forward collision warning, blind spot detection, lane-departure warnings, as well as other driving condition warnings.
- While the above systems do provide the driver with warnings of potential dangerous situations, these systems generally do not provide any information about a potential path of the projected danger. For example, while a blind spot detection system is able to monitor if a vehicle or other object is located in a blind spot of a vehicle, the blind spot detection system does not provide any information as to whether the vehicle and or object detected is moving towards or away from the vehicle. Similarly, while back-up warning systems may alert a driver to potential objects located behind the vehicle while the vehicle is in reverse, these systems fail to provide any information as to whether the object is moving towards or away from the vehicle. Present warning systems generally do not analyze how a pedestrian is actually moving. Beyond walking directly across a street, pedestrians may have different trajectories or paths. For example, the pedestrian may analyze traffic patterns and move in a non-direct direction.
- a system for warning of potential hazards when driving a vehicle has a sensor coupled to the vehicle capturing data of objects located around the vehicle.
- a control unit is coupled to the sensor for processing the data captured by the sensor and generating graphical representations of the objects captured by the sensor and warning indicators alerting a diver of the vehicle to the objects.
- the control unit calculates projected paths of the objects and a projected travel path of the vehicle.
- the warning indicators generated when a specified projected path of a specified object and the projected travel path intersect.
- a beads-up display (HUD) displays the graphical representations of the objects and the warning indicators. The warning indicators positioned on the HUD in directions toward each associated object.
- a system for warning of potential hazards when driving a vehicle has image sensors coupled to the vehicle capturing pedestrians located around the vehicle. Monitoring sensors are coupled to the vehicle measuring speed and travel direction of the vehicle.
- a control unit is coupled to the image sensors and monitoring sensors processing data from the image sensors and monitoring sensors and generating graphical representations of pedestrians captured by the image sensors and warning indicators alerting a diver of the vehicle to the pedestrians captured.
- the control unit calculates projected paths for the pedestrians and a projected travel path of the vehicle.
- the warning indicators generated when a specified projected path of a specified pedestrian and the projected travel path of the vehicle intersect.
- a heads-up display (HUD) shows the graphical representations of pedestrians captured by the sensor and the warning indicators. The warning indicators positioned on the HUD in directions toward each corresponding pedestrian. The HUD displays a single warning indicator of a closest pedestrian at a time.
- a system for warning of potential hazards when driving a vehicle has image sensors coupled to the vehicle and capturing pedestrians located around the vehicle. Monitoring sensors coupled to the vehicle measuring speed and travel direction of the vehicle.
- a control unit is coupled to the image sensors and monitoring sensors processing data from the image sensors and monitoring sensors and generating three dimensional graphical representations of pedestrians captured by the image sensors and warning indicators alerting a diver of the vehicle to the pedestrians captured.
- the control unit calculates projected paths for the pedestrians and a projected travel path of the vehicle.
- the warning indicators generated when a specified projected path of a specified pedestrian and the projected travel path of the vehicle intersect.
- a heads-up display (HUD) shows the graphical representations of pedestrians captured by the sensor and the warning indicators.
- the warning indicators positioned on the HUD in directions toward each corresponding pedestrian.
- the HUD displays a single warning indicator of a closest pedestrian at a time.
- the HUD displays a first warning type indicator for a pedestrian within a field of view (FOV) of the driver and a second warning type indicator for a pedestrian out of the FOV.
- FOV field of view
- FIG. 1 is an elevated perspective view of a vehicle implementing an exemplary warning system that may be used to enhance a driver's awareness of pedestrians and objects around the vehicle in accordance with one aspect of the present application;
- FIG. 2 is a simplified functional block diagram of the exemplary warning system depicted in FIG. 1 in accordance with one aspect of the present application;
- FIG. 3 is a simplified functional block diagram of a control unit shown in FIG. 2 in accordance with one aspect of the present application;
- FIG. 4 a top view depicting operation of the exemplary warning system in accordance with one aspect of the present application
- FIG. 5 is an elevated perspective view depicting generation of a warning indicator for use in the exemplary warning system in accordance with one aspect of the present application
- FIG. 6A-6E depicts different examples of how the warning indicator may be generated by the exemplary warning system in accordance with one aspect of the present application
- FIG. 7 a top view depicting operation of the exemplary warning system in accordance with another aspect of the present application.
- FIG. 8 shows one embodiment of an exemplary Augmented Reality (AR) visual generated by the warning system of FIG. 1 in accordance with one aspect of the present application;
- AR Augmented Reality
- FIG. 9A-9B show side views generation of a warning indicator by the warning system of FIG. 1 in accordance with one aspect of the present application.
- FIG. 10A is a top view indicating the different field of views (FOVs) of a driver of the vehicle.
- FIG. 10B is a side views showing the generation of the warning indicator of FIGS. 9A-9B in accordance with one aspect of the present application.
- the vehicle 10 may be equipped with a warning system 12 that may be used to enhance a driver's awareness of pedestrians and other objects by showing an estimated driving area of the vehicle 10 and projected moving paths of pedestrians and/or moving objects both within and outside an effective field of view of the driver.
- the warning system 12 may be configured to project actual locations and dynamics of pedestrians or other moving objects.
- the warning system 12 may display a warning when the moving path of a pedestrian and/or moving object intersects with the estimated driving area of the vehicle 10 .
- the warning system 12 may have a plurality of sensors 16 .
- the sensors 16 may be positioned around a perimeter of the vehicle 10 .
- the sensors 16 may be configured to be mounted within the body of the vehicle 10 .
- the sensors 16 may be used to capture data of objects located around the vehicle 10 .
- the sensors 16 may be cameras, image sensors, ultrasonic or radar sensors, or other types of image capturing devices. Alternatively, the sensors 16 may be located on the street the vehicle 10 is traveling.
- the sensors 16 may be configure to capture data of objects located around the vehicle 10 and transmit the captured data to the vehicle 10 .
- the warning system 12 may have one or more monitoring sensors 18 .
- the monitoring sensors 18 may be coupled to one or more operating systems 20 of the vehicle 10 .
- the monitoring sensors 18 may be used to detect operating conditions of the vehicle 10 .
- the monitoring sensors 18 may be used to monitor a speed of the vehicle 10 , whether the vehicle 10 is making a turn, or other operating conditions of the vehicle 10 .
- the warning system 12 may have a Global Positioning Satellite (GPS) unit 28 located in the vehicle 10 .
- GPS Global Positioning Satellite
- the GPS unit 28 may be used to determine a geographical location of the vehicle 10 , provide turn-by turn driving instructions, indicate various points of interest, as well as provide other directional data.
- the GPS unit 28 may be used to determine if the vehicle 10 is moving and or turning, the speed of the vehicle 10 , as well as other operating conditions of the vehicle 10 .
- the sensors 16 and the monitoring sensors 18 may be coupled to a control unit 22 .
- the control unit 22 may take and process the data captured by the sensors 16 .
- the control unit 22 may process the data in order to detect and identify the different objects detected by the sensors 16 .
- the control unit 22 may identify the position of the different objects as well as the whether the object is moving. If moving, the control unit 22 may be used to calculate the speed and direction of the moving object.
- the control unit 22 may then take the process data and generate graphical representations of the objects captured by the sensors 16 and provide graphical representations of the projected paths of the moving objects.
- the control unit 22 may process data generated by the monitoring sensors 18 of the vehicle 10 .
- the control unit 22 may receive data from the monitoring sensors 18 in order to determine the speed and/or the direction the vehicle 10 may be traveling.
- the control unit 22 may then take the speed and/or directional data and determine a projected travel path of the vehicle 10 .
- the warning system 12 is a dynamic system.
- the control unit 22 may continuously update graphical representations of the objects captured by the sensors 16 as the vehicle 10 is moving.
- graphical representations of projected paths of the objects detected may be continuously updated.
- Graphical representations indicating the projected travel path of the vehicle 10 may also be continuously updated.
- the control unit 22 may be coupled to a display 24 .
- the display 24 may be used to show the graphical representations generated by the control unit 22 of the objects captured by the sensors 16 , projected paths of the moving objects, as well as the graphical representation indicating the projected traveling path of the vehicle 10 .
- the control unit 22 may be coupled to a Heads Up Display (HUD) system 26 .
- the HUD system 26 may be used to display the graphical representations generated by the control unit 22 of the objects captured by the sensors 16 , projected paths of the moving objects, as well as the graphical representation indicating the traveling path of the vehicle 10 .
- the warning system 12 may be configured so that the display 24 and or HUD system 26 displays the general area viewable in front of the driver when the driver is seated in the vehicle 10 (hereinafter Field of View (FOV) of the driver).
- FOV Field of View
- a string indicating a general position of the detected object and a potential danger level may be generated and shown on the display 24 and or HUD system 26 .
- a Global Positioning Satellite (GPS) unit 28 may be coupled to the control unit 22 .
- the GPS unit 28 may be used to provide geographical information to the control unit 22 . Based on the location indicated by the GPS unit 28 , the GPS unit 28 may load and transfer location data about the indicated location. For example, the GPS unit 28 may load and transfer satellite imagery of the current location. This imaging may be sent to the control unit 22 which may generate a graphical representation of the satellite images to be shown on the display 24 and or HUD system 26 . Further, as disclosed above, the GPS unit 28 may be used to determine if the vehicle 10 is moving by monitoring the speed and direction of the vehicle 10 .
- the control unit 22 may have an image processing module 30 .
- the image processing module 30 may be coupled to the sensors 16 .
- the image processing module 30 may process the data from the sensors 16 in order to detect and identify the different objects detected by the sensors 16 .
- the image processing module 30 may identify the different objects detected as well as determine whether the object is moving and/or a potential moving path of the object.
- the image processing module 30 may be configured to identify and distinguish pedestrians from other objects detected around the vehicle 10 .
- the image processing module 30 may be used to identify potential hazardous situations to the driver, such as pedestrians or bicyclist, as opposed to non-hazardous objects such as a fire hydrant, garbage can, or other non-hazardous objects.
- the control unit 22 may be configured so that potential hazardous objects like pedestrians are shown, while non-hazardous objects may not be processed and or displayed.
- the image processing module 30 may be used to calculate the speed, direction and/or potential moving path of the object. Based on the calculated speed and direction of the moving object, a path processing module 32 of the control unit 22 may be used to calculate a projected path of the moving object. The path processing module 32 may further be used to calculate a projected pathway of the vehicle 10 . The path processing module 32 may receive data from the monitoring sensor 18 and or GPS unit 28 indicating speed and directional information of the vehicle 10 . Based on this information, the path processing module 32 may calculate the projected pathway of the vehicle 10 .
- An augmented reality processing module 34 of the control unit 22 may be used to generate graphical representations of the different objects detected by the sensors 16 , graphical representations of the projected paths of the moving objects and graphical representations indicating the projected traveling pathway of the vehicle 10 .
- the augmented reality processing module 34 may further generate graphical representations of location data provided by the GPS unit 28 .
- the graphical representations generated by the augmented reality processing module 34 may be two dimensional representations or three dimensional representations.
- the graphical representations generated by the augmented reality processing module 34 may be shown on a display 24 located within the vehicle 10 .
- the HUD system 26 may be used to display the graphical representations generated by the augmented reality processing module 34 .
- the system 12 may calculate an estimated driving pathway 38 of the vehicle 10 based on data from the monitoring sensors 18 and/or the GPS unit 28 .
- the driving pathway 38 may be dynamic and adjust in length/size as the vehicle 10 based on the speed and movement of the vehicle 10 .
- the system 12 may monitor for pedestrians, bicyclist, or similar hazardous objects 40 (Hereafter pedestrians 40 ) located near the vehicle 10 using the sensors 16 . If the pedestrian 40 is moving, the system 12 may calculate an estimated moving vector 42 of the pedestrians 40 . If the moving vector 42 and/or the pedestrians 40 intersect with the estimated driving pathway 38 , the system 12 may generate a potential collision warning indicator 44 (hereinafter warning indicator 44 ).
- the warning indicator 44 may be located on a display 24 or on a display area 46 of the HUD 26 . The warning indicator 44 may appear in the general area where the pedestrian 40 is located and where the potential impact may occur. As shown in FIG. 4 , the warning indicator 44 may be located on the upper left side of the display area 46 which may correspond to the general location of the pedestrian 40 .
- the warning indicator 44 may be generated if the moving vector 42 and/or the pedestrians 40 intersect with the estimated driving pathway 38 .
- a warning base 48 may be associated with each pedestrians 40 detected by the sensors 16 .
- the warning base 48 may be an area proximate the pedestrians 40 where potential impact with the vehicle 10 may occur.
- the warning base 48 may indicate a general area in which the pedestrian 40 has a potential of moving and still pose a potential threat of impact with the vehicle 10 . Since the pedestrian 40 may not move directly across the street but may alter the trajectory to analyze traffic patterns, avoid potholes or other obstacles, or for other reasons, the warning base 48 may indicate a potential movement area that may pose a danger for the pedestrian 40 . In the embodiment shown in FIG.
- the warning base 48 may be circular or oval in shape located proximate the pedestrian 40 . However, this is shown as an example the warning base 48 may be other shapes and/or sizes.
- the warning base 48 may be dynamic and may change in shape and or size as the pedestrian 40 detected changes speed and/or direction of movement.
- the system 12 may determine an impact zone 50 .
- the impact zone 50 may be configured to include an area in front of the vehicle 10 where impact with an object may occur.
- the area where the warning base 48 and impact zone 50 coincide may determine a size and shape of the warning indicator 44 as well as the location of the warning indicator 44 on the display area 46 of the HUD 26 .
- Each warning indicator 44 may be formed where the warning base 48 and impact zone 50 coincide and may be positioned on the display area 46 of the HUD 26 to draw the attention of the driver to the pedestrian 40 .
- the warning base 48 may be wholly within and impact zone 50 .
- the warning indicators 44 may be positioned in a center area of the display area 46 of the HUD 26 as the pedestrian 40 may be directly in front of the vehicle 10 .
- a portion of the warning base 48 and impact zone 50 may coincide.
- the warning indicators 44 may each be positioned in an area of the display area 46 of the HUD 26 where the pedestrian 40 may be located. The position of the warning indicators 44 may be used to draw the attention of the driver to the location of the pedestrian 40 as indicted by the arrow 52 .
- the warning indicators 44 may be positioned on a left-hand side of the display area 46 of the HUD 26 where the pedestrian 40 may be located to draw the attention of the driver to the pedestrian 40 .
- the warning indicators 44 may be positioned in a center area of the display area 46 of the HUD 26 as the pedestrian 40 may be directly in front of the vehicle 10 .
- the system 12 may detect multiple pedestrians 40 located near the vehicle 10 .
- the pedestrian 40 A may be located closest to the vehicle 10 , while the pedestrian 40 B may be further away from the vehicle 10 than pedestrian 40 A.
- Pedestrian 40 A may have an associated warning base 48 A and impact zone 50 A which may form warning indicator 44 A.
- Pedestrian 40 B may have an associated warning base 48 B and impact zone 50 B which may form warning indicator 44 B.
- the system 10 may be configured so that the display area 46 of the HUD 26 shows the closest warning indicator 44 A. Showing the closest warning indicator 44 A may lessen distractions to the driver and may draw the driver's attention to the more eminent issue.
- the warning indicator 44 B may be shown on the display area 46 once the vehicle 10 passes the pedestrian 40 A, if pedestrian 40 A moves away from the vehicle 10 or when the warning base 48 A and impact zone 50 A no longer coincide.
- the associated warning base 48 A and impact zone 50 A may be smaller than the warning base 48 B and impact zone 50 B.
- Warning base 48 B and impact zone 50 B may be larger to coincide with the larger field of view of the driver to objects located at a greater distance from the vehicle 10 .
- the augmented reality shown may be the FOV in front of the driver when the driver is seated in the vehicle 12 .
- the control unit 22 may generate one or more graphical representations of pedestrians 40 A- 40 B detected by the sensors 16 .
- two pedestrians 40 may be seen as being in the FOV of the driver.
- the warning indicator 44 A may be placed adjacent to the pedestrian 40 A.
- the warning indicator 44 A may be positioned to draw the driver's attention to the pedestrian 40 A in the FOV of the driver.
- the warning indicator 44 may flash and or blink to further draw the driver's attention to the pedestrian 40 A in the FOV of the driver.
- the system 10 may be configured so that the display area 46 of the HUD 26 shows the closest warning indicator 44 A. Showing the closest warning indicator 44 A may lessen distractions to the driver and may draw the driver's attention to the more eminent issue.
- the warning indicator 44 B may be shown on the display area 46 once the vehicle 10 passes the pedestrian 40 A, if pedestrian 40 A moves away from the vehicle 10 or when the warning base 48 A and impact zone 50 A no longer coincide.
- the driver when driving, the driver may have an effective FOV 52 and a general FOV 54 .
- the effective FOV 52 may be the area directly in front of the driver where the driver may be focusing when driving.
- the general FOV 54 may be larger area than the effective FOV 52 and may include areas the driver may see if the driver where to move the eyes to the left and/or right or if the driver were to rotate his head to the left or right.
- the system 10 may inform the driver of the potential danger.
- the pedestrian 40 C and 40 D may be located proximate the vehicle 10 but outside the general FOV 52 of the driver.
- the system 10 may generate a warning indicator 56 to indicate a general position of the pedestrians 40 C- 40 D who is out of or towards the outer edge of the FOV 52 of the driver.
- the warning indicator 56 may signal to the driver to turn and look in the direction of the warning indicator 56 to see the pedestrians 40 C- 40 D.
- the warning indicator 54 may flash and/or blink to draw the driver's attention to the pedestrians 40 C- 40 D.
- An audible warning may
- the warning indicator 56 may be a light array 56 A which moves towards the pedestrians 40 C- 40 D.
- the light array 56 may start on a left side of the display area 46 and moves towards the right side of the display area 46 as the pedestrian 40 C moves closer to the FOV 52 .
- the light array 56 may start on a right side of the display area 46 and moves towards the left side of the display area 46 as the pedestrian 40 D moves closer to the FOV 52 .
- the light array 56 may change in color as the pedestrians 40 C- 40 D move closer to the vehicle.
- the light array 56 may be color coded to indicate to the relative position of the pedestrians 40 C- 40 D to the vehicle 10 and the potential danger of the vehicle 10 colliding with one of the pedestrians 40 C- 40 D.
- a first section of the light array 56 may start off in a first color (i.e., green), change to a second color (i.e., yellow) in a second section of the light array 56 as the pedestrians 40 C- 40 D move closer to the vehicle, and turns to a third color (i.e., red) in a third section of the light array 56 if the pedestrians 40 C- 40 D move within the estimate driving pathway 38 of the vehicle 10 .
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
- The present application generally relates to a vehicle warning system, and, more particularly, to a system and method for enhancing a driver's awareness of pedestrians and other objects by showing projected travel paths of the vehicle, pedestrian and/or moving objects both within and outside an effective field of view of the driver.
- Motorized vehicles may be equipped with various kinds of warning systems. These warning systems may be general warning systems that may inform the driver of different operating conditions of the vehicle. For example, vehicles may be equipped with warning systems which may be used to warn the driver of low fuel amounts, high and/or low engine temperature, a drop in oil pressure, problems in charging the vehicle battery, doors and/or trunks that may be open, as well other vehicle conditions.
- Motorized vehicles may be equipped with more advanced warning systems which may be used to warn the driver of the vehicle about potentially dangerous situations involving other traffic participants. For example, warning systems have been designed for vehicles that may be used to provide a driver with back-up collision warning, forward collision warning, blind spot detection, lane-departure warnings, as well as other driving condition warnings.
- While the above systems do provide the driver with warnings of potential dangerous situations, these systems generally do not provide any information about a potential path of the projected danger. For example, while a blind spot detection system is able to monitor if a vehicle or other object is located in a blind spot of a vehicle, the blind spot detection system does not provide any information as to whether the vehicle and or object detected is moving towards or away from the vehicle. Similarly, while back-up warning systems may alert a driver to potential objects located behind the vehicle while the vehicle is in reverse, these systems fail to provide any information as to whether the object is moving towards or away from the vehicle. Present warning systems generally do not analyze how a pedestrian is actually moving. Beyond walking directly across a street, pedestrians may have different trajectories or paths. For example, the pedestrian may analyze traffic patterns and move in a non-direct direction.
- Therefore, it would be desirable to provide a vehicle warning system and method that overcome the above problems.
- In accordance with one embodiment, a system for warning of potential hazards when driving a vehicle is disclosed. The system has a sensor coupled to the vehicle capturing data of objects located around the vehicle. A control unit is coupled to the sensor for processing the data captured by the sensor and generating graphical representations of the objects captured by the sensor and warning indicators alerting a diver of the vehicle to the objects. The control unit calculates projected paths of the objects and a projected travel path of the vehicle. The warning indicators generated when a specified projected path of a specified object and the projected travel path intersect. A beads-up display (HUD) displays the graphical representations of the objects and the warning indicators. The warning indicators positioned on the HUD in directions toward each associated object.
- In accordance with one embodiment, a system for warning of potential hazards when driving a vehicle is disclosed. The system has image sensors coupled to the vehicle capturing pedestrians located around the vehicle. Monitoring sensors are coupled to the vehicle measuring speed and travel direction of the vehicle. A control unit is coupled to the image sensors and monitoring sensors processing data from the image sensors and monitoring sensors and generating graphical representations of pedestrians captured by the image sensors and warning indicators alerting a diver of the vehicle to the pedestrians captured. The control unit calculates projected paths for the pedestrians and a projected travel path of the vehicle. The warning indicators generated when a specified projected path of a specified pedestrian and the projected travel path of the vehicle intersect. A heads-up display (HUD) shows the graphical representations of pedestrians captured by the sensor and the warning indicators. The warning indicators positioned on the HUD in directions toward each corresponding pedestrian. The HUD displays a single warning indicator of a closest pedestrian at a time.
- In accordance with one embodiment, a system for warning of potential hazards when driving a vehicle is disclosed. The system has image sensors coupled to the vehicle and capturing pedestrians located around the vehicle. Monitoring sensors coupled to the vehicle measuring speed and travel direction of the vehicle. A control unit is coupled to the image sensors and monitoring sensors processing data from the image sensors and monitoring sensors and generating three dimensional graphical representations of pedestrians captured by the image sensors and warning indicators alerting a diver of the vehicle to the pedestrians captured. The control unit calculates projected paths for the pedestrians and a projected travel path of the vehicle. The warning indicators generated when a specified projected path of a specified pedestrian and the projected travel path of the vehicle intersect. A heads-up display (HUD) shows the graphical representations of pedestrians captured by the sensor and the warning indicators. The warning indicators positioned on the HUD in directions toward each corresponding pedestrian. The HUD displays a single warning indicator of a closest pedestrian at a time. The HUD displays a first warning type indicator for a pedestrian within a field of view (FOV) of the driver and a second warning type indicator for a pedestrian out of the FOV.
- The present application is further detailed with respect to the following drawings. These figures are not intended to limit the scope of the present application but rather illustrate certain attributes thereof.
-
FIG. 1 is an elevated perspective view of a vehicle implementing an exemplary warning system that may be used to enhance a driver's awareness of pedestrians and objects around the vehicle in accordance with one aspect of the present application; -
FIG. 2 is a simplified functional block diagram of the exemplary warning system depicted inFIG. 1 in accordance with one aspect of the present application; -
FIG. 3 is a simplified functional block diagram of a control unit shown inFIG. 2 in accordance with one aspect of the present application; -
FIG. 4 a top view depicting operation of the exemplary warning system in accordance with one aspect of the present application; -
FIG. 5 is an elevated perspective view depicting generation of a warning indicator for use in the exemplary warning system in accordance with one aspect of the present application; -
FIG. 6A-6E depicts different examples of how the warning indicator may be generated by the exemplary warning system in accordance with one aspect of the present application; -
FIG. 7 a top view depicting operation of the exemplary warning system in accordance with another aspect of the present application; -
FIG. 8 shows one embodiment of an exemplary Augmented Reality (AR) visual generated by the warning system ofFIG. 1 in accordance with one aspect of the present application; -
FIG. 9A-9B show side views generation of a warning indicator by the warning system ofFIG. 1 in accordance with one aspect of the present application; and -
FIG. 10A is a top view indicating the different field of views (FOVs) of a driver of the vehicle; and -
FIG. 10B is a side views showing the generation of the warning indicator ofFIGS. 9A-9B in accordance with one aspect of the present application. - The description set forth below in connection with the appended drawings is intended as a description of presently preferred embodiments of the disclosure and is not intended to represent the only forms in which the present disclosure can be constructed and/or utilized. The description sets forth the functions and the sequence of steps for constructing and operating the disclosure in connection with the illustrated embodiments. It is to be understood, however, that the same or equivalent functions and sequences can be accomplished by different embodiments that are also intended to be encompassed within the spirit and scope of this disclosure.
- Referring to
FIG. 1 , anexemplary vehicle 10 is shown. Thevehicle 10 may be equipped with awarning system 12 that may be used to enhance a driver's awareness of pedestrians and other objects by showing an estimated driving area of thevehicle 10 and projected moving paths of pedestrians and/or moving objects both within and outside an effective field of view of the driver. Thewarning system 12 may be configured to project actual locations and dynamics of pedestrians or other moving objects. Thewarning system 12 may display a warning when the moving path of a pedestrian and/or moving object intersects with the estimated driving area of thevehicle 10. - Referring now to
FIGS. 1 and 2 , thewarning system 12 may have a plurality ofsensors 16. Thesensors 16 may be positioned around a perimeter of thevehicle 10. In the embodiment shown inFIG. 1 , thesensors 16 may be configured to be mounted within the body of thevehicle 10. Thesensors 16 may be used to capture data of objects located around thevehicle 10. Thesensors 16 may be cameras, image sensors, ultrasonic or radar sensors, or other types of image capturing devices. Alternatively, thesensors 16 may be located on the street thevehicle 10 is traveling. Thesensors 16 may be configure to capture data of objects located around thevehicle 10 and transmit the captured data to thevehicle 10. - The
warning system 12 may have one ormore monitoring sensors 18. Themonitoring sensors 18 may be coupled to one or more operating systems 20 of thevehicle 10. Themonitoring sensors 18 may be used to detect operating conditions of thevehicle 10. For example, themonitoring sensors 18 may be used to monitor a speed of thevehicle 10, whether thevehicle 10 is making a turn, or other operating conditions of thevehicle 10. - The
warning system 12 may have a Global Positioning Satellite (GPS)unit 28 located in thevehicle 10. TheGPS unit 28 may be used to determine a geographical location of thevehicle 10, provide turn-by turn driving instructions, indicate various points of interest, as well as provide other directional data. TheGPS unit 28 may be used to determine if thevehicle 10 is moving and or turning, the speed of thevehicle 10, as well as other operating conditions of thevehicle 10. - The
sensors 16 and themonitoring sensors 18 may be coupled to acontrol unit 22. Thecontrol unit 22 may take and process the data captured by thesensors 16. Thecontrol unit 22 may process the data in order to detect and identify the different objects detected by thesensors 16. Thecontrol unit 22 may identify the position of the different objects as well as the whether the object is moving. If moving, thecontrol unit 22 may be used to calculate the speed and direction of the moving object. Thecontrol unit 22 may then take the process data and generate graphical representations of the objects captured by thesensors 16 and provide graphical representations of the projected paths of the moving objects. - The
control unit 22 may process data generated by themonitoring sensors 18 of thevehicle 10. Thecontrol unit 22 may receive data from themonitoring sensors 18 in order to determine the speed and/or the direction thevehicle 10 may be traveling. Thecontrol unit 22 may then take the speed and/or directional data and determine a projected travel path of thevehicle 10. - The
warning system 12 is a dynamic system. Thus, thecontrol unit 22 may continuously update graphical representations of the objects captured by thesensors 16 as thevehicle 10 is moving. Thus, graphical representations of projected paths of the objects detected may be continuously updated. Graphical representations indicating the projected travel path of thevehicle 10 may also be continuously updated. - The
control unit 22 may be coupled to adisplay 24. Thedisplay 24 may be used to show the graphical representations generated by thecontrol unit 22 of the objects captured by thesensors 16, projected paths of the moving objects, as well as the graphical representation indicating the projected traveling path of thevehicle 10. Thecontrol unit 22 may be coupled to a Heads Up Display (HUD)system 26. TheHUD system 26 may be used to display the graphical representations generated by thecontrol unit 22 of the objects captured by thesensors 16, projected paths of the moving objects, as well as the graphical representation indicating the traveling path of thevehicle 10. - The
warning system 12 may be configured so that thedisplay 24 and orHUD system 26 displays the general area viewable in front of the driver when the driver is seated in the vehicle 10 (hereinafter Field of View (FOV) of the driver). However, for objects detected by thesensors 16 but not in the FOV of the driver or towards a far edge of the FOV, for example a pedestrian located behind thevehicle 10, a string indicating a general position of the detected object and a potential danger level may be generated and shown on thedisplay 24 and orHUD system 26. - As stated above, a Global Positioning Satellite (GPS)
unit 28 may be coupled to thecontrol unit 22. TheGPS unit 28 may be used to provide geographical information to thecontrol unit 22. Based on the location indicated by theGPS unit 28, theGPS unit 28 may load and transfer location data about the indicated location. For example, theGPS unit 28 may load and transfer satellite imagery of the current location. This imaging may be sent to thecontrol unit 22 which may generate a graphical representation of the satellite images to be shown on thedisplay 24 and orHUD system 26. Further, as disclosed above, theGPS unit 28 may be used to determine if thevehicle 10 is moving by monitoring the speed and direction of thevehicle 10. - Referring now to
FIG. 3 , a functional block diagram of thecontrol unit 22 may be seen. Thecontrol unit 22 may have animage processing module 30. Theimage processing module 30 may be coupled to thesensors 16. Theimage processing module 30 may process the data from thesensors 16 in order to detect and identify the different objects detected by thesensors 16. Theimage processing module 30 may identify the different objects detected as well as determine whether the object is moving and/or a potential moving path of the object. In accordance with one embodiment, theimage processing module 30 may be configured to identify and distinguish pedestrians from other objects detected around thevehicle 10. Thus, theimage processing module 30 may be used to identify potential hazardous situations to the driver, such as pedestrians or bicyclist, as opposed to non-hazardous objects such as a fire hydrant, garbage can, or other non-hazardous objects. Thecontrol unit 22 may be configured so that potential hazardous objects like pedestrians are shown, while non-hazardous objects may not be processed and or displayed. - If an object is detected as moving, the
image processing module 30 may be used to calculate the speed, direction and/or potential moving path of the object. Based on the calculated speed and direction of the moving object, apath processing module 32 of thecontrol unit 22 may be used to calculate a projected path of the moving object. Thepath processing module 32 may further be used to calculate a projected pathway of thevehicle 10. Thepath processing module 32 may receive data from themonitoring sensor 18 and orGPS unit 28 indicating speed and directional information of thevehicle 10. Based on this information, thepath processing module 32 may calculate the projected pathway of thevehicle 10. - An augmented
reality processing module 34 of thecontrol unit 22 may be used to generate graphical representations of the different objects detected by thesensors 16, graphical representations of the projected paths of the moving objects and graphical representations indicating the projected traveling pathway of thevehicle 10. The augmentedreality processing module 34 may further generate graphical representations of location data provided by theGPS unit 28. The graphical representations generated by the augmentedreality processing module 34 may be two dimensional representations or three dimensional representations. - The graphical representations generated by the augmented
reality processing module 34 may be shown on adisplay 24 located within thevehicle 10. Alternatively, theHUD system 26 may be used to display the graphical representations generated by the augmentedreality processing module 34. - Referring to
FIGS. 1-4 , operation of thesystem 12 may be described in accordance with one embodiment of the present application. As thevehicle 10 moves along aroad 36, thesystem 12 may calculate an estimated drivingpathway 38 of thevehicle 10 based on data from themonitoring sensors 18 and/or theGPS unit 28. The drivingpathway 38 may be dynamic and adjust in length/size as thevehicle 10 based on the speed and movement of thevehicle 10. - The
system 12 may monitor for pedestrians, bicyclist, or similar hazardous objects 40 (Hereafter pedestrians 40) located near thevehicle 10 using thesensors 16. If thepedestrian 40 is moving, thesystem 12 may calculate an estimated movingvector 42 of thepedestrians 40. If the movingvector 42 and/or thepedestrians 40 intersect with the estimated drivingpathway 38, thesystem 12 may generate a potential collision warning indicator 44 (hereinafter warning indicator 44). Thewarning indicator 44 may be located on adisplay 24 or on adisplay area 46 of theHUD 26. Thewarning indicator 44 may appear in the general area where thepedestrian 40 is located and where the potential impact may occur. As shown inFIG. 4 , thewarning indicator 44 may be located on the upper left side of thedisplay area 46 which may correspond to the general location of thepedestrian 40. - Referring to
FIG. 5 , generation of thewarning indicator 44 may be described. Thewarning indicator 44 may be generated if the movingvector 42 and/or thepedestrians 40 intersect with the estimated drivingpathway 38. Awarning base 48 may be associated with eachpedestrians 40 detected by thesensors 16. Thewarning base 48 may be an area proximate thepedestrians 40 where potential impact with thevehicle 10 may occur. Thewarning base 48 may indicate a general area in which thepedestrian 40 has a potential of moving and still pose a potential threat of impact with thevehicle 10. Since thepedestrian 40 may not move directly across the street but may alter the trajectory to analyze traffic patterns, avoid potholes or other obstacles, or for other reasons, thewarning base 48 may indicate a potential movement area that may pose a danger for thepedestrian 40. In the embodiment shown inFIG. 5 , thewarning base 48 may be circular or oval in shape located proximate thepedestrian 40. However, this is shown as an example thewarning base 48 may be other shapes and/or sizes. Thewarning base 48 may be dynamic and may change in shape and or size as thepedestrian 40 detected changes speed and/or direction of movement. - Based on the current speed and direction of the
vehicle 10, thesystem 12 may determine animpact zone 50. Theimpact zone 50 may be configured to include an area in front of thevehicle 10 where impact with an object may occur. The area where thewarning base 48 andimpact zone 50 coincide may determine a size and shape of thewarning indicator 44 as well as the location of thewarning indicator 44 on thedisplay area 46 of theHUD 26. - Referring to
FIG. 6A-6E , differentexemplary warning indicators 44 may be seen. Eachwarning indicator 44 may be formed where thewarning base 48 andimpact zone 50 coincide and may be positioned on thedisplay area 46 of theHUD 26 to draw the attention of the driver to thepedestrian 40. As shown inFIG. 6A , thewarning base 48 may be wholly within andimpact zone 50. In this embodiment, thewarning indicators 44 may be positioned in a center area of thedisplay area 46 of theHUD 26 as thepedestrian 40 may be directly in front of thevehicle 10. - In
FIGS. 6B-6E , a portion of thewarning base 48 andimpact zone 50 may coincide. Thewarning indicators 44 may each be positioned in an area of thedisplay area 46 of theHUD 26 where thepedestrian 40 may be located. The position of thewarning indicators 44 may be used to draw the attention of the driver to the location of thepedestrian 40 as indicted by thearrow 52. Thus, inFIGS. 6B-6D , thewarning indicators 44 may be positioned on a left-hand side of thedisplay area 46 of theHUD 26 where thepedestrian 40 may be located to draw the attention of the driver to thepedestrian 40. InFIG. 6E , thewarning indicators 44 may be positioned in a center area of thedisplay area 46 of theHUD 26 as thepedestrian 40 may be directly in front of thevehicle 10. - Referring to
FIG. 7 , as thevehicle 10 moves along aroad 36, thesystem 12 may detectmultiple pedestrians 40 located near thevehicle 10. Thepedestrian 40A may be located closest to thevehicle 10, while thepedestrian 40B may be further away from thevehicle 10 thanpedestrian 40A.Pedestrian 40A may have an associated warning base 48A andimpact zone 50A which may form warningindicator 44A.Pedestrian 40B may have an associated warning base 48B andimpact zone 50B which may form warningindicator 44B. Thesystem 10 may be configured so that thedisplay area 46 of theHUD 26 shows theclosest warning indicator 44A. Showing theclosest warning indicator 44A may lessen distractions to the driver and may draw the driver's attention to the more eminent issue. Thewarning indicator 44B may be shown on thedisplay area 46 once thevehicle 10 passes thepedestrian 40A, ifpedestrian 40A moves away from thevehicle 10 or when the warning base 48A andimpact zone 50A no longer coincide. - As shown in
FIG. 7 , the associated warning base 48A andimpact zone 50A may be smaller than the warning base 48B andimpact zone 50B. Warning base 48B andimpact zone 50B may be larger to coincide with the larger field of view of the driver to objects located at a greater distance from thevehicle 10. - Referring now to
FIG. 1-8 , an exemplary embodiment of an augmented reality generated and displayed by thewarning system 10 may be seen. The augmented reality shown may be the FOV in front of the driver when the driver is seated in thevehicle 12. Thecontrol unit 22 may generate one or more graphical representations ofpedestrians 40A-40B detected by thesensors 16. In the present embodiment, twopedestrians 40 may be seen as being in the FOV of the driver. To highlight the position of thepedestrians 40A, thewarning indicator 44A may be placed adjacent to thepedestrian 40A. Thewarning indicator 44A may be positioned to draw the driver's attention to thepedestrian 40A in the FOV of the driver. Thewarning indicator 44 may flash and or blink to further draw the driver's attention to thepedestrian 40A in the FOV of the driver. Thesystem 10 may be configured so that thedisplay area 46 of theHUD 26 shows theclosest warning indicator 44A. Showing theclosest warning indicator 44A may lessen distractions to the driver and may draw the driver's attention to the more eminent issue. Thewarning indicator 44B may be shown on thedisplay area 46 once thevehicle 10 passes thepedestrian 40A, ifpedestrian 40A moves away from thevehicle 10 or when the warning base 48A andimpact zone 50A no longer coincide. - Referring to
FIGS. 9A-10B , when driving, the driver may have aneffective FOV 52 and ageneral FOV 54. Theeffective FOV 52 may be the area directly in front of the driver where the driver may be focusing when driving. Thegeneral FOV 54 may be larger area than theeffective FOV 52 and may include areas the driver may see if the driver where to move the eyes to the left and/or right or if the driver were to rotate his head to the left or right. - If a
pedestrian 40 is out of thegeneral FOV 54 or towards the outer edge of thegeneral FOV 54 of the driver, thesystem 10 may inform the driver of the potential danger. As may be seen inFIGS. 9A-9B , the 40C and 40D may be located proximate thepedestrian vehicle 10 but outside thegeneral FOV 52 of the driver. Thesystem 10 may generate awarning indicator 56 to indicate a general position of thepedestrians 40C-40D who is out of or towards the outer edge of theFOV 52 of the driver. Thewarning indicator 56 may signal to the driver to turn and look in the direction of thewarning indicator 56 to see thepedestrians 40C-40D. Thewarning indicator 54 may flash and/or blink to draw the driver's attention to thepedestrians 40C-40D. An audible warning may - The
warning indicator 56 may be alight array 56A which moves towards thepedestrians 40C-40D. Thus; for thepedestrian 40C, thelight array 56 may start on a left side of thedisplay area 46 and moves towards the right side of thedisplay area 46 as thepedestrian 40C moves closer to theFOV 52. For thepedestrian 40D, thelight array 56 may start on a right side of thedisplay area 46 and moves towards the left side of thedisplay area 46 as thepedestrian 40D moves closer to theFOV 52. Thelight array 56 may change in color as thepedestrians 40C-40D move closer to the vehicle. Thelight array 56 may be color coded to indicate to the relative position of thepedestrians 40C-40D to thevehicle 10 and the potential danger of thevehicle 10 colliding with one of thepedestrians 40C-40D. For example, a first section of thelight array 56 may start off in a first color (i.e., green), change to a second color (i.e., yellow) in a second section of thelight array 56 as thepedestrians 40C-40D move closer to the vehicle, and turns to a third color (i.e., red) in a third section of thelight array 56 if thepedestrians 40C-40D move within theestimate driving pathway 38 of thevehicle 10. - While embodiments of the disclosure have been described in terms of various specific embodiments, those skilled in the art will recognize that the embodiments of the disclosure may be practiced with modifications within the spirit and scope of the claims.
Claims (20)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/707,905 US9505346B1 (en) | 2015-05-08 | 2015-05-08 | System and method for warning a driver of pedestrians and other obstacles |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/707,905 US9505346B1 (en) | 2015-05-08 | 2015-05-08 | System and method for warning a driver of pedestrians and other obstacles |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20160325676A1 true US20160325676A1 (en) | 2016-11-10 |
| US9505346B1 US9505346B1 (en) | 2016-11-29 |
Family
ID=57222263
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/707,905 Expired - Fee Related US9505346B1 (en) | 2015-05-08 | 2015-05-08 | System and method for warning a driver of pedestrians and other obstacles |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US9505346B1 (en) |
Cited By (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9981602B2 (en) * | 2016-08-19 | 2018-05-29 | 2236008 Ontario Inc. | System and method for pedestrian alert |
| DE102017204529A1 (en) | 2017-03-17 | 2018-09-20 | Volkswagen Aktiengesellschaft | Arrangement for attentiveness control of the driver of a vehicle and method for driving a device for attentiveness control of the driver of a vehicle |
| EP3454013A1 (en) * | 2017-09-12 | 2019-03-13 | Volkswagen Aktiengesellschaft | Method, device and a computer readable storage medium with instructions for controlling a display of an augmented reality display device for a motor vehicle |
| US10269161B2 (en) * | 2015-09-28 | 2019-04-23 | Nissan Motor Co., Ltd. | Vehicular display device and vehicular display method |
| US10332292B1 (en) * | 2017-01-17 | 2019-06-25 | Zoox, Inc. | Vision augmentation for supplementing a person's view |
| US10384596B2 (en) * | 2011-04-07 | 2019-08-20 | Pioneer Corporation | System for detecting surrounding conditions of moving body |
| US20200114820A1 (en) * | 2017-04-12 | 2020-04-16 | Aisin Seiki Kabushiki Kaisha | Obstacle detecting and notifying device, method, and computer program product |
| US10803307B2 (en) * | 2017-08-30 | 2020-10-13 | Honda Motor Co., Ltd | Vehicle control apparatus, vehicle, vehicle control method, and storage medium |
| US20220063613A1 (en) * | 2020-08-31 | 2022-03-03 | Toyota Jidosha Kabushiki Kaisha | Display device for a vehicle, display method and program |
| DE102021103067A1 (en) | 2021-02-10 | 2022-08-11 | Bayerische Motoren Werke Aktiengesellschaft | WARNING ISSUE FOR MOTOR VEHICLE |
| US20220358826A1 (en) * | 2020-01-06 | 2022-11-10 | Aptiv Technologies Limited | Driver-Monitoring System |
| US20220392346A1 (en) * | 2021-06-07 | 2022-12-08 | Honda Motor Co.,Ltd. | Alert control apparatus, moving body, alert control method, and computer-readable storage medium |
| US11718314B1 (en) | 2022-03-11 | 2023-08-08 | Aptiv Technologies Limited | Pedestrian alert system |
| EP4253152A1 (en) * | 2022-03-31 | 2023-10-04 | Toyota Jidosha Kabushiki Kaisha | Light emission control device, light emitting device, vehicle, light emission control method, and computer program |
| US11890933B2 (en) | 2020-01-03 | 2024-02-06 | Aptiv Technologies Limited | Vehicle occupancy-monitoring system |
| WO2024188566A1 (en) | 2023-03-13 | 2024-09-19 | Volkswagen Aktiengesellschaft | Method for marking an object, computer program product and vehicle |
| US20240367522A1 (en) * | 2023-05-05 | 2024-11-07 | Honda Motor Co., Ltd. | Display control device and display control method thereof |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11107229B2 (en) | 2018-01-10 | 2021-08-31 | Samsung Electronics Co., Ltd. | Image processing method and apparatus |
| US10621858B1 (en) | 2019-02-06 | 2020-04-14 | Toyota Research Institute, Inc. | Systems and methods for improving situational awareness of a user |
Family Cites Families (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2008139365A1 (en) * | 2007-05-11 | 2008-11-20 | Philips Intellectual Property & Standards Gmbh | Driver device for leds |
| US8629903B2 (en) * | 2009-04-02 | 2014-01-14 | GM Global Technology Operations LLC | Enhanced vision system full-windshield HUD |
| US8912978B2 (en) * | 2009-04-02 | 2014-12-16 | GM Global Technology Operations LLC | Dynamic vehicle system information on full windshield head-up display |
| US9196168B2 (en) * | 2009-05-20 | 2015-11-24 | Textron Innovations Inc. | Collision avoidance and warning system |
| JP4952765B2 (en) | 2009-10-21 | 2012-06-13 | トヨタ自動車株式会社 | Vehicle night vision support device |
| US8098170B1 (en) | 2010-10-08 | 2012-01-17 | GM Global Technology Operations LLC | Full-windshield head-up display interface for social networking |
| EP2759998B1 (en) | 2011-09-20 | 2018-11-28 | Toyota Jidosha Kabushiki Kaisha | Pedestrian action prediction device and pedestrian action prediction method |
| DE102011120878A1 (en) | 2011-12-09 | 2013-06-13 | Volkswagen Aktiengesellschaft | Method for producing light sport runner-figures on windscreen and display of motor car for marking pedestrians, involves producing ordered groups of movement-performing light spots, which represent locations of body parts, as runner-figures |
| US9625720B2 (en) * | 2012-01-24 | 2017-04-18 | Accipiter Radar Technologies Inc. | Personal electronic target vision system, device and method |
| WO2013169601A2 (en) * | 2012-05-07 | 2013-11-14 | Honda Motor Co., Ltd. | Method to generate virtual display surfaces from video imagery of road based scenery |
| JP2014006700A (en) | 2012-06-25 | 2014-01-16 | Mitsubishi Motors Corp | Pedestrian detection device |
| US20160054563A9 (en) | 2013-03-14 | 2016-02-25 | Honda Motor Co., Ltd. | 3-dimensional (3-d) navigation |
| US20140354684A1 (en) | 2013-05-28 | 2014-12-04 | Honda Motor Co., Ltd. | Symbology system and augmented reality heads up display (hud) for communicating safety information |
| JP2014232501A (en) | 2013-05-30 | 2014-12-11 | 日産自動車株式会社 | Traffic information providing apparatus and traffic information providing method |
| JP2015049842A (en) | 2013-09-04 | 2015-03-16 | トヨタ自動車株式会社 | Alert display device and alert display method |
| US9291819B2 (en) | 2013-09-05 | 2016-03-22 | Texas Instruments Incorporated | Multi-focus heads-up display using single picture generator unit |
| CN104260669B (en) | 2014-09-17 | 2016-08-31 | 北京理工大学 | A kind of intelligent automobile HUD |
-
2015
- 2015-05-08 US US14/707,905 patent/US9505346B1/en not_active Expired - Fee Related
Cited By (37)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10988077B2 (en) | 2011-04-07 | 2021-04-27 | Pioneer Corporation | System for detecting surrounding conditions of moving body |
| US11479165B2 (en) | 2011-04-07 | 2022-10-25 | Pioneer Corporation | System for detecting surrounding conditions of moving body |
| US11577643B2 (en) | 2011-04-07 | 2023-02-14 | Pioneer Corporation | System for detecting surrounding conditions of moving body |
| US10946795B2 (en) | 2011-04-07 | 2021-03-16 | Pioneer Corporation | System for detecting surrounding conditions of moving body |
| US10625666B2 (en) * | 2011-04-07 | 2020-04-21 | Pioneer Corporation | System for detecting surrounding conditions of moving body |
| US10384596B2 (en) * | 2011-04-07 | 2019-08-20 | Pioneer Corporation | System for detecting surrounding conditions of moving body |
| US10269161B2 (en) * | 2015-09-28 | 2019-04-23 | Nissan Motor Co., Ltd. | Vehicular display device and vehicular display method |
| US10793066B2 (en) | 2016-08-19 | 2020-10-06 | 2236008 Ontario Inc. | System and method for vehicular alert |
| US10363869B2 (en) | 2016-08-19 | 2019-07-30 | 2236008 Ontario Inc. | System and method for pedestrian alert |
| US9981602B2 (en) * | 2016-08-19 | 2018-05-29 | 2236008 Ontario Inc. | System and method for pedestrian alert |
| US10332292B1 (en) * | 2017-01-17 | 2019-06-25 | Zoox, Inc. | Vision augmentation for supplementing a person's view |
| DE102017204529A1 (en) | 2017-03-17 | 2018-09-20 | Volkswagen Aktiengesellschaft | Arrangement for attentiveness control of the driver of a vehicle and method for driving a device for attentiveness control of the driver of a vehicle |
| US10940797B2 (en) * | 2017-04-12 | 2021-03-09 | Aisin Seiki Kabushiki Kaisha | Obstacle detecting and notifying device, method, and computer program product |
| US20200114820A1 (en) * | 2017-04-12 | 2020-04-16 | Aisin Seiki Kabushiki Kaisha | Obstacle detecting and notifying device, method, and computer program product |
| US10803307B2 (en) * | 2017-08-30 | 2020-10-13 | Honda Motor Co., Ltd | Vehicle control apparatus, vehicle, vehicle control method, and storage medium |
| KR102154727B1 (en) * | 2017-09-12 | 2020-09-11 | 폭스바겐 악티엔게젤샤프트 | Method, apparatus and computer readable storage medium having instructions for controlling a display of an augmented-reality-display-device for a motor vehicle |
| US10766498B2 (en) * | 2017-09-12 | 2020-09-08 | Volkswagen Aktiengesellschaft | Method, apparatus, and computer readable storage medium having instructions for controlling a display of an augmented reality display device for a transportation vehicle |
| CN109484299A (en) * | 2017-09-12 | 2019-03-19 | 大众汽车有限公司 | Control method, apparatus, the storage medium of the display of augmented reality display device |
| US20190077417A1 (en) * | 2017-09-12 | 2019-03-14 | Volkswagen Aktiengesellschaft | Method, apparatus, and computer readable storage medium having instructions for controlling a display of an augmented reality display device for a transportation vehicle |
| EP3454013A1 (en) * | 2017-09-12 | 2019-03-13 | Volkswagen Aktiengesellschaft | Method, device and a computer readable storage medium with instructions for controlling a display of an augmented reality display device for a motor vehicle |
| KR20190029488A (en) * | 2017-09-12 | 2019-03-20 | 폭스바겐 악티엔 게젤샤프트 | Method, apparatus and computer readable storage medium having instructions for controlling a display of an augmented-reality-display-device for a motor vehicle |
| US11890933B2 (en) | 2020-01-03 | 2024-02-06 | Aptiv Technologies Limited | Vehicle occupancy-monitoring system |
| US20220358826A1 (en) * | 2020-01-06 | 2022-11-10 | Aptiv Technologies Limited | Driver-Monitoring System |
| US11685385B2 (en) * | 2020-01-06 | 2023-06-27 | Aptiv Technologies Limited | Driver-monitoring system |
| US12168437B2 (en) * | 2020-08-31 | 2024-12-17 | Toyota Jidosha Kabushiki Kaisha | Display device for a vehicle, display method and program |
| US20220063613A1 (en) * | 2020-08-31 | 2022-03-03 | Toyota Jidosha Kabushiki Kaisha | Display device for a vehicle, display method and program |
| DE102021103067A1 (en) | 2021-02-10 | 2022-08-11 | Bayerische Motoren Werke Aktiengesellschaft | WARNING ISSUE FOR MOTOR VEHICLE |
| US11922813B2 (en) * | 2021-06-07 | 2024-03-05 | Honda Motor Co., Ltd. | Alert control apparatus, moving body, alert control method, and computer-readable storage medium |
| US20220392346A1 (en) * | 2021-06-07 | 2022-12-08 | Honda Motor Co.,Ltd. | Alert control apparatus, moving body, alert control method, and computer-readable storage medium |
| US11718314B1 (en) | 2022-03-11 | 2023-08-08 | Aptiv Technologies Limited | Pedestrian alert system |
| US12179787B2 (en) | 2022-03-11 | 2024-12-31 | Aptiv Technologies AG | Pedestrian alert system |
| EP4253152A1 (en) * | 2022-03-31 | 2023-10-04 | Toyota Jidosha Kabushiki Kaisha | Light emission control device, light emitting device, vehicle, light emission control method, and computer program |
| US12103552B2 (en) | 2022-03-31 | 2024-10-01 | Toyota Jidosha Kabushiki Kaisha | Light emission control device, light emitting device, vehicle, light emission control method, and non-transitory storage medium |
| WO2024188566A1 (en) | 2023-03-13 | 2024-09-19 | Volkswagen Aktiengesellschaft | Method for marking an object, computer program product and vehicle |
| DE102023202243A1 (en) | 2023-03-13 | 2024-09-19 | Volkswagen Aktiengesellschaft | Method for marking an object, computer program product and vehicle |
| US20240367522A1 (en) * | 2023-05-05 | 2024-11-07 | Honda Motor Co., Ltd. | Display control device and display control method thereof |
| US12515524B2 (en) * | 2023-05-05 | 2026-01-06 | Honda Motor Co., Ltd. | Display control device and display control method thereof |
Also Published As
| Publication number | Publication date |
|---|---|
| US9505346B1 (en) | 2016-11-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9505346B1 (en) | System and method for warning a driver of pedestrians and other obstacles | |
| US9514650B2 (en) | System and method for warning a driver of pedestrians and other obstacles when turning | |
| US9965957B2 (en) | Driving support apparatus and driving support method | |
| JP5718942B2 (en) | Apparatus and method for assisting safe operation of transportation means | |
| US20190340522A1 (en) | Event prediction system, event prediction method, recording media, and moving body | |
| EP3125212B1 (en) | Vehicle warning device | |
| JP4940767B2 (en) | Vehicle surrounding information notification device | |
| US9452712B1 (en) | System and method for warning a driver of a potential rear end collision | |
| WO2019098323A1 (en) | Vehicle drive assist system, vehicle drive assist method, and vehicle drive assist program | |
| JP6307895B2 (en) | Vehicle periphery monitoring device | |
| CN113998034A (en) | Rider assistance system and method | |
| WO2019098216A1 (en) | Vehicle driving assistance system, vehicle driving assistance method, and vehicle driving assistance program | |
| CN106462727A (en) | Systems and methods for end-of-lane recognition | |
| JP2008131648A (en) | Method and system for presenting video images | |
| CN106030609A (en) | System and method for imitating a preceding vehicle | |
| JP2016020876A (en) | Vehicle display device | |
| US20190244515A1 (en) | Augmented reality dsrc data visualization | |
| JP5888339B2 (en) | Display control device | |
| JP2020091663A (en) | Vehicle display control device | |
| JP2008003762A (en) | Obstacle recognition judgment device | |
| JP4872245B2 (en) | Pedestrian recognition device | |
| JP5003473B2 (en) | Warning device | |
| JP5192009B2 (en) | Vehicle periphery monitoring device | |
| CN115527184A (en) | Camera-based Vehicle Blind Spot Detection System | |
| JP4986069B2 (en) | Ambient monitoring device for vehicles |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: HONDA MOTOR CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAMADA, HAJIME;REEL/FRAME:035600/0504 Effective date: 20150506 |
|
| AS | Assignment |
Owner name: HONDA MOTOR CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAMADA, HAJIME;REEL/FRAME:037670/0623 Effective date: 20160114 |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
| FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
| FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20241129 |