GB2615290A - Blindspot assist system for a vehicle - Google Patents
Blindspot assist system for a vehicle Download PDFInfo
- Publication number
- GB2615290A GB2615290A GB2115233.5A GB202115233A GB2615290A GB 2615290 A GB2615290 A GB 2615290A GB 202115233 A GB202115233 A GB 202115233A GB 2615290 A GB2615290 A GB 2615290A
- Authority
- GB
- United Kingdom
- Prior art keywords
- vehicle
- collision
- obstacles
- control unit
- rider
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/008—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62J—CYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
- B62J27/00—Safety equipment
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62J—CYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
- B62J45/00—Electrical equipment arrangements specially adapted for use as accessories on cycles, not otherwise provided for
- B62J45/40—Sensor arrangements; Mounting thereof
- B62J45/42—Sensor arrangements; Mounting thereof characterised by mounting
- B62J45/422—Sensor arrangements; Mounting thereof characterised by mounting on the handlebar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/87—Combinations of systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0133—Traffic data processing for classifying traffic situation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/50—Systems of measurement based on relative movement of target
- G01S17/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Human Computer Interaction (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
A blind spot assist system 100 on a vehicle 102 comprises a control unit (108, Fig.1) and light detection and ranging (LiDAR) sensors 104 operable to monitor attributes of obstacles 110 (e.g. their distance, velocity, angle of approach or dimensions) from a real-time position of the vehicle and correspondingly generate a first set of signals. The control unit is configured to extract the attributes from the signals and use them to generate a collision signature for the vehicle with respect to the obstacles. It determines a percentage of matching of the generated collision signature with a set of predetermined collision signatures pertaining to known collisions and uses it to generate a set of alert signals. The percentage of matching is indicative of a severity level of collision of the vehicle with the obstacles. The vehicle may be a bicycle or two-wheeler motor vehicle and the alert signals may cause haptic feedback (e.g. vibration) in its handlebar.
Description
BLINDSPOT ASSIST SYSTEM FOR A VEHICLE
TECHNICAL FIELD OF INVENTION
100011 The present disclosure relates to the field of blindspot assistance systems. More particularly, the present disclosure relates to a simple, efficient, cost-effective, and reliable blindspot assist system for a vehicle, which continuously monitors oncoming traffic in a blindspot region of a rider of the vehicle, and safely alerts the rider by giving haptic feedback on the handlebar of the vehicle, allowing the rider to take precautionary measures.
BACKGROUND FOR THE INVENTION
[0002] Riding a vehicle such as a bicycle, e-bike, and two-wheeler motor vehicles, especially on roads with traffic, necessitates a high level of concentration on the part of the rider, especially when he/she is changing from his/her lane to an adjacent lane. Before changing lanes, the rider must visually check to see whether any other vehicles are approaching the motorcycle from behind. The rider is required to turn back or over the shoulders to check for incoming vehicles. This operation may, however, be hazardous if for example on the road, a vehicle is approaching from behind at high speed and the rider does not suitably evaluate the distance of the oncoming vehicle, which may lead to accident and casualty.
[0003] Vehicles are provided with side mirrors to allow the rider to see the incoming traffic and accordingly drive the vehicle. However, the mirrors are not sufficient enough to provide a complete view of the rear side of the rider and a blindspot region is created for the rider, which restricts them to see all the oncoming traffic. Besides, it also becomes difficult for the rider to estimate the speed and direction of the oncoming traffic using the mirrors, as objects in the mirror look closer than they appear. In addition, it is very distracting and unsafe for the rider to continuously look into the mirrors and simultaneously drive the vehicle. This distraction may again create a blindspot region in the front region of the rider or vehicle and may cause the rider to miss the traffic and obstacles in front of the vehicle, leading to an accident.
[0004] Various blindspot assist systems are available in the market, which monitors the oncoming traffic in the blindspot (or blind spot) region of the rider, and accordingly alerts the rider about the oncoming traffic using visual or sound media. Generally, ultrasonic sensors and radars are used in blind spot assist systems. Ultrasonic sensors are mechanical devices that work on the principle of Echo-Location, in which a transmitter emits sound waves, and the distance is estimated based on the time it takes for the sound wave to return to a receiver. These ultrasonic sensors are positioned on the rear side of the vehicle, which continuously emits soundwaves to detect oncoming vehicles. However, this is not a very accurate approach for monitoring oncoming traffic for long-distance, since the speed of sound in the air changes with the ambient temperature. Due to this limitation, large ultrasonic sensors are required to monitor oncoming traffic for large distances, which is not structurally feasible.
[0005] To overcome the issue with ultrasonic sensor-based blindspot assist systems, radar-based blindspot assist systems are also available in the market. As radars work on the principle of radio waves ranging, they don't require big mechanical devices like ultrasonic sensors, but they do require expensive equipment to monitor and understand the radio waves, therefore radar-based blindspot assist systems cannot be utilized in blind spot assistance.
[0006] Besides, existing blindspot assist systems are not efficient and reliable, as they only rely on the real-time data captured by the ultrasonic sensors or radars for identifying collision threats by oncoming traffic, which is not sufficient enough in predicting the severity of collision of the vehicle with the oncoming traffic. Thus, there is a need to improve the data processing capability of existing blindspot assist systems, and provide them with enriched data to make them accurate, efficient, and reliable in identifying collision threats.
[0007] Another issue faced with existing blind spot assist systems is alerting the rider about the incoming traffic because if the rider is distracted by the alert it may lead to a crash, which the existing systems are required to avoid. The alerting system of existing blind spot assist systems involves visual and sound means to alert the rider about the oncoming traffic, which catches the attention of the rider but also distracts them. For instance, indicator lights used for alerting the rider are flashy and may distract the rider. Similarly, sudden generation of sound to alert the rider about the oncoming traffic may also distract the rider. Thus, a visual and sound-based alert system for the blind spot assist system is not a viable option.
[0008] Patent document number US10455882 discloses a method and system for providing a rear collision warning within a helmet. The system includes an image capturing device mounted on a helmet for identifying a target vehicle. The use of an image capturing device on the helmet makes the helmet heavier and uncomfortable for the rider. Besides, any movement of the head of the rider while driving makes the system inaccurate and efficient in identifying rear collisions.
[0009] Patent document number US10144474 discloses a bicycle collision system, apparatus, and methods, which include, or use, one or more various sensing apparatus to detect vehicles or other objects that may collide, or potentially collide, with a bicycle. The sensing apparatus may include at least one of side sensing apparatus, rear sensing apparatus, and front sensing apparatus. Further, alert apparatus may be used to alert not only the cyclist but also the driver of a vehicle of an imminent collision. The disclosed patent provides longitudinal and lateral distances, which can potentially slow down the response time, which makes it more complicated. Thus, there is a requirement for a system that gives just one distance at the proper calibrated angle, making it fast and reliable.
100101 Patent document number US10377308 discloses a motorcycle with a device for detecting a vehicle approaching from the rear. The disclosed patent uses a Radar or television camera for detecting approaching vehicles, and the device further evaluates the speed of the incoming vehicle. However, the use of radars in the disclosed patent makes it complex and expensive, as they do require expensive equipment to monitor and understand the radio waves, therefore radar-based blindspot assist systems cannot be utilized in blind spot assistance.
[0011] Therefore, there is a need to overcome the above drawbacks, shortcomings, and limitations associated with the existing blindspot assist system, and provide a simple, improved, cost-effective, and efficient blindspot assist system for vehicles, which monitors oncoming traffic in a blind spot region of the rider of the vehicle in all environmental conditions, and safely alerts the rider without distracting.
OBJECTS OF THE PRESENT INVENTION
[0012] Some of the objectives of the present invention, which at least one embodiment herein satisfies are as listed herein below.
100131 It is an object of the present invention to restrict the collision of vehicles with oncoming traffic [0014] It is an object of the present invention to provide a blindspot assist system for vehicles, which continuously monitors oncoming traffic in blindspot regions, especially the rear side of the rider.
[0015] It is an object of the present invention to alert riders of vehicle about incoming traffic in a safer and non-distracting way.
100161 It is an object of the present invention to provide a blindspot assist system for vehicles, which continuously monitors oncoming traffic in a blindspot region of the rider of the vehicle, and safely alerts the rider without distracting the rider.
100171 It is an object of the present invention to provide a blindspot assist system for vehicles, which is simple, compact, cost-effective yet efficient, and reliable.
100181 It is an object of the present invention to provide a blindspot assist system for vehicles, which is capable of efficiently working in all environmental conditions.
[0019] It is an object of the present invention to improve the data processing capability of the blindspot assist system by providing enriched data to make the system accurate, efficient, and reliable in identifying collision threats
SUMMARY OF THE PRESENT INVENTION
[0020] The present invention relates to a simple, efficient, cost-effective, and reliable blindspot assist system for a vehicle, which continuously monitors oncoming traffic in a blindspot region of a rider of the vehicle, and safely alerts the rider by giving haptic feedback on the handlebar of the vehicle, allowing the rider to take precautionary measures.
[0021] An aspect of the present invention relates to a blind-spot assist system for a vehicle (also referred to as host vehicles, herein) including but not limited to bicycle, e-bike, cargo bike, two-wheeler motor vehicle. The system may comprise a set of light detection and ranging (LiDAR) sensors configured with the vehicle, which are operable to monitor one or more attributes of one or more obstacles within a predefined region from a real-time position of the vehicle. The obstacles may be target vehicles, people, animals, objects, and the likes. The attributes may be the distance of the obstacles from the vehicle, velocity of the obstacles with respect to the vehicle, angle of approach of the obstacles towards the vehicle, and the dimension of the obstacles. Further, the LiDAR sensors may be configured at predefined positions on the vehicle and oriented in predefined directions to cover a blind spot region of a rider of the vehicle or a rear side of the vehicle. For instance, in general, the rear side of any vehicle is the blindspot region for any rider, so the LiDAR sensors may be configured at the rear side of the vehicle to cover and monitor obstacles in the blindspot region in the rear side of the vehicle.
[0022] The system may comprise a control unit operatively coupled to the set of LiDAR sensors. The control unit may use the captured attributes of the obstacles and correspondingly generate a collision signature for the vehicle with respect to the obstacles. Further, the control unit may store a set of predetermined collision signatures pertaining to one or more known collisions. The control unit may compare the generated collision signature with the set of predetermined collision signatures pertaining to known collisions, and may further generate a set of alert signals based on a percentage of matching of the generated collision signature with the set of predetermined collision signatures. The percentage of matching may be indicative of a severity level of collision of the vehicle with the obstacles.
[0023] In an aspect, the system may comprise a haptic feedback device configured with the handlebar of the vehicle, which may be operatively coupled to the control unit. Based on the severity level of predicted collision of the vehicle with the obstacles, the control unit may actuate the haptic feedback device of the handlebar to provide haptic feedback to a rider of the vehicle in a controlled and safe manner such that the rider may not get distracted while driving. Thus, based on the haptic feedback felt by the rider on the handlebar, the rider may accordingly drive the vehicle and may take precautionary measures to avoid any collision.
[0024] In an aspect, the control unit may comprise a machine learning unit configured with the processor, and operable to generate the set of predetermined collision signatures pertaining to the known collisions based on a set of training and testing datasets. Further, the control unit may update the training and testing dataset with the current generated first set of signals and the corresponding collision signature, thereby enriching the datasets and system with new data which may further improve the accuracy and capability of the system to detect obstacles and predict collisions in real-time and live cases [0025] In addition, as the proposed system is using LiDAR sensors, which basically emits infrared lights, the operation of LiDAR sensors remains unaffected by weather conditions (as in the case of ultrasonic sensors), so the proposed system may efficiently work in all weather conditions. Besides, the small size of the LiDAR sensors and the use of a minimum number of components in the proposed system makes the present invention compact, cost-effective, and easily implementable in existing vehicles without compromising on the effectiveness and reliability of the proposed system.
BRIEF DESCRIPTION OF DRAWINGS
[0026] The accompanying drawings are included to provide a further understanding of the present disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure. The diagrams are for illustration only, which thus is not a limitation of the present disclosure.
[0027] In the figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
[0028] FIG. I illustrates an exemplary block diagram of the proposed system, in accordance with an embodiment of the present invention.
[0029] FIG. 2A illustrates an exemplary view of a two-wheeled vehicle with the proposed system of FIG. 1.
[0030] FIG. 2B illustrates an exemplary view depicting the operation of the proposed system while monitoring oncoming obstacles in the rear side of the two-wheeler vehicle of FIG. 2A.
[0031] FIG. 3 illustrates exemplary functional blocks of the control unit of the proposed system, in accordance with an embodiment of the present invention.
[0032] FIG. 4 illustrates an exemplary process flow diagram of the proposed system, in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION OF THE PRESENT INVENTION
[0033] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The disclosed embodiments are merely exemplary of the invention, which may be embodied in various forms. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0034] Embodiments of the present disclosure relate to a simple, efficient, cost-effective, and reliable blindspot assist system for a vehicle, which continuously monitors oncoming traffic in a blindspot region of a rider of the vehicle, and safely alerts the rider by giving haptic feedback on the handlebar of the vehicle, allowing the rider to take precautionary measures.
[0035] According to an aspect, the present disclosure elaborates upon a blindspot assist system for a vehicle. The system can include a set of light detection and ranging (LiDAR) sensors configured with the vehicle and operable to monitor one or more attributes of one or more obstacles within a predefined region from a real-time position of the vehicle and correspondingly generate a first set of signals. Further, the system can include a control unit operatively coupled to the set of LiDAR sensors. The control unit can include a processor operatively coupled to a memory storing instructions executable by the processor. The control unit can be configured to receive the first set of signals from the LiDAR sensors and extract the one or more attributes from the first set of signals. Further, the control unit can generate a collision signature for the vehicle with respect to the one or more obstacles based on the extracted one or more attributes and can compare the generated collision signature with a set of predetermined collision signatures pertaining to one or more known collisions. Accordingly, the control unit can generate a set of alert signals based on a percentage of matching of the generated collision signature with the set of predetermined collision signatures, where the percentage of matching can be indicative of a severity level of collision of the vehicle with the one or more obstacles.
[0036] In an embodiment, the system can include a haptic feedback device configured with one or more components of the vehicle. The haptic feedback device can be operatively coupled to the control unit and can be configured to be actuated and provide haptic feedback to a rider of the vehicle upon receipt of the set of alert signals from the control unit.
[0037] In an embodiment, the haptic device can be configured with the handlebar of the vehicle. The haptic feedback device can include any or a combination of vibration device, and vibration pad [0038] In an embodiment, the predefined region from the real-time position of the vehicle can be a blind spot region of a rider of the vehicle. Further, the set of LiDAR sensors can be configured at predefined positions on the vehicle and can be oriented in predefined directions to cover the blind spot region.
[0039] In an embodiment, the set of LiDAR sensors can be positioned at predefined positions selected from any or a combination of a rear side of a seat of the vehicle, a rear side of a helmet of the user, number plate of the vehicle, a rear side of a frame of the vehicle, tail light of the vehicle, and extreme ends of a handlebar of the vehicle.
[0040] In an embodiment, the predefined region can include a circular sector on the rear side of the vehicle, the circular sector having a predefined central angle and a predefined radius.
[0041] In an embodiment, the one or more attributes of the one or more obstacles can be any or a combination of the distance of the one or more obstacles from the vehicle, velocity of the one or more obstacles with respect to the vehicle, angle of approach of the one or more obstacles towards the vehicle, and the dimension of the one or more obstacles.
[0042] In an embodiment, the vehicle can be selected from a bicycle, e-bike, cargo bike, and two-wheeler motor vehicle. In another embodiment, the vehicle can be a four-wheeler vehicle such as a car, truck, and bus. Further, the haptic feedback devices can be configured with the steering of the four-wheeler vehicle.
[0043] In an embodiment, the control unit can include a machine learning unit configured with the processor, and operable to generate the set of predetermined collision signatures pertaining to the one or more known collisions based on a set of training and testing datasets. The control unit can update the training and testing datasets with the generated first set of signals and the corresponding collision signature.
[0044] Referring now to FIGs. 1 to 2B, according to an embodiment, the proposed blindspot assist system 100 for a vehicle 102 is disclosed. System 100 can include a set of light detection and ranging (LiDAR) sensors 104 configured with the vehicle 102 (also referred to as host vehicle 102, herein). The LiDAR sensors 104 can monitor one or more attributes of one or more obstacles 110-1 to 110-2 (collectively referred to as obstacles 110, herein) within a predefined region from a real-time position of the vehicle 102. In an exemplary embodiment, the obstacles 110 can be target vehicles, people, animals, objects, and the likes. The attributes of the obstacles 110 can be the distance of the obstacles 110 from the vehicle 102, the velocity of the obstacles 110 with respect to the vehicle 102, the angle of approach of the obstacles 110 towards the vehicle 102, and the dimension of the obstacles 110. The LiDAR sensors 104 can continuously emit infrared light in the predefined region around the real-time position of the vehicle 102, and the attributes of the obstacles 110 can be estimated by the LiDAR sensor 104 based on the time it takes for the infrared light to return to the sensor 102 after getting reflected from the obstacles 110.
[0045] In an embodiment, the LiDAR sensors 104 can be configured at predefined positions on vehicle 102 and can be oriented in predefined directions to cover a blind spot region of a rider 112 (also referred to as user 112, herein) of vehicle 102. In an implementation, as the rear side of any vehicle is generally the blindspot region for any rider, so the LiDAR sensors 104 can be positioned on the backside of the vehicle 102 and facing the rear side of the vehicle 102 to cover and monitor obstacles 110 in the blindspot region of the vehicle 102. In an exemplary embodiment, the LiDAR sensors 104 can be positioned at predefined positions selected from any or a combination of a rear side of a seat 102-2 of the vehicle, a rear side of a helmet of the rider 112, number plate of the vehicle, a rear side 102-3 of a frame of the vehicle, tail light 102-4 of the vehicle, extreme ends of a handlebar 106 of the vehicle, and the likes.
[0046] In another implementation, the LiDAR sensors 104 can also be positioned in front the vehicle 102 and oriented towards the frontside of vehicle 102 to cover and monitor obstacles in the front area of vehicle 102 as required, so that rider 112 does not miss any oncoming or stationary obstacles 110 in front of the vehicle 102. In such case, the LiDAR sensors 104 can be positioned at predefined positions selected from any or a combination of a front side of a helmet of the user, front number plate of the vehicle, a front side of a frame of the vehicle, front light of the vehicle, and extreme ends of a handlebar of the vehicle, and the likes.
[0047] In an embodiment, system 100 can include a control unit 108 operatively coupled to the LiDAR sensors 104. The control unit 108 can receive a first set of signals corresponding to the monitored attributes of the obstacles 110 from the LiDAR sensors 104. The control unit 108 can extract the attributes of the obstacles 110 from the first set of signals and can correspondingly generate a collision signature of the vehicle 102 with respect to the obstacles 110. Further, control unit 108 can store a set of predetermined collision signatures pertaining to one or more known collisions. The control unit 108 can then compare the generated collision signature with the set of predetermined collision signatures pertaining to known collisions, and can accordingly generate a set of alert signals based on a percentage of matching of the generated collision signature with the set of predetermined collision signatures. The percentage of matching can be indicative of a severity level of chances of collision of the vehicle 102 with the obstacles 110.
[0048] In an exemplary implementation, as illustrated in FIG. 2B, the LiDAR sensors 104 configured on the rear side of the vehicle 102 can emit infrared light towards the rear side (blindspot region) of the vehicle 102 forming a circular sector, and can continuously monitor any obstacles in the circular sector (blindspot region). The circular sector covered by the LiDAR sensors 104 can have a predefined central angle and a predefined radius covering a pie-shaped area behind the vehicle 102. Any obstacle 110 coming in the pie-shaped area or circular sector behind vehicle 102 can be detected by the LiDAR sensors 104. The orientation and position of the LiDAR sensors 104 can be adjusted such that the radius of the circular sector can be based on the maximum distance behind the vehicle 102, up to which the obstacles 110 have to be monitored. Further, the central angle of the circular sector can be adjusted based on the maximum sideway distance from vehicle 102, up to which the obstacles 110 have to be monitored. As illustrated, the LiDAR sensors 104 are covering a circular sector area having a central angle of a 25-degree angle, and a 10-meter radius (or 10-meter distance) behind vehicle 102.
[0049] In an embodiment, the control 108 unit can include a machine learning unit configured with a processor. The machine learning unit can train the control unit 108 prior to real-time operation, using a set of training and testing datasets including the predetermined attributes of known obstacles and corresponding predetermined collision signatures pertaining to previously known collisions. This can enable the control unit 108 to efficiently and accurately determine the chances and severity level of collision of the vehicle 102 with the obstacles 110 in real-time and live case implementation. Further, control unit 108 can update the training and testing dataset with the real-time attributes of the obstacles 110 and the corresponding collision signature, thereby enriching the training and testing datasets, and making the control unit accurate and efficient in blindspot assist.
[0050] In an embodiment, system 100 can include a haptic feedback device 106 configured with one or more components 102-1 to 102-4 of the vehicle 102, which can be operatively coupled to the control unit 108. In an exemplary embodiment, the haptic feedback device 106 can be a vibrating device or a vibrating pad configured with a handlebar (102-1) or seat of the vehicle, but not limited to the like. Based on the severity level of predicted collision of the vehicle 102 with the obstacles 110, the control unit 108 can accordingly actuate the haptic feedback device 106 of the handlebar 102-1 to provide haptic feedback to a rider 112 of the vehicle 102 in a controlled and safe manner such that the rider 112 does not get distracted while driving. Thus, based on the haptic feedback felt by rider 112 on the handlebar 102-1, rider 112 can accordingly drive vehicle 102 and can take precautionary measures to avoid any collision.
[0051] Those skilled in the art would also appreciate that as the proposed system is using LiDAR sensors, which basically emits infrared lights, the operation of LiDAR sensors remains unaffected by weather conditions (as in the case of ultrasonic sensors). Thus, the proposed system efficiently works in all weather conditions. Besides, the small size of the LiDAR sensors and the use of a minimum number of components in the proposed system makes the present invention compact, cost-effective, and easily implementable in existing vehicles without compromising on the effectiveness and reliability of the proposed system, [0052] FIG. 3 illustrates exemplary functional components of the control unit of the proposed system, in accordance with an embodiment of the present disclosure.
[0053] In an embodiment, control unit 108 may comprise one or more processor(s) 302.
The one or more processor(s) 302 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, one or more processor(s) 302 are configured to fetch and execute computer-readable instructions stored in a memory 304 of the system 102. The memory 304 may store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 304 may comprise any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[0054] Control unit 108 may also comprise an interface(s) 306. The interface(s) 306 may comprise a variety of interfaces, for example, interfaces for data input and output devices, referred to as PO devices, storage devices, and the like. The interface(s) 306 may facilitate communication of control unit 108. The interface(s) 306 may also provide a communication pathway for one or more components of the control unit 108. Examples of such components include, but are not limited to, processing engine(s) 308 and data 310.
[0055] The processing engine(s) 308 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more flinctionalities of the processing engine(s) 308. In the examples described herein, such combinations of hardware and programming may be implemented in several different ways.
For example, the programming for the processing engine(s) 308 may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) 308 may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s) 308. In such examples, the control unit 108 may comprise the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to control unit (108) and the processing resource. In other examples, the processing engine(s) 308 may be implemented by electronic circuitry.
[0056] Data 310 may comprise data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) 308 or system 102.
[0057] In an exemplary embodiment, the processing engine(s) 308 may include a collision signature unit 312, a comparison unit 314, an alert and haptic feedback unit 316, a machine learning unit 318, and other engines (s). Other engine(s) can supplement the functionalities of the processing engine 308 or the control unit.
[0058] In an embodiment, the collision signature unit 312 can enable the processor 302 of the control unit 108 to receive a first set of signals pertaining to attributes of the obstacles from the LiDAR sensors 102, and generate a collision signature based on the attributes of the obstacles 110. The collision signature unit enables the processor 302 to extract the attributes selected from a distance of the obstacles 110 from the vehicle 102, the velocity of the obstacles with respect to the vehicle 102, angle of approach of the obstacles 110 towards the vehicle 102, and the dimension of the obstacles 110, and use at least one of these attributes to generate the collision signature.
100591 In an embodiment, the comparison unit 314 can enable the processor 302 to compare the collision signature generated by the collision signature unit with a set of predetermined collision signatures pertaining to one or more known collisions stored in the database 310 of the control unit 108. The percentage of matching between the collision signature and the set of predetermined collision signatures can be indicative of a severity level or chances of collision of the vehicle 102 with the obstacles 110.
100601 In an embodiment, the alert and haptic feedback unit 316, based on the percentage of matching between the collision signature and the set of predetermined collision signatures, can correspondingly generate a set of alert signals. The haptic feedback provided on the handlebar 102-1 of the vehicle 102, upon receiving the set of alert signals can be actuated and provide haptic feedback to rider 112 of vehicle 102 in a controlled and safe manner such that rider 112 does not get distracted while driving. Thus, based on the haptic feedback felt by rider 112 on the handlebar 102-1, rider 112 can accordingly drive vehicle 102 and can take precautionary measures to avoid any collision.
100611 In an embodiment, the machine learning unit 318 can enable the processor 302 to train the control unit 108 prior to real-time operation, using a set of training and testing datasets including the predetermined attributes of known obstacles and corresponding predetermined collision signatures pertaining to previously known collisions. This can enable the control unit 108 to efficiently and accurately determine the chances and severity level of collision of the vehicle 102 with the obstacles 110 in real-time and live case implementation. Further, the machine learning unit 318 can enable the processor 302 to update the training and testing dataset (stored in database 310) with the real-time attributes of the obstacles 110 and the corresponding generated collision signature, thereby enriching the training and testing datasets 310, and making the control unit 108 accurate and efficient in blindspot assist.
100621 FIG. 4 illustrates an exemplary process flow diagram 400 of the proposed system 100, in accordance with an embodiment of the present invention. As illustrated, at step 402, the control unit can obtain LiDAR data of known collisions, and can correspondingly generate a set of predetermined collision signatures at step 404. The LiDAR data obtained at step 402 can include the predetermined attributes of known obstacles. Further, at step 406, the LiDAR can monitor the attributes of the obstacles within the predefined region from the real-time position of the vehicle, and correspondingly transmit a set of data packets to the control unit. The control unit can convert the LiDAR captured data of step 406 into a collision signature. Further, at step 408, the control unit can compare the collision signature of known collisions of step 404 with the collision signature generated in real-time at step 406. In an embodiment, when the collision signature of step 406 does not match with the collision signature of known collisions of step 404, the system can again follow step 406 of continuously monitoring the attributes of the obstacles within the blindspot region using the LiDAR sensors. Further, when the collision signature of step 406 matches with the collision signature of known collisions of step 404, the control unit, based on the percentage of matching, can send feedback to the rider at step 410 by triggering the haptic feedback provided on the handlebar of the vehicle. The percentage of matching at step 410 can be indicative of a severity level or chances of collision of the vehicle with the obstacles.
[0063] It would be obvious for a person skilled in the art that while various embodiments and drawings of the present disclosure elaborate upon implementation of the proposed blindspot assist system 100 in a two-wheeled vehicle 102, however, the proposed system 100 can also be implemented in other vehicles such as car, tnick, rickshaw, buses, as well, and all such embodiment are also well within the scope of the present invention.
[0064] Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms "comprises" and "comprising" should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refers to at least one of something selected from the group consisting of A, B, C....and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc. [0065] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE PRESENT INVENTION
[0066] The present invention restricts the collision of vehicles with oncoming traffic.
[0067] The present invention provides a blindspot assist system for vehicles, which continuously monitors oncoming traffic from blindspot regions, especially the rear side of the rider.
100681 The present invention alerts riders of the vehicle about incoming traffic in a safer and non-distracting way.
[0069] The present invention provides a blindspot assist system for vehicles, which continuously monitors oncoming traffic from a blindspot region of the rider of the vehicle, and safely alerts the rider without distracting the rider.
[0070] The present invention provides a blindspot assist system for vehicles, which is simple, compact, cost-effective yet efficient, and reliable.
[0071] The present invention provides a blindspot assist system for vehicles, which is capable of efficiently working in all environmental conditions.
[0072] The present invention improves the data processing capability of the blindspot assist system by providing them with enriched data to make them accurate, efficient, and reliable in identifying collision threats
Claims (9)
- CLAIMS1 A blindspot assist system (100) implemented in a vehicle (102), the system (100) comprising: a set of light detection and ranging (Li DAR) sensors (104) configured with the vehicle (102) and operable to monitor one or more attributes of one or more obstacles (110) within a predefined region from a real-time position of the vehicle (102), and correspondingly generate a first set of signals, and a control unit (108) operatively coupled to the set of LiDAR sensors (104), the control unit (108) comprising a processor (302) operatively coupled to a memory (304) storing instructions executable by the processor (302), and configured to: extract the one or more attributes from the first set of signals; generate a collision signature for the vehicle (102) with respect to the one or more obstacles (110) based on the extracted one or more attributes; compare the generated collision signature with a set of predetermined collision signatures pertaining to one or more known collisions; and generate a set of alert signals based on a percentage of matching of the generated collision signature with the set of predetermined collision signatures, wherein the percentage of matching being indicative of a severity level of collision of the vehicle (102) with the one or more obstacles (HO).
- 2 The system (100) of claim 1, wherein the system (100) comprises a haptic feedback device (106) configured with one or more components of the vehicle (102), and wherein the haptic feedback device (106) is operatively coupled to the control unit (108) and configured to be actuated and provide haptic feedback to a rider (112) of the vehicle (102) upon receipt of the set of alert signals from the control unit (108).
- 3 The system (100) of claim 1 or 2, wherein the haptic feedback device (106) is configured with handlebar (102-1) of the vehicle (102).
- 4 The system (100) of any of preceding claims 1-3, wherein the haptic feedback device (102) comprises any or a combination of vibration device, vibration motor, and vibration pad.
- The system (100) of any of preceding claims 1-4, wherein the predefined region is a blind spot region of a rider (110) of the vehicle (102), and wherein the set of LiDAR sensors (104) are configured at predefined positions on the vehicle (102) and oriented in predefined directions to cover the blind spot region.
- 6. The system (100) of any of preceding claims 1-5, wherein the set of LiDAR sensors (104) are positioned at predefined positions selected from any or a combination of rear side of a seat (102-2) of the vehicle (102), rear side of a helmet of the rider (112), number plate (102-4) of the vehicle (102), rear side of a frame (102-3) of the vehicle (102), tail light of the vehicle (102), and extreme ends of a handlebar (102-1) of the vehicle (102).
- 7. The system (102) of any of preceding claims 1-6, wherein the predefined region comprises a circular sector on a rear side of the vehicle (102), the circular sector having a predefined central angle and a predefined radius.
- 8. The system (102) of any of preceding claims 1-7, wherein the one or more attributes comprises any or a combination of distance of the one or more obstacles (110) from the vehicle (102), velocity of the one or more obstacles (110) with respect to the vehicle (102), angle of approach of the one or more obstacles (110) towards the vehicle (102), and dimension of the one or more obstacles (110).
- 9. The system (100) of any of preceding claims 1-8, wherein the vehicle (102) is selected from bicycle, e-bike, cargo bike, and two-wheeler motor vehicle.The system (100) of any of preceding claims 1-9, wherein the control unit (108) comprises a machine learning unit (318) configured with the processor (302), and operable to generate the set of predetermined collision signatures pertaining to the one or more known collisions based on a set of training and testing dataset, and wherein the control unit updates the training and testing dataset with the generated first set of signals and the corresponding collision signature.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB2115233.5A GB2615290B (en) | 2021-10-22 | 2021-10-22 | Blindspot assist system for a vehicle |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB2115233.5A GB2615290B (en) | 2021-10-22 | 2021-10-22 | Blindspot assist system for a vehicle |
Publications (3)
| Publication Number | Publication Date |
|---|---|
| GB202115233D0 GB202115233D0 (en) | 2021-12-08 |
| GB2615290A true GB2615290A (en) | 2023-08-09 |
| GB2615290B GB2615290B (en) | 2024-12-18 |
Family
ID=78805902
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| GB2115233.5A Active GB2615290B (en) | 2021-10-22 | 2021-10-22 | Blindspot assist system for a vehicle |
Country Status (1)
| Country | Link |
|---|---|
| GB (1) | GB2615290B (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130311075A1 (en) * | 2012-05-18 | 2013-11-21 | Continental Automotive Systems, Inc. | Motorcycle and helmet providing advance driver assistance |
| US20200066158A1 (en) * | 2017-04-06 | 2020-02-27 | Samsung Electronics Co., Ltd. | Electronic device, warning message providing method therefor, and non-transitory computer-readable recording medium |
| US20200231170A1 (en) * | 2017-11-09 | 2020-07-23 | Robert Bosch Gmbh | Method and control device for monitoring the blind spot of a two-wheeled vehicle |
| US20200312050A1 (en) * | 2019-03-28 | 2020-10-01 | Nidec Mobility Corporation | Control device, server, safety system, and control method of control device |
| WO2021104833A1 (en) * | 2019-11-28 | 2021-06-03 | Volkswagen Aktiengesellschaft | Method and device for promoting driving safety of vehicle |
| US20220020274A1 (en) * | 2018-12-06 | 2022-01-20 | Robert Bosch Gmbh | Processor and processing method for rider-assistance system of straddle-type vehicle, rider-assistance system of straddle-type vehicle, and straddle-type vehicle |
-
2021
- 2021-10-22 GB GB2115233.5A patent/GB2615290B/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130311075A1 (en) * | 2012-05-18 | 2013-11-21 | Continental Automotive Systems, Inc. | Motorcycle and helmet providing advance driver assistance |
| US20200066158A1 (en) * | 2017-04-06 | 2020-02-27 | Samsung Electronics Co., Ltd. | Electronic device, warning message providing method therefor, and non-transitory computer-readable recording medium |
| US20200231170A1 (en) * | 2017-11-09 | 2020-07-23 | Robert Bosch Gmbh | Method and control device for monitoring the blind spot of a two-wheeled vehicle |
| US20220020274A1 (en) * | 2018-12-06 | 2022-01-20 | Robert Bosch Gmbh | Processor and processing method for rider-assistance system of straddle-type vehicle, rider-assistance system of straddle-type vehicle, and straddle-type vehicle |
| US20200312050A1 (en) * | 2019-03-28 | 2020-10-01 | Nidec Mobility Corporation | Control device, server, safety system, and control method of control device |
| WO2021104833A1 (en) * | 2019-11-28 | 2021-06-03 | Volkswagen Aktiengesellschaft | Method and device for promoting driving safety of vehicle |
Also Published As
| Publication number | Publication date |
|---|---|
| GB202115233D0 (en) | 2021-12-08 |
| GB2615290B (en) | 2024-12-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11462021B2 (en) | Obstacle detection and notification for motorcycles | |
| US10077007B2 (en) | Sidepod stereo camera system for an autonomous vehicle | |
| KR102797059B1 (en) | Vehicle and method for controlling thereof | |
| CN106985780B (en) | Vehicle safety auxiliary system | |
| US11091173B2 (en) | Driving safety enhancing system and method for making or enabling highly accurate judgment and providing advance early warning | |
| CN107487258B (en) | Blind area detection system and method | |
| JP5769163B2 (en) | Alarm device | |
| US20130090806A1 (en) | Method for operating a driver assistance system of a motor vehicle and driver assistance system for a motor vehicle | |
| KR20210083462A (en) | Advanced Driver Assistance System, Vehicle having the same and method for controlling the vehicle | |
| SE1551086A1 (en) | Method, control unit and system for avoiding collision with vulnerable road users | |
| US8946990B1 (en) | Vehicle headlight detection system | |
| KR102712223B1 (en) | Vehicle and method for controlling thereof | |
| CN107380164A (en) | Driver assistance system and support system based on computer vision | |
| CN106463058B (en) | Driver assistance system including warnings sensed by vehicle sensors mounted on opposite vehicle sides | |
| CN109703557B (en) | Rear side early warning device and method for learning driving mode | |
| US11545032B2 (en) | Roadside apparatus and vehicle-side apparatus for road-to-vehicle communication, and road-to-vehicle communication system | |
| US12280791B2 (en) | Driving assistance device, driving assistance method, and storage medium | |
| US20230093042A1 (en) | Vehicle collision detection and driver notification system | |
| US9589470B2 (en) | Method and apparatus for detecting vehicle running in blind spot, and method and apparatus for giving warning in changing cruising lane | |
| JP2007072641A (en) | Dangerous vehicle detection device | |
| EP2797027A1 (en) | A vehicle driver alert arrangement, a vehicle and a method for alerting a vehicle driver | |
| US20200143684A1 (en) | Vehicle Threat Mitigation Technologies | |
| JP2019109795A (en) | Driving support device and driving support system | |
| US12030512B2 (en) | Collision warning system for a motor vehicle having an augmented reality head up display | |
| US11403948B2 (en) | Warning device of vehicle and warning method thereof |