US20210192937A1 - System and method for vehicle identification - Google Patents
System and method for vehicle identification Download PDFInfo
- Publication number
- US20210192937A1 US20210192937A1 US16/724,480 US201916724480A US2021192937A1 US 20210192937 A1 US20210192937 A1 US 20210192937A1 US 201916724480 A US201916724480 A US 201916724480A US 2021192937 A1 US2021192937 A1 US 2021192937A1
- Authority
- US
- United States
- Prior art keywords
- motion data
- vehicle
- computing module
- broadcast
- vehicles
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
- H04W4/44—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/017—Detecting movement of traffic to be counted or controlled identifying vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/012—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from other sources than vehicle or roadside beacons, e.g. mobile networks
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/052—Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
Definitions
- Intelligent infrastructure systems such as parking lots and toll booths, may gather data regarding usage, such as by tracking vehicles entering the area.
- These systems may include various types of sensors that are statically mounted near the system.
- V2X Vehicle to outside systems
- V2V vehicle-to-vehicle
- V2I vehicle-to-infrastructure
- Smart infrastructure systems may offer features by communicating with a nearby vehicle, such as reserving a parking spot or providing directions to an open parking spot.
- a method of identifying a target vehicle in a system includes providing a sensor within a system. Motion data of a plurality of vehicles within the system is detected with the sensor. Broadcast motion data is received from a vehicle unit within the system. Which of the plurality of vehicles is a target vehicle is determined based on a comparison of the detected motion data and the broadcast motion data.
- the detected motion data and the broadcast motion data have the same parameters.
- the detected motion data and the broadcast motion data comprise a vehicle speed.
- the detected motion data and the broadcast motion data comprise a vehicle yaw rate.
- the detected motion data and the broadcast motion data comprise a vertical acceleration.
- the detected motion data and the broadcast motion data comprise at least two parameters.
- the detecting, receiving, and determining steps are performed by a computing module.
- the computing module is in communication with the sensor and the vehicle unit.
- the computing module is configured to send information to the target vehicle.
- the vehicle unit communicates with the computing module wirelessly.
- broadcast motion data is received from multiple vehicle units within the system.
- the vehicle unit is mounted within one of the plurality of vehicles.
- the vehicle unit is a mobile device located within one of the plurality of vehicles.
- a plurality of sensors is provided within the system.
- the system is a portion of a paid or restricted access area.
- a system for identifying a target vehicle within a system includes a sensor configured to detect motion data of a plurality of vehicles within a system.
- a vehicle unit is mounted on a vehicle.
- the vehicle unit is configured to track motion of the vehicle and broadcast the tracked motion data to the computing module.
- a computing module is in communication with the sensor and the vehicle unit.
- the sensor is configured to send the detected motion data to the computing module.
- the vehicle unit is configured to broadcast the tracked motion data to the computing module.
- the computing module identifies the vehicle as a target vehicle based on a comparison of the detected motion data and the broadcast motion data.
- the detected motion data and the broadcast motion data comprise at least one of a vehicle speed, a vehicle yaw rate, and a vertical acceleration.
- the detected motion data and the broadcast motion data comprise at least two parameters.
- the vehicle unit is configured to communicate with the computing module wirelessly.
- the computing module is configured to send information to the target vehicle via vehicle unit.
- the computing module is configured to receive broadcast motion data from multiple vehicle units.
- FIG. 1 schematically illustrates an example smart parking lot system.
- FIG. 2 schematically illustrates vehicle movement in the example smart parking lot system.
- FIG. 3 illustrates example vehicle speed data
- FIG. 4 illustrates example yaw rate data
- FIG. 5 illustrates example vertical acceleration data
- FIG. 6 illustrates an example method of identifying a target vehicle.
- the subject invention provides a system and method for identifying vehicles within a smart system, such as a parking system.
- a sensor within the parking system tracks the motion of vehicles within the system.
- a vehicle unit on the vehicle broadcasts motion of the vehicle.
- the system determines which vehicle within the system is a target vehicle based on a comparison of the tracked motion data and the broadcast motion data.
- FIG. 1 illustrates an example smart infrastructure system, such as a parking system 10 .
- the parking system 10 generally includes a plurality of parking spaces 12 .
- the parking spaces 12 may include empty spaces 18 and occupied spaces 16 , which are occupied by parked vehicles 14 .
- An aisle 24 extends between the parking spaces 12 .
- some of the parking spaces 12 may be designated as handicapped spaces 20 or high priority spaces 22 .
- a parking system 10 is shown and described herein, it should be understood that the disclosed system and method may be used for other systems, such as toll systems, for example.
- the system 10 generally includes a sensor 30 and a computing module 34 .
- the sensor 30 is in communication with the computing module 34 .
- the sensor 30 may communicate with the computing module 34 via communication hardware, or wirelessly. In other embodiments, the sensor 30 and computing module 34 may be integrated into a single unit.
- the system 10 may include multiple sensors 30 mounted in different locations within the system 10 , each of the sensors 30 in communication with the computing module 34 .
- the sensor 30 detects and tracks objects, such as vehicles 26 , within the system 10 .
- the sensor 30 may be a camera, a radar sensor, a lidar sensor, an ultrasonic sensor, or light beam, for example.
- the sensor 30 detects motion of the vehicles 26 .
- the sensor 30 then sends detected motion data about the vehicles 26 to the computing module 34 .
- the sensor 30 may detect motion data such as speed, acceleration, yaw rate, and steering angle, for example.
- the sensor 30 may detect motion data about multiple vehicles 26 within the system 10 simultaneously.
- the vehicle 26 has a vehicle unit 28 that is in communication with the computing module 34 .
- the vehicle unit 28 detects motion data of the vehicle 26 from aboard the vehicle 26 .
- the vehicle unit 28 may be integrated into the vehicle 26 , or may be a smart device located within the vehicle 26 , such as a smart phone or tablet.
- the vehicle unit 28 communicates wirelessly with the computing module 34 .
- the computing module 34 compares the detected motion data from the sensor 30 and the broadcasted motion data from the vehicle unit 28 to pair a particular detected vehicle 26 with a particular subscriber.
- the computing module 34 may be calibrated to have data regarding the physical features of the parking system 10 .
- the computing module 34 may be calibrated to have information regarding parking spaces 12 and aisles 24 .
- the sensor 30 may communicate with the computing module 34 via communication hardware, or may communicate wirelessly.
- the system 10 may use one or more of the following connection classes, for example: WLAN connection, e.g. based on IEEE 802.11, ISM (Industrial, Scientific, Medical Band) connection, Bluetooth® connection, ZigBee connection, UWB (ultrawide band) connection, WiMax® (Worldwide Interoperability for Microwave Access) connection, infrared connection, mobile radio connection, and/or radar-based communication.
- WLAN connection e.g. based on IEEE 802.11, ISM (Industrial, Scientific, Medical Band) connection, Bluetooth® connection, ZigBee connection, UWB (ultrawide band) connection, WiMax® (Worldwide Interoperability for Microwave Access) connection, infrared connection, mobile radio connection, and/or
- the system 10 may include one or more controllers comprising a processor, memory, and one or more input and/or output (I/O) device interface(s) that are communicatively coupled via a local interface.
- the local interface can include, for example but not limited to, one or more buses and/or other wired or wireless connections.
- the local interface may have additional elements, which are omitted for simplicity, such as controllers, buffers (caches), drivers, repeaters, and receivers to enable communications. Further, the local interface may include address, control, and/or data connections to enable appropriate communications among the aforementioned components.
- the computing module 34 may include a hardware device for executing software, particularly software stored in memory, such as an algorithm for comparing motion data.
- the computing module 34 may include a custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the computing module 34 , a semiconductor based microprocessor (in the form of a microchip or chip set), or generally any device for executing software instructions.
- the memory can include any one or combination of volatile memory elements (e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, VRAM, etc.)) and/or nonvolatile memory elements (e.g., ROM, hard drive, tape, CD-ROM, etc.).
- the memory may incorporate electronic, magnetic, optical, and/or other types of storage media. Note that the memory can also have a distributed architecture, where various components are situated remotely from one another, but can be accessed by the processor.
- the software in the memory may include one or more separate programs, each of which includes an ordered listing of executable instructions for implementing logical functions.
- a system component embodied as software may also be construed as a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed.
- the program is translated via a compiler, assembler, interpreter, or the like, which may or may not be included within the memory.
- the controller can be configured to execute software stored within the memory, to communicate data to and from the memory, and to generally control operations of the computing module 34 pursuant to the software.
- Software in memory, in whole or in part, is read by the processor, perhaps buffered within the processor, and then executed. This software may be used to store and compare the motion data from the sensor 30 and vehicle unit 28 , for example.
- Intelligent infrastructure services such as parking and toll systems
- Such systems may offer features such as synchronized directions to a reserved parking spot, premium versus economy parking in the same lot, hourly billing, exclusive travel lanes, and other features that require the system to pair a subscriber or user's account with the vehicle accepting the service. This allows proper access to be pushed to the vehicle, charges to be made, or in some cases, notifications sent to maintenance or towing staff and/or police.
- Some infrastructure sensor systems do not have enough information to properly distinguish between vehicles, and are thus unable to ensure the correct users are getting the services they are allocated.
- Such systems may require the customer to report the service received, such as parking spot number, which disrupts the otherwise automated nature of the service.
- the parking system 10 utilizes both detected motion data from the sensor 30 and motion data broadcast from the vehicle 26 to determine a unique match between the vehicles observed in the environment and the particular subscriber's vehicle.
- Smart vehicles or smart devices that are actively subscribed to the infrastructure-based service will broadcast their dynamic motion data.
- This dynamic motion data may be speed, acceleration, angular velocity or yaw rate, or steering angle, for example.
- the infrastructure system then distinguishes among vehicles by matching motion data histories. Once the infrastructure system has identified a particular vehicle, it can send detailed directions to the subscriber, such as directions to an open parking space.
- FIG. 2 schematically illustrates an example parking system 10 with several vehicles 40 , 42 , 44 moving through the system 10 .
- vehicles 40 , 44 have vehicle units 28 that are broadcasting data to the computing module 34 .
- the vehicle 40 has connected with the system 10 at point 60 .
- the vehicle 40 did not connect to the system 10 until after it had already entered the parking lot.
- the vehicle 44 connected with the system at point 62 .
- the vehicle 44 connected to the system 10 before passing through an entrance 32 of the parking lot.
- Vehicle 42 is not broadcasting data to the computing module 34 via a vehicle unit 28 .
- the computing module 34 gathers detected motion data for each of the vehicles 40 , 42 , 44 within the system 10 .
- Each vehicle 40 , 42 , 44 has a unique path 50 , 54 , 52 , respectively.
- Each path 50 , 54 , 52 will result in unique motion data for the respective vehicle.
- Motion for each vehicle 40 , 42 , 44 is detected by the sensor 30 , and motion is broadcast for each vehicle that is connected to the system 10 .
- the detected and broadcast data are then compared to identify particular vehicles. For example, the data may be compared to identify which vehicle is a target vehicle.
- FIGS. 3-5 illustrate examples of detected motion data and broadcast motion data for vehicles within the system 10 .
- FIG. 3 illustrates example speed 70 over time 72 data for several tracked and broadcasting vehicles.
- the speed over time 74 , 76 , 78 of tracked objects 1 , 2 , and 3 is the detected speed of the vehicles 40 , 42 , 44 detected by the sensor 30 .
- the speed over time 75 , 77 for broadcasting objects 1 and 2 are the broadcast speed of vehicles 40 , 44 after they are connected to the system 10 .
- the broadcast data may be over a shorter time if the broadcasting vehicle does not connect to the system 10 right away.
- the computing module 34 compares the data for tracked objects 1 , 2 , 3 with the data for broadcasting objects 1 and 2 to identify which of the tracked objects are which. Here, based on the speed data, the computing module 34 is able to determine that broadcasting object 1 corresponds to tracked object 3 , and broadcasting object 2 corresponds to tracked object 1 .
- FIG. 4 illustrates example yaw rate 80 over time 72 data for several tracked and broadcasting vehicles.
- the yaw rate over time 84 , 86 , 88 of tracked objects 1 , 2 , and 3 is the detected yaw rate of the vehicles 40 , 42 , 44 detected by the sensor 30 .
- the yaw rate over time 85 , 87 for broadcasting objects 1 and 2 are the broadcast yaw rate of vehicles 40 , 44 after they are connected to the system 10 .
- the computing module 34 compares the data for tracked objects 1 , 2 , 3 with the data for broadcasting objects 1 and 2 to identify which of the tracked objects are which.
- the computing module 34 is able to determine that broadcasting object 1 corresponds to tracked object 3 , and broadcasting object 2 corresponds to tracked object 1 .
- FIG. 5 illustrates example vertical acceleration 90 over time 72 data for several tracked and broadcasting vehicles.
- the vertical acceleration over time 94 , 96 , 98 of tracked objects 1 , 2 , and 3 is the detected vertical acceleration of the vehicles 40 , 42 , 44 detected by the sensor 30 .
- the vertical acceleration over time 95 , 97 for broadcasting objects 1 and 2 are the broadcast vertical acceleration of vehicles 40 , 44 after they are connected to the system 10 .
- the changes in vertical acceleration generally correspond to speed bumps 36 (shown in FIG. 2 ) within the system 10 .
- the computing module 34 compares the data for tracked objects 1 , 2 , 3 with the data for broadcasting objects 1 and 2 to identify which of the tracked objects are which.
- the computing module 34 is able to determine that broadcasting object 1 corresponds to tracked object 3 , and broadcasting object 2 corresponds to tracked object 1 or 2 .
- the data 97 could correspond to the data 94 or 96 , additional parameters may be needed to help identify particular vehicles.
- the computing module 34 may also have stored data regarding the location of speed bumps 36 to help identify particular vehicles within the system 10 based on the vehicle location during changes in vertical acceleration. Although the illustrated example shows speed bumps 36 as the primary changes in vertical acceleration, other features of the system 10 may impact vertical acceleration, such as pot holes and hills.
- a single motion parameter is sufficient to identify a particular vehicle.
- multiple motion parameters may be required to identify vehicles.
- the computing module 34 may rely on any combination of the above described parameters, or additional parameters, such as steering angle.
- a parking system 10 is shown and described, it should be understood that the disclosed system and method may be used for other systems.
- the system may be any subscription based service for a region of a paid and/or restricted access area.
- the system may be a toll road, private drive, garage, or vehicle elevator, for example.
- FIG. 6 summarizes an example method 100 of identifying vehicles within the parking system 10 .
- Motion of vehicles 26 within the system 10 are detected with a sensor 30 at 102 .
- the sensor 30 may detect the motion of all vehicles 28 within the system 10 .
- multiple sensors 30 may be utilized to detect the motion of the vehicles 28 within the system 10 .
- This detected motion data is sent to a computing module 34 .
- the computing module 34 also gathers broadcast motion data from broadcasting vehicles in the system 10 at 104 . Broadcasting vehicles are those that send motion data from the vehicle itself to the computing system 34 via a vehicle unit 28 .
- the computing module 34 then compares the detected motion data with the broadcast motion data at 106 . Based on this comparison, the computing module 34 may identify a target vehicle at 108 .
- the target vehicle may be a subscriber to a parking or other smart system that is broadcasting motion data, for example.
- the computing module 34 may then send information to the target vehicle at 110 .
- the computing module 34 may send directions to a parking space, or other information useful to the subscriber. This method utilizes motion data from both a static sensor 30 within the system 10 and motion data from the vehicle itself broadcast via a vehicle unit 28 to identify vehicles within the system.
- the disclosed system and method provides a way to identify vehicles within a parking lot or other system.
- Some known systems rely on a link to vehicle license plate or other identifying features of the vehicle. These known systems may intrude on a user's privacy by monitoring and tracking particular features of the user and/or user's vehicle. These systems may also require a link between a subscriber's account and a particular vehicle, which may be inconvenient for user's with multiple vehicles or driving a rental car.
- the vehicle unit 28 is a mobile device, such as a smart phone, a subscriber can use a single account for multiple vehicles.
- the disclosed system may also be used to communicate with drivers of the vehicle about features in the parking lot.
- handicapped or premium parking spaces may be reserved digitally, and communicated to subscribers through the vehicle unit 28 .
- the system 10 may choose to make that space into a normal parking space to accommodate more vehicles in the parking lot. This information may be communicated to subscribers using vehicle units 28 , and improve efficiency of the parking system 10 .
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Traffic Control Systems (AREA)
Abstract
A method of identifying a target vehicle in a system includes providing a sensor within a system. Motion data of a plurality of vehicles within the system is detected with the sensor. Broadcast motion data is received from a vehicle unit within the system. Which of the plurality of vehicles is a target vehicle is determined based on a comparison of the detected motion data and the broadcast motion data.
Description
- Intelligent infrastructure systems, such as parking lots and toll booths, may gather data regarding usage, such as by tracking vehicles entering the area. These systems may include various types of sensors that are statically mounted near the system.
- Infrastructure sensor systems may communicate with nearby vehicles. Vehicle to outside systems (V2X) communication, such as vehicle-to-vehicle (V2V) communication and vehicle-to-infrastructure (V2I) communication are-increasingly used as inputs to improve vehicle safety and convenience. Smart infrastructure systems may offer features by communicating with a nearby vehicle, such as reserving a parking spot or providing directions to an open parking spot.
- In one exemplary embodiment, a method of identifying a target vehicle in a system includes providing a sensor within a system. Motion data of a plurality of vehicles within the system is detected with the sensor. Broadcast motion data is received from a vehicle unit within the system. Which of the plurality of vehicles is a target vehicle is determined based on a comparison of the detected motion data and the broadcast motion data.
- In a further embodiment of any of the above, the detected motion data and the broadcast motion data have the same parameters.
- In a further embodiment of any of the above, the detected motion data and the broadcast motion data comprise a vehicle speed.
- In a further embodiment of any of the above, the detected motion data and the broadcast motion data comprise a vehicle yaw rate.
- In a further embodiment of any of the above, the detected motion data and the broadcast motion data comprise a vertical acceleration.
- In a further embodiment of any of the above, the detected motion data and the broadcast motion data comprise at least two parameters.
- In a further embodiment of any of the above, the detecting, receiving, and determining steps are performed by a computing module. The computing module is in communication with the sensor and the vehicle unit.
- In a further embodiment of any of the above, the computing module is configured to send information to the target vehicle.
- In a further embodiment of any of the above, the vehicle unit communicates with the computing module wirelessly.
- In a further embodiment of any of the above, broadcast motion data is received from multiple vehicle units within the system.
- In a further embodiment of any of the above, the vehicle unit is mounted within one of the plurality of vehicles.
- In a further embodiment of any of the above, the vehicle unit is a mobile device located within one of the plurality of vehicles.
- In a further embodiment of any of the above, a plurality of sensors is provided within the system.
- In a further embodiment of any of the above, the system is a portion of a paid or restricted access area.
- In another exemplary embodiment, a system for identifying a target vehicle within a system includes a sensor configured to detect motion data of a plurality of vehicles within a system. A vehicle unit is mounted on a vehicle. The vehicle unit is configured to track motion of the vehicle and broadcast the tracked motion data to the computing module. A computing module is in communication with the sensor and the vehicle unit. The sensor is configured to send the detected motion data to the computing module. The vehicle unit is configured to broadcast the tracked motion data to the computing module. The computing module identifies the vehicle as a target vehicle based on a comparison of the detected motion data and the broadcast motion data.
- In a further embodiment of any of the above, the detected motion data and the broadcast motion data comprise at least one of a vehicle speed, a vehicle yaw rate, and a vertical acceleration.
- In a further embodiment of any of the above, the detected motion data and the broadcast motion data comprise at least two parameters.
- In a further embodiment of any of the above, the vehicle unit is configured to communicate with the computing module wirelessly.
- In a further embodiment of any of the above, the computing module is configured to send information to the target vehicle via vehicle unit.
- In a further embodiment of any of the above, the computing module is configured to receive broadcast motion data from multiple vehicle units.
- The disclosure can be further understood by reference to the following detailed description when considered in connection with the accompanying drawings wherein:
-
FIG. 1 schematically illustrates an example smart parking lot system. -
FIG. 2 schematically illustrates vehicle movement in the example smart parking lot system. -
FIG. 3 illustrates example vehicle speed data. -
FIG. 4 illustrates example yaw rate data. -
FIG. 5 illustrates example vertical acceleration data. -
FIG. 6 illustrates an example method of identifying a target vehicle. - The subject invention provides a system and method for identifying vehicles within a smart system, such as a parking system. A sensor within the parking system tracks the motion of vehicles within the system. A vehicle unit on the vehicle broadcasts motion of the vehicle. The system then determines which vehicle within the system is a target vehicle based on a comparison of the tracked motion data and the broadcast motion data.
-
FIG. 1 illustrates an example smart infrastructure system, such as aparking system 10. Theparking system 10 generally includes a plurality ofparking spaces 12. Theparking spaces 12 may includeempty spaces 18 and occupiedspaces 16, which are occupied by parkedvehicles 14. Anaisle 24 extends between theparking spaces 12. In some examples, some of theparking spaces 12 may be designated ashandicapped spaces 20 orhigh priority spaces 22. Although aparking system 10 is shown and described herein, it should be understood that the disclosed system and method may be used for other systems, such as toll systems, for example. - The
system 10 generally includes asensor 30 and acomputing module 34. Thesensor 30 is in communication with thecomputing module 34. Thesensor 30 may communicate with thecomputing module 34 via communication hardware, or wirelessly. In other embodiments, thesensor 30 andcomputing module 34 may be integrated into a single unit. Thesystem 10 may includemultiple sensors 30 mounted in different locations within thesystem 10, each of thesensors 30 in communication with thecomputing module 34. - The
sensor 30 detects and tracks objects, such asvehicles 26, within thesystem 10. Thesensor 30 may be a camera, a radar sensor, a lidar sensor, an ultrasonic sensor, or light beam, for example. Thesensor 30 detects motion of thevehicles 26. Thesensor 30 then sends detected motion data about thevehicles 26 to thecomputing module 34. Thesensor 30 may detect motion data such as speed, acceleration, yaw rate, and steering angle, for example. Thesensor 30 may detect motion data aboutmultiple vehicles 26 within thesystem 10 simultaneously. Thevehicle 26 has avehicle unit 28 that is in communication with thecomputing module 34. Thevehicle unit 28 detects motion data of thevehicle 26 from aboard thevehicle 26. Thevehicle unit 28 may be integrated into thevehicle 26, or may be a smart device located within thevehicle 26, such as a smart phone or tablet. Thevehicle unit 28 communicates wirelessly with thecomputing module 34. Thecomputing module 34 compares the detected motion data from thesensor 30 and the broadcasted motion data from thevehicle unit 28 to pair a particular detectedvehicle 26 with a particular subscriber. - The
computing module 34 may be calibrated to have data regarding the physical features of theparking system 10. For example, thecomputing module 34 may be calibrated to have information regardingparking spaces 12 andaisles 24. Thesensor 30 may communicate with thecomputing module 34 via communication hardware, or may communicate wirelessly. Thesystem 10 may use one or more of the following connection classes, for example: WLAN connection, e.g. based on IEEE 802.11, ISM (Industrial, Scientific, Medical Band) connection, Bluetooth® connection, ZigBee connection, UWB (ultrawide band) connection, WiMax® (Worldwide Interoperability for Microwave Access) connection, infrared connection, mobile radio connection, and/or radar-based communication. - The
system 10, and in particular thecomputing module 34, may include one or more controllers comprising a processor, memory, and one or more input and/or output (I/O) device interface(s) that are communicatively coupled via a local interface. The local interface can include, for example but not limited to, one or more buses and/or other wired or wireless connections. The local interface may have additional elements, which are omitted for simplicity, such as controllers, buffers (caches), drivers, repeaters, and receivers to enable communications. Further, the local interface may include address, control, and/or data connections to enable appropriate communications among the aforementioned components. - The
computing module 34 may include a hardware device for executing software, particularly software stored in memory, such as an algorithm for comparing motion data. Thecomputing module 34 may include a custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with thecomputing module 34, a semiconductor based microprocessor (in the form of a microchip or chip set), or generally any device for executing software instructions. The memory can include any one or combination of volatile memory elements (e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, VRAM, etc.)) and/or nonvolatile memory elements (e.g., ROM, hard drive, tape, CD-ROM, etc.). Moreover, the memory may incorporate electronic, magnetic, optical, and/or other types of storage media. Note that the memory can also have a distributed architecture, where various components are situated remotely from one another, but can be accessed by the processor. - The software in the memory may include one or more separate programs, each of which includes an ordered listing of executable instructions for implementing logical functions. A system component embodied as software may also be construed as a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed. When constructed as a source program, the program is translated via a compiler, assembler, interpreter, or the like, which may or may not be included within the memory.
- The controller can be configured to execute software stored within the memory, to communicate data to and from the memory, and to generally control operations of the
computing module 34 pursuant to the software. Software in memory, in whole or in part, is read by the processor, perhaps buffered within the processor, and then executed. This software may be used to store and compare the motion data from thesensor 30 andvehicle unit 28, for example. - Intelligent infrastructure services, such as parking and toll systems, are rapidly expanding along with the expansion of smart devices and internet enabled vehicles. Such systems may offer features such as synchronized directions to a reserved parking spot, premium versus economy parking in the same lot, hourly billing, exclusive travel lanes, and other features that require the system to pair a subscriber or user's account with the vehicle accepting the service. This allows proper access to be pushed to the vehicle, charges to be made, or in some cases, notifications sent to maintenance or towing staff and/or police. In high traffic areas such as a busy parking lot with several simultaneous subscribers, some infrastructure sensor systems do not have enough information to properly distinguish between vehicles, and are thus unable to ensure the correct users are getting the services they are allocated. Such systems may require the customer to report the service received, such as parking spot number, which disrupts the otherwise automated nature of the service.
- The
parking system 10 utilizes both detected motion data from thesensor 30 and motion data broadcast from thevehicle 26 to determine a unique match between the vehicles observed in the environment and the particular subscriber's vehicle. Smart vehicles or smart devices that are actively subscribed to the infrastructure-based service will broadcast their dynamic motion data. This dynamic motion data may be speed, acceleration, angular velocity or yaw rate, or steering angle, for example. The infrastructure system then distinguishes among vehicles by matching motion data histories. Once the infrastructure system has identified a particular vehicle, it can send detailed directions to the subscriber, such as directions to an open parking space. -
FIG. 2 schematically illustrates anexample parking system 10 with 40, 42, 44 moving through theseveral vehicles system 10. In this example, there are three 40, 42, 44 detected by thevehicles sensor 30 that are moving within thesystem 10. In this example, 40, 44 havevehicles vehicle units 28 that are broadcasting data to thecomputing module 34. Thevehicle 40 has connected with thesystem 10 atpoint 60. Thus, thevehicle 40 did not connect to thesystem 10 until after it had already entered the parking lot. Thevehicle 44 connected with the system atpoint 62. Thus, thevehicle 44 connected to thesystem 10 before passing through anentrance 32 of the parking lot. Thus, there will be a longer time period of broadcast data forvehicle 44 than forvehicle 40.Vehicle 42 is not broadcasting data to thecomputing module 34 via avehicle unit 28. - The
computing module 34 gathers detected motion data for each of the 40, 42, 44 within thevehicles system 10. Each 40, 42, 44 has avehicle 50, 54, 52, respectively. Eachunique path 50, 54, 52 will result in unique motion data for the respective vehicle. Motion for eachpath 40, 42, 44 is detected by thevehicle sensor 30, and motion is broadcast for each vehicle that is connected to thesystem 10. The detected and broadcast data are then compared to identify particular vehicles. For example, the data may be compared to identify which vehicle is a target vehicle. -
FIGS. 3-5 illustrate examples of detected motion data and broadcast motion data for vehicles within thesystem 10.FIG. 3 illustratesexample speed 70 overtime 72 data for several tracked and broadcasting vehicles. The speed over 74, 76, 78 of trackedtime 1, 2, and 3 is the detected speed of theobjects 40, 42, 44 detected by thevehicles sensor 30. The speed over 75, 77 fortime 1 and 2 are the broadcast speed ofbroadcasting objects 40, 44 after they are connected to thevehicles system 10. In some examples, the broadcast data may be over a shorter time if the broadcasting vehicle does not connect to thesystem 10 right away. Thecomputing module 34 compares the data for tracked 1, 2, 3 with the data forobjects 1 and 2 to identify which of the tracked objects are which. Here, based on the speed data, thebroadcasting objects computing module 34 is able to determine thatbroadcasting object 1 corresponds to trackedobject 3, andbroadcasting object 2 corresponds to trackedobject 1. -
FIG. 4 illustratesexample yaw rate 80 overtime 72 data for several tracked and broadcasting vehicles. The yaw rate over 84, 86, 88 of trackedtime 1, 2, and 3 is the detected yaw rate of theobjects 40, 42, 44 detected by thevehicles sensor 30. The yaw rate over 85, 87 fortime 1 and 2 are the broadcast yaw rate ofbroadcasting objects 40, 44 after they are connected to thevehicles system 10. Thecomputing module 34 compares the data for tracked 1, 2, 3 with the data forobjects 1 and 2 to identify which of the tracked objects are which. Here, based on the yaw rate data, thebroadcasting objects computing module 34 is able to determine thatbroadcasting object 1 corresponds to trackedobject 3, andbroadcasting object 2 corresponds to trackedobject 1. -
FIG. 5 illustrates examplevertical acceleration 90 overtime 72 data for several tracked and broadcasting vehicles. The vertical acceleration over 94, 96, 98 of trackedtime 1, 2, and 3 is the detected vertical acceleration of theobjects 40, 42, 44 detected by thevehicles sensor 30. The vertical acceleration overtime 95, 97 for 1 and 2 are the broadcast vertical acceleration ofbroadcasting objects 40, 44 after they are connected to thevehicles system 10. Here, the changes in vertical acceleration generally correspond to speed bumps 36 (shown inFIG. 2 ) within thesystem 10. Thecomputing module 34 compares the data for tracked 1, 2, 3 with the data forobjects 1 and 2 to identify which of the tracked objects are which. Here, based on the vertical acceleration data, thebroadcasting objects computing module 34 is able to determine thatbroadcasting object 1 corresponds to trackedobject 3, andbroadcasting object 2 corresponds to tracked 1 or 2. In this case, since the data 97 could correspond to theobject 94 or 96, additional parameters may be needed to help identify particular vehicles. Thedata computing module 34 may also have stored data regarding the location ofspeed bumps 36 to help identify particular vehicles within thesystem 10 based on the vehicle location during changes in vertical acceleration. Although the illustrated example showsspeed bumps 36 as the primary changes in vertical acceleration, other features of thesystem 10 may impact vertical acceleration, such as pot holes and hills. - In some of these examples, a single motion parameter is sufficient to identify a particular vehicle. In other examples, multiple motion parameters may be required to identify vehicles. For example, the
computing module 34 may rely on any combination of the above described parameters, or additional parameters, such as steering angle. Although aparking system 10 is shown and described, it should be understood that the disclosed system and method may be used for other systems. The system may be any subscription based service for a region of a paid and/or restricted access area. The system may be a toll road, private drive, garage, or vehicle elevator, for example. -
FIG. 6 summarizes anexample method 100 of identifying vehicles within theparking system 10. Motion ofvehicles 26 within thesystem 10 are detected with asensor 30 at 102. Thesensor 30 may detect the motion of allvehicles 28 within thesystem 10. In some examples,multiple sensors 30 may be utilized to detect the motion of thevehicles 28 within thesystem 10. This detected motion data is sent to acomputing module 34. Thecomputing module 34 also gathers broadcast motion data from broadcasting vehicles in thesystem 10 at 104. Broadcasting vehicles are those that send motion data from the vehicle itself to thecomputing system 34 via avehicle unit 28. Thecomputing module 34 then compares the detected motion data with the broadcast motion data at 106. Based on this comparison, thecomputing module 34 may identify a target vehicle at 108. The target vehicle may be a subscriber to a parking or other smart system that is broadcasting motion data, for example. In some examples, thecomputing module 34 may then send information to the target vehicle at 110. For example, thecomputing module 34 may send directions to a parking space, or other information useful to the subscriber. This method utilizes motion data from both astatic sensor 30 within thesystem 10 and motion data from the vehicle itself broadcast via avehicle unit 28 to identify vehicles within the system. - The disclosed system and method provides a way to identify vehicles within a parking lot or other system. Some known systems rely on a link to vehicle license plate or other identifying features of the vehicle. These known systems may intrude on a user's privacy by monitoring and tracking particular features of the user and/or user's vehicle. These systems may also require a link between a subscriber's account and a particular vehicle, which may be inconvenient for user's with multiple vehicles or driving a rental car. When the
vehicle unit 28 is a mobile device, such as a smart phone, a subscriber can use a single account for multiple vehicles. The disclosed system may also be used to communicate with drivers of the vehicle about features in the parking lot. For example, handicapped or premium parking spaces may be reserved digitally, and communicated to subscribers through thevehicle unit 28. In other words, during busy times, when a normally handicapped space is not being used, thesystem 10 may choose to make that space into a normal parking space to accommodate more vehicles in the parking lot. This information may be communicated to subscribers usingvehicle units 28, and improve efficiency of theparking system 10. - It should also be understood that although a particular component arrangement is disclosed in the illustrated embodiment, other arrangements will benefit herefrom. Although particular step sequences are shown, described, and claimed, it should be understood that steps may be performed in any order, separated or combined unless otherwise indicated and will still benefit from the present invention.
- Although the different examples have specific components shown in the illustrations, embodiments of this invention are not limited to those particular combinations. It is possible to use some of the components or features from one of the examples in combination with features or components from another one of the examples.
- Although an example embodiment has been disclosed, a worker of ordinary skill in this art would recognize that certain modifications would come within the scope of the claims. For that reason, the following claims should be studied to determine their true scope and content.
Claims (20)
1. A method of identifying a target vehicle in a system, comprising:
providing a sensor within a system;
detecting motion data of a plurality of vehicles within the system with the sensor;
receiving broadcast motion data from a vehicle unit within the system; and
determining which of the plurality of vehicles is a target vehicle based on a comparison of the detected motion data and the broadcast motion data.
2. The method of claim 1 , wherein the detected motion data and the broadcast motion data have the same parameters.
3. The method of claim 1 , wherein the detected motion data and the broadcast motion data comprise a vehicle speed.
4. The method of claim 1 , wherein the detected motion data and the broadcast motion data comprise a vehicle yaw rate.
5. The method of claim 1 , wherein the detected motion data and the broadcast motion data comprise a vertical acceleration.
6. The method of claim 1 , wherein the detected motion data and the broadcast motion data comprise at least two parameters.
7. The method of claim 1 , wherein the detecting, receiving, and determining steps are performed by a computing module, the computing module in communication with the sensor and the vehicle unit.
8. The method of claim 7 , wherein the computing module is configured to send information to the target vehicle.
9. The method of claim 7 , wherein the vehicle unit communicates with the computing module wirelessly.
10. The method of claim 1 , comprising receiving broadcast motion data from multiple vehicle units within the system.
11. The method of claim 1 , wherein the vehicle unit is mounted within one of the plurality of vehicles.
12. The method of claim 1 , wherein the vehicle unit is a mobile device located within one of the plurality of vehicles.
13. The method of claim 1 , comprising providing a plurality of sensors within the system.
14. The method of claim 1 , wherein the system is a portion of a paid or restricted access area.
15. A system for identifying a target vehicle within a system, comprising:
a sensor configured to detect motion data of a plurality of vehicles within a system;
a vehicle unit mounted on a vehicle, the vehicle unit configured to track motion of the vehicle and broadcast the tracked motion data to the computing module; and
a computing module in communication with the sensor and the vehicle unit, the sensor configured to send the detected motion data to the computing module, and the vehicle unit configured to broadcast the tracked motion data to the computing module, wherein the computing module identifies the vehicle as a target vehicle based on a comparison of the detected motion data and the broadcast motion data.
16. The system of claim 15 , wherein the detected motion data and the broadcast motion data comprise at least one of a vehicle speed, a vehicle yaw rate, and a vertical acceleration.
17. The system of claim 16 , wherein the detected motion data and the broadcast motion data comprise at least two parameters.
18. The system of claim 15 , wherein the vehicle unit is configured to communicate with the computing module wirelessly.
19. The system of claim 15 , wherein the computing module is configured to send information to the target vehicle via vehicle unit.
20. The system of claim 15 , wherein the computing module is configured to receive broadcast motion data from multiple vehicle units.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/724,480 US11043118B1 (en) | 2019-12-23 | 2019-12-23 | System and method for vehicle identification |
| PCT/US2020/070932 WO2021134100A1 (en) | 2019-12-23 | 2020-12-18 | System and method for vehicle identification |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/724,480 US11043118B1 (en) | 2019-12-23 | 2019-12-23 | System and method for vehicle identification |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US11043118B1 US11043118B1 (en) | 2021-06-22 |
| US20210192937A1 true US20210192937A1 (en) | 2021-06-24 |
Family
ID=74285596
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/724,480 Active US11043118B1 (en) | 2019-12-23 | 2019-12-23 | System and method for vehicle identification |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US11043118B1 (en) |
| WO (1) | WO2021134100A1 (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102794170B1 (en) * | 2019-11-18 | 2025-04-09 | 현대모비스 주식회사 | Rear cross collision detecting system and method |
Family Cites Families (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6666145B1 (en) * | 2001-11-16 | 2003-12-23 | Textron Systems Corporation | Self extracting submunition |
| US7630620B2 (en) | 2007-07-24 | 2009-12-08 | Honeywell International Inc. | Apparatus and method for measuring an acceleration to determine a camera dome's required motor holding current |
| ATE515400T1 (en) | 2008-04-02 | 2011-07-15 | Gm Global Tech Operations Inc | ADAPTIVE SUSPENSION CONTROL FOR A MOTOR VEHICLE |
| US20100217533A1 (en) | 2009-02-23 | 2010-08-26 | Laburnum Networks, Inc. | Identifying a Type of Motion of an Object |
| US9581997B1 (en) * | 2011-04-22 | 2017-02-28 | Angel A. Penilla | Method and system for cloud-based communication for automatic driverless movement |
| US11350237B2 (en) * | 2012-12-21 | 2022-05-31 | Sfara, Inc. | System and method for determining smartphone location |
| US10909845B2 (en) * | 2013-07-01 | 2021-02-02 | Conduent Business Services, Llc | System and method for enhancing images and video frames |
| US9139199B2 (en) * | 2015-02-01 | 2015-09-22 | Thomas Danaher Harvey | Methods for dense parking of remotely controlled or autonomous vehicles |
| US9298186B2 (en) * | 2015-02-01 | 2016-03-29 | Thomas Danaher Harvey | Methods for operation of autonomous vehicles in special control zones |
| US10755569B2 (en) * | 2015-06-29 | 2020-08-25 | Eco Parking Technologies, Llc | Lighting fixture data hubs and systems and methods to use the same |
| US10780879B2 (en) * | 2017-02-14 | 2020-09-22 | Denso Ten Limited | Parking controller, parking control system, and parking control method |
| JP6610601B2 (en) * | 2017-04-04 | 2019-11-27 | トヨタ自動車株式会社 | Vehicle identification device |
| US9952594B1 (en) * | 2017-04-07 | 2018-04-24 | TuSimple | System and method for traffic data collection using unmanned aerial vehicles (UAVs) |
| JP2020522798A (en) * | 2017-05-31 | 2020-07-30 | ベイジン ディディ インフィニティ テクノロジー アンド ディベロップメント カンパニー リミティッド | Device and method for recognizing driving behavior based on motion data |
| CN108305504A (en) * | 2018-03-13 | 2018-07-20 | 京东方科技集团股份有限公司 | A kind of parking stall air navigation aid and system |
| JP7035661B2 (en) * | 2018-03-15 | 2022-03-15 | 株式会社アイシン | Parking control system and vehicle control system |
| US10991249B2 (en) * | 2018-11-30 | 2021-04-27 | Parkifi, Inc. | Radar-augmentation of parking space sensors |
| KR102692320B1 (en) * | 2018-12-07 | 2024-08-07 | 현대자동차주식회사 | Apparatus for managing driving pattern based on object recognition, vehicle driving controlling apparatus using the same and method thereof |
| US11024180B2 (en) | 2018-12-27 | 2021-06-01 | Intel Corporation | Methods and apparatus to validate data communicated by a vehicle |
| JP7125369B2 (en) * | 2019-03-29 | 2022-08-24 | 本田技研工業株式会社 | Parking lot management device, parking lot management method, and program |
-
2019
- 2019-12-23 US US16/724,480 patent/US11043118B1/en active Active
-
2020
- 2020-12-18 WO PCT/US2020/070932 patent/WO2021134100A1/en not_active Ceased
Also Published As
| Publication number | Publication date |
|---|---|
| US11043118B1 (en) | 2021-06-22 |
| WO2021134100A1 (en) | 2021-07-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10906535B2 (en) | System and method for vulnerable road user detection using wireless signals | |
| EP3635707B1 (en) | Parking objects detection system | |
| US10401190B2 (en) | Vehicle control method and apparatus, and storage medium | |
| US10757551B2 (en) | Vehicle-to-infrastructure (V2I) messaging system | |
| CN110103852B (en) | System and method for collision detection in autonomous vehicles | |
| US20230137058A1 (en) | Optimized routing application for providing service to an autonomous vehicle | |
| US9569962B2 (en) | Method for identifying a vehicle detected by a sensor device | |
| US9589466B2 (en) | System and device for parking management | |
| US9884620B2 (en) | Automated parking system | |
| US20190180526A1 (en) | Systems, methods and apparatuses for diagnostic fault detection by parameter data using a redundant processor architecture | |
| US11995985B2 (en) | Intersection trajectory determination and messaging | |
| US20190306779A1 (en) | Vehicle communication control method and vehicle communication device | |
| CN107872775A (en) | Dynamic traffic instruction based on V2V sensor sharing methods | |
| US20140140353A1 (en) | Method and communication system for data reception in wireless vehicle-to-surroundings communication | |
| CN114115292B (en) | Driving control method, system, storage medium and computer device | |
| CN113111682B (en) | Target object perception method and device, perception base station, and perception system | |
| CN111754006B (en) | Management device, management system, management method and storage medium | |
| US20200327460A1 (en) | Information processing apparatus, information processing method and program | |
| CN107005812B (en) | Mobile communication systems, in-vehicle terminals | |
| US20220333937A1 (en) | Method for providing integratednavigation service using vehicle sharing network, and device and system for same | |
| CN107430810A (en) | Method for recognizing the movement of objects on a vehicle parking lot | |
| KR102672131B1 (en) | A location-based vehicle breaker control system | |
| US20180129209A1 (en) | Relaxable turn boundaries for autonomous vehicles | |
| US11043118B1 (en) | System and method for vehicle identification | |
| US10154393B2 (en) | Method, motor vehicle, and system for determining a transmission path |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |