US20170261613A1 - Counter drone system - Google Patents
Counter drone system Download PDFInfo
- Publication number
- US20170261613A1 US20170261613A1 US15/443,143 US201715443143A US2017261613A1 US 20170261613 A1 US20170261613 A1 US 20170261613A1 US 201715443143 A US201715443143 A US 201715443143A US 2017261613 A1 US2017261613 A1 US 2017261613A1
- Authority
- US
- United States
- Prior art keywords
- drone
- lidar
- target drone
- sensor
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41H—ARMOUR; ARMOURED TURRETS; ARMOURED OR ARMED VEHICLES; MEANS OF ATTACK OR DEFENCE, e.g. CAMOUFLAGE, IN GENERAL
- F41H11/00—Defence installations; Defence devices
- F41H11/02—Anti-aircraft or anti-guided missile or anti-torpedo defence installations or systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
-
- G01S17/026—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/04—Systems determining the presence of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/66—Tracking systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/87—Combinations of systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/933—Lidar systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4817—Constructional features, e.g. arrangements of optical elements relating to scanning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/495—Counter-measures or counter-counter-measures using electronic or electro-optical means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/51—Display arrangements
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0016—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/12—Target-seeking control
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/20—Arrangements for acquiring, generating, sharing or displaying traffic information
- G08G5/22—Arrangements for acquiring, generating, sharing or displaying traffic information located on the ground
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/50—Navigation or guidance aids
- G08G5/55—Navigation or guidance aids for a single aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/50—Navigation or guidance aids
- G08G5/57—Navigation or guidance aids for unmanned aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/70—Arrangements for monitoring traffic-related situations or conditions
- G08G5/72—Arrangements for monitoring traffic-related situations or conditions for monitoring traffic
- G08G5/727—Arrangements for monitoring traffic-related situations or conditions for monitoring traffic from a ground station
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/021—Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/15—UAVs specially adapted for particular uses or applications for conventional or electronic warfare
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41H—ARMOUR; ARMOURED TURRETS; ARMOURED OR ARMED VEHICLES; MEANS OF ATTACK OR DEFENCE, e.g. CAMOUFLAGE, IN GENERAL
- F41H13/00—Means of attack or defence not otherwise provided for
- F41H13/0006—Ballistically deployed systems for restraining persons or animals, e.g. ballistically deployed nets
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Definitions
- This disclosure relates generally to drones and more particularly to a technique to detect and track drones.
- drones are readily available at nominal costs to be purchased by private individuals.
- the readily availability of drones to be used by private individuals bring on additional concerns for law enforcement and security personnel where drones can be used for unwanted or illegal activity.
- a drone carrying contraband can be used by an individual to fly over a prison and deliver the contraband within the prison walls.
- Drones can be flown into private areas, carry explosives, or deliver contraband to personnel located in the private areas.
- drones can be flown into air space that then prevents manned airplanes from flying a desired course.
- Other possibilities of the use of drones are only left to the imagination of unlawful actors and hence it is desirable for a system to locate a drone and render it useless when the drone is identified as a nuisance or danger.
- a counter drone system includes a cueing sensor to detect the presence of an object wherein the cueing sensor cues the presence of a target drone, a long range LIDAR system having a sensor pointed in a direction of the target drone to acquire and track at long range the target drone to provide an accurate location of the target drone wherein once a track is acquired, the motion of the target drone is tracked and a threat detector processing the LIDAR data is provided to determine if the target drone is a threat.
- target drone can be located and be rendered useless when the target drone is identified as a nuisance or danger.
- a counter drone system includes a semi-autonomous response planner system with inputs from the threat detector and an operator to determine countermeasures appropriate for the cued target drone. With such an arrangement, suitable countermeasures can be determined and implemented to counter any threats by the target drone.
- a method includes: detecting a presence of a target drone using a cueing sensor; acquiring, in response to initial data from the cuing sensor, a target drone and then tracking the target drone using a long range LIDAR system to provide an accurate location of the target drone and to maintain the track of the target drone; and providing LIDAR data to a threat detector to determine if the target drone is a threat.
- a technique provides an indication of a threat by a target drone so that countermeasures can be taken.
- FIG. 1 is a diagram of a drone detection system
- FIG. 2 is a diagram of using a LIDAR element to provide an electronic fence to protect an area of concern
- FIG. 3 is a sketch of a tracking LIDAR with a field of view of a camera picture of a target and the corresponding LIDAR image taken from a LIDAR scanner;
- FIG. 3A is a diagram of a planned intercept course
- FIG. 4 is a diagram of an early detections system with a tracking LIDAR to track a target
- FIG. 5 is a diagram of a plurality LIDAR elements disposed to provide an electronic fence with a long range tracking LIDAR;
- FIG. 6 is a diagram of a drone detections system with a three dimensional scene model for analyzing an environment
- FIG. 6A is a diagram of a geo-locator and labeler included within the drone detection system of FIG. 6 ;
- FIG. 6B is a diagram of an object avoidance path
- FIG. 6C is an example of a three dimensional scene
- FIG. 6D is another example of a three dimensional scene
- FIG. 7 is a screen shot of a computer screen with an example of a three dimensional scene
- FIG. 8 is a diagram of a drone viewing a target drone
- FIG. 9 is a block diagram of a system to implement a drone detection system
- FIG. 10 is a diagram where a user designates a target drone using a pointing device.
- FIG. 11 is a block diagram of a computer that can be used to implement certain features of the system.
- LIDAR Light detection and ranging
- a LIDAR system includes a light source, such as a laser, that generates and directs pulses of light. The light pulses are reflected by the surface of various objects, such as the ground, a tree, or a building or an object in the air such as a drone.
- a sensor in the LIDAR system detects the reflections. The relative location of the reflecting surface can be determined by the lidar from the elapsed time from when the light pulse is generated and when it is detected. This cycle of pulse and detection may be repeated thousands of times per second.
- the coordinate frame of detection can be translated into another coordinate frame for display using common methods.
- the reflected light pulses are used to create a 3D image of the scanned area or field of view. An operator may then use pan and zoom commands to change the camera or sensor orientation and see different portions of the scanned area or field of view.
- a LIDAR has advantages over other sensors for tracking drones. Short range LIDARs ( ⁇ 100 m) can interrogate all of their airspace and detect a drone, however the range of 100 meters has limited value. If we use a long range LIDAR (1000 m) however because of the narrow field of view, it is not practical for the long range LIDAR to do detection. Our disclosure uses a two-tiered approach of using an alerting system to cue the long range LIDAR so we may take advantage of the long range LIDAR. To make a long range LIDAR feasible we use a second sensor to alert (cue) that there is a drone present to track.
- the second sensor does not need to do a good job of long range tracking, it only needs to provide a small area to search with the long range LIDAR to find the drone.
- LIDAR also provides very precise three dimensional (3D) location information and is capable of detecting the physical presence of an object in most all lighting conditions. It doesn't require the drone to emit RF and it works if the drone is stationary or slow or fast or regardless of being close to the ground or high in the air.
- LIDAR has advantages over radar in that LIDAR allows for more accurate location and has a smaller spot size allowing for a more accurate image of a target to be formed.
- a drone detection system 100 (sometimes referred to as a counter drone system) is shown to include a plurality of detection sensors 110 arranged to detect an object, more specifically a drone 130 .
- a detection processor 112 captures the existence of an object and cues the presence of a drone 130 to a tracking sensor 114 which acquires and tracks at long range the drone 130 using target tracker 116 .
- An image of a target drone once cued can be fed to a target identifier 118 for target recognition and the image 120 of the cued target can be displayed to an operator 122 on display 124 so the operator 122 can verify and analyze the cued target.
- the target tracker 116 also feeds the target tracks to a semi-autonomous response planner system 126 with inputs also from the operator 122 can determine countermeasures 128 appropriate for the cued target.
- a interceptor drone 132 can be deployed.
- a counter drone system wherein a cueing sensor provided by the detections sensors 110 is able to detect the presence of an object wherein the cueing sensor cues the presence of a target drone.
- a long range LIDAR system provided by the tracking sensor 114 and the target tracker 116 with a sensor pointed in a direction of the target drone to acquire and track at long range the target drone can provide an accurate location of the target drone wherein once a track is acquired, the motion of the target drone along with a Kalman Filter is used to maintain the track of the target drone.
- a threat detector provided by the target identifier 118 uses LIDAR data which is provided to the threat detector to determine if the target drone is a threat.
- countermeasures 128 in response to the operator 122 or the semi-autonomous response planner 126 , can then be implemented to render useless the target drone when the target drone is identified as a nuisance or danger.
- cameras can be aimed at the track as well.
- LIDAR (and optional camera) data is given to human operator 122 to determine threat vs. non-threat or automated techniques can be used as well.
- Sensor fusion techniques can also be used to combine the camera and lidar data to assist in threat determination.
- a camera can be aimed toward the target to get further information about the target. Where to aim the camera can based on the target tracker and knowledge about the camera coordinate frame and the tracker sensor coordinate frame as to be discussed further herein below.
- drone detection and tracking is accomplished wherein one sensor 110 being a LIDAR or (alternatively, acoustics, infrared, etc) cues the presence but not high resolution location of a drone, and a LIDAR tracking sensor 114 (flash, Geiger mode, line scanning) is aimed to acquire and track at long range the target to provide an accurate location. Once the track is acquired, the sensing of the target, and the prediction of the motion of the target using standard means (such as a Kalman Filter) is used to maintain the track of the target.
- LIDAR tracking sensor 114 flash, Geiger mode, line scanning
- Line scanning LIDAR is a suitable cuing sensor.
- Examples include a Quanergy M8, or a Velodyne VLP16. This is configured as a light fence facing upward and is described in FIG. 2 .
- a line scanning Lidar such as a Velodyne VLP-16 or similar can be configured as an upwards facing light fence.
- An object that breaks the light fence will be registered by the LIDAR and it's location can be translated into a coordinate, and in the case of a multibeam LIDAR, a vector. This defines a search space for the long range LIDAR to hunt for the object that has broken the light fence.
- lidars configured as a light fence may be networked together to form a parimeter around a location to protect the location such as the White House, an airport, or a prison.
- Acoustic sensor systems could also be used to cue the sensor. In this case the audible signature of the drone is detected by a microphone array and translated into an approximate location. Similarly a radar could be used to cue the sensor.
- the long range LIDAR will “hunt” for a flying object that is defined as an object that is off the ground, and in open space that is previously known to have been empty space. If the object is moving it is tracked. If the object is stationary it is observed stationary.
- response planner 126 will do the following tasks when an object is observed:
- LIDAR sensor 10 in one embodiment provides 16 beams and has a range of approximately 100 meters.
- LIDAR sensor 10 is disposed so that the beams 12 are pointed upward such that the beams 12 can detect an object, here drone 14 when the drone 14 enters the range of the LIDAR sensor 10 .
- the LIDAR sensor 10 is disposed on a surface and when the beam 12 is scanned from one horizon into the air to the other horizon creates a fan 16 that interrogates the air space within the range of the LIDAR sensor 10 .
- a plurality of sensors 10 can be arranged along a line and networked together to provide a light-fence 18 .
- a fence By then disposing a plurality of light fences 18 around an area to be protected, a fence can be created to detect objects entering the light fence 18 .
- a detection system 210 for a drone detection and tracking system for cuing a tracking system is provided where a line scanning LIDAR is pointed upward to make a light-fence, and objects detected by the light-fence can be used to cue a tracker.
- Several such light-fence sections can be established together around a perimeter of an asset to establish a light fence around the asset.
- the inbound vector of an object can be given to a second LIDAR (flash, Geiger mode, line scanning) that is aimed to acquire and track the target to provide an accurate location. Once the track is acquired, the motion of the drone is used as input to maintain the track of the target.
- LIDAR flash, Geiger mode, line scanning
- a system includes a three dimensional line-scanner LIDAR sensor disposed on a side to provide a set of fanned beams that travel from one horizon into the air to the other horizon to detect an object and create a track for the object and a long range sensor can be provided to track the object detected by the line-scanner LIDAR sensor in response to an initial track of the object created by the line-scanner LIDAR sensor.
- a system can be alerted when a drone is flying through a vertical plane.
- Interested parties are alerted when a drone is invading their space.
- a line-scanning LIDAR By putting a line-scanning LIDAR on its side, a set of fanned beams are created that go from one horizon, into the air, and to the other horizon (left, up, right). Anything flying through these beams can be detected, and a track can be established.
- an alert can be provided whenever something flies into the monitored airspace.
- the system can be used to alert a long range LIDAR to the presence of a drone so that the long range LIDAR can track it. Because of the narrow field of view, it is not practical for the long range LIDAR to do detection.
- the light fence provides a technique for detection and to provide an accurate location where the long range LIDAR should look.
- a line-scanning LIDAR is available from several vendors to include models available such as a Velodyne VLP16, HDL32, SICK LMS 111, or a Quanergy M8.
- a light fence is well known in the art. In general to make a light fence: Turn on Lidar, Take a few scans for the Lidar to learn all the expected return ranges for all beams at all angles. For example at 132 degrees the light may travel 30 meters before reflecting off a branch. We know between 0-30 meters is open space because the beam reflected back at 30 meters. At 140 degrees there may not be any return because the beam went up in the air and nothing reflected back. We store this profile for each beam.
- the drone cuing system 210 gives best vector information to the tracking sensor 114 and target tracker (tracking controller) 116 .
- the tracking controller 116 aims flash LIDAR to predicted track location and starts hunting for the object in the sky.
- An object is segmented from background by being in open air.
- the object is tracked in LIDAR frame using existing LIDAR tracking code and the tracking information is fed back into tracking controller 116 .
- cameras can be aimed at the track as well.
- LIDAR (and optional camera) data is given to human operator 122 to determine threat vs. non-threat or automated techniques can be used as well. Sensor fusion techniques can also be used.
- a tracking LIDAR 20 is shown where an ASC Tiger Cub Flash LIDAR emits a flash of laser light and uses a CCD to capture range information. Field of view is narrow, like that of a camera.
- the tracking LIDAR 20 pointed toward a target 26 will return an image 28 of the target 26 .
- the range can be up to 1 km. At 1 km, pixels are about 20 cm, at 500 m, pixels are about 10 cm, and at 100 m, pixels are about 2 cm.
- a given inbound track from a cuing detection system 210 provides the location information of a target drone such that a tracking LIDAR 20 can scan the sky on a pan/tilt head to find a UAV or drone. Once an UAV is found, the tracking LIDAR 20 can track the UAV, providing 3D coordinates for counter measures, provide a 3D model of the object for classification, and give a clean view of an object to an operator for go/no-go decision.
- a drone detection and tracking system 200 includes an early detection system provided by drone cuing system 210 for detecting the presence of a drone.
- the cue sensor 10 facing upward uses it's modality to detect the presence of drones.
- the initial detectors could be acoustic, infrared, radar or other sensors but here we are describing a LIDAR sensor.
- the detection is made by vector flying through the fan.
- the cue sensor 10 from the early detection system 210 gives best vector information to long range tracker 220 .
- the long range tracker 220 aims flash LIDAR to an object 222 to a predicted track location and starts hunting for the object 222 in sky.
- the object 222 is segmented from background by being in open air.
- the object 222 is tracked in LIDAR frame using existing LIDAR tracking code and the tracking information is fed back into tracking controller of long range tracker 220 .
- cameras can be aimed at the track as well.
- LIDAR (and optional camera) data is given to a human operator 122 to determine threat vs. non-threat and automated techniques can be used as well. Sensor fusion techniques can also be used.
- a drone detection and tracking system 300 is shown where a line scanning LIDAR 310 is pointed upward to make a light-fence, and flying entities that fly through the light fence establish an inbound vector.
- the inbound vector is given to a second LIDAR 320 (flash, Geiger mode, line scanning) that is aimed to acquire and track the target to provide an accurate location.
- the motion of the drone is used as input to maintain the track of the target.
- one or more intercept drones are then tasked to the location of the first drone carrying a counter measure device such as a localized jammer, or net, or net gun based on the track from the ground based system.
- a drone detection system 400 is shown to include a plurality of detection sensors 410 arranged to detect an object, more specifically a drone 440 .
- a detection processor 412 captures the existence of an object and cues the presence of a drone to a tracking sensor 414 which acquires and tracks at long range the drone using target tracker 416 .
- An image of a target drone once cued and tracked can be fed to a target identifier 418 for target recognition and the image 420 of the target is displayed to an operator 422 on display 424 so the operator 422 can verify and analyze the target.
- the target tracker 416 also feeds the target tracks to a semi-autonomous response planner system 426 and with inputs also from the operator 422 can determine countermeasures 428 appropriate for the target.
- the drone detection system 400 also includes a system 430 for creating a three dimensional model of an environment where in the detection sensors 410 and the tracking sensors 414 with the target identifier 418 provides a scanning system to scan an environment to provide an image 434 of the scanned environment, a geo-locator 452 is used to tag a plurality of points within the image with geo-reference points and a labeler 454 is used to label features of interest within the image and to identify possible access paths within the features of interest potentially providing an access path for a target drone.
- a real-time pedestrian model system 432 is provided to track locations of pedestrians in an environment 436 .
- the environment 436 can include a portion of the image 434 , include all of the image 434 , or include more than the environment captured by image 434 .
- surveying a site by LIDAR to create a 3D model of the environment can be used as input for: a) explaining false positives when detecting and tracking drones, b) calculating fields of view when detecting and tracking drones, c) optimizing countermeasures for drones, and d) planning routes for countermeasures for drones.
- a 3D scan of the environment is made producing a detailed point cloud of fixed objects and points are Geo-referenced in this model.
- the model gets loaded into command and control software.
- the command and control software is written to use this model when planning way points for interception by avoiding objects that are possible collisions (e.g trees) without requiring on board sensing.
- the model is used when reading and considering new tracks (from LIDAR or other sensor (e.g. radar, acoustics)) to determine if location of a new track is likely to really be from noise (traffic, waving flag, fireworks, . . . ) or in fact a potential target.
- the model is used when evaluating blind spots of the system for deployed sensors by placing their location and field of view into the model and tracing their field of view for intersections with fixed objects in the model (building, trees).
- the model is used when deciding the windows of opportunity for counter measures and prioritizing their use by considering how long a window of opportunity to intercept is possible, if there is collateral damage (pedestrians), chance of interference (radio tower, multi-path off building), etc. based on modality (jamming, projectile, etc).
- the system 400 can create a 3D model of the environment (buildings, trees, roads, parking lots, etc) and use the context of the world to perform better tracking, better false positive rejection, better intercept planning, perform obstacle avoidance for the intercept vehicle, better site preparation for a counter drone detection, tracking and intercepting platform, for example, as shown in FIG. 6B .
- the system 400 can make a 3D scan of the environment producing a detailed point cloud of fixed objects and the objects are geo-reference in this 3D scan model.
- a system for creating a three dimensional model of an environment includes the 3D scene model 430 where a LIDAR scanning system to scan an environment provides an image of the scanned environment which is stored as data 450 and a geo-locator 452 is used to tag a plurality of points within the image with geo-reference points and a labeler 454 is used to label features of interest within the image and to identify possible access paths within the features of interest potentially providing an access path for a target drone.
- the system 400 scans the environment with a LIDAR detection sensor. This can be done by an aerial platform, mobile mapping platform, or a stationary platform using detection sensors 410 . See for example the image of the scenes in FIG. 6B or FIG. 6C .
- the system 400 geo-references the points in the scene with GPS using known techniques. This is common practice.
- the system 400 will next label the scene with features of interest. Examples include: roads (roads have cars, cars move); trees (trees sway in the wind, move slightly; trees are obstacles to avoid with drones); buildings (buildings are high value items we don't want to hurt); areas with people (areas we want to avoid collateral damage); and other features of interest can be considered.
- Labeling of this data could be done by hand or automated methods, or by geo referencing other data sources. Having the latter information available, a mission planner can now consider placement of assets in the model as well as predict where enemy drones may come from. A mission planner can consider windows of opportunity for counter measures and analyze blind spots. The mission planner can analyze areas where false positives (birds, for example) may come from. By playing what-if scenarios, the mission planner can come up with a better placement of assets to protect what needs protection. Furthermore, having the latter information available, when a track is first discovered, the mission planner can consider the likelihood it is a false track by where it originated from. For example, if it came from a tree, there is a possibility it may be a bird. Other sensors reports can be considered.
- Acoustic solution detections can be evaluated in the model, as well as radar detections, if desired. Radars may produce false tracks off cars, etc. More false positives can be rejected by understanding where the false positives are originating from.
- the mission planner can use the 3D model to plan which countermeasure can/should be launched, and determine when an opportunity to intercept is most likely. If the selected countermeasure is deploying another drone, the mission planner can pilot an intercept drone around obstacles because the obstacles have been mapped in the scene apriori.
- the mission planner can plan the best opportunity for minimal collateral damage because a 3D model of the scene is available.
- the mission.planner can compute the firing angles, debris patterns, and effects of range of various systems and choose to engage at a time and place likely to cause the least damage.
- a geo-locator is used to tag a plurality of points within the image with geo-reference points and a labeler is used to label features of interest within the image and to identify possible access paths within the features of interest.
- a screen shot 700 of a computer screen with an example of a three dimensional scene 702 is shown.
- a geo-locator is used to tag a plurality of points within the image with geo-reference points and a labeler is used to label features of interest.
- an intercept drone 82 can be deployed.
- the intercept drone has on-board GPS and ability to fly to GPS locations; the intercept drone can receive waypoints by radio; the tracking system is surveyed in so it's GPS location is known; the tracking system can translate track into GPS coordinates; the intercept drone is commanded over radio link to fly to GPS coordinates to put in range of a tracked target. Coordinates may be an offset or a projection from tracked target (above, ahead, below, etc). Trigger of a counter measure (jammer, net, etc) can be done automatically or by a human pressing button.
- an intercept drone 82 may be stationed at high altitude ( ⁇ 400 ft) by tether to ground power, allowing it to stay in place 24 hrs/day until needed to deploy—dropping ground tether and intercepting from above.
- a drone detection system 500 is shown to include a plurality of detection sensors 510 arranged to detect an object, more specifically a drone 502 .
- a detection processor 512 captures the existence of the drone 502 and cues the presence of the drone 502 to a tracking sensor 514 which acquires and tracks at long range the drone using target tracker 516 .
- Target tracker 516 provides track and the tracking sensor ( 514 )'s GPS coordinates to command and control processor 520 which in turn translates the track from sensor coordinates into GPS coordinates and provides GPS coordinates of the drone 502 to ground control station 522 .
- the control processor knowing the current location of the intercept drone ( 526 ) in GPS coordinates through the ground control station ( 522 ), can determine a proper intercept course for the intercept drone 526 , and command the velocity and vector of travel for intercept drone 526 to intercept the drone 502 .
- the ground control station 522 then provides controls to drone controller 524 which controls an intercept drone 526 .
- the ground control station 522 also provides image data to a tablet 528 such as an Android Tactical Assault Kit (ATAK) tablet 528 .
- the intercept drone 526 has on-board GPS receiver and ability to fly to GPS locations and can receive waypoints by radio.
- the tracking system is surveyed in so it's GPS location is known.
- the tracking system can translate track into GPS coordinates.
- the intercept drone 526 is commanded over radio link 530 to fly to GPS coordinates to put in range of tracked target. Coordinates may be an offset or a projection from tracked target drone 502 (above, ahead, below, etc).
- the command and control processor 520 or the ground control station 522 can then trigger a counter measure (jammer, net, etc) initiated automatically or by a human pressing a button.
- a high powered intercept drone can be flown under supervised autonomy of a system that is tracking a threat drone with a long range LIDAR.
- the supervised autonomy is performed by processing the detection and tracking information, and sending command instructions to the intercept drone to fly to the location of the threat drone.
- the location of the threat drone is updated by the tracking performed by the long range LIDAR.
- the intercept drone can carry any of a number of payloads that are appropriate to disable the threat when in sufficient range.
- the present approach will allow for the intercept drone to carry many different kinds of packages in close range to the target drone. By waiting until close range to the target before using a counter measure collateral damage can be minimized, jamming ranges can be reduced to a few feet.
- a human operator can safely abort the intercept after launch.
- the intercept drone can be controlled at far ranges and maintain an accurate track of the target drone.
- drone tracking is accomplished where a human 90 designates a target drone 92 either by a pointing device (laser designator) 94 or an approximate coordinate and a second autonomous intercept drone 96 uses on board sensing (camera or range sensor being two examples) 98 to follow first drone 92 after it has been selected as the target.
- This is designed to allow a “first responder” (probably two working as a team) to get rid of a nuisance drone.
- the method of disabling the target is left open, as many different payloads could exist.
- Target selection can be done by illumination using the pointing device 94 , or giving an approximate GPS location (via an ATAK tablet 91 or any available user device such as a smart phone or computer).
- the intercept drone 96 gives back its understanding of the selected target which is displayed on ATAK tablet 91 for human 90 to confirm.
- the intercept drone 96 then self pilots to the location of the target.
- the intercept drone 96 follows the motion of the target drone 92 to update destination GPS coordinates. Tracking can be done with a camera, LIDAR, or other sensor and the intercept drone 96 can use on board sensing or pre-loaded model for obstacle detection to include stereo vision, radar, lidar, ultrasound or the like.
- An inner loop of next step GPS way points, or a series of thrust commands in a vector, are given to a flight controller (standard robotics practice) to facilitate the course of flight. Bearing and range can be used to project next the waypoint and the status is updated on ATAK tablet 91 .
- the human 90 (user) is allowed to pause, abort, aid the system, or trigger an onboard counter measure (net, jammer, etc).
- an indication of the target is provided with the pointing device (Laser designator or draw on a screen) and feedback is given to the user by communicating back to a tablet.
- the intercept drone can be commanded without human intervention (self propelled) by using supervised autonomy where the autonomous seek to destination with obstacle avoidance is provided to the flight path.
- the ATAK tablet provides a user interface such that the drone gives back its understanding of selected target which is displayed on ATAK tablet for human to confirm and to control the mission with a method of steering or aborting the process, if necessary.
- tracking done with camera, LIDAR, or other sensor 98 where the drone self pilots to the location of the target and the intercept drone 96 follows motion of target to update destination GPS coordinate using on board sensing or pre-loaded model for obstacle detection.
- the drone detection system 500 can include the plurality of detection sensors 510 arranged to detect an object or optionally a radar sensor or an acoustical sensor can be used to initially detect an object, more specifically the drone 92 .
- the second autonomous intercept drone 96 uses on board sensing (camera or range sensor being two examples) 98 to follow first drone 92 after it has been selected as the target.
- a computer 540 includes a processor 552 , a volatile memory 554 , a non-volatile memory 556 (e.g., hard disk) and the user interface (UI) 558 (e.g., a graphical user interface, a mouse, a keyboard, a display, touch screen and so forth).
- the non-volatile memory 556 stores computer instructions 562 , an operating system 566 and data 568 .
- the computer instructions 562 are executed by the processor 552 out of volatile memory 554 to perform all or part of the processes described herein.
- the processes and techniques described herein are not limited to use with the hardware and software of FIG. 11 ; they may find applicability in any computing or processing environment and with any type of machine or set of machines that is capable of running a computer program.
- the processes described herein may be implemented in hardware, software, or a combination of the two.
- the processes described herein may be implemented in computer programs executed on programmable computers/machines that each includes a processor, a non-transitory machine-readable medium or other article of manufacture that is readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and one or more output devices.
- Program code may be applied to data entered using an input device to perform any of the processes described herein and to generate output information.
- the system may be implemented, at least in part, via a computer program product, (e.g., in a non-transitory machine-readable storage medium such as, for example, a non-transitory computer-readable medium), for execution by, or to control the operation of, data processing apparatus (e.g., a programmable processor, a computer, or multiple computers)).
- a computer program product e.g., in a non-transitory machine-readable storage medium such as, for example, a non-transitory computer-readable medium
- data processing apparatus e.g., a programmable processor, a computer, or multiple computers
- Each such program may be implemented in a high level procedural or object-oriented programming language to communicate with a computer system.
- the programs may be implemented in assembly or machine language.
- the language may be a compiled or an interpreted language and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in
- a computer program may be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
- a computer program may be stored on a non-transitory machine-readable medium that is readable by a general or special purpose programmable computer for configuring and operating the computer when the non-transitory machine-readable medium is read by the computer to perform the processes described herein.
- the processes described herein may also be implemented as a non-transitory machine-readable storage medium, configured with a computer program, where upon execution, instructions in the computer program cause the computer to operate in accordance with the processes.
- a non-transitory machine-readable medium may include but is not limited to a hard drive, compact disc, flash memory, non-volatile memory, volatile memory, magnetic diskette and so forth but does not include a transitory signal per se.
- the processing blocks associated with implementing the system may be performed by one or more programmable processors executing one or more computer programs to perform the functions of the system. All or part of the system may be implemented as, special purpose logic circuitry (e.g., an FPGA (field-programmable gate array) and/or an ASIC (application-specific integrated circuit)). All or part of the system may be implemented using electronic hardware circuitry that include electronic devices such as, for example, at least one of a processor, a memory, a programmable logic device or a logic gate.
- special purpose logic circuitry e.g., an FPGA (field-programmable gate array) and/or an ASIC (application-specific integrated circuit)
- All or part of the system may be implemented using electronic hardware circuitry that include electronic devices such as, for example, at least one of a processor, a memory, a programmable logic device or a logic gate.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Geometry (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Astronomy & Astrophysics (AREA)
- Mechanical Engineering (AREA)
- Business, Economics & Management (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Traffic Control Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
A counter drone system includes a cueing sensor to detect the presence of an object wherein the cueing sensor cues the presence of a target drone, a long range LIDAR system having a sensor pointed in a direction of the target drone to acquire and track at long range the target drone to provide an accurate location of the target drone wherein once a track is acquired, the motion of the target drone is used to maintain the track of the target drone and a threat detector wherein LIDAR data is provided to the threat detector to determine if the target drone is a threat.
Description
- This application claims priority from U.S. Provisional Patent Application Ser. No. 62/364,368, filed on Jul. 20, 2016, and U.S. Provisional Patent Application Ser. No. 62/306,841, filed on Mar. 11, 2016, both of which are incorporated herein by reference in their entirety.
- This disclosure relates generally to drones and more particularly to a technique to detect and track drones.
- In recent years, the advancement of unmanned aerial vehicles or drones has matured where drones are readily available at nominal costs to be purchased by private individuals. The readily availability of drones to be used by private individuals bring on additional concerns for law enforcement and security personnel where drones can be used for unwanted or illegal activity. For example, a drone carrying contraband can be used by an individual to fly over a prison and deliver the contraband within the prison walls. Drones can be flown into private areas, carry explosives, or deliver contraband to personnel located in the private areas. Furthermore, drones can be flown into air space that then prevents manned airplanes from flying a desired course. Other possibilities of the use of drones are only left to the imagination of unlawful actors and hence it is desirable for a system to locate a drone and render it useless when the drone is identified as a nuisance or danger.
- In accordance with the present disclosure, a counter drone system includes a cueing sensor to detect the presence of an object wherein the cueing sensor cues the presence of a target drone, a long range LIDAR system having a sensor pointed in a direction of the target drone to acquire and track at long range the target drone to provide an accurate location of the target drone wherein once a track is acquired, the motion of the target drone is tracked and a threat detector processing the LIDAR data is provided to determine if the target drone is a threat. With such an arrangement, target drone can be located and be rendered useless when the target drone is identified as a nuisance or danger.
- In further accordance with the present disclosure, a counter drone system includes a semi-autonomous response planner system with inputs from the threat detector and an operator to determine countermeasures appropriate for the cued target drone. With such an arrangement, suitable countermeasures can be determined and implemented to counter any threats by the target drone.
- In accordance with the disclosure, a method includes: detecting a presence of a target drone using a cueing sensor; acquiring, in response to initial data from the cuing sensor, a target drone and then tracking the target drone using a long range LIDAR system to provide an accurate location of the target drone and to maintain the track of the target drone; and providing LIDAR data to a threat detector to determine if the target drone is a threat. Such a technique provides an indication of a threat by a target drone so that countermeasures can be taken.
- The details of one or more embodiments of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the disclosure will be apparent from the description and drawings, and from the claims.
-
FIG. 1 is a diagram of a drone detection system; -
FIG. 2 is a diagram of using a LIDAR element to provide an electronic fence to protect an area of concern; -
FIG. 3 is a sketch of a tracking LIDAR with a field of view of a camera picture of a target and the corresponding LIDAR image taken from a LIDAR scanner; -
FIG. 3A is a diagram of a planned intercept course; -
FIG. 4 is a diagram of an early detections system with a tracking LIDAR to track a target; -
FIG. 5 is a diagram of a plurality LIDAR elements disposed to provide an electronic fence with a long range tracking LIDAR; -
FIG. 6 is a diagram of a drone detections system with a three dimensional scene model for analyzing an environment; -
FIG. 6A is a diagram of a geo-locator and labeler included within the drone detection system ofFIG. 6 ; -
FIG. 6B is a diagram of an object avoidance path; -
FIG. 6C is an example of a three dimensional scene; -
FIG. 6D is another example of a three dimensional scene; -
FIG. 7 is a screen shot of a computer screen with an example of a three dimensional scene; -
FIG. 8 is a diagram of a drone viewing a target drone; -
FIG. 9 is a block diagram of a system to implement a drone detection system; -
FIG. 10 is a diagram where a user designates a target drone using a pointing device; and -
FIG. 11 is a block diagram of a computer that can be used to implement certain features of the system. - Like reference symbols in the various drawings indicate like elements.
- The present disclosure describes techniques to use LIDAR as a sensor to track drones. Light detection and ranging (LIDAR) can be used to create three-dimensional (3D) imagery of a field of view. A LIDAR system includes a light source, such as a laser, that generates and directs pulses of light. The light pulses are reflected by the surface of various objects, such as the ground, a tree, or a building or an object in the air such as a drone. A sensor in the LIDAR system detects the reflections. The relative location of the reflecting surface can be determined by the lidar from the elapsed time from when the light pulse is generated and when it is detected. This cycle of pulse and detection may be repeated thousands of times per second. The coordinate frame of detection can be translated into another coordinate frame for display using common methods. The reflected light pulses are used to create a 3D image of the scanned area or field of view. An operator may then use pan and zoom commands to change the camera or sensor orientation and see different portions of the scanned area or field of view.
- A LIDAR has advantages over other sensors for tracking drones. Short range LIDARs (˜100 m) can interrogate all of their airspace and detect a drone, however the range of 100 meters has limited value. If we use a long range LIDAR (1000 m) however because of the narrow field of view, it is not practical for the long range LIDAR to do detection. Our disclosure uses a two-tiered approach of using an alerting system to cue the long range LIDAR so we may take advantage of the long range LIDAR. To make a long range LIDAR feasible we use a second sensor to alert (cue) that there is a drone present to track. The second sensor does not need to do a good job of long range tracking, it only needs to provide a small area to search with the long range LIDAR to find the drone. LIDAR also provides very precise three dimensional (3D) location information and is capable of detecting the physical presence of an object in most all lighting conditions. It doesn't require the drone to emit RF and it works if the drone is stationary or slow or fast or regardless of being close to the ground or high in the air.
- LIDAR has advantages over radar in that LIDAR allows for more accurate location and has a smaller spot size allowing for a more accurate image of a target to be formed.
- Referring now to
FIG. 1 , a drone detection system 100 (sometimes referred to as a counter drone system) is shown to include a plurality ofdetection sensors 110 arranged to detect an object, more specifically adrone 130. Adetection processor 112 captures the existence of an object and cues the presence of adrone 130 to atracking sensor 114 which acquires and tracks at long range thedrone 130 usingtarget tracker 116. An image of a target drone once cued can be fed to atarget identifier 118 for target recognition and theimage 120 of the cued target can be displayed to anoperator 122 ondisplay 124 so theoperator 122 can verify and analyze the cued target. Thetarget tracker 116 also feeds the target tracks to a semi-autonomousresponse planner system 126 with inputs also from theoperator 122 can determinecountermeasures 128 appropriate for the cued target. For example, ainterceptor drone 132 can be deployed. - From the latter, it can be seen, a counter drone system is provided wherein a cueing sensor provided by the
detections sensors 110 is able to detect the presence of an object wherein the cueing sensor cues the presence of a target drone. A long range LIDAR system provided by the trackingsensor 114 and thetarget tracker 116 with a sensor pointed in a direction of the target drone to acquire and track at long range the target drone can provide an accurate location of the target drone wherein once a track is acquired, the motion of the target drone along with a Kalman Filter is used to maintain the track of the target drone. A threat detector provided by thetarget identifier 118 uses LIDAR data which is provided to the threat detector to determine if the target drone is a threat. Furthermore,countermeasures 128, in response to theoperator 122 or thesemi-autonomous response planner 126, can then be implemented to render useless the target drone when the target drone is identified as a nuisance or danger. Optionally cameras can be aimed at the track as well. LIDAR (and optional camera) data is given tohuman operator 122 to determine threat vs. non-threat or automated techniques can be used as well. Sensor fusion techniques can also be used to combine the camera and lidar data to assist in threat determination. - A camera can be aimed toward the target to get further information about the target. Where to aim the camera can based on the target tracker and knowledge about the camera coordinate frame and the tracker sensor coordinate frame as to be discussed further herein below.
- It should be appreciated drone detection and tracking is accomplished wherein one
sensor 110 being a LIDAR or (alternatively, acoustics, infrared, etc) cues the presence but not high resolution location of a drone, and a LIDAR tracking sensor 114 (flash, Geiger mode, line scanning) is aimed to acquire and track at long range the target to provide an accurate location. Once the track is acquired, the sensing of the target, and the prediction of the motion of the target using standard means (such as a Kalman Filter) is used to maintain the track of the target. - It should be understood any Line scanning LIDAR is a suitable cuing sensor. Examples include a Quanergy M8, or a Velodyne VLP16. This is configured as a light fence facing upward and is described in
FIG. 2 . - A line scanning Lidar such as a Velodyne VLP-16 or similar can be configured as an upwards facing light fence. An object that breaks the light fence will be registered by the LIDAR and it's location can be translated into a coordinate, and in the case of a multibeam LIDAR, a vector. This defines a search space for the long range LIDAR to hunt for the object that has broken the light fence. Several such lidars configured as a light fence may be networked together to form a parimeter around a location to protect the location such as the White House, an airport, or a prison. Acoustic sensor systems could also be used to cue the sensor. In this case the audible signature of the drone is detected by a microphone array and translated into an approximate location. Similarly a radar could be used to cue the sensor.
- Once cued, the long range LIDAR will “hunt” for a flying object that is defined as an object that is off the ground, and in open space that is previously known to have been empty space. If the object is moving it is tracked. If the object is stationary it is observed stationary.
- It should be understood the
response planner 126 will do the following tasks when an object is observed: -
- Display the raw lidar data 24 (
FIG. 3 ) to a human operator. - Aim a camera at the location of the target and present the operator with the camera view 22 (
FIG. 3 ). - Plan an intercept course for the intercept asset to the object based on its trajectory (
FIG. 3A ). - If authorized, launch the intercept drone. This action is of “low regret” because the operator can still over ride the interceptor, however this allows the interceptor to close range on the target.
- The response planner can also take into consideration the 3D site models as described herein.
- Display the raw lidar data 24 (
- Referring now to
FIG. 2 , adrone cuing system 210 is shown having a plurality ofLIDAR sensors 10.LIDAR sensor 10 in one embodiment provides 16 beams and has a range of approximately 100 meters.LIDAR sensor 10 is disposed so that thebeams 12 are pointed upward such that thebeams 12 can detect an object, here drone 14 when thedrone 14 enters the range of theLIDAR sensor 10. TheLIDAR sensor 10 is disposed on a surface and when thebeam 12 is scanned from one horizon into the air to the other horizon creates afan 16 that interrogates the air space within the range of theLIDAR sensor 10. A plurality ofsensors 10 can be arranged along a line and networked together to provide a light-fence 18. By then disposing a plurality oflight fences 18 around an area to be protected, a fence can be created to detect objects entering thelight fence 18. With such an arrangement, adetection system 210 for a drone detection and tracking system for cuing a tracking system is provided where a line scanning LIDAR is pointed upward to make a light-fence, and objects detected by the light-fence can be used to cue a tracker. Several such light-fence sections can be established together around a perimeter of an asset to establish a light fence around the asset. The inbound vector of an object can be given to a second LIDAR (flash, Geiger mode, line scanning) that is aimed to acquire and track the target to provide an accurate location. Once the track is acquired, the motion of the drone is used as input to maintain the track of the target. - From the latter, it can be seen, a system according to the disclosure includes a three dimensional line-scanner LIDAR sensor disposed on a side to provide a set of fanned beams that travel from one horizon into the air to the other horizon to detect an object and create a track for the object and a long range sensor can be provided to track the object detected by the line-scanner LIDAR sensor in response to an initial track of the object created by the line-scanner LIDAR sensor.
- As described above, a system can be alerted when a drone is flying through a vertical plane. Interested parties are alerted when a drone is invading their space. By putting a line-scanning LIDAR on its side, a set of fanned beams are created that go from one horizon, into the air, and to the other horizon (left, up, right). Anything flying through these beams can be detected, and a track can be established. This becomes a detection system that can cue another sensor like a long range LIDAR. By surrounding a valuable object (or location) with a light fence, an alert can be provided whenever something flies into the monitored airspace. The system can be used to alert a long range LIDAR to the presence of a drone so that the long range LIDAR can track it. Because of the narrow field of view, it is not practical for the long range LIDAR to do detection. The light fence provides a technique for detection and to provide an accurate location where the long range LIDAR should look.
- It should be understood a line-scanning LIDAR is available from several vendors to include models available such as a Velodyne VLP16, HDL32, SICK LMS 111, or a Quanergy M8. It should also be understood that the concept of a light fence is well known in the art. In general to make a light fence: Turn on Lidar, Take a few scans for the Lidar to learn all the expected return ranges for all beams at all angles. For example at 132 degrees the light may travel 30 meters before reflecting off a branch. We know between 0-30 meters is open space because the beam reflected back at 30 meters. At 140 degrees there may not be any return because the beam went up in the air and nothing reflected back. We store this profile for each beam. When watching the fence you are looking for deviation from the expected pattern. If at 132 degrees there is a return at 18 meters, something has broken the open space and blocked the beam before the expected 30 meter range. If at 140 degrees there is a return at 93 meters, then an object has appeared at 93 meters that was previously open air. If the Lidar has multiple beams, several such breaks in different beams will establish a vector. By networking the LIDAR sensors together, a fence can be created to detect when an object penetrates the fence. Networking the lidars together is nothing more than turning them all on with appropriate power and data connections. They do not need to know about each other, they can all operate independently. To form a coordinate system around these sensors they need to be surveyed in, so that beam breakages can be translated into a global coordinate frame such as GPS.
- Referring now also to
FIG. 1 , it should now be appreciated for the fan LIDAR, the detection is made by an object flying through the fan. Thedrone cuing system 210 gives best vector information to thetracking sensor 114 and target tracker (tracking controller) 116. The trackingcontroller 116 aims flash LIDAR to predicted track location and starts hunting for the object in the sky. An object is segmented from background by being in open air. The object is tracked in LIDAR frame using existing LIDAR tracking code and the tracking information is fed back into trackingcontroller 116. Optionally cameras can be aimed at the track as well. LIDAR (and optional camera) data is given tohuman operator 122 to determine threat vs. non-threat or automated techniques can be used as well. Sensor fusion techniques can also be used. - Referring now to
FIG. 3 , a trackingLIDAR 20 is shown where an ASC Tiger Cub Flash LIDAR emits a flash of laser light and uses a CCD to capture range information. Field of view is narrow, like that of a camera. The trackingLIDAR 20 pointed toward atarget 26 will return animage 28 of thetarget 26. The range can be up to 1 km. At 1 km, pixels are about 20 cm, at 500 m, pixels are about 10 cm, and at 100 m, pixels are about 2 cm. A given inbound track from a cuingdetection system 210 provides the location information of a target drone such that a trackingLIDAR 20 can scan the sky on a pan/tilt head to find a UAV or drone. Once an UAV is found, the trackingLIDAR 20 can track the UAV, providing 3D coordinates for counter measures, provide a 3D model of the object for classification, and give a clean view of an object to an operator for go/no-go decision. - Referring now to
FIG. 4 , a drone detection andtracking system 200 includes an early detection system provided bydrone cuing system 210 for detecting the presence of a drone. Thecue sensor 10 facing upward uses it's modality to detect the presence of drones. It should be appreciated that the initial detectors could be acoustic, infrared, radar or other sensors but here we are describing a LIDAR sensor. For the fan LIDAR sensor, the detection is made by vector flying through the fan. Thecue sensor 10 from theearly detection system 210 gives best vector information tolong range tracker 220. Thelong range tracker 220 aims flash LIDAR to anobject 222 to a predicted track location and starts hunting for theobject 222 in sky. Theobject 222 is segmented from background by being in open air. Theobject 222 is tracked in LIDAR frame using existing LIDAR tracking code and the tracking information is fed back into tracking controller oflong range tracker 220. Optionally cameras can be aimed at the track as well. As described withFIG. 1 , LIDAR (and optional camera) data is given to ahuman operator 122 to determine threat vs. non-threat and automated techniques can be used as well. Sensor fusion techniques can also be used. - Referring now to
FIG. 5 , a drone detection andtracking system 300 is shown where aline scanning LIDAR 310 is pointed upward to make a light-fence, and flying entities that fly through the light fence establish an inbound vector. Several such light-fence sections can be established together around the perimeter of an asset. The inbound vector is given to a second LIDAR 320 (flash, Geiger mode, line scanning) that is aimed to acquire and track the target to provide an accurate location. Once the track is acquired, the motion of the drone is used as input to maintain the track of the target. As to be described hereinafter in connections withFIG. 8 , one or more intercept drones are then tasked to the location of the first drone carrying a counter measure device such as a localized jammer, or net, or net gun based on the track from the ground based system. - Referring now to
FIGS. 6 and 6A , adrone detection system 400 is shown to include a plurality ofdetection sensors 410 arranged to detect an object, more specifically adrone 440. Adetection processor 412 captures the existence of an object and cues the presence of a drone to atracking sensor 414 which acquires and tracks at long range the drone usingtarget tracker 416. An image of a target drone once cued and tracked can be fed to atarget identifier 418 for target recognition and theimage 420 of the target is displayed to anoperator 422 ondisplay 424 so theoperator 422 can verify and analyze the target. Thetarget tracker 416 also feeds the target tracks to a semi-autonomousresponse planner system 426 and with inputs also from theoperator 422 can determinecountermeasures 428 appropriate for the target. Thedrone detection system 400 also includes asystem 430 for creating a three dimensional model of an environment where in thedetection sensors 410 and the trackingsensors 414 with thetarget identifier 418 provides a scanning system to scan an environment to provide animage 434 of the scanned environment, a geo-locator 452 is used to tag a plurality of points within the image with geo-reference points and alabeler 454 is used to label features of interest within the image and to identify possible access paths within the features of interest potentially providing an access path for a target drone. Furthermore, a real-timepedestrian model system 432 is provided to track locations of pedestrians in anenvironment 436. It should be noted theenvironment 436 can include a portion of theimage 434, include all of theimage 434, or include more than the environment captured byimage 434. - It should be appreciated surveying a site by LIDAR to create a 3D model of the environment can be used as input for: a) explaining false positives when detecting and tracking drones, b) calculating fields of view when detecting and tracking drones, c) optimizing countermeasures for drones, and d) planning routes for countermeasures for drones. Using known methods, a 3D scan of the environment is made producing a detailed point cloud of fixed objects and points are Geo-referenced in this model. The model gets loaded into command and control software. The command and control software is written to use this model when planning way points for interception by avoiding objects that are possible collisions (e.g trees) without requiring on board sensing. The model is used when reading and considering new tracks (from LIDAR or other sensor (e.g. radar, acoustics)) to determine if location of a new track is likely to really be from noise (traffic, waving flag, fireworks, . . . ) or in fact a potential target. The model is used when evaluating blind spots of the system for deployed sensors by placing their location and field of view into the model and tracing their field of view for intersections with fixed objects in the model (building, trees). The model is used when deciding the windows of opportunity for counter measures and prioritizing their use by considering how long a window of opportunity to intercept is possible, if there is collateral damage (pedestrians), chance of interference (radio tower, multi-path off building), etc. based on modality (jamming, projectile, etc).
- It should now be appreciated the
system 400 can create a 3D model of the environment (buildings, trees, roads, parking lots, etc) and use the context of the world to perform better tracking, better false positive rejection, better intercept planning, perform obstacle avoidance for the intercept vehicle, better site preparation for a counter drone detection, tracking and intercepting platform, for example, as shown inFIG. 6B . Using known methods, thesystem 400 can make a 3D scan of the environment producing a detailed point cloud of fixed objects and the objects are geo-reference in this 3D scan model. - Referring now to
FIG. 6A , a system for creating a three dimensional model of an environment includes the3D scene model 430 where a LIDAR scanning system to scan an environment provides an image of the scanned environment which is stored asdata 450 and a geo-locator 452 is used to tag a plurality of points within the image with geo-reference points and alabeler 454 is used to label features of interest within the image and to identify possible access paths within the features of interest potentially providing an access path for a target drone. - To implement the described technique, the
system 400 scans the environment with a LIDAR detection sensor. This can be done by an aerial platform, mobile mapping platform, or a stationary platform usingdetection sensors 410. See for example the image of the scenes inFIG. 6B orFIG. 6C . Next, thesystem 400 geo-references the points in the scene with GPS using known techniques. This is common practice. Thesystem 400 will next label the scene with features of interest. Examples include: roads (roads have cars, cars move); trees (trees sway in the wind, move slightly; trees are obstacles to avoid with drones); buildings (buildings are high value items we don't want to hurt); areas with people (areas we want to avoid collateral damage); and other features of interest can be considered. Labeling of this data could be done by hand or automated methods, or by geo referencing other data sources. Having the latter information available, a mission planner can now consider placement of assets in the model as well as predict where enemy drones may come from. A mission planner can consider windows of opportunity for counter measures and analyze blind spots. The mission planner can analyze areas where false positives (birds, for example) may come from. By playing what-if scenarios, the mission planner can come up with a better placement of assets to protect what needs protection. Furthermore, having the latter information available, when a track is first discovered, the mission planner can consider the likelihood it is a false track by where it originated from. For example, if it came from a tree, there is a possibility it may be a bird. Other sensors reports can be considered. Acoustic solution detections can be evaluated in the model, as well as radar detections, if desired. Radars may produce false tracks off cars, etc. More false positives can be rejected by understanding where the false positives are originating from. With such a technique, when a track is validated as a threat and countermeasures will be launched, the mission planner can use the 3D model to plan which countermeasure can/should be launched, and determine when an opportunity to intercept is most likely. If the selected countermeasure is deploying another drone, the mission planner can pilot an intercept drone around obstacles because the obstacles have been mapped in the scene apriori. Also with such a technique, if a counter measure may do collateral damage (cause debris to fall, overshoot, jam RF in a cone, etc), the mission planner can plan the best opportunity for minimal collateral damage because a 3D model of the scene is available. The mission.planner can compute the firing angles, debris patterns, and effects of range of various systems and choose to engage at a time and place likely to cause the least damage. - Referring now to
FIGS. 6C and 6D , examples of a three dimensional scene are shown. As described above, a geo-locator is used to tag a plurality of points within the image with geo-reference points and a labeler is used to label features of interest within the image and to identify possible access paths within the features of interest. - Referring now to
FIG. 7 , a screen shot 700 of a computer screen with an example of a three dimensional scene 702 is shown. As described above, a geo-locator is used to tag a plurality of points within the image with geo-reference points and a labeler is used to label features of interest. - Referring now to
FIG. 8 , to implement countermeasures to render atarget drone 80 useless when the drone is identified as a nuisance or danger, anintercept drone 82 can be deployed. In certain embodiments, the intercept drone has on-board GPS and ability to fly to GPS locations; the intercept drone can receive waypoints by radio; the tracking system is surveyed in so it's GPS location is known; the tracking system can translate track into GPS coordinates; the intercept drone is commanded over radio link to fly to GPS coordinates to put in range of a tracked target. Coordinates may be an offset or a projection from tracked target (above, ahead, below, etc). Trigger of a counter measure (jammer, net, etc) can be done automatically or by a human pressing button. To speed deployment, anintercept drone 82 may be stationed at high altitude (˜400 ft) by tether to ground power, allowing it to stay inplace 24 hrs/day until needed to deploy—dropping ground tether and intercepting from above. - Referring now to
FIG. 9 , adrone detection system 500 is shown to include a plurality ofdetection sensors 510 arranged to detect an object, more specifically adrone 502. Adetection processor 512 captures the existence of thedrone 502 and cues the presence of thedrone 502 to atracking sensor 514 which acquires and tracks at long range the drone usingtarget tracker 516.Target tracker 516 provides track and the tracking sensor (514)'s GPS coordinates to command andcontrol processor 520 which in turn translates the track from sensor coordinates into GPS coordinates and provides GPS coordinates of thedrone 502 toground control station 522. Alternatively, the control processor, knowing the current location of the intercept drone (526) in GPS coordinates through the ground control station (522), can determine a proper intercept course for theintercept drone 526, and command the velocity and vector of travel forintercept drone 526 to intercept thedrone 502. Theground control station 522 then provides controls todrone controller 524 which controls anintercept drone 526. Theground control station 522 also provides image data to atablet 528 such as an Android Tactical Assault Kit (ATAK)tablet 528. Theintercept drone 526 has on-board GPS receiver and ability to fly to GPS locations and can receive waypoints by radio. The tracking system is surveyed in so it's GPS location is known. The tracking system can translate track into GPS coordinates. Theintercept drone 526 is commanded overradio link 530 to fly to GPS coordinates to put in range of tracked target. Coordinates may be an offset or a projection from tracked target drone 502 (above, ahead, below, etc). The command andcontrol processor 520 or theground control station 522 can then trigger a counter measure (jammer, net, etc) initiated automatically or by a human pressing a button. - With such an arrangement, a high powered intercept drone can be flown under supervised autonomy of a system that is tracking a threat drone with a long range LIDAR. The supervised autonomy is performed by processing the detection and tracking information, and sending command instructions to the intercept drone to fly to the location of the threat drone. The location of the threat drone is updated by the tracking performed by the long range LIDAR. The intercept drone can carry any of a number of payloads that are appropriate to disable the threat when in sufficient range. The present approach will allow for the intercept drone to carry many different kinds of packages in close range to the target drone. By waiting until close range to the target before using a counter measure collateral damage can be minimized, jamming ranges can be reduced to a few feet. By using an intercept drone, a human operator can safely abort the intercept after launch. By using a long range LIDAR, the intercept drone can be controlled at far ranges and maintain an accurate track of the target drone.
- Referring now to
FIG. 10 , drone tracking is accomplished where a human 90 designates atarget drone 92 either by a pointing device (laser designator) 94 or an approximate coordinate and a secondautonomous intercept drone 96 uses on board sensing (camera or range sensor being two examples) 98 to followfirst drone 92 after it has been selected as the target. This is designed to allow a “first responder” (probably two working as a team) to get rid of a nuisance drone. The method of disabling the target is left open, as many different payloads could exist. Target selection can be done by illumination using thepointing device 94, or giving an approximate GPS location (via anATAK tablet 91 or any available user device such as a smart phone or computer). Theintercept drone 96 gives back its understanding of the selected target which is displayed onATAK tablet 91 for human 90 to confirm. Theintercept drone 96 then self pilots to the location of the target. Theintercept drone 96 follows the motion of thetarget drone 92 to update destination GPS coordinates. Tracking can be done with a camera, LIDAR, or other sensor and theintercept drone 96 can use on board sensing or pre-loaded model for obstacle detection to include stereo vision, radar, lidar, ultrasound or the like. An inner loop of next step GPS way points, or a series of thrust commands in a vector, are given to a flight controller (standard robotics practice) to facilitate the course of flight. Bearing and range can be used to project next the waypoint and the status is updated onATAK tablet 91. The human 90 (user) is allowed to pause, abort, aid the system, or trigger an onboard counter measure (net, jammer, etc). - With such an arrangement, an indication of the target is provided with the pointing device (Laser designator or draw on a screen) and feedback is given to the user by communicating back to a tablet. The intercept drone can be commanded without human intervention (self propelled) by using supervised autonomy where the autonomous seek to destination with obstacle avoidance is provided to the flight path. The ATAK tablet provides a user interface such that the drone gives back its understanding of selected target which is displayed on ATAK tablet for human to confirm and to control the mission with a method of steering or aborting the process, if necessary. As described above, tracking done with camera, LIDAR, or
other sensor 98 where the drone self pilots to the location of the target and theintercept drone 96 follows motion of target to update destination GPS coordinate using on board sensing or pre-loaded model for obstacle detection. - Referring again also to
FIG. 9 , instead of a human providing the initial targeting as shown inFIG. 10 , thedrone detection system 500 can include the plurality ofdetection sensors 510 arranged to detect an object or optionally a radar sensor or an acoustical sensor can be used to initially detect an object, more specifically thedrone 92. Once initially detected, the secondautonomous intercept drone 96 uses on board sensing (camera or range sensor being two examples) 98 to followfirst drone 92 after it has been selected as the target. - Referring to
FIG. 11 , acomputer 540 includes aprocessor 552, avolatile memory 554, a non-volatile memory 556 (e.g., hard disk) and the user interface (UI) 558 (e.g., a graphical user interface, a mouse, a keyboard, a display, touch screen and so forth). Thenon-volatile memory 556stores computer instructions 562, anoperating system 566 anddata 568. In one example, thecomputer instructions 562 are executed by theprocessor 552 out ofvolatile memory 554 to perform all or part of the processes described herein. - The processes and techniques described herein are not limited to use with the hardware and software of
FIG. 11 ; they may find applicability in any computing or processing environment and with any type of machine or set of machines that is capable of running a computer program. The processes described herein may be implemented in hardware, software, or a combination of the two. The processes described herein may be implemented in computer programs executed on programmable computers/machines that each includes a processor, a non-transitory machine-readable medium or other article of manufacture that is readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and one or more output devices. Program code may be applied to data entered using an input device to perform any of the processes described herein and to generate output information. - The system may be implemented, at least in part, via a computer program product, (e.g., in a non-transitory machine-readable storage medium such as, for example, a non-transitory computer-readable medium), for execution by, or to control the operation of, data processing apparatus (e.g., a programmable processor, a computer, or multiple computers)). Each such program may be implemented in a high level procedural or object-oriented programming language to communicate with a computer system. However, the programs may be implemented in assembly or machine language. The language may be a compiled or an interpreted language and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program may be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network. A computer program may be stored on a non-transitory machine-readable medium that is readable by a general or special purpose programmable computer for configuring and operating the computer when the non-transitory machine-readable medium is read by the computer to perform the processes described herein. For example, the processes described herein may also be implemented as a non-transitory machine-readable storage medium, configured with a computer program, where upon execution, instructions in the computer program cause the computer to operate in accordance with the processes. A non-transitory machine-readable medium may include but is not limited to a hard drive, compact disc, flash memory, non-volatile memory, volatile memory, magnetic diskette and so forth but does not include a transitory signal per se.
- The processes described herein are not limited to the specific examples described. Rather, any of the processing blocks as described above may be re-ordered, combined or removed, performed in parallel or in serial, as necessary, to achieve the results set forth above.
- The processing blocks associated with implementing the system may be performed by one or more programmable processors executing one or more computer programs to perform the functions of the system. All or part of the system may be implemented as, special purpose logic circuitry (e.g., an FPGA (field-programmable gate array) and/or an ASIC (application-specific integrated circuit)). All or part of the system may be implemented using electronic hardware circuitry that include electronic devices such as, for example, at least one of a processor, a memory, a programmable logic device or a logic gate.
- Elements of different embodiments described herein may be combined to form other embodiments not specifically set forth above. Other embodiments not specifically described herein are also within the scope of the following claims.
Claims (20)
1. A counter drone system comprising:
a cueing sensor to detect the presence of an obj ect wherein the cueing sensor cues the presence of a target drone;
a long range LIDAR system having a sensor pointed in a direction of the target drone to acquire and track at long range the target drone to provide an accurate location of the target drone wherein once a track is acquired, the motion of the target drone is used to maintain the track of the target drone; and
a threat detector wherein LIDAR data is provided to the threat detector to determine if the target drone is a threat.
2. The counter drone system as recited in claim 1 comprising a semi-autonomous response planner system with inputs from the threat detector and an operator to determine countermeasures appropriate for the cued target drone.
3. The counter drone system as recited in claim 1 wherein the cueing sensor comprises a LIDAR to cue the presence of the target drone but not a high resolution location of the target drone.
4. The counter drone system as recited in claim 1 wherein the long range LIDAR system comprises a LIDAR tracking sensor to acquire and track at long range the target drone to provide an accurate location in response to the cueing sensor detecting the initial presence of the target drone.
5. The counter drone system as recited in claim 1 wherein the cueing sensor is a line scanning LIDAR.
6. The system as recited in claim 2 wherein a break in the light fence provides initial vector information to a tracking sensor.
7. The system as recited in claim 1 wherein the plurality of three dimensional line-scanner LIDAR sensors provide an initial detection of an object and triggers a long range sensor to acquire and track the object detected by at least one of the plurality of three dimensional line-scanner LIDAR sensors.
8. The counter drone system as recited in claim 1 further comprising a camera to be aimed toward the target drone to get further information about the target drone.
9. The counter drone system as recited in claim 1 wherein the cueing sensor is selected from one of a LIDAR sensor, an acoustic sensor, a radar sensor or an infrared sensor.
10. The counter drone system as recited in claim 1 wherein the LIDAR sensor is selected from one of a flash LIDAR sensor, a Geiger mode LIDAR sensor or a line scanning LIDAR sensor.
11. The counter drone system as recited in claim 1 wherein the threat detector comprises:
a target identifier for target recognition with an image of the target drone; and
a display to display the image of the target drone to an operator.
12. The counter drone system as recited in claim 2 wherein the semi-autonomous response planner system provides raw lidar data to a human operator.
13. The counter drone system as recited in claim 2 wherein the semi-autonomous response planner system comprises a planning module to plan an intercept course for an intercept drone to the target drone.
14. A method comprising:
detecting a presence of a target drone using a cueing sensor;
acquiring, in response to initial data from the cuing sensor, a target drone and then tracking the target drone using a long range LIDAR system to provide an accurate location of the target drone and to maintain the track of the target drone; and
providing LIDAR data to a threat detector to determine if the target drone is a threat.
15. The method as recited in claim 14 comprising determining appropriate countermeasures using a semi-autonomous response planner system with inputs from the threat detector and an operator to determine countermeasures appropriate for the cued target drone.
16. The method as recited in claim 14 comprising aiming a camera toward the target drone to get further information about the target drone.
17. The method as recited in claim 14 comprising providing raw lidar data to a human operator for analysis by the operator.
18. The method as recited in claim 14 comprising planning an intercept course for an intercept drone to the target drone.
19. The method as recited in claim 14 wherein the cueing sensor comprises a LIDAR to cue the presence of the target drone but not a high resolution location of the target drone.
20. An system, comprising circuitry configured to:
detect a presence of a target drone using a cueing sensor;
acquire, in response to initial data from the cuing sensor, a target drone and then track the target drone using a long range LIDAR system to provide an accurate location of the target drone and to maintain the track of the target drone;
provide LIDAR data to a threat detector to determine if the target drone is a threat; and.
determine appropriate countermeasures using a semi-autonomous response planner system with inputs from the threat detector and an operator to determine countermeasures appropriate for the cued target drone.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/443,143 US20170261613A1 (en) | 2016-03-11 | 2017-02-27 | Counter drone system |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662306841P | 2016-03-11 | 2016-03-11 | |
US201662364368P | 2016-07-20 | 2016-07-20 | |
US15/443,143 US20170261613A1 (en) | 2016-03-11 | 2017-02-27 | Counter drone system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170261613A1 true US20170261613A1 (en) | 2017-09-14 |
Family
ID=59786500
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/443,143 Abandoned US20170261613A1 (en) | 2016-03-11 | 2017-02-27 | Counter drone system |
US15/443,182 Abandoned US20170261604A1 (en) | 2016-03-11 | 2017-02-27 | Intercept drone tasked to location of lidar tracked drone |
US15/443,165 Active 2038-01-21 US10690772B2 (en) | 2016-03-11 | 2017-02-27 | LIDAR site model to aid counter drone system |
US15/443,156 Abandoned US20180364741A1 (en) | 2016-03-11 | 2017-02-27 | Human indication of target drone for interception |
US15/443,173 Active 2037-08-31 US10408936B2 (en) | 2016-03-11 | 2017-02-27 | LIDAR light fence to cue long range LIDAR of target drone |
Family Applications After (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/443,182 Abandoned US20170261604A1 (en) | 2016-03-11 | 2017-02-27 | Intercept drone tasked to location of lidar tracked drone |
US15/443,165 Active 2038-01-21 US10690772B2 (en) | 2016-03-11 | 2017-02-27 | LIDAR site model to aid counter drone system |
US15/443,156 Abandoned US20180364741A1 (en) | 2016-03-11 | 2017-02-27 | Human indication of target drone for interception |
US15/443,173 Active 2037-08-31 US10408936B2 (en) | 2016-03-11 | 2017-02-27 | LIDAR light fence to cue long range LIDAR of target drone |
Country Status (1)
Country | Link |
---|---|
US (5) | US20170261613A1 (en) |
Cited By (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160284221A1 (en) * | 2013-05-08 | 2016-09-29 | Matternet, Inc. | Route planning for unmanned aerial vehicles |
US9835709B2 (en) * | 2016-02-02 | 2017-12-05 | Bao Tran | Systems and methods for permission based control of robots |
US10156631B2 (en) | 2014-12-19 | 2018-12-18 | Xidrone Systems, Inc. | Deterrent for unmanned aerial systems |
US10281570B2 (en) | 2014-12-19 | 2019-05-07 | Xidrone Systems, Inc. | Systems and methods for detecting, tracking and identifying small unmanned systems such as drones |
US10291348B2 (en) * | 2016-02-02 | 2019-05-14 | Bao Tran | Systems and methods for control of drones |
US10324527B2 (en) | 2014-01-24 | 2019-06-18 | Tobii Ab | Gaze driven interaction for a vehicle |
WO2019161076A1 (en) | 2018-02-19 | 2019-08-22 | Digital Global Systems, Inc. | Systems, methods, and devices for unmanned vehicle detection and threat management |
US10720068B2 (en) | 2012-05-09 | 2020-07-21 | Singularity University | Transportation using network of unmanned aerial vehicles |
WO2020236238A1 (en) * | 2019-05-17 | 2020-11-26 | Anduril Industries Inc. | Counter drone system |
US10907940B1 (en) | 2017-12-12 | 2021-02-02 | Xidrone Systems, Inc. | Deterrent for unmanned aerial systems using data mining and/or machine learning for improved target detection and classification |
US20220172380A1 (en) * | 2019-04-08 | 2022-06-02 | Shenzhen Vision Power Technology Co., Ltd. | Three-dimensional light field technology-based optical unmanned aerial vehicle monitoring system |
US11521498B2 (en) | 2017-01-23 | 2022-12-06 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
WO2022260712A1 (en) | 2021-06-09 | 2022-12-15 | Raytheon Company | Method and flexible apparatus permitting advanced radar signal processing, tracking, and classification/identification design and evaluation using single unmanned air surveillance (uas) device |
US11549976B2 (en) | 2017-01-23 | 2023-01-10 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within a spectrum |
US11558764B2 (en) | 2013-03-15 | 2023-01-17 | Digital Global Systems, Inc. | Systems, methods, and devices having databases for electronic spectrum management |
US11588562B2 (en) | 2013-03-15 | 2023-02-21 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
DE102021209154A1 (en) | 2021-08-20 | 2023-02-23 | Atlas Elektronik Gmbh | Effective target engagement by a military facility having a first unclassified information network and a second classified information network |
WO2023021094A1 (en) | 2021-08-20 | 2023-02-23 | Atlas Elektronik Gmbh | Effective target engagement using a military device comprising a first network for non-classified information and a second network for classified information |
US11601833B2 (en) | 2013-03-15 | 2023-03-07 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection with temporal feature extraction within a spectrum |
US11617089B2 (en) | 2013-03-15 | 2023-03-28 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US11622170B2 (en) | 2017-01-23 | 2023-04-04 | Digital Global Systems, Inc. | Systems, methods, and devices for unmanned vehicle detection |
US11647409B2 (en) | 2013-03-15 | 2023-05-09 | Digital Global Systems, Inc. | Systems, methods, and devices having databases and automated reports for electronic spectrum management |
US11646918B2 (en) | 2013-03-15 | 2023-05-09 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying open space |
US11653236B2 (en) | 2013-03-15 | 2023-05-16 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US11665664B2 (en) | 2013-03-15 | 2023-05-30 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying signal-emitting devices |
US11676472B2 (en) | 2018-08-24 | 2023-06-13 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time |
US11747115B2 (en) | 2020-04-20 | 2023-09-05 | United States Of America As Represented By The Secretary Of The Air Force | Apparatus and process for drone locating, interdiction and recovery |
US11764883B2 (en) | 2017-01-23 | 2023-09-19 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within an electromagnetic spectrum |
US11820507B2 (en) | 2015-11-10 | 2023-11-21 | Matternet, Inc. | Methods and systems for transportation using unmanned aerial vehicles |
EP4303626A1 (en) | 2022-07-05 | 2024-01-10 | AGC Glass Europe | Drone detection device |
US11974149B2 (en) | 2013-03-15 | 2024-04-30 | Digital Global Systems, Inc. | Systems, methods, and devices having databases and automated reports for electronic spectrum management |
KR20240148531A (en) * | 2023-04-04 | 2024-10-11 | 국민대학교산학협력단 | Autonomous navigation system and method of unmanned aerial vehicle using microphone sensor |
EP4481439A1 (en) * | 2023-06-19 | 2024-12-25 | CS Group - France | Method and system for detecting a flying apparatus in an airspace |
US12183213B1 (en) | 2017-01-23 | 2024-12-31 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
US12205477B2 (en) | 2017-01-23 | 2025-01-21 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
US12256233B2 (en) | 2013-03-15 | 2025-03-18 | Digital Global Systems, Inc. | Systems and methods for automated financial settlements for dynamic spectrum sharing |
RU2838977C1 (en) * | 2024-10-21 | 2025-04-24 | Федеральное государственное автономное образовательное учреждение высшего образования "Национальный исследовательский Томский государственный университет" | System for distributed control of intelligent robots for fighting unmanned vehicles |
US12302144B2 (en) | 2013-03-15 | 2025-05-13 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12356206B2 (en) | 2013-03-15 | 2025-07-08 | Digital Global Systems, Inc. | Systems and methods for automated financial settlements for dynamic spectrum sharing |
Families Citing this family (83)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8233880B2 (en) | 2006-08-16 | 2012-07-31 | Global Tel*Link Corporation | Integration of cellular phone detection and reporting into a prison telephone system |
US10244504B2 (en) | 2013-03-15 | 2019-03-26 | DGS Global Systems, Inc. | Systems, methods, and devices for geolocation with deployable large scale arrays |
US8750156B1 (en) | 2013-03-15 | 2014-06-10 | DGS Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying open space |
US9508212B2 (en) | 2013-09-18 | 2016-11-29 | Renovo Software, Inc. | Apparatus for controlling access to and use of portable electronic devices |
DE102015011058A1 (en) * | 2015-08-27 | 2017-03-02 | Rheinmetall Waffe Munition Gmbh | Threat prevention system |
US12123950B2 (en) | 2016-02-15 | 2024-10-22 | Red Creamery, LLC | Hybrid LADAR with co-planar scanning and imaging field-of-view |
US11556000B1 (en) | 2019-08-22 | 2023-01-17 | Red Creamery Llc | Distally-actuated scanning mirror |
US12399278B1 (en) | 2016-02-15 | 2025-08-26 | Red Creamery Llc | Hybrid LIDAR with optically enhanced scanned laser |
US12399279B1 (en) | 2016-02-15 | 2025-08-26 | Red Creamery Llc | Enhanced hybrid LIDAR with high-speed scanning |
US10735131B2 (en) * | 2016-08-24 | 2020-08-04 | Global Tel*Link Corporation | System and method for detecting and controlling contraband devices in a correctional facility utilizing portable electronic devices |
US10538326B1 (en) * | 2016-08-31 | 2020-01-21 | Amazon Technologies, Inc. | Flare detection and avoidance in stereo vision systems |
CN110583014B (en) * | 2016-10-11 | 2021-04-20 | 深圳市前海腾际创新科技有限公司 | Method and system for detecting and locating intruders using laser detection and ranging device |
US10353388B2 (en) * | 2016-10-17 | 2019-07-16 | X Development Llc | Drop-off location planning for delivery vehicle |
US10206064B2 (en) | 2016-12-14 | 2019-02-12 | Global Tel*Link Corp. | System and method for detecting and locating contraband devices in a secure environment |
US10825345B2 (en) * | 2017-03-09 | 2020-11-03 | Thomas Kenji Sugahara | Devices, methods and systems for close proximity identification of unmanned aerial systems |
US20180295560A1 (en) | 2017-04-11 | 2018-10-11 | Global Tel*Link Corporation | System and method for detecting and controlling contraband devices |
KR20200024763A (en) * | 2017-05-17 | 2020-03-09 | 에어로바이론먼트, 인크. | Systems and Methods for Interception and Countering Unmanned Aerial Vehicles (UAVs) |
CN107168364A (en) * | 2017-05-31 | 2017-09-15 | 陈泽涛 | A kind of unmanned aerial vehicle (UAV) control method, device and unmanned plane |
US10783796B2 (en) * | 2017-09-01 | 2020-09-22 | Qualcomm Incorporated | Collision management for a robotic vehicle |
CN107656543A (en) * | 2017-10-09 | 2018-02-02 | 四川九洲防控科技有限责任公司 | A kind of anti-unmanned plane interference system |
CN107885232A (en) * | 2017-10-23 | 2018-04-06 | 上海机电工程研究所 | A kind of filtering method for how tactful maneuver tracking |
JP7023492B2 (en) * | 2017-12-02 | 2022-02-22 | 学校法人早稲田大学 | Follow-up image presentation system for moving objects |
EP3495771A1 (en) * | 2017-12-11 | 2019-06-12 | Hexagon Technology Center GmbH | Automated surveying of real world objects |
DE102017011592A1 (en) * | 2017-12-14 | 2019-06-19 | Diehl Defence Gmbh & Co. Kg | Method for controlling a drone defense system |
US10448629B2 (en) * | 2017-12-19 | 2019-10-22 | Charles Koch | Turkey call |
US10742338B2 (en) * | 2018-01-26 | 2020-08-11 | Clip Interactive, Llc | Seamless integration of radio broadcast audio with streaming audio |
JP2019203821A (en) * | 2018-05-24 | 2019-11-28 | パナソニックIpマネジメント株式会社 | Flying object detector, flying object detection method, and flying object detection system |
CN110770668A (en) * | 2018-08-22 | 2020-02-07 | 深圳市大疆创新科技有限公司 | Control method of movable platform, movable platform and readable storage medium |
PL3853857T3 (en) | 2018-09-22 | 2025-04-22 | Pierce Aerospace Incorporated | Systems and methods of identifying and managing remotely piloted and piloted air traffic |
US12033516B1 (en) | 2018-09-22 | 2024-07-09 | Pierce Aerospace Incorporated | Systems and methods for remote identification of unmanned aircraft systems |
US11192646B2 (en) * | 2018-10-03 | 2021-12-07 | Sarcos Corp. | Anchored aerial countermeasures for rapid deployment and neutralizing of target aerial vehicles |
US11465741B2 (en) | 2018-10-03 | 2022-10-11 | Sarcos Corp. | Deployable aerial countermeasures for neutralizing and capturing target aerial vehicles |
US11697497B2 (en) | 2018-10-03 | 2023-07-11 | Sarcos Corp. | Aerial vehicles having countermeasures deployed from a platform for neutralizing target aerial vehicles |
US11440656B2 (en) | 2018-10-03 | 2022-09-13 | Sarcos Corp. | Countermeasure deployment system facilitating neutralization of target aerial vehicles |
US11472550B2 (en) | 2018-10-03 | 2022-10-18 | Sarcos Corp. | Close proximity countermeasures for neutralizing target aerial vehicles |
WO2020084322A1 (en) * | 2018-10-22 | 2020-04-30 | Pantazis Alexandros | Modular system for the detection, identification and combating of unmanned aerial systems (uas), of unmanned ground vehicles (ugv) and of chemical, biological, radioactive and nuclear (cbrn) particles |
EP3690383A1 (en) * | 2019-02-04 | 2020-08-05 | CMI Defence S.A. | Operational section of armoured vehicles communicating with a flotilla of drones |
CN109960277A (en) * | 2019-03-08 | 2019-07-02 | 沈阳无距科技有限公司 | Expel unmanned plane and its interference method, device, storage medium and electronic equipment |
US10523342B1 (en) * | 2019-03-12 | 2019-12-31 | Bae Systems Information And Electronic Systems Integration Inc. | Autonomous reinforcement learning method of receiver scan schedule control |
CN110288633B (en) * | 2019-06-04 | 2021-07-23 | 东软集团股份有限公司 | Target tracking method and device, readable storage medium and electronic equipment |
US11157023B2 (en) | 2019-06-05 | 2021-10-26 | International Business Machines Corporation | Automatic relocation of a vehicle based on proximity |
DE102019119049A1 (en) * | 2019-07-15 | 2021-01-21 | Rheinmetall Electronics Gmbh | Net catching drone, system and method for catching a flying drone |
CN110450974B (en) * | 2019-07-15 | 2024-09-17 | 中国农业大学 | An indoor inspection system and method for the spraying performance of a multi-rotor plant protection drone |
FR3098929B1 (en) * | 2019-07-16 | 2021-06-18 | Yellowscan | Method for determining extraneous calibration parameters of a measuring system |
CN110491179B (en) * | 2019-09-02 | 2021-12-28 | 孔吉 | Airport scene monitoring system with dynamic virtual electronic fence |
RU2746090C2 (en) | 2019-09-30 | 2021-04-06 | Акционерное общество "Лаборатория Касперского" | System and method of protection against unmanned aerial vehicles in airspace settlement |
CN112580420B (en) * | 2019-09-30 | 2025-01-03 | 卡巴斯基实验室股份制公司 | Systems and methods for countering unmanned aerial vehicles |
RU2755603C2 (en) | 2019-09-30 | 2021-09-17 | Акционерное общество "Лаборатория Касперского" | System and method for detecting and countering unmanned aerial vehicles |
US11729372B2 (en) * | 2019-10-23 | 2023-08-15 | Alarm.Com Incorporated | Drone-assisted sensor mapping |
CN111198561B (en) | 2019-12-05 | 2021-10-22 | 浙江大华技术股份有限公司 | Motion control method and device for target tracking, computer equipment and storage medium |
CN111123963B (en) * | 2019-12-19 | 2021-06-08 | 南京航空航天大学 | Autonomous Navigation System and Method in Unknown Environment Based on Reinforcement Learning |
ES2975496T3 (en) * | 2020-03-06 | 2024-07-08 | Bae Systems Plc | Drone interception |
EP3876071A1 (en) * | 2020-03-06 | 2021-09-08 | BAE SYSTEMS plc | Drone interception |
GB2592916B (en) * | 2020-03-06 | 2024-10-16 | Bae Systems Plc | Drone interception |
US12008807B2 (en) * | 2020-04-01 | 2024-06-11 | Sarcos Corp. | System and methods for early detection of non-biological mobile aerial target |
CN111605719B (en) * | 2020-05-06 | 2023-11-10 | 嘉兴释探信息技术有限公司 | Method and system for detecting micro unmanned aerial vehicle with communication function |
IL275792B (en) | 2020-07-01 | 2021-08-31 | Imi Systems Ltd | Incoming aerial threat protection system and method |
CN114002700A (en) * | 2020-07-28 | 2022-02-01 | 北京理工大学 | Networking control method for laser terminal guidance aircraft |
CN111857187B (en) * | 2020-08-21 | 2021-07-06 | 烟台大学 | A UAV-based T beam construction tracking system and method |
US20240020968A1 (en) * | 2020-10-08 | 2024-01-18 | Edgy Bees Ltd. | Improving geo-registration using machine-learning based object identification |
CN112304315B (en) * | 2020-10-20 | 2024-07-02 | 青岛中科防务科技有限公司 | Positioning method for aerial striking unmanned aerial vehicle |
US20230088169A1 (en) * | 2020-11-08 | 2023-03-23 | Noam Kenig | System and methods for aiming and guiding interceptor UAV |
CN112416018B (en) * | 2020-11-24 | 2021-07-09 | 广东技术师范大学 | Unmanned aerial vehicle obstacle avoidance method and device based on multi-signal acquisition and path planning model |
CN113138381B (en) * | 2020-12-24 | 2023-03-17 | 北京理工大学 | Anti-low-slow small unmanned aerial vehicle method based on radar and photoelectric detection system |
CN112908039B (en) * | 2021-01-27 | 2022-02-25 | 深圳协鑫智慧能源有限公司 | Airspace control method based on intelligent street lamp and intelligent street lamp |
US12165523B2 (en) * | 2021-03-15 | 2024-12-10 | Aurora Flight Sciences Corporation, a subsidiary of The Boeing Company | Systems and methods for tracking objects relative to an aircraft within an air space |
CN113095161B (en) * | 2021-03-23 | 2024-05-31 | 深圳力维智联技术有限公司 | Dangerous behavior identification method, device, terminal equipment and computer storage medium |
US11980813B2 (en) | 2021-05-04 | 2024-05-14 | Ztag, Inc. | System and method of using a virtual focal point in real physical game |
US20220373649A1 (en) * | 2021-05-17 | 2022-11-24 | Raytheon Company | Mode chaining for multifunction laser radar |
CN113357965B (en) * | 2021-06-01 | 2022-07-12 | 吉林大学 | Unmanned aerial vehicle capturing device and method based on annular scanning type millimeter wave radar point cloud imaging |
US12277850B1 (en) | 2021-06-11 | 2025-04-15 | Essential Aero, Inc. | Automatic foreign object debris inspection system |
US12045059B1 (en) | 2021-06-11 | 2024-07-23 | Essential Aero, Inc. | Method and system for autonomous collection of airfield FOD |
CN113554680B (en) * | 2021-07-21 | 2024-11-05 | 清华大学 | Target tracking method, device, drone and storage medium |
CN113741532B (en) * | 2021-09-16 | 2024-07-05 | 温州大学大数据与信息技术研究院 | Reverse unmanned aerial vehicle target tracking and reversing system |
CN114185361A (en) * | 2021-11-26 | 2022-03-15 | 航空工业信息中心 | Intelligent control-based interceptor cluster intensive impact type hard-killing anti-unmanned aerial vehicle method |
CN114332158B (en) * | 2021-12-17 | 2024-05-07 | 重庆大学 | 3D real-time multi-target tracking method based on fusion of camera and laser radar |
US11594141B1 (en) * | 2022-01-19 | 2023-02-28 | King Abdulaziz University | System and methods to neutralize an attacking UAV based on acoustic features |
CN115390582B (en) * | 2022-07-15 | 2023-04-07 | 江西理工大学 | A method and system for tracking and intercepting multi-rotor UAV based on point cloud |
EP4339650A1 (en) * | 2022-09-14 | 2024-03-20 | Aurora Flight Sciences Corporation, a subsidiary of The Boeing Company | Systems and methods for tracking objects relative to an aircraft within an air space |
CN118882407A (en) * | 2024-07-08 | 2024-11-01 | 中科智远信息科技有限公司 | A fully automatic intelligent drone countermeasure system |
CN118506619B (en) * | 2024-07-15 | 2024-10-01 | 国科星图(深圳)数字技术产业研发中心有限公司 | A method for setting up dynamic geographic information fences for aircraft based on airspace gridding |
CN118549092B (en) * | 2024-07-26 | 2024-11-12 | 江西飞行学院 | Automatic rotating laser anti-UAV test device and control method |
CN119737823B (en) * | 2024-12-23 | 2025-09-09 | 同济大学 | A laser guidance device for unmanned aerial vehicle targets |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4160974A (en) | 1976-10-29 | 1979-07-10 | The Singer Company | Target sensing and homing system |
US7248342B1 (en) * | 2003-02-14 | 2007-07-24 | United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Three-dimension imaging lidar |
US7046841B1 (en) | 2003-08-29 | 2006-05-16 | Aerotec, Llc | Method and system for direct classification from three dimensional digital imaging |
WO2007030026A1 (en) | 2005-09-09 | 2007-03-15 | Industrial Research Limited | A 3d scene scanner and a position and orientation system |
US8788118B2 (en) * | 2006-09-06 | 2014-07-22 | Jeffrey A. Matos | Systems and methods for detecting and managing the unauthorized use of an unmanned aircraft |
US7821619B2 (en) | 2008-03-19 | 2010-10-26 | Raytheon Company | Rapid scan LADAR 3D imaging with compact digital beam formation |
JP4691581B2 (en) | 2008-06-13 | 2011-06-01 | 日立Geニュークリア・エナジー株式会社 | Underwater moving object position detection device |
US8212995B2 (en) | 2010-03-16 | 2012-07-03 | Raytheon Company | Laser imaging system with uniform line illumination and method for generating images |
US8599367B2 (en) * | 2010-08-04 | 2013-12-03 | Alliant Techsystems Inc. | Apparatus and methods for obtaining multi-dimensional spatial and spectral data with LIDAR detection |
WO2012119132A2 (en) | 2011-03-02 | 2012-09-07 | Aerovironment, Inc. | Unmanned aerial vehicle angular reorientation |
US20130141735A1 (en) | 2011-06-09 | 2013-06-06 | Michael Sogard | Target for large scale metrology system |
US8811720B2 (en) * | 2011-07-12 | 2014-08-19 | Raytheon Company | 3D visualization of light detection and ranging data |
US9474265B2 (en) * | 2012-11-27 | 2016-10-25 | Elwha Llc | Methods and systems for directing birds away from equipment |
US8939081B1 (en) | 2013-01-15 | 2015-01-27 | Raytheon Company | Ladar backtracking of wake turbulence trailing an airborne target for point-of-origin estimation and target classification |
US9891321B2 (en) * | 2013-01-21 | 2018-02-13 | Vricon Systems Aktiebolag | Method and arrangement for developing a three dimensional model of an environment |
US20150260824A1 (en) | 2014-03-13 | 2015-09-17 | Chester Charles Malveaux | Unmanned aerial system drone situational awareness flight safety and tracking system |
US9354317B2 (en) | 2014-04-09 | 2016-05-31 | Raytheon Company | Simultaneous forward and inverse synthetic aperture imaging LADAR |
US9275645B2 (en) | 2014-04-22 | 2016-03-01 | Droneshield, Llc | Drone detection and classification methods and apparatus |
US10399674B2 (en) | 2014-07-28 | 2019-09-03 | Insitu, Inc. | Systems and methods countering an unmanned air vehicle |
EP3862837B1 (en) * | 2014-07-30 | 2023-05-03 | SZ DJI Technology Co., Ltd. | Systems and methods for target tracking |
CN108139757A (en) | 2015-09-11 | 2018-06-08 | 深圳市大疆创新科技有限公司 | For the system and method for detect and track loose impediment |
CN105214288B (en) | 2015-11-17 | 2018-01-05 | 丹阳正方纳米电子有限公司 | Golf identification locating and tracking and information communication system based on unmanned plane |
-
2017
- 2017-02-27 US US15/443,143 patent/US20170261613A1/en not_active Abandoned
- 2017-02-27 US US15/443,182 patent/US20170261604A1/en not_active Abandoned
- 2017-02-27 US US15/443,165 patent/US10690772B2/en active Active
- 2017-02-27 US US15/443,156 patent/US20180364741A1/en not_active Abandoned
- 2017-02-27 US US15/443,173 patent/US10408936B2/en active Active
Cited By (136)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10720068B2 (en) | 2012-05-09 | 2020-07-21 | Singularity University | Transportation using network of unmanned aerial vehicles |
US12131656B2 (en) | 2012-05-09 | 2024-10-29 | Singularity University | Transportation using network of unmanned aerial vehicles |
US12003990B2 (en) | 2013-03-15 | 2024-06-04 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12267714B2 (en) | 2013-03-15 | 2025-04-01 | Digital Global Systems, Inc. | Systems, methods, and devices having databases and automated reports for electronic spectrum management |
US12028121B2 (en) | 2013-03-15 | 2024-07-02 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12401433B2 (en) | 2013-03-15 | 2025-08-26 | Digital Global Systems, Inc. | Systems and methods for spectrum analysis utilizing signal degradation data |
US12395875B2 (en) | 2013-03-15 | 2025-08-19 | Digital Global Systems, Inc. | Systems, methods, and devices having databases for electronic spectrum management |
US12388690B2 (en) | 2013-03-15 | 2025-08-12 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying open space |
US12382424B2 (en) | 2013-03-15 | 2025-08-05 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying signal-emitting devices |
US12382326B2 (en) | 2013-03-15 | 2025-08-05 | Digital Global Systems, Inc. | Systems, methods, and devices having databases and automated reports for electronic spectrum management |
US12375194B2 (en) | 2013-03-15 | 2025-07-29 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12363552B2 (en) | 2013-03-15 | 2025-07-15 | Digital Global Systems, Inc. | Systems and methods for automated financial settlements for dynamic spectrum sharing |
US12356206B2 (en) | 2013-03-15 | 2025-07-08 | Digital Global Systems, Inc. | Systems and methods for automated financial settlements for dynamic spectrum sharing |
US12348995B2 (en) | 2013-03-15 | 2025-07-01 | Digital Global Systems, Inc. | Systems, methods, and devices having databases for electronic spectrum management |
US12302146B2 (en) | 2013-03-15 | 2025-05-13 | Digital Global Systems, Inc. | Systems, methods, and devices having databases and automated reports for electronic spectrum management |
US12302144B2 (en) | 2013-03-15 | 2025-05-13 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12284538B2 (en) | 2013-03-15 | 2025-04-22 | Digital Global Systems, Inc. | Systems, methods, and devices having databases and automated reports for electronic spectrum management |
US12284539B2 (en) | 2013-03-15 | 2025-04-22 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12279141B2 (en) | 2013-03-15 | 2025-04-15 | Digital Global Systems, Inc. | Systems, methods, and devices having databases for electronic spectrum management |
US11558764B2 (en) | 2013-03-15 | 2023-01-17 | Digital Global Systems, Inc. | Systems, methods, and devices having databases for electronic spectrum management |
US12278669B2 (en) | 2013-03-15 | 2025-04-15 | Digital Global Systems, Inc. | Systems and methods for spectrum analysis utilizing signal degradation data |
US11588562B2 (en) | 2013-03-15 | 2023-02-21 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12267117B2 (en) | 2013-03-15 | 2025-04-01 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12256233B2 (en) | 2013-03-15 | 2025-03-18 | Digital Global Systems, Inc. | Systems and methods for automated financial settlements for dynamic spectrum sharing |
US11601833B2 (en) | 2013-03-15 | 2023-03-07 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection with temporal feature extraction within a spectrum |
US12224888B2 (en) | 2013-03-15 | 2025-02-11 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying open space |
US11617089B2 (en) | 2013-03-15 | 2023-03-28 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12207118B1 (en) | 2013-03-15 | 2025-01-21 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection with temporal feature extraction within a spectrum |
US11637641B1 (en) | 2013-03-15 | 2023-04-25 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12207119B1 (en) | 2013-03-15 | 2025-01-21 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection with temporal feature extraction within a spectrum |
US12191925B2 (en) | 2013-03-15 | 2025-01-07 | Digital Global Systems, Inc. | Systems and methods for spectrum analysis utilizing signal degradation data |
US11647409B2 (en) | 2013-03-15 | 2023-05-09 | Digital Global Systems, Inc. | Systems, methods, and devices having databases and automated reports for electronic spectrum management |
US11646918B2 (en) | 2013-03-15 | 2023-05-09 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying open space |
US11653236B2 (en) | 2013-03-15 | 2023-05-16 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US11665664B2 (en) | 2013-03-15 | 2023-05-30 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying signal-emitting devices |
US11665565B2 (en) | 2013-03-15 | 2023-05-30 | Digital Global Systems, Inc. | Systems, methods, and devices having databases for electronic spectrum management |
US12185143B2 (en) | 2013-03-15 | 2024-12-31 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12177701B2 (en) | 2013-03-15 | 2024-12-24 | Digital Global Systems, Inc. | Systems, methods, and devices having databases for electronic spectrum management |
US11706651B1 (en) | 2013-03-15 | 2023-07-18 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection with temporal feature extraction within a spectrum |
US12160762B2 (en) | 2013-03-15 | 2024-12-03 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection with temporal feature extraction within a spectrum |
US11736952B2 (en) | 2013-03-15 | 2023-08-22 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12160763B2 (en) | 2013-03-15 | 2024-12-03 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection with temporal feature extraction within a spectrum |
US12126392B2 (en) | 2013-03-15 | 2024-10-22 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12127021B2 (en) | 2013-03-15 | 2024-10-22 | Digital Global Systems, Inc. | Systems, methods, and devices having databases and automated reports for electronic spectrum management |
US12119966B2 (en) | 2013-03-15 | 2024-10-15 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying open space |
US11792762B1 (en) | 2013-03-15 | 2023-10-17 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying signal-emitting devices |
US11791913B2 (en) | 2013-03-15 | 2023-10-17 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12101655B2 (en) | 2013-03-15 | 2024-09-24 | Digital Global Systems, Inc. | Systems, methods, and devices having databases for electronic spectrum management |
US11838154B2 (en) | 2013-03-15 | 2023-12-05 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying open space |
US11838780B2 (en) | 2013-03-15 | 2023-12-05 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection with temporal feature extraction within a spectrum |
US12095518B2 (en) | 2013-03-15 | 2024-09-17 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US12028729B2 (en) | 2013-03-15 | 2024-07-02 | Digital Global Systems, Inc. | Systems, methods, and devices having databases for electronic spectrum management |
US11985013B2 (en) | 2013-03-15 | 2024-05-14 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying open space |
US11991547B2 (en) | 2013-03-15 | 2024-05-21 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection with temporal feature extraction within a spectrum |
US11974149B2 (en) | 2013-03-15 | 2024-04-30 | Digital Global Systems, Inc. | Systems, methods, and devices having databases and automated reports for electronic spectrum management |
US11901963B1 (en) | 2013-03-15 | 2024-02-13 | Digital Global Systems, Inc. | Systems and methods for analyzing signals of interest |
US11943737B2 (en) | 2013-03-15 | 2024-03-26 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying signal-emitting devices |
US11930382B2 (en) | 2013-03-15 | 2024-03-12 | Digital Global Systems, Inc. | Systems, methods, and devices having databases and automated reports for electronic spectrum management |
US20160284221A1 (en) * | 2013-05-08 | 2016-09-29 | Matternet, Inc. | Route planning for unmanned aerial vehicles |
US10324527B2 (en) | 2014-01-24 | 2019-06-18 | Tobii Ab | Gaze driven interaction for a vehicle |
US10795010B2 (en) | 2014-12-19 | 2020-10-06 | Xidrone Systems, Inc. | Systems and methods for detecting, tracking and identifying small unmanned systems such as drones |
US10156631B2 (en) | 2014-12-19 | 2018-12-18 | Xidrone Systems, Inc. | Deterrent for unmanned aerial systems |
US11965977B2 (en) | 2014-12-19 | 2024-04-23 | Xidrone Systems, Inc. | Deterrent for unmanned aerial systems |
US10739451B1 (en) | 2014-12-19 | 2020-08-11 | Xidrone Systems, Inc. | Systems and methods for detecting, tracking and identifying small unmanned systems such as drones |
US12092756B1 (en) | 2014-12-19 | 2024-09-17 | Xidrone Systems, Inc. | Deterrent for unmanned aerial systems |
US12298378B2 (en) | 2014-12-19 | 2025-05-13 | Xidrone Systems, Inc. | Counter unmanned aerial system with navigation data to Intercept and/or disable an unmanned aerial vehicle threat |
US10281570B2 (en) | 2014-12-19 | 2019-05-07 | Xidrone Systems, Inc. | Systems and methods for detecting, tracking and identifying small unmanned systems such as drones |
US11378651B2 (en) | 2014-12-19 | 2022-07-05 | Xidrone Systems, Inc. | Deterrent for unmanned aerial systems |
US11644535B2 (en) | 2014-12-19 | 2023-05-09 | Xidrone Systems, Inc. | Deterrent for unmanned aerial systems |
US11820507B2 (en) | 2015-11-10 | 2023-11-21 | Matternet, Inc. | Methods and systems for transportation using unmanned aerial vehicles |
US10291348B2 (en) * | 2016-02-02 | 2019-05-14 | Bao Tran | Systems and methods for control of drones |
US9835709B2 (en) * | 2016-02-02 | 2017-12-05 | Bao Tran | Systems and methods for permission based control of robots |
US12301976B2 (en) | 2017-01-23 | 2025-05-13 | Digital Global Systems, Inc. | Systems, methods, and devices for unmanned vehicle detection |
US12307905B2 (en) | 2017-01-23 | 2025-05-20 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
US12101132B2 (en) | 2017-01-23 | 2024-09-24 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within an electromagnetic spectrum |
US12407914B1 (en) | 2017-01-23 | 2025-09-02 | Digital Global Systems, Inc. | Systems, methods, and devices for unmanned vehicle detection |
US11956025B2 (en) | 2017-01-23 | 2024-04-09 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within an electromagnetic spectrum |
US11783712B1 (en) | 2017-01-23 | 2023-10-10 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
US11764883B2 (en) | 2017-01-23 | 2023-09-19 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within an electromagnetic spectrum |
US12387608B1 (en) | 2017-01-23 | 2025-08-12 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
US11965922B2 (en) | 2017-01-23 | 2024-04-23 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within a spectrum |
US12143162B2 (en) | 2017-01-23 | 2024-11-12 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within an electromagnetic spectrum |
US11893893B1 (en) | 2017-01-23 | 2024-02-06 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
US11750911B2 (en) | 2017-01-23 | 2023-09-05 | Digital Global Systems, Inc. | Systems, methods, and devices for unmanned vehicle detection |
US12372563B2 (en) | 2017-01-23 | 2025-07-29 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within a spectrum |
US12243431B2 (en) | 2017-01-23 | 2025-03-04 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
US12323196B1 (en) | 2017-01-23 | 2025-06-03 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within an electromagnetic spectrum |
US12183213B1 (en) | 2017-01-23 | 2024-12-31 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
US12184963B2 (en) | 2017-01-23 | 2024-12-31 | Digital Global Systems, Inc. | Systems, methods, and devices for unmanned vehicle detection |
US11668739B2 (en) | 2017-01-23 | 2023-06-06 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within a spectrum |
US11645921B2 (en) | 2017-01-23 | 2023-05-09 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
US11860209B2 (en) | 2017-01-23 | 2024-01-02 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within a spectrum |
US11871103B2 (en) | 2017-01-23 | 2024-01-09 | Digital Global Systems, Inc. | Systems, methods, and devices for unmanned vehicle detection |
US12205477B2 (en) | 2017-01-23 | 2025-01-21 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
US11622170B2 (en) | 2017-01-23 | 2023-04-04 | Digital Global Systems, Inc. | Systems, methods, and devices for unmanned vehicle detection |
US12309483B1 (en) | 2017-01-23 | 2025-05-20 | Digital Global Systems, Inc. | Systems, methods, and devices for unmanned vehicle detection |
US12298337B2 (en) | 2017-01-23 | 2025-05-13 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within a spectrum |
US11521498B2 (en) | 2017-01-23 | 2022-12-06 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
US11549976B2 (en) | 2017-01-23 | 2023-01-10 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within a spectrum |
US12255694B1 (en) | 2017-01-23 | 2025-03-18 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within an electromagnetic spectrum |
US12261650B2 (en) | 2017-01-23 | 2025-03-25 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within an electromagnetic spectrum |
US12266272B1 (en) | 2017-01-23 | 2025-04-01 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
US12272258B2 (en) | 2017-01-23 | 2025-04-08 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
US10907940B1 (en) | 2017-12-12 | 2021-02-02 | Xidrone Systems, Inc. | Deterrent for unmanned aerial systems using data mining and/or machine learning for improved target detection and classification |
WO2019161076A1 (en) | 2018-02-19 | 2019-08-22 | Digital Global Systems, Inc. | Systems, methods, and devices for unmanned vehicle detection and threat management |
US12198527B2 (en) | 2018-08-24 | 2025-01-14 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time |
US12087147B2 (en) | 2018-08-24 | 2024-09-10 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time |
US12277849B2 (en) | 2018-08-24 | 2025-04-15 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time |
US12243406B2 (en) | 2018-08-24 | 2025-03-04 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time |
US12380793B1 (en) | 2018-08-24 | 2025-08-05 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time |
US11869330B2 (en) | 2018-08-24 | 2024-01-09 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time |
US11948446B1 (en) | 2018-08-24 | 2024-04-02 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time |
US11676472B2 (en) | 2018-08-24 | 2023-06-13 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time |
US12142127B1 (en) | 2018-08-24 | 2024-11-12 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time |
US20220172380A1 (en) * | 2019-04-08 | 2022-06-02 | Shenzhen Vision Power Technology Co., Ltd. | Three-dimensional light field technology-based optical unmanned aerial vehicle monitoring system |
US11978222B2 (en) * | 2019-04-08 | 2024-05-07 | Shenzhen Vision Power Technology Co., Ltd. | Three-dimensional light field technology-based optical unmanned aerial vehicle monitoring system |
JP2022532483A (en) * | 2019-05-17 | 2022-07-15 | アンドゥリル・インダストリーズ・インコーポレーテッド | Counter drone system |
JP7312272B2 (en) | 2019-05-17 | 2023-07-20 | アンドゥリル・インダストリーズ・インコーポレーテッド | counter drone system |
US20230082239A1 (en) * | 2019-05-17 | 2023-03-16 | Anduril Industries, Inc. | Counter drone system |
AU2023237207B2 (en) * | 2019-05-17 | 2025-09-25 | Anduril Industries, Inc. | Counter drone system |
US11899473B2 (en) * | 2019-05-17 | 2024-02-13 | Anduril Industries, Inc. | Counter drone system |
US11385659B2 (en) * | 2019-05-17 | 2022-07-12 | Anduril Industries, Inc. | Counter drone system |
EP3969834A4 (en) * | 2019-05-17 | 2023-02-08 | Anduril Industries Inc. | Counter drone system |
US12282340B2 (en) * | 2019-05-17 | 2025-04-22 | Anduril Industries, Inc. | Counter drone system |
WO2020236238A1 (en) * | 2019-05-17 | 2020-11-26 | Anduril Industries Inc. | Counter drone system |
US20240142996A1 (en) * | 2019-05-17 | 2024-05-02 | Anduril Industries, Inc. | Counter drone system |
US11747115B2 (en) | 2020-04-20 | 2023-09-05 | United States Of America As Represented By The Secretary Of The Air Force | Apparatus and process for drone locating, interdiction and recovery |
WO2022260712A1 (en) | 2021-06-09 | 2022-12-15 | Raytheon Company | Method and flexible apparatus permitting advanced radar signal processing, tracking, and classification/identification design and evaluation using single unmanned air surveillance (uas) device |
WO2023021094A1 (en) | 2021-08-20 | 2023-02-23 | Atlas Elektronik Gmbh | Effective target engagement using a military device comprising a first network for non-classified information and a second network for classified information |
DE102021209154A1 (en) | 2021-08-20 | 2023-02-23 | Atlas Elektronik Gmbh | Effective target engagement by a military facility having a first unclassified information network and a second classified information network |
EP4303626A1 (en) | 2022-07-05 | 2024-01-10 | AGC Glass Europe | Drone detection device |
KR102837847B1 (en) | 2023-04-04 | 2025-07-23 | 국민대학교산학협력단 | Autonomous navigation system and method of unmanned aerial vehicle using microphone sensor |
KR20240148531A (en) * | 2023-04-04 | 2024-10-11 | 국민대학교산학협력단 | Autonomous navigation system and method of unmanned aerial vehicle using microphone sensor |
EP4481439A1 (en) * | 2023-06-19 | 2024-12-25 | CS Group - France | Method and system for detecting a flying apparatus in an airspace |
RU2838977C1 (en) * | 2024-10-21 | 2025-04-24 | Федеральное государственное автономное образовательное учреждение высшего образования "Национальный исследовательский Томский государственный университет" | System for distributed control of intelligent robots for fighting unmanned vehicles |
US12431992B2 (en) | 2025-02-04 | 2025-09-30 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within an electromagnetic spectrum |
Also Published As
Publication number | Publication date |
---|---|
US20170261604A1 (en) | 2017-09-14 |
US20190004176A1 (en) | 2019-01-03 |
US10690772B2 (en) | 2020-06-23 |
US10408936B2 (en) | 2019-09-10 |
US20170261999A1 (en) | 2017-09-14 |
US20180364741A1 (en) | 2018-12-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10690772B2 (en) | LIDAR site model to aid counter drone system | |
US10514711B2 (en) | Flight control using computer vision | |
Khan et al. | On the detection of unauthorized drones—Techniques and future perspectives: A review | |
Svanström et al. | Real-time drone detection and tracking with visible, thermal and acoustic sensors | |
Alam et al. | A survey of safe landing zone detection techniques for autonomous unmanned aerial vehicles (UAVs) | |
US20200162489A1 (en) | Security event detection and threat assessment | |
US9429945B2 (en) | Surveying areas using a radar system and an unmanned aerial vehicle | |
US10474144B2 (en) | Remote information collection, situational awareness, and adaptive response system for improving advance threat awareness and hazardous risk avoidance | |
US10416668B2 (en) | Scanning environments and tracking unmanned aerial vehicles | |
Ma'Sum et al. | Simulation of intelligent unmanned aerial vehicle (UAV) for military surveillance | |
US9026272B2 (en) | Methods for autonomous tracking and surveillance | |
US8996207B2 (en) | Systems and methods for autonomous landing using a three dimensional evidence grid | |
WO2019067695A1 (en) | Flight control using computer vision | |
US20110285981A1 (en) | Sensor Element and System Comprising Wide Field-of-View 3-D Imaging LIDAR | |
US20220188553A1 (en) | Estimating auto exposure values of camera by prioritizing object of interest based on contextual inputs from 3d maps | |
US20240248477A1 (en) | Multi-drone beyond visual line of sight (bvlos) operation | |
KR20230141037A (en) | Drone that can autonomously fly indoors based on artificial intelligence | |
JP2024054889A (en) | Air traffic control device and air traffic control method | |
KR20190141941A (en) | Detecting and neutralizing apparatus of flight vehicle capable of flying underground and method thereof | |
US10989797B2 (en) | Passive altimeter system for a platform and method thereof | |
CN107885231A (en) | A kind of unmanned plane capturing method and system based on visible images identification | |
Alturas | Modeling and Optimization of an Obstacle Detection System for Small UAVs | |
JP7721996B2 (en) | Guidance device, guidance system, guidance method, and computer program | |
US20220214702A1 (en) | Systems and methods enabling evasive uav movements during hover and flight | |
KR20250074793A (en) | Reconnaissance Drone Control System with trace function |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: RAYTHEON BBN TECHNOLOGIES CORP., MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VAN VOORST, BRIAN R.;REEL/FRAME:041397/0962 Effective date: 20170224 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |