[go: up one dir, main page]

WO2015029007A1 - Système robotique et procédé pour combat en intérieur complexe - Google Patents

Système robotique et procédé pour combat en intérieur complexe Download PDF

Info

Publication number
WO2015029007A1
WO2015029007A1 PCT/IL2014/000043 IL2014000043W WO2015029007A1 WO 2015029007 A1 WO2015029007 A1 WO 2015029007A1 IL 2014000043 W IL2014000043 W IL 2014000043W WO 2015029007 A1 WO2015029007 A1 WO 2015029007A1
Authority
WO
WIPO (PCT)
Prior art keywords
threat
drone
engagement
data
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IL2014/000043
Other languages
English (en)
Inventor
Ronen Izidor GABBAY
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of WO2015029007A1 publication Critical patent/WO2015029007A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41AFUNCTIONAL FEATURES OR DETAILS COMMON TO BOTH SMALLARMS AND ORDNANCE, e.g. CANNONS; MOUNTINGS FOR SMALLARMS OR ORDNANCE
    • F41A23/00Gun mountings, e.g. on vehicles; Disposition of guns on vehicles
    • F41A23/56Arrangements for adjusting the gun platform in the vertical or horizontal position
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/14Indirect aiming means
    • F41G3/16Sighting devices adapted for indirect laying of fire
    • F41G3/165Sighting devices adapted for indirect laying of fire using a TV-monitor
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0038Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0044Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/15UAVs specially adapted for particular uses or applications for conventional or electronic warfare
    • B64U2101/18UAVs specially adapted for particular uses or applications for conventional or electronic warfare for dropping bombs; for firing ammunition
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/70UAVs specially adapted for particular uses or applications for use inside enclosed spaces, e.g. in buildings or in vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls

Definitions

  • the present invention relates generally to robotic systems and methods, and more specifically to methods and systems for robotic systems for complex indoor combat.
  • Robotic solutions are lacking for various scenarios of confined and complex indoors combat, such as, but not limited to, within commercial and private residences, in educational establishments, commercial establishments, in transport systems, such as underground trains, in tunnels, in airplanes, within ships, for prevention of boarding of hostile persons, hijacking prevention, hostage rescue and securing a bridge and/or staff
  • improved unmanned methods and apparatus are provided for complex indoor combat.
  • a system comprising an integrated robotic drone comprising multiple sensors, optics, combat elements, supporting software, including video analytics, adapted for mission planning, with a heads-up-display HUD.
  • flying robotic systems are provided for indoor combat.
  • the present invention provides a robotic stalk and attack aerial hover system ("drone") designed, geared and aimed at dense urban confined and complex indoors combat.
  • the system of the present invention is constructed and configured to effectively identify, incriminate and eliminate threats and/or designated targets at urban scenarios; namely confined complex indoors environments, autonomously and/or piloted; on a single or a coordinated pack configuration.
  • the robotic drone of the present invention is configured to effectively operate in a complex and lethal environment.
  • a system for indoors engagement of a threat including;
  • a. a motorized apparatus adapted to hover and hold
  • an onboard processor adapted to receive and process said detection data from said at least one detection element and said location data from said at least one apparatus location data element, said processor being configured to output at least one command to said at least one engagement element to inactivate said threat;
  • v. at least one communication element at least one communication element; and b. a ground station for monitoring said motorized apparatus and for receiving data from said at least one communication element.
  • the processor is further adapted to perform video analytics on data from the detection data. Furthermore, according to an embodiment of the present invention, the processor is further adapted to identify the threat responsive to the video analytics.
  • the motorized apparatus includes robotics adapted to activate the at least one engagement element to engage and inactivate the threat.
  • the motorized apparatus is adapted to navigate three-dimensional indoor contours.
  • the at least one engagement element is selected from the group consisting of a gun, a missile, a projector, a gas canister, a fire extinguisher, a grenade, a non-lethal weapon, an immobilizer weapon, a submachine guns and combinations thereof.
  • the at least one detection element is selected from the group consisting of a sound sensor, an infrared sensor, a position sensor, an ultrasonic sensor, a movement detection sensor, a camera, a laser, a visualization sensor, an optic flow sensor and combinations thereof.
  • the at least one location element is selected from the group consisting of a global position system (GPS) element, a position sensor, a camera, a smartphone, an optic sensor and combinations thereof.
  • GPS global position system
  • motorized apparatus and the ground station communicate via at least one communication link.
  • the at least one communication link is selected from the group consisting of IP peer-to-peer, communication link, a cellular communication link, satellite communication an RF communication, an internet link and combinations thereof.
  • the ground station further includes a screen adapted to display a real-time heads up display (HUD) of the motorized apparatus.
  • HUD heads up display
  • the motorized apparatus is unmanned and is selected from an airborne drone, a flying hovering apparatus, a plane, a helicopter and a hovercraft.
  • the ground station includes a computer; a hand-operated remote control apparatus; an antenna; and at least one communications link.
  • the motorized apparatus is a drone, adapted for indoor use.
  • the motorized apparatus is adapted to move along vertical and horizontal conduits.
  • the drone includes robotics adapted to activate the at least one engagement element.
  • the ground station includes on-screen heads up display.
  • the detecting step includes employing video analytics and or detection sensors for identification of the threat.
  • the activating step enables the apparatus to act autonomously.
  • the activating step enables the apparatus to act semi-autonomously.
  • the activating step enables the apparatus to travel along horizontal and vertical conduits to approach the threat.
  • the apparatus is adapted for non-GPS navigation.
  • the apparatus is adapted for day and night activation.
  • the apparatus has a predefined mission plan.
  • the mission plan supports a plurality of engagement modes.
  • the apparatus is controlled from a ground station by heads up display.
  • the motorized apparatus acts autonomously, without communicating with the ground station.
  • the engagement of the threat occurs in an underground environment.
  • the underground environment includes at least one tunnel.
  • the motorized apparatus acts autonomously in the at least one tunnel.
  • a software product for indoor engagement of a threat including a computer-readable medium in which program instructions are stored, which instructions, when read by a computer (on the drone), cause the computer to;
  • the product comprises a plurality of algorithms, each algorithm providing a set of instructions to said motorized apparatus to activate at least one of robotics and the at least one engagement element.
  • a system for neutralization of an onboard threat on a waterborne vehicle including: a. a motorized apparatus adapted to hover and hold:
  • an onboard processor adapted to receive and process the detection data from the at least one detection element and the location data from the at least one apparatus location data element, the processor being configured to output at least one command to the at least one engagement element to inactivate the threat;
  • a ground station for monitoring the motorized apparatus and for receiving data from the at least one communication element.
  • the drone (motorized apparatus) is adapted to act autonomously with no need for an operator to be present. Alternatively, the operator can merely watch the action without interfering.
  • the present invention will be more fully understood from the following detailed description of the preferred embodiments thereof, taken together with the drawings.
  • Fig. 1 is a simplified pictorial illustration showing a system for complex indoor combat, in accordance with an embodiment of the present invention
  • Fig. 2 is a simplified pictorial illustration showing a ground station of the system of Fig. 1, in accordance with an embodiment of the present invention
  • Fig. 3 A is a simplified schematic illustration of a drone in the system of Fig. 1 , in accordance with an embodiment of the present invention
  • Fig. 3B is a simplified schematic illustration of a system for complex indoor combat, in accordance with an embodiment of the present invention
  • Fig. 4 is simplified pictorial illustration of a drone in the system of Fig. 1, in accordance with an embodiment of the present invention
  • Fig. 5 is a vertical cross section of the drone of Fig. 4, in accordance with an embodiment of the present invention.
  • Fig. 6 A is a top view of a drone showing at least one collision sensor, in accordance with an embodiment of the present invention.
  • Fig. 6B is a bottom view of a drone showing at least one collision sensor, in accordance with an embodiment of the present invention
  • Fig. 6C is a side view of a drone showing at least threat detection sensor, in accordance with an embodiment of the present invention
  • Fig. 7 A is a top view of a drone showing at least one optical sensor, in accordance with an embodiment of the present invention
  • Fig. 7B is a side view of a drone showing at least one optical sensor, in accordance with an embodiment of the present invention
  • Fig. 7C is another side view of a drone showing at least one optical sensor, in accordance with an embodiment of the present invention.
  • Fig. 8A is a top view of a drone showing at least one motion detection sensor, in accordance with an embodiment of the present invention.
  • Fig. 8B is a bottom view of a drone showing at least one motion detection sensor, in accordance with an embodiment of the present invention.
  • Fig. 8C is a side view of a drone showing at least one motion detection sensor, in accordance with an embodiment of the present invention
  • Fig. 9A is side view of a drone showing at least one combat element, in accordance with an embodiment of the present invention.
  • Fig. 9B is another side view of a drone showing at least one combat element, in accordance with an embodiment of the present invention.
  • Fig. 9C is another side view of a drone showing at least one combat element, in accordance with an embodiment of the present invention.
  • Fig. 10A is a top view of a drone showing at least one antenna, in accordance with an embodiment of the present invention.
  • Fig. 10B is a side view of a drone showing at least one antenna, in accordance with an embodiment of the present invention
  • Fig. 11 A is a simplified pictorial illustration showing a portable ground station of the system of Fig. 1, in accordance with an embodiment of the present invention
  • Fig. 1 IB is a simplified pictorial illustration showing a portable touch-screen ground station of the system of Fig. 1, in accordance with an embodiment of the present invention
  • Fig. 12 is a simplified flow chart of a mobile configuration method for complex indoor combat, in accordance with an embodiment of the present invention
  • Fig. 13 is a simplified flow chart of a standby mode method for complex indoor combat, in accordance with an embodiment of the present invention
  • Fig. 14 is a simplified flow chart of a sleep mode method for complex indoor combat, in accordance with an embodiment of the present invention
  • Fig. 15 is a screen shot on a ground station screen for activating a drone choosing mode of operation, in accordance with an embodiment of the present invention
  • Fig. 16 is a screen shot on a ground station screen for video analytics activation of a drone, in accordance with an embodiment of the present invention
  • Fig. 17 is a screen shot (Rounds and Bursts) on a ground station screen for activation of Fig. 1, in accordance with an embodiment of the present invention
  • Fig. 18 is a screen shot on a ground station screen for choosing a mode of engagement in the system of Fig. 1, in accordance with an embodiment of the present invention
  • Fig. 19 is a screen shot on a ground station screen for heads up display in the system of Fig. 1, in accordance with an embodiment of the present invention
  • Fig. 20 is a simplified threat map in the system of Fig. 1, in accordance with an embodiment of the present invention.
  • Fig. 21 is another simplified threat map in the system of Fig. 1, in accordance with an embodiment of the present invention.
  • similar reference numerals identify similar parts.
  • Fig. 1 is a simplified pictorial illustration showing a system 100 for complex indoor combat, in accordance with an embodiment of the present invention.
  • System 100 comprises at least one drone 110, a ground station 120, at least one communication device 130, adapted to communicate with the ground station and the drone via at least one wireless communication network 140. Further details of ground station 120 are shown in Fig. 2. Schematics of the drone are shown in Fig. 3 A. A schematic of system 100 is shown in Fig. 3B. Further mechanical details of the drone are shown in Fig. 4. It should be understood from Fig. 1 that the system may use one or more drones of various configurations, one or more communication systems and one or more mobile devices 130.
  • FIG. 2 is a simplified pictorial illustration showing a ground station 200 of the system of Fig. 1, in accordance with an embodiment of the present invention.
  • Ground station 200 similar or identical to ground station 120 of Fig. l, is housed in a suitcase 201 and comprises a remote controller 202, an antenna 204 servicing the remote controller, a computer 210 and an antenna, servicing the computer.
  • Drone 300 may be similar or identical to drone 110 of system 100.
  • the drone comprises an onboard processor 302 in communication with a flight system 304, at least one of a camera 314 or smartphone, and sensor 306, 308, 310, 312, exemplified, but not limited to sound sensor 306, infrared sensor 308, ultrasonic sensor 310 and optic flow sensor 312.
  • the arrows shown in Fig. 3A represent one embodiment of data flow within the drone.
  • FIG. 3B is a simplified schematic illustration of a system 350 for complex indoor combat, in accordance with an embodiment of the present invention.
  • System 350 comprises a ground station 352 in communication with an on-board system 358, a flight system (for example, Pixhawk autopilot) controller 360 a copter 354.
  • the flight system (such as, but not limited to, a Pixhawk) is in communication with a sensor controller 362 and sensors 356 communicating therewith.
  • Fig. 1, Fig. 3A et cetera Some of the components of system 100 (Fig.l) and/or system 350 (Fig. 3B) and their functions are provided in table 1. It should be understood that each of these components may be purchased commercially from commercial establishments in this field.
  • Radio controller receiver to receive control commands from a distant radio controller
  • Fig. 4 is simplified pictorial illustration of a drone 400 (also called copter or hover apparatus) in system 100 of Fig. 1, in accordance with an embodiment of the present invention.
  • the drones of the present invention are lightweight and have a protective frame 402.
  • the frame may be made of plastic, a polymeric material, a metal, aluminum, carbon fiber, an alloy and combinations thereof.
  • the light protective frame of drone 400 may be constructed in various colors, with various camouflages, making it both easy to move and durable.
  • the drone is constructed and configured to be a robotic stalk & attack hover apparatus 400, which may be stationed indoors, underground or onboard a ship, at strategic locations (such as on a bridge, in an engine room or in crew living quarters).
  • Drone 400 is developed, designed, geared and aimed at dense confined and complex indoor combat and thus can effectively identify, incriminate and eliminate threats and/or designated targets at the confined and complex indoors combat within ships, for example and at various operational scenarios, such as, but not limited to, terrorist attacks in buildings, airports, airplanes, educational facilities, underground trains, commercial establishments, residential buildings, boarding prevention, hijacking prevention, hostage rescue, securing the bridge and staff of a ship.
  • the drone When triggered; by an order, command or an alarm, the drone is constructed and configured to operate autonomously and/or piloted; in a single drone or in a coordinated multiple drone pack configuration.
  • this robotic drone is to effectively operate in the most complex and lethal indoor environments.
  • Drone 400 (similar or identical to drone 110, Fig 1) typically comprises the following basic functionalities and components. a. a protective frame 402 constructed as a hardened structure to protect rotors and components; b. at least one engine 404 adapted to activate rotors and powered by batteries or other power supply; c. 4 - 8 propellers/rotors 406 (the number of rotors is based on operational requirements, for example, weight of payload) - minimum of 2 Kg. payload's support d. batteries 504 (Fig. 5) or other electrical or fuel power plant adapted to provide hover endurance of a minimum of 10 minutes e. a camera 408 (may be part of a smartphone 409 (not shown); f .
  • a protective frame 402 constructed as a hardened structure to protect rotors and components
  • at least one engine 404 adapted to activate rotors and powered by batteries or other power supply
  • c. 4 - 8 propellers/rotors 406 (the number of
  • the drones of the present invention are, according to some embodiments, small (up to 0.75 meter in length/width), adapted for easy penetration through standard doorways, windows, corridors, elevator shafts and staircases.
  • the drones of the present invention are, according to some embodiments, silent with suppressed engine and rotors noise.
  • the drones may be in standby mode with components, mentioned hereinabove, activated, with the engine off, with immediate engine restart support.
  • the drones of the present invention are, according to some embodiments, light fabricated of light, yet durable materials, such as composite materials, known in the art.
  • the drones of the present invention are, according to some embodiments, activated in accordance with any one or more of the following algorithms, embedded, for example, in processor 302 (in Fig. 3A):
  • the algorithm integrates data provided by system's sensors (see Fig. 3 A -infrared, ultrasonic, optic flow, video, other) to one coherent orientation picture. Thus, preventing it from colliding obstacles.
  • 3D Mapping Algorithm - sensors map the site for immediate and future navigation: Without a provided map of the indoors site and without GPS.
  • the created map enables the system; in both autonomous and semi- autonomous mode, to navigate indoors without GPS.
  • the created map can then- i. be sent to following forces (more systems or humans);
  • Threat prioritization Algorithm The algorithm integrates data provided by system' s relevant sensors (Infrared, Ultrasonic, Optic flow, Video, other) to one coherent threat map:
  • System 100 is constructed and configured to prioritize the threats:- i. By pre-defined threats - profiles stored in the video analytics system;
  • the drone engages the threat according to the calculated prioritization
  • the handler can intervene and change the priority in real-time at all times.
  • a beacon for example, IR led indicator
  • a friendly force is identified and marked as such on system' s HUD (Heads Up Display 1900- Fig. 19);
  • the projectile shooting device (combat element) carries an external device for better aiming, such as a laser pointer (Fig. 9A.).
  • the drones of the present invention comprise, according to some embodiments anti-collision sensors 812 (Fig. 8B), 822 (Fig. 8C) for both vertical (floor/ceiling) and horizontal obstacles (walls)- with definable distance parameters -support stairways climbing.
  • the drones of the present invention comprise, according to some embodiments, an auto-staircase climbing - self-orientation algorithm, constructed and configured to enable the drone auto-independent navigation in corridors, elevator shafts and staircases by collecting data from sensors and integrating data to one coherent navigation picture, thereby enabling the drones to independently navigate an indoor environment.
  • an auto-staircase climbing - self-orientation algorithm constructed and configured to enable the drone auto-independent navigation in corridors, elevator shafts and staircases by collecting data from sensors and integrating data to one coherent navigation picture, thereby enabling the drones to independently navigate an indoor environment.
  • FIG. 5 is a vertical cross section 500 of the drone 400 of Fig. 4, in accordance with an embodiment of the present invention.
  • This figure shows the embedded (in-body) power plant (batteries or fuel tanks 504) of the drone, landing gear 502 and a battery holder door 506.
  • Fig. 6 A is a top view 600 of a drone showing at least one collision sensor 602, in accordance with an embodiment of the present invention.
  • the collision sensor shown in Fig. 6A is an upper face collision sensor 602.
  • Fig. 6B is a bottom view of a drone 610 showing at least one collision sensor 612 (lower face collision sensor), in accordance with an embodiment of the present invention.
  • a gun-holder 616 can also be seen, as well as propeller arms 614, each adapted to hold one propeller 406 or two in coaxial engine configuration.
  • Fig. 6C is a side view of a drone 620 showing at least one threat detection sensor 624 and a gun holder 622, in accordance with an embodiment of the present invention.
  • Figs. 6A-6C show the anti-collision sensors of drone 400, enabling it to effectively self-navigate and/or be controlled in a dense urban confined and complex indoors environment.
  • Fig. 7 A is a top view 700 of drone 400 showing at least one optical sensor 312, 314, 316, in accordance with an embodiment of the present invention.
  • the drone comprises replaceable HD, 314, 316 adapted for either day only or for day and night.
  • Fig. 7B is a side view 710 of drone 400 showing at least one optical sensor 712, in accordance with an embodiment of the present invention.
  • Fig. 7C is another side view 720 of drone 400 showing a phone/camera attachment element 724 and at least one optical sensor 722, in accordance with an embodiment of the present invention
  • Fig. 8A is a top view 800 of drone 110 showing at least one motion detection sensor (acoustics/laser) 802, adapted to act as a gunshot detection sensor, in accordance with an embodiment of the present invention.
  • a motion detection sensor acoustics/laser
  • Fig. 8B is a bottom view 810 of drone 110 showing the position of at least one motion detection sensor 812, in accordance with an embodiment of the present invention.
  • These are motion sensors on each side of the drone, which cover 360 degrees.
  • the ones on top/bottom are the distance/anti- collision sensors.
  • Fig. 8C is a side view 820 of a drone showing the position of at least one motion detection sensor 822, in accordance with an embodiment of the present invention.
  • FIG. 9A is side view 900 of a drone showing at least one combat element 902 (handgun) with a laser pointer 904, in accordance with an embodiment of the present invention.
  • Fig. 9B is another side view 910 of a drone showing at least one combat element 912 (submachine gun), in accordance with an embodiment of the present invention.
  • Fig. 9C there is seen a another side view 920 of drone 110 showing at least one combat element 922 (hand grenade), in accordance with an embodiment of the present invention;.
  • System 100 (Fig. 1) is thus configured to use these combat elements and sensors, thereby enabling the system to effectively function in combat (search and engage, incriminate and eliminate).
  • FIG. 10A is a top view 1000 of dronel lO showing at least one antenna 1002, in accordance with an embodiment of the present invention.
  • Fig. 10B a side view 1010 of drone 110 is seen, showing the at least one antenna, in accordance with an embodiment of the present invention.
  • Fig. 11 A is a simplified pictorial illustration showing a portable ground station 1100 of system 100 of Fig. 1, in accordance with an embodiment of the present invention.
  • the portable ground station 1100 comprises a screen 1102 on which screen shots shown hereinbelow can be displayed.
  • the ground station is configured to be a game-like, user-friendly system controller has a drone height control and rotation controller 1112, a drone directional controller 1110, an X-Y controller 1104 and further control buttons 1106, 1108.
  • FIG. 11B is another simplified pictorial illustration showing a portable touch-screen ground station 1120 of system 100 of Fig. 1, equipped with a touchscreen 1122, in accordance with an embodiment of the present invention.
  • FIG. 12 is a simplified flow chart of a method 1200 for complex indoor combat, in accordance with an embodiment of the present invention.
  • a mobile configuration of system 100 is provided.
  • the ground station 120 or sometimes, the entire system 100 is carried to an engagement/combat site by a task team (not shown) in a carry system to site step 1202.
  • This configuration is relevant to the following implementations military - ground forces, military - special forces, law enforcement, intelligence & security organizations.
  • the task team Upon call, the task team arrives to site and deploys the system: takes the hover craft (drone 110) out of its case (201, Fig. 2).
  • the personnel (handler) turns on the computer 210 (Fig. 2).
  • a mission planning step 1206 the handler then launches a mission planner; software application module (not shown, in computer 210), installed on tablet/computer, where the mission parameters and required behavior of the system are configured.
  • a choosing mode of operation step 1208 the handler decides of the suitable mode of operation per the given mission, scenario and task team's doctrine, selected from:- a) a semi-autonomous mode activation step 1210, in which the handler flies the system, manages its priorities and engagement, yet robotically supported with:
  • threat map- motion, gunshot and sound sensors alert, identify and direct towards potential threats - video analytics - identifies known threats, differentiate friend/foe, priorities threats and direct towards threats
  • mapping- sensors are mapping the site for immediate and future navigation
  • the handler can navigate the system by either real-time mapping or by a provided map (without GPS) or b) an autonomous mode activation step 1212, in which the system is fully autonomous and acts as per pre-defined priorities, behaviors and algorithms:
  • mapping- sensors are mapping the site in real-time for immediate and future navigation
  • step 1208 the system can navigate by either real-time mapping or by a provided map (without GPS).
  • a screen shot associated with step 1208 is shown in Fig. 15 hereinbelow.
  • the handler Having chosen to activate one of the two above modes, the handler then chooses whether to activate video analytics in a video analytics activation step 1214.
  • a screen shot associated with the video analytics step is shown in Fig. 16.
  • the handler decides whether to use the video analytics module per the given mission, scenario and task team' s doctrine. If he activates the video analytics in step 1214, then the system is constructed and configured to biometrically analyze pre-defined profiles in a real-time analysis step 1216, enabling one or more of the following:- a. a- handler can load a profile (image) from a database (not shown) in the computer; b. a handler can insert and load a recently taken picture of a target; c. several profiles can be loaded at the same time.
  • the system is operative to search and mark each identified profile on an HUD (heads on display 1900, Fig. 19) thereby triggering the "prioritization” and “engagement” algorithms described herein; d. target profiles can be prioritized per handler directive thus triggering the "prioritization” and “engagement” algorithms.
  • step 1214 If the video analytics is not activated in step 1214, then the system does not biometrically analyze pre-defined people' s profiles, but is operative to address realtime threats only address step 1218. - marks people on the HUD
  • the handler decides of the suitable number of projectiles (i.e. rounds) in magazine and of number of projectiles shot per pull of trigger (i.e. burst) per the given mission, scenario and task team's doctrine. Screen shots associated with this step are shown in Figs. 17 and 18 hereinbelow.
  • the handler is operative in defining rounds and bursts step 1220 to define/set a number of rounds in magazine of a combat element on the drone- the handler sets the number of projectiles (i.e. rounds) in magazine of the at least one combat element on the drone, in a rounds in magazine setting step 1222, enabling the system to count and display on HUD the ammunition status in order to follow its consumption (see Figs. 17 and 18).
  • a choose mode of engagement step 1226 in which the handler decides of the suitable mode of engagement per the given mission, scenario and task team' s doctrine
  • an activate autonomous shot mode step 1228 shoot on sight - the system autonomously engages and shoots a projectile at a threat; immediately as it identifies it
  • an activate manual shot mode step 1230 in which the handler triggers the shot from the ground station to the combat element on the drone.
  • the system suggests engagement options to the handler by notifications and/or markings on the HUD (based on threat prioritization algorithm and/or video analytics); and finally, v) a system launch step 1232- go- the pre-mission planning enabled by the mission planning module ends here by the launch of the real-time control of the drone, enabled by the HUD module (see Fig. 19).
  • Fig. 13 is a simplified flow chart of a standby mode method 1300 for complex indoor combat, in accordance with an embodiment of the present invention.
  • system 100 is deployed on-site, the engines 404 (Fig. 4) of drone 400 are switched off but all detection sensors onboard the drone are on (sensors, camera, video analytics, et cetera.).
  • identifying threat step 1304 any threat which detected by the drone and/or ground station is defined as an active threat.
  • the drone is then activated and switched on in an activations step 1306.
  • a handler chooses, per a potential threat a mode of operation. If a semi-autonomous mode is chosen in step 1309, then the handler is operative to activate the drone in semi-autonomous mode (per a notification by the system). If an autonomous mode was chosen in an autonomous mode choosing step 1307, then the drone is activated to function autonomously. The drone re-fires its engines and engages the threat (autonomously) after auto-launching in an auto-launch drone step 1310).
  • the pre-mission planning enabled by the Mission Planning Module (ground station 120 (Fig. 1)) ends here by the launch of the real-time control of the system, enabled by the HUD Module (See Fig. 19).
  • This configuration is relevant to the following implementations (refer to best mode of implementation) - anti piracy - fighting (onboard) pirates, homeland security - facilities and installations, indoors fire fighting, alarm verification.
  • Sensors and/or video analytics identifies a threat in step 1304, and triggers the system/drone.
  • the system is to act as is was preset to on the Mission Planner module: Alternatively, if a semi-autonomous mode is chosen in step 1308, then the handler then needs to decide in a decision step 1311 whether to activate the drone to engage the threat in drone activation step 1312, or to leave it dormant in a standby mode (step 1314).
  • Fig. 14 is a simplified flow chart of a sleep mode method 1400 for complex indoor combat, in accordance with an embodiment of the present invention.
  • the handler decides on a mode of operation, such as a sleep mode.
  • a mode of operation such as a sleep mode.
  • drone 110, 400 et cetera. is deployed on-site in a deploying onsite step 1404. Its- the engines and detection sensors are off (such as sensors, camera, video analytics, et cetera.).
  • the system will start its engines and detection sensors and will engage the threat (autonomously or by handler).
  • This configuration is relevant to the following implementations anti -piracy - fighting (on-board) pirates, homeland security - facilities and installations, indoors fire-fighting and alarm verification.
  • a threat detection step 1406 an alarm is activated and the drone is triggered.
  • the handler decides whether to activate the drone autonomously or semi- autonomously, by remotely activating that mode in steps 1408, 1410, respectively.
  • the system is operative to ask the handler whether to engage a threat in an asking step 1412. Typically, this is performed by the handler receiving a message on his phone or to the ground station. If yes, the drone is activated in a drone activation step 1414. If no, the drone remain in a sleep mode 1416 Per pre-defined directives set on the mission planner module ground station 120 (Fig. 1), system 100 addresses the threat. According to one embodiment, the system acts as it was preset to on the mission planner module: If the autonomous mode was activated in step 1418, then, upon detection of a threat, the drone is auto-launched. The pre-mission planning enabled by the Mission Planning Module (ground station) ends here by the launch of the real-time control of the drone, enabled by the HUD Module (See Fig. 19).
  • Fig. 15 is a screen shot 1500 on a screen 1502 of a ground station 120 (Fig. 1) for activating drone 110, in accordance with an embodiment of the present invention.
  • a handler or user may choose, according to a location, time of day, threat type et cetera, whether to activate drone 110 in a semi-autonomous mode, by activating a semi- autonomous activation element 1504. Alternatively, he/she may activate the drone in an autonomous mode by activating an autonomous activation element 1506.
  • Screen 1502 further comprises a display of date and time data 1522, next buttons 1508, tabs for a mission 1510, an analytics tab 1512, a burst tab 1514, an engagement tab 1516, a heads up display tab 1518 and a settings tab 1520.
  • Fig. 16 is a screen shot 1600 on a ground station screen for video analytics activation 1502 of drone 110, in accordance with an embodiment of the present invention.
  • Screen for video analytics activation 1502 typically comprises a first face image 1602, a second face image 1604, a third face image 1606 and a fourth face image 1608.
  • face data 1610, 112, 1614, 1616 This data may include, for example, a name, an age, a gender, an organization affiliation, a danger level and the like. Additionally, there may be spaces 1618 for uploading additional face data. Additionally or alternatively, the face images may be replaced with full body images.
  • Screen shot 1600 may further comprise a prioritize button 1624, for adjusting, resetting or setting the relative prioritizations of the people's images (1602, 1604,1606, 1608) profiles- the higher priority, the higher priority for the drone to engage the actual person associated with the image.
  • the screens shot may further comprise a drag or drop symbol 1628, a skip button 1620 and a next button 1622.
  • Fig. 17 is a screen shot 1700 on a ground station screen 120 (Fig. 1) for activation of rounds a bursts in system 100 of Fig. 1, in accordance with an embodiment of the present invention.
  • a user can set a number of rounds in a magazine by manipulating a rounds control number using button 1704 in a rounds control element 1702 and a number of rounds per burst by manipulating a rounds control per burst number 1708 in a rounds per burst control element 1706.
  • Fig. 18 is a screen shot 1800 on a ground station screen for choosing a mode of engagement in the system of Fig. 1, in accordance with an embodiment of the present invention.
  • a user can choose a shoot on sight mode by activating an autonomous mode select element 1802 or a manual shot by activating a manual mode select element 1804.
  • Fig. 19 is a screen shot 1900 on a ground station screen for heads up display (HUD) in the system of Fig. 1, in accordance with an embodiment of the present invention.
  • HUD heads up display
  • Screen shot 1900 presents the handlers controller' s graphic user interface (GUI) and its components: a mini map 1934 , a current speed on a speedometer 1906, an altitude on an altitude meter 1908, a local time on date and time data 1522, a battery/energy with time estimate of power % remaining 1912, an ammunition available data with type and number of rounds in munitions store 1916, a firing crosshair 1918, with a threat directional indicator 1920, a compass 1902 and an artificial horizon or azimuth 1904.
  • the HUD further shows a threat map 1992, showing a triangle of a shot 1930, a shot threat detected 1924, movement detection 1926, a triangle of movement 1932 (shown in greater detail in Figs 20 and 21).
  • Figs 20-21 are dealing with the Threat map rather with the 3D map- added it here please inset where applicable: 3D Mapping Algorithm - sensors (Fig. 3A) map the site for immediate and future navigation: Without a provided map of the indoors site and without GPS.
  • the screen shot further shows a return to base button 1942, an ambush or standby button 1944, an operation button 1946, a video analytics button 1948 and a 3D map button 1950.
  • Fig. 20 is a simplified threat map 2000 in the system of Fig. 1, in accordance with an embodiment of the present invention.
  • Threat map includes a drone orientation direction 2001, a camera coverage area 2002, a gunshot direction 2003, a movement direction 2004, and movement and gunshot direction 2005, and gunshot location and position 2006, a movement location and position 2007 and a movement and gunshot location and position 2008.
  • Fig. 21 is another simplified threat map 2100 in the system of Fig. 1, in accordance with an embodiment of the present invention.
  • Threat map includes a drone orientation direction 2101, a camera coverage area 2102, a gunshot direction 2103, a movement direction 2104, and movement and gunshot direction 2105, and gunshot location and position 2106, a movement location and position 2107 and a movement and gunshot location and position 2108.
  • drone 110 The modes of operation of drone 110 are described further as follows, with reference to items shown in the drawings, particularly with reference to Figs. 12- 14.
  • Threat Map- Motion, Gunshot and sound sensors alert, identify and direct towards potential threats
  • Video Analytics - identifies known threats, differentiate friend/foe, priorities threats and direct towards threats
  • mapping- Sensors are mapping the site for immediate and future navigation
  • Navigation- The handler can navigate the system by either real-time mapping or by a provided map (without GPS)
  • Video Analytics - identifies known threats, differentiate friend/foe, priorities threats and direct towards threats
  • mapping- Sensors are mapping the site in real-time for immediate and future navigation
  • Navigation- The system can navigate by either real-time mapping or by a provided map (without GPS)
  • the System does not biometrically analyze pre-defined people's profiles, and addresses real-time threats only
  • Threats can be prioritized per handler directive thus triggering the "Prioritization” and “engagement” Algorithms
  • the System biometrically analyzes pre-defined profiles in real-time Handler can load a Profile (image) from the Database
  • the Handler can prioritize the objectives:
  • the system can than search and mark each identified profile on HUD (thus triggering the "Prioritization” and “engagement” Algorithms)
  • Profiles can be prioritized per handler directive thus triggering the "Prioritization” and “engagement” Algorithms
  • Handler decides whether to engage and shoot a projectile
  • Threat Map- Graphic display mapping displaying the type and direction of sensors' indications, in relation to Hover Craft movement
  • 3D Map- Sensors are mapping the site for immediate and future navigation.
  • the sensors' 360 degrees coverage maps the surroundings and thus enables the display of a dynamically created 3D Map
  • the map is automatically saved and stored for later use
  • Handler may change the priority; and thus the numbering, in real-time by touching the objective he now prioritize as number one.
  • Ambush/Standby mode (Fig 13)- The system will get in to Ambush mode, where it lands (autonomously or by handler), shuts its engines but keeps ALL detection systems ON (Sensors, Camera, Video analytics, Et cetera.). As per a detected threat, the system will resume its engines and will engage the threat (autonomously or by handler).
  • the handler can choose to change the Mode of Operation (See 1.) during actual mission rather than as a pre-defined mode as at the Mission Planner module.
  • the handler can launch the Video Analytics interface (See 2.) during actual mission (rather than as a pre-defined mode as at the Mission Planner module). If at the Mission Planning phase the Handler skipped the phase, he can still launch it even through the HUD
  • the handler can remove/display the 3D Map overlay.
  • Pack/squadron coordinated flight/fight support - i. Formation flight and attack algorithm ii. 'One leader other follow' or 'Coordinated flight and attack of independent drones'
  • Socket enabled charging any socket, any current
  • Video Content Analytics - i. Automatic Intrusion detection - Ensures perimeter control ii. Automatic Abnormal behavior detection - Minimizing civilian casualties by incriminating the real threats
  • Stalking/ambush mode all systems ON, engine OFF with
  • Threat detection and engagement algorithm - i. Collecting data from ;location sensors, Sensors and Video Analytics
  • the drone is to be stationed onboard a ship, at strategic locations, such as, the bridge, engine room, crew quarters, et cetera.
  • the drone addresses the on-board pirates and is operative to engage them under various operational scenarios. For example, boarding prevention, hijacking prevention, hostage rescue, securing the bridge and staff.
  • the drone uses the distributed Wi-Fi network to operate within the vessel and satellite network for external communication.
  • the system is to be stationed onboard the ship, at strategic locations. For example, the bridge, engine room, goods' storage, et cetera.
  • the system can operate autonomously and/or piloted; on a single or a coordinated pack configuration.
  • Command initiated by the Video Content analytics (Either of the Ship or of the system itself) or by the embedded Sensors once a movement or gunshots are identified iii. Direct Order by the crew or by a remote/online security officer located (remotely) at base.
  • Control options - mobile mode (Fig. 12);
  • the system can identify a breach or an unauthorized access to a secure parameter and respond;
  • v. Hijacking Prevention The access prevention and secured bridge are to foil any attempt to hijack the ship, its crew and its cargo;
  • a drone (110, Fig 1) is stationed onboard a ship, at strategic locations, for example, the bridge, engine room, crew quarters.
  • the drone addresses the on-board pirates and engages them under various operational scenarios, for example, boarding prevention, hijacking prevention, hostage rescue, securing the bridge and staff.
  • the drone uses the distributed Wi-Fi network to operate within the vessel and satellite network for external communication.
  • a fire alarm For example, in a warehouse

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

La présente invention porte sur des systèmes et sur des procédés pour combattre une menace, lequel système comprend un appareil motorisé apte à planer et à supporter au moins un élément d'engagement, au moins un élément de détection apte à acheminer des données de détection, un élément de positionnement d'appareil apte à acheminer des données de positionnement de système, un processeur embarqué apte à recevoir et à traiter les données de détection venant du ou des des éléments de détection et les données de positionnement venant du ou des éléments de données de positionnement d'appareil, le processeur étant configuré de façon à délivrer en sortie au moins un ordre au ou aux éléments d'engagement afin de désactiver la menace; et au moins un élément de communication, et une station au sol pour contrôler l'appareil motorisé et pour recevoir des données à partir du ou des éléments de communication.
PCT/IL2014/000043 2013-08-31 2014-08-28 Système robotique et procédé pour combat en intérieur complexe Ceased WO2015029007A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361872638P 2013-08-31 2013-08-31
US61/872,638 2013-08-31
US201361916815P 2013-12-17 2013-12-17
US61/916,815 2013-12-17

Publications (1)

Publication Number Publication Date
WO2015029007A1 true WO2015029007A1 (fr) 2015-03-05

Family

ID=52585686

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2014/000043 Ceased WO2015029007A1 (fr) 2013-08-31 2014-08-28 Système robotique et procédé pour combat en intérieur complexe

Country Status (2)

Country Link
IL (1) IL234372B (fr)
WO (1) WO2015029007A1 (fr)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016162342A1 (fr) * 2015-04-07 2016-10-13 Pixiel Systeme de declenchement du decollage d'un drone a la suite de la detection d'un evenement survenu a un endroit determine
BE1022965B1 (nl) * 2015-04-21 2016-10-24 Airobot Samenstel voor onbemand luchtvaartuig, onbemand luchtvaartuig met het samenstel, en werkwijze voor het aansturen ervan
JP2017036007A (ja) * 2015-08-12 2017-02-16 富士ゼロックス株式会社 画像形成装置および画像形成システム
ES2607723A1 (es) * 2015-10-02 2017-04-03 Universidad De Castilla La Mancha Dispositivo para la detección a distancia de elementos perturbadores sobre una superficie
DE102015014502A1 (de) * 2015-11-10 2017-05-11 Mbda Deutschland Gmbh Hilfstragflügeleinrichtung
EP3182390A1 (fr) * 2015-12-08 2017-06-21 Micro APPS Group Inventions LLC Dispositif de sûreté et de sécurité autonome sur une plate-forme sans équipage sous commande et contrôle d'un téléphone cellulaire
WO2017127491A1 (fr) 2016-01-20 2017-07-27 Babak Rezvani Dispositif de commande de drone
RU2628351C1 (ru) * 2016-04-14 2017-08-16 Сергей Николаевич ПАВЛОВ Противотанковая мина "Стрекоза-М" с возможностью пространственного перемещения с зависанием и переворачиванием в воздухе, разведки, нейтрализации и поражения мобильных бронированных целей
WO2017137393A1 (fr) * 2016-02-10 2017-08-17 Tyco Fire & Security Gmbh Système de détection d'incendie utilisant un drone
DE102016109242A1 (de) * 2016-05-19 2017-11-23 Keil Group GmbH Überwachungssystem
WO2018010909A1 (fr) * 2016-07-12 2018-01-18 Minimax Gmbh & Co. Kg Système et procédé de détermination vérifiée d'un état d'incendie ainsi que véhicule et unité centrale associés
CN107643762A (zh) * 2017-08-07 2018-01-30 中国兵器工业计算机应用技术研究所 自主导航的无人机系统及其导航方法
WO2018063076A1 (fr) * 2016-09-29 2018-04-05 Dynamic Solutions Group Sweden Ab Système de support aérien proche portable et transporteur de charge utile
GR1009313B (el) * 2017-03-30 2018-06-19 Τεχνολογικο Εκπαιδευτικο Ιδρυμα Ανατολικης Μακεδονιας Και Θρακης Συστημα αυτοματοποιημενης διασωσης
GR20160100501A (el) * 2016-10-04 2018-06-27 Ηλιας Θωμα Σαραφης Συστημα ρομποτικης εναεριας επεμβασης για παροχη βοηθειας σε χρηστες
WO2018150492A1 (fr) * 2017-02-15 2018-08-23 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Procédé d'affichage d'image, système d'affichage d'image, objet volant, programme et support d'enregistrement
US10134293B2 (en) 2016-03-21 2018-11-20 Walmart Apollo, Llc Systems and methods for autonomous drone navigation
EP3268276A4 (fr) * 2015-03-12 2018-12-05 Alarm.com Incorporated Assistance robotisée pendant un contrôle de sécurité
CN109189099A (zh) * 2018-11-09 2019-01-11 福州大学 一种四旋翼无人机的图形化控制组态方法
EP3447436A1 (fr) * 2017-08-25 2019-02-27 Aurora Flight Sciences Corporation Système d'interception de véhicule aérien
JP2019064465A (ja) * 2017-09-29 2019-04-25 株式会社エアロネクスト プロペラガード
DE102017223753A1 (de) * 2017-12-22 2019-06-27 Thyssenkrupp Ag Drohnensystem, Schächte für ein Drohnensystem und Verfahren zum Transport von Lasten in einem Schacht mit einer Drohne
US10551810B2 (en) 2017-06-20 2020-02-04 Ademco Inc. System and method to improve the privacy of homes and other buildings having a connected home security/control system and subject to intrusions by unmanned aerial vehicles
DE102019110205A1 (de) * 2019-04-17 2020-10-22 Krauss-Maffei Wegmann Gmbh & Co. Kg Verfahren zum Betrieb eines vernetzten militärischen Verbands
US11009877B2 (en) 2016-07-12 2021-05-18 Minimax Gmbh & Co. Kg Unmanned vehicle, system, and method for initiating a fire extinguishing action
US11009887B2 (en) 2018-07-26 2021-05-18 Toyota Research Institute, Inc. Systems and methods for remote visual inspection of a closed space
US11064184B2 (en) 2017-08-25 2021-07-13 Aurora Flight Sciences Corporation Aerial vehicle imaging and targeting system
WO2022069957A1 (fr) * 2020-09-29 2022-04-07 Rafael Advanced Defense Systems Ltd. Plate-forme aérienne armée
US11767129B2 (en) 2020-01-31 2023-09-26 Southeastern Pennsylvania Unmanned Aircraft Systems, Llc Drone delivery system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110204188A1 (en) * 2010-02-24 2011-08-25 Robert Marcus Rotocraft
US8521339B2 (en) * 2008-09-09 2013-08-27 Aeryon Labs Inc. Method and system for directing unmanned vehicles

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8521339B2 (en) * 2008-09-09 2013-08-27 Aeryon Labs Inc. Method and system for directing unmanned vehicles
US20110204188A1 (en) * 2010-02-24 2011-08-25 Robert Marcus Rotocraft

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DR. NILS MELZER, EUROPEAN PARLIAMENT, HUMAN RIGHTS IMPLICATIONS OF THE USAGE OF DRONES AND UNMANNED ROBOTS IN WARFARE, 3 May 2013 (2013-05-03), pages 7 - 13 *

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10698403B2 (en) 2015-03-12 2020-06-30 Alarm.Com Incorporated Robotic assistance in security monitoring
EP3268276A4 (fr) * 2015-03-12 2018-12-05 Alarm.com Incorporated Assistance robotisée pendant un contrôle de sécurité
AU2022201806B2 (en) * 2015-03-12 2024-01-04 Alarm.Com Incorporated Robotic assistance in security monitoring
US11409277B2 (en) 2015-03-12 2022-08-09 Alarm.Com Incorporated Robotic assistance in security monitoring
AU2021269286B2 (en) * 2015-03-12 2022-01-27 Alarm.Com Incorporated Robotic assistance in security monitoring
AU2020213380B2 (en) * 2015-03-12 2021-11-04 Alarm.Com Incorporated Robotic assistance in security monitoring
FR3034884A1 (fr) * 2015-04-07 2016-10-14 Pixiel Systeme de declenchement du decollage d'un drone a la suite de la detection d'un evenement survenu a un endroit determine
WO2016162342A1 (fr) * 2015-04-07 2016-10-13 Pixiel Systeme de declenchement du decollage d'un drone a la suite de la detection d'un evenement survenu a un endroit determine
BE1022965B1 (nl) * 2015-04-21 2016-10-24 Airobot Samenstel voor onbemand luchtvaartuig, onbemand luchtvaartuig met het samenstel, en werkwijze voor het aansturen ervan
JP2017036007A (ja) * 2015-08-12 2017-02-16 富士ゼロックス株式会社 画像形成装置および画像形成システム
ES2607723A1 (es) * 2015-10-02 2017-04-03 Universidad De Castilla La Mancha Dispositivo para la detección a distancia de elementos perturbadores sobre una superficie
DE102015014502A1 (de) * 2015-11-10 2017-05-11 Mbda Deutschland Gmbh Hilfstragflügeleinrichtung
EP3182390A1 (fr) * 2015-12-08 2017-06-21 Micro APPS Group Inventions LLC Dispositif de sûreté et de sécurité autonome sur une plate-forme sans équipage sous commande et contrôle d'un téléphone cellulaire
US10768625B2 (en) 2016-01-20 2020-09-08 Alarm.Com Incorporated Drone control device
US10228695B2 (en) 2016-01-20 2019-03-12 Alarm.Com Incorporated Drone control device
WO2017127491A1 (fr) 2016-01-20 2017-07-27 Babak Rezvani Dispositif de commande de drone
EP3405846A4 (fr) * 2016-01-20 2019-01-23 Alarm.com Incorporated Dispositif de commande de drone
WO2017137393A1 (fr) * 2016-02-10 2017-08-17 Tyco Fire & Security Gmbh Système de détection d'incendie utilisant un drone
US10134293B2 (en) 2016-03-21 2018-11-20 Walmart Apollo, Llc Systems and methods for autonomous drone navigation
RU2628351C1 (ru) * 2016-04-14 2017-08-16 Сергей Николаевич ПАВЛОВ Противотанковая мина "Стрекоза-М" с возможностью пространственного перемещения с зависанием и переворачиванием в воздухе, разведки, нейтрализации и поражения мобильных бронированных целей
DE102016109242A1 (de) * 2016-05-19 2017-11-23 Keil Group GmbH Überwachungssystem
US11009877B2 (en) 2016-07-12 2021-05-18 Minimax Gmbh & Co. Kg Unmanned vehicle, system, and method for initiating a fire extinguishing action
CN109416864A (zh) * 2016-07-12 2019-03-01 德国美力有限两合公司 用于经验证地确定火灾状态的系统和方法以及用于此的行驶工具和中央单元
CN109416864B (zh) * 2016-07-12 2020-12-22 德国美力有限两合公司 用于经验证地确定火灾状态的系统和方法以及用于此的行驶工具和中央单元
US10825335B2 (en) 2016-07-12 2020-11-03 Minimax Gmbh & Co. Kg System and method for the verified determining of a fire status, as well as vehicle and central unit for this purpose
WO2018010909A1 (fr) * 2016-07-12 2018-01-18 Minimax Gmbh & Co. Kg Système et procédé de détermination vérifiée d'un état d'incendie ainsi que véhicule et unité centrale associés
WO2018063076A1 (fr) * 2016-09-29 2018-04-05 Dynamic Solutions Group Sweden Ab Système de support aérien proche portable et transporteur de charge utile
GR1009387B (el) * 2016-10-04 2018-10-25 Θωμας Ηλια Σαραφης Συστημα ρομποτικης εναεριας επεμβασης για παροχη βοηθειας σε χρηστες
GR20160100501A (el) * 2016-10-04 2018-06-27 Ηλιας Θωμα Σαραφης Συστημα ρομποτικης εναεριας επεμβασης για παροχη βοηθειας σε χρηστες
US11082639B2 (en) 2017-02-15 2021-08-03 SZ DJI Technology Co., Ltd. Image display method, image display system, flying object, program, and recording medium
JPWO2018150492A1 (ja) * 2017-02-15 2019-12-19 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 画像表示方法、画像表示システム、飛行体、プログラム、及び記録媒体
WO2018150492A1 (fr) * 2017-02-15 2018-08-23 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Procédé d'affichage d'image, système d'affichage d'image, objet volant, programme et support d'enregistrement
GR1009313B (el) * 2017-03-30 2018-06-19 Τεχνολογικο Εκπαιδευτικο Ιδρυμα Ανατολικης Μακεδονιας Και Θρακης Συστημα αυτοματοποιημενης διασωσης
US10551810B2 (en) 2017-06-20 2020-02-04 Ademco Inc. System and method to improve the privacy of homes and other buildings having a connected home security/control system and subject to intrusions by unmanned aerial vehicles
CN107643762A (zh) * 2017-08-07 2018-01-30 中国兵器工业计算机应用技术研究所 自主导航的无人机系统及其导航方法
JP2019060589A (ja) * 2017-08-25 2019-04-18 オーロラ フライト サイエンシズ コーポレーション 航空輸送体の迎撃システム
KR20190022406A (ko) * 2017-08-25 2019-03-06 오로라 플라이트 사이언시스 코퍼레이션 공중 비히클 요격 시스템
US10495421B2 (en) 2017-08-25 2019-12-03 Aurora Flight Sciences Corporation Aerial vehicle interception system
KR102600479B1 (ko) * 2017-08-25 2023-11-08 오로라 플라이트 사이언시스 코퍼레이션 공중 비히클 요격 시스템
EP3447436A1 (fr) * 2017-08-25 2019-02-27 Aurora Flight Sciences Corporation Système d'interception de véhicule aérien
US11064184B2 (en) 2017-08-25 2021-07-13 Aurora Flight Sciences Corporation Aerial vehicle imaging and targeting system
US11126204B2 (en) 2017-08-25 2021-09-21 Aurora Flight Sciences Corporation Aerial vehicle interception system
JP2019064465A (ja) * 2017-09-29 2019-04-25 株式会社エアロネクスト プロペラガード
DE102017223753A1 (de) * 2017-12-22 2019-06-27 Thyssenkrupp Ag Drohnensystem, Schächte für ein Drohnensystem und Verfahren zum Transport von Lasten in einem Schacht mit einer Drohne
US11009887B2 (en) 2018-07-26 2021-05-18 Toyota Research Institute, Inc. Systems and methods for remote visual inspection of a closed space
CN109189099B (zh) * 2018-11-09 2021-07-13 福州大学 一种四旋翼无人机的图形化控制组态方法
CN109189099A (zh) * 2018-11-09 2019-01-11 福州大学 一种四旋翼无人机的图形化控制组态方法
DE102019110205A1 (de) * 2019-04-17 2020-10-22 Krauss-Maffei Wegmann Gmbh & Co. Kg Verfahren zum Betrieb eines vernetzten militärischen Verbands
US11767129B2 (en) 2020-01-31 2023-09-26 Southeastern Pennsylvania Unmanned Aircraft Systems, Llc Drone delivery system
WO2022069957A1 (fr) * 2020-09-29 2022-04-07 Rafael Advanced Defense Systems Ltd. Plate-forme aérienne armée
IL277712B1 (en) * 2020-09-29 2024-02-01 Rafael Advanced Defense Systems Ltd Armed aerial platform
US11981459B2 (en) 2020-09-29 2024-05-14 Rafael Advanced Defense Systems Ltd. Armed aerial platform
IL277712B2 (en) * 2020-09-29 2024-06-01 Rafael Advanced Defense Systems Ltd Armed aerial platform

Also Published As

Publication number Publication date
IL234372B (en) 2020-02-27

Similar Documents

Publication Publication Date Title
WO2015029007A1 (fr) Système robotique et procédé pour combat en intérieur complexe
US20220406151A1 (en) Threat identification device and system with optional active countermeasures
US11879705B2 (en) System and method for active shooter defense
US10099785B1 (en) Drone with ring assembly
US6903676B1 (en) Integrated radar, optical surveillance, and sighting system
US20140251123A1 (en) Unmanned range-programmable airburst weapon system for automated tracking and prosecution of close-in targets
US20250028335A1 (en) Systems and methods for managing unmanned vehicle interactions with various payloads
US20180362157A1 (en) Modular unmanned aerial system
US20130192451A1 (en) Anti-sniper targeting and detection system
KR102034494B1 (ko) 악용된 드론을 무력화하는 안티드론 시스템 및 운용방법
JP2019070510A (ja) 航空輸送体の撮像及び照準システム
EP3625125A1 (fr) Syst me et procédé pour intercepter et contrer des véhicules aériens sans pilote (uav)
CN110624189B (zh) 无人机机载灭火弹装置、消防无人机以及发射控制方法
KR20130009894A (ko) 공간정보기술을 이용한 근거리 정밀타격 무인항공기시스템
CN212332970U (zh) 无人机机载灭火弹装置、消防无人机
US9716862B1 (en) System and methods for capturing situational awareness
Sinclair Proposed rules to determine the legal use of autonomous and semi-autonomous platforms in domestic US law enforcement
Sözübir UAV Autonomy in Turkey and Around the World: The “Terminator” Debate
Fortune et al. Counter‐Unmanned Aerial Vehicle Systems: Technical, Training, and Regulatory Challenges
Kesavaraj et al. Security framework for net gun-equipped unmanned aerial vehicles
Vas et al. Comprehensive Study of Military and Civil Drone Applications: Assessing Key Areas of Significance and Future Prospects
KR102009637B1 (ko) 재난 및 비상상황 시 구호활동을 하는 드론
KR20190097609A (ko) 범죄자 무력화 드론
Snyder Design requirements for weaponizing man-portable UAS in support of counter-sniper operations
Jan et al. Use of modern technologies for combat units preparation and management

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14839549

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14839549

Country of ref document: EP

Kind code of ref document: A1