[go: up one dir, main page]

WO2015029007A1 - Robotic system and method for complex indoor combat - Google Patents

Robotic system and method for complex indoor combat Download PDF

Info

Publication number
WO2015029007A1
WO2015029007A1 PCT/IL2014/000043 IL2014000043W WO2015029007A1 WO 2015029007 A1 WO2015029007 A1 WO 2015029007A1 IL 2014000043 W IL2014000043 W IL 2014000043W WO 2015029007 A1 WO2015029007 A1 WO 2015029007A1
Authority
WO
WIPO (PCT)
Prior art keywords
threat
drone
engagement
data
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IL2014/000043
Other languages
French (fr)
Inventor
Ronen Izidor GABBAY
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of WO2015029007A1 publication Critical patent/WO2015029007A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41AFUNCTIONAL FEATURES OR DETAILS COMMON TO BOTH SMALLARMS AND ORDNANCE, e.g. CANNONS; MOUNTINGS FOR SMALLARMS OR ORDNANCE
    • F41A23/00Gun mountings, e.g. on vehicles; Disposition of guns on vehicles
    • F41A23/56Arrangements for adjusting the gun platform in the vertical or horizontal position
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/14Indirect aiming means
    • F41G3/16Sighting devices adapted for indirect laying of fire
    • F41G3/165Sighting devices adapted for indirect laying of fire using a TV-monitor
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0038Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0044Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/15UAVs specially adapted for particular uses or applications for conventional or electronic warfare
    • B64U2101/18UAVs specially adapted for particular uses or applications for conventional or electronic warfare for dropping bombs; for firing ammunition
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/70UAVs specially adapted for particular uses or applications for use inside enclosed spaces, e.g. in buildings or in vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls

Definitions

  • the present invention relates generally to robotic systems and methods, and more specifically to methods and systems for robotic systems for complex indoor combat.
  • Robotic solutions are lacking for various scenarios of confined and complex indoors combat, such as, but not limited to, within commercial and private residences, in educational establishments, commercial establishments, in transport systems, such as underground trains, in tunnels, in airplanes, within ships, for prevention of boarding of hostile persons, hijacking prevention, hostage rescue and securing a bridge and/or staff
  • improved unmanned methods and apparatus are provided for complex indoor combat.
  • a system comprising an integrated robotic drone comprising multiple sensors, optics, combat elements, supporting software, including video analytics, adapted for mission planning, with a heads-up-display HUD.
  • flying robotic systems are provided for indoor combat.
  • the present invention provides a robotic stalk and attack aerial hover system ("drone") designed, geared and aimed at dense urban confined and complex indoors combat.
  • the system of the present invention is constructed and configured to effectively identify, incriminate and eliminate threats and/or designated targets at urban scenarios; namely confined complex indoors environments, autonomously and/or piloted; on a single or a coordinated pack configuration.
  • the robotic drone of the present invention is configured to effectively operate in a complex and lethal environment.
  • a system for indoors engagement of a threat including;
  • a. a motorized apparatus adapted to hover and hold
  • an onboard processor adapted to receive and process said detection data from said at least one detection element and said location data from said at least one apparatus location data element, said processor being configured to output at least one command to said at least one engagement element to inactivate said threat;
  • v. at least one communication element at least one communication element; and b. a ground station for monitoring said motorized apparatus and for receiving data from said at least one communication element.
  • the processor is further adapted to perform video analytics on data from the detection data. Furthermore, according to an embodiment of the present invention, the processor is further adapted to identify the threat responsive to the video analytics.
  • the motorized apparatus includes robotics adapted to activate the at least one engagement element to engage and inactivate the threat.
  • the motorized apparatus is adapted to navigate three-dimensional indoor contours.
  • the at least one engagement element is selected from the group consisting of a gun, a missile, a projector, a gas canister, a fire extinguisher, a grenade, a non-lethal weapon, an immobilizer weapon, a submachine guns and combinations thereof.
  • the at least one detection element is selected from the group consisting of a sound sensor, an infrared sensor, a position sensor, an ultrasonic sensor, a movement detection sensor, a camera, a laser, a visualization sensor, an optic flow sensor and combinations thereof.
  • the at least one location element is selected from the group consisting of a global position system (GPS) element, a position sensor, a camera, a smartphone, an optic sensor and combinations thereof.
  • GPS global position system
  • motorized apparatus and the ground station communicate via at least one communication link.
  • the at least one communication link is selected from the group consisting of IP peer-to-peer, communication link, a cellular communication link, satellite communication an RF communication, an internet link and combinations thereof.
  • the ground station further includes a screen adapted to display a real-time heads up display (HUD) of the motorized apparatus.
  • HUD heads up display
  • the motorized apparatus is unmanned and is selected from an airborne drone, a flying hovering apparatus, a plane, a helicopter and a hovercraft.
  • the ground station includes a computer; a hand-operated remote control apparatus; an antenna; and at least one communications link.
  • the motorized apparatus is a drone, adapted for indoor use.
  • the motorized apparatus is adapted to move along vertical and horizontal conduits.
  • the drone includes robotics adapted to activate the at least one engagement element.
  • the ground station includes on-screen heads up display.
  • the detecting step includes employing video analytics and or detection sensors for identification of the threat.
  • the activating step enables the apparatus to act autonomously.
  • the activating step enables the apparatus to act semi-autonomously.
  • the activating step enables the apparatus to travel along horizontal and vertical conduits to approach the threat.
  • the apparatus is adapted for non-GPS navigation.
  • the apparatus is adapted for day and night activation.
  • the apparatus has a predefined mission plan.
  • the mission plan supports a plurality of engagement modes.
  • the apparatus is controlled from a ground station by heads up display.
  • the motorized apparatus acts autonomously, without communicating with the ground station.
  • the engagement of the threat occurs in an underground environment.
  • the underground environment includes at least one tunnel.
  • the motorized apparatus acts autonomously in the at least one tunnel.
  • a software product for indoor engagement of a threat including a computer-readable medium in which program instructions are stored, which instructions, when read by a computer (on the drone), cause the computer to;
  • the product comprises a plurality of algorithms, each algorithm providing a set of instructions to said motorized apparatus to activate at least one of robotics and the at least one engagement element.
  • a system for neutralization of an onboard threat on a waterborne vehicle including: a. a motorized apparatus adapted to hover and hold:
  • an onboard processor adapted to receive and process the detection data from the at least one detection element and the location data from the at least one apparatus location data element, the processor being configured to output at least one command to the at least one engagement element to inactivate the threat;
  • a ground station for monitoring the motorized apparatus and for receiving data from the at least one communication element.
  • the drone (motorized apparatus) is adapted to act autonomously with no need for an operator to be present. Alternatively, the operator can merely watch the action without interfering.
  • the present invention will be more fully understood from the following detailed description of the preferred embodiments thereof, taken together with the drawings.
  • Fig. 1 is a simplified pictorial illustration showing a system for complex indoor combat, in accordance with an embodiment of the present invention
  • Fig. 2 is a simplified pictorial illustration showing a ground station of the system of Fig. 1, in accordance with an embodiment of the present invention
  • Fig. 3 A is a simplified schematic illustration of a drone in the system of Fig. 1 , in accordance with an embodiment of the present invention
  • Fig. 3B is a simplified schematic illustration of a system for complex indoor combat, in accordance with an embodiment of the present invention
  • Fig. 4 is simplified pictorial illustration of a drone in the system of Fig. 1, in accordance with an embodiment of the present invention
  • Fig. 5 is a vertical cross section of the drone of Fig. 4, in accordance with an embodiment of the present invention.
  • Fig. 6 A is a top view of a drone showing at least one collision sensor, in accordance with an embodiment of the present invention.
  • Fig. 6B is a bottom view of a drone showing at least one collision sensor, in accordance with an embodiment of the present invention
  • Fig. 6C is a side view of a drone showing at least threat detection sensor, in accordance with an embodiment of the present invention
  • Fig. 7 A is a top view of a drone showing at least one optical sensor, in accordance with an embodiment of the present invention
  • Fig. 7B is a side view of a drone showing at least one optical sensor, in accordance with an embodiment of the present invention
  • Fig. 7C is another side view of a drone showing at least one optical sensor, in accordance with an embodiment of the present invention.
  • Fig. 8A is a top view of a drone showing at least one motion detection sensor, in accordance with an embodiment of the present invention.
  • Fig. 8B is a bottom view of a drone showing at least one motion detection sensor, in accordance with an embodiment of the present invention.
  • Fig. 8C is a side view of a drone showing at least one motion detection sensor, in accordance with an embodiment of the present invention
  • Fig. 9A is side view of a drone showing at least one combat element, in accordance with an embodiment of the present invention.
  • Fig. 9B is another side view of a drone showing at least one combat element, in accordance with an embodiment of the present invention.
  • Fig. 9C is another side view of a drone showing at least one combat element, in accordance with an embodiment of the present invention.
  • Fig. 10A is a top view of a drone showing at least one antenna, in accordance with an embodiment of the present invention.
  • Fig. 10B is a side view of a drone showing at least one antenna, in accordance with an embodiment of the present invention
  • Fig. 11 A is a simplified pictorial illustration showing a portable ground station of the system of Fig. 1, in accordance with an embodiment of the present invention
  • Fig. 1 IB is a simplified pictorial illustration showing a portable touch-screen ground station of the system of Fig. 1, in accordance with an embodiment of the present invention
  • Fig. 12 is a simplified flow chart of a mobile configuration method for complex indoor combat, in accordance with an embodiment of the present invention
  • Fig. 13 is a simplified flow chart of a standby mode method for complex indoor combat, in accordance with an embodiment of the present invention
  • Fig. 14 is a simplified flow chart of a sleep mode method for complex indoor combat, in accordance with an embodiment of the present invention
  • Fig. 15 is a screen shot on a ground station screen for activating a drone choosing mode of operation, in accordance with an embodiment of the present invention
  • Fig. 16 is a screen shot on a ground station screen for video analytics activation of a drone, in accordance with an embodiment of the present invention
  • Fig. 17 is a screen shot (Rounds and Bursts) on a ground station screen for activation of Fig. 1, in accordance with an embodiment of the present invention
  • Fig. 18 is a screen shot on a ground station screen for choosing a mode of engagement in the system of Fig. 1, in accordance with an embodiment of the present invention
  • Fig. 19 is a screen shot on a ground station screen for heads up display in the system of Fig. 1, in accordance with an embodiment of the present invention
  • Fig. 20 is a simplified threat map in the system of Fig. 1, in accordance with an embodiment of the present invention.
  • Fig. 21 is another simplified threat map in the system of Fig. 1, in accordance with an embodiment of the present invention.
  • similar reference numerals identify similar parts.
  • Fig. 1 is a simplified pictorial illustration showing a system 100 for complex indoor combat, in accordance with an embodiment of the present invention.
  • System 100 comprises at least one drone 110, a ground station 120, at least one communication device 130, adapted to communicate with the ground station and the drone via at least one wireless communication network 140. Further details of ground station 120 are shown in Fig. 2. Schematics of the drone are shown in Fig. 3 A. A schematic of system 100 is shown in Fig. 3B. Further mechanical details of the drone are shown in Fig. 4. It should be understood from Fig. 1 that the system may use one or more drones of various configurations, one or more communication systems and one or more mobile devices 130.
  • FIG. 2 is a simplified pictorial illustration showing a ground station 200 of the system of Fig. 1, in accordance with an embodiment of the present invention.
  • Ground station 200 similar or identical to ground station 120 of Fig. l, is housed in a suitcase 201 and comprises a remote controller 202, an antenna 204 servicing the remote controller, a computer 210 and an antenna, servicing the computer.
  • Drone 300 may be similar or identical to drone 110 of system 100.
  • the drone comprises an onboard processor 302 in communication with a flight system 304, at least one of a camera 314 or smartphone, and sensor 306, 308, 310, 312, exemplified, but not limited to sound sensor 306, infrared sensor 308, ultrasonic sensor 310 and optic flow sensor 312.
  • the arrows shown in Fig. 3A represent one embodiment of data flow within the drone.
  • FIG. 3B is a simplified schematic illustration of a system 350 for complex indoor combat, in accordance with an embodiment of the present invention.
  • System 350 comprises a ground station 352 in communication with an on-board system 358, a flight system (for example, Pixhawk autopilot) controller 360 a copter 354.
  • the flight system (such as, but not limited to, a Pixhawk) is in communication with a sensor controller 362 and sensors 356 communicating therewith.
  • Fig. 1, Fig. 3A et cetera Some of the components of system 100 (Fig.l) and/or system 350 (Fig. 3B) and their functions are provided in table 1. It should be understood that each of these components may be purchased commercially from commercial establishments in this field.
  • Radio controller receiver to receive control commands from a distant radio controller
  • Fig. 4 is simplified pictorial illustration of a drone 400 (also called copter or hover apparatus) in system 100 of Fig. 1, in accordance with an embodiment of the present invention.
  • the drones of the present invention are lightweight and have a protective frame 402.
  • the frame may be made of plastic, a polymeric material, a metal, aluminum, carbon fiber, an alloy and combinations thereof.
  • the light protective frame of drone 400 may be constructed in various colors, with various camouflages, making it both easy to move and durable.
  • the drone is constructed and configured to be a robotic stalk & attack hover apparatus 400, which may be stationed indoors, underground or onboard a ship, at strategic locations (such as on a bridge, in an engine room or in crew living quarters).
  • Drone 400 is developed, designed, geared and aimed at dense confined and complex indoor combat and thus can effectively identify, incriminate and eliminate threats and/or designated targets at the confined and complex indoors combat within ships, for example and at various operational scenarios, such as, but not limited to, terrorist attacks in buildings, airports, airplanes, educational facilities, underground trains, commercial establishments, residential buildings, boarding prevention, hijacking prevention, hostage rescue, securing the bridge and staff of a ship.
  • the drone When triggered; by an order, command or an alarm, the drone is constructed and configured to operate autonomously and/or piloted; in a single drone or in a coordinated multiple drone pack configuration.
  • this robotic drone is to effectively operate in the most complex and lethal indoor environments.
  • Drone 400 (similar or identical to drone 110, Fig 1) typically comprises the following basic functionalities and components. a. a protective frame 402 constructed as a hardened structure to protect rotors and components; b. at least one engine 404 adapted to activate rotors and powered by batteries or other power supply; c. 4 - 8 propellers/rotors 406 (the number of rotors is based on operational requirements, for example, weight of payload) - minimum of 2 Kg. payload's support d. batteries 504 (Fig. 5) or other electrical or fuel power plant adapted to provide hover endurance of a minimum of 10 minutes e. a camera 408 (may be part of a smartphone 409 (not shown); f .
  • a protective frame 402 constructed as a hardened structure to protect rotors and components
  • at least one engine 404 adapted to activate rotors and powered by batteries or other power supply
  • c. 4 - 8 propellers/rotors 406 (the number of
  • the drones of the present invention are, according to some embodiments, small (up to 0.75 meter in length/width), adapted for easy penetration through standard doorways, windows, corridors, elevator shafts and staircases.
  • the drones of the present invention are, according to some embodiments, silent with suppressed engine and rotors noise.
  • the drones may be in standby mode with components, mentioned hereinabove, activated, with the engine off, with immediate engine restart support.
  • the drones of the present invention are, according to some embodiments, light fabricated of light, yet durable materials, such as composite materials, known in the art.
  • the drones of the present invention are, according to some embodiments, activated in accordance with any one or more of the following algorithms, embedded, for example, in processor 302 (in Fig. 3A):
  • the algorithm integrates data provided by system's sensors (see Fig. 3 A -infrared, ultrasonic, optic flow, video, other) to one coherent orientation picture. Thus, preventing it from colliding obstacles.
  • 3D Mapping Algorithm - sensors map the site for immediate and future navigation: Without a provided map of the indoors site and without GPS.
  • the created map enables the system; in both autonomous and semi- autonomous mode, to navigate indoors without GPS.
  • the created map can then- i. be sent to following forces (more systems or humans);
  • Threat prioritization Algorithm The algorithm integrates data provided by system' s relevant sensors (Infrared, Ultrasonic, Optic flow, Video, other) to one coherent threat map:
  • System 100 is constructed and configured to prioritize the threats:- i. By pre-defined threats - profiles stored in the video analytics system;
  • the drone engages the threat according to the calculated prioritization
  • the handler can intervene and change the priority in real-time at all times.
  • a beacon for example, IR led indicator
  • a friendly force is identified and marked as such on system' s HUD (Heads Up Display 1900- Fig. 19);
  • the projectile shooting device (combat element) carries an external device for better aiming, such as a laser pointer (Fig. 9A.).
  • the drones of the present invention comprise, according to some embodiments anti-collision sensors 812 (Fig. 8B), 822 (Fig. 8C) for both vertical (floor/ceiling) and horizontal obstacles (walls)- with definable distance parameters -support stairways climbing.
  • the drones of the present invention comprise, according to some embodiments, an auto-staircase climbing - self-orientation algorithm, constructed and configured to enable the drone auto-independent navigation in corridors, elevator shafts and staircases by collecting data from sensors and integrating data to one coherent navigation picture, thereby enabling the drones to independently navigate an indoor environment.
  • an auto-staircase climbing - self-orientation algorithm constructed and configured to enable the drone auto-independent navigation in corridors, elevator shafts and staircases by collecting data from sensors and integrating data to one coherent navigation picture, thereby enabling the drones to independently navigate an indoor environment.
  • FIG. 5 is a vertical cross section 500 of the drone 400 of Fig. 4, in accordance with an embodiment of the present invention.
  • This figure shows the embedded (in-body) power plant (batteries or fuel tanks 504) of the drone, landing gear 502 and a battery holder door 506.
  • Fig. 6 A is a top view 600 of a drone showing at least one collision sensor 602, in accordance with an embodiment of the present invention.
  • the collision sensor shown in Fig. 6A is an upper face collision sensor 602.
  • Fig. 6B is a bottom view of a drone 610 showing at least one collision sensor 612 (lower face collision sensor), in accordance with an embodiment of the present invention.
  • a gun-holder 616 can also be seen, as well as propeller arms 614, each adapted to hold one propeller 406 or two in coaxial engine configuration.
  • Fig. 6C is a side view of a drone 620 showing at least one threat detection sensor 624 and a gun holder 622, in accordance with an embodiment of the present invention.
  • Figs. 6A-6C show the anti-collision sensors of drone 400, enabling it to effectively self-navigate and/or be controlled in a dense urban confined and complex indoors environment.
  • Fig. 7 A is a top view 700 of drone 400 showing at least one optical sensor 312, 314, 316, in accordance with an embodiment of the present invention.
  • the drone comprises replaceable HD, 314, 316 adapted for either day only or for day and night.
  • Fig. 7B is a side view 710 of drone 400 showing at least one optical sensor 712, in accordance with an embodiment of the present invention.
  • Fig. 7C is another side view 720 of drone 400 showing a phone/camera attachment element 724 and at least one optical sensor 722, in accordance with an embodiment of the present invention
  • Fig. 8A is a top view 800 of drone 110 showing at least one motion detection sensor (acoustics/laser) 802, adapted to act as a gunshot detection sensor, in accordance with an embodiment of the present invention.
  • a motion detection sensor acoustics/laser
  • Fig. 8B is a bottom view 810 of drone 110 showing the position of at least one motion detection sensor 812, in accordance with an embodiment of the present invention.
  • These are motion sensors on each side of the drone, which cover 360 degrees.
  • the ones on top/bottom are the distance/anti- collision sensors.
  • Fig. 8C is a side view 820 of a drone showing the position of at least one motion detection sensor 822, in accordance with an embodiment of the present invention.
  • FIG. 9A is side view 900 of a drone showing at least one combat element 902 (handgun) with a laser pointer 904, in accordance with an embodiment of the present invention.
  • Fig. 9B is another side view 910 of a drone showing at least one combat element 912 (submachine gun), in accordance with an embodiment of the present invention.
  • Fig. 9C there is seen a another side view 920 of drone 110 showing at least one combat element 922 (hand grenade), in accordance with an embodiment of the present invention;.
  • System 100 (Fig. 1) is thus configured to use these combat elements and sensors, thereby enabling the system to effectively function in combat (search and engage, incriminate and eliminate).
  • FIG. 10A is a top view 1000 of dronel lO showing at least one antenna 1002, in accordance with an embodiment of the present invention.
  • Fig. 10B a side view 1010 of drone 110 is seen, showing the at least one antenna, in accordance with an embodiment of the present invention.
  • Fig. 11 A is a simplified pictorial illustration showing a portable ground station 1100 of system 100 of Fig. 1, in accordance with an embodiment of the present invention.
  • the portable ground station 1100 comprises a screen 1102 on which screen shots shown hereinbelow can be displayed.
  • the ground station is configured to be a game-like, user-friendly system controller has a drone height control and rotation controller 1112, a drone directional controller 1110, an X-Y controller 1104 and further control buttons 1106, 1108.
  • FIG. 11B is another simplified pictorial illustration showing a portable touch-screen ground station 1120 of system 100 of Fig. 1, equipped with a touchscreen 1122, in accordance with an embodiment of the present invention.
  • FIG. 12 is a simplified flow chart of a method 1200 for complex indoor combat, in accordance with an embodiment of the present invention.
  • a mobile configuration of system 100 is provided.
  • the ground station 120 or sometimes, the entire system 100 is carried to an engagement/combat site by a task team (not shown) in a carry system to site step 1202.
  • This configuration is relevant to the following implementations military - ground forces, military - special forces, law enforcement, intelligence & security organizations.
  • the task team Upon call, the task team arrives to site and deploys the system: takes the hover craft (drone 110) out of its case (201, Fig. 2).
  • the personnel (handler) turns on the computer 210 (Fig. 2).
  • a mission planning step 1206 the handler then launches a mission planner; software application module (not shown, in computer 210), installed on tablet/computer, where the mission parameters and required behavior of the system are configured.
  • a choosing mode of operation step 1208 the handler decides of the suitable mode of operation per the given mission, scenario and task team's doctrine, selected from:- a) a semi-autonomous mode activation step 1210, in which the handler flies the system, manages its priorities and engagement, yet robotically supported with:
  • threat map- motion, gunshot and sound sensors alert, identify and direct towards potential threats - video analytics - identifies known threats, differentiate friend/foe, priorities threats and direct towards threats
  • mapping- sensors are mapping the site for immediate and future navigation
  • the handler can navigate the system by either real-time mapping or by a provided map (without GPS) or b) an autonomous mode activation step 1212, in which the system is fully autonomous and acts as per pre-defined priorities, behaviors and algorithms:
  • mapping- sensors are mapping the site in real-time for immediate and future navigation
  • step 1208 the system can navigate by either real-time mapping or by a provided map (without GPS).
  • a screen shot associated with step 1208 is shown in Fig. 15 hereinbelow.
  • the handler Having chosen to activate one of the two above modes, the handler then chooses whether to activate video analytics in a video analytics activation step 1214.
  • a screen shot associated with the video analytics step is shown in Fig. 16.
  • the handler decides whether to use the video analytics module per the given mission, scenario and task team' s doctrine. If he activates the video analytics in step 1214, then the system is constructed and configured to biometrically analyze pre-defined profiles in a real-time analysis step 1216, enabling one or more of the following:- a. a- handler can load a profile (image) from a database (not shown) in the computer; b. a handler can insert and load a recently taken picture of a target; c. several profiles can be loaded at the same time.
  • the system is operative to search and mark each identified profile on an HUD (heads on display 1900, Fig. 19) thereby triggering the "prioritization” and “engagement” algorithms described herein; d. target profiles can be prioritized per handler directive thus triggering the "prioritization” and “engagement” algorithms.
  • step 1214 If the video analytics is not activated in step 1214, then the system does not biometrically analyze pre-defined people' s profiles, but is operative to address realtime threats only address step 1218. - marks people on the HUD
  • the handler decides of the suitable number of projectiles (i.e. rounds) in magazine and of number of projectiles shot per pull of trigger (i.e. burst) per the given mission, scenario and task team's doctrine. Screen shots associated with this step are shown in Figs. 17 and 18 hereinbelow.
  • the handler is operative in defining rounds and bursts step 1220 to define/set a number of rounds in magazine of a combat element on the drone- the handler sets the number of projectiles (i.e. rounds) in magazine of the at least one combat element on the drone, in a rounds in magazine setting step 1222, enabling the system to count and display on HUD the ammunition status in order to follow its consumption (see Figs. 17 and 18).
  • a choose mode of engagement step 1226 in which the handler decides of the suitable mode of engagement per the given mission, scenario and task team' s doctrine
  • an activate autonomous shot mode step 1228 shoot on sight - the system autonomously engages and shoots a projectile at a threat; immediately as it identifies it
  • an activate manual shot mode step 1230 in which the handler triggers the shot from the ground station to the combat element on the drone.
  • the system suggests engagement options to the handler by notifications and/or markings on the HUD (based on threat prioritization algorithm and/or video analytics); and finally, v) a system launch step 1232- go- the pre-mission planning enabled by the mission planning module ends here by the launch of the real-time control of the drone, enabled by the HUD module (see Fig. 19).
  • Fig. 13 is a simplified flow chart of a standby mode method 1300 for complex indoor combat, in accordance with an embodiment of the present invention.
  • system 100 is deployed on-site, the engines 404 (Fig. 4) of drone 400 are switched off but all detection sensors onboard the drone are on (sensors, camera, video analytics, et cetera.).
  • identifying threat step 1304 any threat which detected by the drone and/or ground station is defined as an active threat.
  • the drone is then activated and switched on in an activations step 1306.
  • a handler chooses, per a potential threat a mode of operation. If a semi-autonomous mode is chosen in step 1309, then the handler is operative to activate the drone in semi-autonomous mode (per a notification by the system). If an autonomous mode was chosen in an autonomous mode choosing step 1307, then the drone is activated to function autonomously. The drone re-fires its engines and engages the threat (autonomously) after auto-launching in an auto-launch drone step 1310).
  • the pre-mission planning enabled by the Mission Planning Module (ground station 120 (Fig. 1)) ends here by the launch of the real-time control of the system, enabled by the HUD Module (See Fig. 19).
  • This configuration is relevant to the following implementations (refer to best mode of implementation) - anti piracy - fighting (onboard) pirates, homeland security - facilities and installations, indoors fire fighting, alarm verification.
  • Sensors and/or video analytics identifies a threat in step 1304, and triggers the system/drone.
  • the system is to act as is was preset to on the Mission Planner module: Alternatively, if a semi-autonomous mode is chosen in step 1308, then the handler then needs to decide in a decision step 1311 whether to activate the drone to engage the threat in drone activation step 1312, or to leave it dormant in a standby mode (step 1314).
  • Fig. 14 is a simplified flow chart of a sleep mode method 1400 for complex indoor combat, in accordance with an embodiment of the present invention.
  • the handler decides on a mode of operation, such as a sleep mode.
  • a mode of operation such as a sleep mode.
  • drone 110, 400 et cetera. is deployed on-site in a deploying onsite step 1404. Its- the engines and detection sensors are off (such as sensors, camera, video analytics, et cetera.).
  • the system will start its engines and detection sensors and will engage the threat (autonomously or by handler).
  • This configuration is relevant to the following implementations anti -piracy - fighting (on-board) pirates, homeland security - facilities and installations, indoors fire-fighting and alarm verification.
  • a threat detection step 1406 an alarm is activated and the drone is triggered.
  • the handler decides whether to activate the drone autonomously or semi- autonomously, by remotely activating that mode in steps 1408, 1410, respectively.
  • the system is operative to ask the handler whether to engage a threat in an asking step 1412. Typically, this is performed by the handler receiving a message on his phone or to the ground station. If yes, the drone is activated in a drone activation step 1414. If no, the drone remain in a sleep mode 1416 Per pre-defined directives set on the mission planner module ground station 120 (Fig. 1), system 100 addresses the threat. According to one embodiment, the system acts as it was preset to on the mission planner module: If the autonomous mode was activated in step 1418, then, upon detection of a threat, the drone is auto-launched. The pre-mission planning enabled by the Mission Planning Module (ground station) ends here by the launch of the real-time control of the drone, enabled by the HUD Module (See Fig. 19).
  • Fig. 15 is a screen shot 1500 on a screen 1502 of a ground station 120 (Fig. 1) for activating drone 110, in accordance with an embodiment of the present invention.
  • a handler or user may choose, according to a location, time of day, threat type et cetera, whether to activate drone 110 in a semi-autonomous mode, by activating a semi- autonomous activation element 1504. Alternatively, he/she may activate the drone in an autonomous mode by activating an autonomous activation element 1506.
  • Screen 1502 further comprises a display of date and time data 1522, next buttons 1508, tabs for a mission 1510, an analytics tab 1512, a burst tab 1514, an engagement tab 1516, a heads up display tab 1518 and a settings tab 1520.
  • Fig. 16 is a screen shot 1600 on a ground station screen for video analytics activation 1502 of drone 110, in accordance with an embodiment of the present invention.
  • Screen for video analytics activation 1502 typically comprises a first face image 1602, a second face image 1604, a third face image 1606 and a fourth face image 1608.
  • face data 1610, 112, 1614, 1616 This data may include, for example, a name, an age, a gender, an organization affiliation, a danger level and the like. Additionally, there may be spaces 1618 for uploading additional face data. Additionally or alternatively, the face images may be replaced with full body images.
  • Screen shot 1600 may further comprise a prioritize button 1624, for adjusting, resetting or setting the relative prioritizations of the people's images (1602, 1604,1606, 1608) profiles- the higher priority, the higher priority for the drone to engage the actual person associated with the image.
  • the screens shot may further comprise a drag or drop symbol 1628, a skip button 1620 and a next button 1622.
  • Fig. 17 is a screen shot 1700 on a ground station screen 120 (Fig. 1) for activation of rounds a bursts in system 100 of Fig. 1, in accordance with an embodiment of the present invention.
  • a user can set a number of rounds in a magazine by manipulating a rounds control number using button 1704 in a rounds control element 1702 and a number of rounds per burst by manipulating a rounds control per burst number 1708 in a rounds per burst control element 1706.
  • Fig. 18 is a screen shot 1800 on a ground station screen for choosing a mode of engagement in the system of Fig. 1, in accordance with an embodiment of the present invention.
  • a user can choose a shoot on sight mode by activating an autonomous mode select element 1802 or a manual shot by activating a manual mode select element 1804.
  • Fig. 19 is a screen shot 1900 on a ground station screen for heads up display (HUD) in the system of Fig. 1, in accordance with an embodiment of the present invention.
  • HUD heads up display
  • Screen shot 1900 presents the handlers controller' s graphic user interface (GUI) and its components: a mini map 1934 , a current speed on a speedometer 1906, an altitude on an altitude meter 1908, a local time on date and time data 1522, a battery/energy with time estimate of power % remaining 1912, an ammunition available data with type and number of rounds in munitions store 1916, a firing crosshair 1918, with a threat directional indicator 1920, a compass 1902 and an artificial horizon or azimuth 1904.
  • the HUD further shows a threat map 1992, showing a triangle of a shot 1930, a shot threat detected 1924, movement detection 1926, a triangle of movement 1932 (shown in greater detail in Figs 20 and 21).
  • Figs 20-21 are dealing with the Threat map rather with the 3D map- added it here please inset where applicable: 3D Mapping Algorithm - sensors (Fig. 3A) map the site for immediate and future navigation: Without a provided map of the indoors site and without GPS.
  • the screen shot further shows a return to base button 1942, an ambush or standby button 1944, an operation button 1946, a video analytics button 1948 and a 3D map button 1950.
  • Fig. 20 is a simplified threat map 2000 in the system of Fig. 1, in accordance with an embodiment of the present invention.
  • Threat map includes a drone orientation direction 2001, a camera coverage area 2002, a gunshot direction 2003, a movement direction 2004, and movement and gunshot direction 2005, and gunshot location and position 2006, a movement location and position 2007 and a movement and gunshot location and position 2008.
  • Fig. 21 is another simplified threat map 2100 in the system of Fig. 1, in accordance with an embodiment of the present invention.
  • Threat map includes a drone orientation direction 2101, a camera coverage area 2102, a gunshot direction 2103, a movement direction 2104, and movement and gunshot direction 2105, and gunshot location and position 2106, a movement location and position 2107 and a movement and gunshot location and position 2108.
  • drone 110 The modes of operation of drone 110 are described further as follows, with reference to items shown in the drawings, particularly with reference to Figs. 12- 14.
  • Threat Map- Motion, Gunshot and sound sensors alert, identify and direct towards potential threats
  • Video Analytics - identifies known threats, differentiate friend/foe, priorities threats and direct towards threats
  • mapping- Sensors are mapping the site for immediate and future navigation
  • Navigation- The handler can navigate the system by either real-time mapping or by a provided map (without GPS)
  • Video Analytics - identifies known threats, differentiate friend/foe, priorities threats and direct towards threats
  • mapping- Sensors are mapping the site in real-time for immediate and future navigation
  • Navigation- The system can navigate by either real-time mapping or by a provided map (without GPS)
  • the System does not biometrically analyze pre-defined people's profiles, and addresses real-time threats only
  • Threats can be prioritized per handler directive thus triggering the "Prioritization” and “engagement” Algorithms
  • the System biometrically analyzes pre-defined profiles in real-time Handler can load a Profile (image) from the Database
  • the Handler can prioritize the objectives:
  • the system can than search and mark each identified profile on HUD (thus triggering the "Prioritization” and “engagement” Algorithms)
  • Profiles can be prioritized per handler directive thus triggering the "Prioritization” and “engagement” Algorithms
  • Handler decides whether to engage and shoot a projectile
  • Threat Map- Graphic display mapping displaying the type and direction of sensors' indications, in relation to Hover Craft movement
  • 3D Map- Sensors are mapping the site for immediate and future navigation.
  • the sensors' 360 degrees coverage maps the surroundings and thus enables the display of a dynamically created 3D Map
  • the map is automatically saved and stored for later use
  • Handler may change the priority; and thus the numbering, in real-time by touching the objective he now prioritize as number one.
  • Ambush/Standby mode (Fig 13)- The system will get in to Ambush mode, where it lands (autonomously or by handler), shuts its engines but keeps ALL detection systems ON (Sensors, Camera, Video analytics, Et cetera.). As per a detected threat, the system will resume its engines and will engage the threat (autonomously or by handler).
  • the handler can choose to change the Mode of Operation (See 1.) during actual mission rather than as a pre-defined mode as at the Mission Planner module.
  • the handler can launch the Video Analytics interface (See 2.) during actual mission (rather than as a pre-defined mode as at the Mission Planner module). If at the Mission Planning phase the Handler skipped the phase, he can still launch it even through the HUD
  • the handler can remove/display the 3D Map overlay.
  • Pack/squadron coordinated flight/fight support - i. Formation flight and attack algorithm ii. 'One leader other follow' or 'Coordinated flight and attack of independent drones'
  • Socket enabled charging any socket, any current
  • Video Content Analytics - i. Automatic Intrusion detection - Ensures perimeter control ii. Automatic Abnormal behavior detection - Minimizing civilian casualties by incriminating the real threats
  • Stalking/ambush mode all systems ON, engine OFF with
  • Threat detection and engagement algorithm - i. Collecting data from ;location sensors, Sensors and Video Analytics
  • the drone is to be stationed onboard a ship, at strategic locations, such as, the bridge, engine room, crew quarters, et cetera.
  • the drone addresses the on-board pirates and is operative to engage them under various operational scenarios. For example, boarding prevention, hijacking prevention, hostage rescue, securing the bridge and staff.
  • the drone uses the distributed Wi-Fi network to operate within the vessel and satellite network for external communication.
  • the system is to be stationed onboard the ship, at strategic locations. For example, the bridge, engine room, goods' storage, et cetera.
  • the system can operate autonomously and/or piloted; on a single or a coordinated pack configuration.
  • Command initiated by the Video Content analytics (Either of the Ship or of the system itself) or by the embedded Sensors once a movement or gunshots are identified iii. Direct Order by the crew or by a remote/online security officer located (remotely) at base.
  • Control options - mobile mode (Fig. 12);
  • the system can identify a breach or an unauthorized access to a secure parameter and respond;
  • v. Hijacking Prevention The access prevention and secured bridge are to foil any attempt to hijack the ship, its crew and its cargo;
  • a drone (110, Fig 1) is stationed onboard a ship, at strategic locations, for example, the bridge, engine room, crew quarters.
  • the drone addresses the on-board pirates and engages them under various operational scenarios, for example, boarding prevention, hijacking prevention, hostage rescue, securing the bridge and staff.
  • the drone uses the distributed Wi-Fi network to operate within the vessel and satellite network for external communication.
  • a fire alarm For example, in a warehouse

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present invention provides systems and methods for combatting a threat, the system including a motorized apparatus adapted to hover and hold at least one engagement element, at least one detection element adapted to convey detection data, an apparatus location element adapted to convey system location data, an onboard processor adapted to receive and process the detection data from the at least one detection element and the location data from the at least one apparatus location data element, the processor being configured to output at least one command to the at least one engagement element to inactivate the threat; and at least one communication element, and a ground station for monitoring the motorized apparatus and for receiving data from the at least one communication element.

Description

ROBOTIC SYSTEM AND METHOD FOR COMPLEX INDOOR COMBAT
FIELD OF THE INVENTION
The present invention relates generally to robotic systems and methods, and more specifically to methods and systems for robotic systems for complex indoor combat.
BACKGROUND OF THE INVENTION
At present, there are few non-human systems effective in eliminating threats in complex indoor situations. Robotic solutions are lacking for various scenarios of confined and complex indoors combat, such as, but not limited to, within commercial and private residences, in educational establishments, commercial establishments, in transport systems, such as underground trains, in tunnels, in airplanes, within ships, for prevention of boarding of hostile persons, hijacking prevention, hostage rescue and securing a bridge and/or staff
There thus remains an unmet need to provide efficient robotic systems for unmanned indoor combat systems and methods.
SUMMARY OF THE INVENTION
It is an object of some aspects of the present invention robotic methods and systems are provided for complex indoor combat.
In some embodiments of the present invention, improved unmanned methods and apparatus are provided for complex indoor combat.
In other embodiments of the present invention, a method and system is described for providing robotic unmanned systems for indoors combat.
In additional embodiments for the present invention, provides a system comprising an integrated robotic drone comprising multiple sensors, optics, combat elements, supporting software, including video analytics, adapted for mission planning, with a heads-up-display HUD.
In further embodiments of the present invention, flying robotic systems are provided for indoor combat.
The present invention provides a robotic stalk and attack aerial hover system ("drone") designed, geared and aimed at dense urban confined and complex indoors combat. The system of the present invention is constructed and configured to effectively identify, incriminate and eliminate threats and/or designated targets at urban scenarios; namely confined complex indoors environments, autonomously and/or piloted; on a single or a coordinated pack configuration.
With its innovative concept, design, development, features and components, the robotic drone of the present invention is configured to effectively operate in a complex and lethal environment.
There is thus provided according to an embodiment of the present invention, a system for indoors engagement of a threat including;
a. a motorized apparatus adapted to hover and hold;
i. at least one engagement element;
ii. at least one detection element adapted to convey detection data; and
iii. an apparatus location element adapted to convey system location data;
iv. an onboard processor adapted to receive and process said detection data from said at least one detection element and said location data from said at least one apparatus location data element, said processor being configured to output at least one command to said at least one engagement element to inactivate said threat; and
v. at least one communication element; and b. a ground station for monitoring said motorized apparatus and for receiving data from said at least one communication element.
Additionally, according to an embodiment of the present invention, the processor is further adapted to perform video analytics on data from the detection data. Furthermore, according to an embodiment of the present invention, the processor is further adapted to identify the threat responsive to the video analytics.
Moreover, according to an embodiment of the present invention, the motorized apparatus includes robotics adapted to activate the at least one engagement element to engage and inactivate the threat.
Further, according to an embodiment of the present invention, the motorized apparatus is adapted to navigate three-dimensional indoor contours.
Yet further, according to an embodiment of the present invention, the at least one engagement element is selected from the group consisting of a gun, a missile, a projector, a gas canister, a fire extinguisher, a grenade, a non-lethal weapon, an immobilizer weapon, a submachine guns and combinations thereof.
Additionally, according to an embodiment of the present invention, the at least one detection element is selected from the group consisting of a sound sensor, an infrared sensor, a position sensor, an ultrasonic sensor, a movement detection sensor, a camera, a laser, a visualization sensor, an optic flow sensor and combinations thereof.
Moreover, according to an embodiment of the present invention, the at least one location element is selected from the group consisting of a global position system (GPS) element, a position sensor, a camera, a smartphone, an optic sensor and combinations thereof. Further, according to an embodiment of the present invention, motorized apparatus and the ground station communicate via at least one communication link.
Yet further, according to an embodiment of the present invention, the at least one communication link is selected from the group consisting of IP peer-to-peer, communication link, a cellular communication link, satellite communication an RF communication, an internet link and combinations thereof.
Notably, according to an embodiment of the present invention, the ground station further includes a screen adapted to display a real-time heads up display (HUD) of the motorized apparatus.
Importantly, according to an embodiment of the present invention, the motorized apparatus is unmanned and is selected from an airborne drone, a flying hovering apparatus, a plane, a helicopter and a hovercraft.
Additionally, according to an embodiment of the present invention, the ground station includes a computer; a hand-operated remote control apparatus; an antenna; and at least one communications link.
Moreover, according to an embodiment of the present invention, the motorized apparatus is a drone, adapted for indoor use.
Further, according to an embodiment of the present invention the motorized apparatus is adapted to move along vertical and horizontal conduits.
Yet further, according to an embodiment of the present invention, the drone includes robotics adapted to activate the at least one engagement element.
Importantly, according to an embodiment of the present invention, the ground station includes on-screen heads up display.
There is thus provided according to another embodiment of the present invention a method for indoor engagement of a threat, the method including;
a) detecting the threat by a motorized hovering apparatus;
b) processing at least one of location data and detection data from the apparatus to determine a nature of the threat; and c) activating at least one engagement element on the apparatus to inactivate the threat indoors, responsive to at least one of the location data and the detection data.
Additionally, according to an embodiment of the present invention, the detecting step includes employing video analytics and or detection sensors for identification of the threat.
Furthermore, according to an embodiment of the present invention, the activating step enables the apparatus to act autonomously.
Moreover, according to an embodiment of the present invention, the activating step enables the apparatus to act semi-autonomously.
Notably, according to an embodiment of the present invention, the activating step enables the apparatus to travel along horizontal and vertical conduits to approach the threat.
Additionally, according to an embodiment of the present invention, the apparatus is adapted for non-GPS navigation.
Further, according to an embodiment of the present invention, the apparatus is adapted for day and night activation.
Yet further, according to an embodiment of the present invention, the apparatus has a predefined mission plan.
Additionally, according to an embodiment of the present invention, the mission plan supports a plurality of engagement modes.
Importantly, according to an embodiment of the present invention, the apparatus is controlled from a ground station by heads up display.
Additionally, according to an embodiment of the present invention, the motorized apparatus acts autonomously, without communicating with the ground station.
Furthermore, according to an embodiment of the present invention, the engagement of the threat occurs in an underground environment. In some cases the underground environment includes at least one tunnel. Additionally, according to an embodiment of the present invention, the motorized apparatus acts autonomously in the at least one tunnel.
There is thus provided according to an additional embodiment of the present invention a software product for indoor engagement of a threat the product including a computer-readable medium in which program instructions are stored, which instructions, when read by a computer (on the drone), cause the computer to;
a. detect the threat by a motorized hovering apparatus;
b. process at least one of location data and detection data from the apparatus to determine a nature of the threat; and
c. activate at least one engagement element on the apparatus to inactivate the threat indoors, responsive to at least one of the location data and the detection data.
Additionally, according to an embodiment of the present invention, the product comprises a plurality of algorithms, each algorithm providing a set of instructions to said motorized apparatus to activate at least one of robotics and the at least one engagement element.
Additionally, according to an embodiment of the present invention, a system for neutralization of an onboard threat on a waterborne vehicle, the system including: a. a motorized apparatus adapted to hover and hold:
i. at least one engagement element;
ii. at least one detection element adapted to convey detection data; and
iii. an apparatus location element adapted to convey system location data;
iv. an onboard processor adapted to receive and process the detection data from the at least one detection element and the location data from the at least one apparatus location data element, the processor being configured to output at least one command to the at least one engagement element to inactivate the threat; and
v. at least one communication element;
b. a ground station for monitoring the motorized apparatus and for receiving data from the at least one communication element.
Unless the control is taken by the operator, the drone (motorized apparatus) is adapted to act autonomously with no need for an operator to be present. Alternatively, the operator can merely watch the action without interfering. The present invention will be more fully understood from the following detailed description of the preferred embodiments thereof, taken together with the drawings. BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be described in connection with certain preferred embodiments with reference to the following illustrative figures so that it may be more fully understood. With specific reference now to the figures in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice.
Fig. 1 is a simplified pictorial illustration showing a system for complex indoor combat, in accordance with an embodiment of the present invention;
Fig. 2 is a simplified pictorial illustration showing a ground station of the system of Fig. 1, in accordance with an embodiment of the present invention;
Fig. 3 A is a simplified schematic illustration of a drone in the system of Fig. 1 , in accordance with an embodiment of the present invention; Fig. 3B is a simplified schematic illustration of a system for complex indoor combat, in accordance with an embodiment of the present invention;
Fig. 4 is simplified pictorial illustration of a drone in the system of Fig. 1, in accordance with an embodiment of the present invention;
Fig. 5 is a vertical cross section of the drone of Fig. 4, in accordance with an embodiment of the present invention;
Fig. 6 A is a top view of a drone showing at least one collision sensor, in accordance with an embodiment of the present invention;
Fig. 6B is a bottom view of a drone showing at least one collision sensor, in accordance with an embodiment of the present invention; Fig. 6C is a side view of a drone showing at least threat detection sensor, in accordance with an embodiment of the present invention;
Fig. 7 A is a top view of a drone showing at least one optical sensor, in accordance with an embodiment of the present invention; Fig. 7B is a side view of a drone showing at least one optical sensor, in accordance with an embodiment of the present invention;
Fig. 7C is another side view of a drone showing at least one optical sensor, in accordance with an embodiment of the present invention;
Fig. 8A is a top view of a drone showing at least one motion detection sensor, in accordance with an embodiment of the present invention;
Fig. 8B is a bottom view of a drone showing at least one motion detection sensor, in accordance with an embodiment of the present invention;
Fig. 8C is a side view of a drone showing at least one motion detection sensor, in accordance with an embodiment of the present invention; Fig. 9A is side view of a drone showing at least one combat element, in accordance with an embodiment of the present invention;
Fig. 9B is another side view of a drone showing at least one combat element, in accordance with an embodiment of the present invention;
Fig. 9C is another side view of a drone showing at least one combat element, in accordance with an embodiment of the present invention;
Fig. 10A is a top view of a drone showing at least one antenna, in accordance with an embodiment of the present invention;
Fig. 10B is a side view of a drone showing at least one antenna, in accordance with an embodiment of the present invention; Fig. 11 A is a simplified pictorial illustration showing a portable ground station of the system of Fig. 1, in accordance with an embodiment of the present invention;
Fig. 1 IB is a simplified pictorial illustration showing a portable touch-screen ground station of the system of Fig. 1, in accordance with an embodiment of the present invention; Fig. 12 is a simplified flow chart of a mobile configuration method for complex indoor combat, in accordance with an embodiment of the present invention;
Fig. 13 is a simplified flow chart of a standby mode method for complex indoor combat, in accordance with an embodiment of the present invention; Fig. 14is a simplified flow chart of a sleep mode method for complex indoor combat, in accordance with an embodiment of the present invention;
Fig. 15 is a screen shot on a ground station screen for activating a drone choosing mode of operation, in accordance with an embodiment of the present invention; Fig. 16 is a screen shot on a ground station screen for video analytics activation of a drone, in accordance with an embodiment of the present invention;
Fig. 17 is a screen shot (Rounds and Bursts) on a ground station screen for activation of Fig. 1, in accordance with an embodiment of the present invention;
Fig. 18 is a screen shot on a ground station screen for choosing a mode of engagement in the system of Fig. 1, in accordance with an embodiment of the present invention;
Fig. 19 is a screen shot on a ground station screen for heads up display in the system of Fig. 1, in accordance with an embodiment of the present invention;
Fig. 20 is a simplified threat map in the system of Fig. 1, in accordance with an embodiment of the present invention; and
Fig. 21 is another simplified threat map in the system of Fig. 1, in accordance with an embodiment of the present invention. In all the figures similar reference numerals identify similar parts.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
In the detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that these are specific embodiments and that the present invention may be practiced also in different ways that embody the characterizing features of the invention as described and claimed herein.
Reference is now made to Fig. 1, which is a simplified pictorial illustration showing a system 100 for complex indoor combat, in accordance with an embodiment of the present invention. System 100 comprises at least one drone 110, a ground station 120, at least one communication device 130, adapted to communicate with the ground station and the drone via at least one wireless communication network 140. Further details of ground station 120 are shown in Fig. 2. Schematics of the drone are shown in Fig. 3 A. A schematic of system 100 is shown in Fig. 3B. Further mechanical details of the drone are shown in Fig. 4. It should be understood from Fig. 1 that the system may use one or more drones of various configurations, one or more communication systems and one or more mobile devices 130.
Reference is now made to Fig. 2, which is a simplified pictorial illustration showing a ground station 200 of the system of Fig. 1, in accordance with an embodiment of the present invention. Ground station 200, similar or identical to ground station 120 of Fig. l, is housed in a suitcase 201 and comprises a remote controller 202, an antenna 204 servicing the remote controller, a computer 210 and an antenna, servicing the computer.
Reference is now made to Fig. 3A, which is a simplified schematic illustration of a drone 300 in the system 100 of Fig. 1, in accordance with an embodiment of the present invention. Drone 300 may be similar or identical to drone 110 of system 100.
The drone comprises an onboard processor 302 in communication with a flight system 304, at least one of a camera 314 or smartphone, and sensor 306, 308, 310, 312, exemplified, but not limited to sound sensor 306, infrared sensor 308, ultrasonic sensor 310 and optic flow sensor 312. The arrows shown in Fig. 3A represent one embodiment of data flow within the drone.
Reference is now made to Fig. 3B, which is a simplified schematic illustration of a system 350 for complex indoor combat, in accordance with an embodiment of the present invention. System 350 comprises a ground station 352 in communication with an on-board system 358, a flight system (for example, Pixhawk autopilot) controller 360 a copter 354. The flight system (such as, but not limited to, a Pixhawk) is in communication with a sensor controller 362 and sensors 356 communicating therewith. It should be understood that, for the sake of simplicity, some of the elements shown in this schematic illustration are shown in other system diagrams, such as Fig. 1, Fig. 3A et cetera. Some of the components of system 100 (Fig.l) and/or system 350 (Fig. 3B) and their functions are provided in table 1. It should be understood that each of these components may be purchased commercially from commercial establishments in this field.
Table 1. General hardware components table of the system
Name Function
360 Flight controller Controlling the flight of a
Drone
362 Sensors controller Receiving and sending the sensors data
308 Infrared sensor Proximity detector based on light
310 Ultrasonic sensor Proximity detector based on ultrasonic sound
311 Motion sensor Detect motion
306 Sound sensor Detect sounds
130 Smartphone Run the autonomous system of the Drone robot
352 Tablet Control and monitor the
Drone activity from distance
354 Copter The flight system
355 GPS Global positioning unit for the copter
357 Accelerometers Measure the copter velocities and acceleration in each 3D direction
359 Compass Measure the direction of the copter
361 Indoors positioning Locate the relative position system of the copter in space where no GPS signals exists
363 Gyroscopes Measure the attitude of the copter
365 RC receiver Radio controller receiver to receive control commands from a distant radio controller
Reference is now made to Fig. 4, which is simplified pictorial illustration of a drone 400 (also called copter or hover apparatus) in system 100 of Fig. 1, in accordance with an embodiment of the present invention.
The drones of the present invention are lightweight and have a protective frame 402. The frame may be made of plastic, a polymeric material, a metal, aluminum, carbon fiber, an alloy and combinations thereof. The light protective frame of drone 400 may be constructed in various colors, with various camouflages, making it both easy to move and durable.
The drone is constructed and configured to be a robotic stalk & attack hover apparatus 400, which may be stationed indoors, underground or onboard a ship, at strategic locations (such as on a bridge, in an engine room or in crew living quarters).
Drone 400 is developed, designed, geared and aimed at dense confined and complex indoor combat and thus can effectively identify, incriminate and eliminate threats and/or designated targets at the confined and complex indoors combat within ships, for example and at various operational scenarios, such as, but not limited to, terrorist attacks in buildings, airports, airplanes, educational facilities, underground trains, commercial establishments, residential buildings, boarding prevention, hijacking prevention, hostage rescue, securing the bridge and staff of a ship.
When triggered; by an order, command or an alarm, the drone is constructed and configured to operate autonomously and/or piloted; in a single drone or in a coordinated multiple drone pack configuration.
With its innovative concept, design, development, features and components, this robotic drone is to effectively operate in the most complex and lethal indoor environments.
Drone 400 (similar or identical to drone 110, Fig 1) typically comprises the following basic functionalities and components. a. a protective frame 402 constructed as a hardened structure to protect rotors and components; b. at least one engine 404 adapted to activate rotors and powered by batteries or other power supply; c. 4 - 8 propellers/rotors 406 (the number of rotors is based on operational requirements, for example, weight of payload) - minimum of 2 Kg. payload's support d. batteries 504 (Fig. 5) or other electrical or fuel power plant adapted to provide hover endurance of a minimum of 10 minutes e. a camera 408 (may be part of a smartphone 409 (not shown); f . at least one battery holder 410; g . at least one sound sensor 411 ; h. an antenna 412 including a Remote Control datalink -Direct (Radio and/or WiFi/Cellular) and/or Relay and/or Satellite (not shown); i. a GPS sensor 355; j . at least one gun/munition holder 418; and k. landing gear 416. The drones of the present invention are, according to some embodiments, small (up to 0.75 meter in length/width), adapted for easy penetration through standard doorways, windows, corridors, elevator shafts and staircases.
The drones of the present invention are, according to some embodiments, silent with suppressed engine and rotors noise. The drones may be in standby mode with components, mentioned hereinabove, activated, with the engine off, with immediate engine restart support.
The drones of the present invention are, according to some embodiments, light fabricated of light, yet durable materials, such as composite materials, known in the art.
The drones of the present invention are, according to some embodiments, activated in accordance with any one or more of the following algorithms, embedded, for example, in processor 302 (in Fig. 3A):
1. Anti-collision Algorithm - Sphere like 3D collision prevention - Vertical and Horizontal 360 degrees coverage: The algorithm integrates data provided by system's sensors (see Fig. 3 A -infrared, ultrasonic, optic flow, video, other) to one coherent orientation picture. Thus, preventing it from colliding obstacles. Support handler in semi-autonomous mode and enables a fully- autonomous mode:
a. Slowing per current speed and distance from an obstacle;
b. Stopping per pre-defined distance from an obstacle;
c. Maintaining pre-defined distance from an obstacle; and
d. Enables obstacles avoidance - walls, shafts, doorways, corridors,
staircases, tunnels and the like.
2. 3D Mapping Algorithm - sensors (Fig. 3A) map the site for immediate and future navigation: Without a provided map of the indoors site and without GPS.
a. The vertical and horizontal 360 degrees sensors' coverage draws a detailed map of the surroundings;
b. The created map enables the system; in both autonomous and semi- autonomous mode, to navigate indoors without GPS.
c. The created map can then- i. be sent to following forces (more systems or humans);
ii. enable "Return Home" capability once damaged or per
demand. Threat prioritization Algorithm - The algorithm integrates data provided by system' s relevant sensors (Infrared, Ultrasonic, Optic flow, Video, other) to one coherent threat map:
a. People, movements and gunshots are detected;
b. The created Threat Map enables the system; in both autonomous and semi- autonomous mode, to draw detailed map of imminent threats. c. System 100 is constructed and configured to prioritize the threats:- i. By pre-defined threats - profiles stored in the video analytics system;
ii. By a threats shooting first and then threats movement; iii. By threats distance from the drone;
iv. Will mark and number the threats (on system' s HUD- Heads Up Display) by priority on an descending order- see further discussion herein.
d. In Autonomous mode, the drone engages the threat according to the calculated prioritization;
i. The handler can intervene and change the priority in real-time at all times.
e. In Semi- Autonomous mode, the system suggests the handler whom to address first for him to decide. Friend or foe Algorithm - The algorithm identifies a Friend and prevents the system from engaging him:
a. By carrying a beacon (for example, IR led indicator), a friendly force is identified and marked as such on system' s HUD (Heads Up Display 1900- Fig. 19); and
b. The system will then be prevented from engaging it.
Engagement and Aim Algorithm - The algorithm will navigate the s drone to the optimal shooting position:
a. In autonomous mode- i. Engages the threats by their priority (in descending order) ii. Position the drone to an optimal shooting position by using Video analytics support,
b. In semi-autonomous mode- i. The HUD provides a crosshair for aiming 1918 (Fig 19);
ii. The projectile shooting device (combat element) carries an external device for better aiming, such as a laser pointer (Fig. 9A.).
The drones of the present invention comprise, according to some embodiments anti-collision sensors 812 (Fig. 8B), 822 (Fig. 8C) for both vertical (floor/ceiling) and horizontal obstacles (walls)- with definable distance parameters -support stairways climbing.
The drones of the present invention comprise, according to some embodiments, an auto-staircase climbing - self-orientation algorithm, constructed and configured to enable the drone auto-independent navigation in corridors, elevator shafts and staircases by collecting data from sensors and integrating data to one coherent navigation picture, thereby enabling the drones to independently navigate an indoor environment.
Reference is now made to Fig. 5, which is a vertical cross section 500 of the drone 400 of Fig. 4, in accordance with an embodiment of the present invention. This figure shows the embedded (in-body) power plant (batteries or fuel tanks 504) of the drone, landing gear 502 and a battery holder door 506.
Reference is now made to Fig. 6 A, which is a top view 600 of a drone showing at least one collision sensor 602, in accordance with an embodiment of the present invention. The collision sensor shown in Fig. 6A is an upper face collision sensor 602.
Reference is now made to Fig. 6B, which is a bottom view of a drone 610 showing at least one collision sensor 612 (lower face collision sensor), in accordance with an embodiment of the present invention. A gun-holder 616 can also be seen, as well as propeller arms 614, each adapted to hold one propeller 406 or two in coaxial engine configuration.
Reference is now made to Fig. 6C, which is a side view of a drone 620 showing at least one threat detection sensor 624 and a gun holder 622, in accordance with an embodiment of the present invention.
Thus, Figs. 6A-6C show the anti-collision sensors of drone 400, enabling it to effectively self-navigate and/or be controlled in a dense urban confined and complex indoors environment.
Reference is now made to Fig. 7 A, which is a top view 700 of drone 400 showing at least one optical sensor 312, 314, 316, in accordance with an embodiment of the present invention. According to some embodiments, the drone comprises replaceable HD, 314, 316 adapted for either day only or for day and night. Reference is now made to Fig. 7B, which is a side view 710 of drone 400 showing at least one optical sensor 712, in accordance with an embodiment of the present invention.
Reference is now made to Fig. 7C, which is another side view 720 of drone 400 showing a phone/camera attachment element 724 and at least one optical sensor 722, in accordance with an embodiment of the present invention;
Reference is now made to Fig. 8A, which is a top view 800 of drone 110 showing at least one motion detection sensor (acoustics/laser) 802, adapted to act as a gunshot detection sensor, in accordance with an embodiment of the present invention.
Reference is now made to Fig. 8B, which is a bottom view 810 of drone 110 showing the position of at least one motion detection sensor 812, in accordance with an embodiment of the present invention. These are motion sensors on each side of the drone, which cover 360 degrees. The ones on top/bottom are the distance/anti- collision sensors.
Reference is now made to Fig. 8C, which is a side view 820 of a drone showing the position of at least one motion detection sensor 822, in accordance with an embodiment of the present invention.
Reference is now made to Fig. 9A, which is side view 900 of a drone showing at least one combat element 902 (handgun) with a laser pointer 904, in accordance with an embodiment of the present invention. Reference is now made to Fig. 9B, which is another side view 910 of a drone showing at least one combat element 912 (submachine gun), in accordance with an embodiment of the present invention. Turning to Fig. 9C, there is seen a another side view 920 of drone 110 showing at least one combat element 922 (hand grenade), in accordance with an embodiment of the present invention;. System 100 (Fig. 1) is thus configured to use these combat elements and sensors, thereby enabling the system to effectively function in combat (search and engage, incriminate and eliminate).
Reference is now made to Fig. 10A, which is a top view 1000 of dronel lO showing at least one antenna 1002, in accordance with an embodiment of the present invention. Turning to Fig. 10B, a side view 1010 of drone 110 is seen, showing the at least one antenna, in accordance with an embodiment of the present invention.
Reference is now made to Fig. 11 A, which is a simplified pictorial illustration showing a portable ground station 1100 of system 100 of Fig. 1, in accordance with an embodiment of the present invention. The portable ground station 1100 comprises a screen 1102 on which screen shots shown hereinbelow can be displayed. The ground station is configured to be a game-like, user-friendly system controller has a drone height control and rotation controller 1112, a drone directional controller 1110, an X-Y controller 1104 and further control buttons 1106, 1108.
Reference is now made to Fig. 11B, which is another simplified pictorial illustration showing a portable touch-screen ground station 1120 of system 100 of Fig. 1, equipped with a touchscreen 1122, in accordance with an embodiment of the present invention.
Reference is now made to Fig. 12, which is a simplified flow chart of a method 1200 for complex indoor combat, in accordance with an embodiment of the present invention. In this flowchart, a mobile configuration of system 100 is provided.
The ground station 120, or sometimes, the entire system 100 is carried to an engagement/combat site by a task team (not shown) in a carry system to site step 1202. This configuration is relevant to the following implementations military - ground forces, military - special forces, law enforcement, intelligence & security organizations. Upon call, the task team arrives to site and deploys the system: takes the hover craft (drone 110) out of its case (201, Fig. 2). In an activating step 1204, the personnel (handler) turns on the computer 210 (Fig. 2).
In a mission planning step 1206, the handler then launches a mission planner; software application module (not shown, in computer 210), installed on tablet/computer, where the mission parameters and required behavior of the system are configured.
In a choosing mode of operation step 1208, the handler decides of the suitable mode of operation per the given mission, scenario and task team's doctrine, selected from:- a) a semi-autonomous mode activation step 1210, in which the handler flies the system, manages its priorities and engagement, yet robotically supported with:
- avoiding obstacles- walls, shafts, doorways, staircases, et cetera.
- threat map- motion, gunshot and sound sensors alert, identify and direct towards potential threats - video analytics - identifies known threats, differentiate friend/foe, priorities threats and direct towards threats
- 3d mapping- sensors are mapping the site for immediate and future navigation
- navigation- the handler can navigate the system by either real-time mapping or by a provided map (without GPS) or b) an autonomous mode activation step 1212, in which the system is fully autonomous and acts as per pre-defined priorities, behaviors and algorithms:
- avoiding obstacles- walls, shafts, doorways, staircases, et cetera. - addressing threats- motion, gunshot and sound sensors alert, identify and direct towards potential threats
- video analytics - identifies known threats, differentiate friend/foe, priorities threats and direct towards threats - 3d mapping- sensors are mapping the site in real-time for immediate and future navigation
- navigation- the system can navigate by either real-time mapping or by a provided map (without GPS). A screen shot associated with step 1208 is shown in Fig. 15 hereinbelow.
Having chosen to activate one of the two above modes, the handler then chooses whether to activate video analytics in a video analytics activation step 1214. A screen shot associated with the video analytics step is shown in Fig. 16. The handler decides whether to use the video analytics module per the given mission, scenario and task team' s doctrine. If he activates the video analytics in step 1214, then the system is constructed and configured to biometrically analyze pre-defined profiles in a real-time analysis step 1216, enabling one or more of the following:- a. a- handler can load a profile (image) from a database (not shown) in the computer; b. a handler can insert and load a recently taken picture of a target; c. several profiles can be loaded at the same time. The system is operative to search and mark each identified profile on an HUD (heads on display 1900, Fig. 19) thereby triggering the "prioritization" and "engagement" algorithms described herein; d. target profiles can be prioritized per handler directive thus triggering the "prioritization" and "engagement" algorithms.
If the video analytics is not activated in step 1214, then the system does not biometrically analyze pre-defined people' s profiles, but is operative to address realtime threats only address step 1218. - marks people on the HUD
- marks friend or foe on the HUD
- marks imminent threats on the HUD (for example, gunshots, movements)- see Figs. 20 and 21.
- threats can be prioritized per handler directive thus triggering the "prioritization" and "engagement" algorithms.
Thereafter, in a defining rounds and bursts step 1220, the handler decides of the suitable number of projectiles (i.e. rounds) in magazine and of number of projectiles shot per pull of trigger (i.e. burst) per the given mission, scenario and task team's doctrine. Screen shots associated with this step are shown in Figs. 17 and 18 hereinbelow.
The handler is operative in defining rounds and bursts step 1220 to define/set a number of rounds in magazine of a combat element on the drone- the handler sets the number of projectiles (i.e. rounds) in magazine of the at least one combat element on the drone, in a rounds in magazine setting step 1222, enabling the system to count and display on HUD the ammunition status in order to follow its consumption (see Figs. 17 and 18). This is followed by i) A set rounds per burst in step 1224- the handler sets the number of projectiles (i.e. rounds) released per each pull of a trigger; ii) a choose mode of engagement step 1226, in which the handler decides of the suitable mode of engagement per the given mission, scenario and task team' s doctrine; iii) an activate autonomous shot mode step 1228- shoot on sight - the system autonomously engages and shoots a projectile at a threat; immediately as it identifies it; iv) an activate manual shot mode step 1230, in which the handler triggers the shot from the ground station to the combat element on the drone. According to some embodiments, the system suggests engagement options to the handler by notifications and/or markings on the HUD (based on threat prioritization algorithm and/or video analytics); and finally, v) a system launch step 1232- go- the pre-mission planning enabled by the mission planning module ends here by the launch of the real-time control of the drone, enabled by the HUD module (see Fig. 19).
Reference is now made to Fig. 13, which is a simplified flow chart of a standby mode method 1300 for complex indoor combat, in accordance with an embodiment of the present invention.
In an activate standby mode step 1302, system 100 is deployed on-site, the engines 404 (Fig. 4) of drone 400 are switched off but all detection sensors onboard the drone are on (sensors, camera, video analytics, et cetera.). In an identifying threat step 1304, any threat which detected by the drone and/or ground station is defined as an active threat.
The drone is then activated and switched on in an activations step 1306.
In a choosing mode of operation step 1308, a handler chooses, per a potential threat a mode of operation. If a semi-autonomous mode is chosen in step 1309, then the handler is operative to activate the drone in semi-autonomous mode (per a notification by the system). If an autonomous mode was chosen in an autonomous mode choosing step 1307, then the drone is activated to function autonomously. The drone re-fires its engines and engages the threat (autonomously) after auto-launching in an auto-launch drone step 1310). The pre-mission planning enabled by the Mission Planning Module (ground station 120 (Fig. 1)) ends here by the launch of the real-time control of the system, enabled by the HUD Module (See Fig. 19). Per pre-defined directives set on the Mission Planner Module (ground station)- see detailed process in flowchart 1200 the system addresses the threat. This configuration is relevant to the following implementations (refer to best mode of implementation) - anti piracy - fighting (onboard) pirates, homeland security - facilities and installations, indoors fire fighting, alarm verification. Sensors and/or video analytics identifies a threat in step 1304, and triggers the system/drone. The system is to act as is was preset to on the Mission Planner module: Alternatively, if a semi-autonomous mode is chosen in step 1308, then the handler then needs to decide in a decision step 1311 whether to activate the drone to engage the threat in drone activation step 1312, or to leave it dormant in a standby mode (step 1314).
Reference is now made to Fig. 14, which is a simplified flow chart of a sleep mode method 1400 for complex indoor combat, in accordance with an embodiment of the present invention. In a choosing mode of operation step 1402, the handler decides on a mode of operation, such as a sleep mode. In set sleep mode configuration step, drone 110, 400 et cetera.) is deployed on-site in a deploying onsite step 1404. Its- the engines and detection sensors are off (such as sensors, camera, video analytics, et cetera.). As per order, command or an alarm, the system will start its engines and detection sensors and will engage the threat (autonomously or by handler). This configuration is relevant to the following implementations anti -piracy - fighting (on-board) pirates, homeland security - facilities and installations, indoors fire-fighting and alarm verification. In a threat detection step 1406, an alarm is activated and the drone is triggered.
The handler decides whether to activate the drone autonomously or semi- autonomously, by remotely activating that mode in steps 1408, 1410, respectively.
If in semi- autonomous mode, the system is operative to ask the handler whether to engage a threat in an asking step 1412. Typically, this is performed by the handler receiving a message on his phone or to the ground station. If yes, the drone is activated in a drone activation step 1414. If no, the drone remain in a sleep mode 1416 Per pre-defined directives set on the mission planner module ground station 120 (Fig. 1), system 100 addresses the threat. According to one embodiment, the system acts as it was preset to on the mission planner module: If the autonomous mode was activated in step 1418, then, upon detection of a threat, the drone is auto-launched. The pre-mission planning enabled by the Mission Planning Module (ground station) ends here by the launch of the real-time control of the drone, enabled by the HUD Module (See Fig. 19).
Reference is now made to Fig. 15, which is a screen shot 1500 on a screen 1502 of a ground station 120 (Fig. 1) for activating drone 110, in accordance with an embodiment of the present invention.
A handler or user may choose, according to a location, time of day, threat type et cetera, whether to activate drone 110 in a semi-autonomous mode, by activating a semi- autonomous activation element 1504. Alternatively, he/she may activate the drone in an autonomous mode by activating an autonomous activation element 1506.
The elements on the screen may be by touch or by activating a cursor on an area of the screen, as is known in the art. Screen 1502, further comprises a display of date and time data 1522, next buttons 1508, tabs for a mission 1510, an analytics tab 1512, a burst tab 1514, an engagement tab 1516, a heads up display tab 1518 and a settings tab 1520. There may also be one or more operator data tabs 1524 including a name and password entry tab, as is known in the art.
Reference is now made to Fig. 16, which is a screen shot 1600 on a ground station screen for video analytics activation 1502 of drone 110, in accordance with an embodiment of the present invention.
Screen for video analytics activation 1502 typically comprises a first face image 1602, a second face image 1604, a third face image 1606 and a fourth face image 1608. Associated with each face image is respective face data 1610, 112, 1614, 1616. This data may include, for example, a name, an age, a gender, an organization affiliation, a danger level and the like. Additionally, there may be spaces 1618 for uploading additional face data. Additionally or alternatively, the face images may be replaced with full body images. Screen shot 1600 may further comprise a prioritize button 1624, for adjusting, resetting or setting the relative prioritizations of the people's images (1602, 1604,1606, 1608) profiles- the higher priority, the higher priority for the drone to engage the actual person associated with the image. The screens shot may further comprise a drag or drop symbol 1628, a skip button 1620 and a next button 1622.
Reference is now made to Fig. 17, which is a screen shot 1700 on a ground station screen 120 (Fig. 1) for activation of rounds a bursts in system 100 of Fig. 1, in accordance with an embodiment of the present invention.
A user can set a number of rounds in a magazine by manipulating a rounds control number using button 1704 in a rounds control element 1702 and a number of rounds per burst by manipulating a rounds control per burst number 1708 in a rounds per burst control element 1706.
Reference is now made to Fig. 18, which is a screen shot 1800 on a ground station screen for choosing a mode of engagement in the system of Fig. 1, in accordance with an embodiment of the present invention. A user can choose a shoot on sight mode by activating an autonomous mode select element 1802 or a manual shot by activating a manual mode select element 1804.
Reference is now made to Fig. 19, which is a screen shot 1900 on a ground station screen for heads up display (HUD) in the system of Fig. 1, in accordance with an embodiment of the present invention.
Screen shot 1900 presents the handlers controller' s graphic user interface (GUI) and its components: a mini map 1934 , a current speed on a speedometer 1906, an altitude on an altitude meter 1908, a local time on date and time data 1522, a battery/energy with time estimate of power % remaining 1912, an ammunition available data with type and number of rounds in munitions store 1916, a firing crosshair 1918, with a threat directional indicator 1920, a compass 1902 and an artificial horizon or azimuth 1904. The HUD further shows a threat map 1992, showing a triangle of a shot 1930, a shot threat detected 1924, movement detection 1926, a triangle of movement 1932 (shown in greater detail in Figs 20 and 21). Figs 20-21 are dealing with the Threat map rather with the 3D map- added it here please inset where applicable: 3D Mapping Algorithm - sensors (Fig. 3A) map the site for immediate and future navigation: Without a provided map of the indoors site and without GPS.
The screen shot further shows a return to base button 1942, an ambush or standby button 1944, an operation button 1946, a video analytics button 1948 and a 3D map button 1950.
Reference is now made to Fig. 20, which is a simplified threat map 2000 in the system of Fig. 1, in accordance with an embodiment of the present invention.
Threat map includes a drone orientation direction 2001, a camera coverage area 2002, a gunshot direction 2003, a movement direction 2004, and movement and gunshot direction 2005, and gunshot location and position 2006, a movement location and position 2007 and a movement and gunshot location and position 2008.
Reference is now made to Fig. 21 , which is another simplified threat map 2100 in the system of Fig. 1, in accordance with an embodiment of the present invention. Threat map includes a drone orientation direction 2101, a camera coverage area 2102, a gunshot direction 2103, a movement direction 2104, and movement and gunshot direction 2105, and gunshot location and position 2106, a movement location and position 2107 and a movement and gunshot location and position 2108.
The modes of operation of drone 110 are described further as follows, with reference to items shown in the drawings, particularly with reference to Figs. 12- 14.
Control Application: Mission Planner and Heads Up Display- Main Functions 1. Modes Of Operation
1.1. Semi- Autonomous Mode- Handler flies the system, manages its priorities and engagements, yet robotically supported with:
- Avoiding obstacles- walls, shafts, doorways, staircases, Et cetera.
Threat Map- Motion, Gunshot and sound sensors alert, identify and direct towards potential threats
Video Analytics - identifies known threats, differentiate friend/foe, priorities threats and direct towards threats
- 3D mapping- Sensors are mapping the site for immediate and future navigation
Navigation- The handler can navigate the system by either real-time mapping or by a provided map (without GPS)
1.2. Autonomous Mode- The system is fully autonomous and acts as per pre- defined priorities, behaviors and algorithms:
Avoiding obstacles- walls, shafts, doorways, staircases, Et cetera.
Addressing threats- Motion, Gunshot and sound sensors alert, identify and direct towards potential threats
Video Analytics - identifies known threats, differentiate friend/foe, priorities threats and direct towards threats
3D mapping- Sensors are mapping the site in real-time for immediate and future navigation
Navigation- The system can navigate by either real-time mapping or by a provided map (without GPS)
1.3. Next- will save the chosen mode and will proceed to the next planning phase.
2. Video Analytics 2.1. Skip- The mission will be performed with limited Video Analytics capabilities:
The System does not biometrically analyze pre-defined people's profiles, and addresses real-time threats only
- Marks people on the HUD
Marks friend or Foe on the HUD
Marks imminent threats on the HUD (For example, gunshots, movements)
Threats can be prioritized per handler directive thus triggering the "Prioritization" and "engagement" Algorithms
2.2. Upload Photo- The System biometrically analyzes pre-defined profiles in real-time
The System biometrically analyzes pre-defined profiles in real-time Handler can load a Profile (image) from the Database
- Handler can insert and load a recently taken picture of target
Several profiles can be loaded at the same time
2.3. Priority Set- If several profiles are loaded, the Priority Set option is activated.
Thus, the Handler can prioritize the objectives:
The system can than search and mark each identified profile on HUD (thus triggering the "Prioritization" and "engagement" Algorithms)
Profiles can be prioritized per handler directive thus triggering the "Prioritization" and "engagement" Algorithms
2.4. Drag to Prioritize Objectives- By 'dragging and dropping' the objectives' photos, the Handler can reset the prioritization of the objectives
2.5. Next- will save the chosen mode and will proceed to the next planning phase.
Rounds and Bursts
3.1. Rounds in magazine- Handler sets the number of projectiles in Magazine in order to follow their consumption
3.2. Rounds per burst- Handler sets the number of projectiles released per each pull of a trigger
3.3. Next- will save the chosen mode and will proceed to the next planning phase. Mode of Engagement
4.1. Autonomous Shot- Shoot on site: The system autonomously engages and shoots a projectile at a threat (based on the Threat Prioritization algorithm)
4.2. Manual Shot- Handler will trigger the shot: The system suggests engagement options to the Handler (based on Threat Prioritization algorithm). The
Handler decides whether to engage and shoot a projectile
4.3. Next- will save the chosen mode and will proceed to the App's second module- The Heads Up Display. Heads Up Display (Hereafter, HUD, Fig. 19)
5.1. Number of remaining/available rounds
5.2. Crosshair - Graphic aim assistance
5.3. Directional Hit Indicator - Surrounding the crosshair - 8 colored slices to indicate the type and direction of sensors' indications
- RED for gunshot
Orange for movement
Red-Orange zebra for movement + gunshot -
5.4. Threat Map- Graphic display mapping displaying the type and direction of sensors' indications, in relation to Hover Craft movement
5.5. 3D Map- Sensors are mapping the site for immediate and future navigation.
The sensors' 360 degrees coverage maps the surroundings and thus enables the display of a dynamically created 3D Map
The map is automatically saved and stored for later use
5.6. Video Analytics (frames) markings- Marks friend or Foe with prioritization numbering
Different colors for friend and foe.
-A Friend will never be engaged.
Handler may change the priority; and thus the numbering, in real-time by touching the objective he now prioritize as number one.
1.2. Return to base- The system will navigate back to its departure location by either Real-time mapping (See 5.5)
Provided map
And if available, by GPS .
Ambush/Standby mode (Fig 13)- The system will get in to Ambush mode, where it lands (autonomously or by handler), shuts its engines but keeps ALL detection systems ON (Sensors, Camera, Video analytics, Et cetera.). As per a detected threat, the system will resume its engines and will engage the threat (autonomously or by handler).
Operation- The handler can choose to change the Mode of Operation (See 1.) during actual mission rather than as a pre-defined mode as at the Mission Planner module.
Analytics- The handler can launch the Video Analytics interface (See 2.) during actual mission (rather than as a pre-defined mode as at the Mission Planner module). If at the Mission Planning phase the Handler skipped the phase, he can still launch it even through the HUD
3D Map- As per his preferences, the handler can remove/display the 3D Map overlay. a. Navigation - i. On (to) target GPS and non-GPS (in GPS denied areas) auto- independent navigation (using its embedded sensors) ii. Off target/Come home GPS and non-GPS (in GPS denied areas) auto-independent navigation (using its embedded sensors) b. Pack/squadron coordinated flight/fight support - i. Formation flight and attack algorithm ii. 'One leader other follow' or 'Coordinated flight and attack of independent drones' c. Night/day and Open/Urban replaceable camouflage d. Socket enabled charging (any socket, any current)
e. Switchable batteries for a quick return to action
f. Secured datalink- Line of sight (LOS) and non-line-of- sight (NLOS) data link for indoors scenarios
ne 400 Stalking Functionality and Components
a. Observation - Dual HD Camera (day/Night)
b. Motion sensors - Direction and distance indicators for immediate threat detection and engagement
c. Acoustic/Laser Gunshot Sensor - Direction and distance indicators for immediate threat detection and engagement
d. Video Content Analytics - i. Automatic Intrusion detection - Ensures perimeter control ii. Automatic Abnormal behavior detection - Minimizing civilian casualties by incriminating the real threats
iii. Face Recognition - Spotting and identifying a pre-defined
target
iv. Avoiding "friendly-Fire"
e. Silent-wait mode support
i. Sitting, scouting, content analyzing
ii. Stalking/ambush mode: all systems ON, engine OFF with
immediate engine restart support
f. Zoom/Crosshair support - Targeting and shooting accurately,
ne 400 Engagement Functionality and Components
a. Threat detection and engagement algorithm - i. Collecting data from ;location sensors, Sensors and Video Analytics
ii. Integrating data to one coherent thereat/combat picture iii. Engaging the threat
b. Weapon payloads
i. Standard issued 9mm handgun (For example, Glock 19), or ii. Submachine gun- (For example, Heckler & Koch MP7)
Day/Night (green) Laser designator
iii. Grenade (all NATO variants)
c. Remote operated trigger (for handgun/Submachine gun and Grenade)tem 100 Controller and Pilots' Display Functionality and Components a. Game-like controller
b. 10"-15" (touch) Screen display
c. One GUI screen with, functional layers (interactive real-time Popups) - i. Mini map-
1. showing the position of the drone over a map
2. Clicking on it will enlarge it
3. Will show friendly and enemy entities ( ii. Current Speed
iii. Altitude
iv. Local Time
v. Battery/Energy (with time estimate of conclusion %) vi. Available ammunition - type and number of rounds (bullets) vii. Firing crosshair- see below
viii. Compass ix. Artificial horizon plus crosshair.
Some non-limiting examples of use of the drones and systems of the present inventions are provided hereinbelow.
Law Enforcement - Police and SWAT (local to national level) a. Out-of-car pursuit;
b. Eliminating entrenched criminal threats;
c. Providing cover and aggressive recon at Indoor scenarios; d. Close Quarters Combat Scenarios management, engagement and support (for example, hostage rescue, arrests)
e. Interacting with the suspect (embedded loud-speaker).
Anti-piracy - fighting (on-board) pirates
a. The drone is to be stationed onboard a ship, at strategic locations, such as, the bridge, engine room, crew quarters, et cetera.
b. The drone addresses the on-board pirates and is operative to engage them under various operational scenarios. For example, boarding prevention, hijacking prevention, hostage rescue, securing the bridge and staff.
c. The drone uses the distributed Wi-Fi network to operate within the vessel and satellite network for external communication.
Homeland Security - Facilities and Installations
a. Guarding and proactive protection of complex facilities
b. Indoor /Outdoor deployment.
Military - Ground Forces
a. Eliminating snipers, urban ambushes and enemy scouts.
b. Providing cover and aggressive recon in urban combat scenarios.
c. Close quarters combat (indoor) management, engagement and support. d. Supporting/replacing human soldiers.
Military - Special Forces
a. Eliminating indoor designated targets.
b. Initiating indoor ambushes.
c. Searching in tunnels.
d. Providing cover and aggressive recon in urban scenarios. e. Close quarters combat management, engagement and support f. Hostage rescue.
Intelligence & Security Organizations
a. Eliminating indoor designated targets ("targeted killing")
b. Initiating Indoor Ambushes
c. Supporting/replacing human operatives.
Indoors fire Fighting
a. Responding to a fire alarm (For example, in a warehouse)
b. Locating the fire source by various sensors (For example, heat
detector)
c. Deploying a Fire Extinguisher to extinguish the fire at its early stages (Simply replacing the weapons payload by a Fire Extinguisher).
Alarm Verification
a. Responding to an alarm (in warehouse, homes, facilities).
b. Locating the alarms' trigger source and thus verifying the validity of the alarm and the level of threat.
c. Deploying a non-lethal weapon (For example, electric stun-gun) to hold the intruder (Simply replacing the weapons payload by a non- lethal weapon).
MODE OF IMPLEMENTATION-ONBOARD SHIP
a. Fighting piracy- The system is to be stationed onboard the ship, at strategic locations. For example, the bridge, engine room, goods' storage, et cetera.
b. When triggered; by a direct order, command or an alarm, the system can operate autonomously and/or piloted; on a single or a coordinated pack configuration.
c. Optional triggers:
i. General alarm; initiated by the crew
ii. Command; initiated by the Video Content analytics (Either of the Ship or of the system itself) or by the embedded Sensors once a movement or gunshots are identified iii. Direct Order by the crew or by a remote/online security officer located (remotely) at base.
1. Control options:- mobile mode (Fig. 12);
standby/ambush mode (Fig.13) and sleep mode (Fig. 14).
2. For example, a threat that is firing, entering a forbidden zone, and the like).
d. As the system is ON and deployed it may prevent, combat and foil the attack of the boarding Pirates. Here are several examples:
i. Boarding Prevention - Will engage and attack pirates as they try to board the ship or at their initial boarding stages;
ii. Securing the Bridge and Staff - Will engage and attack pirates as they try to breach into the Bridge or Staff Quarters;
iii. Hostage Rescue - With its Sensors and Video support, the system may eliminate Pirates holding hostages. Its ability to move fast, differentiate between friend and foe and be accurate makes it perfect for the task;
iv. Guarding and preventing access - With its embedded Video Content Analytics (or the one of the ship), the system can identify a breach or an unauthorized access to a secure parameter and respond;
v. Hijacking Prevention - The access prevention and secured bridge are to foil any attempt to hijack the ship, its crew and its cargo;
vi. Eliminating entrenched pirates - Its small size, agility, and accuracy enables the system to reach and eliminate a threat even if entrenched; and
vii. Providing cover and aggressive recon - The system can lead the crew to safety by providing reconnaissance and armed cover.
Law Enforcement - Police and SWAT (Local to Nation Level) - a. Out-of-car Pursuit; b. Eliminating entrenched criminal threats;
c. Providing cover and aggressive recon at Indoor
scenarios;
d. Close Quarters Combat Scenarios management,
engagement and support (for example, hostage rescue, arrests); and
e. Interacting with the suspect (embedded load- speaker). Anti Piracy - Fighting (On-Board) Pirates
a. A drone (110, Fig 1) is stationed onboard a ship, at strategic locations, for example, the bridge, engine room, crew quarters.
b. The drone addresses the on-board pirates and engages them under various operational scenarios, for example, boarding prevention, hijacking prevention, hostage rescue, securing the bridge and staff. c. The drone uses the distributed Wi-Fi network to operate within the vessel and satellite network for external communication.
Homeland Security - Facilities and Installations
a. Guarding and proactive protection of Complex Facilities; and b. Indoors/Outdoor deployment.
Military - Ground Forces
a. Eliminating snipers, urban ambushes, enemy scouts, et cetera.
b. Providing cover and aggressive recon on urban combat scenarios. c. Close quarters combat (indoor) management, engagement and support. d. Supporting/replacing human soldiers.
Military - Special Forces
a. Eliminating indoor designated targets;
b. Initiating indoor ambushes;
c. Searching in tunnels;
d. Providing cover and aggressive recon at urban scenarios;
e. Close Quarters Combat Management, Engagement and
Support; and
f. Hostage rescue.
Intelligence & Security Organizations a. Eliminating indoor Designated Targets ("Targeted Killing") b. Initiating Indoor Ambushes
c. Supporting/replacing human operatives.
Indoors fire Fighting
a. Responding to a fire alarm (For example, in a warehouse) b. Locating the fire source by various sensors (For example, heat detector)
c. Deploying a fire extinguisher to extinguish the fire at its early stages (Simply replacing the weapons payload by a fire extinguisher).
Alarm Verification
a. Responding to an alarm (in warehouse, homes, facilities, et cetera.). b. Locating the alarms' trigger source and thus verifying the validity of the alarm and the level of threat.
c. Deploying a non-lethal weapon (For example, electric stun-gun) to hold the intruder (simply replacing the weapons payload by a non- lethal weapon).
It is to be understood that the invention is not limited in its application to the details set forth in the description contained herein or illustrated in the drawings. The invention is capable of other embodiments and of being practiced and carried out in various ways. Those skilled in the art will readily appreciate that various modifications and changes can be applied to the embodiments of the invention as hereinbefore described without departing from its scope, defined in and by the appended claims.

Claims

1. A system for indoors engagement of a threat comprising:
a. a motorized apparatus adapted to hover and hold:
i. at least one engagement element;
ii. at least one detection element adapted to convey detection data;
iii. an apparatus location element adapted to convey system location data;
iv. an onboard processor adapted to receive and process said detection data from said at least one detection element and said location data from said at least one apparatus location data element, said processor being configured to output at least one command to said at least one engagement element to inactivate said threat; and
v. at least one communication element; and b. a ground station for monitoring said motorized apparatus and for receiving data from said at least one communication element.
2. A system according to claim 1, wherein said processor is further adapted to perform video analytics on data from said detection data.
3. A system according to claim 2, wherein said processor is further adapted to identify said threat responsive to said video analytics.
4. A system according to claim 3, wherein said motorized apparatus comprises robotics adapted to activate said at least one engagement element to engage and inactivate said threat.
5. A system according to claim 1, wherein said motorized apparatus is adapted to navigate three-dimensional indoor contours.
6. A system according to claim 1, wherein said at least one engagement element is selected from the group consisting of a gun, a missile, a projector, a gas canister, a fire extinguisher, a grenade, a non-lethal weapon, an immobilizer weapon, a submachine guns and combinations thereof.
7. A system according to claim 1, wherein said at least one detection element is selected from the group consisting of a sound sensor, an infrared sensor, a position sensor, an ultrasonic sensor, a movement detection sensor, a camera, an optic flow sensor, a laser, an optic visualization and combinations thereof.
8. A system according to claim 1, wherein said at least one location element is selected from the group consisting of a global position system (GPS) element, a position sensor, a camera, a smartphone, an optic sensor and combinations thereof.
9. A system according to claim 1, wherein motorized apparatus and said ground station are adapted to communicate via at least one communication link.
10. A system according to claim 9, wherein said at least one communication link is selected from the group consisting of IP communication link, satellite communication, a cellular communication link, an RF communication link and combinations thereof.
11. A system according to claim 1, wherein said ground station further comprises a screen adapted to display a real-time heads up display (HUD) of said motorized apparatus.
12. A system according to claim 1, wherein said motorized apparatus is unmanned and is selected from an airborne drone, a flying hovering apparatus, a plane, a helicopter and a hovercraft.
13. A system according to claim 1, wherein said ground station comprises:
a. a computer;
b. a hand-operated remote control apparatus;
c. an antenna; and
d. at least one communications link.
14. A system according to claim 1, wherein said motorized apparatus is a drone, adapted for indoor use.
15. A system according to claim 14, wherein said drone is adapted to move along vertical and horizontal conduits.
16. A system according to claim 15, wherein said drone comprises robotics adapted to support navigation and moving indoors of said drone.
17. A system according to claim 16, wherein said ground station comprises on-screen heads up display (HUD).
18 A method for indoor engagement of a threat, the method comprising:
a) detecting said threat by a motorized hovering apparatus; processing at least one of location data and detection data from said apparatus to determine a nature of said threat; and b) activating at least one engagement element on said apparatus to inactivate said threat indoors, responsive to at least one of said location data and said detection data.
19. A method according to claim 18, wherein said detecting step comprises employing video analytics for identification of said threat.
20. A method according to claim 18, wherein said activating step enables said apparatus to act autonomously.
21. A method according to claim 18, wherein said activating step enables said apparatus to act semi-autonomously.
22. A method according to claim 18, wherein said activating step enables said apparatus to travel along horizontal and vertical conduits to approach said threat.
23. A method according to claim 18, wherein said apparatus is adapted for non-GPS navigation.
24. A method according to claim 18, wherein said apparatus is adapted for day and night activation.
25. A method according to claim 18, wherein said apparatus has a predefined mission plan.
26. A method according to claim 25, wherein said mission plan supports a plurality of engagement modes.
27. A method according to claim 18, wherein said apparatus is controlled from a ground station by heads up display.
28. A method according to claim 18, wherein said motorized apparatus acts autonomously, without communicating with said ground station.
29. A method according to claim 18, wherein said engagement of said threat occurs in an underground environment.
30. A method according to claim 29, wherein said underground environment comprises at least one tunnel.
31. A method according to claim 30, wherein said motorized apparatus acts autonomously in said at least one tunnel.
32. A software product for indoor engagement of a threat said product comprising a computer-readable medium in which program instructions are stored, which instructions, when read by a computer, cause the computer to:
a. detect said threat by a motorized hovering apparatus;
b. process at least one of location data and detection data from said apparatus to determine a nature of said threat; and
c. activate at least one engagement element on said apparatus to inactivate said threat indoors, responsive to at least one of said location data and said detection data.
33. A software product according to claim 32, wherein said product comprises a plurality of algorithms, each algorithm providing a set of instructions to said motorized apparatus to activate at least one of robotics and said at least one engagement element.
34. A system for neutralization of an onboard threat on a waterborne vehicle, the system comprising:
a. A motorized apparatus adapted to hover and hold:
i. at least one engagement element;
ii. at least one detection element adapted to convey detection data; and
iii. an apparatus location element adapted to convey system location data;
iv. an onboard processor adapted to receive and process said detection data from said at least one detection element and said location data from said at least one apparatus location data element, said processor being configured to output at least one command to said at least one engagement element to inactivate said threat; and
v. at least one communication element;
b. a ground station for monitoring said motorized apparatus and for receiving data from said at least one communication element.
35. A system for indoors engagement of a threat comprising:
a. a plurality of drones, each adapted to hover and hold:
i. at least one engagement element;
ii. at least one detection element adapted to convey detection data;
iii. an apparatus location element adapted to convey system location data;
iv. an onboard processor adapted to receive and process said detection data from said at least one detection element and said location data from said at least one apparatus location data element, said processor being configured to output at least one command to said at least one engagement element to inactivate said threat; and
v. at least one communication element; and b. a ground station for monitoring said plurality of drones and for receiving data from said at least one communication element on each said drone.
36. A system according to claim 35, wherein said plurality of drones is controlled by said ground station with coordinated flight/fight support.
37. A system according to claim 35, further comprising a flight formation and attack algorithm.
38. A system according to claim 37, wherein said system further enables the drones to form a flight formation.
39. A system according to claim 37, wherein said system further enables the drones to form a line in flight.
40. A system according to claim 37, wherein said system further enables coordinated flight of said plurality of drones.
41. A system according to claim 37, wherein said system enables an attack by one drone independently of the rest of the plurality of drones.
PCT/IL2014/000043 2013-08-31 2014-08-28 Robotic system and method for complex indoor combat Ceased WO2015029007A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361872638P 2013-08-31 2013-08-31
US61/872,638 2013-08-31
US201361916815P 2013-12-17 2013-12-17
US61/916,815 2013-12-17

Publications (1)

Publication Number Publication Date
WO2015029007A1 true WO2015029007A1 (en) 2015-03-05

Family

ID=52585686

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2014/000043 Ceased WO2015029007A1 (en) 2013-08-31 2014-08-28 Robotic system and method for complex indoor combat

Country Status (2)

Country Link
IL (1) IL234372B (en)
WO (1) WO2015029007A1 (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016162342A1 (en) * 2015-04-07 2016-10-13 Pixiel System for triggering the take-off of a drone following the detection of an event that occurred at a predetermined location
BE1022965B1 (en) * 2015-04-21 2016-10-24 Airobot Assembly for unmanned aircraft, unmanned aircraft with the assembly, and method for controlling it
JP2017036007A (en) * 2015-08-12 2017-02-16 富士ゼロックス株式会社 Image formation device and image formation system
ES2607723A1 (en) * 2015-10-02 2017-04-03 Universidad De Castilla La Mancha Device for the distance detection of perturbating elements on a surface (Machine-translation by Google Translate, not legally binding)
DE102015014502A1 (en) * 2015-11-10 2017-05-11 Mbda Deutschland Gmbh Auxiliary airfoil device
EP3182390A1 (en) * 2015-12-08 2017-06-21 Micro APPS Group Inventions LLC Autonomous safety and security device on an unmanned platform under command and control of a cellular phone
WO2017127491A1 (en) 2016-01-20 2017-07-27 Babak Rezvani Drone control device
RU2628351C1 (en) * 2016-04-14 2017-08-16 Сергей Николаевич ПАВЛОВ Anti-tank mine "strekosa-m" with possibility of spatial movement with hovering and reversibility in air, reconnaissance, neutralisation, and damage of mobile armoured targets
WO2017137393A1 (en) * 2016-02-10 2017-08-17 Tyco Fire & Security Gmbh A fire detection system using a drone
DE102016109242A1 (en) * 2016-05-19 2017-11-23 Keil Group GmbH monitoring system
WO2018010909A1 (en) * 2016-07-12 2018-01-18 Minimax Gmbh & Co. Kg System and method for the verified determining of a fire status, as well as vehicle and central unit for this purpose
CN107643762A (en) * 2017-08-07 2018-01-30 中国兵器工业计算机应用技术研究所 The UAS and its air navigation aid of independent navigation
WO2018063076A1 (en) * 2016-09-29 2018-04-05 Dynamic Solutions Group Sweden Ab Portable close air support system and payload carrier
GR1009313B (en) * 2017-03-30 2018-06-19 Τεχνολογικο Εκπαιδευτικο Ιδρυμα Ανατολικης Μακεδονιας Και Θρακης AUTOMATED RESCUE SYSTEM
GR20160100501A (en) * 2016-10-04 2018-06-27 Ηλιας Θωμα Σαραφης AUTOMATIC AIR INTERVENTIONAL INTERVENTION SYSTEM FOR USER ASSISTANCE
WO2018150492A1 (en) * 2017-02-15 2018-08-23 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Image display method, image display system, flying object, program, and recording medium
US10134293B2 (en) 2016-03-21 2018-11-20 Walmart Apollo, Llc Systems and methods for autonomous drone navigation
EP3268276A4 (en) * 2015-03-12 2018-12-05 Alarm.com Incorporated Robotic assistance in security monitoring
CN109189099A (en) * 2018-11-09 2019-01-11 福州大学 A kind of graphical control configuration method of quadrotor drone
EP3447436A1 (en) * 2017-08-25 2019-02-27 Aurora Flight Sciences Corporation Aerial vehicle interception system
JP2019064465A (en) * 2017-09-29 2019-04-25 株式会社エアロネクスト Propeller guard
DE102017223753A1 (en) * 2017-12-22 2019-06-27 Thyssenkrupp Ag Drone system, manholes for a drone system and method for transporting loads in a manhole with a drone
US10551810B2 (en) 2017-06-20 2020-02-04 Ademco Inc. System and method to improve the privacy of homes and other buildings having a connected home security/control system and subject to intrusions by unmanned aerial vehicles
DE102019110205A1 (en) * 2019-04-17 2020-10-22 Krauss-Maffei Wegmann Gmbh & Co. Kg Procedure for operating a networked military formation
US11009887B2 (en) 2018-07-26 2021-05-18 Toyota Research Institute, Inc. Systems and methods for remote visual inspection of a closed space
US11009877B2 (en) 2016-07-12 2021-05-18 Minimax Gmbh & Co. Kg Unmanned vehicle, system, and method for initiating a fire extinguishing action
US11064184B2 (en) 2017-08-25 2021-07-13 Aurora Flight Sciences Corporation Aerial vehicle imaging and targeting system
WO2022069957A1 (en) * 2020-09-29 2022-04-07 Rafael Advanced Defense Systems Ltd. Armed aerial platform
US11767129B2 (en) 2020-01-31 2023-09-26 Southeastern Pennsylvania Unmanned Aircraft Systems, Llc Drone delivery system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110204188A1 (en) * 2010-02-24 2011-08-25 Robert Marcus Rotocraft
US8521339B2 (en) * 2008-09-09 2013-08-27 Aeryon Labs Inc. Method and system for directing unmanned vehicles

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8521339B2 (en) * 2008-09-09 2013-08-27 Aeryon Labs Inc. Method and system for directing unmanned vehicles
US20110204188A1 (en) * 2010-02-24 2011-08-25 Robert Marcus Rotocraft

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DR. NILS MELZER, EUROPEAN PARLIAMENT, HUMAN RIGHTS IMPLICATIONS OF THE USAGE OF DRONES AND UNMANNED ROBOTS IN WARFARE, 3 May 2013 (2013-05-03), pages 7 - 13 *

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10698403B2 (en) 2015-03-12 2020-06-30 Alarm.Com Incorporated Robotic assistance in security monitoring
EP3268276A4 (en) * 2015-03-12 2018-12-05 Alarm.com Incorporated Robotic assistance in security monitoring
AU2022201806B2 (en) * 2015-03-12 2024-01-04 Alarm.Com Incorporated Robotic assistance in security monitoring
US11409277B2 (en) 2015-03-12 2022-08-09 Alarm.Com Incorporated Robotic assistance in security monitoring
AU2021269286B2 (en) * 2015-03-12 2022-01-27 Alarm.Com Incorporated Robotic assistance in security monitoring
AU2020213380B2 (en) * 2015-03-12 2021-11-04 Alarm.Com Incorporated Robotic assistance in security monitoring
FR3034884A1 (en) * 2015-04-07 2016-10-14 Pixiel SYSTEM FOR STARTING THE TAKE-OFF OF A DRONE FOLLOWING THE DETECTION OF AN EVENT IN A DETERMINED LOCATION
WO2016162342A1 (en) * 2015-04-07 2016-10-13 Pixiel System for triggering the take-off of a drone following the detection of an event that occurred at a predetermined location
BE1022965B1 (en) * 2015-04-21 2016-10-24 Airobot Assembly for unmanned aircraft, unmanned aircraft with the assembly, and method for controlling it
JP2017036007A (en) * 2015-08-12 2017-02-16 富士ゼロックス株式会社 Image formation device and image formation system
ES2607723A1 (en) * 2015-10-02 2017-04-03 Universidad De Castilla La Mancha Device for the distance detection of perturbating elements on a surface (Machine-translation by Google Translate, not legally binding)
DE102015014502A1 (en) * 2015-11-10 2017-05-11 Mbda Deutschland Gmbh Auxiliary airfoil device
EP3182390A1 (en) * 2015-12-08 2017-06-21 Micro APPS Group Inventions LLC Autonomous safety and security device on an unmanned platform under command and control of a cellular phone
US10768625B2 (en) 2016-01-20 2020-09-08 Alarm.Com Incorporated Drone control device
US10228695B2 (en) 2016-01-20 2019-03-12 Alarm.Com Incorporated Drone control device
WO2017127491A1 (en) 2016-01-20 2017-07-27 Babak Rezvani Drone control device
EP3405846A4 (en) * 2016-01-20 2019-01-23 Alarm.com Incorporated DRONE CONTROL DEVICE
WO2017137393A1 (en) * 2016-02-10 2017-08-17 Tyco Fire & Security Gmbh A fire detection system using a drone
US10134293B2 (en) 2016-03-21 2018-11-20 Walmart Apollo, Llc Systems and methods for autonomous drone navigation
RU2628351C1 (en) * 2016-04-14 2017-08-16 Сергей Николаевич ПАВЛОВ Anti-tank mine "strekosa-m" with possibility of spatial movement with hovering and reversibility in air, reconnaissance, neutralisation, and damage of mobile armoured targets
DE102016109242A1 (en) * 2016-05-19 2017-11-23 Keil Group GmbH monitoring system
US11009877B2 (en) 2016-07-12 2021-05-18 Minimax Gmbh & Co. Kg Unmanned vehicle, system, and method for initiating a fire extinguishing action
CN109416864A (en) * 2016-07-12 2019-03-01 德国美力有限两合公司 System and method for proven fire state determination and travel tool and central unit therefor
CN109416864B (en) * 2016-07-12 2020-12-22 德国美力有限两合公司 System and method for proven fire state determination and travel tool and central unit therefor
US10825335B2 (en) 2016-07-12 2020-11-03 Minimax Gmbh & Co. Kg System and method for the verified determining of a fire status, as well as vehicle and central unit for this purpose
WO2018010909A1 (en) * 2016-07-12 2018-01-18 Minimax Gmbh & Co. Kg System and method for the verified determining of a fire status, as well as vehicle and central unit for this purpose
WO2018063076A1 (en) * 2016-09-29 2018-04-05 Dynamic Solutions Group Sweden Ab Portable close air support system and payload carrier
GR1009387B (en) * 2016-10-04 2018-10-25 Θωμας Ηλια Σαραφης Robotic air intervention system providing emergency assistance to the users
GR20160100501A (en) * 2016-10-04 2018-06-27 Ηλιας Θωμα Σαραφης AUTOMATIC AIR INTERVENTIONAL INTERVENTION SYSTEM FOR USER ASSISTANCE
US11082639B2 (en) 2017-02-15 2021-08-03 SZ DJI Technology Co., Ltd. Image display method, image display system, flying object, program, and recording medium
JPWO2018150492A1 (en) * 2017-02-15 2019-12-19 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Image display method, image display system, flying object, program, and recording medium
WO2018150492A1 (en) * 2017-02-15 2018-08-23 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Image display method, image display system, flying object, program, and recording medium
GR1009313B (en) * 2017-03-30 2018-06-19 Τεχνολογικο Εκπαιδευτικο Ιδρυμα Ανατολικης Μακεδονιας Και Θρακης AUTOMATED RESCUE SYSTEM
US10551810B2 (en) 2017-06-20 2020-02-04 Ademco Inc. System and method to improve the privacy of homes and other buildings having a connected home security/control system and subject to intrusions by unmanned aerial vehicles
CN107643762A (en) * 2017-08-07 2018-01-30 中国兵器工业计算机应用技术研究所 The UAS and its air navigation aid of independent navigation
JP2019060589A (en) * 2017-08-25 2019-04-18 オーロラ フライト サイエンシズ コーポレーション Aerial vehicle interception system
KR102600479B1 (en) * 2017-08-25 2023-11-08 오로라 플라이트 사이언시스 코퍼레이션 Aerial vehicle interception system
EP3447436A1 (en) * 2017-08-25 2019-02-27 Aurora Flight Sciences Corporation Aerial vehicle interception system
US10495421B2 (en) 2017-08-25 2019-12-03 Aurora Flight Sciences Corporation Aerial vehicle interception system
US11064184B2 (en) 2017-08-25 2021-07-13 Aurora Flight Sciences Corporation Aerial vehicle imaging and targeting system
KR20190022406A (en) * 2017-08-25 2019-03-06 오로라 플라이트 사이언시스 코퍼레이션 Aerial vehicle interception system
US11126204B2 (en) 2017-08-25 2021-09-21 Aurora Flight Sciences Corporation Aerial vehicle interception system
JP2019064465A (en) * 2017-09-29 2019-04-25 株式会社エアロネクスト Propeller guard
DE102017223753A1 (en) * 2017-12-22 2019-06-27 Thyssenkrupp Ag Drone system, manholes for a drone system and method for transporting loads in a manhole with a drone
US11009887B2 (en) 2018-07-26 2021-05-18 Toyota Research Institute, Inc. Systems and methods for remote visual inspection of a closed space
CN109189099B (en) * 2018-11-09 2021-07-13 福州大学 A Graphical Control Configuration Method of Quadrotor UAV
CN109189099A (en) * 2018-11-09 2019-01-11 福州大学 A kind of graphical control configuration method of quadrotor drone
DE102019110205A1 (en) * 2019-04-17 2020-10-22 Krauss-Maffei Wegmann Gmbh & Co. Kg Procedure for operating a networked military formation
US11767129B2 (en) 2020-01-31 2023-09-26 Southeastern Pennsylvania Unmanned Aircraft Systems, Llc Drone delivery system
WO2022069957A1 (en) * 2020-09-29 2022-04-07 Rafael Advanced Defense Systems Ltd. Armed aerial platform
IL277712B1 (en) * 2020-09-29 2024-02-01 Rafael Advanced Defense Systems Ltd Armed aerial platform
US11981459B2 (en) 2020-09-29 2024-05-14 Rafael Advanced Defense Systems Ltd. Armed aerial platform
IL277712B2 (en) * 2020-09-29 2024-06-01 Rafael Advanced Defense Systems Ltd Armed aerial platform

Also Published As

Publication number Publication date
IL234372B (en) 2020-02-27

Similar Documents

Publication Publication Date Title
WO2015029007A1 (en) Robotic system and method for complex indoor combat
US20220406151A1 (en) Threat identification device and system with optional active countermeasures
US10099785B1 (en) Drone with ring assembly
US20210063120A1 (en) System and method for active shooter defense
US6903676B1 (en) Integrated radar, optical surveillance, and sighting system
US20140251123A1 (en) Unmanned range-programmable airburst weapon system for automated tracking and prosecution of close-in targets
US20150321758A1 (en) UAV deployment and control system
US20250028335A1 (en) Systems and methods for managing unmanned vehicle interactions with various payloads
US20180362157A1 (en) Modular unmanned aerial system
US20130192451A1 (en) Anti-sniper targeting and detection system
KR102034494B1 (en) Anti-Drones system and operation methode to neutralize abusing drones
JP2019070510A (en) Aerial vehicle imaging and targeting system
EP3625125A1 (en) System and method for interception and countering unmanned aerial vehicles (uavs)
CN110624189B (en) Unmanned aerial vehicle-mounted fire extinguishing bomb device, fire-fighting unmanned aerial vehicle and emission control method
US20230343229A1 (en) Systems and methods for managing unmanned vehicle interactions with various payloads
KR20130009894A (en) Unmanned aeriel vehicle for precision strike of short-range
KR20130009891A (en) Complex unmanned aerial vehicle system for low and high-altitude
CN212332970U (en) Unmanned aerial vehicle machine carries fire extinguishing bomb device, fire control unmanned aerial vehicle
US9716862B1 (en) System and methods for capturing situational awareness
Sinclair Proposed rules to determine the legal use of autonomous and semi-autonomous platforms in domestic US law enforcement
Sözübir UAV Autonomy in Turkey and Around the World: The “Terminator” Debate
Kesavaraj et al. Security framework for net gun-equipped unmanned aerial vehicles
KR102009637B1 (en) Drone for relief activity in disaster and emergency situations
Vas et al. Comprehensive Study of Military and Civil Drone Applications: Assessing Key Areas of Significance and Future Prospects
KR20190097609A (en) Drone for overpowering the criminals

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14839549

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14839549

Country of ref document: EP

Kind code of ref document: A1