[go: up one dir, main page]

US20060021498A1 - Optical muzzle blast detection and counterfire targeting system and method - Google Patents

Optical muzzle blast detection and counterfire targeting system and method Download PDF

Info

Publication number
US20060021498A1
US20060021498A1 US11/052,921 US5292105A US2006021498A1 US 20060021498 A1 US20060021498 A1 US 20060021498A1 US 5292105 A US5292105 A US 5292105A US 2006021498 A1 US2006021498 A1 US 2006021498A1
Authority
US
United States
Prior art keywords
filter
processor
targeting
target
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/052,921
Inventor
Stanley Moroz
Myron Pauli
William Seisler
Duane Burchick
Mehmet Ertern
Eric Heidhausen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US11/052,921 priority Critical patent/US20060021498A1/en
Publication of US20060021498A1 publication Critical patent/US20060021498A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/14Indirect aiming means
    • F41G3/147Indirect aiming means based on detection of a firing weapon
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/783Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived from static detectors or detector systems
    • G01S3/784Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived from static detectors or detector systems using a mosaic of detectors

Definitions

  • the present invention relates to (1) an optical muzzle blast detection and counterfire targeting system for remotely detecting the location of muzzle blasts produced by rifles, artillery and other weapons and similar explosive events, especially sniper fire; and (2) a system for directing counterfire weapons on to this location.
  • Hillis U.S. Pat. No. 5,686,889 relates to an infrared sniper detection enhancement system.
  • firing of small arms results in a muzzle flash that produces a distinctive signature which is used in automated or machine-aided detection with an IR (infrared) imager.
  • the muzzle flash is intense and abrupt in the 3 to 5 mum band.
  • a sniper detection system operating in the 3 to 5 mum region must deal with the potential problem of false alarms from solar clutter. Hillis reduces the false alarm rate of an IR based muzzle flash or bullet tracking system (during day time) by adding a visible light (standard video) camera.
  • the IR and visible light video are processed using temporal and/or spatial filtering to detect intense, brief signals like those from a muzzle flash.
  • the standard video camera helps detect (and then discount) potential sources of false alarm caused by solar clutter. If a flash is detected in both the IR and the visible spectrum at the same time, then the flash is mostly probably the result of solar clutter from a moving object. According to Hillis, if a flash is detected only in the IR, then it is most probably a true weapon firing event.
  • Hirshberg U.S. Pat. No. 3,936,822 a round detecting method and apparatus are disclosed for automatically detecting the firing of weapons, such as small arms, or the like.
  • radiant and acoustic energy produced upon occurrence of the firing of a weapon and emanating from the muzzle thereof are detected at known, substantially fixed, distances therefrom.
  • Directionally sensitive radiant and acoustic energy transducer means directed toward the muzzle to receive the radiation and acoustic pressure waves therefrom may be located adjacent each other for convenience. In any case, the distances from the transducers to the muzzle, and the different propagation velocities of the radiant and acoustic waves are known.
  • the detected radiant e.g.
  • infrared and acoustic signals are used to generate pulses, with the infrared initiated pulse being delayed and/or extended so as to at least partially coincide with the acoustic initiated pulse; the extension or delay time being made substantially equal to the difference in transit times of the radiant and acoustic signals in traveling between the weapon muzzle and the transducers.
  • the simultaneous occurrence of the generated pulses is detected to provide an indication of the firing of the weapon.
  • an infrared camera stares at its field of view and generates a video signal proportional to the intensity of light.
  • the camera is sensitive in the infrared spectral band where the intensity signature of the flash to be detected minus atmospheric attenuation is maximized.
  • the video signal is transmitted to an image processor where temporal and spatial filtering via digital signal processing to detect the signature of a flash and determine the flash location within the camera's field of view.
  • the image processing circuits are analog and digital electronic elements.
  • the image processing circuits are coupled to target location computation circuits and flash location information is transmitted to the targeting location computation circuits.
  • the targeting computation circuit is digital electronic circuitry with connections to the other devices in the system.
  • the field of view of the camera is correlated to the line of sight of the confirmation sensor by using aim point measurement devices which are coupled to the target computation processor.
  • the displays are video displays and show camera derived imagery superimposed with detection and aiming symbology and system status reports.
  • the user interface devices are keypads and audible or vibrational alarms which control the operation of the system and alert the user to flash detections which are equated to sniper firing, for example.
  • the weapon aim point measurement devices include inertial measurement units, gyroscopes, angular rate sensors, magnetometer-inclinometers, or gimbaled shaft encoders.
  • Visual target confirmation sensors are binoculars or rifle scopes with associated aim point measurement devices. Counterfire weapons contemplate rifles, machine guns, mortars, artillery, missiles, bombs, and rockets.
  • the basic objective of the present invention is to provide an automated and improved muzzle blast detector system and method which uses multi-mode filtering to eliminate and/or minimize false alarms.
  • Another object of the invention is to provide a muzzle blast detector which accurately locates direction and range to muzzle blast source.
  • Another object of the invention is to provide a sniper detection method and apparatus which uses temporal, spectral and spectral filtering to discriminate between
  • FIG. 1 is a general block diagram of a muzzle blast detection system incorporating the invention
  • FIG. 2 is a further block diagram of the detection system of the invention.
  • FIGS. 3A and 3B are graphs of simulated event signatures and corresponding matched filter for 60 FPS video
  • FIG. 4 is a diagrammatic representation of the event filter
  • FIG. 5 illustrates a sample detection filter
  • FIG. 6 is a circuit diagram of a detector with an adaptive threshold level
  • FIG. 7 is a depiction of a low pass spatial filter response h (K,l),
  • FIG. 8 is a circuit diagram showing adaptive detection system with low pass filtered “[sgr]” and high pass filter e ( 2 ).
  • FIG. 9 illustrates the decision filter
  • FIG. 10 illustrates the overall detection and location algorithm
  • FIG. 11 illustrates the video acquisition subscription.
  • FIG. 12 is a schematic diagram of an embodiment of the instant invention.
  • FIG. 13 is a flow chart of an embodiment of the instant invention.
  • the aspect of the invention comprises an infrared camera 10 connected to image processing circuits 11 and a video display 14 which may include an annunciator 14 A to provide an immediate audible or tactile indication of the muzzle blast event.
  • the camera 10 stares at a field of view, and the video signal is fed to the image processor 11 .
  • the pedestal and gain controls of the camera are controlled by the image processor.
  • the image processor outputs the live infrared video to the display.
  • Concurrently algorithms to detect the presence of a muzzle flash in the image are executed on the image processor.
  • the image processor 11 overlays a symbol on the display around the pixel location where the flash was detected.
  • the algorithms that detect the muzzle flash operate by processing several frames of video data through a temporal and spatial digital filter.
  • the activity level at each pixel location is adaptively tracked and the effect of background clutter is reduced by varying the detection threshold at each pixel according to the past activity around that pixel location.
  • the detection algorithms are described in more detail in the section entitled Detection of Short Duration Transient Events against a Cluttered Background.
  • An algorithm is used for automatic adjustment of the pedestal and gain values of the imaging system to achieve high dynamic range. Additional user control over these settings allows certain regions of the image to be dark or saturated. This algorithm is described in the section entitled Automatic Pedestal and Gain Adjustment Algorithm.
  • the coordinates of the detected muzzle flash are fed to targeting circuitry 12 to guide a visual target confirmation sensor 13 , such as binoculars or a telescope, and a counterfire weapon, such as a rifle, onto the target.
  • a visual target confirmation sensor 13 such as binoculars or a telescope
  • a counterfire weapon such as a rifle
  • the aim point measurement devices generate an azimuth and elevation reading.
  • the calibration procedure includes aiming the weapon at three known calibration points. These points are marked by the user on the display 14 using a cursor. The image coordinates and the aim point measurements for these points are used to generate a mathematical transformation so that, given any aim point measurement, it's corresponding image location can be calculated. Symbology denoting the current weapon aim point is displayed on screen 14 , and the difference in target screen locations is used to guide the return fire shooter onto the target.
  • An aim point measuring device 15 is aligned with the confirmation sensor. This device provides the azimuth and elevation (line of sight) of the sensor.
  • the aim point measurement device 15 is correlated to the camera optical axis and orientation using a multipoint calibration procedure, thereby relating azimuth and elevation readings to camera pixel locations.
  • the targeting processor calculates the difference between muzzle flash detection location and the instantaneous pointing location and displays guidance symbology to direct the confirmation sensor to the target.
  • the line of sight of the confirmation sensor is calibrated to camera coordinates using the three-point calibration algorithm used for calibrating the weapon aim points to camera coordinates. Either the same or different calibration points can be used for weapon to camera and confirmation sensor to camera calibration. Symbology denoting the current confirmation sensor line of sight is displayed on screen, and the difference in target screen locations is used to guide the observer onto the target.
  • a telescope, on a gimbal with shaft encoders, mounted on the camera is used to determine the location of the calibration points.
  • the user points the telescope at a calibration point.
  • the telescope gimbal is aligned with the camera, and the image coordinates of the telescope line of sight are known. By selecting three calibration points and aiming the weapon or confirmation sensor at these points the transformation between the aim point measurement devices and camera coordinates can be calculated.
  • the user interface includes a keyboard KB and cursor controlled mechanism M to control the operation of the system, a video display screen 14 , and a detection alarm 14 A.
  • the user is alerted to a detection through an audible alarm of a silent tactical vibration, or other type of silent alarm device which is triggered by the targeting processor.
  • the user is then guided through symbology overlaid on the display to move the confirmation, sensor weapon until the line of sight is aligned with the detected flash.
  • a peripheral vision aiming device is also used to guide a confirmation sensor or weapon to the target.
  • the aiming device consists of a ring of individual lights controlled by the targeting processor.
  • the ring may be placed on the front of a rifle scope, in line with the rifle's hard sites or other locations in the peripheral view of the operator.
  • the targeting processor activates one or more lights to indicate the direction and distance the confirmation sensor/weapon must be moved to achieve alignment with the flash. The number of activated lights increases as the degree of alignment increases. All lights are activated when alignment is achieved.
  • the first difficulty is that the instrumentation to simultaneously extract all components of signals that have spectral, spatial and temporal components is not readily available. Equipment is available to acquire simultaneously either the spectral and temporal (spectrometry), or the spatial and temporal (video) components from a scene. It is also possible, through the use of several imagers to acquire multispectral image sets, essentially sampling the scene at several spectral bands.
  • the intensity variation as a function of time was the easiest component of the event signature to detect.
  • the concept of matched filtering can be used if the statistics of the events to be detected and backgrounds are available. However, many factors, such as humidity, ambient temperature, range, sun angle, etc. influence these statistics. It is not practicable to gather data for all combinations of rapid transient events and background scenes. Thus, for the detection algorithm to reliably work against different background environments, it has to adapt to these environments.
  • the video signal from the camera 10 under control of controller 18 , is digitized 16 and supplied to an image processing system 17 and continuously stored in memory M at frame rates ( FIG. 2 ).
  • the image processor 17 is adapted to operate on the latest and several of the most recent frames captured.
  • the processor operates on progressively scanned 256*256 pixel frames at a rate of 60 frames per second, the algorithm can be used at other resolutions and frame rates.
  • the camera 10 being used is a CCD imager, which integrates the light falling on each pixel until that pixel's charge is read out.
  • the read out time is typically much less than the typical transient event duration. This means that the imager effectively has a 100% duty cycle, with no dead times between frames for each pixel.
  • the camera pedestal and gain settings are set to fully utilize the dynamic range of the image processing system. The algorithms for this are described later herein.
  • the first stage of the detection algorithm includes a temporal Event Filter 20 which is tuned to detect rapid transient signatures, followed by a spatio-temporal Detection Filter designed to reject background clutter.
  • the output of this first stage is a list of candidate event times and locations. These coordinates form the input to a logical processing stage which then estimates the probability of the candidate event actually being due to a single uncorrelated rapid transient.
  • the event filter 20 is a finite impulse response matched filter which operates on each pixel in the sequence.
  • the impulse response of the filter is derived by estimating the signature of the typical transition event.
  • the events to be detected typically have much shorter duration than the frame repetition rate. Therefore, most of the time the rapid transients occur wholly inside one frame. However, it is possible to have a single event overlapping two adjacent frames.
  • the time of occurrence of a transient event and the frame times are uncorrelated processes, and the overlap can be modeled by considering the event time to be uniformly distributed over the frame interval.
  • FIG. 3 shows the case where a rising time constant [tgr] (r) of 0.125 mS and a falling time constant [tgr] (f) of 0.5 mS are chosen.
  • This waveform is convolved with the rectangular window of the frame and the result integrated over successive frame periods reveals the optimal matched filter coefficients.
  • the event filter then is a tapped delay line finite impulse response filter and its output, the error signal, can be written as the simple convolution:
  • the simplest detection scheme for a transient event consists of an event filter 20 followed by a threshold device (comparator 21 , FIG. 5 ). This system works reasonably well in cases where the background scenery is not noisy and where false alarm rejection requirements are not demanding.
  • the simple detector approach is also useful in serving as a baseline to compare the performance of more complicated algorithms.
  • a figure of merit can be devised for other algorithms by comparing their detection performance to the simple detector.
  • the approach taken here is to use adaptive filtering methods to vary the decision threshold spatially, so that image areas of high activity have higher and areas of less activity have lower threshold levels. Thus, the threshold level becomes a varying surface across the image.
  • a good estimate of the activity level for each pixel in the image is the mean square of the signal e(i,j,n), the event filter output. Since this signal is already generated, its calculation imposes no additional computational burden. The calculation of the mean square however still needs to be performed.
  • [ sgr ]( i,j,n ) [ mgr][sgr] ( i,j,n ⁇ 1)+(1 ⁇ [ mgr ]) e ( i,j,n ) (2)
  • [mgr] the learning rate is a constant between 0 and 1.
  • a typical value for [mgr] is 0.95. The best choice for the learning rate will be determined depending on the stationarity of the background scene (in both the statistical and the physical senses).
  • the output of the comparator essentially becomes a measure of the difference of the instantaneous energy in the signal to the estimated average energy at that pixel.
  • Some of the physical phenomena that cause false alarms are edge effects, thermal effects such as convection, camera vibration, and moving objects. A significant portion of these can be eliminated by performing a spatial low pass operation on the variance estimate signal a. This is to spread the threshold raising effect of high energy pixels to their neighbors. However, a pure low pass operation would also lower the a values at the peaks of the curves. To offset this a “rubber-sheeting” low pass filter is used. This is mathematically analogous to laying a sheet of elastic material over the threshold surface.
  • the low pass spatial filter 45 coefficients h(k,l) are chosen depending on the sharpness desired.
  • a set of values which gives good results is generated using a normalized sinc function is plotted in FIG. 7 .
  • a possible enhancement to the detection algorithm is the inclusion of a spatial high pass filter 42 to reject image events which occupy large areas.
  • a spatial high pass filter 42 may reduce the system's susceptibility to false alarms due to events which are not of interest.
  • the block diagram of the detector incorporating these modifications is shown in FIG. 8 .
  • the comparator 43 output is no longer a binary decision but a difference signal. While it is possible to use the compactors' binary output as a final decision stage, it is convenient to further process the output of the detection filter.
  • a value for the detector signal det(i,j,n) is generated at the frame rate.
  • the data rate of the detector output is comparable to the raw image data rate.
  • the detector signal is a measure of the likelihood that an event has occurred at the corresponding pixel. This information has to be reduced to a simple indication of the presence and location of the event.
  • the decision filter performs the required operation.
  • the detector output can be filtered in several ways.
  • the obvious and simple method is to compare it with a set threshold value.
  • Another way is to first localize the possible location of the one most likely event in each frame, and then to decide whether it actually is present or not.
  • This approach is simple to implement and results in significant reduction in the amount of data to be processed. Its limitation is that it does not allow the detection of multiple transient events occurring within a single frame.
  • the location of a single candidate transient event per frame is done in locator 50 by finding the pixel location with the maximum detector output. If this signal exceeds a detector threshold T, then a “Transient Detected In Frame” indication is output, otherwise the output indicates “No Transient Detected In Frame”.
  • This operation is a recursive implementation of an adaptive threshold.
  • the learning rate [agr] (again chosen between 0 and 1 and typically about 0.9) determines the speed with which the system adapts to changes in the background levels.
  • the decision filter block diagram is shown in FIG. 9 .
  • the overall block diagram of the adaptive detection algorithm is shown in FIG. 10 .
  • the invention is especially useful when the background scene is cluttered and contains elements which have statistical properties similar to those of the events being searched for. This is done by utilizing as much of the available knowledge about the spectral, spatial, and temporal characteristics of the events to be detected.
  • the pedestal and gain adjustment algorithm presented here assumes an 8 bit imaging system is being used. The response is assumed to be roughly linear. However, the algorithm will work well with nonlinear imagers as well.
  • the image acquisition subsystem block diagram is shown in FIG. 11 .
  • Optimal values for the tracking parameters p ( 1 ), p ( 2 ), g ( 1 ), G ( 2 ) and k depend on the camera response. However, since feedback is used, this effectively “linearizes” the control loop, and depending on the temporal response desired, suitable values can be derived empirically.
  • variable exposure determines the amount of under- or overexposure desired. This operates in a manner analogous to the exposure control found in automatic cameras. When exposure is set at a positive value, the pedestal and gain dynamics are set to allow a number of pixels to stay saturated (overexposure). Similarly, a negative exposure control allows a number of pixels to stay at zero (underexposure).
  • n (up) is not altered (it stays equal to n (zer)) but n (down) is less than n (sat). This means that a number of pixels (equal to the magnitude of exposure) are allowed to stay saturated. Conversely, with a negative exposure ndow is unaltered but n (up) is allowed to go to a negative number, signifying that a number of pixels are allowed to stay dark.
  • a standard two axis pan and tilt gimbal 1210 is operably connected to the Targeting Processor 1215 .
  • An alignment including registration and calibration, is performed between the gimbal position and the detection camera pixel locations. The alignment is accomplished using reference sources located at a distance from the sensors. The gimbaled camera sensors are calibrated so that the differing fields of view are matched to each other. After this calibration, the gimbal can rapidly point at a given location corresponding to a triggering event.
  • a standard joystick 1220 is interfaced to the Targeting Processor 1215 to enable the user to move the gimbal 1210 independently to locate areas of interest.
  • a Day/Night Color Vision System (“DNCVS”) 1225 is placed on the high speed gimbal 1210 .
  • This subsystem 1225 serves as an adjunct system to the instant VIPER suite.
  • the DNCVS 1225 provides the user with a day/night “visual” validation of the triggering event.
  • the DNCVS 1225 comprises standard multi-spectral cameras that are sensitive to both daytime and nighttime environments. Such multi-spectral cameras include, for example, standard long wave infrared, standard short wave infrared (“SWIR”, and standard visible band, e.g., video, cameras. The use of multiple cameras permits viewing of camouflage, cold targets, hot targets, and reflective (white/black) targets in several spectral bands.
  • Variable camera fields-of-regard are embodied by, for instance, standard zoom optics or standard controlled flip lenses.
  • the operator may select either automated or manual zoom controls allowing optimization of the fields-of-regard.
  • the operator's user interface 1230 permits selection of specific cameras of the DNCVS 1225 that can be displayed.
  • This display 1230 can be selected to be either monochromatic or color.
  • Various false color display schemes are available.
  • Color fusion schemes such as described in U.S. patent application Ser. No. 09/840,235 to Penny G. Warren, entitled “Apparatus and Method for Color Image Fusion,” filed Apr. 24, 2001, and incorporated herein by reference, are selectable for combination of multiple cameras into a single display. Fusion of previously stored images with real-time sensor imagery is also available.
  • Each camera can be optimized for maximum scene contrast by user-selected options. Both analog and digital sensor data is available for processing or storage. For highlighting features at long ranges super-resolution enhancements can be employed.
  • Frame summation techniques can be employed for highlighting dim targets.
  • Laser or other illuminators are used to highlight dim objects or designate an area of interest for external observers. Additionally, it is possible to convert individual wide-band cameras into a multi-color operation by use of laser (or other narrow-band) illuminators in an on-off contrast fashion. These capabilities are controlled by the operator through hardware and/or software interfaces.
  • the IR detection camera is mounted, for instance, on a plate along with the gimbal 1210 .
  • the rest of the sensors are mounted on the gimbal 1210 .
  • the Camera 1260 includes, a detection camera such as a midwave IR detection camera.
  • Other cameras are attached to the gimbal 1210 , for example, since their field of regard is much smaller than the detection camera 1260 .
  • the other cameras are included in the DNCVS 1225 .
  • Camera imagery is also passed into a recording device 1235 , e.g., a Video Storage Device.
  • the storage device 1235 enables archiving and analysis of data and events at a later time.
  • An external portable display 1230 (e.g. a monocular with a shuttered eyecup) is linked to the Targeting Processor 1215 . This enables multiple people in nearby locations to view the same real time data that is presented on the system display.
  • a standard Laser Range Finder 1240 is fixed to the gimbal 1210 permitting ranging to a designated object of interest.
  • a standard magnetometer/digital compass 1245 and standard GPS 1250 are interfaced to the Targeting Processor 1215 providing positional reference of the detection system.
  • the combination of the information from the magnetometer 1245 , gimbal 1210 , laser range finder 1240 and GPS 1250 provide the capability of geolocating the place where the event occurred. The specific place is then referenced and displayed on a stored map in the system and provided to the system operator.
  • Standard commercial software is available for this function, such as Weapons Systems Mapping software produced by DCS Corporation. This information can be passed to external entities in order to enable them to react to the event.
  • a standard very wide angle anamorphic lens 1255 has been developed and implemented that provides a wide angle field of view in one dimension. This lens optimizes the field-of-regard of the detection camera 1260 eliminating the need for multiple cameras to provide the wide angular coverage
  • Optical illuminators/designators 1265 are attached to the gimbal 1210 . They can be aligned in such a fashion to enable the user to illuminate/designate the event of interest. This cues external entities to the existence/relative location of a possible target.
  • the high speed gimbal 1210 containing, or communicating with, the DNCVS 1225 can also be used as a Perimeter Defense surveillance subsystem. This allows the operator to do a sweep-scan or a step-stare over selected angular regions. The timing and selection of the coverage is operator-controlled.
  • the Perimeter Defense surveillance subsystem enhances the situational awareness of the operator by highlighting events such as intrusions. Motion and scene change detection processing can be added to the Perimeter Defense surveillance subsystem to highlight features. The operator can examine the user display and decide to dwell on objects of interest within the Perimeter Defense coverage.
  • a triggering event such as a muzzle flash, overrides the Perimeter Defense surveillance so that the event can be identified and/or targeted.
  • FIG. 13 shows an illustrative method according to the instant invention.
  • Step S 1310 a physical flash has occurred in the IR Detection Camera field of regard.
  • Step S 1320 the detection camera images the flash through a sequence of frames.
  • the Image Processor then filters the imagery and determines if a shot has occurred. If so, it then passes a message to the Targeting Processor. In this instance, this is accomplished through an Ethernet interface between the Image Processor and Targeting Computer.
  • Step S 1330 after a detected shot, the Image Processor can alert the users with either a vibration, an audible alarm and or a visible cue to alert friendly forces in the area.
  • Step S 1340 the Targeting Processor display then alerts the user and updates the display to indicate such. For instance, this may be accomplished through adding the detected event to a list of already detected events and/or drawing an icon on a display representing the detection camera field of regard.
  • Step S 1350 in the case that multiple events occur in a short period of time, to prevent confusion of the operator, a Gimbal Slew Override allows the user to deal with individual events in a serial fashion.
  • Step S 1360 if the Gimbal Slew Override is on, the user is busy attending to a previous event. Thus the gimbal is not deviated from its current position. Meanwhile, the user has available a selection of Commands (STEP S 1380 ) to help react to the previous event.
  • Step S 1370 if the Gimbal Slew Override is off, the Targeting Processor then drives the gimbal to the position corresponding to the alerted event. Imagery of the area of interest is displayed on the Target Processor display.
  • Step S 1380 the Available User Commands are a set of controls that help the user to adapt to various conditions. For instance, a user may select to view a different color of imagery based on day or night.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

An authomated system for remote detection of muzzle blasts produced by rifles, artillery and other weapons, and similar explosive events. The system includes an infrared camera, image processing circuits, targeting computation circuits, displays, user interface devices, weapon aim point measurement devices, confirmation sensors, target designation devices and counterfire weapons. The camera is coupled to the image processing circuits. The image processing circuits are coupled to the targeting location computation circuits. The aim point measurement devices are coupled to the target computation processor. The system includes visual target confirmation sensors which are coupled to the targeting computation circuits.

Description

    BACKGROUND OF THE INVENTION
  • The present invention relates to (1) an optical muzzle blast detection and counterfire targeting system for remotely detecting the location of muzzle blasts produced by rifles, artillery and other weapons and similar explosive events, especially sniper fire; and (2) a system for directing counterfire weapons on to this location.
  • Prior Art
  • Hillis U.S. Pat. No. 5,686,889 relates to an infrared sniper detection enhancement system. According to this Hillis patent, firing of small arms results in a muzzle flash that produces a distinctive signature which is used in automated or machine-aided detection with an IR (infrared) imager. The muzzle flash is intense and abrupt in the 3 to 5 mum band. A sniper detection system operating in the 3 to 5 mum region must deal with the potential problem of false alarms from solar clutter. Hillis reduces the false alarm rate of an IR based muzzle flash or bullet tracking system (during day time) by adding a visible light (standard video) camera. The IR and visible light video are processed using temporal and/or spatial filtering to detect intense, brief signals like those from a muzzle flash. The standard video camera helps detect (and then discount) potential sources of false alarm caused by solar clutter. If a flash is detected in both the IR and the visible spectrum at the same time, then the flash is mostly probably the result of solar clutter from a moving object. According to Hillis, if a flash is detected only in the IR, then it is most probably a true weapon firing event.
  • In Hirshberg U.S. Pat. No. 3,936,822 a round detecting method and apparatus are disclosed for automatically detecting the firing of weapons, such as small arms, or the like. According to this Hirshberg patent, radiant and acoustic energy produced upon occurrence of the firing of a weapon and emanating from the muzzle thereof are detected at known, substantially fixed, distances therefrom. Directionally sensitive radiant and acoustic energy transducer means directed toward the muzzle to receive the radiation and acoustic pressure waves therefrom may be located adjacent each other for convenience. In any case, the distances from the transducers to the muzzle, and the different propagation velocities of the radiant and acoustic waves are known. The detected radiant (e.g. infrared) and acoustic signals are used to generate pulses, with the infrared initiated pulse being delayed and/or extended so as to at least partially coincide with the acoustic initiated pulse; the extension or delay time being made substantially equal to the difference in transit times of the radiant and acoustic signals in traveling between the weapon muzzle and the transducers. The simultaneous occurrence of the generated pulses is detected to provide an indication of the firing of the weapon. With this arrangement extraneously occurring radiant and acoustic signals detected by the transducers will not function to produce an output from the apparatus unless the sequence is corrected and the timing thereof fortuitously matches the above-mentioned differences in signal transit times. If desired, the round detection information may be combined with target miss-distance information for further processing and/or recording.
  • SUMMARY OF THE INVENTION
  • According to the present invention, an infrared camera stares at its field of view and generates a video signal proportional to the intensity of light. The camera is sensitive in the infrared spectral band where the intensity signature of the flash to be detected minus atmospheric attenuation is maximized. The video signal is transmitted to an image processor where temporal and spatial filtering via digital signal processing to detect the signature of a flash and determine the flash location within the camera's field of view. The image processing circuits are analog and digital electronic elements. In another aspect and feature of the invention, the image processing circuits are coupled to target location computation circuits and flash location information is transmitted to the targeting location computation circuits. The targeting computation circuit is digital electronic circuitry with connections to the other devices in the system. The field of view of the camera is correlated to the line of sight of the confirmation sensor by using aim point measurement devices which are coupled to the target computation processor. The displays are video displays and show camera derived imagery superimposed with detection and aiming symbology and system status reports. The user interface devices are keypads and audible or vibrational alarms which control the operation of the system and alert the user to flash detections which are equated to sniper firing, for example. In still another aspect of the invention, the weapon aim point measurement devices include inertial measurement units, gyroscopes, angular rate sensors, magnetometer-inclinometers, or gimbaled shaft encoders. Visual target confirmation sensors are binoculars or rifle scopes with associated aim point measurement devices. Counterfire weapons contemplate rifles, machine guns, mortars, artillery, missiles, bombs, and rockets.
  • OBJECTS OF THE INVENTION
  • The basic objective of the present invention is to provide an automated and improved muzzle blast detector system and method which uses multi-mode filtering to eliminate and/or minimize false alarms.
  • Another object of the invention is to provide a muzzle blast detector which accurately locates direction and range to muzzle blast source.
  • Another object of the invention is to provide a sniper detection method and apparatus which uses temporal, spectral and spectral filtering to discriminate between
  • actual muzzle blasts and non-muzzle blast infrared generating events.
  • DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a general block diagram of a muzzle blast detection system incorporating the invention,
  • FIG. 2 is a further block diagram of the detection system of the invention,
  • FIGS. 3A and 3B are graphs of simulated event signatures and corresponding matched filter for 60 FPS video,
  • FIG. 4 is a diagrammatic representation of the event filter,
  • FIG. 5 illustrates a sample detection filter,
  • FIG. 6 is a circuit diagram of a detector with an adaptive threshold level,
  • FIG. 7 is a depiction of a low pass spatial filter response h (K,l),
  • FIG. 8 is a circuit diagram showing adaptive detection system with low pass filtered “[sgr]” and high pass filter e (2).
  • FIG. 9 illustrates the decision filter,
  • FIG. 10 illustrates the overall detection and location algorithm, and
  • FIG. 11 illustrates the video acquisition subscription.
  • FIG. 12 is a schematic diagram of an embodiment of the instant invention.
  • FIG. 13 is a flow chart of an embodiment of the instant invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The aspect of the invention comprises an infrared camera 10 connected to image processing circuits 11 and a video display 14 which may include an annunciator 14A to provide an immediate audible or tactile indication of the muzzle blast event. The camera 10 stares at a field of view, and the video signal is fed to the image processor 11. The pedestal and gain controls of the camera are controlled by the image processor.
  • Detection
  • The image processor outputs the live infrared video to the display. Concurrently algorithms to detect the presence of a muzzle flash in the image are executed on the image processor. When a muzzle flash is detected the image processor 11 overlays a symbol on the display around the pixel location where the flash was detected. The algorithms that detect the muzzle flash operate by processing several frames of video data through a temporal and spatial digital filter. The activity level at each pixel location is adaptively tracked and the effect of background clutter is reduced by varying the detection threshold at each pixel according to the past activity around that pixel location. The detection algorithms are described in more detail in the section entitled Detection of Short Duration Transient Events Against a Cluttered Background.
  • Automatic Pedestal and Gain
  • An algorithm is used for automatic adjustment of the pedestal and gain values of the imaging system to achieve high dynamic range. Additional user control over these settings allows certain regions of the image to be dark or saturated. This algorithm is described in the section entitled Automatic Pedestal and Gain Adjustment Algorithm.
  • Targeting
  • The coordinates of the detected muzzle flash are fed to targeting circuitry 12 to guide a visual target confirmation sensor 13, such as binoculars or a telescope, and a counterfire weapon, such as a rifle, onto the target.
  • Weapon Aim Point to Camera Coordinate Calibration
  • Given weapon aim point measurement readings 15, the corresponding image coordinates in the camera field of view are derived. The aim point measurement devices generate an azimuth and elevation reading. The calibration procedure includes aiming the weapon at three known calibration points. These points are marked by the user on the display 14 using a cursor. The image coordinates and the aim point measurements for these points are used to generate a mathematical transformation so that, given any aim point measurement, it's corresponding image location can be calculated. Symbology denoting the current weapon aim point is displayed on screen 14, and the difference in target screen locations is used to guide the return fire shooter onto the target.
  • Visual Confirmation
  • An aim point measuring device 15 is aligned with the confirmation sensor. This device provides the azimuth and elevation (line of sight) of the sensor. The aim point measurement device 15 is correlated to the camera optical axis and orientation using a multipoint calibration procedure, thereby relating azimuth and elevation readings to camera pixel locations. The targeting processor calculates the difference between muzzle flash detection location and the instantaneous pointing location and displays guidance symbology to direct the confirmation sensor to the target.
  • Confirmation Sensor Aim Point to Camera Coordinate Calibration
  • The line of sight of the confirmation sensor is calibrated to camera coordinates using the three-point calibration algorithm used for calibrating the weapon aim points to camera coordinates. Either the same or different calibration points can be used for weapon to camera and confirmation sensor to camera calibration. Symbology denoting the current confirmation sensor line of sight is displayed on screen, and the difference in target screen locations is used to guide the observer onto the target.
  • Calibration Using Gimbaled Telescope with Encoders
  • A telescope, on a gimbal with shaft encoders, mounted on the camera is used to determine the location of the calibration points. The user points the telescope at a calibration point. The telescope gimbal is aligned with the camera, and the image coordinates of the telescope line of sight are known. By selecting three calibration points and aiming the weapon or confirmation sensor at these points the transformation between the aim point measurement devices and camera coordinates can be calculated.
  • User Interface
  • The user interface includes a keyboard KB and cursor controlled mechanism M to control the operation of the system, a video display screen 14, and a detection alarm 14A. The user is alerted to a detection through an audible alarm of a silent tactical vibration, or other type of silent alarm device which is triggered by the targeting processor. The user is then guided through symbology overlaid on the display to move the confirmation, sensor weapon until the line of sight is aligned with the detected flash.
  • Ring Display
  • A peripheral vision aiming device is also used to guide a confirmation sensor or weapon to the target. The aiming device consists of a ring of individual lights controlled by the targeting processor. The ring may be placed on the front of a rifle scope, in line with the rifle's hard sites or other locations in the peripheral view of the operator. When a detection is made, the targeting processor activates one or more lights to indicate the direction and distance the confirmation sensor/weapon must be moved to achieve alignment with the flash. The number of activated lights increases as the degree of alignment increases. All lights are activated when alignment is achieved.
  • The following section describes the adaptive algorithm for detection of rapid transient events where a noisy background is present. The theoretical background and a sample implementation are given.
  • It is desired to detect and locate transient events against a noisy background in real time. The detection and location of such an event requires a prior knowledge about the spectral, spatial and temporal signatures of typical events. It is also desirable to have information about the background conditions in which the detection system is expected to operate. This information consists of the spectral, spatial and temporal characteristics of the background.
  • If the statistics of the four-dimensional signal which is specified as the signature of a typical event (spectral, spatial and temporal axes) are known, and if the same statistics for various backgrounds are measured, it becomes a simple matter of applying standard stochastic analysis methods (matched filtering) in order to solve the problem. However, this information is not readily available and there are several other problems which make this approach unfeasible.
  • The first difficulty is that the instrumentation to simultaneously extract all components of signals that have spectral, spatial and temporal components is not readily available. Equipment is available to acquire simultaneously either the spectral and temporal (spectrometry), or the spatial and temporal (video) components from a scene. It is also possible, through the use of several imagers to acquire multispectral image sets, essentially sampling the scene at several spectral bands.
  • Operating at a suitably chosen fixed spectral band, the intensity variation as a function of time was the easiest component of the event signature to detect.
  • Detection Methods Which Deal Only with Spatially and Temporally Varying Signal Components at a Fixed Spectral Band
  • The concept of matched filtering can be used if the statistics of the events to be detected and backgrounds are available. However, many factors, such as humidity, ambient temperature, range, sun angle, etc. influence these statistics. It is not practicable to gather data for all combinations of rapid transient events and background scenes. Thus, for the detection algorithm to reliably work against different background environments, it has to adapt to these environments.
  • The Detection System
  • The video signal from the camera 10, under control of controller 18, is digitized 16 and supplied to an image processing system 17 and continuously stored in memory M at frame rates (FIG. 2). In this invention, the image processor 17 is adapted to operate on the latest and several of the most recent frames captured. Although in this case the processor operates on progressively scanned 256*256 pixel frames at a rate of 60 frames per second, the algorithm can be used at other resolutions and frame rates.
  • The camera 10 being used is a CCD imager, which integrates the light falling on each pixel until that pixel's charge is read out. The read out time is typically much less than the typical transient event duration. This means that the imager effectively has a 100% duty cycle, with no dead times between frames for each pixel. The camera pedestal and gain settings are set to fully utilize the dynamic range of the image processing system. The algorithms for this are described later herein.
  • The first stage of the detection algorithm includes a temporal Event Filter 20 which is tuned to detect rapid transient signatures, followed by a spatio-temporal Detection Filter designed to reject background clutter. The output of this first stage is a list of candidate event times and locations. These coordinates form the input to a logical processing stage which then estimates the probability of the candidate event actually being due to a single uncorrelated rapid transient.
  • The Event Filter 20
  • The event filter 20 is a finite impulse response matched filter which operates on each pixel in the sequence. The impulse response of the filter is derived by estimating the signature of the typical transition event.
  • The events to be detected typically have much shorter duration than the frame repetition rate. Therefore, most of the time the rapid transients occur wholly inside one frame. However, it is possible to have a single event overlapping two adjacent frames. The time of occurrence of a transient event and the frame times are uncorrelated processes, and the overlap can be modeled by considering the event time to be uniformly distributed over the frame interval.
  • A simple model of a rapid transient signature consists of a pair of exponentials, one on the rising edge and another on the falling edge of the event. FIG. 3 shows the case where a rising time constant [tgr] (r) of 0.125 mS and a falling time constant [tgr] (f) of 0.5 mS are chosen. This waveform is convolved with the rectangular window of the frame and the result integrated over successive frame periods reveals the optimal matched filter coefficients.
  • The event filter then is a tapped delay line finite impulse response filter and its output, the error signal, can be written as the simple convolution:
      • Get Mathematical Equation
  • Since h(k), the impulse response of the Event Filter is indexed only to the frame number, this filter is purely temporal and has no spatial effects.
  • The Detection Filter
  • The simplest detection scheme for a transient event consists of an event filter 20 followed by a threshold device (comparator 21, FIG. 5). This system works reasonably well in cases where the background scenery is not noisy and where false alarm rejection requirements are not demanding.
  • The simple detector approach is also useful in serving as a baseline to compare the performance of more complicated algorithms. A figure of merit can be devised for other algorithms by comparing their detection performance to the simple detector.
  • In order to reduce the false alarm rate additional processing is performed. The approach taken here is to use adaptive filtering methods to vary the decision threshold spatially, so that image areas of high activity have higher and areas of less activity have lower threshold levels. Thus, the threshold level becomes a varying surface across the image.
  • A good estimate of the activity level for each pixel in the image is the mean square of the signal e(i,j,n), the event filter output. Since this signal is already generated, its calculation imposes no additional computational burden. The calculation of the mean square however still needs to be performed.
  • Instead of the actual mean square computation to estimate the energy of the intensity signal at each pixel, a recursive estimate is used. Thus we define:
    [sgr](i,j,n)=[mgr][sgr](i,j,n−1)+(1−[mgr]) e(i,j,n)   (2)
    where [mgr] the learning rate is a constant between 0 and 1. A typical value for [mgr] is 0.95. The best choice for the learning rate will be determined depending on the stationarity of the background scene (in both the statistical and the physical senses).
  • The recursive formulation for [sgr](i,j,n) makes it easy to implement. The infinite impulse response filter 32 that implements this has a low pass transfer function, and thus tends to “average out” the activity at each pixel location over its recent past.
  • To simplify implementation, it is possible to remove the square-root operation 33 on the threshold surface, and compare the estimated variance of the signal e to the square of its instantaneous value. Thus, the output of the comparator essentially becomes a measure of the difference of the instantaneous energy in the signal to the estimated average energy at that pixel.
  • Some of the physical phenomena that cause false alarms are edge effects, thermal effects such as convection, camera vibration, and moving objects. A significant portion of these can be eliminated by performing a spatial low pass operation on the variance estimate signal a. This is to spread the threshold raising effect of high energy pixels to their neighbors. However, a pure low pass operation would also lower the a values at the peaks of the curves. To offset this a “rubber-sheeting” low pass filter is used. This is mathematically analogous to laying a sheet of elastic material over the threshold surface. The threshold surface thus generated is calculated by:
    [thgr](i,j,n)=max[[sgr](i,j,n),[sgr] (LP)(i,j,n)]  (3)
    where [sgr] (LP) is the low pass filtered estimated variance, calculated by the convolution:
      • Get Mathematical Equation
  • The low pass spatial filter 45 coefficients h(k,l) are chosen depending on the sharpness desired. A set of values which gives good results is generated using a normalized sinc function is plotted in FIG. 7.
  • A possible enhancement to the detection algorithm is the inclusion of a spatial high pass filter 42 to reject image events which occupy large areas. Depending on the application (i.e. whether rapid transient events which occupy relatively large areas are desired to be detected or not), such a filter may reduce the system's susceptibility to false alarms due to events which are not of interest. The block diagram of the detector incorporating these modifications is shown in FIG. 8.
  • It should also be noted that in the system shown the comparator 43 output is no longer a binary decision but a difference signal. While it is possible to use the compactors' binary output as a final decision stage, it is convenient to further process the output of the detection filter.
  • The Decision Filter (FIG. 9)
  • For each pixel, a value for the detector signal det(i,j,n) is generated at the frame rate. Thus, the data rate of the detector output is comparable to the raw image data rate. The detector signal is a measure of the likelihood that an event has occurred at the corresponding pixel. This information has to be reduced to a simple indication of the presence and location of the event. The decision filter performs the required operation.
  • The detector output can be filtered in several ways. The obvious and simple method is to compare it with a set threshold value. Another way is to first localize the possible location of the one most likely event in each frame, and then to decide whether it actually is present or not. This approach is simple to implement and results in significant reduction in the amount of data to be processed. Its limitation is that it does not allow the detection of multiple transient events occurring within a single frame.
  • The location of a single candidate transient event per frame is done in locator 50 by finding the pixel location with the maximum detector output. If this signal exceeds a detector threshold T, then a “Transient Detected In Frame” indication is output, otherwise the output indicates “No Transient Detected In Frame”.
  • The decision filter 51 operations are as follows: Get Mathematical Equation T ( n ) = [ agr ] T ( n - 1 ) + ( 1 - [ agr ] ) d ( n ) ( 6 )
  • This operation, similar to the calculation of [sgr], is a recursive implementation of an adaptive threshold. The learning rate [agr] (again chosen between 0 and 1 and typically about 0.9) determines the speed with which the system adapts to changes in the background levels.
  • The decision filter block diagram is shown in FIG. 9.
  • The overall block diagram of the adaptive detection algorithm is shown in FIG. 10.
  • Using the approach presented here, it is possible to determine the presence or absence of short duration transient events. The invention is especially useful when the background scene is cluttered and contains elements which have statistical properties similar to those of the events being searched for. This is done by utilizing as much of the available knowledge about the spectral, spatial, and temporal characteristics of the events to be detected.
  • Automatic Pedestal and Gain Adjustment Algorithm
  • The detection of a rapid transient event in a noisy background is significantly degraded if the full dynamic range of the imaging system is not used. This presents a simple algorithm for automatic adjustment of the pedestal and gain values of the imaging system to achieve high dynamic range. In some situations it is desired to have additional control over exposure to allow certain regions of the image to be dark or saturated. A version of the algorithm with exposure control is given below.
  • Automatic Pedestal and Gain Adjustment Algorithms
  • The pedestal and gain adjustment algorithm presented here assumes an 8 bit imaging system is being used. The response is assumed to be roughly linear. However, the algorithm will work well with nonlinear imagers as well. The image acquisition subsystem block diagram is shown in FIG. 11.
  • Two versions of the algorithm are presented here. The simpler first version automatically sets the pedestal and gain values to equalize the image so that all pixels lie throughout the full range of the imaging system. The coefficients of the system have to be adjusted so that the response is not oscillatory (i.e. their values have to be chosen so that the closed loop transfer function has magnitude less than unity). In the slightly more complex second version, the user is given an additional control to allow under- or over-exposure as desired.
  • The following procedure summarizes the detection system algorithm without exposure control:
  • Grab one frame of data. Within a region of interest (typically the whole picture minus a frame around the edges) count the number of saturated pixels (n (5at)) and the number at full darkness (n (zer)). Measure the value of the darkest pixel (botgap) and the distance between the brightest pixel and 255 (topgap). Change the pedestal and gain settings to spread the histogram of the image. Repeat for next frame.
  • The dynamic pedestal and gain equations are:
    [Dgr] p=p(1)n−pbotgap
    [Dgr] g=−g(1)n+g(2)topgap−k[Notidenticalwith]p
    pedestal=pedestal+[Dgr] p
    gain=gain+[Dgr] g
  • Optimal values for the tracking parameters p (1), p (2), g (1), G (2) and k depend on the camera response. However, since feedback is used, this effectively “linearizes” the control loop, and depending on the temporal response desired, suitable values can be derived empirically.
  • The following describes the detection algorithm with exposure control.
  • This version is slightly more complex in that it adds an exposure control input to the original algorithm. The variable exposure determines the amount of under- or overexposure desired. This operates in a manner analogous to the exposure control found in automatic cameras. When exposure is set at a positive value, the pedestal and gain dynamics are set to allow a number of pixels to stay saturated (overexposure). Similarly, a negative exposure control allows a number of pixels to stay at zero (underexposure). The dynamic equations are:
    n(up)=n(zer)+min(exposure, 0)
    n(down)=nsac−max(exposure, 0)
    [Dgr] p=p(1)n(up)−p(2)botgap
    [Dgr] g=−g(1)n(down)+g(2)topgap−k[Dgr]p
    pedestal=pedestal+[Dgr] p
    gain=gain+[Dgr] g
  • Thus, with a positive exposure setting, the only effect is at the top end of the digitization range, so that n (up) is not altered (it stays equal to n (zer)) but n (down) is less than n (sat). This means that a number of pixels (equal to the magnitude of exposure) are allowed to stay saturated. Conversely, with a negative exposure ndow is unaltered but n (up) is allowed to go to a negative number, signifying that a number of pixels are allowed to stay dark.
  • The above description of the VIPER suite incorporates by reference herein U.S. Pat. No. 6, 496,593 to Krone, Jr. et al.
  • Decreased response time for Confirmation
  • In FIG. 12, a standard two axis pan and tilt gimbal 1210 is operably connected to the Targeting Processor 1215. An alignment, including registration and calibration, is performed between the gimbal position and the detection camera pixel locations. The alignment is accomplished using reference sources located at a distance from the sensors. The gimbaled camera sensors are calibrated so that the differing fields of view are matched to each other. After this calibration, the gimbal can rapidly point at a given location corresponding to a triggering event. A standard joystick 1220 is interfaced to the Targeting Processor 1215 to enable the user to move the gimbal 1210 independently to locate areas of interest.
  • Day/Night Functionality
  • A Day/Night Color Vision System (“DNCVS”) 1225 is placed on the high speed gimbal 1210. This subsystem 1225 serves as an adjunct system to the instant VIPER suite. The DNCVS 1225 provides the user with a day/night “visual” validation of the triggering event. The DNCVS 1225 comprises standard multi-spectral cameras that are sensitive to both daytime and nighttime environments. Such multi-spectral cameras include, for example, standard long wave infrared, standard short wave infrared (“SWIR”, and standard visible band, e.g., video, cameras. The use of multiple cameras permits viewing of camouflage, cold targets, hot targets, and reflective (white/black) targets in several spectral bands. Use of multiple bands optimizes target contrast and provides better penetration through obscurants such as smoke and fog. Selection of the proper bands for the situation enables the operator to observe the scene for a wide variety of conditions. For day operations, the visible video cameras provide the best performance. For twilight operation, the SWIR cameras provide superior performance. For starless nights, the long wave IR cameras offer the best performance. Combining the sensors for transition periods, e.g., day to night, can give the best performance as the environmental conditions change.
  • Variable camera fields-of-regard are embodied by, for instance, standard zoom optics or standard controlled flip lenses. The operator may select either automated or manual zoom controls allowing optimization of the fields-of-regard.
  • The operator's user interface 1230 permits selection of specific cameras of the DNCVS 1225 that can be displayed. This display 1230 can be selected to be either monochromatic or color. Various false color display schemes are available. Color fusion schemes, such as described in U.S. patent application Ser. No. 09/840,235 to Penny G. Warren, entitled “Apparatus and Method for Color Image Fusion,” filed Apr. 24, 2001, and incorporated herein by reference, are selectable for combination of multiple cameras into a single display. Fusion of previously stored images with real-time sensor imagery is also available. Each camera can be optimized for maximum scene contrast by user-selected options. Both analog and digital sensor data is available for processing or storage. For highlighting features at long ranges super-resolution enhancements can be employed. Frame summation techniques can be employed for highlighting dim targets. Laser or other illuminators are used to highlight dim objects or designate an area of interest for external observers. Additionally, it is possible to convert individual wide-band cameras into a multi-color operation by use of laser (or other narrow-band) illuminators in an on-off contrast fashion. These capabilities are controlled by the operator through hardware and/or software interfaces.
  • The IR detection camera is mounted, for instance, on a plate along with the gimbal 1210. The rest of the sensors are mounted on the gimbal 1210. The Camera 1260 includes, a detection camera such as a midwave IR detection camera. Other cameras are attached to the gimbal 1210, for example, since their field of regard is much smaller than the detection camera 1260. The other cameras are included in the DNCVS 1225.
  • Video Storage
  • Camera imagery is also passed into a recording device 1235, e.g., a Video Storage Device. The storage device 1235 enables archiving and analysis of data and events at a later time.
  • Enhanced User Interface with External Display
  • An external portable display 1230 (e.g. a monocular with a shuttered eyecup) is linked to the Targeting Processor 1215. This enables multiple people in nearby locations to view the same real time data that is presented on the system display.
  • Geolocation/Mapping
  • A standard Laser Range Finder 1240 is fixed to the gimbal 1210 permitting ranging to a designated object of interest. A standard magnetometer/digital compass 1245 and standard GPS 1250 are interfaced to the Targeting Processor 1215 providing positional reference of the detection system. The combination of the information from the magnetometer 1245, gimbal 1210, laser range finder 1240 and GPS 1250 provide the capability of geolocating the place where the event occurred. The specific place is then referenced and displayed on a stored map in the system and provided to the system operator. Standard commercial software is available for this function, such as Weapons Systems Mapping software produced by DCS Corporation. This information can be passed to external entities in order to enable them to react to the event.
  • Tailored Spectral Bands for Different Missions
  • Several narrow-band cold filter settings have been developed which optimize the performance of the present VIPER detection system. Cold filters are filters cooled down to avoid noise generated by a filter's heat. The noise otherwise drives down contrast. These spectral band settings are chosen based upon the characteristics of the gunfire or ordnance to be observed as well as the properties of the background clutter and the intervening atmosphere. For example, for urban operations, a narrowing of the midwave IR camera passband reduces the false alarms at the cost of shorter detection ranges. By choosing the spectral band, the instant VIPER system is optimized for daytime or nighttime; long-range or short-range detection; or urban vs. rural settings. Proper choice of narrow spectral bands enhances system operation when the system is on a moving platform. The optimization can be fixed for a given situation. A variable filter setting is employed if a standard tunable filter is available to adjust to the specific situation. Alternatively, multiple cameras with individually optimized filters are used instead.
  • Anamorphic Lens Improvement
  • A standard very wide angle anamorphic lens 1255 has been developed and implemented that provides a wide angle field of view in one dimension. This lens optimizes the field-of-regard of the detection camera 1260 eliminating the need for multiple cameras to provide the wide angular coverage
  • Increasing Re-active Coordination through Optical Illumination/Designation
  • Optical illuminators/designators 1265 are attached to the gimbal 1210. They can be aligned in such a fashion to enable the user to illuminate/designate the event of interest. This cues external entities to the existence/relative location of a possible target.
  • Perimeter Defense Operations
  • The high speed gimbal 1210 containing, or communicating with, the DNCVS 1225 can also be used as a Perimeter Defense surveillance subsystem. This allows the operator to do a sweep-scan or a step-stare over selected angular regions. The timing and selection of the coverage is operator-controlled. The Perimeter Defense surveillance subsystem enhances the situational awareness of the operator by highlighting events such as intrusions. Motion and scene change detection processing can be added to the Perimeter Defense surveillance subsystem to highlight features. The operator can examine the user display and decide to dwell on objects of interest within the Perimeter Defense coverage. Optionally, a triggering event, such as a muzzle flash, overrides the Perimeter Defense surveillance so that the event can be identified and/or targeted.
  • FIG. 13 shows an illustrative method according to the instant invention.
  • In Step S1310, a physical flash has occurred in the IR Detection Camera field of regard.
  • In Step S1320, the detection camera images the flash through a sequence of frames. The Image Processor then filters the imagery and determines if a shot has occurred. If so, it then passes a message to the Targeting Processor. In this instance, this is accomplished through an Ethernet interface between the Image Processor and Targeting Computer.
  • In Step S1330, after a detected shot, the Image Processor can alert the users with either a vibration, an audible alarm and or a visible cue to alert friendly forces in the area.
  • In Step S1340, the Targeting Processor display then alerts the user and updates the display to indicate such. For instance, this may be accomplished through adding the detected event to a list of already detected events and/or drawing an icon on a display representing the detection camera field of regard.
  • In Step S1350, in the case that multiple events occur in a short period of time, to prevent confusion of the operator, a Gimbal Slew Override allows the user to deal with individual events in a serial fashion.
  • In Step S1360, if the Gimbal Slew Override is on, the user is busy attending to a previous event. Thus the gimbal is not deviated from its current position. Meanwhile, the user has available a selection of Commands (STEP S1380) to help react to the previous event.
  • In Step S1370, if the Gimbal Slew Override is off, the Targeting Processor then drives the gimbal to the position corresponding to the alerted event. Imagery of the area of interest is displayed on the Target Processor display.
  • In Step S1380, the Available User Commands are a set of controls that help the user to adapt to various conditions. For instance, a user may select to view a different color of imagery based on day or night.
  • While the invention has been described and illustrated in relation to preferred embodiments of the invention, it will be appreciated that other embodiments, adaptations and modifications of the invention will be readily apparent to those skilled in the art.

Claims (11)

1. (canceled)
2. An apparatus comprising:
a spectral filter;
a temporal filter;
a spatial filter;
an image processor cooperating with at least one of said spectral filter, said temporal filter, and said spatial filter to detect a flash event;
a targeting processor cooperating with said image processor to determine a target location based on the flash event; and
a gimbal cooperating with said targeting processor to slew toward the target location.
3. The apparatus according to claim 2, further comprising at least one of:
a target confirmation sensor cooperating with said targeting processor and with said gimbal; and
a counterfire device connected to said target confirmation sensor.
4. The apparatus according to claim 3, wherein said target confirmation sensor comprises one of a pair of binoculars and a telescope.
5. The apparatus according to claim 3, further comprising:
an aim point measurement device aligned with said target confirmation sensor.
6. The apparatus according to claim 2, wherein said image processor, for an image comprising a plurality of pixels, adjusts at least one of a pedestal value and a gain value of at least a portion of the plurality of pixels, thereby spreading a histogram of the image.
7. The apparatus according to claim 7, wherein said image processor comprises an exposure control.
8. The apparatus according to claim 2, wherein said spectral filter comprises a cold filter setting corresponding to at least one of a gunfire characteristic, an ordnance characteristic, a background clutter characteristic, and an atmospheric characteristic.
9. The apparatus according to claim 2, further comprising at least one of:
a range finder connected to said gimbal;
a magnetometer interfacing with said targeting processor;
a compass interfacing with said targeting processor; and
a global positioning satellite transceiver interfacing with said targeting processor,
wherein said targeting processor geolocates the flash event based at least in part from data from at least one of said range finder, said magnetometer, said compass, and said global positioning satellite transceiver.
10. The apparatus according to claim 2, further comprising:
a detection camera comprising a wide angle anamorphic lens cooperating with at least one of said spectral filter, said temporal filter, and said spatial filter.
11. The apparatus according to claim 2, further comprising:
an optical illuminator connected to said gimbal to identify the target.
US11/052,921 2003-12-17 2005-02-09 Optical muzzle blast detection and counterfire targeting system and method Abandoned US20060021498A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/052,921 US20060021498A1 (en) 2003-12-17 2005-02-09 Optical muzzle blast detection and counterfire targeting system and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US73807403A 2003-12-17 2003-12-17
US11/052,921 US20060021498A1 (en) 2003-12-17 2005-02-09 Optical muzzle blast detection and counterfire targeting system and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US73807403A Continuation 2003-12-17 2003-12-17

Publications (1)

Publication Number Publication Date
US20060021498A1 true US20060021498A1 (en) 2006-02-02

Family

ID=35730690

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/052,921 Abandoned US20060021498A1 (en) 2003-12-17 2005-02-09 Optical muzzle blast detection and counterfire targeting system and method

Country Status (1)

Country Link
US (1) US20060021498A1 (en)

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070125951A1 (en) * 2005-11-08 2007-06-07 General Atomics Apparatus and methods for use in flash detection
EP1811315A2 (en) 2006-01-18 2007-07-25 Rafael Armament Development Authority Ltd. Tread detection system
GB2450491A (en) * 2007-06-25 2008-12-31 Special Innovations Group Ltd System and method for detecting an explosive event
US20090128399A1 (en) * 2007-11-21 2009-05-21 Lockheed Martin Corporation Fused Sensor Situation Display
WO2009093227A1 (en) * 2008-01-23 2009-07-30 Elta Systems Ltd. Gunshot detection system and method
US20100079280A1 (en) * 2008-10-01 2010-04-01 Robotic Research, Llc Advanced object detector
WO2010097390A1 (en) * 2009-02-24 2010-09-02 Selex Galileo Limited Ir detector system and method
WO2010090804A3 (en) * 2009-01-15 2010-10-21 Beyond Today Solutions & Technology Llc Rpg launcher deterrent
WO2010138200A1 (en) * 2009-05-29 2010-12-02 Bae Systems Information And Electronic Systems Integration Inc. Apparatus and method for monitoring projectile emission and charging an energy storage device
US20110139989A1 (en) * 2006-09-13 2011-06-16 Pawlak Andrzej M Method and Apparatus for A Universal Infrared Analyzer
WO2012007692A1 (en) * 2010-07-13 2012-01-19 Thales Multifunctional bispectral imaging method and device
EP2533001A1 (en) * 2011-06-07 2012-12-12 Optics 1, Inc. Threat detection systems and methods using image intensifiers and position-sensing photodiodes
US8379193B2 (en) 2008-08-27 2013-02-19 Chemimage Corporation SWIR targeted agile raman (STAR) system for on-the-move detection of emplace explosives
KR101236719B1 (en) 2010-10-12 2013-02-25 엘아이지넥스원 주식회사 Image display apparatus applied to night use system of cannon and method thereof
RU2476815C1 (en) * 2011-10-20 2013-02-27 Открытое акционерное общество "Конструкторское бюро приборостроения" Laser semiactive self-homing head
US8421015B1 (en) 2007-09-13 2013-04-16 Oceanit Laboratories, Inc. Position sensing detector focal plane array (PSD-FPA) event detection and classification system
US20130155302A1 (en) * 2011-12-15 2013-06-20 Stmicroelectronics (Research & Development) Limited Digital image sensor
US20130152447A1 (en) * 2009-12-18 2013-06-20 Vidderna Jakt & Utbildning Ab Aiming device with a reticle defining a target area at a specified distance
US8573110B2 (en) 2009-01-15 2013-11-05 Beyond Today Solutions & Technology Llc RPG launcher deterrent
US8743358B2 (en) 2011-11-10 2014-06-03 Chemimage Corporation System and method for safer detection of unknown materials using dual polarized hyperspectral imaging and Raman spectroscopy
US8994934B1 (en) 2010-11-10 2015-03-31 Chemimage Corporation System and method for eye safe detection of unknown targets
US9052290B2 (en) 2012-10-15 2015-06-09 Chemimage Corporation SWIR targeted agile raman system for detection of unknown materials using dual polarization
US20150192667A1 (en) * 2011-07-21 2015-07-09 James W. Rakeman Optically Augmented Weapon Locating System and Methods of Use
US9103628B1 (en) 2013-03-14 2015-08-11 Lockheed Martin Corporation System, method, and computer program product for hostile fire strike indication
EA021666B1 (en) * 2012-12-11 2015-08-31 Государственное Внешнеторговое Унитарное Предприятие "Белспецвнештехника" Optoelectronic system of weapons system
US9146251B2 (en) 2013-03-14 2015-09-29 Lockheed Martin Corporation System, method, and computer program product for indicating hostile fire
US9196041B2 (en) 2013-03-14 2015-11-24 Lockheed Martin Corporation System, method, and computer program product for indicating hostile fire
US20160178433A1 (en) * 2014-12-21 2016-06-23 Elta Systems Ltd. Methods and systems for flash detection
US20170108374A1 (en) * 2015-10-16 2017-04-20 Raytheon Bbn Technologies Corp. Methods and apparatus for improved sensor vibration cancellation
US9632168B2 (en) 2012-06-19 2017-04-25 Lockheed Martin Corporation Visual disruption system, method, and computer program product
US9714815B2 (en) 2012-06-19 2017-07-25 Lockheed Martin Corporation Visual disruption network and system, method, and computer program product thereof
US9824295B2 (en) 2016-02-18 2017-11-21 L-3 Communications Cincinnati Electronics Corporation Imaging systems and methods including obscurant characterization
US10389928B2 (en) * 2016-08-11 2019-08-20 United States Of America, As Represented By The Secretary Of The Army Weapon fire detection and localization algorithm for electro-optical sensors
US11165414B2 (en) 2019-12-20 2021-11-02 Infineon Technologies Ag Reconfigurable filter network with shortened settling time
EP3591427B1 (en) 2018-07-05 2023-06-14 HENSOLDT Sensors GmbH Missile alerter and a method for issuing a warning about a missile
US12276481B1 (en) * 2023-12-15 2025-04-15 The United States Of America As Represented By The Secretary Of The Army Bullet tracker for aiming remote controlled weapon systems

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4266463A (en) * 1978-01-18 1981-05-12 Aktiebolaget Bofors Fire control device
US5258621A (en) * 1992-02-25 1993-11-02 General Electric Company Cold shield for a scanned linear IR detector array
US5327089A (en) * 1992-09-30 1994-07-05 Raytheon Company Portable assembly for supporting magnetic and electrical sensors
US5596509A (en) * 1994-05-12 1997-01-21 The Regents Of The University Of California Passive infrared bullet detection and tracking
US6072571A (en) * 1996-04-22 2000-06-06 The United States Of America As Represented By The Secretary Of The Navy Computer controlled optical tracking system
US6263160B1 (en) * 1999-06-11 2001-07-17 Wescam Inc. Stabilized platform systems for payloads
US6496593B1 (en) * 1998-05-07 2002-12-17 University Research Foundation, Inc. Optical muzzle blast detection and counterfire targeting system and method
US6720905B2 (en) * 2002-08-28 2004-04-13 Personnel Protection Technologies Llc Methods and apparatus for detecting concealed weapons
US7210392B2 (en) * 2000-10-17 2007-05-01 Electro Optic Systems Pty Limited Autonomous weapon system
US7289272B2 (en) * 2005-09-16 2007-10-30 Raytheon Company Optical system including an anamorphic lens

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4266463A (en) * 1978-01-18 1981-05-12 Aktiebolaget Bofors Fire control device
US5258621A (en) * 1992-02-25 1993-11-02 General Electric Company Cold shield for a scanned linear IR detector array
US5327089A (en) * 1992-09-30 1994-07-05 Raytheon Company Portable assembly for supporting magnetic and electrical sensors
US5596509A (en) * 1994-05-12 1997-01-21 The Regents Of The University Of California Passive infrared bullet detection and tracking
US6072571A (en) * 1996-04-22 2000-06-06 The United States Of America As Represented By The Secretary Of The Navy Computer controlled optical tracking system
US6496593B1 (en) * 1998-05-07 2002-12-17 University Research Foundation, Inc. Optical muzzle blast detection and counterfire targeting system and method
US6263160B1 (en) * 1999-06-11 2001-07-17 Wescam Inc. Stabilized platform systems for payloads
US7210392B2 (en) * 2000-10-17 2007-05-01 Electro Optic Systems Pty Limited Autonomous weapon system
US6720905B2 (en) * 2002-08-28 2004-04-13 Personnel Protection Technologies Llc Methods and apparatus for detecting concealed weapons
US7289272B2 (en) * 2005-09-16 2007-10-30 Raytheon Company Optical system including an anamorphic lens

Cited By (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7732769B2 (en) * 2005-11-08 2010-06-08 General Atomics Apparatus and methods for use in flash detection
US7947954B2 (en) * 2005-11-08 2011-05-24 General Atomics Apparatus and methods for use in flash detection
US8642961B2 (en) 2005-11-08 2014-02-04 General Atomics Apparatus and methods for use in flash detection
US20070125951A1 (en) * 2005-11-08 2007-06-07 General Atomics Apparatus and methods for use in flash detection
US20110095187A1 (en) * 2005-11-08 2011-04-28 General Atomics Apparatus and methods for use in flash detection
US8304729B2 (en) 2005-11-08 2012-11-06 General Atomics Apparatus and methods for use in flash detection
US20080191926A1 (en) * 2006-01-18 2008-08-14 Rafael - Armament Development Authority Ltd. Threat Detection System
US7492308B2 (en) * 2006-01-18 2009-02-17 Rafael Advanced Defense Systems Ltd. Threat detection system
EP1811315A2 (en) 2006-01-18 2007-07-25 Rafael Armament Development Authority Ltd. Tread detection system
US8164061B2 (en) * 2006-09-13 2012-04-24 Delphi Technologies, Inc. Method and apparatus for a universal infrared analyzer
US20110139989A1 (en) * 2006-09-13 2011-06-16 Pawlak Andrzej M Method and Apparatus for A Universal Infrared Analyzer
GB2450491A (en) * 2007-06-25 2008-12-31 Special Innovations Group Ltd System and method for detecting an explosive event
US8421015B1 (en) 2007-09-13 2013-04-16 Oceanit Laboratories, Inc. Position sensing detector focal plane array (PSD-FPA) event detection and classification system
US7602478B2 (en) * 2007-11-21 2009-10-13 Lockheed Martin Corporation Fused sensor situation display
US20090128399A1 (en) * 2007-11-21 2009-05-21 Lockheed Martin Corporation Fused Sensor Situation Display
US8809787B2 (en) * 2008-01-23 2014-08-19 Elta Systems Ltd. Gunshot detection system and method
US20110170798A1 (en) * 2008-01-23 2011-07-14 Elta Systems Ltd. Gunshot detection system and method
WO2009093227A1 (en) * 2008-01-23 2009-07-30 Elta Systems Ltd. Gunshot detection system and method
US8379193B2 (en) 2008-08-27 2013-02-19 Chemimage Corporation SWIR targeted agile raman (STAR) system for on-the-move detection of emplace explosives
US20100079280A1 (en) * 2008-10-01 2010-04-01 Robotic Research, Llc Advanced object detector
US8573110B2 (en) 2009-01-15 2013-11-05 Beyond Today Solutions & Technology Llc RPG launcher deterrent
WO2010090804A3 (en) * 2009-01-15 2010-10-21 Beyond Today Solutions & Technology Llc Rpg launcher deterrent
US9841488B2 (en) 2009-02-24 2017-12-12 Leonardo Mw Ltd IR detector system and method
WO2010097390A1 (en) * 2009-02-24 2010-09-02 Selex Galileo Limited Ir detector system and method
US20110126622A1 (en) * 2009-05-29 2011-06-02 Turner Brian P Apparatus and method for monitoring projectile emission and charging an energy storage device
WO2010138200A1 (en) * 2009-05-29 2010-12-02 Bae Systems Information And Electronic Systems Integration Inc. Apparatus and method for monitoring projectile emission and charging an energy storage device
US20130152447A1 (en) * 2009-12-18 2013-06-20 Vidderna Jakt & Utbildning Ab Aiming device with a reticle defining a target area at a specified distance
FR2962827A1 (en) * 2010-07-13 2012-01-20 Thales Sa METHOD AND DEVICE FOR BI-SPECTRAL MULTIFUNCTION IMAGING
US20130235211A1 (en) * 2010-07-13 2013-09-12 Thales Multifunctional Bispectral Imaging Method and Device
WO2012007692A1 (en) * 2010-07-13 2012-01-19 Thales Multifunctional bispectral imaging method and device
KR101236719B1 (en) 2010-10-12 2013-02-25 엘아이지넥스원 주식회사 Image display apparatus applied to night use system of cannon and method thereof
US8994934B1 (en) 2010-11-10 2015-03-31 Chemimage Corporation System and method for eye safe detection of unknown targets
EP2533001A1 (en) * 2011-06-07 2012-12-12 Optics 1, Inc. Threat detection systems and methods using image intensifiers and position-sensing photodiodes
US9234963B2 (en) * 2011-07-21 2016-01-12 Thales-Raytheon Systems Company Llc Optically augmented weapon locating system and methods of use
US20150192667A1 (en) * 2011-07-21 2015-07-09 James W. Rakeman Optically Augmented Weapon Locating System and Methods of Use
RU2476815C1 (en) * 2011-10-20 2013-02-27 Открытое акционерное общество "Конструкторское бюро приборостроения" Laser semiactive self-homing head
US8743358B2 (en) 2011-11-10 2014-06-03 Chemimage Corporation System and method for safer detection of unknown materials using dual polarized hyperspectral imaging and Raman spectroscopy
US20130155302A1 (en) * 2011-12-15 2013-06-20 Stmicroelectronics (Research & Development) Limited Digital image sensor
US9106850B2 (en) * 2011-12-15 2015-08-11 Stmicroelectronics (Research & Development) Limited Digital image sensor
US9313427B2 (en) 2011-12-15 2016-04-12 Stmicroelectronics (Grenoble 2) Sas Image sensor with improved dynamic range
US9719757B2 (en) 2012-06-19 2017-08-01 Lockheed Martin Corporation Visual disruption network and system, method, and computer program product thereof
US9714815B2 (en) 2012-06-19 2017-07-25 Lockheed Martin Corporation Visual disruption network and system, method, and computer program product thereof
US10082369B2 (en) 2012-06-19 2018-09-25 Lockheed Martin Corporation Visual disruption network and system, method, and computer program product thereof
US10156429B2 (en) 2012-06-19 2018-12-18 Lockheed Martin Corporation Visual disruption network, and system, method, and computer program product thereof
US10151567B2 (en) 2012-06-19 2018-12-11 Lockheed Martin Corporation Visual disruption network and system, method, and computer program product thereof
US9719758B2 (en) 2012-06-19 2017-08-01 Lockheed Martin Corporation Visual disruption network and system, method, and computer program product thereof
US9632168B2 (en) 2012-06-19 2017-04-25 Lockheed Martin Corporation Visual disruption system, method, and computer program product
US9052290B2 (en) 2012-10-15 2015-06-09 Chemimage Corporation SWIR targeted agile raman system for detection of unknown materials using dual polarization
EA021666B1 (en) * 2012-12-11 2015-08-31 Государственное Внешнеторговое Унитарное Предприятие "Белспецвнештехника" Optoelectronic system of weapons system
US9360370B2 (en) 2013-03-14 2016-06-07 Lockheed Martin Corporation System, method, and computer program product for indicating hostile fire
US9658108B2 (en) 2013-03-14 2017-05-23 Lockheed Martin Corporation System, method, and computer program product for hostile fire strike indication
US9196041B2 (en) 2013-03-14 2015-11-24 Lockheed Martin Corporation System, method, and computer program product for indicating hostile fire
US9103628B1 (en) 2013-03-14 2015-08-11 Lockheed Martin Corporation System, method, and computer program product for hostile fire strike indication
US9569849B2 (en) 2013-03-14 2017-02-14 Lockheed Martin Corporation System, method, and computer program product for indicating hostile fire
US9830695B2 (en) 2013-03-14 2017-11-28 Lockheed Martin Corporation System, method, and computer program product for indicating hostile fire
US9146251B2 (en) 2013-03-14 2015-09-29 Lockheed Martin Corporation System, method, and computer program product for indicating hostile fire
US20160178433A1 (en) * 2014-12-21 2016-06-23 Elta Systems Ltd. Methods and systems for flash detection
US10175101B2 (en) * 2014-12-21 2019-01-08 Elta Systems Ltd. Methods and systems for flash detection
US20170108374A1 (en) * 2015-10-16 2017-04-20 Raytheon Bbn Technologies Corp. Methods and apparatus for improved sensor vibration cancellation
US10921180B2 (en) * 2015-10-16 2021-02-16 Raytheon Bbn Technologies Corp. Methods and apparatus for improved sensor vibration cancellation
US9824295B2 (en) 2016-02-18 2017-11-21 L-3 Communications Cincinnati Electronics Corporation Imaging systems and methods including obscurant characterization
US10416470B2 (en) 2016-02-18 2019-09-17 L3 Cincinnati Electronics Corporation Imaging systems and methods including obscurant characterization
US10389928B2 (en) * 2016-08-11 2019-08-20 United States Of America, As Represented By The Secretary Of The Army Weapon fire detection and localization algorithm for electro-optical sensors
EP3591427B1 (en) 2018-07-05 2023-06-14 HENSOLDT Sensors GmbH Missile alerter and a method for issuing a warning about a missile
US11165414B2 (en) 2019-12-20 2021-11-02 Infineon Technologies Ag Reconfigurable filter network with shortened settling time
US12276481B1 (en) * 2023-12-15 2025-04-15 The United States Of America As Represented By The Secretary Of The Army Bullet tracker for aiming remote controlled weapon systems

Similar Documents

Publication Publication Date Title
US20060021498A1 (en) Optical muzzle blast detection and counterfire targeting system and method
US6496593B1 (en) Optical muzzle blast detection and counterfire targeting system and method
US5686889A (en) Infrared sniper detection enhancement
US8896701B2 (en) Infrared concealed object detection enhanced with closed-loop control of illumination by.mmw energy
US8336777B1 (en) Covert aiming and imaging devices
US5834676A (en) Weapon-mounted location-monitoring apparatus
US10425569B2 (en) Camera system
EP2956733B1 (en) Firearm aiming system with range finder, and method of acquiring a target
US9897688B2 (en) Laser detection and image fusion system and method
US20130333266A1 (en) Augmented Sight and Sensing System
US20120090216A1 (en) Electronic Firearm Sight and method for adjusting the reticle thereof
US20090040308A1 (en) Image orientation correction method and system
CN103097935B (en) Optoelectronic system with hyper-hemispheric field of view
US20120117848A1 (en) Electronic sight for firearm, and method of operating same
CA2938227C (en) Method for detecting and classifying events of a scene
US20180039061A1 (en) Apparatus and methods to generate images and display data using optical device
US20110084868A1 (en) Variable range millimeter wave method and system
JPH02105087A (en) Method and device for discriminating start and flight of body
US7193214B1 (en) Sensor having differential polarization and a network comprised of several such sensors
US4804843A (en) Aiming systems
JP2024511798A (en) telescopic sight
KR102485302B1 (en) Portable image display apparatus and image display method
CA3234451A1 (en) Enhanced picture-in-picture
GB2280563A (en) Apparatus for detecting and indicating the position of a source of transient optical radiation
KR20020042328A (en) Image Magnifying System for Distinguishing between Our Side and Their Side, and Weapon Having The Same

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION