[go: up one dir, main page]

WO1994015165A1 - Appareil d'entrainement d'acquisition de cibles et procede de formation a l'acquisition de cibles - Google Patents

Appareil d'entrainement d'acquisition de cibles et procede de formation a l'acquisition de cibles Download PDF

Info

Publication number
WO1994015165A1
WO1994015165A1 PCT/GB1993/002587 GB9302587W WO9415165A1 WO 1994015165 A1 WO1994015165 A1 WO 1994015165A1 GB 9302587 W GB9302587 W GB 9302587W WO 9415165 A1 WO9415165 A1 WO 9415165A1
Authority
WO
WIPO (PCT)
Prior art keywords
aimpoint
zone
display medium
coordinates
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/GB1993/002587
Other languages
English (en)
Inventor
Philip David Samuel Irwin
Mark Tweedie
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Short Brothers PLC
Original Assignee
Short Brothers PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Short Brothers PLC filed Critical Short Brothers PLC
Priority to GB9511352A priority Critical patent/GB2289521B/en
Publication of WO1994015165A1 publication Critical patent/WO1994015165A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/26Teaching or practice apparatus for gun-aiming or gun-laying
    • F41G3/2616Teaching or practice apparatus for gun-aiming or gun-laying using a light emitting device
    • F41G3/2622Teaching or practice apparatus for gun-aiming or gun-laying using a light emitting device for simulating the firing of a gun or the trajectory of a projectile
    • F41G3/2655Teaching or practice apparatus for gun-aiming or gun-laying using a light emitting device for simulating the firing of a gun or the trajectory of a projectile in which the light beam is sent from the weapon to the target
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/26Teaching or practice apparatus for gun-aiming or gun-laying
    • F41G3/2616Teaching or practice apparatus for gun-aiming or gun-laying using a light emitting device
    • F41G3/2622Teaching or practice apparatus for gun-aiming or gun-laying using a light emitting device for simulating the firing of a gun or the trajectory of a projectile
    • F41G3/2627Cooperating with a motion picture projector

Definitions

  • the present invention relates to target acquisition training apparatus for use on a target acquisition range for training one or more aimers assembled on the range and a method of training in target acquisition.
  • the first option "how-to-shoot” is usually taught on medium to long target ranges to achieve the highest accuracy possible and therefore requires the highest aimpoint measurement accuracy.
  • the "when-to- shoot” option is, primarily, decision-based training, carried out on shorter target ranges, against rapidly moving targets, as might appear on a battlefield, in ambush and hostage situations and the like, where the prime purpose is to train the aimer to select targets correctly and to make a decision as to whether or not to shoot, while achieving satisfactory aimpoint accuracy depending on the circumstances presented to him when shooting.
  • This option requires about one order of magnitude less in aimpoint measurement accuracy than the "how-to-shoot” option, because effective "when-to-shoot” training is achieved by a basic assessment of whether a relatively large target was hit or missed.
  • the present invention can meet the requirements for both "how-to- shoot” and "when-to-shoot” training.
  • the weapon aimpoint is often determined by measuring the position of a near-infrared (IR) spot which is produced from a collimated source on the weapon and which is reflected from the wall or screen.
  • the spot position is usually detected by a camera system viewing the wall or screen and linked to suitable hardware and/or software.
  • the camera system can be calibrated to the visible projected image by prior measurement of a projected visible test pattern; an IR pass filter may then subsequently be used to cut out the visible wavelengths to prevent the camera system, after calibration, detecting the visible image, so that the IR spots may appear in a largely blank background, for ease of detection.
  • the conventional hardware/software used in such systems are typically based on a simple video level thresholding operation for spot detection and cannot unambiguously identify two separate IR spots appearing within any one video frame sampled by the camera spot measurement system.
  • aimpoint detection can take place at a maximum rate of only (1/N) times the normal video frame rate as the IR spots must be modulated so that only one at a time appears in any one camera frame.
  • aimpoint could be determined at the point of firing, by measuring the time of firing relative to time of aimpoint readouts before and after firing.
  • the aimpoint coordinates before and after firing could then be used for interpolation based on trigger timing measurements to find the aimpoint coordinates at firing.
  • N the low tracking rate required for N aimers would, as N increases, make this increasingly subject to inaccuracy particularly if any aimpoint disturbance mechanism is implemented just after firing to simulate weapon recoil.
  • An alternative way in such spot measurement systems is to ignore tracking of the aimpoint, and only to switch on the IR spots, for about one video frame time, on trigger pull, thus only measuring hitpoint. Again, this is a disadvantage should two or more aimers fire within the one camera frame time, since predetermined relative delays in switching on IR spots would be required unambiguously to identify aimpoint for the respective aimers.
  • target acquisition training apparatus for use on a target acquisition range for simultaneously training members of a group of aimers assembled on the range comprising projection means for projecting onto a display medium a target acquisition display comprising one or more target images, target acquisition means for each aimer for simulated acquisition of the target image or a selected one of the target images from a sighting position remote from the display medium, each target acquisition means comprising beam transmitting means for transmitting a beam of radiation aligned with an aimpoint axis of the target acquisition means to produce on the display medium a localised radiation aimpoint zone representing the aimpoint of the target acquisition means on the display medium, a video camera positioned to frame scan the display medium and to generate from each frame scan video signals representing the aimpoint zones appearing on the display medium, and computing means to select the camera video signals in each frame representing the aimpoint zones ⁇ appearing on the display medium and to generate therefrom aimpoint coordinates for each aimpoint zone for aimpoint assessment of the aimers.
  • target acquisition is meant the bringing of an aimpoint of a device such as a weapon or simulated weapon on to a target by an aimer of the device or his attempts thereat and may but not necessarily include the firing or simulated firing of the device at the target.
  • the video camera is arranged to generate analogue video signals in each frame
  • an analogue to digital converter is provided to convert the analogue video signals to digitised video signals
  • the computing means comprises a digital computing means to select the digitised video signals in each frame representing the aimpoint zones appearing on the display medium and to generate therefrom aimpoint coordinates for each aimpoint zone for aimpoint assessment of the aimers.
  • the analogue to digital converter is such as to convert the analogue video signals into digitised pixel video signals with each digitised pixel video signal having a discrete value identifying a grey level per pixel location.
  • the digital computing means subjects the digitised pixel video signals in each frame representing the grey level value per pixel location to a centroiding process to determine the centre of each aimpoint zone to within a fraction of the central pixel within each aimpoint zone to provide aimpoint sub- pixel centroid coordinates for each aimpoint zone.
  • the video camera is arranged to generate the analogue video signals in each frame by sequential scanning of interlaced scanning fields
  • the analogue to digital converter converts the analogue video signals field by field to digitised field signals
  • the digital computing means selects the digitised field signals in each field representing the aimpoint zones appearing on the display medium and generates therefrom aimpoint coordinates for each aimpoint zone for aimpoint assessment of the aimers.
  • the digital computing means preferably then includes a field store and the digitised field signals in each field are held for one field time in the field store to allow analysis of the digitised field signals.
  • the computing means is provided with algorithms to discriminate the aimpoint zones.
  • each beam transmitting means is arranged to transmit a beam of radiation modulated to provide a beam on-time, per camera field time, less than the camera field time to reduce movement induced aimpoint zone blur.
  • each beam transmitting means transmits a beam of radiation such as to produce on the display medium a localised radiation aimpoint zone having an identifying characteristic different from that produced by each of the other beam transmitting means.
  • the identifying characteristic of the zone may, for example, be its shape or its intensity.
  • target acquisition training apparatus for use on a target acquisition range for training an aimer assembled on the range comprising projection means for projecting onto a display medium a target acquisition display comprising a target image, target acquisition means for the aimer for simulated acquisition of the target image from a sighting position remote from the display medium, the target acquisition means comprising beam transmitting means for transmitting a beam of radiation aligned with an aimpoint axis of the target acquisition means to produce on the display medium a localised radiation aimpoint zone representing the aimpoint of the target acquisition means on the display medium, a video camera positioned to frame scan the display medium and to generate from each frame scan an analogue video signal or signals representing the aimpoint zone appearing on the display medium, an analogue to digital converter which converts the analogue video signal or signals in to a digitised pixel video signal or signals each having a discrete value identifying a grey level per pixel location, and digital computing means to select the digitised pixel video signal or signals in each frame representing the aimpoint zone appearing on the display
  • the computing means includes a store to store in projected image coordinates one or more target outlines for each frame, and the aimpoint coordinates of each aimpoint zone are compared with the projected image coordinates of the target outline or a selected one of the target outlines for aimpoint assessment of each aimer.
  • the computing means generates the aimpoint coordinates for each aimpoint zone in camera coordinates
  • the aimpoint camera coordinates of the aimpoint zones are arranged to be compared with reference camera coordinates stored in the computing means which are sub-pixel centroid coordinates of a plurality of reference zones projected onto the display medium by the projection means at predetermined projected image coordinates, on the basis of such comparison the aimpoint camera coordinates of the aimpoint zones are converted in to aimpoint projected image coordinates
  • the aimpoint projected image coordinates of each aimpoint zone are compared with the stored projected image coordinates of the target outline or a selected one of the target outlines for aimpoint assessment of the aimers.
  • the aimpoint sub-pixel centroid coordinates calculated for the aimpoint zone are in camera coordinates
  • the aimpoint camera coordinates of the aimpoint zone are arranged to be compared with reference camera coordinates stored in the computing means which are sub-pixel centroid coordinates of a plurality of reference zones projected onto the display medium by the projection means at predetermined projected image coordinates, on the basis of such comparison the aimpoint camera coordinates of the aimpoint zone are converted into aimpoint projected image coordinates
  • the aimpoint projected image coordinates of the aimpoint zone are compared with the stored projected image coordinates of the target outline or a selected one of the target outlines for aimpoint assessment of the aimer.
  • the beam of radiation producing the localised radiation aimpoint zone is preferably a beam of infrared radiation which may take the form of laser radiation.
  • the video camera is a monochrome CCD video camera sensitive to visible and infrared radiation and means are provided to interpose an infrared pass filter between the video camera and the display medium whereby the video camera generates only or substantially only video signals representing the aimpoint zones appearing on the display medium.
  • the video camera integration period is preferably made less than the camera's field time to reduce the adverse effects of ambient infrared background lighting.
  • the projector is preferably in the form of a video projector and may be positioned centrally with respect to the screen and the video camera may conveniently be mounted on the video projector.
  • each target acquisition means is a weapon or simulated weapon and the beam transmitting means is mounted on the weapon to represent on the display medium the aimpoint of the weapon.
  • a method of aimpoint assessment in simultaneously training members of a group of aimers in target acquisition on a target acquisition range comprising the steps of projecting onto a display medium a target acquisition display comprising one or more target images, arranging for each aimer to simulate acquisition of the target image or a selected one of the target images from a sighting position remote from the display medium using target acquisition means which comprises beam transmitting means which transmits a beam of radiation aligned with an aimpoint axis of the target acquisition means and produces on the display medium a localised radiation aimpoint zone representing the aimpoint of the target acquisition means on the display medium, frame scanning the display medium using a video camera, generating from each frame scan video signals representing the aimpoint zones appearing on the display medium, selecting the video signals in each frame representing the aimpoint zones appearing on the display medium and generating therefrom aimpoint coordinates for each aimpoint zone for aimpoint assessment of the aimers.
  • the video signals in each frame are arranged to be digitised pixel video signals each having a discrete value identifying a grey level per pixel location, and the digitised pixel video signals in each frame representing the aimpoint zones appearing on the display medium are selected and subjected to a centroiding process to determine the centre of each aimpoint zone to within a fraction of the central pixel within each aimpoint zone to provide aimpoint sub-pixel centroid coordinates for each aimpoint zone.
  • the projected display is frame scanned by sequential scanning of interlaced scanning fields, the digitised pixel video signals are generated field by field, and the digitised pixel video signals in each field representing the aimpoint zones appearing on the display medium are selected and subjected to a centroiding process to determine the centre of each aimpoint zone to within a fraction of the central pixel within each aimpoint zone to provide aimpoint sub-pixel centroid coordinates for each aimpoint zone.
  • the digitised pixel video signals in each field may be held for one field time to allow analysis thereof.
  • means are provided to discriminate the aimpoint zones.
  • each beam transmitting means transmits a beam of radiation such as to produce on the display medium a localised radiation aimpoint zone having an identifying characteristic different from that produced by each of the other beam transmitting means.
  • the identifying characteristic of the zone may, for example, be its shape or its intensity.
  • a method of aimpoint assessment in training an aimer in target acquisition on a target acquisition range comprising the steps of projecting onto a display medium a target acquisition display comprising a target image, arranging for the aimer to simulate acquisition of the target image from a sighting position remote from the display medium using a target acquisition means which comprises a beam transmitting means which transmits a beam of radiation aligned with an aimpoint axis of the target acquisition means and produces on the display medium a localised radiation aimpoint zone representing the aimpoint of the target acquisition means on the display medium, frame scanning the display medium using a video camera, generating from each frame scan a digitised pixel video signal or signals each having a discrete value identifying a grey level per pixel location, selecting the digitised pixel video signal or signals in each frame representing the aimpoint zone appearing on the display medium and subjecting the digitised pixel video signal or signals to a centroiding process to determine the centre of the aimpoint zone to within a fraction of the central pixel
  • one or more target outlines are stored for each frame in projected image coordinates, and the aimpoint coordinates of each aimpoint zone are compared with the projected image coordinates of the target outline or a selected one of the target outlines for aimpoint assessment of each aimer.
  • the aimpoint coordinates are generated in camera coordinates
  • the aimpoint camera coordinates of the aimpoint zones are compared with reference camera coordinates which are sub-pixel centroid coordinates of a plurality of reference zones projected onto the display medium at predetermined projected image coordinates
  • the aimpoint camera coordinates of the aimpoint zones are converted in to projected image coordinates on the basis of such comparison
  • the aimpoint projected image coordinates of each aimpoint zone are compared with the stored projected image coordinates of the target outline or a selected one of the target outlines for aimpoint assessment of the aimers.
  • the aimpoint sub-pixel centroid coordinates of the aimpoint zone are arranged to be in camera coordinates
  • the aimpoint camera coordinates of the aimpoint zone are compared with reference camera coordinates which are sub-pixel centroid coordinates of a plurality of reference zones projected onto the display medium at predetermined projected image coordinates
  • the aimpoint camera coordinates of the aimpoint zone are converted into projected image coordinates on the basis of such comparison
  • the aimpoint projected image coordinates of the aimpoint zone are compared with the projected image coordinates of the target outline or a selected one of the target outlines for aimpoint assessment of the aimer.
  • each beam transmitting means transmits a beam of infrared radiation which may take the form of infrared laser radiation.
  • the reference zones are arranged to be projected onto the display medium prior to projection of the target acquisition display thereon, the display medium is frame scanned with the video camera and digitised pixel video signals representing the reference zones appearing on the display medium and each having a discrete value identifying a grey level per pixel location generated therefrom, and the digitised pixel video signals representing the references zones appearing on the display medium are selected and subjected to a centroiding process to determine the centre of the reference zones to within a fraction of the central pixel within the reference zones to provide sub-pixel centroid coordinates for each reference zone.
  • Fig 1 is a schematic perspective view of part of a weapon-fire training range embodying target acquisition training apparatus according to the invention
  • Fig 2 is an enlarged view of a part of the screen forming part of the target acquisition training apparatus according to the invention illustrated in Fig 1
  • Fig 3 is a block schematic diagram of the range illustrated in Fig 1 showing sub-assemblies of the target acquisition training apparatus according to the invention and their interconnections
  • Fig 4 is a flow chart illustrating a calibration procedure for the target acquisition training apparatus according to the invention
  • Fig 5 is a flow chart illustrating how aimpoint assessment of an aimer is achieved in an embodiment of the target acquisition training apparatus according to the invention
  • Fig 6 is a block schematic diagram of a single sensor/single processor forming part of the computing means for use in the target acquisition training apparatus illustrated in Figs 1 and 3
  • Fig 7 is a schematic block diagram showing in more detail the analogue to digital, digital to analogue and man- machine interface modules of the computing means illustrated in Fig 6
  • Fig 8 is a block schematic diagram of the transputer processor module of the computing means illustrated in Fig 6 and
  • Fig 9 is a block schematic diagram of a multi- sensor/multi-processor system forming the computing means of the target acquisition training apparatus according to the invention and based on four single sensor/single processors described with reference to Figs 6 to 8
  • the weapon fire training range shown is an indoor range for use by four marksmen, only two of which are schematically illustrated and indicated by reference numerals 11 and 12.
  • the first marksman 11 occupies an end lane LI and is provided with a simulated weapon 14 which he directs at target images 15 projected on to a screen 16 by a video projector assembly 17.
  • the projector 17 provides for full coverage of the screen 16 and presents target images 15 for use in training all four marksmen in target acquisition. Training of the four marksmen in this embodiment takes place simultaneously.
  • the second marksman 12 is provided with a simulated weapon 18 for use against the target images 15 displayed on the screen 16 by the projector ' 17. It is to be noted that the target acquisition training of the two marksmen 11 and 12 is provided in respect of weapons of different type and the other marksmen not shown can if desired be provided with simulated weapons of the same or other types.
  • each simulated weapon has mounted thereon an infrared laser diode which transmits a collimated infrared laser beam which converges with the optical sight of the weapon at a predetermined range of the marksman from the screen 16.
  • the weapon 14 used by the first marksman 11 is shown with an infrared laser diode 50 mounted thereon which transmits an infrared laser beam 5.
  • each infrared laser diode produces on the screen 16 an aimpoint spot 10 coincident with the aimpoint of the weapon, as illustrated in Fig 2.
  • the aimpoint spots 10 produced on the screen 16 by the laser diodes of each weapon 14 and 18 in this manner may, as illustrated in Fig 2, be of different identifying characteristics, such as for example of different shapes or intensities.
  • the training range is placed under the control of a controller 19 who is provided with target assessment displays on a monitor screen 20 of a master console sub-assembly 21.
  • each of the marksmen 11 and 12 is provided with floor box sub- assemblies 22 and 23 which provide on monitors 24 and 25 information as to their own target acquisition performance.
  • a video camera 13 is mounted on the video projector 17 and is arranged to have a field of view of the screen 16 corresponding to the full field projected by the video projector 17.
  • Each of the floor box sub-assemblies 22 and 23 and each of the other two floor box sub-assemblies include a lane microprocessor for processing lane information applied to it and lane monitor screens 24 and 25 for use by the marksmen .
  • the sub-assemblies of the target acquisition training apparatus shown in Fig l are illustrated in block diagram form and include the screen 16, the video projector 17, the video camera 13 mounted thereon, simulated weapons 14 and 18 together with two further simulated weapons 34 and 35 for use by marksmen in the other two lanes of the range, floor box sub- assemblies 22 and 23, including the lane monitors 24 and 25 and lane microprocessors 36 and 37, together with two further floor box sub-assemblies 38 and 39 including further lane monitors 40 and 41 and further lane microprocessors 42 and 43 and the master control sub- assembly 21.
  • the video camera 13 is shown to comprise a camera box 44 mounted on the projector 17, a CCD sensor array device 45, a camera lens body 46, a front window 47 and interposed between the window 47 and the lens body 46 an infrared pass filter 48 movable into and out of the field of view of the camera 13 under the control of a filter selection motor 49.
  • the weapons 14, 18, 34 and 35 are provided with infrared laser diodes 50, 51, 52 and 53, which are arranged to transmit collimated infrared laser beams, each of which is set to converge with the optical sight of the weapon at a predetermined range of the marksman from the screen 16.
  • a camera output path 80 feeds the video signal from the camera 13 to a multi-sensor/multi-processor system 54 which determines the tracking coordinates of each aimpoint IR spot 10 on the screen 16.
  • the tracking coordinates are fed to the master console 21 along a multi-sensor/multi-processor system output path 81 and then to a video/graphics display control system 55 which controls the projection of video images from the projector 17 along a master console output path 82.
  • the infrared pass filter 48 is, during tracking of the aimpoint IR spots 10 on the screen 16, automatically positioned in the field of view of the video camera 13, specifically to prevent the camera 13 detecting the visible image projected by the projector 17 as this would interfere with the aimpoint IR spot detection process.
  • a grid of visible white spots, with predefined projection image coordinates, is projected on to the screen 16.
  • the infrared pass filter 48 is removed at step 101 and the video camera 13 samples the calibration spots at step 102 and generates signals representing the calibration spots appearing on the screen 16 which are fed to the processor system 54 along the camera output path 80.
  • the processor system 54 determines at step 103 the centroids of the calibration spots in camera coordinates and stores them in memory at step 104.
  • the infrared pass filter 48 is then repositioned in the field of view of the camera 13 at step 105 so that training can commence.
  • the camera coordinate calibration centroids can then be related directly back to each projected image, where the target positions and shapes have been stored in memory for each image frame displayed. Therefore, after calibration, the position of any aimpoint IR spot 10 is measured in camera coordinates, which, via the calibration grid mapping, are then translated into image coordinates, since the calibration grid was, firstly, defined in image coordinates. Then, since the target positions and sizes, in image coordinates, are stored for each frame displayed, in the processor system memory, the coordinates of the aimpoint IR spots 10 with respect to the target images can be determined, and an assessment made of whether the target was hit or missed, on firing of the weapon and, if required, the degree of accuracy attained.
  • target acquisition training apparatus hereinbefore described is shown being used for training a plurality of aimers, it will be appreciated that the apparatus is equally suitable for training an individual assembled on the range.
  • the flow chart set out in Fig 5 illustrates the relationship between the camera coordinate calibration centroids, the stored target outlines and an aimpoint spot 10 tracked for aimpoint assessment of an aimer in training.
  • the centroids of the calibration spots in camera coordinates and the projected image coordinates of a target image outline in each frame are stored in memory.
  • a target acquisition display comprising the target image 15 is projected onto the screen 16.
  • an aimer simulates acquisition of the target image 15.
  • the video camera 13 samples the screen 16 and generates along the camera output path 80 for each frame scan video signals representing the aimpoint spot 10 appearing on the screen 16.
  • the processor system 54 processes the camera video signals received along output path 80 and determines the camera coordinates of the aimpoint spot 10 in each frame at step 153.
  • the camera coordinates of the aimpoint spot 10 are then compared with the stored camera coordinates of the calibration spots at step 154 and on the basis of such comparison converted into projected image coordinates.
  • the projected image coordinates of the aimpoint spot 10 are then compared with the stored projected image coordinates of the target outline for corresponding frames at step 155 to enable aimpoint assessment of the aimer to take place at step 156.
  • FIG 5 refers to a single target image being projected onto the screen 16 for a single aimer, it will be appreciated that the flow chart is equally applicable for the case of simultaneous training of a group of aimers using one or more projected target images.
  • a multi-sensor/multi-processor architecture as hereinafter to be described with reference to Fig 9 is required for processing the aimpoint information when two or more systems are combined, to allow many aimers to be trained, simultaneously, over multiple projection screens.
  • the architecture for a single-sensor/single processor system will however first be described in detail with reference to Figs 6 to 8.
  • the single sensor/single processor system comprises a video A/D module 56, a front end processing module 57, which performs a segmentation operation in real-time, and a transputer processor module 58, which performs further data dependent processing.
  • An RGB display monitor 61 is used for display purposes. Due to the design, the system may be expanded simply by adding further processor boards and adapting the software to use them and additional sensors may be integrated into the system with the addition of suitable A/D and front end processing modules as hereinafter to be described.
  • the video A/D module 56 converts a PAL video signal into a 512 x 512 pixel image of 8 bit data, and is arranged to cope with both interlaced and non-interlaced formats. It also provides the timing signals for the digitised data, namely; pixel clock, line synch, frame synch, valid data indicator, odd/even field indicator, pixel number and line number.
  • Interlaced video comprises two half-frame fields which are distinguishable by the relationship of the line and frame timing, and the odd/even signal indicates the current field in this case. Active image data is indicated by the valid data signal.
  • Circuits 62, 63, 64 and 65 for generating the above timing signals are implemented on a first Xilinx PGA 66 and the outputs from the circuits are used by a D/A module 67 to create a display and by the front end processing module 57 which performs a segmentation operation.
  • the front end processing module 57 is required to segment the real-time image data in order to reduce the amount of information to be processed by the transputer module 58.
  • a block diagram of the transputer based processor is shown in Figure 8. As the figure shows, the transputer 58 has access not only to standard areas of RAM and EPROM, but also has an area of video RAM (VRAM) mapped into its memory space.
  • VRAM video RAM
  • the VRAM contains two distinct fields of data, each 512 x 256 pixels in size. This allows processing to occur on one field while the other is being written to with new image data.
  • This image data, coming from the front-end processing module 57, is clocked in a serial fashion into a register on the VRAM. The transfer can occur simultaneously with the read/write operations of the processor, apart from the loss of one processor cycle per video line (512 pixels) .
  • a DMA control block 71 supervises the transfers.
  • a new field arrives every 20ms (for 50Hz operation) and its arrival is signalled to the transputer via an event controller 72.
  • the transputer can then switch attention to the newly arrived field allowing the next field to be transferred to the other section of VRAM.
  • Transputer serial links 73 are available for communication with other processors and are taken off the board on a standard connector.
  • the hardware hereinbefore described constitutes a single- sensor/single-processor image processing system. It is, however, designed in a modular manner with the result that conversion to a multi-sensor/multi-processor system can readily be made. Additional processor boards can be added by simply slotting them into the rack, connecting their serial links in the desired manner and altering the software to utilise the extra processing power. All of the processors will therefore have access to all of the image data. This allows data dependent algorithms to be run on particular processors without the need to transfer image data through the serial links. This not only increases the efficiency of the system but greatly improves its flexibility.
  • a multi-sensor/multi-processor system can therefore be created by simply adding suitable combinations of sensor processing systems to the hardware described with reference to Figs 6 to 8 and allowing the transputers to communicate via their serial links 73 as illustrated in Fig 9.
  • sub-pixel resolution may be employed to determine the centroids of the calibration and aimpoint spots to within a fraction of the central pixel within each spot.
  • Sub-pixel resolution essentially involves generating a pixel image by digitising the video signal or signals generated by the video camera 13 on frame scanning the screen 16 into a signal or signals each having a discrete value identifying a single grey level per pixel location and representing the intensities in the image detected by the video camera 13, for example an IR or visible spot.
  • a simple centroiding technique is then used to find the centre of the light distribution to within a certain fraction of the central pixel of, in this example, the spot, in both x and y axes.
  • centroiding techniques assume a distinguishable fall-off in light intensity on pixels immediately adjacent to the central, that is to say, the brightest pixel.
  • the system is therefore engineered to ensure that this, in fact, is the case if it is to work accurately - eg IR spots need to be modulated so that they are switched on for significantly less than one camera field time, in order to reduce movement-induced blur, which would otherwise distort the sub-pixel resolution process.
  • the calibration method is highly accurate since the calibration spot centroids are measured to sub-pixel accuracy, and can be performed, simultaneously, up to the edges of the projection screen area.
  • Other systems can only maintain relative accuracies over small portions of such a screen ie they cannot maintain absolute accuracy over most of the screen, because they do not use such an advanced calibration system.
  • marksmanship relative accuracy may be achieved over any small portion of the screen, but weapon re-zeroing is required for use on any other part of the screen, because the absolute accuracy is not preserved.
  • the calibration method according to the present invention thus has a distinct advantage, in preserving the absolute accuracy over most of the screen.
  • Sub-pixel accuracy depends on calculating an intensity- weighted centroid, based on the stored grey levels per spot. Pixel jitter timing artefacts are minimised by stripping off the CCD camera video sync signals, and restarting the pixel sampling clock for the video field store, on the relevant rising edge of the sync signal.
  • the field store pixel clock thus operates in a phase- locked loop mode. Therefore, there has been constructed in accordance with the present invention a wide angle camera measurement system, using sub-pixel resolution techniques, capable of producing marksmanship-type measurement accuracies, over a wide-screen, using a conventional 512 pixels/line resolution field store, whereas, previously this would have required multiple cameras and field stores, to achieve the same measurement accuracy.
  • the target acquisition training apparatus described provides sufficient accuracy for "when-to-shoot” and “how-to-shoot” training, with particular relevance to team training, where the effectiveness of the system is greatly enhanced by the use of appropriate hardware and software, for automatic aimpoint detection and tracking.
  • target acquisition training apparatus may be switched on for continuous illumination of the screen or wall, and may be tracked continuously at video frame rates, where the aimers are each engaging separate targets.
  • the higher tracking rate consistently inherent in the apparatus provides not only better tracking information, but also greater hitpoint assessment accuracy, since firstly, interpolation between aimpoint updates, at (1/N) times video rates for N aimers, is not required, where separate targets/aimer are engaged, and secondly, multiple aimers may all fire simultaneously within one video frame time without relative aimpoint measurement delays between firers being inherently required in the system.
  • a means of discriminating between aimers on the basis of laser spot shape, or intensity may be implemented. This means that if several aimers engage one target at the same time, the apparatus hereinbefore described may still correctly identify the ownership of the IR spots, without having to modulate them in sequence over successive video frames. In these circumstances also, the tracking rate advantage is preserved over conventional systems.
  • the identification of IR spots on the basis of either shape or intensity may be implemented by converting the analogue video signal per field to a digitised signal, giving for example, 256 grey levels per pixel location.
  • the digitised camera field is held for 1 field time, after digitisation, in a field store, to allow analysis. Since the video grey levels are stored for 1 field time, this ultimately allows complex processing such as discrimination on the basis of spot shape or intensity, whereas a simple thresholding of the video signal could be more limited in discrimination capability, particularly as regards overlapping spots, or for sub- pixel resolution.
  • more complex processing algorithms can be used to achieve separation and identification of overlapping spots, or alternatively sub-pixel resolution, on single laser spots.
  • the weapon-mounted IR laser diode sources are modulated so that they are switched on in each camera field time, for much less than the actual field time, in order to reduce movement-induced blur which would otherwise distort spot shape and reduce intensity.
  • This feature can allow, therefore, very much higher tracking rates, while still preserving IR spot shape and intensity, than would be achieved with simple unmodulated cw laser sources.
  • the apparatus according to the invention can, however, track the IR spot as the aimer slews it on to the target and attempts to track it; the storage and replay of such aimpoint information thus provides valuable feedback on weapon handling, when subsequently overlaid on the projected image sequence used.
  • the generally higher tracking rate of the apparatus according to the invention, for multiple aimers, is highly beneficial as regards the accuracy of tracking, and, hence, produces improved training capability.
  • Various recorded or computer generated scenarios may be projected by a video or graphic projector on to a wall or screen to multiple aimers, and the apparatus according to the invention using the digitised video output of the camera system viewing the wall or screen determines the aimpoint coordinates, of the IR spots, produced by weapon-mounted IR laser sources.
  • the accuracy achieved by digitising the video signal from the camera 13 , viewing the wall or screen to determine the IR spot ai points, can easily be arranged to be satisfactory for both "when-to-shoot" (hit or miss) and "how-to-shoot” training .
  • the target sizes and locations, for each frame of the video imagery, will be previously determined by using a target box, the coordinates and size of which are stored in memory along with the relevant video frame number.
  • the aimpoint with respect to the target box coordinates can then be used to determine if the shot hit or missed the target and, if required, the degree of accuracy of a hit or miss.
  • a numbered tracking gate or box is placed around each laser spot detected on the screen by the target acquisition training apparatus.
  • all the laser spots may be uniquely identified, and continuously tracked, from one camera field time to the next, without having to reduce the tracking rate below the camera field rate for multiple lasers, because the unique identification of each laser spot is preserved by the target acquisition training apparatus.
  • the target acquisition training apparatus provides very much greater flexibility in tracking the laser spots, because no division of the screen into discrete tracking zones per aimer is required. This is because the apparatus can track any laser spot uniquely over the complete video screen at full camera field rates. It is only necessary to reduce the tracking rate when the tracking boxes around the laser spots overlap, for example, when 2 or more lasers are aimed in the vicinity of 1 target, in scenario mode.
  • the use of the target acquisition training apparatus according to the invention therefore, provides useful gains in overall tracking rate and, therefore, accuracies.
  • target acquisition training apparatus in accordance with the invention provides for the following uses, namely:
  • the apparatus according to the invention may have the following features:
  • a high resolution video projector positioned centrally with respect to the projection screen wall.
  • a visible and IR sensitive, monochrome, CCD video camera for example positioned on the video projector for viewing the projection wall or screen to detect IR spots from weapon-mounted collimated laser sources.
  • An analogue to digital conversion board to digitise the video output of the CCD camera, with a frame store to hold the digitised grey levels for 1 field time.
  • target acquisition training apparatus can be used singularly or alternatively be combined with further ones of the target acquisition training apparatus to allow many aimers to be trained simultaneously over multiple projection screens.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Engineering & Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)

Abstract

L'invention concerne un appareil d'entraînement à l'acquisition de cibles utilisable sur un terrain d'exercice d'acquisition de cible, conçu pour la formation d'un ou de plusieurs tireurs (11, 12) rassemblés sur le terrain d'exercice. L'appareil comprend un dispositif de projection (17) destiné à projeter sur un support d'affichage (16) un affichage d'acquisition de cibles comportant une ou plusieurs images de cibles (15), des dispositifs de cible (14, 15) d'acquisition pour le tireur ou pour chaque tireur (11, 12) conçu pour l'acquisition simulée de l'image de la cible (15) ou l'une des images de cible (15) sélectionnée depuis une position de visée éloignée du support d'affichage (16); le ou les dispositifs (14, 15) d'acquisition de cible comprend/comprennent des dispositifs (50, 51) de transmission de faisceau conçus pour transmettre un faisceau (5) de rayons aligné avec un axe de visée du ou de chaque dispositif (14, 15) d'acquisition de cible pour produire sur le support d'affichage (16) une ou des zones (10) de point de visée du rayon représentant le point de visée du ou des dispositifs (14, 15) d'acquisition de cible sur le support d'affichage (16). L'appareil comprend également une caméra vidéo (13) positionnée pour balayer le support d'affichage (16) et pour générer depuis chaque zone de balayage des signaux vidéo représentant la ou les zones (10) de point de visée apparaissant sur le support d'affichage (16), et un dispositif de calcul (54) conçu pour sélectionner les signaux vidéo de la caméra dans chaque image représentant la ou les zones (10) de point de visée apparaissant sur le support d'affichage (16) et pour générer les coordonnées du point de visée pour la zone ou chacune des zones (10) de point de visée, pour l'évaluation du point de visée d'un ou de plusieurs tireurs (11, 12).
PCT/GB1993/002587 1992-12-18 1993-12-17 Appareil d'entrainement d'acquisition de cibles et procede de formation a l'acquisition de cibles Ceased WO1994015165A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB9511352A GB2289521B (en) 1992-12-18 1993-12-17 Target acquisition training apparatus and method of training in target acquisition

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB9226389.6 1992-12-18
GB929226389A GB9226389D0 (en) 1992-12-18 1992-12-18 Target acquisition training apparatus

Publications (1)

Publication Number Publication Date
WO1994015165A1 true WO1994015165A1 (fr) 1994-07-07

Family

ID=10726810

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1993/002587 Ceased WO1994015165A1 (fr) 1992-12-18 1993-12-17 Appareil d'entrainement d'acquisition de cibles et procede de formation a l'acquisition de cibles

Country Status (2)

Country Link
GB (2) GB9226389D0 (fr)
WO (1) WO1994015165A1 (fr)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0728503A1 (fr) * 1995-02-21 1996-08-28 Konami Co., Ltd. Machine de jeu de tir
EP0848365A1 (fr) * 1996-12-10 1998-06-17 Bernd Hagenlocher Caisse à sable
US5800263A (en) * 1995-02-21 1998-09-01 Konami Co., Ltd. Game machine
US5816817A (en) * 1995-04-21 1998-10-06 Fats, Inc. Multiple weapon firearms training method utilizing image shape recognition
WO2001094872A3 (fr) * 2000-06-09 2002-04-11 Beamhit Llc Systeme et procede d'instruction laser pour armes a feu facilitant l'entrainement aux armes a feu au moyen de diverses cibles et du retour visuel d'emplacements simules d'impacts de projectiles
WO2001051877A3 (fr) * 2000-01-13 2002-05-02 Beamhit Llc Systeme de jeu et de simulation d'arme a feu et procede permettant de connecter de maniere operationnelle une arme a feu peripherique a un systeme informatique
US6579098B2 (en) 2000-01-13 2003-06-17 Beamhit, Llc Laser transmitter assembly configured for placement within a firing chamber and method of simulating firearm operation
RU2251652C2 (ru) * 2003-08-27 2005-05-10 Закрытое акционерное общество "Дженерал Телеком" Способ определения места попадания пули в мишень на полевом стрельбище
US6935864B2 (en) 2000-01-13 2005-08-30 Beamhit, Llc Firearm laser training system and method employing modified blank cartridges for simulating operation of a firearm
AU783018B2 (en) * 1997-08-25 2005-09-15 Beamhit, L.L.C. Network-linked laser target firearm training system
WO2006019974A3 (fr) * 2004-07-15 2006-05-04 Cubic Corp Point de visee ameliore dans des systemes d'apprentissage simules
US7329127B2 (en) 2001-06-08 2008-02-12 L-3 Communications Corporation Firearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control
CN101869765A (zh) * 2009-04-22 2010-10-27 剑扬股份有限公司 使用嵌入式光感测面板的射击训练系统及其方法
EP1825209A4 (fr) * 2004-11-24 2011-11-09 Dynamic Animation Systems Inc Environnement de formation dirigee par un instructeur et interfaces associees
WO2014185764A1 (fr) * 2013-05-17 2014-11-20 4°Bureau De L'etat-Major General Des Forces Armees Royales Simulateur de tir en salle pour armes légères et lance-roquettes antichars
RU2622820C1 (ru) * 2016-08-04 2017-06-20 Акционерное общество "Научно-производственное объединение Русские базовые информационные технологии" Способ идентификации лазерных точек прицеливания
CN109949648A (zh) * 2019-04-30 2019-06-28 上海亿湾特训练设备科技有限公司 一种模拟交战训练系统和模拟交战训练方法
US10480903B2 (en) 2012-04-30 2019-11-19 Trackingpoint, Inc. Rifle scope and method of providing embedded training
EP3769029A4 (fr) * 2018-03-21 2021-12-01 InVeris Training Solutions, Inc. Appareils et procédés de détection d'événement de coup de feu

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10077969B1 (en) 2017-11-28 2018-09-18 Modular High-End Ltd. Firearm training system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0146466A2 (fr) * 1983-12-15 1985-06-26 GIRAVIONS DORAND, Société dite: Dispositif d'entraînement au tir en salle
GB2161251A (en) * 1984-07-07 1986-01-08 Ferranti Plc Weapon training apparatus
US4948371A (en) * 1989-04-25 1990-08-14 The United States Of America As Represented By The United States Department Of Energy System for training and evaluation of security personnel in use of firearms
JPH03134499A (ja) * 1989-10-17 1991-06-07 Japan Radio Co Ltd 照準位置検出方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0146466A2 (fr) * 1983-12-15 1985-06-26 GIRAVIONS DORAND, Société dite: Dispositif d'entraînement au tir en salle
GB2161251A (en) * 1984-07-07 1986-01-08 Ferranti Plc Weapon training apparatus
US4948371A (en) * 1989-04-25 1990-08-14 The United States Of America As Represented By The United States Department Of Energy System for training and evaluation of security personnel in use of firearms
JPH03134499A (ja) * 1989-10-17 1991-06-07 Japan Radio Co Ltd 照準位置検出方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 15, no. 346 (M - 1153) 3 September 1991 (1991-09-03) *

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5800263A (en) * 1995-02-21 1998-09-01 Konami Co., Ltd. Game machine
EP0728503A1 (fr) * 1995-02-21 1996-08-28 Konami Co., Ltd. Machine de jeu de tir
US5816817A (en) * 1995-04-21 1998-10-06 Fats, Inc. Multiple weapon firearms training method utilizing image shape recognition
EP0848365A1 (fr) * 1996-12-10 1998-06-17 Bernd Hagenlocher Caisse à sable
US5954517A (en) * 1996-12-10 1999-09-21 Bernd Hagenlocher Interactive sand box for training
AU783018B2 (en) * 1997-08-25 2005-09-15 Beamhit, L.L.C. Network-linked laser target firearm training system
US6935864B2 (en) 2000-01-13 2005-08-30 Beamhit, Llc Firearm laser training system and method employing modified blank cartridges for simulating operation of a firearm
WO2001051877A3 (fr) * 2000-01-13 2002-05-02 Beamhit Llc Systeme de jeu et de simulation d'arme a feu et procede permettant de connecter de maniere operationnelle une arme a feu peripherique a un systeme informatique
US6579098B2 (en) 2000-01-13 2003-06-17 Beamhit, Llc Laser transmitter assembly configured for placement within a firing chamber and method of simulating firearm operation
US6966775B1 (en) 2000-06-09 2005-11-22 Beamhit, Llc Firearm laser training system and method facilitating firearm training with various targets and visual feedback of simulated projectile impact locations
JP2003536045A (ja) * 2000-06-09 2003-12-02 ビームヒット,リミティド ライアビリティー カンパニー 多種類ターゲットとシミュレートされた発射物衝突位置の視覚フィードバックを有する、小火器訓練を行う為のレーザー小火器訓練システム及び方法
WO2001094872A3 (fr) * 2000-06-09 2002-04-11 Beamhit Llc Systeme et procede d'instruction laser pour armes a feu facilitant l'entrainement aux armes a feu au moyen de diverses cibles et du retour visuel d'emplacements simules d'impacts de projectiles
US7329127B2 (en) 2001-06-08 2008-02-12 L-3 Communications Corporation Firearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control
RU2251652C2 (ru) * 2003-08-27 2005-05-10 Закрытое акционерное общество "Дженерал Телеком" Способ определения места попадания пули в мишень на полевом стрельбище
WO2006019974A3 (fr) * 2004-07-15 2006-05-04 Cubic Corp Point de visee ameliore dans des systemes d'apprentissage simules
US7345265B2 (en) 2004-07-15 2008-03-18 Cubic Corporation Enhancement of aimpoint in simulated training systems
US7687751B2 (en) 2004-07-15 2010-03-30 Cubic Corporation Enhancement of aimpoint in simulated training systems
EP1825209A4 (fr) * 2004-11-24 2011-11-09 Dynamic Animation Systems Inc Environnement de formation dirigee par un instructeur et interfaces associees
EP2249117A1 (fr) * 2009-04-22 2010-11-10 Integrated Digital Technologies, Inc. Systèmes d'entraînement au tir utilisant un panneau photosensible incorporé
CN101869765A (zh) * 2009-04-22 2010-10-27 剑扬股份有限公司 使用嵌入式光感测面板的射击训练系统及其方法
CN101869765B (zh) * 2009-04-22 2013-12-11 剑扬股份有限公司 使用嵌入式光感测显示装置的射击训练系统及其方法
US10480903B2 (en) 2012-04-30 2019-11-19 Trackingpoint, Inc. Rifle scope and method of providing embedded training
WO2014185764A1 (fr) * 2013-05-17 2014-11-20 4°Bureau De L'etat-Major General Des Forces Armees Royales Simulateur de tir en salle pour armes légères et lance-roquettes antichars
RU2622820C1 (ru) * 2016-08-04 2017-06-20 Акционерное общество "Научно-производственное объединение Русские базовые информационные технологии" Способ идентификации лазерных точек прицеливания
EP3769029A4 (fr) * 2018-03-21 2021-12-01 InVeris Training Solutions, Inc. Appareils et procédés de détection d'événement de coup de feu
CN109949648A (zh) * 2019-04-30 2019-06-28 上海亿湾特训练设备科技有限公司 一种模拟交战训练系统和模拟交战训练方法

Also Published As

Publication number Publication date
GB2289521B (en) 1996-07-24
GB2289521A (en) 1995-11-22
GB9511352D0 (en) 1995-09-06
GB9226389D0 (en) 1993-02-10

Similar Documents

Publication Publication Date Title
WO1994015165A1 (fr) Appareil d'entrainement d'acquisition de cibles et procede de formation a l'acquisition de cibles
KR101222447B1 (ko) 시뮬레이팅된 트레이닝 시스템들에서의 조준점의 강화
US7391887B2 (en) Eye tracking systems
US5589942A (en) Real time three dimensional sensing system
EP0294101B1 (fr) Système de mesure du déplacement angulaire d'un objet
US4034401A (en) Observer-identification of a target or other point of interest in a viewing field
US6201579B1 (en) Virtual studio position sensing system
US4706296A (en) Modularly expansible system for real time processing of a TV display, useful in particular for the acquisition of coordinates of known shape objects
US6778180B2 (en) Video image tracking engine
US4424943A (en) Tracking system
US20020163576A1 (en) Position detector and attitude detector
JP2001508211A (ja) 動き解析システム
KR20090034824A (ko) 애플리케이션 제어 방법, 위치 정보 제공 시스템 및 컴퓨터판독가능 매체
JP2001350577A (ja) 座標入力装置、座標入力方法、座標入力指示具及び記憶媒体、コンピュータプログラム
US6125308A (en) Method of passive determination of projectile miss distance
JPH0124275B2 (fr)
US9052161B2 (en) Perspective tracking system
JP3143319B2 (ja) 電子光学機器
WO1993007437A1 (fr) Dispositif d'entrainement a l'acquisition d'un but
EP3538913B1 (fr) Système de reconnaissance de la position et de l'orientation d'un objet dans un champ de tir d'entraînement
JP2000050145A (ja) 自動追尾装置
JPH1123262A (ja) 三次元位置計測システム
Andersson A low-latency 60 Hz stereo vision system for real-time visual control
US6964607B2 (en) Game system and game method
Soetedjo et al. Implementation of sensor on the gun system using embedded camera for shooting training

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): CA GB US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA