WO1993007437A1 - Dispositif d'entrainement a l'acquisition d'un but - Google Patents
Dispositif d'entrainement a l'acquisition d'un but Download PDFInfo
- Publication number
- WO1993007437A1 WO1993007437A1 PCT/GB1992/001794 GB9201794W WO9307437A1 WO 1993007437 A1 WO1993007437 A1 WO 1993007437A1 GB 9201794 W GB9201794 W GB 9201794W WO 9307437 A1 WO9307437 A1 WO 9307437A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target
- optical image
- aimpoint
- camera
- weapon
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/003—Simulators for teaching or training purposes for military purposes and tactics
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/26—Teaching or practice apparatus for gun-aiming or gun-laying
- F41G3/2605—Teaching or practice apparatus for gun-aiming or gun-laying using a view recording device cosighted with the gun
- F41G3/2611—Teaching or practice apparatus for gun-aiming or gun-laying using a view recording device cosighted with the gun coacting with a TV-monitor
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/26—Teaching or practice apparatus for gun-aiming or gun-laying
- F41G3/2616—Teaching or practice apparatus for gun-aiming or gun-laying using a light emitting device
- F41G3/2622—Teaching or practice apparatus for gun-aiming or gun-laying using a light emitting device for simulating the firing of a gun or the trajectory of a projectile
- F41G3/2627—Cooperating with a motion picture projector
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/78—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
- G01S3/782—Systems for determining direction or deviation from predetermined direction
- G01S3/785—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
- G01S3/786—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
- G01S3/7864—T.V. type tracking systems
Definitions
- the present invention relates to target acquisition training apparatus and is particularly although not exclusively concerned with weapon-fire training apparatus in which an optical image including one or more target images is projected on to a screen and in which the optical image is derived from a video signal source, that is to say, is built up from picture elements (pixels) produced by sequential scanning of a field along scanning lines which provide coverage of the field with the scanning of each line producing along the line throughout one dimension of the field a succession of picture elements which are derived from successive signal values of an optical image generating video signal.
- pixels picture elements
- the aimpoint accuracy achieved is usually limited by the pixel stability of the projected optical image.
- Substantial pixel jitter can occur along the scanning lines in such systems, so that the accuracy achieved, when measured on a single video line may be too low for effective aimpoint assessment in training.
- the projected optical image can be subject to substantial line jitter.
- target acquisition training apparatus comprising projection means for projecting an optical image including one or more targets, the projection means including optical image generating means which generates the optical image from picture elements derived in succession from successive signal values of an optical image generating video signal, target acquisition means for simulated acquisition of a target by an aimer who brings an aimpoint axis of the target aquisition means on to the target, the target acquisition means having associated therewith a video camera which generates camera video signals representative of a field of view about the aimpoint axis, and computing means to produce from the camera video signals an output enabling an assessment to be made of the target acquisition skill of the aimer, characterised in that the optical image generating means provides for the inclusion in the optical image of a plurality of high visibility discrete zones spaced about the target, targets or each target and that the computing means includes aimpoint assessment output means responsive to the camera video signals representing the high visibility zones to produce aimpoint assessment outputs measured with respect to optical image reference coordinates which are based on the coordinates of centroids of
- target acquisition is meant the bringing of an aimpoint of a device such as a weapon or simulated weapon on to a target by an aimer of the device or his attempts thereat and may but not necessarily include the firing or simulated firing of the device at the target.
- the aimpoint assessment output means produces aimpoint assessment outputs measured with respect to optical image reference coordinates which correspond to or are based on the average value of the coordinates of the centroids of the high visibility zones.
- picture elements which form the projected optical image are built up by sequential scanning of a field along scanning lines which provide coverage of the field with the scanning of each line producing along the line throughout one dimension of the field a succession of picture elements which are derived from successive signal values of the optical image generating video signal, wherein the high visibility zones are formed in the optical image on each side of the target, targets or each target in a high visibility bar pattern which extends transverse to the scanning lines, the high visibility discrete zones being formed as white sections of each bar pattern which alternate with black sections thereof and wherein the aimpoint assessment output means is responsive to the camera video signals to select from each scanning line in each white section a picture element or elements having a predetermined characteristic and a corresponding picture element or elements from each of the other lines in the white section to produce a centroid coordinate of the selected picture elements for each white section and to average the centroid coordinates to produce an optical image reference coordinate.
- the aimpoint assessment output means may also be made responsive to the camera video signals to select predetermined lines of picture elements in each white section to produce a line centroid coordinate for the selected lines in each white section and to average the line centroid coordinates of the white sections to produce a further optical image reference coordinate.
- the bar patterns are arranged on each side of the target, targets or each target and are arranged parallel to each other and at a fixed distance apart and wherein the computing means includes range aimpoint assessment output means which is responsive to the camera video signals representing the bar patterns to produce from data representing the fixed distance apart of the two bar patterns a range output representing the range of the aimer from the projected optical image.
- the computing means may also include parallax error correction means responsive to the range output to calculate the parallax error and to produce corrected aimpoint assessment outputs to offset the parallax error.
- the computing means may further include bar patterns parallel to each other and the computing means may then include cant angle measuring means responsive to the camera generated video signals representing the bar patterns to produce a cant angle output representing the cant angle of the target acquisition means.
- the target acquisition means takes the form of a weapon or simulated weapon which will usually include a sight for use by the aimer for simulated acquisition of a targe .
- a method of producing aimpoint assessment output data in target acquisition training comprising the steps of projecting an optical image which includes one or more targets and in which the image is generated from picture elements derived in succession from successive signal values of an optical image generating video signal, providing an aimer with target acquisition means for simulated acquisition of a target by the aimer who brings an aimpoint axis of the target acquisition means on to the target, providing in association with the target acquisition means a video camera which generates camera video signals representative of a field of view about the aimpoint axis of the target acquisition means and computing from the camera video signals an output enabling an assessment to be made of the target acquisition skill of the aimer characterised by the steps of including in the optical image a plurality of high visibility discrete zones spaced about the target, targets or each target and computing from the camera video signals representing the high visibility zones aimpoint assessment outputs measured with respect to optical image reference coordinates which are based on the coordinates of centroids of the high visibililty zones.
- the aimpoint assessment outputs are measured with respect to optical image reference coordinates which correspond to or are based on the average value of the coordinates of the centroids of the high visibility zones.
- Fig 1 is a schematic perspective view of part of a multi- arms multi-lane weapon-fire training range embodying target acquisition training apparatus according to the invention
- Fig 2 is a schematic elevation of an optical image projected on to a projection screen of the range illustrated in Fig 1 and including a plurality of targets for use in target acquisition training of a marksman utilising one of four firing lanes in the range and including a high visibility bar pattern on each side of the targets for use in accordance with the invention
- Fig 3 is a block schematic diagram of the range illustrated in Fig 1 showing sub-assemblies of the target acquisition training apparatus according to the invention and their interconnections
- Fig 4 is a schematic side elevation of a part of a weapon with a camera mounted on it for use in the target acquisition training apparatus illustrated in Figs 1 and 3
- Figs 5 and 6 are schematic plan views of displays provided by the target acquisition training apparatus shown in Figs 1 and 3 for target acquisition assessment monitoring and
- Fig 7 is a schematic diagram of waveforms of signals processed in the target acquisition training apparatus illustrated in Figs 1 to 6.
- the multi-lane multi-arms weapon-fire training range shown is an indoor 4-lane range for use by marksmen, only two of which are schematically illustrated and indicated by reference numerals 11 and 12.
- the marksman 11 occupies an end lane LI and is provided with a simulated weapon 14 which he directs at a target images 15 projected on to a screen 16 by a video projector sub- assembly 17.
- the projector 17 provides full coverage of the screen 16 and presents target images 15 for each of the other lanes.
- the marksman 12 is provided with a simulated weapon 18 for use against target images 15 displayed on the screen 16 in lane L2 by the projector 17. It is to be noted that the target acquisition training of the two marksmen 11 and 12 is provided in respect of weapons of differnt type and the other lanes (not shown) can if desired be used by other marksmen training in target acquisition using simulated weapons of the same or other types.
- the training range shown in Fig 1 is placed under the control of a controller 19 who is provided with target assessment displays on a monitor screen 20 of a master console sub-assembly 21.
- each of the marksmen is provided with floor box sub-assemblies 22 and 23 which provide on monitors 24 and 25 information as to their own target acquisition performance.
- Each of the simulated weapons 14 and 18 has mounted thereon a video camera 13 which generates camera video signals representative of a field of view about the aimpoint axis of the simulated weapon.
- Each of the floor box sub-assemblies 22 and 23 and each of the other two floor box sub-assemblies includes a microprocessor for processing the camera video signals from the associated weapons to provide aimpoint assessment outputs to the monitors 24 and 25 and to the monitor 21 of the range controller's console sub-assembly 21 in accordance with the invention and as hereinafter to be described.
- the target images 15 as displayed on the screen 16 in each of the lanes are part of an optical image field projected on to the screen by the video projector 17.
- the image field is built up by the sequential scanning by the projector 17 along horizontal scanning lines which provide coverage of the screen 16 with the scanning of each line producing along the line throughout the horizontal dimension of the screen a succession of picture elements which are derived from successive signal values of the video signal of the projector 17.
- Such images are subject to pixel and line jitter as hereinbefore described.
- the target images 15 comprise outline target images 15A, 15B and 15C representing targets at successively increasing ranges, and at which the marksman 11 can aim using his simulated weapon 14 and make a strike at the target by simulated firing of the weapon 14.
- Included in the projected optical image field at the sides of the target images 15 are high visibility bar patterns 27 and 28 which extend vertically on the screen and therefore transverse to the horizontal scanning lines of the projected optical image field.
- Each of the bar patterns 27 and 28 is formed by white sections 29 alternating with black sections 30.
- the white sections 29 form high visibility discrete zones which are formed by a succession of picture elements along successive scanning lines passing through the zone.
- the video camera mounted on the simulated weapon 14 provides camera video output signals representative of a field of view about the aimpoint axis of the weapon 14 and views not only the target images 15 when the arkman brings the weapon sight on to the target images but also the bar patterns 27 and 28.
- the camera video signals generated by the weapon video camera include not only signals representative of the target images 15 but also signals representative of the high visibility zones provided by the white sections 29 of the bar patterns 27 and 28.
- the camera video signals from the camera mounted on the weapon 14 are transmitted for processing by a microprocessor in floor box sub-assembly 22 which analyzes the camera video signals, detects the relative positions of the bar patterns 27 and 28 within the camera field of view and from stored information of the target positions between the bar patterns, calculates the marksman's aimpoint with respect to the target images.
- the microprocessor detects the location of peaks in the video signals corresponding to the white sections of the bar patterns 27 and 28, averages the peak coordinates per white section to calculate the centroid coordinates of each white section, then averages the values of the coordinates for all the white sections to produce reference coordinates which are then converted to an aimpoint assessment output.
- This averaging process, performed per video frame, is shown dramatically to reduce the adverse effects of pixel jitter on aimpoint measurement, and thereby significantly increase the measurement accuracy.
- the sub-assemblies of the target acquisition training apparatus shown in Fig 1 are illustrated in block diagram form and include a screen sub-assembly 31 carrying the screen 16 and provided with speakers 32 and 33, the projector sub-assembly 17, weapon sub-assemblies 14 and 18 together with two further weapon sub-assemblies 34 and 35 for use by marksmen in the other two lanes of the range, floor box sub-assemblies 22 and 23 together with further floor box sub-assemblies 36 and 37, the electronics rack sub-assembly 26, the master control sub-assembly 21 and a compressor sub-assembly 38 which provides for the application to the simulated weapons of the weapon sub-assemblies 14, 18, 34 and 35 pneumatic pulses providing recoil of the simulated weapons when fired by the marksmen.
- Pixel jitter can be regarded as contributing noise to the aimpoint measurement process, which shows up as a distribution of signals representing the optical image reference coordinates, per video frame. This distribution of signals is then averaged to reduce the noise. For instance, if the microprocessor selects a picture element having a predetermined peak characteristic in say a total of 10 consecutive camera video lines, per "white" section of the bar pattern and there are 4 "white” sections in each bar pattern, then the processor will calculate 40 x-coordinates per bar pattern. The pixel jitter will produce, for example, in the case of perfectly vertical bar pattern image, a distribution in the 40 y- coordinates, for each bar pattern.
- the numerical averaging process over, for example, the above 40 x- coordinates will result in the pixel jitter induced noise in the x-direction being reduced from eg ⁇ N pixels as measured on 1 camera video line, to ⁇ N/V40, as measured in 40 lines, in 1 complete video frame.
- the noise in the aimpoint assessment output is thus reduced by the square root of the number of samples in each video frame.
- Averaging the 40 y-coordinates together would however not reduce jitter as the highest and lowest y-coordinates are averaged and they would be subject to jitter. Therefore, if there are 4 "white" sections in the example, the middle y-coordinate of each section is obtained (by averaging) and these centre y-coordinates are then averaged. This results in a jitter noise reduction in the y-coordinate of the aimpoint assessment output equal to the square root of twice the number of "white” sections, ie the y-coordinate jitter noise is reduced by 2.8 in this example.
- the microprocessor never actually measures the pixel and line jitter directly, but rather significantly reduces its adverse effect on aimpoint assessment accuracy, by producing reference coordinates per video frame, averaged over many video line samples of peaks in the white sections.
- a typical optical sight used on a weapon would have a magnification factor of about 4x, and therefore a field of view of about plus and minus 8°.
- the simulated weapon 14 is shown to represent a self loading rifle, with the video camera 13 mounted on the rifle barrel.
- the mounting of the video camera 13 is shown more fully in Fig 4 to which reference is now made.
- the forward end of the rifle barrel 39 carries a combined camera and lens housing 40 supported on the barrel 39 by brackets 41 and 42.
- the housing includes a video camera unit 43 and a lens body 44 which presents to the camera 43 an image of a field of view about the optical axis 46 of the lens body 44 which is arranged to converge with the optical sight of the weapon 14 at a predetermined range of say 8m of the marksman 11 from the screen 16.
- a high-resolution video/graphics projector 17 is used to project an optical video image on to the screen 16.
- the optical video image is derived from a video digitisation and storage board, which allows for superposition of representative target images 15 and bar patterns 27 and 28 on top of the background range image. This also allows for the target images 15 to be positioned accurately at pre-designated points between the bar patterns. The measurement of the latter's position within the weapon camera's field of view thus allows the aimpoint with respect to the target to be determined, since the target images 15 are already at a known location between the bar patterns 27 and 28.
- the projector 17 in the embodiment illustrated produces a 4-lane range video image on a 4 m wide by 2 m high screen, with the marksman positioned at for example 8 m from the screen 16.
- the projected image would, typically, be digitised to 1688 pixels horizontally, by 928 lines vertically.
- one pixel in the projected image would be nominally 2.37 mm wide on the screen, and would subtend an angle of approximately. 0.3 mrads at the marksman's position, 8 m from the screen.
- the required accuracy of good marksmanship training is about 0.1 mrads from point-to-point, or an error of + or - 0.05 mrads.
- the projected image must be sampled only over a small area, with a high resolution video camera on the marksman's weapon, to produce the necessary measurement accuracy.
- the lens used must give a horizontal field of view of about 50 mrads.
- the camera must view a portion of the 4 m x 2 m video screen 16, of dimensions 400 mm horizontally x 300 mm vertically, taking account of the visual 4:3 aspect ratio of CCD sensors designed for use in normal TV cameras.
- the required lens focal length is 125 mm.
- a pixel clock that corresponds exactly to the number of pixels on the CCD array.
- the pixel clock is used to count time along the video lines output from the camera, and thus pixel position.
- the peaks in the video image may thus be located as spatial coordinates on the CCD sensor, based on the timing measurements from the pixel clock. It is possible to increase the measurement accuracy by use of a stable pixel clock that is oversa pled compared to the number of pixels on the CCD sensor, or alternatively, to maintain the same angular accuracy, but increase the cameras field of view in conjunction with the pixel clock increasing. Essentially, for a fixed angular measurement accuracy, one can trade off pixel clock frequency against the camera lens focal length/field of view, so that the final system will not be restricted to the figures given in the examples.
- a typical example of bar pattern dimensions may be taken to be about 100 mm high on screen, spaced at 200 mm apart, thus allowing plus or minus about 100 mm movement of the camera (for detection of both bar patterns in one image) to either side of the target, or an angle of plus or minus 12.5 mrads at 8 m from the screen.
- To put this in context of a typical standing man target at 100 m would subtend 4 mrads horizontally by 17 mrads vertically, or be displayed as 32 mm (H) by 136 mm (V) , in this example, allowing aimpoint to be detected over a considerable region round the target. This is a system requirement, to deal with weapon handling, trigger snatch, poor aiming, sight zeroing, simulated wind effects and the like.
- the video projector with 928 lines in 2 m, or 2.16 mm per projected video line, would use 6 or 7 projector lines in representing the same white section of the bar pattern.
- the distinction between the pixels and lines used by the video projector and those used by the weapon-mounted camera is also to be noted.
- the microprocessor using the CCD camera video output, attempts to find the peak coordinates, per camera video line, of each white section 29 of the bar pattern, per video frame. In this example, it could find a maximum of 27 peak coordinates per white section 29, if one peak position is detected on each of the 27 video lines that the white section subtends vertically on the CCD sensor.
- centroid of each white section 29 is computed, and the 4 centroids are averaged, to give a single averaged (x,y) coordinate per bar pattern.
- the sensor used must be such as to ensure adequate signal to noise ratio for the external processing circuitry.
- a 1/2" format interline transfer CCD sensor array with a peak response in the visible region, preferably at about 550 nm, near the video projector's peak output wavelength; a minimum of 500 pixels horizontally, should suffice, and the sensor would need to be of a type having an integral microlens array on its surface, for maximum sensitivity.
- a lens aperture of F/4 to F/2 should provide sufficient light throughput.
- the camera processing boards would probably need to be set up with a maximum gain of +24 dB, to provide sufficient signal, and an automatic gain control circuit to cope with variations in projector output levels.
- Each video frame output at 40 ms intervals may be analysed in hardware, in conjunction with a pixel clock, to locate the white section peak coordinates per video line. These coordinates may then be transferred to another processor for data validation, and then averaging to give aimpoint assessment data.
- the aimpoint assessment data is then transmitted to the computer monitor 24 beside the marksman 11, and shows his aimpoint tracking as illustrated in Fig 5, plus where appropriate bullet strike point when ballistics, wind and cant effects are considered, at the correct position, on or near the target at which he is aiming.
- the lay-out of the bar patterns inherently give information on range changes of the weapon with respect to the screen as required for parrallax error correction, and on cant angle of the weapon.
- the measurements based on a single pixel in the projected video image cannot contain enough information to enable these parameters to be determined, and additional sensors would be needed, in this case, notwithstanding the fact the measurement accuracy would in any event be poor due to pixel oscillation anyway.
- the convergence angle is approx 10 mrads.
- the weapon is moved 1 m further back to 9 m, the sight and camera now converge at 1 m in front of the screen, which means that the camera aimpoint on the screen is 10 mm high with respect to the sight's aimpoint.
- This 10 mm error at 8 m is equivalent to an aimpoint sensing error of 1.25 mrads, or for example 125 mm at 100 m; it is to be remembered that the desired system accuracy is plus or minus 0.05 mrads, so the above error must be detected and corrected.
- the external video microprocessor is housed in the floor box sub-assembly 22 at the markman's firing position along with the feedback monitor 24.
- the video processor PCB has a low pass filter input with a cut-off frequency of 500KHz. This corresponds to a sinusoid of period slightly less than twice the width of the video pulse from the camera caused by a white section of a bar pattern. This filtered pulse then passes through an automatic gain control stage and has automatic clamping of the video black level. The next section of the circuit, an edge detector splits the signal and passes the signal through a low pass filter and adds a voltage offset.
- the waveforms are then compared to the original using a comparator and if the original signal is above the filtered offset signal, a TTL pulse is output to the next section.
- the waveforms are illustrated in Figure 7.
- the TTL pulse is then input into a digital IC along with a pixel clock and camera line synch, so that x- and y-coordinates for pulses are output. These coordinates are logged into memory and a microcontroller performs and x and y averaging as explained earlier.
- the calculated centres of each bar pattern are then passed via an RS232 link for computer calculation of cant angle, parallax correction and aimpoint in real time.
- the aimpoint assessment output may in these circumstances be produced only when the trigger of the weapon is pulled, so as to hide the high visibility bar patterns from the marksman, and only to flash them on to the projected video image on trigger pull, for a couple of frames.
- the bar patterns involve making them a feature of the displayed scene, for example as lane marker posts, as usually seen on a traditional marksmanship range. They are then displayed continuously, and so the weapon aimpoint is therefore measured continuously, both before and after trigger pull. It is this sequence of aimpoint measurements which gives vital information of the weapon handling capabilities of the marksman, both before and after trigger pull.
- target acquisition training apparatus may be used for both high accuracy range marksmanship training and lower accuracy combat practice skills on rapidly moving targets it is more advantageous to use it to provide the high accuracy range-type marksmanship training as hereinbefore described.
- the invention hereinbefore described provides a system, primarily for marksmanship simulation and training on range type targets, using a weapon-mounted camera coupled to an external processor which interrogates the scene projected on to a large screen by a video/graphics projector, displaying multi-lane range type marksmanship targets for multiple marksmen, and calculates the relative aimpoint for each marksman, with respect to the target displayed in his lane, to an accuracy significantly greater than that possible using "light pen" - type measurements on the large screen projected video.
- apparatus achieves the required accuracy on a large screen display, of lower video resolution than the measurement accuracy, thus allowing a plurality of marksmen to engage separate targets simultaneously, on the same large screen display.
- the result is a much more realistic simulation of a marksmanship range for weaponry training.
- An essential feature of the system is the provision of high visibility discrete zones in bar patterns in the projected video image. These are conveniently disguised as range type lane marker posts, overlaid on the background range image, with the targets displayed at known relative locations between the lane marker posts.
- the processor analyzing the video signal from the weapon- mounted camera detects the relative positions of the high visibility zones within the camera field of view, and from stored information of target position relative to the zones, calculates the arksmans aimpoint with respect to the target.
- the bar patterns therefore, form the basis of the aimpoint measurement method employed, and in addition, when disguised as described, are highly acceptable to the user.
- the external processor analyzing the video signal from the weapon-mounted camera detects the location of the peaks in the video signal, corresponding to the white sections of the bar patterns.
- the peak positions per white section are then averaged to calculate the centroid coordinates of each white section, which are then averaged to produce reference coordinates for producing an aimpoint assessment output.
- This averaging process, performed per video frame dramatically reduces the effect of random noise sources on the aimpoint measurement, and thereby significantly increase the measurement accuracy.
- the two main noise sources which would cause aimpoint errors, but are now averaged out to a much lower level are pixel jitter oscillations (horizontal and vertical) in the video graphics projected image, and random noise in the CCD camera used to view the projected image.
- the bar patterns are, furthermore, spatially extended within the camera's field of view, by necessity, which means that their image size changes measurably as the weapon to screen range changes.
- range changes can be measured from the processing results, and this allows parallax errors (arising from the fact that the weapon sight and camera only converge on the screen at one range and thus are misconverged elsewhere) to be computed and corrected for; this arises directly from the implementation of the bar patterns described and requires no external range measuring devices.
- the vertical extension of the bar patterns, and the fact that centroids are calculated, one for each white section, for example, 4 white sections per bar pattern, also allows the weapon cant angle from the vertical to be determined by the processor, without additional external measurement devices.
- the first bar pattern's coordinates are (.x ⁇ y,) and the other bar pattern's corresponding coordinates are (x 2 , y 2 ) ⁇ tne cant angle is tan "1 ⁇ y z -y ) I (* 2 ⁇ x ⁇ )
- Cant angle from the vertical can conveniently be displayed on the marksman's monitor as a clock face as illustrated in Fig 6 at the top right-hand corner of the display where indicated by the reference numeral 48, a strike of the target being indicated by the reference numeral 49.
- the video camera mounted on it may conveniently be in the form of a simulated barrel of the gun.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Business, Economics & Management (AREA)
- Electromagnetism (AREA)
- Remote Sensing (AREA)
- Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Un dispositif permettant d'obtenir des résultats d'évaluation de points de visée lors d'un entraînement à l'acquisition d'un but, comprend un projecteur (17) projetant sur un écran (16) une image optique qui se compose d'un but (15) générée à partir d'éléments d'images issus d'une source de signaux vidéo, et une arme ou une arme fictive (14) manipulée par un tireur (11) pour l'acquisition d'un but (15). Le tireur aligne un axe de point de visée de l'arme (14) sur le but (15), et une caméra vidéo (13) placée sur l'arme (14) génère des signaux vidéo représentant un champ optique autour de l'axe de point de visée de l'arme (14). L'image optique projetée par le projecteur (17) comporte des motifs à barres de haute visibilité (27, 28) s'étendant transversalement par rapport aux lignes de balayage de l'image optique projetée de chaque côté du but, et présente une pluralité de zones distinctes de haute visibilité apparaissant comme des parties blanches de chaque motif à barres (27, 28) alternant avec des parties noires. Un microprocesseur calcule, à partir des signaux vidéo représentant les zones de haute visibilité, les résultats d'évaluation de points de visée obtenus par rapport aux coordonnées de référence basées sur les coordonnées de points centraux des zones de haute visibilité.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB9120930.4 | 1991-10-02 | ||
| GB919120930A GB9120930D0 (en) | 1991-10-02 | 1991-10-02 | Target acquisition training apparatus |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO1993007437A1 true WO1993007437A1 (fr) | 1993-04-15 |
Family
ID=10702312
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/GB1992/001794 Ceased WO1993007437A1 (fr) | 1991-10-02 | 1992-09-30 | Dispositif d'entrainement a l'acquisition d'un but |
Country Status (2)
| Country | Link |
|---|---|
| GB (2) | GB9120930D0 (fr) |
| WO (1) | WO1993007437A1 (fr) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0728503A1 (fr) * | 1995-02-21 | 1996-08-28 | Konami Co., Ltd. | Machine de jeu de tir |
| US5800263A (en) * | 1995-02-21 | 1998-09-01 | Konami Co., Ltd. | Game machine |
Families Citing this family (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB2268252A (en) * | 1992-06-30 | 1994-01-05 | British Aerospace Simulation L | Weapon training |
| FR2772908B1 (fr) | 1997-12-24 | 2000-02-18 | Aerospatiale | Simulateur de tir de missiles avec immersion du tireur dans un espace virtuel |
| RU2179698C2 (ru) * | 1999-08-24 | 2002-02-20 | Государственное унитарное предприятие Пензенское конструкторское бюро моделирования | Тренажер наводчиков-операторов установок пуска ракет или стрельбы из орудий и пулеметов |
| RU2205346C2 (ru) * | 2000-09-25 | 2003-05-27 | Гущин Николай Иванович | Зенитный тренажер |
| US7329127B2 (en) * | 2001-06-08 | 2008-02-12 | L-3 Communications Corporation | Firearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control |
| RU2211433C1 (ru) * | 2002-07-17 | 2003-08-27 | Институт прикладной механики УрО РАН | Оптико-электронный стрелковый тренажер коллективного боя |
| RU2310150C2 (ru) * | 2002-12-15 | 2007-11-10 | Серпуховский военный институт ракетных войск (СВИ РВ) | Стрелковый тренажер |
| RU2251652C2 (ru) * | 2003-08-27 | 2005-05-10 | Закрытое акционерное общество "Дженерал Телеком" | Способ определения места попадания пули в мишень на полевом стрельбище |
| RU2247296C1 (ru) * | 2003-11-17 | 2005-02-27 | Государственное унитарное предприятие "Конструкторское бюро приборостроения" | Тренажер для обучения наводчиков |
| UA92462C2 (ru) | 2004-07-15 | 2010-11-10 | Кьюбик Корпорейшн | Способ прогнозирования положения точки нацеливания в имитированной среде (варианты), система и компьютерная система для его осуществления и способ усовершенствованного сопровождения точек нацеливания на цели в имитированной среде |
| US8106884B2 (en) | 2006-03-20 | 2012-01-31 | Samsung Electronics Co., Ltd. | Pointing input device, method, and system using image pattern |
| RU2397423C1 (ru) * | 2009-03-20 | 2010-08-20 | Борис Иванович Кудряков | Стрелковый тренажер |
| US10077969B1 (en) | 2017-11-28 | 2018-09-18 | Modular High-End Ltd. | Firearm training system |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US3945133A (en) * | 1975-06-20 | 1976-03-23 | The United States Of America As Represented By The Secretary Of The Navy | Weapons training simulator utilizing polarized light |
| GB2046410A (en) * | 1979-04-04 | 1980-11-12 | Detras Training Aids Ltd | Target apparatus |
| US4290757A (en) * | 1980-06-09 | 1981-09-22 | The United States Of America As Represented By The Secretary Of The Navy | Burst on target simulation device for training with rockets |
| GB2152645A (en) * | 1984-01-04 | 1985-08-07 | Hendry Electronics Ltd D | Target trainer |
| GB2160298A (en) * | 1984-06-14 | 1985-12-18 | Ferranti Plc | Weapon aim-training apparatus |
| US4824374A (en) * | 1986-08-04 | 1989-04-25 | Hendry Dennis J | Target trainer |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB2161251B (en) * | 1984-07-07 | 1987-11-25 | Ferranti Plc | Weapon training apparatus |
-
1991
- 1991-10-02 GB GB919120930A patent/GB9120930D0/en active Pending
-
1992
- 1992-09-30 GB GB9220650A patent/GB2260188A/en not_active Withdrawn
- 1992-09-30 WO PCT/GB1992/001794 patent/WO1993007437A1/fr not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US3945133A (en) * | 1975-06-20 | 1976-03-23 | The United States Of America As Represented By The Secretary Of The Navy | Weapons training simulator utilizing polarized light |
| GB2046410A (en) * | 1979-04-04 | 1980-11-12 | Detras Training Aids Ltd | Target apparatus |
| US4290757A (en) * | 1980-06-09 | 1981-09-22 | The United States Of America As Represented By The Secretary Of The Navy | Burst on target simulation device for training with rockets |
| GB2152645A (en) * | 1984-01-04 | 1985-08-07 | Hendry Electronics Ltd D | Target trainer |
| GB2160298A (en) * | 1984-06-14 | 1985-12-18 | Ferranti Plc | Weapon aim-training apparatus |
| US4824374A (en) * | 1986-08-04 | 1989-04-25 | Hendry Dennis J | Target trainer |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0728503A1 (fr) * | 1995-02-21 | 1996-08-28 | Konami Co., Ltd. | Machine de jeu de tir |
| US5800263A (en) * | 1995-02-21 | 1998-09-01 | Konami Co., Ltd. | Game machine |
Also Published As
| Publication number | Publication date |
|---|---|
| GB9120930D0 (en) | 1991-11-27 |
| GB9220650D0 (en) | 1992-11-11 |
| GB2260188A (en) | 1993-04-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP0852961B1 (fr) | Appareil de jeu vidéo de tir | |
| JP3748271B2 (ja) | 射撃ゲーム装置 | |
| WO1993007437A1 (fr) | Dispositif d'entrainement a l'acquisition d'un but | |
| US6540607B2 (en) | Video game position and orientation detection system | |
| US4619616A (en) | Weapon aim-training apparatus | |
| EP0728503B1 (fr) | Machine de jeu de tir | |
| US5208417A (en) | Method and system for aiming a small caliber weapon | |
| US4164081A (en) | Remote target hit monitoring system | |
| KR101603281B1 (ko) | 레이저 사격 훈련 시스템 및 방법 | |
| US8610778B2 (en) | Method and device for use in calibration of a projector image display towards a display screen, and a display screen for such use | |
| US4923402A (en) | Marksmanship expert trainer | |
| US6997716B2 (en) | Continuous aimpoint tracking system | |
| US5991043A (en) | Impact position marker for ordinary or simulated shooting | |
| US20070190495A1 (en) | Sensing device for firearm laser training system and method of simulating firearm operation with various training scenarios | |
| US9910507B2 (en) | Image display apparatus and pointing method for same | |
| CN110836616B (zh) | 激光模拟射击弹着点准确定位的图像校正检测方法 | |
| US9052161B2 (en) | Perspective tracking system | |
| US6663391B1 (en) | Spotlighted position detection system and simulator | |
| US20140375562A1 (en) | System and Process for Human-Computer Interaction Using a Ballistic Projectile as an Input Indicator | |
| CN117046085A (zh) | 模拟射击方法和模拟射击系统 | |
| KR100751503B1 (ko) | 레이저총 사격 연습장치 | |
| RU2627019C2 (ru) | Способы определения точки наведения оружия на изображении фоно-целевой обстановки в стрелковых тренажерах и устройство для их осуществления | |
| EP1194733A1 (fr) | Procede et arrangement pour la mesure d'une distance de saut | |
| JPS6232987A (ja) | レ−ザガンゲ−ム装置及びレ−ザガンゲ−ム装置に於ける命中検知方法 | |
| US6964607B2 (en) | Game system and game method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A1 Designated state(s): CA US |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL SE |
|
| 122 | Ep: pct application non-entry in european phase | ||
| NENP | Non-entry into the national phase |
Ref country code: CA |