CN116499303B - Shooting simulation training method and device - Google Patents
Shooting simulation training method and device Download PDFInfo
- Publication number
- CN116499303B CN116499303B CN202310637097.6A CN202310637097A CN116499303B CN 116499303 B CN116499303 B CN 116499303B CN 202310637097 A CN202310637097 A CN 202310637097A CN 116499303 B CN116499303 B CN 116499303B
- Authority
- CN
- China
- Prior art keywords
- information
- shooting
- training
- simulated
- firearm
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41A—FUNCTIONAL FEATURES OR DETAILS COMMON TO BOTH SMALLARMS AND ORDNANCE, e.g. CANNONS; MOUNTINGS FOR SMALLARMS OR ORDNANCE
- F41A33/00—Adaptations for training; Gun simulators
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)
Abstract
The embodiment of the invention relates to the technical field of simulated training and discloses a shooting simulation training method, which comprises the steps of sending a gun control instruction to a corresponding simulated gun to enable the simulated gun to control the simulated gun to be in a working state, wherein the working state is that a pneumatic box and the simulated gun are both in an on state, projecting a set training scene onto a corresponding display screen through projection equipment, acquiring shooting parameter information of the simulated gun in the simulated training process through a camera laser recognition component, wherein the shooting parameter information comprises moving information, the moving information is laser track moving information of the simulated gun within a preset time before the simulated gun is transmitted, and determining gun holding stability of trained personnel according to the moving information. According to the shooting simulation training method, the camera recognition component is adopted to obtain shooting parameters in the simulation training process, so that various state information of trained personnel in shooting can be accurately obtained, and the trained personnel can be helped to improve training effects.
Description
Technical Field
The invention relates to the technical field of simulated training, in particular to a shooting simulated training method and device.
Background
At present, the key index for measuring the simulated firearm is a simulation effect, and the more realistic the impact effect is when the simulated firearm is triggered, the better the training effect is when the simulated firearm is used. The simulation effect mainly comprises the holding feeling of the equipment, the sound effect of the shooting process, the recoil force effect and the like.
The traditional simulated shooting equipment is mostly based on electromagnetic training ware design, and in electromagnetic training ware, manual and full-automatic shooting mode is based on programming realization, and the simulation effect of switching control is relatively poor, is difficult to apply in the higher specialized scene of requirement. And when carrying out the implementation, current scheme only can demonstrate the result, for example only demonstrate target parameters such as the number of rings of gunshot, and do not have other parameters to carry out more comprehensive evaluation to the training person, and then can't have the targeted suggestion of promoting deeply to the training person. Therefore, designing a solution capable of providing multidimensional training parameters to enhance the practical training ability of users is a technical problem to be solved by those skilled in the art.
Disclosure of Invention
Aiming at the defects, the embodiment of the invention discloses a shooting simulation training method which can realize more accurate simulation training for users and can provide more dimensionality training information to assist in improving training results.
The first aspect of the embodiment of the invention discloses a shooting simulation training method, which comprises the following steps:
Transmitting a firearm control instruction to a corresponding simulation firearm so that the simulation firearm is controlled to be in a working state by the simulation firearm, wherein the working state is that the pneumatic box and the simulation firearm are in an open state;
projecting the set training scene to a corresponding display screen through projection equipment;
acquiring shooting parameter information of the simulated firearm in the simulation training process through a camera laser identification component, wherein the shooting parameter information comprises movement information, and the movement information is laser track movement information of the simulated firearm in a preset time before being transmitted;
And determining the gun holding stability of the trained personnel according to the movement information.
In a first aspect of the embodiment of the present invention, the shooting parameter information further includes shooting point location information, and after the shooting parameter information of the simulated firearm in the training process is acquired by the camera laser identification component, the method further includes:
Determining a target practice result of a corresponding trained person based on the shooting point position information;
and comprehensively training the trained personnel according to the target practice result and the gun holding stability.
In a first aspect of the embodiment of the present invention, the movement information is a movement track of the laser on the target object, or,
The method for determining the gun holding stability of the trained personnel according to the movement information comprises the following steps:
determining a gun holding mobile position of the trained personnel based on the mobile information;
determining a trained person aiming center point based on the training scene, and determining moving interval information based on the aiming center point;
And matching the gun holding moving position with the moving interval information to determine the gun holding stability of trained personnel.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, before the sending of the firearm control command, the method further includes:
Receiving training subjects, training conditions and trained personnel configured by a instructor side;
and issuing a corresponding training task, generating a training task scene based on the training task, and carrying out data association on the training task scene and a training terminal of a corresponding trained person.
As an alternative implementation manner, in the first aspect of the embodiment of the present invention, the firearm control command includes a frame header, a frame tail, a safety state, a bolt state, a magazine state, a shot size, a trigger state, a shot size mode, a trigger firing number, a trigger linearity value, and a gun shape;
The shooting simulation training method comprises the following steps:
and in the training terminal, firearm control instruction updating is carried out once every time simulated shooting is carried out.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, the controlling the simulated firearm is in a working state, including:
acquiring the elasticity information, the squat travel information, the static air pressure information and the volume information of the inner cavity of the piston, which are associated with the corresponding simulated firearm;
inputting the recoil spring force information, the recoil stroke information, the static air pressure information and the piston inner cavity volume information as constants into simulation software associated with a simulation firearm;
respectively inputting different air valve opening and closing time to simulate the movement of the simulated firearm, collecting back seat time data and re-entry time data of the simulated firearm, and taking the back seat time data and the re-entry time data as shooting time of simulated shooting;
And controlling the simulated firearm to enter the working state based on the shooting time and the laser control information.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, before the acquiring, by the camera laser identification component, shooting parameter information of the simulated firearm in the simulated training process, the method further includes:
under the calibration state, acquiring initial position information of laser irradiation of the simulated firearm under the calibration state;
performing offset calibration on the initial position information to obtain calibrated calibration position information;
the calibration position information is associated with user information of the current simulation firearm;
Receiving the spot identification area, the spot identification perimeter and the walk trajectory of a camera laser identification assembly, which are set by a user, wherein the spot identification area and the spot identification perimeter are set by interval threshold values;
the shooting parameter information of the simulated firearm in the simulation training process is obtained through the camera laser identification component, and the method comprises the following steps:
Analyzing a camera real-time picture by calling a graphic library and capturing laser point positions;
And mapping the detected laser point information to a display interface, wherein the display page comprises a target image, laser point number information and laser point track information.
A second aspect of an embodiment of the present invention discloses a shooting simulation training system, including:
the transmission module is used for transmitting a firearm control instruction to a corresponding simulation firearm so that the simulation firearm is controlled to be in a working state by the simulation firearm, wherein the working state is that the pneumatic box and the simulation firearm are in an open state;
the projection module is used for projecting the set training scene to the corresponding display screen through the projection equipment;
the parameter acquisition module is used for acquiring shooting parameter information of the simulated firearm in the simulation training process through the camera laser identification component, wherein the shooting parameter information comprises moving information, and the moving information is laser track moving information of the simulated firearm in a preset time before being transmitted;
and the stability determining module is used for determining the gun holding stability of the trained personnel according to the movement information.
A third aspect of the embodiment of the invention discloses an electronic device, which comprises a memory storing executable program codes, and a processor coupled with the memory, wherein the processor calls the executable program codes stored in the memory and is used for executing the shooting simulation training method disclosed in the first aspect of the embodiment of the invention.
A fourth aspect of the embodiment of the present invention discloses a computer-readable storage medium storing a computer program, wherein the computer program causes a computer to execute the shooting simulation training method disclosed in the first aspect of the embodiment of the present invention.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
according to the shooting simulation training method, the camera recognition component is adopted to obtain shooting parameters in the simulation training process, so that various state information of trained personnel in shooting is accurately obtained, the state information is presented to provide all-round shooting parameters, and the trained personnel is helped to improve training effects.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a shooting simulation training method disclosed in an embodiment of the present invention;
FIG. 2 is a schematic diagram of a process for determining a targeting training result according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart of a gun-holding stability analysis disclosed in an embodiment of the present invention;
FIG. 4 is a schematic flow diagram of training scenario distribution disclosed in an embodiment of the present invention;
FIG. 5 is a schematic flow chart of controlling the working state of a simulated firearm according to an embodiment of the present invention;
FIG. 6 is a schematic flow chart of camera calibration as disclosed in an embodiment of the present invention;
FIG. 7 is a schematic diagram of a movement track display disclosed in an embodiment of the present invention;
FIG. 8 is a schematic diagram of an aiming stability analysis curve disclosed in an embodiment of the present invention;
FIG. 9 is a further schematic illustration of an aiming stability analysis curve disclosed in an embodiment of the present invention;
FIG. 10 is a schematic flow chart of simulated training of a rifle as disclosed in an embodiment of the invention;
Fig. 11 is a schematic flow chart of simulated training of a pistol according to an embodiment of the present invention;
FIG. 12 is a schematic diagram of a camera parameter adjustment page disclosed in an embodiment of the present invention;
FIG. 13 is a schematic view showing a pressing stroke curve according to an embodiment of the present invention;
FIG. 14 is a schematic illustration of a simulated trigger real-time status display as disclosed in an embodiment of the present invention;
FIG. 15 is a schematic view of a shooting simulation training apparatus according to an embodiment of the present invention;
Fig. 16 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that the terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present invention are used for distinguishing between different objects and not necessarily for describing a particular sequential or chronological order. The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
The traditional simulated shooting equipment is mostly based on electromagnetic training ware design, and in electromagnetic training ware, manual and full-automatic shooting mode is based on programming realization, and the simulation effect of switching control is relatively poor, is difficult to apply in the higher specialized scene of requirement. And when carrying out the implementation, current scheme only can demonstrate the result, for example only demonstrate target parameters such as the number of rings of gunshot, and do not have other parameters to carry out more comprehensive evaluation to the training person, and then can't have the targeted suggestion of promoting deeply to the training person. Based on the above, the embodiment of the invention discloses a shooting simulation training method, a shooting simulation training device, electronic equipment and a storage medium, which are used for acquiring shooting parameters in a simulation training process by adopting a camera identification component so as to accurately acquire various state information of trained personnel in shooting, and providing all-round shooting parameters by presenting the state information so as to help the trained personnel to promote training effect.
Example 1
Referring to fig. 1, fig. 1 is a schematic flow chart of a shooting simulation training method according to an embodiment of the invention. The execution main body of the method described in the embodiment of the invention is an execution main body composed of software or/and hardware, and the execution main body can receive related information in a wired or/and wireless mode and can send a certain instruction. Of course, it may also have certain processing and storage functions. The execution body may control a plurality of devices, such as a remote physical server or cloud server and related software, or may be a local host or server and related software that performs related operations on a device that is located somewhere, etc. In some scenarios, multiple storage devices may also be controlled, which may be located in the same location or in different locations than the devices.
As shown in fig. 1, the training method based on shooting simulation comprises the following steps:
s101, transmitting a firearm control instruction to a corresponding simulation firearm so that the simulation firearm is controlled to be in a working state by the simulation firearm, wherein the working state is that the pneumatic box and the simulation firearm are in an open state;
When the preparation is started, the corresponding training terminal needs to be controlled to be in a working state through the active end, namely the simulated firearm is controlled to be in the working state so as to carry out subsequent shooting operation.
More preferably, fig. 4 is a schematic flow chart of training scenario issue disclosed in the embodiment of the present invention, as shown in fig. 4, before the firearm control command is sent, further including:
s100a, receiving training subjects, training conditions and trained personnel configured by a instructor side;
and S100b, issuing a corresponding training task, generating a training task scene based on the training task, and carrying out data association on the training task scene and a training terminal of a corresponding trained person.
Before training starts, various parameters, such as the number of trained personnel and training subjects, need to be configured at the instructor end, and the trained personnel can perform subsequent simulation training simulation only on the premise that the instructor end is configured. Different training flows are adopted for different training subjects, as shown in fig. 10 and 11.
More preferably, fig. 5 is a schematic flow chart of a process of controlling an operating state of a simulated firearm according to an embodiment of the present invention, and as shown in fig. 5, the process of controlling the simulated firearm is in the operating state, including:
S1011, acquiring the elasticity information, the recoil stroke information, the static air pressure information and the volume information of the inner cavity of the piston, which are associated with the corresponding simulated firearm;
s1012, inputting the recoil spring force information, the recoil stroke information, the static air pressure information and the piston inner cavity volume information as constants into simulation software associated with a simulation firearm;
s1013, respectively inputting different air valve opening and closing time to simulate the movement of the simulated firearm, collecting the back seat time data and the re-entry time data of the simulated firearm, and taking the back seat time data and the re-entry time data as shooting time of simulated shooting;
And S1014, controlling the simulated firearm to enter an operating state based on the shooting time and the laser control information.
The real simulation is a key problem to be solved when the implementation is carried out, and for the small arms, the non-differential simulation is to be realized in the aspect of the working effect of the arms, and mainly two problems need to be solved, namely, the non-differential simulation of the recoil of the arms, and the non-differential simulation of the simulated firing rate under the premise of ensuring the recoil stroke consistent with that of a real gun.
When the method is implemented, the recoil is simulated in a non-differential mode, and the problem can be effectively solved by reasonably designing the piston structure, the air supply pressure and the air supply time. The non-differential simulation technology of the simulated firing rate is a technical serious problem of the embodiment of the invention, and is a key technology for limiting whether the working effect of the simulated weapon is consistent with the working effect of the real firearm. In order to make the trained object experience the operation experience basically consistent with that of a real gun, the simulated weapon is required to simulate the real gun in a hardware simulation mode besides the recoil and the recoil simulation, so that the simulated firing rate basically consistent with the theoretical firing rate of the real gun.
In the embodiment of the invention, the realistic jet velocity simulation is mainly solved by adopting motion simulation software, and taking parameters such as the recoil spring force of a simulated weapon, the design recoil stroke of a simulated rifle bolt, static air pressure, the volume of an inner cavity of a piston and the like as constants to be input into the simulation software according to the design scheme of the simulated weapon. Different air valve opening and closing times (air supply interval time) are respectively input, so that the simulation software performs motion simulation, and data of sitting back and re-entering are acquired, and one complete cycle (sitting back and re-entering) is the shooting time of simulating shooting. According to the technical parameters of the real gun, the time-consuming data of completing one-time shooting when shooting continuously under ideal conditions is obtained. Based on the data, finding out the design scheme of the opening and closing time of the air valve consistent with the data from the simulation test data;
the scheme is input into a pneumatic control module of a shooting capacity improvement training simulation support system, and an actual test is carried out by using a simulation weapon. And acquiring actual data, and modifying the design scheme of the opening and closing time of the air valve according to the actual data until the hardware simulation shooting speed is basically consistent with that of the actual gun during actual simulation shooting. Through the mode, the simulation of the shooting state can be realized more truly.
S102, projecting a set training scene to a corresponding display screen through projection equipment;
When the method is implemented, training scenes are projected onto corresponding display screens through projection equipment due to the condition of scene simulation, for example, moving human targets or fixed targets can be put in to perform specific scene simulation.
S103, acquiring shooting parameter information of the simulated firearm in the simulation training process through a camera laser identification component, wherein the shooting parameter information comprises moving information, and the moving information is laser track moving information of the simulated firearm in a preset time before being transmitted;
when specific shooting training is carried out, a critical time point is a period of time before shooting, and if trained personnel are in a stable state before bullet shooting, namely, the range area of laser fixed in the shooting point can be controlled, the accuracy of final shooting can be greatly improved. Therefore, in the implementation, the invention determines the gun holding stability of the user by detecting the track in the preset time.
More preferably, fig. 6 is a schematic flow chart of camera calibration disclosed in the embodiment of the present invention, and as shown in fig. 6, before the shooting parameter information of the simulated firearm in the simulated training process is obtained by the camera laser identification component, the method further includes:
S1021, under the calibration state, acquiring initial position information of laser irradiation of the simulated firearm under the aiming state;
S1022, performing offset calibration on the initial position information to obtain calibrated calibration position information;
s1023, storing the calibration position information in a correlated manner with the user information of the current simulation firearm;
s1024, receiving the spot identification area, the spot identification perimeter and the walk trajectory setting of a user on the camera laser identification component, wherein the spot identification area and the spot identification perimeter are set by adopting interval threshold values;
In practice, since different trained personnel are used differently when aiming, namely the same equipment, different personnel have certain deviation in aiming, for example, the laser point should aim at a middle position originally, but the trained personnel seem to aim at the middle position due to the line of sight deviation, but the aiming position deviates for an external observer. Therefore, when the simulation training gun is implemented, calibration can be carried out for different users, for example, offset can be adjusted by a plurality of pixel points, and finally, the offset data can be stored in a data association way with a specific user. Because the number of simulated firearms is limited, the data storage is performed based on more cost consideration, and the simulated firearms have more use effect. The system can automatically calibrate the camera based on the image analysis principle, and after calibration is completed, secondary calibration is not needed under the condition that the position of the camera and the position of the screen are unchanged. The scheme of the embodiment of the invention can also adjust key parameters related to identification, such as a shutter threshold value, an identification threshold value, spot identification area setting, scattering setting and the like of the camera, and a specific camera parameter adjustment page is shown in fig. 12.
The shooting parameter information of the simulated firearm in the simulation training process is obtained through the camera laser identification component, and the method comprises the following steps:
Analyzing a camera real-time picture by calling a graphic library and capturing laser point positions;
And mapping the detected laser point information to a display interface, wherein the display page comprises a target image, laser point number information and laser point track information.
The laser scanning track of the simulated firearm can be efficiently obtained through the camera laser identification component, and when the trigger is pulled, corresponding shooting point position information is obtained, so that subsequent achievement determination is facilitated.
S104, determining the gun holding stability of the trained personnel according to the movement information.
During implementation, gun holding stability analysis can be performed through the moving track, as shown in fig. 7, 8 and 9, when the gun holding stability analysis is specific detection, the moving track of the laser at the corresponding position can be better detected through the mode, and subsequent shooting compound trays can be performed based on the moving track and the position, so that training results of trained personnel can be assisted to be improved, and a more specific training direction is provided for the trained personnel.
More preferably, the movement information is a movement track of the laser on the target, or,
FIG. 3 is a schematic flow chart of a gun-holding stability analysis according to an embodiment of the present invention, as shown in FIG. 3, wherein the determining the gun-holding stability of the trained personnel according to the movement information includes:
s1041, determining a gun holding mobile position of a trained person based on the mobile information;
S1042, determining a aiming center point position of a trained person based on the training scene, and determining moving interval information based on the aiming center point position;
and S1043, matching the gun holding moving position with the moving interval information to determine the gun holding stability of the trained personnel.
The above-mentioned mode is that the interval matching is adopted to detect the stability of the gun, if the track of the user is basically near the aiming center point, the stability of the gun can be determined to be higher, if the moving interval is too large, the stability of the gun of the trained personnel can be determined to be poor, thus, the shooting deviation is easy to generate when the shooting is carried out subsequently, because the time before the shooting is important, if the user does not basically move during the time, the final shooting point is the aiming point.
More preferably, fig. 2 is a schematic flow chart of determining a target practice result according to the embodiment of the present invention, as shown in fig. 2, the shooting parameter information further includes shooting point location information, and after the shooting parameter information of the simulated firearm in the training process is obtained by the camera laser recognition component, the method further includes:
s105, determining a targeting training result of a corresponding trained person based on the shooting point position information;
S106, comprehensively training results of trained personnel according to the target shooting training results and the gun holding stability.
In the embodiment of the invention, the laser detection equipment is arranged, so that the analysis of the track of the shooting point is more convenient, and the detection sensitivity is different when a user is far from the shooting target or near to the shooting target due to the characteristics of the laser simulation firearm, so that the real situation can be reflected better, and the simulation performance is better.
More preferably, the firearm control command comprises a frame head and a frame tail, a safety state, a bolt state, a magazine state, a bullet quantity number, a trigger state, a bullet quantity mode, the number of trigger firing times, a trigger linearity value and a gun shape;
The shooting simulation training method comprises the following steps:
and in the training terminal, firearm control instruction updating is carried out once every time simulated shooting is carried out.
In order to simulate the actual situation more truly, when designing, the control information of each site of the simulated firearm can be obtained through the sensor, for example, the firearm is provided with a plurality of bullets, each trigger is performed for one virtual shooting, and when the bullets in the firearm are polished, ammunition filling is performed through a loading operation.
The shooting simulation training method further comprises the following steps:
In the simulation training process, detecting the position change of a magnet arranged at a simulation trigger relative to a Hall element through the Hall element arranged at a simulation firearm shell so as to determine corresponding position sensing information according to the position change;
And acquiring the pressing stroke information of the simulation trigger according to the position sensing information, and determining the performance of the trained personnel based on the pressing stroke information.
When the device is specifically implemented, the position detection of the trigger can be mainly carried out through the combination of the magnet and the Hall element, the magnet is fixed on the simulation trigger, the distance between the magnet and the standing Hall is controlled through the trigger, the circuit board forms data according to the influence degree of the distance on the standing Hall by the magnet, and then the software generates a real-time graph through the data, so that the trigger pulling condition of the trained personnel is trained and observed, and the trigger pulling state of the trained personnel is further determined.
More preferably, after the obtaining the pressing stroke information of the simulation trigger according to the position sensing information, the method further comprises:
And acquiring a simulation image associated with the simulation firearm, and performing simulation update display on a trigger position at the simulation image based on the pressing stroke information.
As shown in fig. 13 and 14, the trigger can be displayed in various manners, that is, the trigger can be detected in a pressing curve manner as shown in fig. 13, and the trigger state can be displayed in real time as shown in fig. 14, so that the trigger pulling state of the user can be better displayed.
More preferably, the number of the simulation firearms and the number of the camera laser identification components are multiple, the simulation firearms and the camera laser identification components are in one-to-one correspondence, and the camera laser identification components are used for acquiring shooting point position information and laser track information associated with the simulation firearms.
In the prior art, a camera laser recognition component is adopted to detect the whole display picture during acquisition, and in the embodiment of the invention, the acquisition is carried out in a one-to-one correspondence mode for accurate recognition, and due to the adoption of a linkage design mode, the situation that other people miss targets exists during implementation can be accurately distinguished, the stability of final result recognition is improved, and even during specific implementation, the comprehensive judgment of target shooting can be carried out by combining with a laser moving track, so that the accuracy of result recognition is improved.
More preferably, the acquiring the pressing stroke information of the simulation trigger according to the position sensing information, and determining the performance of the trained personnel based on the pressing stroke information includes:
Acquiring pressing curve information of a simulation trigger according to the position sensing information;
Dividing the pressing curve information into multiple sections of strokes based on a section threshold value to obtain multiple sections of pressing stroke curves;
Calculating first distance information from each section of pressing travel curve to a corresponding center point and second distance information from each section of pressing travel curve to the center point;
And determining the stability of the trained personnel pulling the trigger based on the first distance information and the second distance information. The comprehensive detection can be carried out by adopting multi-level distance region division during specific implementation, the region division is carried out on the pressing stroke curve, the inter-region detection is carried out on different regions, and the comprehensive multi-level pressing stability detection is further realized.
According to the shooting simulation training method, the camera recognition component is adopted to obtain shooting parameters in the simulation training process, so that various state information of trained personnel in shooting is accurately obtained, the state information is presented to provide all-round shooting parameters, and the trained personnel is helped to improve training effects.
Example two
Referring to fig. 15, fig. 15 is a schematic structural diagram of a shooting simulation training apparatus according to an embodiment of the invention. As shown in fig. 15, the shooting simulation training apparatus may include:
The transmitting module 21 is used for transmitting a firearm control instruction to a corresponding simulation firearm so that the simulation firearm is controlled to be in a working state by the simulation firearm, wherein the working state is that the pneumatic box and the simulation firearm are in an open state;
the projection module 22 is used for projecting the set training scene onto a corresponding display screen through the projection equipment;
The parameter acquisition module 23 is used for acquiring shooting parameter information of the simulated firearm in the simulation training process through the camera laser identification component, wherein the shooting parameter information comprises moving information, and the moving information is laser track moving information of the simulated firearm in a preset time before being transmitted;
and the stability determining module 24 is used for determining the gun holding stability of the trained personnel according to the movement information.
According to the shooting simulation training method, the camera recognition component is adopted to obtain shooting parameters in the simulation training process, so that various state information of trained personnel in shooting is accurately obtained, the state information is presented to provide all-round shooting parameters, and the trained personnel is helped to improve training effects.
Example III
Referring to fig. 16, fig. 16 is a schematic structural diagram of an electronic device according to an embodiment of the invention. The electronic device may be a computer, a server, or the like, and of course, may also be an intelligent device such as a mobile phone, a tablet computer, a monitor terminal, or the like, and an image acquisition device having a processing function. As shown in fig. 16, the electronic device may include:
A memory 510 storing executable program code;
a processor 520 coupled to the memory 510;
wherein the processor 520 invokes the executable program code stored in the memory 510 to perform some or all of the steps in the shooting simulation training method of the first embodiment.
An embodiment of the present invention discloses a computer-readable storage medium storing a computer program, wherein the computer program causes a computer to execute some or all of the steps in the shooting simulation training method in the first embodiment.
The embodiment of the invention also discloses a computer program product, wherein the computer program product enables the computer to execute part or all of the steps in the shooting simulation training method in the first embodiment.
The embodiment of the invention also discloses an application release platform, wherein the application release platform is used for releasing the computer program product, and the computer program product enables the computer to execute part or all of the steps in the shooting simulation training method in the first embodiment when running on the computer.
In various embodiments of the present invention, it should be understood that the size of the sequence numbers of the processes does not mean that the execution sequence of the processes is necessarily sequential, and the execution sequence of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer-accessible memory. Based on this understanding, the technical solution of the present invention, or a part contributing to the prior art or all or part of the technical solution, may be embodied in the form of a software product stored in a memory, comprising several requests for a computer device (which may be a personal computer, a server or a network device, etc., in particular may be a processor in a computer device) to execute some or all of the steps of the method according to the embodiments of the present invention.
In the embodiments provided herein, it should be understood that "B corresponding to a" means that B is associated with a, from which B can be determined. It should also be understood that determining B from a does not mean determining B from a alone, but may also determine B from a and/or other information.
Those of ordinary skill in the art will appreciate that some or all of the steps of the various methods of the described embodiments may be implemented by hardware associated with a program that may be stored in a computer-readable storage medium, including Read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), programmable Read-Only Memory (Programmable Read-Only Memory, PROM), erasable programmable Read-Only Memory (Erasable Programmable Read-Only Memory, EPROM), one-time programmable Read-Only Memory (One-time Programmable Read-Only Memory, OTPROM), electrically erasable programmable Read-Only Memory (EEPROM), compact disc Read-Only Memory (Compact Disc Read-Only Memory, CD-ROM), or other optical disk Memory, magnetic disk Memory, tape Memory, or any other medium capable of being used to carry or store data.
The shooting simulation training method, device, electronic equipment and storage medium disclosed in the embodiments of the present invention are described in detail, and specific examples are used herein to illustrate the principles and embodiments of the present invention, and the description of the above examples is only for aiding in understanding the method and core concept of the present invention, and meanwhile, to those skilled in the art, according to the concept of the present invention, there are variations in the specific embodiments and application ranges, so the disclosure should not be interpreted as limiting the present invention.
Claims (9)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202310637097.6A CN116499303B (en) | 2023-05-31 | 2023-05-31 | Shooting simulation training method and device |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202310637097.6A CN116499303B (en) | 2023-05-31 | 2023-05-31 | Shooting simulation training method and device |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN116499303A CN116499303A (en) | 2023-07-28 |
| CN116499303B true CN116499303B (en) | 2025-06-20 |
Family
ID=87323180
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202310637097.6A Active CN116499303B (en) | 2023-05-31 | 2023-05-31 | Shooting simulation training method and device |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN116499303B (en) |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114909948A (en) * | 2022-04-26 | 2022-08-16 | 厦门恒兴兴业机械有限公司 | Simulated shooting analysis equipment and method |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20080077737A (en) * | 2007-02-21 | 2008-08-26 | 주식회사 이그잼 | Dynamic shooting training system |
| KR102041461B1 (en) * | 2018-03-26 | 2019-12-02 | 육군사관학교 산학협력단 | Device for analyzing impact point improving the accuracy of ballistic and impact point by applying the shooting environment of actual personal firearm ing virtual reality and vitual shooting training simulation using the same |
| CN110186323B (en) * | 2019-06-17 | 2025-01-07 | 中国电子科技集团公司第十一研究所 | A training laser pistol |
| CN110779395A (en) * | 2019-11-06 | 2020-02-11 | 北京宏大天成防务装备科技有限公司 | Target shooting correction system and method |
-
2023
- 2023-05-31 CN CN202310637097.6A patent/CN116499303B/en active Active
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114909948A (en) * | 2022-04-26 | 2022-08-16 | 厦门恒兴兴业机械有限公司 | Simulated shooting analysis equipment and method |
Also Published As
| Publication number | Publication date |
|---|---|
| CN116499303A (en) | 2023-07-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10584940B2 (en) | System and method for marksmanship training | |
| US10274287B2 (en) | System and method for marksmanship training | |
| US10234240B2 (en) | System and method for marksmanship training | |
| KR101385326B1 (en) | Method and system for photographing object in movement with camera, and based on taken images therefor, obtaining actual movement trace of same object | |
| US10451376B2 (en) | Firearm simulators | |
| WO2019218790A1 (en) | Interface display method and apparatus, electronic device and computer readable storage medium | |
| US20150276349A1 (en) | System and method for marksmanship training | |
| US20160298930A1 (en) | Target practice system | |
| CN102592045A (en) | First person shooter control with virtual skeleton | |
| US20210372738A1 (en) | Device and method for shot analysis | |
| US11293722B2 (en) | Smart safety contraption and methods related thereto for use with a firearm | |
| US10288381B1 (en) | Apparatus, system, and method for firearms training | |
| JP2018005913A (en) | Method and system for compensating brightness of ball image, and non-transitory computer-readable recording medium | |
| CN116499303B (en) | Shooting simulation training method and device | |
| US20220049931A1 (en) | Device and method for shot analysis | |
| CN116793143A (en) | Method and device for detecting shooting pressing action | |
| KR102151340B1 (en) | impact point detection method of shooting system with bullet ball pellet | |
| CN117928303A (en) | Virtual reality shooting training method and device and electronic equipment | |
| CN117899476A (en) | Self-aiming technology detection method and device, storage medium and electronic device | |
| KR101117404B1 (en) | The Shooting Training System of Moving for Real Direction | |
| KR101332741B1 (en) | Arcade gun interaction system | |
| CN115560630A (en) | Simulated shooting system based on visible light | |
| US20210192967A1 (en) | System and method for virtual target simulation | |
| CN114405007A (en) | Simulation shooting model establishment method, simulated shooting method, device and storage medium | |
| RU2583018C1 (en) | Video shooting simulator |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |