[go: up one dir, main page]

WO2019190019A1 - Appareil d'analyse de point d'impact pour améliorer la précision d'une trajectoire balistique et d'un point d'impact par application d'un environnement de tir d'arme à feu personnelle réelle à une réalité virtuelle, et simulation d'entraînement au tir virtuel l'utilisant - Google Patents

Appareil d'analyse de point d'impact pour améliorer la précision d'une trajectoire balistique et d'un point d'impact par application d'un environnement de tir d'arme à feu personnelle réelle à une réalité virtuelle, et simulation d'entraînement au tir virtuel l'utilisant Download PDF

Info

Publication number
WO2019190019A1
WO2019190019A1 PCT/KR2018/014647 KR2018014647W WO2019190019A1 WO 2019190019 A1 WO2019190019 A1 WO 2019190019A1 KR 2018014647 W KR2018014647 W KR 2018014647W WO 2019190019 A1 WO2019190019 A1 WO 2019190019A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
bullet
impact point
virtual
impact
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2018/014647
Other languages
English (en)
Korean (ko)
Inventor
이병학
박석봉
이원우
김동욱
신규용
김주희
박영준
최현호
김종환
강원석
곽윤기
최홍철
김남혁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Korea Military Academy R&db Foundation
Optimus System Co Ltd
Original Assignee
Korea Military Academy R&db Foundation
Optimus System Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020180130030A external-priority patent/KR102041461B1/ko
Application filed by Korea Military Academy R&db Foundation, Optimus System Co Ltd filed Critical Korea Military Academy R&db Foundation
Priority to US17/041,009 priority Critical patent/US20210102781A1/en
Publication of WO2019190019A1 publication Critical patent/WO2019190019A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/26Teaching or practice apparatus for gun-aiming or gun-laying
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes

Definitions

  • the present invention relates to an apparatus capable of calculating ballistic trajectories and impact points identical to real environments by applying a shooting environment of a real personalizer to a shooting training of a virtual reality, and a shooting training simulation system using the same.
  • shooting training can be achieved by having a real firearm and evaluating whether the target is hit by firing a bullet.
  • the target is hit according to the triggering direction. Therefore, since the equipment and the shooting location such as the actual firearms and bullets are not necessary, it can be freed from the restriction on the danger, the number, the location of the training.
  • the trajectory of trajectory and the resulting impact point of the virtual shooting training simulation are applied differently from the actual shooting training.
  • the virtual shooting training simulation currently in operation assumes that the center of mass of the shot fired by the virtual gun is simply translated. That is, in the virtual shooting, as the ballistic trajectory and the impact point are formed under the assumption that the bullet is not affected by various motion laws and the environment, there is a difference from the trajectory track and the impact point generated in the actual shooting training.
  • An object of the present invention is to apply the shooting environment of the actual personalizer in the virtual reality that can improve the reality of the trajectory and impact point generated in the virtual shooting by reflecting the types and environmental elements of the gun and bullets. It provides an impact point analysis device to improve the precision and a virtual fire training simulation system using the same.
  • an impact point analysis device for improving the precision of a ballistic trajectory and an impact point by applying a shooting environment of a real personalizer is a model provided by a user in real space.
  • a firearms analysis module for generating firearm information on the firearm structure of the virtual firearm;
  • a bullet analysis module configured to generate bullet information on a structure of a bullet applied to the virtual firearm;
  • An environment analysis module configured to detect an environment state of target training content output on a screen and generate environment information about the environment;
  • an impact point generation module configured to generate the impact point information regarding the position where the bullet hits the target displayed on the screen by referring to at least one of the firearm information, the bullet information, and the environment information.
  • the gun information includes gun type information, barrel length information, and gun stiffness information.
  • the bullet information may include bullet type information, bullet mass information, bullet shape information, and bullet pressure center information. .
  • the impact point generation module may generate first bullet motion information regarding the motion information of the bullet in the virtual firearm by referring to the firearm information and the bullet information.
  • the virtual shooting training simulation system for correcting the image and reflecting the impact point precision according to another embodiment of the present invention for realizing the above object, based on the screen on which the shooting training content is output in the real space, And an image sensing device for sensing a virtual firearm which is a model provided by the user and generating object image information which is image information thereof.
  • the object image information is analyzed to compare the reference image information detected at the reference position with the change image information detected at the change position, which is a position changed from the reference position, to generate correction information as a result value, and to correct the correction information.
  • An image correction device for generating corrected image information reflected in the change image information; And bullet information on the structure of the virtual firearm, bullet information on the structure of the bullet applied to the virtual firearm, and environmental information on the environmental state of the shooting training content, with reference to the target displayed on the screen. And an impact point analysis device for generating the impact point information on the impacted position.
  • the image correcting apparatus corresponds to the reference position information when the position information of the user and the virtual firearm, which are objects of the screen detected in the real space, includes reference position information that is a preset reference position.
  • a reference image module generating the reference image information which is an image;
  • a change image module configured to generate change image information, which is an image corresponding to the change position information, when the change from the reference position to the change position by the movement of the object is detected;
  • a correction module configured to compare the reference position information and the change position information to generate the correction information as a result value, and to generate corrected image information reflecting the correction information in the change image information.
  • the reference position information is screen coordinate information that is a coordinate value of the screen, reference coordinate information that is a coordinate value for a reference position in the real space initially set by the image sensing apparatus, and disposed in the real space. It may be generated by referring to the positional relationship with the screen coordinate information at the measurement position of the measuring device.
  • the screen coordinate information may include screen reference coordinate information corresponding to a circumferential region of the screen and screen temporary coordinate information about a plurality of coordinates spaced apart from different positions among internal regions of the screen. And when input information corresponding to the screen temporary coordinate information is input, generate measurement / screen position information about a positional relationship with the screen temporary coordinate information at the measurement position, and generate the measurement / screen position information with the reference coordinate information at the measurement position. Generate measurement / reference position information relating to a position relationship, and transmit each of them to the reference image module, wherein the reference image module matches the measurement / screen position information with the screen reference coordinate information and relates to their correlations.
  • first reference relationship information wherein the measurement / screen position information and the measurement / reference position Generate second reference relationship information about these correlations with reference to the beam, and refer to the correlation between the measurement / reference position information and the screen reference coordinate information with reference to the first reference relationship information and the second reference relationship information.
  • Third reference relationship information may be generated.
  • FIG. 1 is a conceptual diagram illustrating a method of operating an impact point analysis device 100 according to an embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a module configuration of the impact point analysis device 100 according to an embodiment of the present invention.
  • FIG 3 is a view for explaining the barrel length of the virtual firearm (F) according to an embodiment of the present invention.
  • FIG. 4 is a view for explaining the type and structure of the bullet (B) according to an embodiment of the present invention.
  • FIG. 5 is a view for explaining environmental information of shooting training content output on the screen S according to an embodiment of the present invention.
  • FIG. 6 is a view for explaining a method of correcting the change in the impact point according to the movement time of the bullet (B) according to an embodiment of the present invention.
  • FIG. 7 is a view for explaining how how to move howitzer (B) in accordance with an embodiment of the present invention.
  • FIG. 8 is a conceptual view illustrating a method of using the virtual shooting training simulation system 10 according to another embodiment of the present invention.
  • FIG. 9 is a block diagram illustrating a configuration of an image correcting apparatus included in the virtual shooting training simulation system 10 of FIG. 8.
  • FIG. 10 is a flowchart for explaining a method of operating the virtual shooting training simulation system 10 of FIG. 8.
  • 11 to 16 are diagrams for describing the operation method of FIG. 10 in each step image.
  • 17 and 18 are diagrams for describing a method for deriving a ballistic trajectory according to an embodiment of the present invention.
  • 19 is a view for explaining an impact point changed according to a ballistic trajectory according to an embodiment of the present invention.
  • an impact point analysis device for improving the precision of ballistic trajectories and impact points by applying a shooting environment of a real personalizer in a virtual reality according to a preferred embodiment of the present invention and a virtual fire training simulation system using the same in detail with reference to the drawings.
  • the same or similar reference numerals are assigned to the same or similar configurations in different embodiments, and the description thereof is replaced with the first description.
  • FIG. 1 is a conceptual diagram illustrating a method of operating an impact point analysis device 100 according to an embodiment of the present invention.
  • a user U having a virtual firearm F which is a model firearm corresponding to an actual firearm, may be positioned with respect to the screen S in the real space.
  • the screen S may output shooting training content, which is a program related to shooting training performed in a virtual reality.
  • the shooting training content may be a shooting training program in a virtual reality in which a target T, which is the target of shooting in various terrains and environments, is selectively generated or moved to be moved.
  • Such shooting training content may be received from an external server or an external terminal such as a website through a communication module and stored in a memory module.
  • the virtual firearm F may be a model firearm that can be linked to the shooting training content.
  • the shape may be formed in the same manner as an actual firearm, and when triggering a trigger, trigger information may be generated. Therefore, the shooting training content may be reflected in the shooting training content output on the screen S when the location information and the direction information of the virtual firearm F are received through the vision sensing module not shown in the present embodiment.
  • triggering information when triggering information is received from the virtual firearm (F), it can be reflected in the shooting training content by referring to the position information and direction information of the virtual firearm (F) at the time of the triggering information.
  • the virtual firearm F may attach a member such as a vision marker to a gun or a body, and the vision detection module may detect the vision marker to generate information regarding the position and direction of the virtual firearm F.
  • the vision detection module detects image information of the virtual firearm F in real time, and compares the detected image information with reference image information, which is a reference value stored in the memory module, to obtain information about the position and orientation of the virtual firearm F.
  • reference image information which is a reference value stored in the memory module
  • the vision sensing module may detect the position and direction of the user U through the above-described method, and generate information about the same.
  • the impact point analysis device 100 includes firearm information on the structure of the firearm of the virtual firearm F, bullet information on the structure of the firearm B applied to the virtual firearm F, and the environment of the shooting training content. With reference to at least one of the environmental information on the state it is possible to generate the impact point information, which is information on the trajectory trajectory and the impact position with respect to the target.
  • the existing shooting training simulation excludes the distance between the virtual firearm F and the screen S, and the factors that actually affect the trajectory, and the first ballistic trajectory T1 formed in a straight line. Accordingly, it is designed to have a first impact point T1.
  • the impact point analysis device 100 of the present embodiment includes firearm information on the structure of the virtual firearm F included in the user U, and bullet information on the structure of the bullet B applied to the virtual firearm F.
  • FIG. Impact point information considering all of the wind direction / wind speed information (W), air pressure information (H), gravity information (G), and temperature information (TP) of the shooting training content output on the screen S may be generated.
  • the second trajectory trajectory T2 and the second trajectory trajectory are curved. 2
  • the impact point A2 can be calculated.
  • the second ballistic trajectory T2 and the second impact point A2 of the present embodiment are shown by machining for illustration, and the second ballistic trajectory T2 and the second impact point A2 are changed according to the conditions of various elements. Can be.
  • the impact point analysis apparatus 100 is the first bullet movement distance information (D1) which is the actual distance from the virtual firearm (F) to the screen (S) in the real space, and the shooting training content which is the virtual reality displayed on the screen (S).
  • Receiving point information may be generated by receiving second bullet movement distance information D2 which is a virtual distance from the target T to the target T, and applying the combined distance as the bullet movement distance information D.
  • the first bullet movement distance information D1 is the actual distance between the virtual gun F and the screen S in the real space
  • the second bullet movement distance information D2 is the virtual reality in the target training content. It is a virtual distance to the target T.
  • the bullet movement distance information D may be calculated by reflecting both the distance in the real space and the distance in the virtual space. For example, when the distance in the real space is 2 m and the distance to the target T in the virtual space is 5 m, the bullet movement distance information D is 7 m in total. Thus, the impact point analysis apparatus 100 may generate the impact point information by applying the case where the bullet movement distance information D is 7 m.
  • the ballistic trajectory and the impact point are calculated by applying all the information about various elements applied as variables in the actual shooting environment, thereby improving the precision and reliability of the impact point more. have.
  • the operation method of the impact point analysis apparatus 100 was briefly demonstrated. 2 will be described in detail with respect to the configuration of the impact point analysis device 100.
  • the virtual reality of the present invention configured as described above to apply the shooting environment of the actual personalizer to improve the precision of the ballistic trajectory and impact point
  • an impact point similar to the actual shooting can be generated, thereby improving the training efficiency of the virtual shooting training.
  • the viewpoint change of the virtual space is also made in accordance with the change of the viewpoint of the user in the real space, it is possible to further improve the viewpoint matching between the real space and the virtual space.
  • the image distortion may be minimized by automatically adjusting the screen image ratio to include the coordinates.
  • FIG. 2 is a block diagram illustrating a module configuration of an impact point analysis device 100 according to an embodiment of the present invention
  • Figure 3 is a description of the barrel length of the virtual firearm (F) according to an embodiment of the present invention
  • 4 is a view for explaining the type and structure of the bullet (B) according to an embodiment of the present invention
  • Figure 5 is output on the screen (S) according to an embodiment of the present invention
  • FIG. 6 is a view for explaining environmental information of target shooting training content
  • FIG. 6 illustrates a method of correcting a change in impact point according to a moving time of a bullet ball B according to an embodiment of the present invention. It is a figure for following.
  • the impact point analysis device 100 includes a gun analysis module 110, a bullet analysis module 120, an environmental analysis module 130, a vision detection module 140, and a distance analysis module 150. , Time analysis module 160, and impact point generation module 170 may be included.
  • the firearm analysis module 110 may generate firearm information about a firearm structure of the virtual firearm F, which is a model included in the user U in real space.
  • the firearm information is information on the physical structure of the virtual firearm F and may include firearm type information, barrel length information, and firearm stiffness information.
  • the firearm type information may include information on the type of firearms in the market such as K2, K1, M16, AK47, K5, and M16A4.
  • the firearm type information is generated by recognizing the vision marker attached to the virtual firearm (F), or by recognizing the image information of the firearm through the vision detection module 140 and the firearm type stored in the corresponding image information and the memory module or an external server. It can also be created by matching the table.
  • the barrel length information may be length information of the metal tube portion of the fire extinguisher that passes when the bullet B is fired in the virtual firearm F as shown in FIG. 3. Accordingly, the barrel length information may be differently set depending on the type of firearm. In FIG. 3, when the barrel length of K2 (F1) is the first barrel length FD1, the barrel length of M16 (F2) may have a second barrel length FD2 longer than the first barrel length FD1. . Since the movement information such as the rotation amount of the bullet (B) is generated differently according to the length of the barrel, it may be necessary to secure the information.
  • the gun stiffness information is information about the spiral groove of the total steel which is inside the barrel, and the bullet (B) rotates along the spiral groove to have rotational inertia, thereby having a stable trajectory.
  • the ballistics (B) is heavy and the number of rotations to give more rotation can be stabilized ballistics.
  • the gun stiffness information may include information on the presence or absence of rigidity, information on the direction of rotation of the steel wire, information on the number of steel wires, and the like.
  • the bullet analysis module 120 may generate bullet information on the structure of the bullet bullet B, which is a bullet applied to the virtual firearm F.
  • FIG. 1 A bullet applied to the virtual firearm F.
  • the bullet information may include bullet type information, bullet length information, bullet mass information, bullet appearance information, bullet pressure center information, and the like.
  • the bullet analysis module 120 refers to the bullet table information on the bullet for each gun (B) stored in the memory module or an external server, and displays bullet information corresponding to the firearm. Can be generated.
  • the bulletin information may be automatically generated as described above, but bulletin information may be input through a user (U) input module (not shown). Manual information input through the user U input module is not limited to bullets B, but may be applied to generation of firearm information.
  • the bullet information may include information on the shape of the various bullets B and the gunpowder embedded in the bullet B as shown in FIG. 4.
  • the environment analysis module 130 may detect environment conditions of the shooting training content output on the screen S as in FIG. 5, and generate environment information about them.
  • the environment information may include atmospheric temperature information (TP), density information, barometric pressure information (H), wind direction / wind speed information (W), and gravity information (G) for the virtual reality output from the shooting training content.
  • TP atmospheric temperature information
  • H barometric pressure information
  • W wind direction / wind speed information
  • G gravity information
  • TP atmospheric temperature information
  • TP density information
  • H barometric pressure information
  • W wind direction / wind speed information
  • G gravity information
  • climate change information on rain, snow, hail, and typhoons may be included in the environment information.
  • the environment analysis module 130 generates environment information on the screen of the shooting training content currently displayed on the screen S and shown to the user U, so that adaptive training is performed on various environmental situations during actual shooting. Can be.
  • the vision sensing module 140 may detect the position and the direction of the user U and the virtual firearm F and generate object position information about it. In other words, the vision detection module 140 detects a vision marker attached to the user's body, clothing, or the virtual firearm F, or detects image information of the user U and the virtual firearm F. The object position information may be generated.
  • the distance analysis module 150 may generate bullet movement distance information, which is a distance until the gun fired from the virtual firearm F reaches the target T of the shooting training content.
  • the bullet movement distance information is the first bullet movement distance information, which is the actual distance that the bullet is moved from the virtual firearm F to the screen S in the real space, and the virtual distance from the target training target to the target T on the shooting training content which is the virtual reality space.
  • the second bullet movement distance information which is a distance, may be included.
  • the bullet movement distance information includes the first bullet movement distance information, which is the actual distance information from the real space to the virtual gun F and the screen S, and the target T on the target training content, which is the virtual space.
  • the second bullet movement distance information which is a virtual distance, may be included.
  • the distance analysis module 150 may generate first bullet movement distance information by referring to object position information generated by the vision sensing module 140. Then, the second bullet movement distance information may be generated with reference to the shooting training content. The first and second bullet distance movement information generated as described above may be combined to generate bullet movement distance information, which is the total movement distance of the bullet.
  • the time analysis module 160 may generate impact time information regarding a time when the bullet is moved from the virtual firearm F to the target T.
  • an impact point was formed at the target T when the virtual gun F was triggered toward the screen S.
  • a time delay between the trigger and the impact is caused by the distance to the target T and various factors. Is generated. Therefore, when the target T is moved, the impact point due to the time delay may be changed.
  • the time analysis module 160 generates the impact time information through the distance analysis module 150 and the triggering time information on the time at which the bullet B is fired by the virtual gun F and the time when the bullet B is fired. Can be.
  • the time analysis module 160 may refer to gun information, bullet information, environmental information, and the like, and apply the same to the impact time information.
  • the impact point generation module 170 relates to a position at which the bullet B triggered by the virtual firearm F hits the target T displayed on the screen S by referring to the information generated by the above-described configurations. Impact point information can be generated. In other words, the impact point generation module 170 may generate the impact point information reflecting the various variable information generated by the above-described modules.
  • the impact point generating module 170 may generate first bullet motion information about the motion of the bullet ball B in the virtual firearm F by referring to the gun information and bullet information.
  • the first ballistic movement information may be intra-ballistic ballistic information about the movement information until the ballistic B starts moving in the virtual firearm F and leaves the muzzle in the first stage of the ballistic movement.
  • the impact point generation module 170 may generate first bullet motion information by Equation 1 and Equation 2 below.
  • the first bullet motion information may be kinetic energy information of the bullet.
  • the maximum length of the object from the rotation center axis may be information about the radius of the bullet (B).
  • the first ballistic motion information may include acceleration and rotational force information generated by the charge and the gun structure when a specific type of bullet B is triggered in the virtual firearm F.
  • the impact point generation module 170 refers to the motion of the bullet (B) moved between the target (T) output to the virtual firearm (F) and the screen (S) by referring to the first bullet motion information and environment information. Two bullets can generate the movement information.
  • the second ballistic movement information may be extra-traditional ballistic information regarding the movement information changed by the environment (air pressure, gravity, temperature, wind direction / wind speed, etc.) when the bullet ball B is flying in the air.
  • the impact point generation module 170 may generate the second bullet motion information by Equation 3 below.
  • the second bullet motion information may include resistance energy information acting on the bullet B emitted from the virtual firearm F to the outside. Therefore, the resistance energy due to gravity, air density, etc. may be resistance energy acting in a direction opposite to the direction in which the bullet travels.
  • the first bullet motion information may be information on the kinetic energy of the bullet B in the firearm
  • the second bullet motion information may be information on the kinetic energy at the position where the bullet B is out of the firearm.
  • the impact point generation module 170 refers to the bullet movement distance information, which is the distance from the virtual gun F to the target T, and the position and structure information of the target T. To generate impact point information.
  • the position and structure information of the target T is information about the position and structure of the target T included in the shooting training content output on the screen S, and may include area information such as size, and the like. Information about them can be received from the content.
  • the impact point generation module 170 may reflect the target T movement information and the impact time information regarding the movement of the target T in the impact point information. That is, when the movement of the target T is generated, the impact point may be corrected by the delayed time in consideration of the impact time according to the movement distance of the bullet (B).
  • a virtual fire drill is applied to various elements generated in an actual shooting environment such as a firearm (F), a bullet (B), an environment, a target (T), and the like. Improve the effectiveness of shooting training.
  • FIG. 7 is a view for explaining how to move howitzer bullet according to an embodiment of the present invention.
  • the impact point analysis device 100 of the present invention can use the virtual gun F as an howitzer.
  • the howitzer is a kind of firearm that hits the target T by bending the obstacle when the target T is located behind the obstacle Z. Therefore, in the case of howitzer, it is possible to move the bullet to the parabolic howitzer T3.
  • the obstacle (Z) may be represented in a plane on the screen (S)
  • the impact point analysis device 100 applies the distance between the obstacle (Z) from the screen (S) on the shooting training content on the obstacle (Z) Impact point information can be generated.
  • Existing shooting training simulation has a problem that it is difficult to implement the howitzer function by firing the bullet in a straight line.
  • the existing technology does not reflect both the distance in the real space and the distance in the virtual space as in the present invention, and also does not reflect the distance in the virtual space of the obstacle Z. There is a problem that cannot be calculated correctly.
  • the impact point analysis apparatus 100 may collect muzzle angle information about the muzzle angle when the virtual firearm F is triggered (through the vision sensing module 140).
  • the impact point information regarding the impact point of the howitzer can be generated by referring to the various element information and the muzzle angle information of the above-described embodiments.
  • the impact point analysis device 100 may include the first bullet movement distance information, which is the actual distance from the virtual gun F to the screen S, in real space, and the shooting training content that is the virtual reality space displayed on the screen S.
  • Receiving point information may be generated by receiving second bullet movement distance information, which is a virtual distance from the screen S to the target T, by applying the combined distance as the bullet movement distance information D.
  • the impact point analysis device 100 is a distance from the screen (S) to the obstacle (Z) / obstacle (Z) on the first bullet movement distance information, the second bullet movement distance information and the shooting training content displayed on the screen (S).
  • the impact point information may be generated in consideration of the height of the backplane.
  • the impact point analysis device 100 considers the distance from the screen S to the obstacle Z, the height of the obstacle Z, and the like on the shooting training content displayed on the screen S on the shooting training content. It is possible to determine whether the bullet suits the target (T).
  • the generated impact point information may be reflected to the shooting training content to be output to the user U on the screen S in real time.
  • the impact point analysis apparatus 100 of the present embodiment not only the shooting training on the plain but also the howitzer training on the obstacle Z may be performed in the same manner as the actual shooting, and thus the diversity of the training may be further improved.
  • FIG. 8 is a conceptual diagram illustrating a method of using a virtual shooting training simulation system according to another embodiment of the present invention.
  • the simulation system 10 in the configuration of the virtual shooting training simulation using the shooting training content, the user U and the user (U) together with the impact point analysis device described above with reference to FIGS.
  • Image correction can be performed according to the position of the virtual firearm G included in the FIG.
  • the simulation system may include an impact point analyzing apparatus 100, an image correcting apparatus 200, an image sensing apparatus 300, and an image output apparatus 400.
  • the image sensing apparatus 300 may be configured as a means such as a camera as a means for detecting location information of the object O, such as the user U and the virtual firearm G included in the user U, in a real space. Can be.
  • the image sensing device 300 may be composed of a plurality of cameras and coupled to an upper portion of the screen S to be described later.
  • the image sensing apparatus 300 may collect image information of the object O to detect their location and generate location information thereof.
  • the impact point analysis device 100 is configured and operated as described above with reference to FIGS. 1 to 7, further description thereof will be omitted.
  • the image calibrating apparatus 200 may be a means for calibrating image information output on a screen according to a change in the position of the object O.
  • the image calibrating apparatus 200 may correct image information (shooting training content) output on the screen S according to the position of the object O.
  • FIG. A detailed configuration and operation method of the image correcting apparatus will be described later with reference to FIG. 9.
  • the image sensing apparatus 300 may generate location information by sensing a vision marker M attached to the body of the user U, clothing, or the virtual firearm G.
  • the vision marker M may include a plurality of colors, patterns, and the like, and the image sensing apparatus 300 may be configured as an infrared camera capable of detecting the color or pattern of the corresponding vision marker M.
  • the vision marker M may be formed of a local area network tag or the like, or may be formed of an infrared reflecting structure, and the image sensing apparatus 300 may have a corresponding vision marker M. It may be configured to correspond to.
  • the image sensing device 300 may be implemented in a configuration that can communicate with the corresponding terminal.
  • the screen S is a means for outputting image information, and may display and output an image irradiated in a beam form from a display unit capable of outputting its own image in a blind or roll structure.
  • a fixed blind structure will be described as an example.
  • the present invention is not limited thereto, and a mobile, variable, screen, or a self image output display unit may be applied.
  • the image output apparatus 400 is a means for outputting an image toward the screen S and may be configured as a beam project. In addition, the image output apparatus 400 may be configured as a display unit which is integrally formed with the screen (S).
  • the shooting training simulation system 10 may further include a communication unit, a user input unit, a memory unit, and a controller which is an integrated controller for controlling the whole thereof.
  • the shooting training corresponding to the viewpoint of the object O located in the real space by correspondingly correcting the image information output on the screen S according to the position of the object O.
  • the content image may be output.
  • a virtual firearm (G) for the target training content by applying variables similar to the actual target training can improve the training efficiency.
  • FIG. 9 is a block diagram illustrating a configuration of an image correcting apparatus 200 included in the virtual shooting training simulation system of FIG. 8.
  • the image correction apparatus 200 may be an apparatus for correcting image information of the price training simulation system 10 described above with reference to FIG. 8.
  • the image correction apparatus 200 may include a reference image module 210, a change image module 230, and a correction module 250.
  • the reference image module 210 may generate reference image information that is an image corresponding to the reference position information when the object is located at the reference position in the real space and detects it.
  • the reference point at which the coordinate information of the object and the center point coordinate information of the image are located on the same line at the reference position.
  • Image information may be generated. Accordingly, when the reference image information is output, the object may match the center point of the image with the field of view.
  • the change image module 230 may detect change position information about the change position, which is the changed position, and generate change image information corresponding thereto.
  • the center point coordinate information of the change image information may be positioned on the same line as the change coordinate information of the object in the same manner as the above-described reference image information, and the object may match the center point of the change image information with the field of view even at the change position.
  • the correction module 250 may generate correction information based on the change value of the reference position information and the change position information, and generate the correction image information reflecting the generated correction information in the change image information.
  • the correction module 250 determines how much change is made in the reference image information based on the difference between the reference position information and the change position information, and reflects the change in the changed image information to minimize the difference between the real space and the virtual space. Can be.
  • the coordinate information of the object in the real space includes object coordinate information
  • the coordinate value of the object in the location information the reference module information and the change of the coordinate value of the reference image information so as to correspond to the coordinate information of the object.
  • Change coordinate information that is a coordinate value of the image information may be generated.
  • the correction module 250 may generate correction information so that the change coordinate information matches the reference coordinate information.
  • the correction module 250 resets the screen information regarding the screen aspect ratio.
  • the reflected corrected image information may be generated.
  • the image calibrating apparatus 200 not only the coordinate information of the object located in the real space and the coordinate information of the reference image / change image, which is a virtual space, are generated to generate the corrected image information, but also the aspect ratio of the screen.
  • the corrected image information in consideration of this, it is possible to minimize the image distortion caused by the movement of the position of the object and the change of viewpoint.
  • FIG. 10 is a flowchart illustrating an operating method of the virtual shooting training simulation system of FIG. 8, and FIGS. 11 to 15 are diagrams for describing the operating method of FIG. Words described in this embodiment are the same as the words described above with reference to FIGS. 8 and 9, and reference numerals for the same elements will be omitted.
  • the impact point analyzing apparatus since the impact point analyzing apparatus has been described above with reference to FIGS. 1 to 7, the description of the configuration is omitted, but information generated by the impact point analyzing apparatus may be corrected by the image correcting apparatus.
  • the shooting training simulation system (not shown in FIGS. 8 and 10) will be described with reference to the configuration as shown in FIG. 8.
  • the image correcting apparatus of the initial shooting training simulation system may set reference position information (S11).
  • the reference position information may be setting information for matching an initial real space and a virtual space of the image correcting apparatus as illustrated in FIG. 11.
  • screen coordinate information which is coordinate information of the entire screen S
  • reference coordinates which are coordinate values of the reference position W in the real space detected by the vision sensing unit.
  • the information may be stored in advance.
  • the reference coordinate information may be coordinate information of an area detected by the vision sensing unit with respect to a specific point in the real space, which may be input and set through a user input module (not shown).
  • the measuring apparatus J may generate a positional relationship from a current position to a specific point by irradiating a laser like a laser tracker.
  • the measuring device J is disposed at the measurement position which is one position of the real space, and screen temporary coordinate information A1, A2, A3, which relates to a plurality of coordinates spaced apart from each other among the internal regions of the screen S.
  • A4 and A5 are input, the laser is irradiated toward the screen temporary coordinate information to calculate measurement / screen position information which is a positional relationship between the measurement position and the screen temporary coordinate information.
  • five pieces of coordinate information are input as screen temporary coordinate information, but the present invention is not limited to the number of coordinates, and may include a plurality of pieces of coordinate information different from each other.
  • the measurement / screen position information may include comprehensive positional relationship information about distance information, direction information, angle information, etc. of the measurement position with respect to each coordinate of the screen temporary coordinate information.
  • the measuring apparatus J may measure the above-described reference position and generate measurement / reference position information regarding the positional relationship between the measurement position and the reference coordinate information which is the reference position.
  • the measurement device J may transmit the generated measurement / screen position information and measurement / reference position information to the image correction device (reference image module).
  • the image calibrating apparatus may match the received measurement / screen position information with screen reference coordinate information corresponding to the circumferential region of the screen 120 to generate first reference relationship information on their correlations. have.
  • the image correcting apparatus may generate second reference relationship information with respect to the correlation by referring to the received measurement / screen position information and the measurement / reference position information.
  • the image correction apparatus may generate third reference relationship information on correlation between measurement / reference position information and screen reference coordinate information by referring to the first reference relationship information and the second reference relationship information. have.
  • the image correction apparatus (reference image module) generates the third reference relation information
  • the real relation and the virtual space are matched according to the third reference relation information to adjust their positional relationship. It can be set more clearly.
  • the position information of the object O may be detected through the image sensing apparatus (S13).
  • the image sensing apparatus may determine the object.
  • the image correction apparatus may determine whether to include the reference position (L1) information stored in the memory unit (S15).
  • the image correcting apparatus sets the position information of the object O as the first reference coordinate information, and reflects the third reference relationship information to the first reference coordinate information, thereby providing the position information of the object O. May be converted into reference position information. Accordingly, the object O may be automatically set to the reference position even if the object O is not located at the preset reference position.
  • the position information of the object O may be calculated by converting the real space into coordinate information and comparing the coordinate information of the object O among the corresponding coordinate information.
  • the means for determining the position of the object O may recognize a site related to the eye of the object O as the location object.
  • the vision marker may be attached to an area similar to the eye of the user, and the image sensing device may detect location information by detecting the corresponding vision marker.
  • the image sensing apparatus is configured as image recognition means
  • the eye of the object O among the recognized image information may be set as a reference point of the position information.
  • the vision marker is attached to the virtual firearm, the virtual firearm may be set as a reference point.
  • the reference position L1 may be set according to an input signal in a state where the object O is located in the real space.
  • the controller may set the position information of the current object O as the reference position L1 information.
  • the image correction apparatus may generate reference image information SP, which is an image corresponding to the reference position L1 information (S17). ).
  • the reference image information SP may be an image having center coordinate information positioned on the same line as the object O coordinate information, which is real space coordinate information of the object O.
  • FIG. the reference image information SP may be an image in which the center coordinate information of the content image output by the object O is adjusted to the same line as the object O coordinate information. Since the object O coordinate information is an area coordinate corresponding to the same viewpoint as the eye of the object O, when the coordinate and the center point coordinate information C of the reference image information SP are located on the same line, the object O ) May be provided with an image that is output from the view of the corresponding image.
  • the reference image information SP is separated from the screen S for convenience of description, but in practice, the reference image information SP may be integrated into the screen S.
  • the image calibrating apparatus may determine whether the object O is changed at the reference position L1 (S19). This may be determined whether the position information of the object O detected by the image sensing device includes change position L2 information moved from the reference position L1 to another change position L2. In other words, it may be determined whether the position of the object O is changed by changing the coordinate information of the object O in real space.
  • the image correction apparatus may generate change image information CP, which is an image corresponding to the change position L2 information (S21).
  • the change image information CP is substantially the same as the reference image information SP, but differs in that the position information of the object O is changed from the reference position L1 to the change position L2. Accordingly, as illustrated in FIG. 13, the change image information CP may be image information having center point coordinate information C ′ positioned on the same line as the object O coordinate information at the change position L2 of the object O. have. Accordingly, the object O may be provided with the change image information CP output at its viewpoint at the change position L2.
  • the object O may receive an image corresponding to a change in its position, but in the case of the corresponding change image information CP, a distortion phenomenon according to a viewpoint may occur.
  • the screen S may be implemented as a function such as a window that is located between the real space and the virtual space in order to realize a more realistic virtual space. Therefore, when the image direction is simply changed according to the position of the object O, a difference from the viewpoint of the object O may occur. Therefore, when the position of the object O moves, it is necessary to generate and reflect correction information that can be matched with the reference image information SP at the changed change position L2.
  • the image correction apparatus may compare the reference position (L1) information and the change position (L2) information to generate the correction information as a result of this (S23).
  • the correction information may be a value for matching the change coordinate information, which is the coordinate value of the change image information CP, to the reference coordinate information, which is the coordinate value of the reference image information SP, at the change position L2.
  • the reference coordinate information includes reference peripheral coordinate information SBP corresponding to the periphery of the screen 120 in the reference image information SP, and the change image information CP may include the screen in the change image information CP.
  • Each of the change perimeter coordinate information CBP corresponding to the perimeter of S may be included.
  • the image correcting apparatus generates reference line information on virtual lines connected to the reference circumferential coordinate information SBP by a straight line at the change position L2, and changes coordinate information of the reference line information and the change image information CP.
  • Cross coordinate information (P1, P2) with respect to the intersection of can be generated.
  • the change extension coordinate information CC that extends the change coordinate information in the linear direction is generated, and the cross coordinate information P1 and P2 are compared with the reference line information. ) Can also be created.
  • each circumferential coordinate information is presented only to correspond to both ends of the width of the screen S.
  • the present invention is not limited thereto, and the left, right, top, and bottom positions of the object O are not limited thereto. Coordinate values corresponding to the width and the height of the image corresponding to may be calculated.
  • the image correction apparatus is first correction corrected by the following equation (4) in order to match the cross coordinate information (P1, P2) to the reference image information (SP) Coordinate information can be generated.
  • the first corrected coordinate information may reduce the distortion generated when the object O views the screen S at the change position L2 by matching the cross coordinate information P1 and P2 with the existing image information.
  • the image correction apparatus may determine whether the corresponding correction coordinate information is included in the screen coordinate information (S25).
  • the screen coordinate information may be a reference coordinate value for the height and width of the screen S. Therefore, the images output on the screen S may be generated to correspond to the screen coordinate information. Therefore, when the first correction coordinate information is included in the screen coordinate information, the image correction apparatus may generate corrected image information reflecting the first correction coordinate information in the change image information CP (S29).
  • a case where the first correction coordinate information is not included in the screen coordinate information may occur. This may be the case where the first correction coordinate information generated by Equation 3 is out of the screen coordinate information, as shown in FIG. 15.
  • the screen information regarding the screen S aspect ratio may be reset (S27). This may be implemented by Equation 5 below.
  • Second correction coordinate information (U n , V n )
  • the screen information SR1 since the first screen coordinate information SR1 may not include the first correction coordinate information, the screen information SR1 may have extended screen information SR2 that extends it. Accordingly, as shown in FIG. 16, the screen coordinate information may be expanded to include the first corrected coordinate information, and the first corrected coordinate information is converted into second corrected coordinate information reset to correspond to the extended screen information SR2. Can be.
  • the image correcting apparatus When the second correction coordinate information is calculated to reset the screen information, the image correcting apparatus generates corrected image information reflecting this on the changed image information CP, generates corrected image information reflecting the same, and outputs the corrected image information to the image output apparatus (S29). It may be controlled to be displayed on the screen (S31).
  • the target image is changed in real time to correspond to the field of view according to the position change of the object O, and the coordinates of the corrected image are set in consideration of the aspect ratio. It is possible to minimize the distortion of the image due to the change of view of O).
  • the impact point analysis device by generating and applying the landing point information reflecting the actual shooting environment by the impact point analysis device, it is possible to implement a virtual shooting training simulation of the environment as close as possible to the actual shooting training in the real space.
  • 17 and 18 are diagrams for describing a method for deriving a ballistic trajectory according to an embodiment of the present invention.
  • FIG. 17 is a diagram illustrating a trajectory derived by the impact point analyzing apparatus 100 based on the X axis (straight distance: range) and the Y axis (height) axis
  • FIG. 18 is a view illustrating an X-axis of the impact point analyzing apparatus 100.
  • Z-axis drift distance: drift
  • the impact point analysis apparatus 100 may derive a ballistic trajectory using Equation 6.
  • m is the warhead mass
  • S is the warhead cross section
  • is the air density
  • Linear drag coefficient Linear lift coefficient
  • total angle of attack V: warhead velocity ( )
  • the conventional laser method applies a ballistic trajectory without distinguishing gravity, lift, drag, rotation, and the like acting on a ball flying in the air (laser graph in FIG. 17).
  • the present invention reflects the gravity (gravity, lift, drag, rotation, etc.) acting on the bullet as shown in [Equation 6], reflecting the height (Y-axis) and the drift (Z-axis) vary depending on the range (X-axis).
  • the ballistic trajectory (MPTMS graph in Figure 17) can be derived.
  • the line of sight is based on the line of sight of the shooter ("0 cm" on the Y axis), and the 0 point of the X axis is the aiming direction of the muzzle ("about -6 cm" on the Y axis).
  • the conventional laser method has a height of about -6 cm, and a zero point of drift is an impact point, but the method according to the present invention has a height of about +10 cm and a drift of about -1 cm.
  • the point is to be an impact point.
  • the conventional laser system has a height of about -6 cm, and a zero drift point is an impact point, but the method according to the present invention has a height of about + 17 cm and a drift of -2 cm. Is the point of impact.
  • the method according to the present invention reflects the impact point as the height (X axis) / drift (Z axis) of the impact point varies depending on the position of the range, thereby deriving the impact point very similar to the actual trajectory track. It is.
  • 19 is a view for explaining an impact point changed according to a ballistic trajectory according to an embodiment of the present invention.
  • FIG 19 it shows an impact point on the target according to the conventional laser method and the method according to the present invention, when the target is 100m / 200m / 300m, the existing laser method is no change in the impact point 1800 according to the distance at all There is not.
  • the method according to the invention is that there is a change in the impact point (1900) with distance.
  • the conventional laser method is recognized as being hit at the target center because it constantly reflects the impact point irrespective of the distance, but the method according to the present invention reflects the impact point differently depending on the distance does not fit the target center, or a few cm from the target center It is possible to accurately determine the degree of deviation. Therefore, the method according to the present invention can derive the impact point very similar to the actual trajectory track.
  • the impact point analysis device and the virtual shooting training simulation system using the same for applying the shooting environment of the actual personalizer in the virtual reality to improve the precision of the trajectory and the impact point are limited to the configuration and operation of the embodiments described above. It is not.
  • the above embodiments may be configured such that various modifications may be made by selectively combining all or part of the embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Engineering & Computer Science (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)

Abstract

La présente invention concerne un appareil d'analyse de point d'impact destiné à améliorer la précision d'une trajectoire balistique et d'un point d'impact par application d'un environnement de tir d'une arme à feu personnelle réelle à une réalité virtuelle, et un système de simulation d'entraînement au tir virtuel utilisant ledit appareil. L'appareil d'analyse de point d'impact comprend : un module d'analyse d'arme à feu pour générer des informations d'arme à feu sur une structure d'arme à feu d'une arme à feu virtuelle qui est une arme à feu factice possédée par un utilisateur dans un espace réel ; un module d'analyse de balle pour générer des informations de balle sur une structure d'une balle appliquée à l'arme à feu virtuelle ; un module d'analyse d'environnement pour détecter un état environnemental d'un contenu d'entraînement au tir présenté sur un écran afin de générer des informations environnementales sur l'état environnemental ; et un module de génération de point d'impact pour générer des informations de point d'impact relatives à une position à laquelle la balle frappe une cible affichée sur l'écran, en faisant référence à au moins un élément d'informations parmi les informations d'arme à feu, les informations de balle et les informations environnementales.
PCT/KR2018/014647 2018-03-26 2018-11-26 Appareil d'analyse de point d'impact pour améliorer la précision d'une trajectoire balistique et d'un point d'impact par application d'un environnement de tir d'arme à feu personnelle réelle à une réalité virtuelle, et simulation d'entraînement au tir virtuel l'utilisant Ceased WO2019190019A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/041,009 US20210102781A1 (en) 2018-03-26 2018-11-26 Point-of-impact analysis apparatus for improving accuracy of ballistic trajectory and point of impact by applying shooting environment of real personal firearm to virtual reality, and virtual shooting training simulation using same

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20180034594 2018-03-26
KR10-2018-0034594 2018-03-26
KR10-2018-0130030 2018-10-29
KR1020180130030A KR102041461B1 (ko) 2018-03-26 2018-10-29 가상 현실에서 실제 개인화기의 사격 환경을 적용하여 탄도 궤도 및 탄착점의 정밀성을 향상시키기 위한 탄착점 분석 장치 및 이를 이용한 가상 사격 훈련 시뮬레이션

Publications (1)

Publication Number Publication Date
WO2019190019A1 true WO2019190019A1 (fr) 2019-10-03

Family

ID=68062284

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/014647 Ceased WO2019190019A1 (fr) 2018-03-26 2018-11-26 Appareil d'analyse de point d'impact pour améliorer la précision d'une trajectoire balistique et d'un point d'impact par application d'un environnement de tir d'arme à feu personnelle réelle à une réalité virtuelle, et simulation d'entraînement au tir virtuel l'utilisant

Country Status (1)

Country Link
WO (1) WO2019190019A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05288495A (ja) * 1992-04-08 1993-11-02 Mitsubishi Heavy Ind Ltd 弾道解析装置
KR20000012160A (ko) * 1998-11-16 2000-03-06 윤욱선 증강 현실을 이용한 사격 훈련 시뮬레이션 시스템 및 그 방법
US20100301116A1 (en) * 2006-02-03 2010-12-02 Burris Company Trajectory compensating sighting device systems and methods
KR20120042382A (ko) * 2010-10-25 2012-05-03 주식회사 에프나인 영상 및 센서를 이용한 사격 관리장치
KR20140062408A (ko) * 2012-11-14 2014-05-23 주식회사 도담시스템스 실내 훈련용 곡사화기 사격술 모의 훈련 시스템 및 이의 제어방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05288495A (ja) * 1992-04-08 1993-11-02 Mitsubishi Heavy Ind Ltd 弾道解析装置
KR20000012160A (ko) * 1998-11-16 2000-03-06 윤욱선 증강 현실을 이용한 사격 훈련 시뮬레이션 시스템 및 그 방법
US20100301116A1 (en) * 2006-02-03 2010-12-02 Burris Company Trajectory compensating sighting device systems and methods
KR20120042382A (ko) * 2010-10-25 2012-05-03 주식회사 에프나인 영상 및 센서를 이용한 사격 관리장치
KR20140062408A (ko) * 2012-11-14 2014-05-23 주식회사 도담시스템스 실내 훈련용 곡사화기 사격술 모의 훈련 시스템 및 이의 제어방법

Similar Documents

Publication Publication Date Title
KR102041461B1 (ko) 가상 현실에서 실제 개인화기의 사격 환경을 적용하여 탄도 궤도 및 탄착점의 정밀성을 향상시키기 위한 탄착점 분석 장치 및 이를 이용한 가상 사격 훈련 시뮬레이션
WO2018190484A1 (fr) Système d'apprentissage de tir sur image
KR940010379B1 (ko) 자동 조준기 보정 시스템 및 그 조준 방법
US9222754B2 (en) Precision guided firearm with hybrid sensor fire control
US5686690A (en) Weapon aiming system
US9671197B2 (en) Remotely operated target-processing system
US8850943B2 (en) Management system of several snipers
JPH03213498A (ja) 空中攻撃及び航行任務を補佐するオプトエレクトロニクスシステム
WO2018212608A1 (fr) Système de marquage mobile, procédé de commande de dispositif de marquage mobile, et support d'enregistrement lisible par ordinateur
GB2033619A (en) Fire control system
WO2019199112A1 (fr) Système et procédé de travail autonome et support d'enregistrement lisible par ordinateur
WO2018074844A1 (fr) Procédé de commande de lancer de balle de dispositif de lancer dans un système d'entraînement au baseball, et système d'entraînement au baseball utilisant celui-ci
WO2024262749A1 (fr) Procédé d'étalonnage d'une pluralité de lidar et programme informatique enregistré dans un support d'enregistrement pour exécuter celui-ci
KR102237380B1 (ko) 탄착점 분석 장치 및 이를 이용한 가상 사격 훈련 시뮬레이션 시스템
WO2019190019A1 (fr) Appareil d'analyse de point d'impact pour améliorer la précision d'une trajectoire balistique et d'un point d'impact par application d'un environnement de tir d'arme à feu personnelle réelle à une réalité virtuelle, et simulation d'entraînement au tir virtuel l'utilisant
AU4750900A (en) Method for the impact or shot evaluation in a shooting range and shooting range
JP3878360B2 (ja) 小火器用照準装置
JP6555804B2 (ja) 射撃訓練システム
GB2324360A (en) Method and apparatus for aiming a weapon
KR102151340B1 (ko) 비비탄용 사격 시스템의 탄착점 검출 방법
GB2041177A (en) Sighting and target tracking instruction apparatus
JPH0357400B2 (fr)
CN111609753B (zh) 一种扳机控制方法及系统
WO2024043507A1 (fr) Procédé et dispositif de prise en charge d'un apprentissage tactique à l'aide d'une localisation visuelle
RU2605664C1 (ru) Стрелковое легкое оружие с автоматизированной электронно-оптической системой прицеливания и способ прицеливания

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18911716

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18911716

Country of ref document: EP

Kind code of ref document: A1