Method And Apparatus For Providing Live-Fire Simulation Game

Information

  • Patent Application
  • 20190219364
  • Publication Number
    20190219364
  • Date Filed
    March 20, 2019
    5 years ago
  • Date Published
    July 18, 2019
    4 years ago
Abstract
Disclosed are a method and an apparatus for providing a live-fire simulation game. The present invention relates to a method for providing a shooting simulation by using live rounds and an apparatus for the same, wherein: a simulation image is output on each of image screens installed in a live-fire exercise room; and the impact point of a live round fired by a shooter according to the output simulation image is calculated by sensing the position of the live round and is compared with the position of a target included in the simulation image, so as to generate shooting result information.
Description
BACKGROUND OF THE INVENTION
Technical Field

The present disclosure in one or more embodiments relates to a method and apparatus for providing a live-round shooting simulation game.


BACKGROUND

The statements in this section merely provide background information related to the present disclosure and may not constitute prior art.


In a general live-round shooting system, a shooter shoots at a target a predetermined distance away in a shooting range. The live-round shooting system is mainly installed outdoors. For this reason, a large area is required, and the noise generated by shooting is severe. As a result, the conditions required to be met for installation of such a live-round shooting system are very stringent.


Consequently, it is very difficult to realize a system for shooting practice that enables a shooter to shoot at a moving target, targets different distances away, and a plurality of targets while actually moving.


In order to solve such limitations, a virtual shooting simulation system based on a laser gun (a gun for games) without using live rounds has been developed. However, this virtual shooting simulation system is similar to a game that can be experienced in an amusement hall, a game room, etc., and is different from a real shooting system in various aspects.


Meanwhile, a live-round shooting structure may be constructed using a large-sized structure (for example, a building, a city area, or a set) in order to apply a live-round shooting system. However, a very large area is required, and shooting practice is possible only in the predetermined live-round shooting structure. As a result, it is not possible to take shooting practice or play a shooting game in various situations. In addition, a plurality of shooters cannot simultaneously take shooting practice since live rounds are used. Furthermore, only stationary targets or targets that move in simple ways can be used.


The present disclosure provides a method and apparatus for providing a live-round shooting simulation game that is capable of outputting a simulation image to an image screen installed in a live-round shooting room and comparing the impact point of a live round fired by a shooter calculated by sensing the position of the live round according to the output simulation image with the position of a target included in the simulation image to generate shooting result information.


SUMMARY OF THE INVENTION

In accordance with some embodiments of the present disclosure, a live-round shooting simulation game provision method using a shooting simulation provision apparatus includes a simulation image output process of transmitting simulation image data to a projector installed in a live-round shooting room such that a simulation image is output to a specific area, a sensing processing process of acquiring sensing information about the position of a live round fired by a shooter from a sensor installed in the specific area, an impact point calculation process of calculating an impact point of the live round based on the sensing information, a target position calculation process of calculating target position information of a target included in the simulation image, and a result calculation process of measuring a shooting accuracy using the impact point and the target position information and generating shooting result information based on the shooting accuracy.


In accordance with some embodiments of the present disclosure, a shooting simulation provision apparatus using a live round includes a simulation image controller for transmitting simulation image data to a projector installed in a live-round shooting room such that a simulation image is output to a specific area, a sensing processing unit for acquiring sensing information about the position of a live round fired by a shooter from a sensor installed in the specific area, an impact point calculation unit for calculating an impact point of the live round based on the sensing information, a target position calculation unit for calculating target position information of a target included in the simulation image, and a result calculation unit for measuring a shooting accuracy using the impact point and the target position information and generating shooting result information based on the shooting accuracy.


In accordance with some embodiments of the present disclosure, a shooting simulation system for shooting practice using a live round includes a projector for projecting a simulation image to a specific area in a live-round shooting room, an image screen installed so as to be spaced apart from at least three inner walls of the live-round shooting room by a predetermined distance in the state of being parallel to the inner walls for outputting the simulation image, a headset including a sensor for sensing a gaze position or direction of a shooter, the headset being configured to provide a sound about the simulation image to the shooter, a live round sensor attached to the image screen for sensing the position of a live round fired by the shooter to generate sensing information, and


a shooting simulation provision apparatus for transmitting simulation image data to the projector such that the simulation image is output, measuring a shooting accuracy using an impact point of the live round calculated based on the sensing information and target position information of a target included in the simulation image, and generating shooting result information based on the shooting accuracy.


According to the present disclosure as described above, live-round shooting practice for use in, for example, games, training, and operations, may be taken using live rounds under the same conditions as an actual situation. In addition, live-round shooting practice may be taken under the same conditions as an actual situation in connection with the operation of a shooter, such as replacement of a magazine, and the motion of the shooter between movements.


All simulations may be performed in a live-round shooting room, and the shooting result using live rounds and the simulation result of all shooting simulations may be generated. In addition, shooting practice or games may be performed using live rounds with respect to a moving target, a plurality of targets, or a target corresponding to a shooting practice theme while using a smaller area than a conventional live-round shooting range.





BRIEF DESCRIPTION OF DRAWINGS


FIG. 1 is a block diagram schematically showing a live-round shooting simulation system according to at least one embodiment;



FIG. 2 is a block diagram schematically showing a live-round shooting simulation provision apparatus according to at least one embodiment;



FIG. 3 is a flowchart describing a live-round shooting simulation game provision method according to at least one embodiment;



FIGS. 4a and 4b are illustration diagrams describing the structure of a live-round shooting simulation system according to at least one embodiment;



FIG. 5 is an illustration diagram describing the installation structure of sensors for sensing the position of a live round according to live-round shooting in a shooting simulation system according to at least one embodiment;



FIG. 6 is an illustration diagram showing a method of calculating shooting accuracy in a shooting simulation system according to at least one embodiment;



FIG. 7 is an illustration diagram showing the movement path of a shooter for calculating a simulation result in a shooting simulation system according to at least one embodiment; and



FIG. 8 is an illustration diagram showing a shooting simulation system according to at least one embodiment that is operated together with a plurality of live-round shooting rooms.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

Hereinafter, at least one embodiment of the present disclosure will be described in detail with reference to the accompanying drawings.


A live-round shooting simulation system according to at least one embodiment is described as a system that is applicable to provide sports or games using live rounds, such as shooting practice and shooting games using live rounds. However, the present disclosure is not limited thereto. The live-round shooting simulation system is also applicable to military/training systems for military operations, terror training, and hostage rescue.


A live-round shooting simulation system according to at least one embodiment provides a simulation game that is capable of performing live-round shooting at various targets. The targets, which are objects for shooting, may be real targets, such as people, animals, and objects. However, the present disclosure is not limited thereto. The targets may also be virtual targets, such as zombies and extraterrestrials.


In addition, a live-round shooting simulation system according to at least one embodiment is realized such that a shooter (a game participant) shoots while looking at a game scene output on a screen. Alternatively, the live-round shooting simulation system may be realized such that a shooter (a game participant) shoots at a game scene that is output as a stereoscopic scene while wearing 3D glasses or smart glasses.


In addition, live rounds used in a live-round shooting simulation system according to at least one embodiment may be general live rounds for military or shooting. Alternatively, the live rounds may be live rounds for games, which are configured to have a reduced penetration force or a reduced amount of gunpowder so as to be used for live-round shooting simulation.



FIG. 1 is a block diagram schematically showing a live-round shooting simulation system according to at least one embodiment.


A live-round shooting simulation system according to at least one embodiment includes a projector 110, an image screen 120, a sensor 130, a sound device 140, and a shooting simulation provision apparatus 150. Here, the image screen 120, the sensor 130, and the sound device 140 are preferably provided in a live-round shooting room, and the shooting simulation provision apparatus 150 is preferably provided outside the live-round shooting room. However, the present disclosure is not limited thereto. The shooting simulation provision apparatus 150 may be provided in the live-round shooting room at one side thereof in the state of being covered with a bombproof material.


The projector 110 is a device provided in the live-round shooting room for outputting a simulation image. The projector 110 outputs a simulation image using simulation image data received from the shooting simulation provision apparatus 150. The simulation image may be projected on the image screen 120, which is installed in the live-round shooting room. In the case in which the image screen 120 is not installed in the live-round shooting room, the simulation image may be output to at least three inner walls of the live-round shooting room.


The projector 110 may include at least three lenses for projecting simulation images to image screens 120 installed to at least three inner walls of the live-round shooting room. However, the present disclosure is not limited thereto. An image projection apparatus including at least three projectors, which are connected to each other, may be provided.


The simulation image is a first-person image in which a shooter takes shooting practice or shooting training or plays a shooting game while looking at a target, a background, etc. The simulation image may be an image similar to an image for a shooting game or a first-person shooter (FPS). The simulation image may be an image created so as to be suitable for a specific shooting practice program.


The image screen 120 displays the simulation image output by the projector 110. The image screen 120 includes a plurality of screens installed so as to be spaced apart from at least three inner walls of the live-round shooting room by a predetermined distance in the state of being parallel to the inner walls. Here, the image screen 120 may be realized by separate screens or a single screen installed on at least three inner walls.


Meanwhile, the image screen 120 is described as being installed so as to be parallel to each wall of the live-round shooting room (a rectangular form) when viewing the live-round shooting room from above. However, the present disclosure is not limited thereto. The image screen 120 may be installed in a circular shape about the center of the live-round shooting room.


Some of the sensors 130 may be installed at one side of the image screen 120. Live round sensors 136 for sensing the position of a live round fired through a gun of the shooter may be installed at the image screen 120. For example, live round sensors 136 may be disposed in the image screen 120 so as to be spaced apart from each other in a lattice arrangement, or may be disposed at a portion of the edge of the image screen 120.


The sensors 130 are devices for sensing the gaze and motion of a shooter, the state of the gun of the shooter, and the position of a live round fired through the gun of the shooter. The sensors 130 include a gaze sensor 132 for sensing the gaze of a shooter, a motion sensor 134 for sensing the movement of the shooter, and a live round sensor 136 for sensing a fired live round.


The gaze sensor 132 is a sensor for sensing the gaze of a shooter according to the simulation image. The gaze sensor 132 is preferably mounted at one side of the shooter. For example, the gaze sensor 132 may be mounted on a hat of the shooter or a sound device 140 that the shooter wears, such as a headset. The gaze sensor 132 may be mounted at any of various positions on the shooter, such as a vest, a hat, a military uniform, and a combat vest, as long as the gaze sensor is capable of sensing the gaze position of the shooter or the direction in which the shooter moves.


The gaze sensor 132 may include a radar sensor for sensing the gaze position of the shooter. However, the present disclosure is not limited thereto. The gaze sensor 132 may be realized by a composite sensor including a plurality of sensors, such as an infrared sensor and a gyro sensor.


The gaze sensor 132 transmits gaze sensing information about the gaze of the shooter to the shooting simulation provision apparatus 150. Here, the gaze sensing information may include variation of the gaze by being oriented upwards, downwards, leftwards, and rightwards relative to a predetermined reference gaze. However, the present disclosure is not limited thereto. In the case in which a coordinate value is set with respect to the wall of the live-round shooting room or the image screen 120, a coordinate value corresponding to the gaze of the shooter may be included. Here, the predetermined reference gaze is critical gaze coordinates set by the shooter or a manager in the initial stage of the simulation image in order to sense a change in the gaze of the shooter. For example, the predetermined reference gaze may be center coordinates of the image screen 120.


The motion sensor 134 is a sensor for sensing the motion of the shooter according to the simulation image. The motion sensor 134 senses the motion of the shooter for movement and transmits movement sensing information about the motion of the shooter to the shooting simulation provision apparatus 150. Here, the movement sensing information includes information about the number of steps, the movement distance, and the leg motion of the shooter.


The motion sensor 134 may be a sensor mounted at one side of a leg of the shooter (for example, a knee or an ankle). However, the present disclosure is not limited thereto. For example, in the case in which a moving pedal for the leg motion of the shooter is provided, the motion sensor 134 may be a sensor mounted at the moving pedal for sensing the number of steps and the movement distance of the shooter based on the number of motions of the pedal. Here, the moving pedal may be a pedal similar or identical to a pedal of an elliptical machine or a pedal of a treadmill.


The live round sensor 136 is a sensor for sensing the position of a live round fired through the gun of the shooter. The position of the live round means information about the live round penetration point and the trajectory of the live round sensed depending on the position at which the live round sensor 136 is installed. The live round sensor 136 includes a plurality of sensors for sensing the position of a live round, and may be installed in the live-round shooting room in various forms.


The live round sensor 136 may include a plurality of sensors installed at the image screen 120. For example, live round sensors 136 may be disposed in the image screen 120 so as to be spaced apart from each other in a lattice arrangement. In this case, the live round sensors 136 generate live round sensing information about the live round penetration point at which the live round has passed through the image screen 120.


Meanwhile, the live round sensors 136 may be disposed at a portion of the edge of the image screen 120. In this case, the live round sensors 136 generate live round sensing information about the trajectory of the live round that has passed through the image screen 120.


The live round sensors 136 are described as being installed at the image screen 120. However, the present disclosure is not limited thereto. Live round sensors 136 may be disposed at the wall of the live-round shooting room so as to be spaced apart from each other in a lattice arrangement, or may be disposed at a portion of the edge of the wall. In the case in which the live round sensors 136 are installed at the wall, the live round sensors 136 generate live round sensing information about the trajectory of the live round or the position of the wall in which the live round is lodged.


The live round sensor 136 may transmit the generated live round sensing information to the shooting simulation provision apparatus 150.


The sensors 130 may include a laser aiming sensor for additionally sensing the direction and the aiming point of the gun of the shooter and a trigger sensor for sensing the firing time of the gun. In the case in which the laser aiming sensor and the trigger sensor are used, the shooter may perform shooting simulation using a blank shot instead of a live round.


In a shooting simulation system according to at least one embodiment, the shooter takes shooting practice using a real gun. For this reason, the shooter must replace a magazine of the gun. In the shooting simulation system, therefore, it is not necessary to provide an additional sensor for recognizing the exchange of the magazine. That is, the shooting simulation system enables the shooter to take shooting practice in the same environment as a real environment.


The sound device 140 is a device for providing a sound from the simulation image to the shooter. The sound device 140 is preferably a headset, which the shooter may wear. However, the present disclosure is not limited thereto. The sound device 140 may be a sound output device (not shown) provided in the live-round shooting room for outputting a sound.


The sound device 140 includes a speaker 140. In the case in which the sound device 140 is a headset, a microphone for receiving a voice of the shooter and a gaze sensor for sensing the gaze of the shooter may be further included.


The shooting simulation provision apparatus 150 is an apparatus that is operated together with various devices included in the live-round shooting room in order to provide and control live-round shooting simulation.


The shooting simulation provision apparatus 150 transmits the simulation image data to the projector 110 such that the simulation image is displayed on a specific area in the live-round shooting room, for example, on the image screen 120.


The shooting simulation provision apparatus 150 determines the gaze position of the shooter (the point at which the shooter gazes) using the gaze sensing information acquired from the gaze sensor 132, which is mounted at the hat or the headset of the shooter, and controls the simulation image based on gaze position information determined based on the gaze sensing information.


The shooting simulation provision apparatus 150 may acquire live round sensing information about the position of the live round fired by the shooter from the live round sensor 136, which is installed at the image screen 120 for outputting the simulation image, and calculate the impact point of the live round based on the live round sensing information. Here, the impact point may be calculated by matching the sensing information included in the live round sensing information with a predetermined coordinate map on the image screen 120.


The shooting simulation provision apparatus 150 calculates target position information about the target for shooting practice included in the simulation image, measures a shooting accuracy using the impact point and the target position information, and generates shooting result information based on the shooting accuracy. Here, the shooting result information is result information including a score corresponding to the shooting accuracy. The shooting result information may be generated for each target for shooting practice, or may be generated by adding scores of a plurality of targets for shooting practice included in respective sections of the simulation image.


Also, in the case in which evaluation reference information corresponding to a shooting practice theme is provided, the shooting simulation provision apparatus 150 generates simulation result information about the shooting practice theme based on the evaluation reference information. In other words, the shooting simulation provision apparatus 150 calculates movement path information about the movement path from the departure (simulation start) point to the destination (simulation end) point depending on the contents of the shooting practice based on the simulation image, and generates simulation result information about the shooting practice theme using the movement path information and the shooting result information. Here, the simulation result information is information generated based on the score calculated by comparing a predetermined shortest path on the simulation image with the movement path information of the shooter and the score of the shooting result information.



FIG. 2 is a block diagram schematically showing a live-round shooting simulation provision apparatus according to at least one embodiment.


A live-round shooting simulation provision apparatus 150 according to at least one embodiment includes a simulation image controller 210, a gaze determination unit 212, a sensing processing unit 220, an impact point calculation unit 230, a target position calculation unit 240, a movement path calculation unit 250, and a result calculation unit 260. A live-round shooting simulation provision apparatus 150 according to at least one embodiment is described as being realized by a hardware module that provides shooting simulation for shooting practice using live rounds. However, the present disclosure is not limited thereto. The live-round shooting simulation provision apparatus may be realized by a software module loaded in a predetermined shooting control device so as to use hardware.


The simulation image controller 210 performs control such that the simulation image data is transmitted to the projector 110, which is installed in the live-round shooting room, and the simulation image is output to a specific area in the live-round shooting room. Here, the simulation image is a first-person image in which the shooter takes shooting practice or shooting training or plays a shooting game while looking at a target, a background, etc. The simulation image may be an image similar to an image for a shooting game or a first-person shooter (FPS). The simulation image may be output to at least three inner walls of the live-round shooting room or to an image screen 120 installed so as to be spaced apart from at least three inner walls by a predetermined distance in the state of being parallel to the inner walls.


The simulation image controller 210 may transmit the simulation image data previously stored in the live-round shooting simulation provision apparatus 150 to the projector 110. However, the present disclosure is not limited thereto. For example, the shooting simulation provision apparatus 150 may further include a data acquisition unit (not shown) for acquiring simulation image data corresponding to the shooting practice theme from an external server. Here, the external server may be a server for providing simulation image data corresponding to a shooting practice theme, such as a shooting game, basic shooting practice, shooting sport practice, a military operation, terror training, or hostage rescue. In this case, the simulation image controller 210 transmits the simulation image data corresponding to the shooting practice theme to the projector 110 such that the simulation image about the shooting practice theme is output.


The simulation image controller 210 may be operated together with the gaze determination unit 212 and the sensing processing unit 220 in order to change the simulation image data that is transmitted to the projector 110. In other words, the simulation image controller 210 may control the simulation image data in order to reflect, in real time, shooting practice situations that occur in the live-round shooting room in the simulation image based on the sensing information received from the sensors 130.


The gaze determination unit 212 determines the gaze position of the shooter based on the gaze sensing information acquired from the sensors 130. The gaze determination unit 212 may determine the gaze position of the shooter using the gaze sensing information acquired from the gaze sensor 132 mounted at a hat of the shooter or to a sound device 140 that the shooter wears, such as a headset. That is, the gaze determination unit 212 may determine the gaze position of the shooter, such as the gaze direction of the shooter, the screen corresponding to the gaze of the shooter, the height of the gaze of the shooter, and the movement of the gaze of the shooter, based on the gaze sensing information. Here, the gaze sensing information may be information including variation of the gaze by being oriented upwards, downward, leftwards, and rightwards relative to the predetermined reference gaze or the coordinate value at which the gaze of the shooter is sensed, which is one of the predetermined coordinate values on the wall of the live-round shooting room or on the image screen 120.


The gaze determination unit 212 may transmit the gaze position information, determined based on the gaze sensing information, to the simulation image controller 210 such that the simulation image is controlled. For example, the gaze determination unit 212 may determine the gaze position of the shooter, and may change the simulation image in response to the gaze position. The simulation image is described as being changed using only the gaze position information of the shooter, which is determined based on the gaze sensing information. Alternatively, movement sensing information may be further applied to change the simulation image. Here, the movement sensing information includes information about the number of steps, the movement distance, and the leg motion of the shooter.


For example, the gaze determination unit 212 determines the motion of the shooter, such as forward movement and backward movement, during the shooting practice based on the movement of the moving pedal or the leg motion of the shooter. In the case in which both gaze turning and motion occur, the screen of the simulation image is controlled such that the shooter moves in the gaze direction. In the case in which gaze turning occurs without motion, the screen of the simulation image is controlled such that only the gaze direction is moved in a still state.


The sensing processing unit 220 acquires live round sensing information from the sensors 130, which is provided in the live-round shooting room, and processes the acquired live round sensing information. The sensing processing unit 220 may acquire live round sensing information about the position of the live round fired by the shooter from a live round sensor 136 installed at a specific area to which the simulation image is output. The live round sensing information may be sensing information acquired from a live round sensor 136 that is installed at the image screen 120 in the live-round shooting room or from a live round sensor 136 that is installed at the wall of the live-round shooting room. The sensing processing unit 220 may acquire various kinds of live round sensing information depending on the kind and position of the live round sensor 136. The live round sensing information includes sensing information about the live round penetration point at which the live round has passed through the image screen 120, the position of the wall in which the live round is lodged, and the trajectory of the live round.


In the case in which the acquired live round sensing information includes a plurality of pieces of sensing information, the sensing processing unit 220 may select sensing information having highest accuracy based on a predetermined priority, and may transmit live round sensing information including the selected sensing information to the impact point calculation unit 230. Here, the live round sensing information may include identification information of a plurality of image screens, identification information of sensors that sensed a live round, and sensing coordinate value of the live round.


The impact point calculation unit 230 calculates the impact point of the live round based on the live round sensing information received from the sensing processing unit 220. In order words, the impact point calculation unit 230 calculates the impact point of the live round by matching the sensing information included in the live round sensing information with a predetermined coordinate map on the image screen 120. Here, the predetermined coordinate map on the image screen 120 is a map or table including predetermined coordinate values for identifying the position of the image screen 120.


The impact point calculation unit 230 preferably calculates the impact point of the live round in the form of (x, y) coordinates based on the image screen 120, on which the simulation image is displayed. However, the present disclosure is not limited thereto. A scene of a three-dimensional simulation image may be applied in order to calculate the impact point of the live round in the form of three-dimensional coordinates (x, y, z).


The target position calculation unit 240 calculates target position information about a target for shooting practice included in the simulation image. The target position calculation unit 240 calculates target position information by matching target information included in the simulation image data with the predetermined coordinate map on the image screen 120 in the live-round shooting room. Here, the target information included in the simulation image data may include range information, center-of-gravity information, and predetermined target coordinates of an area to which the target is output based on the entire screen of the simulation image. The target position information may include the actual coordinates of the target for shooting practice, applied to the image screen 120.


The movement path calculation unit 250 calculates movement path information about the movement path from the departure (simulation start) point to the destination (simulation end) point depending on the contents of the shooting practice based on the simulation image.


The movement path calculation unit 250 calculates the movement path information in consideration of the movement direction, the movement speed, and the movement time of the shooter using the gaze sensing information and the movement sensing information of the shooter from the time at which the shooting practice is commenced using the simulation image. Here, the operation of calculating the movement path information is preferably performed only when the result calculation unit 260 generates simulation result information. However, the present disclosure is not limited thereto. The operation of calculating the movement path information may be performed in order to display the movement path and the position of the target in a combined state even in the case in which only the shooting result information is generated.


The result calculation unit 260 generates shooting result information about the live-round shooting of the shooter. The result calculation unit 260 measures shooting accuracy, using the impact point calculated by the impact point calculation unit 230 and the target position information calculated by the target position calculation unit 240, and generates shooting result information based on the shooting accuracy.


The result calculation unit 260 may measure the shooting accuracy based on a difference value calculated by comparing the impact point with the target position information. For example, in the case in which the coordinates of the impact point are (4, 4) and the target coordinates included in the target position information are (3, 3), the result calculation unit 260 compares the impact point with the target position information to calculate a square root 2 as a difference value, and measures a shooting accuracy corresponding to the difference value (for example, level 3). Here, the shooting accuracy may be a value measured by applying the difference value to a predetermined reference table such as Table 1.











TABLE 1





Difference value (cm)
Shooting accuracy (level)
Score

















1 to 2
Level 3
10


3 to 4
Level 2
5


5 to 6
Level 1
1









In Table 1, a target is assumed to have a size included in a range of 6 cm based on the target position information. The reference table may be changed based on the size of the target. In Table 1, the shooting accuracy has three levels. However, the present disclosure is not limited thereto. Variables about the difference value, the shooting accuracy, and the score in Table 1 may be changed.


The result calculation unit 260 generates shooting result information including a score corresponding to the measured shooting accuracy. Here, the shooting result information may be generated for each target for shooting practice. Alternatively, the shooting result information may be generated by adding scores of a plurality of targets for shooting practice included in respective sections of the simulation image.


In the case in which evaluation reference information corresponding to a shooting practice theme is provided, the result calculation unit 260 may generate simulation result information about the shooting practice theme based on the evaluation reference information. The result calculation unit 260 may generate simulation result information about the shooting practice theme using the movement path information and the shooting result information calculated by the movement path calculation unit 250.


The result calculation unit 260 generates simulation result information using the score calculated by comparing a predetermined shortest path on the simulation image and the movement path information of the shooter and the score of the shooting result information. Here, the score of the movement path is a score calculated based on a difference value obtained by subtracting the length of the predetermined shortest path from the length of the path included in the movement path information. The score of the movement path may be calculated based on Table 2.











TABLE 2





Movement path information (km)
Shortest path (km)
Score

















10 to 20
10
10


21 to 30

5


31 to 40

1









In Table 2, the movement path information is described as having a longest path of 40 km and a shortest path of 10 km. However, the present disclosure is not limited thereto. The movement path information may be changed by the shooter or the manager.


The result calculation unit 260 preferably generates the simulation result information by adding the score of the movement path and the score of the shooting result information. However, the present disclosure is not limited thereto.


The result calculation unit 260 may generate the simulation result information for the entire path based on the movement path information. However, the present disclosure is not limited thereto. For example, the result calculation unit 260 may divide the entire path into a plurality of sections based on the movement path information, and may generate the simulation result information using shooting result information about targets included in the respective sections and movement path information of the respective sections.


Meanwhile, in the case in which the simulation result information is generated based on a plurality of sections, the result calculation unit 260 may set the degree of difficulty for each section, and may apply different weights to the respective sections based on the set degree of difficulty in order to generate the simulation result information.



FIG. 3 is a flowchart describing a live-round shooting simulation game provision method according to at least one embodiment.


The shooting simulation provision apparatus 150 transmits simulation image data corresponding to a shooting practice theme to the projector 110 provided in the live-round shooting room (S310). The shooting simulation provision apparatus 150 may transmit pre-stored simulation image data to the projector 110. Alternatively, the shooting simulation provision apparatus 150 may acquire simulation image data corresponding to a shooting practice theme from an external control tower or an external server, and may transmit the acquired simulation image data to the projector 110. Here, the shooting practice theme may include a shooting game, basic shooting practice, shooting sport practice, a military operation, terror training, and hostage rescue.


The shooting simulation provision apparatus 150 outputs a simulation image to the interior of the live-round shooting room through the projector 110 in order for a shooter to start shooting practice (S312).


The shooting simulation provision apparatus 150 controls the simulation image based on information about the turning and the movement position of the shooter according to the simulation image (S320). For example, the shooting simulation provision apparatus 150 determines the gaze position and direction of the shooter using the gaze sensing information acquired from the gaze sensor 132 mounted on a hat of the shooter or a sound device 140 that the shooter wears, and controls the simulation image based on gaze position information determined based on gaze sensing information and the motion of the shooter.


The shooter shoots according to the simulation image, and the shooting simulation provision apparatus 150 recognizes the shooter's live-round shooting at a target output to the simulation image (S322). The shooting simulation provision apparatus 150 may recognize the live-round shooting through a sensor provided at a gun of the shooter. However, the present disclosure is not limited thereto. The shooting simulation provision apparatus 150 may recognize the live-round shooting by sensing a shooting sound. Step S322 is a subordinate sensing step for calculating the position of a live round, and may be omitted depending on setting of the manager.


The shooting simulation provision apparatus 150 acquires live round sensing information about the position of the live round from the sensors 130 provided in the live-round shooting room (S330), and calculates the impact point of the live round based on the live round sensing information (S340). The shooting simulation provision apparatus 150 may acquire the live round sensing information about the position of the live round based on the shooter's live-round shooting from the live round sensor 136 installed at a specific area to which the simulation image is output, and may calculate the impact point of the live round based on the live round sensing information. Here, the impact point may be calculated by matching the sensing information included in the live round sensing information with a predetermined coordinate map on the image screen 120.


The shooting simulation provision apparatus 150 calculates target position information about the target included in the simulation image (S350).


The shooting simulation provision apparatus 150 measures shooting accuracy using the impact point and the target position information (S360) and generates shooting result information based on the shooting accuracy (S370). Here, the shooting result information is result information including a score corresponding to the shooting accuracy. The shooting result information may be generated for each target for shooting practice, or may be generated by adding scores of a plurality of targets for shooting practice included in respective sections of the simulation image.


The shooting simulation provision apparatus 150 further acquires movement path information of the shooter and generates simulation result information about the shooting practice theme using the movement path information and the shooting result information (S380). The shooting simulation provision apparatus 150 may calculate movement path information about the movement path from the departure (simulation start) point to the destination (simulation end) point depending on the contents of the shooting practice based on the simulation image, and may generate simulation result information about the shooting practice theme using the movement path information and the shooting result information. Here, the simulation result information is information generated based on the score calculated by comparing a predetermined shortest path on the simulation image with the movement path information of the shooter and the score of the shooting result information.


A live-round shooting simulation game provision method according to at least one embodiment shown in FIG. 3 described above may be implemented as a program and stored in a computer-readable recording medium. A computer-readable recording medium, in which a program for implementing a live-round shooting simulation game provision method according to at least one embodiment is recorded, includes all kinds of recording devices for storing data that can be read by a computer system. Examples of the computer-readable recording medium include ROM, RAM, CD-ROM, magnetic tapes, floppy disks, and optical data storage devices, and also include carrier-wave-type implementation (e.g., transmission over the Internet). In addition, the computer-readable recording medium may be distributed to a computer system connected over a network, and computer-readable code may be stored and executed thereon in a distributed manner. Functional programs, code, and code segments for implementing the method described above may be easily inferred by programmers in the art to which at least one embodiment pertains.



FIGS. 4a and 4b are illustration diagrams describing the structure of a live-round shooting simulation system according to at least one embodiment.


As shown in FIGS. 4a and 4b, a shooting simulation system using a live round includes a live-round shooting room and a shooting simulation provision apparatus 150.



FIG. 4a is an illustration diagram showing that a shooter takes shooting practice while moving according to a simulation image in a predetermined moving area 410 of the live-round shooting room. Here, the moving area 410 may be a quadrangular or circular area located in the center of the floor of the live-round shooting room. However, the present disclosure is not limited thereto. The moving area 410 may be the entire floor of the live-round shooting room.



FIG. 4b is an illustration diagram showing that the shooter takes shooting practice while moving according to the simulation image using a moving pedal 412 installed at the floor of the live-round shooting room. Here, the moving pedal 412 may be a pedal similar or identical to a pedal of an elliptical machine or a pedal of a treadmill.



FIG. 5 is an illustration diagram describing the installation structure of sensors for sensing the position of a live round according to live-round shooting in a shooting simulation system according to at least one embodiment.



FIG. 5(a) is an illustration diagram showing a wall 510 of a live-round shooting room and an image screen 120 when the live-round shooting room is viewed from top (ceiling) to bottom (floor). As shown in FIG. 5(a), the image screen 120 may be installed so as to be spaced apart from the wall 510 of the live-round shooting room in the state of being parallel to the wall 510. In addition, sensors for sensing the position of a live round fired by the shooter may be provided at the image screen 120 or the wall 510.


Meanwhile, FIG. 5(a) shows that the image screen 120 is installed inside the quadrangular wall 510 of the live-round shooting room so as to be parallel thereto. However, the present disclosure is not limited thereto. For example, a circular image screen 120 may be installed inside a quadrangular wall 510, or a circular image screen 120 may be installed inside a circular wall 510.



FIGS. 5(b) and 5(c) are illustration diagrams describing live round sensors 520 and 530 installed at the image screen 120. FIGS. 5(b) and 5(c) show the state in which a shooter looks at the image screen 120 in the live-round shooting room.


As shown in FIG. 5(b), first live round sensors 520 may be disposed in the image screen 120 so as to be spaced apart from each other in a lattice arrangement. For example, the first live round sensors 520 may be disposed in the image screen 120 in a lattice arrangement having a matrix of 5×4, and some of the first live round sensors 520 that are adjacent to a live round penetration point 512 formed by live-round shooting of the shooter may sense the live round penetration point 512. Here, the live round penetration point 512 may be sensed by four first live round sensors 520 adjacent to the live round penetration point 512. Sensing information about the live round penetration point 512 may include identification information of each of the four first live round sensors 520, the distance between each of the four first live round sensors 520 and the live round penetration point 512, and the number of live rounds.


As shown in FIG. 5(b), second live round sensors 530 may be disposed at a portion of the edge of the image screen 120. For example, the second live round sensors 530 may be installed at four corners of the image screen 120, and may sense the trajectory of a live round using a radar mode in order to sense the live round penetration point 512. Sensing information about the live round penetration point 512 may include identification information of each of the four second live round sensors 530, information about the trajectory of a live round sensed by each of the four second live round sensors 530, the distances between each of the four second live round sensors 530 and the trajectory of the live round, and the number of live rounds.



FIGS. 5(b) and 5(c) show that the first live round sensors 520 and the second live round sensors 530 are installed at the image screen 120. Alternatively, the sensors may be installed at the wall 520 of the live-round shooting room at corresponding positions in the same manner.



FIG. 6 is an illustration diagram showing a method of calculating shooting accuracy in a shooting simulation system according to at least one embodiment.


In the case in which a target 600 for shooting practice is provided in the image screen 120, as shown in FIG. 6, the shooter shoots live rounds to generate a first impact point 620 and a second impact point 622.


The shooting simulation provision apparatus 150 may measure (calculate) shooting accuracy by comparing target position information 610, including predetermined coordinates of the target 600 for shooting practice or coordinates of the center of gravity of the target, with the impact point 620 and 622. For example, in the case in which the coordinates of the target position information 610 are (15, 12) and the coordinates of the first impact point 620 are (16, 15), the shooting simulation provision apparatus 150 measures the shooting accuracy based on a difference value calculated by comparing the first impact point 620 with the target position information 610. Here, the shooting accuracy may be divided into a predetermined number of levels. However, the present disclosure is not limited thereto.


The shooting simulation provision apparatus 150 may calculate a score corresponding to the measured shooting accuracy in order to generate shooting result information about the shooting of the shooter.



FIG. 7 is an illustration diagram showing the movement path of a shooter for calculating a simulation result in a shooting simulation system according to at least one embodiment.



FIG. 7(a) is a diagram showing a first movement path 720 of a first shooter from a departure point 710 to a destination point 730 in a simulation image. FIG. 7(b) is a diagram showing a second movement path 722 of a second shooter from a departure point 710 to a destination point 730 in a simulation image.


The shooting simulation provision apparatus 150 may calculate movement path information of each of the first shooter and the second shooter based on FIG. 7(a) and FIG. 7(b). The shooting simulation provision apparatus 150 may calculate the movement path information in consideration of the movement direction, the movement speed, and the movement time of the shooter using the gaze sensing information and the movement sensing information of the shooter from the time at which the shooting practice is commenced by the simulation image.


As shown in FIG. 7, the first movement path 720 is shorter than the second movement path 722. Consequently, the second shooter may acquire a higher score than the first shooter through the comparison of the movement path information of the shooter with a predetermined shortest path. Here, the acquired score is combined with the shooting result information so as to be reflected in the simulation result information.



FIG. 8 is an illustration diagram showing a shooting simulation system according to at least one embodiment that is operated together with a plurality of live-round shooting rooms.


As shown in FIG. 8, the shooting simulation provision apparatus 150 may be operated together with a plurality of live-round shooting rooms such that a plurality of shooters simultaneously take shooting practice using live rounds. In other words, the shooting simulation provision apparatus 150 may output the same simulation image to the live-round shooting rooms, and may generate shooting result information or simulation result information about each of the shooters. Here, the shooters in the respective live-round shooting rooms may take shooting practice as a single team (the same team for games or training) or as different teams (opposing teams for games or training).


Although exemplary embodiments of the present disclosure have been described for illustrative purposes, those skilled in the art will appreciate that various modifications, additions and substitutions are possible, without departing from the essential characteristics of the disclosure. Therefore, exemplary embodiments of the present disclosure have been described for the sake of brevity and clarity. Accordingly, one of ordinary skill would understand that the scope of the disclosure is not limited by the embodiments explicitly described above but by the claims and equivalents thereof.

Claims
  • 1. A live-round shooting simulation game provision method using a shooting simulation provision apparatus, the method comprising: a simulation image output process of transmitting simulation image data to a projector installed in a live-round shooting room such that a simulation image is output to a specific area;a sensing processing process of acquiring sensing information about a position of a live round fired by a shooter from a sensor installed in the specific area;an impact point calculation process of calculating an impact point of the live round based on the sensing information;a target position calculation process of calculating target position information of a target included in the simulation image; anda result calculation process of measuring a shooting accuracy using the impact point and the target position information and generating shooting result information based on the shooting accuracy.
  • 2. The method of claim 1, wherein the method further comprises acquiring the simulation image data corresponding to a shooting practice theme, selected from among a shooting game, basic shooting practice, shooting sport practice, a military operation, terror training, and hostage rescue, from an external control tower or an external server, and whereinthe simulation image output process comprises transmitting the simulation image data corresponding to the shooting practice theme to the projector such that the simulation image about the shooting practice theme is output.
  • 3. The method of claim 2, wherein, in a case in which evaluation reference information corresponding to the shooting practice theme is provided, the result calculation process comprises generating simulation result information about the shooting practice theme using movement path information of the shooter and the shooting result information based on the evaluation reference information.
  • 4. The method of claim 3, wherein the result calculation process comprises dividing an entire path into a plurality of sections based on the movement path information and generating the simulation result information using the shooting result information about a target included in each of the sections,the simulation result information being generated by setting a degree of difficulty for each of the sections and applying different weights to the sections based on the degree of difficulty.
  • 5. The method of claim 1, wherein the simulation image is output to at least three inner walls of the live-round shooting room.
  • 6. The method of claim 5, wherein the sensing processing process comprises acquiring the sensing information from the sensor attached to each of the inner walls of the live-round shooting room.
  • 7. The method of claim 6, wherein the sensing information is sensing information about a position of one of the inner walls in which the live round is lodged.
  • 8. The method of claim 1, wherein the simulation image is output to an image screen installed so as to be spaced apart from at least three inner walls of the live-round shooting room by a predetermined distance in a state of being parallel to the inner walls.
  • 9. The method of claim 8, wherein the sensing processing process comprises acquiring the sensing information from the sensor attached to the image screen in the live-round shooting room.
  • 10. The method of claim 9, wherein the sensing information is sensing information about a live round penetration point at which the live round has passed through the image screen.
  • 11. The method of claim 1, wherein the impact point calculation process comprises matching the sensing information with a predetermined coordinate map on an inner wall of the live-round shooting room or an image screen installed so as to be parallel to the inner wall in order to calculate the impact point.
  • 12. The method of claim 1, wherein the target position calculation process comprises matching target information included in the simulation image data with a predetermined coordinate map on an inner wall of the live-round shooting room or an image screen installed so as to be parallel to the inner wall in order to calculate the target position information.
  • 13. The method of claim 1, wherein the result calculation process comprises comparing the impact point with the target position information to calculate a difference value and generating the shooting result information using the shooting accuracy measured based on the difference value.
  • 14. A shooting simulation provision apparatus using a live round, the apparatus comprising: a simulation image controller for transmitting simulation image data to a projector installed in a live-round shooting room such that a simulation image is output to a specific area;a sensing processing unit for acquiring sensing information about a position of a live round fired by a shooter from a sensor installed in the specific area;an impact point calculation unit for calculating an impact point of the live round based on the sensing information;a target position calculation unit for calculating target position information of a target included in the simulation image; anda result calculation unit for measuring a shooting accuracy using the impact point and the target position information and generating shooting result information based on the shooting accuracy.
  • 15. The apparatus of claim 14, wherein, in a case in which evaluation reference information corresponding to a shooting practice theme, selected from among a shooting game, basic shooting practice, shooting sport practice, a military operation, terror training, and hostage rescue, is provided, the result calculation unit generates simulation result information about the shooting practice theme using movement path information of the shooter and the shooting result information based on the evaluation reference information.
  • 16. The apparatus of claim 15, wherein the result calculation unit divides an entire path into a plurality of sections based on the movement path information and generates the simulation result information using the shooting result information about a target included in each of the sections,the simulation result information being generated by setting a degree of difficulty for each of the sections and applying different weights to the sections based on the degree of difficulty.
  • 17. A shooting simulation system for shooting practice using a live round, the system comprising: a projector for projecting a simulation image to a specific area in a live-round shooting room;an image screen installed so as to be spaced apart from at least three inner walls of the live-round shooting room by a predetermined distance in a state of being parallel to the inner walls for outputting the simulation image;a headset comprising a sensor for sensing a gaze position or direction of a shooter, the headset being configured to provide a sound about the simulation image to the shooter;a live round sensor attached to the image screen for sensing a position of a live round fired by the shooter to generate sensing information; anda shooting simulation provision apparatus for transmitting simulation image data to the projector such that the simulation image is output, measuring a shooting accuracy using an impact point of the live round calculated based on the sensing information and target position information of a target included in the simulation image, and generating shooting result information based on the shooting accuracy.
  • 18. The system of claim 17, wherein in a case in which evaluation reference information corresponding to a shooting practice theme is provided, the shooting simulation provision apparatus generates simulation result information about the shooting practice theme using movement path information of the shooter and the shooting result information based on the evaluation reference information, andthe simulation result information is generated by dividing an entire path into a plurality of sections based on the movement path information, setting a degree of difficulty for each of the sections, and applying different weights to the sections based on the degree of difficulty.
Priority Claims (1)
Number Date Country Kind
10-2016-0134788 Oct 2016 KR national
RELATED APPLICATIONS

The present application is a continuation of International Application No. PCT/KR2017/008026, filed Jul. 26, 2017, which is based upon and claims the benefit of priority from Korean Patent Application No. 10-2016-0134788 filed on Oct. 18, 2016. The disclosures of the above listed applications are hereby incorporated by reference herein in their entireties.

Continuations (1)
Number Date Country
Parent PCT/KR2017/008026 Jul 2017 US
Child 16359170 US