This disclosure relates to simulated firearms with integrated target accuracy detection capabilities, and to related methods and systems.
Firearms, such as pistols, rifles, and shotguns have a variety of uses including recreation, protection, hunting, and combat. However, firearms are also dangerous tools that can cause severe injury or harm if used without proper training. Even among experienced users, firearm training can present significant risk of harm to the user, and to nearby people and property. Furthermore, firearms are not suitable for use as toys.
In one aspect, a simulated firearm is provided. The simulated firearm may include a body, a camera, a memory, and a processor. The body may have a handle, a trigger, and a barrel portion. The barrel portion may define a shooting axis. The camera may be coupled to the body and aligned with the shooting axis. The camera may be configured to capture an image in response to actuation of the trigger. The memory may be coupled to the body and stores marker data corresponding to at least one target marker. The processor may be coupled to the body and configured to determine whether a portion of the image captured by the camera matches any one or more of the at least one target marker.
Numerous embodiments are described in this application, and are presented for illustrative purposes only. The described embodiments are not intended to be limiting in any sense. The invention is widely applicable to numerous embodiments, as is readily apparent from the disclosure herein. Those skilled in the art will recognize that the present invention may be practiced with modification and alteration without departing from the teachings disclosed herein. Although particular features of the present invention may be described with reference to one or more particular embodiments or figures, it should be understood that such features are not limited to usage in the one or more particular embodiments or figures with reference to which they are described.
The terms “an embodiment,” “embodiment,” “embodiments,” “the embodiment,” “the embodiments,” “one or more embodiments,” “some embodiments,” and “one embodiment” mean “one or more (but not all) embodiments of the present invention(s),” unless expressly specified otherwise.
The terms “including”, “comprising”, and variations thereof mean “including but not limited to,” unless expressly specified otherwise. A listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise. The terms “a,” “an” and “the” mean “one or more,” unless expressly specified otherwise.
As used herein and in the claims, two or more parts are said to be “coupled”, “connected”, “attached”, or “fastened” where the parts are joined or operate together either directly or indirectly (i.e., through one or more intermediate parts), so long as a link occurs. As used herein and in the claims, two or more parts are said to be “directly coupled”, “directly connected”, “directly attached”, or “directly fastened” where the parts are connected in physical contact with each other. As used herein, two or more parts are said to be “rigidly coupled”, “rigidly connected”, “rigidly attached”, or “rigidly fastened” where the parts are coupled so as to move as one while maintaining a constant orientation relative to each other. None of the terms “coupled”, “connected”, “attached”, and “fastened” distinguish the manner in which two or more parts are joined together.
As used herein and in the claims, a first element is said to be “received” in a second element where at least a portion of the first element is received in the second element, unless specifically stated otherwise.
As used herein and in the claims, two components are said to be “communicatively coupled” where at least one of the components is capable of communicating signals (e.g. electrical signals) to the other component, such as across a wired connection (e.g. conductive wire cable), or a wireless connection (e.g. radio frequency).
Simulated firearm 100 can simulate a firearm experience, without the risks associated with firing live ammunition. For example, simulated firearm 100 may be employed for training exercises such as target practice, as well as a recreational toy for playing games. Simulated firearm 100 includes a body 104 that simulates a firearm. As shown, body 104 includes a handle 108, a trigger 112, and a barrel portion 116.
Body 104 can be shaped to resemble any firearm. In the illustrated example, body 104 is shaped as a pistol. In other examples, body 104 may be shaped as a rifle (
Simulated firearm 100 is described below as having various components coupled to body 104. It will be understood that any of these components may be entirely or partially housed in body 104, or position outside of and physically attached to body 104. In the illustrated example, a camera 124, a memory 128, and a processor 132 are coupled to body 104.
In brief, a user may take a “shot” with simulated firearm 100 by actuating trigger 112, which causes camera 124 to capture an image. The processor 132 determines whether a portion of the captured image matches a target marker 136 for which corresponding marker data is stored in memory 124. The processor 132 may then score the shot based on a distance between a center of the image and the position of the image portion matching the target marker 136.
As shown, camera 124 is coupled to body 104 and oriented to align with shooting axis 120. This allows camera 124 to capture images centered on the shooting axis 120, which may simulate a flight path of ammunition from a genuine firearm. Camera 124 can be any optical imaging device that can capture a target marker 136 located at a shooting distance from a firing end 140 of simulated firearm 100. For example, camera 124 may be able to capture a target marker 136 located within a range of distances which includes at least a portion of the range 10 cm to 1.5 km. For example, camera 124 may include a CCD-sensor, a CMOS-sensor, or another optical sensor that can detect optical radiation in any wavelengths that can emanate (e.g. emit or reflect) from a target marker 136. In some examples, camera 124 can detect optical radiation having wavelength(s) which reside in one or more (or all) of the visible spectrum (e.g. 390 nm to 700 nm), the infrared spectrum (e.g. 700 nm to 1000 nm), and the ultraviolet spectrum (e.g. 10 nm to 390 nm).
Trigger 112 is manually (i.e. by hand) user operable to cause firearm 100 to execute a shot sequence. In some cases, the shot sequence includes camera 124 capturing an image. In other cases (e.g. where the simulated ammunition count is zero), the shot sequence may not include camera 124 capturing an image (e.g. nothing may happen, or the user may be audibly or visually prompted to reload). Trigger 112 can be any device that allows the user to direct firearm 100 to fire a shot (e.g. to direct processor 132 to execute a shot sequence). In the illustrated example, trigger 112 is formed as a finger-trigger of the kind commonly found in firearms. For example, a user may actuate trigger 112 by grasping simulated firearm 100 by handle 108 and pulling trigger 112 towards handle 108 (e.g. away from firearm firing end 140) using one or more fingers. In other embodiments, trigger 112 may be formed by another user-input control, such as a depressible button, a lever, a switch, or a touch sensitive surface (e.g. based on resistive or capacitive technology).
Memory 128 provides electronic storage for marker data corresponding to one or more target markers 136. Target markers 136 can be any visible indicia that can be applied to, embedded into, incorporated into, or comprises the appearance of, a real-world (i.e. physical and tangible) shooting target (e.g. a can, sign, building, person, or vehicle). The physical embodiment of the target marker 136 emits or reflects optical radiation for detection by a camera, such as camera 124. In some examples, a target marker 136 can be a symbol, logo, or photograph that is painted, printed, molded, or sculpted into or onto a shooting target, or onto a substrate (e.g. adhesive sticker) which is applied to the shooting target. In other examples, a target marker 136 includes the appearance of a shooting target (e.g. a person's face). It will be appreciated that a shooting target can bear multiple of the same and/or different target markers 136, and that the same target marker 136 can be borne by multiple different shooting targets.
Marker data in memory 128 can be any information that identifies a target marker 136 for the purpose of matching a portion of an image to that target marker. In some examples, the marker data corresponding to a target marker 136 can include an image of the target marker 136, or an encoded or decoded representation of the target marker 136. For example, for a target marker 136 that is a 2D barcode, the marker data may include an image of the 2D barcode or the barcode information (e.g. ASCII text) encoded in the 2D barcode. In the latter case, the processor 132 may determine a match by encoding the barcode information into a 2D barcode prior to the comparison with a captured image, or the processor 132 may decode barcode information from a 2D barcode located in the captured image for comparison with the barcode information stored in memory 128.
Processor 132 is configured to determine whether a portion of an image captured by camera 124 matches a target marker 136 for which corresponding marker data is stored in memory 128. For this purpose, processor 132 is communicatively coupled to camera 124 and memory 128. Processor 132 may also be communicatively coupled to other components of simulated firearm 100 for directing the operation of simulated firearm 100. As used herein and in the claims, a “processor” is not limited to a single processing chip. Simulated firearm 100 may include a plurality of logic components within and between various components of simulated firearm 100 (e.g. camera 124, memory 128, trigger 112, etc.), and collectively these logic components be included in the meaning of processor 132.
Processor 132 can determine whether a portion of a captured image matches a target marker 136 according to any known image recognition process or algorithm. It will be appreciated that such processes or algorithms may be able to determine a positive match where the captured image includes only a portion of a target marker (e.g. a partially obstructed target marker), or includes a distorted target marker (e.g. target marker is at an angle to shooting axis 120). Accordingly, determining a match may not be limited to identifying a perfect 1-to-1 representation of the target marker 136 in the captured image. For example, a target marker 136 may comprise the appearance of a shooting target (e.g. a drone) and processor 132 may determine a match where a captured image includes an image of that shooting target from any angle (e.g. front, back, top, bottom, left, or right side of the drone). The marker data in memory 128 may include a plurality of images of the shooting target taken from different angles (e.g. front, back, top, bottom, left, or right side of the drone) for use by processor 132 in determining a match with a captured image.
In some embodiments, processor 132 scores a shot based at least in part on the captured image. For example, processor 132 may score a shot with a binary result (e.g. hit or miss, success or failure). Alternatively or in addition, processor 132 may score a shot quantitatively (e.g. on a scale of 1 to 100 based on accuracy). Alternatively or in addition, a shot may be scored qualitatively based on other characteristics of the shot taken (e.g. shot low or high, shot left or right, shot fast or slow, etc.).
Still referring to
In some embodiments, processor 132 scores a shot as a success if a center 164 of the captured image 152a overlaps with the image portion 156a or the target marker 136 therein. In this case, processor 132 would score captured image 152a as a failure (e.g. a “miss”).
In some embodiments, processor 132 scores a shot (e.g. as a success or failure, or quantitatively) based on a distance between image portion 156a or the target market 136 therein and a center of the captured image 152a. For example, processor 132 may score a shot as a success if a distance 160a between a center of the image portion 156a, or the target marker 136 therein, to a center 164 of image 152a is less than a threshold distance. Alternatively, processor 132 may score a shot as a success if a closest distance 168a between the image portion 156a or the target marker 136 therein to a center 164 of image 152a is less than a threshold distance.
Processor 132 may express distances, such as the threshold distances mentioned above, as a function of any one or more of: the image dimension(s) (e.g. vertical or horizontal dimensions of image 152a), the image portion dimension(s) (e.g. vertical or horizontal dimension of image portion 156a), or the target marker dimension(s) (e.g. vertical or horizontal dimension of target marker 136 within image portion 156a).
Referring to
Continuing the example of
Gravity acts in a vertical direction. However, in some cases it may be possible to fire simulated firearm 100 while firearm 100 is tilted (e.g. rotated about a horizontal axis). In this circumstance, camera 124 may capture an image having a “Y” axis that does not align with gravity. In this case, it would be inaccurate to offset image center 164 along the Y-axis of the captured image 152b to account for gravity in determining context-adjusted image center. In some embodiments, simulated firearm 100 includes an orientation sensor 188 which is communicatively coupled to processor 132. Orientation sensor 188 is any device that can sense the rotational orientation of simulated firearm 100 about at least one axis. For example, orientation sensor 188 may include one or more (or all of) a magnetometer (e.g. for sensing compass direction), an accelerometer (e.g. for sensing the direction of gravity), and a gyroscopic sensor (e.g. for sensing the direction of gravity). Processor 132 may receive orientation data from orientation sensor 188 to determine the direction of gravity relative to captured image 152b. The processor 132 can then offset image center 164 by a distance 180 in the determined direction of gravity to determine context-adjusted image center 184.
Referring to
In some embodiments, processor 132 may direct weather data to be communicated to the user audibly by speaker 196, visually by display 200, or both. This allows simulated firearm 100 to inform the user of the weather data, such as when processor 132 generates the weather data, so that the user can take corrective action in taking their shot.
Simulated firearm 100 may include a display 200 coupled to the body 104 and communicatively coupled to processor 132. Display 200 can be any device capable of presenting visual information to the user. For example, display 200 may include a light (e.g. LED, incandescent, halogen, etc.), an LCD screen, an LED screen, an e-ink display, an OLED display, or combinations thereof. In some embodiments, processor 132 may direct display 200 to visually communicate a successful or failed shot indication, a successful shot counter or a quantitative score (e.g. 1 to 100 or A to F) based on shot scoring performed by processor 132. Alternatively or in addition, processor 132 may direct display 200 to visually communicate firearm configuration information such as stored target marker(s) 136, ammunition information, and weather information. Alternatively or in addition, processor 132 may display status information such as a count of remaining ammunition (e.g. bullets remaining).
In some embodiments, processor 132 may direct display 200 to visually communicate the target marker 136 matched in a portion of a captured image. For example, processor 132 may direct display 200 to visually communicate the target marker 136 for which the processor scored a successful hit. This can inform the user of the target marker 136 they successfully hit in a previous shot.
In some embodiments, processor 132 may direct display 200 to visually communicate the target marker(s) 136 not yet successfully hit from among the target marker(s) stored in memory 128 or from among the target marker(s) 136 belong to an ongoing shooting program (e.g. training exercise or game). A shooting program may include a plurality of target markers 136 from among the plurality of target markers 136 stored in memory 128, and the training exercise may be completed when a successful hit (or a threshold quantitative hit score is obtained) for each of the target markers 136. A shooting program may be generated by processor 132, user selected with user-input controls 204 from among shooting programs stored in memory 128, user generated with user-input controls 204, or combinations thereof.
In some embodiments, a shooting program may include a plurality of target markers 136 ordered according to a sequence. In this example, processor 132 may direct display 200 to visually communicate the target marker(s) 136 next in sequence, and once these target marker(s) 136 are hit processor 132 may direct display 200 to visually communicate the subsequent target marker(s) 136 in sequence, and so on.
In some embodiments, processor 132 may direct display 200 to visually communicate the target marker(s) 136 that are not to be shot (e.g. “friendlies”). Processor 132 may penalize the user in response to scoring a successful hit of a target marker 136 that is not to be hit. For example, the penalty may include processor 132 terminating a shooting program, deducting from the user's score, audibly or visually cautioning the user, or combinations thereof.
In some embodiments, processor 132 may determine shot commentary (e.g. corrective advice or shot error), and direct the shot commentary to be communicated visually with display 200 and/or audibly with speaker 196. For example, in
Reference is now made to
In some embodiments, camera 124 includes a zoom lens 224 that is operable to provide optical zoom to camera 124. Zoom lens 224 may include a plurality of zoom positions corresponding to different levels of zoom. By increasing the zoom position of zoom lens 224, camera 124 is provided with a smaller field of view, whereby a target marker 136 within the field of view can occupy a greater portion of the captured image 152 than if the field of view was larger. This can result in a target marker 136 with greater detail in the captured image 152 for more accurate and/or faster matching by processor 132 to marker data corresponding to that target marker 136 in memory 128.
The zoom position of zoom lens 224 may be manually adjustable (e.g. by user-input controls 204). Alternatively or in addition, processor 132 may direct camera 124 to change the zoom position of zoom lens 224 based on the determined subject distance 172. For example, processor 132 may determine subject distance 172 by reference to camera focus information, or by another method such as the methods described above. In some embodiments, processor 132 may direct camera 124 to increase the zoom position of zoom lens 224 to compensate for a long subject distance 172, and direct camera 124 to decrease the zoom position of zoom lens 224 to compensate for a short subject distance 172. This can provide camera 124 with a reduced field of view for distant shots where a target marker 136 is expected to occupy a smaller portion of the captured image 152, and can provide camera 124 with a greater field of view for close shots where target marker 136 is expected to occupy a greater portion of the captured image 152 (e.g. to ensure the target marker 136 can fit within the captured image 152).
In some cases, the image recognition process employed by processor 132 is slower or requires greater processing power to match a target marker 136 with an image having very high resolution (e.g. high pixel count). For example, at a certain image resolution, the benefits of speed and accuracy to providing greater detail to target marker 136 may be outweighed or reduced by the burden of processing a larger image. The characteristics of some target markers 136 may be such that recognition accuracy is improved little or none by providing the target marker 136 with detail above a threshold level of detail (e.g. threshold number of pixels). Alternatively or in addition to zoom lens 224 (and to manual or automatic zoom positioning), processor 132 may change the image capture resolution of camera 124. For example, based on the camera subject distance 172 and field of view, processor 132 may reduce (or increase) the image quality (e.g. pixel density) setting of camera 124 prior to image capture, or processor 132 may crop the field of view of camera 124 prior to image capture (e.g. increase “digital zoom”) to optimize the detail of a captured image 152 and the target marker 136 within. This can reduce the time required for camera 124 to capture the image. In some cases, this can also reduce the time or processing power required for processor 132 to match a captured target marker 136 without significantly affecting matching accuracy. For example, processor 132 may determine an expected detail (e.g. size such as pixel dimensions) of a target marker 136 at a given subject distance 172 and field of view, and then change (e.g. reduce or increase) the image capture resolution of camera 124 to optimize (e.g. reduce or increase) the detail of the captured image 152 and target marker 136 within (e.g. optimal for matching speed, matching accuracy, or both).
In some embodiments, the optimal detail of a target marker 136 is predetermined and included in marker data corresponding to that target marker 136 stored in memory 128. The predetermined optimal detail may be parametrically defined based on contextual shot information, such as lighting, subject distance, and camera movement for example. For example, the optimal detail of a target marker 136 may be greater in low lighting conditions than in bright lighting conditions; greater at farther distances than at near distances (e.g. due to air clarity and focus accuracy); and greater when camera 124 is moving than when camera 124 is still (e.g. due to motion blur).
Referring to
In some embodiments, marker data corresponding to an additional target marker 136 may be downloaded to memory 128. For example, memory 128 may include removable memory (e.g. a flash memory card) for downloading marker data onto memory 128 (e.g. using an external computing device) and then reconnecting memory 128 to simulated firearm 100. Alternatively or in addition, memory 128 may store marker data received from an external computing device communicatively coupled to memory 128. For example, referring to
In some embodiments, marker data corresponding to target markers 136 can be selectively removed from memory 128 (e.g. to be excluded from matching against subsequently captured images). For example, a user may select marker data in memory 128 using user-input controls 204, and thereby direct processor 132 to remove that marker data from memory 128. Alternatively or in addition, a user can direct processor 132 to initiate a target marker removal process using user-input controls 204. The user may then direct camera 124 towards a target marker 136 to be removed and actuate trigger 112 to cause processor 132 to capture a setup image including a target marker 136. The processor 132 may attempt to match the captured target marker 136 to marker data stored in memory 128, and if a match is found then delete that marker data from memory 128.
Wireless communication device 210 can be any device that allows data signals to be sent or received wirelessly between simulated firearm 100 and a device external to and physically disconnected from simulated firearm 100 (an “external device”). For example, wireless communication device 210 may be a device that wirelessly communicates according wireless communication technology, such as infrared (e.g. IrDA), Wifi (e.g. IEEE 802.11x), cellular (e.g. 2G, 3G, or LTE), and Bluetooth communication means.
In some cases, processor 132 only directs wireless communication device 210 to transmit shot data where the shot was scored successfully or where the shot satisfies another criterion (e.g. a quantitative shot score above a threshold, or where a designated target marker 136 is matched in the captured image). For example, in a group shooting program (e.g. team and/or competitive game or training exercise) including a plurality of designated target markers 136 to be shot, simulated firearm 100 may identify a successfully shot target marker 136 to the other simulated firearm(s) 100b in system 232. This can allow the other simulated firearms 100b to remove the identified target marker 136 from a list of target markers 136 remaining in the shooting program for example.
Alternatively or in addition, simulated firearm 100 may transmit shot data to a target 146. In response to receiving the shot data, the target 146 may take a visually or audibly perceptible reaction. The target 146 may be any object that embodies or carries a target marker 136, such as a signpost or a vehicle (e.g. land vehicle or aerial drone). For example, the target 146 may determine that the shot data identifies a target marker 136 carried by that target 146, and in response activate, modulate (e.g. change color, intensity, or pattern), or terminate a light emission, or activate, modulate (e.g. move faster, slower, or in a different pattern), or terminate a movement. In some embodiments, a target 146 is a remotely user-controller device (e.g. remote controlled car or aerial drone).
Still referring to
In some embodiments, simulated firearm 100 includes a light source 244. Light source 244 can be any device that can emit optical radiation (e.g. in the infrared, visible, and/or ultraviolet light spectrums). The optical radiation (“light”) from light source 244 may be substantially collimated (e.g. as in a laser) or uncollimated (e.g. divergent or convergent). As shown, light source 244 is coupled to body 104 and oriented to emit light in parallel with shooting axis 120. This can allow light source 244 to illuminate target markers 136, which can improve the clarity of target markers 136 in captured images. Alternatively or in addition, light source 244 can visually communicate information to the user. For example, processor 132 may direct light source 244 to emit flashing light or to change the color of the emitted light in response to matching stored marker data to a portion of a captured image, and/or in response to scoring of a shot (e.g. green for successful and red for unsuccessful). This can allow the user to receive visual shot feedback without having to look at, e.g. display 200 of simulated firearm 100. Alternatively or in addition, light source 244 may include an aiming aid, such as a laser pointer for marking the intersection of the shooting axis 120 on the external environment.
In some embodiments, a target marker 136 may emit or reflect optical radiation outside of the visible spectrum, such as infrared or ultraviolet radiation. In this case, camera 124 may be configured to detect at least part of the non-visible radiation spectrum that the target marker 136 emits or reflects. This can allow usage of target markers 136 that are optically detectable by camera 124 for matching by processor 132, but which are not visible by the user. This can allow shooting programs (e.g. games or target practice) to better simulate real-world situations in which targets are not normally labelled with target markers 136. Where target markers 136 are reflective to non-visible radiation, simulated firearm 100 may include a light source 244 that emits non-visible optical radiation for the target marker 136 to reflect.
Simulated firearm 100 can include any user-input controls 204 that allow for manual (i.e. by hand) user selections receivable by processor 132. For example, user-input controls 204 may include one or more (or all) of buttons, switches, dials, slides, knobs, touch sensitive surfaces (e.g. capacitive or resistive based), wheels, or levers.
Simulated firearm 100 may include a battery 248 coupled to body 104. Battery 248 may be any one or more energy storage devices suitable for providing electrical power to one or more (or all) energy consuming components of simulated firearm 100, such as one or more (or all) of processor 132, camera 124, memory 128, range finder 176, orientation sensor 188, speaker 196, display 200, user-input controls 204, and light source 244. For example, battery 122 may include an alkaline, Ni-CAD, NiMH, or Li-ion battery that may be rechargeable or single-use disposable.
Memory 128 can be any one or more devices that can store electronic information, such as marker data corresponding to target markers 136. Memory 128 may include one or both of volatile memory (e.g. RAM), and non-volatile memory (e.g. ROM, flash memory, hard disk drive, solid state drive, diskette). In some cases, all of or a portion of memory 128 may be removable.
While the above description provides examples of the embodiments, it will be appreciated that some features and/or functions of the described embodiments are susceptible to modification without departing from the spirit and principles of operation of the described embodiments. Accordingly, what has been described above has been intended to be illustrative of the invention and non-limiting and it will be understood by persons skilled in the art that other variants and modifications may be made without departing from the scope of the invention as defined in the claims appended hereto. The scope of the claims should not be limited by the preferred embodiments and examples, but should be given the broadest interpretation consistent with the description as a whole.
Item 1: A simulated firearm comprising:
This application claims the benefit of U.S. Provisional Application No. 62/332,276 filed on May 5, 2016, which is incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
62332276 | May 2016 | US |