The present disclosure is generally related to portable optical devices, such as rifle scopes, telescopes, and binoculars.
Portable optical devices, such as rifle scopes and gun-mounted cameras, typically include buttons or other controls that allow the shooter to adjust parameters, such as image focus, zoom, and other parameters. Additionally, when a shooter is in the field, the shooter may be making adjustments based on a certain target, yet that target may not be the ideal target and such adjustments may not reflect the correct designated impact point, and so on. Because of the inability to see what the user/shooter sees, it makes it very difficult, if not at times impossible, for a companion or guide to assist out in the field. Furthermore, when a parent is attempting to teach his/her child to shoot or hunt, the process can be difficult and frustrating as the child tries to describe what he/she is seeing and the parent tries to understand and instruct the child.
In an embodiment, a firearm scope includes an optical sensor to capture video data, a display, a transceiver configured to communicate data wirelessly through a communication channel, and a controller. The controller can be configured to provide a portion of the video data to the display, provide media content including the video data to the transceiver for wireless transmission, receive a signal from the communication channel in response to the wireless transmission, and selectively modify the portion of the video data provided to the display in response to receiving the signal.
In another embodiment, a portable optical device includes an optical sensor to capture video data of a view area, a display configured to display a portion of the video data, a transceiver configured to communicate data through a wireless communication channel, and a controller. The controller is configured to provide the portion of the video data and overlay data to the display, provide media data including the video data and the overlay data to the transceiver for communication through the wireless communication channel, receive a signal from the wireless communication channel in response to the media data, and selectively modify at least one of the overlay data and the portion of the video data in response to receiving the signal.
In still another embodiment, a method includes receiving media content from a gun scope at a computing device, providing the media content to a display of the computing device, receiving a user input corresponding to the media content at the computing device, and sending a signal to the portable optical device in response to receiving the user input.
In yet another embodiment, a computer-readable storage device includes computer-readable instructions that, when executed by a processor, cause the processor to receive media data from a firearm scope, provide the media data from the firearm scope to a display, receive a user input corresponding to the media data, and send a signal related to the user input to the firearm scope.
In the following discussion, the same reference numbers are used in the various embodiments to indicate the same or similar elements.
In the following detailed description of the embodiments, reference is made to the accompanying drawings which form a part hereof, and in which are shown by way of illustration of specific embodiments. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present disclosure.
Described below are embodiments of a portable optical device, such as a rifle scope, a telescope, binoculars, or other optical device that is configured to wirelessly communicate with a computing device. As used herein, the term “computing device” can refer to any electronic device configurable to couple to a communications network and to execute instructions, such as Internet browser applications, image rendering applications, and the like, and to receive user inputs, such as through interaction with a keypad or a touch-sensitive interface. The portable optical device may send media content to the computing device through a wireless communication link and may receive a signal in response thereto. As used herein, the term “media content” can include video data, audio data, text data, graphical data, processor-executable instructions, or any combination thereof. In an example, the portable optical device includes video capture (recording) functionality to capture video data associated with a view area, and includes a network transceiver. The portable optical device further includes data processing functionality configured to generate text data and graphical data (such as a reticle) and to present such data (as overlay data) together with at least a portion of the video data to a display. The portable optical device can be configured to communicate media content wirelessly to the computing device, and may be configured to adjust a visual display of the portable optical device in response to signals received from the computing device.
The computing device may be configured to run an application or process that displays media content from the portable optical device on a display. The display may be a touch-screen interface configured to accept inputs corresponding to the media content and to send a signal to the portable optical device based on the inputs. The signal includes data, such as commands, location data corresponding to a touch, or other data, which may be utilized by a controller of the portable optical device to adjust the portion of the video data provided to the display of the optical device and/or to adjust the overlay data.
As used herein, the term “portable” refers to a device that can be carried by a user. In a particular embodiment, the portable optical device may be implemented as a gun scope that can be mounted to a small arms firearm. One possible example of an embodiment of a portable optical device implemented as a rifle scope is described below with respect to
In an embodiment, circuitry 120 includes optical sensors configured to capture video data associated with a view area of rifle scope 100 received through optical element 110. Circuitry 120 further includes logic circuitry (such as a digital signal processor (DSP), a microprocessor unit (MCU), and/or communications logic) configured to format the captured video into a media content format suitable for transmission through a communication link to a computing device. The communication link can be a short-range wireless link (such as a Bluetooth® link) or a logical communications link through a network, such as a mobile phone network, a cellular, digital or satellite communication network, or another wireless communication network. The logical communications link may by any path or route through a network that communicatively couples the portable optical device to the computing device. In an example, rifle scope 100 sends the media content to a destination device. The destination device can be another optical device including another instance of circuitry 120, such as a spotting scope being used in conjunction with the rifle scope 100. In another embodiment, the destination device may be a computing device such as a desktop computer, laptop computer, tablet computing device, smart phone, or other device capable of executing instructions.
In an example, a user may attach rifle scope 100 to his/her rifle and carry the system into the field during a hunting expedition. When a user pulls the trigger, movement of the trigger is detected by circuitry 120 causing activation of the optical sensors to capture the video data. Detection of the trigger pull may further activate a microphone and audio processing circuitry to capture audio data. In other embodiments, circuitry 120 may be configured to continually capture optical and audio data for transmission.
In the above-example, rifle scope 100 is configured to capture and send media content, including video data and/or other data associated with a view area of rifle scope 100 to a destination device through the network, allowing the user to share video of his/her hunting experience with another user in real-time or near real-time.
The computing device may receive the media content and present the media content to a display together with one or more user-selectable options, such as buttons or links. A user may interact with the user-selectable options and/or the media content and, in response to the user interactions, the computing device may send a signal to the rifle scope 100, causing a controller within rifle scope 100 to selectively modify at least a portion of the video data and/or the overlay data provided to a display of rifle scope 100. For example, a user may interact with a touch-screen of the computing device to alter a zoom setting, and the signal sent from the computing device to the rifle scope 100 may cause the controller within the rifle scope 100 to adjust the view setting based on the signal. In another example, the media content from rifle scope 100 may represent real-time or near real-time video data corresponding to the portion of the view area provided to a display of the rifle scope. The user may touch a target within the media content via a touch-sensitive interface, causing the computing device to send the signal, and rifle scope 100 may alter the data provided to the display of rifle scope 100 to highlight or otherwise identify the selected target. One possible example of a system including a rifle scope in communication with a computing device is described below with respect to
Circuitry 120 of rifle scope 100 can be configured to wirelessly communicate with a computing device 204 via a communication link 226. As discussed above, the computing device may be a desktop computer, laptop computer, tablet computing device, smart phone, or other device including circuitry and software configured to communicate with circuitry 120, and to process and interact with media content received from the rifle scope 100. The computing device 204 can include a display component 224, which can be used to display media content received from the rifle scope 100. In an embodiment, display component 224 may be a touch-sensitive interface configured to display data and receive user input. As discussed above, the portion of the view data presented to the display of rifle scope 100 may be presented on display component 224, and the user may interact with display component 224, causing computing device 204 to send a signal to rifle scope 100 to selectively alter the portion of the video data and/or the overlay data provided to the display.
In an embodiment, a user may assist a shooter by observing the media content on display component 224 and by interacting with the display component 224 to provide feedback and/or instructions (or to alter the portion of the video data provided to a display of rifle scope 100), making it possible for a user to train and/or otherwise interact with the shooter in real time or near real-time. In an example, the user may be able to see the tag or visual marker on a selected target within the view area of the scope on display component 224 and to interact with the display component 224 to provide feedback to the shooter through rifle scope 100. An example of the communication between a portable optical device and a computing device is described below with respect to
In an example, the optical element 110 captures video data associated with the view area in front of rifle scope 100. Circuitry 120 may provide a portion of the video data to the view area and may provide a reticle and other data for presentation within the display. When a shooter zooms in to a magnified portion of the view area, the optical element 110 may still be capturing optical data associated with a wider view area than what appears on the display. Circuitry 120 may send video data of the entire view area or just the portion of the view area to the computing device 204. In an example, the user may interact with the display component 224 to adjust the view area, such as by dragging the view area to the right or left (for example), causing the displayed portion within the rifle scope 100 to shift right or left. In an embodiment, a user may interact with the display component 224 to zoom in or pan the view shown on the display 224 to display different portions of the view area than what the shooter sees in the eyepiece 102. In some examples, the user may elect to view other portions of the view area without altering the portion of the view area provided to a display within rifle scope 100.
Other overlay data of the view area 302 may include environmental data such as temperature, wind speed and direction, barometric pressure, range to the selected target, muzzle velocity, and/or other data. For example, a rifle scope 100 may allow a shooter to “tag” a target, which may place a visual marker on a selected target. The scope 100 may send media data to computing device 204. The media content may include the portion 302 of the view area and the potential target 304, the crosshairs 306, and any tag or target designation 308 via transmission circuitry 120 using a wireless communication link 226. The media content may also include other overlay data. In an embodiment, the media content is presented on display interface 224 of computing device 204, allowing the user of computing device 204 to see what the shooter sees.
Computing device 204 may be configured to execute an application to process the media data received from riflescope 100. In an example embodiment, computing device 204 is a portable computing device such as a tablet computer or smartphone and including an input interface, such as a touch-screen. Computing device 204 may receive the media data and provide the media data to a display. Computing device 204 may then receive user input corresponding to the media data and may send a signal to rifle scope 100, which may cause rifle scope 100 to alter at least one of the portion of the video data provided to the display and the overlay data.
In the illustrated embodiment, in addition to the media content received from rifle scope 100, interactive options are shown on display element 224. For example, options may include a “Move Tag” button 310 accessible by a user to adjust a location of tag or visual marker 308 within the portion 302 of the video data. Additionally, display element 224 depicts a “Designate Target” button 312 accessible by a user to initiate a target identification process and to identify a target, such as by touching an area of the display element 224 corresponding to a location of the target within the displayed media content. The options further include a “Direct Field of View” button 314 that, when selected, allows the user to drag, scroll or otherwise change the portion of the view area presented to a display of rifle scope 100. Alternatively, the “Direct Field of View” button 314 may be selected to place an arrow or other directional indication within the portion of the view area to direct the shooter to change the view area in the direction of the pointer. The options also include a “Zoom Control” button 316 that, when executed, allows the user to selectively alter the level of zoom of rifle scope 100. In an example, the user may touch the touch-screen and pinch or expand the view area to zoom in or zoom out. In another example, zoom in and zoom out buttons may be presented that user may select to alter the zoom level. The options may include other buttons 318 as well. In some instances, the user may simply tap or otherwise interact with display component 224 to access user-selectable features. User interactions with display component 224 are treated as user input and can be transmitted as a signal to rifle scope 100 to selectively alter the overlay data and/or the portion of the view area presented to a shooter.
In a particular example, when the user selects the “Move Tag” button 310, computing device 204 may allow a user to move a position of visual marker or tag 308 to adjust the selected location on the target. For example, a shooter may tag a target 304 using a digital rifle scope 100, and a trainer using the computing device 204 may want to relocate the tag indicate a better location on the selected target. Thus, the communication between computing device 204 and rifle scope 100 may be used to facilitate training.
Some embodiments may have additional interactive options, as represented by the “other buttons” 318 element, and some embodiments may have fewer buttons. In an example embodiment using a touchscreen interface, interface buttons 310-318 may not be needed, and commands can be entered by means of specific touch feedback commands; e.g. double-tapping may place a tag, while pressing down and dragging in a direction may adjust the field of view. In some embodiments, a user may input commands to the computing device 204 by means of a pointer device such as a mouse or stylus, by means of a touchscreen interface as discussed, by voice commands, or by other means.
User inputs received at the computing device may be converted into a signal appropriate for transmission over wireless communication link 226, and transmitted to rifle scope 100. Circuitry 120 in rifle scope 100 may process the signal and place or move tags, adjust the field of view, or make other modifications to the image provided to a display of rifle scope 100. For example, a shooter may see a new target designation through the eyepiece 102 of a rifle scope 100.
In an embodiment, the computing device 204 is configured to communicate with a plurality of portable optical devices, shown as Scope 1 through Scope N. Scope 1 through Scope N may communicate media data via a wireless communication links, such as wireless communication link 226 to a network 410, such as the Internet, a short-range communication network, a cellular, digital, or satellite network, a secure private network, other networks, or any combination thereof. The computing device 204 may communicate with the network 410 using a network transceiver 406 to receive media data from each of the scopes 1 through N and to present the media data to portions of display component 224.
In an embodiment with multiple portable optical devices, such as the N scopes, computing device 204 can present media data from each of the scopes 1-N on the display 224, simultaneously. In one embodiment, a single instructor can utilize display component 224 to monitor multiple shooter students. In another embodiment, a user may interact with the display component 224 to generate a signal to selected ones of the scopes. In a particular embodiment, a spotter may utilize the views to designate targets for the users of each of the scopes, and the computing device 204 may communicate a signal to selected ones of the scopes to communicate the target designation information.
As described herein, the computing device 204 can accept user input in response to the presented media content from portable optical devices. A user of computing device 204 may be able to selectively provide user input corresponding to the media data from one of the plurality of scopes 1 to N, for example. The processor 404 can selectively transmit signals corresponding to the user inputs through the network 410 via the network transceiver 406 to a designated one of the scopes. When a user enters input in response to selected media data, the processor 404 can selectively transmit a signal to the scope corresponding to the selected video feed. In some embodiments, a user may wish to provide inputs to all connected optical devices, such as to display a message on the optical device display, which inputs can be transmitted to all of the scopes in a multi-cast type of transmission. The processor may also be configured to store recordings of the media content received or user inputs to the memory 402 to maintain a record.
Circuitry 120 can include a field programmable gate array (FPGA) 612 including one or more inputs coupled to outputs of image (optical) sensors 610. FPGA 612 may further include an input/output interface coupled to a memory 614, which can store data and instructions. FPGA 612 can include a first output coupled to a display 616 for displaying images and/or text and a second output coupled to a speaker 617. FPGA 612 may also be coupled to a digital signal processor (DSP) 630 and a micro controller unit (MCU) 634 of an image processing circuit 618. Circuitry 120 can also include sensors 620 configured to measure one or more environmental parameters (such as wind speed and direction, humidity, temperature, and other environmental parameters) and/or to measure optical elements, such as reflected laser range finding data, and to provide the measurement data to MCU 634. Circuitry 120 can further include a microphone 628 to capture sounds and to convert the sounds into an electrical signal, which it can provide to an analog-to-digital converter (ADC) 629. ADC 629 may include an output coupled to an input of DSP 630. In some embodiments, the microphone 628 may be external to circuitry 120 and circuitry 120 may instead include an audio input jack or interface for receiving an electrical signal from microphone 628. In a particular example, the speaker 617 and microphone 628 may be incorporated in a headset worn by a user that is coupled to circuitry 120 through an input/output interface (not shown).
DSP 630 can be coupled to a memory 632 and to MCU 634. MCU 634 may be coupled to a memory 636. MCU 634 can also be coupled to input interface 622, transceiver 624, and network transceiver 626. In an example, transceiver 624 can be part of an input/output interface, such as a Universal Serial Bus (USB) interface or another wired (or wireless) interface for communicating data to and receiving data from radio device 606, which may be configured to communicate bi-directionally with network 410. In a particular example, transceiver 624 is a wireless transceiver for communicating data to and receiving data from radio device 606. Network transceiver 626 can communicate media content to network 410 and receive data and/or media content from network 410. In an example, network 410 can be a communications network, such as the Internet, a wireless telephone network (cellular, digital, or satellite), an ad hoc wireless network, or any combination thereof. In a particular example, circuitry 120 may receive audio data from network 410 and output the audio data to a user through speaker 617 and may send audio data from microphone 628 through network 410, allowing the user to utilize circuitry 120 for full-duplex or half-duplex audio communication. Further, circuitry 120 can communicate media content, such as video and audio of a view area to a destination device through network 410, and receive signals through network 410 reflecting user input with regard to media data transferred over the network 410 to a computing device.
In an example, DSP 630 executes instructions stored in memory 632 to process audio data from microphone 628. MCU 634 processes instructions and settings data stored in memory 636 and is configured to control operation of circuitry 120. FPGA 612 is configured to process image data from image (optical) sensors 610. FPGA 612 processes the image data to enhance image quality through digital focusing and gain control. Further, FPGA 612 can perform image registration and stabilization. FPGA 612 may cooperate with DSP 630 to perform optical target tracking within the view area of the portable optical device that incorporates circuitry 120. FPGA 612 further cooperates with MCU 634 to mix the video data with overlay data, such reticle information and target tracking information (from DSP 630) and provides the resulting image data to display 616. As a target moves within the view area, DSP 630 can perform target tracking and can apply a visual marker to the target as shown on display 616. The FPGA 612, DSP 630 and MCU 634 can cooperate to modify a portion of the media content sent to the display 616 based on signals received over the network 410 corresponding to user inputs related to the media content.
User-selectable (adjustable) elements 604 may allow the user to control circuitry 120 to transmit media content to a destination device through network 410. Thus, the user can share captured video data, audio data, text data, graphical data, or any combination thereof with a remote user through network 410. If circuitry 120 is incorporated in a rifle scope, such as rifle scope 100, the shooter can capture and transmit media content of his/her hunting experience in real-time or near real-time, and receive signals representing user feedback through the network 410. Signals received through the network 410 can be processed by circuitry 120 and influence a portion of media content displayed on display 616.
In an example where circuitry 120 is incorporated in a rifle scope or optical scope, circuitry 120 can communicate directly with network 410 or can communicate indirectly with network 410 through an intermediate device, such as radio device 606. In some instances, radio device 606 can be configured to communicate directly with other radio devices, forming an ad hoc wireless network or secure network (such as a battlefield network). In one example, circuitry 120 transmits location data, image data, and other information to such radio devices, which information is shared with one or more other radio devices coupled to the ad hoc wireless network or battlefield network.
While the example of
Proceeding to 710, circuitry 120 receives a signal from the destination device in response to sending the media content, the signal corresponding to user input at the destination device. For example, circuitry 120 may receive a signal in response to a user tagging a target depicted in the video data at the destination device. In another example, circuitry 120 may receive zoom data and/or a target designation signal from the destination device. Advancing to 712, circuitry 120 selectively modifies the portion of the video data provided to the display 616 of the rifle scope 100 in response to receiving the signal. For example, the target tagged by the user at the destination device may now be tagged in the portion of the video data shown in the display 616.
Although the above methods are directed towards a computing device and a rifle scope, the teachings can be applied to telescopes, binoculars, or other portable optical devices. Similarly, steps of the methods may be performed by other device elements than described, or some elements may be combined or eliminated without departing from the scope of the present disclosure.
In accordance with another embodiment, the methods described herein may be implemented as one or more software programs running on a computing device, such as a personal computer, telephone, tablet, or other device. Dedicated hardware implementations including, but not limited to, application specific integrated circuits, programmable logic arrays, and other hardware devices can likewise be constructed to implement the methods described herein. Further, the methods described herein may be implemented as a computer readable storage medium including instructions that, when executed, cause a processor to perform the methods.
While the present invention has been described with reference to particular embodiments, workers skilled in the art will recognize that changes may be made in form and detail without departing from the scope of the disclosure.
Number | Name | Date | Kind |
---|---|---|---|
5579165 | Michel et al. | Nov 1996 | A |
6899539 | Stallman et al. | May 2005 | B1 |
7255035 | Mowers | Aug 2007 | B2 |
8648914 | Winker | Feb 2014 | B1 |
8678282 | Black et al. | Mar 2014 | B1 |
20020197584 | Kendir et al. | Dec 2002 | A1 |
20060010697 | Sieracki et al. | Jan 2006 | A1 |
20070277421 | Perkins et al. | Dec 2007 | A1 |
20080020354 | Goree | Jan 2008 | A1 |
20080039962 | McRae | Feb 2008 | A1 |
20090111454 | Jancic et al. | Apr 2009 | A1 |
20090200376 | Peters et al. | Aug 2009 | A1 |
20090205239 | Smith, III | Aug 2009 | A1 |
20100196859 | Saugen | Aug 2010 | A1 |
20110030545 | Klein | Feb 2011 | A1 |
20110173869 | Uhm | Jul 2011 | A1 |
20110241976 | Boger et al. | Oct 2011 | A1 |
20120106170 | Matthews et al. | May 2012 | A1 |
20140110482 | Bay | Apr 2014 | A1 |
Number | Date | Country |
---|---|---|
197 19 977 | Oct 1998 | DE |
1 693 639 | Aug 2006 | EP |
2012138242 | Aug 2009 | WO |
WO 2012131548 | Oct 2012 | WO |
Entry |
---|
Extended European Patent Office Search Report, Application No. 13199337.0-1811 / 2749836, dated Jun. 26, 2015. |
Number | Date | Country | |
---|---|---|---|
20140184788 A1 | Jul 2014 | US |