Optical Device Including a Network Transceiver

Information

  • Patent Application
  • 20190243230
  • Publication Number
    20190243230
  • Date Filed
    August 20, 2018
    6 years ago
  • Date Published
    August 08, 2019
    5 years ago
Abstract
A rifle scope includes a housing configured to mount to a rifle. The rifle scope further includes a network transceiver within the housing and configured to communicate bi-directionally with at least one of a network and an electronic device configurable to communicate with to the network.
Description
FIELD

The present disclosure is generally related to portable optical devices, such as rifle scopes, telescopes and binoculars.


BACKGROUND

Conventionally, camera devices have been mounted onto guns to take still videos and/or to capture video. Such camera devices are often mounted to the outside of an optical device, such as a targeting scope, adding to the bulk and weight of the firearm. In another instance, the camera device has been incorporated within the rifle scope. The recorded video or picture can then be recovered from the camera device at a later time, such as by removing the film (or cassette or flash memory device) from the camera to retrieve the images.


SUMMARY

In an embodiment, a rifle scope includes a housing configured to mount to a rifle. The rifle scope further includes a network transceiver within the housing and that is configured to communicate bi-directionally with at least one of a network and an electronic device configurable to communicate with to the network.


In another embodiment, a binocular display device includes a housing having at least one optical element. The binocular display device further includes a network transceiver within the housing configured to communicate bi-directionally with at least one of a network and an electronic device configurable to communicate with to the network.


In still another embodiment, a portable telescope includes a housing including an optical element. The portable telescope further includes a network transceiver within the housing configured to communicate bi-directionally with at least one of a network and an electronic device configurable to communicate with to the network.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a perspective view of an embodiment of a rifle scope having circuitry including a network transceiver.



FIG. 2 is front view into the rifle scope of FIG. 1 depicting a view area including a reticle and a potential target.



FIG. 3 is a perspective view of a binocular display device having circuitry, including a network transceiver, such as the circuitry of FIG. 1.



FIG. 4 is a block diagram of a system including the circuitry of FIGS. 1-3.



FIG. 5 is an expanded block diagram of a portion of the circuitry of FIGS. 1-4.



FIG. 6 is a side view of an example of a small arms firearm including the rifle scope of FIG. 1.



FIG. 7 is a block diagram of a system including the circuitry of FIGS. 1-6 configured to communicate with one or more devices through a communications network.



FIG. 8 is a flow diagram of an embodiment of a method of sharing media content from an optical device with a destination device.



FIG. 9 is a flow diagram of an embodiment of a method of selectively providing media content to a visual display.





In the following discussion, the same reference numbers are used in the various embodiments to indicate the same or similar elements.


DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS

Embodiments of a portable optical device, such as a telescope, binoculars, or a rifle scope, are described below that are configured to capture video and/or audio corresponding to a view area and to share media content with a destination device through a network. As used herein, the term “media content” includes video data, audio data, text data, graphical data, or any combination thereof. In an example, the portable optical device includes video camera functionality, an audio input, and a network transceiver. The portable optical device further includes data processing functionality configured to generate text data and graphical data (such as a reticle). The portable optical device is configured to communicate media content to the destination device through the network.



FIG. 1 is a perspective view of an embodiment of a rifle scope 100 having circuitry 108 including a network transceiver. Rifle scope 100 includes an eyepiece 102 and an optical element 104 coupled to a housing 106. Housing 106 defines an enclosure sized to receive circuitry 108. Optical element 104 includes an objective lens and other components configured to receive light and to direct and focus the light toward optical sensors associated with circuitry 108.


Rifle scope 100 includes user-selectable buttons 110 and 112 on the outside of housing 106 that allow the user to interact with circuitry 108 to select between operating modes, to adjust settings, and so on. Further, rifle scope 100 includes thumbscrews 114, 116, and 118, which allow for manual adjustment of the rifle scope 100. In an example, thumbscrews 114, 116 and 118 can be turned, individually, to adjust the crosshairs within a view area of rifle scope 100.


Housing 106 includes a removable battery cover 120, which secures a battery within housing 106 for supplying power to circuitry 108. Housing 106 is coupled to a mounting structure 122, which is configured to mount to a surface of a rifle and which includes fasteners 124 and 126 that can be tightened to secure the housing to the rifle.


In an example, circuitry 108 includes optical sensors configured to capture video associated with a view area of rifle scope 100 received through optical element 104. Circuitry 108 further includes logic circuitry (such as a digital signal processor (DSP), a micro processor unit (MCU), and/or communications logic) configured to format the captured video into a media content format suitable for transmission through a network, such as a mobile phone network, a satellite communication network, or another wireless communication network using Transport Control Protocol (TCP)/Internet Protocol (IP) communication protocols. In an example, the destination device can be another optical device including another instance of circuitry 108. In an example, a user may attach rifle scope 100 to his/her rifle and carry the system into the field during a hunting expedition. When a user pulls the trigger, movement of the trigger is detected by circuitry 108 causing activation of the optical sensors to capture the video data and may further activate a microphone and audio processing circuitry to capture audio data. The user may configure circuitry 108 to communicate media content to a destination device through a network. The network can be a wireless communication network, such as a cellular, digital, or satellite communication network. In another instance, the network can be a local area network, a wide area network, or a short-range wireless network, such as a Bluetooth® network.


In the above-example, rifle scope 100 is configured to capture and transmit the video and/or audio associated with a view area of the rifle scope to a destination device through the network, allowing the user to share video of his/her hunting experience with another user in real-time or near real-time. In some instances, rifle scope 100 may include a memory configured to store the captured video and/or audio for subsequent transmission, such as when the transceiver establishes a connection to the communications network. One example of image data that may be transmitted through the network connection is described below with respect to FIG. 2.



FIG. 2 is front view 200 into the rifle scope 100 of FIG. 1 depicting a view area 204 including lines 206 and 208 forming a reticle and including a potential target 210. Front view 200 includes thumb screws 114, 116, and 118 and includes housing 106. Front view 200 further includes eyepiece 102 including an adjustable focusing element 202 that is part of the eyepiece 102.


View area 204 can be captured by optical sensors associated with circuitry 108 within housing 106 and converted into video data that can be transmitted wirelessly to a destination device through a network. Circuitry 108 may also include a memory configured to store media content, which can be transmitted at a later time, for example, if a network connection is unavailable. Thus, a user can capture and share the video with a friend or post the video to a server to share with a large number of users. In this way, the user's hunting experience can be shared with others, allowing for social networking of the video associated with a hunt, for example.



FIG. 3 is a perspective view of a binocular display device 300 having circuitry 108 including a network transceiver, such as the circuitry 108 of FIG. 1. In this instance, binocular display device 300 includes eyepieces 302 and optical elements 304 coupled through a housing 306 that may include one or more prismatic components as well as circuitry 108. Binocular display device 300 further includes a binocular adjustment mechanism 308 allowing for physical adjustment of the eyepieces 302 to fit the user.


In this example, circuitry 108 is configured to capture video and/or audio associated with view area that is observed through optical elements 304. In an example, circuitry 108 may include directional microphones configured to capture audio from the view area. Circuitry 108 includes a network transceiver and logic configured to selectively provide media content to a destination device through a network. In some instances, circuitry 108 stores video data, audio data, text, or any combination thereof in a memory. Circuitry 108 may store and transmit the media content concurrently and/or may subsequently transmit the media content when a network connection becomes available and/or in response to an input from the user. Thus, binocular display device 300 can be used to capture and share video and audio information in real-time (when a network connection is available) or at a later time, or both.


In the examples of FIGS. 1-3, a rifle scope and a binocular display device are configured for wireless transmission of video and/or audio data to a destination device. The destination device can be a portable computing device, such as a smart phone, personal digital assistant (PDA), laptop, or tablet computer. The rifle scope 100 and binocular display device 300 are particular examples of portable optical devices. As used herein, the term “portable optical device” refers to an apparatus that can be carried by a user, for example, when walking through a field. In addition to a rifle scope or binocular display device, circuitry 108 with the network transceiver can be incorporated in any portable optical device for capturing video and/or audio data and for providing the captured media content to a destination device. An example of a system including circuitry 108, which can be incorporated in any such portable optical device, is described below with respect to FIG. 4.



FIG. 4 is a block diagram of a system 400 including the circuitry 108 of FIGS. 1-3. System 400 includes optical elements 402 configured to direct light toward image (optical) sensors 410 of circuitry 108. System 400 further includes user-selectable elements 404 (such as buttons 110 and 112 and/or thumb screws 114, 116, and 118 in FIG. 1) coupled to an input interface 422 of circuitry 108. System 400 also includes a radio device 406 (such as a hand-held radio frequency (RF) communications device, a portable base station, or another electronic device capable of communicating with a network 408) that is coupled to transceiver 424 through a wired or wireless communications link. System 400 is coupled to network 408 through a network transceiver 426. In an example, circuitry 108 can communicate bi-directionally with network 408 through network transceiver 426 or can communicate bi-directionally with radio device 406, which is configured to couple to network 408 and to facilitate communication between circuitry 108 and network 408. As used herein, the phrase “communicate bi-directionally” refers to the capability of sending data to the network 408 and receiving data from the network 408, not necessarily simultaneously.


Circuitry 108 includes a field programmable gate array (FPGA) 412 including one or more inputs coupled to outputs of image (optical) sensors 410. FPGA 412 further includes an input/output interface coupled to a memory 414, which stores data and instructions. FPGA 412 includes a first output coupled to a display 416 for displaying images and/or text and a second output coupled to a speaker 417. FPGA 412 is also coupled to a digital signal processor (DSP) 430 and a micro controller unit (MCU) 434 of an image processing circuit 418. Circuitry 108 also includes sensors 420 configured to measure one or more environmental parameters (such as wind speed and direction, humidity, temperature, and other environmental parameters) and/or to measure optical elements, such as reflected laser range finding data, and to provide the measurement data to MCU 434. Circuitry 108 further includes a microphone 428 to capture sounds and to convert the sounds into an electrical signal, which it provides to an analog-to-digital converter (ADC) 429. ADC 429 includes an output coupled to an input of DSP 430. In some instances, the microphone 428 may be external to circuitry 108 and circuitry 108 may instead include an audio input jack or interface for receiving an electrical signal from microphone 428. In a particular example, the speaker 417 and microphone 428 may be incorporated in a headset worn by a user that is coupled to circuitry 108 through an input/output interface (not shown).


DSP 430 is coupled to a memory 432 and to MCU 434. MCU 434 is coupled to a memory 436. MCU 434 is also coupled to input interface 422, transceiver 424, and network transceiver 426. In an example, transceiver 424 can be part of an input/output interface, such as a Universal Serial Bus (USB) interface or another wired interface for communicating data to and receiving data from radio device 406, which may be configured to communicate bi-directionally with network 408. In a particular example, transceiver 424 is a wireless transceiver for communicating data to and receiving data from radio device 406. Network transceiver 426 communicates media content to network 408 and receives data and/or media content from network 408. In an example, network 408 can be a communications network, such as the Internet, a wireless telephone network (cellular, digital, or satellite), an ad hoc wireless network, or any combination thereof. In a particular example, circuitry 108 may receive audio data from network 408 and output the audio data to a user through speaker 417 and may send audio data from microphone 428 through network 408, allowing the user to utilize circuitry 108 for full-duplex or half-duplex audio communication. Further, circuitry 108 can communicate media content, such as video and audio of a view area to a destination device through network 408.


In an example, DSP 430 executes instructions stored in memory 432 to process audio data from microphone 428. MCU 434 processes instructions and settings data stored in memory 436 and is configured to control operation of circuitry 108. FPGA 412 is configured to process image data from image (optical) sensors 410. FPGA 412 processes the image data to enhance image quality through digital focusing and gain control. Further, FPGA 412 performs image registration and stabilization. FPGA 412 cooperates with DSP 430 to perform optical target tracking within the view area of the portable optical device that incorporates circuitry 108. FPGA 412 further cooperates with MCU 434 to mix the video data with reticle information and target tracking information (from DSP 430) and provides the resulting image data to display 416. As a target moves within the view area, DSP 430 can perform target tracking and can apply a visual marker to the target as shown on display 416.


In an embodiment, user-selectable elements 404 allow the user to select between live image content captured by image (optical) sensors 410 via optics 402 and stored, pre-processed video from memory 414. For example, the user may elect to view pre-recorded video that was recorded by the user at some point in the past on display 416 to relive his/her hunting experience. Alternatively, the manufacturer may store pre-processed video (for example from a hunt performed by a professional hunter) in memory, allowing the user to view the pre-processed video on display 416. Further, user-selectable elements 404 allow the user to control circuitry 108 to transmit media content to a destination device through network 408. Thus, the user can share captured video data, audio data, text data, graphical data, or any combination thereof with a remote user through network 408. If circuitry 108 is incorporated in a rifle scope, such as rifle scope 100, the user can capture and share video of his/her hunting experience in real-time or near real-time.


In an example where circuitry 108 is incorporated in a rifle scope or optical scope, circuitry 108 can communicate directly with network 408 or can communicate indirectly with network through an intermediate device, such as radio device 406. In some instances, radio device 406 can be configured to communicate directly with other radio devices, forming an ad hoc wireless network or secure network (such as a battlefield network). In one example, circuitry 108 transmits location data, image data, and other information to such radio devices, which information is shared with one or more other radio devices coupled to the ad hoc wireless network or battlefield network.


While the example of FIG. 4 depicted some components of circuitry 108, at least some of the operations of circuitry 108 may be controlled using programmable instructions. In one instance, such instructions may be upgraded and/or replaced using transceiver 424. In one instance, the replacement instructions may be downloaded to a portable storage device, such as a thumb drive or radio device 406, which may then be coupled to transceiver 424. The user may then select and execute the upgrade instructions by interacting with the user-selectable elements 404. An example of portions of circuitry 108 including an expanded view of memories 414, 432, and 436 showing instructions executable by FPGA 410, DSP 430, and MCU 434, respectively, is described below with respect to FIG. 5.



FIG. 5 is an expanded block diagram of a portion 500 of the circuitry 108 of FIGS. 1-4. Portion 500 includes image processing circuit 418, FPGA 412, memory 414, display 416, and speaker 417. Image processing circuit 418 includes memories 432 and 436. In this example, memory 436 stores instructions that, when executed by MCU 434, cause MCU 434 to perform operations to control operation of circuitry 108. Memory 436 includes display overlay generation instructions 502 that, when executed, cause MCU 434 to generate a reticle and/or other data that can be provided to display 416 in conjunction with video data. Memory 436 includes user adjustment instructions 504 that, when executed, cause MCU 434 to process inputs received from input interface 422 responsive to adjustments made by a user through user-selectable elements 404. Memory 436 includes peripheral device control instructions 506 that, when executed, cause MCU 434 to control radio device 406 or to control some other electronic device coupled to transceiver 424. Memory 436 also includes network communication instructions 508 that, when executed, cause MCU 434 to communicate media content to a destination device through network 408 via network transceiver 426 or via transceiver 424 through radio device 406. Memory 436 further includes video/image sharing instructions 510 that, when executed, cause MCU 434 to include video data in the media content provided to the destination device in real-time or near real-time and/or to share the media content when network transceiver 426 is able to establish a connection to network 408.


Memory 432 includes image processing instructions 512 that, when executed by DSP 430, cause DSP 430 to perform target tracking with respect to identified objects within a view area. Memory 432 further includes audio signal processing instructions 514 that, when executed, cause DSP 430 to process audio data from microphone 428 and ADC 429.


FPGA 412 is configured to process image/optical data and/or to store video data including overlay information 516 in memory 414. In an example where network transceiver 426 is unable to establish a communications link to network 408, FPGA 412 can store the media content in memory 414 and can retrieve and send the media content at a later time, such as when a communications link is established to network 408.


While the above-examples have described portable optical devices including circuitry 108 in a variety of contexts and in block diagram form, the portable optical device including circuitry 108 can be mounted onto another apparatus, such as a rifle. An example of a rifle scope including the circuitry 108 coupled to a peripheral circuit within a trigger assembly of a rifle is described below with respect to FIG. 6.



FIG. 6 is a side view of an example of a small arms firearm 600 including the rifle scope 100 of FIG. 1. As previously discussed, rifle scope 100 includes circuitry 108 including a network transceiver configured to communicate media content to a network. Rifle scope 100 is mounted to a rifle 602 and aligned with a muzzle 604 of rifle 602 to capture a view area in the target direction. Rifle 602 includes a trigger assembly 606 including peripheral circuit 607 and including a trigger shoe 608 to which a user may apply a force to discharge rifle 602. Rifle 602 further includes a trigger guard 610 and a handle 612 as well as a magazine 614.


In this example, in response to a force applied to trigger shoe 608 by a user, peripheral circuit 607 sends a signal to transceiver 424 of circuitry 108 within rifle scope 100, triggering circuitry 108 to capture video data and/or audio data associated with a view area of rifle scope 100. Circuitry 108 can communicate media content to a destination device through network 408. Further, MCU 434 of circuitry 108 can send and/or receive control signals to peripheral circuit 607.


In the above-discussion, circuitry 108 within a portable optical device includes a network transceiver 426 for communicating media content to one or more devices through a network 408. As previously discussed, media content can include video data, audio data, text data, graphical data, or any combination thereof. An example of a system is described below with respect to FIG. 7 that includes integrated circuit 108 configured to communicate with one or more remote devices through network 408.



FIG. 7 is a block diagram of a system 700 including the circuitry 108 of FIGS. 1-6 configured to communicate with one or more devices through communications network 408. System 700 includes a smart phone 706, a computer server 708, a content source 710, and a computing device 712, which are communicatively coupled to circuitry 108 through communications network 408. Circuitry 108 may be coupled to another instance of circuitry 108, identified as circuitry 108′, within another optical device through communications network 408. Further, a third instance of circuitry 108, labeled as circuitry 108″, is configured to communicate with an electronic device 716 through a wired or wireless link. Electronic device 716 can be a smart phone, a personal digital assistant, a computing device, a field radio (such as a military radio), a router, a satellite uplink, or some other electronic device that is configurable to communicate bi-directionally with network 408 and to communicate bi-directionally with circuitry 108″ to facilitate communication between circuitry 108″ and other devices (or circuitry 108′ and/or circuitry 108) through network 408. As used herein, the term “content source” refers to a device configured to store and share media content with other devices coupled to communications network 408. In an example, content source 710 can be a networked hard disk coupled to communications network 408. Further, as used herein, the term “computing device” refers to any electronic device configurable to couple to communications network 408 and to execute instructions, such as Internet browser applications, image rendering applications, and the like.


In this example, circuitry 108 captures video data, audio data or any combination thereof (associated with a view area) and/or generates text data and graphical data and communicates media content including at least one of the video, audio, and text data to one or more of smart phone 706, server 708, content source 710, computing device 712, and another instance of circuitry 108′. Network transceiver 426 within circuitry 108 communicates the media content wirelessly to communications network 408, making it possible for the user to capture and share media content with one or more remote users through network 408.



FIG. 8 is a flow diagram of an embodiment of a method 800 of sharing video from an optical device with a destination device. At 802, circuitry 108 captures a video stream corresponding to a view area of a rifle scope. Advancing to 804, MCU 434 formats the video stream into media content for transmission through a network, such as communications network 408. In an example, MCU 434 formats the video stream into media content packets for transmission via a TCP/IP network through a direct wireless connection between circuitry 108 and network 408 or through an intermediate device, such as an electronic device 716. Continuing to 806, MCU 434 controls network transceiver 426 (or transceiver 424 and electronic device 716 or radio device 406) to transmit the media content to a destination device through the network 408 to share the video stream.


In an example, the video stream may be formatted and sent as a live media content stream to the destination device through network 408. In an alternative example, such as when a connection to network 408 is unavailable, circuitry 108 may store the media content in a memory, such as memory 414, and may communicate the media content at a later time, such as when the network connection becomes available or in response to a user selection. In still another example, the video stream may be formatted and provided to an electronic device 716 through a wired or wireless connection, and electronic device 716 can send the media content and/or receive media content to and from other devices or other circuitry through network 408.


While the above-examples are directed largely to a portable optical device implementation configured to capture video, it is also possible to display pre-processed video content on the display of the portable optical device and to provide associated audio content to a speaker 417. An example of a method for selectively providing one of the captured video and pre-processed video content to a display is described below with respect to FIG. 9.



FIG. 9 is a flow diagram of an embodiment of a method 900 of selectively providing video content to a visual display. At 902, video of a view area of a portable optical device is captured using circuitry 108, which includes a network transceiver and/or a transceiver configured to communicate through a wired or wireless communication link to electronic device 716, which can communicate with network 408 through a wired or wireless connection. Advancing to 904, circuitry 108 receives user input corresponding to a user-selectable element (such as a button) to select one of the video of the view area and a pre-processed video from a video source for display. In an example, the pre-processed video may be previously recorded video content that was captured by circuitry 108 or video content provided by a manufacturer or other content source. Continuing to 906, circuitry 108 selectively provides one of the video of the view area and the pre-processed video from the video source to a display of the portable optical device in response to the user input. In this example, the user can review stored media content using the display within the portable optical device.


In conjunction with the systems, devices, and methods described above with respect to FIGS. 1-9, circuitry is described that includes optical sensors, a microphone, and a network transceiver. The circuitry is configured to capture video data and audio data associated with a view area and to selectively communicate media content to a destination device through a network, using TCP/IP or other communications protocols. The circuitry enables social networking of media content associated with a view area of a portable optical device, such as a rifle scope, a pair of binoculars, a telescope, or other portable optical device. The media content can be shared in real-time (i.e., live) or at a later time, such as when the network connection becomes available.


Although the present invention has been described with reference to preferred embodiments, workers skilled in the art will recognize that changes may be made in form and detail without departing from the scope of the invention.

Claims
  • 1. An optical scope comprising: a network interface configured to couple to a network;a display;one or more optical sensors configured to capture image data associated with a view area;a processor coupled to the network interface, to the display, and to the one or more optical sensors; anda memory accessible to the processor and configured to store the image data, the memory including instructions that, when executed cause the processor to: selectively provide at least a portion of the image data to the display; andsend media content including the image data to a destination device through the network.
  • 2. The optical scope of claim 1, wherein the destination device comprises a second optical scope.
  • 3. The optical scope of claim 1, wherein the destination device comprises a portable computing device.
  • 4. The optical scope of claim 1, further comprising: a microphone coupled to the processor and configured to capture audio data; andwherein the media content includes the image data and the audio data.
  • 5. The optical scope of claim 4, wherein the processor is configured to send the captured audio data to the destination device through the network and to receive second audio data from the destination device to facilitate bi-directional audio communications.
  • 6. The optical scope of claim 1, wherein the network interface comprises a wireless transceiver.
  • 7. The optical scope of claim 1, wherein the memory further includes instructions that, when executed, cause the processor to: receive second media content from the destination device; andprovide image data from the second media content to the display.
  • 8. The optical scope of claim 1, wherein the memory further includes a stored video; and wherein the processor is configured to: retrieve the stored video from the memory; andprovide the stored video to the display.
  • 9. A firearm system comprising: a gun; andan optical scope coupled to the gun, the optical scope including: a network interface configured to couple to a network;a microphone to configured to capture audio data;a speaker configured to produce sounds;a processor coupled to the network interface, to the microphone, and to the speaker; anda memory accessible to the processor and configured to store instructions that, when executed cause the processor to: receive audio data from the microphone;send the audio data to a destination device through the network; andreceive second audio data from the destination device through the network to facilitate bi-directional communications.
  • 10. The firearm system of claim 9, wherein the optical device further includes one or more optical sensors coupled to the processor and configured to capture image data associated with a view area of the optical device.
  • 11. The firearm system of claim 10, further comprising: a display coupled to the processor; andwherein the memory further includes instructions that, when executed cause the processor to: selectively provide at least a portion of the image data to the display; andsend media content including the image data to the destination device through the network, the media content including the image data and the audio data.
  • 12. The firearm system of claim 9, wherein the destination device comprises a second optical scope.
  • 13. The firearm system of claim 9, wherein the destination device comprises a portable computing device.
  • 14. The firearm system of claim 9, wherein the network interface comprises a wireless transceiver.
  • 15. An optical scope comprising: a network interface configured to couple to a network;a display;an optical sensor configured to capture image data associated with a view area;a processor coupled to the network interface, to the display, and to the optical sensor; anda memory accessible to the processor and configured to store the image data, the memory including instructions that, when executed cause the processor to: provide at least a portion of the image data to the display; andselectively transmit the image data to a destination device through the network.
  • 16. The optical scope of claim 15, further comprising: a microphone coupled to the processor and configured to capture audio data; andwherein the processor is further configured to transmit media content including the image data and the audio data to the destination device.
  • 17. The optical scope of claim 16, wherein the processor is configured to send the captured audio data to the destination device through the network and to receive second audio data from the destination device to facilitate bi-directional audio communications.
  • 18. The optical scope of claim 15, wherein the network interface comprises a wireless transceiver.
  • 19. The optical scope of claim 15, wherein the memory further includes instructions that, when executed, cause the processor to: receive second media content from the destination device; andprovide image data from the second media content to the display.
  • 20. The optical scope of claim 15, wherein the memory further includes a stored video; and wherein the processor is configured to: retrieve the stored video from the memory; andprovide the stored video to the display.
CROSS-REFERENCE TO RELATED APPLICATION(S)

This application is a continuation of and claims priority to U.S. patent application Ser. No. 13/360,545 filed on Jan. 27, 2012 and entitled “Rifle Scope, Portable Telescope, and Binocular Display Device Including a Network Transceiver,” which is incorporated herein by reference in its entirety.

Continuations (1)
Number Date Country
Parent 13360545 Jan 2012 US
Child 16105516 US