The present disclosure relates generally to transmission of both a still image and a video stream and, more particularly, to a system that allows the combination of transmission of a still image while maintaining a video stream from an aircraft to a ground station.
The way that the Vietnam War is now remembered as the helicopter war, the current conflicts in Iraq and Afghanistan may be remembered for the use of unmanned drones or unmanned aerial vehicles (UAVs). Drones may facilitate remote intelligence gathering, alleviating the need for foot soldiers to enter into hostile areas “blind,” with little or no information about the location and strength of hostile forces. Drones may provide close combat support, such as identifying and eliminating targets of interest, alleviating the need to expose soldiers and/or airmen to potential small arms fire, mortars, rocket grenades, road-side bombs, anti-aircraft weaponry, missiles, and other dangers.
Identification of targets and reconnaissance typically involves analyzing video images acquired from cameras carried by the drones. Such cameras may maintain a real time video feed that tracks targets as they move or change over a long period of time. Since video involves sending multiple still frame images from a camera each second, streaming video requires a great deal of bandwidth. Maintaining such a large bandwidth is a challenge both for aircraft video systems that must process and stream the raw video data and ground stations that have limited bandwidth to receive the video feed. One of the tradeoffs to address these concerns is that video quality is degraded by either lowering the resolution (e.g. number of pixels) and/or reducing the image frame rate in order to decrease the required bandwidth. Thus, a video feed allows a remote operator to follow a target, but it does not provide a high resolution image of the target for detailed analysis.
Thus, there is a need for better image transmission from unmanned aerial vehicles.
Aspects of the present disclosure include a system for transmitting still images and a video feed to a remote location. The system includes an aircraft having a digital video camera to capture still images and video frames of an object. A video encoder is coupled to the camera to provide a video output including video packets. A file server is coupled to the camera to provide a still image output including image data packets. A multiplexer is coupled to the video output and the still image output. The multiplexer produces a data transmission including the video packets and the image data packets. A transmitter sends the data transmission to the remote location.
Another example is a system for receiving a combined data transmission of video stream packets and image data packets associated with a still image sent from an aircraft. The system includes a receiver for receiving a multiplexed data transmission including video stream packets and image data packets. A demultiplexer is coupled to the receiver. The demultiplexer separates the video stream packets and the image data packets. A video decoder is coupled to the demultiplexer to assemble the video packets to produce a video stream. A combiner is coupled to the demultiplexer to combine the image data packets to form a still image.
Another example is a method of transmitting a still image in a video data transmission. A still image is captured via a camera. A video stream is captured via the camera. The still image is converted into a plurality of image data packets. The video stream is converted into a plurality of video image packets. The image data packets and video image packets are combined into a data transmission. The combined transmission is sent to a remote receiver. The combined transmission is received on a remote receiver. The combined transmission is demultiplixed into the plurality of image data packets and video image packets. The video image packets are decoded into a video stream. The image data packets are combined into the digital image.
Another example is a system for transmitting data in a first format and data in a second format to a remote location. The system includes an aircraft having a first sensor to capture data in a first format and a second sensor to capture data in a second data format. A multiplexer is coupled to the first and second sensors. The multiplexer produces a data transmission including the packets of data in the first format and the packets of data in the second format. A transmitter sends the data transmission to the remote location.
The foregoing and additional aspects and implementations of the present disclosure will be apparent to those of ordinary skill in the art in view of the detailed description of various embodiments and/or aspects, which is made with reference to the drawings, a brief description of which is provided next.
The foregoing and other advantages of the present disclosure will become apparent upon reading the following detailed description and upon reference to the drawings.
While the invention is susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. It should be understood, however, that the invention is not intended to be limited to the particular forms disclosed. Rather, the invention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.
Each of the aircraft 102, 104, and 106 in
In this example, the hub 118 includes a memory device for storing still images acquired from the aircraft 102 as well as mission data for programming flights for the aircraft 102. The hub 118 also provides a connector interface for cables coupled to a portable computer 130 and a hand controller 140. The hand controller 140 receives the analog video feed from the transceiver 114 with the hub 118. Of course digital video data may also be sent to the hand controller 140 from the transceiver 114. As will be explained below, the portable computer 130 includes a display 132 and includes stored machine instructions to process both video and still images from the aircraft 102, 104, and 106 via the signals received by the transceiver 114 and display the video or still images on the display 132.
The hand controller 140 includes a display 142 that displays video from the aircraft for purposes of piloting the aircraft or showing real-time video when the aircraft 102 is in automatic flight mode. The hand controller 140 includes a joystick 144 that may be used to control the aircraft or the positioning of a camera on board the aircraft to acquire video or still images. The hand controller 140 includes a throttle switch 146 that controls the altitude of the aircraft, a multi-function switch 148, and an enter key 150 to assist in controlling the aircraft 102 in the manual piloting mode.
In video mode, the camera 310 converts captured images to raw digital data frames that are output to a video encoder 312. The video encoder 312 is coupled to a video buffer 314. The camera 310 captures still images at a higher resolution, which are sent to a file packet server 316. The file packet server 316 divides the captured still image pixel data into data blocks since a desired image resolution requires relatively larger amounts of image data. In this example, the video encoder 312 is an ASIC coupled to the output of the camera 310. The on-board image system 302 includes a packet multiplexer 320. The packet multiplexer 320 has an image file input 322, a video stream input 324 and a multiplexed output 326. The input 322 is coupled to the file packet server 316 and the video stream input 324 is coupled to the video encoder 312 and the video buffer 314. In this example, an FPGA is configured as the video buffer 314, the file packet server 316, and the multiplexer 320. Of course other hardware such as ASICs or general processors or DSPs may be used instead of the FPGA. Each of the separate components 314, 316, and 320 may be on a separate chip or any combination may be on the same chip.
The multiplexed output 326 of the multiplexer 320 is coupled to a data link 330, which may be a receiver/transceiver in communication with the ground control station 110 in
The ground processing system 304 includes a data link 350, which is coupled to a packet demultiplexer 352. In this example, the data link 350 is a receiver/transmitter device such as the transceiver 114 in
The file transfer packet 342 includes a header field 412, a file ID field 414, a block ID field 416, a data field 418, and a CRC field 420. In this example, the header field 412 is 4 bytes and is indicative of the location of the block within the overall image. The block ID field 416 is 2 bytes and identifies the particular file or separate image that the block belongs to. The data field 418 is 176 bytes and includes image data for the block. The CRC field 420 is 2 bytes long and used as a checksum to validate the data.
The present system allows the transmission of high resolution still images during the transmission of a video stream without having to interrupt the video stream to wait for the download of a still image. As explained above, the system 300 takes data packets from both a video stream and a still image and combines them into a multiplexed data transmission 340 to the ground control station 110.
In operation, the camera 310 in the aircraft 102 will always send a video feed to the ground control station 110. A user may send commands via the transceiver 114 to the aircraft 102 to take a still image or images from the camera 310 in
A user may decide how much of the bandwidth to share between the video stream and the acquired still image. If the image is a priority, the user may prioritize file packets and the multiplexer 320 may be then controlled to accept more packets from the file packet server 316 in order to send the image at a faster rate. The user may also control the multiplexer 320 to send less file image packets and more equitably share the transmission bandwidth between downloading the image and transmitting the video stream if the image reception is not of as high importance. One example of a control of the ratio of video to image packet is a slider control interface that is between zero percent video frames and 100 percent video frames, but other controls may be used. In this example, the ground control station 110 will continue to command minimal sending of video frames at the lowest resolution and the lowest frame rate at zero percent in order to maintain imagery coming from the aircraft which can aid in the control of the aircraft 102 and a situational awareness of the real-time activities occurring on the ground. Other examples allow the video transmission to be stopped in order to maximize the transmission rate of the still images. In other examples the ground controller 110 can also adjust the resolution of the still images to increase the transmission rate of the images and/or to reduce the effect on the video transmission (e.g. minimize the reduction in either the frame rate or the resolution of the video). Also, it should be noted that the ground controller 110 may control either or both of the video frame rate and the video resolution.
The quality slide control 506 allows the user to move the slide between low quality and high quality for the captured image. The high quality sets the resolution size of the image to the maximum number of pixels in each direction and the lowest amount of compression. The low quality setting sets the resolution size to a low number of pixels and increases compression to the maximum. The priority slide control 508 varies between video and picture. When the slide control of the priority slide control 508 is set at the video setting, transmission of video packets is given priority while when the slide control is set at the picture setting, the data image packets are given priority.
The ground station 110 may set up a system of retries. As explained above, each still image file is broken into N 182 byte blocks or packets in this example. In the instance where the aircraft processing system 302 reports that it has a file of N blocks to ground, the ground control station 110 may prioritize the file solely and halt sending video packets until all of the blocks of the still image are received.
The ground control station 110 may also allow an operator to dynamically allocate bandwidth of the broadcast channel among multiple aircraft. The ground control station 110 may include an arbiter device that decides which aircraft is allocated bandwidth based on predetermined factors such as maximum payload data output. Alternatively, priority may be determined by the operator to allocate bandwidth. Such allocation controls are further described in U.S. Publication No. 20110065469.
The components noted in
In addition, two or more computing systems or devices may be substituted for any one of the controllers described herein. Accordingly, principles and advantages of distributed processing, such as redundancy, replication, and the like, also can be implemented, as desired, to increase the robustness and performance of controllers described herein. The controllers may also be implemented on a computer system or systems that extend across any network environment using any suitable interface mechanisms and communications technologies including, for example telecommunications in any suitable form (e.g., voice, modem, and the like), Public Switched Telephone Network (PSTNs), Packet Data Networks (PDNs), the Internet, intranets, a combination thereof, and the like.
Although the aircraft 102 in this example has a camera such as the CMOS camera 310, the aircraft 102 may include other types of payloads such as radiation detectors, radar, lidar, air samplers, etc. These sensors all have different types of data that may be transmitted back to the ground to a ground control station such as the ground control station 110. Accordingly, such data may also be combined with either still image or video images in the transmission to the ground control station 110 according to the examples described above. The ground control station 110 may accept transmission of data in a first format and data in a second format in a multiplexed data transmission. The aircraft such as the aircraft 102 includes a first sensor to capture data in a first format and a second sensor to capture data in a second data format. The sensors may include diverse sensors such as the cameras, radiation detectors, radar, lidar, etc. A multiplexer is coupled to the first and second sensors and produces a data transmission including the packets of data in the first format and the packets of data in the second format. A transmitter on board the aircraft 102 sends the data transmission to the remote location such as the ground control station 110. The ground control station 110 may control the transmission ratio between the data in the first data format or the second data format depending on the desired priority.
The operation of the example image and video combination sequence will now be described with reference to
The video ratio is then input from the ground control station 110 such as via the controls on the control panel shown in
While particular implementations and applications of the present disclosure have been illustrated and described, it is to be understood that the present disclosure is not limited to the precise construction and compositions disclosed herein and that various modifications, changes, and variations can be apparent from the foregoing descriptions without departing from the spirit and scope of the invention as defined in the appended claims.
This application is a continuation of U.S. patent application Ser. No. 13/220,197, filed Aug. 29, 2011, the contents of which are incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
3638502 | Leavitt et al. | Feb 1972 | A |
4217606 | Nordmann | Aug 1980 | A |
4819059 | Pape | Apr 1989 | A |
4855823 | Struhs et al. | Aug 1989 | A |
5153623 | Bouvier | Oct 1992 | A |
5251118 | Budnovitch et al. | Oct 1993 | A |
5383645 | Pedut et al. | Jan 1995 | A |
5424854 | Hashimoto | Jun 1995 | A |
5448568 | Delpuch | Sep 1995 | A |
5537446 | Lakshman | Jul 1996 | A |
5897223 | Tritchew et al. | Apr 1999 | A |
5936245 | Goillot et al. | Aug 1999 | A |
6056237 | Woodland | May 2000 | A |
6147701 | Tamura et al. | Nov 2000 | A |
6226125 | Levy et al. | May 2001 | B1 |
6269078 | Lakshman | Jul 2001 | B1 |
D452697 | Fallowfield et al. | Jan 2002 | S |
6349898 | Leonard | Feb 2002 | B1 |
6366311 | Monroe | Apr 2002 | B1 |
6529620 | Thompson | Mar 2003 | B2 |
6628338 | Elberbaum et al. | Sep 2003 | B1 |
6731331 | Watabe | May 2004 | B1 |
7000883 | Mercadal et al. | Feb 2006 | B2 |
7049953 | Monroe | May 2006 | B2 |
7058721 | Ellison et al. | Jun 2006 | B1 |
7131136 | Monroe | Oct 2006 | B2 |
7173526 | Monroe | Feb 2007 | B1 |
7253398 | Hughes et al. | Aug 2007 | B2 |
7280810 | Feher | Oct 2007 | B2 |
7302323 | Anderson | Nov 2007 | B2 |
7359622 | Monroe et al. | Apr 2008 | B2 |
7400348 | Hoyos | Jul 2008 | B2 |
7526183 | Takahashi | Apr 2009 | B2 |
7561037 | Monroe | Jul 2009 | B1 |
7610841 | Padan | Nov 2009 | B2 |
7634662 | Monroe | Dec 2009 | B2 |
7695647 | Smela et al. | Apr 2010 | B2 |
7747364 | Roy et al. | Jun 2010 | B2 |
7955006 | Harvey | Jun 2011 | B1 |
8091833 | von Flotow et al. | Jan 2012 | B2 |
8137007 | Harvey | Mar 2012 | B1 |
8140200 | Heppe et al. | Mar 2012 | B2 |
8174612 | Koehler | May 2012 | B1 |
D662120 | Deurwaarder | Jun 2012 | S |
8226039 | von Flotow et al. | Jul 2012 | B2 |
D668701 | Ohno et al. | Oct 2012 | S |
8341684 | Wei | Dec 2012 | B2 |
8523462 | Dimotakis | Sep 2013 | B2 |
8559801 | Dimotakis | Oct 2013 | B2 |
8589994 | Monroe | Nov 2013 | B2 |
8767041 | Yun | Jul 2014 | B2 |
8891539 | Ozawa | Nov 2014 | B2 |
9985717 | Alcorn | May 2018 | B2 |
20010043751 | Takahashi | Nov 2001 | A1 |
20030067542 | Monroe | Apr 2003 | A1 |
20030099457 | Takahashi | May 2003 | A1 |
20040026573 | Andersson et al. | Feb 2004 | A1 |
20040068583 | Monroe et al. | Apr 2004 | A1 |
20040173726 | Mercadal et al. | Sep 2004 | A1 |
20040230352 | Monroe | Nov 2004 | A1 |
20050044112 | Yamamoto | Feb 2005 | A1 |
20050204910 | Padan | Sep 2005 | A1 |
20050219639 | Fujise et al. | Oct 2005 | A1 |
20060033288 | Hughes et al. | Feb 2006 | A1 |
20060110155 | Kouchi et al. | May 2006 | A1 |
20060129727 | Park | Jun 2006 | A1 |
20060231675 | Bostan | Oct 2006 | A1 |
20060276942 | Anderson | Dec 2006 | A1 |
20060276943 | Anderson | Dec 2006 | A1 |
20070031151 | Cunningham et al. | Feb 2007 | A1 |
20070042774 | Alcorn | Feb 2007 | A1 |
20080204553 | Thompson | Aug 2008 | A1 |
20080205696 | Thompson | Aug 2008 | A1 |
20080215204 | Roy et al. | Sep 2008 | A1 |
20080267612 | Harvey | Oct 2008 | A1 |
20080277631 | Smela et al. | Nov 2008 | A1 |
20080316313 | Monroe et al. | Dec 2008 | A1 |
20090015674 | Alley et al. | Jan 2009 | A1 |
20090216394 | Heppe et al. | Aug 2009 | A1 |
20090218447 | von Flotow et al. | Sep 2009 | A1 |
20090273671 | Gardner | Nov 2009 | A1 |
20090279490 | Alcorn | Nov 2009 | A1 |
20090284644 | McKaughan et al. | Nov 2009 | A1 |
20100013628 | Monroe | Jan 2010 | A1 |
20100110162 | Yun | May 2010 | A1 |
20100141503 | Baumatz | Jun 2010 | A1 |
20100241931 | Choi et al. | Sep 2010 | A1 |
20100265329 | Doneker | Oct 2010 | A1 |
20100309344 | Zimmer et al. | Dec 2010 | A1 |
20110103021 | Janssen et al. | May 2011 | A1 |
20110154427 | Wei | Jun 2011 | A1 |
20110170556 | Ozawa | Jul 2011 | A1 |
20120104169 | von Flotow et al. | May 2012 | A1 |
20120106800 | Khan et al. | May 2012 | A1 |
20120200703 | Nadir et al. | Aug 2012 | A1 |
20120230423 | Esenlik | Sep 2012 | A1 |
20120287903 | Alcorn | Nov 2012 | A1 |
20120320203 | Liu | Dec 2012 | A1 |
20120322444 | Alcorn | Dec 2012 | A1 |
20120327844 | Alcorn | Dec 2012 | A1 |
20130048792 | Szarek et al. | Feb 2013 | A1 |
20130050486 | Omer et al. | Feb 2013 | A1 |
20130050487 | Omer et al. | Feb 2013 | A1 |
20130051778 | Dimotakis | Feb 2013 | A1 |
20130051782 | Dimotakis | Feb 2013 | A1 |
20130135471 | Giuffrida et al. | May 2013 | A1 |
20130142267 | Esenlik | Jun 2013 | A1 |
20130250057 | Choi | Sep 2013 | A1 |
20140161435 | Dimotakis | Jun 2014 | A1 |
20140192155 | Choi | Jul 2014 | A1 |
20140198850 | Choi | Jul 2014 | A1 |
20140328413 | Esenlik | Nov 2014 | A1 |
20160073086 | Choi | Mar 2016 | A1 |
Number | Date | Country |
---|---|---|
201099352 | Aug 2008 | CN |
101766049 | Jun 2010 | CN |
211058 | Aug 1993 | TW |
I311121 | Jun 2009 | TW |
2013074172 | May 2013 | WO |
2013074173 | May 2013 | WO |
2013074175 | May 2013 | WO |
2013074176 | May 2013 | WO |
2013074177 | May 2013 | WO |
Entry |
---|
Notice of Allowance for U.S. Appl. No. 13/220,619, dated Mar. 6, 2015, 7 pages. |
Notice of Allowance for U.S. Appl. No. 13/967,720, dated Mar. 25, 2015, 7 pages. |
Notice of Allowance in U.S. Appl. No. 13/220,562, dated May 1, 2013, 8 pages. |
Notice of Allowance in U.S. Appl. No. 13/220,617, dated Jun. 10, 2013, 9 pages. |
Office Action for U.S. Appl. No. 13/220,535, dated Dec. 2, 2014, 15 pages. |
Office Action for U.S. Appl. No. 13/220,535, dated Feb. 27, 2014, 11 pages. |
Office Action for U.S. Appl. No. 13/220,619, dated May 13, 2014, 11 pages. |
Office Action for U.S. Appl. No. 13/220,619, dated Oct. 8, 2014, 11 pages. |
Office Action for U.S. Appl. No. 13/220,535, dated Aug. 2, 2013, 10 pages. |
Office Action for U.S. Appl. No. 13/220,535, dated Aug. 1, 2014, 11 pages. |
Office Action for U.S. Appl. No. 13/967,720, dated Oct. 8, 2014, 14 pages. |
Office Action in U.S. Appl. No. 13/220,562, dated Nov. 23, 2012, 10 pages. |
Office Action in U.S. Appl. No. 13/220,617, dated Dec. 4, 2012, 18 pages. |
Office Action in U.S. Appl. No. 13/220,197, dated Nov. 7, 2013, 25 pages. |
Office Action in U.S. Appl. No. 13/220,197, dated Jun. 2, 2014, 32 pages. |
Office Action (Restriction Requirement) for U.S. Appl. No. 13/220,619, dated Dec. 9, 2013, 8 pages. |
PCT International Search Report and Written Opinion in International Application No. PCT/US12/52723, dated May 3, 2013, 10 pages. |
PCT International Search Report and Written Opinion in International Application No. PCT/US12/52725, dated May 3, 2013, 6 pages. |
PCT International Search Report and Written Opinion in International Application No. PCT/US12/52727, dated Mar. 18, 2013, 9 pages. |
PCT International Search Report and Written Opinion in International Application No. PCT/US12/52728, dated Mar. 19, 2013, 14 pages. |
PCT International Search Report and Written Opinion in International Application No. PCT/US12/52729, dated May 13, 2013, 8 pages. |
Taiwanese Office Action and Search Report for Taiwanese Application No. 101130827, dated Aug. 21, 2015, 8 pages. |
Taiwanese Office Action and Search Report for Taiwanese Application No. 101130828, dated Nov. 11, 2014, 24 pages. |
Taiwanese Office Action and Search Report for Taiwanese Application No. 101130829, completed May 14, 2014, 17 pages. |
Taiwanese Decision of Rejection for Taiwanese Application No. 101130829, dated Sep. 29, 2014, 7 pages. |
Taiwanese Office Action and Search Report for Taiwanese Application No. 101130830, dated Oct. 30, 2014, 12 pages. |
Taiwanese Office Action and Search Report for Taiwanese Application No. 101130832, dated Jun. 26, 2015, 18 pages. |
Taiwanese Office Action and Search Report for Taiwanese Application No. 101130827, dated Feb. 12, 2015, 17 pages. |
Office Action, dated Jun. 21, 2016, issued in Taiwanese Patent Application No. 105113973, 18 pages. |
Number | Date | Country | |
---|---|---|---|
20160165290 A1 | Jun 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13220197 | Aug 2011 | US |
Child | 15010445 | US |