Camera ball turret having high bandwidth data transmission to external image processor

Information

  • Patent Grant
  • 11401045
  • Patent Number
    11,401,045
  • Date Filed
    Monday, August 29, 2011
    13 years ago
  • Date Issued
    Tuesday, August 2, 2022
    2 years ago
Abstract
An unmanned aerial vehicle (UAV) includes a fuselage, a gimbal-mounted turret having one or more degrees of freedom relative to the fuselage, a camera disposed in the gimbal-mounted turret for motion therewith in the one or more degrees of freedom, and a central video image processor disposed exteriorly of the gimbal-mounted turret, the central video image processor configured to receive and process image data from the camera.
Description
TECHNICAL FIELD

The present disclosure relates generally to camera-equipped aircraft, for example unmanned aerial vehicles used for surveillance.


BACKGROUND

Aerial surveillance is an invaluable information-gathering tool. In battle settings, it provides intelligence about troop size, location, and movement, damage assessment, and a host of factors that are critical to successful battle planning and prosecution. Various aircraft-mounted cameras can be used to provide the information in real time, in the form of still or moving (video) images, over a range of spectra including infrared for penetrating through visibility barriers such as haze and for night time operation. The cameras can be fixed or movable, individually or collectively, relative to the aircraft. Gimbaled mechanisms effect camera movement, and generally comprise a turret typically having two degrees of freedom relative to the aircraft. Motion of the turret-mounted camera can be automated, for example in a preset scanning pattern, or user-actuated depending on the specific application. For example, the operator can move or zoom the camera to concentrate attention on a particular area of interest, to capture higher resolution images, or to scan over a broad region in order to detect activity that warrants greater scrutiny, either in real time or during subsequent analysis of the images. Information gathered through surveillance can be processed locally, onboard the aircraft, or transmitted to remote operation centers.



FIG. 1 is a bottom view of an aircraft 100 on which a gimbaled turret 102 is mounted. Disposed in the turret is a camera 104 whose mounting provides it with the two degrees of movement indicated by the pair of double-headed arrows in the drawing.


In addition to conventional manned aircraft, unmanned aerial vehicles, or UAVs, have gained widespread acceptance in the war theater. A primary advantage of UAVs is their pilotless nature, which reduces exposure and risk to human life during operation. The absence of a pilot and other human operators, with their attendant support systems, means the UAV can be made smaller, and payload can be dedicated to other components, such as armament and surveillance equipment. However, as reduced size becomes paramount, more exacting constraints are imposed. Among these are weight and range considerations, which translate to requirements of improved aerodynamics and compactness. For these reasons, UAV-mounted cameras need to be smaller and lighter in order to conserve power and range. Further, because of their exterior mounting, their design needs to present less drag or wind resistance to the aircraft.



FIG. 2 is a schematic view of a conventional turret-mounted camera used in a UAV. Some details of the camera include main optical components (lenses, etc.) 202, a sensor 204, and video processing circuit 206. All of these components are mounted within ball turret 102. Processed image information from the camera is delivered from the ball turret 102 to a transmitter (not shown) disposed in the fuselage of the aircraft. The means of transmission between the camera and transmitter can include cables 208 or other expedients, such as slip rings (not shown), that are designed to eliminate interference with the motion of the ball turret while contending with the large throughput of information necessary to support high resolution still or moving images. Transmission from the fuselage to the ground station is wireless, for example via RF.


SUMMARY

As described herein, an unmanned aerial vehicle (UAV) includes a fuselage, a gimbal-mounted turret having one or more degrees of freedom relative to the fuselage, a camera disposed in the gimbal-mounted turret for motion therewith in the one or more degrees of freedom, and a central video image processor disposed exteriorly of the gimbal-mounted turret, the central video image processor configured to receive and process image data from the camera.


Also as described herein, a surveillance method includes capturing image information using a gimbaled camera mounted in a turret exterior to an aircraft fuselage, transmitting the captured image information to a central image processor disposed in the aircraft fuselage, and processing the transmitted captured image information in the central image processor.


Also as described herein, a device includes means for capturing image information using a gimbaled camera mounted in a turret exterior to an aircraft fuselage, means for transmitting the captured image information to a central image processor disposed in the aircraft fuselage, and means for processing the transmitted captured image information in the central image processor.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated into and constitute a part of this specification, illustrate one or more examples of embodiments and, together with the description of example embodiments, serve to explain the principles and implementations of the embodiments.


In the drawings:



FIG. 1 is a bottom view of a conventional manned aircraft having a gimbaled ball turret in which a camera is mounted;



FIG. 2 is a schematic view of a conventional gimbaled turret-mounted camera with some details thereof;



FIG. 3 is a block diagram of the system architecture for a UAV in accordance with one embodiment described herein;



FIG. 4 is a block diagram showing a centralized image capture approach; and



FIG. 5 is a flow diagram of a surveillance process; and



FIG. 6 is a schematic diagram showing the surveillance operation of a multi-camera UAV in communication with a remote base station.





DESCRIPTION OF EXAMPLE EMBODIMENTS

Example embodiments are described herein in the context of a camera ball turret having high bandwidth data transmission to external image processor. Those of ordinary skill in the art will realize that the following description is illustrative only and is not intended to be in any way limiting. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure. Reference will now be made in detail to implementations of the example embodiments as illustrated in the accompanying drawings. The same reference indicators will be used to the extent possible throughout the drawings and the following description to refer to the same or like items.


In the interest of clarity, not all of the routine features of the implementations described herein are shown and described. It will, of course, be appreciated that in the development of any such actual implementation, numerous implementation-specific decisions must be made in order to achieve the developer's specific goals, such as compliance with application- and business-related constraints, and that these specific goals will vary from one implementation to another and from one developer to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking of engineering for those of ordinary skill in the art having the benefit of this disclosure.


In accordance with this disclosure, the components, process steps, and/or data structures described herein may be implemented using various types of operating systems, computing platforms, computer programs, and/or general purpose machines. In addition, those of ordinary skill in the art will recognize that devices of a less general purpose nature, such as hardwired devices, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), or the like, may also be used without departing from the scope and spirit of the inventive concepts disclosed herein. Where a method comprising a series of process steps is implemented by a computer or a machine and those process steps can be stored as a series of instructions readable by the machine, they may be stored on a tangible medium such as a computer memory device (e.g., ROM (Read Only Memory), PROM (Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only Memory), FLASH Memory, Jump Drive, and the like), magnetic storage medium (e.g., tape, magnetic disk drive, and the like), optical storage medium (e.g., CD-ROM, DVD-ROM, paper card, paper tape and the like) and other types of program memory.



FIG. 3 is a block diagram of the system architecture for a UAV in accordance with one embodiment. The dashed line 300 demarks the separation between the turret payload 302 versus the aircraft or fuselage payload 304. The turret payload 302 includes the camera optics, which are not shown for simplicity. The turret payload 302 also includes the gimbal mechanism 306, responsible for moving the camera optics through a range of azimuthal and elevational angles. Motion is effected using pan and tilt microprocessors and motors 308 and 310 in conjunction with angular sensors 312 that provide feedback and control for these mechanisms. EO (electro-optical) and IR (infrared) detectors 314 provide sensed information to a circuit 316, comprised of an FPGA for instance, for serializing the data and for interfacing with the detectors, for example providing instructions regarding size and orientation of frames to be grabbed, commands for AGC measurement, track or stab (stabilization) offsets, and synchronization signals. Track and stab offsets are parameters which affect the region of the detectors that is captured and returned in video, and are commands that can be generated in video processing on the fuselage. Stabilization is an approach for matching a frame with a previous frame in order to remove unintentional movements, with the effect of keeping objects in the video stationary.


The output of circuit 316 is transmitted out of the turret into the aircraft. The turret payload 302 can contain other hardware and circuit components for operating the camera such as for manipulation and control of frame capture, display orientation, scale, format (bayer, monochrome), image stabilization/tracking, AGC measurement, track or stab offset, and synchronization signals. However, the bulk of the video image processing is performed not by circuitry in the turret payload 302, as in prior art approaches, but by circuits that are disposed in the aircraft itself, as part of the aircraft or fuselage payload 304. This reduces the weight of the turret and its size and commensurate drag, also reducing the amount of heat generated in the turret, wherein limited heat management measures are available, particularly due to the requirement of water-proofing because of weather exposure. In addition, the size and power (and heat) of the motors required to actuate the turret in the various degrees of freedom are reduced, because the weight and the size of the turret is reduced. The reduction in weight reduces the inertia of the turret and as such a lighter turret can be turned as fast by smaller motors or faster with the same motors. In addition, costs are reduced by piggy-backing some or all of the video processing onto existing electronics in the aircraft, eliminating redundant components previously found in both the aircraft and the turret. Further, by centralizing the video image processing onto one location in the aircraft, data from cameras other than those on the turret can be delivered to the centralized location for processing, reducing redundancy, cost and weight, and possibly drag, even more. Because the UAV is battery operated, these reductions directly impact the airtime (e.g. flight mission time) and performance of the aircraft and are critical, outcome determinative factors. The centralized approach is depicted in the block diagram of FIG. 4, which shows a central image processor collecting and processing information from multiple cameras including a landing camera and surveillance cameras 1 and 2. Only a single, central image processor is required for these multiple cameras, compounding the size, weight and cost savings. Another advantage of this approach is the standardization of the image processing, enabling interchangeability of the optics of each of the multiple cameras. Thus the optical components of the camera in the turret for example can be readily swapped out for more specialized optical functionality—that is, higher power magnification or zoom functionality, for instance—without the need to replace the image processing circuitry as well, or to reconfigure the processing algorithms and protocols as would be necessary if the image processing circuitry had to be replaced with the optics. It should be noted that a surveillance and a landing camera may not be operative at the same time as the tasks associated with each are different and separate. Therefore separate video processing functionality at each camera may not be necessary, and can be staggered over time for the two types of cameras. This is also true of the IR and EO cameras, which are often not operated simultaneously, and their use of processing equipment can also be staggered, so that they can both share the same equipment.


Returning to FIG. 3, central video image processor 318 is shown configured to receive image data and other information from EO and IR detectors 314 delivered by way of physical USB connections 320 and communication channels 322. The communication channels 322 may be selected from various types of connections, including but not limited to twisted pair conductors, coaxial cables, slip rings, or even wireless connections. The type of connection used may be a function of the data bandwidth and the amount of compression that is applied before transmission. Raw, uncompressed data requires minimal processing and can therefore help reduce the size of the turret. However, the transmission of raw data imposes the highest bandwidth requirements, and the transmission path would be configured accordingly, using coaxial or even optical cables for example. The received image data and other information is processed by central video image processor 318, whose functions may include, but are not limited to, obtaining raw or partially conditioned data from the detectors 314, obtaining information on how to display individual frames, including rotating and scaling information, performing stabilization and/or tracking, and performing AGC (automatic gain control) measurements and providing results thereof. Central video image processor 318 is also configured to receive non-image related information, such as that from autopilot microprocessor 324. Some or all of this non-image related information is additionally provided to gimbal 306, augmented with other information relating to aircraft state estimates, body rates, mission information, flight mode information, joystick/control information, DTED (ground elevation data used by the payload to estimate where it should be pointing) information and camera control data. Modules (not shown) of central video image processor 324 that can perform the above and other functionalities can include a de-mosaicing module, video conditioning module (for color correction, white balance, saturation, and contrast, for instance), individual frame display information module that provides information on rotating, scaling and offset, and template matching module for stabilization and tracking, and video compression.


A surveillance method 500 in accordance with one embodiment is described with reference to FIGS. 5 and 6. In this method, at 502, a gimbaled camera 600 (FIG. 6) mounted in a turret exterior to an aircraft fuselage 602 is used to collect image information. The image information is then transmitted, at 504, to a central image processor 604 disposed in the aircraft fuselage. Optionally, this processed image is then transmitted to a remote base station 606 at 506. A second, landing camera 608 also transmits its image information to central image processor 604, and, optionally, to remote base station 606. Although the landing camera is shown on the underside of the fuselage, it could be disposed in other locations, for example the top side of the fuselage, for UAVs configured to land “upside down.”


While embodiments and applications have been shown and described, it would be apparent to those skilled in the art having the benefit of this disclosure that many more modifications than mentioned above are possible without departing from the inventive concepts disclosed herein. The invention, therefore, is not to be restricted except in the spirit of the appended claims.

Claims
  • 1. An unmanned aerial vehicle (UAV) comprising: a fuselage having a fuselage payload including: a central video image processor configured to receive and process image data, the central video image processor including: a de-mosaicing module,a video conditioning module configured to modify at least one of color correction, white balance, saturation, and contrast,an individual frame display information module configured to provide information on at least one of rotating, scaling, and offset, anda template matching module for at least one of stabilization, tracking, and video compression; andan autopilot microprocessor coupled to the central video image processor to relay non-image related information; anda gimbal-mounted turret having a turret payload that is separate from the fuselage payload, the gimbal-mounted turret coupled to the fuselage such that the turret payload has one or more degrees of freedom relative to the fuselage payload, the turret payload including: a camera coupled to the central video image processor to supply the image data;one or more motors;one or more angular sensors; anda motion microprocessor coupled to the autopilot microprocessor and the one or more angular sensors, the motion microprocessor further coupled to the one or more motors to effect motion of the camera, wherein motion of the camera with respect to the fuselage payload that is effected by the one or more motors results in motion of the turret payload with respect to the central video image processor.
  • 2. The UAV of claim 1, further comprising an additional camera coupled to the fuselage such that the additional camera is external to the turret payload, the additional camera being coupled to the central video image processor to supply additional image data, the central video image processor configured to receive and process the additional image data from the additional camera.
  • 3. The UAV of claim 1, wherein the additional camera is a landing camera.
  • 4. A surveillance method comprising: capturing image information using a gimbaled camera mounted in a turret payload exterior to an aircraft fuselage;transmitting the captured image information to a central image processor disposed in the aircraft fuselage, wherein transmitting the captured image information to the central image processor includes passing the captured image information from the turret payload to the aircraft fuselage, and wherein the central video image processor includes: a de-mosaicing module,a video conditioning module configured to modify at least one of color correction, white balance, saturation, and contrast,an individual frame display information module configured to provide information on at least one of rotating, scaling and offset, anda template matching module for at least one of stabilization, tracking, and video compression;relaying non-image related information from an autopilot microprocessor disposed in the aircraft fuselage to the central video image processor;processing the transmitted captured image information in the central image processor; andmoving the turret payload with respect to the aircraft fuselage using one or more motors positioned in the turret payload, wherein moving the turret payload with respect to the fuselage includes moving the turret payload with respect to the central image processor, and wherein moving the turret payload includes receiving feedback data from one or more angular sensors in the turret payload and controlling the one or more motors using a motion microprocessor in the turret payload based on the feedback data.
  • 5. The method of claim 4, further comprising transmitting information processed by the central image processor to a remote location.
  • 6. The method of claim 4, further comprising: capturing additional image information using an additional camera, the additional camera being mounted exteriorly of the fuselage; andtransmitting the captured additional image information from the additional camera to the central image processor, wherein transmitting the captured additional image information includes passing the captured additional information from the additional camera into the aircraft fuselage.
  • 7. A device comprising: means for capturing image information using a gimbaled camera mounted in a turret payload exterior to an aircraft fuselage;means for transmitting the captured image information to a central image processor disposed in the aircraft fuselage, wherein the means for transmitting the captured image information to the central image processor includes a means for passing the captured image information from the turret payload to the aircraft fuselage, and wherein the central video image processor includes: a de-mosaicing module,a video conditioning module configured to modify at least one of color correction, white balance, saturation, and contrast,an individual frame display information module configured to provide information on at least one of rotating, scaling and offset, anda template matching module for at least one of stabilization, tracking, and video compression;means for relaying non-image related information from an autopilot microprocessor disposed in the aircraft fuselage to the central video image processor;means for processing the transmitted captured image information in the central image processor; andmeans for effecting motion of the turret payload with respect to the aircraft fuselage, wherein the motion of the turret payload with respect to the aircraft fuselage includes motion of the turret payload with respect to the central image processor, and wherein moving the turret payload includes receiving feedback data from one or more angular sensors in the turret payload and controlling the one or more motors using a motion microprocessor in the turret payload based on the feedback data.
  • 8. The device of claim 7, further comprising means for transmitting information processed by the central image processor to a remote location.
  • 9. The device of claim 7, further comprising: means for capturing additional image information using an additional camera mounted exteriorly of the fuselage; andmeans for transmitting the captured additional image information from the additional camera to the central image processor, wherein the means for transmitting the captured additional image information includes a means for passing the captured additional information from the additional camera into the aircraft fuselage.
  • 10. The UAV of claim 2, wherein the additional camera is coupled to the fuselage external to the fuselage payload.
  • 11. The method of claim 6, wherein the additional camera is mounted exteriorly of the fuselage such that the additional camera is external to the turret payload.
  • 12. The device of claim 9, wherein the additional camera is mounted exteriorly of the fuselage such that the additional camera is external to the turret payload.
  • 13. The UAV of claim 1, wherein the gimbal-mounted turret is waterproofed.
  • 14. The method of claim 4, wherein the turret payload is contained within a waterproofed turret.
  • 15. The device of claim 7, wherein the turret payload is contained within a waterproofed turret.
US Referenced Citations (85)
Number Name Date Kind
3638502 Leavitt et al. Feb 1972 A
4217606 Nordmann Aug 1980 A
4855823 Struhs et al. Aug 1989 A
5153623 Bouvier Oct 1992 A
5251118 Budnovitch et al. Oct 1993 A
5383645 Pedut et al. Jan 1995 A
5897223 Tritchew et al. Apr 1999 A
5936245 Goillot et al. Aug 1999 A
6056237 Woodland May 2000 A
6147701 Tamura et al. Nov 2000 A
6226125 Levy et al. May 2001 B1
D452697 Fallowfield et al. Jan 2002 S
6366311 Monroe Apr 2002 B1
6529620 Thompson Mar 2003 B2
6628338 Elberbaum et al. Sep 2003 B1
7000883 Mercadal et al. Feb 2006 B2
7049953 Monroe May 2006 B2
7058721 Ellison et al. Jun 2006 B1
7131136 Monroe Oct 2006 B2
7173526 Monroe Feb 2007 B1
7253398 Hughes et al. Aug 2007 B2
7280810 Feher Oct 2007 B2
7359622 Monroe et al. Apr 2008 B2
7400348 Hoyos Jul 2008 B2
7526183 Takahashi et al. Apr 2009 B2
7561037 Monroe Jul 2009 B1
7634662 Monroe Dec 2009 B2
7695647 Smela et al. Apr 2010 B2
7747364 Roy et al. Jun 2010 B2
7955006 Harvey Jun 2011 B1
8091833 von Flotow et al. Jan 2012 B2
8137007 Harvey Mar 2012 B1
8140200 Heppe et al. Mar 2012 B2
8174612 Koehler May 2012 B1
D662120 Deuwaarder Jun 2012 S
8226039 von Flotow et al. Jul 2012 B2
D668701 Ohno et al. Oct 2012 S
8523462 Dimotakis Sep 2013 B2
8559801 Dimotakis Oct 2013 B2
8589994 Monroe Nov 2013 B2
8767041 Yun et al. Jul 2014 B2
8891539 Ozawa Nov 2014 B2
20010043751 Takahashi et al. Nov 2001 A1
20020145678 Suzuki Oct 2002 A1
20030067542 Monroe Apr 2003 A1
20030099457 Takahashi et al. May 2003 A1
20040026573 Andersson et al. Feb 2004 A1
20040068583 Monroe et al. Apr 2004 A1
20040173726 Mercadal et al. Sep 2004 A1
20040230352 Monroe Nov 2004 A1
20050219639 Fujise et al. Oct 2005 A1
20060033288 Hughes et al. Feb 2006 A1
20060110155 Kouchi et al. May 2006 A1
20060231675 Bostan Oct 2006 A1
20070031151 Cunningham Feb 2007 A1
20080204553 Thompson Aug 2008 A1
20080205696 Thompson Aug 2008 A1
20080215204 Roy et al. Sep 2008 A1
20080267612 Harvey Oct 2008 A1
20080277631 Smela et al. Nov 2008 A1
20080316313 Monroe et al. Dec 2008 A1
20090015674 Alley et al. Jan 2009 A1
20090216394 Heppe et al. Aug 2009 A1
20090218447 von Flotow et al. Sep 2009 A1
20090273671 Gardner Nov 2009 A1
20090284644 McKaughan et al. Nov 2009 A1
20100013628 Monroe Jan 2010 A1
20100110162 Yun et al. May 2010 A1
20100141503 Baumatz Jun 2010 A1
20100241931 Choi et al. Sep 2010 A1
20100265329 Doneker Oct 2010 A1
20100309344 Zimmer et al. Dec 2010 A1
20110103021 Janssen et al. May 2011 A1
20110170556 Ozawa Jul 2011 A1
20120104169 von Flotow et al. May 2012 A1
20120106800 Khan et al. May 2012 A1
20120200703 Nadir Aug 2012 A1
20120320203 Liu Dec 2012 A1
20130048792 Szarek et al. Feb 2013 A1
20130050486 Omer et al. Feb 2013 A1
20130050487 Omer et al. Feb 2013 A1
20130051778 Dimotakis Feb 2013 A1
20130051782 Dimotakis Feb 2013 A1
20130135471 Giuffrida et al. May 2013 A1
20140161435 Dimotakis Jun 2014 A1
Foreign Referenced Citations (9)
Number Date Country
201099352 Aug 2008 CN
101766049 Jun 2010 CN
211058 Aug 1993 TW
I311121 Jun 2009 TW
WO 2013074172 May 2013 WO
WO 2013074173 May 2013 WO
WO 2013074175 May 2013 WO
WO 2013074176 May 2013 WO
WO 2013074177 May 2013 WO
Non-Patent Literature Citations (29)
Entry
Office Action in U.S. Appl. No. 13/220,617 dated Dec. 4, 2012.
Office Action in U.S. Appl. No. 13/220,562 dated Nov. 23, 2012.
PCT International Search Report and Written Opinion in International Application No. PCT/US12/52727, dated Mar. 18, 2013.
PCT International Search Report and Written Opinion in International Application No. PCT/US12/52728. , dated Mar. 19, 2013.
PCT International Search Report and Written Opinion in International Application No. PCT/US12/52723, dated May 3, 2013.
PCT International Search Report and Written Opinion in International Application No. PCT/US12/52725, dated May 3, 2013.
PCT International Search Report and Written Opinion in International Application No. PCT/US12/52729, dated May 13, 2013.
Notice of Allowance in U.S. Appl. No. 13/220,562, dated May 1, 2013.
Notice of Allowance in U.S. Appl. No. 13/220,617, dated Jun. 10, 2013.
Office Action and Search Report in TW Application No. 101130827 dated Feb. 12, 2015, 14 pages.
Notice of Allowance for U.S. Appl. No. 13/967,720, dated Mar. 25, 2015.
Notice of Allowance for U.S. Appl. No. 13/220,619, dated Mar. 6, 2015.
Office Action in U.S. Appl. No. 13/220,197, dated Nov. 7, 2013.
Final Office Action in U.S. Appl. No. 13/220,197, dated Jun. 2, 2014.
Office Action in U.S. Appl. No. 13/220,619, dated May 13, 2014.
Final Office Action in U.S. Appl. No. 13/220,619, dated Oct. 8, 2014.
Office Action and Search Report in Taiwanese Application No. 101130829, dated May 14, 2014.
Office Action in Taiwanese Application No. 101130829, dated Sep. 29, 2014.
Office Action and Search Report in Taiwanese Application No. 101130830 dated Oct. 30, 2014.
Office Action in U.S. Appl. No. 13/967,720, dated Oct. 8, 2014.
Office Action and Search Report in Taiwanese Application No. 101130828, dated Nov. 11, 2014.
Taiwanese Office Action and Search Report for TW 101130832, dated Jun. 26, 2015.
Office Action in U.S. Appl. No. 13/220,535, dated Aug. 2, 2013, 10 pages.
Office Action in U.S. Appl. No. 13/220,619, dated Dec. 9, 2013 (restriction), 8 pages.
Office Action in U.S. Appl. No. 13/220,619, dated Oct. 8, 2014, 11 pages.
Office Action and Search Report in Taiwanese Application No. 101130828, dated Nov. 11, 2014, 24 pages.
Office Action, dated Aug. 21, 2015, in Taiwanese Application No. 101130827.
Office Action, dated Jun. 10, 2016, issued in U.S. Appl. No. 15/010,445, 31 pages.
Office Action, dated Jun. 21, 2016, issued in Taiwanese Patent Application No. 105113973, 8 pages.
Related Publications (1)
Number Date Country
20130050487 A1 Feb 2013 US