The present invention generally relates to unmanned aerial vehicles (UAVs). More specifically, the present invention relates to course profiling and sharing by UAVs.
An unmanned aerial vehicle (UAV)—also commonly called a drone—is a type of aircraft that may be controlled with varying degrees of autonomy or direction by a remote human pilot. UAVs are available in a variety of different sizes, configurations, power, maneuverability, and peripheral devices, such as cameras, sensors, radar, sonar, etc. Common uses for UAVs include aerial photography, surveillance, and delivery of a variety of payloads, as well as recreational and hobby usage.
In a recreational context, UAVs may be flown in a variety of races or other types of UAV competitions. Such races and competitions have generally required competitors and their respective drones to compete at the same physical location (e.g., a race course or obstacle course) so that the competitors may face the same conditions and challenges. The requirement that the competitors be in the same location may limit, however, the number of competitors who may fly the course within a reasonable timeframe, as well as add a layer of difficulty and expense to the hobby. Recreating the same conditions across multiple venues in different locations may be difficult, expensive, and impractical.
There is, therefore, a need in the art for improved systems and methods for UAV course profiling and sharing.
Embodiments of the present invention allow unmanned aerial vehicle (UAV) course profiling. A plurality of images may be captured by a UAV flying along a course at a first location. A profile may be constructed for the course based on the images captured by the UAV. In some instances, the course profile may further include virtual elements generated by a virtual reality system at the first location. The constructed course profile is transmitted over a communication network to a virtual reality system at a second location. The virtual reality system may generate a virtual environment corresponding to the course based on the constructed course profile, and a second UAV at the second location may fly along the virtual course.
Various embodiments of the present invention may include systems for UAV course profiling. Such systems may include an unmanned aerial vehicle (UAV) that captures a plurality of images while flying along a course at a first location, a processor that executes instructions stored in memory to construct a profile for the course based on the images captured by the UAV, and a network interface that transmits the constructed course profile over a communication network to a virtual reality system at a second location. The virtual reality system may generate a virtual environment corresponding to the course based on the constructed course profile.
Additional embodiments of the present invention may further include methods for unmanned aerial vehicle (UAV) course profiling. Such methods may include capturing a plurality of images by a UAV flying along a course at a first location, executing instructions stored in memory to construct a profile for the course based on the images captured by the UAV while flying along the course, and transmitting the constructed course profile over a communication network to a virtual reality system at a second location. The virtual reality system may generate a virtual environment corresponding to the course based on the constructed course profile.
Further embodiments of the present invention may further include non-transitory computer-readable storage media, having embodied thereon a program executable by a processor to perform methods for unmanned aerial vehicle (UAV) course profiling as described herein.
Embodiments of the present invention allow unmanned aerial vehicle (UAV) course profiling. A plurality of images may be captured by a UAV flying along a course at a first location. A profile may be constructed for the course based on the images captured by the UAV. In some instances, the course profile may further include virtual elements generated by a virtual reality system at the first location. The constructed course profile is transmitted over a communication network to a virtual reality system at a second location. The virtual reality system may generate a virtual environment corresponding to the course based on the constructed course profile, and a second UAV at the second location may fly along the virtual course.
In some embodiments, each motor 155 rotates (e.g., the drive shaft of motor 155 spins) about parallel axes. For example, the thrust provided by all propellers 155 can be in the Z direction. Alternatively, a motor 155 can rotate about an axis that is perpendicular (or any angle that is not parallel) to the axis of rotation of another motor 155. For example, two motors 155 can be oriented to provide thrust in the Z direction (e.g., to be used in takeoff and landing) while two motors 155 can be oriented to provide thrust in the X direction (e.g., for normal flight). In some embodiments, UAV 100 can dynamically adjust the orientation of one or more of its motors 150 for vectored thrust.
In some embodiments, the rotation of motors 150 can be configured to create or minimize gyroscopic forces. For example, if there are an even number of motors 150, then half of the motors can be configured to rotate counter-clockwise while the other half can be configured to rotate clockwise. Alternating the placement of clockwise and counter-clockwise motors can increase stability and enable UAV 100 to rotate about the z-axis by providing more power to one set of motors 150 (e.g., those that rotate clockwise) while providing less power to the remaining motors (e.g., those that rotate counter-clockwise).
Motors 150 can be any combination of electric motors, internal combustion engines, turbines, rockets, etc. In some embodiments, a single motor 150 can drive multiple thrust components (e.g., propellers 155) on different parts of UAV 100 using chains, cables, gear assemblies, hydraulics, tubing (e.g., to guide an exhaust stream used for thrust), etc. to transfer the power.
In some embodiments, motor 150 is a brushless motor and can be connected to electronic speed controller X45. Electronic speed controller 145 can determine the orientation of magnets attached to a drive shaft within motor 150 and, based on the orientation, power electromagnets within motor 150. For example, electronic speed controller 145 can have three wires connected to motor 150, and electronic speed controller 145 can provide three phases of power to the electromagnets to spin the drive shaft in motor 150. Electronic speed controller 145 can determine the orientation of the drive shaft based on back-emf on the wires or by directly sensing to position of the drive shaft.
Transceiver 165 can receive control signals from a control unit (e.g., a handheld control transmitter, a server, etc.). Transceiver 165 can receive the control signals directly from the control unit or through a network (e.g., a satellite, cellular, mesh, etc.). The control signals can be encrypted. In some embodiments, the control signals include multiple channels of data (e.g., “pitch,” “yaw,” “roll,” “throttle,” and auxiliary channels). The channels can be encoded using pulse-width-modulation or can be digital signals. In some embodiments, the control signals are received over TC/IP or similar networking stack.
In some embodiments, transceiver 165 can also transmit data to a control unit. Transceiver 165 can communicate with the control unit using lasers, light, ultrasonic, infra-red, Bluetooth, 802.11x, or similar communication methods, including a combination of methods. Transceiver can communicate with multiple control units at a time.
Position sensor 135 can include an inertial measurement unit for determining the acceleration and/or the angular rate of UAV 100, a GPS receiver for determining the geolocation and altitude of UAV 100, a magnetometer for determining the surrounding magnetic fields of UAV 100 (for informing the heading and orientation of UAV 100), a barometer for determining the altitude of UAV 100, etc. Position sensor 135 can include a land-speed sensor, an air-speed sensor, a celestial navigation sensor, etc.
UAV 100 can have one or more environmental awareness sensors. These sensors can use sonar, LiDAR, stereoscopic imaging, computer vision, etc. to detect obstacles and determine the nearby environment. For example, a collision avoidance system can use environmental awareness sensors to determine how far away an obstacle is and, if necessary, change course.
Position sensor 135 and environmental awareness sensors can all be one unit or a collection of units. In some embodiments, some features of position sensor 135 and/or the environmental awareness sensors are embedded within flight controller 130.
In some embodiments, an environmental awareness system can take inputs from position sensors 135, environmental awareness sensors, databases (e.g., a predefined mapping of a region) to determine the location of UAV 100, obstacles, and pathways. In some embodiments, this environmental awareness system is located entirely on UAV 100, alternatively, some data processing can be performed external to UAV 100.
Camera 105 can include an image sensor (e.g., a CCD sensor, a CMOS sensor, etc.), a lens system, a processor, etc. The lens system can include multiple movable lenses that can be adjusted to manipulate the focal length and/or field of view (e.g., zoom) of the lens system. In some embodiments, camera 105 is part of a camera system which includes multiple cameras 105. For example, two cameras 105 can be used for stereoscopic imaging (e.g., for first person video, augmented reality, etc.). Another example includes one camera 105 that is optimized for detecting hue and saturation information and a second camera 105 that is optimized for detecting intensity information. In some embodiments, camera 105 optimized for low latency is used for control systems while a camera 105 optimized for quality is used for recording a video (e.g., a cinematic video). Camera 105 can be a visual light camera, an infrared camera, a depth camera, etc.
A gimbal and dampeners can help stabilize camera 105 and remove erratic rotations and translations of UAV 100. For example, a three-axis gimbal can have three stepper motors that are positioned based on a gyroscope reading in order to prevent erratic spinning and/or keep camera 105 level with the ground.
Video processor 125 can process a video signal from camera 105. For example video process 125 can enhance the image of the video signal, down-sample or up-sample the resolution of the video signal, add audio (captured by a microphone) to the video signal, overlay information (e.g., flight data from flight controller 130 and/or position sensor), convert the signal between forms or formats, etc.
Video transmitter 120 can receive a video signal from video processor 125 and transmit it using an attached antenna. The antenna can be a cloverleaf antenna or a linear antenna. In some embodiments, video transmitter 120 uses a different frequency or band than transceiver 165. In some embodiments, video transmitter 120 and transceiver 165 are part of a single transceiver.
Battery 170 can supply power to the components of UAV 100. A battery elimination circuit can convert the voltage from battery 170 to a desired voltage (e.g., convert 12v from battery 170 to 5v for flight controller 130). A battery elimination circuit can also filter the power in order to minimize noise in the power lines (e.g., to prevent interference in transceiver 165 and transceiver 120). Electronic speed controller 145 can contain a battery elimination circuit. For example, battery 170 can supply 12 volts to electronic speed controller 145 which can then provide 5 volts to flight controller 130. In some embodiments, a power distribution board can allow each electronic speed controller (and other devices) to connect directly to the battery.
In some embodiments, battery 170 is a multi-cell (e.g., 2S, 3S, 4S, etc.) lithium polymer battery. Battery 170 can also be a lithium-ion, lead-acid, nickel-cadmium, or alkaline battery. Other battery types and variants can be used as known in the art. Additional or alternative to battery 170, other energy sources can be used. For example, UAV 100 can use solar panels, wireless power transfer, a tethered power cable (e.g., from a ground station or another UAV 100), etc. In some embodiments, the other energy source can be utilized to charge battery 170 while in flight or on the ground.
Battery 170 can be securely mounted to main body 110. Alternatively, battery 170 can have a release mechanism. In some embodiments, battery 170 can be automatically replaced. For example, UAV 100 can land on a docking station and the docking station can automatically remove a discharged battery 170 and insert a charged battery 170. In some embodiments, UAV 100 can pass through docking station and replace battery 170 without stopping.
Battery 170 can include a temperature sensor for overload prevention. For example, when charging, the rate of charge can be thermally limited (the rate will decrease if the temperature exceeds a certain threshold). Similarly, the power delivery at electronic speed controllers 145 can be thermally limited—providing less power when the temperature exceeds a certain threshold. Battery 170 can include a charging and voltage protection circuit to safely charge battery 170 and prevent its voltage from going above or below a certain range.
UAV 100 can include a location transponder. For example, in a racing environment, race officials can track UAV 100 using location transponder. The actual location (e.g., X, Y, and Z) can be tracked using triangulation of the transponder. In some embodiments, gates or sensors in a track can determine if the location transponder has passed by or through the sensor or gate.
Flight controller 130 can communicate with electronic speed controller 145, battery 170, transceiver 165, video processor 125, position sensor 135, and/or any other component of UAV 100. In some embodiments, flight controller 130 can receive various inputs (including historical data) and calculate current flight characteristics. Flight characteristics can include an actual or predicted position, orientation, velocity, angular momentum, acceleration, battery capacity, temperature, etc. of UAV 100. Flight controller 130 can then take the control signals from transceiver 165 and calculate target flight characteristics. For example, target flight characteristics might include “rotate x degrees” or “go to this GPS location”. Flight controller 130 can calculate response characteristics of UAV 100. Response characteristics can include how electronic speed controller 145, motor 150, propeller 155, etc. respond, or are expected to respond, to control signals from flight controller 130. Response characteristics can include an expectation for how UAV 100 as a system will respond to control signals from flight controller 130. For example, response characteristics can include a determination that one motor 150 is slightly weaker than other motors.
After calculating current flight characteristics, target flight characteristics, and response characteristics flight controller 130 can calculate optimized control signals to achieve the target flight characteristics. Various control systems can be implemented during these calculations. For example a proportional-integral-derivative (PID) can be used. In some embodiments, an open-loop control system (i.e., one that ignores current flight characteristics) can be used. In some embodiments, some of the functions of flight controller 130 are performed by a system external to UAV 100. For example, current flight characteristics can be sent to a server that returns the optimized control signals. Flight controller 130 can send the optimized control signals to electronic speed controllers 145 to control UAV 100.
In some embodiments, UAV 100 has various outputs that are not part of the flight control system. For example, UAV 100 can have a loudspeaker for communicating with people or other UAVs 100. Similarly, UAV 100 can have a flashlight or laser. The laser can be used to “tag” another UAV 100.
Virtual reality system 300 may generate a display of an artificial image to overlay the view of the real world (e.g., augmented reality) or to create an independent reality all its own. Display screen 310 may be partly transparent or translucent, thereby allowing the user to observe real-world surroundings, as well as a displayed computer generated image, or a combination of the two. The virtual environment generated by virtual reality system 300 and presented to the user may include any of the real-world surroundings, any physical objects (which may be augmented or not), or generate wholly virtual objects.
In some embodiments, display screen 310 includes two screens, one for each eye; these screens can have separate signals for stereoscopic viewing. In some embodiments, receiver 315 may be coupled to display screen 310 (as shown in
A user or pilot may fly UAV 100A along course 420A, thereby navigating through or around the various course elements 410A placed along course 420A. The user may plot the course 420A anywhere within the space 400A. Course 420A may include, for example, maneuvering through or around certain course elements 410A, as well as various maneuvers in open space within space 400A.
The user may view the UAV from his or her physical location within the virtual environment while flying the UAV along the course 420A. Depending on settings of the virtual reality system 300, the user may also be provided with a first person view of the course 420A corresponding to the view as seen from the UAV. The virtual reality system 300 therefore provides the user with a flight simulation experience corresponding to the actual physical flight of the UAV.
While UAV 100A is flying along course 420A, its cameras and sensors may be capturing images and various other details regarding the physical space 400A (e.g., size or dimensional measurements) and the course 420A (e.g., various obstacles, such as walls, and course elements 410A). Using the various positional sensors discussed above with respect to
Such a course profile may include not only physical structures (e.g., walls of space 400A), but virtual structures as well. The virtual structures may be generated by a virtual reality system 300 based on user preferences and input. For example, the physical space 400A may be an empty room shaped as a simple rectangular prism in which course elements 410A may be added, whether as physical or virtual objects. In addition, the virtual reality system 300 may further add another layer of imagery where the walls of the space 400A may be made to appear as walls of an underground cave and the course elements 410A as narrow tunnel openings. Alternatively, the walls of space 400A may be made to look like outer space and the course elements 410A as openings between asteroids. Such virtual features may also be included in the course profile constructed for course 420A.
Space 400B may have different dimensions from space 400A, however. As such, the virtual environment generated for space 400A needs to be scaled for the dimensions of space 400B. In some instances, scaling may also occur based on the dimensions of the UAV 100B in space 400B. Such scaling may be based on the relative differences in dimensions between spaces 400A and 400B, the relative differences in dimensions between UAV 100A and 100B, and various other characteristics of the respective spaces or UAVs. In some implementations, such as competitions, scaling may also apply to scores, points, times, or other metric used to compare performance between a UAV 100A in a first space 400A and another UAV 100B in a second space 400B.
Communication network 510 may be a local, proprietary network (e.g., an intranet) and/or may be a part of a larger wide-area network. The communications network 510 may be a local area network (LAN), which may be communicatively coupled to a wide area network (WAN) such as the Internet. The Internet is a broad network of interconnected computers and servers allowing for the transmission and exchange of Internet Protocol (IP) data between users connected through a network service provider. Examples of network service providers are the public switched telephone network, a cable service provider, a provider of digital subscriber line (DSL) services, or a satellite service provider. Communications network 510 allows for communication between the various components of network environment 500.
UAV courses 520A-C represent different spaces at different locations. Each UAV course 520C-C may be associated with a respective UAV and virtual reality system. The type of UAVs and virtual reality system at each UAV course 520C-C may also differ. In an exemplary implementation, a course may be mapped and profiled at a first UAV course 520A. Such a course profile may include not only information regarding the physical space at UAV course 520A, but also any virtual objects or features thereof. In some instances, the course profile may also include information regarding the UAV used to map and profile the course at UAV course 520A.
The course profile for UAV course 520A may be sent over communication network 510 to any of the other UAV courses 520B-C or to server 530. The respective virtual reality systems at each of the other UAV courses 520B-C may generate a virtual environment corresponding to the course mapped for UAV course 520A. Where the dimensions of the space or UAV may differ, various scaling factors may be used to generate the respective virtual environments at UAV courses 520B-C. In some embodiments, the course profile may be sent to server 530 for storage and distribution.
Server 530 may include any type of server or other computing device as is known in the art, including standard hardware computing components such as network and media interfaces, non-transitory computer-readable storage (memory), and processors for executing instructions or accessing information that may be stored in memory. The functionalities of multiple servers may be integrated into a single server. Any of the aforementioned servers (or an integrated server) may take on certain client-side, cache, or proxy server characteristics. These characteristics may depend on the particular network placement of the server or certain configurations of the server.
In some embodiments, server 530 may act as a storage repository for course profiles for different UAV courses. Users who create course profiles by flying their UAV and augmenting or adding virtual objects to the same may further provide the course profiles to server 530. Such course profiles may be made accessible or distributed to other users upon request. In some embodiments, server 530 may apply various scaling and normalization factors so that a virtual UAV course may be generated in spaces having different dimensions.
Server 530 may be a game server or other type of host server capable of hosting a session that includes one or more systems (e.g., UAV courses 520A-C). For example, the server 530 may manage a competition in which a user at each course 520A-C may fly a UAV through a virtual course. Various scores, points, or times may be recorded, as well as scaled or otherwise normalized to account for differences in degree of difficulty related to the differences between the UAV courses and respective UAVs.
In step 610, images and other data are captured as a UAV is flown along a course at a first location. As discussed above, the UAV may be equipped with various cameras, position sensors, gimbals, environmental sensors, radar, lidar, lasers, and other types of sensors that not only allow for tracking of the UAV movement, but also gather data regarding the environment (e.g., obstacles, objects) through which the UAV moves.
In step 620, virtual elements may be added to the course. Such virtual elements may include augmentation of physical objects or wholly virtual elements. The addition of virtual elements may be based on user preference or selection. In some embodiments, the user may be offered a menu of virtual elements, themes, or templates that may be used to augment or add to a particular course.
In step 630, a course profile may be constructed based on the information captured in step 610, as well as the customizations added in step 620. The course profile may include mapping data of the course, as well as various physical and virtual elements on the course. Such a course profile may further specify sizes, distances, and other measurements of various elements of the course, as well as information regarding the UAV used to map the course.
In step 640, the course profile may be transmitted over a communication network to other virtual reality systems or servers. Such transmission may occur upon request, automatic sharing or distribution settings, or other parameters selected by users or administrators.
In step 650, a virtual reality system (e.g., at a second location) that received the course profile may generate a virtual environment that is based upon the course profile and that includes a course corresponding to the mapped course. Depending on the differences in the available space at the locations and differences in the UAV, various scaling or normalization factors may be used to generate a virtual environment (including the virtual course and course elements) at the second location.
In step 660, a user in the second location may fly their UAV along the virtual course. In some embodiments, multiple virtual courses may be generated at different locations, and UAVs flown along the respective courses may be essentially racing against each other, albeit in different locations. Further, the respective virtual reality systems may communicate with each other to provide the real-time locations of each respective UAV along the course. As such, the user may view a virtual representation of their UAV competitors in the virtual environment, as well as various interactions that may occur between the physical UAV present at a location and virtual UAVs representing UAVs at other locations. Such interactions may result in various parameters being applied to the physical UAV. For example, a bump with a virtual UAV may result in limitations on the power or acceleration made available to the physical UAV.
The computing system 700 of
The components shown in
Mass storage device 730, which may be implemented with a magnetic disk drive or an optical disk drive, is a non-volatile storage device for storing data and instructions for use by processor unit 710. Mass storage device 730 can store the system software for implementing embodiments of the present invention for purposes of loading that software into main memory 720.
Portable storage device 740 operates in conjunction with a portable non-volatile storage medium, such as a floppy disk, compact disk or Digital video disc, to input and output data and code to and from the computer system 700 of
Input devices 760 provide a portion of a user interface. Input devices 760 may include an alpha-numeric keypad, such as a keyboard, for inputting alpha-numeric and other information, or a pointing device, such as a mouse, a trackball, stylus, or cursor direction keys. Additionally, the system 700 as shown in
Display system 770 may include a liquid crystal display (LCD) or other suitable display device. Display system 770 receives textual and graphical information, and processes the information for output to the display device.
Peripherals 780 may include any type of computer support device to add additional functionality to the computer system. For example, peripheral device(s) 780 may include a modem or a router.
The components contained in the computer system 700 of
Entertainment system 800 may be an electronic game console. Alternatively, the entertainment system 800 may be implemented as a general-purpose computer, a set-top box, a hand-held game device, a tablet computing device, or a mobile computing device or phone. Entertainment systems may contain more or less operating components depending on a particular form factor, purpose, or design.
The CPU 810, the vector unit 815, the graphics processing unit 820, and the I/O processor 825 of
The graphics processing unit 820 of
A user of the entertainment system 800 of
The present invention may be implemented in an application that may be operable by a variety of end user devices. For example, an end user device may be a personal computer, a home entertainment system (e.g., Sony PlayStation2® or Sony PlayStation3® or Sony PlayStation4®), a portable gaming device (e.g., Sony PSP® or Sony Vita®), or a home entertainment system of a different albeit inferior manufacturer. The present methodologies described herein are fully intended to be operable on a variety of devices. The present invention may also be implemented with cross-title neutrality wherein an embodiment of the present system may be utilized across a variety of titles from various publishers.
Non-transitory computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU) for execution. Such media can take many forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of non-transitory computer-readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, RAM, PROM, EPROM, a FLASHEPROM, and any other memory chip or cartridge.
Various forms of transmission media may be involved in carrying one or more sequences of one or more instructions to a CPU for execution. A bus carries the data to system RAM, from which a CPU retrieves and executes the instructions. The instructions received by system RAM can optionally be stored on a fixed disk either before or after execution by a CPU. Various forms of storage may likewise be implemented as well as the necessary network interfaces and network topologies to implement the same.
The foregoing detailed description of the technology has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the technology to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The described embodiments were chosen in order to best explain the principles of the technology, its practical application, and to enable others skilled in the art to utilize the technology in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the technology be defined by the claim.
The present application is a continuation and claims the priority benefit of U.S. patent application Ser. No. 16/728,833 filed Dec. 27, 2019, now U.S. Pat. No. 10,692,174, which is a continuation and claims the priority benefit of U.S. patent application Ser. No. 16/526,819 filed Jul. 30, 2019, now U.S. Pat. No. 10,540,746, which is a continuation and claims the priority benefit of Ser. No. 15/394,511 filed Dec. 29, 2016, now U.S. Pat. No. 10,410,320, which claims the priority benefit of U.S. provisional patent application 62/402,584 filed on Sep. 30, 2016, the disclosures of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3279863 | Zuppiger | Oct 1966 | A |
3367658 | Bayha | Feb 1968 | A |
6021646 | Burley et al. | Feb 2000 | A |
6075924 | Will | Jun 2000 | A |
6236365 | LeBlanc et al. | May 2001 | B1 |
6254394 | Draper et al. | Jul 2001 | B1 |
7912742 | Trautman | Mar 2011 | B2 |
7975774 | Akcasu | Jul 2011 | B2 |
7988154 | Regan, Jr. | Aug 2011 | B1 |
8025293 | Crawford et al. | Sep 2011 | B1 |
8909391 | Peeters et al. | Dec 2014 | B1 |
9061102 | Levien et al. | Jun 2015 | B2 |
9218316 | Bernstein et al. | Dec 2015 | B2 |
9442485 | McDermott et al. | Sep 2016 | B1 |
9605926 | Means et al. | Mar 2017 | B1 |
9632502 | Levinson et al. | Apr 2017 | B1 |
9927812 | Wang et al. | Mar 2018 | B2 |
10062292 | Blomberg et al. | Aug 2018 | B2 |
10067736 | Taylor | Sep 2018 | B2 |
10137984 | Flick | Nov 2018 | B1 |
10210905 | Castleman | Feb 2019 | B2 |
10248118 | Bernstein et al. | Apr 2019 | B2 |
10269257 | Gohl et al. | Apr 2019 | B1 |
10336469 | Mallinson | Jul 2019 | B2 |
10357709 | Mallinson | Jul 2019 | B2 |
10377484 | Taylor | Aug 2019 | B2 |
10410320 | Taylor | Sep 2019 | B2 |
10416669 | Rico | Sep 2019 | B2 |
10500487 | Gutierrez et al. | Dec 2019 | B2 |
10540746 | Taylor | Jan 2020 | B2 |
10679511 | Kikuchi | Jun 2020 | B2 |
10692174 | Taylor | Jun 2020 | B2 |
10783554 | Hylton | Sep 2020 | B1 |
10850838 | Castleman | Dec 2020 | B2 |
20030102016 | Bouchard | Jun 2003 | A1 |
20030152892 | Huang et al. | Aug 2003 | A1 |
20040008253 | Monroe | Jan 2004 | A1 |
20040115593 | Hatlestad et al. | Jun 2004 | A1 |
20040172187 | Wiseman | Sep 2004 | A1 |
20050004723 | Duggan et al. | Jan 2005 | A1 |
20050283281 | Hartmann et al. | Dec 2005 | A1 |
20060095262 | Danielli | May 2006 | A1 |
20060169508 | Trojahn | Aug 2006 | A1 |
20060238877 | Ashkenazi et al. | Oct 2006 | A1 |
20070061116 | Bush | Mar 2007 | A1 |
20070102876 | Giegerich et al. | May 2007 | A1 |
20070130599 | Monroe | Jun 2007 | A1 |
20080073839 | Nally | Mar 2008 | A1 |
20080093796 | Narus et al. | Apr 2008 | A1 |
20080144884 | Habibi | Jun 2008 | A1 |
20080154447 | Spinelli | Jun 2008 | A1 |
20080177994 | Mayer | Jul 2008 | A1 |
20080221745 | Diamandis et al. | Sep 2008 | A1 |
20080232602 | Shearer | Sep 2008 | A1 |
20080240448 | Gustafsson et al. | Oct 2008 | A1 |
20090005167 | Arrasvuori et al. | Jan 2009 | A1 |
20090076665 | Hoisington et al. | Mar 2009 | A1 |
20090087029 | Coleman et al. | Apr 2009 | A1 |
20090118896 | Gustafsson | May 2009 | A1 |
20090125163 | Duggan et al. | May 2009 | A1 |
20090187389 | Dobbins et al. | Jul 2009 | A1 |
20090265105 | Davis et al. | Oct 2009 | A1 |
20100017114 | Tehan et al. | Jan 2010 | A1 |
20100083038 | Pierce et al. | Apr 2010 | A1 |
20100096491 | Whitelaw et al. | Apr 2010 | A1 |
20100121574 | Ariyur et al. | May 2010 | A1 |
20100228468 | D'Angelo | Sep 2010 | A1 |
20100305724 | Fry et al. | Dec 2010 | A1 |
20110102459 | Hall | May 2011 | A1 |
20110106339 | Phillips et al. | May 2011 | A1 |
20110145591 | Grzybowski | Jun 2011 | A1 |
20110184590 | Duggan et al. | Jul 2011 | A1 |
20110199376 | Salemane | Aug 2011 | A1 |
20110311949 | Preston et al. | Dec 2011 | A1 |
20120009845 | Schmelzer | Jan 2012 | A1 |
20120035799 | Ehrmann | Feb 2012 | A1 |
20120050325 | Joo | Mar 2012 | A1 |
20120093320 | Flaks et al. | Apr 2012 | A1 |
20120188078 | Soles et al. | Jul 2012 | A1 |
20120206452 | Geisner et al. | Aug 2012 | A1 |
20120212399 | Border et al. | Aug 2012 | A1 |
20120232867 | Ahrens et al. | Sep 2012 | A1 |
20120290948 | Elenzil | Nov 2012 | A1 |
20130014033 | Hamick | Jan 2013 | A1 |
20130128054 | Densham et al. | May 2013 | A1 |
20130137066 | Pollak et al. | May 2013 | A1 |
20130173089 | Bernstein et al. | Jul 2013 | A1 |
20130236040 | Crawford et al. | Sep 2013 | A1 |
20130328927 | Mount et al. | Dec 2013 | A1 |
20130345910 | Kerho et al. | Dec 2013 | A1 |
20140018979 | Goossen et al. | Jan 2014 | A1 |
20140043365 | Fialho | Feb 2014 | A1 |
20140244075 | Litwinowicz et al. | Aug 2014 | A1 |
20140316616 | Kugelmass | Oct 2014 | A1 |
20140324253 | Duggan et al. | Oct 2014 | A1 |
20140356670 | Haug et al. | Dec 2014 | A1 |
20150063610 | Mossner | Mar 2015 | A1 |
20150135144 | Kim | May 2015 | A1 |
20150141100 | Carter | May 2015 | A1 |
20150209659 | Barr et al. | Jul 2015 | A1 |
20150248785 | Holmquist | Sep 2015 | A1 |
20150323931 | Downey et al. | Nov 2015 | A1 |
20150346722 | Herz et al. | Dec 2015 | A1 |
20150370250 | Bachrach et al. | Dec 2015 | A1 |
20150378019 | Schissler et al. | Dec 2015 | A1 |
20160035224 | Yang et al. | Feb 2016 | A1 |
20160078759 | Nerayoff et al. | Mar 2016 | A1 |
20160082597 | Gorshechnikov et al. | Mar 2016 | A1 |
20160091894 | Zhang et al. | Mar 2016 | A1 |
20160111006 | Srivastava et al. | Apr 2016 | A1 |
20160117853 | Zhong et al. | Apr 2016 | A1 |
20160117931 | Chan et al. | Apr 2016 | A1 |
20160144734 | Wang et al. | May 2016 | A1 |
20160153801 | Cho | Jun 2016 | A1 |
20160196754 | Surace | Jul 2016 | A1 |
20160205654 | Robinson | Jul 2016 | A1 |
20160217698 | Liu et al. | Jul 2016 | A1 |
20160240087 | Kube et al. | Aug 2016 | A1 |
20160246474 | Shuster | Aug 2016 | A1 |
20160253908 | Chambers et al. | Sep 2016 | A1 |
20160257001 | Blasdel et al. | Sep 2016 | A1 |
20160279516 | Gupta | Sep 2016 | A1 |
20160284125 | Bostick et al. | Sep 2016 | A1 |
20160291593 | Hammond et al. | Oct 2016 | A1 |
20160292924 | Balachandreswaran et al. | Oct 2016 | A1 |
20160299506 | Bruggeman et al. | Oct 2016 | A1 |
20160307447 | Johnson et al. | Oct 2016 | A1 |
20160327950 | Bachrach et al. | Nov 2016 | A1 |
20160330601 | Srivastava | Nov 2016 | A1 |
20160358497 | Nguyen et al. | Dec 2016 | A1 |
20170031502 | Rosenberg | Feb 2017 | A1 |
20170036771 | Woodman et al. | Feb 2017 | A1 |
20170039859 | Hu et al. | Feb 2017 | A1 |
20170045886 | Liu et al. | Feb 2017 | A1 |
20170053169 | Cuban et al. | Feb 2017 | A1 |
20170061813 | Tao et al. | Mar 2017 | A1 |
20170069214 | Dupray et al. | Mar 2017 | A1 |
20170098947 | Wolski | Apr 2017 | A1 |
20170116723 | Aughey | Apr 2017 | A1 |
20170158353 | Schmick | Jun 2017 | A1 |
20170165575 | Ridihalgh et al. | Jun 2017 | A1 |
20170166204 | Yoo et al. | Jun 2017 | A1 |
20170168488 | Wierzynski et al. | Jun 2017 | A1 |
20170168556 | Goslin et al. | Jun 2017 | A1 |
20170173451 | Pedersen et al. | Jun 2017 | A1 |
20170177937 | Harmsen et al. | Jun 2017 | A1 |
20170182407 | Steele et al. | Jun 2017 | A1 |
20170244775 | Ha et al. | Aug 2017 | A1 |
20170251323 | Jo et al. | Aug 2017 | A1 |
20170283090 | Miller et al. | Oct 2017 | A1 |
20170295446 | Thagadur | Oct 2017 | A1 |
20170329347 | Passot et al. | Nov 2017 | A1 |
20170337826 | Moran et al. | Nov 2017 | A1 |
20170343375 | Kamhi et al. | Nov 2017 | A1 |
20170371353 | Millinger | Dec 2017 | A1 |
20170372617 | Bruno et al. | Dec 2017 | A1 |
20180322699 | Gray et al. | Jan 2018 | A1 |
20180027772 | Gordon et al. | Feb 2018 | A1 |
20180032071 | Wieneke | Feb 2018 | A1 |
20180039262 | Fox et al. | Feb 2018 | A1 |
20180046187 | Martirosyan et al. | Feb 2018 | A1 |
20180046560 | Gillies et al. | Feb 2018 | A1 |
20180093171 | Mallinson | Apr 2018 | A1 |
20180093768 | Castleman | Apr 2018 | A1 |
20180093781 | Mallinson | Apr 2018 | A1 |
20180094931 | Taylor | Apr 2018 | A1 |
20180095433 | Rico | Apr 2018 | A1 |
20180095461 | Taylor | Apr 2018 | A1 |
20180095463 | Castleman | Apr 2018 | A1 |
20180095714 | Taylor | Apr 2018 | A1 |
20180096455 | Taylor | Apr 2018 | A1 |
20180096611 | Kikuchi | Apr 2018 | A1 |
20180098052 | Black | Apr 2018 | A1 |
20180144525 | Gutierrez et al. | May 2018 | A1 |
20180213359 | Reinhardt et al. | Jul 2018 | A1 |
20180246514 | Mitomo et al. | Aug 2018 | A1 |
20180246529 | Hu et al. | Aug 2018 | A1 |
20180259339 | Johnson et al. | Sep 2018 | A1 |
20180299962 | Foss | Oct 2018 | A1 |
20180321692 | Castillo-Effen et al. | Nov 2018 | A1 |
20180329413 | Charalambides et al. | Nov 2018 | A1 |
20190019329 | Eyler et al. | Jan 2019 | A1 |
20190026936 | Gorur Sheshagiri | Jan 2019 | A1 |
20190047700 | Liu et al. | Feb 2019 | A1 |
20190075252 | Zhao et al. | Mar 2019 | A1 |
20190079722 | Taylor | Mar 2019 | A1 |
20190156563 | Wada | May 2019 | A1 |
20190156573 | Palos et al. | May 2019 | A1 |
20190235644 | Chen | Aug 2019 | A1 |
20190311548 | Wang et al. | Oct 2019 | A1 |
20190355092 | Taylor | Nov 2019 | A1 |
20200030700 | Mattar | Jan 2020 | A1 |
20200042278 | Eade | Feb 2020 | A1 |
20200336707 | Schmirler | Oct 2020 | A1 |
20200372815 | Kikuchi | Nov 2020 | A1 |
Number | Date | Country |
---|---|---|
WO 2018063594 | Apr 2018 | WO |
Entry |
---|
“KEITAI-Space: photo-based shared virtual space on cellular phones”; T. Nakao, 18th International Conference on Advanced Information Networking and Applications, 2004. AINA 2004. (vol. 1, pp. 358-363 vol. 1) (Year: 2004). |
U.S. Appl. No. 15/394,285 Office Action dated Dec. 31, 2020. |
Kang et al.; “HRTF Measurement and Its Application for 3-D Sound Localization”; copyright 1997. 6 pages. |
OpenCV Modules and Introduction, retrieved from https://web.archive.org/web/20151224212423if_/https://docs/opencv.org/3.1.0/#gsc.tab=0 and publicly available on or before Dec. 24, 2015; 7 pages. |
U.S. Appl. No. 16/896,480 Office Action dated Jun. 17, 2021. |
U.S. Appl. No. 16/121,441 Final Office Action dated Oct. 5, 2020. |
U.S. Appl. No. 16/896,480, filed Jun. 9, 2020, Megumi Kikuchi, Collision Detection and Avoidance. |
Bai, Z., Blackwell, A., Coulouris, G.; Using augmented reality to elicit pretend play for children with autism. IEEE Transactions on Visualization & Computer Graphics. May 1, 2015(1):1. |
Fujii, Katsuya; Higuchi, Keita; Rekimoto, Jun; “Endless Flyer: A Continuous Flying Drone with Automatic Battery Replacement”, 2013 IEEE 10th International Conference on Ubiquitous Intelligence & Computing and 2013 IEEE 10th International Conference on Autonomic & Trusted Computing, pp. 216-223. |
Thon S, Serena-Allier D, Salvetat C, Lacotte F.; “Flying a dron in a museum an augmented-reality serious game in a Provence”, InDigital Heritage International Congress (DigitaHeritage), Oct. 28, 2013 (vol. 2, pp. 669-676), IEEE. (Year: 2013). |
Williams, Elliot; “Real-life Space Invaders with Drones and Lasers,” Hackday, Sep. 19, 2016. |
PCT Application No. PCT/US2017/048064 International Search Report and Written Opinion dated Nov. 7, 2017. |
PCT Application No. PCT/US2017/048064 International Preliminary Report on Patentability dated Apr. 2, 2019. |
U.S. Appl. No. 15/394,473 Office Action dated Jun. 10, 2019. |
U.S. Appl. No. 15/394,391 Office Action dated Aug. 24, 2018. |
U.S. Appl. No. 15/394,391 Office Action dated Feb. 23, 2018. |
U.S. Appl. No. 15/394,329 Final Office Action dated Feb. 25, 2019. |
U.S. Appl. No. 15/394,329 Office Action dated Aug. 7, 2018. |
U.S. Appl. No. 15/394,267 Final Office Action dated Apr. 19, 2019. |
U.S. Appl. No. 15/394,267 Office Action dated Aug. 24, 2018. |
U.S. Appl. No. 15/394,285 Office Action dated Jan. 8, 2020. |
U.S. Appl. No. 15/394,285 Final Office Action dated Feb. 26, 2019. |
U.S. Appl. No. 15/394,285 Office Action dated Aug. 3, 2018. |
U.S. Appl. No. 15/394,313 Office Action dated Oct. 18, 2017. |
U.S. Appl. No. 16/121,441 Office Action dated Feb. 19, 2020. |
U.S. Appl. No. 16/121,441 Final Office Action dated Sep. 6, 2019. |
U.S. Appl. No. 16/121,441 Office Action dated May 15, 2019. |
U.S. Appl. No. 15/393,855 Final Office Action dated May 17, 2019. |
U.S. Appl. No. 15/393,855 Office Action dated Feb. 1, 2019. |
U.S. Appl. No. 15/393,855 Final Office Action dated Oct. 12, 2018. |
U.S. Appl. No. 15/393,855 Office Action dated May 16, 2018. |
U.S. Appl. No. 15/711,695 Office Action dated Oct. 5, 2018. |
U.S. Appl. No. 15/711,961 Office Action dated Oct. 5, 2018. |
U.S. Appl. No. 15/394,285 Final Office Action dated Aug. 13, 2020. |
U.S. Appl. No. 16/121,441 Office Action dated Feb. 1, 2021. |
Number | Date | Country | |
---|---|---|---|
20200394754 A1 | Dec 2020 | US |
Number | Date | Country | |
---|---|---|---|
62402584 | Sep 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16728833 | Dec 2019 | US |
Child | 16909837 | US | |
Parent | 16526819 | Jul 2019 | US |
Child | 16728833 | US | |
Parent | 15394511 | Dec 2016 | US |
Child | 16526819 | US |