The present invention generally relates to unmanned aerial vehicles (UAVs). More specifically, the present invention relates to positional anchors for UAVs.
An unmanned aerial vehicle (UAV)—also commonly called a drone—is a type of aircraft that may be controlled with varying degrees of autonomy or direction by a remote human pilot. UAVs are available in a variety of different sizes, configurations, power, maneuverability, and peripheral devices, such as cameras, sensors, radar, sonar, etc. Common uses for UAVs include aerial photography, surveillance, and delivery of a variety of payloads, as well as recreational and hobby usage.
In a recreational context, UAVs may be flown in a variety of races, games, or other competitive activity. For more variety and challenge, such games may be placed in a virtual or augmented environment. Alternatively, variety and challenge may be added via various objects to be used in the game or other activity. Incorporating such objects in games taking place in a virtual or augmented environment may be challenging, however, as they may need to be tracked within the real-world as well as virtual environment.
There is, therefore, a need in the art for improved systems and methods for UAV positional anchors.
Embodiments of the present invention allow unmanned aerial vehicle (UAV) positional anchors. Signals may be broadcast via a signal interface of an anchor in a defined space which also includes a UAV. The UAV is at one location within the defined space, and the anchor is at another location within the defined space. A virtual environment may be generated that corresponds to the defined space. The virtual environment may include at least one virtual element, and a location of the virtual element within the virtual environment may be based on the location of the anchor within the defined space. A visual indication may be generated when the UAV is detected within a predetermined distance from the location of the anchor. In some embodiments, a visual element may be generated to augment the anchor where a location of the visual element is based on a location of the anchor within the defined space. The visual element may be changed when the UAV is flown to the location of the anchor within the defined space.
Various embodiments of the present invention may include systems for UAV positional anchors. Such systems may include an unmanned aerial vehicle (UAV) at one location within a defined space and at least one anchor at another location within the defined space. The anchor may include a signal interface that broadcasts signals. The system may further include a virtual reality system that generates a virtual environment corresponding to the defined space that include at least one virtual element, whose placement within the virtual environment is based on the location of the anchor within the defined space. The virtual reality system may further generate a visual indication within the virtual environment when the UAV is detected within a predetermined distance from the location of the anchor within the defined space.
Additional embodiments of the present invention may further include methods for unmanned aerial vehicle (UAV) positional anchors. Such methods may include broadcasting signals via a signal interface of at least one anchor, generating a virtual environment corresponding to the defined space that includes at least one virtual element placed within the virtual environment based on the location of the anchor within the defined space, and generating a visual indication within the virtual environment when the UAV is detected within a predetermined distance from the location of the at least one anchor within the defined space.
Further embodiments of the present invention may further include non-transitory computer-readable storage media, having embodied thereon a program executable by a processor to perform methods for unmanned aerial vehicle (UAV) positional anchors as described herein.
Embodiments of the present invention allow unmanned aerial vehicle (UAV) positional anchors. Signals may be broadcast via a signal interface of an anchor in a defined space which also includes a UAV. The UAV is at one location within the defined space, and the anchor is at another location within the defined space. A virtual environment may be generated that corresponds to the defined space. The virtual environment may include at least one virtual element, and a location of the virtual element within the virtual environment may be based on the location of the anchor within the defined space. A visual indication may be generated when the UAV is detected within a predetermined distance from the location of the anchor. In some embodiments, a visual element may be generated to augment the anchor where a location of the visual element is based on a location of the anchor within the defined space. The visual element may be changed when the UAV is flown to the location of the anchor within the defined space.
In some embodiments, each motor 155 rotates (e.g., the drive shaft of motor 155 spins) about parallel axes. For example, the thrust provided by all propellers 155 can be in the Z direction. Alternatively, a motor 155 can rotate about an axis that is perpendicular (or any angle that is not parallel) to the axis of rotation of another motor 155. For example, two motors 155 can be oriented to provide thrust in the Z direction (e.g., to be used in takeoff and landing) while two motors 155 can be oriented to provide thrust in the X direction (e.g., for normal flight). In some embodiments, UAV 100 can dynamically adjust the orientation of one or more of its motors 150 for vectored thrust.
In some embodiments, the rotation of motors 150 can be configured to create or minimize gyroscopic forces. For example, if there are an even number of motors 150, then half of the motors can be configured to rotate counter-clockwise while the other half can be configured to rotate clockwise. Alternating the placement of clockwise and counter-clockwise motors can increase stability and enable UAV 100 to rotate about the z-axis by providing more power to one set of motors 150 (e.g., those that rotate clockwise) while providing less power to the remaining motors (e.g., those that rotate counter-clockwise).
Motors 150 can be any combination of electric motors, internal combustion engines, turbines, rockets, etc. In some embodiments, a single motor 150 can drive multiple thrust components (e.g., propellers 155) on different parts of UAV 100 using chains, cables, gear assemblies, hydraulics, tubing (e.g., to guide an exhaust stream used for thrust), etc. to transfer the power.
In some embodiments, motor 150 is a brushless motor and can be connected to electronic speed controller X45. Electronic speed controller 145 can determine the orientation of magnets attached to a drive shaft within motor 150 and, based on the orientation, power electromagnets within motor 150. For example, electronic speed controller 145 can have three wires connected to motor 150, and electronic speed controller 145 can provide three phases of power to the electromagnets to spin the drive shaft in motor 150. Electronic speed controller 145 can determine the orientation of the drive shaft based on back-emf on the wires or by directly sensing to position of the drive shaft.
Transceiver 165 can receive control signals from a control unit (e.g., a handheld control transmitter, a server, etc.). Transceiver 165 can receive the control signals directly from the control unit or through a network (e.g., a satellite, cellular, mesh, etc.). The control signals can be encrypted. In some embodiments, the control signals include multiple channels of data (e.g., “pitch,” “yaw,” “roll,” “throttle,” and auxiliary channels). The channels can be encoded using pulse-width-modulation or can be digital signals. In some embodiments, the control signals are received over TC/IP or similar networking stack.
In some embodiments, transceiver 165 can also transmit data to a control unit. Transceiver 165 can communicate with the control unit using lasers, light, ultrasonic, infra-red, Bluetooth, 602.11x, or similar communication methods, including a combination of methods. Transceiver can communicate with multiple control units at a time.
Position sensor 135 can include an inertial measurement unit for determining the acceleration and/or the angular rate of UAV 100, a GPS receiver for determining the geolocation and altitude of UAV 100, a magnetometer for determining the surrounding magnetic fields of UAV 100 (for informing the heading and orientation of UAV 100), a barometer for determining the altitude of UAV 100, etc. Position sensor 135 can include a land-speed sensor, an air-speed sensor, a celestial navigation sensor, etc.
UAV 100 can have one or more environmental awareness sensors. These sensors can use sonar, LiDAR, stereoscopic imaging, computer vision, etc. to detect obstacles and determine the nearby environment. For example, a collision avoidance system can use environmental awareness sensors to determine how far away an obstacle is and, if necessary, change course.
Position sensor 135 and environmental awareness sensors can all be one unit or a collection of units. In some embodiments, some features of position sensor 135 and/or the environmental awareness sensors are embedded within flight controller 130.
In some embodiments, an environmental awareness system can take inputs from position sensors 135, environmental awareness sensors, databases (e.g., a predefined mapping of a region) to determine the location of UAV 100, obstacles, and pathways. In some embodiments, this environmental awareness system is located entirely on UAV 100, alternatively, some data processing can be performed external to UAV 100.
Camera 105 can include an image sensor (e.g., a CCD sensor, a CMOS sensor, etc.), a lens system, a processor, etc. The lens system can include multiple movable lenses that can be adjusted to manipulate the focal length and/or field of view (e.g., zoom) of the lens system. In some embodiments, camera 105 is part of a camera system which includes multiple cameras 105. For example, two cameras 105 can be used for stereoscopic imaging (e.g., for first person video, augmented reality, etc.). Another example includes one camera 105 that is optimized for detecting hue and saturation information and a second camera 105 that is optimized for detecting intensity information. In some embodiments, camera 105 optimized for low latency is used for control systems while a camera 105 optimized for quality is used for recording a video (e.g., a cinematic video). Camera 105 can be a visual light camera, an infrared camera, a depth camera, etc.
A gimbal and dampeners can help stabilize camera 105 and remove erratic rotations and translations of UAV 100. For example, a three-axis gimbal can have three stepper motors that are positioned based on a gyroscope reading in order to prevent erratic spinning and/or keep camera 105 level with the ground.
Video processor 125 can process a video signal from camera 105. For example video process 125 can enhance the image of the video signal, down-sample or up-sample the resolution of the video signal, add audio (captured by a microphone) to the video signal, overlay information (e.g., flight data from flight controller 130 and/or position sensor), convert the signal between forms or formats, etc.
Video transmitter 120 can receive a video signal from video processor 125 and transmit it using an attached antenna. The antenna can be a cloverleaf antenna or a linear antenna. In some embodiments, video transmitter 120 uses a different frequency or band than transceiver 165. In some embodiments, video transmitter 120 and transceiver 165 are part of a single transceiver.
Battery 170 can supply power to the components of UAV 100. A battery elimination circuit can convert the voltage from battery 170 to a desired voltage (e.g., convert 12 v from battery 170 to 5 v for flight controller 130). A battery elimination circuit can also filter the power in order to minimize noise in the power lines (e.g., to prevent interference in transceiver 165 and transceiver 120). Electronic speed controller 145 can contain a battery elimination circuit. For example, battery 170 can supply 12 volts to electronic speed controller 145 which can then provide 5 volts to flight controller 130. In some embodiments, a power distribution board can allow each electronic speed controller (and other devices) to connect directly to the battery.
In some embodiments, battery 170 is a multi-cell (e.g., 2S, 3S, 4S, etc.) lithium polymer battery. Battery 170 can also be a lithium-ion, lead-acid, nickel-cadmium, or alkaline battery. Other battery types and variants can be used as known in the art. Additional or alternative to battery 170, other energy sources can be used. For example, UAV 100 can use solar panels, wireless power transfer, a tethered power cable (e.g., from a ground station or another UAV 100), etc. In some embodiments, the other energy source can be utilized to charge battery 170 while in flight or on the ground.
Battery 170 can be securely mounted to main body 110. Alternatively, battery 170 can have a release mechanism. In some embodiments, battery 170 can be automatically replaced. For example, UAV 100 can land on a docking station and the docking station can automatically remove a discharged battery 170 and insert a charged battery 170. In some embodiments, UAV 100 can pass through docking station and replace battery 170 without stopping.
Battery 170 can include a temperature sensor for overload prevention. For example, when charging, the rate of charge can be thermally limited (the rate will decrease if the temperature exceeds a certain threshold). Similarly, the power delivery at electronic speed controllers 145 can be thermally limited—providing less power when the temperature exceeds a certain threshold. Battery 170 can include a charging and voltage protection circuit to safely charge battery 170 and prevent its voltage from going above or below a certain range.
UAV 100 can include a location transponder. For example, in a racing environment, race officials can track UAV 100 using location transponder. The actual location (e.g., X, Y, and Z) can be tracked using triangulation of the transponder. In some embodiments, gates or sensors in a track can determine if the location transponder has passed by or through the sensor or gate.
Flight controller 130 can communicate with electronic speed controller 145, battery 170, transceiver 165, video processor 125, position sensor 135, and/or any other component of UAV 100. In some embodiments, flight controller 130 can receive various inputs (including historical data) and calculate current flight characteristics. Flight characteristics can include an actual or predicted position, orientation, velocity, angular momentum, acceleration, battery capacity, temperature, etc. of UAV 100. Flight controller 130 can then take the control signals from transceiver 165 and calculate target flight characteristics. For example, target flight characteristics might include “rotate x degrees” or “go to this GPS location”. Flight controller 130 can calculate response characteristics of UAV 100. Response characteristics can include how electronic speed controller 145, motor 150, propeller 155, etc. respond, or are expected to respond, to control signals from flight controller 130. Response characteristics can include an expectation for how UAV 100 as a system will respond to control signals from flight controller 130. For example, response characteristics can include a determination that one motor 150 is slightly weaker than other motors.
After calculating current flight characteristics, target flight characteristics, and response characteristics flight controller 130 can calculate optimized control signals to achieve the target flight characteristics. Various control systems can be implemented during these calculations. For example a proportional-integral-derivative (PID) can be used. In some embodiments, an open-loop control system (i.e., one that ignores current flight characteristics) can be used. In some embodiments, some of the functions of flight controller 130 are performed by a system external to UAV 100. For example, current flight characteristics can be sent to a server that returns the optimized control signals. Flight controller 130 can send the optimized control signals to electronic speed controllers 145 to control UAV 100.
In some embodiments, UAV 100 has various outputs that are not part of the flight control system. For example, UAV 100 can have a loudspeaker for communicating with people or other UAVs 100. Similarly, UAV 100 can have a flashlight or laser. The laser can be used to “tag” another UAV 100.
Augmented or virtual reality system 300 may generate a display of an artificial image to overlay the view of the real world (e.g., augmented reality) or to create an independent reality all its own (e.g., virtual reality). Depending on whether the system is set up for augmented or virtual reality, display screen 310 may be partly transparent or translucent—thereby allowing the user to observe real-world surroundings—or display 310 may be a displayed computer generated image, or a combination of the two. The virtual environment generated by augmented or virtual reality system 300 and presented to the user may include any of the real-world surroundings, any physical objects (which may be augmented or not), or generate wholly virtual objects.
In some embodiments, display screen 310 includes two screens, one for each eye; these screens can have separate signals for stereoscopic viewing. In some embodiments, receiver 315 may be coupled to display screen 310 (as shown in
Each anchor 410-430 is equipped with a signal interface that broadcasts signals throughout the space. Such signals may be ultrasonic, light-based, or other types of beacon signal known in the art. Such signals may be detected by an augmented or virtual reality system 300, which may use such signals to locate the anchor (which may or may not be moving during the game). The location of the anchor may be used to adjust the corresponding augmented or virtual representation. Where an anchor 410-420 moves or may be moved, the signals broadcast by the respective anchor allows the augmented or virtual reality system 300 to track its respective location in real-time, as well as to update the augmented or virtual display based on the real-time location.
Such anchors 410-430 may have different roles depending on the parameters of a game or competition. Some anchors 410 may be mobile and may be an object for the UAV 100 to chase (or to be chased by) through the space 400 during the course of a game. Some anchors 420 may be carried by the UAV 100, and other anchors 430 may be stationary. Different combinations of anchors 410-430 may be incorporated into various games in different capacities. When the UAV 100 is near to an anchor 410-430, certain indications may be generated to indicate certain statuses, scores, bonuses, notifications, information regarding a new challenge, etc.
The object of the game may be for the UAV 100 to catch a mobile anchor 410, to find a hidden anchor 420, bring one anchor 420 to another anchor 430, or race from one to another anchor 410-430. Such anchors 410-430 may represent markers where additional challenges or events may occur. Different anchors 410-430 may be associated with different points or scores, as may be the actions involving such anchors 410-430. Such game parameters may be indicated visually in the augmented or virtual environment.
The user may view the UAV from his or her physical location within the space 400 while flying the UAV. Depending on settings of the augmented or virtual reality system 300, the user may also be provided with a first person view of the augmented or virtual environment corresponding to the view as seen from the UAV. The augmented or virtual reality system 300 therefore provides the user with a flight simulation experience corresponding to the actual physical flight of the UAV 100.
In step 510, one or more anchors are distributed throughout a space. The number and type of anchors used depends on the object of a particular game or challenge. As described above, such anchors may vary in size/weight, mobility, etc. Stationary anchors may be distributed to serve as markers for a race or obstacle course. Mobile anchors may chase the UAV(s), or the UAV(s) may chase the mobile anchor. Further, some anchors may themselves be carried from one location to another (e.g. the location of another anchor).
In step 520, signals are broadcast from each anchor. As noted above, such signals may be in any form known in the art, including ultrasonic, light-based, or other type of beacon signal. Such signals may be detectable to an augmented or virtual reality system present in the space.
In step 530, the augmented or virtual reality system may generate augmentation or virtual elements that correspond to the anchor. An augmented reality system may simply augment the anchor, while a virtual reality system may generate a virtual environment corresponding to the space and that includes a virtual element corresponding to the anchor. Such anchor may be represented in the virtual environment by the virtual element, which may be placed within the virtual environment in accordance with the location of the anchor within the space. The type of augmentation or virtual elements may be based on user preference or selection. In some embodiments, the user may be offered a menu of virtual elements, themes, or templates that may be used to generate the augmentation or virtual element.
In step 540, a UAV may be detected as being near an anchor. The UAV may be flying through various locations within the space. When the UAV is detected as being within a predetermined distance from an anchor, such detection may serve as a trigger. Depending on the object of the game, the proximity of the UAV to the anchor may indicate that the UAV has won a race, reached a milestone or other goal, caught up to a quarry being chased, collided with an obstacle, been caught or tagged by a chaser, etc.
In step 550, a visual indication may be generated based on the detection of step 540. As above, the type of visual indication depends on the type of game, as well as what the proximity between the UAV and anchor may indicate. Such indications may include score, an updated scoreboard, an in-game bonus, a notification, and information regarding a new challenge.
Entertainment system 600 may be an electronic game console. Alternatively, the entertainment system 600 may be implemented as a general-purpose computer, a set-top box, a hand-held game device, a tablet computing device, or a mobile computing device or phone. Entertainment systems may contain more or less operating components depending on a particular form factor, purpose, or design.
The CPU 610, the vector unit 615, the graphics processing unit 620, and the I/O processor 625 of
The graphics processing unit 620 of
A user of the entertainment system 600 of
The present invention may be implemented in an application that may be operable by a variety of end user devices. For example, an end user device may be a personal computer, a home entertainment system (e.g., Sony PlayStation2® or Sony PlayStation3® or Sony PlayStation4®), a portable gaming device (e.g., Sony PSP® or Sony Vita®), or a home entertainment system of a different albeit inferior manufacturer. The present methodologies described herein are fully intended to be operable on a variety of devices. The present invention may also be implemented with cross-title neutrality wherein an embodiment of the present system may be utilized across a variety of titles from various publishers.
Non-transitory computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU) for execution. Such media can take many forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of non-transitory computer-readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, RAM, PROM, EPROM, a FLASHEPROM, and any other memory chip or cartridge.
Various forms of transmission media may be involved in carrying one or more sequences of one or more instructions to a CPU for execution. A bus carries the data to system RAM, from which a CPU retrieves and executes the instructions. The instructions received by system RAM can optionally be stored on a fixed disk either before or after execution by a CPU. Various forms of storage may likewise be implemented as well as the necessary network interfaces and network topologies to implement the same.
The foregoing detailed description of the technology has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the technology to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The described embodiments were chosen in order to best explain the principles of the technology, its practical application, and to enable others skilled in the art to utilize the technology in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the technology be defined by the claim.
The present application claims the priority benefit of U.S. patent application 62/402,609 filed Sep. 30, 2016, the disclosure of which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3279863 | Zuppiger | Oct 1966 | A |
3367658 | Bayha | Feb 1968 | A |
6021646 | Burley et al. | Feb 2000 | A |
6075924 | Will | Jun 2000 | A |
6236365 | LeBlanc | May 2001 | B1 |
6254394 | Draper et al. | Jul 2001 | B1 |
7988154 | Regan, Jr. | Aug 2011 | B1 |
8025293 | Crawford et al. | Sep 2011 | B1 |
8909391 | Peeters et al. | Dec 2014 | B1 |
9061102 | Levien et al. | Jun 2015 | B2 |
9218316 | Bernstein et al. | Dec 2015 | B2 |
9442485 | McDermott et al. | Sep 2016 | B1 |
9605926 | Means et al. | Mar 2017 | B1 |
9632502 | Levinson et al. | Apr 2017 | B1 |
10062292 | Blomberg et al. | Aug 2018 | B2 |
10067736 | Taylor | Sep 2018 | B2 |
10137984 | Flick | Nov 2018 | B1 |
10210905 | Castleman | Feb 2019 | B2 |
10248118 | Bernstein et al. | Apr 2019 | B2 |
20030102016 | Bouchard | Jun 2003 | A1 |
20030152892 | Huang et al. | Aug 2003 | A1 |
20040008253 | Monroe | Jan 2004 | A1 |
20040115593 | Hatlestad et al. | Jun 2004 | A1 |
20040172187 | Wiseman | Sep 2004 | A1 |
20050004723 | Duggan et al. | Jan 2005 | A1 |
20050283281 | Hartmann et al. | Dec 2005 | A1 |
20060095262 | Danielli | May 2006 | A1 |
20060169508 | Trojahn | Aug 2006 | A1 |
20070061116 | Bush | Mar 2007 | A1 |
20070102876 | Giegerich et al. | May 2007 | A1 |
20070130599 | Monroe | Jun 2007 | A1 |
20080073839 | Nally | Mar 2008 | A1 |
20080093796 | Narus et al. | Apr 2008 | A1 |
20080144884 | Habibi | Jun 2008 | A1 |
20080154447 | Spinelli | Jun 2008 | A1 |
20080177994 | Mayer | Jul 2008 | A1 |
20080221745 | Diamandis et al. | Sep 2008 | A1 |
20080232602 | Shearer | Sep 2008 | A1 |
20090005167 | Arrasvuori et al. | Jan 2009 | A1 |
20090076665 | Hoisington et al. | Mar 2009 | A1 |
20090087029 | Coleman et al. | Apr 2009 | A1 |
20090118896 | Gustafsson | May 2009 | A1 |
20090125163 | Duggan et al. | May 2009 | A1 |
20090187389 | Dobbins et al. | Jul 2009 | A1 |
20090265105 | Davis | Oct 2009 | A1 |
20100083038 | Pierce et al. | Apr 2010 | A1 |
20100096491 | Whitelaw et al. | Apr 2010 | A1 |
20100121574 | Ariyur et al. | May 2010 | A1 |
20100228468 | D'Angelo | Sep 2010 | A1 |
20100305724 | Fry et al. | Dec 2010 | A1 |
20110106339 | Phillips et al. | May 2011 | A1 |
20110184590 | Duggan et al. | Jul 2011 | A1 |
20110199376 | Salemane | Aug 2011 | A1 |
20110311949 | Preston et al. | Dec 2011 | A1 |
20120009845 | Schmelzer | Jan 2012 | A1 |
20120035799 | Ehrmann | Feb 2012 | A1 |
20120093320 | Flaks et al. | Apr 2012 | A1 |
20120188078 | Soles et al. | Jul 2012 | A1 |
20120212399 | Border et al. | Aug 2012 | A1 |
20120232867 | Ahrens et al. | Sep 2012 | A1 |
20130128054 | Densham | May 2013 | A1 |
20130137066 | Pollak et al. | May 2013 | A1 |
20130173089 | Bernstein et al. | Jul 2013 | A1 |
20130328927 | Mount et al. | Dec 2013 | A1 |
20140244075 | Litwinowicz et al. | Aug 2014 | A1 |
20140324253 | Duggan et al. | Oct 2014 | A1 |
20150063610 | Mossner | Mar 2015 | A1 |
20150209659 | Barr et al. | Jul 2015 | A1 |
20150248785 | Holmquist | Sep 2015 | A1 |
20150323931 | Downey et al. | Nov 2015 | A1 |
20150346722 | Herz et al. | Dec 2015 | A1 |
20150378019 | Schissler et al. | Dec 2015 | A1 |
20160035224 | Yang et al. | Feb 2016 | A1 |
20160078759 | Nerayoff et al. | Mar 2016 | A1 |
20160091894 | Zhang et al. | Mar 2016 | A1 |
20160111006 | Srivastava et al. | Apr 2016 | A1 |
20160117853 | Zhong et al. | Apr 2016 | A1 |
20160117931 | Chan et al. | Apr 2016 | A1 |
20160196754 | Surace | Jul 2016 | A1 |
20160205654 | Robinson | Jul 2016 | A1 |
20160217698 | Liu et al. | Jul 2016 | A1 |
20160253908 | Chambers et al. | Sep 2016 | A1 |
20160257001 | Blasdel et al. | Sep 2016 | A1 |
20160284125 | Bostick et al. | Sep 2016 | A1 |
20160291593 | Hammond et al. | Oct 2016 | A1 |
20160299506 | Bruggeman et al. | Oct 2016 | A1 |
20160330601 | Srivastava | Nov 2016 | A1 |
20160358497 | Nguyen et al. | Dec 2016 | A1 |
20170036771 | Woodman et al. | Feb 2017 | A1 |
20170039859 | Hu et al. | Feb 2017 | A1 |
20170045886 | Liu et al. | Feb 2017 | A1 |
20170053169 | Cuban et al. | Feb 2017 | A1 |
20170061813 | Tao et al. | Mar 2017 | A1 |
20170069214 | Dupray et al. | Mar 2017 | A1 |
20170116723 | Aughey | Apr 2017 | A1 |
20170158353 | Schmick | Jun 2017 | A1 |
20170165575 | Ridihalgh et al. | Jun 2017 | A1 |
20170166204 | Yoo et al. | Jun 2017 | A1 |
20170168488 | Wierzynski et al. | Jun 2017 | A1 |
20170168556 | Goslin et al. | Jun 2017 | A1 |
20170173451 | Pedersen et al. | Jun 2017 | A1 |
20170182407 | Steele et al. | Jun 2017 | A1 |
20170251323 | Jo et al. | Aug 2017 | A1 |
20170295446 | Thagadur | Oct 2017 | A1 |
20170329347 | Passot | Nov 2017 | A1 |
20170337826 | Moran | Nov 2017 | A1 |
20170343375 | Kamhi et al. | Nov 2017 | A1 |
20170371353 | Millinger | Dec 2017 | A1 |
20170372617 | Bruno et al. | Dec 2017 | A1 |
20180322699 | Gray et al. | Jan 2018 | A1 |
20180032071 | Wieneke | Feb 2018 | A1 |
20180039262 | Fox et al. | Feb 2018 | A1 |
20180046187 | Martirosyan et al. | Feb 2018 | A1 |
20180046560 | Gillies et al. | Feb 2018 | A1 |
20180093171 | Mallinson | Apr 2018 | A1 |
20180093768 | Castleman | Apr 2018 | A1 |
20180093781 | Mallinson | Apr 2018 | A1 |
20180094931 | Taylor | Apr 2018 | A1 |
20180095433 | Rico | Apr 2018 | A1 |
20180095461 | Taylor | Apr 2018 | A1 |
20180095463 | Castleman | Apr 2018 | A1 |
20180095714 | Taylor | Apr 2018 | A1 |
20180096455 | Taylor | Apr 2018 | A1 |
20180096611 | Kikuchi | Apr 2018 | A1 |
20180098052 | Black | Apr 2018 | A1 |
20180246514 | Mitomo et al. | Aug 2018 | A1 |
20180259339 | Johnson et al. | Sep 2018 | A1 |
20180321692 | Castillo-Effen et al. | Nov 2018 | A1 |
20180329413 | Charalambides et al. | Nov 2018 | A1 |
20190019329 | Eyler et al. | Jan 2019 | A1 |
20190047700 | Liu et al. | Feb 2019 | A1 |
20190075252 | Zhao et al. | Mar 2019 | A1 |
20190079722 | Taylor | Mar 2019 | A1 |
Number | Date | Country |
---|---|---|
WO 2018063594 | Apr 2018 | WO |
Entry |
---|
U.S. Appl. No. 15/711,695, Dominic S. Mallinson, Unmanned Aerial Vehicle Movement Via Environmental Airflow, filed Sep. 21, 2017. |
U.S. Appl. No. 15/711,961, Dominic S. Mallinson, Unmanned Aerial Vehicle Movement Via Environmental Interactions, filed Sep. 21, 2017. |
Fujii, Katsuya; Higuchi, Keita; Rekimoto, Jun; “Endless Flyer: A Continuous Flying Drone with Automatic Battery Replacement”, 2013 IEEE 10th International Conference on Ubiquitous Intelligence & Computing and 2013 IEEE 10th International Conference on Autonomic & Trusted Computing, pp. 216-223. |
Williams, Elliot; “Real-life Space Invaders with Drones and Lasers,” Hackaday, Sep. 19, 2016. |
PCT Application No. PCT/US2017/048064 International Search Report and Written Opinion dated Nov. 7, 2017. |
U.S. Appl. No. 15/394,313 Office Action dated Oct. 18, 2017. |
U.S. Appl. No. 16/121,441, Michael Taylor, Proximity Based Noise and Chat, filed Sep. 4, 2018. |
U.S. Appl. No. 15/394,391 Office Action dated Aug. 24, 2018. |
U.S. Appl. No. 15/394,329 Office Action dated Aug. 7, 2018. |
U.S. Appl. No. 15/394,267 Office Action dated Aug. 24, 2018. |
U.S. Appl. No. 15/394,285 Office Action dated Aug. 3, 2018. |
U.S. Appl. No. 15/393,855 Final Office Action dated Oct. 12, 2018. |
U.S. Appl. No. 15/711,695 Office Action dated Oct. 5, 2018. |
U.S. Appl. No. 15/711,961 Office Action dated Oct. 5, 2018. |
U.S. Appl. No. 15/394,391 Office Action dated Feb. 23, 2018. |
U.S. Appl. No. 15/393,855 Office Action dated May 16, 2018. |
U.S. Appl. No. 15/394,473, Dennis Castleman, UAV Battery Form Factor and Insertion/Ejection Methodologies, filed Dec. 29, 2016. |
U.S. Appl. No. 15/394,313, Michael Taylor, Proximity Based Noise and Chat, filed Dec. 29, 2016. |
Bai, Z., Blackwell, A., Coulouris, G.; Using augmented reality to elicit pretend play for children with autism. IEEE Transactions on Visualization & Computer Graphics. May 1, 2015(1):1. |
Thon S, Serena-Allier D, Salvetat C, Lacotte F.; “Flying a dron in a museum an augmented-reality serious game in a Provence”, InDigital Heritage International Congress (DigitalHeritage), Oct. 28, 2013 (vol. 2, pp. 669-676), IEEE. (Year: 2013). |
U.S. Appl. No. 15/394,329 Final Office Action dated Feb. 25, 2019. |
U.S. Appl. No. 15/394,285 Final Office Action dated Feb. 26, 2019. |
U.S. Appl. No. 15/393,855 Office Action dated Feb. 1, 2019. |
PCT Application No. PCT/US2017/048064 International Preliminary Report on Patentability dated Apr. 2, 2019. |
U.S. Appl. No. 15/394,267 Final Office Action dated Apr. 19, 2019. |
U.S. Appl. No. 16/121,441 Office Action dated May 15, 2019. |
U.S. Appl. No. 15/393,855 Final Office Action dated May 17, 2019. |
Number | Date | Country | |
---|---|---|---|
20180095461 A1 | Apr 2018 | US |
Number | Date | Country | |
---|---|---|---|
62402609 | Sep 2016 | US |