The present disclosure relates to autonomous aerial vehicle technology.
Vehicles can be configured to autonomously navigate a physical environment. For example, an autonomous vehicle with various onboard sensors can be configured to generate perception inputs based on the surrounding physical environment that are then used to estimate a position and/or orientation of the autonomous vehicle within the physical environment. In some cases, the perception inputs may include images of the surrounding physical environment captured by cameras on board the vehicle. An autonomous navigation system can then utilize these position and/or orientation estimates to guide the autonomous vehicle through the physical environment.
Example Implementation of an Autonomous Aerial Vehicle
In the example depicted in
In addition to the array of image capture devices 114, the UAV 100 depicted in
In many cases, it is generally preferable to capture images that are intended to be viewed at as high a resolution as possible given hardware and software constraints. On the other hand, if used for visual navigation and/or object tracking, lower resolution images may be preferable in certain contexts to reduce processing load and provide more robust motion planning capabilities. Accordingly, in some embodiments, the image capture device 115 may be configured to capture relatively high resolution (e.g., above 3840×2160) color images, while the image capture devices 114 may be configured to capture relatively low resolution (e.g., below 320×240) grayscale images. Again, these configurations are examples provided to illustrate how image capture devices 114 and 115 may differ depending on their respective roles and constraints of the system. Other implementations may configure such image capture devices differently.
The UAV 100 can be configured to track one or more objects such as a human subject 102 through the physical environment based on images received via the image capture devices 114 and/or 115. Further, the UAV 100 can be configured to track image capture of such objects, for example, for filming purposes. In some embodiments, the image capture device 115 is coupled to the body of the UAV 100 via an adjustable mechanism that allows for one or more degrees of freedom of motion relative to a body of the UAV 100. The UAV 100 may be configured to automatically adjust an orientation of the image capture device 115 so as to track image capture of an object (e.g., human subject 102) as both the UAV 100 and object are in motion through the physical environment. In some embodiments, this adjustable mechanism may include a mechanical gimbal mechanism that rotates an attached image capture device about one or more axes. In some embodiments, the gimbal mechanism may be configured as a hybrid mechanical-digital gimbal system coupling the image capture device 115 to the body of the UAV 100. In a hybrid mechanical-digital gimbal system, orientation of the image capture device 115 about one or more axes may be adjusted by mechanical means, while orientation about other axes may be adjusted by digital means. For example, a mechanical gimbal mechanism may handle adjustments in the pitch of the image capture device 115, while adjustments in the roll and yaw are accomplished digitally by transforming (e.g., rotating, panning, etc.) the captured images so as to effectively provide at least three degrees of freedom in the motion of the image capture device 115 relative to the UAV 100.
In some embodiments, an autonomous aerial vehicle may instead be configured as a fixed-wing aircraft, for example, as depicted in
The mobile device 104 depicted in both
As shown in
In some embodiments, the motion planner 130, operating separately or in conjunction with the tracking system 140, is configured to generate a planned trajectory through a three-dimensional (3D) space of a physical environment based, for example, on images received from image capture devices 114 and/or 115, data from other sensors 112 (e.g., IMU, GPS, proximity sensors, etc.), and/or one or more control inputs 170. Control inputs 170 may be from external sources such as a mobile device operated by a user or may be from other systems on board the UAV 100.
In some embodiments, the navigation system 120 may generate control commands configured to cause the UAV 100 to maneuver along the planned trajectory generated by the motion planner 130. For example, the control commands may be configured to control one or more control actuators 110 (e.g., powered rotors and/or control surfaces) to cause the UAV 100 to maneuver along the planned 3D trajectory. Alternatively, a planned trajectory generated by the motion planner 130 may be output to a separate flight controller 160 that is configured to process trajectory information and generate appropriate control commands configured to control the one or more control actuators 110.
The tracking system 140, operating separately or in conjunction with the motion planner 130, may be configured to track one or more objects in the physical environment based, for example, on images received from image capture devices 114 and/or 115, data from other sensors 112 (e.g., IMU, GPS, proximity sensors, etc.), one or more control inputs 170 from external sources (e.g., from a remote user, navigation application, etc.), and/or one or more specified tracking objectives. Tracking objectives may include, for example, a designation by a user to track a particular detected object in the physical environment or a standing objective to track objects of a particular classification (e.g., people).
As alluded to above, the tracking system 140 may communicate with the motion planner 130, for example, to maneuver the UAV 100 based on measured, estimated, and/or predicted positions, orientations, and/or trajectories of the UAV 100 itself and of other objects in the physical environment. For example, the tracking system 140 may communicate a navigation objective to the motion planner 130 to maintain a particular separation distance to a tracked object that is in motion.
In some embodiments, the tracking system 140, operating separately or in conjunction with the motion planner 130, is further configured to generate control commands configured to cause one or more stabilization/tracking devices 152 to adjust an orientation of any image capture devices 114/115 relative to the body of the UAV 100 based on the tracking of one or more objects. Such stabilization/tracking devices 152 may include a mechanical gimbal or a hybrid digital-mechanical gimbal, as previously described. For example, while tracking an object in motion relative to the UAV 100, the tracking system 140 may generate control commands configured to adjust an orientation of an image capture device 115 so as to keep the tracked object centered in the field of view (FOV) of the image capture device 115 while the UAV 100 is in motion. Similarly, the tracking system 140 may generate commands or output data to a digital image processor (e.g., that is part of a hybrid digital-mechanical gimbal) to transform images captured by the image capture device 115 to keep the tracked object centered in the FOV of the image capture device 115 while the UAV 100 is in motion. The image capture devices 114/115 and associated stabilization/tracking devices 152 are collectively depicted in
In some embodiments, a navigation system 120 (e.g., specifically a motion planning component 130) is configured to incorporate multiple objectives at any given time to generate an output such as a planned trajectory that can be used to guide the autonomous behavior of the UAV 100. For example, certain built-in objectives, such as obstacle avoidance and vehicle dynamic limits, can be combined with other input objectives (e.g., a landing objective) as part of a trajectory generation process. In some embodiments, the trajectory generation process can include gradient-based optimization, gradient-free optimization, sampling, end-to-end learning, or any combination thereof. The output of this trajectory generation process can be a planned trajectory over some time horizon (e.g., 10 seconds) that is configured to be interpreted and utilized by a flight controller 160 to generate control commands (usable by control actuators 110) that cause the UAV 100 to maneuver according to the planned trajectory. A motion planner 130 may continually perform the trajectory generation process as new perception inputs (e.g., images or other sensor data) and objective inputs are received. Accordingly, the planned trajectory may be continually updated over some time horizon, thereby enabling the UAV 100 to dynamically and autonomously respond to changing conditions.
Each given objective of the set of one or more objectives 302 utilized in the motion planning process may include one or more defined parameterizations that are exposed through the API. For example,
The target 344 defines the goal of the particular objective that the motion planner 130 will attempt to satisfy when generating a planned trajectory 320. For example, the target 334 of a given objective may be to maintain line of sight with one or more detected objects or to fly to a particular position in the physical environment.
The dead-zone defines a region around the target 334 in which the motion planner 130 may not take action to correct. This dead-zone 336 may be thought of as a tolerance level for satisfying a given target 334. For example, a target of an example image-relative objective may be to maintain image capture of a tracked object such that the tracked object appears at a particular position in the image space of a captured image (e.g., at the center). To avoid continuous adjustments based on slight deviations from this target, a dead-zone is defined to allow for some tolerance. For example, a dead-zone can be defined in a y-direction and x-direction surrounding a target location in the image space. In other words, as long as the tracked object appears within an area of the image bounded by the target and respective dead-zones, the objective is considered satisfied.
The weighting factor 336 (also referred to as an “aggressiveness” factor) defines a relative level of impact the particular objective 332 will have on the overall trajectory generation process performed by the motion planner 130. Recall that a particular objective 332 may be one of several objectives 302 that may include competing targets. In an ideal scenario, the motion planner 130 will generate a planned trajectory 320 that perfectly satisfies all of the relevant objectives at any given moment. For example, the motion planner 130 may generate a planned trajectory that maneuvers the UAV 100 to a particular GPS coordinate while following a tracked object, capturing images of the tracked object, maintaining line of sight with the tracked object, and avoiding collisions with other objects. In practice, such an ideal scenario may be rare. Accordingly, the motion planner system 130 may need to favor one objective over another when the satisfaction of both is impossible or impractical (for any number of reasons). The weighting factors for each of the objectives 302 define how they will be considered by the motion planner 130.
In an example embodiment, a weighting factor is a numerical value on a scale of 0.0 to 1.0. A value of 0.0 for a particular objective may indicate that the motion planner 130 can completely ignore the objective (if necessary), while a value of 1.0 may indicate that the motion planner 130 will make a maximum effort to satisfy the objective while maintaining safe flight. A value of 0.0 may similarly be associated with an inactive objective and may be set to zero, for example, in response to toggling the objective from an active state to an inactive state. Low weighting factor values (e.g., 0.0-0.4) may be set for certain objectives that are based around subjective or aesthetic targets such as maintaining visual saliency in the captured images. Conversely, high weighting factor values (e.g., 0.5-1.0) may be set for more critical objectives such as avoiding a collision with another object.
In some embodiments, the weighting factor values 338 may remain static as a planned trajectory is continually updated while the UAV 100 is in flight. Alternatively, or in addition, weighting factors for certain objectives may dynamically change based on changing conditions, while the UAV 100 is in flight. For example, an objective to avoid an area associated with uncertain depth value calculations in captured images (e.g., due to low light conditions) may have a variable weighting factor that increases or decreases based on other perceived threats to the safe operation of the UAV 100. In some embodiments, an objective may be associated with multiple weighting factor values that change depending on how the objective is to be applied. For example, a collision avoidance objective may utilize a different weighting factor depending on the class of a detected object that is to be avoided. As an illustrative example, the system may be configured to more heavily favor avoiding a collision with a person or animal as opposed to avoiding a collision with a building or tree.
The UAV 100 shown in
The example aerial vehicles and associated systems described herein are described in the context of an unmanned aerial vehicle such as the UAV 100 for illustrative simplicity; however, the introduced aerial vehicle configurations are not limited to unmanned vehicles. The introduced technique may similarly be applied to configure various types of manned aerial vehicles, such as a manned rotor craft (e.g., helicopters) or a manned fixed-wing aircraft (e.g., airplanes). For example, a manned aircraft may include an autonomous navigation system (similar to navigations systems 120) in addition to a manual control (direct or indirect) system. During flight, control of the craft may switch over from a manual control system in which an onboard pilot has direct or indirect control, to an automated control system to autonomously maneuver the craft without requiring any input from the onboard pilot or any other remote individual. Switchover from manual control to automated control may be executed in response to pilot input and/or automatically in response to a detected event such as a remote signal, environmental conditions, operational state of the aircraft, etc.
Arrangement of Image Capture Devices in Rotor Mounts
In some embodiments, one or more of the image capture devices (e.g., for navigation and/or subject capture) can be arranged proximate to the rotors of a UAV. Specifically, in some embodiments, one or more image capture devices may be arranged within and/or proximate to a structural mount associated with a rotor or a structural arm that connects a rotor mount to the body of the UAV. Arranging image capture devices within the rotor mounts (or rotor arms) of the UAV may provide several advantages, including freeing space within the body of the UAV (e.g., for other systems or batteries), reducing overall weight of the UAV (e.g., by consolidating support structures), and getting baseline between the image capture devices for stereo, trinocular, multi-view depth computation, etc.
The walls 424 of the rotor housing 404 may be manufactured of any material or combination of materials that are suitably durable and lightweight for use in an aerial vehicle. For example, in some embodiments, the walls 424 can be made of plastic, metal (e.g., aluminum), carbon fiber, synthetic fiber, or some sort of composite material such as carbon fiber embedded in an epoxy resin. The actual materials used will depend on the performance requirements of a given embodiment. The walls 424 may be manufactured using any manufacturing process suited for the selected material. For example, in the case of plastic materials, the walls 424 may be manufactured using injection molding, extrusion molding, rotational molding, blow molding, 3D printing, milling, plastic welding, lamination, or any combination thereof. In the case of metal materials, the walls 424 may be manufactured using machining, stamping, casting, forming, metal injection molding, computer numeric control (CNC) machining, or any combination thereof. These are just example materials and manufacturing processes that are provided for illustrative purposes and are not to be construed as limiting.
The walls 424 of the rotor housing 404 may comprise a unitary structure or may represent multiple structural pieces that are affixed together, for example, using mechanical fasteners (e.g., clips, screws, bolts, etc.), adhesives (e.g., glue, tape, etc.), welding, or any other suitable process for affixing parts together. Further, as will be described, in some embodiments, the walls 424 of the rotor housing 404 of a rotor assembly 413 may be part of or otherwise integrate with walls forming other structural components of the aerial vehicle, such as a rotor arm 403 or the body 402. The rotor housing 404 is depicted in
As shown in
As previously mentioned, the motor 411 may be any type of motor capable of applying a torque to rotate the rotor blades 410. For illustrative purposes, the motor 411 is depicted in
The movable first motor assembly includes walls 440, 441 that form a first motor housing. For example, the first motor housing may include a proximal end and a distal end arranged along an axis 480. The first motor housing includes a generally cylindrical side wall 440 that is arranged about axis 480 and an end wall (or “top wall”) 441 intersecting axis 480 at the distal end of the first motor housing. The side wall 440 and end wall 441 define an interior space of the first motor housing with a generally circular opening at the proximal end of the first motor housing. Similarly, the second motor assembly includes walls 442, 443 that form a second motor housing. The second motor housing also has a proximal end and a distal end arranged along axis 480. The second motor housing includes a generally cylindrical side wall 442 that is arranged about axis 480 and an end wall (or “bottom wall”) 443 intersecting axis 480 at the distal end of the second motor housing. The side wall 442 and end wall 443 define an interior space of the second motor housing with a generally circular opening at the proximal end of the second motor housing.
The first motor assembly further includes an axle bearing 462 coupled to the first motor housing, and a stator stack arranged around the axle bearing 462. In an embodiment, the stator stack includes multiple stator coils 444 and optionally multiple stator teeth 446 which can divide an induced electromagnet into multiple sections. Axle bearing 462 is intended to accommodate the previously mentioned axle 460 such that axle 460 is freely rotatable within axle bearing 462. Axle bearing 462 may be of any type suitable to allow for rotation of axle 460. For example, in an embodiment, axle bearing 462 is a plain bearing including a generally cylindrical hollow space within which the shaft of axle 460 can rotate. In some embodiments, axle bearing 462 includes rolling elements such as ball bearings arranged between generally cylindrical races. The stator coils 444 may be made of a conductive material (e.g., copper) through which an electric current can be run to induce the electromagnet of the stator stack.
The second motor assembly further includes the axle 460 that is affixed to the second motor housing and multiple magnets 450 that are affixed to an interior surface of the side walls 442 of the second motor housing. The fixed magnets 450 of the second motor assembly are affixed to the inner surface of side wall 442 and arranged such that that may cause the first motor assembly to rotate about axis 480 when a current is applied (and therefore an electromagnetic force induced) via the stator stack of the first motor assembly, thereby causing the attached rotor blades 410 to rotate.
In some situations, operation of the motor 410 may cause vibrations or electromagnetic interference that may interfere with or otherwise affect the operation of an image capture device 414 in close proximity. To counteract the effects of such vibration and/or electromagnetic interference, the rotor assembly 413 may include an isolator component or system 470.
For example, to isolate the image capture device 414 from vibration caused by the motor 410, the isolator system 470 may include one or more active and/or passive motion dampeners. The one or more motion dampeners may isolate the image capture device 414 from the vibrations of the motor 411 and/or the motion of the surrounding walls of the rotor housing 404 (i.e., caused by the motion of the UAV). Similarly, the one or more motion dampeners may isolate the walls of the rotor housing 404 from the vibrations of the motor 411 so that those vibrations are not transferred to the image capture device 414.
Alternatively, or in addition, to isolate the image capture device 414 from electromagnetic interference caused by the motor 411, the isolator system 470 may include electromagnetic shielding. Electromagnetic shielding may include one or more barriers made of conductive and/or magnetic materials. Specific material may include, for example, sheet metal, metal screen, metal mesh. In some embodiments, the electromagnetic shield of the isolator system 470 may be configured as a barrier wall between the motor 411 and the image capture device 414, for example, as shown in
The image capture device 414 can also be arranged within any of the other structures extending from the central body of the UAV, such as any of one or more rotor support arms (e.g., arm 403 in
The image capture device 414 may be arranged at any point along a length of the support arm extending out from the body of the UAV. In some embodiments, the rotor housing may be substantially integrated as part of a support arm extending from the body of the UAV. For example,
In some embodiments, the walls of the rotor housing and/or support arm may not fully or substantially enclose the motor 411 and/or image capture device. For example, in some embodiments, the individual housings of the image capture device 414 and/or motor 411 may be sufficient to protect internal components from the elements.
For illustrative simplicity, embodiments of a UAV may be described herein with reference to just rotor assembly 413; however, such embodiments may similarly implement alternative rotor assemblies such as assemblies 413d, 413e, and 413f. Further, the various alternative rotor assemblies 413, 413d, 413e, and 413f are examples provided for illustrative purposes and shall not be construed as limiting. Other embodiments may combine certain features of the various rotor assemblies 413, 413d, 413e, and 413f. For example, another alternative rotor assembly (not depicted in the FIGS.) may include support arm and/or rotor housing walls that do not fully or substantially enclose the motor 411 and/or image capture device 414 (e.g., as depicted in
Arrangement of Image Capture Devices in an Aerial Vehicle
The example UAV 100 depicted in
Notably, the multiple image capture devices included in the rotor assemblies 513a-d and the body mounted image capture devices 517a-b are arranged such that the UAV 500 includes three upward-facing image capture devices and three downward-facing image capture devices. Specifically, the upward-facing image capture devices include image capture device 517a and the image capture devices of rotor assemblies 513a and 513b. Similarly, the downward-facing image capture devices include image capture device 517b and the image capture devices of rotor assemblies 513c and 513d. Note that rotor assemblies 513a and 513b may represent inverted arrangements of similar rotor assemblies 513c and 513d.
Each of the image capture devices of UAV 500 depicted in
As shown in
In the example UAV 500 depicted in
Similar to the rotor assemblies, the walls forming the body 502 of the UAV 500 may be manufactured of any material or combination of materials that are suitably durable and lightweight for use in an aerial vehicle. For example, in some embodiments, the walls of body 502 can be made of plastic, metal (e.g., aluminum), carbon fiber, synthetic fiber, or some sort of composite material such as carbon fiber embedded in an epoxy resin. The actual materials used will depend on the performance requirements of the UAV 500. The walls of the body of the UAV 500 may be manufactured using any manufacturing process suited for the selected material. For example, in the case of plastic materials, the walls may be manufactured using injection molding, extrusion molding, rotational molding, blow molding, 3D printing, milling, plastic welding, lamination, or any combination thereof. In the case of metal materials, the walls may be manufactured using machining, stamping, casting, forming, metal injection molding, CNC machining, or any combination thereof. These are just example materials and manufacturing processes that are provided for illustrative purposes and are not to be construed as limiting.
As previously mentioned, to enable for trinocular image capture above and below the UAV 502, the rotor assemblies 513a-d with integrated image capture devices and body mounted image capture devices 517a-b are arranged such that the UAV 500 includes three upward-facing image capture devices and three downward-facing image capture devices.
In the example UAV 500, a first rotor assembly 513a extends from the port side of the body 502 and a second rotor assembly 513b extends from the starboard side. The first and second rotor assemblies 513a and 513b are substantially aligned with each other on opposite sides of the body 502 and are located proximate to the forward end of the body 502. Notably, the first and second rotor assemblies are oriented such that associated image capture devices are on a top side and the associated rotors are on a bottom side. Specifically, the first rotor assembly 513a includes a first image capture device 514a that is arranged on a top side of the first rotor assembly 513a and a first rotor 510a that is arranged on a bottom side of the first rotor assembly 513a. Similarly, the second rotor assembly 513b includes a second image capture device 514b that is arranged on a top side of the second rotor assembly 513b and a second rotor 510b that is arranged on a bottom side of the second rotor assembly 513b.
A third rotor assembly 513c extends from the port side of the body 502 and a fourth rotor assembly 513d extends from the starboard side. The third and fourth rotor assemblies 513c and 513d are substantially aligned with each other on opposite sides of the body 502 and are located proximate to the aft end of the body 502. Notably, the third and fourth rotor assemblies are oriented such that associated image capture devices are on a bottom side and the associated rotors are on a top side. Specifically, the third rotor assembly 513c includes a third image capture device 514c that is arranged on a bottom side of the third rotor assembly 513c and a third rotor 510c that is arranged on a top side of the third rotor assembly 513c. Similarly, the fourth rotor assembly 513d includes a fourth image capture device 514d that is arranged on a bottom side of the fourth rotor assembly 513d and a fourth rotor 510d that is arranged on a top side of the fourth rotor assembly 513d.
The fifth image capture device (i.e., image capture device 517a) is arranged along a top surface of body 502 proximate to the aft end and is substantially aligned with the longitudinal axis 590 of the UAV 500 as shown in
The first and second image capture devices 514a-b together with the fifth image capture device 517a form a first triangle of upward-facing image capture devices that enable trinocular image capture in a plurality of directions above the UAV 500. Similarly, the third and fourth image capture devices 514c-d, together with the sixth image capture device 517b, form a second triangle of downward-facing image capture devices that enable trinocular image capture in a plurality of directions below the UAV 500.
In some embodiments, a gimbaled image capture device can be coupled to a UAV to allow for capturing images of a subject in the physical environment. For example,
Otherwise, similar to UAV 500, example UAV 600 includes three upward-facing image capture devices (image capture device 617a mounted to body 602 and the integrated image capture devices of rotor assemblies 613a and 613b) and three downward-facing image capture devices (image capture device 617b and the integrated image capture devices of rotor assemblies 613c and 613d). In this example, the three upward-facing image capture devices and three downward-facing image capture devices may be utilized for visual navigation, while the gimbaled image capture device 615 is utilized to capture images of the surrounding physical environment for later display.
Notably, the gimbaled image capture device 615 is depicted in
As previously mentioned, the body and rotor assemblies may be arranged differently in other embodiments.
Specifically, the first body component 702a includes or is otherwise coupled to rotor assemblies 713a and 713b, and the second body component 702b includes or is otherwise coupled to rotor assemblies 713c and 713d. In other words, the multiple upward-facing image capture devices include image capture devices 718a and 718b that are arranged on top surfaces of rotor assemblies 713a and 713b (respectively), and another upward-facing image capture device 717a that is arranged on a top surface of the first body component 702a substantially along a central axis of the UAV 700 proximate to the aft end of the UAV 700. In the example depicted in
Similarly, the multiple downward-facing image capture devices include image capture devices 718c and 718d that are arranged on bottom surfaces of rotor assemblies 713c and 713d (respectively) and another downward-facing image capture device 717b that is arranged on a bottom surface of the second body component 702b substantially along the central axis of the UAV 700 proximate to the forward end of the UAV 700. In the example depicted in
In some embodiments, to simplify manufacture and parts replacement, the first body component (including rotor assemblies 713a-b) may be substantially similar (in dimension and/or shape) to the second body component 702b (including rotor assemblies 713c-d) and may be coupled to each other in an overlapping and opposing manner, for example, as more clearly illustrated in
The body components 702a-b may be manufactured using any of the materials or manufacturing processes described with respect to body 502 of example UAV 500. In some embodiments, the body components 702a-b may collectively represent a unitary body. In other words, the two body components 702a and 702b may represent a single part that is formed of a single piece of material despite the separate component callouts in
Further, although not depicted in
In some embodiments, more than one camera can be integrated into a given rotor assembly.
A first image capture device 814a is arranged within the first interior space 826a of the first housing component 804a. Specifically, the first image capture device 814a is arranged within the first interior space 826a proximate to a first end (or “top side”) of the first housing component 804a and oriented such that light is received through an opening in the top side of the first housing component 804a. For example, the first image capture device 814a may include a lens 834a that extends from the top side of the first housing component 804a such that the first image capture device 814a captures images of the physical environment above the rotor assembly 813, while in use. In other words, the first image capture device 814a is an upward-facing image capture device.
The motor 811 and a second image capture device 814b are arranged within the second interior space 826b of the second housing component 804b. Specifically, the motor 811 is arranged within the second interior space 826b proximate to the top side of the second housing component 804b and the second image capture device 814b is arranged within the second interior space 826b proximate to a second end (or “bottom side”) of the second housing component 804b that is opposite the first end. Further, the motor 811 is oriented such that the attached rotor blades 810 extend from the top side of the second housing component 804b. Conversely, the second image capture device 814b is oriented such that light is received through an opening in the bottom side of the second housing component 804b. For example, the second image capture device 814b may include a lens 834b that extends from the bottom side of the second housing component 804b such that the second image capture device 814b captures images of the physical environment below the rotor assembly 813, while in use. In other words, the second image capture device 814b is a downward-facing image capture device. Note that the orientations of elements described with respect to the rotor assembly 813 depicted in
For illustrative purposes, the motor 811 is depicted in
The movable first motor assembly of motor 811 includes walls 840, 841 that form a first motor housing similar to walls 440, 441 of motor 411. Similarly, the second motor assembly includes walls 842, 843 that form a second motor housing similar to walls 442, 443 of motor 411.
The first motor assembly of motor 811 further includes an axle bearing 862 coupled to the first motor housing, and a stator stack arranged around the axle bearing 862. Note that the components of the stator stack of motor 811 are not specifically called out in
Notably, the axle bearing 862 is hollow along axis 880 such that the first housing component 804a can be affixed to the rest of the rotor assembly 813 above the plane of rotation of rotors 810. In other words, axle 860, to which the first housing component 804a is affixed, remains stationary while the first motor assembly of motor 811 (i.e., including walls 840, 841) rotates about axis 880 to rotate rotors 810 that are affixed to the axle bearing 862. In some embodiments, the axle 860 may have a hollow construction to enable wires (e.g., for power and/or data transfer) to pass through to connect first image capture device 814a to processing components on board the UAV.
As with the example rotor assembly 413 described with respect to
In some embodiments, as few as two image capture devices may be utilized to facilitate autonomous visual navigation.
In some embodiments, image capture devices may instead be coupled to the body of a UAV at opposing ends and oriented to capture images in front of and behind the UAV.
Adding additional image capture devices may improve depth estimation accuracy.
In some embodiments, the two upward-facing image capture devices and two downward-facing image capture devices may be arranged in or on rotor assemblies instead of in a central body to free up space in the central body. For example,
In the example UAV 1200, a first rotor assembly 1213a extends from the port side of the body 1202 and a second rotor assembly 1213b extends from the starboard side. The first and second rotor assemblies 1213a and 1213b are substantially aligned with each other on opposite sides of the body 1202 and are located proximate to the forward end of the body 1202. Notably, the first and second rotor assemblies are oriented such that associated image capture devices are on a top side and the associated rotors are on a bottom side. Specifically, the first rotor assembly 1213a includes a first image capture device 1214a that is arranged on a top side of the first rotor assembly 513a and a first rotor 1210a that is arranged on a bottom side of the first rotor assembly 1213a. Similarly, the second rotor assembly 1213b includes a second image capture device 1214b that is arranged on a top side of the second rotor assembly 1213b and a second rotor 1210b that is arranged on a bottom side of the second rotor assembly 1213b.
A third rotor assembly 1213c extends from the port side of the body 1202 and a fourth rotor assembly 213d extends from the starboard side. The third and fourth rotor assemblies 1213c and 1213d are substantially aligned with each other on opposite sides of the body 1202 and are located proximate to the aft end of the body 1202. Notably, the third and fourth rotor assemblies are oriented such that associated image capture devices are on a bottom side and the associated rotors are on a top side. Specifically, the third rotor assembly 1213c includes a third image capture device 514c that is arranged on a bottom side of the third rotor assembly 1213c and a third rotor 1210c that is arranged on a top side of the third rotor assembly 1213c. Similarly, the fourth rotor assembly 1213d includes a fourth image capture device 1214d that is arranged on a bottom side of the fourth rotor assembly 1213d and a fourth rotor 1210d that is arranged on a top side of the fourth rotor assembly 1213d.
In some embodiments, a UAV with only two upward-facing and two-downward facing image capture device (e.g., UAVs 1100 and 1200) may be configured to still achieve stereoscopic capture in multiple directions by, for example, adjusting the angles of the various image capture devices. For example, with reference to UAV 1200, the first image capture device 1214a may be arranged at an angle towards the third rotor assembly 1213c and the third image capture device 1214c may be arranged at an angle towards the first rotor assembly 1213a. Although the first image capture device 1214a and the third image capture device 1213c point in substantially opposite directions (i.e., upwards and downwards), a slight angle towards each other may be sufficient to provide a stereo baseline between the two.
Protective Structure for Image Capture Devices
Arranging the image capture devices as shown in any one or more of the example UAVs of
The protective structural element 1390 is depicted in
The protective structural element 1390 may be manufactured of any material or combination of materials that are suitably durable and lightweight for use in an aerial vehicle. For example, in some embodiments, the protective structural element 1390 can be made of plastic, metal (e.g., aluminum), carbon fiber, synthetic fiber, or some sort of composite material such as carbon fiber embedded in an epoxy resin. The actual materials used will depend on the performance requirements of a given embodiment. The protective structural element 1390 may be manufactured using any manufacturing process suited for the selected material. For example, in the case of plastic materials, the protective structural element 1390 may be manufactured using injection molding, extrusion molding, rotational molding, blow molding, 3D printing, milling, plastic welding, lamination, or any combination thereof. In the case of metal materials, the protective structural element 1390 may be manufactured using machining, stamping, casting, forming, metal injection molding, CNC machining, or any combination thereof. These are just example materials and manufacturing processes that are provided for illustrative purposes and are not to be construed as limiting.
In some embodiments, the protective structural element 1390 may represent a portion of an exterior surface of a UAV. For example, the walls of any of the rotor housing 1304 and/or the rotor arm 1303 may be manufactured to include a portion that extends, for example, as depicted in
In some embodiments, a protective structural element similar to element 1390 may be arranged proximate to each of one or more image capture devices of a UAV. This may include upward-facing image capture devices to protect such device from contact with the ground, for example, if the UAV lands upside down, or from contact with other surfaces above the UAV, such as a ceiling or the underside of a bridge. In some embodiments, the protective structural element 1390 may represent a part of a bezel or frame that is installed flush with a surface associated with the UAV and around a lens of an image capture device.
Removable Rotor Blades
The propellers on certain UAVs (e.g., quadcopter drones) are often considered to be consumable because they are the most likely part of the aircraft to be damaged in the event of a collision with another object. Accordingly, manufacturers of such UAVs typically design the propellers to be user replaceable in the field, without the need of any kind of tools and with a minimum of effort. There are currently two widely used methods of attaching propellers to drones that meet this need. The first is by using a separate, easy to hand-tighten, propeller nut that threads onto the propeller shaft or equivalent structure, pinching the propeller in place. This method has the downside of needing a small separate part (i.e., the propeller nut), that if lost, renders the UAV unusable, and has a tendency to spontaneously loosen and come off when subjected to rotational and vibrational loads. Such a propeller nut also requires frequent re-tightening or exotic, left-handed threads. The second method uses a bayonet lock to attach a propeller directly to a motor. Since this method relies on a spring to keep the propeller seated in its locked position, the propeller cannot be relied upon to undergo any loading that would push back against this spring, and so cannot be used in a pusher configuration or for three-dimensional flight, where the propeller is run in both directions for maximum maneuverability. An improved technique for removable rotor blades is described below to address these challenges.
In some embodiments, the pins 1430a-b may be shaped and/or sized differently based on the type and/or arrangement of the motor 1410 to force proper installation by the user. For example, as previously described (e.g., with respect to UAV 500 in
The keyhole/pin attachment mechanism depicted in
The techniques for removable propeller attachments depicted in
Removable Battery Pack as a Launch Handle
In some embodiments, an autonomous UAV can be configured to launch and land from a user's hand. Such operation may require a prominent feature such as a handle on the UAV so that a person can easily grip the UAV during launch or landing. However, a prominent handle for launch and landing may negatively impact the transportability of the UAV. Instead, such a handle can be configured as a detachable component of the UAV. Further, in some embodiments, the detachable handle component can be configured to house a removable battery pack for powering the UAV.
In some embodiments, the removable battery pack component 1550 can be detachably coupled (both structurally and electrically) to the body 1502 of the UAV 1500 using one or more magnetic contacts/couplings. In addition to facilitating easy attaching and detaching of the battery pack 1550, a magnetic coupling also has the added benefit of allowing the battery pack 1550 to self-eject if the UAV 1500 runs into an obstacle while in flight. Allowing the battery pack 1550 (likely one of the more massive components on board the UAV) to eject upon impact may help to absorb some of the energy of the impact, thereby avoiding extensive damage to the body 1502 of the UAV 1500.
In some embodiments, a removable battery pack 1550 may include user interface features, for example, to allow a user to provide a control input to the UAV. In such embodiments, the user interface features (e.g., in the form of an input device) may be communicatively coupled to an internal control system (e.g., navigation system 120) of the UAV, for example, via the detachable magnetic contacts.
Gimbal Fastener
As previously discussed, a UAV may include a gimbaled image capture device (e.g., image capture device 115) configured for capturing images (including video) for later viewing. The gimbaled image capture device may be coupled to the body of the UAV via a gimbal mechanism that allows the image capture device to change position and/or orientation relative to the body of the UAV, for example, for image stabilization and/or subject tracking.
When not powered (i.e., when the UAV is off), the servo motors of the gimbal mechanism 1625 do not operate, thereby allowing the gimbal mechanism 1625 to freely rotate about the axes 1630a-b. Such freedom of motion may be problematic during storage or transport, as it may lead to damage of the attached image capture device 1615, the gimbal mechanism 1625, and/or the body 1602 of the UAV 1600. A gimbal locking mechanism can be implemented to secure the gimbal mechanism 1625 (and connected camera 1615) in place when the UAV is powered off.
The first locking component 1616 and second locking component 1625 may comprise, for example, opposing mechanical clips, opposing magnets, or any other types of elements configured to detachably couple to each other to prevent rotation of the image capture device 1615 about axis 1630a relative to the gimbal 1625 when the UAV 1600 is not powered.
Although not depicted in the figures, similar locking components can be utilized to prevent rotation about axis 1630b, or any other motion by the image capture device 1615 not depicted in the figures.
In some embodiments, a UAV may be configured with an auto-stowing feature that causes the motors of the gimbal mechanism 1625 to automatically actuate to rotate the attached image capture device 1615 into a locking position, for example, prior to powering down, in response to an environmental condition (e.g., high winds), in response to a system status (e.g., low battery or tracking/calibration errors), or in response to user input to secure the gimbal.
Fixed-Wing Configurations
In some embodiments, an autonomous UAV may be configured as a fixed-wing aircraft.
A fixed-wing UAV can include an autonomous visual navigation system similar to the visual navigation system 120 depicted in
The UAV 1900a depicted in
In some embodiments, the powered rotors of a fixed-wing UAV may be rearranged so as to not interfere with image capture devices arranged along leading or training edges of a fixed flight surface. For example,
A fixed-wing UAV can include more fixed flight surfaces than are depicted in
Aerial Vehicle—Example System
System 2300 is only one example of a system that may be part of any of the aforementioned aerial vehicles. Other aerial vehicles may include more or fewer components than shown in system 2300, may combine two or more components as functional units, or may have a different configuration or arrangement of the components. Some of the various components of system 2300 shown in
A propulsion system (e.g., comprising components 2302-2304) may comprise fixed-pitch rotors. The propulsion system may also include variable-pitch rotors (for example, using a gimbal mechanism), a variable-pitch jet engine, or any other mode of propulsion having the effect of providing force. The propulsion system may vary the applied thrust, for example, by using an electronic speed controller 2306 to vary the speed of each rotor.
Flight controller 2308 may include a combination of hardware and/or software configured to receive input data (e.g., sensor data from image capture devices 2334, generated trajectories from an autonomous navigation system 120, or any other inputs), interpret the data and output control commands to the propulsion systems 2302-2306 and/or aerodynamic surfaces (e.g., fixed-wing control surfaces) of the aerial vehicle. Alternatively, or in addition, a flight controller 2308 may be configured to receive control commands generated by another component or device (e.g., processors 2312 and/or a separate computing device), interpret those control commands and generate control signals to the propulsion systems 2302-2306 and/or aerodynamic surfaces (e.g., fixed-wing control surfaces) of the aerial vehicle. In some embodiments, the previously mentioned navigation system 120 may comprise the flight controller 2308 and/or any one or more of the other components of system 2300. Alternatively, the flight controller 2308 shown in
Memory 2316 may include high-speed random-access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 2316 by other components of system 2300, such as the processors 2312 and the peripherals interface 2310, may be controlled by the memory controller 2314.
The peripherals interface 2310 may couple the input and output peripherals of system 2300 to the processor(s) 4112 and memory 2316. The one or more processors 2312 run or execute various software programs and/or sets of instructions stored in memory 2316 to perform various functions for the UAV 100 and to process data. In some embodiments, processors 2312 may include general central processing units (CPUs), specialized processing units, such as graphical processing units (GPUs), particularly suited to parallel processing applications, or any combination thereof. In some embodiments, the peripherals interface 2310, the processor(s) 2312, and the memory controller 2314 may be implemented on a single integrated chip. In some other embodiments, they may be implemented on separate chips.
The network communications interface 2322 may facilitate transmission and reception of communications signals often in the form of electromagnetic signals. The transmission and reception of electromagnetic communications signals may be carried out over physical media such as copper wire cabling or fiber optic cabling, or may be carried out wirelessly, for example, via a radiofrequency (RF) transceiver. In some embodiments, the network communications interface may include RF circuitry. In such embodiments, RF circuitry may convert electrical signals to/from electromagnetic signals and communicate with communications networks and other communications devices via the electromagnetic signals. The RF circuitry may include well-known circuitry for performing these functions, including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. The RF circuitry may facilitate transmission and receipt of data over communications networks (including public, private, local, and wide area). For example, communication may be over a wide area network (WAN), a local area network (LAN), or a network or networks such as the Internet. Communication may be facilitated over wired transmission media (e.g., via Ethernet) or wirelessly. Wireless communication may be over a wireless cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other modes of wireless communication. The wireless communication may use any of a plurality of communications standards, protocols and technologies, including, but not limited to, Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11n and/or IEEE 802.11ac), Voice Over Internet Protocol (VoIP), Wi-MAX, or any other suitable communication protocols.
The audio circuitry 2324, including the speaker and microphone 2350, may provide an audio interface between the surrounding physical environment and the aerial vehicle. The audio circuitry 2324 may receive audio data from the peripherals interface 2310, convert the audio data to an electrical signal, and transmit the electrical signal to the speaker 2350. The speaker 2350 may convert the electrical signal to human-audible sound waves. The audio circuitry 2324 may also receive electrical signals converted by the microphone 2350 from sound waves. The audio circuitry 2324 may convert the electrical signal to audio data and transmit the audio data to the peripherals interface 2310 for processing. Audio data may be retrieved from and/or transmitted to memory 2316 and/or the network communications interface 2322 by the peripherals interface 2310.
The I/O subsystem 2360 may couple input/output peripherals of the aerial vehicle, such as an optical sensor system 2334, the mobile device interface 2338, and other input/control devices 2342, to the peripherals interface 2310. The I/O subsystem 2360 may include an optical sensor controller 2332, a mobile device interface controller 2336, and other input controller(s) 2340 for other input or control devices. The one or more input controllers 2340 receive/send electrical signals from/to other input or control devices 2342. The other input/control devices 2342 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, touchscreen displays, slider switches, joysticks, click wheels, and so forth.
The mobile device interface device 2338 along with mobile device interface controller 2336 may facilitate the transmission of data between the aerial vehicle and other computing devices such as a mobile device 104. According to some embodiments, communications interface 2322 may facilitate the transmission of data between the aerial vehicle and a mobile device 104 (for example, where data is transferred over a Wi-Fi network).
System 1200 also includes a power system 1218 for powering the various components. The power system 1218 may include a power management system, one or more power sources (e.g., battery, alternating current (AC), etc.), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in computerized device.
System 2300 may also include one or more image capture devices 2334. Image capture devices 2334 may be the same as any of the image capture devices associated with any of the aforementioned aerial vehicles including UAVs 100, 400, 500, 600, 700, 800, 900, 1000, 1100, 1200, 1500, 1500d, 1600, 1700, 1900a-b, 2000, 2100, or 2200.
UAV system 2300 may also include one or more proximity sensors 2330.
System 2300 may also include one or more accelerometers 2326.
System 2300 may include one or more IMU 2328. An IMU 2328 may measure and report the UAV's velocity, acceleration, orientation, and gravitational forces using a combination of gyroscopes and accelerometers (e.g., accelerometer 2326).
System 2300 may include a global positioning system (GPS) receiver 2320.
In some embodiments, the software components stored in memory 2316 may include an operating system, a communication module (or set of instructions), a flight control module (or set of instructions), a localization module (or set of instructions), a computer vision module (or set of instructions), a graphics module (or set of instructions), and other applications (or sets of instructions). For clarity, one or more modules and/or applications may not be shown in
An operating system (e.g., Darwin™, RTXC, Linux™, Unix™, Apple™ OS X, Microsoft Windows™, or an embedded operating system such as VxWorks™) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.), and facilitates communication between various hardware and software components.
A communications module may facilitate communication with other devices over one or more external ports 2344 and may also include various software components for handling data transmission via the network communications interface 2322. The external port 2344 (e.g., Universal Serial Bus (USB), Firewire, etc.) may be adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.).
A graphics module may include various software components for processing, rendering, and displaying graphics data. As used herein, the term “graphics” may include any object that can be displayed to a user, including, without limitation, text, still images, videos, animations, icons (such as user-interface objects including soft keys), and the like. The graphics module, in conjunction with a graphics processing unit (GPU) 2312, may process in real time, or near real time, graphics data captured by optical sensor(s) 2334 and/or proximity sensors 2330.
A computer vision module, which may be a component of a graphics module, provides analysis and recognition of graphics data. For example, while the aerial vehicle is in flight, the computer vision module, along with a graphics module (if separate), GPU 2312, and image capture devices(s) 2334, and/or proximity sensors 2330 may recognize and track the captured image of an object located on the ground. The computer vision module may further communicate with a localization/navigation module and flight control module to update a position and/or orientation of the aerial vehicle and to provide course corrections to fly along a planned trajectory through a physical environment.
A localization/navigation module may determine the location and/or orientation of the aerial vehicle and provide this information for use in various modules and applications (e.g., to a flight control module in order to generate commands for use by the flight controller 2308).
Image capture devices(s) 2334, in conjunction with an image capture device controller 2332 and a graphics module, may be used to capture images (including still images and video) and store them into memory 2316.
The above identified modules and applications each correspond to a set of instructions for performing one or more functions described above. These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and, thus, various subsets of these modules may be combined or otherwise rearranged in various embodiments. In some embodiments, memory 2316 may store a subset of the modules and data structures identified above. Furthermore, memory 2316 may store additional modules and data structures not described above.
Example Computer Processing System
While the main memory 2406, non-volatile memory 2410, and storage medium 2426 (also called a “machine-readable medium”) are shown to be a single medium, the term “machine-readable medium” and “storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store one or more sets of instructions 2428. The term “machine-readable medium” and “storage medium” shall also be taken to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the computing system and that cause the computing system to perform any one or more of the methodologies of the presently disclosed embodiments.
In general, the routines executed to implement the embodiments of the disclosure may be implemented as part of an operating system or a specific application, component, program, object, module, or sequence of instructions referred to as “computer programs.” The computer programs typically comprise one or more instructions (e.g., instructions 2404, 2408, 2428), set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processing units or processors 2402, cause the processing system 2400 to perform operations to execute elements involving the various aspects of the disclosure.
Moreover, while embodiments have been described in the context of fully functioning computers and computer systems, those skilled in the art will appreciate that the various embodiments are capable of being distributed as a program product in a variety of forms, and that the disclosure applies equally, regardless of the particular type of machine or computer-readable media used to actually effect the distribution.
Further examples of machine-readable storage media, machine-readable media, or computer-readable (storage) media include recordable type media such as volatile and non-volatile memory devices 2410, floppy and other removable disks, hard disk drives, optical discs (e.g., Compact Disc Read-Only Memory (CD-ROMS), Digital Versatile Discs (DVDs)), and transmission type media, such as digital and analog communication links.
The network adapter 2412 enables the computer processing system 2400 to mediate data in a network 2414 with an entity that is external to the computer processing system 2400, such as a network appliance, through any known and/or convenient communications protocol supported by the computer processing system 2400 and the external entity. The network adapter 2412 can include one or more of a network adaptor card, a wireless network interface card, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, a bridge router, a hub, a digital media receiver, and/or a repeater.
The network adapter 2412 can include a firewall which can, in some embodiments, govern and/or manage permission to access/proxy data in a computer network, and track varying levels of trust between different machines and/or applications. The firewall can be any number of modules having any combination of hardware and/or software components able to enforce a predetermined set of access rights between a particular set of machines and applications, machines and machines, and/or applications and applications, for example, to regulate the flow of traffic and resource sharing between these varying entities. The firewall may additionally manage and/or have access to an access control list which details permissions including, for example, the access and operation rights of an object by an individual, a machine, and/or an application, and the circumstances under which the permission rights stand.
As indicated above, the techniques introduced here may be implemented by, for example, programmable circuitry (e.g., one or more microprocessors), programmed with software and/or firmware, entirely in special-purpose hardwired (i.e., non-programmable) circuitry, or in a combination or such forms. Special-purpose circuitry can be in the form of, for example, one or more application-specific integrated circuits (ASICs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), etc.
Note that any of the embodiments described above can be combined with another embodiment, except to the extent that it may be stated otherwise above, or to the extent that any such embodiments might be mutually exclusive in function and/or structure.
Although the present invention has been described with reference to specific exemplary embodiments, it will be recognized that the invention is not limited to the embodiments described, but can be practiced with modification and alteration within the spirit and scope of the appended claims. Accordingly, the specifications and drawings are to be regarded in an illustrative sense, rather than a restrictive sense.
This application is a continuation of U.S. patent application Ser. No. 16/395,110, titled “AUTONOMOUS AERIAL VEHICLE HARDWARE CONFIGURATION,” filed Apr. 25, 2019; which is entitled to the benefit and/or right of priority of U.S. Provisional Patent Application No. 62/663,194, titled “AUTONOMOUS UAV HARDWARE CONFIGURATIONS,” filed Apr. 26, 2018, the contents of each of which are hereby incorporated by reference in their entirety for all purposes. This application is therefore entitled to a priority date of Apr. 26, 2018.
Number | Name | Date | Kind |
---|---|---|---|
2834573 | Stalker | May 1958 | A |
3081964 | Quenzler | Mar 1963 | A |
4093155 | Kincaid, Jr. | Jun 1978 | A |
5709357 | Von | Jan 1998 | A |
5727754 | Carter, Jr. | Mar 1998 | A |
6233003 | Ono | May 2001 | B1 |
6352411 | Bucher | Mar 2002 | B1 |
6435453 | Carter, Jr. | Aug 2002 | B1 |
6487779 | Underthun | Dec 2002 | B1 |
6655631 | Austen-Brown | Dec 2003 | B2 |
6802694 | Bucher | Oct 2004 | B2 |
7448571 | Carter, Jr. | Nov 2008 | B1 |
7959104 | Kuntz | Jun 2011 | B2 |
9030149 | Chen et al. | May 2015 | B1 |
9457901 | Bertrand | Oct 2016 | B2 |
9469394 | Vaughn | Oct 2016 | B2 |
9567076 | Zhang | Feb 2017 | B2 |
9643720 | Hesselbarth | May 2017 | B2 |
9678506 | Bachrach | Jun 2017 | B2 |
9764829 | Beckman | Sep 2017 | B1 |
9798322 | Bachrach | Oct 2017 | B2 |
9821909 | Moshe | Nov 2017 | B2 |
9834305 | Taylor | Dec 2017 | B2 |
9891621 | Bachrach | Feb 2018 | B2 |
9930298 | Bevirt | Mar 2018 | B2 |
D816546 | Wang | May 2018 | S |
9972212 | Sperindeo | May 2018 | B1 |
10011354 | Goldstein | Jul 2018 | B2 |
10104289 | Enriquez | Oct 2018 | B2 |
10137983 | Horn | Nov 2018 | B2 |
10315761 | McCullough et al. | Jun 2019 | B2 |
10351235 | Karem et al. | Jul 2019 | B2 |
10386188 | Tian | Aug 2019 | B2 |
10435176 | McClure | Oct 2019 | B2 |
10455155 | Kalinowski | Oct 2019 | B1 |
10494094 | Foley | Dec 2019 | B2 |
10513329 | Ou | Dec 2019 | B2 |
10520943 | Martirosyan | Dec 2019 | B2 |
10611048 | Helsing et al. | Apr 2020 | B1 |
10612523 | Srinivasan | Apr 2020 | B1 |
10618650 | Hasinski | Apr 2020 | B2 |
10647404 | Sugaki | May 2020 | B2 |
10669023 | Heinen et al. | Jun 2020 | B2 |
10676188 | Campbell | Jun 2020 | B2 |
10717524 | Boyes | Jul 2020 | B1 |
10793270 | Chen | Oct 2020 | B2 |
10816967 | Bachrach | Oct 2020 | B2 |
10831186 | Van Niekerk | Nov 2020 | B2 |
D906170 | Thompson | Dec 2020 | S |
10946959 | Nwosu | Mar 2021 | B2 |
10981661 | Oldroyd | Apr 2021 | B2 |
11091258 | Groninga et al. | Aug 2021 | B2 |
11124298 | Chen | Sep 2021 | B2 |
11292598 | Foley | Apr 2022 | B2 |
11453513 | Thompson | Sep 2022 | B2 |
11465739 | Hymer | Oct 2022 | B2 |
11530028 | Wiegman | Dec 2022 | B1 |
11636447 | Mishra et al. | Apr 2023 | B1 |
11738613 | Spikes et al. | Aug 2023 | B1 |
11829161 | Vander Mey et al. | Nov 2023 | B2 |
20010046442 | Bucher | Nov 2001 | A1 |
20060251522 | Matheny | Nov 2006 | A1 |
20080251308 | Molnar | Oct 2008 | A1 |
20090283629 | Kroetsch | Nov 2009 | A1 |
20100171001 | Karem | Jul 2010 | A1 |
20110001020 | Forgac | Jan 2011 | A1 |
20110268194 | Nagano | Nov 2011 | A1 |
20120232718 | Rischmuller | Sep 2012 | A1 |
20120287274 | Bevirt | Nov 2012 | A1 |
20130051778 | Dimotakis | Feb 2013 | A1 |
20130099063 | Grip | Apr 2013 | A1 |
20130175390 | Woodworth | Jul 2013 | A1 |
20130176423 | Rischmuller | Jul 2013 | A1 |
20130280084 | Nagle | Oct 2013 | A1 |
20140356174 | Wang | Dec 2014 | A1 |
20150136897 | Seibel et al. | May 2015 | A1 |
20150137424 | Lyons | May 2015 | A1 |
20150149000 | Rischmuller | May 2015 | A1 |
20150259066 | Johannesson et al. | Sep 2015 | A1 |
20150266571 | Bevirt et al. | Sep 2015 | A1 |
20150321758 | Sarna, II | Nov 2015 | A1 |
20150370250 | Bachrach | Dec 2015 | A1 |
20160129998 | Welsh | May 2016 | A1 |
20160137298 | Youngblood | May 2016 | A1 |
20160214713 | Cragg | Jul 2016 | A1 |
20160214715 | Meffert | Jul 2016 | A1 |
20160286128 | Zhou | Sep 2016 | A1 |
20160327950 | Bachrach | Nov 2016 | A1 |
20160340028 | Datta | Nov 2016 | A1 |
20160376000 | Kohstall | Dec 2016 | A1 |
20160376004 | Claridge | Dec 2016 | A1 |
20170036771 | Woodman | Feb 2017 | A1 |
20170057630 | Schwaiger | Mar 2017 | A1 |
20170075351 | Liu | Mar 2017 | A1 |
20170113793 | Toulmay | Apr 2017 | A1 |
20170113800 | Freeman | Apr 2017 | A1 |
20170158328 | Foley | Jun 2017 | A1 |
20170217585 | Hulsman | Aug 2017 | A1 |
20170233069 | Apkarian | Aug 2017 | A1 |
20170240061 | Waters | Aug 2017 | A1 |
20170240274 | Regev | Aug 2017 | A1 |
20170244270 | Waters | Aug 2017 | A1 |
20170297681 | Yamada | Oct 2017 | A1 |
20170297738 | von Flotow | Oct 2017 | A1 |
20170301111 | Zhao | Oct 2017 | A1 |
20170327218 | Morin | Nov 2017 | A1 |
20170329324 | Bachrach et al. | Nov 2017 | A1 |
20180024570 | Hutson | Jan 2018 | A1 |
20180032042 | Turpin | Feb 2018 | A1 |
20180046187 | Martirosyan | Feb 2018 | A1 |
20180093171 | Mallinson | Apr 2018 | A1 |
20180093781 | Mallinson | Apr 2018 | A1 |
20180157276 | Fisher et al. | Jun 2018 | A1 |
20180181119 | Lee | Jun 2018 | A1 |
20180194463 | Hasinski | Jul 2018 | A1 |
20180208301 | Ye | Jul 2018 | A1 |
20180208311 | Zhang | Jul 2018 | A1 |
20180284575 | Sugaki | Oct 2018 | A1 |
20180312253 | Zhao | Nov 2018 | A1 |
20180312254 | Ni | Nov 2018 | A1 |
20180326255 | Schranz | Nov 2018 | A1 |
20190002124 | Garvin | Jan 2019 | A1 |
20190023395 | Lee | Jan 2019 | A1 |
20190084673 | Chen | Mar 2019 | A1 |
20190100313 | Campbell | Apr 2019 | A1 |
20190102874 | Goja | Apr 2019 | A1 |
20190118944 | Kimchi | Apr 2019 | A1 |
20190135403 | Perry | May 2019 | A1 |
20190144100 | Samir | May 2019 | A1 |
20190168872 | Grubb | Jun 2019 | A1 |
20190215457 | Enke et al. | Jul 2019 | A1 |
20190248487 | Holtz | Aug 2019 | A1 |
20190289193 | Bevirt | Sep 2019 | A1 |
20190329903 | Thompson | Oct 2019 | A1 |
20190377345 | Bachrach | Dec 2019 | A1 |
20190378423 | Bachrach | Dec 2019 | A1 |
20190379268 | Adams | Dec 2019 | A1 |
20200001990 | Jiang | Jan 2020 | A1 |
20200023995 | Song | Jan 2020 | A1 |
20200041996 | Bachrach | Feb 2020 | A1 |
20200064135 | Lai | Feb 2020 | A1 |
20200073385 | Jobanputra | Mar 2020 | A1 |
20200108930 | Foley | Apr 2020 | A1 |
20200156749 | Scacchi | May 2020 | A1 |
20200164966 | Suzuki | May 2020 | A1 |
20200391864 | Lee | Dec 2020 | A1 |
20210009269 | Chen | Jan 2021 | A1 |
20210061482 | Ulrich | Mar 2021 | A1 |
20210107682 | Kozlenko | Apr 2021 | A1 |
20210109312 | Honjo | Apr 2021 | A1 |
20210125503 | Henry et al. | Apr 2021 | A1 |
20210206487 | Iqbal et al. | Jul 2021 | A1 |
20210214068 | Bry | Jul 2021 | A1 |
20210261252 | Dayan | Aug 2021 | A1 |
20210362849 | Bower et al. | Nov 2021 | A1 |
20210403177 | Thompson | Dec 2021 | A1 |
20220001982 | Chen | Jan 2022 | A1 |
20220009626 | Baharav et al. | Jan 2022 | A1 |
20220119102 | Shaanan | Apr 2022 | A1 |
20220177109 | Hefner et al. | Jun 2022 | A1 |
20220185475 | Foley | Jun 2022 | A1 |
20220234747 | Bower et al. | Jul 2022 | A1 |
20220278538 | Kainzmaier et al. | Sep 2022 | A1 |
20220315224 | Kominami et al. | Oct 2022 | A1 |
20220324569 | Zhou | Oct 2022 | A1 |
20220355952 | Thompson | Nov 2022 | A1 |
20220374013 | Bachrach et al. | Nov 2022 | A1 |
20220411102 | Thompson | Dec 2022 | A1 |
20220411103 | Thompson | Dec 2022 | A1 |
20230002074 | Thompson | Jan 2023 | A1 |
20230144408 | Thompson | May 2023 | A1 |
20230166862 | Thompson | Jun 2023 | A1 |
20230280742 | Bachrach et al. | Sep 2023 | A1 |
20230280746 | Bachrach et al. | Sep 2023 | A1 |
20230283902 | Flanigan et al. | Sep 2023 | A1 |
Number | Date | Country |
---|---|---|
2018036231 | Mar 2018 | WO |
Number | Date | Country | |
---|---|---|---|
20230002074 A1 | Jan 2023 | US |
Number | Date | Country | |
---|---|---|---|
62663194 | Apr 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16395110 | Apr 2019 | US |
Child | 17900662 | US |