This invention relates generally to autonomous control systems, and more particularly to training computer models for autonomous control systems.
Autonomous control systems are systems that guide vehicles (e.g., automobiles, trucks, vans) without direct guidance by human operators. Autonomous control systems analyze the surrounding physical environment in various ways to guide vehicles in a safe manner. For example, an autonomous control system may detect and/or track objects in the physical environment, and responsive to a detected object, guide the vehicle away from the object such that collision with the object can be avoided. As another example, an autonomous control system may detect boundaries of lanes on the road such that the vehicle can be guided within the appropriate lane with the flow of traffic. Typically, the autonomous control system includes sensors that capture the surrounding environment as a set of sensor measurements in the form of images, videos, point cloud data, and the like.
Often times, autonomous control systems use computer models to analyze the surrounding environment and perform detection and control operations. The computer models are trained using training data that resemble potential environments the autonomous control system would encounter during operation. The training data may correspond to the type of sensor data generated by the sensors of the autonomous control system. In preparation for the training process, portions of the training data are annotated to label various objects of interest. Computer models can learn representations of the objects through these annotations. For example, annotations for an image of a street from a camera may be regions of the image containing pedestrians that computer models can be trained on to learn representations of people on the street.
Typically, annotations for training data can be generated by human operators who manually label the regions of interest, or can also be generated by annotation models that allow human operators to simply verify the annotations and relabel only those that are inaccurate. While fairly accurate labels can be easily and conveniently generated for certain types of sensor measurements, other types of sensor measurements can be difficult to annotate due to the format, size, or complexity of the data. For example, light detection and ranging (LIDAR) sensors generate sensor measurements in three-dimensional (3D) space that can be difficult for human operators to label compared to a two-dimensional (2D) image. In addition, although annotation models can be used to generate the annotations, this can also be difficult due to the significant amount of data that needs to be processed and the missing sensor measurements that result from the particular sensing mechanism.
An annotation system uses annotations for a first set of sensor measurements from a first sensor to identify annotations for a second set of sensor measurements from a second sensor. Annotations for the first set of sensor measurements may be generated relatively easily and conveniently, while annotations for the second set of sensor measurements may be more difficult to generate than the first set of sensor measurements due to the sensing characteristics of the second sensor. In one embodiment, the first set of sensor measurements are from a camera that represent a scene in a two-dimensional (2D) space, and the second set of sensor measurements are from an active sensor, such as a light detection and ranging (LIDAR) sensor, that represent the scene in a three-dimensional space (3D).
Specifically, the annotation system identifies reference annotations in the first set of sensor measurements that indicates a location of a characteristic object in the 2D space. The annotation system determines a spatial region in the 3D space of the second set of sensor measurements that corresponds to a portion of the scene represented in the annotation of the first set of sensor measurements. The spatial region is determined using at least a viewpoint of the first sensor and the location of the first annotation in the 2D space. In one embodiment, the spatial region is represented as a viewing frustum, which is a pyramid of vision containing the region of space that may appear in the reference annotation in the 2D image. In one instance, the spatial region may be shaped as a rectangular pyramid.
The annotation system determines annotations within the spatial region of the second set of sensor measurements that indicates a location of the characteristic object in the 3D space. In one embodiment, the annotation system filters the spatial region from the second set of sensor measurements, and applies an annotation model to only the filtered region to determine the annotation for the second set of sensor measurements. The annotation system provides the annotations to human operators, such that they can be verified and relabeled if needed.
By using the annotation for the first set of sensor measurements to help determine the annotation for the second set of sensor measurements, the annotation system can narrow down on a spatial region that contains the characteristic object in the second set of sensor measurements in an efficient manner. For example, when the annotation model is applied to the entire second set of sensor measurements, an incorrect annotation outside the spatial region can potentially be assigned the highest likelihood that the region encompassed by the annotation contains the characteristic object. Since the annotation model is restricted to searching a smaller space that actually contains the characteristic object, there is a higher chance the annotation model will identify the appropriate annotation for the object without the need to search the entire space of the second set of sensor measurements. This way, the annotation system can improve the accuracy of annotations as well as save computational resources compared to applying the annotation model to the entire second set of sensor measurements.
The figures depict various embodiments of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
The autonomous control system 110 guides vehicles based on information related to the surrounding environment received from the one or more sensors attached to the vehicles. The vehicles are any means of conveyance or transport in or by which someone or something can travel from one place to another, and may include automobiles, trucks, vans, robotic transports, and the like. The autonomous control system 110 may guide a vehicle through one or more trips from one destination to another. For example, the autonomous control system 110 may guide a ride-sharing vehicle (e.g., a taxi) from a passenger's point of pick-up to their desired destination.
The autonomous control system 110 performs various detection and control algorithms based on sensor data to guide the vehicles in a safe and efficient manner. For example, the autonomous control system 110 may detect various objects (e.g., lamp post, cars) that are proximate to a vehicle in the captured sensor data of the environment, and guide the vehicle away from the objects to prevent collision of the vehicle with the objects. As another example, the autonomous control system 110 may detect boundaries of lanes on the road such that the vehicle can be guided within the appropriate lane with the flow of traffic. Other examples also include simulating sensor data, estimating sensor quality, and the like.
One or more sensors are attached to the vehicles to gather information used to generate the control of the vehicle. The sensors are devices that detect information related to the environment, and generate sensor measurements that characterize how the sensor perceives the environment. The information can be captured through many forms.
More generally, the autonomous control system 110 may include passive sensors or active sensors. Passive sensors include a receiver that detects and measures various forms of energy that are naturally emitted from the physical environment or constituents of the physical environment across various locations. In one instance, the passive sensors include a camera that generates a two-dimensional (2D) image of pixel data indicating intensities of detected light as sensor measurements. In another instance, the passive sensors include a microphone that generates a time series of air pressure values. In another instance, the passive sensors include a vibration sensor that generates a time series of physical displacements of the vibration sensor.
Active sensors emit energy and measure the energy that is reflected back to one or more receivers of the sensor. The reflected energy allows active sensors to probe for environmental information that may not otherwise be readily detected passively at the sensor. This may allow active sensors to represent the environment across a higher dimension compared to passive sensors. For example, active sensors may be capable of estimating distances of objects, and may represent the environment in a three-dimensional (3D) space rather than the 2D space of an image from a camera. Due to their sensing mechanism, active sensors may also output sparse sensor measurements that contain missing portions of data when, for example, objects are outside the sensing range of the sensor or in the presence of occlusions such as rain, fog, and snow.
In one instance, the active sensors include ultrasound sensors that emit ultrasound waves, radio detection and ranging (RADAR) sensors that emit microwaves, light detection and ranging (LIDAR) sensors that emit laser pulses in the near-IR or visible range waves, and IR sensors that emit IR waves. In particular, the sensor measurements of active sensors may include intensity and reflectance measurements of the reflected energy sensed at the receiver. The sensor measurements can be used to generate a depth map indicating how far away objects are from the sensor, or generate a point cloud that represents the environment with reference to a 3D coordinate system, such as a Cartesian coordinate system or a spherical coordinate system. Each value in the point cloud designates the measurements of the actively-transmitted signal as received back at the receiver (e.g., depth or reflected intensity measurements).
In one embodiment, various functions of the autonomous control system 110 are performed through machine-learned computer models. The computer models may be configured to receive the sensor measurements and generate desired output data that is of interest to the autonomous control system 110. For example, a computer detection model may identify regions of a 3D LIDAR point cloud that contains pedestrians, vehicles, and other objects of interest, such that the vehicle can be guided away from these objects to prevent collision. In one embodiment, the machine-learned models are neural network models such as feed-forward networks, convolutional neural networks (CNN), deep neural networks (DNN), recurrent neural networks (RNN), self-organizing maps (SOM), and the like, that are trained by the model training system 130 based on training data.
Though described herein as an autonomous vehicle, the control decisions of the autonomous control system 110 may provide semi-autonomous control rather than complete control of the vehicle, for example to supplement or override user control, or as primary means of control that can be overridden by a user. In addition, although the autonomous control system 110 is described herein as a system that guides vehicles, the autonomous control system 110 may also guide other systems such as robotic arms or manufacturing equipment.
The model training system 130 trains machine-learned computer models for use in the autonomous control system 110. The computer models are trained using training data, which are known sensor measurements that resemble sensing of potential environments the autonomous control system 110 would encounter during operation. The training data may correspond to the type of sensor measurements generated by sensors of the autonomous control system 110. For example, the training data may include images from cameras that represent various scenes in 2D space, and point cloud measurements from active sensors such as LIDAR sensors, RADAR sensors, and the like that represent the scenes in 3D space.
In one embodiment, portions of the training data are annotated by the annotation system 140 with labels indicating various objects of interest, such as pedestrians, vehicles, and the like. The computer models can learn to detect the objects through these annotations. For example, annotations for a training data set of LIDAR sensor measurements may include 3D bounding boxes around vehicles that can be used to train computer models to predict bounding boxes containing the characteristic objects for a new set of LIDAR sensor measurements. The model training system 130 receives annotated training data from the annotation system 140.
The annotation system 140 provides annotated training data to the model training system 130. The annotations represent a desired type of metadata that correspond to the type of data the computer models are configured to predict. For example, annotated regions containing pedestrians can be used to train a computer model that outputs likelihoods that a region of an image contains a pedestrian. In one instance, the annotations are in the form of bounding boxes that enclose objects of interest, preferably within the smallest area or volume possible. In another instance, the annotations are in the form of labels that partition an image into different segments. A pixel or groups of pixels in the image may be assigned a label such that pixels with the same labels share certain characteristics.
In one instance, the annotation system 140 obtains annotations in conjunction with human operators who manually label regions of interest through, for example, an interface provided by the annotation system 140. In another instance, the annotation system 140 automatically generates estimated annotations by applying an annotation model to the training data. Typically, the annotation model scans portions of the sensor measurements in an incremental fashion, and assigns likelihoods to a set of estimated annotations that indicate likelihoods of containing the object of interest. For example, the annotation model may sequentially scan portions of sensor measurements defined by a rectangular bounding box across a particular direction (e.g., width) of the sensor measurements, and assign a likelihood to each portion that indicate a likelihood the portion contains the object of interest. The estimations with the highest likelihoods are usually designated as the annotations for the training data. For example, the bounding boxes with likelihoods above a threshold amount may be designated as annotations for the training data. The annotation system 140 provides the annotations to human operators that verify the result and relabel those that are inaccurate.
While fairly accurate labels can be easily and conveniently generated for certain types of sensor measurements, other types of sensor measurements can be difficult to annotate due to the format, size, or complexity of the data. For example, high-quality annotations for a 2D camera image may be generated fairly easily using widely established annotation tools and models, while sensor measurements for active sensors, such as LIDAR sensors, may require annotations in the 3D space that can be more difficult for human operators to label. Although annotation models can also be used to generate the annotations, this may require scanning the entire set of sensor measurements in the 3D space that can be computationally burdensome. In addition, the annotations may have suboptimal accuracy due to the missing data points that result from the active sensing mechanism.
Thus, in one embodiment, the annotation system 140 uses annotations for a first set of sensor measurements from a first sensor to identify annotations for a second set of sensor measurements from a second sensor. Often times, the training data contains multiple sensor measurements that correspond to the same scene. For example, the training data may have been obtained from multiple sensors attached to a data collection vehicle. The data collection sensors may have the same or different viewpoints. The annotation system 140 takes advantage of the annotations for a first set of sensor measurements to determine annotations for a second set of sensor measurements that capture the same scene. Annotations for the first set of sensor measurements may be generated relatively easily and conveniently, while annotations for the second set of sensor measurements may be more difficult to generate than the first set of sensor measurements due to the sensing characteristics of the second sensor.
Specifically, the annotation system 140 identifies reference annotations in the first set of sensor measurements that indicates a location of a characteristic object in the 2D space. The annotation system 140 determines a spatial region in the 3D space of the second set of sensor measurements that corresponds to a portion of the scene represented in the annotation of the first set of sensor measurements. The spatial region is determined using at least a viewpoint of the first sensor and the location of the annotation in the first set of sensor measurements. In one embodiment, the spatial region is represented as a viewing frustum, which is a pyramid of vision containing the region of space that may appear in the reference annotation in the 2D image. In one instance, the frustum may be shaped as a rectangular pyramid.
The annotation system 140 determines annotations within the spatial region of the second set of sensor measurements that indicates a location of the characteristic object in the 3D space. In one embodiment, the annotation system 140 filters the spatial region from the second set of sensor measurements, and applies an annotation model to only the filtered region to determine the annotations for the second set of sensor measurements. The annotation system 140 provides the annotations to client devices 116 associated with human operators, such that the annotations can be verified and relabeled if needed.
By using the annotation for the first set of sensor measurements to help determine the annotation for the second set of sensor measurements, the annotation system 140 can quickly narrow down on a spatial region that contains the characteristic object. For example, when the annotation model is applied to the entire second set of sensor measurements, an incorrect annotation outside the spatial region can potentially be assigned the highest likelihood, and thus, be designated as an annotation even though the region may not contain the characteristic object. Since the annotation model is restricted to searching a smaller space that contains the characteristic object, there is a higher chance the annotation model will identify the appropriate annotation for the object. This way, the annotation system 140 can improve the accuracy of annotations as well as save computational resources compared to applying the annotation model to the entire second set of sensor measurements.
In one particular embodiment referred to throughout the remainder of the specification, the first set of sensor measurements are sensor measurements from a camera that represent a scene as a two-dimensional (2D) image, and the second set of sensor measurements are sensor measurements from a LIDAR sensor that represent the scene in a three-dimensional space (3D). However, it is appreciated that in other embodiments, the first set of sensors and the second set of sensors can be any other type of sensor measurements that capture the same scene, in which the portion of the scene labeled in the annotation of the first set of sensor measurements can be extrapolated to a region of space in the second set of sensor measurements that contain the portion of the scene.
Although
Returning to
The client devices 116 are associated with human operators that provide various forms of guidance to the annotation system 140 annotations for training data. In one embodiment, the human operators interact with interfaces generated by the annotation system 140 via the client devices 116 to provide guidance on annotations. For example, a human operator may interact with the interface using a browser application of the client device 116. In one embodiment, the client devices 116 receive annotations generated by the annotation system 140 and verifies the accuracy of the annotations. If the annotations are inaccurate, the human operators may also choose to manually relabel the annotations through the interface, such that the annotation system 140 can receive the corrected annotation.
The client devices 116 are configured to communicate via the network 120, which may comprise any combination of local area and/or wide area networks, using both wired and/or wireless communication systems. In one embodiment, the network 120 uses standard communications technologies and/or protocols. For example, the network 120 includes communication links using technologies such as Ethernet, 802.11, worldwide interoperability for microwave access (WiMAX), 3G, 4G, code division multiple access (CDMA), digital subscriber line (DSL), etc. Examples of networking protocols used for communicating via the network 120 include multiprotocol label switching (MPLS), transmission control protocol/Internet protocol (TCP/IP), hypertext transport protocol (HTTP), simple mail transfer protocol (SMTP), and file transfer protocol (FTP). Data exchanged over the network 120 may be represented using any suitable format, such as hypertext markup language (HTML) or extensible markup language (XML). In some embodiments, all or some of the communication links of the network 120 may be encrypted using any suitable technique or techniques.
The data management module 310 manages the sensor data store 350. The sensor data store 350 includes sensor measurements in the form of images, videos, point clouds, and the like that the annotation system 140 can annotate. The annotated data can be provided to the model training system 130 as training data for training the computer models. The sensor measurements may be generated from physical sensors, may be simulated with respect to virtual sensors or may be a combination of both. In particular, the sensor data store 350 may include sensor measurements from different sensors that correspond to the same scene from the same or different viewpoints.
In one instance, the sensor data store 350 includes sensor measurements from a camera. The sensor measurements from the camera may be arranged as pixels and each pixel may have one or more intensity values associated with it depending on whether the camera is a grayscale or color camera. For example, when the camera is a color camera describing a color of a pixel in red, green, and blue, the intensity value for each is typically an integer, such as an 8, 10, or 12-bit integer specifying the intensity of the red, green, or blue portion of the frequency. If the resolution of the picture were 100×100 pixels (having 10,000 total pixels), for every image, there would be 3 separate channels of 10,000 pixels.
In one instance, the sensor data store 350 includes sensor measurements from an active sensor. The sensor measurements from the active sensor may represent the scene in 3D space. In particular, the sensor data store 350 may include sensor measurements from a LIDAR sensor. The active sensor measurements may sense a same scene captured by the camera images but from a same or different viewpoint from the camera. For example, the training data may include an image of a vehicle on a road captured by a color camera near the dashboard of a vehicle. The training data may also include a LIDAR point cloud of the vehicle on the road captured by a LIDAR sensor attached to the roof of the vehicle.
In one instance, the active sensor measurements are arranged as depth maps. The depth maps include depth measurements that indicate how far away an object in the environment is from the sensor. Specifically, the depth is measured by triggering a timer when the energy is emitted, and detecting the amount of time needed for the receiver to detect the reflected energy. The traveling speed of the energy can be used to calculate the depth of various objects at various locations in the environment by emitting energy signals in the direction of the objects. The depth maps may also include intensity measurements that indicate the intensity of the reflected energy detected at the receiver of the sensor. These intensity values may be represented as 8 or 16-bit integer values.
In another instance, the active sensor measurements are arranged as point clouds with reference to a 3D coordinate system, such as a Cartesian coordinate system or a spherical coordinate system. Each value in the point cloud designates the measurements of the actively-transmitted signal at the receiver (e.g., depth or reflected intensity measurements). The number of data points in the point cloud is related to the resolution of the sensor. Further, for a given sensor, the number of data points varies depending on factors such as what portion of the environment is within the sensor's range.
The transformation module 314 obtains reference annotations in a first set of sensor measurements and identifies a spatial region in a second set of sensor measurements that corresponds to a portion of the scene represented in the reference annotation. As discussed in conjunction with
The transformation module 314 determines a spatial region in the space of the second set of sensor measurements that corresponds to a portion of the scene captured in the reference annotations of the first set of sensor measurements. When the reference annotation is a bounding box, the portion of the scene may refer to the region contained within the bounding box. When the reference annotations are segmentation labels, the portion of the scene may refer to the region encompassed by the pixels labeled as the characteristic object. The transformation module 314 applies one or more geometric transformations to the annotated region of the first set of sensor measurements to determine the spatial region in the second set of measurements. In particular, when the spatial region is shaped as a viewing frustum, the transformation module 314 may determine the coordinates of the near plane and the far plane of the viewing frustum that contain the characteristic object in the second set of sensor measurements.
The annotation module 318 determines the annotations for the second set of sensor measurements based on the spatial region identified by the transformation module 314. In one embodiment, the annotation module 318 filters the subset of sensor measurements contained in the spatial region and applies an annotation model to only the filtered subset to determine the annotations. In one instance, the annotations output by the annotation model may be 3D bounding boxes that are volumetric rectangular prisms that surround the object of interest in the 3D space. In another instance, the annotations may be segmentation labels that indicate which measurements correspond to characteristic objects.
The foregoing description of the embodiments of the invention has been presented for the purpose of illustration; it is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Persons skilled in the relevant art can appreciate that many modifications and variations are possible in light of the above disclosure.
Some portions of this description describe the embodiments of the invention in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, hardware, or any combinations thereof.
Any of the steps, operations, or processes described herein may be performed or implemented with one or more hardware or software modules, alone or in combination with other devices. In one embodiment, a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described.
Embodiments of the invention may also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, and/or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a non-transitory, tangible computer readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus. Furthermore, any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
Embodiments of the invention may also relate to a product that is produced by a computing process described herein. Such a product may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
This application claims priority to, and is a continuation of, U.S. patent application Ser. No. 16/514,721, which claims the benefit of U.S. Provisional Application No. 62/701,441, filed Jul. 20, 2018. Each of the above-recited applications are hereby incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
6882755 | Silverstein et al. | May 2005 | B2 |
7209031 | Nakai et al. | Apr 2007 | B2 |
7747070 | Puri | Jun 2010 | B2 |
7904867 | Burch et al. | Mar 2011 | B2 |
7974492 | Nishijima | Jul 2011 | B2 |
8165380 | Choi et al. | Apr 2012 | B2 |
8369633 | Lu et al. | Feb 2013 | B2 |
8406515 | Cheatle et al. | Mar 2013 | B2 |
8509478 | Haas et al. | Aug 2013 | B2 |
8588470 | Rodriguez et al. | Nov 2013 | B2 |
8744174 | Hamada et al. | Jun 2014 | B2 |
8773498 | Lindbergh | Jul 2014 | B2 |
8912476 | Fogg et al. | Dec 2014 | B2 |
8913830 | Sun et al. | Dec 2014 | B2 |
8928753 | Han et al. | Jan 2015 | B2 |
8972095 | Furuno et al. | Mar 2015 | B2 |
8976269 | Duong | Mar 2015 | B2 |
9008422 | Eid et al. | Apr 2015 | B2 |
9081385 | Ferguson et al. | Jul 2015 | B1 |
9275289 | Li et al. | Mar 2016 | B2 |
9586455 | Sugai et al. | Mar 2017 | B2 |
9672437 | McCarthy | Jun 2017 | B2 |
9710696 | Wang et al. | Jul 2017 | B2 |
9738223 | Zhang et al. | Aug 2017 | B2 |
9754154 | Craig et al. | Sep 2017 | B2 |
9767369 | Furman et al. | Sep 2017 | B2 |
9965865 | Agrawal et al. | May 2018 | B1 |
10133273 | Linke | Nov 2018 | B2 |
10140252 | Fowers et al. | Nov 2018 | B2 |
10140544 | Zhao et al. | Nov 2018 | B1 |
10146225 | Ryan | Dec 2018 | B2 |
10152655 | Krishnamurthy et al. | Dec 2018 | B2 |
10167800 | Chung et al. | Jan 2019 | B1 |
10169680 | Sachdeva et al. | Jan 2019 | B1 |
10192016 | Ng et al. | Jan 2019 | B2 |
10216189 | Haynes | Feb 2019 | B1 |
10228693 | Micks et al. | Mar 2019 | B2 |
10242293 | Shim et al. | Mar 2019 | B2 |
10248121 | VandenBerg, III | Apr 2019 | B2 |
10262218 | Lee et al. | Apr 2019 | B2 |
10282623 | Ziyaee et al. | May 2019 | B1 |
10296828 | Viswanathan | May 2019 | B2 |
10303961 | Stoffel et al. | May 2019 | B1 |
10310087 | Laddha et al. | Jun 2019 | B2 |
10311312 | Yu et al. | Jun 2019 | B2 |
10318848 | Dijkman et al. | Jun 2019 | B2 |
10325178 | Tang et al. | Jun 2019 | B1 |
10331974 | Zia et al. | Jun 2019 | B2 |
10338600 | Yoon et al. | Jul 2019 | B2 |
10343607 | Kumon et al. | Jul 2019 | B2 |
10359783 | Williams et al. | Jul 2019 | B2 |
10366290 | Wang et al. | Jul 2019 | B2 |
10372130 | Kaushansky et al. | Aug 2019 | B1 |
10373019 | Nariyambut Murali et al. | Aug 2019 | B2 |
10373026 | Kim et al. | Aug 2019 | B1 |
10380741 | Yedla et al. | Aug 2019 | B2 |
10394237 | Xu et al. | Aug 2019 | B2 |
10395144 | Zeng et al. | Aug 2019 | B2 |
10402646 | Klaus | Sep 2019 | B2 |
10402986 | Ray et al. | Sep 2019 | B2 |
10414395 | Sapp et al. | Sep 2019 | B1 |
10423934 | Zanghi et al. | Sep 2019 | B1 |
10436615 | Agarwal et al. | Oct 2019 | B2 |
10452905 | Segalovitz et al. | Oct 2019 | B2 |
10460053 | Olson et al. | Oct 2019 | B2 |
10467459 | Chen et al. | Nov 2019 | B2 |
10468008 | Beckman et al. | Nov 2019 | B2 |
10468062 | Levinson et al. | Nov 2019 | B1 |
10470510 | Koh et al. | Nov 2019 | B1 |
10474160 | Huang et al. | Nov 2019 | B2 |
10474161 | Huang et al. | Nov 2019 | B2 |
10474928 | Sivakumar et al. | Nov 2019 | B2 |
10489126 | Kumar et al. | Nov 2019 | B2 |
10489972 | Atsmon | Nov 2019 | B2 |
10503971 | Dang et al. | Dec 2019 | B1 |
10514711 | Bar-Nahum et al. | Dec 2019 | B2 |
10528824 | Zou | Jan 2020 | B2 |
10529078 | Abreu et al. | Jan 2020 | B2 |
10529088 | Fine et al. | Jan 2020 | B2 |
10534854 | Sharma et al. | Jan 2020 | B2 |
10535191 | Sachdeva et al. | Jan 2020 | B2 |
10542930 | Sanchez et al. | Jan 2020 | B1 |
10546197 | Shrestha et al. | Jan 2020 | B2 |
10546217 | Albright et al. | Jan 2020 | B2 |
10552682 | Jonsson et al. | Feb 2020 | B2 |
10559386 | Neuman | Feb 2020 | B1 |
10565475 | Lecue et al. | Feb 2020 | B2 |
10567674 | Kirsch | Feb 2020 | B2 |
10568570 | Sherpa et al. | Feb 2020 | B1 |
10572717 | Zhu et al. | Feb 2020 | B1 |
10574905 | Srikanth et al. | Feb 2020 | B2 |
10579058 | Oh et al. | Mar 2020 | B2 |
10579063 | Haynes et al. | Mar 2020 | B2 |
10579897 | Redmon et al. | Mar 2020 | B2 |
10586280 | McKenna et al. | Mar 2020 | B2 |
10591914 | Palanisamy et al. | Mar 2020 | B2 |
10592785 | Zhu et al. | Mar 2020 | B2 |
10599701 | Liu | Mar 2020 | B2 |
10599930 | Lee et al. | Mar 2020 | B2 |
10599958 | He et al. | Mar 2020 | B2 |
10606990 | Tuli et al. | Mar 2020 | B2 |
10609434 | Singhai et al. | Mar 2020 | B2 |
10614344 | Anthony et al. | Apr 2020 | B2 |
10621513 | Deshpande et al. | Apr 2020 | B2 |
10627818 | Sapp et al. | Apr 2020 | B2 |
10628432 | Guo et al. | Apr 2020 | B2 |
10628686 | Ogale et al. | Apr 2020 | B2 |
10628688 | Kim et al. | Apr 2020 | B1 |
10629080 | Kazemi et al. | Apr 2020 | B2 |
10636161 | Uchigaito | Apr 2020 | B2 |
10636169 | Estrada et al. | Apr 2020 | B2 |
10642275 | Silva et al. | May 2020 | B2 |
10645344 | Marman et al. | May 2020 | B2 |
10649464 | Gray | May 2020 | B2 |
10650071 | Asgekar et al. | May 2020 | B2 |
10652565 | Zhang et al. | May 2020 | B1 |
10656657 | Djuric et al. | May 2020 | B2 |
10657391 | Chen et al. | May 2020 | B2 |
10657418 | Marder et al. | May 2020 | B2 |
10657934 | Kolen et al. | May 2020 | B1 |
10661902 | Tavshikar | May 2020 | B1 |
10664750 | Greene | May 2020 | B2 |
10671082 | Huang et al. | Jun 2020 | B2 |
10671886 | Price et al. | Jun 2020 | B2 |
10678244 | Iandola et al. | Jun 2020 | B2 |
10678839 | Gordon et al. | Jun 2020 | B2 |
10678997 | Ahuja et al. | Jun 2020 | B2 |
10679129 | Baker | Jun 2020 | B2 |
10685159 | Su et al. | Jun 2020 | B2 |
10685188 | Zhang et al. | Jun 2020 | B1 |
10692000 | Surazhsky et al. | Jun 2020 | B2 |
10692242 | Morrison et al. | Jun 2020 | B1 |
10693740 | Coccia et al. | Jun 2020 | B2 |
10698868 | Guggilla et al. | Jun 2020 | B2 |
10699119 | Lo et al. | Jun 2020 | B2 |
10699140 | Kench et al. | Jun 2020 | B2 |
10699477 | Levinson et al. | Jun 2020 | B2 |
10713502 | Tiziani | Jul 2020 | B2 |
10719759 | Kutliroff | Jul 2020 | B2 |
10725475 | Yang et al. | Jul 2020 | B2 |
10726264 | Sawhney et al. | Jul 2020 | B2 |
10726279 | Kim et al. | Jul 2020 | B1 |
10726374 | Engineer et al. | Jul 2020 | B1 |
10732261 | Wang et al. | Aug 2020 | B1 |
10733262 | Miller et al. | Aug 2020 | B2 |
10733482 | Lee et al. | Aug 2020 | B1 |
10733638 | Jain et al. | Aug 2020 | B1 |
10733755 | Liao et al. | Aug 2020 | B2 |
10733876 | Moura et al. | Aug 2020 | B2 |
10740563 | Dugan | Aug 2020 | B2 |
10740914 | Xiao et al. | Aug 2020 | B2 |
10748062 | Rippel et al. | Aug 2020 | B2 |
10748247 | Paluri | Aug 2020 | B2 |
10751879 | Li et al. | Aug 2020 | B2 |
10755112 | Mabuchi | Aug 2020 | B2 |
10755575 | Johnston et al. | Aug 2020 | B2 |
10757330 | Ashrafi | Aug 2020 | B2 |
10762396 | Vallespi et al. | Sep 2020 | B2 |
10768628 | Martin et al. | Sep 2020 | B2 |
10768629 | Song et al. | Sep 2020 | B2 |
10769446 | Chang et al. | Sep 2020 | B2 |
10769483 | Nirenberg et al. | Sep 2020 | B2 |
10769493 | Yu et al. | Sep 2020 | B2 |
10769494 | Xiao et al. | Sep 2020 | B2 |
10769525 | Redding et al. | Sep 2020 | B2 |
10776626 | Lin et al. | Sep 2020 | B1 |
10776673 | Kim et al. | Sep 2020 | B2 |
10776939 | Ma et al. | Sep 2020 | B2 |
10779760 | Lee et al. | Sep 2020 | B2 |
10783381 | Yu et al. | Sep 2020 | B2 |
10783454 | Shoaib et al. | Sep 2020 | B2 |
10789402 | Vemuri et al. | Sep 2020 | B1 |
10789544 | Fiedel et al. | Sep 2020 | B2 |
10790919 | Kolen et al. | Sep 2020 | B1 |
10796221 | Zhang et al. | Oct 2020 | B2 |
10796355 | Price et al. | Oct 2020 | B1 |
10796423 | Goja | Oct 2020 | B2 |
10798368 | Briggs et al. | Oct 2020 | B2 |
10803325 | Bai et al. | Oct 2020 | B2 |
10803328 | Bai et al. | Oct 2020 | B1 |
10803743 | Abari et al. | Oct 2020 | B2 |
10805629 | Liu et al. | Oct 2020 | B2 |
10809730 | Chintakindi | Oct 2020 | B2 |
10810445 | Kangaspunta | Oct 2020 | B1 |
10816346 | Wheeler et al. | Oct 2020 | B2 |
10816992 | Chen | Oct 2020 | B2 |
10817731 | Vallespi et al. | Oct 2020 | B2 |
10817732 | Porter et al. | Oct 2020 | B2 |
10819923 | McCauley et al. | Oct 2020 | B1 |
10824122 | Mummadi et al. | Nov 2020 | B2 |
10824862 | Qi et al. | Nov 2020 | B2 |
10828790 | Nemallan | Nov 2020 | B2 |
10832057 | Chan et al. | Nov 2020 | B2 |
10832093 | Taralova et al. | Nov 2020 | B1 |
10832414 | Pfeiffer | Nov 2020 | B2 |
10832418 | Karasev et al. | Nov 2020 | B1 |
10833785 | O'Shea et al. | Nov 2020 | B1 |
10836379 | Xiao et al. | Nov 2020 | B2 |
10838936 | Cohen | Nov 2020 | B2 |
10839230 | Charette et al. | Nov 2020 | B2 |
10839578 | Coppersmith et al. | Nov 2020 | B2 |
10843628 | Kawamoto et al. | Nov 2020 | B2 |
10845820 | Wheeler | Nov 2020 | B2 |
10845943 | Ansari et al. | Nov 2020 | B1 |
10846831 | Raduta | Nov 2020 | B2 |
10846888 | Kaplanyan et al. | Nov 2020 | B2 |
10853670 | Sholingar et al. | Dec 2020 | B2 |
10853739 | Truong et al. | Dec 2020 | B2 |
10860919 | Kanazawa et al. | Dec 2020 | B2 |
10860924 | Burger | Dec 2020 | B2 |
10867444 | Russell et al. | Dec 2020 | B2 |
10871444 | Al et al. | Dec 2020 | B2 |
10871782 | Milstein et al. | Dec 2020 | B2 |
10872204 | Zhu et al. | Dec 2020 | B2 |
10872254 | Mangla et al. | Dec 2020 | B2 |
10872326 | Garner | Dec 2020 | B2 |
10872531 | Liu et al. | Dec 2020 | B2 |
10885083 | Moeller-Bertram et al. | Jan 2021 | B2 |
10887433 | Fu et al. | Jan 2021 | B2 |
10890898 | Akella et al. | Jan 2021 | B2 |
10891715 | Li | Jan 2021 | B2 |
10891735 | Yang et al. | Jan 2021 | B2 |
10893070 | Wang et al. | Jan 2021 | B2 |
10893107 | Callari et al. | Jan 2021 | B1 |
10896763 | Kempanna et al. | Jan 2021 | B2 |
10901416 | Khanna et al. | Jan 2021 | B2 |
10901508 | Laszlo et al. | Jan 2021 | B2 |
10902551 | Mellado et al. | Jan 2021 | B1 |
10908068 | Amer et al. | Feb 2021 | B2 |
10908606 | Stein et al. | Feb 2021 | B2 |
10909368 | Guo et al. | Feb 2021 | B2 |
10909453 | Myers et al. | Feb 2021 | B1 |
10915783 | Hallman et al. | Feb 2021 | B1 |
10917522 | Segalis et al. | Feb 2021 | B2 |
10921817 | Kangaspunta | Feb 2021 | B1 |
10922578 | Banerjee et al. | Feb 2021 | B2 |
10924661 | Vasconcelos et al. | Feb 2021 | B2 |
10928508 | Swaminathan | Feb 2021 | B2 |
10929757 | Baker et al. | Feb 2021 | B2 |
10930065 | Grant et al. | Feb 2021 | B2 |
10936908 | Ho et al. | Mar 2021 | B1 |
10937186 | Wang et al. | Mar 2021 | B2 |
10943101 | Agarwal et al. | Mar 2021 | B2 |
10943132 | Wang et al. | Mar 2021 | B2 |
10943355 | Fagg et al. | Mar 2021 | B2 |
11361457 | Shen | Jun 2022 | B2 |
20030035481 | Hahm | Feb 2003 | A1 |
20050162445 | Sheasby et al. | Jul 2005 | A1 |
20060072847 | Chor et al. | Apr 2006 | A1 |
20060224533 | Thaler | Oct 2006 | A1 |
20060280364 | Ma et al. | Dec 2006 | A1 |
20070031064 | Zhao | Feb 2007 | A1 |
20080225048 | Bijankumar | Sep 2008 | A1 |
20080247635 | Davis | Oct 2008 | A1 |
20090016571 | Tijerina et al. | Jan 2009 | A1 |
20100118157 | Kameyama | May 2010 | A1 |
20120109915 | Kamekawa et al. | May 2012 | A1 |
20120110491 | Cheung | May 2012 | A1 |
20120128205 | Lee | May 2012 | A1 |
20120134595 | Fonseca et al. | May 2012 | A1 |
20150104102 | Carreira et al. | Apr 2015 | A1 |
20160132786 | Balan et al. | May 2016 | A1 |
20160328856 | Mannino et al. | Nov 2016 | A1 |
20170011281 | Dihkman et al. | Jan 2017 | A1 |
20170158134 | Shigemura | Jun 2017 | A1 |
20170206434 | Nariyambut et al. | Jul 2017 | A1 |
20180012082 | Satazoda | Jan 2018 | A1 |
20180012411 | Richey et al. | Jan 2018 | A1 |
20180018590 | Szeto et al. | Jan 2018 | A1 |
20180039853 | Liu et al. | Feb 2018 | A1 |
20180067489 | Oder et al. | Mar 2018 | A1 |
20180068459 | Zhang et al. | Mar 2018 | A1 |
20180068540 | Romanenko et al. | Mar 2018 | A1 |
20180074506 | Branson | Mar 2018 | A1 |
20180121762 | Han et al. | May 2018 | A1 |
20180129919 | Tang | May 2018 | A1 |
20180150081 | Gross et al. | May 2018 | A1 |
20180211403 | Hotson et al. | Jul 2018 | A1 |
20180308012 | Mummadi et al. | Oct 2018 | A1 |
20180314878 | Lee et al. | Nov 2018 | A1 |
20180357511 | Misra et al. | Dec 2018 | A1 |
20180374105 | Azout et al. | Dec 2018 | A1 |
20190023277 | Roger et al. | Jan 2019 | A1 |
20190025773 | Yang et al. | Jan 2019 | A1 |
20190042894 | Anderson | Feb 2019 | A1 |
20190042919 | Peysakhovich et al. | Feb 2019 | A1 |
20190042944 | Nair et al. | Feb 2019 | A1 |
20190042948 | Lee et al. | Feb 2019 | A1 |
20190057314 | Julian et al. | Feb 2019 | A1 |
20190065637 | Bogdoll et al. | Feb 2019 | A1 |
20190072978 | Levi | Mar 2019 | A1 |
20190079526 | Vallespi et al. | Mar 2019 | A1 |
20190080602 | Rice et al. | Mar 2019 | A1 |
20190095780 | Zhong et al. | Mar 2019 | A1 |
20190095946 | Azout et al. | Mar 2019 | A1 |
20190101914 | Coleman et al. | Apr 2019 | A1 |
20190108417 | Talagala et al. | Apr 2019 | A1 |
20190122111 | Min et al. | Apr 2019 | A1 |
20190130255 | Yim et al. | May 2019 | A1 |
20190145765 | Luo et al. | May 2019 | A1 |
20190146497 | Urtasun et al. | May 2019 | A1 |
20190147112 | Gordon | May 2019 | A1 |
20190147250 | Zhang et al. | May 2019 | A1 |
20190147254 | Bai et al. | May 2019 | A1 |
20190147255 | Homayounfar et al. | May 2019 | A1 |
20190147335 | Wang et al. | May 2019 | A1 |
20190147372 | Luo et al. | May 2019 | A1 |
20190158784 | Ahn et al. | May 2019 | A1 |
20190180154 | Orlov et al. | Jun 2019 | A1 |
20190185010 | Ganguli et al. | Jun 2019 | A1 |
20190189251 | Horiuchi et al. | Jun 2019 | A1 |
20190197357 | Anderson et al. | Jun 2019 | A1 |
20190204842 | Jafari et al. | Jul 2019 | A1 |
20190205402 | Sernau et al. | Jul 2019 | A1 |
20190205667 | Avidan et al. | Jul 2019 | A1 |
20190217791 | Bradley et al. | Jul 2019 | A1 |
20190227562 | Mohammadiha et al. | Jul 2019 | A1 |
20190228037 | Nicol et al. | Jul 2019 | A1 |
20190230282 | Sypitkowski et al. | Jul 2019 | A1 |
20190235499 | Kazemi et al. | Aug 2019 | A1 |
20190236437 | Shin et al. | Aug 2019 | A1 |
20190243371 | Nister et al. | Aug 2019 | A1 |
20190244138 | Bhowmick et al. | Aug 2019 | A1 |
20190250622 | Nister et al. | Aug 2019 | A1 |
20190250626 | Ghafarianzadeh et al. | Aug 2019 | A1 |
20190250640 | O'Flaherty et al. | Aug 2019 | A1 |
20190258878 | Koivisto et al. | Aug 2019 | A1 |
20190266418 | Xu et al. | Aug 2019 | A1 |
20190266610 | Ghatage et al. | Aug 2019 | A1 |
20190272446 | Kangaspunta et al. | Sep 2019 | A1 |
20190276041 | Choi et al. | Sep 2019 | A1 |
20190279004 | Kwon et al. | Sep 2019 | A1 |
20190286652 | Habbecke et al. | Sep 2019 | A1 |
20190286972 | El Husseini et al. | Sep 2019 | A1 |
20190287028 | St Amant et al. | Sep 2019 | A1 |
20190289281 | Badrinarayanan et al. | Sep 2019 | A1 |
20190294177 | Kwon et al. | Sep 2019 | A1 |
20190294975 | Sachs | Sep 2019 | A1 |
20190311290 | Huang et al. | Oct 2019 | A1 |
20190318099 | Carvalho et al. | Oct 2019 | A1 |
20190325088 | Dubey et al. | Oct 2019 | A1 |
20190325266 | Klepper et al. | Oct 2019 | A1 |
20190325269 | Bagherinezhad et al. | Oct 2019 | A1 |
20190325580 | Lukac et al. | Oct 2019 | A1 |
20190325595 | Stein et al. | Oct 2019 | A1 |
20190329790 | Nandakumar et al. | Oct 2019 | A1 |
20190332875 | Vallespi-Gonzalez et al. | Oct 2019 | A1 |
20190333232 | Vallespi-Gonzalez et al. | Oct 2019 | A1 |
20190336063 | Dascalu | Nov 2019 | A1 |
20190339989 | Liang et al. | Nov 2019 | A1 |
20190340462 | Pao et al. | Nov 2019 | A1 |
20190340492 | Burger et al. | Nov 2019 | A1 |
20190340499 | Burger et al. | Nov 2019 | A1 |
20190347501 | Kim et al. | Nov 2019 | A1 |
20190349571 | Herman et al. | Nov 2019 | A1 |
20190354782 | Kee et al. | Nov 2019 | A1 |
20190354786 | Lee et al. | Nov 2019 | A1 |
20190354808 | Park et al. | Nov 2019 | A1 |
20190354817 | Shlens et al. | Nov 2019 | A1 |
20190354850 | Watson et al. | Nov 2019 | A1 |
20190370398 | He et al. | Dec 2019 | A1 |
20190370575 | Nandakumar et al. | Dec 2019 | A1 |
20190370935 | Chang et al. | Dec 2019 | A1 |
20190373322 | Rojas-Echenique et al. | Dec 2019 | A1 |
20190377345 | Bachrach et al. | Dec 2019 | A1 |
20190377965 | Totolos et al. | Dec 2019 | A1 |
20190378049 | Widmann et al. | Dec 2019 | A1 |
20190378051 | Widmann et al. | Dec 2019 | A1 |
20190382007 | Casas et al. | Dec 2019 | A1 |
20190384303 | Muller et al. | Dec 2019 | A1 |
20190384304 | Towal et al. | Dec 2019 | A1 |
20190384309 | Silva et al. | Dec 2019 | A1 |
20190384994 | Frossard et al. | Dec 2019 | A1 |
20190385048 | Cassidy et al. | Dec 2019 | A1 |
20190385360 | Yang et al. | Dec 2019 | A1 |
20200004259 | Gulino et al. | Jan 2020 | A1 |
20200004351 | Marchant et al. | Jan 2020 | A1 |
20200012936 | Lee et al. | Jan 2020 | A1 |
20200017117 | Milton | Jan 2020 | A1 |
20200025931 | Liang et al. | Jan 2020 | A1 |
20200026282 | Choe et al. | Jan 2020 | A1 |
20200026283 | Barnes et al. | Jan 2020 | A1 |
20200026992 | Zhang et al. | Jan 2020 | A1 |
20200027210 | Haemel et al. | Jan 2020 | A1 |
20200033858 | Xiao | Jan 2020 | A1 |
20200033865 | Mellinger et al. | Jan 2020 | A1 |
20200034665 | Ghanta et al. | Jan 2020 | A1 |
20200034710 | Sidhu et al. | Jan 2020 | A1 |
20200036948 | Song | Jan 2020 | A1 |
20200039520 | Misu et al. | Feb 2020 | A1 |
20200051550 | Baker | Feb 2020 | A1 |
20200060757 | Ben-Haim et al. | Feb 2020 | A1 |
20200065711 | Clément et al. | Feb 2020 | A1 |
20200065879 | Hu et al. | Feb 2020 | A1 |
20200069973 | Lou et al. | Mar 2020 | A1 |
20200073385 | Jobanputra et al. | Mar 2020 | A1 |
20200074230 | Englard et al. | Mar 2020 | A1 |
20200086880 | Poeppel et al. | Mar 2020 | A1 |
20200089243 | Poeppel et al. | Mar 2020 | A1 |
20200089969 | Lakshmi et al. | Mar 2020 | A1 |
20200090056 | Singhal et al. | Mar 2020 | A1 |
20200097841 | Petousis et al. | Mar 2020 | A1 |
20200098095 | Borcs et al. | Mar 2020 | A1 |
20200103894 | Cella et al. | Apr 2020 | A1 |
20200104705 | Bhowmick et al. | Apr 2020 | A1 |
20200110416 | Hong et al. | Apr 2020 | A1 |
20200117180 | Cella et al. | Apr 2020 | A1 |
20200117889 | Laput et al. | Apr 2020 | A1 |
20200117916 | Liu | Apr 2020 | A1 |
20200117917 | Yoo | Apr 2020 | A1 |
20200118035 | Asawa et al. | Apr 2020 | A1 |
20200125844 | She et al. | Apr 2020 | A1 |
20200125845 | Hess et al. | Apr 2020 | A1 |
20200126129 | Lkhamsuren et al. | Apr 2020 | A1 |
20200134427 | Oh et al. | Apr 2020 | A1 |
20200134461 | Chai et al. | Apr 2020 | A1 |
20200134466 | Weintraub et al. | Apr 2020 | A1 |
20200134848 | El-Khamy et al. | Apr 2020 | A1 |
20200143231 | Fusi et al. | May 2020 | A1 |
20200143279 | West et al. | May 2020 | A1 |
20200148201 | King et al. | May 2020 | A1 |
20200149898 | Felip et al. | May 2020 | A1 |
20200151201 | Chandrasekhar et al. | May 2020 | A1 |
20200151619 | Mopur et al. | May 2020 | A1 |
20200151692 | Gao et al. | May 2020 | A1 |
20200158822 | Owens et al. | May 2020 | A1 |
20200158869 | Amirloo et al. | May 2020 | A1 |
20200159225 | Zeng et al. | May 2020 | A1 |
20200160064 | Wang et al. | May 2020 | A1 |
20200160104 | Urtasun et al. | May 2020 | A1 |
20200160117 | Urtasun et al. | May 2020 | A1 |
20200160178 | Kar et al. | May 2020 | A1 |
20200160532 | Urtasun et al. | May 2020 | A1 |
20200160558 | Urtasun et al. | May 2020 | A1 |
20200160559 | Urtasun et al. | May 2020 | A1 |
20200160598 | Manivasagam et al. | May 2020 | A1 |
20200162489 | Bar-Nahum et al. | May 2020 | A1 |
20200167438 | Herring | May 2020 | A1 |
20200167554 | Wang et al. | May 2020 | A1 |
20200174481 | Van Heukelom et al. | Jun 2020 | A1 |
20200175326 | Shen et al. | Jun 2020 | A1 |
20200175354 | Volodarskiy et al. | Jun 2020 | A1 |
20200175371 | Kursun | Jun 2020 | A1 |
20200175401 | Shen | Jun 2020 | A1 |
20200183482 | Sebot et al. | Jun 2020 | A1 |
20200184250 | Oko | Jun 2020 | A1 |
20200184333 | Oh | Jun 2020 | A1 |
20200192389 | ReMine et al. | Jun 2020 | A1 |
20200193313 | Ghanta et al. | Jun 2020 | A1 |
20200193328 | Guestrin et al. | Jun 2020 | A1 |
20200202136 | Shrestha et al. | Jun 2020 | A1 |
20200202196 | Guo et al. | Jun 2020 | A1 |
20200209857 | Djuric et al. | Jul 2020 | A1 |
20200209867 | Valois et al. | Jul 2020 | A1 |
20200209874 | Chen et al. | Jul 2020 | A1 |
20200210717 | Hou et al. | Jul 2020 | A1 |
20200210769 | Hou et al. | Jul 2020 | A1 |
20200210777 | Valois et al. | Jul 2020 | A1 |
20200216064 | du Toit et al. | Jul 2020 | A1 |
20200218722 | Mai et al. | Jul 2020 | A1 |
20200218979 | Kwon et al. | Jul 2020 | A1 |
20200223434 | Campos et al. | Jul 2020 | A1 |
20200225758 | Tang et al. | Jul 2020 | A1 |
20200226377 | Campos et al. | Jul 2020 | A1 |
20200226430 | Ahuja et al. | Jul 2020 | A1 |
20200238998 | Dasalukunte et al. | Jul 2020 | A1 |
20200242381 | Chao et al. | Jul 2020 | A1 |
20200242408 | Kim et al. | Jul 2020 | A1 |
20200242511 | Kale et al. | Jul 2020 | A1 |
20200245869 | Sivan et al. | Aug 2020 | A1 |
20200249685 | Elluswamy et al. | Aug 2020 | A1 |
20200250456 | Wang et al. | Aug 2020 | A1 |
20200250515 | Rifkin et al. | Aug 2020 | A1 |
20200250874 | Assouline et al. | Aug 2020 | A1 |
20200257301 | Weiser et al. | Aug 2020 | A1 |
20200257306 | Nisenzon | Aug 2020 | A1 |
20200258057 | Farahat et al. | Aug 2020 | A1 |
20200265247 | Musk et al. | Aug 2020 | A1 |
20200272160 | Djuric et al. | Aug 2020 | A1 |
20200272162 | Hasselgren et al. | Aug 2020 | A1 |
20200272859 | Iashyn et al. | Aug 2020 | A1 |
20200273231 | Schied et al. | Aug 2020 | A1 |
20200279354 | Klaiman | Sep 2020 | A1 |
20200279364 | Sarkisian et al. | Sep 2020 | A1 |
20200279371 | Wenzel et al. | Sep 2020 | A1 |
20200285464 | Brebner | Sep 2020 | A1 |
20200286256 | Houts et al. | Sep 2020 | A1 |
20200293786 | Jia et al. | Sep 2020 | A1 |
20200293796 | Sajjadi et al. | Sep 2020 | A1 |
20200293828 | Wang et al. | Sep 2020 | A1 |
20200293905 | Huang et al. | Sep 2020 | A1 |
20200294162 | Shah | Sep 2020 | A1 |
20200294257 | Yoo et al. | Sep 2020 | A1 |
20200294310 | Lee et al. | Sep 2020 | A1 |
20200297237 | Tamersoy et al. | Sep 2020 | A1 |
20200298891 | Liang et al. | Sep 2020 | A1 |
20200301799 | Manivasagam et al. | Sep 2020 | A1 |
20200302276 | Yang et al. | Sep 2020 | A1 |
20200302291 | Hong | Sep 2020 | A1 |
20200302627 | Duggal et al. | Sep 2020 | A1 |
20200302662 | Homayounfar et al. | Sep 2020 | A1 |
20200304441 | Bradley et al. | Sep 2020 | A1 |
20200306640 | Kolen et al. | Oct 2020 | A1 |
20200307562 | Ghafarianzadeh et al. | Oct 2020 | A1 |
20200307563 | Ghafarianzadeh et al. | Oct 2020 | A1 |
20200309536 | Omari et al. | Oct 2020 | A1 |
20200309923 | Bhaskaran et al. | Oct 2020 | A1 |
20200310442 | Halder et al. | Oct 2020 | A1 |
20200311601 | Robinson et al. | Oct 2020 | A1 |
20200312003 | Borovikov et al. | Oct 2020 | A1 |
20200315708 | Mosnier et al. | Oct 2020 | A1 |
20200320132 | Neumann | Oct 2020 | A1 |
20200324073 | Rajan et al. | Oct 2020 | A1 |
20200327192 | Hackman et al. | Oct 2020 | A1 |
20200327443 | Van et al. | Oct 2020 | A1 |
20200327449 | Tiwari et al. | Oct 2020 | A1 |
20200327662 | Liu et al. | Oct 2020 | A1 |
20200327667 | Arbel et al. | Oct 2020 | A1 |
20200331476 | Chen et al. | Oct 2020 | A1 |
20200334416 | Vianu et al. | Oct 2020 | A1 |
20200334495 | Al et al. | Oct 2020 | A1 |
20200334501 | Lin et al. | Oct 2020 | A1 |
20200334551 | Javidi et al. | Oct 2020 | A1 |
20200334574 | Ishida | Oct 2020 | A1 |
20200337648 | Saripalli et al. | Oct 2020 | A1 |
20200341466 | Pham et al. | Oct 2020 | A1 |
20200342350 | Madar et al. | Oct 2020 | A1 |
20200342548 | Mazed et al. | Oct 2020 | A1 |
20200342652 | Rowell et al. | Oct 2020 | A1 |
20200348909 | Das Sarma et al. | Nov 2020 | A1 |
20200350063 | Thornton et al. | Nov 2020 | A1 |
20200351438 | Dewhurst et al. | Nov 2020 | A1 |
20200356107 | Wells | Nov 2020 | A1 |
20200356790 | Jaipuria et al. | Nov 2020 | A1 |
20200356864 | Neumann | Nov 2020 | A1 |
20200356905 | Luk et al. | Nov 2020 | A1 |
20200361083 | Mousavian et al. | Nov 2020 | A1 |
20200361485 | Zhu et al. | Nov 2020 | A1 |
20200364481 | Kornienko et al. | Nov 2020 | A1 |
20200364508 | Gurel et al. | Nov 2020 | A1 |
20200364540 | Elsayed et al. | Nov 2020 | A1 |
20200364746 | Longano et al. | Nov 2020 | A1 |
20200364953 | Simoudis | Nov 2020 | A1 |
20200372362 | Kim | Nov 2020 | A1 |
20200372402 | Kursun et al. | Nov 2020 | A1 |
20200380362 | Cao et al. | Dec 2020 | A1 |
20200380383 | Kwong et al. | Dec 2020 | A1 |
20200393841 | Frisbie et al. | Dec 2020 | A1 |
20200394421 | Yu et al. | Dec 2020 | A1 |
20200394457 | Brady | Dec 2020 | A1 |
20200394495 | Moudgill et al. | Dec 2020 | A1 |
20200394813 | Theverapperuma et al. | Dec 2020 | A1 |
20200396394 | Zlokolica et al. | Dec 2020 | A1 |
20200398855 | Thompson | Dec 2020 | A1 |
20200401850 | Bazarsky et al. | Dec 2020 | A1 |
20200401886 | Deng et al. | Dec 2020 | A1 |
20200402155 | Kurian et al. | Dec 2020 | A1 |
20200402226 | Peng | Dec 2020 | A1 |
20200410012 | Moon et al. | Dec 2020 | A1 |
20200410224 | Goel | Dec 2020 | A1 |
20200410254 | Pham et al. | Dec 2020 | A1 |
20200410288 | Capota et al. | Dec 2020 | A1 |
20200410751 | Omari et al. | Dec 2020 | A1 |
20210004014 | Sivakumar | Jan 2021 | A1 |
20210004580 | Sundararaman et al. | Jan 2021 | A1 |
20210004611 | Garimella et al. | Jan 2021 | A1 |
20210004663 | Park et al. | Jan 2021 | A1 |
20210006835 | Slattery et al. | Jan 2021 | A1 |
20210011908 | Hayes et al. | Jan 2021 | A1 |
20210012116 | Urtasun et al. | Jan 2021 | A1 |
20210012210 | Sikka et al. | Jan 2021 | A1 |
20210012230 | Hayes et al. | Jan 2021 | A1 |
20210012239 | Arzani et al. | Jan 2021 | A1 |
20210015240 | Elfakhri et al. | Jan 2021 | A1 |
20210019215 | Neeter | Jan 2021 | A1 |
20210026360 | Luo | Jan 2021 | A1 |
20210027112 | Brewington et al. | Jan 2021 | A1 |
20210027117 | McGavran et al. | Jan 2021 | A1 |
20210030276 | Li et al. | Feb 2021 | A1 |
20210034921 | Pinkovich et al. | Feb 2021 | A1 |
20210042575 | Firner | Feb 2021 | A1 |
20210042928 | Takeda et al. | Feb 2021 | A1 |
20210046954 | Haynes | Feb 2021 | A1 |
20210049378 | Gautam et al. | Feb 2021 | A1 |
20210049455 | Kursun | Feb 2021 | A1 |
20210049456 | Kurun | Feb 2021 | A1 |
20210049548 | Grisz et al. | Feb 2021 | A1 |
20210049700 | Nguyen et al. | Feb 2021 | A1 |
20210056114 | Price et al. | Feb 2021 | A1 |
20210056306 | Hu et al. | Feb 2021 | A1 |
20210056317 | Golov | Feb 2021 | A1 |
20210056420 | Konishi et al. | Feb 2021 | A1 |
20210056701 | Vranceanu et al. | Feb 2021 | A1 |
Number | Date | Country |
---|---|---|
2019261735 | Jun 2020 | AU |
2019201716 | Oct 2020 | AU |
110599537 | Dec 2010 | CN |
102737236 | Oct 2012 | CN |
103366339 | Oct 2013 | CN |
104835114 | Aug 2015 | CN |
103236037 | May 2016 | CN |
103500322 | Aug 2016 | CN |
106419893 | Feb 2017 | CN |
106504253 | Mar 2017 | CN |
107031600 | Aug 2017 | CN |
107169421 | Sep 2017 | CN |
107507134 | Dec 2017 | CN |
107885214 | Apr 2018 | CN |
108122234 | Jun 2018 | CN |
107133943 | Jul 2018 | CN |
107368926 | Jul 2018 | CN |
105318888 | Aug 2018 | CN |
108491889 | Sep 2018 | CN |
108647591 | Oct 2018 | CN |
108710865 | Oct 2018 | CN |
105550701 | Nov 2018 | CN |
108764185 | Nov 2018 | CN |
108845574 | Nov 2018 | CN |
108898177 | Nov 2018 | CN |
109086867 | Dec 2018 | CN |
107103113 | Jan 2019 | CN |
109215067 | Jan 2019 | CN |
109359731 | Feb 2019 | CN |
109389207 | Feb 2019 | CN |
109389552 | Feb 2019 | CN |
106779060 | Mar 2019 | CN |
109579856 | Apr 2019 | CN |
109615073 | Apr 2019 | CN |
106156754 | May 2019 | CN |
106598226 | May 2019 | CN |
106650922 | May 2019 | CN |
109791626 | May 2019 | CN |
109901595 | Jun 2019 | CN |
109902732 | Jun 2019 | CN |
109934163 | Jun 2019 | CN |
109948428 | Jun 2019 | CN |
109949257 | Jun 2019 | CN |
109951710 | Jun 2019 | CN |
109975308 | Jul 2019 | CN |
109978132 | Jul 2019 | CN |
109978161 | Jul 2019 | CN |
110060202 | Jul 2019 | CN |
110069071 | Jul 2019 | CN |
110084086 | Aug 2019 | CN |
110096937 | Aug 2019 | CN |
110111340 | Aug 2019 | CN |
110135485 | Aug 2019 | CN |
110197270 | Sep 2019 | CN |
110310264 | Oct 2019 | CN |
110321965 | Oct 2019 | CN |
110334801 | Oct 2019 | CN |
110399875 | Nov 2019 | CN |
110414362 | Nov 2019 | CN |
110426051 | Nov 2019 | CN |
110473173 | Nov 2019 | CN |
110516665 | Nov 2019 | CN |
110543837 | Dec 2019 | CN |
110569899 | Dec 2019 | CN |
110599864 | Dec 2019 | CN |
110619282 | Dec 2019 | CN |
110619283 | Dec 2019 | CN |
110619330 | Dec 2019 | CN |
110659628 | Jan 2020 | CN |
110688992 | Jan 2020 | CN |
107742311 | Feb 2020 | CN |
110751280 | Feb 2020 | CN |
110826566 | Feb 2020 | CN |
107451659 | Apr 2020 | CN |
108111873 | Apr 2020 | CN |
110956185 | Apr 2020 | CN |
110966991 | Apr 2020 | CN |
111027549 | Apr 2020 | CN |
111027575 | Apr 2020 | CN |
111047225 | Apr 2020 | CN |
111126453 | May 2020 | CN |
111158355 | May 2020 | CN |
107729998 | Jun 2020 | CN |
108549934 | Jun 2020 | CN |
111275129 | Jun 2020 | CN |
111275618 | Jun 2020 | CN |
111326023 | Jun 2020 | CN |
111428943 | Jul 2020 | CN |
111444821 | Jul 2020 | CN |
111445420 | Jul 2020 | CN |
111461052 | Jul 2020 | CN |
111461053 | Jul 2020 | CN |
111461110 | Jul 2020 | CN |
110225341 | Aug 2020 | CN |
111307162 | Aug 2020 | CN |
111488770 | Aug 2020 | CN |
111539514 | Aug 2020 | CN |
111565318 | Aug 2020 | CN |
111582216 | Aug 2020 | CN |
111598095 | Aug 2020 | CN |
108229526 | Sep 2020 | CN |
111693972 | Sep 2020 | CN |
106558058 | Oct 2020 | CN |
107169560 | Oct 2020 | CN |
107622258 | Oct 2020 | CN |
111767801 | Oct 2020 | CN |
111768002 | Oct 2020 | CN |
111783545 | Oct 2020 | CN |
111783971 | Oct 2020 | CN |
111797657 | Oct 2020 | CN |
111814623 | Oct 2020 | CN |
111814902 | Oct 2020 | CN |
111860499 | Oct 2020 | CN |
111881856 | Nov 2020 | CN |
111882579 | Nov 2020 | CN |
111897639 | Nov 2020 | CN |
111898507 | Nov 2020 | CN |
111898523 | Nov 2020 | CN |
111899227 | Nov 2020 | CN |
112101175 | Dec 2020 | CN |
112101562 | Dec 2020 | CN |
112115953 | Dec 2020 | CN |
111062973 | Jan 2021 | CN |
111275080 | Jan 2021 | CN |
112183739 | Jan 2021 | CN |
112232497 | Jan 2021 | CN |
112288658 | Jan 2021 | CN |
112308095 | Feb 2021 | CN |
112308799 | Feb 2021 | CN |
112313663 | Feb 2021 | CN |
112329552 | Feb 2021 | CN |
112348783 | Feb 2021 | CN |
111899245 | Mar 2021 | CN |
202017102235 | May 2017 | DE |
202017102238 | May 2017 | DE |
102017116017 | Jan 2019 | DE |
102018130821 | Jun 2020 | DE |
102019008316 | Aug 2020 | DE |
1215626 | Sep 2008 | EP |
2228666 | Sep 2012 | EP |
2420408 | May 2013 | EP |
2723069 | Apr 2014 | EP |
2741253 | Jun 2014 | EP |
3115772 | Jan 2017 | EP |
261855981 | Aug 2017 | EP |
3285485 | Feb 2018 | EP |
2863633 | Feb 2019 | EP |
3113080 | May 2019 | EP |
3525132 | Aug 2019 | EP |
3531689 | Aug 2019 | EP |
3537340 | Sep 2019 | EP |
3543917 | Sep 2019 | EP |
3608840 | Feb 2020 | EP |
3657387 | May 2020 | EP |
2396750 | Jun 2020 | EP |
3664020 | Jun 2020 | EP |
3690712 | Aug 2020 | EP |
3690742 | Aug 2020 | EP |
3722992 | Oct 2020 | EP |
3690730 | Nov 2020 | EP |
3739486 | Nov 2020 | EP |
3751455 | Dec 2020 | EP |
350189781 | Dec 2020 | EP |
3783527 | Feb 2021 | EP |
2402572 | Aug 2005 | GB |
2548087 | Sep 2017 | GB |
2577485 | Apr 2020 | GB |
2517270 | Jun 2020 | GB |
2578262 | Aug 1998 | JP |
3941252 | Jul 2007 | JP |
4282583 | Jun 2009 | JP |
4300098 | Jul 2009 | JP |
2015004922 | Jan 2015 | JP |
5863536 | Feb 2016 | JP |
6044134 | Dec 2016 | JP |
6525707 | Jun 2019 | JP |
2019101535 | Jun 2019 | JP |
2020101927 | Jul 2020 | JP |
2020173744 | Oct 2020 | JP |
100326702 | Feb 2002 | KR |
101082878 | Nov 2011 | KR |
101738422 | May 2017 | KR |
101969864 | Apr 2019 | KR |
101996167 | Jul 2019 | KR |
102022388 | Aug 2019 | KR |
102043143 | Nov 2019 | KR |
102095335 | Mar 2020 | KR |
102097120 | Apr 2020 | KR |
1020200085490 | Jul 2020 | KR |
102189262 | Dec 2020 | KR |
1020200142266 | Dec 2020 | KR |
200630819 | Sep 2006 | TW |
I294089 | Mar 2008 | TW |
I306207 | Feb 2009 | TW |
WO 02052835 | Jul 2002 | WO |
WO 16032398 | Mar 2016 | WO |
WO 16048108 | Mar 2016 | WO |
WO 16207875 | Dec 2016 | WO |
WO 17095580 | Jun 2017 | WO |
WO 17158622 | Sep 2017 | WO |
WO 19005547 | Jan 2019 | WO |
WO 19067695 | Apr 2019 | WO |
WO 19089339 | May 2019 | WO |
WO 19092456 | May 2019 | WO |
WO 19099622 | May 2019 | WO |
WO 19122952 | Jun 2019 | WO |
WO 19125191 | Jun 2019 | WO |
WO 19126755 | Jun 2019 | WO |
WO 19144575 | Aug 2019 | WO |
WO 19182782 | Sep 2019 | WO |
WO 19191578 | Oct 2019 | WO |
WO 19216938 | Nov 2019 | WO |
WO 19220436 | Nov 2019 | WO |
WO 20006154 | Jan 2020 | WO |
WO 20012756 | Jan 2020 | WO |
WO 20025696 | Feb 2020 | WO |
WO 20034663 | Feb 2020 | WO |
WO 20056157 | Mar 2020 | WO |
WO 20076356 | Apr 2020 | WO |
WO 20097221 | May 2020 | WO |
WO 20101246 | May 2020 | WO |
WO 20120050 | Jun 2020 | WO |
WO 20121973 | Jun 2020 | WO |
WO 20131140 | Jun 2020 | WO |
WO 20139181 | Jul 2020 | WO |
WO 20139355 | Jul 2020 | WO |
WO 20139357 | Jul 2020 | WO |
WO 20142193 | Jul 2020 | WO |
WO 20146445 | Jul 2020 | WO |
WO 20151329 | Jul 2020 | WO |
WO 20157761 | Aug 2020 | WO |
WO 20163455 | Aug 2020 | WO |
WO 20167667 | Aug 2020 | WO |
WO 20174262 | Sep 2020 | WO |
WO 20177583 | Sep 2020 | WO |
WO 20185233 | Sep 2020 | WO |
WO 20185234 | Sep 2020 | WO |
WO 20195658 | Oct 2020 | WO |
WO 20198189 | Oct 2020 | WO |
WO 20198779 | Oct 2020 | WO |
WO 20205597 | Oct 2020 | WO |
WO 20221200 | Nov 2020 | WO |
WO 20240284 | Dec 2020 | WO |
WO 20260020 | Dec 2020 | WO |
WO 20264010 | Dec 2020 | WO |
Number | Date | Country | |
---|---|---|---|
20220375208 A1 | Nov 2022 | US |
Number | Date | Country | |
---|---|---|---|
62701441 | Jul 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16514721 | Jul 2019 | US |
Child | 17806358 | US |