Cargo sensor system implemented using neural network

Information

  • Patent Grant
  • 11475680
  • Patent Number
    11,475,680
  • Date Filed
    Tuesday, December 10, 2019
    5 years ago
  • Date Issued
    Tuesday, October 18, 2022
    2 years ago
  • Inventors
    • Saydag; Sait C. (Mission Viejo, CA, US)
  • Original Assignees
  • Examiners
    • Kelley; Christopher S
    • Walsh; Kathleen M
    Agents
    • Luedeka Neely Group, P.C.
Abstract
A cargo sensor system incorporates an optical cargo sensor to supply images to a convolutional neural network. In preferred embodiments, the neural network is implemented using a processor in a sensor module, and is trained by a machine learning system to determine the load state of the cargo container. Some embodiments also include a secondary sensor, such as a laser-ranging Time-of-Flight (ToF) sensor, that verifies the cargo reading determined by the optical cargo sensor.
Description
FIELD

This invention relates to the field of cargo transportation. More particularly, this invention relates to a system for providing a cargo occupancy reading to indicate the load status of a cargo container, such as a cargo trailer.


BACKGROUND

Knowledge of the location and loading state of cargo containers, such as cargo trailers, is important to cargo carriers. If a dispatcher knows that there is room to accept more cargo in a particular trailer that is en route to a destination, the dispatcher can divert the trailer to pick up a load at a nearby customer's facility. In this way, owners of trailers can make more efficient use of their assets, thereby increasing profitability and reducing waste.


Previous solutions have implemented ultrasonic sensors to detect cargo within a cargo container. Such sensors often provide false readings, because pieces of cargo having soft outer materials cannot be reliably detected by ultrasonic sensors.


Some prior optical cargo detection and analysis systems have relied on standard image processing algorithms, such as contrast detection and edge detection, to determine the load status of trailers. In situations in which the load conditions in the trailer are unknown, and these optical methods can lead to false results.


What is needed, therefore, is a cargo sensor system that can be used on a cargo trailer or other cargo container to determine the cargo loading state without reliance on ultrasonic sensors or standard image processing algorithms.


SUMMARY

The above and other needs are met by a cargo sensor system that uses an optical cargo sensor, which may be camera-based (also referred to herein as an imaging sensor), that supplies images to a convolutional neural network. In preferred embodiments, the neural network is implemented using a processor in a sensor module, and is trained by a machine learning system to determine the load state of the cargo container. Some embodiments also include a secondary sensor, such as a laser-ranging Time-of-Flight (ToF) sensor, that verifies the cargo reading determined by the optical cargo sensor. This sensor combination results in much higher accuracy.


Embodiments of the cargo sensor system described herein may include an apparatus for detecting cargo within a cargo container. The apparatus preferably includes an optical sensor, memory, and a processor. The optical sensor captures an image of an interior space within the cargo container. The memory stores a neural network file which comprises neural network descriptions (layers, nodes, connections, and operations), weights, and biases. The neural network file is also referred to herein as a model file. The processor accesses the model file from the memory and processes the image from the optical sensor based on the model file to determine a first cargo loading status of the cargo container.


In some embodiments, the apparatus includes a network interface for communicating cargo information via a data communication network from the apparatus to a server computer. The cargo information may include the first cargo loading status or the image from the optical sensor or both.


In some embodiments, the apparatus includes a distance sensor that generates distance information indicative of the presence of cargo within the interior space of the cargo container. The processor of these embodiments receives the distance information and determines a second cargo loading status based at least in part on the distance information. The processor compares the first cargo loading status to the second cargo loading status, and generates an alert message if the first cargo loading status is inconsistent with the second cargo loading status.


In another aspect, embodiments described herein provide a method for detecting cargo within a cargo container. A preferred embodiment of the method includes the following steps:

    • (a) capturing one or more images of an interior space of the cargo container using the one or more image sensors mounted within or on the cargo container;
    • (b) processing the one or more images based on a model file to determine a first cargo loading status of the cargo container, wherein the processing is performed by a processor disposed within or on the cargo container; and
    • (c) communicating cargo information via a data communication network to a server computer that is remote from the cargo container, wherein the cargo information includes one or both of the first cargo loading status and at least one of the images.


In some embodiments, the method includes (d) generating distance information indicative of the presence of cargo within the interior space of the cargo container using a distance sensor disposed on or within the housing, and (e) determining a second cargo loading status based at least in part on the distance information. Step (c) in these embodiments includes communicating the cargo information including the second cargo loading status.


In some embodiments, the method includes (d) comparing the first cargo loading status to the second cargo loading status, and (e) generating an alert message if the first cargo loading status is inconsistent with the second cargo loading status. Step (c) in these embodiments includes communicating the cargo information including the alert message.


In some embodiments, the method includes:

    • (d) prior to step (b), capturing a plurality of images of interior spaces of a plurality of cargo containers using image sensors mounted within or on the plurality of cargo containers;
    • (e) prior to step (b), generating image categorization information based on categorizing each of the plurality of images as depicting either a loaded cargo container or an unloaded cargo container; and
    • (f) prior to step (b), generating a neural network model based on the image categorization information, wherein the neural network model incorporates the model file.


      Step (b) in these embodiments includes processing the one or more images using the neural network model.


In some embodiments, step (b) includes selecting an optimal image from the one or more images for processing.


In some embodiments, step (b) includes calculating a probability distribution of the likelihood that the cargo container is loaded with cargo.


In some embodiments, the probability distribution has a value of zero to one, and step (b) includes determining that

    • the first cargo loading status is loaded if the value of the probability distribution is between zero and a lower threshold value, and
    • the first cargo loading status is unloaded if the value of the probability distribution is between an upper threshold value and one.


In some embodiments, if the value of the probability distribution is between the lower threshold value and the upper threshold value, the method includes (d) generating distance information indicative of the presence of cargo within the interior space of the cargo container using a distance sensor disposed on or within the housing, and (e) determining a second cargo loading status based at least in part on the distance information. Step (c) in these embodiments includes communicating the cargo information including the second cargo loading status.


In yet another aspect, embodiments described herein provide a cargo sensor module for determining a cargo loading status of a cargo container. In a preferred embodiment, the cargo sensor module includes an optical sensor, a distance sensor, memory, a processor, and a network interface. The optical sensor captures an image of an interior space within the cargo container. The distance sensor generates distance information indicative of the presence of cargo within the interior space of the cargo container. The memory stores a model file. The processor executes operational instructions to:

    • access the model file from the memory and process the image from the optical sensor based on the model file to determine a first cargo loading status of the cargo container;
    • receive the distance information and determine a second cargo loading status based at least in part on the distance information; and
    • compare the first cargo loading status to the second cargo loading status, and generate an alert message if the first cargo loading status is inconsistent with the second cargo loading status.


      The network interface communicates cargo information via a data communication network from the apparatus to a server computer. The cargo information may include the alert message, the first cargo loading status, the second cargo loading status, the image from the optical sensor, or the distance information, or any combination thereof.





BRIEF DESCRIPTION OF THE DRAWINGS

Other embodiments of the invention will become apparent by reference to the detailed description in conjunction with the figures, wherein elements are not to scale, so as to more clearly show the details, wherein like reference numbers indicate like elements throughout the several views, and wherein:



FIGS. 1 and 2 depict a cargo sensing system according to an embodiment of the invention;



FIG. 3 depicts a typical cargo trailer attached to a tractor;



FIG. 4 depicts a graphical representation of a probability distribution indicating the probability of the presence of cargo in a cargo container; and



FIG. 5 depicts a process for sensing cargo in a cargo container according to a preferred embodiment.





DETAILED DESCRIPTION


FIGS. 1 and 2 depicts an embodiment of a cargo sensing system 10. Generally, the system 10 includes one or more cargo sensor modules 20 and a tracker unit 40 mounted on a cargo container. In one embodiment depicted in FIG. 3, a cargo sensor module 20 and a tracker unit 40 are mounted on the front sheet (or nose) of the trailer 12. As shown in FIG. 1, each cargo sensor module 20 includes one or more optical imaging sensors 22, one or more optional distance sensors 24, memory 28 (such as local persistent flash storage and dynamic memory storage), a processor 26, a local data interface 30, and one or more illumination LEDs 32, all disposed on or within a cargo sensor module housing. In a preferred embodiment, the processor 26 runs a Linux or other operating system, which provides the framework for the software to run as described herein.


In some embodiments, a hole is drilled in the front sheet of the trailer, through which the optical imaging sensor 22 of the cargo sensor module 20 “sees” the interior of the trailer's cargo compartment. The processor 26 communicates with and controls the imaging sensor 22 to capture images of the inside of the trailer 12. The processor 26 also controls the illumination LEDs 32 to illuminate the interior of a trailer 12 when the doors are closed, or when the interior of the trailer 12 is otherwise not illuminated. In a preferred embodiment, the imaging sensor 22 is sensitive to infrared (IR) frequencies, and the illumination LEDs 32 emit light at a corresponding or similar IR frequency.


In a preferred embodiment, the optional distance sensor 24 is a laser-ranging time-of-flight (ToF) sensor, such as the ST Micro VL53L1X manufactured by ST Microelectronics. This secondary sensor 24 is generally limited to a short-range (about 3-5 meters) and provides additional information to the processor 26 regarding the load status of the trailer 12.


Sensor data generated by the sensor module 20 are provided to the trailer tracker unit 40 and/or the backend server 50 for processing as described hereinafter. Generally, the tracker unit 40 monitors the location and health of the trailer 12, sends commands the cargo sensor module 20 to detect the contents of the trailer, and sends information regarding the current cargo loading state of the trailer 12 to the server 50 based on cargo detected using the sensor module 20. The tracker unit 40 preferably includes a processor 46, GPS receiver electronics 42 for determining location coordinates of the trailer 20, and a wireless data modem 48 for communicating with a backend server 50, and a local data interface 44 for communicating with the local data interface 30 of the sensor module 20. The local data interfaces 30 and 44 may each comprise one or more interfaces to implement wired communications protocols, such as Inter-integrated Circuit (I2C), Serial Peripheral Interface (SPI), Universal Asynchronous Receiver/Transmitter (UART), or Ethernet, or to implement wireless communications protocols, such as Wi-Fi, Bluetooth® short-range wireless technology, or Bluetooth® Low Energy short-range wireless technology. The tracker unit 18 may also interface with multiple external sensors, such as door sensors and temperature sensors.


The cargo sensing system 10 as described herein is operable to detect cargo with no prior knowledge of what is loaded in the trailer 12. Thus, the system 10 is operable to detect the cargo load in any trailer type, regardless of whether the door is open or closed, or whether the trailer has a transparent roof or minor damage on the walls, floor or roof.



FIG. 5 depicts a preferred embodiment of a process for determining a cargo load status for a cargo trailer or other cargo container. As described in more detail hereinafter, some steps in the process are directed to training a convolutional neural network, and some steps are directed to using the trained neural network to determine the cargo load status. To facilitate the training of the neural network, a large number of images of loaded and unloaded trailers 12 are captured using sensor modules 20 on hundreds of trailers (step 102), and the captured images are uploaded to an image library 52 (FIG. 1) for processing by the image processing server 50 (step 104). These images are preferably delivered from each cargo sensor module 20 through its associated tracker unit 40 to the server 50 (or another server) through an Internet Protocol (IP) tunnel 34. Creation of such an IP tunnel is well known in the art. Using this technique, information can be delivered scalably and flexibly from the cargo sensor module 20 through the tracker unit 40 without requiring extensive changes in the tracker unit's operational software, other than the ability to set up the tunnel.


By storing images locally and providing the IP tunnel 34 to send the images to a backend server, additional information that is valuable to the customer can be made available. For example, if the customer determines that the trailer has been damaged within a certain period of time, the customer can command the cargo sensor module 20 to upload images that were captured at about the same time that they believe the trailer was damaged, and collect information from the tracker unit 40 indicating who was in control of the trailer when it was damaged, and potentially indicating how it was damaged.


Once a significant number of images have been uploaded to the image library 52, some or all of the images are categorized as “loaded” (indicating cargo is present) or “unloaded” (indicating no cargo is present) (step 106). Categorization may be handled by humans reviewing the images, or by artificial intelligence. As shown in FIG. 2, the categorized images 58 and 60 are provided to a machine learning system 62 which may be executed on the server 50 or another processor dedicated to that purpose (step 108). In a preferred embodiment, the machine learning system 62 is implemented by the TensorFlow machine learning framework. However, it will be appreciated that any similar neural network learning framework may be used for this purpose.


With continued reference to FIG. 5, after receiving the untrained (empty/skeleton) convolutional network and the categorized images 58 and 60, the machine learning system 62 generates a model file 64 that includes a neural network, weights, and biases value file 66, and the system 62 adjusts the weight and bias values in the file 66 to achieve the best accuracy (step 110). As will be appreciated by one of ordinary skill in the art, the weight values represent the strength of connections between nodes in a neural network, and bias values are constants used in adjusting outputs. This process is generally referred to as training, and the output of training process is referred to as a model or model file that consists of weights and biases.


In a preferred embodiment, the model file 64 undergoes a post-processing step to reduce its size and make it deployable in the processor of the cargo sensor module 20. In one embodiment, the size of the model 64 is reduced by reducing the number of significant digits in the weight values, reducing the entropy of the entire file, and making it more compressible. Upon demand, the model file 64 is uploaded through the tunnel 34 to memory 28 of the cargo sensor module 20 (step 116). Once the processor 26 of the cargo sensor 20 is properly configured, the processor 26 uses the model file 64 to provide an indication of the trailer load status as described hereinafter.


Upon receipt of a command from the tracker unit 40 or the server 50, the optical sensor 22 in the cargo sensor module 20 captures multiple images of the interior of the trailer 12 (step 118). In a preferred embodiment, the multiple images are captured with differing exposure times or differing levels of illumination or both. The images are analyzed by image processing software running on the processor 26 to select an optimal image for further processing (step 120). In various embodiments, the optimal image is chosen based on the highest brightness, or the highest brightness with some limited number of saturated pixels (ones of maximum value). The selected optimal image is then resized and normalized for compatibility with the input layer of the neural network (step 122). Based on analysis of the resized and normalized image, the neural network running on the processor 26 calculates a probability distribution value indicating the likelihood that cargo is present in the trailer (step 124).


As depicted in FIG. 4, a probability value near zero is a strong indication that the cargo is present in the trailer, whereas a probability value near one is a strong indication that no cargo is present. In some embodiments, if the probability value is in an uncertainty range—greater than a lower threshold value and less than an upper threshold value—this indicates that the neural network could not make a meaningful recommendation regarding the presence or absence of cargo. In that situation, software running on the processor 26 acquires distance information from the secondary sensor to determine whether cargo is present.


Thus, with reference again to FIG. 5, if the probability value is greater than or equal to zero but less than or equal to the lower threshold value (step 126), the processor 26 generates a trailer loaded message (step 128). If the probability value is greater than or equal to the upper threshold value but less than or equal to one (step 130), the processor 26 generates a trailer unloaded message (step 132). If the probability value is greater than the lower threshold value and less than the upper threshold value, the neural network has too much uncertainty regarding the cargo loading status. In this case, the processor 26 acquires the distance information from the secondary sensor (134), and generates a trailer loaded message or trailer unloaded message based on the distance information (step 136).


In some embodiments, if the probability value falls in the range between the lower threshold value and the upper threshold value, the selected captured image (from step 120) is flagged for upload to the image library 52 for categorization and use in a future machine learning process (step 108).


The presence of the IP tunnel 34 and the secondary shortrange distance sensor 24 allow several inventive applications. In one embodiment, if the neural network renders an uncertain ruling regarding whether the image shows the presence or absence of cargo in the trailer, the secondary shortrange distance sensor 24 may be queried to detect the presence of cargo. The information from the short-range sensor 24 may be sent over the tunnel 34 to the server 50 along with the associated image from the optical sensor 22. The image may then be added to the image library 52 for use in the training system 62, which generates a new model file 66 (new model file 66) for upload to the cargo sensor module 20. In this way, the detection accuracy of the neural network continuously improves as the image library 52 grows larger.


As mentioned above, in an exemplary embodiment, use of the short-range secondary sensor 24 allows a cargo sensor module 20 to be deployed even before a significant number of images have been added to the image library 52. As the cargo sensor module 20 successfully reports cargo loading status using the short-range sensor 24, it also gathers associated images and sends them to the server 50 for analysis and inclusion in the image library 52 for use by the machine learning system 62.


The presence of the short-range secondary sensor 24 provides additional advantages. In one preferred embodiment, if the secondary sensor 24 measures a valid distance value between a predetermined short threshold and a predetermined long threshold, then the system 10 understands that the secondary sensor 24 has properly detected cargo in the trailer, and thus the cargo state is loaded. In this case, the energy and memory required to capture an image using the optical sensor 22 and the energy needed to activate the illuminating LEDs 32 need not be expended. The load status result is communicated to the server 50 sooner, and the amount of overall energy consumed to provide the load status message is greatly reduced. If the secondary sensor 24 measures a distance value outside the threshold range specified, then the image and neural network processing is used to determine the load status.


In another preferred embodiment, if a malfunction in the system 10 is detected or suspected, information from the secondary shortrange sensor 24 and captured images from the optical sensor 22 can be compared to ascertain what may be causing the system malfunction, such as a mis-mounted or failed sensor.


In some embodiments, images captured by the optical sensor 22 may be processed and classified as indicating a “trailer door open” or “trailer door closed” condition. This classification may be made by the neural network after it has been trained in the same manner that it was trained to recognize “loaded” and “unloaded” cargo conditions. Using images showing open and closed door conditions, the neural network model file 64 may be created and the corresponding model file 66 may be generated so that images captured by the optical sensor 22 can be processed to determine the state of the door 16 in the back of the trailer 12.


In some embodiments, the system 10 may be used to detect specific objects. For example, the neural network may be trained to recognize humans. If a human is detected within the cargo container and the door is detected to be closed, the system 10 may generate an alert message indicating that a person is trapped or hiding inside the cargo container. In another example, the neural network may be trained to detect a specific type or shape of cargo that may be important, expensive, or illegal, or otherwise worthy of generating an alert to the user.


In preferred embodiments, the optical sensor 22 is mounted in the front of the trailer 12. However, in other embodiments it may be mounted in the back of the trailer or on the door 16. In some embodiments, the optical sensor 22 may be disposed remotely from the cargo sensor module 20, and connected via a wired or wireless interface to the processor 26. The optical sensor 22 may also be integrated into the tracker unit 40 to provide a unified cargo sensor and tracker system. In such an embodiment, functions of the processor 26 may be performed by the tracker processor 46 to reduce system cost, power consumption, and physical size.


The foregoing description of preferred embodiments for this invention have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise form disclosed. Obvious modifications or variations are possible in light of the above teachings. The embodiments are chosen and described in an effort to provide the best illustrations of the principles of the invention and its practical application, and to thereby enable one of ordinary skill in the art to utilize the invention in various embodiments and with various modifications as are suited to the particular use contemplated. All such modifications and variations are within the scope of the invention as determined by the appended claims when interpreted in accordance with the breadth to which they are fairly, legally, and equitably entitled.

Claims
  • 1. An apparatus for detecting cargo within a cargo container, the apparatus comprising: an optical sensor that captures an image of an interior space within the cargo container;memory that stores a model file; anda processor that accesses the model file from the memory and processes the image from the optical sensor based on the model file to determine a first cargo loading status of the cargo container, and wherein processing of the image includes: calculating a probability distribution of likelihood that the cargo container is loaded with cargo, wherein the probability distribution has a value from zero to one;determining that the first cargo loading status is loaded if the probability distribution is between zero and a lower threshold value, anddetermining that the first cargo loading status is unloaded if the probability distribution is between an upper threshold value and one.
  • 2. The apparatus of claim 1 further comprising a network interface for communicating cargo information via a data communication network from the apparatus to a server computer.
  • 3. The apparatus of claim 2 wherein the cargo information includes one or more of the first cargo loading status and the image from the optical sensor.
  • 4. The apparatus of claim 1 further comprising a distance sensor that generates distance information indicative of a presence of cargo within the interior space of the cargo container.
  • 5. The apparatus of claim 4 wherein the processor receives the distance information and determines a second cargo loading status based at least in part on the distance information.
  • 6. The apparatus of claim 5 wherein the processor compares the first cargo loading status to the second cargo loading status, and generates an alert message if the first cargo loading status is inconsistent with the second cargo loading status.
  • 7. The apparatus of claim 6 further comprising a network interface for communicating cargo information via a data communication network from the apparatus to a server computer, wherein the cargo information includes one or more of the alert message, the first cargo loading status, the second cargo loading status, the image from the optical sensor, and the distance information.
  • 8. A method for detecting cargo within a cargo container, the method comprising: (a) capturing one or more images of an interior space of the cargo container using the one or more image sensors mounted within or on the cargo container;(b) processing the one or more images based on a model file to determine a first cargo loading status of the cargo container, wherein the processing is performed by a processor disposed within or on the cargo container, and wherein the processing includes: calculating a probability distribution of likelihood that the cargo container is loaded with cargo, wherein the probability distribution has a value of zero to one;determining that the first cargo loading status is loaded if the value of the probability distribution is between zero and a lower threshold value, anddetermining that the first cargo loading status is unloaded if the value of the probability distribution is between an upper threshold value and one; and(c) communicating cargo information via a data communication network to a server computer that is remote from the cargo container, wherein the cargo information includes one or both of the first cargo loading status and at least one of the images.
  • 9. The method of claim 8 further comprising: (d) generating distance information indicative of a presence of cargo within the interior space of the cargo container using a distance sensor disposed on or within the cargo container; and(e) determining a second cargo loading status based at least in part on the distance information,wherein step (c) comprises communicating the cargo information including the second cargo loading status.
  • 10. The method of claim 9 further comprising: (f) comparing the first cargo loading status to the second cargo loading status; and(g) generating an alert message if the first cargo loading status is inconsistent with the second cargo loading status,wherein step (c) comprises communicating the cargo information including the alert message.
  • 11. The method of claim 8 further comprising: (d) prior to step (b), capturing a plurality of images of interior spaces of a plurality of cargo containers using image sensors mounted within or on the plurality of cargo containers;(e) prior to step (b), generating image categorization information based on categorizing each of the plurality of images as depicting either a loaded cargo container or an unloaded cargo container; and(f) prior to step (b), generating a neural network model based on the image categorization information, wherein the neural network model incorporates the model file,wherein step (b) comprises processing the one or more images using the neural network model.
  • 12. The method of claim 8 wherein step (b) includes selecting an optimal image from the one or more images for processing.
  • 13. The method of claim 8 further comprising, if the value of the probability distribution is between the lower threshold value and the upper threshold value: (d) generating distance information indicative of a presence of cargo within the interior space of the cargo container using a distance sensor disposed on or within the cargo container; and(e) determining a second cargo loading status based at least in part on the distance information,wherein step (c) comprises communicating the cargo information including the second cargo loading status.
  • 14. A cargo sensor module for determining a cargo loading status of a cargo container, the cargo sensor module comprising: an optical sensor that captures an image of an interior space within the cargo container;a distance sensor that generates distance information indicative of a presence of cargo within the interior space of the cargo container;memory that stores a model file;a processor that executes operational instructions to: access the model file from the memory and process the image from the optical sensor based on the model file to determine a first cargo loading status of the cargo container, wherein processing of the image includes: calculating a probability distribution of likelihood that the cargo container is loaded with cargo, wherein the probability distribution has a value from zero to one;determining that the first cargo loading status is loaded if the probability distribution is between zero and a lower threshold value, anddetermining that the first cargo loading status is unloaded if the probability distribution is between an upper threshold value and one;receive the distance information and determine a second cargo loading status based at least in part on the distance information; andcompare the first cargo loading status to the second cargo loading status, and generate an alert message if the first cargo loading status is inconsistent with the second cargo loading status; anda network interface for communicating cargo information via a data communication network from the apparatus to a server computer, wherein the cargo information includes one or more of the alert message, the first cargo loading status, the second cargo loading status, the image from the optical sensor, and the distance information.
RELATED APPLICATIONS

This nonprovisional application claims priority to provisional patent application Ser. No. 62/778,553 filed Dec. 12, 2018, titled Neural Network Based Cargo Occupancy Sensor, the entire contents of which are incorporated herein by reference.

US Referenced Citations (205)
Number Name Date Kind
906021 Herrick Dec 1908 A
4633407 Freienstein et al. Dec 1986 A
4837700 Ando et al. Jun 1989 A
5119301 Shimizu et al. Jun 1992 A
5289369 Hirshberg Feb 1994 A
5299132 Wortham Mar 1994 A
5307277 Hirano Apr 1994 A
5870029 Otto et al. Feb 1999 A
5877956 Frank et al. Mar 1999 A
6025774 Forbes Feb 2000 A
6240365 Bunn May 2001 B1
6249217 Forbes Jun 2001 B1
6510381 Grounds et al. Jan 2003 B2
6512465 Flick Jan 2003 B2
6701234 Vogelsang Mar 2004 B1
6771970 Dan Aug 2004 B1
6816090 Teckchandani et al. Nov 2004 B2
6930638 Lloyd et al. Aug 2005 B2
6931309 Phelan et al. Aug 2005 B2
6985087 Soliman Jan 2006 B2
7034683 Ghazarian Apr 2006 B2
7091835 Boulay et al. Aug 2006 B2
7102510 Boling et al. Sep 2006 B2
7170390 Quiñones et al. Jan 2007 B2
7174243 Lightner et al. Feb 2007 B1
7177738 Diaz Feb 2007 B2
7215282 Boling et al. May 2007 B2
7266378 Norta et al. Sep 2007 B2
7346439 Bodin Mar 2008 B2
7366551 Hartley Apr 2008 B1
7405658 Richards Jul 2008 B2
7546151 Hartley Jun 2009 B2
7574195 Krasner et al. Aug 2009 B2
7593999 Nathanson Sep 2009 B2
7675423 Boling et al. Mar 2010 B2
7701363 Zlojutro Apr 2010 B1
7725216 Kim May 2010 B2
7817033 Motoyama Oct 2010 B2
7818098 Koepf et al. Oct 2010 B2
7830305 Boling et al. Nov 2010 B2
7893818 Smoyer et al. Feb 2011 B2
7970496 Koepf et al. Jun 2011 B2
8018332 Boling et al. Sep 2011 B2
8126601 Kapp et al. Feb 2012 B2
8237591 Holcomb et al. Aug 2012 B2
8330626 Adelson Dec 2012 B1
8330817 Foster Dec 2012 B1
8368561 Welch et al. Feb 2013 B2
8452673 Boling et al. May 2013 B2
8462021 Welch et al. Jun 2013 B2
8510200 Pearlman et al. Aug 2013 B2
8527135 Lowrey et al. Sep 2013 B2
8565963 Burke Oct 2013 B2
8612137 Harris et al. Dec 2013 B2
8626152 Farrell et al. Jan 2014 B2
8655544 Fletcher et al. Feb 2014 B2
8671063 Ehrman et al. Mar 2014 B2
8725326 Kapp et al. May 2014 B2
8760274 Boling et al. Jun 2014 B2
8799461 Herz et al. Aug 2014 B2
8933802 Baade Jan 2015 B2
8970701 Lao Mar 2015 B2
9008894 Bishop et al. Apr 2015 B2
9049564 Muetzel et al. Jun 2015 B2
9060213 Jones Jun 2015 B2
9070271 Baade et al. Jun 2015 B2
9316737 Baade Apr 2016 B2
9332404 Boling et al. May 2016 B2
9516394 Carlo et al. Dec 2016 B2
9551788 Epler Jan 2017 B2
9779379 Hall et al. Oct 2017 B2
9779449 Meyer et al. Oct 2017 B2
10089598 Reeder et al. Oct 2018 B2
10169822 Jarvis et al. Jan 2019 B2
10185892 Mishra Jan 2019 B1
10223744 Brady et al. Mar 2019 B2
10232823 Bobay et al. Mar 2019 B1
10255824 Pearlman et al. Apr 2019 B2
10311315 Drazan et al. Jun 2019 B2
10789789 Edman Sep 2020 B1
20010018639 Bunn Aug 2001 A1
20010034577 Grounds et al. Oct 2001 A1
20020000916 Richards Jan 2002 A1
20020014978 Flick Feb 2002 A1
20020059126 Ricciardi May 2002 A1
20020082025 Baese et al. Jun 2002 A1
20020184062 Diaz Dec 2002 A1
20020186144 Meunier Dec 2002 A1
20030083060 Menendez May 2003 A1
20030151501 Teckchandani et al. Aug 2003 A1
20030151507 Andre et al. Aug 2003 A1
20030174067 Soliman Sep 2003 A1
20040093291 Bodin May 2004 A1
20040125217 Jesson Jul 2004 A1
20040130440 Boulay et al. Jul 2004 A1
20040143378 Vogelsang Jul 2004 A1
20040162063 Quinones et al. Aug 2004 A1
20040225557 Phelan et al. Nov 2004 A1
20040246177 Lloyd et al. Dec 2004 A1
20050021199 Zimmerman et al. Jan 2005 A1
20050026627 Boling et al. Feb 2005 A1
20050134504 Harwood et al. Jun 2005 A1
20050215194 Boling et al. Sep 2005 A1
20050237166 Chen Oct 2005 A1
20060007038 Boling et al. Jan 2006 A1
20060055561 Kamali et al. Mar 2006 A1
20060087411 Chang Apr 2006 A1
20060129290 Zimmerman et al. Jun 2006 A1
20070013779 Gin et al. Jan 2007 A1
20070050332 Grenzberg et al. Mar 2007 A1
20070152844 Hartley et al. Jul 2007 A1
20070167147 Krasner et al. Jul 2007 A1
20070206856 Matsuda et al. Sep 2007 A1
20070290923 Norta et al. Dec 2007 A1
20080015748 Nagy Jan 2008 A1
20080147245 Koepf et al. Jun 2008 A1
20080162045 Lee Jul 2008 A1
20080176537 Smoyer et al. Jul 2008 A1
20080186135 Boling et al. Aug 2008 A1
20080198018 Hartley Aug 2008 A1
20080278314 Miller et al. Nov 2008 A1
20080287151 Fjelstad et al. Nov 2008 A1
20080288768 Barowski et al. Nov 2008 A1
20080294302 Basir Nov 2008 A1
20090027500 Elangovan et al. Jan 2009 A1
20090043445 Bishop et al. Feb 2009 A1
20090079591 Motoyama Mar 2009 A1
20090112394 Lepejian et al. Apr 2009 A1
20090112630 Collins, Jr. et al. Apr 2009 A1
20090140887 Breed et al. Jun 2009 A1
20090224966 Boling et al. Sep 2009 A1
20090287369 Nielsen Nov 2009 A1
20100094482 Schofield et al. Apr 2010 A1
20100103042 Bishop et al. Apr 2010 A1
20100117868 Wiemeersch et al. May 2010 A1
20100191412 Kim Jul 2010 A1
20100265104 Zlojutro Oct 2010 A1
20100299020 Koepf et al. Nov 2010 A1
20110016514 Carlo et al. Jan 2011 A1
20110090075 Armitage et al. Apr 2011 A1
20110093159 Boling et al. Apr 2011 A1
20110143669 Farrell et al. Jun 2011 A1
20110227722 Salvat Sep 2011 A1
20110241903 Welch et al. Oct 2011 A1
20120041618 Sun et al. Feb 2012 A1
20120077475 Holcomb et al. Mar 2012 A1
20120078497 Burke Mar 2012 A1
20120197484 Nath et al. Aug 2012 A1
20120299721 Jones Nov 2012 A1
20120299755 Jones Nov 2012 A1
20130059607 Herz et al. Mar 2013 A1
20130066757 Lovelace et al. Mar 2013 A1
20130088371 Welch et al. Apr 2013 A1
20130100286 Lao Apr 2013 A1
20130113637 Sin et al. May 2013 A1
20130127617 Baade et al. May 2013 A1
20130141249 Pearlman et al. Jun 2013 A1
20130144667 Ehrman et al. Jun 2013 A1
20130144770 Boling et al. Jun 2013 A1
20130144771 Boling et al. Jun 2013 A1
20130144805 Boling et al. Jun 2013 A1
20130147617 Boling et al. Jun 2013 A1
20130159214 Boling et al. Jun 2013 A1
20130185193 Boling et al. Jul 2013 A1
20130249713 Adelson Sep 2013 A1
20130297199 Kapp et al. Nov 2013 A1
20130302757 Pearlman et al. Nov 2013 A1
20140012634 Pearlman et al. Jan 2014 A1
20140036072 Lyall et al. Feb 2014 A1
20140052605 Beerle et al. Feb 2014 A1
20140074692 Beerle et al. Mar 2014 A1
20140095061 Hyde Apr 2014 A1
20140125500 Baade May 2014 A1
20140125501 Baade May 2014 A1
20140220966 Muetzel et al. Aug 2014 A1
20140267688 Aich et al. Sep 2014 A1
20140280658 Boling et al. Sep 2014 A1
20150006207 Jarvis et al. Jan 2015 A1
20150019270 Jarvis et al. Jan 2015 A1
20150024727 Hale-Pletka et al. Jan 2015 A1
20150032291 Lowrey et al. Jan 2015 A1
20150066362 Meyer et al. Mar 2015 A1
20150067312 Lewandowski et al. Mar 2015 A1
20150095255 Hall et al. Apr 2015 A1
20150168173 Lewis-Evans et al. Jun 2015 A1
20150172518 Lucas et al. Jun 2015 A1
20150186991 Meyer et al. Jul 2015 A1
20150260529 Petersen Sep 2015 A1
20150332525 Harris et al. Nov 2015 A1
20150356497 Reeder et al. Dec 2015 A1
20150373487 Miller et al. Dec 2015 A1
20160225072 Brady et al. Aug 2016 A1
20160282466 Epler Sep 2016 A1
20170262717 Drazan et al. Sep 2017 A1
20170313269 Breed Nov 2017 A1
20180300967 Winograd Oct 2018 A1
20180352198 Raasch et al. Dec 2018 A1
20190005442 Reeder et al. Jan 2019 A1
20190061692 Bobay et al. Feb 2019 A1
20190114577 Kilburn Apr 2019 A1
20190122173 Souder Apr 2019 A1
20190279494 Raasch et al. Sep 2019 A1
20200014888 Magal Jan 2020 A1
20200105008 Ehrman Apr 2020 A1
20210319582 Sangeneni Oct 2021 A1
Foreign Referenced Citations (62)
Number Date Country
2609106 Oct 2008 CA
2683208 Nov 2008 CA
2837320 Nov 2012 CA
2856796 May 2013 CA
2867447 Sep 2013 CA
2826902 Mar 2014 CA
2828835 Apr 2014 CA
2832185 May 2014 CA
2846134 Sep 2014 CA
2921908 Jul 2007 CN
101240734 Aug 2008 CN
101734228 Jun 2010 CN
101192322 Jul 2012 CN
102779407 Nov 2012 CN
103813477 May 2014 CN
104931066 Sep 2015 CN
4423328 Jan 1996 DE
0096252 May 1987 EP
0451482 Oct 1991 EP
0519630 Dec 1992 EP
0393935 Mar 1995 EP
0744727 Feb 1997 EP
1191500 Mar 2002 EP
1384635 Jan 2004 EP
2418461 Feb 2012 EP
2006123891 May 2006 JP
2014170000 Sep 2014 JP
20100109283 Oct 2010 KR
2009011420 Mar 2010 MX
2010001545 Aug 2010 MX
1984001823 May 1984 WO
1999063357 Dec 1999 WO
2000070530 Nov 2000 WO
2001024393 Apr 2001 WO
2001059601 Aug 2001 WO
2002089077 Nov 2002 WO
2003034089 Apr 2003 WO
2003036462 May 2003 WO
2003079717 Sep 2003 WO
2003012473 Mar 2004 WO
2004075090 Sep 2004 WO
2004102536 Jun 2005 WO
2005086933 Sep 2005 WO
2006028995 Mar 2006 WO
2006028995 Feb 2007 WO
2007146449 Dec 2007 WO
2008034097 Mar 2008 WO
2007146449 Oct 2008 WO
2008121612 Oct 2008 WO
2008141456 Nov 2008 WO
2008144411 Nov 2008 WO
2005086933 Dec 2008 WO
2009021117 Feb 2009 WO
2009058972 Jul 2009 WO
2009097595 Aug 2009 WO
2010047887 Apr 2010 WO
2012162358 Nov 2012 WO
2012162450 Nov 2012 WO
2013078291 May 2013 WO
2013138798 Sep 2013 WO
2014008752 Jan 2014 WO
2016061355 Apr 2016 WO
Related Publications (1)
Number Date Country
20200193196 A1 Jun 2020 US
Provisional Applications (1)
Number Date Country
62778553 Dec 2018 US