Scene change detection in a dimensioner

Information

  • Patent Grant
  • 9940721
  • Patent Number
    9,940,721
  • Date Filed
    Friday, June 10, 2016
    8 years ago
  • Date Issued
    Tuesday, April 10, 2018
    7 years ago
Abstract
A package dimensioner is disclosed. A change in the pose of the package dimensioner is detected by background modeling the area of a measurement platform and then determining if a number of points in a scene are different in distance from the background model. Change in the pose can also be detected by comparing a count of support points in a 3D container generated from images taken in a training process with a count of support points in a subsequent images and determining how many support points are different.
Description
FIELD OF THE INVENTION

The present invention relates to a dimensioner device that uses image processing to measure the physical size of an object. More particularly, a method and apparatus is provided for determining if a scene has changed indicating respective movement between a camera and a measurement platform.


BACKGROUND

A fixed-position package dimensioner is used to measure the X, Y and Z dimensions of an object such as a package. To provide accuracy, such a dimensioner assumes a static relationship between a camera and a plane upon which objects such as parcels are placed. Once this relationship is established, translation or rotation of the camera will usually lead to under/over estimates of a parcel's size. Likewise, the reference plane (e.g., a scale or platform) against which the dimensioner measures packages cannot usually move without introducing error. In other words, the sensors and reference plane should not move relative to each other after initialization. Independent movement of parts within the system can lead to poor accuracy of measurements of packages in all dimensions.


If a user intentionally moves the sensors to change the view of the scene, for example, previous knowledge about the reference plane becomes invalid. The user may not even be aware that changing a sensor's pose will invalidate the original setup and reduce measuring accuracy.


The user may also be unaware that the dimensioning system hardware has moved. Movement could be very gradual over time, due to, for example, a loose mounting bracket and vibration or jarring. A sensor on a wire or pole could slide slightly over time or be accidently bumped out of position.


Therefore, a need exists for an automated process of re-discovering a reference plane when initial alignment has been disturbed.


SUMMARY

Accordingly, in one aspect, the present invention embraces a package dimensioner. Change in the pose of the package dimensioner is detected by background modeling the area of a measurement platform and then determining if a number of points in a scene are different in distance from the background model. Change in the pose can also be detected by comparing a count of support points in a 3D container generated from images taken in a training process with a count of support points in a subsequent images and determining how many support points are different.


In an example embodiment, a method of detecting a change in the pose of a package dimensioning system relative to its operating environment involves: initializing the dimensioning system by: at a range camera, capturing one or more initial reference images of a measurement platform and surrounding area; at a processor: generating a reference depth map from each initial reference image; generating and storing to a memory a background model from the captured initial reference depth maps; testing the dimensioning system for a scene change by: at the range camera, capturing a subsequent image of the measurement platform and surrounding area; at the processor: generating a current depth map from the subsequent image; comparing each pixel of the current depth map with a corresponding pixel of the background model; counting a number of pixels Pv of the current depth map that differ absolutely from the reference depth map by more than the prescribed threshold THRESH1, and if the number of pixels Pv is greater than a threshold THRESH2, determining that a significant change in the image has occurred.


In certain illustrative embodiments, the testing is carried out on a periodic basis. In certain illustrative embodiments, the process further involves executing a dimensioning process to measure the dimensions of an object on the measurement platform. In certain illustrative embodiments, the testing is carried out whenever a prescribed period of inactivity in measuring dimensions of an object on the measurement platform. In certain illustrative embodiments, the testing is carried out prior to each measurement of dimensions of an object on the measurement platform. In certain illustrative embodiments, the process further involves generating an alert upon determining that a significant scene change has occurred. In certain illustrative embodiments, upon determining that a significant scene change has occurred, repeating the initializing.


In another example embodiment, a dimensioning system has a measurement platform. A range camera is mounted so as to capture an image of the measurement platform and surrounding area. A processor is programmed to carry out the following actions: initialize the dimensioning system by: receiving one or more initial reference images of the measurement platform and surrounding area from the range camera; generating and storing to a memory a background model from the one or more captured initial reference images; test the dimensioning system for a scene change by: receiving a subsequent image of the platform area from the range camera; generating a current depth map from the subsequent image; comparing each pixel of the current depth map with a corresponding pixel of the background model; counting a number of pixels Pv of the current depth map that differ absolutely from the reference depth map by more than the prescribed threshold THRESH1, and if the number of pixels Pv is greater than a threshold THRESH2, determining that a significant change in the image has occurred.


In certain illustrative embodiments, the testing is carried out on a periodic basis. In certain illustrative embodiments, the process further involves 9. The system according to claim 7, further comprising the processor executing a dimensioning process to measure dimensions of an object on the measurement platform. In certain illustrative embodiments, the testing is carried out whenever a prescribed period of inactivity in measuring dimensions of an object on the measurement platform. In certain illustrative embodiments, the testing is carried out prior to each measurement of dimensions of an object on the measurement platform. In certain illustrative embodiments, the processor generates an alert upon determining that a significant scene change has occurred. In certain illustrative embodiments, upon determining that a significant scene change has occurred, the processor repeats the initializing.


In another example embodiment, a method of detecting a change in the pose of a package dimensioning system relative to its operating environment involves: initializing the dimensioning system by: at a range camera, capturing an initial reference image of a measurement platform and surrounding area; at a processor, generating a three-dimensional container around the platform and storing the container to memory; at the processor, determining a count of the support points in the container from the reference image; testing the dimensioning system for a scene change by: at the range camera, capturing a subsequent image of the measurement platform and surrounding area; at the processor: counting support points in the subsequent image that are in the container; comparing the count of support points in the container in the subsequent image with the count of support points in the container in the reference image; based on the comparison, determining if a prescribed difference in the counts is present. Determining that a prescribed difference in the counts exists which establishes that a significant scene change has occurred.


In certain illustrative embodiments, the testing is carried out on a periodic basis. In certain illustrative embodiments, the process further involves executing a dimensioning process to measure dimensions of an object on the platform. In certain illustrative embodiments, the testing is carried out whenever a prescribed period of inactivity in measuring dimensions of an object on the measurement platform. In certain illustrative embodiments, the testing is carried out prior to each measurement of dimensions of an object on the measurement platform. In certain illustrative embodiments, the process further involves generating an alert upon determining that a significant scene change has occurred. In certain illustrative embodiments, the container comprises a right prism with a base approximating a convex polygon, where the base is parallel to a congruent convex polygon that bounds the measurement platform, and where the prism's height equals twice a maximum support distance. In certain illustrative embodiments, the container comprises a right cylinder with a circular base, where the base is parallel to a congruent circle that bounds the measurement platform, and where the cylinder's height equals twice the maximum support distance. In certain illustrative embodiments, when a significant scene change is deemed to have occurred, searching the scene for the measurement platform at a location coplanar therewith. In certain illustrative embodiments, upon determining that a significant scene change has occurred, the process involves repeating the initializing.


In a further example embodiment, a dimensioning system has: a measurement platform. A range camera is mounted so as to capture an image of the measurement platform and surrounding area. A processor is programmed to carry out the following actions: initialize the dimensioning system by: receiving an initial reference image of a platform area from the range camera; generating a three-dimensional container around the measurement platform and storing the container to memory; determining a count of the support points in the container from the reference image; test the dimensioning system for a scene change by: receiving a subsequent image of the measurement platform and surrounding area from the range camera; counting support points in the subsequent image that are in the container; comparing the count of support points in the container in the subsequent image with the count of support points in the container in the reference image; based on the comparison, determining if a prescribed difference in the counts of support points is present; and upon determining that a prescribed difference in the counts exists which establishes that a significant scene change has occurred.


In certain illustrative embodiments, the testing is carried out on a periodic basis. In certain illustrative embodiments, the process further involves executing a dimensioning process to measure dimensions of an object on the measurement platform. In certain illustrative embodiments, the testing is carried out whenever a prescribed period of inactivity in measuring dimensions of an object on the measurement platform. In certain illustrative embodiments, the testing is carried out prior to each measurement of dimensions of an object on the measurement platform. In certain illustrative embodiments, the process further involves generating an alert upon determining that a significant scene change has occurred. In certain illustrative embodiments, the container comprises a right prism with a base approximating a convex polygon, where the base is parallel to a congruent convex polygon that bounds the measurement platform, and where the prism's height equals twice the maximum support distance. In certain illustrative embodiments, the container comprises a right cylinder with a circular base, where the base is parallel to a congruent circle that bounds the measurement platform, and where the cylinder's height equals twice the maximum support distance. In certain illustrative embodiments, when a significant scene change is deemed to have occurred, searching the scene for the measurement platform at a location coplanar therewith. In certain illustrative embodiments, when a significant scene change has been established to have occurred, the processor further repeats the initialization.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 depicts an example of a dimensioner arrangement consistent with certain example embodiments.



FIG. 2 depicts an example range camera configuration consistent with certain example embodiments.



FIG. 3 is an example of a flow chart of a process that uses a background model to determine if a scene change has taken place in a manner consistent with certain embodiments.



FIG. 4 is a further example of a flow chart of a process that uses a background model to determine if a scene change has taken place in a manner consistent with certain embodiments.



FIG. 5 is an example of a flow chart of a process that uses a prism constructed about a polygon to determine if a scene change has taken place in a manner consistent with certain embodiments.



FIG. 6 depicts an example of construction of a prism about a polygon in a manner consistent with certain embodiments.



FIG. 7 is a further example of a flow chart of a process that uses a prism constructed about a polygon to determine if a scene change has taken place in a manner consistent with certain embodiments.





DETAILED DESCRIPTION

The embodiments consistent with the present invention embrace several methods and apparatus for detecting that a dimensioner is out of alignment by virtue of movement of a component of the system with respect to other components of the system.


For purposes of this document, the term “Mixture of Gaussians” refers to methods of background modeling. Single-channel grayscale images, RGB images, or depth images could be used (depth images contain Z-values observed at each pixel location). Each pixel location has one or more associated Gaussian (Normal) probability distributions (Gaussians) based on the observed intensities. The multiple Gaussians are mixed together to form joint distributions. The Gaussians can be dynamic; new ones can be added, or they could merge, or the mean (mu) and standard deviation (sigma) can change.


The term “support points” means three dimensional (XYZ) points that contribute to the definition of some geometric shape. Support points for a plane are points in 3D space detected by the system which are very close (in distance) to a reference plane. A threshold distance (for example, a few centimeters) defines how close a point must be to the geometric surface for the point to be considered supporting point. A three dimensional support point in the scene contributes to the equation for a three dimensional reference plane. The point “supports” the reference plane in that the point is relatively close to the reference plane and is used in fitting the plane (i.e., determining a location of the reference plane in 3D space). For purposes of this document, a support point is within a threshold absolute distance from the plane, such as a maximum of e.g., 2 cm. away orthogonally. Also, the approximated surface normal at the support point is near the surface normal of the platform's plane (with a maximum threshold angle of for example a few degrees between 3D normal vectors). A support point approximates a true point somewhere on the plane, but due to noise, the support point may be slightly above or below the true plane of the reference plane.


In one embodiment, the processor may use a combination of RANSAC (random sample consensus) and least squares fitting to find a three-dimensional plane that approximates the platform's top surface. In RANSAC, the processor builds a large number of random planes and outputs the plane with the largest number of support points. In each iteration of RANSAC, the processor chooses three three-dimensional points randomly from the set of points in the scene and constructs a plane through the points. The processor then counts the number of support points from the scene that are near the plane within a threshold distance. Finally, given the plane with the largest number of support points and the list of support points, the processor fits a new plane through the support points using least squares.


The term “convex hull” means a mathematically constructed polygon in three dimensional space that describes the outermost extent of the platform/scale. By “convex,” the polygon doesn't have any “dents” where successive vertices change from a counter-clockwise to a clockwise orientation or vice-versa. The convex hull generally has a small number of vertices, all of which are coplanar in 3D.


The term “platform” or “measurement platform” is used to mean a reference plane for dimensioning such as a floor or table/counter top surface or a weight scale top surface.


The term “prism” is used to mean a mathematically constructed structure that uses shifted copies of the convex hull as bases. The convex hull is shifted up along the platform's normal vector to form the top base, and the convex hull is shifted down in the opposite direction of the platform's normal vector to form the lower base. The height of the prism is often only a few centimeters and is meant to contain the support points near the actual platform. In this manner, walls are mathematically constructed normal to the convex hull to define a space bordered by the platform. In this document, the prism can essentially be considered a bounding container (a mathematical construct—not an actual physical container) around the platform. The container could be a “right prism” with bases that are polygons and segments representing height that are perpendicular to the bases. Or, the container could be a “right cylinder” with circular or elliptical bases. In all cases, the prism represents a shell around a polygon representing the platform that is flat in 3D space. A height (e.g., +/−2 cm) is added to form the prism or cylinder.


In accord with certain embodiments a mechanism is provided to detect when a camera forming a part of a dimensioning system has moved—perhaps even by a small amount—and alert the user and/or shut down dimensioning operations until a re-initialization is completed to account for the movement. In the present document, “movement” means that the camera or platform is moved with respect to one another such that the image registered to the camera is changed from that image used in the initialization of the dimensioning system.


Further, minor changes in the scene that are not problematic should preferably be ignored if they do not significantly disturb the sensor-platform pose. For instance, if a user places a very large package in the scene that consumes most of a sensor's view, the system should not generate an alarm for a significant, global change. Correspondingly, the detection module should avoid false alarms in order not to annoy the user or to render the system unusable for its intended purpose.


Movement of components in a dimensioner system can be inhibited by incorporating physical restraints (to prevent motion of the camera with respect to the platform). Also, paper seals can be used to provide an indication of when such motion has taken place. But, additional physical sealing may unnecessarily increase product cost, and may need to be customized for conformity with local metrological certification rules. Relative to this, an algorithmic approach does not add to the hardware cost and is consistent with the certification rules in that it renders the system inoperable when it is unable to produce an accurate result. Of course, physical restraints and seals can also be used in conjunction with the techniques disclosed herein.


Turning now to FIG. 1, a dimensioner system 10 is depicted in which a camera 14 is mounted in a fixed relationship to a measurement platform (“platform”) upon which parcels or other objects such as 22 to be measured are placed. The parcel 22 may be placed on a weight scale 24 positioned on the table top 18 so that weight can be determined at the same time as the overall dimensions in X, Y and Z directions are determined for the parcel being measured. In this instance, the top surface of the weight scale 24 becomes the reference plane or platform from which the dimensions are referenced. The dimension determinations are made by analysis of the images from camera 14 by a programmed processor 26 and associated memory/storage 28 (e.g., RAM, ROM, volatile and non-volatile memory and/or disc drives or flash memory, etc.) which ultimately can display the size to a display 30 forming part of a user interface for the system.


A range camera 14 such as that depicted in FIG. 2 is used in accord with certain example embodiments. Such a range camera 14 generates 3D spatial data. The camera 14 (or the camera 14 in conjunction with processor 26) provides 3D spatial data from which the processor 26 infers the reference plane. Camera 14, in certain implementations, includes an infrared (IR) camera 34 and an RGB camera 38 in a stereoscopic arrangement as shown with a horizontal spacing between the lens/aperture of the two cameras. The camera 14 further incorporates an infrared projector 42 which projects structured light using a laser and diffractive optical element(s) to produce a large number (e.g., about 50,000 to 100,000, for certain example embodiments) of 3D reference points in certain embodiments. While shown as part of camera 14, the projector can be a separate component without limitation.


Range cameras such as 14 are commercially available or can be custom built. It is desirable for enhancement of accuracy for the range camera to have a wide viewing angle and be able to focus on the objects that are being measured (e.g., within about 0.5 to 4.0 meters in certain example embodiments). It is also desirable for the camera to have a large baseline D (e.g., between about 8 and about 25 cm), since larger baselines will produce greater accuracy.


In each case of the present system, when the system is turned on an “initialization phase” starts the process. During the initialization phase, the user selects a platform (e.g., a scale). This can be done by presenting an image from the RGB camera 38 of camera 14 and requesting the user to indicate a location on the platform, e.g. by clicking a mouse when the pointer is on the platform or tapping the platform on a touch-screen display. Other methods may occur to those skilled in the art upon consideration of the present teachings.


Two separate example methods for detecting a moved sensor and/or platform are provided by way of example herein, but the invention itself is not to be constrained by the details of the techniques disclosed. In each case, when it is established that a significant scene change has occurred, any number of actions can be carried out including, but not limited to providing an audible or visual alert to the user and/or repeating the initialization phase to account for the change in pose.


Background Model Example


In one implementation, background modeling is used to detect a changed scene. In the background model example, the system prompts the user to select the platform during initialization phase. The depth maps are used to train a background model. While testing a new scene, the already captured background model is compared to a depth map of a current image. Such a depth map may contain information for 100,000 pixels in one example. The system then classifies the pixels in the image as foreground if the depth value for that pixel is significantly different from the corresponding depth of the trained background model. If most of the test scene is classified as foreground, then the system determines that the camera or platform has been moved (or that there has otherwise been a significant scene change).


Once movement of camera or platform is detected, the system acquires a new platform location. The system finds the new location of platform by searching for the largest plane that is coincident with the previous plane as defined by a planar equation of the platform. The planar parameters can then be adjusted according to the newly obtained location. In addition, the system may wait for a fixed amount of time before alerting the user regarding any relative movement between the camera and the platform so as to avoid false alerts.


If the relative position of the camera and platform changes significantly, the camera's current images should vary significantly from images captured during an initialization phase of operation of the system. FIG. 3 depicts an example flow chart of a process 100 consistent with an example implementation starting at 104 where the device is started or restarted and initialized. The user is prompted to select the desired platform at 108 and the system finds the platform at 112. The user can then approve this platform at 116.


The system captures, via camera 14, at least one depth map containing depth (or Z-values in X,Y,Z geometry) in a regular matrix (forming a range image). The depth map(s) is used to train a background model at 120 by generating per-pixel statistical (e.g., Gaussian) distributions consistent with the captured depth map for a collection of successive captured images from camera 14. This trained background model represents a depth map of the environment of the platform as the platform appeared at the time when the system was trained at 120.


Training the background model involves an initial training phase of a blank scene and then continual updates, even when testing. Generally, the background model of an empty scene (containing the platform but no parcel) is trained for e.g., 5 to 10 minutes at startup. Then, during testing, the processor will find foreground objects that violate the background model and will also adapt/update the model. The initial training and update phases essentially make similar changes to the background model, but the update phase ignores foreground ‘blobs’ and doesn't update the model for points where foreground was detected.


Initial training of the background model is carried out by placing the camera in a fixed position, such as on a tripod or a rigid mount. A plurality of frames is obtained from the camera for at least about 30 seconds. The scene should contain minimal disruptions like people walking through it. Then, one of the following processes is carried out for each frame:


1) Assuming use of depth maps (a regular matrix of depth values for each pixel), the frame will be a regular matrix with a depth value (e.g., expressed in mm) or disparity value at each pixel location.


2) Assuming use of a “mixture of Gaussians” model, each training frame is used to update a Gaussian model for each pixel location. Distributions could merge together, or new ones can be created. The mean and standard deviation is then updated for each Gaussian model for each pixel.


Once the background model is trained at 120, the system can begin operation to measure objects placed upon the platform (or placed upon a scale on the platform) in a dimensioning loop 124.


It is desirable to frequently check the alignment of the system to assure that the camera to platform orientation (the “pose”) has not changed. This is done in a “testing phase”. In certain example embodiments, this is checked at three different instances. A check can be done 1) prior to each parcel measurement, 2) after a periodic downtime has been reached (i.e., time between measurements), and 3) on a strictly periodic basis. Many variations will occur to those skilled in the art upon consideration of the present teachings.


During the testing phase, at each test frame, the background model is updated, skipping updates for foreground regions. Just as in the initialization phase, the statistics (e.g., the Gaussians) can be updated. Foreground objects left stationary for a long time will eventually become background as a result of these updates. The example system uses a maximum number of old frames that are stored as history and older frames are eventually deleted. For certain example implementations, the system “forgets” frames that were captured more than about 10,000 frames ago. In one example system, frames are captured at about 10 frames per second.


In the present example embodiment, at 128 a downtime timer Td is set prior to the first measurement along with a periodic timer Tp as indicated at 128. At 132, a new frame is received from camera 14. At 134, the user can start a new measurement by initiating a command to the user interface 30 at 136. When this happens, the system first runs a check (by going into a testing phase) to assure that there has been no movement to disturb alignment of the system. This occurs at 140 and 144 where the system checks to see if a large number (as defined by a threshold) of variations are present from the background model trained at 120. A first threshold THRESH1 defines the maximum distance between the current value of a pixel's depth and the value in the background model. The distance in this case is unsigned since it is of no concern about whether the distance is closer or farther from the camera, the only thing of relevance is how far away the depth is from the corresponding depth in the background model. Thus, if a pixel's depth value differs from the background model by a distance greater than THRESH1, the variation is deemed to be large enough to be considered a variation from the background model. The number of such variations are counted to produce a count PV and PV is compared to a second threshold THRESH2 to determine if there is great enough variation to judge that there has been a change in the “pose” of the system.


When carrying out this test at 140 and 144, the current depth map is compared to the depth map representing the background model. The system classifies a pixel as foreground if its depth value differs from the trained value by more than THRESH1 (for example if the difference is greater than about 5-20 mm in one example). If a large amount of the current depth map is classified as foreground points (e.g., if the number of points Pv is greater than THRESH2), then it can be concluded that the camera probably moved or the alignment of the system otherwise changed. Thus, at 144, the number of foreground points Pf is compared to a threshold THRESH2 to determine if there has been movement or not if the number of foreground points is greater than THRESH2. In one illustrative example, the value of THRESH2 can be set at about 75-95% of the total number of points.


Using depth maps in background modeling is a method that is robust against shadows and other lighting changes. It is noted that RGB/gray images can also be used in a background model. In addition, the background models of this example are not adaptive, but could be modified to adapt them to accept long-term changes, like a newly introduced, semi-permanent object, into the background model.


If, at 144, the system determines that the threshold THRESH2 has not been exceeded and the system has not been moved, the user can be prompted to place a parcel or other object to be measured on the platform at 148. The package can then be measured by the system at 152 and the results displayed or otherwise used (e.g., as input to a point of sale system) at 156 and the downtime timer Td is reset at 160.


If the user is between measurements and has not generated an instruction indicating that the measurement is to start at 138 for a time Td (e.g., for example 15 to 30 minutes), the process 100 goes to 164 to check to see if the downtime timer Td has expired. If Td has not expired, the process returns to 132 to await the next frame from the camera. If Td has expired, the timer Td is reset at 168 and a process identical to that defined at 140 and 144 is started at 172 and 176 to assure that the system is in alignment. At 144 or 176, if the process detects that there has been a camera movement or other change that affects alignment, the process generates an alert at 180 to make the user aware that the system is out of alignment and that an alignment initialization process is to be initiated. This alert can be an audible or visual alert that is intended to get the user's attention. The process then returns to an earlier point in the process such as 108 to begin an alignment re-initialization of the system.


In addition to the downtime timer, a periodic timer can be used to check the calibration on a regular basis (e.g., for example every 15 minutes). If the timer Tp has not expired at 164, the system checks to see if the periodic timer Tp has expired at 184. If not, the process awaits a new frame from the camera 14 at 132. If timer Tp has expired, the periodic timer Tp is reset at 188 and control passes to 172.



FIG. 4 shows a somewhat more isolated and detailed example of a process 200, such as that carried in process 100, starting at an initialization phase 202 where the Z direction values are captured as a depth map matrix. This depth map is used at 206 to train a background model that establishes which points are considered background.


At 210, a test of the current scene is initiated in which a new depth map is created for the current scene. This new depth map is compared with the trained background model (i.e., a reference depth map created at the time of initialization of the system). At 214, each pixel of the new depth map is compared to the corresponding pixel of the trained background model and if the value differs greatly (for example, if the pixel differs by more than THRESH1), then the new pixel is classified as a foreground pixel 218. Otherwise, the pixel will be classified as a background pixel 222. The number of foreground points and background points Pv are counted at 226 and the number of foreground points is compared to a threshold THRESH2 at 230. If the number of foreground points is greater than THRESH2, then the system deems that a change (movement) has been detected at 234 and a new initialization and training process is initiated. If no movement is detected at 238, the system is deemed to be in condition to make dimensioning measurements. In example embodiments, the values of THRESH1 and THRESH2 can likely be optimized for any given dimensioning setup and system. Many variations will occur to those skilled in the art upon consideration of the present teachings.


When background modeling is used for the present system, a few possible models can be considered. A single range image from range camera 14 can be used for the model and the process can determine the difference between the test image and the trained reference images to find foreground. When using one range image for the background model, in one implementation, a distance threshold THRESH1 of 5 mm and a threshold foreground percentage THRESH 2 of 90% can be used. In other words, any new depth value that differs from the corresponding trained depth value by more than a threshold (e.g., 5 mm in one example) is deemed foreground, and if more than 90% of pixels are foreground in one example, then the camera or platform is deemed to have moved.


In another example, a single Gaussian at each pixel location can be used and trained with perhaps a few minutes (e.g., 1-5 minutes) of frames of a blank scene. Each pixel has one Normal distribution of depths with a mean value and standard deviation value. When testing, standard Z values can be computed over the scene. Absolute Z-values surpassing some threshold, like for example three standard deviations, could be deemed foreground. Many variations are possible without deviating from the principles described herein.


Complex Hull Example


In another implementation, referred to as the complex hull method, the dimensioning system 10 checks to determine whether or not the platform remains in the location where it was first imaged by use of a complex hull as defined above.


In this example, the system prompts the user to select the platform during initialization phase in the same manner previously described. Then, a convex hull is built around the selected platform. The convex hull contains the supporting points for the platform. While testing a new scene, the quantity of support points within a prism or cylinder is calculated using the original platform's convex hull as a reference end of the prism or cylinder. If too few or too many of the support points exist in the hull, then the system concludes that the platform or the camera has been moved. Further, the system waits for a fixed amount of time (e.g., 5 seconds), to raise an alarm upon detection of relative movement between camera and platform. Once the movement of camera or platform is detected, the system acquires the new platform location. The system finds the new location of the platform by searching for the largest plane that is approximately coincident with the previous planar equation of the platform and adjusts the planar parameters according to the newly obtained location.


A process 300 representing this example implementation is depicted in FIG. 5. The process starts at 304 with startup or reset of the dimensioning system 10. At 308, the user selects the desired platform to be used for the dimensioning. This can be done by presenting an image from the RGB camera 38 of camera 14 and requesting the user to indicate a location on the platform, e.g. by clicking a mouse when the pointer is on the platform or tapping the platform on a touch-screen display. Other methods may occur to those skilled in the art upon consideration of the present teachings.


During the initialization phase at 312, the platform is found in an initial image captured by camera 14. The processor 26 then computes a “convex hull” around the platform. This convex hull should contain support points for the platform (as defined above). As explained above, the convex hull is essentially a polygon in three dimensions, but which is approximately coincident with the reference plane of the platform.



FIG. 6 depicts the process of constructing walls of a prism about the complex hull. The complex hull 312 in this simple example case is a four sided parallelogram 311. But, in other examples, the complex hull could be represented by a polygon that has three or more sides (or a cylinder or oval) and is used to characterize the surface of the platform that serves as a reference plane. In this illustration, a plurality of support points is shown in the complex hull 311. While these support points are shown to be on the plane of the complex hull, they may differ slightly from actual coplanar points (e.g., points 313). The prism can be constructed mathematically by extending walls 314, 316, 318 and 320 upwards in a direction normal to the complex hull toward a projection of the complex hull that forms a parallel plane to the complex hull.


When the prism has been defined by the system at 312, the user may be prompted to approve the identified platform at 324 and if the user does not approve, he or she again selects the platform at 308 and the complex hull is again identified and the prism walls constructed at 312.


Once the user approves the proper selection of the platform at 324, the system is ready to carry out the dimensioning process which operates as a continuous loop at 328. As in the prior example, this example is shown to have three separate timing arrangements to do a scene change test upon the dimensioning system. To accomplish this, timers Tp and Td are set at 332 and a new frame is received from camera 14 at 336.


The user can start a new measurement at 340 to dimension a parcel or other object. But until such measurement is initiated at 340, the system checks the status of the downtime timer Td at 342. If Td has not expired at 342, the periodic timer Tp is checked at 344. If Tp has not expired, the process returns to 336. If the downtime timer Td has expired at 342 it is reset at 348 and control passes to 350. Similarly, if the periodic timer Tp has expired at 344, it is reset at 352 and control passes to 350. In either case, at 350, the number of points is counted in the prism and if this number differs significantly from the count obtained from the originally established trained reference prism, a significant scene change will be deemed to have occurred. This determination is made at 354 where the system determines if the count is within a suitable range based on a computation at 350 which determines a ratio R of the reference count to the current count. This reference can be deemed to represent no significant change if the value of this ratio (Pv) is between a lower acceptable ratio and a higher acceptable ratio (e.g., for example 0.8<Pv<1.25). If Pv is within this range at 354, no movement is deemed to have taken place and the process returns to 336. But, if the value of Pv is outside this range at 354, a significant scene change is deemed to have occurred and an alert is issued at 358 to the user so that the system can be reinitialized starting at 308.


Whenever a user wishes to measure a parcel or other object at 340 and initiates such process, another initial check is carried out starting at 362 which carries out the same process as 350. A decision is made at 366 whether or not Pv is within range and if not, control passes to 358 to alert the user and re-initialize the system. If no scene change is detected (no movement) at 366, the user is prompted to place the object on the platform at 370 and the parcel is measured at 374. Results can be displayed at 378 or other appropriate action (e.g., transferring data to a point of sale terminal or other system) and the downtime timer is reset at 382. Control then passes back to 336.


When testing the current scene, support points are identified near the original plane (e.g., within a few mm) according to the planar equation that defines the complex hull. The quantity of support points within the “prism” is counted using the original platform's convex hull as a base. If many support points are found to still exist in the convex hull, then the platform and/or camera can be deemed to have not moved. Otherwise, the user can be alerted of a changed pose in the dimensioning system. Once relative movement between the camera and platform is detected, the user is directed to re-find the platform. The old platform is invalid, so the process of reporting dimensions of packages is immediately halted.


In one implementation, the process attempts to reacquire the platform as discussed above. Reacquisition assumes the original planar equation is still valid, but the platform has simply moved with the same plane. For example, a user may move a scale to another location on a countertop, but the scale's planar equation is still valid, so the process can try to find the new location of the scale. In one method, the process finds the largest plane that is approximately coincident with the previous plane, and it ensures the new quantity of support points is similar (e.g., within a threshold) to the original count. The process can then appropriately adjust the planar parameters.


Downtime can be established by determining if the user has not measured anything in X seconds. Likewise, if an RGB frame of the scene has not changed significantly in X seconds, the system could run check. It is desirable to find downtimes when the user is not actively measuring so that the integrity of the alignment can be confirmed with minimal disturbance of the user's operation of the dimensioner.


Referring now to FIG. 7, a process 400 consistent with the discussion above is depicted starting at an initialization phase at 404. During this initialization phase, the polygon defining the complex hull that contains support points is used as the base to construct the prism walls. At 408, once the system is initialized, the current scene is tested by finding support points that are near (e.g., within about 10-30 mm) of the original support points. At 412, the number of support points within the prism is counted using the original platform polygon as the base. At 416, a ratio R of counts is taken (the current count divided by the original count) and this ratio R is compared to a lower threshold (THRESH-LOW) and an upper threshold (THRESH-HIGH) and if the ratio is between the lower and higher thresholds the system deems that there has been no significant change in the scene at 420 (e.g., no movement of the camera with respect to the platform). In such case, normal dimensioning can be carried out. However, if the ratio R is outside this range, the system deems that a significant scene change has taken place (e.g., there has been movement of the camera with respect to the platform) at 424. In such case, an alarm or other alert can be generated to make the user aware of the situation and/or a new initialization process can be initiated as previously discussed.


It is noted that the lower and upper thresholds can be adjusted and optimized so as to establish how much change can be tolerated within the bounds of what would be considered significant and cause to halt measurements until a re-initialization can be carried out. Changes in lighting, movement within the camera's view (e.g., hands passing over the platform, etc.) can contribute to noise that is accounted for in part by the range between the thresholds.


In the implementation of finding support points for the original planar equation, a small point-to-plane distance threshold was used. For instance, a point may be deemed to exist in the original plane's support if its distance from the plane is less than about 20 mm. The corresponding convex prism is short in height, perhaps having a height of as much as about twice this maximum distance (i.e., 2*20=40 mm).


In certain implementations, a countdown timer can be used for concluding that the camera actually moved. The system should observe a significantly changed scene for X seconds before concluding that the camera/platform changed pose. After the countdown expires, the process changes to the phase of requiring the user to select the reference plane again. This countdown helps to further prevent false alarms and by calling for a sustained change to the scene before producing a “camera moved” alarm. For example, a user may place a large package in the scene that violates the background model, but if he/she removes it before the countdown expires, then the system will not throw an alarm. In practice, a countdown of several minutes (e.g., 5-10 minutes) was found to be appropriate to provide a good balance between false alarms and accurate dimensioning. Many variations and modifications will occur to those skilled in the art upon consideration of the present teachings.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. No. 6,832,725; U.S. Pat. No. 7,128,266;
  • U.S. Pat. No. 7,159,783; U.S. Pat. No. 7,413,127;
  • U.S. Pat. No. 7,726,575; U.S. Pat. No. 8,294,969;
  • U.S. Pat. No. 8,317,105; U.S. Pat. No. 8,322,622;
  • U.S. Pat. No. 8,366,005; U.S. Pat. No. 8,371,507;
  • U.S. Pat. No. 8,376,233; U.S. Pat. No. 8,381,979;
  • U.S. Pat. No. 8,390,909; U.S. Pat. No. 8,408,464;
  • U.S. Pat. No. 8,408,468; U.S. Pat. No. 8,408,469;
  • U.S. Pat. No. 8,424,768; U.S. Pat. No. 8,448,863;
  • U.S. Pat. No. 8,457,013; U.S. Pat. No. 8,459,557;
  • U.S. Pat. No. 8,469,272; U.S. Pat. No. 8,474,712;
  • U.S. Pat. No. 8,479,992; U.S. Pat. No. 8,490,877;
  • U.S. Pat. No. 8,517,271; U.S. Pat. No. 8,523,076;
  • U.S. Pat. No. 8,528,818; U.S. Pat. No. 8,544,737;
  • U.S. Pat. No. 8,548,242; U.S. Pat. No. 8,548,420;
  • U.S. Pat. No. 8,550,335; U.S. Pat. No. 8,550,354;
  • U.S. Pat. No. 8,550,357; U.S. Pat. No. 8,556,174;
  • U.S. Pat. No. 8,556,176; U.S. Pat. No. 8,556,177;
  • U.S. Pat. No. 8,559,767; U.S. Pat. No. 8,599,957;
  • U.S. Pat. No. 8,561,895; U.S. Pat. No. 8,561,903;
  • U.S. Pat. No. 8,561,905; U.S. Pat. No. 8,565,107;
  • U.S. Pat. No. 8,571,307; U.S. Pat. No. 8,579,200;
  • U.S. Pat. No. 8,583,924; U.S. Pat. No. 8,584,945;
  • U.S. Pat. No. 8,587,595; U.S. Pat. No. 8,587,697;
  • U.S. Pat. No. 8,588,869; U.S. Pat. No. 8,590,789;
  • U.S. Pat. No. 8,596,539; U.S. Pat. No. 8,596,542;
  • U.S. Pat. No. 8,596,543; U.S. Pat. No. 8,599,271;
  • U.S. Pat. No. 8,599,957; U.S. Pat. No. 8,600,158;
  • U.S. Pat. No. 8,600,167; U.S. Pat. No. 8,602,309;
  • U.S. Pat. No. 8,608,053; U.S. Pat. No. 8,608,071;
  • U.S. Pat. No. 8,611,309; U.S. Pat. No. 8,615,487;
  • U.S. Pat. No. 8,616,454; U.S. Pat. No. 8,621,123;
  • U.S. Pat. No. 8,622,303; U.S. Pat. No. 8,628,013;
  • U.S. Pat. No. 8,628,015; U.S. Pat. No. 8,628,016;
  • U.S. Pat. No. 8,629,926; U.S. Pat. No. 8,630,491;
  • U.S. Pat. No. 8,635,309; U.S. Pat. No. 8,636,200;
  • U.S. Pat. No. 8,636,212; U.S. Pat. No. 8,636,215;
  • U.S. Pat. No. 8,636,224; U.S. Pat. No. 8,638,806;
  • U.S. Pat. No. 8,640,958; U.S. Pat. No. 8,640,960;
  • U.S. Pat. No. 8,643,717; U.S. Pat. No. 8,646,692;
  • U.S. Pat. No. 8,646,694; U.S. Pat. No. 8,657,200;
  • U.S. Pat. No. 8,659,397; U.S. Pat. No. 8,668,149;
  • U.S. Pat. No. 8,678,285; U.S. Pat. No. 8,678,286;
  • U.S. Pat. No. 8,682,077; U.S. Pat. No. 8,687,282;
  • U.S. Pat. No. 8,692,927; U.S. Pat. No. 8,695,880;
  • U.S. Pat. No. 8,698,949; U.S. Pat. No. 8,717,494;
  • U.S. Pat. No. 8,717,494; U.S. Pat. No. 8,720,783;
  • U.S. Pat. No. 8,723,804; U.S. Pat. No. 8,723,904;
  • U.S. Pat. No. 8,727,223; U.S. Pat. No. D702,237;
  • U.S. Pat. No. 8,740,082; U.S. Pat. No. 8,740,085;
  • U.S. Pat. No. 8,746,563; U.S. Pat. No. 8,750,445;
  • U.S. Pat. No. 8,752,766; U.S. Pat. No. 8,756,059;
  • U.S. Pat. No. 8,757,495; U.S. Pat. No. 8,760,563;
  • U.S. Pat. No. 8,763,909; U.S. Pat. No. 8,777,108;
  • U.S. Pat. No. 8,777,109; U.S. Pat. No. 8,779,898;
  • U.S. Pat. No. 8,781,520; U.S. Pat. No. 8,783,573;
  • U.S. Pat. No. 8,789,757; U.S. Pat. No. 8,789,758;
  • U.S. Pat. No. 8,789,759; U.S. Pat. No. 8,794,520;
  • U.S. Pat. No. 8,794,522; U.S. Pat. No. 8,794,525;
  • U.S. Pat. No. 8,794,526; U.S. Pat. No. 8,798,367;
  • U.S. Pat. No. 8,807,431; U.S. Pat. No. 8,807,432;
  • U.S. Pat. No. 8,820,630; U.S. Pat. No. 8,822,848;
  • U.S. Pat. No. 8,824,692; U.S. Pat. No. 8,824,696;
  • U.S. Pat. No. 8,842,849; U.S. Pat. No. 8,844,822;
  • U.S. Pat. No. 8,844,823; U.S. Pat. No. 8,849,019;
  • U.S. Pat. No. 8,851,383; U.S. Pat. No. 8,854,633;
  • U.S. Pat. No. 8,866,963; U.S. Pat. No. 8,868,421;
  • U.S. Pat. No. 8,868,519; U.S. Pat. No. 8,868,802;
  • U.S. Pat. No. 8,868,803; U.S. Pat. No. 8,870,074;
  • U.S. Pat. No. 8,879,639; U.S. Pat. No. 8,880,426;
  • U.S. Pat. No. 8,881,983; U.S. Pat. No. 8,881,987;
  • U.S. Pat. No. 8,903,172; U.S. Pat. No. 8,908,995;
  • U.S. Pat. No. 8,910,870; U.S. Pat. No. 8,910,875;
  • U.S. Pat. No. 8,914,290; U.S. Pat. No. 8,914,788;
  • U.S. Pat. No. 8,915,439; U.S. Pat. No. 8,915,444;
  • U.S. Pat. No. 8,916,789; U.S. Pat. No. 8,918,250;
  • U.S. Pat. No. 8,918,564; U.S. Pat. No. 8,925,818;
  • U.S. Pat. No. 8,939,374; U.S. Pat. No. 8,942,480;
  • U.S. Pat. No. 8,944,313; U.S. Pat. No. 8,944,327;
  • U.S. Pat. No. 8,944,332; U.S. Pat. No. 8,950,678;
  • U.S. Pat. No. 8,967,468; U.S. Pat. No. 8,971,346;
  • U.S. Pat. No. 8,976,030; U.S. Pat. No. 8,976,368;
  • U.S. Pat. No. 8,978,981; U.S. Pat. No. 8,978,983;
  • U.S. Pat. No. 8,978,984; U.S. Pat. No. 8,985,456;
  • U.S. Pat. No. 8,985,457; U.S. Pat. No. 8,985,459;
  • U.S. Pat. No. 8,985,461; U.S. Pat. No. 8,988,578;
  • U.S. Pat. No. 8,988,590; U.S. Pat. No. 8,991,704;
  • U.S. Pat. No. 8,996,194; U.S. Pat. No. 8,996,384;
  • U.S. Pat. No. 9,002,641; U.S. Pat. No. 9,007,368;
  • U.S. Pat. No. 9,010,641; U.S. Pat. No. 9,015,513;
  • U.S. Pat. No. 9,016,576; U.S. Pat. No. 9,022,288;
  • U.S. Pat. No. 9,030,964; U.S. Pat. No. 9,033,240;
  • U.S. Pat. No. 9,033,242; U.S. Pat. No. 9,036,054;
  • U.S. Pat. No. 9,037,344; U.S. Pat. No. 9,038,911;
  • U.S. Pat. No. 9,038,915; U.S. Pat. No. 9,047,098;
  • U.S. Pat. No. 9,047,359; U.S. Pat. No. 9,047,420;
  • U.S. Pat. No. 9,047,525; U.S. Pat. No. 9,047,531;
  • U.S. Pat. No. 9,053,055; U.S. Pat. No. 9,053,378;
  • U.S. Pat. No. 9,053,380; U.S. Pat. No. 9,058,526;
  • U.S. Pat. No. 9,064,165; U.S. Pat. No. 9,064,167;
  • U.S. Pat. No. 9,064,168; U.S. Pat. No. 9,064,254;
  • U.S. Pat. No. 9,066,032; U.S. Pat. No. 9,070,032;
  • U.S. Design Pat. No. D716,285;
  • U.S. Design Pat. No. D723,560;
  • U.S. Design Pat. No. D730,357;
  • U.S. Design Pat. No. D730,901;
  • U.S. Design Pat. No. D730,902;
  • U.S. Design Pat. No. D733,112;
  • U.S. Design Pat. No. D734,339;
  • International Publication No. 2013/163789;
  • International Publication No. 2013/173985;
  • International Publication No. 2014/019130;
  • International Publication No. 2014/110495;
  • U.S. Patent Application Publication No. 2008/0185432;
  • U.S. Patent Application Publication No. 2009/0134221;
  • U.S. Patent Application Publication No. 2010/0177080;
  • U.S. Patent Application Publication No. 2010/0177076;
  • U.S. Patent Application Publication No. 2010/0177707;
  • U.S. Patent Application Publication No. 2010/0177749;
  • U.S. Patent Application Publication No. 2010/0265880;
  • U.S. Patent Application Publication No. 2011/0202554;
  • U.S. Patent Application Publication No. 2012/0111946;
  • U.S. Patent Application Publication No. 2012/0168511;
  • U.S. Patent Application Publication No. 2012/0168512;
  • U.S. Patent Application Publication No. 2012/0193423;
  • U.S. Patent Application Publication No. 2012/0203647;
  • U.S. Patent Application Publication No. 2012/0223141;
  • U.S. Patent Application Publication No. 2012/0228382;
  • U.S. Patent Application Publication No. 2012/0248188;
  • U.S. Patent Application Publication No. 2013/0043312;
  • U.S. Patent Application Publication No. 2013/0082104;
  • U.S. Patent Application Publication No. 2013/0175341;
  • U.S. Patent Application Publication No. 2013/0175343;
  • U.S. Patent Application Publication No. 2013/0257744;
  • U.S. Patent Application Publication No. 2013/0257759;
  • U.S. Patent Application Publication No. 2013/0270346;
  • U.S. Patent Application Publication No. 2013/0287258;
  • U.S. Patent Application Publication No. 2013/0292475;
  • U.S. Patent Application Publication No. 2013/0292477;
  • U.S. Patent Application Publication No. 2013/0293539;
  • U.S. Patent Application Publication No. 2013/0293540;
  • U.S. Patent Application Publication No. 2013/0306728;
  • U.S. Patent Application Publication No. 2013/0306731;
  • U.S. Patent Application Publication No. 2013/0307964;
  • U.S. Patent Application Publication No. 2013/0308625;
  • U.S. Patent Application Publication No. 2013/0313324;
  • U.S. Patent Application Publication No. 2013/0313325;
  • U.S. Patent Application Publication No. 2013/0342717;
  • U.S. Patent Application Publication No. 2014/0001267;
  • U.S. Patent Application Publication No. 2014/0008439;
  • U.S. Patent Application Publication No. 2014/0025584;
  • U.S. Patent Application Publication No. 2014/0034734;
  • U.S. Patent Application Publication No. 2014/0036848;
  • U.S. Patent Application Publication No. 2014/0039693;
  • U.S. Patent Application Publication No. 2014/0042814;
  • U.S. Patent Application Publication No. 2014/0049120;
  • U.S. Patent Application Publication No. 2014/0049635;
  • U.S. Patent Application Publication No. 2014/0061306;
  • U.S. Patent Application Publication No. 2014/0063289;
  • U.S. Patent Application Publication No. 2014/0066136;
  • U.S. Patent Application Publication No. 2014/0067692;
  • U.S. Patent Application Publication No. 2014/0070005;
  • U.S. Patent Application Publication No. 2014/0071840;
  • U.S. Patent Application Publication No. 2014/0074746;
  • U.S. Patent Application Publication No. 2014/0076974;
  • U.S. Patent Application Publication No. 2014/0078341;
  • U.S. Patent Application Publication No. 2014/0078345;
  • U.S. Patent Application Publication No. 2014/0097249;
  • U.S. Patent Application Publication No. 2014/0098792;
  • U.S. Patent Application Publication No. 2014/0100813;
  • U.S. Patent Application Publication No. 2014/0103115;
  • U.S. Patent Application Publication No. 2014/0104413;
  • U.S. Patent Application Publication No. 2014/0104414;
  • U.S. Patent Application Publication No. 2014/0104416;
  • U.S. Patent Application Publication No. 2014/0104451;
  • U.S. Patent Application Publication No. 2014/0106594;
  • U.S. Patent Application Publication No. 2014/0106725;
  • U.S. Patent Application Publication No. 2014/0108010;
  • U.S. Patent Application Publication No. 2014/0108402;
  • U.S. Patent Application Publication No. 2014/0110485;
  • U.S. Patent Application Publication No. 2014/0114530;
  • U.S. Patent Application Publication No. 2014/0124577;
  • U.S. Patent Application Publication No. 2014/0124579;
  • U.S. Patent Application Publication No. 2014/0125842;
  • U.S. Patent Application Publication No. 2014/0125853;
  • U.S. Patent Application Publication No. 2014/0125999;
  • U.S. Patent Application Publication No. 2014/0129378;
  • U.S. Patent Application Publication No. 2014/0131438;
  • U.S. Patent Application Publication No. 2014/0131441;
  • U.S. Patent Application Publication No. 2014/0131443;
  • U.S. Patent Application Publication No. 2014/0131444;
  • U.S. Patent Application Publication No. 2014/0131445;
  • U.S. Patent Application Publication No. 2014/0131448;
  • U.S. Patent Application Publication No. 2014/0133379;
  • U.S. Patent Application Publication No. 2014/0136208;
  • U.S. Patent Application Publication No. 2014/0140585;
  • U.S. Patent Application Publication No. 2014/0151453;
  • U.S. Patent Application Publication No. 2014/0152882;
  • U.S. Patent Application Publication No. 2014/0158770;
  • U.S. Patent Application Publication No. 2014/0159869;
  • U.S. Patent Application Publication No. 2014/0166755;
  • U.S. Patent Application Publication No. 2014/0166759;
  • U.S. Patent Application Publication No. 2014/0168787;
  • U.S. Patent Application Publication No. 2014/0175165;
  • U.S. Patent Application Publication No. 2014/0175172;
  • U.S. Patent Application Publication No. 2014/0191644;
  • U.S. Patent Application Publication No. 2014/0191913;
  • U.S. Patent Application Publication No. 2014/0197238;
  • U.S. Patent Application Publication No. 2014/0197239;
  • U.S. Patent Application Publication No. 2014/0197304;
  • U.S. Patent Application Publication No. 2014/0214631;
  • U.S. Patent Application Publication No. 2014/0217166;
  • U.S. Patent Application Publication No. 2014/0217180;
  • U.S. Patent Application Publication No. 2014/0231500;
  • U.S. Patent Application Publication No. 2014/0232930;
  • U.S. Patent Application Publication No. 2014/0247315;
  • U.S. Patent Application Publication No. 2014/0263493;
  • U.S. Patent Application Publication No. 2014/0263645;
  • U.S. Patent Application Publication No. 2014/0267609;
  • U.S. Patent Application Publication No. 2014/0270196;
  • U.S. Patent Application Publication No. 2014/0270229;
  • U.S. Patent Application Publication No. 2014/0278387;
  • U.S. Patent Application Publication No. 2014/0278391;
  • U.S. Patent Application Publication No. 2014/0282210;
  • U.S. Patent Application Publication No. 2014/0284384;
  • U.S. Patent Application Publication No. 2014/0288933;
  • U.S. Patent Application Publication No. 2014/0297058;
  • U.S. Patent Application Publication No. 2014/0299665;
  • U.S. Patent Application Publication No. 2014/0312121;
  • U.S. Patent Application Publication No. 2014/0319220;
  • U.S. Patent Application Publication No. 2014/0319221;
  • U.S. Patent Application Publication No. 2014/0326787;
  • U.S. Patent Application Publication No. 2014/0332590;
  • U.S. Patent Application Publication No. 2014/0344943;
  • U.S. Patent Application Publication No. 2014/0346233;
  • U.S. Patent Application Publication No. 2014/0351317;
  • U.S. Patent Application Publication No. 2014/0353373;
  • U.S. Patent Application Publication No. 2014/0361073;
  • U.S. Patent Application Publication No. 2014/0361082;
  • U.S. Patent Application Publication No. 2014/0362184;
  • U.S. Patent Application Publication No. 2014/0363015;
  • U.S. Patent Application Publication No. 2014/0369511;
  • U.S. Patent Application Publication No. 2014/0374483;
  • U.S. Patent Application Publication No. 2014/0374485;
  • U.S. Patent Application Publication No. 2015/0001301;
  • U.S. Patent Application Publication No. 2015/0001304;
  • U.S. Patent Application Publication No. 2015/0003673;
  • U.S. Patent Application Publication No. 2015/0009338;
  • U.S. Patent Application Publication No. 2015/0009610;
  • U.S. Patent Application Publication No. 2015/0014416;
  • U.S. Patent Application Publication No. 2015/0021397;
  • U.S. Patent Application Publication No. 2015/0028102;
  • U.S. Patent Application Publication No. 2015/0028103;
  • U.S. Patent Application Publication No. 2015/0028104;
  • U.S. Patent Application Publication No. 2015/0029002;
  • U.S. Patent Application Publication No. 2015/0032709;
  • U.S. Patent Application Publication No. 2015/0039309;
  • U.S. Patent Application Publication No. 2015/0039878;
  • U.S. Patent Application Publication No. 2015/0040378;
  • U.S. Patent Application Publication No. 2015/0048168;
  • U.S. Patent Application Publication No. 2015/0049347;
  • U.S. Patent Application Publication No. 2015/0051992;
  • U.S. Patent Application Publication No. 2015/0053766;
  • U.S. Patent Application Publication No. 2015/0053768;
  • U.S. Patent Application Publication No. 2015/0053769;
  • U.S. Patent Application Publication No. 2015/0060544;
  • U.S. Patent Application Publication No. 2015/0062366;
  • U.S. Patent Application Publication No. 2015/0063215;
  • U.S. Patent Application Publication No. 2015/0063676;
  • U.S. Patent Application Publication No. 2015/0069130;
  • U.S. Patent Application Publication No. 2015/0071819;
  • U.S. Patent Application Publication No. 2015/0083800;
  • U.S. Patent Application Publication No. 2015/0086114;
  • U.S. Patent Application Publication No. 2015/0088522;
  • U.S. Patent Application Publication No. 2015/0096872;
  • U.S. Patent Application Publication No. 2015/0099557;
  • U.S. Patent Application Publication No. 2015/0100196;
  • U.S. Patent Application Publication No. 2015/0102109;
  • U.S. Patent Application Publication No. 2015/0115035;
  • U.S. Patent Application Publication No. 2015/0127791;
  • U.S. Patent Application Publication No. 2015/0128116;
  • U.S. Patent Application Publication No. 2015/0129659;
  • U.S. Patent Application Publication No. 2015/0133047;
  • U.S. Patent Application Publication No. 2015/0134470;
  • U.S. Patent Application Publication No. 2015/0136851;
  • U.S. Patent Application Publication No. 2015/0136854;
  • U.S. Patent Application Publication No. 2015/0142492;
  • U.S. Patent Application Publication No. 2015/0144692;
  • U.S. Patent Application Publication No. 2015/0144698;
  • U.S. Patent Application Publication No. 2015/0144701;
  • U.S. Patent Application Publication No. 2015/0149946;
  • U.S. Patent Application Publication No. 2015/0161429;
  • U.S. Patent Application Publication No. 2015/0169925;
  • U.S. Patent Application Publication No. 2015/0169929;
  • U.S. Patent Application Publication No. 2015/0178523;
  • U.S. Patent Application Publication No. 2015/0178534;
  • U.S. Patent Application Publication No. 2015/0178535;
  • U.S. Patent Application Publication No. 2015/0178536;
  • U.S. Patent Application Publication No. 2015/0178537;
  • U.S. Patent Application Publication No. 2015/0181093;
  • U.S. Patent Application Publication No. 2015/0181109;
  • U.S. patent application Ser. No. 13/367,978 for a Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.);
  • U.S. patent application Ser. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.);
  • U.S. patent application Ser. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.);
  • U.S. patent application Ser. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.);
  • U.S. patent application Ser. No. 29/486,759 for an Imaging Terminal, filed Apr. 2, 2014 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering);
  • U.S. patent application Ser. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/277,337 for MULTIPURPOSE OPTICAL READER, filed May 14, 2014 (Jovanovski et al.);
  • U.S. patent application Ser. No. 14/283,282 for TERMINAL HAVING ILLUMINATION AND FOCUS CONTROL filed May 21, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/327,827 for a MOBILE-PHONE ADAPTER FOR ELECTRONIC TRANSACTIONS, filed Jul. 10, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/334,934 for a SYSTEM AND METHOD FOR INDICIA VERIFICATION, filed Jul. 18, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/339,708 for LASER SCANNING CODE SYMBOL READING SYSTEM, filed Jul. 24, 2014 (Xian et al.);
  • U.S. patent application Ser. No. 14/340,627 for an AXIALLY REINFORCED FLEXIBLE SCAN ELEMENT, filed Jul. 25, 2014 (Rueblinger et al.);
  • U.S. patent application Ser. No. 14/446,391 for MULTIFUNCTION POINT OF SALE APPARATUS WITH OPTICAL SIGNATURE CAPTURE filed Jul. 30, 2014 (Good et al.);
  • U.S. patent application Ser. No. 14/452,697 for INTERACTIVE INDICIA READER, filed Aug. 6, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/453,019 for DIMENSIONING SYSTEM WITH GUIDED ALIGNMENT, filed Aug. 6, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/462,801 for MOBILE COMPUTING DEVICE WITH DATA COGNITION SOFTWARE, filed on Aug. 19, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/483,056 for VARIABLE DEPTH OF FIELD BARCODE SCANNER filed Sep. 10, 2014 (McCloskey et al.);
  • U.S. patent application Ser. No. 14/513,808 for IDENTIFYING INVENTORY ITEMS IN A STORAGE FACILITY filed Oct. 14, 2014 (Singel et al.);
  • U.S. patent application Ser. No. 14/519,195 for HANDHELD DIMENSIONING SYSTEM WITH FEEDBACK filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,179 for DIMENSIONING SYSTEM WITH MULTIPATH INTERFERENCE MITIGATION filed Oct. 21, 2014 (Thuries et al.);
  • U.S. patent application Ser. No. 14/519,211 for SYSTEM AND METHOD FOR DIMENSIONING filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/519,233 for HANDHELD DIMENSIONER WITH DATA-QUALITY INDICATION filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,249 for HANDHELD DIMENSIONING SYSTEM WITH MEASUREMENT-CONFORMANCE FEEDBACK filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/527,191 for METHOD AND SYSTEM FOR RECOGNIZING SPEECH USING WILDCARDS IN AN EXPECTED RESPONSE filed Oct. 29, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/529,563 for ADAPTABLE INTERFACE FOR A MOBILE COMPUTING DEVICE filed Oct. 31, 2014 (Schoon et al.);
  • U.S. patent application Ser. No. 14/529,857 for BARCODE READER WITH SECURITY FEATURES filed Oct. 31, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/398,542 for PORTABLE ELECTRONIC DEVICES HAVING A SEPARATE LOCATION TRIGGER UNIT FOR USE IN CONTROLLING AN APPLICATION UNIT filed Nov. 3, 2014 (Bian et al.);
  • U.S. patent application Ser. No. 14/531,154 for DIRECTING AN INSPECTOR THROUGH AN INSPECTION filed Nov. 3, 2014 (Miller et al.);
  • U.S. patent application Ser. No. 14/533,319 for BARCODE SCANNING SYSTEM USING WEARABLE DEVICE WITH EMBEDDED CAMERA filed Nov. 5, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/535,764 for CONCATENATED EXPECTED RESPONSES FOR SPEECH RECOGNITION filed Nov. 7, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/568,305 for AUTO-CONTRAST VIEWFINDER FOR AN INDICIA READER filed Dec. 12, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/573,022 for DYNAMIC DIAGNOSTIC INDICATOR GENERATION filed Dec. 17, 2014 (Goldsmith);
  • U.S. patent application Ser. No. 14/578,627 for SAFETY SYSTEM AND METHOD filed Dec. 22, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/580,262 for MEDIA GATE FOR THERMAL TRANSFER PRINTERS filed Dec. 23, 2014 (Bowles);
  • U.S. patent application Ser. No. 14/590,024 for SHELVING AND PACKAGE LOCATING SYSTEMS FOR DELIVERY VEHICLES filed Jan. 6, 2015 (Payne);
  • U.S. patent application Ser. No. 14/596,757 for SYSTEM AND METHOD FOR DETECTING BARCODE PRINTING ERRORS filed Jan. 14, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/416,147 for OPTICAL READING APPARATUS HAVING VARIABLE SETTINGS filed Jan. 21, 2015 (Chen et al.);
  • U.S. patent application Ser. No. 14/614,706 for DEVICE FOR SUPPORTING AN ELECTRONIC TOOL ON A USER'S HAND filed Feb. 5, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/614,796 for CARGO APPORTIONMENT TECHNIQUES filed Feb. 5, 2015 (Morton et al.);
  • U.S. patent application Ser. No. 29/516,892 for TABLE COMPUTER filed Feb. 6, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/619,093 for METHODS FOR TRAINING A SPEECH RECOGNITION SYSTEM filed Feb. 11, 2015 (Pecorari);
  • U.S. patent application Ser. No. 14/628,708 for DEVICE, SYSTEM, AND METHOD FOR DETERMINING THE STATUS OF CHECKOUT LANES filed Feb. 23, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/630,841 for TERMINAL INCLUDING IMAGING ASSEMBLY filed Feb. 25, 2015 (Gomez et al.);
  • U.S. patent application Ser. No. 14/635,346 for SYSTEM AND METHOD FOR RELIABLE STORE-AND-FORWARD DATA HANDLING BY ENCODED INFORMATION READING TERMINALS filed Mar. 2, 2015 (Sevier);
  • U.S. patent application Ser. No. 29/519,017 for SCANNER filed Mar. 2, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/405,278 for DESIGN PATTERN FOR SECURE STORE filed Mar. 9, 2015 (Zhu et al.);
  • U.S. patent application Ser. No. 14/660,970 for DECODABLE INDICIA READING TERMINAL WITH COMBINED ILLUMINATION filed Mar. 18, 2015 (Kearney et al.);
  • U.S. patent application Ser. No. 14/661,013 for REPROGRAMMING SYSTEM AND METHOD FOR DEVICES INCLUDING PROGRAMMING SYMBOL filed Mar. 18, 2015 (Soule et al.);
  • U.S. patent application Ser. No. 14/662,922 for MULTIFUNCTION POINT OF SALE SYSTEM filed Mar. 19, 2015 (Van Horn et al.);
  • U.S. patent application Ser. No. 14/663,638 for VEHICLE MOUNT COMPUTER WITH CONFIGURABLE IGNITION SWITCH BEHAVIOR filed Mar. 20, 2015 (Davis et al.);
  • U.S. patent application Ser. No. 14/664,063 for METHOD AND APPLICATION FOR SCANNING A BARCODE WITH A SMART DEVICE WHILE CONTINUOUSLY RUNNING AND DISPLAYING AN APPLICATION ON THE SMART DEVICE DISPLAY filed Mar. 20, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/669,280 for TRANSFORMING COMPONENTS OF A WEB PAGE TO VOICE PROMPTS filed Mar. 26, 2015 (Funyak et al.);
  • U.S. patent application Ser. No. 14/674,329 for AIMER FOR BARCODE SCANNING filed Mar. 31, 2015 (Bidwell);
  • U.S. patent application Ser. No. 14/676,109 for INDICIA READER filed Apr. 1, 2015 (Huck);
  • U.S. patent application Ser. No. 14/676,327 for DEVICE MANAGEMENT PROXY FOR SECURE DEVICES filed Apr. 1, 2015 (Yeakley et al.);
  • U.S. patent application Ser. No. 14/676,898 for NAVIGATION SYSTEM CONFIGURED TO INTEGRATE MOTION SENSING DEVICE INPUTS filed Apr. 2, 2015 (Showering);
  • U.S. patent application Ser. No. 14/679,275 for DIMENSIONING SYSTEM CALIBRATION SYSTEMS AND METHODS filed Apr. 6, 2015 (Laffargue et al.);
  • U.S. patent application Ser. No. 29/523,098 for HANDLE FOR A TABLET COMPUTER filed Apr. 7, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/682,615 for SYSTEM AND METHOD FOR POWER MANAGEMENT OF MOBILE DEVICES filed Apr. 9, 2015 (Murawski et al.);
  • U.S. patent application Ser. No. 14/686,822 for MULTIPLE PLATFORM SUPPORT SYSTEM AND METHOD filed Apr. 15, 2015 (Qu et al.);
  • U.S. patent application Ser. No. 14/687,289 for SYSTEM FOR COMMUNICATION VIA A PERIPHERAL HUB filed Apr. 15, 2015 (Kohtz et al.);
  • U.S. patent application Ser. No. 29/524,186 for SCANNER filed Apr. 17, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/695,364 for MEDICATION MANAGEMENT SYSTEM filed Apr. 24, 2015 (Sewell et al.);
  • U.S. patent application Ser. No. 14/695,923 for SECURE UNATTENDED NETWORK AUTHENTICATION filed Apr. 24, 2015 (Kubler et al.);
  • U.S. patent application Ser. No. 29/525,068 for TABLET COMPUTER WITH REMOVABLE SCANNING DEVICE filed Apr. 27, 2015 (Schulte et al.);
  • U.S. patent application Ser. No. 14/699,436 for SYMBOL READING SYSTEM HAVING PREDICTIVE DIAGNOSTICS filed Apr. 29, 2015 (Nahill et al.);
  • U.S. patent application Ser. No. 14/702,110 for SYSTEM AND METHOD FOR REGULATING BARCODE DATA INJECTION INTO A RUNNING APPLICATION ON A SMART DEVICE filed May 1, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/702,979 for TRACKING BATTERY CONDITIONS filed May 4, 2015 (Young et al.);
  • U.S. patent application Ser. No. 14/704,050 for INTERMEDIATE LINEAR POSITIONING filed May 5, 2015 (Charpentier et al.);
  • U.S. patent application Ser. No. 14/705,012 for HANDS-FREE HUMAN MACHINE INTERFACE RESPONSIVE TO A DRIVER OF A VEHICLE filed May 6, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/705,407 for METHOD AND SYSTEM TO PROTECT SOFTWARE-BASED NETWORK-CONNECTED DEVICES FROM ADVANCED PERSISTENT THREAT filed May 6, 2015 (Hussey et al.);
  • U.S. patent application Ser. No. 14/707,037 for SYSTEM AND METHOD FOR DISPLAY OF INFORMATION USING A VEHICLE-MOUNT COMPUTER filed May 8, 2015 (Chamberlin);
  • U.S. patent application Ser. No. 14/707,123 for APPLICATION INDEPENDENT DEX/UCS INTERFACE filed May 8, 2015 (Pape);
  • U.S. patent application Ser. No. 14/707,492 for METHOD AND APPARATUS FOR READING OPTICAL INDICIA USING A PLURALITY OF DATA SOURCES filed May 8, 2015 (Smith et al.);
  • U.S. patent application Ser. No. 14/710,666 for PRE-PAID USAGE SYSTEM FOR ENCODED INFORMATION READING TERMINALS filed May 13, 2015 (Smith);
  • U.S. patent application Ser. No. 29/526,918 for CHARGING BASE filed May 14, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/715,672 for AUGMENTED REALITY ENABLED HAZARD DISPLAY filed May 19, 2015 (Venkatesha et al.);
  • U.S. patent application Ser. No. 14/715,916 for EVALUATING IMAGE VALUES filed May 19, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/722,608 for INTERACTIVE USER INTERFACE FOR CAPTURING A DOCUMENT IN AN IMAGE SIGNAL filed May 27, 2015 (Showering et al.);
  • U.S. patent application Ser. No. 29/528,165 for IN-COUNTER BARCODE SCANNER filed May 27, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/724,134 for ELECTRONIC DEVICE WITH WIRELESS PATH SELECTION CAPABILITY filed May 28, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 14/724,849 for METHOD OF PROGRAMMING THE DEFAULT CABLE INTERFACE SOFTWARE IN AN INDICIA READING DEVICE filed May 29, 2015 (Barten);
  • U.S. patent application Ser. No. 14/724,908 for IMAGING APPARATUS HAVING IMAGING ASSEMBLY filed May 29, 2015 (Barber et al.);
  • U.S. patent application Ser. No. 14/725,352 for APPARATUS AND METHODS FOR MONITORING ONE OR MORE PORTABLE DATA TERMINALS (Caballero et al.);
  • U.S. patent application Ser. No. 29/528,590 for ELECTRONIC DEVICE filed May 29, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 29/528,890 for MOBILE COMPUTER HOUSING filed Jun. 2, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/728,397 for DEVICE MANAGEMENT USING VIRTUAL INTERFACES CROSS-REFERENCE TO RELATED APPLICATIONS filed Jun. 2, 2015 (Caballero);
  • U.S. patent application Ser. No. 14/732,870 for DATA COLLECTION MODULE AND SYSTEM filed Jun. 8, 2015 (Powilleit);
  • U.S. patent application Ser. No. 29/529,441 for INDICIA READING DEVICE filed Jun. 8, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/735,717 for INDICIA-READING SYSTEMS HAVING AN INTERFACE WITH A USER'S NERVOUS SYSTEM filed Jun. 10, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/738,038 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES filed Jun. 12, 2015 (Amundsen et al.);
  • U.S. patent application Ser. No. 14/740,320 for TACTILE SWITCH FOR A MOBILE ELECTRONIC DEVICE filed Jun. 16, 2015 (Bandringa);
  • U.S. patent application Ser. No. 14/740,373 for CALIBRATING A VOLUME DIMENSIONER filed Jun. 16, 2015 (Ackley et al.);
  • U.S. patent application Ser. No. 14/742,818 for INDICIA READING SYSTEM EMPLOYING DIGITAL GAIN CONTROL filed Jun. 18, 2015 (Xian et al.);
  • U.S. patent application Ser. No. 14/743,257 for WIRELESS MESH POINT PORTABLE DATA TERMINAL filed Jun. 18, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 29/530,600 for CYCLONE filed Jun. 18, 2015 (Vargo et al);
  • U.S. patent application Ser. No. 14/744,633 for IMAGING APPARATUS COMPRISING IMAGE SENSOR ARRAY HAVING SHARED GLOBAL SHUTTER CIRCUITRY filed Jun. 19, 2015 (Wang);
  • U.S. patent application Ser. No. 14/744,836 for CLOUD-BASED SYSTEM FOR READING OF DECODABLE INDICIA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/745,006 for SELECTIVE OUTPUT OF DECODED MESSAGE DATA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/747,197 for OPTICAL PATTERN PROJECTOR filed Jun. 23, 2015 (Thuries et al.);
  • U.S. patent application Ser. No. 14/747,490 for DUAL-PROJECTOR THREE-DIMENSIONAL SCANNER filed Jun. 23, 2015 (Jovanovski et al.); and
  • U.S. patent application Ser. No. 14/748,446 for CORDLESS INDICIA READER WITH A MULTIFUNCTION COIL FOR WIRELESS CHARGING AND EAS DEACTIVATION, filed Jun. 24, 2015 (Xie et al.).


In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A method, comprising: initializing a dimensioning system by: at a range camera, capturing one or more initial reference images of a measurement platform and surrounding area;at a processor: generating a reference depth map from each initial reference image;generating and storing to a memory a background model from the reference depth maps;testing the dimensioning system for a scene change by: at the range camera, capturing a subsequent image of the measurement platform and surrounding area;at the processor: generating a current depth map from the subsequent image;comparing each pixel of the current depth map with a corresponding pixel of the background model; andcounting a number of pixels Pv of the current depth map that differ absolutely from the background model by more than the prescribed threshold THRESH1, and if the number of pixels Pv is greater than a threshold THRESH2, determining that a significant change in the image has occurred.
  • 2. The method according to claim 1, where the testing is carried out on a periodic basis.
  • 3. The method according to claim 1, further comprising executing a dimensioning process to measure the dimensions of an object on the measurement platform.
  • 4. The method according to claim 3, where the testing is carried out whenever a prescribed period of inactivity in measuring dimensions of an object on the measurement platform.
  • 5. The method according to claim 3, where the testing is carried out prior to each measurement of dimensions of an object on the measurement platform.
  • 6. The method according to claim 1, further comprising generating an alert upon determining that a significant scene change has occurred.
  • 7. The method according to claim 1, further comprising upon determining that a significant scene change has occurred, repeating the initializing.
  • 8. A dimensioning system, comprising: a measurement platform;a range camera mounted so as to capture an image of the measurement platform and surrounding area;a processor programmed to carry out the following actions:initialize the dimensioning system by: receiving one or more initial reference images of the measurement platform and surrounding area from the range camera;generating and storing to a memory a background model from the one or more captured initial reference images;test the dimensioning system for a scene change by: receiving a subsequent image of the platform area from the range camera;generating a current depth map from the subsequent image;comparing each pixel of the current depth map with a corresponding pixel of the background model; andcounting a number of pixels Pv of the current depth map that differ absolutely from the background model by more than the prescribed threshold THRESH1, and if the number of pixels Pv is greater than a threshold THRESH2, determining that a significant change in the image has occurred.
  • 9. The system according to claim 8, where the testing is carried out on a periodic basis.
  • 10. The system according to claim 8, further comprising the processor executing a dimensioning process to measure dimensions of an object on the measurement platform.
  • 11. The system according to claim 10, where the testing is carried out whenever a prescribed period of inactivity in measuring dimensions of an object on the measurement platform.
  • 12. The system according to claim 10, where the testing is carried out prior to each measurement of dimensions of an object on the measurement platform.
  • 13. The system according to claim 8, further comprising the processor generating an alert upon determining that a significant scene change has occurred.
  • 14. The system according to claim 8, further comprising upon determining that a significant scene change has occurred, the processor repeating the initializing.
  • 15. A method, comprising: initializing a dimensioning system by: at a range camera, capturing an initial reference image of a platform and surrounding area;at a processor, generating a three-dimensional container around the platform and storing the container to memory; andat the processor, determining a count of support points in the container from the initial reference image;testing the dimensioning system for a scene change by: at the range camera, capturing a subsequent image of the measurement platform and surrounding area;at the processor: counting support points in the subsequent image that are in the container;comparing the count of support points in the container in the subsequent image with the count of support points in the container in the initial reference image;based on the comparison, determining if a prescribed difference in the counts is present; andupon determining that a prescribed difference in the counts exists establishing that a significant scene change has occurred.
  • 16. The method according to claim 15, where the testing is carried out on a periodic basis.
  • 17. The method according to claim 15, further comprising executing a dimensioning process to measure dimensions of an object on the platform.
  • 18. The method according to claim 17, where the testing is carried out whenever a prescribed period of inactivity in measuring dimensions of an object on the measurement platform.
  • 19. The method according to claim 17, where the testing is carried out prior to each measurement of dimensions of an object on the measurement platform.
  • 20. The method according to claim 17, further comprising generating an alert upon determining that a significant scene change has occurred.
  • 21. The method according to claim 17, where the container comprises a right prism with a base approximating a convex polygon, where the base is parallel to a congruent convex polygon that bounds the measurement platform, and where the prism's height equals twice a maximum support distance.
  • 22. The method according to claim 15, where the container comprises a right cylinder with a circular base, where the base is parallel to a congruent circle that bounds the measurement platform, and where the cylinder's height equals twice the maximum support distance.
  • 23. The method according to claim 15, comprising: when a significant scene change is deemed to have occurred, searching the scene for the measurement platform at a location coplanar therewith.
  • 24. The method according to claim 15, further comprising upon establishing that a significant scene change has occurred, repeating the initializing.
  • 25. A dimensioning system, comprising: a measurement platform;a range camera mounted so as to capture an image of the measurement platform and surrounding area;a processor programmed to carry out the following actions: initialize the dimensioning system by: receiving an initial reference image of a platform area from the range camera;generating a three-dimensional container around the measurement platform and storing the container to memory; anddetermining a count of support points in the container from the initial reference image;test the dimensioning system for a scene change by: receiving a subsequent image of the measurement platform and surrounding area from the range camera;counting support points in the subsequent image that are in the container;comparing the count of support points in the container in the subsequent image with the count of support points in the container in the initial reference image;based on the comparison, determining if a prescribed difference in the counts of support points is present; andupon determining that a prescribed difference in the counts exists establishing that a significant scene change has occurred.
  • 26. The system according to claim 25, where the testing is carried out on a periodic basis.
  • 27. The system according to claim 25, further comprising the processor executing a dimensioning process to measure dimensions of an object on the measurement platform.
  • 28. The system according to claim 27, where the testing is carried out whenever a prescribed period of inactivity in measuring dimensions of an object on the measurement platform.
  • 29. The system according to claim 27, where the testing is carried out prior to each measurement of dimensions of an object on the measurement platform.
  • 30. The system according to claim 25, further comprising the processor generating an alert upon determining that a significant scene change has occurred.
  • 31. The system according to claim 25, where the container comprises a right prism with a base approximating a convex polygon, where the base is parallel to a congruent convex polygon that bounds the measurement platform, and where the prism's height equals twice the maximum support distance.
  • 32. The system according to claim 25, where the container comprises a right cylinder with a circular base, where the base is parallel to a congruent circle that bounds the measurement platform, and where the cylinder's height equals twice the maximum support distance.
  • 33. The system according to claim 25, further comprising when a significant scene change is deemed to have occurred, the processor searching the scene for the measurement platform at a location coplanar therewith.
  • 34. The system according to claim 25, further comprising the processor repeating the initializing upon establishing that a significant scene change has occurred.
US Referenced Citations (876)
Number Name Date Kind
3971065 Bayer Jul 1976 A
4026031 Siddall et al. May 1977 A
4279328 Ahlbom Jul 1981 A
4398811 Nishioka et al. Aug 1983 A
4495559 Gelatt, Jr. Jan 1985 A
4730190 Win et al. Mar 1988 A
4803639 Steele et al. Feb 1989 A
5184733 Amarson et al. Feb 1993 A
5198648 Hibbard Mar 1993 A
5220536 Stringer et al. Jun 1993 A
5331118 Jensen Jul 1994 A
5359185 Hanson Oct 1994 A
5384901 Glassner et al. Jan 1995 A
5548707 LoNegro et al. Aug 1996 A
5555090 Schmutz Sep 1996 A
5561526 Huber et al. Oct 1996 A
5590060 Granville et al. Dec 1996 A
5606534 Stringer et al. Feb 1997 A
5619245 Kessler et al. Apr 1997 A
5655095 LoNegro et al. Aug 1997 A
5661561 Wurz et al. Aug 1997 A
5699161 Woodworth Dec 1997 A
5729750 Ishida Mar 1998 A
5730252 Herbinet Mar 1998 A
5732147 Tao Mar 1998 A
5734476 Dlugos Mar 1998 A
5737074 Haga et al. Apr 1998 A
5748199 Palm May 1998 A
5767962 Suzuki et al. Jun 1998 A
5831737 Stringer et al. Nov 1998 A
5850370 Stringer et al. Dec 1998 A
5850490 Johnson Dec 1998 A
5869827 Rando Feb 1999 A
5870220 Migdal et al. Feb 1999 A
5900611 Hecht May 1999 A
5923428 Woodworth Jul 1999 A
5929856 LoNegro et al. Jul 1999 A
5938710 Lanza et al. Aug 1999 A
5959568 Woolley Sep 1999 A
5960098 Tao Sep 1999 A
5969823 Wurz et al. Oct 1999 A
5978512 Kim et al. Nov 1999 A
5979760 Freyman et al. Nov 1999 A
5988862 Kacyra et al. Nov 1999 A
5991041 Woodworth Nov 1999 A
6009189 Schaack Dec 1999 A
6025847 Marks Feb 2000 A
6035067 Ponticos Mar 2000 A
6049386 Stringer et al. Apr 2000 A
6053409 Brobst et al. Apr 2000 A
6064759 Buckley et al. May 2000 A
6067110 Nonaka et al. May 2000 A
6069696 McQueen et al. May 2000 A
6115114 Berg et al. Sep 2000 A
6137577 Woodworth Oct 2000 A
6177999 Wurz et al. Jan 2001 B1
6189223 Haug Feb 2001 B1
6232597 Kley May 2001 B1
6236403 Chaki May 2001 B1
6246468 Dimsdale Jun 2001 B1
6333749 Reinhardt et al. Dec 2001 B1
6336587 He et al. Jan 2002 B1
6369401 Lee Apr 2002 B1
6373579 Ober et al. Apr 2002 B1
6429803 Kumar Aug 2002 B1
6457642 Good et al. Oct 2002 B1
6507406 Yagi et al. Jan 2003 B1
6517004 Good et al. Feb 2003 B2
6519550 D'Hooge et al. Feb 2003 B1
6535776 Tobin et al. Mar 2003 B1
6674904 McQueen Jan 2004 B1
6705526 Zhu et al. Mar 2004 B1
6781621 Gobush et al. Aug 2004 B1
6824058 Patel et al. Nov 2004 B2
6832725 Gardiner et al. Dec 2004 B2
6858857 Pease et al. Feb 2005 B2
6922632 Foxlin Jul 2005 B2
6971580 Zhu et al. Dec 2005 B2
6995762 Pavlidis et al. Feb 2006 B1
7057632 Yamawaki et al. Jun 2006 B2
7085409 Sawhney et al. Aug 2006 B2
7086162 Tyroler Aug 2006 B2
7104453 Zhu et al. Sep 2006 B1
7128266 Zhu et al. Oct 2006 B2
7137556 Bonner et al. Nov 2006 B1
7159783 Walczyk et al. Jan 2007 B2
7161688 Bonner et al. Jan 2007 B1
7205529 Andersen et al. Apr 2007 B2
7214954 Schopp May 2007 B2
7277187 Smith et al. Oct 2007 B2
7307653 Dutta Dec 2007 B2
7310431 Gokturk et al. Dec 2007 B2
7353137 Vock et al. Apr 2008 B2
7413127 Ehrhart et al. Aug 2008 B2
7509529 Colucci et al. Mar 2009 B2
7527205 Zhu May 2009 B2
7586049 Wurz Sep 2009 B2
7602404 Reinhardt et al. Oct 2009 B1
7639722 Paxton et al. Dec 2009 B1
7726575 Wang et al. Jun 2010 B2
7780084 Zhang et al. Aug 2010 B2
7788883 Buckley et al. Sep 2010 B2
7974025 Topliss Jul 2011 B2
8027096 Feng et al. Sep 2011 B2
8028501 Buckley et al. Oct 2011 B2
8050461 Shpunt et al. Nov 2011 B2
8055061 Katano Nov 2011 B2
8072581 Breiholz Dec 2011 B1
8102395 Kondo et al. Jan 2012 B2
8132728 Dwinell et al. Mar 2012 B2
8134717 Pangrazio et al. Mar 2012 B2
8149224 Kuo et al. Apr 2012 B1
8194097 Xiao et al. Jun 2012 B2
8201737 Palacios Durazo et al. Jun 2012 B1
8212158 Wiest Jul 2012 B2
8212889 Chanas et al. Jul 2012 B2
8228510 Pangrazio et al. Jul 2012 B2
8230367 Bell et al. Jul 2012 B2
8294969 Plesko Oct 2012 B2
8305458 Hara Nov 2012 B2
8310656 Zalewski Nov 2012 B2
8313380 Zalewski et al. Nov 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8339462 Stec et al. Dec 2012 B2
8350959 Topliss et al. Jan 2013 B2
8351670 Ijiri et al. Jan 2013 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381976 Mohideen et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8437539 Komatsu et al. May 2013 B2
8441749 Brown et al. May 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8463079 Ackley et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8570343 Halstead Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8576390 Nunnink Nov 2013 B1
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8594425 Gurman et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8792688 Unsworth Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8810779 Hilde Aug 2014 B1
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8897596 Passmore et al. Nov 2014 B1
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8928896 Kennington et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9014441 Truyen et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9082023 Feng et al. Jul 2015 B2
9082195 Holeva et al. Jul 2015 B2
9142035 Rotman Sep 2015 B1
9171278 Kong et al. Oct 2015 B1
9224022 Ackley et al. Dec 2015 B2
9224027 Van Horn et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9233470 Bradski et al. Jan 2016 B1
9235899 Kirmani et al. Jan 2016 B1
9443123 Hejl Jan 2016 B2
9250712 Todeschini Feb 2016 B1
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9299013 Curlander et al. Mar 2016 B1
9310609 Rueblinger et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342724 McCloskey et al. May 2016 B2
9375945 Bowles Jun 2016 B1
D760719 Zhou et al. Jul 2016 S
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
D762647 Fitch et al. Aug 2016 S
9412242 Van Horn et al. Aug 2016 B2
9424749 Reed et al. Aug 2016 B1
D766244 Zhou et al. Sep 2016 S
9443222 Singel et al. Sep 2016 B2
9478113 Xie et al. Oct 2016 B2
9486921 Straszheim et al. Nov 2016 B1
9828223 Svensson et al. Nov 2017 B2
20010027995 Patel et al. Oct 2001 A1
20010032879 He et al. Oct 2001 A1
20020054289 Thibault et al. May 2002 A1
20020067855 Chiu et al. Jun 2002 A1
20020109835 Goetz Aug 2002 A1
20020118874 Chung et al. Aug 2002 A1
20020158873 Williamson Oct 2002 A1
20020167677 Okada et al. Nov 2002 A1
20020179708 Zhu et al. Dec 2002 A1
20020196534 Lizotte et al. Dec 2002 A1
20030038179 Tsikos et al. Feb 2003 A1
20030053513 Vatan et al. Mar 2003 A1
20030063086 Baumberg Apr 2003 A1
20030078755 Leutz et al. Apr 2003 A1
20030091227 Chang et al. May 2003 A1
20030156756 Gokturk et al. Aug 2003 A1
20030197138 Pease et al. Oct 2003 A1
20030225712 Cooper et al. Dec 2003 A1
20030235331 Kawaike et al. Dec 2003 A1
20040008259 Gokturk et al. Jan 2004 A1
20040019274 Galloway et al. Jan 2004 A1
20040024754 Mane et al. Feb 2004 A1
20040066329 Zeitfuss et al. Apr 2004 A1
20040073359 Ichijo et al. Apr 2004 A1
20040083025 Yamanouchi et al. Apr 2004 A1
20040089482 Ramsden et al. May 2004 A1
20040098146 Katae et al. May 2004 A1
20040105580 Hager et al. Jun 2004 A1
20040118928 Patel et al. Jun 2004 A1
20040122779 Stickler et al. Jun 2004 A1
20040132297 Baba et al. Jul 2004 A1
20040155975 Hart et al. Aug 2004 A1
20040165090 Ning Aug 2004 A1
20040184041 Schopp Sep 2004 A1
20040211836 Patel et al. Oct 2004 A1
20040214623 Takahashi et al. Oct 2004 A1
20040233461 Armstrong et al. Nov 2004 A1
20040258353 Gluckstad et al. Dec 2004 A1
20050006477 Patel Jan 2005 A1
20050117215 Lange Jun 2005 A1
20050128193 Popescu et al. Jun 2005 A1
20050128196 Popescu et al. Jun 2005 A1
20050168488 Montague Aug 2005 A1
20050211782 Martin Sep 2005 A1
20050257748 Kriesel et al. Nov 2005 A1
20050264867 Cho et al. Dec 2005 A1
20060047704 Gopalakrishnan Mar 2006 A1
20060078226 Zhou Apr 2006 A1
20060108266 Bowers et al. May 2006 A1
20060112023 Horhann May 2006 A1
20060151604 Zhu et al. Jul 2006 A1
20060159307 Anderson et al. Jul 2006 A1
20060159344 Shao et al. Jul 2006 A1
20060213999 Wang et al. Sep 2006 A1
20060230640 Chen Oct 2006 A1
20060232681 Okada Oct 2006 A1
20060255150 Longacre Nov 2006 A1
20060269165 Viswanathan Nov 2006 A1
20060291719 Ikeda et al. Dec 2006 A1
20070003154 Sun et al. Jan 2007 A1
20070025612 Iwasaki et al. Feb 2007 A1
20070031064 Zhao Feb 2007 A1
20070063048 Havens et al. Mar 2007 A1
20070116357 Dewaele May 2007 A1
20070127022 Cohen et al. Jun 2007 A1
20070143082 Degnan Jun 2007 A1
20070153293 Gruhlke et al. Jul 2007 A1
20070171220 Kriveshko Jul 2007 A1
20070177011 Lewin et al. Aug 2007 A1
20070181685 Zhu et al. Aug 2007 A1
20070237356 Dwinell et al. Oct 2007 A1
20070291031 Konev et al. Dec 2007 A1
20070299338 Stevick et al. Dec 2007 A1
20080013793 Hillis et al. Jan 2008 A1
20080035390 Wurz Feb 2008 A1
20080056536 Hildreth et al. Mar 2008 A1
20080062164 Bassi et al. Mar 2008 A1
20080077265 Boyden Mar 2008 A1
20080079955 Storm Apr 2008 A1
20080164074 Wurz Jun 2008 A1
20080204476 Montague Aug 2008 A1
20080212168 Olmstead et al. Sep 2008 A1
20080247635 Davis et al. Oct 2008 A1
20080273191 Kim et al. Nov 2008 A1
20080273210 Hilde Nov 2008 A1
20080278790 Boesser et al. Nov 2008 A1
20090038182 Lans et al. Feb 2009 A1
20090059004 Bochicchio Mar 2009 A1
20090081008 Somin et al. Mar 2009 A1
20090095047 Patel et al. Apr 2009 A1
20090134221 Zhu et al. May 2009 A1
20090195790 Zhu et al. Aug 2009 A1
20090225333 Bendall et al. Sep 2009 A1
20090237411 Gossweiler et al. Sep 2009 A1
20090268023 Hsieh Oct 2009 A1
20090272724 Gubler Nov 2009 A1
20090273770 Bauhahn et al. Nov 2009 A1
20090313948 Buckley et al. Dec 2009 A1
20090318815 Barnes et al. Dec 2009 A1
20090323084 Dunn et al. Dec 2009 A1
20090323121 Valkenburg Dec 2009 A1
20100035637 Varanasi et al. Feb 2010 A1
20100060604 Zwart et al. Mar 2010 A1
20100091104 Sprigle Apr 2010 A1
20100113153 Yen et al. May 2010 A1
20100118200 Gelman et al. May 2010 A1
20100128109 Banks May 2010 A1
20100161170 Siris Jun 2010 A1
20100171740 Andersen et al. Jul 2010 A1
20100172567 Prokoski Jul 2010 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100202702 Benos et al. Aug 2010 A1
20100208039 Stettner Aug 2010 A1
20100211355 Horst et al. Aug 2010 A1
20100217678 Goncalves Aug 2010 A1
20100220849 Colbert et al. Sep 2010 A1
20100220894 Ackley et al. Sep 2010 A1
20100223276 Al-Shameri et al. Sep 2010 A1
20100245850 Lee et al. Sep 2010 A1
20100254611 Amz Oct 2010 A1
20100274728 Kugelman Oct 2010 A1
20100303336 Abraham Dec 2010 A1
20100315413 Izadi et al. Dec 2010 A1
20100321482 Cleveland Dec 2010 A1
20110019155 Daniel et al. Jan 2011 A1
20110040192 Brenner et al. Feb 2011 A1
20110043609 Choi et al. Feb 2011 A1
20110099474 Grossman et al. Apr 2011 A1
20110169999 Grunow et al. Jul 2011 A1
20110188054 Petronius et al. Aug 2011 A1
20110188741 Sones et al. Aug 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20110234389 Mellin Sep 2011 A1
20110235854 Berger et al. Sep 2011 A1
20110249864 Venkatesan et al. Oct 2011 A1
20110254840 Halstead Oct 2011 A1
20110260965 Kim et al. Oct 2011 A1
20110279916 Brown et al. Nov 2011 A1
20110286007 Pangrazio et al. Nov 2011 A1
20110286628 Goncalves et al. Nov 2011 A1
20110288818 Thierman Nov 2011 A1
20110301994 Tieman Dec 2011 A1
20110303748 Lemma et al. Dec 2011 A1
20110310227 Konertz et al. Dec 2011 A1
20120024952 Chen Feb 2012 A1
20120056982 Katz et al. Mar 2012 A1
20120057345 Kuchibhotla Mar 2012 A1
20120067955 Rowe Mar 2012 A1
20120074227 Ferren et al. Mar 2012 A1
20120081714 Pangrazio et al. Apr 2012 A1
20120111946 Golant May 2012 A1
20120113223 Hilliges et al. May 2012 A1
20120126000 Kunzig et al. May 2012 A1
20120140300 Freeman Jun 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120179665 Baarman et al. Jul 2012 A1
20120185094 Rosenstein et al. Jul 2012 A1
20120190386 Anderson Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120197464 Wang et al. Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120218436 Rodriguez et al. Sep 2012 A1
20120223141 Good et al. Sep 2012 A1
20120224026 Bayer et al. Sep 2012 A1
20120224060 Gurevich et al. Sep 2012 A1
20120236288 Stanley Sep 2012 A1
20120242852 Hayward et al. Sep 2012 A1
20120113250 Farlotti et al. Oct 2012 A1
20120256901 Bendall Oct 2012 A1
20120261474 Kawashime et al. Oct 2012 A1
20120262558 Boger et al. Oct 2012 A1
20120280908 Rhoads et al. Nov 2012 A1
20120282905 Owen Nov 2012 A1
20120282911 Davis et al. Nov 2012 A1
20120284012 Rodriguez et al. Nov 2012 A1
20120284122 Brandis Nov 2012 A1
20120284339 Rodriguez Nov 2012 A1
20120284593 Rodriguez Nov 2012 A1
20120293610 Doepke et al. Nov 2012 A1
20120293625 Schneider et al. Nov 2012 A1
20120294549 Doepke Nov 2012 A1
20120299961 Ramkumar et al. Nov 2012 A1
20120300991 Mikio Nov 2012 A1
20120313848 Galor et al. Dec 2012 A1
20120314030 Datta Dec 2012 A1
20120314058 Bendall et al. Dec 2012 A1
20120316820 Nakazato et al. Dec 2012 A1
20130019278 Sun et al. Jan 2013 A1
20130038881 Pesach et al. Feb 2013 A1
20130038941 Pesach et al. Feb 2013 A1
20130043312 Van Horn Feb 2013 A1
20130050426 Sarmast et al. Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130093895 Palmer et al. Apr 2013 A1
20130094069 Lee et al. Apr 2013 A1
20130101158 Lloyd et al. Apr 2013 A1
20130156267 Muraoka et al. Jun 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130200150 Reynolds et al. Aug 2013 A1
20130201288 Billerbaeck et al. Aug 2013 A1
20130208164 Cazier et al. Aug 2013 A1
20130211790 Loveland et al. Aug 2013 A1
20130222592 Gieseke Aug 2013 A1
20130223673 Davis et al. Aug 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130291998 Konnerth Nov 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308013 Li et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130329012 Bartos Dec 2013 A1
20130329013 Metois et al. Dec 2013 A1
20130342342 Sabre et al. Dec 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140009586 McNamer et al. Jan 2014 A1
20140019005 Lee et al. Jan 2014 A1
20140021259 Moed et al. Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140031665 Pinto et al. Jan 2014 A1
20140034731 Gao et al. Feb 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039674 Motoyama et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140058612 Wong et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140062709 Hyer et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140064624 Kim et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067104 Osterhout Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071430 Hansen et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140079297 Tadayon et al. Mar 2014 A1
20140091147 Evans et al. Apr 2014 A1
20140097238 Ghazizadeh Apr 2014 A1
20140098091 Hori Apr 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140100813 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140104664 Lee Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Kearney May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140135984 Hirata May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140139654 Takahashi May 2014 A1
20140140585 Wang May 2014 A1
20140142398 Patil et al. May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140152975 Ko Jun 2014 A1
20140158468 Adami Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Lumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168380 Heidemann et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140177931 Kocherscheidt et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140192187 Atwell et al. Jul 2014 A1
20140192551 Masaki Jul 2014 A1
20140197238 Lui et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140201126 Zadeh et al. Jul 2014 A1
20140203087 Smith et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140205150 Ogawa Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140225918 Mittal et al. Aug 2014 A1
20140225985 Klusza et al. Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140240454 Lee Aug 2014 A1
20140247279 Nicholas et al. Sep 2014 A1
20140247280 Nicholas et al. Sep 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140267609 Laffargue Sep 2014 A1
20140268093 Tohme et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140270361 Amma et al. Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140306833 Ricci Oct 2014 A1
20140307855 Withagen et al. Oct 2014 A1
20140312121 Lu et al. Oct 2014 A1
20140313527 Askan Oct 2014 A1
20140319219 Liu et al. Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140320408 Zagorsek et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140333775 Naikal et al. Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140347533 Ovsiannikov et al. Nov 2014 A1
20140350710 Gopalkrishnan et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van Horn et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20140379613 Nishitani et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009100 Haneda et al. Jan 2015 A1
20150009301 Ribnick et al. Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150036876 Marrion et al. Feb 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150042791 Metois et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062160 Sakamoto et al. Mar 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150062369 Gehring et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150070158 Hayasaka Mar 2015 A1
20150071819 Todeschini Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150116498 Vartiainen et al. Apr 2015 A1
20150117749 Chen et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150163474 You Jun 2015 A1
20150169925 Chen et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150178900 Kim et al. Jun 2015 A1
20150182844 Jang Jul 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150193645 Colavito et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150204662 Kobayashi et al. Jul 2015 A1
20150204671 Showering Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150213647 Laffargue et al. Jul 2015 A1
20150219748 Hyatt Aug 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150229838 Hakim et al. Aug 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150269403 Lei et al. Sep 2015 A1
20150201181 Herschbach Oct 2015 A1
20150276379 Ni et al. Oct 2015 A1
20150308816 Laffargue et al. Oct 2015 A1
20150316368 Moench et al. Nov 2015 A1
20150325036 Lee Nov 2015 A1
20150327012 Bian et al. Nov 2015 A1
20150332463 Galera et al. Nov 2015 A1
20150355470 Herschbach Dec 2015 A1
20160014251 Hejl Jan 2016 A1
20160169665 Deschenes et al. Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160048725 Holz et al. Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160070982 Li et al. Feb 2016 A1
20160063429 Varley et al. Mar 2016 A1
20160088287 Sadi et al. Mar 2016 A1
20160090283 Svensson et al. Mar 2016 A1
20160090284 Svensson et al. Mar 2016 A1
20160101936 Chamberlin Apr 2016 A1
20160102975 McCloskey et al. Apr 2016 A1
20160104019 Todeschini et al. Apr 2016 A1
20160104274 Jovanovski et al. Apr 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue et al. Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160138247 Conway et al. May 2016 A1
20160138248 Conway et al. May 2016 A1
20160138249 Svensson et al. May 2016 A1
20160171720 Todeschini Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160187186 Coleman et al. Jun 2016 A1
20160187187 Coleman et al. Jun 2016 A1
20160187210 Coleman et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160191801 Sivan Jun 2016 A1
20160125873 Braho et al. Jul 2016 A1
20160202478 Masson Jul 2016 A1
20160203641 Bostick et al. Jul 2016 A1
20160223474 Tang et al. Aug 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Sewell et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20160343176 Ackley Nov 2016 A1
20170115490 Hsieh et al. Apr 2017 A1
20170121158 Wong May 2017 A1
20170182942 Hardy et al. Jun 2017 A1
Foreign Referenced Citations (58)
Number Date Country
2004212587 Apr 2005 AU
201139117 Oct 2008 CN
3335760 Apr 1985 DE
10210813 Oct 2003 DE
102007037282 Mar 2008 DE
1111435 Jun 2001 EP
1443312 Aug 2004 EP
2013117 Jan 2009 EP
2286932 Feb 2011 EP
2372648 Oct 2011 EP
2381421 Oct 2011 EP
2533009 Dec 2012 EP
2562715 Feb 2013 EP
2722656 Apr 2014 EP
2779027 Sep 2014 EP
2833323 Feb 2015 EP
2843590 Mar 2015 EP
2845170 Mar 2015 EP
2966595 Jan 2016 EP
3006893 Mar 2016 EP
3012601 Mar 2016 EP
3007096 Apr 2016 EP
2503978 Jan 2014 GB
2525053 Oct 2015 GB
2531928 May 2016 GB
H04129902 Apr 1992 JP
200696457 Apr 2006 JP
2007084162 Apr 2007 JP
2008210276 Sep 2008 JP
2014210646 Nov 2014 JP
2015174705 Oct 2015 JP
20100020115 Feb 2010 KR
20110013200 Feb 2011 KR
20110117020 Oct 2011 KR
20120028109 Mar 2012 KR
9640452 Dec 1996 WO
0077726 Dec 2000 WO
0114836 Mar 2001 WO
2006095110 Sep 2006 WO
2007015059 Feb 2007 WO
200712554 Nov 2007 WO
2011017241 Feb 2011 WO
2012175731 Dec 2012 WO
2013021157 Feb 2013 WO
2013033442 Mar 2013 WO
2013163789 Nov 2013 WO
2013166368 Nov 2013 WO
2013173985 Nov 2013 WO
20130184340 Dec 2013 WO
2014019130 Feb 2014 WO
2014023697 Feb 2014 WO
2014102341 Jul 2014 WO
2014110495 Jul 2014 WO
2014149702 Sep 2014 WO
2014151746 Sep 2014 WO
2015006865 Jan 2015 WO
2016020038 Feb 2016 WO
2016061699 Apr 2016 WO
Non-Patent Literature Citations (128)
Entry
Thorlabs, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, downloaded from https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_id=6430, 4 pages.
EKSMA Optics, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, downloaded from http://eksmaoptics.com/optical-systems/f-theta-lenses/f-theta-lens-for-1064-nm/, 2 pages.
Sill Optics, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, http://www.silloptics.de/1/products/sill-encyclopedia/laser-optics/f-theta-lenses/, 4 pages.
Chinese Notice of Reexamination in related Chinese Application 201520810313.3, dated Mar. 14, 2017, English Computer Translation provided, 7 pages.
Extended European search report in related EP Application 16199707.7, dated Apr. 10, 2017, 15 pages.
Ulusoy et al., One-Shot Scanning using De Bruijn Spaced Grids, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, 7 pages [Cited in EP Extended search report dated Apr. 10, 2017].
European Examination report in related EP Application No. 14181437.6, dated Feb. 8, 2017, 5 pages.
Wikipedia, “Microlens”, Downloaded from https://en.wikipedia.org/wiki/Microlens, pp. 3. {Cited by Examiner in Feb. 9, 2017 Final Office Action in related matter}.
Fukaya et al., “Characteristics of Speckle Random Pattern and Its Applications”, pp. 317-327, Nouv. Rev. Optique, t.6, n.6. (1975) {Cited by Examiner in Feb. 9, 2017 Final Office Action in related matter: downloaded Mar. 2, 2017 from http://iopscience.iop.org}.
European Extended Search Report in related EP Application No. 16190017.0, dated Jan. 4, 2017, 6 pages.
European Extended Search Report in related EP Application No. 16173429.8, dated Dec. 1, 2016, 8 pages. [Only new references cited: US 2013/0038881 was previously cited].
Extended European Search Report in related EP Application No. 16175410.0, dated Dec. 13, 2016, 5 pages.
Peter Clarke, Actuator Developer Claims Anti-Shake Breakthrough for Smartphone Cams, Electronic Engineering Times, p. 24, May 16, 2011.
Spiller, Jonathan; Object Localization Using Deformable Templates, Master's Dissertation, University of the Witwatersrand, Johannesburg, South Africa, 2007; 74 pages.
Leotta, Matthew J.; Joseph L. Mundy; Predicting High Resolution Image Edges with a Generic, Adaptive, 3-D Vehicle Model; IEEE Conference on Computer Vision and Pattern Recognition, 2009; 8 pages.
European Search Report for application No. EP13186043 dated Feb. 26, 2014 (now EP2722656 (dated Apr. 23, 2014)): Total pp. 7.
European Patent Office Action for Application No. 14157971.4-1906, dated Jul. 16, 2014, 5 pages.
European Patent Search Report for Application No. 14157971.4-1906, dated Jun. 30, 2014, 6 pages.
Caulier, Yannick et al., “A New Type of Color-Coded Light Structures for an Adapted and Rapid Determination of Point Correspondences for 3D Reconstruction.” Proc. of SPIE, vol. 8082 808232-3; 2011; 8 pages.
Kazantsev, Aleksei et al. “Robust Pseudo-Random Coded Colored STructured Light Techniques for 3D Object Model Recovery”; ROSE 2008 IEEE International Workshop on Robotic and Sensors Environments (Oct. 17-18, 2008) , 6 pages.
Mouaddib E. et al. “Recent Progress in Structured Light in order to Solve the Correspondence Problem in Stereo Vision” Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Apr. 1997; 7 pages.
Proesmans, Marc et al. “Active Acquisition of 3D Shape for Moving Objects” 0-7803-3258-X/96 1996 IEEE; 4 pages.
Salvi, Joaquim et al. “Pattern Codification Strategies in Structured Light Systems” published in Pattern Recognition; The Journal of the Pattern Recognition Society, Received Mar. 6, 2003; Accepted Oct. 2, 2003; 23 pages.
EP Search and Written Opinion Report in related matter EP Application No. 14181437.6, dated Mar. 26, 2015, 7 pages.
Hetzel, Gunter et al.; “3D Object Recognition from Range Images using Local Feature Histograms,”, Proceedings 2001 IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2001. Kauai, Hawaii, Dec. 8-14, 2001; pp. 394-399, XP010584149, ISBN: 978-0-7695-1272-3.
Second Chinese Office Action in related CN Application No. 201520810685.6, dated Mar. 22, 2016, 5 pages, no references.
European Search Report in related EP Application No. 15190315.0, dated Apr. 1, 2016, 7 pages.
International Search Report for PCT/US2013/039438 (WO2013166368), dated Oct. 1, 2013, 7 pages.
Lloyd, Ryan and Scott McCloskey, “Recognition of 3D Package Shapes for Singe Camera Metrology” IEEE Winter Conference on Applications of computer Visiona, IEEE, Mar. 24, 2014, pp. 99-106, {retrieved on Jun. 16, 2014}, Authors are employees of common Applicant.
European Office Action for application EP 13186043, dated Jun. 12, 2014(now EP2722656 (Apr. 23, 2014)), Total of 6 pages.
Zhang, Zhaoxiang; Tieniu Tan, Kaiqi Huang, Yunhong Wang; Three-Dimensional Deformable-Model-based Localization and Recognition of Road Vehicles; IEEE Transactions on Image Processing, vol. 21, No. 1, Jan. 2012, 13 pages.
U.S. Appl. No. 14/801,023, Tyler Doomenbal et al., filed Jul. 16, 2015, not published yet, Adjusting Dimensioning Results Using Augmented Reality, 39 pages.
Wikipedia, YUV description and definition, downloaded from http://www.wikipeida.org/wiki/YUV on Jun. 29, 2012, 10 pages.
YUV Pixel Format, downloaded from http://www.fource.org/yuv.php on Jun. 29, 2012; 13 pages.
YUV to RGB Conversion, downloaded from http://www.fource.org/fccyvrgb.php on Jun. 29, 2012; 5 pages.
Benos et al., “Semi-Automatic Dimensioning with Imager of a Portable Device,” U.S. Appl. No. 51/149,912, filed Feb. 4, 2009 (now expired), 56 pages.
Dimensional Weight—Wikipedia, the Free Encyclopedia, URL=http://en.wikipedia.org/wiki/Dimensional_weight, download date Aug. 1, 2008, 2 pages.
Dimensioning—Wikipedia, the Free Encyclopedia, URL=http://en.wikipedia.org/wiki/Dimensioning, download date Aug. 1, 2008, 1 page.
Decision to Grant in counterpart European Application No. 14157971.4 dated Aug. 6, 2015, pp. 1-2.
Leotta, Matthew, Generic, Deformable Models for 3-D Vehicle Surveillance, May 2010, Doctoral Dissertation, Brown University, Providence RI, 248 pages.
Ward, Benjamin, Interactive 3D Reconstruction from Video, Aug. 2012, Doctoral Thesis, Univesity of Adelaide, Adelaide, South Australia, 157 pages.
Hood, Frederick W.; William A. Hoff, Robert King, Evaluation of an Interactive Technique for Creating Site Models from Range Data, Apr. 27-May 1, 1997 Proceedings of the ANS 7th Topical Meeting on Robotics & Remote Systems, Augusta GA, 9 pages.
Gupta, Alok; Range Image Segmentation for 3-D Objects Recognition, May 1988, Technical Reports (CIS), Paper 736, University of Pennsylvania Department of Computer and Information Science, retrieved from Http://repository.upenn.edu/cis_reports/736, Accessed May 31, 2015, 157 pages.
Reisner-Kollmann,Irene; Anton L. Fuhrmann, Werner Purgathofer, Interactive Reconstruction of Industrial Sites Using Parametric Models, May 2010, Proceedings of the 26th Spring Conference of Computer Graphics SCCG 10, 8 pages.
Drummond, Tom; Roberto Cipolla, Real-Time Visual Tracking of Complex Structures, Jul. 2002, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, No. 7; 15 pages.
European Search Report for Related EP Application No. 15189214.8, dated Mar. 3, 2016, 9 pages.
Santolaria et al. “A one-step intrinsic and extrinsic calibration method for laster line scanner operation in coordinate measuring machines”, dated Apr. 1, 2009, Measurement Science and Technology, IOP, Bristol, GB, vol. 20, No. 4; 12 pages.
Search Report and Opinion in Related EP Application 15176943.7, dated Jan. 8, 2016, 8 pages.
European Search Report for related EP Application No. 15188440.0, dated Mar. 8, 2016, 8 pages.
Second Chinese Office Action in related CN Application No. 2015220810562.2, dated Mar. 22, 2016, 5 pages. English Translation provided [No references].
European Search Report for related Application EP 15190249.1, dated Mar. 22, 2016, 7 pages.
Second Chinese Office Action in related CN Application No. 201520810313.3, dated Mar. 22, 2016, 5 pages. English Translation provided [No references].
U.S. Appl. No. 14/800,757 , Eric Todeschini, filed Jul. 16, 2015, not published yet, Dimensioning and Imaging Items, 80 pages.
U.S. Appl. No. 14/747,197, Serge Thuries et al., filed Jun. 23, 2015, not published yet, Optical Pattern Projector; 33 pages.
U.S. Appl. No. 14/747,490, Brian L. Jovanovski et al., filed Jun. 23, 2015, not published yet, Dual-Projector Three-Dimensional Scanner; 40 pages.
Search Report and Opinion in related GB Application No. 1517112.7, dated Feb. 19, 2016, 6 Pages.
U.S. Appl. No. 14/793,149, H. Sprague Ackley, filed Jul. 7, 2015, not published yet, Mobile Dimensioner Apparatus for Use in Commerce; 57 pages.
U.S. Appl. No. 14/740,373, H. Sprague Ackley et al., filed Jun. 16, 2015, not published yet, Calibrating a Volume Dimensioner; 63 pages.
Intention to Grant in counterpart European Application No. 14157971.4 dated Apr. 14, 2015, pp. 1-8.
United Kingdom Search Report in related application GB1517842.9, dated Apr. 8, 2016, 8 pages.
Great Britain Search Report for related Application On. GB1517843.7, dated Feb. 23, 2016; 8 pages.
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned.
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages; now abandoned.
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages; now abandoned.
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages.
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages.
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages.
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages.
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages.
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages.
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages.
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages.
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages.
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages; now abandoned.
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages.
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages.
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages.
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages.
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages.
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages.
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages.
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages.
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages.
U.S. Appl. No. 14/740,320 for Tactile Switch Fora Mobile Electronic Device filed Jun. 16, 2015 (Barndringa); 38 pages.
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages.
European extended search report in related EP Application 16190833.0, dated Mar. 9, 2017, 8 pages [only new art has been cited; US Publication 2014/0034731 was previously cited].
United Kingdom Combined Search and Examination Report in related Application No. GB1620676.5, dated Mar. 8, 2017, 6 pages [References have been previously cited; WO2014/151746, WO2012/175731, US 2014/0313527, GB2503978].
European Exam Report in related , EP Application No. 16168216.6, dated Feb. 27, 2017, 5 pages, [References have been previously cited; WO2011/017241 and US 2014/0104413].
Office Action in counterpart European Application No. 13186043.9 dated Sep. 30, 2015, pp. 1-7.
Lloyd et al., “System for Monitoring the Condition of Packages Throughout Transit”, U.S. Appl. No. 14/865,575, filed Sep. 25, 2015, 59 pages, not yet published.
McCloskey et al., “Image Transformation for Indicia Reading,” U.S. Appl. No. 14/928,032, filed Oct. 30, 2015, 48 pages, not yet published.
Great Britain Combined Search and Examination Report in related Application GB1517842.9, dated Apr. 8, 2016, 8 pages.
Search Report in counterpart European Application No. 15182675.7, dated Dec. 4, 2015, 10 pages.
Wikipedia, “3D projection” Downloaded on Nov. 25, 2015 from www.wikipedia.com, 4 pages.
M.Zahid Gurbuz, Selim Akyokus, Ibrahim Emiroglu, Aysun Guran, An Efficient Algorithm for 3D Rectangular Box Packing, 2009, Applied Automatic Systems: Proceedings of Selected AAS 2009 Papers, pp. 131-134.
European Extended Search Report in Related EP Application No. 16172995.9, dated Aug. 22, 2016, 11 pages.
European Extended search report in related EP Application No. 15190306.9, dated Sep. 9, 2016, 15 pages.
Collings et al., “The Applications and Technology of Phase-Only Liquid Crystal on Silicon Devices”, Journal of Display Technology, IEEE Service Center, New, York, NY, US, vol. 7, No. 3, Mar. 1, 2011 (Mar. 1, 2011), pp. 112-119.
European extended Search report in related EP Application 13785171.3, dated Sep. 19, 2016, 8 pages.
El-Hakim et al., “Multicamera vision-based approach to flexible feature measurement for inspection and reverse engineering”, published in Optical Engineering, Society of Photo-Optical Instrumentation Engineers, vol. 32, No. 9, Sep. 1, 1993, 15 pages.
El-Hakim et al., “A Knowledge-based Edge/Object Measurement Technique”, Retrieved from the Internet: URL: https://www.researchgate.net/profile/Sabry_E1-Hakim/publication/44075058_A_Knowledge_Based_EdgeObject_Measurement_Technique/links/00b4953b5faa7d3304000000.pdf [retrieved on Jul. 15, 2016] dated Jan. 1, 1993, 9 pages.
H. Sprague Ackley, “Automatic Mode Switching in a Volume Dimensioner”, U.S. Appl. No. 15/182,636, filed Jun. 15, 2016, 53 pages, Not yet published.
Bosch Tool Corporation, “Operating/Safety Instruction for DLR 130”, Dated Feb. 2, 2009, 36 pages.
European Search Report for related EP Application No. 16152477.2, dated May 24, 2016, 8 pages.
Mike Stensvold, “get the Most Out of Variable Aperture Lenses”, published on www.OutdoorPhotogrpaher.com; dated Dec. 7, 2010; 4 pages, [As noted on search report retrieved from URL: http;//www.outdoorphotographer.com/gear/lenses/get-the-most-out-ofvariable-aperture-lenses.html on Feb. 9, 2016].
Houle et al., “Vehical Positioning and Object Avoidance”, U.S. Appl. No. 15/007,522 [not yet published], filed Jan. 27, 2016, 59 pages.
United Kingdom combined Search and Examination Report in related GB Application No. 1607394.2, dated Oct. 19, 2016, 7 pages.
European Search Report from related EP Application No. 16168216.6, dated Oct. 20, 2016, 8 pages.
Padzensky, Ron; “Augmera; Gesture Control”, Dated Apr. 18, 2015, 15 pages.
Grabowski, Ralph; “New Commands in AutoCADS 2010: Part 11 Smoothing 3D Mesh Objects” Dated 2011, 6 pages.
Theodoropoulos, Gabriel; “Using Gesture Recognizers to Handle Pinch, Rotate, Pan, Swipe, and Tap Gestures” dated Aug. 25, 2014, 34 pages.
European Exam Report in related EP Application No. 16152477.2, dated Jun. 20, 2017, 4 pages.
European Exam Report in related EP Applciation 16172995.9, dated Jul. 6, 2017, 9 pages.
United Kingdom Search Report in related Application No. GB1700338.5, dated Jun. 30, 2017, 5 pages.
European Search Report in related EP Application No. 17175357.7, dated Aug. 17, 2017, pp. 1-7.
Ralph Grabowski, “Smothing 3D Mesh Objects,” New Commands in AutoCAD 2010: Part 11, dated May 19, 2017; 6 pages.
European Exam Report in related EP Application No. 15176943.7, dated Apr. 12, 2017, 6 pages.
European Exam Report in related EP Application No. 15188440.0, dated Apr. 21, 2017, 4 pages.
Boavida et al., “Dam monitoring using combined terrestrial imaging systems”, 2009 Civil Engineering Survey Dec./Jan. 2009, pp. 33-38 {Notice of Allowance dated Sep. 15, 2017 in related matter}.
EP Search Report in related EP Application No. 17171844 dated Sep. 18, 2017. 4 pages.
EP Extended Search Report in related EP Applicaton No. 17174843.7 dated Oct. 17, 2017, 5 pages.
UK Further Exam Report in related UK Application No. GB1517842.9, dated Sep. 1, 2017, 5 pages.
Ulusoy, Ali Osman et al.; “One-Shot Scanning using De Bruijn Spaced Grids”, Brown University; 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, pp. 1786-1792 [EPO Search Report dated Dec. 5, 2017].
Extended European Search report in related EP Application No. 17189496.7 dated Dec. 5, 2017; 9 pages.
Examination Report in related EP Application No. 15190315, dated Jan. 26, 2018, 6 pages.
Examination Report in related GB Application No. GB1517843.7, dated Jan. 19, 2018, 4 pages.
Extended European Search report in related EP Application No. 17190323.0 dated Jan. 19, 2018; 6 pages.
Related Publications (1)
Number Date Country
20170358098 A1 Dec 2017 US