Dimensioning system

Information

  • Patent Grant
  • 10908013
  • Patent Number
    10,908,013
  • Date Filed
    Monday, November 20, 2017
    7 years ago
  • Date Issued
    Tuesday, February 2, 2021
    3 years ago
Abstract
A terminal for measuring at least one dimension of an object includes a range camera, a visible camera, and a display that are fixed in position and orientation relative to each other. The range camera is configured to produce a range image of an area in which the object is located. The visible camera is configured to produce a visible image of an area in which the object is located. The display is configured to present information associated with the range camera's field of view and the visible camera's field of view.
Description
FIELD OF THE INVENTION

The present invention relates to the field of devices for weighing and dimensioning packages, more specifically, to an integrated dimensioning and weighing system for packages.


BACKGROUND

Shipping companies typically charge customers for their services based on package size (i.e., volumetric weight) and/or weight (i.e., dead weight). When printing a shipping label for a package to be shipped, a customer enters both the size and weight of the package into a software application that bills the customer based on the information. Typically, customers get this information by hand-measuring package's dimensions (e.g., with a tape measure) and may weigh the package on a scale. In some cases, customers simply guess the weight of the package. Both guessing of the weight and hand-measurement of dimensions are prone to error, particularly when packages have irregular shape. When the shipping company determines, at a later time, that the package is larger and/or heavier than reported by the customer, an additional bill may be issued to the customer. Additional bills may reduce customer satisfaction, and, if the shipping customer is a retail company who has already passed along the shipping cost to an end customer, decrease the customer's earnings.


Furthermore, shipping companies may also collect the package's origin, destination, and linear dimensions from a customer to determine the correct charges for shipping a package. Manual entry of this information by a customer or the shipping company is also error prone.


As such, there is a commercial need for systems that accurately collect a package's size, weight, linear dimensions, origin, and destination and for integration with billing systems to reduce errors in transcribing that data.


SUMMARY

Accordingly, in one aspect, the present invention embraces an object analysis system. The system includes a scale for measuring the weight of the object, a range camera configured to produce a range image of an area in which the object is located, and a computing device configured to determine the dimensions of the object based, at least in part, on the range image.


In an exemplary embodiment, the range camera is configured to produce a visible image of the scale's measured weight of the object and the computing device is configured to determine the weight of the object based, at least in part, on the visible image. The scale may be an analog scale having a gauge and the visible image produced by the range camera includes the scale's gauge. Alternatively, the scale may be a digital scale having a display and the visible image produced by the range camera includes the scale's display.


In yet another exemplary embodiment, the computing device is configured to execute shipment billing software.


In yet another exemplary embodiment, the object analysis system transmits the weight of the object and determined dimensions to a host platform configured to execute shipment billing software.


In yet another exemplary embodiment, the object analysis system includes a microphone for capturing audio from a user and the computing device is configured for converting the captured audio to text.


In yet another exemplary embodiment, the range camera is configured to project a visible laser pattern onto the object and produce a visible image of the object and the computing device is configured to determine the dimensions of the object based, at least in part, on the visible image of the object.


In yet another exemplary embodiment, the scale and the range camera are fixed in position and orientation relative to each other and the computing device is configured to determine the dimensions of the object based, at least in part, on ground plane data of the area in which the object is located. The ground plane data may be generated by capturing an initial range image and identifying a planar region in the initial range image that corresponds to a ground plane.


In another aspect, the present invention embraces a method for determining the dimensions of an object that includes capturing a range image of a scene that includes the object and determining the dimensions of the object based, at least in part, on the range image and ground plane data of the area in which the object is located.


In yet another aspect, the present invention embraces a terminal for measuring at least one dimension of an object that includes a range camera, a visible camera, a display that are fixed in position and orientation relative to each other. The range camera is configured to produce a range image of an area in which the object is located. The visible camera is configured to produce a visible image of an area in which the object is located. The display is configured to present information associated with the range camera's field of view and the visible camera's field of view.


In an exemplary embodiment, the range camera's field of view is narrower than the visible camera's field of view and the display is configured to present the visible image produced by the visible camera and an outlined shape on the displayed visible image corresponding to the range camera's field of view.


In another exemplary embodiment, the display is configured to present the visible image produced by the visible camera and a symbol on the displayed visible image corresponding to the optical center of the range camera's field of view.


In yet another aspect, the present invention embraces a method for determining the dimensions of an object that includes projecting a laser pattern (e.g., a visible laser pattern) onto the object, capturing an image of the projected pattern on the object, and determining the dimensions of the objection based, at least in part, on the captured image.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an object analysis system in accordance with one or more exemplary embodiments.



FIG. 2 illustrates a system for determining dimensions associated with an object in accordance with one or more embodiments of the present disclosure.



FIG. 3 illustrates a method for determining dimensions associated with an object in accordance with one or more embodiments of the present disclosure.



FIG. 4 is a schematic physical form view of one embodiment of a terminal in accordance with aspects of the present invention.



FIG. 5 is a block diagram of the terminal of FIG. 4.



FIG. 6 is a diagrammatic illustration of one embodiment of an imaging subsystem for use in the terminal of FIG. 4.



FIG. 7 is a flowchart illustrating one embodiment of a method for measuring at least one dimension of an object using the terminal of FIG. 4.



FIG. 8 is an illustration of a first image of the object obtained using the fixed imaging subsystem of FIG. 6.



FIG. 9 is a view of the terminal of FIG. 4 illustrating on the display the object disposed in the center of the display for use in obtaining the first image of FIG. 8.



FIG. 10 is a second aligned image of the object obtained using the movable imaging subsystem of FIG. 6.



FIG. 11 is a diagrammatic illustration of the geometry between an object and the image of the object on an image sensor array.



FIG. 12 is a diagrammatic illustration of another embodiment of an imaging subsystem for use in the terminal of FIG. 4, which terminal may include an aimer.



FIG. 13 is a diagrammatic illustration of another embodiment of a single movable imaging subsystem and actuator for use in the terminal of FIG. 4.



FIG. 14 is an elevational side view of one implementation of an imaging subsystem and actuator for use in the terminal of FIG. 4.



FIG. 15 is a top view of the imaging subsystem and actuator of FIG. 14.



FIG. 16 is a timing diagram illustrating one embodiment for use in determining one or more dimensions and for decoding a decodable performed by the indicia reading terminal of FIG. 4.



FIG. 17 depicts the near field relationship between a laser pattern and a camera system's field of view as employed in an exemplary method.



FIG. 18 depicts the far field relationship between a laser pattern and a camera system's field of view as employed in an exemplary method.



FIG. 19 depicts an exemplary arrangement of a standard rectilinear box-shaped object on a flat surface upon which a laser pattern has been projected in accordance with an exemplary method.



FIG. 20 schematically depicts a relationship between the width of a laser line and the size of the field of view of a small number of pixels within a camera system.





DETAILED DESCRIPTION

The present invention embraces a system that accurately collects a package's size, weight, linear dimensions, origin, and destination and that may be integrated with billing systems to reduce errors in transcribing that data.


In one aspect, the present invention embraces an object analysis system. FIG. 1 illustrates an exemplary object analysis system 11. As depicted, the system 11 includes a scale 12, a range camera 102, a computing device 104, and a microphone 18. Typically, the scale 12 measures the weight of the object 112, the range camera 102 is configured to produce a range image of an area 110 in which the object is located, and the computing device 104 is configured to determine the dimensions of the object 112 based, at least in part, on the range image.


As noted, the scale 12 measures the weight of the object 112. Exemplary scales 12 include analog scales having gauges or and digital scales having displays. The scale 12 of FIG. 1 includes a window 13 for showing the measured weight of the object 112. The window 13 may be a gauge or display depending on the type of scale 12.


The scale 12 also includes top surface markings 14 to guide a user to place the object in a preferred orientation for analysis by the system. For example, a particular orientation may improve the range image and/or visible image produced by range camera 102. Additionally, the scale may include top surface markings 16 to facilitate the computing device's estimation of a reference plane during the process of determining the dimensions of the object 112.


In exemplary embodiments, the scale 12 transmits the measured weight of the object 112 to the computing device 104 and/or a host platform 17. In this regard, the scale 12 may transmit this information via a wireless connection and/or a wired connection (e.g., a USB connection, such as a USB 1.0, 2.0, and/or 3.0).


As noted, the object analysis system 11 includes a range camera 102 that is configured to produce a range image of an area 110 in which the object 112 is located. In exemplary embodiments, the range camera 102 is also configured to produce a visible image of the scale's measured weight of the object 112 (e.g., a visible image that includes window 13). The range camera 102 may be separate from the computing device 104, or the range camera 102 and the computing device 104 may be part of the same device. The range camera 102 is typically communicatively connected to the computing device 104.


The depicted object analysis system 11 includes a microphone 18. The microphone 18 may be separate from the range camera 102, or the microphone 18 and the range camera 102 may be part of the same device. Similarly, the microphone 18 may be separate from the computing device 104, or the microphone 18 and the computing device 104 may be part of the same device.


The microphone 18 captures audio from a user of the object analysis system 11, which may then be converted to text (e.g., ASCII text). In exemplary embodiments, the text may be presented to the user via a user-interface for validation or correction (e.g., by displaying the text on a monitor or by having a computerized reader speak the words back to the user). The text is typically used as an input for software (e.g., billing software and/or dimensioning software). For example, the text (i.e., as generated by converting audio from the user) may be an address, in which case the computing device may be configured to determine the components of the address. In this regard, exemplary object analysis systems reduce the need for error-prone manual entry of data.


Additionally, the text may be used as a command to direct software (e.g., billing software and/or dimensioning software). For example, if multiple objects are detected in the range camera's field of view, a user interface may indicate a numbering for each object and ask the user which package should be dimensioned. The user could then give a verbal command by saying a number, and the audio as captured by the microphone 18 can be converted into text which commands the dimensioning software. Similarly, the user could give verbal commands to describe the general class of the object (e.g., “measure a box”) or to indicate the type of information being provided (e.g., a command of “destination address” to indicate that an address will be provided next).


The computing device 104 may be configured for converting the audio captured by the microphone 18 to text. Additionally, the computing device 104 may be configured to transmit the captured audio (e.g., as a file or a live stream) to a speech-to-text module and receive the text. The captured audio may be transcoded as necessary by the computing device 104. The computing device 104 may or may not include the speech-to-text module. For example, the computing device 104 may transmit (e.g., via a network connection) the captured audio to an external speech-to-text service provider (e.g., Google's cloud-based speech-to-text service). In exemplary embodiments, the speech-to-text module transmits the text and a confidence measure of each converted phrase. The computing device 104 may be configured to enter the text into shipment billing software (e.g., by transmitting the text to a host platform 17 configured to execute shipment billing software).


As noted, the object analysis system 11 includes a computing device 104. The computing device 104 depicted in FIG. 1 includes a processor 106 and a memory 108. Additional aspects of processor 106 and memory 108 are discussed with respect to FIG. 2. Memory 108 can store executable instructions, such as, for example, computer readable instructions (e.g., software), that can be executed by processor 106. Although not illustrated in FIG. 1, memory 108 can be coupled to processor 106.


The computing device 104 is configured to determine the dimensions of an object 112 based, at least in part, on a range image produced by range camera 102. Exemplary methods of determining the dimensions of an object 112 are discussed with respect to FIGS. 2-16. The computing device 104 may also be configured to determine the weight of an object 112 based, at least in part, on a visible image produced by range camera 102. For example, the computing device 104 may execute software that processes the visible image to read the weight measured by the scale 12.


The computing device 104 may be configured to calculate the density of the object 112 based on its determined dimensions and weight. Furthermore, the computing device 104 may be configured to compare the calculated density to a realistic density threshold (e.g., as preprogrammed data or tables). If the calculated density exceeds a given realistic density threshold, the computing device 104 may: re-determine the dimensions of the object 112 based on the range image; instruct the range camera 102 to produce a new range image; instruct the range camera 102 to produce a new visible image and/or instruct the scale 12 to re-measure the object 112.


The computing device 104 may also be configured to compare the determined dimensions of the object 112 with the dimensions of the scale 12. In this regard, the scale's dimensions may be known (e.g., as preprogrammed data or tables), and the computing device 104 may be configured to determine the dimensions of the object based on the range image and the known dimensions of the scale 12. Again, if the determined dimensions exceed a given threshold of comparison, the computing device 104 may: re-determine the dimensions of the object 112 based on the range image; instruct the range camera 102 to produce a new range image; instruct the range camera 102 to produce a new visible image and/or instruct the scale 12 to re-measure the object 112.


In exemplary embodiments, the computing device 104 may be configured to execute shipment billing software. In such embodiments, the computing device 104 may be a part of the same device as the host platform 17, or the object analysis system 11 may not include a host platform 17.


Alternatively, the object analysis system 11 may transmit (e.g., via a wireless connection and/or a wired connection, such as a USB connection) the weight of the object 112 and determined dimensions to a host platform 17 configured to execute shipment billing software. For example, the computing device 104 may transmit the weight of the object 112 and determined dimensions to the host platform 17.


In exemplary embodiments, the range camera 102 is configured to project a laser pattern (e.g., a visible laser pattern) onto the object 112 and produce a visible image of the object 112, and the computing device 104 is configured to determine the dimensions of the object 112 based, at least in part, on the visible image of the object 112. In this regard, the projection of the laser pattern on the object 112 provides additional information or an alternative or supplemental method for determining the dimensions of the object 112. Furthermore, the laser pattern will facilitate user-placement of the object with respect to the range camera.


An exemplary object analysis system 11 includes a scale 12 and a range camera 102 that are fixed in position and orientation relative to each other. The computing device 104 of such an exemplary object analysis system 11 may be configured to determine the dimensions of the object 112 based, at least in part, on ground plane data of the area 110 in which the object is located. The ground plane data may include data generated by capturing an initial range image and identifying a planar region in the initial range image that corresponds to a ground plane.


The ground plane data may be stored on the computing device 104 during manufacturing after calibrating the object analysis system 11. The ground plane data may also be updated by the computing device 104 after installation of the object analysis system 11 or periodically during use by capturing an initial range image and identifying a planar region in the initial range image that corresponds to a ground plane.


The computing device 104 may be configured to verify the validity of the ground plane data by identifying a planar region in the range image produced by the range camera 102 that corresponds to a ground plane. If the ground plane data does not correspond to the identified planar region in the range image, the computing device 104 may update the ground plane data.


In exemplary embodiments, the computing device 104 may be configured to control the object analysis system in accordance with multiple modes. While in a detection mode, the computing device 104 may be configured to evaluate image viability and/or quality (e.g., of an infra-red image or visible image) in response to movement or the placement of an object in the range camera's field of view. Based on the evaluation of the image viability and/or quality, the computing device 104 may be configured to place the object analysis system in another mode, such as an image capture mode for capturing an image using the range camera 102 or an adjust mode for adjusting the position of the range camera 102.


In exemplary embodiments, the object analysis system may include positioning devices, (e.g., servo motors, tilt motors, and/or three-axis accelerometers) to change the position of the range camera relative to the object. In this regard, the computing device 104 may be configured to control and receive signals from the positioning devices. After evaluating image viability and/or quality, the computing device may place the object analysis system in an adjust mode. The computing device may be configured to have two adjust modes, semiautomatic and automatic. In semiautomatic adjust mode, the computing device may be configured to provide visual or audio feedback to an operator that then moves the range camera (e.g., adjusts the camera's tilt angle and/or height). In automatic mode, the computing device may be configured to control and receive signals from the positioning devices to adjust the position of the range camera. By adjusting the position of the range camera, the object analysis system can achieve higher dimensioning accuracy.


In another aspect, the present invention embraces a method for determining the dimensions of an object. The method includes capturing an image of a scene that includes the object and determining the dimensions of the object based, at least in part, on the range image and ground plane data of the area in which the object is located. As noted with respect to an exemplary object analysis system, the ground plane data may include data generated by capturing an initial range image and identifying a planar region in the initial range image that corresponds to a ground plane. The method may also include verifying the validity of the ground plane data by identifying a planar region in the range image that corresponds to a ground plane.


This exemplary method for determining the dimensions of an object is typically used in conjunction with a range camera on a fixed mount at a given distance and orientation with respect to the area in which the object is placed for dimensioning. In this regard, utilizing the ground plane data, rather than identifying the ground plane for each implementation of the method, can reduce the time and resources required to determine the dimensions of the object.


In yet another aspect, the present invention embraces another method for determining the dimensions of an object. The method includes projecting a laser pattern (e.g., a visible laser pattern) onto an object, capturing an image of the projected pattern on the object, and determining the dimensions of the object based, at least in part, on the captured image. In an exemplary embodiment, the object has a rectangular box shape.


An exemplary method includes projecting a laser pattern (e.g., a grid or a set of lines) onto a rectangular box. Typically, the box is positioned such that two non-parallel faces are visible to the system or device projecting the laser pattern and a camera system with known field of view characteristics. The camera system is used to capture an image of the laser light reflecting off of the box. Using image analysis techniques (e.g., imaging software), the edges of the box are determined. The relative size and orientation of the faces is determined by comparing the distance between lines of the laser pattern in the captured image to the known distance between the lines of the laser pattern as projected while considering the characteristics of the camera system's field of view, such as size, aspect ratio, distortion, and/or angular magnification.


The distance from the camera system to the box may also be desired and may be used to determine the dimensions of the box. The distance between the camera system and the box can be determined using a variety of methods. For example, the distance from the camera system to the box may be determined from the laser pattern and the camera system's field of view. Additionally, sonar ranging techniques or considering the light time of flight may facilitate determination of this distance.


Another exemplary method includes projecting a laser pattern including two horizontal, parallel lines and two vertical, parallel lines. The distance between each set of parallel lines is constant. In this regard, the laser pattern is collimated, producing a constant-size square or rectangle in the center of the laser pattern as it propagates away from the device that generated the laser pattern.


An exemplary laser pattern including two horizontal, parallel lines and two vertical, parallel lines is depicted in FIGS. 17 and 18. The exemplary laser pattern is aligned to the field of view of the camera system, and the relationship between the laser pattern and the field of view are determined. This relationship may be determined by a precision alignment of the laser pattern to a known fixture pattern and/or a software calibration process may process two or more images from the camera system. FIG. 17 depicts the approximated relationship between the laser pattern and the camera's near-field field of view, and FIG. 18 depicts the approximated relationship between the laser pattern and the camera's far-field field of view.


The exemplary method typically includes projecting the laser pattern onto two faces of a standard rectilinear box-shaped object such that the two horizontal laser lines are parallel to and on opposite side of the edge connecting the two faces (i.e., one horizontal laser line above the edge and the other horizontal line below the edge). Additionally, the laser pattern is typically projected such that the laser pattern fully traverses the visible faces of the object.



FIG. 19 depicts an exemplary arrangement of a standard rectilinear box-shaped object 5001 upon which a laser pattern 5002 has been projected. As depicted, the two horizontal laser lines are parallel to and on opposite sides of the edge connecting the two faces. Additionally, the laser pattern 5002 fully traverse the visible faces of the object 5001. Accordingly, a number of break points, typically ten break points, are formed in the projected laser pattern 5002. These break points are identified in FIG. 19 by open circles.


The exemplary method includes capturing an image of the projected laser pattern on the object (e.g., with a camera system). The dimensions of the object are then determined, at least in part, from the captured image. For example, a processor may be used to process the image to identify the break points in the projected laser pattern. Using the known relationship between the laser pattern and the field of view, the break points may be translated into coordinates in a three-dimensional space. Typically, any two break points which are connected by a laser line segment can be used to calculate a dimension of the object.


In an exemplary embodiment, the method includes determining the coordinates of the break points in a three-dimensional space based on the known size of the central rectangle (e.g., a square). In other words, the known size of the rectangle is used as a ruler or measuring stick in the image to determine the dimensions of the object.


Exemplary methods include projecting a laser pattern including laser lines having a profile with a small divergence angle. In other words, the width of the laser lines increases as the distance from the device projecting the pattern increases. The divergence angle is typically between about 1 and 30 milliradians (e.g., between about 2 and 20 milliradians). In an exemplary embodiment, the divergence angle is between about 3 and 10 milliradians (e.g., about 6 milliradians).


In exemplary embodiments, the laser lines' divergence angle corresponds to the divergence of a small number of pixels (e.g., between about 2 and 10 pixels) within the camera system used to capture an image. Thus, as the field of view of this small number of pixels expands with increasing distance from the camera system, the width of the laser lines increases at a similar rate. Accordingly, the width of the laser lines covers approximately the same number of pixels, although not necessarily the same set of pixels, regardless of the projected laser pattern's distance from the camera system.


In another exemplary embodiment, the laser pattern includes laser lines having a profile with a divergence angle such that the width of the laser line in the far field corresponds to the field of view of a small number of pixels in the far field. In this regard, the divergence angle of the laser lines does not necessarily match the field of view of the small number of pixels in the near field. FIG. 20 schematically depicts such a relationship between the laser lines' width and the field of view of a small number of pixels within a camera system. The depicted device 6000 includes the camera system and a laser projecting module.


Exemplary methods utilizing a laser pattern that includes laser lines having a profile with a small divergence angle prevents the loss of resolution in the far field. When projected laser lines are conventionally collimated, the laser lines appear increasingly thinner on a target object as the distance between the laser projection module and the target object increases. If the reflected light from a projected laser line falls on an area of the camera system's sensor that is approximately one pixel wide or smaller, the precision of the dimensioning method can be no greater than one pixel. In contrast, when projected laser lines have a profile with a small divergence angle, the projected line has an energy distribution encompassing multiple pixels facilitating a more precise determination of the center of the projected line. Accordingly, methods employing projected laser lines having a profile with a small divergence angle facilitate measurements that exceed the resolution of the camera pixel sampling.


In yet another aspect, the present invention embraces a terminal for measuring at least one dimension of an object. The terminal includes a range camera, a visible camera (e.g., a grayscale and/or RGB sensor), and a display that are fixed in position and orientation relative to each other. The range camera is configured to produce a range image of an area in which an object is located, and the visible camera is configured to produce a visible image of an area in which the object is located. The display is configured to present information associated with the range camera's field of view and the visible camera's field of view.


Typically, the range camera's field of view is narrower than the visible camera's field of view. To facilitate accurate dimensioning, the display is configured to present the visible image produced by the visible camera and an outlined shape on the displayed visible image corresponding to the range camera's field of view (e.g., a rectangle). The outlined shape shows the user of the terminal when the object to be dimensioned is within the range camera's field of view. In other words, the interior of the outlined shape typically corresponds to the intersection or overlap between the visible image and the range image.


In exemplary embodiments, the display is configured to present information associated with the optimal orientation of the range camera and visible camera with respect to the object. Such information further facilitates accurate dimensioning by encouraging the user to adjust the orientation of the terminal to an orientation that accelerates or improves the dimensioning process.


The display may be configured to present the visible image produced by the visible camera and a symbol on the displayed visible image corresponding to the optical center of the range camera's field of view. Again, presenting such a symbol on the display facilitates accurate dimensioning by encouraging the user to adjust the orientation of the terminal to an orientation that accelerates or improves the dimensioning process.


In exemplary embodiments, the symbol shown by the display is a crosshair target having three prongs. When the object is a rectangular box, the display may be configured to show the three prongs of the crosshairs on the displayed visible image in an orientation that corresponds to the optimal orientation of the range camera and visible camera with respect to a corner of the rectangular box.


When the object to be dimensioned is cylindrically shaped (e.g., having a medial axis and base), the display may be configured to show the visible image produced by the visible camera and a line on the displayed visible image in an orientation that corresponds to the optimal orientation of the range camera and visible camera with respect to the medial axis of the object. The display may also be configured to show the visible image produced by the visible camera and an ellipse on the displayed visible image in an orientation that corresponds to the optimal orientation of the range camera and visible camera with respect to the base of the object.


As noted, the configuration of the terminal's display presents information associated with the range camera's field of view and the visible camera's field of view. The information helps the user determine the three degrees of freedom and/or the three degrees of freedom for translation of the camera relative to the object that will ensure or at least facilitate an accurate measurement of the object.


In exemplary embodiments, the terminal may include a processor that is configured to automatically initiate a dimensioning method when the orientation of the terminal with respect to an object corresponds to an orientation that accelerates or improves the dimensioning process. Automatically initiating the dimensioning method in this manner prevents any undesirable motion of the terminal that may be induced when an operator presses a button or other input device on the terminal. Additionally, automatically initiating the dimensioning method typically improves the accuracy of the dimensioning method.


As noted, the terminal's display may be configured to present information associated with the optimal orientation of the range camera and visible camera with respect to the object. The terminal's processor may be configured to analyze the output of the display (i.e., the visible image and the information associated with the optimal orientation) and initiate the dimensioning method (e.g., including capturing a range image) when the orientation information and the visible image align. The terminal's processor may be configured to analyze the output of the display using imaged-based edge detection methods (e.g., a Canny edge detector).


For example, if the orientation information presented by the display is a crosshair target having three prongs, the processor may be configured to analyze the output of the display using edge detection methods and, when the combined edge strengths of the three prongs and three of the object's edges (i.e., at a corner) exceed a threshold, the processor automatically initiates a dimensioning method. In other words, when the three prongs align with the object's edges, the processor automatically initiates a dimensioning method. Typically, the edge detection methods are only applied in the central part of the display's output image (i.e., near the displayed orientation information) to reduce the amount of computation.


In exemplary embodiments, the display is configured to present information associated with the optimal distance of the terminal from the object. Such information further facilitates accurate dimensioning by encouraging the user to position the terminal at a distance from the object that accelerates or improves the dimensioning process. For example, the range camera of the terminal typically has a shorter depth of view than does the visible camera. Additionally, when objects are very close to the terminal the range camera typically does not work as accurately, but the visible camera functions normally. Thus, when viewing the visible image produced by the visible camera on the display, objects outside of the range camera's optimal range (i.e., either too close or too far from the terminal to accurately determine the object's dimensions) appear normal.


Accordingly, the display may be configured to present the visible image produced by the visible camera modified such that portions of the visible image corresponding to portions of the range image with high values (e.g., distances beyond the range camera's optimal range) are degraded (e.g., a percentage of the pixels corresponding to the range image's high values are converted to a different color, such as white or grey). The amount of degradation (e.g., the percentage of pixels converted) typically corresponds to the range image's value beyond the upper end of the range camera's optimal range. In other words, the amount of degradation occurs such that the clarity of objects in the displayed visible image corresponds to the range camera's ability to determine the object's dimensions. The amount of degradation may begin at a certain low level corresponding to a threshold distance from the terminal, increase linearly up to a maximum distance after which the degradation is such that the visible image is no longer displayed (e.g., only grey or white is depicted).


Similarly, the display may be configured to present the visible image produced by the visible camera modified such that portions of the visible image corresponding to portions of the range image with low values (e.g., distances less than the range camera's optimal range) are degraded (e.g., a percentage of the pixels corresponding to the range image's high values are converted to a different color, such as black or grey). The amount of degradation (e.g., the percentage of pixels converted) may correspond to the range image's value under the lower end of the range camera's optimal range. Typically, the degradation is complete (i.e., only black or grey) if the range image's value is less than the lower end of the range camera's optimal range. Additional aspects of an exemplary terminal and dimensioning method are described herein with respect to FIGS. 4-16.


An exemplary method of determining the dimensions of an object using a range camera is described in U.S. patent application Ser. No. 13/278,559 filed at the U.S. Patent and Trademark Office on Oct. 21, 2011 and titled “Determining Dimensions Associated with an Object,” which is hereby incorporated by reference in its entirety.


In this regard, devices, methods, and systems for determining dimensions associated with an object are described herein. For example, one or more embodiments include a range camera configured to produce a range image of an area in which the object is located, and a computing device configured to determine the dimensions of the object based, at least in part, on the range image.


One or more embodiments of the present disclosure can increase the automation involved in determining the dimensions associated with (e.g., of) an object (e.g., a box or package to be shipped by a shipping company). For example, one or more embodiments of the present disclosure may not involve an employee of the shipping company physically contacting the object during measurement (e.g., may not involve the employee manually measuring the object and/or manually entering the measurements into a computing system) to determine its dimensions. Accordingly, one or more embodiments of the present disclosure can decrease and/or eliminate the involvement of an employee of the shipping company in determining the dimensions of the object. This can, for example, increase the productivity of the employee, decrease the amount of time involved in determining the object's dimensions, reduce and/or eliminate errors in determining the object's dimensions (e.g., increase the accuracy of the determined dimensions), and/or enable a customer to check in and/or pay for a package's shipping at an automated station (e.g., without the help of an employee), among other benefits.


In the following description, reference is made to FIGS. 2 and 3 that form a part hereof. The drawings show by way of illustration how one or more embodiments of the disclosure may be practiced. These embodiments are described in sufficient detail to enable those of ordinary skill in the art to practice one or more embodiments of this disclosure. It is to be understood that other embodiments may be utilized and that process, electrical, and/or structural changes may be made without departing from the scope of the present disclosure.


As will be appreciated, elements shown in the various embodiments herein can be added, exchanged, combined, and/or eliminated so as to provide a number of additional embodiments of the present disclosure. The proportion and the relative scale of the elements provided in FIGS. 2 and 3 are intended to illustrate the embodiments of the present disclosure, and should not be taken in a limiting sense. As used in the disclosure of this exemplary dimensioning method, “a” or “a number of” something can refer to one or more such things. For example, “a number of planar regions” can refer to one or more planar regions.



FIG. 2 illustrates a system 114 for determining dimensions associated with (e.g., of) an object 112 in accordance with one or more embodiments of the present disclosure of this exemplary dimensioning method. In the embodiment illustrated in FIG. 2, object 112 is a rectangular shaped box (e.g., a rectangular shaped package). However, embodiments of the present disclosure are not limited to a particular object shape, object scale, or type of object. For example, in some embodiments, object 112 can be a cylindrical shaped package. As an additional example, object 112 could be a rectangular shaped box with one or more arbitrarily damaged faces.


As shown in FIG. 2, system 114 includes a range camera 102 and a computing device 104. In the embodiment illustrated in FIG. 2, range camera 102 is separate from computing device 104 (e.g., range camera 102 and computing device 104 are separate devices). However, embodiments of the present disclosure are not so limited. For example, in some embodiments, range camera 102 and computing device 104 can be part of the same device (e.g., range camera 102 can include computing device 104, or vice versa). Range camera 102 and computing device 104 can be coupled by and/or communicate via any suitable wired or wireless connection (not shown in FIG. 2).


As shown in FIG. 2, computing device 104 includes a processor 106 and a memory 108. Memory 108 can store executable instructions, such as, for example, computer readable instructions (e.g., software), that can be executed by processor 106. Although not illustrated in FIG. 2, memory 108 can be coupled to processor 106.


Memory 108 can be volatile or nonvolatile memory. Memory 108 can also be removable (e.g., portable) memory, or non-removable (e.g., internal) memory. For example, memory 108 can be random access memory (RAM) (e.g., dynamic random access memory (DRAM) and/or phase change random access memory (PCRA)), read-only memory (ROM) (e.g., electrically erasable programmable read-only memory (EEPROM) and/or compact-disc read-only memory (CD-ROM)), flash memory, a laser disc, a digital versatile disc (DVO) or other optical disk storage, and/or a magnetic medium such as magnetic cassettes, tapes, or disks, among other types of memory.


Further, although memory 108 is illustrated as being located in computing device 104, embodiments of the present disclosure are not so limited. For example, memory 108 can also be located internal to another computing resource (e.g., enabling computer readable instructions to be downloaded over the Internet or another wired or wireless connection).


In some embodiments, range camera 102 can be part of a handheld and/or portable device, such as a barcode scanner. In some embodiments, range camera 102 can be mounted on a tripod.


Range camera 102 can produce (e.g., capture, acquire, and/or generate) a range image of an area (e.g., scene). Range camera 102 can produce the range image of the area using, for example, structured near-infrared (near-IR) illumination, among other techniques for producing range images.


The range image can be a two-dimensional image that shows the distance to different points in the area from a specific point (e.g., from the range camera). The distance can be conveyed in real-world units (e.g., metric units such as meters or millimeters), or the distance can be an integer value (e.g., 11-bit) that can be converted to real-world units. The range image can be a two-dimensional matrix with one channel that can hold integers or floating point values. For instance, the range image can be visualized as different black and white shadings (e.g., different intensities, brightnesses, and/or darknesses) and/or different colors in any color space (e.g., RGB or HSV) that correspond to different distances between the range camera and different points in the area.


For example, range camera 102 can produce a range image of an area (e.g., area 110 illustrated in FIG. 2) in which object 112 is located. That is, range camera 102 can produce a range image of an area that includes object 112.


Range camera 102 can be located a distance d from object 112 when range camera 102 produces the range image, as illustrated in FIG. 2. Distance d can be, for instance, 0.75 to 5.0 meters. However, embodiments of the present disclosure are not limited to a particular distance between range camera 102 and object 112.


The range image produced by range camera 102 can be visualized as black and white shadings corresponding to different distances between range camera 102 and different portions of object 112. For example, the darkness of the shading can increase as the distance between range camera 102 and the different portions of object 112 decreases (e.g., the closer a portion of object 112 is to range camera 102, the darker the portion will appear in the range image). Additionally and/or alternatively, the range image can be visualized as different colors corresponding to the different distances between range camera 102 and the different portions of object 112. Computing device 104 can determine the dimensions (e.g., the length, width, height, diameter, etc.) of object 112 based, at least in part, on the range image produced by range camera 102. For instance, processor 106 can execute executable instructions stored in memory 108 to determine the dimensions of object 112 based, at least in part, on the range image.


For example, computing device 104 can identify a number of planar regions in the range image produced by range camera 102. The identified planar regions may include planar regions that correspond to object 112 (e.g., to surfaces of object 112). That is, computing device 104 can identify planar regions in the range image that correspond to object 112. For instance, in embodiments in which object 112 is a rectangular shaped box (e.g., the embodiment illustrated in FIG. 2), computing device 104 can identify two or three mutually orthogonal planar regions that correspond to surfaces (e.g., faces) of object 112 (e.g., the three surfaces of object 112 shown in FIG. 2).


Once the planar regions that correspond to object 112 have been identified, computing device 104 can determine the dimensions of object 112 based, at least in part, on the identified planar regions (e.g., on the dimensions of the identified planar regions). For example, computing device 104 can determine the dimensions of the planar regions that correspond to object 112. For instance, computing device 104 can determine the dimensions of the planar regions that correspond to object 112 based, at least in part, on the distances of the planar regions within the range image. Computing device 104 can then determine the dimensions of object 112 based, at least in part, on the dimensions of the planar regions.


Computing device 104 can identify the planar regions in the range image that correspond to object 112 by, for example, determining (e.g., calculating) coordinates (e.g., real-world x, y, z coordinates in millimeters) for each point (e.g., each row, column, and depth tuple) in the range image. Intrinsic calibration parameters associated with range camera 102 can be used to convert each point in the range image into the real-world coordinates. The system can undistort the range image using, for example, the distortion coefficients for the camera to correct for radial, tangential, and/or other types of lens distortion. In some embodiments, the two-dimensional matrix of the real-world coordinates may be downsized by a factor between 0.25 and 0.5.


Computing device 104 can then build a number of planar regions through the determined real-world coordinates. For example, a number of planar regions can be built near the points, wherein the planar regions may include planes of best fit to the points. Computing device 104 can retain the planar regions that are within a particular (e.g., pre-defined) size and/or a particular portion of the range image. The planar regions that are not within the particular size or the particular portion of the range image can be disregarded.


Computing device 104 can then upsample each of the planar regions (e.g., the mask of each of the planar regions) that are within the particular size and/or the particular portion of the range image to fit in an image of the original (e.g., full) dimensions of the range image. Computing device 104 can then refine the planar regions to include only points that lie within an upper bound from the planar regions.


Computing device 104 can then fit a polygon to each of the planar regions that are within the particular size and/or the particular portion of the range image, and retain the planar regions whose fitted polygon has four vertices and is convex. These retained planar regions are the planar regions that correspond to object 112 (e.g., to surfaces of object 112). The planar regions whose fitted polygon does not have four vertices and/or is not convex can be disregarded. Computing device 104 can also disregard the planar regions in the range image that correspond to the ground plane and background clutter of area 110.


Computing device 104 can disregard (e.g., ignore) edge regions in the range image that correspond to the edges of area 110 while identifying the planar regions in the range image that correspond to object 112. For example, computing device 104 can run a three dimensional edge detector on the range image before identifying planar regions in the range image, and can then disregard the detected edge regions while identifying the planar regions. The edge detection can also identify non-uniform regions that can be disregarded while identifying the planar regions.


Once the planar regions that correspond to object 112 have been identified, computing device 104 can determine the dimensions of object 112 based, at least in part, on the identified planar regions (e.g., on the dimensions of the identified planar regions). For example, computing device 104 can determine the dimensions of object 112 by arranging the identified planar regions (e.g., the planar regions whose fitted polygon has four vertices and is convex) into a shape corresponding to the shape of object 112, and determining a measure of centrality (e.g., an average) for the dimensions of clustered edges of the arranged shape. The dimensions of the edges of the arranged shape correspond to the dimensions of object 112.


Once the arranged shape (e.g., the bounding volume of the object) is constructed, computing device 104 can perform (e.g., run) a number of quality checks. For example, in embodiments in which object 112 is a rectangular shaped box, computing device 104 can determine whether the identified planar regions fit together into a rectangular arrangement that approximates a true rectangular box within (e.g., below) a particular error threshold.


In some embodiments, computing device 104 can include a user interface (not shown in FIG. 2). The user interface can include, for example, a screen that can provide (e.g., display and/or present) information to a user of computing device 104. For example, the user interface can provide the determined dimensions of object 112 to a user of computing device 104.


In some embodiments, computing device 104 can determine the volume of object 112 based, at least in part, on the determined dimensions of object 112. Computing device 104 can provide the determined volume to a user of computing device 104 via the user interface.



FIG. 3 illustrates a method 220 for determining dimensions associated with (e.g., of) an object in accordance with one or more embodiments of the present disclosure. The object can be, for example, object 112 previously described in connection with FIG. 2. Method 220 can be performed, for example, by computing device 104 previously described in connection with FIG. 2.


At block 222, method 220 includes capturing a range image of a scene that includes the object. The range image can be, for example, analogous to the range image previously described in connection with FIG. 2 (e.g., the range image of the scene can be analogous to the range image of area 110 illustrated in FIG. 2), and the range image can be captured in a manner analogous to that previously described in connection with FIG. 2.


At block 224, method 220 includes determining the dimensions (e.g., the length, width, height, diameter, etc.) associated with the object based, at least in part, on the range image. For example, the dimensions associated with (e.g., of) the object can be determined in a manner analogous to that previously described in connection with FIG. 2. In some embodiments, the volume of the object can be determined based, at least in part, on the determined dimensions associated with the object.


As an additional example, determining the dimensions associated with the object can include determining the dimensions of the smallest volume rectangular box large enough to contain the object based, at least in part, on the range image. The dimensions of the smallest volume rectangular box large enough to contain the object can be determined by, for example, determining and disregarding (e.g., masking out) the portion (e.g., part) of the range image containing information (e.g., data) associated with (e.g., from) the ground plane of the scene that includes the object, determining (e.g., finding) the height of a plane that is parallel to the ground plane and above which the object does not extend, projecting additional (e.g., other) portions of the range image on the ground plane, and determining (e.g., estimating) a bounding rectangle of the projected portions of the range image on the ground plane.


Although specific embodiments have been illustrated and described herein, those of ordinary skill in the art will appreciate that any arrangement calculated to achieve the same techniques can be substituted for the specific embodiments shown. This disclosure of exemplary methods of determining the dimensions of an object is intended to cover any and all adaptations or variations of various embodiments of the disclosure.


An exemplary method of determining the dimensions of an object and an exemplary terminal for dimensioning objects are described in U.S. patent application Ser. No. 13/471,973 filed at the U.S. Patent and Trademark Office on May 15, 2012 and titled “Terminals and Methods for Dimensioning Objects,” which is hereby incorporated by reference in its entirety.



FIG. 4 illustrates one embodiment of a terminal 1000 operable for measuring at least one dimension of an object 10 in accordance with aspects of the present invention. For example, terminal 1000 may determine a height H, a width W, and a depth D of an object. In addition, terminal 1000 may be operable to read a decodable indicia 15 such as a barcode disposed on the object. For example, the terminal may be suitable for shipping applications in which an object such as a package is subject to shipping from one location to another location. The dimension (dimensioning) information and other measurement (e.g., volume measurement information) respecting object 10 may be used, e.g., to determine a cost for shipping a package or for determining a proper arrangement of the package in a shipping container.


In one embodiment, a terminal in accordance with aspects of the present invention may include at least one or more imaging subsystems such as one or more camera modules and an actuator to adjust the pointing angle of the one or more camera modules to provide true stereo imaging. The terminal may be operable to attempt to determine at least one of a height, a width, and a depth based on effecting the adjustment of the pointing angle of the one or more camera modules.


For example, a terminal in accordance with aspects of the present invention may include at least one or more imaging subsystems such as camera modules and an actuator based on wires of nickel-titanium shape memory alloy (SMA) and an associated control and heating ASIC (application-specific integrated circuit) to adjust the pointing angle of the one or more camera modules to provide true stereo imaging. Using true stereo imaging, the distance to the package can be determined by measuring the amount of drive current or voltage drop across the SMA actuator. The terminal may be operable to attempt to determine at least one of a height, a width, a depth, based on the actuator effecting the adjustment of the pointing angle of the one or more camera modules, the measured distance, and the obtained image of the object.


With reference still to FIG. 4, terminal 1000 in one embodiment may include a trigger 1220, a display 1222, a pointer mechanism 1224, and a keyboard 1226 disposed on a common side of a hand held housing 1014. Display 1222 and pointer mechanism 1224 in combination can be regarded as a user interface of terminal 1000. Terminal 1000 may incorporate a graphical user interface and may present buttons 1230, 1232, and 1234 corresponding to various operating modes such as a setup mode, a spatial measurement mode, and an indicia decode mode, respectively. Display 1222 in one embodiment can incorporate a touch panel for navigation and virtual actuator selection in which case a user interface of terminal 1000 can be provided by display 1222. Hand held housing 1014 of terminal 1000 can in another embodiment be devoid of a display and can be in a gun style form factor. The terminal may be an indicia reading terminal and may generally include hand held indicia reading terminals, fixed indicia reading terminals, and other terminals. Those of ordinary skill in the art will recognize that the present invention is applicable to a variety of other devices having an imaging subassembly which may be configured as, for example, mobile phones, cell phones, satellite phones, smart phones, telemetric devices, personal data assistants, and other devices.



FIG. 5 depicts a block diagram of one embodiment of terminal 1000. Terminal 1000 may generally include at least one imaging subsystem 900, an illumination subsystem 800, hand held housing 1014, a memory 1085, and a processor 1060. Imaging subsystem 900 may include an imaging optics assembly 200 operable for focusing an image onto an image sensor pixel array 1033. An actuator 950 is operably connected to imaging subsystem 900 for moving imaging subsystem 900 and operably connected to processor 1060 (FIG. 5) via interface 952. Hand held housing 1014 may encapsulate illumination subsystem 800, imaging subsystem 900, and actuator 950. Memory 1085 is capable of storing and or capturing a frame of image data, in which the frame of image data may represent light incident on image sensor array 1033. After an exposure period, a frame of image data can be read out. Analog image signals that are read out of array 1033 can be amplified by gain block 1036 converted into digital form by an analog-to-digital converter 1037 and sent to DMA unit 1070. DMA unit 1070, in turn, can transfer digitized image data into volatile memory 1080. Processor 1060 can address one or more frames of image data retained in volatile memory 1080 for processing of the frames for determining one or more dimensions of the object and/or for decoding of decodable indicia represented on the object.



FIG. 6 illustrates one embodiment of the imaging subsystem employable in terminal 1000. In this exemplary embodiment, an imaging subsystem 2900 may include a first fixed imaging subsystem 2210, and a second movable imaging subsystem 2220. An actuator 2300 may be operably connected to imaging subsystem 2220 for moving imaging subsystem 2220. First fixed imaging subsystem 2210 is operable for obtaining a first image or frame of image data of the object, and second movable imaging subsystem 2220 is operable for obtaining a second image or frame of image data of the object. Actuator 2300 is operable to bring the second image into alignment with the first image as described in greater detail below. In addition, either the first fixed imaging subsystem 2210 or the second movable imaging subsystem 2220 may also be employed to obtain an image of decodable indicia 15 (FIG. 4) such as a decodable barcode.



FIGS. 6-10 illustrate one embodiment of the terminal in a spatial measurement mode. For example, a spatial measurement mode may be made active by selection of button 1232 (FIG. 4). In a spatial measurement operating mode, terminal 1000 (FIG. 4) can perform one or more spatial measurements, e.g., measurements to determine one or more of a terminal to target distance (z distance) or a dimension (e.g., h, w, d) of an object or another spatial related measurement (e.g., a volume measurement, a distance measurement between any two points).


Initially, at block 602 as shown in FIG. 7, terminal 10 may obtain or capture first image data, e.g., at least a portion of a frame of image data such as a first image 100 using fixed imaging subsystem 2210 (FIG. 6) within a field of view 20 (FIGS. 4 and 8). For example, a user may operate terminal 1000 to display object 10 using fixed imaging subsystem 2210 (FIG. 6) in the center of display 1222 as shown in FIG. 9. Terminal 1000 can be configured so that block 602 is executed responsively to trigger 1220 (FIG. 4) being initiated. With reference again to FIG. 3, imaging the object generally in the center of the display results when the object is aligned with an imaging axis or optical axis 2025 of fixed imaging subsystem 2210. For example, the optical axis may be a line or an imaginary line that defines the path along which light propagates through the system. The optical axis may pass through the center of curvature of the imaging optics assembly and may be coincident with a mechanical axis of imaging subsystem 2210.


With reference again to FIG. 7, at 604, terminal 1000 may be adapted to move an optical axis 2026 (FIG. 6) of movable imaging subsystem 2220 (FIG. 6) using actuator 2300 (FIG. 6) to align second image data, e.g., at least a portion of a frame of image data such as a second image 120 using movable imaging subsystem 2220 (FIG. 6) within a field of view 20 (FIGS. 4 and 10) with the first image data. As shown in FIG. 6, optical axis 2026 of imaging subsystem 2220 may be pivoted, tilted or deflected, for example in the direction of double-headed arrow R1 in response to actuator 2300 to align the second image of the object with the object in the first image.


For example, the terminal may include a suitable software program employing a subtraction routine to determine when the image of the object in the second image data is aligned with the object in the first image data. The closer the aligned images of the object are, the resulting subtraction of the two images such as subtracting the amplitude of the corresponding pixels of the imagers will become smaller as the images align and match. The entire images of the object may be compared, or a portion of the images of the object may be compared. Thus, the better the images of the object are aligned, the smaller the subtracted difference will be.


A shown in FIG. 7, at 606, an attempt to determine at least one of a height, a width, and a depth dimension of the object is made based on moving the optical axis of the movable imaging subsystem to align the image of the object in the second image data with the image of the object in the first image data. For example, the position of the angle of the optical axis is related to the distance between the terminal and the object, and the position of the angle of the optical axis and/or the distance between the terminal and the object may be used in combination with the number of pixels used for imaging the object in the image sensor array to the determine the dimensions of the object.


With reference again to FIG. 6, the angle of the optical axis of the movable imaging subsystem relative to the terminal is related to the distance from the movable imaging subsystem (e.g., the front of the images sensor array) to the object (e.g., front surface, point, edge, etc.), and the angle of the optical axis of the movable imaging subsystem relative to the terminal is related to the distance from the fixed imaging subsystem (e.g., the front of the images sensor array) to the object (e.g., front surface, point, edge, etc.).


For example, the relationship between an angle Θ of the optical axis of the movable imaging subsystem relative to the terminal, a distance A from the fixed imaging subsystem to the object, and a distance C between the fixed imaging subsystem and the movable imaging subsystem may be expressed as follows:

tan Θ=A/C.


The relationship between angle Θ of the optical axis of the movable imaging subsystem relative to the terminal, a distance B from the fixed imaging subsystem to the object, and distance C between the fixed imaging subsystem and the movable imaging subsystem may be expressed as follows:

cos Θ=C/B.


With reference to FIG. 11, the actual size of an object relative to the size of the object observed on an image sensor array may be generally defined as follows:







h
f

=


H
D

.






where h is a dimension of the object (such as height) of the object on the image sensor array, f is focal length of the imaging optics lens, H is a dimension of the actual object (such as height), and D is distance from the object to the imaging optic lens.


With reference to measuring, for example a height dimension, knowing the vertical size of the imaging sensor (e.g., the height in millimeters or inches) and number of pixels vertically disposed along the imaging sensor, the height of the image of the object occupying a portion of the imaging sensor would be related to a ratio of the number of pixels forming the imaged object to the total pixels disposed vertically along the image sensor.


For example, a height of an observed image on the imaging senor may be determined as follows:






h
=






observed





object






image





height






(
pixels
)








height





of






sensor






(
pixels
)






×
height





of





sensor







(


e
.
g
.

,

in





inches


)

.






In one embodiment, an actual height measurement may be determined as follows:






H
=



D
×
h

f

.





For example, where an observed image of the object is 100 pixels high, and a distance D is 5 feet, the actual object height would be greater than when the observed image of the object is 100 pixels high, and a distance D is 2 feet. Other actual dimensions (e.g., width and depth) of the object may be similarly obtained.


From the present description, it will be appreciated that the terminal may be setup using a suitable setup routine that is accessed by a user or by a manufacturer for coordinating the predetermined actual object to dimensioning at various distances, e.g., coordinate a voltage or current reading required to affect the actuator to align the object in the second image with the image of the object in the first image, to create a lookup table. Alternatively, suitable programming or algorithms employing, for example, the relationships described above, may be employed to determine actual dimensions based on the number of pixels observed on the imaging sensor. In addition, suitable edge detection or shape identifier algorithms or processing may be employed with analyzing standard objects, e.g., boxes, cylindrical tubes, triangular packages, etc., to determine and/or confirm determined dimensional measurements.



FIG. 12 illustrates another embodiment of an imaging subsystem employable in terminal 1000 (FIG. 4). Alignment of the second image may also be accomplished using a projected image pattern P from an aimer onto the object to determine the dimensions of the object. In activating the terminal, an aimer such as a laser aimer may project an aimer pattern onto the object. The projected aimer pattern may be a dot, point, or other pattern. The imaged object with the dot in the second image may be aligned, e.g., the actuator effective to move the movable imaging subsystem so that the laser dot on the imaged second image aligns with the laser dot in the first image. The aimer pattern may be orthogonal lines or a series of dots that a user may be able to align adjacent to or along one or more sides or edges such as orthogonal sides or edges of the object.


In this exemplary embodiment, an imaging subsystem 3900 may include a first fixed imaging subsystem 3210, and a second movable imaging subsystem 3220. In addition, terminal 1000 (FIG. 4) may include an aiming subsystem 600 (FIG. 5) for projecting an aiming pattern onto the object, in accordance with aspects of the present invention. An actuator 3300 may be operably attached to imaging subsystem 3220 for moving imaging subsystem 3220. First fixed imaging subsystem 3210 is operable for obtaining a first image of the object having an aimer pattern P such as a point or other pattern. Second movable imaging subsystem 3220 is operable for obtaining a second image of the object. Actuator 3300 is operable to bring the second image into alignment with the first image be aligning point P in the second image with point p in the second image. For example, an optical axis 3026 of imaging subsystem 3220 may be pivoted, tilted or deflected, for example in the direction of double-headed arrow R2 in response to actuator 3300 to align the second image of the object with the object in the first image. In addition, either the first fixed imaging subsystem 3210, or the second movable imaging subsystem 3220 may also be employed to obtain an image of decodable indicia 15 (FIG. 4) such as a decodable barcode.



FIG. 13 illustrates another embodiment of an imaging subsystem employable in terminal 1000 (FIG. 4). In this embodiment, an imaging subsystem 4900 may be employed in accordance with aspects of the present invention. For example, an imaging subsystem 4900 may include a movable imaging subsystem 4100. An actuator 4300 may be operably attached to imaging subsystem 4100 for moving imaging subsystem 4100 from a first position to a second position remote from the first position. Movable imaging subsystem 4100 is operable for obtaining a first image of the object at the first position or orientation, and after taking a first image, moved or translate the movable imaging subsystem to a second location or orientation such as in the direction of arrow L1 using actuator 4300 to provide a distance L between the first position and the second position prior to aligning the object and obtaining a second image of the object. Actuator 4300 is also operable to bring the second image into alignment with the first image. For example, an optical axis 4026 of imaging subsystem 4100 may be pivoted, tilted or deflected, for example in the direction of double-headed arrow R3 in response to actuator 4100 to align the second image of the object with the object in the first image. As noted above, terminal 1000 (FIG. 4) may include an aiming subsystem 600 (FIG. 5) for projecting an aiming pattern onto the object in combination with imaging subsystem 4900. In addition, the movable imaging subsystem 4100 may also be employed to obtain an image of decodable indicia 15 (FIG. 4) such as a decodable barcode.


From the present description of the various imaging subsystems and actuators, it will be appreciated that the second aligned image be performed in an operable time after the first image so that the effect of the user holding and moving the terminal when obtaining the images or the object moving when obtaining the image does not result in errors in determining the one or more dimensions of the object. It is desirable minimize the time delay between the first image and the second aligned image. For example, it may be suitable that the images be obtained within about 0.5 second or less, or possibly within about ⅛ second or less, about 1/16 second or less, or about 1/32 second or less.


With reference to FIGS. 6, 11, and 12, the actuators employed in the various embodiments may comprise one or more actuators which are positioned in the terminal to move the movable imagining subsystem in accordance with instructions received from processor 1060 (FIG. 5). Examples of a suitable actuator include a shaped memory alloy (SMA) which changes in length in response to an electrical bias, a piezo actuator, a MEMS actuator, and other types of electromechanical actuators. The actuator may allow for moving or pivoting the optical axis of the imaging optics assembly, or in connection with the actuator in FIG. 13, also moving the imaging subsystem from side-to-side along a line or a curve.


As shown in FIGS. 14 and 15, an actuator 5300 may comprise four actuators 5310, 5320, 5330, and 5430 disposed beneath each corner of an imaging subsystem 5900 to movable support the imaging subsystem on a circuit board 5700. The actuators may be selected so that they are capable of compressing and expanding and, when mounted to the circuit board, are capable of pivoting the imaging subsystem relative to the circuit board. The movement of imaging subsystem by the actuators may occur in response to a signal from the processor. The actuators may employ a shaped memory alloy (SMA) member which cooperates with one or more biasing elements 5350 such as springs, for operably moving the imaging subsystem. In addition, although four actuators are shown as being employed, more or fewer than four actuators may be used. The processor may process the comparison of the first image to the observed image obtained from the movable imaging subsystem, and based on the comparison, determine the required adjustment of the position of the movable imaging subsystem to align the object in the second image with the obtained image in the first obtained image.


In addition, the terminal may include a motion sensor 1300 (FIG. 5) operably connected to processor 1060 (FIG. 5) via interface 1310 (FIG. 5) operable to remove the effect of shaking due to the user holding the terminal at the same time as obtaining the first image and second aligned image which is used for determining one of more dimensions of the object as described above. A suitable system for use in the above noted terminal may include the image stabilizer for a microcamera disclosed in U.S. Pat. No. 7,307,653 issued to Dutta, the entire contents of which are incorporated herein by reference.


The imaging optics assembly may employ a fixed focus imaging optics assembly. For example, the optics may be focused at a hyperfocal distance so that objects in the images from some near distance to infinity will be sharp. The imaging optics assembly may be focused at a distance of 15 inches or greater, in the range of 3 or 4 feet distance, or at other distances. Alternatively, the imaging optics assembly may comprise an autofocus lens. The exemplary terminal may include a suitable shape memory alloy actuator apparatus for controlling an imaging subassembly such as a microcamera disclosed in U.S. Pat. No. 7,974,025 by Topliss, the entire contents of which are incorporated herein by reference.


From the present description, it will be appreciated that the exemplary terminal may be operably employed to separately obtain images and dimensions of the various sides of an object, e.g., two or more of a front elevational view, a side elevational view, and a top view, may be separately obtained by a user similar to measuring an object as one would with a ruler.


The exemplary terminal may include a suitable autofocusing microcamera such as a microcamera disclosed in U.S. Patent Application Publication No. 2011/0279916 by Brown et al., the entire contents of which is incorporated herein by reference.


In addition, it will be appreciated that the described imaging subsystems in the embodiments shown in FIGS. 6, 12, and 13, may employ fluid lenses or adaptive lenses. For example, a fluid lens or adaptive lens may comprise an interface between two fluids having dissimilar optical indices. The shape of the interface can be changed by the application of external forces so that light passing across the interface can be directed to propagate in desired directions. As a result, the optical characteristics of a fluid lens, such its focal length and the orientation of its optical axis, can be changed. With use of a fluid lens or adaptive lens, for example, an actuator may be operable to apply pressure to the fluid to change the shape of the lens. In another embodiment, an actuator may be operable to apply a DC voltage across a coating of the fluid to decrease its water repellency in a process called electrowetting to change the shape of the lens. The exemplary terminal may include a suitable fluid lens as disclosed in U.S. Pat. No. 8,027,096 issued to Feng et al., the entire contents of which is incorporated herein by reference.


With reference to FIG. 16, a timing diagram may be employed for obtaining a first image of the object for use in determining one or more dimensions as described above, and also used for decoding a decodable indicia disposed on an object using for example, the first imaging subassembly. At the same time or generally simultaneously after activation of the first imaging subassembly, the movable subassembly and actuator may be activated to determine one or more dimensions as described above. For example, the first frame of image data of the object using the first imaging subassembly may be used in combination with the aligned image of the object using the movable imaging subsystem.


A signal 7002 may be a trigger signal which can be made active by actuation of trigger 1220 (FIG. 4), and which can be deactivated by releasing of trigger 1220 (FIG. 4). A trigger signal may also become inactive after a time out period or after a successful decode of a decodable indicia.


A signal 7102 illustrates illumination subsystem 800 (FIG. 5) having an energization level, e.g., illustrating an illumination pattern where illumination or light is alternatively turned on and off. Periods 7110, 7120, 7130, 7140, and 7150 illustrate where illumination is on, and periods 7115, 7125, 7135, and 7145 illustrate where illumination is off.


A signal 7202 is an exposure control signal illustrating active states defining exposure periods and inactive states intermediate the exposure periods for an image sensor of a terminal. For example, in an active state, an image sensor array of terminal 1000 (FIG. 4) is sensitive to light incident thereon. Exposure control signal 7202 can be applied to an image sensor array of terminal 1000 (FIG. 4) so that pixels of an image sensor array are sensitive to light during active periods of the exposure control signal and not sensitive to light during inactive periods thereof. During exposure periods 7210, 7220, 7230, 7240, and 7250, the image sensor array of terminal 1000 (FIG. 4) is sensitive to light incident thereon.


A signal 7302 is a readout control signal illustrating the exposed pixels in the image sensor array being transferred to memory or secondary storage in the imager so that the imager may be operable to being ready for the next active portion of the exposure control signal. In the timing diagram of FIG. 16, period 7410 may be used in combination with movable imaging subsystem to determine one or more dimensions as described above. In addition, in the timing diagram of FIG. 16, periods 7410, 7420, 7430, and 7440 are periods in which processer 1060 (FIG. 5) may process one or more frames of image data. For example, periods 7410, 7420, 7430, and 7440 may correspond to one or more attempts to decode decodable indicia in which the image resulted during periods when indicia reading terminal 1000 (FIG. 4) was illuminating the decodable indicia.


With reference again to FIG. 5, indicia reading terminal 1000 may include an image sensor 1032 comprising multiple pixel image sensor array 1033 having pixels arranged in rows and columns of pixels, associated column circuitry 1034 and row circuitry 1035. Associated with the image sensor 1032 can be amplifier circuitry 1036 (amplifier), and an analog to digital converter 1037 which converts image information in the form of analog signals read out of image sensor array 1033 into image information in the form of digital signals. Image sensor 1032 can also have an associated timing and control circuit 1038 for use in controlling, e.g., the exposure period of image sensor 1032, gain applied to the amplifier 1036, etc. The noted circuit components 1032, 1036, 1037, and 1038 can be packaged into a common image sensor integrated circuit 1040. Image sensor integrated circuit 1040 can incorporate fewer than the noted number of components. Image sensor integrated circuit 1040 including image sensor array 1033 and imaging lens assembly 200 can be incorporated in hand held housing 1014.


In one example, image sensor integrated circuit 1040 can be provided e.g., by an MT9V022 (752×480 pixel array) or an MT9V023 (752×480 pixel array) image sensor integrated circuit available from Aptina Imaging (formerly Micron Technology, Inc.). In one example, image sensor array 1033 can be a hybrid monochrome and color image sensor array having a first subset of monochrome pixels without color filter elements and a second subset of color pixels having color sensitive filter elements. In one example, image sensor integrated circuit 1040 can incorporate a Bayer pattern filter, so that defined at the image sensor array 1033 are red pixels at red pixel positions, green pixels at green pixel positions, and blue pixels at blue pixel positions. Frames that are provided utilizing such an image sensor array incorporating a Bayer pattern can include red pixel values at red pixel positions, green pixel values at green pixel positions, and blue pixel values at blue pixel positions. In an embodiment incorporating a Bayer pattern image sensor array, processor 1060 prior to subjecting a frame to further processing can interpolate pixel values at frame pixel positions intermediate of green pixel positions utilizing green pixel values for development of a monochrome frame of image data. Alternatively, processor 1060 prior to subjecting a frame for further processing can interpolate pixel values intermediate of red pixel positions utilizing red pixel values for development of a monochrome frame of image data. Processor 1060 can alternatively, prior to subjecting a frame for further processing interpolate pixel values intermediate of blue pixel positions utilizing blue pixel values. An imaging subsystem of terminal 1000 can include image sensor 1032 and lens assembly 200 for focusing an image onto image sensor array 1033 of image sensor 1032.


In the course of operation of terminal 1000, image signals can be read out of image sensor 1032, converted, and stored into a system memory such as RAM 1080. Memory 1085 of terminal 1000 can include RAM 1080, a nonvolatile memory such as EPROM 1082 and a storage memory device 1084 such as may be provided by a flash memory or a hard drive memory. In one embodiment, terminal 1000 can include processor 1060 which can be adapted to read out image data stored in memory 1080 and subject such image data to various image processing algorithms. Terminal 1000 can include a direct memory access unit (DMA) 1070 for routing image information read out from image sensor 1032 that has been subject to conversion to RAM 1080. In another embodiment, terminal 1000 can employ a system bus providing for bus arbitration mechanism (e.g., a PCI bus) thus eliminating the need for a central DMA controller. A skilled artisan would appreciate that other embodiments of the system bus architecture and/or direct memory access components providing for efficient data transfer between the image sensor 1032 and RAM 1080 are within the scope and the spirit of the present invention.


Reference still to FIG. 5 and referring to further aspects of terminal 1000, imaging lens assembly 200 can be adapted for focusing an image of decodable indicia 15 located within a field of view 20 on the object onto image sensor array 1033. A size in target space of a field of view 20 of terminal 1000 can be varied in a number of alternative ways. A size in target space of a field of view 20 can be varied, e.g., by changing a terminal to target distance, changing an imaging lens assembly setting, changing a number of pixels of image sensor array 1033 that are subject to read out. Imaging light rays can be transmitted about an imaging axis. Lens assembly 200 can be adapted to be capable of multiple focal lengths and multiple planes of optimum focus (best focus distances).


Terminal 1000 may include illumination subsystem 800 for illumination of target, and projection of an illumination pattern (not shown). Illumination subsystem 800 may emit light having a random polarization. The illumination pattern, in the embodiment shown can be projected to be proximate to but larger than an area defined by field of view 20, but can also be projected in an area smaller than an area defined by a field of view 20. Illumination subsystem 800 can include a light source bank 500, comprising one or more light sources. Light source assembly 800 may further include one or more light source banks, each comprising one or more light sources, for example. Such light sources can illustratively include light emitting diodes (LEDs), in an illustrative embodiment. LEDs with any of a wide variety of wavelengths and filters or combination of wavelengths or filters may be used in various embodiments. Other types of light sources may also be used in other embodiments. The light sources may illustratively be mounted to a printed circuit board. This may be the same printed circuit board on which an image sensor integrated circuit 1040 having an image sensor array 1033 may illustratively be mounted.


Terminal 1000 can also include an aiming subsystem 600 for projecting an aiming pattern (not shown). Aiming subsystem 600 which can comprise a light source bank can be coupled to aiming light source bank power input unit 1208 for providing electrical power to a light source bank of aiming subsystem 600. Power input unit 1208 can be coupled to system bus 1500 via interface 1108 for communication with processor 1060.


In one embodiment, illumination subsystem 800 may include, in addition to light source bank 500, an illumination lens assembly 300, as is shown in the embodiment of FIG. 5. In addition to or in place of illumination lens assembly 300, illumination subsystem 800 can include alternative light shaping optics, e.g., one or more diffusers, mirrors and prisms. In use, terminal 1000 can be oriented by an operator with respect to a target, (e.g., a piece of paper, a package, another type of substrate, screen, etc.) bearing decodable indicia 15 in such manner that the illumination pattern (not shown) is projected on decodable indicia 15. In the example of FIG. 5, decodable indicia 15 is provided by a 10 barcode symbol. Decodable indicia 15 could also be provided by a 2D barcode symbol or optical character recognition (OCR) characters. Referring to further aspects of terminal 1000, lens assembly 200 can be controlled with use of an electrical power input unit 1202 which provides energy for changing a plane of optimum focus of lens assembly 200. In one embodiment, electrical power input unit 1202 can operate as a controlled voltage source, and in another embodiment, as a controlled current source. Electrical power input unit 1202 can apply signals for changing optical characteristics of lens assembly 200, e.g., for changing a focal length and/or a best focus distance of (a plane of optimum focus of) lens assembly 200. A light source bank electrical power input unit 1206 can provide energy to light source bank 500. In one embodiment, electrical power input unit 1206 can operate as a controlled voltage source. In another embodiment, electrical power input unit 1206 can operate as a controlled current source. In another embodiment electrical power input unit 1206 can operate as a combined controlled voltage and controlled current source. Electrical power input unit 1206 can change a level of electrical power provided to (energization level of) light source bank 500, e.g., for changing a level of illumination output by light source bank 500 of illumination subsystem 800 for generating the illumination pattern.


In another aspect, terminal 1000 can include a power supply 1402 that supplies power to a power grid 1404 to which electrical components of terminal 1000 can be connected. Power supply 1402 can be coupled to various power sources, e.g., a battery 1406, a serial interface 1408 (e.g., USB, RS232), and/or AC/DC transformer 1410.


Further, regarding power input unit 1206, power input unit 1206 can include a charging capacitor that is continually charged by power supply 1402. Power input unit 1206 can be configured to output energy within a range of energization levels. An average energization level of illumination subsystem 800 during exposure periods with the first illumination and exposure control configuration active can be higher than an average energization level of illumination and exposure control configuration active.


Terminal 1000 can also include a number of peripheral devices including trigger 1220 which may be used to make active a trigger signal for activating frame readout and/or certain decoding processes. Terminal 1000 can be adapted so that activation of trigger 1220 activates a trigger signal and initiates a decode attempt. Specifically, terminal 1000 can be operative so that in response to activation of a trigger signal, a succession of frames can be captured by way of read out of image information from image sensor array 1033 (typically in the form of analog signals) and then storage of the image information after conversion into memory 1080 (which can buffer one or more of the succession of frames at a given time). Processor 1060 can be operative to subject one or more of the succession of frames to a decode attempt.


For attempting to decode a barcode symbol, e.g., a one dimensional barcode symbol, processor 1060 can process image data of a frame corresponding to a line of pixel positions (e.g., a row, a column, or a diagonal set of pixel positions) to determine a spatial pattern of dark and light cells and can convert each light and dark cell pattern determined into a character or character string via table lookup. Where a decodable indicia representation is a 2D barcode symbology, a decode attempt can comprise the steps of locating a finder pattern using a feature detection algorithm, locating matrix lines intersecting the finder pattern according to a predetermined relationship with the finder pattern, determining a pattern of dark and light cells along the matrix lines, and converting each light pattern into a character or character string via table lookup.


Terminal 1000 can include various interface circuits for coupling various peripheral devices to system address/data bus (system bus) 1500, for communication with processor 1060 also coupled to system bus 1500. Terminal 1000 can include an interface circuit 1028 for coupling image sensor timing and control circuit 1038 to system bus 1500, an interface circuit 1102 for coupling electrical power input unit 1202 to system bus 1500, an interface circuit 1106 for coupling illumination light source bank power input unit 1206 to system bus 1500, and an interface circuit 1120 for coupling trigger 1220 to system bus 1500. Terminal 1000 can also include display 1222 coupled to system bus 1500 and in communication with processor 1060, via an interface 1122, as well as pointer mechanism 1224 in communication with processor 1060 via an interface 1124 connected to system bus 1500. Terminal 1000 can also include keyboard 1226 coupled to systems bus 1500 and in communication with processor 1060 via an interface 1126. Terminal 1000 can also include range detector unit 1210 coupled to system bus 1500 via interface 1110. In one embodiment, range detector unit 1210 can be an acoustic range detector unit. Various interface circuits of terminal 1000 can share circuit components. For example, a common microcontroller can be established for providing control inputs to both image sensor timing and control circuit 1038 and to power input unit 1206. A common microcontroller providing control inputs to circuit 1038 and to power input unit 1206 can be provided to coordinate timing between image sensor array controls and illumination subsystem controls.


A succession of frames of image data that can be captured and subject to the described processing can be full frames (including pixel values corresponding to each pixel of image sensor array 1033 or a maximum number of pixels read out from image sensor array 1033 during operation of terminal 1000). A succession of frames of image data that can be captured and subject to the described processing can also be “windowed frames” comprising pixel values corresponding to less than a full frame of pixels of image sensor array 1033. A succession of frames of image data that can be captured and subject to the above described processing can also comprise a combination of full frames and windowed frames. A full frame can be read out for capture by selectively addressing pixels of image sensor 1032 having image sensor array 1033 corresponding to the full frame. A windowed frame can be read out for capture by selectively addressing pixels or ranges of pixels of image sensor 1032 having image sensor array 1033 corresponding to the windowed frame. In one embodiment, a number of pixels subject to addressing and read out determine a picture size of a frame. Accordingly, a full frame can be regarded as having a first relatively larger picture size and a windowed frame can be regarded as having a relatively smaller picture size relative to a picture size of a full frame. A picture size of a windowed frame can vary depending on the number of pixels subject to addressing and readout for capture of a windowed frame.


Terminal 1000 can capture frames of image data at a rate known as a frame rate. A typical frame rate is 60 frames per second (FPS) which translates to a frame time (frame period) of 16.6 ms. Another typical frame rate is 30 frames per second (FPS) which translates to a frame time (frame period) of 33.3 ms per frame. A frame rate of terminal 1000 can be increased (and frame time decreased) by decreasing of a frame picture size.


In numerous cases herein wherein systems and apparatuses and methods are described as having a certain number of elements, it will be understood that such systems, apparatuses and methods can be practiced with fewer than the mentioned certain number of elements. Also, while a number of particular embodiments have been described, it will be understood that features and aspects that have been described with reference to each particular embodiment can be used with each remaining particularly described embodiment.


Another exemplary method of determining the dimensions of an object utilizes one or more of the foregoing methods to improve the accuracy of the method. In particular, the method includes capturing a range image of the object and capturing a visible image of the object (e.g., using a range camera with both an infra-red sensor and an RGB or monochrome camera). The range image and visible image are then aligned based on the relative positions from which the two images were captured.


In an exemplary embodiment, the method includes performing a first method of determining the object's dimensions based on either the range image or the visible image. The method then includes performing a second method of determining the object's dimensions based on the other image (i.e., not the image used in the first method). The results of the first and second methods are then compared. If the compared results are not within a suitable threshold, new images may be captured or the first and second methods may be performed again using the original images.


In another exemplary embodiment, the method includes simultaneously performing a first method of determining the object's dimensions based on the range image and a second method of determining the object's dimensions based on the visible image. When one of the methods determines one of the object's dimensions, the determined dimension is provided to the other method, and the other method adjusts its process for determining the object's dimensions. For example, the other method may assume the determined dimension to be correct or the other method may verify the determined dimension in view of the image it is using to determine the object's dimensions. In other words, the method performs both dimensioning methods simultaneously and dynamically. Such dynamic sharing of information between dimensioning methods facilitates the efficient determination of reliable dimensions of the object.


As would be recognized by one of ordinary skill in the art upon consideration of the present disclosure, the foregoing method may be implemented by an appropriately configured computing device (e.g., including a processor and memory).


The foregoing disclosure has presented a number of systems, methods, and devices for determining the dimensions of an object. Although methods have been disclosed with respect to particular systems and/or devices, the methods may be performed using different systems and/or devices than those particularly disclosed. Similarly, the systems and devices may perform different methods than those methods specifically disclosed with respect to a given system or device. Furthermore, the systems and devices may perform multiple methods for determining the dimensions of an object (e.g., to increase accuracy). Aspects of each of the methods for determining the dimensions of an object may be used in or combined with other methods. Finally, components (e.g., a range camera, camera system, scale, and/or computing device) of a given disclosed system or device may be incorporated into other disclosed systems or devices to provide increased functionality.


To supplement the present disclosure, this application incorporates entirely by reference commonly assigned U.S. patent application Ser. No. 13/784,933 for an “Integrated Dimensioning and Weighing System” filed Mar. 5, 2013 at the United States Patent and Trademark Office.


In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. An object analysis system, comprising: a camera system comprising a field of view;a laser projection device configured to project a laser pattern aligned to the field of view of the camera system, the laser pattern comprising:a first linear subpattern traversing a first visible face and a second linear subpattern traversing a second visible face of an object,wherein the first visible face and the second visible face are on different sides of a first edge connecting the first visible face of the object and the second visible face of the object,wherein the first linear subpattern and the second linear subpattern are parallel to the first edge of the object; andwherein the laser pattern comprises a constant sized rectangle formed by laser lines in the laser pattern, wherein the laser pattern is formed by two horizontal parallel lines and two vertical parallel lines with constant distance between the parallel lines;wherein the camera system is configured to capture an image of the projected laser pattern on the object within the field of view of the camera system; anda processor configured to:calculate coordinates in three-dimensional space of a first break point in the projected laser pattern at a second edge of the object and a second break point in the projected laser pattern at a third edge of the object; andcalculate coordinates in three-dimensional space of a third break point in the projected laser pattern at a fourth edge of the object and a fourth break point in the projected laser pattern at a fifth edge of the object,wherein a dimension between the first break point and second break point represents a dimension of the object in a first plane and a dimension between the third break point and the fourth break point represents a dimension of the object in a second plane.
  • 2. The object analysis system of claim 1, wherein the camera system comprises a range camera and visible camera.
  • 3. The object analysis system of claim 1, wherein the laser pattern comprises a rectangle formed by laser lines in the pattern.
  • 4. The object analysis system of claim 1, wherein the camera system comprises software calibrating a relationship between the laser pattern and the field of view.
  • 5. The object analysis system of claim 1, further comprising a scale configured to determine a weight of the object.
  • 6. The object analysis system of claim 1, wherein the scale comprises a top surface that has a marking to indicate an orientation of the object.
  • 7. The object analysis system of claim 1, wherein the processor is configured to identify a ground plane surface in the image generated by the camera system.
  • 8. The object analysis of claim 1, wherein the laser pattern comprises laser lines that have a divergence angle between 1 and 30 milliradians.
  • 9. The object analysis system of claim 1, wherein the laser pattern comprises laser lines that have a divergence angle between 3 and 10 milliradians.
  • 10. The object analysis system of claim 1, wherein a first width of laser lines in the laser pattern on the object corresponds to a second width of between 2 and 10 pixels of a sensor in the camera system imaging the object and the pattern.
  • 11. The object analysis system of claim 1, wherein the laser pattern is projected from a laser projection device separate from the camera system.
  • 12. The object analysis system of claim 1, wherein there are between 5 and 10 break points within the camera system's field of view.
  • 13. The object analysis system of claim 1, wherein the processor is configured to execute shipment billing software.
  • 14. The object analysis system of claim 1, wherein the camera system comprises a range camera and a visible camera that are each configured to capture an image of the object and the pattern on the object.
  • 15. A method for object analysis, the method comprising: projecting, by a laser projection device, a laser pattern aligned to a field of view of a camera system, the laser pattern comprising:a first linear subpattern traversing a first visible face and a second linear subpattern traversing a second visible face of an object,wherein the first visible face and the second visible face are on different sides of a first edge connecting the first visible face of the object and the second visible face of the object,wherein the first linear subpattern and the second linear subpattern are parallel to the first edge of the object; andwherein the laser pattern comprises a constant sized rectangle formed by laser lines in the laser pattern, wherein the laser pattern is formed by two horizontal parallel lines and two vertical parallel lines with constant distance between the parallel lines;capturing by the camera system, an image of the projected laser pattern on the object within the field of view of the camera system;computing, by a processor, coordinates in three-dimensional space of a first break point in the projected laser pattern at a second edge of the object and a second break point in the projected laser pattern at a third edge of the object; andcomputing, by the processor, coordinates in three-dimensional space of a third break point in the projected laser pattern at a fourth edge of the object and a fourth break point in the projected laser pattern at a fifth edge of the object,wherein a dimension between the first break point and second break point represents a dimension of the object in a first plane and a dimension between the third break point and the fourth break point represents a dimension of the object in a second plane.
  • 16. The method of claim 15, wherein the camera system comprises a range camera.
  • 17. The method of claim 15, wherein the laser pattern is projected from a laser projection device separate from the camera system.
  • 18. The method of claim 15, wherein there are between 5 and 10 break points within the camera system's field of view.
  • 19. The method of claim 15, wherein the camera system comprises software calibrating a relationship between the laser pattern and the field of view.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application claims the benefit of U.S. patent application Ser. No. 13/785,177 for a Dimensioning System filed Mar. 5, 2013 (and published Apr. 17, 2014 as U.S. Patent Publication No. 2014/0104414), which claims the benefit of U.S. Patent Application No. 61/714,394 for an Integrated Dimensioning and Weighing System filed Oct. 16, 2012. Each of the foregoing patent applications and patent publication is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 13/784,933 for an Integrated Dimensioning and Weighing System filed Mar. 5, 2013 (and published Apr. 17, 2014 as U.S. Patent Publication No. 2014/0104413) also claims the benefit of U.S. Patent Application No. 61/714,394. Each of the foregoing patent applications and patent publication is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 14/055,383 for a Dimensioning System filed Oct. 16, 2013 (and published Apr. 17, 2014 as U.S. Patent Publication No. 2014/0104416) also claims the benefit of U.S. Patent Application No. 61/714,394. U.S. patent application Ser. No. 14/055,383 also claims the benefit of U.S. Patent Application No. 61/787,414 for an Integrated Dimensioning and Weighing System filed Mar. 15, 2013 and U.S. Patent Application No. 61/833,517 for an Integrated Dimensioning and Weighing System filed Jun. 11, 2013. Each of the foregoing patent applications and patent publication is hereby incorporated by reference in its entirety.

US Referenced Citations (1171)
Number Name Date Kind
3699245 Scott Oct 1972 A
3971065 Bayer Jul 1976 A
4026031 Siddall et al. May 1977 A
4279328 Ahlbom Jul 1981 A
4398811 Nishioka et al. Aug 1983 A
4495559 Gelatt, Jr. Jan 1985 A
4634278 Ross et al. Jan 1987 A
4730190 Win et al. Mar 1988 A
4803639 Steele et al. Feb 1989 A
4914460 Caimi et al. Apr 1990 A
4974919 Muraki et al. Dec 1990 A
5111325 DeJager May 1992 A
5175601 Fitts Dec 1992 A
5184733 Amarson et al. Feb 1993 A
5198648 Hibbard Mar 1993 A
5220536 Stringer et al. Jun 1993 A
5243619 Albers et al. Sep 1993 A
5331118 Jensen Jul 1994 A
5359185 Hanson Oct 1994 A
5384901 Glassner et al. Jan 1995 A
5477622 Skalnik Dec 1995 A
5548707 LoNegro et al. Aug 1996 A
5555090 Schmutz Sep 1996 A
5561526 Huber et al. Oct 1996 A
5590060 Granville et al. Dec 1996 A
5592333 Lewis Jan 1997 A
5606534 Stringer et al. Feb 1997 A
5619245 Kessler et al. Apr 1997 A
5655095 LoNegro et al. Aug 1997 A
5661561 Wurz et al. Aug 1997 A
5699161 Woodworth Dec 1997 A
5729750 Ishida Mar 1998 A
5730252 Herbinet Mar 1998 A
5732147 Tao Mar 1998 A
5734476 Dlugos Mar 1998 A
5737074 Haga et al. Apr 1998 A
5748199 Palm May 1998 A
5767962 Suzuki et al. Jun 1998 A
5802092 Endriz Sep 1998 A
5808657 Kurtz et al. Sep 1998 A
5831737 Stringer et al. Nov 1998 A
5850370 Stringer et al. Dec 1998 A
5850490 Johnson Dec 1998 A
5869827 Rando Feb 1999 A
5870220 Migdal et al. Feb 1999 A
5900611 Hecht May 1999 A
5923428 Woodworth Jul 1999 A
5929856 LoNegro et al. Jul 1999 A
5938710 Lanza et al. Aug 1999 A
5959568 Woolley Sep 1999 A
5960098 Tao Sep 1999 A
5969823 Wurz et al. Oct 1999 A
5978512 Kim et al. Nov 1999 A
5979760 Freyman et al. Nov 1999 A
5988862 Kacyra et al. Nov 1999 A
5991041 Woodworth Nov 1999 A
6009189 Schaack Dec 1999 A
6025847 Marks Feb 2000 A
6035067 Ponticos Mar 2000 A
6049386 Stringer et al. Apr 2000 A
6053409 Brobst et al. Apr 2000 A
6064759 Buckley et al. May 2000 A
6067110 Nonaka et al. May 2000 A
6069696 McQueen et al. May 2000 A
6115114 Berg et al. Sep 2000 A
6137577 Woodworth Oct 2000 A
6177999 Wurz et al. Jan 2001 B1
6189223 Haug Feb 2001 B1
6232597 Kley May 2001 B1
6235403 Vinden et al. May 2001 B1
6236403 Chaki May 2001 B1
6246468 Dimsdale Jun 2001 B1
6333749 Reinhardt et al. Dec 2001 B1
6336587 He et al. Jan 2002 B1
6369401 Lee Apr 2002 B1
6373579 Ober et al. Apr 2002 B1
6429803 Kumar Aug 2002 B1
6457642 Good et al. Oct 2002 B1
6507406 Yagi et al. Jan 2003 B1
6517004 Good et al. Feb 2003 B2
6519550 D'Hooge et al. Feb 2003 B1
6535776 Tobin et al. Mar 2003 B1
6661521 Stern Sep 2003 B1
6674904 McQueen Jan 2004 B1
6705526 Zhu et al. Mar 2004 B1
6773142 Rekow Aug 2004 B2
6781621 Gobush et al. Aug 2004 B1
6804269 Lizotte et al. Oct 2004 B2
6824058 Patel et al. Nov 2004 B2
6832725 Gardiner et al. Dec 2004 B2
6858857 Pease et al. Feb 2005 B2
6912293 Korobkin Jun 2005 B1
6922632 Foxlin Jul 2005 B2
6971580 Zhu et al. Dec 2005 B2
6995762 Pavlidis et al. Feb 2006 B1
7057632 Yamawaki et al. Jun 2006 B2
7085409 Sawhney et al. Aug 2006 B2
7086162 Tyroler Aug 2006 B2
7104453 Zhu et al. Sep 2006 B1
7128266 Zhu et al. Oct 2006 B2
7137556 Bonner et al. Nov 2006 B1
7159783 Walczyk et al. Jan 2007 B2
7161688 Bonner et al. Jan 2007 B1
7205529 Andersen et al. Apr 2007 B2
7214954 Schopp May 2007 B2
7233682 Levine Jun 2007 B2
7277187 Smith et al. Oct 2007 B2
7307653 Dutta Dec 2007 B2
7310431 Gokturk et al. Dec 2007 B2
7313264 Crampton Dec 2007 B2
7353137 Vock et al. Apr 2008 B2
7413127 Ehrhart et al. Aug 2008 B2
7509529 Colucci et al. Mar 2009 B2
7527205 Zhu May 2009 B2
7586049 Wurz Sep 2009 B2
7602404 Reinhardt et al. Oct 2009 B1
7614563 Nunnink et al. Nov 2009 B1
7639722 Paxton et al. Dec 2009 B1
7726206 Terrafranca, Jr. et al. Jun 2010 B2
7726575 Wang et al. Jun 2010 B2
7780084 Zhang et al. Aug 2010 B2
7788883 Buckley et al. Sep 2010 B2
7912320 Minor Mar 2011 B1
7974025 Topliss Jul 2011 B2
8009358 Zalevsky et al. Aug 2011 B2
8027096 Feng et al. Sep 2011 B2
8028501 Buckley et al. Oct 2011 B2
8050461 Shpunt et al. Nov 2011 B2
8055061 Katano Nov 2011 B2
8061610 Nunnink Nov 2011 B2
8072581 Breiholz Dec 2011 B1
8102395 Kondo et al. Jan 2012 B2
8132728 Dwinell et al. Mar 2012 B2
8134717 Pangrazio et al. Mar 2012 B2
8149224 Kuo et al. Apr 2012 B1
8194097 Xiao et al. Jun 2012 B2
8201737 Palacios Durazo et al. Jun 2012 B1
8212158 Wiest Jul 2012 B2
8212889 Chanas et al. Jul 2012 B2
8224133 Popovich et al. Jul 2012 B2
8228510 Pangrazio et al. Jul 2012 B2
8230367 Bell et al. Jul 2012 B2
8294969 Plesko Oct 2012 B2
8301027 Shaw et al. Oct 2012 B2
8305458 Hara Nov 2012 B2
8310656 Zalewski Nov 2012 B2
8313380 Zalewski et al. Nov 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8320621 McEldowney Nov 2012 B2
8322622 Liu Dec 2012 B2
8339462 Stec et al. Dec 2012 B2
8350959 Topliss et al. Jan 2013 B2
8351670 Ijiri et al. Jan 2013 B2
8366005 Kotlarsky et al. Feb 2013 B2
8368762 Chen et al. Feb 2013 B1
8371507 Haggerty et al. Feb 2013 B2
8374498 Pastore Feb 2013 B2
8376233 Horn et al. Feb 2013 B2
8381976 Mohideen et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8437539 Komatsu et al. May 2013 B2
8441749 Brown et al. May 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8463079 Ackley et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8570343 Halstead Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8576390 Nunnink Nov 2013 B1
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8594425 Gurman et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8792688 Unsworth Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8810779 Hilde Aug 2014 B1
8820630 Qu et al. Sep 2014 B2
8822806 Cockerell et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8897596 Passmore et al. Nov 2014 B1
8903172 Smith Dec 2014 B2
8908277 Pesach et al. Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8928896 Kennington et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8993974 Goodwin Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9014441 Truyen et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9061527 Tobin et al. Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9066087 Shpunt Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9076459 Braho et al. Jul 2015 B2
9079423 Bouverie et al. Jul 2015 B2
9080856 Laffargue Jul 2015 B2
9082023 Feng et al. Jul 2015 B2
9082195 Holeva et al. Jul 2015 B2
9084032 Rautiola et al. Jul 2015 B2
9087250 Coyle Jul 2015 B2
9092681 Havens et al. Jul 2015 B2
9092682 Wilz et al. Jul 2015 B2
9092683 Koziol et al. Jul 2015 B2
9093141 Liu Jul 2015 B2
9098763 Lu et al. Aug 2015 B2
9104929 Todeschini Aug 2015 B2
9104934 Li et al. Aug 2015 B2
9107484 Chaney Aug 2015 B2
9111159 Liu et al. Aug 2015 B2
9111166 Cunningham Aug 2015 B2
9135483 Liu et al. Sep 2015 B2
9137009 Gardiner Sep 2015 B1
9141839 Xian et al. Sep 2015 B2
9142035 Rotman Sep 2015 B1
9147096 Wang Sep 2015 B2
9148474 Skvoretz Sep 2015 B2
9158000 Sauerwein Oct 2015 B2
9158340 Reed et al. Oct 2015 B2
9158953 Gillet et al. Oct 2015 B2
9159059 Daddabbo et al. Oct 2015 B2
9165174 Huck Oct 2015 B2
9171278 Kong et al. Oct 2015 B1
9171543 Emerick et al. Oct 2015 B2
9183425 Wang Nov 2015 B2
9189669 Zhu et al. Nov 2015 B2
9195844 Todeschini et al. Nov 2015 B2
9202458 Braho et al. Dec 2015 B2
9208366 Liu Dec 2015 B2
9208367 Wangu Dec 2015 B2
9219836 Bouverie et al. Dec 2015 B2
9224022 Ackley et al. Dec 2015 B2
9224024 Bremer et al. Dec 2015 B2
9224027 Van Horn et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9233470 Bradski et al. Jan 2016 B1
9235553 Fitch et al. Jan 2016 B2
9235899 Kirmani et al. Jan 2016 B1
9239950 Fletcher Jan 2016 B2
9245492 Ackley et al. Jan 2016 B2
9443123 Hejl Jan 2016 B2
9248640 Heng Feb 2016 B2
9250652 London et al. Feb 2016 B2
9250712 Todeschini Feb 2016 B1
9251411 Todeschini Feb 2016 B2
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9262660 Lu et al. Feb 2016 B2
9262662 Chen et al. Feb 2016 B2
9269036 Bremer Feb 2016 B2
9270782 Hala et al. Feb 2016 B2
9273846 Rossi et al. Mar 2016 B1
9274812 Doren et al. Mar 2016 B2
9275388 Havens et al. Mar 2016 B2
9277668 Feng et al. Mar 2016 B2
9280693 Feng et al. Mar 2016 B2
9286496 Smith Mar 2016 B2
9292723 Lu et al. Mar 2016 B2
9297900 Jiang Mar 2016 B2
9298964 Li et al. Mar 2016 B2
9299013 Curlander et al. Mar 2016 B1
9301427 Feng et al. Mar 2016 B2
9304376 Anderson Apr 2016 B2
9310609 Rueblinger et al. Apr 2016 B2
9313377 Todeschini et al. Apr 2016 B2
9317037 Byford et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342723 Liu et al. May 2016 B2
9342724 McCloskey May 2016 B2
9361882 Ressler et al. Jun 2016 B2
9365381 Colonel et al. Jun 2016 B2
9366861 Johnson Jun 2016 B1
9373018 Colavito et al. Jun 2016 B2
9375945 Bowles Jun 2016 B1
9378403 Wang et al. Jun 2016 B2
D760719 Zhou et al. Jul 2016 S
9360304 Chang et al. Jul 2016 B2
9383848 Daghigh Jul 2016 B2
9384374 Bianconi Jul 2016 B2
9390596 Todeschini Jul 2016 B1
9399557 Mishra et al. Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
9411386 Sauerwein Aug 2016 B2
9412242 Van Horn et al. Aug 2016 B2
9418269 Havens et al. Aug 2016 B2
9418270 Van Volkinburg et al. Aug 2016 B2
9423318 Lui et al. Aug 2016 B2
9424749 Reed et al. Aug 2016 B1
D766244 Zhou et al. Sep 2016 S
9443222 Singel et al. Sep 2016 B2
9454689 McCloskey et al. Sep 2016 B2
9464885 Lloyd et al. Oct 2016 B2
9465967 Xian et al. Oct 2016 B2
9470511 Maynard et al. Oct 2016 B2
9478113 Xie et al. Oct 2016 B2
9478983 Kather et al. Oct 2016 B2
D771631 Fitch et al. Nov 2016 S
9481186 Bouverie et al. Nov 2016 B2
9486921 Straszheim et al. Nov 2016 B1
9488986 Solanki Nov 2016 B1
9489782 Payne et al. Nov 2016 B2
9490540 Davies et al. Nov 2016 B1
9491729 Rautiola et al. Nov 2016 B2
9497092 Gomez et al. Nov 2016 B2
9507974 Todeschini Nov 2016 B1
9519814 Cudzilo Dec 2016 B2
9521331 Bessettes et al. Dec 2016 B2
9530038 Xian et al. Dec 2016 B2
D777166 Bidwell et al. Jan 2017 S
9558386 Yeakley Jan 2017 B2
9572901 Todeschini Feb 2017 B2
9595038 Cavalcanti et al. Mar 2017 B1
9606581 Howe et al. Mar 2017 B1
D783601 Schulte et al. Apr 2017 S
D785617 Bidwell et al. May 2017 S
D785636 Oberpriller et al. May 2017 S
9646189 Lu et al. May 2017 B2
9646191 Unemyr et al. May 2017 B2
9652648 Ackley et al. May 2017 B2
9652653 Todeschini et al. May 2017 B2
9656487 Ho et al. May 2017 B2
9659198 Giordano et al. May 2017 B2
D790505 Vargo et al. Jun 2017 S
D790546 Zhou et al. Jun 2017 S
9680282 Hanenburg Jun 2017 B2
9697401 Feng et al. Jul 2017 B2
9701140 Alaganchetty et al. Jul 2017 B1
9709387 Fujita et al. Jul 2017 B2
9736459 Mor et al. Aug 2017 B2
9741136 Holz Aug 2017 B2
9828223 Svensson et al. Nov 2017 B2
20010027995 Patel et al. Oct 2001 A1
20010032879 He et al. Oct 2001 A1
20020036765 McCaffrey Mar 2002 A1
20020054289 Thibault et al. May 2002 A1
20020067855 Chiu et al. Jun 2002 A1
20020105639 Roelke Aug 2002 A1
20020109835 Goetz Aug 2002 A1
20020113946 Kitaguchi et al. Aug 2002 A1
20020118874 Chung et al. Aug 2002 A1
20020158873 Williamson Oct 2002 A1
20020167677 Okada et al. Nov 2002 A1
20020179708 Zhu et al. Dec 2002 A1
20020186897 Kim et al. Dec 2002 A1
20020196534 Lizotte et al. Dec 2002 A1
20030038179 Tsikos et al. Feb 2003 A1
20030053513 Vatan et al. Mar 2003 A1
20030063086 Baumberg Apr 2003 A1
20030078755 Leutz et al. Apr 2003 A1
20030091227 Chang et al. May 2003 A1
20030156756 Gokturk et al. Aug 2003 A1
20030163287 Vock et al. Aug 2003 A1
20030197138 Pease et al. Oct 2003 A1
20030217018 Groff Nov 2003 A1
20030225712 Cooper et al. Dec 2003 A1
20030235331 Kawaike et al. Dec 2003 A1
20040008259 Gokturk Jan 2004 A1
20040019274 Galloway et al. Jan 2004 A1
20040024754 Mane et al. Feb 2004 A1
20040032974 Kriesel Feb 2004 A1
20040066329 Zeitfuss et al. Apr 2004 A1
20040073359 Ichijo et al. Apr 2004 A1
20040083025 Yamanouchi et al. Apr 2004 A1
20040089482 Ramsden et al. May 2004 A1
20040098146 Katae et al. May 2004 A1
20040105580 Hager et al. Jun 2004 A1
20040118928 Patel Jun 2004 A1
20040122779 Stickler et al. Jun 2004 A1
20040132297 Baba et al. Jul 2004 A1
20040155975 Hart et al. Aug 2004 A1
20040165090 Ning Aug 2004 A1
20040184041 Schopp Sep 2004 A1
20040211836 Patel et al. Oct 2004 A1
20040214623 Takahashi et al. Oct 2004 A1
20040233461 Armstrong et al. Nov 2004 A1
20040258353 Gluckstad et al. Dec 2004 A1
20050006477 Patel Jan 2005 A1
20050117215 Lange Jun 2005 A1
20050128193 Popescu et al. Jun 2005 A1
20050128196 Popescu et al. Jun 2005 A1
20050168488 Montague Aug 2005 A1
20050187887 Nicolas et al. Aug 2005 A1
20050211782 Martin Sep 2005 A1
20050240317 Kienzle-Lietl Oct 2005 A1
20050257748 Kriesel et al. Nov 2005 A1
20050264867 Cho et al. Dec 2005 A1
20060036556 Knispel Feb 2006 A1
20060047704 Gopalakrishnan Mar 2006 A1
20060078226 Zhou Apr 2006 A1
20060108266 Bowers et al. May 2006 A1
20060109105 Varner et al. May 2006 A1
20060112023 Horhann May 2006 A1
20060151604 Zhu et al. Jul 2006 A1
20060159307 Anderson et al. Jul 2006 A1
20060159344 Shao et al. Jul 2006 A1
20060213999 Wang et al. Sep 2006 A1
20060230640 Chen Oct 2006 A1
20060232681 Okada Oct 2006 A1
20060255150 Longacre Nov 2006 A1
20060269165 Viswanathan Nov 2006 A1
20060276709 Khamene et al. Dec 2006 A1
20060291719 Ikeda et al. Dec 2006 A1
20070003154 Sun et al. Jan 2007 A1
20070025612 Iwasaki et al. Feb 2007 A1
20070031064 Zhao et al. Feb 2007 A1
20070063048 Havens et al. Mar 2007 A1
20070116357 Dewaele May 2007 A1
20070127022 Cohen et al. Jun 2007 A1
20070143082 Degnan Jun 2007 A1
20070153293 Gruhlke et al. Jul 2007 A1
20070165013 Goulanian et al. Jul 2007 A1
20070171220 Kriveshko Jul 2007 A1
20070177011 Lewin et al. Aug 2007 A1
20070181685 Zhu et al. Aug 2007 A1
20070237356 Dwinell et al. Oct 2007 A1
20070291031 Konev et al. Dec 2007 A1
20070299338 Stevick et al. Dec 2007 A1
20080013793 Hillis et al. Jan 2008 A1
20080035390 Wurz Feb 2008 A1
20080047760 Georgitsis Feb 2008 A1
20080050042 Zhang et al. Feb 2008 A1
20080054062 Gunning et al. Mar 2008 A1
20080056536 Hildreth et al. Mar 2008 A1
20080062164 Bassi et al. Mar 2008 A1
20080065509 Williams Mar 2008 A1
20080077265 Boyden Mar 2008 A1
20080079955 Storm Apr 2008 A1
20080164074 Wurz Jun 2008 A1
20080156619 Patel et al. Jul 2008 A1
20080204476 Montague Aug 2008 A1
20080212168 Olmstead et al. Sep 2008 A1
20080247635 Davis et al. Oct 2008 A1
20080273191 Kim et al. Nov 2008 A1
20080273210 Hilde Nov 2008 A1
20080278790 Boesser et al. Nov 2008 A1
20090038182 Lans et al. Feb 2009 A1
20090046296 Kilpartrick et al. Feb 2009 A1
20090059004 Bochicchio Mar 2009 A1
20090081008 Somin et al. Mar 2009 A1
20090095047 Patel et al. Apr 2009 A1
20090114818 Casares et al. May 2009 A1
20090134221 Zhu et al. May 2009 A1
20090161090 Campbell et al. Jun 2009 A1
20090189858 Lev et al. Jul 2009 A1
20090195790 Zhu et al. Aug 2009 A1
20090225333 Bendall et al. Sep 2009 A1
20090237411 Gossweiler et al. Sep 2009 A1
20090268023 Hsieh Oct 2009 A1
20090272724 Gubler Nov 2009 A1
20090273770 Bauhahn et al. Nov 2009 A1
20090313948 Buckley et al. Dec 2009 A1
20090318815 Barnes et al. Dec 2009 A1
20090323084 Dunn et al. Dec 2009 A1
20090323121 Valkenburg Dec 2009 A1
20100035637 Varanasi et al. Feb 2010 A1
20100060604 Zwart et al. Mar 2010 A1
20100091104 Sprigle Apr 2010 A1
20100113153 Yen et al. May 2010 A1
20100118200 Gelman et al. May 2010 A1
20100128109 Banks May 2010 A1
20100149315 Qu Jun 2010 A1
20100161170 Siris Jun 2010 A1
20100171740 Andersen et al. Jul 2010 A1
20100172567 Prokoski Jul 2010 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100194709 Tamaki et al. Aug 2010 A1
20100202702 Benos et al. Aug 2010 A1
20100208039 Stettner Aug 2010 A1
20100211355 Horst et al. Aug 2010 A1
20100217678 Goncalves Aug 2010 A1
20100220849 Colbert et al. Sep 2010 A1
20100220894 Ackley et al. Sep 2010 A1
20100223276 Al-Shameri et al. Sep 2010 A1
20100245850 Lee et al. Sep 2010 A1
20100254611 Arnz Oct 2010 A1
20100274728 Kugelman Oct 2010 A1
20100303336 Abraham Dec 2010 A1
20100315413 Izadi et al. Dec 2010 A1
20100321482 Cleveland Dec 2010 A1
20110019155 Daniel et al. Jan 2011 A1
20110040192 Brenner et al. Feb 2011 A1
20110040407 Lim Feb 2011 A1
20110043609 Choi et al. Feb 2011 A1
20110075936 Deaver Mar 2011 A1
20110081044 Peeper Apr 2011 A1
20110099474 Grossman et al. Apr 2011 A1
20110169999 Grunow et al. Jul 2011 A1
20110180695 Li et al. Jul 2011 A1
20110188054 Petronius et al. Aug 2011 A1
20110188741 Sones et al. Aug 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20110234389 Mellin Sep 2011 A1
20110235854 Berger et al. Sep 2011 A1
20110243432 Hirsch et al. Oct 2011 A1
20110249864 Venkatesan et al. Oct 2011 A1
20110254840 Halstead Oct 2011 A1
20110260965 Kim et al. Oct 2011 A1
20110279916 Brown Nov 2011 A1
20110286007 Pangrazio et al. Nov 2011 A1
20110286628 Goncalves et al. Nov 2011 A1
20110288818 Thierman Nov 2011 A1
20110297590 Ackley et al. Dec 2011 A1
20110301994 Tieman Dec 2011 A1
20110303748 Lemma et al. Dec 2011 A1
20110310227 Konertz et al. Dec 2011 A1
20110310256 Shishido Dec 2011 A1
20120014572 Wong et al. Jan 2012 A1
20120024952 Chen Feb 2012 A1
20120056982 Katz et al. Mar 2012 A1
20120057345 Kuchibhotla Mar 2012 A1
20120067955 Rowe Mar 2012 A1
20120074227 Ferren Mar 2012 A1
20120081714 Pangrazio et al. Apr 2012 A1
20120082383 Kruglick Apr 2012 A1
20120111946 Golant May 2012 A1
20120113223 Hilliges et al. May 2012 A1
20120126000 Kunzig et al. May 2012 A1
20120140300 Freeman Jun 2012 A1
20120154607 Moed Jun 2012 A1
20120162413 Murphy et al. Jun 2012 A1
20120168509 Nunnink et al. Jul 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120179665 Baarman et al. Jul 2012 A1
20120185094 Rosenstein et al. Jul 2012 A1
20120190386 Anderson Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120197464 Wang et al. Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120218436 Rodriguez et al. Sep 2012 A1
20120223141 Good et al. Sep 2012 A1
20120224026 Bayer et al. Sep 2012 A1
20120224060 Gurevich et al. Sep 2012 A1
20120236212 Itoh et al. Sep 2012 A1
20120236288 Stanley Sep 2012 A1
20120242852 Hayward et al. Sep 2012 A1
20120113250 Farlotti et al. Oct 2012 A1
20120256901 Bendall Oct 2012 A1
20120261474 Kawashime et al. Oct 2012 A1
20120262558 Boger et al. Oct 2012 A1
20120280908 Rhoads et al. Nov 2012 A1
20120282905 Owen Nov 2012 A1
20120282911 Davis et al. Nov 2012 A1
20120284012 Rodriguez et al. Nov 2012 A1
20120284122 Brandis Nov 2012 A1
20120284339 Rodriguez Nov 2012 A1
20120284593 Rodriguez Nov 2012 A1
20120293610 Doepke et al. Nov 2012 A1
20120293625 Schneider et al. Nov 2012 A1
20120294478 Publicover et al. Nov 2012 A1
20120294549 Doepke Nov 2012 A1
20120299961 Ramkumar et al. Nov 2012 A1
20120300991 Mikio Nov 2012 A1
20120313848 Galor et al. Dec 2012 A1
20120314030 Datta Dec 2012 A1
20120314058 Bendall et al. Dec 2012 A1
20120314258 Moriya Dec 2012 A1
20120316820 Nakazato et al. Dec 2012 A1
20130019278 Sun et al. Jan 2013 A1
20130038881 Pesach et al. Feb 2013 A1
20130038941 Pesach et al. Feb 2013 A1
20130043312 Van Horn Feb 2013 A1
20130050426 Sarmast et al. Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130076857 Kurashige et al. Mar 2013 A1
20130093895 Palmer et al. Apr 2013 A1
20130094069 Lee et al. Apr 2013 A1
20130101158 Lloyd et al. Apr 2013 A1
20130156267 Muraoka et al. Jun 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130200150 Reynolds et al. Aug 2013 A1
20130201288 Billerbaeck et al. Aug 2013 A1
20130208164 Cazier et al. Aug 2013 A1
20130211790 Loveland et al. Aug 2013 A1
20130222592 Gieseke Aug 2013 A1
20130223673 Davis et al. Aug 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130291998 Konnerth Nov 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer Nov 2013 A1
20130308013 Li et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130317642 Asaria Nov 2013 A1
20130326425 Forstall et al. Dec 2013 A1
20130329012 Bartos Dec 2013 A1
20130329013 Metois et al. Dec 2013 A1
20130332524 Fiala et al. Dec 2013 A1
20130342343 Harring et al. Dec 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001258 Chan et al. Jan 2014 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140009586 McNamer et al. Jan 2014 A1
20140019005 Lee et al. Jan 2014 A1
20140021259 Moed et al. Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140031665 Pinto et al. Jan 2014 A1
20140100813 Showering Jan 2014 A1
20140034731 Gao et al. Feb 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039674 Motoyama et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140058612 Wong et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140062709 Flyer et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140064624 Kim et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067104 Osterhout Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071430 Hansen et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140079297 Tadayon et al. Mar 2014 A1
20140091147 Evans et al. Apr 2014 A1
20140097238 Ghazizadeh Apr 2014 A1
20140097252 He et al. Apr 2014 A1
20140098091 Hori Apr 2014 A1
20140098243 Ghazizadeh Apr 2014 A1
20140098244 Ghazizadeh Apr 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104664 Lee Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140121438 Long et al. May 2014 A1
20140121445 Fontenot et al. May 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125577 Hoang et al. May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Kearney May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140135984 Hirata May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140139654 Taskahashi May 2014 A1
20140140585 Wang May 2014 A1
20140142398 Patil et al. May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140152975 Ko Jun 2014 A1
20140157861 Jonas et al. Jun 2014 A1
20140158468 Adami Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140168380 Heidemann et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140177931 Kocherscheidt et al. Jun 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140192187 Atwell et al. Jul 2014 A1
20140192551 Masaki Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140201126 Zadeh et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140205150 Ogawa Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140225918 Mittal et al. Aug 2014 A1
20140225985 Klusza et al. Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140240454 Lee Aug 2014 A1
20140247279 Nicholas et al. Sep 2014 A1
20140247280 Nicholas et al. Sep 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140267609 Laffargue Sep 2014 A1
20140268093 Tohme et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140270361 Amma et al. Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140306833 Ricci Oct 2014 A1
20140307855 Vvithagen et al. Oct 2014 A1
20140313527 Askan Oct 2014 A1
20140319219 Liu et al. Oct 2014 A1
20140320408 Zagorsek et al. Oct 2014 A1
20140320605 Johnson Oct 2014 A1
20140333775 Naikal et al. Nov 2014 A1
20140347533 Ovsiannikov et al. Nov 2014 A1
20140347553 Ovsiannikov et al. Nov 2014 A1
20140350710 Gopalkrishnan et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20140379613 Nishitani et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009100 Haneda et al. Jan 2015 A1
20150009301 Ribnick et al. Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150016712 Rhoads et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150036876 Marrion et al. Feb 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150042791 Metois et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062160 Sakamoto et al. Mar 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150062369 Gehring et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150070158 Hayasaka Mar 2015 A1
20150070489 Hudman et al. Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150116498 Vartiainen et al. Apr 2015 A1
20150117749 Chen et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150130928 Maynard et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150163474 You Jun 2015 A1
20150178900 Kim et al. Jun 2015 A1
20150182844 Jang Jul 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150204662 Kobayashi et al. Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150213590 Brown et al. Jul 2015 A1
20150213647 Laffargue et al. Jul 2015 A1
20150219748 Hyatt Aug 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150229838 Hakim et al. Aug 2015 A1
20150243030 Pfeiffer Aug 2015 A1
20150248578 Utsumi Sep 2015 A1
20150253469 Le Gros et al. Sep 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150260830 Ghosh et al. Sep 2015 A1
20150269403 Lei et al. Sep 2015 A1
20150201181 Herschbach Oct 2015 A1
20150276379 Ni et al. Oct 2015 A1
20150308816 Laffargue et al. Oct 2015 A1
20150310243 Ackley Oct 2015 A1
20150310389 Crimm et al. Oct 2015 A1
20150316368 Moench et al. Nov 2015 A1
20150325036 Lee Nov 2015 A1
20150327012 Bian et al. Nov 2015 A1
20150332075 Burch Nov 2015 A1
20150332463 Galera et al. Nov 2015 A1
20150355470 Herschbach Dec 2015 A1
20160014251 Hejl Jan 2016 A1
20160169665 Deschenes et al. Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160048725 Holz et al. Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160070982 Li et al. Feb 2016 A1
20160062473 Bouchat et al. Mar 2016 A1
20160063429 Varley et al. Mar 2016 A1
20160065912 Peterson Mar 2016 A1
20160088287 Sadi et al. Mar 2016 A1
20160090283 Svensson et al. Mar 2016 A1
20160090284 Svensson et al. Mar 2016 A1
20160092805 Geisler et al. Mar 2016 A1
20160094016 Beach et al. Mar 2016 A1
20160101936 Chamberlin Apr 2016 A1
20160102975 McCloskey et al. Apr 2016 A1
20160104019 Todeschini et al. Apr 2016 A1
20160104274 Jovanovski et al. Apr 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160117627 Raj et al. Apr 2016 A1
20160117631 McCloskey et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160138247 Conway et al. May 2016 A1
20160138248 Conway et al. May 2016 A1
20160138249 Svensson et al. May 2016 A1
20160147408 Bevis et al. May 2016 A1
20160164261 Warren Jun 2016 A1
20160171597 Todeschini Jun 2016 A1
20160171666 McCloskey Jun 2016 A1
20160171720 Todeschini Jun 2016 A1
20160171775 Todeschini et al. Jun 2016 A1
20160171777 Todeschini et al. Jun 2016 A1
20160174674 Oberpriller et al. Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160178685 Young et al. Jun 2016 A1
20160178707 Young et al. Jun 2016 A1
20160178915 Mor et al. Jun 2016 A1
20160179132 Harr et al. Jun 2016 A1
20160179143 Bidwell et al. Jun 2016 A1
20160179368 Roeder Jun 2016 A1
20160179378 Kent et al. Jun 2016 A1
20160180130 Bremer Jun 2016 A1
20160180133 Oberpriller et al. Jun 2016 A1
20160180136 Meier et al. Jun 2016 A1
20160180594 Todeschini Jun 2016 A1
20160180663 McMahan et al. Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160180713 Bernhardt et al. Jun 2016 A1
20160185136 Ng et al. Jun 2016 A1
20160185291 Chamberlin Jun 2016 A1
20160186926 Oberpriller et al. Jun 2016 A1
20160187186 Coleman et al. Jun 2016 A1
20160187187 Coleman et al. Jun 2016 A1
20160187210 Coleman et al. Jun 2016 A1
20160188861 Todeschini Jun 2016 A1
20160188939 Sailors et al. Jun 2016 A1
20160188940 Lu et al. Jun 2016 A1
20160188941 Todeschini et al. Jun 2016 A1
20160188942 Good et al. Jun 2016 A1
20160188943 Linwood Jun 2016 A1
20160188944 Wilz, Sr. et al. Jun 2016 A1
20160189076 Mellott et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160189088 Pecorari et al. Jun 2016 A1
20160189092 George et al. Jun 2016 A1
20160189284 Mellott et al. Jun 2016 A1
20160189288 Todeschini Jun 2016 A1
20160189366 Chamberlin et al. Jun 2016 A1
20160189443 Smith Jun 2016 A1
20160189447 Valenzuela Jun 2016 A1
20160189489 Au et al. Jun 2016 A1
20160191684 DiPiazza et al. Jun 2016 A1
20160191801 Sivan Jun 2016 A1
20160192051 DiPiazza et al. Jun 2016 A1
20160125873 Braho et al. Jul 2016 A1
20160202478 Masson et al. Jul 2016 A1
20160202951 Pike et al. Jul 2016 A1
20160202958 Zabel et al. Jul 2016 A1
20160202959 Doubleday et al. Jul 2016 A1
20160203021 Pike et al. Jul 2016 A1
20160203429 Mellott et al. Jul 2016 A1
20160203641 Bostick et al. Jul 2016 A1
20160203797 Pike et al. Jul 2016 A1
20160203820 Zabel et al. Jul 2016 A1
20160204623 Haggert et al. Jul 2016 A1
20160204636 Allen et al. Jul 2016 A1
20160204638 Miraglia et al. Jul 2016 A1
20160210780 Paulovich et al. Jul 2016 A1
20160316190 McCloskey et al. Jul 2016 A1
20160223474 Tang et al. Aug 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Sewell et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20160323310 Todeschini et al. Nov 2016 A1
20160325677 Fitch et al. Nov 2016 A1
20160327614 Young et al. Nov 2016 A1
20160327930 Charpentier et al. Nov 2016 A1
20160328762 Pape Nov 2016 A1
20160328854 Kimura Nov 2016 A1
20160330218 Hussey et al. Nov 2016 A1
20160343163 Venkatesha et al. Nov 2016 A1
20160343176 Ackley Nov 2016 A1
20160364914 Todeschini Dec 2016 A1
20160370220 Ackley et al. Dec 2016 A1
20160372282 Bandringa Dec 2016 A1
20160373847 Vargo et al. Dec 2016 A1
20160377414 Thuries et al. Dec 2016 A1
20160377417 Jovanovski et al. Dec 2016 A1
20170010141 Ackley Jan 2017 A1
20170010328 Mullen et al. Jan 2017 A1
20170010780 Waldron et al. Jan 2017 A1
20170016714 Laffargue et al. Jan 2017 A1
20170018094 Todeschini Jan 2017 A1
20170046603 Lee et al. Feb 2017 A1
20170047864 Stang et al. Feb 2017 A1
20170053146 Liu et al. Feb 2017 A1
20170053147 Geramine et al. Feb 2017 A1
20170053647 Nichols et al. Feb 2017 A1
20170055606 Xu et al. Mar 2017 A1
20170060316 Larson Mar 2017 A1
20170061961 Nichols et al. Mar 2017 A1
20170064634 Van Horn et al. Mar 2017 A1
20170083730 Feng et al. Mar 2017 A1
20170091502 Furlong et al. Mar 2017 A1
20170091706 Lloyd et al. Mar 2017 A1
20170091741 Todeschini Mar 2017 A1
20170091904 Ventress Mar 2017 A1
20170092908 Chaney Mar 2017 A1
20170094238 Germaine Mar 2017 A1
20170098947 Wolski Apr 2017 A1
20170100949 Celinder et al. Apr 2017 A1
20170103545 Holz Apr 2017 A1
20170108838 Todeschini et al. Apr 2017 A1
20170108895 Chamberlin et al. Apr 2017 A1
20170115490 Hsieh et al. Apr 2017 A1
20170115497 Chen et al. Apr 2017 A1
20170116462 Ogasawara Apr 2017 A1
20170118355 Wong et al. Apr 2017 A1
20170121158 Wong May 2017 A1
20170123598 Phan et al. May 2017 A1
20170124369 Rueblinger et al. May 2017 A1
20170124396 Todeschini et al. May 2017 A1
20170124687 McCloskey et al. May 2017 A1
20170126873 McGary et al. May 2017 A1
20170126904 d'Armancourt et al. May 2017 A1
20170132806 Balachandreswaran May 2017 A1
20170139012 Smith May 2017 A1
20170139213 Schmidtlin May 2017 A1
20170140329 Bernhardt et al. May 2017 A1
20170140731 Smith May 2017 A1
20170147847 Berggren et al. May 2017 A1
20170148250 Angermayer May 2017 A1
20170150124 Thuries May 2017 A1
20170018294 Hardy et al. Jun 2017 A1
20170169198 Nichols Jun 2017 A1
20170171035 Lu et al. Jun 2017 A1
20170171703 Maheswaranathan Jun 2017 A1
20170171803 Maheswaranathan Jun 2017 A1
20170180359 Wolski et al. Jun 2017 A1
20170180577 Nguon et al. Jun 2017 A1
20170181299 Shi et al. Jun 2017 A1
20170182942 Hardy et al. Jun 2017 A1
20170190192 Delario et al. Jul 2017 A1
20170193432 Bernhardt Jul 2017 A1
20170193461 Jonas et al. Jul 2017 A1
20170193727 Van Horn et al. Jul 2017 A1
20170200108 Au et al. Jul 2017 A1
20170200275 McCloskey et al. Jul 2017 A1
20170200296 Jones et al. Jul 2017 A1
20170309108 Sadovsky et al. Oct 2017 A1
20170336870 Everett et al. Nov 2017 A1
20180018627 Ross Jan 2018 A1
Foreign Referenced Citations (65)
Number Date Country
2004212587 Apr 2005 AU
201139117 Oct 2008 CN
3335760 Apr 1985 DE
10210813 Oct 2003 DE
102007037282 Mar 2008 DE
0871008 Oct 1998 EP
1111435 Jun 2001 EP
1443312 Aug 2004 EP
1112483 May 2006 EP
1232480 May 2006 EP
2013117 Jan 2009 EP
2216634 Aug 2010 EP
2286932 Feb 2011 EP
2372648 Oct 2011 EP
2381421 Oct 2011 EP
2533009 Dec 2012 EP
2562715 Feb 2013 EP
2722656 Apr 2014 EP
2779027 Sep 2014 EP
2833323 Feb 2015 EP
2843590 Mar 2015 EP
2845170 Mar 2015 EP
2966595 Jan 2016 EP
3006893 Mar 2016 EP
3012601 Mar 2016 EP
3007096 Apr 2016 EP
3270342 Jan 2018 EP
2503978 Jan 2014 GB
2525053 Oct 2015 GB
2531928 May 2016 GB
H04129902 Apr 1992 JP
200696457 Apr 2006 JP
2007084162 Apr 2007 JP
2008210276 Sep 2008 JP
2014210646 Nov 2014 JP
2015174705 Oct 2015 JP
20100020115 Feb 2010 KR
20110013200 Feb 2011 KR
20110117020 Oct 2011 KR
20120028109 Mar 2012 KR
9640452 Dec 1996 WO
0077726 Dec 2000 WO
0114836 Mar 2001 WO
2006095110 Sep 2006 WO
2007015059 Feb 2007 WO
2007125554 Nov 2007 WO
200712554 Nov 2007 WO
2011017241 Feb 2011 WO
2012175731 Dec 2012 WO
2013021157 Feb 2013 WO
2013033442 Mar 2013 WO
2013173985 Nov 2013 WO
2013163789 Nov 2013 WO
2013166368 Nov 2013 WO
20130184340 Dec 2013 WO
2014019130 Feb 2014 WO
2014023697 Feb 2014 WO
2014102341 Jul 2014 WO
2014149702 Sep 2014 WO
2014151746 Sep 2014 WO
2015006865 Jan 2015 WO
2016020038 Feb 2016 WO
2016061699 Apr 2016 WO
2016061699 Apr 2016 WO
2016085682 Jun 2016 WO
Non-Patent Literature Citations (156)
Entry
United Kingdom Further Examination Report in related GB Patent Application No. 1517842.9 dated Jul. 26, 2018; 5 pages [Cited art has been previously cited in this matter].
United Kingdom Further Examination Report in related GB Patent Application No. 1517112.7 dated Jul. 17, 2018; 4 pages [No art cited].
United Kingdom Further Examination Report in related GB Patent Application No. 1620676.5 dated Jul. 17, 2018; 4 pages [No art cited].
European Extended Search Report in related EP Application No. 17201794.9, dated Mar. 16, 2018, 10 pages [Only new art cited herein].
European Extended Search Report in related EP Application 17205030.4, dated Mar. 22, 2018, 8 pages.
European Exam Report in related EP Application 16172995.9, dated Mar. 15, 2018, 7 pages (Only new art cited herein).
United Kingdom Combined Search and Examination Report dated Mar. 21, 2018, 5 pages (Art has been previously cited).
European extended Search Report in related Application No. 17207882.6 dated Apr. 26, 2018, 10 pages.
Padzensky, Ron; “Augmera; Gesture Control”, Dated Apr. 18, 2015, 15 pages [Examiner Cited Art in Office Action dated Jan. 20, 2017 in related Application.].
Grabowski, Ralph; “New Commands in AutoCADS 2010: Part 11 Smoothing 3D Mesh Objects” Dated 2011, 6 pages, [Examiner Cited Art in Office Action dated Jan. 20, 2017 in related Application.].
Theodoropoulos, Gabriel; “Using Gesture Recognizers to Handle Pinch, Rotate, Pan, Swipe, and Tap Gestures” dated Aug. 25, 2014, 34 pages, [Examiner Cited Art in Office Action dated Jan. 20, 2017 in related Application.].
Boavida et al., “Dam monitoring using combined terrestrial imaging systems”, 2009 Civil Engineering Survey De/Jan. 2009, pp. 33-38 {Cited in Notice of Allowance dated Sep. 15, 2017 in related matter}.
Ralph Grabowski, “Smothing 3D Mesh Objects,” New Commands in AutoCAD 2010: Part 11, Examiner Cited art in related matter Non Final Office Action dated May 19, 2017; 6 pages.
Wikipedia, “Microlens”, Downloaded from https://en.wikipedia.org/wiki/Microlens, pp. 3. {Cited by Examiner in Feb. 9, 2017 Final Office Action in related matter}.
Fukaya et al., “Characteristics of Speckle Random Pattern and Its Applications”, pp. 317-327, Nouv. Rev. Optique, t.6, n. 6. (1975) {Cited by Examiner in Feb. 9, 2017 Final Office Action in related matter: downloaded Mar. 2, 2017 fro http://iopscience.iop.org}.
Thorlabs, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, downloaded from https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_id=6430, 4 pages.
Eksma Optics, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, downloaded from http://eksmaoptics.com/optical-systems/f-theta-lenses/f-theta-lens-for-1064-nm/, 2 pages.
Sill Optics, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, http://www.silloptics.de/1/products/sill-encyclopedia/laser-optics/f-theta-lenses/, 4 pages.
Office Action in counterpart European Application No. 13186043.9 dated Sep. 30, 2015, pp. 1-7.
Lloyd et al., “System for Monitoring the Condition of Packages Throughout Transit”, U.S. Appl. No. 14/865,575, filed Sep. 25, 2015, 59 pages, not yet published.
McCloskey et al., “Image Transformation for Indicia Reading,” U.S. Appl. 14/928,032, filed Oct. 30, 2015, 48 pages, not yet published.
Great Britain Combined Search and Examination Report in related Application GB1517842.9, dated Apr. 8, 2016, 8 pages.
Search Report in counterpart European Application No. 15182675.7, dated Dec. 4, 2015, 10 pages.
Wikipedia, “3D projection” Downloaded on Nov. 25, 2015 from www.wikipedia.com, 4 pages.
M.Zahid Gurbuz, Selim Akyokus, Ibrahim Emiroglu, Aysun Guran, An Efficient Algorithm for 3D Rectangular Box Packing, 2009, Applied Automatic Systems: Proceedings of Selected AAS 2009 Papers, pp. 131-134.
European Extended Search Report in Related EP Application No. 16172995.9, dated Aug. 22, 2016, 11 pages.
European Extended search report in related EP Application No. 15190306.9, dated Sep. 9, 2016, 15 pages.
Collings et al., “The Applications and Technology of Phase-Only Liquid Crystal on Silicon Devices”, Journal of Display Technology, IEEE Service Center, New, York, NY, US, vol. 7, No. 3, Mar. 1, 2011 (Mar. 1, 2011), pp. 112-119.
European extended Search report in related EP Application 13785171.3, dated Sep. 19, 2016, 8 pages.
El-Hakim et al., “Multicamera vision-based approach to flexible feature measurement for inspection and reverse engineering”, published in Optical Engineering, Society of Photo-Optical Instrumentation Engineers, vol. 32, No. 9, Sep. 1, 1993, 15 pages.
El-Hakim et al., “A Knowledge-based Edge/Object Measurement Technique”, Retrieved from the Internet: URL: https://www.researchgate.net/profile/Sabry_E1-Hakim/publication/44075058_A_Knowledge_Based_EdgeObject_Measurement_Technique/links/00b4953b5faa7d3304000000.pdf [retrieved on Jul. 15, 2016] dated Jan. 1, 1993, 9 pages.
H. Sprague Ackley, “Automatic Mode Switching in a Volume Dimensioner”, U.S. Appl. No. 15/182,636, filed Jun. 15, 2016, 53 pages, Not yet published.
Bosch Tool Corporation, “Operating/Safety Instruction for DLR 130”, Dated Feb. 2, 2009, 36 pages.
European Search Report for related EP Application No. 16152477.2, dated May 24, 2016, 8 pages.
Mike Stensvold, “get the Most Out of Variable Aperture Lenses”, published on www.OutdoorPhotogrpaher.com; dated Dec. 7, 2010; 4 pages, [As noted on search report retrieved from URL: http://www.outdoorphotographer.com/gear/lenses/get-the-most-out-ofvariable-aperture-lenses.html on Feb. 9, 2016].
Houle et al., “Vehical Positioning and Object Avoidance”, U.S. Appl. No. 15/007,522 [not yet published], filed Jan. 27, 2016, 59 pages.
United Kingdom combined Search and Examination Report in related GB Application No. 1607394.2, dated Oct. 19, 2016, 7 pages.
European Search Report from related EP Application No. 16168216.6, dated Oct. 20, 2016, 8 pages.
Ulusoy, Ali Osman et al.; “One-Shot Scanning using De Bruijn Spaced Grids”, Brown University; 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, pp. 1786-1792 [Cited in EPO Search Report dated Dec. 5, 2017}.
Extended European Search report in related EP Application No. 17189496.7 dated Dec. 5, 2017; 9 pages.
Extended European Search report in related EP Application No. 17190323.0 dated Jan. 19, 2018; 6 pages [Only new art cited herein].
Examination Report in related GB Application No. GB1517843.7, dated Jan. 19, 2018, 4 pages [Only new art cited herein].
Examination Report in related EP Application No. 15190315, dated Jan. 26, 2018, 6 pages [Only new art cited herein].
European Extended Search Report in related EP Application No. 16190017.0, dated Jan. 4, 2017, 6 pages.
European Extended Search Report in related EP Application No. 16173429.8, dated Dec. 1, 2016, 8 pages [US 2013/0038881 cited on separate IDS filed concurrently herewith].
Extended European Search Report in related EP Application No. 16175410.0, dated Dec. 13, 2016, 5 pages.
European extended search report in related EP Application 16190833.0, dated Mar. 9, 2017, 8 pages [US Publication 2014/0034731 cited on separate IDS filed concurrently herewith].
United Kingdom Combined Search and Examination Report in related Application No. GB1620676.5, dated Mar. 8, 2017, 6 pages [References cited on separate IDS filed concurrently herewith; WO2014/151746, WO2012/175731, US 2014/0313527, GB2503978].
European Exam Report in related , EP Application No. 16168216.6, dated Feb. 27, 2017, 5 pages, [cited on separate IDS filed concurrently herewith; WO2011/017241 and US 2014/0104413].
EP Search Report in related EP Application No. 17171844 dated Sep. 18, 2017. 4 pages [Only new art cited herein; some art has been cited on separate IDS filed concurrently herewith}.
EP Extended Search Report in related EP Applicaton No. 17174843.7 dated Oct. 17, 2017, 5 pages {Only new art cited herein; some art has been cited on separate IDS filed concurrently herewith}.
UK Further Exam Report in related UK Application No. GB1517842.9, dated Sep. 1, 2017, 5 pages (only new art cited herein; some art cited on separate IDS filed concurrently herewith).
European Exam Report in related EP Application No. 15176943.7, dated Apr. 12, 2017, 6 pages [Art cited on separate IDS filed concurrently herewith].
European Exam Report in related EP Application No. 15188440.0, dated Apr. 21, 2017, 4 pages [Art has been cited on separate IDS filed concurrently herewith.].
European Examination report in related EP Application No. 14181437.6, dated Feb. 8, 2017, 5 pages [References sited on separate IDS filed concurrently herewith].
Chinese Notice of Reexamination in related Chinese Application 201520810313.3, dated Mar. 14, 2017, English Computer Translation provided, 7 pages [References cited on separate IDS filed concurrently herewith].
Extended European search report in related EP Application 16199707.7, dated Apr. 10, 2017, 15 pages.
Ulusoy et al., One-Shot Scanning using De Bruijn Spaced Grids, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, 7 pages [Cited in EP Extended search report dated Apr. 10, 2017; NPL 14].
European Exam Report in related EP Application No. 16152477.2, dated Jun. 20, 2017, 4 pages [References cited on separate IDS filed concurrently herewith].
European Exam Report in related EP Applciation 16172995.9, dated Jul. 6, 2017, 9 pages [References cited on separate IDS filed concurrently herewith].
United Kingdom Search Report in related Application No. GB1700338.5, dated Jun. 30, 2017, 5 pages.
European Search Report in related EP Application No. 17175357.7, dated Aug. 17, 2017, pp. 1-7 [References cited on separate IDS filed concurrently herewith].
Peter Clarke, Actuator Developer Claims Anti-Shake Breakthrough for Smartphone Cams, Electronic Engineering Times, p. 24, May 16, 2011. [Previously cited and copy provided in parent application].
Spiller, Jonathan; Object Localization Using Deformable Templates, Master's Dissertation, University of the Witwatersrand, Johannesburg, South Africa, 2007; 74 pages [Previously cited and copy provided in parent application].
Leotta, Matthew J.; Joseph L. Mundy; Predicting High Resolution Image Edges with a Generic, Adaptive, 3-D Vehicle Model; IEEE Conference on Computer Vision and Pattern Recognition, 2009; 8 pages. [Previously cited and copy provided in parent application].
European Search Report for application No. EP13186043 dated Feb. 26, 2014 (now EP2722656 (dated Apr. 23, 2014)): Total pp. 7 [Previously cited and copy provided in parent application].
International Search Report for PCT/US2013/039438 (WO2013166368), dated Oct. 1, 2013, 7 pages [Previously cited and copy provided in parent application].
Lloyd, Ryan and Scott McCloskey, “Recognition of 3D Package Shapes for Singe Camera Metrology” IEEE Winter Conference on Applications of computer Visiona, IEEE, Mar. 24, 2014, pp. 99-106, {retrieved on Jun. 16, 2014}, Authors are employees of common Applicant [Previously cited and copy provided in parent application].
European Office Action for application EP 13186043, dated Jun. 12, 2014 (now EP2722656 (dated Apr. 23, 2014)), Total of 6 pages [Previously cited and copy provided in parent application].
Zhang, Zhaoxiang; Tieniu Tan, Kaiqi Huang, Yunhong Wang; Three-Dimensional Deformable-Model-based Localization and Recognition of Road Vehicles; IEEE Transactions on Image Processing, vol. 21, No. 1, Jan. 2012, 13 pages. [Previously cited and copy provided in parent application].
U.S. Appl. No. 14/801,023, Tyler Doomenbal et al., filed Jul. 16, 2015, not published yet, Adjusting Dimensioning Results Using Augmented Reality, 39 pages [Previously cited and copy provided in parent application].
Wikipedia, YUV description and definition, downloaded from http://www.wikipeida.org/wiki/YUV on Jun. 29, 2012, 10 pages [Previously cited and copy provided in parent application].
YUV Pixel Format, downloaded from http://www.fource.org/yuv.php on Jun. 29, 2012; 13 pages. [Previously cited and copy provided in parent application].
YUV to RGB Conversion, downloaded from http://www.fource.org/fccyvrgb.php on Jun. 29, 2012; 5 pages [Previously cited and copy provided in parent application].
Benos et al., “Semi-Automatic Dimensioning with Imager of a Portable Device,” U.S. Appl. No. 61/149,912, filed Feb. 4, 2009 (now expired), 56 pages. [Previously cited and copy provided in parent application].
Dimensional Weight—Wikipedia, the Free Encyclopedia, URL=http://en.wikipedia.org/wiki/Dimensional_weight, download date Aug. 1, 2008, 2 pages. [Previously cited and copy provided in parent application].
Dimensioning—Wikipedia, the Free Encyclopedia, URL=http://en.wikipedia.org/wiki/Dimensioning, download date Aug. 1, 2008, 1 page [Previously cited and copy provided in parent application].
European Patent Office Action for Application No. 14157971.4-1906, dated Jul. 16, 2014, 5 pages. [Previously cited and copy provided in parent application].
European Patent Search Report for Application No. 14157971.4-1906, dated Jun. 30, 2014, 6 pages. [Previously cited and copy provided in parent application].
Caulier, Yannick et al., “A New Type of Color-Coded Light Structures for an Adapted and Rapid Determination of Joint Correspondences for 3D Reconstruction.” Proc. of SPIE, vol. 8082 808232-3; 2011; 8 pages [Previously cited and copy provided in parent application].
Kazantsev, Aleksei et al. “Robust Pseudo-Random Coded Colored STructured Light Techniques for 3D Object Model Recovery”; ROSE 2008 IEEE International Workshop on Robotic and Sensors Environments (Oct. 17-18, 2008) , 6 pages [Previously cited and copy provided in parent application].
Mouaddib E. et al. “Recent Progress in Structured Light in order to Solve the Correspondence Problem in Stereo Vision” Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Apr. 1997; 7 pages [Previously cited and copy provided in parent application].
Proesmans, Marc et al. “Active Acquisition of 3D Shape for Moving Objects” 0-7803-3258-X/96 1996 IEEE; 4 pages [Previously cited and copy provided in parent application].
Salvi, Joaquim et al. “Pattern Codification Strategies in Structured Light Systems” published in Pattern Recognition; The Journal of the Pattern Recognition Society; Accepted Oct. 2, 2003; 23 pages [Previously cited and copy provided in parent application].
EP Search and Written Opinion Report in related matter EP Application No. 14181437.6, dated Mar. 26, 2015, 7 pages. [Previously cited and copy provided in parent application].
Hetzel, Gunter et al.; “3D Object Recognition from Range Images using Local Feature Histograms,”, Proceedings 2001 IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2001. Kauai, Hawaii, Dec. 8-14, 2001; pp. 394-399, XP010584149, ISBN: 978-0-7695-1272-3. [Previously cited and copy provided in parent application].
Second Chinese Office Action in related CN Application No. 201520810685.6, dated Mar. 22, 2016, 5 pages, no references. [Previously cited and copy provided in parent application].
European Search Report in related EP Application No. 15190315.0, dated Apr. 1, 2016, 7 pages [Previously cited and copy provided in parent application].
Second Chinese Office Action in related CN Application No. 2015220810562.2, dated Mar. 22, 2016, 5 pages. English Translation provided [No references] [Previously cited and copy provided in parent application].
European Search Report for related Application EP 15190249.1, dated Mar. 22, 2016, 7 pages. [Previously cited and copy provided in parent application].
Second Chinese Office Action in related CN Application No. 201520810313.3, dated Mar. 22, 2016, 5 pages. English Translation provided [No references].
U.S. Appl. No. 14/800,757 , Eric Todeschini, filed Jul. 16, 2015, not published yet, Dimensioning and Imaging Items, 80 pages [Previously cited and copy provided in parent application].
U.S. Appl. No. 14/747,197, Serge Thuries et al., filed Jun. 23, 2015, not published yet, Optical Pattern Projector; 33 pages [Previously cited and copy provided in parent application].
U.S. Appl. No. 14/747,490, Brian L. Jovanovski et al., filed Jun. 23, 2015, not published yet, Dual-Projector Three-Dimensional Scanner; 40 pages [Previously cited and copy provided in parent application].
Search Report and Opinion in related GB Application No. 1517112.7, dated Feb. 19, 2016, 6 Pages [Previously cited and copy provided in parent application].
U.S. Appl. No. 14/793,149, H. Sprague Ackley, filed Jul. 7, 2015, not published yet, Mobile Dimensioner Apparatus for Use in Commerce; 57 pages [Previously cited and copy provided in parent application].
U.S. Appl. No. 14/740,373, H. Sprague Ackley et al., filed Jun. 16, 2015, not published yet, Calibrating a Volume Dimensioner; 63 pages [Previously cited and copy provided in parent application].
Intention to Grant in counterpart European Application No. 14157971.4 dated Apr. 14, 2015, pp. 1-8 [Previously cited and copy provided in parent application].
Decision to Grant in counterpart European Application No. 14157971.4 dated Aug. 6, 2015, pp. 1-2 [Previously cited and copy provided in parent application].
Leotta, Matthew, Generic, Deformable Models for 3-D Vehicle Surveillance, May 2010, Doctoral Dissertation, Brown University, Providence RI, 248 pages [Previously cited and copy provided in parent application].
Ward, Benjamin, Interactive 3D Reconstruction from Video, Aug. 2012, Doctoral Thesis, Univesity of Adelaide, Adelaide, South Australia, 157 pages [Previously cited and copy provided in parent application].
Hood, Frederick W.; William A. Hoff, Robert King, Evaluation of an Interactive Technique for Creating Site Models from Range Data, Apr. 27-May 1, 1997 Proceedings of the ANS 7th Topical Meeting on Robotics & Remote Systems, Augusta GA, 9 pages [Previously cited and copy provided in parent application].
Gupta, Alok; Range Image Segmentation for 3-D Objects Recognition, May 1988, Technical Reports (CIS), Paper 736, University of Pennsylvania Department of Computer and Information Science, retrieved from Http://repository.upenn.edu/cis_reports/736, Accessed May 31, 2015, 157 pages [Previously cited and copy provided in parent application].
Reisner-Kollmann,Irene; Anton L. Fuhrmann, Werner Purgathofer, Interactive Reconstruction of Industrial Sites Using Parametric Models, May 2010, Proceedings of the 26th Spring Conference of Computer Graphics SCCG ″10, 8 pages [Previously cited and copy provided in parent application].
Drummond, Tom; Roberto Cipolla, Real-Time Visual Tracking of Complex Structures, Jul. 2002, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, No. 7; 15 pages. [Previously cited and copy provided in parent application].
European Search Report for Related EP Application No. 15189214.8, dated Mar. 3, 2016, 9 pages [Previously cited and copy provided in parent application].
Santolaria et al. “A one-step intrinsic and extrinsic calibration method for laster line scanner operation in coordinate measuring machines”, dated Apr. 1, 2009, Measurement Science and Technology, IOP, Bristol, GB, vol. 20, No. 4; 12 pages [Previously cited and copy provided in parent application].
Search Report and Opinion in Related EP Application 15176943.7, dated Jan. 8, 2016, 8 pages [Previously cited and copy provided in parent application].
European Search Report for related EP Application No. 15188440.0, dated Mar. 8, 2016, 8 pages. [Previously cited and copy provided in parent application].
United Kingdom Search Report in related application GB1517842.9, dated Apr. 8, 2016, 8 pages [Previously cited and copy provided in parent application].
Great Britain Search Report for related Application On. GB1517843.7, dated Feb. 23, 2016; 8 pages [Previously cited and copy provided in parent application].
United Kingdom Further Exam Report in related application GB1607394.2 dated Oct. 5, 2018; 5 pages {Only new art cited here in].
European Extended Search Report in related EP application 18184864.9, dated Oct. 30, 2018, 7 pages.
Combined Search and Examination Report in related UK Application No. GB1817189.2 dated Nov. 14, 2018, pp. 1-4 [Reference previously cited.].
Examination Report in related UK Application No. GB1517842.9 dated Dec. 21, 2018, pp. 1-7 [All references previously cited.].
Examination Report in European Application No. 16152477.2 dated Jun. 18, 2019, pp. 1-6.
Examination Report in European Application No. 17175357.7 dated Jun. 26, 2019, pp. 1-5 [All references previously cited.].
Examination Report in European Application No. 19171976.4 dated Jun. 19, 2019, pp. 1-8.
Examination Report in GB Application No. 1607394.2 dated Jul. 5 2019, pp. 1-4.
Advisory Action (PTOL-303) dated Apr 7, 2017 for U.S. Appl. No. 13/784,933.
Advisory Action (PTOL-303) dated Aug. 15, 2018 for U.S. Appl. No. 13/784,933.
Advisory Action (PTOL-303) dated Mar. 20, 2018 for U.S. Appl. No. 14/055,383.
Advisory Action (PTOL-303) dated Mar. 23, 2016 for U.S. Appl. No. 13/785,177.
Advisory Action (PTOL-303) dated May 12, 2016 for U.S. Appl. No. 13/784,933.
Advisory Action (PTOL-303) dated Nov. 25, 2016 for U.S. Appl. No. 14/055,383.
Examiner initiated interview summary (PTOL-413B) dated Apr. 12, 2017 for U.S. Appl. No. 13/785,177.
Examiner initiated interview summary (PTOL-413B) dated Aug. 22, 2019 for U.S. Appl. No. 14/055,383.
Final Rejection dated Dec. 11, 2015 for U.S. Appl. No. 13/785,177.
Final Rejection dated Dec. 26, 2017 for U.S. Appl. No. 14/055,383.
Final Rejection dated Dec. 30, 2016 for U.S. Appl. No. 13/784,933.
Final Rejection dated Feb. 1, 2016 for U.S. Appl. No. 13/784,933.
Final Rejection dated Feb. 4, 2019 for U.S. Appl. No. 14/055,383.
Final Rejection dated Jul. 5, 2016 for U.S. Appl. No. 14/055,383.
Final Rejection dated May 16, 2018 for U.S. Appl. No. 13/784,933.
Non-Final Rejection dated Apr. 20, 2017 for U.S. Appl. No. 14/055,383.
Non-Final Rejection dated Aug. 12, 2015 for U.S. Appl. No. 13/784,933.
Non-Final Rejection dated Dec. 16, 2015 for U.S. Appl. No. 14/055,383.
Non-Final Rejection dated Jul. 5, 2018 for U.S. Appl. No. 14/055,383.
Non-Final Rejection dated Jul. 16, 2015 for U.S. Appl. No. 13/785,177.
Non-Final Rejection dated Jun. 30, 2016 for U.S. Appl. No. 13/784,933.
Non-Final Rejection dated Oct. 10, 2017 for U.S. Appl. No. 13/784,933.
Non-Final Rejection dated Sep. 26, 2016 for U.S. Appl. No. 13/785,177.
Notice of Allowance and Fees Due (PTOL-85) dated Apr. 12, 2017 for U.S. Appl. No. 13/785,177.
Notice of Allowance and Fees Due (PTOL-85) dated Oct. 23, 2017 for U.S. Appl. No. 13/785,177.
Notice of Allowance for related U.S. Appl. No. 14/055,234 dated Sep. 8, 2015, 7 pages.
Office Action for related U.S. Appl. No. 14/055,234 dated May 15, 2015, 11 pages.
Second Chinese Office Action in related CN Application No. 201520810562.2, dated Mar. 22, 2016, 5 pages. [English Translation provided [No references].
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned.
Combined Search and Examination Report in related UK Application No. GB1900752.5 dated Feb. 1, 2019, pp. 1-5.
Examination Report in related UK Application No. GB1517842.9 dated Mar. 8, 2019, pp. 1-4.
Examination Report in related EP Application No. 13193181.8 dated Mar. 20, 2019, pp. 1-4.
First Office Action in related CN Application No. 201510860188.1 dated Jan. 18, 2019, pp. 1-14 [All references previously cited.].
Examination Report in related EP Application No. 13785171.3 dated Apr. 2, 2019, pp. 1-5.
Lowe David G., “Filling Parameterized Three-Dimensional Models to Images”, IEEE Transaction on Pattern Analysis and Machine Intelligence, IEEE Computer Society, USA, vol. 13, No. 5, May 1, 1991, pp. 441-450.
Applicant Initiated Interview Summary (PTOL-413) dated Jan. 11, 2018 for U.S. Appl. No. 13/784,933.
Applicant Initiated Interview Summary (PTOL-413) dated May 18, 2017 for U.S. Appl. No. 13/784,933.
Related Publications (1)
Number Date Country
20180073914 A1 Mar 2018 US
Provisional Applications (1)
Number Date Country
61714394 Oct 2012 US
Continuations (1)
Number Date Country
Parent 13785177 Mar 2013 US
Child 15817840 US