Dimensioning system with guided alignment

Information

  • Patent Grant
  • 9976848
  • Patent Number
    9,976,848
  • Date Filed
    Wednesday, April 5, 2017
    7 years ago
  • Date Issued
    Tuesday, May 22, 2018
    6 years ago
Abstract
A dimensioning system including a computing device running an alignment software program is disclosed. The alignment software uses range information from a range sensor in order to generate alignment messages. The alignment messages may help a user define a frame of reference and align the dimensioning system's range sensor for improved dimensioning performance.
Description
CROSS-REFERENCE TO RELATED APPLICATION

The present application claims the benefit of U.S. patent application Ser. No. 14/453,019 for a Dimensioning System with Guided Alignment filed Aug. 6, 2014 (and published Feb. 11, 2016 as U.S. Patent Publication No. 2016/0040982), now U.S. Pat. No. 9,625,252. Each of the foregoing patent application, patent publication, and patent is hereby incorporated by reference in its entirety.


FIELD OF THE INVENTION

The present invention relates to the field of dimensioning systems, and more specifically, to a system and method for aligning a package dimensioning system.


BACKGROUND

Generally speaking freight carriers calculate shipping costs based on package size and weight (i.e., volumetric weight). This helps prevent lightweight packages that require a large amount of space from becoming unprofitable for the freight carriers.


When printing a shipping label for a package, a freight carrier employee is required to enter the package's size and weight into a software application that uses this information to calculate the cost of shipping. Typically, freight carrier employees derive this information through hand measurements (e.g., with a tape measure) and may weigh the package on a scale. Hand measurements are prone to error, particularly when packages have an irregular shape. These errors may lead to dissatisfaction and/or financial loss. For example, when a shipping company determines, after shipping costs are negotiated, that a package is larger and/or heavier than reported by the customer, additional costs may assessed. In addition, retailers that pass the shipping costs along to customers typically assume the extra shipping costs associated with these errors. As a result, automated dimensioning systems have been developed to bring more accuracy to package volume calculation.


One such automated dimensioning system uses a light projector to project a light pattern (e.g., point cloud) onto objects (e.g., packages) within a field of view. A range camera, physically offset from the light projector, creates a range image from the light pattern reflected from the packages. Software running on a computing device compares the light pattern in the range image to some reference (e.g., a reference image taken during calibration). Through this comparison, the dimensions of a package may be derived.


In order to dimension a specified size range (e.g., a range of package sizes) accurately, the dimensioning system may require a user to position (i.e., align) the range sensor into a particular pose (i.e., height and orientation). This positioning typically takes place during the installation of the dimensioning system. During positioning the pose is computed relative to a reference (i.e., ground) plane that is typically defined prior to positioning. The process of selecting the reference plane and positioning the range sensor is not easily handled by a typical user, but poorly installed range sensors may result in dimensioning errors or the inability to dimension. Therefore, a need exists for a method to assist the user with the selection of a reference plane and the positioning of a range sensor to ensure good performance of the dimensioning system.


SUMMARY

Accordingly, in one aspect, the present invention embraces a package dimensioning system including a range sensor for capturing a series of range images of the range sensor's field of view. The system also includes an adjustable range-sensor support to physically support and position the range sensor in a target pose. A computing device, communicatively coupled to the range sensor, is capable of executing an adjustment software program, which provides adjustment messages to facilitate the adjustment of the range sensor. The adjustment software program configures the computing device to receive the series of range images, process the series of range images to produce the adjustment messages, and transmit the adjustment messages to a display. The display is communicatively coupled to the computing device and displays the series of range images and the adjustment messages.


In another aspect, the present invention embraces a computer implemented method for generating adjustment messages to facilitate the positioning of a range sensor for dimensioning. The method includes the step of recording range images onto a computer-readable storage medium. The method also includes the step of reading the range images from the computer-readable storage medium. In addition, the method includes processing the range images to derive a result. The method further includes the step of generating adjustment messages based on the result.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying figure (i.e., fig.) set.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 schematically depicts an exemplary range sensor's stereo arrangement of a pattern projector of a range camera for capturing range information as a pixel displacement.



FIG. 2 graphically depicts an exemplary dimensioning system including a platform and an object for dimensioning.



FIG. 3 schematically depicts an exemplary dimensioning system.



FIG. 4 schematically depicts a flowchart of an exemplary adjustment software program.





DETAILED DESCRIPTION

The present invention embraces a dimensioning system to measure items, such as packages for shipment. The dimensioning system typically uses a range sensor, a computing device, and a display for this measurement. The range sensor may be an optical system that acquires information about a field of view or could use another range sensing modality (e.g., ultrasonic). The optical system requires a user to place an item within the system's field of view for measurement and is very easy to use. For the system to give best results, however, the range sensor should be positioned in a range-sensor pose with respect to a frame of reference shared by the package that closely matches a target pose (e.g., below an adjustable threshold value of pose difference). The target pose represents the desired range sensor's position (e.g., height, pitch, roll, and/or yaw) with respect to the frame of reference that ensures good dimensioning performance. The target pose is chosen to allow the range sensor to obtain accurate dimensioning results for a given range of package sizes. For example, a target pose may be established to contain the largest package to be dimensioned, while also ensuring that the smallest package may be resolved sufficiently for accurate measurement. The target pose is also chosen based on the range sensor's resolution, field of view, and/or other limitations (e.g., specular reflections, multipath interference, and/or mixed pixel responses). Similarly, a target pose may be chosen to minimize depth differences between a foreground package and clutter (e.g., background clutter). The target pose may be stored on a computer-readable medium (e.g., non-transitory memory) communicatively coupled to the computing device and is typically set once for a particular application. In certain scenarios, however, this target pose could adjustable. For example, an old target pose could be replaced with a new target pose, or in another embodiment, target poses could be selected by a user to match a particular dimensioning application. For example, a user might want to replace the range sensor with a new range sensor that has a different, sensing modality, resolution, and/or field of view. Here the user could update the target pose to match the new range sensor. Alternatively, the user might want to accommodate a new range of package sizes. Here the user could update the target pose to meet the requirements of the new range of package sizes.


The mathematical representation of a physical pose requires a frame of reference. This frame of reference may be defined with a specified surface (e.g., planar surface) or a line. Alternatively, the frame of reference may be established with a set of 3D points that are arranged in some known way. For example, a pose may be calculated relative to a cylinder placed in front of the camera. Typically, however, a planar surface within the field of view, selected by a user, establishes the frame of reference. For example, the surface that the measured item (e.g., package) rests on during the measurement (e.g., scale or counter-top) may be chosen to serve as the reference surface (i.e., platform). In this way, the orientations of the range-sensor pose and the target pose (each relative to the frame of reference) can be derived mathematically and compared (e.g., compared by rotation matrix or axis-angle representation). For example, rotation vectors, with respect to the frame of reference, could be derived for each pose using Rodrigues' rotation formula. The angle between the two rotation vectors could then be computed and compared to obtain the difference between the range-sensor pose and the target pose (i.e., pose difference).


The process of establishing the platform and aligning the range-sensor pose with a target pose may be made easier through the use of software (e.g., one or more executable files, libraries, and/or scripts) to generate guidance advice for sensor alignment. Here, the adjustment software (i.e., adjustment software program) receives range images from the range sensor and produces feedback (i.e., adjustment messages) to help a user align the range sensor.


Range images are typically single-channel (e.g., gray scale) images that represent the distance between the range camera and the portion of the field of view represented by a pixel. Using these range images, the adjustment software may detect planar surfaces using an algorithm. For example, a random sample consensus (i.e., RANSAC) algorithm may identify planar surfaces within the range sensor's field of view. In the case where more than one planar surface is detected within the range camera's field of view, each planar surface may be indicated in a visual image presented on a display. In one possible embodiment, each planar surface may be indicated by an overlay (e.g., semi-transparent and/or colored overlay) image superimposed on the range image and presented on the display. In other possible embodiments, the reference surface may otherwise be highlighted (e.g., an outline). A prompt, generated by the adjustment software, may query a user to indicate which planar surface should be established as the reference surface (i.e., ground plane). Once the ground plane (i.e., platform) is established, the adjustment software may use the range image to compute the range camera's height and orientation (i.e., roll, pitch, and/or yaw) with respect to the ground plane.


During range sensor alignment (e.g., during installation of a package dimensioning system) the adjustment software may use the computed range sensor height and orientation to provide adjustment messages. These adjustment messages include indications of the necessary adjustments in order to align the range sensor's physical pose with the target pose. This feedback may be audible or visual. Visible messages could be text messages or graphical images displayed alone, in addition to, and/or superimposed on other images (e.g., range images, color images, or point-cloud images). By following these adjustment messages, a user may adjust the range-sensor support (e.g., adjustable tripod mount, pole mount, ceiling mount, and/or wall mount) to move the range sensor closer to the target pose. Many adjustment messages may be generated during the alignment process. In one possible embodiment, the process of analyzing range images and providing alignment messages (e.g., “move camera up”) may continue iteratively until the range sensor is aligned with the target pose. Once aligned, the adjustment software may provide an adjustment message indicating that the range sensor is in position, indicating that the user should stop adjusting and secure the support. In another possible embodiment, the software may provide alignment messages that indicate the alignment of the range-sensor pose with the target pose in qualitative terms (e.g., good, better, or best). In still another possible embodiment, the software may simply provide real-time alignment information (e.g., pose difference results in numerical form) and allow the user to decide the ultimate alignment criteria.


While the adjustment software is typically used during the installation of the package dimensioning system, it may also be used periodically after the installation. For example, the adjustment software program may be configured to periodically check the range sensor's pose and compare this with the target pose. If the difference between the two poses (i.e., pose difference) is above some threshold value (e.g., from a misalignment caused by mechanical movement or vibration), the guidance software may provide messages to alert a user that the range sensor is no longer in alignment and that an adjustment is necessary.


The alignment process may happen in real-time with the display rendering real-time range images, while simultaneously displaying adjustment messages. In one embodiment, for example, the adjustment software may guide the user to first adjust range sensor's height and then adjust its orientation (i.e., roll, pitch, and/or yaw) separately and sequentially. In another embodiment, the software may accommodate a user to jointly adjust the range sensor's height and orientation simultaneously.


Three-dimensional (i.e., 3D) sensors (e.g., range sensors) can be utilized effectively in dimensioning applications. The recent advent of relatively low-cost range sensors that can detect and display three-dimensional information has afforded greater opportunity for implementing automated dimensioning on a wider scale. Consequently, the package-dimensioning system disclosed here may include a range sensor to acquire a two-dimensional gray scale image for conveying the range on a pixel by pixel basis (i.e., range image). In an exemplary range image, darker pixels may indicate a point that is a shorter distance away from the range sensor than points represented by lighter pixels.


In the embodiment shown in FIG. 1, the range sensor includes a projector 1 and a range camera 2. The projector 1 may radiate a light pattern onto an item 4 within a field of view 3. The reflected light pattern 6 from the item may be imaged and detected by the range camera 2. If the item's range 7 is changed then the range camera may sense this change as a displacement 5 in detected light pattern. A processor within the range sensor may convert this range information into a range image. In this way the pattern projector and range camera may together help to produce a range image. The projector 1 and the range camera 2 are positioned collinearly and are codirected towards the same field of view (i.e., are positioned in a stereo arrangement). The light from the projector (e.g., the point cloud) may be visible but is typically invisible to the human eye. The range camera is sensitive to this light. In some embodiments, the range sensor also includes a color (i.e., RGB) camera that is sensitive to visible light and which shares the field of view 3 with the projector and range camera. This color camera may be used to display images for a user that are easily interpreted and less confusing than the gray scale range images or point cloud images. In another possible embodiment, the range images may be displayed during the alignment process. In yet another possible embodiment, the raw images including the projected light pattern (point cloud images) may be displayed during the alignment process.


An exemplary package dimensioning system is graphically shown in FIG. 2. Here the range sensor 10 is physically supported and positioned by the range-sensor support 11. The support helps configure the range-sensor pose which is defined by the range sensor's height 12 and orientation. The orientation may include the range sensor's pitch 13, yaw 14, and/or roll 15. A user may configure the range-sensor pose to match a target pose through the use of an adjustable range-sensor support 11. This support is shown in this embodiment as a tripod, though other support mechanisms (e.g., pole-mount, wall-mount, or ceiling-mount) may be used. An adjustment software program executed by a computing device 16 may display adjustment message on a display 13. The range-sensor pose and the target pose are relative to a platform 18 (i.e., reference plane or ground plane) that a package 17 is placed on for dimensioning. The platform may be selected by a user from a plurality of planar surfaces detected within the field of view before the poses are calculated.


The schematic of the package dimensioning system including a computing device 24 for package dimensioning is shown in FIG. 3. Here a range sensor 20 includes a pattern projector 27 for creating a light pattern that can be imaged by the range camera 22 and mathematically transformed into a range image that is transmitted from the range sensor 20 to a processor 23 integrated in the computing device 24 and communicatively coupled to the range sensor. The processor may store the range image in a computer-readable storage medium 25. Adjustment software stored in the storage medium 25 may configure the processor 23 to execute the program steps required for generating the adjustment messages necessary to facilitate the positioning of the range sensor 20 for dimensioning. The processor may transmit the adjustment messages to a display 26. These messages may be displayed along with an image of the range sensor's field of view. This image of the field of view may be the range camera's 22 image or may be a color image created by a color camera 21 configured with the same field of view as the range camera 22. The color camera is optional but may enhance the user's experience, as these images may be easier to understand than the gray scale range images.


As shown in FIG. 4, the adjustment software program 36 operates on range information (e.g., range images 31) to produce some feedback information (e.g., adjustment messages 39). A flowchart illustrating the method for generating adjustment messages to facilitate the positioning of a range sensor for package dimensioning is shown in FIG. 4. The range sensor 30 produces a range image 31. The software analyzes the range image to detect planar surfaces within the field of view. A user may then be prompted to select a reference plane (i.e., platform) from the detected planar surfaces. Alternatively, the software may detect and select a platform automatically. Once the platform 32 is detected, a target pose 33 may be computed based on a stored pose 34. The stored pose information may be information based on the range camera's field of view, the range of expected package sizes, and/or the resolution of the range image. This information may be stored in the computing device's non-transitory, computer-readable storage medium (e.g., hard drive). The platform 32 may also be used to mathematically compute the range-sensor pose 35 from the range image 31 and the platform 32. Mathematically a pose may be defined as a vector relative to the platform 32. The adjustment software program 36 then computes the difference between the target pose 33 and the range-sensor pose 35 to determine a pose difference 37 (e.g., vector difference). If the pose difference is zero (or below some threshold value) then the camera is considered aligned, however if the pose difference is above a threshold value, then a desired action 38 to minimize the pose difference is computed. Based on the desired action 38, an adjustment message 39 is created. This adjustment message is then transmitted with the range camera's image to the display 40 for viewing. The adjustment message could be a text message or a graphical image. In one possible embodiment an arrow graphic indicating the direction to move the range sensor 30 could be overlaid with the range image 31 on the display 40. In another embodiment the adjustment message could provide quantitative measurements (e.g., move camera up 10 cm). In another possible embodiment the adjustment messages may be audio messages transmitted to a speaker for broadcast.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. No. 6,832,725; U.S. Pat. No. 7,128,266;
  • U.S. Pat. No. 7,159,783; U.S. Pat. No. 7,413,127;
  • U.S. Pat. No. 7,726,575; U.S. Pat. No. 8,294,969;
  • U.S. Pat. No. 8,317,105; U.S. Pat. No. 8,322,622;
  • U.S. Pat. No. 8,366,005; U.S. Pat. No. 8,371,507;
  • U.S. Pat. No. 8,376,233; U.S. Pat. No. 8,381,979;
  • U.S. Pat. No. 8,390,909; U.S. Pat. No. 8,408,464;
  • U.S. Pat. No. 8,408,468; U.S. Pat. No. 8,408,469;
  • U.S. Pat. No. 8,424,768; U.S. Pat. No. 8,448,863;
  • U.S. Pat. No. 8,457,013; U.S. Pat. No. 8,459,557;
  • U.S. Pat. No. 8,469,272; U.S. Pat. No. 8,474,712;
  • U.S. Pat. No. 8,479,992; U.S. Pat. No. 8,490,877;
  • U.S. Pat. No. 8,517,271; U.S. Pat. No. 8,523,076;
  • U.S. Pat. No. 8,528,818; U.S. Pat. No. 8,544,737;
  • U.S. Pat. No. 8,548,242; U.S. Pat. No. 8,548,420;
  • U.S. Pat. No. 8,550,335; U.S. Pat. No. 8,550,354;
  • U.S. Pat. No. 8,550,357; U.S. Pat. No. 8,556,174;
  • U.S. Pat. No. 8,556,176; U.S. Pat. No. 8,556,177;
  • U.S. Pat. No. 8,559,767; U.S. Pat. No. 8,599,957;
  • U.S. Pat. No. 8,561,895; U.S. Pat. No. 8,561,903;
  • U.S. Pat. No. 8,561,905; U.S. Pat. No. 8,565,107;
  • U.S. Pat. No. 8,571,307; U.S. Pat. No. 8,579,200;
  • U.S. Pat. No. 8,583,924; U.S. Pat. No. 8,584,945;
  • U.S. Pat. No. 8,587,595; U.S. Pat. No. 8,587,697;
  • U.S. Pat. No. 8,588,869; U.S. Pat. No. 8,590,789;
  • U.S. Pat. No. 8,596,539; U.S. Pat. No. 8,596,542;
  • U.S. Pat. No. 8,596,543; U.S. Pat. No. 8,599,271;
  • U.S. Pat. No. 8,599,957; U.S. Pat. No. 8,600,158;
  • U.S. Pat. No. 8,600,167; U.S. Pat. No. 8,602,309;
  • U.S. Pat. No. 8,608,053; U.S. Pat. No. 8,608,071;
  • U.S. Pat. No. 8,611,309; U.S. Pat. No. 8,615,487;
  • U.S. Pat. No. 8,616,454; U.S. Pat. No. 8,621,123;
  • U.S. Pat. No. 8,622,303; U.S. Pat. No. 8,628,013;
  • U.S. Pat. No. 8,628,015; U.S. Pat. No. 8,628,016;
  • U.S. Pat. No. 8,629,926; U.S. Pat. No. 8,630,491;
  • U.S. Pat. No. 8,635,309; U.S. Pat. No. 8,636,200;
  • U.S. Pat. No. 8,636,212; U.S. Pat. No. 8,636,215;
  • U.S. Pat. No. 8,636,224; U.S. Pat. No. 8,638,806;
  • U.S. Pat. No. 8,640,958; U.S. Pat. No. 8,640,960;
  • U.S. Pat. No. 8,643,717; U.S. Pat. No. 8,646,692;
  • U.S. Pat. No. 8,646,694; U.S. Pat. No. 8,657,200;
  • U.S. Pat. No. 8,659,397; U.S. Pat. No. 8,668,149;
  • U.S. Pat. No. 8,678,285; U.S. Pat. No. 8,678,286;
  • U.S. Pat. No. 8,682,077; U.S. Pat. No. 8,687,282;
  • U.S. Pat. No. 8,692,927; U.S. Pat. No. 8,695,880;
  • U.S. Pat. No. 8,698,949; U.S. Pat. No. 8,717,494;
  • U.S. Pat. No. 8,717,494; U.S. Pat. No. 8,720,783;
  • U.S. Pat. No. 8,723,804; U.S. Pat. No. 8,723,904;
  • U.S. Pat. No. 8,727,223; U.S. Pat. No. D702,237;
  • International Publication No. 2013/163789;
  • International Publication No. 2013/173985;
  • International Publication No. 2014/019130;
  • U.S. Patent Application Publication No. 2008/0185432;
  • U.S. Patent Application Publication No. 2009/0134221;
  • U.S. Patent Application Publication No. 2010/0177080;
  • U.S. Patent Application Publication No. 2010/0177076;
  • U.S. Patent Application Publication No. 2010/0177707;
  • U.S. Patent Application Publication No. 2010/0177749;
  • U.S. Patent Application Publication No. 2011/0202554;
  • U.S. Patent Application Publication No. 2012/0111946;
  • U.S. Patent Application Publication No. 2012/0138685;
  • U.S. Patent Application Publication No. 2012/0168511;
  • U.S. Patent Application Publication No. 2012/0168512;
  • U.S. Patent Application Publication No. 2012/0193407;
  • U.S. Patent Application Publication No. 2012/0193423;
  • U.S. Patent Application Publication No. 2012/0203647;
  • U.S. Patent Application Publication No. 2012/0223141;
  • U.S. Patent Application Publication No. 2012/0228382;
  • U.S. Patent Application Publication No. 2012/0248188;
  • U.S. Patent Application Publication No. 2013/0043312;
  • U.S. Patent Application Publication No. 2013/0056285;
  • U.S. Patent Application Publication No. 2013/0070322;
  • U.S. Patent Application Publication No. 2013/0075168;
  • U.S. Patent Application Publication No. 2013/0082104;
  • U.S. Patent Application Publication No. 2013/0175341;
  • U.S. Patent Application Publication No. 2013/0175343;
  • U.S. Patent Application Publication No. 2013/0200158;
  • U.S. Patent Application Publication No. 2013/0214048;
  • U.S. Patent Application Publication No. 2013/0256418;
  • U.S. Patent Application Publication No. 2013/0257744;
  • U.S. Patent Application Publication No. 2013/0257759;
  • U.S. Patent Application Publication No. 2013/0270346;
  • U.S. Patent Application Publication No. 2013/0278425;
  • U.S. Patent Application Publication No. 2013/0287258;
  • U.S. Patent Application Publication No. 2013/0292474;
  • U.S. Patent Application Publication No. 2013/0292475;
  • U.S. Patent Application Publication No. 2013/0292477;
  • U.S. Patent Application Publication No. 2013/0293539;
  • U.S. Patent Application Publication No. 2013/0293540;
  • U.S. Patent Application Publication No. 2013/0306728;
  • U.S. Patent Application Publication No. 2013/0306730;
  • U.S. Patent Application Publication No. 2013/0306731;
  • U.S. Patent Application Publication No. 2013/0306734;
  • U.S. Patent Application Publication No. 2013/0307964;
  • U.S. Patent Application Publication No. 2013/0308625;
  • U.S. Patent Application Publication No. 2013/0313324;
  • U.S. Patent Application Publication No. 2013/0313325;
  • U.S. Patent Application Publication No. 2013/0313326;
  • U.S. Patent Application Publication No. 2013/0327834;
  • U.S. Patent Application Publication No. 2013/0341399;
  • U.S. Patent Application Publication No. 2013/0342717;
  • U.S. Patent Application Publication No. 2014/0001267;
  • U.S. Patent Application Publication No. 2014/0002828;
  • U.S. Patent Application Publication No. 2014/0008430;
  • U.S. Patent Application Publication No. 2014/0008439;
  • U.S. Patent Application Publication No. 2014/0021256;
  • U.S. Patent Application Publication No. 2014/0025584;
  • U.S. Patent Application Publication No. 2014/0027518;
  • U.S. Patent Application Publication No. 2014/0034723;
  • U.S. Patent Application Publication No. 2014/0034734;
  • U.S. Patent Application Publication No. 2014/0036848;
  • U.S. Patent Application Publication No. 2014/0039693;
  • U.S. Patent Application Publication No. 2014/0042814;
  • U.S. Patent Application Publication No. 2014/0049120;
  • U.S. Patent Application Publication No. 2014/0049635;
  • U.S. Patent Application Publication No. 2014/0061305;
  • U.S. Patent Application Publication No. 2014/0061306;
  • U.S. Patent Application Publication No. 2014/0061307;
  • U.S. Patent Application Publication No. 2014/0063289;
  • U.S. Patent Application Publication No. 2014/0066136;
  • U.S. Patent Application Publication No. 2014/0067692;
  • U.S. Patent Application Publication No. 2014/0070005;
  • U.S. Patent Application Publication No. 2014/0071840;
  • U.S. Patent Application Publication No. 2014/0074746;
  • U.S. Patent Application Publication No. 2014/0075846;
  • U.S. Patent Application Publication No. 2014/0076974;
  • U.S. Patent Application Publication No. 2014/0078341;
  • U.S. Patent Application Publication No. 2014/0078342;
  • U.S. Patent Application Publication No. 2014/0078345;
  • U.S. Patent Application Publication No. 2014/0084068;
  • U.S. Patent Application Publication No. 2014/0086348;
  • U.S. Patent Application Publication No. 2014/0097249;
  • U.S. Patent Application Publication No. 2014/0098284;
  • U.S. Patent Application Publication No. 2014/0098792;
  • U.S. Patent Application Publication No. 2014/0100774;
  • U.S. Patent Application Publication No. 2014/0100813;
  • U.S. Patent Application Publication No. 2014/0103115;
  • U.S. Patent Application Publication No. 2014/0104413;
  • U.S. Patent Application Publication No. 2014/0104414;
  • U.S. Patent Application Publication No. 2014/0104416;
  • U.S. Patent Application Publication No. 2014/0104451;
  • U.S. Patent Application Publication No. 2014/0106594;
  • U.S. Patent Application Publication No. 2014/0106725;
  • U.S. Patent Application Publication No. 2014/0108010;
  • U.S. Patent Application Publication No. 2014/0108402;
  • U.S. Patent Application Publication No. 2014/0108682;
  • U.S. Patent Application Publication No. 2014/0110485;
  • U.S. Patent Application Publication No. 2014/0114530;
  • U.S. Patent Application Publication No. 2014/0124577;
  • U.S. Patent Application Publication No. 2014/0124579;
  • U.S. Patent Application Publication No. 2014/0125842;
  • U.S. Patent Application Publication No. 2014/0125853;
  • U.S. Patent Application Publication No. 2014/0125999;
  • U.S. Patent Application Publication No. 2014/0129378;
  • U.S. patent application Ser. No. 13/367,978 for a Laser Scanning Module Employing An Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.);
  • U.S. patent application Ser. No. 29/436,337 for an Electronic Device, filed Nov. 5, 2012 (Fitch et al.);
  • U.S. patent application Ser. No. 13/736,139 for an Electronic Device Enclosure, filed Jan. 8, 2013 (Chaney);
  • U.S. patent application Ser. No. 13/771,508 for an Optical Redirection Adapter, filed Feb. 20, 2013 (Anderson);
  • U.S. patent application Ser. No. 13/780,356 for a Mobile Device Having Object-Identification Interface, filed Feb. 28, 2013 (Samek et al.);
  • U.S. patent application Ser. No. 13/852,097 for a System and Method for Capturing and Preserving Vehicle Event Data, filed Mar. 28, 2013 (Barker et al.);
  • U.S. patent application Ser. No. 13/902,110 for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Hollifield);
  • U.S. patent application Ser. No. 13/902,144, for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Chamberlin);
  • U.S. patent application Ser. No. 13/902,242 for a System For Providing A Continuous Communication Link With A Symbol Reading Device, filed May 24, 2013 (Smith et al.);
  • U.S. patent application Ser. No. 13/912,262 for a Method of Error Correction for 3D Imaging Device, filed Jun. 7, 2013 (Jovanovski et al.);
  • U.S. patent application Ser. No. 13/912,702 for a System and Method for Reading Code Symbols at Long Range Using Source Power Control, filed Jun. 7, 2013 (Xian et al.);
  • U.S. patent application Ser. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 13/922,339 for a System and Method for Reading Code Symbols Using a Variable Field of View, filed Jun. 20, 2013 (Xian et al.);
  • U.S. patent application Ser. No. 13/927,398 for a Code Symbol Reading System Having Adaptive Autofocus, filed Jun. 26, 2013 (Todeschini);
  • U.S. patent application Ser. No. 13/930,913 for a Mobile Device Having an Improved User Interface for Reading Code Symbols, filed Jun. 28, 2013 (Gelay et al.);
  • U.S. patent application Ser. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 29/459,681 for an Electronic Device Enclosure, filed Jul. 2, 2013 (Chaney et al.);
  • U.S. patent application Ser. No. 13/933,415 for an Electronic Device Case, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 29/459,785 for a Scanner and Charging Base, filed Jul. 3, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 29/459,823 for a Scanner, filed Jul. 3, 2013 (Zhou et al.);
  • U.S. patent application Ser. No. 13/947,296 for a System and Method for Selectively Reading Code Symbols, filed Jul. 22, 2013 (Rueblinger et al.);
  • U.S. patent application Ser. No. 13/950,544 for a Code Symbol Reading System Having Adjustable Object Detection, filed Jul. 25, 2013 (Jiang);
  • U.S. patent application Ser. No. 13/961,408 for a Method for Manufacturing Laser Scanners, filed Aug. 7, 2013 (Saber et al.);
  • U.S. patent application Ser. No. 14/018,729 for a Method for Operating a Laser Scanner, filed Sep. 5, 2013 (Feng et al.);
  • U.S. patent application Ser. No. 14/019,616 for a Device Having Light Source to Reduce Surface Pathogens, filed Sep. 6, 2013 (Todeschini);
  • U.S. patent application Ser. No. 14/023,762 for a Handheld Indicia Reader Having Locking Endcap, filed Sep. 11, 2013 (Gannon);
  • U.S. patent application Ser. No. 14/035,474 for Augmented-Reality Signature Capture, filed Sep. 24, 2013 (Todeschini);
  • U.S. patent application Ser. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/047,896 for Terminal Having Illumination and Exposure Control filed Oct. 7, 2013 (Jovanovski et al.);
  • U.S. patent application Ser. No. 14/053,175 for Imaging Apparatus Having Imaging Assembly, filed Oct. 14, 2013 (Barber);
  • U.S. patent application Ser. No. 14/055,234 for Dimensioning System, filed Oct. 16, 2013 (Fletcher);
  • U.S. patent application Ser. No. 14/053,314 for Indicia Reader, filed Oct. 14, 2013 (Huck);
  • U.S. patent application Ser. No. 14/065,768 for Hybrid System and Method for Reading Indicia, filed Oct. 29, 2013 (Meier et al.);
  • U.S. patent application Ser. No. 14/074,746 for Self-Checkout Shopping System, filed Nov. 8, 2013 (Hejl et al.);
  • U.S. patent application Ser. No. 14/074,787 for Method and System for Configuring Mobile Devices via NFC Technology, filed Nov. 8, 2013 (Smith et al.);
  • U.S. patent application Ser. No. 14/087,190 for Optimal Range Indicators for Bar Code Validation, filed Nov. 22, 2013 (Hejl);
  • U.S. patent application Ser. No. 14/094,087 for Method and System for Communicating Information in an Digital Signal, filed Dec. 2, 2013 (Peake et al.);
  • U.S. patent application Ser. No. 14/101,965 for High Dynamic-Range Indicia Reading System, filed Dec. 10, 2013 (Xian);
  • U.S. patent application Ser. No. 14/118,400 for Indicia Decoding Device with Security Lock, filed Nov. 18, 2013 (Liu);
  • U.S. patent application Ser. No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.);
  • U.S. patent application Ser. No. 14/154,207 for Laser Barcode Scanner, filed Jan. 14, 2014 (Hou et al.);
  • U.S. patent application Ser. No. 14/154,915 for Laser Scanning Module Employing a Laser Scanning Assembly having Elastomeric Wheel Hinges, filed Jan. 14, 2014 (Havens et al.);
  • U.S. patent application Ser. No. 14/158,126 for Methods and Apparatus to Change a Feature Set on Data Collection Devices, filed Jan. 17, 2014 (Berthiaume et al.);
  • U.S. patent application Ser. No. 14/159,074 for Wireless Mesh Point Portable Data Terminal, filed Jan. 20, 2014 (Wang et al.);
  • U.S. patent application Ser. No. 14/159,509 for MMS Text Messaging for Hand Held Indicia Reader, filed Jan. 21, 2014 (Kearney);
  • U.S. patent application Ser. No. 14/159,603 for Decodable Indicia Reading Terminal with Optical Filter, filed Jan. 21, 2014 (Ding et al.);
  • U.S. patent application Ser. No. 14/160,645 for Decodable Indicia Reading Terminal with Indicia Analysis Functionality, filed Jan. 22, 2014 (Nahill et al.);
  • U.S. patent application Ser. No. 14/161,875 for System and Method to Automatically Discriminate Between Different Data Types, filed Jan. 23, 2014 (Wang);
  • U.S. patent application Ser. No. 14/165,980 for System and Method for Measuring Irregular Objects with a Single Camera filed Jan. 28, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/166,103 for Indicia Reading Terminal Including Optical Filter filed Jan. 28, 2014 (Lu et al.);
  • U.S. patent application Ser. No. 14/176,417 for Devices and Methods Employing Dual Target Auto Exposure filed Feb. 10, 2014 (Meier et al.);
  • U.S. patent application Ser. No. 14/187,485 for Indicia Reading Terminal with Color Frame Processing filed Feb. 24, 2014 (Ren et al.);
  • U.S. patent application Ser. No. 14/200,405 for Indicia Reader for Size-Limited applications filed Mar. 7, 2014 (Feng et al.);
  • U.S. patent application Ser. No. 14/222,994 for Method and Apparatus for Reading Optical Indicia Using a Plurality of Data filed Mar. 24, 2014 (Smith et al.);
  • U.S. patent application Ser. No. 14/230,322 for Focus Module and Components with Actuator filed Mar. 31, 2014 (Feng et al.);
  • U.S. patent application Ser. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.);
  • U.S. patent application Ser. No. 14/249,497 for Terminal Having Plurality of Operating Modes filed Apr. 10, 2014, Grunow et al.);
  • U.S. patent application Ser. No. 14/250,923 for Reading Apparatus Having Partial Frame Operating Mode filed Apr. 11, 2014, (Deng et al.);
  • U.S. patent application Ser. No. 14/257,174 for Imaging Terminal Having Data Compression filed Apr. 21, 2014, (Barber et al.)
  • U.S. patent application Ser. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014, (Showering);
  • U.S. patent application Ser. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014, (Ackley et al.);
  • U.S. patent application Ser. No. 14/274,858 for Mobile Printer with Optional Battery Accessory filed May 12, 2014, (Marty et al.);
  • U.S. patent application Ser. No. 14/342,544 for Imaging Based Barcode Scanner Engine with Multiple Elements Supported on a Common Printed Circuit Board filed Mar. 4, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/342,551 for Terminal Having Image Data Format Conversion filed Mar. 4, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/345,735 for Optical Indicia Reading Terminal with Combined Illumination filed Mar. 19, 2014 (Ouyang);
  • U.S. patent application Ser. No. 29/486,759 for an Imaging Terminal, filed Apr. 2, 2014 (Oberpriller et al.); and
  • U.S. patent application Ser. No. 14/355,613 for Optical Indicia Reading Terminal with Color Image Sensor filed May 1, 2014, (Lu et al.).


In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A dimensioning system comprising: a range sensor for capturing range images of a field of view;a range-sensor support to physically support and position the range sensor in a range-sensor pose;a computing device communicatively coupled to the range sensor;a display communicatively coupled to the computing device; anda memory comprising adjustment software and dimensioning software;wherein, when the computing device executes the adjustment software, the adjustment software configures the computing device to: (i) receive range images from the range sensor, (ii) compute, using at least one of the range images, a pose difference, wherein the pose difference comprises a difference between an initial range-sensor pose and a target range-sensor pose, (iii) generate, based on the pose difference, at least one adjustment message to facilitate adjustment of the range sensor pose to more closely align with the target range-sensor pose, and (iv) transmit the at least one adjustment message to the display communicatively coupled to the computing device, andwherein, when the computing device executes the dimensioning software, the dimensioning software configures the computing device to derive, from at least one of the range images, dimensions of an object in the field of view.
  • 2. The dimensioning system according to claim 1, wherein the adjustment software generates adjustment messages until the pose difference is minimized below an adjustable threshold value.
  • 3. The dimensioning system according to claim 1, wherein the range sensor comprises a stereo arrangement of (i) a pattern projector for projecting a light pattern within a field of view and (ii) a range camera for capturing images of the reflected light pattern.
  • 4. The dimensioning system according to claim 3, wherein the light pattern is invisible.
  • 5. The dimensioning system according to claim 1, wherein the range sensor comprises a color camera for capturing color images of visible light within the field of view.
  • 6. The dimensioning system according to claim 1, wherein the adjustment software further configures the computing device to detect a frame of reference in the range images.
  • 7. The dimensioning system according to claim 6, wherein the adjustment messages comprise indications for a user to select a platform from the detected frames of reference.
  • 8. The dimensioning system according to claim 6, wherein the adjustment software further configures the computing device to select a platform from the detected frames of reference.
  • 9. The dimensioning system according to claim 1, wherein the display displays two-dimensional gray scale images representative of range images from the range sensor.
  • 10. The dimensioning system according to claim 1, wherein the target range-sensor pose is stored on a non-transitory computer-readable storage medium communicatively coupled with the computing device.
  • 11. The dimensioning system according to claim 1, wherein the target range-sensor pose is adjustable.
  • 12. A method, implemented by a computer, for operating a dimensioning system, the method comprising: generating adjustment messages to facilitate the positioning of a range sensor of a dimensioning system, the range sensor communicatively coupled to the computer and capable of generating range images, the generating adjustment messages comprising: processing at least one of the range images from the range sensor to derive a result, comprising computing a difference between an initial range-sensor pose and a target range-sensor pose; andgenerating at least one adjustment message based on the result, the at least one adjustment message being generated to facilitate adjustment of the range sensor to more closely align with the target range-sensor pose; andderiving, from at least one of the range images, dimensions of an object in the field of view.
  • 13. The method according to claim 12, wherein the step of processing comprises identifying at least one frame of reference in at least one range image.
  • 14. The method according to claim 13, wherein the step of processing comprises computing the initial range-sensor pose and the target range-sensor pose.
  • 15. The method according to claim 14, wherein the adjustment messages comprise instructions for physically adjusting the range sensor in order to minimize the difference between the initial range-sensor pose and the target range-sensor pose.
  • 16. The method according to claim 12, wherein the range sensor comprises a projector for projecting a light pattern in a field of view and a range camera for detecting the light pattern in at least part of the field of view, wherein the projector and the range camera are in a stereo arrangement.
  • 17. The method according to claim 12, wherein the adjustment messages comprise range images with superimposed graphics and/or text.
  • 18. The method according to claim 12, wherein the range sensor comprises a color camera and the adjustment messages comprise color images with superimposed graphics and/or text.
  • 19. The method according to claim 12, wherein the adjustment messages comprise audio messages.
  • 20. The method according to claim 12, wherein the adjustment messages comprise visual information displayed on a display communicatively coupled to the computer.
US Referenced Citations (572)
Number Name Date Kind
4279328 Ahlbom Jul 1981 A
5175601 Fitts Dec 1992 A
5184733 Amarson et al. Feb 1993 A
5561526 Huber et al. Oct 1996 A
5748199 Palm May 1998 A
5767962 Suzuki et al. Jun 1998 A
5938710 Lanza et al. Aug 1999 A
5959568 Woolley Sep 1999 A
6115114 Berg et al. Sep 2000 A
6189223 Haug Feb 2001 B1
6519550 D'Hooge et al. Feb 2003 B1
6535776 Tobin et al. Mar 2003 B1
6832725 Gardiner et al. Dec 2004 B2
6922632 Foxlin Jul 2005 B2
7057632 Yamawaki et al. Jun 2006 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7205529 Andersen et al. Apr 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7726575 Wang et al. Jun 2010 B2
8072581 Breiholz Dec 2011 B1
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8351670 Ijiri et al. Jan 2013 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue et al. Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9082023 Feng et al. Jul 2015 B2
9082195 Holeva et al. Jul 2015 B2
9142035 Rotman Sep 2015 B1
9224022 Ackley et al. Dec 2015 B2
9224027 Van Horn et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9250712 Todeschini Feb 2016 B1
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9310609 Rueblinger et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342724 McCloskey May 2016 B2
9375945 Bowles Jun 2016 B1
D760719 Zhou et al. Jul 2016 S
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
D762647 Fitch et al. Aug 2016 S
9412242 Van Horn et al. Aug 2016 B2
9424749 Reed et al. Aug 2016 B1
D766244 Zhou et al. Sep 2016 S
9443123 Hejl Sep 2016 B2
9443222 Singel et al. Sep 2016 B2
9478113 Xie et al. Oct 2016 B2
9486921 Straszheim et al. Nov 2016 B1
9828223 Svensson et al. Nov 2017 B2
20020105639 Roelke Aug 2002 A1
20020109835 Goetz Aug 2002 A1
20020196534 Lizotte et al. Dec 2002 A1
20030225712 Cooper et al. Dec 2003 A1
20040073359 Ichijo et al. Apr 2004 A1
20040083025 Yamanouchi et al. Apr 2004 A1
20040089482 Ramsden et al. May 2004 A1
20040098146 Katae et al. May 2004 A1
20040105580 Hager et al. Jun 2004 A1
20040132297 Baba et al. Jul 2004 A1
20040214623 Takahashi et al. Oct 2004 A1
20050257748 Kriesel et al. Nov 2005 A1
20060047704 Gopalakrishnan Mar 2006 A1
20060078226 Zhou Apr 2006 A1
20060108266 Bowers et al. May 2006 A1
20060109105 Varner et al. May 2006 A1
20060291719 Ikeda et al. Dec 2006 A1
20070003154 Sun et al. Jan 2007 A1
20070063048 Havens et al. Mar 2007 A1
20070143082 Degnan Jun 2007 A1
20070177011 Lewin et al. Aug 2007 A1
20080047760 Georgitsis Feb 2008 A1
20080050042 Zhang et al. Feb 2008 A1
20080062164 Bassi et al. Mar 2008 A1
20080079955 Storm Apr 2008 A1
20080273210 Hilde Nov 2008 A1
20090095047 Patel et al. Apr 2009 A1
20090134221 Zhu et al. May 2009 A1
20090318815 Barnes Dec 2009 A1
20100060604 Zwart et al. Mar 2010 A1
20100113153 Yen et al. May 2010 A1
20100118200 Gelman et al. May 2010 A1
20100171740 Andersen et al. Jul 2010 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100223276 Al-Shameri et al. Sep 2010 A1
20100274728 Kugelman Oct 2010 A1
20100321482 Cleveland Dec 2010 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20110234389 Mellin Sep 2011 A1
20110235854 Berger et al. Sep 2011 A1
20110260965 Kim et al. Oct 2011 A1
20110297590 Ackley et al. Dec 2011 A1
20110303748 Lemma et al. Dec 2011 A1
20120067955 Rowe Mar 2012 A1
20120111946 Golant May 2012 A1
20120126000 Kunzig et al. May 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120190386 Anderson Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20120224060 Gurevich et al. Sep 2012 A1
20120256901 Bendall Oct 2012 A1
20120293625 Schneider et al. Nov 2012 A1
20130019278 Sun et al. Jan 2013 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130093895 Palmer et al. Apr 2013 A1
20130156267 Muraoka Jun 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130208164 Cazier et al. Aug 2013 A1
20130211790 Loveland et al. Aug 2013 A1
20130222592 Gieseke Aug 2013 A1
20130223673 Davis et al. Aug 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedrao Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130329013 Metois et al. Dec 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140009586 McNamer et al. Jan 2014 A1
20140019005 Lee et al. Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140031665 Pinto et al. Jan 2014 A1
20140100813 Showering Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039674 Motoyama et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140058612 Wong et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140062709 Hyer et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071430 Hansen et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140079297 Tadayon et al. Mar 2014 A1
20140098243 Ghazizadeh Apr 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Kearney May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140139654 Takahashi May 2014 A1
20140140585 Wang May 2014 A1
20140142398 Patil et al. May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158468 Adami Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140177931 Kocherscheidt et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197238 Lui et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140201126 Zadeh et al. Jul 2014 A1
20140203087 Smith et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140225985 Klusza et al. Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140240454 Lee Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140270361 Amma et al. Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140306833 Ricci Oct 2014 A1
20140307855 Withagen et al. Oct 2014 A1
20140312121 Lu et al. Oct 2014 A1
20140319219 Liu et al. Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van Horn et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20140379613 Nishitani et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009100 Haneda et al. Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150062369 Gehring et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150070158 Hayasaka Mar 2015 A1
20150071819 Todeschini Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150116498 Vartiainen et al. Apr 2015 A1
20150117749 Chen et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150169925 Chen et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150178900 Kim et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150193645 Colavito et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150201181 Moore et al. Jul 2015 A1
20150204671 Showering Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150219748 Hyatt Aug 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150229838 Hakim et al. Aug 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150316368 Moench et al. Nov 2015 A1
20150325036 Lee Nov 2015 A1
20150327012 Bian et al. Nov 2015 A1
20150332463 Galera et al. Nov 2015 A1
20160014251 Hejl Jan 2016 A1
20160169665 Deschenes et al. Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160048725 Holz et al. Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160065912 Peterson Mar 2016 A1
20160090283 Svensson et al. Mar 2016 A1
20160090284 Svensson et al. Mar 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160138247 Conway et al. May 2016 A1
20160138248 Conway et al. May 2016 A1
20160138249 Svensson et al. May 2016 A1
20160171720 Todeschini Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160187186 Coleman et al. Jun 2016 A1
20160187187 Coleman et al. Jun 2016 A1
20160187210 Coleman et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160191801 Sivan Jun 2016 A1
20160125873 Braho et al. Jul 2016 A1
20160203641 Bostick et al. Jul 2016 A1
20160223474 Tang et al. Aug 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Sewell et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20170115490 Hsieh et al. Apr 2017 A1
20170121158 Wong et al. May 2017 A1
20170182942 Hardy et al. Jun 2017 A1
Foreign Referenced Citations (18)
Number Date Country
201139117 Oct 2008 CN
1232480 May 2006 EP
2013117 Jan 2009 EP
2286932 Feb 2011 EP
2372648 Oct 2011 EP
2562715 Feb 2013 EP
2833323 Feb 2015 EP
2525053 Oct 2015 GB
200696457 Apr 2006 JP
2007084162 Apr 2007 JP
2015174705 Oct 2015 JP
20100020115 Feb 2010 KR
2013163789 Nov 2013 WO
2013173985 Nov 2013 WO
20130184340 Dec 2013 WO
2014019130 Feb 2014 WO
2014110495 Jul 2014 WO
2015006865 Jan 2015 WO
Non-Patent Literature Citations (52)
Entry
Chinese Notice of Reexamination in related Chinese Application 201520810313.3, dated Mar. 14, 2017, English Computer Translation provided, 7 pages [No new art cited].
Extended European search report in related EP Application 16199707.7, dated Apr. 10, 2017, 15 pages.
Ulusoy et al., One-Shot Scanning using De Bruijn Spaced Grids, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, 7 pages [Cited in EP Extended search report dated Apr. 10, 2017].
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned.
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages; now abandoned.
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages; now abandoned.
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages.
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages.
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages.
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages.
European Exam Report in related EP Application No. 15176943.7, dated Apr. 12, 2017, 6 pages [Art previously cited in this matter].
European Exam Report in related EP Application No. 15188440.0, dated Apr. 21, 2017, 4 pages [No new art to cite].
Ralph Grabowski, “Smothing 3D Mesh Objects,” New Commands in AutoCAD 2010: Part 11, Examiner Cited art in related matter Non Final Office Action dated May 19, 2017; 6 pages.
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages.
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages.
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages.
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages.
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages.
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages; now abandoned.
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages.
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages.
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages.
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages.
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages.
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages.
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages.
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages.
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages.
U.S. Appl. No. 14/740,320 for Tactile Switch For a Mobile Electronic Device filed Jun. 16, 2015 (Bamdringa); 38 pages.
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages.
European extended search report in related EP Application 16190833.0, dated Mar. 9, 2017, 8 pages [only new art has been cited; US Publication 2014/0034731 was previously cited].
United Kingdom Combined Search and Examination Report in related Application No. GB1620676.5, dated Mar. 8, 2017, 6 pages [References have been previously cited; WO2014/151746, WO2012/175731, US 2014/0313527, GB2503978].
European Exam Report in related , EP Application No. 16168216.6, dated Feb. 27, 2017, 5 pages, [References have been previously cited; WO2011/017241 and US 2014/0104413].
Thorlabs, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application downloaded from https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_id=6430, 4 pages.
EKSMA Optics, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, downloaded from http://eksmaoptics.com/optical-systems/f-theta-lenses/f-theta-lens-for-1064-nm/, 2 pages.
Sill Optics, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, http://www.silloptics.de/1/products/sill-encyclopedia/laser-optics/f-theta-lenses/, 4 pages.
European Exam Report in related EP Application No. 16152477.2, dated Jun. 20, 2017, 4 pages [No art to be cited].
European Exam Report in related EP Applciation 16172995.9, dated Jul. 6, 2017, 9 pages [No new art to be cited].
United Kingdom Search Report in related Application No. GB1700338.5, dated Jun. 30, 2017, 5 pages.
European Search Report in related EP Application No. 17175357.7, dated Aug. 17, 2017, pp. 1-7 [No new art to be cited].
Boavida et al., “Dam monitoring using combined terrestrial imaging systems”, 2009 Civil Engineering Survey Dec./Jan. 2009, pp. 33-38 {Notice of Allowance dated Sep. 15, 2017 in related matter}.
EP Search Report in related EP Application No. 17171844 dated Sep. 18, 2017. 4 pages.
EP Extended Search Report in related EP Applicaton No. 17174843.7 dated Oct. 17, 2017, 5 pages.
UK Further Exam Report in related UK Application No. GB1517842.9, dated Sep. 1, 2017, 5 pages.
Ulusoy, Ali Osman et al.; “One-Shot Scanning using De Bruijn Spaced Grids”, Brown University; 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, pp. 1786-1792 [EPO Search Report dated Dec. 5, 2017}.
Extended European Search report in related EP Application No. 17189496.7 dated Dec. 5, 2017; 9 pages.
Extended European Search report in related EP Application No. 17190323.0 dated Jan. 19, 2018; 6 pages.
Examination Report in related GB Application No. GB1517843.7, dated Jan. 19, 2018, 4 pages.
Examination Report in related EP Application No. 15190315; dated Jan. 29, 2018; 6 pages.
European Extended Search Report in related EP Application No. 17201794.9, dated Mar. 16, 2018, 10 pages.
European Extended Search Report in related EP Application 17205030.4, dated Mar. 22, 2018, 8 pages.
Related Publications (1)
Number Date Country
20170211931 A1 Jul 2017 US
Continuations (1)
Number Date Country
Parent 14453019 Aug 2014 US
Child 15479839 US