Method for a mobile dimensioning device to use a dynamic accuracy compatible with NIST standard

Information

  • Patent Grant
  • 11353319
  • Patent Number
    11,353,319
  • Date Filed
    Wednesday, July 10, 2019
    4 years ago
  • Date Issued
    Tuesday, June 7, 2022
    a year ago
Abstract
A mobile dimensioning device, i.e. a mobile dimensioner, is described that uses a dynamic accuracy while still being compatible with the NIST standard. Even if the accuracy division is dynamic and not predetermined, a mobile dimensioning device of the present invention reports the actual dimensioning prior to measurement capture and can therefore be certified and used in commercial transactions.
Description
FIELD OF THE INVENTION

The present invention relates to mobile volume dimensioning devices.


BACKGROUND

A traditional Multiple Dimensioning Measurement Device (MDMD) captures the three dimensional size (i.e. length, width, height) of objects, such as parcels or pallets, based on the predetermined accuracy of the system. In the United States National Institutes of Standards and Technology (NIST) standard, this predetermined accuracy level of the system is known as the accuracy division.


Some MDMD devices support operations with different accuracy divisions, but these accuracy divisions are still predetermined. For example, an MDMD can provide a measurement with an accuracy of 1 cm for objects with dimensions smaller than 50 cm and can provide a measurement with an accuracy of 2 cm for objects with dimensions greater than 50 cm.


Predetermined accuracy divisions work for fixed dimensioning systems because the parameters of the measurement environment are known in fixed dimensioning systems. For example, for fixed MDMDs, the distance to the object to be measured, the viewing angle, and other parameters are limited by the installation of the device.


However, in the case of a Mobile Dimensioning Device (MDD), many of the parameters that influence the accuracy of the system cannot be controlled. Because of the dynamic nature of its accuracy, MDDs are not easily compatible with a NIST certification that requires the accuracy division to be reported in advance of the actual measurement. This lack of NIST certification generally prohibits MMDs from being used for commercial transactions.


Therefore, a need exists for a mobile dimensioning device that uses a dynamic accuracy division while remaining compatible with the NIST standard.


SUMMARY

Accordingly one aspect of the present invention discloses a mobile dimensioning device, comprising: a display; non-volatile storage; one or more sensors; an input subsystem; one or more processors; and memory containing instructions executable by the one or more processors whereby the device is operable to: derive one or more accuracy parameters based on information received from the one or more sensors for a measurement environment of an object being measured; compute an accuracy level based on the one or more accuracy parameters; determine if the accuracy level corresponds to a sufficient measurement environment; if the accuracy level corresponds to a sufficient measurement environment; display, on the display, an indication that the measurement environment is sufficient and a capture icon to enable the measurement capture; in response to an input received at the capture icon, capture the measurement; display, on the display, the dimensions of the object; and record the dimensions of the object.


In additional exemplary embodiments, the accuracy level is the accuracy division as defined by the National Institutes of Standards and Technology (NIST) standard.


In still other embodiments, the accuracy parameters comprise at least one of the group consisting of: distance to the object, viewing angle relative to the object, temperature, ambient light, and quality of data from the one or more sensors.


In further embodiments, the one or more sensors comprise at least one of the group consisting of: optical sensors and measurement sensors.


In additional embodiments, the optical sensors are selected from a group consisting of: a barcode sensor, a camera, and an image sensor.


In some embodiments, the measurement sensors are selected from a group consisting of: point-cloud projection, structured light, and stereoscopic cameras and n-scopic cameras.


In another embodiment, the sufficient measurement environment is an environment where the accuracy division has a low value.


In more embodiments, displaying, on the display, an indication that the measurement environment is sufficient comprises at least one of the group consisting of: displaying the accuracy division, displaying an icon to enable the measurement capture, removing the indications for improving the measurement environment, displaying a completed progress bar, and displaying a confirmation icon.


In still other embodiments, displaying, on the display, the dimensions of the object comprises displaying the dimensions of the object.


And yet in further embodiments, displaying, on the display, the dimensions of the object comprises displaying the dimensions of the object and the corresponding accuracy divisions.


In some embodiments, computing an accuracy level based on the accuracy parameters comprises running multivariable regression on the accuracy parameters.


In other embodiments, the dimensions of the object and the accuracy level are stored in the non-volatile storage.


In still further embodiments, the device is further operable to: determine that the object being measured has been previously measured; retrieved the dimensions of the object and the accuracy level from the from the non-volatile storage; display, on the display, the dimensions of the object and the accuracy level from the from the non-volatile storage; and record the dimensions of the object and the accuracy level from the from the non-volatile storage.


In further embodiments, the device is further operable to: if the accuracy level does not correspond to a sufficient measurement environment; provide an indication for improving the measurement environment.


In still further embodiments, the indication for improving the measurement environment comprises at least one of group consisting of: a textual instruction, a graphical instruction, and a graphical icon.


In additional embodiments, the indication for improving the measurement environment comprises at least one of the group consisting of: an indication for shortening the distance to the object, an indication for improving the viewing angle relative to the object, an indication to delay measurement pending a target operating temperature, and indication for improving the ambient light, and an indication for adjusting the one or more sensors to improve the quality of data.


An additional aspect of the present invention discloses a mobile dimensioning device, comprising: a display; non-volatile storage; one or more sensors; an input subsystem; one or more processors; and memory containing instructions executable by the one or more processors whereby the device is operable to: derive one or more accuracy parameters based on information received from the one or more sensors for a measurement environment of an object being measured; compute an accuracy level based on the one or more accuracy parameters; determine if the accuracy level corresponds to a sufficient measurement environment; if the accuracy level corresponds to a sufficient measurement environment; display, on the display, an acceptance icon to enable the display of the accuracy level; display, on the display, the accuracy level and a capture icon to enable measurement capture.


In another embodiment, the device is further operable to: in response to an input received at the capture icon, capture the measurement display, on the display, the dimensions of the object; and record the dimensions of the object.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of the hardware elements of a device according to embodiments of the disclosed subject matter.



FIGS. 2A, 2B, and 2C are an example of a graphical user interface (GUI) of the system in accordance with one embodiment of the disclosed subject matter.



FIGS. 3A, 3B, 3C, and 3D are an example of a GUI of the system in accordance with one embodiment of the disclosed subject matter.



FIG. 4 is a flow chart outlining the process for operating a device in accordance with embodiments of the disclosed subject matter.





DETAILED DESCRIPTION

The present invention embraces the concept of a mobile dimensioning device that uses a dynamic accuracy while still being compatible with the NIST standard. Even if the accuracy division is dynamic and not predetermined, a mobile dimensioning device of the present invention reports the actual dimensioning prior to measurement capture and can therefore be certified and used in commercial transactions. Moreover, since the NIST standard for MDMD is derived from the International Organization of Legal Metrology (OIML) R 129 standard, a mobile dimensioning device of the present invention should be compliant with the OIML R 129 standard as well as any other standard derived from the OIML R 129.



FIG. 1 illustrates an exemplary device 100, such as a mobile dimensioning device, for one embodiment of the present invention. The device 100 may include other components not shown in FIG. 1, nor further discussed herein for the sake of brevity. One having ordinary skill in the art will understand the additional hardware and software included but not shown in FIG. 1.


In general, device 100 may be implemented in any form of digital computer or mobile device. Digital computers may include, but are not limited to, laptops, desktops, workstations, fixed vehicle computers, vehicle mount computers, hazardous environment computers, rugged mobile computers, servers, blade servers, mainframes, other appropriate computers. Mobile devices may include, but are not limited to, cellular telephones, smart phones, personal digital assistants, tablets, pagers, two-way radios, netbooks, barcode scanners, radio frequency identification (RFID) readers, intelligent sensors, tracking devices, volume dimensioning devices, mobile dimensioning devices, and other similar computing devices.


In general, as shown, the mobile dimensioning device 100 of FIG. 1 includes a processing system 110 that includes one or more processors 111, such as Central Processing Units (CPUs), Application Specific Integrated Circuits (ASICs), and/or Field Programmable Gate Arrays (FPGAs), a memory controller 112, memory 113, which may include software 114, and other components that are not shown for brevity, such as busses, etc. The processing system may also include storage 115, such as a hard drive or solid state drive.


The processing system 110 also includes a peripherals interface 116 for communicating with other components of the mobile dimensioning device 100, including but not limited to, radio frequency (RF) circuitry 152, such as Wi-Fi and/or cellular communications circuitry such as wireless Ethernet, Bluetooth, and near field communication (NFC), audio circuitry 154 for the audio input component 153, such as a microphone, and audio output component 155, such as a speaker, one or more accelerometers 156, one or more other sensors 158, such as a location determination component such as a Global Positioning System (GPS) chip, and one or more external ports 160, which may be used for smart card readers or for wired connections such as wired Ethernet, USB, serial or I2C ports. The RF circuitry 152 and external ports 160 individually and collectively make up the communication interfaces for the mobile dimensioning device 100. The processing system 110 is also connected to a power system component 120 that is used to power the mobile dimensioning device 100, such as a battery or a power supply unit. The processing system 110 is also connected to a clock system component 130 that controls timing functions.


The peripherals interface 116 may also communicate with an Input/Output (I/O) subsystem 140, which includes a display(s) controller 141 operative to control display(s) 142. In some embodiments the display(s) 142 is a touch-sensitive display system, and the display(s) controller 141 is further operative to process touch inputs on the touch sensitive display 142. The I/O subsystem 140 may also include a keypad(s) controller 143 operative to control keypad(s) 144 on the mobile dimensioning device 100. The I/O subsystem 140 also includes an optical sensor(s) controller 145 operative to control one or more optical sensor(s) 146. The optical sensor(s) may include, but is not limited to, a barcode sensor, a camera, and an image sensor. The I/O subsystem 140 also includes a measurement sensor(s) controller 147 operative to control one or more measurement sensor(s) 148. The measurement sensor(s) may include, but is not limited to, a point-cloud projection sensor, a structured light sensor, a stereoscopic camera, and a n-scopic camera. The components of mobile dimensioning device 100 may be interconnected using one or more buses, represented generically by the arrows of FIG. 1, and may be mounted on a motherboard (not shown) or some other appropriate configuration.



FIG. 2A is an example of a graphical user interface (GUI) that would be displayed on the display 142 of the mobile dimensioning device 100 in accordance with one embodiment of the disclosed subject matter. FIG. 2A is a representative GUI during the phase while the mobile dimensioning device is finding a sufficient measurement environment for measuring an object.


The elements of FIG. 2A are now described. The main window of the interface 202 has a title field 204 and a progress bar 206 as well as a viewing window 208. The viewing window 208 currently shows an object to be measured 210, for example, a box to be shipped. In some embodiments, the object to be measured 210 can be highlighted in the viewing window 208 in some manner, such as with a green outline. Overlaid onto or integrated with the viewing window 208 is a guidance indication 212, represented by, but not limited to, an arrow in FIG. 2A.


The guidance indication 212 may be a textual instruction, a graphical instruction, a graphical icon, or any combination therein. The guidance indication 212 provides information that guides the mobile dimensioning device 100 to a measurement environment sufficient for measuring an object. The guidance indication 212 is based on the dynamic accuracy level of the mobile dimensioning device 100.


In a preprocessing phase, the mobile dimensioning device 100 computes its accuracy level dynamically as a function of all of the parameters that influence it. Any kind of measureable parameter influencing accuracy can be included in the model for computing the accuracy level of the mobile dimensioning device 100. The list of parameters includes, but is not limited to, distance of the mobile dimensioning device 100 to the object being measured, the viewing angle of the camera or optical sensor in the mobile dimensioning device 100 relative to the object being measured, temperature of the mobile dimensioning device 100, ambient light, and quality of data from the one or more sensors of the mobile dimensioning device 100. Individually and collectively, these parameters make up the measuring environment for measuring the object. In one embodiment, the accuracy level may be computed, for example, using multivariable regression on the parameters influencing the accuracy.


Note that in some embodiments, the mobile dimensioning device 100 records a variety of raw data from the sensors. The mobile dimensioner device, through hardware and software, transforms that data into the accuracy parameters that are used to compute the accuracy level for a given the measurement environment.


Once the dynamic accuracy level is computed, it is used to guide the mobile dimensioning device 100 to a measurement environment sufficient for measuring the object. In one embodiment, this is accomplished by identifying accuracy levels with a low accuracy division value. In general, dimensioning error is reduced as the mobile dimensioning device 100 gets closer to the object and has the proper viewing angle for capturing the object, thus reducing the accuracy division. The lower the accuracy division value, the more optimal the measuring environment.


Examples of the types of guidance provided by the guidance indication 212 include, but are not limited to, shortening the distance to the object, improving the viewing angle relative to the object, delaying measurement pending a target operating temperature, and improving the ambient light, adjusting the one or more sensors to improve the quality of data.


In some embodiments, the progress bar 206 appears with other guidance indications 212. As shown in FIG. 2A, the progress bar 206 works in tandem with the guidance indication 212 to guide the mobile dimensioning device 100 to a sufficient measurement environment, showing a reading of 0% complete when the mobile dimensioning device has an insufficient measurement environment and 100% when the mobile dimensioning device has found a sufficient measurement environment.



FIG. 2B is an example of a GUI of the system in accordance with one embodiment of the disclosed subject matter. FIG. 2A is a representative GUI after the mobile dimensioning device has found a sufficient measurement environment for measuring an object.



FIG. 2B adds some additional elements beyond FIG. 2A. Once the mobile dimensioning device has found a sufficient measurement environment for measuring an object, the mobile dimensioning device 100 displays an accuracy division field 214. In FIG. 2B, the accuracy division field 214 shows one accuracy division per dimension, but the present invention is not limited thereto. Note also that the progress bar 206 now shows 100%, indicating that the measurement environment is sufficient. FIG. 2A also shows a capture icon 216 which is used to capture the measurement of the object in response to an input at the mobile dimensioning device 100. In other embodiments, the capture icon 216 could be implemented in any of a variety of ways using different elements understood in the art of GUI design for receiving input. In other embodiments, the capture icon 216 would not be part of the GUI but rather would be a hardware button on the mobile dimensioning device 100 that becomes active when the device is enabled to capture the measurement. Note that the capture icon 216 is only visible when the measuring environment is sufficient for measuring the object. Note also that the guidance indication 212 is no longer shown, as the mobile dimensioning device has found a sufficient environment for taking the measurement. All of these visual cues in the GUI (i.e. the displaying of the accuracy division field 214, the completed progress bar 206, the capture icon 216, and the absence of the guidance indication 212) are examples of indications that the measurement environment is sufficient.


Note that because mobile dimensioning device 100 reveals the accuracy division prior to permitting or enabling the actual measurement of the object, the mobile dimensioning device is compatible with the NIST standard.



FIG. 2C is an example of a GUI of the system in accordance with one embodiment of the disclosed subject matter. FIG. 2C is a representative GUI after the mobile dimensioning device 100 has captured the measurements of the object.


In some embodiments, the mobile dimensioner device 100 records an infra-red (IR) image of a pattern of light projected on an object being measured. The mobile dimensioner device, though hardware and software, transform the image into three dimensional data about the object. That three dimensional data is used to derive an accurate measurement for the object. This process of deriving the accurate measurement for the object is known as capturing the measurement. Capturing the measurement can be done by the mobile dimensioning device 100 after the accuracy division has been displayed either automatically or in response to an input.



FIG. 2C adds some additional elements beyond FIG. 2A and FIG. 2B. Because the measurement has now been captured, it is possible to present the dimension field 220. In some embodiments, and additional confirmation icon 218 is provided to confirm that the measurement has been captured, such as but not limited to the check mark icon shown in FIG. 2C.



FIGS. 3A, 3B, 3C, and 3D are an example of a GUI of the system in accordance with an alternative embodiment of the disclosed subject matter. In this embodiment, neither the dimensions of the object nor the accuracy division are shown until the measurement environment is sufficient for measuring the object. Then a button or icon is shown to enable the capture.



FIG. 3A has similar elements to FIG. 2A. FIG. 3A shows the preprocessing phase of the alternative embodiment. The main window of the interface 302 has a title field 304 and a progress bar 306 as well as a viewing window 308. The viewing window 308 currently shows an object to be measured 310, for example, a box to be shipped. In some embodiments, the object to be measured 310 can be highlighted in the viewing window 308 in some manner, such as with a green outline. Overlaid onto or integrated with the viewing window 308 is a guidance indication 312, represented by, but not limited to, an arrow in FIG. 3A. As described earlier, the guidance indication 312 may be a textual instruction, a graphical instruction, a graphical icon, or any combination therein.


Also, as discussed earlier, in the background the mobile dimensioning device 100 computes its accuracy level dynamically to help the mobile dimensioning device 100 identify a measurement environment sufficient for measuring the object.



FIG. 3B is similar to FIG. 2B. FIG. 3B is a representative GUI after the mobile dimensioning device has found a sufficient measurement environment for measuring an object according to the alternative embodiment.


In FIG. 3B, the progress bar 206 now shows 100%, indicating that the measurement environment is sufficient. In this alternative embodiment, an acceptance icon 322 is provided to confirm that the measurement environment is sufficient to measure the object. Note that neither the dimensions nor the accuracy division are shown in FIG. 3B. The acceptance icon 322 enables the display of the accuracy level.


In response to an input received at the acceptance icon 322, the accuracy level will be displayed as shown in FIG. 3C. Note that only the accuracy division field 314 is shown. FIG. 3C also provides a capture icon 316 to enable the measurement capture. In response to an input involving the capture icon 316, the mobile dimensioning device will capture the measurements. In other embodiments, recording the measurements may be automatic after capture.



FIG. 3D is an exemplary GUI that is displayed after the measurement has been captured. Note now that both the accuracy division field 314 and the actual dimension field 320 are both shown. In some embodiments, an additional confirmation icon 318, such as but not limited to the check mark icon shown in FIG. 3D, is provided to confirm that the measurement has been captured.



FIG. 4 is a flow chart outlining the process for operating a mobile dimensioning device in accordance with embodiments of the disclosed subject matter.


The process begins in FIG. 4 at Step 400 followed by Step 402 in which a check is made to see if an adjustment to the mobile dimensioning device has been detected. The adjustment to the mobile dimensioning device can be a random movement of the device itself or any of the components of the device. In some embodiments, the adjustment to the mobile dimensioning device represents movements that correspond to the types of guidance described earlier by the guidance indication 212, 312. If no adjustment has been detected (Path 403), then the process ends (Step 422).


If an adjustment has been detected (Step 405), then the mobile dimensioning device 100 derives the new accuracy parameters that correspond to the new measurement environment (i.e. the measurement environment after the adjustment to the mobile dimensioning device) based on information received from the sensors (Step 404). The mobile dimensioning device 100 then compute an accuracy level based on the one or more accuracy parameters (Step 406).


The mobile dimensioning device 100 then checks to see if the measurement environment is sufficient for measurement of the object (Step 408). If not (Path 407), then guidance indications for improving the measurement environment are displayed (Step 410). The indications for improving the measurement environment were described earlier. These are the guidance indications 212, 312. Examples include, but are not limited to, shortening the distance to the object, improving the viewing angle relative to the object, delaying measurement pending a target operating temperature, and improving the ambient light, adjusting the one or more sensors to improve the quality of data.


If the measurement environment is sufficient for measurement of the object (Path 409), then the mobile dimensioning device displays indications of sufficient measurement environment (Step 412). As described earlier, the indication of sufficient measurement environment include but are not limited to: displaying of the accuracy division, displaying a completed progress bar, displaying a capture icon, and the removal of the guidance indications.


The mobile dimensioning device 100 then checks to see if a capture event is received (Step 414). The capture event triggers the actual measurement of the object. In some embodiments, the capture event occurs automatically. In other embodiments, the capture event occurs in response to an input received at the mobile dimensioning device 100.


If no capture event is detected (Path 411), then the mobile dimensioner device checks to see if an adjustment has been detected (Step 402) as described earlier.


If a capture event is detected (Path 413), then the dimensions of the object are actually measured (Step 416), the dimensions are displayed (Step 418), and the dimensions are recorded (Step 420). In some embodiments, when the object dimensions are displayed, the associated accuracy division for each dimension is also displayed. In other embodiments, only the object dimensions are displayed. The process then ends (Step 422).


In this respect, the processes described in the figures should make it clear to a person of ordinary skill in the art how the mobile dimensioner device 100 of the present invention uses a dynamic accuracy while still being compatible with the NIST standard and can therefore be certified and used in commercial transactions.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications: U.S. Pat. No. 6,832,725;




  • U.S. Pat. No. 7,128,266;

  • U.S. Pat. Nos. 7,159,783; 7,413,127;

  • U.S. Pat. Nos. 7,726,575; 8,294,969;

  • U.S. Pat. Nos. 8,317,105; 8,322,622;

  • U.S. Pat. Nos. 8,366,005; 8,371,507;

  • U.S. Pat. Nos. 8,376,233; 8,381,979;

  • U.S. Pat. Nos. 8,390,909; 8,408,464;

  • U.S. Pat. Nos. 8,408,468; 8,408,469;

  • U.S. Pat. Nos. 8,424,768; 8,448,863;

  • U.S. Pat. Nos. 8,457,013; 8,459,557;

  • U.S. Pat. No. 8,463,079;

  • U.S. Pat. Nos. 8,469,272; 8,474,712;

  • U.S. Pat. Nos. 8,479,992; 8,490,877;

  • U.S. Pat. Nos. 8,517,271; 8,523,076;

  • U.S. Pat. Nos. 8,528,818; 8,544,737;

  • U.S. Pat. Nos. 8,548,242; 8,548,420;

  • U.S. Pat. Nos. 8,550,335; 8,550,354;

  • U.S. Pat. Nos. 8,550,357; 8,556,174;

  • U.S. Pat. Nos. 8,556,176; 8,556,177;

  • U.S. Pat. Nos. 8,559,767; 8,599,957;

  • U.S. Pat. Nos. 8,561,895; 8,561,903;

  • U.S. Pat. Nos. 8,561,905; 8,565,107;

  • U.S. Pat. Nos. 8,571,307; 8,579,200;

  • U.S. Pat. Nos. 8,583,924; 8,584,945;

  • U.S. Pat. Nos. 8,587,595; 8,587,697;

  • U.S. Pat. Nos. 8,588,869; 8,590,789;

  • U.S. Pat. Nos. 8,596,539; 8,596,542;

  • U.S. Pat. Nos. 8,596,543; 8,599,271;

  • U.S. Pat. Nos. 8,599,957; 8,600,158;

  • U.S. Pat. Nos. 8,600,167; 8,602,309;

  • U.S. Pat. Nos. 8,608,053; 8,608,071;

  • U.S. Pat. Nos. 8,611,309; 8,615,487;

  • U.S. Pat. Nos. 8,616,454; 8,621,123;

  • U.S. Pat. Nos. 8,622,303; 8,628,013;

  • U.S. Pat. Nos. 8,628,015; 8,628,016;

  • U.S. Pat. Nos. 8,629,926; 8,630,491;

  • U.S. Pat. Nos. 8,635,309; 8,636,200;

  • U.S. Pat. Nos. 8,636,212; 8,636,215;

  • U.S. Pat. Nos. 8,636,224; 8,638,806;

  • U.S. Pat. Nos. 8,640,958; 8,640,960;

  • U.S. Pat. Nos. 8,643,717; 8,646,692;

  • U.S. Pat. Nos. 8,646,694; 8,657,200;

  • U.S. Pat. Nos. 8,659,397; 8,668,149;

  • U.S. Pat. Nos. 8,678,285; 8,678,286;

  • U.S. Pat. Nos. 8,682,077; 8,687,282;

  • U.S. Pat. Nos. 8,692,927; 8,695,880;

  • U.S. Pat. Nos. 8,698,949; 8,717,494;

  • U.S. Pat. Nos. 8,717,494; 8,720,783;

  • U.S. Pat. Nos. 8,723,804; 8,723,904;

  • U.S. Pat. No. 8,727,223; U.S. Pat. No. D702,237;

  • U.S. Pat. Nos. 8,740,082; 8,740,085;

  • U.S. Pat. Nos. 8,746,563; 8,750,445;

  • U.S. Pat. Nos. 8,752,766; 8,756,059;

  • U.S. Pat. Nos. 8,757,495; 8,760,563;

  • U.S. Pat. Nos. 8,763,909; 8,777,108;

  • U.S. Pat. Nos. 8,777,109; 8,779,898;

  • U.S. Pat. Nos. 8,781,520; 8,783,573;

  • U.S. Pat. Nos. 8,789,757; 8,789,758;

  • U.S. Pat. Nos. 8,789,759; 8,794,520;

  • U.S. Pat. Nos. 8,794,522; 8,794,526;

  • U.S. Pat. Nos. 8,798,367; 8,807,431;

  • U.S. Pat. Nos. 8,807,432; 8,820,630;

  • U.S. Pat. No. 8,854,633;

  • International Publication No. 2013/163789;

  • International Publication No. 2013/173985;

  • International Publication No. 2014/019130;

  • International Publication No. 2014/110495;

  • U.S. Patent Application Publication No. 2008/0185432;

  • U.S. Patent Application Publication No. 2009/0134221;

  • U.S. Patent Application Publication No. 2010/0177080;

  • U.S. Patent Application Publication No. 2010/0177076;

  • U.S. Patent Application Publication No. 2010/0177707;

  • U.S. Patent Application Publication No. 2010/0177749;

  • U.S. Patent Application Publication No. 2010/0202702;

  • U.S. Patent Application Publication No. 2010/0220894;

  • U.S. Patent Application Publication No. 2011/0202554;

  • U.S. Patent Application Publication No. 2012/0111946;

  • U.S. Patent Application Publication No. 2012/0138685;

  • U.S. Patent Application Publication No. 2012/0168511;

  • U.S. Patent Application Publication No. 2012/0168512;

  • U.S. Patent Application Publication No. 2012/0193423;

  • U.S. Patent Application Publication No. 2012/0203647;

  • U.S. Patent Application Publication No. 2012/0223141;

  • U.S. Patent Application Publication No. 2012/0228382;

  • U.S. Patent Application Publication No. 2012/0248188;

  • U.S. Patent Application Publication No. 2013/0043312;

  • U.S. Patent Application Publication No. 2013/0056285;

  • U.S. Patent Application Publication No. 2013/0070322;

  • U.S. Patent Application Publication No. 2013/0075168;

  • U.S. Patent Application Publication No. 2013/0082104;

  • U.S. Patent Application Publication No. 2013/0175341;

  • U.S. Patent Application Publication No. 2013/0175343;

  • U.S. Patent Application Publication No. 2013/0200158;

  • U.S. Patent Application Publication No. 2013/0256418;

  • U.S. Patent Application Publication No. 2013/0257744;

  • U.S. Patent Application Publication No. 2013/0257759;

  • U.S. Patent Application Publication No. 2013/0270346;

  • U.S. Patent Application Publication No. 2013/0278425;

  • U.S. Patent Application Publication No. 2013/0287258;

  • U.S. Patent Application Publication No. 2013/0292475;

  • U.S. Patent Application Publication No. 2013/0292477;

  • U.S. Patent Application Publication No. 2013/0293539;

  • U.S. Patent Application Publication No. 2013/0293540;

  • U.S. Patent Application Publication No. 2013/0306728;

  • U.S. Patent Application Publication No. 2013/0306730;

  • U.S. Patent Application Publication No. 2013/0306731;

  • U.S. Patent Application Publication No. 2013/0307964;

  • U.S. Patent Application Publication No. 2013/0308625;

  • U.S. Patent Application Publication No. 2013/0313324;

  • U.S. Patent Application Publication No. 2013/0313325;

  • U.S. Patent Application Publication No. 2013/0341399;

  • U.S. Patent Application Publication No. 2013/0342717;

  • U.S. Patent Application Publication No. 2014/0001267;

  • U.S. Patent Application Publication No. 2014/0002828;

  • U.S. Patent Application Publication No. 2014/0008430;

  • U.S. Patent Application Publication No. 2014/0008439;

  • U.S. Patent Application Publication No. 2014/0025584;

  • U.S. Patent Application Publication No. 2014/0027518;

  • U.S. Patent Application Publication No. 2014/0034734;

  • U.S. Patent Application Publication No. 2014/0036848;

  • U.S. Patent Application Publication No. 2014/0039693;

  • U.S. Patent Application Publication No. 2014/0042814;

  • U.S. Patent Application Publication No. 2014/0049120;

  • U.S. Patent Application Publication No. 2014/0049635;

  • U.S. Patent Application Publication No. 2014/0061305;

  • U.S. Patent Application Publication No. 2014/0061306;

  • U.S. Patent Application Publication No. 2014/0063289;

  • U.S. Patent Application Publication No. 2014/0066136;

  • U.S. Patent Application Publication No. 2014/0067692;

  • U.S. Patent Application Publication No. 2014/0070005;

  • U.S. Patent Application Publication No. 2014/0071840;

  • U.S. Patent Application Publication No. 2014/0074746;

  • U.S. Patent Application Publication No. 2014/0075846;

  • U.S. Patent Application Publication No. 2014/0076974;

  • U.S. Patent Application Publication No. 2014/0078341;

  • U.S. Patent Application Publication No. 2014/0078342;

  • U.S. Patent Application Publication No. 2014/0078345;

  • U.S. Patent Application Publication No. 2014/0084068;

  • U.S. Patent Application Publication No. 2014/0097249;

  • U.S. Patent Application Publication No. 2014/0098792;

  • U.S. Patent Application Publication No. 2014/0100774;

  • U.S. Patent Application Publication No. 2014/0100813;

  • U.S. Patent Application Publication No. 2014/0103115;

  • U.S. Patent Application Publication No. 2014/0104413;

  • U.S. Patent Application Publication No. 2014/0104414;

  • U.S. Patent Application Publication No. 2014/0104416;

  • U.S. Patent Application Publication No. 2014/0104451;

  • U.S. Patent Application Publication No. 2014/0106594;

  • U.S. Patent Application Publication No. 2014/0106725;

  • U.S. Patent Application Publication No. 2014/0108010;

  • U.S. Patent Application Publication No. 2014/0108402;

  • U.S. Patent Application Publication No. 2014/0108682;

  • U.S. Patent Application Publication No. 2014/0110485;

  • U.S. Patent Application Publication No. 2014/0114530;

  • U.S. Patent Application Publication No. 2014/0124577;

  • U.S. Patent Application Publication No. 2014/0124579;

  • U.S. Patent Application Publication No. 2014/0125842;

  • U.S. Patent Application Publication No. 2014/0125853;

  • U.S. Patent Application Publication No. 2014/0125999;

  • U.S. Patent Application Publication No. 2014/0129378;

  • U.S. Patent Application Publication No. 2014/0131438;

  • U.S. Patent Application Publication No. 2014/0131441;

  • U.S. Patent Application Publication No. 2014/0131443;

  • U.S. Patent Application Publication No. 2014/0131444;

  • U.S. Patent Application Publication No. 2014/0131445;

  • U.S. Patent Application Publication No. 2014/0131448;

  • U.S. Patent Application Publication No. 2014/0133379;

  • U.S. Patent Application Publication No. 2014/0136208;

  • U.S. Patent Application Publication No. 2014/0140585;

  • U.S. Patent Application Publication No. 2014/0151453;

  • U.S. Patent Application Publication No. 2014/0152882;

  • U.S. Patent Application Publication No. 2014/0158770;

  • U.S. Patent Application Publication No. 2014/0159869;

  • U.S. Patent Application Publication No. 2014/0160329;

  • U.S. Patent Application Publication No. 2014/0166755;

  • U.S. Patent Application Publication No. 2014/0166757;

  • U.S. Patent Application Publication No. 2014/0166759;

  • U.S. Patent Application Publication No. 2014/0166760;

  • U.S. Patent Application Publication No. 2014/0166761;

  • U.S. Patent Application Publication No. 2014/0168787;

  • U.S. Patent Application Publication No. 2014/0175165;

  • U.S. Patent Application Publication No. 2014/0175169;

  • U.S. Patent Application Publication No. 2014/0175172;

  • U.S. Patent Application Publication No. 2014/0175174;

  • U.S. Patent Application Publication No. 2014/0191644;

  • U.S. Patent Application Publication No. 2014/0191913;

  • U.S. Patent Application Publication No. 2014/0197238;

  • U.S. Patent Application Publication No. 2014/0197239;

  • U.S. Patent Application Publication No. 2014/0197304;

  • U.S. Patent Application Publication No. 2014/0203087;

  • U.S. Patent Application Publication No. 2014/0204268;

  • U.S. Patent Application Publication No. 2014/0214631;

  • U.S. Patent Application Publication No. 2014/0217166;

  • U.S. Patent Application Publication No. 2014/0217180;

  • U.S. Patent Application Publication No. 2014/0267609;

  • U.S. patent application Ser. No. 13/367,978 for a Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.);

  • U.S. patent application Ser. No. 29/436,337 for an Electronic Device, filed Nov. 5, 2012 (Fitch et al.);

  • U.S. patent application Ser. No. 13/771,508 for an Optical Redirection Adapter, filed Feb. 20, 2013 (Anderson);

  • U.S. patent application Ser. No. 13/852,097 for a System and Method for Capturing and Preserving Vehicle Event Data, filed Mar. 28, 2013 (Barker et al.);

  • U.S. patent application Ser. No. 13/902,110 for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Hollifield);

  • U.S. patent application Ser. No. 13/902,144, for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Chamberlin);

  • U.S. patent application Ser. No. 13/902,242 for a System For Providing A Continuous Communication Link With A Symbol Reading Device, filed May 24, 2013 (Smith et al.);

  • U.S. patent application Ser. No. 13/912,262 for a Method of Error Correction for 3D Imaging Device, filed Jun. 7, 2013 (Jovanovski et al.);

  • U.S. patent application Ser. No. 13/912,702 for a System and Method for Reading Code Symbols at Long Range Using Source Power Control, filed Jun. 7, 2013 (Xian et al.);

  • U.S. patent application Ser. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.);

  • U.S. patent application Ser. No. 13/922,339 for a System and Method for Reading Code Symbols Using a Variable Field of View, filed Jun. 20, 2013 (Xian et al.);

  • U.S. patent application Ser. No. 13/927,398 for a Code Symbol Reading System Having Adaptive Autofocus, filed Jun. 26, 2013 (Todeschini);

  • U.S. patent application Ser. No. 13/930,913 for a Mobile Device Having an Improved User Interface for Reading Code Symbols, filed Jun. 28, 2013 (Gelay et al.);

  • U.S. patent application Ser. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.);

  • U.S. patent application Ser. No. 29/459,681 for an Electronic Device Enclosure, filed Jul. 2, 2013 (Chaney et al.);

  • U.S. patent application Ser. No. 13/933,415 for an Electronic Device Case, filed Jul. 2, 2013 (London et al.);

  • U.S. patent application Ser. No. 29/459,785 for a Scanner and Charging Base, filed Jul. 3, 2013 (Fitch et al.);

  • U.S. patent application Ser. No. 29/459,823 for a Scanner, filed Jul. 3, 2013 (Zhou et al.);

  • U.S. patent application Ser. No. 13/947,296 for a System and Method for Selectively Reading Code Symbols, filed Jul. 22, 2013 (Rueblinger et al.);

  • U.S. patent application Ser. No. 13/950,544 for a Code Symbol Reading System Having Adjustable Object Detection, filed Jul. 25, 2013 (Jiang);

  • U.S. patent application Ser. No. 13/961,408 for a Method for Manufacturing Laser Scanners, filed Aug. 7, 2013 (Saber et al.);

  • U.S. patent application Ser. No. 14/018,729 for a Method for Operating a Laser Scanner, filed Sep. 5, 2013 (Feng et al.);

  • U.S. patent application Ser. No. 14/019,616 for a Device Having Light Source to Reduce Surface Pathogens, filed Sep. 6, 2013 (Todeschini);

  • U.S. patent application Ser. No. 14/023,762 for a Handheld Indicia Reader Having Locking Endcap, filed Sep. 11, 2013 (Gannon);

  • U.S. patent application Ser. No. 14/035,474 for Augmented-Reality Signature Capture, filed Sep. 24, 2013 (Todeschini);

  • U.S. patent application Ser. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.);

  • U.S. patent application Ser. No. 14/055,234 for Dimensioning System, filed Oct. 16, 2013 (Fletcher);

  • U.S. patent application Ser. No. 14/053,314 for Indicia Reader, filed Oct. 14, 2013 (Huck);

  • U.S. patent application Ser. No. 14/065,768 for Hybrid System and Method for Reading Indicia, filed Oct. 29, 2013 (Meier et al.);

  • U.S. patent application Ser. No. 14/074,746 for Self-Checkout Shopping System, filed Nov. 8, 2013 (Hejl et al.);

  • U.S. patent application Ser. No. 14/074,787 for Method and System for Configuring Mobile Devices via NFC Technology, filed Nov. 8, 2013 (Smith et al.);

  • U.S. patent application Ser. No. 14/087,190 for Optimal Range Indicators for Bar Code Validation, filed Nov. 22, 2013 (Hejl);

  • U.S. patent application Ser. No. 14/094,087 for Method and System for Communicating Information in an Digital Signal, filed Dec. 2, 2013 (Peake et al.);

  • U.S. patent application Ser. No. 14/101,965 for High Dynamic-Range Indicia Reading System, filed Dec. 10, 2013 (Xian);

  • U.S. patent application Ser. No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.);

  • U.S. patent application Ser. No. 14/154,207 for Laser Barcode Scanner, filed Jan. 14, 2014 (Hou et al.);

  • U.S. patent application Ser. No. 14/165,980 for System and Method for Measuring Irregular Objects with a Single Camera filed Jan. 28, 2014 (Li et al.);

  • U.S. patent application Ser. No. 14/166,103 for Indicia Reading Terminal Including Optical Filter filed Jan. 28, 2014 (Lu et al.);

  • U.S. patent application Ser. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.);

  • U.S. patent application Ser. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.);

  • U.S. patent application Ser. No. 14/250,923 for Reading Apparatus Having Partial Frame Operating Mode filed Apr. 11, 2014, (Deng et al.);

  • U.S. patent application Ser. No. 14/257,174 for Imaging Terminal Having Data Compression filed Apr. 21, 2014, (Barber et al.);

  • U.S. patent application Ser. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering);

  • U.S. patent application Ser. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014 (Ackley et al.);

  • U.S. patent application Ser. No. 14/274,858 for Mobile Printer with Optional Battery Accessory filed May 12, 2014 (Marty et al.);

  • U.S. patent application Ser. No. 14/277,337 for MULTIPURPOSE OPTICAL READER, filed May 14, 2014 (Jovanovski et al.);

  • U.S. patent application Ser. No. 14/283,282 for TERMINAL HAVING ILLUMINATION AND FOCUS CONTROL filed May 21, 2014 (Liu et al.);

  • U.S. patent application Ser. No. 14/300,276 for METHOD AND SYSTEM FOR CONSIDERING INFORMATION ABOUT AN EXPECTED RESPONSE WHEN PERFORMING SPEECH RECOGNITION, filed Jun. 10, 2014 (Braho et al.);

  • U.S. patent application Ser. No. 14/305,153 for INDICIA READING SYSTEM EMPLOYING DIGITAL GAIN CONTROL filed Jun. 16, 2014 (Xian et al.);

  • U.S. patent application Ser. No. 14/310,226 for AUTOFOCUSING OPTICAL IMAGING DEVICE filed Jun. 20, 2014 (Koziol et al.);

  • U.S. patent application Ser. No. 14/327,722 for CUSTOMER FACING IMAGING SYSTEMS AND METHODS FOR OBTAINING IMAGES filed Jul. 10, 2014 (Oberpriller et al,);

  • U.S. patent application Ser. No. 14/327,827 for a MOBILE-PHONE ADAPTER FOR ELECTRONIC TRANSACTIONS, filed Jul. 10, 2014 (Hejl);

  • U.S. patent application Ser. No. 14/329,303 for CELL PHONE READING MODE USING IMAGE TIMER filed Jul. 11, 2014 (Coyle);

  • U.S. patent application Ser. No. 14/333,588 for SYMBOL READING SYSTEM WITH INTEGRATED SCALE BASE filed Jul. 17, 2014 (Barten);

  • U.S. patent application Ser. No. 14/334,934 for a SYSTEM AND METHOD FOR INDICIA VERIFICATION, filed Jul. 18, 2014 (Hejl);

  • U.S. patent application Ser. No. 14/336,188 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES, Filed Jul. 21, 2014 (Amundsen et al.);

  • U.S. patent application Ser. No. 14/339,708 for LASER SCANNING CODE SYMBOL READING SYSTEM, filed Jul. 24, 2014 (Xian et al.);

  • U.S. patent application Ser. No. 14/340,627 for an AXIALLY REINFORCED FLEXIBLE SCAN ELEMENT, filed Jul. 25, 2014 (Rueblinger et al.);

  • U.S. patent application Ser. No. 14/340,716 for an OPTICAL IMAGER AND METHOD FOR CORRELATING A MEDICATION PACKAGE WITH A PATIENT, filed Jul. 25, 2014 (Ellis);

  • U.S. patent application Ser. No. 14/342,544 for Imaging Based Barcode Scanner Engine with Multiple Elements Supported on a Common Printed Circuit Board filed Mar. 4, 2014 (Liu et al.);

  • U.S. patent application Ser. No. 14/345,735 for Optical Indicia Reading Terminal with Combined Illumination filed Mar. 19, 2014 (Ouyang);

  • U.S. patent application Ser. No. 14/336,188 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES, Filed Jul. 21, 2014 (Amundsen et al.);

  • U.S. patent application Ser. No. 14/355,613 for Optical Indicia Reading Terminal with Color Image Sensor filed May 1, 2014 (Lu et al.);

  • U.S. patent application Ser. No. 14/370,237 for WEB-BASED SCAN-TASK ENABLED SYSTEM AND METHOD OF AND APPARATUS FOR DEVELOPING AND DEPLOYING THE SAME ON A CLIENT-SERVER NETWORK filed Jul. 2, 2014 (Chen et al.);

  • U.S. patent application Ser. No. 14/370,267 for INDUSTRIAL DESIGN FOR CONSUMER DEVICE BASED SCANNING AND MOBILITY, filed Jul. 2, 2014 (Ma et al.);

  • U.S. patent application Ser. No. 14/376,472, for an ENCODED INFORMATION READING TERMINAL INCLUDING HTTP SERVER, filed Aug. 4, 2014 (Lu);

  • U.S. patent application Ser. No. 14/379,057 for METHOD OF USING CAMERA SENSOR INTERFACE TO TRANSFER MULTIPLE CHANNELS OF SCAN DATA USING AN IMAGE FORMAT filed Aug. 15, 2014 (Wang et al.);

  • U.S. patent application Ser. No. 14/452,697 for INTERACTIVE INDICIA READER, filed Aug. 6, 2014 (Todeschini);

  • U.S. patent application Ser. No. 14/453,019 for DIMENSIONING SYSTEM WITH GUIDED ALIGNMENT, filed Aug. 6, 2014 (Li et al.);

  • U.S. patent application Ser. No. 14/460,387 for APPARATUS FOR DISPLAYING BAR CODES FROM LIGHT EMITTING DISPLAY SURFACES filed Aug. 15, 2014 (Van Horn et al.);

  • U.S. patent application Ser. No. 14/460,829 for ENCODED INFORMATION READING TERMINAL WITH WIRELESS PATH SELECTION CAPABILITY, filed Aug. 15, 2014 (Wang et al.);

  • U.S. patent application Ser. No. 14/462,801 for MOBILE COMPUTING DEVICE WITH DATA COGNITION SOFTWARE, filed on Aug. 19, 2014 (Todeschini et al.);

  • U.S. patent application Ser. No. 14/446,387 for INDICIA READING TERMINAL PROCESSING PLURALITY OF FRAMES OF IMAGE DATA RESPONSIVELY TO TRIGGER SIGNAL ACTIVATION filed Jul. 30, 2014 (Wang et al.);

  • U.S. patent application Ser. No. 14/446,391 for MULTIFUNCTION POINT OF SALE APPARATUS WITH OPTICAL SIGNATURE CAPTURE filed Jul. 30, 2014 (Good et al.);

  • U.S. patent application Ser. No. 29/486,759 for an Imaging Terminal, filed Apr. 2, 2014 (Oberpriller et al.);

  • U.S. patent application Ser. No. 29/492,903 for an INDICIA SCANNER, filed Jun. 4, 2014 (Zhou et al.); and

  • U.S. patent application Ser. No. 29/494,725 for an IN-COUNTER BARCODE SCANNER, filed Jun. 24, 2014 (Oberpriller et al.).



In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A mobile dimensioning device, comprising: a display;a non-volatile storage;a sensor;an input subsystem;a processor; anda memory comprising computer-executable instructions that, when executed by the processor, cause the mobile dimensioning device to: derive an accuracy parameter based on information received from the sensor for a measurement environment of an object being measured;compute an accuracy level based on the accuracy parameter;in response to determining that the accuracy level corresponds to a sufficient measurement environment, display, on a user interface of the display, an acceptance icon to facilitate the display of the accuracy level; andin response to receiving a user selection of the acceptance icon via the user interface, trigger displaying, on the user interface of the display, the accuracy level associated with measuring a dimension of the object in the measurement environment and a capture icon to facilitate measuring the dimension of the object.
  • 2. The mobile dimensioning device of claim 1, wherein the accuracy level corresponds to a National Institutes of Standards and Technology (NIST) standard associated with accuracy.
  • 3. The mobile dimensioning device of claim 2, wherein the sufficient measurement environment is an environment where an accuracy division has a value lower than a predetermined value.
  • 4. The mobile dimensioning device of claim 1, wherein the accuracy parameter is associated with at least one of: a distance to the object, a viewing angle relative to the object, a temperature, ambient light, or a quality of data from the sensor.
  • 5. The mobile dimensioning device of claim 1, wherein the sensor is at least one of: an optical sensor or a measurement sensor.
  • 6. The mobile dimensioning device of claim 5, wherein the optical sensor is at least one of: a barcode sensor, a camera, or an image sensor.
  • 7. The mobile dimensioning device of claim 5, wherein the measurement sensor is at least one of: a point-cloud projection sensor, a structured light sensor, a stereoscopic camera, or an n-scopic camera.
  • 8. The mobile dimensioning device of claim 1, wherein the computer-executable instructions, when executed by the processor, cause the mobile dimensioning device to further: display, on the display, an indication that the measurement environment is sufficient, comprising at least one of: displaying an accuracy division, displaying an icon to facilitate capture of dimensions of the object, removing indications for improving the measurement environment, displaying a completed progress bar, or displaying a confirmation icon.
  • 9. The mobile dimensioning device of claim 1, wherein computing the accuracy level based on the accuracy parameter comprises calculating multivariable regression on the accuracy parameter.
  • 10. The mobile dimensioning device of claim 1, wherein the computer-executable instructions, when executed by the processor, cause the mobile dimensioning device to further: in response to determining that the accuracy level does not correspond to the sufficient measurement environment, provide an indication for improving the measurement environment.
  • 11. The mobile dimensioning device of claim 10, wherein the indication for improving the measurement environment comprises at least one of: a textual instruction, a graphical instruction, or a graphical icon.
  • 12. The mobile dimensioning device of claim 10, wherein the indication for improving the measurement environment comprises at least one of: an indication for shortening a distance to the object, an indication for improving a viewing angle relative to the object, an indication to delay measurement pending a target operating temperature, an indication for improving an ambient light, or an indication for adjusting the sensor to improve a quality of data.
  • 13. The mobile dimensioning device of claim 1, wherein the computer-executable instructions, when executed by the processor, cause the mobile dimensioning device to further: in response to an input received associated with the capture icon, capture measurement;display, on the display, dimensions of the object; andrecord the dimensions of the object.
  • 14. The mobile dimensioning device of claim 1, wherein, when displaying the accuracy level and the capture icon on the user interface, the computer-executable instructions, when executed by the processor, cause the mobile dimensioning device to further: display an accuracy division for each dimension of the object on the user interface.
  • 15. A method for measuring dimensions of an object, comprising: deriving, based on information received from a sensor, an accuracy parameter that is associated with an accuracy level for a measurement environment of the object being measured;computing the accuracy level based on the accuracy parameter;determining if the accuracy level is sufficient for the measurement environment;in response to the accuracy level being sufficient for the measurement environment, displaying, on a user interface of a display, an acceptance icon to facilitate displaying the accuracy level; andin response to receiving a user selection of the acceptance icon via the user interface, causing the accuracy level associated with measuring dimensions of the object in the measurement environment and a capture icon, configured to facilitate measuring the dimensions of the object, to be displayed on the display.
  • 16. The method for measuring the dimensions of the object of claim 15, wherein the capture icon is caused to be displayed on the display with an indication that the measurement environment is sufficient.
  • 17. The method for measuring the dimensions of the object of claim 15, wherein the accuracy parameter is associated with at least one of: a distance to the object, a viewing angle relative to the object, a temperature, ambient light, or a quality of data from the sensor.
  • 18. The method for measuring the dimensions of the object of claim 15 further comprising: in response to the accuracy level being not sufficient for the measurement environment, providing an indication for improving the measurement environment.
  • 19. The method for measuring the dimensions of the object of claim 18, wherein the indication for improving the measurement environment comprises at least one of: a textual instruction, a graphical instruction, or a graphical icon.
  • 20. The method for measuring the dimensions of the object of claim 15, wherein, when displaying the accuracy level and the capture icon on the user interface, the method further comprises: display an accuracy division for each dimension of the object on the user interface.
Priority Claims (1)
Number Date Country Kind
15176943 Jul 2015 EP regional
CROSS-REFERENCE TO RELATED APPLICATION

The present application is a continuation application of U.S. application Ser. No. 15/146,084, filed May 4, 2016, which claims the benefit of European Patent Application No. 15176943.7 for a Method for a Mobile Dimensioning Device to Use a Dynamic Accuracy Compatible with NIST Standard filed on Jul. 15, 2015 at the European Patent Office, the contents of which are hereby incorporated by reference in their entireties.

US Referenced Citations (926)
Number Name Date Kind
3971065 Bayer Jul 1976 A
4026031 Siddall et al. May 1977 A
4279328 Ahlbom Jul 1981 A
4398811 Nishioka et al. Aug 1983 A
4495559 Gelatt et al. Jan 1985 A
4730190 Win et al. Mar 1988 A
4803639 Steele et al. Feb 1989 A
5184733 Arnarson et al. Feb 1993 A
5198648 Hibbard Mar 1993 A
5220536 Stringer et al. Jun 1993 A
5331118 Jensen Jul 1994 A
5359185 Hanson Oct 1994 A
5384901 Glassner et al. Jan 1995 A
5548707 Lonegro et al. Aug 1996 A
5555090 Schmutz Sep 1996 A
5561526 Huber et al. Oct 1996 A
5590060 Granville et al. Dec 1996 A
5606534 Stringer et al. Feb 1997 A
5619245 Kessler et al. Apr 1997 A
5655095 Lonegro et al. Aug 1997 A
5661561 Wurz et al. Aug 1997 A
5699161 Woodworth Dec 1997 A
5729750 Ishida Mar 1998 A
5730252 Herbinet Mar 1998 A
5732147 Tao Mar 1998 A
5734476 Dlugos Mar 1998 A
5737074 Haga et al. Apr 1998 A
5748199 Palm May 1998 A
5767962 Suzuki et al. Jun 1998 A
5831737 Stringer et al. Nov 1998 A
5850370 Stringer et al. Dec 1998 A
5850490 Johnson Dec 1998 A
5869827 Rando Feb 1999 A
5870220 Migdal et al. Feb 1999 A
5900611 Hecht May 1999 A
5923428 Woodworth Jul 1999 A
5929856 Lonegro et al. Jul 1999 A
5938710 Lanza et al. Aug 1999 A
5959568 Woolley Sep 1999 A
5960098 Tao Sep 1999 A
5969823 Wurz et al. Oct 1999 A
5978512 Kim Nov 1999 A
5979760 Freyman et al. Nov 1999 A
5988862 Kacyra et al. Nov 1999 A
5991041 Woodworth Nov 1999 A
6009189 Schaack Dec 1999 A
6025847 Marks Feb 2000 A
6035067 Ponticos Mar 2000 A
6049386 Stringer et al. Apr 2000 A
6053409 Brobst et al. Apr 2000 A
6064759 Buckley et al. May 2000 A
6067110 Nonaka et al. May 2000 A
6069696 Mcqueen et al. May 2000 A
6115114 Berg et al. Sep 2000 A
6137577 Woodworth Oct 2000 A
6177999 Wurz et al. Jan 2001 B1
6189223 Haug Feb 2001 B1
6232597 Kley May 2001 B1
6236403 Chaki et al. May 2001 B1
6246468 Dimsdale Jun 2001 B1
6333749 Reinhardt et al. Dec 2001 B1
6336587 He et al. Jan 2002 B1
6369401 Lee Apr 2002 B1
6373579 Ober et al. Apr 2002 B1
6429803 Kumar Aug 2002 B1
6457642 Good et al. Oct 2002 B1
6507406 Vagi et al. Jan 2003 B1
6517004 Good et al. Feb 2003 B2
6519550 D et al. Feb 2003 B1
6535776 Tobin et al. Mar 2003 B1
6674904 Mcqueen Jan 2004 B1
6705526 Zhu et al. Mar 2004 B1
6781621 Gobush et al. Aug 2004 B1
6824058 Patel et al. Nov 2004 B2
6832725 Gardiner et al. Dec 2004 B2
6858857 Pease et al. Feb 2005 B2
6922632 Foxlin Jul 2005 B2
6971580 Zhu et al. Dec 2005 B2
6995762 Pavlidis et al. Feb 2006 B1
7057632 Yamawaki et al. Jun 2006 B2
7085409 Sawhney et al. Aug 2006 B2
7086162 Tyroler Aug 2006 B2
7104453 Zhu et al. Sep 2006 B1
7128266 Zhu et al. Oct 2006 B2
7137556 Bonner et al. Nov 2006 B1
7159783 Walczyk et al. Jan 2007 B2
7161688 Bonner et al. Jan 2007 B1
7205529 Andersen et al. Apr 2007 B2
7214954 Schopp May 2007 B2
7277187 Smith et al. Oct 2007 B2
7307653 Dutta Dec 2007 B2
7310431 Gokturk et al. Dec 2007 B2
7353137 Vock et al. Apr 2008 B2
7413127 Ehrhart Aug 2008 B2
7509529 Colucci et al. Mar 2009 B2
7527205 Zhu et al. May 2009 B2
7586049 Wurz Sep 2009 B2
7602404 Reinhardt et al. Oct 2009 B1
7639722 Paxton et al. Dec 2009 B1
7726575 Wang et al. Jun 2010 B2
7780084 Zhang et al. Aug 2010 B2
7788883 Buckley et al. Sep 2010 B2
7974025 Topliss Jul 2011 B2
8027096 Feng et al. Sep 2011 B2
8028501 Buckley et al. Oct 2011 B2
8050461 Shpunt et al. Nov 2011 B2
8055061 Katano Nov 2011 B2
8072581 Breiholz Dec 2011 B1
8102395 Kondo et al. Jan 2012 B2
8132728 Dwinell et al. Mar 2012 B2
8134717 Pangrazio et al. Mar 2012 B2
8149224 Kuo et al. Apr 2012 B1
8175640 Choi May 2012 B2
8194097 Xiao et al. Jun 2012 B2
8201737 Palacios et al. Jun 2012 B1
8212158 Wiest Jul 2012 B2
8212889 Chanas et al. Jul 2012 B2
8228510 Pangrazio et al. Jul 2012 B2
8230367 Bell et al. Jul 2012 B2
8294969 Plesko Oct 2012 B2
8305458 Hara Nov 2012 B2
8310656 Zalewski Nov 2012 B2
8313380 Zalewski et al. Nov 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8339462 Stec et al. Dec 2012 B2
8350959 Topliss et al. Jan 2013 B2
8351670 Ijiri et al. Jan 2013 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Horn et al. Feb 2013 B2
8381976 Mohideen et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Van et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8437539 Komatsu et al. May 2013 B2
8441749 Brown et al. May 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8463079 Ackley et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein, Jr. Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8570343 Halstead Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8576390 Nunnink Nov 2013 B1
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8594425 Gurman et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre, Jr. Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz, Sr. Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8792688 Unsworth Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van et al. Aug 2014 B2
8810779 Hilde Aug 2014 B1
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue et al. Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein, Jr. Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8897596 Passmore et al. Nov 2014 B1
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8928896 Kennington et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 El et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9014441 Truyen Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber et al. Jun 2015 B2
9047531 Showering Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9082023 Feng et al. Jul 2015 B2
9082195 Holeva et al. Jul 2015 B2
9142035 Rotman et al. Sep 2015 B1
9171278 Kong et al. Oct 2015 B1
9208550 Chen Dec 2015 B2
9224022 Ackley et al. Dec 2015 B2
9224027 Van et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9233470 Bradski et al. Jan 2016 B1
9235899 Kirmani et al. Jan 2016 B1
9250712 Todeschini Feb 2016 B1
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9299013 Curlander et al. Mar 2016 B1
9310609 Rueblinger et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342724 Mccloskey et al. May 2016 B2
9375945 Bowles Jun 2016 B1
D760719 Zhou et al. Jul 2016 S
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
D762647 Fitch et al. Aug 2016 S
9411787 Lad Aug 2016 B1
9412242 Van et al. Aug 2016 B2
9424749 Reed et al. Aug 2016 B1
D766244 Zhou et al. Sep 2016 S
9443123 Hejl Sep 2016 B2
9443222 Singel et al. Sep 2016 B2
9478113 Xie et al. Oct 2016 B2
D771631 Fitch et al. Nov 2016 S
9486921 Straszheim Nov 2016 B1
9507974 Todeschini Nov 2016 B1
9508099 Cancro Nov 2016 B2
D777166 Bidwell et al. Jan 2017 S
D783601 Schulte et al. Apr 2017 S
9632658 Holz Apr 2017 B2
D785617 Bidwell et al. May 2017 S
D785636 Oberpriller et al. May 2017 S
D790505 Vargo et al. Jun 2017 S
D790546 Zhou et al. Jun 2017 S
D790553 Fitch et al. Jun 2017 S
9778767 Nakata Oct 2017 B2
9786101 Ackley Oct 2017 B2
9826220 Laffargue et al. Nov 2017 B2
9828223 Svensson et al. Nov 2017 B2
9835486 Ackley Dec 2017 B2
9857167 Jovanovski et al. Jan 2018 B2
9891612 Charpentier et al. Feb 2018 B2
9892876 Bandringa Feb 2018 B2
9954871 Hussey et al. Apr 2018 B2
9978088 Pape May 2018 B2
10007112 Fitch et al. Jun 2018 B2
10025314 Houle et al. Jul 2018 B2
10038716 Todeschini et al. Jul 2018 B2
10066982 Ackley et al. Sep 2018 B2
10094650 Todeschini Oct 2018 B2
10163216 Ackley Dec 2018 B2
10249030 McCloskey et al. Apr 2019 B2
10360728 Venkatesha et al. Jul 2019 B2
10393506 Laffargue Aug 2019 B2
10401436 Young et al. Sep 2019 B2
10612958 Ackley Apr 2020 B2
10709425 Waechter-Stehle Jul 2020 B2
20010027995 Patel et al. Oct 2001 A1
20010032879 He et al. Oct 2001 A1
20020054289 Thibault et al. May 2002 A1
20020067855 Chiu et al. Jun 2002 A1
20020109835 Goetz Aug 2002 A1
20020118874 Chung et al. Aug 2002 A1
20020158873 Williamson Oct 2002 A1
20020167677 Okada et al. Nov 2002 A1
20020179708 Zhu et al. Dec 2002 A1
20020196534 Lizoiie et al. Dec 2002 A1
20030038179 Tsikos et al. Feb 2003 A1
20030053513 Vatan et al. Mar 2003 A1
20030063086 Baumberg Apr 2003 A1
20030078755 Leutz et al. Apr 2003 A1
20030091227 Chang et al. May 2003 A1
20030156756 Gokturk et al. Aug 2003 A1
20030197138 Pease et al. Oct 2003 A1
20030225712 Cooper et al. Dec 2003 A1
20030235331 Kawaike et al. Dec 2003 A1
20040008259 Gokturk et al. Jan 2004 A1
20040019274 Galloway et al. Jan 2004 A1
20040024754 Mane et al. Feb 2004 A1
20040066329 Zeitfuss et al. Apr 2004 A1
20040073359 Ichijo et al. Apr 2004 A1
20040083025 Yamanouchi et al. Apr 2004 A1
20040089482 Ramsden et al. May 2004 A1
20040098146 Katae et al. May 2004 A1
20040105580 Hager et al. Jun 2004 A1
20040118928 Patel et al. Jun 2004 A1
20040122779 Stickler et al. Jun 2004 A1
20040132297 Baba et al. Jul 2004 A1
20040155975 Hart et al. Aug 2004 A1
20040165090 Ning Aug 2004 A1
20040184041 Schopp Sep 2004 A1
20040211836 Patel et al. Oct 2004 A1
20040214623 Takahashi et al. Oct 2004 A1
20040233461 Armstrong et al. Nov 2004 A1
20040258353 Gluckstad et al. Dec 2004 A1
20050006477 Patel Jan 2005 A1
20050117215 Lange Jun 2005 A1
20050128193 Lueder Jun 2005 A1
20050128196 Popescu et al. Jun 2005 A1
20050168488 Montague Aug 2005 A1
20050211782 Martin et al. Sep 2005 A1
20050257748 Kriesel et al. Nov 2005 A1
20050264867 Cho et al. Dec 2005 A1
20060047704 Gopalakrishnan Mar 2006 A1
20060078226 Zhou Apr 2006 A1
20060108266 Bowers et al. May 2006 A1
20060112023 Horhann et al. May 2006 A1
20060151604 Zhu et al. Jul 2006 A1
20060159307 Anderson et al. Jul 2006 A1
20060159344 Shao et al. Jul 2006 A1
20060213999 Wang et al. Sep 2006 A1
20060230640 Chen Oct 2006 A1
20060232681 Okada Oct 2006 A1
20060255150 Longacre, Jr. Nov 2006 A1
20060269165 Viswanathan Nov 2006 A1
20060291719 Ikeda et al. Dec 2006 A1
20070003154 Sun et al. Jan 2007 A1
20070025612 Iwasaki et al. Feb 2007 A1
20070031064 Zhao et al. Feb 2007 A1
20070063048 Havens et al. Mar 2007 A1
20070116357 Dewaele May 2007 A1
20070127022 Cohen Jun 2007 A1
20070143082 Degnan Jun 2007 A1
20070153293 Gruhlke et al. Jul 2007 A1
20070171220 Kriveshko Jul 2007 A1
20070177011 Lewin et al. Aug 2007 A1
20070181685 Zhu et al. Aug 2007 A1
20070237356 Dwinell et al. Oct 2007 A1
20070291031 Konev et al. Dec 2007 A1
20070299338 Stevick et al. Dec 2007 A1
20080013793 Hillis et al. Jan 2008 A1
20080035390 Wurz Feb 2008 A1
20080056536 Hildreth et al. Mar 2008 A1
20080062164 Bassi et al. Mar 2008 A1
20080077265 Boyden et al. Mar 2008 A1
20080079955 Storm Apr 2008 A1
20080164074 Wurz Jul 2008 A1
20080204476 Montague Aug 2008 A1
20080212168 Olmstead et al. Sep 2008 A1
20080247635 Davis et al. Oct 2008 A1
20080273191 Kim et al. Nov 2008 A1
20080273210 Hilde Nov 2008 A1
20080278790 Boesser et al. Nov 2008 A1
20090022369 Satoh Jan 2009 A1
20090038182 Lans et al. Feb 2009 A1
20090059004 Bochicchio Mar 2009 A1
20090081008 Somin et al. Mar 2009 A1
20090095047 Patel et al. Apr 2009 A1
20090134221 Zhu et al. May 2009 A1
20090195790 Zhu et al. Aug 2009 A1
20090225333 Bendall et al. Sep 2009 A1
20090237411 Gossweiler et al. Sep 2009 A1
20090258445 Zhou Oct 2009 A1
20090268023 Hsieh Oct 2009 A1
20090272724 Gubler et al. Nov 2009 A1
20090273770 Bauhahn et al. Nov 2009 A1
20090313948 Buckley et al. Dec 2009 A1
20090318815 Barnes et al. Dec 2009 A1
20090323084 Dunn et al. Dec 2009 A1
20090323121 Valkenburg et al. Dec 2009 A1
20100035637 Varanasi et al. Feb 2010 A1
20100060604 Zwart et al. Mar 2010 A1
20100091104 Sprigle et al. Apr 2010 A1
20100113153 Yen et al. May 2010 A1
20100118200 Gelman et al. May 2010 A1
20100128109 Banks May 2010 A1
20100161170 Siris Jun 2010 A1
20100171740 Andersen et al. Jul 2010 A1
20100172567 Prokoski Jul 2010 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100202702 Benos et al. Aug 2010 A1
20100208039 Stettner Aug 2010 A1
20100211355 Horst et al. Aug 2010 A1
20100217678 Goncalves Aug 2010 A1
20100220849 Colbert et al. Sep 2010 A1
20100220894 Ackley et al. Sep 2010 A1
20100223276 Al-Shameri et al. Sep 2010 A1
20100245850 Lee et al. Sep 2010 A1
20100254611 Arnz Oct 2010 A1
20100274728 Kugelman Oct 2010 A1
20100303336 Abraham et al. Dec 2010 A1
20100315413 Izadi et al. Dec 2010 A1
20100321482 Cleveland Dec 2010 A1
20110019155 Daniel et al. Jan 2011 A1
20110040192 Brenner Feb 2011 A1
20110043609 Choi et al. Feb 2011 A1
20110099474 Grossman Apr 2011 A1
20110169999 Grunow et al. Jul 2011 A1
20110188054 Petronius et al. Aug 2011 A1
20110188741 Sones et al. Aug 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20110234389 Mellin Sep 2011 A1
20110235854 Berger et al. Sep 2011 A1
20110249864 Venkatesan et al. Oct 2011 A1
20110254840 Halstead Oct 2011 A1
20110260965 Kim et al. Oct 2011 A1
20110279916 Brown et al. Nov 2011 A1
20110286007 Pangrazio et al. Nov 2011 A1
20110286628 Goncalves et al. Nov 2011 A1
20110288818 Thierman et al. Nov 2011 A1
20110301994 Tieman Dec 2011 A1
20110303748 Lemma et al. Dec 2011 A1
20110310227 Konertz Dec 2011 A1
20120024952 Chen Feb 2012 A1
20120056982 Katz et al. Mar 2012 A1
20120057345 Kuchibhotla Mar 2012 A1
20120067955 Rowe Mar 2012 A1
20120074227 Ferren et al. Mar 2012 A1
20120081714 Pangrazio et al. Apr 2012 A1
20120111946 Golant May 2012 A1
20120113223 Hilliges et al. May 2012 A1
20120113250 Farlotti et al. May 2012 A1
20120126000 Kunzig et al. May 2012 A1
20120140300 Freeman Jun 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120179665 Baarman et al. Jul 2012 A1
20120185094 Rosenstein et al. Jul 2012 A1
20120190386 Anderson Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120197464 Wang et al. Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120218436 Rhoads et al. Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20120224026 Bayer et al. Sep 2012 A1
20120224060 Gurevich et al. Sep 2012 A1
20120236288 Stanley Sep 2012 A1
20120242852 Hayward et al. Sep 2012 A1
20120256901 Bendall Oct 2012 A1
20120261474 Kawashime et al. Oct 2012 A1
20120262484 Gottfeld Oct 2012 A1
20120262558 Boger et al. Oct 2012 A1
20120280908 Rhoads et al. Nov 2012 A1
20120282905 Owen Nov 2012 A1
20120282911 Davis et al. Nov 2012 A1
20120284012 Rodriguez et al. Nov 2012 A1
20120284122 Brandis Nov 2012 A1
20120284339 Rodriguez Nov 2012 A1
20120284593 Rodriguez Nov 2012 A1
20120293610 Doepke et al. Nov 2012 A1
20120293625 Schneider et al. Nov 2012 A1
20120294549 Doepke Nov 2012 A1
20120299961 Ramkumar et al. Nov 2012 A1
20120300991 Free Nov 2012 A1
20120313848 Galor et al. Dec 2012 A1
20120314030 Datta et al. Dec 2012 A1
20120314058 Bendall Dec 2012 A1
20120316820 Nakazato et al. Dec 2012 A1
20130019278 Sun et al. Jan 2013 A1
20130038881 Pesach et al. Feb 2013 A1
20130038941 Pesach et al. Feb 2013 A1
20130043312 Van Horn Feb 2013 A1
20130050426 Sarmast et al. Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130093895 Palmer et al. Apr 2013 A1
20130094069 Lee et al. Apr 2013 A1
20130101158 Lloyd et al. Apr 2013 A1
20130156267 Muraoka Jun 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130200150 Reynolds et al. Aug 2013 A1
20130201288 Billerbeck et al. Aug 2013 A1
20130208164 Cazier et al. Aug 2013 A1
20130211790 Loveland et al. Aug 2013 A1
20130222592 Gieseke Aug 2013 A1
20130223673 Davis et al. Aug 2013 A1
20130230074 Shin Sep 2013 A1
20130247403 Hayashida Sep 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130291998 Konnerth Nov 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedrao Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308013 Li et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130329012 Bartos et al. Dec 2013 A1
20130329013 Metois Dec 2013 A1
20130342342 Sabre et al. Dec 2013 A1
20130342343 Harring et al. Dec 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140009586 Mcnamer et al. Jan 2014 A1
20140019005 Lee et al. Jan 2014 A1
20140021259 Moed et al. Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140031665 Pinto et al. Jan 2014 A1
20140034731 Gao et al. Feb 2014 A1
20140034734 Sauerwein, Jr. Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039674 Motoyama et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue Feb 2014 A1
20140050367 Chen Feb 2014 A1
20140058612 Wong et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140062709 Hyer et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140064624 Kim et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067104 Osterhout Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071430 Hansen et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140079297 Tadayon et al. Mar 2014 A1
20140091147 Evans et al. Apr 2014 A1
20140097238 Ghazizadeh Apr 2014 A1
20140098091 Hori Apr 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140100813 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 Mccloskey et al. Apr 2014 A1
20140104414 Mccloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140104664 Lee et al. Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein, Jr. Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Kearney May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140135984 Hirata May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140139654 Takahashi May 2014 A1
20140140585 Wang May 2014 A1
20140142398 Patil et al. May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140152975 Ko Jun 2014 A1
20140158468 Adami Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168380 Heidemann et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140177931 Kocherscheidt et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140192187 Atwell et al. Jul 2014 A1
20140192551 Masaki Jul 2014 A1
20140197238 Liu et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140201126 Zadeh et al. Jul 2014 A1
20140203087 Smith et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140205150 Ogawa Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140225918 Mittal et al. Aug 2014 A1
20140225985 Klusza et al. Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140240454 Hirata Aug 2014 A1
20140247279 Nicholas et al. Sep 2014 A1
20140247280 Nicholas et al. Sep 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140267609 Laffargue Sep 2014 A1
20140268093 Tohme et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140270361 Amma et al. Sep 2014 A1
20140278387 Digregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140300722 Garcia Oct 2014 A1
20140306833 Ricci Oct 2014 A1
20140307855 Withagen et al. Oct 2014 A1
20140312121 Lu et al. Oct 2014 A1
20140313527 Askan Oct 2014 A1
20140319219 Liu et al. Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140320408 Zagorsek et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140333775 Naikal et al. Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140347533 Toyoda Nov 2014 A1
20140350710 Gopalakrishnan et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20140379613 Nishitani Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009100 Haneda et al. Jan 2015 A1
20150009301 Ribnick et al. Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150036876 Marrion et al. Feb 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150042791 Metois et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062160 Sakamoto et al. Mar 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150062369 Gehring et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150070158 Hayasaka Mar 2015 A1
20150071819 Todeschini Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150116498 Vartiainen et al. Apr 2015 A1
20150117749 Smith et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150163474 You et al. Jun 2015 A1
20150169925 Chen et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150178900 Kim et al. Jun 2015 A1
20150182844 Jang Jul 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150193645 Colavito et al. Jul 2015 A1
20150193759 Fukuda Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150201181 Moore et al. Jul 2015 A1
20150204662 Kobayashi Jul 2015 A1
20150204671 Showering Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150213647 Laffargue et al. Jul 2015 A1
20150219748 Hyatt et al. Aug 2015 A1
20150220257 Moore Aug 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150229838 Hakim Aug 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150269403 Lei et al. Sep 2015 A1
20150276379 Ni et al. Oct 2015 A1
20150308816 Laffargue Oct 2015 A1
20150316368 Moench et al. Nov 2015 A1
20150325036 Lee Nov 2015 A1
20150327012 Bian et al. Nov 2015 A1
20150332463 Galera et al. Nov 2015 A1
20150355470 Herschbach Dec 2015 A1
20160014251 Hejl Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160048725 Holz et al. Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160063429 Varley et al. Mar 2016 A1
20160070982 Jachalsky et al. Mar 2016 A1
20160088287 Sadi et al. Mar 2016 A1
20160090283 Svensson et al. Mar 2016 A1
20160090284 Svensson et al. Mar 2016 A1
20160101936 Chamberlin Apr 2016 A1
20160102975 Mccloskey et al. Apr 2016 A1
20160104019 Todeschini et al. Apr 2016 A1
20160104274 Jovanovski et al. Apr 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue et al. Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160125873 Braho et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160138247 Conway et al. May 2016 A1
20160138248 Conway et al. May 2016 A1
20160138249 Conway et al. May 2016 A1
20160169665 Deschenes et al. Jun 2016 A1
20160171720 Todeschini Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160187186 Coleman et al. Jun 2016 A1
20160187187 Coleman et al. Jun 2016 A1
20160187210 Coleman et al. Jun 2016 A1
20160188181 Smith Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160191801 Sivan Jun 2016 A1
20160202478 Masson et al. Jul 2016 A1
20160203641 Bostick et al. Jul 2016 A1
20160223474 Tang et al. Aug 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Wilz et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20160343176 Ackley Nov 2016 A1
20160370605 Ain-Kedem Dec 2016 A1
20160377414 Thuries et al. Dec 2016 A1
20170017301 Doornenbal et al. Jan 2017 A1
20170091706 Lloyd et al. Mar 2017 A1
20170115490 Hsieh et al. Apr 2017 A1
20170121158 Wong et al. May 2017 A1
20170182942 Hardy et al. Jun 2017 A1
Foreign Referenced Citations (63)
Number Date Country
2004212587 Apr 2005 AU
101149462 Mar 2008 CN
201139117 Oct 2008 CN
101388073 Mar 2009 CN
101433473 May 2009 CN
103217108 Jul 2013 CN
3335760 Apr 1985 DE
10210813 Oct 2003 DE
102007037282 Mar 2008 DE
1111435 Jun 2001 EP
1443312 Aug 2004 EP
2013117 Jan 2009 EP
2286932 Feb 2011 EP
2372648 Oct 2011 EP
2381421 Oct 2011 EP
2533009 Dec 2012 EP
2562715 Feb 2013 EP
2722656 Apr 2014 EP
2779027 Sep 2014 EP
2833323 Feb 2015 EP
2843590 Mar 2015 EP
2845170 Mar 2015 EP
2966595 Jan 2016 EP
3006893 Apr 2016 EP
3007096 Apr 2016 EP
3012601 Apr 2016 EP
2503978 Jan 2014 GB
2525053 Oct 2015 GB
2531928 May 2016 GB
04-129902 Apr 1992 JP
2006-096457 Apr 2006 JP
2007-084162 Apr 2007 JP
2008-210276 Sep 2008 JP
2014-210646 Nov 2014 JP
2015-174705 Oct 2015 JP
10-2010-0020115 Feb 2010 KR
10-2011-0013200 Feb 2011 KR
10-2011-0117020 Oct 2011 KR
10-2012-0028109 Mar 2012 KR
9640452 Dec 1996 WO
0077726 Dec 2000 WO
0114836 Mar 2001 WO
2006095110 Sep 2006 WO
2007012554 Feb 2007 WO
2007015059 Feb 2007 WO
2007125554 Nov 2007 WO
2011017241 Feb 2011 WO
2012175731 Dec 2012 WO
2013021157 Feb 2013 WO
2013033442 Mar 2013 WO
2013163789 Nov 2013 WO
2013166368 Nov 2013 WO
2013173985 Nov 2013 WO
2013184340 Dec 2013 WO
2014019130 Feb 2014 WO
2014023697 Feb 2014 WO
2014102341 Jul 2014 WO
2014110495 Jul 2014 WO
2014149702 Sep 2014 WO
2014151746 Sep 2014 WO
2015006865 Jan 2015 WO
2016020038 Feb 2016 WO
2016061699 Apr 2016 WO
Non-Patent Literature Citations (114)
Entry
Search Report in counterpart European Application No. 15182675.7, dated Dec. 4, 2015, 10 pages.
Second Chinese Office Action in related CN Application No. 201520810313.3, dated Mar. 22, 2016, 5 pages. English Translation provided.
Second Chinese Office Action in related CN Application No. 201520810685.6, dated Mar. 22, 2016, 5 pages.
Second Chinese Office Action in related CN Application No. 2015220810562.2, dated Mar. 22, 2016, 5 pages. English Translation provided.
Sill Optics, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, http://WWW.silloptics.de/1/products/sill-encyclopedia/laser-optics/f-theta-lenses/,4 pages.
Spiller, Jonathan; Object Localization Using Deformable Templates, Master's Dissertation, University of the Witwatersrand, Johannesburg, South Africa, 2007; 74 pages.
Theodoropoulos, Gabriel; “Using Gesture Recognizers to Handle Pinch, Rotate, Pan, Swipe, and Tap Gestures” dated Aug. 25, 2014, 34 pages, [Examiner Cited Art in Office Action dated Jan. 20, 17 in related Application.]
Thoriabs, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, downloaded from https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_id=6430, 4 pages.
U.S. Pat. Appl. filed Feb. 7, 2012, (Feng et al.); now abandoned., U.S. Appl. No. 13/367,978.
U.S. Patent Application for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages; now abandoned, U.S. Appl. No. 14/446,391.
U.S. Patent Application for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages now abandoned., U.S. Appl. No. 14/277,337.
U.S. Patent Application for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages; now abandoned, U.S. Appl. No. 14/283,282.
UK Further Exam Report in related UK Application No. GB1517842.9, dated Sep. 1, 2017, 5 pages.
Ulusoy et al., One-Shot Scanning using De Bruijn Spaced Grids, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, 7 pages [Cited In EP Extended search report dated Apr. 10, 2017].
Ulusoy, Ali Osman et al.; “One-Shot Scanning using De Bruijn Spaced Grids”, Brown University; 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, pp. 1786-1792 [Cited in EPO Search Report dated Dec. 5, 2017].
United Kingdom Combined Search and Examination Report in related Application No. GB1620676.5, dated Mar. 8, 2017, 6 pages.
United Kingdom combined Search and Examination Report in related GB Application No. 1607394.2, dated Oct. 19, 2016, 7 pages.
United Kingdom Search Report in related application GB1517842.9, dated Apr. 8, 2016, 8 pages.
United Kingdom Search Report in related Application No. GB1700338.5, dated Jun. 30, 2017, 5 pages.
Ward, Benjamin, Interactive 3D Reconstruction from Video, Aug. 2012, Doctoral Thesis, Univesity of Adelaide, Adelaide, South Australia, 157 pages.
Wikipedia, “3D projection” Downloaded on Nov. 25, 2015 from www.wikipedia.com, 4 pages.
Wikipedia, “Microlens”, Downloaded from https://en.wikipedia.org/wiki/Microlens, p. 3. {Cited by Examiner in Feb. 9, 2017 Final Office Action in related matter}.
Wikipedia, YUV description and definition, downloaded from http://www.wikipeida.org/wiki/YUV on Jun. 29, 2012, 10 pages.
YUV Pixel Formal, downloaded from http://www.fource.org/yuv.php on Jun. 29, 2012; 13 pages.
Zhang, Zhaoxiang; Tieniu Tan, Kaiqi Huang, Yunhong Wang; Three-Dimensional Deformable-Model-based Localization and Recognition of Road Vehicles; IEEE Transactions on Image Processing, vol. 21, No. 1, Jan. 2012, 13 pages.
Observations by third parties Mailed on May 31, 2018 for EP Application No. 15176943.
Examiner initiated interview summary (PTOL-413B) dated Apr. 10, 2019 for U.S. Appl. No. 15/146,084.
Examiner initiated interview summary (PTOL-413B) dated Jan. 14, 2019 for U.S. Appl. No. 15/146,084.
Non-Final Rejection dated Jun. 27, 2018 for U.S. Appl. No. 15/146,084.
Notice of Allowance and Fees Due (PTOL-85) dated Apr. 10, 2019 for U.S. Appl. No. 15/146,084.
Notice of Allowance and Fees Due (PTOL-85) dated Jun. 18, 2019 for U.S. Appl. No. 15/146,084.
Office Action for related Chinese Application No. 201610557618.7 dated Apr. 8, 2020, 10 pages.
Requirement for Restriction/Election dated Jul. 26, 2017 for U.S. Appl. No. 15/146,084.
Gupta, Alok; Range Image Segmentation for 3-D Objects Recognition, May 1988, Technical Reports (CIS), Paper 736, University of Pennsylvania Department of Computer and Information Science, retrieved from Http://repository.upenn.edu/cis_reports/736, Accessed May 31, 2015, 157 pages.
Hetzel, Gunter et al.; “3D Object Recognition from Range Images using Local Feature Histograms,”, Proceedings 2001 IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2001. Kauai, Hawaii, Dec. 8-14, 2001; pp. 394-399, XP010584149, ISBN: 978-0-7695-1272-3.
Hood, Frederick W.; William A. Hoff, Robert King, Evaluation of an Interactive Technique for Creating Site Models from Range Data, Apr. 27-May 1, 1997 Proceedings of the ANS 7th Topical Meeting on Robotics & Remote Systems, Augusta GA, 9 pages.
Intention to Grant in counterpart European Application No. 14157971.4 dated Apr. 14, 2015, pp. 1-8.
International Search Report for PCT/US2013/039438 (WO2013166368), Oct. 1, 2013, 7 pages.
Kazantsev, Aleksei et al. “Robust Pseudo-Random Coded Colored STructured Light Techniques for 3D Object Model Recovery”; ROSE 2008 IEEE International Workshop on Robotic and Sensors Environments (Oct. 17-18, 2008), 6 pages.
Leotta, Matthew J.; Joseph L. Mundy; Predicting High Resolution Image Edges with a Generic, Adaptive, 3-D Vehicle Model; IEEE Conference on Computer Vision and Pattern Recognition, 2009; 8 pages.
Leotta, Matthew, Generic, Deformable Models for 3-D Vehicle Surveillance, May 2010, Doctoral Dissertation, Brown University, Providence RI, 248 pages.
Lloyd, Ryan and Scott McCloskey, “Recognition of 3D Package Shapes for Singe Camera Melrology” IEEE Winter Conference on Applications of computer Visiona, IEEE, Mar. 24, 2014, pp. 99-106, (retrieved on Jun. 16, 2014), Authors are employees of common Applicant.
M.Zahid Gurbuz, Selim Akyokus, Ibrahim Emiroglu, Aysun Guran, An Efficient Algorithm for 3D Rectangular Box Packing, 2009, Applied Automatic Systems: Proceedings of Selected AAS 2009 Papers, pp. 131-134.
Mike Stensvold, “get the Most Out of Variable Aperture Lenses”, published on www.OutdoorPhotogrpaher.com dated Dec. 7, 2010; 4 pages, [As noted on search report retrieved from URL: http;//www.outdoorphotographer.com/gear/lenses/get-the-most-out-ofvariable-aperture-lenses.html on Feb. 9, 2016].
Mouaddib E. et al. “Recent Progress in Structured Light in order to Solve the Correspondence Problem in Stereo Vision” Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Apr. 1997; 7 pages.
Office Action for European Patent Application 18171435.3, dated May 6, 2019, 4 pages.
Office Action in counterpart European Application No. 13186043.9 dated Sep. 30, 2015, pp. 1-7.
Padzensky, Ron; “Augmera; Gesture Control”, Dated Apr. 18, 2015, 15 pages [Examiner Cited Art in Office Action dated Jan. 20, 2017 in related Application.]
Peter Clarke, Actuator Developer Claims Anti-Shake Breakthrough for Smartphone Cams, Electronic Engineering Times, p. 24, May 16, 2011.
Proesmans, Marc et al. “Active Acquisition of 3D Shape for Moving Objects” 0-7803-3258-X/96 1996 IEEE; 4 pages.
Ralph Grabowski, “Smothing 3D Mesh Objects,” New Commands in AutoCAD 2010: Part 11, Examiner Cited art in related matter Non Final Office Action dated May 19, 2017; 6 pages.
Reisner-Kollmann,Irene; Anton L. Fuhrmann, Werner Purgathofer, Interactive Reconstruction of Industrial Sites Using Parametric Models, May 2010, Proceedings of the 26th Spring Conference of Computer Graphics SCCG 10, 8 pages.
Salvi, Joaquim et al. “Pattern Codification Strategies in Structured Light Systems” published in Pattern Recognition; The Journal of the Pattern Recognition Society, Accepted Oct. 2, 2003; 23 pages.
Santolaria et al. “A one-step intrinsic and extrinsic calibration method for tester line scanner operation in coordinate measuring machines”, dated Apr. 1, 2009, Measurement Science and Technology, IOP, Bristol, GB, vol. 20, No. 4; 12 pages.
Search Report and Opinion in Related EP Application 15176943.7, dated Jan. 8, 2016, 8 pages.
Search Report and Opinion in related GB Application No. 1517112.7, dated Feb. 19, 2016, 6 Pages.
Decision to grant a European patent dated Aug. 17, 2018 for EP Application No. 15176943.
Benos et al., “Semi-Automatic Dimensioning with Imager of a Portable Device,” U.S. Provisional Patent Application Filed Feb. 4, 2009 (now expired), 56 pages, U.S. Appl. No. 61/149,912.
Boavida et al., “Dam monitoring using combined terrestrial imaging systems”, 2009 Civil Engineering Survey De/Jan. 2009, pp. 33-38 {Cited in Notice of Allowance dated Sep. 15, 2017 in related matter}.
Bosch Tool Corporation, “Operating/Safety Instruction for DLR 130”, Dated Feb. 2, 2009, 36 pages.
Caulier, Yannick et al., “A New Type of Color-Coded Light Structures for an Adapted and Rapid Determination of Point Correspondences for 3D Reconstruction.” Proc. of SPIE, vol. 8082 808232-3; 2011; 8 pages.
Chinese Notice of Reexamination in related Chinese Application 201520810313.3, dated Mar. 14, 2017, English Computer Translation provided, 7 pages.
Decision to Grant in counterpart European Application No. 14157971.4 dated Aug. 6, 2015, pp. 1-2.
Dimensional Weight-Wikipedia, the Free Encyclopedia, URL=hllp://en.wikipedia.org/wiki/Dimensional_weight, download date Aug. 1, 2008, 2 pages.
Drummond, Tom; Roberto Cipolla, Real-Time Visual Tracking of Complex Structures, Jul. 2002, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, No. 7; 15 pages.
EKSMA Optics, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, downloaded from http://eksmaoptics.com/optical-systems/f-theta-lenses/f-theta-lens-for-1064-nm/, 2 pages.
El-Hakim et al., “A Knowledge-based Edge/Object Measurement Technique”, Retrieved from the Internet: URL: https://www.researchgate.net/profile/Sabry_E1-Hakim/publication/44075058_A_Knowledge_Based_EdgeObject_Measurement_Technique/links/00b4953b5faa7d3304000000.pdf [retrieved on Jul. 15, 2016] dated Jan. 1, 1993, 9 pages.
El-Hakim et al., “Multicamera vision-based approach to flexible feature measurement for inspection and reverse engineering”, published in Optical Engineering, Society of Photo-Optical Instrumentation Engineers, vol. 32, No. 9, Sep. 1, 1993, 15 pages.
EP Extended Search Report in related EP Applicaton No. 17174843.7 dated Oct. 17, 2017, 5 pages.
EP Search and Written Opinion Report in related matter EP Application No. 14181437.6, dated Mar. 26, 2015, 7 pages.
EP Search Report in related EP Application No. 17171844 dated Sep. 18, 2017. 4 pages.
European Exam Report in related EP Applciation 16172995.9, dated Jul. 6, 2017, 9 pages.
European Exam Report in related EP Application No. 15176943.7, dated Apr. 12, 2017, 6 pages.
European Exam Report in related EP Application No. 15188440.0, dated Apr. 21, 2017.
European Exam Report in related EP Application No. 16152477.2, dated Jun. 20, 2017, 4 pages.
European Exam Report in related, EP Application No. 16168216.6, dated Feb. 27, 2017, 5 pages.
European Examination report in related EP Application No. 14181437.6, dated Feb. 8, 2017, 5 pages.
European extended Search report in related EP Application 13785171.3, dated Sep. 19, 2016, 8 pages.
European extended search report in related EP Application 16190833.0, dated Mar. 9, 2017, 8 pages.
European Extended search report in related EP Application No. 15190306.9, dated Sep. 9, 2016, 15 pages.
European Extended Search Report in Related EP Application No. 16172995.9, dated Aug. 22, 2016, 11 pages.
European Extended Search Report in related EP Application No. 16173429.8, dated Dec. 1, 2016, 8 pages.
European Extended Search Report in related EP Application No. 16190017.0, dated Jan. 4, 2017, 6 pages.
European Office Action for application EP 13186043, dated Jun. 12, 2014(now EP2722656 (Apr. 23, 2014)), Total of 6 pages.
European Patent Office Action for Application No. 14157971.4-1906, dated Jul. 16, 2014, 5 pages.
European Patent Search Report for Application No. 14157971.4-1906, dated Jun. 30, 2014, 6 pages.
European Search Report for application No. EP13186043 dated Feb. 26, 14 (now EP2722656 (Apr. 23, 2014)): Total pp. 7.
European Search Report for related Application EP 15190249.1, dated Mar. 22, 2016, 7 pages.
European Search Report for related EP Application No. 15188440.0, dated Mar. 8, 2016, 8 pages.
European Search Report for Related EP Application No. 15189214.8, dated Mar. 3, 2016, 9 pages.
European Search Report for related EP Application No. 16152477.2, dated May 24, 2016, 8 pages.
European Search Report from related EP Application No. 16168216.6, Dated Oct. 20, 2016, 8 pages.
European Search Report in related EP Application No. 15190315.0, dated Apr. 1, 2016, 7 pages.
European Search Report in related EP Application No. 17175357.7, dated Aug. 17, 2017, pp. 1-7.
Examination Report in related EP Application No. 15190315, dated Jan. 26, 2018, 6 pages.
Examination Report in related GB Application No. GB1517843.7, dated Jan. 19, 2018, 4 pages.
Extended European Search Report for corresponding European Application No. 18171435.3, Aug. 3, 2018, 6 pages.
Extended European search report in related EP Application 16199707.7, dated Apr. 10, 2017, 15 pages.
Extended European Search Report in related EP Application No. 16175410.0, dated Dec. 13, 2016, 5 pages.
Extended European Search report in related EP Application No. 17189496.7 dated Dec. 5, 2017; 9 pages.
Extended European Search report in related EP Application No. 17190323.0 dated Jan. 19, 2018; 6 pages.
First Office Action in Chinese Application No. 201610557618.7, dated Aug. 1, 2019, 18 pages including English translation.
Fukaya et al., “Characteristics of Speckle Random Pattern and Its Applications”, pp. 317-327, Nouv. Rev. Optique, t.6, n.6. (1975) {Cited by Examiner in Feb. 9, 2017 Final Office Action in related matter: downloaded Mar. 2, 2017 from http://iopscience.iop.org}.
Grabowski, Ralph; “New Commands in AutoCADs 2010: Part 11 Smoothing 3D Mesh Objects” Dated 2011 (per examiner who cited reference), 6 pages, [Examiner Cited Art in Office Action dated Jan. 20, 2017 in related Application.]
Great Britain Combined Search and Examination Report in related Application GB1517842.9, dated Apr. 8, 2016, 8 pages.
Great Britain Search Report for related Application on. GB1517843.7, dated Feb. 23, 2016; 8 pages.
CN Notice of Allowance dated Sep. 8, 2020 for CN Application No. 201610557618.
CN Search report dated Jul. 11, 2019 for CN Application No. 201610557618.
Communication about intention to grant a European patent dated Feb. 1, 2018 for EP Application No. 15176943.
Communication about intention to grant a European patent dated Jun. 5, 2018 for EP Application No. 15176943.
Communication about intention to grant a European patent dated May 11, 2020 for EP Application No. 18171435.
Decision to grant a European patent dated Sep. 24, 2020 for EP Application No. 18171435.
English translation of CN Notice of Allowance dated Sep. 20, 2020 for CN Application No. 201610557618.
U.S. Appl. No. 14/250,923 for Imaging Terminal Having Data Compression filed Apr. 11, 2014, (Deng et al.), abandoned.
Related Publications (1)
Number Date Country
20190339057 A1 Nov 2019 US
Continuations (1)
Number Date Country
Parent 15146084 May 2016 US
Child 16507338 US