Mobile dimensioner apparatus for use in commerce

Information

  • Patent Grant
  • 9804013
  • Patent Number
    9,804,013
  • Date Filed
    Tuesday, July 7, 2015
    9 years ago
  • Date Issued
    Tuesday, October 31, 2017
    7 years ago
Abstract
A mobile volume dimensioning device, i.e. a mobile dimensioner, is described that detects excessive measuring time and/or a repetitive range of measuring motion and receives a deactivation event upon detection of this inappropriate behavior so as to prevent the systematic reporting of either the highest or lowest dimensions in an effort to mitigate unfair charging practices in commerce applications involving the shipping of goods.
Description
FIELD OF THE INVENTION

The present invention relates to volume dimensioning devices.


BACKGROUND

Volume dimensioning devices, i.e. dimensioners, are devices that are used for estimating sizes of items (such as boxes) and the sizes of empty spaces (such as the volume left in a delivery truck). Dimensioners may be larger devices that are a static part of a larger logistical system in a distribution center or warehouse, or they may be smaller mobile devices designed for portable use. Mobile dimensioners that are certified for use in commerce can be used to charge customers for shipment based on the dimensions of an item. The National Conference on Weights and Measures (NCWM) issues a National Type Evaluation Program (NTEP) Certificate of Conformance to mobile dimensioners that have been evaluated and found to produce accurate measurements capable of meeting applicable requirements of the National Institute of Standards and Technology (NIST) Handbook 44, entitled “Specifications, Tolerances, and Other Technical Requirements for Weighing and Measuring Devices.”


Despite the certification process, mobile dimensioners can have variable tolerances in measurements as a result of the inherent variations that arise from different methods of measurement. The same item could be measured from two different locations, resulting in two different methods of measurement, each with a different angle relative to the item being measured as well as a different distance to the item being measured. Consequently, it is possible to have two different measurements for an item, both of which are certifiable and correct, all because of the variable tolerances in the methods of measurement. More specifically, because the accuracy dimension (referred to as “d” in the NIST and NTEP documentation) can change, two different yet valid measurements can be obtained simply by moving the mobile dimensioner around.


By way of a non-limiting example, assume that one dimension of an item to be shipped has been measured with a mobile dimensioner to be 9.5 with an accuracy dimension of 0.5 (i.e. d=0.5). Simply moving the mobile dimensioner side to side or further away might provide the same dimension with a measurement of 10 with an accuracy dimension of 1.0 (i.e. d=1.0). In yet other situations, it is possible to produce variable measurements for the same dimension with the same accuracy dimension. Again, simply by moving the mobile dimensioner in and out, it would be possible to go from a measurement of 10 with a d=1.0 to a measurement of 9 with a d=1.0.


One of the primary reasons behind government oversight of the measurement process is to ensure that vendors are not employing improper measurements in their business transactions with both customers and shipping companies. Since a mobile dimensioner has the inherent ability to produce different certifiable measurements, a disreputable vendor, could in practice, move the device back and forth within the useable range, for example closer and father away, always looking at the reported dimension and then picking the larger dimension for overcharging customers and the smaller dimension for cheating shippers. Therefore, over time, a disreputable vendor can employ a certified mobile dimensioner to determine a method of measurement designed to systematically defraud customers and shippers.


Therefore, a need exists for a mobile dimensioner designed to thwart activities intended to generate fraudulent measurements.


SUMMARY

Accordingly one aspect of the present invention discloses a mobile dimensioner device, comprising: a display; one or more optical sensors; one or more measurement sensors; an input subsystem; a clock system; one or more processors; and memory containing instructions executable by the one or more processors whereby the device is operable to: receive a threshold time period; activate at least one of the one or more measurement sensors; derive a first set of dimensions for an object and an associated indication of the dimensional accuracy of each of the dimensions based on information received from the one or more measurement sensors; display, on the display, the first set of dimensions and the associated indication of the dimensional accuracy of each of the dimensions; determine the time interval since the first set of dimensions for the object was derived; if the time interval exceeds the threshold time period, receive a deactivation event.


In other exemplary embodiments, the threshold time period is defined by one of the group consisting of: defined by the manufacturer of the device, defined to comply with certification standards set by a certification organization, defined in response to input received via the input subsystem at the device, and defined in response to information received at the device from a server.


In additional exemplary embodiments, the deactivation event is selected from the group consisting of: a power off event for the device, an event that turns off the ability of the device to take measurements, an event that turns off the one or more measurement sensors of the device, an event that restricts the ability of the device to report results, an event that turns off one or more communication interfaces of the device, an event that deactivates the measurement sensors and displays the first set of dimensions, an event that deactivates the measurement sensors and places the device in a state requiring reset, and an event that deactivates the measurement sensors and deletes the first set of dimensions.


In further embodiments, the one or more optical sensors are selected from a group consisting of: a barcode sensor, a camera, and an image sensor.


In yet other embodiments, the one or more measurement sensors are selected from a group consisting of: point-cloud projection, structured light, and stereoscopic cameras and n-scopic cameras.


Another aspect of the present invention discloses a mobile dimensioner device, comprising: a display; one or more optical sensors; one or more measurement sensors; an input subsystem; one or more processors; and memory containing instructions executable by the one or more processors whereby the device is operable to: receive a threshold number of contrary events; activate at least one of the one or more measurement sensors; derive a first set of dimensions for an object and an associated indication of the dimensional accuracy of each of the dimensions based on information received from the one or more measurement sensors; display, on the display, the first set of dimensions and the associated indication of the dimensional accuracy of each of the dimensions; display, on the display, an indication to obtain a better measurement of the object; detect a number of contrary events; if the number of contrary events detected exceeds the threshold number of contrary events, receive a deactivation event.


In still other exemplary embodiments, the device is further operable to: derive a set of preliminary dimensions for an object based on information received from the one or more measurement sensors.


In more embodiments, the contrary event is an action that does not correspond to an indication to obtain a better measurement of the object.


In some embodiments, the threshold number of contrary events is defined by one of the group consisting of: defined by the manufacturer of the device, defined to comply with certification standards set by a certification organization, defined in response to input received via the input subsystem at the device, and defined in response to information received at the device from a server.


An additional aspect of the present invention discloses a mobile dimensioner device, comprising: a display; one or more optical sensors; one or more measurement sensors; an input subsystem; one or more processors; and memory containing instructions executable by the one or more processors whereby the device is operable to: activate at least one of the one or more measurement sensors; derive a set of first dimensions for an object and an associated indication of the first dimensional accuracy of each of the first dimensions based on information received from the one or more measurement sensors; display, on the display, the set of first dimensions and the associated indication of the first dimensional accuracy of each of the first dimensions; derive a set of second dimensions for an object and an associated indication of the second dimensional accuracy of each of the second dimensions based on information received from the one or more measurement sensors; display, on the display, the set of second dimensions and the associated indication of the first dimensional accuracy of each of the first dimensions; in response to an input to capture the set of second dimensions, determine if the second dimensional accuracy is greater than the first dimensional accuracy; if the second dimensional accuracy is greater than the first dimensional accuracy; then receive a deactivation event; and if the second dimensional accuracy is not greater than the first dimensional accuracy; then capture the second set of dimensions.


In yet other embodiments, the device further comprises: a communication interface.


In still more embodiments, the communication interface is selected from the group consisting of: Bluetooth, Ethernet, wireless Ethernet, USB, serial, and I2C.


In other embodiments, the device is further operable to: send the second set of dimensions to a server.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A is a block diagram of the hardware elements of a device according to embodiments of the disclosed subject matter.



FIG. 1B and FIG. 1C are block diagrams of the hardware elements of the system in accordance with embodiments of the disclosed subject matter.



FIG. 2 is a flow chart outlining the process for deactivating a device in accordance with embodiments of the disclosed subject matter in response to the detection of excessive measuring time.



FIG. 3A and FIG. 3B are flow charts outlining the process for deactivating a device in accordance with embodiments of the disclosed subject matter in response to repetitive motion.



FIG. 4A and FIG. 4B are flow charts outlining the process for deactivating a device in accordance with embodiments of the disclosed subject matter in response to the detection of excessive measuring time and/or repetitive motion.



FIG. 5A and FIG. 5B are flow charts outlining the process for deactivating a device in accordance with embodiments of the disclosed subject matter involving the accuracy dimension.





DETAILED DESCRIPTION

The present invention embraces the concept of restricting a mobile dimensioner from reporting systematically either the highest or lowest dimensions. Because the measurement results from a mobile dimensioner are not predictable, e.g. a mobile dimensioner used at its farthest range will not necessarily produce larger or smaller dimensions, a disreputable vendor must move the mobile dimensioner in and out and/or right or left looking for a specific measurement conducive to defrauding customers and shippers. This behavior must be repeated with each measurement because each item being measured will have a different size and will produce different results. In embodiments of the present invention, a mobile dimensioner device detects excessive measuring time and/or a repetitive range of motion and deactivates upon detection of this inappropriate behavior. In other embodiments of the present invention, a mobile dimensioner device detects when a measurement with a greater accuracy has been taken (i.e. a small accuracy dimension “d” value) and restricts the dimensioner from reporting measurements with less accuracy (i.e. a large accuracy dimension “d” value).



FIG. 1A illustrates an exemplary device 100, such as a mobile dimensioner device, for one embodiment of the present invention. The device 100 may include other components not shown in FIG. 1A, nor further discussed herein for the sake of brevity. One having ordinary skill in the art will understand the additional hardware and software included but not shown in FIG. 1A.


In general, device 100 may be implemented in any form of digital computer or mobile device. Digital computers may include, but are not limited to, laptops, desktops, workstations, fixed vehicle computers, vehicle mount computers, hazardous environment computers, rugged mobile computers, servers, blade servers, mainframes, other appropriate computers. Mobile devices may include, but are not limited to, cellular telephones, smart phones, personal digital assistants, tablets, pagers, two-way radios, netbooks, barcode scanners, radio frequency identification (RFID) readers, intelligent sensors, tracking devices, volume dimensioning devices, mobile dimensioners, and other similar computing devices.


In some embodiments of the present invention, the device 100 of FIG. 1A can be connected to other devices, designated 100-X. In one embodiment, device 100-1 may be connected to another device 100-2 via a network 170, as shown in FIG. 1B. The network 170 may be any type of wide area network (WAN), such as the Internet, Local Area Network (LAN), or the like, or any combination thereof, and may include wired components, such as Ethernet, wireless components, such as LTE, Wi-Fi, Bluetooth, or near field communication (NFC), or both wired and wireless components, collectively represented by the data links 172 and 174.


In other embodiments of the present invention, the device 100-1 may be connected to another device 100-2 via a wired communication channel 176, as shown in FIG. 1C. The wired communication channel 176 may be Universal Serial Bus (USB), serial, Inter-Integrated Circuit (I2C), or other computer bus.


In one embodiment, the device 100-1 is a mobile dimensioner device and the device 100-2 is a server than handles backend functions like invoicing customers for the packages being shipped. In this embodiment, FIG. 1B and FIG. 1C represent ways that the devices can be connected to allow the measurement information from device 100-1 to be shared with the backend system of device 100-2.


In general, as shown, the device 100 of FIG. 1A includes a processing system 110 that includes one or more processors 111, such as Central Processing Units (CPUs), Application Specific Integrated Circuits (ASICs), and/or Field Programmable Gate Arrays (FPGAs), a memory controller 112, memory 113, which may include software 114, and other components that are not shown for brevity, such as busses, etc. The processing system may also include storage 115, such as a hard drive or solid state drive.


The processing system 110 also includes a peripherals interface 116 for communicating with other components of the device 100, including but not limited to, radio frequency (RF) circuitry 152, such as Wi-Fi and/or cellular communications circuitry such as wireless Ethernet, Bluetooth, and near field communication (NFC), audio circuitry 154 for the audio input component 153, such as a microphone, and audio output component 155, such as a speaker, one or more accelerometers 156, one or more other sensors 158, such as a location determination component such as a Global Positioning System (GPS) chip, and one or more external ports 160, which may be used for smart card readers or for wired connections such as wired Ethernet, USB, serial or I2C ports. The RF circuitry 152 and external ports 160 individually and collectively make up the communication interfaces for the device 100. The processing system 110 is also connected to a power system component 120 that is used to power the device 100, such as a battery or a power supply unit. The processing system 110 is also connected to a clock system component 130 that controls a timer for use by the disclosed embodiments.


The peripherals interface 116 may also communicate with an Input/Output (I/O) subsystem 140, which includes a display(s) controller 141 operative to control display(s) 142. In some embodiments the display(s) 142 is a touch-sensitive display system, and the display(s) controller 141 is further operative to process touch inputs on the touch sensitive display 142. The I/O subsystem 140 may also include a keypad(s) controller 143 operative to control keypad(s) 144 on the device 100. The I/O subsystem 140 also includes an optical sensor(s) controller 145 operative to control one or more optical sensor(s) 146. The optical sensor(s) may include, but is not limited to, a barcode sensor, a camera, and an image sensor. The I/O subsystem 140 also includes a measurement sensor(s) controller 147 operative to control one or more measurement sensor(s) 148. The measurement sensor(s) may include, but is not limited to, a point-cloud projection sensor, a structured light sensor, a stereoscopic camera, and an n-scopic camera. The components of device 100 may be interconnected using one or more buses, represented generically by the arrows of FIG. 1A, and may be mounted on a motherboard (not shown) or some other appropriate configuration.



FIG. 2 is a flow chart outlining the process for deactivating a device in accordance with embodiments of the disclosed subject matter in response to the detection of excessive measuring time. The process begins at Step 200 followed by Step 202 in which a check is made to see if an activation event has been received by the mobile dimensioner device 100. If not (Path 203), then the process ends (Step 220). If an activation event has been received (Path 205), then the process continues.


In some embodiments, the activation event comprises a power on event or a power cycling event for the mobile dimensioner device 100. In other embodiments, the activation event comprises an event that turns on the ability of the mobile dimensioner device 100 to take measurements, such as turning on one or more measurement sensors 148. In other embodiments, the activation event comprises an event that turns on the communication interfaces 160 and/or 152 of mobile dimensioner device 100-1 to report measurement results to the server 100-2. In still other embodiments, an activation event may also comprise a reset of any existing measurements currently in memory 113 or storage 115 of the mobile dimensioner device 100, or a reset of the mobile dimensioner device itself. In additional embodiments, the activation event includes a preliminary scan of the object to derive some preliminary dimensions. Note that in some embodiments, the object can be empty space, e.g. the amount of dirt removed from a hole or the amount of space remaining in a delivery truck. However, in other embodiments, the object will be a package that is being shipped.


Next, the mobile dimensioner device 100 then resets a timer used in the detection of excessive measuring time. This timer is called the acc-meas-timer in FIG. 2 (step 204) as it is used to track the amount of time that transpires from when an accurate measurement is obtained to when the measurement is captured. In some embodiments, the mobile dimensioner device 100 records an infra-red (IR) image of a pattern of light projected on an object being measured. The mobile dimensioner device, though hardware and software, transform the image into three dimensional data about the object. That three dimensional data is used to derive an accurate measurement for the object.


The mobile dimensioner device 100 then checks to see if the acc-meas-timer is greater than a specified threshold (Step 208). This threshold is called the acc-meas-time-threshold in FIG. 2. The acc-meas-time-threshold may be set by the manufacturer of the mobile dimensioner device 100, may be set to comply with certification standards set by a certification organization, may be set by a server 100-2, may be set in response to an input at the mobile dimensioner device 100, or may be set in some other manner.


If the acc-meas-timer is less than or equal to the acc-meas-time-threshold (Path 207), i.e. the amount of time spent between obtaining and capturing an accurate measurement is not excessive, then the mobile dimensioner device 100 checks to see if an accurate measurement has been obtained and displayed (Step 210). If no accurate measurement has been obtained and displayed (Path 211), then the mobile dimensioner device 100 resets the acc-meas-timer (Step 222), and the process then continues to Step 208 as described above.


Returning to Step 210, if an accurate measurement has been obtained and displayed (Path 213), then the mobile dimensioner device checks to see if the measurement has been captured (Step 216). In some embodiments, the measurement may be captured in response to an input at the mobile dimensioner device 100, or may be set in some other manner. If the measurement is captured (Path 217), then the measurement results are reported (Step 218) and the process is complete (Step 220). If the measurement is not captured (Path 215), then the clock system 130 of the mobile dimensioner device 100 increments the acc-meas-timer with the passage of time (Step 206) and the process continues to Step 208 as described above.


Returning to Step 208, if the acc-meas-timer is greater than the acc-meas-time-threshold (Path 209), i.e. the amount of time spent in obtaining and capturing an accurate measurement is excessive, then the process continues to Step 214 where a deactivation event is received by the mobile dimensioner device 100, and the mobile dimensioner device returns to a state where it waits for an activation event (Step 202). In some embodiments, the deactivation event comprises a power off event for the mobile dimensioner device 100 itself. In alternative embodiments, the deactivation event comprises placing the device in a state requiring a reset, such as a key sequence to reset or a simple power cycle reset. In other embodiments, the deactivation event comprises an event that turns off the ability of the mobile dimensioner device 100 to take measurements, such as an event that turns off or temporarily disables one or more measurement sensors 148. In some embodiments, any active measurements in the mobile dimensioner device at the time of the deactivation event may be cleared, i.e. deleted or erased. In other embodiments, any active measurements in the mobile dimensioner device at the time of the deactivation event may be displayed. In yet other embodiments, the deactivation event comprises events restricting the ability of the mobile dimensioner device 100-1 to report the results to the server 100-2, such as events that turn off the communication interfaces 152 and/or 160 of the mobile dimensioner device 100. In some embodiments, the deactivation events are initiated by the mobile dimensioner device 100 itself in response to the criteria met in accordance with FIG. 2 as described above.


In this manner, FIG. 2 describes a use case where, for a mobile dimensioner device 100 that may or may not display an accuracy dimension, once an accurate measurement has been derived and displayed, it must be captured within a certain time period or the mobile dimensioner device 100 will be deactivated.



FIG. 3A and FIG. 3B are flow charts outlining the process for deactivating a device in accordance with embodiments of the disclosed subject matter in response to repetitive motion. The process begins in FIG. 3A at Step 300 followed by Step 302 in which a check is made to see if an activation event has been received by the mobile dimensioner device 100. If not (Path 303), then the process ends (Step 328). If an activation event has been received (Path 305), then the process continues.


As described earlier, there are different embodiments for the activation event, including but not limited to: a power on event, a power cycling event, an event that turns on the ability to take measurements, an event that turns on the communication interfaces, an event that resets existing measurements, an event that resets the mobile dimensioner device, and an event that includes a preliminary scan of the object.


Next, the mobile dimensioner device 100 then resets a counter used in the detection of repetitive motion. This counter is called the contrary-event-counter in FIG. 3A (Step 306) as it is used to track the number of times that a contrary event occurs.


A contrary event is an action by the mobile dimensioning device 100 that does not correspond to an indication for a better measurement of the object. As described earlier, an action may be a new measurement taken by the mobile dimensioning device 100, a movement of the mobile dimensioner device 100, or a combination of both.


An indication may either be text or graphics (or both) for a movement that the mobile dimensioner device 100 should take or a measurement that the mobile dimensioner device 100 should obtain (or both a movement and a measurement) in order to better measure the object being measured. By way of a non-limiting example, a movement indication may be a text instruction that provides directions, such as move left, move up, move in closer, etc., that allow the mobile dimensioner device 100 to be moved into a better position for measuring the subject being measured. In another non-limiting example, a movement indication may be an arrow that provides visual cues, such as move down, move right, move back further, etc., that allow the mobile dimensioner device 100 to be moved into a better position for measuring the subject being measured. A measurement indication may include, but is not limited to, a textual instruction, such as “measure the depth of the object”, that allows the mobile dimensioner device 100 to obtain a measurement of a particular dimension of the subject being measured. A measurement indication may also include, but is not limited to, a visual representation of the object being measured that highlights particular dimensions of the subject being measured for the mobile dimensioner device 100 to obtain, such as an icon of a box with the depth dimension blinking.


In this respect, a contrary event is more specifically defined as a measurement or movement (or both) by the mobile dimensioning device 100 that does not correspond to a text or graphic (or both) that provides information designed to help the mobile dimensioner device 100 obtain better measurements of the subject being measured. Accordingly, the contrary-event-counter is used to track the number of times that movements or measurements of the mobile dimensioner device 100 are not aligned with the goal of obtaining better measurements for the subject being measured.


In some embodiments, a contrary event also occurs whenever new accurate measurements are derived after an accurate measurement has already been derived but not captured.


Returning to FIG. 3A, the next steps in the process reset flags used to track certain events. The accurate-measurement-flag is a flag that is set to TRUE once an accurate measurement has been obtained by the mobile dimensioner device 100. This flag is initially set to FALSE (Step 308). The indication-measurement-flag is a flag that is set to TRUE if the mobile dimensioner device 100 has any indications for better measurement of the object but the actions of the mobile dimensioner device do not corresponded to those indications. This flag is initially set to FALSE (Step 310).


Next, the mobile dimensioner device 100 checks to see if the contrary-event-counter is greater than a specified threshold (Step 312). This threshold is called the contrary-event-threshold in FIG. 3A. The contrary-event-threshold may be set by the manufacturer of the mobile dimensioner device 100, may be set to comply with certification standards set by a certification organization, may be set by a server 100-2, may be set in response to input at the mobile dimensioner device 100, or may be set in some other manner.


If the contrary-event-counter is less than or equal to the contrary-event-threshold (Path 307), i.e. the number of times that a contrary event has occurred is less than the allowed number, then the mobile dimensioner device 100 checks to see if an accurate measurement has been obtained and displayed (Step 314). If no accurate measurement has been obtained and displayed (Path 311), then the mobile dimensioner device 100 receives new actions (Step 316). In some embodiments, new actions may be new measurements taken by the mobile dimensioner device. In other embodiments, new actions may be movements of the mobile dimensioner device 100, including but not limited to movements in three dimensional space (up-and-down, side-to-side, front-to-back), or a repositioning of the viewing angle of the mobile dimensioner device 100 relative to the object being measured. In other embodiments, new actions include both new measurements and new movements. It should be noted that if the actions do not produce an accurate measurement that can be displayed, then the process set forth in FIG. 3A repeats until such an accurate measurement is derived.


Next, the mobile dimensioner device 100 checks to see if there are any indications for obtaining a better measurement of the object being measured (Step 322). If not (Path 315), then the process continues as indicated by the connector A. If there are indications (Path 317), then the mobile dimensioner device checks to see if the new actions followed or corresponded to the indications (Step 326). If the new actions followed the indications (Path 325), then the process continues as indicated by connector A. If the new actions did not correspond to the indications (Path 323), then the indication-measurement-flag is set to TRUE (Step 332) and the process continues as indicated by connector A.


Returning to Step 314, if an accurate measurement has been obtained and displayed (Path 313), then the accurate-measurement-flag is set to TRUE (Step 320), and then the mobile dimensioner device checks to see if the measurement has been captured (Step 324). As described earlier, in some embodiments, the measurement may be captured in response to input at the mobile dimensioner device 100, or some other manner. If the measurement is captured (Path 321), then the measurement results are reported (Step 330) and the process is complete (Step 328). If the measurement is not captured (Path 319), then process then continues to Step 316 where new actions are received by the mobile dimensioner device 100, as already described.


Connector A from FIG. 3A continues then in FIG. 3B. In this part of the process, the mobile dimensioner device 100 checks the flags and increments the contrary-event counter accordingly. The mobile dimensioner device 100 first checks to see if the accurate-measurement-flag is TRUE. If it is not (Path 327), then the process continues. If it is (Path 329), then the contrary-event counter is incremented (Step 338) and the accurate-measurement-flag is reset to FALSE (Step 340), and the process continues. The mobile dimensioner device 100 then checks to see if the indication-measurement-flag is TRUE (Step 336). If it is not (Path 331), then the process continues as indicated by connector B. If it is (Path 333), then the contrary-event counter is incremented (Step 342) and the indication-measurement-flag is reset to FALSE (Step 344), and the process continues as indicated by connector B.


Connector B from FIG. 3B continues then in FIG. 3A. At this point, the mobile dimensioner device 100 again checks to see if the contrary-event-counter is greater than a specified threshold (Step 312). If the contrary-event-counter is greater than the contrary-event-threshold (Path 309), i.e. the number of times that a contrary event has occurred is now greater than the allowed number, then the mobile dimensioner device 100 then the process continues to Step 318 where a deactivation event is received by the mobile dimensioner device 100, and the mobile dimensioner device returns to a state where it waits for an activation event (Step 302).


As described earlier, there are different embodiments for the deactivation event, including but not limited to: a power off event, an event that turns off the ability to take measurements, an event that turns off sensors, an event that restricts the reporting of results, an event that turns off communication interfaces, an event that deactivates sensors and displays the last set of dimensions, an event that deactivates sensors and requires a device reset, and an event that deactivates and deletes the last set of dimensions.


In this manner, FIGS. 3A and 3B describe a use case where, for a mobile dimensioner device 100 that may or may not display an accuracy dimension, if there are indications for getting a better measurement and they are repeatedly ignored or if an accurate measurement is obtained but is perpetually not captured, then the mobile dimensioner device 100 will be deactivated.



FIG. 4A and FIG. 4B represent an embodiment that combines elements of FIG. 2, FIG. 3A and FIG. 3B. FIG. 4A and FIG. 4B are flow charts outlining the process for deactivating a device in accordance with embodiments of the disclosed subject matter in response to the detection of excessive measuring time and/or repetitive motion.


The process begins in FIG. 4A at Step 400 followed by Step 402 in which a check is made to see if an activation event has been received by the mobile dimensioner device 100. If not (Path 403), then the process ends (Step 428). If an activation event has been received (Path 405), then the process continues.


As described earlier, there are different embodiments for the activation event, including but not limited to: a power on event, a power cycling event, an event that turns on the ability to take measurements, an event that turns on the communication interfaces, an event that resets existing measurements, an event that resets the mobile dimensioner device, and an event that includes a preliminary scan of the object.


Next, the mobile dimensioner device 100 then resets a counter used to track the number of times that a contrary event occurs, i.e. the contrary-event-counter (Step 406). The next steps in the process resets the accurate-measurement-flag (Step 408) which is used to track when an accurate measurement has been obtained. The indication-measurement-flag, which is used to track when indications for better measurements are not followed, is then reset (Step 410). Next, the mobile dimensioner device 100 resets the acc-meas-timer (Step 412), which is the timer used to track the amount of time that transpires before the mobile dimensioner device 100 derives an accurate measurement.


An additional timer, called the indication-timer, is then reset (Step 414). This timer is used to track the aggregate elapsed time that the mobile dimensioner device 100 spends in movements or measurements that are not aligned with the goal of obtaining better measurements for the subject being measured. The process then continues as indicated by connector C.


Connector C from FIG. 4A continues then in FIG. 4B. At this point, the mobile dimensioner device 100 then checks to see if the acc-meas-timer is greater than a specified threshold (Step 456). As described with earlier, the acc-meas-time-threshold may be set by the manufacturer, may be set to comply with certification standards, may be set in response to input or the like.


If the acc-meas-timer is less than or equal to the acc-meas-time-threshold (Path 435), i.e. the amount of time spent in obtaining and capturing an accurate measurement is not excessive, then the mobile dimensioner device 100 checks to see if the indication-timer is greater than a specified threshold (Step 458). This threshold is called the indication-time-threshold in FIG. 4B. Similar to the acc-meas-time-threshold, the indication-timer may be set by the manufacturer, may be set to comply with certification standards, may be set in response to input or the like.


If the indication-timer is less than or equal to the indication-time-threshold (Path 439), the mobile dimensioner device 100 then checks to see if the contrary-event-counter is greater than a specified threshold (Step 460). This threshold is called the contrary-event-threshold in FIG. 4B. As described earlier, the contrary-event-threshold may be set by the manufacturer, may be set to comply with certification standards, may be set by a server, may be set in response to input at the mobile dimensioner device 100, or may be set in some other manner.


If the contrary-event-counter is less than or equal to the contrary-event-threshold (Path 443), i.e. the number of times that a contrary event has occurred is less than or equal to the allowed number, then the process continues as indicated by connector F.


If the acc-meas-timer is greater than the acc-meas-time-threshold (Path 437), i.e. the amount of time spent in obtaining and capturing an accurate measurement is excessive, then the process continues to Step 462 where a deactivation event is received by the mobile dimensioner device 100. The process then continues as indicated by connector D.


If the indication-timer is greater than the indication-time threshold (Path 441), i.e. the aggregate elapsed time that the mobile dimensioner device 100 spends in movements or measurements that are not aligned with the goal of obtaining better measurements for the object being measured is greater than what is allowed, then the process continues to Step 462 where a deactivation event is received by the mobile dimensioner device 100. The process then continues as indicated by connector D.


If the contrary-event-counter is greater than the contrary-event-threshold (Path 445), i.e. the number of times that a contrary event has occurred is now greater than the allowed number, then the mobile dimensioner device 100 then the process continues to Step 462 where a deactivation event is received by the mobile dimensioner device 100. The process then continues as indicated by connector D.


As described earlier, there are different embodiments for the deactivation event, including but not limited to: a power off event, an event that turns off the ability to take measurements, an event that turns off sensors, an event that restricts the reporting of results, an event that turns off communication interfaces, an event that deactivates sensors and displays the last set of dimensions, an event that deactivates sensors and requires a device reset, and an event that deactivates and deletes the last set of dimensions.


Connector D of FIG. 4B then continues in FIG. 4A, where a check is made to see if an activation event has been received by the mobile dimensioner device 100 (Step 402), and if not (Path 403) then the process ends (Step 428).


Returning to Step 460, Connector F of FIG. 4B then continues in FIG. 4A where the mobile dimensioner device 100 checks to see if an accurate measurement has been derived and displayed (Step 416). If no accurate measurement has been obtained and displayed (Path 407), then the acc-meas-timer is reset (Step 464), and the mobile dimensioner device 100 receives new actions (Step 418). As described above, new actions are movements, measurements, or both.


Next, the mobile dimensioner device 100 checks to see if there are any indications for obtaining a better measurement of the object being measured (Step 422). If not (Path 411), then the process continues as indicated by the connector E. If there are indications (Path 413), then the mobile dimensioner device checks to see if the new actions followed or corresponded to the indications (Step 426). If the new actions follow the indications (Path 421), then the incrementing of the indication-timer with the passage of time, if it has been running, is stopped or paused (Step 434) and the process continues as indicated by connector E. If the new actions do not correspond to the indications (Path 419), then the indication-measurement-flag is set to TRUE (Step 432), the indication-timer is incremented with the passage of time (Step 436), and the process continues as indicated by connector E.


Connector E of FIG. 4A then continues in 4B. The mobile dimensioner device 100 first checks to see if the accurate-measurement-flag is TRUE. If it is not (Path 423), then the process continues. If it is (Path 425), then the contrary-event counter is incremented (Step 444) and the accurate-measurement-flag is reset to FALSE (Step 446), and the process continues. The mobile dimensioner device 100 then checks to see if the indication-measurement-flag is TRUE. If it is not (Path 427), then the process continues. If it is (Path 429), then the contrary-event counter is incremented (Step 448) and the indication-measurement-flag is reset to FALSE (Step 450), and the process continues.


The mobile dimensioner device 100 then checks to see if the number of contrary events is 0 (Step 442). If not (Path 431), then the process continues as indicated by connector C. If the number of contrary events is 0 (Path 433), then the acc-meas-time-threshold is augmented (Step 452), and the process continues as indicated by connector C. The augmentation of the acc-meas-time-threshold effectively rewards movements and measurements by the mobile dimensioner device 100 that are aligned with the goal of obtaining better measurements for the subject being measured by giving more time to derive an accurate measurement. Connector C from FIG. 4B then continues in FIG. 4A, as described above.


Returning to Step 416, if an accurate measurement has been derived and displayed (Path 409), then the accurate-measurement-flag is set to TRUE (Step 420), and then the mobile dimensioner device checks to see if the measurement has been captured (Step 424). As described earlier, in some embodiments, the measurement may be captured in response to input at the mobile dimensioner device 100, or may be set in some other manner. If the measurement is captured (Path 417), then the measurement results are reported (Step 430) and the process is complete (Step 428). If the measurement is not captured (Path 415), then the clock system 130 of the mobile dimensioner device 100 increments the acc-meas-timer with the passage of time (Step 454), and the process then continues to Step 418 where new actions are received by the mobile dimensioner device 100. In alternative embodiments, once the clock system 130 of the mobile dimensioner device 100 increments the acc-meas-timer with the passage of time (Step 454), the process continues as indicated by connector C. In this embodiment, similar to FIG. 2, once the mobile dimensioner device 100 has an accurate measurement, it must be captured or the device will deactivate.


In this manner, FIGS. 4A and 4B describe a use case where, for a mobile dimensioner device 100 that may or may not display an accuracy dimension, if there are indications for getting an accurate measurement and they are ignored in sufficient quantity and/or duration, or if accurate measurements are derived but not captured after a certain number of times or within a certain time period, then the mobile dimensioner device 100 will be deactivated.



FIG. 5A and FIG. 5B are flow charts outlining the process for deactivating a device in accordance with embodiments of the disclosed subject matter involving the accuracy dimension. The process begins in FIG. 5A at Step 500 followed by Step 502 in which a check is made to see if an activation event has been received by the mobile dimensioner device 100. If not (Path 503), then the process ends (Step 520). If an activation event has been received (Path 305), then the process continues.


As described earlier, there are different embodiments for the activation event, including but not limited to: a power on event, a power cycling event, an event that turns on the ability to take measurements, an event that turns on the communication interfaces, an event that resets existing measurements, an event that resets the mobile dimensioner device, and an event that includes a preliminary scan of the object.


The mobile dimensioner device 100 first resets a variable used to store the previous measurements taken by the mobile dimensioner device 100 (Step 504). This is the prev-measurement variable in FIG. 5A. The mobile dimensioner device then resets a variable used to store the accuracy dimension (i.e. the “d” value) for the previous set of measurements taken by the mobile dimensioner device (Step 506). This is the prev-acc-value variable in FIG. 5A. The last variable reset by the mobile dimensioner device 100 is the curr-acc-value (Step 508), which is used to store the accuracy dimension of the current set of measurements taken by the mobile dimensioner device 100. Note, in some embodiments, these single variables may be implemented as separate but related variables having the attributes described herein.


Next, the mobile dimensioner device 100 checks to see if an accurate measurement has been derived and displayed (Step 510). If no accurate measurement has been derived and displayed yet (Path 507), then the mobile dimensioner device 100 receives new actions (Step 512). As described above, new actions are movements, measurements, or both. The process then continues as indicated by connector G.


Connector G in FIG. 5A continues in FIG. 5B. The mobile dimensioner device 100 then checks to see if the accuracy dimension for the current measurement is defined (Step 524). If not (Path 523), then the process continues as indicated by connector H. If the accuracy dimension is defined (Path 525), then the mobile dimensioner device 100 sets the curr-acc-value variable to the value of the accuracy dimension of the current measurement (Step 526). The mobile dimensioner device 100 then checks to see if the prev-acc-value variable is 0 or if the prev-acc-value variable is greater than the curr-acc-value variable (Step 528). If not (Path 527), then the process continues as indicated by connector H. If so (Path 529), then the mobile dimensioner device 100 sets the prev-acc-value variable to the curr-acc-value variable (Step 530) and sets the prev-measurements variable to the value of the current measurements (Step 532). The process then continues as indicated by connector H.


Connector H of FIG. 5B then continues in FIG. 5A. If an accurate measurement has been obtained and displayed (Path 509), then the mobile dimensioner device 100 checks to see if the measurement has been captured (Step 514). In some embodiments, the measurement may be captured in response to input at the mobile dimensioner device 100, or may be set in some other manner. If the measurement is not captured (Path 511), then the mobile dimensioner device 100 receives new actions (Step 512) as described above. If the measurement is captured (Path 513), then the mobile dimensioner device 100 checks to see if the curr-acc-value variable exceeds the prev-acc-value variable (Step 518), i.e. is the mobile dimensioner device 100 attempting to capture a measurement with less accuracy (a large accuracy dimension “d” value) after having previously obtained a measurement with more accuracy (a small accuracy dimension “d” value). If not (Path 515), then the measurement results are reported (Step 522) and the process is complete (Step 520). If so (Path 517), then the mobile dimensioner device 100 checks to see if it should use the previous accurate measurements stored in the prev-measurements variable (Step 518). If so (Path 521), then again, the measurement results are reported (Step 522). If not, then a deactivation event is received by the mobile dimensioner device 100, and the mobile dimensioner device returns to a state where it waits for an activation event (Step 502).


As described earlier, deactivation events may be a power off event, an event requiring a reset, an event that turns off the ability to take measurements, an event that clears out any active measurements, an event restricting the ability to report the results, or any combination therein.


In this manner, FIGS. 5A and 5B describe a use case where, for a mobile dimensioner device 100 that displays an accuracy dimension, once a measurement with a greater accuracy has been taken (i.e. a small accuracy dimension “d” value), the mobile dimensioner device 100 is prevented from reporting measurements with less accuracy (i.e. a large accuracy dimension “d” value).


In this respect, the processes described in FIG. 2, FIG. 3A & FIG. 3B, FIG. 4A & FIG. 4B, and FIG. 5A & FIG. 5B should make it clear to a person of ordinary skill in the art how the mobile dimensioner device 100 of the present invention detects excessive measuring time and/or a repetitive range of motion and receives a deactivation event upon detection of these activities in an attempt to mitigate the risk of systematic reporting of improper measurements designed to defraud customers and shippers.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. No. 6,832,725; U.S. Pat. No. 7,128,266;
  • U.S. Pat. No. 7,159,783; U.S. Pat. No. 7,413,127;
  • U.S. Pat. No. 7,726,575; U.S. Pat. No. 8,294,969;
  • U.S. Pat. No. 8,317,105; U.S. Pat. No. 8,322,622;
  • U.S. Pat. No. 8,366,005; U.S. Pat. No. 8,371,507;
  • U.S. Pat. No. 8,376,233; U.S. Pat. No. 8,381,979;
  • U.S. Pat. No. 8,390,909; U.S. Pat. No. 8,408,464;
  • U.S. Pat. No. 8,408,468; U.S. Pat. No. 8,408,469;
  • U.S. Pat. No. 8,424,768; U.S. Pat. No. 8,448,863;
  • U.S. Pat. No. 8,457,013; U.S. Pat. No. 8,459,557;
  • U.S. Pat. No. 8,463,079;
  • U.S. Pat. No. 8,469,272; U.S. Pat. No. 8,474,712;
  • U.S. Pat. No. 8,479,992; U.S. Pat. No. 8,490,877;
  • U.S. Pat. No. 8,517,271; U.S. Pat. No. 8,523,076;
  • U.S. Pat. No. 8,528,818; U.S. Pat. No. 8,544,737;
  • U.S. Pat. No. 8,548,242; U.S. Pat. No. 8,548,420;
  • U.S. Pat. No. 8,550,335; U.S. Pat. No. 8,550,354;
  • U.S. Pat. No. 8,550,357; U.S. Pat. No. 8,556,174;
  • U.S. Pat. No. 8,556,176; U.S. Pat. No. 8,556,177;
  • U.S. Pat. No. 8,559,767; U.S. Pat. No. 8,599,957;
  • U.S. Pat. No. 8,561,895; U.S. Pat. No. 8,561,903;
  • U.S. Pat. No. 8,561,905; U.S. Pat. No. 8,565,107;
  • U.S. Pat. No. 8,571,307; U.S. Pat. No. 8,579,200;
  • U.S. Pat. No. 8,583,924; U.S. Pat. No. 8,584,945;
  • U.S. Pat. No. 8,587,595; U.S. Pat. No. 8,587,697;
  • U.S. Pat. No. 8,588,869; U.S. Pat. No. 8,590,789;
  • U.S. Pat. No. 8,596,539; U.S. Pat. No. 8,596,542;
  • U.S. Pat. No. 8,596,543; U.S. Pat. No. 8,599,271;
  • U.S. Pat. No. 8,599,957; U.S. Pat. No. 8,600,158;
  • U.S. Pat. No. 8,600,167; U.S. Pat. No. 8,602,309;
  • U.S. Pat. No. 8,608,053; U.S. Pat. No. 8,608,071;
  • U.S. Pat. No. 8,611,309; U.S. Pat. No. 8,615,487;
  • U.S. Pat. No. 8,616,454; U.S. Pat. No. 8,621,123;
  • U.S. Pat. No. 8,622,303; U.S. Pat. No. 8,628,013;
  • U.S. Pat. No. 8,628,015; U.S. Pat. No. 8,628,016;
  • U.S. Pat. No. 8,629,926; U.S. Pat. No. 8,630,491;
  • U.S. Pat. No. 8,635,309; U.S. Pat. No. 8,636,200;
  • U.S. Pat. No. 8,636,212; U.S. Pat. No. 8,636,215;
  • U.S. Pat. No. 8,636,224; U.S. Pat. No. 8,638,806;
  • U.S. Pat. No. 8,640,958; U.S. Pat. No. 8,640,960;
  • U.S. Pat. No. 8,643,717; U.S. Pat. No. 8,646,692;
  • U.S. Pat. No. 8,646,694; U.S. Pat. No. 8,657,200;
  • U.S. Pat. No. 8,659,397; U.S. Pat. No. 8,668,149;
  • U.S. Pat. No. 8,678,285; U.S. Pat. No. 8,678,286;
  • U.S. Pat. No. 8,682,077; U.S. Pat. No. 8,687,282;
  • U.S. Pat. No. 8,692,927; U.S. Pat. No. 8,695,880;
  • U.S. Pat. No. 8,698,949; U.S. Pat. No. 8,717,494;
  • U.S. Pat. No. 8,717,494; U.S. Pat. No. 8,720,783;
  • U.S. Pat. No. 8,723,804; U.S. Pat. No. 8,723,904;
  • U.S. Pat. No. 8,727,223; U.S. Pat. No. D702,237;
  • U.S. Pat. No. 8,740,082; U.S. Pat. No. 8,740,085;
  • U.S. Pat. No. 8,746,563; U.S. Pat. No. 8,750,445;
  • U.S. Pat. No. 8,752,766; U.S. Pat. No. 8,756,059;
  • U.S. Pat. No. 8,757,495; U.S. Pat. No. 8,760,563;
  • U.S. Pat. No. 8,763,909; U.S. Pat. No. 8,777,108;
  • U.S. Pat. No. 8,777,109; U.S. Pat. No. 8,779,898;
  • U.S. Pat. No. 8,781,520; U.S. Pat. No. 8,783,573;
  • U.S. Pat. No. 8,789,757; U.S. Pat. No. 8,789,758;
  • U.S. Pat. No. 8,789,759; U.S. Pat. No. 8,794,520;
  • U.S. Pat. No. 8,794,522; U.S. Pat. No. 8,794,526;
  • U.S. Pat. No. 8,798,367; U.S. Pat. No. 8,807,431;
  • U.S. Pat. No. 8,807,432; U.S. Pat. No. 8,820,630;
  • U.S. Pat. No. 8,854,633;
  • International Publication No. 2013/163789;
  • International Publication No. 2013/173985;
  • International Publication No. 2014/019130;
  • International Publication No. 2014/110495;
  • U.S. Patent Application Publication No. 2008/0185432;
  • U.S. Patent Application Publication No. 2009/0134221;
  • U.S. Patent Application Publication No. 2010/0177080;
  • U.S. Patent Application Publication No. 2010/0177076;
  • U.S. Patent Application Publication No. 2010/0177707;
  • U.S. Patent Application Publication No. 2010/0177749;
  • U.S. Patent Application Publication No. 2010/0202702;
  • U.S. Patent Application Publication No. 2010/0220894;
  • U.S. Patent Application Publication No. 2011/0202554;
  • U.S. Patent Application Publication No. 2012/0111946;
  • U.S. Patent Application Publication No. 2012/0138685;
  • U.S. Patent Application Publication No. 2012/0168511;
  • U.S. Patent Application Publication No. 2012/0168512;
  • U.S. Patent Application Publication No. 2012/0193423;
  • U.S. Patent Application Publication No. 2012/0203647;
  • U.S. Patent Application Publication No. 2012/0223141;
  • U.S. Patent Application Publication No. 2012/0228382;
  • U.S. Patent Application Publication No. 2012/0248188;
  • U.S. Patent Application Publication No. 2013/0043312;
  • U.S. Patent Application Publication No. 2013/0056285;
  • U.S. Patent Application Publication No. 2013/0070322;
  • U.S. Patent Application Publication No. 2013/0075168;
  • U.S. Patent Application Publication No. 2013/0082104;
  • U.S. Patent Application Publication No. 2013/0175341;
  • U.S. Patent Application Publication No. 2013/0175343;
  • U.S. Patent Application Publication No. 2013/0200158;
  • U.S. Patent Application Publication No. 2013/0256418;
  • U.S. Patent Application Publication No. 2013/0257744;
  • U.S. Patent Application Publication No. 2013/0257759;
  • U.S. Patent Application Publication No. 2013/0270346;
  • U.S. Patent Application Publication No. 2013/0278425;
  • U.S. Patent Application Publication No. 2013/0287258;
  • U.S. Patent Application Publication No. 2013/0292475;
  • U.S. Patent Application Publication No. 2013/0292477;
  • U.S. Patent Application Publication No. 2013/0293539;
  • U.S. Patent Application Publication No. 2013/0293540;
  • U.S. Patent Application Publication No. 2013/0306728;
  • U.S. Patent Application Publication No. 2013/0306730;
  • U.S. Patent Application Publication No. 2013/0306731;
  • U.S. Patent Application Publication No. 2013/0307964;
  • U.S. Patent Application Publication No. 2013/0308625;
  • U.S. Patent Application Publication No. 2013/0313324;
  • U.S. Patent Application Publication No. 2013/0313325;
  • U.S. Patent Application Publication No. 2013/0341399;
  • U.S. Patent Application Publication No. 2013/0342717;
  • U.S. Patent Application Publication No. 2014/0001267;
  • U.S. Patent Application Publication No. 2014/0002828;
  • U.S. Patent Application Publication No. 2014/0008430;
  • U.S. Patent Application Publication No. 2014/0008439;
  • U.S. Patent Application Publication No. 2014/0025584;
  • U.S. Patent Application Publication No. 2014/0027518;
  • U.S. Patent Application Publication No. 2014/0034734;
  • U.S. Patent Application Publication No. 2014/0036848;
  • U.S. Patent Application Publication No. 2014/0039693;
  • U.S. Patent Application Publication No. 2014/0042814;
  • U.S. Patent Application Publication No. 2014/0049120;
  • U.S. Patent Application Publication No. 2014/0049635;
  • U.S. Patent Application Publication No. 2014/0061305;
  • U.S. Patent Application Publication No. 2014/0061306;
  • U.S. Patent Application Publication No. 2014/0063289;
  • U.S. Patent Application Publication No. 2014/0066136;
  • U.S. Patent Application Publication No. 2014/0067692;
  • U.S. Patent Application Publication No. 2014/0070005;
  • U.S. Patent Application Publication No. 2014/0071840;
  • U.S. Patent Application Publication No. 2014/0074746;
  • U.S. Patent Application Publication No. 2014/0075846;
  • U.S. Patent Application Publication No. 2014/0076974;
  • U.S. Patent Application Publication No. 2014/0078341;
  • U.S. Patent Application Publication No. 2014/0078342;
  • U.S. Patent Application Publication No. 2014/0078345;
  • U.S. Patent Application Publication No. 2014/0084068;
  • U.S. Patent Application Publication No. 2014/0097249;
  • U.S. Patent Application Publication No. 2014/0098792;
  • U.S. Patent Application Publication No. 2014/0100774;
  • U.S. Patent Application Publication No. 2014/0100813;
  • U.S. Patent Application Publication No. 2014/0103115;
  • U.S. Patent Application Publication No. 2014/0104413;
  • U.S. Patent Application Publication No. 2014/0104414;
  • U.S. Patent Application Publication No. 2014/0104416;
  • U.S. Patent Application Publication No. 2014/0104451;
  • U.S. Patent Application Publication No. 2014/0106594;
  • U.S. Patent Application Publication No. 2014/0106725;
  • U.S. Patent Application Publication No. 2014/0108010;
  • U.S. Patent Application Publication No. 2014/0108402;
  • U.S. Patent Application Publication No. 2014/0108682;
  • U.S. Patent Application Publication No. 2014/0110485;
  • U.S. Patent Application Publication No. 2014/0114530;
  • U.S. Patent Application Publication No. 2014/0124577;
  • U.S. Patent Application Publication No. 2014/0124579;
  • U.S. Patent Application Publication No. 2014/0125842;
  • U.S. Patent Application Publication No. 2014/0125853;
  • U.S. Patent Application Publication No. 2014/0125999;
  • U.S. Patent Application Publication No. 2014/0129378;
  • U.S. Patent Application Publication No. 2014/0131438;
  • U.S. Patent Application Publication No. 2014/0131441;
  • U.S. Patent Application Publication No. 2014/0131443;
  • U.S. Patent Application Publication No. 2014/0131444;
  • U.S. Patent Application Publication No. 2014/0131445;
  • U.S. Patent Application Publication No. 2014/0131448;
  • U.S. Patent Application Publication No. 2014/0133379;
  • U.S. Patent Application Publication No. 2014/0136208;
  • U.S. Patent Application Publication No. 2014/0140585;
  • U.S. Patent Application Publication No. 2014/0151453;
  • U.S. Patent Application Publication No. 2014/0152882;
  • U.S. Patent Application Publication No. 2014/0158770;
  • U.S. Patent Application Publication No. 2014/0159869;
  • U.S. Patent Application Publication No. 2014/0160329;
  • U.S. Patent Application Publication No. 2014/0166755;
  • U.S. Patent Application Publication No. 2014/0166757;
  • U.S. Patent Application Publication No. 2014/0166759;
  • U.S. Patent Application Publication No. 2014/0166760;
  • U.S. Patent Application Publication No. 2014/0166761;
  • U.S. Patent Application Publication No. 2014/0168787;
  • U.S. Patent Application Publication No. 2014/0175165;
  • U.S. Patent Application Publication No. 2014/0175169;
  • U.S. Patent Application Publication No. 2014/0175172;
  • U.S. Patent Application Publication No. 2014/0175174;
  • U.S. Patent Application Publication No. 2014/0191644;
  • U.S. Patent Application Publication No. 2014/0191913;
  • U.S. Patent Application Publication No. 2014/0197238;
  • U.S. Patent Application Publication No. 2014/0197239;
  • U.S. Patent Application Publication No. 2014/0197304;
  • U.S. Patent Application Publication No. 2014/0203087;
  • U.S. Patent Application Publication No. 2014/0204268;
  • U.S. Patent Application Publication No. 2014/0214631;
  • U.S. Patent Application Publication No. 2014/0217166;
  • U.S. Patent Application Publication No. 2014/0217180;
  • U.S. Patent Application Publication No. 2014/0267609;
  • U.S. patent application Ser. No. 13/367,978 for a Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.);
  • U.S. patent application Ser. No. 29/436,337 for an Electronic Device, filed Nov. 5, 2012 (Fitch et al.);
  • U.S. patent application Ser. No. 13/771,508 for an Optical Redirection Adapter, filed Feb. 20, 2013 (Anderson);
  • U.S. patent application Ser. No. 13/852,097 for a System and Method for Capturing and Preserving Vehicle Event Data, filed Mar. 28, 2013 (Barker et al.);
  • U.S. patent application Ser. No. 13/902,110 for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Hollifield);
  • U.S. patent application Ser. No. 13/902,144, for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Chamberlin);
  • U.S. patent application Ser. No. 13/902,242 for a System For Providing A Continuous Communication Link With A Symbol Reading Device, filed May 24, 2013 (Smith et al.);
  • U.S. patent application Ser. No. 13/912,262 for a Method of Error Correction for 3D Imaging Device, filed Jun. 7, 2013 (Jovanovski et al.);
  • U.S. patent application Ser. No. 13/912,702 for a System and Method for Reading Code Symbols at Long Range Using Source Power Control, filed Jun. 7, 2013 (Xian et al.);
  • U.S. patent application Ser. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 13/922,339 for a System and Method for Reading Code Symbols Using a Variable Field of View, filed Jun. 20, 2013 (Xian et al.);
  • U.S. patent application Ser. No. 13/927,398 for a Code Symbol Reading System Having Adaptive Autofocus, filed Jun. 26, 2013 (Todeschini);
  • U.S. patent application Ser. No. 13/930,913 for a Mobile Device Having an Improved User Interface for Reading Code Symbols, filed Jun. 28, 2013 (Gelay et al.);
  • U.S. patent application Ser. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 29/459,681 for an Electronic Device Enclosure, filed Jul. 2, 2013 (Chaney et al.);
  • U.S. patent application Ser. No. 13/933,415 for an Electronic Device Case, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 29/459,785 for a Scanner and Charging Base, filed Jul. 3, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 29/459,823 for a Scanner, filed Jul. 3, 2013 (Zhou et al.);
  • U.S. patent application Ser. No. 13/947,296 for a System and Method for Selectively Reading Code Symbols, filed Jul. 22, 2013 (Rueblinger et al.);
  • U.S. patent application Ser. No. 13/950,544 for a Code Symbol Reading System Having Adjustable Object Detection, filed Jul. 25, 2013 (Jiang);
  • U.S. patent application Ser. No. 13/961,408 for a Method for Manufacturing Laser Scanners, filed Aug. 7, 2013 (Saber et al.);
  • U.S. patent application Ser. No. 14/018,729 for a Method for Operating a Laser Scanner, filed Sep. 5, 2013 (Feng et al.);
  • U.S. patent application Ser. No. 14/019,616 for a Device Having Light Source to Reduce Surface Pathogens, filed Sep. 6, 2013 (Todeschini);
  • U.S. patent application Ser. No. 14/023,762 for a Handheld Indicia Reader Having Locking Endcap, filed Sep. 11, 2013 (Gannon);
  • U.S. patent application Ser. No. 14/035,474 for Augmented-Reality Signature Capture, filed Sep. 24, 2013 (Todeschini);
  • U.S. patent application Ser. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/055,234 for Dimensioning System, filed Oct. 16, 2013 (Fletcher);
  • U.S. patent application Ser. No. 14/053,314 for Indicia Reader, filed Oct. 14, 2013 (Huck);
  • U.S. patent application Ser. No. 14/065,768 for Hybrid System and Method for Reading Indicia, filed Oct. 29, 2013 (Meier et al.);
  • U.S. patent application Ser. No. 14/074,746 for Self-Checkout Shopping System, filed Nov. 8, 2013 (Hejl et al.);
  • U.S. patent application Ser. No. 14/074,787 for Method and System for Configuring Mobile Devices via NFC Technology, filed Nov. 8, 2013 (Smith et al.);
  • U.S. patent application Ser. No. 14/087,190 for Optimal Range Indicators for Bar Code Validation, filed Nov. 22, 2013 (Hejl);
  • U.S. patent application Ser. No. 14/094,087 for Method and System for Communicating Information in an Digital Signal, filed Dec. 2, 2013 (Peake et al.);
  • U.S. patent application Ser. No. 14/101,965 for High Dynamic-Range Indicia Reading System, filed Dec. 10, 2013 (Xian);
  • U.S. patent application Ser. No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.);
  • U.S. patent application Ser. No. 14/154,207 for Laser Barcode Scanner, filed Jan. 14, 2014 (Hou et al.);
  • U.S. patent application Ser. No. 14/165,980 for System and Method for Measuring Irregular Objects with a Single Camera filed Jan. 28, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/166,103 for Indicia Reading Terminal Including Optical Filter filed Jan. 28, 2014 (Lu et al.);
  • U.S. patent application Ser. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.);
  • U.S. patent application Ser. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.);
  • U.S. patent application Ser. No. 14/250,923 for Reading Apparatus Having Partial Frame Operating Mode filed Apr. 11, 2014, (Deng et al.);
  • U.S. patent application Ser. No. 14/257,174 for Imaging Terminal Having Data Compression filed Apr. 21, 2014, (Barber et al.);
  • U.S. patent application Ser. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering);
  • U.S. patent application Ser. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/274,858 for Mobile Printer with Optional Battery Accessory filed May 12, 2014 (Marty et al.);
  • U.S. patent application Ser. No. 14/277,337 for MULTIPURPOSE OPTICAL READER, filed May 14, 2014 (Jovanovski et al.);
  • U.S. patent application Ser. No. 14/283,282 for TERMINAL HAVING ILLUMINATION AND FOCUS CONTROL filed May 21, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/300,276 for METHOD AND SYSTEM FOR CONSIDERING INFORMATION ABOUT AN EXPECTED RESPONSE WHEN PERFORMING SPEECH RECOGNITION, filed Jun. 10, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/305,153 for INDICIA READING SYSTEM EMPLOYING DIGITAL GAIN CONTROL filed Jun. 16, 2014 (Xian et al.);
  • U.S. patent application Ser. No. 14/310,226 for AUTOFOCUSING OPTICAL IMAGING DEVICE filed Jun. 20, 2014 (Koziol et al.);
  • U.S. patent application Ser. No. 14/327,722 for CUSTOMER FACING IMAGING SYSTEMS AND METHODS FOR OBTAINING IMAGES filed Jul. 10, 2014 (Oberpriller et al);
  • U.S. patent application Ser. No. 14/327,827 for a MOBILE-PHONE ADAPTER FOR ELECTRONIC TRANSACTIONS, filed Jul. 10, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/329,303 for CELL PHONE READING MODE USING IMAGE TIMER filed Jul. 11, 2014 (Coyle);
  • U.S. patent application Ser. No. 14/333,588 for SYMBOL READING SYSTEM WITH INTEGRATED SCALE BASE filed Jul. 17, 2014 (Barten);
  • U.S. patent application Ser. No. 14/334,934 for a SYSTEM AND METHOD FOR INDICIA VERIFICATION, filed Jul. 18, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/336,188 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES, Filed Jul. 21, 2014 (Amundsen et al.);
  • U.S. patent application Ser. No. 14/339,708 for LASER SCANNING CODE SYMBOL READING SYSTEM, filed Jul. 24, 2014 (Xian et al.);
  • U.S. patent application Ser. No. 14/340,627 for an AXIALLY REINFORCED FLEXIBLE SCAN ELEMENT, filed Jul. 25, 2014 (Rueblinger et al.);
  • U.S. patent application Ser. No. 14/340,716 for an OPTICAL IMAGER AND METHOD FOR CORRELATING A MEDICATION PACKAGE WITH A PATIENT, filed Jul. 25, 2014 (Ellis);
  • U.S. patent application Ser. No. 14/342,544 for Imaging Based Barcode Scanner Engine with Multiple Elements Supported on a Common Printed Circuit Board filed Mar. 4, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/345,735 for Optical Indicia Reading Terminal with Combined Illumination filed Mar. 19, 2014 (Ouyang);
  • U.S. patent application Ser. No. 14/336,188 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES, Filed Jul. 21, 2014 (Amundsen et al.);
  • U.S. patent application Ser. No. 14/355,613 for Optical Indicia Reading Terminal with Color Image Sensor filed May 1, 2014 (Lu et al.);
  • U.S. patent application Ser. No. 14/370,237 for WEB-BASED SCAN-TASK ENABLED SYSTEM AND METHOD OF AND APPARATUS FOR DEVELOPING AND DEPLOYING THE SAME ON A CLIENT-SERVER NETWORK filed Jul. 2, 2014 (Chen et al.);
  • U.S. patent application Ser. No. 14/370,267 for INDUSTRIAL DESIGN FOR CONSUMER DEVICE BASED SCANNING AND MOBILITY, filed Jul. 2, 2014 (Ma et al.);
  • U.S. patent application Ser. No. 14/376,472, for an ENCODED INFORMATION READING TERMINAL INCLUDING HTTP SERVER, filed Aug. 4, 2014 (Lu);
  • U.S. patent application Ser. No. 14/379,057 for METHOD OF USING CAMERA SENSOR INTERFACE TO TRANSFER MULTIPLE CHANNELS OF SCAN DATA USING AN IMAGE FORMAT filed Aug. 15, 2014 (Wang et al.);
  • U.S. patent application Ser. No. 14/452,697 for INTERACTIVE INDICIA READER, filed Aug. 6, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/453,019 for DIMENSIONING SYSTEM WITH GUIDED ALIGNMENT, filed Aug. 6, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/460,387 for APPARATUS FOR DISPLAYING BAR CODES FROM LIGHT EMITTING DISPLAY SURFACES filed Aug. 15, 2014 (Van Horn et al.);
  • U.S. patent application Ser. No. 14/460,829 for ENCODED INFORMATION READING TERMINAL WITH WIRELESS PATH SELECTION CAPABILITY, filed Aug. 15, 2014 (Wang et al.);
  • U.S. patent application Ser. No. 14/462,801 for MOBILE COMPUTING DEVICE WITH DATA COGNITION SOFTWARE, filed on Aug. 19, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/446,387 for INDICIA READING TERMINAL PROCESSING PLURALITY OF FRAMES OF IMAGE DATA RESPONSIVELY TO TRIGGER SIGNAL ACTIVATION filed Jul. 30, 2014 (Wang et al.);
  • U.S. patent application Ser. No. 14/446,391 for MULTIFUNCTION POINT OF SALE APPARATUS WITH OPTICAL SIGNATURE CAPTURE filed Jul. 30, 2014 (Good et al.);
  • U.S. patent application Ser. No. 29/486,759 for an Imaging Terminal, filed Apr. 2, 2014 (Oberpriller et al.);
  • U.S. patent application Ser. No. 29/492,903 for an INDICIA SCANNER, filed Jun. 4, 2014 (Zhou et al.); and
  • U.S. patent application Ser. No. 29/494,725 for an IN-COUNTER BARCODE SCANNER, filed Jun. 24, 2014 (Oberpriller et al.).


In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A mobile dimensioner device, comprising: a display;one or more optical sensors;one or more measurement sensors;an input subsystem;one or more processors; andmemory containing instructions executable by the one or more processors whereby the device is operable to: receive a threshold number of contrary events;activate at least one of the one or more measurement sensors;derive a first set of dimensions for an object and an associated indication of the dimensional accuracy of each of the dimensions based on information received from the one or more measurement sensors;display, on the display, the first set of dimensions and the associated indication of the dimensional accuracy of each of the dimensions;display, on the display, an indication to obtain a better measurement of the object;detect a number of contrary events;if the number of contrary events detected exceeds the threshold number of contrary events, receive a deactivation event and deactivate the device in accordance with the deactivation event.
  • 2. The device of claim 1, wherein the device is further operable to: derive a set of preliminary dimensions for an object based on information received from the one or more measurement sensors.
  • 3. The device of claim 1, wherein the contrary event is an action that does not correspond to an indication to obtain a better measurement of the object.
  • 4. The device of claim 1, wherein the threshold number of contrary events is defined by one of the group consisting of: defined by the manufacturer of the device, defined to comply with certification standards set by a certification organization, defined in response to input received via the input subsystem at the device, and defined in response to information received at the device from a server.
  • 5. The device of claim 1, wherein the deactivation event is selected from the group consisting of: a power off event for the device, an event that turns off the ability of the device to take measurements, an event that turns off the one or more measurement sensors of the device, an event that restricts the ability of the device to report results, an event that turns off one or more communication interfaces of the device, an event that deactivates the measurement sensors and displays the first set of dimensions, an event that deactivates the measurement sensors and places the device in a state requiring reset, and an event that deactivates the measurement sensors and deletes the first set of dimensions.
  • 6. The device of claim 1, wherein the one or more optical sensors is selected from a group consisting of: a barcode sensor, a camera, and an image sensor.
  • 7. The device of claim 1, wherein the one or more measurement sensors is selected from a group consisting of: point-cloud projection, structured light, and stereoscopic cameras and n-scopic cameras.
US Referenced Citations (797)
Number Name Date Kind
3971065 Bayer Jul 1976 A
4026031 Siddall et al. May 1977 A
4279328 Ahlbom Jul 1981 A
4398811 Nishioka et al. Aug 1983 A
4495559 Gelatt, Jr. Jan 1985 A
4730190 Win et al. Mar 1988 A
4803639 Steele et al. Feb 1989 A
5184733 Amarson et al. Feb 1993 A
5198648 Hibbard Mar 1993 A
5220536 Stringer et al. Jun 1993 A
5331118 Jensen Jul 1994 A
5359185 Hanson Oct 1994 A
5384901 Glassner et al. Jan 1995 A
5548707 LoNegro et al. Aug 1996 A
5555090 Schmutz Sep 1996 A
5561526 Huber et al. Oct 1996 A
5590060 Granville et al. Dec 1996 A
5606534 Stringer et al. Feb 1997 A
5619245 Kessler et al. Apr 1997 A
5655095 LoNegro et al. Aug 1997 A
5661561 Wurz et al. Aug 1997 A
5699161 Woodworth Dec 1997 A
5729750 Ishida Mar 1998 A
5730252 Herbinet Mar 1998 A
5732147 Tao Mar 1998 A
5734476 Dlugos Mar 1998 A
5737074 Haga et al. Apr 1998 A
5767962 Suzuki et al. Jun 1998 A
5831737 Stringer et al. Nov 1998 A
5850370 Stringer et al. Dec 1998 A
5850490 Johnson Dec 1998 A
5869827 Rando Feb 1999 A
5870220 Migdal et al. Feb 1999 A
5900611 Hecht May 1999 A
5923428 Woodworth Jul 1999 A
5929856 LoNegro et al. Jul 1999 A
5938710 Lanza et al. Aug 1999 A
5959568 Woolley Sep 1999 A
5960098 Tao Sep 1999 A
5969823 Wurz et al. Oct 1999 A
5978512 Kim et al. Nov 1999 A
5979760 Freyman et al. Nov 1999 A
5988862 Kacyra et al. Nov 1999 A
5991041 Woodworth Nov 1999 A
6009189 Schaack Dec 1999 A
6025847 Marks Feb 2000 A
6035067 Ponticos Mar 2000 A
6049386 Stringer et al. Apr 2000 A
6053409 Brobst et al. Apr 2000 A
6064759 Buckley et al. May 2000 A
6067110 Nonaka et al. May 2000 A
6069696 McQueen et al. May 2000 A
6115114 Berg et al. Sep 2000 A
6137577 Woodworth Oct 2000 A
6177999 Wurz et al. Jan 2001 B1
6189223 Haug Feb 2001 B1
6232597 Kley May 2001 B1
6236403 Chaki May 2001 B1
6246468 Dimsdale Jun 2001 B1
6333749 Reinhardt et al. Dec 2001 B1
6336587 He et al. Jan 2002 B1
6369401 Lee Apr 2002 B1
6373579 Ober et al. Apr 2002 B1
6429803 Kumar Aug 2002 B1
6457642 Good et al. Oct 2002 B1
6507406 Yagi et al. Jan 2003 B1
6517004 Good et al. Feb 2003 B2
6519550 D'Hooge et al. Feb 2003 B1
6674904 McQueen Jan 2004 B1
6705526 Zhu et al. Mar 2004 B1
6781621 Gobush et al. Aug 2004 B1
6824058 Patel et al. Nov 2004 B2
6832725 Gardiner et al. Dec 2004 B2
6858857 Pease et al. Feb 2005 B2
6922632 Foxlin Jul 2005 B2
6971580 Zhu et al. Dec 2005 B2
6995762 Pavlidis et al. Feb 2006 B1
7057632 Yamawaki et al. Jun 2006 B2
7085409 Sawhney et al. Aug 2006 B2
7086162 Tyroler Aug 2006 B2
7104453 Zhu et al. Sep 2006 B1
7128266 Zhu et al. Oct 2006 B2
7137556 Bonner et al. Nov 2006 B1
7159783 Walczyk et al. Jan 2007 B2
7161688 Bonner et al. Jan 2007 B1
7205529 Andersen et al. Apr 2007 B2
7214954 Schopp May 2007 B2
7277187 Smith et al. Oct 2007 B2
7307653 Dutta Dec 2007 B2
7310431 Gokturk et al. Dec 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7509529 Colucci et al. Mar 2009 B2
7527205 Zhu May 2009 B2
7586049 Wurz Sep 2009 B2
7602404 Reinhardt et al. Oct 2009 B1
7639722 Paxton et al. Dec 2009 B1
7726575 Wang et al. Jun 2010 B2
7780084 Zhang et al. Aug 2010 B2
7788883 Buckley et al. Sep 2010 B2
7974025 Topliss Jul 2011 B2
8027096 Feng et al. Sep 2011 B2
8028501 Buckley et al. Oct 2011 B2
8050461 Shpunt et al. Nov 2011 B2
8055061 Katano Nov 2011 B2
8072581 Breiholz Dec 2011 B1
8102395 Kondo et al. Jan 2012 B2
8132728 Dwinell et al. Mar 2012 B2
8134717 Pangrazio et al. Mar 2012 B2
8149224 Kuo et al. Apr 2012 B1
8194097 Xiao et al. Jun 2012 B2
8201737 Palacios Durazo et al. Jun 2012 B1
8212889 Chanas et al. Jul 2012 B2
8228510 Pangrazio et al. Jul 2012 B2
8230367 Bell et al. Jul 2012 B2
8294969 Plesko Oct 2012 B2
8305458 Hara Nov 2012 B2
8310656 Zalewski Nov 2012 B2
8313380 Zalewski et al. Nov 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8339462 Stec et al. Dec 2012 B2
8350959 Topliss et al. Jan 2013 B2
8351670 Ijiri et al. Jan 2013 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381976 Mohideen et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8437539 Komatsu et al. May 2013 B2
8441749 Brown et al. May 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8463079 Ackley et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8570343 Halstead Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8576390 Nunnink Nov 2013 B1
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8594425 Gurman et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8792688 Unsworth Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8810779 Hilde Aug 2014 B1
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8897596 Passmore et al. Nov 2014 B1
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8928896 Kennington et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9014441 Truyen et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber et al. Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9082023 Feng et al. Jul 2015 B2
9082195 Holeva et al. Jul 2015 B2
9142035 Rotman Sep 2015 B1
9171278 Kong et al. Oct 2015 B1
9233470 Bradski et al. Jan 2016 B1
9235899 Kirmani et al. Jan 2016 B1
9299013 Curlander et al. Mar 2016 B1
9424749 Reed et al. Aug 2016 B1
9486921 Straszheim et al. Nov 2016 B1
20010027995 Patel et al. Oct 2001 A1
20010032879 He et al. Oct 2001 A1
20020054289 Thibault et al. May 2002 A1
20020067855 Chiu et al. Jun 2002 A1
20020109835 Goetz Aug 2002 A1
20020118874 Chung et al. Aug 2002 A1
20020158873 Williamson Oct 2002 A1
20020167677 Okada et al. Nov 2002 A1
20020179708 Zhu et al. Dec 2002 A1
20020196534 Lizotte et al. Dec 2002 A1
20030038179 Tsikos et al. Feb 2003 A1
20030053513 Vatan et al. Mar 2003 A1
20030063086 Baumberg Apr 2003 A1
20030078755 Leutz et al. Apr 2003 A1
20030091227 Chang et al. May 2003 A1
20030156756 Gokturk et al. Aug 2003 A1
20030197138 Pease et al. Oct 2003 A1
20030225712 Cooper Dec 2003 A1
20030235331 Kawaike et al. Dec 2003 A1
20040008259 Gokturk et al. Jan 2004 A1
20040019274 Galloway et al. Jan 2004 A1
20040024754 Mane et al. Feb 2004 A1
20040066329 Zeitfuss et al. Apr 2004 A1
20040073359 Ichijo et al. Apr 2004 A1
20040083025 Yamanouchi et al. Apr 2004 A1
20040089482 Ramsden May 2004 A1
20040098146 Katae et al. May 2004 A1
20040105580 Hager et al. Jun 2004 A1
20040118928 Patel et al. Jun 2004 A1
20040122779 Stickler et al. Jun 2004 A1
20040155975 Hart et al. Aug 2004 A1
20040165090 Ning Aug 2004 A1
20040184041 Schopp Sep 2004 A1
20040211836 Patel et al. Oct 2004 A1
20040214623 Takahashi et al. Oct 2004 A1
20040233461 Armstrong et al. Nov 2004 A1
20040258353 Gluckstad et al. Dec 2004 A1
20050006477 Patel Jan 2005 A1
20050117215 Lange Jun 2005 A1
20050128196 Popescu et al. Jun 2005 A1
20050168488 Montague Aug 2005 A1
20050211782 Martin Sep 2005 A1
20050264867 Cho et al. Dec 2005 A1
20060047704 Gopalakrishnan Mar 2006 A1
20060078226 Zhou Apr 2006 A1
20060108266 Bowers May 2006 A1
20060112023 Horhann May 2006 A1
20060151604 Zhu et al. Jul 2006 A1
20060159307 Anderson et al. Jul 2006 A1
20060159344 Shao et al. Jul 2006 A1
20060213999 Wang et al. Sep 2006 A1
20060232681 Okada Oct 2006 A1
20060255150 Longacre Nov 2006 A1
20060269165 Viswanathan Nov 2006 A1
20060291719 Ikeda et al. Dec 2006 A1
20070003154 Sun et al. Jan 2007 A1
20070025612 Iwasaki et al. Feb 2007 A1
20070031064 Zhao et al. Feb 2007 A1
20070063048 Havens et al. Mar 2007 A1
20070116357 Dewaele May 2007 A1
20070127022 Cohen et al. Jun 2007 A1
20070143082 Degnan Jun 2007 A1
20070153293 Gruhlke et al. Jul 2007 A1
20070171220 Kriveshko Jul 2007 A1
20070177011 Lewin et al. Aug 2007 A1
20070181685 Zhu et al. Aug 2007 A1
20070237356 Dwinell et al. Oct 2007 A1
20070291031 Konev et al. Dec 2007 A1
20070299338 Stevick et al. Dec 2007 A1
20080013793 Hillis et al. Jan 2008 A1
20080035390 Wurz Feb 2008 A1
20080056536 Hildreth et al. Mar 2008 A1
20080062164 Bassi et al. Mar 2008 A1
20080077265 Boyden Mar 2008 A1
20080164074 Wurz Jul 2008 A1
20080204476 Montague Aug 2008 A1
20080212168 Olmstead et al. Sep 2008 A1
20080247635 Davis et al. Oct 2008 A1
20080273191 Kim et al. Nov 2008 A1
20080273210 Hilde Nov 2008 A1
20080278790 Boesser et al. Nov 2008 A1
20090059004 Bochicchio Mar 2009 A1
20090081008 Somin et al. Mar 2009 A1
20090095047 Patel et al. Apr 2009 A1
20090134221 Zhu et al. May 2009 A1
20090195790 Zhu et al. Aug 2009 A1
20090225333 Bendall et al. Sep 2009 A1
20090237411 Gossweiler et al. Sep 2009 A1
20090268023 Hsieh Oct 2009 A1
20090272724 Gubler Nov 2009 A1
20090273770 Bauhahn et al. Nov 2009 A1
20090313948 Buckley et al. Dec 2009 A1
20090318815 Barnes et al. Dec 2009 A1
20090323084 Dunn et al. Dec 2009 A1
20090323121 Valkenburg Dec 2009 A1
20100035637 Varanasi et al. Feb 2010 A1
20100060604 Zwart et al. Mar 2010 A1
20100091104 Sprigle Apr 2010 A1
20100118200 Gelman et al. May 2010 A1
20100128109 Banks May 2010 A1
20100161170 Siris Jun 2010 A1
20100171740 Andersen et al. Jul 2010 A1
20100172567 Prokoski Jul 2010 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100202702 Benos et al. Aug 2010 A1
20100208039 Stettner Aug 2010 A1
20100211355 Horst et al. Aug 2010 A1
20100217678 Goncalves Aug 2010 A1
20100220849 Colbert et al. Sep 2010 A1
20100220894 Ackley et al. Sep 2010 A1
20100223276 Al-Shameri et al. Sep 2010 A1
20100245850 Lee et al. Sep 2010 A1
20100254611 Arnz Oct 2010 A1
20100303336 Abraham Dec 2010 A1
20100315413 Izadi et al. Dec 2010 A1
20100321482 Cleveland Dec 2010 A1
20110019155 Daniel et al. Jan 2011 A1
20110040192 Brenner et al. Feb 2011 A1
20110043609 Choi et al. Feb 2011 A1
20110099474 Grossman et al. Apr 2011 A1
20110169999 Grunow et al. Jul 2011 A1
20110188054 Petronius et al. Aug 2011 A1
20110188741 Sones et al. Aug 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20110234389 Mellin Sep 2011 A1
20110235854 Berger et al. Sep 2011 A1
20110249864 Venkatesan et al. Oct 2011 A1
20110254840 Halstead Oct 2011 A1
20110279916 Brown et al. Nov 2011 A1
20110286007 Pangrazio et al. Nov 2011 A1
20110286628 Goncalves et al. Nov 2011 A1
20110288818 Thierman Nov 2011 A1
20110301994 Tieman Dec 2011 A1
20110303748 Lemma et al. Dec 2011 A1
20110310227 Konertz et al. Dec 2011 A1
20120024952 Chen Feb 2012 A1
20120056982 Katz et al. Mar 2012 A1
20120057345 Kuchibhotla Mar 2012 A1
20120067955 Rowe Mar 2012 A1
20120074227 Ferren et al. Mar 2012 A1
20120081714 Pangrazio et al. Apr 2012 A1
20120111946 Golant May 2012 A1
20120113223 Hilliges et al. May 2012 A1
20120113250 Farlotti et al. May 2012 A1
20120126000 Kunzig et al. May 2012 A1
20120140300 Freeman Jun 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120179665 Baarman et al. Jul 2012 A1
20120185094 Rosenstein et al. Jul 2012 A1
20120190386 Anderson Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120197464 Wang et al. Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120218436 Rodriguez et al. Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20120224026 Bayer et al. Sep 2012 A1
20120236288 Stanley Sep 2012 A1
20120242852 Hayward et al. Sep 2012 A1
20120256901 Bendall Oct 2012 A1
20120261474 Kawashime et al. Oct 2012 A1
20120262558 Boger et al. Oct 2012 A1
20120280908 Rhoads et al. Nov 2012 A1
20120282905 Owen Nov 2012 A1
20120282911 Davis et al. Nov 2012 A1
20120284012 Rodriguez et al. Nov 2012 A1
20120284122 Brandis Nov 2012 A1
20120284339 Rodriguez Nov 2012 A1
20120284593 Rodriguez Nov 2012 A1
20120293610 Doepke et al. Nov 2012 A1
20120294549 Doepke Nov 2012 A1
20120299961 Ramkumar et al. Nov 2012 A1
20120300991 Free Nov 2012 A1
20120313848 Galor et al. Dec 2012 A1
20120314030 Datta Dec 2012 A1
20120314058 Bendall et al. Dec 2012 A1
20120316820 Nakazato et al. Dec 2012 A1
20130019278 Sun et al. Jan 2013 A1
20130038881 Pesach et al. Feb 2013 A1
20130038941 Pesach et al. Feb 2013 A1
20130043312 Van Horn Feb 2013 A1
20130050426 Sarmast et al. Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130094069 Lee et al. Apr 2013 A1
20130101158 Lloyd et al. Apr 2013 A1
20130156267 Muraoka et al. Jun 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130200150 Reynolds et al. Aug 2013 A1
20130201288 Billerbaeck et al. Aug 2013 A1
20130208164 Cazier et al. Aug 2013 A1
20130211790 Loveland et al. Aug 2013 A1
20130223673 Davis et al. Aug 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130291998 Konnerth Nov 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308013 Li et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130329012 Bartos Dec 2013 A1
20130329013 Metois et al. Dec 2013 A1
20130342342 Sabre et al. Dec 2013 A1
20130342343 Harring et al. Dec 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140009586 McNamer et al. Jan 2014 A1
20140021259 Moed et al. Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140031665 Pinto et al. Jan 2014 A1
20140034731 Gao et al. Feb 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue Feb 2014 A1
20140058612 Wong et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140062709 Hyer et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140064624 Kim et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067104 Osterhout Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071430 Hansen et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140091147 Evans et al. Apr 2014 A1
20140097238 Ghazizadeh Apr 2014 A1
20140098091 Hori Apr 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140100813 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140104664 Lee Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Kearney May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140135984 Hirata May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140139654 Taskahashi May 2014 A1
20140140585 Wang May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140152975 Ko Jun 2014 A1
20140158468 Adami Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168380 Heidemann et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140192187 Atwell et al. Jul 2014 A1
20140192551 Masaki Jul 2014 A1
20140197238 Liu et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140203087 Smith et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140205150 Ogawa Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140225918 Mittal et al. Aug 2014 A1
20140225985 Klusza et al. Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140240454 Hirata et al. Aug 2014 A1
20140247279 Nicholas et al. Sep 2014 A1
20140247280 Nicholas et al. Sep 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140267609 Laffargue Sep 2014 A1
20140268093 Tohme et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140270361 Amma et al. Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140306833 Ricci Oct 2014 A1
20140307855 Withagen et al. Oct 2014 A1
20140312121 Lu et al. Oct 2014 A1
20140313527 Askan Oct 2014 A1
20140319219 Liu et al. Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140320408 Zagorsek et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140333775 Naikal et al. Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140347553 Ovsiannikov et al. Nov 2014 A1
20140350710 Gopalkrishnan et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van Horn et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20140379613 Nishitani et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009100 Haneda et al. Jan 2015 A1
20150009301 Ribnick et al. Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150036876 Marrion et al. Feb 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150042791 Metois et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150062369 Gehring et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150071819 Todeschini Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150116498 Vartiainen et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150163474 You Jun 2015 A1
20150169925 Chen et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150193645 Colavito et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150204662 Kobayashi et al. Jul 2015 A1
20150204671 Showering Jul 2015 A1
20150213647 Laffargue et al. Jul 2015 A1
20150229838 Hakim et al. Aug 2015 A1
20150269403 Lei et al. Sep 2015 A1
20150276379 Ni et al. Oct 2015 A1
20150301181 Herschbach Oct 2015 A1
20150308816 Laffargue et al. Oct 2015 A1
20150325036 Lee Nov 2015 A1
20150355470 Herschbach Dec 2015 A1
20160040982 Li et al. Feb 2016 A1
20160048725 Holz et al. Feb 2016 A1
20160063429 Varley et al. Mar 2016 A1
20160088287 Sadi et al. Mar 2016 A1
20160090283 Svensson et al. Mar 2016 A1
20160090284 Svensson et al. Mar 2016 A1
20160101936 Chamberlin Apr 2016 A1
20160102975 McCloskey et al. Apr 2016 A1
20160104019 Todeschini et al. Apr 2016 A1
20160104274 Jovanovski et al. Apr 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue et al. Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160138247 Conway et al. May 2016 A1
20160138248 Conway et al. May 2016 A1
20160138249 Svensson et al. May 2016 A1
20160169665 Deschenes et al. Jun 2016 A1
20160187186 Coleman et al. Jun 2016 A1
20160187210 Coleman et al. Jun 2016 A1
20160191801 Sivan Jun 2016 A1
20160202478 Masson et al. Jul 2016 A1
20160343176 Ackley Nov 2016 A1
20170115490 Hsieh et al. Apr 2017 A1
20170182942 Hardy et al. Jun 2017 A1
Foreign Referenced Citations (52)
Number Date Country
2004212587 Apr 2005 AU
3335760 Apr 1985 DE
10210813 Oct 2003 DE
102007037282 Mar 2008 DE
1111435 Jun 2001 EP
1443312 Aug 2004 EP
2286932 Feb 2011 EP
2381421 Oct 2011 EP
2533009 Dec 2012 EP
2722656 Apr 2014 EP
2779027 Sep 2014 EP
2833323 Feb 2015 EP
2843590 Mar 2015 EP
2845170 Mar 2015 EP
2966595 Jan 2016 EP
3006893 Mar 2016 EP
3012601 Mar 2016 EP
3007096 Apr 2016 EP
2503978 Jan 2014 GB
2525053 Oct 2015 GB
2531928 May 2016 GB
H04129902 Apr 1992 JP
200696457 Apr 2006 JP
2007084162 Apr 2007 JP
2008210276 Sep 2008 JP
2014210646 Nov 2014 JP
2015174705 Oct 2015 JP
20110013200 Feb 2011 KR
20110117020 Oct 2011 KR
20120028109 Mar 2012 KR
9640452 Dec 1996 WO
0077726 Dec 2000 WO
0114836 Mar 2001 WO
2006095110 Sep 2006 WO
2007015059 Feb 2007 WO
2011017241 Feb 2011 WO
2012175731 Dec 2012 WO
2013021157 Feb 2013 WO
2013033442 Mar 2013 WO
2013163789 Nov 2013 WO
2013166368 Nov 2013 WO
2013173985 Nov 2013 WO
2013184340 Dec 2013 WO
2014019130 Feb 2014 WO
2014023697 Feb 2014 WO
2014102341 Jul 2014 WO
2014110495 Jul 2014 WO
2014149702 Sep 2014 WO
2014151746 Sep 2014 WO
2015006865 Jan 2015 WO
2016020038 Feb 2016 WO
2016061699 Apr 2016 WO
Non-Patent Literature Citations (168)
Entry
Office Action in counterpart European Application No. 13186043.9 dated Sep. 30, 2015, pp. 1-7.
Lloyd et al., “System for Monitoring the Condition of Packages Throughout Transit”, U.S. Appl. No. 14/865,575, filed Sep. 25, 2015, 59 pages, not yet published.
McCloskey et al., “Image Transformation for Indicia Reading,” U.S. Appl. No. 14/928,032, filed Oct. 30, 2015, 48 pages, not yet published.
Great Britain Combined Search and Examination Report in related Application GB1517842.9, dated Apr. 8, 2016, 8 pages.
Search Report in counterpart European Application No. 15182675.7, dated Dec. 4, 2015, 10 pages.
Wikipedia, “3D projection” Downloaded on Nov. 25, 2015 from www.wikipedia.com, 4 pages.
M.Zahid Gurbuz, Selim Akyokus, Ibrahim Emiroglu, Aysun Guran, An Efficient Algorithm for 3D Rectangular Box Packing, 2009, Applied Automatic Systems: Proceedings of Selected AAS 2009 Papers, pp. 131-134.
European Extended Search Report in Related EP Application No. 16172995.9, dated Aug. 22, 2016, 11 pages.
European Extended search report in related EP Application No. 15190306.9, dated Sep. 9, 2016, 15 pages.
Collings et al., “The Applications and Technology of Phase-Only Liquid Crystal on Silicon Devices”, Journal of Display Technology, IEEE Service Center, New, York, NY, US, vol. 7, No. 3, Mar. 1, 2011 (Mar. 1, 2011), pp. 112-119.
European extended Search report in related EP Application 13785171.3, dated Sep. 19, 2016, 8 pages.
El-Hakim et al., “Multicamera vision-based approach to flexible feature measurement for inspection and reverse engineering”, published in Optical Engineering, Society of Photo-Optical Instrumentation Engineers, vol. 32, No. 9, Sep. 1, 1993, 15 pages.
El-Hakim et al., “A Knowledge-based Edge/Object Measurement Technique”, Retrieved from the Internet: URL: https://www.researchgate.net/profile/Sabry—E1 -Hakim/publication/44075058—A—Knowledge—Based—EdgeObject—Measurement—Technique/links/00b4953b5faa7d3304000000.pdf [retrieved on Jul. 15, 2016] dated Jan. 1, 1993, 9 pages.
H. Sprague Ackley, “Automatic Mode Switching in a Volume Dimensioner”, U.S. Appl. No. 15/182,636, filed Jun. 15, 2016, 53 pages, Not yet published.
Bosch Tool Corporation, “Operating/Safety Instruction for DLR 130”, Dated Feb. 2, 2009, 36 pages.
European Search Report for related EP Application No. 16152477.2, dated May 24, 2016, 8 pages.
Mike Stensvold, “get the Most Out of Variable Aperture Lenses”, published on www.OutdoorPhotogrpaher.com; dated Dec. 7, 2010; 4 pages, [As noted on search report retrieved from URL: http;//www.outdoorphotographer.com/gear/lenses/get-the-most-out-ofvariable-aperture-lenses.html on Feb. 9, 2016].
Houle et al., “Vehical Positioning and Object Avoidance”, U.S. Appl. No. 15/007,522 [not yet published], filed Jan. 27, 2016, 59 pages.
United Kingdom combined Search and Examination Report in related GB Application No. 1607394.2, dated Oct. 19, 2016, 7 pages.
European Search Report from related EP Application No. 16168216.6, dated Oct. 20, 2016, 8 pages.
Peter Clarke, Actuator Developer Claims Anti-Shake Breakthrough for Smartphone Cams, Electronic Engineering Times, p. 24, May 16, 2011.
Spiller, Jonathan; Object Localization Using Deformable Templates, Master's Dissertation, University of the Witwatersrand, Johannesburg, South Africa, 2007; 74 pages.
Leotta, Matthew J.; Joseph L. Mundy; Predicting High Resolution Image Edges with a Generic, Adaptive, 3-D Vehicle Model; IEEE Conference on Computer Vision and Pattern Recognition, 2009; 8 pages.
European Search Report for application No. EP13186043 dated Feb. 26, 2014 (now EP2722656 (Apr. 23, 2014)): Total pp. 7.
International Search Report for PCT/US2013/039438 (WO2013166368), dated Oct. 1, 2013, 7 pages.
Lloyd, Ryan and Scott McCloskey, “Recognition of 3D Package Shapes for Singe Camera Metrology” IEEE Winter Conference on Applications of computer Visiona, IEEE, Mar. 24, 2014, pp. 99-106, {retrieved on Jun. 16, 2014}, Authors are employees of common Applicant.
European Office Action for application EP 13186043, dated Jun. 12, 2014(now EP2722656 (Apr. 23, 2014)), Total of 6 pages.
Zhang, Zhaoxiang; Tieniu Tan, Kaiqi Huang, Yunhong Wang; Three-Dimensional Deformable-Model-based Localization and Recognition of Road Vehicles; IEEE Transactions on Image Processing, vol. 21, No. 1, Jan. 2012, 13 pages.
U.S. Appl. No. 14/801,023, Tyler Doomenbal et al., filed Jul. 16, 2015, not published yet, Adjusting Dimensioning Results Using Augmented Reality, 39 pages.
Wikipedia, YUV description and definition, downloaded from http://www.wikipeida.org/wiki/YUV on Jun. 29, 2012, 10 pages.
YUV Pixel Format, downloaded from http://www.fource.org/yuv.php on Jun. 29, 2012; 13 pages.
YUV to RGB Conversion, downloaded from http://www.fource.org/fccyvrgb.php on Jun. 29, 2012; 5 pages.
Benos et al., “Semi-Automatic Dimensioning with Imager of a Portable Device,” U.S. Appl. No. 61/149,912, filed Feb. 4, 2009 (now expired), 56 pages.
Dimensional Weight—Wikipedia, the Free Encyclopedia, URL=http://en.wikipedia.org/wiki/Dimensional—weight, download date Aug. 1, 2008, 2 pages.
Dimensioning—Wikipedia, the Free Encyclopedia, URL=http://en.wikipedia.org/wiki/Dimensioning, download date Aug. 1, 2008, 1 page.
European Patent Office Action for Application No. 14157971.4-1906, dated Jul. 16, 2014, 5 pages.
European Patent Search Report for Application No. 14157971.4-1906, dated Jun. 30, 2014, 6 pages.
Caulier, Yannick et al., “A New Type of Color-Coded Light Structures for an Adapted and Rapid Determination of Point Correspondences for 3D Reconstruction.” Proc. of SPIE, vol. 8082 808232-3; 2011; 8 pages.
Kazantsev, Aleksei et al. “Robust Pseudo-Random Coded Colored STructured Light Techniques for 3D Object Model Recovery”; ROSE 2008 IEEE International Workshop on Robotic and Sensors Environments (Oct. 17-18, 2008) , 6 pages.
Mouaddib E. et al. “Recent Progress in Structured Light in order to Solve the Correspondence Problem in Stereo Vision” Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Apr. 1997; 7 pages.
Proesmans, Marc et al. “Active Acquisition of 3D Shape for Moving Objects” 0-7803-3258-X/96 1996 IEEE; 4 pages.
Salvi, Joaquim et al. “Pattern Codification Strategies in Structured Light Systems” published in Pattern Recognition; The Journal of the Pattern Recognition Society, Received Mar. 6, 2003; Accepted Oct. 2, 2003; 23 pages.
EP Search and Written Opinion Report in related matter EP Application No. 14181437.6, dated Mar. 26, 2015, 7 pages.
Hetzel, Gunter et al.; “3D Object Recognition from Range Images using Local Feature Histograms,”, Proceedings 2001 IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2001. Kauai, Hawaii, Dec. 8-14, 2001; pp. 394-399, XP010584149, ISBN: 978-0-7695-1272-3.
Second Chinese Office Action in related CN Application No. 201520810685.6, dated Mar. 22, 2016, 5 pages, no references.
European Search Report in related EP Application No. 15190315.0, dated Apr. 1, 2016, 7 pages.
Second Chinese Office Action in related CN Application No. 2015220810562.2, dated Mar. 22, 2016, 5 pages. English Translation provided [No references].
European Search Report for related Application EP 15190249.1, dated Mar. 22, 2016, 7 pages.
Second Chinese Office Action in related CN Application No. 201520810313.3, dated Mar. 22, 2016, 5 pages. English Translation provided [No references].
U.S. Appl. No. 14/800,757 , Eric Todeschini, filed Jul. 16, 2015, not published yet, Dimensioning and Imaging Items, 80 pages.
U.S. Appl. No. 14/747,197, Serge Thuries et al., filed Jun. 23, 2015, not published yet, Optical Pattern Projector; 33 pages.
U.S. Appl. No. 14/747,490, Brian L. Jovanovski et al., filed Jun. 23, 2015, not published yet, Dual-Projector Three-Dimensional Scanner; 40 pages.
Search Report and Opinion in related GB Application No. 1517112.7, dated Feb. 19, 2016, 6 Pages.
U.S. Appl. No. 14/793,149, H. Sprague Ackley, filed Jul. 7, 2015, not published yet, Mobile Dimensioner Apparatus for Use in Commerce; 57 pages.
U.S. Appl. No. 14/740,373, H. Sprague Ackley et al., filed Jun. 16, 2015, not published yet, Calibrating a Volume Dimensioner; 63 pages.
Intention to Grant in counterpart European Application No. 14157971.4 dated Apr. 14, 2015, pp. 1-8.
Decision to Grant in counterpart European Application No. 14157971.4 dated Aug. 6, 2015, pp. 1-2.
Leotta, Matthew, Generic, Deformable Models for 3-D Vehicle Surveillance, May 2010, Doctoral Dissertation, Brown University, Providence RI, 248 pages.
Ward, Benjamin, Interactive 3D Reconstruction from Video, Aug. 2012, Doctoral Thesis, Univesity of Adelaide, Adelaide, South Australia, 157 pages.
Hood, Frederick W.; William A. Hoff, Robert King, Evaluation of an Interactive Technique for Creating Site Models from Range Data, Apr. 27-May 1, 1997 Proceedings of the ANS 7th Topical Meeting on Robotics & Remote Systems, Augusta GA, 9 pages.
Gupta, Alok; Range Image Segmentation for 3-D Objects Recognition, May 1988, Technical Reports (CIS), Paper 736, University of Pennsylvania Department of Computer and Information Science, retrieved from Http://repository.upenn.edu/cis—reports/736, Accessed May 31, 2015, 157 pages.
Reisner-Kollmann,Irene; Anton L. Fuhrmann, Werner Purgathofer, Interactive Reconstruction of Industrial Sites Using Parametric Models, May 2010, Proceedings of the 26th Spring Conference of Computer Graphics SCCG 10, 8 pages.
Drummond, Tom; Roberto Cipolla, Real-Time Visual Tracking of Complex Structures, Jul. 2002, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, No. 7; 15 pages.
European Search Report for Related EP Application No. 15189214.8, dated Mar. 3, 2016, 9 pages.
Santolaria et al. “A one-step intrinsic and extrinsic calibration method for laster line scanner operation in coordinate measuring machines”, dated Apr. 1, 2009, Measurement Science and Technology, IOP, Bristol, GB, vol. 20, No. 4; 12 pages.
Search Report and Opinion in Related EP Application 15176943.7, dated Jan. 8, 2016, 8 pages.
European Search Report for related EP Application No. 15188440.0, dated Mar. 8, 2016, 8 pages.
United Kingdom Search Report in related application GB1517842.9, dated Apr. 8, 2016, 8 pages.
Great Britain Search Report for related Application on. GB1517843.7, dated Feb. 23, 2016; 8 pages.
Padzensky, Ron; “Augmera; Gesture Control”, Dated Apr. 18, 2015, 15 pages.
Grabowski, Ralph; “New Commands in AutoCADS 2010: Part 11 Smoothing 3D Mesh Objects” Dated 2011, 6 pages.
Theodoropoulos, Gabriel; “Using Gesture Recognizers to Handle Pinch, Rotate, Pan, Swipe, and Tap Gestures” dated Aug. 25, 2014, 34 pages.
European Extended Search Report in related EP Application No. 16190017.0, dated Jan. 4, 2017, 6 pages.
European Extended Search Report in related EP Application No. 16173429.8, dated Dec. 1, 2016, 8 pages [Only new references cited: US 2013/0038881 was previously cited].
Extended European Search Report in related EP Application No. 16175410.0, dated Dec. 13, 2016, 5 pages.
Wikipedia, “Microlens”, Downloaded from https://en.wikipedia.org/wiki/Microlens, pp. 3. {Feb. 9, 2017 Final Office Action in related matter}.
Fukaya et al., “Characteristics of Speckle Random Pattern and Its Applications”, pp. 317-327, Nouv. Rev. Optique, t.6, n.6. (1975) {Feb. 9, 2017 Final Office Action in related matter: downloaded Mar. 2, 2017 from http://iopscience.iop.org}.
European Examination report in related EP Application No. 14181437.6, dated Feb. 8, 2017, 5 pages. [References have been previously cited].
European extended search report in related EP Application 16190833.0, dated Mar. 9, 2017, 8 pages. [only new art has been cited; US Publication 2014/0034731 was previously cited].
United Kingdom Combined Search and Examination Report in related Application No. GB1620676.5, dated Mar. 8, 2017, 6 pages [References have been previously cited; WO2014/151746, WO2012/175731, US 2014/0313527, GB2503978].
European Exam Report in related , EP Application No. 16168216.6, dated Feb. 27, 2017, 5 pages, [References have been previously cited; WO2011/017241 and US 2014/0104413].
Thorlabs, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, downloaded from https://www.thorlabs.com/newgrouppage9.cfm?objectgroup—id=6430, 4 pages.
Eksma Optics, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, downloaded from http://eksmaoptics.com/optical-systems/f-theta-lenses/f-theta-lens-for-1064-nm/, 2 pages.
Sill Optics, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, http://www.silloptics.de/1/products/sill-encyclopedia/laser-optics/f-theta-lenses/, 4 pages.
Chinese Notice of Reexamination in related Chinese Application 201520810313.3, dated Mar. 14, 2017, English computer Translation provided, 7 pages. [No new art cited].
Extended European search report in related EP Application 16199707.7, dated Apr. 10, 2017, 15 pages.
Ulusoy et al., One-Shot Scanning using De Bruijn Spaced Grids, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, 7 pages [Cited in EP Extended search report dated Apr. 10, 2017].
U.S. Appl. No. 14/519,179 for Dimensioning System With Multipath Interference Mitigation filed Oct. 21, 2014 (Thuries et al.); 30 pages.
U.S. Appl. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014, (Ackley et al.); 39 pages.
U.S. Appl. No. 14/453,019 for Dimensioning System With Guided Alignment, filed Aug. 6, 2014 (Li et al.); 31 pages.
U.S. Appl. No. 14/452,697 for Interactive Indicia Reader , filed Aug. 6, 2014, (Todeschini); 32 pages
U.S. Appl. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.); 36 pages.
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages.
U.S. Appl. No. 14/513,808 for Identifying Inventory Items in a Storage Facility filed Oct. 14, 2014 (Singel et al.); 51 pages.
U.S. Appl. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.); 22 pages.
U.S. Appl. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.); 21 pages.
U.S. Appl. No. 14/483,056 for Variable Depth of Field Barcode Scanner filed Sep. 10, 2014 (McCloskey et al.); 29 pages.
U.S. Appl. No. 14/531,154 for Directing an Inspector Through an Inspection filed Nov. 3, 2014 (Miller et al.); 53 pages.
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages.
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages.
U.S. Appl. No. 14/340,627 for an Axially Reinforced Flexible Scan Element, filed Jul. 25, 2014 (Reublinger et al.); 41 pages.
U.S. Appl. No. 14/676,327 for Device Management Proxy for Secure Devices filed Apr. 1, 2015 (Yeakley et al.); 50 pages.
U.S. Appl. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering); 31 pages.
U.S. Appl. No. 14/327,827 for a Mobile-Phone Adapter for Electronic Transactions, filed Jul. 10, 2014 (Hejl); 25 pages.
U.S. Appl. No. 14/334,934 for a System and Method for Indicia Verification, filed Jul. 18, 2014 (Hejl); 38 pages.
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages.
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages.
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages.
U.S. Appl. No. 14/619,093 for Methods for Training a Speech Recognition System filed Feb. 11, 2015 (Pecorari); 35 pages.
U.S. Appl. No. 29/524,186 for Scanner filed Apr. 17, 2015 (Zhou et al.); 17 pages.
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages.
U.S. Appl. No. 14/614,706 for Device for Supporting an Electronic Tool on a User's Hand filed Feb. 5, 2015 (Oberpriller et al.); 33 pages.
U.S. Appl. No. 14/628,708 for Device, System, and Method for Determining the Status of Checkout Lanes filed Feb. 23, 2015 (Todeschini); 37 pages.
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et. al.); 60 pages.
U.S. Appl. No. 14/529,563 for Adaptable Interface for a Mobile Computing Device filed Oct. 31, 2014 (Schoon et al.); 36 pages.
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages.
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages.
U.S. Appl. No. 14/695,364 for Medication Management System filed Apr. 24, 2015 (Sewell et al.); 44 pages.
U.S. Appl. No. 14/664,063 for Method and Application for Scanning a Barcode With a Smart Device While Continuously Running and Displaying an Application on the Smart Device Display filed Mar. 20, 2015 (Todeschini); 37 pages.
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages.
U.S. Appl. No. 14/527,191 for Method and System for Recognizing Speech Using Wildcards in an Expected Response filed Oct. 29, 2014 (Braho et al.); 45 pages.
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/535,764 for Concatenated Expected Responses for Speech Recognition filed Nov. 7, 2014 (Braho et al.); 51 pages.
U.S. Appl. No. 14/687,289 for System for Communication via a Peripheral Hub filed Apr. 15, 2015 (Kohtz et al.); 37 pages.
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages.
U.S. Appl. No. 14/674,329 for Aimer for Barcode Scanning filed Mar. 31, 2015 (Bidwell); 36 pages.
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages.
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages.
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages.
U.S. Appl. No. 14/740,320 for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Barndringa); 38 pages.
U.S. Appl. No. 14/695,923 for Secure Unattended Network Authentication filed Apr. 24, 2015 (Kubler et al.); 52 pages.
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages.
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned.
U.S. Appl. No. 14/462,801 for Mobile Computing Device With Data Cognition Software, filed on Aug. 19, 2014 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/596,757 for System and Method for Detecting Barcode Printing Errors filed Jan. 14, 2015 (Ackley); 41 pages.
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages.
U.S. Appl. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.); 42 pages.
U.S. Appl. No. 14/662,922 for Multifunction Point of Sale System filed Mar. 19, 2015 (Van Horn et al.); 41 pages.
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages.
U.S. Appl. No. 29/528,165 for In-Counter Barcode Scanner filed May 27, 2015 (Oberpriller et al.); 13 pages.
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages.
U.S. Appl. No. 14/614,796 for Cargo Apportionment Techniques filed Feb. 5, 2015 (Morton et al.); 56 pages.
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages.
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages.
U.S. Appl. No. 14/578,627 for Safety System and Method filed Dec. 22, 2014 (Ackley et al.); 32 pages.
U.S. Appl. No. 14/573,022 for Dynamic Diagnostic Indicator Generation filed Dec. 17, 2014 (Goldsmith); 43 pages.
U.S. Appl. No. 14/529,857 for Barcode Reader With Security Features filed Oct. 31, 2014 (Todeschini et al.); 32 pages.
U.S. Appl. No. 14/519,195 for Handheld Dimensioning System With Feedback filed Oct. 21, 2014 (Laffargue et al.); 39 pages.
U.S. Appl. No. 14/519,211 for System and Method for Dimensioning filed Oct. 21, 2014 (Ackley et al.); 33 pages.
U.S. Appl. No. 14/519,233 for Handheld Dimensioner With Data-Quality Indication filed Oct. 21, 2014 (Laffargue et al.); 36 pages.
U.S. Appl. No. 14/533,319 for Barcode Scanning System Using Wearable Device With Embedded Camera filed Nov. 5, 2014 (Todeschini); 29 pages.
U.S. Appl. No. 14/748,446 for Cordless Indicia Reader With a Multifunction Coil for Wireless Charging and EAS Deactivation, filed Jun. 24, 2015 (Xie et al.); 34 pages.
U.S. Appl. No. 29/528,590 for Electronic Device filed May 29, 2015 (Fitch et al.); 9 pages.
U.S. Appl. No. 14/519,249 for Handheld Dimensioning System With Measurement-Conformance Feedback filed Oct. 21, 2014 (Ackley et al.); 36 pages.
U.S. Appl. No. 29/519,017 for Scanner filed Mar. 2, 2015 (Zhou et al.); 11 pages.
U.S. Appl. No. 14/398,542 for Portable Electronic Devices Having a Separate Location Trigger Unit for Use in Controlling an Application Unit filed Nov. 3, 2014 (Sian et al.); 22 pages.
U.S. Appl. No. 14/405,278 for Design Pattern for Secure Store filed Mar. 9, 2015 (Zhu et. al.) 23 pages.
U.S. Appl. No. 14/590,024 for Shelving and Package Locating Systems for Delivery Vehicles filed Jan. 6, 2015 (Payne); 31 pages.
U.S. Appl. No. 14/568,305 for Auto-Contrast Viewfinder for an Indicia Reader filed Dec. 12, 2014 (Todeschini); 29 pages.
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages.
U.S. Appl. No. 14/580,262 for Media Gate for Thermal Transfer Printers filed Dec. 23, 2014 (Bowles); 36 pages.
Ralph Grabowski, “Smothing 3D Mesh Objects,” New Commands in AutoCAD 2010: Part 11, Examiner Cited art in related matter Non Final Office Action dated May 19, 2017; 6 pages.
European Exam Report in related EP Application No. 15176943.7, dated Apr. 12, 2017, 6 pages.
European Exam Report in related EP Application No. 15188440.0, dated Apr. 21, 2017, 4 pages.
European Exam Report in related EP Application No. 16152477.2, dated Jun. 20, 2017, 4 pages [No art to be cited].
European Exam Report in related EP Application 16172995.9, dated Jul. 6, 2017, 9 pages [No new art to be cited].
United Kingdom Search Report in related Application No. GB1700338.5, dated Jun. 30, 2017, 5 pages.
European Search Report in related EP Application No. 17175357.7, dated Aug. 17, 2017, pp. 1-7 [No new art to be cited].
Related Publications (1)
Number Date Country
20170010141 A1 Jan 2017 US