This disclosure generally relates to volume dimensioning systems, and particularly to systems and methods useful in the promoting compliance with governmental or industry standard calibration guidelines.
Volume dimensioning systems are useful for providing dimensional and/or volumetric information related to three-dimensional objects. The objects may, for example take the form of parcels or packages intended for transit via a carrier (e.g., courier) or other items intended for transit. Dimensional and/or volumetric information is useful for example, in providing users with accurate shipping rates based on the actual size and/or volume of the object being shipped. Dimensional and/or volumetric information may be used by the carrier in selecting and scheduling appropriately sized vehicles and/or delivery routes. The ready availability of dimensional and/or volumetric information for all objects within a carrier's network assists the carrier in ensuring optimal use of available space in the many different vehicles and containers used in local, interstate, and international shipping.
Such may be of particular significant in today's economy where many businesses rely on “just in time” manufacturing. Typically, every supplier in the supply chain must be able to ship necessary components or resources on demand or with very little lead time. Thus, efficient handling of cargo is required. It does a supplier no good to have the desired goods on hand, if the supplier cannot readily ship the desired goods.
Automating volume dimensioning can speed parcel intake, improve the overall level of billing accuracy, and increase the efficiency of cargo handling. Unfortunately, parcels are not confined to a standard size or shape, and may, in fact, have virtually any size or shape. Additionally, parcels may also have specialized shipping and/or handling instructions (e.g., fragile, this side up) that must be followed during shipping or handling to protect the objects during shipping.
Volume dimensioning devices are used throughout the package delivery and carriage industry to provide a rapid way of measuring the overall dimensions of an object and, in some instances, to provide shipping rates for the object based on one or more classes of service. Historically, shipping rates were principally a function of an object's weight—heavier objects were assigned higher shipping costs than lighter objects. However, such a costing system failed to appreciate that volume the volume occupied by an object also impacted shipping costs since vehicles were not just limited in gross vehicle weight, but internal volume as well. As a consequence shippers began establishing shipping rates using both volume and weight as factors considered in determining the ultimate shipping rate charged to a customer.
The concept of volume dimensioning factors the shipping volume of an object into the overall shipping cost of an object. Thus, objects having a relatively light weight but a relatively large physical volume may have a shipping cost that exceeds the shipping cost of a physically smaller, but heavier, object. The use of volume in determining shipping costs increased the labor associated with package intake, since objects could no longer simply be weighed and a cost assigned. Instead, to accurately obtain a volume dimension, multiple dimensional measurements were taken and used to determine the volume of the object. Once the volume is determined, a shipping cost is assigned based on the measured volume and/or weight of the object. Thus, the shipping cost charged a customer is a function of the weight of an object, the volume occupied by the object, or both the weight of and the volume occupied by the object. Automated volume dimensioning systems have replaced the laborious and error prone derivation of an object's volume by manually obtaining multiple linear dimensions (e.g., the length, width, height, girth, etc.) of an object. The accuracy of a quoted shipping rate is thus dependent upon the accuracy with which an object can be dimensioned using a volume dimensioning system.
There exists a need for new dimensioning systems that may accurately perform volume dimensioning of objects including parcels and packages as well as other objects.
The Applicants have developed systems and methods useful for adjusting the reported or displayed dimensional measurement accuracy and consequently the reported or displayed shipping or cartage rate obtained using dimensional or volumetric data supplied by the volume dimensioning system. The systems and methods described herein take into consideration the level of distortion (e.g., dimensional distortion, optical distortion, etc.) present in the image data provided by such automated volume dimensioning systems. In some instances, the system adjusts a dimensional accuracy of a representation of volume dimensioning information (e.g., dimensions, cost based on the measured distortion present in the volume dimensioning system). Such may ensure that the dimensional and shipping cost data generated by the system is determined using the finest units of accuracy achievable given the current system operational parameters to reliably provide the most accurate shipping or cartage costs. Such systems and methods can be used to promote or facilitate volume dimensioning system compliance with corporate, industry, or regulatory standards, best practices, or guidelines, for example National Institute of Standards (NIST) Handbook 44-2012 Chapter 5.58—“Multiple Dimension Measuring Devices”.
The systems and methods disclosed herein also facilitate the ongoing, operationally transparent, calibration of volume dimensioning systems. Such ongoing calibrations provide system users and consumers with a degree of confidence in the dimensional and shipping cost data provided by the volume dimensioning system and also provide an early indication that the system calibration can no longer be brought into compliance with corporate, industry, or regulatory standards, best practices, or guidelines.
A volume dimensioning system may be summarized as including at least one image sensor that provides image data representative of a number of images of a field of view of the at least one image sensor; and a control subsystem communicatively coupled to the at least one image sensor to receive the image data therefrom, the control subsystem including at least one nontransitory storage medium and at least one processor, the at least one nontransitory storage medium which stores at least one of information or processor executable instructions; and the at least one processor which: determines at least one distortion value indicative of an amount of distortion in the images based at least in part on at least a portion of a calibration pattern which appears in the field of view of the at least one image sensor in at least a portion of some of the images, the calibration pattern having a set of defined characteristics; assesses the at least one distortion value relative to a number of distortion threshold values; and adjusts a unit of accuracy in a representation of volume dimensioning related information based at least in part on the assessment of the at least one distortion value relative to the distortion threshold values.
The at least one processor may determine the at least one distortion value as at least one set of optical distortion values and at least one set of dimensional distortion values, the set of optical distortion values representative of an optical contribution to distortion in the image data and the set of dimensional distortion values representative of a dimensional contribution to distortion in the image data. The at least one processor may assess the at least one distortion value relative to a recalibration threshold value that represents distortion correctable via a self recalibration by the volume dimensioning system. The at least one processor may assess the at least one distortion value relative to a service required threshold value that represents distortion that can only be corrected via a servicing of the volume dimensioning system by a service technician. The at least one processor may adjust the unit of accuracy in the representation of volume dimensioning related information in response to an assessment that the at least one distortion value exceeds the recalibration threshold value and is below the service required threshold value. Responsive to the determination that the at least one distortion value is less than the recalibration threshold value, the at least one processor may recalibrate the volume dimensioning system to a fine unit of accuracy; and wherein responsive to the determination that the at least one distortion value exceeds the recalibration threshold value and is below the service required threshold value, the at least one processor may recalibrate the volume dimensioning system to a coarse unit of accuracy. The processor may further produce an alert in response to an assessment that the at least one distortion value exceeds the service required threshold value. The processor may further determine at least one of a set of calculated optical distortion correction factors or a set of calculated dimensional correction factors in response to an assessment that the at least one distortion value is within the recalibration threshold value and wherein the processor may further apply at least one of the set of calculated optical distortion correction factors or the set of calculated dimensional correction factors to the image data in determining the volume dimensioning related information. The processor may adjust a decimal place represented to adjust the unit of accuracy in the representation of volume dimensioning related information. The processor may adjust a dimensional unit of measurement represented to adjust the unit of accuracy in the representation of volume dimensioning related information. The processor may adjust a unit of currency represented to adjust the unit of accuracy in the representation of volume dimensioning related information. The volume dimensioning system may further include an illumination subsystem communicably coupled to the control subsystem, the illumination subsystem to at least partially illuminate the calibration pattern. The volume dimensioning system may further include a support structure to receive at least the at least one image sensor such that when the at least one image sensor is received by the support structure at least a portion of the pattern is within a field of view of the at least one image sensor. The system may be fixed or hand held. The at least one distortion value may be associated with at least one of data indicative of a date or data indicative of a time and wherein the at least one distortion value and the respective associated data indicative of a date or data indicative of a time may be stored in the non-transitory storage medium.
A volume dimensioning method may be summarized as including receiving by at least one dimensioning system processor image data representative of a number of images in a field of view of at least one image sensor; determining by the at least one dimensioning system processor at least one distortion value indicative of an amount of distortion in the images based at least in part on at least a portion of a calibration pattern which appears in the field of view of the at least one image sensor in at least some of the images, the calibration pattern having a set of defined characteristics; assessing by the at least one dimensioning system processor the at least one distortion value relative to a number of distortion threshold values stored in a non-transitory storage medium communicably coupled to the at least one dimensioning system processor; and adjusting by the at least one dimensioning system processor a unit of accuracy in a representation of volume dimensioning related information based at least in part on the assessment of the at least one distortion value relative to the distortion threshold values.
Assessing by the at least one dimensioning system processor the at least one distortion value relative to a number of distortion threshold values may include determining the at least one distortion value as at least one set of optical distortion values and at least one set of dimensional distortion values; wherein the set of optical distortion values represents an optical contribution to distortion in the image data; and wherein the set of dimensional distortion values represent a dimensional contribution to distortion in the image data. Assessing by the at least one dimensioning system processor the at least one distortion value relative to a number of distortion threshold values may include assessing the at least one distortion value relative to a recalibration threshold value representing distortion correctable via a recalibration of the volume dimensioning system. Assessing by the at least one dimensioning system processor the at least one distortion value relative to a number of distortion threshold values may include assessing the at least one distortion value relative to a service required threshold value representing distortion not correctable via recalibration of the volume dimensioning system. Assessing by the at least one dimensioning system processor the at least one distortion value relative to a number of distortion threshold values may include assessing the at least one distortion value to fall between the recalibration threshold value and the service required threshold value, representing distortion correctable via a recalibration of the volume dimensioning system. Adjusting a unit of accuracy in a representation of volume dimensioning related information based at least in part on the assessment of the at least one distortion value relative to the distortion threshold values may include recalibrating the volume dimensioning system to a fine unit of accuracy responsive to an assessment that the at least one distortion value relative to the recalibration threshold value indicates a distortion correctable via recalibration; recalibrating the volume dimensioning system to a coarse unit of accuracy responsive to an assessment that the at least one distortion value falls between the recalibration threshold value and the service required threshold value; and generating an alert responsive to an assessment that the at least one distortion value relative to the service required threshold value indicates a distortion not correctable via recalibration. The volume dimensioning method may further include, responsive to the determination that the at least one distortion value is within the recalibration threshold value, determining by the at least one dimensioning system processor at least one of a set of calculated optical distortion correction factors or a set of calculated dimensional correction factors; and applying at least one of the set of calculated optical distortion correction factors or the set of calculated dimensional correction factors to the image data in determining the volume dimensioning related information.
A volume dimensioning controller may be summarized as including at least one input communicably coupled to at least one processor, the at least one input to receive image data representative of a number of images of a field of view of at least one image sensor; and at least one processor communicably coupled to the at least one non-transitory storage medium, the at least one processor to: determine at least one distortion value indicative of an amount of distortion in the images based at least in part on at least a portion of a calibration pattern which appears in the field of view of the at least one image sensor in at least some of the images, the calibration pattern having a set of defined characteristics; assess the at least one distortion value relative to a number of distortion threshold values stored in the non-transitory storage medium; and adjust a unit of accuracy in a representation of volume dimensioning related information based at least in part on the assessment of the at least one distortion value relative to the distortion threshold values.
The at least one processor may determine the at least one distortion value as at least one set of optical distortion values and at least one set of dimensional distortion values, the set of optical distortion values representative of an optical contribution to distortion in the image data and the set of dimensional distortion values representative of a dimensional contribution to distortion in the image data. The at least one processor may assess the at least one distortion value relative to a recalibration threshold value that represents distortion correctable via a self recalibration by the volume dimensioning system. The at least one processor may assess the at least one distortion value relative to a service required threshold value that represents distortion that can only be corrected via a servicing of the volume dimensioning system by a service technician. The at least one processor may adjust the unit of accuracy in the representation of volume dimensioning related information in response to an assessment that the at least one distortion value exceeds the recalibration threshold value and is below the service required threshold value. Responsive to the determination that the at least one distortion value is less than the recalibration threshold value, the at least one processor may recalibrate the volume dimensioning system to a fine unit of accuracy; and wherein responsive to the determination that the at least one distortion value exceeds the recalibration threshold value and is below the service required threshold value, the at least one processor may recalibrate the volume dimensioning system to a coarse unit of accuracy.
In the drawings, identical reference numbers identify similar elements or acts. The sizes and relative positions of elements in the drawings are not necessarily drawn to scale. For example, the shapes of various elements and angles are not drawn to scale, and some of these elements are arbitrarily enlarged and positioned to improve drawing legibility. Further, the particular shapes of the elements as drawn, are not intended to convey any information regarding the actual shape of the particular elements, and have been solely selected for ease of recognition in the drawings.
In the following description, certain specific details are set forth in order to provide a thorough understanding of various disclosed embodiments. However, one skilled in the relevant art will recognize that embodiments may be practiced without one or more of these specific details, or with other methods, components, materials, etc. In other instances, well-known structures associated with volume dimensioning systems, correction of optical and dimensional distortion in single and compound lens devices, wired, wireless and optical communications systems, and/or automatic data collection (ADC) readers have not been shown or described in detail to avoid unnecessarily obscuring descriptions of the embodiments.
Unless the context requires otherwise, throughout the specification and claims which follow, the word “comprise” and variations thereof, such as, “comprises” and “comprising” are to be construed in an open, inclusive sense, that is as “including, but not limited to.”
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
As used in this specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the content clearly dictates otherwise. It should also be noted that the term “or” is generally employed in its broadest sense, that is as meaning “and/or” unless the content clearly dictates otherwise.
The headings and Abstract of the Disclosure provided herein are for convenience only and do not interpret the scope or meaning of the embodiments.
The volume dimensioning system 100 includes a camera subsystem 102 and control subsystem 104. The volume dimensioning system 100 optionally includes one or more of: a user interface (UI) subsystem 106; a communications subsystem 108 and/or an automatic data collection (ADC) subsystem 110.
The various subsystems 102-110 may be communicatively coupled by one or more couplers (e.g., electrically conductive paths, wires, optical fibers), for example via one or more buses 112 (only one shown) and/or control lines 114 (only two shown). The buses 112, or other couplers, may include power buses or lines, data buses, instruction buses, address buses, etc., which allow operation of the various subsystems 102-110 and interaction or intercommunication therebetween. The various subsystems 102-110 are each discussed in turn, below. While various individual components are generally easily categorizable into one or another of the subsystems, some components may be optionally implemented in one or two or more of the subsystems 102-110. Thus, some components may be illustrated in
The camera subsystem 102 includes an optional illumination subsystem 116 to provide or emit electromagnetic illumination outward from the volume dimensioning system 100 into an environment containing a target object (not shown in
The illumination subsystem 116 includes an illumination device 120. The illumination device 120 may take the form of an array of individually addressable or controllable elements, and also may have a variety of forms capable of producing electromagnetic energy having a spectral content useful for image collection by the sensor subsystem 118. The illumination subsystem 116 will typically include an illumination driver 122 which is coupled to control the individually addressable or controllable elements of the illumination device 120. Alternatively, the illumination device 120 may be controlled directly by the control subsystem 104 without the use of a dedicated illumination driver 122.
In particular, the illumination device 120 is controlled to produce or emit modulated electromagnetic energy in a number of wavelengths or ranges of wavelengths. For instance, illumination may include electromagnetic energy of wavelengths in an optical range or portion of the electromagnetic spectrum including wavelengths in a human-visible range or portion (e.g., approximately 390 nm-750 nm) and/or wavelengths in the near-infrared (NIR) (e.g., approximately 750 nm-1400 nm) or infrared (e.g., approximately 750 nm-1 mm) portions and/or the near-ultraviolet (NUV) (e.g., approximately 400 nm-300 nm) or ultraviolet (e.g., approximately 400 nm-122 nm) portions of the electromagnetic spectrum. The particular wavelengths are exemplary and not meant to be limiting. Other wavelengths of electromagnetic energy may be employed.
The sensor subsystem 118 includes an image transducer or image sensor 124, typically a two-dimensional array of photo-sensitive or photo-responsive elements, for instance a two-dimensional array of photodiodes or a two-dimensional array of charge coupled devices (CODs). The sensor subsystem 118 may optionally include a buffer 125 communicatively coupled to the image sensor 124 to receive or otherwise acquire image data measured, captured or otherwise sensed or acquired by the image sensor 124. The buffer 125 may comprise a non-transitory storage medium capable of temporarily storing image data until the image data is further processed by the volume dimensioning system 100. In at least some instances, the sensor subsystem 118 can include one or more sensors, systems, or devices for reading or scanning one or more optical machine readable symbols or radio frequency machine readable devices such as radio frequency identification (RFID) tags. Some possibly suitable systems are described in U.S. patent application Ser. No. 12/638,616, filed Dec. 15, 2009 and published as U.S. patent application publication no. US 2010-0220894, which is incorporated by reference herein in its entirety to the extent the subject matter therein does not contradict or conflict with the subject matter of the instant application.
The sensor subsystem may further include one or more distance determination sensors (not shown in
The control subsystem 104 includes one or more processors 126, for example one or more microprocessors (one shown) 126a, digital signal processors (DSP—one shown) 126b, application specific integrated circuits (ASIC), programmable gate arrays (PGA), programmable logic controllers (PLC), or the like. While the DSP 126b may be considered or provided or packaged as part of the control subsystem 104, the DSP 126b may in some applications be may be considered or provided or packaged as part of the camera subsystem 102.
The control subsystem 104 includes at least one non-transitory storage media 130. For example, the control subsystem 104 may include nonvolatile memory, for instance read only memory (ROM) or NAND Flash memory 130a. Additionally or alternatively, all or a portion of the at least one non-transitory storage media 130 may include volatile memory, for instance dynamic random access memory (ROM) 130b. The at least one non-transitory storage media 130 may store one or more computer- or processor-executable instructions or data, useful in causing the microprocessor, DSP or other microcontroller to perform dimensional functions, volumetric functions, volume dimensioning functions, shipping cost calculation functions, or combinations thereof, for example by executing the various methods described herein.
In some instances the at least one non-transitory storage media 130 may store or otherwise retain a number of distortion values indicative of the quantitative or qualitative degree of distortion present in the image data provided by the volume dimensioning system 100. Such distortion may be present as an optical distortion, a dimensional distortion, or any other type of distortion including chromatic distortion that causes deviations between the image data and the scene within the field of view of the sensor subsystem 118. In yet other instances, the at least one non-transitory storage media 130 may store or otherwise retain a plurality of historical distortion values, such as a plurality of optical or dimensional distortion values that permit the historical trending of the optical or dimensional distortion values. Such historical data can also play a helpful role in demonstrating an ongoing compliance with one or more corporate, industry, or regulatory guidelines, best practices, or standards. In at least some instances, the at least one non-transitory storage media 130 can store or otherwise retain one or more sets of distortion correction factors useful in reducing or eliminating one or more forms of distortion present in the image data provided by the volume dimensioning system 100.
The optional UI subsystem 106 may include one or more user interface components which provide information to a user or allow a user to input information or control operation of the volume dimensioning system 100.
For example, the UI subsystem 106 may include a display 132 to visually provide information or control elements to the user. The display 132 may, for example, take the form of a liquid crystal display (LCD) panel. The display 132 may, for example, take the form of a touch sensitive display, allowing the display of user selectable icons (e.g., virtual keypad or keyboard, graphical user interface or GUI elements) in addition to the display of information. The display 132 may be coupled to the control subsystem 104 via a display driver 134 or similar component. The display driver 134 may control the presentation of information and icons on the display 132. The display driver 134 may additionally process signals indicative of user inputs made via the display 132.
The UI subsystem 106 may optionally include a physical keypad or keyboard 136, which allows a user to enter data and instructions or commands. The physical keypad or keyboard 136 may be integral to a housing (not shown) of the volume dimensioning system 100. Alternatively, the optional physical keypad or keyboard 136 may be separate from the housing, communicatively coupled thereto via a wireless connection or wired connection for instance a Universal Serial Bus (USB®) interface.
The UI subsystem 106 may optionally include a speaker 138 to provide audible information, cues and/or alerts to a user. The UI subsystem 106 may optionally include a microphone 140 to receive spoken information, instructions or commands from a user.
The communications subsystem 108 may include one or more wireless communications components and/or one or more wired communications components to allow communications with devices external from the volume dimensioning system 100.
For example the communications subsystem 108 may include one or more radios (e.g., transmitters, receivers, transceivers) 142 and associated antenna(s) 144. The radio(s) 142 may take any of a large variety of forms using any of a large variety of communications protocols, for instance IEEE 802.11, including WI-FI®, BLUETOOTH®, various cellular protocols for instance CDMA, TDMA, EDGE®, 3G, 4G, GSM.
Also for example, the communications subsystem 108 may include one or more communications ports 146. The communications ports 146 may take any of a large variety of forms, for example wired communications ports for instance ETHERNET® ports, USB® ports, FIREWIRE® ports, THUNDERBOLT® ports, etc. The communications ports 146 may even take the form of wireless ports, for instance an optical or radio frequency transceiver.
The ADC subsystem 110 may include one or more ADC readers to perform automatic data collection activities, for instance with respect to a target object.
For example, the ADC subsystem 110 may include a radio frequency identification (RFID) reader or interrogator 148 and associated antenna 150 to wireless read and/or write to wireless transponders (e.g., RFID tags or transponders) (not shown). Any of a large variety of RFID readers or interrogators 148 may be employed, including fixed or stationary RFID readers or portable or handheld RFID readers. RFID reader(s) 148 may be used to read information from a transponder physically or at least proximally associated with a target object (not shown in
Also for example, the ADC subsystem 110 may include a machine-readable symbol reader 152 to wireless read machine-readable symbols (e.g., one-dimensional or barcode symbols, two-dimensional or matrix code symbols) (not shown). Any of a large variety of machine-readable symbol readers 152 may be employed. For example, such may employ scanner based machine-readable symbol readers 152 such as those that scan a point of light (e.g., laser) across a symbol and detector light returned from the symbol, and decoding information encoded in the symbol. Also for example, such may employ imager based machine-readable symbol readers 152 such as those that employ flood illumination (e.g., LEDs) of a symbol, detect or capture an image of the symbol, and decode information encoded in the symbol. The machine-readable symbol reader(s) 152 may include fixed or stationary machine-readable symbol readers or portable or handheld machine-readable symbol readers. The machine-readable symbol reader(s) 152 may be used to read information from a machine-readable symbol physically or at least proximally associated with a target object. Such information may, for instance, include recipient information including an address and/or telephone number, sender information including an address and/or telephone number, specific handling instructions (e.g., fragile, keep a give side up, temperature range, security information).
While not illustrated, the volume dimensioning system 100 may include a self contained, discrete source of power, for example one or more chemical battery cells, ultracapacitor cells and/or fuel cells. While also not illustrated, the volume dimensioning system 100 may include a recharging circuit, for example to recharge secondary chemical battery cells. Alternatively or additionally, the volume dimensioning system 100 may be wired to an external power source, such as mains, residential or commercial power.
The dimensions of the reference pattern 202 are defined, fixed, and known by the volume dimensioning system 100 prior to imaging. This allows the volume dimensioning system 100 to analyze pattern image data 212, 222 containing at least a portion of the reference pattern 202 to assess or quantify the distortion present in the image and to generate at least one distortion value indicative of the distortion. The distortion may be global throughout the image or may be localized with different portions of the image exhibiting different types or amounts of distortion.
Although shown in two different figures for clarity and ease of discussion, optical and dimensional distortion, along with other forms of distortion such as color or chromatic aberration or distortion, may appear in image data produced by a volume dimensioning system 100. Such combinations further complicate the accurate determination of dimensional or volumetric information therefrom. Uncorrected, such optical and dimensional distortion in the image can cause the calculation or propagation of erroneous dimensional information and consequently volumetric information and volume-based shipping cost information.
Optical distortion may be present in the image data received from the sensor subsystem 118 in many forms. Typical forms of optical distortion present in image data can include radial distortion, chromatic or spherical aberration, linear distortion, geometric distortion, and combinations thereof. Such optical distortion may not necessarily be a consequence of a latent defect in the sensor subsystem 118 but may be inherent in the design or manufacture of the optics used to provide the sensor subsystem 118 or characteristic of the image processing hardware, firmware, or software employed by the volume dimensioning system 100. Such optical distortion may variously be referred to as pincushion distortion, barrel distortion, or mustache distortion depending on the visual appearance of the distortion present in the displayed pattern image data 212, 222. Regardless of the cause, the presence of distortion in the image data compromises the ability of the volume dimensioning system 100 to accurately determine dimensional or volumetric data for an object. Uncorrected, such optical and dimensional distortion may adversely impact the accuracy of the shipping costs provided by the volume dimensioning system 100 and also may hinder a shipper's ability to schedule and load shipping containers, trucks, railcars, or the like based on the dimensional and volumetric data.
In some instances, optical distortion may be present but non-uniformly distributed across an image. Such distortion may result in a portion of an image suffering little or no optical distortion while other portions of the same image suffer significant distortion. For example, little optical distortion may be present in the center portion of an image while all or a portion of the periphery of the same image may suffer a much greater degree of optical distortion. In other instances a first type of distortion may be distributed more-or-less uniformly across the image while a second type of distortion may be present in one or more localized areas. In yet other instances, an object of dimensional interest may lie within only a portion of an optically distorted image captured by the image sensor. In such instances, it may be advantageous to locally correct the distortion present in the area of the image in which the object of dimensional interest lies. In at least some instances, if the type and extent of such local distortion present in an image can be assessed or is of a known type, extent, and/or magnitude, then local dimensional correction may be possible within the image. The ability to locally correct distortion present in an image advantageously eliminates the application of such distortion correction in portions of the image having where such distortion is not present.
Although the reference pattern 202 is depicted as a checkerboard, any number of machine recognizable indicia including one or more machine-readable symbols, machine-readable patterns, calibration patterns, calibration targets, calibration points, or the like may be similarly employed as a tool for assessing the distortion (e.g., amount, type, location) present in the image data. Physical parameters associated with the reference pattern 202 can be provided to the volume dimensioning system 100 either as one or more factory settings (e.g., preloaded values placed into the read only portion of the non-transitory storage medium) or communicated to the volume dimensioning system (e.g., via a network, Bluetooth, or similar connection). All or a portion of such physical parameters may include color information associated with the reference pattern 202 including the overall reference pattern size, the size of the white areas 204, the size of the colored areas 206, or combinations thereof. All or a portion of such physical parameters may include the spatial or geometric relationship between the various white and colored areas 204, 206 in the reference pattern 202. All or a portion of such physical parameters may include information encoded into one or more regions or portions of the reference pattern 202 in the form of one or more machine readable indicia or symbols. Pattern image data 212, 214 is used by the at least one processor 126 to detect and quantify the distortion (e.g., optical or dimensional distortion) present in the image data using the one or more known physical parameters. The quantity of distortion present may be expressed as at least one distortion value. In at least some instances, all or a portion of the reference pattern 202 may be useful in calibrating the volume dimensioning system 100.
In at least some instances, the entire electromagnetic spectrum reflected or otherwise returned from the reference pattern 202 may be used by the at least one processor 126 to determine all or a portion of the at least one distortion value. In other instances, only a portion of the electromagnetic spectrum reflected or otherwise returned from the reference pattern 202 may be used by the at least one processor 126 to determine the at least one distortion value. In some instances, the reference pattern 202 may return pattern image data unique to the electromagnetic spectrum illuminating the reference pattern 202 (e.g., the pattern image data returned in a near-ultraviolet spectrum may differ from that returned in a visible spectrum). Portions of the electromagnetic spectrum used by the at least one processor 126 may include, but are not limited to, the near ultraviolet portion of the electromagnetic spectrum, the near infrared portion of the electromagnetic spectrum, one or more portions of the visible electromagnetic spectrum, or any portion of combination thereof.
Although the entire reference pattern 202 is shown within the field of view 210 of the image sensor 124 in both
Identification data may in some instances be created, generated or otherwise provided by the at least one processor 126 and associated with the at least one determined distortion value. Such identification data may include chronological data such as data indicative of the date or the time at which the at least one distortion value was obtained, calculated, or otherwise determined by the at least one processor 126. Such identification data may include chronological data such as data indicative of the date or the time at which one or more sets of distortion correction factors are determined by the at least one processor 126. Such identification data may include chronological data associated with system events (e.g., distortion value determination, distortion correction determination, system calibration, a change in system units of accuracy, a change in system configuration, etc.) that are recommended or required for compliance with one or more corporate, industry, or regulatory guidelines, best practices, or standards.
The determined at least one distortion value along with the respective associated identification data may be at least partially stored in the at least one non-transitory storage media 130. Maintaining a history of the determined at least one distortion value may advantageously provide the ability for the one or more processors 126 to predict expected future distortion values and to detect sudden or unexpected changes in the level or magnitude of the determined at least one distortion value. Sudden changes in the at least one distortion value may, for example, indicate an unexpected change in performance of the volume dimensioning system 100. The ability to predict future expected distortion values may, for example, be useful in providing a predicted replacement interval or an expected remaining service life for the volume dimensioning system 100.
In at least some instances, the volume dimensioning system 100 can generate an output that includes both identification data and the associated at least one distortion value either as a visible output on user interface 132 or as a data output transmitted via the communication subsystem 108 to an external device such as a non-transitory data storage location on a local network or in the cloud or an external display device such as a printer or similar data output device.
At least some instances, the at least one processor 126 can calculate or otherwise determine one or more sets of distortion correction factors (e.g., one or more sets of optical distortion factors or one or more sets of dimensional distortion factors) based in whole or in part on the determined at least one distortion value. When within a first set of distortion threshold values, the volume dimensioning system 100 may use the distortion correction factors to reduce or even eliminate the effects of distortion, improving the dimensional, volumetric, and resultant shipping cost calculation capabilities of the volume dimensioning system 100.
The at least one processor 126 may determine the distortion correction factors using one or more numerical distortion correction methods. Numerical distortion correction methods may, for example, include Brown's distortion model or other similar mathematical distortion correction methods or schemes. One or more graphical distortion correction methods may also be used alone or in cooperation with one or more numerical distortion correction methods.
In some instances, the at least one processor 126 may use the entire electromagnetic spectrum of the image provided by the sensor subsystem 118 to determine all or a portion of the one or more distortion correction factors. In other instances, the at least one processor may use a portion of the electromagnetic spectrum of the image provided by the sensor subsystem 118 to determine the one or more distortion correction factors. The use of sets of distortion correction factors in one or more portions of the electromagnetic spectrum may in some instances advantageously provide the ability to partially or completely correct at least a portion of the chromatic aberration present in the image data provided by the sensor subsystem 118.
Dimensional distortion such as that shown in
In at least some instances the distortion present in the image data may include both optical and dimensional distortion. In such instances the one or more processors 126 may calculate multiple distortion values including at least one optical distortion value indicative of the level of optical distortion present in the image data and at least one dimensional distortion value indicative of the level of dimensional distortion present in the image data. The at least one optical distortion value and the at least one dimensional distortion value may be stored or otherwise retained individually within the non-transitory storage media 130 or alternatively may be combined to provide at least one combined distortion value reflective of both the optical and dimensional distortion present in the image data.
Using one or more calibration parameters of the reference pattern 202 and based on the determined at least one distortion value, the one or more processors 126 can determine or otherwise generate one or more sets of distortion correction factors. Such sets of distortion correction factors can include one or more sets of optical distortion correction factors, one or more sets of dimensional distortion correction factors, or combinations thereof. The one or more sets of distortion correction factors can be wholly or partially stored or otherwise retained in the at least one non-transitory storage media 130. The one or more sets of distortion correction factors are used by the at least one processor 126 to reduce or eliminate the effects of distortion present in the image data on the determined dimensional, volumetric, volume dimensional, or cost data provided by the volume dimensioning system 100. Additionally, the one or more sets of distortion correction factors may be used to by the at least one processor 126 to correct the image data prior to using the image data to provide an output on the display 132.
The volume dimensioning system 100 can determine the at least one distortion value and one or more sets of distortion correction factors on a regular or irregular basis. For example, in some instances, the volume dimensioning system 100 can determine the at least one distortion value when the reference pattern 202 falls within the field of view of the at least one sensor 124 and the system 100 is not actively volume dimensioning an object. Such may occur when the volume dimensioning system 100 is placed in a defined location for example returned to a cradle or stand. In other instances, the routine range of motion may bring the reference pattern 202 within the field of view of the at least one sensor 124 as the volume dimensioning system is moved or displaced. For example, the reference pattern 202 may appear in the field of view of the at least one image sensor 124 when the volume dimensioning system 110 is moved from a “storage” position or location to a “ready” position or location, or from a “ready” position or location to a “storage” position or location. In yet other instances, the volume dimensioning system 100 may provide one or more human perceptible indicators or signals that prompt a user to at least partially align the volume dimensioning system 100 with the reference pattern 202 to permit the system to perform a distortion correction or calibration.
In other instances, determination of the at least one distortion value and optionally the determination of the at least one set of distortion correction factors may occur as a portion of the volume dimensioning system 100 calibration routine. For example, in some instances, the at least one distortion value may be determined prior to the performance of a volume dimensioning system calibration to improve or otherwise enhance the level of accuracy of the calibration. In some instances, such distortion correction or calibration routines may be time-based and conducted at regular or irregular intervals. In other instances, such distortion correction or calibration routines may be performance related and conducted based upon one or more measured system performance parameters. In yet other instances, such distortion correction or calibration routines may be time and performance based to comply with one or more corporate, industry, or regulatory standards, best practices, or guidelines.
Advantageously, the ability to detect the presence of distortion present in the image data, to quantify the distortion using at least one distortion value, to optionally determine one or more sets of distortion correction factors, and to optionally incorporate both into a volume dimensioning system calibration procedure reduces the likelihood of the volume dimensioning system 100 providing erroneous linear, volumetric, or shipping cost information. Such periodic detection and quantification of distortion present in the image data may be conducted on an automatic (i.e., system generated) or manual (i.e., at user discretion) basis at regular or irregular intervals.
The volume dimensioning system 100 may also include one or more sensors (not visible in
In at least some instances the reference pattern 202 can be formed on the base 304 or on a rigid or flexible member that is operably coupled or otherwise attached to the base member 304. The reference pattern 202 may be formed in different colors, materials, embossings, debossings, textures, engravings, or similar. In some instances, the reference pattern 202 may include one or more inscriptions, logos, designs, trademarked images, or the like. In at least some instances all or a portion of the reference pattern 202 and the base member 304 may be detached and mounted remotely from the support member 302. For example, in at least some instances the reference pattern 202 may be mounted on a vertical surface such as a wall or similar partition.
In at least some situations, when the volume dimensioning system 100 is received by the support member 302 the sensor subsystem 118 may autonomously provide pattern image data including at least the portion of the reference pattern 202 to the at least one processor 126. Autonomous provision of image data by the sensor subsystem 118 to the at least one processor 126 may occur at regular or irregular intervals. Autonomous collection of pattern image data may permit a more frequent updating of the at least one distortion value or the one or more sets of distortion correction factors than a manually initiated collection of pattern image data since such autonomous collection may occur at times when the volume dimensioning system 100 is not in active use. The pattern image data so acquired allows the at least one processor 126 to determine the at least one distortion value using the known reference pattern 202 calibration parameters. Access to pattern image data also optionally permits the at least one processor 126 to determine the one or more sets of distortion correction factors. Providing the at least one processor 126 with the ability to determine the at least one distortion value and the sets of distortion correction factors while the volume dimensioning system 100 is not in active use may advantageously increase the overall accuracy of the dimensional, volumetric, and cost information provided by the system 100.
Although the object 402a is depicted as a cubic solid for simplicity and ease of illustration, it should be understood that similar principles as described below will apply to any object placed within the field of view of the volume dimensioning system 100. Object 402a is illustrated as having actual dimensions of 11.1 cm in length, 6.3 cm in width, and 15.6 cm in height. Such an object may be representative of a commonly encountered shipping container such as a cardboard box. Prior to placement of the object 402a in the field of view 210 of the imaging sensor 124, the volume dimensioning system 100a has determined through the use of a reference pattern 202 (not shown in
Object 402b has dimensions identical to object 402a, 11.1 cm in length, 6.3 cm in width, and 15.6 cm in height. However, prior to placement of the object 402b in the field of view 210 of the imaging sensor 124, the volume dimensioning system 100b has determined through the use of a reference pattern 202 (not shown in
In at least some instances, the volume dimensioning system 100 can correct distortion present in only a portion of the overall image. For example, the volume dimensioning system 100 may correct only the portion of the image containing the object 402. Such local correction can proceed using one or more correction factors determined based at least in part on any distortion present in the portion of the image containing and/or proximate the object 402. Such local distortion correction factors can be used in a manner similar to image wide distortion correction factors, for example to determine the dimensional accuracy achievable with regard to the object 402 and to determine whether a fine unit of accuracy or a coarse unit of accuracy should be used in assessing dimensional and cost information for the object 402.
At 506 the at least one processor 126 determines at least one distortion value using the pattern image data received from the sensor subsystem 118 at 504. The at least one processor 126 can determine any number of distortion values, including at least one of: an optical distortion value, a dimensional distortion value, a chromatic aberration or distortion value, or combinations thereof. The distortion values so determined provide a quantitative measure or assessment of the overall quality of the image data provided by the sensor subsystem 118. In some instances, all or a portion of the at least one distortion values determined by the at least one processor 126 at 506 can be stored or otherwise retained within the at least one non-transitory storage media 130.
At 508 the at least one processor 126 compares the determined at least one distortion value from 506 with a first distortion threshold. A determined at least one distortion value falling within the first distortion threshold indicates the distortion present in the image data provided by the sensor subsystem 118 is sufficiently small that a fine unit of accuracy may be used in determining and calculating dimensional, volumetric, and cost information. Conversely, a determined at least one distortion value exceeding the first distortion threshold may indicate the level of distortion present in the image data provided by the sensor subsystem 118 is sufficiently large that the use of the fine unit of accuracy is inappropriate and a coarse unit of accuracy should instead be used to determine and calculate dimensional, volumetric, and cost information. Such distortion thresholds may be provided as one or more factory settings or one or more periodically updated thresholds that are stored or otherwise retained in the at least one non-transitory storage media 130.
Advantageously, such adjustments are made autonomously by the volume dimensioning system 100 without user intervention using the determined at least one distortion value and a plurality of distortion thresholds stored or otherwise retained within the non-transitory storage media 130. For illustrative purposes, Table 1 lists one set of example values that may be associated with “fine” and “coarse” units of accuracy:
At 510, if the at least one processor 126 finds the distortion value determined at 506 is within or less than the first distortion threshold, the at least one processor 126 can adjust one or more volume dimensioning system parameters at 512. In at least some instances, at 512 the one or more processors 126 may calculate one or more sets of distortion correction factors to reduce or eliminate the distortion present in the image data provided by the sensor subsystem 118 using the one or more distortion values determined at 506. In some instances, adjusting the one or more volume dimensioning system parameters at 512 may also include confirming the fine units of accuracy are being used, performing one or more calibration routines, or combinations thereof.
At 514 the at least one processor compares the at least one distortion value determined at 506 with a second distortion threshold. If at 510 the at least one processor 126 found the at least one distortion value determined at 506 exceeded the first distortion threshold at 510, the at least one processor 126 can compare the determined at least one distortion value with a second distortion threshold at 514. In at least some instances, distortion values exceeding the second distortion threshold may indicate the presence of distortion in the image data provided by the sensor subsystem 118 that is of a magnitude or severity sufficient to render the system 100 unusable based on one or more corporate, industry, or regulatory guidelines, best practices, or standards.
Although
If at 516 the at least one processor 126 finds the at least one distortion value determined at 506 exceeds or is greater than the second distortion threshold, the at least one processor 126 can generate one or more human perceptible outputs indicative of a “service required” condition at 518. In some instances at 518, one or more functions or features of the volume dimensioning system 100, for example the costing functionality, may be inhibited if the distortion value exceeds the second distortion threshold.
At 520, if the at least one processor 126 found the distortion value determined at 506 fell between the first and the second distortion thresholds at 516, the at least one processor 126 can adjust the units of accuracy of the information presented by the volume dimensioning system 100. In at least some instances, at least one processor 126 can adjust dimensional, volumetric or cost information provided by the volume dimensioning system 100 to one or more coarse units of accuracy. In at least some instances, at 520 the one or more processors 126 may calculate one or more sets of distortion correction factors to reduce or eliminate the distortion present in the image data provided by the sensor subsystem 118 using the one or more distortion values determined at 506. The one or more coarse units of accuracy cause the system 100 to determine, calculate, and display dimensional, volumetric, and cost data in units of accuracy that are based at least in part on the capability of the system 100 to resolve such dimensions and volumes based on the distortion values determined at 506. In at least some instances, some or all of the units of accuracy may be based on one or more corporate, industry, or regulatory guidelines, best practices, or standards. In some instances, for example, the units of accuracy used by the volume dimensioning system may be based on the NIST Handbook 44-2012 Chapter 5.58. The method 500 terminates at 522.
At 606 the at least one processor 126 assesses the image data supplied by the sensor subsystem 118 for dimensional distortion. The assessment of the dimensional distortion by the at least one processor 126 determines at least in part at least one dimensional distortion value. The at least one dimensional distortion value determined at 606 can provide a quantitative measure of the degree or magnitude of the dimensional distortion present in the image data provided by the sensor subsystem 118. Such a quantitative measure of the dimensional distortion present in the image data may be obtained by the at least one processor 126 using one or more numerical distortion analysis techniques, graphical distortion analysis techniques, or combinations thereof. After the at least one processor 126 has determined at least one distortion value attributable to either or both optical and dimensional distortion present in the image data provided by the sensor subsystem 118, the method 600 concludes at 608.
At 704, the at least one processor 126 can associate one or more identifiers with the at least one distortion value determined at 506 or the one or more sets of distortion correction factors determined at 512 or 520. Any type of logical identifier, including one or more sequential or chronological identifiers, may be so associated with the at least one distortion value. The association of one or more logical identifiers with the at least one distortion value or the one or more sets of distortion correction factors permits the retrieval and presentation of such data in an organized and logical manner. Storage of such historical data may also assist in compliance with one or more corporate, industry, or regulatory guidelines, best practices, or standards.
At 704 the at least one processor 126 can associate one or more logical identifiers with all or a portion of the distortion values (i.e., determined at 506) or all or a portion of the calculated sets of distortion correction factors (i.e., calculated at 512 or 520). In at least some instances, the one or more logical indicators can include one or more chronological indicators such as date and time of determination of the at least one distortion value or calculation of the set of distortion correction factors by the at least one processor 126. In some instances, the one or more logical indicators can include one or more serialized indicators sequentially assigned by the at least one processor 126 upon determining the at least one distortion values or calculating the set of distortion correction factors. Any similar logical indicators that provide the ability to retrieve, sort, organize, or display the associated distortion values or distortion correction factors in a logical manner may be so assigned by the at least one processor 126.
At 706, the at least one distortion value or the set of distortion correction factors and the associated logical identifier are at least partially stored within a non-transitory storage media 130. In at least some instances, at least a portion of the non-transitory storage media 130 can include one or more types of removable media, for example secure digital (SD) storage media, compact flash (CF) storage media, universal serial bus (USB) storage media, memory sticks, or the like. The use of such removable storage media may advantageously permit the transfer of data such as the stored distortion values and distortion correction factors to one or more external computing devices equipped with a comparable removable storage media reader.
At 708, the stored distortion values or distortion correction factors are displayed sorted or otherwise arranged or organized by the associated identifier either on the internal display device 132 of the volume dimensioning system 100 or an external display device wiredly or wirelessly accessed by the system 100 via the communications subsystem 108.
At 710, the stored distortion values or distortion correction factors are displayed sorted by the associated identifier either on the internal display device 132 of the volume dimensioning system 100 or an external display device wiredly or wirelessly accessed by the system 100 via the communications subsystem 108. Additionally, one or more trend lines may be fitted to the displayed data to provide an indication of the overall rate of degradation or change in distortion of the image data provided by the sensor subsystem 118. Such trend data may be useful in detecting sudden or unexpected changes in the overall level of image data quality provided by the sensor subsystem 118 and may advantageously provide an indication of the overall condition of the sensor subsystem 118.
At 712, the stored distortion values or distortion correction factors are displayed sorted by the associated identifier either on the internal display device 132 of the volume dimensioning system 100 or an external display device wiredly or wirelessly accessed by the system 100 via the communications subsystem 108. Additionally, through the use of one or more trend lines or similar data analysis techniques, a performance forecast is provided. Such performance forecasts may identify an expected date or timeframe in which the image data provided by the sensor subsystem 118 will no longer fall within an acceptable distortion threshold. Such data may advantageously indicate or predict an expected date at which the sensor subsystem 118 or the volume dimensioning system 100 may require service or replacement. The method 700 terminates at 714
The above description of illustrated embodiments, including what is described in the Abstract, is not intended to be exhaustive or to limit the embodiments to the precise forms disclosed. Although specific embodiments of and examples are described herein for illustrative purposes, various equivalent modifications can be made without departing from the spirit and scope of the disclosure, as will be recognized by those skilled in the relevant art. The teachings provided herein of the various embodiments can be applied to other automated systems, not necessarily the exemplary volume dimensioning system generally described above.
For instance, the foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, schematics, and examples. Insofar as such block diagrams, schematics, and examples contain one or more functions and/or operations, it will be understood by those skilled in the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. In one embodiment, the present subject matter may be implemented via Application Specific Integrated Circuits (ASICs). However, those skilled in the art will recognize that the embodiments disclosed herein, in whole or in part, can be equivalently implemented in standard integrated circuits, as one or more computer programs executed by one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs executed by on one or more controllers (e.g., microcontrollers) as one or more programs executed by one or more processors (e.g., microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of ordinary skill in the art in light of the teachings of this disclosure.
When logic is implemented as software and stored in memory, logic or information can be stored on any computer-readable medium for use by or in connection with any processor-related system or method. In the context of this disclosure, a memory is a computer-readable medium that is an electronic, magnetic, optical, or other physical device or means that contains or stores a computer and/or processor program. Logic and/or the information can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions associated with logic and/or information.
In the context of this specification, a “computer-readable medium” can be any element that can store the program associated with logic and/or information for use by or in connection with the instruction execution system, apparatus, and/or device. The computer-readable medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device. More specific examples (a non-exhaustive list) of the computer readable medium would include the following: a portable computer diskette (magnetic, compact flash card, secure digital, or the like), a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory), a portable compact disc read-only memory (CDROM), digital tape, and other nontransitory media.
Many of the methods described herein can be performed with one or more variations. For example, many of the methods may include additional acts, omit some acts, and/or perform or execute acts in a different order than as illustrated or described.
The various embodiments described above can be combined to provide further embodiments. All of the commonly assigned US patent application publications, US patent applications, foreign patents, foreign patent applications and non-patent publications referred to in this specification and/or listed in the Application Data Sheet, including but not limited to U.S. provisional patent application Ser. No. 61/691,093, filed is incorporated herein by reference, in its entirety. Aspects of the embodiments can be modified, if necessary, to employ systems, circuits and concepts of the various patents, applications and publications to provide yet further embodiments.
These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.
This application is a continuation of U.S. application Ser. No. 13/786,131, filed Mar. 5, 2013, which claims the benefit of U.S. Provisional Application No. 61/691,093, filed Aug. 20, 2012, each of which are incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3971065 | Bayer | Jul 1976 | A |
4279328 | Ahlbom | Jul 1981 | A |
4398811 | Nishioka et al. | Aug 1983 | A |
4495559 | Gelatt et al. | Jan 1985 | A |
4730190 | Win et al. | Mar 1988 | A |
4803639 | Steele et al. | Feb 1989 | A |
4914460 | Caimi et al. | Apr 1990 | A |
4974919 | Muraki et al. | Dec 1990 | A |
5111325 | Dejager | May 1992 | A |
5175601 | Fitts | Dec 1992 | A |
5184733 | Arnarson et al. | Feb 1993 | A |
5220536 | Stringer et al. | Jun 1993 | A |
5243619 | Albers et al. | Sep 1993 | A |
5331118 | Jensen | Jul 1994 | A |
5359185 | Hanson | Oct 1994 | A |
5384901 | Glassner et al. | Jan 1995 | A |
5477622 | Skalnik | Dec 1995 | A |
5548707 | Lonegro et al. | Aug 1996 | A |
5555090 | Schmutz | Sep 1996 | A |
5561526 | Huber et al. | Oct 1996 | A |
5590060 | Granville et al. | Dec 1996 | A |
5592333 | Lewis | Jan 1997 | A |
5606534 | Stringer et al. | Feb 1997 | A |
5619245 | Kessler et al. | Apr 1997 | A |
5655095 | Lonegro et al. | Aug 1997 | A |
5661561 | Wurz et al. | Aug 1997 | A |
5699161 | Woodworth | Dec 1997 | A |
5729750 | Ishida | Mar 1998 | A |
5730252 | Herbinet | Mar 1998 | A |
5732147 | Tao | Mar 1998 | A |
5734476 | Dlugos | Mar 1998 | A |
5737074 | Haga et al. | Apr 1998 | A |
5748199 | Palm | May 1998 | A |
5767962 | Suzuki et al. | Jun 1998 | A |
5802092 | Endriz | Sep 1998 | A |
5808657 | Kurtz et al. | Sep 1998 | A |
5831737 | Stringer et al. | Nov 1998 | A |
5850370 | Stringer et al. | Dec 1998 | A |
5850490 | Johnson | Dec 1998 | A |
5869827 | Rando | Feb 1999 | A |
5870220 | Migdal et al. | Feb 1999 | A |
5900611 | Hecht | May 1999 | A |
5923428 | Woodworth | Jul 1999 | A |
5929856 | Lonegro et al. | Jul 1999 | A |
5938710 | Lanza et al. | Aug 1999 | A |
5959568 | Woolley | Sep 1999 | A |
5960098 | Tao | Sep 1999 | A |
5969823 | Wurz et al. | Oct 1999 | A |
5978512 | Kim | Nov 1999 | A |
5979760 | Freyman et al. | Nov 1999 | A |
5988862 | Kacyra et al. | Nov 1999 | A |
5991041 | Woodworth | Nov 1999 | A |
6009189 | Schaack | Dec 1999 | A |
6025847 | Marks | Feb 2000 | A |
6049386 | Stringer et al. | Apr 2000 | A |
6053409 | Brobst et al. | Apr 2000 | A |
6064759 | Buckley et al. | May 2000 | A |
6067110 | Nonaka et al. | May 2000 | A |
6069696 | McQueen et al. | May 2000 | A |
6115114 | Berg et al. | Sep 2000 | A |
6137577 | Woodworth | Oct 2000 | A |
6177999 | Wurz et al. | Jan 2001 | B1 |
6189223 | Haug | Feb 2001 | B1 |
6232597 | Kley | May 2001 | B1 |
6236403 | Chaki et al. | May 2001 | B1 |
6246468 | Dimsdale | Jun 2001 | B1 |
6333749 | Reinhardt et al. | Dec 2001 | B1 |
6336587 | He et al. | Jan 2002 | B1 |
6369401 | Lee | Apr 2002 | B1 |
6373579 | Ober et al. | Apr 2002 | B1 |
6429803 | Kumar | Aug 2002 | B1 |
6457642 | Good et al. | Oct 2002 | B1 |
6507406 | Yagi et al. | Jan 2003 | B1 |
6517004 | Good et al. | Feb 2003 | B2 |
6519550 | D et al. | Feb 2003 | B1 |
6535776 | Tobin et al. | Mar 2003 | B1 |
6641042 | Pierenkemper et al. | Nov 2003 | B1 |
6661521 | Stern | Dec 2003 | B1 |
6674904 | McQueen | Jan 2004 | B1 |
6689998 | Bremer | Feb 2004 | B1 |
6705526 | Zhu et al. | Mar 2004 | B1 |
6750769 | Smith | Jun 2004 | B1 |
6773142 | Rekow | Aug 2004 | B2 |
6781621 | Gobush et al. | Aug 2004 | B1 |
6804269 | Lizotte et al. | Oct 2004 | B2 |
6824058 | Patel et al. | Nov 2004 | B2 |
6832725 | Gardiner et al. | Dec 2004 | B2 |
6858857 | Pease et al. | Feb 2005 | B2 |
6912293 | Korobkin | Jun 2005 | B1 |
6922632 | Foxlin | Jul 2005 | B2 |
6971580 | Zhu et al. | Dec 2005 | B2 |
6995762 | Pavlidis et al. | Feb 2006 | B1 |
7057632 | Yamawaki et al. | Jun 2006 | B2 |
7085409 | Sawhney et al. | Aug 2006 | B2 |
7086162 | Tyroler | Aug 2006 | B2 |
7104453 | Zhu et al. | Sep 2006 | B1 |
7128266 | Zhu et al. | Oct 2006 | B2 |
7137556 | Bonner et al. | Nov 2006 | B1 |
7159783 | Walczyk et al. | Jan 2007 | B2 |
7161688 | Bonner et al. | Jan 2007 | B1 |
7205529 | Andersen et al. | Apr 2007 | B2 |
7214954 | Schopp | May 2007 | B2 |
7233682 | Levine | Jun 2007 | B2 |
7277187 | Smith et al. | Oct 2007 | B2 |
7307653 | Dutta | Dec 2007 | B2 |
7310431 | Gokturk et al. | Dec 2007 | B2 |
7313264 | Crampton | Dec 2007 | B2 |
7413127 | Ehrhart et al. | Aug 2008 | B2 |
7527205 | Zhu et al. | May 2009 | B2 |
7551149 | Higham | Jun 2009 | B2 |
7586049 | Wurz | Sep 2009 | B2 |
7602404 | Reinhardt et al. | Oct 2009 | B1 |
7614563 | Nunnink et al. | Nov 2009 | B1 |
7639722 | Paxton et al. | Dec 2009 | B1 |
7726206 | Terrafranca et al. | Jun 2010 | B2 |
7726575 | Wang et al. | Jun 2010 | B2 |
7780084 | Zhang et al. | Aug 2010 | B2 |
7788883 | Buckley et al. | Sep 2010 | B2 |
7912320 | Minor | Mar 2011 | B1 |
7974025 | Topliss | Jul 2011 | B2 |
8009358 | Zalevsky et al. | Aug 2011 | B2 |
8027096 | Feng et al. | Sep 2011 | B2 |
8028501 | Buckley et al. | Oct 2011 | B2 |
8050461 | Shpunt et al. | Nov 2011 | B2 |
8055061 | Katano | Nov 2011 | B2 |
8061610 | Nunnink | Nov 2011 | B2 |
8072581 | Breiholz | Dec 2011 | B1 |
8102395 | Kondo et al. | Jan 2012 | B2 |
8132728 | Dwinell et al. | Mar 2012 | B2 |
8134717 | Pangrazio et al. | Mar 2012 | B2 |
8149224 | Kuo et al. | Apr 2012 | B1 |
8194097 | Xiao et al. | Jun 2012 | B2 |
8212889 | Chanas et al. | Jul 2012 | B2 |
8224133 | Popovich et al. | Jul 2012 | B2 |
8228510 | Pangrazio et al. | Jul 2012 | B2 |
8230367 | Bell et al. | Jul 2012 | B2 |
8294969 | Plesko | Oct 2012 | B2 |
8301027 | Shaw et al. | Oct 2012 | B2 |
8305458 | Hara | Nov 2012 | B2 |
8310656 | Zalewski | Nov 2012 | B2 |
8313380 | Zalewski et al. | Nov 2012 | B2 |
8317105 | Kotlarsky et al. | Nov 2012 | B2 |
8320621 | McEldowney | Nov 2012 | B2 |
8322622 | Liu | Dec 2012 | B2 |
8339462 | Stec et al. | Dec 2012 | B2 |
8350959 | Topliss et al. | Jan 2013 | B2 |
8351670 | Ijiri et al. | Jan 2013 | B2 |
8366005 | Kotlarsky et al. | Feb 2013 | B2 |
8368762 | Chen et al. | Feb 2013 | B1 |
8371507 | Haggerty et al. | Feb 2013 | B2 |
8374498 | Pastore | Feb 2013 | B2 |
8376233 | Horn et al. | Feb 2013 | B2 |
8381976 | Mohideen et al. | Feb 2013 | B2 |
8381979 | Franz | Feb 2013 | B2 |
8390909 | Plesko | Mar 2013 | B2 |
8408464 | Zhu et al. | Apr 2013 | B2 |
8408468 | Van et al. | Apr 2013 | B2 |
8408469 | Good | Apr 2013 | B2 |
8424768 | Rueblinger et al. | Apr 2013 | B2 |
8437539 | Komatsu et al. | May 2013 | B2 |
8441749 | Brown et al. | May 2013 | B2 |
8448863 | Xian et al. | May 2013 | B2 |
8457013 | Essinger et al. | Jun 2013 | B2 |
8459557 | Havens et al. | Jun 2013 | B2 |
8463079 | Ackley et al. | Jun 2013 | B2 |
8469272 | Kearney | Jun 2013 | B2 |
8474712 | Kearney et al. | Jul 2013 | B2 |
8479992 | Kotlarsky et al. | Jul 2013 | B2 |
8490877 | Kearney | Jul 2013 | B2 |
8517271 | Kotlarsky et al. | Aug 2013 | B2 |
8523076 | Good | Sep 2013 | B2 |
8528818 | Ehrhart et al. | Sep 2013 | B2 |
8544737 | Gomez et al. | Oct 2013 | B2 |
8548420 | Grunow et al. | Oct 2013 | B2 |
8550335 | Samek et al. | Oct 2013 | B2 |
8550354 | Gannon et al. | Oct 2013 | B2 |
8550357 | Kearney | Oct 2013 | B2 |
8556174 | Kosecki et al. | Oct 2013 | B2 |
8556176 | Van et al. | Oct 2013 | B2 |
8556177 | Hussey et al. | Oct 2013 | B2 |
8559767 | Barber et al. | Oct 2013 | B2 |
8561895 | Gomez et al. | Oct 2013 | B2 |
8561903 | Sauerwein, Jr. | Oct 2013 | B2 |
8561905 | Edmonds et al. | Oct 2013 | B2 |
8565107 | Pease et al. | Oct 2013 | B2 |
8570343 | Halstead | Oct 2013 | B2 |
8571307 | Li et al. | Oct 2013 | B2 |
8576390 | Nunnink | Nov 2013 | B1 |
8579200 | Samek et al. | Nov 2013 | B2 |
8583924 | Caballero et al. | Nov 2013 | B2 |
8584945 | Wang et al. | Nov 2013 | B2 |
8587595 | Wang | Nov 2013 | B2 |
8587697 | Hussey et al. | Nov 2013 | B2 |
8588869 | Sauerwein et al. | Nov 2013 | B2 |
8590789 | Nahill et al. | Nov 2013 | B2 |
8596539 | Havens et al. | Dec 2013 | B2 |
8596542 | Havens et al. | Dec 2013 | B2 |
8596543 | Havens et al. | Dec 2013 | B2 |
8599271 | Havens et al. | Dec 2013 | B2 |
8599957 | Peake et al. | Dec 2013 | B2 |
8600158 | Li et al. | Dec 2013 | B2 |
8600167 | Showering | Dec 2013 | B2 |
8602309 | Longacre et al. | Dec 2013 | B2 |
8608053 | Meier et al. | Dec 2013 | B2 |
8608071 | Liu et al. | Dec 2013 | B2 |
8611309 | Wang et al. | Dec 2013 | B2 |
8615487 | Gomez et al. | Dec 2013 | B2 |
8621123 | Caballero | Dec 2013 | B2 |
8622303 | Meier et al. | Jan 2014 | B2 |
8628013 | Ding | Jan 2014 | B2 |
8628015 | Wang et al. | Jan 2014 | B2 |
8628016 | Winegar | Jan 2014 | B2 |
8629926 | Wang | Jan 2014 | B2 |
8630491 | Longacre et al. | Jan 2014 | B2 |
8635309 | Berthiaume et al. | Jan 2014 | B2 |
8636200 | Kearney | Jan 2014 | B2 |
8636212 | Nahill et al. | Jan 2014 | B2 |
8636215 | Ding et al. | Jan 2014 | B2 |
8636224 | Wang | Jan 2014 | B2 |
8638806 | Wang et al. | Jan 2014 | B2 |
8640958 | Lu et al. | Feb 2014 | B2 |
8640960 | Wang et al. | Feb 2014 | B2 |
8643717 | Li et al. | Feb 2014 | B2 |
8646692 | Meier et al. | Feb 2014 | B2 |
8646694 | Wang et al. | Feb 2014 | B2 |
8657200 | Ren et al. | Feb 2014 | B2 |
8659397 | Vargo et al. | Feb 2014 | B2 |
8668149 | Good | Mar 2014 | B2 |
8678285 | Kearney | Mar 2014 | B2 |
8678286 | Smith et al. | Mar 2014 | B2 |
8682077 | Longacre, Jr. | Mar 2014 | B1 |
D702237 | Oberpriller et al. | Apr 2014 | S |
8687282 | Feng et al. | Apr 2014 | B2 |
8692927 | Pease et al. | Apr 2014 | B2 |
8695880 | Bremer et al. | Apr 2014 | B2 |
8698949 | Grunow et al. | Apr 2014 | B2 |
8702000 | Barber et al. | Apr 2014 | B2 |
8717494 | Gannon | May 2014 | B2 |
8720783 | Biss et al. | May 2014 | B2 |
8723804 | Fletcher et al. | May 2014 | B2 |
8723904 | Marty et al. | May 2014 | B2 |
8727223 | Wang | May 2014 | B2 |
8736909 | Sato et al. | May 2014 | B2 |
8740082 | Wilz, Sr. | Jun 2014 | B2 |
8740085 | Furlong et al. | Jun 2014 | B2 |
8746563 | Hennick et al. | Jun 2014 | B2 |
8750445 | Peake et al. | Jun 2014 | B2 |
8752766 | Xian et al. | Jun 2014 | B2 |
8756059 | Braho et al. | Jun 2014 | B2 |
8757495 | Qu et al. | Jun 2014 | B2 |
8760563 | Koziol et al. | Jun 2014 | B2 |
8777108 | Coyle | Jul 2014 | B2 |
8777109 | Oberpriller et al. | Jul 2014 | B2 |
8779898 | Havens et al. | Jul 2014 | B2 |
8781520 | Payne et al. | Jul 2014 | B2 |
8783573 | Havens et al. | Jul 2014 | B2 |
8789757 | Barten | Jul 2014 | B2 |
8789758 | Hawley et al. | Jul 2014 | B2 |
8789759 | Xian et al. | Jul 2014 | B2 |
8792688 | Unsworth | Jul 2014 | B2 |
8794520 | Wang et al. | Aug 2014 | B2 |
8794522 | Ehrhart | Aug 2014 | B2 |
8794525 | Amundsen et al. | Aug 2014 | B2 |
8794526 | Wang et al. | Aug 2014 | B2 |
8798367 | Ellis | Aug 2014 | B2 |
8807431 | Wang et al. | Aug 2014 | B2 |
8807432 | Van et al. | Aug 2014 | B2 |
8810779 | Hilde | Aug 2014 | B1 |
8820630 | Qu et al. | Sep 2014 | B2 |
8822806 | Cockerell et al. | Sep 2014 | B2 |
8822848 | Meagher | Sep 2014 | B2 |
8824692 | Sheerin et al. | Sep 2014 | B2 |
8824696 | Braho | Sep 2014 | B2 |
8842849 | Wahl et al. | Sep 2014 | B2 |
8844822 | Kotlarsky et al. | Sep 2014 | B2 |
8844823 | Fritz et al. | Sep 2014 | B2 |
8849019 | Li et al. | Sep 2014 | B2 |
D716285 | Chaney et al. | Oct 2014 | S |
8851383 | Yeakley et al. | Oct 2014 | B2 |
8854633 | Laffargue et al. | Oct 2014 | B2 |
8866963 | Grunow et al. | Oct 2014 | B2 |
8868421 | Braho et al. | Oct 2014 | B2 |
8868519 | Maloy et al. | Oct 2014 | B2 |
8868802 | Barten | Oct 2014 | B2 |
8868803 | Caballero | Oct 2014 | B2 |
8870074 | Gannon | Oct 2014 | B1 |
8879639 | Sauerwein, Jr. | Nov 2014 | B2 |
8880426 | Smith | Nov 2014 | B2 |
8881983 | Havens et al. | Nov 2014 | B2 |
8881987 | Wang | Nov 2014 | B2 |
8897596 | Passmore et al. | Nov 2014 | B1 |
8903172 | Smith | Dec 2014 | B2 |
8908277 | Pesach et al. | Dec 2014 | B2 |
8908995 | Benos et al. | Dec 2014 | B2 |
8910870 | Li et al. | Dec 2014 | B2 |
8910875 | Ren et al. | Dec 2014 | B2 |
8914290 | Hendrickson et al. | Dec 2014 | B2 |
8914788 | Pettinelli et al. | Dec 2014 | B2 |
8915439 | Feng et al. | Dec 2014 | B2 |
8915444 | Havens et al. | Dec 2014 | B2 |
8916789 | Woodburn | Dec 2014 | B2 |
8918250 | Hollifield | Dec 2014 | B2 |
8918564 | Caballero | Dec 2014 | B2 |
8925818 | Kosecki et al. | Jan 2015 | B2 |
8939374 | Jovanovski et al. | Jan 2015 | B2 |
8942480 | Ellis | Jan 2015 | B2 |
8944313 | Williams et al. | Feb 2015 | B2 |
8944327 | Meier et al. | Feb 2015 | B2 |
8944332 | Harding et al. | Feb 2015 | B2 |
8950678 | Germaine et al. | Feb 2015 | B2 |
D723560 | Zhou et al. | Mar 2015 | S |
8967468 | Gomez et al. | Mar 2015 | B2 |
8971346 | Sevier | Mar 2015 | B2 |
8976030 | Cunningham et al. | Mar 2015 | B2 |
8976368 | El et al. | Mar 2015 | B2 |
8978981 | Guan | Mar 2015 | B2 |
8978983 | Bremer et al. | Mar 2015 | B2 |
8978984 | Hennick et al. | Mar 2015 | B2 |
8985456 | Zhu et al. | Mar 2015 | B2 |
8985457 | Soule et al. | Mar 2015 | B2 |
8985459 | Kearney et al. | Mar 2015 | B2 |
8985461 | Gelay et al. | Mar 2015 | B2 |
8988578 | Showering | Mar 2015 | B2 |
8988590 | Gillet et al. | Mar 2015 | B2 |
8991704 | Hopper et al. | Mar 2015 | B2 |
8993974 | Goodwin | Mar 2015 | B2 |
8996194 | Davis et al. | Mar 2015 | B2 |
8996384 | Funyak et al. | Mar 2015 | B2 |
8998091 | Edmonds et al. | Apr 2015 | B2 |
9002641 | Showering | Apr 2015 | B2 |
9007368 | Laffargue et al. | Apr 2015 | B2 |
9010641 | Qu et al. | Apr 2015 | B2 |
9014441 | Truyen et al. | Apr 2015 | B2 |
9015513 | Murawski et al. | Apr 2015 | B2 |
9016576 | Brady et al. | Apr 2015 | B2 |
D730357 | Fitch et al. | May 2015 | S |
9022288 | Nahill et al. | May 2015 | B2 |
9030964 | Essinger et al. | May 2015 | B2 |
9033240 | Smith et al. | May 2015 | B2 |
9033242 | Gillet et al. | May 2015 | B2 |
9036054 | Koziol et al. | May 2015 | B2 |
9037344 | Chamberlin | May 2015 | B2 |
9038911 | Xian et al. | May 2015 | B2 |
9038915 | Smith | May 2015 | B2 |
D730901 | Oberpriller et al. | Jun 2015 | S |
D730902 | Fitch et al. | Jun 2015 | S |
D733112 | Chaney et al. | Jun 2015 | S |
9047098 | Barten | Jun 2015 | B2 |
9047359 | Caballero et al. | Jun 2015 | B2 |
9047420 | Caballero | Jun 2015 | B2 |
9047525 | Barber et al. | Jun 2015 | B2 |
9047531 | Showering et al. | Jun 2015 | B2 |
9049640 | Wang et al. | Jun 2015 | B2 |
9053055 | Caballero | Jun 2015 | B2 |
9053378 | Hou et al. | Jun 2015 | B1 |
9053380 | Xian et al. | Jun 2015 | B2 |
9057641 | Amundsen et al. | Jun 2015 | B2 |
9058526 | Powilleit | Jun 2015 | B2 |
9064165 | Havens et al. | Jun 2015 | B2 |
9064167 | Xian et al. | Jun 2015 | B2 |
9064168 | Todeschini et al. | Jun 2015 | B2 |
9064254 | Todeschini et al. | Jun 2015 | B2 |
9066032 | Wang | Jun 2015 | B2 |
9066087 | Shpunt | Jun 2015 | B2 |
9070032 | Corcoran | Jun 2015 | B2 |
D734339 | Zhou et al. | Jul 2015 | S |
D734751 | Oberpriller et al. | Jul 2015 | S |
9082023 | Feng et al. | Jul 2015 | B2 |
9082195 | Holeva et al. | Jul 2015 | B2 |
9142035 | Rotman et al. | Sep 2015 | B1 |
9233470 | Bradski et al. | Jan 2016 | B1 |
9273846 | Rossi et al. | Mar 2016 | B1 |
9299013 | Curlander et al. | Mar 2016 | B1 |
9366861 | Johnson | Jun 2016 | B1 |
9399557 | Mishra et al. | Jul 2016 | B1 |
9424749 | Reed et al. | Aug 2016 | B1 |
9470511 | Maynard et al. | Oct 2016 | B2 |
9486921 | Straszheim et al. | Nov 2016 | B1 |
9709387 | Fujita et al. | Jul 2017 | B2 |
9736459 | Mor et al. | Aug 2017 | B2 |
9741136 | Holz | Aug 2017 | B2 |
9828223 | Svensson et al. | Nov 2017 | B2 |
20010027995 | Patel et al. | Oct 2001 | A1 |
20010032879 | He et al. | Oct 2001 | A1 |
20020036765 | McCaffrey et al. | Mar 2002 | A1 |
20020067855 | Chiu et al. | Jun 2002 | A1 |
20020105639 | Roelke | Aug 2002 | A1 |
20020109835 | Goetz | Aug 2002 | A1 |
20020113946 | Kitaguchi et al. | Aug 2002 | A1 |
20020118874 | Chung et al. | Aug 2002 | A1 |
20020158873 | Williamson | Oct 2002 | A1 |
20020167677 | Okada et al. | Nov 2002 | A1 |
20020179708 | Zhu et al. | Dec 2002 | A1 |
20020186897 | Kim et al. | Dec 2002 | A1 |
20020196534 | Lizotte et al. | Dec 2002 | A1 |
20030038179 | Tsikos et al. | Feb 2003 | A1 |
20030053513 | Vatan et al. | Mar 2003 | A1 |
20030063086 | Baumberg | Apr 2003 | A1 |
20030071118 | Gershman et al. | Apr 2003 | A1 |
20030091227 | Chang et al. | May 2003 | A1 |
20030156756 | Gokturk et al. | Aug 2003 | A1 |
20030163287 | Vock et al. | Aug 2003 | A1 |
20030197138 | Pease et al. | Oct 2003 | A1 |
20030225712 | Cooper et al. | Dec 2003 | A1 |
20030235331 | Kawaike et al. | Dec 2003 | A1 |
20040008259 | Gokturk et al. | Jan 2004 | A1 |
20040019274 | Galloway et al. | Jan 2004 | A1 |
20040024754 | Mane et al. | Feb 2004 | A1 |
20040066329 | Zeitfuss et al. | Apr 2004 | A1 |
20040073359 | Ichijo et al. | Apr 2004 | A1 |
20040083025 | Yamanouchi et al. | Apr 2004 | A1 |
20040089482 | Ramsden et al. | May 2004 | A1 |
20040098146 | Katae et al. | May 2004 | A1 |
20040105580 | Hager et al. | Jun 2004 | A1 |
20040118928 | Patel et al. | Jun 2004 | A1 |
20040122779 | Stickler et al. | Jun 2004 | A1 |
20040132297 | Baba et al. | Jul 2004 | A1 |
20040155975 | Hart et al. | Aug 2004 | A1 |
20040165090 | Ning | Aug 2004 | A1 |
20040184041 | Schopp | Sep 2004 | A1 |
20040211836 | Patel et al. | Oct 2004 | A1 |
20040214623 | Takahashi et al. | Oct 2004 | A1 |
20040233461 | Armstrong et al. | Nov 2004 | A1 |
20040258353 | Gluckstad et al. | Dec 2004 | A1 |
20050006477 | Patel | Jan 2005 | A1 |
20050117215 | Lange | Jun 2005 | A1 |
20050128193 | Lueder | Jun 2005 | A1 |
20050128196 | Popescu et al. | Jun 2005 | A1 |
20050168488 | Montague | Aug 2005 | A1 |
20050190098 | Bridgelall et al. | Sep 2005 | A1 |
20050211782 | Martin et al. | Sep 2005 | A1 |
20050240317 | Kienzle-Lietl | Oct 2005 | A1 |
20050257748 | Kriesel et al. | Nov 2005 | A1 |
20050264867 | Cho et al. | Dec 2005 | A1 |
20060047704 | Gopalakrishnan | Mar 2006 | A1 |
20060078226 | Zhou | Apr 2006 | A1 |
20060108266 | Bowers et al. | May 2006 | A1 |
20060109105 | Varner et al. | May 2006 | A1 |
20060112023 | Horhann et al. | May 2006 | A1 |
20060122497 | Glossop | Jun 2006 | A1 |
20060151604 | Zhu et al. | Jul 2006 | A1 |
20060159307 | Anderson et al. | Jul 2006 | A1 |
20060159344 | Shao et al. | Jul 2006 | A1 |
20060197652 | Hild et al. | Sep 2006 | A1 |
20060232681 | Okada | Oct 2006 | A1 |
20060255150 | Longacre, Jr. | Nov 2006 | A1 |
20060269165 | Viswanathan | Nov 2006 | A1 |
20060276709 | Khamene et al. | Dec 2006 | A1 |
20060291719 | Ikeda et al. | Dec 2006 | A1 |
20070003154 | Sun et al. | Jan 2007 | A1 |
20070025612 | Iwasaki et al. | Feb 2007 | A1 |
20070031064 | Zhao et al. | Feb 2007 | A1 |
20070063048 | Havens et al. | Mar 2007 | A1 |
20070116357 | Dewaele | May 2007 | A1 |
20070127022 | Cohen et al. | Jun 2007 | A1 |
20070143082 | Degnan | Jun 2007 | A1 |
20070153293 | Gruhlke et al. | Jul 2007 | A1 |
20070162190 | Choubey | Jul 2007 | A1 |
20070165013 | Goulanian et al. | Jul 2007 | A1 |
20070171220 | Kriveshko | Jul 2007 | A1 |
20070177011 | Lewin et al. | Aug 2007 | A1 |
20070181685 | Zhu et al. | Aug 2007 | A1 |
20070184898 | Miller et al. | Aug 2007 | A1 |
20070229665 | Tobiason et al. | Oct 2007 | A1 |
20070237356 | Dwinell et al. | Oct 2007 | A1 |
20070291031 | Konev et al. | Dec 2007 | A1 |
20070299338 | Stevick et al. | Dec 2007 | A1 |
20080013793 | Hillis et al. | Jan 2008 | A1 |
20080035390 | Wurz | Feb 2008 | A1 |
20080047760 | Georgitsis | Feb 2008 | A1 |
20080050042 | Zhang et al. | Feb 2008 | A1 |
20080054062 | Gunning et al. | Mar 2008 | A1 |
20080056536 | Hildreth et al. | Mar 2008 | A1 |
20080062164 | Bassi et al. | Mar 2008 | A1 |
20080065509 | Williams | Mar 2008 | A1 |
20080077265 | Boyden et al. | Mar 2008 | A1 |
20080079955 | Storm | Apr 2008 | A1 |
20080156619 | Patel et al. | Jul 2008 | A1 |
20080164074 | Wurz | Jul 2008 | A1 |
20080185432 | Caballero et al. | Aug 2008 | A1 |
20080204476 | Montague | Aug 2008 | A1 |
20080212168 | Olmstead et al. | Sep 2008 | A1 |
20080247635 | Davis et al. | Oct 2008 | A1 |
20080273191 | Kim et al. | Nov 2008 | A1 |
20080273210 | Hilde | Nov 2008 | A1 |
20080278790 | Boesser et al. | Nov 2008 | A1 |
20090046296 | Kilpatrick et al. | Feb 2009 | A1 |
20090059004 | Bochicchio | Mar 2009 | A1 |
20090095047 | Patel et al. | Apr 2009 | A1 |
20090114818 | Casares et al. | May 2009 | A1 |
20090134221 | Zhu et al. | May 2009 | A1 |
20090161090 | Campbell et al. | Jun 2009 | A1 |
20090189858 | Lev et al. | Jul 2009 | A1 |
20090195790 | Zhu et al. | Aug 2009 | A1 |
20090225333 | Bendall et al. | Sep 2009 | A1 |
20090237411 | Gossweiler et al. | Sep 2009 | A1 |
20090268023 | Hsieh | Oct 2009 | A1 |
20090272724 | Gubler et al. | Nov 2009 | A1 |
20090273770 | Bauhahn et al. | Nov 2009 | A1 |
20090313948 | Buckley et al. | Dec 2009 | A1 |
20090318815 | Barnes et al. | Dec 2009 | A1 |
20090323084 | Dunn et al. | Dec 2009 | A1 |
20090323121 | Valkenburg et al. | Dec 2009 | A1 |
20100035637 | Varanasi et al. | Feb 2010 | A1 |
20100060604 | Zwart et al. | Mar 2010 | A1 |
20100091104 | Sprigle et al. | Apr 2010 | A1 |
20100113153 | Yen et al. | May 2010 | A1 |
20100118200 | Gelman et al. | May 2010 | A1 |
20100128109 | Banks | May 2010 | A1 |
20100161170 | Siris | Jun 2010 | A1 |
20100171740 | Andersen et al. | Jul 2010 | A1 |
20100172567 | Prokoski | Jul 2010 | A1 |
20100177076 | Essinger et al. | Jul 2010 | A1 |
20100177080 | Essinger et al. | Jul 2010 | A1 |
20100177707 | Essinger et al. | Jul 2010 | A1 |
20100177749 | Essinger et al. | Jul 2010 | A1 |
20100202702 | Benos et al. | Aug 2010 | A1 |
20100208039 | Stettner | Aug 2010 | A1 |
20100211355 | Horst et al. | Aug 2010 | A1 |
20100217678 | Goncalves | Aug 2010 | A1 |
20100220894 | Ackley et al. | Sep 2010 | A1 |
20100223276 | Al-Shameri et al. | Sep 2010 | A1 |
20100245850 | Lee et al. | Sep 2010 | A1 |
20100254611 | Arnz | Oct 2010 | A1 |
20100274728 | Kugelman | Oct 2010 | A1 |
20100284041 | Warnes | Nov 2010 | A1 |
20100290665 | Sones et al. | Nov 2010 | A1 |
20100303336 | Abraham et al. | Dec 2010 | A1 |
20100315413 | Izadi et al. | Dec 2010 | A1 |
20100321482 | Cleveland | Dec 2010 | A1 |
20110019155 | Daniel et al. | Jan 2011 | A1 |
20110040192 | Brenner et al. | Feb 2011 | A1 |
20110040407 | Lim et al. | Feb 2011 | A1 |
20110043609 | Choi et al. | Feb 2011 | A1 |
20110075936 | Deaver | Mar 2011 | A1 |
20110081044 | Peeper et al. | Apr 2011 | A1 |
20110099474 | Grossman et al. | Apr 2011 | A1 |
20110169999 | Grunow et al. | Jul 2011 | A1 |
20110180695 | Li et al. | Jul 2011 | A1 |
20110188054 | Petronius et al. | Aug 2011 | A1 |
20110188741 | Sones et al. | Aug 2011 | A1 |
20110202554 | Powilleit et al. | Aug 2011 | A1 |
20110234389 | Mellin | Sep 2011 | A1 |
20110235854 | Berger et al. | Sep 2011 | A1 |
20110243432 | Hirsch, Jr. | Oct 2011 | A1 |
20110249864 | Venkatesan et al. | Oct 2011 | A1 |
20110254840 | Halstead | Oct 2011 | A1 |
20110260965 | Kim et al. | Oct 2011 | A1 |
20110279916 | Brown et al. | Nov 2011 | A1 |
20110286007 | Pangrazio et al. | Nov 2011 | A1 |
20110288818 | Thierman et al. | Nov 2011 | A1 |
20110297590 | Ackley et al. | Dec 2011 | A1 |
20110301994 | Tieman | Dec 2011 | A1 |
20110303748 | Lemma et al. | Dec 2011 | A1 |
20110310227 | Konertz et al. | Dec 2011 | A1 |
20110310256 | Shishido | Dec 2011 | A1 |
20120014572 | Wong et al. | Jan 2012 | A1 |
20120024952 | Chen | Feb 2012 | A1 |
20120056982 | Katz et al. | Mar 2012 | A1 |
20120057345 | Kuchibhotla | Mar 2012 | A1 |
20120067955 | Rowe | Mar 2012 | A1 |
20120074227 | Ferren et al. | Mar 2012 | A1 |
20120081714 | Pangrazio et al. | Apr 2012 | A1 |
20120082383 | Kruglick | Apr 2012 | A1 |
20120111946 | Golant | May 2012 | A1 |
20120113223 | Hilliges et al. | May 2012 | A1 |
20120113250 | Farlotti et al. | May 2012 | A1 |
20120126000 | Kunzig et al. | May 2012 | A1 |
20120138685 | Qu et al. | Jun 2012 | A1 |
20120140300 | Freeman | Jun 2012 | A1 |
20120168509 | Nunnink et al. | Jul 2012 | A1 |
20120168512 | Kotlarsky et al. | Jul 2012 | A1 |
20120179665 | Baarman et al. | Jul 2012 | A1 |
20120185094 | Rosenstein et al. | Jul 2012 | A1 |
20120190386 | Anderson | Jul 2012 | A1 |
20120193423 | Samek | Aug 2012 | A1 |
20120197464 | Wang et al. | Aug 2012 | A1 |
20120203647 | Smith | Aug 2012 | A1 |
20120218436 | Rhoads et al. | Aug 2012 | A1 |
20120223141 | Good et al. | Sep 2012 | A1 |
20120224026 | Bayer et al. | Sep 2012 | A1 |
20120224060 | Gurevich et al. | Sep 2012 | A1 |
20120228382 | Havens et al. | Sep 2012 | A1 |
20120236212 | Itoh et al. | Sep 2012 | A1 |
20120236288 | Stanley | Sep 2012 | A1 |
20120236326 | Lee | Sep 2012 | A1 |
20120242852 | Hayward et al. | Sep 2012 | A1 |
20120248188 | Kearney | Oct 2012 | A1 |
20120256901 | Bendall | Oct 2012 | A1 |
20120262558 | Boger et al. | Oct 2012 | A1 |
20120280908 | Rhoads et al. | Nov 2012 | A1 |
20120282905 | Owen | Nov 2012 | A1 |
20120282911 | Davis et al. | Nov 2012 | A1 |
20120284012 | Rodriguez et al. | Nov 2012 | A1 |
20120284122 | Brandis | Nov 2012 | A1 |
20120284339 | Rodriguez | Nov 2012 | A1 |
20120284593 | Rodriguez | Nov 2012 | A1 |
20120293610 | Doepke et al. | Nov 2012 | A1 |
20120293625 | Schneider et al. | Nov 2012 | A1 |
20120294478 | Publicover et al. | Nov 2012 | A1 |
20120294549 | Doepke | Nov 2012 | A1 |
20120299961 | Ramkumar et al. | Nov 2012 | A1 |
20120300991 | Free | Nov 2012 | A1 |
20120313848 | Galor et al. | Dec 2012 | A1 |
20120314030 | Datta et al. | Dec 2012 | A1 |
20120314058 | Bendall et al. | Dec 2012 | A1 |
20120314258 | Moriya | Dec 2012 | A1 |
20120316820 | Nakazato et al. | Dec 2012 | A1 |
20130019278 | Sun et al. | Jan 2013 | A1 |
20130038881 | Pesach et al. | Feb 2013 | A1 |
20130043312 | Van Horn | Feb 2013 | A1 |
20130050426 | Sarmast et al. | Feb 2013 | A1 |
20130056285 | Meagher | Mar 2013 | A1 |
20130070322 | Fritz et al. | Mar 2013 | A1 |
20130075168 | Amundsen et al. | Mar 2013 | A1 |
20130076857 | Kurashige et al. | Mar 2013 | A1 |
20130093895 | Palmer et al. | Apr 2013 | A1 |
20130094069 | Lee et al. | Apr 2013 | A1 |
20130101158 | Lloyd et al. | Apr 2013 | A1 |
20130156267 | Muraoka et al. | Jun 2013 | A1 |
20130175341 | Kearney et al. | Jul 2013 | A1 |
20130175343 | Good | Jul 2013 | A1 |
20130200158 | Feng et al. | Aug 2013 | A1 |
20130208164 | Cazier et al. | Aug 2013 | A1 |
20130211790 | Loveland et al. | Aug 2013 | A1 |
20130222592 | Gieseke | Aug 2013 | A1 |
20130223673 | Davis et al. | Aug 2013 | A1 |
20130256418 | Havens et al. | Oct 2013 | A1 |
20130257744 | Daghigh et al. | Oct 2013 | A1 |
20130257759 | Daghigh | Oct 2013 | A1 |
20130270346 | Xian et al. | Oct 2013 | A1 |
20130278425 | Cunningham et al. | Oct 2013 | A1 |
20130287258 | Kearney | Oct 2013 | A1 |
20130291998 | Konnerth | Nov 2013 | A1 |
20130292475 | Kotlarsky et al. | Nov 2013 | A1 |
20130292477 | Hennick et al. | Nov 2013 | A1 |
20130293539 | Hunt et al. | Nov 2013 | A1 |
20130293540 | Laffargue et al. | Nov 2013 | A1 |
20130306728 | Thuries et al. | Nov 2013 | A1 |
20130306730 | Brady et al. | Nov 2013 | A1 |
20130306731 | Pedrao | Nov 2013 | A1 |
20130307964 | Bremer et al. | Nov 2013 | A1 |
20130308625 | Park et al. | Nov 2013 | A1 |
20130313324 | Koziol et al. | Nov 2013 | A1 |
20130313325 | Wilz et al. | Nov 2013 | A1 |
20130317642 | Asaria et al. | Nov 2013 | A1 |
20130329012 | Bartos et al. | Dec 2013 | A1 |
20130329013 | Metois et al. | Dec 2013 | A1 |
20130341399 | Xian et al. | Dec 2013 | A1 |
20130342343 | Harring et al. | Dec 2013 | A1 |
20130342717 | Havens et al. | Dec 2013 | A1 |
20140001258 | Chan et al. | Jan 2014 | A1 |
20140001267 | Giordano et al. | Jan 2014 | A1 |
20140002828 | Laffargue et al. | Jan 2014 | A1 |
20140008430 | Soule et al. | Jan 2014 | A1 |
20140008439 | Wang | Jan 2014 | A1 |
20140009586 | McNamer et al. | Jan 2014 | A1 |
20140019005 | Lee et al. | Jan 2014 | A1 |
20140021259 | Moed et al. | Jan 2014 | A1 |
20140025584 | Liu et al. | Jan 2014 | A1 |
20140027518 | Edmonds et al. | Jan 2014 | A1 |
20140031665 | Pinto et al. | Jan 2014 | A1 |
20140034731 | Gao et al. | Feb 2014 | A1 |
20140034734 | Sauerwein, Jr. | Feb 2014 | A1 |
20140036848 | Pease et al. | Feb 2014 | A1 |
20140039674 | Motoyama et al. | Feb 2014 | A1 |
20140039693 | Havens et al. | Feb 2014 | A1 |
20140042814 | Kather et al. | Feb 2014 | A1 |
20140049120 | Kohtz et al. | Feb 2014 | A1 |
20140049635 | Laffargue et al. | Feb 2014 | A1 |
20140058612 | Wong et al. | Feb 2014 | A1 |
20140061305 | Nahill et al. | Mar 2014 | A1 |
20140061306 | Wu et al. | Mar 2014 | A1 |
20140062709 | Hyer et al. | Mar 2014 | A1 |
20140063289 | Hussey et al. | Mar 2014 | A1 |
20140066136 | Sauerwein et al. | Mar 2014 | A1 |
20140067104 | Osterhout | Mar 2014 | A1 |
20140067692 | Ye et al. | Mar 2014 | A1 |
20140070005 | Nahill et al. | Mar 2014 | A1 |
20140071430 | Hansen et al. | Mar 2014 | A1 |
20140071840 | Venancio | Mar 2014 | A1 |
20140074746 | Wang | Mar 2014 | A1 |
20140075846 | Woodburn | Mar 2014 | A1 |
20140076974 | Havens et al. | Mar 2014 | A1 |
20140078341 | Havens et al. | Mar 2014 | A1 |
20140078342 | Li et al. | Mar 2014 | A1 |
20140078345 | Showering | Mar 2014 | A1 |
20140079297 | Tadayon et al. | Mar 2014 | A1 |
20140084068 | Gillet et al. | Mar 2014 | A1 |
20140097238 | Ghazizadeh | Apr 2014 | A1 |
20140097249 | Gomez et al. | Apr 2014 | A1 |
20140097252 | He et al. | Apr 2014 | A1 |
20140098091 | Hori | Apr 2014 | A1 |
20140098243 | Ghazizadeh | Apr 2014 | A1 |
20140098244 | Ghazizadeh | Apr 2014 | A1 |
20140098792 | Wang et al. | Apr 2014 | A1 |
20140100774 | Showering | Apr 2014 | A1 |
20140100813 | Showering | Apr 2014 | A1 |
20140103115 | Meier et al. | Apr 2014 | A1 |
20140104413 | McCloskey et al. | Apr 2014 | A1 |
20140104414 | McCloskey et al. | Apr 2014 | A1 |
20140104416 | Giordano et al. | Apr 2014 | A1 |
20140104451 | Todeschini et al. | Apr 2014 | A1 |
20140104664 | Lee et al. | Apr 2014 | A1 |
20140106594 | Skvoretz | Apr 2014 | A1 |
20140106725 | Sauerwein, Jr. | Apr 2014 | A1 |
20140108010 | Maltseff et al. | Apr 2014 | A1 |
20140108402 | Gomez et al. | Apr 2014 | A1 |
20140108682 | Caballero | Apr 2014 | A1 |
20140110485 | Toa et al. | Apr 2014 | A1 |
20140114530 | Fitch et al. | Apr 2014 | A1 |
20140121438 | Long et al. | May 2014 | A1 |
20140121445 | Fontenot et al. | May 2014 | A1 |
20140124577 | Wang et al. | May 2014 | A1 |
20140124579 | Ding | May 2014 | A1 |
20140125577 | Hoang | May 2014 | A1 |
20140125842 | Winegar | May 2014 | A1 |
20140125853 | Wang | May 2014 | A1 |
20140125999 | Longacre et al. | May 2014 | A1 |
20140129378 | Richardson | May 2014 | A1 |
20140131441 | Nahill et al. | May 2014 | A1 |
20140131443 | Smith | May 2014 | A1 |
20140131444 | Wang | May 2014 | A1 |
20140131448 | Xian et al. | May 2014 | A1 |
20140133379 | Wang et al. | May 2014 | A1 |
20140136208 | Maltseff et al. | May 2014 | A1 |
20140139654 | Takahashi | May 2014 | A1 |
20140140585 | Wang | May 2014 | A1 |
20140142398 | Patil et al. | May 2014 | A1 |
20140151453 | Meier et al. | Jun 2014 | A1 |
20140152882 | Samek et al. | Jun 2014 | A1 |
20140152975 | Ko | Jun 2014 | A1 |
20140158468 | Adami | Jun 2014 | A1 |
20140158770 | Sevier et al. | Jun 2014 | A1 |
20140159869 | Zumsteg et al. | Jun 2014 | A1 |
20140160329 | Ren et al. | Jun 2014 | A1 |
20140166755 | Liu et al. | Jun 2014 | A1 |
20140166757 | Smith | Jun 2014 | A1 |
20140166759 | Liu et al. | Jun 2014 | A1 |
20140166760 | Meier et al. | Jun 2014 | A1 |
20140166761 | Todeschini et al. | Jun 2014 | A1 |
20140168380 | Heidemann et al. | Jun 2014 | A1 |
20140168787 | Wang et al. | Jun 2014 | A1 |
20140175165 | Havens et al. | Jun 2014 | A1 |
20140175169 | Kosecki et al. | Jun 2014 | A1 |
20140175172 | Jovanovski et al. | Jun 2014 | A1 |
20140175174 | Barber et al. | Jun 2014 | A1 |
20140177931 | Kocherscheidt et al. | Jun 2014 | A1 |
20140191644 | Chaney | Jul 2014 | A1 |
20140191913 | Ge et al. | Jul 2014 | A1 |
20140192187 | Atwell et al. | Jul 2014 | A1 |
20140192551 | Masaki | Jul 2014 | A1 |
20140197238 | Liu et al. | Jul 2014 | A1 |
20140197239 | Havens et al. | Jul 2014 | A1 |
20140197304 | Feng et al. | Jul 2014 | A1 |
20140201126 | Zadeh et al. | Jul 2014 | A1 |
20140203087 | Smith et al. | Jul 2014 | A1 |
20140204268 | Grunow et al. | Jul 2014 | A1 |
20140205150 | Ogawa | Jul 2014 | A1 |
20140214631 | Hansen | Jul 2014 | A1 |
20140217166 | Berthiaume et al. | Aug 2014 | A1 |
20140217180 | Liu | Aug 2014 | A1 |
20140225918 | Mittal et al. | Aug 2014 | A1 |
20140225985 | Klusza et al. | Aug 2014 | A1 |
20140231500 | Ehrhart et al. | Aug 2014 | A1 |
20140232930 | Anderson | Aug 2014 | A1 |
20140240454 | Hirata et al. | Aug 2014 | A1 |
20140247279 | Nicholas et al. | Sep 2014 | A1 |
20140247280 | Nicholas et al. | Sep 2014 | A1 |
20140247315 | Marty et al. | Sep 2014 | A1 |
20140263493 | Amurgis et al. | Sep 2014 | A1 |
20140263645 | Smith et al. | Sep 2014 | A1 |
20140267609 | Laffargue | Sep 2014 | A1 |
20140268093 | Tohme et al. | Sep 2014 | A1 |
20140270196 | Braho et al. | Sep 2014 | A1 |
20140270229 | Braho | Sep 2014 | A1 |
20140270361 | Amma et al. | Sep 2014 | A1 |
20140278387 | Digregorio | Sep 2014 | A1 |
20140282210 | Bianconi | Sep 2014 | A1 |
20140284384 | Lu et al. | Sep 2014 | A1 |
20140288933 | Braho et al. | Sep 2014 | A1 |
20140297058 | Barker et al. | Oct 2014 | A1 |
20140299665 | Barber et al. | Oct 2014 | A1 |
20140306833 | Ricci | Oct 2014 | A1 |
20140307855 | Withagen et al. | Oct 2014 | A1 |
20140312121 | Lu et al. | Oct 2014 | A1 |
20140313527 | Askan | Oct 2014 | A1 |
20140319219 | Liu et al. | Oct 2014 | A1 |
20140319220 | Coyle | Oct 2014 | A1 |
20140319221 | Oberpriller et al. | Oct 2014 | A1 |
20140320408 | Zagorsek et al. | Oct 2014 | A1 |
20140320605 | Johnson | Oct 2014 | A1 |
20140326787 | Barten | Nov 2014 | A1 |
20140332590 | Wang et al. | Nov 2014 | A1 |
20140344943 | Todeschini et al. | Nov 2014 | A1 |
20140346233 | Liu et al. | Nov 2014 | A1 |
20140350710 | Gopalakrishnan et al. | Nov 2014 | A1 |
20140351317 | Smith et al. | Nov 2014 | A1 |
20140353373 | Van et al. | Dec 2014 | A1 |
20140361073 | Qu et al. | Dec 2014 | A1 |
20140361082 | Xian et al. | Dec 2014 | A1 |
20140362184 | Jovanovski et al. | Dec 2014 | A1 |
20140363015 | Braho | Dec 2014 | A1 |
20140369511 | Sheerin et al. | Dec 2014 | A1 |
20140374483 | Lu | Dec 2014 | A1 |
20140374485 | Xian et al. | Dec 2014 | A1 |
20140379613 | Nishitani et al. | Dec 2014 | A1 |
20150001301 | Ouyang | Jan 2015 | A1 |
20150001304 | Todeschini | Jan 2015 | A1 |
20150003673 | Fletcher | Jan 2015 | A1 |
20150009100 | Haneda et al. | Jan 2015 | A1 |
20150009301 | Ribnick et al. | Jan 2015 | A1 |
20150009338 | Laffargue et al. | Jan 2015 | A1 |
20150009610 | London et al. | Jan 2015 | A1 |
20150014416 | Kotlarsky et al. | Jan 2015 | A1 |
20150016712 | Rhoads et al. | Jan 2015 | A1 |
20150021397 | Rueblinger et al. | Jan 2015 | A1 |
20150028102 | Ren et al. | Jan 2015 | A1 |
20150028103 | Jiang | Jan 2015 | A1 |
20150028104 | Ma et al. | Jan 2015 | A1 |
20150029002 | Yeakley et al. | Jan 2015 | A1 |
20150032709 | Maloy et al. | Jan 2015 | A1 |
20150039309 | Braho et al. | Feb 2015 | A1 |
20150040378 | Saber et al. | Feb 2015 | A1 |
20150048168 | Fritz et al. | Feb 2015 | A1 |
20150049347 | Laffargue et al. | Feb 2015 | A1 |
20150051992 | Smith | Feb 2015 | A1 |
20150053766 | Havens et al. | Feb 2015 | A1 |
20150053768 | Wang et al. | Feb 2015 | A1 |
20150053769 | Thuries et al. | Feb 2015 | A1 |
20150062366 | Liu et al. | Mar 2015 | A1 |
20150062369 | Gehring et al. | Mar 2015 | A1 |
20150063215 | Wang | Mar 2015 | A1 |
20150063676 | Lloyd et al. | Mar 2015 | A1 |
20150069130 | Gannon | Mar 2015 | A1 |
20150070158 | Hayasaka | Mar 2015 | A1 |
20150070489 | Hudman et al. | Mar 2015 | A1 |
20150071818 | Scheuren et al. | Mar 2015 | A1 |
20150083800 | Li et al. | Mar 2015 | A1 |
20150086114 | Todeschini | Mar 2015 | A1 |
20150088522 | Hendrickson et al. | Mar 2015 | A1 |
20150096872 | Woodburn | Apr 2015 | A1 |
20150099557 | Pettinelli et al. | Apr 2015 | A1 |
20150100196 | Hollifield | Apr 2015 | A1 |
20150102109 | Huck | Apr 2015 | A1 |
20150115035 | Meier et al. | Apr 2015 | A1 |
20150116498 | Vartiainen et al. | Apr 2015 | A1 |
20150117749 | Smith et al. | Apr 2015 | A1 |
20150127791 | Kosecki et al. | May 2015 | A1 |
20150128116 | Chen et al. | May 2015 | A1 |
20150129659 | Feng et al. | May 2015 | A1 |
20150130928 | Maynard et al. | May 2015 | A1 |
20150133047 | Smith et al. | May 2015 | A1 |
20150134470 | Hejl et al. | May 2015 | A1 |
20150136851 | Harding et al. | May 2015 | A1 |
20150136854 | Lu et al. | May 2015 | A1 |
20150142492 | Kumar | May 2015 | A1 |
20150144692 | Hejl | May 2015 | A1 |
20150144698 | Teng et al. | May 2015 | A1 |
20150144701 | Xian et al. | May 2015 | A1 |
20150149946 | Benos et al. | May 2015 | A1 |
20150161429 | Xian | Jun 2015 | A1 |
20150163474 | You et al. | Jun 2015 | A1 |
20150169925 | Chen et al. | Jun 2015 | A1 |
20150169929 | Williams et al. | Jun 2015 | A1 |
20150186703 | Chen et al. | Jul 2015 | A1 |
20150193644 | Kearney et al. | Jul 2015 | A1 |
20150193645 | Colavito et al. | Jul 2015 | A1 |
20150199957 | Funyak et al. | Jul 2015 | A1 |
20150201181 | Moore et al. | Jul 2015 | A1 |
20150204662 | Kobayashi et al. | Jul 2015 | A1 |
20150204671 | Showering | Jul 2015 | A1 |
20150213590 | Brown et al. | Jul 2015 | A1 |
20150213647 | Laffargue et al. | Jul 2015 | A1 |
20150219748 | Hyatt et al. | Aug 2015 | A1 |
20150229838 | Hakim et al. | Aug 2015 | A1 |
20150243030 | Pfeiffer | Aug 2015 | A1 |
20150248578 | Utsumi | Sep 2015 | A1 |
20150253469 | Le et al. | Sep 2015 | A1 |
20150260830 | Ghosh et al. | Sep 2015 | A1 |
20150276379 | Ni et al. | Oct 2015 | A1 |
20150308816 | Laffargue | Oct 2015 | A1 |
20150316368 | Moench et al. | Nov 2015 | A1 |
20150325036 | Lee | Nov 2015 | A1 |
20150332075 | Burch | Nov 2015 | A1 |
20150332463 | Galera et al. | Nov 2015 | A1 |
20150355470 | Herschbach | Dec 2015 | A1 |
20160048725 | Holz et al. | Feb 2016 | A1 |
20160063429 | Varley et al. | Mar 2016 | A1 |
20160065912 | Peterson | Mar 2016 | A1 |
20160090283 | Svensson et al. | Mar 2016 | A1 |
20160090284 | Svensson et al. | Mar 2016 | A1 |
20160094016 | Beach et al. | Mar 2016 | A1 |
20160138247 | Conway et al. | May 2016 | A1 |
20160138248 | Conway et al. | May 2016 | A1 |
20160138249 | Conway et al. | May 2016 | A1 |
20160147408 | Bevis et al. | May 2016 | A1 |
20160164261 | Warren | Jun 2016 | A1 |
20160169665 | Deschenes et al. | Jun 2016 | A1 |
20160178915 | Mor et al. | Jun 2016 | A1 |
20160187186 | Coleman et al. | Jun 2016 | A1 |
20160187187 | Coleman et al. | Jun 2016 | A1 |
20160187210 | Coleman et al. | Jun 2016 | A1 |
20160191801 | Sivan | Jun 2016 | A1 |
20160202478 | Masson et al. | Jul 2016 | A1 |
20160203641 | Bostick et al. | Jul 2016 | A1 |
20160210780 | Paulovich et al. | Jul 2016 | A1 |
20160223474 | Tang et al. | Aug 2016 | A1 |
20160328854 | Kimura | Nov 2016 | A1 |
20170103545 | Holz | Apr 2017 | A1 |
20170115490 | Hsieh et al. | Apr 2017 | A1 |
20170115497 | Chen et al. | Apr 2017 | A1 |
20170116462 | Ogasawara | Apr 2017 | A1 |
20170121158 | Wong et al. | May 2017 | A1 |
20170132806 | Balachandreswaran | May 2017 | A1 |
20170139213 | Schmidtlin | May 2017 | A1 |
20170148250 | Angermayer et al. | May 2017 | A1 |
20170182942 | Hardy et al. | Jun 2017 | A1 |
20170200296 | Jones et al. | Jul 2017 | A1 |
20170309108 | Sadovsky et al. | Oct 2017 | A1 |
20170336870 | Everett et al. | Nov 2017 | A1 |
Number | Date | Country |
---|---|---|
2004212587 | Apr 2005 | AU |
3335760 | Apr 1985 | DE |
10210813 | Oct 2003 | DE |
102007037282 | Mar 2008 | DE |
1443312 | Aug 2004 | EP |
1112483 | May 2006 | EP |
1232480 | May 2006 | EP |
2216634 | Aug 2010 | EP |
2286932 | Feb 2011 | EP |
2372648 | Oct 2011 | EP |
2381421 | Oct 2011 | EP |
2533009 | Dec 2012 | EP |
2722656 | Apr 2014 | EP |
2779027 | Sep 2014 | EP |
2833323 | Feb 2015 | EP |
2843590 | Mar 2015 | EP |
2845170 | Mar 2015 | EP |
2966595 | Jan 2016 | EP |
3006893 | Apr 2016 | EP |
3007096 | Apr 2016 | EP |
3012601 | Apr 2016 | EP |
2525053 | Oct 2015 | GB |
2531928 | May 2016 | GB |
04-129902 | Apr 1992 | JP |
2006-096457 | Apr 2006 | JP |
2007-084162 | Apr 2007 | JP |
2008-210276 | Sep 2008 | JP |
2014-210646 | Nov 2014 | JP |
2015-174705 | Oct 2015 | JP |
10-2011-0013200 | Feb 2011 | KR |
10-2011-0117020 | Oct 2011 | KR |
10-2012-0028109 | Mar 2012 | KR |
9640452 | Dec 1996 | WO |
0077726 | Dec 2000 | WO |
2006095110 | Sep 2006 | WO |
2007015059 | Feb 2007 | WO |
2011017241 | Feb 2011 | WO |
2012175731 | Dec 2012 | WO |
2013021157 | Feb 2013 | WO |
2013033442 | Mar 2013 | WO |
2013163789 | Nov 2013 | WO |
2013166368 | Nov 2013 | WO |
2013173985 | Nov 2013 | WO |
2013184340 | Dec 2013 | WO |
2014019130 | Feb 2014 | WO |
2014110495 | Jul 2014 | WO |
2014149702 | Sep 2014 | WO |
2014151746 | Sep 2014 | WO |
20151006865 | Jan 2015 | WO |
20161020038 | Feb 2016 | WO |
20161061699 | Apr 2016 | WO |
20161085682 | Jun 2016 | WO |
Entry |
---|
Hetzel, Gunter et al.; “3D Object Recognition from Range Images using Local Feature Histograms,”, Proceedings 2001 IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2001. Kauai, Hawaii, Dec. 8-14, 2001; pp. 394-399, XP010584149, ISBN: 978-0-7695-1272-3. |
Hahnel et al., “Mapping and Localization with RFID Technology,” IEEE International Conference on Robotics and Automation, vol. 1, Apr. 26-May 1, 2004, pp. 1015-1020. |
Gupta, Alok; Range Image Segmentation for 3-D Objects Recognition, May 1988, Technical Reports (CIS), Paper 736, University of Pennsylvania Department of Computer and Information Science, retrieved from http://repository.upenn.edu/cis_reports/736, Accessed May 31, 2015, 157 pages. |
Great Britain Search Report for related Application On. GB1517843.7, dated Feb. 23, 2016; 8 pages. |
Great Britain Combined Search and Examination Report in related Application GB1517842.9, dated Apr. 8, 2016, 8 ages [References previously cited]. |
Grabowski, Ralph; “New Commands in AutoCADS 2010: Part 11 Smoothing 3D Mesh Objects” Dated 2011 (per examiner who cited reference), 6 pages, [Examiner Cited Art in Office Action dated Jan. 20, 2017 in related Application.]. |
Fukaya et al., “Characteristics of Speckle Random Pattern and Its Applications”, pp. 317-327, Nouv. Rev. Optique, t.6, n.6. (1975) {in Feb. 9, 2017 Final Office Action in related matter: downloaded Mar. 2, 2017 from http://iopscience.iop.org}. |
First Office Action in related CN Application No. 201510860188.1 dated Jan. 18, 2019, pp. 1-14 [All references previously cited]. |
Extended European Search report in related EP Application No. 17190323.0 dated Jan. 19, 2018; 6 pages [Only new art cited herein]. |
Extended European Search report in related EP Application No. 17189496.7 dated Dec. 5, 2017; 9 pages. |
Extended European Search Report in related EP Application No. 16175410.0, dated Dec. 13, 2016, 5 pages. |
Extended European search report in related EP Application 16199707.7, dated Apr. 10, 2017, 15 pages. |
Extended European Search Report in counterpart European Application No. 15182675.7 dated Dec. 4, 2015, pp. 1-10 references previously cited. |
Examination Report in related UK Application No. GB1517842.9 dated Mar. 8, 2019, pp. 1-4. |
Examination Report in related UK Application No. GB1517842.9 dated Dec. 21, 2018, pp. 1-7 [All references previously cited.]. |
Examination Report in related GB Application No. GB1517843.7, dated Jan. 19, 2018, 4 pages [Only new art cited herein]. |
Examination Report in related EP Application No. 15190315, dated Jan. 26, 2018, 6 pages [Only new art cited herein]. |
Examination Report in related EP Application No. 13785171.3 dated Apr. 2, 2019, pp. 1-5. |
Examination Report in related EP Application No. 13193181.8 dated Mar. 20, 2019, pp. 1-4. |
European Search Report in related EP Application No. 17175357.7, dated Aug. 17, 2017, pp. 1-7 [No new art to be cited]. |
European Search Report in related EP Application No. 15190315.0, dated Apr. 1, 2016, 7 pages [Commonly owned Reference 2014/0104416 has been previously cited]. |
European Search Report from related EP Application No. 16168216.6, dated Oct. 20, 2016, 8 pages [U.S. Publication 2014/0104413 has been previously cited]. |
European Search Report for Related EP Application No. 15189214.8, dated Mar. 3, 2016, 9 pages. |
European Search Report for related EP Application No. 15188440.0, dated Mar. 8, 2016, 8 pages. |
European Search Report for related Application EP 15190249.1, dated Mar. 22, 2016, 7 pages. |
European Search Report for application No. EP13186043 (now EP2722656 (dated Apr. 23, 2014)): Total pp. 7. |
European Patent Search Report for Application No. 14157971.4-1906, dated Jun. 30, 2014, 6 pages. |
European Patent Office Action for Application No. 14157971.4-1906, dated Jul. 16, 2014, 5 pages. |
European Office Action for application EP 13186043, dated Jun. 12, 2014(now EP2722656 (dated Apr. 23, 2014)), Total of 6 pages. |
European Extended Search Report in related EP Application No. 17201794.9, dated Mar. 16, 2018, 10 pages [Only new art cited herein]. |
European Extended Search Report in related EP Application No. 16190017.0, dated Jan. 4, 2017, 6 pages. |
European Extended search report in related EP Application No. 15190306.9, dated Sep. 9, 2016, 15 pages [only new references are cited; remaining references were cited with partial search report in same application dated May 6, 2016]. |
European Extended Search Report in related EP application 18184864.9, dated Oct. 30, 2018, 7 pages. |
European Extended Search Report in related EP Application 17205030.4, dated Mar. 22, 2018, 8 pages. |
European extended search report in related EP Application 16190833.0, dated Mar. 9, 2017, 8 pages [only new art has been cited; US Publication 2014/0034731 was previously cited]. |
European extended Search report in related EP Application 13785171.3, dated Sep. 19, 2016, 8 pages. |
European extended Search Report in related Application No. 17207882.6 dated Apr. 26, 2018, 10 pages. |
European Examination report in related EP Application No. 14181437.6, dated Feb. 8, 2017, 5 pages [References have been previously cited]. |
European Exam Report in related, EP Application No. 16168216.6, dated Feb. 27, 2017, 5 pages, [References have been previously cited; WO2011/017241 and US 2014/0104413]. |
European Exam Report in related EP Application No. 16152477.2, dated Jun. 20, 2017, 4 pages [No art to be cited]. |
European Exam Report in related EP Application No. 15188440.0, dated Apr. 21, 2017, 4 pages [No new art to cite]. |
European Exam Report in related EP Application No. 15176943.7, dated Apr. 12, 2017, 6 pages [Art previously cited in this matter]. |
European Exam Report in related EP Application 16172995.9, dated Mar. 15, 2018, 7 pages (Only new art cited herein). |
European Exam Report in related EP Applciation 16172995.9, dated Jul. 6, 2017, 9 pages [No new art to be cited]. |
EP Search and Written Opinion Report in related matter EP Application No. 1418437.6, dated Mar. 26, 2015, 7 pages. |
El-Hakim et al., “Multicamera vision-based approach to flexible feature measurement for inspection and reverse engineering”, published in Optical Engineering, Society of Photo-Optical Instrumentation Engineers, vol. 32, No. 9, Sep. 1, 1993, 15 pages. |
El-Hakim et al., “A Knowledge-based Edge/Object Measurement Technique”, Retrieved from the Internet: URL: https://www.researchgate.net/profile/Sabry_E1 -Hakim/publication/44075058_A_Knowledge_Based_EdgeObject_Measurement_Technique/links/00b4953b5faa7d3304000000.pdf [retrieved on Jul. 15, 2016] dated Jan. 1, 1993, 9 pages. |
EKSMA Optics, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, downloaded from http://eksmaoptics.com/optical-systems/f-theta-lenses/f-theta-lens-for-1064-nm/, 2 pages. |
Drummond, Tom; Roberto Cipolla, Real-Time Visual Tracking of Complex Structures, Jul. 2002, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, No. 7; 15 pages. |
Dimensioning—Wikipedia, the Free Encyclopedia, URL=http://en.wikipedia.org/wiki/Dimensioning, download date Aug. 1, 2008, 1 page. |
U.S. Patent Application for High Dynamic-Range Indicia Reading System, filed Dec. 10, 2013 (Xian); 28 pages., U.S. Appl. No. 14/101,965. |
U.S. Patent Application for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Filch et al.); 44 pages, U.S. Appl. No. 14/705,012. |
U.S. Patent Application for Handheld Dimensioning System With Measurement-Conformance Feedback filed Oct. 21, 2014 (Ackley et al.); 36 pages, U.S. Appl. No. 14/519,249. |
U.S. Patent Application for Handheld Dimensioning System With Feedback filed Oct. 21, 2014 (Laffargue et al.); 39 pages, U.S. Appl. No. 14/519,195. |
U.S. Patent Application for Handheld Dimensioner With Data-Quality Indication filed Oct. 21, 2014 (Laffargue et al.); 36 pages, U.S. Appl. No. 14/519,233. |
U.S. Patent Application for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.); 36 pages, U.S. Appl. No. 14/231,898. |
U.S. Patent Application for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages, U.S. Appl. No. 14/715,916. |
U.S. Patent Application for Encoded Information Reading Terminal With Wireless Path Selecton Capability, filed Aug. 15, 2014 (Wang et al.); 40 pages, U.S. Appl. No. 14/460,829. |
U.S. Patent Application for Dynamic Diagnostic Indicator Generation filed Dec. 17, 2014 (Goldsmith); 43 pages, U.S. Appl. No. 14/573,022. |
U.S. Patent Application for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages, U.S. Appl. No. 14/747,490. |
U.S. Patent Application for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering); 31 pages, U.S. Appl. No. 14/257,364. |
U.S. Patent Application for Directing an Inspector Through an Inspection filed Nov. 3, 2014 (Miller et al.); 53 pages, U.S. Appl. No. 14/531,154. |
U.S. Patent Application for Dimensioning System, filed Oct. 16, 2013 (Fletcher); 26 pages., U.S. Appl. No. 14/055,234. |
U.S. Patent Application for Dimensioning System With Multipath Interference Mitigation filed Oct. 21, 2014 (Thuries et al.); 30 pages., U.S. Appl. No. 14/519,179. |
U.S. Patent Application for Dimensioning System With Guided Alignment, filed Aug. 6, 2014 (Li et al.); 31 pages, U.S. Appl. No. 14/453,019. |
U.S. Patent Application for Device, System, and Method for Determining the Status of Checkout Lanes filed Feb. 23, 2015 (Todeschini); 37 pages, U.S. Appl. No. 14/628,708. |
U.S. Patent Application for Device Management Proxy for Secure Devices filed Apr. 1, 2015 (Yeakley et al.); 50 pages, U.S. Appl. No. 14/676,327. |
U.S. Patent Application for Device for Supporting an Electronic Tool on a User's Hand filed Feb. 5, 2015 (Oberpriller et al.); 33 pages, U.S. Appl. No. 14/614,706. |
U.S. Patent Application for Design Patiern for Secure Store filed Mar. 9, 2015 (Zhu et al.); 23 pages, U.S. Appl. No. 14/405,278. |
U.S. Patent Application for Customer Facing Imaging Systems and Methods for Obtaining Images filed Jul. 10, 2014 (Oberpriller et al,); 39 pages, U.S. Appl. No. 14/327,722. |
U.S. Patent Application for Cordless Indicia Reader With a Multifunction Coil for Wireless Charging and EAS Deactivation, filed Jun. 24, 2015 (Xie et al.); 34 pages, U.S. Appl. No. 14/748,446. |
U.S. Patent Application for Concatenated Expected Responses for Speech Recognition filed Nov. 7, 2014 (Braho et al.); 51 pages, U.S. Appl. No. 14/535,764. |
U.S. Patent Application for Cell Phone Reading Mode Using Image Timer filed Jul. 11, 2014 (Coyle); 22 pages, U.S. Appl. No. 14/329,303. |
U.S. Patent Application for Cargo Apportionment Techniques filed Feb. 5, 2015 (Morton et al.); 56 pages, U.S. Appl. No. 14/614,796. |
U.S. Patent Application for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages, U.S. Appl. No. 14/740,373. |
U.S. Patent Application for Barcode Scanning System Using Wearable Device With Embedded Camera filed Nov. 5, 2014 (Todeschini); 29 pages, U.S. Appl. No. 14/533,319. |
U.S. Patent Application for Barcode Reader With Security Features filed Oct. 31, 2014 (Todeschini et al.); 32 pages, U.S. Appl. No. 14/529,857. |
U.S. Patent Application for Autofocusing Optical Imaging Device filed Jun. 20, 2014 (Koziol et al.); 28 pages, U.S. Appl. No. 14/310,226. |
U.S. Patent Application for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014, (Ackley et al.); 39 pages., U.S. Appl. No. 14/264,173. |
U.S. Patent Application for Auto-Contrast Viewfinder for an Indicia Reader filed Dec. 12, 2014 (Todeschini); 29 pages, U.S. Appl. No. 14/568,305. |
U.S. Patent Application for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages, U.S. Appl. No. 14/715,672. |
U.S. Patent Application for Augmented-Reality Signature Capture, filed Sep. 24, 2013 (Todeschini); 33 pages., U.S. Appl. No. 14/035,474. |
U.S. Patent Application for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages, U.S. Appl. No. 14/707,123. |
U.S. Patent Application for Apparatus for Displaying Bar Codes From Light Emitting Display Surfaces filed Aug. 15, 2014 (Van Horn et al.); 40 pages, U.S. Appl. No. 14/460,387. |
U.S. Patent Application for an Optical Redirection Adapter, filed Feb. 20, 2013 (Anderson); 26 pages., U.S. Appl. No. 13/771,508. |
U.S. Patent Application for an Optical Imager and Method for Correlating a Medication Package With a Patient, filed Jul. 25, 2014 (Ellis); 26 pages, U.S. Appl. No. 14/340,716. |
U.S. Patent Application for an Encoded Information Reading Terminal Including HTTP Server filed Aug. 4, 2014, (Lu); 30 pages, U.S. Appl. No. 14/376,472. |
U.S. Patent Application for an Electronic Device Case, filed Jul. 2, 2013 (London et al.); 47 pages., U.S. Appl. No. 13/933,415. |
U.S. Patent Application for an Axially Reinforced Flexible Scan Element, filed Jul. 25, 2014 (Reublinger et al.); 41 pages, U.S. Appl. No. 14/340,627. |
U.S. Patent Application for Aimer for Barcode Scanning filed Mar. 31, 2015 (Bidwell); 36 pages, U.S. Appl. No. 14/674,329. |
U.S. Patent Application for Adaptable Interface for a Mobile Computing Device filed Oct. 31, 2014 (Schoon et al.); 36 pages, U.S. Appl. No. 14/529,563. |
U.S. Patent Application for a System for Providing a Continuous Communication Link With a Symbol Reading Device, filed May 24, 2013 (Smith et al.); 24 pages., U.S. Appl. No. 13/902,242. |
U.S. Patent Application for a System and Method for Selectively Reading Code Symbols, filed Jul. 22, 2013 (Rueblinger et al.); 29 pages., U.S. Appl. No. 13/947,296. |
U.S. Patent Application for a System and Method for Reading Code Symbols Using a Variable Field of View, filed Jun. 20, 2013 (Xian et al.); 23 pages., U.S. Appl. No. 13/922,339. |
U.S. Patent Application for a System and Method for Reading Code Symbols at Long Range Using Source Power Control, filed Jun. 7, 2013 (Xian et al.); 24 pages., U.S. Appl. No. 13/912,702. |
U.S. Patent Application for a System and Method for Indicia Verification, filed Jul. 18, 2014 (Hejl); 38 pages, U.S. Appl. No. 14/334,934. |
U.S. Patent Application for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Hollifield); 29 pages., U.S. Appl. No. 13/902,110. |
U.S. Patent Application for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Chamberlin); 23 pages., U.S. Appl. No. 13/902,144. |
U.S. Patent Application for a System and Method for Capturing and Preserving Vehicle Event Data, filed Mar. 28, 2013 (Barker et al.); 20 pages., U.S. Appl. No. 13/852,097. |
U.S. Patent Application for a Mobile-Phone Adapter for Electronic Transactions, filed Jul. 10, 2014 (Hejl); 25 pages, U.S. Appl. No. 14/327,827. |
Zhang, Zhaoxiang; Tieniu Tan, Kaiqi Huang, Yunhong Wang; Three-Dimensional Deformable-Model-based Localization and Recognition of Road Vehicles; IEEE Transactions on Image Processing, vol. 21, No. 1, Jan. 2012, 13 pages. |
YUV to RGB Conversion, downloaded from http://www.fource.org/fccyvrgb.php on Jun. 29, 2012; 5 pages. |
YUV Pixel Format, downloaded from http://www.fource.org/yuv.php on Jun. 29, 2012; 13 pages. |
Wikipedia, YUV description and definition, downloaded from http://www.wikipeida.org/wiki/YUV on Jun. 29, 2012, 10 pages. |
Wikipedia, “Microlens”, Downloaded from https://en.wikipedia.org/wiki/Microlens, pp. 3, (in Feb. 9, 2017 Final Office Action in related matter). |
Wikipedia, “3D projection” Downloaded on Nov. 25, 2015 from www.wikipedia.com, 4 pages. |
Ward, Benjamin, Interactive 30 Reconstruction from Video, Aug. 2012, Doctoral Thesis, Univesity of Adelaide, Adelaide, South Australia, 157 pages. |
Vogt, H., “Multiple Object Identification with Passive RFID Tags,” IEEE International Conference on Systems, Man, and Cybernetics, vol. 3, Oct. 6-9, 2002, 6 pages. |
United Kingdom Search Report in related Application No. GB1700338.5, dated Jun. 30, 2017, 5 pages. |
United Kingdom Search Report in related application GB1517842.9, dated Apr. 8, 2016, 8 pages. |
United Kingdom Further Examination Report in related GB Patent Application No. 1620676.5 dated Jul. 17, 2018; 4 pages [No art cited]. |
United Kingdom Further Examination Report in related GB Patent Application No. 1517842.9 dated Jul. 26, 2018; 5 pages [Cited art has been previously cited in this matter]. |
United Kingdom Further Examination Report in related GB Patent Application No. 1517112.7 dated Jul. 17, 2018; 4 pages [No art cited]. |
United Kingdom Further Exam Report in related application GB1607394.2 dated Oct. 5, 2018; 5 pages [Only new art cited here in]. |
United Kingdom combined Search and Examination Report in related GB Application No. 1607394.2, dated Oct. 19, 2016, 7 pages. |
United Kingdom Combined Search and Examination Report in related Application No. GB1620676.5, dated Mar. 8, 2 2017, 6 pages [References have been previously cited; WO2014/151746, WO2012/175731, US 2014/0313527, GB2503978]. |
United Kingdom Combined Search and Examination Report dated Mar. 21, 2018, 5 pages (Art has been previously cited). |
Ulusoy, Ali Osman et al.; “One-Shot Scanning using De Bruijn Spaced Grids”, Brown University; 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, pp. 1786-1792 [Cited in EPO Search Report dated Dec. 5, 2017]. |
Ulusoy et al., One-Shot Scanning using De Bruijn Spaced Grids, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, 7 pages [Cited in EP Extended search report dated Apr. 10, 2017]. |
U.S. Patent Application Tyler Doomenbal et al., filed Jul. 16, 2015, not published yet, Adjusting Dimensioning Results Using Augmented Reality, 39 pages, U.S. Appl. No. 14/801,023. |
U.S. Patent Application Serge Thuries et al., filed Oct. 21, 2014, not published yet 40 pages., U.S. Appl. No. 14/519,179. |
U.S. Patent Application not yet published, Hand Held Products, Inc. filed Oct. 16, 2013; 26 pages, U.S. Appl. No. 14/055,234. |
U.S. Patent Application not yet published, filed Sep. 19, 2014, Intermec IP Corporation, Volume Dimensioning System Calibration Systems and Methods:, U.S. Appl. No. 14/490,989. |
U.S. Patent Application not yet published, filed Aug. 18, 2014, Hand Held Products Inc., System and Method for Package Dimensioning: 21 pages, U.S. Appl. No. 14/461,524. |
U.S. Patent Application not yet published, filed Aug. 6, 2014, Hand Held Products Inc., Dimensioning System With Guided Alignment: 31 pages, U.S. Appl. No. 14/453,019. |
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages. |
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages. |
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Filch et al.); 61 pages. |
U.S. Appl. No. 29/528,590 for Electronic Device filed May 29, 2015 (Filch et al.); 9 pages. |
U.S. Appl. No. 29/528,165 for In-Counter Barcode Scanner filed May 27, 2015 (Oberpriller et al.); 13 pages. |
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Filch et al.); 10 pages. |
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages. |
U.S. Appl. No. 29/524,186 for Scanner filed Apr. 17, 2015 (Zhou et al.); 17 pages. |
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages. |
U.S. Appl. No. 29/519,017 for Scanner filed Mar. 2, 2015 (Zhou et al.); 11 pages. |
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages. |
U.S. Appl. No. 29/494,725 for an In-Counter Barcode Scanner, filed Jun. 24, 2014 (Oberpriller et al.); 23 pages. |
U.S. Appl. No. 29/492,903 for an Indicia Scanner, filed Jun. 4, 2014 (Zhou et al.); 23 pages. |
U.S. Appl. No. 29/486,759 for an Imaging Terminal, filed Apr. 2, 2014 (Oberpriller et al.); 8 pages. |
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages. |
U.S. Appl. No. 29/459,823 for a Scanner, filed Jul. 3, 2013 (Zhou et al.); 13 pages. |
U.S. Appl. No. 29/459,785 for a Scanner and Charging Base, filed Jul. 3, 2013 (Fitch et al.); 21 pages. |
U.S. Appl. No. 29/459,681 for an Electronic Device Enclosure, filed Jul. 2, 2013 (Chaney et al.); 14 pages. |
U.S. Appl. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.); 21 pages. |
U.S. Appl. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.); 22 pages. |
U.S. Appl. No. 29/436,337 for an Electronic Device, filed Nov. 5, 2012 (Fitch et al.); 19 pages. |
U.S. Patent Application H. Sprague Ackley, filed Jul. 7, 2015, not published yet, Mobile Dimensioner Apparatus for Use in Commerce; 57 pages, U.S. Appl. No. 14/793,149. |
U.S. Patent Application H. Sprague Ackley et al., filed Oct. 21, 2014, not published yet 36 pages., U.S. Appl. No. 14/519,249. |
U.S. Patent Application H. Sprague Ackley et al., filed Oct. 21, 2014, not published yet 33 pages., U.S. Appl. No. 14/519,211. |
U.S. Patent Application Franck Laffargue et al., filed Oct. 21, 2014, not published yet 35 pages., U.S. Appl. No. 14/519,195. |
U.S. Appl. No. 13/786,131, filed Mar. 5, 2013, U.S. Pat. No. 10,321,127, Patented. |
Dimensional Weight—Wikipedia, the Free Encyclopedia, URL=http://en.wikipedia.org/wiki/Dimensional_weight, download date Aug. 1, 2008, 2 pages. |
Decision to Grant in counterpart European Application No. 14157971.4 dated Aug. 6, 2015, pp. 1-2. |
Combined Search and Examination Report in related UK Application No. GB1900752.5 dated Feb. 1, 2019, pp. 1-5. |
Combined Search and Examination Report in related UK Application No. GB1817189.2 dated Nov. 14, 2018, pp. 1-4 [Reference previously cited.]. |
Collings et al., “The Applications and Technology of Phase-Only Liquid Crystal on Silicon Devices”, Journal of Display Technology, IEEE Service Center, New, York, NY, US, vol. 7, No. 3, Mar. 1, 2011 (Mar. 1, 2011), pp. 112-119. |
Chinese Notice of Reexamination in related Chinese Application 201520810313.3, dated Mar. 14, 2017, English Computer Translation provided, 7 pages [No new art cited]. |
Caulier, Yannick et al., “A New Type of Color-Coded Light Structures for an Adapted and Rapid Determination of Point Correspondences for 3D Reconstruction.” Proc. of SPIE, vol. 8082 808232-3; 2011; 8 pages. |
Butcher et al. (eds.), NIST Handbook 44, 2012 Edition, Section 5.58, “Multiple Dimension Measuring Devices,” Oct. 2011, pp. 5-71 to 5-82, 15 pages. |
Boukraa et al, “Tag-Based Vision: Assisting 3D Scene Analysis with Radio-Frequency Tags,” IEEE International Conference on Image Processing, 2002, pp. 269-272. |
Boavida et al., “Dam monitoring using combined terrestrial imaging systems”, 2009 Civil Engineering Survey Dec./Jan. 2009, pp. 33-38 {Cited in Notice of Allowance dated Sep. 15, 2017 in related matter}. |
Benos et al., “Semi-Automatic Dimensioning with Imager of a Portable Device,” U.S. Appl. No. 61/149,912; filed Feb. 4, 2009 (now expired), 56 pages. |
U.S. Patent Application for a Mobile Device Having an Improved User Interface for Reading Code Symbols, filed Jun. 28, 2013 (Gelay et al.); 24 pages., U.S. Appl. No. 13/930,913. |
U.S. Patent Application for a Method of Error Correction for 30 Imaging Device, filed Jun. 7, 2013 (Jovanovski et al.); 33 pages., U.S. Appl. No. 13/912,262. |
U.S. Patent Application for a Method for Operating a Laser Scanner, filed Sep. 5, 2013 (Feng et al.); 24 pages., U.S. Appl. No. 14/018,729. |
U.S. Patent Application for a Method for Manufacturing Laser Scanners, filed Aug. 7, 2013 (Saber et al.); 26 pages., U.S. Appl. No. 13/961,408. |
U.S. Patent Application for a Handheld Indicia Reader Having Locking Endcap, filed Sep. 11, 2013 (Gannon); 31 pages., U.S. Appl. No. 14/023,762. |
U.S. Patent Application for a Device Having Light Source to Reduce Surface Pathogens, filed Sep. 6, 2013 (Todeschini); 23 pages., U.S. Appl. No. 14/019,616. |
U.S. Patent Application for a Code Symbol Reading System Having Adjustable Object Detection, filed Jul. 25, 2013 (Jiang); 28 pages., U.S. Appl. No. 13/950,544. |
U.S. Patent Application for a Code Symbol Reading System Having Adaptive Autofocus, filed Jun. 26, 2013 (Todeschini); 24 pages., U.S. Appl. No. 13/927,398. |
U.S. Patent Application Eric Todeschini, filed Jul. 16, 2015, not published yet, Dimensioning and Imaging Items, 80 pages, U.S. Appl. No. 14/800,757. |
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned. |
Trucco, Emanuele, and Alessandro Verri, Introductory Techniques for 3-D Computer Vision, Prentice Hall, New Jersey, 1998, Chapter 7, “Stereopsis,” pp. 139-175., 39 pages. |
Todeschini et al.; “Depth Sensor Based Auto-Focus System for an Indicia Scanner,” U.S. Patent Application filed Oct. 1, 2015, 44 pages, not yet published., U.S. Appl. No. 14/872,176. |
Thorlabs, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, downloaded from https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_id=6430, 4 pages. |
Theodoropoulos, Gabriel; “Using Gesture Recognizers to Handle Pinch, Rotate, Pan, Swipe, and Tap Gestures” dated Aug. 25, 2014, 34 pages, [Examiner Cited Art in Office Action dated Jan. 20, 2017 in related Application.]. |
Spiller, Jonathan; Object Localization Using Deformable Templates, Master's Dissertation, University of the Witwatersrand, Johannesburg, South Africa, 2007; 74 pages. |
Sill Optics, NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, http://www.silloptics.de/1/products/sill-encyclopedia/laser-optics/f-theta-lenses/, 4 pages. |
Second Chinese Office Action in related CN Application No. 2015220810562.2, dated Mar. 22, 2016, 5 pages. English Translation provided [No references]. |
Second Chinese Office Action in related CN Application No. 201520810685.6, dated Mar. 22, 2016, 5 pages, no references. |
Second Chinese Office Action in related CN Application No. 201520810313.3, dated Mar. 22, 2016, 5 pages. English Translation provided [No references]. |
Search Report and Opinion in related GB Application No. 1517112.7, dated Feb. 19, 2016, 6 Pages (GB2503978 is a commonly owned now abandoned application and not cited above). |
Search Report and Opinion in Related EP Application 15176943.7, dated Jan. 8, 2016, 8 pages, (US Application 2014/0049635 has been previously cited). |
Salvi, Joaquim et al. “Pattern Codification Strategies in Structured Light Systems” published in Pattern Recognition; The Journal of the Pattern Recognition Society, Accepted Oct. 2, 2003; 23 pages. |
Reisner-Kollmann,lrene; Anton L. Fuhrmann, Werner Purgathofer, Interactive Reconstruction of Industrial Sites Using Parametric Models, May 2010, Proceedings of the 26th Spring Conference of Computer Graphics SCCG ″10, 8 pages. |
Ralph Grabowski, “Smothing 30 Mesh Objects,” New Commands in AutoCAD 2010: Part 11, Examiner Cited art in related matter Non Final Office Action dated May 19, 2017; 6 pages. |
Proesmans, Marc et al. “Active Acquisition of 3D Shape for Moving Objects” 0-7803-3258-X/96 1996 IEEE; 4 pages. |
Peter Clarke, Actuator Developer Claims Anti-Shake Breakthrough for Smartphone Cams, Electronic Engineering Times, p. 24, May 16, 2011. |
Padzensky, Ron; “Augmera; Gesture Control”, Dated Apr. 18, 2015, 15 pages [Examiner Cited Art in Office Action dated Jan. 20, 2017 in related Application]. |
Office Action in counterpart European Application No. 13186043.9 dated Sep. 30, 2015, pp. 1-7. |
Office Action dated Oct. 23, 2017, for U.S. Appl. No. 13/786,131, 28 pages. |
Office Action dated May 23, 2018, for U.S. Appl. No. 13/786,131, 30 pages. |
Office Action dated Mar. 23, 2016, for U.S. Appl. No. 14/490,989, 7 pages. |
Office Action dated Jun. 17, 2016, for U.S. Appl. No. 13/786,131, 31 pages. |
Office Action dated Jan. 21, 2016, for U.S. Appl. No. 13/786,131, 28 pages. |
Office Action dated Apr. 12, 2017, for U.S. Appl. No. 13/786,131, 32 pages. |
Notice of Allowance dated Jan. 22, 2019, for U.S. Appl. No. 13/786,131, 7 pages. |
Mouaddib E. et al. “Recent Progress in Structured Light in order to Solve the Correspondence Problem in Stereo Vision” Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Apr. 1997; 7 pages. |
McCloskey et al., “Methods for Improving the Accuracy of Dimensioning-System Measurements,” U.S. Patent Application filed Sep. 2, 2015, 47 pages, not yet published., U.S. Appl. No. 14/873,613. |
McCloskey et al., “Image Transformation for Indicia Reading,” U.S. Patent Application filed Oct. 30, 2015, 48 pages, not yet published., U.S. Appl. No. 14/982,032. |
M.Zahid Gurbuz, Selim Akyokus, Ibrahim Emiroglu, Aysun Guran, An Efficient Algorithm for 30 Rectangular Box Packing, 2009, Applied Automatic Systems: Proceedings of Selected AAS 2009 Papers, pp. 131-134 [Examiner cited art in related US matter with Notice of Allowance dated Aug. 11, 2016]. |
Lowe David G., “Fitting Parameterized Three-Dimensional Models to Images”, IEEE Transaction on Pattern Analysis and Machine Intelligence, IEEE Computer Society, USA, vol. 13, No. 5, May 1, 1991, pp. 441-450. |
Lloyd et al., “System for Monitoring the Condition of Packages Throughout Transit”, U.S. Patent Application filed Sep. 25, 2015, 59 pages, not yet published., U.S. Appl. No. 14/865,575. |
Leotta, Matthew, Generic, Deformable Models for 3-D Vehicle Surveillance, May 2010, Doctoral Dissertation, Brown University, Providence RI, 248 pages. |
Leotta, Matthew J.; Joseph L. Mundy; Predicting High Resolution Image Edges with a Generic, Adaptive, 3-D Vehicle Model; IEEE Conference on Computer Vision and Pattern Recognition, 2009; 8 pages. |
Kazantsev, Aleksei et al. “Robust Pseudo-Random Coded Colored STructured Light Techniques for 30 Object Model Recovery”; ROSE 2008 IEEE International Workshop on Robotic and Sensors Environments (Oct. 17-18, 2008) , 6 pages. |
Jovanovski et al., “Image-Stitching for Dimensioning”, U.S. Patent Application filed Sep. 30, 2015, 45 pages, not yet published., U.S. Appl. No. 14/870,488. |
James Chamberlin, “System and Method for Picking Validation”, U.S. Patent Application filed Sep. 25, 2015, 44 pages, not yet published, U.S. Appl. No. 14/865,797. |
International Search Report for PCT/US2013/039438 (W02013166368), dated Oct. 1, 2013, 7 pages. |
Intention to Grant in counterpart European Application No. 14157971.4 dated Apr. 14, 2015, pp. 1-8. |
Hood, Frederick W.; William A. Hoff, Robert King, Evaluation of an Interactive Technique for Creating Site Models from Range Data, Apr. 27-May 1, 1997 Proceedings of the ANS 7th Topical Meeting on Robotics & Remote Systems, Augusta GA, 9 pages. |
Hinske, Steve, “Determining the Position and Orientation of Multi-Tagged Objects Using RFID Technology,” Fifth IEEE International Conference on Pervasive Computing and Communications Workshops, Mar. 19-23, 2007, pp. 377-381. |
Advisory Action (PTOL-303) dated Feb. 1, 2018 for U.S. Appl. 13/786,131. |
Advisory Action (PTOL-303) dated Sep. 29, 2016 for U.S. Appl. No. 13/786,131. |
Applicant Initiated Interview Summary (PTOL-413) dated Aug. 29, 2018 for U.S. Appl. No. 13/786,131. |
Notice of Allowance and Fees Due (PTOL-85) dated Mar. 5, 2019 for U.S. Appl. No. 13/786,131. |
Notice of Allowance and Fees Due (PTOL-85) dated Mar. 14, 2019 for U.S. Appl. No. 13/786,131. |
Notice of Allowance and Fees Due (PTOL-85) dated May 13, 2019 for U.S. Appl. No. 13/786,131. |
Supplemental Notice of Allowability for U.S. Appl. No. 13/786,131 dated Mar. 14, 2019, 2 pages. |
Supplemental Notice of Allowability for U.S. Appl. No. 13/786,131 dated Mar. 5, 2019, 2 pages. |
Supplemental Notice of Allowability for U.S. Appl. No. 13/786,131 dated May 13, 2019, 2 pages. |
U.S. Patent Application Franck Laffargue et al., filed Oct. 21, 2014, not published yet 34 pages., U.S. Appl. No. 14/519,233. |
U.S. Patent Application for Web-Based Scan-Task Enabled System and Method of and Apparatus for Developing and Deploying the Same on a Client-Server Network filed Jul. 2, 2014 (Chen et al.); 65 pages, U.S. Appl. No. 14/370,237. |
U.S. Patent Application for Variable Depth of Field Barcode Scanner filed Sep. 10, 2014 (McCloskey et al.); 29 pages, U.S. Appl. No. 14/483,056. |
U.S. Patent Application for Tracking Batiery Conditions filed May 4, 2015 (Young et al.); 70 pages, U.S. Appl. No. 14/702,979. |
U.S. Patent Application for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages, U.S. Appl. No. 14/283,282. |
U.S. Patent Application for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Barndringa); 38 pages, U.S. Appl. No. 14/740,320. |
U.S. Patent Application for System for Communication via a Peripheral Hub filed Apr. 15, 2015 (Kohtz et al.); 37 pages, U.S. Appl. No. 14/687,289. |
U.S. Patent Application for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages, U.S. Appl. No. 14/702,110. |
U.S. Patent Application for System and Method for Dimensioning filed Oct. 21, 2014 (Ackley et al.); 33 pages, U.S. Appl. No. 14/519,211. |
U.S. Patent Application for System and Method for Detecting Barcode Printing Errors filed Jan. 14, 2015 (Ackley); 41 pages, U.S. Appl. No. 14/596,757. |
U.S. Patent Application for Symbol Reading System With Integrated Scale Base filed Jul. 17, 2014 (Barten); 59 pages, U.S. Appl. No. 14/333,588. |
U.S. Patent Application for Shelving and Package Locating Systems for Delivery Vehicles filed Jan. 6, 2015 (Payne); 31 pages, U.S. Appl. No. 14/590,024. |
U.S. Patent Application for Self-Checkout Shopping System, filed Nov. 8, 2013 (Hejl et al.); 26 pages., U.S. Appl. No. 14/074,746. |
U.S. Patent Application for Secure Unatiended Network Authentication filed Apr. 24, 2015 (Kubler et al.); 52 pages, U.S. Appl. No. 14/695,923. |
U.S. Patent Application for Safety System and Method filed Dec. 22, 2014 (Ackley et al.); 32 pages, U.S. Appl. No. 14/578,627. |
U.S. Patent Application for Reading Apparatus Having Partial Frame Operating Mode filed Apr. 21, 2014, (Barber et al.), 67 pages, U.S. Appl. No. 14/257,174. |
U.S. Patent Application for Portable Electronic Devices Having a Separate Location Trigger Unit for Use in Controlling an Application Unit filed Nov. 3, 2014 (Bian et al.); 22 pages, U.S. Appl. No. 14/398,542. |
U.S. Patent Application for Optimal Range Indicators for Bar Code Validation, filed Nov. 22, 2013 (Hejl); 27 pages., U.S. Appl. No. 14/087,190. |
U.S. Patent Application for Optical Patie RN Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages, U.S. Appl. No. 14/747,197. |
U.S. Patent Application for Optical Indicia Reading Terminal with Combined Illumination filed Mar. 19, 2014 (Ouyang); 19 pages., U.S. Appl. No. 14/345,735. |
U.S. Patent Application for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages, U.S. Appl. No. 14/277,337. |
U.S. Patent Application for Multifunction Point of Sale System filed Mar. 19, 2015 (Van Horn et al.); 41 pages, U.S. Appl. No. 14/662,922. |
U.S. Patent Application for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages, U.S. Appl. No. 14/446,391. |
U.S. Patent Application for Mobile Printer With Optional Batiery Accessory, filed May 12, 2014, (Marty et al.); 26 pages, U.S. Appl. No. 14/274,858. |
U.S. Patent Application for Mobile Computing Device With Data Cognition Software, filed on Aug. 19, 2014 (Todeschini et al.); 38 pages, U.S. Appl. No. 14/462,801. |
U.S. Patent Application for Methods for Training a Speech Recognition System filed Feb. 11, 2015 (Pecorari); 35 pages, U.S. Appl. No. 14/619,093. |
U.S. Patent Application for Method of Using Camera Sensor Interface to Transfer Multiple Channels of Scan Data Using an Image Format filed Aug. 15, 2014 (Wang et al.); 28 pages, U.S. Appl. No. 14/379,057. |
U.S. Patent Application for Method of and System for Detecting Object Weighing Interferences, filed Jul. 21, 2014 (Amundsen et al.); 34 pages, U.S. Appl. No. 14/336,188. |
U.S. Patent Application for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages, U.S. Appl. No. 14/705,407. |
U.S. Patent Application for Method and System for Recognizing Speech Using Wildcards in an Expected Response filed Oct. 29, 2014 (Braho et al.); 45 pages, U.S. Appl. No. 14/527,191. |
U.S. Patent Application for Method and System for Considering Information About an Expected Response When Performing Speech Recognition, filed Jun. 10, 2014 (Braho et al.); 31 pages, U.S. Appl. No. 14/300,276. |
U.S. Patent Application for Method and System for Configuring Mobile Devices via NFC Technology, filed Nov. 8, 2013 (Smith et al.); 28 pages., U.S. Appl. No. 14/074,787. |
U.S. Patent Application for Method and Application for Scanning a Barcode With a Smart Device While Continuously Running and Displaying an Application on the Smart Device Display filed Mar. 20, 2015 (Todeschini); 37 pages, U.S. Appl. No. 14/664,063. |
U.S. Patent Application for Medication Management System filed Apr. 24, 2015 (Sewell et al.); 44 pages, U.S. Appl. No. 14/695,364. |
U.S. Patent Application for Media Gate for Thermal Transfer Printers filed Dec. 23, 2014 (Bowles); 36 pages, U.S. Appl. No. 14/580,262. |
U.S. Patent Application for Indicia Reading Terminal Including Optical Filter filed Jan. 28, 2014 (Lu et al.); 29 pages, U.S. Appl. No. 14/166,103. |
U.S. Patent Application for Indicia Reader, filed Oct. 14, 2013 (Huck); 29 pages., U.S. Appl. No. 14/053,314. |
U.S. Patent Application for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.); 42 pages, U.S. Appl. No. 14/200,405. |
U.S. Patent Application for Incicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.); 28 pages., U.S. Appl. No. 14/150,393. |
U.S. Patent Application for Laser Scanning Code Symbol Reading System, filed Jul. 24, 2014 (Xian et al.); 39 pages, U.S. Appl. No. 14/339,708. |
U.S. Patent Application for Laser Barcode Scanner, filed Jan. 14, 2014 (Hou et al.); 26 pages, U.S. Appl. No. 14/154,207. |
U.S. Patent Application for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages, U.S. Appl. No. 14/704,050. |
U.S. Patent Application for Interactive Indicia Reader, filed Aug. 6, 2014, (Todeschini); 32 pages, U.S. Appl. No. 14/452,697. |
U.S. Patent Application for Industrial Design for Consumer Device Based Scanning and Mobility, filed Jul. 2, 2014 (Ma et al.); 45 pages, U.S. Appl. No. 14/370,267. |
U.S. Patent Application for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages, U.S. Appl. No. 14/735,717. |
U.S. Patent Application for Indicia Reading Terminal Processing Plurality of Frames of Image Data Responsively to Trigger Signal Activation filed Jul. 30, 2014 (Wang et al.); 76 pages, U.S. Appl. No. 14/446,387. |
U.S. Patent Application for Indicia Reading System Employing Digital Gain Control filed Jun. 16, 2014 (Xian et al.); 53 pages, U.S. Appl. No. 14/305,153. |
U.S. Patent Application for Imaging Based Barcode Scanner Engine with Multiple Elements Supported on a Common Printed Circuit Board filed Mar. 4, 2014 (Liu et al.); 27 pages, U.S. Appl. No. 14/342,544. |
U.S. Patent Application for Identifying Inventory Items in a Storage Facility filed Oct. 14, 2014 (Singel et al.); 51 pages, U.S. Appl. No. 14/513,808. |
U.S. Patent Application for Hybrid System and Method for Reading Indicia, filed Oct. 29, 2013 (Meier et al.); 22 pages., U.S. Appl. No. 14/065,768. |
Number | Date | Country | |
---|---|---|---|
20190335168 A1 | Oct 2019 | US |
Number | Date | Country | |
---|---|---|---|
61691093 | Aug 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13786131 | Mar 2013 | US |
Child | 16404189 | US |