Autofocus lens system

Information

  • Patent Grant
  • 10222514
  • Patent Number
    10,222,514
  • Date Filed
    Monday, April 16, 2018
    6 years ago
  • Date Issued
    Tuesday, March 5, 2019
    5 years ago
Abstract
An autofocus lens system includes no conventional moving parts and has excellent speed and low power consumption. The system includes a small electronically-controlled focusing-module lens. The focusing-module lens includes two adjustable polymeric surfaces (e.g., two adjustable-surface lenses in a back-to-back configuration). The curvature of the surfaces can be adjusted to change focus. The performance of the autofocus lens system is extended by adding a conventional first and second lens, or lens group, on either side of the focusing-module lens. What results is an autofocus lens system with excellent near field and far field performance.
Description
FIELD OF THE INVENTION

The present invention relates to indicia readers and, more particularly, to an autofocus optical system that extends the range of distances at which indicia may be read.


BACKGROUND

Indicia readers (e.g., barcode scanners, OCR scanners) fall into two main classes based on their barcode-reading technology, namely (i) linear scanners (e.g., laser scanners, 1D imagers) and (ii) 2D scanners (e.g., 2D imagers, page scanners).


Laser scanners use fast moving mirrors to sweep a laser beam across a linear barcode. The bars and spaces of the barcode are recognized based on their respective reflectivity. In other words, the light areas and dark areas of the barcode reflect light back toward the scanner differently. This difference can be sensed by the scanner's photo-detector (e.g., photodiode) and converted into an electronic signal suitable for decoding.


Imaging scanners were developed to read advanced codes by adapting technology used in digital cameras. Imaging scanners take a picture of the entire barcode, and a processor running image processing algorithms recognizes and decodes the barcode. This digital approach overcomes many of the laser scanner's limitations.


Imaging scanners are more reliable than laser scanners, which use fast-moving parts. Imaging scanners can be configured to process all barcodes within a field of view and do not require separate scans for each barcode. Sophisticated decoding algorithms eliminate the need to align the imaging scanner with the barcode. Imaging scanners can also scan poor quality or damaged barcodes faster and more reliably than laser scanners. Further, the imaging scanner is more versatile and can be configured to address new codes or new modes of operation, such as document-capture. In view of these advantages, many users prefer the imaging scanner. The imaging scanner, however, lacks the extended scan range associated with laser scanners.


Extended scan ranges are important in warehouse environments, where barcoded containers may be stacked on high shelves. Operators may be limited in their access to barcodes and must scan over a range of distances. In these situations, scanning ranges can be 10 centimeters to 10 meters. This multi-order-of-magnitude range requirement places stringent demands on the imaging scanner.


The range of imaging scanners is limited by the scanner's imaging optics (e.g., lens). The quality of a barcode image is crucial for proper scans. Images that are unfocused images can render a barcode unreadable.


The range of distances over which a barcode can be decoded is known as the working-distance range. In fixed-lens systems (i.e., no moving parts), this working-distance range is the distance between the nearest focused objects and the farthest focused objects within the field of view (i.e., depth of field). The depth of field is related to the lens's f-number. A lens with a high f-number has a large depth of field. High f-number lenses, however, collect less light. Imaging scanners must collect sufficient light to prevent noisy images. These scanners, therefore, need a lens with both a low f-number and the ability to produce sharp images over a wide range of working distances. Fixed lenses, therefore, are not used for imaging scanners intended for extended range applications (e.g., warehouses).


Autofocus (i.e., AF) lenses may be used in imaging scanners that need both near and far scanning capabilities. Typically, focus is achieved in an autofocus lens by mechanically moving the lens. These mechanically-tuned autofocus lenses provide range to imaging scanners but also have some limitations.


The moving parts in mechanical autofocus lens systems may have reliability issues. The mechanical autofocus lens systems can be bulky because of the extra components required for motion (e.g., actuators, tracks, and linkages). These motion components also consume power at a rate that may limit their compatibility with battery-powered scanners. The mechanical motion of the lens or lenses can be slow and may hinder their use in applications that require fast focus (e.g., scanning in moving environments). Finally, the cost of these mechanical autofocus lens systems can be high because of the number and precision of the required mechanical parts.


Therefore, a need exists for an imaging-scanner autofocus lens system that has (i) a large focus range, (ii) a small size, (iii) low power consumption, and (iv) reduced mechanical complexity.


SUMMARY

Accordingly, in one aspect, the present invention embraces an autofocus lens system for an imaging scanner. The autofocus lens system uses a first lens (or lens group including a plurality of lenses), a second lens (or lens group including a plurality of lenses), and a focusing-module lens to focus a barcode onto an image sensor. The first lens is fixedly positioned along an optical axis. The second lens is fixedly positioned along the optical axis. The focusing-module lens is fixedly positioned along the optical axis between the first and second lenses. The lenses together create a real image of indicia. The focusing-module lens is used to change the focus of the autofocus lens system. Focus is adjusted by adjusting the optical power of the focusing-module lens. The optical power of the focusing-module lens is controlled by electronically adjusting the curvature of two adjustable surfaces.


In an exemplary embodiment, the autofocus lens system has a focusing-module lens with a clear aperture diameter that is smaller than the diameter of either the first lens or the second lens. The focusing-module lens defines the aperture stop for the autofocus lens system.


In another exemplary embodiment, this focusing-module lens has a diameter between 1.3 and 1.7 millimeters (e.g., about 1.5 millimeters).


In another exemplary embodiment, the autofocus lens system has a working distance of 10 centimeters or greater.


In another exemplary embodiment, the autofocus lens system has a response time of 2 milliseconds or less (e.g., less than about 1 millisecond).


In another exemplary embodiment, the autofocus lens system consumes 20 milliwatts of power or less.


In another exemplary embodiment, the autofocus lens system has an f-number of 7 or less.


In another exemplary embodiment, the focusing-module lens includes two adjustable-surface lenses positioned in close proximity to one another.


In still another exemplary embodiment, the focusing-module lens may include two contiguous adjustable surfaces. Here, the focusing-module lens includes (i) a first transparent deformable membrane having a ring-shaped piezoelectric film contiguously positioned on the first transparent deformable membrane's outer surface, (ii) a second transparent deformable membrane having a ring-shaped piezoelectric film contiguously positioned on the second transparent deformable membrane's outer surface, and (iii) a flexible polymer contiguously positioned between the first transparent deformable membrane and the second transparent deformable membrane. In this way, the flexible polymer is in contact with the inner surfaces of both the first transparent deformable membrane and the second transparent deformable membrane. The transparent deformable membrane can be fabricated from glass, quartz, sapphire or other semi-rigid transparent material. The two adjustable surfaces of the focusing-module lens may, in some embodiments, be electronically controlled independently.


In another aspect, the present invention embraces an active autofocus system for an imaging scanner including the foregoing autofocus lens system. In this active autofocus system, a range finder senses the range of a barcode through transmitted and received radiation and creates a range signal representing the sensed range. A processor generates a control signal based on the comparison of the range signal with a lookup table stored in memory. The lookup table contains focus settings associated with various range-signal values. A controller responds to the control signal by creating electronic autofocus signals to adjust the autofocus lens system to achieve focus of a real image of a 1D or 2D barcode.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 schematically depicts a block diagram of an active autofocus system.



FIG. 2 graphically depicts a perspective view of a cutaway layout of an adjustable-surface lens.



FIG. 3a graphically depicts a side-view cross-section of an adjustable-surface lens with no voltage applied (i.e., the “off” state).



FIG. 3b graphically depicts a side-view cross-section of an adjustable-surface lens with a voltage applied (i.e., the “on” state).



FIG. 4 schematically depicts an embodiment of an autofocus lens system.



FIG. 5 graphically depicts a first embodiment of a focusing-module lens with two adjustable-surface lenses in close proximity.



FIG. 6 graphically depicts a second embodiment of a focusing-module lens.



FIG. 7 schematically depicts an exemplary embodiment of an autofocus lens system.





DETAILED DESCRIPTION

The present invention embraces an autofocus lens system for an imaging scanner that extends the range of distances over which barcodes may be read. In this regard, an exemplary autofocus lens system for an imaging scanner includes (i) a first lens (e.g., first positive lens), fixedly positioned along an optical axis, (ii) a second lens (e.g., second positive lens) for creating a real image of a barcode, the second lens fixedly positioned along the optical axis, and (iii) and a focusing-module lens fixedly positioned along the optical axis between the first lens and the second lens, the focusing-module lens formed from two adjustable surfaces, wherein the optical power of the adjustable surfaces is controlled electronically to achieve focus.


Imaging scanners require good images to properly decode barcodes. The image sensors used in these devices deliver high-quality images only when the light impinging on the sensor (i) is focused and (ii) has an intensity above the sensor's noise level (i.e., a high signal-to-noise ratio).


To achieve high signal-to-noise ratio (SNR) images, the lens of an imaging scanner should gather light efficiently (i.e., have a high throughput). The entrance pupil is the image of the lens system's aperture-stop as seen through the front lens and is an indicator of the throughput. A large entrance pupil implies that the lens system will have high throughput. Long range scans are especially susceptible to low SNR images. This is due to path loss. The light reflected from the barcode spreads out over long ranges and less is captured by the scanner imaging lens. Illumination sources may help to improve the signal to noise ratio (i.e., SNR) of images, but a high throughput imaging lens is still extremely important.


Imaging scanners used in diverse environments (e.g., warehouses) should read barcodes at various ranges in both the near field and far field (e.g., 10 centimeters to 10 meters). In other words, the imaging scanner's lens must be able to create sharp barcode images over a wide range of working distances.


Fixed-focus lenses, with no focusing motion, are not used in imaging scanners requiring wide scanning ranges. These lenses typically have low throughput and may not be able to focus on a barcode that is close to the scanner.


Lenses with focusing motion can extend the scanner's working-distance range. The focusing motion moves the lens to a point where the light rays from a barcode converge onto the image sensor and produce a sharp real image. While this focusing movement can be accomplished manually, it is more practical for scanners to use an automatic-focus (i.e., autofocus) system.


Autofocus systems use (i) optimal focus information and (ii) a positioner (e.g., an actuator or piezoelectric) to positioning the real image. A passive autofocus system might use a processor running image-processing algorithms to determine the focus quality. The processor uses this information to send signals to actuators that position the lens. Alternatively, an active autofocus system uses a range finder to ascertain the distance between the object and the front lens of the system (i.e., the working distance). This range information can then be used to adjust the lens position for optimal focus. Because of its simplicity, the active autofocus scheme is well suited for imaging scanners.


The range finder in an active autofocus system can use one or more sensors to create a range signal. A processor running a process can compare the range signal with a stored lookup table to generate a corresponding control signal. The control signal can be interpreted by control electronics (e.g., a controller) to drive the lens system's positioning devices.


A block diagram of an active autofocus system is shown in FIG. 1. Here, a range-finder 20 senses the range (i.e., working distance) of a barcode through some transmitted radiation 5 and received radiation 15 (e.g., optical signals). The range finder 20 creates a range signal 25 and then sends this range signal 25 to a processor 10. The processor 10 runs an algorithm to compare the value of the range signal 25 with a lookup table 32 stored in memory 30. The lookup table 32 contains focus settings for various range signals 25. Once the focus settings corresponding to the measured range are determined, the processor 10 sends a control signal 35 to the autofocus controller 40. Based on this signal, the autofocus controller 40 sends electronic autofocus signals 45 to the autofocus lens system 50. The autofocus signals 45 cause the autofocus lens system 50 to change the imaging system's focus. When the adjustment of the autofocus lens system 50 is complete, the light from the barcode 55 is well focused onto the imaging scanner's image sensor.


Autofocus functionality relies on an adjustable lens parameter to focus the barcode. In traditional autofocus systems, focus is achieved by changing the position of a lens (or lenses forming a lens group) in relation to the image sensor. The autofocus signals 45 drive motors or actuators that move the lens (or lenses). In other words, the focus is controlled mechanically by changing the position of a lens or a lens group.


Mechanical autofocus systems can be bulky and slow for imaging scanner applications. A typical mechanical autofocus system can take 60 milliseconds to reach focus. Actuators in these systems can also draw a relatively large amount of power. Typical systems may draw around 450 milliwatts, which reduces battery life.


Focus can be adjustable non-mechanically as well. Lens curvature (i.e., lens power) may be changed to adjust focus. Lenses made from an adjustable surface (i.e., adjustable-surface lenses) can be used in autofocus lens systems for imaging scanners. Adjustable-surface lenses are compact, fast, reliable, cost-effective, and energy efficient.


A perspective half-section view of a lens made from a single adjustable surface is shown in FIG. 2. The glass support 105 is a support element made of a transparent rigid material such as glass. The top element is a thin glass membrane 110, including an actuating element, such as a ring-shaped piezoelectric film 120. The glass membrane is supported on its edges by a silicon support 115 made from some MEMS (i.e., micro-electro-mechanical systems) manufacturing technology. Sandwiched between the glass support 105 and the glass membrane 110 is a flexible transparent polymeric material 130.


The adjustable-surface lens 100 relies on a change in the polymeric surface's curvature as a result of an applied voltage. A side-view cross-section of the adjustable-surface lens 100 and its off/on operation are shown in FIG. 3a and FIG. 3b respectively. As depicted in FIG. 3a, when no voltage is applied to the ring-shaped piezoelectric film 120, the light beam 101 passes through the clear polymer 130 with no alteration (i.e., zero optical power). On the other hand, as shown in FIG. 3b, when a voltage is applied to the piezoelectric film 120, the shape of the glass membrane 110 and the contiguous polymer 130 are curved (e.g., spherically or near spherically). When a voltage is applied to the adjustable-surface lens 100, the light beam 101 is focused to a point behind the lens.


Focusing the small adjustable-surface lens 100 is achieved by changing the shape of the adjustable surface. This change is caused by a mechanical strain exerted by the ring-shaped piezoelectric film (i.e., piezo-ring) 120 because of an applied voltage. This strain alters the shape of the glass membrane 110, and, more importantly, also changes the shape of the flexible polymer 130 that is contiguous to this layer. In this way, the adjustable surface's optical power is controlled, and the position of the focus is adjusted.


The adjustable-surface lens 100 is well suited for imaging scanners. The adjustable-surface lens 100 can be fabricated using advanced semiconductor manufacturing techniques (i.e., MEMS technology), and therefore can be very cost effective. The adjustable-surface lens 100 is small and can be conveniently integrated within an imaging scanner. Adjusting the optical surface is very fast (e.g., 2 milliseconds) and draws very little power (e.g., 20 milliwatts), allowing for fast acquisition and long-life battery operation.


The adjustable-surface lens 100 has some limitations that must be accommodated for in order to use this component for imaging scanner applications. The adjustable-surface lens 100 has a very small clear aperture (e.g., 1.55 millimeters) and leads to a high f-number with a long focal length (e.g., f/10). The range of optical powers is limited (e.g., 0 to +10 diopters), resulting in a working distance range (e.g., 10 centimeters to infinity).


To overcome the limitation of the adjustable-surface lens's small aperture, other lenses may be added along the optical axis 140. An embodiment of this autofocus lens system is shown in FIG. 4. As depicted, the light from a barcode 55 is focused onto the image sensor 155 by three ideal lenses that help to form the autofocus lens system 50. A first lens 142 and a second lens 144, both with positive power, are positioned on either side of the adjustable-surface lens 100, which has negative power. Both the first and second lenses have apertures larger than the aperture stop of the system (i.e., the adjustable-surface lens 100). The first lens 142 forms a large entrance pupil by magnifying the aperture stop. The larger entrance pupil improves the lens system's throughput. The second lens 144 forms a real image of the object (e.g., barcode) onto the image sensor 155. A focusing-module lens 150 uses adjustable optical power to adjust the focus position along the optical axis 140 so that, regardless of the barcode distance, the focus position coincides with the image sensor 155.


To achieve focus for all ranges, the focusing-module lens 150 must be able to adjust its optical power sufficiently. The three-lens configuration shown in FIG. 4 has excellent throughput but lacks performance when a single adjustable-surface lens 100 is used. In other words, when a single adjustable-surface lens 100 is used, the autofocus lens system 50 cannot accommodate close scans (e.g., 10-centimeter scans). The pupil magnification, while needed for throughput, reduces the adjustable-surface lens's ability to focus on objects in the near field. To extend the focus range requires optical power beyond what a single adjustable-surface lens 100 can provide.


To increase the effective optical power, a focusing-module lens 150 may be formed using two adjustable optical surfaces placed in close proximity. The optical powers of the two adjustable surfaces are additive so the overall power of the focusing-module lens 150 may be doubled.


One exemplary embodiment of the two adjustable surface focusing-module lens 150 uses two adjustable-surface lenses 100 placed in close proximity (e.g., back-to-back) as shown in FIG. 5. The back-to-back adjustable-surface lenses form an effective lens with two adjustable polymeric surfaces to control the optical power (i.e., instead of just one lens). An advantage of this embodiment is that the adjustable-surfaces lenses 100 have already been reduced to practice and are commercially available, such as from poLight AS. In this regard, this application incorporates entirely by reference U.S. Pat. No. 8,045,280 (poLight AS).


Alternatively, a focusing-module lens 150 with two adjustable surfaces integrated into a single device can be utilized. As shown in FIG. 6, this alternative embodiment includes a flexible polymer 130 sandwiched between two transparent deformable membranes 110, each with its own ring-shaped piezoelectric film 120. Each transparent deformable membrane 110 can be fabricated from glass, quartz, and/or sapphire). This embodiment would allow for the same focusing power as the first embodiment and would also offer more simplicity and compactness.


In both embodiments, the electronically controlled optical powers on the adjustable polymeric surfaces summate, thereby forming a focusing-module lens 150 with a larger available optical power range (e.g., 0-20 diopters). When this two-surface focusing-module lens 150 is used in the three lens configuration shown in FIG. 4, the higher optical power range and the pupil magnification combine to form an autofocus lens system 50 with excellent focus range and small f-number. Such an autofocus lens system is well suited for imaging scanner applications.


A practical embodiment of the autofocus lens system 50 is shown in FIG. 7. Here, two positive lens groups 142, 144, including a plurality of lenses, are fixedly positioned along the optical axis 140. A focusing-module lens 150, which includes two adjustable-surface lenses 100, is fixed between the two lens groups. No moving parts along the optical axis are required for focusing. A voltage applied to each adjustable surface's ring-shaped piezoelectric film 120 is enough to focus the barcode onto the image sensor 155.


The resulting autofocus lens system 50 is smaller, faster, consumes less power, and is more cost effective than other mechanically-tuned autofocus lenses for imaging scanners. The autofocus lens system 50, based on the architecture described here, can focus on indicia both in the near-field (e.g., 10 centimeters) and far-field (e.g., 10 meters or greater).


The possible applications for this auto-focus lens system 50 need not be limited to imaging scanners. Any application requiring a narrow field of view (e.g., about 10 degrees to 15 degrees) and a wide focus range (e.g., 10 centimeters to infinity) could benefit from this lens configuration. For example, applications like license-plate imagers or long range facial-recognition would be suitable for this type of lens.


This application incorporates entirely by reference the commonly assigned U.S. Pat. No. 7,296,749 for Autofocus Barcode Scanner and the Like Employing Micro-Fluidic Lens; U.S. Pat. No. 8,328,099 for Auto-focusing Method for an Automatic Data Collection Device, such as an Image Acquisition Device; U.S. Pat. No. 8,245,936 for Dynamic Focus Calibration, such as Dynamic Focus Calibration using an Open-Loop System in a Bar Code Scanner; and U.S. Pat. No. 8,531,790 for Linear Actuator Assemblies and Methods of Making the Same.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. No. 6,832,725; U.S. Pat. No. 7,128,266;
  • U.S. Pat. No. 7,159,783; U.S. Pat. No. 7,413,127;
  • U.S. Pat. No. 7,726,575; U.S. Pat. No. 8,294,969;
  • U.S. Pat. No. 8,317,105; U.S. Pat. No. 8,322,622;
  • U.S. Pat. No. 8,366,005; U.S. Pat. No. 8,371,507;
  • U.S. Pat. No. 8,376,233; U.S. Pat. No. 8,381,979;
  • U.S. Pat. No. 8,390,909; U.S. Pat. No. 8,408,464;
  • U.S. Pat. No. 8,408,468; U.S. Pat. No. 8,408,469;
  • U.S. Pat. No. 8,424,768; U.S. Pat. No. 8,448,863;
  • U.S. Pat. No. 8,457,013; U.S. Pat. No. 8,459,557;
  • U.S. Pat. No. 8,469,272; U.S. Pat. No. 8,474,712;
  • U.S. Pat. No. 8,479,992; U.S. Pat. No. 8,490,877;
  • U.S. Pat. No. 8,517,271; U.S. Pat. No. 8,523,076;
  • U.S. Pat. No. 8,528,819; U.S. Pat. No. 8,544,737;
  • U.S. Pat. No. 8,548,242; U.S. Pat. No. 8,548,420;
  • U.S. Pat. No. 8,550,335; U.S. Pat. No. 8,550,354;
  • U.S. Pat. No. 8,550,357; U.S. Pat. No. 8,556,174;
  • U.S. Pat. No. 8,556,176; U.S. Pat. No. 8,556,177;
  • U.S. Pat. No. 8,559,767; U.S. Pat. No. 8,559,957;
  • U.S. Pat. No. 8,561,895; U.S. Pat. No. 8,561,903;
  • U.S. Pat. No. 8,561,905; U.S. Pat. No. 8,565,107;
  • U.S. Pat. No. 8,571,307; U.S. Pat. No. 8,579,200;
  • U.S. Pat. No. 8,583,924; U.S. Pat. No. 8,584,945;
  • U.S. Pat. No. 8,587,595; U.S. Pat. No. 8,587,697;
  • U.S. Pat. No. 8,588,869; U.S. Pat. No. 8,590,789;
  • U.S. Pat. No. 8,593,539; U.S. Pat. No. 8,596,542;
  • U.S. Pat. No. 8,596,543; U.S. Pat. No. 8,599,271;
  • U.S. Pat. No. 8,599,957; U.S. Pat. No. 8,600,158;
  • U.S. Pat. No. 8,600,167; U.S. Pat. No. 8,602,309;
  • U.S. Pat. No. 8,608,053; U.S. Pat. No. 8,608,071;
  • U.S. Pat. No. 8,611,309; U.S. Pat. No. 8,615,487;
  • U.S. Pat. No. 8,616,454; U.S. Pat. No. 8,621,123;
  • U.S. Pat. No. 8,622,303; U.S. Pat. No. 8,628,013;
  • U.S. Pat. No. 8,628,015; U.S. Pat. No. 8,628,016;
  • U.S. Pat. No. 8,629,926; U.S. Pat. No. 8,630,491;
  • U.S. Pat. No. 8,635,309; U.S. Pat. No. 8,636,200;
  • U.S. Pat. No. 8,636,212; U.S. Pat. No. 8,636,215;
  • U.S. Pat. No. 8,636,224; U.S. Pat. No. 8,638,806;
  • U.S. Pat. No. 8,640,958; U.S. Pat. No. 8,640,960;
  • U.S. Pat. No. 8,643,717; U.S. Pat. No. 8,646,692;
  • U.S. Pat. No. 8,646,694; U.S. Pat. No. 8,657,200;
  • U.S. Pat. No. 8,659,397; U.S. Pat. No. 8,668,149;
  • U.S. Pat. No. 8,678,285; U.S. Pat. No. 8,678,286;
  • U.S. Pat. No. 8,682,077; U.S. Pat. No. 8,687,282;
  • International Publication No. 2013/163789;
  • International Publication No. 2013/173985;
  • International Publication No. 2014/019130;
  • U.S. Patent Application Publication No. 2008/0185432;
  • U.S. Patent Application Publication No. 2009/0134221;
  • U.S. Patent Application Publication No. 2010/0177080;
  • U.S. Patent Application Publication No. 2010/0177076;
  • U.S. Patent Application Publication No. 2010/0177707;
  • U.S. Patent Application Publication No. 2010/0177749;
  • U.S. Patent Application Publication No. 2011/0169999;
  • U.S. Patent Application Publication No. 2011/0202554;
  • U.S. Patent Application Publication No. 2012/0111946;
  • U.S. Patent Application Publication No. 2012/0138685;
  • U.S. Patent Application Publication No. 2012/0168511;
  • U.S. Patent Application Publication No. 2012/0168512;
  • U.S. Patent Application Publication No. 2012/0193407;
  • U.S. Patent Application Publication No. 2012/0193423;
  • U.S. Patent Application Publication No. 2012/0203647;
  • U.S. Patent Application Publication No. 2012/0223141;
  • U.S. Patent Application Publication No. 2012/0228382;
  • U.S. Patent Application Publication No. 2012/0248188;
  • U.S. Patent Application Publication No. 2013/0043312;
  • U.S. Patent Application Publication No. 2013/0056285;
  • U.S. Patent Application Publication No. 2013/0070322;
  • U.S. Patent Application Publication No. 2013/0075168;
  • U.S. Patent Application Publication No. 2013/0082104;
  • U.S. Patent Application Publication No. 2013/0175341
  • U.S. Patent Application Publication No. 2013/0175343;
  • U.S. Patent Application Publication No. 2013/0200158;
  • U.S. Patent Application Publication No. 2013/0256418;
  • U.S. Patent Application Publication No. 2013/0257744;
  • U.S. Patent Application Publication No. 2013/0257759;
  • U.S. Patent Application Publication No. 2013/0270346;
  • U.S. Patent Application Publication No. 2013/0278425;
  • U.S. Patent Application Publication No. 2013/0287258;
  • U.S. Patent Application Publication No. 2013/0292474;
  • U.S. Patent Application Publication No. 2013/0292475;
  • U.S. Patent Application Publication No. 2013/0292477;
  • U.S. Patent Application Publication No. 2013/0293539;
  • U.S. Patent Application Publication No. 2013/0293540;
  • U.S. Patent Application Publication No. 2013/0306728;
  • U.S. Patent Application Publication No. 2013/0306730;
  • U.S. Patent Application Publication No. 2013/0306731;
  • U.S. Patent Application Publication No. 2013/0306734;
  • U.S. Patent Application Publication No. 2013/0307964;
  • U.S. Patent Application Publication No. 2013/0313324;
  • U.S. Patent Application Publication No. 2013/0313325;
  • U.S. Patent Application Publication No. 2013/0313326;
  • U.S. Patent Application Publication No. 2013/0327834;
  • U.S. Patent Application Publication No. 2013/0341399;
  • U.S. Patent Application Publication No. 2013/0342717;
  • U.S. Patent Application Publication No. 2014/0001267;
  • U.S. Patent Application Publication No. 2014/0002828;
  • U.S. Patent Application Publication No. 2014/0008430;
  • U.S. Patent Application Publication No. 2014/0008439;
  • U.S. Patent Application Publication No. 2014/0021256;
  • U.S. Patent Application Publication No. 2014/0025584;
  • U.S. Patent Application Publication No. 2014/0027518;
  • U.S. Patent Application Publication No. 2014/0034723;
  • U.S. Patent Application Publication No. 2014/0034734;
  • U.S. Patent Application Publication No. 2014/0036848;
  • U.S. Patent Application Publication No. 2014/0039693;
  • U.S. Patent Application Publication No. 2014/0042814;
  • U.S. Patent Application Publication No. 2014/0049120;
  • U.S. Patent Application Publication No. 2014/0049635;
  • U.S. Patent Application Publication No. 2014/0061305;
  • U.S. Patent Application Publication No. 2014/0061306;
  • U.S. Patent Application Publication No. 2014/0061307;
  • U.S. Patent Application Publication No. 2014/0063289;
  • U.S. Patent Application Publication No. 2014/0066136;
  • U.S. Patent Application Publication No. 2014/0067692;
  • U.S. Patent Application Publication No. 2014/0070005;
  • U.S. Patent Application Publication No. 2014/0071840;
  • U.S. Patent Application Publication No. 2014/0074746;
  • U.S. patent application Ser. No. 13/367,978 for a Laser Scanning Module Employing An Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.);
  • U.S. patent application Ser. No. 13/400,748 for a Laser Scanning Bar Code Symbol Reading System Having Intelligent Scan Sweep Angle Adjustment Capabilities Over The Working Range Of The System For Optimized Bar Code Symbol Reading Performance, filed Feb. 21, 2012 (Wilz);
  • U.S. patent application Ser. No. 13/736,139 for an Electronic Device Enclosure, filed Jan. 8, 2013 (Chaney);
  • U.S. patent application Ser. No. 13/750,304 for Measuring Object Dimensions Using Mobile Computer, filed Jan. 25, 2013;
  • U.S. patent application Ser. No. 13/771,508 for an Optical Redirection Adapter, filed Feb. 20, 2013 (Anderson);
  • U.S. patent application Ser. No. 13/780,158 for a Distraction Avoidance System, filed Feb. 28, 2013 (Sauerwein);
  • U.S. patent application Ser. No. 13/780,196 for Android Bound Service Camera Initialization, filed Feb. 28, 2013 (Todeschini et al.);
  • U.S. patent application Ser. No. 13/780,271 for a Vehicle Computer System with Transparent Display, filed Feb. 28, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 13/780,356 for a Mobile Device Having Object-Identification Interface, filed Feb. 28, 2013 (Samek et al.);
  • U.S. patent application Ser. No. 13/784,933 for an Integrated Dimensioning and Weighing System, filed Mar. 5, 2013 (McCloskey et al.);
  • U.S. patent application Ser. No. 13/785,177 for a Dimensioning System, filed Mar. 5, 2013 (McCloskey et al.);
  • U.S. patent application Ser. No. 13/792,322 for a Replaceable Connector, filed Mar. 11, 2013 (Skvoretz);
  • U.S. patent application Ser. No. 13/852,097 for a System and Method for Capturing and Preserving Vehicle Event Data, filed Mar. 28, 2013 (Barker et al.);
  • U.S. patent application Ser. No. 13/895,846 for a Method of Programming a Symbol Reading System, filed Apr. 10, 2013 (Corcoran);
  • U.S. patent application Ser. No. 13/902,110 for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Hollifield);
  • U.S. patent application Ser. No. 13/902,144, for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Chamberlin);
  • U.S. patent application Ser. No. 13/902,242 for a System For Providing A Continuous Communication Link With A Symbol Reading Device, filed May 24, 2013 (Smith et al.);
  • U.S. patent application Ser. No. 13/912,262 for a Method of Error Correction for 3D Imaging Device, filed Jun. 7, 2013 (Jovanovski et al.);
  • U.S. patent application Ser. No. 13/912,702 for a System and Method for Reading Code Symbols at Long Range Using Source Power Control, filed Jun. 7, 2013 (Xian et al.);
  • U.S. patent application Ser. No. 13/922,339 for a System and Method for Reading Code Symbols Using a Variable Field of View, filed Jun. 20, 2013 (Xian et al.);
  • U.S. patent application Ser. No. 13/927,398 for a Code Symbol Reading System Having Adaptive Autofocus, filed Jun. 26, 2013 (Todeschini);
  • U.S. patent application Ser. No. 13/930,913 for a Mobile Device Having an Improved User Interface for Reading Code Symbols, filed Jun. 28, 2013 (Gelay et al.);
  • U.S. patent application Ser. No. 13/933,415 for an Electronic Device Case, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 13/947,296 for a System and Method for Selectively Reading Code Symbols, filed Jul. 22, 2013 (Rueblinger et al.);
  • U.S. patent application Ser. No. 13/950,544 for a Code Symbol Reading System Having Adjustable Object Detection, filed Jul. 25, 2013 (Jiang);
  • U.S. patent application Ser. No. 13/961,408 for a Method for Manufacturing Laser Scanners, filed Aug. 7, 2013 (Saber et al.);
  • U.S. patent application Ser. No. 13/974,374 for Authenticating Parcel Consignees with Indicia Decoding Devices, filed Aug. 23, 2013 (Ye et al.);
  • U.S. patent application Ser. No. 14/018,729 for a Method for Operating a Laser Scanner, filed Sep. 5, 2013 (Feng et al.);
  • U.S. patent application Ser. No. 14/019,616 for a Device Having Light Source to Reduce Surface Pathogens, filed Sep. 6, 2013 (Todeschini);
  • U.S. patent application Ser. No. 14/023,762 for a Handheld Indicia Reader Having Locking Endcap, filed Sep. 11, 2013 (Gannon);
  • U.S. patent application Ser. No. 14/035,474 for Augmented-Reality Signature Capture, filed Sep. 24, 2013 (Todeschini);
  • U.S. patent application Ser. No. 14/047,896 for Terminal Having Illumination and Exposure Control filed Oct. 7, 2013 (Jovanovski et al.);
  • U.S. patent application Ser. No. 14/035,474 for Augmented-Reality Signature Capture, filed Sep. 24, 2013 (Todeschini);
  • U.S. patent application Ser. No. 14/050,515 for Hybrid-Type Bioptical, filed Oct. 10, 2013 (Edmonds et al.);
  • U.S. patent application Ser. No. 14/053,175 for Imaging Apparatus Having Imaging Assembly, filed Oct. 14, 2013 (Barber) U.S. patent application Ser. No. 14/055,234 for Dimensioning System, filed Oct. 16, 2013 (Fletcher);
  • U.S. patent application Ser. No. 14/055,353 for Dimensioning System, filed Oct. 16, 2013 (Giordano et al.);
  • U.S. patent application Ser. No. 14/055,383 for Dimensioning System, filed Oct. 16, 2013 (Li et al.);
  • U.S. patent application Ser. No. 14/053,314 for Indicia Reader, filed Oct. 14, 2013 (Huck);
  • U.S. patent application Ser. No. 14/058,762 for Terminal Including Imaging Assembly, filed Oct. 21, 2013 (Gomez et al.);
  • U.S. patent application Ser. No. 14/062,239 for Chip on Board Based Highly Integrated Imager, filed Oct. 24, 2013 (Toa et al.);
  • U.S. patent application Ser. No. 14/065,768 for Hybrid System and Method for Reading Indicia, filed Oct. 29, 2013 (Meier et al.);
  • U.S. patent application Ser. No. 14/074,746 for Self-Checkout Shopping System, filed Nov. 8, 2013 (Hejl et al.);
  • U.S. patent application Ser. No. 14/074,787 for Method and System for Configuring Mobile Devices via NFC Technology, filed Nov. 8, 2013 (Smith et al.);
  • U.S. patent application Ser. No. 14/082,468 for Encoded Information Reading Terminal with Wireless Path Selection Capability, filed Nov. 18, 2013 (Wang et al.);
  • U.S. patent application Ser. No. 14/087,190 for Optimal Range Indicators for Bar Code Validation, filed Nov. 22, 2013 (Hejl);
  • U.S. patent application Ser. No. 14/093,484 for System for Capturing a Document in an Image Signal, filed Dec. 1, 2013 (Showering);
  • U.S. patent application Ser. No. 14/093,487 for Method and System Operative to Process Color Image Data, filed Dec. 1, 2013 (Li et al.);
  • U.S. patent application Ser. No. 14/093,490 for Imaging Terminal Having Image Sensor and Lens Assembly, filed Dec. 1, 2013 (Havens et al.);
  • U.S. patent application Ser. No. 14/093,624 for Apparatus Operative for Capture of Image Data, filed Dec. 2, 2013 (Havens et al.);
  • U.S. patent application Ser. No. 14/094,087 for Method and System for Communicating Information in an Digital Signal, filed Dec. 2, 2013 (Peake et al.);
  • U.S. patent application Ser. No. 14/101,965 for High Dynamic-Range Indicia Reading System, filed Dec. 10, 2013 (Xian);
  • U.S. patent application Ser. No. 14/107,048 for Roaming Encoded Information Reading Terminal, filed Dec. 16, 2013 (Wang et al.);
  • U.S. patent application Ser. No. 14/118,400 for Indicia Decoding Device with Security Lock, filed Nov. 18, 2013 (Liu);
  • U.S. patent application Ser. No. 14/138,206 for System and Method to Store and Retrieve Identifier Associated Information, filed Dec. 23, 2013 (Gomez et al.);
  • U.S. patent application Ser. No. 14/143,399 for Device Management Using Virtual Interfaces, filed Dec. 30, 2013 (Caballero);
  • U.S. patent application Ser. No. 14/147,992 for Decoding Utilizing Image Data, filed Jan. 6, 2014 (Meier et al.);
  • U.S. patent application Ser. No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.);
  • U.S. patent application Ser. No. 14/153,111 for Indicia Reading Terminal Including Frame Quality Evaluation Processing, filed Jan. 13, 2014 (Wang et al.);
  • U.S. patent application Ser. No. 14/153,142 for Imaging Apparatus Comprising Image Sensor Array having Shared Global Shutter Circuitry, filed Jan. 13, 2014 (Wang);
  • U.S. patent application Ser. No. 14/153,182 for System and Method to Manipulate an Image, filed Jan. 13, 2014 (Longacre et al.);
  • U.S. patent application Ser. No. 14/153,213 for Apparatus Comprising Image Sensor Array and Illumination Control, filed Jan. 13, 2014 (Ding);
  • U.S. patent application Ser. No. 14/153,249 for Terminal Operative for Storing Frame of Image Data, filed Jan. 13, 2014 (Winegar);
  • U.S. patent application Ser. No. 14/154,207 for Laser Barcode Scanner, filed Jan. 14, 2014 (Hou et al.);
  • U.S. patent application Ser. No. 14/154,915 for Laser Scanning Module Employing a Laser Scanning Assembly having Elastomeric Wheel Hinges, filed Jan. 14, 2014 (Havens et al.);
  • U.S. patent application Ser. No. 14/158,126 for Methods and Apparatus to Change a Feature Set on Data Collection Devices, filed Jan. 17, 2014 (Berthiaume et al.);
  • U.S. patent application Ser. No. 14/159,074 for Wireless Mesh Point Portable Data Terminal, filed Jan. 20, 2014 (Wang et al.);
  • U.S. patent application Ser. No. 14/159,509 for NMS Text Messaging for Hand Held Indicia Reader, filed Jan. 21, 2014 (Kearney);
  • U.S. patent application Ser. No. 14/159,603 for Decodable Indicia Reading Terminal with Optical Filter, filed Jan. 21, 2014 (Ding et al.);
  • U.S. patent application Ser. No. 14/160,645 for Decodable Indicia Reading Terminal with Indicia Analysis Functionality, filed Jan. 22, 2014 (Nahill et al.);
  • U.S. patent application Ser. No. 14/161,875 for System and Method to Automatically Discriminate Between Different Data Types, filed Jan. 23, 2014 (Wang);
  • U.S. patent application Ser. No. 14/165,980 for System and Method for Measuring Irregular Objects with a Single Camera filed Jan. 28, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/166,103 for Indicia Reading Terminal Including Optical Filter filed Jan. 28, 2014 (Lu et al.);
  • U.S. patent application Ser. No. 14/176,417 for Devices and Methods Employing Dual Target Auto Exposure filed Feb. 10, 2014 (Meier et al.);
  • U.S. patent application Ser. No. 14/187,485 for Indicia Reading Terminal with Color Frame Processing filed Feb. 24, 2014 (Ren et al.);
  • U.S. patent application Ser. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.);
  • U.S. patent application Ser. No. 14/342,544 for Imaging Based Barcode Scanner Engine with Multiple Elements Supported on a Common Printed Circuit Board filed Mar. 4, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/342,551 for Terminal Having Image Data Format Conversion filed Mar. 4, 2014 (Lui et al.); and
  • U.S. patent application Ser. No. 14/345,735 for Optical Indicia Reading Terminal with Combined Illumination filed Mar. 19, 2014 (Ouyang).


In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A method of manufacturing an adjustable-surface lens of an autofocus lens system, the method comprising: providing a first transparent deformable membrane having a ring-shaped piezoelectric film contiguously positioned on the first transparent deformable membrane's outer surface;providing a second transparent deformable membrane having a ring-shaped piezoelectric film contiguously positioned on the second transparent deformable membrane's outer surface; andcontiguously positioning a flexible polymer between the first transparent deformable membrane and the second transparent deformable membrane, wherein the flexible polymer is attached to the respective inner surfaces of the first transparent deformable membrane and the second transparent deformable membrane.
  • 2. The method of claim 1, wherein the first transparent deformable membrane and the second transparent deformable membrane transparent each comprises glass, quartz, or sapphire.
  • 3. The method of claim 1, comprising providing a first positive lens fixedly positioned along an optical axis.
  • 4. The method of claim 3, wherein the first lens is a lens group comprising a plurality of lenses.
  • 5. The method of claim 3, comprising providing a second positive lens fixedly positioned along the optical axis.
  • 6. The method of claim 5, wherein the second lens is a lens group comprising a plurality of lenses.
  • 7. The method of claim 1, comprising: providing a first positive lens fixedly positioned along an optical axis;providing a second positive lens fixedly positioned along the optical axis; andpositioning the adjustable-surface lens between the first positive lens and the second positive lens along the optical axis.
  • 8. The method of claim 1, comprising providing electrical connection on the ring-shaped piezoelectric films.
  • 9. A method of manufacturing an adjustable-surface lens of an autofocus lens system, the method comprising: attaching contiguously a ring shaped piezoelectric film to a transparent deformable membrane on the transparent deformable membrane's outer surface;attaching contiguously a flexible transparent polymeric material to an inner surface of the transparent deformable membrane; andproviding a rigid support contiguous to the flexible transparent polymeric material and opposite to the transparent deformable membrane.
  • 10. The method of claim 9, wherein the transparent deformable membrane comprises glass, quartz, or sapphire.
  • 11. The method of claim 9, comprising providing a first positive lens fixedly positioned along an optical axis.
  • 12. The method of claim 11, wherein the first lens is a lens group comprising a plurality of lenses.
  • 13. The method of claim 11, comprising providing a second positive lens fixedly positioned along the optical axis.
  • 14. The method of claim 13, wherein the second lens is a lens group comprising a plurality of lenses.
  • 15. The method of claim 9, comprising providing electrical connection on the ring-shaped piezoelectric film.
  • 16. A focusing-module lens, comprising: two adjustable surfaces, wherein the optical power of the adjustable surfaces is controlled electronically to achieve focus;a first transparent deformable membrane having a ring-shaped piezoelectric film contiguously positioned on the first transparent deformable membrane's outer surface;a second transparent deformable membrane having a ring-shaped piezoelectric film contiguously positioned on the second transparent deformable membrane's outer surface; anda flexible polymer contiguously positioned between the first transparent deformable membrane and the second transparent deformable membrane, whereby the flexible polymer is in contact with an inner surface of the first transparent deformable membrane and an inner surface of the second transparent deformable membrane.
  • 17. The focusing-module lens of claim 16, wherein the focusing-module lens has a clear aperture with a diameter of between 1.3 millimeters and 1.7 millimeters.
  • 18. The focusing-module lens of claim 16, wherein the first transparent deformable membrane and the second transparent deformable membrane transparent each comprise glass, quartz, or sapphire.
  • 19. The focusing-module lens of claim 16, wherein the focusing-module lens has a focusing response time of 2 milliseconds or less.
  • 20. The focusing-module lens of claim 16, wherein the focusing-module lens consumes 20 milliwatts or less.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application claims the benefit of U.S. patent application Ser. No. 15/440,357 for an Autofocus Lens System filed Feb. 23, 2017 (and published Jun. 8, 2017 as U.S. Patent Application Publication No. 2017/0160441), now U.S. Pat. No. 9,952,356, which claims the benefit of U.S. patent application Ser. No. 14/979,818 for an Autofocus Lens System filed Dec. 28, 2015 (and published May 12, 2016 as U.S. Patent Publication No. 2016/0131894), now U.S. Pat. No. 9,581,809, which claims the benefit of U.S. patent application Ser. No. 14/264,173 for an Autofocus Lens System for Indicia Readers filed Apr. 29, 2014 (and published Oct. 29, 2015 as U.S. Patent Publication No. 2015/0310243), now U.S. Pat. No. 9,224,022. Each of the foregoing patent applications, patent publications, and patents is hereby incorporated by reference in its entirety.

US Referenced Citations (635)
Number Name Date Kind
6832725 Gardiner et al. Dec 2004 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7296749 Massieu Nov 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7726575 Wang et al. Jun 2010 B2
7813047 Wang et al. Oct 2010 B2
8045280 Henriksen et al. Oct 2011 B2
8294969 Plesko Oct 2012 B2
8305691 Havens et al. Nov 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8366002 Wang et al. Feb 2013 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8505822 Wang et al. Aug 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8531790 Stang et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9061527 Tobin et al. Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9076459 Braho et al. Jul 2015 B2
9079423 Bouverie et al. Jul 2015 B2
9080856 Laffargue Jul 2015 B2
9082023 Feng et al. Jul 2015 B2
9084032 Rautiola et al. Jul 2015 B2
9087250 Coyle Jul 2015 B2
9092681 Havens et al. Jul 2015 B2
9092682 Wilz et al. Jul 2015 B2
9092683 Koziol et al. Jul 2015 B2
9093141 Liu Jul 2015 B2
9098763 Lu et al. Aug 2015 B2
9104929 Todeschini Aug 2015 B2
9104934 Li et al. Aug 2015 B2
9107484 Chaney Aug 2015 B2
9111159 Liu et al. Aug 2015 B2
9111166 Cunningham Aug 2015 B2
9135483 Liu et al. Sep 2015 B2
9137009 Gardiner Sep 2015 B1
9141839 Xian et al. Sep 2015 B2
9147096 Wang Sep 2015 B2
9148474 Skvoretz Sep 2015 B2
9158000 Sauerwein Oct 2015 B2
9158340 Reed et al. Oct 2015 B2
9158953 Gillet et al. Oct 2015 B2
9159059 Daddabbo et al. Oct 2015 B2
9165174 Huck Oct 2015 B2
9171543 Emerick et al. Oct 2015 B2
9183425 Wang Nov 2015 B2
9189669 Zhu et al. Nov 2015 B2
9195844 Todeschini et al. Nov 2015 B2
9202458 Braho et al. Dec 2015 B2
9208366 Liu Dec 2015 B2
9208367 Wang Dec 2015 B2
9219836 Bouverie et al. Dec 2015 B2
9224022 Ackley Dec 2015 B2
9224024 Bremer et al. Dec 2015 B2
9224027 Van Horn et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9235553 Fitch et al. Jan 2016 B2
9239950 Fletcher Jan 2016 B2
9245492 Ackley et al. Jan 2016 B2
9443123 Hejl Jan 2016 B2
9248640 Heng Feb 2016 B2
9250652 London et al. Feb 2016 B2
9250712 Todeschini Feb 2016 B1
9251411 Todeschini Feb 2016 B2
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9262660 Lu et al. Feb 2016 B2
9262662 Chen et al. Feb 2016 B2
9269036 Bremer Feb 2016 B2
9270782 Hala et al. Feb 2016 B2
9274812 Doren et al. Mar 2016 B2
9275388 Havens et al. Mar 2016 B2
9277668 Feng et al. Mar 2016 B2
9280693 Feng et al. Mar 2016 B2
9286496 Smith Mar 2016 B2
9297900 Jiang Mar 2016 B2
9298964 Li et al. Mar 2016 B2
9301427 Feng et al. Mar 2016 B2
9304376 Anderson Apr 2016 B2
9310609 Rueblinger et al. Apr 2016 B2
9313377 Todeschini et al. Apr 2016 B2
9317037 Byford et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342723 Liu et al. May 2016 B2
9342724 McCloskey May 2016 B2
9361882 Ressler et al. Jun 2016 B2
9365381 Colonel et al. Jun 2016 B2
9373018 Colavito et al. Jun 2016 B2
9375945 Bowles Jun 2016 B1
9378403 Wang et al. Jun 2016 B2
D760719 Zhou et al. Jul 2016 S
9360304 Chang et al. Jul 2016 B2
9383848 Daghigh Jul 2016 B2
9384374 Bianconi Jul 2016 B2
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
9411386 Sauerwein Aug 2016 B2
9412242 Van Horn et al. Aug 2016 B2
9418269 Havens et al. Aug 2016 B2
9418270 Van Volkinburg et al. Aug 2016 B2
9423318 Lui et al. Aug 2016 B2
D766244 Zhou et al. Sep 2016 S
9443222 Singel et al. Sep 2016 B2
9454689 McCloskey et al. Sep 2016 B2
9464885 Lloyd et al. Oct 2016 B2
9465967 Xian et al. Oct 2016 B2
9478113 Xie et al. Oct 2016 B2
9478983 Kather et al. Oct 2016 B2
D771631 Fitch et al. Nov 2016 S
9481186 Bouverie et al. Nov 2016 B2
9481809 Carney et al. Nov 2016 B2
9488986 Solanki Nov 2016 B1
9489782 Payne et al. Nov 2016 B2
9490540 Davies et al. Nov 2016 B1
9491729 Rautiola et al. Nov 2016 B2
9497092 Gomez et al. Nov 2016 B2
9507974 Todeschini Nov 2016 B1
9519814 Cudzilo Dec 2016 B2
9521331 Bessettes et al. Dec 2016 B2
9530038 Xian et al. Dec 2016 B2
D777166 Bidwell et al. Jan 2017 S
9558386 Yeakley Jan 2017 B2
9572901 Todeschini Feb 2017 B2
9581809 Ackley Feb 2017 B2
9606581 Howe et al. Mar 2017 B1
D783601 Schulte et al. Apr 2017 S
D785617 Bidwell et al. May 2017 S
D785636 Oberpriller et al. May 2017 S
9646189 Lu et al. May 2017 B2
9646191 Unemyr et al. May 2017 B2
9652648 Ackley et al. May 2017 B2
9652653 Todeschini et al. May 2017 B2
9656487 Ho et al. May 2017 B2
9659198 Giordano et al. May 2017 B2
D790505 Vargo et al. Jun 2017 S
D790546 Zhou et al. Jun 2017 S
9680282 Hanenburg Jun 2017 B2
9697401 Feng et al. Jul 2017 B2
9701140 Alaganchetty et al. Jul 2017 B1
20070063048 Havens et al. Mar 2007 A1
20080144185 Wang et al. Jun 2008 A1
20080144186 Feng et al. Jun 2008 A1
20090072037 Good et al. Mar 2009 A1
20090134221 Zhu et al. May 2009 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100276491 Havens et al. Nov 2010 A1
20100276492 Wang et al. Nov 2010 A1
20100276493 Havens et al. Nov 2010 A1
20110149409 Haugholt et al. Jun 2011 A1
20110157675 Heim et al. Jun 2011 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20110304927 Margolis Dec 2011 A1
20120111946 Golant May 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20120248195 Feng et al. Oct 2012 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130332524 Fiala et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140100813 Showering Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140049683 Guenter et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104696 Moreau et al. Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140194750 Lee et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150310243 Ackley Oct 2015 A1
20150310389 Crimm et al. Oct 2015 A1
20150327012 Bian et al. Nov 2015 A1
20160014251 Hejl Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160062473 Bouchat et al. Mar 2016 A1
20160092805 Geisler et al. Mar 2016 A1
20160101936 Chamberlin Apr 2016 A1
20160102975 McCloskey et al. Apr 2016 A1
20160104019 Todeschini et al. Apr 2016 A1
20160104274 Jovanovski et al. Apr 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160117627 Raj et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160131894 Ackley May 2016 A1
20160133253 Braho et al. May 2016 A1
20160171597 Todeschini Jun 2016 A1
20160171666 McCloskey Jun 2016 A1
20160171720 Todeschini Jun 2016 A1
20160171775 Todeschini et al. Jun 2016 A1
20160171777 Todeschini et al. Jun 2016 A1
20160174674 Oberpriller et al. Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160178685 Young et al. Jun 2016 A1
20160178707 Young et al. Jun 2016 A1
20160179132 Harr et al. Jun 2016 A1
20160179143 Bidwell et al. Jun 2016 A1
20160179368 Roeder Jun 2016 A1
20160179378 Kent et al. Jun 2016 A1
20160180130 Bremer Jun 2016 A1
20160180133 Oberpriller et al. Jun 2016 A1
20160180136 Meier et al. Jun 2016 A1
20160180594 Todeschini Jun 2016 A1
20160180663 McMahan et al. Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160180713 Bernhardt et al. Jun 2016 A1
20160185136 Ng et al. Jun 2016 A1
20160185291 Chamberlin Jun 2016 A1
20160186926 Oberpriller et al. Jun 2016 A1
20160188861 Todeschini Jun 2016 A1
20160188939 Sailors et al. Jun 2016 A1
20160188940 Lu et al. Jun 2016 A1
20160188941 Todeschini et al. Jun 2016 A1
20160188942 Good et al. Jun 2016 A1
20160188943 Linwood Jun 2016 A1
20160188944 Wilz et al. Jun 2016 A1
20160189076 Mellott et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160189088 Pecorari et al. Jun 2016 A1
20160189092 George et al. Jun 2016 A1
20160189284 Mellott et al. Jun 2016 A1
20160189288 Todeschini Jun 2016 A1
20160189366 Chamberlin et al. Jun 2016 A1
20160189443 Smith Jun 2016 A1
20160189447 Valenzuela Jun 2016 A1
20160189489 Au et al. Jun 2016 A1
20160191684 DiPiazza et al. Jun 2016 A1
20160192051 DiPiazza et al. Jun 2016 A1
20160125873 Braho et al. Jul 2016 A1
20160202951 Pike et al. Jul 2016 A1
20160202958 Zabel et al. Jul 2016 A1
20160202959 Doubleday et al. Jul 2016 A1
20160203021 Pike et al. Jul 2016 A1
20160203429 Mellott et al. Jul 2016 A1
20160203797 Pike et al. Jul 2016 A1
20160203820 Zabel et al. Jul 2016 A1
20160204623 Haggert et al. Jul 2016 A1
20160204636 Allen et al. Jul 2016 A1
20160204638 Miraglia et al. Jul 2016 A1
20160316190 McCloskey et al. Jul 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Sewell et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20160323310 Todeschini et al. Nov 2016 A1
20160325677 Fitch et al. Nov 2016 A1
20160327614 Young et al. Nov 2016 A1
20160327930 Charpentier et al. Nov 2016 A1
20160328762 Pape Nov 2016 A1
20160330218 Hussey et al. Nov 2016 A1
20160343163 Venkatesha et al. Nov 2016 A1
20160343176 Ackley Nov 2016 A1
20160364914 Todeschini Dec 2016 A1
20160370220 Ackley et al. Dec 2016 A1
20160372282 Bandringa Dec 2016 A1
20160373847 Vargo et al. Dec 2016 A1
20160377414 Thuries et al. Dec 2016 A1
20160377417 Jovanovski et al. Dec 2016 A1
20170010141 Ackley Jan 2017 A1
20170010328 Mullen et al. Jan 2017 A1
20170010780 Waldron et al. Jan 2017 A1
20170016714 Laffargue et al. Jan 2017 A1
20170018094 Todeschini Jan 2017 A1
20170046603 Lee et al. Feb 2017 A1
20170047864 Stang et al. Feb 2017 A1
20170053146 Liu et al. Feb 2017 A1
20170053147 Geramine et al. Feb 2017 A1
20170053647 Nichols et al. Feb 2017 A1
20170055606 Xu et al. Mar 2017 A1
20170060316 Larson Mar 2017 A1
20170061961 Nichols et al. Mar 2017 A1
20170064634 Van Horn et al. Mar 2017 A1
20170083730 Feng et al. Mar 2017 A1
20170091502 Furlong et al. Mar 2017 A1
20170091706 Lloyd et al. Mar 2017 A1
20170091741 Todeschini Mar 2017 A1
20170091904 Ventress Mar 2017 A1
20170092908 Chaney Mar 2017 A1
20170094238 Germaine et al. Mar 2017 A1
20170098947 Wolski Apr 2017 A1
20170100949 Celinder et al. Apr 2017 A1
20170108838 Todeschinie et al. Apr 2017 A1
20170108895 Chamberlin et al. Apr 2017 A1
20170118355 Wong et al. Apr 2017 A1
20170123598 Phan et al. May 2017 A1
20170124369 Rueblinger et al. May 2017 A1
20170124396 Todeschini et al. May 2017 A1
20170124687 McCloskey et al. May 2017 A1
20170126873 McGary et al. May 2017 A1
20170126904 d'Armancourt et al. May 2017 A1
20170139012 Smith May 2017 A1
20170140329 Bernhardt et al. May 2017 A1
20170140731 Smith May 2017 A1
20170147847 Berggren et al. May 2017 A1
20170150124 Thuries May 2017 A1
20170160441 Ackley Jun 2017 A1
20170169198 Nichols Jun 2017 A1
20170171035 Lu et al. Jun 2017 A1
20170171703 Maheswaranathan Jun 2017 A1
20170171803 Maheswaranathan Jun 2017 A1
20170180359 Wolski et al. Jun 2017 A1
20170180577 Nguon et al. Jun 2017 A1
20170181299 Shi et al. Jun 2017 A1
20170190192 Delario et al. Jul 2017 A1
20170193432 Bernhardt Jul 2017 A1
20170193461 Jonas et al. Jul 2017 A1
20170193727 Van Horn et al. Jul 2017 A1
20170200108 Au et al. Jul 2017 A1
20170200275 McCloskey et al. Jul 2017 A1
Foreign Referenced Citations (1)
Number Date Country
2013163789 Nov 2013 WO
Non-Patent Literature Citations (10)
Entry
US 9,952,356, 04/2018, Ackley (withdrawn)
European Exam Report in related EP Application No. 15163293.2, dated Mar. 28, 2018, 7 pages [Only new art cited herein].
Office Action in related European Application No. 15163293.2 dated Jun. 22, 2017, pp. 1-5.
poLight, “poLight; The Optical MEMS Company”, Mar. 2012, 34 pages; Previously submitted in Parent Application.
Intermec, “EX25 Near/Far 2D Imager Engine 3rd Generation” Website Product Profile and Specifications, Copyright 2013 Honeywell Internation Inc. 61187-D 12/13, 2 pages Previously submitted in Parent Application.
Intermec, “The 2D Revolution; How evolving business needs and improved technology are driving explosive growth in two-dimensional bar coding”, White Paper, Copyright 2007 Intermec Technologies Corporation 611838-01A 06/07, 7 pages Previously submitted in Parent Application.
Intermec, “Guide to Scanning Technologies”, White Paper, Copyright 2007 Intermec Technologies Corporation 609107-010 03/07, 8 pages Previously submitted in Parent Application.
Intermec, “Imaging Moves into the Mainstream; Why 2D Imagers are Surpassing Laser Scanners for Bar Code Application”, White Paper, Copyright 2011 Intermec Technologies Corporation 612138-02A 07/11, 7 pages Previously submitted in Parent Application.
Jon H. Ulvensoen, poLight AS, “New Micro technology provides the functionality to mobile phone cameras”, All rights reserved poLight AS, www.polight.com, 2011, 20 pages Previously submitted in Parent Application.
Extended Search report in related EP Application No. 15163293.2, dated Sep. 18, 2015, 7 pages; Previously submitted in Parent Application.
Related Publications (1)
Number Date Country
20180239063 A1 Aug 2018 US
Continuations (3)
Number Date Country
Parent 15440357 Feb 2017 US
Child 15953762 US
Parent 14979818 Dec 2015 US
Child 15440357 US
Parent 14264173 Apr 2014 US
Child 14979818 US