Autofocus lens system

Information

  • Patent Grant
  • 10073197
  • Patent Number
    10,073,197
  • Date Filed
    Thursday, February 23, 2017
    7 years ago
  • Date Issued
    Tuesday, September 11, 2018
    6 years ago
Abstract
An autofocus lens system includes no conventional moving parts and has excellent speed and low power consumption. The system includes a small electronically-controlled focusing-module lens. The focusing-module lens includes two adjustable polymeric surfaces (e.g., two adjustable-surface lenses in a back-to-back configuration). The curvature of the surfaces can be adjusted to change focus. The performance of the autofocus lens system is extended by adding a conventional first and second lens, or lens group, on either side of the focusing-module lens. What results is an autofocus lens system with excellent near field and far field performance.
Description
CROSS-REFERENCE TO RELATED APPLICATION

The present application claims the benefit of U.S. patent application Ser. No. 14/979,818 for an Autofocus Lens System filed Dec. 28, 2015 (and published May 12, 2016 as U.S. Patent Publication No. 2016/0131894), now U.S. Pat. No. 9,581,809, which claims the benefit of U.S. patent application Ser. No. 14/264,173 for an Autofocus Lens System for Indicia Readers filed Apr. 29, 2014 (and published Oct. 29, 2015 as U.S. Patent Publication No. 2015/0310243), now U.S. Pat. No. 9,224,022. Each of the foregoing patent applications, patent publications, and patents is hereby incorporated by reference in its entirety.


FIELD OF THE INVENTION

The present invention relates to indicia readers and, more particularly, to an autofocus optical system that extends the range of distances at which indicia may be read.


BACKGROUND

Indicia readers (e.g., barcode scanners, OCR scanners) fall into two main classes based on their barcode-reading technology, namely (i) linear scanners (e.g., laser scanners, 1D imagers) and (ii) 2D scanners (e.g., 2D imagers, page scanners).


Laser scanners use fast moving mirrors to sweep a laser beam across a linear barcode. The bars and spaces of the barcode are recognized based on their respective reflectivity. In other words, the light areas and dark areas of the barcode reflect light back toward the scanner differently. This difference can be sensed by the scanner's photo-detector (e.g., photodiode) and converted into an electronic signal suitable for decoding.


Imaging scanners were developed to read advanced codes by adapting technology used in digital cameras. Imaging scanners take a picture of the entire barcode, and a processor running image processing algorithms recognizes and decodes the barcode. This digital approach overcomes many of the laser scanner's limitations.


Imaging scanners are more reliable than laser scanners, which use fast-moving parts. Imaging scanners can be configured to process all barcodes within a field of view and do not require separate scans for each barcode. Sophisticated decoding algorithms eliminate the need to align the imaging scanner with the barcode. Imaging scanners can also scan poor quality or damaged barcodes faster and more reliably than laser scanners. Further, the imaging scanner is more versatile and can be configured to address new codes or new modes of operation, such as document-capture. In view of these advantages, many users prefer the imaging scanner. The imaging scanner, however, lacks the extended scan range associated with laser scanners.


Extended scan ranges are important in warehouse environments, where barcoded containers may be stacked on high shelves. Operators may be limited in their access to barcodes and must scan over a range of distances. In these situations, scanning ranges can be 10 centimeters to 10 meters. This multi-order-of-magnitude range requirement places stringent demands on the imaging scanner.


The range of imaging scanners is limited by the scanner's imaging optics (e.g., lens). The quality of a barcode image is crucial for proper scans. Images that are unfocused images can render a barcode unreadable.


The range of distances over which a barcode can be decoded is known as the working-distance range. In fixed-lens systems (i.e., no moving parts), this working-distance range is the distance between the nearest focused objects and the farthest focused objects within the field of view (i.e., depth of field). The depth of field is related to the lens's f-number. A lens with a high f-number has a large depth of field. High f-number lenses, however, collect less light. Imaging scanners must collect sufficient light to prevent noisy images. These scanners, therefore, need a lens with both a low f-number and the ability to produce sharp images over a wide range of working distances. Fixed lenses, therefore, are not used for imaging scanners intended for extended range applications (e.g., warehouses).


Autofocus (i.e., AF) lenses may be used in imaging scanners that need both near and far scanning capabilities. Typically, focus is achieved in an autofocus lens by mechanically moving the lens. These mechanically-tuned autofocus lenses provide range to imaging scanners but also have some limitations.


The moving parts in mechanical autofocus lens systems may have reliability issues. The mechanical autofocus lens systems can be bulky because of the extra components required for motion (e.g., actuators, tracks, and linkages). These motion components also consume power at a rate that may limit their compatibility with battery-powered scanners. The mechanical motion of the lens or lenses can be slow and may hinder their use in applications that require fast focus (e.g., scanning in moving environments). Finally, the cost of these mechanical autofocus lens systems can be high because of the number and precision of the required mechanical parts.


Therefore, a need exists for an imaging-scanner autofocus lens system that has (i) a large focus range, (ii) a small size, (iii) low power consumption, and (iv) reduced mechanical complexity.


SUMMARY

Accordingly, in one aspect, the present invention embraces an autofocus lens system for an imaging scanner. The autofocus lens system uses a first lens (or lens group including a plurality of lenses), a second lens (or lens group including a plurality of lenses), and a focusing-module lens to focus a barcode onto an image sensor. The first lens is fixedly positioned along an optical axis. The second lens is fixedly positioned along the optical axis. The focusing-module lens is fixedly positioned along the optical axis between the first and second lenses. The lenses together create a real image of indicia. The focusing-module lens is used to change the focus of the autofocus lens system. Focus is adjusted by adjusting the optical power of the focusing-module lens. The optical power of the focusing-module lens is controlled by electronically adjusting the curvature of two adjustable surfaces.


In an exemplary embodiment, the autofocus lens system has a focusing-module lens with a clear aperture diameter that is smaller than the diameter of either the first lens or the second lens. The focusing-module lens defines the aperture stop for the autofocus lens system.


In another exemplary embodiment, this focusing-module lens has a diameter between 1.3 and 1.7 millimeters (e.g., about 1.5 millimeters).


In another exemplary embodiment, the autofocus lens system has a working distance of 10 centimeters or greater.


In another exemplary embodiment, the autofocus lens system has a response time of 2 milliseconds or less (e.g., less than about 1 millisecond).


In another exemplary embodiment, the autofocus lens system consumes 20 milliwatts of power or less.


In another exemplary embodiment, the autofocus lens system has an f-number of 7 or less.


In another exemplary embodiment, the focusing-module lens includes two adjustable-surface lenses positioned in close proximity to one another.


In still another exemplary embodiment, the focusing-module lens may include two contiguous adjustable surfaces. Here, the focusing-module lens includes (i) a first transparent deformable membrane having a ring-shaped piezoelectric film contiguously positioned on the first transparent deformable membrane's outer surface, (ii) a second transparent deformable membrane having a ring-shaped piezoelectric film contiguously positioned on the second transparent deformable membrane's outer surface, and (iii) a flexible polymer contiguously positioned between the first transparent deformable membrane and the second transparent deformable membrane. In this way, the flexible polymer is in contact with the inner surfaces of both the first transparent deformable membrane and the second transparent deformable membrane. The transparent deformable membrane can be fabricated from glass, quartz, sapphire or other semi-rigid transparent material. The two adjustable surfaces of the focusing-module lens may, in some embodiments, be electronically controlled independently.


In another aspect, the present invention embraces an active autofocus system for an imaging scanner including the foregoing autofocus lens system. In this active autofocus system, a range finder senses the range of a barcode through transmitted and received radiation and creates a range signal representing the sensed range. A processor generates a control signal based on the comparison of the range signal with a lookup table stored in memory. The lookup table contains focus settings associated with various range-signal values. A controller responds to the control signal by creating electronic autofocus signals to adjust the autofocus lens system to achieve focus of a real image of a 1D or 2D barcode.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 schematically depicts a block diagram of an active autofocus system.



FIG. 2 graphically depicts a perspective view of a cutaway layout of an adjustable-surface lens.



FIG. 3a graphically depicts a side-view cross-section of an adjustable-surface lens with no voltage applied (i.e., the “off” state).



FIG. 3b graphically depicts a side-view cross-section of an adjustable-surface lens with a voltage applied (i.e., the “on” state).



FIG. 4 schematically depicts an embodiment of an autofocus lens system.



FIG. 5 graphically depicts a first embodiment of a focusing-module lens with two adjustable-surface lenses in close proximity.



FIG. 6 graphically depicts a second embodiment of a focusing-module lens.



FIG. 7 schematically depicts an exemplary embodiment of an autofocus lens system.





DETAILED DESCRIPTION

The present invention embraces an autofocus lens system for an imaging scanner that extends the range of distances over which barcodes may be read. In this regard, an exemplary autofocus lens system for an imaging scanner includes (i) a first lens (e.g., first positive lens), fixedly positioned along an optical axis, (ii) a second lens (e.g., second positive lens) for creating a real image of a barcode, the second lens fixedly positioned along the optical axis, and (iii) and a focusing-module lens fixedly positioned along the optical axis between the first lens and the second lens, the focusing-module lens formed from two adjustable surfaces, wherein the optical power of the adjustable surfaces is controlled electronically to achieve focus.


Imaging scanners require good images to properly decode barcodes. The image sensors used in these devices deliver high-quality images only when the light impinging on the sensor (i) is focused and (ii) has an intensity above the sensor's noise level (i.e., a high signal-to-noise ratio).


To achieve high signal-to-noise ratio (SNR) images, the lens of an imaging scanner should gather light efficiently (i.e., have a high throughput). The entrance pupil is the image of the lens system's aperture-stop as seen through the front lens and is an indicator of the throughput. A large entrance pupil implies that the lens system will have high throughput. Long range scans are especially susceptible to low SNR images. This is due to path loss. The light reflected from the barcode spreads out over long ranges and less is captured by the scanner imaging lens. Illumination sources may help to improve the signal to noise ratio (i.e., SNR) of images, but a high throughput imaging lens is still extremely important.


Imaging scanners used in diverse environments (e.g., warehouses) should read barcodes at various ranges in both the near field and far field (e.g., 10 centimeters to 10 meters). In other words, the imaging scanner's lens must be able to create sharp barcode images over a wide range of working distances.


Fixed-focus lenses, with no focusing motion, are not used in imaging scanners requiring wide scanning ranges. These lenses typically have low throughput and may not be able to focus on a barcode that is close to the scanner.


Lenses with focusing motion can extend the scanner's working-distance range. The focusing motion moves the lens to a point where the light rays from a barcode converge onto the image sensor and produce a sharp real image. While this focusing movement can be accomplished manually, it is more practical for scanners to use an automatic-focus (i.e., autofocus) system.


Autofocus systems use (i) optimal focus information and (ii) a positioner (e.g., an actuator or piezoelectric) to positioning the real image. A passive autofocus system might use a processor running image-processing algorithms to determine the focus quality. The processor uses this information to send signals to actuators that position the lens. Alternatively, an active autofocus system uses a range finder to ascertain the distance between the object and the front lens of the system (i.e., the working distance). This range information can then be used to adjust the lens position for optimal focus. Because of its simplicity, the active autofocus scheme is well suited for imaging scanners.


The range finder in an active autofocus system can use one or more sensors to create a range signal. A processor running a process can compare the range signal with a stored lookup table to generate a corresponding control signal. The control signal can be interpreted by control electronics (e.g., a controller) to drive the lens system's positioning devices.


A block diagram of an active autofocus system is shown in FIG. 1. Here, a range-finder 20 senses the range (i.e., working distance) of a barcode through some transmitted radiation 5 and received radiation 15 (e.g., optical signals). The range finder 20 creates a range signal 25 and then sends this range signal 25 to a processor 10. The processor 10 runs an algorithm to compare the value of the range signal 25 with a lookup table 32 stored in memory 30. The lookup table 32 contains focus settings for various range signals 25. Once the focus settings corresponding to the measured range are determined, the processor 10 sends a control signal 35 to the autofocus controller 40. Based on this signal, the autofocus controller 40 sends electronic autofocus signals 45 to the autofocus lens system 50. The autofocus signals 45 cause the autofocus lens system 50 to change the imaging system's focus. When the adjustment of the autofocus lens system 50 is complete, the light from the barcode 55 is well focused onto the imaging scanner's image sensor.


Autofocus functionality relies on an adjustable lens parameter to focus the barcode. In traditional autofocus systems, focus is achieved by changing the position of a lens (or lenses forming a lens group) in relation to the image sensor. The autofocus signals 45 drive motors or actuators that move the lens (or lenses). In other words, the focus is controlled mechanically by changing the position of a lens or a lens group.


Mechanical autofocus systems can be bulky and slow for imaging scanner applications. A typical mechanical autofocus system can take 60 milliseconds to reach focus. Actuators in these systems can also draw a relatively large amount of power. Typical systems may draw around 450 milliwatts, which reduces battery life.


Focus can be adjustable non-mechanically as well. Lens curvature (i.e., lens power) may be changed to adjust focus. Lenses made from an adjustable surface (i.e., adjustable-surface lenses) can be used in autofocus lens systems for imaging scanners. Adjustable-surface lenses are compact, fast, reliable, cost-effective, and energy efficient.


A perspective half-section view of a lens made from a single adjustable surface is shown in FIG. 2. The glass support 105 is a support element made of a transparent rigid material such as glass. The top element is a thin glass membrane 110, including an actuating element, such as a ring-shaped piezoelectric film 120. The glass membrane is supported on its edges by a silicon support 115 made from some MEMS (i.e., micro-electro-mechanical systems) manufacturing technology. Sandwiched between the glass support 105 and the glass membrane 110 is a flexible transparent polymeric material 130.


The adjustable-surface lens 100 relies on a change in the polymeric surface's curvature as a result of an applied voltage. A side-view cross-section of the adjustable-surface lens 100 and its off/on operation are shown in FIG. 3a and FIG. 3b respectively. As depicted in FIG. 3a, when no voltage is applied to the ring-shaped piezoelectric film 120, the light beam 101 passes through the clear polymer 130 with no alteration (i.e., zero optical power). On the other hand, as shown in FIG. 3b, when a voltage is applied to the piezoelectric film 120, the shape of the glass membrane 110 and the contiguous polymer 130 are curved (e.g., spherically or near spherically). When a voltage is applied to the adjustable-surface lens 100, the light beam 101 is focused to a point behind the lens.


Focusing the small adjustable-surface lens 100 is achieved by changing the shape of the adjustable surface. This change is caused by a mechanical strain exerted by the ring-shaped piezoelectric film (i.e., piezo-ring) 120 because of an applied voltage. This strain alters the shape of the glass membrane 110, and, more importantly, also changes the shape of the flexible polymer 130 that is contiguous to this layer. In this way, the adjustable surface's optical power is controlled, and the position of the focus is adjusted.


The adjustable-surface lens 100 is well suited for imaging scanners. The adjustable-surface lens 100 can be fabricated using advanced semiconductor manufacturing techniques (i.e., MEMS technology), and therefore can be very cost effective. The adjustable-surface lens 100 is small and can be conveniently integrated within an imaging scanner. Adjusting the optical surface is very fast (e.g., 2 milliseconds) and draws very little power (e.g., 20 milliwatts), allowing for fast acquisition and long-life battery operation.


The adjustable-surface lens 100 has some limitations that must be accommodated for in order to use this component for imaging scanner applications. The adjustable-surface lens 100 has a very small clear aperture (e.g., 1.55 millimeters) and leads to a high f-number with a long focal length (e.g., f/10). The range of optical powers is limited (e.g., 0 to +10 diopters), resulting in a working distance range (e.g., 10 centimeters to infinity).


To overcome the limitation of the adjustable-surface lens's small aperture, other lenses may be added along the optical axis 140. An embodiment of this autofocus lens system is shown in FIG. 4. As depicted, the light from a barcode 55 is focused onto the image sensor 155 by three ideal lenses that help to form the autofocus lens system 50. A first lens 142 and a second lens 144, both with positive power, are positioned on either side of the adjustable-surface lens 100, which has negative power. Both the first and second lenses have apertures larger than the aperture stop of the system (i.e., the adjustable-surface lens 100). The first lens 142 forms a large entrance pupil by magnifying the aperture stop. The larger entrance pupil improves the lens system's throughput. The second lens 144 forms a real image of the object (e.g., barcode) onto the image sensor 155. A focusing-module lens 150 uses adjustable optical power to adjust the focus position along the optical axis 140 so that, regardless of the barcode distance, the focus position coincides with the image sensor 155.


To achieve focus for all ranges, the focusing-module lens 150 must be able to adjust its optical power sufficiently. The three-lens configuration shown in FIG. 4 has excellent throughput but lacks performance when a single adjustable-surface lens 100 is used. In other words, when a single adjustable-surface lens 100 is used, the autofocus lens system 50 cannot accommodate close scans (e.g., 10-centimeter scans). The pupil magnification, while needed for throughput, reduces the adjustable-surface lens's ability to focus on objects in the near field. To extend the focus range requires optical power beyond what a single adjustable-surface lens 100 can provide.


To increase the effective optical power, a focusing-module lens 150 may be formed using two adjustable optical surfaces placed in close proximity. The optical powers of the two adjustable surfaces are additive so the overall power of the focusing-module lens 150 may be doubled.


One exemplary embodiment of the two adjustable surface focusing-module lens 150 uses two adjustable-surface lenses 100 placed in close proximity (e.g., back-to-back) as shown in FIG. 5. The back-to-back adjustable-surface lenses form an effective lens with two adjustable polymeric surfaces to control the optical power (i.e., instead of just one lens). An advantage of this embodiment is that the adjustable-surfaces lenses 100 have already been reduced to practice and are commercially available, such as from poLight AS. In this regard, this application incorporates entirely by reference U.S. Pat. No. 8,045,280 (poLight AS).


Alternatively, a focusing-module lens 150 with two adjustable surfaces integrated into a single device can be utilized. As shown in FIG. 6, this alternative embodiment includes a flexible polymer 130 sandwiched between two transparent deformable membranes 110, each with its own ring-shaped piezoelectric film 120. Each transparent deformable membrane 110 can be fabricated from glass, quartz, and/or sapphire). This embodiment would allow for the same focusing power as the first embodiment and would also offer more simplicity and compactness.


In both embodiments, the electronically controlled optical powers on the adjustable polymeric surfaces summate, thereby forming a focusing-module lens 150 with a larger available optical power range (e.g., 0-20 diopters). When this two-surface focusing-module lens 150 is used in the three lens configuration shown in FIG. 4, the higher optical power range and the pupil magnification combine to form an autofocus lens system 50 with excellent focus range and small f-number. Such an autofocus lens system is well suited for imaging scanner applications.


A practical embodiment of the autofocus lens system 50 is shown in FIG. 7. Here, two positive lens groups 142, 144, including a plurality of lenses, are fixedly positioned along the optical axis 140. A focusing-module lens 150, which includes two adjustable-surface lenses 100, is fixed between the two lens groups. No moving parts along the optical axis are required for focusing. A voltage applied to each adjustable surface's ring-shaped piezoelectric film 120 is enough to focus the barcode onto the image sensor 155.


The resulting autofocus lens system 50 is smaller, faster, consumes less power, and is more cost effective than other mechanically-tuned autofocus lenses for imaging scanners. The autofocus lens system 50, based on the architecture described here, can focus on indicia both in the near-field (e.g., 10 centimeters) and far-field (e.g., 10 meters or greater).


The possible applications for this auto-focus lens system 50 need not be limited to imaging scanners. Any application requiring a narrow field of view (e.g., about 10 degrees to 15 degrees) and a wide focus range (e.g., 10 centimeters to infinity) could benefit from this lens configuration. For example, applications like license-plate imagers or long range facial-recognition would be suitable for this type of lens.


This application incorporates entirely by reference the commonly assigned U.S. Pat. No. 7,296,749 for Autofocus Barcode Scanner and the Like Employing Micro-Fluidic Lens; U.S. Pat. No. 8,328,099 for Auto-focusing Method for an Automatic Data Collection Device, such as an Image Acquisition Device; U.S. Pat. No. 8,245,936 for Dynamic Focus Calibration, such as Dynamic Focus Calibration using an Open-Loop System in a Bar Code Scanner; and U.S. Pat. No. 8,531,790 for Linear Actuator Assemblies and Methods of Making the Same.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. Nos. 6,832,725; 7,128,266; 7,159,783; 7,413,127; 7,726,575; 8,294,969; 8,317,105; 8,322,622; 8,366,005; 8,371,507; 8,376,233; 8,381,979; 8,390,909; 8,408,464; 8,408,468; 8,408,469; 8,424,768; 8,448,863; 8,457,013; 8,459,557; 8,469,272; 8,474,712; 8,479,992; 8,490,877; 8,517,271; 8,523,076; 8,528,819; 8,544,737; 8,548,242; 8,548,420; 8,550,335; 8,550,354; 8,550,357; 8,556,174; 8,556,176; 8,556,177; 8,559,767; 8,559,957; 8,561,895; 8,561,903; 8,561,905; 8,565,107; 8,571,307; 8,579,200; 8,583,924; 8,584,945; 8,587,595; 8,587,697; 8,588,869; 8,590,789; 8,593,539; 8,596,542; 8,596,543; 8,599,271; 8,599,957; 8,600,158; 8,600,167; 8,602,309; 8,608,053; 8,608,071; 8,611,309; 8,615,487; 8,616,454; 8,621,123; 8,622,303; 8,628,013; 8,628,015; 8,628,016; 8,629,926; 8,630,491; 8,635,309; 8,636,200; 8,636,212; 8,636,215; 8,636,224; 8,638,806; 8,640,958; 8,640,960; 8,643,717; 8,646,692; 8,646,694; 8,657,200; 8,659,397; 8,668,149; 8,678,285; 8,678,286; 8,682,077; 8,687,282;
  • International Publication No. 2013/163789;
  • International Publication No. 2013/173985;
  • International Publication No. 2014/019130;
  • U.S. Patent Application Publication No. 2008/0185432;
  • U.S. Patent Application Publication No. 2009/0134221;
  • U.S. Patent Application Publication No. 2010/0177080;
  • U.S. Patent Application Publication No. 2010/0177076;
  • U.S. Patent Application Publication No. 2010/0177707;
  • U.S. Patent Application Publication No. 2010/0177749;
  • U.S. Patent Application Publication No. 2011/0169999;
  • U.S. Patent Application Publication No. 2011/0202554;
  • U.S. Patent Application Publication No. 2012/0111946;
  • U.S. Patent Application Publication No. 2012/0138685;
  • U.S. Patent Application Publication No. 2012/0168511;
  • U.S. Patent Application Publication No. 2012/0168512;
  • U.S. Patent Application Publication No. 2012/0193407;
  • U.S. Patent Application Publication No. 2012/0193423;
  • U.S. Patent Application Publication No. 2012/0203647;
  • U.S. Patent Application Publication No. 2012/0223141;
  • U.S. Patent Application Publication No. 2012/0228382;
  • U.S. Patent Application Publication No. 2012/0248188;
  • U.S. Patent Application Publication No. 2013/0043312;
  • U.S. Patent Application Publication No. 2013/0056285;
  • U.S. Patent Application Publication No. 2013/0070322;
  • U.S. Patent Application Publication No. 2013/0075168;
  • U.S. Patent Application Publication No. 2013/0082104;
  • U.S. Patent Application Publication No. 2013/0175341
  • U.S. Patent Application Publication No. 2013/0175343;
  • U.S. Patent Application Publication No. 2013/0200158;
  • U.S. Patent Application Publication No. 2013/0256418;
  • U.S. Patent Application Publication No. 2013/0257744;
  • U.S. Patent Application Publication No. 2013/0257759;
  • U.S. Patent Application Publication No. 2013/0270346;
  • U.S. Patent Application Publication No. 2013/0278425;
  • U.S. Patent Application Publication No. 2013/0287258;
  • U.S. Patent Application Publication No. 2013/0292474;
  • U.S. Patent Application Publication No. 2013/0292475;
  • U.S. Patent Application Publication No. 2013/0292477;
  • U.S. Patent Application Publication No. 2013/0293539;
  • U.S. Patent Application Publication No. 2013/0293540;
  • U.S. Patent Application Publication No. 2013/0306728;
  • U.S. Patent Application Publication No. 2013/0306730;
  • U.S. Patent Application Publication No. 2013/0306731;
  • U.S. Patent Application Publication No. 2013/0306734;
  • U.S. Patent Application Publication No. 2013/0307964;
  • U.S. Patent Application Publication No. 2013/0313324;
  • U.S. Patent Application Publication No. 2013/0313325;
  • U.S. Patent Application Publication No. 2013/0313326;
  • U.S. Patent Application Publication No. 2013/0327834;
  • U.S. Patent Application Publication No. 2013/0341399;
  • U.S. Patent Application Publication No. 2013/0342717;
  • U.S. Patent Application Publication No. 2014/0001267;
  • U.S. Patent Application Publication No. 2014/0002828;
  • U.S. Patent Application Publication No. 2014/0008430;
  • U.S. Patent Application Publication No. 2014/0008439;
  • U.S. Patent Application Publication No. 2014/0021256;
  • U.S. Patent Application Publication No. 2014/0025584;
  • U.S. Patent Application Publication No. 2014/0027518;
  • U.S. Patent Application Publication No. 2014/0034734;
  • U.S. Patent Application Publication No. 2014/0036848;
  • U.S. Patent Application Publication No. 2014/0039693;
  • U.S. Patent Application Publication No. 2014/0042814;
  • U.S. Patent Application Publication No. 2014/0049120;
  • U.S. Patent Application Publication No. 2014/0049635;
  • U.S. Patent Application Publication No. 2014/0061305;
  • U.S. Patent Application Publication No. 2014/0061306;
  • U.S. Patent Application Publication No. 2014/0061307;
  • U.S. Patent Application Publication No. 2014/0063289;
  • U.S. Patent Application Publication No. 2014/0066136;
  • U.S. Patent Application Publication No. 2014/0067692;
  • U.S. Patent Application Publication No. 2014/0070005;
  • U.S. Patent Application Publication No. 2014/0071840;
  • U.S. Patent Application Publication No. 2014/0074746;
  • U.S. patent application Ser. No. 13/367,978 for a Laser Scanning Module Employing An Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.);
  • U.S. patent application Ser. No. 13/400,748 for a Laser Scanning Bar Code Symbol Reading System Having Intelligent Scan Sweep Angle Adjustment Capabilities Over The Working Range Of The System For Optimized Bar Code Symbol Reading Performance, filed Feb. 21, 2012 (Wilz);
  • U.S. patent application Ser. No. 13/736,139 for an Electronic Device Enclosure, filed Jan. 8, 2013 (Chaney);
  • U.S. patent application Ser. No. 13/750,304 for Measuring Object Dimensions Using Mobile Computer, filed Jan. 25, 2013;
  • U.S. patent application Ser. No. 13/771,508 for an Optical Redirection Adapter, filed Feb. 20, 2013 (Anderson);
  • U.S. patent application Ser. No. 13/780,158 for a Distraction Avoidance System, filed Feb. 28, 2013 (Sauerwein);
  • U.S. patent application Ser. No. 13/780,196 for Android Bound Service Camera Initialization, filed Feb. 28, 2013 (Todeschini et al.);
  • U.S. patent application Ser. No. 13/780,271 for a Vehicle Computer System with Transparent Display, filed Feb. 28, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 13/780,356 for a Mobile Device Having Object-Identification Interface, filed Feb. 28, 2013 (Samek et al.);
  • U.S. patent application Ser. No. 13/784,933 for an Integrated Dimensioning and Weighing System, filed Mar. 5, 2013 (McCloskey et al.);
  • U.S. patent application Ser. No. 13/785,177 for a Dimensioning System, filed Mar. 5, 2013 (McCloskey et al.);
  • U.S. patent application Ser. No. 13/792,322 for a Replaceable Connector, filed Mar. 11, 2013 (Skvoretz);
  • U.S. patent application Ser. No. 13/852,097 for a System and Method for Capturing and Preserving Vehicle Event Data, filed Mar. 28, 2013 (Barker et al.);
  • U.S. patent application Ser. No. 13/895,846 for a Method of Programming a Symbol Reading System, filed Apr. 10, 2013 (Corcoran);
  • U.S. patent application Ser. No. 13/902,110 for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Hollifield);
  • U.S. patent application Ser. No. 13/902,144, for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Chamberlin);
  • U.S. patent application Ser. No. 13/902,242 for a System For Providing A Continuous Communication Link With A Symbol Reading Device, filed May 24, 2013 (Smith et al.);
  • U.S. patent application Ser. No. 13/912,262 for a Method of Error Correction for 3D Imaging Device, filed Jun. 7, 2013 (Jovanovski et al.);
  • U.S. patent application Ser. No. 13/912,702 for a System and Method for Reading Code Symbols at Long Range Using Source Power Control, filed Jun. 7, 2013 (Xian et al.);
  • U.S. patent application Ser. No. 13/922,339 for a System and Method for Reading Code Symbols Using a Variable Field of View, filed Jun. 20, 2013 (Xian et al.);
  • U.S. patent application Ser. No. 13/927,398 for a Code Symbol Reading System Having Adaptive Autofocus, filed Jun. 26, 2013 (Todeschini);
  • U.S. patent application Ser. No. 13/930,913 for a Mobile Device Having an Improved User Interface for Reading Code Symbols, filed Jun. 28, 2013 (Gelay et al.);
  • U.S. patent application Ser. No. 13/933,415 for an Electronic Device Case, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 13/947,296 for a System and Method for Selectively Reading Code Symbols, filed Jul. 22, 2013 (Rueblinger et al.);
  • U.S. patent application Ser. No. 13/950,544 for a Code Symbol Reading System Having Adjustable Object Detection, filed Jul. 25, 2013 (Jiang);
  • U.S. patent application Ser. No. 13/961,408 for a Method for Manufacturing Laser Scanners, filed Aug. 7, 2013 (Saber et al.);
  • U.S. patent application Ser. No. 13/974,374 for Authenticating Parcel Consignees with Indicia Decoding Devices, filed Aug. 23, 2013 (Ye et al.);
  • U.S. patent application Ser. No. 14/018,729 for a Method for Operating a Laser Scanner, filed Sep. 5, 2013 (Feng et al.);
  • U.S. patent application Ser. No. 14/019,616 for a Device Having Light Source to Reduce Surface Pathogens, filed Sep. 6, 2013 (Todeschini);
  • U.S. patent application Ser. No. 14/023,762 for a Handheld Indicia Reader Having Locking Endcap, filed Sep. 11, 2013 (Gannon);
  • U.S. patent application Ser. No. 14/035,474 for Augmented-Reality Signature Capture, filed Sep. 24, 2013 (Todeschini);
  • U.S. patent application Ser. No. 14/047,896 for Terminal Having Illumination and Exposure Control filed Oct. 7, 2013 (Jovanovski et al.);
  • U.S. patent application Ser. No. 14/035,474 for Augmented-Reality Signature Capture, filed Sep. 24, 2013 (Todeschini);
  • U.S. patent application Ser. No. 14/050,515 for Hybrid-Type Bioptical, filed Oct. 10, 2013 (Edmonds et al.);
  • U.S. patent application Ser. No. 14/053,175 for Imaging Apparatus Having Imaging Assembly, filed Oct. 14, 2013 (Barber) U.S. patent application Ser. No. 14/055,234 for Dimensioning System, filed Oct. 16, 2013 (Fletcher);
  • U.S. patent application Ser. No. 14/055,353 for Dimensioning System, filed Oct. 16, 2013 (Giordano et al.);
  • U.S. patent application Ser. No. 14/055,383 for Dimensioning System, filed Oct. 16, 2013 (Li et al.);
  • U.S. patent application Ser. No. 14/053,314 for Indicia Reader, filed Oct. 14, 2013 (Huck);
  • U.S. patent application Ser. No. 14/058,762 for Terminal Including Imaging Assembly, filed Oct. 21, 2013 (Gomez et al.);
  • U.S. patent application Ser. No. 14/062,239 for Chip on Board Based Highly Integrated Imager, filed Oct. 24, 2013 (Toa et al.);
  • U.S. patent application Ser. No. 14/065,768 for Hybrid System and Method for Reading Indicia, filed Oct. 29, 2013 (Meier et al.);
  • U.S. patent application Ser. No. 14/074,746 for Self-Checkout Shopping System, filed Nov. 8, 2013 (Hejl et al.);
  • U.S. patent application Ser. No. 14/074,787 for Method and System for Configuring Mobile Devices via NFC Technology, filed Nov. 8, 2013 (Smith et al.);
  • U.S. patent application Ser. No. 14/082,468 for Encoded Information Reading Terminal with Wireless Path Selection Capability, filed Nov. 18, 2013 (Wang et al.);
  • U.S. patent application Ser. No. 14/087,190 for Optimal Range Indicators for Bar Code Validation, filed Nov. 22, 2013 (Hejl);
  • U.S. patent application Ser. No. 14/093,484 for System for Capturing a Document in an Image Signal, filed Dec. 1, 2013 (Showering);
  • U.S. patent application Ser. No. 14/093,487 for Method and System Operative to Process Color Image Data, filed Dec. 1, 2013 (Li et al.);
  • U.S. patent application Ser. No. 14/093,490 for Imaging Terminal Having Image Sensor and Lens Assembly, filed Dec. 1, 2013 (Havens et al.);
  • U.S. patent application Ser. No. 14/093,624 for Apparatus Operative for Capture of Image Data, filed Dec. 2, 2013 (Havens et al.);
  • U.S. patent application Ser. No. 14/094,087 for Method and System for Communicating Information in an Digital Signal, filed Dec. 2, 2013 (Peake et al.);
  • U.S. patent application Ser. No. 14/101,965 for High Dynamic-Range Indicia Reading System, filed Dec. 10, 2013 (Xian);
  • U.S. patent application Ser. No. 14/107,048 for Roaming Encoded Information Reading Terminal, filed Dec. 16, 2013 (Wang et al.);
  • U.S. patent application Ser. No. 14/118,400 for Indicia Decoding Device with Security Lock, filed Nov. 18, 2013 (Liu);
  • U.S. patent application Ser. No. 14/138,206 for System and Method to Store and Retrieve Identifier Associated Information, filed Dec. 23, 2013 (Gomez et al.);
  • U.S. patent application Ser. No. 14/143,399 for Device Management Using Virtual Interfaces, filed Dec. 30, 2013 (Caballero);
  • U.S. patent application Ser. No. 14/147,992 for Decoding Utilizing Image Data, filed Jan. 6, 2014 (Meier et al.);
  • U.S. patent application Ser. No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.);
  • U.S. patent application Ser. No. 14/153,111 for Indicia Reading Terminal Including Frame Quality Evaluation Processing, filed Jan. 13, 2014 (Wang et al.);
  • U.S. patent application Ser. No. 14/153,142 for Imaging Apparatus Comprising Image Sensor Array having Shared Global Shutter Circuitry, filed Jan. 13, 2014 (Wang);
  • U.S. patent application Ser. No. 14/153,182 for System and Method to Manipulate an Image, filed Jan. 13, 2014 (Longacre et al.);
  • U.S. patent application Ser. No. 14/153,213 for Apparatus Comprising Image Sensor Array and Illumination Control, filed Jan. 13, 2014 (Ding);
  • U.S. patent application Ser. No. 14/153,249 for Terminal Operative for Storing Frame of Image Data, filed Jan. 13, 2014 (Winegar);
  • U.S. patent application Ser. No. 14/154,207 for Laser Barcode Scanner, filed Jan. 14, 2014 (Hou et al.);
  • U.S. patent application Ser. No. 14/154,915 for Laser Scanning Module Employing a Laser Scanning Assembly having Elastomeric Wheel Hinges, filed Jan. 14, 2014 (Havens et al.);
  • U.S. patent application Ser. No. 14/158,126 for Methods and Apparatus to Change a Feature Set on Data Collection Devices, filed Jan. 17, 2014 (Berthiaume et al.);
  • U.S. patent application Ser. No. 14/159,074 for Wireless Mesh Point Portable Data Terminal, filed Jan. 20, 2014 (Wang et al.);
  • U.S. patent application Ser. No. 14/159,509 for MMS Text Messaging for Hand Held Indicia Reader, filed Jan. 21, 2014 (Kearney);
  • U.S. patent application Ser. No. 14/159,603 for Decodable Indicia Reading Terminal with Optical Filter, filed Jan. 21, 2014 (Ding et al.);
  • U.S. patent application Ser. No. 14/160,645 for Decodable Indicia Reading Terminal with Indicia Analysis Functionality, filed Jan. 22, 2014 (Nahill et al.);
  • U.S. patent application Ser. No. 14/161,875 for System and Method to Automatically Discriminate Between Different Data Types, filed Jan. 23, 2014 (Wang);
  • U.S. patent application Ser. No. 14/165,980 for System and Method for Measuring Irregular Objects with a Single Camera filed Jan. 28, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/166,103 for Indicia Reading Terminal Including Optical Filter filed Jan. 28, 2014 (Lu et al.);
  • U.S. patent application Ser. No. 14/176,417 for Devices and Methods Employing Dual Target Auto Exposure filed Feb. 10, 2014 (Meier et al.);
  • U.S. patent application Ser. No. 14/187,485 for Indicia Reading Terminal with Color Frame Processing filed Feb. 24, 2014 (Ren et al.);
  • U.S. patent application Ser. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.);
  • U.S. patent application Ser. No. 14/342,544 for Imaging Based Barcode Scanner Engine with Multiple Elements Supported on a Common Printed Circuit Board filed Mar. 4, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/342,551 for Terminal Having Image Data Format Conversion filed Mar. 4, 2014 (Lui et al.); and
  • U.S. patent application Ser. No. 14/345,735 for Optical Indicia Reading Terminal with Combined Illumination filed Mar. 19, 2014 (Ouyang).


In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A focusing-module lens, comprising: an aperture stop;two adjustable surfaces, wherein the optical power of the adjustable surfaces is controlled electronically to achieve focus;a first transparent deformable membrane having a ring-shaped piezoelectric film contiguously positioned on the first transparent deformable membrane's outer surface;a second transparent deformable membrane having a ring-shaped piezoelectric film contiguously positioned on the second transparent deformable membrane's outer surface; anda flexible polymer contiguously positioned between the first transparent deformable membrane and the second transparent deformable membrane, whereby the flexible polymer is in contact with the respective inner surfaces of the first transparent deformable membrane and the second transparent deformable membrane.
  • 2. The focusing-module lens according to claim 1, wherein the focusing-module lens has a clear aperture with a diameter of between 1.3 millimeters and 1.7 millimeters.
  • 3. The focusing-module lens according to claim 1, wherein the first transparent deformable membrane and the second transparent deformable membrane transparent each comprise glass, quartz, or sapphire.
  • 4. The focusing-module lens according to claim 1, wherein the focusing-module lens has a focusing response time of 2 milliseconds or less.
  • 5. The focusing-module lens according to claim 1, wherein the focusing-module lens consumes 20 milliwatts or less.
  • 6. A lens system, comprising: a first positive lens group including a plurality of lenses fixedly positioned along an optical axis;a second positive lens group including a plurality of lenses fixedly positioned along the optical axis;a focusing-module lens fixedly positioned along the optical axis between the first positive lens group and the second positive lens group and formed from at least one adjustable surface whose optical power is electronically controlled to achieve focus, wherein the focusing-module lens (i) defines an aperture stop for the lens system and (ii) has a clear aperture whose diameter is smaller than the diameter of either the first lens group or the second lens group.
  • 7. The lens system according to claim 6, wherein focusing-module lens comprises exactly two adjustable surfaces electronically controlled independently of one another.
  • 8. The lens system according to claim 6, wherein the focusing-module lens consists of two adjustable-surface lenses positioned back-to-back.
  • 9. The lens system according to claim 6, wherein the lens system has an f-number of 6 or less.
  • 10. The lens system according to claim 6, wherein the lens system has a working distance of 10 centimeters or greater.
  • 11. The lens system according to claim 6, wherein the focusing-module lens has a focusing response time of 2 milliseconds or less.
  • 12. A system, comprising: a lens system, comprising (i) a first positive lens fixedly positioned along an optical axis, (ii) a second positive lens fixedly positioned along the optical axis, and (iii) a focusing-module lens fixedly positioned along the optical axis between the first positive lens and the second positive lens and formed from at least one adjustable surface, wherein the optical power of the at least one adjustable surface is controlled electronically to achieve focus of a real image of an indicia;a range finder for sensing the range of an indicia through transmitted and received radiation and for creating a range signal representing the sensed range;a lookup table stored in memory, the lookup table containing focus settings associated with range-signal values;a processor for running a process that compares the range signal with the lookup table in order to generate a corresponding control signal; anda controller for reading the control signal and for outputting electronic autofocus signals to adjust the lens system.
  • 13. The system according to claim 12, wherein the focusing-module lens has a clear aperture whose diameter is smaller than the diameter of either the first positive lens or the second positive lens, the focusing-module lens defining an aperture stop for the lens system.
  • 14. The system according to claim 12, wherein the first lens is a lens group comprising a plurality of lenses.
  • 15. The system according to claim 12, wherein the second lens is a lens group comprising a plurality of lenses.
  • 16. The system according to claim 12, wherein the first positive lens and the second positive lens each comprise a lens group comprising a plurality of lenses.
  • 17. The system according to claim 12, wherein the focusing-module lens consists of two adjustable-surface lenses positioned in close proximity to one another.
  • 18. The system according to claim 12, wherein the focusing-module lens comprises (i) a first glass membrane having a ring-shaped piezoelectric film contiguously positioned on the first glass membrane's outer surface, (ii) a second glass membrane having a ring-shaped piezoelectric film contiguously positioned on the second glass membrane's outer surface, and (iii) a flexible polymer contiguously positioned between the first glass membrane and the second glass membrane, whereby the flexible polymer is in contact with the respective inner surfaces of the first glass membrane and the second glass membrane.
  • 19. The system according to claim 12, wherein the lens system has a working distance of 10 centimeters or greater.
  • 20. The system according to claim 12, wherein the focusing-module lens consumes 20 milliwatts or less.
US Referenced Citations (470)
Number Name Date Kind
6832725 Gardiner et al. Dec 2004 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7296749 Massieu Nov 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7726575 Wang et al. Jun 2010 B2
7813047 Wang et al. Oct 2010 B2
8045280 Henriksen et al. Oct 2011 B2
8294969 Plesko Oct 2012 B2
8305691 Havens et al. Nov 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8366002 Wang et al. Feb 2013 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8505822 Wang et al. Aug 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8531790 Stang et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber et al. Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9082023 Feng et al. Jul 2015 B2
9224022 Ackley Dec 2015 B2
9224027 Van Horn et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9250712 Todeschini Feb 2016 B1
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9310609 Rueblinger et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342724 McCloskey May 2016 B2
9375945 Bowles Jun 2016 B1
D760719 Zhou et al. Jul 2016 S
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
D762647 Fitch et al. Aug 2016 S
9412242 Van Horn et al. Aug 2016 B2
D766244 Zhou et al. Sep 2016 S
9443123 Hejl Sep 2016 B2
9443222 Singel et al. Sep 2016 B2
9478113 Xie et al. Oct 2016 B2
9481809 Carney Nov 2016 B2
9581809 Ackley Feb 2017 B2
20070063048 Havens et al. Mar 2007 A1
20080144185 Wang et al. Jun 2008 A1
20080144186 Feng et al. Jun 2008 A1
20090072037 Good et al. Mar 2009 A1
20090134221 Zhu et al. May 2009 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100276491 Havens et al. Nov 2010 A1
20100276492 Wang et al. Nov 2010 A1
20100276493 Havens et al. Nov 2010 A1
20110149409 Haugholt et al. Jun 2011 A1
20110157675 Heim et al. Jun 2011 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20110304927 Margolis Dec 2011 A1
20120111946 Golant May 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20120248195 Feng et al. Oct 2012 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140100813 Showering Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140049683 Guenter et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140104696 Moreau et al. Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Kearney May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140194750 Lee et al. Jul 2014 A1
20140197238 Lui et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140203087 Smith et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140312121 Lu et al. Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van Horn et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150071819 Todeschini Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150169925 Chen et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150193645 Colavito et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150204671 Showering Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150327012 Bian et al. Nov 2015 A1
20160014251 Hejl Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160131894 Ackley May 2016 A1
20160133253 Braho et al. May 2016 A1
20160171720 Todeschini Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160125873 Braho et al. Jul 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Sewell et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
Foreign Referenced Citations (4)
Number Date Country
2013163789 Nov 2013 WO
2013173985 Nov 2013 WO
2014019130 Feb 2014 WO
2014110495 Jul 2014 WO
Non-Patent Literature Citations (34)
Entry
poLight, “poLight; The Optical MEMS Company”, Mar. 2012, 34 pages.
Intermec, “EX25 Near/Far 2D Imager Engine 3rd Generation” Website Product Profile and Specifications, Copyright 2013 Honeywell Internation Inc. 61187-D Dec. 2013, 2 pages.
Intermec, “The 2D Revolution; How evolving business needs and improved technology are driving explosive growth in two-dimensional bar coding”, White Paper, Copyright 2007 Intermec Technologies Corporation 611838-01A Jun. 2007, 7 pages.
Intermec, “Guide to Scanning Technologies”, White Paper, Copyright 2007 Intermec Technologies Corporation 609107-010 Mar. 2007, 8 pages.
Intermec, “Imaging Moves into the Mainstream; Why 2D Imagers are Surpassing Laser Scanners for Bar Code Application”, White Paper, Copyright 2011 Intermec Technologies Corporation 612138-02A Jul. 2011, 7 pages.
Jon H. Ulvensoen, poLight AS, “New Micro technology provides the functionality to mobile phone cameras”, All rights reserved poLight AS, www.polight.com, 2011, 20 pages.
Extended Search report in related EP Application No. 15163293.2, dated Sep. 18, 2015, 7 pages.
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned.
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages; now abandoned.
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages; now abandoned.
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages.
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages.
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages.
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages.
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages.
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages.
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages.
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages.
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages.
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages; now abandoned.
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages.
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages.
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages.
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Vekatesha et al.); 35 pages.
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages.
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages.
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages.
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages.
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al); 40 pages.
U.S. Appl. No. 14/740,320 for Tactile Switch For a Mobile Electronic Device filed Jun. 16, 2015 (Barndringa); 38 pages.
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages.
Office Action in related European Application No. 15163293.2 dated Jun. 22, 2017, pp. 1-5 [all refernces previously cited.].
European Exam Report in related EP Application No. 15163293.2, dated Mar. 28, 2018, 7 pages.
Related Publications (1)
Number Date Country
20170160441 A1 Jun 2017 US
Continuations (2)
Number Date Country
Parent 14979818 Dec 2015 US
Child 15440357 US
Parent 14264173 Apr 2014 US
Child 14979818 US