Not Applicable.
Variable focus lenses generally, and liquid lenses in particular, have improved the ability for lens systems to achieve a wide range of optical properties and to switch between settings in a quick fashion. As a result, a nearly infinite set of optical outcomes can be achieved by combining various variable focus lenses in various arrangements of optics and adjusting the settings of those variable focus lenses appropriately. However, determining precisely which settings for the variable focus lenses are necessary to achieve the desired optical outcomes can be a computationally complex process.
One area where variable focus lenses are deployed with increasing regularity is in cell phone cameras. The computational complexity of pairing variable focus lens settings to desired optical outcomes is not particularly problematic in the context of cell phone cameras, or other cameras where merely acquiring an image is the primary outcome, because: a) when merely acquiring an image, the computing power of a cell phone (or other camera) can be more readily monopolized for image acquisition purposes; and b) a cell phone has enough computing power to perform these computations.
In cases where computational resources are more scarce, such as code reading, lens systems are needed that can take advantage of the abilities of variable focus lenses without overwhelming system computational resources with the determination of variable focus lens settings.
In one aspect, the present disclosure provides an optical system for reading a code at a predefined short distance and a predefined long distance. The code has known size dimensions. The optical system includes an image sensor, a fixed focus lens, a first variable focus lens, a second variable focus lens, and a variable focus lens controller. The image sensor has an optical axis. The fixed focus lens is positioned along the optical axis at a first distance from the image sensor. The first variable focus lens is positioned along the optical axis at a second distance from the image sensor. The second distance is greater than the first distance. The second variable focus lens is positioned along the optical axis at a third distance from the image sensor. The third distance is greater than the second distance. The variable focus lens controller is programmed to selectively control the first variable focus lens and the second variable focus lens to project a short field of view and a long field of view onto the image sensor. The short field of view has a short focal plane located at the predefined short distance and a short viewing angle. The long field of view has a long focal plane located at the predefined long distance and a long viewing angle. The short viewing angle is wider than the long viewing angle.
In another aspect, the present disclosure provides an optical system for reading a code at a predefined short distance and a predefined long distance. The code has known size dimensions. The optical system include an image sensor, a fixed focus lens, a first variable focus lens, a second variable focus lens, a variable focus lens controller, a processor, and a memory. The image sensor has an optical axis. The fixed focus lens is positioned along the optical axis at a first distance from the image sensor. The first variable focus lens is positioned along the optical axis at a second distance from the image sensor. The second distance is greater than the first distance. The second variable focus lens is positioned along the optical axis at a third distance from the image sensor. The third distance is greater than the second distance. The variable focus lens controller is configured to control a first focal length of the first variable focus lens and a second focal length of the second variable focus lens. The processor is configured to communicate signals to the variable focus lens controller. The memory has stored thereon a predefined short first focal length, a predefined long first focal length, a predefined short second focal length, a predefined long second focal length, and instructions that, when executed by the processor, cause the processor to: receive an input specifying the predefined short distance or the predefined long distance; in response to receiving the input specifying the predefined short distance: retrieve from the memory the predefined short first focal length and the predefined short second focal length; and send one or more short signals to the variable focus lens controller specifying the predefined short first focal length and the predefined short second focal length, thereby causing the variable focus lens controller to set a first focal length of the first variable focus lens to the predefined short first focal length and to set a second focal length of the second variable focus lens to the predefined short second focal length; and in response to receiving the input specifying the long distance: retrieve from the memory the predefined long first focal length and the predefined long second focal length; and send one or more long signals to the variable focus lens controller specifying the predefined long first focal length and the predefined long second focal length thereby causing the variable focus lens controller to set the first focal length of the first variable focus lens to the predefined long first focal length and to set the second focal length of the second variable focus lens to the predefined long second focal length. The predefined short first focal length and the predefined short second focal length are selected to project a short field of view onto the image sensor. The short field of view has a short focal plane located at the predefined short distance and a short viewing angle. The predefined long first focal length and the predefined long second focal length are selected to project a long field of view onto the image sensor. The long field of view has a long focal plane located at the predefined long distance and a long viewing angle. The short viewing angle is wider than the long viewing angle.
In yet another aspect, the present disclosure provides the present disclosure provides an optical system for reading a code at a predefined short distance and a predefined long distance. The code has known size dimensions. The optical system include an image sensor, a fixed focus lens, a first variable focus lens, a second variable focus lens, a variable focus lens controller, a processor, and a memory. The image sensor has an optical axis. The fixed focus lens is positioned along the optical axis at a first distance from the image sensor. The first variable focus lens is positioned along the optical axis at a second distance from the image sensor. The second distance is greater than the first distance. The second variable focus lens is positioned along the optical axis at a third distance from the image sensor. The third distance is greater than the second distance. The variable focus lens controller is configured to control a first focal length of the first variable focus lens and a second focal length of the second variable focus lens. The processor is configured to operate in a short distance mode and a long distance mode. Operating in the short distance mode causes the variable focus lens controller to adjust the first focal length to the predefined short first focal length and to adjust the second focal length to the predefined short second focal length, thereby projecting a short field of view onto the image sensor. The short field of view has a short focal plane located at the predefined short distance and a short viewing angle. Operating in the long distance mode causes the variable focus lens controller to adjust the first focal length to the predefined long first focal length and to adjust the second focal length to the predefined long second focal length, thereby projecting a long field of view onto the image sensor. The long field of view has a long focal plane located at the predefined long distance and a long viewing angle. The short viewing angle is wider than the long viewing angle.
In a further aspect, the present disclosure provide an optical system for reading a code at a predefined short distance, a predefined medium distance, and a predefined long distance. The code has known size dimensions. The optical system include an image sensor, a fixed focus lens, a first variable focus lens, a second variable focus lens, and a variable focus lens controller. The image sensor has an optical axis. The fixed focus lens is positioned along the optical axis at a first distance from the image sensor. The first variable focus lens is positioned along the optical axis at a second distance from the image sensor. The second distance is greater than the first distance. The second variable focus lens is positioned along the optical axis at a third distance from the image sensor. The third distance is greater than the second distance. The variable focus lens controller is programmed to selectively control the first variable focus lens and the second variable focus lens to project a short field of view, a medium field of view, and a long field of view onto the image sensor. The short field of view has a short focal plane located at the predefined short distance and a short viewing angle. The medium field of view has a medium focal plane located at the predefined medium distance and a medium viewing angle. The long field of view has a long focal plane located at the predefined long distance and a long viewing angle. The short viewing angle is wider than the medium viewing angle. The medium viewing angle is wider than the long viewing angle.
In another further aspect, the present disclosure provides an optical system for reading a code at a predefined short distance, a predefined medium distance, and a predefined long distance. The code has known size dimensions. The optical system include an image sensor, a fixed focus lens, a first variable focus lens, a second variable focus lens, a variable focus lens controller, a processor, and a memory. The image sensor has an optical axis. The fixed focus lens is positioned along the optical axis at a first distance from the image sensor. The first variable focus lens is positioned along the optical axis at a second distance from the image sensor. The second distance is greater than the first distance. The second variable focus lens is positioned along the optical axis at a third distance from the image sensor. The third distance is greater than the second distance. The variable focus lens controller is configured to control a first focal length of the first variable focus lens and a second focal length of the second variable focus lens. The processor is configured to communicate signals to the variable focus lens controller. The memory has stored thereon a predefined short first focal length, a predefined medium first focal length, a predefined medium second focal length, a predefined long first focal length, a predefined short second focal length, a predefined long second focal length, and instructions that, when executed by the processor, cause the processor to: receive an input specifying the predefined short distance, the predefined medium distance, or the predefined long distance; in response to receiving the input specifying the predefined short distance: retrieve from the memory the predefined short first focal length and the predefined short second focal length; and send one or more short signals to the variable focus lens controller specifying the predefined short first focal length and the predefined short second focal length, thereby causing the variable focus lens controller to set a first focal length of the first variable focus lens to the predefined short first focal length and to set a second focal length of the second variable focus lens to the predefined short second focal length; in response to receiving the input specifying the predefined medium distance: retrieve from the memory the predefined medium first focal length and the predefined medium second focal length; and send one or more medium signals to the variable focus lens controller specifying the predefined medium first focal length and the predefined medium second focal length, thereby causing the variable focus lens controller to set a first focal length of the first variable focus lens to the predefined medium first focal length and to set a second focal length of the second variable focus lens to the predefined medium second focal length; and in response to receiving the input specifying the long distance: retrieve from the memory the predefined long first focal length and the predefined long second focal length; and send one or more long signals to the variable focus lens controller specifying the predefined long first focal length and the predefined long second focal length thereby causing the variable focus lens controller to set the first focal length of the first variable focus lens to the predefined long first focal length and to set the second focal length of the second variable focus lens to the predefined long second focal length. The predefined short first focal length and the predefined short second focal length are selected to project a short field of view onto the image sensor. The short field of view has a short focal plane located at the predefined short distance and a short viewing angle. The predefined medium first focal length and the predefined medium second focal length are selected to project a medium field of view onto the image sensor. The medium field of view has a medium focal plane located at the predefined medium distance and a medium viewing angle. The predefined long first focal length and the predefined long second focal length are selected to project a long field of view onto the image sensor. The long field of view has a long focal plane located at the predefined long distance and a long viewing angle. The short viewing angle is wider than the medium viewing angle. The medium viewing angle is wider than the long viewing angle.
In yet another aspect, the present disclosure provides an optical system for reading a code at a predefined short distance, a predefined medium distance, and a predefined long distance. The code has known size dimensions. The optical system include an image sensor, a fixed focus lens, a first variable focus lens, a second variable focus lens, a variable focus lens controller, a processor, and a memory. The image sensor has an optical axis. The fixed focus lens is positioned along the optical axis at a first distance from the image sensor. The first variable focus lens is positioned along the optical axis at a second distance from the image sensor. The second distance is greater than the first distance. The second variable focus lens is positioned along the optical axis at a third distance from the image sensor. The third distance is greater than the second distance. The variable focus lens controller is configured to control a first focal length of the first variable focus lens and a second focal length of the second variable focus lens. The processor is configured to operate in a short distance mode, a medium distance mode, and a long distance mode. Operating in the short distance mode causes the variable focus lens controller to adjust the first focal length to the predefined short first focal length and to adjust the second focal length to the predefined short second focal length, thereby projecting a short field of view onto the image sensor. The short field of view has a short focal plane located at the predefined short distance and a short viewing angle. Operating in the medium distance mode causes the variable focus lens controller to adjust the first focal length to the predefined medium first focal length and to adjust the second focal length to the predefined medium second focal length, thereby projecting a medium field of view onto the image sensor. The medium field of view has a medium focal plane located at the predefined medium distance and a medium viewing angle. Operating in the long distance mode causes the variable focus lens controller to adjust the first focal length to the predefined long first focal length and to adjust the second focal length to the predefined long second focal length, thereby projecting a long field of view onto the image sensor. The long field of view has a long focal plane located at the predefined long distance and a long viewing angle. The short viewing angle is wider than the medium viewing angle. The medium viewing angle is wider than the long viewing angle.
In still another aspect, the present disclosure provides a code reading including the optical system as described in any aspect herein.
In another aspect, the present disclosure provide a method of making an optical system for reading a code at a predefined short distance and a predefined long distance. The method includes: a) positioning an image sensor having an optical axis, a fixed focus lens, a first variable focus lens, and a second variable focus lens such that the fixed focus lens is on the optical axis at a first distance from the image sensor, the first variable focus lens is on the optical axis at a second distance from the image sensor, and the second variable focus lens is on the optical axis at a third distance from the image sensor, the second distance is greater than the first distance, the third distance is greater than the second distance, the first variable focus lens and the second variable focus lens in electronic communication with a variable focus lens controller, the variable focus lens controller in electronic communication with a processor, the processor in electronic communication with a memory; b) determining a short first focal length and a short second focal length to provide desired short optical properties for acquiring a first image of the code at the predefined short distance and a long first focal length and a long second focal length to provide desired long optical properties for acquiring a second image of the code at the predefined long distance; and c) storing on the memory a digital representation of the short first focal length, the long first focal length, the short second focal length, and the long second focal length.
In yet another aspect, the present disclosure provides a method of making an optical system for reading a code at a predefined short distance, a predefined medium distance, and a predefined long distance. The method includes: a) positioning an image sensor having an optical axis, a fixed focus lens, a first variable focus lens, and a second variable focus lens such that the fixed focus lens is on the optical axis at a first distance from the image sensor, the first variable focus lens is on the optical axis at a second distance from the image sensor, and the second variable focus lens is on the optical axis at a third distance from the image sensor, the second distance is greater than the first distance, the third distance is greater than the second distance, the first variable focus lens and the second variable focus lens in electronic communication with a variable focus lens controller, the variable focus lens controller in electronic communication with a processor, the processor in electronic communication with a memory; b) determining a short first focal length and a short second focal length to provide desired short optical properties for acquiring a first image of the code at the predefined short distance, a medium first focal length and a medium second focal length to provide desired medium optical properties for acquiring a second image of the code at the predefined medium distance, and a long first focal length and a long second focal length to provide desired long optical properties for acquiring a third image of the code at the predefined long distance; and c) storing on the memory a digital representation of the short first focal length, the medium first focal length, the long first focal length, the short second focal length, the medium second focal length, and the long second focal length.
In another aspect, the present disclosure provides an imaging device. The imaging device includes an image sensor, a rear fixed focus lens group, a first liquid lens, a second liquid lens, an aperture, and a front fixed focus lens group. The rear fixed focus lens group is positioned in front of the image sensor along an optical axis. The first liquid lens is positioned in front of the image sensor along the optical axis. The second liquid lens is positioned in front of the image sensor along the optical axis. The aperture is positioned between the first and second liquid lenses along the optical axis of the imaging device. The optical axis passes through the first and second liquid lenses. The front fixed focus lens group is positioned in front of the image sensor along the optical axis. The rear fixed focus lens group or the front fixed focus lens group includes at least one aspherical lens. The controller restrains the first liquid lens and the second liquid lens to settings where the F # of the imaging device is constant independent of zoom level. The controller simultaneously adjust the zoom level and focus. The controller adjusts the first liquid lens and the second liquid lens to project a zoom-independent area of a focal plane projected onto the image sensor.
In another aspect, the present disclosure provides an imaging device. The imaging device includes an image sensor, a rear fixed focus lens group, a first liquid lens, a second liquid lens, an aperture, a front fixed focus lens group, and a controller. The rear fixed focus lens group is positioned in front of the image sensor along an optical axis of the imaging device. The first liquid lens is positioned in front of the image sensor along the optical axis. The first liquid lens includes a first solid portion and a first flexible membrane. The first solid portion and the first flexible membrane define a first interior space that is filled by a first liquid. The second liquid lens is positioned in front of the image sensor along the optical axis. The second liquid lens includes a second solid portion and a second flexible membrane. The second solid portion and the second flexible membrane define a second interior space that is filled by a second liquid. The aperture is positioned between the first liquid lens and the second liquid lens along the optical axis. The optical axis passes through the first liquid lens and the second liquid lens. The front fixed focus lens group is positioned in front of the image sensor along the optical axis. The controller is operatively coupled to the first liquid lens and the second liquid lens. The controller is configured to: receive a desired focal distance for a symbol candidate for attempted decoding; retrieve from a database or compute focal length settings for the first liquid lens and the second liquid lens that are capable of achieving the desired focal distance; send, using the focal length settings, signals to the first liquid lens and the second liquid lens to cause the first liquid lens and the second liquid lens to be at the desired focal distance. The controller restrains the first liquid lens and the second liquid lens to settings where the F # of the imaging device is constant independent of zoom level. The controller simultaneously adjust the zoom level and focus.
In another aspect, the present disclosure provides an imaging device. The imaging device includes an image sensor, a rear fixed focus lens group, a first liquid lens, a second liquid lens, an aperture, a front fixed focus lens group, and a controller. The rear fixed focus lens group is positioned in front of the image sensor along an optical axis of the imaging device. The first liquid lens is positioned in front of the image sensor along the optical axis. The first liquid lens includes a first solid portion and a first flexible membrane. The first solid portion and the first flexible membrane define a first interior space that is filled by a first liquid. The second liquid lens is positioned in front of the image sensor along the optical axis. The second liquid lens includes a second solid portion and a second flexible membrane. The second solid portion and the second flexible membrane define a second interior space that is filled by a second liquid. The aperture is positioned between the first liquid lens and the second liquid lens along the optical axis. The optical axis passes through the first liquid lens and the second liquid lens. The front fixed focus lens group is positioned in front of the image sensor along the optical axis. The controller is operatively coupled to the first liquid lens and the second liquid lens. The first liquid lens is oriented such that the first flexible membrane faces toward the image sensor. The second liquid lens is oriented such that the second flexible membrane faces toward the image sensor. The image sensor includes an indicia that indicates a preferred operating orientation. The first flexible membrane has a first inflection point positioned further away from the imaging sensor relative to a first neutral position of the first flexible membrane and the second flexible membrane has a second inflection point positioned further away from the imaging sensor relative to a second neutral position of the second flexible membrane when the imaging device is positioned in the preferred operating orientation. The controller restrains the first liquid lens and the second liquid lens to settings where the F # of the imaging device is constant independent of zoom level. The controller simultaneously adjusts the zoom level and focus. The controller adjusts the first liquid lens and the second liquid lens to project a zoom-independent area of a focal plane projected onto the image sensor.
In another aspect, the present disclosure provides an imaging device. The imaging device includes an image sensor, a rear fixed focus lens group, a first liquid lens, a second liquid lens, a front fixed focus lens group, a distance sensor, and a controller. The rear fixed focus lens group is positioned in front of the image sensor along an optical axis of the imaging device. The first liquid lens is positioned in front of the image sensor along the optical axis. The second liquid lens is positioned in front of the image sensor along the optical axis. The front fixed focus lens group is positioned in front of the image sensor along the optical axis. The controller is operatively coupled to the first liquid lens, the second liquid lens, and the distance sensor. The controller is configured to: determine, using the distance sensor, a distance between an object including a symbol candidate and the imaging device; receive a desired focal distance for a symbol candidate for attempted decoding, based on the distance between the object and the imaging device; retrieve from a database or compute focal length settings for the first liquid lens and the second liquid lens that are capable of achieving the desired focal distance; and send, using the focal length settings, signals to the first liquid lens and the second liquid lens to cause the first liquid lens and the second liquid lens to be at the desired focal distance. The controller restrains the first liquid lens and the second liquid lens to settings where the F # of the imaging device is constant independent of zoom level. The controller simultaneously adjusts the zoom level and focus.
In another aspect, the present disclosure provides an imaging device. The imaging device includes an image sensor, a rear fixed focus lens group, a first liquid lens, a second liquid lens, a front fixed focus lens group, and a controller. The rear fixed focus lens group is positioned in front of the image sensor along an optical axis of the imaging device. The first liquid lens is positioned in front of the image sensor along the optical axis. The second liquid lens is positioned in front of the image sensor along the optical axis. The front fixed focus lens group is positioned in front of the image sensor along the optical axis. The controller is operatively coupled to the first liquid lens and the second liquid lens. The image sensor, the rear fixed focus lens group, the first liquid lens, the second liquid lens, and the front fixed focus lens group are positioned in order along the optical axis. The controller is configured to: cause the first liquid lens to be at a first focal length that has an optical power of substantially 0 diopters; and with the first liquid lens at the first focal length, calibrate the imaging device.
In another aspect, a sorting system is disclosed. The sorting system includes an imaging device as described herein, a conveyor, and a distance sensor. The distance sensor is upstream of the imaging device and configured to measure a height of objects traveling down the conveyor.
Before the present invention is described in further detail, it is to be understood that the invention is not limited to the particular embodiments described. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting. The scope of the present invention will be limited only by the claims. As used herein, the singular forms “a”, “an”, and “the” include plural embodiments unless the context clearly dictates otherwise.
It should be apparent to those skilled in the art that many additional modifications beside those already described are possible without departing from the inventive concepts. In interpreting this disclosure, all terms should be interpreted in the broadest possible manner consistent with the context. Variations of the term “comprising”, “including”, or “having” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, so the referenced elements, components, or steps may be combined with other elements, components, or steps that are not expressly referenced. Embodiments referenced as “comprising”, “including”, or “having” certain elements are also contemplated as “consisting essentially of” and “consisting of” those elements, unless the context clearly dictates otherwise. It should be appreciated that aspects of the disclosure that are described with respect to a system are applicable to the methods, and vice versa, unless the context explicitly dictates otherwise.
Numeric ranges disclosed herein are inclusive of their endpoints. For example, a numeric range of between 1 and 10 includes the values 1 and 10. When a series of numeric ranges are disclosed for a given value, the present disclosure expressly contemplates ranges including all combinations of the upper and lower bounds of those ranges. For example, a numeric range of between 1 and 10 or between 2 and 9 is intended to include the numeric ranges of between 1 and 9 and between 2 and 10.
As used herein, the terms “component,” “system,” “device” and the like are intended to refer to either hardware, a combination of hardware and software, software, or software in execution. The word “exemplary” is used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs.
Furthermore, the disclosed subject matter may be implemented as a system, method, apparatus, or article of manufacture using standard programming and/or engineering techniques and/or programming to produce hardware, firmware, software, or any combination thereof to control an electronic based device to implement aspects detailed herein.
Unless specified or limited otherwise, the terms “connected,” and “coupled” and variations thereof are used broadly and encompass both direct and indirect mountings, connections, supports, and couplings. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings. As used herein, unless expressly stated otherwise, “connected” means that one element/feature is directly or indirectly connected to another element/feature, and not necessarily electrically or mechanically. Likewise, unless expressly stated otherwise, “coupled” means that one element/feature is directly or indirectly coupled to another element/feature, and not necessarily electrically or mechanically.
As used herein, the term “processor” may include one or more processors and memories and/or one or more programmable hardware elements. As used herein, the term “processor” is intended to include any of types of processors, CPUs, microcontrollers, digital signal processors, or other devices capable of executing software instructions.
As used herein, the term “memory” includes a non-volatile medium, e.g., a magnetic media or hard disk, optical storage, or flash memory; a volatile medium, such as system memory, e.g., random access memory (RAM) such as DRAM, SRAM, EDO RAM, RAMBUS RAM, DR DRAM, etc.; or an installation medium, such as software media, e.g., a CD-ROM, or floppy disks, on which programs may be stored and/or data communications may be buffered. The term “memory” may also include other types of memory or combinations thereof.
A “field of view” (“FOV”) as used herein shall refer to a three-dimensional space in which an object, that has no obstructions between the object and the image sensor, is visible at the image sensor.
A “two-dimensional field of view” as used herein shall refer to a projection of the field of view onto the image sensor. The two-dimensional field of view can be limited by two factors: 1) the size of the image sensor—if light is projected around the image sensor, that light is outside the two-dimensional field of view; and 2) any apertures located between the image scene and the image sensor—if no light is projected onto pixels at the periphery of the image sensor, then those pixels are outside the two-dimensional field of view.
A “focal plane” as used herein shall refer to a plane that is perpendicular to an optical axis and which is positioned at the focus of the camera along the optical axis.
A “lens” as used herein shall refer to an optic or multiple optics that perform the function of a lens. Examples of lenses include, but are not limited to, lenses, lens objectives, curved mirror pairs, and the like.
A “variable focus lens” as used herein shall refer to an optic or multiple optics that perform the function of a lens, where the optical power of the variable focus lens can be changed in some fashion. Examples of variable focus lenses include, but are not limited to, liquid lenses and the like.
When distances are discussed herein, the distance refers to a distance long an optical path and not an absolute distance between two objects, unless the context clearly dictates otherwise. As an example, if a first object, a point on a reflective surface, and a second object are positioned such as to form an equilateral triangle having sides of a given length, and the optical path runs from the first object to the point on the reflective surface to the second object, then the distance between the first object and the second object is equal to two times the given length.
The present disclosure relates to optical systems and methods of making and using the same. The optical systems can be deployed in a variety of code reading contexts, including a code reader, such as a hand held code reader.
As discussed above, in imaging applications where computational resources are more scarce, such as code reading, optical systems are needed that can take advantage of the abilities of variable focus lenses without overwhelming system computational resources with the determination of variable focus lens settings. As one example, the speed with which an optical system can implement real-time adjustments to parameters (e.g., changing focus, zoom, etc.) is increasingly relevant when imaging moving objects. In a shipping/logistics setting where objects must be efficiently moved via conveyors, for example, packages of varying sizes, with varying labels (e.g., position, size, type), and varying positions with respect to the optical system are often reliant on processed image data. An image of a barcode may be “decoded” to make significant decisions such as where to route the package, package contents, etc. It is appreciated that the quality of the captured image corresponds to the reliability and quality of the resulting data. Potential issues from the optical system (e.g., blurry images, partial barcodes) can impact the broader system, particularly output and efficiency.
Continuing with the shipping/logistics example, there is a common emphasis on conveyor speed, and enabling systems to operate at higher and higher rates of speed. Current conveyor systems can operate in excess of 1.5 m/s (e.g., 2.0 m/s, 2.5 m/s, and higher), often with only a small gap between packages. When considering system operating parameters, a major question involves the speed with which an optical system can consistently capture images of suitable quality. For example, if a large package (i.e., the surface to-be-imaged is relatively close to the imaging device) is immediately preceded on a conveyor by a small package (i.e., the surface to-be-imaged is relatively far from the imaging device), how quickly can the optical system adjust such that suitable quality images are captured for both packages?
The present disclosure provides optical systems and methods capable of high-speed image acquisition across a range of focal distances, while preserving the image resolution commonly associated with slower speeds. In particular, the present disclosures provides an optical system configured to update both the focus and the zoom substantially simultaneously. In contrast, a conventional “zoom lens” updates the zoom and the focus separately, sequentially. By updating the focus and the zoom substantially simultaneously, the optical system can respond quickly to changes within the imaging scene (e.g., as packages are conveyed through the FOV).
As will be described, the present disclosure includes optical systems configured to provide a constant F #. Embodiments of the present disclosure can also include two liquid lenses. Conventional systems involving two liquid lenses are limited in terms of the imager size, which limits the image quality. Additionally, conventional optical systems are reliant on a compensator, which is a mechanical component used to perform fine-tuning. Notably, the present disclosure enables the removal of such a compensator through the use of the described two liquid lens configurations.
Measurement of speed and resolution data during testing, confirming the high-speed capabilities of the described embodiments. In summary, the simultaneous adjustment of the zoom level and focus is shown to occur within 15-25 milliseconds. Even at an adjustment speed of 15 milliseconds, the incidence of “no-reads” (i.e., failure to decode an imaged barcode) remained somewhat negligible at 0.015%. Thus, the disclosed systems and methods can still perform adequately even with an adjustment speed of less than 15 milliseconds.
Referring now to
A code can have known size dimensions. In some cases, the code can have known size dimensions including a largest dimension of between 0.1 cm and 50 cm. In some cases, the code can have known size dimensions including a code width of between 0.1 cm and 25 cm and a code length of between 0.1 cm and 25 cm. In certain cases, a one-dimensional code can have a known length of between 0.1 cm and 50 cm. In certain cases, a two-dimensional code can have a known length of between 0.5 cm and 25 cm and/or a known width of between 0.5 cm and 25 cm. In certain cases, the code can have known size dimensions of ˜2.5 cmט2.5 cm. In certain cases, the code can have known size dimensions of ˜7.5 cmט7.5 cm.
The optical system 10 can further include a variable focus lens controller 20 in electronic communication with the first variable focus lens 16 and/or the second variable focus lens 18. The variable focus lens controller 20 can be a single controller configured to control the first variable focus lens 16 and the second variable focus lens 18. The variable focus lens controller 20 can be two distinct controllers, each configured to control one of the first variable focus lens 16 and the second variable focus lens 18. The variable focus lens controller 20 can be mounted within the housing 30 (not illustrated) or can be remote from the housing 30. The variable focus lens controller 20 can be a driver circuit configured to receive a small voltage that is representative of a desired variable focus lens focal length and deliver a larger and/or different voltage to a variable focus lens and causes the variable focus lens to be set to the desired variable focus lens focal length.
The optical system 10 can further include a processor 21 in electronic communication with the image sensor 12 and/or the variable focus lens controller 20. The processor 21 can be configured to acquire an image using the image sensor 12. The processor 21 can be mounted within the housing 30 (not illustrated) or can be remote from the housing 30. The processor 21 can be configured to receive an input. In some cases, the input can be a user input. In some cases, the input can be an automatically-generated input. The input can specify a predetermined short distance or a predetermined long distance. In some cases, the input can specify a predetermined short distance, a predetermined medium distance, or a predetermined long distance. The processor 21 can be configured to access a memory (not illustrated) having stored thereon instructions that, when executed by the processor, cause the processor to execute one or more of the steps of any methods described herein, including methods 200, 300.
The image sensor 12 can be an image sensor known to those having ordinary skill in the art. Examples of suitable image sensors include, but are not limited to, a complementary metal-oxide-semiconductor (CMOS) camera sensor, a charge-coupled device (CCD) sensor, an N-type metal-oxide-semiconductor (NMOS) camera sensor, and the like.
The image sensor 12 can be configured to acquire a monochromatic image. The image sensor 12 can be a monochromatic image sensor. In certain cases, the image sensor 12 can be a color image sensor. In cases with a color image sensor, the image sensor 12 can be operable to acquire an image with only a single wavelength of light (i.e., the image sensor 12 can be a color image sensor that is operable to function as a monochromatic image sensor).
The image sensor 12 can define an optical axis 22. The optical axis 22 is perpendicular to the image sensor 12. In some cases, the optical axis 22 emerges from a central point of the image sensor 12.
The fixed focus lens 14, the first variable focus lens 16, and the second variable focus lens 18 can be positioned along the optical axis 22. The fixed focus lens 14 can be positioned at a first distance 24 from the image sensor 12. The first variable focus lens 16 can be positioned at a second distance 26 from the image sensor 12. The second variable focus lens 18 can be positioned at a third distance 28 from the image sensor 12. The third distance 28 is longer than the second distance 26, and both the third distance 28 and the second distance 26 are longer than the first distance 24. These distances can also be defined as the distances between the given optical components, as would be understood by a person having ordinary skill in the art. The fixed focus lens 14 is positioned between the image sensor 12 and the first variable focus lens 16. The first variable focus lens 16 is positioned between the fixed focus lens 14 and the second variable focus lens 18.
The first distance 24, which is the distance between the image sensor 12 and the fixed focus lens 14, can be between 5 mm and 20 mm.
The second distance 26, which is the distance between the image sensor 12 and the first variable focus lens 16, can be between 5 mm and 25 mm.
The third distance 28, which is the distance between the image sensor 12 and the second variable focus lens 18, can be between 10 mm and 40 mm.
The fixed focus lens 14 can be made of materials known to those having ordinary skill in the optical arts to be suitable for use as materials for a lens having properties that do not vary over time. Examples of suitable materials for the fixed focus lens 14 include, but are not limited to, glass, quartz, fluorite, fused silica, plastic, and the like.
The first variable focus lens 16 and/or the second variable focus lens 18 can be a liquid lens. Non-limiting examples of a liquid lens include Liquid Lens Technology by Cognex Corporation headquartered in Natick, MA, a focus tunable lens by Optotune Switzerland AG headquartered in Dietikon, Switzerland, or variable focus liquid lenses by Varioptic headquartered in Lyon, France. The liquid lens (or liquid lenses) can be configured to have adjustable optical power that corresponds to an applied voltage that is provided to the liquid lens.
The first variable focus lens 16 can have an adjustable first focal length. The first focal length can be adjustable between −20 mm and +20 mm, including but not limited to, between −50 mm and +50 mm, or between −100 mm and +100 mm. The optical power of the first variable focus lens 16, defined as the inverse of the focal length (1/f), can be adjusted between −50 to +50 diopter, including but not limited to −20 to +20 diopter, or −10 to +10 diopter.
The second variable focus lens 18 can have an adjustable second focal length. The second focal length can be adjustable between −20 mm and +20 mm, including but not limited to, between −50 mm and +50 mm, or between −100 mm and +100 mm. The optical power of the second variable focus lens 18, defined as the inverse of the focal length (1/f), can be adjusted between −50 to +50 diopter, including but not limited to −20 to +20 diopter, or −10 to +10 diopter.
Referring to
Referring to
The predefined short distance 32, the predefined medium distance 36, and the predefined long distance 34 can be measured from the second variable focus lens 18 or some other fixed point within the optical system 10, such as a front-facing window of a code reader, so long as the distances remain consistently defined for the purposes of the mathematical calculations discussed herein. A person having ordinary skill in the art will appreciate how to compensate for differences in the definitions of these distances.
In the two- and three-distance configuration, the processor and/or the variable focus lens controller can be configured to selectively control the first variable focus lens and the second variable focus lens to project a short field of view 38 onto the image sensor. In the two- and three-distance configuration, the processor and/or the variable focus lens controller can be configured to set the first focal length of the first variable focus lens and the second focal length of the second variable focus lens, thereby projecting a short field of view onto the image sensor. In the two- and three-distance configuration, the processor and/or the variable focus lens controller can be configured to selectively control the first variable focus lens and the second variable focus lens to project a long field of view onto the image sensor. In the two- and three-distance configuration, the processor and/or the variable focus lens controller can be configured to set the first focal length of the first variable focus lens and the second focal length of the second variable focus lens, thereby projecting a long field of view 40 onto the image sensor. In the two- and three-distance configuration, the processor and/or the variable focus lens controller can be configured to operate in a short distance mode and a long distance mode.
In the three-distance configuration, the processor and/or the variable focus lens controller can be configured to selectively control the first variable focus lens and the second variable focus lens to project a medium field of view 42 onto the image sensor. In the three-distance configuration, the processor and/or the variable focus lens controller can be configured to set the first focal length of the first variable focus lens and the second focal length of the second variable focus lens, thereby projecting a medium field of view onto the image sensor. In the three-distance configuration, the processor and/or the variable focus lens controller can be configured to selectively control the first variable focus lens and the second variable focus lens to project a medium field of view onto the image sensor. In the three-distance configuration, the processor and/or the variable focus lens controller can be configured to operate in a medium distance mode.
It should be appreciated that the optical system 10 will be configured to project one field of view onto the image sensor at a given time. Thus, in aspects described as being configured to project multiple different fields of view onto the image sensor, these aspects should be interpreted as being alternately selectable. For example, an optical system 10 having a variable focus lens controller programmed to selectively control the first variable focus lens and the second variable focus lens to project a short field of view and a long field of view onto the image sensor refers to the variable focus lens controller being able to project the short field of view with one pair of functional settings for the first variable focus lens and the second variable focus lens and able to project the long field of view with a different pair of functional settings for the first variable focus lens and the second variable focus lens.
The short field of view 38 includes a short focal plane 44 and a short viewing angle 46.
The long field of view 40 includes a long focal plane 48 and a long viewing angle 50.
The medium field of view 42 includes a medium focal plane 52 and a medium viewing angle 54.
The short field of view 38 can have a wider viewing angle relative to the long field of view 40. The short field of view 38 can have a shorter depth of field relative to the long field of view 40. The short field of view 38 can generate a lower resolution relative to the long field of view 40. The short field of view 38 can have generate a larger aperture relative to the long field of view 40. The short field of view 38 can generate a projection of the code onto the image sensor 12 that is within 10% relative to the long field of view 40.
The short field of view 38 can have a wider viewing angle relative to the medium field of view 42. The short field of view 38 can have a shorter depth of field relative to the medium field of view 42. The short field of view 38 can generate a lower resolution relative to the medium field of view 42. The short field of view 38 can have generate a larger aperture relative to the medium field of view 42. The short field of view 38 can generate a projection of the code onto the image sensor 12 that is within 10% relative to the medium field of view 42.
The medium field of view 42 can have a wider viewing angle relative to the long field of view 40. The medium field of view 42 can have a shorter depth of field relative to the long field of view 40. The medium field of view 42 can generate a lower resolution relative to the long field of view 40. The medium field of view 42 can have generate a larger aperture relative to the long field of view 40. The medium field of view 42 can generate a projection of the code onto the image sensor 12 that is within 10% relative to the long field of view 40.
In the two-distance configuration, the processor 21 and/or the variable focus lens controller 20 can be configured to prevent the first focal length from being adjusted to a value that is outside a range defined by the focal length values between the predefined short first focal length minus 25 mm, 10 mm, 5 mm, or 1 mm and the predefined short first focal length plus 25 mm, 10 mm, 5 mm, or 1 mm and between the predefined long first focal length minus 25 mm, 10 mm, 5 mm, or 1 mm and the predefined long first focal length plus 25 mm, 10 mm, 5 mm, or 1 mm. In the two-distance configuration, the processor 21 and/or the variable focus lens controller 20 can be configured to prevent the second focal length from being adjusted to a value that is outside a range defined by the focal length values between the predefined short second focal length minus 25 mm, 10 mm, 5 mm, or 1 mm and the predefined short second focal length plus 25 mm, 10 mm, 5 mm, or 1 mm and between the predefined long second focal length minus 25 mm, 10 mm, 5 mm, or 1 mm and the predefined long second focal length plus 25 mm, 10 mm, 5 mm, or 1 mm.
In the two-distance configuration, the processor 21 and/or the variable focus lens controller 20 can be configured to prevent the first optical power from being adjusted to a value that is outside a range defined the predefined short first optical power plus or minus 0.1 diopter, 0.2 diopter, 0.5 diopter, or 1.0 diopter and the range defined by the predefined long first optical power plus or minus 0.1 diopter, 0.2 diopter, 0.5 diopter, or 1 diopter. In the two-distance configuration, the processor 21 and/or the variable focus lens controller 20 can be configured to prevent the second optical power from being adjusted to a value that is outside a range defined the predefined short second optical power plus or minus 0.1 diopter, 0.2 diopter, 0.5 diopter, or 1 diopter and the range defined by the predefined long second optical power plus or minus 0.1 diopter, 0.2 diopter, 0.5 diopter, or 1 diopter.
In the three-distance configuration, the processor 21 and/or the variable focus lens controller 20 can be configured to prevent the first optical power from being adjusted to a value that is outside a range defined the predefined short first optical power plus or minus 0.1 diopter, 0.2 diopter, 0.5 diopter, or I diopter and the range defined by the predefined medium first optical power plus or minus 0.1 diopter, 0.2 diopter, 0.5 diopter, or 1 diopter and the range defined by the predefined long first optical power plus or minus 0.1 diopter, 0.2 diopter, 0.5 diopter, or 1 diopter. In the three-distance configuration, the processor 21 and/or the variable focus lens controller 20 can be configured to prevent the second optical power from being adjusted to a value that is outside a range defined the predefined short second optical power plus or minus 0.1 diopter, 0.2 diopter, 0.5 diopter, or 1 diopter and the range defined by the predefined medium second optical power plus or minus 0.1 diopter, 0.2 diopter, 0.5 diopter, or 1 diopter and the range defined by the predefined long second optical power plus or minus 0.1 diopter, 0.2 diopter, 0.5 diopter, or 1 diopter.
In the three-distance configuration, the processor 21 and/or the variable focus lens controller 20 can be configured to prevent the first focal length from being adjusted to a value that is outside a range defined by the focal length values between the predefined short first focal length minus 25 mm, 10 mm, 5 mm, or 1 mm and the predefined short first focal length plus 25 mm, 10 mm, 5 mm, or 1 mm, between the predefined medium first focal length minus 25 mm, 10 mm, 5 mm, or 1 mm and the predefined medium first focal length plus 25 mm, 10 mm, 5 mm, or 1 mm, and between the predefined long first focal length minus 25 mm, 10 mm, 5 mm, or 1 mm and the predefined long first focal length plus 25 mm, 10 mm, 5 mm, or 1 mm. In the three-distance configuration, the processor 21 and/or the variable focus lens controller 20 can be configured to prevent the second focal length from being adjusted to a value that is outside a range defined by the focal length values between the predefined short second focal length minus 25 mm, 10 mm, 5 mm, or 1 mm and the predefined short second focal length plus 25 mm, 10 mm, 5 mm, or 1 mm, between the predefined medium second focal length minus 25 mm, 10 mm, 5 mm, or 1 mm and the predefined medium second focal length plus 25 mm, 10 mm, 5 mm, or 1 mm, and between the predefined long second focal length minus 25 mm, 10 mm, 5 mm, or 1 mm and the predefined long second focal length plus 25 mm, 10 mm, 5 mm, or 1 mm.
The processor 21 can be configured to fine tune the first focal length and/or the second focal length. This fine tuning can involve acquiring an image using the processor 21 and the image sensor 12. The processor 21 can be configured to fine tune the first focal length and the second focal length within the focal length ranges or optical power ranges described in the four immediately preceding paragraphs. The processor 21 can be configured to fine tune the first focal length and the second focal length by an amount of less than 25 mm, less than 10 mm, less than 5 mm, or less than 1 mm. The image can then be analyzed for one or more desired performance characteristics, including but not limited to, sharpness, brightness, illumination, distance measurements, combinations thereof, and the like.
While the optical system 10 is illustrated in a linear fashion, it should be appreciated that the optical system 10 can include various reflective configurations without departing from the spirit of the disclosure. The optical system 10 can include additional optics that are suitable for use with the present disclosure, including but not limited to, additional lenses that do not impact the described function of the optical system 10, filters, apertures, shutters, polarizers, mirrors, waveplates, prisms, combinations thereof, and the like.
Optics in the optical system 10, including the fixed focus lens 14, the first variable focus lens 16, and the second variable focus lens 18 can include various coatings, such as anti-reflection coatings, filter coatings (i.e., a coating that optically filters out certain wavelengths, such as ultraviolet light), anti-lens flare coatings, infrared, short band pass filter, BBAR, combinations thereof, and the like.
It should be appreciated that other types of variable focus lenses are contemplated without departing from the scope of the present disclosure, including liquid lenses that have optical power adjusted by way of changes in fluid pressure provided by a high speed pump, and the like.
In certain aspects, the variable focus lens controller 20 can be a driver circuit that is configured to provide a known voltage to the first variable focus lens 16 and/or the second variable focus lens 18 can be a liquid lens. In certain cases, the variable focus lens controller 20 can have firmware that restricts control of the first variable focus lens 16 and the second variable focus lens 18 to the specific settings described herein (i.e., settings for a predefined short distance, a predefined long distance, and optionally a predefined medium distance) and prevents the variable focus lens controller 20 from accessing other settings and subsequent different optical outcomes.
In certain cases, the optical system 10 can include a processor 21 that is separate and distinct from the variable focus lens controller 20. In these separate and distinct cases, the processor 21 can be located proximate the variable focus lens controller 20 or remote from the variable focus lens controller 20. In certain cases, the optical system 10 can include a processor 21 that is integrated with the variable focus lens controller 20.
In certain cases, the optical system 10 described herein can be specialized in its programming to be particularly useful at capturing images of a code of a given size at two distances, a short distance and a long distance. When designing the optical system 10 for the given code size and the two given distances, pairs of optical powers for the first variable focus lens 16 and the second variable focus lens 18 are calculated, which each correspond to different viewing angles and working distances (i.e., the short distance and the long distance). The following equations can be used for these calculations:
where f1′ is the focal length of the first variable focus lens 16, f2′ is the focal length of the second variable focus lens 18, fT′ is the focal length of the entire optical system 10, e is the distance between the first variable focus lens 16 and the second variable focus lens 18 (i.e., the third distance minus the second distance), a1 is the distance from the second variable focus lens 18 to the object (i.e., the code), and a2′ is the back focal length (i.e., the distance from the image sensor 12 to the fixed focus lens 14).
Equations (1) and (2) can be solved for both the short and long distances (i.e., the two anticipated distances between the second variable focus lens 18 and the code for a given code reading application). This solving of the equations can take into account any constraints on optical power for the variable focus lenses 16, 18. A discrete pair (or set) of working distances can be calculated with magnification and field of view being optimized for a given usage.
In certain cases, the optical power of both variable focus lenses 16, 18 can be adjusted to achieve a certain magnification at a certain object distance. In certain aspects, the lens can be designed to provide a small field of view at the large distance (or a small viewing angle) and/or a large field of view at the short distance (or a wide viewing angle).
As one exemplary calculation, assuming a system that has a distance between the two variable focus lenses 16 of 24 mm, a back focal length of 20 mm, a short object distance of 100 mm, and a long object distance of 10,000 mm, the focal length settings shown in Table 1 are computed. The settings in Table 1 provide a large viewing angle at the short object distance and a higher resolution at the long object distance.
It should be appreciated that the exemplary calculations described above are not the only way to compute the pairs of optical powers and the calculations are not intended to be limiting.
In certain cases, the system 10 can be configured to provide focal lengths and fields of view at a continuous plurality of distances. Referring to
In certain cases, the processor 21 and/or the variable focus lens controller 20 can be configured to prevent the variable focus lenses 16, 18 from having their focal length set to any values other than the predefined short focal length and predefined long focal length values (and optionally, the predefined medium focal length values).
The processor 21 can have stored thereon instructions that, when executed by the processor, cause the processor to execute any of the methods of using the optical system that are described herein.
Referring to
The optical system 10 and code reader 100 described herein can include features described in the methods 200, 300, 400 described below, unless the context clearly dictates otherwise.
Referring to
At process block 212, the method 200 includes acquiring an image with the optical system. At optional process block 214, the method 200 can include analyzing the image to determine if the image is of sufficient quality for decoding the code. At optional process block 216, the method 200 can include fine tuning the first focal length and/or the second focal length based on the analysis of process block 214. The fine tuning of process block 216 includes the fine tuning described elsewhere herein. At optional process block 218, the method can include acquiring a fine-tuned image. Process blocks 214, 216, and 218 can be repeated as necessary to provide an image of sufficient quality for decoding the code. At process block 220, the method 200 can include decoding the code in either the image or the fine-tuned image.
Referring to
At process block 316, the method 300 includes acquiring an image with the optical system. At optional process block 318, the method 300 can include analyzing the image to determine if the image is of sufficient quality for decoding the code. At optional process block 320, the method 300 can include fine tuning the first focal length and/or the second focal length based on the analysis of process block 318. The fine tuning of process block 320 includes the fine tuning described elsewhere herein. At optional process block 322, the method can include acquiring a fine-tuned image. Process blocks 318, 320, and 322 can be repeated as necessary to provide an image of sufficient quality for decoding the code. At process block 324, the method 300 can include decoding the code in either the image or the fine-tuned image.
Receiving the input of process blocks 202 and 302 can be achieved in a variety of ways. In some cases, receiving the input can include a user providing a manual input defining the desired predefined distance and the manual input can be electronically communicated to the processor. In some cases, the input can be automatically provided as a result of monitoring a property of the optical system, or optionally a code reader or handheld code reader. In some cases, the property of the optical system can include the orientation of the optical system. In some cases, a distance to the code can be measured, for example by using a laser range-finder or other known distance measuring technique, and the input can be deduced from the distance to the code.
The setting of the focal lengths of process blocks 206, 210, 306, 310, and 314 can be achieved in a variety of ways. In some cases, the processor can communicate directly with the first variable focus lens and the second variable focus lens to set the respective focal lengths. In some cases, the processor can send one or more signals (in the case of a predefined short distance, the one or more signals can be one or more short signals, in the case of a predefined medium distance, the one or more signals can be one or more medium signals, and in the case of a predefined long distance, the one or more signals can be one or more long signals) to the variable focus lens controller, which subsequently communicates with the first variable focus lens and the second variable focus lens to set the respective focal lengths. In some cases, the setting of the focal lengths can be achieved by methods known to those having ordinary skill in the electronic communications arts.
In methods 200, 300, the predefined short first focal length and the predefined short second focal length are selected to define the short field of view to have the properties described above. In methods 200, 300, the predefined long first focal length and the predefined long second focal length are selected to define the long field of view to have the properties described above. In method 300, the predefined medium first focal length and the predefined medium second focal length are selected to define the medium field of view to have the properties described above.
The methods 200, 300 can include features described elsewhere herein with respect to the optical system 10, the code reader 100, or the method 400, unless the context clearly dictates otherwise.
Referring to
At optional process block 409, the method 400 can also optionally include determining the predetermined focal lengths. The determining of optional process block 409 can be achieved as described above.
The method 400 can include features described elsewhere herein with respect to the optical system 10, the code reader 100, or the methods 200, 300, unless the context clearly dictates otherwise.
In some embodiments, the groups of lenses 502, 504, 506 can have various optical powers and can be formed out of various materials. For example, the group of lenses 502 can collectively have a negative optical power, and each lens of the group of lenses 502 can be formed out of flint (e.g., each negative lens can be formed out of flint). As another example, the group of lenses 504 can collectively have a negative optical power and can also be formed out of flint (e.g., each negative lens can be formed out of flint). As yet another example, some of the lenses of the group of lenses 506 can have a positive optical power, or a negative optical power. In some cases, a lens with a positive optical power can be formed out of glass (e.g., crown glass), whereas a lens with a negative optical power can be formed out of flint. In some cases, the group of lenses 506 can include at least one lens with a positive optical power and at least one lens with a negative optical power.
In some embodiments, the group of lenses 502 can be configured to reshape the light angles coming from the object. For example, the groups of lenses 502 can have relatively large curvatures to steer the steep and relatively large angles of the light from the image scene towards the optical axis of the imaging device 500. Thus, in some cases, each lens of the group of lenses 502 can have a larger curvature than each of the lenses of the other group of lenses 504, 506, 508 and the variable focus lenses 510, 512. In some cases, the group of lenses 504, which are positioned behind the group of lenses 502, can facilitate transitioning of the light between the groups of lenses 502, 506. For example, the group of lenses 504 can focus the light already focused by the group of lenses 502 onto the group of lenses 506. Thus, similarly to the group of lenses 502, the group of lenses 504 can also have relatively large curvatures. Accordingly, each lens of the group of lenses 504 can have a curvature that is larger than a curvature of each lens of the groups of lenses 506, 508. In some embodiments, the group of lenses 506 (e.g., which can be positioned behind the group of lenses 504) can be configured to minimize the vignetting effect by distributing the light from the group of lenses 504 so that the light is less saturated at the periphery of the image (e.g., image of the FOV of the imaging device 500). In some embodiments, the optical paths of the light that passes between the groups of lenses 504, 506 can be substantially parallel, which can be advantageous for changing the optical properties of the light that passes along this region (e.g., because the light paths that are parallel are less likely to suffer from undesirable aberrations as compared to more angled optical paths). In this way, the imaging device 500 can include an optical filter between the lens groups 504, 506 (and along the optical axis 520) which can filter the light with less aberrations introduced. In some cases, one lens (e.g., a glass lens or glass element) of one of the groups of lenses 502, 504, 506, 508 can be aspherical. In some cases, a front lens group (e.g., further from the image sensor) includes a lens that is aspherical. In some cases, a rear lens group (e.g., closer to the image sensor) includes a lens that is aspherical. Referring to
In some embodiments, the variable focus lenses 510, 512 can each be positioned in front of the image sensor 514, in front of the group of lenses 508, and behind each of the group of lenses 502, 504, 506. In some cases, the variable focus lens 510 can be positioned farther away from the image sensor 514 than the variable focus lens 512. Each variable focus lens 510, 512 can be a liquid lens as described herein. For example, the liquid lens can include a solid portion and a flexible membrane that defines an interior space filled with a liquid. The liquid lens can also include a pair of electrodes that when electrically excited (e.g., by receiving a voltage) can change the curvature of the flexible membrane. Thus, a computing device (e.g., the controller 516) can cause each of the variable focus lenses 510, 512 (e.g., that are liquid lenses) to change their respective focal length by changing the voltage applied to the electrodes. In some embodiments, the variable focus lenses 510, 512 can be configured to adjust the angle of the FOV of the imaging device 500 relative to the optical axis, and to control the focal point of the entire imaging device 500. In addition, and as described in more detail below, the variable focus lenses 510, 512 can control the F # of the imaging device 500.
As shown in
The aperture 522 can be located at a distance from the variable focus lenses 510, 512 that is tailored to achieve a desired goal, as would be appreciated by a skilled artisan. In some cases, the distance of the aperture 522 from either of the variable focus lenses 510, 512 can be represented as a percentage of an inter-lens distance between the two variable focus lenses 510, 512.
The aperture 522 can be located at a distance from the first variable focus lens 510 (i.e., the variable focus lens closer to the image sensor 514) that is between 1% and 95% of the distance between the first variable focus lens 510 and the second variable focus lens 512, including but not limited to, between 5% and 25%, between 1% and 10%, between 1% and 25%, or between 5% and 29%.
In some cases, it can be preferable that the aperture 522 is positioned more closely to the first variable focus lens 510 than the second variable focus lens, so the distance is between 1% and 49% or the ranged identified above that fall entirely within this range. In such cases, if the aperture 522 is positioned appropriately, the inventors surprisingly discovered that different fields of view can be acquired while retaining a substantially constant F # (see equations below).
In cases where the aperture 522 is movable between positions, the distances that those positions are capable of occupying may be limited to the disclosed ranges.
In cases where the aperture 522 is in a fixed position or is prevented from moving, the aperture 522 is positioned at a single distance selected from one of the disclosed ranges.
In some embodiments, the group of lenses 508, which can be positioned behind the group of lenses 502, 504, 506, and the variable focus lenses 510, 512, can be configured to correct the light from the variable focus lenses 510, 512. For example, the group of lenses 508 adjusts the chief ray angle (“CRA”) of the light that is directed at the image sensor 514. In other words, the group of lenses 508 can adjust the ray angle of the light from the variable focus lenses 510, 512 to be closer to the optical axis 520, which can prevent image artefacts. In addition, the group of lenses 508 can control the vignetting effect (as described above), and can control the field curvature minimize aberrations in the image acquired at the image sensor 514 (e.g., by providing some diverging lenses). The group of lenses 508 can include at least one lens with a negative optical power and at least one lens with a positive optical power. The negative optical power lenses can be formed out of flint, and the positive optical power lenses can be formed out of glass (e.g., crown glass).
As shown in
In some embodiments, each component of the imaging device 500 can be coupled to the housing 518. Thus, each component of the imaging device 500 can be in fixed spatial relationship to one another, which can be the configuration shown in
In some embodiments, when each variable focus lens 510, 512 is a liquid lens, the liquid lenses can have different curvatures (e.g., one liquid lens having a membrane with a convex curvature and the other liquid lens having a membrane with a concave curvature), or the liquid lenses can have the same curvatures (e.g., with both liquid lenses having a membrane with a concave curvature, or both liquid lenses having a membrane with a convex curvature). Stated a different way, the membranes of the liquid lenses can be inverted relative to one another, or are not inverted relative to one another. In some embodiments, and as described in more detail below, the orientation of the membranes of the liquid lenses can be important, which can mitigate gravitationally induced artifacts.
In some embodiments, the configuration of the variable focus lenses 510, 512 and the aperture 522 can be advantageous. For example, as shown in
While
In some cases, the two variable focus lenses 510, 512 are liquid lenses and the controller 116 is adapted to provide control where one of the liquid lenses 510, 512 is required to be in a convex orientation, while the other lens is allowed to adopt an orientation that is not constrained to being convex.
In some embodiments, the imaging device 500 can allow the variable focus lenses 510, 512 to only operate with the inflection points 542, 544 of the respective membranes 532, 538 to be farther away from the image sensor 514 relative to a neutral position of the respective membrane 532, 538, when, for example, an inflection point is present in a membrane (e.g., the imaging device 500 can allow the membranes 532, 538 to be in a neutral position, which can include when no voltage is applied to the variable focus lenses 510, 512). In other words, the controller 516 can prevent the variable focus lenses 510, 512 from having the inflection points 542, 544 of the respective membranes 532, 538 to be closer to the image sensor 514 relative to a neutral position of the respective membranes 532, 538, when, for example, an inflection point is present in a membrane. In this way, the imaging device 500 (e.g., the controller 516) can mitigate gravitationally induced artifacts by preventing undesirable configurations of the membranes 532, 538 relative to a gravity vector.
In some embodiments, while the gravity vector 546 has been illustrated as extending from the image sensor 514 to the variable focus lenses 510, 512, the gravity vector 546 can be reversed. In this case, the discussion above for desirable configurations can be reversed. For example, in this opposing configuration, the imaging device 500 can allow the variable focus lenses 510, 512 to only operate with the inflection points 542, 544 of the respective membranes 532, 538 to be closer to the image sensor 514 relative to a neutral position of the respective membrane 532, 538, when, for example, an inflection point is present in a membrane (e.g., the imaging device 500 can allow the membranes 532, 538 to be in a neutral position, which can include when no voltage is applied to the variable focus lenses 510, 512). In other words, the controller 516 can prevent the variable focus lenses 510, 512 from having the inflection points 542, 544 of the respective membranes 532, 538 to be farther away from the image sensor 514 relative to a neutral position of the respective membranes 532, 538, when, for example, an inflection point is present in a membrane. In this way, the imaging device 500 (e.g., the controller 516) can mitigate gravitationally induced artifacts by preventing undesirable configurations of the membranes 532, 538 relative to a gravity vector.
In some embodiments, the imaging device 500 can include an indica 548 that can indicate a preferred orientation of the imaging device 500, which, in this case, can indicate that the gravity vector is aligned with the imaging device 500. The indica 548 can be a visual identifier and can be coupled to an exterior surface of the housing 518 (e.g., so that the indica 548 can be viewed by a user that can install the imaging device 500). The indica 548 can include a gravity indica 550 and an orientation indica 552, which can prompt the user to orient the imaging device 500 so that the gravity indicia 550 is aligned with the actual gravity vector and that the orientation of the imaging device 500 (e.g., a longitudinal axis of the imaging device 500) is aligned with the actual gravity vector. In other words, the indica 548 can indicate that the imaging device 500 should be mounted so that a lens of the imaging device 500 faces away from the actual gravity vector. In some cases, the gravity indica 550 can be a first arrow, and the orientation indica 552 can be a second arrow, with the first arrow being in the same direction as the second arrow. In some configurations, when the imaging device 500 includes the indica 548, the imaging device 500 can allow the variable focus lenses 510, 512 to only operate with the inflection points 542, 544 of the respective membranes 532, 538 to be farther away from the image sensor 514 relative to a neutral position of the respective membrane 532, 538, when, for example, an inflection point is present in a membrane (e.g., the imaging device 500 can allow the membranes 532, 538 to be in a neutral position, which can include when no voltage is applied to the variable focus lenses 510, 512).
In some embodiments, the imaging device 500 can include an indica 554 that can indicate a preferred orientation of the imaging device 500 (e.g., different than the preferred orientation with the indica 548), which, in this case, can indicate that the gravity vector is in an opposite direction to the imaging device 500. The indica 554 can also be a visual identifier and can be coupled to an exterior surface of the housing 518 (e.g., so that the indica 554 can be viewed by a user that can install the imaging device 500). The indica 554 can include a gravity indica 556 and an orientation indica 558, which can prompt the user to orient the imaging device 500 so that the gravity vector 556 is aligned with the actual gravity vector and that the orientation of the imaging device 500 is opposite the actual gravity vector. In other words, the indica 554 can indicate that the imaging device 500 should be mounted so that a lens of the imaging device 500 faces away towards the actual gravity vector. In some cases, the gravity indica 556 can be a first arrow, and the orientation indica 558 can be a second arrow, with the first arrow being in the opposite direction as the second arrow. In some configurations, when the imaging device 500 includes the indica 554, the imaging device 500 can allow the variable focus lenses 510, 512 to only operate with the inflection points 542, 544 of the respective membranes 532, 538 to be closer to the image sensor 514 relative to a neutral position of the respective membrane 532, 538, when, for example, an inflection point is present in a membrane (e.g., the imaging device 500 can allow the membranes 532, 538 to be in a neutral position, which can include when no voltage is applied to the variable focus lenses 510, 512).
In some embodiments, the imaging device 500 can include an orientation sensor 560, which can be coupled to and positioned within the housing 518. The orientation sensor 560 can be implemented in different ways. For example, the orientation sensor 560 can be an accelerometer, a gyroscope, an internal measurement unit, etc. The orientation sensor 560 can sense the orientation of the imaging device 500 relative to the gravity vector 546. In this way, the sensor 560, which can be in communication with the controller 516, can cause the imaging device 500 to operate in different operational modes, based on the orientation of the imaging device 500 relative to the gravity vector 546), which can mitigate undesirable orientations of the variable focus lenses 510, 512 with respect to certain orientations of the imaging device 500. For example, the controller 516 can cause the imaging device 500 to operate according to a first mode of operation or a second mode of operation, based on the orientation of the imaging device 500. In some cases, this can include the controller 516 switching from the first mode of operation to the second mode of operation (or vice versa). In the first mode of operation, the controller 516 can cause variable focus lenses 510, 512 to only have inflection points 542, 544 that are farther away from the imaging sensor 514 relative to a neutral position of the membranes 532, 538, when an inflection point is present. Correspondingly, in the second mode of operation, the controller 516 can cause the variable focus lenses 510, 512 to only have inflection points 542, 544 that are closer to the imaging sensor 514 relative to a neutral position of the membranes 532, 538, when an inflection point is present.
In some embodiments, the particular orientation from the orientation sensor 560 can be used to indicate which mode of operation should be used (e.g., the first mode or the second mode). For example, the controller 516 can receive a first orientation, form the orientation sensor 560, that can indicate that the orientation of the imaging device 500 is in substantially the same direction as the gravity vector 546. Then, the controller 516 can cause the imaging device 500 to operate in the first mode of operation, which can include switching from the second mode of operation to the first mode of operation. As another example, the controller 516 can receive a second orientation, from the orientation sensor 560, that can indicate that the orientation of the imaging device 500 is in a substantially opposite direction as the gravity vector 546. Then, the controller 516 can cause the imaging device 500 to operate in the second mode of operation, which can include switching from the first mode of operation to the second mode of operation. In this way, if, for example, the imaging device 500 is mounted to a different location, the controller 516 can still cause the imaging device 500 to operate in a more ideal configuration, based on the current orientation of the imaging device 500.
As shown in
In some embodiments, each object 604, 606 can be a box, a package, etc. For example, each object 604, 606 can have dimensions that are substantially the same. In particular, two or more of the width, the length, or the base of the object 604 can be the same. In some embodiments, a dimension (a length, a width, a height, etc.) of each object 604, 606 is less than 800 millimeters. For example, when the imaging device 608 is positioned above the conveyor 602, objects that are too tall (e.g. greater than 800 millimeters in height) may be too difficult to acquire an image of a symbol that is of a sufficient quality to decode. In some embodiments, each symbol 612, 614 can have a minimum size. For example, in some cases, a dimension (e.g., a width, a length, etc.) of each symbol 612, 614 can be greater than or equal to 10 millimeters. Thus, in some cases, the dimension of each symbol 612, 614 is not less than 10 millimeters.
In some embodiments, the imaging device 608 can be implemented in a similar manner as the imaging devices, the optical systems, the optical devices, etc., described herein. Thus, the previous descriptions of the imaging devices, the optical systems, the optical devices, etc., described herein pertain to the imaging device 608 (and vice versa). The imaging device 608 can include variable focus lenses 618, 620 positioned along an optical axis 622 of the imaging device 608, a distance sensor 624, and an illumination source 626. In some embodiments, the distance sensor 624 and the illumination source 626 can be coupled to a housing of the imaging device 608 and can be positioned on opposing sides of the housing of the imaging device 608. In other configurations, the illumination source 626 and the distance sensor 624 can be positioned on the same side of the housing of the imaging device 608. In addition, while the distance sensor 624, and the illumination source 626 are illustrated as being part of the imaging device 608, in other configurations the distance sensor 624 and the illumination source 626 can each be separate from the imaging device 608. In this case, the distance sensor 624 and the illumination source 626 can each still be in communication with the imaging device 608 (e.g., including the controller of the imaging device 608).
In some embodiments, the distance sensor 624 and the illumination source 626 can be implemented in different ways. For example, the distance sensor 624 can be a dimensioner, an image sensor, a time of flight (“ToF”) sensor, etc. Regardless, however, the distance sensor 624 can be configured to sense a distance between the imaging device 608 and a surface of an object (e.g., an upper surface of the object 604). In this case, the distance sensor 624 can directly sense the distance between the imaging device 608 and the surface of the object (e.g., when the distance sensor 624 is coupled to or otherwise integrated within the imaging device 608), or the distance sensor 624 can indirectly sense the distance between the imaging device 608 and the surface of the object, based on a reference dimension between the distance sensor 624 and the imaging device 608 (e.g., when the distance sensor 624 is separate from the imaging device 608). In some embodiments, the distance between the imaging device 608 and the surface of the object can be a height of the object (e.g., which advantageously can be used to determine a focus distance of the imaging device 608). The illumination source 626 can include one or more light sources (e.g., a light emitting diode (“LED”)), and can be configured to illuminate at least a region of a FOV 628 of the imaging device 608. For example, the illumination source 626 can emit light at the conveyor 602, the object 604, etc., to illuminate the FOV 628 for better image acquisition.
As shown in
In some embodiments, a height of each of the objects 604, 606 (e.g., and any other object supported and moved by the conveyor 602) can be less than or equal to half the distance between the imaging device 608 and the conveyor 602 (e.g., the height of the imaging device 608 above the conveyor 602). In this way, an upper surface of the object 604 is not too close to the imaging device 608, which may be too close to the imaging device 608, even with the zooming capability of the imaging device 608. In some configurations, the distance between the imaging device 608 and the conveyor 602 (e.g., the height of the imaging device 608 above the conveyor 602) can be less than or equal to 1600 millimeters.
As shown in
In some embodiments, the imaging device 608 can acquire one or more images of each of the objects 604, 606 including a symbol on the respective object, as the objects 604, 606 move along the conveyor 602. In addition, the imaging device 608 (e.g., the controller of the imaging device 608) can also determine, using the distance sensor 624, a distance between the imaging device 608 and a surface of the object 604 (e.g., a height of the object), as the object 604 travels along the conveyor 602. In this way, the imaging device 608 can adjust a total focal length of the imaging device 608 to acquire an image of the symbol 612 of the object 604 that is of a sufficient quality to be able to decode the symbol 612. For example, the imaging device 608 can set the focal length of the imaging device 608, including a first focal length of the variable focus lens 618 and a second focal length of the variable focus lens 620 (e.g., each of which at least partially define the total focal length of the imaging device 608), based on the distance between the imaging device 608 and the object 604 (e.g., the height of the object 604). In this way, the imaging device 608 can focus so that the focal plane of the FOV of the imaging device 608 is substantially at a surface of the object 604 that includes a symbol (e.g., an upper surface of the object 604).
As shown in
At 652, the process 650 can include a computing device causing a first variable focus lens of an imaging device to have an optical power of substantially zero diopters. In some cases, including when the first variable focus lens is a liquid lens, this can include a computing device causing substantially no current (e.g., substantially 0 mA is provided), substantially no voltage, etc., to be delivered to the one or more electrodes of the liquid lens. In some configurations, this can include a computing device assuming that the first variable focus lens of the imaging device has an optical power of substantially zero diopters. For example, when the first variable focus lens is a liquid lens, this can include a computing device assuming that the first variable focus lens is substantially flat, substantially planar, etc.
At 654, the process 650 can include a computing device causing a second variable focus lens of the imaging device to have an optical power of substantially zero diopters. In some cases, similarly to the block 652, the process 650 can include a computing device causing substantially no current (e.g., substantially 0 mA is provided), substantially no voltage, etc., to be delivered to the one or more electrodes of the liquid lens. In some configurations, this can include a computing device assuming that the second variable focus lens of the imaging device has an optical power of substantially zero diopters. In some configurations, as described above, the first variable focus lens can be separated from the second variable focus lens along an optical axis of the imaging device (e.g., and with the optical axis intersecting the first variable focus lens and the second variable focus lens).
At 656, the process 650 can include a computing device determining a calibration point (e.g., an origin of a calibration curve), based on the configuration of the first variable focus lens and the configuration of the second variable focus lens. For example, a computing device can set an entire optical power of the imaging device (or a combined optical power of the first variable focus lens and the second variable focus lens) to be at a maximum possible optical power (of the imaging device or of the combined optical power) for a distance from the imaging device and the object being at 0 mm (e.g., the working distance of the imaging device, including the distance the focal plane of the FOV of the imaging device is from the imaging device, such as a front lens of the imaging device). Correspondingly, the computing device can set a minimum focal distance of the imaging device or of the combined focal distance of the first variable focus lens and the second variable focus lens to 0 mm for the distance form the imaging device and the object being at 0 mm. As described above, by considering inherent curvatures of the membranes of the liquid lenses at these positions, a more robust and accurate calibration curve can be created and used to calibrate the imaging device.
At 658, the process 650 can include moving an object to a known distance away from the imaging device. In some cases, the object can include a symbol of standard dimensions (e.g., a standard barcode) on a surface of the object (e.g., an upper surface of the object). In some cases, this can include a computing device determining a distance between the imaging device and the surface of the object (e.g., the surface of the object including a symbol to be decoded) to, for example, ensure that the measured distance and the known distance are within a particular threshold (e.g., and that the measure distance “seen” by the imaging device corresponds closely to the known distance). In some cases, this distance can be the desired working distance for the imaging device. In some cases, this can include a computing device determining the distance between the imaging device and the surface of the object, using a distance sensor.
At 660, the process 650 can include a computing device determining a first focal length of the first variable focus lens and a second focal length of the first variable focus lens. In some configurations, the computing device can determine the first focal length and the second focal length using at least one of the distance of the object, the distance between one or more optical components of the imaging device (e.g., the known distance at the block 658), the size of the aperture, the position of the aperture relative to the first variable focus lens and the second variable focus lens, etc. In some embodiments, the computing device can determine the first focal length and the second focal length using one or more of the following equations [3]-[12] below.
The equations [3]-[12] above can be used to determine focal values for the first variable focus lens and the second variable focus lens to maintain a substantially constant F # (i.e., f-number) of the imaging device across multiple known distances (e.g., multiple known working distances of the imaging device), which can be important to ensure good image quality (e.g., to decode a symbol). For example, equations [3]-[12] show how to get the condition for keeping the F # constant over the whole range of the focal values available for the imaging device.
Eq 3 (shorthand for equation [3], which will be utilized below with respect to this and other numbered equations) shows the definition of the relative aperture for an optical system, eq 4 shows how the total focal value of the system can be calculated, which can consider the focal values of the two different groups of an optical system and the optical distance between them. For example, f1′ can include the groups of lenses 502, 504, 506, and the variable focus lens 510, while f2′ can include the variable focus lens 512, the group of lenses 508, and the image sensor 514. In some cases, f1′ and f2′ can be separate from each other at the aperture 522. In other words, the aperture 522 can separate f1′ from f2′. This optical distance is referred to as the principal planes of the two parts. Eq 5 governs how the magnification of the entrance pupil is calculated and taken into account and the distances of the aperture stop with respect to the f1′ system. Finally, eq 6 simply solves eq 5 for the diameter of the entrance pupil and having the option of introducing that term in eq 3 as it can be seen in eq 7. Eq 8 also considers the eq 4 and it expresses the dependency of the f-number (“FNr”) with the position of the aperture stop in the system. With the help of eq 9, the image distance of the aperture stop is substituted in eq 8 and it ends up on the eq 10 where finally, the FNr depends only of the object distance of the aperture stop but not of the image distance. Deriving eq 10, we get the eq 11 following the basic derivation rules and finally, the eq 12 shows the condition of the FNr for this system. A skilled artisan will recognize how these equations can be combined with positioning of the aperture described above.
In some embodiments, the block 660 can include a computing device causing the first variable focus lens to be at the first focal length, and the second variable focus lens to be at the second focal length.
At 662, the process 650 can include a computing device acquiring an image of a symbol of the object using the imaging device with the first variable focus lens being at the first focal length and the second variable focus lens being at the at the second focal length. In some embodiments, the block 662 can include a computing device determining whether or not the symbol in the image can be decoded. If at the block 662, the computing device cannot decode the symbol (or the computing device determines that the symbol cannot be decoded), the process 650 can proceed back to the block 660 to determine different focal lengths for the first variable focus lens and the second variable focus lens (e.g., which can include adjusting the first focal length and the second focal length). In this way, the focal lengths of the variable focus lenses can be tweaked slightly to ensure that the focal plane of the imaging device is substantially at the surface of the object that includes the symbol. If, however, at the block 664, the computing device can decode the symbol (or the computing device determines that the symbol can be decoded), the process 650 can proceed to the block 664.
At 664, the process 650 can include a computing device associating the first focal length and the second focal length with the known distance (e.g., which can be a desired working distance of the imaging device). In some embodiments, including after the block 664 has been completed, the process 650 can proceed back to the block 658 to move the object to another known distance (e.g., different from the known distance) away from the imaging device. Then, the process 650 can proceed through the same blocks (e.g., blocks 660-664) until a total number of desired data points have been acquired.
At 668, the process 650 can include a computing device after one or more iterations have been completed (e.g., with each iteration creating one data point), creating (or adjusting) a calibration curve using the number of data points (e.g., with each data point including a known distance from the target and a combined focal length or a combined optical power associated therewith). In some embodiments, creating a calibration curve can include a computing device fitting a curve with the data points. In this regard,
In some embodiments, the block 668 can include a computing device calibrating the imaging device using one or more calibration points. In some cases, the one or more calibration points can include the calibration point at the block 656, other calibration points, a calibration curve using multiple calibration points, etc.
At 702, the process 700 can include a computing device determining a distance between an object and an imaging device. In some cases, this can include a computing device determining the distance using a distance sensor (e.g., of the imaging device).
At 704, the process 700 can include a computing device determining a first focal length of a first variable focus lens of an imaging device and a second focal length for a second variable focus lens. In some cases, including at the first iteration of the process 700, the computing device can determine that the first focal length and the second focal length are each substantially 0 diopters. In this way, the computing device can quickly attempt to acquire an image that is of a sufficient quality by doing little to no computations, which can be beneficial in high-speed settings in which objects move along conveyors quickly.
At 706, the process 700 can include a computing device causing the first variable focus lens to be at the first focal length and the second variable focus lens to be at the second focal length.
At 708, the process 700 can include a computing device acquiring an image of a symbol of the object (e.g., using the imaging device) with the first variable focus lens at the first focal length and the second variable focus lens to be at the second focal length.
At 710, the process 700 can include a computing device determining whether or not the symbol is able to be decoded. In some cases, this can include a computing device attempting to decode the image that includes the symbol. If at the block 710, the computing device cannot decode the symbol (or the computing device determines that the symbol cannot be decoded), the process 700 can proceed back to the block 704 to determine different focal lengths for the first variable focus lens and the second variable focus lens (e.g., which can include adjusting the first focal length and the second focal length). For example, this can include a computing device determining another first focal length of the first variable focus lens and another second focal length of the second variable focus lens. In some cases, the another first focal length and the another second focal length can each be determined using the distance between the object and the imaging device (e.g., determined at the block 702). In addition, the another first focal length and the another second focal length can each be determined using one or more equations described herein, a calibration curve, a calibration point, etc. In some embodiments, the process 700 can proceed through the blocks 706, 708 to attempt to acquire an image of a sufficient quality to decode the symbol. If, however, at the block 710, the computing device can decode the symbol (or the computing device determines that the symbol can be decoded), the process 700 can proceed to the block 712.
At 712, the process 700 can include a computing device extracting symbol information from the symbol (e.g., of the image that was deemed decodable). In some cases, this can include a computing device decoding the symbol to determine the symbol information of the symbol. For example, this can include decoding a barcode to determine a barcode string of the barcode.
In some embodiments, some or all blocks of the process 700, including multiple iterations of portions of the process 700, can occur while the object is positioned within a FOV of the imaging device. In this way, the imaging device can acquire multiple different images using different settings (e.g., different focal lengths) to improve the image quality of previously acquired images until an image has been acquired of a sufficient quality to decode the symbol.
Several features of the present disclosure may be present in any of the aspects described herein. For instance, in all cases, the controller can be adapted to restrain the first liquid lens and second liquid lens to settings where the F # of the imaging device is constant independent of zoom level. As another example, in all cases, the controller can be adapted to simultaneously adjust the zoom level and focus. In one particular implementation, the controller can restrain the first liquid lens and the second liquid lens to settings where the F # of the imaging device is changed with respect to the zoom level. As yet another example, the controller can be adapted to control the liquid lenses to project a zoom-independent area of a focal plane projected onto the image sensor. In other words, regardless of the distance from the imaging device or the corresponding zoom level, a feature of a given area (e.g., a barcode) will take up the same area on the image sensor (e.g., within 10%).
Unless otherwise defined or specified, as used herein with respect to a reference value, the term “substantially” indicates a variation from the reference value of ±10% or less, inclusive (e.g., ±5%, ±2%, ±1%, ±0.5%). For example, a first component that is positioned substantially equidistant between second and third components is positioned at a distance from the second component that is within 10% inclusive of a distance from the from the third component. Similarly, a substantially constant value deviates (over a relevant operation) from a reference value by 10% or less. In particular, “substantially parallel” indicates a direction that is within ±10 degrees of a reference direction (e.g., within ±5 degrees or ±3 degrees), inclusive. Correspondingly, “substantially perpendicular” indicates a direction that is within ±10 degrees of perpendicular a reference direction (e.g., within ±5 degrees or ±3 degrees), inclusive. Also in particular, “substantially zero diopters” indicates diopters below 0.1 or less, 0.01 or less, or 0.001 or less. and “substantially zero current” indicates current below 0.1 mA or less, 0.01 mA or less, or 0.001 mA or less.
The present disclosure may take any one or more (including combinations) of the following example configurations.
The particular aspects disclosed above are illustrative only, as the technology may be modified and practiced in different but equivalent manners apparent to those skilled in the art having the benefit of the teachings herein. Furthermore, no limitations are intended to the details of construction or design herein shown, other than as described in the claims below. It is therefore evident that the particular aspects disclosed above may be altered or modified and all such variations are considered within the scope and spirit of the technology. Accordingly, the protection sought herein is as set forth in the claims below.
This application claims priority to and the benefit of U.S. Provisional Application No. 63/441,525, filed on Jan. 27, 2023, the entire contents of which are herein incorporated by reference for all purposes.
Number | Date | Country | |
---|---|---|---|
63441525 | Jan 2023 | US |