An autofocus camera includes a motor that automatically adjusts the focus of the camera optics based on signals received from a controller. In an active autofocus camera, the controller drives the motor to adjust the camera optics to an optimal focus based on a measurement of the distance to a target object. In a passive autofocus camera, the controller drives the motor to adjust the camera optics to an optimal focus based on measurements of a parameter (e.g., sharpness or contrast) of images produced by the camera optics as the focus of the camera optics is incrementally adjusted by the motor.
A manual focus camera does not include an autofocus mechanism. Instead, a manual focus camera has either fixed optics or manually adjustable optics. The focus of a manual focus camera having fixed optics is adjusted by physically moving the camera toward and away from the target object. The focus of a camera having manually adjustable optics is adjusted by manually manipulating a focus adjustment mechanism, such as a focus control knob, a lens, or a lens bezel, which controls the focus of the camera optics.
Many factors contribute to the difficulty of capturing well-focused pictures using a manual focus camera. For example, to be well-focused, a fixed optics camera must be positioned a distance from the target object that falls within a narrow range, especially when the fixed optics have a shallow depth of field. In the case of a camera with manually adjustable optics, the user must subjectively determine when the optics are focused based on a view of the target object through a viewfinder or a low resolution display that causes the sharpness of focus to be difficult to visually ascertain.
For these reasons, many approaches have been proposed for guiding users to an optimal focus during manual adjustment of a manual focus camera. Some of these approaches involve expensive active range finding mechanisms. Other approaches involve presenting a user with a visual or audible indication of a relative focus measure (e.g., sharpness or contrast) that is computed while the user manually adjusts the focus with respect to the target object. With respect to these approaches, the user can infer that the optimal focus is achieved when the visual or audible indication is maximized.
Mechanisms for guiding users to an optimal focus of a manual focus camera go a long way toward reducing the difficulty of capturing well-focused pictures using a manual focus camera. Such mechanisms, however, still require subjective determinations by the user. In addition, such mechanisms may be cumbersome to implement in some application environments, including small form factor devices and low-cost devices that may not include viewfinders or displays for viewing the target object. What are needed are systems and methods of automatically capturing focused images obtained through unguided manual focus adjustment.
In one aspect, the invention features an image capture method. In accordance with this inventive method, light from a scene is received through an optical system free of automatic focusing components in an image capture period. A sequence of focal values is produced from image data produced during the image capture period. One or more target focus criteria are determined from one or more of the focal values produced during a calibration portion of the image capture period. An image of the scene is automatically captured in response to a determination that one or more of the focal values that are produced after the calibration portion of the image capture period satisfies the target focus criteria.
The invention also features an image capture system that includes an optical system, an image sensor, and a processing system. The optical system is free of automatic focusing components and is operable to receive light from a scene. The image sensor is operable to generate image data in response to light received from the optical system. The processing system is operable to produce a sequence of focal values from the image data generated by the image sensor during an image capture period. The processing system also is operable to determine one or more target focus criteria from one or more of the focal values that are produced during a calibration portion of the image capture period. The processing system additionally is operable to cause an image of the scene to be captured in response to a determination that one or more of the focal values produced after the calibration portion of the image capture period satisfies the one or more target focus criteria.
Other features and advantages of the invention will become apparent from the following description, including the drawings and the claims.
In the following description, like reference numbers are used to identify like elements. Furthermore, the drawings are intended to illustrate major features of exemplary embodiments in a diagrammatic manner. The drawings are not intended to depict every feature of actual embodiments nor relative dimensions of the depicted elements, and are not drawn to scale.
The image capture system 10 includes an optical system 12, an image sensor 14, a processing system 16, and a memory 18. In some embodiments, the optical system 12, the image sensor 14, and the processing system 16 are incorporated in a cameral module, which may be incorporated in a larger system or device.
The optical system 12 includes at least one lens 20 that focuses light from a scene 22 onto the active portion of the image sensor 14. The optical system 12 is characterized by a focus 24 where converging light rays emanate from a point to the optical system 12. The focal length (LFocal) is the distance along the optical axis 26 of the optical system 12 from the optical system 12 to the focus 24. The optical system 12 is either a fixed-lens optical system (i.e., LFocal is fixed) or a manually adjustable optical system (i.e., LFocal is manually adjustable). In all implementations, however, the optical system 12 is free of any automatic focusing components, such as lens drive motors or other automatic focus adjustment mechanisms.
The image sensor 14 may be any type of image sensor, including a charge coupled device (CCD) image sensor or a complementary metal-oxide-semiconductor (CMOS) image sensor.
The processing system 16 may be implemented by one or more discrete modules that are not limited to any particular hardware or software configuration and may be implemented in any computing or processing environment, including in digital electronic circuitry or in computer hardware, firmware, device driver, or software. Computer process instructions for implementing the methods performed by the processing system 16 and the data generated by the processing system 16 typically are stored in one or more machine-readable media. Storage devices suitable for tangibly embodying these instructions and data include all forms of non-volatile memory, including, for example, semiconductor memory devices, such as EPROM, EEPROM, flash memory devices, digital storage disks such as internal hard storage disks and removable storage disks.
The memory 18 may be implemented by any type of image storage technology, including a compact flash memory card and a digital video tape cassette. The image data that is stored in the memory 18 may be transferred to a storage device (e.g., a hard disk drive, a floppy disk drive, a CD-ROM drive, or a non-volatile data storage device) of an external processing system (e.g., a computer or workstation) via an external wired or wireless communications port or antenna.
In accordance with the method 50, the optical system 12 receives light from the scene 22 in an image capture period during which the focus 24 of the optical system is adjusted manually in relation to the scene 22 (
The image sensor 14 generates image data 54 from the light that is received by the optical system 12 and focused onto the active portion of the image sensor 14 during the image capture period (
The processing system 16 produces a sequence of focal values 58 from the image data 54 during the image capture period (
In this example, the optical system 12 initially is out-of-focus with respect to the scene 22. The user then adjusts the focus 24 of the optical system 12 in relation to the scene either by moving the image capture system 10 toward or away from the scene 22 or by manually adjusting the relative positions of one or more lenses of the optical system 12. During the initial part of the calibration period (TCalibration), the correlation between the focus 24 in relation to the scene 12 increases with each manual adjustment of focus from the beginning time (t0) to a transition time (ttrans), as indicated by the increasing focal values 62. After the transition time (ttrans), the correlation between the focus 24 in relation to the scene 12 decreases with each manual adjustment of focus, as indicated by the decreasing focal values 62. The peak correlation between the focus 24 in relation to the scene 12 (i.e., the “maximal focus” of the optical system) occurs near the transition time (ttrans).
In some implementations, the user designates the beginning time (t0) of the capture period and the end point of the calibration period, and the processing system 16 identifies at least one of the largest focal values 62 produced during the calibration period. In other implementations, the user designates only the beginning time (t0) of the capture period, and the processing system 16 automatically determines the end point of the calibration period (tCal. End) and identifies at least one of the largest focal values 62 produced during the calibration period. In these implementations, the processing system 16 determines the end point (tCal., End) of the calibration period by analyzing changes in the slope of the focal values 62 plotted over time. In some implementations, the processing system 16 identifies the transition time (ttrans) as the time before which successive ones of the focal values predominantly vary in accordance with respective gradients of a first polarity (e.g., positive in the illustrated example) and after which successive ones of the focal values predominantly vary in accordance with respective gradients of a second polarity (e.g., negative in the illustrated example) that is opposite the first polarity. The processing system 16 identifies the end of the calibration period (tCal. End) using an empirically determined heuristic that identifies when a sufficient number of focal values have been analyzed for a change in gradient to be identified reliably.
The processing system 16 determines one or more target focus criteria from one or more of the focal values 62 that are produced during the calibration portion of the image capture period (
In some implementations, the processing system 16 determines the focal value 58 (FVMAX
FVT=α·FVMAX
where 0<α≦1.
After the one or more target focus criteria have been determined, the processing system 16 causes the image capture system 10 to automatically capture an image of the scene 22 in response to a determination that one or more of the focal values produced after the calibration portion (TCalibration) of the image capture period (TCapture) satisfies the one or more target focus criteria (
The telephone 70 additionally includes an antenna 76, a receiver 78, the speaker 74, a processing system 80, a frequency synthesizer 82, a transmitter 84, the microphone 72, a keypad 86, and a memory 88. The processing system 80 choreographs the operation of the receiver 78, the transmitter 84, and the frequency synthesizer 82. The frequency synthesizer 82 sets the operating frequencies of the receiver 78 and the transmitter 84 in response to control signals received from the processing system 80.
The image capture system 10 is incorporated in the housing 112. In particular, the optical system 12 receives light through an optical port that is formed through a wall of the housing 112. The image sensor 14 is adjacent the optical system 12. The processing system 14 is electrically coupled to the image sensor 14 and the memory 18. The memory 18 stores data generated by the processing system 16, including temporary data, intermediate data, data sampled from the image sensor 14. In some implementations, memory 18 is an erasable, rewritable memory chip that holds its content without power, such as a flash RAM or a flash ROM memory chip. Other implementations may use a different type of memory.
The electronic writing device 110 additionally includes an input/output (I/O) interface 118, a battery 120, and a power button 122.
The I/O interface 118 provides a hardware interface for communications between the electronic writing device 110 and a remote system. The I/O interface 118 may be configured for wired or wireless communication with the remote system. In some implementations, the I/O interface 118 provides a bi-directional serial communication interface. The remote system may be any type of electronic device or system, including a workstation, a desktop computer, a portable computing device (e.g., a notebook computer, a laptop computer, a tablet computer, and a handheld computer), a cash register or point-of-sale terminal. A docking station may be used to connect the I/O interface 118 to the remote system. In some implementations, the remote system may be located at a location remote from the user. For example, the remote system may be a central server computer located at a remote node of a computer network and data from the electronic writing device 110 may be uploaded to the central server computer from any network node connected to the central server computer.
The battery 120 may be any type of battery that provides a source of direct current (DC), including a rechargeable type of battery (e.g., a nickel metal hydride rechargeable battery of a lithium polymer rechargeable battery) and a non-rechargeable type of battery. The battery 120 supplies DC power to the electrical components of the electronic writing device 110.
The power button 122 may be depressed by a user to activate and deactivate the image capture device 10.
Other embodiments are within the scope of the claims.