This application claims the priority benefit of Korean Patent Application No. 2012-0081488, filed on Jul. 25, 2012 in the Korean Intellectual Property Office, the disclosure of which is incorporated herein by reference.
1. Field
Embodiments of the invention relate to an image capturing apparatus, and more particularly, to an image capturing apparatus on which an ocular sensor configured to detect the approach of an object is mounted.
2. Related Art
A camera, which is an image capturing apparatus, may be an apparatus configured to record an image of a subject input through a lens on a film or an imaging device (e.g., a charge-coupled device (CCD) or complementary metal-oxide-semiconductor (CMOS) device). In the camera, the capturing of images may be performed in response to manipulation of a shutter-release button or manipulation of a remote controller configured to communicate with the camera in a wired or wireless manner. That is, a user may confirm a subject via a viewfinder or monitor disposed in a main body of the camera, and then manipulate the shutter-release button or the remote controller to capture images of the subject.
The manipulation of the shutter-release button may include applying a predetermined force using, for example, a finger, to the shutter-release button disposed in the main body of the camera. During the application of the predetermined force, the main body of the camera may shake. As a result, blurs may occur on the images, and the shape of the subject may become unclear. This phenomenon may be more serious when a shutter speed is reduced to increase the quantity of light in the case of a lack of light or night photography or low-illumination indoor photography.
The remote controller may be configured to enable a user to capture images without direct contact with the shutter-release button. The remote controller may include a receiving unit contained in the main body of the camera or additionally combined with the main body of the camera, and a transmission unit disposed in a position apart from the main body of the camera and configured to generate a shutter-release signal. The user may manipulate the transmission unit to generate the shutter-release signal, and the receiving unit of the main body of the camera may receive the shutter-release signal and provide the shutter-release signal to a controller of the main body of the camera so that the controller can perform control operations to drive a shutter. When the remote controller is used, images may be captured without direct contact with the main body of the camera, so shake of the camera may be prevented. However, since an additional remote controller should be purchased, additional costs may be incurred. Also, it is troublesome to always carry the purchased remote controller together with the camera.
Therefore, an embodiment provides an image capturing apparatus and a method of controlling the same, with which images may be captured without direct contact with a main body of the image capturing apparatus without purchasing and carrying an additional remote controller.
Additional embodiments will be set forth in part in the description which follows and, in part, will become apparent from the description, or may be learned by practice of the embodiments of the invention.
In accordance with an embodiment, a method of controlling an image capturing apparatus includes: generating a first detection signal and a second detection signal in response to motion of an object within a detection region of an ocular sensor, controlling a focusing module to bring a subject into focus in response to the generation of the first detection signal, and controlling a shutter and an imaging device to capture an image of the subject in response to the generation of the second detection signal.
The ocular sensor may generate the first detection signal when the object enters the detection region, and generate the second detection signal when the object exits the detection region.
When a time interval between a time point at which the first detection signal is generated and a time point at which the second detection signal is generated is within a predetermined range, the image of the subject may be captured in response to the second detection signal.
The ocular sensor may generate the first detection signal when the object enters the detection region, and generate the second detection signal when the object re-enters the detection region after exiting the detection region.
When a time interval between a time point at which the first detection signal is generated and a time point at which the second detection signal is generated is within a predetermined range, the image of the subject may be captured in response to the second detection signal.
The ocular sensor may generate the first detection signal when the object enters the detection region, and generate the second detection signal when the object repeatedly re-enters the detection region at least twice after exiting the detection region.
When a time interval between a time point at which the first detection signal is generated and a time point at which the second detection signal is generated is more than a predetermined time duration, the image of the subject may be captured in response to the second detection signal.
The second detection signal may be generated when the object repeatedly re-enters the detection region at least twice, and a time interval between time points at which the object repeatedly re-enters the detection region at least twice is within a predetermined range.
The method may further include generating a first control signal for controlling the focusing module to bring the subject into focus in response to the generation of the first detection signal, and generating a second control signal for controlling the shutter and the imaging device to capture the image of the subject in response to the generation of the second detection signal.
The ocular sensor may be a single ocular sensor.
In accordance with another embodiment, a method of controlling an image capturing apparatus includes: generating a first detection signal and a second detection signal in response to motion of an object in a detection region of an ocular sensor, wherein the first detection signal is generated when the object enters the detection region, and the second detection signal is detected when the object exits the detection region, controlling a focusing module to bring the subject into focus in response to the generation of the first detection signal, and controlling a shutter and an imaging device to capture an image of the subject in response to the generation of the second detection signal.
When a time interval between a time point at which the first detection signal is generated and a time point at which the second detection signal is generated is within a predetermined range, the image of the subject may be captured in response to the second detection signal.
In accordance with another embodiment, a method of controlling an image capturing apparatus includes: generating a first detection signal and a second detection signal in response to motion of an object in a detection region of an ocular sensor, wherein the first detection signal is generated when the object enters the detection region, and the second detection signal is detected when the object re-enters the detection region after exiting the detection region, controlling a focusing module to bring a subject into focus in response to the generation of the first detection signal, and controlling a shutter and an imaging device to capture an image of the subject in response to the generation of the second detection signal.
When a time interval between a time point at which the first detection signal is generated and a time point at which the second detection signal is generated is within a predetermined range, the image of the subject may be controlled in response to the second detection signal.
In accordance with embodiment, a method of controlling an image capturing apparatus includes: generating a first detection signal and a second detection signal in response to motion of an object in a detection region of an ocular sensor, wherein the first detection signal is generated when the object enters the detection region, and the second detection signal is detected when the object repeatedly re-enters the detection region at least twice after exiting the detection region, controlling a focusing module to bring a subject into focus in response to the generation of the first detection signal, and controlling a shutter and an imaging device to capture an image of the subject in response to the generation of the second detection signal.
When a time interval between a time point at which the first detection signal is generated and a time point at which the second detection signal is generated is more than a predetermined time duration, the image of the subject may be captured in response to the second detection signal.
The second detection signal may be generated when the object repeatedly re-enters the detection region at least twice, and a time interval between time points at which the object repeatedly re-enters the detection region at least twice is within a predetermined range.
In accordance with another embodiment, an image capturing apparatus includes: a lens configured to receive an image of a subject, a focusing module configured to drive the lens and bring the subject into focus, an imaging device configured to capture the image of the subject, a shutter configured to expose the image of the subject for a predetermined amount of time, an ocular sensor configured to generate a first detection signal and a second detection signal in response to motion of an object within a detection region of the ocular sensor, and a controller configured to control the focusing module to bring the subject into focus in response to the generation of the first detection signal, and control the shutter and the imaging device to capture the image of the subject in response to the generation of the second detection signal.
The image capturing apparatus may further include a viewfinder, a display unit, a first mode in which the viewfinder and the display unit are alternately enabled, and a second mode in which an image capturing function using the ocular sensor is enabled. The detection region of the ocular sensor is larger in the second mode than in the first mode.
The ocular sensor may be a single ocular sensor.
In various embodiments, images may be captured without direct contact with a main body of an image capturing apparatus to prevent shake (vibration) of the image capturing apparatus. Also, images may be captured without purchasing and carrying an additional remote controller so that the costs of the purchase of the remote controller and a burden of carrying the remote controller can be eliminated.
These and/or other aspects will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
Reference will now be made in detail to the embodiments of the invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout.
An embodiment proposes another method of capturing an image using an ocular sensor 106 or 156. In the image capturing apparatus 100 shown in
A controller 250 may execute a program recorded in an electronically erasable and programmable read-only memory (EEPROM) 255 and control the whole operation of the image capturing apparatus 150. The controller 250 may include a digital signal processor (DSP) or image processor configured to convert an analog image signal into a digital image signal and perform various conversion operations for obtaining good images.
An optical system 210 may include a plurality of optical lenses and used to form an image of a subject on an imaging surface of an imaging device 220. The optical system 210 may include a lens 212 disposed along an optical-axis direction and configured to receive the image of the subject, a shutter 214 and an iris diaphragm 216 configured to control the exposure time and quantity of light incident on the imaging device 220, and a focusing module 218 configured to control the focus of the image of the subject that is formed on the imaging surface of the imaging device 220. The lens 212, the shutter 214, the iris diaphragm 216, and the focusing module 218 may be driven by a driver 211. The focusing module 218 may include at least one focus lens. The shutter 214 of
The imaging device 220 may be a charge-coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) device and may convert the image of the subject, which is received via the optical system 210, into an electric image signal (photoelectric conversion). An operation of the imaging device 220 may be controlled by the controller 250 via a timing generator 221. A film of a film camera may correspond to the imaging device 220. In this case, photosensitization may occur in the film without causing photoelectric conversion to record an image.
An analog front-end (AFE) circuit 230 may be used to process an output signal of the imaging device 220 and may convert the output signal of the imaging device 220 into a quantized digital image signal. The AFE circuit 230 may perform sample-hold operations on the output signal of the imaging device 220 in a correlated double sampling manner to maintain a high signal-to-noise ratio (SNR), control a gain of an image signal using an auto-gain control, and perform an analog-to-digital conversion (ADC) operation so that the AFE circuit 230 can convert an analog image signal into a digital image signal.
The digital image signal converted by the AFE circuit 230 may selectively undergo an image quality correction operation and luminance/chromaticity conversion operations and be transmitted to an encoder/decoder 260, converted into encoded data according to a prescribed compression encoding method, such as Joint Photographic Experts Group (JPEG) or Moving Picture Experts Group (MPEG), and transmitted to and stored in a recording medium 270. The recording medium 270 may be used to store image data of a subject in the form of a still image or moving image file. The encoder/decoder 260 may access the image file stored in the recording medium 270 and extension-decode the image file to construct a reproduction screen of the still image or moving image. Meanwhile, a dynamic random access memory (DRAM) 240 may temporarily store the image data to provide a working region for processing signals and provide a kind of a temporary working region in which the encoder/decoder 260 may perform a compression operation and the controller 250 may perform various signal processing operations.
A display unit 152 and an ocular sensor 156 may be the same as described above with reference to
According to the embodiment, in the image capturing apparatus 100 on which the viewfinder 104 is mounted as shown in
In the image capturing apparatus 150 of
In
A time interval between the rising edge and falling edge of the detection signal 402 shown in
When the entrance of an object into the detection region of the ocular sensor 156 of the image capturing apparatus 150 is detected, a rising edge of the detection signal 402 of the ocular sensor 156 may be formed (operation 504), and a rising edge of the control signal 406 may be formed in response to the rising edge of the detection signal 402 to enable a focusing control operation (506). Before a predetermined time elapses (refer to “NO” of operation 508) from a time point at which the rising edge of the detection signal 402 is formed, when the exit of the object from the detection region of the ocular sensor 156 is detected and the falling edge of the detection signal 402 of the ocular sensor 156 is formed (refer to “YES” of operation 510), the falling edge of the control signal 406 may be formed in response to the falling edge of the detection signal 402, and an image capturing control operation may be performed in response to the formation of the falling edge of the control signal 406 (operation 512). Here, the predetermined time refers to the predetermined range of the time interval described above with reference to
The controller 250 may store data regarding the image captured in operation 512 in the recording medium 270, and display the captured image via the display unit 152 when an image display function using the display unit 152 is enabled (operation 516). Afterwards, when the user powers off the image capturing apparatus 150, the operation of the image capturing apparatus 150 may end (refer to “YES” of operation 518). When the image capturing apparatus 150 remains powered on (refer to “NO” of operation 518), the count of a time required to control the ocular sensor 156 may be initialized (operation 514), and the process may return to the operation following operation 502 of powering on the image capturing apparatus 150 and determining the image capturing condition.
In
A time interval between the first rising edge and second rising edge of the detection signal 602 shown in
When a first entrance of an object into the detection region of the ocular sensor 156 of the image capturing apparatus 150 is detected, a first rising edge of the detection signal 602 of the ocular sensor 156 may be formed (operation 704), and a rising edge of the control signal 604 may be formed in response to the first rising edge of the detection signal 602 to enable a focusing control operation (706). Before a predetermined time elapses (refer to “NO” of operation 708) from a time point at which the first rising edge of the detection signal 602 is formed, when a second entrance of the object to the detection region of the ocular sensor 156 is detected and the second rising edge of the detection signal 602 of the ocular sensor 156 is formed (refer to “YES” of operation 710), the falling edge of the control signal 604 may be formed in response to the second rising edge of the detection signal 602, and an image capturing control operation may be performed in response to the formation of the rising edge of the control signal 604 (operation 712). Here, the predetermined time refers to the predetermined range of the time interval described above with reference to
The controller 250 may store data regarding the image captured in operation 712 in the recording medium 270, and display the captured image via the display unit 152 when an image display function using the display unit 152 is enabled (operation 716). Afterwards, when the user powers off the image capturing apparatus 150, the operation of the image capturing apparatus 150 may end (refer to “YES” of operation 718). When the image capturing apparatus 150 remains powered on (refer to “NO” of operation 718), the count of a time required to control the ocular sensor 156 may be initialized (operation 714), and the process may return to the operation following operation 702 of powering on the image capturing apparatus 150 and determining the image capturing condition.
In
In
When bi-directional movement of an object is detected by the ocular sensor 156 in front of the ocular sensor 156 of the image capturing apparatus 150, the first rising edge of the detection signal 802 of the ocular sensor 156 may be formed at a time point t5 (operation 904), and the rising edge of the control signal 804 may be formed in response to the first rising edge of the detection signal 802 to perform a focusing control operation (operation 906). After a predetermined time has elapsed from the time point t5 at which the first rising edge of the detection signal 802 is formed (refer to “YES” of operation 908), when another motion of the object in an opposite direction is detected by the ocular sensor 156, the second rising edge of the detection signal 802 of the ocular sensor 156 is formed (refer to “YES” of operation 910). The second rising edge of the control signal 804 may be formed in response to the second rising edge of the detection signal 802, and an image capturing control operation may be performed in response to the formation of the second rising edge of the control signal 804 (operation 912). Here, the predetermined time refers to the predetermined range of the time interval described above with reference to
The controller 250 may store data regarding the image captured in operation 912 in the recording medium 270, and display the captured image via the display unit 152 when an image display function using the display unit 152 is enabled (operation 916). Afterwards, when the user powers off the image capturing apparatus 150, the operation of the image capturing apparatus 150 may end (refer to “YES” of operation 918). When the image capturing apparatus 150 remains powered on (refer to “NO” of operation 918), the count of a time required to control the ocular sensor 156 may be initialized (operation 914), and the process may return to the operation following operation 902 of powering on the image capturing apparatus 150 and determining the image capturing condition.
Although a few embodiments have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the claims and their equivalents.
All references, including publications, patent applications, and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein.
For the purposes of promoting an understanding of the principles of the invention, reference has been made to the embodiments illustrated in the drawings, and specific language has been used to describe these embodiments. However, no limitation of the scope of the invention is intended by this specific language, and the invention should be construed to encompass all embodiments that would normally occur to one of ordinary skill in the art. The terminology used herein is for the purpose of describing the particular embodiments and is not intended to be limiting of exemplary embodiments of the invention. In the description of the embodiments, certain detailed explanations of related art are omitted when it is deemed that they may unnecessarily obscure the essence of the invention.
The apparatus described herein may comprise a processor, a memory for storing program data to be executed by the processor, a permanent storage such as a disk drive, a communications port for handling communications with external devices, and user interface devices, including a display, touch panel, keys, buttons, etc. When software modules are involved, these software modules may be stored as program instructions or computer readable code executable by the processor on a non-transitory computer-readable media such as magnetic storage media (e.g., magnetic tapes, hard disks, floppy disks), optical recording media (e.g., CD-ROMs, Digital Versatile Discs (DVDs), etc.), and solid state memory (e.g., random-access memory (RAM), read-only memory (ROM), static random-access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), flash memory, thumb drives, etc.). The computer readable recording media may also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion. This computer readable recording media may be read by the computer, stored in the memory, and executed by the processor.
Also, using the disclosure herein, programmers of ordinary skill in the art to which the invention pertains may easily implement functional programs, codes, and code segments for making and using the invention.
The invention may be described in terms of functional block components and various processing steps. Such functional blocks may be realized by any number of hardware and/or software components configured to perform the specified functions. For example, the invention may employ various integrated circuit components, e.g., memory elements, processing elements, logic elements, look-up tables, and the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices. Similarly, where the elements of the invention are implemented using software programming or software elements, the invention may be implemented with any programming or scripting language such as C, C++, JAVA®, assembler, or the like, with the various algorithms being implemented with any combination of data structures, objects, processes, routines or other programming elements. Functional aspects may be implemented in algorithms that execute on one or more processors. Furthermore, the invention may employ any number of conventional techniques for electronics configuration, signal processing and/or control, data processing and the like. Finally, the steps of all methods described herein may be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context.
For the sake of brevity, conventional electronics, control systems, software development and other functional aspects of the systems (and components of the individual operating components of the systems) may not be described in detail. Furthermore, the connecting lines, or connectors shown in the various figures presented are intended to represent exemplary functional relationships and/or physical or logical couplings between the various elements. It should be noted that many alternative or additional functional relationships, physical connections or logical connections may be present in a practical device. The words “mechanism”, “element”, “unit”, “structure”, “means”, and “construction” are used broadly and are not limited to mechanical or physical embodiments, but may include software routines in conjunction with processors, etc.
The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. Numerous modifications and adaptations will be readily apparent to those of ordinary skill in this art without departing from the spirit and scope of the invention as defined by the following claims. Therefore, the scope of the invention is defined not by the detailed description of the invention but by the following claims, and all differences within the scope will be construed as being included in the invention.
No item or component is essential to the practice of the invention unless the element is specifically described as “essential” or “critical”. It will also be recognized that the terms “comprises,” “comprising,” “includes,” “including,” “has,” and “having,” as used herein, are specifically intended to be read as open-ended terms of art. The use of the terms “a” and “an” and “the” and similar referents in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless the context clearly indicates otherwise. In addition, it should be understood that although the terms “first,” “second,” etc. may be used herein to describe various elements, these elements should not be limited by these terms, which are only used to distinguish one element from another. Furthermore, recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein.
Number | Date | Country | Kind |
---|---|---|---|
10-2012-0081488 | Jul 2012 | KR | national |