One of the aspects of the embodiments relates to an electronic apparatus, its control method, and a storage medium.
Japanese Patent Laid-Open No. 2004-312477 discloses a power control apparatus that detects, based on a change in the attitude of an electronic apparatus, that the electronic apparatus is removed from an accommodation location such as a bag or clothing pocket, and automatically turns on the power. Thereby, imaging can be immediately performed at an intended timing without any operation for turning on the power. The electronic apparatus disclosed in Japanese Patent Laid-open No. 2004-312477 discloses a method for preventing power consumption by automatically turning off the power if an image does not change and no operation is performed within a predetermined time after the power is turned on.
However, the electronic apparatus disclosed in Japanese Patent Laid-Open No. 2004-312477 cannot recognize the user's state during imaging. Thus, the power consumption of the electronic apparatus may not be able to be reduced by keeping the power on even though the power is actually unnecessary.
An electronic apparatus according to one aspect of the disclosure includes a display unit configured to detect contact with an object using a detector, and a processor configured to change between a first power mode and a second power mode that consumes less power than the first power mode, based on a contact area size of an object on the display unit detected by the detector. A control method of the above electronic apparatus also constitutes another aspect of the disclosure. A storage medium storing a program that causes a computer to execute the above control method also constitutes another aspect of the disclosure.
Further features of various embodiments of the disclosure will become apparent from the following description of embodiments with reference to the attached drawings.
In the following, the term “unit” may refer to a software context, a hardware context, or a combination of software and hardware contexts. In the software context, the term “unit” refers to a functionality, an application, a software module, a function, a routine, a set of instructions, or a program that can be executed by a programmable processor such as a microprocessor, a central processing unit (CPU), or a specially designed programmable device or controller. A memory contains instructions or programs that, when executed by the CPU, cause the CPU to perform operations corresponding to units or functions. In the hardware context, the term “unit” refers to a hardware element, a circuit, an assembly, a physical structure, a system, a module, or a subsystem. Depending on the specific embodiment, the term “unit” may include mechanical, optical, or electrical components, or any combination of them. The term “unit” may include active (e.g., transistors) or passive (e.g., capacitor) components. The term “unit” may include semiconductor devices having a substrate and other layers of materials having various concentrations of conductivity. It may include a CPU or a programmable processor that can execute a program stored in a memory to perform specified functions. The term “unit” may include logic elements (e.g., AND, OR) implemented by transistor circuits or any other switching circuits. In the combination of software and hardware contexts, the term “unit” or “circuit” refers to any combination of the software and hardware contexts as described above. In addition, the term “element,” “assembly,” “component,” or “device” may also refer to “circuit” with or without integration with packaging materials.
Referring now to the accompanying drawings, a detailed description will be given of embodiments according to the disclosure.
Referring now to
A shutter control unit 104 controls a shutter 103 based on exposure information from an image processing unit 106 in cooperation with a lens control unit 204 that controls an aperture stop 203. A first image sensor 105 converts an optical image of an unillustrated object formed through a lens (optical system) 205, the aperture stop 203, lens mounts 202 and 102, and the shutter 103 into an electrical signal. The image processing unit 106 performs predetermined calculation processing for the image signal output from the first image sensor 105, and performs image processing such as pixel interpolation processing, color conversion processing, or white balance processing based on the result of the calculation processing. The processed image data is output to a liquid crystal panel display unit 112 of a display apparatus (display unit, rear monitor) 110 or a display unit 121 of the viewfinder 120. The image processing unit 106 also has an image compression function such as Joint Photographic Experts Group (JPEG).
A second image sensor (in-camera) 107 is provided on the back side of the image pickup apparatus 100, and acquires an image in the back direction of the image pickup apparatus 100, such as an image of the user. A face detector 108 detects whether or not the user's face is included in the image acquired by the second image sensor 107. A memory 109 is, for example, a RAM, and stores captured still images, moving images, image data for playback display, and the like. The memory 109 has a storage capacity sufficient to store a predetermined number of still images or moving images. The memory 109 has a program stack area, a status storage area, a calculation area, a work area, and an image display data area for a system control unit 150. Various calculations are executed by the system control unit 150 using the calculation area of the memory 109.
The display apparatus 110 is a liquid crystal display type display apparatus with a touch panel function, and includes a touch panel unit 111, a liquid crystal panel display unit 112, and a backlight illumination unit 113. The liquid crystal panel display unit 112 can display a menu screen stored in the image display data area of the memory 109 or an image file stored in a recording medium 180 according to instructions from the system control unit 150. The liquid crystal panel display unit 112 can perform live-view imaging by sequentially displaying the imaging data obtained from the first image sensor 105 as a through image on a real-time basis. The backlight illumination unit 113 performs rear illumination (backlight illumination) for the liquid crystal panel display unit 112. The light source element for backlight illumination may be an Light Emitting Diode (LED), an organic electro-luminescence (EL), a fluorescent tube, or the like, but is not limited to this example. The backlight illumination unit 113 can arbitrarily turn on or off the illumination according to instructions from the system control unit 150. The touch panel unit 111 as a detector that can detect the contact of an object uses a capacitance method as a touch detecting method. The touch panel unit 111 enables intuitive device operations such as touch autofocus (AF) to focus on a position touched by the user.
Referring now to
The viewfinder 120 includes an unillustrated optical system (finder optical system), a display unit (intra-finder display unit) 121 different from the liquid crystal panel display unit 112, and an eye proximity detector 122. The display unit 121 is a liquid crystal panel, an organic EL, or the like disposed inside the viewfinder 120. Similarly to the liquid crystal panel display unit 112, the display unit 121 displays a menu screen stored in the image display data area of the memory 109 or the image file stored on the recording medium 180 according to instructions from the system control unit 150.
The eye proximity detector 122 is, for example, a proximity sensor having an infrared LED light emitter, a photodiode light receiver, and the like. The eye proximity detector 122 is disposed outside the optical system of the viewfinder 120 and detects that the user is using the viewfinder 120. In a case where the eye proximity detector 122 detects that the user's face is close to the viewfinder 120, the system control unit 150 changes the display destination from the display apparatus 110 as a rear monitor to the display unit 121 in the viewfinder 120. Conversely, in a case where the eye proximity detector 122 does not detect that the user's face is close to the viewfinder 120, it changes the display destination from the display unit 121 in the viewfinder 120 to the display apparatus 110.
A shutter button 130 is an operation unit for imaging instruction, and includes a first shutter switch 131 and a second shutter switch 132. The first shutter switch 131 is turned on in a case where the shutter button 130 is half-pressed (imaging preparation instruction) during operation of the shutter button 130, and generates a first shutter switch signal SW1. The first shutter switch signal SW1 starts operations such as AF processing, auto-exposure (AE) processing, auto white balance (AWB) processing, flash pre-emission (FE) processing, and imaging with the first image sensor 105. The second shutter switch 132 is turned on in a case where the operation of the shutter button 130 is completed, that is, in a case where the shutter button 130 is fully pressed (imaging instruction), and generates a second shutter switch signal SW2. The system control unit 150 performs a series of imaging processing operations from signal readout from the first image sensor 105 to writing image data to the recording medium 180 using the second shutter switch signal SW2.
The operation unit 133 is an operation unit for inputting various predetermined operation instructions to the system control unit 150, and uses any one of a switch, a dial, a touch panel, a pointing device based on line-of-sight detection, a voice recognition apparatus, etc., or a combination thereof. A power switch 134 can switch and set each mode of powering on or powering off the image pickup apparatus 100. A nonvolatile memory 140 can be electrically erased and recorded, and is a flash memory, an electrically erasable programmable read-only memory (EEPROM), or the like. The nonvolatile memory 140 stores an imaging state, a program for controlling the image pickup apparatus 100, and the like.
The system control unit 150 is a control unit that has at least one processor and controls the operation of the entire image pickup apparatus 100. A power supply control unit 160 includes a battery detecting circuit, a protection circuit, a DC-DC converter, an LDO regulator, and the like. The power supply control unit 160 has a function of protecting a load circuit connected to the power supply circuit by cutting off the power supply in detecting the presence or absence of a battery, a type of battery, the remaining capacity, or an overcurrent. The power supply control unit 160 controls a power supply unit 170 based on instructions from the system control unit 150, and supplies a desired power supply voltage to each component in the image pickup apparatus 100 for a desired period. The power supply unit 170 is a primary battery such as an alkaline battery or a lithium battery, a secondary battery such as a NiCd battery, a NiMH battery, or a Li battery, or an AC adapter. The recording medium 180 is a semiconductor memory or the like removably attached to the image pickup apparatus 100 and configured to record or read image data. An attitude detector 190 is an acceleration sensor or the like, and detects at least a roll angle and a pitch angle. Thereby, the horizontal and vertical positions and the tilt in the tilt direction of the image pickup apparatus 100 can be detected.
The lens apparatus 200 can guide an optical image of an unillustrated object through the lens 205, aperture stop 203, lens mounts 202 and 102, and shutter 103, and form it on the first image sensor 105. The lens control unit 204 controls the entire lens apparatus 200. The lens control unit 204 transmits information to the image pickup apparatus 100 via the connectors 101 and 201 that electrically connect the image pickup apparatus 100 and the lens apparatus 200, or performs various lens controls, such as focusing and zooming, according to instructions from the system control unit 150.
Referring now to
First, in step S101, the system control unit 150 determines whether a predetermined period while the user does not operate the image pickup apparatus 100 (in the non-operating state) has elapsed. Here, the predetermined period is a period, such as 10 seconds or 20 seconds, that is evaluated to indicate that the user unlikely intends to capture an image, and can be properly set. In a case where it is determined that the predetermined period has not elapsed in the non-operating state, step S101 is repeated until it is determined that the predetermined period has elapsed. On the other hand, in a case where it is determined that the predetermined period has elapsed, the flow proceeds to step S102.
In step S102, the system control unit 150 starts the second image sensor (in-camera) 107. Next, in step S103, the system control unit 150 determines whether the user's face has been detected using the second image sensor 107. For example, the face detector 108 can detect whether the user's face is included in the image by analyzing the image in the user direction acquired by the second image sensor 107. In a case where it is determined that the user's face has been detected, the flow proceeds to step S109. On the other hand, in a case where it is determined that no face is detected, the flow proceeds to step S104.
Referring now to
In step S104, the system control unit 150 determines whether the proximity of an object to the viewfinder 120 is detected using the eye proximity detector 122. In a case where the proximity of the object to the viewfinder 120 is detected, the flow proceeds to step S105. On the other hand, in a case where the proximity of the object to the viewfinder 120 is not detected, the flow proceeds to step S106.
In step S105, the system control unit 150 determines whether surface contact with the touch panel unit 111 has occurred. In a case where it is determined that surface contact has occurred with the touch panel unit 111, the flow proceeds to step S106. On the other hand, in a case where it is determined that surface contact has not occurred, the flow proceeds to step S109. Here, the system control unit 150 determines whether the state illustrated in
In a case where the user is in an imaging state using the viewfinder 120 as illustrated in
The determination threshold in step S105 is the second threshold. Providing two thresholds having such a relationship can detect contact with the torso without misrecognizing contact with the nose in using the viewfinder 120 as a finger touch operation.
The torso contact in the neck hanging state may be determined on the condition of a detection area on the touch panel unit 111.
In step S106 in
In step S107, since it is determined that the user has no intention of imaging, the system control unit 150 sets the image pickup apparatus 100 to a low power consumption mode (a second power mode with lower power consumption than the first power mode). As specific control for reducing power consumption in the low power consumption mode, the system control unit 150 stops the imaging operation of the first image sensor 105, turns off the display apparatus 110, etc. Next, in step S108, the system control unit 150 determines whether the face detector 108 has detected the user's face from the image acquired by the second image sensor 107, or the user has operated the shutter button 130 or the operation unit 133 (camera operation). In a case where it is determined that the user's face has been detected or that the camera has been operated, the flow proceeds to step S109. On the other hand, in a case where the state in which the user's face is not detected or no camera operation continues, the process returns to step S107.
In step S109, the system control unit 150 sets a normal power mode (first power mode) for operating all functions of the image pickup apparatus 100. In a case where the user's face is detected in step S103, or in a case where the user's face is detected or the user operation is performed in the low power consumption mode in step S108, it can be determined that the user has the intention of imaging. Therefore, the image pickup apparatus 100 is operated in the normal power mode in which all functions can be operated.
Next, in step S110, the system control unit 150 determines whether the predetermined period has elapsed in a state in which the user's face has been detected by the face detector 108 during operation in the normal power mode. In a case where it is determined that the predetermined period has not elapsed, step S110 is repeated. On the other hand, in a case where it is determined that the predetermined period has elapsed, the flow proceeds to step S111. In step S111, the system control unit 150 stops imaging of the second image sensor 107 because it is determined that the imaging operation is being continuously performed and determining the imaging intention based on face detection is unnecessary.
In this embodiment, the system control unit 150 changes a first power mode and a second power mode that consumes less power than the first power mode, based on the contact area size of the object on the display apparatus 110 detected by the touch panel unit 111. Thus, the user's intention of imaging can be more accurately determined by using the surface contact detection result of the touch panel unit 111, and thereby power saving control can be more accurately performed. In this embodiment, the touch panel unit 111 uses a capacitive touch panel, but may use another method such as a resistive film method as long as the surface contact can be detected.
In this embodiment, in a case where a large and heavy lens apparatus 200 such as a telephoto lens is attached, the image pickup apparatus 100 tilts significantly as illustrated in
To avoid this problem, the system control unit 150 may use the result of the attitude detector 190. In a case where a face is detected in step S103 in
Embodiment(s) of the disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer-executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer-executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer-executable instructions. The computer-executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read-only memory (ROM), a storage of distributed computing systems, an optical disc (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the disclosure has described example embodiments, it is to be understood that some embodiments are not limited to the disclosed embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This embodiment can provide an electronic apparatus that can more effectively reduce power consumption.
This application claims the benefit of Japanese Patent Application No. 2023-031864, which was filed on Mar. 2, 2023, and which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2023-031864 | Mar 2023 | JP | national |