The present invention relates to an electronic apparatus having a function of detecting a move operation, and particularly, relates to a calibration method of the function.
An electronic apparatus on which various operating members (pointing devices) for designating positions are mounted is known. For example, an electronic apparatus in which selection or movement of an object is controlled according to a move operation (a slide operation) of touching and sliding a touch operating member is known as an electronic apparatus having a touch operating member which is one kind of a pointing device. An electronic apparatus in which selection or movement of an object is controlled by a mouse drag or the like is also known.
Japanese Patent Application Publication No. 2018-128738 proposes a method of selecting a position based on movement of a position to be touched subsequently from among a plurality of selection candidate positions located in a moving direction obtained by two points of an initial slide operation so that a position located in a direction intended by a user can be easily selected by the slide operation.
In a move operation of moving an operating body (a finger or the like touching a touch operating member) or an operating member (a mouse or the like), the habit of moving a moving target generally differs from user to user. Due to factors such as a difference in the habit of each user, the direction of a move operation intended by a user may differ from an operating direction detected by the device, and a move operation in the direction intended by the user may not be performed. In the method disclosed in Japanese Patent Application Publication No. 2018-128738, since a subsequent moving direction (an operating direction) is limited by the two points of an initial slide operation, in a case where the user changes the moving direction intentionally during the slide operation, the position located in the direction intended by the user may not be selected.
Therefore, the present invention provides an electronic apparatus in which processing corresponding to a direction closer to a direction intended by a user is executed as processing that corresponds to the direction of a move operation.
The present invention in its first aspect provides an electronic apparatus includes: at least one processor and/or at least one circuit to perform the operations of the following units: an operation-detecting unit configured to detect a move operation; a notification unit configured to send a notification prompting a user to perform a move operation of drawing a linear trajectory in a specific direction; and an execution unit configured to (1) acquire reference curve information indicating information on a curve based on a trajectory of a first move operation corresponding to the notification, which has been detected by the operation-detecting unit, wherein the reference curve information is information for determining whether the move operation is in the specific direction, and (2) execute processing based on a direction based on comparison between the reference curve information and a trajectory of a third move operation, in a case where the third move operation is performed later than the first move operation.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.
As illustrated in
In
The arrangement of the AF-ON buttons 1 and 2 will be described. As illustrated in
The AF-ON buttons 1 and 2 are operating members different from the touch panel 70a and do not have a display function. In an example described later, an example of moving an indicator (an AF frame) indicating a range-finding position selected by an operation on the AF-ON buttons 1 and 2 is described. However, the function executed according to an operation on the AF-ON buttons 1 and 2 is not particularly limited. For example, an arbitrary indicator that is moved by a slide operation on the AF-ON buttons 1 and 2 may be used as long as the indicator is displayed on the display unit 28 and can be moved. For example, the indicator may be a pointing cursor such as a mouse cursor and may be a cursor indicating an option selected among a plurality of options (a plurality of items displayed on a menu screen). An indicator moved by a slide operation on the AF-ON button 1 may be different from an indicator moved by a slide operation on the AF-ON button 2. The function executed by a push operation on the AF-ON buttons 1 and 2 may be another function related to the function executed by the slide operation on the AF-ON buttons 1 and 2.
A mode changeover switch 60 is an operating member for switching various modes. A power supply switch 72 is an operating member for switching the power of the camera 100 on and off. A sub-electronic dial 73 is a rotary operating member for moving a selection frame and feeding images. Eight-direction keys 74a and 74b are operating members that can be pressed down in the up, down, left, right, upper left, lower left, upper right, and lower right directions, respectively, and processes corresponding to the directions in which the eight-direction keys 74a and 74b are pressed down can be performed. The eight-direction key 74a is mainly used in a horizontal capturing mode and the eight-direction key 74b is mainly used in a vertical capturing mode. A SET button 75 is an operating member mainly used for confirming a selection item. A still image/video selection switch 77 is an operating member for switching between a still image capturing mode and a video capturing mode. An LV button 78 is an operating member for switching a live-view (hereinafter, LV) on and off. When LV is on, a mirror 12 described later moves (mirror-up) to a retraction position retracted from an optical axis, subject light is guided to an imaging unit 22 described later, and an LV mode in which a LV image is captured is set. In the LV mode, a subject image can be viewed in the LV image. When the LV is off, the mirror 12 moves (mirror-down) onto the optical axis, subject light is reflected, the subject light is guided to the finder 16, and an OVF mode in which an optical image (an optical subject image) of the subject can be visually recognized from the finder 16 is set. A playback button 79 is an operating member for switching between a capturing mode (a photographing screen) and a playback mode (a playback screen). When a user presses the playback button 79 during a capturing mode, the user can enter a playback mode, and the latest image among the images recorded on a recording medium 200 (described later in
A lens unit 150 is a lens unit on which exchangeable photographing lenses are mounted. A lens 155 generally includes a plurality of lenses such as a focus lens group and a zoom lens group, but only one lens is illustrated in
An AE sensor 17 measures the luminance of a subject (a subject light) having passed through the lens unit 150 and the quick-return mirror 12 and formed on a focusing screen 13.
A focus detecting unit 11 is a phase difference detection-type AF sensor that captures an image (a subject light) incident through the quick-return mirror 12 and outputs defocus amount information to the system control unit 50. The system control unit 50 controls the lens unit 150 on the basis of the defocus amount information and performs phase difference AF. The AF method may not be phase difference AF but may be contrast AF. Moreover, the phase difference AF may not use the focus detecting unit 11 but may be performed on the basis of a defocus amount detected on an imaging plane of the imaging unit 22 (imaging plane phase difference AF).
The quick-return mirror 12 (hereinafter, a mirror 12) is moved up and down by an actuator (not illustrated) according to an instruction from the system control unit 50 during exposure, live-view photographing, and moving-image photographing. The mirror 12 is a mirror for switching a light flux incident from the lens 155 toward the finder 16 or the imaging unit 22. The mirror 12 is usually disposed so as to guide (reflect) a light flux toward the finder 16 (mirror-down). In a capturing mode or a live-view mode, the mirror 12 pops up to retract from a light flux so as to guide a light flux toward the imaging unit 22 (mirror-up). Moreover, a central part of the mirror 12 is configured as a half-mirror so that a portion of light can pass through the mirror 12, and the mirror 12 transmits a portion of a light flux so as to be incident on the focus detecting unit 11 for detecting the focus.
A user can confirm the focusing state and the composition of an optical image of a subject obtained through the lens unit 150 by observing an image formed on the focusing screen 13 through a pentaprism 14 and the finder 16.
A focal plane shutter 21 (shutter 21) controls an exposure time of the imaging unit 22 under the control of the system control unit 50.
The imaging unit 22 is an imaging device (an imaging sensor) composed of CCD or CMOS devices that convert an optical image to an electrical signal. An A/D converter 23 is used for converting an analog signal output from the imaging unit 22 to a digital signal.
An image processing unit 24 performs predetermined processing (pixel interpolation, resizing processing such as reduction, and color conversion processing) with respect to data from the A/D converter 23 or data from a memory control unit 15. The image processing unit 24 performs predetermined calculation processing using captured image data, and the system control unit 50 performs exposure control and ranging control on the basis of the obtained calculation results. In this way, TTL (through the lens)-type AF (auto focus) processing, AE (auto exposure) processing, and EF (flash free emission) processing are performed. The image processing unit 24 also performs predetermined calculation processing using captured image data and performs TTL-type AWB (auto white balance) processing on the basis of the obtained calculation results.
A memory 32 stores image data obtained by the imaging unit 22 and converted to digital data by the A/D converter 23 and image data for displaying on the display unit 28. The memory 32 has a storage capacity sufficient for storing a predetermined number of still images and a predetermined period of videos and audio. The memory 32 may be a removable recording medium such as a memory card and may be an internal memory.
The display unit 28 is a backside monitor for displaying an image, which is provided on the back surface of the camera 100 as illustrated in
An in-finder display portion 41 displays a frame (an AF frame) indicating a range-finding point being auto-focused presently and an icon indicating the setting state of the camera with the aid of an finder internal display unit drive circuit 42. A finder outer display unit 43 displays various setting values of the camera 100 such as a shutter speed and an aperture with the aid of a finder outer display unit drive circuit 44.
An orientation detecting unit 55 is a sensor for detecting an attitude according to the angle of the camera 100. On the basis of the attitude detected by the orientation detecting unit 55, it is possible to determine whether the image captured by the imaging unit 22 is an image captured with the camera 100 held horizontally or vertically. The system control unit 50 can add orientation information corresponding to the attitude detected by the orientation detecting unit 55 to an image file of the image captured by the imaging unit 22 and rotate and record the image. An acceleration sensor, a gyro sensor, or the like can be used as the orientation detecting unit 55. Using an acceleration sensor and a gyro sensor which is the orientation detecting unit 55, it is also possible to detect the movement (panning, tilting, lifting, and whether it is stationary) of the camera 100.
A nonvolatile memory 56 is a memory that is electrically erasable and rewritable by the system control unit 50, and an EEPROM, for example, is used. The nonvolatile memory 56 stores constants for operation of the system control unit 50, programs, and the like. The programs mentioned herein are programs for executing various flowcharts described later in the present embodiment.
The system control unit 50 includes at least one processor (including circuits) and controls the entire camera 100. The system control unit 50 realizes respective steps of processing of the present embodiment by executing the programs recorded on the nonvolatile memory 56. A system memory 52 loads constants for operation of the system control unit 50, variables, programs and the like read from the nonvolatile memory 56. Moreover, the system control unit 50 performs display control by controlling the memory 32, the D/A converter 19, the display unit 28, and the like.
A system timer 53 is a time measuring unit that measures the time used for various controls and the time of a built-in clock. The mode changeover switch 60 switches an operation mode of the system control unit 50 to a still image capturing mode or a video capturing mode. The still image capturing mode includes a P-mode (program AE), an M-mode (manual), and the like. Alternatively, after switching to a menu screen once with the mode changeover switch 60, the mode may be switched to any one of these modes included in the menu screen using another operating member. Similarly, the video capturing mode may include a plurality of modes. In the M-mode, a user can set an aperture value, a shutter speed, an ISO service and can perform photographing with exposure intended by the user.
A first shutter switch 62 is turned on by so-called half-pressing (photographing preparation instruction) in the middle of operation of the shutter buttons 103 and 105 provided in the camera 100 and generates a first shutter switch signal SW1. The system control unit 50 starts operations such as AF (autofocus) processing, AE (auto exposure) processing, AWB (auto white balance) processing, EF (flicker free emission) processing, and the like according to the first shutter switch signal SW1. Luminance is also measured by the AE sensor 17.
A second shutter switch 64 is turned on by full-pressing (photographing instruction) upon completion of operation of the shutter buttons 103 and 105 and generates a second shutter switch signal SW2. The system control unit 50 starts operations of a series of photographing processing from reading of signals from the imaging unit 22 to recording of images in the recording medium 200 as image files according to the second shutter switch signal SW2.
A power supply control unit 83 includes a battery detection circuit, a DC-DC converter, a switch circuit for switching blocks to be energized, and the like and detects attachment of a battery, the type of a battery, and a remaining battery level. Moreover, the power supply control unit 83 controls the DC-DC converter on the basis of the detection results and the instruction from the system control unit 50 and supplies a necessary voltage to each unit including the recording medium 200 for a necessary period. The power supply switch 72 is a switch for switching the power of the camera 100 on and off.
A power supply unit 30 includes a primary battery such as an alkaline battery or a lithium battery, a secondary battery such as a NiCd battery, a NiMH battery, or a Li battery, an AC adapter, and the like. A recording medium I/F 18 is an interface to the recording medium 200 such as a memory card or a hard disk. The recording medium 200 is a recording medium such as a memory card for recording captured images and is composed of a semiconductor memory, a magnetic disk, or the like.
As described above, the camera 100 has the touch panel 70a capable of detecting a touch on the display unit 28 (the touch panel 70a) as one kind of the operating unit 70. The touch panel 70a and the display unit 28 may be configured integrally. For example, the touch panel 70a has light transmittance such that the display of the display unit 28 is not disturbed and is attached to an upper layer of the display surface of the display unit 28. The input coordinates on the touch panel 70a are correlated with the display coordinates on the display unit 28. In this way, it is possible to configure a GUI (graphical user interface) as if a user can directly operate the screen displayed on the display unit 28. The system control unit 50 can detect the following touch operations or states on the touch panel 70a.
When a touch-down is detected, a touch-on is also detected at the same time. After a touch-down is detected, a touch-on is usually detected continuously unless a touch-up is detected. A touch-on is also detected when a touch-move is detected. Even if a touch-on is detected, a touch-move is not detected unless a touch position is moved. A touch-off is detected after a touch-up of all fingers or pens being in touch with the touch panel is detected.
These operations and states and the positional coordinates at which a finger or a pen is touch with the touch panel 70a are notified to the system control unit 50 via an internal bus. The system control unit 50 determines which operation has been performed on the touch panel 70a on the basis of the notified information. As for a touch-move, the moving direction of the finger or pen moving on the touch panel 70a can be determined for each of the vertical and horizontal components on the touch panel 70a on the basis of changes in the positional coordinates. When a touch-down, a certain amount of a touch-move, and a touch-up are sequentially performed on the touch panel 70a, it is assumed that a “stroke” is drawn. An operation of quickly drawing a stroke is referred to as a “flick”. A flick is an operation of quickly moving a finger over a certain distance while touching the touch panel 70a and then separating the finger as it is. In other words, a flick is an operation of quickly tracing on the touch panel 70a as if a finger flicks on the touch panel 70a. When a touch-move at least a predetermined speed over at least a predetermined distance is detected and a touch-up is subsequently detected as it is, it can be determined that a flick has been performed. When a touch-move at a speed lower than a predetermined speed over at least a predetermined distance is detected, it can be determined that a drag has been performed. The touch panel 70a may be any one of various types of touch panels such as a resistance film type, a capacitance type, a surface acoustic wave type, an infrared type, an electromagnetic induction type, an image recognition type, or an optical sensor type. Although there are a type in which a touch is detected when a finger or a pen comes into contact with the touch panel and a type in which a touch is detected when a finger or a pen comes close to the touch panel, either type may be used.
The system control unit 50 can detect a touch operation or a push operation on the AF-ON buttons 1 and 2 according to a notification (output information) from the AF-ON buttons 1 and 2. The system control unit 50 calculates the direction of movement of a finger or the like on the AF-ON buttons 1 and 2 in eight directions of up, down, left, right, upper left, lower left, upper right, and lower right on the basis of the output information of the AF-ON buttons 1 and 2. Furthermore, the system control unit 50 calculates the amount (hereinafter referred to as a movement amount (x,y)) of the movement of a finger or the like on the AF-ON buttons 1 and 2 in two-dimensional directions of an x-axis direction and ay-axis direction on the basis of the output information of the AF-ON buttons 1 and 2. The system control unit 50 can further detect the following operations or states on the AF-ON buttons 1 and 2. The system control unit 50 calculates the moving direction or the movement amount (x,y) and detects the following operations and states individually for the AF-ON buttons 1 and 2.
When a touch-down is detected, a touch-on is also detected at the same time. After a touch-down is detected, a touch-on is usually detected continuously unless a touch-up is detected. A touch-on is also detected when a touch-move is detected. Even if a touch-on is detected, a touch-move is not detected if the movement amount (x,y) is 0. A touch-off is detected after a touch-up of all fingers or the like being in touch with the AF-ON button is detected.
The system control unit 50 determines which operation (touch operation) has been performed on the AF-ON buttons 1 and 2 on the basis of these operations and states, the moving direction, and the movement amount (x,y). As for a touch-move, movement in the eight directions of up, down, left, right, upper left, lower left, upper right, and lower right or the two-dimensional directions of the x-axis direction and the y-axis direction as the movement of a finger or the like on the AF-ON buttons 1 and 2. The system control unit 50 determines that a slide operation has been performed when movement in any one of the eight directions or movement in one or both of the two-dimensional directions of the x-axis direction and the y-axis direction is detected. In the present embodiment, it is assumed that the AF-ON buttons 1 and 2 are infrared touch sensors. However, the AF-ON button may be another type of touch sensor such as a surface acoustic wave type, a capacitance type, an electromagnetic induction type, an image recognition type, or an optical sensor type.
The structure of the AF-ON button 1 will be described with reference to
A cover 310 is an outer cover of the AF-ON button 1. A window 311 is a part of the outer cover of the AF-ON button 1 and transmits light projected from a light-projecting unit 312. The cover 310 projects further outward than the outer cover 301 of the camera 100 and can be pushed in. The light-projecting unit 312 is alight-emitting device such as a light-emitting diode that emits light toward the window 311. The light emitted from the light-projecting unit 312 is preferably light (infrared light) other than visible light. When a finger 300 touches the surface (an operation surface of the AF-ON button 1) of the window 311, the light emitted from the light-projecting unit 312 is reflected from the surface of the touching finger 300, and the reflected light is received (captured) by a light-receiving unit 313. The light-receiving unit 313 is an imaging sensor. On the basis of the image captured by the light-receiving unit 313, it is possible to detect whether an operating body (the finger 300) is not in contact with the operation surface of the AF-ON button 1, whether the operating body is in touch, and whether the touching operating body is moving in a touching state (a slide operation is performed). The cover 310 is an elastic member 314 and is provided on aground surface 316. When the finger 300 pushes the surface of the window 311 and the cover 310 is pushed in, the cover 310 touches a switch 315 for detecting a push. In this way, it is detected that the AF-ON button 1 is pushed.
A face detection function will be described. The system control unit 50 transmits a face detection target image to the image processing unit 24. Under the control of the system control unit 50, the image processing unit 24 applies a horizontal band-pass filter to the image data. Moreover, under the control of the system control unit 50, the image processing unit 24 applies a vertical band-pass filter to the processed image data. Edge components are detected from the image data by the horizontal and vertical band-pass filters.
After that, the system control unit 50 performs pattern matching on the detected edge components to extract candidate groups for eyes, nose, mouth, and ears. The system control unit 50 determines candidates that satisfy preset conditions (for example, the distance between two eyes, and an inclination, and the like) among the extracted candidate group for eyes as an eye pair and narrows down candidates having the eye pair as the eye candidate group. The system control unit 50 correlates the narrowed-down eye candidate group with other parts (nose, mouse, and ears) forming a face corresponding thereto and passing the correlation results to preset non-face condition filter to detect a face. The system control unit 50 outputs face information according to the face detection result and ends the processing. In this case, a character value such as the number of faces is stored in the system memory 52.
In this way, it is possible to analyze a LV image or an image being played to extract a character value of the image and detect subject information (detect a specific subject). In the present embodiment, the face is taken as an example of a specific subject. However, other subjects such as eyes, hands, torso, a specific individual, a moving object, or a character may be detected and be selected as a target of AF or the like.
An AF frame selectable in an OVF mode will be described. In an OVF mode, a user can select and set in advance any one of a plurality of select modes including at least the following select modes from a setting menu as an AF frame select mode (a range-finding area select mode).
In
In
Here, in a move operation of moving an operating body (a finger or the like touching a touch operating member) or an operating member (a mouse or the like), the habit of moving a moving target is generally different depending on a user. Due to factors such as differences in habits of each user, the direction of a move operation intended by a user may differ from an operating direction detected by the device, and a move operation in the direction intended by the user may not be performed. Such a problem occurs in a slide operation on the AF-ON buttons 1 and 2. Therefore, in the present embodiment, calibration of the AF-ON buttons 1 and 2 is performed so that the habit of moving the thumb 403 in a slide operation on the AF-ON buttons 1 and 2 is taken into consideration. In this way, even when a user who wants to move the range-finding frame in the direction of the arrows 420 and 440 moves the touch position in a direction different from the direction of the arrow 402 due to his or her habit, it is possible to move the range-finding frame in the direction of the arrows 420 and 440.
In S501, the system control unit 50 determines whether a calibration mode for calibrating the AF-ON button 1 is set. It is waited for the calibration mode to be set, and the flow proceeds to S502 when the calibration mode is set.
In S502, the system control unit 50 displays an operation instruction screen on at least one of the display unit 28 and the in-finder display portion 41 in order to prompt the user to perform a slide operation of drawing a linear trajectory in a specific direction.
In S503, the system control unit 50 determines whether a touch-down on the AF-ON button 1 is detected. It is waited for a touch-down to be detected and the flow proceeds to S504 when a touch-down is detected.
In S504, the system control unit 50 detects a character value of a finger performing a touch operation on the AF-ON button 1 using optical information (an image captured by the light-receiving unit 313) from the AF-ON button 1 and records the character value in the system memory 52. The character value is used for user authentication for identifying (determining) a user who has performed a slide operation (a slide operation in a mode different from the calibration mode) after calibration and is a fingerprint or the like, for example. The detection position of the character value is used as a touch position.
In S505, the system control unit 50 detects a moving distance and a moving direction of the finger (the touch position) in the slide operation from a change in the detection position of the character value detected in S504 and records the same in the system memory 52.
In S506, the system control unit 50 determines whether a touch-up from the AF-ON button 1 is detected. The processing of S504 and S505 is repeated until a touch-up is detected, and the flow proceeds to S507 when a touch-up is detected.
In S507, the system control unit 50 calculates an approximate line that approximates the trajectory of the performed slide operation from the moving direction detected in S505.
In S508, the system control unit 50 compares a designated line (a specific direction designated in the operation instruction screen displayed in S502) and the approximate line calculated in S507 to determine an inclination coefficient. The inclination coefficient is a value indicating an inclination such as a coefficient a of the equation of a straight line (Y=aX+b) and an angular difference (the angle between the approximate line and the designated line) between the approximate line and the designated line.
In S509, the system control unit 50 records the inclination coefficient calculated in S508 in the nonvolatile memory 56 as reference coordinate axis information for determining the direction of the slide operation and reference straight line information for determining whether the slide operation is in the specific direction designated in S502. Furthermore, the system control unit 50 records the character value detected in S504 in the nonvolatile memory 56 so that the user authentication can be performed. The character value recorded may be one or a plurality of items, and for example, a character value during a touch-on and a character value during a touch-up are recorded.
In S510, the system control unit 50 determines whether the inclination coefficient has been recorded for all directions (four directions designated in the operation instruction screens 601 to 604). When the inclination coefficient is recorded for all directions, the first calibration process ends, and otherwise, the flow proceeds to S502. That is, the processing of S502 to S510 is repeated so that the inclination coefficient is determined while switching the operation instruction screen to be displayed, and the first calibration process ends when the inclination coefficient is recorded for all directions. The processing of S509 may be controlled so that the character value acquired in the processing for some directions (for example, one direction) is recorded as the character value for user authentication, and may not.
In
In
Here, it is assumed that the user performs on the operation instruction screen 602 a slide operation of drawing a trajectory being in a parallel relationship with the horizontal left direction designated in the operation instruction screen 602. In this case, the angle (=0°) with respect to the horizontal left direction is recorded. As a result, when a slide operation of drawing a trajectory in a horizontal left direction (a direction opposite to the horizontal right direction 712) is performed in a first slide response process described later, movement (processing based on the horizontal left direction) of the range-finding frame in the horizontal left direction is executed. On the other hand, since the angle (α≠0°) of the allocation process 711 with respect to the horizontal right direction 712 is recorded, even when a slide operation of drawing a trajectory in the horizontal right direction 712 is performed, movement (processing based on the horizontal right direction 1112) of the range-finding frame in the horizontal right direction 712 is not executed. For example, the range-finding frame is moved in a direction inclined at the angle α with respect to the horizontal right direction 712. The same is applied to when the user performs a slide operation of drawing a trajectory being in a parallel relationship with a vertical downward direction designated in the operation instruction screen 604 on the display of the operation instruction screen 604. In the first slide response process, when a slide operation of drawing a trajectory in the vertical downward direction (a direction opposite to the vertical upward direction 722) is performed, the range-finding frame is moved in the vertical downward direction. On the other hand, since the angle (β≠0°) of the approximate line 721 with respect to the vertical upward direction 722 is recorded, even when a slide operation of drawing a trajectory in the vertical upward direction 722 is performed, the range-finding frame is not moved in the vertical upward direction 722.
Furthermore, it is assumed that the trajectory 710 (the approximate line 711) and the trajectory 720 (the approximate line 721) are not in a vertical relationship (that is, α≠β). In this case, reference coordinate axis information including a first axis (angle α) based on the trajectory 710 and a second axis (angle β) based on the trajectory 720, which are not vertical to each other, is recorded. Here, in the first slide response process described later, it is assumed that when a slide operation of drawing a trajectory inclined at a specific angle with respect to the first axis is performed, processing based on the direction inclined at the specific angle with respect to the first axis is executed. However, when a slide operation of drawing a trajectory inclined at a specific angle with respect to the second axis is performed, processing based on the direction inclined at the specific angle with respect to the second axis is not executed.
As an example, a case where α=40° and β=100 will be considered. Moreover, a case of performing processing based on a direction closest to the trajectory of a slide operation among an axial direction parallel to the first and second axes and a non-axial direction that divides between the first and second axes will be considered. In this case, the width (angle) of a first quadrant and a third quadrant defined by the first and second axes is 50°, and the width (angle) of a second quadrant and a fourth quadrant defined by the first and second axes is 130°. Moreover, the angle of a non-axial direction passing through the first and third quadrants for the first axis is 25°, and the angle of a non-axial direction passing through the second and fourth quadrants for the second axis is 65°. When a slide operation of drawing a trajectory inclined at 25° with respect to the first axis is performed, processing based on a non-axial direction (a direction inclined at 25° with respect to the first axis) passing through the first and third quadrants is executed. On the other hand, when a slide operation of drawing a trajectory inclined at 25° with respect to the second axis is performed, processing based on a direction inclined at 25° with respect to the first axis is not executed, but processing based on an axial direction parallel to the second axis is executed.
In S801, the system control unit 50 determines whether a touch-down on the AF-ON button 1 is detected. It is waited for a touch-down to be detected, and the flow proceeds to S802 when a touch-down is detected.
In S802, the system control unit 50 detects a character value of a finger performing a touch operation on the AF-ON button 1 using optical information (an image captured by the light-receiving unit 313) from the AF-ON button 1 and records the character value in the system memory 52.
In S803, the system control unit 50 detects a character value that matches the character value detected in S802 from a plurality of character values recorded in the nonvolatile memory 56 in the first calibration process (
In S804, the system control unit 50 detects a change in the detection position of the character value detected in S802 (that is, a trajectory drawn by the slide operation) as a user input.
In S805, the system control unit 50 reads the user input detected in S804 into a user-specific coordinate system (a coordinate system corresponding to the information (reference straight line information; reference coordinate axis information) acquired in S803) obtained in the first calibration process (
In S806, the system control unit 50 corrects the user input detected in S804 by correcting the coordinate system so that an inclination of the axis of the user-specific coordinate system (the coordinate system corresponding to the information (reference straight line information; reference coordinate axis information) acquired in S803) is eliminated.
In S807, the system control unit 50 moves the range-finding frame in the direction of the user input corrected in S806.
The processing of S805 may be regarded as comparison between the reference coordinate axis information (reference straight line information) and the trajectory of the slide operation performed. The processing of S806 may be regarded as processing of determining the direction of the slide operation performed on the basis of the comparison in S805. The processing of S807 may also be regarded as execution of processing based on the direction determined in S806.
In S808, the system control unit 50 determines whether a touch-up from the AF-ON button 1 is detected. The processing of S802 to S808 is repeated until a touch-up is detected, and the first slide response process ends when a touch-up is detected.
As described above, according to the first slide response process (
In S1001, the system control unit 50 determines whether a calibration mode for calibrating the AF-ON button 1 is set. It is waited for the calibration mode to be set, and the flow proceeds to S1002 when the calibration mode is set.
In S1002, the system control unit 50 displays an operation instruction screen on at least one of the display unit 28 and the in-finder display portion 41 in order to prompt the user to perform a slide operation of drawing a linear trajectory in a specific direction. Similarly to S502 in
In S1003, the system control unit 50 determines whether a touch-down on the AF-ON button 1 is detected. It is waited for a touch-down to be detected and the flow proceeds to S1004 when a touch-down is detected.
In S1004, the system control unit 50 detects a character value (a fingerprint or the like) of a finger performing a touch operation on the AF-ON button 1 using optical information (an image captured by the light-receiving unit 313) from the AF-ON button 1 and records the character value in the system memory 52.
In S1005, the system control unit 50 detects a moving distance and a moving direction of the finger (the touch position) in the slide operation from a change in the detection position of the character value detected in S1004 and records the same in the system memory 52.
In S1006, the system control unit 50 determines whether a touch-up from the AF-ON button 1 is detected. The processing of S1004 and S1005 is repeated until a touch-up is detected, and the flow proceeds to S1007 when a touch-up is detected.
In S1007, the system control unit 50 calculates an approximate parabola that approximates the trajectory of the performed slide operation from the moving direction detected in S1005. A curve different from the parabola may be calculated as a curve that approximates the trajectory of the slide operation.
In S1008, the system control unit 50 compares a designated line (a specific direction designated in the operation instruction screen displayed in S1002) and the approximate parabola calculated in S1007 to determine an unevenness coefficient. The unevenness coefficient is a value indicating the degree of curvature (distortion) of an approximate parabola such as a coefficient a of the equation of a parabola (Y=aX2+b) and the designated line.
In S1009, the system control unit 50 records the unevenness coefficient calculated in S1008 in the nonvolatile memory 56 as reference curve information for determining whether the slide operation is in the specific direction designated in S1002. Furthermore, the system control unit 50 records the character value detected in S1004 in the nonvolatile memory 56 so that the user authentication can be performed. The character value recorded may be one or a plurality of items, and for example, a character value during a touch-on and a character value during a touch-up are recorded.
In S1010, the system control unit 50 determines whether the unevenness coefficient has been recorded for all directions (four directions designated in the operation instruction screens 601 to 604). When the unevenness coefficient is recorded for all directions, the second calibration process ends, and otherwise, the flow proceeds to S1002. That is, the processing of S1002 to S1010 is repeated so that the unevenness coefficient is determined while switching the operation instruction screen to be displayed, and the second calibration process ends when the unevenness coefficient is recorded for all directions. The processing of S1010 may be controlled so that the character value acquired in the processing for some directions (for example, one direction) is recorded as the character value for user authentication, and may not.
In
In
Here, it is assumed that the user performs on the operation instruction screen 602 a slide operation of drawing a trajectory being in a parallel relationship with the horizontal left direction designated in the operation instruction screen 602. In this case, an unevenness coefficient indicating that the trajectory is not curved is recorded as the unevenness coefficient corresponding to the horizontal left direction. As a result, when a slide operation of drawing a trajectory in a horizontal left direction (a direction opposite to the horizontal right direction 1112) is performed in a second slide response process described later, movement (processing based on the horizontal left direction) of the range-finding frame in the horizontal left direction is executed. On the other hand, the unevenness coefficient λ indicating the trajectory is curved is recorded as the unevenness coefficient corresponding to the horizontal right direction 1112. Therefore, even when a slide operation of drawing a trajectory in the horizontal right direction 1112 is performed, movement (processing based on the horizontal right direction 1112) of the range-finding frame in the horizontal right direction 1112 is not executed. For example, the range-finding frame is moved so as to be curved with respect to the horizontal right direction 1112. The same is applied to when the user performs a slide operation of drawing a trajectory being in a parallel relationship with a vertical downward direction designated in the operation instruction screen 604 on the display of the operation instruction screen 604. In the second slide response process, when a slide operation of drawing a trajectory in the vertical downward direction (a direction opposite to the vertical upward direction 1122) is performed, the range-finding frame is moved in the vertical downward direction. On the other hand, since the unevenness coefficient Ω (with curvature) corresponding to the vertical upward direction 1122 is recorded, even when a slide operation of drawing a trajectory in the vertical upward direction 1122 is performed, the range-finding frame is not moved in the vertical upward direction 1122.
In S1201, the system control unit 50 determines whether a touch-down on the AF-ON button 1 is detected. It is waited for a touch-down to be detected, and the flow proceeds to S1202 when a touch-down is detected.
In S1202, the system control unit 50 detects a character value of a finger performing a touch operation on the AF-ON button 1 using optical information (an image captured by the light-receiving unit 313) from the AF-ON button 1 and records the character value in the system memory 52.
In S1203, the system control unit 50 detects a character value that matches the character value detected in S1202 from a plurality of character values recorded in the nonvolatile memory 56 in the second calibration process (
In S1204, the system control unit 50 detects a change in the detection position of the character value detected in S1202 (that is, a trajectory drawn by the slide operation) as a user input.
In S1205, the system control unit 50 reads the user input detected in S1204 into a user-specific coordinate system (a coordinate system corresponding to the information (reference curve information) acquired in S1203) obtained in the second calibration process (
In S1206, the system control unit 50 corrects the user input detected in S1204 by correcting the coordinate system so that distortion of the axis of the user-specific coordinate system (the coordinate system corresponding to the information (reference curve information) acquired in S1203) is eliminated.
In S1207, the system control unit 50 moves the range-finding frame in the direction of the user input corrected in S1206.
The processing of S1205 may be regarded as comparison between the reference curve information and the trajectory of the slide operation performed. The processing of S1206 may be regarded as processing of determining the direction of the slide operation performed on the basis of the comparison in S1205. The processing of S1207 may also be regarded as execution of processing based on the direction determined in S1206.
In S1208, the system control unit 50 determines whether a touch-up from the AF-ON button 1 is detected. The processing of S1202 to S1208 is repeated until a touch-up is detected, and the second slide response process ends when a touch-up is detected.
As described above, according to the second slide response process subsequent to the second calibration process, it is possible to move the range-finding frame in a direction closer to the user's intention by taking the habit of the user drawing a trajectory curved (distorted) with respect to the intended direction into consideration. It is possible to eliminate the dissatisfaction of the user resulting from the movement of the range-finding frame to be curved with respect to the direction intended by the user.
Although the correction (conversion) is performed from the trajectory of the slide operation to determine the direction of the slide operation, the movement amount of the slide operation is preferably determined without performing the correction (conversion). For example, it is preferable that the length of the trajectory of the slide operation is used as the movement amount of the slide operation as it is. By doing so, it is possible to suppress a change in the moving speed of the range-finding frame resulting from the correction (elimination of distortion).
A calibration process in which the first and second calibration processes are combined may be executed. For example, after a touch-up is detected, an approximate line and an approximate parabola that approximate the trajectory of a slide operation may be determined, and an inclination coefficient based on the approximate line and an unevenness coefficient based on the approximate parabola may be recorded. Similarly, a slide response process in which the first and second slide response processes are combined may be executed. For example, after the distortion of the trajectory of the slide operation may be eliminated (reduced) on the basis of the unevenness coefficient, the inclination may be corrected on the basis of the inclination coefficient.
The present invention is not limited to a slide operation on an AF-ON button but may be applied to an operation on another operating member. For example, a slide operation on a touch panel or a touch panel may be detected, and processing similar to the processing (control) may be performed with respect to the slide operation. Movement of an operating member itself such as a mouse or a motion controller may be detected in a wired or wireless manner and the processing may be performed with respect to the movement. Movement of the hand (an operating body) of a user such as a spatial gesture may be detected in a non-contact manner, and the processing may be performed with respect to the movement. The present invention can be applied to an arbitrary electronic apparatus as long as the electronic apparatus detects movement of an operating body (a finger, a pen, a hand, or the like) and movement of an operating member (a mouse, a motion controller, or the like) and execute processing.
While the present invention has been described in detail on the basis of preferred embodiments thereof, the present invention is not limited to these specific embodiments and various modes without departing from the scope of the invention are also included in the present invention. Furthermore, the embodiments described above simply represent an exemplary embodiment of the present invention and the embodiments may also be combined with each other.
Various controls described to be performed by the system control unit 50 may be performed by one piece of hardware or a plurality of pieces of hardware (for example, a plurality of processors or circuits) may control the entire apparatus by sharing the processing.
While an example in which the present invention is applied to an imaging apparatus has been described in the above-described embodiment, the present invention is not limited to this example and can be applied to any electronic apparatus having an operation detection function of detecting a move operation. For example, the present invention can be applied to a personal computer, a PDA, a mobile phone terminal, a mobile image viewer, a printer apparatus, a digital photo frame, a music player, a game device, an electronic book reader, and the like. Moreover, the present invention can be applied to a video player, a display apparatus (including a projection apparatus), a tablet terminal, a smartphone, an AI speaker, a home electrical appliance, a vehicle-mounted apparatus, and the like.
The present invention is not limited to an imaging apparatus body but can be applied to a control device that communicates with an imaging apparatus (including a network camera) via cable or wireless communication and controls the imaging apparatus remotely. Examples of the device that controls the imaging apparatus remotely include a smartphone, a tablet PC, a desktop PC, and the like. The imaging apparatus can be controlled remotely by notifying a command for making various operations and settings from the control device to the imaging apparatus on the basis of an operation performed on the control device or processing performed on the control device. Moreover, a live-view image captured by the imaging apparatus may be received via cable or wireless communication so that the image can be displayed on the control device.
According to the present invention, it is possible to cause an electronic apparatus that executes processing corresponding to a direction of a move operation to execute processing closer to the user's intention.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2019-220335, filed on Dec. 5, 2019, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2019-220335 | Dec 2019 | JP | national |