The disclosure of Japanese Patent Application No. 2012-1114 is incorporated herein by reference.
1. Field of the Invention
The present invention relates to electronic equipment, and more specifically, electronic equipment provided with a display, for example.
2. Description of the Related Art
An example of a related art is disclosed in Japanese Patent Application Laying-Open No. 2003-196017 [G06F 3/033, G06F 3/00, G06T 1/00, G06T 7/60] (document 1) laid-open on Jul. 11, 2003. A data input device of this document 1 displays an input data group of a menu or a keyboard on a display, images an eye portion of a user of the device with a camera, determines a direction of a line of sight of the user in the imaged image, determines input data located in the direction of the line of sight, and outputs determined input data to external equipment, etc.
Another example of a related art is disclosed in Japanese Patent Application Laying-Open No. H9-212287 [G06F 3/033] (Document 2) laid-open on Aug. 15, 1997. An eye point input device of this document 2 performs an inquiry for a sign of a character, numeral, symbol or the like based on a positional data of an eye of an operator being sent from a camera, detects a sign onto which the operator puts his/her eye point, and when it is determined that a detected sign is fixed for a predetermined time period set in advance, outputs the sign to an input circuit.
A further example of a related art is disclosed in Japanese Patent Application Laying-Open No. 2003-150306 [G06F 3/033] (Document 3) laid-open on May 23, 2003. An information display device of this document 3 presumes a gaze point based on a direction of a line of sight if a user performs a selection by his/her line of sight, estimates predetermined information, commodity, etc. based on the presumed direction of the line of sight and displays the information, commodity, etc. being a selection target.
A still further example of a related art is disclosed in Japanese Patent Application Laying-Open No. 2000-20196 [G06F 3/00, G06F 3/033] (Document 4) laid-open on Jan. 21, 2000. In an eye-controlled input device of this document 4, a part of a plurality of kinds of character groups is displayed in a character area, and a character is selected by an eye cursor indicating a position of a line of sight of an observer and the character is input.
The other example of a related art is disclosed in Japanese Patent Application Laying-Open No. H9-204260 [G06F 3/033] (Document 5) laid-open on Aug. 5, 1997. A data input device of this document 5 detects a position of a pupil viewing a part of a display, calculates coordinates on the display corresponding to the detected position, and displays a cursor at a position of the coordinates on the display.
In the above-described eye-controlled input device, there is a tendency that a device becomes larger in proportion to a distance between a sensor and an eye ball. Accordingly, on the assumption that such an eye-controlled input device is incorporated in relatively small electronic equipment such as a mobile terminal, the related arts respectively described in the documents 1 to 4 are not adequate because the device is relatively large. Furthermore, in the related art described in the document 5, a cursor displayed on a display is moved based on an imaged image that a pupil of a user who closes his/her eye to a window such as a finder, and therefore, it is possible to detect a line of sight only in a restricted using situation that the user watches the display through the window. That is, in a case that an eye and a device are separate from each other, there is a possibility that a line of sight cannot be correctly detected.
Therefore, it is a primary object of the present invention to provide novel electronic equipment.
Another object of the present invention is to provide electronic equipment capable of increasing a recognition rate of an eye-controlled input.
An aspect according to the present invention is electronic equipment provided with a display portion, comprising: an infrared light detecting portion which is arranged above the display portion and detects an infrared light; and an infrared light output portion which is arranged below the display portion.
The above described objects and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
With referring to
For example, the user can input a telephone number by making a touch operation on the touch panel 16 with respect to a dial key (not shown) displayed on the display 14, and start a telephone conversation by operating the call key 22. If and when the end key 24 is operated, the telephone conversation can be ended. In addition, by long-depressing the end key 24, it is possible to turn-on/-off power of the mobile phone 10.
If the menu key 26 is operated, a menu screen is displayed on the display 14, and in such a state, by making a touch operation on the touch panel 16 with respect to a software key, a menu icon (both, not shown) or the like being displayed on the display 14, it is possible to select a menu, and to determine such a selection.
In addition, it is pointed out in advance that in this embodiment shown, a description is made on a mobile phone such as a smartphone which is an example of electronic equipment, but the present invention is applicable to various kinds of electronic equipment provided with a display device. An arbitrary mobile terminal such as a feature phone, a tablet terminal, a PDA, etc. comes within examples of other electronic equipment.
With referring to
The processor 40 is called as a computer or a CPU and in charge of a whole control of the mobile phone 10. An RTC 40a is included in the processor 40, by which a time (including year, month and day) is measured. All or a part of a program set in advance in the flash memory 54 is, in use, developed or loaded into the RAM 56, and the processor 40 performs various kinds of processing in accordance with the program developed in the RAM 56. In addition, the RAM 56 is further used as a working area or buffer area for the processor 40.
The input device 50 includes the hardware keys (22, 24, 26) shown in
The wireless communication circuit 42 is a circuit for transmitting and receiving a radio wave for a telephone conversation, a mail, etc. via an antenna 44. In this embodiment, the wireless communication circuit 42 is a circuit for performing a wireless communication with a CDMA system. For example, if the user designates an outgoing call (telephone call) using the input device 50, the wireless communication circuit 42 performs a telephone call processing under instructions from the processor 40 and outputs a telephone call signal via the antenna 44. The telephone call signal is transmitted to a telephone at the other end of the line through a base station and a communication network. Then, an incoming call processing is performed in the telephone at the other end of the line, when a communication-capable state is established and the processor 40 performs the telephonic communication processing.
Specifically describing a normal telephonic communication processing, a modulated sound signal sent from a telephone at the other end of the line is received by the antenna 44. The modulated sound signal received is subjected to demodulation processing and decode processing by the wireless communication circuit 42. A received sound signal obtained through such processing is converted into a sound signal by the D/A converter 48 to be output from the speaker 18. On the other hand, a sending sound signal taken-in through the microphone 20 is converted into sound data by the A/D converter 46 to be applied to the processor 40. The sound data is subjected to an encode processing and a modulation processing by the wireless communication circuit 42 under instructions by the processor 40 to be output via the antenna 44. Therefore, the modulated sound signal is transmitted to the telephone at the other end of the line via the base station and the communication network.
When the telephone call signal from a telephone at the other end of the line is received by the antenna 44, the wireless communication circuit 42 notifies the processor 40 of the incoming call. In response thereto, the processor 40 displays on the display 14 sender information (telephone number and so on) described in the incoming call notification by controlling the display driver 52. In addition, the processor 40 outputs from the speaker 18 a ringtone (may be also called as a ringtone melody, a ringtone voice). In other words, incoming call operations are performed.
Then, if the user performs an answering operation by using the call key 22 (
If the telephone communication ending operation is performed by the end key 24 (
In addition, the processor 40 adjusts, in response to an operation for adjusting a volume by the user, a sound volume of the sound output from the speaker 18 by controlling an amplification factor of the amplifier connected to the D/A converter 48.
The display driver 52 controls a displaying by the display 14 which is connected to the display driver 40 under instructions by the processor 40. In addition, the display driver 52 includes a video memory temporarily storing image data to be displayed. The display 14 is provided with a backlight which includes a light source of an LED or the like, for example, and the display driver 52 controls, according to the instructions from the processor 40, brightness, light-on/-off of the backlight.
The touch panel 16 shown in
In the embodiment, the touch panel 16 is of an electrostatic capacitance system that detects a change of an electrostatic capacitance between electrodes, which occurs when an object such as a finger is in close to a surface of the touch panel 16, and it is detected that one or more fingers are brought into contact with the touch panel 16, for example. The touch panel control circuit 58 functions as a detecting portion for detecting a touch operation, and, more specifically, detects a touch operation within a touch-effective range of the touch panel 16, and outputs touch coordinates data indicative of a position of the touch operation to the processor 40.
In addition, for a detection system of the touch panel 16, a surface-type electrostatic capacitance system may be adopted, or a resistance film system, an ultrasonic system, an infrared ray system, an electromagnetic induction system or the like may be adopted. Furthermore, a touch operation is not limited to an operation by a finger, may be performed by a touch pen.
An LED driver 60 is connected with an infrared LED 32 shown in
To an imaged image processing circuit 62, an infrared camera 30 shown in
In addition, the above-described wireless communication circuit 42, A/D converter 44 and D/A converter 46 may be included in the processor 40.
In the mobile telephone 10 having such structure, instead of a key operation or a touch operation, it is possible to perform an input or operation by a line of sight (hereinafter, may be called as “eye-controlled operation” or “eye-controlled input”). In the following, with using the drawings, examples of eye-controlled operation will be described. Although a detecting method of a gaze area based on the eye-controlled operation will be described in detail, by the eye-controlled operation, predetermined processing that is set in correspondence to a predetermined region (hereinafter, may be called as “operating region”) designated by a point (a gaze point) that a line of sight and a displaying plane of the display 14 are intersected with each other is performed.
As the predetermined processing, predetermined information is input, a predetermined action (operation) is performed, or a predetermined application is activated, for example. A button image capable of being designated or turned-on by the eye-controlled operation, or a displaying region of a reduced image such as an icon or thumbnail comes under the operating region; however, there is a case that only an operating region is set in an area where no such image is displayed. Furthermore, in this embodiment shown, an area including a gaze point (“divided area” described later) is determined as a gaze area, and it is determined that an operating region overlapped with the gaze area or included in the gaze area is designated by the eye-controlled operation. Therefore, a position where a reduced image such as a button image, icon or thumbnail designated or turned-on by the eye-controlled operation is displayed and a size thereof, and a position and a size of an operating region that is set without relationship with such an image are determined by taking the divided area into account. For example, it is configured not to display a plurality of reduced images in the same divided area, or not to set a plurality of operating area in the same divided area.
As shown in
In the lock screen 100 shown in
In a case that a 4-digit numeral “1460” is set as the secret code number, for example, if a line of sight is moved as shown by an arrow mark of a dotted line, it is determined that the button images 110 arranged on a moving path of the line of sight are operated in an order that the line of sight is moved. Therefore, in the example shown in
In a case of the eye-controlled operation, since a position on the screen designated by the line of sight is continuously changed, a button image arranged between two button images is also operated (turned-on). Therefore, in this embodiment shown, even if a numeral not included in the secret code number is input by the eye-controlled operation, if a time period that the secret code number is input by the eye-controlled operation is within a first predetermined time period (30 seconds, for example), and if an alignment order of the numerals is identical, it is determined that a correct secret code number is input.
Therefore, in a case that the numeral “145690” is input by the eye-controlled operation within the first time period, since the input numeral “145690” includes a numeral “1460” of the secret code number in the same order, it is determined that a correct secret code number is input. Then, the lock screen 100 is put out (non-displayed) and an arbitrary screen such as a standby screen becomes to be displayed.
Furthermore, in the lock screen 100 shown in
A lock cancel is thus implemented by an eye-controlled operation, it is possible to perform a lock cancel by the eye-controlled operation even in a situation that the mobile phone 10 can be held by one hand but both hands cannot be used. Furthermore, since the lock cancel can be performed by a line of sight, operated button images and an order of operation cannot be known by other persons, and therefore, it is possible to increase the security.
Furthermore, the user can select (execute) an application, select a menu or select an image through an eye-controlled operation.
In the application selecting screen 150 as shown in
At that time, in order to notify the user of the icon 160 the user gazes at and its gaze time, the processor 40 linearly or gradually changes a background color of the icon 160 determined that the user gazes at, in accordance with a length of the gaze time. For example, in a case that an icon 160 for a schedule function is gazed at as shown in
Thus, by changing a background color of an icon 160 according to a gaze time, it is possible to notify the user by a displaying manner (image), a gaze target and a gaze time (or a remaining time to be gazed) or a time until an application or function is started.
Likewise, in a case that a plurality of button images (thumbnails) are displayed, if a desired button image is gazed at, a background color of the button image is changed, and if the gaze time reaches the second predetermined time period, an operation (action) set to the button image is performed.
In this embodiment shown, a background color is changed, but it is not necessary to limit thereto. That is, it is possible to adopt various methods for changing a displaying manner of an icon. For example, an icon being gazed at may be made larger and an icon not gazed at may be made smaller. Furthermore, an icon being gazed may be displayed in rotation. In addition, in a case that a size of an icon is changed, a maximum (largest) size of an icon is determined in advance in accordance with the second predetermined time period and stored in the RAM 56 so that the user can recognize a time lapse of the second predetermined time period by a displaying manner (image). Likewise, in a case that an icon is rotated, a rotation number of an icon is determined in advance in accordance with the second predetermined time period, and stored in the RAM 56.
Furthermore, as a method for changing a color of an icon, a further method may be adopted. For example, an entire background color may be changed to a further color gradually, or a luminance of the background color may be changed gradually.
In addition, instead of a change of a displaying manner of an icon, processing that outside an area where a gazed icon is being displayed, a gaze time is indicated by a numeral or an indicator having a bar a length of which is changed according to a gaze time is displayed, may be performed.
As shown in
In this embodiment shown, in a case that the user reads the e-book, the user can turn pages by an eye-controlled operation. For example, as shown in
In the displaying area 206, a gaze time of the operating region 210 or the operating region 212 is indicated by displaying a bar having a color different from a background color. In the e-book displaying screen 200, if and when the gaze time of the operating region 210 or the operating region 212 reaches a third predetermined time period (1-3 seconds, for example), the page advancing or the page returning is performed. In addition, a length of the bar being displayed in the indicator (displaying area 206) is linearly or gradually changed according to the gaze time, and when the gaze time becomes coincident with the third predetermined time period, the bar reaches the right end of the displaying area 206.
Since the indicator is thus provided, the user can know the gaze time of the operating region 210 or the operating region 212 (or a remaining time that the user has to gaze at the operating region until an operation the user intends is performed), or a time until a page is turned, through a change in displaying manner (image).
In the above-described embodiment, the pages of the e-book is advanced or returned on a page-by-page basis, but not limited thereto. For example, further operating regions are formed at an upper right portion and an upper left portion of the displaying area 204, and if the operating region of the upper right portion is continuously gazed for more than the third predetermined time period, the e-book is advanced to the last page or a next chapter, and if the operating region of the upper left is continuously gazed for more than the third predetermined time period, the e-book is returned to the first page of the e-book, the first page of the current chapter or the first page of the previous chapter.
In such a case, when it is detected that the operating region is gazed, or it is detected that the operating region is continuously gazed for a predetermined time period, a page number of an advancing page designation or a returning page designation may be displayed on the display 14. The user can know the page or the page number of the advancing designation or the returning designation by such a displaying.
Accordingly, in a case that the alarm screen 250 is being displayed, according to an eye-controlled operation by the user, if a time (gaze time) that a user gazes at the button image 260 reaches a fourth predetermined time period (1-3 seconds, for example), the button image 260 is turned-on. Then, the snooze function is turned-on, and therefore, an alarm is stopped once and as shown in
In a case that the alarm screen 250 is being displayed, through an eye-controlled operation by the user, if the gaze time of the button image 262 reaches the fourth predetermined time period, the button image 262 is turned-on. Accordingly, the alarm is stopped and as shown in
An operation such as stopping an alarm is thus performed by an eye-controlled operation, in a case that an alarm function of the mobile phone 10 is used as an alarm clock, since the user must open his/her eyes, it is possible to suitably carry out a purpose of an alarm clock.
Hence, when a time (gaze time) that a user gazes at the button image 360 reaches a fifth predetermined time period (1-3 seconds, for example), the button image 360 is turned-on, and the mobile phone 10 answers the incoming call. That is, as described above, incoming call processing is performed to start normal telephone conversation processing. If the gaze time of the button image 362 exceeds the fifth predetermined time period, the button image 362 is turned-on, and the incoming call is stopped.
Since an operation for an incoming call can be thus performed through an eye-controlled operation, even in a situation that the mobile phone 10 is held by one hand but another hand is not usable, it is possible to answer to the incoming call or stop the incoming call.
Furthermore, in a case that the browsing function is being performed, as shown in
Accordingly, if a time (gaze time) that a user gazes at a left end of the screen reaches a sixth predetermined time period (1-3 seconds, for example), the screen is scrolled in the rightward direction at a predetermined amount. If a time that the user gazes at a right end of the screen reaches the sixth predetermined time period, the screen is scrolled at a predetermined amount in the leftward direction. Furthermore, if a time that a user gazes at an upper end of the screen reaches a sixth predetermined time period, the screen is scrolled in the downward direction at a predetermined amount. A time that the user gazes at a downward end of the screen reaches the sixth predetermined time period, the screen is scrolled at a predetermined amount in the upward direction.
In addition, in an example shown in
Since a screen can be thus scrolled through an eye-controlled operation, even in a situation that a mobile phone 10 is held by one hand but another hand is unusable, a displaying content such as a map larger than a size of a screen of the display 14 can be confirmed.
In addition, as far as a situation that the scroll is performed by an eye-controlled operation is concerned, not limited to the browsing function, in a case that other applications or functions are to be performed, by setting the operating regions (410L, 410R, 410T and 410B) as shown in
Next, a detecting method of a gaze area by a line of sight according to the embodiment will be described. As shown in
As shown at an upper side of
Therefore, as also shown at an upper side of
In addition, a distance between the infrared camera 30 and the infrared LED 32 is determined based on a distance between a face of the user and the mobile phone 10 (a surface of a housing or a displaying plance of the display 14) at a time that the user uses the mobile phone 10, a size of the mobile phone 10 and so on.
In a case that the gaze area is to be detected, in an imaged image that the infrared camera 30 images, a pupil and a reflecting light of the infrared light are detected by the processor 40. A method for detecting a pupil and a reflecting light of the infrared light in the imaged image is well-known, and not essential for this embodiment shown, and therefore, a description thereof is omitted here.
When the processor 40 detects the pupil and the reflecting light in the imaged image, then, the processor 40 detects a direction of a line of sight (eye vector). Specifically, a vector from a position of the reflecting light to a position of the pupil in a two dimensional image imaged by the infrared camera 30 is detected. That is, a vector from a center A to a center B is an eye vector as shown in
As shown in
In addition, in a case an eye-controlled operation is to be performed, at first, a calibration that is calibrating processing performed in starting the eye-controlled operation is executed; however, it is not necessary to perform a calibration at every time that the eye-controlled operation is started, a calibration may be executed at a time that the use of the mobile phone 10 is started or may be performed in response to a designation by the user, or a calibration may be performed at every predetermined time.
An eye vector in a case that the user gazes at each divided area is detected in advance through a calibration, and in correspondence to an identification information of the divided area, respective detected eye vectors are stored as reference eye vectors (reference victors N (N=1, 2, - - -, 20). In the calibration, for example, an eye vector is sequentially detected from the divided area in the uppermost column and in the column, an eye vector is sequentially detected from the left end divided area. Therefore, by detecting a reference vector N, most closely related to an eye vector of the user that is detected when an eye-controlled operation is actually performed, the divided area stored in correspondence to the most approximate reference vector N is determined as a gaze area.
For example, when a calibration is started, first, a divided area (1) is set as a gaze area as shown in
In addition, in
In the calibration, a line of sight of the user is guided in an order shown by the identification information (numbers) of the divided areas (1)-(20), and for example, a divided area to be gazed is indicated by a predetermined color.
Then, when the eye-controlled operation is actually performed, an eye vector detected based on an imaged image (for the sake of convenience of description, called as “current vector”) W is compared with respective one of the reference vectors N and a divided area stored in correspondence to the most approximate reference vector N is determined as an area that the user gazes at (a gaze area).
In addition, since a distance between the mobile phone 10 (infrared camera 30) and a face of the user (eye) is different in most cases at a time of the calibration and at a time that the eye-controlled operation is actually performed, the current vector W is scaled (enlarged or reduced).
In this embodiment shown, the current vector W is scaled based on a distance L0 between the left and right eyes at a time that the reference vector N is detected and a distance L1 between the left and right eyes at a time that the current vector W is detected. In addition, a distance L between both eyes is determined by a distance (horizontal distance) between a center position of the reflecting light of the infrared light on the left eye and a center position of the reflecting light of the infrared light on the right eye as shown in
As shown in
Specifically, the current vector W is scaled according to the following equation (1) where an X axis component of the current vector W is Wx and the Y axis component is Wy, and an X axis component of the current vector W after scaled is Wx1 and the Y axis component is Wy1.
(Wx1,Wy1)=(Wx*L1/L0,Wy*L1/L0) (1)
A length rN of a differential vector between respective one of the reference vectors N and the current vector W after scaled is respectively calculated in accordance with the following equation (2). Then, in a case that the length of the differential vector is shortest, it is determined that the current vector W after scaled and the reference vector N are most closely related to each other. Based on a determination result, a divided area being in correspondence to a reference vector N in a case that the length of the differential vector is shortest is determined as a current gaze area. Here, the reference vector N (N=1, 2, - - -, 20) is indicated by (XvN, YvN).
r
N=√{(XvN−Wx1)2+(YvN−Wy1)2} (2)
The main processing program 502a is a program for processing a main routine of the mobile phone 10. The communicating program 502b is a program for performing telephone conversation processing with other telephones or for communicating with other telephones or computers via a communication network (a telephone network, internet). The gaze area detecting program 502c is a program for detecting a divided area on a displaying surface of the display 14, which is gazed at by a user of the mobile phone 10 as a gaze area.
The lock canceling program 502d is a program for canceling a lock state in accordance with an operation of the user in a case that the lock function is turned-on. In this embodiment shown, a case that the lock is canceled by the eye-controlled operation is described, but it is needless to say that the lock can also be canceled by a key operation or a touch operation. Likewise, in the application selecting program 502e, the e-book displaying program 502f and the browsing program 502g, not only the eye-controlled operation but also the key operation or the touch operation can be used.
The application selecting program 502e is a program for selecting (executing) an application or function installed in the mobile phone 10. The e-book displaying program 502f is a program for executing a processing related to an operation for an e-book (turning over of pages and so on). The browsing program 502g is a program for performing processing related to an operation for a browser (a displaying of a page of an internet site, a scrolling of a screen, a page movement and so on).
Although not shown, the program storage area 502 is further stored with an image producing processing program, an image displaying program, a sound outputting program, and a program for other application or function such as a memo pad, an address book, etc.
The data storage area 504 is provided with an input data buffer 504a. Furthermore, the data storage area 504 is stored with image data 504b, gaze area data 504c, operating region data 504d, reference vector data 504e and current vector data 504f. The data storage area 504 is further provided with a restriction timer 504g and a gaze timer 504h.
The input data buffer 504a is an area for temporarily storing key data and touch coordinates data according to a time series. The key data and the touch coordinates data are erased after use for processing by the processor 40.
The image data 504b is data for displaying various kinds of screens (100, 150, 200, 250, 300, 350, 400 and so on). The gaze area data 504c is data for identifying a divided area that the user currently gazes at, i.e., a gaze area.
The operating region data 504d is data of positions (coordinates) for defining operating regions for a current displayed screen and data indicative of a content for an operation (action) or function (application) being set in correspondence to the operating regions.
The reference vector data 504e is data for the eye vectors each corresponding to each of the divided areas, acquired by the calibration, i.e., the reference vectors N. The current vector data 504f is data for the eye vector currently detected, i.e., the aforementioned current vector W.
The restriction timer 504g is a timer for counting a restricted time during when the eye-controlled operation is performed for lock canceling. The gaze time 504h is a timer for counting a time that the user gazes at the same divided area.
Although not shown, the data storage area 504 is further stored with other data and provided with other timers (counters), and provided with flags, which are all necessary for executing respective programs stored in the program storage area 502.
In a next step S3, a detection of a gaze area is started. That is, the processor 40 executes the gaze area detecting process (
In a succeeding step S7, the processor 40 acquires a gaze area detected by the gaze area detecting process with reference to the gaze area data 504c. In a next step S9, it is determined whether or not the acquired gaze area overlaps with the operating region. Here, the operating region data 504d is referred and it is determined whether or not the gaze area previously acquired overlaps with the operating region. If “NO” is determined in the step S9, that is, if the gaze area acquired does not overlap with the operating region, the process proceeds to a step S13. On the other hand, if “YES” is determined in the step S9, that is, if the acquired gaze area overlaps with the operating region, in a step S11, the button image corresponding to the operating region is stored, and then, the process proceeds to the step S13. That is, an input secret code number and so on are stored.
In the step S13, it is determined whether or not the security lock is to be canceled. That is, it is determined whether or not the input secret code number or operation procedure is correct. In addition, a secret code number or operation procedure set in advance is stored in the flash memory 54, and at that time, the same is referred. If “NO” is determined in the step S13, that is, if the lock is not to be canceled, in a step S15, it is determined whether or not a count value of the restriction timer 504g reaches or exceeds the first predetermined time period (10 seconds, for example). If “NO” is determined in the step S15, that is, if the first predetermined time period does not elapse, the process returns to the step S7. If “YES” is determined in the step S15, that is, if the first predetermined time period elapses, in a step S17, a failure of lock canceling is notified, and then, the process returns to the step S1. Specifically, in the step S17, the processor 40 displays a message that the lock canceling fails on the display 14, or outputs from a speaker (speaker 18 or other speakers) a sound (music, melody) that the lock cancel fails, or performs both of them.
If “YES” is determined in the step S13, that is, if the lock is to be canceled, in a step S19, the lock screen 100 is put out (non-displayed) and the lock canceling process is terminated.
In a next step S33, the pupil is detected in the imaged image, and in a step S35, a center position of the pupil is determined. In a step S37, a reflecting light of the infrared ray (infrared light) in the imaged image is detected, and in a step S39, a center position of the reflecting light is determined. Then, in a step S41, a current vector W having a start point at the center position of the reflecting light and an end point at a center position of the pupil is calculated.
Subsequently, a distance L between both eyes is determined in a step S43. Here, a distance L1 between the center position of the reflecting light of the infrared light on the left eye and the center position of the reflecting light of the infrared light on the right eye is evaluated. In a next step S45, the current vector W is scaled (enlarged or reduced) in accordance with the aforementioned equation (1). Furthermore, in a step S47, a differential vector between the current vector W after scaled and the reference vector N for each divided area is calculated in accordance with the equation (2). Then, in a step S49, a divided area corresponding to the reference vector N that the length of the reference vector becomes minimum (shortest) is determined as a gaze area, and the gaze area detecting process is terminated. The identification information of the gaze area (divided area) determined in the step S49 is stored (renewed) as the gaze area data 504c.
In addition, if once the gaze area detecting process is started, until performing processing of a predetermined function is ended, the gaze area detecting process is repeatedly executed; however, the gaze area detecting process may be terminated by performing a predetermined key operation or touch operation. This is true in a case that the gaze area detecting process is to be executed.
If the lock function is set, the above-described lock canceling process is executed, and after the lock is canceled, the performing function determining process is started. If the lock function is not set, the above-described lock canceling process is not performed, and the performing function determining process is started when the user starts the use of the mobile phone 10.
In a next step S63, the processor 40 determines whether or not a current time is an alarm set time (an alarm time). That is, the processor 40 determines, by referring to a current time measured by the RTC 40a, whether or not the current time reaches the alarm time. In a case that the alarm is not set, the processor 40 determines that the current time is not the alarm time.
If “YES” is determined in the step S63, that is, if the current time is the alarm time, in a step S65, an alarming process (see
If “YES” is determined in the step S67, that is, if an input for performing the application selection exists, in a step S69, an application selecting process (see
If “YES” is determined in the step S71, that is, if the e-book processing is to be executed, in a step S73, the e-book processing (see
If “YES” is determined in the step S75, that is, if the browser is to be executed, in a step S77, the browsing processing (see
If “YES” is determined in the step S79, that is, if an incoming call exists, in a step S81, incoming call processing (see
If “YES” is determined in the step S83, that is, if a further operation exists, it is determined whether or not an operation of the power button is performed in a step S85. If “YES” is determined in the step S85, that is, if the operation for the power button is performed, the process proceeds to the step S91. If “NO” is determined in the step S85, that is, if not an operation for the power button, in a step S87, the further processing is performed, and then, the process returns to the step S61 shown in
If “NO” is determined in the step S83, that is, if no further operation exists, in a step S89, it is determined whether or not a seventh predetermined time period (10 seconds, for example) elapses in a no operation state. For example, a time that a key operation and a touch operation do not exist is counted by a timer (no operation timer) different from the restriction timer 504g and the gaze timer 504h. Such a no operation timer is reset and started when the key operation or the touch operation is ended. For example, the seventh predetermined time period is settable between 5 seconds and 30 seconds.
If “NO” is determined in the step S89, that is, if the seventh predetermined time period does not elapse in the no operation state, the process returns to the step S61. If “YES” is determined in the step S89, that is, if the seventh predetermined time period elapses in the no operation state, in a step S91, the screen is put out (the display 14 is turned-off), and the performing function determining process is terminated.
In a step S113, an alarm screen 250 as shown in
Next, in a step S119, it is determined whether or not the gaze area is overlapped with the operating region (here, a displaying area of the button image 260 or 262) set in the alarm screen 250. If “NO” is determined in the step S119, that is, if the gaze area does not overlap with the operating region, in a step S121, it is determined whether or not the alarm is to be automatically stopped. It is determined whether or not a time (30 seconds-5 minutes, for example) from the ring of alarm started to the automatic stopping elapses. A timer for such determination may be provided, or it is determined whether or not the automatic stopping is to be performed with referring to a time counted by the RTC 40a.
If “NO” is determined in the step S121, that is, in a case that the alarm is not to be automatically stopped, the process returns to the step S117. If “YES” is determined in the step S121, that is, if the alarm is to be automatically stopped, in a step S123, the ring of alarm is stopped, and in a step S125, it is determined whether or not a setting of a snooze is present.
If “YES” is determined in the step S125, that is, if the setting of a snooze exists, in a step S127, an alarm time is changed by adding the current alarm time to a time of the snooze, and then, the process returns to the performing function determining process. If “NO” is determined in the step S125, that is, if no setting of the snooze is present, in a step S129, a next alarm time is set, and then, the process returns to the performing function determining process. In addition, if a next alarm is not set, the processor 40 does not perform processing in the step S129, and returns to the performing function determining process. This is true for a step S149 described later.
If “YES” is determined in the step S119, that is, if the gaze area overlaps with the operating region, in a step S131, it is determined whether or not the operating region with which the gaze area overlaps is changed. That is, the processor 40 determines whether or not the operating region with which the gaze area overlaps differs from at the preceding time to at the current time. If “NO” is determined in the step S131, that is, if the operating region is not changed, the process proceeds to a step S135 shown in
As shown in
If “NO” is determined in the step S135, that is, if the fourth predetermined time period does not elapse, the process returns to the step S117 shown in
If “YES” is determined in the step S137, that is, if the gaze area is the snooze button, the snooze button, i.e., the button image 260 is turned-on in a step S139, and the ring of alarm is stopped in a step S141, and then, an alarm time is changed by adding the snooze time to the alarm time in a step S143, and the process returns to the performing function determining process.
If “NO” is determined in the step S137, that is, if the gaze area is a stop button, in a step S145, the stop button, i.e., the button image 262 is turned-on, and the ring of alarm is stopped in a step S147, and a next alarm time is set in a step S149, and then, the process returns to the performing function determining process.
When the application selecting processing is started, the processor 40 displays an application selecting screen 150 as shown in
If “NO” is determined in the step S167, the process returns to the step S165. If “YES” is determined in the step S167, it is determined whether or not the operating region with which the gaze area overlaps is changed in a step S169. If “NO” is determined in the step S169, the process proceeds to a step S173. If “YES” is determined in the step S169, in a step S171, the gaze timer 504h is reset and started, and then, the process proceeds to the step S173.
In the step S173, a background color of an icon 160 being gazed at is changed at a predetermined amount. In a next step S175, it is determined whether or not a second predetermined time period (1-3 seconds, for example) elapses. That is, the processor 40 determines whether or not a time that the user watches the same icon 160 reaches the second predetermined time period with referring to a count value of the gaze timer 504h.
If “NO” is determined in the step S175, that is, if the second predetermined time period does not elapse, the process returns to the step S165. If “YES” is determined in the step S175, that is, if the second predetermined time period elapses, in a step S177, an application or function corresponding to the gazed icon 160 is activated, and then, the process returns to the performing function determining process.
In addition, if the activated application or function is an e-book or a browser, as described later, the e-book processing and the browsing processing are executed. Furthermore, as described above, when the second predetermined time period elapses, the background color of the gazed icon 160 is entirely changed.
In a next step S193, a detection of a gaze area is started. It is determined whether or not the e-book processing is to be terminated in a next step S195. That is, the processor 40 determines whether or not a termination of the e-book processing is instructed by the user. If “YES” is determined in the step S195, that is, if the e-book processing is to be terminated, as shown in
If “NO” is determined in the step S195, that is, if the e-book processing is not to be terminated, in a step S197, a gaze area is acquired. In a succeeding step S199, it is determined whether or not the gaze area overlaps with the operating region (210 or 212). If “NO” is determined in the step S199, the process returns to the step S195, but if “YES” is determined in the step S199, if it is determined whether or not the operating region with which the gaze area overlaps is changed in a step S201.
If “NO” is determined in the step S201, the process proceeds to a step S205. On the other hand, if “YES” is determined in the step S201, the gaze timer 504h is reset and started in a step S203, and then, the process proceeds to the step S205 wherein a color of the indicator 206 is changed at a predetermined amount. That is, a blank of the indicator 206 is filled with a predetermined color at a predetermined amount.
In a step S207, it is determined whether or not a third predetermined time period (1-3 seconds, for example) elapses. That is, the processor 40 determines whether or not a time that the user watches the predetermined region (210 or 212) reaches the third predetermined time period with referring to a count value of the gaze timer 504h. If “NO” is determined in the step S207, that is, if the third predetermined time period does not elapse, the process returns to the step S195. If “YES” is determined in the step S207, that is, if the third predetermined time period elapses, it is determined whether or not the eye-controlled operation is the page advancing in a step S209 shown in
If “NO” is determined in the step S209, that is, in a case that the user gazes at the operating region 212, it is determined that the eye-controlled operation is the page returning, and thus, a preceding page is displayed in a step S211, and then, the process returns to the step S195 shown in
If “NO” is determined in the step S213, that is, if the current page is not the last page, in a step S215, a succeeding page is displayed, and then, the process returns to the step S195. If “YES” is determined in the step S213, that is, if the current page is the last page, the e-book processing is terminated, and then, the process returns to the performing function determination process.
As shown in
In a next step S233, the processor 40 starts a detection of a gaze area, and in a step S235, it is determined whether or not the browser is to be terminated. Here, the processor 40 performs such a determination based on whether or not a termination of the browsing processing is instructed by the user. If “YES” is determined in the step S235, that is, if the browser is to be terminated, the process returns to the performing function determination process. If “NO” is determined in the step S235, that is, if the browsing processing is not to be terminated, a gaze area is acquired in a step S237.
In a next step S239, it is determined whether or not the gaze area overlaps with the operating region (410L, 410R, 410T, 410B). If “NO” is determined in the step S239, the process returns to the step S235. If “YES” is determined in the step S239, it is determined, in a step S241, whether or not the operating region with which the gaze area overlaps is changed. If “NO” is determined in the step S241, the process proceeds to a step S245. If “YES” is determined in the step S241, in a step S243, the gaze timer 504h is reset and started, and then, the process proceeds to the step S245.
In the step S245, it is determined whether or not a sixth predetermined time period (1-3 seconds, for example) elapses. Here, the processor 40 determines, with referring to the count value of the gaze timer 504h, it is determined whether or not a time that the user gazes at the operating region (410L, 410R, 410T, 410B) reaches the sixth predetermined time period.
If “NO” is determined in the step S245, that is, if the sixth predetermined time period does not elapse, the process returns to the step S235. If “YES” is determined in the step S245, that is, if the sixth predetermined time period elapses, in a step S247 shown in
If “YES” is determined in the step S247, that is, if the gaze area is a left, a scroll in the rightward direction at a predetermined amount is performed in a step S249, and then, the process returns to the step S235 shown in
If “YES” is determined in the step S251, that is, if the gaze area is a right, a scroll in the leftward direction at a predetermined amount is performed in a step S253, and then, the process returns to the step S235. If “NO” is determined in the step S251, that is, if the gaze area is not a right, in a step S255, it is determined whether or not the gaze area is a top. Here, the processor 40 determines whether or not the user gazes at the operating region 410T.
If “YES” is determined in the step S255, that is, if the gaze area is the up, the process returns to the 235 after a scroll is performed in the downward direction at a predetermined amount in a step S257. If “NO” is determined in the step S255, that is, in a case that the operating region 410B is gazed at, it is determined that the gaze area is a bottom, and in a step S259, a scroll in the upward direction is performed at the predetermined amount, and then, the process returns to the step S235.
In addition, it is described that the screen can be necessarily scrolled, but if an end of a displaying content is displayed or the last page is displayed, the screen cannot be scrolled, and in such a case, even if an instruction for the scrolling is input, this instruction is ignored.
As shown in
In a next step S273, an incoming call screen 350 shown in
If “YES” is determined in the step S277, that is, if the incoming call processing is to be terminated, the process proceeds to a step S291 shown in
If “NO” is determined in the step S281, the process returns to the step S277. If “YES” is determined in the step S281, it is determined, in a step S283, whether or not the operating region with which the gaze area overlaps is changed. If “NO” is determined in the step S283, the process proceeds to a step S287. If “YES” is determined in the step S283, in a step S285, the gaze timer 504h is reset and started, and then, the process proceeds to the step S287.
In the step S287, it is determined whether or not a fifth predetermined time period (1-3 seconds, for example) elapses. Here, the processor 40 determines, with referring to the count value of the gaze timer 504h, whether or not a time that the user gazes at the operating region (the displaying region of the button image 360 or 362) reaches the fifth predetermined time period.
If “NO” is determined in the step S287, that is, if the fifth predetermined time period does not elapse, the process returns to the step S277. If “YES” is determined in the step S287, that is, if the fifth predetermined time period elapses, in a step S289 shown in
If “NO” is determined in the step S289, that is, in a case that the user gazes at the button image 362, it is determined that the incoming call is to be stopped, and in the step S291, the incoming call answer is stopped, and then, the process returns to the performing function determination process. In the step S291 (the same as in the step S293), the processor 40 stops the ringtone, or stops the vibration motor, or performs both of them. If “YES” is determined in the step S289, that is, if the incoming call is to be answered, in a step S293, the incoming call answer is stopped, and in a step S295, the above-described telephone conversation processing is performed.
Subsequently, in a step S297, it is determined whether or not the telephone conversation is to be ended. Here, the processor 40 determines whether or not the end key 24 is operated by the user, or an end signal is received from the other end on the line. If “NO” is determined in the step S297, that is, if not to be ended, the process returns to the step S295, to continue the telephone conversation processing. If “YES” is determined in the step S297, that is, if to be ended, in a step S299, the circuit is cut-out and the process returns to the performing function determination process.
According to this embodiment, since the infrared camera is arranged above the display, and the infrared LED is arranged below the display, even in a case that the user slightly closes the eyelid, it is possible to surely image the reflecting light of the infrared light, and thus, it is possible to be increase the recognition rate of the eye-controlled input.
In addition, in this embodiment, only the security lock function is described as the lock function, but not limited thereto. As the lock function, a lock (key lock) function for preventing an erroneous operation of the touch panel may be introduced. One of the security lock function and the key lock function may be settable, or both of them are settable. In addition, in a case that both of the security lock function and the key lock function are set, when the power of the display is turned-on, the security lock is canceled after the key lock is canceled.
In a case that the key lock function is set, when the use of the mobile phone 10 is started, that is, when the power for the display 14 is turned-on, a lock screen 450 (key lock) as shown in
In the lock screen 450 shown in
In addition, a movement of the cancel object 460 is performed by an eye-controlled operation. Specifically, when the lock screen 450 is being displayed, if the gaze area and the operating region for the cancel object 460 are overlapped with each other, in accordance with a position change of the gaze area (a line of sight) thereafter, the cancel object 460 is continuously moved.
Then, if the cancel object 460 is moved equal to or more than the predetermined distance d, the lock screen 450 is put out, and the key lock is canceled. For example, if the center 460a of the cancel object 460 is moved onto the contour line of the circle 470 or over the contour line, it is determined that the cancel object 460 is moved equal to or more than the predetermined distance d.
Here, by moving the cancel object 460, the displaying manner is changed, and at a time that the cancel object 460 is moved equal to or more than the predetermined distance d, it is determined that the displaying manner becomes a predetermined manner, and the key lock is canceled, but not limited thereto. For example, an arrangement may be employed such that, by changing a size or a color of the cancel object 460 when the user gazes at the cancel object 460, the displaying manner is changed, and when the size and the color of the cancel object 460 are changed to a predetermined size and a predetermined color, it is determined that the cancel object 460 becomes the predetermined manner, thereby to cancel the key lock. In such a case, when a time that the gaze area overlaps the displaying region (operating region) of the cancel object 460 reaches an eighth predetermined time period (3-5 seconds, for example), the size and the color of the cancel object 460 are changed to the predetermined size and the predetermined color. The size of the cancel object 460 is made larger (or smaller) by a predetermined amount (a predetermined length of a radius) at every unit time (0.5-1 seconds, for example). That is, the cancel object 460 is continuously changed according to the gaze time. Then, when the cancel object 460 becomes the same size as the circle 470, for example, it is determined that the cancel object 460 becomes the predetermined size. Therefore, a predetermined amount (predetermined dot width) by which the size of the cancel object 460 is changed lineally or gradually is set such that the change of the cancel object 460 is ended at a timing that the gaze time is coincident with the eighth predetermined time period. Such a setting is also employed for a case that the color of the cancel object 460 is changed.
The internal color of the cancel object 460 is changed by a predetermined amount at every unit time. Then, when the color of the cancel object 460 is entirely changed, it is determined that the cancel object 460 is changed to the predetermined color. Here, instead of the color of the cancel object 460, a luminance may be changed.
A specific lock canceling process (key lock) is shown in
When the lock canceling process is started as shown in
In a next step S313, a detection of a gaze area is started. That is, the processor 40 executes a gaze area detecting process (
On the other hand, if “YES” is determined in the step S317, that is, if the acquired gaze area overlaps with the operating region, in a step S319, the gaze area is acquired, and in a step S321, it is determined whether or not the gaze area is changed. The processor 40 determines whether or not the gaze area detected at this time is different for the gaze area indicated by the gaze area data 504c.
If “NO” is determined in the step S321, that is, if the gaze area is not changed, it is determined that the line of sight is not moved, and the process returns to the step S319. If “YES” is determined in the step S321, that is, if the gaze area is changed, it is determined that the line of sight is moved, and then, in a step S323, the cancel object 460 is moved equal to or to the current gaze area. For example, the processor 40 displays the cancel object 460 in a manner that the center of the gaze area and the center of the cancel object 460 become to be coincident with each other.
In a next step S325, it is determined whether or not the key lock is to be canceled. That is, the processor 40 determines whether or not the cancel object 460 is moved more than the predetermined distance d. If “NO” is determined in the step S325, that is, if the key lock is not to be canceled, the process returns to the step S319. If “YES” is determined in the step S325, that is, if the key lock is to be canceled, the lock screen 450 is put out (non-displayed) in a step S327, and then, the lock canceling process is terminated.
Next, a lock canceling process (key lock) shown in
On the other hand, if “YES” is determined in the step S347, the gaze timer 504h is reset and started in a step S349. Subsequently, in a step S351, the gaze area is acquired, and in a step S353, it is determined whether or not the acquired gaze area overlaps with the operating region.
If “NO” is determined in the step S353, the process returns to the step S349. If “YES” is determined in the step S353, in a step S355, a displaying area (size) of the cancel object 460, that is, the length of the radius of the cancel object 460 is made larger (or smaller) at a predetermined amount. Then, in a step S357, it is determined whether or not an eighth predetermined time period (3-5 seconds, for example) elapses. Here, the processor 40 determines whether or not the user gazes at the cancel object 460 more than the predetermined eighth predetermined time period by determining whether or not a count value of the gaze timer 504h reaches the eighth predetermined time period.
If “NO” is determined in the step S357, that is, if the eighth predetermined time period does not elapse, it is determined that the key lock is not to be canceled, and the process returns to the step S351. In addition, in the steps S351-S357, the displaying area of the cancel object 460 is enlarged (or reduced) by the predetermined amount in accordance with the gaze time.
On the other hand, if “YES” is determined in the step S357, that is, if the eighth predetermined time period elapses, it is determined that the key lock is to be canceled, and the lock screen 450 is put out (non-displayed) in a step S359, and the lock canceling process is terminated.
In addition, in the above-described embodiment, the displaying area of the cancel object 460 is changed by gazing at the cancel object 460, but, as described above, the color of the cancel object 460 may be changed.
Furthermore, in the above-described embodiment, during a time that the cancel object 460 is gazed at, the displaying area or the color thereof is changed, but it is not necessary to change the displaying manner of the cancel object 460, and at a timing that the eighth predetermined time period elapses, the key lock may be canceled. In such a case, the processing in the step S355 may be deleted.
The key lock is thus canceled by an eye-controlled operation. If another person intends to cancel the key lock through an eye-controlled operation, the eye-controlled operation by the person cannot be correctly recognized for a reason that the distance L between both of the eyes differs, for example, and therefore, it is possible to prevent the mobile phone 10 from being used by other persons unintentionally. This is true for the cancel of the security lock.
In addition, the embodiment is described that the lock canceling process (key lock) shown in
Furthermore, in
Furthermore, in the above-described embodiments, a case that the alarm function of the mobile phone 10 is used as an alarm clock, but the alarm function can be used as an alarm for schedule. In a case that the alarm function is used as the alarm for schedule, if a content of the schedule may be displayed on the display 14 at a time that the alarm is rung or the alarm is stopped, it is possible to make the user surely confirm the content of the schedule.
As shown in
Therefore, in a case that the alarm screen 600 is displayed, by performing an eye-controlled operation by the user, if a time that the button image 610 is gazed at, i.e., a gaze time reaches a ninth predetermined time period (1-3 seconds, for example), the button image 610 is turned-on, whereby the alarm is stopped. As described above, the content of schedule is displayed when the alarm screen 600 is displayed, or when the button image 610 is turned-on.
Furthermore, in the alarm screen 600 shown in
In addition, in this embodiment shown, the infrared camera and the infrared LED are arranged apart from each other in a vertical direction, but not limited thereto. For example, electronic equipment such as a smartphone may be used in the horizontal direction, and therefore, the structure capable of performing an eye-controlled operation in such a case may be adopted.
For example, as shown in
Furthermore, as shown in
In addition, in the embodiment shown, a case that the processing by the processor is performed by the eye-controlled operation, but it is needless to say that such the processing may be performed through a key operation or a touch operation. In addition, when the processing by the eye-controlled operation is performed, a setting may be made so as not to accept the key operation and the touch operation.
Furthermore, in the embodiment shown, a case that the eye-controlled operation can be performed has been described, but, there is a case that the eye-controlled operation (eye-controlled input) can be performed or cannot be performed, and therefore, in a case that the eye-controlled operation can be performed, a message or an image (icon) indicative of such a situation may be displayed. Furthermore, in a case that the eye-controlled operation is being performed, a message or an image to accept the eye-controlled input (during the eye-controlled operation being executed) may be displayed. Thus, the user can recognize that the eye-controlled operation can be performed and that the eye-controlled input can be accepted.
Furthermore, in the above-described embodiments, when the alarm processing, the application selecting processing, the e-book processing, the browsing processing or the incoming call processing is started, the eye-controlled operation is automatically detected, but limited thereto. For example, the eye-controlled operation may be started in response to a predetermined key operation or touch operation. Likewise, the end of the eye-controlled operation may be instructed by a predetermined key operation or touch operation.
Furthermore, in the performing function determination process shown in
Therefore, as described above, in a case that the start or the end of the eye-controlled operation is instructed, or in a case that an application or function capable of being performed with the eye-controlled operation and an application or function not capable of being performed with the eye-controlled operation mingles, when the incoming call processing is started as an interruption, whether the eye-controlled operation can be utilized in the incoming call processing may be set depending on whether the eye-controlled operation being performed for an application or function having being executed just before.
For example, in a case that the eye-controlled operation is performed in the application or function having been executed just before, if the incoming call is input, it is possible to instruct to answer or stop the incoming call based on the eye-controlled operation. Inversely, if the incoming call is input, the eye-controlled operation is made unable and only the key operation or the touch operation may be accepted, thereby to instruct to answer or stop the incoming call by the key operation or the touch operation. In such a case, since it does not take a time for the processing that the gaze area is detected or the like, the incoming call can be promptly answered or stopped. Furthermore, in a case that the key operation or touch operation is performed for an application or function having being executed just before, if the incoming call is input, to answer or stop the incoming call may be instructed based on the key operation or touch operation as it is. That is, since an operating method is maintained before and after the incoming call, a burden to change an operating method is removed from the user.
Programs utilized in the above-described embodiments may be stored in an HDD of the server for data distribution, and distributed to the mobile phone 10 via the network. The plurality of programs may be stored in a storage medium such as an optical disk of CD, DVD, BD (Blu-ray Disc) or the like, a USB memory, a memory card, etc. and then, such the storage medium may be sold or distributed. In a case that the plurality of programs downloaded via the above-described server or storage medium are installed to a mobile terminal having the structure equal to the structure of the embodiment, it is possible to obtain advantages equal to advantages according to the embodiment.
The specific numerical values mentioned in this specification are only examples, and changeable properly in accordance with the change of product specifications.
An embodiment is electronic equipment provided with a display portion, comprising: an infrared light detecting portion which is arranged above the display portion and detects an infrared light; and a first infrared light output portion which is arranged below the display portion.
In the embodiment, the electronic equipment (10) is provided with a display portion (14). The electronic equipment comprises an infrared light detecting portion (30) which is arranged above the display portion and detects an infrared light and a first infrared light output portion (32) which is arranged below the display portion. Accordingly, the infrared light is irradiated to a portion lower than the center of the pupil of an eye of the user facing straight the display portion of the electronic equipment. Therefore, even in a state that an eyelid of the user is slightly closed, a reflecting light of the infrared light can be imaged by the infrared light detecting portion.
According to the embodiment, since the reflecting light of the infrared light can be surely imaged, a recognition rate of an eye-controlled input can be increased. Therefore, in a case that the electronic equipment is operated by the eye-controlled input, it is possible to surely receive such an operation.
Another embodiment is the electronic equipment wherein the infrared light detecting portion and the first infrared light output portion are arranged on a first line which is in parallel with a vertical direction of the display portion.
In this embodiment, the infrared light detecting portion and the first infrared light output portion are arranged on a first line being in parallel with the vertical direction of the display portion. The infrared light detecting portion and the first infrared light output portion are arranged in a manner that the center position of a detecting surface of the infrared light detecting portion and the center position of an light-emitting surface of the first infrared light output portion are laid on the same line, for example.
According to this embodiment, since the infrared light detecting portion and the first infrared light output portion are arranged on a line, it is unnecessary to perform correcting processing due to a positional deviation of the both. That is, it is unnecessary to perform complicated calculation.
A further embodiment is the electronic equipment further comprising a second infrared light output portion, wherein the second infrared light output portion is arranged on a second line which is in parallel with a horizontal direction of the display portion at an opposite side of the infrared light detecting portion in the horizontal direction of the display portion, the infrared light detecting portion being arranged on the second line.
In the further embodiment, a second infrared light output portion (34) is provided, which is arranged on a second line being in parallel with a horizontal direction of the display portion at an opposite side of the infrared light detecting portion in the horizontal direction of the display portion, the infrared light detecting portion being arranged on the second line. The second infrared light output portion is arranged in a manner that the center position of a detecting surface of the infrared light detecting portion and the center position of a light emitting surface of the second infrared light output portion are laid on the same line, for example. Therefore, if the electronic equipment is used in a traverse direction, by using the infrared light detecting portion and the second infrared light output portion, a direction of a line of sight is detected.
According to the further embodiment, irrespective of a direction of the electronic equipment, a recognition rate of an eye-controlled input can be increased.
A still further embodiment is the electronic equipment wherein the infrared light detecting portion and the first infrared light output portion are arranged at diagonal positions sandwiching the display portion.
In the still further embodiment, the infrared light detecting portion and the first infrared light output portion are arranged at diagonal positions sandwiching the display portion. In a case of a display portion having a rectangular displaying surface, the infrared light detecting portion and the first infrared light output portion are arranged on a line in parallel with a diagonal line. Accordingly, even if the electronic equipment is used vertically or even if the electronic equipment is used horizontally, by using the infrared light detecting portion and the first infrared light output portion, a line of sight can be detected.
According to the still further embodiment, it is possible to detect a line of sight in both directions of the vertical direction and the horizontal direction without increasing of the number of parts.
A yet still further embodiment is the electronic equipment further comprising a gaze area detecting portion which detects a gaze area on a screen of the display portion at which a user is gazing, based on a pupil of the user detected by the infrared light detecting portion and a reflecting light of the first infrared light output portion; and a performing portion which performs predetermined processing based on the gaze area detected by the gaze area detecting portion.
In the yet still further embodiment, the electronic equipment further comprises the gaze area detecting portion (40, 62, S49) and the performing portion (40, S139-S149, S177, S211, S215, S249, S253, S257, S259, S291, S293, S295). The gaze area detecting portion detects a gaze area on a screen of the display portion at which a user is gazing, based on the pupil of the user detected by the infrared light detecting portion and a reflecting light of the first infrared light output portion. For example, in a two-dimension imaged image, an eye vector having a start point at the center position of the reflecting light and an end point at the center position of the pupil is detected, and in accordance with the eye vector, an area on the screen being divided in advance is determined as a gaze area. The performing portion performs predetermined processing based on the gaze area detected by the gaze area detecting portion. For example, a button image, an icon or thumbnail displayed at a position or region overlapping with the gaze area is operated (turned-on), or an operation or action (turning-over pages, scrolling screen, etc.) assigned to a predetermined area (operating region, in embodiments) set at a position or area overlapping with the gaze area is performed.
According to the yet still further embodiment, since the predetermined processing is performed according to an area to which a line of sight of the user is directed, the electronic equipment can be operated by an eye-controlled input.
Another embodiment is the electronic equipment wherein the display portion displays one or more images, and further comprising a displaying manner changing portion which changes, according to a time lapse, a displaying manner of the one or more images with which the gaze area detected by the gaze area detecting portion overlaps.
In the embodiment, the display portion displays one or more images. The image is a button image, an icon, a thumbnail or the like, for example. The displaying manner changing portion changes, according to a time lapse, a displaying manner of the one or more images with which the gaze area detected by the gaze area detecting portion overlaps. For example, a color of a background of the image is changed; a size of a background of the image is changed; or the image is displayed in a predetermined animation (in rotation).
According to this embodiment, since a displaying manner of the image with which the gaze area overlaps, the image recognized by the user to be being gazed can be notified and a passage of gazing time can be notified as a change of the displaying manner.
Another further embodiment is the electronic equipment wherein when the image is changes to a predetermined displaying manner by the displaying manner changing portion, the performing portion performs predetermined processing assigned to the concerned image.
In this further embodiment, the performing portion performs predetermined processing assigned to the concerned image when the image is changes to the predetermined displaying manner. The predetermined displaying manner means, for example, a state that a background color of the image is entirely changed, a state that the image is changed up to a predetermined size or a state that the image is rotated by a predetermined number of rotation.
According to this further embodiment, since if and when the image is changes to a predetermined manner, the performing portion performs the predetermined processing assigned to the concerned image, it is necessary to continue to gaze at the image to some extent, and therefore, it is possible to prevent an erroneous operation.
Another embodiment is the electronic equipment wherein the display portion is set with one or more predetermined regions, and when the gaze area detected by the gaze area detecting portion overlaps any one of the one or more predetermined regions, the performing portion performs predetermined processing assigned to the concerned predetermined region.
In this embodiment, one or more predetermined regions (210, 212, 410L, 410R, 410T, 410B, etc.) is set on the display portion. The performing portion performs the predetermined processing assigned to the concerned predetermined region when the gaze area detected by the gaze area detecting portion overlaps any one of the one or more predetermined regions.
According to this embodiment, even in a case that the image is not displayed, it is possible to set the predetermined region(s) and perform the predetermined processing by gazing at any one of the predetermined region(s).
A further embodiment is the electronic equipment wherein the predetermined processing includes a turning of a page.
In the further embodiment, the predetermined processing includes a turning of the page, the page is advance or returned one by one page basis. The predetermined processing may be the turning to the last page or the first page.
According to the further embodiment, the turning of page(s) can be designated by the eye-controlled operation.
A still further embodiment is the electronic equipment wherein the predetermined processing includes a scroll of a screen.
In the still further embodiment, the predetermined processing is a scroll of the screen, and the screen is scrolled in the leftward or rightward direction or the upward or downward direction, or in the oblique direction.
According to the still further embodiment, the scroll of the screen can be designated by the eye-controlled operation.
A yet still further embodiment is the electronic equipment wherein the display portion displays a lock screen which includes a character or image, further comprising: an arrangement detecting portion which detects according to a time series an arrangement of the character or image with which the gaze area detected by the gaze area detecting portion overlaps; and a lock canceling portion which puts out the lock screen when a predetermined arrangement is included in the arrangement of the character or image detected by the arrangement detecting portion.
In the yet still further embodiment, the lock screen (100) including a character or image is displayed on the display portion. In a case that the security lock function is turned-on, for example, in starting the use of the electronic equipment or in performing (starting) a predetermined application or function, the lock screen is displayed. The arrangement detecting portion (40, S13) detects according to a time series an arrangement of the character or image with which the gaze area detected by the gaze area detecting portion overlaps. That is, the characters or images designated by the eye-controlled input are detected according to an order of the eye-controlled input. The lock canceling portion (40, S19) puts out the lock screen when a predetermined arrangement is included in the arrangement of the character or image detected by the arrangement detecting portion, at a time that “YES” is determined in the step S13, for example.
According to the yet still further embodiment, since the lock canceling can be performed by the eye-controlled operation, even if a situation that the secret code number or the like is input is seen by other person, the other person cannot easily know the secret code number. That is, it is possible to increase the security.
A still further embodiment is the electronic equipment wherein the display portion displays a lock screen which includes a predetermined object, further comprising: a displaying manner changing portion which changes a displaying manner of the predetermined object when the gaze area detected by the gaze area detecting portion overlaps with the predetermined object; and a lock canceling portion which puts out the lock screen when a displaying manner which is changed by the displaying manner changing portion is a predetermined displaying manner.
In the still further embodiment, the lock screen (450) including a predetermined object (460) is displayed on the display portion. In a case that the lock function for the key (touch panel) is turned-on, for example, when the power for the display portion is turned-on, the lock screen is displayed. The displaying manner changing portion (40, S323, S355) changes a displaying manner of the predetermined object when the gaze area detected by the gaze area detecting portion overlaps with the predetermined object. For example, according to the eye-controlled input, the predetermined object is moved, or changed in its size and/or color. The lock canceling portion (40, S327, S359) puts out the lock screen when the displaying manner which is changed by the displaying manner changing portion becomes the predetermined displaying manner (“YES” in S325, S357).
According to the still further embodiment, since the lock canceling can be performed by the eye-controlled operation, it is possible to cancel the lock state even in a situation that the user cannot use his/her hand therefor.
Another embodiment is the electronic equipment wherein the display portion displays a lock screen which includes a predetermined object, and further comprising a lock canceling portion which puts out the lock screen when a time that the gaze area detected by the gaze area detecting portion is overlapping with the predetermined object reaches a predetermined time period.
In this embodiment, the lock screen (450) including the predetermined object (460) is displayed on the display portion. In a case that the lock function for the key (touch panel) is turned-off, for example, when the power for the display portion is turned-on, the lock screen is displayed. The lock canceling portion (40, S359) puts out the lock screen when a time that the gaze area detected by the gaze area detecting portion is overlapping with the predetermined object reaches a predetermined time period (“YES” in S357).
According to this embodiment, it is possible to cancel the lock state even in a situation that the user cannot use his/her hand therefore.
A further embodiment is the electronic equipment wherein the display portion displays at least an alarm screen for stopping an alarm when at a time of alarm, and the performing portion stops the alarm when the gaze area detected by the gaze area detecting portion overlaps with a predetermined area continuously more than a predetermined time period.
In the further embodiment, at least the alarm screen (250, 600) for stopping an alarm is displayed on the display portion when at a time of an alarm. The performing portion stops the alarm when the gaze area detected by the gaze area detecting portion overlaps a predetermined area (260, 262, 610) continuously more than a predetermined time period.
According to the further embodiment, since the alarm can be stopped by the eye-controlled operation, in a case that the electronic equipment is used as an alarm clock, the user necessarily open his/her eyes, it is possible to play a role of the alarm clock suitably. Furthermore, in a case that the electronic equipment is functioned as an alarm for schedule, by displaying the content of schedule on the display, it is possible to make the user confirm the content of schedule surely.
The other embodiment is the electronic equipment further comprising a telephone function, wherein the display portion displays at a time of incoming call, a selection screen which includes at least two predetermined regions to answer an incoming call and to stop the incoming call, and when the gaze area detected by the gaze area detecting portion overlaps with either one of the two predetermined areas continuously more than a predetermined time period, the performing portion answers the incoming call or stops the incoming call in accordance with the concerned predetermined area.
In the other embodiment, the electronic equipment comprises the telephone function. The electronic equipment is a mobile phone, for example. At a time of incoming call, the selection screen (350) which includes at least two predetermined regions to answer an incoming call or to stop the incoming call. When the gaze area detected by the gaze area detecting portion overlaps with either one of the two predetermined areas continuously more than a predetermined time period, the performing portion answers the incoming call or stops the incoming call (refusing the incoming call) in accordance with the concerned predetermined region.
According to the other embodiment, it is possible to answer or stop the incoming call by the eye-controlled operation.
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2012-001114 | Jan 2012 | JP | national |