The present disclosure relates to an electronic mirror device.
A so-called “electronic mirror” which displays, on a display screen, an image that has been captured with a camera and which can be used as a mirror has been developed and some people propose that such a mirror be used as an application for a mobile telecommunications device (see Patent Document No. 1, for example). Also, as disclosed in Patent Document No. 2, if the position of the camera can be changed, then the user can also use it as a rear-view mirror. Thus, the electronic mirror can perform a different function from a normal mirror which just reflects light.
However, the electronic mirror cannot be used in exactly the same way as a normal mirror, which makes the user feel somewhat uncomfortable especially when the camera is not level with his or her eyes (i.e., his or her line of sight is not aligned with the camera), which is one of problems with the electronic mirror. Thus, to overcome such a problem, Patent Document No. 3 proposes a method for synthesizing together images which have been shot with a plurality of cameras. Patent Document No. 4 proposes a method in which an image capturing section is arranged right behind a display section, thereby turning ON and OFF the display and image capturing sections synchronously with each other. And Patent Document No. 5 proposes a mechanism for changing the position of the camera. Meanwhile, although it does not relate to the field of electronic mirrors, Patent Document No. 6 proposes a technique for aligning the line of sight with the position of an image by detecting the line of sight of a car driver and shifting the location where an image is displayed.
Patent Document No. 1: Japanese Utility Model Publication No. 3154529
Patent Document No. 2: Japanese Laid-Open Patent Publication No. 2000-138926
Patent Document No. 3: Japanese Laid-Open Patent Publication No. 2002-290964
Patent Document No. 4: Japanese Laid-Open Patent Publication No. 2004-297733
Patent Document No. 5: Japanese Laid-Open Patent Publication No. 2012-60547
Patent Document No. 6: Japanese Laid-Open Patent Publication No. 2011-240813
The present disclosure provides an electronic mirror device which allows the user to view image data on the screen more easily by taking advantage of the fact that an electronic mirror works differently from, and cannot be used in the same way as, a normal mirror.
An electronic mirror device according to an aspect of the present disclosure displays a user's facial mirror image, and includes: an image capturing section which captures the user's facial image; an image data generating section which outputs image data representing the user's facial mirror image based on image data representing the user's facial image that has been captured; a display section which displays the image data representing the user's facial mirror image; and a display location shifting section which shifts a location where the mirror image is displayed on the display section.
An electronic mirror device according to an aspect of the present disclosure can shift an image display location to a location where the user can view the image easily, thus allowing the user to look at easily even a facial part (such as his or her profile or a part under his or her chin) which is usually hard to see in a normal mirror.
An aspect of the present disclosure can be outlined as follows:
An electronic mirror device according to an aspect of the present disclosure displays a user's facial mirror image, and includes: an image capturing section which captures the user's facial image; an image data generating section which outputs image data representing the user's facial mirror image based on image data representing the user's facial image that has been captured; a display section which displays the image data representing the user's facial mirror image; and a display location shifting section which shifts a location where the mirror image is displayed on the display section.
The electronic mirror device may further include a user interface which accepts the user's instruction to shift the location where the mirror image is displayed. The display location shifting section may shift the location where the mirror image is displayed in accordance with the user's instruction.
For example, the electronic mirror device may further include an orientation estimating section which estimates the user's facial orientation. The display location shifting section may shift the location where the mirror image is displayed according to the facial orientation that has been estimated.
For example, the electronic mirror device may further include a facial feature detecting section which detects the location and distribution of a predetermined facial part of the image data representing the user's facial image. The orientation estimating section may estimate the user's facial orientation based on the location and distribution of the predetermined facial part that has been detected.
For example, the electronic mirror device may further include a distance estimating section which estimates a distance from the display section to the user's face. The display location shifting section may shift the location where the mirror image is displayed according to the facial orientation and distance that have been estimated.
For example, the electronic mirror device may further include a line of sight detecting section which detects the direction of the user's line of sight. The display location shifting section may shift the location where the mirror image is displayed to a location to which the user directs his or her line of sight.
For example, the image capturing section and the display section may form integral parts of the device.
For example, before the display location shifting section shifts the location where the mirror image is displayed, the image data may be displayed closer to the image capturing section than to a central area of the display section.
For example, the electronic mirror device may include a plurality of the display sections. The display location shifting section may shift the location where the mirror image is displayed from one of the plurality of display section to another.
A computer program according to an aspect of the present disclosure is defined to make an electronic device perform the operation of displaying a user's facial mirror image which includes the steps of: capturing the user's facial image; outputting image data representing the user's facial mirror image based on image data representing the user's facial image that has been captured; displaying the image data representing the user's facial mirror image; and shifting a location where the mirror image is displayed.
Embodiments of an electronic mirror device will now be described with reference to the accompanying drawings.
(Embodiment 1)
Next, it will be described with reference to
The location where the facial image 210 is displayed may be shifted by making the user 200 manipulate the touchscreen panel 104 and slide his or her own facial image 210 to any location he or she likes. Alternatively, face recognition ability may be added to the electronic mirror device 100 to change the location where the facial image 210 is displayed automatically.
The image capturing section 102 may be a CCD image sensor, for example, and captures a subject image and generates image data. In this example, the image capturing section 102 captures the user's (200) facial image and generates image data representing his or her facial image.
The image data generating section 120 generates and outputs image data representing the user's (200) facial mirror image based on the image data representing the user's (200B) facial image that has been captured. The display section 103 displays the image data 210 representing the user's (200) facial mirror image. In this example, the image to be displayed on the display section 103 is supposed to be a moving picture. However, the image may also be a still picture. The image retouching section 170 allows the user to retouch the image data representing his or her face. The angle detecting section 105 detects the angle (i.e., the degree of tilt) of the electronic mirror device 100.
The user interface 104 accepts the user's (200) instruction to shift the location where the mirror image is displayed. The user interface 104 may be a touchscreen panel, for example.
The facial feature detecting section 150 detects the location and distribution of a predetermined facial part (such as the eyes, nose or mouth) in the image data representing the user's (200) facial image. The orientation estimating section 140 estimates the user's (200) facial orientation based on the location and distribution of the predetermined facial part that has been detected. The distance estimating section 160 estimates the distance from the display section 103 to the user's (200) face. The line of sight detecting section 180 detects the direction of the user's (200) line of sight.
The display location shifting section 130 shifts the location where the mirror image is displayed on the display section 103 in accordance with the user's (200) instruction which has been entered through the user interface 104. Optionally, the display location shifting section 130 may shift the mirror image display location according to the facial orientation that has been estimated. Alternatively, the display location shifting section 130 may shift the mirror image display location according to the facial orientation and distance that have been estimated. Still alternatively, the display location shifting section 130 may shift the mirror image display location to a location to which the user 200 directs his or her line of sight.
The microcomputer 110 is a control section which controls the overall operations of the respective components of the electronic mirror device 100 described above.
It should be noted that the set of components shown in
Likewise, the angle detecting section 105 just needs to be provided if the electronic mirror device 100 is rotated when used as in a smartphone or tablet terminal to be described later. And the angle detecting section 105 is not an indispensable one in an embodiment in which the electronic mirror device 100 is fixed at the same angle when used.
Next, exemplary processing of changing the location where image data representing a facial mirror image is displayed will be described.
The processing steps S21, S22 and S23 to be carried out until image data representing a mirror image is displayed are the same as the processing steps S11, S12 and S13 shown in
Next, it will be described with reference to
If the distance needs to be estimated more accurately, the distance d is determined by the parallax and the actual radius r of the user's (200) head is calculated based on the angle of view of the image capturing section 102 and the width of the head in the captured image. To measure the distance d from the display section 103 to the user 200 based on the parallax, either a method using a plurality of cameras or a method using a pupil division technique may be adopted. Up to this point, the distance d from the display section 103 to the user 200, the radius r of the user's (200) head, and the value representing the facial orientation θ have been obtained.
If the display location shifting section 130 has decided to shift the display location of the facial image based on the facial orientation and distance that have been estimated, then the location where the facial image is displayed is shifted (in Steps S26 and S27). On the other hand, if the display location shifting section 130 has found it unnecessary to shift the facial image (e.g., in a situation where the user is facing forward), then the processing of estimating the facial orientation and distance is continued without changing the facial image display location (in Steps S26 and S28). After that, either the processing of estimating the facial orientation and distance or the processing of shifting the display location will be continued until the user finishes using this electronic mirror.
In the example described above, calculations are supposed to be made on the supposition that the head's radius is r for the sake of simplicity. However, the magnitude of shift of the display location may also be determined by the interval between the pupils or the width between the lateral corners of the eyes which can be easily read from the photo or image shot. Optionally, the image display location may also be adjusted by carrying out detection in the same way not only laterally across the face but also vertically or obliquely. Furthermore, it is not always necessary to provide the distance estimating section 160. Alternatively, the relation between the facial orientation and the display location may be stored in advance in a built-in memory and the display location may be determined according to the user's (200) facial orientation estimated.
The line of sight detecting section 180 detects the user's (200) line of sight. The method of detecting the line of sight is not particularly limited. But the line of sight may be detected by a method which uses a difference in luminance between the cornea (black part of the eye) and the sclera (white part of the eye). For example, if the decision has been made that the user 200 facing right or left, then the display location is shifted to the right or the left. On the other hand, if the decision has been made that the user 200 is facing straight forward, then the display location is not shifted (in Steps S35, S36 and S37). After that, either the line of sight detection processing or the processing of shifting the display location will be continued until the user finishes using this electronic mirror (in Steps S37 and S38).
In the embodiments shown in
This electronic mirror device 100 can display the user's (200) facial image at a location where he or she can view his or her own facial image comfortably. Consequently, he or she can view that image more easily with his or her eyes panned much less than in a normal mirror.
(Embodiment 2)
Now it will be described with reference to
Next, if the electronic mirror device 100 has been rotated 90 degrees to the right as shown in
Just like the electronic mirror device 100 of the first embodiment described above, the electronic mirror device 100 of this second embodiment can also shift the location where the facial image 210 is displayed. In addition, the electronic mirror device 100 of the second embodiment is a portable one of so small a size that the relative positions of the user 200, the image capturing section 102 and display section 103 can be changed easily.
Next, it will be described in further detail with reference to
It should be noted that not the entire facial image has to be displayed on the display section 103 but only a particular part of the face the user wants to view may be displayed there. The location where the facial image 210 is displayed may be shifted either manually or by automatic face detection function as in the first embodiment described above. Alternatively, after the location where the facial image 210 is displayed has been shifted automatically, the facial image 210 may be slid on the touchscreen panel 104 so as to be shifted finely to a location where the user 200 can view the image comfortably for him or her. However, in the small and portable electronic mirror device 100 of this second embodiment, the magnitude of shift of the image location is limited by the size of the display section 103. For example, if the display section 103 is that of a smartphone with a longitudinal size of 100 mm, the display section 103 is so small that the user 200 is expected to view this electronic mirror device 100 from a distance of approximately 200 nm in order to view the image comfortably. As a grownup male or female usually has a head width of about 150 mm on average, the facial orientation at which the magnitude of shift of the image becomes smaller than 100 mm that is the size of the display section becomes equal to or smaller than 20 degrees which satisfies 100<tan θ·(75+200). If the user wants to view a particular part of his or her face such as around his or her ears or under his or her chin with his or her facial orientation changed by 20 degrees or more, then the magnitude of shift will be too large for the size of the electronic mirror device 100 to prevent the image from falling out of the intended display area. For that reason, the operation may be defined so that the facial region image will be displayed within the predetermined range of the display section 103.
According to this second embodiment, when the user 200 who is using the small and portable electronic mirror device 100 arranges the image capturing section 102 close to a facial part he or she wants to view, his or her facial orientation will be detected and his or her facial image captured will be shifted to, and displayed at, a location which is away from the image capturing section 102. As a result, the facial image can be displayed at a location where the user 200 can view the image comfortably. That is to say, the user 200 can view the facial image more easily with his or her eyes panned much less than in a normal mirror.
(Embodiment 3)
An electronic mirror device 100 as a third embodiment will be described with reference to
If the user 200 is facing straight forward and directing his or her line of sight toward the display section 103C as shown in
To determine the display location, the angle formed between the display sections 103L and 103C and the angle formed between the display sections 103C and 103R need to be detected in advance. That is why if the angle can be adjusted in multiple stages, the electronic mirror device 100 may include a sensor which senses the angles formed between those pairs of display sections. Alternatively, the electronic mirror device 100 may include a user interface which allows the user 200 to adjust the display location of the facial image to any location he or she likes.
In the third embodiment described above, the electronic mirror device 100 is supposed to include three flat display planes. However, the display sections 103 do not have to be flat but a flexible display with a curved surface may also be used. If such a curved display screen is adopted, the image to be displayed needs to be retouched according to the curvature of the display screen so as to look as natural to the viewer's eye as possible.
According to the third embodiment, by displaying the user's (200) facial image at the end of his or her line of sight, he or she can view easily his or her facial image, even from an angle from which he or she could not look at him- or herself in a normal mirror, with his or her eyes panned just a little.
In the foregoing description of the first to third embodiments, a touchscreen panel which senses a touch manipulation by the user 200 is supposed to be used as an exemplary user interface 104. However, this is only an example of the present disclosure. Alternatively, the user interface 104 may also receive the user's (200) instruction by detecting his or her gesture or sign. Still alternatively, the user interface 104 may also receive the user's (200) audio instruction by recognizing his or her voice. Yet alternatively, the user interface 104 may even be buttons such as cursor keys.
The processing by the respective components of the electronic mirror device 100 described above may be carried out at least partially by the microcomputer 110. For example, the microcomputer 110 may have the functions of the image data generating section 120, display location shifting section 130, orientation estimating section 140, facial feature detecting section 150, distance estimating section 160, image retouching section 170, and line of sight detecting section 180 and may perform their operations.
Also, the operation of the electronic mirror device 100 described above may be carried out either by hardware circuits or by executing a software program. A computer program which is designed to get such an operation done may be stored in a memory built in the microcomputer 110. Also, such a computer program may be installed from a storage medium (such as an optical disc or a semiconductor memory) in which the program is stored into the electronic mirror device 100 or downloaded through a telecommunications line such as the Internet. Alternatively, the electronic mirror operation described above may also be performed by installing such a computer program in an electronic device such as a smartphone or a tablet terminal.
The technique of the present disclosure can be used particularly effectively in the field of electronic mirrors.
Number | Date | Country | Kind |
---|---|---|---|
2013-071744 | Mar 2013 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2014/001737 | 3/26/2014 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2014/156146 | 10/2/2014 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
20070040033 | Rosenberg | Feb 2007 | A1 |
20080151045 | Kida | Jun 2008 | A1 |
20100296802 | Davies | Nov 2010 | A1 |
20110210970 | Segawa | Sep 2011 | A1 |
20120044364 | An | Feb 2012 | A1 |
20120269405 | Kaneda | Oct 2012 | A1 |
20130128364 | Wheeler | May 2013 | A1 |
20130258190 | Sawada | Oct 2013 | A1 |
Number | Date | Country |
---|---|---|
2 281 838 | Mar 1995 | GB |
2000-138926 | May 2000 | JP |
2002-290964 | Oct 2002 | JP |
2004-297733 | Oct 2004 | JP |
2004-297734 | Oct 2004 | JP |
2005-005791 | Jan 2005 | JP |
2008-277983 | Nov 2008 | JP |
3154529 | Sep 2009 | JP |
2011-240813 | Dec 2011 | JP |
2012-060547 | Mar 2012 | JP |
WO 2009153975 | Dec 2009 | WO |
Entry |
---|
International Search Report for corresponding International Application No. PCT/JP2014/001737 dated Jul. 1, 2014. |
Form PCT/ISA/237 for corresponding International Application No. PCT/JP2014/001737 dated Jul. 1, 2014 and partial English translation. |
Number | Date | Country | |
---|---|---|---|
20150154439 A1 | Jun 2015 | US |