The present disclosure pertains to a display device that changes a display orientation of a display object displayed thereon, and in particular to an improvement user interface therefor.
Conventionally, a display device such as a smartphone or tablet terminal is able to detect a change in tilt with respect to the force of gravity using an acceleration sensor or the like, and to display a display object in accordance with the current vertical orientation (e.g., Patent Literature 1). Further, Patent Literature 2 describes a display device that captures a facial image of a user and recognizes the vertical orientation of the facial image in order to adjust the orientation of the display object to match that of the user's face.
[Patent Literature 1]
Japanese Patent Application Publication No. 2008-131616
[Patent Literature 2]
Japanese Patent Application Publication No. 2009-130816
However, the display device described in Patent Literature 1 changes the vertical orientation of the display object to match the display orientation of the display device only when the orientation of the display device is changed. This is a potential problem for the user in that when the vertical orientations of the user's face and the display device do not match, such as situations when the user uses the display device while lying down. Also, the display device described in Patent Literature 2 changes the display such that there is a match between the vertical orientations of the user's face and the display object. However, this is problematic in that image capture and face recognition processes are performed continuously and consume a large amount of electric power, thus reducing the effective operation time of the battery.
In consideration of these problems the present disclosure aims to provide a display device capable of displaying the display object with an orientation that matches the orientation of the user's face while reducing electric power consumption.
In order to solve the aforementioned problem, the present disclosure provides a portable display device having a display, operable to switch a display orientation of a display object on the display among a plurality of available display orientations, the display device comprising: a tilt detection unit detecting an angle of rotation as being a display tilt, the angle being measured between a vertical plane component of a tilt vector of a predetermined reference line in the display and a gravitational vector; a facial orientation specification unit recognizing a face of a person facing the display and specifying an upright orientation of the face relative to a current position of the display; a terminal orientation determination unit establishing a plurality of angle ranges within each of which the display orientation remains unchanged depending on the available display orientations, and determining whether or not the display tilt newly detected by the tilt detection unit exceeds a given angle range and enters another angle range among the established angle ranges, the given angle range including the angle detected by the tilt detection unit upon updating a previous display orientation; and a control unit causing the facial orientation specification unit to begin the specification when the terminal orientation determination unit makes an affirmative determination, and updating the display orientation according to the upright orientation of the face specified for display.
According to the above-described configuration, the display device of the present disclosure is able to provide a display orientation that matches the vertical orientation of a user's face, while constraining electric power consumption.
A display device 100 is described below as an Embodiment of the present disclosure.
(Outline)
As shown, the display device 100 includes a touch panel 101, a camera 102, and a speaker 103. A user is able to view a display object displayed on the touch panel 101 and to listen to audio from the speaker 103. The display object may be composed of icons, text, images, and all other configurations of items displayed on the touch panel 101.
The touch panel 101 has four edges 201, 202, 203, and 204. The display object is displayed with an orientation that is one of A-facing, B-facing, C-facing, and D-facing. A-facing is an orientation where edge 201 is the top edge and edge 203 is the bottom edge, B-facing is an orientation where edge 202 is the top edge and edge 204 is the bottom edge, C-facing is an orientation where edge 203 is the top edge and edge 201 is the bottom edge, and D-facing is an orientation where edge 204 is the top edge and edge 202 is the bottom edge.
The camera 102 is provided in the same plane as a display surface of the touch panel 101 and is able to capture a facial image of the user using the display device 100.
As shown, when the user using the display device 100 goes from a seated position (see
Accordingly, the number of instances of power-consuming facial recognition process iterations is constrained, and the user is able to view the display object being displayed in accordance with the user's facial orientation, regardless of whether or not the user is using the display device 100 while lying down or otherwise positioned such that the vertical upward orientation of the display device 100 and the facial orientation of the user do not match.
(Configuration)
The following describes the display device 100 configuration.
As shown, the display device 100 includes a terminal orientation determiner 110, a tilt detector 111, a facial orientation specifier 120, a controller 130, a display unit 140, an input unit 150, and a memory 160. The display device 100 includes a processor and memory, and thus realizes the functions of the terminal orientation determiner 110, the facial orientation specifier 120, the controller 130, the display unit 140, and the input unit 150 by having the processor execute a program stored in memory.
The tilt detector 111 is an acceleration sensor provided at the upper left of the display device 100, measuring acceleration with respect to three orthogonal axes. The three axes are as given in
The terminal orientation determiner 110 determines whether or not a tilt currently detected by the tilt detector 111 has changed by a predetermined value or more since an update of the current display orientation for the display object. The tilt is defined as the angle between the x-y component of the pull of gravity on the touch panel 101 (also termed a gravitational vector) and the y-axis (also termed a reference line). The tilt also signifies the tilt of the touch panel 101 itself. In the present disclosure, the touch panel 101 and the display device 100 are indivisible. As such, the following description refers to the tilt of the display device 100 for convenience. Four angle ranges are defined to correspond to the A-facing, B-facing, C-facing, and D-facing orientations, namely a range of 315° to 45°, a range of 45° to 135°, a range of 135° to 225°, and a range of 225° to 315°, respectively. When the current orientation is estimated to have become difficult to view while the user's posture is assumed to remain unchanged, the terminal orientation determiner 110 beneficially makes an affirmative determination (i.e., a determination that the tilt of the current display object has changed the predetermined value or more since the update) to the effect that the tilt detected by the tilt detector 111 has changed from one angle range to another. Also, once the terminal orientation determiner 110 has made the affirmative determination, the later-described facial orientation specifier 120 performs a facial orientation specification process that is high in energy consumption. As such, the affirmative determination is beneficially not performed frequently, despite an unstable tilt approaching the limit of an angle range due to slight movements. The predetermined value is set to 70°, the midpoint of the angle range that includes the tilt detected by the tilt detector 111 upon updating the previous display object orientation is taken as a reference value for the tilt prior to changing, and the terminal orientation determiner 110 makes the affirmative determination when the change is of 70° or more from the reference value.
The facial orientation specifier 120 includes a capture unit 121 and a face recognition unit 122 further specifies the upright facial orientation of the user using the display device 100.
The capture unit 121 includes the camera 102 and operates the camera 102 by supplying electric power thereto in order to capture a facial image of the user. The capture unit 121 captures the facial image of the user for facial recognition purposes only when the terminal orientation determiner 110 determines that the tilt has changed by the predetermined value or more.
The face recognition unit 122 recognizes the eyes and nose in the facial image of the user captured by the capture unit 121 with reference to a face recognition template stored in advance. The vertical axis of the template matches the vertical axis of the face in the template. Elements of the face cannot be recognized when the vertical axis of the face in the image captured by the capture unit 121 greatly differs from the vertical axis of the template. Therefore, the face recognition unit 122 rotates the template orientation to be one of A-facing, B-facing, C-facing, and D-facing when performing the face recognition. The vertical axis of the captured image corresponds to the A-facing orientation of the touch panel 101.
The facial orientation specifier 120 specifies the upright orientation of the face according to the orientation of the template when the face recognition unit 122 recognizes the nose and eyes. For example, when the face recognition unit 122 recognizes the nose and eyes while the template orientation is A-facing, the facial orientation of the user is specified as being A-facing. The facial orientation specifier 120 notifies the controller 130 of the specified orientation. When the face recognition unit 122 fails to recognize the eyes and nose, the facial orientation specifier 120 notifies the controller 130 to such effect.
The controller 130 controls all operations pertaining to display by the display device 100.
Specifically, the controller 130 rotates the display object such that the vertical axis of the display object matches the vertical axis specified by the facial orientation specifier 120, and sends instructions to the display unit 140 so as to shrink or magnify the display objects on the touch panel 101 to an appropriate size. When the facial orientation specifier 120 fails to specify the upright facial orientation, the controller 130 also controls the display in response to the tilt detected by the tilt detector 111.
The display unit 140 includes the touch panel 101, receives instructions from the controller 130, and displays the display object on the touch panel 101. The input unit 150 detects contact made on the touch panel 101 and receives input from the user.
The memory 160 is an area storing later-described terminal orientation information 10 and facial orientation information 20.
(Data)
The terminal orientation information 10 is described first.
The terminal orientation information 10 is information indicating a midpoint of the angle range that includes the tilt detected by the tilt detector 111, and is stored as one of 0°, 90°, 180°, and 270°. The terminal orientation information 10 is updated by the controller 130 once the terminal orientation determiner 110 makes the affirmative determination. Specifically, the value of the terminal orientation information 10 is updated according to the tilt detected by the tilt detector 111 to be 0° for an angle of 315° to 45°, 90° for an angle of 45° to 135°, 180° for an angle of 135° to 225°, and 270° for an angle of 225° to 315°.
The determination process by the terminal orientation determiner 110 uses a pre-change tilt as the reference value, takes a tilt value detected by the tilt detector 111 for the previous display object orientation update as n, and takes the terminal orientation information 10 as f(n), expressed as follows.
The facial orientation information 20 is described next.
The facial orientation information 20 is information indicating the current display orientation of the display object, and is stored as one of A-facing, B-facing, C-facing, and D-facing. The facial orientation information 20 is updated by the controller 130 after the display is updated through the facial orientation specification process, to reflect the display orientation at the current time.
(Operations)
The following describes the operations of the display device 100.
As shown, once the display device 100 is activated, the terminal orientation determiner 110 first displays a background images, icons, and so on, in accordance with a predetermined initial display used after start-up (step S1). Afterward, when an input operation is received from the user (Yes in step S2), processing corresponding to the input operation is performed, and the display is updated (step S3). For example, when the user inputs a text string by touching the touch panel 101, the text string is displayed. Next, the terminal orientation determiner 110 acquires the terminal orientation information 10 from the memory 160 (step S4) and uses the tilt indicated in the terminal orientation information 10 to determine whether or not the tilt currently detected by the tilt detector 111 has changed by the predetermined value or more (step S5). The terminal orientation determiner 110 makes the determination by first subtracting the tilt detected by the tilt detector 111 from the terminal orientation information 10 stored in the memory 160 and then taking the absolute value P of the result. The resulting absolute value P is then compared to a value obtained by subtracting P from 360, and a change in tilt S is defined as the smaller of the two values.
The absolute value P is expressed as follows, where the tilt indicated by the terminal orientation information 10 is f(n) and the tilt currently detected by the tilt detector 111 is m.
P=|f(n)−m| (Math. 2)
Also, the change in tilt S is expressed as follows, where a Min function is defined as shown.
The terminal orientation determiner 110 then determines whether or not the change in tilt S is greater than the predetermined value.
According to the above, the display orientation of the display objects is updated from A-facing to C-facing when, for example, the calculation of the terminal orientation information 10 and the change in tilt S reveal that the tilt has changed from 0° to 135°. Once the terminal orientation information 10 has been updated to read 180°, the terminal orientation determiner 110 determines that the tilt has not changed by more than the predetermined value despite the tilt of the display device 100 approaching 135° and being unstable, because the change in tilt is less than the difference between the predetermined value and the 180 indicated in the terminal orientation information 10, that is, less than 70°. As a result, frequent instances of the power draining processes, performed by the facial orientation specifier 120, of capturing the facial image of the user and performing facial recognition are avoidable.
When the terminal orientation determiner 110 determines that the tilt has changed by the predetermined value or more (Yes in step S5), the controller 130 updates the terminal orientation information 10 stored in the memory 160 (step S6), and makes a notification to the facial orientation specifier 120 indicating that the terminal orientation determiner 110 has made the affirmative determination. Upon receiving the notification, the facial orientation specifier 120 performs the facial orientation specification process (step S7). The details of the facial orientation specification process are given later.
The controller 130 rotates the display object in accordance with the upright facial orientation specified in the facial orientation specification process, shrinks or magnifies the display object as appropriate for the touch panel 101, and transmits an instruction to the display unit 140. Upon receiving the instruction, the display unit 140 displays the display object on the touch panel 101 (step S8). When the facial orientation specifier 120 fails to specify the facial orientation, the facial orientation is made to match the vertical orientation of the display device 100, such that the display object faces upward with the top edge of the display device 100 being defined as upward.
Then, the controller 130 updates the facial orientation information 20 with the display orientation of the display object as updated in step S8 (step S9).
The following describes the details of the facial orientation specification process of step S7, with reference to
First, the facial orientation specifier 120 activates the camera 102 and captures a facial image of the user (step S31). Next, sequential face recognition is performed by rotating the template to be A-facing, B-facing, C-facing, and D-facing, until the eyes and nose are recognized in the captured facial image. The template is set to face upward, oriented as indicated by the facial orientation information 20, i.e., such that the top of the display object faces the top prior to the affirmative determination by the terminal orientation determiner 110 (step S32), and face recognition is performed (step S33). When the nose and eyes cannot be detected (No in step S33), the edge of the display device 100 currently serving as the top edge is detected using the tilt in the terminal orientation information 10, the template is rotated to match the detected top edge, i.e., so as to be oriented toward the current top edge of the display device 100 (step S34), and face recognition is performed (step S35).
When the eyes and nose still cannot be detected, the template is rotated sequentially into the remaining two orientations, until face recognition succeeds (steps S36 and S38), then face recognition is performed (steps S37 and S39).
In step S34, when there is no match between the orientation in which the current top edge of the display device 100 and the orientation in which the display object faces upward, prior to the determination by the terminal orientation determiner 110 that the tilt has changed by the predetermined value or more, the template is rotated sequentially into the remaining three orientations until face recognition succeeds (steps S34, S36, and S38), then face recognition is performed (steps S35, S37, S39).
When the face recognition unit 122 succeeds at face recognition (Yes in one of step S33, S35, S37, and S39), the facial orientation specifier 120 specifies the vertical orientation of the template in which the eyes and nose are detected as being the upright orientation of the user's face (step S40), and notifies the controller 130 accordingly. When the face recognition unit 122 fails to recognize the face (No in step S39), the controller 130 is notified that recognition has failed.
The following describes specific operations of the display device 100 with reference to
In this example, the user first uses the display device 100 as shown in
First, when the usage condition changes from that of
P=|90−3|=87
Change in tilt S=Min(87,273)=87≧70
Accordingly, the terminal orientation determiner 110 determines that the tilt has changed by more than the predetermined value. The terminal orientation determiner 110 notifies the controller 130 of the determination results. Upon receiving the notification, the controller 130 updates the terminal orientation information 10 to 0°, and notifies the facial orientation specifier 120 that the terminal orientation determiner 110 has detected a change in tilt greater than the predetermined value. Upon receiving the notification, the facial orientation specifier 120 captures a facial image of the user. The facial image of the user is A-facing, as shown in
Next, when the usage condition changes from that of
P=|0−92|=92
Change in tilt S=Min(92,268)=92≧70
The terminal orientation determiner 110 determines that the tilt has changed by more than the predetermined value. Subsequently, the controller 130 updates the terminal orientation information 10 to 90°, and notifies the facial orientation specifier 120 to the effect that the terminal orientation determiner 110 has made the affirmative determination. Upon receiving the notification, the facial orientation specifier 120 captures a facial image of the user, rotates the template to the A-facing orientation indicated by the facial orientation information 20, and performs face recognition. At this time, the captured image of the user's face is A-facing relative to the display device 100. Thus, the face is recognized and the facial orientation specifier 120 specifies the facial orientation as being A-facing. The controller 130 rotates the display object so as to be A-facing as specified by the facial orientation specifier 120, shrinks or magnifies the display object to fit a display range of the touch panel 101, and transmits an instruction to the display unit 140. Upon receiving the instruction, the display unit 140 makes the display on the touch panel 101.
(Supplement)
Although the present disclosure is described above using the Embodiment of the display device as an example, no limitation is intended. The following variations are of course possible.
Also, in the above-described Embodiment, when the face recognition unit fails to recognize the face, the controller displays the display object such that the upright orientation of the face and the upward orientation of the display device match. However, no limitation is intended. Provided that the display object is displayed despite the absence of face orientation specification, the facial orientation specifier may specify the upright orientation of the display device as the vertical orientation of the face when the face recognition unit fails to recognize the face.
The following describes a further Embodiment of the present disclosure in terms of the display device configuration and the effects thereof.
Further, the facial orientation specification unit includes a face recognition unit recognizing facial elements in a captured facial image by referencing a facial element recognition template stored in advance and having a designated orientation, and the face recognition unit detects a current top edge of the display from the angle detected by the tilt detection unit, initially designates the detected edge as the upward orientation of the template, and then references the template. According to this configuration, the number of face recognition iterations is reduced to cases where the user changes the tilt of the display device and the like.
The display device pertaining to the present disclosure constrains electric power consumption while displaying a display object in an orientation corresponding to the upright orientation of a user's face relative to the display. Thus, the display device is applicable to display orientation switching functionality and similar.
Number | Date | Country | Kind |
---|---|---|---|
2011-047948 | Mar 2011 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2012/001138 | 2/21/2012 | WO | 00 | 11/20/2012 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2012/120799 | 9/13/2012 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
20050104848 | Yamaguchi et al. | May 2005 | A1 |
20060265442 | Palayur | Nov 2006 | A1 |
20100125816 | Bezos | May 2010 | A1 |
20100171693 | Tamura et al. | Jul 2010 | A1 |
20110032220 | Shih et al. | Feb 2011 | A1 |
20110193985 | Inoue | Aug 2011 | A1 |
20120029389 | Amiot et al. | Feb 2012 | A1 |
20120057064 | Gardiner et al. | Mar 2012 | A1 |
Number | Date | Country |
---|---|---|
1860433 | Nov 2006 | CN |
101465116 | Jun 2009 | CN |
101783133 | Jul 2010 | CN |
2005-100084 | Apr 2005 | JP |
2008-131616 | Jun 2008 | JP |
2009-130816 | Jun 2009 | JP |
2011-34029 | Feb 2011 | JP |
2011-138449 | Jul 2011 | JP |
2011-203860 | Oct 2011 | JP |
2011-221094 | Nov 2011 | JP |
2011104837 | Sep 2011 | WO |
Entry |
---|
Kimberly Tuck, “Embedded Orientation Detection Using the MMA8450Q”, published in Sep. 2010. pp. 3-7. |
International Search Report issued May 22, 2012 in corresponding International Application No. PCT/JP2012/001138. |
Chinese Office Action issued Nov. 15, 2014, in Chinese Application No. 201280001501.0 (with partial English translation). |
Number | Date | Country | |
---|---|---|---|
20130069988 A1 | Mar 2013 | US |