The present invention relates to a head-mounted display.
In a head-mounted display or the like that is worn on a head of a user and in which a video or the like is viewed as disclosed in PTL 1 to 4, a technique has been known in which a position of eyes or line-of-sight direction of the user is detected from an image of the eyes of the user photographed by a camera.
[PTL 1] JP 2015-508182T; [PTL 2] JP 2002-301030A; [PTL 3] JP 1997-127459A; [PTL 4] JP 2012-515579T
When photographing eyes of a user that wears a head-mounted display, a camera is installed in a location (for example, an upper portion or lower portion of an optical system) that is a space between the eyes of the user and the optical system disposed between a display unit and the eyes of the user, and that has no obstruction in viewing the display unit by the user.
Herein, in the head-mounted display that is worn on a head, a distance between the optical system and the eyes of the user that wears the head-mounted display becomes small. Also, an actual state is that a size of the optical system is preferably larger in order to broaden a field of view of the user. Then, there occurs a problem that a photographing angle of the camera to the eyes of the user becomes shallow and it is difficult to photograph the eyes of the user with eyelids, eyelashes, or the like becoming an obstacle.
In view of the foregoing, it is an object of the present invention to provide the head-mounted display in which it is easy to photograph the eyes of the user that wears the head-mounted display.
In order to solve the above-described problem, a head-mounted display worn by a user according to the present invention, includes a display unit disposed in front of eyes of the user, an optical system disposed between the display unit and the eyes of the user, and an imaging unit disposed between the display unit and the optical system, the imaging unit imaging the display unit in which an image of the eyes of the user is reflected.
In a mode of the present invention, it may be assumed that the head-mounted display further includes a light-irradiation unit that irradiates light onto the eyes of the user, and the imaging unit images an image of the eyes of the user reflected in the display unit by reflecting the light irradiated by the light-irradiation unit by the eyes of the user.
In this mode, it may be assumed that the light-irradiation unit is disposed between the optical system and the eyes of the user and directly irradiates light onto the eyes of the user.
Further, it may be assumed that the light-irradiation unit is disposed between the display unit and the optical system and irradiates, onto the eyes of the user, reflected light in which light irradiated onto the display unit is reflected.
Further, it may be assumed that the light-irradiation unit irradiates infrared light, and the imaging unit is an infrared camera capable of imaging the infrared light.
Further, in a mode of the present invention, it may be assumed that the head-mounted display further includes a line-of-sight detection unit that detects a line-of-sight direction of the user on the basis of an image including the image of the eyes of the user imaged by the imaging unit.
Hereinafter, an embodiment of the present invention will be described with reference to the accompanying drawings.
As illustrated in
The control unit 11 includes a program control device such as a central processing unit (CPU), and executes various types of information processing in accordance with programs stored in the storage unit 12.
The storage unit 12 includes a memory device such as a random access memory (RAM) or read-only memory (ROM), and stores programs or the like executed by the control unit 11. Further, the storage unit 12 also functions as a work memory of the control unit 11.
The input/output unit 13 is an input/output interface such as a high-definition multimedia interface (HDMI) (registered trademark) port or a universal serial bus (USB) port, for example.
The display unit 14 is a display such as a liquid crystal display or an organic electroluminescence (EL) display, for example. The display unit 14 according to the present embodiment displays, for example, a video or the like expressed by video signals received from an entertainment apparatus such as a home game machine, digital versatile disc (DVD) player, or Blue-ray (registered trademark) player connected via the input/output unit 13. In addition, the display unit 14 according to the present embodiment may be capable of displaying a three-dimensional video.
The light-irradiation unit 15 is an optical device such as a light-emitting diode (LED). The light-irradiation unit 15 according to the present embodiment is assumed to irradiate light of a wavelength band other than a visible light band, called infrared light; however, the light-irradiation unit 15 is not limited to this example, and may be assumed to irradiate light of the visible light band.
The imaging unit 16 is a camera such as a digital camera for generating an image in which an object is imaged, for example. The imaging unit 16 according to the present embodiment is assumed to be an infrared camera capable of imaging infrared light; however, the imaging unit 16 is not limited to this example, and may be a camera capable of imaging visible light. Further, as illustrated in
The control unit 11 further detects a position of the eyes of the user or a line-of-sight direction of the user from an image including the image 40 of the eyes of the user imaged by the imaging unit 16. The control unit 11 is assumed to detect the position of the eyes of the user or the line-of-sight direction by using a known line-of-sight detection technique. For example, the control unit 11 detects the position of the eyes of the user or the line-of-sight direction of the user on the basis of a positional relationship between pupils of the user and a basic point obtained by reflecting infrared light by corneas of the user. Alternatively, on the basis of the position of irises or pupils, the control unit 11 may detect the position of the eyes of the user or the line-of-sight direction of the user. In addition, the imaging unit 16 may include a function of detecting the position of the eyes of the user or the line-of-sight direction of the user.
As described above, the position of the eyes of the user or the line-of-sight direction of the user detected from an image including the image 40 of the eyes of the user imaged by the imaging unit 16 is applicable to calibration of the HMD 10 or a line-of-sight tracking function of the user. The calibration of the HMD 10 is for adjusting a display position of a video in the display unit 14 at the time of starting up the HIVID 10, at the time of starting up applications, or the like; for example, the display position of the video can be adjusted so that the video is displayed in the position of the eyes of the user on the display unit 14 by using the detected position of the eyes of the user. In the line-of-sight tracking function of the HMD 10, for example, a line-of-sight tracking can be performed by using the detected line-of-sight direction of the user.
In addition, the image including the image 40 of the eyes of the user imaged by the imaging unit 16 may be assumed to be transmitted to an external information processing apparatus via the input/output unit 13. In this case, the position of the eyes of the user or the line-of-sight direction of the user is assumed to be detected by using the image including the image 40 of the eyes of the user received by the external information processing apparatus.
In the first example of the configuration of the HMD 10 according to the present embodiment illustrated in
In the HMD 10 according to the present embodiment illustrated in
Subsequently, a contrast example with the HMD 10 according to the present embodiment illustrated in
In the HMD 110 according to the contrast example illustrated in
Herein, the HMD 10 according to the present embodiment illustrated in
First, in the HMD 10 according to the present embodiment, the imaging unit 16 is preferably disposed in a location in which it is easy to image the image 40 of the eyes of the user reflected in the display unit 14 and a video displayed on the display unit 14 by the user is not prevented from being viewed. In the HMD 10 according to the present embodiment as illustrated in
Subsequently, in the HMD 110 according to the contrast example, the imaging unit 116 is preferably disposed in a location in which it is easy to image the eyes 130 of the user and a video displayed on the display unit 114 is not prevented from being viewed by the user. In the HMD 110 according to the contrast example as illustrated in
As described above, in
In the above-described example, an example is provided in which the vertical distance H1 in the HMD 10 according to the present embodiment is approximately the same as the vertical distance H2 in the HMD 110 according to the contrast example, and the vertical distance L1 in the HMD 10 according to the present embodiment is longer than the vertical distance L2 in the HMD 110 according to the contrast example; however, the present embodiment is not limited to this example. For example, the vertical distance H1 in the HMD 10 according to the present embodiment may be larger than half of the height of the optical system 17. Further, the vertical distance L1 in the HMD 10 according to the present embodiment may be shorter than the distance X or may be shorter than the distance Y. Even in this case, the imaging unit 16 according to the present embodiment images the display unit 14 in which the image 40 of the eyes of the user is reflected, and thereby there is not present an obstacle to the imaging such as the eyelashes or eyelids of the user as in the HMD 110 according to the contrast example, and therefore it is easy to image the image 40 of the eyes of the user.
The imaging unit 16 further images the display unit 14 in which the image 40 of the eyes of the user by the infrared light is reflected, and thereby the image 40 of the eyes of the user can be imaged even during the period in which videos or the like received from entertainment apparatus are displayed on the display unit 14. That is, the infrared light is irradiated onto the eyes of the user, and thereby the image 40 of the eyes of the user can be reflected in the display unit 14 without obstructing the video displayed on the display unit 14. Through this processing, even while the user views the video or the like received from the entertainment apparatus, the control unit 11 is capable of detecting the position of the eyes of the user or the line-of-sight direction of the user and performing the calibration or the line-of-sight tracking. For example, a line-of-sight input operation can be used as a user input operation to be accepted while the video or the like received from the entertainment apparatus is displayed on the display unit 14; further, the entertainment apparatus is capable of performing processing relating to the input operation according to the position of the eyes of the user or the line-of-sight direction of the user.
Further, in the HMD 10 according to the present embodiment illustrated in
Further, in the HMD 10 according to the present embodiment illustrated in
In the HMD 10 according to the present embodiment illustrated in
In addition, the configuration of the HMD 10 illustrated in
In the HMD 10 according to the modification example illustrated in
In the HMD 10 according to the modification example illustrated in
Number | Date | Country | Kind |
---|---|---|---|
2015-158976 | Aug 2015 | JP | national |
This is a continuation application of U.S. patent application Ser. No. 15/747,269, accorded a filing date of Jan. 24, 2018, which is a national stage application of International Application No. PCT/JP2016/072967, filed Aug. 4, 2016, which claims priority to JP Application No. 2015-158976, filed Aug. 11, 2015, the entire disclosures of which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
6529331 | Massof | Mar 2003 | B2 |
6634749 | Morrison | Oct 2003 | B1 |
7071973 | Yoshioka | Jul 2006 | B1 |
7432917 | Wilson | Oct 2008 | B2 |
7907150 | Shulman | Mar 2011 | B2 |
9025252 | Lewis | May 2015 | B2 |
9448405 | Yamamoto | Sep 2016 | B2 |
9495589 | Strombom | Nov 2016 | B2 |
9625723 | Lou | Apr 2017 | B2 |
9720232 | Hua | Aug 2017 | B2 |
9811159 | Nortrup | Nov 2017 | B2 |
10062182 | Haddick | Aug 2018 | B2 |
10191279 | Nortrup | Jan 2019 | B2 |
10228561 | Robbins | Mar 2019 | B2 |
10369314 | Chodkowski | Aug 2019 | B2 |
10445573 | Wilson | Oct 2019 | B2 |
20110043644 | Munger | Feb 2011 | A1 |
20110242053 | Chiu | Oct 2011 | A1 |
20110279666 | Strombom | Nov 2011 | A1 |
20120176474 | Border | Jul 2012 | A1 |
20130120224 | Cajigas | May 2013 | A1 |
20140125579 | Yamamoto | May 2014 | A1 |
20140285404 | Takano | Sep 2014 | A1 |
20140361957 | Hua | Dec 2014 | A1 |
20150055085 | Fonte | Feb 2015 | A1 |
20150148665 | Sato | May 2015 | A1 |
20150198809 | Wei | Jul 2015 | A1 |
20150238719 | Chodkowski | Aug 2015 | A1 |
20160370591 | Wilson | Dec 2016 | A1 |
20170140223 | Wilson | May 2017 | A1 |
Number | Date | Country |
---|---|---|
102928979 | Feb 2013 | CN |
103809687 | May 2014 | CN |
104793337 | Jul 2015 | CN |
106797422 | May 2017 | CN |
1131663 | Sep 2001 | EP |
089205 | Jan 1996 | JP |
09127459 | May 1997 | JP |
11202256 | Jul 1999 | JP |
2002301030 | Oct 2002 | JP |
2002328330 | Nov 2002 | JP |
2006053321 | Feb 2006 | JP |
2008241822 | Oct 2008 | JP |
2012515579 | Jul 2012 | JP |
2015508182 | Mar 2015 | JP |
2015530205 | Oct 2015 | JP |
5824697 | Nov 2015 | JP |
0026713 | May 2000 | WO |
2015198477 | Dec 2015 | WO |
2016103525 | Jun 2016 | WO |
Entry |
---|
International Search Report for corresponding PCT Application No. PCT/JP2016/072967, 4 pages, dated Oct. 25, 2016. |
International Preliminary Report and Written Opinion for corresponding PCT Application No. PCT/JP2016/072967, 11 pages, dated Feb. 22, 2018. |
Notification of Reason for Refusal for corresponding Japanese Patent Application No. 2017-534399, 9 pages, dated Oct. 30, 2018. |
Extended European Search Report for corresponding EP Application No. 16835068, 9 pages, dated Mar. 7, 2019. |
Rolf R Hainich et al: “Chapter 10.12: Adaptive Displays and Eye Tracking” in: “Displays : Fundamentals and Applications”, CRC Press vol. 13, pp. 481-493 (Jan. 1, 2011). |
Hong Hua et al: “Video-based eyetracking methods and algorithms in head-mounted displays”, Optics Express, vol . 14, No. 10, p. 4328-4350, (Jan. 1, 2006). |
Korean Intellectual Property Office Notice of Preliminary Rejection for corresponding KR Application No. 10-2018-7003038, 10 pages, dated Mar. 20, 2019. |
Notification of Reason for Refusal for corresponding JP Application No. 2019-013837, 9 pages, dated Jul. 30, 2019. |
CN First office action for corresponding CN Application No. 201680045546.6, 15 pages, dated Sep. 6, 2019. |
Office Action for related U.S. Appl. No. 15/747,269, 17 pages, dated Jan. 4, 2019. |
Office Action for related U.S. Appl. No. 15/747,269, 22 pages, dated Jul. 1, 2019. |
Office action for corresponding CN Application No. 201680045546.6, 16 pages, dated Jul. 3, 2020. |
Number | Date | Country | |
---|---|---|---|
20200218891 A1 | Jul 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15747269 | US | |
Child | 16821096 | US |