The present invention relates to a display control technology, and more particularly, to a display control apparatus and a display control method for controlling display on a head-mounted display.
Games are played by wearing a head-mounted display, connected to a game console, on the head, watching a screen displayed on the head-mounted display, and manipulating a controller or other device. With an ordinary stationary display, a user's field-of-view range spreads outside the display screen, possibly making it impossible to focus one's attention on the display screen or resulting in insufficient sense of immersion. In that respect, when a head-mounted display is worn, a user cannot see anything other than an image appearing on the head-mounted display, thereby increasing a sense of immersion into the image world and further enhancing the entertaining nature of the game.
The inventor recognized the need for a more convenient display control technology to ensure that games using a head-mounted display can be enjoyed by more user segments.
In order to solve the above problem, a display control apparatus according to a mode of the present invention includes a display control section that generates a virtual space image by specifying a viewpoint position and a direction of line of sight and displays the image on a head-mounted display. The display control section can specify a plurality of positions in the virtual space as viewpoint positions and can change the viewpoint position to a position determined from among the plurality of positions in accordance with an attitude of the head-mounted display, and when the viewpoint position is changed, the display control section specifies, as a direction of line of sight, the direction in which a first position in the virtual space is seen from the changed viewpoint position.
Another mode of the present invention is a display control apparatus. This apparatus includes a display control section and a viewpoint position control section. The display control section generates a virtual space image by specifying a viewpoint position and a direction of line of sight and displays the image on a head-mounted display. The viewpoint position control section moves the viewpoint position in accordance with a position of the head-mounted display. The viewpoint position control section moves the viewpoint position to a greater extent when the head-mounted display is moved horizontally than when the head-mounted display is moved perpendicularly.
It should be noted that arbitrary combinations of the above components and conversions of expressions of the present invention between method, apparatus, system, program, and so on are also effective as modes of the present invention.
According to the present invention, it is possible to improve convenience of head-mounted display users.
In the present embodiment, a description will be given of a display technology using a head-mounted display (HMD). A head-mounted display is a display apparatus worn on a user's head in such a manner as to cover his or her eyes so that the user can view still images and videos appearing on a display screen provided in front of user's eyes. What appears on the head-mounted display may be content such as movies and television (TV) programs. In the present embodiment, however, a description will be given of an example in which a head-mounted display is used as a display apparatus for displaying game images.
The gaming apparatus 10 executes a game program based on an instruction input supplied from the input apparatus 20 or the head-mounted display 100, a position or attitude of the input apparatus 20 or the head-mounted display 100, and so on, generates a first game image and transports the image to the head-mounted display 100, and generates a second game image and transports the image to the display apparatus 12.
The head-mounted display 100 displays the first game image generated by the gaming apparatus 10. The head-mounted display 100 also transports, to the gaming apparatus 10, information related to user input to the input apparatus provided on the head-mounted display 100. The head-mounted display 100 may be connected to the gaming apparatus 10 with a wired cable. Alternatively, the head-mounted display 100 may be connected wirelessly through wireless local area network (LAN) or other means.
The display apparatus 12 displays a second game image generated by the gaming apparatus 10. The display apparatus 12 may be a TV having a display and a speaker. Alternatively, the display apparatus 12 may be a computer display or other apparatus.
The input apparatus 20 has a function to transport user instruction input to the gaming apparatus 10 and is configured as a wireless controller capable of wirelessly communicating with the gaming apparatus 10 in the present embodiment. The input apparatus 20 and the gaming apparatus 10 may establish wireless connection using Bluetooth (registered trademark) protocol. It should be noted that the input apparatus 20 is not limited to a wireless controller and may be a wired controller connected to the gaming apparatus 10 via a cable.
The input apparatus 20 is driven by batteries and is configured to have a plurality of buttons for making instruction input so as to progress the game. When the user operates a button on the input apparatus 20, instruction input resulting from the operation is sent to the gaming apparatus through wireless communication.
The imaging apparatus 14 is a video camera that includes, for example, a charge-coupled device (CCD) imaging device or a complementary metal-oxide semiconductor (CMOS) imaging device and generates, by imaging a real space at a given interval, a frame image for each interval. The imaging apparatus 14 is connected to the gaming apparatus 10 via a universal serial bus (USB) or other interface. An image captured by the imaging apparatus 14 is used by the gaming apparatus to derive the positions and attitudes of the input apparatus 20 and the head-mounted display 100. The imaging apparatus 14 may be a ranging camera or a stereo camera capable of acquiring a distance. In this case, the imaging apparatus 14 makes it possible to acquire the distance between the imaging apparatus 14 and the input apparatus 20 or the head-mounted display 100.
In the game system 1 of the present embodiment, the input apparatus 20 and the head-mounted display 100 have a light-emitting section configured to emit light in a plurality of colors. During a game, the light-emitting section emits light in the color specified by the gaming apparatus and is imaged by the imaging apparatus 14. The imaging apparatus 14 images the input apparatus 20, generates a frame image, and supplies the image to the gaming apparatus 10. The gaming apparatus 10 acquires the frame image and derives position information of the light-emitting section in the real space from the position and size of the image of the light-emitting section in the frame image. The gaming apparatus 10 treats position information as a game operation instruction and reflects position information in game processing including controlling the action of a player's character.
Also, the input apparatus 20 and the head-mounted display 100 have an acceleration sensor and a gyrosensor. Sensor detection values are sent to the gaming apparatus 10 at a given interval, and the gaming apparatus 10 acquires sensor detection values and acquires attitude information of the input apparatus 20 and the head-mounted display 100 in the real space. The gaming apparatus treats attitude information as a game operation instruction and reflects attitude information in game processing.
The main body section 110 includes a display, a global positioning system (GPS) unit for acquiring position information, an attitude sensor, a communication apparatus, and so on. The head contact section 112 may include a biological information acquisition sensor capable of measuring user's biological information such as temperature, pulse, blood components, perspiration, brain waves, and cerebral blood flow. As described above, the light-emitting section 114 emits light in the color specified by the gaming apparatus 10 and functions as a criterion for calculating the position of the head-mounted display 100 in the image captured by the imaging apparatus 14.
A camera for capturing the user's eyes may be further provided on the head-mounted display 100. The camera mounted to the head-mounted display 100 permits detection of the user's line of sight, movement of the pupils, blinking, and so on.
Although a description will be given of the head-mounted display 100 in the present embodiment, the display control technology of the present embodiment is applicable not only to a case in which the head-mounted display 100 in a narrow sense is worn but also to a case in which eyeglasses, an eyeglass-type display, an eyeglass-type camera, a headphone, a headset (microphone equipped headphone), an earphone, an earring, an ear-mounted camera, a hat, a camera-equipped hat, or hair band is worn.
The control section 160 is a main processor that processes and outputs signals such as image signals and sensor signals, instructions, and data. The input interface 122 accepts an operation signal and a setup signal from input buttons and so on and supplies these signals to the control section 160. The output interface 130 receives an image signal from the control section 160 and displays the signal on the display apparatus 190. The backlight 132 supplies backlight to a liquid crystal display making up the display apparatus 190.
The communication control section 140 sends, to external equipment, data input from the control section 160 in a wired or wireless communication manner via the network adapter 142 or the antenna 144. The communication control section 140 receives data from external equipment in a wired or wireless manner via the network adapter 142 or the antenna 144 and outputs the data to the control section 160.
The storage section 150 temporarily stores data and parameters processed by the control section 160, operation signals, and so on.
The GPS unit 161 receives position information from a GPS satellite in accordance with an operation signal from the control section 160 and supplies position information to the control section 160. The wireless unit 162 receives position information from a wireless base station in accordance with an operation signal from the control section 160 and supplies position information to the control section 160.
The attitude sensor 164 detects attitude information such as orientation and tilt of the main body section 110 of the head-mounted display 100. The attitude sensor 164 is realized by combining a gyrosensor, an acceleration sensor, an angular acceleration sensor, and so on as appropriate.
The external I/O terminal interface 170 is an interface for connecting peripheral equipment such as USB controller. The external memory 172 is an external memory such as flash memory.
The clock section 180 specifies time information using a setup signal from the control section 160 and supplies time information to the control section 160.
The user plays a game while watching a game screen displayed on the display apparatus 12. The imaging apparatus 14 needs to image the light-emitting body 22 during execution of a game application. Therefore, an imaging range thereof is preferably arranged to face the same direction as the display apparatus 12. In general, the user often plays games in front of the display apparatus 12. Therefore, the imaging apparatus 14 is arranged such that an optical axis thereof matches a front direction of the display apparatus 12. Specifically, the imaging apparatus 14 is preferably arranged near the display apparatus 12 such that the imaging range thereof includes a position where the user can visually recognize the display screen of the display apparatus 12. This allows the imaging apparatus 14 to image the input apparatus 20.
The processing section 50 includes a main control section 52, an input acceptance section 54, a triaxial acceleration sensor 56, a triaxial gyrosensor 58, and a light emission control section 60. The main control section 52 sends and receives necessary data to and from the wireless communication module 48.
The input acceptance section 54 accepts input information from the operating buttons 30, 32, 34, 36, 38, and 40 and sends input information to the main control section 52. The triaxial acceleration sensor 56 detects acceleration components of three axial directions of X, Y, and Z. The triaxial gyrosensor 58 detects angular speeds on XZ, ZY, and YX planes. It should be noted that, here, width, height, and length directions of the input apparatus 20 are specified as X, Y, and Z axes. The triaxial acceleration sensor 56 and the triaxial gyrosensor 58 are preferably arranged inside the handle 24 and near the center inside the handle 24. The wireless communication module 48 sends, together with input information from the operating buttons, detection value information obtained by the triaxial acceleration sensor 56 and detection value information obtained by the triaxial gyrosensor 58, to the wireless communication module of the gaming apparatus 10 at a given interval. This transmission interval is set, for example, at 11.25 milliseconds.
The light emission control section 60 controls light emission of the light-emitting section 62. The light-emitting section 62 has a red light-emitting diode (LED) 64a, a green LED 64b, and a blue LED 64c, thereby allowing them to emit light in a plurality of colors. The light emission control section 60 causes the light-emitting section 62 to emit light in a desired color by controlling light emission of the red LED 64a, the green LED 64b, and the blue LED 64c.
When a light emission instruction is received from the gaming apparatus 10, the wireless communication module 48 supplies the light emission instruction to the main control section 52. The main control section 52 supplies the light emission instruction to the light emission control section 60. The light emission control section 60 controls light emission of the red LED 64a, the green LED 64b, and the blue LED 64c such that the light-emitting section 62 emits light in the color specified by the light emission instruction. For example, the light emission control section 60 may control lighting of each LED through pulse width modulation (PWM) control.
The wireless communication module 86 establishes wireless communication with the wireless communication module 48 of the input apparatus 20. This allows the input apparatus 20 to send operating button state information and detection value information of the triaxial acceleration sensor 56 and the triaxial gyrosensor 58 to the gaming apparatus 10 at a given interval.
The wireless communication module 86 receives operating button state information and sensor detection value information sent from the input apparatus 20 and supplies them to the input acceptance section 88. The input acceptance section 88 separates button state information and sensor detection value information and hands them over to the application processing section 300. The application processing section 300 receives button state information and sensor detection value information as a game operation instruction. The application processing section 300 treats sensor detection value information as attitude information of the input apparatus 20.
The frame image acquisition section 80 is configured as a USB interface and acquires frame images at a given imaging speed (e.g., 30 frames/second) from the imaging apparatus 14. The image processing section 82 extracts a light-emitting body image from a frame image. The image processing section 82 identifies the position and size of the light-emitting body in the frame images. For example, as the light-emitting body 22 of the input apparatus 20 emits light in a color that is unlikely used in the user's environment, the image processing section 82 can extract a light-emitting body image from a frame image with high accuracy. The image processing section 82 may generate a binarized image by binarizing frame image data using a given threshold. This binarization encodes a pixel value of a pixel having luminance higher than the given threshold as “1” and the pixel value of a pixel having luminance equal to or lower than the given threshold as “0.” By causing the light-emitting body 22 to light up at luminance beyond this given threshold, the image processing section 82 can identify the position and size of the light-emitting body image from the binarized image. For example, the image processing section 82 identifies coordinates of a center of gravity and a radius of the light-emitting body image in the frame image.
The device information deriving section 84 derives position information of the input apparatus 20 and the head-mounted display 100 as seen from the imaging apparatus 14 from the position and size of the light-emitting body image identified by the image processing section 82. The device information deriving section 84 derives position coordinates in camera coordinates from the center of gravity of the light-emitting body image and also derives distance information from the imaging apparatus 14 from the radius of the light-emitting body image. The position coordinates and the distance information make up position information of the input apparatus 20 and the head-mounted display 100. The device information deriving section 84 derives position information of the input apparatus 20 and the head-mounted display 100 for each frame image and hands over position information to the application processing section 300. The application processing section 300 receives position information of the input apparatus 20 and the head-mounted display 100 as a game operation instruction.
The application processing section 300 progresses the game from position information and attitude information of the input apparatus 20 and button state information and generates an image signal indicating processing results of the game application. The image signal is sent to the display apparatus 12 from the output section 90 and output as a display image.
The data holding section 360 holds program data of games executed in the gaming apparatus 10, various data used by the game programs, and so on.
The instruction input acquisition section 312 acquires information related to user instruction input accepted by the input apparatus 20 or the head-mounted display 100 from the input apparatus 20 or the head-mounted display 100.
The HMD information acquisition section 314 acquires information related to the attitude of the head-mounted display from the head-mounted display 100. Also, the HMD information acquisition section 314 acquires information related to the position of the head-mounted display 100 from the device information deriving section 84. These pieces of information are conveyed to the game control section 311. Information related to the attitude of the head-mounted display 100 may be acquired by the device information deriving section 84 analyzing a captured image of the head-mounted display 100.
The input apparatus information acquisition section 315 acquires information related to the attitude of the input apparatus 20. Also, the input apparatus information acquisition section 315 acquires information related to the position of the input apparatus 20 from the device information deriving section 84. These pieces of information are conveyed to the game control section 311. Information related to the attitude of the input apparatus 20 may be acquired by the device information deriving section 84 analyzing a captured image of the input apparatus 20.
If the input apparatus 20 moves out of the imaging range of the imaging apparatus 14 or if the input apparatus 20 is hidden behind the user's body or an obstacle and fails to be imaged by the imaging apparatus 14, the input apparatus information acquisition section 315 calculates the position of the input apparatus 20 based on the previously acquired position of the input apparatus 20 and information related to the attitude of the input apparatus 20 acquired after that point in time. For example, the current position of the input apparatus 20 may be calculated by calculating a deviation from the previously acquired position of the input apparatus 20 based on translational acceleration data acquired from the acceleration sensor of the input apparatus 20. While the input apparatus 20 is not imaged by the imaging apparatus 14, the position of the input apparatus 20 is successively calculated in the similar manner. When the input apparatus 20 is imaged again by the imaging apparatus 14, there is a possibility that the position of the input apparatus 20 successively calculated from acceleration data may not indicate a correct position due to cumulative drift error. Therefore, the position of the input apparatus 20 newly calculated by the device information deriving section 84 may be used as the current position of the input apparatus 20. The same is true for the head-mounted display 100.
The game control section 311 executes the game program and progresses the game based on user instruction input acquired by the instruction input acquisition section 312 and information related to the position or attitude of the input apparatus 20 or the head-mounted display 100. The game control section 311 changes the position of a player's character, an operation target, based on input made by directional keys or an analog stick of the input apparatus 20 and a change in position of the input apparatus 20 or the head-mounted display 100 in a game field made up of a virtual three-dimensional (3D) space.
The first image generation section 316 generates an image to be displayed on the head-mounted display 100. The first image generation section 316 generates a game field image by specifying a viewpoint position based on the position of the operation target controlled by the game control section 311, specifying a direction of line of sight based on the attitude of the head-mounted display 100, and rendering a virtual 3D space. The first image generation section 316 associates the attitude of the head-mounted display 100 and the direction of line of sight in the game field at a given time and changes, thereafter, the direction of line of sight with change in the attitude of the head-mounted display 100. As a result, the user can look over the game field by actually moving his or her head, allowing the user to feel as if he or she were really in the game field. The first image generation section 316 generates a first image by adding information related to the game, an image to be displayed on the head-mounted display 100, and so on to the generated game field image. The first image generated by the first image generation section 316 is sent to the head-mounted display 100 via a wireless communication module or a wired communication module.
The second image generation section 317 generates an image to be displayed on the display apparatus 12. When the same image as displayed on the head-mounted display 100 is displayed on the display apparatus 12, the first image generated by the first image generation section 316 is also sent to the display apparatus 12. When an image different from the image displayed on the head-mounted display 100 is displayed on the display apparatus 12, an example of which is when the user wearing the head-mounted display 100 and the user watching the display apparatus 12 execute a head-to-head game, the second image generation section 317 generates a game field image by specifying a viewpoint position and a direction of line of sight different from those specified by the first image generation section 316. The second image generation section 317 generates a second image by adding information related to the game, an image to be displayed on the display apparatus 12, and so on to the generated game field image. The second image generated by the second image generation section 317 is sent to the display apparatus 12 via a wireless communication module or a wired communication module.
When the marker enters a given range specified near the center of the display screen as the user points his or her face or line of sight toward the marker direction, as depicted in
When an attempt is made to move the viewpoint position to above the left or right hole, it is necessary to move the head-mounted display 100 upward while keeping the head-mounted display 100 tilted to the left or right. However, it is not easy for the user to move his or her head straight upward while keeping the body tilted to the left or right. In the present embodiment, therefore, when the head-mounted display 100 is moved up or down in a tilted position to the left or right, and even if the direction of movement is tilted diagonally, the game control section 311 moves the viewpoint position vertically, but not horizontally. The game control section 311 moves the viewpoint position vertically by the amount of travel equivalent to a vertical component of a movement vector of the head-mounted display 100 and may ignore a horizontal component or move the viewpoint position vertically by the amount of travel equivalent to the magnitude of the movement vector of the head-mounted display 100. Thus, when the viewpoint position is changed in response to movement of the head-mounted display 100, it is possible to restrict the movement direction of the viewpoint position to a given direction and prevent the viewpoint position from moving in an unnecessary direction by converting the movement vector of the head-mounted display 100 into a vector in a given direction. Also, it is possible to provide a user interface that permits the movement of the viewpoint position only in a necessary direction, thereby ensuring improved user convenience.
Such a technology is applicable, for example, to a game in which a player's character hides behind an obstacle such as wall to ward off oncoming bullets.
The present invention has been described above based on an embodiment. The present embodiment is illustrative, and it is to be understood by those skilled in the art that the combination of components and processes thereof can be modified in various ways and that these modification examples also fall within the scope of the present invention.
Although an image for binocular stereopsis was displayed on the display apparatus 190 of the head-mounted display 100 in the above example, an image for monocular stereopsis may be displayed in a different example.
Although the head-mounted display 100 was used in a game system in the above example, the technology described in the embodiment can be also used to display content other than games.
The present invention is applicable to a display control apparatus for controlling display to a head-mounted display.
Number | Date | Country | Kind |
---|---|---|---|
2015-235897 | Dec 2015 | JP | national |
This is a continuation application of U.S. patent application Ser. No. 17/225,203, accorded a filing date of Apr. 8, 2021, allowed; which is a continuation application of U.S. patent Application No. accorded a filing date of May 3, 2018 (U.S. Pat. No. 11,042,038, issued Jun. 22, 2021); which is a national stage application of International Application No. PCT/JP2016/084937, filed Nov. 25, 2016; and which claims priority to JP Application No. 2015-235897, filed Dec. 2, 2015, the entire disclosures of which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5742264 | Inagaki | Apr 1998 | A |
5907328 | Brush II | May 1999 | A |
5993318 | Kousaki | Nov 1999 | A |
6124843 | Kodama | Sep 2000 | A |
6831656 | Kitao | Dec 2004 | B2 |
7298384 | Anabuki | Nov 2007 | B2 |
7479967 | Bachelder | Jan 2009 | B2 |
7492362 | Sakagawa | Feb 2009 | B2 |
7646394 | Neely, III | Jan 2010 | B1 |
8108190 | Riener | Jan 2012 | B2 |
8564645 | Suzuki | Oct 2013 | B2 |
8879787 | Ikenoue | Nov 2014 | B2 |
9075514 | Karakotsios | Jul 2015 | B1 |
9081177 | Wong | Jul 2015 | B2 |
9250799 | Okura | Feb 2016 | B2 |
9256284 | Hanaya | Feb 2016 | B2 |
9268138 | Shimizu | Feb 2016 | B2 |
9380287 | Nistico | Jun 2016 | B2 |
9448625 | Kobayashi | Sep 2016 | B2 |
9607439 | Uematsu | Mar 2017 | B2 |
9658737 | Rothenberger | May 2017 | B2 |
9684169 | Tanaka | Jun 2017 | B2 |
9740009 | Fujimaki | Aug 2017 | B2 |
9846303 | Biwa | Dec 2017 | B2 |
9928650 | Inomata | Mar 2018 | B2 |
10029176 | Aizawa | Jul 2018 | B2 |
10116914 | Woods | Oct 2018 | B2 |
10261581 | Kim | Apr 2019 | B2 |
11042038 | Kake | Jun 2021 | B2 |
20020024521 | Goden | Feb 2002 | A1 |
20030017872 | Oishi | Jan 2003 | A1 |
20040254771 | Riener | Dec 2004 | A1 |
20050024388 | Takemoto | Feb 2005 | A1 |
20050270309 | Murata | Dec 2005 | A1 |
20070109296 | Sakagawa | May 2007 | A1 |
20070252833 | Kuroki | Nov 2007 | A1 |
20070258658 | Kobayashi | Nov 2007 | A1 |
20080024392 | Gustafsson | Jan 2008 | A1 |
20080132334 | Nonaka | Jun 2008 | A1 |
20080297437 | Takahashi | Dec 2008 | A1 |
20090069096 | Nishimoto | Mar 2009 | A1 |
20100026714 | Utagawa | Feb 2010 | A1 |
20100066734 | Ohta | Mar 2010 | A1 |
20100087258 | Moriwaki | Apr 2010 | A1 |
20100182409 | Suzuki | Jul 2010 | A1 |
20100289878 | Sato | Nov 2010 | A1 |
20110014977 | Yamazaki | Jan 2011 | A1 |
20110034300 | Hall | Feb 2011 | A1 |
20110140994 | Noma | Jun 2011 | A1 |
20110245942 | Yamamoto | Oct 2011 | A1 |
20120039507 | Ikenoue | Feb 2012 | A1 |
20120169725 | Shimizu | Jul 2012 | A1 |
20120212429 | Okura | Aug 2012 | A1 |
20130009957 | Arakita | Jan 2013 | A1 |
20130194305 | Tetsuya | Aug 2013 | A1 |
20130258461 | Sato | Oct 2013 | A1 |
20140160129 | Sako | Jun 2014 | A1 |
20140186002 | Hanaya | Jul 2014 | A1 |
20140198033 | Kobayashi | Jul 2014 | A1 |
20140225920 | Murata | Aug 2014 | A1 |
20140241586 | Miyamoto | Aug 2014 | A1 |
20140327613 | Chessa | Nov 2014 | A1 |
20140354515 | LaValle | Dec 2014 | A1 |
20140361956 | Mikhailov | Dec 2014 | A1 |
20140362446 | Bickerstaff | Dec 2014 | A1 |
20140375560 | Masuda | Dec 2014 | A1 |
20140375687 | Tanaka | Dec 2014 | A1 |
20150009101 | Biwa | Jan 2015 | A1 |
20150022664 | Pflug | Jan 2015 | A1 |
20150049003 | Fujimaki | Feb 2015 | A1 |
20150049018 | Gomez | Feb 2015 | A1 |
20150052458 | Rothenberger | Feb 2015 | A1 |
20150177829 | Sakuta | Jun 2015 | A1 |
20150199850 | Uematsu | Jul 2015 | A1 |
20150288944 | Nistico | Oct 2015 | A1 |
20150293362 | Takahashi | Oct 2015 | A1 |
20150352437 | Koseki | Dec 2015 | A1 |
20160078681 | Shikoda | Mar 2016 | A1 |
20160078682 | Shikoda | Mar 2016 | A1 |
20160125654 | Shikoda | May 2016 | A1 |
20160282619 | Oto | Sep 2016 | A1 |
20160330376 | Debevec | Nov 2016 | A1 |
20170076486 | Aizawa | Mar 2017 | A1 |
20170076497 | Inomata | Mar 2017 | A1 |
20170153713 | Niinuma | Jun 2017 | A1 |
20170169540 | Satori | Jun 2017 | A1 |
20170278262 | Kawamoto | Sep 2017 | A1 |
20180307310 | McCombe | Oct 2018 | A1 |
20180321493 | Kim | Nov 2018 | A1 |
20180329215 | Kake | Nov 2018 | A1 |
20180349083 | Kasahara | Dec 2018 | A1 |
20190121515 | Nashida | Apr 2019 | A1 |
Number | Date | Country |
---|---|---|
1409218 | Apr 2003 | CN |
1529880 | Sep 2004 | CN |
1708982 | Dec 2005 | CN |
101123692 | Feb 2008 | CN |
101414383 | Apr 2009 | CN |
101783967 | Jul 2010 | CN |
101970067 | Feb 2011 | CN |
102317888 | Jan 2012 | CN |
102317892 | Jan 2012 | CN |
102566052 | Jul 2012 | CN |
104238665 | Dec 2014 | CN |
104280882 | Jan 2015 | CN |
104376194 | Feb 2015 | CN |
104407700 | Mar 2015 | CN |
104423041 | Mar 2015 | CN |
104603673 | May 2015 | CN |
104794757 | Jul 2015 | CN |
108292490 | Jul 2018 | CN |
102860837 | Jan 2019 | CN |
08202281 | Aug 1996 | JP |
2000020753 | Jan 2000 | JP |
2000250699 | Sep 2000 | JP |
2002170132 | Jun 2002 | JP |
2002263371 | Sep 2002 | JP |
2004054590 | Feb 2004 | JP |
2004283521 | Oct 2004 | JP |
2005174021 | Jun 2005 | JP |
2006061716 | Mar 2006 | JP |
2007299326 | Nov 2007 | JP |
2007312930 | Dec 2007 | JP |
2012008745 | Jan 2012 | JP |
2012048597 | Mar 2012 | JP |
2014127987 | Jul 2014 | JP |
2014137396 | Jul 2014 | JP |
2015011368 | Jan 2015 | JP |
2015095045 | May 2015 | JP |
2015182712 | Oct 2015 | JP |
2015187797 | Oct 2015 | JP |
5869177 | Feb 2016 | JP |
Entry |
---|
International Search Report for corresponding PCT Application No. PCT/JP2016/084937, 3 pages, dated Feb. 28, 2017. |
International Preliminary Report on Patentability and Written Opinion for corresponding PCT Application No. PCT/ JP2016/084937, 10 pages, dated Jul. 6, 2018. |
Partial Supplementary European Search Report for corresponding EP Application No. 16870544.0, 15 pages, dated Oct. 16, 2018. |
Extended European Search Report for corresponding EP Application No. 18206819, 8 pages, dated Mar. 14, 2019. |
Examination Report for corresponding EP Application No. 16870544.0, 5 pages, dated Feb. 25, 2020. |
Examination Report for corresponding EP Application No. 18206819.7, 7 pages, dated Feb. 25, 2020. |
Caroline Jay et al: “Amplifying Head Movements with Head-Mounted Displays”, Presence, vol. 12, No. 3, pp. 268-276, Jun. 1, 2003 (For relevancy see Non-Pat. Lit. #6). |
The First Office Action for corresponding CN Application No. 201680069099.8, 22 pages, dated Jul. 3, 2020. |
Notice of Reasons for Refusal for corresponding JP Application No. 2020055471, 6 pages, dated Feb. 4, 2021. |
The Second Office Action for corresponding CN Application No. 201680069099.8, 17 pages, dated Mar. 25, 2021. |
Notification of Decision to Grant corresponding CN Application No. 201680069099.8, 8 pages, dated Sep. 30, 2021. |
Extended European Search Report for corresponding EP Application No. 21208486.7, 8 pages, dated Feb. 17, 2022. |
Notification of Reasons for Refusal for corresponding JP Application No. 2021-088448, 4 pages, dated Apr. 6, 2022. |
Reconsideration Report before Appeal for corresponding JP Application No. 2021-088448, 4 pages, dated Dec. 16, 2022. |
Notice of Reasons for Refusal for corresponding JP Application No. 2021-88448, 16 pages, dated Aug. 30, 2023. |
Sato “The 13th time of VR Gaming Today! series Playstation VR, TGS2015 exhibition title was tried! Precious opportunity. A title to be played by a game fan on a general date!”, Game Watch, 12 pages, Sep. 18, 2015 (for relevancy see Pat. Lit. #1). |
Playstation.Blog, “PlayStation “R” VR is jumping to the world of the game!” 9 pages, Nov. 24, 2015 (for relevancy see Pat. Lit. #1). |
Decision of “the official name of the Sony's retractable VR headset System is,” PlayStation VR Pronews, 4 pages, Sep. 18, 2015 (for relevancy see Pat. Lit. #1). |
Notification of Fulfilling of Registration Formality for corresponding CN Application No. 202111534076.9, 10 pages, dated Jul. 9, 2024. |
Number | Date | Country | |
---|---|---|---|
20230408833 A1 | Dec 2023 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17225203 | Apr 2021 | US |
Child | 18452819 | US | |
Parent | 15773409 | US | |
Child | 17225203 | US |