The present invention relates to a vibration device used for presenting a pseudo force sensation to a user, a vibration control apparatus for controlling the vibration device, a vibration device control method, and a control program.
In some cases, a device used while it is attached to or retained by the body of a user, such as an operating device connected to a home gaming machine for use, includes a vibration mechanism for vibrating a part or the whole of the device. Such a vibration device is used to present a tactile sensation to the user. The tactile sensation represents a sensation that causes the user to feel as if the user has touched a real object when, for example, the user comes into contact with a virtual object in a virtual space.
When a vibration device based on the above-mentioned conventional technology is vibrated according to a particular vibration waveform, the user is given a sensation (hereinafter referred to as the pseudo force sensation) that a pulling force (traction force) is seemingly applied to pull the vibration device in a particular direction. When the user is viewing certain content, an enhanced realistic sensation can be given to the user by presenting a pseudo force sensation matching the viewed content. However, whatever the content may be, the pseudo force sensation to be presented to match the content is not always prepared. Further, in some cases, it is difficult for a content supplier to predefine the pseudo force sensation to be presented.
The present invention has been made in view of the above circumstances. An object of the present invention is to provide a vibration control apparatus, a vibration device, a vibration device control method, and a control program that are capable of presenting to a user a pseudo force sensation matching the content to be viewed.
A vibration control apparatus according to the present invention is a vibration control apparatus for vibrating a vibration mechanism, and includes a content information acquisition section and a vibration control section. The content information acquisition section acquires content information about content including at least either one of video and audio to be presented to a user. The vibration control section vibrates the vibration mechanism in such a manner that the user feels a pseudo force sensation defined based on the content information while the content is being presented.
A vibration device according to the present invention includes a vibration mechanism, a reception section, and a control section. The reception section receives a control command that specifies generation of a vibration causing a user to feel a pseudo force sensation defined based on content information about content while the content is being presented. The content includes at least either one of video and audio to be presented to the user. The control section causes the vibration mechanism to generate a vibration based on the control command.
A method of controlling a vibration mechanism according to the present invention includes the steps of: acquiring content information about content including at least either one of video and audio to be presented to a user; and vibrating the vibration mechanism in such a manner that the user feels a pseudo force sensation defined based on the content information while the content is being presented.
A program according to the present invention is a program for controlling a vibration mechanism. The program causes a computer to function as a content information acquisition section and a vibration control section. The content information acquisition section acquires content information about content including at least either one of video and audio to be presented to a user. The vibration control section vibrates the vibration mechanism in such a manner that the user feels a pseudo force sensation defined based on the content information while the content is being presented. The program may be supplied on a computer-readable, non-transitory information storage medium.
An embodiment of the present invention will now be described with reference to the accompanying drawings.
A vibration control system 1 according to an exemplary embodiment of the present invention includes a vibration control apparatus 10 and a vibration device 20. The vibration device 20 is connected to the vibration control apparatus 10.
The vibration device 20 is used while it is held by a hand of a user or attached to the body of the user. The vibration device 20 has a built-in vibration mechanism 21. The vibration mechanism 21 is operated to present a vibration to the user. The vibration mechanism 21 may be a linear resonance actuator, a voice coil motor, an eccentric motor, or other vibration generation element. The vibration device 20 may include various operating members to be operated by the user, such as an operating button and a lever. Here, it is assumed as a concrete example that the vibration device 20 includes only one vibration mechanism 21 for generating a vibration along one axis and is able to present a pseudo force sensation along such a vibration direction.
The vibration control apparatus 10 is an information processing apparatus communicatively connected to the vibration device 20. Thus, the vibration control apparatus 10 may be, for example, a home gaming machine or a personal computer. In the present embodiment, the vibration control apparatus 10 is further communicatively connected to a display apparatus 14 and an audio reproduction apparatus 15. As illustrated in
The control section 11 includes a program control device such as a central processing unit (CPU), and performs various information processes in accordance with a program stored in the storage section 12. Concrete examples of processes performed by the control section 11 will be described in detail later.
The storage section 12 is, for example, a memory device, and stores the program to be executed by the control section 11. The program may be stored on a computer-readable, non-transitory storage medium, supplied, and copied into the storage section 12. The storage section 12 further functions as a work memory for the control section 11.
The communication section 13 includes a universal serial bus (USB) or other serial interface or a Bluetooth (registered trademark) or other wireless communication interface. The vibration control apparatus 10 is communicatively connected to the vibration device 20 through the communication section 13. In the present embodiment, particularly, the communication section 13 transmits a control signal for operating the vibration mechanism 21 to the vibration device 20 in accordance with an instruction from the control section 11. Further, the communication section 13 includes communication interfaces for wiredly or wirelessly communicating with the display apparatus 14 and the audio reproduction apparatus 15, respectively. The vibration control apparatus 10 transmits video data, which is to be displayed on the display apparatus 14, to the display apparatus 14 through the communication section 13. Further, the vibration control apparatus 10 transmits an audio signal, which is to be reproduced by the audio reproduction apparatus 15, to the audio reproduction apparatus 15.
The display apparatus 14 displays video based on a video signal transmitted from the vibration control apparatus 10. The display apparatus 14 may be, for example, a head-mounted display or other similar device worn on the head of the user. The audio reproduction apparatus 15 reproduces audio based on the audio signal transmitted from the vibration control apparatus 10. The audio reproduction apparatus 15 may include a plurality of speakers installed at a distance from each other or may be a set of headphones worn on the ears of a user.
Operations of the control section 11 in the vibration control apparatus 10 will now be described. As illustrated in
The content reproduction section 31 is implemented when the control section 11 executes an application program such as a content player. The application program for implementing the content reproduction section 31 may be, for example, a game for reproducing content that varies based on a user input. The content reproduction section 31 reproduces content C that includes at least either one of video and audio. More specifically, in a case where the content C includes video, the content reproduction section 31 draws the video and outputs it as a video signal. The video is then displayed on the display apparatus 14 and viewed by the user. In a case where the content C includes audio, the content reproduction section 31 outputs the audio as an audio signal. The outputted audio signal is reproduced by the audio reproduction apparatus 15.
The content information acquisition section 32 acquires information about the content C to be reproduced by the content reproduction section 31 (this information is hereinafter referred to as the content information). More specifically, the content information acquisition section 32 may read the content information outputted from the content reproduction section 31, or may analyze the video and audio included in the content C reproduced by the content reproduction section 31 and acquire the result of analysis as the content information. Concrete examples of the content information acquired by the content information acquisition section 32 will be described later.
The pseudo force sensation determination section 33 defines, based on the content information acquired by the content information acquisition section 32, the pseudo force sensation to be presented to the user. More specifically, the pseudo force sensation determination section 33 determines the time point, direction, and strength of pseudo force sensation presentation during the reproduction of the content C by the content reproduction section 31. Concrete examples of the pseudo force sensation defined by the pseudo force sensation determination section 33 will be described later.
In order to present to the user the pseudo force sensation defined by the pseudo force sensation determination section 33, the vibration control section 34 defines the vibration to be generated by the vibration device 20 (this vibration is hereinafter referred to as the pseudo force sensory vibration), and outputs a control command for generating the defined vibration to the vibration device 20. When generating the pseudo force sensation defined by the pseudo force sensation determination section 33, the vibration control section 34 may display relevant information on the screen of the display apparatus 14 in order to indicate that pseudo force sensation generation is in progress.
Upon receiving the above control command, the vibration device 20 causes the vibration mechanism 21 to generate a vibration based on the pseudo force sensory waveform. As a result, the pseudo force sensation based on the definition made by the pseudo force sensation determination section 33 is presented to the user. The functions of the above-described sections enable the vibration control apparatus 10 to present, to the user, the pseudo force sensation based on the content even when the pseudo force sensation to be presented is not predefined for the content C.
The following description deals with concrete examples of content information acquired by the content information acquisition section 32 and of pseudo force sensation defined by the pseudo force sensation determination section 33 in accordance with the acquired content information.
As a first example, an example of control based on audio will now be described. This example assumes that a plurality of channels of audio are included in the content C. The content information acquisition section 32 references the audio signal outputted from the content reproduction section 31, and acquires information indicative of the volumes of the channels as the content information.
The pseudo force sensation determination section 33 defines the pseudo force sensation in accordance with the volumes of the channels. More specifically, the pseudo force sensation determination section 33 selects, for example, one of left and right two audio channels so as to present, to the user, a pseudo force sensation oriented in a direction toward a higher volume. The magnitude of the pseudo force sensation may be varied in accordance with the volume in the direction of pseudo force sensation presentation or with the difference between left and right volumes. When control is exercised in the above-described manner in a case where the user is listening to audio whose left and right volumes differ from each other by using, for example, headphones, the user can be made to feel like being pulled in a direction toward a louder sound.
Further, in a certain case, the content reproduction section 31 reproduces the position and direction of a sound generation source (sound source) by using a stereophonic technology. In that case, the content information acquisition section 32 may analyze, for example, the volume difference and time lag between left and right channels and changes in phase and frequency response, identify the direction of sound generation (the direction of the sound source) relative to a listener, and acquire the identified direction as the content information. Even in a case where the content reproduction section 31 reproduces audio of three or more channels, the content information acquisition section 32 is able to identify the direction of the sound source by analyzing audio information about each channel. Based on information about the direction of the sound source, which is acquired by the content information acquisition section 32, the pseudo force sensation determination section 33 may present to the user a pseudo force sensation oriented toward that direction. Alternatively, the pseudo force sensation determination section 33 may identify temporal changes in the position of the sound source (i.e., the direction of movement) and present to the user a pseudo force sensation oriented toward the direction of movement. Further, in contrary to the foregoing description, the pseudo force sensation determination section 33 may present to the user a pseudo force sensation oriented away from a direction toward a louder sound.
Moreover, even in a case where only one channel of audio is included in the content C, the pseudo force sensation determination section 33 may present the pseudo force sensation based on the audio. In that case, the direction of pseudo force sensation presentation may be a predetermined direction, a gravity direction at the time, or a direction close to the gravity direction. The direction close to the gravity direction may be used for pseudo force sensation presentation because that direction causes the user to strongly feel the pseudo force sensation. In this case, it is assumed that the vibration device 20 includes an acceleration sensor or other section for detecting the gravity direction, and the pseudo force sensation determination section 33 uses the result of such detection to determine the direction of pseudo force sensation presentation.
In a case where the predetermined conditions are satisfied, for example, by the volume and frequency of audio included in the content C, the pseudo force sensation determination section 33 may present the pseudo force sensation. For example, in a case where audio having a frequency within a predetermined numerical value range is to be reproduced, the pseudo force sensation determination section 33 may present the pseudo force sensation. In this case, it is assumed that the content information includes information about not only the volume of the audio but also the frequency of the audio.
As a second example, an example of control based on changes in video will now be described. The following description assumes that, as a concrete example, a target object O appearing in the video is predefined. In a case where video drawing the inside of a virtual space in real time is to be reproduced as the content C by the content reproduction section 31, the target object O may be a predetermined object disposed in the virtual space. In this case, the content information may be information about a position within the video depicting the target object O, which is identified when the content reproduction section 31 draws the video.
Further, even in a case where the content reproduction section 31 reproduces pre-rendered video, the content information acquisition section 32 may analyze the video in real time, determine whether or not a predetermined target object O is depicted in the video, and acquire the position of the target object O in the video as the content information. The movement of such an object depicted in video can be identified by using, for example, a publicly known optical flow technology.
The pseudo force sensation determination section 33 determines the direction of pseudo force sensation in accordance with the position within video depicting the target object O. It is assumed, for example, that the pseudo force sensation determination section 33 presents to the user a pseudo force sensation oriented in the leftward direction at a time point when the target object O is depicted on the left side in the video, and presents to the user a pseudo force sensation oriented in the rightward direction at a time point when the target object O is depicted on the right side in the video.
The above description assumes that the pseudo force sensation determination section 33 defines the pseudo force sensation to be presented at each time point in accordance with the absolute position of the target object O at that time point. Alternatively, however, the pseudo force sensation determination section 33 may determine the direction of the pseudo force sensation in accordance with the direction of movement of the target object O within the video. More specifically, in a case where the target object O is moving in the rightward direction as depicted, for example, in
Here, it is assumed that the direction of pseudo force sensation is determined in accordance with the direction of movement of the target object O. Alternatively, however, the pseudo force sensation determination section 33 may determine the direction of pseudo force sensation in accordance with the movement of a background in the video. For example, in a case where the background in the video is moving in the rightward direction, the point of view is moving in the leftward direction. In such a case, the pseudo force sensation determination section 33 may present to the user a pseudo force sensation oriented in the leftward direction.
In the concrete examples of processing performed by the pseudo force sensation determination section 33, which have been described above, the direction of pseudo force sensation direction that is temporarily determined by the pseudo force sensation determination section 33 in accordance with the content information may differ in some cases from the direction of pseudo force sensation that can be presented by the vibration mechanism 21. For example, in a case where the vibration mechanism 21 is able to generate a vibration only in the left-right direction as viewed from the user holding the vibration device 20, the vibration device 20 is allowed to present only a pseudo force sensation oriented in the left-right direction. If, in the above case, the target object O moves in the up-down direction within the screen in a situation where the direction of pseudo force sensation is determined based, for example, on the direction of movement of the target object O, the pseudo force sensation based on the direction of movement cannot be presented.
In the above case, the pseudo force sensation determination section 33 may alternatively present a pseudo force sensation that is oriented in a direction relatively close to a target direction. An alternative is to generate a vibration other than a pseudo force sensory vibration or generate a vibration other than the pseudo force sensory vibration while presenting a pseudo force sensation in a presentable direction.
Further, in a case where the vibration device 20 includes a plurality of built-in vibration mechanisms 21, the pseudo force sensation determination section 33 may cause the vibration mechanism 21 disposed toward a direction of intended pseudo force sensation presentation to generate the pseudo force sensory vibration or generate a normal vibration instead of the pseudo force sensation. As a concrete example, when a pseudo force sensation oriented in the rightward direction is to be presented correspondingly to the rightward movement of the target object O in a case where two vibration mechanisms 21 disposed on the left and right of the vibration device 20 are able to generate a vibration only in the up-down direction, the pseudo force sensation determination section 33 makes a determination so as to present a downward pseudo force sensation by using one of the two vibration mechanisms 21 that is disposed toward the rightward direction as viewed from the user. Conversely, when a leftward pseudo force sensation is to be presented, the pseudo force sensation determination section 33 makes a determination so as to present the downward pseudo force sensation by using the vibration mechanism 21 that is disposed toward the leftward direction. The reason why a downward pseudo force sensation is presented instead of an upward pseudo force sensation is that a pseudo force sensation oriented in a direction closer to the gravity direction can easily be presented to the user as mentioned earlier.
Moreover, as another example, in a case where a pseudo force sensation oriented in the upper right direction is to be presented correspondingly to the upper right movement of the target object O, the pseudo force sensation determination section 33 may make the vibration generated from the rightward vibration mechanism 21 stronger than the vibration generated from the leftward vibration mechanism 21 by causing the two vibration mechanisms 21 to present an upward pseudo force sensation and causing the rightward vibration mechanism 21 to superimpose the waveform of a vibration other than the pseudo force sensory vibration on the waveform of the upward pseudo force sensation. Using the above-described method makes it possible to artificially present a pseudo force sensation based on the movement of the target object O.
The foregoing description assumes that the vibration mechanism 21 is disposed at a fixed position in the housing of the vibration device 20 and is thus able to present a pseudo force sensation oriented in a limited direction. However, the vibration device 20 according to the embodiment of the present invention is not limited to the above-described one. The vibration device 20 may alternatively include a drive mechanism 22 that varies the orientation of the vibration mechanism 21 in the housing in order to present the pseudo force sensation over a wider range. When the vibration mechanism 21 includes such a drive mechanism 22, the pseudo force sensation determination section 33 makes it possible to present the pseudo force sensation in various directions determined based on the content information by causing the drive mechanism 22 to vary the orientation of the vibration mechanism 21 and controlling the vibration mechanism 21 to generate a pseudo force sensory vibration.
Furthermore, in a case where the vibration device 20 includes two vibration mechanisms 21 disposed at a distance from each other as illustrated in
Moreover, when the drive mechanisms 22 are controlled to vary the orientation of the vibration mechanisms 21, the user can be made to feel as if a force is generated to turn the vibration device 20 in various directions. As an example,
The vibration control system 1 according to the present embodiment, which has been described above, is able to define the pseudo force sensation in accordance with the content information and thus present to the user the pseudo force sensation based on the content even if the pseudo force sensation to be generated is not predefined.
The present invention is not limited to the above-described embodiment. For example, the foregoing description assumes that the vibration device 20 is an operating device for receiving a user's operation input. However, the vibration device 20 is not limited to such an operating device. Alternatively, the vibration device 20 may be a device that is merely used to present a tactile sensation or a pseudo force sensation to the user or used for other purposes.
Further, the foregoing description assumes that the vibration control apparatus 10, which is separate from the vibration device 20, defines the pseudo force sensory vibration to be generated. However, the present invention is not limited to such a configuration. An alternative is to allow the vibration device 20 to define the pseudo force sensation to be presented based on the content information received from the vibration control apparatus 10. In this case, the vibration device 20 functions as the vibration control apparatus according to the embodiment of the present invention.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/030345 | 8/24/2017 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/038888 | 2/28/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5889670 | Schuler | Mar 1999 | A |
6766299 | Bellomo | Jul 2004 | B1 |
6864877 | Braun | Mar 2005 | B2 |
7010488 | van Santen | Mar 2006 | B2 |
7218310 | Goldenberg | May 2007 | B2 |
8248218 | Yamaya | Aug 2012 | B2 |
8249276 | Hamada | Aug 2012 | B2 |
8325144 | Tierling | Dec 2012 | B1 |
8378797 | Pance | Feb 2013 | B2 |
8384316 | Houston | Feb 2013 | B2 |
8479274 | Furukawa | Jul 2013 | B2 |
8487759 | Hill | Jul 2013 | B2 |
8727878 | Longdale | May 2014 | B2 |
8787586 | Hamada | Jul 2014 | B2 |
9070282 | Clough | Jun 2015 | B2 |
9098984 | Heubel | Aug 2015 | B2 |
9135791 | Nakamura | Sep 2015 | B2 |
9436280 | Tartz | Sep 2016 | B2 |
9459632 | Houston | Oct 2016 | B2 |
9630098 | Mikhailov | Apr 2017 | B2 |
9753537 | Obana | Sep 2017 | B2 |
9792501 | Maheriya | Oct 2017 | B1 |
9846484 | Shah | Dec 2017 | B2 |
9946347 | Nakagawa | Apr 2018 | B2 |
9952670 | Watanabe | Apr 2018 | B2 |
9983671 | Adachi | May 2018 | B2 |
10109161 | Shah | Oct 2018 | B2 |
10150029 | Yamano | Dec 2018 | B2 |
10175761 | Cruz-Hernandez | Jan 2019 | B2 |
10347093 | Rihn | Jul 2019 | B2 |
10394326 | Ono | Aug 2019 | B2 |
10444837 | Takeda | Oct 2019 | B2 |
10969867 | Nakagawa | Apr 2021 | B2 |
10981053 | Nakagawa | Apr 2021 | B2 |
11013990 | Nakagawa | May 2021 | B2 |
11145172 | Nakagawa | Oct 2021 | B2 |
11198059 | Konishi | Dec 2021 | B2 |
11253776 | Enokido | Feb 2022 | B2 |
11260286 | Enokido | Mar 2022 | B2 |
20020030663 | Goldenberg | Mar 2002 | A1 |
20020080112 | Braun | Jun 2002 | A1 |
20020163498 | Chang | Nov 2002 | A1 |
20030030619 | Martin | Feb 2003 | A1 |
20030212555 | van Santen | Nov 2003 | A1 |
20040220812 | Bellomo | Nov 2004 | A1 |
20050134562 | Grant | Jun 2005 | A1 |
20070091063 | Nakamura | Apr 2007 | A1 |
20070248235 | Hamada | Oct 2007 | A1 |
20070253178 | Uchiumi | Nov 2007 | A1 |
20080064500 | Satsukawa | Mar 2008 | A1 |
20080204266 | Malmberg | Aug 2008 | A1 |
20080262658 | Ding | Oct 2008 | A1 |
20090017911 | Miyazaki | Jan 2009 | A1 |
20100016077 | Longdale | Jan 2010 | A1 |
20100056208 | Ashida | Mar 2010 | A1 |
20100085462 | Sako | Apr 2010 | A1 |
20100090815 | Yamaya | Apr 2010 | A1 |
20100101480 | Sugahara | Apr 2010 | A1 |
20100245237 | Nakamura | Sep 2010 | A1 |
20110012717 | Pance | Jan 2011 | A1 |
20110039606 | Kim | Feb 2011 | A1 |
20110075835 | Hill | Mar 2011 | A1 |
20110163946 | Tartz | Jul 2011 | A1 |
20120028710 | Furukawa | Feb 2012 | A1 |
20120232780 | Delson | Sep 2012 | A1 |
20120281849 | Hamada | Nov 2012 | A1 |
20130057509 | Cruz-Hernandez | Mar 2013 | A1 |
20130250502 | Tossavainen | Sep 2013 | A1 |
20130261811 | Yagi | Oct 2013 | A1 |
20140169795 | Clough | Jun 2014 | A1 |
20140176415 | Buuck | Jun 2014 | A1 |
20140220520 | Salamini | Aug 2014 | A1 |
20140266644 | Heubel | Sep 2014 | A1 |
20140361956 | Mikhailov | Dec 2014 | A1 |
20140378191 | Hosoi | Dec 2014 | A1 |
20150042484 | Bansal | Feb 2015 | A1 |
20150059086 | Clough | Mar 2015 | A1 |
20150070261 | Saboune | Mar 2015 | A1 |
20150081110 | Houston | Mar 2015 | A1 |
20150273322 | Nakagawa | Oct 2015 | A1 |
20150297990 | Mahlmeister | Oct 2015 | A1 |
20150302854 | Clough | Oct 2015 | A1 |
20150323996 | Obana | Nov 2015 | A1 |
20150356838 | Obana | Dec 2015 | A1 |
20160012687 | Obana | Jan 2016 | A1 |
20160054797 | Tokubo | Feb 2016 | A1 |
20160124707 | Ermilov | May 2016 | A1 |
20160132117 | Asachi | May 2016 | A1 |
20160144404 | Houston | May 2016 | A1 |
20160162025 | Shah | Jun 2016 | A1 |
20160214007 | Yamashita | Jul 2016 | A1 |
20160258758 | Enokido | Sep 2016 | A1 |
20160310844 | Yamashita | Oct 2016 | A1 |
20160342213 | Endo | Nov 2016 | A1 |
20170038841 | Takeda | Feb 2017 | A1 |
20170045991 | Watanabe | Feb 2017 | A1 |
20170053502 | Shah | Feb 2017 | A1 |
20170061784 | Clough | Mar 2017 | A1 |
20170087458 | Nakagawa | Mar 2017 | A1 |
20170092084 | Rihn | Mar 2017 | A1 |
20170097681 | Ono | Apr 2017 | A1 |
20170136354 | Yamano | May 2017 | A1 |
20170139479 | Shimotani | May 2017 | A1 |
20170180863 | Biggs | Jun 2017 | A1 |
20170205883 | Tanaka | Jul 2017 | A1 |
20170235364 | Nakamura | Aug 2017 | A1 |
20170242486 | Grant | Aug 2017 | A1 |
20180028911 | Aoki | Feb 2018 | A1 |
20180067313 | Sako | Mar 2018 | A1 |
20180098583 | Keller | Apr 2018 | A1 |
20180203509 | Yamano | Jul 2018 | A1 |
20180345131 | Yamashita | Dec 2018 | A1 |
20180369865 | Shoji | Dec 2018 | A1 |
20190105563 | Yamano | Apr 2019 | A1 |
20190278372 | Nakagawa | Sep 2019 | A1 |
20190332174 | Nakagawa | Oct 2019 | A1 |
20190334426 | Culbertson | Oct 2019 | A1 |
20190369730 | Marchant | Dec 2019 | A1 |
20200061459 | Nakagawa | Feb 2020 | A1 |
20200061460 | Nakagawa | Feb 2020 | A1 |
20200070047 | Nakagawa | Mar 2020 | A1 |
20200122028 | Konishi | Apr 2020 | A1 |
20200225755 | Lee | Jul 2020 | A1 |
20200238168 | Konishi | Jul 2020 | A1 |
20200238169 | Konishi | Jul 2020 | A1 |
20200246692 | Nakagawa | Aug 2020 | A1 |
20200282310 | Nakagawa | Sep 2020 | A1 |
20200324194 | Enokido | Oct 2020 | A1 |
20200324195 | Enokido | Oct 2020 | A1 |
20200359687 | Scatterday | Nov 2020 | A1 |
20210121776 | Nakagawa | Apr 2021 | A1 |
Number | Date | Country |
---|---|---|
1397061 | Feb 2003 | CN |
102576252 | Jul 2012 | CN |
106133650 | Nov 2016 | CN |
0884858 | Apr 1996 | JP |
11226265 | Aug 1999 | JP |
2002199056 | Jul 2002 | JP |
2003228453 | Aug 2003 | JP |
2004129120 | Apr 2004 | JP |
2004157944 | Jun 2004 | JP |
2005058404 | Mar 2005 | JP |
2005190465 | Jul 2005 | JP |
2005332063 | Dec 2005 | JP |
3132531 | Jun 2007 | JP |
2007324829 | Dec 2007 | JP |
2009037582 | Feb 2009 | JP |
2009183751 | Aug 2009 | JP |
2011501296 | Jan 2011 | JP |
2011183374 | Sep 2011 | JP |
2012103852 | May 2012 | JP |
2012226482 | Nov 2012 | JP |
2013507059 | Feb 2013 | JP |
2013052046 | Mar 2013 | JP |
2013054645 | Mar 2013 | JP |
2013516708 | May 2013 | JP |
2013145589 | Jul 2013 | JP |
2013243604 | Dec 2013 | JP |
2014179984 | Sep 2014 | JP |
2014528120 | Oct 2014 | JP |
2015053038 | Mar 2015 | JP |
2015118605 | Jun 2015 | JP |
2015121983 | Jul 2015 | JP |
2015185137 | Oct 2015 | JP |
2015200994 | Nov 2015 | JP |
2015215712 | Dec 2015 | JP |
2015225521 | Dec 2015 | JP |
2015228064 | Dec 2015 | JP |
2015230516 | Dec 2015 | JP |
2015231098 | Dec 2015 | JP |
2016002797 | Jan 2016 | JP |
2016131018 | Jul 2016 | JP |
2016527601 | Sep 2016 | JP |
2017037523 | Feb 2017 | JP |
2017062788 | Mar 2017 | JP |
2017063916 | Apr 2017 | JP |
2018523863 | Aug 2018 | JP |
02073385 | Sep 2002 | WO |
2008078523 | Jul 2008 | WO |
2009035100 | Mar 2009 | WO |
2015059887 | Apr 2015 | WO |
2015121971 | Aug 2015 | WO |
2015151380 | Oct 2015 | WO |
2016038953 | Mar 2016 | WO |
2016186041 | Nov 2016 | WO |
2017043610 | Mar 2017 | WO |
Entry |
---|
International Preliminary Report on Patentability and Written Opinion for related PCT Application No. PCT/JP2017/030344, 13 pages, dated Mar. 5, 2020. |
International Preliminary Report on Patentability and Written Opinion for corresponding PCT Application No. PCT/JP2017/030345, 11 pages, dated Mar. 5, 2020. |
International Preliminary Report on Patentability and Written Opinion for related PCT Application No. PCT/JP2017/030932, 16 pages, dated Mar. 12, 2020. |
International Preliminary Report on Patentability and Written Opinion for related PCT Application No. PCT/JP2017/0030909, 11 pages, dated Mar. 12, 2020. |
Notice of Reasons for Refusal for corresponding JP Application No. 2019513214, 12 pages, dated Apr. 28, 2020. |
Decision to Grant for related JP Application No. JP2019-513523, 5 pages, dated Dec. 24, 2020. |
Notification of Reasons for Refusal for related JP Application No. JP2019-537507, 8 pages, dated Dec. 14, 2020. |
Notice of Reasons for Refusal for corresponding JP Application No. 2019-537506, 4 pages dated Nov. 18, 2020. |
The First Office Action for related CN Application No. 201780093973.6, 16 pages, dated Feb. 22, 2023. |
Office Action for related U.S. Appl. No. 16/500,651, 7 pages, dated Apr. 16, 2020. |
International Preliminary Report on Patentability and Written Opinion for related PCT Application No. PCT/JP2017/015563, 13 pages, dated Oct. 31, 2019. |
International Preliminary Report on Patentability and Written Opinion for corresponding PCT Application No. PCT/JP2017/044074, 15 pages, dated Jun. 27, 2019. |
Notification of Reason for Refusal for related JP Patent Application No. JP 2018-556628, 17 pages, dated Feb. 18, 2020. |
International Preliminary Report on Patentability and Written Opinion for related PCT Application No. PCT/JP2017/033925 16 pages, dated Oct. 31, 2019. |
International Preliminary Report on Patentability and Written Opinion for related PCT Application No. PCT/JP2017/015740 14 pages, dated Oct. 31, 2019. |
Office Action for related U.S. Appl. No. 16/345,071, 10 pages, dated Feb. 19, 2020. |
International Search Report for related PCT Application No. PCT/JP2017/044074, 4 pages, dated Jan. 16, 2018. |
International Search Report for related PCT Application No. PCT/JP2017/044072, 4 pages, dated Jan. 16, 2018. |
International Preliminary Report on Patentability and Written Opinion for related PCT Application No. PCT/JP2017/044072, 15 pages, dated Jun. 27, 2019. |
International Search Report for related PCT Application No. PCT/JP2017/044073, 2 pages, dated Jan. 23, 2018. |
International Preliminary Report on Patentability and Written Opinion for related PCT Application No. PCT/JP2017/044073, 10 pages, dated Jun. 27, 2019. |
International Preliminary Report on Patentability and Written Opinion for related PCT Application No. PCTJP2017044074, 12 pages, dated Jun. 18, 2019. |
International Search Report for related PCT Application No. PCT/JP2017/015563, 2 pages, dated Jun. 13, 2017. |
International Search Report for related PCT Application No. PCT/JP2017/033925, 4 pages, dated Nov. 7, 2017. |
International Search Report for related PCT Application No. PCT/JP2017/015740, 4 pages, dated Jul. 4, 2017. |
International Search Report for corresponding PCT Application No. PCT/JP2017/016552, 2 pages, dated Jun. 20, 2017. |
International Preliminary Report on Patentability and Written Opinion for related PCT Application No. PCTJP2017044075, 15 pages, dated Jun. 27, 2019. |
International Preliminary Report on Patentability and Written Opinion for related PCT Application No. PCT/JP2017/016552, 13 pages, dated Nov. 7, 2019. |
International Search Report for related PCT Application No. PCT/JP2017/030344, 4 pages, dated Oct. 10, 2017. |
International Search Report for corresponding PCT Application No. PCT/JP2017/030345, 2 pages, dated Sep. 26, 2017. |
International Search Report for related PCT Application No. PCT/JP2017/030909, 3 pages, dated Sep. 26, 2017. |
Number | Date | Country | |
---|---|---|---|
20200238168 A1 | Jul 2020 | US |