The present invention relates to a display system that displays a virtual object, a display device and a method of controlling the same, and a storage medium.
There is known a technique for effectively displaying an augmented reality image on a display section of a head mounted display. For example, Japanese Laid-Open Patent Publication (Kokai) No. 2015-32131 describes a technique for displaying an augmented reality image on a display section of a head mounted display, in a state added to a real object, to thereby make it possible to obtain high realistic feeling.
However, although the technique described in Japanese Laid-Open Patent Publication (Kokai) No. 2015-32131 makes it possible to display a virtual object in a state added to a real object, the technique suffers from a problem that the visibility is lowered and a problem that the operability of the virtual object is lowered due to the lowering of the visibility.
The present invention provides a display system that makes it possible to obtain high operability while maintaining visibility when a virtual object is displayed.
In a first aspect of the present invention, there is provided a display system including an information processing apparatus, and a head mounted display that is capable of communicating with the information processing apparatus and has an image capturing unit, the display system including a detection unit configured to detect information on the information processing apparatus from an image captured by the image capturing unit, wherein the display device includes a display section that is capable of displaying a virtual object, and a display control unit configured to display the virtual object on the display section in a position adjacent to the information processing apparatus or in a manner partially superimposing the virtual object on the information processing apparatus, based on the information.
In a second aspect of the present invention, there is provided a head mounted display that is capable of communicating with an information processing apparatus by wireless communication, including an image capturing unit, a display section that is capable of displaying a virtual object, and a display control unit configured to display the virtual object on the display section in a position adjacent to the information processing apparatus or in a manner partially superimposing the virtual object on the information processing apparatus, based on information on the information processing apparatus, which is detected from an image captured by the image capturing unit.
In a third aspect of the present invention, there is provided a method of controlling a display system including an information processing apparatus, and a head mounted display that is capable of communicating with the information processing apparatus and has an image capturing unit, including detecting information on the information processing apparatus from an image captured by the image capturing unit, and displaying a virtual object on a display section of the display device in a position adjacent to the information processing apparatus or in a manner partially superimposing the virtual object on the information processing apparatus, based on the information.
In a fourth aspect of the present invention, there is provided a method of controlling a head mounted display including an image capturing unit, including establishing communication with an information processing apparatus, and displaying a virtual object in a position adjacent to the information processing apparatus or in a manner partially superimposing the virtual object on the information processing apparatus, based on information on the information processing apparatus, which is detected from an image captured by the image capturing unit.
According to the present invention, it is possible to realize a display system that makes it possible to obtain high operability while maintaining visibility when a virtual object is displayed.
Further features of the present invention will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
The present invention will now be described in detail below with reference to the accompanying drawings showing embodiments thereof.
The smartphone 100 is capable of displaying a variety of icons, photograph information, and so forth, and further, has a display section 101 on which a touch panel is disposed in a state superposed thereon such that a touch operation can be performed. The smartphone 100 will be described hereinafter in more detail.
Note that there is no problem of using a well-known smartphone as the smartphone 100, and hence detailed description thereof is omitted. Further, the information processing apparatus as a component of the display system according to the present invention is not limited to the smartphone 100, and may be another electronic device, such as a tablet PC or a portable game machine. In the present embodiment, the information processing apparatus is assumed to be of a portable type which is held and operated by a user's hand.
The wireless communication path 120 communicably connects between the smartphone 100 and the smart glasses 110. As the wireless communication path 120, there may be mentioned direct communication using Bluetooth (registered trademark) or Wi-Fi, wireless LAN communication using an access point, not shown, and so forth, but the communication method is not particularly limited. Communication performed between the smartphone 100 and the smart glasses 110 includes a pairing request sent from the smart glasses 110 to the smartphone 100 and a pairing permission response sent from the smartphone 100 to the smart glasses 110 in response to this pairing request.
Next, the smart glasses 110 will be described with reference to
The smart glasses 110 are configured as a glasses-type head mounted display (HIVID). The smart glasses 110 are also configured as an optical transmission type HMD on which a user can visually recognize an image (display image) displayed on the display section 204 and directly view an outside scenery at the same time. Note that the head mounted display as a component of the display system according to the present embodiment is not limited to the glasses type, but may be of any other type, such as a goggles type which encloses an area surrounding the eyes or a helmet type which covers the entire head.
The wireless communication section 201 bidirectionally communicates with the smartphone 100 via the wireless communication path 120. Note that in
The smart glasses 110 are configured as the optical transmission type HMD as mentioned above. That is, the display section 204 corresponding to lenses of general glasses is formed of a material which is high in translucency (high in transmittance) and forms a transmission type display that does not block a visual field of a user when the display section 204 is in a state not displaying an image. Further, the smart glasses 110 are configured such that the smart glasses 110 display a virtual object 600 (see e.g.
The control unit 202 is implemented by a so-called microcomputer comprised of a CPU, a ROM, a RAM, and a flash memory. The CPU loads predetermined programs stored in the ROM into the RAM and executes the loaded programs, whereby the control unit 202 functions as a detection section and a display controller 304. The detection section includes a distance detection section 301, a position detection section 302, and a tilt detection section 303, and detects information on the smartphone 100 from an image captured by the image capturing section 203. Note that the CPU controls a variety of operations which can be executed by the smart glasses 110 by loading an OS program stored in the ROM into the RAM.
The distance detection section 301 detects information on a distance from the smart glasses 110 (display section 204) to the smartphone 100 from an image captured by the image capturing section 203, as an item of information on the smartphone 100. Here, in the description of the first embodiment, it is assumed that wireless connection for executing wireless communication via the wireless communication path 120 has been established between a wireless communication section 311 (see
The position detection section 302 detects position information of the smartphone 100 from an image of the smartphone 100 captured by the image capturing section 203 by calculation (image processing), as an item of information on the smartphone 100. The position information of the smartphone 100 refers to information indicating in which position (which direction) the smartphone 100 exists within the captured image with respect to the display section 204. Note that the method of detecting the position information will be described hereinafter with reference to
The tilt detection section 303 detects tilts of the smartphone 100 of which an image is captured, from the image of the smartphone 100 captured by the image capturing section 203 by calculation (image processing) as an item of information on the smartphone 100. More specifically, the tilt detection section 303 detects a tilt in a vertical rotational direction, a tilt in a horizontal rotational direction, and a tilt in a rotational axis direction. Note that the method of detecting the tilt information of the smartphone 100 will be described hereinafter with reference to
The display controller 304 performs control for displaying a virtual object on the display section 204 based on the information on a distance, a position, and tilts detected by the distance detection section 301, the position detection section 302, and the tilt detection section 303, respectively.
Next, the method of detecting the smartphone 100 by the smart glasses 110 will be described. First, the method of detecting a position of the smartphone 100 by the position detection section 302 of the smart glasses 110 will be described.
Next, the method of detecting tilts of the smartphone 100 by the tilt detection section 303 will be described. The tilt detection section 303 detects a tilt in the vertical rotational direction, a tilt in the horizontal rotational direction, and a tilt in the rotational axis direction.
In the first embodiment, a mode of the virtual object displayed on the right display section 204R is controlled based on information on the distance, position, and tilts of the smartphone 100 with respect to the smart glasses 110, detected from an image of the smartphone 100 captured by the image capturing section 203.
The distance detection section 301, the position detection section 302, and the tilt detection section 303 sequentially acquire images of the smartphone 100 captured by the image capturing section 203 and continue to detect therefrom the information on the distance, position, and tilts of the smartphone 100 with respect to the smart glasses 110. Then, as shown in
For example, in a case where the distance information of the smartphone 100 indicates 50 cm, the display controller 304 controls the display to arrange the virtual object 600 such that the virtual object 600 appears in a position at 50 cm from a viewpoint of the smart glasses 110. Then, the display controller 304 performs affine transformation on the virtual object 600 such that the respective tilts of the virtual object 600 in the vertical rotational direction and the horizontal rotational direction become the same as the respective tilts of the smartphone 100 in the vertical rotational direction and the horizontal rotational direction. For example, if the tilt of the smartphone 100 in the vertical rotational direction is represented by “M:N=5:4”, the display controller 304 performs affine transformation on the virtual object 600 such that the ratio between the lower side and upper side of the virtual object 600 becomes “5:4”. Further, if the tilt of the smartphone 100 in the horizontal rotational direction is represented by “K:L=7:6”, the display controller 304 performs affine transformation on the virtual object 600 such that the ratio between the left side and right side of the virtual object 600 becomes “7:6”. Further, the display controller 304 performs rotation processing on the virtual object 600 having been subjected to affine transformation, such that the tilt in the rotational axis direction becomes the same as the tilt of the smartphone 100 in the rotational axis direction. For example, if the angle of the smartphone 100 in the rotational axis direction is 30 degrees, the display controller 304 rotates the vector of the virtual object 600 having been subjected to affine transformation through 30 degrees.
The display controller 304 controls the virtual object 600 to be disposed in the right display section 204R such that the coordinates of the upper left corner of the virtual object 600 are made to coincide with or close to the coordinates 402 of the smartphone 100. At this time, the virtual object 600 is displayed on the right display section 204R such that the upper side and lower side thereof each form substantially one straight line with the upper side and lower side of the smartphone 100, respectively. In other words, the length of the left side of the virtual object 600 is set to be substantially equal to the length of the right side of the smartphone 100. Thus, the display mode shown in
The present process is started when the smart glasses 110 are powered on by a user's operation performed on the operation section provided on the smart glasses 110, thereby enabling the display of a virtual object on the display section 204. It is assumed that when the process is started for the smart glasses 110, the smartphone 100 has been powered on.
In a step S801, the CPU sends a pairing request to the smartphone 100 according to a user's operation. The user's operation performed in this step may be an operation performed on the operation section or a user's gesture an image of which is captured by the image capturing section 203. In a step S811, the smartphone 100 is assumed to permit the pairing in response to the pairing request received in the step S801. Although not shown in the block diagram of the smartphone 100, the process performed in the smartphone 100 is realized by the CPU of the smartphone 100 that executes a predetermined program to control the components of the smartphone 100. Note that the smartphone 100 and the smart glasses 110 are paired because, for example, in a case where a plurality of smartphones appear within an image captured by the image capturing section 203, a smartphone that has been paired can be identified.
In a step S802, the distance detection section 301 detects distance information of the smartphone 100 from an image of the smartphone 100 captured by the image capturing section 203. Note that the first embodiment aims to operate the virtual object 600 from the smartphone 100, and hence it is assumed that the user is directly viewing the smartphone 100 through the display section 204. Further, it is assumed that the smartphone 100 appears in the image captured by the image capturing section 203. Further, in a case where the control unit 202 analyzes (performs image processing of) the captured image before execution of the step S802, and it is detected as a result of the analysis that the smartphone 100 (display section 101) does not appear in the captured image, the present process may be terminated.
In a step S803, the position detection section 302 detects the position information of the smartphone 100, i.e. the coordinates 402 of the upper right corner of the smartphone 100, from the image of the smartphone 100 captured by the image capturing section 203. In a step S804, the tilt detection section 303 detects from the image of the smartphone 100 captured by the image capturing section 203, a tilt in the vertical rotational direction, a tilt in the horizontal rotational direction, and a tilt in the rotational axis direction, as the tit information of the smartphone 100. In a step S805, the display controller 304 displays the virtual object 600 on the display section 204 (right display section 204R) based on the information on the distance, position, and tilts of the smartphone 100, obtained in the steps S802 to S804.
The display of the virtual object 600 is continued until the smart glasses 110 are powered off e.g. by a user's operation. Until the smart glasses 110 are powered off, the processing for acquiring the information on the position and posture of the smartphone 100 and the processing for displaying the virtual object 600 (steps S802 to S805) are repeatedly executed. That is, the display controller 304 continues to display the virtual object in a position adjacent to the smartphone 100 while changing the shape of the virtual object according to changes in position and posture of the smartphone 100.
Thus, in the first embodiment, the control unit 202 continues to detect the information on the distance, position, and tilts of the smartphone 100, and display the virtual object on the display section 204 based on these items of the information. With this, the user can appreciate the virtual object while changing the display mode of the virtual object by changing the posture and position of the smartphone 100 in a state in which the user can directly view the smartphone 100.
Note that although in the first embodiment, the information on the distance, position, and tilts of the smartphone 100 is detected by the smart glasses 110, this is not limitative. For example, in a case where the smartphone 100 has a variety of tilt information items obtained using e.g. a gyro sensor, the smartphone 100 may provide these tilt information items to the smart glasses 110. Further, as for the distance information and position information of the smartphone 100, in a case where the smartphone 100 has a ranging function, such as a LIDAR, the smartphone 100 may also provide the distance information and position information to the smart glasses 110.
Further, although in the first embodiment, the virtual object is displayed in a position adjacent to the right side of the smartphone 100 which can be directly viewed through the display section 204, the display position of the virtual object is not limited to this. For example, the virtual object may be displayed in a position adjacent to the left side of the smartphone 100. Further, the virtual object 600 may be displayed such that part or whole of the virtual object 600 is superimposed on the display section 101 of the smartphone 100.
Next, a second embodiment of the present invention will be described. A display system according to the second embodiment has the same basic configuration as the display system according to the first embodiment but is different in the configuration of the control unit 202 included in the smart glasses 110. Therefore, the configuration and functions of the control unit 202 will be mainly described, for describing the configuration and the control of the display system according to the second embodiment. Note that out of the components of the display system according to the second embodiment, components which are substantially the same as those of the display system according to the first embodiment are denoted by the same names and reference numerals, and the redundant description is omitted.
The icon displaying section 901 performs control for disposing an icon on the smartphone 100, which is used for operating the virtual object displayed on the display section 204. That is, the icon displaying section 901 requests the smartphone 100 to display the icon for operating the virtual object on the display section 101. The smartphone 100 having received this request sends a notification for permitting the display of the requested icon to the icon displaying section 901 as a response and displays the icon for operating the virtual object on the screen of the smartphone 100.
Note that an application (software) for displaying the icon for operating the virtual object on the display section 101 has been stored in the smartphone 100 in advance or acquired from an external device via the wireless communication path 120. For example, the application makes it possible to move a cursor displayed in the virtual object 600 by a user's flick operation on the icon and partially enlarge the virtual object 600 by a user's double-tap operation on the icon.
The operation performed on the icon disposed on the display section 101 of the smartphone 100 is converted to electrical signals and the electrical signals are transmitted to the display controller 902 of the smart glasses 110. The display controller 902 receives the electrical signals indicating the user's operation performed on the icon displayed on the display section 101 of the smartphone 100, and controls movement of the cursor displayed in the virtual object and the enlargement/reduction and so forth of the virtual object according to the received electrical signals.
Referring to
In the virtual object 600 appearing in
Referring to
The present process is started when the smart glasses 110 are powered on by a user's operation performed on the operation section provided on the smart glasses 110, thereby enabling the display of a virtual object on the display section 204. Note that it is assumed that when the process is started for the smart glasses 110, the smartphone 100 has already been powered on. Further, steps S1101 to S1105 and S1111 are the same as the steps S801 to S805 and S811 in
In a step S1106 following the step S1105, the control unit 202 (CPU) determines whether or not an icon display request by a user's operation has been received. The user's operation performed in this step may be an operation performed on the operation section or a user's gesture of which an image is captured by the image capturing section 203. If it is determined that an icon display request has not been received (NO to the step S1106), the control unit 202 continues, similar to the first embodiment, to display the virtual object 600 until the smart glasses 110 are powered off e.g. by a user's operation. Until the smart glasses 110 are powered off, the processing for acquiring the information on the position and posture of the smartphone 100 and the processing for displaying the virtual object 600 (steps S1102 to S1105) are repeatedly executed.
If it is determined that an icon display request has been received (YES to the step S1106), the control unit 202 proceeds to a step S1107. In the step S1107, the icon displaying section 901 sends an icon display request to the smartphone 100. In a step S1112, the CPU of the smartphone 100 permits the display of the icon in response to the icon display request and sends a response indicating the permission of display of the icon, to the icon displaying section 901. Then, in a step S1113, the CPU of the smartphone 100 displays the icon 1002 on the display section 101. This enables the user to operate the icon 1002.
In the smart glasses 110, in a step S1108, the display controller 902 displays the virtual object 600 on the display section 204 according to the response of the permission of the icon display to the icon display request sent in the step S1107. At this time, as shown in
Thus, in the second embodiment, the icon for operating the virtual object displayed on the display section 204 is displayed on the display section 101 of the smartphone 100 to make it possible to operate the virtual object by a user's operation performed on the icon. This enables the user to intuitively operate the virtual object on the smartphone 100 directly viewed via the display section 204 without touching the display section 204 on which the virtual object is displayed. That is, it is possible to easily operate the virtual object while ensuring visibility of the smartphone 100 and the virtual object.
The display position of the icon 1002 is not limited to the lower left of the display section 101, and the size of the icon 1002 is not limited insofar as the icon can be easily operated by a user. Further, the number of icons displayed on the display section 101 is not limited to one, but a plurality of icons (such as icons dedicated for performing predetermined respective operations) may be displayed. The operations which can be executed on the icon 1002 are not limited to the flick and double-tap operations, and further, these operations are not necessarily required, but the operations may be any of single-tap, swipe, pinch-in, pinch-out, and like other operations.
Next, a third embodiment of the present invention will be described. A display system according to the third embodiment has has the same basic configuration as the display system according to the first embodiment but is different in the configuration of the control unit 202 included in the smart glasses 110. Therefore, the configuration and functions of the control unit 202 will be mainly described, for describing the configuration and the control of the display system according to the third embodiment. Note that out of the components of the display system according to the third embodiment, components which are substantially the same as those of the display system according to the first embodiment are denoted by the same names and reference numerals, and the redundant description is omitted.
In the third embodiment, the smartphone 100 is in a power-off state and pairing with the smart glasses 110 has not been performed yet. On the other hand, the object recognition section 1201 recognizes the smartphone 100 by detecting a user's specific operation, such as a double-tap operation, performed on the smartphone 100, from an image acquired via the image capturing section 203.
Here, it is envisaged that there arises a situation in which a tilt of the smartphone 100 in the vertical rotational direction, a tilt in the horizontal rotational direction, and a tilt in the rotational axis direction, detected by the tilt detection section 303, become larger than tilt threshold values set in advance, respectively. In this case, the display controller 1202 invalidates the tilt information detected by the tilt detection section 303 and normally displays the virtual object on the display section 204 without causing the user to touch the smartphone 100.
The operation of normally displaying the virtual object refers to an operation of displaying the virtual object on the display section 204 in a positional relationship in which a line connecting the right eye and left eye of the user wearing the smart glasses 110 and the upper side or lower side of the virtual object are substantially parallel to each other. For example, in a case where the angle α of the smartphone 100 in the rotational axis direction, detected by the tilt detection section 303, is larger than the tilt threshold value set in advance, the virtual object is displayed on the display section 204 without rotating the vector of the virtual object in the rotational axis direction. Note that the tilt threshold value set in advance may be a value set for the smart glasses 110 as a default value or may be a value set by a user.
Let it be assumed, for example, that in a case where the tilt threshold value is set to 30 degrees, the smartphone 100 is tilted in the rotational axis direction from a state tilted by 20 degrees to a state tilted by 60 degrees. In this case, while the tilt angle in the rotational axis direction is changed from 20 degrees up to 30 degrees, the virtual object 600 is subjected to vector rotation in the rotational axis direction according to the tilt of the smartphone 100 in the rotational axis direction. Then, when the tilt angle in the rotational axis direction exceeds 30 degrees, the display state in which the virtual object is tilted in the rotational axis direction through 30 degrees is maintained without further subjecting the virtual object 600 to vector rotation. Therefore, the display of the virtual object 600 is not changed after the tilt angle of the smartphone 100 in the rotational axis direction exceeds 30 degrees and until it reaches 60 degrees.
This makes it possible to prevent the virtual object 600 from being displayed on the display section 204 in a state in which the user is difficult to recognize the contents of the virtual object 600. Note that threshold values (a threshold value of a ratio between the upper side and the lower side, and a threshold value of a ratio between the left side and the right side) are also set for the tilt in the vertical rotational direction and the tilt in the horizontal rotational direction, respectively, and the display control is similarly performed.
The present process is started when the smart glasses 110 are powered on by a user's operation performed on the operation section provided on the smart glasses 110 and thereby enabling the display of a virtual object on the display section 204. In a step S1401, the object recognition section 1201 recognizes the smartphone 100 as an object. Steps S1402 to S1404 are the same as the steps S802 to S804 in
In a step S1405, the display controller 1202 determines whether or not each of the tilts in the vertical rotational direction, the horizontal rotational direction, and the rotational axis direction, detected with respect to the smartphone 100 recognized in the step S1401, is larger than the associated tilt threshold value set to each tilt in advance.
If the detected tilts are not larger than the associated tilt threshold values, respectively (NO to the step S1405), the display controller 1202 proceeds to a step S1406. The step S1406 et seq. are the same as the step S805 et seq. in
Thus, in the third embodiment, the smartphone 100 which has not been paired is recognized, and if a tilt detected with respect to the recognized smartphone 100 is larger than the associated tilt threshold value, the detected tilt is determined to be invalidated. This makes it possible to control the display mode of the virtual object according to the operation of the smartphone 100 visually recognized through the display section 204, and display the virtual object on the display section 204 in a state in which the user can always easily recognize the contents of the virtual object at the time.
Note that restriction of change of the shape of the virtual object in the third embodiment can be also applied to the display control of the virtual object in a state in which the smartphone 100 and the smart glasses 110 have been paired as in the first embodiment.
The present invention has been described heretofore based on the embodiments thereof. However, the present invention is not limited to these embodiments, but it is to be understood that the invention includes a variety of forms within the scope of the gist of the present invention. Further, the embodiments of the present invention are described only by way of example, and it is possible to combine the embodiments on an as-needed basis.
For example, although in the above-described embodiments, the smart glasses 110 as an example of the optical transmission type (optical see-through type) HMD has been described as the display device, a video see-through type HMD may be used in place of this. A display section of the video see-through type HMD, which corresponds to lenses of general glasses, is not transparent. Therefore, in a case where the video see-through type HMD is used, a display image generated by superimposing an image of a virtual object on a real image captured by an image capturing section is displayed on the display section in a manner visually recognizable by a user.
Further, although in the above-described embodiments, the smartphone 100 is detected from an image captured by the image capturing section 203 by the control unit 202 of the smart glasses 110, this detection may be performed by the control unit 312 of the smartphone 100. That is, the smartphone 100 may receive (image data of) an image captured by the image capturing section 203, using the wireless communication section 311, and the control unit 312 may perform detection of the smartphone 100 from the received image, whereafter the detection result may be transmitted from the wireless communication section 311 to the smart glasses 110. This method is effective in a case where the processing capability of the control unit 312 of the smartphone 100 is higher than the processing capability of the control unit 202 of the smart glasses 110.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2021-015070, filed Feb. 2, 2021, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2021-015070 | Feb 2021 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
9886796 | Kobayashi | Feb 2018 | B2 |
10334076 | McKenzie et al. | Jun 2019 | B2 |
20130031511 | Adachi | Jan 2013 | A1 |
20150022444 | Ooi | Jan 2015 | A1 |
20150062164 | Kobayashi | Mar 2015 | A1 |
20160163109 | Kobayashi | Jun 2016 | A1 |
20180164404 | Koga | Jun 2018 | A1 |
20180321798 | Kawamura | Nov 2018 | A1 |
20220327646 | Suzuki | Oct 2022 | A1 |
Number | Date | Country |
---|---|---|
2015-032131 | Feb 2015 | JP |
2017-146651 | Aug 2017 | JP |
Number | Date | Country | |
---|---|---|---|
20220244899 A1 | Aug 2022 | US |