The present application claims priority of Japanese Patent Application No. 2020-97112 filed on Jun. 3, 2020, the contents of which are incorporated herein by reference.
The present invention relates to an image diagnosis method, device, and system for detecting a specific tissue or cell using an image.
In recent years, “pathological diagnosis” by microscopic observation of a lesion tissue specimen plays an important role in disease diagnosis. In pathological diagnosis, most of work from specimen preparation to diagnosis is manually performed, and is difficult to be automated. In particular, ability and experience are important factors for accurate diagnosis, and accuracy of diagnosis depends on ability of a pathologist. On the other hand, pathologists are insufficient in the medical field due to factors such as an increase of cancer patients along with aging of the population. Therefore, there is an increasing need for an image processing technique of supporting pathological diagnosis, remote diagnosis, and the like.
For example, a technique disclosed in PTL 1 is known as the technique of supporting pathological diagnosis as described above. PTL 1 discloses that “an image diagnosis support device includes an image data acquisition unit that acquires high-magnification image data of a specimen tissue, and the image diagnosis support device includes: an image sorting unit that generates low-magnification image data from the high-magnification image data acquired by the image data acquisition unit, and sorts the generated low-magnification image data into a group of each image data pattern of a plurality of pathological tissues; and an image determination unit that determines whether the high-magnification image data serving as a source of the low-magnification image data sorted by the image sorting unit is a pathological tissue in the sorted group.”
In the related art, whether a tissue/cell is an abnormal tissue/cell (for example, tissue/cell indicating malignant) or a benign tissue/cell is determined based on an image including the tissue/cell. However, actually, it is difficult to determine whether it is abnormal or benign, and it is better to carry out follow-up in some cases. In such a case, when abnormal or benign is determined, detection omission of an abnormal tissue/cell (overlooking of malignant tissue/cell) or erroneous detection of an abnormal tissue/cell may occur. The detection omission of the abnormal tissue/cell may lead to aggravation of a symptom, and the erroneous detection of the abnormal tissue/cell may result in erroneous site excision.
The invention has been made in view of such circumstances, and an object thereof is to provide a technique of presenting not only malignant and benign but also follow-up as a diagnosis result based on an image including a tissue/cell.
A representative example of the invention disclosed in the present application is outlined as follows. That is, there is provided an image diagnosis method executed by an image diagnosis support device. The image diagnosis support device includes an arithmetic device, a storage device, and a connection interface configured to connect to an external device. The image diagnosis method includes: a step of, by the arithmetic device, acquiring an image including at least one of a tissue and a cell as an element via the connection interface, and storing the image in the storage device; a step of, by the arithmetic device, classifying, for each partial image that is a part of the image, a property of the element included in the partial image, and storing a classification result in the storage device; a step of, by the arithmetic device, sorting the image into any one of benign indicating that no lesion element is present, malignant indicating that a lesion element is present, and follow-up based on classification results of the plurality of partial images, and storing a sorting result as a diagnosis result in the storage device; and a step of outputting, by the arithmetic device, the diagnosis result via the connection interface.
According to the invention, the image diagnosis support device can present, as a diagnosis result, any one of malignant, benign, and follow-up based on an image. Problems, configurations, and effects other than those described above will be apparent from the following description of embodiments.
Each embodiment of the invention provides an image diagnosis support device that prevents detection omission and erroneous detection of an abnormal tissue/cell indicating a lesion tissue/cell such as a cancer by outputting any one of malignant, benign, and follow-up as a diagnosis result, and a method therefor.
Hereinafter, embodiments of the invention will be described with reference to the accompanying drawings. In the accompanying drawings, functionally identical elements may be denoted by the same reference numerals. The accompanying drawings show specific examples of implementation according to the principle of the invention, but these are for the purpose of understanding the invention and are not used to limit the invention.
While the present embodiment has been described in sufficient detail for a person skilled in the art to carry out the invention, it is necessary to understand that other implementations and forms are possible, and changes in configuration and structure and replacement of various elements are possible without departing from the scope and spirit of the technical idea of the invention. Therefore, the following description should not be construed as being limited thereto.
As will be described later, the embodiments of the invention may be implemented by software running on a general-purpose computer, or may be implemented by dedicated hardware or a combination of software and hardware.
Hereinafter, each processing in the embodiments of the invention will be described with “each processing unit (for example, feature extraction unit) as a program” as a subject (operation subject), but since the program executes processing determined by being executed by an arithmetic device such as a processor and a CPU while using a memory and a communication port (communication control device), the description may be made with the arithmetic device as the subject.
The image diagnosis support device 1 includes an input unit 100, a feature extraction unit 101, a classification unit 102, a follow-up determination unit 103, a drawing unit 104, a recording unit 105, and a control unit 106.
The input unit 100, the feature extraction unit 101, the classification unit 102, the follow-up determination unit 103, the drawing unit 104, the recording unit 105, and the control unit 106 may be implemented by programs or by modularization.
The input unit 100 receives an input of an image. For example, the input unit 100 may receive, as input images, still images or the like encoded in JPG, Jpeg 2000, PNG, BMP formats, or the like captured at predetermined intervals by an imaging unit such as a camera built in a microscope. The input unit 100 may extract still images of frames at predetermined intervals from a moving image in Motion JPEG, MPEG, H.264, HD/SDI formats, or the like, and receive the still images as input images. The input unit 100 may receive, as an input image, an image acquired by an imaging unit via a bus, a network, or the like. The input unit 100 may receive, as an input image, an image stored in a detachable recording medium.
The feature extraction unit 101 acquires definition information on an algorithm (model) stored in a sub-storage device 203, and extracts, using the algorithm, features related to a tissue/cell from a partial image that is a part of an image (input image) including the tissue/cell.
The classification unit 102 acquires definition information on an algorithm (model) stored in the sub-storage device 203, and calculates, using the algorithm and the features, a classification intensity representing likelihood of a normal tissue/cell (for example, benign tissue/cell) and a classification intensity representing likelihood of an abnormal tissue/cell (for example, malignant tissue/cell) for each partial image of the input image. That is, a value indicating a degree corresponding to a certain property of the tissue/cell is calculated. The classification unit 102 outputs a classification result of benign or malignant of each partial image based on the classification intensity.
The follow-up determination unit 103 calculates an area (classification area) of each classification result in the input image based on the classification result of each partial image. The follow-up determination unit 103 sorts the input image into any one of malignant, benign, and follow-up based on the classification intensity and the classification area of each partial image.
The drawing unit 104 displays an image representing a distribution of the classification intensities of the classification results calculated by the classification unit 102. The drawing unit 104 displays, for example, an input image that is color-coded corresponding to each classification result. The drawing unit 104 displays a sorting result (diagnosis result) obtained by the follow-up determination unit 103.
The recording unit 105 stores the image and the diagnosis result displayed by the drawing unit 104 in the sub-storage device 203 (see
The control unit 106 controls the entire image diagnosis support device 1. The control unit 106 is connected to the input unit 100, the feature extraction unit 101, the classification unit 102, the follow-up determination unit 103, the drawing unit 104, and the recording unit 105. Each functional unit of the image diagnosis support device 1 operates autonomously or according to an instruction from the control unit 106.
The image diagnosis support device 1 according to the first embodiment extracts, using the feature extraction unit 101, features indicating abnormality likelihood of a tissue/cell included in an input image, and calculates, using the classification unit 102, a classification intensity indicating a degree of normality likelihood and the abnormality likelihood of the tissue/cell for the input image. Further, the image diagnosis support device 1 calculates a classification area for each classification result (normal and abnormal) using the follow-up determination unit 103, and sorts any one of malignant, benign, and follow-up into a class using the classification intensity and the classification area of a classification result of each partial image.
For the functional units in the image diagnosis support device 1, a plurality of functional units may be integrated into one functional unit, or one functional unit may be divided into a plurality of functional units for each function. For example, the feature extraction unit 101 may be provided in the classification unit 102.
The image diagnosis support device 1 may be implemented not as a device but as a function. In this case, the function may be implemented in a tissue/cell image acquisition device such as a virtual slide, or may be implemented in a server connected to a tissue/cell image acquisition device via a network as described in second and third embodiments.
The image diagnosis support device 1 includes a CPU 201, a main storage device 202, the sub-storage device 203, an output device 204, an input device 205, and a communication device 206. The pieces of hardware described above are interconnected via a bus 207.
The CPU 201 is an example of an arithmetic device, reads a program from the main storage device 202 as necessary, and executes the program.
The main storage device 202 is a storage device such as a memory, and stores programs for implementing the input unit 100, the feature extraction unit 101, the classification unit 102, the follow-up determination unit 103, the drawing unit 104, the recording unit 105, and the control unit 106. The main storage device 202 includes a work area to be temporarily used by the program.
The sub-storage device 203 is a storage device such as a hard disk drive (HDD) or a solid state drive (SSD), and permanently stores data. The sub-storage device 203 according to the first embodiment stores the input image, the classification result and the classification intensity of each partial image of the input image output by the classification unit 102, the classification area and the diagnosis result of each classification result output by the follow-up determination unit 103, and the image generated by the drawing unit 104 (a numerical value of the classification intensity for drawing the classification intensity, and position information). The sub-storage device 203 stores information on algorithms to be used by the feature extraction unit 101 and the classification unit 102, and the like.
The output device 204 includes devices such as a display, a printer, and a speaker. For example, the output device 204 displays display data generated by the drawing unit 104 on the display.
The input device 205 includes devices such as a keyboard, a mouse, and a microphone. A user inputs various instructions such as determination of an input image and setting of parameters to the image diagnosis support device 1 using the input device 205.
The communication device 206 communicates with other devices via a network. For example, the communication device 206 receives data such as an image transmitted from a device connected via a network, such as a server, and stores the data in the sub-storage device 203. The communication device 206 is not an essential component of the image diagnosis support device 1. For example, when a communication device is provided in a computer or the like connected to an image acquisition device, the image diagnosis support device 1 may not include the communication device 206.
Next, the functional units in the image diagnosis support device 1 will be described in detail.
First, the feature extraction unit 101 will be described.
The feature extraction unit 101 extracts features of an input image. As shown in
For example, as shown in
In Equation (1), wj represents a filter coefficient, pj represents a pixel value, bi represents an offset value, m represents the number of filter coefficients, and h represents a nonlinear function. The filter coefficient wj is a coefficient of the classifier, and is calculated using a known machine learning technique such that a normal tissue/cell and an abnormal tissue/cell can be classified. The sub-storage device 203 stores parameters such as the filter coefficient wj and the offset value bi.
Next, the classification unit 102 will be described.
The classification unit 102 executes logistic regression processing using a matrix f including the features FAi calculated by the feature extraction unit 101 to calculate a value (classification intensity) indicating lesion likelihood.
The classification unit 102 calculates a classification intensity y of each classification result based on, for example, Equation (2). That is, as shown in
[Math. 2]
y=g(w×f+b) (2)
In Equation (2), w represents a weight matrix, b represents an offset value, g represents a nonlinear function, and y represents a classification intensity of a classification result. The weight matrix w and the offset value b are calculated using a known machine learning technique. For example, CNN may be used as a machine learning technique. The sub-storage device 203 stores parameters such as the weight matrix w and the offset value b.
Next, the follow-up determination unit 103 will be described.
The follow-up determination unit 103 calculates a classification area of each classification result in an input image as shown in
For example, the follow-up determination unit 103 calculates the classification area based on the total number of colors (for example, the number of pixels) in the input image. Alternatively, the follow-up determination unit 103 detects a tissue region from the input image, and calculates a classification area based on the total number of colors (for example, the number of pixels) in the tissue region.
The follow-up determination unit 103 calculates a value of lesion likelihood (HE) for the input image using the classification intensity y of the classification result of each of a plurality of partial images. The value of the lesion likelihood is a real number between 0 and 1.
The follow-up determination unit 103 determines whether the input image corresponds to any class of malignant, benign, and follow-up based on a determination condition related to the classification region of each classification result.
For example, the following determination processing may be considered.
(Step 0) The follow-up determination unit 103 determines whether there is a region where a classification intensity of a classification result “abnormal” is A1 or larger. When the above-described determination condition is satisfied, the follow-up determination unit 103 determines that a class of the input image is “malignant”. When the above-described determination condition is not satisfied, the follow-up determination unit 103 proceeds to Step 1.
(Step 1) The follow-up determination unit 103 determines whether a classification area with the classification intensity of the classification result “abnormal” that is less than A1 and equal to or greater than A2 is less than B1%. When the above-described determination condition is satisfied, the follow-up determination unit 103 determines that a class of the input image is “benign”. When the above-described determination condition is not satisfied, the follow-up determination unit 103 proceeds to Step 2.
(Step 2) The follow-up determination unit 103 determines whether a classification area with the classification intensity of the classification result “abnormal.” that is less than A1 and equal to or greater than A2 is equal to or greater than B1% and less than B2. When the above-described determination condition is satisfied, the follow-up determination unit 103 determines that a class of the input image is “follow-up”. When the above-described determination condition is not satisfied, the follow-up determination unit 103 proceeds to Step 3.
(Step 3) When a classification area with the classification intensity of the classification result “abnormal” that is less than A1 and equal to or greater than A2 is equal to or greater than B2%, the follow-up determination unit 103 determines that a class of the input image is “malignant”.
A1, A2, B1, and B2 are preset thresholds. A1 and A2 are values such as 0.8 and 0.5, and B1 and B2 are values such as 5 or 10. In an example shown in
In this way, the image diagnosis support device 1 according to the first embodiment executes determination in consideration of the classification intensity of the classification result and the area of the classification result, thereby enabling sorting of “benign” and “malignant” as well as “follow-up”.
Next, the drawing unit 104 will be described.
The drawing unit 104 presents a diagnosis result together with a basis. For example, the drawing unit 104 presents the diagnosis result using a graphical user interface (GUI) 600 as shown in
The GUI 600 includes a diagnosis result sorting field 601, a display field 602, an image button 603, and a diagnosis result presentation field 604. The diagnosis result sorting field 601 is a field for displaying a class output as the diagnosis result. The corresponding class is highlighted among icons of classes in the diagnosis result sorting field 601. The display field 602 is a field for displaying a value of lesion likelihood calculated by the classification unit 102. The image button 603 is an operation button for displaying an image. The diagnosis result presentation field 604 is a field for displaying the diagnosis result output by the follow-up determination unit 103.
As shown in
When the image button 603 is operated, the drawing unit 104 presents a GUI 610 as shown in
Next, the recording unit 105 will be described.
The recording unit 105 stores, in the sub-storage device 203, the input image, the classification intensity of each classification result, the diagnosis result, coordinate information for drawing the input image that is color-coded by the drawing unit 104 according to each classification result, and the like.
The functional units in the image diagnosis support device 1 have been described above. Next, processing executed by the image diagnosis support device 1 will be described.
When receiving an input image, the image diagnosis support device 1 starts processing described below.
The input unit 100 inputs the input image to the feature extraction unit 101 (step S701).
The feature extraction unit 101 acquires definition information on a classifier stored in the sub-storage device 203, and calculates the features FAi of a partial image of the input image using the definition information of the classifier (step S702).
Specifically, the feature extraction unit 101 acquires parameters such as the filter coefficient wj and the offset value bi from the sub-storage device 203, and calculates the features FAi using Equation (1).
The classification unit 102 acquires the definition information on the classifier stored in the sub-storage device 203, and calculates the classification intensity y of a classification result of the partial image using the matrix f including the definition information of the classifier and the features FAi (step S703).
Specifically, the classification unit 102 acquires parameters such as the weight matrix w and the offset value b from the sub-storage device 203, and calculates the classification intensity y of the classification result of the partial image using Equation (2).
The classification unit 102 determines whether a tissue/cell included in the partial image is normal or abnormal based on a comparison result between the classification intensity y and a threshold Th1 (step S704).
Specifically, the classification unit 102 determines whether the classification intensity y is equal to or larger than the threshold Th1.
When the classification intensity y is equal to or larger than the threshold Th1, the classification unit 102 sets a value (for example, 1) indicating an abnormal tissue/cell as a classification result res (step S705), and then proceeds to step S707. When the classification intensity y is smaller than the threshold Th1, the classification unit 102 sets a value (for example, 0) indicating a normal tissue/cell as the classification result res (step S706), and then proceeds to step S707.
The classification unit 102 determines whether classification of all the partial images of the input image is completed (step S707).
When the classification of all the partial images of the input image is not completed, the classification unit 102 calls the feature extraction unit 101, and the processing proceeds to step S702. When the classification of all the partial images of the input image is completed, the classification unit 102 calls the follow-up determination unit 103, and the processing proceeds to step S708.
The follow-up determination unit 103 calculates a classification area of each classification result based on the classification result of each of the plurality of partial images, and sorts the input image using the classification intensity and the classification area of each classification result (step S708).
Specifically, the follow-up determination unit 103 sorts the input image into any class of benign, malignant, and follow-up by executing the determination processing described above. The follow-up determination unit 103 calculates a value of lesion likelihood based on the classification intensity of each classification result.
The drawing unit 104 presents a diagnosis result (step 3709).
Specifically, the drawing unit 104 displays, together with the diagnosis result as shown in
The recording unit 105 stores, in the sub-storage device 203, the input image, the classification intensity of each classification result, the diagnosis result, the coordinate information for drawing the input image that is color-coded according to each classification result, and the like (step S710).
As described above, according to the first embodiment, the image diagnosis support device 1 can sort the input image into any one of benign, malignant, and follow-up based on the classification intensity and the classification area of each classification result. This can prevent erroneous detection and excessive detection of a lesion caused by erroneous sorting of benign and malignant.
A system using the image diagnosis support device 1 described in the first embodiment will be described in a second embodiment.
The remote diagnosis support system 800 includes a server 801 including an image acquisition device and a server 802 having a function of the image diagnosis support device 1. The two servers 801, 802 are installed at physically separate locations, and are connected to each other via a network 803. The network 803 is a local area network (LAN), a wide area network (WAN), or the like, and a connection method may be wired or wireless.
The server 801 is a computer on which the image acquisition device such as a virtual slide device and a camera is mounted. The server 801 includes an imaging unit 811 that captures an image, and a display unit 812 that displays a diagnosis result transmitted from the server 802.
The server 801 includes an arithmetic device, a storage device, and a communication device that transmits the image to the server 802 and receives data from the server 802, which are not shown.
The server 802 is a computer that executes image processing similar to that of the image diagnosis support device 1 according to the first embodiment. The server 802 includes an image diagnosis support unit 821 that executes image processing according to the first embodiment on the image transmitted by the server 801, a storage unit 822 that stores a diagnosis result output from the image diagnosis support unit 821, and a display unit 823 that displays the diagnosis result output from the image diagnosis support unit 821.
The server 802 includes an arithmetic device, a storage device, and a communication device that transmits the diagnosis result to the server 801 and receives the image from the server 801, which are not shown. The server 802 may include the image diagnosis support unit 821 as hardware equivalent to the image diagnosis support device 1, or may include the image diagnosis support unit 821 as a program.
The image diagnosis support unit 821 classifies presence or absence of an abnormal tissue/cell such as a cancer in the image captured by the imaging unit 811. The image diagnosis support unit 821 sorts lesion likelihood according to a degree of progress (classification intensity) of the abnormal tissue/cell based on a classification result using features of the input image. The image diagnosis support unit 821 sorts the input image into any one of benign, malignant, and follow-up using the classification intensity and a classification area of each classification result, and outputs the sorted result as the diagnosis result.
The display units 812, 823 display the classification result of the input image, the diagnosis result, and the like on screens connected to the servers 801, 802, respectively.
The image acquisition device may be a regenerative medical device including an imaging unit, an iPS cell culture device, and an MRI and ultrasonic imaging device.
According to the second embodiment, a computer (or system) installed at a certain location can sort an image transmitted from 30J a facility or the like at a different location into any one of benign, malignant, and follow-up, and transmit a diagnosis result to the facility or the like at the different location. Accordingly, a computer in the facility or the like that has transmitted the image can display the diagnosis result. In this way, the remote diagnosis support system can be provided according to the second embodiment.
A system using the image diagnosis support device 1 described in the first embodiment will be described in a third embodiment.
The network consignment service providing system 900 includes a server 901 including an image acquisition device and a server 902 having a function of the image diagnosis support device 1. The two servers 901, 902 are installed at physically separate locations, and are connected to each other via a network 903. The network 903 is a local area network (LAN), a wide area network (WAN), or the like, and a connection method may be wired or wireless.
The server 901 is a computer on which the image acquisition device such as a virtual slide device and a camera is mounted. The server 901 includes an imaging unit 911 that captures an image, a storage unit 912 that stores classifier information transmitted from the server 902, and an image diagnosis support unit 913 that executes image processing according to the first embodiment on the image newly captured by the imaging unit 911 using the classifier information.
The server 901 includes an arithmetic device, a storage device, and a communication device that transmits the image to the server 902 and receives the classifier information from the server 902, which are not shown. The server 901 may include the image diagnosis support unit 913 as hardware equivalent to the image diagnosis support device 1, or may include the image diagnosis support unit 913 as a program.
The server 902 is a computer that executes image processing similar to that of the image diagnosis support device 1 according to the first embodiment. The server 902 includes an image diagnosis support unit 921 that executes the image processing according to the first embodiment, and a storage unit 922 that stores the classifier information output from the image diagnosis support unit 921.
The server 902 includes an arithmetic device, a storage device, and a communication device that transmits the classifier information to the server 901 and receives the image from the server 901, which are not shown. The server 902 may include the image diagnosis support unit 921 as hardware equivalent to the image diagnosis support device 1, or may include the image diagnosis support unit 921 as a program.
The server 902 executes the image processing according to the first embodiment on the image captured by the imaging unit 911, and executes machine learning for generating a classifier (model) for correctly classifying a normal tissue/cell and an abnormal tissue/cell. Here, the classifier includes the feature extraction unit 101 and the classification unit 102. The server 902 transmits the classifier information, which is definition information on the classifier, to the server 901 installed in a facility or the like at a different location.
When performing actual diagnosis, the server 901 reads the classifier information from the storage unit 912, and executes classification processing and sorting processing on the image captured by the imaging unit 911 using the classifier defined by the classifier information.
The image acquisition device may be a regenerative medical device including an imaging unit, an iPS cell culture device, and an MRI and ultrasonic imaging device.
The server 902 may execute learning processing a plurality of times using images having different characteristics in advance using the image diagnosis support unit 921, and store a plurality of pieces of classifier information in the storage unit 922. When receiving the image from the server 901, the server 902 executes classification processing and sorting processing using the image diagnosis support unit 921 in which classifiers are set, and transmits the classifier information on the classifier with the highest accuracy to the server 902.
According to the third embodiment, a computer (or a system) installed at a certain location can generate a classifier using an image transmitted from a facility or the like at a different location and transmit information on the classifier to the facility or the like at the different location. Accordingly, a computer in the facility or the like that has transmitted the image can output a diagnosis result of any one of benign, malignant, and follow-up for a new image. In this way, the network consignment service providing system can be applied according to the third embodiment.
The embodiments described above can be modified as follows.
The feature extraction unit 101 calculates the features using a filter (CNN or the like) generated by machine learning, or may use other features such as HOG to achieve the same effects.
The feature extraction unit 101 calculates the features using one classifier for an input image, or may calculate the features using two or more classifiers to achieve the same effects.
The classification unit 102 executes logistic regression processing to classify a tissue/cell, or may use linear regression, Poisson regression, or the like to achieve the same effects.
The invention is not limited to the embodiments described above, and includes various modifications. For example, the embodiments described above have been described in detail for easy understanding of the invention, and the invention is not necessarily limited to those including all the configurations described above. Some configuration of the embodiments can be added to, deleted from, or replaced with another configuration.
In addition, some or all of the above configurations, functions, processing units, processing methods, and the like may be implemented by hardware through, for example, design using an integrated circuit. Further, the invention can also be implemented by a program code of software that implements the functions of the embodiments. In this case, a storage medium recording the program code is provided on a computer, and a processor in the computer reads the program code stored in the storage medium. In this case, the program code itself read from the storage medium implements the functions of the above embodiments, and the program code itself and the storage medium storing the program code constitute the invention. Examples of the storage medium for supplying such a program code include a flexible disk, a CD-ROM, a DVD-ROM, a hard disk, a solid state drive (SSD), an optical disk, a magneto-optical disk, a CD-R, a magnetic tape, a nonvolatile memory card, and a ROM.
The program code that implements the functions described in the present embodiment can be implemented in a wide range of programs or script languages such as assembler, C/C++, perl, Shell, PHP, Python, and Java (registered trademark).
Further, the program code of the software that implements the functions of the embodiments may be distributed via a network to be stored in a storage unit such as a hard disk or a memory of a computer or a storage medium such as a CD-RW or a CD-R, and a processor in the computer may read and execute the program code stored in the storage unit or the storage medium.
In the embodiments described above, control lines and information lines that are considered to be necessary for description are shown, and not all control lines and information lines are necessarily shown on a product. All configurations may be interconnected.
Number | Date | Country | Kind |
---|---|---|---|
2020-097112 | Jun 2020 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/009063 | 3/8/2021 | WO |