The present disclosure relates to a non-transitory computer-readable recording medium storing an information processing program and the like.
A list in which a question sentence frequently inquired from a company and a response sentence that is an answer to the question sentence are paired is disclosed, and a user can obtain a response sentence to a question sentence by using such a list. This list is called frequently asked questions (FAQ).
As a conventional technique using FAQ, there is a technique of, when a question sentence is accepted from a user, searching FAQ for a question sentence having a feature similar to the feature of the question sentence, and notifying the user of a response sentence to the question sentence searched for.
The company has to be responsible for the response sentence of FAQ, and it is unacceptable to confirm the response sentence with artificial intelligence (AI). Therefore, the work of examining a wide variety of question sentences that can be transmitted from users and appropriate response sentences to the question sentences and editing FAQ will be performed, and such a work is performed by an expert having specialized knowledge.
Examples of the related art include: [Patent Document 1] Japanese Laid-open Patent Publication No. 2002-41573; and [Patent Document 2] Japanese Laid-open Patent Publication No. 2005-196296.
According to an aspect of the embodiments, there is provided a non-transitory computer-readable recording medium storing an information processing program for causing a computer to perform processing including: executing preprocessing processing that includes calculating vectors for a plurality of subtexts of text information included in a plurality of pieces of history information in which information on a plurality of question sentences and a plurality of response sentences is recorded; executing training processing that includes training a training model based on training data that defines relationships between the vectors of some subtexts and the vectors of other subtexts among the plurality of subtexts; and executing generation processing that includes calculating, when accepting a new question sentence, the vectors of the subtexts by inputting the vectors of the new question sentence to the training model, and generating a response that corresponds to the new question sentence, based on the calculated vectors.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention.
processing device according to the present embodiment;
However, the above-described conventional technique has a disadvantage that FAQ may not be efficiently edited.
In order to create FAQ, a large part depends on specialized knowledge of experts, and a work person who does not have specialized knowledge is not able to create FAQ, which involves time until FAQ is created. In recent years, since diverse new services are provided every day, it is expected to efficiently create FAQ.
In one aspect, an object of the present invention is to provide an information processing program, an information processing method, and an information processing device capable of efficiently editing FAQ.
Hereinafter, embodiments of an information processing program, an information processing method, and an information processing device disclosed in the present application will be described in detail with reference to the drawings. Note that these embodiments do not limit the present invention.
The information processing device generates a fault report table 40, based on the troubleshooting history table 30. A plurality of fault reports is registered in the fault report table 40. In the example illustrated in
The fault report 40a includes a subtext 41a relating to question content, a subtext 41b relating to a common phenomenon, a subtext 41c relating to a specific phenomenon, a subtext 41d relating to a cause, and a subtext 41e relating to coping.
The question content indicates the content of the question sentence. For example, the text included in the area 31a of the troubleshooting history table 30 forms the subtext 41a of the question content.
The common phenomenon indicates a phenomenon of a fault common to a plurality of question sentences. For example, the text included in the area 32a of the troubleshooting history 30a forms the subtext 41b of the common phenomenon. The information processing device makes comparison with the texts of the question sentences of other fault reports and specifies the text that represents the common phenomenon.
The specific phenomenon indicates a phenomenon of a fault specific to the question sentence of interest. For example, the text included in the area 32b of the troubleshooting history 30a forms the subtext 41c of the specific phenomenon. The information processing device makes comparison with the texts of the question sentences of other fault reports and specifies the text that represents the specific phenomenon.
The cause indicates a cause of occurrence of the fault. For example, the text included in an area 32c of the troubleshooting history 30a forms the subtext 41d of the cause.
The coping indicates a coping method for the fault. For example, the text included in an area 32d of the troubleshooting history 30a forms the subtext 41e of the coping.
The information processing device separately calculates vectors of the respective subtexts 41a to 41e included in the fault report 40a. The information processing device calculates a vector Vq1 of the question content from the subtext 41a. The information processing device calculates a vector V1-1 of the common phenomenon from the subtext 41b. The information processing device calculates a vector V1-2 of the specific phenomenon from the subtext 41c. The information processing device calculates a vector V1-3 of the cause from the subtext 41d. The information processing device calculates a vector V1-4 of the coping from the subtext 41e. A specific example in which the information processing device calculates a vector from text information such as a subtext will be described later.
The information processing device also calculates vectors of the subtexts of the question content, the common phenomenon, the specific phenomenon, the cause, and the coping for each of the plurality of fault reports included in the fault report table 40. In the following description, a vector calculated from the subtext of the question content will be expressed as a “question content vector”. A vector calculated from the subtext of the common phenomenon will be expressed as a “common phenomenon vector”. A vector calculated from the subtext of the specific phenomenon will be expressed as a “specific phenomenon vector”. A vector calculated from the subtext of the cause will be expressed as a “cause vector”. A vector calculated from the subtext of the coping will be expressed as a “coping vector”.
Description of
The information processing device executes training of the training model 70, using the training table 65. The training model 70 corresponds to a convolutional neural network (CNN), a recurrent neural network (RNN), or the like. In the training model 70 of the present embodiment, it is assumed that the common phenomenon vector, the specific phenomenon vector, the cause vector, and the coping vector are separately output from different nodes in an output layer.
The information processing device executes training by back propagation such that the output when the question content vector is input to the training model 70 approaches the common phenomenon vector, the specific phenomenon vector, the cause vector, and the coping vector. The information processing device adjusts parameters of the training model 70 (executes machine learning) by repeatedly executing the above processing, based on the relationship between the “question content vector” and the “common phenomenon vector, specific phenomenon vector, cause vector, and coping vector”.
Description of
The information processing device separately compares the common phenomenon vector v10-1 and the specific phenomenon vector v10-2 with the common phenomenon vector and the specific phenomenon vector of each fault report stored in the fault report table 40. As a result of the comparison, the information processing device specifies a fault report 40-1 having similar common phenomenon vector and specific phenomenon vector.
For example, a fault report having a common phenomenon vector v40-11 and a specific phenomenon vector v40-12 similar to the common phenomenon vector v10-1 and the specific phenomenon vector v10-2 is assumed as the fault report 40-1. In these circumstances, the information processing device generates a response sentence 20 in which a cause α1 and coping α2 set in the fault report 40-1 are set and transmits the generated response sentence 20 to the terminal device of the customer.
Description of
The information processing device separately compares the common phenomenon vector v11-1 with the common phenomenon vector of each fault report stored in the fault report table 40. As a result of the comparison, the information processing device specifies fault reports 40-2 and 40-3 having similar common phenomenon vectors. In this manner, when a plurality of fault reports has been specified, the information processing device generates a response sentence 21 and notifies the terminal device of the customer of the generated response sentence 21. The response sentence 21 includes a specific phenomenon 21a of the fault report 40-2 and a specific phenomenon 21b of the fault report 40-3.
The customer operates the terminal device to check the response sentence 21 and selects the specific phenomenon similar to the phenomenon in the service undergoing the occurrence from among the specific phenomena 21a and 21b to generate a question sentence 12. In the example illustrated in
When accepting the question sentence 12, the information processing device specifies the fault report 40-3 corresponding to the specific phenomenon 21b included in the question sentence 12. The information processing device generates a response sentence 22 in which a cause γ1 and coping γ2 set in the fault report 40-3 are set and transmits the generated response sentence 22 to the terminal device of the customer.
As described above, the information processing device acquires a plurality of subtexts of the text data of the question content and the text data of the response content included in the fault report table 40, generates the training table 65 using the vector of each subtext, and executes training of the training model 70. When accepting a question sentence from a customer, the information processing device uses vectors calculated by inputting the vectors of the question sentence to the training model 70 to specify a fault report having relevant vectors and generates a response sentence. This may enable to efficiently create FAQ.
Next, a system according to the present embodiment will be described.
The terminal devices 5a to 5c are terminal devices used by customers. In the following description, the terminal devices 5a to 5c will be collectively expressed as terminal devices 5. The customer operates the terminal device 5 to generate data of the question sentence and transmits the generated data to the information processing device 100. When receiving the data of the response sentence from the information processing device, the terminal device 5 displays information on the received response sentence.
The information processing device 100 generates the training model 70 by executing the processing described with reference to
The communication unit 110 is coupled to an external device or the like in a wired or wireless manner and transmits and receives information to and from the terminal devices 5, the external device, or the like. For example, the communication unit 110 is implemented by a network interface card (NIC) or the like.
The input unit 120 is an input device that inputs various types of information to the information processing device 100. The input unit 120 corresponds to a keyboard, a mouse, a touch panel, or the like.
The display unit 130 is a display device that displays information output from the control unit 150. The display unit 130 corresponds to a liquid crystal display, an organic electro luminescence (EL) display, a touch panel, or the like.
The storage unit 140 includes a troubleshooting history table 30, a fault report table 40, an encoded file table 50, dictionary information D1, a vector table T1, an inverted index table In1, a training table 65, and a training model 70. The storage unit 140 is implemented by, for example, a semiconductor memory element such as a random access memory (RAM) or a flash memory, or a storage device such as a hard disk or an optical disc.
The troubleshooting history table 30 is a table in which an administrator or the like registers a plurality of troubleshooting histories performed in the past. The troubleshooting history table 30 corresponds to the troubleshooting history table 30 described with reference to
The fault report table 40 includes a plurality of fault reports generated based on the troubleshooting history table 30. The fault report table 40 corresponds to the fault report table 40 described with reference to
In the encoded file table 50, information obtained by encoding each fault report included in the fault report table 40 is registered.
The identification information is information that uniquely identifies the fault report that has been subjected to encoding. The compression coding sequence indicates a fault report encoded in units of words. The compression coding sequence includes a compression coding sequence obtained by encoding the subtext of the question content, a compression coding sequence obtained by encoding the subtext of the common phenomenon, and a compression coding sequence obtained by encoding the subtext of the specific phenomenon. In addition, the compression coding sequence includes a compression coding sequence obtained by encoding the subtext of the cause and a compression coding sequence obtained by encoding the subtext of the coping.
The dictionary information D1 is dictionary information that defines a compression code corresponding to a word.
For the Poincare embeddings, for example, the technique described in Non-Patent Document “Valentin Khrulkovl et al., “Hyperbolic Image Embeddings” Cornell University, Apr. 3, 2019”, or the like can be simply used. In the Poincare embeddings, a vector is allocated according to the embedded position in a Poincare space, and additionally, there is a characteristic that more similar pieces of information are embedded in closer positions. The information processing device 100 embeds the static code in the Poincare space in advance and previously calculates the vector for the static code.
In the vector table T1, each of vectors calculated from the compression coding sequence of each subtext is registered.
The question content vector table T1-1 associates the compression coding sequence of the subtext of the question content with the vector. The common phenomenon vector table T1-2 associates the compression coding sequence of the subtext of the common phenomenon with the vector. The specific phenomenon vector table T1-3 associates the compression coding sequence of the subtext of the specific phenomenon with the vector. The cause vector table T1-4 associates the compression coding sequence of the subtext of the cause with the vector. The coping vector table T1-5 associates the compression coding sequence of the subtext of the coping with the vector. Each vector table is omitted in illustration.
The inverted index table In1 defines an offset (the distance from the beginning of the encoded file table 50) of the compression coding sequence of each subtext.
The question content inverted index In1-1 defines the compression coding sequence of the subtext of the question content and the offset. The common phenomenon inverted index In1-2 defines the compression coding sequence of the subtext of the common phenomenon and the offset. The specific phenomenon inverted index In1-3 defines the compression coding sequence of the subtext of the specific phenomenon and the offset. The cause inverted index In1-4 defines the compression coding sequence of the subtext of the cause and the offset. The coping inverted index In1-5 defines the compression coding sequence of the subtext of the coping and the offset.
The training table 65 stores data for training the training model 70. The description of the training table 65 corresponds to the description of the training table 65 made with reference to
The training model 70 corresponds to a CNN, an RNN, or the like. In the training model 70 of the present embodiment, it is assumed that the common phenomenon vector, the specific phenomenon vector, the cause vector, and the coping vector are separately output from different nodes in an output layer.
The description returns to
The preprocessing unit 151 executes processing of generating the fault report table 40, processing of calculating the vector, and processing of generating the training table 65.
Processing in which the preprocessing unit 151 generates the fault report table 40 will be described. This processing corresponds to the processing described with reference to
The preprocessing unit 151 extracts the text in the area in which the data of the question sentence is recorded, as the subtext of the question content. The preprocessing unit 151 makes comparison with the texts of the question sentences of the respective troubleshooting histories and specifies and extracts the subtext of the common phenomenon and the subtext of the specific phenomenon.
The preprocessing unit 151 extracts the subtext of the cause and the subtext of the coping from the text in the area in which the data of the response sentence is recorded.
Note that the administrator may refer to the troubleshooting histories and operate the input unit 120 to designate the subtext of the question content, the subtext of the common phenomenon, the subtext of the specific phenomenon, the subtext of the cause, and the subtext of the coping. The preprocessing unit 151 extracts the subtext of the question content, the subtext of the common phenomenon, the subtext of the specific phenomenon, the subtext of the cause, and the subtext of the coping, based on the designated information.
Processing in which the preprocessing unit 151 calculates the vector will be described. The preprocessing unit 151 extracts the subtext of the question content from the fault report table 40. The preprocessing unit 151 executes morphological analysis on the subtext of the question content and divides the subtext into a plurality of words. By comparing each divided word with the dictionary information D1 to allocate compression codes to the words, the preprocessing unit 151 generates the compression coding sequence of the question content. The preprocessing unit 151 registers the compression coding sequence of the question content in the encoded file table 50.
In addition, by comparing each divided word with the dictionary information D1 to allocate the vector of each word (compression code) and integrating the vectors of the words included in the subtext of the question content, the preprocessing unit 151 calculates the question content vector. The preprocessing unit 151 registers the compression coding sequence of the question content and the question content vector in the question content vector table T1-1 in association with each other. The preprocessing unit 151 registers the question content vector and the offset in the question content inverted index In1-1 in association with each other.
The preprocessing unit 151 also similarly generates the compression code of the common phenomenon and the common phenomenon vector for the subtext of the common phenomenon. The preprocessing unit 151 registers the compression coding sequence of the common phenomenon in the encoded file table 50. The preprocessing unit 151 registers the compression coding sequence of the common phenomenon and the common phenomenon vector in the common phenomenon vector table T1-2 in association with each other. The preprocessing unit 151 registers the common phenomenon vector and the offset in the common phenomenon inverted index Int-2 in association with each other.
The preprocessing unit 151 also similarly generates the compression code of the specific phenomenon and the specific phenomenon vector for the subtext of the specific phenomenon. The preprocessing unit 151 registers the compression coding sequence of the specific phenomenon in the encoded file table 50. The preprocessing unit 151 registers the compression coding sequence of the specific phenomenon and the specific phenomenon vector in the specific phenomenon vector table T1-3 in association with each other. The preprocessing unit 151 registers the specific phenomenon vector and the offset in the specific phenomenon inverted index In1-3 in association with each other.
The preprocessing unit 151 also similarly generates the compression code of the cause and the cause vector for the subtext of the cause. The preprocessing unit 151 registers the compression coding sequence of the cause in the encoded file table 50. The preprocessing unit 151 registers the compression coding sequence of the cause and the cause vector in the cause vector table T1-4 in association with each other. The preprocessing unit 151 registers the cause vector and the offset in the cause inverted index In1-4 in association with each other.
The preprocessing unit 151 also similarly generates the compression code of the coping and the coping vector for the subtext of the coping. The preprocessing unit 151 registers the compression coding sequence of the coping in the encoded file table 50. The preprocessing unit 151 registers the compression coding sequence of the coping and the coping vector in the coping vector table T1-5 in association with each other. The preprocessing unit 151 registers the coping vector and the offset in the coping inverted index In1-5 in association with each other.
Meanwhile, in the above description, the relationships between each subtext of the fault report included in the fault report table 40 and the vectors are defined using the encoded file table 50, the vector table T1, and the inverted index table In1, but the definition is not limited to this. For example, the information processing device 100 may directly associate each subtext included in the fault report with each one of the vectors and set the associated subtext and vector in the fault report table 40.
Processing in which the preprocessing unit 151 calculates the training table 65 will be described. In the fault report, the preprocessing unit 151 registers the relationship between the “question content vector” and the “common phenomenon vector, specific phenomenon vector, cause vector, and coping vector” in the training table 65.
The preprocessing unit 151 generates the training table 65 by repeatedly executing the above processing in each fault report.
The training unit 152 executes training of the training model 70, using the training table 65. The training unit 152 executes training by back propagation such that the output when the question content vector is input to the training model 70 approaches the common phenomenon vector, the specific phenomenon vector, the cause vector, and the coping vector. The training unit 152 adjusts parameters of the training model 70 (executes machine learning) by repeatedly executing the above processing, based on the relationship between the “question content vector” and the “common phenomenon vector, specific phenomenon vector, cause vector, and coping vector”.
The generation unit 153 is a processing unit that generates data of the response sentence corresponding to the question sentence and transmits the generated data to the terminal device 5 when accepting the data of the question sentence from the terminal device 5. Processing of the generation unit 153 corresponds to the processing described with reference to
An example of processing of the generation unit 153 will be described with reference to
The generation unit 153 calculates the common phenomenon vector v10-1 and the specific phenomenon vector v10-2 by inputting the vectors of the question sentence to the training model 70. Note that the generation unit 153 does not use the cause vector and the coping vector that can be output from the training model 70.
The generation unit 153 separately compares the common phenomenon vector v10-1 and the specific phenomenon vector v10-2 with the common phenomenon vector and the specific phenomenon vector of each fault report stored in the fault report table 40. As a result of the comparison, the generation unit 153 specifies the fault report 40-1 having similar common phenomenon vector and specific phenomenon vector. For example, when the cos similarity between the vectors is equal to or higher than a threshold value, the generation unit 153 determines that the compared vectors are similar to each other.
For example, the fault report having the common phenomenon vector v40-11 and the specific phenomenon vector v40-12 similar to the common phenomenon vector v10-1 and the specific phenomenon vector v10-2 is assumed as the fault report 40-1. In these circumstances, the generation unit 153 generates the response sentence 20 in which the cause α1 and the coping α2 set in the fault report 40-1 are set and transmits the generated response sentence 20 to the terminal device 5 that is the transmission source of the question sentence 10.
An example of processing of the generation unit 153 will be described with reference to
The generation unit 153 calculates the common phenomenon vector v11-1 by inputting the vectors of the question sentence 11 to the training model 70. Note that the generation unit 153 does not specify the specific phenomenon vector when the value of the specific phenomenon vector output from the training model 70 does not fall within a predetermined range. In addition, the generation unit 153 does not use the cause vector and the coping vector that can be output from the training model 70.
The generation unit 153 separately compares the common phenomenon vector v11-1 with the common phenomenon vector of each fault report stored in the fault report table 40. As a result of the comparison, the generation unit 153 specifies the fault reports 40-2 and 40-3 having similar common phenomenon vectors. In this manner, when a plurality of fault reports has been specified, the generation unit 153 generates the response sentence 21 and notifies the terminal device 5 that is the transmission source of the question sentence 11 of the generated response sentence 21. The response sentence 21 includes the specific phenomenon 21a of the fault report 40-2 and the specific phenomenon 21b of the fault report 40-3.
The customer operates the terminal device 5 to check the response sentence 21 and selects the specific phenomenon similar to the phenomenon in the service undergoing the occurrence from among the specific phenomena 21a and 21b to generate the question sentence 12. In the example illustrated in
When accepting the question sentence 12, the generation unit 153 specifies the fault report 40-3 corresponding to the specific phenomenon 21b included in the question sentence 12. The information processing device generates the response sentence 22 in which the cause γ1 and the coping γ2 set in the fault report 40-3 are set and transmits the generated response sentence 22 to the terminal device 5 that is the transmission source of the question sentence 12.
Next, an example of a processing procedure of the information processing device 100 according to the present embodiment will be described.
The preprocessing unit 151 generates the question sentence vector, the common phenomenon vector, the specific phenomenon vector, the cause vector, and the coping vector, based on the dictionary information D1 (step S102).
The preprocessing unit 151 generates the training table 65 (step S103). The generation unit 153 of the information processing device 100 generates the training model 70, based on the training table 65 (step S104).
The generation unit 153 calculates vectors of the question sentence (step S202). The generation unit 153 inputs the vectors of the question sentence to the training model 70 (step S203). When the generation unit 153 has calculated the common phenomenon vector and the specific phenomenon vector (step S204, Yes), the generation unit 153 proceeds to step S205. On the other hand, when generation unit 153 has calculated only the common phenomenon vector (step S204, No), the generation unit 153 proceeds to step S206.
The processing in step S205 will be described. The generation unit 153 detects a fault report corresponding to similar common phenomenon vector and specific phenomenon vector, based on the common phenomenon vector and the specific phenomenon vector (step S205), and proceeds to step S210.
The processing in step S206 will be described. The generation unit 153 detects a plurality of fault reports corresponding to similar common phenomenon vectors, based on the common phenomenon vector (step S206). The generation unit 153 sets each of specific phenomena included in the detected plurality of fault reports in a response draft and transmits the set response draft to the terminal device 5 (step S207).
The generation unit 153 receives the data of a question sentence from the terminal device 5 (step S208). The generation unit 153 detects the fault report corresponding to the specific phenomenon included in the question sentence (step S209). The generation unit 153 generates a response draft, based on the cause and coping of the detected fault report, and transmits the generated response draft to the terminal device 5 (step S210).
Next, an effect of the information processing device 100 according to the present embodiment will be described. The information processing device 100 calculates the vector of each subtext of the text data of the question content and the text data of the response content included in the fault report table 40 to generate the training table 65 and executes training of the training model 70. In addition, when accepting a question sentence from a customer, the information processing device 100 uses vectors calculated by inputting the vectors of the question sentence to the training model 70 to specify a fault report having relevant vectors and generates a response sentence. This may enable to efficiently create FAQ.
The information processing device 100 generates the training table 65 from the fault report, based on each of the vectors of the subtext of the question content, the subtext of the common phenomenon, the subtext of the specific phenomenon, the subtext of the cause, and the subtext of the coping, and executes training of the training model 70. By using this training model 70, the vectors of the common phenomenon, the specific phenomenon, the cause, and the coping can be calculated from the vectors of the question content.
The information processing device 100 inputs the vectors of the question sentence to the training model to calculate the common phenomenon vector and the specific phenomenon vector, detects the fault report corresponding to similar common phenomenon vector and specific phenomenon vector, and generates the response sentence, based on the detected fault report. This may enable to efficiently generate a response sentence corresponding to the question sentence.
Note that an example in which the fault reports are created from the troubleshooting histories regarding a software product has been indicated as an embodiment. However, history information corresponding to the fault reports may be generated from troubleshooting histories relating to a hardware product such as a keyboard, a printer, or a hard disk, a medicine, or the like, and processing may be performed in a similar manner to the processing for the fault reports. For example, an operation manual, a package insert, or the like indicates a standard use method, but has no description about a trouble caused by a plurality of errors in operations (dosages), an operating environment (complication), or the like. Therefore, a subtext relating to a trouble caused by a plurality errors in operations, an operating environment, or the like may be extracted from the history information, vectors may be calculated in a similar manner to the above-described subtexts of the common phenomenon, the specific phenomenon, the cause, and the coping, and training of a training model may be executed.
Next, an example of a hardware configuration of a computer that implements functions similar to the functions of the information processing device 100 indicated in the above embodiments will be described.
As illustrated in
The hard disk device 207 includes a preprocessing program 207a, a training program 207b, and a generation program 207c. In addition, the CPU 201 reads each of the programs 207a to 207c and loads the read programs 207a to 207c into the RAM 206.
The preprocessing program 207a functions as a preprocessing process 206a. The training program 207b functions as a training process 206b. The generation program 207c functions as a generation process 206c.
Processing of the preprocessing process 206a corresponds to the processing of the preprocessing unit 151. Processing of the training process 206b corresponds to the processing of the training unit 152. Processing of the generation process 206c corresponds to the processing of the generation unit 153.
Note that each of the programs 207a to 207c does not necessarily have to be previously stored in the hard disk device 207. For example, each of the programs may be stored in a “portable physical medium” to be inserted into the computer 200, such as a flexible disk (FD), a compact disc read only memory (CD-ROM), a digital versatile disc (DVD), a magneto-optical disk, or an integrated circuit (IC) card. Then, the computer 200 may read and execute each of the programs 207a to 207c.
All examples and conditional language provided herein are intended for the pedagogical purposes of aiding the reader in understanding the invention and the concepts contributed by the inventor to further the art, and are not to be construed as limitations to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although one or more embodiments of the present invention have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
This application is a continuation application of International Application PCT/JP2021/016551 filed on Apr. 23, 2021 and designated the U.S., the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2021/016551 | Apr 2021 | US |
Child | 18479910 | US |