The present disclosure relates to an image processing apparatus and an image processing method.
Japanese Laid-open Patent Publication No. 2004-24559 discloses a technology of extracting a display image from an instructed image periphery of a user using image quality and operation information as indices. This may solve bother of repeatedly performing capturing so as to obtain a high-quality image, because a freeze manipulation in an ultrasonograph deteriorates image quality due to blurring, unsharpness, and the like that are attributed to a posture change of an ultrasound probe that are caused by holding of an the ultrasound probe by the hand of a diagnostician, aspiration, a body posture change, and the like. Specifically, after a plurality of chronological ultrasound images are stored, a freeze image is set according to an instruction of the user, a plurality of candidate images having a relationship of approaching temporally the freeze image are selected, and a display image is selected using reference information such as image quality and an operation that accompanies the plurality of candidate images, as feature data (index).
An image processing apparatus according to one aspect of the present disclosure includes a processor comprising hardware, wherein the processor is configured to execute: analyzing a characteristic of a pathologic region included in individual endoscopic images of an endoscopic image group arranged in chronological order; setting, based on the characteristic of the pathologic region, an extraction condition for extracting one or more endoscopic images appropriate for diagnosis from the endoscopic image group; and extracting, based on the extraction condition, one or more endoscopic images having image quality appropriate for diagnosis from the endoscopic image group, wherein, when performing the analysis of the characteristic of the pathologic region, the processor acquires pathologic region information representing coordinate information of a pathologic region in each endoscopic image of the endoscopic image group, the pathologic region information being generated by detecting a pathologic region by a pathologic region detection device from each endoscopic image of the endoscopic image group, acquires, based on the pathologic region information, pathologic region presence information representing whether a pathologic region having an area equal to or larger than a preset predetermined value is included in each endoscopic image, calculates, based on the pathologic region information, pathology characteristic information representing a characteristic of the pathologic region, and classifies the pathologic region into a preset class of malignant degree.
The above and other features, advantages and technical and industrial significance of this disclosure will be better understood by reading the following detailed description of presently preferred embodiments of the disclosure, when considered in connection with the accompanying drawings.
Hereinafter, an image processing apparatus, an image processing method, and a program according to embodiments of the present disclosure will be described with reference to the drawings. In addition, the present disclosure is not limited by these embodiments. In addition, in the description of the drawings, the same parts are assigned the same signs.
The image processing apparatus 1 illustrated in
The image acquisition unit 2 is appropriately formed according to the mode of a system including an endoscope. For example, when a portable recording medium is used for transferring an endoscopic image group (moving image data, image data) and pathologic region information with an endoscope, the image acquisition unit 2 is formed as a reader device that has the recording medium detachably attached thereto and reads the recorded endoscopic image group and pathologic region information. In addition, when a server that records an endoscopic image group captured by an endoscope and pathologic region information is used, the image acquisition unit 2 is formed by a communication device that can bi-directionally communicate with the server, or the like, and acquires the endoscopic image group and the pathologic region information by performing data communication with the server. Furthermore, in addition, the image acquisition unit 2 may be formed by an interface device to which an endoscopic image group and pathologic region information are input from an endoscope via a cable, or the like.
The input unit 3 is implemented by an input device such as a keyboard, a mouse, a touch panel, and various switches, for example, and outputs an input signal received according to a manipulation from the outside, to the control unit 6.
Under the control of the control unit 6, the output unit 4 outputs a diagnosis target image extracted by the calculation of the arithmetic unit 7, to an external display device, or the like. In addition, the output unit 4 is formed using a display panel such as a liquid crystal or an organic Electro Luminescence (EL), and may display various images including a diagnosis target image by the calculation of the arithmetic unit 7.
The recording unit 5 is implemented by various IC memories such as a flash memory, a read only memory (ROM), and a random access memory (RAM), and a hard disc that is incorporate or connected by a data communication terminal, or the like. Aside from the endoscopic image group acquired by the image acquisition unit 2, the recording unit 5 records programs for operating the image processing apparatus 1 and causing the image processing apparatus 1 to execute various functions, data used in the execution of the programs, and the like. For example, the recording unit 5 records an image processing program 51 for extracting one or more endoscopic images optimum for diagnosis from an endoscopic image group, various types of information used in the execution of the program, and the like.
The control unit 6 is formed using a general-purpose processor such as a central processing unit (CPU), or a dedicated processor such as various arithmetic circuits that execute specific functions such as an application specific integrated circuit (ASIC) and a field programmable gate array (FPGA). When the control unit 6 is a general-purpose processor, the control unit 6 performs instructions, data transfer, and the like to units constituting the image processing apparatus 1, by reading various programs stored in the recording unit 5, and comprehensively controls operations of the entire image processing apparatus 1. In addition, when the control unit 6 is a dedicated processor, the processor may independently execute various types of processing, or the processor and the recording unit 5 may execute various types of processing in cooperation or in combination, by using various data stored in the recording unit 5, and the like.
The arithmetic unit 7 is formed using a general-purpose processor such as a CPU, or a dedicated processor such as various arithmetic circuits that executes specific functions such as an ASIC or an FPGA. When the arithmetic unit 7 is a general-purpose processor, the arithmetic unit 7 executes image processing of extracting an endoscopic image optimum for diagnosis from the acquired endoscopic image group arranged in chronological order, by reading the image processing program 51 from the recording unit 5. In addition, when the arithmetic unit 7 is a dedicated processor, the processor may independently execute various types of processing, or the processor and the recording unit 5 may execute image processing in cooperation or in combination, by using various data stored in the recording unit 5, and the like.
Next, a detailed configuration of the arithmetic unit 7 will be described.
The arithmetic unit 7 includes a pathologic region analysis unit 71, an extraction condition setting unit 72, and an image extraction unit 73.
The pathologic region analysis unit 71 receives input of an endoscopic image group acquired by the image acquisition unit 2 via the control unit 6 or the recording unit 5, and pathologic region information representing coordinate information of a pathologic region in each endoscopic image, and analyzes features and characteristics of a pathologic region included in individual endoscopic images. The pathologic region analysis unit 71 includes a pathologic region acquisition unit 711, a pathologic region presence information acquisition unit 712, a pathology characteristic information calculation unit 713, and a gazing operation determination unit 714.
The pathologic region acquisition unit 711 acquires an endoscopic image group acquired by the image acquisition unit 2 via the control unit 6 or the recording unit 5, and pathologic region information representing coordinate information of a pathologic region in each endoscopic image.
Based on pathologic region information of each endoscopic image, the pathologic region presence information acquisition unit 712 acquires pathologic region presence information as to whether a pathologic region having an area equal to or larger than a preset predetermined value is included.
Based on the pathologic region information, the pathology characteristic information calculation unit 713 calculates pathology characteristic information representing a characteristic of a pathologic region. In addition, the pathology characteristic information calculation unit 713 includes a size acquisition unit 7131 that acquires size information of a pathologic region based on pathologic region information when pathologic region presence information includes information representing that a pathologic region having an area equal to or larger than a preset predetermined value is included (hereinafter, referred to as a “case of present determination”).
The gazing operation determination unit 714 determines a gazing operation on a pathologic region based on the pathology characteristic information. In addition, the gazing operation determination unit 714 includes a near view capturing operation determination unit 7141 that determines that gazing and near view imaging are being performed, when pathologic region presence information represents present determination and size information in pathology characteristic information represents a preset predetermined value or more.
The extraction condition setting unit 72 sets an extraction condition based on the characteristic (feature) of a pathologic region. The extraction condition setting unit 72 includes an extraction target range setting unit 721.
Based on the characteristic (feature) of the pathologic region, the extraction target range setting unit 721 sets a range between a base point and edge points decided based on the base point, as an extraction target range. In addition, the extraction target range setting unit 721 includes a base point image setting unit 7211 that sets an endoscopic image at a specific operation position as a reference image based on operation information in the characteristic (feature) of the pathologic region, and an edge point section setting unit 7212 that sets endoscopic images at specific operation positions preceding and following the base point image as edge point images, and sets a section from the base point image to the edge point images, based on operation information in the characteristic (feature) of the pathologic region.
The base point image setting unit 7211 includes an operation change point extraction unit 7211a that sets, as a base point image, an endoscopic image near a position at which a specific operation switches to another operation after the specific operation has continued for a preset predetermined section.
The edge point section setting unit 7212 includes an operation occurrence section position setting unit 7212a that sets a section up to an image where a specific operation occurs. Furthermore, in addition, the operation occurrence section position setting unit 7212a includes a base point setting unit 7212b that sets, as edge point images, base point images preceding and following the base point image, in a section in which pathologic region presence information represents present determination.
Based on an extraction condition, the image extraction unit 73 extracts one or more endoscopic images, each having image quality appropriate for diagnosis (image quality satisfying a predetermined condition). The image extraction unit 73 includes an image quality evaluation value calculation unit 731 that calculates an evaluation value corresponding to the image quality of a pathologic region.
Next, an image processing method executed by the image processing apparatus 1 will be described.
As illustrated in
Subsequently, based on the pathologic region information, the pathology characteristic information calculation unit 713 calculates pathology characteristic information representing a characteristic of a pathologic region (Step S11). Specifically, when pathologic region presence information represents present determination, the size acquisition unit 7131 acquires size information of a pathologic region based on pathologic region information.
After that, the gazing operation determination unit 714 determines a gazing operation on a pathologic region based on the pathology characteristic information (Step S12). Specifically, the near view capturing operation determination unit 7141 determines that gazing is being performed, when pathologic region presence information represents present determination, and determines that near view imaging is being performed, when size information in pathology characteristic information is a preset predetermined value or more. After Step S12, the image processing apparatus 1 returns to a main routine in
Referring back to
In Step S2, the extraction condition setting unit 72 executes, on an endoscopic image group, extraction condition setting processing of setting an extraction target range of extracting a base point and edge points decided based on the base point, based on the characteristic (feature) of the pathologic region.
Subsequently, based on operation information in the characteristic (feature) of a pathologic region, the edge point section setting unit 7212 sets endoscopic images at specific operation positions preceding and following the base point image as edge point images, and sets a section from the base point image to the edge point images (Step S21). Specifically, the operation occurrence section position setting unit 7212a sets a section up to an image where a specific operation occurs. More specifically, the base point setting unit 7212b sets, as an edge point image, an endoscopic image at a timing at which a diagnosis operation switches after a preset specific diagnosis operation has continued, and sets a section from the base point image to the edge point image. After Step S21, the image processing apparatus 1 returns to the aforementioned main routine in
Referring back to
In Step S3, the image extraction unit 73 extracts an endoscopic image having predetermined condition image quality, based on the extraction condition. Specifically, the image quality evaluation value calculation unit 731 extracts an endoscopic image while assuming, as a pixel value, at least any one of a color shift amount, sharpness, and an effective region area in a surface structure. Here, regarding a color shift amount, the image quality evaluation value calculation unit 731 calculates a representative value (average value, etc.) of saturation information calculated from the entire image for the base point image, regards an endoscopic image having a smaller value as compared with the representative value of saturation information of the base point image, as an endoscopic image having a smaller color shift amount, and calculates an evaluation value regarding image quality, to be higher. In addition, regarding sharpness, the image quality evaluation value calculation unit 731 regards an endoscopic image having a larger value as compared with sharpness information of the base point image, as an endoscopic image having stronger sharpness, and calculates an evaluation value regarding image quality, to be higher. In addition, the image quality evaluation value calculation unit 731 calculates an evaluation value regarding image quality, to be higher as an effective region area becomes larger. Subsequently, the image extraction unit 73 extracts a high-quality image by extracting an image falling within a predetermined range on a feature data space of an image quality evaluation value, based on a calculated evaluation value.
Here, in JP 2004-24559 A described above, as feature data used when an image is selected from a reference range instructed by the user, image quality or an operation described in reference information is applied, and solution for reducing a burden on the user in instructing an extraction target range and the number of images to be extracted has not been mentioned. For example, in an intraluminal image of an endoscope, an operation change of a subject is larger and a pathologic region frequently goes into and out of a captured range, and in such a scene that a fluctuation of the subject is larger, the user is assumed to fail in instructing a freeze timing and setting an approach range of a freeze image, and a high-quality image has not been always extracted. In contrast to this, according to the first embodiment, analysis of the characteristic (feature) of a pathologic region that has been obtained as input information is performed, an extraction condition is set based on the characteristic (feature) of the pathologic region, and a high-quality image is extracted from a reference range image based on the extraction condition, whereby an endoscopic image appropriate for diagnosis can be extracted from an endoscopic image group arranged in chronological order.
Next, a first modified example of the first embodiment will be described. The first modified example of this first embodiment is different in the configurations of the pathology characteristic information calculation unit 713, the gazing operation determination unit 714, the base point image setting unit 7211, and the edge point section setting unit 7212 according to the aforementioned first embodiment. Hereinafter, a pathology characteristic information calculation unit, a gazing operation determination unit, a base point image setting unit, and an edge point section setting unit according to the first modified example of this first embodiment will be described. In addition, the same configurations as those in the image processing apparatus 1 according to the aforementioned first embodiment are assigned the same sings, and the description thereof will be omitted.
According to the first modified of the first embodiment described above, analysis of the characteristic (feature) of a pathologic region that has been obtained as input information is performed, an extraction condition is set based on the characteristic (feature) of the pathologic region, and a high-quality image is extracted from a reference range image based on the extraction condition, whereby an endoscopic image appropriate for diagnosis can be extracted from an endoscopic image group arranged in chronological order.
Next, a second modified example of the first embodiment will be described. The second modified example of this first embodiment is different in the configurations of the pathology characteristic information calculation unit 713 and the gazing operation determination unit 714 according to the aforementioned first embodiment. Hereinafter, a pathology characteristic information calculation unit and a gazing operation determination unit according to the second modified example of this first embodiment will be described. In addition, the same configurations as those in the image processing apparatus 1 according to the aforementioned first embodiment are assigned the same sings, and the description thereof will be omitted.
According to the second modified of the first embodiment described above, analysis of the characteristic (feature) of a pathologic region that has been obtained as input information is performed, an extraction condition is set based on the characteristic (feature) of the pathologic region, and a high-quality image is extracted from a reference range image based on the extraction condition, whereby an endoscopic image appropriate for diagnosis can be extracted from an endoscopic image group arranged in chronological order.
Next, a third modified example of the first embodiment will be described. The third modified example of the first embodiment is different in configuration of the pathologic region analysis unit 71 according to the aforementioned first embodiment and pathologic region characteristic analysis processing performed by the pathologic region analysis unit 71. Hereinafter, after a pathologic region analysis unit according to the third modified example of this first embodiment will be described, pathologic region characteristic analysis processing executed by the pathologic region analysis unit will be described. In addition, the same configurations as those in the image processing apparatus 1 according to the aforementioned first embodiment are assigned the same sings, and the description thereof will be omitted.
Next, the pathologic region characteristic analysis processing executed by the pathologic region analysis unit 71a will be described.
In Step S13, the manipulation operation determination unit 715 determines manipulation operation of the endoscope based on signal information of the endoscope. Specifically, the signal information of the endoscope includes image magnification ratio change information for changing a magnification ratio of an image, thumbnail acquisition information for instructing acquisition of a thumbnail (freeze image, still image), angle operation information for instructing a change of an angle, and manipulation information of other button manipulations.
According to the third modified of the first embodiment described above, analysis of the characteristic (feature) of a pathologic region that has been obtained as input information is performed, an extraction condition is set based on the characteristic (feature) of the pathologic region, and a high-quality image is extracted from a reference range image based on the extraction condition, whereby an endoscopic image appropriate for diagnosis can be extracted from an endoscopic image group arranged in chronological order.
Next, a fourth modified example of the first embodiment will be described. The fourth modified example of the first embodiment is different in configuration of the pathologic region analysis unit 71 and pathologic region characteristic analysis processing according to the aforementioned first embodiment. Hereinafter, after a pathologic region analysis unit according to the fourth modified example of this first embodiment will be described, pathologic region characteristic analysis processing executed by the pathologic region analysis unit will be described. In addition, the same configurations as those in the image processing apparatus 1 according to the aforementioned first embodiment are assigned the same sings, and the description thereof will be omitted.
Next, the pathologic region characteristic analysis processing executed by the pathologic region analysis unit 71b will be described.
According to the fourth modified of the first embodiment described above, analysis of the characteristic (feature) of a pathologic region that has been obtained as input information is performed, an extraction condition is set based on the characteristic (feature) of the pathologic region, and a high-quality image is extracted from a reference range image based on the extraction condition, whereby an endoscopic image appropriate for diagnosis can be extracted from an endoscopic image group arranged in chronological order.
Next, a second embodiment will be described. This second embodiment is different in configuration of the arithmetic unit 7 of the image processing apparatus 1 according to the aforementioned first embodiment, and different in processing to be executed. Hereinafter, after the configuration of an arithmetic unit according to this second embodiment will be described, processing executed by an image processing apparatus according to this second embodiment will be described. In addition, the same configurations as those in the image processing apparatus 1 according to the aforementioned first embodiment are assigned the same sings, and the description thereof will be omitted.
The pathologic region analysis unit 71c includes a pathology characteristic information calculation unit 713c in place of the pathology characteristic information calculation unit 713 according to the aforementioned first embodiment.
Based on the pathologic region information, the pathology characteristic information calculation unit 713c calculates pathology characteristic information representing a characteristic of a pathologic region. In addition, the pathology characteristic information calculation unit 713c includes a malignant degree determination unit 7134 that classifies a pathologic region according to a preset class of malignant degree.
The extraction condition setting unit 72c sets an extraction condition based on the characteristic (feature) of a pathologic region. In addition, the extraction condition setting unit 72c includes an extraction number decision unit 723 that sets, based on malignant degree information of a pathologic region, a larger number of extraction to larger malignant degree.
Next, an image processing method executed by the image processing apparatus 1 will be described.
As illustrated in
In Step S5312, the malignant degree determination unit 7134 classifies a pathologic region according to a preset class of malignant degree. Specifically, in malignant degree class classification processing, a rectangle region is set in a pathologic region, texture feature data in the rectangle region is calculated, and class classification is performed by machine learning. Here, texture feature data is calculated using a known technique such as SIFT feature data, LBP feature data, and CoHoG. Subsequently, texture feature data is vector-quantized using BoF, BoVM, or the like. In addition, in the machine learning, classification is performed using a strong classifier such as SVM. For example, pathology is classified into hyperplastic polyp, adenoma pathology, invasive cancer, and the like. After Step S312, the image processing apparatus 1 returns to a main routine in
Referring back to
In Step S32, the extraction condition setting unit 72c executes, on an endoscopic image group, extraction condition setting processing of setting an extraction target range of extracting a base point and edge points decided based on the base point, based on the characteristic (feature) of the pathologic region.
Referring back to
In Step S33, the image extraction unit 73 executes endoscopic image extraction processing of extracting, based on an extraction condition, an endoscopic image having image quality appropriate for diagnosis (image quality satisfying a predetermined condition).
Subsequently, the image extraction unit 73 extracts images by a number of extraction set by the extraction condition setting unit 72c in order from an image having a short distance from a predetermined range on a feature data space of an image quality evaluation value (Step S332). Specifically, the image extraction unit 73 extracts images by a number of extraction set by the extraction condition setting unit 72c in order from an image having a short distance from a predetermined range on a feature data space of an image quality evaluation value. After Step S332, the image processing apparatus 1 returns to a main routine in
According to the second embodiment described above, analysis of the characteristic (feature) of a pathologic region that has been obtained as input information is performed, an extraction condition is set based on the characteristic (feature) of the pathologic region, and a high-quality image is extracted from a reference range image based on the extraction condition, whereby an endoscopic image appropriate for diagnosis can be extracted from an endoscopic image group arranged in chronological order.
Next, a third embodiment will be described. This third embodiment is different in configuration of the arithmetic unit 7 of the image processing apparatus 1 according to the aforementioned first embodiment, and different in processing to be executed. Hereinafter, after the configuration of an arithmetic unit according to this third embodiment will be described, processing executed by an image processing apparatus according to this third embodiment will be described. In addition, the same configurations as those in the image processing apparatus 1 according to the aforementioned first embodiment are assigned the same sings, and the description thereof will be omitted.
The pathologic region analysis unit 71d includes a pathology characteristic information calculation unit 713d and a gazing operation determination unit 714d in place of the pathology characteristic information calculation unit 713 and the gazing operation determination unit 714 of the pathologic region analysis unit 71 according to the aforementioned first embodiment.
Based on the pathologic region information, the pathology characteristic information calculation unit 713d calculates pathology characteristic information representing a characteristic of a pathologic region. In addition, the pathology characteristic information calculation unit 713d includes a change amount calculation unit 7135 that calculates a change amount of pathologic regions between an endoscopic image of interest and an endoscopic image adjacent to the endoscopic image of interest in chronological order, when pathologic region presence information represents present determination.
The gazing operation determination unit 714d determines a gazing operation on a pathologic region based on the pathology characteristic information. In addition, the gazing operation determination unit 714d includes a stop operation determination unit 7145 that determines to stop, when the change amount is less than a preset predetermined value.
The extraction condition setting unit 72d has the same configuration as the extraction condition setting unit 72c according to the aforementioned second embodiment, and sets an extraction condition based on the characteristic (feature) of the pathologic region. In addition, the extraction condition setting unit 72c includes an extraction number decision unit 723 that sets, based on malignant degree information of a pathologic region, a larger number of extraction to larger malignant degree.
The image extraction unit 73d extracts an endoscopic image having predetermined condition image quality, based on the extraction condition. In addition, the image extraction unit 73d includes an image quality evaluation value calculation unit 731d that calculates an evaluation value corresponding to the image quality of a pathologic region. In addition, the image quality evaluation value calculation unit 731d includes a viewpoint evaluation value calculation unit 7311 that calculates an evaluation value corresponding to viewpoint information for a pathologic region.
Next, an image processing method executed by the image processing apparatus 1 will be described.
As illustrated in
Subsequently, the change amount calculation unit 7135 calculates a change amount of pathologic regions between an endoscopic image of interest and an endoscopic image adjacent to the endoscopic image of interest in chronological order, when pathologic region presence information represents present determination (Step S412).
After that, the stop operation determination unit 7145 determines diagnosis operation of a stop operation (Step S413). Specifically, the stop operation determination unit 7145 determines to stop, when the change amount is less than a preset predetermined value. After Step S413, the image processing apparatus 1 returns to a main routine in
Referring back to
In Step S42, the extraction condition setting unit 72d executes, on an endoscopic image group, extraction condition setting processing of setting an extraction target range of extracting a base point and edge points decided based on the base point, based on the characteristic (feature) of the pathologic region.
Referring back to
In Step S43, the image extraction unit 73 executes endoscopic image extraction processing of extracting, based on an extraction condition, an endoscopic image having image quality appropriate for diagnosis (image quality satisfying a predetermined condition).
In Step S432, the viewpoint evaluation value calculation unit 7311 calculates an evaluation value corresponding to viewpoint information for a pathologic region. Specifically, the viewpoint evaluation value calculation unit 7311 calculates an evaluation value of an image in which an important region largely appears, to be higher, such as a viewpoint viewed from the above in which a top portion of pathology can be checked, and a viewpoint viewed from the side surface in which rising of pathology can be checked. Here, the viewpoint information is defined according to inclination upside of a mucosal surface around the pathologic region. For example, the viewpoint evaluation value calculation unit 7311 calculates an evaluation value in such a manner that inclination intensity and direction of a pathology neighbor region vary, if the viewpoint is upper viewpoint. After Step S432, the image processing apparatus 1 advances the processing to Step S433.
According to the third embodiment described above, analysis of the characteristic (feature) of a pathologic region that has been obtained as input information is performed, an extraction condition is set based on the characteristic (feature) of the pathologic region, and a high-quality image is extracted from a reference range image based on the extraction condition, whereby an endoscopic image appropriate for diagnosis can be extracted from an endoscopic image group arranged in chronological order.
In the present disclosure, an image processing program recorded in a recording device can be implemented by being executed in a computer system such as a personal computer and a work station. In addition, such a computer system may be used by being connected to a device such as another computer system or a server via a local area network (LAN), a wide area network (WAN), or a public line such as the internet. In this case, the image processing apparatus according to the first to second embodiments and the modified examples thereof may acquire image data of an intraluminal image via these networks, output an image processing result to various types of output devices such as a viewer and a printer connected via these networks, and store an image processing result into a recording device connected via these networks, such as a recording medium readable by a reading device connected to a network, for example.
In addition, in the description of the flowcharts in this specification, an anteroposterior relationship in processing between steps is clearly indicated using wordings such as “first”, “after that”, and “subsequently”, but the order of processes necessary for implementing the present disclosure is not uniquely defined by these wordings. In other words, the order of processes in the flowcharts described in this specification can be changed without causing contrariety.
According to the present disclosure, there is caused such an effect that a high-quality endoscopic image appropriate for diagnosis can be extracted from an endoscopic image group.
Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the disclosure in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.
This application is a continuation of International Application No. PCT/JP2016/071770, filed on Jul. 25, 2016, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2016/071770 | Jul 2016 | US |
Child | 16256425 | US |