Recording device, image observation device, observation system, control method of observation system, and computer-readable recording medium

Information

  • Patent Grant
  • 11653815
  • Patent Number
    11,653,815
  • Date Filed
    Thursday, February 11, 2021
    3 years ago
  • Date Issued
    Tuesday, May 23, 2023
    a year ago
Abstract
A recording device includes: a memory; and a processor including hardware. The processor is configured to generate, based on temporal change in plural sets of image data that have been generated by an endoscope and arranged chronologically, biological information on a subject, associate the plural sets of image data with the biological information to record the plural sets of image data with the biological information into the memory, and select, based on the biological information, image data from the plural sets of image data that have been recorded in the memory to generate three-dimensional image data.
Description
BACKGROUND
1. Technical Field

The present disclosure relates to a recording device, an image observation device, an observation system, a control method of the observation system, and a computer-readable recording medium.


2. Related Art

Observation of the inside of the bodies of subjects, which are living bodies, using endoscopes has been performed in the medical field (for example, see Japanese Patent Application Laid-open No. 2016-062488). Images captured by an endoscope are used in diagnosis performed by a first medical doctor operating the endoscope.


Studies have been made on: storage of the images captured by the endoscope into a recording device; and use of the stored images for diagnosis (a second opinion) by a second medical doctor who is at a place spatially or temporally away from where the first medical doctor is.


However, biological information of the subject such as pulse and respiration is not available when the second medical doctor observes the stored images, which makes diagnosis more difficult than when the subject is directly observed by inserting the endoscope into the body.


SUMMARY

The disclosure addresses the above-described issue, and a general purpose thereof is to provide a recording device, an image observation device, an observation system, a control method of the observation system, and an operating program for the observation system that facilitate the observation using the recorded images captured by an endoscope.


To address the above issue, in some embodiments, a recording device includes: a memory; and a processor including hardware. The processor is configured to generate, based on temporal change in plural sets of image data that have been generated by an endoscope and arranged chronologically, biological information on a subject, associate the plural sets of image data with the biological information to record the plural sets of image data with the biological information into the memory, and select, based on the biological information, image data from the plural sets of image data that have been recorded in the memory to generate three-dimensional image data.


In some embodiments, an image observation device includes: a processor comprising hardware. The processor is configured to acquire data on a three-dimensional image, the data being generated by selection of a set of image data from plural sets of image data that have been generated by an endoscope and arranged chronologically, the selection being based on biological information on a subject, the biological information being generated based on temporal change in the plural sets of image data, generate angle-of-view information, based on information of a position and an angle at which the subject is observed, and generate two-dimensional image data acquired by observation of the three-dimensional image, the observation being based on the biological information and the angle-of-view information.


In some embodiments, an observation system includes: a memory; and a processor including hardware. The processor is configured to generate, based on temporal change in plural sets of image data that have been generated by an endoscope and arranged chronologically, biological information on a subject, associate the plural sets of image data with the biological information to record the plural sets of image data with the biological information into the memory, select, based on the biological information, image data from the plural sets of image data that have been recorded in the memory to generate data on a three-dimensional image, generate angle-of-view information, based on information of a position and an angle at which the subject is observed, and generate two-dimensional image data acquired by observation of the three-dimensional image, the observation being based on the biological information and the angle-of-view information.


In some embodiments, a control method of an observation system includes: generating, based on temporal change in plural sets of image data that have been generated by an endoscope and arranged chronologically, biological information on a subject; selecting, based on the biological information, image data from the plural sets of image data to generate data on a three-dimensional image; generating angle-of-view information, based on information of a position and an angle at which the subject is observed; and generating two-dimensional image data acquired by observation of the three-dimensional image, the observation being based on the biological information and the angle-of-view information.


In some embodiments, provided is a non-transitory computer-readable recording medium with an executable program stored thereon. The program causes an observation system to execute: generating, based on temporal change in plural sets of image data that have been generated by an endoscope and arranged chronologically, biological information on a subject; selecting, based on the biological information, image data from the plural sets of image data to generate data on a three-dimensional image; generating angle-of-view information, based on information of a position and an angle at which the subject is observed; and generating two-dimensional image data acquired by observation of the three-dimensional image, the observation being based on the biological information and the angle-of-view information.


The above and other features, advantages and technical and industrial significance of this disclosure will be better understood by reading the following detailed description of presently preferred embodiments of the disclosure, when considered in connection with the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram illustrating a configuration of an observation system according to an embodiment of the disclosure;



FIG. 2 is a flow chart for observation using the observation system;



FIG. 3 is a diagram illustrating how a recording device generates three-dimensional image data;



FIG. 4 is a diagram illustrating how an image observation device generates two-dimensional image data;



FIG. 5 is a block diagram illustrating a configuration of an observation system according to a first modified example;



FIG. 6 is a block diagram illustrating a configuration of an observation system according to a second modified example;



FIG. 7 is a block diagram illustrating a configuration of an observation system according to a third modified example;



FIG. 8 is a diagram illustrating how biological information is generated from signal values of feature points; and



FIG. 9 is a diagram illustrating how biological information is generated from coordinate values of feature points.





DETAILED DESCRIPTION

Embodiments of a recording device, an image observation device, an observation system, a control method of the observation system, and an operating program for the observation system will be described below by reference to the drawings. The disclosure is not limited by these embodiments. The disclosure is generally applicable to recording devices using medical endoscopes, image observation devices using medical endoscopes, observation systems using medical endoscopes, control methods for the observation systems, and operating programs for the observation systems.


Any elements that are the same or corresponding to each other are assigned with the same reference sign throughout the drawings, as appropriate. In addition, it needs to be noted that the drawings are schematic and relations between dimensions of each element therein and proportions between the elements therein may be different from the actual ones. The drawings may also include a portion that differs in its dimensional relations or proportions between the drawings.


EMBODIMENTS


FIG. 1 is a block diagram illustrating a configuration of an observation system according to an embodiment of the disclosure. As illustrated in FIG. 1, an observation system 1 according to this embodiment includes an endoscope observation system 10, a recording device 20, and an image observation device 30, which are connected to one another via a network 2, such as an Internet connection.


The endoscope observation system 10 includes: an endoscope 11 that captures an image of the inside of the body of a subject that is a living body; a sensor unit 12 that detects biological information on the subject; an image processing device 13 that performs image processing on image data captured by the endoscope 11; and a display device 14 that displays an image according to image data generated by the image processing device 13.


The endoscope 11 has an insertion unit to be inserted into the subject, and an operating unit that is connected consecutively to a proximal end of the insertion unit. According to input to the operating unit, an imaging device arranged at a distal end of the insertion unit captures an image of the inside of the body of the subject. The imaging device is an image sensor, such as a complementary metal oxide semiconductor (CMOS) or a charge coupled device (CCD).


The sensor unit 12 acquires biological information that is information on, for example, the pulse or respiration. Specifically, the sensor unit 12 is a bloodstream sensor that is fixed to hold a fingertip of the subject. Furthermore, the sensor unit 12 may be a respiration sensor attached to, for example, the mouth, nose, or chest of the subject.


The image processing device 13 is configured using a work station or personal computer including a general-purpose processor or a dedicated processor. The general purpose processor may be, for example, a central processing unit (CPU), and the dedicated processor may be any of various arithmetic circuits that execute specific functions, such as application specific integrated circuits (ASICs) and field programmable gate arrays (FPGAs).


The display device 14 is configured using liquid crystal or organic electroluminescence, for example.


The recording device 20 is a server connected to plural devices, such as an endoscope observation system and an image observation device, including the endoscope observation system 10 and the image observation device 30, via the network 2. The recording device 20 is implemented by, for example: a general-purpose processor, such as a CPU, or a dedicated processor, such as an arithmetic circuit that executes a specific function; a recording medium (a memory device), such as a semiconductor memory, an HDD, an MO, a CD-R, or a DVD-R; and a drive device that drives the recording medium. The arithmetic circuit may be an ASIC or FPGA, and the semiconductor memory may be a flash memory, a random access memory (RAM), or a read only memory (ROM).


The recording device 20 includes: a biological information generating unit 21 that generates biological information on the subject; a recording unit 22 that records therein plural sets of image data in association with the biological information, the plural sets of image data having been generated by the endoscope 11 and chronologically arranged; and a three-dimensional image generating unit 23 that generates three-dimensional image data by selecting, based on the biological information, image data from the plural sets of image data that have been recorded in the recording unit 22.


The image observation device 30 includes: an input device 31 that receives input of a position and an angle, at which the subject is observed; an image processing device 32 that reads three-dimensional image data recorded in the recording device 20 and generates image data; and a display device 33 that displays an image according to the image data generated by the image processing device 32.


The input device 31 has a configuration similar to that of the operating unit of the endoscope 11 and receives input of a position and an angle, at which the subject is observed. However, the input device 31 may be formed of, for example, at least one selected from the group of a mouse, a keyboard, a touch pad, and a touch panel.


The image processing device 32 is configured using a work station or personal computer including, for example, a general-purpose processor, such as a CPU, or a dedicated processor, such as an arithmetic circuit that executes a specific function. The arithmetic circuit may be an ASIC or FPGA.


The image processing device 32 has: an acquiring unit 321 that acquires the three-dimensional image data generated by the three-dimensional image generating unit 23; an angle-of-view information generating unit 322 that generates angle-of-view information, based on information received by the input device 31; and a two-dimensional image generating unit 323 that generates two-dimensional image data acquired by observation of a three-dimensional image, the observation being based on the biological information and the angle-of-view information.


The display device 33 is configured using liquid crystal or organic electroluminescence, for example.


Next, operation in which a second medical doctor observes an image using the image observation device 30 will be described, the image having been captured by a first medical doctor using the endoscope observation system 10. FIG. 2 is a flow chart for observation using an observation system. As illustrated in FIG. 2, by using the endoscope 11, the first medical doctor captures images of the inside of the body of a subject (Step S1).


Simultaneously, the sensor unit 12 detects bloodstream information for a time period in which the endoscope 11 is capturing the images (Step S2).


Subsequently, the image processing device 13 performs predetermined image processing on plural sets of image data generated by the endoscope 11 (Step S3).


Thereafter, the endoscope observation system 10 transmits the plural sets of image data and the bloodstream information via the network 2 to the recording device 20, and records the plural sets of image data and the bloodstream information into the recording device 20 (Step S4).


Subsequently, the biological information generating unit 21 generates biological information that is pulse information, using the bloodstream information (Step S5: biological information generating step). FIG. 3 is a diagram illustrating how a recording device generates three-dimensional image data. As illustrated in FIG. 3, based on the bloodstream information detected by the sensor unit 12, the biological information generating unit 21 determines whether each image captured by the endoscope 11 is an image captured in the systole or an image captured in the diastole. A result of this determination is the pulse information on each image. Systole is a phase in which a heart contracts and blood is being sent out from the heart. On the contrary, diastole is a phase in which the heart dilates and the arterial blood pressure is low according to the systemic vascular resistance. The blood flow volumes in organs inside a body differ between the systole and the diastole, and color tones of the organs thus also differ between the systole and the diastole.


The recording unit 22 then records therein the plural sets of image data and the biological information in association with each other (Step S6). Specifically, the recording unit 22 records therein each image included in the plural sets of image data in association with pulse information on that image.


Furthermore, the three-dimensional image generating unit 23 generates three-dimensional image data by selecting image data, based on the biological information, from the plural sets of image data (Step S7: three-dimensional image generating step). Specifically, as illustrated in FIG. 3, the three-dimensional image generating unit 23 selects, from the plural sets of image data, an image of a systolic frame captured in the systole, and generates a systolic three-dimensional image. The three-dimensional image generating unit 23 selects, from the plural sets of image data, an image of a diastolic frame captured in the diastole, and generates a diastolic three-dimensional image. The three-dimensional image generating unit 23 generates a three-dimensional image by extracting feature points from each image and joining the feature points in consecutive images in the chronological order. The systolic three-dimensional image and the diastolic three-dimensional image generated by the three-dimensional image generating unit 23 are recorded into the recording unit 22.


Subsequently, the acquiring unit 321 of the image observation device 30 acquires the three-dimensional image data and the biological information (Step S8). Specifically, the acquiring unit 321 acquires the systolic three-dimensional image and the diastolic three-dimensional image from the recording unit 22 of the recording device 20.


Based on information received by the input device 31, the angle-of-view information generating unit 322 generates angle-of-view information (Step S9: angle-of-view information generating step). Specifically, based on operation on the input device 31 by the second medical doctor, the angle-of-view information generating unit 322 generates angle-of-view information that temporally changes.


Furthermore, the two-dimensional image generating unit 323 generates two-dimensional image data acquired by observation of a three-dimensional image, the observation being based on biological information and angle-of-view information (Step S10: two-dimensional image generating step). FIG. 4 is a diagram illustrating how an image observation device generates two-dimensional image data. As illustrated in FIG. 4, based on pulse information, the two-dimensional image generating unit 323 generates a virtual pulse phase. This phase may be acquired by using the pulse information, or by calculating, based on the pulse information, the average value of periods or amplitudes, for example. Based on the phase and angle-of-view information, the two-dimensional image generating unit 323 generates, from a three-dimensional image corresponding to the phase, two-dimensional image data for a position and an angle indicated by the second medical doctor, the three-dimensional image being the systolic three-dimensional image or the diastolic three-dimensional image.


As described above, the embodiment enables the second medical doctor to observe the inside of the body of a subject that is virtually pulsating and has changing bloodstream. That is, observation is facilitated because the second medical doctor is able to perform observation as if the second medical doctor is directly observing the inside of the body of the subject using the endoscope 11.


Furthermore, when respiration information is used as the biological information, the second medical doctor is able to observe the inside of the body of a subject that virtually moves due to aspiration. That is, observation is facilitated because the second medical doctor is able to perform observation as if the second medical doctor is directly observing the inside of the body of the subject using the endoscope 11.


According to the above description of the embodiment, a virtual pulse is generated for observation of the inside of the body of a subject, but the systolic three-dimensional image or the diastolic three-dimensional image may be observed as is.


First Modified Example


FIG. 5 is a block diagram illustrating a configuration of an observation system according to a first modified example. As illustrated in FIG. 5, a recording device 20A in an observation system 1A further includes an image correcting unit 24A that performs correction including: reducing change in shape that differs according to biological information; and enhancing change in color tones that differ according to the biological information. Specifically, the image correcting unit 24A corrects shape information and color tone information on the systolic three-dimensional image and the diastolic three-dimensional image illustrated in FIG. 3. Shape information is, for example, three-dimensional coordinate values of feature points extracted when a three-dimensional image is generated, and color tone information is, for example, RGB signal values of the feature points.


When the shape information on the systolic three-dimensional image is Ss, the shape information on the diastolic three-dimensional image is Se, the corrected shape information on the systolic three-dimensional image is Ss′, the corrected shape information on the diastolic three-dimensional image is Se′, and the correction coefficient is ks; the correction for reducing the change in shape is able to be performed on a three-dimensional image by using the following Equations (1) and (2).

Ss′=Ss+ks(Se−Ss), where 0<ks<0.5  (1)
Se′=Se−ks(Se−Ss) where 0<ks<0.5  (2)


Furthermore, when the color tone information on the systolic three-dimensional image is Ts, the color tone information on the diastolic three-dimensional image is Te, the corrected color tone information on the systolic three-dimensional image is Ts′, the corrected color tone information on the diastolic three-dimensional image is Te′, and the correction coefficient is kt; the correction for enhancing the change in color tones in a three-dimensional image is able to be performed by using the following Equations (3) and (4).

Ts′=Ts+ks(Te−Ts) where kt<0  (3)
Te′=Te−ks(Te−Ts) where kt<0  (4)


As described above, by the image correcting unit 24A performing the correction, the change in shape between the systolic three-dimensional image and the diastolic three-dimensional image generated by the three-dimensional image generating unit 23 is reduced, and unnecessary movement due to the pulse, for example, in the two-dimensional image data is thus eliminated when the second medical doctor performs observation and the observation is thus facilitated. Furthermore, by the image correcting unit 24A implementing the correction, the change in color tones between the systolic three-dimensional image and the diastolic three-dimensional image is increased, and the second medical doctor is able to easily observe a portion having a different color tone in the two-dimensional image data, the portion being, for example reddening or a tumor.


According to the above description of the first modified example, the image correcting unit 24A corrects each of the shapes and the color tones of the systolic three-dimensional image and the diastolic three-dimensional image, but the first modified example is not limited to this example. The three-dimensional image generating unit 23 may generate a shape of a three-dimensional image using all of images, generate a systolic three-dimensional image using systolic color tone information for the generated shape of the three-dimensional image, and generate a diastolic three-dimensional image using diastolic color tone information for the generated shape of the three-dimensional image. The image correcting unit 24A may then correct only the color tones of the systolic three-dimensional image and the diastolic three-dimensional image by using Equations (3) and (4) above. Similarly, the image correcting unit 24A may correct only the shapes of three-dimensional images generated by the three-dimensional image generating unit 23.


According to the above description of the first modified example, the image correcting unit 24A corrects the systolic three-dimensional image and the diastolic three-dimensional image, but the first modified example is not limited to this example. The image correcting unit 24A may, for example, correct the image data generated by the endoscope 11, and the three-dimensional image generating unit 23 may generate three-dimensional image data using the corrected image data.


Second Modified Example


FIG. 6 is a block diagram illustrating a configuration of an observation system according to a second modified example. As illustrated in FIG. 6, an image processing device 13B of an endoscope observation system 10B in an observation system 1B further includes an imaging control unit 131B that controls the timing for imaging by the endoscope 11, according to biological information. As a result, imaging is enabled at times when the bloodstream becomes the maximum and the minimum, for example. The second medical doctor is then able to perform observation in a state where the differences between the systole and the diastole are large and thus the observation is facilitated.


Third Modified Example


FIG. 7 is a block diagram illustrating a configuration of an observation system according to a third modified example. As illustrated in FIG. 7, an endoscope observation system 100 in an observation system 10 does not have a sensor unit.


Based on temporal change in image data that have been recorded in the recording unit 22, the biological information generating unit 21 generates biological information.



FIG. 8 is a diagram illustrating how biological information is generated from signal values of feature points. As illustrated in FIG. 8, color tones of the inside of the body of a subject temporally change depending on the bloodstream. The biological information generating unit 21 generates biological information, based on temporal change in RGB signal values of feature points extracted upon generation of a three-dimensional image. Specifically, the biological information generating unit 21 generates, based on temporal change in signal values of the R component and B component of feature points, biological information that is pulse information like the one illustrated in FIG. 3. The biological information generating unit 21 may generate pulse information, based on, for example, temporal change in color tones of the whole image or temporal change in color tones of an area extracted by predetermined image processing. The biological information generating unit 21 may generate pulse information, based on information on any one of color components, such as the R component or the B component, or may generate pulse information by performing predetermined calculation with signal values of each color component.



FIG. 9 is a diagram illustrating how biological information is generated from coordinate values of feature points. As illustrated in FIG. 9, the position of an organ inside a subject temporally changes depending on the pulse. The biological information generating unit 21 generates biological information, based on temporal change in coordinate values of feature points extracted upon generation of a three-dimensional image. Specifically, the biological information generating unit 21 generates, based on temporal change in coordinates of feature points, biological information that is pulse information like the one illustrated in FIG. 3. The biological information generating unit 21 may generate pulse information, based on temporal change in coordinates of, for example, a contour detected from an image, without being limited to feature points.


According to the disclosure, a recording device, an image observation device, an observation system, a control method of the observation system, and an operating program for the observation system that further facilitate observation are able to be provided, the observation involving acquisition of images from the recording device, the images having been captured using an endoscope.


Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the disclosure in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.

Claims
  • 1. A recording device comprising: a processor comprising hardware, the processor being configured to:generate, based on temporal change in plural sets of image data that have been generated by an endoscope and arranged chronologically, pulse information on a subject;associate the plural sets of image data with the pulse information to record the plural sets of image data with the pulse information into a memory; andselect, based on the pulse information, image data from the plural sets of image data that have been recorded in the memory to generate three-dimensional image data,wherein the processor is further configured to perform correction to reduce change in shape that differs according to the pulse information and to enhance change in a color tone that differs according to the pulse information, andwherein the processor is further configured to: select image data in systole from the plural sets of image data that have been recorded in the memory to generate systolic three-dimensional image data, the systole being a phase in which a heart contracts and blood is being sent out from the heart;select image data in diastole from the plural sets of image data that have been recorded in the memory to generate diastolic three-dimensional image data, the diastole being a phase in which the heart dilates and an arterial blood pressure is low according to a systemic vascular resistance; andperform the correction to reduce change in shape between a systolic three-dimensional image and a diastolic three-dimensional image and to enhance change in color tone between the systolic three-dimensional image and the diastolic three-dimensional image.
  • 2. The recording device according to claim 1, wherein the processor is further configured to: extract feature points in images from the plural sets of image data that have been arranged chronologically; andgenerate, based on temporal change in the extracted feature points, the pulse information on the subject.
  • 3. The recording device according to claim 1, wherein the processor is further configured to: extract feature points in images from the plural sets of image data that have been arranged chronologically; andgenerate, based on temporal change in colors of the extracted feature points, the pulse information on the subject.
  • 4. The recording device according to claim 1, wherein the processor is further configured to: extract feature points in images from the plural sets of image data that have been arranged chronologically; andgenerate, based on temporal change in coordinates of the extracted feature points, the pulse information on the subject.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a continuation of International Application No. PCT/JP2018/032248, filed on Aug. 30, 2018, the entire contents of which are incorporated herein by reference.

US Referenced Citations (202)
Number Name Date Kind
6295465 Simonetti Sep 2001 B1
6398731 Mumm Jun 2002 B1
6628743 Drummond Sep 2003 B1
6674879 Weisman Jan 2004 B1
6889071 Saranathan May 2005 B2
7522696 Imai Apr 2009 B2
8083679 Azuma Dec 2011 B1
8509511 Sakaguchi Aug 2013 B2
8577443 Miyazaki Nov 2013 B2
8620044 Honda Dec 2013 B2
9031387 Kono May 2015 B2
9186051 Hirota Nov 2015 B2
9275473 Hori Mar 2016 B2
9516993 Ito Dec 2016 B2
9521330 Kuriyama Dec 2016 B2
9547940 Sun Jan 2017 B1
9684849 Yaguchi Jun 2017 B2
9959618 Kitamura May 2018 B2
10010379 Gibby Jul 2018 B1
10198811 Kamiyama Feb 2019 B2
10206555 Kanda Feb 2019 B2
10210610 Iwasaki Feb 2019 B2
10325369 Ma Jun 2019 B2
10568497 Nakagawa Feb 2020 B2
10679358 On Jun 2020 B2
10732246 Miyazaki Aug 2020 B2
10748279 Kitamura Aug 2020 B2
10765297 Kanda Sep 2020 B2
10776921 Kitamura Sep 2020 B2
10776998 Wang Sep 2020 B1
10918260 Hosoya Feb 2021 B2
10918358 Zhang Feb 2021 B2
11055865 Sakamoto Jul 2021 B2
11145053 Hayami Oct 2021 B2
11232568 Yamamoto Jan 2022 B2
11232570 Nozaki Jan 2022 B2
11471065 Nomura Oct 2022 B2
11565136 Marquet Jan 2023 B2
20020072670 Chenal Jun 2002 A1
20020087068 Foo Jul 2002 A1
20030038802 Johnson Feb 2003 A1
20030160786 Johnson Aug 2003 A1
20040044283 Yoneyama Mar 2004 A1
20040077952 Rafter Apr 2004 A1
20040153128 Suresh Aug 2004 A1
20040167414 Tanabe Aug 2004 A1
20040249297 Pfeiffer Dec 2004 A1
20050018890 McDonald Jan 2005 A1
20050043609 Murphy Feb 2005 A1
20050113665 Mohr May 2005 A1
20050137661 Sra Jun 2005 A1
20060025689 Chalana Feb 2006 A1
20060058618 Nishiura Mar 2006 A1
20060082677 Donomae Apr 2006 A1
20060122512 Abe Jun 2006 A1
20060241449 Oonuki Oct 2006 A1
20060253031 Altmann Nov 2006 A1
20070014452 Suresh Jan 2007 A1
20070083105 Miyazaki Apr 2007 A1
20070106146 Altmann May 2007 A1
20070167733 Miyoshi Jul 2007 A1
20070167777 Abe Jul 2007 A1
20070248319 Sakaguchi Oct 2007 A1
20080085042 Trofimov Apr 2008 A1
20080118126 Sakaguchi May 2008 A1
20080170770 Suri Jul 2008 A1
20080181479 Yang Jul 2008 A1
20080281218 Lei Nov 2008 A1
20080285819 Konofagou Nov 2008 A1
20080312527 Masumoto Dec 2008 A1
20090052613 Sakaguchi Feb 2009 A1
20090054776 Sasaki Feb 2009 A1
20090082675 Gunji Mar 2009 A1
20090136109 Saigo May 2009 A1
20090148020 Sugiura Jun 2009 A1
20090149734 Sugiura Jun 2009 A1
20090163815 Kawagishi Jun 2009 A1
20090207241 Igarashi Aug 2009 A1
20090234220 Maschke Sep 2009 A1
20090287303 Carpentier Nov 2009 A1
20090292175 Akimoto Nov 2009 A1
20090311655 Karkanias Dec 2009 A1
20090312648 Zhang Dec 2009 A1
20100027861 Shekhar Feb 2010 A1
20100081917 Zhang Apr 2010 A1
20100128943 Matsue May 2010 A1
20100174191 Lin Jul 2010 A1
20100195887 Abe Aug 2010 A1
20100208957 Chen Aug 2010 A1
20100317976 Chelma Dec 2010 A1
20110160849 Carpentier Jun 2011 A1
20110218427 Kitamura Sep 2011 A1
20110282151 Trovato Nov 2011 A1
20120053408 Miyamoto Mar 2012 A1
20120087561 Guetter Apr 2012 A1
20120089016 Mizuno Apr 2012 A1
20120093278 Tsukagoshi Apr 2012 A1
20120101368 Masumoto Apr 2012 A1
20120232853 Voigt Sep 2012 A1
20120281901 Yoshizawa Nov 2012 A1
20120302870 Bjallmark Nov 2012 A1
20130012835 Chono Jan 2013 A1
20130013278 Hu Jan 2013 A1
20130034287 Itagaki Feb 2013 A1
20130053664 Jian Feb 2013 A1
20130137926 Itai May 2013 A1
20130267835 Edwards Oct 2013 A1
20130285655 Miyazaki Oct 2013 A1
20130336450 Kyriakou Dec 2013 A1
20140081079 Kawasaki Mar 2014 A1
20140111201 Kim Apr 2014 A1
20140121496 Bi May 2014 A1
20150020547 Akita Jan 2015 A1
20150038860 Fonte Feb 2015 A1
20150145953 Fujie May 2015 A1
20150178938 Gorman, III Jun 2015 A1
20150190038 Sakuragi Jul 2015 A1
20150193966 Sakuragi Jul 2015 A1
20150237325 Angot Aug 2015 A1
20150245772 Kawamoto Sep 2015 A1
20150257655 Ishii Sep 2015 A1
20160038048 Ting Feb 2016 A1
20160061920 Mekkaoui Mar 2016 A1
20160063707 Masumoto Mar 2016 A1
20160070877 Taylor Mar 2016 A1
20160100766 Yoshioka Apr 2016 A1
20160125596 Jo May 2016 A1
20160220190 Zhang Aug 2016 A1
20160269713 Kasumi Sep 2016 A1
20160310077 Hunter Oct 2016 A1
20170065230 Sinha Mar 2017 A1
20170084036 Pheiffer Mar 2017 A1
20170188986 Nakagawa Jul 2017 A1
20170245835 Okazaki Aug 2017 A1
20170252000 Fukuda Sep 2017 A1
20170296055 Gardner Oct 2017 A1
20170311839 Osman Nov 2017 A1
20170347975 Ishii Dec 2017 A1
20170367659 Lading Dec 2017 A1
20180020932 Chen Jan 2018 A1
20180055364 Pierro Mar 2018 A1
20180108138 Kluckner Apr 2018 A1
20180150929 Pheiffer May 2018 A1
20180153621 Duindam Jun 2018 A1
20180184920 Rabinovich Jul 2018 A1
20180246178 Wang Aug 2018 A1
20180263503 Ukawa Sep 2018 A1
20180289270 Zalevsky Oct 2018 A1
20190089895 Kono Mar 2019 A1
20190090818 Nakajima Mar 2019 A1
20190167233 Konofagou Jun 2019 A1
20190239861 Kobayashi Aug 2019 A1
20190247127 Kopel Aug 2019 A1
20190247142 Themelis Aug 2019 A1
20190290247 Popovic Sep 2019 A1
20190298290 Noji Oct 2019 A1
20190304107 Sakuragi Oct 2019 A1
20190304129 Schafer Oct 2019 A1
20190347814 Ganapati Nov 2019 A1
20190357854 Reich Nov 2019 A1
20190380781 Tsai Dec 2019 A1
20200060528 Akimoto Feb 2020 A1
20200077869 Ida Mar 2020 A1
20200085281 Regensburger Mar 2020 A1
20200085315 Kang Mar 2020 A1
20200085579 Kim Mar 2020 A1
20200146568 Park May 2020 A1
20200183013 Iguchi Jun 2020 A1
20200184723 Ouyang Jun 2020 A1
20200187897 Abe Jun 2020 A1
20200219272 Pizer Jul 2020 A1
20200273148 Ouyang Aug 2020 A1
20200345215 Springer Nov 2020 A1
20200345291 Bradley Nov 2020 A1
20200383588 Fujita Dec 2020 A1
20200387706 Zur Dec 2020 A1
20210015340 Horie Jan 2021 A1
20210052146 Komp Feb 2021 A1
20210052240 Weingarten Feb 2021 A1
20210068683 Reich Mar 2021 A1
20210089079 Kang Mar 2021 A1
20210121079 Sugo Apr 2021 A1
20210146117 Reich May 2021 A1
20210153745 Park May 2021 A1
20210153752 Park May 2021 A1
20210153756 Hu May 2021 A1
20210158526 Patil May 2021 A1
20210196386 Shelton, IV Jul 2021 A1
20210216822 Paik Jul 2021 A1
20210280312 Freedman Sep 2021 A1
20210386299 Hocking Dec 2021 A1
20210398304 Uyama Dec 2021 A1
20220020118 Hallen Jan 2022 A1
20220020166 Hufford Jan 2022 A1
20220039666 Park Feb 2022 A1
20220104790 Cadieu Apr 2022 A1
20220222825 Yaacobi Jul 2022 A1
20220248962 Chu Aug 2022 A1
20220265149 Lee Aug 2022 A1
20220296111 Leabman Sep 2022 A1
20220313214 Ando Oct 2022 A1
20220409071 Leaning Dec 2022 A1
Foreign Referenced Citations (7)
Number Date Country
2002-282213 Oct 2002 JP
2010-279500 Dec 2010 JP
2016-062488 Apr 2016 JP
2016-179121 Oct 2016 JP
6095879 Mar 2017 JP
2017-153953 Sep 2017 JP
WO 2017073181 May 2017 WO
Non-Patent Literature Citations (1)
Entry
International Search Report dated Nov. 27, 2018 issued in PCT/JP2018/032248.
Related Publications (1)
Number Date Country
20210192836 A1 Jun 2021 US
Continuations (1)
Number Date Country
Parent PCT/JP2018/032248 Aug 2018 US
Child 17173602 US