The disclosure relates to surgical imaging systems, and more particularly, to systems and methods for assisting a clinician performing surgery on lymphatic and other luminal structures.
As technology has advanced, surgeons have largely replaced classical open surgical techniques with minimally invasive techniques such as laparoscopic and thoracoscopic surgery in an effort to minimize trauma to surrounding tissue, reduce pain, reduce scarring, and reduce the length of time a patient is required to stay in the hospital. Minimally invasive surgery, such as the thoracoscopic approach pioneered in the mid-19th century, involves the use of small incisions (from one to several), typically no larger than 3-10 mm. Originally performed using a cystoscope, advances in medical technology led to the development of specialized instruments for use in the thoracic cavity, such as a thoracoscope, to view anatomy within the thoracic cavity while performing the surgical procedure. In the late 20th century, Video Assisted Thoracic Surgery (VATS) was developed utilizing a fiber-optic endoscope to further reduce the size of incisions required to make the incision and to provide clearer, more defined images of the thoracic cavity.
In parallel with the advances in minimally invasive surgeries have come advances in in-situ imaging techniques. Employing these in-situ techniques, dyes, such as indocyanine or methylene blue, can be injected into the body. These dyes are typically injected into blood vessels and other luminal networks so that the blood vessel or other luminal network pathway can be observed when the dyes are excited by various wavelengths of infrared and near-infrared light.
While these technologies have led to improvements in surgical outcomes, improvements to the technology are always desirable.
The disclosure is directed to a system and method that enables real-time visual examination of in vivo tissues and selective display of luminal networks lying beneath the surface of the tissues being examined.
In one aspect, this disclosure features a method of imaging tissue. The method of imaging tissue includes receiving white light images, receiving near infrared (NIR) images, storing the NIR images in memory, and detecting fluorescence in the NIR images. The method of imaging tissue also includes generating composite images including the white light images and NIR images in which the fluorescence is detected. The method of imaging tissue also includes displaying the composite images in a user interface.
In aspects, implementations of this disclosure may include one or more of the following features. The composite images may be formed of white light images captured at a time after the NIR images are received. The method may also include registering the white light images and the NIR images. The registration may be an electromagnetic-based registration. The registration may be an image-based registration. Detecting fluorescence may include determining which pixels in the NIR images change brightness at a rate faster than a threshold.
In another aspect, this disclosure features a system for imaging a patient. The system includes an endoscope including a white light source, a near infrared (NIR) light source, and at least one camera capable of capturing reflected white and NIR light. The system also includes a processor in communication with the at least one camera and configured to generate a white light video and an NIR video from the captured reflected white and NIR light. The system also includes a display in communication with the processor to selectively present a user interface including the white light video or the NIR video. The system also includes a memory having stored thereon an application which, when executed by the processor, causes the processor to detect fluorescence in the NIR video, generate a composite video including the white light video and the fluorescence detected in the NIR video, and display the composite video on the display.
In aspects, implementations of this disclosure may include one or more of the following features. The NIR video may be stored in memory. The application, when executed by the processor, may further cause the processor to register the NIR video to the white light video. The white light video registered to the NIR video may be captured after the detection of the fluorescence in the NIR video. The system may also include an electromagnetic (EM) field generator. The endoscope may include an EM sensor. The application, when executed by the processor, may further cause the processor to determine a position of the EM sensor in a field generated by the EM field generator. The application, when executed by the processor, may further cause the processor to perform image-based registration of the NIR video and the white light video. The displayed composite video may depict the white light video correlated to fluorescing pixels in the NIR video with an altered color. The application, when executed by the processor, may further cause the processor to correlate fluorescing pixels in the NIR video to pixels in the white light video. Detecting fluorescence in the NIR video may include determining which pixels in the NIR video change brightness at a rate faster than a threshold.
In another aspect, this disclosure features a method of identifying an area of interest in an endoscopic image. The method includes illuminating tissue with white light. The method also includes capturing reflected white light. The method also includes illuminating tissue with near infrared (NIR) light. The method also includes capturing fluorescence emitted by tissue infused with a fluorescent dye. The method also includes displaying the captured reflected white light as a white light video on a display. The method also includes storing the captured fluorescence as a video in a memory. The method also includes displaying the fluorescence video and the white light video on the display.
In aspects, implementations of this disclosure may include one or more of the following features. The fluorescence video may be registered to the white light video. The registration may be an image-based registration. A composite video including the white light video and fluorescence video may be generated and displayed such that the fluorescence may be observable in the white light video. The method may also include detecting pixels in the fluorescence video corresponding to pixels in the white light video. The method may also include displaying the white light video with the corresponding pixels having a changed color.
A further aspect of the disclosure is directed to a method of imaging tissue including steps of illuminating tissue with white light, capturing white light images, and illuminating tissue with near infrared (NIR) light. The method further includes detecting fluorescence emitted by tissue infused with a fluorescent dye, converting the detected fluorescence into a centerline of perfusion, and overlaying the centerline of perfusion onto the white light image. The method may include displaying a composite image of the white light image and the centerline of perfusion on a user interface. The centerline of perfusion may be determined from the rate of change of a detected position of detected fluorescence. The centerline of perfusion may be determined as a median of a plurality of vectors determined via image processing of multiple images in which fluorescence is detected. Further, the centerline of perfusion may be determined prior to a current white light video image on which the centerline of perfusion is overlaid. Still further, a current white light image and a prior white light image may be used to register a location of the centerline of perfusion in the current white light image.
Various aspects and features of the disclosure are described hereinbelow with references to the drawings, wherein:
The disclosure is directed to a system and method that enables real-time visual examination of in vivo tissues and selective display of luminal networks lying beneath the surface of the tissues being examined.
Details of the endoscope 100 can be seen in
The first camera 106 may be a white light optical camera such as a charge-coupled device (CCD) camera, a complementary metal-oxide-semiconductor (CMOS) camera, an N-type metal-oxide-semiconductor (NMOS) camera, or any other suitable white light camera known in the art. Similarly, the first light source 104 may be or may include a light emitting diode (LED) emitting white light, although any suitable light emitting device known in the art may be utilized (e.g., the first light source 104 may be the end of an optical fiber connected to a light source external to the patient). The second light source 108 may be a laser or another emitter of infrared (IR) or near infrared (NIR) light. Finally, the second camera 110 may be a CCD camera capable of detecting IR or NIR light. Other cameras capable of capturing IR or NIR light, either with or without filtering, are also contemplated in connection with the disclosure as are multi-spectral cameras such as those capable of capturing white light and NIR light using a single camera.
Following insertion, as shown in
To identify structures that are either below the surface of the tissue, or whose structure is not clearly distinguishable from a perspective above the surface of the tissue, various dyes may be employed. One such dye is indocyanine green (ICG) dye. When ICG dye is illuminated with certain frequencies of light in the IR or NIR range, the ICG fluoresces a green color that can be captured by a camera such as the second camera 108.
The capture of the NIR images can be triggered automatically or as directed by the clinician. The clinician can instruct or control the system to capture the endoscopic image using any suitable method of control available (e.g., footswitch, voice, assistant, or hand motion). Alternatively, the system can perform this task automatically by detecting sudden changes in the image. Automated detection accuracy can be improved through comparison with template images described below. When dyes such as ICG, which generate visible changes, are used the results are seen in the UI 200. The process of the dye diffusing through the tissue reveals anatomic detail, such as vasculature and parts of the lymphatic system. The pattern of diffusion provides information on the direction of flow within that structure. By capturing a video of the diffusion in the lymphatic or another luminal system, it is possible to show the clinician the network of connections (e.g., lymph nodes and lymphatic ducts) and to determine which connections are the sentinel nodes, thus allowing for more complete harvesting of such structures during cancer surgery.
With reference back to
As shown in
While the systems and methods described above are useful, dyes such as ICG have certain issues that impact the way they are used. First, some dyes tend to perfuse through the entirety of the tissues 202-206 as the dyes are passed through the circulatory system. Substantially all tissues receive blood from the circulatory system. As a result, after a period, the entirety of the tissue proximate to the tissue into which a dye has been injected will be perfused, thus saturating the captured images with fluorescence and rendering the fluorescence captured by the second camera 110 useless. Secondly, dyes dissipate relatively quickly and are only visible for a couple of minutes after injection. Many of these dissipating dyes also have a level of toxicity that makes it difficult or less than desirable to perform multiple injections of the dyes. These limitations force the clinician to act immediately or at least very quickly on the information gleaned from the use of the dyes and imaging the tissues.
A network interface 408 allows the workstation 400 to communicate with a variety of other devices and systems via the Internet. The network interface 408 may connect the workstation 400 via a wired or wireless connection. Additionally or alternatively the communication may be via an ad-hoc Bluetooth or wireless networks enabling communication with a wide-area network (WAN) and or a local area network (LAN). The network interface 408 may connect to the internet via one or more gateways, routers, and network address translation (NAT) devices. The network interface 408 may communicate with a cloud storage system 410, in which further image data and videos may be stored. The cloud storage system 410 may be remote from or on the premises of the hospital such as in a control or hospital information technology room. An input module 412 receives inputs from an input device such as keyboard, mouse, voice commands, etc. An output device 414 connects the processor 402 and memory 404 to a variety of output devices such as the display 220. Finally, the workstation 400 may include its own display 416, which may be a touchscreen display.
In at least one aspect, the endoscope 100 includes a location sensor such as an electromagnetic (EM) sensor 112 which receives electromagnetic signals from a field generator 114 generating three or more electromagnetic fields. One of the applications 406 stored in the memory 404 and executed by the processor 402 may determine the position of the EM sensor 112 in the EM field generated by the field generator 114. Determination of the position of the endoscope and the first and second cameras 106, 110 enables the registration of images. For example, as will be explained in greater detail below, a live white light image may be registered with an NIR image stored in memory. Though EM sensors are described above, other position sensors, such as ultrasound sensors, flex sensors, robotic position detection sensors, are contemplated within the scope of the disclosure.
In accordance with the disclosure, a method of utilizing a video of the dye diffusion is described herein. Rather than simply imaging the tissue so that it can be displayed while the clinician is performing the surgery, the injection of the ICG dye and its diffusion through the tissue is captured as a video. The captured video of the diffusion of the ICG dye through the tissue can be played back at any time by the clinician. This allows the clinician to observe the direction of the diffusion as well as the relative speeds of diffusion through different vessels. This video of the diffusion of the ICG dye can be played back at a suitable speed (e.g., either at normal speeds or at high speeds) so that the clinician can observe the locations of the blood vessels or other tissues such as the lymph nodes 212, 214 before the tissue becomes entirely perfused. This video may then be overlaid on the real-time white light images (
In one aspect, the white light images are captured by the first camera 106. The lymph node 212 or a sentinel node upstream from the lymph node 212 may be injected with the ICG dye. The second light source 108 and the second camera 110 may be selected and, by observing fluorescence of the ICG dye when under NIR illumination, the location of the lymph nodes 212, 214 may be determined. Importantly, the direction of travel from the lymph node 212 to the lymph node 214, can be observed. The direction of travel through the lymphatic system can be an important factor in determining the likely spread of cancer and other diseases. Thus, by observing the direction of travel, if the lymph node 214 is down stream from the lymph node 212, then a clinical determination may be made as to whether lymph node 214 should be removed. This determination is usually based on the proximity of the lymph node 214 to a cancerous lymph node or a sentinel node. In addition, a more complete map of the lymph node tree can be developed and stored in the memory 404.
Because a video of the lymphatic system or at least a portion of the lymphatic system is stored in the memory 404, the clinician can reference the video at any time. In accordance with one aspect of the disclosure, one or more registration points are identified either automatically or by the user in the white light video, which may be captured simultaneously with the NIR video. Since the NIR video and the white light video may be captured simultaneously (or at alternating times which occur at a high speed), by identifying a location of a structure in one video, that same structure can be located in the other video. This allows the data from one, e.g., the position of the lymph nodes 212, 214 and the lymphatic duct 210 connecting them, to be accurately displayed as an overlay on the white light video.
Further, by identifying registration points, image recognition techniques can be employed to allow the system to display the lymph nodes 212, 214, even upon movement of the endoscope 100 where one of the lymph nodes 212, 214 may no longer be observable using the white light camera 106. In such a scenario, a representation of the lymph node behind the intervening tissue may still be displayed. This may be very useful when, for example, navigating the endoscope 100 within the patient. Once the registration points are in the view of the endoscope 100, the location of the lymph nodes 212, 214 will be known and can be displayed in the white light images on display 220.
Alternatively, image recognition techniques make it possible to determine which pixels in the image of the NIR camera are detecting fluorescence. Because the field of view of the white light first camera 106 and the NIR camera 110 are known, the location of the lymph nodes 212, 214 and the lymphatic duct 210 connecting them can be observed. The pixel data can then be transferred to, overlaid on, or otherwise applied to the white light images, so that the corresponding pixels of the white light images in the substantially same field of view are highlighted to display the locations of the lymph nodes 212, 214. In practice, this may only be performed for those lines of pixels that are changing at a rate of change that is greater than a set threshold. This limits the growth of the pixelized areas in the white light images by applying, to the white images, just the pixel data of the lymph nodes and lymphatic vessels, and not the pixel data of the surrounding tissue into which the ICG dye will ultimately, but more slowly, diffuse. Though described with respect to the lymph system, the same could be done with the blood vessels in the area. These and other methods of utilizing the data received from the first camera 106 and the second camera 110 are described in greater detail below.
The acquiring of both NIR and white light images may occur in an alternating fashion or simultaneously. The user may determine which of the images to display on the display 220. This may be the live white light images, the live NIR images, or the NIR diffusion video stored in the memory 404. If it is a diffusion video stored in memory 404 that is desired to be displayed, the appropriate diffusion video may be selected by a clinician and the diffusion video may be loaded into an application 406 at step 518 for display. In addition, at step 520, a speed of the video may be adjusted in response to a request received from a user. Increasing the speed of the video can be useful to the clinician in clearly and quickly outlining where areas of interest (e.g., blood vessels, lymph nodes, lymphatic ducts, etc.) are located. This can allow a clinician to play the video forwards and backwards, and at varying rates of speed to ensure that the locations of the areas of interest are well understood. Knowledge of these areas can assist the clinician when determining which structures to cut, avoid, biopsy, or remove as necessary.
The video, as adjusted at step 518 may be displayed on the display 220. The presentation of the video may be as a picture-in-picture format allowing the clinician to observe both the video and the real-time white light images received from the endoscope. Alternatively, the video and the real-time white light images may be displayed side-by-side and/or having approximately the same size.
In a further aspect, where the field of view of the white light images captured by the endoscope at least partially aligns with the field of view of the video stored in memory, a registration of the video and the live images may be undertaken at step 522. Following registration, the diffusion video (or an individual NIR image) may be displayed as an overlay on the real-time white light video (or an individual image in the white light video) at step 524. This overlay of the NIR video, in which only those portions of the field of view (FOV) which fluoresce can be readily observed, results in a composite video of the NIR video and the white light video viewable on the display 220. In this way, the NIR video can be used to identify structures in the FOV of the white light video. As the FOV of the endoscope changes with its movement, those portions of the white light video which are no longer aligned with the FOV of the NIR video may no longer be depicted as a composite video, but rather only the white light video is displayed. This may be considered an indicator to the clinician to move the endoscope back to a location where the NIR video clearly reveals the locations of the areas of interest.
Additionally, or alternatively if any portion of the structures of the NIR video are still in the FOV of the white light, the white light video may be warped in order to fit the portion of the NIR video in the FOV of the white light video. In embodiments with EM sensor 112 collecting position data of the cameras 106, 110 of the FOV a calculation can be made to determine the amount of foreshortening, rotation, and other warping to perform.
If a 3D endoscope is used, the coordinates of the area of interest may be extracted directly from the scanned model annotated with any additional information extracted from the images such as luminal network connection data and fluid flow direction as described for lymph nodes above. This additional data may be stored in the memory 404 and available for use by the clinician as the endoscope is returned to one or more stored coordinate positions.
In accordance with another aspect of the disclosure, an application 406 stored in memory 404 can be employed to analyze the NIR images captured by the endoscope 100 at second camera 110. This may occur in real time as the images are captured or may involve processing the video stored in memory 404. The images are analyzed at step 526 to determine which pixels in the NIR images of the diffusion video are experiencing a change in brightness or illuminance (typically measured in lux or foot-candle). The change in brightness is a result of the NIR light from the second light source 108 illuminating the ICG dye that is diffusing through the area of interest.
Because the NIR images may be captured substantially simultaneously with the white light images, the images are necessarily registered with one another. Because of the registration of the simultaneous capture of the images, by determining which pixels in the NIR images are changing from not fluorescing to fluorescing, an indication of the location of areas of interest can be made. The rate of the change is relevant to the identification of the larger vessels carrying the dye. Blood vessels and other luminal structures greater than a certain diameter allow the dye to pass through relatively quickly and thus change from not fluorescing to fluorescing quickly. By detecting this rate of change, and only identifying those pixels that change at a rate greater than a threshold, those pixels that are slower to change can be eliminated from further processing and thus prevent the saturation described above.
Once the pixels that are changing at a rate greater than a predetermined rate are identified, these pixels can then be correlated to the same pixels in the FOV of the white light images at step 528. Once the pixels are identified in the white light images, these pixels can then have their brightness or color changed such that the area of interest is revealed in the white light images at step 530 and this altered white light image can be displayed at step 532.
As noted herein above, registration of the NIR images and the white light images is a useful feature in enabling the data derived from the NIR images to be observed in the white light images. One form of registration requires the use of the EM sensor 112 and the field generator 114. A detected position of the EM sensor 112 in the EM field can be correlated to each frame of both the white light videos and the NIR videos. With reference to the position data of the NIR images, once the position data for each frame in the NIR video depicting the perfusion of the dye through the area of interest is determined, the NIR video data can be made available and displayed in the white light images as described above using the correlation of the position of the NIR images with the white light images.
In addition to EM field position detection and registration, other forms of registration are also possible. For example, an application 406 stored in memory 404 may engage in image detection that identifies structures or other hard points in every image of the white light video at step 534. These hard points are also necessarily in the NIR images that are simultaneously captured, and as described above. As a result, the white light video images and the NIR video images are necessarily registered with one another. However, if use is to be made of the diffusion video at a later time (e.g., later in a long procedure, or even in a subsequent procedure), some form of registration is required. By identifying registration points in the white light video that was captured substantially simultaneously with the NIR video images, when these registration points are observed in subsequent white light video images, a correlation or registration can be performed between these later captured video images and the original white light video images.
The registration may include a step 536 of determining a FOV of the endoscope 100 and white light images captured by the endoscope. Registration of the real time white light images with the previously captured white light images necessarily also registers the previously captured NIR images to the current white light images at step 538, and thus the diffusion video with the real time white light images. In this way, either selectively or automatically, as new white light images are captured via the endoscope 100, the fluorescence observed in the NIR images of the diffusion video from the same FOV as the real-time white light video can be displayed on the display 220 at step 540.
Yet a further aspect of the disclosure is described with reference to
In
As a result of the processes above, an area of interest can be tracked continually during the remainder of the surgery. For example, image recognition techniques may be used to follow the anatomy during the dissection and resection process as parts of the anatomy are split into smaller sections or removed entirely. At times, the area of interest may become obscured or covered by other anatomy or instruments. Whether using EM registration and position detection or image-based registration, the system may recover the tracking process when the common landmarks described earlier return into view.
In one aspect the clinician may at any time toggle on an anatomy tracking feature to see where the highlighted anatomy has moved to or if it still exists. In the case of lymph node resection, the clinician may have marked an area on the display that is intended to be removed. The system of the disclosure can track the area of interest and can notify the surgeon when the tracked area has been entirely removed. When sentinel lymph nodes are removed, the surgeon can be alerted as to the presence of connected lymph nodes that should be considered for removal as well
While several aspects of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular aspects.
Number | Name | Date | Kind |
---|---|---|---|
5057494 | Sheffield | Oct 1991 | A |
5321113 | Cooper et al. | Jun 1994 | A |
6003517 | Sheffield et al. | Dec 1999 | A |
8335359 | Fidrich et al. | Dec 2012 | B2 |
8706184 | Mohr et al. | Apr 2014 | B2 |
8827934 | Chopra et al. | Sep 2014 | B2 |
9375268 | Long | Jun 2016 | B2 |
9918659 | Chopra et al. | Mar 2018 | B2 |
10004558 | Long et al. | Jun 2018 | B2 |
10194897 | Cedro et al. | Feb 2019 | B2 |
10373719 | Soper et al. | Aug 2019 | B2 |
10376178 | Chopra | Aug 2019 | B2 |
10405753 | Sorger | Sep 2019 | B2 |
10478162 | Barbagli et al. | Nov 2019 | B2 |
10480926 | Froggatt et al. | Nov 2019 | B2 |
10524866 | Srinivasan et al. | Jan 2020 | B2 |
10555788 | Panescu et al. | Feb 2020 | B2 |
10569071 | Harris et al. | Feb 2020 | B2 |
10603106 | Weide et al. | Mar 2020 | B2 |
10610306 | Chopra | Apr 2020 | B2 |
10638953 | Duindam et al. | May 2020 | B2 |
10639114 | Schuh et al. | May 2020 | B2 |
10674970 | Averbuch et al. | Jun 2020 | B2 |
10682070 | Duindam | Jun 2020 | B2 |
10702137 | Deyanov | Jul 2020 | B2 |
10706543 | Donhowe et al. | Jul 2020 | B2 |
10709506 | Coste-Maniere et al. | Jul 2020 | B2 |
10772485 | Schlesinger et al. | Sep 2020 | B2 |
10796432 | Mintz et al. | Oct 2020 | B2 |
10823627 | Sanborn et al. | Nov 2020 | B2 |
10827913 | Ummalaneni et al. | Nov 2020 | B2 |
10835153 | Rafii-Tari et al. | Nov 2020 | B2 |
10885630 | Li et al. | Jan 2021 | B2 |
20020147462 | Mair et al. | Oct 2002 | A1 |
20030013972 | Makin | Jan 2003 | A1 |
20040120981 | Nathan | Jun 2004 | A1 |
20080045938 | Weide et al. | Feb 2008 | A1 |
20100286529 | Carroll et al. | Nov 2010 | A1 |
20110082369 | Mohr et al. | Apr 2011 | A1 |
20130303945 | Blumenkranz et al. | Nov 2013 | A1 |
20140024948 | Shida | Jan 2014 | A1 |
20140035798 | Kawada et al. | Feb 2014 | A1 |
20140051986 | Zhao et al. | Feb 2014 | A1 |
20150018690 | Kang et al. | Jan 2015 | A1 |
20150148690 | Chopra et al. | May 2015 | A1 |
20150265368 | Chopra et al. | Sep 2015 | A1 |
20160157939 | Larkin et al. | Jun 2016 | A1 |
20160183841 | Duindam et al. | Jun 2016 | A1 |
20160192860 | Allenby et al. | Jul 2016 | A1 |
20160262602 | Yu | Sep 2016 | A1 |
20160287344 | Donhowe et al. | Oct 2016 | A1 |
20170112571 | Thiel et al. | Apr 2017 | A1 |
20170112576 | Coste-Maniere et al. | Apr 2017 | A1 |
20170209071 | Zhao et al. | Jul 2017 | A1 |
20170265952 | Donhowe et al. | Sep 2017 | A1 |
20170311844 | Zhao et al. | Nov 2017 | A1 |
20170319165 | Averbuch | Nov 2017 | A1 |
20180078318 | Barbagli et al. | Mar 2018 | A1 |
20180144092 | Flitsch et al. | May 2018 | A1 |
20180153621 | Duindam et al. | Jun 2018 | A1 |
20180235709 | Donhowe et al. | Aug 2018 | A1 |
20180240237 | Donhowe et al. | Aug 2018 | A1 |
20180256262 | Duindam et al. | Sep 2018 | A1 |
20180263706 | Averbuch | Sep 2018 | A1 |
20180279852 | Rafii-Tari et al. | Oct 2018 | A1 |
20180325419 | Zhao et al. | Nov 2018 | A1 |
20190000559 | Berman et al. | Jan 2019 | A1 |
20190000560 | Berman et al. | Jan 2019 | A1 |
20190008413 | Duindam et al. | Jan 2019 | A1 |
20190038365 | Soper et al. | Feb 2019 | A1 |
20190065209 | Mishra et al. | Feb 2019 | A1 |
20190110839 | Rafii-Tari et al. | Apr 2019 | A1 |
20190175062 | Rafii-Tari et al. | Jun 2019 | A1 |
20190175799 | Hsu et al. | Jun 2019 | A1 |
20190183318 | Froggatt et al. | Jun 2019 | A1 |
20190183585 | Rafii-Tari et al. | Jun 2019 | A1 |
20190183587 | Rafii-Tari et al. | Jun 2019 | A1 |
20190192234 | Gadda et al. | Jun 2019 | A1 |
20190209016 | Herzlinger et al. | Jul 2019 | A1 |
20190209043 | Zhao et al. | Jul 2019 | A1 |
20190216548 | Ummalaneni | Jul 2019 | A1 |
20190239723 | Duindam et al. | Aug 2019 | A1 |
20190239831 | Chopra | Aug 2019 | A1 |
20190250050 | Sanborn et al. | Aug 2019 | A1 |
20190254649 | Walters et al. | Aug 2019 | A1 |
20190269470 | Barbagli et al. | Sep 2019 | A1 |
20190269818 | Dhanaraj et al. | Sep 2019 | A1 |
20190269819 | Dhanaraj et al. | Sep 2019 | A1 |
20190272634 | Li et al. | Sep 2019 | A1 |
20190298160 | Ummalaneni et al. | Oct 2019 | A1 |
20190298451 | Wong et al. | Oct 2019 | A1 |
20190320878 | Duindam et al. | Oct 2019 | A1 |
20190320937 | Duindam et al. | Oct 2019 | A1 |
20190336238 | Yu et al. | Nov 2019 | A1 |
20190343424 | Blumenkranz et al. | Nov 2019 | A1 |
20190350659 | Wang et al. | Nov 2019 | A1 |
20190365199 | Zhao et al. | Dec 2019 | A1 |
20190365479 | Rafii-Tari | Dec 2019 | A1 |
20190365486 | Srinivasan et al. | Dec 2019 | A1 |
20190380787 | Ye et al. | Dec 2019 | A1 |
20200000319 | Saadat et al. | Jan 2020 | A1 |
20200000526 | Zhao | Jan 2020 | A1 |
20200008655 | Schlesinger et al. | Jan 2020 | A1 |
20200030044 | Wang et al. | Jan 2020 | A1 |
20200030461 | Sorger | Jan 2020 | A1 |
20200038750 | Kojima | Feb 2020 | A1 |
20200043207 | Lo et al. | Feb 2020 | A1 |
20200046431 | Soper et al. | Feb 2020 | A1 |
20200046436 | Tzeisler et al. | Feb 2020 | A1 |
20200054399 | Duindam et al. | Feb 2020 | A1 |
20200054408 | Schuh et al. | Feb 2020 | A1 |
20200060771 | Lo et al. | Feb 2020 | A1 |
20200069192 | Sanborn et al. | Mar 2020 | A1 |
20200077870 | Dicarlo et al. | Mar 2020 | A1 |
20200078023 | Cedro et al. | Mar 2020 | A1 |
20200078095 | Chopra et al. | Mar 2020 | A1 |
20200078103 | Duindam et al. | Mar 2020 | A1 |
20200085514 | Blumenkranz | Mar 2020 | A1 |
20200109124 | Pomper et al. | Apr 2020 | A1 |
20200129045 | Prisco | Apr 2020 | A1 |
20200129239 | Bianchi et al. | Apr 2020 | A1 |
20200138514 | Blumenkranz et al. | May 2020 | A1 |
20200138515 | Wong | May 2020 | A1 |
20200142013 | Wong | May 2020 | A1 |
20200155116 | Donhowe et al. | May 2020 | A1 |
20200155232 | Wong | May 2020 | A1 |
20200170623 | Averbuch | Jun 2020 | A1 |
20200170720 | Ummalaneni | Jun 2020 | A1 |
20200179058 | Barbagli et al. | Jun 2020 | A1 |
20200188021 | Wong et al. | Jun 2020 | A1 |
20200188038 | Donhowe et al. | Jun 2020 | A1 |
20200205903 | Srinivasan et al. | Jul 2020 | A1 |
20200205904 | Chopra | Jul 2020 | A1 |
20200214664 | Zhao et al. | Jul 2020 | A1 |
20200229679 | Zhao et al. | Jul 2020 | A1 |
20200242767 | Zhao et al. | Jul 2020 | A1 |
20200275822 | Michihata | Sep 2020 | A1 |
20200275860 | Duindam | Sep 2020 | A1 |
20200297442 | Adebar et al. | Sep 2020 | A1 |
20200315554 | Averbuch et al. | Oct 2020 | A1 |
20200322512 | Aono | Oct 2020 | A1 |
20200330795 | Sawant et al. | Oct 2020 | A1 |
20200352427 | Deyanov | Nov 2020 | A1 |
20200364865 | Donhowe et al. | Nov 2020 | A1 |
20200383750 | Kemp et al. | Dec 2020 | A1 |
20210000524 | Barry et al. | Jan 2021 | A1 |
20210059532 | Tsumatori | Mar 2021 | A1 |
Number | Date | Country |
---|---|---|
0013237 | Jul 2003 | BR |
0116004 | Jun 2004 | BR |
0307259 | Dec 2004 | BR |
0412298 | Sep 2006 | BR |
112018003862 | Oct 2018 | BR |
1644519 | Dec 2008 | CZ |
486540 | Sep 2016 | CZ |
2709512 | Aug 2017 | CZ |
2884879 | Jan 2020 | CZ |
1644519 | Dec 2008 | EP |
2141497 | Jan 2010 | EP |
3413830 | Sep 2019 | EP |
3478161 | Feb 2020 | EP |
3641686 | Apr 2020 | EP |
3644885 | May 2020 | EP |
3644886 | May 2020 | EP |
3749239 | Dec 2020 | EP |
20190032758 | Mar 2019 | KR |
PA03005028 | Jan 2004 | MX |
PA03000137 | Sep 2004 | MX |
PA03006874 | Sep 2004 | MX |
225663 | Jan 2005 | MX |
226292 | Feb 2005 | MX |
PA03010507 | Jul 2005 | MX |
PA05011725 | May 2006 | MX |
06011286 | Mar 2007 | MX |
246862 | Jun 2007 | MX |
2007006441 | Aug 2007 | MX |
265247 | Mar 2009 | MX |
284569 | Mar 2011 | MX |
Entry |
---|
Extended European Search Report issued in European Application No. 20212737.9 dated Aug. 13, 2021, 13 pages. |
Communication pursuant to Article 94(3) EPC issued in European Patent Application No. 20212737.9 dated Jul. 11, 2023. |
Number | Date | Country | |
---|---|---|---|
20210169330 A1 | Jun 2021 | US |
Number | Date | Country | |
---|---|---|---|
62946307 | Dec 2019 | US |