Device and method for tracking the position of an endoscope within a patient's body

Information

  • Patent Grant
  • 11529197
  • Patent Number
    11,529,197
  • Date Filed
    Thursday, October 17, 2019
    4 years ago
  • Date Issued
    Tuesday, December 20, 2022
    a year ago
Abstract
Systems and methods of tracking the position of an endoscope within a patient's body during an endoscopic procedure is disclosed. The devices and methods include determining a position of the endoscope within the patient in the endoscope's coordinate system, capturing in an image fiducial markers attached to the endoscope by an external optical tracker, transforming the captured fiducial markers from the endoscope's coordinate system to the optical tracker's coordinate system, projecting a virtual image of the endoscope on a model of the patient's organ, and projecting or displaying the combined image.
Description
FIELD

The present specification relates generally to endoscopes, and more specifically, to a device and method for displaying an image of a position of an endoscope within a patient's body, superimposed on the patient's body or overlaid on an image of the patient's body.


BACKGROUND

An endoscope is a medical instrument used for examining and treating internal body cavities such as the alimentary canals, airways, the gastrointestinal system, and other organ systems. Conventional endoscopes are usually an elongated tubular shaft, rigid or flexible, having a video camera and a fiber optic light guide for directing light from an external light source situated at a proximal end of the tube to a distal tip. Also, most endoscopes are provided with one or more channels, through which medical devices, such as forceps, probes, and other tools, may be passed. Further, during an endoscopic procedure, fluids, such as water, saline, drugs, contrast material, dyes, or emulsifiers are often introduced or evacuated via the shaft. A plurality of channels, one each for introduction and suctioning of liquids, may be provided within the shaft.


Endoscopes have attained great acceptance within the medical community, since they provide a means for performing procedures with minimal patient trauma, while enabling the physician to view the internal anatomy of the patient. Over the years, numerous endoscopes have been developed and categorized according to specific applications, such as cystoscopy, colonoscopy, laparoscopy, upper gastrointestinal (GI) endoscopy among others. Endoscopes may be inserted into the body's natural orifices or through an incision in the skin.


Some endoscopes have viewing elements for viewing an internal organ, such as the colon, and an illuminator for illuminating the field of view of the viewing elements. The viewing elements and illuminators are located in a tip of the endoscope and are used to capture images of the internal walls of the body cavity being endoscopically scanned. The captured images are sent to a control unit coupled with the endoscope via one of the channels present in the scope shaft, for being displayed on a screen coupled with the control unit.


During an endoscopic procedure an operating physician guides the endoscope within a patient's body by using the captured images displayed on the screen coupled with the control unit as a guide. However, the physician does not know the exact position of the endoscope within the body with respect to the internal organs of the patient. The physician maneuvers the endoscope largely based on his/her knowledge of the patient's anatomy, experience and the displayed images of internal organs.


Conventional endoscope guiding systems allow an operating physician to view the scope's position within a patient's body, by displaying a representation of the endoscope within the body during an endoscopic procedure. However, such representations depict the scope's position with respect to the endoscope's coordinate system and not with respect to the patient's coordinates. Thus the operator is not provided with an accurate sense of the scope's position relative to the patient's body. This may cause the operator to maneuver the scope in such a manner that causes discomfort or even pain to the patient.


Hence, there is need for a device and method that displays an accurate position of an endoscope within a patient's body by combining the scope's coordinates with the patient's coordinate. There is need for a method of combining patient and endoscope information to provide an augmented reality environment clearly highlighting the endoscope's position with respect to a patient's internal organs.


SUMMARY

The following embodiments and aspects thereof are described and illustrated in conjunction with systems, tools and methods, which are meant to be exemplary and illustrative, not limiting in scope.


The present specification discloses an endoscope system having an endoscope handle and an endoscope body adapted to be inserted into a gastrointestinal tract of a patient, the system comprising: a plurality of orientation markers positioned on said endoscope handle, wherein said orientation markers are distributed around a circumference of said endoscope handle; a plurality of sensors positioned at different locations longitudinally along an external surface of the endoscope body, wherein each of said plurality of sensors is adapted to generate first orientation data; one or more cameras positioned external to said patient and adapted to detect one or more of said plurality of orientation markers and generate second orientation data; and a controller adapted to receive said first orientation data and second orientation data and generate data indicative of a position of said endoscope body within said gastrointestinal tract of the patient.


Optionally, the data indicative of a position of said endoscope body includes positions of all of the endoscope body that has entered into the gastrointestinal tract of the patient.


Optionally, the sensors are distributed longitudinally along a length of the endoscope body and are separated by a predefined distance of at least 0.1 mm.


Optionally, the controller is adapted to generate an image of said endoscope body based upon data indicative of a position of said endoscope body.


Optionally, the controller is further adapted to orient said image by performing a translation of an orientation of the endoscope to a coordinate system defining a position of said patient and applying said translation to the said image.


Optionally, the controller is further adapted to overlay said oriented image onto an image of a portion of said patient's gastrointestinal tract to generate an oriented, overlaid image of said endoscope body.


Optionally, the first orientation data is indicative of a position of the endoscope body relative to an endoscope coordinate system.


Optionally, the second orientation data is indicative of a position of the endoscope handle relative to a patient coordinate system.


Optionally, the endoscope system further comprises a projector adapted to receive the oriented, overlaid image of said endoscope body from the controller and project it onto the patient.


Optionally, the plurality of orientation markers comprise spheres placed on handle of the endoscope, each sphere having a diameter ranging from 0.5 to 2 cm.


Optionally, the plurality of orientation markers comprise pinpoint-sized laser beams.


Optionally, the plurality of orientation markers are made of a material that reflects or emits infra-red light.


Optionally, the plurality of sensors comprise one or more of accelerometers, gyroscopes, magnetometers and stripes that measure the bending and twisting of an insertion tube of the endoscope by one or more of electro-optic and mechanic methods.


Optionally, the plurality of sensors comprise one or more of inductive sensors, capacitive sensors, capacitive displacement sensors, photoelectric sensors, magnetic sensors, and infrared sensors placed along one of an elongated shaft and an insertion tube of the endoscope, wherein each of the sensors corresponds to a unique identifier, based on the location of the sensor along the insertion tube.


Optionally, the endoscope system further comprising a distance sensor adapted to detect distance markers positioned at different locations longitudinally along an external surface of the endoscope body and generate distance data, wherein the distance sensor comprises one or more of a depth sensor and a touch sensor for providing a distance the insertion tube has travelled inside the gastrointestinal tract of a patient.


Optionally, the endoscope system comprises two stereo-calibrated cameras adapted to generate second orientation data comprising 3D location of the fiducials in the cameras' own coordinate system by triangulation.


The present specification also discloses a method of tracking the position of an endoscope within a patient's organ during an endoscopic procedure, the method comprising: determining a position of the endoscope within the organ in the endoscope's coordinate system; capturing in an image a plurality of fiducial markers by an external optical tracker; transforming the captured fiducial markers from the endoscope's coordinate system to the optical tracker's coordinate system; detecting the captured fiducial markers on a model of the patient's organ; and projecting the image of the endoscope with the fiducial markers upon an image of the patient's organ with the fiducial markers.


Optionally, the external optical tracker is a camera placed above the endoscope performing the endoscopic procedure.


Optionally, the captured fiducial markers are detected on a model of the patient's organ by using an object detection algorithm. Still optionally, the captured fiducial markers are detected on a model of the patient's organ by using the Hough Transform. Optionally, the method further comprises casting the position of the endoscope directly on the patient's body by using a calibrated projector.


The present specification also discloses an endoscopy system for tracking the position of an endoscope within a patient's organ during an endoscopic procedure, the system comprising at least an endoscope coupled with a plurality of fiducial markers; an optical tracker placed external to the endoscope and a computing unit for at least processing the images captured by the optical tracker, the optical tracker capturing in an image the plurality of fiducial markers and the endoscope during the endoscopic procedure, the computing unit transforming the captured fiducial markers from the endoscope's coordinate system to the optical tracker's coordinate system and projecting the image of the endoscope with the fiducial markers upon an image of the patient's organ with the fiducial markers.


Optionally, the external optical tracker is a camera placed above the endoscope performing the endoscopic procedure.


Optionally, the captured fiducial markers are transformed from the endoscope's coordinate system to the optical tracker's coordinate system by using an object detection algorithm. Still optionally, the captured fiducial markers are transformed from the endoscope's coordinate system to the optical tracker's coordinate system by using the Hough Transform.


Optionally, the system further comprises a calibrated projector for casting the position of the endoscope directly on the patient's body.


The aforementioned and other embodiments of the present specification shall be described in greater depth in the drawings and detailed description provided below.





BRIEF DESCRIPTION OF THE DRAWINGS

These and other features and advantages of the present specification will be appreciated, as they become better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:



FIG. 1A illustrates a multiple viewing elements endoscopy system in which the methods of the present specification may be implemented;



FIG. 1B is a schematic depiction of a layout of a multiple viewing elements endoscopy system and an associated interface unit deployed in an operating room in which the methods of the present specification may be implemented;



FIG. 1C is a schematic diagram of another layout of a multiple viewing elements endoscopy system and an associated interface unit deployed in an operating room in which the methods of the present specification may be implemented;



FIG. 1D is a schematic diagram of yet another layout of a multiple viewing elements endoscopy system and an associated interface unit deployed in an operating room in which the methods of the present specification may be implemented;



FIG. 2 illustrates a measurement of the depth, distance or location of an endoscopic tip using a multiple viewing elements endoscope whose elongated shaft has a plurality of sensors attached thereto, in accordance with an embodiment of the present specification;



FIG. 3 is a flowchart illustrating a method of tracking a position of an endoscope within a patient's internal organ during an endoscopic procedure, in accordance with an embodiment of the present specification;



FIG. 4A illustrates a three-dimensional (3D) model of an endoscope with fiducial markers, in accordance with an embodiment of the present specification;



FIG. 4B illustrates the 3D endoscope model shown in FIG. 4A projected onto a two-dimensional (2D) plane by an optical tracker/camera; and



FIG. 4C illustrates the fiducial markers shown in FIG. 4A projected on top of an image of an endoscope captured by the same camera, in accordance with an embodiment of the present specification.





DETAILED DESCRIPTION

The present specification provides a method for displaying the position of an endoscope within a patient's body. In an embodiment, an image of the endoscope, being used in an endoscopic procedure is projected directly over the patient's body allowing an operating physician to clearly ascertain the position of the endoscope within the body. In another embodiment, the endoscope's position within a patient's body is displayed as an image along with an image of the patient's internal organs on a monitor allowing the operator to maneuver the endoscope easily within the body.


In various embodiments, the method of the present specification allows an operating physician to accurately determine the position of an endoscope within a patient's body during an endoscopic procedure, thereby reducing endoscope navigation time significantly. The method also allows the operator to correctly ascertain the distance of the endoscopic tip from a patient's cecum during a GI procedure.


In an embodiment, the method of the present specification allows for a three-dimensional reconstruction of a patient's colon. The images captured during an endoscopic procedure may be displayed on top of the colon model enabling better maneuverability of the endoscope and an improved diagnosis of the colon.


It is noted that the term “endoscope” as mentioned herein may refer particularly to a colonoscope, according to some embodiments, but is not limited only to colonoscopes. The term “endoscope” may refer to any instrument used to examine the interior of a hollow organ or cavity of the body.


It should also be noted that a plurality of terms, as follows, appearing in this specification are used interchangeably to apply or refer to similar components and should in no way be construed as limiting:

    • “Utility tube/cable” may also be referred to as an “umbilical tube/cable”.
    • A “main control unit” may also be referred to as a “controller unit” or “main controller”.
    • A “viewing element” may also be referred to as an image capturing device/component, viewing components, camera, TV camera or video camera.
    • A “working channel” may also be referred to as a “service channel”.
    • An “illuminator” may also be referred to as an “illumination source”, and in some embodiments, an LED.
    • A “flexible shaft” may also be referred to as a bending section or vertebra mechanism.
    • “Fiducial”, used herein and throughout, may be used to refer to a standard or reference, for example, a fiducial marker.


As used in the specification, the term “optical assembly” is used to describe a set of components that allows the endoscopic device to capture light and transform that light into at least one image. In some embodiments, lenses/optical elements are employed to capture light and image capturing devices, such as sensors, are employed to transform that light into at least one image.


Image capturing devices may be Charged Coupled Devices (CCD's) or Complementary Metal Oxide Semiconductor (CMOS) image sensors, or other suitable devices having a light sensitive surface usable for capturing an image. In some embodiments, a sensor such as a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) image sensor (for detecting the reflected light received by an optical element), is employed.


In some embodiments, an optical element comprises a plurality of optics such as lens assemblies, lenses and protective glass, and is configured to receive reflected light from target objects.


In accordance with an embodiment of the present specification, a tip cover may house the tip section of an endoscope. The tip section, with the tip cover, may be turned or maneuvered by way of a flexible shaft, which may also be referred to as a bending section, for example, a vertebra mechanism. Tip cover may be configured to fit over the inner parts of the tip section, including an electronic circuit board assembly and a fluid channeling component, and to provide protection to the internal components in the inner parts, such as a body cavity. The endoscope can then perform diagnostic or surgical procedures inside the body cavity. The tip section carries one or more viewing elements, such as cameras, to view areas inside body cavities that are the target of these procedures.


Tip cover may include panels having a transparent surface, window or opening for optical lens assemblies of viewing elements. The panels and viewing elements may be located at the front and sides of the tip section. Optical lens assemblies may include a plurality of lenses, static or movable, providing different fields of view.


An electronic circuit board assembly may be configured to carry the viewing elements, which may view through openings on the panels. Viewing elements may include an image sensor, such as but not limited to a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) image sensor.


The electronic circuit board assembly may be configured to carry illuminators that are able to provide illumination through illuminator optical windows. The illuminators may be associated with viewing elements, and may be positioned to illuminate the viewing elements' fields of view.


The present specification provides a method of determining the position of an endoscope within a patient's body by determining the endoscope's coordinates within the body. In an embodiment, the present specification discloses the use of fiducial markers or points to track the position of an endoscope within a patient's body. In an embodiment, fiducial markers are distributed around the entirety of the endoscope's handle and in conjunction with a sensing system provide data describing how the handle has been turned or oriented in 3D space. In an embodiment, a camera system is used to capture the position of the fiducial markers and, using known algorithms, the orientation of the endoscope is translated relative to the camera system. In an embodiment, fixing the position of the patient and the camera system enables the camera system to scale and translate the orientation of the endoscope into an image which is projected onto the body of the patient.


In embodiments, the position of an endoscope within a patient's body is tracked by sensors that measure the bending, turning, or orientation of the endoscope body within the patient's body. In other embodiments, sensors are integrated along the endoscope's insertion tube to provide real-time information on the distance being travelled by the endoscope inside the patient's body. In an embodiment, the orientation of the endoscope obtained from the fiducial markers and the bending, turning, or orientation information obtained via the sensors together provides a precise orientation of the entire endoscope within the patient. In an embodiment, position of a first position of the endoscope is determined by using the fiducial markers and, the position of each point along the endoscope body is determined by using a turn or orientation change relative to the endoscope's handle and the distance that point has traveled in the patient's body.


The present specification is directed towards multiple embodiments. The following disclosure is provided in order to enable a person having ordinary skill in the art to practice the specification. Language used in this specification should not be interpreted as a general disavowal of any one specific embodiment or used to limit the claims beyond the meaning of the terms used therein. The general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the specification. Also, the terminology and phraseology used is for the purpose of describing exemplary embodiments and should not be considered limiting. Thus, the present specification is to be accorded the widest scope encompassing numerous alternatives, modifications and equivalents consistent with the principles and features disclosed. For purpose of clarity, details relating to technical material that is known in the technical fields related to the specification have not been described in detail so as not to unnecessarily obscure the present specification.


It should be noted herein that any feature or component described in association with a specific embodiment may be used and implemented with any other embodiment unless clearly indicated otherwise.


Reference is now made to FIG. 1A, which shows a multi-viewing elements endoscopy system 100. System 100 may include a multi-viewing elements endoscope 102. Multi-viewing elements endoscope 102 may include a handle 104, from which an elongated shaft 106 emerges. Elongated shaft 106 terminates with a tip section 108 which is turnable by way of a bending section 110. Handle 104 may be used for maneuvering elongated shaft 106 within a body cavity. The handle may include one or more buttons and/or knobs and/or switches 105 which control bending section 110 as well as functions such as fluid injection and suction. Handle 104 may further include at least one, and in some embodiments, one or more working channel openings 112 through which surgical tools may be inserted as well as one and more side service channel openings.


A utility cable 114, also referred to as an umbilical tube, may connect between handle 104 and a Main Control Unit 199. Utility cable 114 may include therein one or more fluid channels and one or more electrical channels. The electrical channel(s) may include at least one data cable for receiving video signals from the front and side-pointing viewing elements, as well as at least one power cable for providing electrical power to the viewing elements and to the discrete illuminators.


The main control unit 199 contains the controls required for displaying the images of internal organs captured by the endoscope 102. The main control unit 199 may govern power transmission to the endoscope's 102 tip section 108, such as for the tip section's viewing elements and illuminators. The main control unit 199 may further control one or more fluid, liquid and/or suction pump(s) which supply corresponding functionalities to the endoscope 102. One or more input devices 118, such as a keyboard, a touch screen and the like may be connected to the main control unit 199 for the purpose of human interaction with the main control unit 199. In the embodiment shown in FIG. 1, the main control unit 199 comprises a screen/display 120 for displaying operation information concerning an endoscopy procedure when the endoscope 102 is in use. The screen 120 may be configured to display images and/or video streams received from the viewing elements of the multi-viewing element endoscope 102. The screen 120 may further be operative to display a user interface for allowing a human operator to set various features of the endoscopy system.


In various embodiments, the position of an endoscope within a patient's body is calculated by determining the endoscope's coordinates within the body. Various methods may be used for determining the location and pose of the endoscope within the scope's coordinate system or relative to an exterior coordinate system.


In an embodiment, the present specification discloses the use of fiducial markers or points to track the position of an endoscope within a patient's lumen. As is commonly known in the art, a fiducial marker or fiducial point is an object placed in the field of view of an imaging system which appears in the image produced, for use as a point of reference or a measure. It may be either something placed into or on the imaging subject, or a mark or set of marks in the reticle of an optical instrument. The location of the fiducial markers is dependent upon the method used to compute the scope's pose within the patient's body.


In an embodiment, the fiducial markers or points are small spheres (balls) placed on an endoscope's handle that enable measurement of the endoscope's position relative to the endoscope's coordinates (internal coordinate system). These spheres are easily recognizable in an image. Other recognizable shapes, such as but not limited to crosses, for example, may also be used as fiducial markers. In an embodiment, the fiducial markers employed in the present specification are spheres having a diameter in the range of 0.5 to 2 cm. In another embodiment, pinpoint-sized laser beams may also be used as fiducial markers as they are capable of being uniquely detected by an optical tracker/camera, when in the field of view of the optical tracker/camera. Fiducial markers may be made of any material that enables easy detection by optical trackers. In an embodiment, the fiducial markers may be made of a material that reflects or emits any form of light, particularly infrared light. A plurality of optical trackers, such as cameras, may be employed for detecting fiducial points on the handle portion of the endoscope. The fiducial points appear in the image captured by the optical trackers serving as points of reference for co-relating the endoscope's coordinates with the coordinates of the optical tracker.


In embodiments, the optical tracker, which may be an external camera, is placed above a patient undergoing an endoscopic procedure, so that the camera captures both the endoscope and the patient's body in the same image. Fiducial markers are placed on at least a portion of the endoscope. Hence external camera produces an image that displays the fiducial markers on the (partially unseen) endoscope along with the patient's body.


In an embodiment of the present specification, electromagnetic tracking techniques are used to detect the position of an endoscope within a patient's body. As is known in the art, for electromagnetic tracking of an endoscope, a plurality of electromagnetic coils is wound around one or more portions of the endoscope. The coils emit an electromagnetic signal that can be detected by an electromagnetic tracking device placed external to the endoscope. In an embodiment, fiducial markers are attached to the electromagnetic tracking device which is used to measure the position of the endoscope with respect to the coordinates of the tracking device.


Sensing Systems:


In another embodiment of the present specification, the position of an endoscope within a patient's body is tracked by sensors that measure the orientation of the insertion tube of the endoscope at several positions. In various embodiments, sensors such as accelerometers, gyroscopes, magnetometers (i.e. electronic compasses) and stripes that measure the bending and twisting of the insertion tube by electro-optic/mechanic methods may be used. In accordance with another embodiment, the endoscope comprises sensors integrated along its insertion tube to provide real-time information on the distance being travelled by the endoscope inside the patient's lumen.


In one embodiment, a plurality of sensors are placed along the elongated shaft or insertion tube of the endoscope. Further, each sensor has a unique identifier, code, signature, or other identification according to its location (such as distance from the distal tip) along the insertion tube. In another embodiment, each identifier is not only unique to the sensor but also indicative of the particular position, or distance, occupied by the sensor. Several different types of sensors may be employed, including, but not limited to inductive sensors, capacitive sensors, capacitive displacement sensors, photoelectric sensors, magnetic sensors, and infrared sensors. In an embodiment, a depth sensor is placed at the entrance of the body where the endoscope is inserted and is in communication with the main control unit that is used with the endoscope. In some embodiments a matrix of sensors are employed, so that continuity in reading of distances is achieved. In some embodiments touch sensors may be used. Thus, for example, with touch sensors placed at regular intervals on the insertion tube, the number of touch sensors showing an output would indicate the depth the insertion tube has travelled inside the lumen of the body.


It is known in the art that the insertion tube has numbers or marks on it to indicate to the physician the distance of the insertion tube within patient body. Thus, in another embodiment, an imaging device, such as a CCD, a CMOS and the like, is placed outside the patient's body, close to the entrance point of the insertion tube of the endoscope to capture images of the mark of the insertion tube visible outside the body, thereby providing the distance of the insertion tube within patient body.


In a yet another embodiment, depth is measured by using sensors that respond to the physician's grip on the tube. Sensors are placed over substantially the entire length of the insertion tube, and each sensor has a unique identifier, code, signature, or other identification per its location along elongated axes of the insertion tube. Methods and systems of determining the location or distance of an endoscopic tip within a patient's body are described in co-pending United States Patent Application Publication Number US 2015/0099925 A1, entitled “Endoscope with Integrated Sensors” and published on Apr. 9, 2015, which is herein incorporated by reference in its entirety. Hence, as described above various methods may be used for determining the location and pose of the endoscope within the scope's coordinate system.


Orientation Determination:


The 3D coordinates of the endoscope can be reconstructed from its computed orientations by integration by means of a bending matrix. A bending matrix provides a measurement of the extent of bending of an endoscope. The 3D coordinates of the endoscope can be reconstructed from the bending information provided by the bending matrix. In an embodiment, fiducial markers with known coordinates in both the endoscope's and the optical tracker's coordinate systems are used to obtain a match between the two coordinate systems.


In an embodiment, a 3D scope model is projected onto a two-dimensional (2D) plane by an optical tracker/camera. The optical tracker captures fiducial markers with respect to the camera's coordinates and matches the fiducial markers up to a model of a human organ being endoscopically scanned, such as a colon, in accordance with an embodiment of the present specification. An object detection algorithm is used to detect the captured fiducial markers within the frame/model of the patient's organ. In an embodiment, detected fiducial markers are represented with a circle or demarcations (in this case, orange) around the fiducial markers.


The transformation method for transforming the fiducial markers from the endoscope's coordinates to the optical tracker's coordinates is dependent on the optical tracker's characteristics. If the optical tracker is composed of two stereo-calibrated cameras, then these cameras compute the 3D location of the fiducials in their own coordinate system by triangulation. The transformation of these 3D points with the known 3D structure of the fiducial markers can be computed by any point-cloud registration algorithm such as Horn's algorithm which computes a transformation that minimizes the average distance between points. However, if the optical tracker is composed of a single camera, then the transformation can be computed by any algorithm that solves the PnP problem, such as EPnP, DLT, and POSSIT. As is known in the art, the PnP problem aims to determine the location of a camera based on comparison between a set of 3D points with known location in some arbitrary coordinate system, and their 2D projection by the camera.


In an embodiment, an object detection algorithm such as ‘Hough Transform’ is used to detect the captured fiducial markers within a frame/model of the patient's internal organ(s). In some embodiments, various other object detection algorithms may also be used. Hough transform is an algorithm that is commonly used to detect parametric shapes in images. For example it can be used to detect spheres. The algorithm first computes the number of pixels that are consistent with any parametric combination of a shape and second, determines a threshold value for the computed matrix.


In other embodiments data obtained from sensors such as accelerometers, gyroscopes, magnetometers (i.e. electronic compasses) and stripes placed along the insertion tube of the endoscope for measuring the bending and twisting of the insertion tube by electro-optic/mechanic methods may be used to obtain distance being travelled by the endoscope inside the patient's lumen. In an embodiment, distance being travelled by the endoscope inside the patient's lumen is obtained by using data obtained from sensors such as inductive sensors, capacitive sensors, capacitive displacement sensors, photoelectric sensors, magnetic sensors, depth sensors, infrared sensors and touch sensors placed along the elongated shaft or insertion tube of the endoscope. A unique identifier of each of these sensors provides information about the particular position (location with respect to a distal tip of the insertion tube), or distance, occupied by the sensor, thereby providing an orientation of the scope within the patient's organ.


Image Projection


In embodiments, the orientation of the endoscope determined by using the fiducial markers and the sensors is translated to the camera's coordinate system, scaled based on the patient position and size and the relative camera position, and projected onto the patient.


In an embodiment, the captured fiducial markers are projected on an image of the patient's internal organ. A 3D image of the internal organ displaying the captured fiducial markers is obtained by using computer software. In another embodiment, the captured fiducial markers corresponding to the endoscope are projected on an image of the endoscope. A 3D image of the endoscope displaying the captured fiducial markers is obtained by using computer software. In yet another embodiment, the image of the internal organ along with the fiducial markers and the image of the endoscope with the fiducial markers are displayed together to enable an operating physician to clearly determine the position of the endoscope within the organ.


In an embodiment, a projector connected to the control unit of the endoscope is used to project a virtual model of the patient's organ being scanned, showing a position of the endoscope therein, directly on the patient's body. In an embodiment, the projector is calibrated to convey its position in relation to the positions of the endoscope and the patient in the coordinate system of the endoscope and the patient. Calibration also provides the internal parameters of the projector, such as the direction of rays originating from the projector and the patient. By using the internal parameters, the exact illumination pattern of the projector may be computed, which in turn enables real time projection of a virtual model (holographic) of the endoscope's location within the patient's body on top of the patient.



FIG. 1B schematically depicts a layout of an endoscope system 130 and an associated interface unit 132 deployed in an operating room, in which the optical trackers and fiducial markers as described above, may be employed. A patient 134 is supported on a bed 136 while a physician 138 is using an endoscope portion 140 of endoscope system 130 in an endoscopic procedure. Endoscope 140 is connected to a main controller 142 by a utility cable 144. A plurality of fiducial markers 146 are provided on the endoscope 140. Optical tracker, or external camera 148, is placed above the patient 134 so that the camera 148 captures the endoscope 140, fiducial markers 146 and the patient's body 134 in the same image 150 displayed on an external display unit 152. In another embodiment more than one optical tracker, or external camera 148, may be positioned above patient 134.


Thus, the optical tracker, or in this case external camera, is placed above a patient undergoing an endoscopic procedure, so that the camera captures both the endoscope and the patient's body in the same image. Fiducial markers are placed on at least a portion of the endoscope. Hence external camera produces an image that displays the fiducial markers on the (partially unseen) endoscope along with the patient's body. Next, an object detection algorithm such as ‘Hough Transform’ is used to detect the captured fiducial markers (described in detail with respect to FIG. 3) within a frame/model of the patient's internal organ. In various embodiments, after determining the external camera's parameters (by calibration) a virtual model of the patient's organ (e.g. colon) is rendered on top of the patient image taken by the external camera. In an embodiment, the virtual model of the patient's organ showing the position of the endoscope is cast directly on the patient's body by using a calibrated projector.


Endoscope 140 provides one or more endoscopic views (which may be simultaneous) using one, two, three or more cameras housed in the tip of endoscope 140. Main controller 142 is connected to at least one display screen 154 (not shown) or a plurality of display screens, for example three display screens 154a, 154b, and 154c, respectively, wherein each display screen is configured to display a corresponding view of the three endoscopic views provided by endoscope system 130. Display screen/s 154 is positioned facing physician 138 and possibly elevated so that physician 138 may conduct the endoscopic procedure by looking at the screen displays and having an undisturbed line of sight thereto. In an embodiment, the scope location and pose is determined relative to an exterior coordinate system. In this embodiment, the fiducial markers are attached to the exterior coordinate tracking system.



FIG. 1C illustrates an electromagnetic field generator with fiducial markers placed in proximity to the patient undergoing an endoscopy procedure as shown in FIG. 1B. An electromagnetic field generator with fiducial markers 156 is placed in close proximity to the patient 134, for example on/under patient bed 136 or on a stand (not shown) placed in proximity to patient 134.



FIG. 1D illustrates the position of the endoscope within the body of the patient undergoing an endoscopic procedure as shown in FIG. 1B being cast directly on the patient's body by using a projector, in accordance with an embodiment of the present specification. As shown a projector 158 connected to the controller 142 is used to project a virtual model 160 of the patient's colon showing a position of the endoscope therein, directly on the patient's body 134. In an embodiment, the projector 158 is calibrated, whereby calibrating the projector 158 conveys its position in relation to the positions of the endoscope 140 and the patient 134 in the coordinate system of the endoscope and the patient. Calibration also provides the internal parameters of the projector 158, such as the direction of rays originating from the projector and the patient 134. By using the internal parameters, the exact illumination pattern of the projector 158 may be computed, which in turn enables real time projection of a virtual model 160 (holographic) of the endoscope's 140 location within the patient's body on top of the patient 134.


In accordance with another embodiment, the endoscope comprises sensors integrated along its insertion tube to provide real-time information on the distance being travelled by the endoscope inside the patient's lumen. In one embodiment, as shown in FIG. 2, a plurality of sensors 2015 are placed along the elongated shaft or insertion tube 2306 of the endoscope. Further, each sensor has a unique identifier, code, signature, or other identification according to its location (such as distance from the distal tip) along the insertion tube 2306. Thus for example, and not limited to such example, a sensor would be placed at a distance of 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, or 20 centimeters, or any increment therein, from the distal end of the tube 2306. The next sensor may be placed at a similar, or different, distance and would have an identifier that is different than the identifier programmed into the first sensor. In another embodiment, each identifier is not only unique to the sensor but also indicative of the particular position, or distance, occupied by the sensor. Thus, in one embodiment, a plurality of sensors are placed at 10 centimeter increments along the length of the insertion tube 2306 where each sensor 2015 has a different identifier and where each identifier is indicative of the distance increment occupied by the sensor.


Additionally, a depth sensor is placed at the entrance of the body where the endoscope is inserted and is in communication with the main control unit that is used with the endoscope. As a non-limiting example we consider an endoscopic procedure being performed for a patient's colon 2022. The depth sensor 2020 is placed outside the body 2024, close to the rectum 2026, which is the entry point for an endoscope into the colon 2022. In operation, the depth sensor 2020 detects alignment to sensor 2016 closest to the entrance site, outside the body. In one embodiment, each sensor 2015, 2016 is pre-programmed to be read according to its location, such that the 10 cm sensor would transmit a different output than the 20 cm sensor. In one embodiment, the output of the depth sensor 2020 is conveyed to the controller or main control unit, which records and provides a display of the distance travelled by the distal end of the scope.



FIG. 3 is a flowchart illustrating a first method of tracking the position of an endoscope within a patient's internal organ during an endoscopic procedure, in accordance with an embodiment of the present specification. At step 302, a reference position of the endoscope within a patient's body is determined either in the scope's coordinate system (internal) or an external coordinate system.


In an embodiment, to determine a position of the scope within a patient's body using the scope's coordinate system, fiducial markers are placed on the handle of the endoscope.


In another embodiment, the scope's position within a patient's body may be determined by an external coordinate system, such as by using a bending matrix, electromagnetic tracking, or by using one or more sensors as described above. In such an embodiment, fiducial markers are placed on the external reference coordinate tracking system.


At step 304, fiducial markers, either located on the scope's handle or the external coordinate tracking system, are captured by an optical tracker. In an embodiment, the optical tracker is a camera that captures fiducial markers with respect to the camera's coordinates. In the case where the fiducial markers are rigidly attached to the endoscope, the markers can also be described with respect to the endoscope's coordinates.



FIG. 4A illustrates a three-dimensional (3D) model of a scope with fiducial markers, in accordance with an embodiment of the present specification. FIG. 4B illustrates the 3D scope model shown in FIG. 4A projected onto a two-dimensional (2D) plane by an optical tracker/camera. As shown, fiducial markers 402 are captured in an image by an optical tracker (not shown in the figure) with respect to an endoscope 404. In an embodiment, the optical tracker is a camera that captures fiducial markers 402 with respect to the camera's coordinates. Hence, FIG. 4B illustrates the fiducial markers shown in FIG. 4A matching up to a model of a human colon, in accordance with an embodiment of the present specification. An object detection algorithm is used to detect the captured fiducial markers 402 within the frame/model 406 of the patient's colon. The circles or demarcations (in this case, orange) around the fiducial markers 402 are indications that the fiducial markers 402 are detected (and marked) by a fiducial detection algorithm as explained in step 306 of FIG. 3 below.


At step 306, a transformation of the fiducial markers from the endoscope's coordinates to the optical tracker's coordinates is obtained. The transformation method is dependent on the optical tracker's characteristics. If the optical tracker is composed of two stereo-calibrated cameras, then these cameras compute the 3D location of the fiducials in their own coordinate system by triangulation. However, if the optical tracker is composed of a single camera, then the transformation can be computed by any algorithm that solves the PnP problem, such as EPnP, DLT, and POSSIT.


At step 308, the captured fiducial markers are projected on an image of the patient's internal organ. A 3D image of the internal organ displaying the captured fiducial markers may be obtained by using computer software. At step 310, the captured fiducial markers corresponding to the endoscope are projected on an image of the endoscope. A 3D image of the endoscope displaying the captured fiducial markers may be obtained by using computer software as illustrated in FIG. 4C. FIG. 4C illustrates the fiducial markers shown in FIG. 4A projected on top of the image captured by the same optical tracker/camera, in accordance with an embodiment of the present specification. The fiducial markers 402 shown in FIG. 4A are projected on an image of the endoscope 404, as illustrated. In an embodiment, the accuracy of the position of the camera (not shown in the figure) may be estimated by computation of the camera's re-projection error.


At step 312, the image of the internal organ along with the fiducial markers and the image of the endoscope with the fiducial markers are displayed together to enable an operating physician to clearly determine the position of the endoscope within the organ. In an embodiment, after determining the external camera's parameters (by calibration) a virtual model of the patient's colon showing the position of the endoscope is augmented to the patient image taken by the external camera.


In various embodiments, the patient image may be augmented with the virtual model of the patient's colon showing the position of the endoscope by using a computer monitor display. In other embodiments, display methods such as a see-through glass may be used. In another embodiment, the augmented image may also be displayed on a viewer's retina.


The above examples are merely illustrative of the many applications of the system of present specification. Although only a few embodiments of the present specification have been described herein, it should be understood that the present specification might be embodied in many other specific forms without departing from the spirit or scope of the specification. Therefore, the present examples and embodiments are to be considered as illustrative and not restrictive, and the specification may be modified within the scope of the appended claims.

Claims
  • 1. An endoscope system having an endoscope, wherein the endoscope includes an endoscope handle and an endoscope body adapted to be inserted into a patient, the system comprising: a plurality of orientation markers positioned on the endoscope handle;a plurality of first sensors positioned at different locations longitudinally along the endoscope body, wherein each of the plurality of first sensors is adapted to generate first orientation data;one or more cameras positioned external to the patient and adapted to detect one or more of the plurality of orientation markers and generate second orientation data;a controller adapted to: (i) receive the first orientation data and the second orientation data, and (ii) generate, using the first orientation data and the second orientation data, a virtual model of the endoscope body in a coordinate system of the endoscope and the patient showing a position of the endoscope within the patient; anda projector adapted to: (i) receive data indicative of the virtual model from the controller, and (ii) project the virtual model onto the patient to show the position of the endoscope body relative to the patient.
  • 2. The endoscope system of claim 1, wherein the first sensors are pressure sensors.
  • 3. The endoscope system of claim 1, further comprising at least one electromagnetic coil wound around one or more portions of the endoscope body, wherein the electromagnetic coil is configured to emit an electromagnetic signal.
  • 4. The endoscope system of claim 3, further comprising an electromagnetic tracking device external to the endoscope, wherein the electromagnetic tracking device is configured to receive the electromagnetic signal from the electromagnetic coil to track a position of the endoscope.
  • 5. The endoscope system of claim 1, wherein the one or more cameras include two stereo-calibrated cameras adapted to generate, by triangulation, at least a portion of the second orientation data comprising three-dimensional location of fiducials in a coordinate system of the camera.
  • 6. The endoscope system of claim 1, wherein the virtual model is a holographic virtual model.
  • 7. The endoscope system of claim 1, further comprising a plurality of additional sensors including one or more of accelerometers, gyroscopes, magnetometers, and stripes that measure the bending and twisting of an insertion tube of the endoscope by one or more of electro-optic and mechanical methods.
  • 8. An endoscope system having an endoscope, wherein the endoscope includes an endoscope handle and an endoscope body adapted to be inserted into a patient, the system comprising: a plurality of orientation markers positioned on the endoscope handle;a plurality of pressure sensors positioned at different locations longitudinally along the endoscope body, wherein each of the plurality of pressure sensors is adapted to generate first orientation data and first pressure data;one or more cameras positioned external to the patient and adapted to: (i) detect one or more of the plurality of orientation markers and (ii) generate second orientation data;a controller adapted to: (i) receive the first orientation data, the first pressure data, and the second orientation data, and (ii) generate, using the first orientation data and the second orientation data, a virtual model of the endoscope body in a coordinate system of the endoscope body and the patient showing a position of the endoscope body within the patient;a projector adapted to: (i) receive data indicative of the virtual model from the controller, and (ii) project the virtual model of the endoscope body in a coordinate system of the endoscope body and the patient showing a position of the endoscope body within the patient.
  • 9. The endoscope system of claim 8, wherein the virtual model is a holographic model.
  • 10. The endoscope system of claim 8, wherein the projector is adapted to project the virtual model on the patient, and wherein the one or more cameras include two stereo-calibrated cameras adapted to generate, by triangulation, at least a portion of the second orientation data comprising three-dimensional location of fiducials in a coordinate system of the camera.
  • 11. The endoscope system of claim 8, further comprising: at least one electromagnetic coil wound around one or more portions of the endoscope body, wherein the electromagnetic coil is configured to emit an electromagnetic signal, andan electromagnetic tracking device external to the endoscope, wherein the electromagnetic tracking device is configured to receive the electromagnetic signal from the electromagnetic coil to track a position of the endoscope.
  • 12. The endoscope system of claim 8, further comprising a plurality of additional sensors including one or more of accelerometers, gyroscopes, magnetometers, and stripes that measure the bending and twisting of an insertion tube of the endoscope by one or more of electro-optic and mechanical methods.
  • 13. An endoscope system having an endoscope, wherein the endoscope includes an endoscope handle and an endoscope body adapted to be inserted into a patient, the system comprising: a plurality of orientation markers positioned on the endoscope handle;a plurality of first sensors positioned at different locations longitudinally along the endoscope body, wherein each of the plurality of first sensors is adapted to generate first orientation data;one or more cameras positioned external to the patient and adapted to: (i) detect one or more of the plurality of orientation markers and (ii) generate second orientation data;a controller adapted to: (i) receive the first orientation data and the second orientation data, and (ii) generate, using the first orientation data and the second orientation data, a virtual model of a position of the endoscope body in a coordinate system of the endoscope body and the patient showing a position of the endoscope body within the patient; anda projector adapted to: (i) receive data indicative of the virtual model from the controller, and (ii) project the virtual model of the endoscope body in a coordinate system of the endoscope body and the patient showing a position of the endoscope body within the patient.
  • 14. The endoscope system of claim 13, further comprising a plurality of additional sensors including one or more of accelerometers, gyroscopes, magnetometers, and stripes that measure the bending and twisting of an insertion tube of the endoscope by one or more of electro-optic and mechanical methods.
  • 15. The endoscope system of claim 13, further comprising: at least one electromagnetic coil wound around one or more portions of the endoscope body, wherein the electromagnetic coil is configured to emit an electromagnetic signal; andan electromagnetic tracking device external to the endoscope, wherein the electromagnetic tracking device is configured to receive the electromagnetic signal from the electromagnetic coil to track a position of the endoscope.
  • 16. The endoscope system of claim 13, wherein the projector is adapted to project the virtual model on the patient.
  • 17. The endoscope system of claim 13, wherein the virtual model is a holographic model.
  • 18. The endoscope system of claim 13, wherein the controller is adapted to generate an image of the patient positioned on the patient bed.
  • 19. The endoscope system of claim 13, wherein the virtual model includes a portion of the patient's colon.
  • 20. The endoscope system of claim 13, further comprising at least one orientation marker positioned on a patient bed.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. Nonprovisional patent application Ser. No. 15/335,249 filed Oct. 26, 2016, which claims the benefit of U.S. Provisional Application No. 62/247,232, filed on Oct. 28, 2015. The present specification relates to United States Patent Application Publication Number US 2015/0099925 A1, entitled “Endoscope With Integrated Sensors”, and published on Apr. 9, 2015; and to United States Patent Application Publication Number US 2015/0313445 A1, entitled “System And Method Of Scanning A Body Cavity Using a Multiple Viewing Elements Endoscope”, and published on Nov. 5, 2015. The above-mentioned applications are herein incorporated by reference in their entirety.

US Referenced Citations (421)
Number Name Date Kind
3639714 Fujimoto Feb 1972 A
3955064 Demetrio May 1976 A
4027697 Bonney Jun 1977 A
4037588 Heckele Jul 1977 A
4084401 Belardi Apr 1978 A
4402313 Yabe Sep 1983 A
4461282 Ouchi Jul 1984 A
4494549 Namba Jan 1985 A
4532918 Wheeler Aug 1985 A
4588294 Siegmund May 1986 A
4641635 Yabe Feb 1987 A
4727859 Lia Mar 1988 A
4764001 Yokota Aug 1988 A
4801792 Yamasita Jan 1989 A
4825850 Opie May 1989 A
4877314 Kanamori Oct 1989 A
4902115 Takahashi Feb 1990 A
4976522 Igarashi Dec 1990 A
4984878 Miyano Jan 1991 A
5007406 Takahashi Apr 1991 A
5014685 Takahashi May 1991 A
5193525 Silverstein Mar 1993 A
5224929 Remiszewski Jul 1993 A
5296971 Mori Mar 1994 A
5359456 Kikuchi Oct 1994 A
5395329 Fleischhacker Mar 1995 A
5447148 Oneda Sep 1995 A
5460167 Yabe Oct 1995 A
5464007 Krauter Nov 1995 A
5475420 Buchin Dec 1995 A
5489256 Adair Feb 1996 A
5518501 Oneda May 1996 A
5518502 Kaplan May 1996 A
5547455 McKenna Aug 1996 A
5547467 Tsuyuki Aug 1996 A
5571114 Devanaboyina Nov 1996 A
5575755 Krauter Nov 1996 A
5587839 Miyano Dec 1996 A
5630782 Adair May 1997 A
5630798 Beiser May 1997 A
5651043 Tsuyuki Jul 1997 A
5662588 Iida Sep 1997 A
5674182 Suzuki Oct 1997 A
5676673 Ferre Oct 1997 A
5685821 Pike Nov 1997 A
5685823 Ito Nov 1997 A
5702347 Yabe Dec 1997 A
5707344 Nakazawa Jan 1998 A
5725474 Yasui Mar 1998 A
5725476 Yasui Mar 1998 A
5725477 Yasui Mar 1998 A
5725478 Saad Mar 1998 A
5777797 Miyano Jul 1998 A
5782751 Matsuno Jul 1998 A
5800341 McKenna Sep 1998 A
5810715 Moriyama Sep 1998 A
5810717 Maeda Sep 1998 A
5810770 Chin Sep 1998 A
5830121 Enomoto Nov 1998 A
5836894 Sarvazyan Nov 1998 A
5860913 Yamaya Jan 1999 A
5870234 EbbesmeierneeSchitthof Feb 1999 A
5916148 Tsuyuki Jun 1999 A
5940126 Kimura Aug 1999 A
6058109 Lechleider May 2000 A
6095970 Hidaka Aug 2000 A
6095971 Takahashi Aug 2000 A
6117068 Gourley Sep 2000 A
6181481 Yamamoto Jan 2001 B1
6196967 Lim Mar 2001 B1
6261226 McKenna Jul 2001 B1
6277064 Yoon Aug 2001 B1
6359674 Horiuchi Mar 2002 B1
6375610 Verschuur Apr 2002 B2
6402738 Ouchi Jun 2002 B1
6419626 Yoon Jul 2002 B1
6476851 Nakamura Nov 2002 B1
6520908 Ikeda Feb 2003 B1
6636254 Onishi Oct 2003 B1
6638214 Akiba Oct 2003 B2
6673012 Fujii Jan 2004 B2
6690337 Mayer, III Feb 2004 B1
6712760 Sano Mar 2004 B2
6832984 Stelzer Dec 2004 B2
6888119 Iizuka May 2005 B2
6895268 Rahn et al. May 2005 B1
6997871 Sonnenschein Feb 2006 B2
7154378 Ertas Dec 2006 B1
7435218 Krattiger Oct 2008 B2
7621869 Ratnakar Nov 2009 B2
7630148 Yang Dec 2009 B1
7701650 Lin Apr 2010 B2
7713246 Shia May 2010 B2
7746572 Asami Jun 2010 B2
7813047 Wang Oct 2010 B2
7828725 Maruyama Nov 2010 B2
7918788 Lin Apr 2011 B2
7927272 Bayer Apr 2011 B2
7967745 Gilad Jun 2011 B2
7976462 Wright Jul 2011 B2
8064666 Bayer Nov 2011 B2
8182422 Bayer May 2012 B2
8197399 Bayer Jun 2012 B2
8235887 Bayer Aug 2012 B2
8262558 Sato Sep 2012 B2
8287446 Bayer Oct 2012 B2
8289381 Bayer Oct 2012 B2
8300325 Katahira Oct 2012 B2
8310530 Bayer Nov 2012 B2
8353860 Boulais Jan 2013 B2
8447132 Galil May 2013 B1
8449457 Aizenfeld May 2013 B2
8460182 Ouyang Jun 2013 B2
8504136 Sun Aug 2013 B1
8585584 Ratnakar Nov 2013 B2
8587645 Bayer Nov 2013 B2
8672836 Higgins Mar 2014 B2
8715168 Ratnakar May 2014 B2
8797392 Bayer Aug 2014 B2
8872906 Bayer Oct 2014 B2
8926502 Levy Jan 2015 B2
9044185 Bayer Jun 2015 B2
9101266 Levi Aug 2015 B2
9101268 Levy Aug 2015 B2
9101287 Levy Aug 2015 B2
9144664 Jacobsen Sep 2015 B2
9289110 Woolford Mar 2016 B2
9314147 Levy Apr 2016 B2
9320419 Kirma Apr 2016 B2
20010036322 Bloomfield Nov 2001 A1
20020017515 Obata Feb 2002 A1
20020047897 Sugimoto Apr 2002 A1
20020077533 Bieger et al. Jun 2002 A1
20020077543 Grzeszczuk Jun 2002 A1
20020087047 Remijan Jul 2002 A1
20020109771 Ledbetter Aug 2002 A1
20020109774 Meron Aug 2002 A1
20020161279 Luloh Oct 2002 A1
20020161281 Jaffe Oct 2002 A1
20020172498 Esenyan Nov 2002 A1
20020183591 Matsuura Dec 2002 A1
20030030918 Murayama Feb 2003 A1
20030063398 Abe Apr 2003 A1
20030076411 Iida Apr 2003 A1
20030083552 Remijan May 2003 A1
20030128893 Castorina Jul 2003 A1
20030139650 Homma Jul 2003 A1
20030153897 Russo Aug 2003 A1
20030158503 Matsumoto Aug 2003 A1
20030163029 Sonnenschein Aug 2003 A1
20040015054 Hino Jan 2004 A1
20040046865 Ueno Mar 2004 A1
20040061780 Huffman Apr 2004 A1
20040064019 Chang Apr 2004 A1
20040077927 Ouchi Apr 2004 A1
20040106850 Yamaya Jun 2004 A1
20040133072 Kennedy Jul 2004 A1
20040138532 Glukhovsky Jul 2004 A1
20040138556 Cosman Jul 2004 A1
20040158129 Okada Aug 2004 A1
20040160682 Miyano Aug 2004 A1
20040176663 Whitin Sep 2004 A1
20040176683 Whitin et al. Sep 2004 A1
20040190159 Hasegawa Sep 2004 A1
20040249247 Iddan Dec 2004 A1
20040260151 Akiba Dec 2004 A1
20050018042 Rovegno Jan 2005 A1
20050020876 Shioda Jan 2005 A1
20050038317 Ratnakar Feb 2005 A1
20050047134 Mueller Mar 2005 A1
20050057687 Irani Mar 2005 A1
20050090709 Okada Apr 2005 A1
20050096501 Stelzer May 2005 A1
20050119527 Banik Jun 2005 A1
20050124858 Matsuzawa Jun 2005 A1
20050222499 Banik Oct 2005 A1
20050234296 Saadat Oct 2005 A1
20050234347 Yamataka Oct 2005 A1
20050251127 Broach Nov 2005 A1
20050272975 McWeeney Dec 2005 A1
20050277808 Sonnenschein Dec 2005 A1
20050283048 Gill Dec 2005 A1
20060004257 Gilad Jan 2006 A1
20060047184 Banik Mar 2006 A1
20060063976 Aizenfeld Mar 2006 A1
20060069314 Farr Mar 2006 A1
20060111613 Boutillette May 2006 A1
20060114986 Knapp Jun 2006 A1
20060149129 Watts Jul 2006 A1
20060169845 Maahs Aug 2006 A1
20060171693 Todd Aug 2006 A1
20060173245 Todd Aug 2006 A1
20060183975 Saadat Aug 2006 A1
20060184037 Ince Aug 2006 A1
20060215406 Thrailkill Sep 2006 A1
20060235306 Cotter Oct 2006 A1
20060252994 Ratnakar Nov 2006 A1
20060264704 Fujimori Nov 2006 A1
20060293556 Garner Dec 2006 A1
20070015989 Desai Jan 2007 A1
20070162095 Kimmel Jan 2007 A1
20070049803 Moriyama Mar 2007 A1
20070055100 Kato Mar 2007 A1
20070055128 Glossop Mar 2007 A1
20070079029 Carlson Apr 2007 A1
20070088193 Omori Apr 2007 A1
20070100206 Lin May 2007 A1
20070106119 Hirata May 2007 A1
20070118015 Wendlandt May 2007 A1
20070142711 Bayer Jun 2007 A1
20070167681 Gill Jul 2007 A1
20070177008 Bayer Aug 2007 A1
20070177009 Bayer Aug 2007 A1
20070185384 Bayer Aug 2007 A1
20070188427 Lys Aug 2007 A1
20070197875 Osaka Aug 2007 A1
20070203396 McCutcheon Aug 2007 A1
20070206945 DeLorme Sep 2007 A1
20070213591 Aizenfeld Sep 2007 A1
20070225553 Shahidi Sep 2007 A1
20070229656 Khait Oct 2007 A1
20070241895 Morgan Oct 2007 A1
20070244353 Larsen Oct 2007 A1
20070244354 Bayer Oct 2007 A1
20070247867 Hunter Oct 2007 A1
20070249907 Boulais Oct 2007 A1
20070249932 Shahinian Oct 2007 A1
20070265492 Sonnenschein Nov 2007 A1
20070270642 Bayer Nov 2007 A1
20070279486 Bayer Dec 2007 A1
20070286764 Noguchi Dec 2007 A1
20070293720 Bayer Dec 2007 A1
20080009673 Khachi Jan 2008 A1
20080021274 Bayer Jan 2008 A1
20080025413 Apostolopoulos Jan 2008 A1
20080036864 McCubbrey Feb 2008 A1
20080045797 Yasushi Feb 2008 A1
20080058601 Fujimori Mar 2008 A1
20080071141 Gattani et al. Mar 2008 A1
20080071290 Larkin Mar 2008 A1
20080091065 Oshima Apr 2008 A1
20080130108 Bayer Jun 2008 A1
20080151070 Shiozawa Jun 2008 A1
20080161646 Gomez Jul 2008 A1
20080163652 Shatskin Jul 2008 A1
20080167529 Otawara Jul 2008 A1
20080177139 Courtney Jul 2008 A1
20080183034 Henkin Jul 2008 A1
20080183043 Spinnler Jul 2008 A1
20080221388 Courtney Jul 2008 A1
20080243142 Gildenberg Oct 2008 A1
20080246771 O'Neal Oct 2008 A1
20080253686 Bayer Oct 2008 A1
20080262312 Carroll Oct 2008 A1
20080269606 Matsumura Oct 2008 A1
20080275298 Ratnakar Nov 2008 A1
20080303898 Nishimura Dec 2008 A1
20090005643 Smith Jan 2009 A1
20090023998 Ratnakar Jan 2009 A1
20090030275 Nicolaou Jan 2009 A1
20090054790 Czaniera Feb 2009 A1
20090062615 Yamaya Mar 2009 A1
20090076327 Ohki Mar 2009 A1
20090082624 Joko Mar 2009 A1
20090086017 Miyano Apr 2009 A1
20090135245 Luo May 2009 A1
20090137875 Kitagawa May 2009 A1
20090143647 Banju Jun 2009 A1
20090147076 Ertas Jun 2009 A1
20090182917 Kim Jul 2009 A1
20090213211 Bayer Aug 2009 A1
20090216084 Yamane Aug 2009 A1
20090225159 Schneider Sep 2009 A1
20090231419 Bayer Sep 2009 A1
20090234183 Abe Sep 2009 A1
20090253966 Ichimura Oct 2009 A1
20090287188 Golden Nov 2009 A1
20090287192 Vivenzio Nov 2009 A1
20090299144 Shigemori Dec 2009 A1
20100010309 Kitagawa Jan 2010 A1
20100016673 Bandy Jan 2010 A1
20100053312 Watanabe Mar 2010 A1
20100069713 Endo Mar 2010 A1
20100073470 Takasaki Mar 2010 A1
20100073948 Stein Mar 2010 A1
20100076268 Takasugi Mar 2010 A1
20100123950 Fujiwara May 2010 A1
20100130822 Katayama May 2010 A1
20100141763 Itoh Jun 2010 A1
20100160729 Smith Jun 2010 A1
20100174144 Hsu Jul 2010 A1
20100231702 Tsujimura Sep 2010 A1
20100245653 Bodor Sep 2010 A1
20100249513 Tydlaska Sep 2010 A1
20100260322 Mizuyoshi Nov 2010 A1
20100296178 Genet Nov 2010 A1
20100326703 Gilad Dec 2010 A1
20110004058 Oneda Jan 2011 A1
20110004059 Arneson Jan 2011 A1
20110034769 Adair Feb 2011 A1
20110063427 Fengler Mar 2011 A1
20110069159 Soler et al. Mar 2011 A1
20110084835 Whitehouse Apr 2011 A1
20110105895 Kornblau May 2011 A1
20110140003 Beck Jun 2011 A1
20110160530 Ratnakar Jun 2011 A1
20110160535 Bayer Jun 2011 A1
20110169931 Pascal Jul 2011 A1
20110184243 Wright Jul 2011 A1
20110211267 Takato Sep 2011 A1
20110254937 Yoshino Oct 2011 A1
20110263938 Levy Oct 2011 A1
20110282144 Gettman Nov 2011 A1
20110292258 Adler Dec 2011 A1
20110301414 Hotto Dec 2011 A1
20120040305 Karazivan Feb 2012 A1
20120050606 Debevec Mar 2012 A1
20120053407 Levy Mar 2012 A1
20120057251 Takato Mar 2012 A1
20120065468 Levy Mar 2012 A1
20120076425 Brandt Mar 2012 A1
20120162402 Amano Jun 2012 A1
20120200683 Oshima Aug 2012 A1
20120209071 Bayer Aug 2012 A1
20120209289 Duque Aug 2012 A1
20120212630 Pryor Aug 2012 A1
20120220832 Nakade Aug 2012 A1
20120224026 Bayer Sep 2012 A1
20120229615 Kirma Sep 2012 A1
20120232340 Levy Sep 2012 A1
20120232343 Levy Sep 2012 A1
20120253121 Kitano Oct 2012 A1
20120277535 Hoshino Nov 2012 A1
20120281536 Gell Nov 2012 A1
20120289858 Ouyang Nov 2012 A1
20120300999 Bayer Nov 2012 A1
20130053646 Yamamoto Feb 2013 A1
20130057724 Miyahara Mar 2013 A1
20130060086 Talbert Mar 2013 A1
20130066297 Shtul Mar 2013 A1
20130077257 Tsai Mar 2013 A1
20130085329 Morrissette Apr 2013 A1
20130109916 Levy May 2013 A1
20130116506 Bayer May 2013 A1
20130131447 Benning May 2013 A1
20130137930 Menabde May 2013 A1
20130141557 Kawata Jun 2013 A1
20130150671 Levy Jun 2013 A1
20130158344 Taniguchi Jun 2013 A1
20130169843 Ono Jul 2013 A1
20130172670 Levy Jul 2013 A1
20130172676 Levy Jul 2013 A1
20130197309 Sakata Aug 2013 A1
20130197556 Shelton Aug 2013 A1
20130218024 Boctor Aug 2013 A1
20130222640 Baek Aug 2013 A1
20130253268 Okada Sep 2013 A1
20130257778 Rehe Oct 2013 A1
20130264465 Dai Oct 2013 A1
20130271588 Kirma Oct 2013 A1
20130274551 Kirma Oct 2013 A1
20130281925 Benscoter Oct 2013 A1
20130296649 Kirma Nov 2013 A1
20130303979 Stieglitz Nov 2013 A1
20130317295 Morse Nov 2013 A1
20140018624 Bayer Jan 2014 A1
20140031627 Jacobs Jan 2014 A1
20140046136 Bayer Feb 2014 A1
20140107418 Ratnakar Apr 2014 A1
20140148644 Levi May 2014 A1
20140184766 Amling Jul 2014 A1
20140194732 Nakaguchi Jul 2014 A1
20140212025 Thienphrapa Jul 2014 A1
20140213850 Levy Jul 2014 A1
20140225998 Dai Aug 2014 A1
20140243658 Breisacher et al. Aug 2014 A1
20140276207 Ouyang Sep 2014 A1
20140296628 Kirma Oct 2014 A1
20140296643 Levy Oct 2014 A1
20140296866 Salman Oct 2014 A1
20140298932 Okamoto Oct 2014 A1
20140309495 Kirma Oct 2014 A1
20140316198 Krivopisk Oct 2014 A1
20140316204 Ofir Oct 2014 A1
20140320617 Parks Oct 2014 A1
20140333742 Salman Nov 2014 A1
20140333743 Gilreath Nov 2014 A1
20140336459 Bayer Nov 2014 A1
20140343358 Hameed Nov 2014 A1
20140343361 Salman Nov 2014 A1
20140343489 Lang Nov 2014 A1
20140364691 Krivopisk Dec 2014 A1
20140364692 Salman Dec 2014 A1
20140364694 Avron Dec 2014 A1
20150005581 Salman Jan 2015 A1
20150045614 Krivopisk Feb 2015 A1
20150057500 Salman Feb 2015 A1
20150094538 Wieth Apr 2015 A1
20150099925 Tal Davidson et al. Apr 2015 A1
20150099926 Davidson Apr 2015 A1
20150105618 Levy Apr 2015 A1
20150164308 Ratnakar Jun 2015 A1
20150182105 Salman Jul 2015 A1
20150196190 Levy Jul 2015 A1
20150201827 Sidar Jul 2015 A1
20150208900 Vidas Jul 2015 A1
20150208909 Davidson Jul 2015 A1
20150223676 Bayer Aug 2015 A1
20150230698 Cline Aug 2015 A1
20150305601 Levi Oct 2015 A1
20150313445 Davidson Nov 2015 A1
20150313450 Wieth Nov 2015 A1
20150313451 Salman Nov 2015 A1
20150320300 Gershov Nov 2015 A1
20150342446 Levy Dec 2015 A1
20150359415 Lang Dec 2015 A1
20150374206 Shimony Dec 2015 A1
20160015257 Levy Jan 2016 A1
20160015258 Levin Jan 2016 A1
20160058268 Salman Mar 2016 A1
20160191887 Casas Jun 2016 A1
Foreign Referenced Citations (135)
Number Date Country
2297986 Mar 1999 CA
2765559 Dec 2010 CA
2812097 Mar 2012 CA
2798716 Jun 2013 CA
2798729 Jun 2013 CA
103348470 Oct 2013 CN
103403605 Nov 2013 CN
103491854 Jan 2014 CN
103702604 Apr 2014 CN
103732120 Apr 2014 CN
104717916 Jun 2015 CN
105246393 Jan 2016 CN
105324065 Feb 2016 CN
105324066 Feb 2016 CN
105338875 Feb 2016 CN
105358042 Feb 2016 CN
105358043 Feb 2016 CN
105377106 Mar 2016 CN
105407788 Mar 2016 CN
202010016900 May 2011 DE
1690497 Aug 2006 EP
1835844 Sep 2007 EP
1968425 Sep 2008 EP
1986541 Nov 2008 EP
1988613 Nov 2008 EP
2023794 Feb 2009 EP
2023795 Feb 2009 EP
2190341 Aug 2010 EP
2211683 Aug 2010 EP
2457492 May 2012 EP
2457493 May 2012 EP
1988812 Nov 2012 EP
2520218 Nov 2012 EP
2 550 908 Jan 2013 EP
2550908 Jan 2013 EP
2604175 Jun 2013 EP
2618718 Jul 2013 EP
2635932 Sep 2013 EP
2648602 Oct 2013 EP
2649648 Oct 2013 EP
2672878 Dec 2013 EP
2736400 Jun 2014 EP
2744390 Jun 2014 EP
2442706 Nov 2014 EP
2865322 Apr 2015 EP
2908714 Aug 2015 EP
2979123 Feb 2016 EP
2994032 Mar 2016 EP
2994033 Mar 2016 EP
2994034 Mar 2016 EP
2996536 Mar 2016 EP
2996541 Mar 2016 EP
2996542 Mar 2016 EP
2996621 Mar 2016 EP
9991537 Mar 2016 EP
12196628 Mar 2015 GB
H1043129 Feb 1998 JP
H10239740 Sep 1998 JP
41137512 May 1999 JP
200161861 Mar 2001 JP
2005253543 Sep 2005 JP
2006025888 Feb 2006 JP
2006068109 Mar 2006 JP
2010178766 Aug 2010 JP
2012135432 Jul 2012 JP
2013116277 Jun 2013 JP
2013123647 Jun 2013 JP
2013123648 Jun 2013 JP
2013208459 Oct 2013 JP
2013215582 Oct 2013 JP
2013230383 Nov 2013 JP
2013542467 Nov 2013 JP
2013544617 Dec 2013 JP
2014524303 Sep 2014 JP
2014524819 Sep 2014 JP
2015533300 Nov 2015 JP
2006073676 Jul 2006 WO
2006073725 Jul 2006 WO
2007025081 Mar 2007 WO
2007070644 Jun 2007 WO
2007092533 Aug 2007 WO
2007092636 Aug 2007 WO
2007087421 Nov 2007 WO
2007136859 Nov 2007 WO
2007136879 Nov 2007 WO
2008015164 Feb 2008 WO
2009014895 Jan 2009 WO
2009015396 Jan 2009 WO
2009049322 Apr 2009 WO
2009049324 Apr 2009 WO
2009062179 May 2009 WO
2010146587 Dec 2010 WO
2012038958 Mar 2012 WO
2012056453 May 2012 WO
2012075153 Jun 2012 WO
2012077116 Jun 2012 WO
2012077117 Jun 2012 WO
2012096102 Jul 2012 WO
2012120507 Sep 2012 WO
2012149548 Nov 2012 WO
2013014673 Jan 2013 WO
2013024476 Feb 2013 WO
2013165380 Nov 2013 WO
2014061023 Apr 2014 WO
2014160983 Oct 2014 WO
2014179236 Nov 2014 WO
2014182723 Nov 2014 WO
2014182728 Nov 2014 WO
2014183012 Nov 2014 WO
2014186230 Nov 2014 WO
2014186519 Nov 2014 WO
2014186521 Nov 2014 WO
2014186525 Nov 2014 WO
2014186775 Nov 2014 WO
2014210516 Dec 2014 WO
2015002847 Jan 2015 WO
20150318772 Mar 2015 WO
WO 2015031877 Mar 2015 WO
WO 2015031877 Mar 2015 WO
2015050829 Apr 2015 WO
2015084442 Jun 2015 WO
2015095481 Jun 2015 WO
2015112747 Jul 2015 WO
2015112899 Jul 2015 WO
201511957341 Aug 2015 WO
WO 2015119573 Aug 2015 WO
WO 2015119573 Aug 2015 WO
2015134060 Sep 2015 WO
2015168066 Nov 2015 WO
2015168664 Nov 2015 WO
2015171732 Nov 2015 WO
2015175246 Nov 2015 WO
2016014581 Jan 2016 WO
2016033403 Mar 2016 WO
2015047631 Apr 2016 WO
Non-Patent Literature Citations (78)
Entry
Notice of Allowance dated Apr. 12, 2017 for U.S. Appl. No. 14/603,137.
Notice of Allowance dated Apr. 16, 2017 for U.S. Appl. No. 13/713,449.
Office Action dated Apr. 19, 2017 for U.S. Appl. No. 14/988,551.
Notice of Allowability dated Apr. 21, 2017 for U.S. Appl. No. 14/549,265.
Office Action dated May 11, 2017 for U.S. Appl. No. 14/278,293.
Office Action dated May 10, 2017 for U.S. Appl. No. 14/986,551.
Office Action dated May 5, 2017 for U.S. Appl. No. 15/077,513.
Notice of Allowance dated May 15, 2017 for U.S. Appl. No. 14/271,270.
Office Action dated May 15, 2017 for U.S. Appl. No. 14/278,293.
Office Action dated May 18, 2017 for U.S. Appl. No. 14/278,338.
Notice of Allowance dated May 16, 2017 for U.S. Appl. No. 14/746,986.
Office Action dated May 23, 2017 for U.S. Appl. No. 13/655,120.
Notice of Allowance dated May 25, 2017 for U.S. Appl. No. 14/318,189.
Office Action dated May 23, 2017 for U.S. Appl. No. 14/500,975.
International Search Report for PCT/US14/37004, dated Sep. 25, 2014.
International Search Report for PCT/US14/38094, dated Nov. 6, 2014.
International Search Report for PCT/US2014/037526, dated Oct. 16, 2014.
International Search Report for PCT/US2014/071085, dated Mar. 27, 2015.
International Search Report for PCT/US2014/58143, dated Jan. 21, 2015.
International Search Report for PCT/US2015/012596, dated Dec. 11, 2015.
International Search Report for PCT/US2015/012751, dated Jun. 26, 2015.
International Search Report for PCT/US2015/027902, dated Jul. 23, 2015.
International Search Report for PCT/US2015/28962, dated Jul. 28, 2015.
International Search Report for PCT/US2015/29421, dated Aug. 7, 2015.
International Search Report for PCT/US2015/41396, dated Sep. 29, 2015.
International Search Report for PCT/US2015/47334, dated Dec. 28, 2015.
International Search Report for PCT/US2015/6548, dated Feb. 26, 2016.
International Search Report for PCT/US2015/66486, dated Dec. 17, 2015.
International Search Reporfor PCT/US2016/058915, dated Feb. 15, 2017.
Corrected Notice of Allowance dated Apr. 13, 2016 for U.S. Appl. No. 13/680,646.
Office of Allowence dated Mar. 28, 2016 for U.S. Appl. N0. 13/413,059.
Notice of Allowance dated Mar. 29, 2016 for U.S. Appl. No. 13/680,646.
Office Action dated Feb. 26, 2016 for U.S. Appl. No. 14/274,323.
Office Action dated Feb. 4, 2016 for U.S. Appl. No. 14/271,234.
Office Action dated Mar. 23, 2016 for U.S. Appl. No. 13/713,449.
Office Action dated Mar. 24, 2016 for U.S. Appl. No. 13/212,627.
Office Action dated Mar. 28. 2016 for U.S. Appl. No. 13/119,032.
Office Action dated May 25, 2016 for U.S. Appl. No. 14/271,234.
Offce Action dated May 5, 2016 for U.S. Appl. No. 14/278,338.
Office Action dated May 6, 2016 for U.S. Appl. No. 14/263,896.
Office Action dated Jun. 30, 2016 for U.S. Appl. No. 13/655,120.
Office Action dated Jun. 28, 2016 for U.S. Appl. No. 14/278,293.
Office Action dated Jul. 1, 2016 for U.S. Appl. No. 14/229,699.
Office Action dated Jul. 15, 2016 for U.S. Appl. No. 14/273,923.
Notice of Allowance dated Jul. 15, 2016 for U.S. Appl. No. 14/274,323.
Office Action dated Jul. 22, 2016 for U.S. Appl. No. 14/549,265.
Sherman L.M., Plastics That Conduct Heat; Plastics Technology, Jun. 2001—article obtained online from http://www.ptonline.com/articles/plastics-that-conduct-heat.
Office Action dated Aug. 11, 2016 for U.S. Appl. No. 14/318,249.
Office Action dated Apr. 28, 2016 for U.S. Appl. No. 13/992,014.
Notice of Allowance dated Aug. 26, 2016 for U.S. Appl. No. 13/212,627.
Office Action dated Sep. 2, 2016 for U.S. Appl. No. 14/278,338.
Office Action dated Sep. 16, 2016 for U.S. Appl. No. 13/992,014.
Notice of Allowance dated Oct. 12, 2016 for U.S. Appl. No. 13/119,032.
Office Action dated Oct. 7, 2016 for U.S. Appl. No. 13/713,449.
Office Action dated Oct. 5, 2016 for U.S. Appl. No. 14/271,270.
Notice of Allowance dated Oct. 13, 2016 for U.S. Appl. No. 14/273,923.
Notice of Allowance dated Nov. 9, 2016 for U.S. Appl. No. 13/557,114.
Office Action dated Dec. 1, 2016 for U.S. Appl. No. 14/278,293.
Office Action dated Dec. 9, 2016 for U.S. Appl. No. 14/549,265.
Office Action dated Dec. 16, 2016 for U.S. Appl. No. 14/263,896.
Notice of Allowance dated Dec. 28, 2016 for U.S. Appl. No. 14/229,699.
Notice of Allowance dated Dec. 27, 2016 for U.S. Appl. No. 14/317,883.
Office Action dated Dec. 27, 2016 forU.S. Appl. No. 14/603,137.
Office Action dated Dec. 29, 2016 for U.S. Appl. No. 15/077,513.
Office Action dated Dec. 30, 2016 for U.S. Appl. No. 14/457,268.
Office Action dated Jan. 17, 2017 for U.S. Appl. No. 14/318,189.
Notice of Allowance dated Jan. 31, 2017 for U.S. Appl. No. 14/271,234.
Office Action dated Feb. 2, 2017 for U.S. Appl. No. 14/278,336.
Office Action dated Feb. 9, 2017 for U.S. Appl. No. 14/746,986.
Office Action dated Feb. 6, 2017 for U.S. Appl. No. 14/751,835.
Office Action dated Feb. 14, 2017 for U.S. Appl. No. 14/271,270.
Office Action dated Feb. 23, 2017 for U.S. Appl. No. 14/318,249.
Office Action dated Mar. 9, 2017 for U.S. Appl. No. 14/791,316.
Office Action dated Mar. 21, 2017 for U.S. Appl. No. 13/992,014.
Office Action dated Mar. 20, 2417 for U.S. Appl. No. 14/276,293.
Notice of Allowance dated Mar. 21, 2017 for U.S. Appl. No. 14/549,265.
Office Action dated Mar. 22, 2017 for U.S. Appl. No. 14/705,355.
Office Action dated Mar. 24, 2017 for U.S. Appl. No. 14/838,509.
Related Publications (1)
Number Date Country
20200046437 A1 Feb 2020 US
Provisional Applications (1)
Number Date Country
62247232 Oct 2015 US
Continuations (1)
Number Date Country
Parent 15335249 Oct 2016 US
Child 16656000 US