This disclosure is directed to advanced visualization and digital imaging systems and methods and, more particularly but not entirely, to adjusting a frame rate of an image sensor on a per-frame basis.
Endoscopic surgical instruments are often preferred over traditional open surgical devices because the small incision tends to reduce post-operative recovery time and associated complications. In some instances of endoscopic visualization, it is desirable to view a space with high-definition color imaging and further with one or more advanced visualization techniques providing additional information that cannot be discerned with the human eye. However, the space constrained environment of an endoscope introduces numerous technical challenges when seeking to capture advanced visualization data in a light deficient environment.
There are numerous endoscopic visualization systems that seek to capture advanced visualization data, such as multispectral data, fluorescence data, and laser mapping data, while working within the space constrained environment of an endoscope. However, these endoscopic visualization systems do not address the inherent qualities of an image sensor that leads to a pixel array being more or less efficient at accumulating electromagnetic radiation at different wavebands across the electromagnetic spectrum.
Traditional video cameras and image sensors implement a fixed frame duration for the standard operating mode. With a fixed frame duration, the frame rate remains constant, and the data transmission timing from the image sensor remains constant. For example, for a high-definition video stream, data from approximately two million pixels is offloaded from the image sensor every 16.67 milliseconds (ms) (or 60 frames per second (fps)). The transmission of the data through the electronic circuitry and software algorithms occurs at a fixed timing based on this constant frame rate.
A color video stream requires at least 24-30 fps of white light video, which leaves finite time for capturing data for advanced visualization modalities. In traditional visualization systems, the system must capture all advanced visualization modalities in 60 fps, with each frame duration lasting for 16.67 ms. This frame duration may be too short for the pixel array to accumulate sufficient electromagnetic radiation at certain wavebands of the electromagnetic spectrum where the pixel array is inherently inefficient. Further, this same frame duration may be more than enough time to for the pixel array to accumulate sufficient electromagnetic radiation at certain wavebands where the pixel array is inherently efficient.
For example, commonly owned U.S. Patent Application Publication No. 2020/0404131, entitled “HYPERSPECTRAL AND FLUORESCENCE IMAGING WITH TOPOLOGY LASER SCANNING IN A LIGHT DEFICIENT ENVIRONMENT,” filed on Oct. 24, 2019, which is incorporated by reference in its entirety, describes an endoscopic visualization system for color and “specialty” imaging. In this disclosure, an emitter is configured to emit red, green, blue, and specialty emissions of light, wherein the specialty emissions may include hyperspectral, fluorescence, or laser mapping emissions. Further in this disclosure, the emitter may emit light pulses for different durations from frame to frame. However, this disclosure does not address wherein a frame rate may be optimized on a frame-to-frame basis to compensate for a pixel array having varying efficiencies to accumulating electromagnetic radiation at different wavebands of the electromagnetic spectrum.
Consequently, a significant need exists for an endoscopic visualization system that can capture color, including high-definition color, multispectral, fluorescence, and laser mapping imaging data with consistent exposure from frame to frame despite the waveband of electromagnetic radiation. Specifically, there is a need to compensate for the inherent characteristics of an image sensor relating to a pixel array's varying sensitivities to different wavebands of electromagnetic radiation.
In view of the foregoing, described herein are systems, methods, and devices for fluorescence, multispectral, and laser mapping imaging in a light deficient environment, wherein the frame rate of an image sensor is adjustable on a frame-to-frame basis.
Non-limiting and non-exhaustive implementations of the disclosure are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified. Advantages of the disclosure will become better understood with regard to the following description and accompanying drawings where:
Disclosed herein are systems, methods, and devices for digital visualization that may be primarily suited to medical applications such as medical endoscopic imaging. An embodiment of the disclosure is an endoscopic system for color visualization and “advanced visualization” of a scene. The advanced visualization includes one or more of multispectral imaging, fluorescence imaging, or topographical mapping. Data retrieved from the advanced visualization may be processed by one or more algorithms configured to determine characteristics of the scene. The advanced visualization data may specifically be used to identify tissue structures within a scene, generate a three-dimensional topographical map of the scene, calculate dimensions of objects within the scene, identify margins and boundaries of different tissue types, and so forth.
An embodiment of the disclosure is an endoscopic visualization system that includes an emitter, an image sensor, and a controller. The emitter includes a plurality of separate and independently actuatable sources of electromagnetic radiation (EMR) that may be separately cycled on and off to illuminate a scene with pulses of EMR. The image sensor accumulates EMR and reads out data for generating a plurality of data frames. The controller synchronizes operations of the emitter and the image sensor to output a desired visualization scheme based on user input. The visualization scheme may include a selection of one or more of color imaging, multispectral imaging, fluorescence imaging, topographical mapping, or anatomical measurement.
The controller instructs the emitter and the image sensor to operate in a synchronized sequence to output a video stream that includes one or more types of visualization (i.e., color imaging, multispectral imaging, fluorescence imaging, topographical mapping, or anatomical measurement). The controller instructs the emitter to actuate one or more of the plurality of EMR sources to pulse according to a variable pulse cycle. The controller instructs the image sensor to accumulate EMR and read out data according to a variable sensor cycle that is synchronized in time with the variable pulse cycle. The synchronized sequence of the emitter and the image sensor enables the image sensor to read out data corresponding with a plurality of different visualization types. For example, the image sensor may read out a color frame in response to the emitter pulsing a white light or other visible EMR, the image sensor may readout a multispectral frame in response to the emitter pulsing a multispectral waveband of EMR, the image sensor may read out data for calculating a three-dimensional topographical map in response to the emitter pulsing EMR in a mapping pattern, and so forth.
The controller optimizes and adjusts the variable pulse cycle in real-time based on user input, sufficient exposure of resultant data frames, and inherent properties of a corresponding pixel array. In some cases, a pixel array has varying sensitivities to different wavebands of EMR. In these cases, the pixel array is irradiated with EMR for shorter or longer durations of time depending on the type of illumination pulse to ensure the pixel array outputs data frames with consistent exposure levels. The controller adjusts the irradiation time of the pixel array and the pulsing duration of the emitter in real-time to compensate for the pixel array's varying efficiencies to different types of illumination.
The systems, methods, and devices described herein are implemented for color visualization and advanced visualization. The advanced visualization techniques described herein can be used to identify certain tissues, see through tissues in the foreground, calculate a three-dimensional topography of a scene, and calculate dimensions and distances for objects within the scene. The advanced visualization techniques described herein specifically include multispectral visualization, fluorescence visualization, and laser mapping visualization.
Spectral imaging uses multiple bands across the electromagnetic spectrum. This is different from conventional cameras that only capture light across the three wavelengths based in the visible spectrum that are discernable by the human eye, including the red, green, and blue wavelengths to generate an RGB image. Spectral imaging may use any wavelength bands in the electromagnetic spectrum, including infrared wavelengths, the visible spectrum, the ultraviolet spectrum, x-ray wavelengths, or any suitable combination of various wavelength bands. Spectral imaging may overlay imaging generated based on non-visible bands (e.g., infrared) on top of imaging based on visible bands (e.g., a standard RGB image) to provide additional information that is easily discernable by a person or computer algorithm.
The multispectral imaging techniques discussed herein can be used to “see through” layers of tissue in the foreground of a scene to identify specific types of tissue and/or specific biological or chemical processes. Multispectral imaging can be used in the medical context to quantitatively track the process of a disease and to determine tissue pathology. Additionally, multispectral imaging can be used to identify critical structures such as nervous tissue, muscle tissue, cancerous cells, blood vessels, and so forth. In an embodiment, multispectral partitions of EMR are pulsed and data is gathered regarding the spectral responses of different types of tissue in response to the partitions of EMR. A datastore of spectral responses can be generated and analyzed to assess a scene and predict which tissues are present within the scene based on the sensed spectral responses.
Multispectral imaging enables numerous advantages over conventional imaging. The information obtained by multispectral imaging enables medical practitioners and/or computer-implemented programs to precisely identify certain tissues or conditions that may not be possible to identify with RGB imaging. Additionally, multispectral imaging may be used during medical procedures to provide image-guided surgery that enables a medical practitioner to, for example, view tissues located behind certain tissues or fluids, identify atypical cancerous cells in contrast with typical healthy cells, identify certain tissues or conditions, identify critical structures, and so forth. Multispectral imaging provides specialized diagnostic information about tissue physiology, morphology, and composition that cannot be generated with conventional imaging.
Fluorescence occurs when an orbital electron of a molecule, atom, or nanostructure is excited by light or other EMR, and then relaxes to its ground state by emitting a photon from the excited state. The specific frequencies of EMR that excite the orbital electron, or are emitted by the photon during relaxation, are dependent on the particular atom, molecule, or nanostructure. In most cases, the light emitted by the substance has a longer wavelength, and therefore lower energy, than the radiation that was absorbed by the substance.
Fluorescence imaging is particularly useful in biochemistry and medicine as a non-destructive means for tracking or analyzing biological molecules. The biological molecules, including certain tissues or structures, are tracked by analyzing the fluorescent emission of the biological molecules after being excited by a certain wavelength of EMR. However, relatively few cellular components are naturally fluorescent. In certain implementations, it may be desirable to visualize a certain tissue, structure, chemical process, or biological process that is not intrinsically fluorescent. In such an implementation, the body may be administered a dye or reagent that may include a molecule, protein, or quantum dot having fluorescent properties. The reagent or dye may then fluoresce after being excited by a certain wavelength of EMR. Different reagents or dyes may include different molecules, proteins, and/or quantum dots that will fluoresce at particular wavelengths of EMR. Thus, it may be necessary to excite the reagent or dye with a specialized band of EMR to achieve fluorescence and identify the desired tissue, structure, or process in the body.
The fluorescence imaging techniques described herein may be used to identify certain materials, tissues, components, or processes within a body cavity or other light deficient environment. Fluorescence imaging data may be provided to a medical practitioner or computer-implemented algorithm to enable the identification of certain structures or tissues within a body. Such fluorescence imaging data may be overlaid on black-and-white or RGB images to provide additional information and context.
The fluorescence imaging techniques described herein may be implemented in coordination with fluorescent reagents or dyes. Some reagents or dyes are known to attach to certain types of tissues and fluoresce at specific wavelengths of the electromagnetic spectrum. In an implementation, a reagent or dye is administered to a patient that is configured to fluoresce when activated by certain wavelengths of light. The visualization system disclosed herein is used to excite and fluoresce the reagent or dye. The fluorescence of the reagent or dye is detected by an image sensor to aid in the identification of tissues or structures in the body cavity. In an implementation, a patient is administered a plurality of reagents or dyes that are each configured to fluoresce at different wavelengths and/or provide an indication of different structures, tissues, chemical reactions, biological processes, and so forth. In such an implementation, the visualization system described herein emits each of the applicable wavelengths to fluoresce each of the applicable reagents or dyes. This may negate the need to perform individual imaging procedures for each of the plurality of reagents or dyes.
Laser mapping generally includes the controlled deflection of laser beams. Laser mapping can be implemented to generate one or more of a three-dimensional topographical map of a scene, calculate distances between objects within the scene, calculate dimensions of objects within the scene, track the relative locations of tools within the scene, and so forth.
Laser mapping combines controlled steering of laser beams with a laser rangefinder. By taking a distance measurement at every direction, the laser rangefinder can rapidly capture the surface shape of objects, tools, and landscapes. Construction of a full three-dimensional topography may include combining multiple surface models that are obtained from different viewing angles. Various measurement systems and methods exist in the art for applications in archaeology, geography, atmospheric physics, autonomous vehicles, and others. One such system includes light detection and ranging (LIDAR), which is a three-dimensional mapping system. LIDAR has been applied in navigation systems such as airplanes or satellites to determine position and orientation of a sensor in combination with other systems and sensors. LIDAR uses active sensors to illuminate an object and detect energy that is reflected off the object and back to a sensor.
As discussed herein, the term “laser mapping” includes laser tracking. Laser tracking, or the use of lasers for tool tracking, measures objects by determining the positions of optical targets held against those objects. Laser trackers can be accurate to the order of 0.025 mm over a distance of several meters. The visualization system described herein pulses EMR for use in conjunction with a laser tracking system such that the position of tools within a scene can be tracked and measured.
The endoscopic visualization system described herein implements laser mapping imaging to determine precise measurements and topographical outlines of a scene. In one implementation, mapping data is used to determine precise measurements between, for example, structures or organs in a body cavity, devices or tools in the body cavity, and/or critical structures in the body cavity. As discussed herein, the term “mapping” encompasses technologies referred to as laser mapping, laser scanning, topographical scanning, three-dimensional scanning, laser tracking, tool tracking, and others. A mapping data frame as discussed herein includes data for calculating one or more of a topographical map of a scene, dimensions of objects or structures within a scene, distances between objects or structures within the scene, relative locations of tools or other objects within the scene, and so forth.
For the purposes of promoting an understanding of the principles in accordance with the disclosure, reference will now be made to the embodiments illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Any alterations and further modifications of the inventive features illustrated herein, and any additional applications of the principles of the disclosure as illustrated herein, which would normally occur to one skilled in the relevant art and having possession of this disclosure, are to be considered within the scope of the disclosure claimed.
Before the structure, systems, and methods are disclosed and described, it is to be understood that this disclosure is not limited to the particular structures, configurations, process steps, and materials disclosed herein as such structures, configurations, process steps, and materials may vary somewhat. It is also to be understood that the terminology employed herein is used for the purpose of describing particular embodiments only and is not intended to be limiting since the scope of the disclosure will be limited only by the appended claims and equivalents thereof.
In describing and claiming the subject matter of the disclosure, the following terminology will be used in accordance with the definitions set out below.
It must be noted that, as used in this specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise.
As used herein, the terms “comprising,” “including,” “containing,” “characterized by,” and grammatical equivalents thereof are inclusive or open-ended terms that do not exclude additional, unrecited elements or method steps.
As used herein, the phrase “consisting of” and grammatical equivalents thereof exclude any element or step not specified in the claim.
As used herein, the phrase “consisting essentially of” and grammatical equivalents thereof limit the scope of a claim to the specified materials or steps and those that do not materially affect the basic and novel characteristic or characteristics of the claimed disclosure.
As used herein, the term “proximal” shall refer broadly to the concept of a portion nearest an origin.
As used herein, the term “distal” shall generally refer to the opposite of proximal, and thus to the concept of a portion farther from an origin, or a farthest portion, depending upon the context.
As used herein, color sensors or multi spectrum sensors are those sensors known to have a color filter array (CFA) thereon to filter the incoming EMR into its separate components. In the visual range of the electromagnetic spectrum, such a CFA may be built on a Bayer pattern or modification thereon to separate green, red, and blue spectrum components of visible EMR.
As used herein, a monochromatic sensor refers to an unfiltered imaging sensor comprising color-agnostic pixels.
Referring now to the figures,
The optical visualization system 106 may be disposed at a distal end of a lumen of an endoscope 110. Alternatively, one or more components of the optical visualization system 106 may be disposed at a proximal end of the lumen of the endoscope 110 or in another region of the endoscope 110. The optical visualization system 106 may include one or more image sensors 124 that each include a pixel array (see pixel array 125 first illustrated in
The optical visualization system 106 may specifically include two lenses 126 dedicated to each image sensor 124 to focus EMR on to a rotated image sensor 124 and enable a depth view. The filter 128 may include a notch filter configured to block unwanted reflected EMR. In a particular use-case, the unwanted reflected EMR may include a fluorescence excitation wavelength that was pulsed by the emitter 102, wherein the system 100 wishes to only detect a fluorescence relaxation wavelength emitted by a fluorescent reagent or tissue.
The image sensor 124 includes one or more image sensors, and the example implementation illustrated in
The emitter 102 includes one or more EMR sources, which may include, for example, lasers, laser bundles, light emitting diodes (LEDs), electric discharge sources, incandescence sources, electroluminescence sources, and so forth. In some implementations, the emitter 102 includes at least one white EMR source 134 (may be referred to herein as a white light source). The emitter 102 may additionally include one or more EMR sources 138 that are tuned to emit a certain waveband of EMR. The EMR sources 138 may specifically be tuned to emit a waveband of EMR that is selected for multispectral or fluorescence visualization. The emitter 102 may additionally include one or more mapping sources 142 that are configured to emit EMR in a mapping pattern such as a grid array or dot array selected for capturing data for topographical mapping or anatomical measurement.
The one or more white EMR sources 134 emit EMR into a dichroic mirror 136 that feeds the white EMR into a waveguide 130. The white EMR source 134 may specifically feed into a first waveguide 130a dedicated to white EMR. The EMR sources 138 emit EMR into independent dichroic mirrors 140 that each feed EMR into the waveguide 130 and may specifically feed into a second waveguide 130b. The first waveguide 130a and the second waveguide 130b later merge into a waveguide 130 that transmits EMR to a distal end of the endoscope 110 to illuminate a scene with an emission of EMR 144.
The one or more EMR sources 138 that are tuned to emit a waveband of EMR may specifically be tuned to emit EMR that is selected for multispectral or fluorescence visualization. In some cases, the EMR sources 138 are finely tuned to emit a central wavelength of EMR with a tolerance threshold not exceeding ±5 nm, ±4 nm, ±3 nm, ±2 nm, or ±1 nm. The EMR sources 138 may include lasers or laser bundles that are separately cycled on and off by the emitter 102 to pulse the emission of EMR 144 and illuminate a scene with a finely tuned waveband of EMR.
The one or more mapping sources 142 are configured to pulse EMR in a mapping pattern, which may include a dot array, grid array, vertical hashing, horizontal hashing, pin grid array, and so forth. The mapping pattern is selected for laser mapping imaging to determine one or more of a three-dimensional topographical map of a scene, a distance between two or more objects within a scene, a dimension of an object within a scene, a location of a tool 108 within the scene, and so forth. The EMR pulsed by the mapping source 142 is diffracted to spread the energy waves according to the desired mapping pattern. The mapping source 142 may specifically include a device that splits the EMR beam with quantum-dot-array diffraction grafting. The mapping source 142 may be configured to emit low mode laser light.
The controller 104 (may be referred to herein as a camera control unit or CCU) may include a field programmable gate array (FGPA) 112 and a computer 113. The FGPA 112 may be configured to perform overlay processing 114 and image processing 116. The computer 113 may be configured to generate a pulse cycle 118 for the emitter 102 and to perform further image processing 120. The FGPA 112 receives data from the image sensor 124 and may combine data from two or more data frames by way of overlay processing 114 to output an overlay image frame. The computer 113 may provide data to the emitter 102 and the image sensor 124. Specifically, the computer 113 may calculate and adjust a variable pulse cycle to be emitted by the emitter 102 in real-time based on user input. Additionally, the computer 113 may receive data frames from the image sensor 124 and perform further image processing 120 on those data frames.
The controller 104 may be in communication with a network, such as the Internet, and automatically upload data to the network for remote storage. The MCU 122 and image sensors 124 maybe exchanged and updated and continue to communicate with an established controller 104. In some cases, the controller 104 is “out of date” with respect to the MCU 122 but will still successfully communicate with the MCU 122. This may increase the data security for a hospital or other healthcare facility because the existing controller 104 may be configured to undergo extensive security protocols to protect patient data.
The controller 104 may reprogram the image sensor 124 for each data frame to set a required blanking period duration and/or readout period duration for a subsequent frame period. One frame period includes a blanking period and a readout period. Generally speaking, the pixel array 125 accumulates EMR during the blanking period and reads out pixel data during the readout period. It will be understood that a blanking period corresponds to a time between a readout of a last row of active pixels in the pixel array of the image sensor and a beginning of a next subsequent readout of active pixels in the pixel array. Additionally, the readout period corresponds to a duration of time when active pixels in the pixel array are being read. Further, the controller 104 may write correct registers to the image sensor 124 to adjust the duration of one or more of the blanking period or the readout period for each frame period on a frame-by-frame basis within the sensor cycle as needed.
The endoscope 110 includes a microcontroller unit (MCU) 122 disposed therein. The MCU 122 may specifically be disposed within a handpiece portion of the endoscope 110 and communicate with electronic circuitry (such as the image sensor 124) disposed within a distal end of a lumen of the endoscope 110. The MCU 122 receives instructions from the controller 104, including an indication of the pulse cycle 118 provided to the emitter 102 and the corresponding sensor cycle timing for the image sensor 124. The MCU 122 executes a common Application Program Interface (API). The controller 104 communicates with the MCU 122, and the MCU 122 executes a translation function that translates instructions received from the controller 104 into the correct format for each type of image sensor 124. In some cases, the system 100 may include multiple different image sensors that each operate according to a different “language” or formatting, and the MCU 122 is configured to translate instructions from the controller 104 into each of the appropriate data formatting languages. The common API on the MCU 122 passes information by the scene, including, for example parameters pertaining to gain, exposure, white balance, setpoint, and so forth. The MCU 122 runs a feedback algorithm to the controller 104 for any number of parameters depending on the type of visualization.
The MCU 122 stores operational data and images captured by the image sensors 124. In some cases, the MCU 122 does not need to continuously push data up the data chain to the controller 104. The data may be set once on the microcontroller 122, and then only critical information may be pushed through a feedback loop to the controller 104. The MCU 122 may be set up in multiple modes, including a primary mode (may be referred to as a “master” mode when referring to a master/slave communication protocol). The MCU 122 ensures that all downstream components (i.e., distal components including the image sensors 124, which may be referred to as “slaves” in the master/slave communication protocol) are apprised of the configurations for upcoming data frames. The upcoming configurations may include, for example, gain, exposure duration, readout duration, pixel binning configuration, and so forth.
The MCU 122 includes internal logic for executing triggers to coordinate different devices, including, for example multiple image sensors 124. The MCU 122 provides instructions for upcoming frames and executes triggers to ensure that each image sensor 124 begins to capture data the same time. In some cases, the image sensors 124 may automatically advance to a subsequent data frame without receiving a unique trigger from the MCU 122.
In some cases, the endoscope 110 includes two or more image sensors 124 that detect EMR and output data frames simultaneously. The simultaneous data frames may be used to output a three-dimensional image and/or output imagery with increased definition and dynamic range. The pixel array of the image sensor 124 may include active pixels and optical black (“OB”) or optically blind pixels. The optical black pixels may be read during a blanking period of the pixel array when the pixel array is “reset” or calibrated. After the optical black pixels have been read, the active pixels are read during a readout period of the pixel array. The active pixels accumulate EMR that is pulsed by the emitter 102 during the blanking period of the image sensor 124. The pixel array 125 may include monochromatic or “color agnostic” pixels that do not comprise any filter for selectively receiving certain wavebands of EMR. The pixel array may include a color filter array (CFA), such as a Bayer pattern CFA, that selectively allows certain wavebands of EMR to pass through the filters and be accumulated by the pixel array.
The image sensor 124 is instructed by a combination of the MCU 122 and the controller 104 working in a coordinated effort. Ultimately, the MCU 122 provides the image sensor 124 with instructions on how to capture the upcoming data frame. These instructions include, for example, an indication of the gain, exposure, white balance, exposure duration, readout duration, pixel binning configuration, and so forth for the upcoming data frame. When the image sensor 124 is reading out data for a current data frame, the MCU 122 is rewriting the correct registers for the next data frame. The MCU 122 and the image sensor 124 operate in a back-and-forth data flow, wherein the image sensor 124 provides data to the MCU 122 and the MCU 122 rewrites correct registers to the image sensor 124 for each upcoming data frame. The MCU 122 and the image sensor 124 may operate according to a “ping pong buffer” in some configurations.
The image sensor 124, MCU 122, and controller 104 engage in a feedback loop to continuously adjust and optimize configurations for upcoming data frames based on output data. The MCU 122 continually rewrites correct registers to the image sensor 124 depending on the type of upcoming data frame (i.e., color data frame, multispectral data frame, fluorescence data frame, topographical mapping data frame, and so forth), configurations for previously output data frames, and user input. In an example implementation, the image sensor 124 outputs a multispectral data frame in response to the emitter 102 pulsing a multispectral waveband of EMR. The MCU 122 and/or controller 104 determines that the multispectral data frame is underexposed and cannot successfully be analyzed by a corresponding machine learning algorithm. The MCU 122 and/or controller 104 than adjusts configurations for upcoming multispectral data frames to ensure that future multispectral data frames are properly exposed. The MCU 122 and/or controller 104 may indicate that the gain, exposure duration, pixel binning configuration, etc. must be adjusted for future multispectral data frames to ensure proper exposure. All image sensor 124 configurations may be adjusted in real-time based on previously output data processed through the feedback loop, and further based on user input.
The waveguides 130, 131 include one or more optical fibers. The optical fibers may be made of a low-cost material, such as plastic to allow for disposal of one or more of the waveguides 130, 131. In some implementations, one or more of the waveguides 130, 131 include a single glass fiber having a diameter of 500 microns. In some implementations, one or more of the waveguides 130, 131 include a plurality of glass fibers.
The data flow 200 includes an emitter 102, a pixel array 125 of an image sensor 124 (not shown), and an image signal processor 140. The image signal processor 140 may include one or more of the image processing 116, 120 modules illustrated in
The controller 104 instructs the emitter 102 to cycle the plurality of EMR sources according to a variable pulse cycle. The controller 104 calculates the variable pulse cycle based at least in part upon a user input indicating the desired visualization scheme. For example, the desired visualization scheme may indicate the user wishes to view a scene with only color imaging. In this case, the variable pulse cycle may include only pulses of white EMR. In an alternative example, the desired visualization scheme may indicate the user wishes to be notified when nerve tissue can be identified in the scene and/or when a tool within the scene is within a threshold distance from the nerve tissue. In this example, the variable pulse cycle may include pulses of white EMR and may further include pulses of one or more multispectral wavebands of EMR that elicit a spectral response from the nerve tissue and/or “see through” non-nerve tissues by penetrating those non-nerve tissues. Additionally, the variable pulse cycle may include pulses of EMR in a mapping pattern configured for laser mapping imaging to determine when the tool is within the threshold distance from the nerve tissue. The controller 104 may reconfigure the variable pulse cycle in real-time in response to receiving a revised desired visualization scheme from the user.
Specifically, the pixel array 125 accumulates EMR during the T1 blanking period and reads out the T1 data frame during the T1 readout period, which follows the T1 blanking period. Similarly, the pixel array 125 accumulates EMR during the T2 blanking period and reads out the T2 data frame during the T2 readout period, which follows the T2 blanking period. The pixel array 125 accumulates EMR during the T3 blanking period and reads out the T3 data frame during the T3 readout period, which follows the T3 blanking period. The pixel array 125 accumulates EMR during the T4 blanking period and reads out the T4 data frame during the T4 readout period, which follows the T4 blanking period. Each of the T1 data frame, the T2 data frame, the T3 data frame, and the T4 data frame is provided to the image signal processor 140.
The contents of each of the T1-T4 data frames is dependent on the type of EMR that was pulsed by the emitter 102 during the preceding blanking period. For example, if the emitter 102 pulses white light during the preceding blanking period, then the resultant data frame may include a color data frame (if the pixel array 125 includes a color filter array for outputting red, green, and blue image data). Further for example, if the emitter 102 pulses a multispectral waveband of EMR during the preceding blanking period, then the resultant data frame is a multispectral data frame comprising information for identifying a spectral response by one or more objects within the scene and/or information for “seeing through” one or more structures within the scene. Further for example, if the emitter 102 pulses a fluorescence excitation waveband of EMR during the preceding blanking period, then the resultant data frame is a fluorescence data frame comprising information for identifying a fluorescent reagent or autofluorescence response by a tissue within the scene. Further for example, if the emitter 102 pulses EMR in a mapping pattern during the preceding blanking period, then the resultant data frame is a mapping data frame comprising information for calculating one or more of a three-dimensional topographical map of the scene, a dimension of one or more objects within the scene, a distance between two or more objects within the scene, and so forth.
Some “machine vision” data frames, including multispectral data frames, fluorescence data frames, and mapping data frames may be provided to a corresponding algorithm or neural network configured to evaluate the information therein. A multispectral algorithm may be configured to identify one or more tissue structures within a scene based on how those tissue structures respond to one or more different wavebands of EMR selected for multispectral imaging. A fluorescence algorithm may be configured to identify a location of a fluorescent reagent or auto-fluorescing tissue structure within a scene. A mapping algorithm may be configured to calculate one or more of a three-dimensional topographical map of a scene, a depth map, a dimension of one or more objects within the scene, and/or a distance between two or more objects within the scene based on the mapping data frame.
The pixel array 125 reads out a color data frame 205 in response to the emitter 102 pulsing the pulsed visible 204 EMR. The pulsed visible 204 EMR may specifically include a pulse of white light. The pixel array 125 reads out a multispectral data frame 207 in response to the emitter 102 pulsing the multispectral 206 waveband of EMR. The pulsed multispectral 206 waveband of EMR may specifically include one or more of EMR within a waveband from about 513-545 nanometers (nm), 565-585 nm, and/or 900-1000 nm. It will be appreciated that the pulsed multispectral 206 waveband of EMR may include various other wavebands used to elicit a spectral response. The pixel array 125 reads out a fluorescence data frame 209 in response to the emitter 102 pulsing the fluorescence 208 waveband of EMR. The pulsed fluorescence 208 waveband of EMR may specifically include one or more of EMR within a waveband from about 770-795 nm and/or 790-815 nm. The pixel array 125 reads out a mapping data frame 211 in response to the emitter 102 pulsing EMR in a mapping pattern 210. The pulsed mapping pattern 210 may include one or more of vertical hashing, horizontal hashing, a pin grid array, a dot array, a raster grid of discrete points, and so forth. Each of the color data frame 205, the multispectral data frame 207, the fluorescence data frame 209, and the mapping data frame 211 is provided to the image signal processor 140.
In an implementation, the emitter 102 separately pulses red, green, and blue visible EMR. In this implementation, the pixel array 125 may include a monochromatic (color agnostic) array of pixels. The pixel array 125 may separately read out a red data frame, a green data frame, and a blue data frame in response to the separate pulses of red, green, and blue visible EMR.
In an implementation, the emitter 102 separately pulses wavebands of visible EMR that are selected for capturing luminance (“Y”) imaging data, red chrominance (“Cr”) imaging data, and blue chrominance (“Cb”) imaging data. In this implementation, the pixel array 125 may separately read out a luminance data frame (comprising only luminance imaging information), a red chrominance data frame, and a blue chrominance data frame.
The emitter 102 pulses according to a variable pulse cycle that includes one or more types of EMR. The variable pulse cycle may include visible EMR, which may include a white light emission, red light emission, green light emission, blue light emission, or some other waveband of visible EMR. The white light emission may be pulsed with a white light emitting diode (LED) or other light source and may alternatively be pulsed with a combination of red, green, and blue light sources pulsing in concert. The variable pulse cycle may include one or more wavebands of EMR that are selected for multispectral imaging or fluorescence imaging. The variable pulse cycle may include one or more emissions of EMR in a mapping pattern selected for three-dimensional topographical mapping or calculating dimensions within a scene. In some cases, different types of EMR are represented in the variable pulse cycle with different regularity than other types of EMR. This may be implemented to emphasize and de-emphasize aspects of the recorded scene as desired by the user.
The controller 104 adjusts the variable pulse cycle in real-time based on the visualization objectives. The system enables a user to input one or more visualization objectives and to change those objectives while using the system. For example, the visualization objective may indicate the user wishes to view only color imaging data, and in this case, the variable pulse cycle may include pulsed or constant emissions of white light (or other visible EMR). The visualization objective may indicate the user wishes to be notified when a scene includes one or more types of tissue or conditions that may be identified using one or more of color imaging, multispectral imaging, or fluorescence imaging. The visualization objective may indicate that a patient has been administered a certain fluorescent reagent or dye, and that fluorescence imaging should continue while the reagent or dye remains active. The visualization objective may indicate the user wishes to view a three-dimensional topographical map of a scene, receive information regarding distances or dimensions within the scene, receive an alert when a tool comes within critical distance from a certain tissue structure, and so forth.
The variable pulse cycle may include one or more finely tuned partitions of the electromagnetic spectrum that are selected to elicit a fluorescence response from a reagent, dye, or auto-fluorescing tissue. The fluorescence excitation wavebands of EMR include one or more of the following: 700±50 nm, 710±50 nm, 720±50 nm, 730±50 nm, 740±50 nm, 750±50 nm, 760±50 nm, 770±50 nm, 780±50 nm, 790±50 nm, 800±50 nm, 810±50 nm, 820±50 nm, 830±50 nm, 840±50 nm, 850±50 nm, 860±50 nm, 870±50 nm, 880±50 nm, 890±50 nm, or 900±50 nm. The aforementioned wavebands may be finely tuned such that the emitter pulses the central wavelength with a tolerance threshold of ±100 nm, ±90 nm, ±80 nm, ±70 nm, ±60 nm, ±50 nm, ±40 nm, ±30 nm, ±20 nm, ±10 nm, ±8 nm, ±6 nm, ±5 nm, ±4 nm, ±3 nm, ±2 nm, ±1 nm, and so forth. In some cases, the emitter includes a plurality of laser bundles that are each configured to pulse a particular wavelength of EMR with a tolerance threshold not greater than ±5 nm, ±4 nm, ±3 nm, or ±2 nm.
The variable pulse cycle may include one or more wavebands of EMR that are tuned for multispectral imaging. These wavebands of EMR are selected to elicit a spectral response from a certain tissue or penetrate through a certain tissue (such that substances disposed behind that tissue may be visualized). The multispectral wavebands of EMR include one or more of the following: 400±50 nm, 410±50 nm, 420±50 nm, 430±50 nm, 440±50 nm, 450±50 nm, 460±50 nm, 470±50 nm, 480±50 nm, 490±50 nm, 500±50 nm, 510±50 nm, 520±50 nm, 530±50 nm, 540±50 nm, 550±50 nm, 560±50 nm, 570±50 nm, 580±50 nm, 590±50 nm, 600±50 nm, 610±50 nm, 620±50 nm, 630±50 nm, 640±50 nm, 650±50 nm, 660±50 nm, 670±50 nm, 680±50 nm, 690±50 nm, 700±50 nm, 710±50 nm, 720±50 nm, 730±50 nm, 740±50 nm, 750±50 nm, 760±50 nm, 770±50 nm, 780±50 nm, 790±50 nm, 800±50 nm, 810±50 nm, 820±50 nm, 830±50 nm, 840±50 nm, 850±50 nm, 860±50 nm, 870±50 nm, 880±50 nm, 890±50 nm, 900±50 nm, 910±50 nm, 920±50 nm, 930±50 nm, 940±50 nm, 950±50 nm, 960±50 nm, 970±50 nm, 980±50 nm, 990±50 nm, 1000±50 nm, 900±100 nm, 950±100 nm, or 1000±100 nm. The aforementioned wavebands may be finely tuned such that the emitter pulses the central wavelength with a tolerance threshold of ±100 nm, ±90 nm, ±80 nm, ±70 nm, ±60 nm, ±50 nm, ±40 nm, ±30 nm, ±20 nm, ±10 nm, ±8 nm, ±6 nm, ±5 nm, +4 nm, ±3 nm, ±2 nm, ±1 nm, and so forth. In some cases, the emitter includes a plurality of laser bundles that are each configured to pulse a particular wavelength of EMR with a tolerance threshold not greater than ±5 nm, ±4 nm, ±3 nm, or ±2 nm.
Certain multispectral wavelengths pierce through tissue and enable a medical practitioner to “see through” tissues in the foreground to identify chemical processes, structures, compounds, biological processes, and so forth that are located behind the foreground tissues. The multispectral wavelengths may be specifically selected to identify a specific disease, tissue condition, biological process, chemical process, type of tissue, and so forth that is known to have a certain spectral response.
The variable pulse cycle may include one or more emissions of EMR that are optimized for mapping imaging, which includes, for example, three-dimensional topographical mapping, depth map generation, calculating distances between objects within a scene, calculating dimensions of objects within a scene, determining whether a tool or other object approaches a threshold distance from another object, and so forth. The pulses for laser mapping imaging include EMR formed in a mapping pattern, which may include one or more of vertical hashing, horizontal hashing, a dot array, and so forth.
The controller 104 optimizes the variable pulse cycle to accommodate various imaging and video standards. In most use-cases, the system outputs a video stream comprising at least 30 frames per second (fps). The controller 104 synchronizes operations of the emitter and the image sensor to output data at a sufficient frame rate for visualizing the scene and further for processing the scene with one or more advanced visualization techniques. A user may request a real-time color video stream of the scene and may further request information based on one or more of multispectral imaging, fluorescence imaging, or laser mapping imaging (which may include topographical mapping, calculating dimensions and distances, and so forth). The controller 104 causes the image sensor to separately sense color data frames, multispectral data frames, fluorescence data frames, and mapping data frames based on the variable pulse cycle of the emitter.
In some cases, a user requests more data types than the system can accommodate while maintaining a smooth video frame rate. The system is constrained by the image sensor's ability to accumulate a sufficient amount of electromagnetic energy during each blanking period to output a data frame with sufficient exposure. In some cases, the image sensor outputs data at a rate of 60-120 fps and may specifically output data at a rate of 60 fps. In these cases, the controller 104 may devote 30 fps to color visualization and may devote the other frames per second to one or more advanced visualization techniques.
The controller 104 calculates and adjusts the variable pulse cycle of the emitter 102 in real-time based at least in part on the known capabilities of the pixel array 125. The controller 104 may access data stored in memory indicating how long the pixel array 125 must be exposed to a certain waveband of EMR for the pixel array 125 to accumulate a sufficient amount of EMR to output a data frame with sufficient exposure. In most cases, the pixel array 125 is inherently more or less sensitive to different wavebands of EMR. Thus, the pixel array 125 may require a longer or shorter blanking period duration for some wavebands of EMR to ensure that all data frames output by the image sensor 124 comprise sufficient exposure levels.
The controller 104 determines the data input requirements for various advanced visualization algorithms (see, e.g., the algorithms 346, 348, 350 first described in
The system 100 may include a plurality of image sensors 124 that may have different or identical pixel array configurations. For example, one image sensor 124 may include a monochromatic or “color agnostic” pixel array with no filters, another image sensor 124 may include a pixel array with a Bayer pattern CFA, and another image sensor 124 may include a pixel array with a different CFA. The multiple image sensors 124 may be assigned to detect EMR for a certain imaging modality, such as color imaging, multispectral imaging, fluorescence imaging, or laser mapping imaging. Further, each of the image sensors 124 may be configured to simultaneously accumulate EMR and output a data frame, such that all image sensors are capable of sensing data for all imaging modalities.
The controller 104 prioritizes certain advanced visualization techniques based on the user's ultimate goals. In some cases, the controller 104 prioritizes outputting a smooth and high-definition color video stream to the user above other advanced visualization techniques. In other cases, the controller 104 prioritizes one or more advanced visualization techniques over color visualization, and in these cases, the output color video stream may appear choppy to a human eye because the system outputs fewer than 30 fps of color imaging data.
For example, a user may indicate that a fluorescent reagent has been administered to a patient. If the fluorescent reagent is time sensitive, then the controller 104 may ensure that a sufficient ratio of frames is devoted to fluorescence imaging to ensure the user receives adequate fluorescence imaging data while the reagent remains active. In another example, a user requests a notification whenever the user's tool comes within a threshold distance of a certain tissue, such as a blood vessel, nerve fiber, cancer tissue, and so forth. In this example, the controller 104 may prioritize laser mapping visualization to constantly determine the distance between the user's tool and the surrounding structures and may further prioritize multispectral or fluorescence imaging that enables the system to identify the certain tissue. The controller 104 may further prioritize color visualization to ensure the user continues to view a color video stream of the scene.
As illustrated in
The multispectral data frame 207 may undergo spectral processing 346 that is executed by the image signal processor 140 and/or another processor that is external to the system 300. The spectral processing 346 may include a machine learning algorithm and may be executed by a neural network configured to process the multispectral data frame 207 to identify one or more tissue structures within a scene based on whether those tissue structures emitted a spectral response.
The fluorescence data frame 209 may undergo fluorescence processing 348 that is executed by the image signal processor 140 and/or another processor that is external to the system 300. The fluorescence processing 348 may include a machine learning algorithm and may be executed by a neural network configured to process to fluorescence data frame 209 and identify an intensity map wherein a fluorescence relaxation wavelength is detected by the pixel array.
The mapping data frame 211 may undergo topographical processing 350 that is executed by the image signal processor 140 and/or another processor that is external to the system 300. The topographical processing 350 may include a machine learning algorithm and may be executed by a neural network configured to assess time-of-flight information to calculate a depth map representative of the scene. The topographical processing 350 includes calculating one or more of a three-dimensional topographical map of the scene, a dimension of one or more objects within the scene, a distance between two or more objects within the scene, a distance between a tool and a certain tissue structure within the scene, and so forth.
Each frame period in the sensor cycle is adjustable on a frame-by-frame basis to optimize the output of the image sensor and compensate for the pixel array 125 having varying degrees of sensitivity to different wavebands of EMR. The duration of each blanking period may be shortened or lengthened to customize the amount of EMR the pixel array 125 can accumulate. Thus, the image sensor 124 may output data frames at an irregular rate due to the sensor cycle comprising a variable frame rate. The system 300 includes a memory buffer 352 that receives data frames from the image sensor 124. The memory buffer 352 stores the data frames and then outputs each data frame to the image signal processor 140 at a regular rate. This enables the image signal processor 140 to process each data frame in sequence at a regular rate.
The variable pulsing cycle is customizable and adjustable in real-time based on user input. The emitter 102 may instruct the individual EMR sources to pulse in any order. Additionally, the emitter 102 may adjust one or more of a duration or an intensity of each pulse of EMR. The variable pulse cycle may be optimized to sufficiently illuminate the light deficient environment 406 such that the resultant data frames read out by the pixel array 125 are within a desired exposure range (i.e., the frames are neither underexposed nor overexposed). The desired exposure range may be determined based on user input, requirements of the image signal processor 140, and/or requirements of a certain image processing algorithm (see 344, 346, 348, and 350 in
The pixel array 125 experiences varying degrees of sensitivity to different wavebands of EMR. In some cases, the pixel array 125 is tuned to be highly sensitive to visible bands of EMR (i.e., the pixel array 125 can efficiently detect visible bands of EMR). The pixel array 125 may have reduced sensitivity to other wavebands of EMR (i.e., the pixel array 125 is less efficient when detecting the other wavebands of EMR). The controller 104 optimizes the variable pulse cycle of the emitter 102 and the variable sensor cycle of the image sensor 124 to compensate for the pixel array's 125 varying degrees of sensitivity to different wavebands of the electromagnetic spectrum.
The controller 104 may adjust each frame period by shortening or lengthening the durations of the blanking periods of the image sensor 124 and the pulsing periods of the emitter 102 on a frame-by-frame basis to ensure each output data frame (see for example 205, 207, 209, and 211 in
The pulse cycle may be adjusted on a frame-by-frame basis based on user input and proper exposure of the resultant data frames output by the image sensor 124. For example, the controller 104 may determine that the data frames output by the image sensor 124 in response to the second pulsing period 510 are underexposed, and therefore, the duration of the second pulsing period 510 should be lengthened to ensure the pixel array 125 can accumulate a sufficient amount of energy to output a data frame with proper exposure.
The controller 104 may instruct the emitter 102 to pulse certain wavebands of EMR at an increased intensity to overcome inherent inefficiencies of the pixel array 125. For example, if the pixel array 125 is inherently inefficient at detecting EMR at a wavelength of 950 nm, then the controller 104 may instruct the emitter 102 to pulse EMR at 950 nm with an increased intensity relative to other pulses of EMR.
Traditional video cameras implement a fixed frame duration for the standard operating mode. With a fixed frame duration, the frame rate remains constant, and the data transmission timing from the image sensor remains constant. For example, for a high-definition video stream, data from approximately 2 million pixels is offloaded from the image sensor every 16.67 ms (or 60 fps). The transmission of the data through the electronic circuitry and software algorithms occurs at a fixed timing based on this constant frame rate.
The advanced visualization systems described herein are capable of outputting a color video stream that is interlaced with one or more different advanced visualization data streams, such as multispectral, fluorescence, or laser mapping data streams. This data output may be captured by a single image sensor or multiple image sensors working simultaneously. The image sensor is designed to process images at a constant and repetitive frame rate (e.g., 60 fps, or one data frame every 16.67 ms). The color video stream requires at least 30 fps of white light video, which leaves finite time for capturing the remaining advanced visualization modalities. In traditional cases, the system must capture all advanced visualization modalities in 30 fps, with each frame duration lasting for 16.67 ms. This frame duration may be too short for the pixel array to accumulate sufficient EMR at certain wavebands of the electromagnetic spectrum where the pixel array is inherently inefficient. Further, this same frame duration may be more than enough time to for the pixel array to accumulate sufficient EMR at certain wavebands where the pixel array is inherently efficient.
As described herein, the controller 104 optimizes and adjusts the variable pulse cycle of the emitter and the sensor cycle of the image sensor in real-time to ensure proper exposure of the image sensor. The controller may dynamically adjust each frame period on a frame-by-frame basis by adjusting the blanking period duration based on efficiencies of the pixel array and use-case requirements received from a user. This improves the overall system performance and ensures the image sensor can output a plurality of different advanced visualization data frames with proper exposure.
In an example implementation, the image sensor requires 25 ms to accumulate a sufficient amount of EMR at a 950 nm wavelength because the emitter 102 is inefficient at illuminating at this wavelength and/or the pixel array 125 is inefficient at detecting this wavelength. However, the image sensor may only require 5 ms to accumulate a sufficient amount of EMR at 650 nm because the emitter 102 is efficient at illuminating at this wavelength and/or the pixel array 125 is efficient at detecting this wavelength. The controller 104 may lengthen the blanking period for the 950 nm wavelength and shorten the blanking period for the 650 nm wavelength to compensate for the system's 100 relative inefficiencies and ensure the image sensor 124 outputs a data frame with optimal exposure at each of the 950 nm wavelength and the 650 nm wavelength.
The example mapping pattern 210 illustrated in
As discussed in connection with
The emitter 102 may pulse the mapping pattern 210 at any suitable wavelength of EMR, including, for example, ultraviolet light, visible, light, and/or infrared or near infrared light. The surface and/or objects within the environment may be mapped and tracked at very high resolution and with very high accuracy and precision.
The mapping pattern 210 is selected for the desired anatomical measurement scheme, such as three-dimensional topographical mapping, measuring distances and dimensions within a scene, tracking a relative position of a tool 108 within a scene, and so forth. The image sensor 124 detects reflected EMR and outputs a mapping data frame 211 in response to the emitter 102 pulsing the mapping pattern 210. The resultant mapping data frame 211 is provided to a topographical processing 350 algorithm that is trained to calculate one or more of a three-dimensional topographical map of a scene, a distance between two or more objects within the scene, a dimension of an object within the scene, a relative distance between a tool and another object within the scene, and so forth.
The emitter 102 may pulse the mapping pattern 210 at a sufficient speed such that the mapping pattern 210 is not visible to a user. In various implementations, it may be distracting to a user to see the mapping pattern 210 during an endoscopic visualization procedure. In some cases, a rendering of the mapping pattern 210 may be overlaid on a color video stream to provide further context to a user visualizing the scene. The user may further request to view real-time measurements of objects within the scene and real-time proximity alerts when a tool approaches a critical structure such as a blood vessel, nerve fiber, cancer tissue, and so forth. The accuracy of the measurements may be accurate to less than one millimeter.
Multispectral imaging incudes imaging information from across the electromagnetic spectrum 800. A multispectral pulse of EMR may include a plurality of sub-pulses spanning one or more portions of the electromagnetic spectrum 800 or the entirety of the electromagnetic spectrum 800. A multispectral pulse of EMR may include a single partition of wavelengths of EMR. A resulting multispectral data frame includes information sensed by the pixel array subsequent to a multispectral pulse of EMR. Therefore, a multispectral data frame may include data for any suitable partition of the electromagnetic spectrum 800 and may include multiple data frames for multiple partitions of the electromagnetic spectrum 800.
The emitter 102 may include any number of multispectral EMR sources as needed depending on the implementation. In one embodiment, each multispectral EMR source covers a spectrum covering 40 nanometers. For example, one multispectral EMR source may emit EMR within a waveband from 500 nm to 540 nm while another multispectral EMR source may emit EMR within a waveband from 540 nm to 580 nm. In another embodiment, multispectral EMR sources may cover other sizes of wavebands, depending on the types of EMR sources available or the imaging needs. Each multispectral EMR source may cover a different slice of the electromagnetic spectrum 800 ranging from far infrared, mid infrared, near infrared, visible light, near ultraviolet and/or extreme ultraviolet. In some cases, a plurality of multispectral EMR sources of the same type or wavelength may be included to provide sufficient output power for imaging. The number of multispectral EMR sources needed for a specific waveband may depend on the sensitivity of a pixel array 125 to the waveband and/or the power output capability of EMR sources in that waveband.
The waveband widths and coverage provided by the EMR sources may be selected to provide any desired combination of spectrums. For example, contiguous coverage of a spectrum 800 using very small waveband widths (e.g., 10 nm or less) may allow for highly selective multispectral and/or fluorescence imaging. The waveband widths allow for selectively emitting the excitation wavelength(s) for one or more particular fluorescent reagents. Additionally, the waveband widths may allow for selectively emitting certain partitions of multispectral EMR for identifying specific structures, chemical processes, tissues, biological processes, and so forth. Because the wavelengths come from EMR sources which can be selectively activated, extreme flexibility for fluorescing one or more specific fluorescent reagents during an examination can be achieved. Additionally, extreme flexibility for identifying one or more objects or processes by way of multispectral imaging can be achieved. Thus, much more fluorescence and/or multispectral information may be achieved in less time and within a single examination which would have required multiple examinations, delays because of the administration of dyes or stains, or the like.
In one embodiment, each data frame is generated based on at least one pulse of EMR. The pulse of EMR is reflected and detected by the pixel array 125 and then read out in a subsequent readout (902). Thus, each blanking period and readout results in a data frame for a specific waveband of EMR. For example, the first data frame 904 may be generated based on a waveband of a first one or more pulses 916, a second data frame 906 may be generated based on a waveband of a second one or more pulses 918, a third data frame 908 may be generated based on a waveband of a third one or more pulses 920, a fourth data frame 910 may be generated based on a waveband of a fourth one or more pulses 922, a fifth data frame 912 may be generated based on a waveband of a fifth one or more pulses 924, and an Nth data frame 926 may be generated based on a waveband of an Nth one or more pulses 926.
The pulses 916-926 may include energy from a single EMR source or from a combination of two or more EMR sources. For example, the waveband included in a single readout period or within the plurality of data frames 904-914 may be selected for a desired examination or detection of a specific tissue or condition. According to one embodiment, one or more pulses may include visible spectrum light for generating an RGB or black and white image while one or more additional pulses are emitted to sense a spectral response to a multispectral wavelength of EMR.
The pulses 916-926 are emitted according to a variable pulse cycle determined by the controller 104. For example, pulse 916 may include a white light, pulse 918 may include a multispectral waveband, pulse 920 may include a white light, pulse 922 may include a fluorescence waveband, pulse 924 may include white light, and so forth.
The plurality of frames 904-914 are shown having varying lengths in readout periods and pulses having different lengths or intensities. The blanking period, pulse length or intensity, or the like may be selected based on the sensitivity of a monochromatic sensor to the specific wavelength, the power output capability of the EMR source(s), and/or the carrying capacity of the waveguide.
In one embodiment, dual image sensors may be used to obtain three-dimensional images or video feeds. A three-dimensional examination may allow for improved understanding of a three-dimensional structure of the examined region as well as a mapping of the different tissue or material types within the region.
In an example implementation, a patient is imaged with an endoscopic imaging system to identify quantitative diagnostic information about the patient's tissue pathology. In the example, the patient is suspected or known to suffer from a disease that can be tracked with multispectral imaging to observe the progression of the disease in the patient's tissue. The endoscopic imaging system pulses white light to generate an RGB video stream of the interior of the patient's body. Additionally, the endoscopic imaging system pulses one or more multispectral wavebands of light that permit the system to “see through” some tissues and generate imaging of the tissue affected by the disease. The endoscopic imaging system senses the reflected multispectral EMR to generate multispectral imaging data of the diseased tissue, and thereby identifies the location of the diseased tissue within the patient's body. The endoscopic imaging system may further emit a mapping pulsing scheme for generating a three-dimensional topographical map of the scene and calculating dimensions of objects within the scene. The location of the diseased tissue (as identified by the multispectral imaging data) may be combined with the topographical map and dimensions information that is calculated with the mapping data. Therefore, the precise location, size, dimensions, and topology of the diseased tissue can be identified. This information may be provided to a medical practitioner to aid in excising, imaging, or studying the diseased tissue. Additionally, this information may be provided to a robotic surgical system to enable the surgical system to excise the diseased tissue.
The computing device 1000 includes one or more processor(s) 1004, one or more memory device(s) 1004, one or more interface(s) 1006, one or more mass storage device(s) 1008, one or more Input/output (I/O) device(s) 1010, and a display device 1030 all of which are coupled to a bus 1012. Processor(s) 1004 include one or more processors or controllers that execute instructions stored in memory device(s) 1004 and/or mass storage device(s) 1008. Processor(s) 1004 may also include various types of computer-readable media, such as cache memory.
Memory device(s) 1004 include various computer-readable media, such as volatile memory (e.g., random access memory (RAM) 1014) and/or nonvolatile memory (e.g., read-only memory (ROM) 1016). Memory device(s) 1004 may also include rewritable ROM, such as Flash memory.
Mass storage device(s) 1008 include various computer readable media, such as magnetic tapes, magnetic disks, optical disks, solid-state memory (e.g., Flash memory), and so forth. As shown in
I/O device(s) 1010 include various devices that allow data and/or other information to be input to or retrieved from computing device 1000. Example I/O device(s) 1010 include cursor control devices, keyboards, keypads, microphones, monitors or other display devices, speakers, printers, network interface cards, modems, and the like.
Display device 1030 includes any type of device capable of displaying information to one or more users of computing device 1000. Examples of display device 1030 include a monitor, display terminal, video projection device, and the like.
Interface(s) 1006 include various interfaces that allow computing device 1000 to interact with other systems, devices, or computing environments. Example interface(s) 1006 may include any number of different network interfaces 1020, such as interfaces to local area networks (LANs), wide area networks (WANs), wireless networks, and the Internet. Other interface(s) include user interface 1018 and peripheral device interface 1022. The interface(s) 1006 may also include one or more user interface elements 1018. The interface(s) 1006 may also include one or more peripheral interfaces such as interfaces for printers, pointing devices (mice, track pad, or any suitable user interface now known to those of ordinary skill in the field, or later discovered), keyboards, and the like.
Bus 1012 allows processor(s) 1004, memory device(s) 1004, interface(s) 1006, mass storage device(s) 1008, and I/O device(s) 1010 to communicate with one another, as well as other devices or components coupled to bus 1012. Bus 1012 represents one or more of several types of bus structures, such as a system bus, PCI bus, IEEE bus, USB bus, and so forth.
For purposes of illustration, programs and other executable program components are shown herein as discrete blocks, such as block 302 for example, although it is understood that such programs and components may reside at various times in different storage components of computing device 1000 and are executed by processor(s) 1002. Alternatively, the systems and procedures described herein, including programs or other executable program components, can be implemented in hardware, or a combination of hardware, software, and/or firmware. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein.
The following examples pertain to preferred features of further embodiments:
Example 1 is a system. The system includes an emitter comprising a plurality of electromagnetic sources. The system includes an image sensor comprising a pixel array that detects EMR. The system includes a controller that synchronizes operations of the emitter and the image sensor. The controller instructs the image sensor to accumulate EMR and read out data according to a variable frame cycle comprising a plurality of frame periods. The system is such that a duration of each of the plurality of frame periods is dynamically adjustable on a per-frame basis.
Example 2 is a system as in Example 1, wherein each of the plurality of frame periods comprises: a blanking period wherein the pixel array accumulates the EMR; and a readout period wherein the pixel array reads out data for generating a data frame.
Example 3 is a system as in any of Examples 1-2, wherein the controller instructs the emitter to actuate the plurality of electromagnetic sources to pulse according to a variable illumination cycle comprising independent pulses of two or more different wavebands of EMR; wherein each pulse within the variable illumination cycle of the emitter corresponds with a frame period of the plurality of frame periods of the image sensor; and wherein the controller determines the pixel array's efficiency in detecting the two or more different wavebands of EMR that are pulsed in the variable illumination cycle.
Example 4 is a system as in any of Examples 1-3, wherein the controller calculates an optimized blanking period duration for each of the plurality of frame periods based on the corresponding pulse within the variable illumination cycle and the pixel array's efficiency in detecting a waveband of the corresponding pulse.
Example 5 is a system as in any of Examples 1-4, wherein the controller is configured to synchronize the operations of the emitter and the image sensor by: instructing the emitter to pulse a first waveband of EMR during a first blanking period of the pixel array; and optimizing the duration of the first blanking period of the pixel array based on the pixel array's efficiency in detecting the first waveband of EMR.
Example 6 is a system as in any of Examples 1-5, wherein the controller optimizes the duration of the first blanking period to enable the pixel array to accumulate a sufficient amount of energy to output data for generating a first data frame that corresponds with the first waveband of EMR such that the first data frame comprises a threshold degree of exposure for visualizing a scene.
Example 7 is a system as in any of Examples 1-6, wherein the controller instructs the emitter to pulse a first waveband of EMR during a first blanking period of the pixel array, and wherein: the controller lengthens a duration of the first blanking period if the pixel array's sensitivity to the first waveband of EMR falls below a threshold efficiency; and the controller shortens a duration of the first blanking period if the pixel array's sensitivity to the first waveband of EMR exceeds the threshold efficiency.
Example 8 is a system as in any of Examples 1-7, wherein the controller adjusts the duration of at least a portion of the plurality of frame periods to compensate for the pixel array comprising varying sensitivity to detecting different wavelengths of EMR emitted by the emitter.
Example 9 is a system as in any of Examples 1-8, wherein the plurality of electromagnetic sources comprises a visible source that pulses a visible wavelength of EMR, and wherein the visible source comprises one or more of: a white light source; or a red light source, a green light source, and a blue light source that are pulsed simultaneously or sequentially.
Example 10 is a system as in any of Examples 1-9, wherein the plurality of electromagnetic sources comprises a multispectral source that pulses a spectral waveband of EMR, and wherein the multispectral source comprises one or more of: a first multispectral source that pulses EMR within a waveband from about 513 nm to about 545 nm; a second multispectral source that pulses EMR within a waveband from about 545 nm to about 565 nm; or a third multispectral source that pulses EMR within a waveband from about 900 nm to about 1000 nm.
Example 11 is a system as in any of Examples 1-10, wherein the plurality of electromagnetic sources comprises a fluorescence source that pulses a fluorescence excitation wavelength of EMR, and wherein the fluorescence source comprises one or more of: a first fluorescence source that pulses EMR within a waveband from about 770 nm to about 795 nm; or a second fluorescence source that pulses EMR within a waveband from about 790 nm to about 815 nm.
Example 12 is a system as in any of Examples 1-11, wherein the plurality of electromagnetic sources comprises a mapping source that pulses EMR in a mapping pattern.
Example 13 is a system as in any of Examples 1-12, wherein the pixel array detects reflected EMR and outputs mapping data in response to the emitter pulsing the mapping pattern, and wherein the mapping data comprises information for determining one or more of a topographical map of a scene, a dimension of one or more objects within the scene, or a distance.
Example 14 is a system as in any of Examples 1-13, wherein the controller instructs the emitter to selectively cycle the plurality of electromagnetic sources according to a variable illumination cycle, and wherein the variable illumination cycle is adjustable based on user input.
Example 15 is a system as in any of Examples 1-14, wherein the user input comprises an indication the image sensor should output color imaging and machine vision imaging, and wherein the controller adjusts the variable illumination cycle to comprise a pattern comprising: a plurality of pulses of a visible wavelength of EMR, wherein the image sensor outputs color imaging data in response to the emitter pulsing the visible wavelength of EMR; and a plurality of pulses of a machine vision emission, wherein the image sensor outputs machine vision imaging data in response to the emitter pulsing the machine vision emission.
Example 16 is a system as in any of Examples 1-15, wherein the machine vision imaging comprises one or more of multispectral imaging, fluorescence imaging, or topographical mapping, and wherein the machine vision emission comprises a wavelength or pattern selected for the one or more of the multispectral imaging, the fluorescence imaging, or the topographical mapping.
Example 17 is a system as in any of Examples 1-16, wherein the controller reprograms the image sensor prior to each frame period of the plurality of frame periods to set a blanking period duration for an upcoming frame period.
Example 18 is a system as in any of Examples 1-17, further comprising a memory buffer, and wherein: the image sensor outputs a plurality of datasets corresponding with the plurality of frame periods, and wherein each of the plurality of datasets comprises information for generating a data frame; two or more of the plurality of frame periods of the frame cycle comprise different blanking period durations; and the image sensor outputs the plurality of datasets to the memory buffer at irregular intervals due to the frame cycle comprising the different blanking period durations.
Example 19 is a system as in any of Examples 1-18, further comprising an image processing pipelines that receives the plurality of datasets stored in the memory buffer and processes the plurality of datasets at regular intervals.
Example 20 is a system as in any of Examples 1-19, wherein the controller writes to a sensor register for the image sensor to set a blanking period duration for each of the plurality of frame periods, and wherein two or more of the plurality of frame periods comprise a different blanking period duration.
Example 21 is a method. The method includes actuating an emitter to cycle a plurality of electromagnetic sources according to a variable illumination cycle comprising: a first emission comprising EMR within a first waveband; and a second emission comprising EMR within a second waveband. The method includes instructing an image sensor comprising a pixel array to accumulate EMR and read out data according to a frame cycle. The method is such that the frame cycle comprises a plurality of frame periods, and wherein a duration of each of the plurality of frame periods is variable on a per-frame basis.
Example 22 is a method as in Example 21, wherein the frame cycle comprises: a first frame period having a duration that is optimized for accumulating the EMR within the first waveband; and a second frame period that is different from the first frame period having a duration that is optimized for accumulating the EMR within the second waveband.
Example 23 is a method as in any of Examples 21-22, wherein the durations of the first frame period and the second frame period are optimized for enabling the pixel array to accumulate sufficient energy to output data for generating a data frame corresponding to each of the first frame period and the second frame period, wherein each data frame comprises a threshold degree of exposure for visualizing a scene.
Example 24 is a method as in any of Examples 21-23, wherein the plurality of frame periods of the frame cycle comprises: a first frame period comprising: a first blanking period wherein the pixel array accumulates the EMR within the first waveband; and a first readout period wherein the pixel array reads out data for generating a first data frame corresponding with the EMR within the first waveband; and a second frame period comprising: a second blanking period wherein the pixel array accumulates the EMR within the second waveband; and a second readout period wherein the pixel array reads out data for generating a second data frame corresponding with the EMR within the second waveband.
Example 25 is a method as in any of Examples 21-24, further comprising: calculating an optimized duration for the first blanking period such that the first data frame comprises a threshold degree of exposure for visualizing a scene; and calculating an optimized duration for the second blanking period such that the second data frame comprises at least the threshold degree of exposure for visualizing the scene; wherein the optimized duration for the first blanking period is different from the optimized duration for the second blanking period.
Example 26 is a method as in any of Examples 21-25, wherein calculating the optimized duration for the first blanking period comprises calculating based on the pixel array's efficiency in accumulating the EMR within the first waveband; and wherein calculating the optimized duration for the second blanking period comprises calculating based on the pixel array's efficiency in accumulating the EMR within the second waveband.
Example 27 is a method as in any of Examples 21-26, wherein the first waveband is different from the second waveband, and wherein the pixel array comprises varying degrees of sensitivity to detecting EMR within the first waveband and the second waveband.
Example 28 is a method as in any of Examples 21-27, wherein the threshold degree of exposure for visualizing the scene is consistent for each of the first data frame and the second data frame, and wherein the optimized durations for each of the first blanking period and the second blanking period are configured to compensate for the pixel array comprising the varying degrees of sensitivity to detecting EMR within the first waveband and the second waveband.
Example 29 is a method as in any of Examples 21-28, wherein the plurality of electromagnetic sources of the emitter comprises a visible source that pulses a visible wavelength of EMR and further comprises one or more of: a multispectral source that pulses a spectral wavelength of EMR; a fluorescence source that pulses a fluorescence excitation wavelength of EMR; or a mapping source that pulses a mapping pattern.
Example 30 is a method as in any of Examples 21-29, wherein the variable illumination cycle comprises an emission by the visible source and further comprises an emission by one or more of the multispectral source, the fluorescence source, or the mapping source, and wherein the variable illumination cycle is adjustable based on user input.
Example 31 is a method as in any of Examples 21-30, wherein the multispectral source comprises one or more of: a first multispectral source that pulses EMR within a waveband from about 513 nm to about 545 nm; a second multispectral source that pulses EMR within a waveband from about 545 nm to about 565 nm; or a third multispectral source that pulses EMR within a waveband from about 900 nm to about 1000 nm.
Example 32 is a method as in any of Examples 21-31, wherein the fluorescence source comprises one or more of: a first fluorescence source that pulses EMR within a waveband from about 770 nm to about 795 nm; or a second fluorescence source that pulses EMR within a waveband from about 790 nm to about 815 nm.
Example 33 is a method as in any of Examples 21-32, wherein the mapping pattern emitted by the mapping source comprises one or more of vertical hashing, horizontal hashing, a raster grid of discrete points, an occupancy grid map, or a dot array.
Example 34 is a method as in any of Examples 21-33, wherein the variable illumination cycle comprises a plurality of independent pulses of EMR, and wherein the method comprises: calculating an optimized duration for the first emission based at least in part on a sensitivity of the pixel array for detecting the EMR within the first waveband; and calculating an optimized duration for the second emission based at least in part on a sensitivity of the pixel array for detecting the EMR within the second waveband.
Example 35 is a method as in any of Examples 21-34, wherein the frame cycle is optimized for outputting a video stream comprising color image data and machine vision image data, and wherein the duration of each of the plurality of frame periods is optimized to compensate for the pixel array having varying sensitivity to illumination corresponding with the color image data and illumination corresponding with the machine vision image data.
Example 36 is a method as in any of Examples 21-35, wherein: the pixel array outputs the color image data in response to the emitter pulsing a visible wavelength of EMR; and the pixel array outputs the machine vision image data in response to the emitter pulsing one or more of a multispectral wavelength of EMR, a fluorescence excitation wavelength of EMR, or a mapping pattern.
Example 37 is a method as in any of Examples 21-36, wherein instructing the image sensor comprises reprogramming the image sensor prior to each frame period of the plurality of frame periods to set a blanking period duration for an upcoming frame period.
Example 38 is a method as in any of Examples 21-37, wherein: the image sensor outputs a plurality of datasets corresponding with the plurality of frame periods, and wherein each of the plurality of datasets comprises information for generating a data frame; two or more of the plurality of frame periods of the frame cycle comprise different blanking period durations; and the image sensor outputs the plurality of datasets to a memory buffer at irregular intervals due to the frame cycle comprising the different blanking period durations.
Example 39 is a method as in any of Examples 21-38, wherein an image processing pipeline receives the plurality of datasets stored in the memory buffer and processes the plurality of datasets at regular intervals.
Example 40 is a method as in any of Examples 21-39, wherein instructing the image sensor comprises writing to a sensor register for the image sensor to set a blanking period duration for each of the plurality of frame periods, and wherein two or more of the plurality of frame periods comprise a different blanking period duration.
Example 41 is a system as in any of Examples 1-20, wherein the plurality of electromagnetic sources comprises a multispectral source that pulses a spectral waveband of EMR, and wherein the multispectral source comprises one or more of: a first multispectral source that pulses EMR within a near infrared waveband of the electromagnetic spectrum; a second multispectral source that pulses a first narrowband of visible electromagnetic radiation; and a third multispectral source that pulses a second narrowband of visible electromagnetic radiation.
Example 42 is a system as in any of Examples 1-20, wherein the plurality of electromagnetic sources comprises a multispectral source that pulses a spectral waveband of EMR selected to elicit a spectral response from a tissue, and wherein the multispectral source comprises: a first multispectral source that pulses EMR within a first narrowband of a visible waveband of the electromagnetic spectrum, wherein the first narrowband is 20 nm wide or less; a second multispectral source that pulses EMR within a second narrowband of the visible waveband of the electromagnetic spectrum, wherein the second narrowband is 20 nm wide or less; and a third multispectral source that pulses EMR within a near infrared waveband of the electromagnetic spectrum.
It will be appreciated that various features disclosed herein provide significant advantages and advancements in the art. The following claims are exemplary of some of those features.
In the foregoing Detailed Description of the Disclosure, various features of the disclosure are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed disclosure requires more features than are expressly recited in each claim. Rather, inventive aspects lie in less than all features of a single foregoing disclosed embodiment.
It is to be understood that any features of the above-described arrangements, examples, and embodiments may be combined in a single embodiment comprising a combination of features taken from any of the disclosed arrangements, examples, and embodiments.
It is to be understood that the above-described arrangements are only illustrative of the application of the principles of the disclosure. Numerous modifications and alternative arrangements may be devised by those skilled in the art without departing from the spirit and scope of the disclosure and the appended claims are intended to cover such modifications and arrangements.
Thus, while the disclosure has been shown in the drawings and described above with particularity and detail, it will be apparent to those of ordinary skill in the art that numerous modifications, including, but not limited to, variations in size, materials, shape, form, function and manner of operation, assembly and use may be made without departing from the principles and concepts set forth herein.
Further, where appropriate, functions described herein can be performed in one or more of: hardware, software, firmware, digital components, or analog components. For example, one or more application specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) can be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the following description and claims to refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
The foregoing description has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. Further, it should be noted that any or all the aforementioned alternate implementations may be used in any combination desired to form additional hybrid implementations of the disclosure.
Further, although specific implementations of the disclosure have been described and illustrated, the disclosure is not to be limited to the specific forms or arrangements of parts so described and illustrated. The scope of the disclosure is to be defined by the claims appended hereto, any future claims submitted here and in different applications, and their equivalents.