As less invasive medical techniques and procedures become more widespread, medical professionals such as surgeons may require articulating surgical tools, such as endoscopes, to perform such less invasive medical techniques and procedures that access interior regions of the body via a body orifice such as the mouth.
In an aspect, a tool positioning system for performing a medical procedure on a patient includes an articulating probe and a stereoscopic imaging assembly for providing an image of a target location. The stereoscopic imaging assembly comprises: a first camera assembly comprising a first lens and a first sensor, wherein the first camera assembly is constructed and arranged to provide a first magnification of the target location; and a second camera assembly comprising a second lens and a second sensor, wherein the second camera assembly is constructed and arranged to provide a second magnification of the target location. In some embodiments, the second magnification is greater than the first magnification.
In some embodiments, the articulating probe comprises an inner probe comprising multiple articulating inner links and an outer probe surrounding the inner probe and comprising multiple articulating outer links.
In some embodiments, one of the inner probe or the outer probe is configured to transition between a rigid mode and a flexible mode, and the other of the inner probe or the outer probe is configured to transition between a rigid mode and a flexible mode and to be steered.
In some embodiments, the outer probe is configured to be steered.
In some embodiments, the tool positioning system further comprises a feeder assembly to apply forces to the inner and outer probes.
In some embodiments, the forces cause the inner and outer probes to independently advance or retract.
In some embodiments, the forces cause the inner and outer probes to independently transition between the rigid mode and the flexible mode.
In some embodiments, the forces cause the other of the inner or outer probes to be steered.
In some embodiments, the feeder assembly is positioned on a feeder cart.
In some embodiments, the tool positioning system further comprises a user interface.
In some embodiments, the user interface is configured to transmit commands to the feeder assembly to apply the forces to the inner and outer probes.
In some embodiments, the user interface comprises a component selected from the group consisting of: joystick; keyboard; mouse; switch; monitor, touchscreen; touch pad; trackball; display; touchscreen; audio element; speaker; buzzer; light; LED; and combinations thereof.
In some embodiments, the tool positioning system further comprises a working channel positioned between the multiple inner links and the multiple outer links and wherein the stereoscopic imaging assembly further comprises a cable positioned in the working channel. In some embodiments, at least one of the outer links comprises a side lobe positioned at an outer portion thereof, the side lobe including a side lobe channel, wherein the stereoscopic imaging assembly further comprises a cable positioned in the side lobe channel.
In some embodiments, the articulating probe is constructed and arranged to be inserted into a natural orifice of the patient.
In some embodiments, the articulating probe is constructed and arranged to be inserted through an incision in the patient.
In some embodiments, the articulating probe is constructed and arranged to provide subxiphoid entry into the patient.
In some embodiments, the tool positioning system further comprises an image processing assembly configured to receive a first image captured by the first camera assembly at the first magnification and a second image captured by the second camera assembly at the second magnification.
In some embodiments, the image processing assembly is configured to generate a two-dimensional image from the first image and the second image, the two-dimensional image having a magnification that is variable between the first magnification and the second magnification.
In some embodiments, the two-dimensional image is generated by merging at least a portion of the first image with at least a portion of the second image.
In some embodiments, as the magnification of the two-dimensional image increases from the first magnification to the second magnification, a greater percentage of the two-dimensional image is formed from the second image.
In some embodiments, at the first magnification, approximately fifty percent of the two-dimensional image is formed from the first image and approximately fifty percent of the two-dimensional image is forming from the second image.
In some embodiments, at the second magnification, approximately zero percent of the two-dimensional image is formed from the first image and approximately 100 percent of the two-dimensional image is formed from the second image.
In some embodiments, at a magnification between the first magnification and the second magnification, a lower percentage of the two-dimensional image is formed from the first image than from the second image.
In some embodiments, the magnification of the two-dimensional image is continuously variable between the first magnification and the second magnification.
In some embodiments, the first sensor and the second sensor are selected from the group consisting of charge-coupled devices (CCD), complementary metal oxide semiconductor (CMOS) devices and fiber optic-bundled sensor devices.
In some embodiments, the first camera assembly and the second camera assembly are mounted within a housing.
In some embodiments, the tool positioning system further comprises at least one LED mounted in the housing.
In some embodiments, the tool positioning system further comprises a plurality of LEDs mounted in the housing, each capable of providing differing levels of light to the target location.
In some embodiments, each of the plurality of LEDs is configured to be adjustable to provide greater light output to darker areas detected in the target image and lesser light output to lighter areas detected in the target location.
In some embodiments, the stereoscopic imaging assembly is rotatably mounted within a housing at the distal portion of the articulating probe, the housing further comprising a biasing mechanism mounted between the housing and the stereoscopic imaging assembly for applying a biasing force to the stereoscopic imaging assembly and an actuation mechanism mounted between the housing and the stereoscopic imaging assembly for rotating the stereoscopic imaging assembly within the housing in conjunction with the biasing force.
In some embodiments, the biasing mechanism comprises a spring.
In some embodiments, the actuation mechanism comprises a linear actuator.
In some embodiments, the tool positioning system further comprises an image processing assembly comprising an algorithm configured to digitally enhance the image.
In some embodiments, the algorithm is configured to adjust an image parameter selected from the group consisting of: size; color; contrast; hue; sharpness; pixel size; and combinations thereof.
In some embodiments, the stereoscopic imaging assembly is configured to provide a 3D image of the target location.
In some embodiments, a first image of the target location is captured by the first camera assembly and a second image of the target location is captured by the second camera assembly; the system being configured to manipulate a characteristic of the first image to substantially correspond to a characteristic of the second image and to combine the manipulated first image with the second image to generate a three-dimensional image of the target location.
In some embodiments, a first image of the target location is captured by the first camera assembly having a first field of view and a second image of the target location is captured by the second camera assembly having a second field of view, the second field of view being narrower than the first field of view; the system being configured to manipulate the first field of view of the first image to substantially correspond to the second field of view of the second image and to combine the manipulated first image with the second image to generate a three-dimensional image of the target location.
In some embodiments, the stereoscopic imaging assembly comprises a functional element.
In some embodiments, the functional element comprises a transducer.
In some embodiments, the transducer comprises a component selected from the group consisting of: solenoid; heat delivery transducer; heat extraction transducer; vibrational element; and combinations thereof.
In some embodiments, the functional element comprises a sensor.
In some embodiments, the sensor comprises a component selected from the group consisting of: temperature sensor; pressure sensor; voltage sensor; current sensor; electromagnetic field sensor; optical sensor; and combinations thereof.
In some embodiments, the sensor is configured to detect an undesired state of the stereoscopic imaging assembly.
In some embodiments, the tool positioning system further comprises: a third lens, constructed and arranged to provide a third magnification of the target location; and a fourth lens constructed and arranged to provide a fourth magnification of the target location; wherein a relationship between the third and fourth magnifications are different than a relationship between the first and second magnifications.
In some embodiments, the first and second sensors are in fixed positions within the stereoscopic imaging assembly and the first, second, third and fourth lenses are mounted within a rotatable bezel within the stereoscopic imaging assembly; and in a first configuration, the first and second lenses are positioned to direct light to the first and second sensors and, in a second configuration, the third and fourth lenses are positioned to direct light to the first and second sensors.
In some embodiments, the first camera assembly comprises a first value for a camera parameter, and the second camera assembly comprises a second value for the camera parameter, and wherein the camera parameter is selected from the group consisting of: field of view; f-stop; depth of focus; and combinations thereof.
In some embodiments, the first value compared to the second value is relatively equal to a magnification ratio of the first camera assembly to the second camera assembly.
In some embodiments, the first lens of the first camera assembly and the second lens of the second camera assembly are each positioned in the distal portion of the articulating probe.
In some embodiments, the first sensor of the first camera assembly and the second sensor of the second camera assembly are both positioned in the distal portion of the articulating probe.
In some embodiments, the first sensor of the first camera assembly and the second sensor of the second camera assembly are both positioned proximal to the articulating probe.
In some embodiments, the tool positioning system further comprises an optical conduit optically connecting the first lens to the first sensor and the second lens to the second sensor.
In some embodiments, the second magnification is an integer value greater than the first magnification.
In some embodiments, the second magnification is twice the first magnification.
In some embodiments, the first magnification is 5× and the second magnification is 10×.
In some embodiments, the first magnification is less than 7.5× and the second magnification is at least 7.5×.
In some embodiments, the target location comprises a location selected from the group consisting of: esophageal tissue; vocal chords; colon tissue; vaginal tissue; uterine tissue; nasal tissue; spinal tissue such as tissue on the anterior side of the spine; cardiac tissue such as tissue on the posterior side of the heart; tissue to be removed from a body; tissue to be treated within a body; cancerous tissue; nasal tissue; tissue and combinations thereof.
In some embodiments, the tool positioning system further comprises an image processing assembly.
In some embodiments, the image processing assembly further comprises a display.
In some embodiments, the image processing assembly further comprises an algorithm.
In some embodiments, the tool positioning system further comprises an error detection process for notifying a user of the system of one or more failures in the operation of the first and second camera assemblies during a procedure.
In some embodiments, the error detection process is configured to monitor operation of the first and second camera assemblies and, upon detecting a failure of one of the first and second camera assemblies, enabling the user to continue the procedure using the other of the first and second camera assemblies.
In some embodiments, the error detection process is further configured to monitor operation of the other of the first and second camera assemblies and to cease the procedure upon detecting a failure of the other of the first and second camera assemblies.
In some embodiments, the error detection process comprises an override function.
In some embodiments, the tool positioning system further comprises a diagnostic function for determining a calibration diagnostic of the first and second camera assemblies.
In some embodiments, the diagnostic function is configured to: receive a first diagnostic image of a calibration target from the first camera assembly and a second diagnostic image of the calibration target from the second camera assembly; process the first and second diagnostic images to identify corresponding features; perform a comparison of the first and second diagnostic images based on the corresponding features; and if the first and second diagnostic images differ by more than a predetermined amount, determining that the calibration diagnostic has failed.
In some embodiments, the tool positioning system further comprises a depth map generation assembly.
In some embodiments, the depth map generation assembly is configured to: receive a first depth map image of the target location from the first camera assembly and a second depth map image of the target location from the second camera assembly, the first and second camera assemblies being a known distance away from each other; and generate a depth map corresponding to the target location such that, the greater a disparity between a location in the first depth map image and a corresponding location in the second depth map image, the greater the depth associated with the location.
In some embodiments, the depth map generation assembly comprises a time of flight sensor aligned with an image sensor, the time of flight sensor configured to provide a depth of each pixel of an image corresponding to a portion of the target location to generate a depth map of the target location.
In some embodiments, the depth map generation assembly comprises a light-emitting device emitting a predetermined light pattern on the target location and an image sensor for detecting the light pattern on the target location; the depth map generation assembly configured to calculate a difference between the predetermined light pattern and the detected light pattern to generate the depth map.
In some embodiments, the system is further configured to generate a three-dimensional image of the target location using the depth map.
In some embodiments, the system is further configured to: rotate a first image captured by the first camera assembly to a desired position; rotate the depth map to align with the first image in the desired position; generate a second rotated image by applying the rotated depth map to the rotated first image; and generate a three-dimensional image from the rotated first and second rotated images.
In some embodiments, at least one of the first and second sensors is configured to capture image data at a first exposure amount in a first set of pixel lines of the at least one of the first and second sensors and image data at a second exposure amount in a second set of pixel lines of the at least one of the first and second sensors.
In some embodiments, the first set of pixel lines are odd-numbered pixel lines of the at least one of the first and second sensors and the second set of pixel lines are even-numbered pixel lines of the at least one of the first and second sensors.
In some embodiments, the first exposure amount is a high exposure amount and the second exposure amount is a low exposure amount.
In some embodiments, the first exposure amount is utilized in darker areas of an image and the second exposure amount is utilized in lighter areas of the image.
In some embodiments, the imaging assembly requires power, and the system further comprises a power source remote from the imaging assembly, wherein the power is transmitted to the image assembly via a power conduit.
In some embodiments, the tool positioning system further comprises an image processing assembly, wherein image data is recorded by the imaging assembly and transmitted to the image processing assembly via the power conduit.
In some embodiments, the tool positioning system further comprises a differential signal driver configured to AC couple the image data to the power conduit.
In another aspect, a stereoscopic imaging assembly for providing an image of a target location, comprises: a first sensor mounted within a housing; a second sensor mounted within the housing; and a variable lens assembly rotatably mounted within the housing, wherein, at various positions of the variable lens assembly, image data at different levels of magnification is provided to each of the first and second sensors by the variable lens assembly.
In some embodiments, the variable lens assembly comprises an Alvarez lens.
In another aspect, a method for capturing an image of a target location, comprises providing an articulating probe comprising a distal portion, and providing a stereoscopic imaging assembly, a portion of the which is positioned at the distal portion of the articulating probe, for providing an image of a target location. The stereoscopic imaging assembly may comprise: a first camera assembly comprising a first lens and a first sensor, wherein the first camera assembly is constructed and arranged to provide a first magnification of the target location; and a second camera assembly comprising a second lens and a second sensor, wherein the second camera assembly is constructed and arranged to provide a second magnification of the target location, wherein the second magnification is greater than the first magnification. The distal portion of the articulating probe is positioned at the target location; and the image at the target location is captured using the stereoscopic imaging assembly.
In some embodiments, the method further comprises providing the captured image at a user interface.
The foregoing and other objects, features and advantages of embodiments of the present inventive concepts will be apparent from the more particular description of preferred embodiments, as illustrated in the accompanying drawings in which like reference characters refer to the same elements throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the preferred embodiments.
The terminology used herein is for the purpose of describing particular embodiments and is not intended to be limiting of the inventive concepts. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes” and/or “including,” when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It will be understood that, although the terms first, second, third etc. may be used herein to describe various limitations, elements, components, regions, layers and/or sections, these limitations, elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one limitation, element, component, region, layer or section from another limitation, element, component, region, layer or section. Thus, a first limitation, element, component, region, layer or section discussed below could be termed a second limitation, element, component, region, layer or section without departing from the teachings of the present application.
It will be further understood that when an element is referred to as being “on” or “connected” or “coupled” to another element, it can be directly on or above, or connected or coupled to, the other element or intervening elements can be present. In contrast, when an element is referred to as being “directly on” or “directly connected” or “directly coupled” to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., “between” versus “directly between,” “adjacent” versus “directly adjacent,” etc.). When an element is referred to herein as being “over” another element, it can be over or under the other element, and either directly coupled to the other element, or intervening elements may be present, or the elements may be spaced apart by a void or gap.
It will be further understood that when a first element is referred to as being “in”, “on”, “at” and/or “within” a second element, the first element can be positioned: within an internal space of the second element, within a portion of the second element (e.g. within a wall of the second element); positioned on an external and/or internal surface of the second element; and combinations of one or more of these, but is not limited thereto.
To the extent that functional features, operations, and/or steps are described herein, or otherwise understood to be included within various embodiments of the inventive concept, such functional features, operations, and/or steps can be embodied in functional blocks, units, modules, operations and/or methods. And to the extent that such functional blocks, units, modules, operations and/or methods include computer program code, such computer program code can be stored in a computer readable medium, e.g., such as non-transitory memory and media, that is executable by at least one computer processor.
In the following description, references are made to the capturing, manipulation and processing of images. It will be understood that this may refer to single, still images and may also refer to an image as a single frame in a video stream. In the latter case, the video stream may be comprised of many images as frames in the stream.
Interface unit 200 may include a processor 210, including software 225. Software 225 can include one or more algorithms, routines, and/or other processes (“algorithms” herein), for execution by processor 210, which enable the operation of the articulating probe system 10 described herein. User interface 230 of interface unit 200 may correspond to human interface device HID 202 for receiving tactile commands from a surgeon, technician and/or other operator of system 10, and display 201 for providing visual and/or auditory feedback, as shown in
When performing investigative or surgical procedures, it is imperative that the operator of the articulating probe 100 have a clear and, at certain points in a procedure, magnified view of the environment through which the articulating probe is guided and of the inspection or surgical site itself during a procedure. Typical environments also referred to as “target locations” include anatomical locations with tissue types selected from the group consisting of: esophageal tissue; vocal chords; colon tissue; vaginal tissue; uterine tissue; nasal tissue; spinal tissue such as tissue on the anterior side of the spine; cardiac tissue such as tissue on the posterior side of the heart; tissue to be removed from a body; tissue to be treated within a body; cancerous tissue; nasal tissue; tissue; and combinations thereof. It is important that the operator be able to zoom in on, or magnify, the site to ensure precision and to facilitate better intra-operative decisions. A challenge comes from the difficulty to provide a true optical zoom, which provides higher magnification, while also providing the same or better optical detail to the user. Movable zoom lenses, which include multiple lenses that are moved relative to one another to change the magnification of the system, are commonly used to enable a user of a camera system to zoom in on or magnify an object. However, such lens systems, even in miniaturized form, may be too bulky to be used in procedures such as the type of procedures that the articulated probe 100 is used to perform. Such systems may also be very expensive and, in a case where the feeder top assembly 330 (of
Distal portion 108 of articulating probe 100 may include a stereoscopic imaging assembly 130 coupled to distal outer link 112, including a first camera assembly 135a and a second camera assembly 135b. According to aspects of the inventive concept, camera assemblies 135a, 135b may each include a fixed-magnification lens 132a, 132b and an optical assembly 133a, 133b. Optical assemblies 133a, 133b may be charge-coupled devices (CCD), complementary metal oxide semiconductor (CMOS) devices, fiber optic-bundled systems, or any other technology suitable for this application.
According to an embodiment of the inventive concept, lenses 132a and 132b may have different levels of magnification. For example, lens 132a may have a first magnification that provides a first field of view, FOV1, and lens 132b may have a second magnification that provides a second field of view, FOV2. As shown in
According to an aspect of the inventive concept, LEDs 138a-138d may be controlled individually to optimize the view provided to the operator and to the stereographic image assembly 130. Upon receiving images from the optical assemblies 133a, 133b, the processor 210, based on an image analysis performed by the image processing assembly 220, may vary the intensity of light provided by each LED 138, to enable uniform exposure across the image. In another embodiment, pixel illumination in each quadrant of the optical assembly may be analyzed and the output of corresponding LEDs controlled to optimize the resulting images.
The two-dimensional images captured by each of the camera assemblies 135a and 135b are transmitted to image processing assembly 220 via optical conduits 134a and 134b, respectively, and optical receiver 221. According to an aspect of the inventive concept, the received 2D image frames may be processed by the image processing assembly 220 to produce corresponding 3D image frames. This process is generally shown in flowchart 1000 of
The multiple-camera system also enables the generation of an image capable of having a range of simulated continuous magnification levels between the magnification level of each camera assembly 135a, 135b, by combining image data from each of the camera assemblies. The configuration of images of various magnification levels is described with reference to
Typically, the user performing a surgical procedure is concerned mostly with the middle of the visible workspace displayed on display 201. Inserting the higher resolution image of
With the two images overlaid, as shown in
Likewise, at a relative magnification factor of 2, which in the example described above is 10×, the image output from the image processing assembly 220 consists of approximately 100% of the FOV1 image captured by camera assembly 135a, with approximately 0% contribution by the FOV2 image captured by camera assembly 135b. This is shown at 182 in
At magnification levels in between 5× and 10×, the images captured by camera assemblies 135a and 135b contribute to the output-magnified image based on the proportion of the magnification level. For example, for an output image at 7.5× (or a relative magnification factor of 1.5, shown at 184 in
For an output image lower than 7.5×(or a relative magnification factor of 1.5) the FOV1 image captured by narrow field camera assembly 135a makes up a lower percentage of the resulting output image and the FOV2 image captured by wide field camera assembly 135b makes up a higher percentage of the resulting output image. Likewise, for an output image higher than 7.5× (or a relative magnification factor of 1.5) the FOV1 image captured by narrow field camera assembly 135a makes up a higher percentage of the resulting output image and the FOV2 image captured by wide field camera assembly 135b makes up a lower percentage of the resulting output image.
Generally, an image output by image processing assembly 220 may comprise approximately 100% of the FOV1 image captured by camera assembly 135a, which may make up between approximately 50% and 100% of the output image, depending on the magnification factor applied to the output image. Further, depending on the magnification factor applied to the output image, between approximately 0% and 50% of the output image may comprise at least a portion of the FOV2 image captured by camera assembly 135b. Magnifications closer to a magnification factor of 1 will comprise a greater portion of the FOV2 image, while magnifications closer to a magnification factor of 2 will comprise a smaller portion of the FOV2 image. In each instance, the resulting image may be scaled up or down in size by processing software 225 to a size accommodated by the display 201.
In an embodiment having more than two camera assemblies, more image data may be utilized to provide the generated zoom images at magnifications between those provided by each camera assembly.
To provide further granularity during the continuous zoom phase, the output image may be further improved with a number of image processing features that provide a digital enhancement of the image. Examples may include sizing, detail, color and other parameters.
According to another aspect of the present inventive concept, the stereoscopic image assembly 130 may include an error detection process that may provide a redundancy feature for the articulating probe system 10. Accordingly, in the event that one of camera assemblies 135a, 135b fails during a procedure, the operator would be given the option to continue the procedure using the single operating camera assembly, for example by using an override function provided by the error detection process. This process is depicted in flowchart 1400 of
However, if, in Step 1406, a failure of one of the camera assemblies 135a, 135b is detected, the operator may be notified of the failure through the user interface 230 and queried about continuing the procedure using only the remaining operable camera assembly, Step 1408. If the operator chooses not to continue in Step 1412, the procedure is terminated for replacement of the faulty camera assembly, Step 1416. If, in Step 1412, the operator chooses to continue the procedure, which choice may be communicated to the processor 210 via the user interface 230, the procedure is continued in a “single camera mode,” Step 1414. Processor 210 continues to monitor the functionality of the remaining camera assembly, Step 1418. As long as a second failure is not detected, Step 1420, the procedure is continued, Step 1422. If a second failure is detected in Step 1420, the procedure is terminated, Step 1416. In connection with the foregoing, a failure could be any type of degradation of the ability of a camera assembly to provide optimal quality images, for example, complete mechanical or electrical failure or even the associated lens being fouled with debris that prevents it from operating properly.
In order to ensure that both camera assemblies 135a, 135b are operating properly, a system diagnostic procedure may be undertaken. An example calibration procedure will now be described with reference to flowchart 1500 of
This procedure may be undertaken at the beginning of each procedure, and also periodically or continuously throughout the procedure. The data acquired through the diagnostic procedure may be utilized in the functionality monitoring procedure described with reference to
Outer link 150a further may include a motor (not shown) for driving a gear 151, which is mated to outer teeth configuration 156 of rotating lens housing 155a. As described above, lenses 135a-135d may have different magnification levels. Therefore, to change the zoom range of images captured by the optical assemblies 133a and 133b, rotating lens housing 155a may be rotated 90 degrees about an axis 152 by driving gear 151, to position lenses 135c and 135d over optical assemblies 133b and 133a, respectively. This may provide the stereoscopic image assembly 130 with a different range of magnification than that provided by lenses 135a and 135b.
As shown in
Stereographic image assembly 130 may be rotatable within a rotatable housing 165, within housing 164 of distal link 160, about central axis 162. A biasing spring 161 may be attached at one end to housing 164 and at the other end to stereographic image assembly 130 to provide a biasing force between the two components. Countering the biasing force is a linear actuator 163, also coupled between the housing 164 and the stereographic image assembly 130. Linear actuator 163 may comprise a device having a length that is electrically or mechanically controllable to enable it to exert a force against the biasing force provided by the spring 161, which enables the stereographic image assembly 130 to be controllably rotated within the housing 164. Examples of such linear actuators may be a solenoid device, a nitinol wire, or other device having similar properties. The biasing spring 161 is configured to allow a known amount of positive and negative offset from a position of the camera in which the camera axis 170 bisecting the camera assemblies 135a and 135b, is aligned with the surgical horizon. Such a position is shown in
Referring back to
During surgical procedures, the lighting requirements can change drastically and quickly. In some cases, the amount of light required to fully illuminate the surgical field may be beyond the capability of a lighting system associated with the stereographic image assembly 130. To compensate for low-light or high-light conditions, the exposure parameters of the optical assemblies 133a, 133b may be altered to allow the pixels in the sensors in the optical assemblies 133a, 133b more or less time to integrate the photons that are received into a signal that is relayed to the image processing assembly 220. For example, if the surgical site is very dark, the exposure of a sensor may be increased to allow more photons to reach the sensor and produce a brighter image. Conversely, if the surgical site is very bright, the exposure may be shortened to allow less light to reach the sensor, resulting in lower probabilities of sensor saturation.
While increasing or decreasing the exposure may account for one lighting condition at a time, in the case of positioning the articulating probe 100 and during a surgical procedure, lighting conditions can change rapidly, or within the target area within a single frame. Therefore, high dynamic range processing may be used to enable the operator to capture images with different exposures and combine them into an optimized image, compensated for the lighting variations across the optical assembly. To accomplish this, images having multiple exposure settings may be taken by alternating horizontal rows of pixels within the sensor of the optical assembly with different exposure settings.
An aspect of the inventive concept is to improve the performance of the camera assemblies 135a, 135b in high dynamic range situations while meeting the low-latency requirements for robotic surgery. This would be, for example, when certain regions of the image are very well exposed with sufficient lighting while other regions of the image are under exposed and darker. In an embodiment, a mode of each camera assembly 135a, 135b may be activated that provides alternating lines of different exposure. The odd pixel lines may be configured for a higher exposure time in order to capture greater image detail in darker regions. The even pixel lines may be configured for a lower exposure time in order to capture image detail in highly illuminated regions. It will be understood that any configuration of pixel lines and varying amounts of exposure may be utilized according to various aspects of the inventive concept. For example, every third pixel line may be configured for higher exposure time relative to a lower exposure time for two pixels lines there between. Any combination of high or low exposure times corresponding to any combination or configuration of pixel lines are considered within the scope of the inventive concept.
In an embodiment, the output of the camera sensor 133′ may be processed as follows: the captured image or video stream may be input to a custom image processing apparatus, for example, an FPGA designed to perform exposure fusion (the combination of the high and low exposure data) into a single processed image. Any saturated regions of the image may be better represented due to the apparatus applying a higher weighting to the short exposure data from the even pixel lines. Any dark regions may be better represented due to the apparatus applying a higher weighting to the long exposure data from the odd pixel lines. The processing may then allow for additional tone mapping of the resulting image to enhance or reduce contrast. The apparatus may use frame buffers and/or line buffers to store data for processing within the processing apparatus. The apparatus may process video in real-time with just a small additional latency due to the data buffering.
This process is outlined in flowchart 1800 of
As described above with reference to
In standard 2D image rotation, the image is rotated about the center of the native image, as shown in
According to an aspect of the inventive concept, the above issues may be rectified by generating a depth map of the scene that provides a pixel-by-pixel depth representation of the captured image. In an embodiment, camera assemblies 135a, 135b each capture an image of a target area. Since the camera assemblies have a known distance between them, the view from each camera assembly, relative to a reference point, will be different from each other. A difference between the two images may be calculated relative to the reference point to generate a depth map, which in combination with one of the two images, may be used to regenerate the second of the two images (e.g. after a rotation has been performed, as described herein). The depth map along with the one image can be individually rotated, such that the regenerated image is also rotated, and the pair can be displayed as a digitally rotated stereoscopic pair.
Referring now to
The greater the positional disparity of the tools from the center of the left eye and right eye images (2D), the greater the depth associated with that object (or the pixels that make up that object) from the imaging system. Therefore, as shown in
Alternatively, a depth map may be created using an image sensor to capture a 2D image and a “time of flight” sensor that has been aligned to the image sensor. The “time of flight” sensor could provide the depth of each pixel and software could align the 2D image to the data received from the time of flight sensor to generate a depth map. Another system could include a system including a light-emitting device for emitting a light pattern that is known, and an image sensor for detecting the pattern on the target area. The system could then calculate the difference in the pattern detected by the image sensor compared to the known pattern that has been emitted, and a depth map calculated.
The top assembly 330 includes an articulating probe 100 for example comprising a link assembly including an inner link mechanism comprising a plurality of inner links, and an outer link mechanism comprising a plurality of outer links, as described in connection with various embodiments herein, as described herebelow in reference to
In some embodiments, the base assembly 320 is operably connected to the interface unit 200, such connection typically including electrical wires, optical fibers, or wireless communications, for transmission of power and/or data, or mechanical transmission conduits such as mechanical linkages or pneumatic/hydraulic delivery tubes, conduit 301 shown. The interface unit 200 includes a user interface 230, comprising a human interface device HID 202 for receiving tactile commands from a surgeon, technician and/or other operator of system 10, and a display 201 for providing visual and/or auditory feedback. The interface unit 200 can likewise be positioned on an interface cart 205, which is mounted on wheels 205a (e.g. lockable wheels) to allow for manual manipulation of its position. Base assembly 320 can comprise a processor, 210, including an image processing unit 220 and software 225, as described hereabove in reference to
In some embodiments, one mechanism starts limp and the other starts rigid. For the sake of explanation, assume the outer link mechanism 110 is rigid and the inner link mechanism 120 is limp, as seen in step 1 in
In medical applications, operation, procedures, and so on, once the probe 100 arrives at a desired location, the operator, such as a surgeon, can slide one or more tools through one or more working channels of outer link mechanism 110, inner link mechanism 120, or one or more working channels formed between outer link mechanism 110 and inner link mechanism 120, such as to perform various diagnostic and/or therapeutic procedures. In some embodiments, the channel is referred to as a working channel that can, for example, extend between first recesses formed in a system of outer links and second recesses formed in a system of inner links. Working channels may be included on the periphery of articulating probe 100, such as working channels comprising one or more radial projections extending from outer link mechanism 110, these projections including one or more holes sized to slidingly receive one or more tools. As described with reference to other embodiments, working channels may be of outer location of the articulating probe 100.
In addition to clinical procedures such as surgery, articulating probe 100 can be used in numerous applications including but not limited to: engine inspection, repair or retrofitting; tank inspection and repair; surveillance applications; bomb disarming; inspection or repair in tightly confined spaces such as submarine compartments or nuclear weapons; structural inspections such as building inspections; hazardous waste remediation; biological sample and toxin recovery; and combination of these. Clearly, the device of the present disclosure has a wide variety of applications and should not be taken as being limited to any particular application.
Inner link mechanism 120 and/or outer link mechanism 110 are steerable and inner link mechanism 120 and outer link mechanism 110 can each be made both rigid and limp, allowing articulating probe 100 to drive anywhere in three-dimensions while being self-supporting. Articulating probe 100 can “remember” each of its previous configurations and for this reason, articulating probe 100 can retract from and/or retrace to anywhere in a three dimensional volume such as the intracavity spaces in the body of a patient such as a human patient.
The inner link mechanism 120 and outer link mechanism 110 each include a series of links, i.e. inner links 121 and outer links 111 respectively, that articulate relative to each other. In some embodiments, the outer links are used to steer and lock the probe, while the inner links are used to lock the articulating probe 100. In “follow the leader” fashion, while the inner links 121 are locked, the outer links 111 are advanced beyond a distal-most inner link 122. The outer links 111 are steered into position by the system steering cables, and then locked by locking the steering cables. The cable of the inner links 121 is then released and the inner links 121 are advanced to follow the outer links. The procedure progresses in this manner until a desired position and orientation are achieved. The combined inner links 121 and outer links 111 may include working channels for temporary or permanent insertion of tools at the surgery site. In some embodiments, the tools can advance with the links during positioning of the probe. In some embodiments, the tools can be inserted through the links following positioning of the probe.
One or more outer links 111 can be advanced beyond the distal-most inner link prior to the initiation of an operator controlled steering maneuver, such that the quantity extending beyond the distal-most inner link will collectively articulate based on steering commands. Multiple link steering can be used to reduce procedure time, such as when the specificity of single link steering is not required. In some embodiments, between 2 and 20 outer links can be selected for simultaneous steering, such as between 2 and 10 outer links or between 2 and 7 outer links. The number of links used to steer corresponds to achievable steering paths, with smaller numbers enabling more specificity of curvature of probe 100. In some embodiments, an operator can select the number of links used for steering (e.g. to select between 1 and 10 links to be advanced prior to each steering maneuver).
While the inventive concept has been described for use in connection with a surgical probe device, it will be understood that it is equally suitable for use in connection with any type of device where stereoscopic imaging may be advantageous or desired, such as a line-of-sight robot 500, including tools 520a, 520b and camera assembly 530, as shown in
Circuit 140 comprises a voltage regulator 141 and inductor 144. Voltage regulator 141 is configured to receive power from transmit assembly 250 and provide power to circuit 140. Voltage regulator 141 may comprise a low-dropout (LDO) voltage regulator configured to step down the voltage provided to circuit 140. Regulator 141 is configured to provide clean, stable voltage rails for optical assembly 133. Inductor 144 may be selected to limit 300-400 MHz signal noise on conduit 134′. Circuit 140 further comprises a differential signal driver 142 that receives optical data from optical assembly 133. Differential signal driver 142 transmits the received optical data to differential signal receiver 242 by AC coupling the data to conduit 134′. Differential signal receiver 242 may decouple the optical data from conduit 134′, and transmit the data to image processing assembly 220 of processor 210.
While the preferred embodiments of the devices and methods have been described in reference to the environment in which they were developed, they are merely illustrative of the principles of the present inventive concepts. Modification or combinations of the above-described assemblies, other embodiments, configurations, and methods for carrying out the invention, and variations of aspects of the invention that are obvious to those of skill in the art are intended to be within the scope of the claims. In addition, where this application has listed the steps of a method or procedure in a specific order, it may be possible, or even expedient in certain circumstances, to change the order in which some steps are performed, and it is intended that the particular steps of the method or procedure claim set forth herebelow not be construed as being order-specific unless such order specificity is expressly stated in the claim.
This application claims the benefit of U.S. Provisional Application No. 62/401,390, filed Sep. 29, 2016, the content of which is incorporated herein by reference in its entirety. This application claims the benefit of U.S. Provisional Application No. 62/504,175, filed May 10, 2017, the content of which is incorporated herein by reference in its entirety. This application claims the benefit of U.S. Provisional Application No. 62/517,433, filed Jun. 9, 2017, the content of which is incorporated herein by reference in its entirety. This application claims the benefit of U.S. Provisional Application No. 62/481,309, filed Apr. 4, 2017, the content of which is incorporated herein by reference in its entirety. This application claims the benefit of U.S. Provisional Application No. 62/533,644, filed Jul. 17, 2017, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 61/921,858, filed Dec. 30, 2013, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No PCT/US2014/071400, filed Dec. 19, 2014, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 14/892,750, filed Nov. 20, 2015, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 61/406,032, filed Oct. 22, 2010, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No PCT/US2011/057282, filed Oct. 21, 2011, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 13/880,525, filed Apr. 19, 2013, now U.S. Pat. No. 8,992,421, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 14/587,166, filed Dec. 31, 2014, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 61/492,578, filed Jun. 2, 2011, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No. PCT/US12/40414, filed Jun. 1, 2012, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 14/119,316, filed Nov. 21, 2013, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 61/412,733, filed Nov. 11, 2010, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No PCT/US2011/060214, filed Nov. 10, 2011, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 13/884,407, filed May 9, 2013, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 15/587,832, filed May 5, 2017, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 61/472,344, filed Apr. 6, 2011, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No. PCT/US12/32279, filed Apr. 5, 2012, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 14/008,775, filed Sep. 30, 2013, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 14/944,665, filed Nov. 18, 2015, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 14/945,685, filed Nov. 19, 2015, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 61/534,032 filed Sep. 13, 2011, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No. PCT/US12/54802, filed Sep. 12, 2012, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 14/343,915, filed Mar. 10, 2014, now U.S. Pat. No. 9,757,856, issued Sep. 12, 2017, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 15/064,043, filed Mar. 8, 2016, now U.S. Pat. No. 9,572,628, issued Feb. 21, 2017, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 15/684,268, filed Aug. 23, 2017, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 61/368,257, filed Jul. 28, 2010, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No PCT/US2011/044811, filed Jul. 21, 2011, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 13/812,324, filed Jan. 25, 2013, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 61/578,582, filed Dec. 21, 2011, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No. PCT/US12/70924, filed Dec. 20, 2012, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 14/364,195, filed Jun. 10, 2014, now U.S. Pat. No. 9,364,955 issued Jun. 14, 2016, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 15/180,503, filed Jun. 13, 2016, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 61/681,340, filed Aug. 9, 2012, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No. PCT/US13/54326, filed Aug. 9, 2013, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 14/418,993, filed Feb. 2, 2015, now U.S. Pat. No. 9,675,380 issued Jun. 13, 2017, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 15/619,875, filed Jun. 12, 2017, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 61/751,498, filed Jan. 11, 2013, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No. PCT/US14/10808, filed Jan. 9, 2014, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 14/759,020, filed Jan. 9, 2014, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 61/656,600, filed Jun. 7, 2012, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No. PCT/US13/43858, filed Jun. 3, 2013, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 14/402,224, filed Nov. 19, 2014, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 61/825,297, filed May 20, 2013, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No. PCT/US13/38701, filed May 20, 2014, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 14/888,541, filed Nov. 2, 2015, now U.S. Pat. No. 9,517,059, issued Dec. 13, 2016, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 15/350,549, filed Nov. 14, 2016, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 61/818,878, filed May 2, 2013, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No. PCT/US14/36571, filed May 2, 2014, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 14/888,189, filed Oct. 30, 2015, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 61/909,605, filed Nov. 27, 2013, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 62/052,736, filed Sep. 19, 2014, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No. PCT/US14/67091, filed Nov. 24, 2014, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 15/038,531, filed May 23, 2016, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 62/008,453 filed Jun. 5, 2014, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No. PCT/US15/34424, filed Jun. 5, 2015, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 15/315,868, filed Dec. 2, 2016, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 62/150,223, filed Apr. 20, 2015, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. Provisional Application No. 62/299,249, filed Feb. 24, 2016, the content of which is incorporated herein by reference in its entirety. This application is related to PCT Application No. PCT/US16/28374, filed Apr. 20, 2016, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 11/630,279, filed Dec. 20, 2006, published as U.S. Patent Application Publication No. 2009/0171151, the content of which is incorporated herein by reference in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2017/054297 | 9/29/2017 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62401390 | Sep 2016 | US | |
62481309 | Apr 2017 | US | |
62504175 | May 2017 | US | |
62517433 | Jun 2017 | US | |
62533644 | Jul 2017 | US |