For the practitioner, the field of diagnostic imaging, for example endoscopy, has allowed for the viewing of objects, internal mechanisms and the like with minimal disruption to the subjects necessarily penetrated to view the afore mentioned objects and mechanisms. Such imaging tools have been used in a wide variety of settings for detailed inspection, including but not limited to the use and application in the field of medicine.
Of particular challenge in the case of using imaging, for example, in the medical field, is the vast amount of equipment typically required, the maintenance of such equipment, and the cabling required for connection to other systems. Among the vast array of equipment required to accomplish an imaging application found in the prior art includes monitor systems, lighting systems and power systems. In addition these systems may be permanently or semi-permanently installed in small offices or operation rooms, for example, which require said offices and rooms to be adapted in potentially a less than ideal fashion so as to accommodate the cumbersomeness of the imaging equipment. In addition, this challenge of the needed installation of imaging systems components may require the duplication of such imaging systems in other offices and rooms as required.
Compounding the above mentioned problem is the requirement that many of these imaging system components must utilize a cabling means to function. These cables that transfer electrical, optical and mechanical means, for example, may physically interfere with objects and persons in the room such as a patient. In some cases, cables for light transmission, for example fiber optic cables, that are rather inflexible may break if over-flexed and thus compromise the outcome of the imaging application.
An additional challenge for imaging technology found in the prior art is the use of external monitoring of the imaging that may be located some distance from the practitioner. As is the case, the practitioner would then be required to view the monitoring of the imaging application in one direction while physically introducing or utilizing the imaging means in a different direction, thus potentially compromising the detail and accuracy of the use of the imaging tool.
Another problem with such imaging systems is that they may require external power. This power must be located relatively proximate to the location of the power outlets and the required voltage available. Since various countries do not share a common power adapter means, or the same voltage output, additional adapters must be utilized for functionality of these systems.
Another challenge faced by imaging systems is satisfaction of the goals of sterility and reusability. Imaging systems must be sterile in order to be employed for their intended applications. While sterility can be accomplished by using a device only once, such approaches are wasteful. However, reusing a device poses significant challenges with respect to maintaining sterility.
Hand-held minimally dimensioned diagnostic devices having integrated distal end visualization are provided. Also provided are systems that include the devices, as well as methods of using the devices, e.g., to visualize internal tissue of a subject.
Hand-held minimally dimensioned diagnostic devices having integrated distal end visualization are provided. Also provided are systems that include the devices, as well as methods of using the devices, e.g., to visualize internal tissue of a subject.
Before the present invention is described in greater detail, it is to be understood that this invention is not limited to particular embodiments described, as such may, of course, vary. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting, since the scope of the present invention will be limited only by the appended claims.
Where a range of values is provided, it is understood that each intervening value, to the tenth of the unit of the lower limit unless the context clearly dictates otherwise, between the upper and lower limit of that range and any other stated or intervening value in that stated range, is encompassed within the invention. The upper and lower limits of these smaller ranges may independently be included in the smaller ranges and are also encompassed within the invention, subject to any specifically excluded limit in the stated range. Where the stated range includes one or both of the limits, ranges excluding either or both of those included limits are also included in the invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Although any methods and materials similar or equivalent to those described herein can also be used in the practice or testing of the present invention, representative illustrative methods and materials are now described.
All publications and patents cited in this specification are herein incorporated by reference as if each individual publication or patent were specifically and individually indicated to be incorporated by reference and are incorporated herein by reference to disclose and describe the methods and/or materials in connection with which the publications are cited. The citation of any publication is for its disclosure prior to the filing date and should not be construed as an admission that the present invention is not entitled to antedate such publication by virtue of prior invention. Further, the dates of publication provided may be different from the actual publication dates which may need to be independently confirmed.
It is noted that, as used herein and in the appended claims, the singular forms “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise. It is further noted that the claims may be drafted to exclude any optional element. As such, this statement is intended to serve as antecedent basis for use of such exclusive terminology as “solely,” “only” and the like in connection with the recitation of claim elements, or use of a “negative” limitation.
As will be apparent to those of skill in the art upon reading this disclosure, each of the individual embodiments described and illustrated herein has discrete components and features which may be readily separated from or combined with the features of any of the other several embodiments without departing from the scope or spirit of the present invention. Any recited method can be carried out in the order of events recited or in any other order which is logically possible.
In further describing various aspects of the invention, aspects of embodiments of the subject tissue visualization devices and systems are described first in greater detail. Next, embodiments of methods of visualizing an internal target tissue of a subject in which the subject tissue visualization systems may find use are reviewed in greater detail.
As summarized above, aspects of the invention include internal tissue visualization systems. The internal tissue visualization systems are visualization systems that are configured to visualize an internal tissue site of a subject. As such, the systems are structured or designed to provide images of a tissue site inside of a body, such as a living body, to a user. As such, aspects of systems of the invention include internal tissue visualization devices that are useful for visualizing an internal target tissue site, e.g., a spinal location that is near or inside of an intervertebral disc (IVD). The internal tissue visualization devices of embodiments of systems of the invention are dimensioned such that at least the distal end of the devices can pass through a minimally invasive body opening. As such, at least the distal end of the devices of these embodiments may be introduced to an internal target site of a patient, e.g., a spinal location that is near or inside of an intervertebral disc, through a minimal incision, e.g., one that is less than the size of an incision employed for an access device having a outer diameter of 20 mm or smaller, e.g., less than 75% the size of such an incision, such as less than 50% of the size of such an incision, or smaller.
As summarized above, internal tissue visualization devices of the systems of the invention include an elongated member and a hand-held control unit (such as a probe piece and hand piece as described further below). With respect to the elongated member, this component of the devices has a length that is 1.5 times or longer than its width, such as 2 times or longer than its width, including 5 or even 10 times or longer than its width, e.g., 20 times longer than its width, 30 times longer than its width, or longer. The length of the elongated member may vary, and in some instances ranges from 5 cm to 20 cm, such as 7.5 cm to 15 cm and including 10 to 12 cm. The elongated member may have the same outer cross-sectional dimensions (e.g., diameter) along its entire length. Alternatively, the cross-sectional diameter may vary along the length of the elongated member.
In some instances, at least the distal end region of the elongated member of the devices is dimensioned to pass through a Cambin's triangle. By distal end region is meant a length of the elongated member starting at the distal end of 1 cm or longer, such as 3 cm or longer, including 5 cm or longer, where the elongated member may have the same outer diameter along its entire length. The Cambin's triangle (also known in the art as the Pambin's triangle) is an anatomical spinal structure bounded by an exiting nerve root and a traversing nerve root and a disc. The exiting root is the root that leaves the spinal canal just cephalad (above) the disc, and the traversing root is the root that leaves the spinal canal just caudad (below) the disc. Where the distal end of the elongated member is dimensioned to pass through a Cambin's triangle, at least the distal end of the device has a longest cross-sectional dimension that is 10 mm or less, such as 8 mm or less and including 7 mm or less. In some instances, the devices include an elongated member that has an outer diameter at least in its distal end region that is 5.0 mm or less, such as 4.0 mm or less, including 3.0 mm or less.
The elongated members of the subject tissue visualization devices have a proximal end and a distal end. The term “proximal end”, as used herein, refers to the end of the elongated member that is nearer the user (such as a physician operating the device in a tissue modification procedure), and the term “distal end”, as used herein, refers to the end of the elongated member that is nearer the internal target tissue of the subject during use. The proximal end is also the end that is operatively coupled to the hand-held control unit of the device (described in greater detail below). The elongated member is, in some instances, a structure of sufficient rigidity to allow the distal end to be pushed through tissue when sufficient force is applied to the proximal end of the elongate member. As such, in these embodiments the elongated member is not pliant or flexible, at least not to any significant extent.
As summarized above, the visualization devices include a visualization sensor integrated at the distal end of the elongated member, such that the visualization sensor is integrated with the elongated member. As the visualization sensor is integrated with the elongated member, it cannot be removed from the remainder of the elongated member without significantly compromising the structure and functionality of the elongated member. Accordingly, the devices of the present invention are distinguished from devices which include a “working channel” through which a separate autonomous device is passed through. In contrast to such devices, since the visualization sensor of the present device is integrated with the elongated member, it is not a separate device from the elongated member that is merely present in a working channel of the elongated member and which can be removed from the working channel of such an elongated member without structurally compromising the elongated member in any way. The visualization sensor may be integrated with the elongated member by a variety of different configurations. Integrated configurations include configurations where the visualization sensor is fixed relative to the distal end of the elongated member, as well as configurations where the visualization sensor is movable to some extent relative to the distal end of the elongated member. Movement of the visualization sensor may also be provided relative to the distal end of the elongated member, but then fixed with respect to another component present at the distal end, such as a distal end integrated illuminator. Specific configurations of interest are further described below in connection with the figures.
Visualization sensors of interest include miniature imaging sensors that have a cross-sectional area which is sufficiently small for its intended use and yet retains a sufficiently high matrix resolution. Imaging sensors of interest are those that include a photosensitive component, e.g., array of photosensitive elements that convert light into electrons, coupled to a circuitry component, such as an integrated circuit. The integrated circuit may be configured to obtain and integrate the signals from the photosensitive array and output image data, which image data may in turn be conveyed to an extra-corporeal device configured to receive the data and display it to a user. The image sensors of these embodiments may be viewed as integrated circuit image sensors. The integrated circuit component of these sensors may include a variety of different types of functionalities, including but not limited to: image signal processing, memory, and data transmission circuitry to transmit data from the visualization sensor to an extra-corporeal location, etc. The miniature imaging sensors may be present in a module which further includes one or more of a housing, a lens component made up of one or more lenses positioned relative to the photosensitive component so as to focus images on the photosensitive component, one or more filters, polarized members, etc. Specific types of miniature imaging sensors of interest include complementary metal-oxide-semiconductor (CMOS) sensors and charge-coupled device (CCD) sensors. The sensors may have any convenient configuration, including circular, square, rectangular, etc. Visualization sensors of interest may have a longest cross-sectional dimension that varies depending on the particular embodiment, where in some instances the longest cross sectional dimension (e.g., diameter) is 4.0 mm or less, such as 3.5 mm or less, including 3.0 mm or less, such as 2.5 mm or less, including 2.0 mm or less, including 1.5 mm or less, including 1.0 mm or less. Within a given imaging module, the sensor component may be located some distances from the lens or lenses of the module, where this distance may vary, such as 10 mm or less, including 7 mm or less, e.g., 6 mm or less.
Imaging sensors of interest may be either frontside or backside illumination sensors, and have sufficiently small dimensions while maintaining sufficient functionality to be integrated at the distal end of the elongated members of the devices of the invention. Aspects of these sensors are further described in one or more the following U.S. patents, the disclosures of which are herein incorporated by reference: U.S. Pat. Nos. 7,388,242; 7,368,772; 7,355,228; 7,345,330; 7,344,910; 7,268,335; 7,209,601; 7,196,314; 7,193,198; 7,161,130; and 7,154,137.
As summarized above, the visualization sensor is located at the distal end of the elongated member, such that the visualization sensor is a distal end visualization sensor. In these instances, the visualization sensor is located at or near the distal end of the elongated member. Accordingly, it is positioned at 3 mm or closer to the distal end, such as at 2 mm or closer to the distal end, including at 1 mm or closer to the distal end. In some instances, the visualization sensor is located at the distal end of the elongated member. The visualization sensor may provide for front viewing and/or side-viewing, as desired. Accordingly, the visualization sensor may be configured to provide image data as seen in the forward direction from the distal end of the elongated member. Alternatively, the visualization sensor may be configured to provide image data as seen from the side of the elongate member. In yet other embodiments, a visualization sensor may be configured to provide image data from both the front and the side, e.g., where the image sensor faces at an angle that is less than 90° relative to the longitudinal axis of the elongated member.
Components of the visualization sensor, e.g., the integrated circuit, one or more lenses, etc., may be present in a housing. The housing may have any convenient configuration, where the particular configuration may be chosen based on location of the sensor, direction of view of the sensor, etc. The housing may be fabricated from any convenient material. In some instances, non-conductive materials, e.g., polymeric materials, are employed.
Visualization sensors may further include functionality for conveying image data to an extra-corporeal device, such as an image display device, of a system. In some instances, a wired connection, e.g., in the form of a signal cable (or other type of signal conveyance element), may be present to connect the visualization sensor at the distal end to a device at the proximal end of the elongate member, e.g., in the form of one or more wires running along the length of the elongate member from the distal to the proximal end. In some instances, the visualization sensor is coupled to a conductive member (e.g., cable or analogous structure) that conductively connects the visualization sensor to a proximal end location of the elongated member. Alternatively, wireless communication protocols may be employed, e.g., where the visualization sensor is operatively coupled to a wireless data transmitter, which may be positioned at the distal end of the elongated member (including integrated into the visualization sensor, at some position along the elongated member or at the proximal end of the device, e.g., at a location of the proximal end of the elongated member or associated with the handle of the device).
Where desired, the devices may include one or more illumination elements configured to illuminate a target tissue location so that the location can be visualized with a visualization sensor, e.g., as described above. A variety of different types of light sources may be employed as illumination elements (also referred to herein as illuminators), so long as their dimensions are such that they can be positioned at the distal end of the elongated member. The light sources may be integrated with a given component (e.g., elongated member) such that they are configured relative to the component such that the light source element cannot be removed from the remainder of the component without significantly compromising the structure of the component. As such, the integrated illuminators of these embodiments are not readily removable from the remainder of the component, such that the illuminator and remainder of the component form an inter-related whole. The light sources may be light emitting diodes (LEDs) configured to emit light of the desired wavelength range, or optical conveyance elements, e.g., optical fibers, configured to convey light of the desired wavelength range from a location other than the distal end of the elongate member, e.g., a location at the proximal end of the elongate member, to the distal end of the elongate member. The physical location of the light source, e.g., LED, may vary, such as any location in the elongated member, in the hand-held control unit, etc.
As with the image sensors, the light sources may include a conductive element, e.g., wire, or an optical fiber, which runs the length of the elongate member to provide for power and control of the light sources from a location outside the body, e.g., an extracorporeal control device.
Where desired, the light sources may include a diffusion element to provide for uniform illumination of the target tissue site. Any convenient diffusion element may be employed, including but not limited to a translucent cover or layer (fabricated from any convenient translucent material) through which light from the light source passes and is thus diffused. In those embodiments of the invention where the system includes two or more illumination elements, the illumination elements may emit light of the same wavelength or they may be spectrally distinct light sources, where by “spectrally distinct” is meant that the light sources emit light at wavelengths that do not substantially overlap, such as white light and infra-red light. In certain embodiments, an illumination configuration as described in copending U.S. application Ser. Nos. 12/269,770 and 12/269,772 (the disclosures of which are herein incorporated by reference) is present in the device.
Distal end integrated illuminators may have any convenient configuration. Configurations of interest have various cross-sectional shapes, including but not limited to circular, ovoid, rectangular (including square), irregular, etc. In some instances the configuration of the integrated illuminator is configured to conform with the configuration of the integrated visualization sensor such that the cross-sectional area of the two components is maximized within the overall minimal cross-sectional area available at the distal end of the elongated member. For example, the configurations of the integrated visualization sensor and illuminators may be such that the integrated visualization sensor may occupy a first portion of the available cross-sectional area of the distal end of the elongated member (such as 40% or more, including 50% or 60% or more of the total available cross-sectional area of the distal end of the elongated member) and the integrated illuminator may occupy a substantial portion of the remainder of the cross-sectional area, such as 60% or more, 70% or more, or 80% or more of the remainder of the cross-sectional area.
In one configuration of interest, the integrated illuminator has a crescent configuration. The crescent configuration may have dimensions configured to confirm with walls of the elongated member and a circular visualization sensor. In another configuration of interest, the integrated illuminator has an annular configuration, e.g., where conforms to the inner walls of the elongated member or makes up the walls of the elongated member, e.g., as described in greater detail below. This configuration may be of interest where the visualization sensor is positioned at the center of the distal end of the elongated member.
In some instances, the elongated member comprises an annular wall configured to conduct light to the elongated member distal end from a proximal end source. The distal end of this annular wall may be viewed as an integrated illuminator, as described above. In these instances, the walls of the elongated structure which collective make up the annular wall are fabricated from a translucent material which conducts light from a source apart from the distal end, e.g., from the proximal end, to the distal end. Where desired, a reflective coating may be provided on the outside of the translucent elongated member to internally reflect light provided from a remote source, e.g., such as an LED at the proximal end, to the distal end of the device. Any convenient reflective coating material may be employed.
Also of interest are integrated illuminators that include a fluid filled structure that is configured to conduct light to the elongated member distal end from a proximal end source. Such a structure may be a lumen that extends along a length of the elongated structure from a proximal end light source to the distal end of the elongated structure. When present, such lumens may have a longest cross section that varies, ranging in some instances from 0.5 to 4.0 mm, such as 0.5 to 3.5 mm, including 0.5 to 3.0 mm. The lumens may have any convenient cross-sectional shape, including but not limited to circular, square, rectangular, triangular, semi-circular, trapezoidal, irregular, etc., as desired. The fluid filled structure may be filled with any convenient translucent fluid, where fluids of interest include aqueous fluids, e.g., water, saline, etc., organic fluids, such as heavy mineral oil (e.g., mineral oil having a specific gravity greater than or equal to about 0.86 and preferably between about 0.86 and 0.905), and the like.
As indicated above, certain instances of the integrated illuminators are made up of an elongated member integrated light conveyance structure, e.g., optical fiber, light conductive annular wall, light conducting fluid filled structure, etc., which is coupled to a proximal end light source. In some instances, the proximal end light source is a forward focused LED. Of interest are in such embodiments are bright LEDs, e.g., LEDs having a brightness of 100 mcd or more, such as 300 mcd or more, and in some instances 500 mcd or more, 1000 mcd or more, 1500 mcd or more. In some instances, the brightness ranges from 100 to 2000 mcd, such as 300 to 1500 mcd. The LED may be coupled with a forward focusing lens that is, in turn, coupled to the light conveyance structure.
In some instances, the proximal end LED may be coupled to the light conveyance structure in a manner such that substantially all, if not all, light emitted by the LED is input into the light conveyance structure. Alternatively, the LED and focusing lens may be configured such that at least a portion of the light emitted by the LED is directed along the outer surface of the elongated member. In these instances, the forward focused light emitting diode is configured to direct light along the outer surface of the elongated member. As such, light from the proximal end LED travels along the outer surface of the elongated member to the distal end of the elongated member.
In some instances, the tissue visualization devices of the invention are configured to reduce coupling of light directly from the integrated illuminator to the visualization sensor. In other words, the devices are structures so that substantially all, if not all, of the light emitted by the integrated illuminator at the distal end of the elongated structure is prevented from directly reaching the visualization sensor. In this manner, the majority, if not all, of the light that reaches the visualization sensor is reflected light, which reflected light is converted to image data by the visualization sensor. In order to substantially prevent, if not inhibit, light from the integrated illuminator from directly reaching the integrated visualization sensor, the device may include a distal end polarized member. By distal end polarized member is meant a structure or combination of structures that have been polarized in some manner sufficient to achieve the desired purpose of reducing, if not eliminating, light from the integrated illuminator directly reaching the integrated visualization sensor. In one embodiment, the light from an LED is polarized by a first polarizer (linearly or circularly) as it enters at lens or prism at the distal tip of the elongated member. A visualization sensor, such as CMOS sensor, also has a polarizer directly in front of it, with this second polarizer being complimentary to the first polarizer so that any light reflected by the outer prism surface into the visualization sensor will be blocked by this polarizer. Light passing through the first polarizer and reflected by the surrounding tissue will have random polarization, so roughly half of this light will pass through the second polarizer to reach the visualization sensor and be converted to image data. The distal end polarized member may be a cover lens, e.g., for forward viewing elongated members, or a prism, e.g., for off-axis viewing elongated members, such as described in greater detail below.
In some instances, the distal end of the elongated member includes an off-axis visualization module that is configured so that the visualization sensor obtains data from a field of view that is not parallel to the longitudinal axis of the elongated member. With an off-axis visualization module, the field of view of the visualization sensor is at an angle relative to the longitudinal axis of the elongated member, where this angle may range in some instances from 5 to 90°, such as 45 to 75°, e.g., 30°. The off-axis visualization module may include any convenient light guide which collects light from an off-axis field of view and conveys the collected light to the visualization sensor. In some instances, the off-axis visualization module is a prism.
Depending on the particular device embodiment, the elongated member may or may not include one or more lumens that extend at least partially along its length. When present, the lumens may vary in diameter and may be employed for a variety of different purposes, such as irrigation, aspiration, electrical isolation (for example of conductive members, such as wires), as a mechanical guide, etc., as reviewed in greater detail below. When present, such lumens may have a longest cross section that varies, ranging in some instances from 0.5 to 5.0 mm, such as 1.0 to 4.5 mm, including 1.0 to 4.0 mm. The lumens may have any convenient cross-sectional shape, including but not limited to circular, square, rectangular, triangular, semi-circular, trapezoidal, irregular, etc., as desired. These lumens may be provided for a variety of different functions, including as conveyance structures for providing access of devices, compositions, etc. to the distal end of the elongated member, as described in greater detail below. Such lumens may be employed as a “working channel”.
In some embodiments, an integrated articulation mechanism that imparts steerability to at least the distal end of the elongated member or a component thereof is also present in the device, such that the elongated member is the elongated member is configured for distal end articulation. By “steerability” is meant the ability to maneuver or orient the distal end of the elongated member or component thereof as desired during a procedure, e.g., by using controls positioned at the proximal end of the device, e.g., on the hand-held control unit. In these embodiments, the devices include a steerability mechanism (or one or more elements located at the distal end of the elongated member) which renders the desired elongated member distal end or component thereof maneuverable as desired through proximal end control. As such, the term “steerability”, as used herein, refers to a mechanism that provides a user steering functionality, such as the ability to change direction in a desired manner, such as by moving left, right, up or down relative to the initial direction. The steering functionality can be provided by a variety of different mechanisms. Examples of suitable mechanisms include, but are not limited to one or more wires, tubes, plates, meshes or combinations thereof, made from appropriate materials, such as shape memory materials, music wire, etc.
In some instances, the distal end of the elongated member is provided with a distinct, additional capability that allows it to be independently rotated about its longitudinal axis when a significant portion of the operating handle is maintained in a fixed position, as discussed in greater detail below. The extent of distal component articulations of the invention may vary, such as from −180 to +180°; e.g., −90 to +90°. Alternatively, the distal probe tip articulations may range from 0 to 360°, such as 0 to +180°, and including 0 to +90°, with provisions for rotating the entire probe about its axis so that the full range of angles is accessible on either side of the axis of the probe, e.g., as described in greater detail below. Rotation of the elongated member may be accomplished via any convenient approach, e.g., through the use of motors, such as described in greater detail below. Articulation mechanisms of interest are further described in published PCT Application Publication Nos. WO 2009029639; WO 2008/094444; WO 2008/094439 and WO 2008/094436; the disclosures of which are herein incorporated by reference. Specific articulation configurations of interest are further described in connection with the figures, below, as well as in U.S. application Ser. No. 12/422,176; the disclosure of which is herein incorporated by reference.
As summarized above, the internal tissue visualization devices of the invention further include a hand-held control unit to which the elongated member is operably connected. By “operably connected” is meant that one structure is in communication (for example, mechanical, electrical, optical connection, or the like) with another structure. The hand-held control unit is located at the proximal end of the elongated structure, and therefore at the proximal end of the device. As the control unit is hand-held, it is configured to be held easily in the hand of an adult human. Accordingly, the hand-held control unit may have a configuration that is amenable to gripping by the human adult hand. The weight of the hand-held control unit may vary, but in some instances ranges from 0.5 to 5 lbs, such as 0.5 to 3 lbs. The hand-held control unit may have any convenient configuration, such as a hand-held wand with one or more control buttons, as a hand-held gun with a trigger, etc., where examples of suitable handle configurations are further provided below.
In some instances, the hand-held control unit may include a monitor. By monitor is meant a visual display unit, which includes a screen that displays visual data in the form of images and/or text to a user. The screen may vary, where a screen type of interest is an LCD screen. The monitor, when present, may be integrated or detachable from the remainder of the hand-held control unit. As such, in some instances the monitor may be an integrated structure with the hand-held control unit, such that it cannot be separated from the hand-held control unit without damaging the monitor in some manner. In yet other embodiments, the monitor may be a detachable monitor, where the monitor can be attached to and separated from the hand-held control unit, as desired, without damaging the function of the monitor. In such embodiments, the monitor and hand-held control unit may have a variety of different mating configurations, such as where the hand-held control unit includes a hole configured to receive a post of the monitor, where the monitor has a structure that is configured to snap onto a receiving structure of the hand-held control unit, etc. The monitor, when present will have dimensions sufficient for use with the hand-held control unit, where screen sizes of interest may include 10 inches or smaller, such es or smaller, e.g., 5 inches or smaller, e.g., 3.5 inches, etc.
Data communication between the monitor and the remainder of the hand-held control unit may be accomplished according to any convenient configuration. For example, the monitor and remaining components of the hand-held control unit may be connected by one or more wires. Alternatively, the two components may be configured to communication with each other via a wireless communication protocol. In these embodiments, the monitor will include a wireless communication module.
In some embodiments, the distal end of the elongated member is rotatable about its longitudinal axis when a significant portion of the hand-held control unit is maintained in a fixed position. As such, at least the distal end of the elongated member can turn by some degree while the hand-held control unit attached to the proximal end of the elongated member stays in a fixed position. The degree of rotation in a given device may vary, and may range from 0 to 360°, such as 0 to 270°, including 0 to 180°. Rotation, when present, may be provided by any convenient approach, e.g., through use of motors.
Devices of the invention may be disposable or reusable. As such, devices of the invention may be entirely reusable (e.g., be multi-use devices) or be entirely disposable (e.g., where all components of the device are single-use). In some instances, the device can be entirely reposable (e.g., where all components can be reused a limited number of times). Each of the components of the device may individually be single-use, of limited reusability, or indefinitely reusable, resulting in an overall device or system comprised of components having differing usability parameters.
Of interest are devices in which the hand-held control unit is reusable. In such devices, the elongated member is configured to be detachable from the hand-held control unit. As the elongated member is configured to be readily separable from the hand-held control unit without in any way damaging the functionality of the hand-held control unit, such that the hand-held control unit may be attached to another elongated member. As such, the devices are configured so that the hand-held control unit can be sequentially operably attached to multiple different elongated members. Of interest are configurations in which the elongated member can be manually operably attached to a hand-held control unit without the use of any tools. A variety of different configurations may be employed, e.g., where the proximal end of the elongated member engages the hand-held control unit to provide an operable connection between the two, such as by a snap-fit configuration, an insertion and twist configuration, etc. In certain configurations, the hand-held control unit has a structure configured to receive the proximal end of the elongated member.
In some instances, the hand-held control unit may be re-used simply by wiping down the hand-held control unit following a given procedure and then attaching a new elongated member to the hand-held control unit. In other instances, to provide for desired sterility to the hand-held control unit, the device may include a removable sterile covering attached to the proximal end of the elongated member that is configured to seal the hand-held control unit from the environment. This sterile covering (e.g., in the form of a sheath as described in greater detail below) may be a disposable sterile handle cover that uses a flexible bag, a portion of which is affixed to and sealed to the proximal end of the disposable elongated member. Where desired, the sterile covering may include an integrated clear monitor cover, which may be rigid and configured to conform to the monitor screen. In some instances, the cover may be configured to provide for touch screen interaction with the monitor. As indicated above, the hand-held control unit may include a manual controller. In such instances, the sterile covering may include a flexible rubber boot for mechanical controller sealing, i.e., a boot portion configured to associated with the manual controller. In addition, the sterile covering may include a seal at a region associated with the proximal end of the hand-held control unit. In these instances, the open side of sterile cover prior to use may be conveniently located at the proximal end. Following positioning of the cover around the hand-held control unit, the open side may be mechanically attached to the handle and closed by a validated sealing method. The sterile cover of these embodiments is configured such that when employed, it does not inhibit handle controls or elongated structure and monitor actuation.
In addition to the distal end integrated visualization sensor, e.g., as described in greater detail above, devices of the invention may include a distal end integrated non-visualization sensor. In other words, the devices may include one or more non-visualization sensors that are integrated at the distal end of the elongated member. The one or more non-visualization sensors are sensors that are configured to obtain non-visual data from a target location. Non-visual data of interest includes, but is not limited to: temperature, pressure, pH, elasticity, impedance, conductivity, distance, size, etc. Non-visualization sensors of interest include those configured to obtain one or more types of the non-visual data of interest. Examples of sensors that may be integrated at the distal end include, but are not limited to: temperature sensors, pressure sensors, pH sensors, impedance sensors, conductivity sensors, elasticity sensors, etc. Specific types of sensors include, but are not limited to: thermistors, strain gauges, membrane containing sensors, MEMS sensors, electrodes, light sensors, etc. The choice of a specific type of sensor will depend on the nature of the non-visual data of interest. For example, a pressure sensor can detect the force applied to a target tissue as it is deformed to determine the elastic modulus of the target tissue. A temperature sensor can be employed to detect locally elevated temperatures (which can be used to differentiate different types of tissue, such as to different normal and tumor tissue (where tumors exhibit increased bloodflow and therefore a higher temperature)). A properly collimated laser beam could be used to determine the distance to objects in the device field of view or the length scale of objects in the device field of view. When present, the integrated non-visualization sensor or sensors may be configured to complement other distal end components of the devices, so as to minimize any impact on the outer dimension of the distal end, e.g., in ways analogous to those described above in connection with integrated illumination elements.
In some instances, the devices include a tissue modifier. Tissue modifiers are components that interact with tissue in some manner to modify the tissue in a desired way. The term modify is used broadly to refer to changing in some way, including cutting the tissue, ablating the tissue, delivering an agent(s) to the tissue, freezing the tissue, etc. As such, of interest as tissue modifiers are tissue cutters, tissue ablators, tissue freezing/heating elements, agent delivery devices, etc. Tissue cutters of interest include, but are not limited to: blades, liquid jet devices, lasers and the like. Tissue ablators of interest include, but are not limited to ablation devices, such as devices for delivery ultrasonic energy (e.g., as employed in ultrasonic ablation), devices for delivering plasma energy, devices for delivering radiofrequency (RF) energy, devices for delivering microwave energy, etc. Energy transfer devices of interest include, but are not limited to: devices for modulating the temperature of tissue, e.g., freezing or heating devices, etc. In some embodiments, the tissue modifier is not a tissue modifier that achieves tissue modification by clamping, clasping or grasping of tissue such as may be accomplished by devices that trap tissue between opposing surfaces (e.g., jaw-like devices). In these embodiments, the tissue modification device is not an element that is configured to apply mechanical force to tear tissue, e.g., by trapping tissue between opposing surfaces.
In some instances, the tissue modifier is a low-profile tissue modifier, such as a low-profile biopsy tool or a low-profile cutter. Such low-profile tissue modifiers are include tissue cutting structure positioned at the distal of the elongated member. Because the biopsy or cutting tool is low-profile, its presence at the distal end of the elongated member does not substantially increase the outer diameter of the elongated member. In some instances, the presence of the low-profile biopsy tool increase the outer diameter of the elongated member by 2 mm or less, such as 1.5 mm or less, including 1 mm or less. The configuration of the low-profile biopsy tool may vary. In some instances, the low-profile biopsy tool comprises an annular cutting member concentrically disposed about the distal end of the elongated member and configured to be moved relative to the distal end of the elongated member in a manner sufficient to engage tissue. The annular cutting member may or may not be configured as a complete ring structure, where the ring structure is movable in a longitudinal manner relative to the distal end of the elongated member (such that it may be moved along the elongated member towards and away from the proximal end of the elongated member). The distal edge of the ring structure may be movable some distance beyond the distal end of elongated member, where this distance may vary and in some instances is 10 mm or less, such as 5 mm or less, including 3 mm or less. The distal edge of the ring structure may be sharp in order to penetrate tissue, and may include one or more tissue retaining structures, such as barbs, hooks, lips, etc., which are configured to engage the tissue and stably associate the engaged tissue with the ring structure, e.g., when the ring structure is moved longitudinally along the elongated member towards the proximal end. Also of interest are cutting tools, e.g., as described
In some instances, the distal end integrated visualization sensor is present as an RF-shielded visualization module. As the visualization sensor module of these embodiments is RF-shielded, the visualization sensor module includes an RF shield that substantially inhibits, if not completely prevents, an ambient RF field from reaching and interacting with circuitry of the visualization sensor. As such, the RF shield is a structure which substantially inhibits, if not completely prevents, ambient RF energy (e.g., as provided by a distal end RF electrode, as described in greater detail blow) from impacting the circuitry function of the visualization sensor.
Visualization sensor modules of devices of the invention include at least a visualization sensor. In certain embodiments, the devices may further include a conductive member that conductively connects the visualization sensor with another location of the device, such as a proximal end location. Additional components may also be present in the visualization sensor module, where these components are described in greater detail below.
The RF shield of the visualization sensor module may have a variety of different configurations. The RF shield may include an enclosure element or elements which serve to shield the circuitry of the visualization sensor from an ambient RF field. In some instances, the RF shield is a grounded conductive enclosure component or components which are associated with the visualization sensor, conductive member and other components of the visualization sensor module. In some instances, the visualization sensor of the visualization sensor module is present in a housing, where the housing may include a grounded outer conductive layer which serves as an RF shield component. In these instances, the RF shield is an outer grounded conductive layer. The conductive enclosure or enclosures of the RF-shielded visualization sensor module may be fabricated from a variety of different conductive materials, such as metals, metal alloys, etc., where specific conductive materials of interest include, but are not limited to: copper foils and the like. In certain instances, the RF shield is a metallic layer. This layer, when present, may vary in thickness, but in some instances has a thickness ranging from 0.2 mm to 0.7 mm, such as 0.3 mm to 0.6 mm and including 0.4 mm to 0.5 mm. Additional details regarding RF-shielded visualization modules may be found in U.S. application Ser. No. 12/437,865; the disclosure of which is herein incorporated by reference.
In some instances, the may include a collimated laser configured to emit collimated laser light from a distal region of the elongated member, such as the distal end of the elongated member. The collimated laser components of these embodiments may be configured for use for a variety of purposes, such as but not limited to: anatomical feature identification, anatomical feature assessment of sizes and distances within the field of view of the visualization sensor, etc.
The devices of the invention may be fabricated using any convenient materials or combination thereof, including but not limited to: metallic materials such as tungsten, stainless steel alloys, platinum or its alloys, titanium or its alloys, molybdenum or its alloys, and nickel or its alloys, etc; polymeric materials, such as polytetrafluoroethylene, polyimide, PEEK, and the like; ceramics, such as alumina (e.g., STEATITE™ alumina, MAECOR™ alumina), etc.
In some instances, the devices may include a stereoscopic image module. By stereoscopic image module is meant a functional module that provides a stereoscopic image from image data obtained by the device. As such, the module provides a user via the monitor with the perception of a three-dimensional view of an image produced from the image data obtained by the device. The module is described in terms of “images”, and it should be understood that the description applies equally to still images and video.
Where the device includes a stereoscopic image module, the device may include two or more distinct visualization sensors (e.g., CMOS cameras as reviewed above) or a single visualization sensor via which the image data is collected and employed by the stereoscopic image module to provide the stereoscopic image. Where the elongated member includes first and second visualization sensors, the stereoscopic imaging module is configured to process imaged data provided by the first and second visualization sensors to produce the stereoscopic image. In such embodiments, any convenient stereoscopic image processing program may be employed.
Stereo algorithms compute range information to objects seen by the visualization sensors by using triangulation. Objects seen at different viewpoints will result in the object at different locations in the image data for the first and second visualization sensors. The disparity, or image difference, is used in determining depth and range of objects. Corresponding pixel points within the image data for the first and second visualization sensors may be identified and used in the determination of disparity line, as represented by block 1024. Because the first and second visualization sensors are at different locations and hence have different perspectives, the same object present in image data for the first and second visualization sensor may be at different pixel coordinate locations. Triangulation may be implemented, as represented by block 1026, based on geometry associated with the locations of the first and second visualization sensors may be used to determine depth and range of objects seen by the visualization sensors. Triangulation computations are applied to derive range data, and the resultant range (or depth) map can be overlayed on the image display, as desired. This is represented at block 1028 in
Further details regarding aspects of stereoscopic image modules that employ image data obtained by two or more distinct visualization sensors may be found in U.S. application Ser. No. 12/269,770; the disclosure of which is herein incorporated by reference.
Also of interest are stereoscopic image modules that are configured to provide a stereoscopic image from data obtained by a single image sensor. In such embodiments, the image sensor is configured to provide to the stereoscopic image module consecutive offset image data of the target tissue location, which consecutive offset image data are then employed by the stereoscopic image module to provide the desired stereoscopic image. By consecutive offset image data is meant image data that includes at least data from a first view of a target tissue location and data from a second view of the same target location, where the second view is offset from the first view. The second view may be offset from the first view by any convenient distance, for example 1 mm or less, including 0.5 mm or less. The first and second offset views may be obtained using any convenient approach. In one approach, the single visualization sensor is moved from a first position to a second position in order to obtain the desired offset image data. The single visualization sensor may be moved from the first to the second positions using any convenient manner, e.g., by a mechanical element that physically moves the sensor from the first to the second position. In yet other embodiments, the desired offset views may be obtained with a single visualization sensor operatively coupled to an optical guide system (which may include one or more of lenses, mirrors, filters, etc.) configured to provide the desired first and second offset views. For example, the first and second offset views may be provided to the single visualization sensor by including a first and second lens systems which alternately convey image data to the visualization sensor. The offset views may also be provided, for example, by including a single lens system with mirrors configured to provide the lens with two or more different views. The frequency with which the first and second offset views are obtained may vary, where in some instances the frequency may range from 1 to 30 frames/sec, such as 1 to 15 frames/sec. Various systems may be implemented to provide multiple views with a single camera. Systems of interest include, but are not limited to, those described in: “Scalable Multi-view Stereo Camera Array for Real World Real-Time Image Capture and Three Dimensional Displays” by S. Hill (Massachusetts Institute of Technology, Program in Media Arts and Sciences School of Architecture and Planning; May 7, 2004; see also, http://web.media.mit.edu/˜vmb/papers/hillms.pdf); “Single Camera Stereo Using Planar Parallel Plate” by Chunyu Gao, et al. (Beckman Institute, University of Illinois at Urbana-Champaign; see also, http://vision.ai.uiuc.edu/newpubs/Stereo13PPP13Gao.pdf); and, “3-D Reconstruction Using Mirror Images Based on a Plane Symmetry Recovering Method” by Mitsumoto, H., et al. (IEEE Transaction on Pattern Analysis and Machine Intelligence; Vol. 14; Issue No. 9, September 1992, pp. 941-946).
The stereoscopic image module may be implemented in a video processor module configured to receive image data obtained by the one or more visualization sensors. The stereoscopic image module processes the image data to provide stereoscopic image data for display on a display.
In certain embodiments, devices of the invention include an image recognition module. Image recognition modules of interest are those that are configured to receive image data and compare the received image data with a reference that includes at least one of color descriptor data and anatomical descriptor data to make a determination as to whether an alert signal should be generated. The term “reference” is used herein to refer to data in any format, e.g., saved as one or more image files, etc., that is for one or more reference images, e.g., where the data can be used by an appropriate processor to produce one or more reference images. As such, a reference includes at least a first set of reference image data for a first reference image. In some instances a reference also includes a second set of reference image data for a second reference image. In such embodiments, a reference may include sets of reference image data for multiple reference images, e.g., 2 or more, 5 or more, 10 or more, 25 or more, 50 or more, 100 or more, 1000 or more, 1500 or more, 2000 or more, 5000 or more, 10,000 or more etc., reference images.
Reference images are predetermined images of a region of interest. As the reference images are predetermined, they are images that have been produced independently of the image data that is received by the image processing module. In some instances, the reference images are images that exist prior to obtainment of the image data that is received by the image processing module. The reference images may be images that are obtained from the same subject (e.g., person) that is being visualized during a given procedure (e.g., where the reference images were obtained from the subject prior to a given procedure) or from a different subject (e.g., person). Alternatively, the reference images may be produced de novo, such that they are not produced from image data obtained from any actual subject but instead are designed, e.g., by using manual or computer assisted graphic protocols.
Reference images that make up the reference may differ from each other in a number of ways. For example, any two given reference images may be images of regions of interest of different internal tissue locations. In such a reference, the reference may include first and second pre-determined images that differ from each other with respect to a pre-determined internal tissue location. For example, the reference may include images of at least a first tissue location and a second tissue location. The first and second tissue locations may be locations that a given device may be expected to image during a given procedure, such as during a surgical procedure. In some instances, the reference includes multiple images of different locations that a given visualization sensor should image during a given procedure if the procedure is performed correctly. The reference may also include images of different tissue locations that a visualization sensor should not see during a given procedure, e.g., images of tissue locations that should not be viewed by the sensor if the given procedure of interest is being performed correctly. Accordingly, some references may include multiple images that track the location of a device when correctly and incorrectly positioned during an entire procedure, such as an entire surgical procedure.
The sets of image data in the reference may include one or more color descriptor data and anatomical descriptor data. By color descriptor data is meant data which is based on the particular color of a given internal tissue site and components thereof. For example, an internal tissue site may include one or more tissues that each has a distinct color. For example, different tissues such as muscle, nerve, bone, etc., may have different colors. This distinct color may be present in the reference image as color descriptor data, and employed by the image processing module. By anatomical descriptor data is meant data which is based on the particular shape of one or more tissue structures at the internal tissue site. For example, different tissues such as muscle, nerve, bone, etc., have different shapes. These different shapes are present in the image data as anatomical descriptor data.
As summarized above, the image recognition module compares received image data of an internal tissue site (e.g., obtained during a given procedure of interest) with the reference. The comparison performed by the image recognition module may be achieved using any convenient data processing protocol. Data processing protocols that may be employed in this comparison step may compare the received image data and reference based on color descriptor data and/or anatomical descriptor data. Data comparison protocols of interest include, but are not limited to: mean absolute difference between the descriptors of data and stored values such as mean color intensity, and, the degree of correlation between principle axis of the structure and stored values.
In performing this comparison step, the image recognition module may be configured to automatically select the appropriate images from a reference to compare against the received image data. In some instances, the image recognition module is configured to compare the received image data with the reference by selecting an appropriate set of reference image data based on a determined positional location of the device. For example, the image recognition module may obtain positional information about the device (e.g., as may be obtained from sensors on the device or manually input and associated with a given image) and then select reference images that are for the same positional location as the device when the device obtained the image data being received. Alternatively, the image recognition module may automatically select appropriate sets of image data based on similarity parameters. For example, the image recognition module may automatically select the most similar sets of image data from the reference to use in the comparison step.
The image recognition module compares the received image data with the reference in order to determine whether an alert signal should be generated. In other words, the output of the image recognition module is a decision as to whether an alert signal should be generated. If an image recognition module determines that an alert signal should be generated, it may generate the alert signal or instruct a separate module of the system to produce an alert signal.
The alert signal, when generated, may vary depending on the nature of the system. An alert signal may be a warning signal about a given system parameter or a signal that confirms to an operator of the system that a given system parameter of interest is acceptable. In some embodiments, an alert signal may include functional information about a device. For example, in these embodiments an alert signal may include information that a given device is functioning properly. In some embodiments, an alert signal may include positional information about a device. For example, an alert signal may include information as to whether or not a given device is correctly spatially positioned. In these embodiments, the alert signal may contain information that a tissue modifier of the device is contacting non-target tissue, such that the tissue modifier is not correctly spatially positioned.
The system may be configured to employ an alert signal in a variety of different ways. The system may be configured to provide the alert signal to a user of the system, e.g., via an alert signal output of the system. In addition or alternatively, the system may be configured to automatically modulate one or more operational parameters of the system based on the generation of an alert signal. For example, where the image processing module determines that a tissue modifier is contacting non-target tissue and therefore generates an alert signal, the alert signal may automatically modulate operation of the tissue modifier, e.g., by turning it off. In some instances, the alert signal may automatically shut the system down.
Further details regarding image recognition modules are provided in U.S. application Ser. No. 12/437,186; the disclosure of which is herein incorporated by reference.
The stereoscopic module and image recognition modules, e.g., as described above, may be implemented as software, e.g., digital signal processing software; hardware, e.g., a circuit; or combinations thereof, as desired.
In some embodiments, the devices may include a conveyance structure configured to convey an item between the distal end of the elongated member and an entry port positioned at a proximal end of the device, e.g., associated with the proximal end of the elongated member or associated with the hand-held control unit. This conveyance structure may have any convenient configuration, where in some instances it is a “working channel” disposed within the elongated member. When present as a working channel, the channel may have an outer diameter that varies, and in some instances has an outer diameter of 3 mm or less, such as 2 mm or less and including 1 mm or less. The conveyance structure may be configured to transport items, e.g., fluids, medicines, devices, to an internal target site or from an internal target site. As such, the proximal end entry port of the conveyance structure may vary, and may be configured to be operably coupled to a variety of different types of components, such as but not limited to: aspiration units, fluid reservoirs, device actuators, etc.
As indicated elsewhere, devices of the invention may be configured for wireless data transmission, e.g., to provide for one or more of: transmission of data between various component of the device, transmission of data between components of the device and another device, such as hospital information system, separate monitor, etc. Any convenient wireless communication protocol may be employed, where in some instances wireless communication is implemented as one or more wireless communication modules.
A video processor module may be present and be configured to control the one or more distinct visualization sensors by sending camera control data to a camera module including the visualization sensor(s). The video processor may also be configured to receive sensor data from one or more sensors and/or tools; and further, may be configured to control the sensors and/or tools by sending sensor control data to a sensor module including the one or more sensors and/or tools. The various sensors may include, but are not limited to, sensors relating to pressure, temperature, elasticity, ultrasound acoustic impedance, laser pointer to identify and/or measure difference to sensors, etc. The various tools may include, but are not limited to, a measurement scale, teardrop probe, biopsy probe, forceps, scissors, implant device, IR lighting, ultrasound measurement device, cutting tool, etc. Depending on the specific application and sensor/tool implemented, sensor data may also be included with the image data for processing by the stereoscopic image module, in order to provide the stereographic images.
In certain instances, the devices of the invention include an updatable control module, by which is meant that the devices are configured so that one or more control algorithms of the device may be updated. Updating may be achieved using any convenient protocol, such as transmitting updated algorithm data to the control module using a wire connection (e.g., via a USB port on the device) or a wireless communication protocol. The content of the update may vary. In some instances, a hand-held control unit is updated to configure the unit to be used with a particular elongated member. In this fashion, the same hand-held control units may be employed with two or more different elongated members that may differ by function and have different components. In some instances, the update information may be transmitted from the particular elongated member itself, such that upon operable connection of the elongated member to the hand-held control unit, update information is transferred from the elongated member to the hand-held control unit that updates the control module of the hand-held control unit such that it can operate with that particular elongated member. The update information may also include general functional updates, such that the hand-held control unit can be updated at any desired time to include one or more additional software features and/or modify one or more existing programs of the device. The update information can be provided from any source, e.g., a particular elongated member, the internet, etc.
Turning now to the figures,
From an external view, the probe piece, as shown in
Internally viewed, the device additionally contains a battery 195 that may be rechargeable, an electronic control board 190, and connectors 199 for all electrical and optical components of the device, to and from the electronic control board 190, as shown in
Within the distal tip 120 of the probe piece, as shown in
Also within the hand piece, in the present embodiment of the invention as shown in
In another embodiment, as shown in
Lastly, referring to
In another embodiment of the invention, the portable diagnostic imaging system 100 may include an element to transport material, medicine and implants to and from a point external to the hand piece and external to the distal tip 120 of the probe piece, e.g., a lumen configured as a working channel. As shown in
As mentioned above, devices of the invention may include an electronic board 190.
Another item that is electrically attached to the electronic board 190 is an electrical connector 170 for transmitting data to and from the electronic board 190 to an external transmitting or receiving means. In one embodiment of the present invention, the electrical connector 170 may be used to program a chip that may be located in the electronic component area or areas of 146 and/or 147 of the electronic board 190, for example with a computer. In another embodiment, the electrical connector 170 may be used for downloading video or still images that are captured by the camera that is located at the distal tip 120 of the probe piece means and stored in a memory chip that may be located in the electronic component area or areas of 146 and/or 147 of the electronic board 190. Additionally this memory chip may be removable from the present invention or reattached to the present invention. In another embodiment of the present invention the electronic connector 170 may be used to send video signal to an external monitor. In yet another embodiment, the electrical connector 170 may have an external device, such as a wireless adapter, should a wireless system not already be included within present invention, as it may be in one embodiment, attached to it to wirelessly send data from the present invention to an external receiving device, for example a monitor, or send and receive data wirelessly to and/or from, for example, a computer or other computing devices.
As mentioned previously, there is also attached to the electronic board 190 a switch 180 for turning on and off the present device. In some embodiments, the switch 180 would allow for power from the battery 195, shown in
There is also attached to the electronic board 190, such as to electronic components located at either/or electronic component areas 146 and 147, a series of switches 145 for control of the present invention, as shown in
Additionally attached to the electronic board 190, such as to electronic components located at either/or electronic component areas 146 and 147, is a switch 140 for controlling the rotation of the motor which then controls the rotation of the catheter piece. In one embodiment, the switch 140 may be configured to have one of three positions whereby there is a neutral position in the middle, for example, and a position on either side on the neutral position for rotating the motor either clockwise or counter-clockwise as would be determined by the user's input.
Another attachment to the electronic board 190, and where desired to electronic components located at either/or electronic component areas 146 and 147, are a series of connectors 199. These connectors 199 may serve a variety of functions, including for the control of the motors 157 or 156, the camera 122, the lighting 126, and the monitor 130. In another embodiment, the connectors are linked to a sensor located at the distal tip 120 of the catheter.
As shown if
In another embodiment of the current device 100, the monitor 113 may also be detachable from the hand piece 114 as shown in
In another embodiment, of the components within the distal tip 120 as shown in
One challenge with an integrated illuminator 128 and a camera being mechanically located behind a prism 122 is that stray or unintended light from the integrated illuminator 128 or other source may interfere with the camera, thereby producing sub-optimum image. To address this issue, a visualization module may include a filtering system.
In another embodiment of the filtering means, as shown in
With reference now to
Another embodiment, for the mechanical arrangement of components located at the distal end 300 of the device, is shown in
Data transfer from the sensor to a control module in the hand piece of the device may be accomplished using any convenient approach. In certain embodiments, transferring information from sensor 390 to the electronics within the hand piece is accomplished by a connection to the electronic board 190 at a point 392 via wires 394 that are passed through the probe piece from the sensor 390 into the hand piece, as shown in
With reference now to
As mentioned above, and as shown in
In another embodiment of the sheath 404, as shown in
In another embodiment of the sheath means 404, as shown in
In one or more embodiments of the current invention it may be desirable to have the camera viewing in one or more directions, for example at an angle from the axis of the catheter piece, other than those directions that may be attained through the rotation of the catheter piece. The direction that the camera shall view may be controllable or fixed. With reference now to
The flexible portion 500 of the probe piece may be constructed in such a way as to allow for flexion of this portion of the probe piece, in one or more directions. The embodiment as shown in
Another embodiment for the viewing of the camera at an angle, for example 30 degrees from the central axis of the catheter piece, is shown in
In cases where the practitioner of the present invention is required to diagnose, for example a tissue, it may be required of the practitioner to retrieve a portion of the material under diagnosis. With reference now to
There may be cases where the practitioner of the present invention is required to scrape or cut material, for example a tissue. With reference now to
There may be cases where the practitioner of the present invention is required to deploy one or more sensors in or near or around a material, for example a tissue. Such may be the case in a diagnosis of a material, for example a tissue, where monitoring the material in question requires a continuous sensing and also requires the removal of the visualization means of the present invention from, for example a patient under diagnosis. With reference now to
As mentioned previously, there may be a wireless deployment of the sensor 812. In this case, the sensor 812 would wirelessly connect to the electronics board within the handle where its output would be processed. Any convenient wireless communication protocol may be employed. This processed output may be displayed on a monitoring means and/or recorded to a memory chip on the electronics board, for example.
Camera module 950 may include a single visualization sensor, or two or more distinct visualization sensors which provide image data. Front end module 915 includes circuitry for receiving the image data from the camera module 950. The image data received from camera module 950 is processed by stereoscopic image module (i.e., by image coprocessing module 940) to provide stereoscopic image data. For example, as previously described, the image data from each distinct visualization sensor may be warped to correct image distortion, and fused to construct a single stereo image taking into account three-dimensional depth information. Back end module 920 includes circuitry for sending the stereoscopic image data to display 980. Display 980 displays a three-dimensional view of the image data for the user to see.
Video processor module 905 may be electrically coupled with camera module 950 via an I2C bus, for example, with camera module 950 configured as the slave and microcontroller 930 as the master. Microcontroller 930 may be configured to send camera control data to the camera module 950. The camera control data may comprise information requests (e.g., for information relating to testing/debugging, for calibration data, etc.) or provide commands for controlling the camera module 950 (e.g., controlling the two or more distinct visualization sensors, etc.).
Sensor module 960 may include one or more sensors and/or tools previously described. The one or more sensors and/or tools implemented may provide sensor data related to their specific function and application. The sensor data is received by processor/controller module 910 and may be used in a variety of ways depending on the specific function of the sensor(s) and/or tool(s) and their application. For instance, sensor data may be used by processor/controller module 910 to provide information to a user (e.g. parameter data, calibration data, measurement readings, warnings, etc., to be displayed on display 980 or to illuminate one or more LEDs), to account for feedback signals for more accurate control of a specific sensor(s) and/or tool(s), to store in memory, to further process into additional related information, etc. Microcontroller 930 may also control the sensor module 960 via the I2C bus or General Purpose Input/Output (GPIO) interface by sending sensor control data (e.g., to control and/or calibrate the specific sensors and/or tools implemented).
Processor/controller module 910 further comprises various modules for interfacing with external devices and peripherals. For example, as shown in
Aspects of the subject invention also include methods of imaging (and in some embodiments modifying) an internal target tissue of a subject. Accordingly, aspects of the invention further include methods of imaging an internal tissue site with tissue visualization devices of the invention. A variety of internal tissue sites can be imaged with devices of the invention. In certain embodiments, the methods are methods of imaging an intervertebral disc in a minimally invasive manner. For ease of description, the methods are now primarily described further in terms of imaging IVD target tissue sites. However, the invention is not so limited, as the devices may be used to image a variety of distinct target tissue sites.
With respect to imaging an intervertebral disc or portion thereof, e.g., exterior of the disc, nucleus pulposus, etc., embodiments of such methods include positioning a distal end of a minimally invasive intervertebral disc imaging device of the invention in viewing relationship to an intervertebral disc or portion of there, e.g., nucleus pulposus, internal site of nucleus pulposus, etc. By viewing relationship is meant that the distal end is positioned within 40 mm, such as within 10 mm, including within 5 mm of the target tissue site of interest. Positioning the distal end in viewing device in relation to the desired target tissue may be accomplished using any convenient approach, including through use of an access device, such as a cannula or retractor tube, which may or may not be fitted with a trocar, as desired. Following positioning of the distal end of the imaging device in viewing relationship to the target tissue, the target tissue, e.g., intervertebral disc or portion thereof, is imaged through use of the illumination and visualization elements to obtain image data. Image data obtained according to the methods of the invention is output to a user in the form of an image, e.g., using a monitor or other convenient medium as a display means. In certain embodiments, the image is a still image, while in other embodiments the image may be a video.
The internal target tissue site may vary widely. Internal target tissue sites of interest include, but are not limited to, cardiac locations, vascular locations, orthopedic joints, central nervous system locations, etc. In certain cases, the internal target tissue site comprises spinal tissue.
In some instances, the methods may include obtaining a tissue biopsy with a low-profile biopsy tool. For example, the methods may include advancing an annular cutting member concentrically disposed about the distal end of the elongated member beyond the distal end of the elongated member in a manner sufficient to penetrate and engage target tissue. Following tissue engagement, the annular member may be retracted in the direction of the proximal end of the elongate member in a manner sufficient to secure an amount of tissue with the device which can then be removed from the body to obtain the tissue biopsy.
The subject methods are suitable for use with a variety of mammals. Mammals of interest include, but are not limited to: race animals, e.g. horses, dogs, etc., work animals, e.g. horses, oxen etc., and humans. In some embodiments, the mammals on which the subject methods are practiced are humans.
Aspects of the invention further include methods of assembling an internal tissue visualization device. In these embodiments, the methods include operatively coupling a proximal end of an elongated member to a hand-held control unit, e.g., as described above. Depending on the particular configuration, this step of operatively coupling may include a variety of different actions, such as snapping the elongated member into a receiving structure of the hand-held control unit, twist locking the elongated member into a receiving structure of the hand-held control unit, and the like. In some instances, methods of assembling may further include sealing the hand-held control unit inside of a removable sterile covering, where the sterile covering is attached to the proximal end of the elongated member and configured to seal the hand-held control unit from the environment, e.g., as described above. In such instances, the methods may further include sealing a proximal end of the sterile covering.
The subject tissue visualization devices and methods find use in a variety of different applications where it is desirable to image an internal target tissue site of a subject while minimizing damage to the surrounding tissue. The subject devices and methods find use in many applications, such as but not limited to surgical procedures, where a variety of different types of tissues may be visualized, including but not limited to: soft tissue, cartilage, bone, ligament, etc. Specific procedures of interest include, but are not limited to, spinal fusion (such as Transforaminal Lumbar Interbody Fusion (TLIF)), total disc replacement (TDR), partial disc replacement (PDR), procedures in which all or part of the nucleus pulposus is removed from the intervertebral disc (IVD) space, arthroplasty, and the like. As such, methods of the invention also include treatment methods, e.g., where a disc is modified in some manner to treat an existing medical condition. Treatment methods of interest include, but are not limited to: annulotomy, nucleotomy, discectomy, annulus replacement, nucleus replacement, and decompression due to a bulging or extruded disc. Additional methods in which the imaging devices find use include those described in United States Published Application No. 20080255563.
Also provided are kits for use in practicing the subject methods, where the kits may include one or more of the above devices, and/or components thereof, e.g., elongated members, hand-held control units, sterile coverings, etc., as described above. The kits may further include other components, e.g., guidewires, access devices, fluid sources, etc., which may find use in practicing the subject methods. Various components may be packaged as desired, e.g., together or separately.
In addition to above mentioned components, the subject kits may further include instructions for using the components of the kit to practice the subject methods. The instructions for practicing the subject methods are generally recorded on a suitable recording medium. For example, the instructions may be printed on a substrate, such as paper or plastic, etc. As such, the instructions may be present in the kits as a package insert, in the labeling of the container of the kit or components thereof (i.e., associated with the packaging or subpackaging) etc. In other embodiments, the instructions are present as an electronic storage data file present on a suitable computer readable storage medium, e.g. CD-ROM, diskette, etc. In yet other embodiments, the actual instructions are not present in the kit, but means for obtaining the instructions from a remote source, e.g. via the internet, are provided. An example of this embodiment is a kit that includes a web address where the instructions can be viewed and/or from which the instructions can be downloaded. As with the instructions, this means for obtaining the instructions is recorded on a suitable substrate.
The following examples are offered by way of illustration and not by way of limitation.
A hand-held minimally dimensioned diagnostic device having integrated distal end visualization was constructed as follows. The device consisted of an outer SLA shell in the form of a hand-held unit housing batteries, a 3.5″ monitor, a control board, and wires that connect to 2 LEDS and a visualization module at the distal tip of a steel 4 mm hypodermic tube that was connected to the handle. The tubing was bent about an inch back from the distal tip to about 30 degrees. A manual wheel was provided on the hand-piece connected to the tube, and when actuated, rotated the tube 180 degrees in each direction. Considering a field of view for the camera of roughly 120 degrees (diagonal), the rotation of the tube allowed the camera to view at least a full hemisphere of space. The visualization module at the 4 mm outer diameter distal tip of the hypodermic tube included an Omnivision 6920 QVGA imaging chip (Santa Clara, Calif.), a series of lenses, an aperature, IR filter and a cover-glass within a small steel housing. In addition, LEDS were placed behind the flat cover-glass, but distal to the aperature. Thus due to the configuration of camera lens and lighting, there is little incidence of stray light affecting the image. In the constructed device, the signal from the powered camera goes through a series of electronic components where it is processed in a manner useful for the control board, and wires send the data to the control board where it is then displayed on the monitor. The monitor also rotates. QVGA resolution was observed for the image displayed on the 3.5 inch monitor.
Although the foregoing invention has been described in some detail by way of illustration and example for purposes of clarity of understanding, it is readily apparent to those of ordinary skill in the art in light of the teachings of this invention that certain changes and modifications may be made thereto without departing from the spirit or scope of the appended claims. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting, since the scope of the present invention will be limited only by the appended claims.
Accordingly, the preceding merely illustrates the principles of the invention. It will be appreciated that those skilled in the art will be able to devise various arrangements which, although not explicitly described or shown herein, embody the principles of the invention and are included within its spirit and scope. Furthermore, all examples and conditional language recited herein are principally intended to aid the reader in understanding the principles of the invention and the concepts contributed by the inventors to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the invention as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents and equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure. The scope of the present invention, therefore, is not intended to be limited to the exemplary embodiments shown and described herein. Rather, the scope and spirit of present invention is embodied by the appended claims.