The medial meniscus and lateral meniscus are crescent-shaped bands of thick, pliant cartilage attached to the shinbone (fibia). Meniscectomy is the surgical removal of all or part of a torn meniscus. The lateral meniscus is on the outside of the knee, is generally shaped like a circle, and covers 70% of the tibial plateau. The medial meniscus is on the inner side of the knee joint, has a C shape, and is thicker posteriorly. As the inner portion of the meniscus does not have good vascular flow, tears are less likely to heal. The current surgical procedure for treating damaged meniscus cartilage typically involves partial meniscectomy by arthroscopic removal of the unstable portion of the meniscus and balancing of the residual meniscal rim. Postoperative therapy typically involves treatment for swelling and pain, strengthening exercises, and limits on the level of weight bearing movement depending on the extent of tissue removal.
Existing arthroscopic techniques utilize a first percutaneous entry of an arthroscope that is 4-5 mm in diameter to inspect the condition of the meniscus. After visual confirmation as to the nature of the injury, the surgeon can elect to proceed with insertion of surgical tools to remove a portion of the meniscus.
A hip joint is essentially a ball and socket joint. It includes the head of the femur (the ball) and the acetabulum (the socket). Both the ball and socket are congruous and covered with hyaline cartilage (hyaline cartilage on the articular surfaces of bones is also commonly referred to as articular cartilage), which enables smooth, almost frictionless gliding between the two surfaces. The edge of the acetabulum is surrounded by the acetabular labrum, a fibrous structure that envelops the femoral head and forms a seal to the hip joint. The acetabular labrum includes a nerve supply and as such may cause pain if damaged. The underside of the labrum is continuous with the acetabular articular cartilage so any compressive forces that affect the labrum may also cause articular cartilage damage, particularly at the junction between the two (the chondrolabral junction).
The acetabular labrum may be damaged or torn as part of an underlying process, such as Femoroacetabular impingement (FAI) or dysplasia, or may be injured directly by a traumatic event. Depending on the type of tear, the labrum may be either trimmed (debrided) or repaired. Various techniques are available for labral repair that mainly use anchors, which may be used to re-stabilise the labrum against the underlying bone to allow it to heal in position.
Similarly, articular cartilage on the head of femur and acetabulum may be damaged or torn, for example, as a result of a trauma, a congenital condition, or just constant wear and tear. When articular cartilage is damaged, a torn fragment may often protrude into the hip joint causing pain when the hip is flexed. Moreover, the bone material beneath the surface may suffer from increased joint friction, which may eventually result in arthritis if left untreated. Articular cartilage injuries in the hip often occur in conjunction with other hip injuries and like labral tears.
Removal of loose bodies is a common reason physicians perform hip surgery. Loose bodies may often be the result of trauma, such as a fall, an automobile accident, or a sports-related injury, or they may result from degenerative disease. When a torn labrum rubs continuously against cartilage in the joint, this may also cause fragments to break free and enter the joint. Loose bodies can cause a “catching” in the joint and cause both discomfort and pain. As with all arthroscopic procedures, the hip arthroscopy is undertaken with fluid in the joint, and there is a risk that some can escape into the surrounding tissues during surgery and cause local swelling. Moreover, the distention of the joint can result in a prolonged recovery time. Thus, there exists a need for improved systems and methods for performing minimally invasive procedures on the hip joint.
The present disclosure relates to systems and methods utilizing a small diameter imaging probe (e.g., endoscope) and a small diameter surgical tool for simultaneously imaging and performing a minimally invasive procedure on an internal structure within a body. More particularly, a small diameter imaging probe and a small diameter arthroscopic tool can each include distal ends operatively configured for insertion into a narrow access space, for example, an access space less than 4 mm across at the narrowest region, more preferably less than 3 mm across at the narrowest region, and for many embodiments preferably less than 2 mm across at the narrowest region. Thus, for example, the imaging probe and arthroscopic tool are characterized by a having a distal end characterized by a diameter of less than 4 mm across at the largest region, more preferably less than 3 mm across at the largest region and most preferably less than 2 mm across at the largest region of each device.
In some embodiments, the region may be accessed, for example, through a joint cavity characterized by a narrow access space. Example procedures which may require access via a joint cavity characterized by a narrow access space may include procedures for repairing damage to the meniscus in the knee joint and procedures for repairing damage to the labrum in the hip and shoulder joints, for example. Advantageously, the systems and methods described herein enable accessing, visualizing and performing a procedure on a damaged region accessed via a joint cavity without the need for distension or other expansion of the joint cavity, for example, by injection of fluids under pressure or dislocation of the joint. Thus, the systems and methods of the present disclosure enable significant improvements in speeding up recovery time and preventing and/or mitigating complications. It will be appreciated that the arthroscopic tool may be any arthroscopic tool for performing a procedure on a damaged region that meets the dimensional requirements and that enables alignment with the visualization system described herein.
In exemplary embodiments, the imaging probe may enable visualization of both the target region and the arthroscopic tool thereby providing real-time visual feedback on a procedure being performed by the arthroscopic tool, for example a surgical procedure. It will be appreciated that the arthroscopic tool may be any arthroscopic tool for performing a procedure on a target region.
In some embodiments, the imaging probe may be characterized by an offset field of view, for example, offset from an insertion axis wherein the distal end of the imaging probe enables viewing at anon-zero angle relative to the insertion axis. In example embodiments, the field of view may include an offset axis having an angle relative to the insertion axis in a range of 5-45 degrees. Advantageously, the offset field of view may enable improved visualization of the target region and/or of the arthroscopic tool.
In some embodiments, the distal ends of the imaging probe and/or arthroscopic tool may be operatively configured for insertion into an access space having a predefined geometry, for example a curved geometry. Thus, for example, the distal ends of the imaging probe or endoscope and/or arthroscopic tool may include one or more regions shaped to substantially match a predefined geometry, for example, shaped to include a particular curvature to improve access to the region of interest. Example predefined geometries may include the curved space between the femoral head and the acetabulum in the hip joint or the curved space between the head of the humerus and the glenoid fossa of scapula in the shoulder joint. In some embodiments, the predefined geometry may be selected based on patient demographics, for example, based on age, gender, or build (i.e., height and weight).
In exemplary embodiments, the systems and methods may utilize one or more cannulas in conjunction with the imaging probe and/or arthroscopic tool described herein. In some embodiments, the cannula may be a single port cannula defining a single guide channel for receiving the imaging probe or arthroscopic tool therethrough. Alternatively, the cannula may be a dual port cannula, defining a pair of guide channels for receiving, respectively, the imaging probe and arthroscopic tool. In the dual port configuration, the cannula may be used to advantageously define a relative spatial positioning and/or orientation along one or more axes between the imaging probe and arthroscopic tool. For example, in some embodiments, the cannula may constrain the relative positioning of the imaging probe and arthroscopic tool to movement along each of the insertion axes defined by the guide channels. In yet further embodiments, the cannula may fix the orientation of the imaging probe and/or arthroscopic tool within its guide channel, for example to fix the orientation relative to the position of the other port. Thus, the cannula may advantageously be used to position and/or orientate the imaging probe and arthroscopic tool relative to one another, for example, in vivo, thereby enabling alignment of the field of view of the imaging probe with an operative portion or region of the body being treated with the arthroscopic tool.
Advantageously, a cannula as described herein may be operatively configured for insertion along an entry path between an entry point (for example, an incision) and an access space of a region of interest. In some embodiments, the cannula may be configured for insertion into the access space of the target region, for example, at least part of the way to the treatment site. Alternatively, the cannula may be configured for insertion along an entry path up until the access space with only the imaging probe and/or arthroscopic tool entering the access space. In some embodiments, the cannula may be configured for insertion via an entry path having a predefined geometry and may therefore be shaped to substantially match the predefined geometry. In some embodiments, the predefined geometry of the entry path and the predefined geometry of the access space may be different. Thus, in exemplary embodiments, the cannula may be used to define a predefined geometry along the entry path up until the access space while the distal end(s) of the imaging probe and/or arthroscopic tool protruding from a distal end of the cannula may be used to define the predefined geometry along the access space. For example, the cannula may be used to define a relatively straight entry path up until the access space, and the distal ends of the imaging probe and/or arthroscopic tool may be used to define a curved path through the access space. In some embodiments, the distal end(s) of the imaging probe and/or arthroscopic tool may include a resilient bias with respect to a predetermined geometry of the access space. Thus, the cannula may be used to rigidly constrain the shape of the distal end(s) up until the point of protrusion. Thus the positioning of the distal end of the cannula may, for example, determine a point at which the insertion path changes, for example from a straight entry path to a curved path through the access space.
In some embodiments, the cannula(s) or the visualization device or the arthroscopic tool may include a port for delivering medication or another therapeutic agent to the joint in question. For example, the arthroscopic tool may include an injection/delivery port for injecting/delivering a stem cell material into a joint cavity, and more particularly, with respect to a cartilage area of the target region, e.g., to facilitate repair thereof.
In accordance with the arthroscopic surgical method described herein, a patient was prepped and draped for a lateral menisectomy. No leg holder or post was employed to allow for limb flexibility. The patient was draped and sterile tech applied as is standard. No forced insufflation of the joint via pump or gravity flow was employed as would traditionally occur. The injection port was employed for any aspiration or delivery of saline required to clear the surgical field. Empty syringes were used to clear the view when occluded by either synovial fluid or injected saline. No tourniquet was employed in the case. A modified insertion port (from traditional arthroscopy ports) was chosen for insertion of the cannula and trocar. The position (lower) was modified given the overall size and angle aperture of the scope (1.4 mm gets around easily and 0 degree) that allows the user to migrate through the joint without distension. Following insertion of the endoscopic system and visual confirmation of the lateral meniscus tear, a surgical access port was established with the use of a simple blade. Under direct visualization and via the access port, traditional arthroscopic punches were employed (straight, left/right and up/down) to trim the meniscus. Visualization was aided during these periods by the injection of sterile saline 40 via a tubing extension set in short bursts of 2 to 4 cc at a time. Leg position via figure four and flexion/extension were employed throughout the procedure to open access and allow for optimal access to the site. Alternatively, a standard shaver hand piece was inserted into the surgical site to act as a suction wand to clear the site of any fluid or residual saline/synovial fluid. Multiple cycles of punches, irrigation and suctioning of the site were employed throughout the procedure to remove the offending meniscal tissue. Following final confirmation of correction and the absence of any loose bodies, the surgical site was sutured closed while the endoscope's side was bandaged via a band-aid. Preferably, both arthroscopic ports are closed without suturing due to the small size.
In a preferred embodiment, a wireless endoscopy system is configured to broadcast low-latency video that is received by a receiver and displayed on an electronic video display. The system operates at a video rate such that the user, such as a surgeon, can observe his or her movement of the distal end of the endoscope with minimal delay. This minimal configuration lacks the storage of patient data and procedure imagery, but compared to existing endoscopy systems it provides the benefits of a low number of components, low cost, and manufacturing simplicity. In a second embodiment, the wireless endoscopy system is configured to broadcast low-latency video to an electronic video display and also to a computer or tablet that executes application software that provides one or more of: patient data capture, procedure image and video storage, image enhancement, report generation, and other functions of medical endoscopy systems.
Preferred embodiments relate to a high-definition camera hand-piece that is connected to a control unit via a multi-protocol wireless link. In addition to the image sensor, the high definition camera unit contains a power source and associated circuitry, one or more wireless radios, a light source, a processing unit, control buttons, and other peripheral sensors. The control unit contains a system on chip (SOC) processing unit, a power supply, one or more wireless radios, a touchscreen enabled display, and a charging cradle for charging the camera hand-piece. By connecting the camera unit to the control unit in this way, this invention provides a real-time high definition imaging system that is far less cumbersome than traditional hard-wired systems.
The foregoing and other objects, features and advantages of the invention will be apparent from the following more particular description of preferred embodiments of the invention as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention.
Preferred embodiments of the invention are directed to devices and methods for minimally invasive arthroscopic procedures. A first percutaneous entry position is used to insert a small diameter endoscope such as that described in U.S. Pat. No. 7,942,814 and U.S. application Ser. No. 12/439,116 filed on Aug. 30, 2007, and also in U.S. application Ser. No. 12/625,847 filed on Nov. 25, 2009, the entire contents of these patents and applications being incorporated herein by reference.
The present invention enables the performance of surgical procedures without the use of distension of the joint. Without the application of fluid under pressure to expand the volume accessible, a much smaller volume is available for surgical access. Existing techniques employ a pump pressure of 50-70 mmHg to achieve fluid distension of knee joints suitable for arthroscopic surgery. A tourniquet is used for an extended period to restrict blood flow to the knee. The present invention provides for the performance of arthroscopic procedures without fluid distension and without the use of a tourniquet. Low pressure flushing of the joint can be done using, for example, a manual syringe to remove particulate debris and fluid.
A preferred embodiment of the invention utilizes positioning of the knee in a “figure four” position to achieve separation of the femur from the tibia to provide access. A preferred embodiment of the invention utilizes positioning of the knee in a “figure four” position to achieve separation of the femur from the tibia to provide access. This orientation provides a small aperture in which to insert devices into the joint cavity to visualize and surgically treat conditions previously inaccessible to larger-sized instruments.
As depicted in
The handle 22 is configured to operate with one or more imaging sensors that are optically coupled to a fiber optic imaging bundle that extends within an imaging tube 25 as depicted in
The handle 22 is attachable to an endoscope 23 which comprises the tubular body 25 and a base 37. The housing 37 includes one or more lens elements to expand an image from the fiber optic imaging bundle within the tubular body 25. The base also attaches the endoscope 23 to the handle 22.
The handle 22 can include control elements 21 to operate the handle. A sheath 24 includes a tubular section 34, a base or optical barrier 32 that optically encloses the housing 37 and a sleeve or camera barrier 36 that unfolds from the proximal end of the base 32 to enclose the handle and a portion of the cable 18. The user can either slide their operating hand within the barrier to grasp and operate, or can grasp the handle with a gloved handle that is external to barrier 36.
During a procedure, the user first inserts the cannula 27 through the skin of the patient and into the joint cavity. The endoscope tube 25 is inserted into a lumen within the sheath tube 34 which is enclosed at the distal end by a window or lens. The sleeve is extended over the handle 22, and the sheath and endoscope are inserted into the cannula.
The assembled components are illustrated in
Shown in
As seen in the view of
Shown in
A single port system 200 for arthroscopic repair is shown in
A single cannula 206 can be used having a first channel to receive the flexible sheath and endoscope body. In this embodiment, the rigid tool 42 can be inserted straight through a second channel of the cannula 206. Note that the proximal end of the cannula 206 shown in
A further embodiment of a system 300 is shown in
In the alternative embodiments illustrating cannula insertion,
Shown in
The cross-sectional view of the cannula 402 seen in
With reference to
Advantageously, the distal ends 1110 and 1210 of the imaging probe assembly 1100 and surgical tool assembly 1200 may be dimensioned and shaped so as to enable access and visualization of the damaged region 1310 of the hip joint 1300 and performance of a surgical process on the damaged region 1310, all while minimizing the need for distension or other expansion of the joint cavity such as by injection of fluids and/or distraction of the hip joint 1300. Thus, the distal ends 1110 and 1210 of the imaging probe assembly 1100 and surgical tool assembly 1200 may be less than 4 mm in diameter, more preferably, less than 3 mm in diameter and most preferably less than 2 mm in diameter. Moreover, as depicted, the distal ends 1110 and 1210 of the imaging probe assembly 1100 and surgical tool assembly 1200 may be shaped to substantially match the curved access space 1320 between the femoral head 1302 and the acetabulum 1304 in the hip joint 1300. The various exemplary embodiments depicted in
With reference to
As depicted, the imaging probe assembly 1100 may comprise an endoscopic system 20 similar to the endoscopic system 20 described with respect to
Similar to the setup in
With reference still to
In an exemplary arthroscopic hip procedure, the cannula 27 and 40 for the endoscopic system 27 and a surgical tool 42, may be inserted into a patient along entry paths defined between an entry point (for example, an incision) and an access space of a damaged region of the hip joint, for example, the curved access space 1320 between the femoral head 1302 and the acetabulum 1304 in the hip joint 1300, such as the chondrolabral junction. In some embodiments (see, e.g.,
With reference now to
With reference now to
With reference now to
With reference now to
As depicted, the imaging probe assembly 1100 may comprise an endoscopic system 20 similar to the endoscopic system 20 described with respect to
Similar to the setup in
With reference still to
In contrast with the embodiment of
With reference now to
In some embodiments, the distal end(s) of the imaging probe and/or surgical tool may include a resilient bias with respect to a predetermined geometry of the access space. Thus, the imaging probe and/or surgical tool may advantageously bend in a predetermined manner upon protrusion from a cannula, e.g., to facilitate insertion into a curved access space. In such embodiments, the cannula may be used to rigidly constrain the shape of the distal end until the point of protrusion. Thus the positioning of the distal end of the cannula may, for example, determine a point at which the insertion path changes, for example from a straight entry path to a curved path through the curved access space. With reference to
It will be appreciated by one of ordinary skill in the art that any number of mechanisms may be used to cause a bend in a distal end of an imaging probe, surgical tool and/or cannula. For example, shape memory material (for example, heat sensitive shape memory materials), articulating segments, and other mechanisms may be utilized. In some embodiments, a cannula may include one or more telescopic distal portions. In exemplary embodiments, such telescopic distal portions may exhibit a resilient bias with respect to a predetermined geometry of the access space. In other embodiments, a cannula may include articulating segments which may be used to shape and steer the path of the cannula.
With reference now to
Advantageously, the distal ends 1110 and 1210 of the imaging probe assembly 1100 and surgical tool assembly 1200 may be dimensioned and shaped so as to enable access and visualization of a damaged region 1410 of the shoulder joint 1400 and performance of a surgical process on the damaged region 1410, all while minimizing the need for distension or other expansion of the joint cavity such as by injection of fluids and/or distraction of shoulder joint 1400. Thus, the distal ends 1110 and 1210 of the imaging probe assembly 1100 and surgical tool assembly 1200 may be less than 4 mm in diameter, more preferably less than 3 mm in diameter and most preferably less than 2 mm in diameter. Moreover, as depicted, the distal ends 1110 and 1210 of the imaging probe assembly 1100 and surgical tool assembly 1200 may be shaped to substantially match the curved access space 1420 between the head of the humerus 1402 and the glenoid fossa 1404 of scapula in the shoulder joint.
Shown in
The illumination fiber bundle 1644 can comprise an annular array of optical fibers within a polymer or plastic matrix that is attached to the outer surface of tube 1646. At the distal end of the illumination fiber assembly 1644 is a light transmitting sleeve 1648 that is shaped to direct light emitted from the distal ends of the fiber assembly 1644 at the correct angle of view. The sleeve 1648 operates to shape the light to uniformly illuminate the field of view at the selected angle. Thus, the sleeve's illumination distribution pattern will vary as a function of the angle of view 1670.
Illustrated in
Turning more particularly to the drawings relating to a wireless endoscope handle, an embodiment of the wireless endoscopy system embodying the present invention is depicted generally at 1800 in
The first embodiment of the endoscopy system 1800 includes the camera handpiece 1810, the endoscope 1815, the receiver 1825 and display 1830, and a sterile barrier 1845 in the form of an illumination sheath 1850 that is discussed herein.
In some applications, it is permissible to sterilize the endoscope 1815 prior to each endoscopic imaging session. In other applications it is preferable to sheath the endoscope with a sterile barrier 1845. One type of sterile barrier 1845 is an illumination sheath 1850, similar to those described in U.S. Pat. No. 6,863,651 and U.S. Pat. App. Pub. 2010/0217080, the entire contents of this patent and patent application being incorporated herein by reference. The sheath carries light from illumination source 1835 such that it exits the distal tip of the illumination sheath 1850.
Another type of sterile barrier 1845 does not require the handpiece 1810 to contain a source of illumination in that the sterile barrier 1845 can contain a source of illumination, for example an embedded illuminator 1836 in the proximal base, or a distal tip illuminator 1837 such as a millimeter-scale white light emitting diode (LED). In these cases, power can be coupled from the means of powering a source of illumination 1840. In all cases, the sterile barrier 1845 may or may not be disposable. The camera handpiece 1810 may perform other functions and has a variety of clinically and economically advantageous properties.
It will be understood in the field of endoscopy that elements of the endoscopy system 1800 can also be in communication with a wired or wireless network. This has utility, for example, for transmitting patient reports or diagnostic image and video data on electronic mail, to a picture archiving and communication system (PACS), or to a printer.
In a preferred embodiment, the camera handpiece 1810 receives optical energy corresponding to clinical imagery at an image capture electro-optical module, such as a digital image sensor module, model number STC-HD203DV, having an HD active pixel resolution of at least 1920×1080 (i.e., at least 2 million pixels or more) and a physical enclosure measuring at least 40 mm×40 mm×45.8 mm, manufactured by Sensor Technologies America, Inc. (Carrollton, Tex.), (i.e., between 60,000 mm3 and 200,00 mm3) and provides HDMI-formatted image data to a wireless video transmitter module 1880, such as the Nyrius ARIES Prime Digital Wireless HDMI Transmitter or Amimon Ltd. AMN 2120 or 3110 (Herzlia, Israel).
The wireless video transmitter module 1880 broadcasts the radio frequency signals 1820 indicative of the clinical imagery described in an earlier illustration. A power source 1882, for example a rechargeable battery 1884 or a single-use battery, and power electronics 1886, may receive electrical energy from a charger port 1888. The power electronics 1890 is of a configuration well-known to electrical engineers and may provide one or more current or voltage sources to one or more elements of the endoscopy system 1800. The power source 1882 generates one or more voltages or currents as required by the components of the camera handpiece 1810 and is connected to the wireless video transmitter module 1880, the image capture electro-optical module 1881, and the illumination source 1835 such as a white light-emitting diode (LED). For illustrative purposes, an LED power controller 1892 and a power controller for external coupling 1894 are also depicted, which can optionally be included in the handle.
It will be appreciated that the first embodiment incorporates a component-count that is greatly reduced compared to existing endoscopy systems and intentionally provides sufficient functionality to yield an endoscopy system when paired with the suitable wireless video receiver 1825 such as one using the Amimon AMI 2220 or 3210 chipsets and the electronic display 1830 such as the LCD display described earlier that preferably operates at HD resolution.
In other embodiments, as illustrated in
The camera handpiece 1810 may include a camera controller and additional electronics 1898 in unidirectional or bidirectional communication with the image capture electro-optics module 1881. The camera controller and additional electronics 1898 may contain and perform processing and embedded memory 1885 functions of any of:
1. Sets imaging parameters by sending commands to the image capture electro-optics module, such as parameters corresponding to white balance, image enhancement, gamma correction, and exposure.
2. For an associated control panel 1896 having buttons or other physical user interface devices, interprets button presses corresponding for example to: “take snapshot,” “start/stop video capture,” or “perform white balance.”
3. Controls battery charging and power/sleep modes.
4. Performs boot process for imaging parameter settings.
5. Interprets the data generated by an auxiliary sensor 1897, for example “non-imaging” sensors such as an RFID or Hall Effect sensor, or a mechanical, thermal, fluidic, or acoustic sensor, or imaging sensors such as photodetectors of a variety of visible or non-visible wavelengths. For example, the electronics 1898 can generate various imager settings or broadcast identifier information that is based on whether the auxiliary sensor 1897 detects that the endoscope 1815 is made or is not made by a particular manufacturer, detects that the sterile barrier 1845 is or is not made by a particular manufacturer, or other useful functions. As an illustrative example, if the camera handpiece 1810 is paired with an endoscope 1815 that is made by a different manufacturer than that of the camera handpiece, and lacks an identifier such as an RFID tag, then the system does not detect that endoscope's model number or manufacturer and thus can be commanded to operate in a “default imaging” mode. If an endoscope of a commercially-approved manufacturer is used and does include a detectable visual, magnetic, RFID, or other identifier, then the system can be commanded to operate in an “optimized imaging” mode. These “default” and “optimized” imaging modes can be associated with particular settings for gamma, white balance, or other parameters. Likewise, other elements of the endoscopy system 1805 can have identifiers that are able to be sensed or are absent. Such other elements include the sterile barrier 1845.
6. Includes a memory for recording snapshots and video
7. Includes a MEMS sensor and interpretive algorithms to enable the camera handpiece to enter a mode of decreased power consumption if it is not moved within a specified period of time
8. Includes a Bluetooth Low Energy (BLE) module, WiFi module, or other wireless means that transmits and/or receives the procedure data 1855.
The camera controller and additional electronics 1894 may optionally be in communication with an electronic connector 1898 that transmits or receives one or more of: power, imagery, procedure settings, or other signals that may be useful for the endoscopy system 1800. The electronic connector 1898 can be associated with a physical seal or barrier such as a rubber cap as to enable sterilization of the camera handpiece 1810.
Illustrated in
A camera module 2015, contained in the camera hand-piece 2010, receives optical energy from an illuminated scene that is focused onto the camera module's active elements in whole or in part by an endoscope 2013. The camera module 2015 translates the optical energy into electrical signals, and exports the electrical signals in a known format, such as the high definition multimedia interface (HDMI) video format. An example of this module is the STC-HD203DV from Sensor Technologies America, Inc.
The handheld camera device 2010 wirelessly transmits the HDMI video signal with low latency, preferably in real time, to a wireless video receiver 2003 via a wireless video transmitter 2006. The wireless video receiver 2003 is a component within the camera control unit 2002. An example of this wireless chipset is the AMN2120 or 3110 from Amimon Ltd.
In addition to the wireless video link described, a wireless control transceiver 2007 is used for relaying control signals between the camera device 2010 and the camera control unit 2002, for example control signals indicative of user inputs such as button-presses for snapshots or video recording. The wireless control transceiver 2007 is implemented using a protocol such as the Bluetooth Low Energy (BLE) protocol, for example, and is paired with a matching control transceiver 2012 in the camera control unit 2002. An example of a chipset that performs the functionality of the wireless control transceiver 2007 is the CC2541 from Texas Instruments, or the nRF51822 from Nordic Semiconductor. The wireless control transceiver 2007 sends and receives commands from a processing unit 2004, which can include a microcontroller such as those from the ARM family of microcontrollers.
In the first embodiment, the processing unit 2004 is in communication with, and processes signals from, several peripheral devices. The peripheral devices include one or more of: user control buttons 2014, an identification sensor 2103, an activity sensor 2005, a light source controller 2112, a battery charger 2109, and a power distribution unit 2008.
The identification sensor 2103 determines the type of endoscope 2013 or light guide that is attached to the camera hand-piece 2010. The processing unit 2004 sends the endoscope parameters to the camera control unit 2002 via the wireless control transceiver 2007. The camera control unit 2002 is then able to send camera module setup data, corresponding to the endoscope type, to the processing unit 2004 via the wireless control transceiver 2007. The camera module setup data is then sent to the camera module 2005 by the processing unit 2004. The camera module setup data is stored in a non-volatile memory 2102. The processing unit 2004 controls the power management in the camera hand-piece 2010 by enabling or disabling power circuits in the power distribution unit 2008.
The processing unit 2004 puts the camera hand-piece 2010 into a low power mode when activity has not been detected by an activity sensor 2005 after some time. The activity sensor 2005 can be any device from which product-use can be inferred, such as a MEMS-based accelerometer. The low power mode can alternatively be entered when a power gauge 2114, such as one manufactured by Maxim Integrated, detects that a battery 2110 is at a critically low level. The power gauge 2114 is connected to the processing unit 2004 and sends the status of the battery to the camera control unit 2002 via the wireless control transceiver 2007. The processing unit 2004 can also completely disable all power to the camera hand-piece 2010 when it has detected that the camera hand-piece 2010 has been placed into a charging cradle 2210 of the camera control unit 2002. In an embodiment in which the camera hand-piece is capable of being sterilized, the charging cradle 2210, and corresponding battery charger input 2111 contains a primary coil for the purpose of inductively charging the battery 2110 in the camera hand-piece 2010. In another embodiment where sterilization is not required, the charging cradle 2110 and corresponding battery charger input 2111 contain metal contacts for charging the battery 2110 in the camera hand-piece 2010. The touchscreen operates in response to a touch processor that is programmed to respond to a plurality of touch icons and touch gestures associated with specific operations features described herein.
Referring still to
With reference to
Preferred embodiments can utilize different coupling from the handle to the endoscope to enable illumination; one or more LEDs coupled to fiber optics; one or more LEDs coupled to thin light guide file; one or more LEDs mounted in the tip of the endoscope; fiber optics or thin light guide film or HOE/DOE arranged on the outside diameter of elongated tube; the elongated tube itself can be a hollow tube with one end closed. The tube is made of light pipe material and the closed end is optically clear. The clear closed end and the light pipe tube can be extruded as one piece so it provides a barrier for the endoscope inside. This light source can be used for imaging through turbid media. In this case, the camera uses a polarizing filter as well.
The illumination can employ time-varying properties, such as one light source whose direction is modulated by a time-varying optical shutter or scanner (MEMS or diffractive) or multiple light sources with time-varying illumination.
To provide video with low latency, preferred embodiments can employ parallel-to-serial conversion camera module data (in the case where the module output is raw RGB and a cable is used to connect the camera to the camera control unit); direct HDMI from the camera module (can be used with or without a cable); cable harness for transmission of video data to a post processing unit in the absence of wireless; Orthogonal Frequency Division Multiplexing (OFDM) with multiple input multiple output wireless transmission of video (Amimon chip). In this case, the module data must be in HDMI format. If a camera module is used that has raw RGB output, there is an additional conversion from RGB to HDMI.
The display can comprise a small display integrated into the camera hand piece; a direct CCU to wireless external monitor; a display integrated into the camera control unit (CCU); a video streaming to iPad or Android; a head mounted display (like Google Glass); or a specialized dock in the CCU capable of supporting both an iPad or other tablet (optionally with an adapter insert).
To provide systems for identification, control and patient data management, systems can use bluetooth low energy (BLE) for wireless button controls and for unit identification where BLE can also control power management; a secure BLE dongle on PC for upload/download of patient data; touchscreen on camera control unit for entering patient data and controlling user interface; keyboard for entering patient data and controlling user interface; WiFi-enabled camera control unit to connect to network for upload/download of patient data; integrated buttons for pump/insufflation control; ultrasound or optical time of flight distance measurement; camera unit can detect a compatible endoscope (or lack of) and can set image parameters accordingly; a sterile/cleanable cradle for holding a prepped camera; a charging cradle for one or more cameras; or inventory management: ability to track/record/communicate the usage of the disposables associated with the endoscopy system, and to make this accessible to the manufacturer in order to learn of usage rates and trigger manual or automated re-orders. Enabling technologies such as QR (or similar) Codes, or RFID tags utilizing near field communication (NFC) technology such as the integrated circuits available from NXP Semiconductor NV, on/in the disposables or their packaging, which can be sensed or imaged by an NFC scanner or other machine reader in the camera handpiece or the CCU.
Image processing can employ software modules for image distortion correction; 2D/3D object measurement regardless of object distance; or utilization of computational photography techniques to provide enhanced diagnostic capabilities to the clinician. For example: H.-Y. Wu et al, “Eulerian Video Magnification for Revealing Subtle Changes in the World,” (SIGGRAPH 2012) and Coded aperture (a patterned occluder within the aperture of the camera lens) for recording all-focus images. With the proper image processing, it might give the ability to autofocus or selectively focus without a varifocal lens. E.g.: A. Levin et al, “Image and Depth from a Conventional Camera with a Coded Aperture,” (SIGGRAPH 2007). A digital zoom function can also be utilized. Optical systems can include a varifocal lens operated by ultrasound; or a varifocal lens (miniature motor).
With certain details and embodiments of the present invention for the wireless endoscopy systems disclosed, it will be appreciated by one skilled in the art that changes and additions could be made thereto without deviating from the spirit or scope of the invention.
The attached claims shall be deemed to include equivalent constructions insofar as they do not depart from the spirit and scope of the invention. It must be further noted that a plurality of the following claims may express certain elements as means for performing a specific function, at times without the recital of structure or material and any such claims should be construed to cover not only the corresponding structure and material expressly described in this specification but also all equivalents thereof.
This application claims priority to U.S. Provisional Application No. 61/974,427 filed Apr. 2, 2014, U.S. Provisional Application No. 61/979,476 filed Apr. 14, 2014, U.S. Provisional Application No. 62/003,287 filed May 27, 2014, and U.S. Provisional Application No. 62/045,490 filed Sep. 3, 2014, the entire contents of these applications being incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
20160066770 A1 | Mar 2016 | US |
Number | Date | Country | |
---|---|---|---|
62045490 | Sep 2014 | US | |
62003287 | May 2014 | US | |
61979476 | Apr 2014 | US | |
61974427 | Apr 2014 | US |