The present disclosure relates generally to a system including a visualization instrument comprising a camera to view an internal space and, more particularly, to a visualization instrument comprising a camera to examine the interior of a patient.
Visualization instruments include endoscopes, laryngoscopes, borescopes and other medical instruments designed to look inside the body of a patient. Medical visualization instruments are used in a multitude of medical procedures including laryngoscopy, rhinoscopy, bronchoscopy, cystoscopy, hysteroscopy, laparoscopy, arthroscopy, etc. Visualization instruments are also used in non-medical applications such as to investigate the internal structures of machines, buildings, and explosive devices. Laryngoscopes are used to obtain view of the vocal folds and the glottis to perform noninvasive tracheal intubations. A conventional rigid laryngoscope consists of a handle with a light source and a blade. Direct laryngoscopy is usually carried out with the patient lying on his or her back. The laryngoscope is inserted into the mouth, typically on the right side, and pushed towards the left side to move the tongue out of the line of sight and to create a pathway for insertion of an endotracheal tube. The blade may be lifted with an upwards and forward motion to move the epiglottis and make a view of the glottis possible. Once the laryngoscope is in place, the endotracheal tube may be inserted into the pathway. The blade may be provided with guide surfaces to guide the insertion of the endotracheal tube. Laryngoscopes may be outfitted with illumination devices and optical devices to provide views of the vocal cords externally of the patient's body. Optical devices include lenses, mirrors, prisms and fiberoptic fibers, all adapted to transfer an optical image. Imaging devices may also be provided to capture the optical images and display the optical images in high definition display monitors.
Stylets and other visualization instruments have also been developed. Each instrument has its own limitations such as, for example, fogging, insufficient lighting to produce a good optical image, inability to project images remotely, additional procedural steps to insert the endotracheal tube, and cost. As difficult intubations may be performed remotely from a hospital, such as at the scene of an accident or military battle, it would be desirable to provide emergency responders and others affordable equipment necessary to perform field intubations. It would be desirable to provide visualization instruments which may be discarded after a single or a limited number of uses.
A visualization instrument and a method of using the visualization instrument are disclosed herein. The visualization instrument is insertable into a space to capture images representing internal views of the space. The visualization instrument comprises an insertable portion supporting an imaging sensor and a video device configured to display images corresponding to views captured by the imaging sensor.
In one exemplary embodiment of the present disclosure, a visualization instrument is provided. The visualization instrument comprising a display device; an imaging assembly including a camera and a lens; the camera including an imaging sensor, an imaging support having a distal surface and an optical cavity, the optical cavity defining a cavity opening in the distal surface, the lens and the camera sealed within the optical cavity to keep the optical cavity dry, the camera outputting a digital image stream corresponding to a plurality of views obtained through the lens; a handle portion detachably coupled to the display device; a self-contained energy source supported by one of the handle portion and the display device; and an insertable portion coupled to the handle portion and insertable into the patient, the insertable portion having a distal cavity with a distal opening at a distal end thereof, the imaging assembly received by the distal cavity through the distal opening, the imaging assembly electronically coupled to the display device when the insertable portion is coupled to the handle portion and the handle portion is coupled to the display device to present images corresponding to the plurality of views with the display device.
In a further example, the insertable portion further comprises an anterior wall and a medial wall, the anterior wall and the medial wall defining a guide pathway adapted for guiding a tube into a patient, the guide pathway adjacent a side of the medial wall and the distal cavity adjacent an opposite side of the medial wall, the anterior wall having a tip portion extending distally beyond the medial wall, the tip portion including a textured surface adapted to engage a tissue of the patient. In another variation thereof, the textured surface includes a plurality of ridges arranged in a regulated pattern. In a further variation thereof, the plurality of ridges are longitudinally aligned. In yet another variation thereof, the textured surface has a first coefficient of friction measured in a first direction and a second coefficient of friction measured in a second direction different from the first direction. In a further variation thereof, the tip portion includes a longitudinally aligned wall portion.
In yet a further example, the insertable portion comprises an elongate tubular member. In a variation thereof, the elongate tubular member is malleable. In another variation thereof, the elongate tubular member is steerable, further comprising a steering mechanism supported by the handle portion.
In another example, the visualization instrument further comprises a translucent cover attached to the distal surface, the translucent cover including an anti-fog coating.
In a further example, the visualization instrument further comprises a second lens and a camera barrel having a barrel cavity, the lens positioned between the distal surface and the camera barrel when the camera barrel is received by the optical cavity, and the second lens received by the camera barrel and positioned between the lens and the camera.
In a yet further example, the visualization instrument further comprises a motion sensor detecting motion of the display device and disabling presentation of the images when motion is not detected during a predetermined amount of time.
In another example, the camera forms the digital image stream using radiation having wavelengths ranging between 10 nanometers and 14,000 nanometers. In a variation thereof, the elongate tubular member is malleable. In another variation thereof, the camera forms the digital image stream using radiation having wavelengths in the visible light spectrum.
In yet another example thereof, further comprising a distal tip, the distal tip extends distally beyond the lens, the distal tip having a textured surface operable to displace the glottis of the patient.
In a further example thereof, further comprising a distal tip extending distally beyond the lens and a processing device, the distal tip includes a use indicia positioned within the field of view of the lens and operable to determine a use state of the insertable portion, wherein the processing device disables presentation of the images when the use state indicates prior uses exceed a permitted number of uses.
In a further example thereof, further comprising a distal tip extending distally beyond the lens, the distal tip includes flexural strengthening features to reduce flexure of the distal tip by at least 5% when the distal tip engages the patient's tissue including a longitudinal wall.
In another exemplary embodiment of the present disclosure, a visualization instrument provided. The visualization instrument comprising a display device; a lens; a camera including an imaging sensor, the camera outputting a digital image stream corresponding to a plurality of views obtained through the lens; a handle portion detachably coupled to the display device; a self-contained energy source supported by one of the handle portion and the display device; an insertable portion coupled to the handle portion and insertable into the patient, the insertable portion having a distal cavity at a distal end thereof receiving the lens and the camera, the camera electronically coupled to the display device when the insertable portion is coupled to the handle portion and the handle portion is coupled to the display device to present images corresponding to the plurality of views with the display device; and a use indicia located in one of the handle portion and the insertable portion, the use indicia operable to determine prior uses of the insertable portion and to disable presentation of the images when the prior uses exceed a permitted number of uses.
In one example thereof, the permitted number of uses is one. In another example thereof, the use indicia provides information regarding environmental variables including at least one of temperature and humidity. In a further example thereof, the use indicia comprises a single-use fuse.
In yet another example, the visualization instrument further comprises a processing device cooperating with the use indicia to determine the prior uses. In an example thereof, the instrument further comprises a sensing device electronically coupled to the processing device and sensing the use indicia to determine the prior uses. In another example thereof, the instrument further comprises an image sensor identifier, wherein the processing device determines the prior uses based on the image sensor identifier. In a variation thereof, the image sensor identifier is stored in the camera. In another variation, further comprising an electronic device storing the image sensor identifier, the electronic device is supported by one of the handle portion and the insertable portion and electronically coupled to the processing device when the insertable portion is coupled to the display device.
In a further example, the insertable portion comprises an elongate tubular member. In a variation thereof, the elongate tubular member is malleable. In another variation, the elongate tubular member is steerable, further comprising a steering mechanism supported by the handle portion.
In yet another example, further comprising a processing device, a camera identifier, a data storage device, and a plurality of camera identifiers stored in the data storage device, the processing device compares the camera identifier to the plurality of camera identifiers to find a match and disables presentation of the images if the match is not found.
In a further exemplary embodiment of the present disclosure, a visualization partially insertable into a patient is provided, the visualization instrument comprising an insertable portion having guiding means for guiding insertion of a tube into a patient; attachment means for detachably coupling a display device to the insertable portion; imaging means for capturing a plurality of images corresponding to a field of view of the imaging means and outputting a digital image stream operable to present corresponding images with the display device; and use tracking means for disabling presentation of the corresponding images when the insertable portion has been used more than a permitted number of uses.
In another exemplary embodiment of the present disclosure, a visualization kit is provided. The visualization kit comprising a first component insertable into an oral cavity of a patient, the first component including a first camera operable to transmit first images of the oral cavity; a second component different from and interchangeable with the first component, the second component including a second camera operable to transmit second images; a third component detachably attachable to the first component and the second component and sized to be held by a hand of a user, the third component including a viewable screen and being communicatively coupled to the first camera when the third component is attached to the first component and to the second camera when the third component is attached to the second component; wherein the viewable screen presents images corresponding to one of the first images and the second images. In one example thereof, the first component comprises a guide pathway adapted to guide insertion of a tube into the oral cavity and the second component comprises a stylet.
In yet another exemplary embodiment of the present disclosure, a visualization method is provided. The visualization method comprising the steps of providing an insertable component having a camera; detachably coupling a display support component to the insertable component, the display support component sized to be held by a hand of a user and including a display device, the display support component being communicatively coupled to the camera when the display support component is coupled to the insertable component; inserting the insertable component into a target space; capturing with the camera a plurality of views corresponding to a field of view of the camera; presenting with the display device a plurality of images corresponding to the plurality of views; aligning the field of view with a target within the target space; removing the insertable component from the target space; detaching the display support component from the insertable component; tracking uses of the insertable component; and disabling presentation of the plurality of images when the insertable portion has been used more than a permitted number of uses. In an example thereof, the method further comprises the step of discarding the insertable component. In a variation thereof, the step of tracking uses comprises sensing a use indicia. In a further variation thereof, the step of tracking uses comprises storing a use indicia after use of the insertable component.
In a further example thereof, the display device includes a display side and an opposite side opposite the display side, the display support component further comprising a rest surface and a switch, the rest surface and the switch disposed on the opposite side, further comprising the step of laying the display support component to rest on the rest surface without actuating the switch.
In another example thereof, the method further comprises the steps of comparing with a processing device a camera identifier to a plurality of camera identifiers stored in a memory device to find a match, and disabling presentation of the plurality of images if the match is not found.
In a further exemplary embodiment of the present disclosure, a visualization instrument configured to intubate a patient is provided. The visualization instrument comprising a display device including a display driver and a display; an imaging assembly having an image sensor, a transparent cover, a plurality of lenses between the image sensor and the transparent cover, and an illumination device illuminating a cavity of the patient, the imaging assembly configured to transfer an image stream representing views of the cavity to the display device; a control component including a processor, a memory, and a program embedded in the memory, the processor receiving the data stream from the imaging assembly, transforming the data stream into a second data stream, and providing the second data stream to the display driver to show the views of the cavity on the display; a housing coupled to the display device and having a first connector configured to receive the second data stream from the control component; and an insertable portion having a proximal cavity configured to receive the housing and a distal cavity configured to receive the imaging assembly, the insertable portion also including a second connector configured transfer the first image stream from the imaging assembly through the first connector to the control component, the insertable portion further including a passageway configured to guide insertion of an elongate tubular component into the cavity, wherein the imaging assembly is configured to capture in the first image stream images of a distal end of the tubular component as the tubular component slides through the guide; an identification source located in the insertable portion; and a sensor communicatively coupled with the control component and configured to sense an identification signal from the identification source, the identification signal operable to ascertain a prior use of the insertable portion, the control component being configured to detect the prior use based on the identification signal and to prevent operation of the imaging assembly upon detection of the prior use.
In another exemplary embodiment of the present disclosure, a visualization instrument is provided. The visualization instrument comprising a display device including a display driver and a display; an imaging assembly having an image sensor, a transparent cover, a lens between the image sensor and the transparent cover, and an illumination device illuminating a cavity of the patient, the imaging assembly configured to transfer an image stream representing views of the cavity to the display device; a control component including a processor, a memory, and a program embedded in the memory, the processor receiving the data stream from the imaging assembly, transforming the data stream into a second data stream, and providing the second data stream to the display driver to show the views of the cavity on the display; a housing coupled to the display device and having a first connector configured to receive the second data stream from the control component; and an insertable portion having a proximal cavity configured to receive the housing and a distal cavity configured to receive the imaging assembly, the insertable portion also including a second connector, a passageway, and a distal tip, the second connector configured to transfer the first image stream from the imaging assembly through the first connector to the control component, the passageway configured to guide insertion of an elongate tubular component into the cavity, and the distal tip engaging a glottis of a patient, the distal tip having a lateral wall extending distally beyond the distal cavity and a textured surface configured to engage the glottis, wherein the imaging assembly is configured to capture in the first image stream images of a distal and of the tubular component as the tubular component slides through the guide.
In yet another exemplary embodiment of the present disclosure, a visualization instrument is provided. The visualization instrument comprising a display device including a display driver and a display; an imaging assembly having an image sensor, a transparent cover, a lens between the image sensor and the transparent cover, and an illumination device illuminating a cavity of the patient, the imaging assembly configured to transfer an image stream representing views of the cavity to the display device; a control component including a processor, a memory, and a program embedded in the memory, the processor receiving the data stream from the imaging assembly, transforming the data stream into a second data stream, and providing the second data stream to the display driver to show the views of the cavity on the display; a housing coupled to the display device and having a first connector configured to receive the second data stream from the control component; and an insertable portion having a proximal cavity configured to receive the housing and a distal end having a distal cavity configured to receive the imaging assembly, the insertable portion also including a second connector, a passageway, and a distal tip, the second connector configured to transfer the first image stream from the imaging assembly through the first connector to the control component, passageway configured to guide insertion of an elongate tubular component into the cavity, and the distal tip engaging a glottis of a patient, the distal tip having a lateral wall extending distally beyond the distal cavity and a textured surface configured to engage the glottis, wherein the distal tip exhibits a curvature perpendicularly to a length of the insertable portion and includes at least a portion of a ridge parallel to the length of the insertable portion, the curvature and the ridge enhancing the flexural strength of the distal tip by at least 5%.
In a further exemplary embodiment of the present disclosure, a visualization instrument configured to intubate a patient is provided. The visualization instrument comprising an insertable component including a camera, at least two lenses, and an illumination device to illuminate the oral cavity of the patient when the insertable component is inserted, at least partially, into the oral cavity, the insertable component being configured to guide insertion of a tube through the vocal cords of the patient, and the camera being mounted on the insertable component so as to capture images of a distal end of the tube as the tube enters the vocal cords; a reusable component including a display device and a video processing portion, the reusable component being removably attachable to the insertable component; an identification insignia on the insertable component; and a sensor supported by the reusable component and operable to sense the identification insignia, wherein the reusable component determines an identity data of the insertable component based on the identification insignia, and determines a status of the insertable component by comparing the identity data to a plurality of identity and status data corresponding to a plurality of insertable components.
The features of this invention, and the manner of attaining them, will become more apparent and the invention itself will be better understood by reference to the following description of embodiments of the invention taken in conjunction with the accompanying drawings.
Corresponding reference characters indicate corresponding parts throughout the several views. Although the drawings represent embodiments of the present invention, the drawings are not necessarily to scale and certain features may be exaggerated to better illustrate and explain the embodiments. The exemplifications set out herein illustrate embodiments of the invention in several forms and such exemplification is not to be construed as limiting the scope of the invention in any manner.
The embodiments of the disclosure discussed below are not intended to be exhaustive or limit the invention to the precise forms disclosed in the following detailed description. Rather, the embodiments are chosen and described so that others skilled in the art may utilize their teachings.
A visualization instrument, and a method of using the instrument, are disclosed herein. In one embodiment of the visualization instrument, the visualization instrument comprises a display screen and a display screen support portion removably and electrically coupled to an insertable portion including an imaging system to acquire images of an internal space. Exemplary visualization instruments include endoscopes, laryngoscopes, and stylets. The display screen support portion and the display screen may be integrally constructed and may be reusable or disposable. In various embodiments described below, a unitary component comprising the display screen and the display screen support portion is referred to as a reusable portion denoting that in many instances it is possible, although not necessary, and perhaps desirable for economic reasons, to reuse the display screen and electronic components relating thereto. In one variation thereof, the visualization instrument transfers images to a remote display. In one example thereof, the reusable portion includes a housing received in a proximal cavity of a handle coupled to the insertable portion. The display device is supported by the housing. An anti-glare coating or layer may be provided on the display surface.
In another embodiment of the visualization instrument, the insertable portion comprises a passageway or guide pathway configured to guide insertion of an elongate tubular component, e.g., an airway device, endotracheal tube and the like, and an imaging assembly disposed on or in the distal end of the insertable portion. The imaging assembly captures images of the patient which are shown with the display device. A distal end of the tubular component may also be visible in the images as the tubular component slides through the guide pathway towards the vocal cords. Illustrative embodiments of the reusable and insertable portions are described with reference to
In yet another embodiment of the visualization instrument, the insertable portion comprises an elongate arm having an imaging assembly disposed in the distal end of the arm. In one example thereof, the elongate arm is coupled to a handle adapted to receive the reusable portion. In one variation of the previous example, the elongate arm forms part of a stylet. Illustrative embodiments of stylets are described with reference to
In a further embodiment of the visualization instrument, an imaging cap is provided. The imaging cap comprises a handle adapted to removably receive the reusable portion and a camera to enable a user to capture external images. Additional data acquisition sensors may be coupled to the reusable portion or the imaging cap. An illustrative embodiment of the imaging cap and the sensors is described below with reference to
In an embodiment of a visualization system disclosed herein, the visualization instruments described above are adapted to transmit images to a remote device. Exemplary embodiments of systems adapted to transmit images from the reusable portion to the remote device are described below with reference to
Advantageously, the imaging assembly may be configured to be produced at a low cost to enable the insertable portion to function as a single-use disposable device. In one embodiment, the imaging assembly comprises a support structure or camera barrel supporting a camera integrated circuit (IC), camera, or camera-chip, an illumination device, and lenses. The imaging assembly may be inserted into a cavity located in the distal end of the insertable portion. The imaging assembly may comprise a retention device, e.g., a pin, detent, resilient elastomeric filler, screw or any other fixation device configured to securely couple the imaging assembly to the distal cavity. Exemplary embodiments of imaging assemblies are described below with reference to
Several defogging features may be provided to prevent fogging of the imaging assembly. In one embodiment, the distal surface of the most distally located lens is coated to reduce or eliminate fogging. In one example thereof, an anti-fog coating is applied to one side of a substrate and an adhesive coating is applied to the other side of the substrate. The adhesively coated side is then adhered to the distal lens surface to attach the anti-fog coating to the lens. The substrate may comprise any known combination of polymers extruded in clear thin film form. Exemplary polymers include polycarbonate, polyester based polymers, polystyrene, polyethylene, polypropylene, and other transparent polymers. A removable backing may be applied to the adhesively coated thin film to facilitate processing. The backing is then removed to expose the adhesive before application of the substrate to the lens surface. In another example, a cover plate seals the cavity and prevents lens fogging. In one variation thereof, the cover plate includes an anti-fog layer or coating on its external surface. The insertable portion may be packaged with a swab comprising H2O2 or other antifog coating agents, such that the swab wipes the lens when the insertable portion is withdrawn from the packaging. For example, the packaging may comprise a polymeric strip with a swab attached thereto. Alternatively, the adhesively and anti-fog coated substrate may be adhered to the cover plate. In a further example, defogging is achieved by coupling a heating element to the cover plate and to the power leads of an illumination device, which in one embodiment is a white light emitting diode (LED), which is driven above its nominal illumination power level to generate heat with the excess power. In another variation, an LED conducting 150 milliamps coupled to a thermal element heats the distal lens to 45 degrees Celsius in about one minute.
A commercially available camera, such as a camera used in cellular phones and personal digital assistants (PDAs), comprises an image sensor and electronic components configured to convert pixel data captured by the image sensor to image data, e.g., digital images, and to output streams of digital images in a standard format. Image sensors may comprise CCD, CMOS sensors with active or passive pixels, or other photo sensors well known in the art. Operational signals are provided to the image sensor to control its operation. Advantageously, the cost of the disposable portion is reduced further by locating the components for providing the operational signals in the reusable portion. In one embodiment, a display driver configured to receive the standard image stream and drive the display device accordingly, also comprises the components necessary to control the camera. In one example thereof, the input/output signals are provided by signal conductors, e.g., a multi-conductor flexible ribbon. In another example thereof, a control component is provided intermediate the camera and the display driver to transform the standard image stream into a differently structured image stream conforming to the size of the display device and/or transforming the standard image stream to a different format corresponding to the format required by the display driver. In a further example thereof, the operational circuits are integrated with the camera, which is configured to output a preconfigured image stream upnn the application of power, and which is usable directly by the display device. In yet another example, control components supported by the reusable portion housing provide control signals to the camera to define the size of the images output by the camera. In a further example, the image stream output by the camera is transmitted wirelessly by a wireless transmitter located in the insertion portion. In yet a further example, the wireless transmitter is integrated with the camera. In a variation thereof, the wireless transmitter is positioned in the proximal end of the insertable portion or in the distal cavity. In one example, the camera forms a digital image stream using radiation having wavelengths ranging between 10 nanometers and 14,000 nanometers. The wavelengths include the visible light, ultraviolet, and infrared spectrums. In one variation, the camera is an infrared camera. In another variation, the camera is an ultraviolet light camera. In another variation, the camera is a visible light camera.
While the embodiments of the disclosure are applicable in medical and non-medical applications, exemplary features of visualization instruments will be described below with reference to medical instruments such as laryngoscopes and stylets although the invention is not limited to medical applications and instruments.
Generally, an intubation instrument comprises a reusable portion having a display device coupled to a housing and a blade. The blade comprises a handle in a proximal end thereof spaced apart from an insertable portion located at a distal end. An imaging assembly is located at the distal end of the insertable portion. The term blade denotes a single part integrally combining a handle and an insertable portion defined by a plurality of walls as described below. In a variation thereof, the handle and the insertable portion are distinct parts that are removably attachable. A display device includes a viewing screen. The handle comprises a proximal cavity for receiving the housing and coupling the reusable portion to the blade. The insertable portion of the blade comprises an elongate passageway or pathway designed to guide insertion of a catheter, intubation tube and the like (not shown) into the larynx of a patient. The housing includes batteries and electronic circuits to receive image signals from the imaging assembly via a conductor which comprises a plurality of signal conductors and may comprise power and control conductors as well. In an alternative embodiment, the conductor is at least partially replaced with a wireless transmitter and receiver coupling the imaging assembly and the housing. The housing may comprise a control component and a connector adapted to couple with a connector of the blade to transfer images thereto. Exemplary imaging assemblies are disclosed in
An exemplary pathway is defined by the interior surfaces of a medial wall, an anterior wall, a posterior wall, and a lateral wall. Each wall has an interior surface which is the surface adjacent to the pathway. In other embodiments, the lateral wall may extend uninterrupted from the proximal to the distal end of the blade or may be configured with more or fewer wall portions. A distal tip extends from the anterior wall beyond the end of the medial wall and comprises a surface which is configured to contact the patient to move the epiglottis and expose the vocal cords.
An imaging assembly comprises a plurality of lenses supported by a camera barrel.
In a further embodiment of a visualization instrument, imaging features are provided on a surface of the insertion portion to indicate its orientation relative to the space viewed by the camera as observed in the images. In one example thereof, the imaging feature is a landmark, illustratively a ridge 48 (shown in
In a further embodiment of a camera assembly, the prongs extend from the proximal end of camera holder 384 rather than from the proximal end of camera barrel 376. Camera barrel 376 slides into a cavity in the camera holder from the proximal end of the camera holder. Then, the circuit board supporting the camera is attached. The pressure plate is attached last. The pressure plate engages the prongs of the camera holder thereby holding the camera barrel and the camera in place. The camera can be mounted onto the camera barrel in any other manner. Advantageously, in this embodiment the size of the circuit board holding the camera can be reduced since it no longer has to engage the prongs. Of course, the camera can be supported by any other means alternative to a circuit board.
In one example of the present embodiment, the camera supplies a first image stream which is 8-bits wide. The resolution of the camera is 640×480 (VGA) pixels per frame. There are 30 frames per second. The data format is 2 bytes per pixel (i.e., the no called YUV (4:2:2) format). Intensity Y is specified at every pixel, color information U or V every second time. A FPGA is programmed to convert the data stream to a second image stream with a format compatible with the display device which comprises an OLED display. In an alternative embodiment, the camera data is provided to the video processing chip, and the video processing chip, after adding information such as colors, symbols or other information, outputs a video stream to the FPGA for the FPGA to convert to the VGA format. The display resolution is 320×240 (QVGA) pixels per frame, 30 frames per second. The data format, however, is RGB (6, 6, 6). This format uses a 6-bit value for red, a 6-bit value for green, and a 6-bit value for blue. There are specific well known equations for conversion from the YUV color space to the RGB color space. The FPGA implements this conversion. It also performs the conversion (e.g. dropping every second pixel) to convert from VGA to QVGA resolution. The FPGA also provides signals for writing the converted data stream into the OLED display's memory/buffer. The FPGA also sends the camera data to the NTSC/S-video conversion chip. The video chip having the video processor is capable of accepting the VGA, YUV format almost directly. The FPGA provides the necessary operational signals to load the video chip's memory. In a variation thereof, the FPGA also verifies the identity of the camera against a database of approved cameras. The FPGA extracts camera information from the camera, for example a built-in camera ID or a programmable camera ID, and checks the identity against an approved list which is periodically updated. If the camera identification is not on the approved list, the FPGA does not convert the first image stream or, optionally, inserts a warning into the second image stream to alert a practitioner that the insertable portion is not an approved device. Approval may be desirable to ensure the insertable portion meets quality specifications.
A program and data structures are embedded in the memory. The program comprises a plurality of processing sequences operable by the processor to interact with data structures containing data. Data may include parameters such as video instructions, security feature instructions, landmark patterns and the like. The reusable portion may comprise temperature and humidity sensors, and the data may thus include status information, e.g., battery charge level and number of uses, and environmental information, e.g. temperature and humidity levels. Such data may be displayed by the display device or transmitted to a remote device to assist the practitioner. Suitable alarm functions may be implemented if the environmental or battery information falls outside predetermined ranges.
In one embodiment of a visualization instrument, a first processing sequence examines the first image stream and identifies a plurality of landmarks corresponding to features of the internal space and orientation features on the insertable portion. Another processing sequence transforms the first image stream by coloring the space landmarks. A third processing sequence transforms the first image stream by coloring the orientation features. In one example, the orientation feature is a viewable marking in the distal surface of distal tip 46 or an internal surface of wall 34 (both shown in
In another embodiment of a visualization instrument, power saving features are provided to extend the battery life of the reusable portion of the visualization instrument. Power is consumed by illumination, image display, image stream generation, and conversion of the image stream from the image sensor to the display device. In one example thereof, the reusable portion disables the display device if it detects the absence of the camera (a disengaged period). Enablement of the display device during disengaged periods may cause video display noise and frozen images which are prevented if the display is disabled during those periods.
In another example, the display device is also disabled during monitoring periods and automatically enabled if monitoring generates an alert, e.g., low battery, defective connection, high humidity and the like. In a monitoring period a practitioner may also manually enable the display device to request information. Alternatively, an inactive mode may be set which disables monitoring and thereby also disables the display device. In a variation thereof, the monitoring or the inactive mode may be determined based on the engagement or disengagement of the imaging cap or the insertable portion. The camera may be disabled during the monitoring and inactive periods. Advantageously, enabling the camera only under predetermined conditions, including engagement, not only saves power, but also minimizes the damage that may be caused by hot-swapping the reusable and insertable portions. Table 1 summarizes a multiplicity of operating modes of the viewing instrument based upon the state of its components as described above. However, the modes described herein are exemplary, and additional or alternative criteria may be used to determine the same or more operating modes.
In another example, the visualization instrument, either or both the reusable and insertable portions, comprise a motion sensor. Exemplary motion sensors include micro-electromechanical sensors (MEMS), e.g., inertial sensors, gyroscopes, accelerometers, rate sensors, and inclinometers, configured to detect absence of motion. If absence of motion during a predetermined time period is detected, all functions except motion detection may be shut down to save power, thus placing the instrument in sleep mode. Once motion is detected during sleep mode, all functions may be re-established without performing start-up routines to quickly enable full functionality.
When the insertable portion is intended to be a single-use disposable unit, potential re-usability of the insertable portion may be of concern to practitioners, hospital administrators and others responsible for patient safety. Advantageously, in one embodiment the reusable portion may disable or not enable the insertable portion if the insertable portion has been previously used, thereby alleviating or eliminating the concern. One exemplary feature for preventing repeated uses is described herein as a single-use fuse. Generally, a single-use fuse feature detects an irreversible change to the insertable portion or the handle. Another exemplary feature is status tracking. Status tracking enables an insertion portion to be used once and then discarded, e.g. a disposable insertion portion, and also enables a permitted number of uses. If the insertable portion is intended to be used a limited number of times, such portion defined herein as “reposable.” Tracking features are used to count the number of uses and to disable the reposable unit after the limit has been reached. Described below are examples of such features. Generally, in a status tracking embodiment, the insertable portion comprises an identification feature to track the number of uses. The reusable portion or the blade can be configured to detect the identification feature. The reusable portion or an associated database and processing system can track uses. In a further example, reusable blades and insertion portions can be used with multiple reusable portions so long as the use limit has not been reached. The program may indicate the status of the insertable portion or the blade with the display device. The identification information may be encrypted to prevent tampering. An anti-tampering integrated circuit may be coupled to the conductor in the insertable portion.
Additional variations of single-use fuses are described below. In one variation thereof, a tab is provided which is deformed, e.g., broken, when the insertable portion is coupled to the reusable portion or when it is disengaged. The reusable portion detects the broken tab when an attempt is made to re-use the insertable portion. For example, the housing may contain an angled protrusion which enables a tab in the proximal cavity of the handle to pass over it. When the insertable portion is disengaged, the angled protrusion tears the tab. Upon re-engagement, the reusable portion detects the deformed tab. Exemplary detectors include limit switches, optical sensors, pressure sensors, and the like. An alterable mechanical key/slot may be used as well.
In another variation thereof, a film or coating that changes color after being exposed to the environment is provided in or on the insertable portion or the blade. If the color change is irreversible, for example by an irreversible chemical reaction. UV activated cross-linking of polymers and the like, then the feature is a single-use feature. However, the feature may be a status tracking feature if the color change is reversible. Upon detection of the color change to a predetermined color, or absence of a predetermined color, software may disable the insertable portion or changes its status. The color may be detected with a detector in the housing or in the first image stream. Environmental variables include, without limitation, air, moisture, e.g. saliva, pressure, e.g. touch or heat, and other suitable variables. Sensors may be provided in the insertable portion to detect the environmental variables. For example, MEMS IC's may be provided on the external surfaces of the insertable portion. The environmental variable may have to be maintained in the changed state for a predetermined amount of time. For example, temperature may have to be greater than 75 degrees for one minute to trigger the status change.
Additional variations of status tracking features are described below. In one variation, the insertable portion is encoded by an identification component such as an electronic identifier (ID) or a unique feature detectable in the first image stream. Electronic ID features may comprise, without limitation, an RFID passive or active transmitter, a camera ID, a programmable ID located in an IC in the insertable portion, and the like. Upon engagement, the reusable portion detects the identification component, determines the status, and activates the insertable portion if the status indicates first use or reposable use less than the prescribed limit.
In another variation, the distally-facing surface of the glottis-engaging protrusion located at the distal end of the insertable portion is encoded with a pattern viewable by the image sensor. The software detects the pattern in the image stream. The pattern may comprise holographic keys molded or engraved in the distally-facing surface and may be designed to change during use so that a subsequent use may be detected.
In a further variation, the identification component comprises a physical mark in the insertable portion which is sensed by the reusable portion to determine first-use or re-use. Exemplary identification components include barcodes, luminescence marks, color keys, holographic keys, magnetic keys, and the like. Sensors adapted to sense corresponding physical marks include microbarcode readers having high magnification objectives to enable minimization of the size of the physical mark, optical sensors and/or detectors, optical sensors or detectors sensitive to holographic diffraction patterns. Hall effect sensors, pressure sensors or detectors, contact switches, and other suitable sensors. Combinations of physical marks are also envisioned, such as a key/slot combined with magnetic or optical marks. Advantageously, the identification component may also identify the type, make and model of the insertable portion, display, and/or record that information, including date and GPS stamp, to a second image stream produced for forensic use. In one example, the control component adds the forensic information to the first image stream to generate the second image stream. In another example, the forensic data is stored and transmitted separately from the image stream.
In another embodiment of the visualization instrument, fluid management lumens are provided. In one example thereof, the insertable portion includes a lumen for providing or withdrawing fluids to or from the patient. In one variation thereof, the lumen is molded opposite the guide pathway. In another variation, a plurality of small channels are included in the molded parts of the insertable portion with distal apertures located around the imaging assembly so as to not increase the size of the insertable portion. The lumen or the channels are connected to external tubes to transfer fluids, e.g. medications or bodily fluids, therethrough to and/or from an external fluid reservoir. Exemplary fluids provided to the patient include liquids, air, and gases.
In yet another embodiment of the visualization instrument, comfort features are provided. In one example thereof, a blade comprises a first material which imparts structure and rigidity to the insertable portion and a second material coupled to the first material to provide a soft and resilient feel. The second material extends, at least partially, over the surface of the handle and is textured to increase grasping comfort. In another example, sensors are placed beneath the second material to detect pressure and trigger status changes.
Exemplary visualization systems are described with reference to
The systems depicted in
Remote feedback enables a practitioner observing remotely to provide suggestions and other information to the local practitioner. For example, a medical technician may perform the intubation in a battlefield or accident scene as directed by a physician at a hospital. The remote feedback may be text, image, audio or any other type of feedback. Visual feedback may be provided in the display device through the electronic communication link between the visualization device and the local computer. The local computer or the reusable portion may also include speakers to aurally communicate the remote feedback to the practitioner. In one example of the present embodiment, the reusable portion or the local computer provides feedback to the practitioner, the source of the feedback being generated with the remote processing system. Images generated with visualization device may be viewed by a practitioner in the display device of the reusable portion, and in the local and remote processing systems simultaneously. The images displayed by each device may be the same or different. Local computer 462 may incorporate display features suitable to local use while remote computer 476 or portable device 488 may incorporate features suitable for remote use or compatible with their processing capabilities.
In another example, the communication systems depicted in
In a further example of a visualization system, the local computer collects patient information and transmits the information to the reusable portion. The reusable portion displays on-screen indicators in the display device to alert the practitioner without requiring the practitioner to look away to receive the same information. On-screen information may include vital signs. CO2 levels in the air exhaled by the patient, temperature, oxygen saturation, pulse, blood pressure and any other patient vital signs. On-screen information may also include corresponding indicators such as alarms, color-coded thresholds indicating that the vital signs are approaching concerning levels, and alarms/indicators corresponding to the performance of equipment such as ventilators. In one variation thereof, the reusable portion displays on-screen information and indicators generated by the reusable portion. Such information may include parameters extracted from the first image stream, indicators from comparisons of landmarks in the first image stream to the expected location of the landmarks relative to the insertable portion, and other data which the reusable portion may collect with sensors such as those attached to the communications port.
The visualization system is well suited for emergency, rescue and military operations. In a further embodiment of the visualization system, communications gear typically used in such operations are provided with a cradle in which the reusable portion is stored. The cradle comprises a charging housing to re-charge batteries in the reusable portion. The cradle may comprise UV lights to sterilize the reusable portion, since the reusable portion may be used several times before the rescue team or military unit returns to base. Due to the availability of power and telecommunications gear, the reusable portion can be designed to communicate locally only and thereby its size and weight may be minimized. The cradle may also sterilize a reposable portion. Of course, the use of such cradles is not limited to rescue and military operations. Cradles may be used in any environment in which the reusable portion can be used.
Another embodiment of a visualization instrument is described below with reference to
In a further embodiment of an intubation instrument, passageway 536 is partially constrained at the distal end of the insertable portion by an extension portion of a posterior wall such that an internal surface of the extension portion of the posterior wall faces interior surface 538. The extension portion may be provided integrally with the posterior wall, for example as a single extruded part, or may be attached to the insertable, portion, for example by providing a layer that can be adhesively bonded to surface 40 of the posterior wall. The extension portion may comprise an elastomeric composition, as described above, which resiliently allows removal of the insertable portion after an endotracheal tube is inserted through the passageway into the larynx of the patient. The extension portion, medial wall 44, and anterior wall 38 form a C-channel coextensive with the distal portion of passageway 536.
A number of configurations are described herein which may be provided to facilitate removal of the intubation tube.
In an alternative embodiment, a resilient tab is positioned in the interior surface of the posterior wall and/or on the interior surface of the lateral wall, at the distal end of the insertion portion. The resilient tab is designed to push the endotracheal tube passing through passageway 36 towards distal tip 46 regardless of the tube diameter. Thus, even when the tube diameter is substantially smaller than the cross-sectional area of the passageway, the tab(s) push(es) the endotracheal tube into the proper position for insertion through the vocal cords.
Handle 1030 may comprise a textured external surface to enhance grip. Handle 1030 includes connector 1060 adapted to communicatively couple the camera to body portion 1008. A similar connector may be provided in a cradle to charge the reusable portion when not in use. Alternatively, or additionally, a cradle may comprise an inductive charger and either of the reusable portion or the insertable portion may comprise a matching induction coil. When the intubation instrument is placed in the cradle, the inductive charger charges the induction coil to recharge the intubation instrument. A least a portion of a wall of handle 1030 may be sufficiently thin to enable the electromagnetic waves emitted by the inductive charger to efficiently pass through the wall,
In one example of the present embodiment, stylet 1004 is steerable.
Stylet arm 1100 may be permanently or removably attached to stylet handle 1030.
A further embodiment of a visualization instrument is illustrated in
Blade 1250 includes a plurality of guide walls forming a pathway for an endotracheal tube. The guide pathway is defined, at least in part, by an anterior guide surface and a medial guide surface. In one variation thereof, the anterior guide surface, e.g. anterior guide surface 1269, is substantially orthogonal to the medial guide surface e.g. the surface of medial wall 1272 shown in
In one example, the medial guide surface includes a transition portion extending through the proximal portion of the guide pathway and a longitudinally aligned portion extending through the distal portion of the guide pathway. In a variation thereof, the transition portion extends from a side of the insertable portion to the longitudinally aligned portion. In another variation thereof, exemplified in
Blade 1250 supports the imaging sensor and electronic components to electrically couple the imaging sensor to video display 1202. The imaging sensor may be electronically coupled wirelessly or by electrical conductors embedded in the insertable portion of the blade. In the exemplary embodiment shown in
Resilient materials may be provided to add functionality to the blade. The exemplary embodiment described with reference to
In yet another embodiment of a visualization instrument, alignment features are provided to facilitate engagement of the reusable portion and the handle. Exemplary mating alignment features were described with referring to
In another embodiment of the medical visualization instrument, guide pathway biasing features are provided to facilitate use of multiple endotracheal tube sizes. Generally, the biasing features exert anteriorly directed force on an endotracheal tube as it translates through the guide pathway. As shown in
In another embodiment of the disclosure, a visualization instrument is provided. The visualization instrument comprises a reusable portion, a handle portion, an insertable portion, and an imaging assembly. The insertable portion comprises a distal cavity at a distal end thereof and a connector accessible through the distal cavity to electrically and detachably couple the imaging assembly to the insertable portion. In one example thereof, the handle portion and the insertable portion are integrally coupled. In another example thereof, the imaging assembly is connected to the connector prior to use and subsequently disconnected. The insertable portion is then discarded while the imaging assembly may be cleaned and re-used. Advantageously, a reusable imaging assembly reduces the cost of the insertable portion which may be discarded after a single or a limited number of uses.
In another embodiment of the visualization instrument, image alignment features are provided to facilitate visualization of the endotracheal tube. An example of image alignment features is shown in
In another embodiment of the visualization instrument, a blade without posterior and lateral guide walls is provided. An example of such a blade is shown in
In a further embodiment of the visualization instrument, rest features are provided which support the reusable portion when the reusable portion rests on a surface. The rest features comprise rest surfaces adapted to stabilize the medical instrument in a rest position. In one example, the rest surface has a coefficient of friction higher than the coefficient of friction of the distal surface of the display device. In one variation of the previous example, the rest surface comprises rubber. In another variation, the rest surface comprises a polymeric material with a coefficient of friction that is higher than the coefficient of friction of the material from which the display device frame is made. In another example, a rest surface extends from the distal surface of the display device. In one variation thereof, a rest surface is parallel to the supporting surface when the reusable portion is decoupled from the handle. In another variation thereof, a rest surface is parallel to the supporting surface when the reusable portion is coupled to the handle. In a further variation thereof, the display device comprises a rest feature having two rest surfaces. One rest surface supports the display device when the handle is coupled to the reusable portion and the other rest surface supports the display device when the handle is not coupled the reusable portion. In a further example, a switch cover is disposed between the rest surface and the screen and the rest surface prevents accidental activation of the switch. An example of a rest surface and switch cover will now be described with reference to
In yet a further embodiment of the visualization instrument, external communication features are provided. Referring now to
Referring now to
Referring now to
Examples of visualization instruments comprising a reusable portion and a handle coupled to an insertable portion in a single piece construction were described above. In a further example of a visualization instrument, the insertable portion and the handle are detachably coupled. Any of the alignment and state features described above with reference to coupling of the handle and the reusable portion may also be applied to coupling of the handle and the insertable portion. In one example, the handle is integrally formed with the housing supporting the video display, and the insertable portion is detachably coupled to the handle. In one variation thereof, the insertable portion comprises walls defining a guide pathway. In another variation thereof, the insertable portion comprises an elongate tubular member.
While the invention has been described as having exemplary designs, the present disclosure may be further modified within the spirit and scope of this disclosure. This application is therefore intended to cover any variations, uses, or adaptations of the disclosure using its general principles. Further, this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains.
This application claims the benefit of priority from U.S. patent application Ser. No. 61/314,058 entitled INTUBATION INSTRUMENT WITH VISUALIZATION FEATURES filed on Mar. 15, 2010 and U.S. patent application Ser. No. 61/265,330 entitled INTUBATION SYSTEM WITH ELASTOMERIC FEATURES filed on Nov. 30, 2009, the disclosures of which are expressly incorporated by reference herein in their entirety.
This invention was made with government support under Award W81XWH-06-1-0019 awarded by the U.S. Army. The government has certain rights in the invention.
Number | Name | Date | Kind |
---|---|---|---|
3766909 | Ozbey | Oct 1973 | A |
3771514 | Huffman et al. | Nov 1973 | A |
4114609 | Moses | Sep 1978 | A |
4126127 | May | Nov 1978 | A |
4360008 | Corazzelli, Jr. | Nov 1982 | A |
4573451 | Bauman | Mar 1986 | A |
4592343 | Upsher | Jun 1986 | A |
4611579 | Bellhouse | Sep 1986 | A |
4742819 | George | May 1988 | A |
4793327 | Frankel | Dec 1988 | A |
4832004 | Heckele | May 1989 | A |
4846153 | Berci | Jul 1989 | A |
4947896 | Bartlett | Aug 1990 | A |
4982729 | Wu | Jan 1991 | A |
5003962 | Choi | Apr 1991 | A |
5016614 | MacAllister | May 1991 | A |
5038766 | Parker | Aug 1991 | A |
5042469 | Augustine | Aug 1991 | A |
5095888 | Hawley | Mar 1992 | A |
5183031 | Rossoff | Feb 1993 | A |
5202795 | Kashima | Apr 1993 | A |
5203320 | Augustine | Apr 1993 | A |
5261392 | Wu | Nov 1993 | A |
5287848 | Cubb et al. | Feb 1994 | A |
5329940 | Adair | Jul 1994 | A |
5339805 | Parker | Aug 1994 | A |
5363838 | George | Nov 1994 | A |
5381787 | Bullard | Jan 1995 | A |
5400771 | Pirak et al. | Mar 1995 | A |
5431152 | Flam et al. | Jul 1995 | A |
5443058 | Ough | Aug 1995 | A |
5489256 | Adair | Feb 1996 | A |
5498231 | Franicevic | Mar 1996 | A |
5513627 | Flam | May 1996 | A |
5551946 | Bullard | Sep 1996 | A |
5603688 | Upsher | Feb 1997 | A |
5643175 | Adair | Jul 1997 | A |
5645519 | Lee et al. | Jul 1997 | A |
5665052 | Bullard | Sep 1997 | A |
5676635 | Levin | Oct 1997 | A |
5716323 | Lee | Feb 1998 | A |
5776052 | Callahan | Jul 1998 | A |
5800344 | Wood, Sr. et al. | Sep 1998 | A |
5803898 | Bashour | Sep 1998 | A |
5819733 | Bertram | Oct 1998 | A |
5827178 | Berall | Oct 1998 | A |
5842973 | Bullard | Dec 1998 | A |
5845634 | Parker | Dec 1998 | A |
5873814 | Adair | Feb 1999 | A |
5879289 | Yarush et al. | Mar 1999 | A |
5913817 | Lee | Jun 1999 | A |
5929901 | Adair et al. | Jul 1999 | A |
5986693 | Adair et al. | Nov 1999 | A |
6043839 | Adair et al. | Mar 2000 | A |
6046769 | Ikeda et al. | Apr 2000 | A |
6079409 | Brain | Jun 2000 | A |
6097423 | Mattsson-Boze et al. | Aug 2000 | A |
6099465 | Inoue | Aug 2000 | A |
6115523 | Choi et al. | Sep 2000 | A |
6142144 | Pacey | Nov 2000 | A |
6174281 | Abramowitz | Jan 2001 | B1 |
6190308 | Irion et al. | Feb 2001 | B1 |
6248061 | Cook, Jr. | Jun 2001 | B1 |
6275255 | Adair et al. | Aug 2001 | B1 |
6310642 | Adair et al. | Oct 2001 | B1 |
6322498 | Gravenstein et al. | Nov 2001 | B1 |
6354993 | Kaplan et al. | Mar 2002 | B1 |
6396873 | Goldstein et al. | May 2002 | B1 |
6413209 | Thompson | Jul 2002 | B1 |
6432042 | Bashour | Aug 2002 | B1 |
6432046 | Yarush et al. | Aug 2002 | B1 |
6449007 | Yokoyama | Sep 2002 | B1 |
6494826 | Chatenever et al. | Dec 2002 | B1 |
6494828 | Berall | Dec 2002 | B1 |
6623425 | Cartledge et al. | Sep 2003 | B2 |
6652453 | Smith et al. | Nov 2003 | B2 |
6655377 | Pacey | Dec 2003 | B2 |
6656110 | Irion et al. | Dec 2003 | B1 |
6663560 | Macaulay | Dec 2003 | B2 |
6676598 | Rudischhauser | Jan 2004 | B2 |
6750037 | Adair et al. | Jun 2004 | B2 |
6753160 | Adair et al. | Jun 2004 | B2 |
6792948 | Brain | Sep 2004 | B2 |
6830049 | Augustine et al. | Dec 2004 | B2 |
6832986 | Chibber et al. | Dec 2004 | B2 |
6840903 | Mazzei et al. | Jan 2005 | B2 |
6843769 | Gandarias | Jan 2005 | B1 |
6870566 | Koide et al. | Mar 2005 | B1 |
6875169 | Berci et al. | Apr 2005 | B2 |
6890298 | Berci et al. | May 2005 | B2 |
6901928 | Loubser | Jun 2005 | B2 |
6923663 | Oddsen et al. | Aug 2005 | B2 |
6929600 | Hill | Aug 2005 | B2 |
6982740 | Adair | Jan 2006 | B2 |
6982742 | Adair | Jan 2006 | B2 |
6984205 | Gazdzinski | Jan 2006 | B2 |
7030904 | Adair et al. | Apr 2006 | B2 |
7044910 | Cartledge et al. | May 2006 | B2 |
7048686 | Kameya et al. | May 2006 | B2 |
7110808 | Adair | Sep 2006 | B2 |
7116352 | Yaron | Oct 2006 | B2 |
7128071 | Brain | Oct 2006 | B2 |
7134992 | Schara et al. | Nov 2006 | B2 |
7154527 | Goldstein et al. | Dec 2006 | B1 |
D534652 | McGrath | Jan 2007 | S |
7156091 | Koyama et al. | Jan 2007 | B2 |
7159589 | Brain | Jan 2007 | B2 |
7182728 | Cubb et al. | Feb 2007 | B2 |
7212227 | Amling et al. | May 2007 | B2 |
7243653 | Nelson | Jul 2007 | B2 |
7289139 | Amling et al. | Oct 2007 | B2 |
7297105 | Mackin | Nov 2007 | B2 |
7305985 | Brain | Dec 2007 | B2 |
7369176 | Sonnenschein | May 2008 | B2 |
7383599 | Gabbay | Jun 2008 | B2 |
7448377 | Koyama et al. | Nov 2008 | B2 |
7471310 | Amling et al. | Dec 2008 | B2 |
7480402 | Bar/Zohar et al. | Jan 2009 | B2 |
7485375 | Tokuda et al. | Feb 2009 | B2 |
7493901 | Brain | Feb 2009 | B2 |
7511732 | Ellison et al. | Mar 2009 | B2 |
7563227 | Gardner | Jul 2009 | B2 |
7658708 | Schwartz et al. | Feb 2010 | B2 |
7683926 | Schechterman et al. | Mar 2010 | B2 |
7771350 | Geist et al. | Aug 2010 | B2 |
7909757 | Herman | Mar 2011 | B2 |
7909759 | Pecherer | Mar 2011 | B2 |
7946981 | Cubb | May 2011 | B1 |
20010004768 | Hodge et al. | Jun 2001 | A1 |
20010013345 | Bertram | Aug 2001 | A1 |
20010014768 | Kaplan et al. | Aug 2001 | A1 |
20010033326 | Goldstein | Oct 2001 | A1 |
20010054425 | Bertram | Dec 2001 | A1 |
20020010417 | Bertram | Jan 2002 | A1 |
20020022769 | Smith | Feb 2002 | A1 |
20020103494 | Pacey | Aug 2002 | A1 |
20030195390 | Graumann | Oct 2003 | A1 |
20040019256 | Cubb | Jan 2004 | A1 |
20040028390 | Chatenever et al. | Feb 2004 | A9 |
20040122292 | Dey | Jun 2004 | A1 |
20040127770 | McGrath | Jul 2004 | A1 |
20040133072 | Kennedy | Jul 2004 | A1 |
20040215061 | Kimmel et al. | Oct 2004 | A1 |
20050059863 | Zilch | Mar 2005 | A1 |
20050080342 | Gilreath et al. | Apr 2005 | A1 |
20050090712 | Cubb | Apr 2005 | A1 |
20050090715 | Schorer | Apr 2005 | A1 |
20050159649 | Patel | Jul 2005 | A1 |
20050182297 | Gravenstein et al. | Aug 2005 | A1 |
20050187434 | Dey et al. | Aug 2005 | A1 |
20050192481 | Berci et al. | Sep 2005 | A1 |
20050240081 | Eliachar | Oct 2005 | A1 |
20050244801 | DeSalvo | Nov 2005 | A1 |
20050279355 | Loubser | Dec 2005 | A1 |
20060004258 | Sun et al. | Jan 2006 | A1 |
20060004260 | Boedeker et al. | Jan 2006 | A1 |
20060015008 | Kennedy | Jan 2006 | A1 |
20060020166 | Berall | Jan 2006 | A1 |
20060020171 | Gilreath | Jan 2006 | A1 |
20060022234 | Adair et al. | Feb 2006 | A1 |
20060050144 | Kennedy | Mar 2006 | A1 |
20060079734 | Miyagi | Apr 2006 | A1 |
20060119621 | Krier | Jun 2006 | A1 |
20060162730 | Glassenberg et al. | Jul 2006 | A1 |
20060180155 | Glassenberg et al. | Aug 2006 | A1 |
20060241476 | Loubser | Oct 2006 | A1 |
20060276693 | Pacey | Dec 2006 | A1 |
20060276694 | Gandarias | Dec 2006 | A1 |
20070024717 | Chatenever et al. | Feb 2007 | A1 |
20070030345 | Amling et al. | Feb 2007 | A1 |
20070049794 | Glassenberg et al. | Mar 2007 | A1 |
20070068530 | Pacey | Mar 2007 | A1 |
20070074728 | Rea | Apr 2007 | A1 |
20070093693 | Geist et al. | Apr 2007 | A1 |
20070095352 | Berall | May 2007 | A1 |
20070106117 | Yokota | May 2007 | A1 |
20070106121 | Yokota | May 2007 | A1 |
20070106122 | Yokota et al. | May 2007 | A1 |
20070129603 | Hirsh | Jun 2007 | A1 |
20070137651 | Glassenberg et al. | Jun 2007 | A1 |
20070139953 | Krattiger et al. | Jun 2007 | A1 |
20070156022 | Patel | Jul 2007 | A1 |
20070162095 | Kimmel et al. | Jul 2007 | A1 |
20070167686 | McGrath | Jul 2007 | A1 |
20070173697 | Dutcher et al. | Jul 2007 | A1 |
20070175482 | Kimmel et al. | Aug 2007 | A1 |
20070179342 | Miller | Aug 2007 | A1 |
20070182842 | Sonnenschein et al. | Aug 2007 | A1 |
20070195539 | Birnkrant | Aug 2007 | A1 |
20070197873 | Birnkrant | Aug 2007 | A1 |
20070215162 | Glassenberg et al. | Sep 2007 | A1 |
20070265492 | Sonnenschein et al. | Nov 2007 | A1 |
20070265498 | Ito | Nov 2007 | A1 |
20070276436 | Sonnenschein et al. | Nov 2007 | A1 |
20070299313 | McGrath | Dec 2007 | A1 |
20080004498 | Pecherer | Jan 2008 | A1 |
20080009674 | Yaron | Jan 2008 | A1 |
20080029100 | Glassenberg et al. | Feb 2008 | A1 |
20080045801 | Shalman et al. | Feb 2008 | A1 |
20080051628 | Pecherer | Feb 2008 | A1 |
20080055400 | Schechterman et al. | Mar 2008 | A1 |
20080064926 | Chen | Mar 2008 | A1 |
20080091064 | Laser | Apr 2008 | A1 |
20080097161 | Wang et al. | Apr 2008 | A1 |
20080158343 | Schechterman et al. | Jul 2008 | A1 |
20080158344 | Schechterman et al. | Jul 2008 | A1 |
20080177146 | Chen | Jul 2008 | A1 |
20080177148 | Chen et al. | Jul 2008 | A1 |
20080200761 | Schwartz et al. | Aug 2008 | A1 |
20080211634 | Hopkins et al. | Sep 2008 | A1 |
20080236575 | Chuda | Oct 2008 | A1 |
20080249355 | Birnkrant | Oct 2008 | A1 |
20080249370 | Birnkrant et al. | Oct 2008 | A1 |
20080294010 | Cooper | Nov 2008 | A1 |
20080300475 | Jaeger et al. | Dec 2008 | A1 |
20080312507 | Kim | Dec 2008 | A1 |
20090022393 | Bar/Zohar et al. | Jan 2009 | A1 |
20090032016 | Law et al. | Feb 2009 | A1 |
20090065000 | Chen | Mar 2009 | A1 |
20090114217 | Wulfsohn et al. | May 2009 | A1 |
20090118580 | Sun et al. | May 2009 | A1 |
20090123135 | Amling et al. | May 2009 | A1 |
20090143645 | Matthes | Jun 2009 | A1 |
20090179985 | Amling | Jul 2009 | A1 |
20090187078 | Dunlop | Jul 2009 | A1 |
20090192350 | Mejia | Jul 2009 | A1 |
20090198111 | Nearman | Aug 2009 | A1 |
20090209826 | Sanders et al. | Aug 2009 | A1 |
20090235935 | Pacey | Sep 2009 | A1 |
20090247833 | Tanaka | Oct 2009 | A1 |
20090253955 | Akiba | Oct 2009 | A1 |
20090264708 | Pacey et al. | Oct 2009 | A1 |
20090287059 | Patel | Nov 2009 | A1 |
20090299146 | McGrath | Dec 2009 | A1 |
20090318758 | Farr et al. | Dec 2009 | A1 |
20090318768 | Tenger et al. | Dec 2009 | A1 |
20090318769 | Tenger et al. | Dec 2009 | A1 |
20090322867 | Carrey et al. | Dec 2009 | A1 |
20100022829 | Irion et al. | Jan 2010 | A1 |
20100022843 | Pecherer et al. | Jan 2010 | A1 |
20100069722 | Shalman et al. | Mar 2010 | A1 |
20100087708 | Chen et al. | Apr 2010 | A1 |
20100094090 | Mejia | Apr 2010 | A1 |
20100101569 | Kim et al. | Apr 2010 | A1 |
20100121152 | Boedeker | May 2010 | A1 |
20100141744 | Amling et al. | Jun 2010 | A1 |
20100152541 | Tenger et al. | Jun 2010 | A1 |
20100168521 | Gandarias | Jul 2010 | A1 |
20100191054 | Supiez | Jul 2010 | A1 |
20100191061 | Simons | Jul 2010 | A1 |
20100192355 | Zhao et al. | Aug 2010 | A1 |
20100198009 | Farr et al. | Aug 2010 | A1 |
20100224187 | Dalton | Sep 2010 | A1 |
20100249513 | Tydlaska | Sep 2010 | A1 |
20100249639 | Bhatt | Sep 2010 | A1 |
20100256451 | McGrath et al. | Oct 2010 | A1 |
20100261967 | Pacey et al. | Oct 2010 | A1 |
20100261968 | Nearman et al. | Oct 2010 | A1 |
20100288272 | Yokota et al. | Nov 2010 | A1 |
Number | Date | Country |
---|---|---|
1640033 | Mar 2006 | EP |
1977685 | Oct 2008 | EP |
WO 2006102770 | Oct 2006 | WO |
WO 2008019367 | Feb 2008 | WO |
WO 2009130666 | Oct 2009 | WO |
WO 2010049694 | May 2010 | WO |
WO 2010066787 | Jun 2010 | WO |
WO 2010066790 | Jun 2010 | WO |
WO 2010100496 | Sep 2010 | WO |
WO 2010100498 | Sep 2010 | WO |
Entry |
---|
Coopdech VLP-100 Video Laryngoscope advertising in fero-medic.com web page; last accessed on Aug. 31, 2010 at http://www.fero-medic.com; 1 page. |
GlideScope® Ranger advertising web page; last accessed on Aug. 31, 2010 at http://www.verathon.com/gs—ranger.htm; 2 pages. |
McGrath® Video Laryngoscope Series 5 advertising web page; last accessed on Aug. 31, 2010 at http://www.medtel.com.au; 2 pages. |
Res-Q-Tech Res-Q-Scope® II advertising web page; last accessed Oct. 13, 2009 at http://www.res-q-tech-na.com/products.html. |
Anthony J. Chipas, A Video Laryngoscope, How easy is it to use? How affordable to practice?, Outpatient Surgery Magazine, Jan. 2009, 4 pages. |
E. B. Liem, et al., New options for airway management: intubating fibreoptic stylets, The Board of Management and Trustees of the British Journal of Anesthesia©, 2003, 11 pages. |
Ken Yanagisawa, “How I Do It”—Head and Neck and Plastic Surgery, Color Photography of Video Images of Otolaryngological Structures Using a 35 mm SLR Camera, Laryngoscope 97, Aug. 1987, 2 pages. |
J. E. Smith, et al., Teaching Fibreoptic Nasotracheal Intubation With and Without Closed Circuit Television, British Journal of Anesthesia, 1993, pp. 206-211. |
Pentax Corporation, Airway Scope AWS-S100, Rigid Video Laryngoscope for Intubation, Jan. 6, 2006, 4 pages. |
Coopdech, Video Laryngoscope Portable VLP-100 Specifications, webpage, Aug. 2009, 2 pages. |
Truview EV02 New generation intubation devices, Truphatek.com webpage, Jan. 2007, 2 page. |
Truview Premier Intubation Trainer Kit, webpage, Feb. 4, 2010, 2 pages. |
Pentax Corporation, Fast Accurate Portable Airway Scope, LMA North America, Inc., May 31, 2006, 19 pages. |
LMA North America, Inc., Introducing the McGRATH® Video Laryngoscope,Jan. 2008, 2 pages. |
LMA North America, Inc., LMA CTrach™, Product Literature, May 31, 2006, 2 pages. |
Verathon Medical, GlideScope® Video Laryngoscopes Product Line, May 2006, 14 pages. |
International Search Report and Written Opinion for PCT Application No. PCT/US2010/058226, Jun. 1, 2011, 16 pgs. |
Number | Date | Country | |
---|---|---|---|
20110130632 A1 | Jun 2011 | US |
Number | Date | Country | |
---|---|---|---|
61314058 | Mar 2010 | US | |
61265330 | Nov 2009 | US |