The present disclosure relates generally to a system including a visualization instrument comprising a camera to view an internal space and, more particularly, to a visualization instrument comprising a camera to examine the interior of a patient.
Visualization instruments include endoscopes, laryngoscopes, borescopes and other medical instruments designed to look inside the body of a patient. Medical visualization instruments are used in a multitude of medical procedures including laryngoscopy, rhinoscopy, bronchoscopy, cystoscopy, hysteroscopy, laparoscopy, arthroscopy, etc. Visualization instruments are also used in non-medical applications such as to investigate the internal structures of machines, buildings, and explosive devices. Laryngoscopes are used to obtain view of the vocal folds and the glottis to perform noninvasive tracheal intubations. A conventional rigid laryngoscope consists of a handle with a light source and a blade. Direct laryngoscopy is usually carried out with the patient lying on his or her back. The laryngoscope is inserted into the mouth, typically on the right side, and pushed towards the left side to move the tongue out of the line of sight and to create a pathway for insertion of an endotracheal tube. The blade may be lifted with an upwards and forward motion to move the epiglottis and make a view of the glottis possible. Once the laryngoscope is in place, the endotracheal tube may be inserted into the pathway. The blade may be provided with guide surfaces to guide the insertion of the endotracheal tube. Laryngoscopes may be outfitted with illumination devices and optical devices to provide views of the vocal cords externally of the patient's body. Optical devices include lenses, mirrors, prisms and fiberoptic fibers, all adapted to transfer an optical image. Imaging devices may also be provided to capture the optical images and display the optical images in high definition display monitors.
Stylets and other visualization instruments have also been developed. Each instrument has its own limitations such as, for example, fogging, insufficient lighting to produce a good optical image, inability to project images remotely, additional procedural steps to insert the endotracheal tube, and cost. As difficult intubations may be performed remotely from a hospital, such as at the scene of an accident or military battle, it would be desirable to provide emergency responders and others affordable equipment necessary to perform field intubations. It would be desirable to provide visualization instruments which may be discarded after a single or a limited number of uses.
A visualization instrument and a method of using the visualization instrument are disclosed herein. The visualization instrument is insertable into a space to capture images representing internal views of the space. The visualization instrument comprises an insertable portion supporting an imaging sensor and a video device configured to display images corresponding to views captured by the imaging sensor.
In one exemplary embodiment of the present disclosure, a visualization instrument is provided. The visualization instrument comprising a display device; an imaging assembly including a camera and a lens, the camera including an imaging sensor, an imaging support having a distal surface and an optical cavity, the optical cavity defining a cavity opening in the distal surface, the lens and the camera sealed within the optical cavity to keep the optical cavity dry, the camera outputting a digital image stream corresponding to a plurality of views obtained through the lens; a handle portion detachably coupled to the display device; a self-contained energy source supported by one of the handle portion and the display device; and an insertable portion coupled to the handle portion and insertable into the patient, the insertable portion having a distal cavity with a distal opening at a distal end thereof, the imaging assembly received by the distal cavity and electronically coupled to the display device when the insertable portion is coupled to the handle portion and the handle portion is coupled to the display device to present images corresponding to the plurality of views with the display device.
In one example thereof, the insertable portion further comprises a guide pathway adapted for guiding a tube into a patient, the distal cavity and the guide pathway arranged laterally to each other to reduce an anterior/posterior height of the insertable portion.
In another example thereof, the handle portion and the insertable portion are integrally formed as a single piece blade.
In yet another example thereof, the insertable portion further comprises an anterior guide surface and a medial guide surface, the anterior guide surface and the medial guide surface defining a guide pathway adapted for guiding a tube into a patient. In a variation thereof, the anterior guide surface and the medial guide surface are substantially orthogonal to each other. In a further variation thereof, the tube is distinguishable in the digital image stream as the tube passes through a field of view of the lens. In another variation thereof, the guide pathway comprises a proximal portion and a distal portion, the insertable portion further comprising a posterior guide surface opposite the anterior guide surface and a lateral guide surface opposite the medial guide surface, the distal portion of the guide pathway defined by the anterior guide surface, the posterior guide surface, the medial guide surface and the lateral guide surface. In a further variation thereof, the proximal portion of the guide pathway is shorter than the distal portion.
In a further example, the insertable portion further comprises an anterior wall and a medial wall, the anterior wall and the medial wall defining a guide pathway adapted for guiding a tube into a patient, the guide pathway adjacent a side of the medial wall and the distal cavity adjacent an opposite side of the medial wall, the anterior wall having a tip portion extending distally beyond the medial wall. In a further variation thereof, the tip portion includes one or more flexural support feature. In another variation thereof, the one or more flexural support feature increases a flexural strength of the tip portion by at least 5%. In another variation thereof, the flexural support feature comprises at least one of a longitudinally aligned ridge and a transverse curvature of the tip portion.
In another example, the imaging assembly is permanently attached to the insertable portion.
In yet another example, the visualization instrument further comprises an electronic connector affixed to the insertable portion and accessible from the distal cavity, the imaging assembly operable to removably connect the connector when the imaging assembly is received by the distal cavity.
In yet another example, the visualization instrument further comprises a protrusion and a recess configured to receive the protrusion, the recess and the protrusion generating an audible sound when the handle portion couples to the display device. In a variation thereof, the visualization instrument further comprises a display device support portion supporting the display device, the handle portion includes a handle cavity adapted to receive the display device support portion thereby coupling the display device to the insertable portion, one of the protrusion and the recess are positioned on the display device support portion and the other of the protrusion and the recess are positioned inside the handle cavity.
In another exemplary embodiment of the present disclosure, a visualization instrument partially insertable into a patient is provided. The visualization instrument comprising a display device; a lens; a camera including an imaging sensor, the camera outputting a digital image stream corresponding to a plurality of views obtained through the lens; a handle portion detachably coupled to the display device; a self-contained energy source supported by one of the handle portion and the display device; and an insertable portion coupled to the handle portion and insertable into the patient, the insertable portion having a distal cavity at a distal end thereof receiving the lens and the camera, the camera electronically coupled to the display device when the insertable portion is coupled to the handle portion and the handle portion is coupled to the display device to present images corresponding to the plurality of views with the display device, the insertable portion further comprising at least two substantially non-resilient walls and at least one resilient wall, the at least two non-resilient walls and the at least one resilient wall forming a guide pathway operable to guide insertion of a tube into the patient and defining an elongate opening, the at least one resilient wall deforming when at least a portion of the tube is removed through the elongate opening.
In one example thereof, the handle portion and the insertable portion are integrally formed as a single piece blade. In one variation thereof, the blade is configured to be discarded after a single use.
In another example thereof, the guide pathway defines a proximal anterior/posterior height at one end thereof and a distal anterior/posterior height at a distal end thereof, the proximal anterior/posterior height being greater than the distal anterior/posterior height.
In a further example thereof, further comprising a distal tip extending distally beyond the lens, the distal tip includes flexural strengthening features to reduce flexure of the distal tip by at least 5% when the distal tip engages the patient's tissue. In a variation thereof, the flexural strengthening features comprise at least one of a curved profile of the distal tip along its width and a longitudinal ridge extending from a surface of the distal tip.
In yet another exemplary embodiment of the present disclosure, a visualization instrument partially insertable into a patient is provided. The visualization instrument comprising an insertable portion having guiding means for guiding insertion of a tube into a patient, the guiding means resiliently deforming when at least a portion of the tube is removed through the guiding means; attachment means for detachably coupling a display device to the insertable portion; and; imaging means for capturing a plurality of images corresponding to a field of view of the imaging means and outputting a digital image stream operable to present corresponding images with the display device.
In yet another exemplary embodiment of the present disclosure, a visualization method is provided. The visualization method comprising the steps of providing an insertable component having a camera; detachably coupling a display support component to the insertable component, the display support component sized to be held by a hand of a user and including a display device, the display support component being communicatively coupled to the camera when the display support component is coupled to the insertable component; inserting the insertable component into a target space; capturing with the camera a plurality of views corresponding to a field of view of the camera; presenting with the display device a plurality of images corresponding to the plurality of views; aligning the field of view with a target within the target space; removing the insertable component from the target space; and detaching the display support component from the insertable component. In an example thereof, the method further comprises the step of discarding the insertable component.
In yet another example thereof, the target space is an interior of a patient and the target comprises the vocal cords of the patient, and the method further comprises the step of intubating the patient using the insertable component before removing the insertable component from the target space. In a variation thereof, the insertable component comprises a resilient portion, and the removing step includes the step of resiliently deforming the resilient portion.
The features of this invention, and the manner of attaining them, will become more apparent and the invention itself will be better understood by reference to the following description of embodiments of the invention taken in conjunction with the accompanying drawings.
Corresponding reference characters indicate corresponding parts throughout the several views. Although the drawings represent embodiments of the present invention, the drawings are not necessarily to scale and certain features may be exaggerated to better illustrate and explain the embodiments. The exemplifications set out herein illustrate embodiments of the invention in several forms and such exemplification is not to be construed as limiting the scope of the invention in any manner.
The embodiments of the disclosure discussed below are not intended to be exhaustive or limit the invention to the precise forms disclosed in the following detailed description. Rather, the embodiments are chosen and described so that others skilled in the art may utilize their teachings.
A visualization instrument, and a method of using the instrument, are disclosed herein. In one embodiment of the visualization instrument, the visualization instrument comprises a display screen and a display screen support portion removably and electrically coupled to an insertable portion including an imaging system to acquire images of an internal space. Exemplary visualization instruments include endoscopes, laryngoscopes, and stylets. The display screen support portion and the display screen may be integrally constructed and may be reusable or disposable. In various embodiments described below, a unitary component comprising the display screen and the display screen support portion is referred to as a reusable portion denoting that in many instances it is possible, although not necessary, and perhaps desirable for economic reasons, to reuse the display screen and electronic components relating thereto. In one example thereof, the reusable portion includes a housing received in a proximal cavity of a handle coupled to the insertable portion. The display device is supported by the housing. In one variation thereof, the display device is supported by the housing at a fixed angle, preferably between 10 degrees and 30 degrees, and even more preferably between 12.5 degrees and 25 degrees, measured from a plane parallel to the posterior surface of the proximal end of the insertable portion. In another variation thereof, the display device is hinged to enable a practitioner to adjust the display angle as the visualization instrument is inserted into the patient.
In another embodiment of the visualization instrument, the insertable portion comprises a passageway or guide pathway configured to guide insertion of an elongate tubular component, e.g., an airway device, endotracheal tube and the like, and an imaging assembly disposed on or in the distal end of the insertable portion. The imaging assembly captures images of the patient which are shown with the display device. A distal end of the tubular component may also be visible in the images as the tubular component slides through the guide pathway towards the vocal cords.
Advantageously, the imaging assembly may be configured to be produced at a low cost to enable the insertable portion to function as a single-use disposable device. In one embodiment, the imaging assembly comprises a support structure or camera barrel supporting a camera integrated circuit (IC), camera, or camera-chip, an illumination device, and lenses. The imaging assembly may be inserted into a cavity located in the distal end of the insertable portion. The imaging assembly may comprise a retention device. e.g., a pin, detent, resilient elastomeric filler, screw or any other fixation device configured to securely couple the imaging assembly to the distal cavity.
A commercially available camera, such as a camera used in cellular phones and personal digital assistants (PDAs), comprises an image sensor and electronic components configured to convert pixel data captured by the image sensor to image data, e.g., digital images, and to output streams of digital images in a standard format. Image sensors may comprise CCD, CMOS sensors with active or passive pixels, or other photo sensors well known in the art. Operational signals are provided to the image sensor to control its operation. Advantageously, the cost of the disposable portion is reduced further by locating the components for providing the operational signals in the reusable portion. In one example thereof, the input/output signals are provided by signal conductors, e.g., a multi-conductor flexible ribbon. In another example thereof, a control component is provided intermediate the camera and the display driver to transform the standard image stream into a differently structured image stream conforming to the size of the display device and/or transforming the standard image stream to a different format corresponding to the format required by the display driver. In yet another example, control components supported by the reusable portion housing provide control signals to the camera to define the size of the images output by the camera.
While the embodiments of the disclosure are applicable in medical and non-medical applications, exemplary features of visualization instruments will be described below with reference to medical instruments such as laryngoscopes and stylets although the invention is not limited to medical applications and instruments.
An embodiment of a visualization instrument is described below with reference to
Passageway 36 is defined by the interior surfaces of a medial wall 44, an anterior wall 34, a posterior wall 24, and a lateral wall 50 which in this embodiment comprises a wall portion 54. Each wall has an interior surface which is the surface adjacent to passageway 36. A surface 42 is the interior surface of medial wall 44. Surfaces 38 and 40 are the external surfaces of anterior wall 34 and posterior wall 24, respectively. In other embodiments wall 50 may extend uninterrupted from the proximal to the distal end of blade 14 or may be configured with more or fewer wall portions. Passageway 36 may have a cross-section designed to be operable with endotracheal tubes having internal diameters ranging from 2.0 to 10.0 mm, and more preferably between 5.0 and 8.0 mm. Surfaces 38 and 40 define the anterior and posterior surfaces, respectively, of blade 14. Wall 50 may also include a wall portion 56 configured to confine the volume of passageway 36 further than as confined by wall portion 54. A distal tip 46 extends wall 34 beyond the end of medial wall 44 and comprises a surface 70 which is configured to contact the patient to move the epiglottis and expose the vocal cords.
The cross-sectional area of passageway 36 may be uniform or may vary. In one embodiment, the cross-sectional area of passageway 36 is smaller at the distal end of the insertable portion than at its proximal end. One or both of walls 24 and 54, or portions thereof, may be formed at least in part of a composition comprising resilient material, e.g., thermoset or thermoplastic elastomeric material, buta-N (Nitrile) (NBR), EPDM, Silicone, Neoprene, block copolymers (SIS, SBS, SEBS, SEPS), etc., configured to enable the smaller cross-sectional area to expand when a tube is introduced through passageway 36 having a diameter which is larger than the cross-sectional area. Advantageously, a resilient distal cross-sectional area enables the insertion portion to snugly receive tubes of different diameters which are pressed against the anterior wall by the resilient material and are thereby placed by the resilient material adjacent to distal tip 46.
Still referring to
In one example, the medial guide surface includes a transition portion extending through the proximal portion of the guide pathway and a longitudinally aligned portion extending through the distal portion of the guide pathway. In a variation thereof, the transition portion extends from a lateral side of the insertable portion to the longitudinally aligned portion.
In another embodiment of a visualization instrument, the visualization instrument comprises audible engagement features. In one example thereof, a protrusion makes an audible sound when it engages a notch to indicate to a user that the handle and the display device have been properly engaged. In another example, ridges or channels comprise an interruption adapted to receive a protrusion and to make an audible sound when the protrusion is received by the interruption. In a further example, a protrusion supported by a support element supporting a display device, and a matching recess in the handle are configured to generate an audible sound, such as a “click” sound, when the handle and the support element are properly engaged.
In one example of the present embodiment, the camera supplies a first image stream which is 8-bits wide. The resolution of the camera is 640×480 (VGA) pixels per frame. There are 30 frames per second. The data format is 2 bytes per pixel (i.e., the so called YUV (4:2:2) format). Intensity Y is specified at every pixel, color information U or V every second time. A FPGA is programmed to convert the data stream to a second image stream with a format compatible with the display device 110 which comprises an OLED display. In an alternative embodiment, the camera data is provided to the video processing chip, and the video processing chip, after adding information such as colors, symbols or other information, outputs a video stream to the FPGA for the FPGA to convert to the VGA format. The display resolution is 320×240 (QVGA) pixels per frame, 30 frames per second. The data format, however, is RGB (6, 6, 6). This format uses a 6-bit value for red, a 6-bit value for green, and a 6-bit value for blue. There are specific well known equations for conversion from the YUV color space to the RGB color space. The FPGA implements this conversion. It also performs the conversion (e.g. dropping every second pixel) to convert from VGA to QVGA resolution. The FPGA also provides signals for writing the converted data stream into the OLED display's memory/buffer. The FPGA also sends the camera data to the NTSC/S-video conversion chip. The video chip having the video processor is capable of accepting the VGA, YUV format almost directly. The FPGA provides the necessary operational signals to load the video chip's memory.
A program and data structures are embedded in the memory. The program comprises a plurality of processing sequences operable by the processor to interact with data structures containing data. Data may include parameters such as video instructions and the like.
In yet another embodiment of the visualization instrument, comfort features are provided. In one example thereof, the handle comprises soft material to enhance grasping comfort. In another example, the insertion portion comprises a resilient component to reduce pressure on the teeth of the patient. In a further example thereof, a blade comprises a first material which imparts structure and rigidity to the insertable portion and a second material coupled to the first material to provide a soft and resilient feel. In one variation, the second material extends, at least partially, over the surface of the handle. For example, a thin layer of elastomeric material, e.g., about 1 mm thick, may be provided over surface 40 and extend to the posterior side of handle 30. The second material may also extend over the surface of wall 50. The second material may be adhesively secured to the first material. The first material has a first modulus and the second material has a second modulus which is lower than the first modulus. In a further example, walls 34 and 44 comprise the first material and wall portion 54 comprises the second material. Advantageously, this embodiment provides flexibility to wall portion 54 which facilitates removal of the endotracheal tube from passageway 36.
Another embodiment of a visualization instrument, denoted by numeral 500, is described below with reference to
While the invention has been described as having exemplary designs, the present disclosure may be further modified within the spirit and scope of this disclosure. This application is therefore intended to cover any variations, uses, or adaptations of the disclosure using its general principles. Further, this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains.
This application claims the benefit of priority from U.S. Patent Application Ser. No. 61/314,058 entitled INTUBATION INSTRUMENT WITH VISUALIZATION FEATURES filed on Mar. 15, 2010 and U.S. Patent Application Ser. No. 61/265,330 entitled INTUBATION SYSTEM WITH ELASTOMERIC FEATURES filed on Nov. 30, 2009, the disclosures of which are expressly incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
3625793 | Sheridan | Dec 1971 | A |
3766909 | Ozbey | Oct 1973 | A |
3771514 | Huffman et al. | Nov 1973 | A |
4090518 | Elam | May 1978 | A |
4114609 | Moses | Sep 1978 | A |
4126127 | May | Nov 1978 | A |
4167946 | Sandstrom | Sep 1979 | A |
4231365 | Scarberry | Nov 1980 | A |
4360008 | Corazzelli, Jr. | Nov 1982 | A |
4573451 | Bauman | Mar 1986 | A |
4592343 | Upsher | Jun 1986 | A |
4611579 | Bellhouse | Sep 1986 | A |
4616631 | Takahashi | Oct 1986 | A |
4742819 | George | May 1988 | A |
4793327 | Frankel | Dec 1988 | A |
4832004 | Heckele | May 1989 | A |
4846153 | Berci | Jul 1989 | A |
4947896 | Bartlett | Aug 1990 | A |
4982729 | Wu | Jan 1991 | A |
5003962 | Choi | Apr 1991 | A |
5016614 | MacAllister | May 1991 | A |
5038766 | Parker | Aug 1991 | A |
5042469 | Augustine | Aug 1991 | A |
5095888 | Hawley | Mar 1992 | A |
5183031 | Rossoff | Feb 1993 | A |
5202795 | Kashima | Apr 1993 | A |
5203320 | Augustine | Apr 1993 | A |
5261392 | Wu | Nov 1993 | A |
5287848 | Cubb et al. | Feb 1994 | A |
5329940 | Adair | Jul 1994 | A |
5339805 | Parker | Aug 1994 | A |
5363838 | George | Nov 1994 | A |
5381787 | Bullard | Jan 1995 | A |
5400771 | Pirak et al. | Mar 1995 | A |
5431152 | Flam et al. | Jul 1995 | A |
5443058 | Ough | Aug 1995 | A |
5489256 | Adair | Feb 1996 | A |
5498231 | Franicevic | Mar 1996 | A |
5513627 | Flam | May 1996 | A |
5551946 | Bullard | Sep 1996 | A |
5603688 | Upsher | Feb 1997 | A |
5607386 | Flam | Mar 1997 | A |
5607435 | Sachdeva | Mar 1997 | A |
5643175 | Adair | Jul 1997 | A |
5645519 | Lee | Jul 1997 | A |
5665052 | Bullard | Sep 1997 | A |
5676635 | Levin | Oct 1997 | A |
5716323 | Lee | Feb 1998 | A |
5776052 | Callahan | Jul 1998 | A |
5800344 | Wood, Sr. et al. | Sep 1998 | A |
5803898 | Bashour | Sep 1998 | A |
5819733 | Bertram | Oct 1998 | A |
5827178 | Berall | Oct 1998 | A |
5842973 | Bullard | Dec 1998 | A |
5845634 | Parker | Dec 1998 | A |
5873814 | Adair | Feb 1999 | A |
5879289 | Yarush et al. | Mar 1999 | A |
5913817 | Lee | Jun 1999 | A |
5929901 | Adair et al. | Jul 1999 | A |
5986693 | Adair et al. | Nov 1999 | A |
6043839 | Adair et al. | Mar 2000 | A |
6046769 | Ikeda et al. | Apr 2000 | A |
6079409 | Brain | Jun 2000 | A |
6097423 | Boze et al. | Aug 2000 | A |
6099465 | Inoue | Aug 2000 | A |
6115523 | Choi et al. | Sep 2000 | A |
6142144 | Pacey | Nov 2000 | A |
6174281 | Abramowitz | Jan 2001 | B1 |
6190308 | Irion et al. | Feb 2001 | B1 |
6248061 | Cook, Jr. | Jun 2001 | B1 |
6275255 | Adair et al. | Aug 2001 | B1 |
6310642 | Adair et al. | Oct 2001 | B1 |
6322498 | Gravenstein et al. | Nov 2001 | B1 |
6354993 | Kaplan et al. | Mar 2002 | B1 |
6396873 | Goldstein et al. | May 2002 | B1 |
6413209 | Thompson | Jul 2002 | B1 |
6432042 | Bashour | Aug 2002 | B1 |
6432046 | Yarush et al. | Aug 2002 | B1 |
6439232 | Brain | Aug 2002 | B1 |
6449007 | Yokoyama | Sep 2002 | B1 |
6494826 | Chatenever et al. | Dec 2002 | B1 |
6494828 | Berall | Dec 2002 | B1 |
6543447 | Pacey | Apr 2003 | B2 |
6623425 | Cartledge et al. | Sep 2003 | B2 |
6652453 | Smith et al. | Nov 2003 | B2 |
6655377 | Pacey | Dec 2003 | B2 |
6656110 | Irion et al. | Dec 2003 | B1 |
6663560 | MacAulay et al. | Dec 2003 | B2 |
6676598 | Rudischhauser | Jan 2004 | B2 |
6750037 | Adair et al. | Jun 2004 | B2 |
6753160 | Adair et al. | Jun 2004 | B2 |
6792948 | Brain | Sep 2004 | B2 |
6830049 | Augustine et al. | Dec 2004 | B2 |
6832986 | Chhibber et al. | Dec 2004 | B2 |
6840903 | Mazzei et al. | Jan 2005 | B2 |
6843769 | Gandarias | Jan 2005 | B1 |
6860264 | Christopher | Mar 2005 | B2 |
6870566 | Koide et al. | Mar 2005 | B1 |
6875169 | Berci et al. | Apr 2005 | B2 |
6890298 | Berci et al. | May 2005 | B2 |
6901928 | Loubser | Jun 2005 | B2 |
6923663 | Oddsen et al. | Aug 2005 | B2 |
6929600 | Hill | Aug 2005 | B2 |
6982740 | Adair | Jan 2006 | B2 |
6982742 | Adair | Jan 2006 | B2 |
6984205 | Gazdzinski | Jan 2006 | B2 |
7030904 | Adair et al. | Apr 2006 | B2 |
7044910 | Cartledge et al. | May 2006 | B2 |
7048686 | Kameya et al. | May 2006 | B2 |
7110808 | Adair | Sep 2006 | B2 |
7116352 | Yaron | Oct 2006 | B2 |
7128071 | Brain | Oct 2006 | B2 |
7134992 | Schara et al. | Nov 2006 | B2 |
7154527 | Goldstein et al. | Dec 2006 | B1 |
D534652 | McGrath | Jan 2007 | S |
7156091 | Koyama et al. | Jan 2007 | B2 |
7159589 | Brain | Jan 2007 | B2 |
7182728 | Cubb et al. | Feb 2007 | B2 |
RE39508 | Parker | Mar 2007 | E |
7212227 | Amling et al. | May 2007 | B2 |
7243653 | Nelson | Jul 2007 | B2 |
7278420 | Ganesh | Oct 2007 | B2 |
7289139 | Amling et al. | Oct 2007 | B2 |
7297105 | Mackin | Nov 2007 | B2 |
7305985 | Brain | Dec 2007 | B2 |
7369176 | Sonnenschein | May 2008 | B2 |
7383599 | Gabbay | Jun 2008 | B2 |
7448377 | Koyama et al. | Nov 2008 | B2 |
7471310 | Amling et al. | Dec 2008 | B2 |
7480402 | Bar/Zohar et al. | Jan 2009 | B2 |
7485375 | Tokuda et al. | Feb 2009 | B2 |
7493901 | Brain | Feb 2009 | B2 |
7511732 | Ellison et al. | Mar 2009 | B2 |
7563227 | Gardner | Jul 2009 | B2 |
7658708 | Schwartz et al. | Feb 2010 | B2 |
7683926 | Schechterman et al. | Mar 2010 | B2 |
7771350 | Geist et al. | Aug 2010 | B2 |
7909757 | Herman | Mar 2011 | B2 |
7909759 | Pecherer | Mar 2011 | B2 |
7946981 | Cubb | May 2011 | B1 |
8029440 | Birnkrant et al. | Oct 2011 | B2 |
8894570 | Jiang | Nov 2014 | B2 |
8998798 | Hayman | Apr 2015 | B2 |
9415179 | Molnar | Aug 2016 | B2 |
9421341 | Miller | Aug 2016 | B2 |
20010004768 | Hodge et al. | Jun 2001 | A1 |
20010012923 | Christopher | Aug 2001 | A1 |
20010013345 | Bertram | Aug 2001 | A1 |
20010014768 | Kaplan et al. | Aug 2001 | A1 |
20010033326 | Goldstein et al. | Oct 2001 | A1 |
20010054425 | Bertram | Dec 2001 | A1 |
20020010417 | Bertram | Jan 2002 | A1 |
20020022769 | Smith | Feb 2002 | A1 |
20020103494 | Pacey | Aug 2002 | A1 |
20030183234 | Ranzinger | Oct 2003 | A1 |
20030195390 | Graumann | Oct 2003 | A1 |
20040019256 | Cubb | Jan 2004 | A1 |
20040028390 | Chatenever et al. | Feb 2004 | A9 |
20040122292 | Dey et al. | Jun 2004 | A1 |
20040127770 | McGrath | Jul 2004 | A1 |
20040133072 | Kennedy | Jul 2004 | A1 |
20040215061 | Kimmel et al. | Oct 2004 | A1 |
20050059863 | Zilch | Mar 2005 | A1 |
20050080342 | Gilreath et al. | Apr 2005 | A1 |
20050090712 | Cubb | Apr 2005 | A1 |
20050090715 | Schorer | Apr 2005 | A1 |
20050139220 | Christopher | Jun 2005 | A1 |
20050159649 | Patel | Jul 2005 | A1 |
20050182297 | Gravenstein et al. | Aug 2005 | A1 |
20050187434 | Dey et al. | Aug 2005 | A1 |
20050192481 | Berci et al. | Sep 2005 | A1 |
20050240081 | Eliachar | Oct 2005 | A1 |
20050244801 | DeSalvo | Nov 2005 | A1 |
20050279355 | Loubser | Dec 2005 | A1 |
20060004258 | Sun et al. | Jan 2006 | A1 |
20060004260 | Boedeker et al. | Jan 2006 | A1 |
20060015008 | Kennedy | Jan 2006 | A1 |
20060020166 | Berall | Jan 2006 | A1 |
20060020171 | Gilreath | Jan 2006 | A1 |
20060022234 | Adair et al. | Feb 2006 | A1 |
20060050144 | Kennedy | Mar 2006 | A1 |
20060079734 | Miyagi | Apr 2006 | A1 |
20060119621 | Krier | Jun 2006 | A1 |
20060162730 | Glassenberg et al. | Jul 2006 | A1 |
20060180155 | Glassenberg et al. | Aug 2006 | A1 |
20060241476 | Loubser | Oct 2006 | A1 |
20060276693 | Pacey | Dec 2006 | A1 |
20060276694 | Gandarias | Dec 2006 | A1 |
20070024717 | Chatenever et al. | Feb 2007 | A1 |
20070030345 | Amling et al. | Feb 2007 | A1 |
20070049794 | Glassenberg et al. | Mar 2007 | A1 |
20070068530 | Pacey | Mar 2007 | A1 |
20070074728 | Rea | Apr 2007 | A1 |
20070093693 | Geist et al. | Apr 2007 | A1 |
20070095352 | Berall | May 2007 | A1 |
20070106117 | Yokota | May 2007 | A1 |
20070106121 | Yokota et al. | May 2007 | A1 |
20070106122 | Yokota et al. | May 2007 | A1 |
20070118149 | Richardson | May 2007 | A1 |
20070129603 | Hirsh | Jun 2007 | A1 |
20070137651 | Glassenberg et al. | Jun 2007 | A1 |
20070139953 | Krattiger et al. | Jun 2007 | A1 |
20070156022 | Patel | Jul 2007 | A1 |
20070162095 | Kimmel et al. | Jul 2007 | A1 |
20070167686 | McGrath | Jul 2007 | A1 |
20070173697 | Dutcher et al. | Jul 2007 | A1 |
20070175482 | Kimmel et al. | Aug 2007 | A1 |
20070179342 | Miller | Aug 2007 | A1 |
20070182842 | Sonnenschein et al. | Aug 2007 | A1 |
20070195539 | Birnkrant | Aug 2007 | A1 |
20070197873 | Birnkrant | Aug 2007 | A1 |
20070215162 | Glassenberg et al. | Sep 2007 | A1 |
20070265492 | Sonnenschein et al. | Nov 2007 | A1 |
20070265498 | Ito | Nov 2007 | A1 |
20070276436 | Sonnenschein et al. | Nov 2007 | A1 |
20070299313 | McGrath | Dec 2007 | A1 |
20080004498 | Pecherer | Jan 2008 | A1 |
20080009674 | Yaron | Jan 2008 | A1 |
20080029100 | Glassenberg et al. | Feb 2008 | A1 |
20080045801 | Shalman et al. | Feb 2008 | A1 |
20080051628 | Pecherer | Feb 2008 | A1 |
20080055400 | Schechterman et al. | Mar 2008 | A1 |
20080064926 | Chen | Mar 2008 | A1 |
20080091064 | Laser | Apr 2008 | A1 |
20080097161 | Wang et al. | Apr 2008 | A1 |
20080158343 | Schechterman et al. | Jul 2008 | A1 |
20080158344 | Schechterman et al. | Jul 2008 | A1 |
20080177146 | Chen | Jul 2008 | A1 |
20080177148 | Chen et al. | Jul 2008 | A1 |
20080200761 | Schwartz et al. | Aug 2008 | A1 |
20080211634 | Hopkins et al. | Sep 2008 | A1 |
20080236575 | Chuda | Oct 2008 | A1 |
20080249355 | Birnkrant | Oct 2008 | A1 |
20080249370 | Birnkrant et al. | Oct 2008 | A1 |
20080294010 | Cooper | Nov 2008 | A1 |
20080300475 | Jaeger et al. | Dec 2008 | A1 |
20080312507 | Kim | Dec 2008 | A1 |
20090022393 | Bar/Zohar et al. | Jan 2009 | A1 |
20090032016 | Law et al. | Feb 2009 | A1 |
20090065000 | Chen | Mar 2009 | A1 |
20090114217 | Wulfsohn et al. | May 2009 | A1 |
20090118580 | Sun et al. | May 2009 | A1 |
20090123135 | Amling et al. | May 2009 | A1 |
20090143645 | Matthes | Jun 2009 | A1 |
20090179985 | Amling | Jul 2009 | A1 |
20090187078 | Dunlop | Jul 2009 | A1 |
20090192350 | Mejia | Jul 2009 | A1 |
20090198111 | Nearman et al. | Aug 2009 | A1 |
20090209826 | Sanders et al. | Aug 2009 | A1 |
20090235935 | Pacey | Sep 2009 | A1 |
20090247833 | Tanaka | Oct 2009 | A1 |
20090253955 | Akiba | Oct 2009 | A1 |
20090264708 | Pacey et al. | Oct 2009 | A1 |
20090287059 | Patel | Nov 2009 | A1 |
20090299146 | McGrath | Dec 2009 | A1 |
20090318758 | Farr et al. | Dec 2009 | A1 |
20090318768 | Tenger et al. | Dec 2009 | A1 |
20090318769 | Tenger et al. | Dec 2009 | A1 |
20090322867 | Carrey et al. | Dec 2009 | A1 |
20100022829 | Irion et al. | Jan 2010 | A1 |
20100022843 | Pecherer et al. | Jan 2010 | A1 |
20100069722 | Shalman et al. | Mar 2010 | A1 |
20100087708 | Chen et al. | Apr 2010 | A1 |
20100094090 | Mejia | Apr 2010 | A1 |
20100101569 | Kim et al. | Apr 2010 | A1 |
20100121152 | Boedeker | May 2010 | A1 |
20100141744 | Amling et al. | Jun 2010 | A1 |
20100152541 | Tenger et al. | Jun 2010 | A1 |
20100168521 | Gandarias | Jul 2010 | A1 |
20100191054 | Supiez | Jul 2010 | A1 |
20100191061 | Simons | Jul 2010 | A1 |
20100192355 | Zhao et al. | Aug 2010 | A1 |
20100198009 | Farr et al. | Aug 2010 | A1 |
20100224187 | Dalton | Sep 2010 | A1 |
20100249513 | Tydlaska | Sep 2010 | A1 |
20100249639 | Bhatt | Sep 2010 | A1 |
20100256451 | McGrath et al. | Oct 2010 | A1 |
20100261967 | Pacey et al. | Oct 2010 | A1 |
20100261968 | Nearman et al. | Oct 2010 | A1 |
20100288272 | Yokota et al. | Nov 2010 | A1 |
Number | Date | Country |
---|---|---|
1640033 | Mar 2006 | EP |
1977685 | Oct 2008 | EP |
WO 2006102770 | Oct 2006 | WO |
WO 2008019367 | Feb 2008 | WO |
WO 2009130666 | Oct 2009 | WO |
WO 2010049694 | May 2010 | WO |
WO 2010066787 | Jun 2010 | WO |
WO 2010066790 | Jun 2010 | WO |
WO 2010100496 | Sep 2010 | WO |
WO 2010100498 | Sep 2010 | WO |
Entry |
---|
International Search Report and Written Opinion for PCT Application No. PCT/US2010/058226, Jun. 1, 2011, 16 pgs. |
Coopdech VLP-100 Video Laryngoscope advertising in fero-medic.com web page; last accessed on Aug. 31, 2010 at http://www.fero-medic.com; 1 page. |
GlideScope® Ranger advertising web page; last accessed on Aug. 31, 2010 at http://www.verathon.com/gs—ranger.htm; 2 pages. |
McGrath® Video Laryngoscope Series 5 advertising web page; last accessed on Aug. 31, 2010 at http://www.medtel.com.au; 2 pages. |
Res-Q-Tech Res-Q-Scope® II advertising web page; last accessed Oct. 13, 2009 at http://www.res-q-tech-na.com/products.html. |
Anthony J. Chipas, A Video Laryngoscope, How easy is it to use? How affordable to practice?, Outpatient Surgery Magazine, Jan. 2009, 4 pages. |
E. B. Liem, et al., New options for airway management: intubating fibreoptic stylets, The Board of Management and Trustees of the British Journal of Anesthesia©, 2003, 11 pages. |
Ken Yanagisawa, “How I Do It”—Head and Neck and Plastic Surgery, Color Photography of Video Images of Otolaryngological Structures Using a 35 mm SLR Camera, Laryngoscope 97, Aug. 1987, 2 pages. |
J. E. Smith, et al., Teaching Fibreoptic Nasotracheal Intubation With and Without Closed Circuit Television, British Journal of Anesthesia, 1993, pp. 206-211. |
Pentax Corporation, Airway Scope AWS-S100, Rigid Video Laryngoscope for Intubation, Jan. 6, 2006, 4 pages. |
Coopdech, Video Laryngoscope Portable VLP-100 Specifications, webpage, Aug. 2009, 2 pages. |
Truview EV02 New generation intubation devices, Truphatek.com webpage, Jan. 2007, 2 page. |
Truview Premier Intubation Trainer Kit, webpage, Feb. 4, 2010, 2 pages. |
Pentax Corporation, Fast Accurate Portable Airway Scope, LMA North America, Inc., May 31, 2006, 19 pages. |
LMA North America, Inc., Introducing the McGrath® Video Laryngoscope,Jan. 2008, 2 pages. |
LMA North America, Inc., LMA CTrach™, Product Literature, May 31, 2006, 2 pages. |
Verathon Medical, GlideScope® Video Laryngoscopes Product Line, May 2006, 14 pages. |
International Preliminary Report on Patentability issued in PCT/US2010/058226, issued Jun. 5, 2012, 10 pages. |
Number | Date | Country | |
---|---|---|---|
20110130627 A1 | Jun 2011 | US |
Number | Date | Country | |
---|---|---|---|
61314058 | Mar 2010 | US | |
61265330 | Nov 2009 | US |