Endoscopic device and system

Information

  • Patent Grant
  • 10064683
  • Patent Number
    10,064,683
  • Date Filed
    Friday, April 17, 2015
    9 years ago
  • Date Issued
    Tuesday, September 4, 2018
    6 years ago
Abstract
An endoscopic system may include a catheter or tubing and at least one optical sensor disposed along the catheter or tubing and configured to capture image information from a body lumen when disposed within the body lumen and activated. The system may further include an untethered module operatively arranged with the at least one optical sensor and configured to store or transmit image information captured by the at least one optical sensor. The catheter or tubing, at least one optical sensor and module may be portable during use.
Description
BACKGROUND

Sleep disordered breathing, including snoring and obstructive sleep apnea, affects tens of millions of adults in the United States. It is associated with substantial cardiovascular morbidity and mortality, endocrine disturbances, excessive daytime sleepiness, quality of life and performance deficits, and motor vehicle crashes.


Treatment options include behavioral measures such as weight loss, positive airway pressure therapy, surgery, and oral appliances. All treatments have strengths and weaknesses, and in particular surgical treatment has outcomes that vary widely among patients and procedures.


The evaluation of patients with sleep disordered breathing may improve outcomes of surgical treatment. The goals of such evaluation include characterizing (1) the pattern of airway obstruction (involving primarily the palate/tonsils region, tongue base, epiglottis, and/or lateral pharyngeal walls) and (2) the site of sound production. Existing upper airway examination techniques, however, may not provide an accurate evaluation of the pharynx during natural sleep as explained below.


A flexible endoscope such as the Olympus fiberscope or the Olympus video scope may be utilized to examine a patient's upper airway during wakefulness, natural sleep or sedation. Examination during natural sleep may provide the best results, but attempts to perform traditional natural sleep endoscopy have been largely abandoned for multiple reasons, including the fact that it requires that an operator be present to hold the endoscope in place during the often prolonged period needed for patients to fall asleep with the endoscope in place. The behavior of the upper airway during wakefulness differs dramatically compared to natural sleep, which makes examinations during wakefulness insufficient. Sedation is costly because it requires a controlled environment and the involvement of highly trained personnel and specialized equipment. In addition, sedation may alter the pattern of upper airway obstruction.


Current examinations during wakefulness, sedation, and natural sleep are also limited because their duration is typically no more than 10-15 minutes due to personnel and financial constraints. It is unclear whether this abbreviated examination adequately describes pharyngeal behavior through an entire night of sleep.


There is enthusiasm among clinicians and patients alike for improved surgical evaluation techniques, particularly techniques that provide an accurate, dynamic assessment of the upper airway during natural sleep without the need for sedation, the presence of a clinician, or the associated costs.


SUMMARY

An endoscopic device may include a catheter or tubing and at least two optical sensors spaced apart from each other along the catheter or tubing such that, for example, image information from different regions of a body lumen partially or completely separated by an obstruction is captured when each of the at least two optical sensors is disposed within one of the regions and activated.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a schematic diagram of an endoscopic system.



FIG. 2 is a schematic diagram of another endoscopic system.



FIG. 3 is a schematic diagram of a data and/or power module of an endoscopic system.



FIG. 4 is a schematic diagram of the sensor arrangement of the endoscopic system of FIG. 1.



FIG. 5 is a schematic diagram of another sensor arrangement for an endoscopic system.



FIG. 6 is a schematic diagram of yet another sensor arrangement for an endoscopic system.



FIG. 7A is a perspective view of one of the optical sensors of the endoscopic system of FIG. 1.



FIG. 7B is a side view, in cross-section, of the optical sensor of FIG. 7A.



FIG. 8A is a perspective view of another optical sensor of an endoscopic system.



FIG. 8B is a side view, in cross-section, of the optical sensor of FIG. 8A.



FIGS. 9A and 9B are front and rear schematic views, respectively, of the endoscopic system of FIG. 2 fitted to a patient's head in the in-use position.



FIGS. 10A and 10B are side views, in sagittal cross-section, of the patient's head and optical sensor arrangement of FIGS. 9A and 9B.



FIGS. 11A and 11B are front and rear schematic views, respectively, of another embodiment of an endoscopic system fitted to a patient's head in the in-use position.



FIGS. 12A and 12B are front and rear schematic views, respectively, of yet another endoscopic system fitted to a patient's head in the in-use position.





DETAILED DESCRIPTION

As required, detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention that may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Moreover, similarly numbered elements (e.g., elements 20 and 120, elements 21, 321, and 621, elements 26 and 326, elements 170 and 670) may have similar descriptions. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present invention.


Referring to FIG. 1, an embodiment of an endoscopic system 10 may be used to collect visual and/or audio information from a patient's airway. This information may be displayed and/or played by the output system 12. The endoscopic system 10 may include an optical sensor arrangement 14, a data and/or power module 16, a sensor cord 18 connecting the sensor arrangement 14 and module 16, and a skullcap 19. Other arrangements are also possible and will be discussed in further detail below.


The module 16 may be removably attached with the skullcap 19 via VELCRO, a snap feature or other suitable attachment mechanism. The skullcap 19 may include an elastic band 20 configured to secure the skullcap 19 and sensor cord 18 to a patient's head.


The sensor arrangement 14, in the embodiment of FIG. 1, includes a body portion 21 and a pair of legs 22 of differing lengths projecting there from. Each of the legs 22 has an optical sensor 24 disposed at an end thereof. The sensor arrangement 14 also includes a connector 26 (e.g., a USB port, a fiber optic connector, etc.) disposed at an end of the body portion 21 opposite the legs 22 and capable of, for example, connecting the optical sensors 24 with a mating connector 28 of the sensor cord 18 and/or a mating connector 30 of the display system 12.


When the sensor arrangement 14 is connected with the module 16, the module 16 may store information received from the sensor arrangement 14 for later access. For example, the module 16 may store collected information while a patient is asleep. An investigator may then access the stored information at a later time. When the sensor arrangement 14 is connected with the display system 12, information captured by the sensor arrangement 14 may be viewed in real-time.


The display system 12 may include a display screen 32 and any suitable/known decoder device 34 (e.g., a non-wearable version of module 16, a video monitor, a smart phone, a personal computer with suitable hardware/software, etc.) The decoder device 34 may be configured to interpret information from the sensor arrangement 14 and generate appropriate signals for (i) display and/or play by the display screen 32 and/or (ii) data storage in a storage device (not shown).


Referring to FIG. 2, the sensor arrangement 114 is directly connected with the module 116. The module 116 may wirelessly transmit (and/or store) information from the sensor arrangement 114 to the decoder device 134 for real-time display and/or play. Other arrangements are, of course, also possible. For example, the module 116 may wirelessly transmit information from the sensor arrangement 114 to a remote storage device (not shown). Alternatively, the module 116 may store the information from the sensor arrangement 114 for later access, etc.


Referring to FIG. 3, an embodiment of a data and/or power module 216 may include a bundle coupling 236, a microphone input 238, a microprocessor 240, a memory 242 (e.g., a card reader, etc.), and a radiofrequency transmitter 244 all mounted on a printed circuit board 246. The module 216 may also include a rechargeable power source 250 (e.g., rechargeable batteries) disposed, in this embodiment, on a side of the circuit board 246 opposite the microprocessor 240, and various switches 252 (e.g., a power switch, a microphone switch, etc.) The coupling 236 may include a recharging port 248, image sensors 254, and associated LEDs 256. Hence, the module 216 is self-contained and may operate untethered. That is, it may operate without being attached via cording to a separate power source, processor, memory, etc. (unlike certain conventional arrangements).


In other embodiments, the image sensors 254, for example, may be replaced with suitable connectors for cameras; the transmitter 244 may be replaced with a suitable wired data output, etc.


Information received via the coupling 236 and/or the microphone input 238 may be processed by the microprocessor 240 in any suitable/known fashion and stored in the memory 242 and/or wirelessly transmitted via the radiofrequency transmitter 244.


Referring to FIG. 4, the sensor arrangement 14 of FIG. 1 is shown in further detail. The legs 22 are of different lengths such that the optical sensor 24 disposed at the end of the longer leg 22 may capture images from an area of a patient's airway different than the images captured by the optical sensor 24 disposed at the end of the shorter leg 22. Given this arrangement, different regions within the airway may be observed at the same time, even if there is an intervening structure that obstructs the simultaneous visualization of both (or all) areas with a single optical sensor. For snoring and sleep apnea, as an example, the soft palate can isolate two areas of the airway (above and below) if it completely or partially obstructs the airway. The use of two cameras or other methods of image acquisition from two sites (as opposed to one) enables the simultaneous visualization of the two areas.


In the embodiment of FIG. 4, the optical sensors 24 are spaced apart at a distance that may be greater than or equal to 5 mm. That is, the length of the longer leg 22 may be at least 5 mm greater than that of the shorter leg 22. In other embodiments, the optical sensors 24 may be spaced at any distance suitable for capturing images of different regions of a patient's airway (e.g., 5 mm, 10 mm, 50 mm, etc.)


Some embodiments may have more or less than two legs 22 and two associated optical sensors 24. For example, a sensor arrangement may have three legs each with an optical sensor disposed at an end thereof. The number of legs (and thus the number of optical sensors) may, however, be limited by the effective diameter of a patient's airway.


Referring to FIG. 5, the legs 322 of the sensor arrangement 314 have the same or different lengths, but a clamp 358 (band, locking mechanism, etc.) may be used to effectively change the length of one of the legs 322 relative to the other. This arrangement may be used to tailor the distance between the optical sensors 324 for a particular patient while using a standard sensor arrangement.


Referring to FIG. 6, the optical sensors 424 are disposed (and spaced apart) along the body portion 421 of the sensor arrangement 414. This arrangement, relative to the embodiments of FIGS. 4 and 5, may allow the packaging of several of the optical sensors 424 without substantially altering the diameter of the sensor arrangement 414.


Referring to FIGS. 7A and 7B, one of the optical sensors 24 of FIG. 1 is shown in further detail. In this embodiment, the optical sensor 24 includes a camera 60 and associated lens 62. The camera 60 may receive power from and transmit image information to the module 16 of FIG. 1 via electrical/communication lines 64. Illumination fibers 66 in optical communication with, for example, LEDs within the module 16 of FIG. 1 are disposed within the leg 22. The illumination fibers 66 provide light for the camera 60.


The optical sensor 24 may have a diameter ranging from approximately 6 mm to 10 mm. The leg 22 may have a diameter ranging from approximately 2 mm to 5 mm. Other diameters and sizes, however, are also possible depending on, for example, camera size, fiber diameter, etc.


Referring to FIGS. 8A and 8B, another embodiment of an optical sensor 524 is shown in further detail. The optical sensor 524 includes a video fiber optic element 568 and associated lens 562. Because the optical sensor 524 does not include an internal camera (similar to the optical sensor 24 of FIGS. 7A and 7B), its diameter may range in size from approximately 3 mm to 8 mm. Other diameters and sizes, however, are also possible depending on, for example, fiber diameter, etc. In other embodiments, any suitable/known optical sensor may be used.


Referring to FIGS. 9A, 9B, 10A and 10B, the endoscopic system 110 of FIG. 2 is fitted to a patient's head. The sensor arrangement 114 is positioned such that the optical sensors 124 capture image information from different regions within the patient's airway. More specifically, the legs 122 are arranged and the optical sensors 124 are positioned, in this example, such that the optical sensors 124 capture image information from the retropalatal and retrolingual regions of the patient's pharynx regardless of whether there is an obstruction between the two regions (e.g., the soft palate obstructing the airway, as in FIG. 10B). The sensor arrangement 114 may also be arranged, of course, to capture, for example, image information from a single region or multiple overlapping or non-overlapping regions of any body lumen.


The skullcap 119 is placed over the patient's head and the module 116 is attached thereto with, for example, a snap feature (not shown). The body portion 121 is attached to the patient's cheek and nose with adhesive 170. This arrangement may permit the patient to wear (attended or unattended) the endoscopic system 110 for extended periods of time while image information from their airway is collected and, for example, stored within the module 116.


Referring to FIGS. 11A and 11B, another embodiment of an endoscopic system 610 is fitted to a patient's head. In this embodiment, the headband 620 is larger than in the embodiments of, for example, FIGS. 1 and 2. Hence, the module 616 may be attached directly to the front of the headband 620 (or the rear/side as desired).


Referring to FIGS. 12A and 12B, yet another embodiment of an endoscopic system 710 is fitted to a patient's head. The headband 720, in this embodiment, encompasses the nose of the patient and provides a harness in the rear where the module 716 may be attached. In this embodiment, the body portion 721 need not be attached directly to the patient's face. Rather, the body portion 721 is attached to the headband 720 via snaps 772 or other suitable attachment features. Other arrangements are also possible.


While embodiments of the invention have been illustrated and described, it is not intended that these embodiments illustrate and describe all possible forms of the invention. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the invention. As an example, certain embodiments may be used to collect images over a period of time from any body lumen (e.g., nasal cavity, esophagus, stomach, small intestine and large intestine, etc.) As another example, while endoscopic systems including either an internal (to the body) camera with an external light source and battery or an external camera, light source and battery have been explicitly disclosed, other embodiments of endoscopic systems may include an internal camera, battery and light source, an internal camera and light source with an external battery, an internal camera and battery with an external light source, or any other internal/external combination of components. As yet another example, certain embodiments may be fitted to any part of a patient's body via suitable cuffs/adhesives/bands/pockets/etc.

Claims
  • 1. A method for examining an airway of a human patient comprising: inserting into a nasal cavity of the patient an endoscopic device including a catheter or tubing and at least two optical sensors attached to the catheter or tubing; andremovably but fixably securing the catheter or tubing to position at least one of the at least two optical sensors in a retropalatal region of an upper airway of the patient and at least one other of the at least two optical sensors in a retrolingual region of the upper airway but not in an esophagus of the patient such that the at least two optical sensors are spaced away from each other along the catheter or tubing and that image information from the retropalatal region is captured by the one of the at least two optical sensors and image information from the retrolingual region is captured by the at least one other of the at least two optical sensors when the at least two optical sensors are activated.
  • 2. The method of claim 1 further comprising: operatively arranging the endoscopic device with an untethered module, that includes at least one of a power source, memory, transmitter, microphone and illumination source, configured to store or transmit wirelessly image information captured by the at least two optical sensors, wherein the catheter or tubing, at least two optical sensors and module are portable during use.
  • 3. The method of claim 1 further comprising: operatively arranging the endoscopic device with an untethered module including at least one of a power source, memory, transmitter, microphone and illumination source.
  • 4. The method of claim 1 further comprising: operatively arranging the endoscopic device with an untethered module and a band or cuff attachable with the module and configured to secure the module to a body of the patient.
  • 5. The method of claim 1 further comprising: removably but fixably securing the catheter or tubing to position the at least two optical sensors such that image information from the retropalatal region is captured by at least one of the al least two optical sensors and image information from the retrolingual region is captured by at least one other of the at least two optical sensors when the at least two optical sensors are activated regardless of whether a soft palate of a patient completely obstructs the airway between the retropalatal region and the retrolingual region.
  • 6. The method of claim 1 further comprising: the catheter or tubing having at least two branches of unequal length extending from a common joint, each of the branches having an optical sensor attached thereto.
  • 7. A method for examining an airway of a human patient comprising: inserting into a nasal cavity of the patient an endoscopic device including a catheter or tubing and at least one optical sensor attached to the catheter or tubing; andremovably but fixably securing the catheter or tubing in a retropalatal region of an upper airway of the patient but not in an esophagus of the patient such that image information from the retropalatal region is captured when the at least one optical sensor is activated.
  • 8. The method of claim 7 further comprising: operatively arranging the endoscopic device with an untethered module, that includes at least one of a power source, memory, transmitter, microphone and illumination source, configured to store or transmit wirelessly image information captured by the at least one optical sensor, wherein the catheter or tubing, at least one optical sensor and module are portable during use.
  • 9. The method of claim 7 further comprising: operatively arranging the endoscopic device with an untethered module including at least one of a power source, memory, transmitter, microphone and illumination source.
  • 10. The method of claim 7 further comprising: operatively arranging the endoscopic device with an untethered module and a band or cuff attachable with the module and configured to secure the module to a body of the patient.
  • 11. A method for examining a human patient comprising: inserting into a hollow body cavity of the patient an endoscopic device including a catheter or tubing having at least two branches of unequal length extending from a common joint, each of the branches having an optical sensor attached thereto; andpositioning the branches within the cavity such that the at least two optical sensors are spaced away from each other and that image information from different regions of the cavity is captured when the at least two optical sensors are activated during natural sleep; andremovably but fixably securing a portion of the catheter or tubing to a body of the patient.
  • 12. The method of claim 11 further comprising: operatively arranging the endoscopic device with an untethered module, that includes at least one of a power source, memory, transmitter, microphone and illumination source, configured to store or transmit wirelessly image information captured by the at least two optical sensors, wherein the catheter or tubing, at least two optical sensors and module are portable during use.
  • 13. The method of claim 11 further comprising: operatively arranging the endoscopic device with an untethered module including at least one of a power source, memory, transmitter, microphone and illumination source.
  • 14. The method of claim 11 further comprising: operatively arranging the endoscopic device with an untethered module and a band or cuff attachable with the module and configured to secure the module to the body.
  • 15. A method for examining a human patient comprising: inserting into a hollow body cavity of the patient an endoscopic device including a catheter or tubing and at least one optical sensor attached to the catheter or tubing;positioning the catheter or tubing such that image information from the at least one optical sensor is captured during natural sleep when the at least one optical sensor is activated; andremovably but fixably securing a portion of the catheter or tubing to a body of the patient.
  • 16. The method of claim 15 further comprising: operatively arranging the endoscopic device with an untethered module, that includes at least one of a power source, memory, transmitter, microphone and illumination source, configured to store or transmit wirelessly image information captured by the at least one optical sensor, Wherein the catheter or tubing, at least one optical sensor and module are portable during use.
  • 17. The method of claim 15 further comprising: operatively arranging the endoscopic device with an untethered module including at least one of a power source, memory, transmitter, microphone and illumination source.
  • 18. The method of claim 15 further comprising: operatively arranging the endoscopic device with an untethered module and a band or cuff attachable with the module and configured to secure the module to the body.
  • 19. The method of claim 15 further comprising at least two optical sensors attached to the catheter or tubing.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of application Ser. No. 14/293,994, filed Jun. 2, 2014, which is a continuation of application Ser. No. 13/071,937, filed Mar. 25, 2011, which claims the benefit of U.S. Provisional Application No. 61/321,911, filed Apr. 8, 2010, each of which is hereby incorporated by reference in its entirety.

US Referenced Citations (132)
Number Name Date Kind
153819 Lando May 1925 A
3776222 Smiddy Dec 1973 A
4265561 Heckele May 1981 A
4784117 Miyazaki Nov 1988 A
4797736 Kloots et al. Jan 1989 A
5039198 VanBeek Aug 1991 A
5166787 Irion Nov 1992 A
5178130 Kaiya Jan 1993 A
5305098 Matsunaka et al. Apr 1994 A
5305121 Moll Apr 1994 A
5341240 Broome Aug 1994 A
5368015 Wilk Nov 1994 A
5381784 Adair Jan 1995 A
5519532 Broome May 1996 A
5604531 Iddan et al. Feb 1997 A
5653677 Okada et al. Aug 1997 A
5746693 Spitz et al. May 1998 A
5940126 Kimura Aug 1999 A
5976071 Sekiya Nov 1999 A
6066090 Yoon May 2000 A
6106456 Storz Aug 2000 A
6139490 Breidenthal et al. Oct 2000 A
6186944 Tsai Feb 2001 B1
6361489 Tsai Mar 2002 B1
6561972 Ooshima et al. May 2003 B2
6626825 Tsai Sep 2003 B2
6632172 Igarashi Oct 2003 B1
6833952 Yamamoto Dec 2004 B2
6855110 Igarashi Feb 2005 B2
6875169 Berci et al. Apr 2005 B2
6934575 Ferre et al. Aug 2005 B2
6960161 Amling et al. Nov 2005 B2
6986738 Glukhovsky et al. Jan 2006 B2
6997871 Sonnenschein et al. Feb 2006 B2
7029435 Nakao Apr 2006 B2
7087013 Belson et al. Aug 2006 B2
7108657 Irion et al. Sep 2006 B2
7116352 Yaron Oct 2006 B2
7137948 Tsai Nov 2006 B2
7267647 Okada et al. Sep 2007 B2
7316646 Amling et al. Jan 2008 B2
7318802 Suzuki et al. Jan 2008 B2
7322934 Miyake et al. Jan 2008 B2
7419467 Tsai Sep 2008 B2
7448993 Yokoi et al. Nov 2008 B2
7621869 Ratnakar Nov 2009 B2
7662089 Okada et al. Feb 2010 B2
8182415 Larkin et al. May 2012 B2
8197399 Bayer et al. Jun 2012 B2
8277373 Maahs et al. Oct 2012 B2
8289381 Bayer et al. Oct 2012 B2
8353920 Mikkaichi Jan 2013 B2
8585584 Ratnakar Nov 2013 B2
8602971 Farr Dec 2013 B2
8715168 Ratnakar May 2014 B2
20020007110 Irion Jan 2002 A1
20020077593 Perkins et al. Jun 2002 A1
20030163029 Sonnenschein et al. Aug 2003 A1
20040138525 Saadat et al. Jul 2004 A1
20040138529 Wiltshire et al. Jul 2004 A1
20040215061 Kimmel et al. Oct 2004 A1
20040225185 Obata et al. Nov 2004 A1
20040249367 Saadat et al. Dec 2004 A1
20050038317 Ratnakar Feb 2005 A1
20050065397 Saadat et al. Mar 2005 A1
20050065401 Saadat et al. Mar 2005 A1
20050096502 Khalili May 2005 A1
20050097191 Yamaki et al. May 2005 A1
20050107663 Saadat et al. May 2005 A1
20050113640 Saadat et al. May 2005 A1
20050182297 Gravenstein et al. Aug 2005 A1
20050272977 Saadat et al. Dec 2005 A1
20060041304 Jang et al. Feb 2006 A1
20060069302 Halvorsen et al. Mar 2006 A1
20060106280 Surti et al. May 2006 A1
20060149129 Watts et al. Jul 2006 A1
20060178560 Saadat et al. Aug 2006 A1
20060183975 Saadat et al. Aug 2006 A1
20060189845 Maahs et al. Aug 2006 A1
20060235277 Ohkubo et al. Oct 2006 A1
20060252993 Freed et al. Nov 2006 A1
20060252994 Ratnakar Nov 2006 A1
20060265031 Skwarek et al. Nov 2006 A1
20070049794 Glassenberg et al. Mar 2007 A1
20070106113 Ravo May 2007 A1
20070142702 Haller et al. Jun 2007 A1
20070142710 Yokoi et al. Jun 2007 A1
20070156021 Morse et al. Jul 2007 A1
20070161855 Mikkaichi et al. Jul 2007 A1
20070177010 Murata Aug 2007 A1
20070185503 Mikkaichi Aug 2007 A1
20070188604 Miyamoto et al. Aug 2007 A1
20070197873 Birnkrant Aug 2007 A1
20070208252 Makower Sep 2007 A1
20070239056 Moore Oct 2007 A1
20070255100 Barlow et al. Nov 2007 A1
20070276183 Melder Nov 2007 A1
20080015413 Barlow et al. Jan 2008 A1
20080027279 Abou El Kheir Jan 2008 A1
20080051629 Sugiyama et al. Feb 2008 A1
20080051655 Sato et al. Feb 2008 A1
20080055400 Schechterman et al. Mar 2008 A1
20080065104 Larkin et al. Mar 2008 A1
20080091064 Laser Apr 2008 A1
20080116093 Felten et al. May 2008 A1
20080188868 Weitzner Aug 2008 A1
20080190436 Jaffe et al. Aug 2008 A1
20080208002 Maruyama Aug 2008 A1
20080214890 Motai et al. Sep 2008 A1
20080249360 Li et al. Oct 2008 A1
20080275298 Ratnakar Nov 2008 A1
20090023998 Ratnakar Jan 2009 A1
20090030274 Goldfarb et al. Jan 2009 A1
20090076329 Su et al. Mar 2009 A1
20090125039 Mikkaichi et al. May 2009 A1
20090143645 Matthes Jun 2009 A1
20090198100 Moore Aug 2009 A1
20090240108 Shimizu et al. Sep 2009 A1
20090253961 Le et al. Oct 2009 A1
20090260625 Wondka Oct 2009 A1
20090318757 Singh Dec 2009 A1
20090318798 Singh et al. Dec 2009 A1
20100010302 Hadani Jan 2010 A1
20100056863 Dejima et al. Mar 2010 A1
20100125165 Torii et al. May 2010 A1
20100174138 Chang et al. Jul 2010 A1
20100217076 Ratnakar Aug 2010 A1
20100261962 Friedberg Oct 2010 A1
20100274188 Chang et al. Oct 2010 A1
20110160530 Ratnakar Jun 2011 A1
20120046669 Duval et al. Feb 2012 A1
20140107418 Ratnakar Apr 2014 A1
Foreign Referenced Citations (7)
Number Date Country
2609498 Apr 2004 CN
2620530 Jun 2004 CN
2706123 Jun 2005 CN
51046994 Dec 1976 JP
2005270468 Oct 2005 JP
WO9315648 Aug 1993 WO
WO2007137059 Nov 2007 WO
Related Publications (1)
Number Date Country
20150216607 A1 Aug 2015 US
Provisional Applications (1)
Number Date Country
61321911 Apr 2010 US
Continuations (2)
Number Date Country
Parent 14293994 Jun 2014 US
Child 14689406 US
Parent 13071937 Mar 2011 US
Child 14293994 US