The present invention relates to determine the shape of surfaces of soft tissue, and more specifically, to determining such shapes using optical technology.
Hearing aids, hearing protection, and custom head phones often require silicone impressions to be made of a patient's ear canal. Audiologists pour the silicone material into an ear, wait for it to harden then manufacturers use the resulting silicone impression to create a custom fitting in-ear device. The process is slow, expensive, inconsistent, and unpleasant for the patient, and can even be dangerous.
Also, there are a range of other medical needs that benefit from determining the shape of body surfaces, including surfaces defining body orifices, such as the size of shape of an ear canal, throat, mouth, or nostrils of a patient. For example, surgery may be guided by knowing such shapes or medical devices fashioned to have a custom fit for such shapes.
There is a need, therefore, for improvements in the determination of body surface shapes, including the shapes and sizes of surfaces associated with body orifices.
According to one embodiment of the present invention, a device for scanning a body orifice of a body includes a light source and a wide angle lens wherein the light from the light source is projected in a pattern distal to the wide angle lens.
In another embodiment, an embodiment of the present invention includes a method of determining geometry of a body orifice. The method includes projecting, with a light source, a pattern of light to a location in a coordinate system. At least a partial lateral portion of the pattern of light illuminates a surface of the body orifice. A position of the lateral portion in the coordinate system is determined using a camera with a focal surface, wherein the focal surface includes the location.
With reference now to
The term “known” as used herein refers to known approximately within normal tolerances fit to achieve the desired resolution. Thus, the known focal surface has some thickness and variation to it that corresponds to the result of normal manufacturing tolerances.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
Referring again to
Notably, some or all of the elements of the optical hardware assembly 66, including the light sources for the laser and/or the fiberscope, the tracking system 28 and the computer 68 can be contained within the body of the handheld probe 34.
The body 30 of the patient defines any of a plurality of orifices or body surfaces that can be investigated by embodiments of the present invention for medical purposes. Body markers 38 or fiducials adorn the portion of the body 30 defining the surface or orifice of interest. For example, a head band 72 extends around the head near the ear canal 70 and supports a plurality of retro-reflective spheres.
It should also be noted that non-medical uses are possible, such as measuring of tortuous openings or surfaces in an industrial setting. However, embodiments of the present invention are particularly well-suited to measure surfaces of an ear canal 70 which has a small diameter (approximately 6 mm). The ear canal optionally has and at least one bend along its length.
The probe 34 includes a handle 74, a cable 76, a probe shaft 40, and a plurality of probe markers 38. The cable 76 includes a light conductor 42 and a plurality of image conductors 44 and connects the probe 34 to the optical hardware assembly 66. The image conductors may conduct the optical images, such as through a fiber optic line, or through communicating an electrical signal containing the image data. The term “conductor” therefore is used in its broadest sense herein to include conducting of any signal, analog or digital, power or information or data. A conductor may also represent wireless communication such as by an RF signal.
The optical hardware assembly includes a fiberscope body, the light source 12 and a camera. The fiberscope body is connected via one of the image conductors 44 to the probe 34. The camera is connected to the fiberscope body and receives images therefrom for navigation of the probe 34 within the body orifice. Similarly, the light source, in this case a laser light source, connects to the probe 34 via the light conductor 42.
The tracking system 28 includes a pair of cameras 78 spaced apart and pointed toward the probe markers 36 and the body marker 38. Optionally, there are at least three probe markers. The tracking system 28 may be an integrated system that is configured to track and report the position of objects within its coordinate system, or one marker relative to another, using on-board hardware and software. Or, the processing functions may be distributed, such as within the computer system 68 of the embodiment illustrated in
The computer system 68 is connected to the optical hardware assembly 66 and the tracking system 28. Within the computer is the processor 26 and additional components described in more detail in
Referring to
Referring to
Extending under and past the wide angle lens 14 are the light conductor 42 and the distal end of the fiber scope 54 which includes a conductor(s) (such as a fiber optic bundle) for diffuse light and return conductor(s) for returning navigation images. At the distal most tip of the light source 12 is positioned the mirror 52 having a conical shape and configured to redirect the laser light into the pattern 16. If the conical shape is more or less than a 45 degree angle with respect to the axis of the laser light, the shape of the pattern 16 a conical surface. At 45 degrees, the shape is the planar surface 60 shown in
The mask 50 is a planar sheet with a pair of holes, as shown in
The transparent side walls 58 and the cap 56 are configured to enclose and protect the distal portions of the probe 34 but at the same time allow passage of the laser light pattern 16, diffuse navigation light from the fiber scope 54 and the images resulting and returning therefrom. The cap 56 may be, but does not need to be, transparent for the fiberscope. Optionally, as shown in
Because the light conductor 42 and the image conductor extend distal to the wide angle lens 14, the images of the projected pattern 16 of light as it strikes a surface are not completely detected and returned through the full 360 degree field. Instead, in a roughly cylindrical opening such as the ear canal, the returned partial lateral portion 20 may only be a “C shape” that leaves out a portion blocked from visibility by the light conductor 42 and image conductor 44.
The fiberscope 54 has its distal end near the cone mirror 52 and extends proximally in a path adjacent to the light conductor 42 within the shaft 40. Both the fiber scope 54 and the light conductor 52 bend around a CCD camera chip 80 and into the body of the probe 34 to pass through the cable 76 to the computer system 68.
Also within the shaft 40 of the probe 34, the wide angle lens 14 and its image conductor 44 extend back from the cylindrical window 58 in a generally parallel relationship to the conductors 42, 44. The relative positioning of the optical components of the wide angle lens 14 is maintained in part by use of a pair of spacers 82. The wide angle lens 14 is a plurality of optical lens elements that include the image conductor 14 returning the image of the lateral portions 20 to the CCD camera chip 80 mounted in the body of the probe as shown in
Supporting the wide angle lens 14 is a focusing screw 84 that when turned adjusts the focus of the wide angle lens 14, thereby changing the position of its focal surface for improved accuracy within different body orifices and for compensating for manufacturing tolerances and for improved accuracy within a variety of orifices. Proximal to the focusing screw 84 is the CCD camera chip that receives the images of the lateral portions 20 and converts those images into pixel data for return to the computer 68 for processing.
The term “wide angle lens” as used herein means any lens configured for a relatively wide field of view that will work in tortuous openings such as the ear canal 70. For example, for an ear canal, a 63 degree angle results in a lens-focal surface offset about equal to the maximum diameter of the ear canal that can be scanned with a centered probe 34. Notably, the focal surface of a 60 degree lens (a fairly standard sized wide angle lens) is equal to the diameter, resulting in a forward focal surface of about 6 mm, which is short enough to survive the second bend in an ear canal which is at about a 6 mm diameter. Therefore, for the purpose of ear canals, wide angle lenses are 60 degrees or greater. Other increments that work even better are 90 degrees with its 2:1 ratio allowing a forward focal surface distance of about 3 mm, allowing the probe 34 to be fairly short. Lenses that are greater than 90 degrees are possible as are lenses that include complex optical elements with sideways only views and no forward field of view.
In another embodiment, illustrated
Referring to
An advantage of the present invention is that the wide angle lens 14 can view relatively proximate lateral portions of the body surface with high precision due to overlap of its focal surface with the pattern 16 of laser light. The term “focal surface” as used herein refers to a thickness within a range of focus of the wide angle lens 14 that is capable of achieving a certain base line resolution, such as being able to discern a 50 micrometer feature or smaller. For example, lateral positioning of the pattern 16 within the focal surface allows one pixel to be equivalent to about 50 micrometers. The focal surface itself has a bell curve distribution of resolution that allows variations in overlap or thickness of the focal surface and the width of the lateral portion 20 which, as shown above, has its own curved distribution across its thickness.
Generally, the wide angle lens 14 should have a reasonably low distortion threshold to meet the resolution goals. Most wide angle lenses can be as high as −80 percent or −60 percent distortion that would need to be compensated by improved accuracy in other areas such as placement of the focal surface and lateral portion 20. Therefore, there is no set threshold although collectively the various components are preferably tuned to allow a 50 micrometer or better resolution for lateral distances from the optical axis of the wide angle lens 14. The inventors have found that a distortion of better than −40 percent works well with preferred fields of view mentioned herein for ear canal applications.
The tracker or tracking system 28 is configured to determine a position of the probe 34 in the coordinate system and the body 30 of the patient in the coordinate system. The processor 26 is configured to use this information to determine the position of the probe 34 and its measurements relative to the body 30. The tracking system 28 may include elements of a commercially available tracking system such as the POLARIS SPECTRA from NDI of Waterloo, Ontario, Canada. The system is a two camera system to allow three-dimensional position determination of objects in its field of view including the patient and the probe 34 through the probe markers 36 and the body markers 38.
Once the field of view is calibrated to establish the coordinate system the probe 34 and its laser pattern 16 are calibrated using a target placed in the field of view. For such calibration, it is assumed that the laser pattern 16 and the optics, including the wide angle lens 14, are perfect and that the probe 34 is rigid. This enables referencing of the laser pattern 16 directed to the coordinate system.
As shown in
With the checkerboard 86 in place, a tracking session is performed with the tracker 28 to establish the position of the checkerboard with the markers 88 and the position of the probe with the probe markers 36. Then, while maintaining the relative relationship of the probe 34 and the checkerboard 86, a lamp or light is shined on the checkerboard 86 and an image of it is collected through the wide angle lens 14. Preferably, the direction of the y-axis and z-axis relative to the tracker 28 is also noted to avoid axial direction errors.
Calibration may also include non-planar light patterns wherein a checkerboard is exposed to the light pattern in several different orientations. The intersection of the light pattern lateral portion with the checkerboard lines allows a reconstruction of the shape of the non-planar light pattern with respect to the wide-angle lens. Using a target similar to that illustrated in
In another embodiment, the device 10 includes a processor 26 that's connected in communication with the wide angle lens 14 and is configured to perform several functions including: determining a position of the lateral portion 20 in the coordinate system determining the position of the lateral portion 20 using a known focal surface determining the position of a plurality of the lateral portions 20 in the coordinate system and a corresponding location of the coordinate system relative to the body 30 combining the lateral portions 20 together into a three-dimensional shape 32 of a body orifice (such as an ear canal) using the positions and the corresponding locations
In the example of
The image sensor may be implemented in complementary-symmetry metallic-oxide-semiconductor (‘CMOS’) sensor, as a charge-coupled device (‘CCD’), or with other sensing technology as may occur to those of skill in the art. A CMOS sensor can be operated in a snapshot readout mode or with a rolling shutter when the scan along the Z-axis is incremented or stepped synchronously to effect a readout of a complete frame. Similar incrementing or stepping may be used for a CCD operated with interlacing scans of image frames.
In another embodiment of the present invention, as shown in
Referring now to
In addition, the central server 500 may include at least one storage device 515, such as a hard disk drive, a floppy disk drive, a CD Rom drive, or optical disk drive, for storing information on various computer-readable media, such as a hard disk, a removable magnetic disk, or a CD-ROM disk. As will be appreciated by one of ordinary skill in the art, each of these storage devices 515 may be connected to the system bus 545 by an appropriate interface. The storage devices 515 and their associated computer-readable media may provide nonvolatile storage for a central server. It is important to note that the computer-readable media described above could be replaced by any other type of computer-readable media known in the art. Such media include, for example, magnetic cassettes, flash memory cards and digital video disks.
A number of program modules may be stored by the various storage devices and within RAM 530. Such program modules may include an operating system 550 and a plurality of one or more (N) modules 560. The modules 560 may control certain aspects of the operation of the central server 500, with the assistance of the processor 510 and the operating system 550. For example, the modules may perform the functions described above and illustrated by the figures and other materials disclosed herein.
The schematics, flowcharts, and block diagrams in the
Advantages of the embodiments of the invention described herein include the relatively short distance (3 mm, 2 mm, 1 mm or less) of the pattern 16 and focal surface 18 extending past the probe 34 that allow it to image laterally in orifices with tortuous geometry, such as ear canals with a small diameter and where it is useful to scan 3 mm past a bend and also to image larger diameter ear canals and spaces without having to take multiple passes over that section of the canal. Also, the low distortion of the wide angle lens 14 leads to high resolution when the laser pattern 16 is coincident with the focal surface 18. This allows the resolution of 50 micrometers for a single pixel when other prior art systems have neighboring pixels a millimeter or more apart.
Advantages particular to the creation of hearing aids include a solution that allows directly scanning of the ear instead of making a silicone mold. Quality, performance and fit are improved while reducing cost and increasing speed of production by capturing the shape and size of the ear canal for submission directly to the hearing aid manufacturer. Other medical applications include endoscopic surgery, dental impressions and the aforementioned industrial applications, such as inspection of various pipes, channels, tubing or other openings.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. For example, a camera may be any kind of image sensor. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
This patent application claims the benefit and priority of U.S. Provisional Patent Application No. 61/466,863, filed Mar. 23, 2011, and entitled “Optical Scanning Device.”
Number | Name | Date | Kind |
---|---|---|---|
4185918 | DiMatteo et al. | Jan 1980 | A |
4396945 | DiMatteo et al. | Aug 1983 | A |
4434800 | Anson et al. | Mar 1984 | A |
4575805 | Moermann et al. | Mar 1986 | A |
4585349 | Gross et al. | Apr 1986 | A |
4622967 | Schachar | Nov 1986 | A |
4637715 | Idesawa | Jan 1987 | A |
4645348 | Dewar et al. | Feb 1987 | A |
4705401 | Addleman et al. | Nov 1987 | A |
4774403 | Arts | Sep 1988 | A |
4821117 | Sekiguchi | Apr 1989 | A |
4885634 | Yabe | Dec 1989 | A |
4967092 | Fraignier et al. | Oct 1990 | A |
4986262 | Saito | Jan 1991 | A |
5044373 | Northeved et al. | Sep 1991 | A |
5056204 | Bartschi | Oct 1991 | A |
5090400 | Saito | Feb 1992 | A |
5200819 | Nudelman et al. | Apr 1993 | A |
5218427 | Koch | Jun 1993 | A |
5280378 | Lombardo | Jan 1994 | A |
5294940 | Wennagel et al. | Mar 1994 | A |
5419312 | Arenberg et al. | May 1995 | A |
5432543 | Hasegawa et al. | Jul 1995 | A |
5436655 | Hiyama et al. | Jul 1995 | A |
5487012 | Topholm et al. | Jan 1996 | A |
5546189 | Svetkoff et al. | Aug 1996 | A |
5605531 | Lane et al. | Feb 1997 | A |
5658235 | Priest et al. | Aug 1997 | A |
5702249 | Cooper | Dec 1997 | A |
5714832 | Shirrod et al. | Feb 1998 | A |
5733246 | Forkey | Mar 1998 | A |
5738633 | Christiansen | Apr 1998 | A |
5740802 | Nafis et al. | Apr 1998 | A |
5747789 | Godik | May 1998 | A |
5753931 | Borchers et al. | May 1998 | A |
5784098 | Shoji et al. | Jul 1998 | A |
5825495 | Huber | Oct 1998 | A |
5831601 | Vogeley et al. | Nov 1998 | A |
5840017 | Furusawa et al. | Nov 1998 | A |
5847832 | Liskow et al. | Dec 1998 | A |
5883385 | Takahashi et al. | Mar 1999 | A |
5891016 | Utsui et al. | Apr 1999 | A |
5895927 | Brown | Apr 1999 | A |
5897494 | Flock et al. | Apr 1999 | A |
5926388 | Kimbrough et al. | Jul 1999 | A |
5936628 | Kitamura et al. | Aug 1999 | A |
5978092 | Brown | Nov 1999 | A |
6028672 | Geng | Feb 2000 | A |
6044170 | Migdal et al. | Mar 2000 | A |
6069698 | Ozawa et al. | May 2000 | A |
6081612 | Gutkowicz-Krusin et al. | Jun 2000 | A |
6110106 | MacKinnon et al. | Aug 2000 | A |
6179777 | Ninomiya et al. | Jan 2001 | B1 |
6186944 | Tsai | Feb 2001 | B1 |
6217510 | Ozawa et al. | Apr 2001 | B1 |
6292263 | Norita et al. | Sep 2001 | B1 |
6293911 | Imaizumi et al. | Sep 2001 | B1 |
6319199 | Sheehan et al. | Nov 2001 | B1 |
6327041 | Guern | Dec 2001 | B1 |
6361489 | Tsai | Mar 2002 | B1 |
6377865 | Edelsbrunner et al. | Apr 2002 | B1 |
6383133 | Jones | May 2002 | B1 |
6393431 | Salvati et al. | May 2002 | B1 |
6450970 | Mahler et al. | Sep 2002 | B1 |
6459493 | Sugiura et al. | Oct 2002 | B1 |
6470124 | Le Gargasson et al. | Oct 2002 | B1 |
6471636 | Sano et al. | Oct 2002 | B1 |
6532299 | Sachdeva et al. | Mar 2003 | B1 |
6573513 | Hayashi | Jun 2003 | B2 |
6602186 | Sugimoto et al. | Aug 2003 | B1 |
6603552 | Cline et al. | Aug 2003 | B1 |
6626825 | Tsai | Sep 2003 | B2 |
6675040 | Cosman | Jan 2004 | B1 |
6679839 | Farkas et al. | Jan 2004 | B2 |
6751494 | Collier et al. | Jun 2004 | B2 |
6753966 | Von Rosenberg | Jun 2004 | B2 |
6918538 | Breytman et al. | Jul 2005 | B2 |
6920414 | Tøpholm | Jul 2005 | B2 |
6937348 | Geng | Aug 2005 | B2 |
6949069 | Farkas et al. | Sep 2005 | B2 |
7068825 | Rubbert et al. | Jun 2006 | B2 |
7110124 | Jensen et al. | Sep 2006 | B2 |
7137948 | Tsai | Nov 2006 | B2 |
7162323 | Brumback et al. | Jan 2007 | B2 |
7179222 | Imaizumi et al. | Feb 2007 | B2 |
7206067 | Jensen et al. | Apr 2007 | B2 |
7251025 | Jensen et al. | Jul 2007 | B2 |
7258663 | Doguchi et al. | Aug 2007 | B2 |
7311723 | Seibel et al. | Dec 2007 | B2 |
7341557 | Cline et al. | Mar 2008 | B2 |
7371218 | Walston et al. | May 2008 | B2 |
7399181 | Weber et al. | Jul 2008 | B2 |
7419467 | Tsai | Sep 2008 | B2 |
7421140 | Rottem | Sep 2008 | B2 |
7440121 | Stone | Oct 2008 | B2 |
7446885 | Zabolitzky et al. | Nov 2008 | B2 |
7448753 | Chinnock | Nov 2008 | B1 |
7490085 | Walker et al. | Feb 2009 | B2 |
7544163 | MacKinnon et al. | Jun 2009 | B2 |
7553020 | Goldfain et al. | Jun 2009 | B2 |
7583872 | Seibel et al. | Sep 2009 | B2 |
7625335 | Deichmann et al. | Dec 2009 | B2 |
7722534 | Cline et al. | May 2010 | B2 |
7742635 | Rohaly et al. | Jun 2010 | B2 |
7801584 | Iddan et al. | Sep 2010 | B2 |
7802909 | Baker | Sep 2010 | B2 |
7813591 | Paley et al. | Oct 2010 | B2 |
7835925 | Roe et al. | Nov 2010 | B2 |
7912257 | Paley et al. | Mar 2011 | B2 |
7925333 | Weir et al. | Apr 2011 | B2 |
7937253 | Anast et al. | May 2011 | B2 |
7949385 | Khamene et al. | May 2011 | B2 |
7955255 | Boulais et al. | Jun 2011 | B2 |
7961981 | Berg | Jun 2011 | B2 |
7976474 | Zoth et al. | Jul 2011 | B2 |
7995214 | Forster et al. | Aug 2011 | B2 |
7996068 | Telischak et al. | Aug 2011 | B2 |
8035637 | Kriveshko | Oct 2011 | B2 |
8100826 | Mackinnon et al. | Jan 2012 | B2 |
8107086 | Hart et al. | Jan 2012 | B2 |
8112146 | Hart et al. | Feb 2012 | B2 |
8169470 | Ishihara et al. | May 2012 | B2 |
8206290 | Huang | Jun 2012 | B2 |
8212884 | Seibel et al. | Jul 2012 | B2 |
8228368 | Zhao et al. | Jul 2012 | B2 |
8239001 | Verard et al. | Aug 2012 | B2 |
8249461 | Vaerndal | Aug 2012 | B2 |
8271069 | Jascob et al. | Sep 2012 | B2 |
8310560 | Hart et al. | Nov 2012 | B2 |
8319184 | Hart et al. | Nov 2012 | B2 |
8328731 | Hessel et al. | Dec 2012 | B2 |
8384916 | Hart et al. | Feb 2013 | B2 |
20010044668 | Kimbrough et al. | Nov 2001 | A1 |
20010051766 | Gazdzinski | Dec 2001 | A1 |
20010055462 | Seibel | Dec 2001 | A1 |
20020161282 | Fulghum | Oct 2002 | A1 |
20030074174 | Fu et al. | Apr 2003 | A1 |
20030139658 | Collier et al. | Jul 2003 | A1 |
20030139673 | Vivenzio et al. | Jul 2003 | A1 |
20030164952 | Deichmann et al. | Sep 2003 | A1 |
20030171655 | Newman et al. | Sep 2003 | A1 |
20030210812 | Khamene et al. | Nov 2003 | A1 |
20040107080 | Deichmann et al. | Jun 2004 | A1 |
20040122787 | Avinash et al. | Jun 2004 | A1 |
20040136010 | Jensen et al. | Jul 2004 | A1 |
20050068544 | Doemens et al. | Mar 2005 | A1 |
20060133634 | Berg | Jun 2006 | A1 |
20060282009 | Oberg et al. | Dec 2006 | A1 |
20070035707 | Margulis | Feb 2007 | A1 |
20070112273 | Rogers | May 2007 | A1 |
20070153296 | Schick | Jul 2007 | A1 |
20070156021 | Morse et al. | Jul 2007 | A1 |
20070237306 | Jones et al. | Oct 2007 | A1 |
20070270647 | Nahen et al. | Nov 2007 | A1 |
20070270788 | Nahen et al. | Nov 2007 | A1 |
20080045799 | Whitehead et al. | Feb 2008 | A1 |
20080045800 | Farr | Feb 2008 | A2 |
20080058629 | Seibel et al. | Mar 2008 | A1 |
20080081950 | Koenig et al. | Apr 2008 | A1 |
20080119693 | Makower et al. | May 2008 | A1 |
20080146915 | McMorrow | Jun 2008 | A1 |
20080208006 | Farr | Aug 2008 | A1 |
20080208297 | Gertner et al. | Aug 2008 | A1 |
20080275483 | Makower et al. | Nov 2008 | A1 |
20080281156 | Makower et al. | Nov 2008 | A1 |
20080281167 | Soderberg et al. | Nov 2008 | A1 |
20090018465 | Hessel et al. | Jan 2009 | A1 |
20090021818 | Weir et al. | Jan 2009 | A1 |
20090028407 | Seibel et al. | Jan 2009 | A1 |
20090189972 | Harris et al. | Jul 2009 | A1 |
20090221880 | Soderberg et al. | Sep 2009 | A1 |
20090221920 | Boppart et al. | Sep 2009 | A1 |
20090292168 | Farr | Nov 2009 | A1 |
20090312638 | Bartlett | Dec 2009 | A1 |
20090318758 | Farr et al. | Dec 2009 | A1 |
20100020333 | Kunz et al. | Jan 2010 | A1 |
20100060718 | Forster et al. | Mar 2010 | A1 |
20100191144 | Zoth et al. | Jul 2010 | A1 |
20100198009 | Farr et al. | Aug 2010 | A1 |
20100231513 | Deliwala | Sep 2010 | A1 |
20100239126 | Grafenberg et al. | Sep 2010 | A1 |
20100296664 | Burgett et al. | Nov 2010 | A1 |
20110009694 | Schultz et al. | Jan 2011 | A1 |
20110026037 | Forster et al. | Feb 2011 | A1 |
20110028790 | Farr et al. | Feb 2011 | A1 |
20110102763 | Brown et al. | May 2011 | A1 |
20110130652 | Boppart et al. | Jun 2011 | A1 |
20110137118 | Huang | Jun 2011 | A1 |
20120039493 | Rucker et al. | Feb 2012 | A1 |
20120057734 | Ambrose et al. | Mar 2012 | A1 |
20120063644 | Popovic | Mar 2012 | A1 |
20120140301 | Xu et al. | Jun 2012 | A1 |
20120187190 | Wang et al. | Jul 2012 | A1 |
20120191078 | Yadlowsky et al. | Jul 2012 | A1 |
20120281071 | Bergman et al. | Nov 2012 | A1 |
20120310098 | Popovic | Dec 2012 | A1 |
20120327287 | Meyers et al. | Dec 2012 | A1 |
20120327426 | Hart et al. | Dec 2012 | A1 |
20120327427 | Hart et al. | Dec 2012 | A1 |
20130002426 | Hart et al. | Jan 2013 | A1 |
20130002824 | Hart et al. | Jan 2013 | A1 |
20130003078 | Hart et al. | Jan 2013 | A1 |
20130027515 | Vinther et al. | Jan 2013 | A1 |
20130027516 | Hart et al. | Jan 2013 | A1 |
20130237754 | Berglund et al. | Jan 2013 | A1 |
20130237758 | Berglund et al. | Jan 2013 | A1 |
20130237756 | Berglund et al. | Sep 2013 | A1 |
20130237757 | Berglund et al. | Sep 2013 | A1 |
20130237759 | Berglund et al. | Sep 2013 | A1 |
20130237764 | Berglund et al. | Sep 2013 | A1 |
20140031680 | Berglund et al. | Jan 2014 | A1 |
20140031701 | Berglund et al. | Jan 2014 | A1 |
20140128743 | Yew et al. | May 2014 | A1 |
Number | Date | Country |
---|---|---|
WO 2010140074 | Dec 2010 | WO |
WO 2012129229 | Sep 2012 | WO |
WO 2013138074 | Sep 2013 | WO |
WO 2013138077 | Sep 2013 | WO |
WO 2013138078 | Sep 2013 | WO |
WO 2013138079 | Sep 2013 | WO |
WO 2013138081 | Sep 2013 | WO |
WO 2013138082 | Sep 2013 | WO |
Entry |
---|
PCT Search Report and Written Opinion, May 9, 2013, PCT Application No. PCTUS2013028218, 7 pages. |
PCT Search Report and Written Opinion, Apr. 26, 2013, PCT Application No. PCTUS2013028311, 9 pages. |
PCT Search Report and Written Opinion, Apr. 26, 2013, PCT Application No. PCTUS2013028323, 9 pages. |
PCT Search Report and Written Opinion, May 13, 2013, PCT Application No. PCTUS2013028364, 7 pages. |
PCT Search Report and Written Opinion, May 10, 2013, PCT Application No. PCTUS2013028299, 11pages. |
PCT Search Report and Written Opinion, May 3, 2013, PCT Application No. PCTUS2013028347, 9 pages. |
PCT Search Report and Written Opinion, Jun. 7, 2012, PCT Application No. PCTUS2012029806, 13 pages. |
Office Action, U.S. Appl. No. 13/417,649 Oct. 7, 2013 pp. 1-25. |
Office Action, U.S. Appl. No. 13/586,411 Aug. 29, 2013, pp. 1-21. |
Notice of Allowance, U.S. Appl. No. 13/586,411, Jan. 30, 2014, pp. 1-21. |
Office Action U.S. Appl. No. 13/586,471 Oct. 7, 2013 pp. 1-23. |
Office Action, U.S. Appl. No. 13/586,448 Oct. 8, 2013 pp. 1-24. |
Office Action, U.S. Appl. No. 13/586,474 Oct. 9, 2013 pp. 1-24. |
Office Action, U.S. Appl. No. 13/586,459 Oct. 7, 2013 pp. 1-25. |
Final Office Action, U.S. Appl. No. 13/417,649 Jun. 18, 2014, pp. 1-17. |
Number | Date | Country | |
---|---|---|---|
20120281071 A1 | Nov 2012 | US |
Number | Date | Country | |
---|---|---|---|
61466863 | Mar 2011 | US |