Patient eye level touch control

Information

  • Patent Grant
  • 9089367
  • Patent Number
    9,089,367
  • Date Filed
    Thursday, April 7, 2011
    13 years ago
  • Date Issued
    Tuesday, July 28, 2015
    9 years ago
Abstract
In various embodiments, a surgical console may include a sensor strip with sensor strip sensors (e.g., field effect or capacitive sensors) offset vertically and configured to receive an input from a user corresponding to a vertical height of a patient's eyes relative to the surgical console. The surgical console may use the input from the sensor strip to determine a patient eye level (PEL) relative to a surgical console component and then use the PEL and the at least one component in controlling operation of at least one of a source of irrigation or a source of aspiration during an ophthalmic procedure. The surgical console may further include a plurality of visual indicators positioned relative to at least two of the plurality of sensor strip sensors and configured to be illuminated to correspond to a sensor detecting the touch input.
Description
FIELD OF THE INVENTION

The present invention generally pertains to surgical consoles. More particularly, but not by way of limitation, the present invention pertains to patient eye level determination for surgical consoles.


DESCRIPTION OF THE RELATED ART

The human eye in its simplest terms functions to provide vision by transmitting light through a clear outer portion called the cornea, and focusing the image by way of the lens onto the retina. The quality of the focused image depends on many factors including the size and shape of the eye, and the transparency of the cornea and lens. Different surgical procedures performed on the eye may require precise control of fluid pressure being delivered to the eye. The height of a sensor or fluid source above (or below) a patient's eye may affect pressure measurements and/or the pressure of fluid being delivered from the fluid source to the eye. Current surgical systems may require a user to estimate the distance between, for example, an aspiration sensor, and a user's eyes and type that data into the console.


SUMMARY

In various embodiments, a surgical console may include a sensor strip with sensor strip sensors (e.g., field effect or capacitive sensors) offset vertically and configured to receive an input from a user corresponding to a vertical height of a patient's eyes relative to the surgical console. The surgical console may further include at least one component (e.g., an aspiration sensor) configured to be used during an ophthalmic procedure. In some embodiments, the surgical console may use the input from the sensor strip to determine a patient eye level (PEL) relative to the at least one component and then use the PEL and the at least one component in controlling, for example, irrigation or aspiration during the ophthalmic procedure. In some embodiments, the PEL may be a perpendicular distance between the patient's eyes and a line, parallel to the ground/floor, that intersects the at least one component of the surgical console. The surgical console may further include visual indicators positioned relative to the sensor strip sensors to be illuminated in response to detected touch input. In some embodiments, the sensor strip sensors and/or visual indicators may be arranged along a curved line on the surgical console. In some embodiments, the surgical console may further include a light source configured to project a horizontal light ray at the vertical height corresponding to the sensor strip input received from the user. In some embodiments, the PEL may be used by the surgical console to control an aspiration pump speed to increase/decrease an operating aspiration pressure to be within a desired range. As another example, the PEL may be used by the surgical console to raise/lower an irrigation bottle to increase/decrease the irrigation pressure to be within a desired range.





BRIEF DESCRIPTION OF THE DRAWINGS

For a more complete understanding of the present invention, reference is made to the following description taken in conjunction with the accompanying drawings in which:



FIG. 1 illustrates patient eye level (PEL) relative to a surgical component, according to an embodiment;



FIG. 2
a illustrates a block diagram of a PEL relative to a sensor strip, according to an embodiment;



FIG. 2
b illustrates a block diagram of a sensor strip and visible line projecting device, according to an embodiment;



FIG. 3 illustrates a block diagram of a sensor strip with embedded light emitting diodes (LEDs), according to an embodiment;



FIGS. 4
a-b illustrate a user inputting a vertical height using the sensor strip, according to an embodiment;



FIG. 5 illustrates a sensor strip on a surgical console, according to an embodiment;



FIG. 6 illustrates a flowchart for inputting a vertical height using the sensor strip, according to an embodiment; and



FIG. 7 illustrates a console with a sensor strip and processor, according to an embodiment.





It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are intended to provide a further explanation of the present invention as claimed.


DETAILED DESCRIPTION OF THE EMBODIMENTS


FIG. 1 illustrates an embodiment of a patient eye level (PEL) 103 relative to a surgical component (e.g., an aspiration sensor 105). The PEL 103 may be a vertical height between the aspiration sensor 105 and a patient's eyes 123 when the patient 109 is lying on a surgical table 201 (see FIGS. 2a-b). As seen in FIG. 1, “vertical height” may include a perpendicular distance (e.g., in centimeters) between the patient's eyes 123 and a line, parallel to the ground/floor, that intersects the aspiration sensor 105. While several embodiments presented herein describe the PEL as a vertical height between the patient's eyes 123 and the aspiration sensor 105, it is to be understood that the PEL may also be a vertical height between the patient's eyes 123 and a different surgical component of the surgical console 107 (e.g., an irrigation sensor, irrigation bottle 111, etc.) or another reference point (e.g., the ground). In some embodiments, the PEL 103 may be used by the surgical console 107 in determining, for example, an aspiration or irrigation pressure at a patient's eyes 123. For example, an aspiration pump 121 may provide aspiration to hand piece 119 through a fluid line coupling the hand piece 119 to the console 107. In some embodiments, the surgical console 107 may determine an approximate aspiration pressure at the patient's eyes 123 (e.g., at the tip of the hand piece 119) using sensor readings at the aspiration sensor 105 (on that fluid line) in combination with the PEL. The PEL may also be used in determining an irrigation pressure. For example, irrigation pressure may increase with increasing PEL (e.g., the greater a bottle height (relative to a patient's eyes 123), the greater the pressure of irrigation fluid entering the patient's eyes 123 from the bottle 111). The surgical console 107 may use the pressure information to control a source of irrigation or aspiration (e.g., control an aspiration pump speed, an irrigation bottle height 101, etc).



FIG. 2
a illustrates a block diagram of an embodiment of PEL 103 relative to sensor strip 207. In some embodiments, a user (e.g., a surgeon, nurse, etc.) may touch (e.g., with a finger 213 or stylus) a sensor strip 207 on the surgical console 107 at a vertical height corresponding to the height of the patient's eyes 123. For example, the user may touch the sensor strip 207 at a point such that a straight line running through the point and the patient's eyes 123 is parallel to the ground/floor. The sensor strip 207 may include multiple sensors (e.g., vertically spaced) such that when a sensor (such as sensor 205) of the sensor strip 207 detects a touch, the location of the sensor strip sensor may be used to determine the PEL 103. For example, the vertical height between each sensor and the aspiration sensor 105 may be stored on a table in memory 1003 (see FIG. 7) to enable look-up of the vertical height based on the touched sensor. As another example, an equation may be used that relates the PEL with the sensor location (the sensor location (e.g., sensor height) may be an input to the equation with the PEL being the output). In some embodiments, the PEL value may be stored with the sensor strip such that when a sensor detects a touch, the corresponding PEL is automatically sent by the sensor strip to, for example, a controller on the console 107. Other PEL determination methods are also possible based on input from the sensors. Further, while the sensor strip 207 is shown on the front of the surgical console 107, other locations of the sensor strip 207 (e.g., on the side of the surgical console, etc) are also contemplated.


In some embodiments, a light emitting diode (LED) 203 or other visual indicator may be used to provide the user with a visual indication of the received touch input (e.g., LED 203 may illuminate as the user touches sensor 205). The visual indication may assist the user in determining if the correct sensor strip sensor (corresponding to the intended patient eye vertical height) has been touched. In some embodiments, as seen in FIG. 2b, a visible line projecting device 209 with, for example, a row of light sources (such as a low intensity laser) or a movable light source may project a horizontal light ray 211 corresponding to the height of the touched sensor 205 toward the operating table 201. In some embodiments, the surgical console 107 may illuminate a light source (or move the movable light source) corresponding to the height of the touched sensor strip sensor. The user may determine, based on the location of the projected light ray 211 if the correct sensor strip sensor has been touched. For example, the light ray 211 may align with the patient's eyes 123 indicating that the correct height has been indicated on the sensor strip 207. In some embodiments, the light source may be a low intensity light source (e.g., a low intensity laser) to prevent damaging the patient's eyes 123. As another example, the light source may include an LED (e.g., a high-powered LED) surrounded by a reflector.



FIG. 3 illustrates a block diagram of an embodiment of a sensor strip 207 with embedded sensors 305 and LEDs 303. In some embodiments, the sensors 305 may include sensors 305 that are sensitive to touch (e.g., field effect or capacitive sensors). The sensors 305 may be field effect switch sensors such as Touchcell™ sensors (e.g., which use low impedance electrodynamic field-effect technology) that may produce digital logic-level switching output. The field effect switch sensors may produce an electric field and detect a change in the electric field when a conductive mass (such as a human finger) enters the field. Other sensor types are also contemplated. For example, resistive sensors, buttons (e.g., a vertical array of buttons) or infrared motion detectors/cameras (to detect a relative location of the user's finger) may be used. In some embodiments, a mechanical slider (with a vertical sliding element that can be moved by the user and aligned with the patient's eyes 123) may be used. Other sensor strip sensors may also be used to receive user input indicating a vertical height of the patient's eyes 123 relative to, for example, the aspiration sensor 105. As seen in FIG. 3, the LEDs 303 may overlap the sensors 305 and may be illuminated as their respective sensor strip sensor is touched. Other locations for the LEDs are also contemplated (e.g., next to the sensor strip as shown in FIGS. 2a-b).



FIGS. 4
a-b illustrate an embodiment of inputting a vertical height using the sensor strip 207. As seen in FIGS. 4a-b, the sensor strip 207 may include configurations that are slanted, curved, etc. In some embodiments, distances (e.g., relative vertical heights) between the individual sensors on the sensor strip 207 and the aspiration sensor 105 may be stored (e.g., in memory 1003) such that the sensor strip 207 may take on various configurations (e.g., the sensor strip 207 may curve to follow the contour of the surgical console 107). The “vertical height” for an individual sensor strip sensor may include a perpendicular distance between the individual sensor strip sensor and a line, parallel to the ground/floor, that intersects, for example, the aspiration sensor 105. Other configurations of “height” are also contemplated. For example, the PEL may be a point along a slanted line (e.g., a line running through the aspiration sensor 205 and the patient's eyes) and calculations based on the slanted line PEL may be configured to account for location of the PEL on the slanted line (as opposed to a vertical PEL).


In some embodiments, the sensor strip 207 may include a continuous slide feature (such as the continuous indentation shown in FIG. 4a) over the sensor strip sensors to assist a user in sliding their finger over the sensor strip sensors during height selection. As the user slides their finger (e.g., between FIGS. 4a and 4b) the LED associated with the touched sensor strip sensor may illuminate to indicate to a user the current sensor strip sensor location detecting the touch. FIG. 5 illustrates another embodiment of a sensor strip 207 on a surgical console. The number and placements of the sensor strip sensors may be arranged according to the desired resolution. For example, distances between sensor strip sensors may include 1 millimeter, 5 millimeters, 1 centimeter, 2 centimeters, etc. Smaller or greater distances may be used according to the desired resolution. In some embodiments, distances between sensor strip sensors may not be consistent (e.g., smaller distances may be used between sensor strip sensors located at normal patient eye levels and greater distances may be used for sensors strip sensors outside the range of normal patient eye levels).



FIG. 6 illustrates a flowchart of an embodiment for inputting a vertical height using the sensor strip 207. The elements provided in the flowchart are illustrative only. Various provided elements may be omitted, additional elements may be added, and/or various elements may be performed in a different order than provided below.


At 601, a user may indicate to the surgical console 107 that they are about to enter a vertical height corresponding to a PEL 103. In some embodiments, the user may select an option to enter the vertical height by pressing a visual option (such as an icon) presented on a graphical user interface (GUI) 117 on the surgical console's touchscreen. Other selections mechanisms are also contemplated (keyboard, computer mouse, etc). In some embodiments, the icon may need to be selected each time a new vertical height will be indicated (i.e., the surgical console 107 may stop considering further sensor strip inputs after a vertical height is received or after a predetermined amount of time (e.g., 10 seconds) has passed since the icon was selected to prevent changes to the vertical height due to inadvertent touches). In some embodiments, the user may not need to indicate the user will be indicating a vertical height before the user enters the distance (e.g., the surgical console 107 may accept a new touch input from the sensor strip 207 at any time).


At 603, a user may touch the sensor strip 207 at a vertical height approximately level with the patient's eyes 123. In some embodiments, the user may tap the sensor strip 207 or slide their finger (or, for example, a stylus or mechanical slider) along the sensor strip 207 to the vertical height (as visually determined by the user eyeing the patient on the surgical table 201). In some embodiments, the surgical console 107 may accept the new vertical height only if the user slides their finger along multiple sensor strip sensors first to prevent the vertical location from being changed due to an inadvertent touch of the sensor strip 207. Other input indications are also contemplated (e.g., the user may be required to double tap on a sensor strip sensor corresponding to the vertical height for the input to be acknowledged by the surgical console 107).


At 605, the surgical console 107 may provide a visual indication of the received sensor strip input. For example, the surgical console 107 may illuminate an LED 203 or project a horizontal line (e.g., laser) toward the patient 109 corresponding to the touched sensor strip sensor (or last touched sensor strip sensor if a user slides their finger along the sensor strip). Other visual indicators are also contemplated.


At 607, the surgical console 107 may use the sensor strip input to determine a PEL 103. For example, a table 1005 with sensor strip sensor identifications and the relative vertical heights between the sensor strip sensors and the aspiration sensor 105 may be accessed to determine a vertical height between the sensor detecting a touch and the aspiration sensor 105. As another example, the vertical heights (between the sensor strip sensors and the aspiration sensor 105) corresponding to each sensor strip sensor may be stored in a one to one correlation (e.g., stored with the sensor strip sensors) that is not necessarily in table format. In some embodiments, multiple PELs (e.g., relative to multiple console components) may be determined. For example, a table with vertical heights relative to the each individual sensor strip sensor and components such as the irrigation bottle, irrigation sensor, aspiration sensor, etc. may be used to determine PELs relative to other system components based on a single sensor strip input.


At 609, the determined vertical height may be used as the PEL during system operation (e.g., in determining irrigation and aspiration pressure). For example, the PEL may be used along with input from the aspiration sensor 105 to determine a relative aspiration pressure at the patient's eyes 123. As another example, a PEL (relative to an irrigation bottle) may be used to determine an irrigation pressure at the patient's eyes 123. The respective aspiration and/or irrigation pressures may be used to control an aspiration pump speed (to increase/decrease the aspiration pressure to be within a desired range) or raise/lower the irrigation bottle (to increase/decrease the irrigation pressure to be within a desired range). Other PEL uses are also possible.


In some embodiments, as seen in FIG. 7, the surgical console may include one or more processors (e.g., processor 1001). The processor 1001 may include single processing devices or a plurality of processing devices. Such a processing device may be a microprocessor, controller (which may be a micro-controller), digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, control circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on operational instructions. The memory 1003 coupled to and/or embedded in the processors 1001 may be a single memory device or a plurality of memory devices. Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information. Note that when the processors 1001 implement one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory 1003 storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. The memory 1003 may store, and the processor 1001 may execute, operational instructions corresponding to at least some of the elements illustrated and described in association with the figures. For example, the processor 1001 may process touch inputs (e.g., relayed as digital values) from the sensor strip sensor to determine a PEL 103 for use with pressure measured by the aspiration pressure sensor 105.


Various modifications may be made to the presented embodiments by a person of ordinary skill in the art. Other embodiments of the present invention will be apparent to those skilled in the art from consideration of the present specification and practice of the present invention disclosed herein. It is intended that the present specification and examples be considered as exemplary only with a true scope and spirit of the invention being indicated by the following claims and equivalents thereof.

Claims
  • 1. A surgical console, comprising: at least one component configured to be used during an ophthalmic surgical procedure;a sensor strip comprising a plurality of sensor strip sensors, wherein at least two of the plurality of sensor strip sensors are offset vertically and wherein the sensor strip is configured to receive an input from a user, through the sensor strip, at an input point on the sensor strip that aligns with a vertical height of a patient's eyes relative to the surgical console such that a line, parallel to a ground/floor supporting the surgical console, intersects the input point on the sensor strip and the patient's eyes;wherein the surgical console is configured to use the input to determine a patient eye level (PEL) relative to the at least one component;wherein the surgical console is configured to use the PEL and the at least one component in controlling operation of at least one of a source of irrigation or a source of aspiration during the ophthalmic surgical procedure.
  • 2. The surgical console of claim 1, wherein the plurality of sensor strip sensors comprise field effect switch or capacitive sensors.
  • 3. The surgical console of claim 1, wherein the PEL is a perpendicular distance between the patient's eyes and the line, parallel to the ground/floor, that intersects the at least one component of the surgical console.
  • 4. The surgical console of claim 1, further comprising a plurality of visual indicators, wherein at least two of the plurality of visual indicators are positioned relative to at least two of the plurality of sensor strip sensors.
  • 5. The surgical console of claim 4, wherein the surgical console is configured to illuminate at least one of the plurality of visual indicators that corresponds to a sensor detecting the touch input.
  • 6. The surgical console of claim 1, wherein the at least one component of the surgical console is an aspiration pressure sensor.
  • 7. The surgical console of claim 1, wherein the plurality of sensor strip sensors are arranged along a curved line on the surgical console.
  • 8. The surgical console of claim 1, wherein the PEL is used by the surgical console to control an aspiration pump speed to increase/decrease an operating aspiration pressure to be within a desired range.
  • 9. The surgical console of claim 1, wherein the PEL is used by the surgical console to raise/lower the source of irrigation to increase/decrease an irrigation pressure provided through a hand piece coupled to the surgical console to be within a desired range.
  • 10. The surgical console of claim 1, further comprising a light source configured to project a horizontal light ray at the vertical height corresponding to the sensor strip input received from the user.
  • 11. A surgical console, comprising: an aspiration pump;an aspiration sensor configured to detect an aspiration pressure in a line coupled to the aspiration pump;a sensor strip comprising a plurality of field effect switch or capacitive sensors, wherein at least two of the plurality of sensor strip sensors are offset vertically and wherein the sensor strip is configured to receive an input from a user, through the sensor strip, at an input point on the sensor strip that aligns with a vertical height of a patient's eyes relative to the surgical console such that a line, parallel to a ground/floor supporting the surgical console, intersects the input point on the sensor strip and the patient's eyes;a plurality of visual indicators, wherein at least two of the plurality of visual indicators are positioned relative to the at least two of the plurality of sensor strip sensors and wherein the surgical console is configured to illuminate at least one of the plurality of visual indicators that corresponds to a sensor strip sensor detecting the user input;wherein the surgical console is configured to use the input to determine a patient eye level (PEL) relative to the aspiration sensor of the surgical console;wherein the surgical console is configured to use the PEL and information from the aspiration sensor to control operation of the aspiration pump to obtain a desired aspiration pressure at the patient's eyes through a hand piece coupled to the surgical console.
  • 12. The surgical console of claim 6, wherein the sensor strip and the aspiration pressure sensor are fixed relative to each other on a main body of the surgical console.
  • 13. The surgical console of claim 11, wherein the PEL is a perpendicular distance between the patient's eyes and the line, parallel to the ground/floor, that intersects the aspiration sensor of the surgical console.
  • 14. The surgical console of claim 11, wherein the plurality of visual indicators overlap the plurality of sensor strip sensors.
  • 15. The surgical console of claim 11, wherein the plurality of sensor strip sensors and the plurality of visual indicators are arranged along a curved line on the surgical console.
  • 16. The surgical console of claim 11, wherein the PEL is further used by the surgical console to raise/lower the irrigation bottle to increase/decrease an irrigation pressure provided through a hand piece coupled to the surgical console to be within a desired range.
  • 17. The surgical console of claim 11, further comprising a light source configured to project a horizontal light ray at the vertical height corresponding to the sensor strip input received from the user.
  • 18. The surgical console of claim 11, wherein the sensor strip and the aspiration sensor are fixed relative to each other on a main body of the surgical console.
PRIORITY CLAIM

This application claims the benefit of priority of U.S. Provisional Patent Application Ser. No. 61/321,998 titled “Patient Eye Level Touch Control”, filed on Apr. 8, 2010, whose inventors are John Koontz, David Thoe, and Mikhail Boukhny, which is hereby incorporated by reference in its entirety as though fully and completely set forth herein.

US Referenced Citations (118)
Number Name Date Kind
2450062 Voss et al. Sep 1948 A
3239872 Kitrell Mar 1966 A
3652103 Higgs Mar 1972 A
3818542 Jones Jun 1974 A
3890668 Stosberg et al. Jun 1975 A
3920014 Banko Nov 1975 A
4025099 Virden May 1977 A
4110866 Ishii Sep 1978 A
4143442 Harlang Mar 1979 A
4550221 Mabusth Oct 1985 A
4550808 Folson Nov 1985 A
4616888 Peterman Oct 1986 A
4633544 Hicks Jan 1987 A
4669580 Neville Jun 1987 A
4675485 Paukert Jun 1987 A
4677706 Screen Jul 1987 A
4744536 Bancalari May 1988 A
4811966 Singleton Mar 1989 A
4941552 Screen Jul 1990 A
D325086 Charles et al. Mar 1992 S
5112019 Metzler et al. May 1992 A
5242035 Lange Sep 1993 A
5249121 Baum et al. Sep 1993 A
5273043 Ruike Dec 1993 A
5280789 Potts Jan 1994 A
5315290 Moreno et al. May 1994 A
D352106 Fanney et al. Nov 1994 S
5456336 Bopp Oct 1995 A
5642392 Nakano et al. Jun 1997 A
5650597 Redmayne Jul 1997 A
5702117 Geelhoed Dec 1997 A
5752520 Bisnaire et al. May 1998 A
5766146 Barwick, Jr. Jun 1998 A
5788688 Bauer et al. Aug 1998 A
5800383 Chandler et al. Sep 1998 A
5810765 Oda Sep 1998 A
5823302 Schweninger Oct 1998 A
5827149 Sponable Oct 1998 A
5830180 Chandler et al. Nov 1998 A
5836081 Orosz, Jr. Nov 1998 A
5857685 Phillips et al. Jan 1999 A
5859629 Tognazzini Jan 1999 A
5876016 Urban et al. Mar 1999 A
5880538 Schulz Mar 1999 A
5964313 Guy Oct 1999 A
5988323 Chu Nov 1999 A
6024720 Chandler et al. Feb 2000 A
6034449 Sakai et al. Mar 2000 A
6047634 Futsuhara et al. Apr 2000 A
6055458 Cochran et al. Apr 2000 A
6109572 Urban et al. Aug 2000 A
6232758 Konda et al. May 2001 B1
6251113 Appelbaum et al. Jun 2001 B1
6267503 McBride Jul 2001 B1
6276485 Eriksson et al. Aug 2001 B1
D447567 Murphy et al. Sep 2001 S
6357765 Heien Mar 2002 B1
6409187 Crow, Jr. Jun 2002 B1
6429782 Pavatich et al. Aug 2002 B2
D467001 Buczek et al. Dec 2002 S
6501198 Taylor et al. Dec 2002 B2
6503208 Skovlund Jan 2003 B1
6530598 Kirby Mar 2003 B1
6532624 Yang Mar 2003 B1
6587750 Gerbi et al. Jul 2003 B2
6590171 Wolf et al. Jul 2003 B1
6619438 Yang Sep 2003 B1
6626445 Murphy et al. Sep 2003 B2
6662404 Stroh et al. Dec 2003 B1
6678917 Winters et al. Jan 2004 B1
6749538 Slawinski et al. Jun 2004 B2
6824539 Novak Nov 2004 B2
6854568 Kun-Tsai Feb 2005 B2
6899694 Kadziauskas et al. May 2005 B2
6944910 Pauls Sep 2005 B2
6969032 Olivera et al. Nov 2005 B2
6971617 Nguyen Dec 2005 B2
7065812 Newkirk et al. Jun 2006 B2
7100716 Engels et al. Sep 2006 B2
D550362 Olivera et al. Sep 2007 S
7454839 Della Bona et al. Nov 2008 B2
7509747 Sudou et al. Mar 2009 B2
7685660 Chinn Mar 2010 B2
7878289 Standke Feb 2011 B2
8542203 Serban et al. Sep 2013 B2
20010023331 Kanda et al. Sep 2001 A1
20040119484 Basir et al. Jun 2004 A1
20040226187 Bruntz et al. Nov 2004 A1
20050004559 Quick et al. Jan 2005 A1
20050068417 Kreiner et al. Mar 2005 A1
20050088417 Mulligan Apr 2005 A1
20050230575 Zelenski et al. Oct 2005 A1
20050234441 Bisch et al. Oct 2005 A1
20060031989 Graham et al. Feb 2006 A1
20060113733 Kazaoka Jun 2006 A1
20060149426 Unkrich et al. Jul 2006 A1
20060267295 You Nov 2006 A1
20070051566 Marlow Mar 2007 A1
20070124858 Ahlman Jun 2007 A1
20080033361 Evans et al. Feb 2008 A1
20080114290 King et al. May 2008 A1
20080126969 Blomquist May 2008 A1
20080147023 Hopkins Jun 2008 A1
20080189173 Bakar et al. Aug 2008 A1
20080223650 Standke Sep 2008 A1
20090013780 Gao Jan 2009 A1
20090036271 Brand et al. Feb 2009 A1
20090040181 Darnell et al. Feb 2009 A1
20090045021 Einbinder Feb 2009 A1
20090069799 Daw et al. Mar 2009 A1
20090090434 Brand et al. Apr 2009 A1
20090143734 Humayun et al. Jun 2009 A1
20090231095 Gray Sep 2009 A1
20090289431 Geeslin Nov 2009 A1
20100019200 Chi et al. Jan 2010 A1
20100049119 Norman et al. Feb 2010 A1
20110247173 Nguyen et al. Oct 2011 A1
20110251548 Koontz et al. Oct 2011 A1
Foreign Referenced Citations (87)
Number Date Country
8504205 Apr 1987 BR
2273269 Jan 1998 CN
2102508 Jun 1973 DE
3016615 Nov 1981 DE
3039611 Apr 1982 DE
3203886 Sep 1983 DE
8910606 Oct 1989 DE
4344187 Jun 1995 DE
19504073 Aug 1996 DE
10047006 Apr 2002 DE
20308670 Feb 2004 DE
10332823 Feb 2005 DE
202005016310 Jan 2006 DE
202007008797 Aug 2007 DE
102006049071 Nov 2007 DE
102008015505 Feb 2009 DE
102007053444 May 2009 DE
102009058919 Jun 2011 DE
0701917 Mar 1996 EP
0979741 Feb 2000 EP
1016580 Jul 2000 EP
1024071 Aug 2000 EP
1180473 Feb 2002 EP
0901388 Jan 2003 EP
1964750 Sep 2008 EP
2106986 Oct 2009 EP
1016578 Mar 2010 EP
2173154 Jan 2011 EP
2292202 Mar 2011 EP
2285964 Nov 2007 ES
2648101 Dec 1990 FR
2799410 Apr 2001 FR
2880700 Jul 2006 FR
210851 Feb 1924 GB
767159 Jan 1957 GB
2061105 May 1981 GB
2132478 Jul 1984 GB
2260195 Apr 1993 GB
2260622 Apr 1993 GB
2441303 Mar 2008 GB
02-107245 Apr 1990 JP
03-062902 Mar 1991 JP
03-190919 Aug 1991 JP
03-252266 Nov 1991 JP
04-063328 Feb 1992 JP
09-058203 Mar 1997 JP
09-113071 May 1997 JP
10-297206 Nov 1998 JP
11-169411 Jun 1999 JP
11-244339 Sep 1999 JP
2001-001703 Jan 2001 JP
2001-058503 Mar 2001 JP
2002-312116 Oct 2002 JP
2003-220803 Aug 2003 JP
2005-162113 Jun 2005 JP
2005-296606 Oct 2005 JP
2006-341670 Dec 2006 JP
2007-137305 Jun 2007 JP
2009-512971 Mar 2009 JP
2010-508104 Mar 2010 JP
2010-088490 Apr 2010 JP
WO9825556 Jun 1998 WO
WO 9825556 Jun 1998 WO
WO 0012150 Mar 2000 WO
WO 0018012 Mar 2000 WO
WO 02043571 Jun 2002 WO
WO2006073400 Jun 2002 WO
WO 02043571 Apr 2003 WO
WO 03093408 Nov 2003 WO
WO 2004017521 Feb 2004 WO
WO 2004082554 Sep 2004 WO
WO 2004082554 Mar 2005 WO
WO 2006073400 Jul 2006 WO
WO 2008052752 May 2008 WO
WO 2008053485 May 2008 WO
2009021836 Feb 2009 WO
WO2009021836 Feb 2009 WO
WO 2009073691 Jun 2009 WO
WO 2009073769 Jun 2009 WO
WO 2009073691 Jul 2009 WO
WO 2009073769 Jul 2009 WO
WO 2010020200 Feb 2010 WO
WO 2010027255 Mar 2010 WO
WO 2010027255 Mar 2010 WO
WO 2011126596 Oct 2011 WO
WO 2011126597 Oct 2011 WO
WO 2011127231 Oct 2011 WO
Non-Patent Literature Citations (10)
Entry
European Searching Authority, Extended Supplementary European Search Report, European Patent Application No. 11766694.1, Publication No. 2555725, Published Feb. 13, 2013, 5 pages.
International Searching Authority, Written Opinion of the International Searching Authority, International Application No. PCT/US2011/023107, Mar. 31, 2011, 7 pages.
International Searching Authority, International Search Report, International Application No. PCT/US2011/023107, Mar. 31, 2011, 2 pages.
International Searching Authority, International Search Report, International Application No. PCT/US11/23103, Mar. 30, 2011, 2 pages.
International Searching Authority, Written Opinion of the International Searching Authority, International Application No. PCT/US11/23103, Mar. 30, 2011, 4 pages.
International Searching Authority, International Search Report, PCT/US2011/031500, Jun. 16, 2011, 2 pages.
International Searching Authority, Written Opinion of the International Searching Authority, PCT/US2011/031500, Jun. 16, 2011, 6 pages.
Steinco brochure accessed through http://web.archive.org/web/20080731183316/http://www.steinco.de/service/downloads.aspx?id=6901 & accessed from page http://web.archive.org/web/20080731188316/http://www.steinco.de/en/Castors—Hospital.aspx—original date believed to be Jul. 31, 2008, 2 pages.
http://web.archive.org/web/20090526022232/http://www.touchsensor.com/engineers.html?bcsi—scan—0A8B7FA59D377CC3=w1 . . . (web archive dated May 26, 2009) (2 pages).
http://web.archive.org/web/20090322184659/http://www.touchsensor.com/technology—switch.html (web archive dated Mar. 22, 2009) (2 pages).
Related Publications (1)
Number Date Country
20110251548 A1 Oct 2011 US
Provisional Applications (1)
Number Date Country
61321998 Apr 2010 US