This invention pertains to optical proximity detection and, more particularly, control of a touch screen input using optical proximity detection.
Conventional systems exist that perform control of touch input devices, such as touch screens. Typically, touch screen based systems collect user inputs using a touch screen that monitors changes in capacitance on the touch screen to identify the position of user input from a stylus or finger in contact with the touch screen. Changes in the capacitance in the touch screen device are monitored and interpreted to determine the user's input.
In one embodiment, a touch input and optical proximity sensing system is shown having first and second light sources and a first optical receiver configured to receive a first reflected light signal from an object when the first light source is activated and output a first measured reflectance value corresponding to an amplitude of the first reflected light signal and receive a second reflected light signal from the object when the second light source is activated and output a second measured reflectance value corresponding to an amplitude of the second reflected light signal. The system includes a first touch input device, where the first and second light sources and the first optical receiver are in predetermined positions relative to the first touch input device. A controller is in communication and control of the first and second light sources, the first optical receiver and the first touch input device. The controller is configured to independently activate the first and second light sources to produce the first and second reflected light signals and capture the first and second measured reflectance values from the first optical receiver. The controller is further configured to determine a first approximate position of the object based on the first and second measured reflectance values and determine whether the first approximate position of the object is within a predetermined proximity area to the first touch input device and, if the object is within the predetermined proximity area, the controller is configured to activate at least a portion of the first touch input device.
In a further refinement of this embodiment, the first optical receiver is configured to receive a third reflected light signal from an object when the first light source is activated and output a third measured reflectance value corresponding to an amplitude of the third reflected light signal and receive a fourth reflected light signal from the object when the second light source is activated and output a fourth measured reflectance value corresponding to an amplitude of the fourth reflected light signal. The controller is further configured to independently activate the first and second light sources to produce the third and fourth reflected light signals and capture the third and fourth measured reflectance values from the first optical receiver, and the controller is further configured to determine a second approximate position of the object based on the third and fourth measured reflectance values and compare the first and second approximate positions in order determine whether the object is approaching the first touch input device and, activate the portion of the first touch input device when the object is approaching the first touch input device.
In another refinement of this embodiment, the system includes a third light source and the first optical receiver is configured to receive a fifth reflected light signal from the object when the third light source is activated and output a fifth measured reflectance value corresponding to an amplitude of the fifth reflected light signal. The controller is in communication and control of the third light source and is configured to independently activate the third light source to produce the fifth reflected light signal and capture the fifth measured reflectance value from the first optical receiver. The controller is further configured to determine the first approximate position of the object based on the first, second and fifth measured reflectance values and selectively activate the portion of the first touch input device based on the first approximate position derived from the first, second and fifth measured reflectance values.
In a different refinement of this embodiment, the system includes a second touch input device and the controller is in communication and control of the second touch input device, where the controller is configured to selectively activate at least one of the first and second touch input devices based on the first approximate position of the object. In still another refinement, the controller is configured to activate the portion of the first touch input device by scanning the portion of the first touch input device.
An embodiment of a method for optical proximity sensing and touch input control calls for mounting a plurality of light sources and a first optical receiver in predetermined position relative to a first touch input device. The method involves activating at least a first one of the light sources and measuring an amplitude of a first reflected light signal from an object using the first optical receiver to obtain a first measured reflectance value, activating at least a second one of the light sources and measuring an amplitude of a second reflected light signal from the object using the first optical receiver to obtain a second measured reflectance value, and determining a first approximate position of the object based on the first and second measured reflectance values. The method also involves determining whether the first approximate position of the object is within a predetermined proximity area to the first touch input device and activating at least a portion of the first touch input device if the object is within the predetermined proximity area.
In a refinement of this embodiment of a method, the method includes the steps of activating the first one of the light sources and measuring an amplitude of a third reflected light signal from the object using the first optical receiver to obtain a third measured reflectance value, activating the second one of the light sources and measuring an amplitude of a fourth reflected light signal from the object using the first optical receiver to obtain a fourth measured reflectance value, determining a second approximate position of the object based on the third and fourth measured reflectance values, and comparing the first and second approximate positions in order determine whether the object is approaching the first touch input device. In this refinement, the step of activating at least a portion of the first touch input device further includes activating the portion of the first touch input device when the object is approaching the first touch input device.
In another refinement of the embodiment of a method, the method includes the steps of activating a third one of the light sources and measuring an amplitude of a fifth reflected light signal from the object using the first optical receiver to obtain a fifth measured reflectance value. In this refinement, the step of determining a first approximate position of the object further comprises determining the first approximate position of the object based on the first, second and fifth measured reflectance values and the step of selectively activating the portion of the first touch input device based on the first approximate position further includes selectively activating the portion of the first touch input device based on the first approximate position derived from the first, second and fifth measured reflectance values.
In still another refinement of the method, the method includes mounting a second touch input device in predetermined position relative to the plurality of light sources and the first optical receiver and the step of determining whether the first approximate position of the object is within a predetermined proximity area to the first touch input device further includes determining whether the first approximate position of the object is within another predetermined proximity area to the second touch input device. In this refinement, the method includes the step of activating at least a portion of the second touch input device if the object is within the another predetermined proximity area.
In another refinement of the method, the step of activating at least a portion of the first touch input device further comprises scanning the portion of the first touch input device.
In yet another refinement of the method, the method includes mounting a second optical receiver in a predetermined position with respect to the first and second light sources and the first touch input device, where the second optical receiver is configured to receive reflected light signals from the object when at least one of the first and second light sources is activated and output a first measured reflectance value corresponding to an amplitude of the received reflected light signals and the step of determining the first approximate position of the object further comprises determining the first approximate position of the object based on the first and second measured reflectance values and the measured reflectance values from the second optical receiver in order to determine whether the first approximate position of the object is within the predetermined proximity area to the first touch input device.
Certain exemplary embodiments of the present invention are discussed below with reference to the following figures, wherein:
Described below are several exemplary embodiments of systems and methods for touch screen control and optical proximity sensing that may include motion detection or gesture recognition based on approximate position determinations using relatively simple optical receivers, such as proximity sensors or infrared data transceivers, to perform reflectance measurements. Motion detection may be used to activate a touch input device, such as a capacitive touch screen, that may be deactivated to save power when no motion activity is detected. Alternatively, motion detection and position sensing may be used to selectively activate or scan a portion of a touch input device to save power or reduce spurious input. In still another alternative, gesture recognition based on the sensed motion is combined with touch input to interpret user input.
In general terms, motion detection or gesture recognition is based on repeatedly measuring reflectance from an object to determine approximate position for the object, comparing the measured reflectances to identify changes in the approximate position of the object over time, and interpreting the change in approximate position of the object as motion correlating to a particular gesture, which may be interpreted as a user movement or as a motion vector of the object.
The positions are generally rough approximations because the reflectance measurements are highly dependent upon the reflectance of the object surface as well as orientation of object surface. Use of the reflectance measurement values from simple optical systems to obtain an absolute measure of distance is typically not highly accurate. Even a system that is calibrated to a particular object will encounter changes in ambient light and objection orientation, e.g. where the object has facets or other characteristics that affect reflectance independent of distance, that degrade the accuracy of a distance measurement based on measured reflectance.
Because of the variations in reflectance, distance measurements are not reliable, but relative motion can be usefully measured. The present systems and methods for gesture recognition, therefore, rely on relative changes in position. Though the measure of relative motion assumes that the variations in the reflectance of the object are due to motion and not the other factors, such as orientation. Using a single reflectance measurement repeated over time, e.g. a system based on a single LED and receiver, motion of an object toward or away from the system can be identified on a Z axis. This may be useful for a simple implementation, such as a light switch or door opener, or, in machine vision and control applications, the approach of the end of a robot arm to an object, for example. Using two reflectance measurements, e.g. two LEDs and a receiver or two receivers and an LED, reasonable accuracy for position along an X axis may be obtained along with some relative sense of motion in the Z axis. This may be useful for a relatively simple touchless mobile phone interface or slider light dimmer or, in machine vision and control applications, movement of an object along a conveyor belt, for example. Using three or more reflectance measurements, e.g. three LEDs and a receiver or three receivers and an LED, the system can obtain reasonable accuracy for position along X and Y axes, and relative motion in the Z axis. This may be useful for more complex applications, such as a touchless interface for a personal digital assistant device or vision based control of automated equipment. A high number of reflectance measurements can be realized by using multiple receivers and/or LEDs to increase resolution for improved gesture recognition.
In one preferred embodiment, for reflectance measurements, multiple light sourced, such as LEDs, are activated and the resulting photodiode current is measured. In this embodiment, each LED is selectively activated and the receiver measures the resulting photodiode current for each LED when activated. The photodiode current is converted to a digital value and stored by a controller, such as a microprocessor. The measurements are repeated under the control of the processor at time intervals, fixed or variable. The measurements at each time are compared to obtain an approximate determination of position in X and Y axes. The measurements between time intervals are compared by the processor to determine relative motion, i.e. vector motion, of the object.
The relative position of an object in motion can be detected. The detection of the motion in proximity to a touch input device may be used to activate the touch input, which otherwise may remain inactive to conserve power. Alternatively, the relative position of the object in motion can be utilized to selectively activate or scan a portion of the touch input device or to control the scan rate in a portion of the touch input device relative to other portions of the touch input device. In still another alternative, the relative position of the object in motion can be utilized to selectively activate or scan a subset of multiple touch input devices to collect user input information.
In yet another example, the motion of the object may be interpreted or recognized as a gesture and the gesture combined with the touch input information to interpret user input. The relative motion of the object can be interpreted as gestures. For example, positive motion primarily in the X axis can be interpreted as right scroll and negative motion as a left scroll. Positive motion in the Y axis is a down scroll and negative motion is an up scroll. Positive motion in the Z axis can be interpreted as a selection or click (or a sequence of two positive motions as a double click. Relative X and Y axis motion can be used to move a cursor. The gesture may also be a motion vector for the object or for the receiver system mounted on a piece of equipment, e.g. a robot arm. For example, in automated equipment applications, motion of an object along an axis may be tracked to detect an object moving along a conveyor belt. By way of another example, the motion vector may be tracked to confirm proper motion of a robot arm or computer numerically controlled (CNC) machinery components with respect to workpiece objects or to detect unexpected objects in the path of the machinery, e.g. worker's limbs or a build up of waste material.
In this embodiment, receiver 22 and LEDs 24, 26 are mounted along an axis, e.g. an X axis, below a touch input device 20, such as a capacitive touch screen. LEDs 24 and 26 are independently activated and a photodiode in receiver 22 detects reflected light R1 and R2, respectively, from a target object 12. The strength of the reflected light signals R1 and R2 are measured by optical receiver 22. It is assumed that the strength of the reflected light signal roughly represents the distance of object 12 from the system 10.
The touch input and optical proximity sensing system 10 of
In the example of
In addition, as demonstrating using
Also note that the distance of object 12 from receiver 22 may also be determined on a relative basis. For example, if the ratio of R1 to R2 remains substantially the same over a sequence of measurements, but the absolute values measured for R1 and R2 increase or decrease, this may represent motion of object 12 towards receiver 22 or away from receiver 22, respectively. This motion of object 12 may, for example, be interpreted as a gesture selecting or activating a graphical object on a display, e.g. clicking or double clicking.
The principles for two dimensional optical proximity detection, object position determination and gesture recognition described above with respect to
Motions involving changes in distance from receiver 22 can also be identified by monitoring changes in amplitude of the measurements from LEDs 24, 26, 32, 34 when the relative ratios of the measured reflections follow the same relationship, i.e. the relation of reflectance to distance is not linear, but tends to be the same or similar for each LED and receiver in the system. To measure distance from the receiver, all LEDs can be activated simultaneously and the resulting measured reflectance will be proportional to a sum of all the individual reflectance contributions. This simple method may be used, for example, to detect if the object is in the proximity of a touch input device 31 in order to activate device 31 or initiate a gesture recognition algorithm.
The present invention may be implemented using multiple receivers and/or light sources.
The principles for two dimensional object position determination and gesture recognition described above with respect to
In the embodiments of
The number of elements used for reflectance measurement and gesture recognition may be varied as desired for a given application. The manner for determining the relative position of an object and the algorithm employed for gesture recognition need merely be adapted for the number and position of the elements. For example,
There may be two or more LEDs such as LEDs 103A-C installed in sensor circuit 100 without departing from the spirit and scope of the present invention. Three LEDs are shown for the purpose of explaining the invention. In other embodiments there may be more than three LEDs chained in parallel, multiplexed, or independently wired and there may be multiple photodetectors 105 and associated optical receiver circuitry 101 and multiple touch input devices 150. Alternatively, other architectures will be suitable for use for touch input and optical proximity sensing and control, such as multiple sensor circuits 100 chained together, or multiple optical receiving circuitry 101 may be interfaced to a common controller 108. LEDs 103A-C in this example may be compact devices capable of emitting continuous light (always on) or they may be configured to emit light under modulation control. Likewise, they may be powered off during a sleep mode between proximity measurement cycles. The actual light emitted from the LEDs may be visible or not visible to the human eye such as red light and/or infrared light. In one embodiment, visible-light LEDs may be provided for optical reflectance measuring.
In this logical block diagram, the exact placement of components and the trace connections between components of sensor system 100 are meant to be logical only and do not reflect any specific designed trace configuration.
Optical receiver circuitry 101 includes a DC ambient correction circuit 107, which may be referred to hereinafter as DCACC 107. DCACC 107 is a first order, wide loop correction circuit that has connection to a DC ambient zero (DCA-0) switch 106 that is connected inline to PD 105 through a gate such as a PMOS gate described below. Optical receiver circuitry 101 may therefore be first calibrated where the DC ambient light coming from any sources other than optical reflectance is measured and then cancelled to determine the presence of any reflectance signal that may qualified against a pre-set threshold value that may, in one example, be determined during calibration of optical receiver circuitry 101.
Reflectance is determined, in one embodiment of the present invention, by measuring the amplified pulse width of an output voltage signal. Correction for DC ambient light is accomplished by providing optical receiver circuitry 101 with the capability of producing an amplified pulse width that is proportional to the measured DC ambient light entering PD 105. DCACC 107 and switch 106 are provided and adapted for that purpose along with a voltage output comparator circuit 111. More particularly, during calibration for DC ambient light, correction is accomplished by setting the DC-ambient correction to zero using switch 106 at the beginning of the calibration cycle and then measuring the width of the detected pulse during the calibration cycle. The width of the output pulse is proportional to the background DC ambient. Of course, during calibration, the transmitter LEDs 103A-C are disabled.
Sensor circuit 100, in this example, includes a power source and a controller 108. Controller 108 may be integrated on a circuit with optical receiver circuitry 101 or may be a separate device, such as a microprocessor mounted on a printed circuit board on chip carrier or may be a host processor. Controller 108 may be part of an interfacing piece of equipment or another optical receiver depending on the application. The power source for sensor circuit 100 may be a battery power source, a re-chargeable source or some other current source. In this example, the transmitter LEDs 103A-C are connected to and are controlled by controller 108 and may receive power through controller 108 as well. Optical receiver circuit 101 also has a connection to the power source for sensor circuit 100. More than one power source may be used to operate different parts of sensor circuit 100 without departing from the spirit and scope of the present invention. Controller 108, optical receiver circuitry 101, and touch pad device 150 are illustrated logically in this example to show that a processing device may be used to control the optical proximity sensor and touch input functions.
DC ambient circuit 107 produces a voltage from the input signal received from photodiode 105. Optical receiver circuitry 101 includes an analog to digital converter circuit (ADC) 111 that, in this example, converts an input voltage signal produced by photodiode 105 to a digital reflectance measurement value REFL that is output to controller 108. In this example, controller 108 is configured to make the motion and position detection steps of the process 250 of
In the operation of optical sensor circuit 100, calibration is first performed to measure the average DC ambient light conditions using DCACC 107 and ADC 111 with LEDs 103A-C switched off. When the DC ambient loop has settled and a valid threshold has been determined, LEDs 103A-C are independently switched on, in this example, by controller 108 for reflectance measurement. Reflectance received at PD 105 from object 102, in this example, produces a voltage above DC ambient. The resulting input voltage from PD 105 reaches ADC 111, which converts the voltage to a digital value REFL that is output to controller 108. Controller 108 activates one LED at a time and measures the resulting reflectance value REFL produced for each LED 103A-C. Controller 108 may then calculate an approximate position for object 102 based on the measured reflectance values and the relative positions of LEDs 103A-C and photodiode 105 with respect to one another.
In one embodiment, controller 108 activates or scans touch input device 150 when it senses object 102 in the physical proximity of touch input device 150, as discussed further below with respect to
In one embodiment, optical isolation is provided, such as by a partition, to isolate photodiode 105 from receiving any crosstalk from LEDs 103A-C, as illustrated in
Circuitry 200 includes DCACC 107 and ADC 111. The circuitry making up DCACC 107 is illustrated as enclosed by a broken perimeter labeled 107. DCACC 107 includes a trans-impedance amplifier (TIA) A1 (201), a transconductance amplifier (TCA) A2 (202), resistors R1 and R2, and a charge capacitor (C1). These components represent a low-cost and efficient implementation of DCACC 107.
DCA-0 switch (S2) 106 is illustrated as connected to a first PMOS gate (P2), which is in turn connected to a PMOS gate (P1). Gate P1 is connected inline with the output terminal of amplifier A2 (202). A2 receives its input from trans impedance amplifier A1 (201). For purposes of simplification in description, amplifier A2 will be referenced as TCA 202 and amplifier A1 will be referenced as TIA 201. TCA 202 removes DC and low frequency signals. It is important to note that for proximity sensing, TCA 202 takes its error input from the amplifier chain, more particularly from TIA 201. In this respect, TIA 201 includes amplifier A1 and resistor R1.
Controller 108 is not illustrated in
When measuring reflectance, PD 105 receives reflected light from whichever LED 103A-C is activated by controller 108, where the reflected light is illustrated as a reflectance arrow emanating from object 102 and entering PD 105. The resulting current proceeds to TIA 201 formed by operational amplifier A1 and feedback resistor R1. Amplified output from TIA 201 proceeds through FBS 109 (S1) as signal VO (voltage out) to ADC 111.
Output from TIA 201 also proceeds through R2 to the input of DCACC 202 (A2). Here, the input is limited by a diode (D1) or an equivalent limiting circuit. In this way, the output of TCA 202 (A2) has a fixed maximum current to charge capacitance C1. This state causes the current proceeding through PMOS 204 (P1) to ramp at a maximum linear rate. At such time when the current through PMOS 201 (P1) equals the current produced by PD 105, the input error of TIA 201 goes to zero. This state causes the output of TIA to fall thereby reducing the error input to TCA 202 (A2). This slows and then prevents further charging of C1. DCACC 107 can only slew at a fixed rate for large signals and at a proportionally smaller rate for signals below the clipping level, the time it takes for DCACC 107 to correct the input signal change is a measure of the amplitude of the input signal change. In one embodiment, the reflectance value REFL output by ADC 111 is proportional to the total change of optical signal coupled into the photodiode generated by the LED. In other embodiments, the value REFL may be logarithmically compressed or inverted, for example, as required for the particular implementation.
In one embodiment, conversion of the input current to output pulse width includes converting both DC ambient and reflection signals to VO pulse width changes. DCA-0 switch 106 (S2) is closed during calibration and measurement of DC ambient light. Closing switch S2 causes the current through PMOS 204 (P1) to fall near zero while still maintaining voltage on C1 very close to the gate threshold of P1. A period of time is allowed for the DC ambient correction loop to settle. DAC-0 106 (S2) is opened after the correction loop has settled re-enabling the DC-ambient correction loop. The voltage at C1 then increases until the current through PMOS 204 (P1) equals the DC ambient photocurrent resulting from PD 105. Therefore, the time it takes for VO from amplifier A1 to return to its normal state after changing due to proximity detection is proportional to the DC-ambient input current output by PD 105 with the LEDs switched off.
Conversely, for measuring reflectance, S2 is held open while sufficient time is allowed for DC ambient background calibration including letting the DC ambient loop settle or cancel the average DC background ambient. After calibration is complete, TX LEDs 103A-C are enabled to transmit light. The subsequent increase in photocurrent put out by PD 105 as the result of reflectance from object 102 is amplified by A1 causing a change in VO output to ADC 111 only if the amplified change exceeds the proximity detection threshold set by Vref. After detecting reflectance (sensing proximity) the DC-ambient loop causes the voltage on C1 to increase until it cancels the increase in photocurrent due to reflectance. At this point in the process, VO (the amplified signal output from TIA 201) returns to its normal value, thus ending the detection cycle. The period of time between TX of the LEDs and when VO returns to its previous value is proportional to the magnitude of the reflectance signal.
One of skill in the art will recognize that within the sensor circuitry 200 presented in this example, DCACC 107 continuously operates to remove normal changes in the background ambient light. Only transient changes produce an output. Output only occurs when there is a difference between the DC correction signal and the input signal. An advantage of this method of reflectance measurement is that resolution is limited by the “shot noise” of PD 105, provided a low noise photo amplifier is used. Circuitry 200 exhibits low noise for the DC ambient correction current source if a moderately large PMOS is used for P1 and an appropriate degeneration resistor is used at its Vdd source. The integrator capacitor on the gate of P1 removes most of the noise components of TCA 202.
In this embodiment, feedback blanking is implemented by switch 109 (S1), which is driven by one-shot circuit (OS1) 110. OS1110 produces a blanking pulse when the TX LED function is enabled, i.e. in response to the LED transmit control signals from the controller. This blanking pulse is wider in this example than the settling time for transients within TIA 201 (A1). As discussed further above, introducing a blanking pulse into the process increases the sensitivity of the receiver. Otherwise the sensitivity of the receiver is reduced due to feedback noise from the leading edge of the transmitted pulse from LEDs 103A-C.
At step 260, controller 108 makes a determination of the approximate position of object 102 based on measured reflectance from multiple LEDs. If the approximate position of the object is near the touch input device, then control flow proceeds at step 262 to step 264, where the touch input device is activated. Otherwise, control branches to the beginning of process 250. In this embodiment, the touch input device remains inactive until activated by controller 108, at step 264, to receive touch input. This may be a power saving feature or may prevent spurious input at the touch input device. At step 266, controller 108 interacts with the touch input device that has been activated to capture the user's touch input. Optionally, controller 108 may continue to track the motion of object 102 to interpret a gesture. This option may be used to verify the user's touch input, e.g. confirm that a user's finger's double tap touch input to select, for example, an application for launch is consistent with the optically observed motion. By way of another example, a user's sliding motion, e.g. scrolling, may be interpreted both optically and by capacitive touch input detection. Note that some of the steps of process 250 may be omitted or combined or taken in different order without departing from the scope of the present invention and that other forms of processing may be utilized with the present invention as will be understood by one of skill in the art.
The present invention may be applied to systems where portions of a touch input device may be selectively activated or scanned.
In the embodiment of
When an object's position is determined by controller 370, the controller 370 may be configured to scan only the cross points of the capacitive array of touch screen 310 corresponding to the position of the object. In this manner, the present invention may be utilized to activate or scan only the portion of the touch screen 310 in the proximity of the object 102. This approach may be utilized to save power or avoid spurious input from other regions of the touch screen 310. Alternatively, SCAN CONTROL may be used to control the rate of scanning, where, for example, the region of the touch screen 310 in proximity to the object 102 is scanned at a higher rate that other regions of the touch screen.
Control circuit 410 is also interfaced with multiple capacitive array devices 420A-C, such as touch screens, which may make up a larger single touch input device or different devices.
At step 458, the area of the touch screen, e.g. area 400, corresponding to the position of the object is selectively activated or scanned. For the embodiment of
In order to determine motion in a controller of a system according to the present invention, for example, an approximate position P1 for an object is determined from reflectance measurement values REFL obtained at a first point in time, T1, using the optical sensor circuitry. An approximate position for the object is determined from reflectance measurement values REFL obtained using sensor 100 at a second point in time, T2. The approximate positions P1 and P2 are then compared in order to identify motion or a corresponding gesture. For example, the relative motion from P1 to P2 is determined and normalized to a value RELATIVE MOTION that may be used to index a look-up or symbol table to obtain a GESTURE ID value corresponding to a gesture corresponding to the motion vector from P1 to P2 or a value indicating that no gesture could be identified for the motion. Likewise, the normalized approximate positions may be utilized to index a look-up table to identify a touch screen device responsive to the user's motion or the normalized positions may be utilized to calculate the boundaries of a touch screen to be activated or scanned responsive to the user's motion.
All references, including publications, patent applications, and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein.
The use of the terms “a” and “an” and “the” and similar referents in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.
Preferred embodiments of this invention are described herein, including the best mode known to the inventors for carrying out the invention. It should be understood that the illustrated embodiments are exemplary only, and should not be taken as limiting the scope of the invention.
This patent application is related to co-pending U.S. patent application Ser. No. 12/334,296, filed Dec. 12, 2008, herein incorporated by reference in its entirety for all purposes.