1. Field of Invention
The present invention relates in general to the field of education and more specifically to systems and methods for conducting test assessment.
2. Description of the Background Art
Conventional assessment provides for a administering a variety of different individualized tests in which each test is designed to assess a particular subset of various aspects of student learning. While final scores may be compared, each test is configured in a distinct and encapsulated manner for separately assessing the particular learning aspects of a particular student.
Formative testing, for example, provides for relatively frequent, less formalized testing of ongoing student progress in one or more particular aspects of a particular learning area. Formative testing may, for example, include a weekly testing of recently covered topics in mathematics or other separately formulated periodic testing of recently covered topics in science, and so on. Each formative test is typically highly encapsulated with regard to the topic and any sub-topics to be covered, as well as with regard to the construction and goal (or “call”) of included test items. Assessing of each formative test is also highly encapsulated. Each test item is separately assessed and accumulated to produce a separately derived test score. While so-called cumulative testing may also be administered (e.g., finals), such testing is also typically provided, administered and assessed in a similar manner as with other formative testing.
Conventional summative testing is nearly entirely distinct from current formative testing in both substantive and procedural respects. Current summative testing provides for very infrequent, highly formalized and more extensive testing of accumulated learning of each student that may cover a particular learning area or collection of learning areas. Summative testing further, need not be limited to recent learning and may instead include less recent learning, learning that may not yet have been achieved (e.g., for testing the extent of student learning, as a result of syllabus variations, and so on). Summative testing items, portions thereof, presentation or goals (e.g., implemented as item response assessment criteria) may also differ extensively from those of formative testing. For example, items may be required to meet increased reliability and validity criteria, minimization of bias criteria, security or exposure criteria and so on. Summative testing may, for example, include achievement tests, professional certification tests, college admissions testing, or other standardized tests that are typically administered following of some period of education, such as the end of a professional program, school year, semester or quarter.
As with formative testing, however, conventional summative testing is typically highly encapsulated. Each summative test is entirely separately evaluated and assessed to produce a summative test score. The separately produced summative test score may then be compared with that of another (typically the immediately preceding) summative test to determine whether a student learning change has occurred (e.g., student knowledge has or has not improved in a particular learning area—typically an area that has been newly presented since the preceding summative test).
Unfortunately, because the formality and comprehensiveness of conventional summative testing often require testing very near the end of a school term and the present testing approach results in very extensive testing, the lengthy process of assessment may not be completed until after the school term has ended. The assessment process may further take months to complete. Such factors, as well as the different nature and increased importance of a particular summative testing session also render summative testing a necessarily disruptive and stressful addition to formative testing to all involved. For example, poor summative testing results may well adversely affect student placement, faculty/institutional evaluation or ranking, financing and/or other factors. The present inventors have also determined that the accuracy and reliability of summative testing as, for example, as a probabilistic assessment of student learning, may be substantially increased. For example, aspects of the present invention enable substantially greater resistance to accuracy concerns, such as a student guessing incorrectly on a first summative test and correctly on a second summative test being mis-interpreted as an indicator of increased learning. Thus, among other conventional testing problems advances promised by the present invention may well draw into question the sufficiency of present summative testing accuracy and reliability.
Accordingly, there is a need for improved cumulative assessment systems and methods that enable one or more of the above and/or other problems of conventional testing to be avoided.
Aspects of the invention are embodied in systems, methodologies, software, etc for computing an improved likelihood ability estimate for an assessment respondent or a group of assessment respondents. Assessments are administered to respondents a first time and at least one subsequent time. Responses to items in the assessments are scored each time. Two or more assessments are selected, based on selection criteria, and from the selected assessments, a number of items are selected, also based on selection criteria, to be included in an improved likelihood ability estimate. An improved likelihood ability estimate for each respondent or the group of respondents can be computed based on the selected, or included, assessments and the selected, or included, items.
Accordingly, an improved ability estimate computed in accordance with the cumulative assessment scheme described herein becomes a more integrated assessment based on the respondent's cumulative performance on multiple assessments, as opposed to being merely a snapshot ability estimate based on a single point-in-time assessment.
a is a flow diagram illustrating a cumulative assessment system according to an embodiment of the invention;
b is a flow diagram illustrating a further cumulative assessment system according to an embodiment of the invention;
a illustrates a mechanism for performing related item selection in conjunction with cumulative assessment according to an embodiment of the invention
b illustrates another mechanism for performing related item selection in conjunction with cumulative assessment according to an embodiment of the invention;
c illustrates a further mechanism for performing related item selection in conjunction with cumulative assessment according to an embodiment of the invention;
a illustrates utilization of a learning map in performing cumulative assessment according to an embodiment of the invention;
In the description herein for embodiments of the present invention, numerous specific details are provided, such as examples of components and/or methods, to provide a thorough understanding of embodiments of the present invention. One skilled in the relevant art will recognize, however, that an embodiment of the invention can be practiced without one or more of the specific details, or with other apparatus, systems, assemblies, methods, components, materials, parts, and/or the like. In other instances, well-known structures, materials, or operations are not specifically shown or described in detail to avoid obscuring aspects of embodiments of the present invention.
A “computer” for purposes of embodiments of the present invention may include any processor-containing device, such as a mainframe computer, personal computer, laptop, notebook, microcomputer, server, personal data manager or “PIM” (also referred to as a personal information manager or “PIM”) smart cellular or other phone, so-called smart card, settop box or any of the like. A “computer program” may include any suitable locally or remotely executable program or sequence of coded instructions which are to be inserted into a computer, well known to those skilled in the art. Stated more specifically, a computer program includes an organized list of instructions that, when executed, causes the computer to behave in a predetermined manner. A computer program contains a list of ingredients (called variables) and a list of directions (called statements) that tell the computer what to do with the variables. The variables may represent numeric data, text, audio or graphical images. If a computer is employed for synchronously presenting multiple video program ID streams, such as on a display screen of the computer, the computer would have suitable instructions (e.g., source code) for allowing a user to synchronously display multiple video program ID streams in accordance with the embodiments of the present invention. Similarly, if a computer is employed for presenting other media via a suitable directly or indirectly coupled input/output (I/O) device, the computer would have suitable instructions for allowing a user to input or output (e.g., present) program code and/or data information respectively in accordance with the embodiments of the present invention.
A “computer-readable medium” for purposes of embodiments of the present invention may be any medium that can contain, store, communicate, propagate, or transport the computer program for use by or in connection with the instruction execution system, apparatus, system or device. The computer readable medium can be, by way of example only but not by limitation, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, system, device, propagation medium, or computer memory. The computer readable medium may have suitable instructions for synchronously presenting multiple video program ID streams, such as on a display screen, or for providing for input or presenting in accordance with various embodiments of the present invention.
Referring now to
For clarity sake, however, the more specific assessment example of separately administered testing will be used as a consistent example according to which testing (or other assessment) embodiments of the invention may be better understood. It will be appreciated, however, that other assessment mechanisms may be utilized in a substantially similar manner as with separately administered testing.
In separately administered testing, for example, assessment materials (hereinafter, “testing materials”) that may include one or more questions, other response requests or portions thereof (“items”) is presented to one or more students who are charged with producing responses to the items (“item responses”). The items or item portions may, for example, include selected response item portions, in which the students may choose from predetermined presented answers and indicate their answer selection (e.g., in a response grid, in a provided form, and so on.) The items or item portions may also include constrained constructed response (“CCR”) items in which the students may modify or construct a presented graph (“graph item”), circle, cross out, annotate connect, erase, modify or otherwise marking up portions of a presented drawing, text, audio/visual clip(s), other multimedia or combined test materials (“markup item”), delineate a correspondence (“matching item response”) between or among presented images, text, other multimedia or combined test materials (“matching item”), provide missing text, numbers or other information or some combination (“short answer response”), and so on. Other item types, portions thereof or some combination may also comprise items.
Note that the term “or” as used herein is intended to include “and/or” unless otherwise indicated or unless the context clearly dictates otherwise. The term “portion” as used herein is further intended to include “in whole or contiguous or non-contiguous part” which part can include zero or more portion members, unless otherwise indicated or unless the context clearly dictates otherwise. The term “multiple” as used herein is intended to include “two or more” unless otherwise indicated or the context clearly indicates otherwise. The term “multimedia” as used herein may include one or more media types unless otherwise indicated or the context clearly indicates otherwise.
In the more specific embodiment of
Testing may be administered in an otherwise conventional manner at various locations 122a, 122b within each test site 102, 102a using the received test materials 121. Testing materials including student responses (hereinafter collectively referred to as “student answer sheets” regardless of the type actually used) may then be collected and delivered to subject assessment system 111 of assessment provider 101 for assessment. Other testing materials provided to students, including but not limited to test booklets, scratch paper, and so on, or some combination, may also be collected, for example, in an associated manner with a corresponding student answer sheet, or further delivered to subject assessment system 111, and may also be assessed. (Student markings that may exist on such materials or the lack thereof may, for example, be included in an assessment.)
Assessment provider 101 portion of assessment system 100 in one embodiment comprises a subject assessment system 111 including at least one test material receiving device 110 and a cumulative assessment engine 116. (It will become apparent that assessment of the tests may also be conducted by one or more other subject assessment authorities using one or more assessment engines and selected assessment results or assessments of selected items may be provided to one or more cumulative assessment providing components, or some combination may be used.) Test material receiving device 110 in a more specific embodiment includes a high-speed scanner, brail reader or other mechanism for receiving one or more response portions (e.g., of an answer book) and providing included item responses in an electronic format to other subject assessment system components.
Assessment (i.e., Test) generation system 113 in one embodiment includes item/assessment producing device 114 (e.g., printer, audio/video renderer, and so on, or some combination). Assessment generation system 113 may be further coupled, e.g., via a local area network (LAN) or other network 112, to a server 115. Assessment generation system 113 is also coupled (via network 112) to subject assessment system 111 and item response receiving device 110 (e.g., a scanner, renderer, other data entry device or means, or some combination).
Subject assessment system 111 also includes an assessment/item selection engine (“selection engine”) 116b. Selection engine 116b provides for selecting two or more assessment portions including related items (“included assessments”) or for further selecting assessments of two or more related items (included items) corresponding to two or more assessments based on selection criteria and selection indicators as discussed below. Selection engine 116b may in one embodiment receive predetermined included assessments or included assessment items from a coupled storage storing such information, other subject assessment system 111 component, some other assessment source, or some combination. In another embodiment, selection engine 116b may receive selected assessments from one or more predetermined or otherwise determinable assessment sources to be used in their totality or from which selection engine 116b may select items that are or are not to be further processed in accordance with cumulative assessment (“included items” or “excludable items” respectively). Related items for purposes of the present embodiment may include those items for which an ability assessment may be conducted with respect to a common goal (e.g., measuring mathematical ability, measuring science ability, measuring nursing ability).
a through 2C illustrate embodiments of mechanisms according to which selection engine 116b may select related items. In accordance with the illustrated embodiments, selection engine 116b may receive item selection criteria from a coupled storage storing such information, other subject assessment system 111 component, some other assessment source, or some combination. The selection criteria source may, for example, be a predetermined source, an association of such source(s) with one or more assessment information, a source otherwise determinable by selection engine 116b, e.g., in an otherwise conventional manner for selecting a coupled component, or some combination. The selection criteria may further include selection indicators, e.g., for selecting particular items, item groups or portions thereof, selection algorithms, weighted selection, AI, application of learning maps, cluster analysis, and so on, or some combination. One or more similar mechanisms may also be used for selection of one or more assessments or portions thereof. Other selection mechanisms or some combination of selection mechanisms may also be used for conducting selection, selection refinement or both.
Beginning with
a also illustrates how embodiments of the present invention enable a series of assessments otherwise provided as formative assessments with respect to substance, procedure or both. (Formative, for purposes of the present invention, may include any ongoing assessment regardless of form. Summative testing may further be defined in a conventional sense, while cumulative testing may provide for producing assessment information otherwise attributable to conventional summative assessment, but may be produced using formative testing, summative testing or both.)
More specifically cumulative testing may include ongoing testing in which the items of any assessment are provided in a standardized manner (e.g., extensive accuracy in identifying a likelihood that a student has acquired an ability or ability level corresponding to a goal) or by a lesser skilled teacher or other item preparer. As will become more apparent, embodiments of the present invention enable substantial improvement in estimation accuracy that may be applicable to either mode of preparation. Additionally, because embodiments of the present invention enable an accumulation of related items that may be distributed over the course of multiple assessments (e.g., at least two), the number of items included in a particular assessment may be decreased.
Cumulative assessment may still further be conducted at various points in time utilizing all or some of available assessments. Thus, for example, assuming that assessments A through D are conducted at successive points in time, cumulative assessment may be conducted following assessment B and in conjunction with assessments A and B to provide a more accurate estimation of a corresponding student's ability with respect to the assessed goals at the time of assessment B as well as at the time of assessment A (e.g., see below). Cumulative assessment may also be conducted following assessment C and in conjunction with one or more of assessment A and assessment B to provide a more accurate estimation of a corresponding student's ability with respect to the goals of included items of included assessments, and so on.
a also illustrates how cumulative assessment according to the present invention enables summative-like testing to be conducted in an expeditious manner. As was noted earlier, any one or more of assessments A through D may be administered—in a more conventional sense—as a formative or summative assessment. However, because cumulative assessment provides for aggregation of related items, accuracy improvement may be achieved in an ongoing manner for summative assessment, formative assessment or both. Therefore, comprehensive final summative assessment is not required and, in addition to response scoring automation or other techniques that may be used, a less comprehensive or extensive final test may administered that may be scored in a more expeditious manner. Nevertheless, it is likely that a final assessment including items covering a greater spread of goals may provide even further accuracy benefits (e.g., by assessing an ability estimate for a student that covers a broader range of goals or goals presented over a broader time period. Thus, for example, Assessment D may include items relating to goals 1 and 2 (e.g., for which learning may have been presented first and second or otherwise during an earlier time period) and items relating to goals 5 and 6, e.g., for which learning may have been presented last or otherwise during a later time period).
b illustrates a further item selection mechanism that utilizes a learning map or other diagnostic criteria. A more detailed example of a learning map is illustrated by
In one embodiment, each learning target represents or is associated with a smallest targeted or teachable concept (“TC”) at a defined level of expertise or depth of knowledge (“DOK”). A TC may include a concept, knowledge state, proposition, conceptual relationship, definition, process, procedure, cognitive state, content, function, anything anyone can do or know, or some combination. A DOK may indicate a degree or range of degrees of progress in a continuum over which something increases in cognitive demand, complexity, difficulty, novelty, distance of transfer of learning, or any other concepts relating to a progression along a novice-expert continuum, or any combination of these.
For example, learning target 311 (LT1) represents a particular TC (i.e., TC-A) at a particular depth of knowledge (i.e., DOK-1). Learning target 312 (LT2), represents the same TC as learning target 311, but at a different depth of knowledge. That is, learning target 312, represents TC-A at a depth of knowledge of DOK-2. Arc 351, which connects target 311 to 312, represents the relationship between target 311 and 312. Because arc 351 points from target 311 to target 312, target 311 is a precursor to target 312, and target 312 is a postcursor of target 311.
Examples of learning maps and methods of developing them and using them to guide assessment and instructions are described in U.S. patent application Ser. No. 10/777,212, corresponding to application publication no. US 2004-0202987, the contents of which are hereby incorporated by reference.
Returning now to
Continuing with
Returning again to
For example, let us assume that an assessment A that includes items a1, a2 . . . aN is administered at a time T1 and scored (e.g., by assessment engine 116a) to produce an ability estimate (θ1) given by equation 1, in which:
θ1=f(AssessmentA) at T1 Equation 1
Function, f, of equation 1 may, for example, represent a standardized ability estimate measure, which, in the implementation of the invention described herein, comprises a first, or greater, order probabilistic model that predicts an unobserved state (i.e., ability estimate) based on observed evidence (e.g., item response results), often referred to in the literature as “reasoning over time.” Typical examples of such models include unidimensional item response theory models (e.g., 3-parameter logistic model (3PL IRT), 2-parameter logistic model (2PL IRT), 1-parameter logistic model (1 PL IRT), Rasch model), multidimensional IRT models (MIRT), Learning Map Analytics (LMA), and Bayesian Networks. Let us further assume that an assessment B that includes items b1, b2 . . . bM is administered at a time T2 and scored (e.g., by assessment engine 116a) to produce an ability estimate (θ2) given by equation 2, in which:
θ2=f(AssessmentB) at T2 Equation 2
Again, for equation 2, the function f is a probabilistic model for predicting, or estimating, ability based on assessment results.
If selection engine 116b further selects related items included in included assessments A and B, then likelihood engine 116c may score the included assessments in accordance with a union of the ability estimates representing the greater number of items corresponding to the union of the assessments as compared with either individual included assessment. Moreover, likelihood engine 116c may score the included assessments to produce a maximum likelihood, or further, a simultaneous maximum likelihood ability estimate for the included assessments given by Equation 3 for theta 2 prime (θ2′) and theta 1 prime (θ1′) in which:
θ2′=f(Assessment A in view of Assessment B), and
θ1′=f(Assessment B in view of Assessment A). Equation 3
Stated alternatively, a standard measurement, such as 3PL IRT, which is given by equation 4 below, may be modified by the union of included ability estimates at a point of maximum likelihood for each one (here, θ2′ and θ1′) to produce a more accurate ability estimate at the time of each of the included assessments. For clarity sake, Equation 4 is expressed in a more conventional manner according to the probability of a correct response to item j by student i, wherein:
Pij(Xj=1|θi)=cj+1−cj/1+e−aj(θ
Where
Xj=1 indicates a correct response to item j,
θi is the ability estimate for student i, and
aj, bj, and cj are the discrimination, difficulty, and pseudo-guessing parameters for the 3PL model, respectively.
Graphs 400a and 400b of
Returning now to
The
System 100b includes assessment provider system 101 and test site system 102, which systems are at least intermittently communicatingly couplable via network 103. As with system 100a, test materials may be generated by test generation system 113a, e.g., via a learning map or other diagnostic criteria, by hand, using other mechanisms or some combination, and delivered to test site 102a1 or other test sites in hard-copy form, for example, via conventional delivery. The test may further be administered in hard-copy form at various locations within one or more test sites and the responses or other materials may be delivered, for example, via conventional delivery to performance evaluation system 111a of assessment provider system 100a. In other embodiments, test materials, results or both may be deliverable in hard-copy, electronic, mixed or combined forms respectively via delivery service 104, network 103 or both. (It will be appreciated that administering of the assessment may also be conducted with respect to remotely located students, in accordance with the requirements of a particular implementation.
Assessment (i.e., Test) generation system 113a in the embodiment of
Substantially any devices that are capable of presenting testing materials and receiving student responses (e.g., devices 124, 125) may be used by students (or officiators) as testing devices for administering an assessment in electronic form. Devices 124, 125 are connected at test site 102a1 via site network 123 (e.g., a LAN) to test site server computer 126. Network 103 may, for example, include a static or reconfigurable wired/wireless local area network (LAN), wide are network (WAN), such as the Internet, private network, and so on, or some combination. Firewall 118 is illustrative of a wide variety of security mechanisms, such as firewalls, encryption, fire zone, compression, secure connections, and so on, one or more of which may be used in conjunction with various system 100b components. Many such mechanisms are well known in the computer and networking arts and may be utilized in accordance with the requirements of a particular implementation.
As with system 100a, assessment provider 101a portion of assessment system 100b in one embodiment comprises performance evaluation engine 111a including a test material receiving device 110a and a cumulative assessment engine 116. Test material receiving device 110a may also again include a high-speed scanner, brail reader or other mechanism for receiving one or more response portions (e.g., of an answer book or mixed item-and-response format assessment sheet) and providing included item responses in an electronic format to other subject assessment system components. (It will be appreciated, however, that no conversion to electronic form may be required for responses or other utilized test materials that are received in electronic form.)
Performance evaluation system 111a of the illustrated embodiment includes a Cumulative assessment engine 116 that provides for performing cumulative assessment in a substantially similar manner as discussed for cumulative assessment engine 116 of
The
Computing system 500 comprises components coupled via one or more communication channels (e.g. bus 501) including one or more general or special purpose processors 502, such as a Pentium®, Centrino®, Power PC®, digital signal processor (“DSP”), and so on. System 500 components also include one or more input devices 503 (such as a mouse, keyboard, microphone, pen, and so on), and one or more output devices 504, such as a suitable display, speakers, actuators, and so on, in accordance with a particular application.
System 500 also includes a computer readable storage media reader 505 coupled to a computer readable storage medium 506, such as a storage/memory device or hard or removable storage/memory media; such devices or media are further indicated separately as storage 508 and memory 509, which may include hard disk variants, floppy/compact disk variants, digital versatile disk (“DVD”) variants, smart cards, partially or fully hardened removable media, read only memory, random access memory, cache memory, and so on, in accordance with the requirements of a particular implementation. One or more suitable communication interfaces 507 may also be included, such as a modem, DSL, infrared, RF or other suitable transceiver, and so on for providing inter-device communication directly or via one or more suitable private or public networks or other components that can include but are not limited to those already discussed.
Working memory 510 further includes operating system (“OS”) 511, and may include one or more of the remaining illustrated components in accordance with one or more of a particular device, examples provided herein for illustrative purposes, or the requirements of a particular application. Assessment engine 512, selection engine 513 and likelihood engine 514 may, for example, be operable in substantially the same manner as was already discussed. Working memory of one or more devices may also include other program(s) 515, which may similarly be stored or loaded therein during use.
The particular OS may vary in accordance with a particular device, features or other aspects in accordance with a particular application, e.g., using Windows, WindowsCE, Mac, Linux, Unix, a proprietary OS, and so on. Various programming languages or other tools may also be utilized, such as those compatible with C variants (e.g., C++, C#), the Java 2 Platform, Enterprise Edition (“J2EE”) or other programming languages. Such working memory components may, for example, include one or more of applications, add-ons, applets, servlets, custom software and so on for conducting cumulative assessments including, but not limited to, the examples discussed elsewhere herein. Other programs 515 may, for example, include one or more of security, compression, synchronization, backup systems, groupware, networking, or browsing code, and so on, including but not limited to those discussed elsewhere herein.
When implemented in software, one or more of system 100a and 100b or other components may be communicated transitionally or more persistently from local or remote storage to memory (SRAM, cache memory, etc.) for execution, or another suitable mechanism may be utilized, and one or more component portions may be implemented in compiled or interpretive form. Input, intermediate or resulting data or functional elements may further reside more transitionally or more persistently in a storage media, cache or other volatile or non-volatile memory, (e.g., storage device 508 or memory 509) in accordance with the requirements of a particular application.
Turning now to
In block 610, the cumulative assessment engine determines included assessments, and in block 612, determines included items (e.g., directly or via determination of excluded items). In block 614, the cumulative assessment engine scores the included assessments (or included items of the included assessments) to produce a maximum likelihood ability estimate for the included assessments.
Reference throughout this specification to “one embodiment”, “an embodiment”, or “a specific embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention and not necessarily in all embodiments. Thus, respective appearances of the phrases “in one embodiment”, “in an embodiment”, or “in a specific embodiment” in various places throughout this specification are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics of any specific embodiment of the present invention may be combined in any suitable manner with one or more other embodiments. It is to be understood that other variations and modifications of the embodiments of the present invention described and illustrated herein are possible in light of the teachings herein and are to be considered as part of the spirit and scope of the present invention.
Further, at least some of the components of an embodiment of the invention may be implemented by using a programmed general purpose digital computer, by using application specific integrated circuits, programmable logic devices, or field programmable gate arrays, or by using a network of interconnected components and circuits. Connections may be wired, wireless, by modem, and the like.
It will also be appreciated that one or more of the elements depicted in the drawings/figures can also be implemented in a more separated or integrated manner, or even removed or rendered as inoperable in certain cases, as is useful in accordance with a particular application. It is also within the spirit and scope of the present invention to implement a program or code that can be stored in a machine-readable medium to permit a computer to perform any of the methods described above.
Additionally, any signal arrows in the drawings/Figures should be considered only as exemplary, and not limiting, unless otherwise specifically noted. Furthermore, the term “or” as used herein is generally intended to mean “and/or” unless otherwise indicated. Combinations of components or steps will also be considered as being noted, where terminology is foreseen as rendering the ability to separate or combine is unclear.
As used in the description herein and throughout the claims that follow, “a”, “an”, and “the” includes plural references unless the context clearly dictates otherwise. Also, as used in the description herein and throughout the claims that follow, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
The foregoing description of illustrated embodiments of the present invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed herein. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes only, various equivalent modifications are possible within the spirit and scope of the present invention, as those skilled in the relevant art will recognize and appreciate. As indicated, these modifications may be made to the present invention in light of the foregoing description of illustrated embodiments of the present invention and are to be included within the spirit and scope of the present invention.
Thus, while the present invention has been described herein with reference to particular embodiments thereof, a latitude of modification, various changes and substitutions are intended in the foregoing disclosures, and it will be appreciated that in some instances some features of embodiments of the invention will be employed without a corresponding use of other features without departing from the scope and spirit of the invention as set forth. Therefore, many modifications may be made to adapt a particular situation or material to the essential scope and spirit of the present invention. It is intended that the invention not be limited to the particular terms used in following claims and/or to the particular embodiment disclosed as the best mode contemplated for carrying out this invention, but that the invention will include any and all embodiments and equivalents falling within the scope of the appended claims.
This application claims the benefit of U.S. Provisional Application Ser. No. 60/689,978 filed May 28, 2005, the contents of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
60689978 | May 2005 | US |