1. Field of the Invention
The present invention relates to field of educational reporting systems, and, more specifically, provides systems and methods for reporting student progress and/or knowledge using a learning map, which is device for expressing dependency relationships between and amongst learning targets.
2. Discussion of the Background
A teacher (or other educator) responsible for teaching a subject area to a student would benefit by knowing the student's strengths and weaknesses in the subject area. For example, if the teacher knows the student's strengths and weaknesses in the subject area, then the teacher can spend more time teaching the concepts that the student doesn't know and less time teaching the concepts that the student already knows.
Accordingly, what is desired are systems and methods to enable a student, teacher or other interested party (e.g., parent or tutor) (hereafter “user”) to quickly and easily determine the concepts the students knows and/or the concepts the student doesn't know so that the educator can better teach the student.
In one aspect, the present invention provides systems and methods for using a learning map to enable a user to visualize a student's or group of students' progress (or non-progress) in one or more subject areas. A student can use the systems described herein to inform his or her own learning and track his or her own progress as well.
A learning map includes a network of nodes, with each node representing a particular learning target (i.e., a skill or concept at a particular depth of knowledge) in a well-defined strand of learning in an academic content area or any other domain of learning. Preferably, the nodes are linked together in an ordered way as pre-cursors and post-cursors of each other in an empirically validated learning sequence. Pre-cursor indicators are related to the knowledge that the student should have prior to being taught the learning target. Post-cursor indicators relate to knowledge that the student should be able to acquire more readily after learning the learning target. There can be more than one pre-cursor and/or post-cursor to any given targeted skill, and nodes from one academic area (such as reading language arts) may serve as pre-cursors and/or post-cursors to another academic area (such as mathematical computation). All academic areas may be interconnected into one large learning map.
A method according to some embodiments of the invention includes: (a) administering an assessment to a student, the assessment having one or more questions, at least one question of the assessment being associated with a first learning target; (b) providing a report comprising (b1) a first node associated with the first learning target, (b2) a second node directly connected to the first node, the second node being associated with a second learning target that is a pre-cursor of the first learning target, and (b3) a third node directly connected to the first node, the third node being associated with a third learning target that is a post-cursor of the first learning target; (c) coding the first node, based at least in part on the student's response to the question associated with the first learning target, to indicate (1) whether the student has mastered the first learning target, (2) whether the student has not yet learned the first learning target, (3) whether there is insufficient information to determine the knowledge state of the student with respect to the first learning target, or (4) whether the student has not yet been assessed on the first learning target; and (d) providing means enabling a viewer to view a question associated with the first learning target.
The method may also includes the steps of: providing means directing a user to instructional resources related to one of the learning targets; providing professional development materials for an instructor to use for further instruction on one of the first learning targets, and providing views into other related information such as the text of a given state's standards for assessment on one of the learning targets.
Questions (a.k.a., “items”) may broadly include traditional questions, such as, for example, those used in paper and online testing programs, and non-traditional questions, such as, for example, interactions with multimedia and interactive media such as games. In short, a question can be any “device” that is used in determining a student's knowledge of, for example, a subject or concept.
Assessments may broadly include traditional forms of assessment such as collections of question provided online or on paper, individually administered performance assessment, as well as performance measurements based on interactions with a computer game or videogame in which the student's interactions or collections of interactions are correlated to nodes on the learning map. This correlation could be applied to existing assessment or designed into new assessments.
Instructional resources and professional development materials may include traditional materials such as textbooks, videos, reference materials, etc. and may also include nontraditional instructional material such as electronic games, board and cards, games, and all manner of interactive media.
The above and other aspects, features and advantages of the present invention, as well as the structure and operation of preferred embodiments of the present invention, are described in detail below with reference to the accompanying drawings.
The accompanying drawings, which are incorporated herein and form part of the specification, help illustrate various embodiments of the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use embodiments of the invention. In the drawings, like reference numbers indicate identical or functionally similar elements.
In one aspect, the present invention provides a reporting system that enables a user to view student assessment results in graphical format.
Referring now to
Data processing system 104 may include one or more general and/or special purpose computers. If more than one computer is used to implement processing system 104, the two or more computers need not be co-located. If they are not co-located, then, in one embodiment, a network (e.g., the Internet or other network) is used to enable the two or more computers to communicate with each other.
Coupled to processing system 104 is a user interface system 110. User interface system 110 may be directly connected to processing system 104 or indirectly coupled to the processing system 104 through, for example, a local or wide area network and zero or more other processing systems. User interface system 110 may include one or more information input and/or output device, such as, for example, a monitor, keyboard, mouse, microphone, speaker or other information input/output device. The user interface may provide for local presentation, remote presentation, or some combination of these.
In some embodiment, reporting software 106 may include one or more executable application programs (e.g., web servers, web browsers and other computer programs), scripts, markup language documents (e.g., HTML, XML, etc.), and/or other software that function together to provide the functionality described herein.
Referring now to
Learning map table 210 captures the relationships among learning targets and may also include pre/post-cursor inference values. A postcursor inference value is a value that represents the probability that a student knows the precursor learning target if it can be shown that the student knows the postcursor learning target. A precursor inference value is a value that represents the probability that a student does not know the postcursor learning target if it can be shown that the student does not know the precursor learning target. As shown in table 210, we can determine that at least five learning targets (LT1, LT2, . . . , LT5) have been specified because there are five rows in table 210. Each row in table 210 corresponds to one of the five learning targets. The data in a given row specifies the post-cursor relationships between the learning target corresponding to the given row and the other learning targets.
For example, consider the first row of table 210. This row corresponds to learning target LT1. The data in this row indicates that LT2 is the only learning target that is a post-cursor of LT1 because cell 250, which corresponds to LT2, includes the pre-cursor and post-cursor inference values, whereas all the other cells in the row do not contain inference values. The inference values included in cell 250 indicate that, if a student doesn't know LT1, then there is a probability of 0.86 that the student also does not know LT2, and if a student knows LT2, then there is a probability of 0.97 that the student also knows LT1.
The second row in table 210, which corresponds to LT2, indicates that LT3 is the only learning target that is a post-cursor of LT2. This row also indicates that, if a student doesn't know LT2, then there is a probability of 0.82 that the student also does not know LT3, and if a student knows LT3, then there is a probability of 0.95 that the student also knows LT2.
Table 210 can be used to generate a network diagram (i.e., learning map) that corresponds to table 210. The network diagram has nodes and arcs, wherein the nodes represent the specified learning targets and the arcs represent the specified post-cursor relationships between learning targets. This network diagram forms a learning map. As further described in patent application Ser. No. 10/777,212, learning maps are advantageous for many reasons, including that they can be used to generate efficient tests (i.e., knowledge assessments) that assess one's knowledge of a particular academic content area or across multiple academic areas. Additionally, as described herein, learning maps can be used to generate reports that enable an educator, student or other interested party to visualize a student's performance on an assessment as well as visualize the student's progress over time.
Referring now to
Referring back to
The student/item table 206 is used for storing the students' responses to items and information indicating whether the response is a correct response. For each response to an item, table 206 may also record the date the student provided the response. For example, at the beginning of a semester a student may be given an assessment having ten items (e.g., ten multiple choice questions) and may be given the same or other assessments later in the semester to gauge whether the student is making progress. In such a scenario, table 206 may record not only the student's responses to the items from first assessment, but also the student's responses to the items from all of the other assessments.
The item/learning target table 208 is used to associate items with learning targets. In some embodiments, each item of an assessment may be associated with at least one learning target. Typically, an item is associated with a learning target if a correct response to the item provides evidence that the student comprehends the learning target.
Referring now to
Process 400 enables a user of system 100 to view and interact with various reports pertaining to a student's or a group of student's mastery of one or more subject areas. Process 400 is, in one embodiment, performed after one or more assessments have been administered to a group of one or more students (e.g., Ms. Jones's third grad class) and scored. That is, for example, process 400 is preferably performed after data has been stored in tables shown in
Process 400 may begin at step 402, where reporting system 100 displays a graphical user interface screen on a user's display (e.g., a display of user interface system 110.
Individual Student Report (ISR)
In step 404, system 100 may prompt the user to select a particular student from a list of students and a particular assessment from a list of assessments that were given to the student. After the user selects a particular student (e.g., Jamie Jones) and selects a particular assessment (e.g., assessment #384 administered on Sep. 1, 2004), process 400 may proceed to step 406.
In step 406, reporting system 100 displays on a user's display a graphical user interface screen that includes at least a segment of a learning map. For example, in one embodiment, each assessment is associated with at least a segment of a learning map and reporting system 100 displays the learning map segment (which may correspond to a strand of learning) that is associated with the selected assessment.
In one embodiment, the learning map segment displayed in step 406 includes, for each item on the assessment, a node corresponding to a learning target that is associated with the item. For example, if the selected assessment includes three items (item1, item2, and item3), and item1 is associated with learning target LT1, item2 is associated with learning target LT2, and item3 is associated with learning target LT3, then the segment of the learning map that gets displayed in step 406 includes a node corresponding to LT1, a node corresponding to LT2 and a node corresponding to LT3.
To illustrate the above feature,
As shown in
The above mentioned color scheme is merely exemplary, and any other color scheme or other scheme for differentiating nodes can be used to differentiate between mastered, not yet learned, further instruction required, or not yet assessed nodes. For example, different types of nodes can have different shapes or background patterns (e.g., stippling or vertical, horizontal, or diagonal crosshatching). Alternatively, distinct sounds may be provided that play when a user “mouses over” each node. Still alternatively, individual nodes can simply have labels indicating whether the node is mastered, not yet learned, requiring further instruction, or not yet assessed.
Selection options (e.g., buttons or menu items or Other selection options) may be provided on screen 600 for enabling a user to select either a report showing results from the most recent assessment taken or a report summarizing the results for all tests taken to date. A user can use the buttons 691 and 692 to zoom the map 602 in and out, respectively, in a manner similar to a user of a digital geographical map. Additionally, elements 693, 694, 695 and 696 are provided to enable the user to scroll up, down, left, and right to see adjacent and related nodes in order to see more post-cursor and pre-cursor nodes in order to gain a greater understanding of the student's progress and of what lies ahead.
In some embodiments, a user is be able to query the shortest route on the learning map from their present knowledge status, as revealed by the green nodes, to the nodes representing a goal, such as state standard or indicator represented by one or more nodes in the learning map. Additionally, a user can jump to regions of the map (e.g., short division) by inputting a natural language or curriculum language instructions, such as: “mathematics, grade3, short division without remainders.” In response to a user entering such an instruction, system 100 will display the corresponding region of the map on the user's display. The zoom level of the map can be altered by selecting an increment of time (day/week/month/quarter year/half year/year/multiple years).
Nodes that are displayed within screen 600 that are associated with learning concepts that fall within a selected or predetermined state's grade-level expectations for learning in that strand may be heavily outlined for emphasis. For example, node 650 is heavily outlined. These nodes may also have “S” icons (or other identifying element) associated therewith. Selecting the “S” icon causes system 100 to display relevant wording from state standards. System 100 may display the wording in a pop-up window or the like. An exemplary state standard pop-up window 702—activated when the “S” icon is selected—is shown in
Nodes that are correlated with specific instructional resources may feature an “I” icon or other indicator. Selecting an “I” icon associated with a particular node causes system 100 to display a pop-up window or the like containing information that directs the user to exact locations within selected instructional resources where relevant lessons targeting the skills and concepts represented by the particular node can be found.
Nodes corresponding to learning targets that were included on the selected assessment may feature “Q” icons (or other indicators), one for each item on the assessment. The “Q” icons may be coded (e.g., color coded) to indicate whether the student correctly responded to the item with which the Q icon is associated. As shown in
The ISR screen 600 (or a report similar to the ISR screen) can be printed and sent home with students for their parents to review. In addition, online versions of the report can be provided for access over a distributed network, for example, the Internet. For online versions, appropriate security features, for example restricted authorizations and password access, are desirable.
Educators will find ISR screen 600 to be a useful tool in evaluating a student. Simply by glancing at the screen 600, a teacher can quickly determine the learning targets that the student knows and doesn't know. The teacher can then help focus the student in those areas were the student's skill appear to be lacking. Students will find the screens useful tool for self-evaluation and assistance in learning.
Pre-cursor and post-cursor relationships that appear in these reports, allow teachers and students to identify learning targets that may need to be learned in order to acquire a targeted skill. They may also use them to identify learning targets that may be able to be learned in the future.
It is expected that a teacher using system 100 will use the system 100 to display an ISR screen for each student in the teacher's class. This will enable the teacher to give more individualized instruction to each student, because, simply by reviewing each students' ISR screen, the teacher can quickly determine the areas that need to be focused on for each student. For example, an ISR screen for one student may indicate that the student should focus on three learning targets: (D) multiplication regrouping; (F) subtraction regrouping; and (H2) long division, whereas an ISR screen for a different student may indicate that this other student need only focus on learning division. In this way, the ISR screens provide a powerful tool to educators.
Individual Longitudinal Report (IDL)
Referring back to
In step 422, reporting system 100 displays on the user's display a graphical user interface screen that includes at least a segment of a learning map. For example, in one embodiment, each assessment is associated with at least a segment of a learning map, and reporting system 100 displays a learning map segment that encompasses all the segments associated with the selected assessments.
That is, in one embodiment, the learning map segment displayed in step 422 includes, for each item on each selected assessment, a node corresponding to a learning target that is associated with the item. For example, if one of the selected assessments includes an item (item1), and item1 is associated with learning target LT1, and another of the selected assessments includes an item (e.g., item77), and item77 is associated with learning target LT77, then the segment of the learning map that gets displayed in step 422 includes a node corresponding to LT1 and a node corresponding to LT77.
To illustrate the above feature,
The IDL screen 1000 is used to display the results from the selected assessments. The results are presented within the paradigm of a network of nodes in a learning map, visibly coded (e.g., color coded) as described above and/or sound coded to represent a student's achievement status in a particular strand of learning during the course of a school year. The default display of nodes may be “zoomed out” to a greater extent than the view on the ISR screen 600 in order to give the user a better overall look at the concepts of the learning strand that have been covered over time. It will be appreciated that the degree of granularity in the data displayed in the learning map may differ in granularity based on the zoom level selected by the user. Greater detail of learning targets may be displayed (see
A user can again zoom in and out and navigate through the map segment 1002 to choose other views. The user also select a “Play” button 1090 on IDL screen 1000 in order to display the test results in sequence and observe the student's progress as nodes change from red to yellow to green.
More specifically, in response to activation of the play button 1090, system 100 initially codes the nodes on map 1002 according to the results of an earlier assessment (e.g., a node on the map may initially be colored red to indicate that the results of the first assessment indicates that the student has not learned the concept associated with the node). As discussed above, the results of assessments may be stored in a database, thus, system 100 may first retrieve information from the database prior to coding the nodes on the map. After initially coding the nodes, system 100 may pause for a predetermined delay and then recode the nodes on map 1002 according to the results of the next selected assessment. For example, the node that was initially colored red may change to the color green because the results of the second assessment may indicate that the student has learned the concepts associated with the node.
The user can also select any one of the particular assessments and view the ISR screen for that given assessment. For example, in one embodiment, a “timeline” 1054 is displayed on the IDL screen 1000. Indicia (e.g., the letter “A”) is displayed on the timeline 1054 and each indicia is associated with one of the assessments selected in step 420. Accordingly, to select any one of the particular assessments, the user may use a mouse or the like to select an “A,” by, for example, “clicking on” the “A” or merely hovering the cursor over the “A.” In response to the user selecting an “A” on the timeline, system 100 codes map 1002 according to the assessment associated with the selected “A.”
As shown in
Typically, for example, the IDL screen 1000 permits the student/viewer to browse backward from the selected assessment to view the results of all tests taken in this strand. In addition, the IDL screen 1000 is constructed and arranged such that the user can browse backward from nodes (blue in color) in the learning map that are prior to the nodes for which assessment occurred, or forward to preview the lessons and learning targets in nodes note yet assessed (blue in color) leading up to and including the nodes representing the state's grade-level expectations for learning in that strand, or any set of pre-defined learning expectations established for any purpose. Because tests can be delivered on a daily or weekly basis, or any other increment/schedule desired, the assessments are spaced out on a timeline that reflects the relative difference in timing between assessments.
The IDL screen 1000 features the same “S,” “I,” and “Q” icons and navigational methodologies that are described above with respect to the ISR screen 600 for the viewer's reference.
Referring now to
The Group Report (GR)
Referring back to
In step 442, reporting system 100 displays on the user's display a Group Report (GR) screen that includes at least a segment of a learning map. In one embodiment, the learning map segment displayed in step 442 includes, for each item on the selected assessment, a node corresponding to a learning target that is associated with the item. For example, if the selected assessment includes three items (item1, item2, and item3), and item1 is associated with learning target LT1, item2 is associated with learning target LT2, and item3 is associated with learning target LT3, then the segment of the learning map that gets displayed in step 442 may include a node corresponding to LT1, a node corresponding to LT2 and a node corresponding to LT3.
To illustrate the above feature,
A GR screen, such as GR screen 1200, represents the assessment results of any selected group on any segment of a learning map (i.e., “learning strand,” such as mathematics computation, for example) by displaying color-coded horizontal bands within the nodes of the learning map segment.
In the example GR screen 1200 shown, the horizontal bands, in order from the bottom of the node to the top, are green, yellow, and red. The widths of the green and red bands are proportional to the number of students who have mastered (green) or who have not yet achieved (red) the learning target of the node in question. An intermediate yellow band represents the percentage of students having assessment results for which data is inconclusive or for which further instruction may be required.
As a specific example, consider the node 1220, which corresponds to learning target LT1. As shown in
As shown in
Each node in map 1202 may also include a fourth band of a different color (e.g., blue) which represents the fraction of the class not yet assessed relative to that learning target. Furthermore, the GR screen 1200 may be constructed and arranged so as to automatically zoom and navigate the view so that the GR screen will be primarily displaying any nodes that were most recently assessed for that group. The display may also automatically zoom and navigate to display the most useful information for the viewer (e.g., the point on the map at which more than 25% of the students have not mastered nodes that are pre-cursors to nodes aligned to state standards).
The GR screen 1200, in one embodiment, features the same “S,” “I,” and “Q” icons and navigational methodologies that are found in the ISR screen 600.
Longitudinal Group Report (LGR)
Referring back to
In step 462, reporting system 100 displays on the user's display a graphical user interface screen that includes at least a segment of a learning map. In one embodiment, the learning map segment displayed in step 462 includes, for each item on each selected assessment, a node corresponding to a learning target that is associated with the item. For example, if one of the selected assessments includes an item (item1), and item1 is associated with learning target LT1, and another of the selected assessments includes an item (e.g., item77), and item77 is associated with learning target LT77, then the segment of the learning map that gets displayed in step 462 includes a node corresponding to LT1 and a node corresponding to LT77.
To illustrate the above feature,
Like the GR screen 1200, the LGR screen 1400 represents the assessment results of any target group (Mrs. Johnson's third grade class, for example) on any learning strand (mathematics computation, for example) by displaying the color-coded horizontal bands (referred to and explained above) in nodes of the learning map for the strand. The LGR screen 1400, in one embodiment, provides substantially the same viewing and navigational options as the IDL screen 1000. For example, a user can again zoom in and out and navigate through the map to choose other views. The user can also select a “Play” button 1490 on the LGR screen 1400 in order to display group test results in a timed sequence and observe the group's progress as nodal bands change color and width.
In one embodiment, if desired, the navigation may automatically shift up, down, left, or right while the report is playing. The zoom level may be automatically controlled during the play with or without the automatic navigation control.
The user can also select any one of the particular assessments and view the GR screen for that given assessment. For example, in one embodiment, a “timeline” 1454 is displayed on the LGR screen 1400. Indicia is displayed on the timeline 1454 and each indicia is associated with one of the assessments selected in step 460. Accordingly, to select any one of the particular assessments, the user may select one of the indicia. In response, system 100 displays a GR screen corresponding to the assessment associated with the selected indicia and corresponding to the group of students selected in step 460.
In some embodiments, the heavy outline around nodes associated with a standard or goal may change in appearance when all learners have achieved the standard or goal. The LGR screen 1400 features the same “S,” “I,” and “Q” icons and navigational methodologies that are found in the ISR screen 600 for the viewer's reference. As with the GR screen 1200, the teacher can select a colored band in a node to bring up the associated student lists.
Conclusion
It will be readily apparent that the various processes and algorithms described herein may be implemented by, e.g., appropriately programmed general purpose computers and computing devices. Typically a processor (e.g., a microprocessor) will receive instructions from a memory or like device, and execute those instructions, thereby performing a process defined by those instructions. Further, programs that implement such methods and algorithms may be stored and transmitted using a variety of known media.
While various embodiments/variations of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
Additionally, while the processes described above and illustrated in the drawings are shown as a sequence of steps, this was done solely for the sake of illustration. Accordingly, it is contemplated that some steps may be added and other steps omitted, and the order of the steps may be re-arranged. In other words, any sequence or order of steps that may be described does not necessarily indicate a requirement that the steps be performed in that order. The steps of processes described herein may be performed in any order practical. Further, some steps may be performed simultaneously.
This application claims the benefit of U.S. Provisional Patent Application No. 60/572,970, filed on May 21, 2004, the contents of which are incorporated herein. This application is related to co-pending U.S. patent application Ser. Nos. 10/777,212, filed on Feb. 13, 2004, and 10/644,061, filed on Aug. 20, 2003. The contents of the above identified applications are incorporated herein.
Number | Name | Date | Kind |
---|---|---|---|
4958284 | Bishop et al. | Sep 1990 | A |
5059127 | Lewis et al. | Oct 1991 | A |
5308244 | Hirose | May 1994 | A |
5395243 | Lubin et al. | Mar 1995 | A |
5421730 | Lasker, III et al. | Jun 1995 | A |
5433615 | Clark | Jul 1995 | A |
5513994 | Kershaw et al. | May 1996 | A |
5519809 | Husseiny et al. | May 1996 | A |
5558521 | Clark et al. | Sep 1996 | A |
5562460 | Price | Oct 1996 | A |
5565316 | Kershaw et al. | Oct 1996 | A |
5657256 | Swanson et al. | Aug 1997 | A |
5727951 | Ho et al. | Mar 1998 | A |
5752836 | Clark et al. | May 1998 | A |
5779486 | Ho et al. | Jul 1998 | A |
5823789 | Jay et al. | Oct 1998 | A |
5827070 | Kershaw et al. | Oct 1998 | A |
5870731 | Trif et al. | Feb 1999 | A |
5879165 | Brunkow et al. | Mar 1999 | A |
5890911 | Griswold et al. | Apr 1999 | A |
5904485 | Siefert | May 1999 | A |
5934909 | Ho et al. | Aug 1999 | A |
5934910 | Ho et al. | Aug 1999 | A |
5947747 | Walker et al. | Sep 1999 | A |
5954516 | Heinberg | Sep 1999 | A |
5967793 | Ho et al. | Oct 1999 | A |
6000945 | Sanchez-Lazer et al. | Dec 1999 | A |
6018617 | Sweitzer et al. | Jan 2000 | A |
6029043 | Ho et al. | Feb 2000 | A |
6039575 | L'Allier et al. | Mar 2000 | A |
6064856 | Lee et al. | May 2000 | A |
6077085 | Parry et al. | Jun 2000 | A |
6112049 | Sonnenfeld | Aug 2000 | A |
6118973 | Ho et al. | Sep 2000 | A |
6137911 | Zhilyaev | Oct 2000 | A |
6144838 | Sheehan | Nov 2000 | A |
6146148 | Stuppy | Nov 2000 | A |
6148174 | Remschel | Nov 2000 | A |
6149441 | Pellegrino et al. | Nov 2000 | A |
6159018 | Clark et al. | Dec 2000 | A |
6164975 | Weingarden et al. | Dec 2000 | A |
6183260 | Clark et al. | Feb 2001 | B1 |
6186794 | Brown et al. | Feb 2001 | B1 |
6186795 | Wilson | Feb 2001 | B1 |
6193521 | Clark et al. | Feb 2001 | B1 |
6212358 | Ho et al. | Apr 2001 | B1 |
6259890 | Driscoll et al. | Jul 2001 | B1 |
6285993 | Ferrell | Sep 2001 | B1 |
6301571 | Tatsuoka | Oct 2001 | B1 |
6336029 | Ho et al. | Jan 2002 | B1 |
6341212 | Shende et al. | Jan 2002 | B1 |
6418298 | Sonnenfeld | Jul 2002 | B1 |
6419496 | Vaughan | Jul 2002 | B1 |
6431875 | Elliott et al. | Aug 2002 | B1 |
6442370 | Driscoll et al. | Aug 2002 | B1 |
6484010 | Sheehan | Nov 2002 | B1 |
6507726 | Atkinson et al. | Jan 2003 | B1 |
6606480 | L'Allier et al. | Aug 2003 | B1 |
6658412 | Jenkins et al. | Dec 2003 | B1 |
6663392 | Leyva et al. | Dec 2003 | B2 |
6666687 | Stuppy | Dec 2003 | B2 |
6675133 | Knowles et al. | Jan 2004 | B2 |
6685476 | Safran, Sr. | Feb 2004 | B1 |
6688889 | Wallace et al. | Feb 2004 | B2 |
6704741 | Lively, Jr. et al. | Mar 2004 | B1 |
6877989 | Embretson | Apr 2005 | B2 |
6918772 | Clark et al. | Jul 2005 | B2 |
6978115 | Whitehurst et al. | Dec 2005 | B2 |
6996366 | L'Allier et al. | Feb 2006 | B2 |
7114126 | Berger et al. | Sep 2006 | B2 |
7121830 | Kaplan et al. | Oct 2006 | B1 |
7127208 | Burstein et al. | Oct 2006 | B2 |
7137821 | Jorgensen et al. | Nov 2006 | B2 |
7162198 | Kuntz et al. | Jan 2007 | B2 |
20020028430 | Driscoll et al. | Mar 2002 | A1 |
20020182579 | Driscoll et al. | Dec 2002 | A1 |
20020188583 | Rukavina et al. | Dec 2002 | A1 |
20030017442 | Tudor et al. | Jan 2003 | A1 |
20030118978 | L'Allier et al. | Jun 2003 | A1 |
20030129575 | L'Allier et al. | Jul 2003 | A1 |
20030129576 | Wood et al. | Jul 2003 | A1 |
20030152902 | Altenhofen et al. | Aug 2003 | A1 |
20030180703 | Yates et al. | Sep 2003 | A1 |
20030198932 | Stuppy | Oct 2003 | A1 |
20030200077 | Leacock et al. | Oct 2003 | A1 |
20030232315 | Pfund | Dec 2003 | A1 |
20040076941 | Cunningham et al. | Apr 2004 | A1 |
20040086841 | Clark et al. | May 2004 | A1 |
20040106088 | Driscoll et al. | Jun 2004 | A1 |
20040229199 | Ashley et al. | Nov 2004 | A1 |
20050086257 | Wright | Apr 2005 | A1 |
20050255439 | Cody | Nov 2005 | A1 |
20060078864 | Jorgensen et al. | Apr 2006 | A1 |
20060160057 | Armagost et al. | Jul 2006 | A1 |
20060188862 | Johnson | Aug 2006 | A1 |
Number | Date | Country | |
---|---|---|---|
60572970 | May 2004 | US |