The present invention relates to a system that supports (1) the development and administration of instructional materials and/or programs, (2) the development, administration, scoring and reporting of testing materials and/or programs, and/or (3) the integration of instruction materials with test materials and/or programs.
Aspects of the present invention provide a system that supports (1) the development and administration of instructional materials and/or programs, (2) the development, administration, scoring, and reporting of assessment materials (e.g. formative, diagnostic summative, etc.) and/or programs, and/or (3) the integration of instruction with assessment materials and/or programs. The system achieves this by organizing the content (knowledge taught and assessed) in these materials and programs into learning targets and ordering the learning targets to reflect the pre-cursor/post-cursor relationships among the learning targets. Learning targets are the correct conceptions or misconceptions that are part of any learning path, decomposed into the smallest units that are useful for educational purposes, and further decomposed and defined by the level of expertise with which these units are understood and applied.
As used herein a “target indicator” corresponds to a learning target to be taught or assessed. Pre-cursor indicators are related to the knowledge that the student should have prior to being taught the target (target indicator). Post-cursor indicators relate to knowledge that the student should be able to acquire more readily after learning the target (target indicator).
Advantageously, in some embodiments, the system links each defined learning target with other entities associated with the learning target (these entities may include but are not limited to any or all of the following: items or parts of items, item statistics, instructional materials, research on misconceptions, teaching strategies, time-to-learn data, data associated with special populations, matching content descriptions and/or location in any other curriculum, instruction, assessment taxonomy or framework, etc.).
Further, the system may employ a data model and methods that permit access to the learning targets, learning target ordering, linked entities, and learning sequence information for efficient use in: the development and administration of instructional materials and/or programs; and/or the development, administration, scoring, and reporting of assessment materials and/or programs; and/or the integration of instructional and assessment materials.
Additionally, the system may provide recommendations as to collections of learning targets and/or collections of linked entities for specific purposes based on user preferences or circumstances.
In one particular aspect, the present invention provides a system for designing academic achievement tests using a database of test items and software that (i) allows a user to select a major academic area (e.g., science) and a topic or growth strand (e.g., laws of motion) within that academic area, (ii) retrieves tests items from the database that relate to that growth strand, and (iii) displays the test items within a matrix having rows representing learning targets (e.g., pre-cursor indicators, target indicators, and post-cursor indicators) and columns representing depth of knowledge (e.g., routine, comprehension, application, exploration). One or more test items can be displayed within a cell of the matrix depending upon the learning target and depth of knowledge they demonstrate.
The system can be set to display items from a recommended test, a previously defined test, or all relevant test items within the database that relate to the selected target and its pre-cursors and post-cursors. The user can add or remove test items by clicking on individual test items. The user can also view test items by double clicking on individual test items. The test definition can be saved once the user has completed the process of selecting test items for the test.
Another aspect of the system allows the user to see how a state's or textbook's performance indicators defined within a selected content area (e.g., science) at a selected education level (e.g., elementary) map onto the system's performance indicators defined within the selected content area at the selected education level, and vice-versa.
Advantageously, the system further may be configured to enable the user to view performance reports, test items, and students' responses to a set of test items.
The above and other features and advantages of the present invention, as well as the structure and operation of preferred embodiments of the present invention, are described in detail below with reference to the accompanying drawings.
The accompanying drawings, which are incorporated herein and form part of the specification, illustrate various embodiments of the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
In one aspect, the present invention provides a system 100 (see
I. System Overview
As shown in
System 100 also includes an information processing system 104 having software 106 stored therein and/or accessible thereto. Information processing system 104 may include one or more general and/or special purpose computers. If more than one computer is used to implement processing system 104, the two or more computers need not be co-located. If they are not co-located, then, preferably, a network (e.g., the Internet or other network) is used to enable the two or more computers to communicate with each other.
Software 106 may include one or more computer programs (e.g., web servers, web browsers and other computer programs), scripts, markup language documents (e.g., HTML, XML, etc.), routines, and/or other mechanism for controlling processing system 104 and/or user interface 110.
Coupled to processing system 104 is a user interface system 110. User interface system 110 may be directly connected to processing system 104 or indirectly coupled to the processing system 104 through, for example, a local or wide area network. User interface system 110 may include one or more information input and/or output device, such as, for example, a monitor, keyboard, mouse, microphone, speaker or other information input/output device.
II. Organization of the Item Bank
Referring back to item bank 101, the items within item bank 101 are subdivided into major academic content areas, for example, science, math, and geography. This is illustrated in
As shown in
Each learning target (LT) 301 has one or more depths of knowledge (DOK) associated therewith.
For example, learning target 301(a), which is labeled “earth is shaped like a ball,” has three depths of knowledge: DOK1 (a.k.a., “routine”), DOK2 (a.k.a., “comprehension”) and DOK3 (a.k.a., “application”). Similarly, LT2301(b) also has three depths of knowledge. Advantageously, each DOK associated with a learning target (LT) is also associated with one or more items within item bank 101. This is illustrated in
As shown in
Additionally, information relating to pre-cursor and post-cursor relationship between LTs may also be stored in a database. As shown in
III. System Recommended Tests
As discussed above, each content area (e.g., science) includes a set of pre-defined learning targets, which may be grouped into growth strands, as shown in
In addition to associating items and tests with a learning target, the system 100 can associate other entities with a learning target by storing entity information in a database and associating that information with the learning target. Such other entities may include: items parts, instructional material, item statistics, research on misconceptions, teaching strategies, time-to-learn data, data associated with special populations, matching content descriptions and/or location in any other curriculum, instruction, assessment taxonomy or framework, etc.
IV. Selecting, Modifying and Creating Tests
Software 106 of system 100 enables a user to select, modify and create academic achievement tests.
Process 600 may begin in step 602. In step 602, software 106 enables the user to select a content area (e.g., science). For example, software 106 may display a list of the available content areas and allow the user to select one of the content areas from the list. In step 604, software 106 enables the user to select a growth strand within the selected content area. In step 606, after the user selects a growth strand, software 106 request the user to select one of the learning targets from the selected growth strand. The selected learning target is referred to as the target indicator.
After the user selects a target indicator, software 106 determines pre-cursors and post-cursors of the selected target indicator (step 608). As discussed above, this information may be contained in a database, in which case, software 106 may access the database to determine pre-cursors and post-cursors of the selected target.
Next (step 610), software 106 determines the set of test items that are included in the default recommended test associated with the target indicator. This information may also be contained in a database, in which case, software 106 may access the database to determine which of the recommended tests is the default and the items that are included in the default recommended test.
Any one of the recommended tests associated with the target indicator may be the default recommended test. In one embodiment, the recommended pre-test is the default recommended test. Thus, in this embodiment, in step 610, software 106 determines all of the test items that are included in the recommended pre-test associated with the target indicator.
Next (step 612), software 106 displays to the user a user interface screen that displays an item identifier for each item determined in step 610, as well as displaying the target indicator and pre/post-cursors of the target indicator. Preferably, the item identifiers are presented to the user in an organized fashion that makes it easy for the user to quickly determine the LT/DOK pair with which any particular item from the recommended test is associated.
In one embodiment, software 106 displays the item identifiers in table form.
Each row of table 702 corresponds to one of the target indicator, a pre-cursor, or a post-cursor. In the example shown in
Each column of table 702 corresponds to a different depth of knowledge. In the example shown in
For example, if one of the items determined in step 610 is associated with one particular LT/DOK pair, then the identifier for the item will be included in the cell of table 702 that is in the row corresponding to the LT and the column corresponding to the DOK. Thus, a user of system 100 can quickly and easily determine the LT/DOK pair with which an item is associated. For example, simply by reviewing table 702, it is clear that item Q25 is associated with the target indicator at the “application” depth of knowledge (DOK). As another example, it should be clear that items Q1 and Q2 are associated with the “earth is shaped like a ball” LT and the “Routine” DOK.
Preferably, the row of the matrix containing the target indicator has a different background color than the other rows in the matrix. Similarly, it is preferably that the rows associated with the pre-cursors and the rows associated with the post-cursors have unique colors. For example, the post-cursor row(s) could be colored blue, the target indicator row could be colored yellow, and the pre-cursor row(s) could be colored red. In this manner, a user can more easily distinguish among the target indicator, post-cursors, and pre-cursors.
In addition to including table 702, user interface screen 700 may include buttons, checkboxes and the like for enabling the user to change what is displayed in table 702. For example, screen 700 includes radio-buttons 704 that enable the user to select a different test type than the one currently being displayed. If, for example, the user selects the button associated with “Unit Test,” then software 106 will display in table 702 the item identifiers that identify the items included in the recommended Unit Test. Similarly, if the user selects the button associated with “Pre-Test,” then software 106 will display in table 702 the items that are included in the recommended Pre-Test. In this manner, the user can review the items that make up the recommended Unit Test, recommended Pre-Test and recommended Post-Test.
Screen 700 may also enable the user to view all items associated with the currently displayed target indicator, pre-cursor indicator and post-cursor indicator. Additionally, screen 700 may enable the user to view any previously defined test.
To view all items associated with the currently displayed target indicator, pre-cursor indicators and post-cursor indicators, the user need only click on the “All items” button 710. In response to the user clicking on button 710, software 106 determines all of the items associated with the target indicator and each pre-cursor and post-cursor indicator displayed on screen 700. As discussed above, a database may be used to store information that enables software 106 to make this determination. After determining the items associated with the target indicator and each pre-cursor and post-cursor indicator, software 106 displays in the appropriate cells of table 702 the identifiers that identify the determined items. Thus, an identifier that identifies an item associated with a particular TOC/DOK pair will be positioned in the cell of table 702 that corresponds to the particular TOC/DOK pair.
To view the items associated with a previously defined test, the user need only click on the “Test Set” button 712 and select the previously defined test using pull-down menu 716. In response to the user clicking on button 712 and selecting a previously defined test, software 106 determines all of the items associated with the previously defined test. A database may be used to store information that enables software 106 to make this determination. After determining the items associated with the selected previously defined test, software 106 displays in the appropriate cells of table 702 the identifiers that identify the determined items.
Advantageously, in at least one embodiment, software 106 enables the user to modify an existing test and save the modified test. To modify an existing test (e.g., the recommended Unit Test or a previously defined test), the user first selects the test and then adds and/or removes item identifiers from table 702. Once the user is finished modifying the test, the user can click on the “Save Test” button 714 to save the modified test. Clicking on the “Reset” button 716 causes software 106 to put table 702 back into the state it was before the user began modifying the test. The user can also create a test from scratch by clicking on the “Create Test” button 718 and then adding item identifiers to table 702.
Additionally, software 106 provides a user with easy access to view individual items in the selected test by simply clicking on the identifier of the item the user desires to view.
In response to the user clicking on an item identifier, software 106 displays to the user the item associated with the clicked on identifier. For example, if the user double-clicks on item identifier “Q21,” then software 106 displays an item view screen 3100 (see
As shown in
Indicator and depth of knowledge for each item may be shown as a tool tip on mouse over for each item number in the navigation bar 3102. Similarly, the items listed at navigation bar 3102 can be color-coded to identify the LT/DOK pair with which the item is associated. For example, the buttons in navigation bar 3102 associated with items 1-20 may have one color (e.g., blue), thereby identifying items 1-20 as pre-cursor items, the buttons in navigation bar 3102 associated with items 21-25 may have another color (e.g., gold), thereby identifying items 21-25 as target items, and the buttons in navigation bar 3102 associated with items 26-30 may be still another color (e.g., purple), thereby identifying the items 26-30 as post-cursor items.
V. The Standards Alignment Matrix
Another feature of system 100 is that it enables the user to analyze an alignment between any set of curriculum standards, or instructional materials, with tests designed to assess achievement of these standards or measure learning progress in the instructional materials. This capability is based on prior cross-coding of all the materials/publications/frameworks involved in the alignment study.
More specifically, in one embodiment, a feature of system 100 is that it enables the user to see how a state's or textbook's performance indicators within a selected content area (e.g., science) at a selected education level (e.g., elementary) map onto system 100's performance indicators within the selected content area at the selected education level, and vice-versa.
In one embodiment, information that provides a mapping between system 100's performance indicators and at least one state's and/or textbook's performance indicators is stored in system 100. Preferably, the information is stored in a relational database.
As shown in
System 100 provides a user interface screen that the user can interact with to view and analyze an alignment between system 100's performance indicators and a state's and/or textbook's performance indicators. User interface screen 900 (see
Referring now to
Referring now to
In step 1006, system 100 determines system 100's performance indicators for the selected content area at the selected education level. This information is preferably stored in a database. Next (step 1008), system displays the determined system performance indicators in a first column 921 of an alignment matrix table 920.
Next (step 1010), system 100 determines whether the user selected a state from pull-down menu 906. If the user selected a state, control passes to step 1012, otherwise control passes to step 1020.
In step 1012, for each system 100 performance indicator included in column 921 of table 920, system 100 uses information stored in the database (e.g., the information from a table like table 810) to determine the selected state's performance indicators that correspond to the system 100 performance indicator (if any) and displays the determined state performance indicators in a column of table 920 (e.g., column 922) in the row corresponding to the system 100 performance indicator. After step 1012, the process continues to step 1020.
In step 1020, system 100 determines whether the user selected a textbook from pull-down menu 908. If the user selected a textbook, control passes to step 1022, otherwise control passes to step 1090, where system 100 waits for the user to change a selection or exit the screen.
In step 1022, for each system 100 performance indicator included in column 921 of table 920, system 100 uses information stored in the database to determine the selected textbook's performance indicators that correspond to the system 100 performance indicator (if any) and displays the determined textbook performance indicators in a column of table 920 (e.g., column 922) in the row corresponding to the system 100 performance indicator. After step 1022, the process continues to step 1090.
In step 1046, system 100 determines the selected state's performance indicators for the selected content area at the selected education level. Next (step 1048), system displays the determined state performance indicators in a first row 921 of an alignment matrix table 920.
Next (step 1050), for each state performance indicator included in column 921 of table 920, system 100 uses information stored in the database to determine the system 100 performance indicators that correspond to the state performance indicator (if any) and displays the determined system performance indicators in a column of table 920 (e.g., column 922) in the row corresponding to the state performance indicator. After step 1050, the process continues to step 1060.
In step 1060, system 100 determines whether the user selected a textbook from pull-down menu 908. If the user selected a textbook, control passes to step 1062, otherwise control passes to step 1090.
In step 1062, for each state performance indicator included in column 921 of table 920, system 100 uses information stored in the database to determine the selected textbook's performance indicators that correspond to the state performance indicator (if any) and displays the determined textbook performance indicators in a column of table 920 (e.g., column 922) in the row corresponding to the state performance indicator. After step 1062, the process continues to step 1090.
In step 1076, system 100 determines the selected textbook's performance indicators for the selected content area at the selected education level. Next (step 1078), system displays the determined textbook performance indicators in a first row 921 of an alignment matrix table 920.
Next (step 1080), for each textbook performance indicator included in column 921 of table 920, system 100 uses information stored in the database to determine the system 100 performance indicators that correspond to the textbook performance indicator (if any) and displays the determined system performance indicators in a column of table 920 (e.g., column 922) in the row corresponding to the textbook performance indicator. After step 1080, the process continues to step 1082.
In step 1082, system 100 determines whether the user selected a state from pull-down menu 906. If the user selected a state, control passes to step 1084, otherwise control passes to step 1090.
In step 1084, for each textbook performance indicator included in column 921 of table 920, system 100 uses information stored in the database to determine the selected state's performance indicators that correspond to the textbook performance indicator (if any) and displays the determined state performance indicators in a column of table 920 (e.g., column 922) in the row corresponding to the textbook performance indicator. After step 1084, the process continues to step 1090.
To give one example of how the above described alignment feature of system 100 can be used, assume a testing company (e.g., CTB McGraw-Hill, the assignee of the present invention) wants to build a middle school science test for the state of California. By selecting from the appropriate menus and buttons at the left side of the screen 900, it becomes possible to display in an alignment matrix the performance indicators in the testing company's framework that correspond to the performance indicators associated with each California standard (in this case Focus on Earth Science and Physical Science, Standard Various).
The alignment matrix 920 supports: (1) the selection of secure items (from system 100's item bank), aligned with a state's indicators, for use in a test that's being developed; (2) the election of sample items (from system 100's item bank) which accurately target the selected state's indicators at the intended depths of knowledge, for use by item writers and test content editors as models for item writing and editing to ensure that newly developed test items are closely aligned with the state's standards; (3) clarification of the item specifications to guide writers in developing items that accurately assess achievement of the state's standards; and (4) discussion about the scope and relative importance of each selected state performance indicator in support of the development of a test blueprint.
VI. Performance Reporting
System 100 may be programmed to provide performance reports.
As represented in
As shown in
More specifically, a locator question is typically a short question that will take the student a short time to complete and which will give system 100 an idea of where to start testing the student in terms of stages of learning and depths of knowledge for a given section of content. The locator question is used to locate a starting item in a set of items, select from a set of predefined item sets, or increase/decrease the number of items presented to the student from one or more items sets. Use of the locator questions may result in a reduction in the amount of testing time for each particular student or adaptation of the assessment for one or more subsets of the content for each particular student or both.
Although locator questions are used for adaptive testing, the comprehensive, detailed framework using stages of learning and depths of knowledge may be used to characterize content for any assessment type including formative, summative, benchmark, diagnostic, high-stakes, low-stakes, homework, etc (any or all of which may, or may not be adaptive). This comprehensive, detailed framework using stages of learning and depths of knowledge can be used to create a profile of the student, which can be updated based on any or all of the assessments given to an individual student, so long as the content of those assessments is developed with respect to, or developed independently of, and matched to, the same comprehensive, detailed framework using stages of learning and depths of knowledge.
The user has access to the student responses to each question (except for “high-stakes” questions) by clicking on the desired question number displayed in table 1300. Color coding also indicates that the student either was not tested, or that testing of this indicator at that level of knowledge is not relevant. The user can print the report or publish the report to the web (note this report can be an intranet or protected internet site).
System 100 may provide a feature that enables the user to view a student's response to each item in an assessment previously given to the student. Additionally, system 100 enables the user to not only see the student's response to the items, but also the items themselves. This feature is illustrated in
As shown in
Viewing of responses to high stakes items may be blocked except for correct, incorrect, partially correct (see
While various embodiments/variations of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
This application claims the benefit of U.S. Provisional Patent Application No. 60/404,394, filed on Aug. 20, 2002, the contents of which are hereby incorporated herein by reference. This application is related to U.S. Provisional Patent Application No. 60/449,827, filed on Feb. 26, 2003 and U.S. Provisional Patent Application No. 60/447,300, filed on Feb. 14, 2003. The contents of both of these provisional patent applications is hereby incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4958284 | Bishop et al. | Sep 1990 | A |
5059127 | Lewis et al. | Oct 1991 | A |
5308244 | Hirose | May 1994 | A |
5395243 | Lubin et al. | Mar 1995 | A |
5421730 | Lasker, III et al. | Jun 1995 | A |
5433615 | Clark | Jul 1995 | A |
5513994 | Kershaw et al. | May 1996 | A |
5519809 | Husseiny et al. | May 1996 | A |
5558521 | Clark et al. | Sep 1996 | A |
5562460 | Price | Oct 1996 | A |
5565316 | Kershaw et al. | Oct 1996 | A |
5657256 | Swanson et al. | Aug 1997 | A |
5727951 | Ho et al. | Mar 1998 | A |
5752836 | Clark et al. | May 1998 | A |
5779486 | Ho et al. | Jul 1998 | A |
5823789 | Jay et al. | Oct 1998 | A |
5827070 | Kershaw et al. | Oct 1998 | A |
5870731 | Trif et al. | Feb 1999 | A |
5879165 | Brunkow et al. | Mar 1999 | A |
5890911 | Griswold et al. | Apr 1999 | A |
5904485 | Siefert | May 1999 | A |
5934909 | Ho et al. | Aug 1999 | A |
5934910 | Ho et al. | Aug 1999 | A |
5947747 | Walker et al. | Sep 1999 | A |
5954516 | Heinberg | Sep 1999 | A |
5967793 | Ho et al. | Oct 1999 | A |
6000945 | Sanchez-Lazer et al. | Dec 1999 | A |
6018617 | Sweitzer et al. | Jan 2000 | A |
6029043 | Ho et al. | Feb 2000 | A |
6039575 | L'Allier et al. | Mar 2000 | A |
6064856 | Lee et al. | May 2000 | A |
6077085 | Parry et al. | Jun 2000 | A |
6112049 | Sonnenfeld | Aug 2000 | A |
6118973 | Ho et al. | Sep 2000 | A |
6137911 | Zhilyaev | Oct 2000 | A |
6144838 | Sheehan | Nov 2000 | A |
6146148 | Stuppy | Nov 2000 | A |
6148174 | Remschel | Nov 2000 | A |
6149441 | Pellegrino et al. | Nov 2000 | A |
6159018 | Clark et al. | Dec 2000 | A |
6164974 | Carlile et al. | Dec 2000 | A |
6164975 | Weingarden et al. | Dec 2000 | A |
6183260 | Clark et al. | Feb 2001 | B1 |
6186794 | Brown et al. | Feb 2001 | B1 |
6186795 | Wilson | Feb 2001 | B1 |
6193521 | Clark et al. | Feb 2001 | B1 |
6212358 | Ho et al. | Apr 2001 | B1 |
6259890 | Driscoll et al. | Jul 2001 | B1 |
6285993 | Ferrell | Sep 2001 | B1 |
6301571 | Tatsuoka | Oct 2001 | B1 |
6336029 | Ho et al. | Jan 2002 | B1 |
6341212 | Shende et al. | Jan 2002 | B1 |
6418298 | Sonnenfeld | Jul 2002 | B1 |
6419496 | Vaughan | Jul 2002 | B1 |
6431875 | Elliott et al. | Aug 2002 | B1 |
6442370 | Driscoll et al. | Aug 2002 | B1 |
6484010 | Sheehan | Nov 2002 | B1 |
6606480 | L'Allier et al. | Aug 2003 | B1 |
6658412 | Jenkins et al. | Dec 2003 | B1 |
6663392 | Leyva et al. | Dec 2003 | B2 |
6666687 | Stuppy | Dec 2003 | B2 |
6675133 | Knowles et al. | Jan 2004 | B2 |
6688889 | Wallace et al. | Feb 2004 | B2 |
6704741 | Lively, Jr. et al. | Mar 2004 | B1 |
6877989 | Embretson | Apr 2005 | B2 |
6918772 | Clark et al. | Jul 2005 | B2 |
6978115 | Whitehurst et al. | Dec 2005 | B2 |
6996366 | L'Allier et al. | Feb 2006 | B2 |
7121830 | Kaplan et al. | Oct 2006 | B1 |
7127208 | Burstein et al. | Oct 2006 | B2 |
7137821 | Jorgensen et al. | Nov 2006 | B2 |
7162198 | Kuntz et al. | Jan 2007 | B2 |
7165012 | Swanson | Jan 2007 | B2 |
20020028430 | Driscoll et al. | Mar 2002 | A1 |
20020182579 | Driscoll et al. | Dec 2002 | A1 |
20020188583 | Rukavina et al. | Dec 2002 | A1 |
20030017442 | Tudor et al. | Jan 2003 | A1 |
20030118978 | L'Allier et al. | Jun 2003 | A1 |
20030129575 | L'Allier et al. | Jul 2003 | A1 |
20030129576 | Wood et al. | Jul 2003 | A1 |
20030152902 | Altenhofen et al. | Aug 2003 | A1 |
20030180703 | Yates et al. | Sep 2003 | A1 |
20030198932 | Stuppy | Oct 2003 | A1 |
20030200077 | Leacock et al. | Oct 2003 | A1 |
20040063085 | Ivanir et al. | Apr 2004 | A1 |
20040076941 | Cunningham et al. | Apr 2004 | A1 |
20040086841 | Clark et al. | May 2004 | A1 |
20040106088 | Driscoll et al. | Jun 2004 | A1 |
20040229199 | Ashley et al. | Nov 2004 | A1 |
20050079477 | Diesel et al. | Apr 2005 | A1 |
20050086257 | Wright | Apr 2005 | A1 |
20050255439 | Cody | Nov 2005 | A1 |
20060078864 | Jorgensen et al. | Apr 2006 | A1 |
20060160057 | Armagost et al. | Jul 2006 | A1 |
20060188862 | Johnson | Aug 2006 | A1 |
Number | Date | Country | |
---|---|---|---|
60404394 | Aug 2002 | US |