This disclosure relates generally to method and apparatus for defining fields for scanned documents, and more particularly to method and apparatus for defining fields in standardized forms.
One method for evaluating the knowledge or skill of a person includes the use of standardized tests. For example, standardized tests are used to monitor the academic progress of students. Some standardized tests incorporate objective test questions that are answered by filling in an appropriate oval in an associated bubble-type answer sheet using a pencil. Some standardized tests include open-ended questions or tests. Examples of open-ended questions in these tests include essay questions, fill-in-the-blank questions, fill-in-the-map questions, math questions, and motor skill tests in which a test taker draws geometric shapes. Open-ended questions or tests, also referred to herein as scoring tasks or test items, are evaluated and scored by a human reviewer, also referred to herein as a reader.
Standardized tests are distributed from test processing centers to test-taking centers where the tests are administered to test-takers using standardized forms. One example of a test-taker is a student. Another example of a test-taker is a student who is taking home-based assessments.
In order to promote the integrity and accuracy of the administered tests and test results, it is desirable to properly define sections of the test when scanning the completed test, so that objective test items can be properly scored by machine and open-ended test items can be properly scored by readers.
Another practice by which information is obtained is a survey. Surveys are used to obtain information from a person, such as a consumer of goods or services. Surveys also use forms having standardized questions. Large amounts of data can be compiled from surveys using standardized forms. Thus, there is a need for accurate collection of data from the standardized forms.
Improved method and apparatus for defining fields in standardized forms and for imaging are needed.
The above-mentioned problems and others not expressly discussed herein are addressed by the present subject matter and will be understood by reading and studying this specification.
Disclosed herein, among other things, are method and apparatus for identifying fields for scanned documents. According to an embodiment of a method, at least one page of a standardized exam is converted into a raster image. One or more bubble response fields are automatically identified on the raster image. A location of one or more bubble response fields is stored in a database. According to various embodiments, a portion of the raster image is defined that includes one or more bubble response fields. A size of the bubbles of the one or more bubble response fields is selected, and one or more bubbles of the size selected are automatically detected. A position of each bubble detected is determined and stored, according to various embodiments.
Another aspect of this disclosure relates to a method for image field definition. According to an embodiment, the method includes converting a scanned image from a first format to a second format and automatically searching the image for one or more bubble response fields on the image. The one or more bubble response fields are identified and a position of each bubble is stored in a database.
Another aspect of this disclosure relates to a system for image field definition. According to an embodiment, the system includes means for automatically converting at least one page of a standardized exam into a raster image. The system also includes means for identifying one or more bubble response fields on the raster image. According to an embodiment, the system further includes means for storing a location of one or more bubble response fields in a database.
This Summary is an overview of some of the teachings of the present application and not intended to be an exclusive or exhaustive treatment of the present subject matter. Further details about the present subject matter are found in the detailed description and appended claims. The scope of the present invention is defined by the appended claims and their legal equivalents.
The following detailed description of the present subject matter refers to subject matter in the accompanying drawings which show, by way of illustration, specific aspects and embodiments in which the present subject matter may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the present subject matter. References to “an”, “one”, or “various” embodiments in this disclosure are not necessarily to the same embodiment, and such references contemplate more than one embodiment. The following detailed description is demonstrative and not to be taken in a limiting sense. The scope of the present subject matter is defined by the appended claims, along with the full scope of legal equivalents to which such claims are entitled.
The image operation system purpose is to convert educational assessments and/or completed survey forms from a paper format to an electronic format. After response documents (responding to the educational assessment or survey) are scanned and proofs are generated in PDF format, the imaging system's image field definition application receives the PDF format, (1) converts to a format for use in the imaging system, (2) identifies bubble response fields, and (3) identifies constructive response fields.
The present subject matter generally relates to apparatus and method for defining fields for scanned documents, and more particularly to apparatus and method for defining fields in standardized test imaging. The image field definition (IFD) application is used to identify the location of bubbles (for objective test question responses) and constructed (or open-ended) response items in test booklets. This defines fields the scorers (both human and machine) will receive. This application is also used to set editing rules for each project, and convert vector images (such as PDF) into raster images (such as TIFF) format for used in an imaging system. Bubbles refer to any two-dimensionally closed form adapted to be filled in with a writing utensil, such as but not limited to: a circle, an ellipse, a square, and a rectangle. Bubbles with other shapes and sizes can be used without departing from the scope of this disclosure.
In order to successfully scan documents via the imaging operations system, a method of defining fields for scanner reading and target interpretation is provided. Both ‘bubble’ targets (multiple choice or demographic fields) and constructed response items must be defined. The imaging system locate page reference marks (T-Marks) within a defined search area, processes gray-scale images containing bubbled fields, applies mark levels and mark discrimination parameters, establishes editing rules, associates images with pages, scans and defines constructed response fields for importation into the hand scoring client, and establishes constructed response scoring rules. Documents scanned by the systems comply with standard layout specifications. A master image is generated in a useable format for each page of a scanned document and made accessible for defining bubbled and/or constructed response fields that reside on a page. The conversion from PDF to TIFF uses reference marks on the image, such as “T” marks to define relative locations on the image, in an embodiment. In the event that a PDF file is not provided for material, the actual documents can be scanned on a scanner and imported into this application using the scanned image file as the imported image.
Bubble response fields are automatically identified using a search algorithm. A search is performed to identify a pre-defined shape (bubble). A first location is searched, and if unsuccessful, another location adjacent the first location (by a predefined number of pixels) is searched. As discussed below with respect to
Defining a portion of the image includes allowing a user to draw a box around the field where the bubbles are to be identified, in various embodiments. An example is shown in
According to various embodiments, automatically searching the image includes using a pattern-matching search. In an embodiment, tolerances are defined for the pattern-matching search. Automatically searching the image includes searching the image for patterns of a shape and a size, in various embodiments. User input can be provided for defining the shape and the size. In one embodiment, converting a scanned image from a first format to a second format includes converting a scanned image from a portable document format (PDF). Converting a scanned image from a first format to a second format includes converting a scanned image from a tagged image file format (TIFF), in various embodiments. In various embodiments, the method further includes providing for a user input to define a number of bubbles within a bubble response field, automatically detecting one or more bubbles within the bubble response field, and verifying that the number of bubbles detected is equal to the number of bubbles defined.
According to various embodiments, scoring information is stored for use in scoring a response to each of the one or more bubble response fields. The method further includes storing instructions as to how the response to each of the one or more bubble response fields is scored. A number of bubbles associated with each bubble response field can also be stored. Directions are provided to locate one or more bubbles associated with a response for a particular question. In one embodiment, a second section of an image is defined for constructed responses. The first section and the second section are reviewed to confirm proper definition, including converting both fields and storing images in a database.
Various embodiments include a method for defining constructed response locations. A constructed response is defined as a response that is not a bubble response. An example is a written or typed response to an essay or short-answer question on an examination. The system allows a user to define the area on the answer form that includes the constructed response. The defined area is then provided to a reader or grader for evaluating and scoring the content of the response. According to various embodiments, a user defines the field name and edit rules as with bubble responses in
To view the Constructed Response screen, the user must first select a valid page or item from the IFD common navigation screen. Then, the user chooses a task in various embodiments. The selected task dictates which Constructed Response screen is shown: Add Item—displays Define Items tab with default field values; Edit Item—displays Define Items tab with saved field values; View Item—displays Define Items tab with saved field values, read only; Delete Item—deletes item, does not show Constructed Response screen; Setup Validity Items—Displays Setup Validity Items tab. Once the user is able to view the Constructed Response Screen, closing the application directly or clicking another tab will prompt the user to save/cancel their changes (if data is valid and is changed). The user may choose the Save command from the menu/toolbar at any time to save changes to the database. Changes will only be saved if the data is valid.
Completed test booklets are boxed, illustrated at 208, for shipping to a test-processing center 210. The boxes include an identifier 212, such as a bar code for example. Upon arriving at the test-processing center 210, the boxes of test booklets are unloaded at 214. The test booklets are removed from the boxes and sorted at 216. At 220, the test booklets are cut into loose pages. These loose pages are reconciled to ensure that all of the pages for each test booklet are accounted for. Reading devices 222, 224, and 226, such as bar code scanners for example, are used to read the identifiers 223 and identify the boxes, read the identifiers 225 and identify the test booklets, and read the identifiers and identify the pages. In one embodiment, the image field definition system identifies the identifying markings for the pages.
The test pages are graded or scored at 228. In one embodiment, objective scoring tasks, such as multiple choice questions for example, are scored using scoring of tests from images 230. In one embodiment, open-ended scoring tasks are scanned at scanning stations 232, are stored in a queue, and are distributed by a dealer 234 to human readers 235 who evaluate the open-ended scoring tasks. Reports 236 of the score results are provided at 237.
A server in the test-processing center is used to perform a variety of tasks with the scanned data, as discussed herein. In one embodiment, the server includes priority information, as illustrated via lines 238, 240, 242, 244 and 246, the priority information is available at various places along the process. In one embodiment, for example, the reading device(s) 222 determines which of the boxes should proceed for further processing before other boxes. In one embodiment, the reading device(s) 224 determine which of the test booklets should proceed for further processing before other test booklets. In one embodiment, the reading device(s) 226 determine which of the pages (or test items on the pages) should proceed for further processing before other pages (or test items on the pages). In one embodiment, for example, the priority information is used in the scoring system 228 to determine which test items should be scored before other test items. In one embodiment, for example, the priority information is used to determine which reports should be provided before other reports 236.
Specific embodiments have been illustrated and described herein, however, is to be understood that the above description is intended to be illustrative, and not restrictive. The scope of the present subject matter should be determined with reference to the appended claims, along with the full scope of legal equivalents to which such claims are entitled.
This application claims the benefit of provisional U.S. patent application Ser. No. 60/981,739, filed on Oct. 22, 2007, which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4813077 | Woods et al. | Mar 1989 | A |
4817179 | Buck | Mar 1989 | A |
4827330 | Walsh et al. | May 1989 | A |
4837842 | Holt | Jun 1989 | A |
4967354 | Buchanan | Oct 1990 | A |
4978305 | Kraft | Dec 1990 | A |
5001769 | Reid-Green et al. | Mar 1991 | A |
5004896 | Serrell et al. | Apr 1991 | A |
5041874 | Nishimori et al. | Aug 1991 | A |
5194966 | Quardt et al. | Mar 1993 | A |
5313291 | Appel et al. | May 1994 | A |
5321611 | Clark et al. | Jun 1994 | A |
5363318 | McCauley | Nov 1994 | A |
5433615 | Clark | Jul 1995 | A |
5452379 | Poor | Sep 1995 | A |
5521722 | Colvill et al. | May 1996 | A |
5557515 | Abbruzzese et al. | Sep 1996 | A |
5672060 | Poor | Sep 1997 | A |
5735694 | Clark et al. | Apr 1998 | A |
5825947 | Sasaki et al. | Oct 1998 | A |
5832100 | Lawton et al. | Nov 1998 | A |
5907742 | Johnson et al. | May 1999 | A |
5987149 | Poor | Nov 1999 | A |
5987302 | Driscoll et al. | Nov 1999 | A |
6141120 | Falk | Oct 2000 | A |
6173154 | Kucinski et al. | Jan 2001 | B1 |
6183261 | Clark et al. | Feb 2001 | B1 |
6204873 | Shimazaki | Mar 2001 | B1 |
6256111 | Rijavec | Jul 2001 | B1 |
6321052 | Yamashina et al. | Nov 2001 | B1 |
6366759 | Burstein et al. | Apr 2002 | B1 |
6404517 | Chao | Jun 2002 | B1 |
6459509 | Maciey et al. | Oct 2002 | B1 |
6471352 | Akahira | Oct 2002 | B2 |
6526258 | Bejar et al. | Feb 2003 | B2 |
6532026 | Takahashi et al. | Mar 2003 | B2 |
6645029 | Akahira | Nov 2003 | B2 |
6714321 | Rao et al. | Mar 2004 | B2 |
6832825 | Nishikori et al. | Dec 2004 | B1 |
6947571 | Rhoads et al. | Sep 2005 | B1 |
7027187 | Zuber | Apr 2006 | B1 |
7084998 | Blair et al. | Aug 2006 | B2 |
7162198 | Kuntz et al. | Jan 2007 | B2 |
7295340 | Mestha et al. | Nov 2007 | B2 |
7406392 | Gedlinske et al. | Jul 2008 | B2 |
7411688 | Zhai et al. | Aug 2008 | B1 |
7474783 | Sharma et al. | Jan 2009 | B2 |
7505173 | Viturro et al. | Mar 2009 | B2 |
7516895 | Holoubek | Apr 2009 | B2 |
7573616 | Poor | Aug 2009 | B2 |
7630931 | Rachev et al. | Dec 2009 | B1 |
7692832 | Klassen | Apr 2010 | B2 |
7697166 | Bray | Apr 2010 | B2 |
7742991 | Salzmann et al. | Jun 2010 | B2 |
7831195 | Borchers | Nov 2010 | B2 |
7835043 | Gila et al. | Nov 2010 | B2 |
7992953 | Yorimoto et al. | Aug 2011 | B2 |
8102412 | Klemer et al. | Jan 2012 | B2 |
8488220 | Ray | Jul 2013 | B1 |
8526055 | Ray | Sep 2013 | B1 |
8649601 | Ray | Feb 2014 | B1 |
8738659 | Ray | May 2014 | B1 |
20010028916 | Akahira | Oct 2001 | A1 |
20010040979 | Davidson et al. | Nov 2001 | A1 |
20020054384 | Motamed | May 2002 | A1 |
20020126172 | Akiyama | Sep 2002 | A1 |
20020161772 | Bergelson et al. | Oct 2002 | A1 |
20030016263 | Takahashi et al. | Jan 2003 | A1 |
20030118976 | Makishima et al. | Jun 2003 | A1 |
20030202029 | Bronswijk et al. | Oct 2003 | A1 |
20040114164 | Linder et al. | Jun 2004 | A1 |
20040117617 | Geller et al. | Jun 2004 | A1 |
20040130739 | Adam et al. | Jul 2004 | A1 |
20040131279 | Poor | Jul 2004 | A1 |
20040264771 | Sharma et al. | Dec 2004 | A1 |
20050024410 | Subirada et al. | Feb 2005 | A1 |
20050094170 | Ichitani | May 2005 | A1 |
20050206982 | Hattori | Sep 2005 | A1 |
20050213790 | Rhoads et al. | Sep 2005 | A1 |
20060028699 | Venable et al. | Feb 2006 | A1 |
20060077407 | Tanaka | Apr 2006 | A1 |
20060164700 | Hayashi | Jul 2006 | A1 |
20060193017 | Zuber | Aug 2006 | A1 |
20060227386 | Nuuja et al. | Oct 2006 | A1 |
20060285134 | Viturro et al. | Dec 2006 | A1 |
20060288279 | Yacoub et al. | Dec 2006 | A1 |
20070024657 | Zhang et al. | Feb 2007 | A1 |
20070024928 | Ono | Feb 2007 | A1 |
20070201112 | Motamed | Aug 2007 | A1 |
20070247681 | Klassen | Oct 2007 | A1 |
20080080027 | Mestha et al. | Apr 2008 | A1 |
20080152371 | Burry et al. | Jun 2008 | A1 |
20080225067 | Morino et al. | Sep 2008 | A1 |
20080316552 | Poor | Dec 2008 | A1 |
20090002724 | Paul et al. | Jan 2009 | A1 |
20090059321 | Buckley et al. | Mar 2009 | A1 |
20090086230 | Reed | Apr 2009 | A1 |
20100231728 | Holub | Sep 2010 | A1 |
20100284041 | Warnes | Nov 2010 | A1 |
20140065595 | Ray | Mar 2014 | A1 |
Number | Date | Country |
---|---|---|
0 374 892 | Apr 1997 | EP |
Entry |
---|
“U.S. Appl. No. 12/256,303, Response filed Oct. 22, 2012 to Final Office Action mailed May 29, 2012”, 10 pgs. |
“U.S. Appl. No. 12/256,317, Notice of Allowance mailed Nov. 7, 2012”, 9 pgs. |
“U.S. Appl. No. 12/256,339 , Response filed Nov. 15, 2012 to Final Office Action mailed Sep. 7, 2012”, 9 pgs. |
“U.S. Appl. No. 12/256,339, Examiner Interview Summary mailed Nov. 23, 2012”, 3 pgs. |
“U.S. Appl. No. 12/256,339, Final Office Action mailed Sep. 7, 2012”, 15 pgs. |
“U.S. Appl. No. 12/256,354, Notice of Allowance mailed Oct. 5, 2012”, 7 pgs. |
“U.S. Appl. No. 12/256,303, Non Final Office Action Mailed Jan. 5, 2012”, 9 pgs. |
“U.S. Appl. No. 12/256,303, Response filed Oct. 24, 2011 to Restriction Requirement mailed Jul. 23, 2011”, 6 pgs. |
“U.S. Appl. No. 12/256,303, Restriction Requirement mailed Aug. 23, 2011”, 5 pgs. |
“U.S. Appl. No. 12/256,317 , Response filed Feb. 7, 2012 to Non Final Office Action mailed Oct. 7, 2011”, 8 pgs. |
“U.S. Appl. No. 12/256,317, Non Final Office Action mailed Oct. 7, 2011”, 10 pgs. |
“U.S. Appl. No. 12/256,317, Notice of Allowance mailed Mar. 2, 2012”, 11 pgs. |
“U.S. Appl. No. 12/256,339, Non Final Office Action mailed Jan. 10, 2012”, 13pgs. |
“U.S. Appl. No. 12/256,354, Non Final Office Action maied Mar. 1, 2012”, 8 pgs. |
“U.S. Appl. No. 12/256,354, Restriction Requirement mailed Nov. 21, 2011”, 8 pgs. |
“U.S. Appl. No. 12/256,317, Notice of Allowance mailed Mar. 19, 2013”, 9 pgs. |
“U.S. Appl. No. 12/256,339 , Response filed Apr. 9, 2013 to Non Final Office Action mailed Jan. 9, 2013”, 7 pgs. |
“U.S. Appl. No. 12/256,339, Non Final Office Action mailed Jan. 9, 2013”, 16 pgs. |
“U.S. Appl. No. 12/256,354, Corrected Notice of Allowance mailed Jan. 17, 2013”, 2 pgs. |
“U.S. Appl. No. 12/256,354, Notice of Allowance mailed Feb. 1, 2013”, 5 pgs. |
“U.S. Appl. No. 12/256,354, Response filed Dec. 21, 2011 to Restriction Requirement mailed Nov. 21, 2011”, 5 pgs. |
““Score Image” Processing of Constructed-Responses, Essays, and Writing Samples”, UNISCORE, Incorporated, (1992), 3 pgs. |
“Image Processing of Open-Ended Questions”, UNISCORE, Incorporated, (1992), 4 pgs. |
Cason, Gerald J, et al., “Integrated Test Scoring, Performance Rating and Assessment Records Keeping”, Innovations in Medical Education, Association of American Medical Colleges, Washington, D.C.., Paper presented at the annual meeting of the Association of Medical Colleges., (Nov. 1, 1987), 2-20. |
Gathy, P, et al., “Computer-Assisted Self-Assessment (CASA) in Histology”, Computers Education., vol. 17, No. 2., (1991), 109-116. |
Reid-Green, Keith S, “A High Speed Image Processing System [Journal Paper]”, IMC Journal, vol. 26, No. 2, March-April, USA, (1990), 12-14. |
Zuckerman, Ronald A, “Optical Scanning for Data Collection, Conversion & Reduction”, NTIS, U.S. Department of Commerce, National Technical Information Service, August, Springfield, VA, USA, (1967), 49 pgs. |
“U.S. Appl. No. 12/256,303, Notice of Allowance mailed Jan. 9, 2014”, 12 pgs. |
“U.S. Appl. No. 12/256,339 , Response filed Nov. 15, 2013 to Final Office Action mailed Aug. 15, 2013”, 9 pgs. |
“U.S. Appl. No. 12/256,339, Final Office Action mailed Aug. 15, 2013”, 16 pgs. |
“U.S. Appl. No. 12/256,339, Notice of Allowance mailed Dec. 6, 2013”, 15 pgs. |
Number | Date | Country | |
---|---|---|---|
60981739 | Oct 2007 | US |