The present invention relates to the field of information retrieval and evaluation. In particular, the invention provides a web-based method and system for obtaining information about an organization and evaluating the organization against one or more standards.
A question often faced by those who provide financial support to charitable organizations is whether a particular organization is legitimate and/or is operating in an ethical and well-managed manner. Charities have grown in number three-fold during the past 25 years from 300,000 in 1980 to over 1,000,000 organizations today. Increasingly, charities also have been in the public spotlight due to concerns raised about how they spend contributed funds, the integrity of their fund raising and how well they are governed. Thus, when individual, corporate, or other donors contemplate contributing to an organization, they seek assurance that the organization is appropriately conducting operations in accordance with their expectations. As it can be difficult for many donors to make this determination on their own especially in light of the growing number of existing charities, they often seek help. Providing an easily accessible evaluation of organizations to determine a charity's accountability would make it easier for donors to make more informed giving decisions and contribute with confidence.
Prior methods of evaluating organizations included making personal contact with a staff member of the charity to request information and materials in order to complete an evaluative report based on a set of comprehensive charity accountability standards addressing charity governance, finances, and fund raising. Typically, an analyst would request a great deal of general documentation from an organization, including incorporation documents, bylaws, tax filings, budgets, fund raising appeals, public service announcement scripts, board roster, annual reports, and audited or unaudited financial data about the organization. An organization would then have to spend time and effort collecting the requested documentation, making copies and forwarding the materials. Once received, the analyst had to review the documentation to determine if the subject charity met specified charity accountability standards. Since document retention and maintenance differ from organization to organization, compiling the information necessary for the evaluation was often time-consuming for the subject charity.
This conventional method of evaluating organizations was inefficient, requiring the analyst to find the answers to open questions based on material included in the documentation. This method also limited the number of organizations that could be evaluated due to the amount of time each evaluation took to complete. The benefit of the evaluation was also limited because some organizations did not want to participate due to the amount of effort and resources that would have to be expended by the organization during the evaluation process. Another problem with the conventional method of evaluating organizations was the amount of storage space necessary to retain the documentation requested from the organization.
In order to overcome some of the problems of the conventional method of evaluating organizations, other methods of evaluation were developed. One evaluation method used by some charity monitoring groups is to solely focus on a few financial ratios. The financial ratios were then converted into a grade or star rating that could be used to compare one organization to another. This method limited the evaluation burden on organizations because the information needed for the evaluation was publicly available in tax forms, thus not requiring the organization to provide it. Further, by limiting the scope of the evaluation, a greater number of organizations could be evaluated by the same number of analysts. In addition, since less documentation was needed, less space was required to store it. However, this evaluation method did have its drawbacks. For example, such evaluations are not as thorough. They provide a narrow view of charity accountability by restricting the evaluation to just certain financial aspects of the organization. An organization may have excellent financial ratios, but may be deficient in other areas of accountability such as self-dealing or misleading appeals.
Another issue that has been raised with charity monitoring organizations is how they can ensure thorough and consistent application of their standards, especially if they seek to significantly increase their volume of their reporting. This concern is magnified if charity evaluations are conducted at more than one office (for example, national and local affiliate offices). Reporting manuals and training have been used but their effectiveness is reliant on the staff that makes use of such tools.
In view of the foregoing, there is a need in the art for a method to allow an organization to quickly and efficiently provide information about itself for evaluation purposes. There is also a need in the art for a method to produce a greater number of organizational evaluations with increased efficiency by automatically evaluating an organization against a set of standards based on information provided by the organization. Additionally, there is a need in the art for the ability to generate reports detailing the results of the evaluation. Furthermore, there is a need in the art for the ability to provide these reports and evaluation data quickly and efficiently for the public at large to use.
An information retrieval and evaluation system provides methods and architecture for receiving information about an organization, evaluating the received information against a set of predetermined standards, generating a report summarizing the evaluation results and the information provided by an organization, and making the report available to individuals and corporations via online access.
In support of an evaluation of an organization, the organization prepares a response to a questionnaire. This response typically includes one or more answers to questions contained in the questionnaire. The response can also include documentation or embedded links to information requested within the questionnaire. The questionnaire typically includes multiple questions designed to elicit information about the organization. Any type of question can be included in the questionnaire and typically the questionnaire includes multiple types of questions. The questionnaire can be designed in such a way that for some questions an organization can choose whether it wishes to provide an answer, while for other questions, an answer is required for proper completion of the questionnaire. For example, a question having a mandatory response would require the completing party to provide a response before the next page of questions will be displayed or before the organization will be allowed to complete the questionnaire.
A validation check typically includes an evaluation of the answers provided by an organization to determine if the organization answered all of the questions requiring a response and if the answers are consistent. Consistency of answers can be evaluated by inserting one or more consistency evaluations into the code of the question. Answers of questions that contain a consistency evaluation can then be parsed and evaluated against one another. An automated evaluation can include an evaluation of the answers provided by the organization against a series of standards. Standards typically include business practices and financial situations that are considered beneficial in an organization to ensure legitimate operations. Each standard typically includes one or more evaluation points. The evaluation points can correspond to questions provided in the questionnaire. The answers to the corresponding questions can be compared to the evaluation points to determine if the answer satisfies the evaluation points. Typically, if all of the answers to the corresponding questions satisfy all of the evaluation points, the standard is met by the organization. There is no limit to the breadth and scope of the standards, and the system provides a mechanism for modifying the standards over time.
For one aspect of the present invention, the evaluation system can receive a response from an organization containing answers to a questionnaire. The answers in the response can be checked for errors and inconsistencies in a validation check. The evaluation system can then conduct an automated evaluation of the response against a series of standards to determine the financial health or legitimacy of the organization. A report can be generated describing the organization and the results of the automated evaluation.
For another aspect of the present invention, data previously received or purchased and relevant to an organization can be retrieved from a database. The information can include information about organizations that is capable of being evaluated. The data can be checked for errors and inconsistencies in a validation check. The evaluation system can conduct an automated evaluation of the data against multiple standards having multiple evaluation points. The evaluation system can then generate a report that includes a summary of the evaluation and the retrieved data.
For a further aspect of the present invention, a request can be received by the system for information about organizations. The request typically includes one or more parameters associated with one or many organizations. A search of the database is conducted based on the provided parameters and a list is generated. The list typically includes all of the organizations that satisfy the search parameters. A request for a particular organization can then be received, and the system can retrieve one or more reports for the selected organization.
For a more complete understanding of exemplary embodiments of the present invention and the advantages thereof, reference is now made to the following description in conjunction with the accompanying drawings in which:
The present invention supports a computer-implemented method and system for online reporting of financial and operational information by organizations, evaluating the information provided against one or more standards, and generating a report based on the evaluation. Exemplary embodiments of the invention can be more readily understood by reference to the accompanying figures.
Although exemplary embodiments of the present invention will be generally described in the context of a software module and an operating system running on a personal computer, those skilled in the art will recognize that the present invention can also be implemented in conjunction with other program modules for other types of computers. Furthermore, those skilled in the art will recognize that the present invention may be implemented in a stand-alone or in a distributed computing environment. In a distributed computing environment, program modules may be physically located in different local and remote memory storage devices. Execution of the program modules may occur locally in a stand-alone manner or remotely in a client/server manner. Examples of such distributed computing environments include local area networks of an office, enterprise-wide computer networks, and the global Internet.
The detailed description that follows is represented largely in terms of processes and symbolic representations of operations by conventional computer components, including processing units, memory storage devices, display devices, and input devices. These processes and operations may utilize conventional computer components in a distributed computing environment.
The processes and operations performed by the computer include the manipulation of signals by a processing unit or remote computer and the maintenance of these signals within data structures resident in one or more of the local or remote memory storage devices. Such data structures impose a physical organization upon the collection of data stored within a memory storage device and represent specific electrical or magnetic elements. These symbolic representations are the means used by those skilled in the art of computer programming and computer construction to most effectively convey teachings and discoveries to others skilled in the art.
Exemplary embodiments of the present invention include a computer program that embodies the functions described herein and illustrated in the appended flowcharts. However, it should be apparent that there could be many different ways of implementing the invention in computer programming, and the invention should not be construed as limited to any one set of computer program instructions. Further, a skilled programmer would be able to write such a computer program to implement a disclosed embodiment of the present invention without difficulty based, for example, on the flowcharts and associated description in the application text. Therefore, disclosure of a particular set of program code instructions is not considered necessary for an adequate understanding of how to make and use the present invention. The inventive functionality of the computer program will be explained in more detail in the following description and is disclosed in conjunction with the remaining figures illustrating the program flow.
Referring now to the drawings, in which like numerals represent like elements throughout the several figures, aspects of the present invention and an exemplary operating environment for the implementation of the present invention will be described.
The evaluation workstation 170 is communicably attached via a distributed computer network to the evaluation system 105. In one exemplary embodiment, the evaluation workstation 170 is a personal computer. The evaluation system 105 is communicably attached via a distributed computer network to the workstation 175, evaluation workstation 170, e-mail engine 185, and data storage system 150. The exemplary evaluation system 105 comprises a questionnaire design system (“QDS”) 110, an organization registration system (“CRS”) 115, an organization questionnaire (“OQ”) 120, an analyst evaluation publishing system (“AEPS”) 125, a WGA public website (“WPWS”) 130, a questionnaire auto-validator (“QAV”) 135, an auto-evaluation processor (“AEP”) 140, and an auto-report generator (“ARP”) 145.
The data storage system 150 is communicably attached via a distributed computer network to the evaluation system 105 and the OLAP engine 187. The exemplary data storage system 150 includes a questionnaire and standards data store (“QSDS”) 155, an organization data store (“ODS”) 160, and a reports data store (“RDS”) 165. In one exemplary embodiment, the data storage system 150 is a database comprising the data stored in the QSDS 155, ODS 160, and RDS 165.
The QDS 110 is communicably attached via a distributed computer network to the QSDS 155, the ODS 160, and the evaluation workstation 170. In one exemplary embodiment, the QDS is a web-based computer application that allows an analyst or network administrator to generate or modify a questionnaire or generate or modify one or more standards used to evaluate the questionnaire and store the questionnaire or standard in the QSDS 155. In one exemplary embodiment, the QDS 110 transmits questions, validation conditions, standards, evaluation points, and basic language to be inserted into a report to the QSDS 155.
The CRS 115 is communicably attached via a distributed computer network to the workstation 175, the ODS 160, and the e-mail engine 185. The CRS is a COM object capable of receiving registration information from an organization through the workstation 175 and storing the registration information in the ODS 160. The CRS is also capable of passing registration information to the e-mail engine 185, which can generate and send an email to an organization at the workstation 175. In one exemplary embodiment, the CRS 115 publishes a user interface on a website that is accessible via the workstation 175 through the Internet 180. This user interface is useful for receiving registration information for an organization. The registration information can include the name of the organization, its address, phone number, e-mail address, and a password for logging into the evaluation system 105 at a subsequent point in time.
The OQ 120 can be communicably attached via a distributed computer network to the workstation 175, the QAV 135, the QSDS 155, the ODS 160, and the e-mail engine 185. The OQ 120 is a COM object capable of receiving a questionnaire from the QSDS 155, receiving responses to the questionnaire from the workstation 175 via the Internet 180, passing the responses to the QAV 135 for a validation check, and storing the responses in the ODS 160. The AEPS 125 is communicably attached via a distributed computer network to the evaluation workstation 170, the e-mail engine 185, the AEP 140, the ARP 145, the QSDS 155, and the ODS 160. The AEPS 125 is a COM object capable of retrieving data from the QSDS 155 and the ODS 160 and displaying the data on the evaluation workstation 170. The AEPS can also transmit changes made to an evaluation to the AEP 140 and the ARP 145. In one exemplary embodiment, the AEPS 125 generates and displays a web page on the evaluation workstation 170 for receiving changes to an evaluation. In another exemplary embodiment, the AEPS 125 can transmit changes to responses to the questionnaire to the ODS 160. Furthermore, in the exemplary embodiment, the data received by the AEPS 125 from the QSDS 155 includes questions, standards, evaluation points, and relationships of questions, while the data received from the ODS 160 includes responses to the questionnaire and automatic evaluations. The AEPS 125 is also capable of sending an e-mail to the workstation 175 using the e-mail engine 185.
The WPWS 130 is communicably attached via a distributed computer network to the workstation 175 and the RDS 165. The WPWS 130 is a COM object capable of generating and displaying a web page on the workstation through the Internet 180 to allow a user to request information regarding an organization. The WPWS 130 can retrieve information about the organization, including a report from the RDS 165, and display it on the workstation 175. The QAV 135 is communicably attached via a distributed computer network to the OQ 135, the QSDS 155, and the ODS 160. The QAV 135 is a COM object capable of receiving validation logic from the QSDS 155 and responses from the ODS 160 to review the responses to determine if they are valid, then passing the results of the validation check to the OQ 120.
The AEP 140 is communicably attached via a distributed computer network to the AEPS 125, the QSDS 155, and the ODS 140. The AEP 140 is a COM object capable of receiving a set of standards and evaluation points from the QSDS 155, receiving responses from the ODS 160, and conducting an automated evaluation of these responses to determine if they meet the standards. The AEP can then store the results of the evaluation in the ODS 160. The ARP 145 is communicably attached via a distributed computer network to the AEPS 125, the QSDS 155, the ODS 160, and the RDS 165. The ARP 145 is a COM object capable of receiving standards and basic text from the QSDS 155 and evaluation results and responses from the ODS 160, and generating a report on an organization, which can be stored in the RDS 165.
The QSDS 155 is communicably attached via a distributed computer network to the QDS 110, OQ 120, AEPS 125, QAV 135, AEP 140, and the ARP 145. The QSDS 155 typically contains questions, answer logic, standards, evaluation points, basic “does not meet” language, and documentation types. In one exemplary embodiment, the QSDS 155 is a SQL server database. The ODS 160 is communicably attached via a distributed computer network to the QDS 110, CRS 115, OQ 115, AEPS 125, QAV 135, AEP 140, ARP 145, and OLAP engine 187. The ODS 160 can contain responses to the questionnaire, e-mail correspondence with the organization, supplemental documentation provided by the organization, and results of the evaluation of the responses. In one exemplary embodiment, the ODS 160 is a SQL server database. The RDS 165 is communicably attached via a distributed computer network to the WPWS 130 and ARP 145. The RDS 165 typically contains reports on organizations generated by the ARP 145. In one exemplary embodiment, the RDS 165 is a SQL server database.
An evaluation workstation 170 is communicably attached via a distributed computer network to the QDS 110 and AEPS 125. The evaluation workstation 170 typically allows an analyst or administrator to create questionnaires and standards and evaluate responses to registrations and questionnaires. In one exemplary embodiment, the evaluation workstation 170 is a personal computer.
An OLAP engine 187 is communicably attached via a distributed computer network to the ODS 160 and the analytical database 189. The OLAP engine 187 typically provides a mechanism for manipulating data from a variety of sources that has been stored in a database, such as the ODS 160. The OLAP engine 187 can allow a user at the workstation 175 to conduct statistical evaluations of the responses stored in the ODS 160. The user can access the OLAP engine 187 through a web page generated by the statistical reporting system 191. Results of the statistical analysis can be stored in the analytical database 189. In one exemplary embodiment, the statistical reporting system 191 is a COM object and the analytical database 189 is a SQL server database.
Now referring to
In step 210, a system administrator can create one or more evaluation standards that can be input into the system 100 from the evaluation workstation 170 through the QDS 110 and stored in the QSDS 155. In one exemplary embodiment, evaluation standards can include the following: a board of directors that provides adequate oversight of the charity's operation; a board of directors with a minimum of five voting members; a minimum of three board meetings per year that include the full governing body having a majority in attendance and meeting face-to-face; no more than 10 percent of the board can be compensated by the charity; assessing the charity's performance at least every two years; submitting a report to the governing body describing the charity's performance and providing recommendations for the future; at least 65 percent of expenses go towards program activities; less than 35 percent of contributions can be used for fundraising; and financial statements prepared according to GAAP and available upon request.
A charity seeking to be evaluated can register on the system 100 in step 215. Registration on the system 100 can be initiated from the workstation 175 through the Internet 180 and CRS 115. Registration information is typically stored in the ODS 160 and can include the name of the charity that is registering, an e-mail address or other contact information, and a password for subsequent entry into the system 100. In step 220, the validity of the registering charity is verified. The verification of a charity typically includes determining if the organization is a legitimate business entity or organization and if the organization has previously registered for an evaluation. Validation of a charity can be completed by matching information maintained in databases or by manual review. In one exemplary embodiment, the CRS 115 passes registration information from the ODS 160 to the evaluation client 185, where validation of a charity is determined by an analyst who manually determines if the charity submitting the request is a soliciting organization.
In step 225, the automated evaluation system 100 receives a response to the questionnaire from the workstation 175 at the OQ 120. The QAV 135 in step 230 validates information contained in the response. In step 235, the response is reviewed to determine if the proper answer types have been provided by the responding charity. In one exemplary embodiment, the AEPS passes the response from the ODS 160 to the evaluation workstation 170 where an analyst determines if proper answer types have been provided.
In step 240, the AEP 140 conducts an automated evaluation of the response to determine if the questionnaire responses meet one or more of the standards stored in the QSDS 155. The ARP 145 in step 250 generates a report. The report typically includes the responses provided by the charity, each of the standards used for comparison to the responses and whether the charity met, failed to meet, or did not provide enough information to determine if the charity met the standards. In step 255, the report can be updated or modified. In one exemplary embodiment, the report is modified by an analyst through the evaluation workstation 170 and the AEPS 125. The modified report is displayed on the evaluation workstation 170 in step 260. In step 265, the report can be stored in the RDS 165 and can be viewed by the workstation 175 by making a request through the WPWS 130. The exemplary process terminates at the END step. The tasks completed in steps 205, 215, 225, 230, 240, 245, and 250 are described in more detail below in connection with
An inquiry is conducted in step 325 to determine if the question contains a consistency evaluation. In one exemplary embodiment, a question contains a consistency evaluation if the question includes conditions that must be met in order for the answer to be considered consistent. The conditions are typically embedded as SQL code. If the question contains a consistency check, the “Yes” branch is followed to step 330, where the consistency conditions are inserted. Otherwise, the “No” branch is followed to step 335.
In step 335, an inquiry is conducted to determine if answers to the question may require the charity to supply additional documentation. Additional documentation may be required to better explain the basis for an answer to a question or to provide supplemental proof that the answer is correct. In one exemplary embodiment, a question seeking information related to income tax filings can also ask the charity to supply a copy of income tax forms filed with the IRS or state agencies. If an answer to the question could require documentation, the “Yes” branch is followed to step 340, where the question is flagged at the QDS 110 by the administrator, via the evaluation workstation 170. The process continues to step 345, where the administrator inserts instructions through the evaluation workstation 170 into the QDS 110 regarding the required documentation so that the request for documentation will display when the OQ 120 receives a particular type of answer.
An inquiry is conducted in step 350 to determine if this question is a follow-up question to another question and will only be displayed if particular types of answers are provided in response to the question. Instead of requesting supporting documentation, when particular answers are provided to specific questions, one or more additional questions can be retrieved from the QSDS 155 by the OQ 120 and displayed on the workstation 175. If the current question is a follow-up question, the “Yes” branch is followed to step 355, where one or more questions that the current question is a follow-up to are linked by the administrator at the QDS 110 via the evaluation workstation 170. In step 360, answers that will cause the current question to be displayed are linked to the current question in the QDS 110 by an administrator through the evaluation workstation 170. The process continues to step 365.
If the current question is not a follow-up question in step 350, the “No” branch is followed to step 365. The questions are typically stored in the QSDS 155. An inquiry is conducted in step 365 to determine if another question is being input into the QDS 110 from the evaluation workstation 170. If another question is being input, the “Yes” branch returns to step 305. Otherwise, the “No” branch is followed to step 210 of
The CRS 115 in step 415 receives an e-mail address and password for the charity. In one exemplary embodiment, the e-mail address can be inserted into a request box on a web page and transmitted to the CRS 115 with the workstation 175. An inquiry is conducted in step 420 to determine if the e-mail address received by the CRS 115 is associated with a different charity. The determination can be made by the CRS 115 evaluating the ODS 160 for the e-mail address received in step 415. If the address is already located in the ODS 160, the CRS 115 can determine if the same charity previously provided that e-mail address. If the e-mail address is associated with a different charity, the “Yes” branch is followed to step 425, where the CRS 115 generates a message to be displayed on the workstation 175 that the charity must insert a different e-mail address. The process then returns to step 415.
If the e-mail address is not associated with a different charity, the “No” branch is followed to step 430, where the registration information is stored in the ODS 160. In step 435, the e-mail engine 185 generates an e-mail message notifying an analyst that a new charity has registered for the system. In one exemplary embodiment, the message is sent from the e-mail engine 185 to the evaluation workstation 170, where it is displayed. The process continues to step 220 of
In step 510, the OQ 120 retrieves the first page of questions in the questionnaire from the QSDS 155 and displays the page on the workstation 175.
An inquiry is conducted in step 520 to determine if the form of the answer is in error. In one exemplary embodiment, the OQ 120 can determine if the form of answer that should be received does not match the form of the answer received. For example, an error would be generated if the anticipated answer was numerical but the answer provided was a word or phrase. If the answer contains an error, the “Yes” branch is followed to step 522, where the OQ 120 generates an error message and displays it on the workstation 175, requesting the charity to revise the answer. The process returns to step 518. If there is no error, the “No” branch is followed to step 524, where an inquiry is conducted to determine if additional questions are associated with the current question. The OQ 120 determines if additional questions are associated with the current question by evaluating the QSDS 155 to see if questions were linked together as discussed in step 355 of
Retuning to
In step 530, an inquiry is conducted to determine if the charity has asked to go on to the next or another page. In one exemplary embodiment, the user can select a “Next” button on the website that comprises a link allowing the charity to move to the next page of the questionnaire or the charity can select a specific page to view next. The exemplary questionnaire user interface of
An inquiry is conducted in step 536 to determine if the charity has reached the last page of the questionnaire and then requested the next page. If not, the “No” branch is followed to step 540, where the OQ 120 retrieves the next page of questions from the QSDS 155 and displays them on the workstation 175. The process then returns to step 512. Otherwise, the “Yes” branch is followed to step 538, where the OQ 120 retrieves a summary of the answers provided by the charity form the ODS 160 and displays them on the workstation 175. In step 542, the charity submits the answers for review. In one exemplary embodiment, the answers can be submitted for review by selecting a link on the website at the workstation 175. In another exemplary embodiment, the charity cannot submit its answers for review unless it agrees to a click-wrap license that is displayed when the charity tries to submit its answers for review. The charity can typically agree to the click-wrap license agreement by selecting a link designated “Agree” and simultaneously submitting the answers for review. The process continues to step 230 of
An inquiry is conducted in step 615 to determine if there is a validation error. If so, the “Yes” branch is followed to step 620, where the QAV 135 generates an error message and displays it on the workstation 175. Otherwise, the “No” branch is followed to step 625, where an inquiry is conducted to determine if there is another question on the current page. If so, the “Yes” branch is followed to step 630, where the counter variable X is incremented by one. The process then returns to step 610. If no additional questions remain on the page, the “No” branch is followed to step 631.
In step 631, an inquiry is conducted to determine if there is another consistency check to conduct on the questions on this page. If so, the “Yes” branch is followed to step 632, where the counter variable Y is incremented by one. The process then returns to step 605. If there are no additional consistency checks for this page, the “No” branch is followed to step 635. In step 635, an inquiry is conducted to determine if the QAV 135 displayed any error messages on the workstation 175. If so, the “Yes” branch is followed to step 640, where the QAV 135 generates a message that the charity cannot continue and displays the message on the workstation 175. The process continues to step 518 of
An inquiry is conducted in step 730 to determine if there is another question to evaluate. Typically, the QAV 135 retrieves the questionnaire from the QSDS 155 to determine if there is another question to evaluate. If there is another question to evaluate, the “Yes” branch is followed to step 735, where the variable X is incremented by one. The process then returns to step 715. However, if there are no other questions to evaluate, the “No” branch is followed to step 740, where the counter variable Y is set equal to one. In step 745, the QAV 135 performs a first consistency check. In performing the consistency check, the QAV 135 typically retrieves the answers for a charity from the ODS 160 and reviews which questions contain a consistency evaluation in the QSDS 155. The QAV 135 then determines if the answers to the questions containing the consistency evaluation are consistent.
In step 750, an inquiry is conducted to determine if the answers are consistent for the first consistency check. If not, the “No” branch is followed to step 755, where the OQ 120 generates an error message stating that a consistency error exists for that particular consistency check. Otherwise, the “Yes” branch is followed to step 760, where an inquiry is conducted to determine if there is another consistency check to complete. If so, the “Yes” branch is followed to step 765, where the counter variable Y is incremented by one. The process then returns to step 745. If the last consistency check has been completed, then the “No” branch is followed to step 770.
In step 770, an inquiry is conducted to determine if the QAV 135 has generated any error messages for the submitted questionnaire. In one exemplary embodiment, error messages generated by the OQ 120 in steps 725 and 755 can be stored in a queue of the OQ 120. If error messages have been generated by the QAV 135, the “Yes” branch is followed to step 775, where the OQ 120 displays a web page listing the error messages on the workstation 175. The process then continues to step 225 of
The AEP 140 analyzes the first evaluation point for the first standard in step 812 by comparing the first evaluation point in the standard to a corresponding answer in the submitted response. In step 814, the AEP 140 determines if the evaluation point does not apply. An evaluation point does not apply if it is for a standard that is no longer in effect or has not yet gone into effect. For example, consider if standard one is only used for evaluation purposes for submissions made in the 2004 calendar year. If a submission is made in the 2005 calendar year, then the evaluation points for standard one would not apply in evaluating a submission made in 2005. If the first evaluation point for the first standard does not apply, the “No” branch is followed to step 832. Otherwise, the “Yes” branch is followed to step 818.
An inquiry is conducted by the AEP 140 to determine if the first evaluation point in the submitted response is incomplete in step 818. An evaluation point is incomplete if the information provided in the responses submitted and stored in the ODS 160 does not provide enough information to determine if the charity meets the evaluation points for a standard. If the first evaluation point for the first standard is incomplete, the “Yes” branch is followed to step 820, where the AEP 140 records the first evaluation point as incomplete in the ODS 160. The process then continues to step 832. If, on the other hand, the first evaluation point is incomplete, the “No” branch is followed to step 822.
An inquiry is conducted to determine if the AEP 140 should mark the first evaluation point for review in step 822. An evaluation point that is marked for review can typically be manually reviewed at a later time by an administrator or evaluator via the evaluation workstation 170. In one exemplary embodiment, the exemplary system 100 marks evaluation points for review when the system 100 is not able to verify if the charity meets the evaluation point because of insufficient information, internal consistency or because human judgment is needed for the determination. If the evaluation point should be marked for review, the “Yes” branch is followed to step 824, where the AEP 140 marks the first evaluation point for review in the submitted response. The process then continues to step 832. However, if the evaluation point should not be marked for review, the “No” branch is followed to step 826, where an inquiry is conducted to determine if the charity satisfies the first evaluation point. If the first evaluation point does satisfy the first standard, the “Yes” branch is followed to step 828, where the AEP 140 records the evaluation point as satisfying the standard in the ODS 155. Otherwise, the “No” branch is followed to step 830, where the AEP 140 records the evaluation point as not satisfying the standard in the ODS 160.
In step 832, an inquiry is conducted to determine if there is another evaluation point for the first standard. If so, the counter variable N is incremented by one and the process returns to step 812 so that the AEP 140 can evaluate the next evaluation point. Otherwise, the “No” branch is followed to step 836 of
In step 840, an inquiry is conducted by the AEP 140 to determine if at least one evaluation point for the first standard did not meet the standard. If so, the “Yes” branch is followed to step 842, where the AEP 140 generates a message that the submitted response does not meet the standard and records the message in the ODS 160. The process then continues to step 854. However, if none of the evaluation points were determined to not meet the standard, the “No” branch is followed to step 844, where the AEP 140 determines if any of the evaluation points for the first standard were marked for review. If so, the “Yes” branch is followed to step 846, where the AEP 140 generates a message that the standard has been flagged for manual review and stores the message in the ODS 160. The process then continues to step 854. If, on the other hand, no evaluation points were marked for review, the “No” branch is followed to step 848.
In step 848, the AEP 140 conducts an inquiry to determine if all of the evaluation points for the first standard did not apply. If so, the “Yes” branch is followed to step 850, where the AEP 140 generates a message that the first standard does not apply and records the message in the ODS 160. The process then continues to step 854. If one or more of the evaluation points did apply, the “No” branch is followed to step 852, where the AEP 140 generates a message that the submission meets the requirements for the first standard and stores the message in the ODS 160. An inquiry is conducted in step 854 to determine if there are additional standards to review. The AEP 140 typically makes this determination by reviewing the standards stored in the QSDS 155. If there are additional standards to evaluate, the “Yes” branch is followed to step 856, where the counter variable M is incremented by one. The process then continues to step 808 of
In step 904 a standard is selected. A counter variable N representing an evaluation point for the standard M is set equal to one in step 906. In step 908, the AEPS 125 displays the automatic evaluation for standard M at the evaluation workstation 170. The AEPS 125 displays the effective evaluation for standard M at the evaluation workstation 170 in step 910. In step 912, the AEPS 125 displays the evaluation point record for the first evaluation point, retrieved from the ODS 160. In step 914, an inquiry is conducted to determine if standard M has another evaluation point record in the ODS 160. The AEPS 125 typically reviews the ODS 160 to determine if additional evaluation point records exist. If so, the “Yes” branch is followed to step 916, where the counter variable N is incremented by one. The process then returns to step 912. Otherwise, the “No” branch is followed to step 918, where the OQ 120 retrieves all of the questions related to the standard M from the QSDS 155 and their answers from ODS 160 and displays them at the evaluation workstation 170. Questions are typically related to one another if they are each evaluated in order to determine if a specific standard has been met. By designating questions as being related to one another, answers to the related questions can be quickly retrieved and displayed at the evaluation workstation 170.
In step 920, an inquiry is conducted to determine if the analyst or administrator wants to modify the effective evaluation in the ODS 160 for standard M. If so, the “Yes” branch is followed to step 922, where a modified effective evaluation is received from the evaluation workstation 170 at the AEPS 125. The AEPS 125 stores the modified effective evaluation in the ODS 160 in step 924. The process then returns to step 908. In one exemplary embodiment, an analyst might want to modify the effective evaluation when evaluation points for the standard have been marked for review. Once the analyst has had an opportunity to review the evaluation points and the charity's responses to questions related to the standard, the analyst could manually input a different record as to whether the charity satisfied the standard.
Returning to step 920, if no modifications are made to the effective evaluation, the “No” branch is followed to 926, where the AEPS 125 conducts an evaluation to determine if the effective evaluation meets standard M. If the effective evaluation is recorded as meeting the standard in the ODS 160, the “Yes” branch is followed to step 940. Otherwise, the “No” branch is followed to step 928, where the AEPS 125 conducts an inquiry to determine if the ODS 160 contains custom language explaining why the charity did not meet the standard. If it does not have custom language, the “No” branch is followed to step 930, where the ARP 145 generates the “does not meet” language for the evaluation report. Otherwise, the “Yes” branch is followed to step 932, where the AEPS displays the custom “does not meet” language generated at the evaluation workstation 170.
In step 934, an inquiry is conducted to determine if an analyst or administrator wants to modify the custom “does not meet” language. If so, the “Yes” branch is followed to step 936, where the modified language is received at the ARP 145 from the evaluation workstation 170. The modified “does not meet” language can then be stored by the ARP 145 in the ODS 160. If there is no change to the custom “does not meet” language, the “No” branch is followed to step 940, where an inquiry is conducted to determine if another standard will be selected. If so, the “Yes” branch is followed to step 904, where another standard is selected. Otherwise, the “No” branch is followed to step 250 of
The ARP 145 evaluates the effective evaluation stored in the ODS 160 to determine if any of the standards are flagged for review in step 1006. In step 1008, an inquiry is conducted to determine if any standards in the effective evaluation are flagged for review. If so, the “Yes” branch is followed to step 1010, where the ARP 145 generates a message that an evaluation report cannot be generated and displays the message on the evaluation workstation 170. Otherwise, the “No” branch is followed to step 1012, where the ARP 145 retrieves basic information about the charity being evaluated from the ODS 160 and inserts the basic information into a report template. In one exemplary embodiment, basic information about the charity can include the name of the charity, its address, the state the charity is incorporated in, and any affiliates of the corporation. In another exemplary embodiment, the basic information about the charity can include governance and financial information about the charity and custom information inserted into the report by an analyst or administrator.
In step 1014, a counter variable M, representing the standards, is set equal to one. In step 1016, an inquiry is conducted to determine if the first standard is met in the effective evaluation. In one exemplary embodiment, the ARP 145 retrieves the effective evaluation from the ODS 160 to determine the evaluation as compared to the standards. If the standard is met in the effective evaluation, the “Yes” branch is followed to step 1034. Otherwise, the “No” branch is followed to step 1018, where the ARP 145 conducts an inquiry to determine if the first standard does not apply in the effective evaluation. In one exemplary embodiment, a standard does not apply if all of the evaluation points related to the standard do not apply. If the first standard does not apply in the effective evaluation, the “Yes” branch is followed to 1034, where the counter variable X is incremented by one. The process continues to step 1038. If the first standard does apply in the effective evaluation, the “No” branch is followed to step 1020, where the ARP 145 generates language that the charity does not meet the first standard and adds the language into the report template.
In step 1022, the ARP 145 conducts an inquiry to determine if basic “does not meet” language should be used for the charity's failure to meet the first standard (basic “does not meet” language should be used if no custom “does not meet” language has been provided for the charity for that standard). Each evaluation point contains a template for basic “does not meet” language that should be used if the charity does not meet that evaluation point; this template is typically stored in the QSDS 155. If basic language is used, the “Yes” branch is followed to step 1024, where the ARP 145 retrieves from the QSDS 155 the templates for the one or more evaluation points that the charity did not meet within the first standard, and from the ODS 160 the responses to the questionnaire that are relevant to the standard. In step 1026, the ARP 145 generates an explanation of how the charity failed to meet the standard by combining the retrieved questionnaire responses with the template “does not meet” language for the retrieved evaluation point(s) that the charity did not satisfy in the first standard. In step 1028, the ARP 145 inserts the generated language into the report template. Returning to step 1022, if basic “does not meet” language is not used, the “No” branch is followed to step 1030.
In step 1030, the ARP 145 retrieves the custom “does not meet” language for the first standard from the ODS 160 and inserts it into the report template in step 1032. In step 1036, the counter variable Y is incremented by one. An inquiry is conducted by the ARP 145 to determine if another standard was evaluated for this charity in step 1038. If so, the “Yes” branch is followed to step 1040, where the counter variable M is incremented by one. The process then returns to step 1016. Otherwise, the “No” branch is followed to step 1042.
In step 1042, the ARP 145 conducts an inquiry to determine if all standards were either met or did not apply. If so, the “Yes” branch is followed to step 1044, where the ARP 145 generates a statement that the charity meets all standards and inserts it into the report template. Otherwise, the “No” branch is followed to step 1046, where the ARP adds the counter variables X and Y into the report template to designate the number of standards a charity did and did not meet.
In step 1115, the automated evaluation system 100 receives a response to the questionnaire from an analyst or administrator inputting information from the evaluation workstation 170 at the OQ 120. The QAV 135 in step 1120 validates information contained in the response. In step 1125, the AEP 140 conducts an automatic evaluation of the response to determine if the response meets the standards stored in the QSDS 155. A backup review and revision of submitted responses can be received from the evaluation workstation 170 through AEPS 125 in step 1130. The ARP 145 in step 1135 can generate a report. The report typically includes the responses provided by the analyst or administrator in step 1115, the standards the responses were compared to and whether the charity met, failed to meet, or did not provide enough information to determine if the charity met the standard. In step 1140, the report can be updated or modified. In one exemplary embodiment, the report is modified by an analyst through the evaluation workstation 170 and the AEPS 125. The modified report is displayed on the evaluation workstation 170 in step 1145. In step 1150, the report can be stored in the RDS 165 and can be viewed by the workstation 175 by making a request through the WPWS 130. The process continues to the END step.
Charities having information that match the inquiry are displayed on the workstation 175 by the WPWS 130 in step 1815. In step 1820, a selection is received at the WPWS 130 from the workstation 175. The selection typically consists of one particular charity that the inquirer wants information about. The WPWS 130 retrieves the report for the selected charity from the RDS 165 in step 1825. In step 1830, the WPWS 130 transmits the report to the workstation 175 to be displayed. The process continues to the END step.
In conclusion, the present invention supports a computer-implemented method for receiving information about an organization and automatically evaluating the organization against one or more standards. It will be appreciated that the present invention fulfills the needs of the prior art described herein and meets the above-stated objectives. While there have been shown and described several exemplary embodiments of the present invention, it will be evident to those skilled in the art that various modifications and changes may be made thereto without departing from the spirit and the scope of the present invention as set forth in the appended claims and equivalence thereof.
This non-provisional patent application claims priority under 35 U.S.C. § 119 to U.S. Provisional Patent Application No. 60/592,826, entitled “Online Charity Reporting and Evaluation System,” filed Jul. 30, 2004. This provisional application and the contents thereof are hereby fully incorporated by reference.
Number | Date | Country | |
---|---|---|---|
60592826 | Jul 2004 | US |