1. Field of the Invention
The present invention relates generally to the field of certifications and accreditation (C&A) and, more particularly, to a computer-implemented system method and medium for certification and accreditation that assesses the risk of and/or determines the suitability of a target system to comply with at least one predefined standard, regulation and/or requirement.
2. Background Description
The general purpose of C&A is to certify that automated information systems adequately protect information in accordance with data sensitivity and/or classification levels. In accordance with Department of Defense (DoD) Instruction 5200.40, dated Dec. 30, 1997, entitled DoD Information Technology Security Certification and Accreditation Process (DITSCAP), which is incorporated herein by reference in its entirety, certification can be defined as the comprehensive evaluation of the technical and non-technical features of an information technology (IT) system and other safeguards, made in support of the accreditation process, to establish the extent that a particular design and implementation meets a set of specified security requirements. Similarly, as used herein, accreditation can be defined as a formal declaration by a designated approving authority that an IT system is approved to operate in a particular security mode using a prescribed set of safeguards at an acceptable level of risk. In general DISTSCAP is utilized by the DoD for identifying and documenting threats and vulnerabilities that pose risk to critical information systems. DITSCAP compliance generally means that security risk posture is considered acceptable and that potential liability for system “owners” is mitigated.
The C&A process typically involves a number of policies, regulations, guidelines, best practices, etc. that serve as C&A criteria. Conventionally, the C&A process is typically a labor intensive exercise that can require multiple skill sets over a period of time typically spanning 6-12 months. There can be, for example, several organizations and/or individuals that may be involved in the processes of selecting applicable standards, regulations and/or test procedures, and assembling test results and other information into a DITSCAP compliant package. There is therefore a need to substantially streamline and expedite the security C&A process in a computer based application that substantially automates the process of performing security risk assessments, certification test procedure development, system configuration guidance, and residual risk acceptance.
To address the deficiencies of prior schemes as indicated above, the present invention provides a system, method and medium that substantially automates the security C&A process in a manner that enhances and facilitates security risk assessments, certification test procedure development, system configuration guidance, and/or residual risk acceptance.
In an exemplary embodiment, the C&A process is automated in accordance with DoD's DITSCAP requirements. The present invention is not, however, limited to a DoD environment, and may also be used in non-DoD government as well as civilian/private sector organizations requiring risk management and guidance. For example, the system and method according to the present invention can also be used to automate the National Information Assurance Certification and Accreditation Process (NIACAP).
An exemplary embodiment according to the present invention contemplates a browser based solution that automates the DITSCAP process. The browser is preferably directed to five primary elements: 1) gathering information, 2) analyzing requirements, 3) testing requirements, 4) performing risk assessment, and 5) generating certification documentation based on an assessment of the first four elements.
The information gathered primarily relates to a description of the system to be certified, and its respective components and operating environment (e.g., workstation manufacturer and model, operating system and version, secret, or top secret operating environment, etc.). The requirements analysis generally involves selecting by the user a list of standards and/or regulations that the system must or should comply with. The user may optionally input his own standards/regulations and/or additional requirements. Once information is gathered and the requirements analysis is provided, the system intelligently selects a set of test procedures against which the system is tested. Upon completion of testing, the risk assessment provides as output an estimate of the risk level for each individual test failed. Each of the failed tests are also collectively considered and used to evaluate the risk level of the target system as a whole. Then, documentation can be printed that includes information pertaining to the first four elements that would enable an accreditation decision to be made based on the inputs and outputs respectively provided and generated in the first four elements.
Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments and of being practiced and carried out in various ways.
The Detailed Description including the description of a preferred structure as embodying features of the invention will be best understood when read in reference to the accompanying figures wherein:
Referring now to the drawings, and more particularly to
As indicated above, aspects of at least some embodiments of the present invention are described in accordance with DoD's DITSCAP requirements. However, it should be understood that such description is only by way of example, and that the present invention contemplates use with regard to any number of types of requirements or environments. In addition, within its use with regard to DITSCAP requirements, it should be understood that many of the various aspects and selection options are also exemplary, as is the fact that information is shown as being entered via a web browser.
The requirements analysis generally involves selecting (by a human and/or some automated procedure) a list of standards and/or regulations that the system must, or should, comply with. This is indicated by a block 102. Optionally, selection of additional standards/regulations and/or requirements by a user is also contemplated. At least some embodiments of the present invention then contemplate automatically displaying/listing each requirement that comprises the current security requirements traceability matrix (SRTM), which is derived from the selected set of standards and/or regulations that the system must comply with. Additionally, the user will be able to customize the current SRTM by either adding, editing and/or deleting requirements. As known to those skilled in the art, a SRTM can be a table used to trace project lifecycle activities (e.g., testing requirements) and/or work products to the project requirements. The SRTM can be used to establish a thread that traces, for example, testing and/or compliance requirements from identification through implementation. A SRTM can thus be used to ensure that project objectives and/or requirements are satisfied and/or completed.
Once information is gathered 100 and the requirements analysis 102 is provided, the system intelligently selects a set of test procedures against which the system is tested, as indicated by a block 104. The test procedures are selected in a manner so that successful completion of the test procedures will render the system undergoing C&A to satisfy the SRTM requirements.
Upon completion of testing 104, the risk assessment step (as indicated by a block 106) then involves assessing for each test failure (should any exist) the vulnerability of the system, as well as the level of the threat as determined by the information gathered. The risk assessment 106 provides as output an estimate of the risk level for each individual test failed. Each of the failed tests are also collectively considered and used to evaluate the risk level of the system as a whole. Then, documentation can be optionally printed 108 that includes information pertaining to the first four elements that would enable an accreditation decision to be made based on the inputs and outputs respectively provided and generated in the first four blocks (i.e., 100, 102, 104, 106). Each block shown in
As shown in
When tab 406 is activated, a project security information screen, such as shown in
When the user selects information category tab 442, a pull down menu listing the security levels (e.g., secret, unclassified, sensitive, etc.) appears.
Also in accordance with DITSCAP requirements,
Formal access category 462 is a designator indicating the level of formal approval for accessing the system and is related to the clearance levels of users and the maximum data classification processed by the system. Formal access category 462 is in at least some embodiments contemplated by the present invention, to be only applicable for a system operating in the compartmented mode or multi-level security mode. There are different definitions for each mode. In a compartmented mode system, exemplary available options are: 1) No user lacks formal access for more than one category being processed; and 2) At least one user does not have formal access approval for more than one category being processed. In a Multi-level Security system, the options can be: 1) All users have formal access approval for all categories of data processed by the system; 2) No user lacks formal access approval for more than one category being processed; and 3) At least one user lacks formal access approval for more than one category being processed.
Certification Level 464 can be a read-only display of the calculated Certification Analysis Level. Finally, Total Value 466 can be a read-only display of the total weighted values of the ITSEC parameters. These can be used to determine the Certification Analysis Level.
As shown in
When project personnel tab 408 shown in
For example, the following fields can be provided in a menu (not shown) subsequent to clicking the personnel tab 408:
When the project schedule tab 412 of
When project hardware tab 414 is activated, a menu as shown in
When project operating system 416 is activated, a menu (not shown) that enables a user to describe and store operating systems associated with the system hardware is provided. The ability to enter information pertaining to multiple operating systems (OS) on each hardware platform can be provided. Fields are provided to enable a user to enter information pertaining to the OS Name (e.g., Windows NT, AIX, HP UX, etc.), OS Type (e.g., NT, UNIX, etc.), OS Manufacturer (e.g., Microsoft, Hewlett Packard, IBM, etc.), OS Version (the numeric value of the operating system version), OS Options (a list of all OS options (if any) obtained for this platform), OS Patches (a list of OS patches (if any) that have been installed on the platform), OS Description (a detailed description of the operating system, possibly including the basic features, and any functions unique to the system being accredited).
When project application tab 418 is activated, a project application screen appears (not shown) that can provide the analyst with the ability to describe and store applications associated with the system hardware/OS combinations. The following exemplary fields can be provided: Application Name (the name of the application), Application Type (the type of application on the system being accredited—e.g., database, office automation, e-mail server, etc.), Application Manufacturer (the name of the application manufacturer), Application Version (the numeric version of the application), Application Options (a list of the options associated with the application (if any)), Application Patches (a list of the patches associated with the application), and Application Description (a detailed description of the application).
When system interfaces tab 420 is activated, a menu (not shown) is provided that provides the user the ability to describe and store the flow of information into and out of the accredited system. The system interfaces entries can describe each of the internal and external interfaces identified for the system. The following exemplary fields can be provided: Interface Name (an internal or external name associated with the system interface), and Interface Description (a detailed description of the internal or external system interface, which preferably includes a statement of the significant features of the interface as it applies to the entire system, as well as a high level diagram of the communications links and encryption techniques connecting the components of the information system, associated data communications, and networks).
When system data flow tab 422 is activated, a menu (not shown) is provided that can provide the user the ability to describe and store the flow of information within the accredited system. System data flow entries can describe the flow of information to each of the external interfaces identified for the system. The following exemplary fields can be provided: Data Flow Short Name (a brief user-defined name associated with the system data flow), and Data Flow Description (a detailed description of the data flow associated with the external interface, which preferably includes a statement of the purpose of the external interface and the relationship between the interface and the system, as well as the type of data and the general method for data transmission, if applicable).
When accreditation boundary tab 424 is activated, a menu (not shown) that provides the user with the ability to describe and store the identification of components that are associated with the system being accredited, but are outside of the accreditation boundary (i.e., not included in the accreditation). This category might include such equipment/services as, for example, a domain naming service (DNS) used to translate the host names to IP addresses. The DNS might not be part of the atomic system being accredited, but is required for most communication activities. The following exemplary fields can be provided: Accreditation Boundary Name (a name associated with the external system component), and Accreditation Boundary Description (a detailed description of the external system component, which preferably includes the function that this component/service provides the system being accredited and its relationship to the system).
When project threat tab 426 is activated, a menu (not shown) appears that provides the user the ability to quantify the threat environment where the system is intended to operate. If the system is targeted to operate in multiple locations, the environmental condition that results in the higher or highest level of risk can be selected. The following exemplary fields can be provided: Location (CONUS (CONtinental US) or OCONUS (Outside CONtinenal US) as the primary operating location for the system), System Communications (the primary means of information transfer to external systems, such as No LAN, Local LAN Only, SIPRNET (SECRET Internet Protocol Router Network), NIPRNET (Unclassified but Sensitive Internet Protocol Router Network), Internet, etc.), Connection (the types of connection—e.g., wireless, dial-up, or protected distribution system (PDS), etc.), Training Competency Level (e.g., administrator, maintenance personnel, user, etc.), Installation Facility (the operating environment of the system at its intended end site), Natural Disaster Susceptibility (e.g., fire, flood, lightning, volcano, earthquake, tornado, etc.), and Custom Components.
When project appendices tab 428 is activated, a menu (not shown) that provides the user the ability to identify external documents that are associated with the C&A is provided. These appendices can optionally include references to other documents, or consist of the contents of other documents that are accessible via a computer-implemented embodiment of the present invention. Representative appendices that may be derived are: System Concept of Operations, Information Security Policy, System Rules of Behavior, Incident Response Plan, Contingency Plans, Personnel/Technical Security Controls, Memoranda of Agreement, Security, Education, Training and Awareness Plan, and Certification and Accreditation Statement.
Tabs 402-428 can be activated in any order, and do not need to be activated sequentially. Also, each tab can be optionally customized to contain different, fewer, or additional fields relative to the fields discussed above. Further, the tabs (402-428) can be arranged differently. Fewer or additional tabs can also be provided to suit a particular application or need.
The system configuration captured in the step of block 100 of
In an exemplary embodiment, a general purpose computer on which the present invention operates will have stored thereon or have access to a repository of security regulations and test procedures from various government and/or civilian departments, agencies, organizations, etc (e.g., such as those from DITSCAP). In step 1102 (FIG. 11), and based at least in part on the information entered in step 100, pertinent regulations will be selected from this repository, upon which to build a security requirement traceability matrix (SRTM) for the C&A. The SRTM, as discussed above, can be a mapping of one or more test procedures to each individual requirement within a requirements document. Satisfactory completion of the respective one or more test procedures that can be mapped to each requirement is generally considered to render the requirement satisfied. However, the user has the flexibility to view and modify 1104 the SRTM as desired to meet the specific needs of the systems being accredited by, for example, adding and/or deleting one or more tests to/from the SRTM, and/or editing one or more of the test procedures to, for example, include additional testing requirements. If the user decides to modify a test procedure, the specified test procedure displayed 1106. The user can then modify and save the revised test procedure 1108. The user can then either end the editing process or continue to modify another security document 1110.
After selections have been made, either by the user by, for example, clicking the appropriate boxes associated with documents (e.g., 1204, 1208, 1220 and 1224), and/or by the system, the application will provide a Display SRTM screen as shown in FIG. 13. Additionally,
With the security requirements traceability matrix in place (a portion of which is illustratively shown in FIG. 13), the user proceeds to the testing step 104. In at least some embodiments of the present invention, user interfaces will be provided, in accordance with the steps shown in
An Edit Test Plan Information screen, corresponding to step 1402, is shown in FIG. 15. The exemplary input fields on the screen are Expected Date of Test 1502, Planned Location of Procedure 1504, Test Resources 1506, Test Personnel 1508, and Remarks 1510.
Once the testing step 104 has been completed and the results recorded, the risk assessment step 106 commences, as indicated by sub-headings a-d below.
a) Generate Project Threat Profile (Step 2102)
As shown in
For example, generic threat elements 1-29, as defined in
corresponding, respectively, to elements 1-29. For this project threat profile, the threat of a flood is thus considered high.
b) Threat Correlation String (Step 2104)
In step 2104, a threat correlation for each failed test procedure is accessed. Specifically, each test procedure used in the C&A for the system being evaluated is, in at least some embodiments of the present invention, coded with a threat correlation string, with each character in the string representing one of the generic threat elements in the same order as they exist in the project threat profile as shown, for example, in FIG. 22. The test procedure database preferably contains these codes. Each character in the threat correlation string contains a score that indicates the relative potential of a given threat to exploit a vulnerability caused by failure of this particular test. An exemplary scoring system is as follows:
Thus, for example, failure of a particular test may mean that the system being tested is highly vulnerable to Floods. To indicate this, the character in the threat correlation string corresponding to Floods would contain a score of “H”.
c) Determine Risk Profile for Each Failed Test Procedure (Step 2106)
As indicated at step 2106, the risk profile for each test procedure is determined. Specifically, for each test failure, the threat correlation string contained within each test procedure, as determined at step 2104, is applied against the project threat profile as determined at step 2102.
For example, the project threat profile above, given as:
may have a test procedure with the following threat correlation sting:
In this case, in accordance with an exemplary process according to at least some embodiments of the present invention, the combined risk profile string as determined in accordance with
For a given row of
The highest risk level in the combined string for a given test procedure is preferably used as the risk level for the failure of that test procedure. Thus, for the combined string above, the risk level for a failure of the test procedure is high, since there is an H in the second position. Similarly, if M were the highest risk level that appears in a combined string, then the risk level for a failure of that test procedure would be medium, etc.
d) Determine Overall System Level Risk (Step 2108)
In addition to the individual risk level scores for each test failure as determined in step 2106, an overall risk level for the project is also determined as indicated by step 2108. As shown in
In the publishing step 108, the present invention collates the results of the certification process and optionally generates the documents needed for accreditation. The present invention takes the information gathered during the steps corresponding to blocks 100, 102, 104 and 106, and reformats the information by, for example, organizing it into to appropriate documents, document subsections or subparagraphs, sections and/or appendices, etc.
As shown in
The techniques of the present invention may be implemented on a computing unit such as that depicted in FIG. 31. In this regard,
Viewed externally in
The computer system 3100 also has an optional display 3108 upon which information, such as the screens illustrated in, for example,
Although computer system 3100 is illustrated having a single processor, a single hard disk drive and a single local memory, the system 3100 is optionally suitably equipped with any multitude or combination of processors or storage devices. Computer system 3100 is, in point of fact, able to be replaced by, or combined with, any suitable processing system operative in accordance with the principles of the present invention, including hand-held, laptop/notebook, mini, mainframe and super computers, as well as processing system network combinations of the same.
A display interface 3218 interfaces display 3208 and permits information from the bus 3202 to be displayed on the display 3108. Again as indicated, display 3108 is also an optional accessory. For example, display 3108 could be substituted or omitted. Communications with external devices, for example, the other components of the system described herein, occur utilizing communication port 3216. For example, optical fibers and/or electrical cables and/or conductors and/or optical communication (e.g., infrared, and the like) and/or wireless communication (e.g., radio frequency (RF), and the like) can be used as the transport medium between the external devices and communication port 3216. Peripheral interface 3220 interfaces the keyboard 3110 and the mouse 3112, permitting input data to be transmitted to the bus 3202.
In alternate embodiments, the above-identified CPU 3204, may be replaced by or combined with any other suitable processing circuits, including programmable logic devices, such as PALs (programmable array logic) and PLAs (programmable logic arrays). DSPs (digital signal processors), FPGAs (field programmable gate arrays), ASICs (application specific integrated circuits), VLSIs (very large scale integrated circuits) or the like.
One of the implementations of the invention is as sets of instructions resident in the random access memory 3208 of one or more computer systems 3100 configured generally as described above. Until required by the computer system, the set of instructions may be stored in another computer readable memory, for example, in the hard disk drive 3214, or in a removable memory such as an optical disk for eventual use in the CD-ROM 3212 or in a floppy disk (e.g., floppy disk 3302 of
At least some embodiments of the present invention can utilize a relational database to store and organize all information such as, for example, test procedures, standards/regulations, and user entered information. The design of an embodiment of the database is provided in the ERD shown in FIG. 34. The database is initially populated with security requirements, test procedures and related information to facilitate the operation of the system. As information is entered by the user and calculated by the system, it is also recorded in the database. At least some embodiments of the present invention produce output documentation that can be formatted in accordance with, for example, DITSCAP and/or NIACAP standard(s).
The ERD shown in
A one-to-many (1:M) relationship indicates that each occurrence of entity A is related to one or more occurrences of entity B, but each occurrence of entity B is related to only one occurrence of entity A. Two vertical lines (as shown, for example, in
A many-to-many (N:M) relationship shows that each occurrence of entity A is related to one or more occurrences of entity B, and each occurrence of entity B is related to one or more occurrences of entity A. A many-to-many relationship is indicated by an arrow at each end of a solid line.
If there can be occurrences of one entity that are not related to at least one occurrence of the other entity, then the relationship is optional and this is shown by the use of a dashed line in
As known to those skilled in the art, a data dictionary, as provided below, defines and specifies the data elements in the system. The data dictionary shown below can be used either as a stand-alone system or as an integral part of the database. Data integrity and accuracy is better ensured in the latter case.
An instance of an entity shown in
The table below provides an exemplary data dictionary that can be used with the ERD of FIG. 34.
The many features and advantages of the invention are apparent from the detailed specification, and thus, it is intended by the appended claims to cover all such features and advantages of the invention which fall within the true spirit and scope of the invention. Further, since numerous modifications and variations will readily occur to those skilled in the art, it is not desired to limit the invention to the exact construction and operation illustrated and described, and accordingly, all suitable modifications and equivalents may be resorted to, falling within the scope of the invention. While the foregoing invention has been described in detail by way of illustration and example of preferred embodiments, numerous modifications, substitutions, and alterations are possible without departing from the scope of the invention defined in the following claims.
This application claims priority to application Ser. No. 60/223,982, filed Aug. 9, 2000, entitled “Web Certification and Accreditation System, Method and Medium”, which is assigned to the assignee of this application. The disclosure of application Ser. No. 60/223,982 is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5621889 | Lermuzeaux et al. | Apr 1997 | A |
5625751 | Brandwajn et al. | Apr 1997 | A |
5684959 | Bhat et al. | Nov 1997 | A |
5699403 | Ronnen | Dec 1997 | A |
5740248 | Fieres et al. | Apr 1998 | A |
5796942 | Esbensen | Aug 1998 | A |
5850516 | Schneier | Dec 1998 | A |
5859847 | Dew et al. | Jan 1999 | A |
5870545 | Davis et al. | Feb 1999 | A |
5892900 | Ginter et al. | Apr 1999 | A |
5892903 | Klaus | Apr 1999 | A |
5931946 | Terada et al. | Aug 1999 | A |
6006328 | Drake | Dec 1999 | A |
6134664 | Walker | Oct 2000 | A |
6148401 | Devanbu et al. | Nov 2000 | A |
6151599 | Shrader et al. | Nov 2000 | A |
6185689 | Todd et al. | Feb 2001 | B1 |
6205407 | Testa et al. | Mar 2001 | B1 |
6219626 | Steinmetz et al. | Apr 2001 | B1 |
6219628 | Kodosky et al. | Apr 2001 | B1 |
6219805 | Jones et al. | Apr 2001 | B1 |
6256773 | Bowman-Amuah | Jul 2001 | B1 |
6282546 | Gleichauf et al. | Aug 2001 | B1 |
6298445 | Shostack et al. | Oct 2001 | B1 |
6317868 | Grimm et al. | Nov 2001 | B1 |
6324647 | Bowman-Amuah | Nov 2001 | B1 |
6370573 | Bowman-Amuah | Apr 2002 | B1 |
6389402 | Ginter et al. | May 2002 | B1 |
6401073 | Tokuda et al. | Jun 2002 | B1 |
6405364 | Bowman-Amuah | Jun 2002 | B1 |
6408391 | Huff et al. | Jun 2002 | B1 |
6473794 | Guheen et al. | Oct 2002 | B1 |
20010027389 | Beverina et al. | Oct 2001 | A1 |
20010034847 | Gaul, Jr. | Oct 2001 | A1 |
20020104014 | Zobel et al. | Aug 2002 | A1 |
Number | Date | Country |
---|---|---|
0999489 | May 2000 | EP |
WO 0070463 | Nov 2000 | WO |
WO 0137511 | May 2001 | WO |
WO 0159989 | Aug 2001 | WO |
WO 0199349 | Dec 2001 | WO |
WO02061544 | Aug 2002 | WO |
Number | Date | Country | |
---|---|---|---|
20020069035 A1 | Jun 2002 | US |
Number | Date | Country | |
---|---|---|---|
60223982 | Aug 2000 | US |