NIST SP 800 55 Revision 1, Performance Measurement Guide For Information Security

User Manual:

Open the PDF directly: View PDF PDF.
Page Count: 80

Performance M
NIST Special Publication 800-55 Revision 1
easurement Guide
for Information Security
Elizabeth Chew, Marianne Swanson, Kevin Stine,
Nadya Bartol, Anthony Brown, and Will Robinson
I N F O R M A T I O N S E C U R I T Y
Computer Security Division
Information Technology Laboratory
National Institute of Standards and Technology
Gaithersburg, MD 20899-8930
July 2008
U.S. Department of Commerce
Carlos M. Gutierrez, Secretary
National Institute of Standards and Technology
James M. Turner, Deputy Director
Reports on Computer Systems Technology
nd Technology
ship for the Nation’s
ce data, proof of
ductive use of
strative,
s for the cost-effective security and privacy of sensitive
unclassified information in federal computer systems. This Special Publication 800-series reports on ITL’s
research, guidelines, and outreach efforts in information security, and its collaborative activities with
industry, government, and academic organizations.
The Information Technology Laboratory (ITL) at the National Institute of Standards a
(NIST) promotes the U.S. economy and public welfare by providing technical leader
measurement and standards infrastructure. ITL develops tests, test methods, referen
concept implementations, and technical analyses to advance the development and pro
information technology. ITL’s responsibilities include the development of management, admini
technical, and physical standards and guideline
ii
Authority
This document has been developed by the National Institute of Standards and Technology (NIST) in
nagement Act
rements, and for
t such standards and
security systems. This guideline is consistent with the requirements
ency
s. Supplemental
vided in A-130, Appendix III.
y nongovernmental
tion would be
Nothing in this document should be taken to contradict standards and guidelines made mandatory and
binding on federal agencies by the Secretary of Commerce under statutory authority. Nor should these
guidelines be interpreted as altering or superseding the existing authorities of the Secretary of
Commerce, Director of the OMB, or any other federal official.
furtherance of its statutory responsibilities under the Federal Information Security Ma
(FISMA) of 2002, Public Law 107-347.
NIST is responsible for developing standards and guidelines, including minimum requi
providing adequate information security for all agency operations and assets, bu
guidelines shall not apply to national
of the Office of Management and Budget (OMB) Circular A-130, Section 8b(3), Securing Ag
Information Systems, as analyzed in A-130, Appendix IV: Analysis of Key Section
information is pro
This guideline has been prepared for use by federal agencies. It may also be used b
organizations on a voluntary basis and is not subject to copyright regulations. (Attribu
appreciated by NIST.)
Certain commercial entities, equipment, or materials may be identified in this
document in order to describe an experimental procedure or concept adequately.
Such identification is not intended to imply recommendation or endorsement by NIST,
nor is it intended to imply that the en materials, or equipment are necessarily the
best available for the purpose.
tities,
iii
Acknowledgements
zabeth Lennon (NIST),
) who reviewed
gratefully
ciate the many contributions from individuals and organizations in the
public and private sectors whose thoughtful and constructive comments improved the quality and
usefulness of this publication.
The authors wish to thank Joan Hash (NIST), Arnold Johnson (NIST), Eli
Karen Scarfone (NIST), Kelley Dempsey (NIST), and Karen Quigg (MITRE
drafts of this document and/or contributed to its development. The authors also
acknowledge and appre
iv
TABLE OF CONTENTS
E................... VIII
................... ......1
.................. .....1
.................. .....2
.................. .....2
..........................4
.....................5
R .................. .....6
................... ......6
................... ......6
.................. .....7
................... ......8
................... ......8
................... ......8
3. .................. .....9
.................. .....9
.................. ...10
.................. ...11
.................. ...13
.................. ...13
.................. ...14
.................. ...15
.................. ...15
................... ....16
................... ....16
.................. ...17
.................. ...17
3 .................. ...17
.................. ...19
L................... ....20
4.1 Legislative Considerations.............................................................................................20
4.1.1 Government Performance Results Act.......................................................................20
4.1.2 Federal Information Security Management Act ........................................................21
4.2 Federal Enterprise Architecture.....................................................................................22
4.3 Linkage Between Enterprise Strategic Planning and Information Security ..................23
5. MEASURES DEVELOPMENT PROCESS......................................................................24
5.1 Stakeholder Interest Identification.................................................................................25
XECUTIVE SUMMARY .............................................................................. . ...
..
..
..
..
..
..
..
..
2.4 Program Manager/Information System Owner...................................... .
....
....
....
....
....
....
....
....
....
....
....
....
....
....
....
....
....
....
1. INTRODUCTION..................................................................................... . .
1.1 Purpose and Scope.............................................................................. . ...
1.2 Audience ............................................................................................. . ...
...
1.3 History................................................................................................. .
1.4 Critical Success Factors...................................................................................................3
1.5 Relationship to Other NIST Documents................................................
1.6 Document Organization..............................................................................
2. OLES AND RESPONSIBILITIES....................................................... . ...
2.1 Agency Head....................................................................................... . .
2.2 Chief Information Officer .................................................................. . .
2.3 Senior Agency Information Security Officer...................................... . ...
2.5 Information System Security Officer................................................ . .
2.6 Other Related Roles.......................................................................... . .
INFORMATION SECURITY MEASURES BACKGROUND.......... . ...
3.1 Definition.......................................................................................... . ...
3.2 Benefits of Using Measures.............................................................. . ...
...
3.3 Types of Measures............................................................................ . ...3.3.1 Implementation Measures............................................................. . ...3.3.2 Effectiveness/Efficiency Measures............................................... . ...3
...
.3.3 Impact Measures........................................................................... . ...3.4 Measurement Considerations............................................................ .
3.4.1 Organizational Considerations...................................................... .
.
3.4.2 Manageability ............................................................................................................15
3.4.3 Data Management Concerns......................................................... .
3.4.4 Automation of Measurement Data Collection.............................. . .
3.5 Information Security Measurement Program Scope......................... .
3 ...
...
.5.1 Individual Information Systems.................................................... . ....5.2 System Development Life Cycle .................................................. . ...3.5.3 Enterprise-Wide Programs............................................................ .
.4. EGISLATIVE AND STRATEGIC DRIVERS.................................. .
v
5.2 Goals and Objectives Definition....................................................... . .......
....
....
....
....
....
....
....
....
IO
....
....
......
.....
......
.................. ...26
.................. ...27
.................. ...27
.................. ...28
.................. ...29
5 .................. ...29
5 ................... ....30
................... ....31
.................. ...33
I N....................35
.................. ...35
.................. ...36
.................. .38
.................. .38
...................
APPENDIX B: ACRONYMS ...................................................................................................B-1
APPENDIX C: REFERENCES............................................................................................... C-1
APPENDIX D: SPECIFICATIONS FOR MINIMUM SECURITY REQUIREMENTS .D-1
5.3 Information Security Policies, Guidelines, and Procedures Review . ...
5.4 Information Security Program Implementation Review................... . ...
...
5.5 Measures Development and Selection.............................................. . ...5.5.1 Measures Development Approach................................................ . ....5.2 Measures Prioritization and Selection .......................................... . ..5.3 Establishing Performance Targets ................................................ .
5.6 Measures Development Template..................................................... . .
5.7 Feedback Within the Measures Development Process..................... . ...
6. NFORMATION SECURITY MEASUREMENT IMPLEMENTAT
6.1 Prepare for Data Collection .............................................................. . ...
6.2 Collect Data and Analyze Results..................................................... . ...
6.3 Identify Corrective Actions............................................................. . .....
6.4 Develop Business Case and Obtain Resources................................ . .....
6.5 Apply Corrective Actions..............................................................................................40
APPENDIX A: CANDIDATE MEASURES .............................................. . .. A-1
vi
LIST OF FIGURES
.................. .....3
................... ....25
Figure 5-2. Information Security Measures Trend Example .........................................................31
Figure 6-1. Information Security Measurement Program Implementation Process......................35
Table 1. Measurement During System Development....................................................................18
Table 2. Measures Template and Instructions ...............................................................................32
Figure 1-1. Information Security Measurement Program Structure................. .....
nt
....
...
Figure 3-1. Information Security Program Maturity and Types of Measureme .........................12
Figure 5-1. Information Security Measures Development Process.................. . .
LIST OF TABLES
vii
EXECUTIVE SUMMARY
entation of measures
ate the
upporting information
prove performance,
levant
cy, and
’s success in
rocess described in this guide
tionship between
agency mission,
n Act, the
k Elimination Act
t (FISMA)—cite information
nce measurement in
tion to legislative compliance, agencies can use performance
measures as management tools in their internal improvement efforts and link implementation of
ts.
ent and implementation of an
d numbers);
information security processes should be considered for measurement;
d directing resources.
easures are
This document focuses on the development and collection of three types of measures:
Implementation measures to measure execution of security policy;
Effectiveness/efficiency measures to measure results of security services delivery; and
Impact measures to measure business or mission consequences of security events.
This document is a guide to assist in the development, selection, and implem
to be used at the information system and program levels. These measures indic
effectiveness of security controls applied to information systems and s
security programs. Such measures are used to facilitate decision making, im
and increase accountability through the collection, analysis, and reporting of re
performance-related data—providing a way to tie the implementation, efficien
effectiveness of information system and program security controls to an agency
achieving its mission. The performance measures development p
will assist agency information security practitioners in establishing a rela
information system and program security activities under their purview and the
helping to demonstrate the value of information security to their organization.
A number of existing laws, rules, and regulations—including the Clinger-Cohe
Government Performance and Results Act (GPRA), the Government Paperwor
(GPEA), and the Federal Information Security Management Ac
performance measurement in general, and information security performa
particular, as a requirement. In addi
their information security programs to agency-level strategic planning effor
The following factors must be considered during developm
information security measurement program:
Measures must yield quantifiable information (percentages, averages, an
Data that supports the measures needs to be readily obtainable;
Only repeatable
and
Measures must be useful for tracking performance an
The measures development process described in this document ensures that m
developed with the purpose of identifying causes of poor performance and pointing to
appropriate corrective actions.
viii
ix
ful for
ent types of
ary focus of information security measures shifts
as the implementation of security controls matures.
The types of measures that can realistically be obtained, and that can also be use
performance improvement, depend on the maturity of the agency’s information security program
and the information system’s security control implementation. Although differ
measures can be used simultaneously, the prim
1. INTRODUCTION
gulatory, financial,
ions cite information
ent in
e Clinger-Cohen Act, the Government
mination Act (GPEA), and
tion security
rformance
sures as
on of their
rmation security
and accountability
ata. They provide
ty controls to an
in its mission-critical activities. The performance measures development
is document will assist agency information security practitioners in
ship between information system and program security activities under
f information security
plementation of
elated activities. It
entifies the adequacy
h to help
es, identify and
ontrols for continuous
cesses and how
nd support risk-
program can
and should
entation of such a
program assists agencies in meeting the annual requirements of the Office of Management and
Budget (OMB) to report the status of agency information security programs.
NIST Special Publication (SP) 800-55, Revision 1, expands upon NIST’s previous work in the
field of information security measures to provide additional program-level guidelines for
quantifying information security performance in support of organizational strategic goals. The
processes and methodologies described in this document link information system security
performance to agency performance by leveraging agency-level strategic planning processes. By
doing so, the processes and methodologies help demonstrate how information security
The requirement to measure information security performance is driven by re
and organizational reasons. A number of existing laws, rules, and regulat
performance measurement in general, and information security performance measurem
particular, as a requirement. These laws include th
Performance and Results Act (GPRA), the Government Paperwork Eli
the Federal Information Security Management Act (FISMA).
While these laws, rules, and regulations are important drivers for informa
measurement, equally compelling are the benefits that information security pe
measurement can yield for organizations. Agencies can use performance mea
management tools in their internal improvement efforts and link implementati
information security programs to agency-level strategic planning efforts. Info
measures are used to facilitate decision making and improve performance
through collection, analysis, and reporting of relevant performance-related d
the means for tying the implementation, efficiency, and effectiveness of securi
agency’s success
process described in th
establishing a relation
their purview and the agency mission, helping to demonstrate the value o
to their organization.
1.1 Purpose and Scope
This document is a guide for the specific development, selection, and im
information system-level and program-level measures to indicate the implementation,
efficiency/effectiveness, and impact of security controls, and other security-r
provides guidelines on how an organization, through the use of measures, id
of in-place security controls, policies, and procedures. It provides an approac
management decide where to invest in additional information security resourc
evaluate nonproductive security controls, and prioritize security c
monitoring. It explains the measurement development and implementation pro
measures can be used to adequately justify information security investments a
based decisions. The results of an effective information security measurement
provide useful data for directing the allocation of information security resources
simplify the preparation of performance-related reports. Successful implem
1
contributes to accomplishing agency strategic goals and objectives. Performance m
developed according to this guide will enhance the a easures
bility of agencies to respond to a variety of
mended Security
at support the
s on developing
r, expand, or use
ile focused on NIST SP 800-53 security controls,
measures related to
an be helpful in
ta collection, analysis,
ored to support FISMA performance measures,
e Architecture’s (FEA) Performance Reference Model (PRM) requirements,
and any other enterprise-specific requirements for reporting quantifiable information about
written primarily for Chief Information Officers (CIOs), Senior Agency
Information Security Officers (SAISOs)—often referred to as Chief Information Security
SOs)—and Information System Security Officers (ISSOs). It targets individuals
who are familiar with security controls as described in NIST SP 800-53. The concepts,
vernment and
ness has been under development for , and
tion Security, both
e publications by
building upon them to align this approach with security controls provided in NIST SP 800-53,
Recommended Security Controls for Federal Information Systems. The document also expands
0-55 to assist with
the assessment of information security program implementation.
Security control implementation for information systems and information security programs is
reviewed and reported annually to OMB in accordance with the Electronic Government Act of
2002, which includes FISMA. The Act requires departments and agencies to demonstrate that
federal government mandates and initiatives, including FISMA.
This publication uses the security controls identified in NIST SP 800-53, Recom
Controls for Federal Information Systems, as a basis for developing measures th
evaluation of information security programs. In addition to providing guideline
measures, the guide lists a number of candidate measures that agencies can tailo
as models for developing other measures.1 Wh
the process described in this guide can be applied to develop agency-specific
security controls that are not included in NIST SP 800-53.
The information security measurement program described in this document c
fulfilling regulatory requirements. The program provides an underlying da
and reporting infrastructure that can be tail
Federal Enterpris
information security performance.
1.2 Audience
This guide is
Officers (CI
processes, and candidate measures presented in this guide can be used within go
industry contexts.
1.3 History
The approach for measuring security control effective
several years. NIST SP 800-55, Security Metrics Guide for Information Technology Systems
NIST Draft SP 800-80, Guide to Developing Performance Metrics for Informa
addressed information security measurement. This document supersedes thes
on concepts and processes introduced in the original version of NIST SP 80
1 Candidate measures offered by this guide do not constitute mandatory requirements. Rather, they provide a sampling of
measures to be considered for use by the readers of this guide.
2
they are meeting applicable information security requirements, and to document the level of
performance based on results of annual program reviews.
An information security measurement program within an organization should include four
interdependent components (see Figure 1-1).
1.4 Critical Success Factors
Figure 1-1. Information Security Measurement Program Structure
This support
ization. Without
trol information
ured by
The second component of an effective information security measurement program is the
existence of information security policies and procedures backed by the authority necessary to
enforce compliance. Information security policies delineate the information security
management structure, clearly assign information security responsibilities, and lay the foundation
needed to reliably measure progress and compliance. Procedures document management’s
position on the implementation of an information security control and the rigor with which it is
applied. Measures are not easily obtainable if no procedures are in place that supply data to be
used for measurement.
The foundation of strong upper-level management support is critical, not only for the success of
the information security program, but also for the program’s implementation.
establishes a focus on information security within the highest levels of the organ
a solid foundation (i.e., proactive support of personnel in positions that con
resources), the information security measurement program can fail when press
organizational dynamics and budget limitations.
3
The third component is developing and establishing quantifiable performanc
designed to capture and provide meaningful performance data. To provide meaningful data,
quantifiable information security measures must be based on information
goals and objectives, and be easily obtainable and feasible to measure. They m
e measures that are
security performance
ust also be
repeatable, provide relevant performance trends over time, and be useful for tracking
onsistent periodic
learned, improve
d plan for the implementation of future security
controls to meet new information security requirements as they occur. Accurate data collection
ningful and useful
d by the degree
rmation security measurement
program should provide substantive justification for decisions that directly affect the information
se decisions include budget and personnel requests and
should assist in
ance.
ded to assist
ent,
s on quantifying
fforts such as those described in NIST SP
ation Systems;
in NIST SP 800-30,
mmended Security
from NIST SP 800-53A in that it provides a quantitative
approach to measuring and analyzing security controls implementation and effectiveness at the
information system and program levels, aggregated across multiple individual efforts. It also
provides an approach for aggregating information from multiple information systems to measure
and analyze information security from an enterprise-level perspective. NIST SP 800-53A
provides procedures for assessing if the security controls are implemented and operating as
intended according to the information system security plan for the system. The assessment data
produced as a result of applying NIST SP 800-53A assessment procedures can serve as a data
source for information security measurement.
performance and directing resources.
Finally, the information security measurement program itself must emphasize c
analysis of the measures data. Results of this analysis are used to apply lessons
effectiveness of existing security controls, an
must be a priority with stakeholders and users if the collected data is to be mea
in improving the overall information security program.
The success of an information security program implementation should be judge
to which meaningful results are produced. A comprehensive info
security posture of an organization. The
allocation of available resources. An information security measurement program
the preparation of required reports relating to information security perform
1.5 Relationship to Other NIST Documents
This document is a continuation in a series of NIST special publications inten
information management and information security personnel in the establishm
implementation, and maintenance of an information security program. It focuse
information security performance based on the results of a variety of information security
activities. This approach draws upon many sources of data, including:
Information security assessment and testing e
800-53A, Guide for Assessing the Security Controls in Federal Inform
Information security risk assessments efforts, such as those described
Risk Management Guide for Information Technology Systems; and
Minimum security controls recommended in NIST SP 800-53, Reco
Controls for Federal Information Systems.
NIST SP 800-55, Revision 1, differs
4
5
inputs into the
anagers; and
NIST SP 800-65, Integrating IT Security into the Capital Planning and Investment
sist with prioritization for the continuous monitoring of
bed in NIST SP 800-37, Guide for the Security Certification and
1.6 Document Organization
nsibilities of agency
rity program, and in
uidelines on the
security measures, the benefits of
he factors that
.
y to strategic
rocess used for
usses those factors
tion of an information security measurement program.
This guide contains four appendices. Appendix A, Candidate Measures, provides practical
examples of information security measures that can be used or modified to meet specific agency
requirements. Appendix B provides a list of acronyms used in this document. Appendix C lists
references. Appendix D lists specifications for minimum security requirements taken from
Federal Information Processing Standard (FIPS) 200, Minimum Security Requirements for
Federal Information and Information Systems.
Information security measurement results described in this guide will provide
information security program activities described in a number of NIST publications, including:
NIST SP 800-100, Information Security Handbook: A Guide for M
Control Process.
These measures can also be used to as
security controls, as descri
Accreditation of Federal Information Systems.
The remaining sections of this guide discuss the following:
Section 2, Roles and Responsibilities, describes the roles and respo
staff that have a direct interest in the success of the information secu
the establishment of an information security measurement program.
Section 3, Information Security Measures Background, provides g
background and definition of information
implementation, various types of information security measures, and t
directly affect success of an information security measurement program
Section 4, Legislative and Strategic Drivers, links information securit
planning through relevant legislation and guidelines.
Section 5, Measures Development Process, presents the approach and p
development of information security measures.
Section 6, Information Security Measurement Implementation, disc
that can affect the implementa
2. ROLES AND RESPONSIBILITIES
This section outlines the key roles and responsibilities for developing and imp
information security measures. While information security is the responsib
of the organization
lementing
ility of all members
, the positions described in Sections 2.1 through 2.6 are key information
ders that should work to instill a culture of information security awareness
zation..
urement are as
agency strategic and
al planning processes to secure the organization’s mission;
to annual reporting on the
ation
ent and
gency;
t activities have adequate financial and
human resources for success;
Actively promoting information security measurement as an essential facilitator of
y; and
Approving policy to officially institute measures collection.
2
ollowing responsibilities related to information
Using information security measures to assist in monitoring compliance with applicable
information security requirements;
Using information security measures in annually reporting on effectiveness of the agency
Demonstrating management’s commitment to information security measures
development and implementation through formal leadership;
security stakehol
across the organi
2.1 Agency Head
The specific Agency Head responsibilities related to information security meas
follows:
Ensuring that information security measures are used in support of
operation
Ensuring that information security measures are integrated in
effectiveness of the agency information security program by the Chief Inform
Officer (CIO);
Demonstrating support for information security measures developm
implementation, and communicating official support to the a
Ensuring that information security measuremen
information security performance improvement throughout the agenc
2.2 Chief Information Officer
The Chief Information Officer (CIO) has the f
security measurement:
information security program to the agency head;
2 When an agency has not designated a formal Chief Information Officer position, FISMA requires the associated responsibilities
to be handled by a comparable agency official.
6
Formally communicating the importance of using information secu
monitor the over rity measures to
all health of the information security program and to comply with
ent and implementation;
ion security
ation security
ures data to support policy, resource allocation, budget decisions, and assessment of
information
measures analysis
king corrective actions such as revising information security procedures and
urity training to staff; and
ment, and institute
Officer (SAISO) may
thin this
he CISO. The SAISO has
:
lanning,
, evaluating, and documenting remedial actions to address any deficiencies
ency;
ation security
information
O’s annual reporting to
ency head on the effectiveness of the agency’s information security program,
including progress of remedial actions;
Conducting information security measures development and implementation;
Ensuring that a standard process is used throughout the agency for information security
measures development, creation, analysis, and reporting; and,
Using information security measures for policy, resource allocation, and budget
decisions.
applicable regulations;
Ensuring information security measurement program developm
Allocating adequate financial and human resources to the informat
measurement program;
Reviewing information security measures regularly and using inform
meas
the information security program posture and operational risks to agency
systems;
Ensuring that a process is in place to address issues discovered through
and ta
providing additional information sec
Issuing policy, procedures, and guidelines to officially develop, imple
measures.
2.3 Senior Agency Information Security Officer
Depending upon the agency, the Senior Agency Information Security
sometimes be referred to as the Chief Information Security Officer (CISO). Wi
document, the term SAISO is used to represent both the SAISO and t
the following responsibilities related to information security measurement
Integrating information security measurement into the process for p
implementing
in the information security policies, procedures, and practices of the ag
Obtaining adequate financial and human resources to support inform
measurement program development and implementation;
Leading the development of any internal guidelines or policy related to
security measures;
Using information security measures in support of the agency CI
the ag
7
8
2.4 Program Manager/Information System Owner
ensuring that
nd availability of
information and information systems. The program manager/information system owner has the
t:
rity measurement program development and
ction and
Educating staff on the development, collection, analysis, and reporting of information
quirements,
Ensuring that measurement data is collected consistently and accurately and is provided
red;
Reviewing information security measures data regularly and using it for policy, resource
s; and
easuring
performance.
t:
information security measurement program development and
collection and
ata or providing measurement data to designated staff that are collecting,
analyzing, and reporting the data.
2.6 Other Related Roles
Information security measurement may require inputs from a variety of organizational
components or stakeholders, including incident response, information technology operations,
privacy, enterprise architecture, human resources, physical security, and others. Section 5.1 lists
additional stakeholders.
Program managers, as well as information system owners, are responsible for
proper security controls are in place to address the confidentiality, integrity, a
following responsibilities related to information security measuremen
Participating in information secu
implementation by providing feedback on the feasibility of data colle
identifying data sources and repositories;
security measures and how it will affect information security policy, re
resource allocation, and budget decisions;
to designated staff who are analyzing and reporting the data;
Directing full participation and cooperation of staff, when requi
allocation, and budget decision
Supporting implementation of corrective actions, identified through m
information security
2.5 Information System Security Officer
The Information System Security Officer (ISSO) has the following responsibilities related to
information security measuremen
Participating in
implementation by providing feedback on the feasibility of data
identifying data sources and repositories; and
Collecting d
3. INFORMATION SECURITY MEASURES BACKGROUND
s are and why
defines types
can be used; discusses the key aspects of making an information security
measurement program successful; and identifies the uses of measures for management, reporting,
aking.
prove performance
tability through the collection, analysis, and reporting of relevant performance-related
easured activities and
n observed
nization.
progressively higher levels, depending on the size and complexity of an organization. While a
ms, such as
he results of data
goals and
n information or
uate information
ance objectives enable
n security policies and
organization.
example goal
training includes
f, and a reference
tion’s information security policies and procedures.
tives by
ols; analyzing the
adequacy of information security program activities; and identifying possible improvement
actions. During measures development, goals and objectives from federal guidelines, legislation,
regulations, and enterprise-level guidance are identified and prioritized to ensure that the
measurable aspects of information security performance correspond to the operational priorities
of the organization.
Information security measures must yield quantifiable information for comparison purposes,
apply formulas for analysis, and track changes using the same points of reference. Percentages
This section provides basic information on what information security measure
information security performance should be measured. Additionally, this section
of measures that
and decision m
3.1 Definition
Information security measures are used to facilitate decision making and im
and accoun
data. The purpose of measuring performance is to monitor the status of m
facilitate improvement in those activities by applying corrective actions based o
measurements.
Information security measures can be obtained at different levels within an orga
Detailed measures, collected at the information system level, can be aggregated and rolled up to
case can be made for using different terms for more detailed and aggregated ite
“metrics” and “measures,” this document standardizes on “measures” to mean t
collection, analysis, and reporting. This document refers to the process of data collection,
analysis, and reporting as “measurement.”
Information security measures are based on information security performance
objectives. Information security performance goals state the desired results of a
security program implementation, such as, “All employees should receive adeq
security awareness training.” Information security perform
accomplishment of goals by identifying practices defined by informatio
procedures that direct consistent implementation of security controls across the
Examples of information security performance objectives, corresponding to the
cited above, are: All new employees receive new employee training. Employee
a summary of the Rules of Behavior. Employee training includes a summary o
to, the organiza
Information security measures monitor the accomplishment of goals and objec
quantifying the implementation, efficiency, and effectiveness of security contr
9
or averages are most common. Absolute numbers are sometimes useful, depending on the
ss that is under
d repeatable
table and stable,
e have not been
n of measurement
esources that may
ities that can provide data for
measurement include risk assessments, penetration testing, security assessments, and continuous
nd awareness
provide relevant
lied to problem
view performance by observing trends, identifying
and prioritizing corrective actions, and directing the application of those corrective actions based
available resources. The measures development process, described
in Section 5, ensures that measures are developed with the purpose of identifying causes of poor
anizational and
ormation security
nstrating
for resource
countability for
e implemented
lysis processes can
lementation within
cific information systems.
surement program
ystems and
Information security measures can assist with determining the effectiveness of implemented
information security processes, procedures, and security controls by relating results of
information security activities and events (e.g., incident data, revenue lost to cyber attacks) to
security controls and information security investments.
Demonstrate Compliance: Organizations can demonstrate compliance with applicable laws,
rules, and regulations by implementing and maintaining an information security measurement
program. Information security measures will assist in satisfying the annual FISMA reporting
activity that is being measured.
Data required for calculating measures must be readily obtainable, and the proce
consideration needs to be measurable. Only processes that can be consistent an
should be considered for measurement. Even though the processes may be repea
measurable data may be difficult to obtain if the processes and their performanc
documented. Measures must use easily obtainable data to ensure that the burde
on the organization does not defeat the purpose of measurement by absorbing r
be needed elsewhere. Examples of information security activ
monitoring. Other assessment activities (such as the effectiveness of a training a
program) can also be quantified and used as data sources for measures.
To be useful in tracking performance and directing resources, measures need to
performance trends over time and point to improvement actions that can be app
areas. Management should use measures to re
on risk mitigation factors and
performance and point to appropriate corrective actions.
3.2 Benefits of Using Measures
An information security measurement program provides a number of org
financial benefits. Major benefits include increasing accountability for inf
performance; improving effectiveness of information security activities; demo
compliance with laws, rules and regulations; and providing quantifiable inputs
allocation decisions.
Increase Accountability: Information security measures can increase ac
information security by helping to identify specific security controls that ar
incorrectly, are not implemented, or are ineffective. Data collection and ana
facilitate identification of the personnel responsible for security controls imp
specific organizational components or for spe
Improve Information Security Effectiveness: An information security mea
will enable organizations to quantify improvements in securing information s
demonstrate quantifiable progress in accomplishing agency strategic goals and objectives.
10
requirement to state performance measures for past and current fiscal years. A
information security measures can be used as input into the Government Acco
(GAO) and Inspectors General (IG) audits. Implementation of an information s
measurement program will demonstrate agency commitment to proactive inform
dditionally,
untability Office
ecurity
ation security.
It will also greatly reduce time spent by agencies in collecting data, which is routinely requested
raints and market
n such an
rity infrastructure.
a comprehensive risk
ased decision
ss. It will allow
ation security
allocation for future
sing the results of the measures analysis, program managers and system owners
can isolate problems, use collected data to justify investment requests, and then target
lly to the areas in need of improvement. By using measures to target
security investments, these measures can aid organizations in obtaining the best value from
s information security program determines the type of measures
istence and
rogram matures, its
ome more
ta that can be used
for information
ty programs need to
surement. More
the most mature
ine the effect of
An information security program is dependent upon upper-level management support to define
its goals and objectives. These goals and objectives may be expressed through information
security policies and processes at the program’s inception, or in a variety of other sources.
(Goals and objectives are addressed in more detail in Sections 4.1 and 5.2.) Information security
policies are documented, and information security procedures begin to stabilize, as the program
is implemented and begins to mature. To be useful, information security measurement requires
existence of documented procedures and some available data on the implementation of security
controls.
by the GAO and IG during audits and for subsequent status updates.
Provide Quantifiable Inputs for Resource Allocation Decisions: Fiscal const
conditions compel government and industry to operate on reduced budgets. I
environment, it is difficult to justify broad investments in the information secu
Information security investments should be allocated in accordance with
management program. Use of information security measures will support risk-b
making by contributing quantifiable information to the risk management proce
organizations to measure successes and failures of past and current inform
investments, and should provide quantifiable data that will support resource
investments. U
investments specifica
available resources.
3.3 Types of Measures
The maturity of an organization’
that can be gathered successfully. A program’s maturity is defined by the ex
institutionalization of processes and procedures. As an information security p
policies become more detailed and better documented, the processes it uses bec
standardized and repeatable, and the program produces a greater quantity of da
for performance measurement.
Figure 3-1 depicts this continuum by illustrating measurement considerations
security programs. As Figure 3-1 illustrates, less mature information securi
develop their goals and objectives before being able to implement effective mea
mature programs use implementation measures to evaluate performance, while
programs use effectiveness/efficiency and business impact measures to determ
their information security processes and procedures.
11
A mature program normally uses multiple tracking mechanisms to doc
various aspects of its performance. As more data becomes available, the d
measurement decreases and the ability to automate data collection increases.
automatio
Figure 3-1. Information Security Program Maturity and Types of Mea
ument and quantify
ifficulty of
Data collection
n depends on the availability of data from automated sources versus the availability of
naires and
ata is available
ols, certification
ered to be fully
automated when all data is gathered by automated data sources without human involvement or
intervention.
Types of measures (implementation, effectiveness/efficiency, and impact) that can realistically
be obtained and are useful for performance improvement depend on the maturity of the security
control implementation. Although different types of measures can be used simultaneously, the
primary focus of information security measures shifts as implementation of the information
security program matures. As information security program goals and strategic plans are
surement
data input by personnel. Manual data collection involves developing question
conducting interviews and surveys with the organization’s staff. More usable d
from semi automated and automated data sources—such as self-assessment to
and accreditation (C&A) databases, and incident reporting/response databases—as an
information security program matures. Measures data collection is consid
12
documented and implemented, the ability to reliably collect the outcome of t
improves. As an organization’s information security program evolves and perfo
becomes more readily available, measures will focus on program effectiveness
operational results of security control implementation. Once information security is integrated
into an organization’s processes, the processes become repeatable, measu
becomes fully automated, and the mission or business impact of information security-related
actions
heir implementation
rmance data
/efficiency and the
rement data collection
and events can be determined by analyzing and correlating the measurement data.
Appendix A contains examples of implementation, effectiveness/efficiency, and impact
information security
xamples of
ercentage of
ge of information
password policies configured as required. At first, the results of these measures
t. At this point, the
iency and
he percentage of
ults of this system-level
results reach and
lly implemented
t activities can refocus on other
ter most implementation measures reach and remain at 100
er fully retire
curity controls that
improvement; however, as an organization matures, the emphasis and resources of
the measurement program should shift away from implementation and towards
measures.
formation security
program activities.
3.3.2 Effectiveness/Efficiency Measures
Effectiveness/efficiency measures are used to monitor if program-level processes and system-
level security controls are implemented correctly, operating as intended, and meeting the desired
outcome. These measures concentrate on the evidence and results of assessments and may
require multiple data points quantifying the degree to which information security controls are
measures.
3.3.1 Implementation Measures
Implementation measures are used to demonstrate progress in implementing
programs, specific security controls, and associated policies and procedures. E
implementation measures related to information security programs include the p
information systems with approved system security plans and the percenta
systems with
might be less than 100 percent. However, as the information security program and its associated
policies and procedures mature, results should reach and remain at 100 percen
organization should begin to focus its measurement efforts on effectiveness/effic
impact measures.
Implementation measures can also examine system-level areas—for example, t
servers within a system with a standard configuration. At first, the res
measure will likely be less than 100 percent. When the implementation measure
remain at 100 percent, it can be concluded that the information systems have fu
the security controls addressed by this measure, and measuremen
controls in need of improvement. Af
percent, the organization should begin to focus its measurement efforts on
effectiveness/efficiency and impact measures. Organizations should nev
implementation measures because they are effective at pointing out specific se
are in need of
effectiveness/efficiency and impact
Implementation measures require data that can be easily obtained from information security
assessment reports, quarterly and annual FISMA reports, plans of action and milestones
(POA&M), and other commonly used means of documenting and tracking in
13
implemented and the resulting effect(s) on the organization’s information secu
example, the percentage of enterprise operating system vulnerabilities for whic
been applied or that have been otherwise mitigated is both an implementation
measure. It measures the implementation of the security control Flaw Remediati
800-53 because the result of the measure demonstrates whether or not vulne
mitigated through patches or other means. At the same time, the result
of the Secu
rity posture. For
h patches have
and effectiveness
on (SI-2) in SP
rabilities are
indicates the effectiveness
rity Alerts and Advisories (SI-5) security control because any result less than the
lly mitigate
mentation
imeliness of the
ure—percentage
idents caused by improperly configured access controls—relies on
ecurity controls:
AU-6); and Monitoring
omponents that
the efficiency of the
ort (SA-3).
rity decision
sions. These measures can offer
more,
onitoring efforts
ess of security controls. The results of
effectiveness/efficiency measures can be used to ascertain whether selected security controls are
and are helping facilitate corrective action prioritization.
gram activities
in a manner that can
an organization’s
-specific since each organization has a
mission, impact measures can be used to
quantify:
Cost savings produced by the information security program or through costs incurred
from addressing information security events;
The degree of public trust gained/maintained by the information security program; or
Other mission-related impacts of information security.
target indicates a lack of ability to receive alerts and use them to successfu
vulnerabilities.
Effectiveness/efficiency measures address two aspects of security control imple
results: the robustness of the result itself, referred to as effectiveness, and the t
result, referred to as efficiency. For example, the effectiveness/efficiency meas
of information security inc
information regarding the implementation and effectiveness of the following s
Incident Monitoring (IR-5); Audit Monitoring, Analysis, and Reporting (
Configuration Changes (CM-4).
Additionally, the effectiveness/efficiency measure—the percentage of system c
undergo maintenance on schedule—relies on information regarding
following security controls: Periodic Maintenance (MA-2) and Life Cycle Supp
Effectiveness/efficiency measures provide key information for information secu
makers about the results of previous policy and acquisition deci
insight for improving performance of information security programs. Further
effectiveness/efficiency measures can be used as a data source for continuous m
because they help determine the effectiven
functioning properly
Effectiveness/efficiency measures may require fusing information security pro
data with the data obtained from automated monitoring and evaluation tools
be directly tied to security controls implementation.
3.3.3 Impact Measures
Impact measures are used to articulate the impact of information security on
mission. These measures are inherently organization
unique mission. Depending upon the organization’s
14
These measures combine information about the results of security controls impl
a variety of information about resources. They can provide the most direct ins
of information security to the organization and are the ones that are sought out
For example, the percentage of the agency’s information system budget devote
security relies on information regarding the implementation, effectiveness,
following NIST SP 800-53 security controls: Allocation of Resources (SA-2) a
(SA-4). Another, more generalized budget-related impact measure would b
information security investments reported to OMB in an
ementation with
ight into the value
by executives.
d to information
and outcome of the
nd Acquisitions
e the number of
Exhibit 300. Rather than examining the
between the
ocess.
g a variety of resource information across the organization in a
manner that can be directly tied to information security activities and events.
Organizations embarking on information security performance measurement should be aware of
e p make their program a success. These include specific
organizational structure and processes as well as an understanding of required budget, personnel,
security measures
information security as
ular basis (e.g.,
urce management, legal department) may need to be included in this process. (See
ore information on stakeholders.) If an organizational element exists that is
rformance measurement in general, the development and implementation of an
t organization. If a
pment and
ply with the
the implementing
organization. Results of many information security activities can be quantified and used for
performance measurement; however, since resources are limited and the majority of resources
should be applied to correcting performance gaps, organizations should prioritize measurement
requirements to ensure that a limited number of measures are gathered. Each stakeholder should
be responsible for as few measures as possible—usually two to three measures per stakeholder.
This helps ensure that the measures that are collected are meaningful, yield impact and outcome
findings, and provide stakeholders with the time necessary to use the results to address
performance gaps. As the program matures and target levels of measurement are reached,
impact of a security control or controls, this measure evaluates the relationship
portfolio of information security investments and the budget pr
Impact measures require trackin
3.4 Measurement Considerations
several considerations that can h l
and time resources.
3.4.1 Organizational Considerations
Appropriate stakeholders must be included in the development of information
and program implementation. Organizational elements that do not have
their primary responsibility but interact with information security on a reg
training, reso
Section 5.1 for m
responsible for pe
information security measurement program should be coordinated with tha
process exists for approving organization-wide data calls and actions, develo
implementation of the information security measurement program should com
existing process.
3.4.2 Manageability
Any information security measurement program must be manageable for
15
obsolete measures should be phased out and new ones that measure completion
of more current items should be used. New measures will also be required and effectiveness
if the organization’s
mission is redefined or if changes are made to information security policies and guidelines.
repositories used
should be
incident-reporting
lements, or if
f standardizing
s cannot be overemphasized. When organizations are developing and
ecurity measurement
to facilitate the
ts of
rity measurement
ose of
ation security measures must be as nonintrusive as possible—and of maximum usefulness
s rather than collect data.
substantial
imize its benefits.
sources to maintain the
program.
Finally, the information contained in information security data repositories represents a
y of this data,
ected
n
tion. Automating
ps institutionalize
automated data
lections are likely
to be housed in a centralized database or similar data repository.
As a complement to automating performance measurement, organizations should also consider
how performance measurement automation can supplement other automated information security
tasks. For example, Extensible Markup Language (XML)-formatted configuration checklists can
allow organizations to use Commercial Off-The-Shelf (COTS), Government Off-The-Shelf
(GOTS), or open-source tools to automatically check their information security configuration
and map it to technical compliance requirements. While these checklists are primarily used for
3.4.3 Data Management Concerns
To ascertain the quality and validity of data, data collection methods and data
for measures data collection and reporting, either directly or as data sources,
standardized. The validity of data is suspect if the primary data source is an
database that stores only the information reported by a few organizational e
reporting processes between organizations are inconsistent. The importance o
reporting processe
implementing processes that may serve as inputs into an information s
program, they must ensure that data gathering and reporting are clearly defined
collection of valid data.
Organizations must understand that although they may collect substantial amoun
information security data, not all data will be useful for their information secu
program at any given point in time. Any data collection specifically for the purp
inform
to ensure that available resources are used primarily to correct problem
Establishment of an information security measurement program will require a
investment to ensure that the program is implemented in a way that will max
Benefits of the program are expected to outweigh the costs of investing re
significant collection of operational and vulnerability data. Due to the sensitivit
information security performance measurement data repositories need to be prot
accordingly.
3.4.4 Automation of Measurement Data Collectio
Efficient data management is facilitated by automating measurement data collec
measurement data collection standardizes data collection and reporting, and hel
measurement activity by integrating it into business processes. In addition,
collection minimizes opportunities for human error, leading to greater accuracy of available data.
Standardized collection and reporting can also increase data availability, as col
16
compliance with regulations such as FISMA, they can also be used to map s
control settings to the corresponding NIST SP 800-53 security controls, which c
verification of compliance more consistent and efficient. For example, a check
the password strength settings on a system and report whether or not those sett
requirements specified in NIST SP 800-53. The results of such automated dat
provide dynamic updates t
pecific technical
an make the
list could examine
ings meet
a collection could
o an agency’s automated information security performance measures
to indicate if information security targets are being achieved and where corrective actions and
ety of environments and
perational
ystem development life
ocesses; and
n be applied to organizational units, sites, or other
organizational constructs. Organizations should carefully define the scope of their information
c goals and
ram maturity.
measurement can be applied at the information system level to provide
quantifiable data regarding the implementation, effectiveness/efficiency, or impact of required or
ation system owners can use measures to support the
determination of the information system’s security posture, demonstrate compliance with
ent. Information security
k assessments,
rting activities, or
3.5.2 System Development Life Cycle
Information security measurement should be used throughout the SDLC to monitor
implementation of appropriate security controls. Formalized measurement of information
security during the SDLC provides information to the project manager that is essential to
understanding how well information security is integrated into the SDLC and to what degree
vulnerabilities are being introduced into the information system. Different measures may be
mitigation activities are required.
3.5 Information Security Measurement Program Scope
An information security measurement program can be scoped to a vari
needs:
Quantifying information system-level security performance for an o
information system;
Quantifying the integration of information security into the s
cycle (SDLC) during information system and software development pr
Quantifying enterprise-wide information security performance.
Information security measures ca
security measurement program based on specific stakeholder needs, strategi
objectives, operating environments, risk priorities, and information security prog
3.5.1 Individual Information Systems
Information security
desired security controls. Inform
organizational requirements, and identify areas in need of improvem
measurement can support certification and accreditation activities (e.g. ris
information system security plans, and continuous monitoring), FISMA repo
capital planning activities.
17
useful for different project activities. The following table provides examples of information
security measu ty of project activities.
T ement D ystem Devel nt3
res that can be used during the SDLC for a varie
able 1. Measur uring S opme
SDLC Phase Relevant Measures Purpose Value
Acquisition/Dev defects that negatively
impact the security
posture of the system
des insight into the
effectiveness of life cycle
es and information
security training for
evelopers
cates need for additional
rity controls in
operations
elopment Percentage of product Identify software
defects that may
Provi
be exploited in the
future process
d
Indi
secu
Acquisition/Dev security requirements
(i.e., security controls
implemented) that are
mapped to design
des insight into
ion of information
rity requirements in
releases
Provides insight into
complexity of information
curity implementation
s short- and long-
for additional
security controls in
operations
elopment Percentage of information Determine if
security
Provi
inclus
requirements are
being planned and
implemented
secu
early
se
Indicate
term need
Acquisition/Dev
ount of
monitoring
required
ides insight into
bility of inherent
vulnerabilities and increased
enterprise risk
elopment Number of en
for a module
the minimum
try points
(should be
necessary)
Fewe
poi
am
r entry
nts reduces the
Prov
possi
Acquisition/Dev discovered
are known as
ilities
overflows
ting)
viations
sign, code,
ents
er of defects and
the area of the code in
which they were found (it
is a higher risk to have
the defects between
or other interfaces)
Percent of discovered
vulnerabilities that have
been mitigated
Proactively
address security
defects prior to
testing and
implementation
Helps minimize
development and
maintenance rework costs
elopment
defects that
software vulnerab
(e.g., buffer
and cross-site scrip
Number of de
Number of
between de
and requirem
Numb
components, unit seams,
3 These measures were developed in collaboration with Department of Homeland Security Software Assurance Program.
18
SDLC Phase Relevant Measures Purpose Value
Acquisition/Dev informatio
activities
t into cost
d schedule risks to project
s
es accuracy in
ng of future projects
elopment Cost/schedule variance in
n security
Moni
an tor planning Provides insigh
d
implementation of an
succes
security activities Increas
planni
Implementati in
erabilities
Percentage of failed
security control
e
future
t into risk of
being exploited
when implemented
Indicates need for additional
security controls in
tions
on/Assessment Percentage of
that conta
vuln
modules Identify software
defects that may
be exploited in th
Provides insigh
the system
requirements opera
Collecting and analyzing these types of measures will help the project manager in the following
ine if software defects that may impact information security are being identified
more secure design
Identify and investigate trends that require corrective actions, such as training or revising
ty controls; and
Track trends in information security risk throughout the SDLC.
Collecting, analyzing, and reporting appropriate security measures during the SDLC can be used
elopment effort to
rather than added
Information security measurement can be implemented on an enterprise-wide level to monitor
the implementation, effectiveness/efficiency, and impact on the organization’s information
security activities. Enterprise-level measures may be derived by aggregating multiple
information system-level measures or developed by using the entire enterprise as the scope.
For an enterprise-wide measurement to be effective, the organization must operate at a certain
level of maturity to ensure that processes the measures depend upon are consistent, repeatable,
and can ensure availability of data across the enterprise.
manner:
Determ
early in the life cycle where they are more cost-effective to fix;
Identify and remove potential vulnerabilities in software and develop
practices;
poorly written and confusing procedures;
Determine if the information system will comply with required securi
to improve integration of information security into the information system dev
increase the overall assurance that system security requirements are built in
later.
3.5.3 Enterprise-Wide Programs
19
4. LEGISLATIVE AND STRATEGIC DRIVERS
This section explains the relationship between overall agency performance mea
and information security performance measures reporting, and provides agen
on how to link these two activities to ensure that their information secur
to overall accomplishment of the agency mission, goals, and objectives. Section
provide an overview of the Government Performance Results Act (GPRA), the
Information Security Management Act (FISMA), and the Federal Enterprise A
sures reporting
cies with guidelines
ity program contributes
s 4.1 and 4.2
Federal
rchitecture from a
performance measurement point of view and describe their associated performance management
4.3 discusses the linkage between enterprise strategic planning and
information security.
ns, is driving an increased
purpose of these
ations, improve efficiencies in
e value of these services to the public. Agencies are
tives and make these plans and corresponding
velops initiatives
4.1.1 Government Performance Results Act
tructure and facilitate
nually
e federal
gencies accountable for achieving
gram performance reform with a series of pilot projects in setting program
goals, measuring program performance against those goals, and reporting publicly on
their progress;
Improve federal program effectiveness and public accountability by promoting a new
focus on results, service quality, and customer satisfaction;
Help federal managers improve service delivery by requiring that they plan for meeting
program objectives, and by providing them with information about program results and
service quality;
requirements, while Section
4.1 Legislative Considerations
Legislation such as GPRA and FISMA, along with executive regulatio
emphasis on managing, quantifying, and reporting agency performance. The
efforts is to facilitate the streamlining of U.S. government oper
delivering services, and demonstrate th
required to strategically plan their initia
performance measures available to the public. The Executive Branch also de
that may require organizations to collect and report performance measures.
GPRA focuses on improving program effectiveness and efficiency by adequately articulating
program goals and providing information on program performance. To s
program improvement, it requires agencies to develop multiyear strategic plans and an
report their performance against these plans.
The purpose of GPRA is to:
Improve the confidence of the American people in the capability of th
government by systematically holding federal a
program results;
Initiate pro
20
Improve congressional decision making by providing more objec
achieving statutory objectives and by r tive information on
eporting on the relative effectiveness and
nd
culminates in
A puts this
g and Investment Control (CPIC)
plishes and how well
the accomplishments match with the program’s purpose and objectives.”5
formance planning processes, agencies should:
anagement and
means to track
e against agency goals and objectives and measurable performance targets. Agencies
can demonstrate the impact of information security on their missions by aligning information
mation security goals and objectives.
d Execution of the
Management Act
urate with the
ormation
ementing and
The purpose of FISMA is to:
ness of security controls
over information resources that support federal operations and assets;
Recognize the highly networked nature of the current federal computing environment and
provide effective government wide management and oversight of related information
efficiency of federal programs and spending; a
Improve internal management of the federal government.4
GPRA mandates agencies to conduct strategic and performance planning that
annual submissions of strategic plans and performance measures reports. GPR
planning in the context of the overall agency Capital Plannin
process by emphasizing “managing for results—what the program accom
As a part of their annual strategic and per
Define their long-term and annual goals and objectives;
Set measurable targets of performance; and
Report their performance against goals and objectives to the Office of M
Budget (OMB) on a quarterly basis.
This performance measures reporting directly supports GPRA by providing a
performanc
security performance measures with their infor
GPRA is implemented by OMB Circular A-11, Preparation, Submission, an
Budget, Part 6.
4.1.2 Federal Information Security
FISMA requires federal agencies to provide appropriate protection of their resources through
implementing a comprehensive information security program that is commens
sensitivity of the information being processed, transmitted, and stored by agency inf
systems. It also requires agencies to assess and report their performance in impl
managing their information security programs.
Provide a comprehensive framework for ensuring the effective
4 Public Law 103-62, Government Performance and Results Act of 1993.
5 OMB Circular A-11, Preparation, Submission, and Execution of the Budget, 2005, Section 15, clause 15.5.
21
security risks, including coordination of information security efforts throughout the
nce of minimum security controls required to
on security
cts offer advanced,
ic security that are
re information
s should be made by individual agencies from among commercially
developed products.6
nes pertaining to
s and define and
ources. It also requires
rity programs.
enables agencies to
ormation security
stems that
are certified and accredited, the percentage of their personnel that have taken required
g requirements. A
information security measurement program also enables agencies to satisfy any new
easures reporting requirements required internally or
r information security data collection, analysis, quantification,
d quarterly FISMA
4.2 Federal Enterprise Architecture
security performance measurement requirements, the
Executive Branch periodically implements initiatives designed to monitor and improve the
effectiveness of federal organizations. One such Executive Branch initiative that relies on
information security measures is the Federal Enterprise Architecture (FEA). One of FEA’s
reference models is the Performance Reference Model (PRM). The PRM is a standardized
civilian, national security, and law enforcement communities;
Provide for the development and maintena
protect federal information and information systems;
Provide a mechanism for improved oversight of federal agency informati
programs;
Acknowledge that commercially developed information security produ
dynamic, robust, and effective information security solutions for the protection of critical
information infrastructures important to national defense and econom
designed, built, and operated by the private sector; and
Recognize that the selection of specific technical hardware and softwa
security solution
FISMA also mandated NIST to develop and promulgate standards and guideli
federal information systems.
FISMA requires agencies to identify and assess risks to their information system
implement appropriate security controls to protect their information res
agencies to report quarterly and annually on the status of their information secu
An institutionalized information security performance measurement program
collect and report on relevant FISMA performance indicators. For example, inf
performance measures enable agencies to quickly determine the percentage of their sy
information security training, and their compliance with other FISMA reportin
mature
information security performance m
externally by providing a basis fo
and reporting.
OMB publishes annual guidelines on the process and elements of annual an
reporting.
In addition to legislative information
6 Public Law 107-347, E-Government Act of 2002, Title III
22
framework to measure the performance of major IT investments and their contribution to
implementation into FEA efforts to reduce duplication of data collection and facilitate integration
rity
gic planning
lly established, each with
lished. As a part of
mplishment of their
measure.
Information security performance measures provide a means to monitor and report on an
rformance
ctiveness of
ency’s mission.
hich are defined
ecurity must be
d to at least one goal or objective in the strategic planning process to demonstrate its
ablished by
rity requirements
ishing these goals and
performance
ed at multiple levels within
an organization—including the overall agency information security program, operating bureau
information security programs, or individual agency programs. They can also be scoped to
different types of efforts, as discussed in Section 3.6. Measures developed at different levels of
an organization should be used for internal management and process improvement purposes.
They may also be aggregated to agency-level information security program performance
measures. Agency-level measures will either be reported to the organization’s upper
management or used for external reporting—such as GPRA and FISMA.
program performance.
Organizations should consider tying information security measures development and
of information security into their enterprise architectures.
4.3 Linkage Between Enterprise Strategic Planning and Information Secu
Federal agencies develop their long-term strategic goals as part of their strate
process—a requirement of GPRA. Five to six strategic goals are usua
several performance objectives that describe how the goal will be accomp
this process, agencies develop performance measures to quantify the acco
goals and objectives with quarterly and annual targets for each performance
agency’s implementation of its information security program and associated pe
measures as mandated by FISMA. These measures can also help assess the effe
security controls in protecting agency information resources in support of the ag
Ultimately, all efforts must support the agency’s overall goals and objectives, w
and reassessed annually during its strategic planning activities. Information s
explicitly tie
importance in accomplishing the agency’s mission. This connection can be est
identifying goals and objectives that would articulate agency information secu
within the context of the overall agency mission. Progress toward accompl
objectives may be monitored by implementing appropriate information security
measures.
Information security performance measures can be developed and us
23
5. MEASURES DEVELOPMENT PROCESS
ce measures
n during
early in the process is more effective than
ions for setting up an
Selecting the measures most appropriate for the organization’s strategy and business
vironment, and
s;
ation to, all relevant
stakeholders; and
ce, including
the establishment
ent
al set of measures as well as selection of the measures subset that is
surement program
ects of information
s section describes the
6 describes the information security measurement
n a larger organizational
context and demonstrates that they can be used to progressively measure the implementation,
ctivities within
r activities:
Identification and definition of the current information security program; and
Development and selection of specific measures to gauge the implementation,
effectiveness, efficiency, and impact of the security controls.
The activities outlined in Figure 5-1 need not be done sequentially. The process is provided as a
way to think about measures and facilitate the identification of measures tailored to a specific
organization and its different stakeholder groups.
The benefit of devoting the time to set up an information security performan
program in advance is similar to that of allowing time for requirements definitio
information system development—investing time
retrofitting requirements once the effort is under way. Important considerat
information security performance measures program include:
environment, including mission and information security priorities, en
requirement
Taking time to collect input and get buy-in from, and provide educ
Ensuring that appropriate technical and process infrastructure is in pla
creation/modification of data collection, analysis, and reporting tools.
Two processes—measures development and measures implementation—guide
and operation of an information security measurement program. The measures developm
process establishes the initi
appropriate for an organization at a given time. The information security mea
implementation process is iterative by nature and ensures that appropriate asp
security are measured for a specific time period. The remainder of thi
measures development process. (Section
program implementation process.)
Figure 5-1 illustrates the place of information security measures withi
effectiveness/efficiency, and business impact of information security a
organizations or for specific information systems.
The information security measures development process consists of two majo
24
Figure 5-1. Information Security Measures Development Pr
5.1 Stakeholder Interest Identification
ocess
f the measures development process (see Figure 5-1) identifies relevant stakeholders
s in information security measurement. Anyone within an organization can be
me individuals or groups have a greater stake
s. The primary information security stakeholders are:
ystem administrator/network administrator;
Information system support personnel.
Secondary information security stakeholders are members of groups within an organization that
do not have information security as their primary mission but are involved with information
security in some aspects of their operations. Examples of secondary information security
stakeholders may include:
Chief Financial Officer (CFO);
Phase 1 o
and their interest
an information security stakeholder, although so
than other
Agency Head;
CIO;
SAISO/CISO;
ISSO;
Program manager/information system owner;
Information s
Security engineers; and
25
Training organization;
Human resources/personnel organization;
ilities.
ts of their particular
ay require an
’s information
be determined through
ent reviews. In
erenced in Section
res. It is
ended that fewer measures per stakeholder be used when an organization is establishing
uld gradually
ement program
evelopment to
urity performance.
system security
ram’s overall success.
ency/effectiveness,
utive will be
ission impact of information security activities (e.g., What is the
monetary and public trust cost of the latest incident? Is there an article about us in a major
and program managers will be interested in the
effectiveness/efficiency of information security programs (e.g., Could we have prevented the
work administrators
ps to avoid or
5.2 Goals and Objectives Definition
Phase 2 of the measures development process (see Figure 5-1) is to identify and document
hat would guide security control
implementation for the information security program of a specific information system. For
federal information systems, these goals and objectives may be expressed in the form of high-
level policies and requirements, laws, regulations, guidelines, and guidance.7
Inspectors General (IG); and
Chief Privacy Officer or other designated official with privacy responsib
Stakeholder interests will differ, depending on the information security aspec
role and their position within the organizational hierarchy. Each stakeholder m
additional set of customized measures that provides a view of the organization
security performance within their area of responsibility. Interests may
multiple venues, such as interviews, brainstorming sessions, and mission statem
many cases, stakeholder interests are driven by laws and regulations. As ref
3.4.2, each stakeholder should initially be responsible for two to three measu
recomm
an information security program; the number of measures per stakeholder sho
increase as the information security program and information security measur
mature.
Stakeholders should be involved in each step of information security measures d
ensure organizational buy-in to the concept of measuring information sec
This involvement will also ensure that a sense of ownership of the information
measures exists at multiple levels of the organization to encourage the prog
The three measurable aspects of information security—business impact, effici
and implementation—speak to different stakeholders. For example, an exec
interested in the business and m
newspaper?), information security
incident? How fast did we respond to it?), and information systems or net
will want to know what went wrong (e.g., Have we performed all necessary ste
minimize the impact of the incident?).
information system security performance goals and objectives t
7 See Section 4 for additional information on requirements, laws, regulations, guidelines, and guidance.
26
Information security program goals and objectives can also be derived f
goals and objectives in support of the overall organization’s mission, wh
articulated in agency strategic and performance plans. Applicable docu
reviewed to identify and extract applicable information security performance goals and
objecti
rom enterprise-level
ich are usually
ments should be
ves. Extracted goals and objectives should be validated with the organizational
stakeholders to ensure their acceptance of, and participation in, the measures development
quirements for
um security
ing to low-
gencies must define
rocessed, stored, and
programs must
entation and
it linkage of
s can use
in FIPS 200, as an input into objectives
ese specifications, which
rovided in Appendix D.
rammatic and
nization-specific
emented are
efine a baseline of
information security practices for the information system. Specifically, they describe how
niques lead to accomplishing information
security performance goals and objectives. These documents should be examined not only
ies when the
ents should be
d targets of
performance.
5.4 Information Security Program Implementation Review
In Phase 4 of the measures development process (see Figure 5-1), any existing measures and data
repositories that can be used to derive measures data should be reviewed. Following the review,
applicable information should be extracted and used to identify appropriate implementation
process.
Federal Information Processing Standard (FIPS) 200, Minimum Security Re
Federal Information and Information Systems, provides specifications for minim
requirements. NIST SP 800-53 provides minimum security controls correspond
impact, moderate-impact, and high-impact categories as defined in FIPS 199, Standards for
Security Categorization of Federal Information and Information Systems. A
and implement minimum security controls based on the sensitivity of data p
transmitted on their information systems. As such, agency information security
include planning, implementing, monitoring, and reporting on the implem
effectiveness of these information system security controls. To facilitate explic
information security activities with agency-level strategic planning, agencie
specifications for minimum security requirements, stated
for developing information security performance measures. (Th
correspond to the 17 security control families in NIST SP 800-53, are p
Appendix A provides candidate information security measures from both prog
system-level perspectives, with corresponding goals and objectives.)
5.3 Information Security Policies, Guidelines, and Procedures Review
Phase 3 of the measures development process (see Figure 5-1) focuses on orga
information security practices. Details of how security controls should be impl
usually set forth in organization-specific policies and procedures that d
implementing security controls, requirements, and tech
during initial measures development, but in future measures development activit
initial list of measures is exhausted and needs to be replaced. Applicable docum
reviewed to identify information security controls, applicable processes, an
27
evidence to support measures development and data collection.8 Implementat
points to aspects of security controls that would be indicative of the informatio
performance objective being met, or at least that actions leading to the accompl
performance objective in the future are performed. The information system security
ion evidence
n security
ishment of the
requirements, processes, and procedures that have been implemented can be extracted by
n.
ay contain information from which measures data can be generated:
;9
est GAO and IG findings;
ed activities, such as incident handling and
it logs, and network and information system
nts and penetration testing results;
curity assessment reports);
results;
Training results and statistics.
As information system security practices evolve and the documents that describe them change,
o ensure that the newly
e examined to
Phases 5, 6, and 7 of the measures development process, depicted in Figure 5-1, involve
developing measures that track process implementation, efficiency/effectiveness, and mission
ion describes how
to develop measures in these three areas for information security. (Appendix A provides
to selected security control families in NIST SP
800-53.) To support continuous improvement of security for information systems and programs,
the process explicitly connects information security activities to the organization’s strategic goals
consulting multiple sources, including documents, interviews, and observatio
The following sources m
System Security Plans
Plan of Action and Milestones (POA&M) reports;
Lat
Tracking of information security-relat
reporting, testing, network management, aud
billing;
Risk assessme
C&A documentation (e.g., se
Continuous monitoring
Contingency plans;
Configuration management plans; and
existing measures will be retired and new measures will be developed. T
developed measures are appropriate, these and similar documents will need to b
identify new areas that should be captured in measures.
5.5 Measures Development and Selection
impact. The performance ures development process presented in this sectmeas
candidate measures, some of which correspond
8 Implementation evidence refers to the data collected to support an information security performance measure. Implementation
evidence is discussed in greater detail in Table 2 contained in Section 5.6.
9 NIST SP 800-18 provides guidelines on System Security Plan development.
28
Organizations manage what
they measure. It is important
to select two to three high-
priority measures per
stakeholder, determined by
using a risk-based approach.
sumes that
ave multiple strategic goals, and that a single goal may require inputs from
multiple measures.
n security
urity control, a
easures that
help determine where a given organization stands in support of the corresponding strategic
asured, provide a
rols should:
Use data describing the security control’s implementation to generate required measures
th overall information security program performance should:
ity goals and objectives that may encompass performance
of information security across the spectrum of security controls; and
generate
measures.
tin res, will be quite large.
nsure that the set mentation has the
security
approach. “High priority” may be defined by the
latest GAO or IG reports, results of a risk
assessment, through continuous monitoring, or based
on an internal organizational goal.
Uses data that can realistically be obtained from
existing sources and data repositories (e.g., system
inventories, training databases, POA&Ms).
through development and use of performance measures. This approach as
organizations h
5.5.1 Measures Development Approach
Depending on the scope of the measurement effort, development of informatio
measures should focus on gauging the security performance of a specific sec
group of security controls, or a security program. Such an approach will result in m
objective—and, when multiple controls or the entire program are being me
broad view of information security performance.
Measures corresponding to security control families or individual security cont
Be mapped directly to the individual security control(s);
such as POA&M, testing, and project tracking; and
Characterize the measure as applicable to low, moderate, or high information system
categorization.
Measures dealing wi
Be mapped to information secur
Use the data describing the information security program performance to
required
5.5.2 Measures Prioritization and Selection
The universe of possible measures, based on exis
Measures must be prioritized to e g policies and procedu
selected for initial imple
following qualities:
Facilitates improvement of high-priority
control implementation as defined using a risk-based
29
Measures processes that already exist and are established. Measuring i
processes will not provide meaningful data about information security p
will not be useful for targeting specific aspects of performance. Howeve
such measurement may still be useful to attain a baseline to be closely m
nconsistent
erformance and
r, attempting
onitored through
continuous assessment and further measurement to improve the information security
ce of selected
y program
portance in the
Weight should be based on the overall risk
mitigation goals and would likely reflect higher criticality of enterprise-level initiatives versus
is a useful tool that facilitates the integration of information
security measures into the departmental capital planning process.
ation security
ch success is measured. The
rformance
easures
implementation
is complex because
te
f performance for
ation of security
events on its
An organization
nd should be ready to adjust
these targets, based on actual measurements, once they are obtained. The organization may also
decide not to set targets for these measures until the first measurement is collected that can be
used as a performance baseline. Once the baseline is obtained and corrective actions identified,
appropriate measurement targets and implementation milestones that are realistic for a specific
system environment can be defined. If performance targets cannot be established after the
baseline has been obtained, management should evaluate whether the measured activities and
corresponding measures are providing the expected value for the organization.
posture.
Organizations may decide to use a weighting scale to differentiate the importan
measures and ensure that results accurately reflect existing information securit
priorities. This would involve assigning values to each measure based on its im
context of the overall information security program.
smaller-scale initiatives. This scale
5.5.3 Establishing Performance Targets
Establishing performance targets is an important component of defining inform
measures. Performance targets establish a benchmark by whi
degree of success is based on the proximity of the measure result to the stated pe
target. The mechanics of establishing performance targets differ for implementation m
and the other two types of measures (effectiveness/efficiency and impact). For
measures, targets are set to 100 percent completion of specific tasks.
Setting performance targets for effectiveness/efficiency and impact measures
management will need to apply qualitative and subjective reasoning to determine appropria
levels of security effectiveness and efficiency, and use these levels as targets o
applicable measures. Although every organization desires effective implement
controls, efficient delivery of security services, and minimal impact of security
mission, the associated measurements will be different for different systems.
can attempt to establish performance targets for these measures a
30
This template and the
candidate measures
provided in Appendix
A are examples, and
are meant to be
tailored to fit the
needs of the
organization.
gets of
vailable. Trends
existed previously,
ommendations
ndustry, when published, may provide a means of setting targets.
Figure 5-2 provides an example of an implementation measure that is based on the percentage of
approved system security plans.
Establishment of effectiveness/efficiency and impact measures baselines and tar
performance can be facilitated if historic data that pertains to these measures is a
observed in the past will provide insight into ranges of performance that have
and guide the creation of realistic targets for the future. In the future, expert rec
and standards within the i
Percentage of Approved System Security Plans
0%
20%
40%
60%
80%
100%
Oct-05 Apr-06 Oct-06 Apr-07 Oct-07
asu le
easure
evelop
ndard fo
n, analysis, and
e, provided in Table 2, is
an example of such a format.
While the measures template provides a suggested approach for
measurement, depending upon internal practices and procedures,
organizations may tailor their own performance measurement
templates by using a subset of the provided fields or adding more
fields based on their environment and requirements.
Figure 5-2. Information Security Me
5.6 Measures Development Template
res Trend Examp
s in a
ment,
rmat will
Organizations should document their performance m
standard format to ensure repeatability of measures d
tailoring, collection, and reporting activities. A sta
provide the detail required to guide measures collectio
reporting activities. The measures templat
31
Table 2. Measures Template and Instructions
Field Data
Measure ID que identifier used for measure tracking and sorting. The unique identifier
ly reference another
State the uni
can be from an organization-specific naming convention or can direct
source.
Goal stem-level security
ntation for that
s and information
als can be derived
hese goals are
possible, include both the
al extracted from agency
at would contribute to
Statement of strategic goal and/or information security goal. For sy
control measures, the goal would guide security control impleme
information system. For program-level measures, both strategic goal
security goals can be included. For example, information security go
from enterprise-level goals in support of the organization’s mission. T
usually articulated in strategic and performance plans. When
enterprise-level goal and the specific information security go
documentation, or identify an information security program goal th
the accomplishment of the selected strategic goal.
Measure h the word
ber,” “frequency,” “average,” or a similar term.
red. Security
tion Evidence. If the
erate, or low), state
Statement of measurement. Use a numeric statement that begins wit
“percentage,” “num
If applicable, list the NIST SP 800-53 security control(s) being measu
controls that provide supporting data should be stated in Implementa
measure is applicable to a specific FIPS 199 impact level (high, mod
this level within the measure.
Type ess/efficiency, or impact. Statement of whether the measure is implementation, effectiven
Formula a measure. The
ves as an input into the
Calculation to be performed that results in a numeric expression of
information gathered through listing implementation evidence ser
formula for calculating the measure.
Target completion or a
llars, or other
ay be tied to a required completion time frame.
stated goal.
Threshold for a satisfactory rating for the measure, such as milestone
statistical measure. Target can be expressed in percentages, time, do
appropriate units of measure. Target m
Select final and interim target to enable tracking of progress toward
Im em
Evide that the activity is
or a specific measure.
at would provide
alify the measure for
NIST SP 800-53
If the measure is applicable to a specific FIPS 199 impact level, questions should
state the impact level.
For automated data collection, identify data elements that would be required for the
formula, qualify the measure for acceptance, and validate the information provided.
pl entation
nce Implementation evidence is used to compute the measure, validate
performed, and identify probable causes of unsatisfactory results f
For manual data collection, identify questions and data elements th
the data inputs necessary to calculate the measure’s formula, qu
acceptance, and validate provided information.
For each question or query, state the security control number from
that provides information, if applicable.
Frequency Indication of how often the data is collected and analyzed, and how often the data is
reported. Select the frequency of data collection based on a rate of change in a particular
security control that is being evaluated. Select the frequency of data reporting based on
external reporting requirements and internal customer preferences.
32
Field Data
Responsible Indicate the following key stakeholders:
Parties d individual who owns
d individual
ollector should
t organizational unit
t of interest and ensure
e whether it is
nt and individual who
he data.
Information Owner: Identify organizational component an
required pieces of information;
Information Collector: Identify the organizational component an
responsible for collecting the data. (Note: If possible, Information C
be a different individual or even a representative of a differen
than the Information Owner, to avoid the possibility of conflic
separation of duties. Smaller organizations will need to determin
feasible to separate these two responsibilities.); and
Information Customer: Identify the organizational compone
will receive t
Data Source atabases, tracking
tools, organizations, or specific roles within organizations that can provide required
information.
Location of the data to be used in calculating the measure. Include d
Reporting
Format Indication of how the measure will be reported, such as a pie chart, line c
or other format. State the type of format or provide a sample. hart, bar graph,
Candidate measures provided in Appendix A are examples of information security measures and
reporting at any point in time (e.g.,
ples of measures that can
Modified and tailored to a specific organization’s requirement; or
Used as a template for other information security measures.
rting point for their
nly for measuring
improvement
ation security
ous improvement. This
relationship is depicted by the feedback arrows in Figure 5-1, which are marked as
Goal/Objective Redefinition, Policy Update, and Continuous Improvement. Once measurement
of security control implementation begins, subsequent measures can be used to identify
performance trends and determine whether the implementation rate is appropriate. A specific
frequency of each measure collection will depend on the life cycle of a measured event. For
example, a measure that pertains to the percentage of completed or updated system security plans
should not be collected more often than semiannually, while a measure that pertains to crackable
passwords should be collected more frequently. Over time, measurements will point to
may or may not be required for regulatory or organizational
FISMA). The purpose of listing these measures is to demonstrate exam
be:
Used as stated;
Organizations are encouraged, but not required, to use these measures as a sta
information security measurement efforts.
5.7 Feedback Within the Measures Development Process
Measures that are ultimately selected for implementation will be useful not o
performance, identifying causes of unsatisfactory performance, and pinpointing
areas, but also for facilitating consistent policy implementation, effecting inform
policy changes, redefining goals and objectives, and supporting continu
33
continuous implementation of applicable security controls. Once effectiveness
measures are implemented, they will facilitate an understanding of whether th
performance
/efficiency
e security control
goals, identified in the information security policies and procedures, are realistic
onfiguration,
tage of passwords that
f security control
e policy will
broken
plementation, the
e identified.
d. If a significant
has been
ineffective in thwarting
r strengthening
of keeping the
password policy as is, tightening it, or replacing password authentication with other techniques
must also be determined. Conducting cost-benefit analyses will generate business impact
measures to address the issue of redefining information system identification and authentication
objectives and appropriately realign these objectives with the information system mission.
and appropriate.
For example, if an information security policy defines a specific password c
compliance with this policy could be determined by measuring the percen
are configured according to the policy. This measure addresses the level o
implementation. It is assumed that configuring all passwords according to th
significantly reduce, if not eliminate, information system compromises through
passwords. To measure effectiveness of the existing password policy im
percentage of passwords crackable by common password-breaking tools could b
This measure addresses the effectiveness of the security control as implemente
percentage of crackable passwords remains after the required password policy
implemented, the logical conclusion is that the underlying policy may be
password compromises. If this is the case, an organization will need to conside
the policy or implementing other mitigating measures. Costs and benefits
34
6. INFORMATION SECURITY MEASUREMENT IMPLEMENTATION
s for monitoring
ance
entation process
sists of six phases, which, when fully executed, will ensure continuous use of these measures
for security control performance monitoring and improvement. The process is shown in Figure
6-1.
Information security measurement implementation involves applying measure
information security control performance and using the results to initiate perform
improvement actions. The information security measurement program implem
con
Figure 6-1. Information Security Measurement Program Implementa
6.1 Prepare for Data Collection
tion Process
n process, Prepare for
mprehensive
information security measurement program—including information security measures
identification, definition, development, and selection. The next step is to develop an information
10
Specific implementation steps should be defined based on how data for the measures should be
collected, analyzed, and reported. These steps should be documented in the measurement
program implementation plan. The following items may be included in the plan:
Phase 1 of the information security measurement program implementatio
Data Collection, involves activities that are essential for establishing a co
security measurement program implementation plan.
10 The information security measurement program implementation plan can be formal or informal, depending upon the
organization’s needs.
35
Audience for the plan;
ding responsibilities for data collection
Process of measures collection, analysis, and reporting, as tailored to the specific
uch as risk
in the agency (e.g.,
acy) to ensure that measures data collection
contain
ontinuous
impact analyses of
s, and status
ing. Sound continuous monitoring practices dictate that the organization establishes
selection criteria for a subset of the security controls employed within the information system for
. NIST SP 800-37 provides guidelines on the continuous
monitoring process. NIST SP 800-53A provides guidelines on the assessment of security
to support and
prioritization in
ess, Collect Data
he collected measures are
used to gain an understanding of information system security and identify appropriate
improvement actions. This phase includes the following activities:
Collect measures data according to the processes defined in the Measurement
Program Implementation Plan;
Aggregate measures as appropriate to derive higher-level measures (e.g., “rolling up”
information system-level measures to derive program-level measures);
Measurement roles and responsibilities, inclu
(both soliciting and submitting), analysis, and reporting;
organizational structure, processes, policies, and procedures;
Details of coordination within the Office of the CIO, relating to areas s
assessment, C&A, and FISMA reporting activities;
Details of coordination between the SAISO and other functions with
physical security, personnel security, and priv
is streamlined and non-intrusive;
Creation or selection of data collection and tracking tools;
Modifications of data collection and tracking tools; and
Measures summary reporting formats.
Additionally, the information security measurement implementation plan should
provisions for continuous monitoring of the information security program. C
monitoring activities include configuration management, information security
changes to the information system, assessment of a subset of security control
report
purposes of continuous monitoring
controls. Results generated from continuous monitoring provide data necessary
supplement the data collected in Phase 2, and help facilitate corrective action
Phase 3.
6.2 Collect Data and Analyze Results
Phase 2 of the information security measurement program implementation proc
and Analyze Results, involves activities essential for ensuring that t
36
Consolidate collected data and store in a format conducive to data analysis and
ents with targets (if defined)
l and desired performance;
ance; and
more than one
security plans is
lem. To
garding the
reasons for the low percentages (e.g., lack of guidelines, insufficient expertise, or conflicting
tion evidence for the
of approved system security plans. Once this information is collected and compiled,
oblem.
entation and
ering,
een removed but
anagement practices—New or upgraded information systems that are not
configured with required information security settings and patches;
patches or upgrades that are incompatible
Awareness and commitment—Lack of management awareness and/or commitment to
information security;
Policies and procedures—Lack of policies and procedures required to ensure existence,
use, and audit of required information security functions;
Architectures—Poor information system and information security architectures that
render information systems vulnerable; and
reporting—for example, in a database or spreadsheet;
Conduct gap analysis to compare collected measurem
and identify gaps between actua
Identify causes of poor perform
Identify areas that require improvement.
Causes of poor performance can often be identified by using the data from
measure. For example, determining that the percentage of approved system
unacceptably low would not be helpful for determining how to correct the prob
determine the cause of low compliance, information will need to be obtained re
priorities). This can be collected as separate measures or as implementa
percentage
corrective actions could be directed at the cause of the pr
The following are examples of factors contributing to poor security implem
effectiveness:
Resources—Insufficient human, monetary, or other resources;
Training—Lack of appropriate training for personnel installing, administ
maintaining, or using the information systems;
Information system upgrades—Information security patches that have b
not replaced during information system upgrades;
Configuration m
Software compatibility—Information security
with software applications supported by the information system;
37
Inefficient processes—Inefficient planning and implementation proc
mea esses that influence
sures, including the communication processes necessary to direct organizational
mentation process, Identify
r closing the
tion factors, identify
clude changing
s; training information security staff, information
curity tools;
nd procedures;
veral corrective
y be inappropriate if
em. Applicable
ascending order of
ribed in NIST
s, or the
, Integrating IT
should be used to
easures in the Prepare for
Data Collection . Alternatively,
d on the criticality
tions, and the magnitude of
rmation security posture. Corrective actions
the corresponding information system or
ess.
om the top of the
6.4 Develop Business Case and Obtain Resources
Phase 4 of the information security measurement program implementation process, Develop
Business Case, and Phase 5, Obtain Resources, address the budgeting cycle for acquiring
resources needed to implement remediation actions identified in Phase 3. The steps involved in
developing a business case are based on industry practices and mandatory guidelines, including
OMB Circular A-11, the Clinger-Cohen Act, and GPRA. Results of the prior three phases will
be included in the business case as supporting evidence.
actions.
6.3 Identify Corrective Actions
Phase 3 of the information security measurement program imple
Corrective Actions, involves development of a plan to serve as the roadmap fo
implementation gap identified in Phase 2. It includes the following activities:
Determine range of corrective actions—Based on results and causa
potential corrective actions for each performance issue. These may in
information system configuration
system administrator staff, or regular users; purchasing information se
changing information system architecture; establishing new processes a
and updating information security policies.
Prioritize corrective actions based on overall risk mitigation goals—Se
actions may apply to a single performance issue; however, some ma
they are too costly or are inconsistent with the magnitude of the probl
corrective actions should be prioritized for each performance issue in
cost and descending order of impact. The risk management process desc
SP 800-30, Risk Management Guide for Information Technology System
corrective action prioritization process described in NIST SP 800-65
Security into the Capital Planning and Investment Control Process,
prioritize corrective actions. If weights were assigned to m
phase, they should be used to prioritize corrective actions
priorities may be assigned in the Identify Corrective Actions phase base
of implementing specific corrective actions, cost of the ac
their impact on the organization’s info
should be documented in the POA&M for
organization and tracked as a part of the continuous monitoring proc
Select most appropriate corrective actions—Viable corrective actions fr
prioritized list should be selected for use in a full cost-benefit analysis.
38
The following activities are generally performed as a part of business case
pursued within the bounds of agen analysis. They are
cy-specific processes to obtain the resources needed to
Document mission and objectives (identified during Phase 2 of the measures
eline for comparing
Document the information security performance gaps between target performance and
uring Phase 2 of
s;
rnative, as
ntified in Phase 3 of the information security measurement program implementation
eatest effect on the
cost;
quantifiable and non-quantifiable returns delivered through
erformed in
atic risks
ss case to accurately
hase of the process.
pending thresholds that
determine which investments and budget requests require a formal business case. In general, the
level of effort to develop the business case should correspond with the size and scope of the
er recovery site
to establish an account review process.
ess case, its underlying components and
analysis enable easier completion of internal and external budget requests. A thorough
implement corrective actions, and include:
development process);
Determine the cost and risks of maintaining status quo to use as a bas
investment alternatives;
current performance, as evidenced by the current measures collected d
the information security measurement program implementation proces
Estimate the life cycle costs of each corrective action or investment alte
ide
process;
Perform sensitivity analysis to determine which variables have the gr
11
Characterize benefits that are
improved performance, based on the prioritization of corrective actions p
Phase 3 of the information security measurement program implementation process;
Perform risk analysis to assess the likelihood of obstacles and programm
inherent to a particular alternative; and
Prepare budget submission by summarizing key aspects of the busine
illustrate its merits.12
Each agency should follow its specific business case guidelines during this p
Agencies typically have unique business case processes and life cycle s
funding request. For example, the business case to build and maintain a disast
would be more thorough than a business case
Regardless of the scope and complexity of the busin
11 If a small change in the value of a variable causes a large change in the calculation result, the result is said to be sensitive to
that parameter or assumption.
12 See NIST SP 800-65, Integrating IT Security Into the Capital Planning and Investment Control Process, for more information
on how to prepare appropriate budget request information for corrective actions.
39
40
ss case will support and facilitate the Obtain Resources phase, which
evaluation inquiries;
quested resources are not allocated); and
perform corrective actions.
ment program implementation process, Apply
program, or in the
M process is used
tive actions,
improvement is needed. The nature of
the cycle monitors progress and ensures that corrective actions are influencing information
system security control implementation in the intended way. Frequent performance
measurements will flag actions that are not implemented as planned or do not have the desired
effect, enabling quick course corrections within the organization to avoid problems that could be
uncovered during external audits, C&A efforts, or related activities.
examination of the busine
involves the following activities:
Respond to budget
Receive allocated budget;
Prioritize available resources (if all re
Assign resources to
6.5 Apply Corrective Actions
Phase 6 of the information security measure
Corrective Actions, involves implementing corrective actions in the security
technical, management, and operational areas of security controls. The POA&
to document and monitor the corrective action status.
Iterative data collection, analysis, and reporting will track the progress of correc
measure improvement, and identify areas where further
Appendix A: CANDIDATE MEASURES
rformance.
Devoting sufficient time to establishing information security performance measures is critical
to deriving the maximum value from measuring information security pe
This section offers a sampling of program-level and system-level measure
measures include information security programmatic measures, and measur
minimum security requirements in Federal Information Processing Standard (F
Minimum Security Requirements for Federal Information and Information
correspond to the 17 security control families in NIST SP 800-53. They a
adoption as a complete set, but are provided as examples that org
s. The sample
es that align with the
IPS) 200,
Systems, which
re not intended for
anizations can tailor and adapt
mples of tailoring
eporting
.
um security
cts of the requirements.
plement or replace those
provided in this section if the samples are not appropriate for their needs.
These candidate measures offer examples of specific security controls implemented at the
program level or at the system level and include all measure types—implementation,
effectiveness/efficiency, and impact.
to measure the performance of their information security programs. Exa
include specific time frames, implementation evidence, data sources, formulas, r
formats, frequency, responsible parties, or adding further fields to the template
It should be noted that these measures do not completely address the minim
requirements from FIPS 200, but will address one or more important aspe
Organizations should look into developing additional measures to com
A-1
Measure 1: Securi el) ty Budget (program-lev
Field Data
Measure ID Security Budget Measure 1
Goal rity and accountability
nd products.
properly secure agency
Strategic Goal: Ensure an environment of comprehensive secu
for personnel, facilities, a
Information Security Goal: Provide resources necessary to
information and information systems.
Measure Percentage (%) of the agency’s information system budget devoted to information
ion of Resources
security
NIST SP 800-53 Controls – SA-2; Allocat
Measure Type Impact
Formula (Information security budget/total agency information technology budget) *100
Target This should be an organizationally defined percentage.
Implementati
Evidence 1. What is the total information security budget across all agency systems (SA-2)? _____
et across all agency systems (SA-2)?
on
2. What is the total information technology budg
_____
Frequency Collection Frequency: Organization-defined (example: annually)
Reporting Frequency: Organization-defined (example: annually)
Responsi
Parties ncial Officer (CFO),
fficer (SAISO) (e.g., Chief Information
ystem Security Officer
Information Customer: Chief Information Officer (CIO), Senior Agency Information
Security Officer (SAISO) (e.g., Chief Information Security Officer [CISO]), external
audiences (e.g., Office of Management and Budget)
ble Information Owner: Chief Information Officer (CIO), Chief Fina
Senior Agency Information Security O
Security Officer [CISO])
Information Collector: System Administrator or Information S
(ISSO), budget personnel
Data Source Exhibit 300s, Exhibit 53s, agency budget documentation
Reporting
Format Pie chart illustrating the total agency information technology budget and the portion of
that budget devoted to information security
A-2
Measure 2: Vulnerability Management (program-level)
Field Data
Measure ID Vulnerability Measure 1
Goal rity and accountability
on Security Goal: Ensure all vulnerabilities are identified and mitigated.
Strategic Goal: Ensure an environment of comprehensive secu
for personnel, facilities, and products.
Informati
Measure Percentage (%)
periods after disco
of high13 vulnerabilities mitigated within organizationally defined time
very
NIST SP 800-53 Controls: RA-5; Vulnerability Scanning
Measure Ty Effectiveness/Efficiency pe
Formula (Number of high vulnerabilities identified and mitigated within targ
during the time period /number of high vulnerabilities identified wit
*100
eted time frame
hin the time period)
Target This should be a high percentage defined by the organization.
Implementa
Evidence ber of high vulnerabilities identified across the enterprise during the time period
(RA-5)? _____
ring the time period
tion 1. Num
2. Number of high vulnerabilities mitigated across the enterprise du
(RA-5)? _____
Frequency Collection Frequency: Organization-defined (example: quarterly)
Reporting Frequency: Organization-defined (example: quarterly)
Responsi Agency Information
ficer [CISO]), System
Security Officer
Information Customer: Chief Information Officer (CIO), Senior Agency Information
Security Officer (SAISO) (e.g., Chief Information Security Officer [CISO])
ble Information Owner: Chief Information Officer (CIO), Senior
Parties Security Officer (SAISO) (e.g., Chief Information Security Of
Owner
Information Collector: System Administrator or Information System
(ISSO)
Data Source Vulnerability scanning software, audit logs, vulnerability management systems, patch
management systems, change management records
Reporting
Format Stacked bar chart illustrating the percentage of high vulnerabilities closed within targeted
time frames after discovery over several reporting periods
13 The National Vulnerability Database (NVD) provides severity rankings of “Low” “Medium” and “High” for all Common
Vulnerabilities and Exposures (CVE) in the database. The NVD is accessible at http://nvd.nist.gov.
A-3
Measure 3: Access Control (AC) (system-level)
Field Data
Measure ID Remote Access Control Measure 1 (or a unique identifier to be filled o
organization) ut by the
Goal rity and accountability
stem, and component access to
dentifiable, known, credible, and authorized.
Strategic Goal: Ensure an environment of comprehensive secu
for personnel, facilities, and products.
Information Security Goal: Restrict information, sy
individuals or machines that are i
Measure Percentage (%) of remote access points used to gain unauthorized acce
NIST SP 8
ss
00-53 Controls: AC-17; Remote Access
Measure Type Effectiveness/Efficiency
Formula (Number of remote access points used to gain unauthorized access/to
access points) *100 tal number of remote
Target This should be a low percentage defined by the organization.
Implementati
Evidence date network diagram
Yes No
? _____
oes e org n employ Intrusion Detection Systems (IDS) to monitor traffic
ersi remo points (SI-4)?
o
oes e organization collect and review audit logs associated with all remote access
dent database that identifies standardized
or appropriate
to gain
on 1. Does the organization use automated tools to maintain an up-to-
that identifies all remote access points (CM-2)?
2. How many remote access points exist in the organization’s network
3. D th anizatio
trav ng te access
Yes N
4. D th
points (AU-6)?
Yes No
5. Does the organization maintain a security inci
incident categories for each incident (IR-5)?
Yes No
6. Based on reviews of the incident database, IDS logs and alerts, and/
remote access point log files, how many access points have been used
unauthorized access within the reporting period? ______
Frequency le: monthly) Collection Frequency: Organization-defined (examp
Reporting Frequency: Organization-defined (example: quarterly)
Responsible
Parties
Information Owne puter Security Incident Response Team (CSIRT)
Information Collector: System Administrator or Information System Security Officer
(ISSO)
Information Customer: Chief Information Officer (CIO), Senior Agency Information
Security Officer (SAISO) (e.g., Chief Information Security Officer [CISO])
r: Com
Data Source Incident database, audit logs, network diagrams, IDS logs and alerts
Reporting
Format Stacked bar chart, by month, which illustrates the percentage of remote access points used
for unauthorized access versus the total number of remote access points
A-4
Measure 4: Awareness and Training (AT) (program-level)
Field Data
Measure ID Security Training Measure 1 (or a unique identifier to be filled out by the organization)
Goal odern and secure
el are adequately trained
ation security-related duties and responsibilities.
Strategic Goal: Ensure a high-quality work force supported by m
infrastructure and operational capabilities.
Information Security Goal: Ensure that organization personn
to carry out their assigned inform
Measure Percentage (%) of information system security personnel that have received security
training
NIST SP 800-53 Controls: AT-3: Security Training
Measure Type Implementation
Formula (Number of information system security personnel that have co
within the past year/total num mpleted security training
ber of information system security personnel) *100
Target is should be rcentage defined by the organization. Th a high pe
Implementa
Evidence d
documented in policy (AT-1 and PS-2)?
2. Are records kept of which employees have significant security responsibilities (AT-3)?
No
How many em ncy (or agency component, as applicable) have
indicate the training that
Yes No
5. How many of those with significant security responsibilities have received the required
? _____
not received training, state all reasons that apply (AT-4):
Courses unavailable
Employee has not registered
Other (specify) ______________
tion 1. Are significant security responsibilities defined with qualifications criteria an
Yes No
Yes
3. ployees in your age
significant security responsibilities (AT-3)? _____
4. Are training records maintained (AT-4)? (Training records
specific employees have received.)
training (AT-4)
6. If all personnel have
Insufficient funding
Insufficient time
Frequency Collection Frequency: Organization-defined (example: quarterly)
Reporting Frequency: Organization-defined (example: annually)
Responsible Information Owner: Organization-defined (example: Training Manager)
A-5
Field Data
Parties ormation System Security
nformation Officer (CIO), Information System
, Senior Agency Information Security Officer (SAISO) (e.g.,
Information Collector: Organization-defined (example: Inf
Officer [ISSO], Training Manager)
Information Customer: Chief I
Security Officer (ISSO)
Chief Information Security Officer [CISO])
Data Source Training and awareness tracking records
Reporting
Format Pie chart illustrating the percentage of security personnel that have received training
versus those who have ot received training. If performance is below target, pie chart
illustrating causes of performance falling short of targets
n
A-6
Measure 5: Audit and Accountability (AU) (system-level)
Field Data
Measure ID Audit Record Review Measure 1 (or a unique identifier to be filled ou
organization) t by the
Goal urity and accountability
rmation system audit
estigation, and
tivity.
Strategic Goal: Ensure an environment of comprehensive sec
for personnel, facilities, and products.
Information Security Goal: Create, protect, and retain info
nalysis, invrecords to the extent needed to enable the monitoring, a
reporting of unlawful, unauthorized, or inappropriate ac
Measure ords review and analysis for inappropriate activity
itoring, Analysis, and Reporting
Average frequency of audit rec
NIST SP 800-53 Controls: AU-6: Audit Mon
Measure Ty ness/Efficiency pe Effective
Formula ing period Average frequency during report
Target is should be quency defined by the organization. Th a high fre
Implementa
Evidence og ing ac on the system (AU-2)?
tes evidence of
ivity within system audit logs?
No
g period, how many system audit logs have been reviewed within the
es for inappropriate activity (choose the nearest time period for each
d AU-6):
tion For each system:
1. Is l g tivated
Yes No
2. Does the organization have clearly defined criteria for what constitu
“inappropriate” act
Yes
3. For the reportin
following time fram
system) (AU-3 an
Within the past day _____
Within the past week _____
2 weeks to 1 month _____
1 month to 6 months _____
Over 6 months _____
Frequency requency: Organization-defined (example: daily) Collection F
Reporting Frequency: Organization-defined (example: quarterly)
Responsible
Parties ned (example: System Owner)
Information Collector: Organization-defined (example: System Administrator)
Information Customer: Chief Information Officer (CIO), Information System
Security Officer (ISSO), Senior Agency Information Security Officer (SAISO) (e.g.,
Chief Information Security Officer [CISO])
Information Owner: Organization-defi
Data Source Audit log reports
Reporting
Format Bar chart showing the number of systems with average audit log reviews in each of the
five categories within the Implementation Evidence field
A-7
Measure 6: Certification, Accreditation, and Security Assessments (CA) (program-level)
Field Data
Measure ID C&A Completion Measure 1 (or a unique identifier to be filled out by the organization)
Goal accountability
l, facilities, and products.
have been certified and
Strategic Goal: Ensure an environment of comprehensive security and
for personne
Information Security Goal: Ensure all information systems
accredited as required.
Measure Percentage (%) of new systems that have completed certification and accreditation
n
(C&A) prior to their implementation
NIST SP 800-53 Control: CA-6: Security Accreditatio
Measure Type Effectiveness/Efficiency
Formula (Number of new systems with complete C&A packages with Authori
approval prior zing Official [AO]
to implementation)/(total number of new systems) *100
Target is should be ercentage defined by the organization. Th a high p
Implementa
Evidence maintain a complete and up-to-
date system inventory?
2. Is there a formal C&A process within your agency (CA-1)?
s required to
plementation (CA-1)?
the reporting period? _____
operate prior
tion 1. Does your agency (or agency component, if applicable)
Yes No
Yes No
3. If the answer to Question 2 is yes, are system development project
complete C&A prior to im
Yes No
4. How many new systems have been implemented during
5. How many systems indicated in Question 4 have received an authority to
to implementation (CA-6)? _____
Frequency defined (example: quarterly) Collection Frequency: Organization-
Reporting Frequency: Organization-defined (example: annually)
Responsible
Parties ned (example: Authorizing Official [AO])
Information Collector: Organization-defined (example: System Owners)
Information Customer: Chief Information Officer (CIO), Information System
Security Officer (ISSO), Senior Agency Information Security Officer (SAISO) (e.g.,
Chief Information Security Officer [CISO])
Information Owner: Organization-defi
Data Source System inventory, system C&A documentation
Reporting
Format Pie chart comparing the percentage of new systems with AO-approved C&A packages
versus new systems without AO-approved C&A packages
A-8
Measure 7: Configuration agement (CM) (program-level) Man
Field Data
Measure ID Configuration Changes Measure 1 (or a unique identifier to be fill
organization) ed out by the
Goal nformation
Information Security Goal: Establish and maintain baseline configurations and
dware, software,
umentation) throughout the respective system development life
Strategic Goal: Accelerate the development and use of an electronic i
infrastructure.
inventories of organizational information systems (including har
firmware, and doc
cycles.
Measure ge (%) approved and implemented configuration changes identified in the latest
baseline configuration
M-3: Configuration
Percenta
automated
NIST SP 800-53 Controls CM-2: Baseline Configuration and C
Change Control
Measure Type Implementation
Formula (Number of approved and implemented configuration changes identi
automated baseline configuratio fied in the latest
n/total number of configuration changes identified
ugh automated s) *100 thro scan
Target This should be a high percentage defined by the organization.
Implementa
Evidence o information systems using an
organizationally approved process (CM-3)?
ration changes that
ent 2)?
h automated scanning
emented over the last
tion 1. Does the organization manage configuration changes t
Yes No
2. Does the organization use automated scanning to identify configu
were implemented on its systems and networks (CM-2, Enhancem
Yes No
3. If yes, how many configuration changes were identified throug
over the last reporting period (CM-3)? _____
4. How many change control requests were approved and impl
reporting period (CM 3)? _____
Freque rterly)
ally)
ncy Collection Frequency: Organization-defined (example: qua
Reporting Frequency: Organization-defined (example: annu
Respo
Part nsible
ies n Manager)
ion System Security
Officer (ISSO), System Owner, System Administrator)
Information Customer: Chief Information Officer (CIO), Information System
Security Officer (ISSO), Senior Agency Information Security Officer (SAISO) (e.g.,
Chief Information Security Officer [CISO]), Authorizing Official [AO],
Configuration Control Board)
Information Owner: Organization-defined (example: Configuratio
Information Collector: Organization-defined (example: Informat
Data Source System security plans, configuration management database, security tool logs
Reporting
Format Pie chart comparing the percentage of approved and implemented changes documented in
the latest baseline configuration versus the percentage of changes not documented in the
latest baseline configuration
A-9
Measure 8: Contingency Planning (CP) (program-level)
Field Data
Measure ID Contingency Plan Testing Measure 1 (or a unique identifier to be fille
organization) d out by the
Goal ty and accountability
, and effectively implement plans for
overy for
organizational information systems to ensure the availability of critical information
situations.
Strategic Goal: Ensure an environment of comprehensive securi
for personnel, facilities, and products.
Information Security Goal: Establish, maintain
emergency response, backup operations, and post-disaster rec
resources and continuity of operations in emergency
Measure Percentage (%) of information systems that have conducted annual contingency plan
Testing and Exercises
testing
NIST SP 800-53 Controls: CP-4: Contingency Plan
Measure Type Effectiveness/Efficiency
Formula ngency plans
00
(Number of information systems that have conducted annual conti
testing/number of information systems in the system inventory) *1
Target This should be a high percentage defined by the organization.
Implementa
Evidence em inventory? _____
plan (CP-2)? _____
the past year (CP-4)?
tion 1. How many information systems are in the syst
2. How many information systems have an approved contingency
3. How many contingency plans were successfully tested within
_____
Fre y) quency Collection Frequency: Organization-defined (example: annuall
Reporting Frequency: Organization-defined (example: annually)
Responsible
s er: Organization-defined (example: Contingency Plan Manager)
r, System
Information Customer: Chief Information Officer (CIO), Information System
Security Officer (I SO), Senior Agency Information Security Officer (SAISO) (e.g.,
Chief Information Security Officer [CISO])
Partie
Information Own
Information Collector: Organization-defined (example: System Owne
Administrator)
S
Data Source Contingency Plan testing results
Reporting
Format Pie chart comparing the percentage of systems that conducted annual contingency plan
testing versus the percentage of systems that have not conducted annual contingency plan
testing
A-10
Measure 9: Identification and Authentication (IA) (system-level)
Field Data
Measure ID User Accounts Measure 1 (or a unique identifier to be filled out by the organization)
Goal curity and accountability
ducts.
m users are identified and authenticated in
Strategic Goal: Ensure an environment of comprehensive se
for personnel, facilities, and pro
Information Security Goal: All syste
accordance with information security policy.
Measure Percentage (%) of users with access to shared accounts
AC-3: Access Enforcement,
ion
NIST SP 800-53 Controls AC-2: Account Management,
and IA-2: User Identification and Authenticat
Measure Type Effectiveness/Efficiency
Formula ber of users) *100 (Number of users with access to shared accounts/total num
Target This should be a low percentage defined by the organization.
Implementa
ce -2)? _____
_
tion 1. How many users have access to the system (IA
Eviden
2. How many users have access to shared accounts (AC-2)? ____
Fre quency Collection Frequency: Organization-defined (example: monthly)
Reporting Frequency: Organization-defined (example: monthly)
Responsi ion-defined (example: System Owner, System
Administrator)
ation Officer (CIO), Information System
Security Officer (ISSO), Senior Agency Information Security Officer (SAISO) (e.g.,
Chief Information Security Officer [CISO])
ble Information Owner: Organizat
Parties Administrator)
Information Collector: Organization-defined (example: System
Information Customer: Chief Inform
Data Source Configuration Management Database, Access Control List, System-Produced User ID
Lists
Reporting
Format Pie chart comparing the percentage of users with access to shared accounts versus the
percentage of users without access to shared accounts
A-11
Measure 10: Incident Respons ) (program-level and system-level) e (IR
Field Data
Measure ID Incident Response Measure 1 (or a unique identifier to be filled out by the organization)
Goal anization’s programs
.
ts to appropriate
Strategic Goal: Make accurate, timely information on the org
and services readily available
Information Security Goal: Track, document, and report inciden
organizational officials and/or authorities.
Measure ithin required time frame per applicable incident
easure will be computed for each incident category described in
Controls IR-6: Incident Reporting
Percentage (%) of incidents reported w
category (the m
Implementation Evidence)
NIST SP 800-53
Measure Type Effectiveness/Efficiency
Formula For each incident category (number of incidents reported on time/total number of
reported incidents) *100
Target entage defined by the organization. This should be a high perc
Implementati
Evidence re reported during the period (IR-6)?
___
Service? _____
_____
olving personally identifiable information (PII) were reported
? _____
ere reported within the prescribed time frame
cording to the time frames established by US-CERT (IR-6)?
de? _____
Category 5 – Scans/Probes/Attempted Access? _____
Category 6 – Investigation? _____
4. Of the PII incidents reported, how many were reported within the prescribed time
frame for their category, according to the time frames established by US-CERT and/or
OMB Memorandum(s) (IR-6)? _____
on 1. How many incidents we
Category 1 – Unauthorized Access? __
Category 2 – Denial of
Category 3 – Malicious Code? _____
Category 4 – Improper Usage? _____
Category 5 – Scans/Probes/Attempted Access? _____
Category 6 – Investigation?
2. How many incidents inv
during the period (IR-6)
3. Of the incidents reported, how many w
for their category, ac
Category 1 – Unauthorized Access? _____
Category 2 – Denial of Service? _____
Category 3 – Malicious Co
Category 4 – Improper Usage? _____
Frequency Collection Frequency: Organization-defined (example: monthly)
Reporting Frequency: Organization-defined (example: annually)
A-12
Field Data
Responsible Information Ow
Parties Security Incident
wner, Information
ation Officer (CIO), Senior Agency Information
er [CISO])
ner: Organization-defined (example: Computer
Response Team [CSIRT])
Information Collector: Organization-defined (example: System O
Security Officer [ISSO], CSIRT)
Information Customer: Chief Inform
Security Officer (SAISO) (e.g., Chief Information Security Offic
Data Source ng database (if available) Incident logs, incident tracki
Reporting
Format rtion of reported incidents
per category that were reported on time
For trends line chart each line represents an individual category plus a line
representing 100 perce t
For one-time snapshot stacked bar chart illustrating the propo
where
n
A-13
Measure 11: Maint ce (MA) (system-level) enan
Field Data
Measure ID Maintenance Measure 1 (or a unique identifier to be filled out by the organization)
Goal an electronic information
ation Security Goal: Perform periodic and timely maintenance on
trols on the tools,
rmation system
Strategic Goal: Accelerate the development and use of
infrastructure.
Inform
organizational information systems and provide effective con
techniques, mechanisms, and personnel used to conduct info
maintenance.
Measure ntage (%) of system components that undergo maintenance in accordance with
ance schedules
and MA-6: Timely
Perce
formal mainten
NIST SP 800-53 Controls MA-2: Controlled Maintenance
Maintenance
Measure Type Effectiveness/Efficiency
Formula according to formal
intenance sc total number of system components) *100
(Number of system components that undergo maintenance
ma hedules/
Target ion. This should be a high percentage defined by the organizat
Implementa
Evidence 2)?
em (CM-8)? _____
e with the formal
tion 1. Does the system have a formal maintenance schedule (MA-
Yes No
2. How many components are contained within the syst
3. How many components underwent maintenance in accordanc
maintenance schedule (MA-6)? _____
Freque
ization-defined (example: annually)
ncy Collection Frequency: Organization-defined (example: quarterly)
Reporting Frequency: Organ
Respo
Part nsible
ies Owner)
em Administrator)
er: Chief Information Officer (CIO), Information System
Security Officer (I SO), Senior Agency Information Security Officer (SAISO) (e.g.,
Chief Information Security Officer [CISO])
Information Owner: Organization-defined (example: System
Information Collector: Organization-defined (example: Syst
Information Custom
S
Data Source Maintenance schedule, maintenance logs
Reporting
Format Pie chart comparing the percentage of system components receiving maintenance in
accordance with the formal maintenance schedule versus the percentage of system
components not receiving maintenance in accordance with the formal maintenance
schedule over the specified period
A-14
Measure 12: Media Protection (MP) (program-level and system-level)
Field Data
Measure y the organization) ID Media Sanitization Measure 1 (or a unique identifier to be filled out b
Goal urity and accountability
ilities, and products.
stem media before
osal or ease for reuse.
Strategic Goal: Ensure an environment of comprehensive sec
for personnel, fac
Information Security Goal: Sanitize or destroy information sy
disp rel
Measure Percentage (%)
impact systems of media that passes sanitization procedures testing for FIPS 199 high-
SP 800-53 Controls MP-6: Media Sanitization and Disposal NIST
Measure Type Effectiveness/Efficiency
Formula (Number of media that passes sanitization procedures testing/total number of media
ed) * 100 test
Target This should be a high percentage defined by the organization.
Implementa
Evidence itizing media before it is discarded or reused (MP-1)?
Yes No
a sanitization procedures for FIPS 199 high-impact
for FIPS 199 high-
P-6, Enhancement
tion 1. Is there a policy for san
2. Does the organization test medi
systems (MP-6, Enhancement 2)?
Yes No
3. Number of media that successfully passed sanitization testing
impact systems (MP-6, Enhancement 2)? _____
4. Total number of media tested for FIPS 199 high-impact systems (M
2)? _____
Freque y)
ency: Organization-defined (example: annually)
ncy Collection Frequency: Organization-defined (example: quarterl
Reporting Frequ
Responsible
Parties y Security Officer)
rmation Collector: Organization-defined (example: System Owner, Information
System Security O ficer (ISSO])
Information Customer: Chief Information Officer (CIO), Senior Agency Information
Security Officer (SAISO) (e.g., Chief Information Security Officer [CISO])
Information Owner: Organization-defined (example: Facilit
Info f
Data Source Sanitization testing results
Reporting
Format Pie chart comparing the percentage of media passing sanitization procedures testing
versus the percentage of media not passing sanitization procedures testing over the
specified period
A-15
Measure 13: Physical and Environmental (PE) (program-level)
Field Data
Measure ID Physical Security Incidents Measure 1 (or a unique identifier to be fill
organization) ed out by the
Goal rity and accountability
ecurity protection
n’s information
Strategic Goal: Ensure an environment of comprehensive secu
for personnel, facilities, and products.
Information Security Goal: Integrate physical and information s
mechanisms to ensure appropriate protection of the organizatio
resources.
Measure Percentage (%) of physical security incidents allowing unauthorized entry into facilities
containing information systems
NIST SP 800-53 Control PE-6: Monitoring Physical Access
Measure Type Effectiveness/Efficiency
Formula y into facilities
containing information systems/total number of physical security incidents) *100
(Number of physical security incidents allowing unauthorized entr
Target This should be a low percentage defined by the organization.
Implem
Eviden entati
ce during the specified period (PE-6)?
rized entry into facilities
on 1. How many physical security incidents occurred
_____
2. How many of the physical security incidents allowed unautho
containing information systems (PE-6)? _____
Fre y) quency Collection Frequency: Organization-defined (example: quarterl
Reporting Frequency: Organization-defined (example: quarterly)
Responsible
s (example: Physical Security Officer)
puter Security Incident
Information Customer: Chief Information Officer (CIO), Information System
Security Officer (I SO), Senior Agency Information Security Officer (SAISO) (e.g.,
Chief Information Security Officer [CISO])
Partie
Information Owner: Organization-defined
Information Collector: Organization-defined (example: Com
Response Team [CSIRT])
S
Data Source Physical security incident reports, physical access control logs
Reporting
Format Pie chart comparing the physical security incidents allowing unauthorized entry into
facilities containing information systems versus the total number of physical security
incidents
A-16
Measure 14: Planning (PL) (program-level and system-level)
Field Data
Measure ID Planning Measure 1 (or a unique identifier to be filled out by the organization)
Goal ty and accountability
al: Develop, document, periodically update, and implement
cribe the security
rules of behavior for
Strategic Goal: Ensure an environment of comprehensive securi
for personnel, facilities, and products..
Information Security Go
security plans for organizational information systems that des
controls in place or planned for information systems, and the
individuals accessing these systems.
Measure Percentage of employees who are authorized access to information systems only after
d rules of behavior
L-4: Rules of Behavior and AC-2: Account Management
they sign an acknowledgement that they have read and understoo
NIST SP 800-53 Controls P
Measure Type Implementation
Formula m access after signing rules of behavior/total (Number of users who are granted syste
number of users with system access) *100
Target This should be a high percentage defined by the organization.
Implementa
ce stem (AC-2)? _____
nowledgements (PL-4)? _____
em only after signing
tion 1. How many users access the sy
Eviden
2. How many users signed rules of behavior ack
3. How many users have been granted access to the information syst
rules of behavior acknowledgements? _____
Fre u y) q ency Collection Frequency: Organization-defined (example: quarterl
Reporting Frequency: Organization-defined (example: annually)
Responsi ion-defined (example: System Owner, Information
])
inistrator,
Information Customer: Chief Information Officer (CIO), Information System
Security Officer (I O), Senior Agency Information Security Officer (SAISO) (e.g.,
Chief Information Security Officer [CISO])
ble Information Owner: Organizat
Parties System Security Officer [ISSO
Information Collector: Organization-defined (example: System Adm
System Owner)
SS
Data Source Repositories containing rules of behavior records
Reporting
Format Pie chart comparing the percentage of users who have signed rules of behavior
acknowledgement forms prior to being granted information system access to those users
who have accessed the system without signed rules of behavior acknowledgement forms
A-17
Measure 15: Personnel Securit m-level and system-level) y (PS) (progra
Field Data
Measure ID Personnel Security Screening Measure 1 (or a unique identifier to be f
organization) illed out by the
Goal rity and accountability
positions of
lished security
Strategic Goal: Ensure an environment of comprehensive secu
for personnel, facilities, and products.
Information Security Goal: Ensure that individuals occupying
responsibility within organizations are trustworthy and meet estab
criteria for those positions.
Measure Percentage (%) of individuals screened before being granted access to organizational
anagement and PS-3: Personnel Screening
information and information systems
NIST SP 800-53 Controls AC-2: Account M
Measure Type Implementation
Formula creened/total number of individuals with access) *100 (Number of individuals s
Target This should be a high percentage defined by the organization.
Implementation
ce ss to organizational information and
ersonnel screening (PS-3)?
Eviden 1. How many individuals have been granted acce
information systems (AC-2)? _____
2. What is the number of individuals who have completed p
_____
Fre y) quency Collection Frequency: Organization-defined (example: quarterl
Reporting Frequency: Organization-defined (example: annually)
Responsible
s ganization-defined (example: Human Resources)
Administrators,
System Owners, Information System Security Officer [ISSO])
Information Customer: Chief Information Officer (CIO), Information System
Security Officer (ISSO), Senior Agency Information Security Officer (SAISO) (e.g.,
Chief Information Security Officer [CISO])
Partie
Information Owner: Or
Information Collector: Organization-defined (example: System
Data Source Clearance records, access control lists
Reporting
Format Pie chart comparing the percentage of individuals screened versus the total number of
individuals
A-18
Measure 16: Risk As ent (RA) (system-level) sessm
Field Data
Measure ID Risk Assessment Vulnerability Measure 1 (or a unique identifier to be
organization) filled out by the
Goal accountability
izational operations
onal assets, and
rmation systems.
Strategic Goal: Ensure an environment of comprehensive security and
for personnel, facilities, and products.
Information Security Goal: Periodically assess the risk to organ
i(including mission, functions, image, or reputation), organizat
individuals resulting from the operation of organizational info
Measure Percentage (%) of vulnerabilities remediated within organization-specified time frames
-5: Plan of Actions NIST SP 800-53 Controls RA-5: Vulnerability Scanning and CA
and Milestones
Measure Type Effectiveness/Efficiency
Formula OA&M schedule/total number of
A&M docum vulnerabilities identified through vulnerability scans) *100
(Number of vulnerabilities remediated according to P
PO - ented
Target by the organization. This should be a high percentage defined
Implementation
Evidence he organization conduct periodic vulnerability scans (RA-5)?
es No
is the periodicity of vulnerability scans (RA-5)?
Quarterl
ities identified through
mented in appropriate system POA&Ms (CA-5)?
4. How many vulnerabilities were identified through vulnerability scanning and entered
into applicable POA&Ms (CA-5)? _____
5. How many of the vulnerabilities from Question 4 were remediated on schedule
according to their POA&Ms (CA-5)? _____
1. Does t
Y
2. What
Weekly
Monthly
y
Other ____________
3. Does the organization’s POA&M process require vulnerabil
vulnerability scanning to be docu
Yes No
Frequency Collection Frequency: Organization-defined (example: monthly)
Reporting Frequency: Organization-defined (example: monthly)
A-19
Field Data
Responsible Information Owner: Or
Parties Owners, Information
stem Administrators,
nformation Officer (CIO), Information System
Senior Agency Information Security Officer (SAISO) (e.g.,
ganization-defined (example: System
System Security Officer [ISSO])
Information Collector: Organization-defined (example: Sy
System Owners, Information System Security Officer [ISSO])
Information Customer: Chief I
Security Officer (ISSO),
Chief Information Security Officer [CISO])
Data Source POA&Ms, vulnerability scanning reports
Reporting
Format Pie chart comparing th percentage of vulnerabilities remediated on schedule versus the
percentage of vulnerabilities not remediated on schedule
e
A-20
Measure 17: System and Services Acquisition (SA) (program-level and system-level)
Field Data
Measure ID Service Acquisition Contract Measure 1 (or a unique identifier to
organization) be filled out by the
Goal tronic information
mploy adequate security
ices outsourced from the
Strategic Goal: Accelerate the development and use of an elec
infrastructure.
Information Security Goal: Ensure third-party providers e
measures to protect information, applications, and/or serv
organization.
Measure Percentage (%) of system and service acquisition contracts that include security
requirements and/or specifications
NIST SP 800-53 Control SA-4: Acquisitions
Measure Type Implementation
Formula security requirements
contracts) *100
(Number of system and service acquisition contracts that include
and specifications/total number of system and service acquisition
Target h percentage defined by the organization. This should be a hig
Implementa
Evidence es the organization have? _____
curity requirements and
tion 1. How many active service acquisition contracts do
2. How many active service acquisition contracts include se
specifications (SA-4)? _____
Frequency Collection Frequency: Organization-defined (example: quarterly)
Reporting Frequency: Organization-defined (example: annually)
Responsi ion-defined (example: Contracting Officer)
ollector: Organization-defined (example: Contracting Officer’s
l Representative, System
wner, Procurement Officer, Chief Information Officer (CIO), Information System
Security Officer (I O), Senior Agency Information Security Officer (SAISO) (e.g.,
Chief Information Security Officer [CISO])
ble Information Owner: Organizat
Parties Information C
Technical Representative, System Owner)
Information Customer: Contracting Officer’s Technica
OSS
Data Source Service acquisition contracts
Reporting
Format Pie chart comparing the percentage of system and service acquisition contracts that
include security requirements and/or specifications versus the percentage of system and
service acquisition contracts that do not include security requirements and/or
specifications
A-21
Measure 18: System and Communications Protection (SC) (program-level)
Field Data
Measure ID System and Communication Protection Measure 1 (or a unique iden
by the organization) tifier to be filled out
Goal lectronic information
quately protect
Strategic Goal: Accelerate the development and use of an e
infrastructure.
Information Security Goal: Allocate sufficient resources to ade
electronic information infrastructure.
Measure erform all cryptographic operations
IPS 140-2 validated cryptographic modules operating in approved modes of
hy
Percentage of mobile computers and devices that p
using F
operation
NIST SP 800-53 Control SC-13: Use of Validated Cryptograp
Measure Type Implementation
Formula (Number of mobile computers and devices that perform all cryptographic operations
proved modes of using FIPS 140-2 validated cryptographic modules operating in ap
operation/total number of mobile computers and devices) *100
Target gh percentage defined by the organization. This should be a hi
Implementatio
Evidence ation (CM-8)? _____
uters and devices employ cryptography (CM-8)? _____
alidated encryption
devices perform all cryptographic
perating in approved
mplementation waivers
n 1. How many mobile computers and devices are used in the organiz
2. How many mobile comp
a. How many mobile computers and devices employ FIPS 140-2 v
modules (SC-13)? _____
b. How many of those mobile computers and
operations using FIPS 140-2 validated cryptographic modules o
modes of operation (SC-13)? _____
3. How many mobile computers and devices have cryptography i
(CM-8)? _____
Fre y) quency Collection Frequency: Organization-defined (example: quarterl
Reporting Frequency: Organization-defined (example: annually)
Responsi
Parties ble ion Owner: Organization-defined (example: System Owners, Information
Officer [ISSO])
ystem Administrators,
Information Customer: Chief Information Officer (CIO), Information System
Security Officer (ISSO), Senior Agency Information Security Officer (SAISO) (e.g.,
Chief Information Security Officer [CISO])
Informat
System Security
Information Collector: Organization-defined (example: S
System Owners, Information System Security Officer [ISSO])
Data Source System security plans
Reporting
Format Pie chart illustrating the number of mobile computers and devices that perform all
cryptographic operations (including key generation) using FIPS 140-2 validated
cryptographic modules operating in approved modes of operation as a percentage of the
total number of mobile computers and devices
A-22
Measure 19: System and Information Integrity (SI) (program-level and system-level)
Field Data
Measure ID System and Information Integrity 1 (or a unique identifier to be fil
organization) led out by the
Goal tronic information
ode at appropriate
formation systems
take appropriate actions in response.
Strategic Goal: Accelerate the development and use of an elec
infrastructure.
Information Security Goal: Provide protection from malicious c
locations within organizational information systems, monitor in
security alerts and advisories, and
Measure Percentage (%) of operating system vulnerabilities for which patches have been applied
or that have been otherwise mitigated
NIST SP 800-53 Controls SI-2: Flaw Remediation
Measure Ty and Effectiveness/Efficiency pe Implementation
Formula uted alerts and advisories for which
non-applicable, or granted a waiver/total
number of applicable vulnerabilities identified through alerts and advisories and through
lnera lity sc 00
(Number of vulnerabilities addressed in distrib
patches have been implemented, determined as
vu bi ans) *1
Target This should be a high percentage defined by the organization.
Implementa
Evidence es the organization distribute alerts and advisories (SI-5)?
Yes No
erabilities were identified by analyzing distributed alerts and advisories
ns (RA-5)? _____
plemented to address identified
? _____
be remediated by
tion 1. Do
2. How many vuln
(SI-5)? _____
3. How many vulnerabilities were identified through vulnerability sca
4. How many patches or work-arounds were im
vulnerabilities (SI-2)? _____
5. How many vulnerabilities were determined to be non-applicable (SI-2)
6. How many waivers have been granted for weaknesses that could not
implementing patches or work-arounds? _____
Frequency Collection Frequency: Organization-defined (example: weekly)
Reporting Frequency: Organization-defined (example: monthly)
Responsible
Parties
Information Owner: Organization-defined (example: Computer Security Incident
Response Team [CSIRT])
Information Collector: Organization-defined (example: Information System Security
Officer [ISSO], System Owners)
Information Customer: Chief Information Officer (CIO), Information System
Security Officer (ISSO), Senior Agency Information Security Officer (SAISO) (e.g.,
Chief Information Security Officer [CISO])
Data Source Vulnerability scans, POA&Ms, repositories of alerts and advisories, risk assessments
A-23
A-24
Field Data
Reporting
Format omposed of percentages
isories for which
patches have been determined as non-applicable, have been implemented, have had a
waiver granted, or othe
Stacked bar chart with total number of applicable vulnerabilities c
of number of vulnerabilities addressed in distributed alerts and adv
r
Appendix B: ACRONYMS
A Control C Access
A Authorizing OfficO ial
g
tion
ity Officer
l
T se Team
ise Architecture tandards
ontrols Audit Manual
A anagement Act
tability Office
A lts Act
y Engineering Association
ecurity Officer
ratory
ndards and Technology
t
stones
e Model
urity
RA Risk Assessment
SA System and Services Acquisition
SAISO Senior Agency Information Security Officer
SC System and Communications Protection
SDLC System Development Life Cycle
SI System and Information Integrity
SP Special Publication
USC United States Code
AT Awareness and Trainin
AU Audit and Accountability
A taC& Certification and Accredi
CFO Chief Financial Officer
CIO Chief Information Officer
CISO Chief Information Secur
CM Configuration Management
COTS Commercial Off-The-Shelf
CP Contingency Planning
CPIC Capital Planning and Investment Contro
CSIR Computer Security Incident Respon
FEA Federal Enterpr
FIPS Federal Information Processing S
FIS Federal Information System CCAM
FISM Federal Information Security M
FY Fiscal Year
GAO Government Accoun
GOTS Government Off-The-Shelf
E imination Act GP Government Paperwork El
GPRA Government Performance and Resu
ID Identification
IG Inspector General
IR Incident Response
ISSEA International Systems Securit
ISSO Information System S
ITL Information Technology Labo
MP Media Protection
NIST National Institute of Sta
OMB Office of Management and Budge
PE Physical and Environmental
PL Planning
POA&M Plan of Action and Mile
PRM Performance Referenc
PS Physical Sec
B-1
B-2
ergency Readiness Team
XML Extensible Markup Language
US-CERT United States Computer Em
Appendix C: REFERENCES
, 2nd International Systems Security
1.
rd A Conference
Management
, February 10, 1996.
P.L 107-347),
199, Standards for Security Categorization of
quirements for
rmation Systems, March 2006.
t for Fiscal Year 2001 (P.L. 106-398).
,
GAO/AIMD-12.19.6, January 1996.
f Key
Information Security Terms, April, 2006.
00-18, Guide for
2006.
lication 800-30, Risk Management
ide for the
National Institute of Standards and Technology Special Publication 800-53, Recommended
Security Controls for Federal Information Systems, December 2007..
National Institute of Standards and Technology Special Publication 800-53A, Guide for
Assessing the Security Controls in Federal Information Systems, June 2008.
National Institute of Standards and Technology Special Publication 800-65, Integrating Security
into the Capital Planning and Investment Control Process, January 2005.
Bartol N., Givans N., Measuring the “Goodness” of Security
Engineering Association (ISSEA) Conference Proceedings, February 200
Bartol N., Information Security Performance Measurement: Live, 3 ISSE
Proceedings, March 2002.
Clinger-Cohen Act of 1996 (formerly known as the Information Technology
Reform Act)
E-Government Act, Title III—Federal Information Security Management Act (
December 2002.
Federal Information Processing Standards (FIPS)
Federal Information and Information Systems, February 2004.
Federal Information Processing Standards (FIPS) 200, Minimum Security Re
Federal Information and Info
Floyd D. Spence National Defense Authorization Ac
General Accounting Office, Federal Information System Controls Audit Manual (FISCAM)
Government Performance and Results Act of 1993 (PL. 103-62).
National Institute of Standards and Technology Interagency Report 7298, Glossary o
National Institute of Standards and Technology Special Publication 8
Developing Security Plans and Information Technology Systems, February
National Institute of Standards and Technology Special Pub
Guide for Information Technology Systems, June 2001.
National Institute of Standards and Technology Special Publication 800-37, Gu
Security Certification and Accreditation of Federal Information Systems, May 2004.
C-1
C-2
Special Publication 800-100, Information
urces, February
1, Preparation, Submission, and Execution of the
Budget, Part 6, Preparation and Submission of Strategic Plans, Annual Performance Plans, and
Annual Program Performance Reports (updated annually).
National Institute of Standards and Technology
Security Handbook: A Guide for Managers, October 2006.
Office of Management and Budget, “Security of Federal Automated Information Resources,”
Appendix III to OMB Circular A-130, Management of Federal Information Reso
8, 1996.
Office of Management and Budget Circular A-1
Appendix D: SPECIFICATIONS FOR MINIMUM SECURITY REQUIREMENTS14
ess to authorized
ting on behalf of authorized users, or devices (including other
t authorized users
anagers and users
mation security risks
ders, directives,
instructions, regulations, or procedures related to the information
organizational
security-related
itoring, analysis,
priate information
n system users can
r their actions.
ations must: (i)
s to
ols are effective in their application; (ii) develop and implement
vulnerabilities in
nizational
; and (iv) monitor
continued
f the controls.
and maintain
stems (including
pective information
rmation security
organizational
information systems.
Contingency Planning (CP): Organizations must establish, maintain, and effectively
, and post-disaster recovery
for organizational information systems to ensure the availability of critical information
resources and continuity of operations in emergency situations.
Identification and Authentication (IA): Organizations must identify information
system users, processes acting on behalf of users, or devices and authenticate (or verify)
Access Control (AC): Organizations must limit information system acc
users, processes ac
information systems), and to the types of transactions and functions tha
are permitted to exercise.
Awareness and Training (AT): Organizations must: (i) ensure that m
of organizational information systems are made aware of the infor
associated with their activities and of the applicable laws, Executive or
policies, standards,
security of organizational information systems; and (ii) ensure that
personnel are adequately trained to carry out their assigned information
duties and responsibilities.
Audit and Accountability (AU): Organizations must: (i) create, protect, and retain
information system audit records to the extent needed to enable the mon
investigation, and reporting of unlawful, unauthorized, or inappro
system activity; and (ii) ensure that the actions of individual informatio
be uniquely traced to those users so that they can be held accountable fo
Certification, Accreditation, and Security Assessments (CA): Organiz
periodically assess the security controls in organizational information system
determine if the contr
plans of action designed to correct deficiencies and reduce or eliminate
organizational information systems; (iii) authorize the operation of orga
information systems and any associated information system connections
information system security controls on an ongoing basis to ensure the
effectiveness o
Configuration Management (CM): Organizations must: (i) establish
baseline configurations and inventories of organizational information sy
hardware, software, firmware, and documentation) throughout the res
system development life cycles; and (ii) establish and enforce info
configuration settings for information technology products employed in
implement plans for emergency response, backup operations
14 FIPS 200, Minimum Security Requirements for Federal Information and Information Systems, March 2006.
D-1
the identities of those users, processes, or devices, as a prerequisite to allowing access to
al incident
es adequate
tion, detection, analysis, containment, recovery, and user response activities; and
officials and/or
aintenance
s on the tools,
em maintenance.
ection (MP): Organizations must: (i) protect information system media, both
m media to
before disposal or
) limit physical
ective operating environments to
infrastructure for
tems; (iv)
ide appropriate
formation systems.
date, and ribe the
the rules of behavior
nel Security (PS): Organizations must: (i) ensure that individuals occupying
vice providers)
se positions; (ii)
otected during
ploy formal sanctions
urity policies and
to organizational
anizational assets, and
s and the
associated processing, storage, or transmission of organizational information.
System and Services Acquisition (SA): Organizations must: (i) allocate sufficient
resources to adequately protect organizational information systems; (ii) employ
information system development life cycle processes that incorporate information
security considerations; (iii) employ software usage and installation restrictions; and (iv)
ensure that third-party providers employ adequate information security measures to
protect information, applications, and/or services outsourced from the organization.
organizational information systems.
Incident Response (IR): Organizations must: (i) establish an operation
handling capability for organizational information systems that includ
prepara
(ii) track, document, and report incidents to appropriate organizational
authorities.
Maintenance (MA): Organizations must: (i) perform periodic and timely m on organizational information systems; and (ii) provide effective control
techniques, mechanisms, and personnel used to conduct information syst
Media Prot
paper and digital; (ii) limit access to information on information syste
authorized users; and (iii) sanitize or destroy information system media
release for reuse.
Physical and Environmental Protection (PE): Organizations must: (i
access to information systems, equipment, and the resp
authorized individuals; (ii) protect the physical plant and support
information systems; (iii) provide supporting utilities for information sys
protect information systems against environmental hazards; and (v) prov
environmental controls in facilities containing in
Planning (PL): Organizations must develop, document, periodically up
implement system security plans for organizational information systems that desc
security controls in place or planned for the information systems and
for individuals accessing the information systems.
Person
positions of responsibility within organizations (including third-party ser
are trustworthy and meet established information security criteria for tho
ensure that organizational information and information systems are pr
personnel actions such as terminations and transfers; and (iii) em
for personnel failing to comply with organizational information sec
procedures.
Risk Assessment (RA): Organizations must periodically assess the risk
operations (including mission, functions, image, or reputation), org
individuals resulting from the operation of organizational information system
D-2
D-3
st: (i) monitor,
transmitted or
aries and key
ectural designs,
rinciples that
ion systems.
y, report, and
at appropriate locations within organizational information
systems; and (iii) monitor information system security alerts and advisories and take
appropriate actions in response.
System and Communications Protection (SC): Organizations mu
control, and protect organizational communications (i.e., information
received by organizational information systems) at the external bound
internal boundaries of the information systems; and (ii) employ archit
software development techniques, and information systems engineering p
promote effective information security within organizational informat
System and Information Integrity (SI): Organizations must: (i) identif
correct information and information system flaws in a timely manner; (ii) provide
protection from malicious code

Navigation menu