Considering Operational Security Risks During System Development

Pittsburgh, PA 15213-3890
Considering Operational
Security Risks During System
Development
Christopher Alberts
Audrey Dorofee
Carol Woody
Sponsored by the U.S. Department of Defense
© 2003 by Carnegie Mellon University
page 1
OCTAVE®
Operationally Critical Threat, Asset, and Vulnerability
EvaluationSM
® OCTAVE is registered with the U.S. Patent and Trademark Office by Carnegie Mellon
University
SM Operationally Critical Threat, Asset, and Vulnerability Evaluation is a service mark of
Carnegie Mellon University.
© 2003 by Carnegie Mellon University
page 2
Operational Focus
PreDevelopment
Activities
RFP
concept
Development
Activities
design
requirements
Post
Development
Activities
testing
build
integration
acceptance
operation
OCTAVE was developed for managing security risks in
the operational environment.
© 2003 by Carnegie Mellon University
page 3
A Complex Domain
Threats
• people inside your organization
• people outside your organization
• system problems
• other problems
© 2003 by Carnegie Mellon University
Security Practices
• organizational practices
• technical practices
People Involved
• information technology
(IT) staff
• general staff
• managers
• contractors
• service providers
• partners and
collaborators
page 4
Organizational Gap
© 2003 by Carnegie Mellon University
page 5
OCTAVE Approach - 1
© 2003 by Carnegie Mellon University
page 6
OCTAVE Approach - 2
OCTAVE is at the center of a risk management approach
to information security.
© 2003 by Carnegie Mellon University
page 7
What Is OCTAVE?
OCTAVE is a risk-based strategic assessment and
planning technique for security.
• It leverages people’s knowledge of their organization’s
security-related practices and processes to capture the
current state of security practice within the organization.
• Risks to the most critical assets are used to prioritize
areas of improvement and set the security strategy for
the organization.
© 2003 by Carnegie Mellon University
page 8
OCTAVE Phases
OCTAVE is divided into the following three phases:
• Phase 1: Build Asset-Based Threat Profiles
• Phase 2: Identify Infrastructure Vulnerabilities
• Phase 3: Develop Security Strategy and Plans
© 2003 by Carnegie Mellon University
page 9
© 2003 by Carnegie Mellon University
page 10
OCTAVE Analysis Team
• An interdisciplinary team – consisting of
- business or mission-related staff
- information technology staff
© 2003 by Carnegie Mellon University
page 11
Key Questions
Can some operational risks be reduced by addressing
them during development?
Can the OCTAVE approach be expanded to address
operational risks during development?
© 2003 by Carnegie Mellon University
page 12
The Value in Addressing Risks Earlier
Lower cost to mitigate security risks
More stable solutions can be implemented
Solutions carry forward with future upgrades
Builds on existing operational planning activities
Builds a consistent vocabulary between customers,
developers, users, and maintainers
© 2003 by Carnegie Mellon University
page 13
Challenges in Addressing Risks
Earlier
Only a subset of operational security risks can be
addressed early in the lifecycle.
Solutions implemented early in the lifecycle may affect
security-related activities during operation and
maintenance.
© 2003 by Carnegie Mellon University
page 14
Addressing Security Issues Earlier in
the Lifecycle
PreDevelopment
Activities
RFP
concept
Development
Activities
design
requirements
Post
Development
Activities
testing
build
integration
acceptance
operation
The focus for managing some operational security risks
can be shifted earlier in the lifecycle.
© 2003 by Carnegie Mellon University
page 15
Security Risk Management Activities
Component
protection
Define
target
system
Determine
security
attributes
Identify
threats
Identify
risks
Develop
protection
plan
Procedural
protection
© 2003 by Carnegie Mellon University
page 16
Define Target System
Component
protection
Define
target
system
Determine
security
attributes
Identify
threats
Identify
risks
Develop
protection
plan
Procedural
protection
© 2003 by Carnegie Mellon University
page 17
Target System
What components are part of the target system?
• software applications (interface modules, database
modules, etc.)
• hardware
• operating environment
• COTS
What critical information is stored, processed, and
transmitted by the target system?
© 2003 by Carnegie Mellon University
page 18
Determine Security Attributes
Component
protection
Define
target
system
Determine
security
attributes
Identify
threats
Identify
risks
Develop
protection
plan
Procedural
protection
© 2003 by Carnegie Mellon University
page 19
Security Attributes
What is important about the critical information on the
target system?
• Who will be authorized to view it? (confidentiality)
• To what extent should the information be accurate and
complete? Who will be authorized to modify it?
(integrity)
• How often must the information be available?
(availability)
© 2003 by Carnegie Mellon University
page 20
Identify Threats - 1
Component
protection
Define
target
system
Determine
security
attributes
Identify
threats
Identify
risks
Develop
protection
plan
Procedural
protection
© 2003 by Carnegie Mellon University
page 21
Threats
Who or what could threaten the target system? How could
security attributes be affected?
Are system components interfaced such that a
compromise of one puts others at risk?
© 2003 by Carnegie Mellon University
page 22
Network Attacks - 1
Consider how deliberate/malicious network-based attacks
could affect the target system.
How could system components be compromised by these
attacks?
How could security attributes be affected as a result?
• Can information be disclosed to unauthorized people?
• Can information be modified by unauthorized people?
• Can information be destroyed?
• Can access to information be interrupted?
© 2003 by Carnegie Mellon University
page 23
Network Attacks - 2
Consider how accidental/non-malicious network-based
events could affect the target system.
How could system components be compromised by these
events?
How could security attributes be affected as a result?
• Can information be disclosed to unauthorized people?
• Can information be modified by unauthorized people?
• Can information be destroyed?
• Can access to information be interrupted?
© 2003 by Carnegie Mellon University
page 24
Network Attacks - 3
Consider how viruses and other malicious code could
affect the target system.
How could system components be compromised by these
events?
How could security attributes be affected as a result?
• Can information be disclosed to unauthorized people?
• Can information be modified by unauthorized people?
• Can information be destroyed?
• Can access to information be interrupted?
© 2003 by Carnegie Mellon University
page 25
Identify Risks
Component
protection
Define
target
system
Determine
security
attributes
Identify
threats
Identify
risks
Develop
protection
plan
Procedural
protection
© 2003 by Carnegie Mellon University
page 26
Risk Identification
What is the potential impact on the organization resulting
from each threat?
Consider what might happen to the operational
environment (organization, system users, etc.) if the target
system is compromised and critical information is
• disclosed
• modified
• destroyed
• inaccessible
Risk = Threat + Impact
© 2003 by Carnegie Mellon University
page 27
Potential Impacts on the Operational
Environment
Lost revenue
Increased operational costs
Reputation damage
Fines and penalties
Safety and health of users
Others
© 2003 by Carnegie Mellon University
page 28
Develop Protection Plan - 1
Component
protection
Define
target
system
Determine
security
attributes
Identify
threats
Identify
risks
Develop
protection
plan
Procedural
protection
© 2003 by Carnegie Mellon University
page 29
What is a Protection Plan?
A plan to reduce operational security risks
Actions in the plan primarily mitigate threats to system
components.
© 2003 by Carnegie Mellon University
page 30
Protection Plan - 1
Who needs to be involved in developing the protection plan?
• users
• operational staff
• 3rd parties
• people responsible for other components
• developers
• customers
• others
© 2003 by Carnegie Mellon University
page 31
Protection Plan - 2
What is the overall protection plan for the target system?
Component Protection
• software applications
• hardware
• operating environment (operating system, I/O
management, networks)
• COTS
Procedural Protection
• transition from development to operational environment
• operational contingency plans
• access control management
© 2003 by Carnegie Mellon University
page 32
Component Protection - 1
Software Applications
What technological measures should be used to restrict
user access to software applications?
What technological measures should be used to
authenticate users to software applications? What degree
of authentication is required?
Should encryption be employed to protect information
while in storage and during transmission?
© 2003 by Carnegie Mellon University
page 33
Component Protection – 2
Software Applications (cont.)
How can the software architecture be designed to
recognize, resist, and recover from network-based
threats? How might these options affect other quality
attributes?
What monitoring capabilities are required to recognize
threats to the software applications as they are occurring?
© 2003 by Carnegie Mellon University
page 34
Component Protection – 3
Hardware
What expectations do you have for hardware security?
Who is responsible for implementing them?
Are these expectations sufficient?
How will they be verified? By whom?
© 2003 by Carnegie Mellon University
page 35
Component Protection – 4
Operating Environment
What expectations do you have for operating environment
security?
Who is responsible for implementing them?
Are these expectations sufficient?
How will they be verified? By whom?
© 2003 by Carnegie Mellon University
page 36
Component Protection - 5
COTS
What expectations do you have for COTS security?
Who is responsible for implementing them?
Are these expectations sufficient?
How will they be verified? By whom?
© 2003 by Carnegie Mellon University
page 37
Procedural Protection - 1
Transition from Development to Operational Environment
How should hardware and software components be locked
down during the transition?
What procedures should be in place for maintaining the
target system (especially validating the integrity of
production software) during the transition?
What helpdesk support procedures should be in place
during the transition?
© 2003 by Carnegie Mellon University
page 38
Procedural Protection - 2
Operational Contingency Plans
How should operational backup procedures be modified or
updated to accommodate the security attributes of the target
system?
How should operational contingency plans be modified or
updated to accommodate the security attributes of the target
system?
© 2003 by Carnegie Mellon University
page 39
Procedural Protection - 3
Access Control Management
Who is monitoring and managing access control to the
target system? What are the requirements for access
control?
How should the access of contractors and vendors with
support contracts be controlled?
What training and support will users need for accessing
and using the target system?
© 2003 by Carnegie Mellon University
page 40
Complicating Factors
Incomplete and/or poorly defined operational security
standards and requirements
Lack of consideration of security within contracts for
outsourced development, COTS software, and/or
operational support
Lack of operational expertise within the development life
cycle
Significant unknowns with respect to the operational
environment during development
© 2003 by Carnegie Mellon University
page 41
Summary - 1
Development Activities
Residual Risk
Start
Uncertainty
Finish
Perfect
System
Certainty
At the end of development, some degree of uncertainty remains
with respect to the target operational environment.
This uncertainty translates into the residual risk that must be
managed during operations and maintenance.
© 2003 by Carnegie Mellon University
page 42
Summary - 2
Residual Risk
Development Activities
Start
Uncertainty
Certainty
New
Start
By addressing some operational security
risks during development, the residual risk
that must be managed during operations
and maintenance is reduced.
© 2003 by Carnegie Mellon University
Perfect
System
Finish
New
Finish
Reduced
Residual Risk
page 43
Questions?
© 2003 by Carnegie Mellon University
page 44
For Additional Information
Telephone 412 / 268-5800
Fax
412 / 268-5758
Internet
[email protected] (for general information)
[email protected] (for licensing inquiries)
WWW
http://www.cert.org/octave
U.S. mail
Customer Relations
Software Engineering Institute
Carnegie Mellon University
Pittsburgh, PA 15213-3890
© 2003 by Carnegie Mellon University
page 45