HostedDB - Dedicated UNIX Servers

-->
NOT the Orange Book

1 November 1998
Source: Paul H. Merrill

This document available Zip-compressed: ntob.zip (96K)



NOT
the Orange Book

A Guide to the Definition, Specification, Tasking, and
Documentation for the Development of Secure
Computer Systems
Including Condensations of the Members of
the
the National Computer Security Center's
Rainbow Series and Related Documents




By Paul H. Merrill


Merlyn Press
Wright-Patterson Air Force Base

1992



Works by the Author from Merlyn Press

Graphics User's Manual for the NCR Worksaver System
Appendix I: TRIGS COMPUSEC Requirements
TRIGS Computer Lifecycle Management Plan
TRIGS Software Risk Management Plan
Test Report: Sigma II Silver Harvester
SoftWare Evaluation Team Report
Software Quality Assurance
PHIP Segment Specification
Rainbow Readers' Digest
NOT The Orange Book



Copyright 1992 Paul H. Merrill

All rights reserved with the exception of unrestricted use by the government.


Dedication and Acknowledgement

To Hugh Lynn, without whom this never would have gotten started. To Terry Tucker, without whose inspiration this would not be what it is. To Larry Miner, without whose indulgence this would never have been finished. To Jim Plaisted, Laura Picon, Mel Hoeferlin, Dave Hensley, and Dan O'Connor, without whose kind words this would not have been as well-aimed. To Larry Yelowitz and DoctorDave Gobuty, without whose pressure the diamond might not be so pure. To Steve Job, for making Apple what it was. And, of course, to Mom and PoP.

Foreword

This book was started to serve as a text from which to teach an introductory course in Computer Security. As the writing progressed, it became apparent that more depth was needed than a simple introductory course would entail ... and ... that the work would need to serve those people not taking the course. I sincerely hope that this helps you as much as it was intended to.


Introduction

This book was written to help reduce the onslaught of information available in the field of operational computer security to manageable proportions and to help make sense of the process needed to correctly determine the level of computer security needed for a specific program. There are four major groups of intended readers who can gain from the use of this text.

Active COMPUSEC Professionals: Those who regularly work with COMPUSEC issues will find that this book works quite nicely to bring together, in a manageable size, the gems from a shelf full of reference publications. After all, who can remember it all, all of the time; paper was invented so people could forget. Also, this book should help to locate that fine nuance which is "in one of those books over there" but eludes capture on the tip of the tongue at the moment.

Novice COMPUSEC Professionals: Those who have only recently begun to work the COMPUSEC issues will find that this book helps to point out what they don't know and then take a good stab at filling the void, to working levels at least.

COMPUSEC-Associated Professionals: For the mass of people who work in association with COMPUSEC professionals, especially on a program which has COMPUSEC as a significant portion of the effort, this book serves to enlighten without wasting an inordinate amount of time.

COMPUSEC Unknowns: For those who aren't sure whether their program requires active computer security measures, this book serves to point out what there is to be gained by COMPUSEC in an integrated security engineering effort.

NOT The Orange Book is not intended to replace the Rainbow Series or the associated publications produced by other governmental bodies. Rather it seeks to point the way in a shorter and simpler format. But, just as Cliff Notes are not the same as the classics covered by them, there is still a need, at times, to read the full text of the standard documents. Neither is this book intended to replace the engineering process (or the thought process) in determining the computer security requirements. The use of this book has been envisioned in the following ways.

Table of Contents: The table of contents is written in the Old Style, with major topics within the section included in the table itself. If a particular topic is in question, between the titles and the topics, the appropriate section should be able to be found with little difficulty. If it is certain that the full text is going to be needed for close scrutiny, the entry includes the title, number, and color (if a Rainbow Member) of the unDigested document.

Defining the System: A guided tour of the thought processes and basic algorithms involved seeks to help the Reader to determine the level of computer security needed for the particular system and its environment. As a part of this process the tradeoffs which need to be considered between computer security and the other securities are discussed.

Spare Parts Bin: In the various portions of this section are the spare parts needed to specify the requirements and task the work effort and documentation for the various levels of computer security.

Digests: Each of the digests uses the information in the Table of Contents, with amplification, as a header and follows with the highlights from the particular book. If the digest leaves questions unanswered, going to the full text is probably in order.

Case Studies: Through the use of several case studies (at high levels of abstraction) the Reader may be able to focus the other information and tools discussed and provided by this book.

Glossary: There are two "Glossaries" in this book. The first is the digest of TG-004, the second is a glossary of the terms which need definition in this book. While it would have been nice to have only one glossary, that would have required deletion of TG-004 (not good for completeness) or the inclusion of terms in the TG-004 digest which are not in TG-004 (not good for purity.)



Table of Contents

Dedication and Acknowledgement

Foreword

Introduction

Table of Contents

Section I: Building A Secure System

System Security Engineering Management
Systems Security Engineering as a Part of Systems Engineering
CompuSec Engineering as a Part of Systems Security Engineering
The "Securities"
System Accreditation
Security After Deployment
The Fundamental Laws of Computer Security

Determining the System's Security Needs

Threats and Vulnerabilities => Risks
Sharing the Security Needs ... or ... "CompuSec and the Securities"
Yellow Books (I & II)
Lowering the CompuSec Needs

Statement of Work Issues

Format Considerations
Separate WBS Item / Scattered Throughout
Cost
Quality
Oversight
Perceived Importance

System Security Engineering Management Program

The Securities
Oversight of Program
Industrial Security and the Rest

Documentation

For Itself
For Oversight Effort
Pertinent Documents

Section II: Spare Parts

C1 Class
Discretionary Access Control (DAC)
Identification And Authentication
SOW Tasking

C2 Class

Discretionary Access Control (DAC)
Identification And Authentication
Audit
System Security Architecture
SOW Tasking

B1 Class

Discretionary Access Control (DAC)
Identification And Authentication
Audit
System Security Architecture
Labels
Mandatory Access Control (MAC)
SOW Tasking

B2 Class

Discretionary Access Control (DAC)
Identification And Authentication
Audit
System Security Architecture
Labels & Mandatory Access Control (MAC)
SOW Tasking

B3 Class

Discretionary Access Control (DAC)
Identification And Authentication
Audit
System Security Architecture
Labels & Mandatory Access Control (MAC)
SOW Tasking

A1 Class

Discretionary Access Control (DAC)
Identification And Authentication
Audit
System Security Architecture
Labels & Mandatory Access Control (MAC)
SOW Tasking

CDRL Inputs

Data Item Descriptions (DIDs)

Subsystem Design Analysis Report
System Security Management Plan
Security Vulnerability Analysis
System/Subsystem Specification Document for AISs
Adversary Mission Analysis
Logistical Support Analysis

Section III: Rainbow Readers' Digest

DOD 5200.28-STD Orange (Formerly: CSC-STD-001-83)
DoD Trusted Computer System Evaluation Criteria
Fundamental Computer Security Requirements
Divisions and Classes of Systems
Testing Guidelines
Commercial Product Evaluation
Requirements

CSC-STD-002-85 Green
DoD Password Management Guideline

Individual Password
Password Change
Password Protection
Password Length

CSC-STD-003-85 Yellow I
Computer Security Requirements

Minimum User Clearance (RMin )
Maximum Data Sensitivity (RMax )
Risk Index
Computer Security Requirements

CSC-STD-004-85 Yellow II
Rationale Behind Computer Security Requirements

Risk Index
Security Risk Index Matrix
Security Index Matrix For Open Security Environments
Security Index Matrix For Closed Security Environments

CSC-STD-005-85 Dark Blue
DoD Magnetic Remanence Security Guideline

Declassification and Clearing
Declassification Permission
Properly Functioning Media
Non-functional Media
Destruction

NCSC-TG-001 Tan
A Guide to Understanding Audit

Audit Requirements Overview
Audited Events
Selective Audit

NCSC-TG-002 Blue
Trusted Product Evaluations A Guide for Vendors

Phases of the Trusted Product Evaluation Program
Technical Description of the Product
Legal Agreement

NCSC-TG-003 Tangerine I
A Guide to Understanding Discretionary Access Control

Implementation Methodologies
Control Permission and Access Modes
Requirements

NCSC-TG-004 Aqua
Glossary of Computer Security Terms

NCSC-TG-005 Red I
Trusted Network Interpretation

Security Policy
New Evaluation Areas
Network Components
Cascading

NCSC-TG-006 Tangerine II
A Guide to Understanding Configuration Management

Configuration Management (CM) Use
CM Requirements
CM Tasks

NCSC-TG-007 Burgandy
A Guide to Understanding Design Documentation

C2 Design Documentation Requirements
B1 Design Documentation Requirements
B2 Design Documentation Requirements
B3 Design Documentation Requirements
A1 Design Documentation Requirements

NCSC-TG-008 Lavender
A Guide to Understanding Trusted Distribution

Trusted Distribution (TD) Assurances
Post-Production Protection
Transit Protection

NCSC-TG-009 Venice Blue
Computer Security Subsystem Interpretation

Required Features
Assurance Requirements
Documentation Requirements

NCSC-TG-011 Red II
Trusted Network Interpretation Environments Guideline

Network Security Architecture and Design
Risk Assessment
Network Security Services

NCSC-TG-013 Hot Pink
Rating Maintenance Phase Program Document

Preevaluation Phase
Vendor Assistance Phase/Design Analysis Phase
Evaluation Phase
Rating Maintenance Phase

NCSC-TG-014 Purple
Guidelines for Formal Verification Systems

Endorsement And ETL Listing
Technical factors
Features
Assurance
Required Documentation

NCSC-TG-015 Brown
A Guide to Understanding Trusted Facility Management

Security Administrator
Secure Operator
Account Administrator
Auditor

NCSC-TG-017 Light Blue
Identification and Authentication

Authentication by Knowledge
Authentication by Ownership
Authentication by Characteristic
Implementation
I&A Requirements by Class

NCSC-TG-019 Blue
Trusted Product Evaluation Questionnaire


NCSC-TG-020 Gray
Access Control List (ACL) Features for UNIX

NCSC-TG-021 Lilac
Trusted Database Management System Interpretation

Conditions for Evaluating by Parts
Local/Global Requirements
Interpretation of the Orange Book Requirements

NCSC-TG-025 Green
Data Remanence in Automated Information Systems

Standard Clearing / Purging Methods
Considerations
Approved Procedures for Various Media

NCSC-TG-026 Fluorescent Orange
Security Features User's Guide

Audience
Format
Presentation
Example SFUGs

C Technical Report 79-91 Yellow
Integrity in Automated Information Systems

Integrity Goals
Integrity Principles
Integrity Policies and Mechanisms
Integrity Separation Policies
General Integrity Policies

NTISSAM COMPUSEC/1-87
Advisory Memorandum on Office Automation Security

User Responsibilities
Security Officer Responsibilities

MIL-STD 1785
System Security Engineering Program Management Requirements

Concept Exploration Phase
Demonstration and Validation Phase
Full-Scale Development Phase
Production and Deployment Phase

DRS-2600-5502-86
Security Requirements for System High
and Compartmented Mode Workstations

System High Requirements
Compartmented Mode Requirements

Section IV: Case Studies

NDG: New Development Gonculator
Threats
Data / Classified material
Implementation

UG: Upgrade Gonculator

Threats
Data / Classified material
Implementation

CGG: Communications Group Gonculator

Threats
Data / Classified material
Implementation

GIGS: Ground Intelligence Gonculator System

Threats
Data / Classified material
Implementation

Yellow Books

NDG Assessment
UG Assessment
CGG Assessment
GIGS Assessment

Section V: Notes

Abbreviations and Acronyms

Glossary


Section I:

Building A Secure System

I. Building a Secure Computer System

As the Information Age progresses, the reliance on computers is increasing at a steady pace. Along with this reliance comes an ever increasing need to apply the standard management practices, which are a part of the rest of our lives, to computers. Security has long been a standard management practice which now needs to make its way into the management of computer systems too. People who would never leave their office unlocked will leave their computer modem activated without a second thought. People who would never hand over their checkbook will leave their financial data on a diskette unprotected. The same people who would never hand out the combination to their safe will resist the use of password protection on a computer with the same information stored. Security must be part and parcel of a computer system, and it must be built into the system with minimum negative impact and maximum return on the investment.

System Security Engineering Management

The security aspects of a program are maintained and nourished through Systems Security Engineering Management as set forth in MIL-STD-1785. Systems Security Engineering covers the entire spectrum of security activities with an integrated approach that allows for methodical and comprehensive coverage of the security effort.

Systems Security Engineering as a Part of Systems Engineering

Through the Systems Engineering discipline, Systems Security Engineering in general, and Computer Security (CompuSec) in particular, can be accomplished in an efficient and orderly manner. The iterative analytical techniques which are so much a part of the Systems Engineering process are totally applicable to the requirements definition for the Systems Security Engineering process as well. Top level functions still decompose into lower level functionality. The design tradeoffs and requirements analysis are still studied, interpreted, and selected in the same manner. MIL-STD-1785 gives the tasks to be performed and their sequencing; Systems Engineering gives the methodology to accomplish the tasking.

Computer Security Engineering as a Part of Systems Security Engineering

Though only lightly touched upon in MIL-STD-1785, computer security (CompuSec) is an integral part of system security engineering for programs which incorporate computers processing sensitive data. Due to the ever increasing storage capacity, processing power, and transmission rates, damages and losses from computers have become a very real concern across the whole of society. An entire briefcaseful of classified data will fit on a single diskette. The average computer heist is orders of magnitude greater than the average bank robbery. Because computers operate so much faster than humans, the computer must handle a significant portion of the security.

The "Securities"

Systems Security Engineering is concerned with a group of security disciplines which are interrelated and have significant overlap in the protection given against various vulnerabilities. Each must be considered, along with its strengths and weaknesses during system requirements definition and development.

Information Security (INFOSEC): Concerned with access to information without regard to the form of the information. (Closely related to CompuSec with a large area of overlap.)

Physical Security: Concerned with physical access to the protected resources. (Locks, doors, fences, and guards.)

Computer Security (CompuSec): Concerned with regulating and recording access to computer resources and the data residing in the computer.

Personnel Security: Concerned with the verified "Goodness" of the personnel involved with a system (Clearances).

Product Security: Concerned with the security of the "product" during the manufacturing process. (A CompuSec concern for Configuration Management and Trusted Distribution)

Industrial Security: Concerned with the developing contractors' security aspects and activities. (Higher Industrial Security is one way to lower CompuSec requirements.)

Operations Security (OPSEC): Concerned with operational information and the restricting of data to the appropriate personnel. (Loose Lips )

Communications Security (COMSEC): Concerned with the secure communications and cryptographical aspects. (Often lumped with CompuSec and called INFOSEC.)

Emanations Security (EMSEC, Tempest): Concerned with stray electronic impulses and the ability to use them to derive useful information.

Administrative Security (Procedural Security): Concerned with the procedures used to provide for security. (Hand-written audit trails, Administrative restriction to a given security level on a given machine, Two-Man rule, etc.)

System Accreditation

No matter what kind of computer system is being developed, there will be someone (or some agency), responsible for allowing the system to operate with the data needed. This someone, the accreditor {or Designated Approving Authority (DAA), or any number of other names depending on the system} looks at the entire system and all of the securities and evaluates the interactions to determine the overall security suitability of the total system. Early and active involvement of the DAA in the development of the requirements and the system design is necessary to the development of a secure system which will be allowed to operate, as planned, when finished.

Security After Deployment

Whatever security is designed and built into the system during development must take into account the security measures which will be implemented after system deployment. Computer security measures require action from a security person at least on a part-time basis and, for larger systems, teams of security personnel are needed full time. In addition, the various implementation levels for physical security and personnel security can either raise or lower the necessary security measures which need to be incorporated into the computer system itself. (Obviously the security measures for an Automatic Teller Machine (ATM) are radically different than the measures for the bank's mainframe computer.) Security measures considered during design and development must, therefore, take into account usability and personnel availability as well as other security measures which may modify the security needs of the computer system itself.

The Fundamental Laws of Computer Security

The First Law of Computer Security

P(Bany) = 1.0

The probability of a Bust on any secure system is 1.0
If someone really wants in.

The Second Law of Computer Security

The best security system ever built will not function
If not used.

The Third Law of Computer Security

To err is Human
To really screw things up, use a Computer.

The Fourth Law of Computer Security

The audit trail is there to tell you who to shoot.


Determining the System's Security Needs

In order to levy appropriate CompuSec requirements and task the developing contractor properly, it is first necessary to determine the types of security needed and the levels of security for each type needed. This book does not seek to fully define the levels of security for all of the "Securities" - just identify the tradeoffs which can be made between the "Securities" and then fully define the level of CompuSec needed for a given set of assumptions about the system, its environment, and its capabilities.

Threats and Vulnerabilities => Risks

The purpose of the CompuSec requirements and tasking levied on a system is to lower the risks of the system. Risks occur when a threat finds a vulnerability to attack. By reducing either the threat or the system's vulnerability to the threat, the risk is reduced.

Vulnerabilities: Vulnerabilities can be grouped into two main categories for CompuSec purposes; the system itself and the data which is stored, processed, and/or transmitted by the system.

The computer portion of the system itself is relatively fragile by nature and is susceptible to system shutdown or slowdown without very much effort on the part of the threat. Even a relatively small degradation in performance is sufficient to cause the system to become effectively worthless. In addition, the typical non-embedded system is a high dollar value piece of equipment which is always a favorite target.

The data used within the system is obviously a target for the threats to the system. After all, the manipulation of the data is why the computer system is being developed. Whether classified or not, the data is needed by the user of the system or it should not be on the system. Any data that is needed by the user is also needed by those Not So Friendly people in the world and/or the Not So Friendlies would like to see the data not be available to the user.

Threats: Threats to a system come in a wide variety of shapes and sizes. An airborne system has the threat that it might fall to Earth. A ground based system has the threat of power surges during a thunder storm. Either of these threats can effectively deny the user the use of the system and its data without any malicious intent on the threats part. In addition, there are indeed Not So Friendly people out there who will want a system and its data, either for their own or to deny the use of the system to the user. Each threat must be considered for each system, though the nature of the system will limit the viable threats somewhat. (Computer Centers rarely suffer from bird strikes and crash from the sky.)

Risks: When the Vulnerable assets of a system are subject to targeting by the Threats to the system, Risk of Bad Things happening is present. The point behind the CompuSec engineering effort is to reduce the risks to an acceptable level so that the Bad Things won't happen beyond the system's level to cope with them. (Because the Bad Things for one system are different than the Bad Things for another system, the Approach to determining the needs for a system is the purpose of this section.)

Sharing the Security Needs or"CompuSec and the Securities"

Theoretically the computer system could be built which would allow anyone, even Not So Friendly folk, to sit down at the terminal and have an acceptable level of risk. In fact, this is essentially the case for the network of ATMs raising its head all over the place. This is not, however, the optimum situation for most systems. Through a well-reasoned engineering trade-off process, a suitable mix of the securities can be found which will minimize performance degradation and cost while remaining in the realm of well understood techniques and well within the state-of-the-art (and probably within the state-of-the-technology.)

Performance Considerations: Whatever security measures are levied on the system will have some effect on the performance of the system. In the case of the ATMs, ease of use is reduced by the need to have the Card and the limited functionality of the available options. If human intervention is required for each access to data, the queue will build to intolerable levels for an active system. Requiring the system to maintain security labels on each subject and object within the system will add unnecessary throughput to a system which operates at a single security level. A facility entry system which backs-up the personnel around the block at shift change is most probably counter-productive. Some of the basic trade-offs between CompuSec and the other Securities are:

Layered Physical Security will allow for more ready access to the terminals for a limited set of users, provided that the Physical Security measures are sized properly for the actual traffic flow. The Physical Security tends to be manpower intensive and or to use emerging technology which will add another set of CompuSec concerns regarding the computer that runs the physical security. In most instances, some level of Physical Security will be required.

When control over the users is possible, Personnel Security can reduce the level of CompuSec required. Unfortunately, operational considerations do not always allow for sufficient controls to be placed on the personnel. The ultimate case in point is the ATM Network, but, in this day and age of multinational cooperation, the limiting of personnel to US nationals goes out the window when the system is sold through Foreign Military Sales (FMS). The limiting of personnel to relatively high clearance levels will add to the lifecyle cost of the system unless the users are, by the nature of their duties, already cleared to the required level.

While labor-intensive, Procedural Security can solve many "unsolvable" situations. The system operators must have the ability to circumvent the normal operations of the system in order to perform some of their tasks - Procedural Security is the answer. Overuse of this technique will limit the ability of the "watchers" to watch effectively, and the higher the use the higher the problem of watching the watchers grows.

Intelligent use of these securities will allow for a greater degree of trust to be granted the system's software through development by cleared personnel and configuration management throughout the life of the system, including upgrades. All of those little devils like Trojan Horses, Viruses, Logic Bombs, Worms, etc. will be greatly reduced, and possibly stopped, through careful application of Administrative, Product, and Industrial Securities.

Costs: Additional requirements, of any kind, add to the development cost of a system. In some cases, though, the lifecycle cost will be lower because of the increased requirements. The following are some of the aspects of the cost of the requirements that need to be considered.

Because the majority of the CompuSec requirements are realized in software, the development cost of CompuSec requirements are substantial. To some degree these costs can be lowered through the purchase of COTS systems from the National Computer Security Center's (NCSC) Evaluated Product List (EPL) for the appropriate level of security. The other securities are typically realized as hardware subsystems and as operational procedure manuals. Depending on the level of security required, these can be COTS turnkey systems and procedures which are simple and straight forward. In the extreme case, the requirements can represent a major development effort in their own right.

The payoff for the expense in development is that if the computer performs a function, a person does not need to. CompuSec requirements which are executed properly reduce the manpower needed for the life of the system. The personnel needs linked to Physical Security are high and the Administrative Security methodologies also are labor intensive. Personnel Security can become prohibitively expensive in more resources than mere money if the required clearance levels grow too far beyond the minimum necessary in an effort to simplify the security requirements in other security disciplines.

Level of Understanding: Of all of the security disciplines which are managed by System Security Engineering Management, CompuSec is probably the least well understood. While there is nothing especially difficult about CompuSec, its relative newness leads to misunderstandings concerning its requirements which would not happen in another discipline. Because of this low misunderstanding threshold, any requirements which are leveled in the CompuSec arena must be stated in the clearest form possible and monitored closely to ensure proper execution.

Do-Ability: In conjunction with the generally low level of understanding of the CompuSec discipline, the do-ability of CompuSec functionality bottoms out in much shallower water than with functionality that is better understood. The state-of-the-art for CompuSec is not on a par with the state-of-the-technology for other aspects of systems development. Each time a CompuSec requirement is levied it must be carefully examined to ensure the do-ability.

Yellow Books (I & II)

The Yellow Books are members of the Rainbow Series put out by the NCSC. (Actually they are CSC-STD-003-85 and CSC-STD-004-85. Digests of these two standards can be found in Section III: Rainbow Readers' Digest.) The sole purpose of these two documents is to give a standard method for determining the needed Orange Book Class for a given system with its given set of circumstances. The approach taken by the Yellow Books is conceptually simple: take the level of data on the system and compare it to the clearances of the people with access to the system. The greater the difference between the lowest cleared person and the highest level of data, the higher the risk. The following steps are taken to quantify this risk.

Minimum User Clearance Rating (RMin): Through the use of the table in the Yellow Books, a numeric rating is given to the clearance of the lowest cleared personnel who will have access to the system. This number will be between 0 (Uncleared, without access to Sensitive Unclassified Information) and 7 (TS/SBI with access to Multiple Categories of TS Data).

Maximum Data Sensitivity Rating (RMax): Through the use of the appropriate table in the Yellow Books, a numeric rating is given to the sensitivity of the most sensitive data operated on by the system. The number will be between 0 (Unclassified) and 7 (TS with two or more categories of Secret or TS data).

Risk Index: The Risk Index is computed by subtracting the RMin from the RMax. When a non-positive (zero or negative) value is the result {all users cleared to equal to or higher than all of the data}, the Risk Index is 1 if there are any categories to which any user does not have access and 0 otherwise. When a positive value is the result, the Risk Index is that result.

(Risk Index = RMax - RMin)

Modes of Operation: The CompuSec requirements for a system vary with the mode of operation and the controls that are necessary to allow for operation in that mode. There are four recognized modes of operation for secure computer systems. Each of these modes has its own environment in which to operate and conditions that indicate its use.

The Dedicated Mode of operation is indicated for a system when each user has clearance for all of the data on the system, formal access to all of the data, and a valid need-to-know for all of the data.

The system high mode of operation is indicated for a system when each user has clearance for all of the data on the system and formal access to all of the data on the system, but not all of the users have a valid need-to-know for all of the data on the system.

The compartmented mode of operation is indicated for a system when each user has clearance for all of the data on the system but not all users have formal access or a valid need-to-know for all of the data on the system.

The multilevel mode of operation is indicated for systems where not every user has clearance for all of the data on the system, formal access to all of the data on the system, or a valid need-to-know for all of the data on the system.

Development Environment: Malicious logic planted by the developers during development is almost impossible to find until it is too late. When a system is developed by personnel who are verified to not be Not So Friendly People the system can be trusted to act in the manner in which it was designed (at least more so than a system which may have been developed by malicious programmers.) Because of this, the Yellow Books allow for less stringent CompuSec requirements when the system is developed in a closed environment than if it is developed in an open environment.

There are two required conditions for a Closed Development Environment, both of which must be met, to qualify a development environment as Closed.

Developers: The developers must have clearances and authorizations equal to the data which will be processed by the system, or at least Secret clearance for systems that will process data at or above Secret. (This is to reduce the risk of malicious logic being inserted by the developers.)

Configuration Control: Configuration control must be sufficient to ensure no malicious logic is inserted after development.

If either of the above conditions are not met, the development environment is Open. It should be noted that most COTS systems are developed in an Open Environment. (In fact, most commercial vendors fail both conditions).

Orange Book Class: Now that the Risk Index, Mode of Operation, and Development Environment type have been selected/defined for the system, the Orange Book Criteria Class needed for the system can be determined. The actual CompuSec requirements and tasking are based upon the Orange Book Class indicated.

C1: Provides for nominal Discretionary Access Control (DAC).

C2: Provides for more finely grained DAC, auditing, and accountability through individual login procedures.

B1: Additionally provides for Mandatory Access control (MAC), an informal policy model, and export labeling.

B2: Additionally provides for a formal model, covert channel analysis, a structured Trusted Computing Base (TCB), and stringent Configuration Management.

B3: Additionally provides for a small TCB which contains only security policy enforcement functions, a separate security administrator function, and all accesses to objects are to be mediated.

A1: Functional requirements are the same as B3. Additionally provides for formal design specification and verification techniques and more stringent documentation and Configuration Management requirements.

Lowering the CompuSec Needs

Now that the various trade-off options have been discussed and an initial Orange Book class has been identified with the Yellow Books, the iterative process of tuning the system operational concept to lower the CompuSec needs begins. The following areas are the primary areas where progress can be made in this effort.

Classified Needs: The assumed classified needs of the system should be reviewed to ensure the validity of each level of classified or sensitive data. In some instances, the data requirements of the system could be satisfied with a lower level of data which can materially lower the level of CompuSec requirements. Nice-To-Haves aren't so nice when they get to be expensive.

Operational Personnel: If the system concept calls for most of the operational personnel to be cleared to Level X and a few to be cleared only to the lower Level Y, it should be investigated whether the few can be reasonably cleared to Level X as well. If, on the other hand, only a few of the personnel are cleared (conceptually) to Level X, it should be investigated whether the Level X personnel (and the Level X data) could be segregated into a subsystem of the total system which is not accessible to the Level Y personnel.

Environmental Considerations: The net effects of the other securities should be reviewed to tune the trade-offs on iteration of the security requirements definition. In some instances, slightly raising the other Securities' requirements will greatly lower the CompuSec requirements. In other cases, it may become clear that relieving the other Securities of some of their requirements will have no negative impact on the system because CompuSec is going to need to cover that aspect of security anyway.

Statement of Work (SOW) Issues

As the Class of requirements increases, the documentation requirements and other tasks also increase. By the time that the jump from B3 to A1 comes along, the only changes are to the tasking and documentation; the functional requirements are identical. This part of the book will supply the Spare Parts and "installation instructions" for the SOW.

There are some decisions to be made for the specific program in question; these decisions will be discussed in the Format Considerations portion. The impacts of the decisions are subtle, but, like much subtlety, the effects can be truly insidious.

System Security Engineering Management (SSEM), as set forth in MIL-STD-1785, covers a broad range of security disciplines which includes computer security. The SSEM Program portion shows the paths which will help the program arrive at the appropriate destination for the system. It is especially important that the other securities neither wash out the computer security nor allow the computer security to wash them out.

The Documentation portion gives a guided tour of the various documents which are peculiar to, or measurably altered by, computer security.

The Tasks portion breaks the tasks into digestible chunks and presents them by Orange Book Class. The final arrangement in the SOW will be based on the decisions which need to be made with the help of the Format Considerations portion presented earlier in the SOW part and as modified by the SSEM program portion.

Format Considerations

The formatting of the tasking within the SOW can have impacts beyond the surface impression. The continuity of the security effort and the cost of that effort are both affected by the layout of the tasking as well as the reporting of the effort and its costs. The decision must be made early in order to have the security built into the program instead of a Johnny-Come-Lately effort tacked on to the main effort after it is too late to achieve much more than expensive Lip Service.

Separate WBS Item / Scattered Throughout

At the lower extreme, the C1 Class systems require very little in the way of special requirements or extra documentation. For such systems there is no need to have a special niche set in the Work Breakdown Structure (WBS) for the CompuSec effort.

On the other hand, the A1 Class systems have such intensive requirement sets, software proofs, and documentation that the set of activities represents a significant portion of the overall effort for the system. For such systems there is a definite need to elevate security in general, or CompuSec in particular, a relatively high niche in the WBS for reporting purposes and to give a unified front for the related, and interdependent, activities involved in the design and development of an A1 Class system.

Now that the decisions for the extremes are taken care of, consideration must be given to the middle ground. The relative scale and scope of the CompuSec effort as a portion of the total effort will be the primary determining factor for a given system. Secondary considerations are the relative awareness of the CompuSec arena by the contractor personnel involved and the relative newness of the techniques which are planned to be used. A relatively high level WBS element dedicated to the CompuSec effort will generally lead to a CompuSec "office" within the contractor with the charge of monitoring the implementation of the requirements and handling the documentation requirements.

Cost Quality Oversight Perceived Importance

The higher the level of the WBS element for CompuSec is defined, the finer the detail cost accounting for the CompuSec effort will be. This is, of course, the way the WBS works. In conjunction with the finer detail of the cost accounting, the actual cost will also tend to rise with the level of the WBS. More management emphasis and a higher level manager to manage the effort equate to more effort expended and more effort expended equates to higher cost for the CompuSec effort.

With the higher level of management attention, the quality of the work performed tends to improve somewhat. If the tasking is scattered about the system effort the tendency exists for the CompuSec portion of the effort to get slighted in favor of the "Real Effort". As the effort is drawn together, the "Real Effort" becomes the CompuSec effort. This results in a more concerted effort and trade-offs being made which favor the being viewpoint (at least part of the time.)

Along with the cost visibility comes overall program oversight of the effort both on the part of the contractor and the government. This improved oversight of the effort can result in great savings because the CompuSec arena is not well understood for the most part. As time marches on and more contractors (and government personnel) become experienced with CompuSec the great need for special oversight will wane, but, for now, the greater the oversight the better. In addition to the direct government oversight, if the contractor has a cadre of experienced personnel, the CompuSec "office" at the contractor can serve as an Internal Independent Verification and Validation (IIV&V) team. Of course, the usual problems related to IIV&V will still exist; the reporting chain will not reach high enough, the IIV&V personnel will be pulled to perform Software Engineering at the drop of a schedule, etc. For a time, the visibility and oversight will be improved greatly.

The single greatest reason for elevating the level of the WBS element for the CompuSec effort is the perceived importance of the effort by the contractor. The contractor is resource limited. The areas which are perceived as being important to the government will receive the higher priorities in the allocation of resources. If the CompuSec effort is perceived as being of low importance, the resources allocated will be low priority resources.

System Security Engineering Management Program

The System Security Engineering Management (SSEM) Program is called for by MIL-STD-1785. While the CompuSec portion of the SSEM Program is the primary emphasis of this book, the Program as a whole needs to be examined to determine the appropriate way to task the portions.

The Securities

Each of the Securities mentioned earlier in this Section are covered by SSEM. For any given program, some, all, or none of the securities will have minimal impact or import. Each must be examined and the appropriate measures, studies, etc. must be taken to counter the threats. For the most part, the measures to be taken will take the form of support subsystems and/or procedural and administrative measures. By the start of Engineering and Manufacturing Development (EMD), and the writing of the EMD SOW, the appropriate tasking should be clear for each of the securities. The SSEM Program should cover the group as a whole with separate sections as needed for the actual design and implementation.

Oversight of Program

In order to maintain an appropriate level of oversight by the government, it is a good idea to levy the SSEM Program Management tasks as one of the direct tasks under System Engineering. This allows the tasks to receive the proper attention by the proper personnel. Within the SSEM PM section, the various securities can be broken out as needed.

Industrial Security and the Rest

One care that should be taken is to ensure that none of the securities ride rough-shod over the others. (Yes, that includes CompuSec) Another security which tends to grow a life of its own is Industrial Security. While definitely needed, Industrial Security should be kept well separated from the rest of the securities (except perhaps Product Security). A good model for a program which includes a fairly stiff CompuSec effort would be; System Engineering => SSEM => Industrial Security, CompuSec, The Other Securities.

Documentation

Along with the functional requirements for each of the Classes of secure computer systems come documentation requirements. In some cases, the documentation needs closely parallel the standard documentation (users guides, etc.) and these needs can be filled by replacement or modifications to the standard documents. In other cases, the CompuSec documents have no equivalent document (Formal Top Level Specification, Computer Security Policy Model, etc.).

For Itself

Some of the CompuSec documentation is needed because the document produced is needed by the operators of the system. For instance, the user's manuals, especially the Security Features Users Guide, are needed to allow the full and efficient use of the secure system as designed and built. On the other hand, the security test documentation and the covert channel analysis are needed by the security personnel to help circumvent penetration attempts during the operational phase of the system lifecycle. While these documents are not part of the standard documentation package for most systems, they are essential to the ultimate security of the system.

For Oversight Effort

Other documents in the package are for program oversight purposes. For instance, the Computer Security Policy Model is used as an intermediate document to ensure the correct interpretation and implementation of the Computer Security Policy. While these documents serve no innate long term purpose, the long term effects of not having the documents are massive. Because the CompuSec arena is not well understood and the vocabulary is basically unstable, there is an increased need for the contracting agency to maintain good oversight of the contractor's effort (and for the contractor to do the same).

Pertinent Documents

Because the documents are not those typically used for most programs, here is a list of the basic documents and their descriptions in order to help familiarize the reader with the terrain when the SOW tasking, CDRL, and tailored DIDs portions are reached.

System Security Engineering Management Plan: This is the cornerstone document for the security engineering effort. In it the contractor lays out the plan for managing the entire security engineering effort. Along with the other securities comes CompuSec and the plans to interpret, allocate, and implement CompuSec requirements and taskings. The document usually works best when detailed plans are given for the next phase of the program and broad plans are given for the remainder of the program. Updates at the beginning of each phase will then allow for the each layer of detailed planning to be based on a firm understanding of the approaches being taken. This is the document which will allow the government to gain confidence in the contractor's understanding of the tasking and requirements in a timely manner.

Computer Security Management Plan: This Plan is a CompuSec specific offshoot of the SSEM Plan. Here the detailed plans for CompuSec are laid out in excruciating detail. This allows the contractor and the government to ensure the understanding of the effort by the team which will actually be performing the tasking. On a large program with an intensive security engineering effort both documents are needed to allow for the varying levels of abstraction involved. On a smaller program, or a program with less intensive security needs, one or the other of the documents should be sufficient by itself.

Security Vulnerability Analysis (SVA): The SVA is generated during Concept Exploration and updated during Dem Val. By the time that FSD rolls around the SVA is complete and is used by the designers to determine the appropriate implementations to counter the vulnerabilities.

Computer Security Policy: The Policy contains the stated rules and assumptions under which the system will be built and function. Some of the topics covered would be the assumed level of personnel that will operate the system, the mode of operation for the system, whether Mandatory Access Control will be used, and the assumed operational environment for the system. The Policy really should be written by the government but typically it is left to the contractor to determine the Policy for the system, which works well if the government takes an active role in the approval process.

Computer Security Policy Model: This is the bridging document between the Policy and the requirements. Depending on the Class of the system the model is either a formal mathematical model or an informal English language descriptive model. While the Model tends to be a good-sized document it should be given close attention to ensure complete understanding of the Policy by the contractor.

Security Concept of Operations: This is the "Artists Rendition" of the CompuSec arena. This is where the conceptual implementation of the Policy is laid out and the interplay of the various Securities is discussed. This document should be written for the ease of understanding by the non-Security professional. This is the probable source (along with the Policy) for the inputs to the Yellow Books in the determination of the needed Class of security requirements and tasking for the system.

Security Architecture Study: This study covers the finer details of the security architecture after the initial determination of the needed Class has been made. The results can result in changes in the needed Class through reallocation to the Other Securities.

NOTE: The Policy, SVA, Model, Security Con Ops, and Security Architecture Study are developed and finalized around the same time period and are iteratively developed as the interdependencies and repercussions of the decision process are felt.

Covert Channel Analysis Report: This report is mandated for systems at or above Class B2. The methods of passing data outside normal, controlled channels are analyzed and any which are of sufficient bandwidth are studied for elimination/minimization purposes.

Computer Security Audit Analysis: This analysis is performed on a recurring basis to trace the implementation of the audit trail requirements. Understanding of the contractor approach and realization of the requirements can be gained as well as the impacts of the audit process in processor throughput, network bandwidth availability, and storage space required. The periodic release is needed because the actual system performance costs continue to become more finely defined as the implementation approaches completion.

Security Features Users Guide (SFUG): This is the guide to understanding the functionality of the security features, their interactions, and guidelines for their use. For the lower Classes of systems, the SFUG can be a section of the users' manual but as the classes grow, so does the SFUG. Whenever possible, the SFUG should be a free-standing document for ease of use (and finding it).

Trusted Facility Manual: This manual covers the duties and roles of the security related positions on the system. The manual is sometimes realized as Positional Handbooks for each of the positions with the pertinent details needed for each position in its own Handbook. If realized as a single volume, the details for each position related to the security implementation needs to be present.

Descriptive Top Level Specification (DTLS): Starting at the B2 Class, the DTLS describes the Trusted Computing Base (TCB) in intimate detail, especially the TCB interface.

Formal Top Level Specification (FTLS): This is an A1 Class required document that is the formal (mathematical) brother of the DTLS. The FTLS must be produced with a endorsed formal specification system and must be mapped to the code of the TCB.

Security Test Documentation: Security test documentation serves two purposes: during development and test the documentation is used for the separate testing of the security features to ensure the security of the system. The documentation is then maintained to allow for retesting after modifications and updates to the system and for the use of the security personnel in the evaluation of the risks related to proposed changes to the systems and its environment.

Security Test Plan: The test plan, like most test plans, covers the Big Picture without going into great detail. It is the best view of the testing effort, though, and can be quite useful for gaining insight into the security implementation.

Security Test Procedures: The procedures, like most procedures, are excruciatingly detailed directions for the test itself. Where these test procedures differ is the pointing away from simple requirement satisfaction and more toward security satisfaction.

Security Test Report: The test report is the results of the testing and should be maintained through the life of the system to ensure continued protection and to document flaws which may exist.


Section II:

Spare Parts

II. Spare parts

In Section I: Building a Secure System the determination of the appropriate Class of secure system was made. In addition, the pros and cons of SOW and WBS options were discussed and decisions were to be made based on the particular system being developed. Also, a list of pertinent documents were discussed for the CompuSec arena.

In Section II: Spare Parts the pieces are listed which can be put together to form the CompuSec portions of the specification, SOW, and CDRL for the system in question. The section is broken into parts based on the Class of system determined in Section I. For each of the Classes from C1 through A1, the requirements, SOW tasking, and CDRL inputs are listed.

Requirements: The requirements are laid out as a separate appendix to the specification. While this is not intensely needed for the lower Classes, by the time that the B-level Classes are reached, the requirements are involved enough to necessitate the creation of a separate appendix.

SOW Tasking: The SOW portion is composed of paragraphs which will need to be arranged in the SOW in accordance with the decisions, made in Section I, concerning SOW formatting.

CDRL Inputs: The CDRL input section gives essentially all of the document-specific except for the distribution section which will need to be determined for each system individually.

NOTE -- The Tasking and CDRL inputs are not separated by phase. If the program has clear phases, some of the tasks and documentation will need to be tailored accordingly.

Following the portions concerned with the various Classes of systems, there is a portion containing the major Data Item Descriptions (DID). In some cases there will be a need to tailor the DIDs. Such tailoring should be performed only with the greatest of care.

Two things to remember when using these Spare Parts are:

These are the basic minimums for a generic system. For your specific system there may be portions which just do not apply and there may be portions which need to be added.

Nothing replaces careful thought and consideration.


C1 Class

The C1 class secure computer system is the lowest level of CompuSec requirements defined by the Orange Book. C1 class systems are appropriate for operation in the dedicated mode of operation only. The protections are not sufficient to protect against an internal attack.

There are a limited number of high level subjects and objects which are protected at all and the limits on the sharing of objects are not restrictive.

The users are required to login but the system does not need to be able to identify individual users uniquely. This implies that group logins (or project logins) are allowed.

The Trusted Computing Base (TCB) is required to protect itself (software) for outside interference and tampering and to be able to validate the operations of the TCB hardware and firmware.

C1 Class Requirements

10.1 Discretionary Access Control (DAC)
10.1.1 The TCB shall control access between named users and named objects.

10.1.2 The TCB shall allow users to specify and control sharing of objects by named individuals or groups or both.

10.2 Identification and Authentication

10.2.1 The TCB shall require users to identify themselves before performing any other actions on behalf of that user.

10.2.2 The TCB shall authenticate the user's identity before performing any other actions on behalf of that user.

10.2.3 The TCB shall protect authentication data from unauthorized access.

10.3 The TCB shall protect itself from external interference and tampering.

10.4 The TCB shall have the capability to validate the correct operations of the TCB's hardware and firmware.

C1 Class SOW Tasking

A. Data Accession List. The contractor shall maintain a complete list of all data generated as a result of this contract. The contractor shall list the data by title, date, and subject. The list shall include all memos, letters, meeting minutes, phone logs, etc. All data not required by the CDRL shall be considered contractor format. The document shall be titled Gonculator Data Accession List. (DI-A-3027A/T)

B. System Description. The contractor shall prepare separately published common appendices which describes the system for use with the system documentation. These appendices shall be prepared in an Unclassified version, if possible, and such classified versions that are required for a full description of the Gonculator System. The contractor shall use the common appendices in place of the system description within the bodies of the documentation. The descriptions shall be titled Gonculator System Description Appendix, (Appropriate Classification) Version. (DI-GDRQ-80567/T)

C. System Security Management Program. The contractor shall conduct a System Security Management Program in accordance with the approved System Security Management Plan.

D. System Security Management Plan. The contractor shall develop a system security management plan describing the contractor's security engineering and management approach. The plan should include all aspects of system security, including computer security. The document shall be titled Gonculator System Security Management Plan. (DI-MISC-80839/T)

E. Security Vulnerability Analysis. The contractor shall conduct a security vulnerability analysis and document the results. The study shall include identification of logical security vulnerabilities of the system, defining functional requirements which may secure the system from exploitation, and choosing safeguards to reduce identified vulnerabilities. The document shall be titled Gonculator Security Vulnerability Analysis. (DI-MISC-80841/T)

F. Computer Security Policy. The contractor shall prepare a document that defines the security policy enforced by the computer system. The document shall be titled Gonculator Computer Security Policy. (DI-GDRQ-80567/T)

G. Security Features Users Guide. The contractor shall generate a Users' Guide that documents the protection mechanisms provided by the system, guidelines for their use, and how the protection mechanisms interact with each other. The users guide may be published as either a common appendix to the positional handbooks or a stand-alone document titled Gonculator Security Features Users' Guide. (DI-MCCR-80019A/T)

H. Trusted Facility Manual. The contractor shall generate a manual that documents cautions about functions and privileges that should be controlled when running the secure facility in accordance with DOD-5200.28-STD. The document shall be titled Gonculator Trusted Facility Manual. (DI-MCCR-80019A/T)


C2 Class

The C2 class secure computer system enforces a more finely grained DAC than C1 class systems do. C2 class systems are appropriate for operation in the dedicated or system high modes of operation only. The protections include individual accountability.

There are a limited number of high level subjects and objects which are protected. Propagation of access rights is controlled. Access is controlled at the granularity of named individual and/or group of named individuals.

The users are required to login and the system needs to be able to identify individual users uniquely. While individuals are individuals, group access rights are still allowed.

Reuse of storage objects (memory, disk space, buffers, etc.) is only allowed after the object is cleared to disallow any access to the old data.

Audit is required to make users responsible for their actions. Half of the theory behind audits is to allow for the security personnel to find wrongful acts and identify the perpetrator; the other half of the theory is that an advertised audit process will keep Honest people Honest.

The Trusted Computing Base (TCB) is required to protect itself (software) for outside interference and tampering and to be able to validate the operations of the TCB hardware and firmware.

C2 Class Requirements

10.1 Discretionary Access Control (DAC)
10.1.1 The TCB shall control access between named users and named objects.

10.1.2 The TCB shall protect all named objects from unauthorized access.

10.1.3 The TCB shall allow for the inclusion or exclusion of access to named objects at the level of the single user.

10.1.4 The TCB shall allow authorized users to specify and control sharing of objects by named individuals or groups of individuals or both.

10.1.5 The TCB shall provide controls to limit propagation of access rights.

10.2 The TCB shall clear all memory objects before reallocation.

10.3 Identification and Authentication

10.3.1 The TCB shall require users to uniquely identify themselves before performing any other actions on behalf of that user.

10.3.2 The TCB shall authenticate the user's unique identity before performing any other actions on behalf of that user.

10.3.3 The TCB shall protect authentication data from unauthorized access.

10.3.4 The TCB shall associate the user's unique identity for all auditable actions taken by the user.

10.4 Audit

10.4.1 The TCB shall be able to create an audit trail.

10.4.2 The TCB shall maintain the audit trail on-line for a minimum of thirty days.

10.4.3 The TCB shall protect the audit trail from modification or unauthorized access.

10.4.4 The TCB shall audit the following event types:

a. Login.

b. Logout.

c. Access to objects.

d. Deletion of objects.

e. Actions taken by system operators, administrators, and security officers.

10.4.5 The TCB shall include the following information in each audit trail record:

a. Date and time of event.

b. User identity.

c. Event type.

d. Success or failure of action.

e. Name of object, if any.

10.4.6 The TCB shall selectively audit the actions of one or more users based on individual identity.

10.5 System Security Architecture

10.5.1 The TCB shall protect itself from external interference and tampering.

10.5.2 The TCB shall isolate the resources to be protected by access controls and auditing.

10.6 The TCB shall have the capability to validate the correct operations of the TCB's hardware and firmware.

C2 Class SOW Tasking

A. Data Accession List. The contractor shall maintain a complete list of all data generated as a result of this contract. The contractor shall list the data by title, date, and subject. The list shall include all memos, letters, meeting minutes, phone logs, etc. All data not required by the CDRL shall be considered contractor format. The document shall be titled Gonculator Data Accession List. (DI-A-3027A/T)

B. System Description. The contractor shall prepare separately published common appendices which describes the system for use with the system documentation. These appendices shall be prepared in an Unclassified version, if possible, and such classified versions that are required for a full description of the Gonculator System. The contractor shall use the common appendices in place of the system description within the bodies of the documentation. The descriptions shall be titled Gonculator System Description Appendix, (Appropriate Classification) Version. (DI-GDRQ-80567/T)

C. System Security Management Program. The contractor shall conduct a System Security Management Program in accordance with the approved System Security Management Plan.

D. System Security Management Plan. The contractor shall develop a system security management plan describing the contractor's security engineering and management approach. The plan should include all aspects of system security, including computer security. The document shall be titled Gonculator System Security Management Plan. (DI-MISC-80839/T)

E. Security Vulnerability Analysis. The contractor shall conduct a security vulnerability analysis and document the results. The study shall include identification of logical security vulnerabilities of the system, defining functional requirements which may secure the system from exploitation, and choosing safeguards to reduce identified vulnerabilities. The document shall be titled Gonculator Security Vulnerability Analysis. (DI-MISC-80841/T)

F. Computer Security Policy. The contractor shall prepare a document that defines the security policy enforced by the computer system. The document shall be titled Gonculator Computer Security Policy. (DI-GDRQ-80567/T)

G. Computer Security Audit Analysis. The contractor shall analyze the audit schema for the system including events to be audited, audit record structures, throughput requirements, storage needs, and archival storage techniques. The document shall be titled Gonculator Computer Security Audit Analysis. (DI-GDRQ-80567/T)

H. Security Features Users Guide. The contractor shall generate a Users' Guide that documents the protection mechanisms provided by the system, guidelines for their use, and how the protection mechanisms interact with each other. The users guide may be published as either a common appendix to the positional handbooks or a stand-alone document titled Gonculator Security Features Users' Guide. (DI-MCCR-80019A/T)

I. Trusted Facility Manual. The contractor shall generate a manual that documents cautions about functions and privileges that should be controlled when running the secure facility in accordance with DOD-5200.28-STD. The document shall be titled Gonculator Trusted Facility Manual. (DI-MCCR-80019A/T)


B1 Class

The B1 class secure computer system enforces a Mandatory Access Control (MAC) policy and the associated labeling. B1 class systems are appropriate for operation in the compartmented and multilevel modes of operation. Multilevel mode operations should be carefully considered before use because there is no protection from covert channel attacks.

There are a limited number of high level subjects and objects which are protected by DAC. Propagation of access rights is controlled. Access is controlled at the granularity of named individual and/or group of named individuals.

The users are required to login and the system needs to be able to identify individual users uniquely. While individuals are individuals, group access rights are still allowed.

Reuse of storage objects (memory, disk space, buffers, etc.) is only allowed after the object is cleared to disallow any access to the old data.

Audit is required to make users responsible for their actions. Half of the theory behind audits is to allow for the security personnel to find wrongful acts and identify the perpetrator; the other half of the theory is that an advertised audit process will keep Honest people Honest. With labeling and MAC, the security level is also recorded for objects being audited.

Labels are required for all subjects and storage objects controlled by the TCB. These labels are used for MAC decisions which are also mandated for all subjects and storage objects under TCB control. The basic requirements are a simple translation of the standard "Paper-style" requirements onto the computer so the computer will mark and allow access properly.

The Trusted Computing Base (TCB) is required to protect itself (software) for outside interference and tampering and to be able to validate the operations of the TCB hardware and firmware.

B1 Class Requirements

10.1 Discretionary Access Control (DAC)
10.1.1 The TCB shall control access between named users and named objects.

10.1.2 The TCB shall protect all named objects from unauthorized access.

10.1.3 The TCB shall allow for the inclusion or exclusion of access to named objects at the level of the single user.

10.1.4 The TCB shall allow authorized users to specify and control sharing of objects by named individuals or groups of individuals or both.

10.1.5 The TCB shall provide controls to limit propagation of access rights.

10.2 The TCB shall clear all memory objects before allocation to a new subject.

10.3 Identification and Authentication

10.3.1 The TCB shall require users to uniquely identify themselves before performing any other actions on behalf of that user.

10.3.2 The TCB shall authenticate the user's unique identity before performing any other actions on behalf of that user.

10.3.3 The TCB shall ensure that the user's login security level and authorizations are dominated by the user's clearance and authorizations.

10.3.4 The TCB shall ensure that the security level and authorizations of subjects external to the TCB which are created on behalf of the user are dominated by the user's clearance and authorizations.

10.3.5 The TCB shall protect authentication data from unauthorized access.

10.3.6 The TCB shall associate the user's unique identity for all auditable actions taken by the user.

10.4 Audit

10.4.1 The TCB shall be able to create an audit trail.

10.4.2 The TCB shall maintain the audit trail on-line for a minimum of thirty days.

10.4.3 The TCB shall protect the audit trail from modification or unauthorized access.

10.4.4 The TCB shall audit the following event types:

a. Login.

b. Logout.

c. Access to objects.

d. Deletion of objects.

e. Override of human-readable output markings.

f. Labeling of imported unlabeled data.

g. Any change in the designation of single level and multilevel devices.

h. Any change in security level or levels associated with a subject or object.

i. Actions taken by system operators, administrators, and security officers.

10.4.5 The TCB shall include the following information in each audit trail record:

a. Date and time of event.

b. User identity.

c. Event type.

d. Success or failure of action.

e. Security level of object, if any.

f. Name of object, if any.

10.4.6 The TCB shall selectively audit the actions of one or more users based on individual identity and/or object security level.

10.5 System Security Architecture

10.5.1 The TCB shall protect itself from external interference and tampering.

10.5.2 The TCB shall isolate the resources to be protected by access controls and auditing.

10.5.3 The TCB shall maintain distinct address spaces for process isolation.

10.6 The TCB shall have the capability to validate the correct operations of the TCB's hardware and firmware.

10.7 Labels

10.7.1 The TCB shall maintain the sensitivity label associated with each subject and object under TCB control.

10.7.2 The sensitivity label associated with a subject or an object shall correctly represent the security level of the subject or object.

10.7.3 The TCB shall use the sensitivity labels as the basis for mandatory access control decisions.

10.7.4 The TCB shall request a label from an authorized user before importing unlabeled data.

10.7.5 The TCB shall associate an accurate and unambiguous sensitivity label exported information.

10.7.6 The TCB shall designate each I/O device and communications channel as a single level or multilevel device.

10.7.7 Any change in the designation of single level or multilevel shall be performed manually.

10.7.8 The TCB shall associate with each object exported over a multilevel device the sensitivity level in the same form as the data and physically residing with the data.

10.7.9 The communications protocol used for each multilevel communications port shall provide for the unambiguous pairing of the data and its associated sensitivity label.

10.7.10 The TCB shall allow an authorized user to designate the single security level of information imported or exported via a single level communications port or I/O device.

10.7.11 The TCB shall allow the System Administrator to specify the printable label names associated with exported sensitivity labels.

10.7.12 The TCB shall mark the top and bottom of all human readable output with human readable sensitivity labels that properly represent the sensitivity of the output.

10.8 Mandatory Access Control (MAC)

10.8.1 The TCB shall enforce mandatory access control for all subjects and storage objects under its control.

10.8.2 A subject shall be allowed read access only if the hierarchical classification in the subject's security level is greater than or equal to the hierarchical classification in the object's security level and the non-hierarchical categories in the subject's security level include all of the non-hierarchical categories in the object's security level.

10.8.3 A subject shall be allowed write access only if the hierarchical classification in the subject's security level is less than or equal to the hierarchical classification in the object's security level and all of the non-hierarchical categories in the subject's security level are included in the non-hierarchical categories in the object's security level.

B1 Class SOW Tasking

A. Gonculator Accreditation Working Group. The contractor shall support the Gonculator Accreditation Working Group (GAWG) through attendance at meetings as needed, presenting current program data as requested, acting upon assigned and accepted Action Items, and preparing minutes of the GAWG meetings. At a minimum, the GAWG will meet twice a year. At a minimum, support at the meetings shall include representatives from program management, configuration management, system engineering, and security engineering. (DI-A-7089/T)

B. Data Accession List. The contractor shall maintain a complete list of all data generated as a result of this contract. The contractor shall list the data by title, date, and subject. The list shall include all memos, letters, meeting minutes, phone logs, etc. All data not required by the CDRL shall be considered contractor format. The document shall be titled Gonculator Data Accession List. (DI-A-3027A/T)

C. System Description. The contractor shall prepare separately published common appendices which describes the system for use with the system documentation. These appendices shall be prepared in an Unclassified version, if possible, and such classified versions that are required for a full description of the Gonculator System. The contractor shall use the common appendices in place of the system description within the bodies of the documentation. The descriptions shall be titled Gonculator System Description Appendix, (Appropriate Classification) Version. (DI-GDRQ-80567/T)

D. System Security Management Program. The contractor shall conduct a System Security Management Program in accordance with the approved System Security Management Plan.

E. System Security Management Plan. The contractor shall develop a system security management plan describing the contractor's security engineering and management approach. The plan should include all aspects of system security, including computer security. The document shall be titled Gonculator System Security Management Plan. (DI-MISC-80839/T)

F. Computer Security Management Program. The contractor shall conduct a Computer Security Management Program in accordance with the approved Computer Security Management Plan.

G. Computer Security Management Plan. The contractor shall generate a computer security management plan. The Plan shall include the methods used to manage the computer security engineering program, program schedule, and the steps to be taken to ensure the proper incorporation of the computer security requirements into the system. The document shall be titled Gonculator Computer Security Management Plan. (DI-MISC-80839/T)

H. Security Vulnerability Analysis. The contractor shall conduct a security vulnerability analysis and document the results. The study shall include identification of logical security vulnerabilities of the system, defining functional requirements which may secure the system from exploitation, and choosing safeguards to reduce identified vulnerabilities. The document shall be titled Gonculator Security Vulnerability Analysis. (DI-MISC-80841/T)

I. Computer Security Policy. The contractor shall prepare a document that defines the security policy enforced by the computer system. The document shall be titled Gonculator Computer Security Policy. (DI-GDRQ-80567/T)

J. Computer Security Policy Model. The contractor shall develop and document a model of the security policy enforced by the system. The model description shall include the specific protection mechanisms and an explanation showing that they satisfy the model and will enforce the security policy. The document shall be titled Gonculator Computer Security Policy Model. (DI-GDRQ-80567/T)

K. Security Architecture Study. The contractor shall conduct a study of the security architecture of the system and document the results of the study. The study shall include partitioning of the system, cost/benefit analysis for the architectural alternatives, and the required Class of system for each alternative and for each partitioned subsystem. The report shall detail the recommended architecture for the system. The document shall be titled Gonculator Security Architecture Study. (DI-GDRQ-80567/T)

L. System Security Concept of Operations. The contractor shall generate a system security concept of operations that documents the security concept for the system including the incorporation of the protection philosophy into the system. The document shall be titled Gonculator System Security Concept Of Operations. (DI-MISC-80840/T)

M. Computer Security Audit Analysis. The contractor shall analyze the audit schema for the system including events to be audited, audit record structures, throughput requirements, storage needs, and archival storage techniques. The document shall be titled Gonculator Computer Security Audit Analysis. (DI-GDRQ-80567/T)

N. Security Features Users Guide. The contractor shall generate a Users' Guide that documents the protection mechanisms provided by the system, guidelines for their use, and how the protection mechanisms interact with each other. The users guide may be published as either a common appendix to the positional handbooks or a stand-alone document titled Gonculator Security Features Users' Guide. (DI-MCCR-80019A/T)

O. Trusted Facility Manual. The contractor shall generate a manual that documents cautions about functions and privileges that should be controlled when running the secure facility in accordance with DOD-5200.28-STD. The document shall be titled Gonculator Trusted Facility Manual. (DI-MCCR-80019A/T)

P. Security Test. The contractor shall conduct a separate security test in conjunction with the system acceptance test. The security test shall show that all security mechanisms function as defined in the system documentation and that the system is resistant to penetration. The security test shall include an ad hoc, loosely structured test by the government test team. The contractor shall train the government test team in the operation of the system before the start of the test. The government test team will consist of six people drawn from the developing, supporting, using, and accrediting communities.

Q. Security Test Plan. The contractor shall develop a test plan for the security testing to be performed. The document shall be titled Gonculator Security Test Plan. (DI-MCCR-80014A/T)

R. Security Test Description. The contractor shall develop test procedures to implement the approved Gonculator Security Test Plan. The document shall be titled Gonculator Security Test Description. (DI-MCCR-80015A/T)

S. Security Test Report. The contractor shall document the results of the security test. The document shall be titled Gonculator Security Test Report. (DI-MCCR-80017A/T)


B2 Class

The B2 class secure computer system is based on clearly defined security policy and model of the policy. B2 class systems are appropriate for operation in the compartmented and multilevel modes of operation. Multilevel mode operations should be carefully considered before use because there is limited protection from covert channel attacks.

There are a limited number of high level subjects and objects which are protected by DAC. Propagation of access rights is controlled. Access is controlled at the granularity of named individual and/or group of named individuals.

The users are required to login and the system needs to be able to identify individual users uniquely. While individuals are individuals, group access rights are still allowed.

Reuse of storage objects (memory, disk space, buffers, etc.) is only allowed after the object is cleared to disallow any access to the old data.

Audit is required to make users responsible for their actions. Half of the theory behind audits is to allow for the security personnel to find wrongful acts and identify the perpetrator; the other half of the theory is that an advertised audit process will keep Honest people Honest. With labeling and MAC, the security level is also recorded for objects being audited.

Labels are required for all system resources. These labels are used for MAC decisions which are also mandated for all subjects and storage objects under TCB control. The basic requirements are a simple translation of the standard "Paper-style" requirements onto the computer so the computer will mark and allow access properly.

The Trusted Computing Base (TCB) is required to protect itself (software) from outside interference and tampering and to be able to validate the operations of the TCB hardware and firmware.

B2 Class Requirements

10.1 Discretionary Access Control (DAC)
10.1.1 The TCB shall control access between named users and named objects.

10.1.2 The TCB shall protect all named objects from unauthorized access.

10.1.3 The TCB shall allow for the inclusion or exclusion of access to named objects at the level of the single user.

10.1.4 The TCB shall allow authorized users to specify and control sharing of objects by named individuals or groups of individuals or both.

10.1.5 The TCB shall provide controls to limit propagation of access rights.

10.2 The TCB shall clear all memory objects before allocation to a new subject.

10.3 Identification and Authentication

10.3.1 The TCB shall require users to uniquely identify themselves before performing any other actions on behalf of that user.

10.3.2 The TCB shall authenticate the user's unique identity before performing any other actions on behalf of that user.

10.3.3 The TCB shall ensure that the user's login security level and authorizations are dominated by the user's clearance and authorizations.

10.3.4 The TCB shall ensure that the security level and authorizations of subjects external to the TCB which are created on behalf of the user are dominated by the user's clearance and authorizations.

10.3.5 The TCB shall protect authentication data from unauthorized access.

10.3.6 The TCB shall associate the user's unique identity for all auditable actions taken by the user.

10.3.7 The TCB shall support a trusted communication path between the TCB and a user, initiated exclusively by the user, for initial login and authentication.

10.4 Audit

10.4.1 The TCB shall be able to create an audit trail.

10.4.2 The TCB shall maintain the audit trail on-line for a minimum of thirty days.

10.4.3 The TCB shall protect the audit trail from modification or unauthorized access.

10.4.4 The TCB shall audit the following event types:

a. Login.

b. Logout.

c. Access to objects.

d. Deletion of objects.

e. Override of human-readable output markings.

f. Labeling of imported unlabeled data.

g. Any change in the designation of single level and multilevel devices.

h. Any change in security level or levels associated with a subject or object.

i. Identified covert storage channel exploitation events.

j. Actions taken by system operators, administrators, and security officers.

10.4.5 The TCB shall include the following information in each audit trail record:

a. Date and time of event.

b. User identity.

c. Event type.

d. Success or failure of action.

e. Security level of object, if any.

f. Name of object, if any.

10.4.6 The TCB shall selectively audit the actions of one or more users based on individual identity and/or object security level.

10.5 System Security Architecture

10.5.1 The TCB shall maintain a domain of its own execution that protects it from external interference and tampering.

10.5.2 The TCB shall isolate the resources to be protected by access controls and auditing.

10.5.3 The TCB shall maintain distinct address spaces for process isolation.

10.5.4 The TCB shall be structured modularly.

10.5.5 The user interface to the TCB and all TCB elements shall be completely defined.

10.6 The TCB shall have the capability to validate the correct operations of the TCB's hardware and firmware.

10.7 Labels

10.7.1 The TCB shall maintain the sensitivity label associated with each system resources accessible by subjects external to the TCB.

10.7.2 The sensitivity label associated with a subject or an object shall correctly represent the security level of the subject or object.

10.7.3 The TCB shall notify a terminal user of each change in the user's security level during a session.

10.7.4 The TCB shall support the assignment of minimum and maximum security levels for all attached devices.

10.7.5 The TCB shall use the sensitivity labels as the basis for mandatory access control decisions.

10.7.6 The TCB shall request a label from an authorized user before importing unlabeled data.

10.7.7 The TCB shall associate an accurate and unambiguous sensitivity label with all exported information.

10.7.8 The TCB shall designate each I/O device and communications channel as a single level or multilevel device.

10.7.9 Any change in the designation of single level or multilevel shall be performed manually.

10.7.10 The TCB shall associate with each object exported over a multilevel device the sensitivity level in the same form as the data and physically residing with the data.

10.7.11 The communications protocol used for each multilevel communications port shall provide for the unambiguous pairing of the data and its associated sensitivity label.

10.7.12 The TCB shall allow an authorized user to designate the single security level of information imported or exported via a single level communications port or I/O device.

10.7.13 The TCB shall allow the System Administrator to specify the printable label names associated with exported sensitivity labels.

10.7.14 The TCB shall mark the top and bottom of all human readable output with human readable sensitivity labels that properly represent the sensitivity of the output.

10.8 Mandatory Access Control (MAC)

10.8.1 The TCB shall enforce mandatory access control over all subjects and objects accessible to subjects external to the TCB.

10.8.2 A subject shall be allowed read access only if the hierarchical classification in the subject's security level is greater than or equal to the hierarchical classification in the object's security level and the non-hierarchical categories in the subject's security level include all of the non-hierarchical categories in the object's security level.

10.8.3 A subject shall be allowed write access only if the hierarchical classification in the subject's security level is less than or equal to the hierarchical classification in the object's security level and all of the non-hierarchical categories in the subject's security level are included in the non-hierarchical categories in the object's security level.

B2 Class SOW Tasking

A. Gonculator Accreditation Working Group. The contractor shall support the Gonculator Accreditation Working Group (GAWG) through attendance at meetings as needed, presenting current program data as requested, acting upon assigned and accepted Action Items, and preparing minutes of the GAWG meetings. At a minimum, the GAWG will meet twice a year. At a minimum, support at the meetings shall include representatives from program management, configuration management, system engineering, and security engineering. (DI-A-7089/T)

B. Data Accession List. The contractor shall maintain a complete list of all data generated as a result of this contract. The contractor shall list the data by title, date, and subject. The list shall include all memos, letters, meeting minutes, phone logs, etc. All data not required by the CDRL shall be considered contractor format. The document shall be titled Gonculator Data Accession List. (DI-A-3027A/T)

C. System Description. The contractor shall prepare separately published common appendices which describes the system for use with the system documentation. These appendices shall be prepared in an Unclassified version, if possible, and such classified versions that are required for a full description of the Gonculator System. The contractor shall use the common appendices in place of the system description within the bodies of the documentation. The descriptions shall be titled Gonculator System Description Appendix, (Appropriate Classification) Version. (DI-GDRQ-80567/T)

D. System Security Management Program. The contractor shall conduct a System Security Management Program in accordance with the approved System Security Management Plan.

E. System Security Management Plan. The contractor shall develop a system security management plan describing the contractor's security engineering and management approach. The plan should include all aspects of system security, including computer security. The document shall be titled Gonculator System Security Management Plan. (DI-MISC-80839/T)

F. Computer Security Management Program. The contractor shall conduct a Computer Security Management Program in accordance with the approved Computer Security Management Plan.

G. Computer Security Management Plan. The contractor shall generate a computer security management plan. The Plan shall include the methods used to manage the computer security engineering program, program schedule, and the steps to be taken to ensure the proper incorporation of the computer security requirements into the system. The document shall be titled Gonculator Computer Security Management Plan. (DI-MISC-80839/T)

H. Security Vulnerability Analysis. The contractor shall conduct a security vulnerability analysis and document the results. The study shall include identification of logical security vulnerabilities of the system, defining functional requirements which may secure the system from exploitation, and choosing safeguards to reduce identified vulnerabilities. The document shall be titled Gonculator Security Vulnerability Analysis. (DI-MISC-80841/T)

I. Security Architecture Study. The contractor shall conduct a study of the security architecture of the system and document the results of the study. The study shall include partitioning of the system, cost/benefit analysis for the architectural alternatives, and the required Class of system for each alternative and for each partitioned subsystem. The report shall detail the recommended architecture for the system. The document shall be titled Gonculator Security Architecture Study. (DI-GDRQ-80567/T)

J. System Security Concept of Operations. The contractor shall generate a system security concept of operations that documents the security concept for the system including the incorporation of the protection philosophy into the system. The document shall be titled Gonculator System Security Concept Of Operations. (DI-MISC-80840/T)

K. Computer Security Policy. The contractor shall prepare a document that defines the security policy enforced by the computer system. The document shall be titled Gonculator Computer Security Policy. (DI-GDRQ-80567/T)

L. Computer Security Policy Model. The contractor shall develop and document a formal model of the security policy enforced by the system. The model description shall include the specific protection mechanisms and an explanation showing that they satisfy the model and will enforce the security policy. The document shall be titled Gonculator Computer Security Policy Model. (DI-GDRQ-80567/T)

M. Computer Security Audit Analysis. The contractor shall analyze the audit schema for the system including events to be audited, audit record structures, throughput requirements, storage needs, and archival storage techniques. The document shall be titled Gonculator Computer Security Audit Analysis. (DI-GDRQ-80567/T)

N. Covert Channel Analysis. The contractor shall conduct a thorough search for covert storage channels and make a determination of the maximum bandwidth of each identified channel. The contractor shall generate a covert channel analysis report that documents the results of the covert channel analysis. The document shall be titled Gonculator Covert Storage Channel Analysis Report. (DI-GDRQ-80567/T)

O. Descriptive Top Level Specification. The contractor shall generate a descriptive top level specification that completely and accurately describes the TCB in terms of exceptions, error messages, effects, and interfaces. The document shall be titled Gonculator Trusted Computing Base Descriptive Top Level Specification. (DI-GDRQ-80567/T)

P. Security Features Users Guide. The contractor shall generate a Users' Guide that documents the protection mechanisms provided by the system, guidelines for their use, and how the protection mechanisms interact with each other. The users guide may be published as either a common appendix to the positional handbooks or a stand-alone document titled Gonculator Security Features Users' Guide. (DI-MCCR-80019A/T)

Q. Trusted Facility Manual. The contractor shall generate a manual that documents cautions about functions and privileges that should be controlled when running the secure facility in accordance with DOD-5200.28-STD. The document shall be titled Gonculator Trusted Facility Manual. (DI-MCCR-80019A/T)

R. Security Test. The contractor shall conduct a separate security test in conjunction with the system acceptance test. The security test shall show that all security mechanisms function as defined in the system documentation and that the system is resistant to penetration. The security test shall include an ad hoc, loosely structured test by the government test team. The contractor shall train the government test team in the operation of the system before the start of the test. The government test team will consist of six people drawn from the developing, supporting, using, and accrediting communities.

S. Security Test Plan. The contractor shall develop a test plan for the security testing to be performed. The document shall be titled Gonculator Security Test Plan. (DI-MCCR-80014A/T)

T. Security TEST DESCRIPTION. The contractor shall develop test procedures to implement the approved Gonculator Security Test Plan. The document shall be titled Gonculator Security Test Description. (DI-MCCR-80015A/T)

U. Security Test Report. The contractor shall document the results of the security test. The document shall be titled Gonculator Security Test Report. (DI-MCCR-80017A/T)


B3 Class

The B3 class secure computer system is based on clearly defined security policy and model of the policy. B3 class systems are appropriate for operation in the compartmented and multilevel modes of operation.

Subjects and objects which are protected by DAC. Propagation of access rights is controlled. Access is controlled at the granularity of named individual and/or group of named individuals. The TCB is able to list the groups and individuals with access to given objects including modes of access allowed and those individuals and groups with no access allowed to an object. (This implies, but does not explicitly require, Access Control Lists (ACL).)

The users are required to login and the system needs to be able to identify individual users uniquely. While individuals are individuals, group access rights are still allowed.

Reuse of storage objects (memory, disk space, buffers, etc.) is only allowed after the object is cleared to disallow any access to the old data.

Audit is required to make users responsible for their actions. Half of the theory behind audits is to allow for the security personnel to find wrongful acts and identify the perpetrator; the other half of the theory is that an advertised audit process will keep Honest people Honest. With labeling and MAC, the security level is also recorded for objects being audited.

Labels are required for all system resources. These labels are used for MAC decisions which are also mandated for all subjects and storage objects under TCB control. The basic requirements are a simple translation of the standard "Paper-style" requirements onto the computer so the computer will mark and allow access properly.

The Trusted Computing Base (TCB) is required to protect itself (software) for outside interference and tampering and to be able to validate the operations of the TCB hardware and firmware. In addition, the TCB monitors the audit events to determine if the activity indicates imminent security violations and alerts the security officer upon occurrence.

B3 Class Requirements

10.1 Discretionary Access Control (DAC)
10.1.1 The TCB shall control access between named users and named objects.

10.1.2 The TCB shall protect all named objects from unauthorized access.

10.1.3 The TCB shall allow for the inclusion or exclusion of access to named objects at the level of the single user.

10.1.4 The TCB shall allow authorized users to specify and control sharing of objects by named individuals or groups of individuals or both.

10.1.5 The TCB shall provide controls to limit propagation of access rights.

10.1.6 The TCB shall shall be capable of specifying a list of individuals and groups with access to the object including allowed access modes.

10.2 The TCB shall clear all memory objects before allocation to a new subject.

10.3 Identification and Authentication

10.3.1 The TCB shall require users to uniquely identify themselves before performing any other actions on behalf of that user.

10.3.2 The TCB shall authenticate the user's unique identity before performing any other actions on behalf of that user.

10.3.3 The TCB shall ensure that the user's login security level and authorizations are dominated by the user's clearance and authorizations.

10.3.4 The TCB shall ensure that the security level and authorizations of subjects external to the TCB which are created on behalf of the user are dominated by the user's clearance and authorizations.

10.3.5 The TCB shall protect authentication data from unauthorized access.

10.3.6 The TCB shall associate the user's unique identity for all auditable actions taken by the user.

10.3.7 The TCB shall support a trusted communication path between the TCB and a user, initiated exclusively by the user or the TCB, for use when positive TCB-to-user connection is needed.

10.4 Audit

10.4.1 The TCB shall be able to create an audit trail.

10.4.2 The TCB shall maintain the audit trail on-line for a minimum of thirty days.

10.4.3 The TCB shall protect the audit trail from modification or unauthorized access.

10.4.4 The TCB shall audit the following event types:

a. Login.

b. Logout.

c. Access to objects.

d. Deletion of objects.

e. Override of human-readable output markings.

f. Labeling of imported unlabeled data.

g. Any change in the designation of single level and multilevel devices.

h. Any change in security level or levels associated with a subject or object.

i. Identified covert channel exploitation events.

j. Actions taken by system operators, administrators, and security officers.

10.4.5 The TCB shall include the following information in each audit trail record:

a. Date and time of event.

b. User identity.

c. Event type.

d. Success or failure of action.

e. Security level of object, if any.

f. Name of object, if any.

10.4.6 The TCB shall selectively audit the actions of one or more users based on individual identity and/or object security level.

10.4.7 The TCB shall monitor the occurrence and accumulation of security audit events and identify imminent security violations.

10.4.8 The TCB shall alert the security officer of identified imminent security violations and take action to terminate continuing violations.

10.5 System Security Architecture

10.5.1 The TCB shall maintain a domain of its own execution that protects it from external interference and tampering.

10.5.2 The TCB shall isolate the resources to be protected by access controls and auditing.

10.5.3 The TCB shall maintain distinct address spaces for process isolation.

10.5.4 The TCB shall be structured modularly.

10.5.5 The user interface to the TCB and all TCB elements shall be completely defined.

10.6 The TCB shall have the capability to validate the correct operations of the TCB's hardware and firmware.

10.7 Labels

10.7.1 The TCB shall maintain the sensitivity label associated with each system resources accessible by subjects external to the TCB.

10.7.2 The sensitivity label associated with a subject or an object shall correctly represent the security level of the subject or object.

10.7.3 The TCB shall notify a terminal user of each change in the user's security level during a session.

10.7.4 The TCB shall support the assignment of minimum and maximum security levels for all attached devices.

10.7.5 The TCB shall use the sensitivity labels as the basis for mandatory access control decisions.

10.7.6 The TCB shall request a label from an authorized user before importing unlabeled data.

10.7.7 The TCB shall associate an accurate and unambiguous sensitivity label with all exported information.

10.7.8 The TCB shall designate each I/O device and communications channel as a single level or multilevel device.

10.7.9 Any change in the designation of single level or multilevel shall be performed manually.

10.7.10 The TCB shall associate with each object exported over a multilevel device the sensitivity level in the same form as the data and physically residing with the data.

10.7.11 The communications protocol used for each multilevel communications port shall provide for the unambiguous pairing of the data and its associated sensitivity label.

10.7.12 The TCB shall allow an authorized user to designate the single security level of information imported or exported via a single level communications port or I/O device.

10.7.13 The TCB shall allow the System Administrator to specify the printable label names associated with exported sensitivity labels.

10.7.14 The TCB shall mark the top and bottom of all human readable output with human readable sensitivity labels that properly represent the sensitivity of the output.

10.8 Mandatory Access Control (MAC)

10.8.1 The TCB shall enforce mandatory access control over all subjects and objects accessible to subjects external to the TCB.

10.8.2 A subject shall be allowed read access only if the hierarchical classification in the subject's security level is greater than or equal to the hierarchical classification in the object's security level and the non-hierarchical categories in the subject's security level include all of the non-hierarchical categories in the object's security level.

10.8.3 A subject shall be allowed write access only if the hierarchical classification in the subject's security level is less than or equal to the hierarchical classification in the object's security level and all of the non-hierarchical categories in the subject's security level are included in the non-hierarchical categories in the object's security level.

B3 Class SOW Tasking

A. Gonculator Accreditation Working Group. The contractor shall support the Gonculator Accreditation Working Group (GAWG) through attendance at meetings as needed, presenting current program data as requested, acting upon assigned and accepted Action Items, and preparing minutes of the GAWG meetings. At a minimum, the GAWG will meet twice a year. At a minimum, support at the meetings shall include representatives from program management, configuration management, system engineering, and security engineering. (DI-A-7089/T)

B. Data Accession List. The contractor shall maintain a complete list of all data generated as a result of this contract. The contractor shall list the data by title, date, and subject. The list shall include all memos, letters, meeting minutes, phone logs, etc. All data not required by the CDRL shall be considered contractor format. The document shall be titled Gonculator Data Accession List. (DI-A-3027A/T)

C. System Description. The contractor shall prepare separately published common appendices which describes the system for use with the system documentation. These appendices shall be prepared in an Unclassified version, if possible, and such classified versions that are required for a full description of the Gonculator System. The contractor shall use the common appendices in place of the system description within the bodies of the documentation. The descriptions shall be titled Gonculator System Description Appendix, (Appropriate Classification) Version. (DI-GDRQ-80567/T)

D. System Security Management Program. The contractor shall conduct a System Security Management Program in accordance with the approved System Security Management Plan.

E. System Security Management Plan. The contractor shall develop a system security management plan describing the contractor's security engineering and management approach. The plan should include all aspects of system security, including computer security. The document shall be titled Gonculator System Security Management Plan. (DI-MISC-80839/T)

F. Computer Security Management Program. The contractor shall conduct a Computer Security Management Program in accordance with the approved Computer Security Management Plan.

G. Computer Security Management Plan. The contractor shall generate a computer security management plan. The Plan shall include the methods used to manage the computer security engineering program, program schedule, and the steps to be taken to ensure the proper incorporation of the computer security requirements into the system. The document shall be titled Gonculator Computer Security Management Plan. (DI-MISC-80839/T)

H. Security Vulnerability Analysis. The contractor shall conduct a security vulnerability analysis and document the results. The study shall include identification of logical security vulnerabilities of the system, defining functional requirements which may secure the system from exploitation, and choosing safeguards to reduce identified vulnerabilities. The document shall be titled Gonculator Security Vulnerability Analysis. (DI-MISC-80841/T)

I. Security Architecture Study. The contractor shall conduct a study of the security architecture of the system and document the results of the study. The study shall include partitioning of the system, cost/benefit analysis for the architectural alternatives, and the required Class of system for each alternative and for each partitioned subsystem. The report shall detail the recommended architecture for the system. The document shall be titled Gonculator Security Architecture Study. (DI-GDRQ-80567/T)

J. System Security Concept of Operations. The contractor shall generate a system security concept of operations that documents the security concept for the system including the incorporation of the protection philosophy into the system. The document shall be titled Gonculator System Security Concept Of Operations. (DI-MISC-80840/T)

K. Computer Security Policy. The contractor shall prepare a document that defines the security policy enforced by the computer system. The document shall be titled Gonculator Computer Security Policy. (DI-GDRQ-80567/T)

L. Computer Security Policy Model. The contractor shall develop and document a formal model of the security policy enforced by the system. The model description shall include the specific protection mechanisms and an explanation showing that they satisfy the model and will enforce the security policy. The document shall be titled Gonculator Computer Security Policy Model. (DI-GDRQ-80567/T)

M. Computer Security Audit Analysis. The contractor shall analyze the audit schema for the system including events to be audited, audit record structures, throughput requirements, storage needs, and archival storage techniques. The document shall be titled Gonculator Computer Security Audit Analysis. (DI-GDRQ-80567/T)

N. Covert Channel Analysis. The contractor shall conduct a thorough search for covert channels and make a determination of the maximum bandwidth of each identified channel. The contractor shall generate a covert channel analysis report that documents the results of the covert channel analysis. The document shall be titled Gonculator Covert Storage Channel Analysis Report. (DI-GDRQ-80567/T)

O. Descriptive Top Level Specification. The contractor shall generate a descriptive top level specification that completely and accurately describes the TCB in terms of exceptions, error messages, effects, and interfaces. The document shall be titled Gonculator Trusted Computing Base Descriptive Top Level Specification. (DI-GDRQ-80567/T)

P. Security Features Users Guide. The contractor shall generate a Users' Guide that documents the protection mechanisms provided by the system, guidelines for their use, and how the protection mechanisms interact with each other. The users guide may be published as either a common appendix to the positional handbooks or a stand-alone document titled Gonculator Security Features Users' Guide. (DI-MCCR-80019A/T)

Q. Trusted Facility Manual. The contractor shall generate a manual that documents cautions about functions and privileges that should be controlled when running the secure facility in accordance with DOD-5200.28-STD. The document shall be titled Gonculator Trusted Facility Manual. (DI-MCCR-80019A/T)

R. Security Test. The contractor shall conduct a separate security test in conjunction with the system acceptance test. The security test shall show that all security mechanisms function as defined in the system documentation and that the system is resistant to penetration. The security test shall include an ad hoc, loosely structured test by the government test team. The contractor shall train the government test team in the operation of the system before the start of the test. The government test team will consist of six people drawn from the developing, supporting, using, and accrediting communities.

S. Security Test Plan. The contractor shall develop a test plan for the security testing to be performed. The document shall be titled Gonculator Security Test Plan. (DI-MCCR-80014A/T)

T. Security TEST DESCRIPTION. The contractor shall develop test procedures to implement the approved Gonculator Security Test Plan. The document shall be titled Gonculator Security Test Description. (DI-MCCR-80015A/T)

U. Security Test Report. The contractor shall document the results of the security test. The document shall be titled Gonculator Security Test Report. (DI-MCCR-80017A/T)


A1 Class

The A1 class secure computer system is based on clearly defined security policy and model of the policy. A1 class systems are appropriate for operation in the multilevel mode of operation. The functional requirements for A1 are identical to the requirements for the B3 class system. The increased protection offered by A1 systems is through increased assurance measures.

Subjects and objects which are protected by DAC. Propagation of access rights is controlled. Access is controlled at the granularity of named individual and/or group of named individuals. The TCB is able to list the groups and individuals with access to given objects including modes of access allowed and those individuals and groups with no access allowed to an object. (This implies, but does not explicitly require, Access Control Lists (ACL).)

The users are required to login and the system needs to be able to identify individual users uniquely. While individuals are individuals, group access rights are still allowed.

Reuse of storage objects (memory, disk space, buffers, etc.) is only allowed after the object is cleared to disallow any access to the old data.

Audit is required to make users responsible for their actions. Half of the theory behind audits is to allow for the security personnel to find wrongful acts and identify the perpetrator; the other half of the theory is that an advertised audit process will keep Honest people Honest. With labeling and MAC, the security level is also recorded for objects being audited.

Labels are required for all system resources. These labels are used for MAC decisions which are also mandated for all subjects and storage objects under TCB control. The basic requirements are a simple translation of the standard "Paper-style" requirements onto the computer so the computer will mark and allow access properly.

The Trusted Computing Base (TCB) is required to protect itself (software) for outside interference and tampering and to be able to validate the operations of the TCB hardware and firmware. In addition, the TCB monitors the audit events to determine if the activity indicates imminent security violations and alerts the security officer upon occurrence.

A1 Class Requirements

10.1 Discretionary Access Control (DAC)
10.1.1 The TCB shall control access between named users and named objects.

10.1.2 The TCB shall protect all named objects from unauthorized access.

10.1.3 The TCB shall allow for the inclusion or exclusion of access to named objects at the level of the single user.

10.1.4 The TCB shall allow authorized users to specify and control sharing of objects by named individuals or groups of individuals or both.

10.1.5 The TCB shall provide controls to limit propagation of access rights.

10.1.6 The TCB shall shall be capable of specifying a list of individuals and groups with access to the object including allowed access modes.

10.2 The TCB shall clear all memory objects before allocation to a new subject.

10.3 Identification and Authentication

10.3.1 The TCB shall require users to uniquely identify themselves before performing any other actions on behalf of that user.

10.3.2 The TCB shall authenticate the user's unique identity before performing any other actions on behalf of that user.

10.3.3 The TCB shall ensure that the user's login security level and authorizations are dominated by the user's clearance and authorizations.

10.3.4 The TCB shall ensure that the security level and authorizations of subjects external to the TCB which are created on behalf of the user are dominated by the user's clearance and authorizations.

10.3.5 The TCB shall protect authentication data from unauthorized access.

10.3.6 The TCB shall associate the user's unique identity for all auditable actions taken by the user.

10.3.7 The TCB shall support a trusted communication path between the TCB and a user, initiated exclusively by the user or the TCB, for use when positive TCB-to-user connection is needed.

10.4 Audit

10.4.1 The TCB shall be able to create an audit trail.

10.4.2 The TCB shall maintain the audit trail on-line for a minimum of thirty days.

10.4.3 The TCB shall protect the audit trail from modification or unauthorized access.

10.4.4 The TCB shall audit the following event types:

a. Login.

b. Logout.

c. Access to objects.

d. Deletion of objects.

e. Override of human-readable output markings.

f. Labeling of imported unlabeled data.

g. Any change in the designation of single level and multilevel devices.

h. Any change in security level or levels associated with a subject or object.

i. Identified covert channel exploitation events.

j. Actions taken by system operators, administrators, and security officers.

10.4.5 The TCB shall include the following information in each audit trail record:

a. Date and time of event.

b. User identity.

c. Event type.

d. Success or failure of action.

e. Security level of object, if any.

f. Name of object, if any.

10.4.6 The TCB shall selectively audit the actions of one or more users based on individual identity and/or object security level.

10.4.7 The TCB shall monitor the occurrence and accumulation of security audit events and identify imminent security violations.

10.4.8 The TCB shall alert the security officer of identified imminent security violations and take action to terminate continuing violations.

10.5 System Security Architecture

10.5.1 The TCB shall maintain a domain of its own execution that protects it from external interference and tampering.

10.5.2 The TCB shall isolate the resources to be protected by access controls and auditing.

10.5.3 The TCB shall maintain distinct address spaces for process isolation.

10.5.4 The TCB shall be structured modularly.

10.5.5 The user interface to the TCB and all TCB elements shall be completely defined.

10.6 The TCB shall have the capability to validate the correct operations of the TCB's hardware and firmware.

10.7 Labels

10.7.1 The TCB shall maintain the sensitivity label associated with each system resources accessible by subjects external to the TCB.

10.7.2 The sensitivity label associated with a subject or an object shall correctly represent the security level of the subject or object.

10.7.3 The TCB shall notify a terminal user of each change in the user's security level during a session.

10.7.4 The TCB shall support the assignment of minimum and maximum security levels for all attached devices.

10.7.5 The TCB shall use the sensitivity labels as the basis for mandatory access control decisions.

10.7.6 The TCB shall request a label from an authorized user before importing unlabeled data.

10.7.7 The TCB shall associate an accurate and unambiguous sensitivity label with all exported information.

10.7.8 The TCB shall designate each I/O device and communications channel as a single level or multilevel device.

10.7.9 Any change in the designation of single level or multilevel shall be performed manually.

10.7.10 The TCB shall associate with each object exported over a multilevel device the sensitivity level in the same form as the data and physically residing with the data.

10.7.11 The communications protocol used for each multilevel communications port shall provide for the unambiguous pairing of the data and its associated sensitivity label.

10.7.12 The TCB shall allow an authorized user to designate the single security level of information imported or exported via a single level communications port or I/O device.

10.7.13 The TCB shall allow the System Administrator to specify the printable label names associated with exported sensitivity labels.

10.7.14 The TCB shall mark the top and bottom of all human readable output with human readable sensitivity labels that properly represent the sensitivity of the output.

10.8 Mandatory Access Control (MAC)

10.8.1 The TCB shall enforce mandatory access control over all subjects and objects accessible to subjects external to the TCB.

10.8.2 A subject shall be allowed read access only if the hierarchical classification in the subject's security level is greater than or equal to the hierarchical classification in the object's security level and the non-hierarchical categories in the subject's security level include all of the non-hierarchical categories in the object's security level.

10.8.3 A subject shall be allowed write access only if the hierarchical classification in the subject's security level is less than or equal to the hierarchical classification in the object's security level and all of the non-hierarchical categories in the subject's security level are included in the non-hierarchical categories in the object's security level.

A1 Class SOW Tasking

A. Gonculator Accreditation Working Group. The contractor shall support the Gonculator Accreditation Working Group (GAWG) through attendance at meetings as needed, presenting current program data as requested, acting upon assigned and accepted Action Items, and preparing minutes of the GAWG meetings. At a minimum, the GAWG will meet twice a year. At a minimum, support at the meetings shall include representatives from program management, configuration management, system engineering, and security engineering. (DI-A-7089/T)

B. Data Accession List. The contractor shall maintain a complete list of all data generated as a result of this contract. The contractor shall list the data by title, date, and subject. The list shall include all memos, letters, meeting minutes, phone logs, etc. All data not required by the CDRL shall be considered contractor format. The document shall be titled Gonculator Data Accession List. (DI-A-3027A/T)

C. System Description. The contractor shall prepare separately published common appendices which describes the system for use with the system documentation. These appendices shall be prepared in an Unclassified version, if possible, and such classified versions that are required for a full description of the Gonculator System. The contractor shall use the common appendices in place of the system description within the bodies of the documentation. The descriptions shall be titled Gonculator System Description Appendix, (Appropriate Classification) Version. (DI-GDRQ-80567/T)

D. System Security Management Program. The contractor shall conduct a System Security Management Program in accordance with the approved System Security Management Plan.

E. System Security Management Plan. The contractor shall develop a system security management plan describing the contractor's security engineering and management approach. The plan should include all aspects of system security, including computer security. The document shall be titled Gonculator System Security Management Plan. (DI-MISC-80839/T)

F. Computer Security Management Program. The contractor shall conduct a Computer Security Management Program in accordance with the approved Computer Security Management Plan.

G. Computer Security Management Plan. The contractor shall generate a computer security management plan. The Plan shall include the methods used to manage the computer security engineering program, program schedule, and the steps to be taken to ensure the proper incorporation of the computer security requirements into the system. The document shall be titled Gonculator Computer Security Management Plan. (DI-MISC-80839/T)

H. Security Vulnerability Analysis. The contractor shall conduct a security vulnerability analysis and document the results. The study shall include identification of logical security vulnerabilities of the system, defining functional requirements which may secure the system from exploitation, and choosing safeguards to reduce identified vulnerabilities. The document shall be titled Gonculator Security Vulnerability Analysis. (DI-MISC-80841/T)

I. Security Architecture Study. The contractor shall conduct a study of the security architecture of the system and document the results of the study. The study shall include partitioning of the system, cost/benefit analysis for the architectural alternatives, and the required Class of system for each alternative and for each partitioned subsystem. The report shall detail the recommended architecture for the system. The document shall be titled Gonculator Security Architecture Study. (DI-GDRQ-80567/T)

J. System Security Concept of Operations. The contractor shall generate a system security concept of operations that documents the security concept for the system including the incorporation of the protection philosophy into the system. The document shall be titled Gonculator System Security Concept Of Operations. (DI-MISC-80840/T)

K. Computer Security Policy. The contractor shall prepare a document that defines the security policy enforced by the computer system. The document shall be titled Gonculator Computer Security Policy. (DI-GDRQ-80567/T)

L. Computer Security Policy Model. The contractor shall develop and document a formal model of the security policy enforced by the system. The model description shall include the specific protection mechanisms and an explanation showing that they satisfy the model and will enforce the security policy. The document shall be titled Gonculator Computer Security Policy Model. (DI-GDRQ-80567/T)

M. Computer Security Audit Analysis. The contractor shall analyze the audit schema for the system including events to be audited, audit record structures, throughput requirements, storage needs, and archival storage techniques. The document shall be titled Gonculator Computer Security Audit Analysis. (DI-GDRQ-80567/T)

N. Covert Channel Analysis. The contractor shall use formal methods to conduct a thorough search for covert channels and make a determination of the maximum bandwidth of each identified channel. The contractor shall generate a covert channel analysis report that documents the results of the covert channel analysis. The document shall be titled Gonculator Covert Storage Channel Analysis Report. (DI-GDRQ-80567/T)

O. Descriptive Top Level Specification. The contractor shall generate a descriptive top level specification that completely and accurately describes the TCB in terms of exceptions, error messages, effects, and interfaces. The document shall be titled Gonculator Trusted Computing Base Descriptive Top Level Specification. (DI-GDRQ-80567/T)

P. Formal Top Level Specification. The contractor shall develop a Formal Top Level Specification that accurately describes the TCB in terms of exceptions, error messages, and effects. The FTLS shall be developed with the use of an NCSC-endosed formal specification and verification system. The document shall be titled Gonculator Trusted Computing Base Formal Top Level Specification. (DI-GDRQ-80567/T)

Q. Security Features Users Guide. The contractor shall generate a Users' Guide that documents the protection mechanisms provided by the system, guidelines for their use, and how the protection mechanisms interact with each other. The users guide may be published as either a common appendix to the positional handbooks or a stand-alone document titled Gonculator Security Features Users' Guide. (DI-MCCR-80019A/T)

R. Trusted Facility Manual. The contractor shall generate a manual that documents cautions about functions and privileges that should be controlled when running the secure facility in accordance with DOD-5200.28-STD. The document shall be titled Gonculator Trusted Facility Manual. (DI-MCCR-80019A/T)

S. Security Test. The contractor shall conduct a separate security test in conjunction with the system acceptance test. The security test shall show that all security mechanisms function as defined in the system documentation and that the system is resistant to penetration. The security test shall include an ad hoc, loosely structured test by the government test team. The contractor shall train the government test team in the operation of the system before the start of the test. The government test team will consist of six people drawn from the developing, supporting, using, and accrediting communities.

T. Security Test Plan. The contractor shall develop a test plan for the security testing to be performed. The document shall be titled Gonculator Security Test Plan. (DI-MCCR-80014A/T)

U. Security TEST DESCRIPTION. The contractor shall develop test procedures to implement the approved Gonculator Security Test Plan. The document shall be titled Gonculator Security Test Description. (DI-MCCR-80015A/T)

V. Security Test Report. The contractor shall document the results of the security test. The document shall be titled Gonculator Security Test Report. (DI-MCCR-80017A/T)


Contract Data Requirements List Inputs

1. Gonculator Accreditation Working Group Meeting Minutes

DI-A-7089 Meeting Minutes.

Due 5 Days after each GAWG Meeting.

Distribution to all GAWG Members and Attenders.

Justification: Data is needed to document the results of the GAWG.

Contractor format acceptable.

2. Data Accession List

DI-A-3027A Data Accession List

Due monthly.

Justification: Data is needed to monitor contractor work in progress.

Contractor format acceptable.

NOTE: This is a DID on contract for most programs. The reason it it included in this list is the generally insufficient level of detail found on a DAL. Since the DAL is based almost entirely on the tasking, there is a SOW paragraph calling for the proper level of detail included in the SOW Tasking for each Class.

3. System Description Appendix

DI-GDRQ-80567 Subsystem Design Analysis Report

Due 90DAC with Updates as the design of the system develops sufficiently to invalidate the

previous version, but not later than each major review.

Direct distribution is limited to the Program Office, Program Engineering, and Configuration

Management. The document will also be distributed as an appendix to all other documents which need a system description.

Justification: This data is needed to form a unified view of the system as it progresses.

Contractor format acceptable.

4. System Security Management Plan

DI-MISC-80839 System Security Management Plan

Due 90DAC with revisions for each new major phase of the program.

Distribution to the Program Office, Program Engineering, and Accreditor.

Justification: This data is needed to establish a viable System Security Management

Program (SSMP). The approved plan is the direction to the contractor for the implementation of the SSMP.

Contractor format acceptable. Delete 10.6.2 through 10.6.7

5. Computer Security Management Plan

DI-MISC-80839 System Security Management Plan

Due 90DAC with revisions for each new major phase of the program.

Distribution to the Program Office, Program Engineering, and Accreditor.

Justification: This data is needed to establish a viable Computer Security Management

Program (CSMP). The approved plan is the direction to the contractor for the implementation of the CSMP.

Contractor format acceptable. Delete 10.6.2 through 10.6.7

6. Security Vulnerability Analysis

DI-MISC-80841 Security Vulnerability Analysis

Preliminary version due 30DAC. Final due SRR-30 (or 90DAC if no SRR).

Distribution to the Program Office, Program Engineering, and Accreditor.

Justification: This data is needed to validate the sufficiency of the security requirements levied on the system to avoid designed-in security breaches.

Contractor format acceptable.

7. Security Architecture Study

DI-GDRQ-80567 Subsystem Design Analysis Report

Preliminary version due 90DAC. Final due SDR-30.

Distribution to the Program Office, Program Engineering, and Accreditor.

Justification: This data is needed to validate the sufficiency of the security requirements levied on the system to avoid designed-in security breaches.

Contractor format acceptable.

8. System Security Concept of Operations

DI-MISC-80840 Preliminary System Security Concept

Preliminary version due 90DAC. Final due SDR-30.

Distribution to the Program Office, Program Engineering, Users, and Accreditor.

Justification: This data is needed to validate the sufficiency of the security requirements levied on the system to avoid designed-in security breaches and to ensure the operational workability of the design concept.

Contractor format acceptable.

9. Computer Security Policy

DI-GDRQ-80567 Subsystem Design Analysis Report

Preliminary version due 30DAC. Final SRR-30. Revisions as needed by changes to system security policy.

Distribution to the Program Office, Program Engineering, Users, and Accreditor.

Justification: This data is needed to establish the baseline for the security policy to be enforced by the system.

Contractor format acceptable.

10. Computer Security Policy Model

DI-GDRQ-80567 Subsystem Design Analysis Report

Preliminary version due SDR-30. Final due PDR-30. Revisions as needed.

Distribution to the Program Office, Program Engineering, and Accreditor.

Justification: This data is needed to validate the model of the policy for sufficiency and completeness.

Contractor format acceptable.

11. Computer Security Audit Analysis

DI-GDRQ-80567 Subsystem Design Analysis Report

Due SDR-30. Quarterly submittals after approval.

Distribution to the Program Office and Program Engineering.

Justification: This data is needed to evaluate the contractor design and implementation of the audit requirements and related performance issues.

Contractor format acceptable.

12. Covert Channel Analysis

DI-GDRQ-80567 Subsystem Design Analysis Report

Preliminary version due SDR-30. Final due PDR-30. Revisions as needed.

Distribution to the Program Office, Program Engineering, and Accreditor.

Justification: This data is needed to evaluate the security of the contractor's deign and implementation with regards to covert channels.

Contractor format acceptable.

13. Descriptive Top Level Specification

DI-GDRQ-80567 Subsystem Design Analysis Report

Preliminary version due SDR-30. Final due PDR-30. Revisions as needed.

Distribution to the Program Office, Program Engineering, and Accreditor.

Justification: This data is needed to validate the contractor's design for the Trusted Computing Base (TCB).

Contractor format acceptable.

14. Formal Top Level Specification

DI-GDRQ-80567 Subsystem Design Analysis Report

Preliminary version due SDR-30. Final due PDR-30. Revisions as needed.

Distribution to the Program Office, Program Engineering, and Accreditor.

Justification: This data is needed to formally validate the contractor's design for the Trusted Computing Base (TCB).

Contractor format acceptable.

15. Security Features Users Guide

DI-MCCR-80019A Software User's Manual

Preliminary due CDR-30. Final due 30 days before test.

Distribution to the Program Office, Program Engineering, Users, and Accreditor.

Justification: This data is needed to guide the users in the use of the security features.

Contractor format acceptable.

16. Trusted Facility Manual

DI-MCCR-80019A Software User's Manual

Preliminary due CDR-30. Final due 30 days before test.

Distribution to the Program Office, Program Engineering, Users, and Accreditor.

Justification: This data is needed to guide the security personnel in the use of the security safeguards.

Contractor format acceptable.

17. Security Test Plan

DI-MCCR-80014A Software Test Plan

Preliminary due CDR-30. Final due CDR+90.

Distribution to the Program Office, Program Engineering, Users, and Accreditor.

Justification: This data is needed to determine the sufficiency of the security safeguards in the completed system.

Contractor format acceptable.

18. Security Test Description

DI-MCCR-80015A Software Test Description

Preliminary version due 90 days before test. Final due 30 days before test.

Distribution to the Program Office and Program Engineering.

Justification: This data is needed to establish the procedures to be used during the system security test.

Contractor format acceptable.

19. Security Test Report

DI-MCCR-80017A Software Test Report

Due 30 days after security test completion.

Distribution to the Program Office, Program Engineering, Users, and Accreditor.

Justification: This data is needed to determine the sufficiency of the security safeguards in the completed system.

Contractor format acceptable.

Gonculator Accreditation Working Group B1 B2 B3 A1
Gonculator Accreditation Working Group     B1 B2 B3 A1
Data Accession List C1 C2 B1 B2 B3 A1
System Description C1 C2 B1 B2 B3 A1
System Security Management Program C1 C2 B1 B2 B3 A1
System Security Management Plan C1 C2 B1 B2 B3 A1
Computer Security Management Program B1 B2 B3 A1
Computer Security Management Plan B1 B2 B3 A1
Security Vulnerability Analysis C1 C2 B1 B2 B3 A1
Security Architecture Study B1 B2 B3 A1
System Security Concept of Operations B1 B2 B3 A1
Computer Security Policy C1 C2 B1 B2 B3 A1
Computer Security Policy Model B1 B2 B3 A1
Computer Security Audit Analysis C2 B1 B2 B3 A1
Covert Channel Analysis B2 B3 A1
Descriptive Top Level Specification B2 B3 A1
Formal Top Level Specification A1
Security Features Users Guide C1 C2 B1 B2 B3 A1
Trusted Facility Manual C1 C2 B1 B2 B3 A1
Security Test B1 B2 B3 A1
Security Test Plan B1 B2 B3 A1
Security Test Description B1 B2 B3 A1 B1 B2 B3 A1
Security Test Report B1 B2 B3 A1


Data Item Descriptions (DIDs)

Because some of the DIDs mentioned in the SOW Tasking and CDRL Inputs portions of this section may be unfamiliar to some, and difficult to find for all, this portion of the section is dedicated to the DIDs themselves. Each of the major DIDs is included in its entirety. The DIDs contained here are:

DI-GDRQ-80567 Subsystem Design Analysis Report: This DID is a relatively generic DID which is useful for the production of the out-of-the-norm documents. In particular, the following documents use this DID:
Covert Channel Analysis Report

Descriptive Top Level Specification

Formal Top Level Specification

Computer Security Audit Analysis

Computer Security Policy

Computer Security Policy Model

Security Architecture Study

System Description Appendix

DI-MISC-80839 System Security Management Plan: This DID is useful for both the System Security Management Plan and the Computer Security Management Plan.

DI-MISC-80840 Preliminary System Security Concept: This DID is useful for the System Security Concept of Operations.

DI-MISC-80841 Security Vulnerability Analysis: This DID is useful for the Security Vulnerability Analysis. (Oddly enough.)

DI-IPSC-80690 System/Subsystem Specification (SS) Document for Automated Information Systems (AIS): As the system's security functionality increases as a proportion of the total system functionality this DID becomes increasingly useful. In particular, if the system has security operations as its primary function (as in a security add-on subsystem) this is the DID which should be used.

DI-MISC-80842 Adversary Mission Analysis: This DID is a non-CompuSec portion of the standard System Security Engineering Management Program.

DI-S-1817 Logistical Support Plan: This DID is a non-CompuSec portion of the standard System Security Engineering Management Program.


[Note: The following HTML versions of DIDs may be replaced with correctly formatted DIDs 1817, 80016, 80017, 80567, 80690, 80839, 80840, 80841 and 80841 available in Word 7.0 files in a single Zip-compressed file: http://jya.com/ntob-did.zip  (51K) ]


Data Item Description

1. TITLE Subsystem Design Analysis Report

2. Identification Number DI-GDRQ-80567

3. Description/Purpose

3.1 This report is used to evaluate the design approach for the configuration item or subsystem and to provide visibility to the government. The data may also be used to formulate additional technical direction to the design activity.

4. Approval Date 880415

5. OPR F/AD/ALX

6a. DTIC Applicable X

6b. GDEP Applicable

7. Application/Interrelationship

7.1 This data item description (DID) contains the format and content preparation instructions for the data product generated by the specific and discrete task requirements as delineated in the contract.

7.2 This report is normally prepared during the analysis effort for each configuration item or subsystem during system acquisition. It may also be applicable to other development efforts.

7.3 Specific requirements of MIL-STD-847 should be stated in the contract data requirements list.

7.4 This report should be considered for submission to the Defense Technical Information Center, Cameron Station, Alexandria VA under the provisions of AFR 80-40.

7.5 This DID supersedes DI-S-3581.

8. Approval Limitation

9a. Applicable Forms

9b. AMSC Number F4380

10. Preparation Instructions (Continued on Page 2)

11. Distribution Statement DISTRIBUTION STATEMENT A: Approved for public release; distribution is unlimited.

Page 1 of 2 Pages


Block 10. Preparation Instructions (Continued)

10.1 Reference Documents. The applicable issue cited herein, including their approval dates and dates of any applicable amendments, notices, and revisions shall be specified in the contract.

10.2 Format. The report shall be structured to separately cover each of the major subsections of the design analysis task. The analysis report shall correlate the design requirements with the system requirement and any specified requirement for the subsystem or configuration item. The report shall include or reference all related data (sketches, preliminary drawings, schematics, functional diagrams) necessary for portrayal of the analysis or to aid in an understanding of the analysis.

10.2.1 The report shall be generally written to MIL-STD-847 format. Specific requirements of MIL-STD-847 will be stated in the contract.

10.3 Content. The report shall include the following:

10.3.1 Objective of the analysis.

10.3.2 Description of the items involved, including adequate drawings, schematics, and computer print-outs to support the analysis.

10.3.3 Specification of design constraints and assumptions imposed on the analysis.

10.3.4 Discussion of the evaluation and analysis procedure, method, or technique used, and its probable accuracy, explained by sample calculations.

10.3.5 Identification of source material used in the analysis.

10.3.6 Results of the analysis to include such aspects as:

a. Predicted performance related to requirements.

b. Design impact and any constraints which influence other subsystems or configuration items.

c. Producibility considerations.

d. Problems encountered or revealed and suggested solutions.

10.3.7 Conclusions.

Page 2 of 2 Pages


Data Item Description

1. TITLE System Security Management plan

2. Identification Number DI-MISC-80839

3. Description/Purpose 3.1 Outlines and defines the contractor System Security Management Program. The SSMP describes the methods used to (1) identify security requirements, (2) synthesize and evaluate proposed solutions, and (3) provide security inputs to the system acquisition process.

4. Approval Date 890605

5. OPR AF 10

6a. DTIC Applicable x

6b. GDEP Applicable

7. Application/Interrelationship 7.1 This Data Item Description contains the format and content preparation instructions for data resulting from the work task described by 5.3.1.1 of MIL-STD-1785. 7.2 Security Vulnerability Analysis, DI-MISC-80841, is used with this Data Item Description when paragraphs 10.1.6.2 through 10.1.6.7 are cited. 7.3 This Data Item description supersedes DI-R-3527/S-112-1. 7.4 Defense Technical Information Center, Cameron Station, Alexandria VA 22314-6100.

8. Approval Limitation 9a. Applicable Forms 9b. AMSC Number F4730

10. Preparation Instructions (Continued on Page 2)

11. Distribution Statement DISTRIBUTION STATEMENT A: Approved for public release; distribution is unlimited.

Page 1 of 3 Pages


Block 10. Preparation Instructions (Continued)

10.1 The System Security Management Plan (SSMP) shall include the following:

10.1.1 Applicable Documents. A list of documents which apply as directives or guidance during the execution of the SSMP. The list shall include pertinent legal, regulatory, and other published or draft contract requirements applicable to the system under development. System security requirements and objectives are drawn from these documents.

10.1.2 Purpose. State the approach to the System Security Engineering Program (SSEP) and the principles which will be applied.

10.1.3 Organization. Describe the organizational placement and manning of the contractor's security engineering management organization. Use charts and diagrams to show organizational and functional relationships.

10.1.4 System Security Engineering Management Program. Describe the activities planned to satisfy System Security engineering program objectives. Use charts or diagrams to illustrate the program's functional interfaces, engineering and design requirements, activity milestones, management process, and levels of effort for each program phase.

10.1.5 Program Data Flow. Illustrate the manner in which basic program data flows, indicating how the system security engineering organization will monitor all program efforts and make inputs to decision processes.

10.1.6 System Security Engineering Functions. Describe the principal functions and specific tasks to be performed as given in the Statement of Work. Describe the assignment of these functions and tasks within the system security engineering organization. Include the following:

10.1.6.1 Establishing the Security Requirements and Objectives Baseline. Describe how the SSMP will address the following security concerns: Personnel, industrial, operations, product, communications, and physical security; survivability, antiterrorism and counterintelligence aspects. Describe how the security regulations and other program guidance will be identified and synthesized into a set of security requirements and objectives. Illustrate how these requirements and objectives will be used to measure the effectiveness of security system arrangements and how required policy revisions to government security programs will be processed.

10.1.6.2 Threat Analysis. Describe how the threat analysis will be evaluated and integrated along with adversary mission objectives.

10.1.6.3 Conducting the Adversary Mission Analysis and Constructing the Preliminary Threat Logic Tree. Describe the technical and analytical methods used to identify criteria for success in adversary mission objectives and to synthesize threat models. Delineate the system security technology research tasks and explain how this research will be documented.

10.1.6.4 Applying Threat Rejection Logic and Documenting the Results. Describe how qualitative and quantitative values will be established for threats and countermeasures and the method to document threat rejection logic.

(Continued on Page 3)

Page 2 of 3 Pages


Block 10. Preparation Instructions (Continued)

10.1.6.5 Synthesizing Countermeasures. Describe the process by which countermeasures will be synthesized. Explain how this activity and the security system synthesis and evaluation tasks will be coordinated.

10.1.6.6 Adversary Vulnerability Measurement. Describe fully the method used to identify and conduct quantitative and qualitative analysis for risks associated with each adversary mission objective. Include the application of candidate countermeasures and the manner in which preferred countermeasures will be selected and documented.

10.1.6.7 Computing and Constructing a Summary Threat Matrix. Describe how the completed Threat logic Tree will be analyzed and system security effectiveness will be computed. Include the method used to document the results.

10.1.6.8 Integrating Security Functions with the System Engineering Process. Describe the process by which security inputs will be applied to system functional design, requirements allocation, trade-off study, and communications, electronic, and interface (CEI) design specification processes.

10.1.6.9 Security System Synthesis and Evaluation. Describe the method by which security system hardware, facilities, procedures, and personnel subsystems will be synthesized and evaluated. Specify the scope and type of research to be conducted of existing material. Include techniques to evaluate their applicability to security requirements.

10.1.6.10 Test and Evaluation. Describe the process use to identify security test requirements and proposed test methods.

10.1.6.11 Configuration Control. Describe the manner in which system security engineering efforts will be integrated with system configuration control activities. Explain how the proposed changes to the system will affect security efforts.

10.1.6.12 Relationships with Other Contractors. Outline the methods by which system security engineering efforts of associate system contractors, subcontractors, and vendors will be integrated within the System Security Engineering Program.

10.1.6.13 System Installation and Checkout. Describe how the System Security Engineering and Industrial and Product Security efforts will be coordinated to ensure no security vulnerability is created during system installation and checkout.

10.1.6.14 Product Security. Describe how the major system components/products will be secured at the contractor's assembly plants. Explain the security manpower, facilities, equipment, and procedures to be used. Include the security interface with associate contractors, subcontractors, and vendors.

Page 3 of 3 Pages


Data Item Description

1. TITLE Preliminary System Security Concept

2. Identification Number DI-MISC-80840

3. Description/Purpose

3.1 This Data Item Description outlines the format and content of the PSSC. The PSSC is a document which cites security concepts and requirements relative to a particular system. It is prepared by the contractor to provide the Government with preliminary description of security requirements and resources.

4. Approval Date 890605

5. OPR AF 10

6a. DTIC Applicable x

6b. GDEP Applicable

7. Application/Interrelationship

7.1 This Data Item Description contains the format and content preparation instructions for data resulting from the work task described by 5.3.1.3 of MIL-STD-1785.

7.2 Security Vulnerability Analysis, DID DI-MISC-80841, is used with this Data Item Description when paragraphs 10.1.5.1 through 10.1.5.5 are cited.

7.3 Defense Technical Information Center (DTIC), Cameron Station, Alexandria VA 22314-6100.

8. Approval Limitation

9a. Applicable Forms

9b. AMSC Number F4371

10. Preparation Instructions (Continued on Page 2)

11. Distribution Statement DISTRIBUTION STATEMENT A: Approved for public release; distribution is unlimited.

Page 1 of 5 Pages


Block 10. Preparation Instructions (Continued)

10.1 The Preliminary System Security Concept (PSSC) shall include the following:
10.1.1 Program Data.
10.1.1.1 Title. Include the complete PSSC title.

10.1.1.2 Submitting Agency. List the name and address of the contract agency submitting the report and the name and telephone number of a project officer or point of contact.

10.1.1.3 Contract Citation. Identify the contract number and date as listed by the government.

10.1.1.4 Security Tasks. Briefly describe major security tasks cited in the statement of work and related contract documents.

10.1.1.5 Distribution. List the names and addresses of Government and contract agencies receiving copies of this concept. If necessary, list them in an appendix and make reference to it here.

10.1.2 System Concept.

10.1.2.1 Description. Briefly describe the system and its major components. Cite separate configurations for Initial Operational Capability (IOC) and Full Operational Capability (FOC), if different.

10.1.2.2 Performance requirements. Cite the major performance and deployment criteria in the applicable statements of work and other related contract documents which affect security.

10.1.2.3 Reliability and maintainability. Identify security issues affecting system reliability, logistics reliability, availability, and maintainability.

10.1.2.4 System Survivability. Show self-protection capabilities or subsystem designs which may enhance security. (examples include devices against tampering and spoofing, chemical or biological radiation hardness, nuclear hardness, nuclear or non-nuclear electromagnetic pulse hardness, and the use of passive detection technology.)

10.1.2.5 Preplanned Product Improvements (P3I). Describe provisions or security implications for subsystem growth or improvements such as modifications and upgrades.

10.1.3 Security Subsystem Employment Data.

10.1.3.1 General employment description. Describe how, where, when, and what security subsystems will be used and how they will be integrated with the system(s) they support.

10.1.3.2 Command and control structure. Describe the command and control data that must be exchanged. Explain how security subsystems will it integrated into the command and control structure projected to exist when it is deployed.

(Continued on Page 3)

Page 2 of 5 Pages


Block 10. Preparation Instructions (Continued)

10.1.3.3. Information systems. Identify the information that must be exchanged between this subsystem other systems, subsystems, or components. Cite the expected length of each communications link, anticipated flow rate across each link, required availability of each link, etc.

10.1.3.4 Security subsystem standardization, interoperability, and commonalty. Describe requirements for joint service interface, NATO cross-servicing and interoperability with existing systems and subsystems. Identify procedural and technical interface standards incorporated in subsystem design.

10.1.3.5 Operational environment. Describe climatic and atmospheric environmental effects and considerations. If applicable, define the chemical and biological environment in which equipment must function.

10.1.4 Security Subsystem Support.

10.1.4.1 Maintenance planning. Outline the actions, support, and documentation necessary to establish maintenance concepts and requirements. Include maintenance tasks to be accomplished for on- and off-equipment maintenance; interservice, organic, and contractor mix, workloads, and time phasing for depot maintenance. Explain the management strategies for selecting and integrating contractor and government furnished equipment.

10.1.4.2 Manpower and personnel. Outline the projected manpower requirements envisioned to support this subsystem(s). Include type of specialty codes and skill levels required, time phased reporting, etc.

10.1.4.3 Supply Support. Show the proposed approach for provisioning initial support and acquiring, distributing, and replenishing inventory spares and repair parts.

10.1.4.4 Support equipment. Identify equipment required to support this subsystem(s). Include ground handling and maintenance equipment, tools, metrology and calibration equipment, and related computer hardware and software.

10.1.4.5 Training and training devices. Describe the training support concept from security subsystem design through deployment. Identify the command or agency responsible for developing and conducting each phase of training. Show inventory items and training devices by projected type, number, use, and locations required. Outline initial and recurring training requirements by location, type, specialty, and fiscal year.

10.1.4.6 Computer resources support. Define special computer program documentation, related software, source data, facilities, hardware, etc. required for subsystem support.

10.1.4.7 Facilities. Specify facility, shelter, and housing external to system-designed survivability features.

10.1.4.8 Packaging, handling, storage, and transportation. Describe the requirements, resources, processes, procedures, design considerations, and methods to ensure security subsystems are properly preserved, packaged, handled, and transported.

(Continued on Page 4)

Page 3 of 5 Pages


Block 10. Preparation Instructions (Continued)

10.1.4.9 Related support factors. Describe the pertinent support factors, considerations, or requirements not covered elsewhere, but deemed important to the effectiveness of the security subsystem.

10.1.5 General Provisions for System Security. Address the following security issues relative to overall system deployment and operation.

10.1.5.1 Threat Assessment. Address security threats to the system for design, development, production, at IOC, and throughout its projected life. Include foreign government capabilities, peaces and wartime ground threats, and system-unique vulnerabilities. Make reference to government threat documents. In addition, cite requirements for threat analysis and security vulnerability assessments.

10.1.5.2 Procedural requirements. Cite security force and procedural requirements which apply to pre-, trans-, and post-attack operation in support of the Air Force Physical Security Program.

10.1.5.3 Security resources. Cite security manpower, facility, and equipment requirements in the quantities, type and configuration necessary to support the system when deployed.

10.1.5.4 Security response planning. Address emergency security response planning, which reflects the general design of the security forces posture calculated to produce the greatest invulnerability to terrorism, sabotage, overt, and covert attack. It is supported by the threat and vulnerability assessments cited in 10.1.5.1. In addition, briefly describe how a security reporting and alerting system will be implemented.

10.1.5.5 Security priorities for all applicable systems and components. Include security priorities for all operational phases, including maintenance. For example, aircraft system priorities would include nuclear alert, nonnuclear alert, mission capable and non-alert. In addition, explain how waivers, exceptions, and variance to security criteria will be identified, submitted, approved, and corrected.

10.1.5.6 Security requirements from related security disciplines. Include applicable information security, physical security, computer security, personnel security, product security, industrial security, operations security, communications security, electronic security, survivability, antiterrorism, and counter-intelligence aspects.

(Continued on Page 5)

Page 4 of 5 Pages


Block 10. Preparation Instructions (Continued)

10.1.5.7 Facility and equipment requirements. Facility and equipment requirements which are incorporated into the system to support system security requirements. These requirements include:
a. The Central Security Control Facility, Master Surveillance and Control Facility, Security Force Response Facility, entry control facilities, etc.

b. Barrier systems and warning signs.

c. Alarm annunciation and display equipment.

d. Security force armament and duty equipment.

e. Security force communications. Include fixed, portable, and landline requirements by type and number.

f. Interior and exterior intrusion detection systems.

10.1.5.8 Manpower standards. Identify security force post and patrol requirements for normal operations.

10.1.5.9 Security force logistics. Cite security force logistics and materiel requirements including vehicles and associated equipment, training aids, tool kits, new armament, etc.

10.1.5.10 Entry access controls. Include system entry control requirements for all restricted areas including:

a. General criteria and unique requirements for entry control. Include the rate at which individuals must be processed during normal operations, alert operations, and periods of advanced readiness.

b. Qualification requirements for the various categories of people who must enter.

c. Personnel clearance and investigative requirements.

d. Special training or briefing and debriefing requirements.

e. Authentication and duress code techniques and procedures.

f. Dispatch control procedures for unattended or minimally staffed sites.

g. Description of the badge system, emergency procedures, and personnel escort requirements, including the number of individual names maintained in the entry data files.

Page 5 of 5 Pages


Data Item Description

1. TITLE Security Vulnerability Analysis

2. Identification Number DI-MISC-80841

3. Description/Purpose

3.1 The Security Vulnerability Analysis provides the result of the contractor's efforts to quantitatively and qualitatively define system security functional requirements and residual clandestine vulnerabilities. It will be classified no lower than SECRET NOFORN or SECRET Restricted Data, as applicable. This analysis contributes to the security vulnerability analysis.

4. Approval Date 890605

5. OPR AF 10

6a. DTIC Applicable x

6b. GDEP Applicable

7. Application/Interrelationship

7.1 This Data Item Description contains the format and content preparation instructions for data resulting from the work task described in 5.3.1.9 of MIL-STD-1785.

7.2 This Data Item Description supersedes DI-R-3528/S-113-1.

7.3 This Data Item Description is used in conjunction with the System Security Management Plan (SSMP), DI-MISC-80839.

7.4 Defense Technical Information Center, Cameron Station, Alexandria VA 22314-6100.

8. Approval Limitation

9a. Applicable Forms

9b. AMSC Number F4371

10. Preparation Instructions

10.1 The Security Vulnerability Analysis shall include:
a. A preface with narrative description of the system. Information concerning each form of external overt or covert method of attack against the system considered during system development.

b. Threat models in Threat Logic Tree format showing their transition from preliminary to initial Threat Logic Trees and thence into Summary Threat Logic Trees.

c. Rationale used for threat rejection in developing the initial Threat Logic Trees.

d. An evaluation of the conditional probabilities for achieving each adversary mission objective.

e. An assessment of security vulnerabilities related to information security, personnel security, industrial security, operations security, communications security, physical security, computer security, product security, and TEMPEST.

11. Distribution Statement DISTRIBUTION STATEMENT A: Approved for public release; distribution is unlimited.

Page 1 of 1 Pages


Data Item Description

1. TITLE System/Subsystem Specification (SS) Document for Automated Information Systems (AIS)

2. Identification Number DI-IPSC-80690

3. Description/Purpose The SS provides a detailed definition of the system/subsystem functions. It documents details of the on-going analysis between the user's operational personnel and the appropriate development personnel. It defines in detail the interfaces with other systems and subsystems and the facilities to be utilized for accomplishing the interfaces.

4. Approval Date 881031

5. OPR SC 6a. DTIC Applicable

6b. GDEP Applicable

7. Application/Interrelationship

7.1 This DID contains the format and content preparation instructions for the data resulting from the work task described in 5.2 of DOD-STD-7935A.

7.2 This DID supersedes DI-S-30551B.

8. Approval Limitation

9a. Applicable Forms

9b. AMSC Number A4551

10. Preparation Instructions

10.1 Reference documents. The applicable issue of the documents cited herein, including their approval dates and dates of any applicable amendments, notices, and revisions, shall be specified in the contract.

10.2 Format. This document shall be prepared on 8 1/2 by 11 inch white paper (hardcopy) or a form of electronic media. Margins of hardcopy shall be sufficiently large to permit looseleaf binding even if some other form of binding is specified in the contract for hardcopy items.

10.2.1 Tailoring instructions. All paragraph and subparagraph numbers and titles identified in the standard shall be included in the document. In the event that a paragraph or subparagraph is tailored out, an indication to that effect should be added directly following the title.

(Continued on Page 2)

11. Distribution Statement DISTRIBUTION STATEMENT A: Approved for public release; distribution is unlimited.

Page 1 of 2 Pages


Block 10. Preparation Instructions (Continued)

10.2.2 Use of alternate presentation styles. When the information required by the paragraphs and subparagraphs of this DID can be made more readable by the use of charts, tables, matrices, and other presentation styles, the contractor is encouraged to incorporates such styles. Such presentation styles shall not incorporate continuous tone photographs, multiple color printing, oversize foldout sheets, or other techniques that cannot readily be reimaged from the original using office copiers or microforms.

10.2.3 Multiple paragraphs and subparagraphs. Any paragraph or subparagraph in this DID starting with the phrase "This paragraph shall" or "This subparagraph shall" may be written as multiple paragraphs or subparagraphs to enhance readability.

10.2.4 Document reference numbers. All pages of this document shall contain the document reference number near the top of the page. If the document is classified, the classification indicator shall not obscure the document reference number. Document reference numbers will be assigned by the government issuing organization. If the government does not specify the document reference number to be used, the contractor shall assign an identification code which is unique for each document (and for each part or volume of a document published in multiple parts) and which is unique among versions or revisions of the document to insure that unambiguous reference can be made to the proper version.

10.2.5 Document structure. The document shall contain the components discussed in the following paragraphs.

10.2.5.1 Title page. This page shall contain the document control number centered at the top of the page. Also included shall be the name and any abbreviation or acronym of the system, name of the sponsoring or issuing organization, document type name, contract number, name of contractor, and publication date, as well as any necessary security markings or other restrictions on the handling of the document. If only a portion of the system is being documented, the name and any abbreviation or acronym of that portion of the system will also be included. If a front cover or a report documentation page is provided, information on the cover or report document must be consistent with that shown on this page. If a document is published in multiple parts or volumes, each part or volume shall have its own title page that shall include a subtitle describing the contents of the particular part or volume.

10.2.5.2 Table of contents. The document shall contain a table of contents listing the title and starting page number of all paragraphs and subparagraphs that have numbers and titles. The table of contents shall also list the numbers, titles, and staring page numbers of all figures, tables, and appendices.

10.3 Content. The content of the document shall conform with the requirements of 5.2 of DOD-STD-7935.

Page 2 of 2 Pages


Data Item Description

1. TITLE Adversary Mission Analysis

2. Identification Number DI-MISC-80842

3. Description/Purpose

3.1 This Data Item Description is used by the contractor to quantitatively describe how potential adversaries may attack the system. It will be classified no lower than SECRET NOFORN or SECRET Restricted Data, as applicable.

4. Approval Date 890605

5. OPR AF 10

6a. DTIC Applicable x

6b. GDEP Applicable

7. Application/Interrelationship

7.1 This Data Item Description contains the format and content preparation instructions for the data resulting from the work task described in 5.3.2.1 of MIL-STD-1785.

7.2 Defense Technical Information Center (DTIC), Cameron Station, Alexandria VA 22314-6100.

8. Approval Limitation

9a. Applicable Forms

9b. AMSC Number F4373

10. Preparation Instructions

10.1 The Adversary Mission Analysis shall include:
a. Descriptions of the adversary mission scenarios developed. Information resulting from the threat analysis (conceptual phase) will be used as the basis for the scenarios.

b. Estimates of adversary success criteria developed. These estimates will be prerequisites for system vulnerabilities.

11. Distribution Statement DISTRIBUTION STATEMENT A: Approved for public release; distribution is unlimited.

Page 1 of 1 Pages


Data Item Description

1. TITLE Logistical Support Plan

2. Identification Number DI-S-1817

3. Description/Purpose

3.1 The Logistical Support Plan provides information and data to major field commanders for support of complex items or systems.

4. Approval Date

5. OPR USAMC

6a. DTIC Applicable x

6b. GDEP Applicable

7. Application/Interrelationship

7.1 This Data Item Description contains the format and content preparation instructions for data resulting from the work task described by 5.3.1.6 of MIL-STD-1785.

7.2 This data item is related to DI-S-1813, Maintenance Support Plan, and DI-S-1819, Contractor Recommended Support Plan.

7.3 Defense Technical Information Center (DTIC), Cameron Station, Alexandria VA 22314-6100.

8. Approval Limitation

9a. Applicable Forms

9b. AMSC Number

10. Preparation Instructions

10.1 The Logistical Support Plan shall include logistical planning data and information as specified in AR 750-6, Appendix III.

10.2 The requiring activity will provide the following input in preparation of the Logistical Support plan:

a. Modification or expansion of the support plan outline as required to meet special requirements.

b. Data to be inserted in the support plan.

10.3 The Logistical Support Plan will summarize information contained in the Maintenance Support plan.

11. Distribution Statement DISTRIBUTION STATEMENT A: Approved for public release; distribution is unlimited.

Page 1 of 1 Pages


Section III:
Rainbow Readers' Digest

III: Rainbow Readers' Digest

The National Computer Security Center (NCSC) has produced a series of documents dealing with various aspects of computer security development and operations. Because the covers of these documents are color-coded, the documents have come to be known as the Rainbow Series. This Section contains digests of each of the documents which have been published so far. In addition, there are a few related documents digested which are not Members of the Rainbow Series but are of the same genre. The documents in this digest are as follows.

DOD 5200.28-STD DoD Trusted Computer System Evaluation Criteria: This is the keynote document of the series. It covers Fundamental Computer Security Requirements, Divisions and Classes of Systems, Testing Guidelines, Commercial Product Evaluation, and the Requirements by Class. (Also known as the Orange book.)

CSC-STD-002-85 DoD Password Management Guideline: Topics covered include Individual Password, Password Change, Password Protection, and Password Length.

CSC-STD-003-85 Computer Security Requirements and CSC-STD-004-85 Rationale Behind Computer Security Requirements: These two documents help you calculate the appropriate level of Computer Security Requirements for your system. The treatment given in each is slightly different, but the answers given are essentially the same. Topics covered include Minimum User Clearance (RMin ), Maximum Data Sensitivity (RMax ), Risk Index, and Computer Security Requirements Class.

CSC-STD-005-85 DoD Magnetic Remanence Security Guideline: (Superseded by NCSC-TG-025) The topics covered include Declassification and Clearing, Declassification Permission, Properly Functioning Media, Non-functional Media, and Destruction. This standard appears to have fallen on hard times as the media industry has made such great advances. The basic concepts involved are still valid.

NCSC-TG-001 A Guide to Understanding Audit: Topics covered include Audit Requirements Overview, Audited Events, and Selective Audit. (TG stands for Tech Guide.)

NCSC-TG-002 Trusted Product Evaluations A Guide for Vendors: This document acclimates the vendor to the evaluation procedures of the NCSC. Specific topics include phasing of the evaluation process, technical product description, and the legal agreement between NSA and the vendor.

NCSC-TG-003 A Guide to Understanding Discretionary Access Control: This document covers the ticklish topic of DAC. Specific sub-topics include Implementation Methodologies, Control Permission and Access Modes, and Requirements.

NCSC-TG-004 Glossary of Computer Security Terms

NCSC-TG-005 Trusted Network Interpretation: This document takes a look at the Orange Book (which was written in terms of a monolithic computer) and "Interprets" the requirements for the network type environment. Topics include Security Policy, New Evaluation Areas, Network Components, and Cascading

NCSC-TG-006 A Guide to Understanding Configuration Management: In this day and age of viruses and worms there is an even greater need for CM than before. The actual CM requirements are not much different than those of the typical DoD development. The primary emphasis of this guide was for the commercial vendor whose CM practices are often non-existent. Topics include Configuration Management (CM) Use, CM Requirements, and CM Tasks.

NCSC-TG-007 A Guide to Understanding Design Documentation: This document covers the design documentation requirements for each of the classes from C1 through A1.

NCSC-TG-008 A Guide to Understanding Trusted Distribution: CM covers the protection of the system when in the hands of the developers and the users. This document covers the protection of the system between them. Topics include Trusted Distribution (TD) Assurances, Post-Production Protection, and Transit Protection.

NCSC-TG-009 Computer Security Subsystem Interpretation: This interpretation is for the add-on systems which lend security to a basically insecure system. Topics include Required Features, Assurance Requirements, and Documentation Requirements.

NCSC-TG-011 Trusted Network Interpretation Environments Guideline: It is the first of a series of off-shoots of NCSC-TG-005. Topics include Network Security Architecture and Design (NSAD), Risk Assessment, and Network Security Services.

NCSC-TG-013 Rating Maintenance Phase Program (RAMP) Document: The current setup for evaluation lists a particular hardware configuration with a particular operating system as a particular class. RAMP seeks to extend the evaluation listing to future versions of the system. Topics include Preevaluation Phase, Vendor Assistance Phase/Design Analysis Phase, Evaluation Phase, and Rating Maintenance Phase.

NCSC-TG-014 Guidelines for Formal Verification Systems: Topics include Endorsement And ETL Listing, Technical factors, Features, Assurance, and Required Documentation. Of little interest for most systems.

NCSC-TG-015 A Guide to Understanding Trusted Facility Management: This is a good background document for gaining an understanding of the basic concept for setting up a system. Roles discussed include Security Administrator, Secure Operator, Account Administrator, and Auditor.

NCSC-TG-017 A Guide to Understanding Identification and Authentication in Trusted Systems: This document covers the requirements for I&A by class and discusses methods and implementations to meet the requirements.

NCSC-TG-019 Trusted Product Evaluation Questionnaire: This document serves as a good guide to help the developer ensure that all of the "bases are covered". The questionnaire covers the entire gamut of CompuSec issues for a system.

NCSC-TG-020 Access Control List (ACL) Features for UNIX: The requirement to be able to list users with access or not for an object tends to point toward ACLs for the DAC implementation. This Guide was written specifically for UNIX system developers implementing ACLs, but the though processes and rationales involved are of wider validity.

NCSC-TG-021 Trusted Database Management System Interpretation of the Trusted Computer System Evaluation Criteria: This document covers a broad range of topics which have some relation to trusted databases. The total effect is close to being a rewrite of the Orange Book while not officially attempting to do so.

NCSC-TG-025 A Guide to Understanding Data Remanence in Automated Information Systems: This document explores each method of clearing and/or purging data from media and lists DoD approved methods for each major media type.

NCSC-TG-026 A Guide to Writing the Security Features User's Guide for Trusted Systems: This document covers presentation, packaging, and content for the SFUG and gives two outline examples of acceptable SFUG styles.

C Technical Report 79-91 Integrity in Automated Information Systems: This document addresses the "other half" of COMPUSEC. Along with removing the chance for compromise of sensitive data we must also protect the data from damage. This is not a full member of the Rainbow Series but instead is a Tech Report from the NCSC. Integrity itself plus integrity models, implementations, and principles are discussed.

NTISSAM COMPUSEC/1-87 Advisory Memorandum on Office Automation Security: This document covers User Responsibilities and Security Officer Responsibilities in the office PC-type environment.

MIL-STD 1785 System Security Engineering Program Management: This MIL-STD covers the entirety of security engineering and is helpful for the phasing aspects and the associated DIDs. The Phases covered include Concept Exploration, Demonstration and Validation, Full-Scale Development, Production and Deployment.

DRS-2600-5502-86 System High and Compartmented Mode Workstations: This is a DIA document from their secure workstation contract. It takes a few turns not taken by NCSC. It contains separate sections for the System High Requirements and the Compartmented Mode Requirements.


Document Number: DOD 5200.28-STD (Formerly: CSC-STD-001-83)

Title: DoD Trusted Computer System Evaluation Criteria

Color: Orange Book

Date: December 1985

Highlights: "The provisions of this document apply to DoD components. This document is mandatory for 'all DoD components in activities applicable to the processing and storage of classified information'."

1. Fundamental Computer Security Requirements

a. Policy
1) Security Policy: There must be an explicit and well-defined security policy enforced by the system.
a) Mandatory Security Control Objective: The security policy must include provisions for enforcing mandatory access control rules based on the classification of the information and the user's clearance.

b) Discretionary Security Control Objective: The security policy must include provisions for the enforcement of discretionary access control rules based on the user's need-to-know for the information.

2) Marking: Access control labels must be associated with objects.

c) Marking Control Objective: The system must store and preserve the integrity of labels for all information. Labels exported from the system must be an accurate representation of the internal labels being exported.

b. Accountability

d) Accountability Control Objective: The system must ensure individual accountability with data usable by a competent agent in a reasonable time without undue difficulty.
3) Identification: Individual subjects must be identified.

4) Accountability: Audit information must be selectively kept and protected so that actions affecting security can be traced to the responsible party.

c. Assurance

e) Assurance Control Objective: The system must accurately implement the policy, without distorting the policy's intent, throughout the system's life.
5) Assurance: The computer system must contain hardware/software mechanisms that can be independently evaluated to provide sufficient assurance that the system enforces requirements 1 through 4 above.

6) Continuous Protection: The trusted mechanisms that enforce these basic requirements must be continuously protected against tampering and/or unauthorized changes.

2. Divisions and Classes of Systems

a. Division D: Minimal Protection - Reserved for those systems which have been evaluated but that fail to meet the requirements of a higher evaluation class.

b. Division C: Discretionary Protection - Provide for discretionary (need-to-know) protection and for accountability of subjects and the actions they initiate.

1) Class C1: Discretionary Security Protection - Incorporate discretionary access controls capable of enforcing access limitations on an individual basis.

2) Class C2: Controlled Access Protection - More finely grained discretionary access control with individual user accountability through login, audit, and isolation.

c. Division B: Mandatory Protection - Enforce mandatory access control rules, carry sensitivity labels with major data structures, implement the reference monitor concept, and have a Trusted Computing Base (TCB) which implements the security policy model.

1) Class B1: Labelled Security Protection - In addition to C2 features, provide informal model, data labeling, mandatory access control, accurate export labeling.

2) Class B2: Structured Protection - Provide formal security policy model that requires access controls for all subjects and all objects, address covert channels, structured into critical and non-critical elements, provide for trusted facility management and configuration management.

3) Class B3: Security Domains - Mediate all accesses of subjects to objects, have tamperproof TCB which is small enough for analysis and test, provide security administrator support, signal security relevant events, securely recover the system.

d. Division A: Verified Protection - Use formal security verification methods to assure that the mandatory and discretionary access controls can effectively protect the information stored and/or processed. Extensive documentation is required to demonstrate the meeting of security requirements in design, development, and implementation.

1) Class A1: Verified Design - {No additional features (over B3).} The system has formal design specification, formal verification techniques, formal model of the security policy, Formal Top Level Specification, more stringent configuration management, and secure distribution of the system to sites.

3. Testing Guidelines

a. Division C Personnel:
1) Two members with BSCS or equivalent.

2) Able to follow developer test plans and suggest additions.

3) Have knowledge of system and completed developers internals course.

b. Division C Testing:

1) Independently run the developers test suite.

2) Independently design at least five system-specific tests.

3) At least one month and not exceeding three months.

4) No fewer than twenty hands-on hours for testing.

c. Division B Personnel:

1) Two members with BSCS or equivalent and at least one MSCS.

2) Able to follow developer test plans and suggest additions.

3) Be fluent in the TCB implementation language(s).

4) Have knowledge of system and completed developers internals course.

5) One member has previously completed a security test on a system.

d. Division B Testing:

1) Independently run the developers test suite.

2) Independently design at least fifteen system-specific tests.

3) At least two month and not exceeding four months.

4) No fewer than thirty hands-on hours for testing per team member.

e. Division A Personnel:

1) One member with BSCS or equivalent and at least two MSCSs.

2) Able to follow developer test plans and suggest additions.

3) Be fluent in the TCB implementation language(s).

4) Have knowledge of system and completed developers internals course.

5) One member understand the hardware diagnostics and documentation.

6) Two members have previously completed a security test on a system.

7) One member has system programming competence for the system.

f. Division A Testing:

1) Independently run the developers test suite.

2) Independently design at least twenty-five system-specific tests.

3) At least three month and not exceeding six months.

4) No fewer than fifty hands-on hours for testing per team member.

4. Covert Channels:

a. Any communications channel that can transfer information in violation of the system security policy.

b. Covert storage channels include all methods that allow direct or indirect writing of a location by one process and direct or indirect reading of the location by another process.

c. Covert timing channels allow one process to signal another process by modulating its use of system resources.

d. Covert Channel Bandwidth:

1) 100 bits per sec (bps) is unacceptably high.

2) ~1 bps is acceptable.

3) ~0.1 bps should be audited.

5. MAC Features: To support transportability and commonalty it is recommended that:

a. Sixteen (16) or more hierarchical classifications be supported.

b. Sixty-four (64) or more non-hierarchical classifications be supported.

6. Commercial Product Evaluation:

a. This evaluation process is keyed to Commercial Off The Shelf (COTS) systems. It serves as an input to computer system security approval/accreditation. A complete approval/accreditation study considers:
1) Mode of Operation.

2) Specific users.

3) Applications.

4) Data sensitivity.

5) Physical security.

6) Personnel security.

7) Administrative security.

8) Procedural security.

9) TEMPEST.

10) Communications security.

b. The NCSC evaluation has three distinct elements:

1) Preliminary Product Evaluation - Informal dialogue to gain common understanding of vendor's target evaluation class and NCSC expectations for the class.

2) Formal Product Evaluation - A formal evaluation by NCSC with the result being the product and its rating being placed on the Evaluated Product List.

3) Evaluated Product List (EPL) - A list of products that have been formally evaluated and their assigned ratings.

7. Requirements

a. Audit
C2: TCB must protect, perform, and maintain an audit trail with the system administrator able to selectively audit by UserID.

B1: Audit must include security level, actions upon the security level, and bannering overrides and allow for selective audit by security level.

B2: Audit must include covert channels.

B3: Audit must include threshold values for imminent security violations and alert the security administrator when the thresholds are passed.

b. Configuration Management

B2: Use configuration management on all design and implementation documentation, code, and test fixtures during development and maintenance.

A1: Use configuration management on all documentation, code, security-relevant hardware, and test fixtures during the entire lifecycle. Provide for trusted distribution.

c. Covert Channel Analysis

B2: Search for covert storage channels and determine maximum bandwidth of each.

B3: Search for covert channels and determine maximum bandwidth of each.

A1: Formal methods must be used for covert channel analysis.

d. Design Documentation

C1: Documentation must discuss the philosophy of protection and its translation into the TCB along with the interfaces between TCB modules.

B1: Documentation must include an informal/formal security policy model and identify the specific mechanisms which implement the model.

B2: Documentation must include a formal security policy model proven to satisfy the policy, a Descriptive Top-Level Specification (DTLS) which describes the TCB interfaces, discussions of the TCB structure, and data on the covert channel analysis including indicative auditable events.

B3: Documentation must show that the TCB implementation is consistent with the DTLS.

A1: Documentation must show that the TCB implementation is consistent with the Formal Top-Level Specification (FTLS). Mechanisms not dealt with by the FTLS must be fully described.

e. Design Specification And Verification

B1: Internally consistent formal/informal security policy model.

B2: Proven internally consistent formal security policy model and a DTLS.

B3: Convincing argument that the DTLS and model agree.

A1: FTLS (shown to be an accurate description of all interfaces) and a combination of formal and informal techniques to show that the FTLS and the model agree.

f. Device Labels

B2: All attached physical devices must be assigned minimum and maximum security levels and these labels must be used to enforce constraints imposed upon the devices.

g. Discretionary Access Control

C1: Access control must be in place between named users and named objects with sharing by defined groups and/or individual users.

C2: Access control must be in place between named users and named objects with sharing by defined groups of individuals and/or individual users. Propagation of access rights must be controlled.

B3: For each named object, there must be a method for specifying a list of named individuals and their modes of access including NULL access.

h. Exportation of Labelled Information

B1: Each communications channel and I/O device must be designated either single level or multilevel. Any change to the designation or the security level must be audited.

i. Exportation to Multilevel Devices

B1: Objects exported over a multilevel port must have the sensitivity label exported in the same form as the object. All objects imported or exported over a multilevel port must allow for unambiguous pairing between the object and its sensitivity label.

j. Exportation to Single Level Devices

B1: Single level ports must have a mechanism to set the single security level at which they operate.

k. Identification and Authentication

C1: Login with password is required along with protection of the authentication data.

C2: Login with password and unique UserID is required.

B1: Authentication data must include clearance and authorization data for the user.

l. Label Integrity

B1: Sensitivity labels must accurately represent the security levels of the associated subjects and objects in the system and when exported.

m. Labeling Human Readable Output

B1: Administrator must be able to specify the printable label names which must be printed top and bottom of all output. Labeling changes and overrides must be audited.

n. Labels

B1: TCB must maintain labels for each subject and object and use them for MAC decisions. Unlabeled imported data requires human intervention for labeling.

B2: TCB must maintain labels for each system resource accessible by subjects external to the TCB.

o. Mandatory Access Control

B1: TCB must enforce MAC over all subjects and objects under its control.

B2: TCB must enforce MAC over each system resource accessible by subjects external to the TCB.

p. Object Reuse

C2: No information from the previous subject can be available in a storage object allocated to the next subject.

q. Security Features User's Guide

C1: A single source must be made available to describe the protection mechanisms of the TCB, their use, and interactions.

r. Security Testing

C1: System must work as claimed and have no obvious way to bypass security.

C2: Search for obvious flaws which would allow violation of resource isolation or access to audit and authentication data.

B1: Testing must include design documentation, source code, and object code. All flaws must be removed or neutralized and the system retested.

B2: All flaws must be corrected and the system retested. The system must be relatively resistant to penetration and must be consistent with the DTLS.

B3: There must be no design flaws and few implementation flaws. The system must be resistant to penetration.

A1: TCB implementation must be consistent with the FTLS.

s. Subject Sensitivity Labels

B2: TCB must notify a user of changes to his security level during a session or on request.

t. System Architecture

C1: TCB must protect itself from interference and tampering.

C2: Resource isolation must be used for access control and auditing.

B1: Process isolation must be provided through separate address spaces.

B2: TCB must be modular and use least privilege principle with clearly defines user interfaces and all portions of the TCB identified.

B3: TCB must be built around a simple protection mechanism and use sound engineering principles in its implementation. Keep It Simple Stupid (KISS).

u. System Integrity

C1: Features must be provided to revalidate correct operation of the TCB.

v. Test Documentation

C1: Developer must provide test plan, procedures, and report.

B2: Documentation must include effectiveness testing of the covert channel bandwidth reduction measures.

A1: Documentation must include the results of the FTLS to Source Code mapping.

w. Trusted Distribution

A1: There must be a trusted control and distribution facility with procedures for trusted distribution.

x. Trusted Facility Management

B2: TCB must support separate operator and administrator functions.

B3: The security administrator must take auditable actions to "Put On" his security administrator hat.

y. Trusted Facility Manual

C1: TFM shall cover cautions regarding privileges and functions that should be controlled.

C2: TFM shall cover procedures for using the audit trail, including record structure for event types.

B1: TFM shall cover operator and administrator functions and the interactions within the TCB.

B2: TFM shall cover reference validation mechanisms with their locations and the procedures for generating a new TCB from source code.

B3: TFM shall cover starting/resuming operations in a secure state.

z. Trusted Path

B2: Must be a trusted communication path between the TCB and the user for initial login and authentication initiated by the user.

B3: Must be a trusted communication path between the TCB and the user for positive TCB-to-user connection activated by the user and distinct from other paths.

aa. Trusted Recovery

B3: Provisions for system recovery without compromise must be provided.


Document Number: CSC-STD-002-85

Title: DoD Password Management Guideline

Color: Green

Date: 12 APR 85

Highlights

1. Passwords must be used in conjunction with the UserID for Authenticated Personal Identification for use in access control and auditing.

2. GroupIDs are not allowed for authentication purposes (login).

3. Passwords should be computer generated.

4. Initial passwords should be dispersed without exposure to the ISSO or as "expired passwords." (Select from computer real-time-generated list, sealed carbon paper envelopes, etc.)

5. User password change should include entering old password to authenticate the User.

6. SSO must be able to change anyone's password.

7. Passwords should be classified at the level of exposure allowed by release of the password.

8. Access across networks should be authenticated by 1) Trusted Identification Forwarding by the "Home" host or 2) UserID and Password to the remote host.

9. Passwords should be encrypted or DACed or both.

10. Encryption should include the UserID with the password.

11. Password guess rate should be limited to approximately one per minute after a failure on an access port and/or UserID basis.

12. Failed passwords should have an adjustable alert level to the SSO (about 5).

13. On successful login the User should be told date and time of last login, location of last login, and number of unsuccessful attempts since last login.

14. Password length should be at least:

Log( Password Life x Available Try Rate / Probability of Guess During Lifetime)

Log( "Alphabet" Size)

a. Probability of Guess should be .000001 or lower. More restrictive values should be selected for more restrictive environments.

b. Typical Alphabet sizes

1) English letters = 26

2) English letters and numbers = 36

3) English 4, 5, and 6 letter words = 23300


Document Number: CSC-STD-003-85

Title: Computer Security Requirements

Color: Yellow I

Date: 25 JUN 85

Highlights

1. Minimum User Clearance Rating (RMin )

  Minimum User Clearance Rating                               (RMin )
  ___________________________________________________________________

  Uncleared (U)                                                 0
  Authorized Sensitive but Unclassified Information (N)         1
  Confidential (C)                                              2
  Secret (S)                                                    3
  Top Secret/Current Background Investigation (TS BI)           4
  Top Secret/Current Special Background Investigation (TS SBI)  5
  One Category (1C)                                             6
  Multiple Categories (MC)                                      7
  ___________________________________________________________________


2. Maximum Data Sensitivity Rating (RMax )

  Maximum Data Sensitivity Rating                             (RMax )
  ___________________________________________________________________

  U                                                             0
  N                                                             1
  N with one or more categories                                 2
  C                                                             2
  C with one or more categories                                 3
  S                                                             3
  S with one or more categories                                 4
  S with two or more categories containing Secret data          5
  TS                                                            5
  TS with one or more categories                                6
  TS with two or more categories containing S or TS data        7
  ___________________________________________________________________

  ** A category is only to be counted if a user does not have access
     to the category **


3. The Risk Index is computed by one of the following algorithms:

a. If RMin < RMax then Risk Index = RMax - RMin

** If RMin = (TS BI) and RMax = TS without categories then use algorithm b. **

b. If RMin >= RMax then

If there are categories on the system to which any user is not authorized access then

Risk Index = 1

Else

Risk Index = 0

4. Computer Security Requirements

Risk Index  Security Operating Mode  Open Environments  Closed Environments
___________________________________________________________________________

0		Dedicated		C1 or less		C1 or less
0		System High		C2			C2
1		Limited Access		B1			B1
		Controlled
		Compartmented
		Multilevel
2		Limited Access		B2			B2
		Controlled
		Compartmented
		Multilevel
3		Controlled		B3			B2
		Multilevel
4		Multilevel		A1			B3
5		Multilevel		*			A1
6		Multilevel		*			*
7		Multilevel		*			*
___________________________________________________________________________



Document Number: CSC-STD-004-85

Title: Rationale Behind Computer Security Requirements

Color: Yellow II

Date: 25 JUN 85

Highlights

1. Minimum User Clearance Rating (RMin )

  Minimum User Clearance Rating                                (RMin )
  ___________________________________________________________________

  Uncleared (U)                                                  0
  Authorized Sensitive but Unclassified Information (N)          1
  Confidential (C)                                               2
  Secret (S)                                                     3
  Top Secret/Current Background Investigation (TS BI)            4
  Top Secret/Current Special Background Investigation (TS SBI)   5
  One Category (1C)                                              6
  Multiple Categories (MC)                                       7
  ___________________________________________________________________


2. Maximum Data Sensitivity Rating (RMax )

  Maximum Data Sensitivity Rating                              (RMax )
  ___________________________________________________________________

  U                                                              0
  N                                                              1
  N with one or more categories                                  2
  C                                                              2
  C with one or more categories                                  3
  S                                                              3
  S with one or more categories                                  4
  S with two or more categories containing Secret data           5
  TS                                                             5
  TS with one or more categories                                 6
  TS with two or more categories containing S or TS data         7
  ___________________________________________________________________

  ** A category is only to be counted if a user does not have access to the
     category **

3. The Risk Index is computed by one of the following algorithms:

a. If RMin < RMax then Risk Index = RMax - RMin

** If RMin = (TS BI) and RMax = TS without categories then use algorithm b. **

b. If RMin >= RMax then

If there are categories on the system to which any user is not authorized access then

Risk Index = 1

Else Risk Index = 0

4. Computer Security Requirements

Risk Index  Security Operating Mode  Open Environments  Closed Environments
___________________________________________________________________________

  0        	Dedicated                   C1 or less        	C1 or less
  0        	System High                 C2                	C2
  1        	Compartmented,Multilevel    B1                	B1
  2        	Compartmented,Multilevel    B2                	B2
  3        	Multilevel                  B3                	B2
  4        	Multilevel                  A1                	B3
  5        	Multilevel                  *                 	A1
  6        	Multilevel                  *                 	*
  7        	Multilevel                  *                 	*
___________________________________________________________________________


5. Security Risk Index Matrix

                               Maximum Data Sensitivity
  ___________________________________________________________

                             U    N    C    S    TS   1C   MC
  ___________________________________________________________

  Minimum            U       0    1    2    3    5    6    7
  Clearance          N       0    0    1    2    4    5    6
  Or                 C       0    0    0    1    3    4    5
  Authorization      S       0    0    0    0    2    3    4
  of                 TS(BI)  0    0    0    0    0    2    3
  System             TS(SBI) 0    0    0    0    0    1    2
  Users              1C      0    0    0    0    0    0    1
                     MC      0    0    0    0    0    0    0 
  ___________________________________________________________


6. Security Index Matrix For Open Security Environments

                               Maximum Data Sensitivity
  ___________________________________________________________

                             U    N    C    S    TS   1C   MC
  ___________________________________________________________

  Minimum	     U	     C1	  B1   B2   B3   *    *    *
  Clearance	     N	     C1	  C2   B2   B2   A1   *    *
  Or	             C	     C1   C2   C2   B1   B3   A1   *	
  Authorization	     S       C1   C2   C2   C2   B2   B3   A1
  of	             TS(BI)  C1   C2   C2   C2   C2   B2   B3
  System	     TS(SBI) C1   C2   C2   C2   C2   B1   B2
  Users	             1C      C2   C2   C2   C2   C2   C2   B1
	             MC      C2   C2   C2   C2   C2   C2   C2
  ___________________________________________________________


7. Security Index Matrix For Closed Security Environments

                               Maximum Data Sensitivity
  ___________________________________________________________

                             U    N    C    S    TS   1C   MC
  ___________________________________________________________

  Minimum	     U       C1   B1   B2   B2   A1   *    *
  Clearance	     N       C1   C2   B1   B2   B3   A1   *
  Or	             C       C1   C2   C2   B1   B2   B3   A1	
  Authorization	     S       C1   C2   C2   C2   B2   B2   B3
  of	             TS(BI)  C1   C2   C2   C2   C2   B2   B2
  System	     TS(SBI) C1   C2   C2   C2   C2   B1   B2
  Users	             1C	     C2	  C2   C2   C2   C2   C2   B1
	             MC	     C2	  C2   C2   C2   C2   C2   C2
  ___________________________________________________________



Document Number: CSC-STD-005-85

Title: DoD Magnetic Remanence Security Guideline

Color: Dark Blue

Date: 15 NOV 85

Highlights: No longer in distribution, this document has been superseded by NCSC-TG-25, A Guide to Understanding Data Remanence.

1. Declassification of magnetic media is the procedure which renders the media unclassified and suitable for release. Clearing is the procedure which removes the data from the media before being stored or rereleased within a secure environment.

2. Declassification requires permission from the ISSO. This decision will be based on:

a. The physical media.

b. The criticality of the data previously stored on the media.

c. The procedures performed.

d. A review of the audit of the declassification.

e. An analysis of the risk of incomplete declassification.

3. For clearing of properly functioning media, one overwrite of each addressable location with a bit pattern, followed by the pattern's complement is sufficient.

4. For declassification of properly functioning media, three overwrite cycles are required. There will be the risk of partially malfunctioning equipment to factor into the declassification decision.

5. Non-Destructive Declassification and Clearing of non-functional media (including malfunctioning) must be accomplished by degaussing with an NCSC-Approved Degausser.

6. Malfunctioning Winchester disk drives cannot be declassified for return to vendor for repair in a non-destructive manner. An (expensive) alternative is to courier the drive to the vendor and have the platters removed, in the couriers view, and returned.

7. Destruction of soft media can be accomplished through incineration or disintegration.

8. Destruction of drums and platters can be accomplished through removing the magnetic media with an emery wheel or a disk sander.

9. Before release for destruction, the media should be cleared, when possible.

10. NCSC evaluates and approves degaussers. The latest list of approved degaussers should be Reviewed before acquisition of new degaussers.


Document Number: NCSC-TG-001 Version-2

Title: A Guide to Understanding Audit

Color: Tan

Date: 1 JUN 88

Highlights

1. Audit must:

a. Provide data for individual accountability.

b. Allow review of user access patterns.

c. Allow discovery of attempts to bypass security.

d. Act as a deterrent against habitual attempts to bypass security. (Advertise the Audit.)

e. Supply assurance that attempts (successful or not) to bypass security will be discovered.

f. Be protected as a part of the TCB and at the sensitivity of the data on the system.

2. C2 Audited Events:

a. Login.

b. Begin Access to Object.

c. End Access to Object.

d. Actions taken by the system operators, administrators, and/or security administrators.

e. Printed Output.

f. System specific security relevant events.

3. Additional B1 Audited Events:

g. Override of hardcopy output markings.

h. Changes of single/multilevel port designators.

i. Changes of single level port sensitivity levels.

j. Changes of multilevel port sensitivity level ranges.

4. Additional B2 Audited Events:

k. Covert storage channel events.

5. Additional B3 Audited Events:

l. Events that may indicate imminent violation of the security policy.

6. C2 Audited Information:

a. Date and Time.

b. UserID.

c. Event Type.

d. Success or Failure.

e. TerminalID.

f. ObjectID.

g. Description of modifications to security databases. (Security Administrator.)

7. B1 Additional Audited Information:

h. Object SL.

i. Subject SL.

8. B3 auditing includes notification of event thresholds which show "funny business".

9. Selective collection of audit events saves time, space, and processor but possibly doesn't collect all data that will be needed.

10. Selective reduction collects all of the data but performance suffers and when storage is filled nothing is collected until corrective action is taken.

11. Compression of the audit data to save space and processing is allowed (in conjunction with "decompression").

12. Separate audit trails for separate formats (or audit event types) are allowed with consistent Date-Time entries.

13. Archival storage capability for audit trail data is required.

14. Provisions must be made for the overflow conditions in physical storage.

15. Dedicated processor storage or write-once devices are good protection for the data.


Document Number: NCSC-TG-002 Version-1

Title: Trusted Product Evaluations A Guide for Vendors

Color: Blue

Date: 22 JUN 90

Highlights: This document gives the trusted product vendor the background and requirements for evaluation by NCSC for inclusion on the EPL. The concepts of "Shared Responsibility" and "Partnership with the Vendor" are stressed.

1. Phases of the Trusted Product Evaluation Program

a. Proposal Review Phase
1) Initial Contact

2) Certificate Pertaining to Foreign Interests

3) Proposal Package Review

4) Preliminary Technical Review

5) Program Decision

6) Legal Agreement

b. Vendor Assistance Phase

1) Design

2) Documentation

3) Test Plan

4) Plan for Rating Maintenance

c. Design Analysis Phase

d. Evaluation Phase

e. Rating Maintenance Phase

2. Technical Description of the Product: The Technical Description is discussed in the document but a better organized and more concise listing of the required information is found in NCSC-TG-019, Trusted Product Evaluation Questionnaire.

3. Legal Agreement: After acceptance for entry into the Vendor Assistance Phase, a Legal Agreement is entered into which covers the following points:

a. NSA provides needed computer security information.

b. NSA protects proprietary information.

c. Vendor supplies information needed for the assessment.

d. Vendor follows the procedures.

e. NSA performs an evaluation with a rating and a final evaluation report.

f. Vendor provides ads and brochures before release.

g. Vendor prepares a report after each RAMP approval.


Document Number: NCSC-TG-003 Version-1

Title: A Guide to Understanding Discretionary Access Control

Color: Tangerine I

Date: 30 SEP 87

Highlights

1. Discretionary Access Control (DAC) is a means of restricting access to objects based on the identity of subjects and/or their group memberships.

2. DAC is intended to restrict (and allow) access based on need-to-know.

3. DAC is generally implemented using one (or more) of five methodologies:

a. Capabilities: Each user has (or has not) a "capability" to access a specific object in a specific mode (Read, Write, etc.). Capabilities can be passed to another subject but cannot be altered without TCB intervention. With capabilities, it is not possible to determine which users have access to specific objects. Group access and revocation of access are a major drawback. [User Possessed Token Scheme]

b. Profiles: Each user has a list of allowed objects associated with it. This can become a very large and cumbersome table. It is difficult to determine which users have access to specific objects. Group access and revocation of access are a major drawback. [List of User's Objects]

c. Access Control Lists: Each object to be protected has a list of allowed (and disallowed) users associated with it. This can also be implemented with groups. If implemented properly, ACLs can meet all classes' requirements. [List of Object's Users]

d. Protection Bits: Access is limited based on the access modes allowed for the Owner, Group, and World. Protection Bits do not readily support Multiple Groups and individual access control.

e. Passwords: The use of a password to gain access to an object can work in conjunction with other protections, but the proliferation of the passwords leads to unwieldy systems. Passwords are not sufficient protection, but as supplemental protection against Trojan Horses, passwords have merit.

4. The heart of DAC rests with two distinct and preferably separate entities, control permission and access modes:

a. Control permissions define which subjects have the ability to grant and revoke access permissions and change access modes.
1) Control permissions can limit the control to grant/revoke permission on access modes for subjects (Control) or

2) Control permissions can include the passing of the control permission to another subject (Control With Passing).

b. Implementation of Control Models for DAC are based on the control permissions.

c. The four basic control models for DAC are:

1) Hierarchical: For a simple example, the system administrator would have control with passing to all objects within the system. The system administrator would pass control of subsets to the applicable "department heads" who would then pass control of subsets to "project heads", etc. Controllers may not be the titular heads. Controllers also control their own access modes. (Can satisfy all classes.)

2) Ownership: Each object has an "Owner" who controls access to the object. Protection bits implement ownership. Administratively, ownership can be implemented by the system administrator not passing control permission to any subject but the Owner. (Can satisfy all classes.)

3) Laissez-faire: Each subject can pass whatever control permissions and accesses the subject has to any other subject, who can then pass them on to whoever they desire. Control permissions and access mode control should be separate. (Can satisfy all classes. If not separate, does not meet C2.)

4) Centralized: All control permissions rest with one user. Control permissions cannot be passed. Control is definite, but could lead to overwork and slowdown with large request quantities. (Can satisfy all classes.)

d. Access Modes control if and how a subject can use an object. Access modes include:

1) Read: No change allowed.

2) Write-Append: Additions allowed, but no changes and no viewing.

3) Write: Allows modify, add, or delete, but no view.

4) Delete: Deletion, nothing else.

5) Execute: Run an executable object.

e. Access controls can be levied on directories and/or files.

5. Requirements for C1:

a. Define and control access between named and objects named users (or named groups).

b. Must recognize and control individual objects.

6. Requirements for C2 through B2:

a. Include/Exclude access to each object on a per user (or per group) basis.

b. Protection for all objects. (Protection by default for new objects.)

c. Control permissions separate from other access types.

d. Group members uniquely identified.

e. Groups are optional.

7. Requirements for B3 through A1:

a. Different access modes are required. (Read, Write, etc.)

b. For each named object, be able to specify named users with access modes allowed.

c. For each named object, be able to specify groups with their membership and accesses.

d. For each named object, be able to specify named users and groups with no access.


[See also Section V]

Document Number: NCSC-TG-004 Version 1

Title: Glossary of Computer Security Terms

Color: Aqua

Date: 21 OCT 88

AIS Automated Information System

COMPUSEC Computer Security

COMSEC Communications Security

DAA Designated Approving Authority

DAC Discretionary Access Control

EPL Evaluated Product List

ISSO Information System Security Officer

MAC Mandatory Access Control

NCSC National Computer Security Center

OPSEC Operations Security

SSO System Security Officer

TCB Trusted Computing Base

TCSEC DoD Trusted Computer System Evaluation Criteria

Access - A specific type of interaction between a subject and an object that results in the flow of information from one to the other (e.g., Read, Write, Execute, Append, Modify, Delete, or Create).

Access Control Mechanism - Hardware or software features, operating procedures, and/or management procedures designed to detect and prevent unauthorized access and to permit authorized access in an automated system.

Access Level - The hierarchical portion of the security level used to identify the sensitivity of data and the clearance or authorization of users.

Access List - A list of users, programs, and/or processes and the specifications of access categories to which each is assigned.

Accountability - The property that enables activities on a system to be traced to individuals who may then be held responsible for their actions.

Accreditation - A formal declaration by the DAA that the AIS is approved to operate in a particular security mode using a prescribed set of safeguards.

Administrative Security (Procedural Security) - The management constraints and supplemental controls established to provide an acceptable level of protection for data.

Assurance - The measure of confidence that the security features and architecture of an AIS accurately mediate and enforce the security policy.

Attack - The act of trying to bypass controls on a system.

Audit Trail - A chronological record of system activities that is sufficient to enable the reviewing of the sequence of environments and activities surrounding or leading to an operation, a procedure, or an event in a transaction from its inception to final results.

Authenticate - To verify the identity of a user, device, or other entity in a computer system, often as a prerequisite to allowing access to resources in a system.

Authorization - The granting of access rights to a user, program, or process.

Automated information System - An assembly of computer hardware, software, and/or firmware configured to collect, create, communicate, compute, disseminate, process, store, and/or control data or information.

Capability - A protected identifier that both identifies the object and specifies the access rights to be allowed by the accessor who possesses the capability.

Category - A restrictive label that has been applied to classified or unclassified data as a means of increasing the protection of the data and further restricting access to the data.

Certification - The comprehensive evaluation of the technical and nontechnical security features of an AIS and other safeguards, made in support of the accreditation process, that establish the extent to which a design and implementation meet a specified set of security requirements.

Closed Security Environment - Environment in which both of these conditions occur:

a. Application developers (including maintainers) have sufficient clearances and authorizations to provide an acceptable presumption that they have not introduced malicious logic.

b. Configuration control procedures provide sufficient assurance that applications and the equipment are protected against the introduction of malicious logic prior to and during operations.

Communications Security - Measures taken to deny unauthorized persons information derived from telecommunications of the U.S. Government concerning national security, and to ensure the authenticity of such communications. COMSEC includes cryptosecurity, transmission security, emission security and physical security of communications security material and information.

Compartment - A class of information that has need-to-know access controls beyond those normally provided for access to Confidential, Secret, or Top Secret information.

Compartmented Mode - The mode of operation where each user has a valid personnel clearance for the most restrictive data and formal access and need-to-know for the data to which he has access.

Compromise - A violation of the security policy of a system such that unauthorized disclosure of sensitive information may have occurred.

Computer Security - Measures and controls that protect an AIS against denial of service and unauthorized (accidental or intentional) disclosure, modification, or destruction of AISs and data.

Configuration Control - The process of controlling modifications to the system's hardware, firmware, software, and documentation that provides sufficient assurance that the system is protected against the introduction of improper modifications prior to, during, or after system implementation.

Configuration management - The management of security features and assurances through the control of changes made to a system's hardware, software, firmware, documentation, test, test fixtures, and test documentation throughout the development and operational life of the system.

Confinement Property (*-Property) - Rule allowing a subject write access to an object only if the security level of the object dominates the security level of the subject. (Bell-LaPadula)

Cost-Risk Analysis - The assessment of the costs of providing data protection for a system versus the cost of losing or compromising the data.

Covert Channel - A communications channel that allows two cooperating processes to transfer information in a manner that violates the system's security policy.

Cryptosecurity - The security or protection resulting from the proper use of technically sound cryptosystems.

Data Security - The protection of data from unauthorized (accidental or intentional) modification, destruction, or disclosure.

Dedicated Mode - The mode of operation where each user has a valid personnel clearance, formal access, and need-to-know for all of the information in the system.

Denial of Service - Any action that prevents any part of a system from functioning in accordance with its intended purpose.

Designated Approving Authority - The official who has the authority to decide on accepting the security safeguards prescribed for an AIS or that official who may be responsible for issuing an accreditation statement that records the decision to accept those safeguards.

Discretionary Access Control - A means of restricting access to objects based on the identity and need-to-know of the user, process, and/or groups to which they belong.

Domain - The unique context (e.g., access control parameters) in which a program is operating; in effect, the set of objects that a subject has the ability to access.

Dominate - Security level S1 is said to dominate security level S2 if the hierarchical classification of S1 is greater than or equal to that of S2 and the nonhierarchical categories of S1 include all those of S2 as a subset.

Embedded System - A system that performs or controls a function, either in whole or in part, as an integral element of a larger system or subsystem.

Emission Security (EMSEC) - The protection resulting from all measures taken to deny unauthorized persons information of value that might be derived from intercept and from an analysis of compromising emanations from systems.

Environment - The aggregate of external procedures, conditions, and objects that affect the development, operation and maintenance of a system.

Formal Access Approval - Documented approval by a data owner to allow access to a particular category of information.

Granularity - An expression of the relative size of a data object; e.g., protection at the file level is considered coarse granularity, whereas protection at the field level is considered to be of a finer granularity.

Identification - The process that enables recognition of an entity by a system, generally by the use of unique machine readable user names.

Information System Security Officer - The person responsible to the DAA for ensuring that security is provided for and implemented throughout the life cycle of an AIS from the beginning of the concept development plan through its design, development, operation, maintenance, and secure disposal.

Isolation - The containment of subjects and objects in a system in such a way that they are separated from one another, as well as from the protection controls of the operating system.

Least Privilege - The principle that requires that each subject be granted the most restrictive set of privileges needed for the performance of authorized tasks.

Least Use - The principle that requires that each subject use the least privilege needed for the performance of authorized tasks.

List-Oriented - A computer protection system in which each protected object has a list of all subjects authorized to access it.

Lock-and-Key Protection System - A protection system that involves matching a key or password with a specific access requirement.

Mandatory Access Control - A means of restricting access to objects based on the sensitivity (as represented by a label) of the information contained in the objects and the formal authorization (i.e., clearance) of subjects to access information of such sensitivity.

Modes Of Operation - A description of the conditions under which an AIS functions, based on the sensitivity of data processed and the clearance levels and authorizations of the users. Four modes of operation authorized are:

a. Dedicated Mode

b. System-High Mode

c. Compartmented Mode

d. Multilevel Mode

Multilevel Mode - The mode of operation where some users do not have a valid personnel clearance for the most restrictive information in the system, and all have clearance, formal access, and need-to-know for the data to which they have access.

Multilevel Secure - A class of system containing information with different sensitivities that simultaneously permits access by users with different security clearances and need-to-know, but prevents users from obtaining access to information for which they lack authorization.

Multiuser Mode - A mode of operation designed for systems that process sensitive unclassified information in which users may not have a need-to-know for all information processed.

Need-To-Know - The necessity for access to, knowledge of, or possession of specific information required to carry out official duties.

Object - A passive entity that contains or receives information. Examples: records, blocks, pages, segments, files, bytes, fields, printers, network nodes, etc.

Object Reuse - The reassignment and reuse of a storage medium that once contained one or more objects. To be securely reused and assigned to a new subject, storage media must contain no residual data from the object(s) previously contained in the media.

Open Security Environment - An environment in which at least one of the following conditions occur: a. Application developers (including maintainers) do not have sufficient clearances or authorizations to provide an acceptable presumption that they have not introduced malicious logic. b. Configuration control procedures does not provide sufficient assurance that applications and the equipment are protected against the introduction of malicious logic prior to and during the operation of the system.

Operations Security - An analytical process by which the U.S.Government and its supporting contractors can deny to potential adversaries information about capabilities and intentions by identifying, controlling, and protecting evidence of the planning and execution of sensitive activities and operations.

Partitioned Security Mode - A mode of operation wherein all personnel have the clearance but not necessarily formal access approval and need-to-know for all information in the system.

Password - A protected/private character string used to authenticate an identity.

Penetration - The successful act of bypassing the security mechanisms of a system.

Periods Processing - The processing of various levels of sensitive information at distinctly different times.

Personnel Security - The procedures established to ensure that all personnel who have access to sensitive information have the required authority as well as the appropriate clearances.

Physical Security - The application of physical barriers and control procedures as preventative measures or countermeasures against threats to resources and sensitive information.

Privileged Instructions - A set of instructions to control features that are generally executable only when the automated system is operating in the executive state.

Process - A program in execution.

Protection Philosophy - An informal description of the overall design of a system that delineated each of the protection mechanisms employed. A combination of formal and informal techniques is used to show that the mechanisms are adequate to enforce the security policy.

Purge - The removal of sensitive data from an AIS, AIS storage device, or peripheral device with storage capacity, at the end of a processing session.

Read - A fundamental operation that results in the flow of information from an object to a subject.

Reference Monitor Concept - An access-control concept that refers to an abstract machine that mediates all access to objects by subjects.

Residual Risk - The portion of risk that remains after security measures have been applied.

Residue - Data left in storage after processing operations are complete.

Risk - The probability that a particular threat will exploit a particular vulnerability of the system.

Secure State - A condition in which no subject can access any object in an unauthorized manner.

Security Filter - A trusted subsystem that enforces a security policy on the data that pass through it.

Security Label - A piece of information that represents the security level of an object.

Security Level - The combination of a hierarchical classification and a set of nonhierarchical categories that represents the sensitivity of information.

Security Policy - The set of laws, rules, and practices that regulates how an organization manages, protects, and distributes sensitive information.

Security Policy Model - A formal representation of the security policy enforced by the system. It must identify the set of rules and practices that regulate how a system manages, protects, and distributes sensitive information.

Security Range - The highest and lowest security levels that are permitted in or on a system, system component, subsystem, or network.

Security Requirements Baseline - A description of minimum requirements necessary for a system to maintain an acceptable level of security.

Security Safeguards - The protective measures and controls that are prescribed to meet the security requirements specified for a system. Those safeguards may include but are not limited to: hardware and software security features, operating procedures, accountability procedures, access and distribution controls, management constraints, personnel security, and physical structures, areas, and devices.

Security Specifications - A detailed description of the safeguards required to protect a system.

Sensitivity Label - A piece of information that represents the security level of an object. Sensitivity labels are used by the TCB as the basis for mandatory access control decisions.

Simple Security Property - A security model rule allowing a subject read access to an object only if the security level of the subject dominates the security level of the object. (Bell-LaPadula)

Stand-alone, Shared System - A system that is physically and electrically isolated from other systems and is intended to be used by more than one person with data belonging to one user remaining available to the system while another user is using the system.

Stand-Alone, Single-User System - A system that is physically and electrically isolated from other systems and is intended to be used by one person at a time with no data belonging to other users remaining in the system.

Storage Object - An object that supports both read and write accesses.

Subject - An active entity, generally in the form of a person, process, or device, that causes information to flow among objects or changes the system state.

Subject Security Level - A subject's security level is equal to the security level of objects to which it has both read and write access. A subject's security level must always be dominated by the clearance of the user with which the subject is associated.

System High - The highest security level supported by a system at a particular time or in a particular environment.

System High Mode - The mode of operation where each user has a valid personnel clearance and formal access for the most restrictive information in the system and need-to-know for some of the information in the system.

System Low - The lowest security level supported by a system at a particular time or in a particular environment.

Technical Attack - An attack that can be perpetrated by circumventing or nullifying hardware and software protection mechanisms, rather that by subverting system personnel or other users.

Terminal Identification - The means to uniquely identify a terminal to a system.

Threat - Any circumstance or event with the potential to cause harm to a system in the form of destruction, disclosure, modification of data, and/or denial of service.

Threat Analysis - The examination of all actions that might adversely affect a system.

Threat Monitoring - The analysis, assessment, and review of audit trails and other data collected for the purpose of searching out system events that may constitute violations or attempted violations of system security.

Ticket-Oriented - A computer protection system in which each subject maintains a list of unforgeable bit patterns, called tickets, one for each object the subject is authorized to access.

Trusted Computing Base - The totality of protection mechanisms within a computer system, including hardware, software, and firmware, the combination of which is responsible for enforcing a security policy.

Trusted Distribution - A trusted method for distributing the TCB hardware, software, and firmware components, both originals and updates, that provides methods for protecting the TCB from modification during distribution and for detection of any changes to the TCB that may occur.

Trusted Identification Forwarding - An identification method used in networks whereby the sending host can verify that an authorized user on its system is attempting a connection to another host.

Trusted Path - A mechanism by which a person at a terminal can communicate directly with the TCB. This mechanism can only be activated by the person or the TCB and cannot be imitated by untrusted software.

Trusted Process - A process whose incorrect or malicious execution is capable of violating system security policy. Such a process is examined and evaluated to ensure proper performance of the desired functionality.

Untrusted Process - A process that has not been evaluated or examined for adherence to the security policy. Such a process is not allowed the ability to violate the security policy.

User - Person or process accessing an AIS either by direct connection(i.e., via terminals) or indirect connections (i.e., prepare input data or receive output that is not reviewed for content or classification by a responsible individual).

User ID - A unique symbol or character string that is used by a system to identify a specific user.

User Profile - Patterns of a user's activity that can be used to detect changes in normal routines.

Vulnerability - A weakness in the system security procedures, system design, implementation, internal controls, etc., that could be exploited to violate system security policy.

Write - A fundamental operation that results in the flow of information from a subject to an object.


Document Number: NCSC-TG-005 Version-1

Title: Trusted Network Interpretation

Color: Red I

Date: 31 JUL 87

Highlights This book is a rehash of the Orange Book for networks (in 278 pages). There are several new concepts brought out by the Red Book; these new concepts are the heart of this Digest.

1. Security Policy: The network security policy needs two section, in addition to the sections needed by a system:

a. Secrecy Policy: The Secrecy Policy defines the policy enforced by the network to protect the data on the network from improper exposure to unauthorized users.

b. Integrity Policy: The Integrity Policy defines the policy enforced to ensure the delivery of unmodified and uncorrupted data from the source to the destination.

2. For networks, several new evaluation areas are brought forth. The criteria for the evaluations fall into three main areas; Functionality, Strength of Mechanism, and Assurance. the following is a listing of the new areas to assess when determining the "Goodness" of a network which were not present in the assessment of the stand-alone system:

a. Support Primitives:
1) Encryption Mechanism.

2) Protocols.

b. Documentation (needing special attention for networks):

1) Security Features User's Guide.

2) Trusted Facility Manual.

3) Test Documentation.

4) Design Documentation.

c. Communications Integrity:

1) Authentication.

2) Communications Field Integrity.

3) Non-Repudiation (receipted, guaranteed delivery).

d. Denial of Service:

1) Continuity of Operations.

2) Protocol Based Protection.

3) Network Management.

e. Compromise Protection:

1) Data Confidentiality.

2) Traffic Confidentiality.

3) Selective Routing.

3. Networks are composed of components. While each component need not possess full security "Goodness", the whole must be able to be judged for overall "Goodness". The method chosen for systematic evaluation of the components was to judge each piece by the criteria of one or more slices of the total security framework. The following is a listing of Component Types and their Minimum and Maximum Classes:

M = MAC	
D = DAC	
I = Identification and Authentication	
A =
Audit
Component Type	Min Class	Max Class
_________________________________________

M		B1		A1
D		C1		C2+
I		C1		C2
A 		C2		C2+
DI		C1		C2+
DA		C2		C2+
IA		C2		C2+
IAD		C2		C2+
MD		B1		A1
MA		B1		A1
MI		B1		A1
MDA		B1		A1
MDI		B1		A1
MIA		B1		A1
MIAD		B1		A1
_________________________________________


4. Cascading is the situation where multilevel systems which operate at different ranges are connected on a multilevel network. While each of the systems in question have an allowably small range of operation (Confidential through Secret, Secret through Top Secret, etc.) the network has a larger (and possibly unacceptable) range of users (Confidential through Top Secret, by the example). Even when the network only allows Secret transfers, there is the increased chance that a Confidential who can worm up one level (to Secret) can now run the net to the Secret-Top Secret system and worm up one more level.


Document Number: NCSC-TG-006 Version-1

Title: A Guide to Understanding Configuration Management

Color: Tangerine II

Date: 28 MAR 88

Highlights

1. Configuration Management (CM) is a requirement for B2 through A1. (Multilevel Mode)

2. CM should be used for all systems, regardless of level and mode.

3. CM has the duty of ensuring that the inevitable changes to the system are performed properly and do not adversely effect the security and operations of the system.

4. B2 CM Requirements:

1) CM system must be in place during the development and maintenance of the TCB to maintain control of changes to:

2) - Descriptive Top Level Specification (DTLS).

3) - Other Design Data.

4) - Implementation Documentation (User's Manuals, etc.).

5) - Source Code.

6) - Running Version of Object Code.

7) - Test Fixtures.

8) - Test Documentation.

9) CM System must assure consistent mapping across all documentation and code.

10) CM System must provide tools to generate a new version of the TCB.

11) CM system must provide tools to ensure the inclusion of only the desired changes to the TCB upon generation of a new version of the system.

5. A1 CM Requirements:

12) CM System must be in place throughout the entire lifecycle (design, development, and maintenance). The CM System must maintain control of changes to the:

13) - TCB hardware.

14) - TCB software.

15) - TCB firmware.

16) - Formal Model.

17) - Formal Top Level Specification.

18) - CM Tools.

19) Prevent unauthorized modification or destruction of the masters of all material used to generate a new TCB.

6. CM consists of four separate tasks:

a. Identification.

b. Control.

c. Status Accounting.

d. Auditing.

7. CM Identification: Procedures to enable a person to identify the configuration of the system at discrete points in time for systematic control of the configuration and to maintain integrity and traceability throughout the system life cycle.

8. CM Control: The systematic evaluation, coordination, approval or disapproval of proposed changes to the design and construction of a configuration item whose configuration has been formally approved.

9. CM Status Accounting: Records and reports which enable proper logistics support to be established.

10. CM Audit: Checking CM Accounting information to ascertain that only the authorized changes have been made in the code that will actually be used as the new version of the TCB.

11. Classical Baselining is the recommended method for implementing CM.

12. Use of a Configuration Control Board is also recommended. Membership should include the entire community.

13. UNIX SCCS (Source Code Control System) and VAX DEC/CMS (Code Management System) are recommended tools for tracking code histories.


Document Number: NCSC-TG-007 Version-1

Title: A Guide to Understanding Design Documentation

Color: Burgandy

Date: 2 OCT 88

Highlights

1. Design documentation should be written in an iterative manner with the total lifecycle of the system in mind at all times.

2. C1 Design Documentation Requirements:

1) Describe the Philosophy of Protection (PoP).

2) Describe how the PoP is translated into the TCB.

3) Describe how the TCB is modularized (if modular).

4) Describe all interfaces between TCB modules (if modular).

5) Describe how the TCB protects itself.

6) Provide a statement of the system security policy.

3. B1 Design Documentation Requirements:

7) Provide a description of the security policy model enforced by the TCB.

8) Explain the sufficiency of the security policy model to enforce the security policy.

9) Identify and describe the TCB protection mechanisms. [See 5)]

10) Explain how the TCB mechanisms satisfy the security policy model.

4. B2 Design Documentation Requirements:

11) Describe how the TCB is modularized. [See 3)]

12) Describe all of the interfaces between TCB modules. [See 4)]

13) Provide a formal description of the security policy model.

14) Prove the sufficiency of the security policy model to enforce the security policy.

15) Show that the Descriptive Top-Level Specification (DTLS) is an accurate description of the TCB interface.

16) Describe how the TCB implements the Reference Monitor Concept.

17) Describe why the reference monitor is tamper resistant.

18) Describe why the reference monitor cannot be bypassed.

19) Describe why the reference monitor is correctly implemented.

20) Describe how the TCB is structured to facilitate testing.

21) Describe how the TCB is structured to enforce least privilege.

22) Present the results and methodology of the covert channel analysis.

23) Describe the tradeoffs involved in restricting covert channels.

24) Identify all auditable events that may be used in exploitation of known covert storage channels.

25) Provide the bandwidth of known covert storage channels whose use is not detectable by auditing mechanisms.

5. B3 Design Documentation Requirements:

26) Identify all auditable events that may be used in exploitation of known covert timing channels.

27) Provide the bandwidth of known covert storage channels whose use is not detectable by auditing mechanisms.

28) Describe how the TCB complies with additional B3 system architecture requirements, e.g. minimal TCB and layering.

29) Informally show consistency of the TCB implementation with the DTLS.

30) Informally show correspondence between elements of the DTLS and elements of the TCB.

31) Informally show consistency of the DTLS with the model.

6. A1 Design Documentation Requirements:

32) Informally show consistency of the TCB implementation with the Formal Top-Level Specification (FTLS).

33) Informally show correspondence between elements of the FTLS and elements of the TCB.

34) Clearly describe hardware, software, and firmware internal to the TCB that is not dealt with in the FTLS.

35) Informally or formally show consistency of the FTLS with the model.

36) Informally show correspondence between the DTLS and the FTLS.


Document Number: NCSC-TG-008 Version-1

Title: A Guide to Understanding Trusted Distribution

Color: Lavender

Date: 15 DEC 88

Highlights

1. Trusted Distribution (TD) gives the following assurances:

a. Assurance that the product evaluated is the one the manufacturer built.

b. Assurance that the product built is the one sent.

c. Assurance that the product sent is the one the customer site received.

2. TD is required at the A1 level and higher.

3. TD should be used at lower levels as cost-effective.

4. TD should include the documentation.

5. TD protects against two threats (either of which can introduce Trojan Horses, trapdoors, viruses, etc.):

a. Tampering during transit.

b. Counterfeit TCB updates.

6. TD requires protection during three phases:

a. Post-Production.

b. Transit.

c. Receipt.

7. Post-production protection and assurance come from the use of packing, storage, and loading procedures which ensure that the product shipped is the product intended. This protection and assurance can come from a combination of one or more of the following:

a. Configuration Management procedures.

b. Shrink-Wrap packaging (assuming heat sensitivity allows this).

c. Bonded storage.

8. In transit protection ensures that what is shipped arrives in an unaltered state and that what arrives was sent by the purported sender. This assurance can come from a combination of one or more of the following:

a. Shrink-Wrap packaging.

b. Tamper resistant seals.

c. Numbered container seals (with the number sent by separate secure means).

d. Active alarm systems.

e. Couriers.

f. Registered mail

g. Message authentication codes.

h. Encryption.

9. Receipt verification lends assurance through three main methods:

a. Confirmation that the methods used previously do not indicate tampering, such as Authentication code checks, untriggered alarms, and unbroken seals.

b. Confirmation that the proper parties sent the shipment which has arrived.

c. Independent confirmation that the received TCB is the right TCB. This assurance can come from a combination of one or more of the following:

1) Check-Sum confirmation.

2) Inventory.

3) Technical inspection.

10. Each case is unique and must be viewed individually to determine the appropriate TD methods. The active threats against the customer site, the active threats against the vendor site, and the state-of-the-threat-art must each be taken into account in determining the cost-effective methods to be used.


Document Number: NCSC-TG-009 Version-1

Title: Computer Security Subsystem Interpretation

Color: Venice Blue

Date: 16 SEP 88

Highlights

A. Introduction

1. For systems which do not have sufficient security there are four recognized areas which can be taken care of with "add-on" products:
a. Discretionary Access Control (DAC).

b. Object Reuse (OR).

c. Identification and Authentication (I&A).

d. Audit (AUD).

2. These products, by definition, do not meet all of the requirements for a system. The ratings which they are given are based on meeting the subset of the requirements pertaining to the area of protection. The ratings given are all "D" because of the incompleteness, but the ratings show area compliance in the following manner:

a. D => Does not meet C1 (DAC, OR, I&A, AUD).

b. D1 => Meets C1 for the area (DAC, I&A).

c. D2 => Meets C2 for the area (DAC, OR, I&A, AUD).

d. D3 => Meets B3 for the area (DAC, AUD).

3. In order to have the subsystems have meaning in the total system the necessary support from other areas must be present. The basic dependencies are:

a. DAC requires I&A for ID purposes and AUD (at D2) to track control permissions.

b. OR has no dependencies.

c. I&A requires DAC (or domain isolation) and AUD (at D2).

d. AUD requires I&A and DAC (or domain isolation).

4. The integration of the total system, with its subsystems, needs to be viewed to determine the total level of trust which can be granted.

B. Required Features

1. DAC D1 Requirements:
a. Access must be controlled by group or individual.

b. Means must be available to specify authorizations for all users and groups with access.

c. Mediation prior to access must allow (or disallow) the access.

2. DAC D2 Requirements:

d. Access must be able to include/exclude at the individual user level.

e. Control permission must be limited to authorized users.

f. Passing of control permission must be limited to authorized users.

g. All access to objects must be denied unless explicit authorizing action is taken.

3. DAC D3 Requirements:

h. DAC must allow users to specify the list of individuals or groups with access to each object and what access modes are allowed. (Implies ACLs.)

4. OR D2 Requirement::

a. Before allowing a new user to have read access a previously used storage device, the storage area must be overwritten with meaningless bit patterns.

5. I&A D1 Requirements:

a. Users must identify themselves prior to use of the system.

b. Identification must be authenticated by a protected means (e.g. password).

c. Authentication data must be safeguarded from unauthorized users.

d. Authenticated identities must be passed to the protected system.

6. I&I D2 Requirements:

e. Identification must be to the individual level.

f. Audit records must be generated for successful and failed logins.

7. AUD D2 Requirements:

a. The system must be able to call the AUD subsystem and hand off audit parameters.

b. The AUD subsystem must format the data and route to storage or an audit logger.

c. Audit data must be protected from unauthorized modification.

d. Access by authorized individuals must be through the AUD subsystem.

e. All security relevant events must be recorded.

f. For each event, UserID, date/time, event type, and success/failure must be recorded.

g. The AUD subsystem must be able to perform selection of audit data based on UserID.

8. AUD D3 Requirement:

h. The AUD subsystem must be able to accept threshold values and immediately notify the security administrator of imminent security violation.

C. Assurance Requirements

1. Architecture D1 Requirement (to be met by all D1 subsystems):
a. Protection of the subsystem's mechanism and data must be protected from external interference and tampering.

2. Architecture D2 Requirement (to be met by all D2 subsystems):

b. Operations of the subsystem must be nonbypassable and perform on all subjects and objects controlled by the subsystem.

3. Integrity D1 Requirement (to be met by all D1 subsystems):

a. The capability must exist to validate the correct operation of the hardware and firmware elements of the subsystem even if they are part of the protected system.

4. Test D1 Requirements (to be met by all D1 subsystems):

a. Subsystem must perform as documented.

b. Subsystem must not introduce new flaws to the system.

c. There must be no obvious ways to bypass or defeat the security features.

5. Test D2 Requirement (to be met by all D2 subsystems):

d. Testing must include a search for obvious flaws which affect the security features.

D. Documentation Requirements

5. Security Features User's Guide D1 Requirement (to be met by all D1 subsystems):
a. There must be clear, comprehensive, and centralized documentation describing the protection mechanisms of the subsystem and their use.

6. Trusted Facility Manual (TFM) D1 Requirements (to be met by all D1 subsystems):

a. TFM must give cautions about functions and privileges provided by the subsystem.

b. TFM must give precise directions for integration of the subsystem into the system.

7. Trusted Facility Manual (TFM) D2 Requirements (to be met by all D2 subsystems):

c. TFM must give full details about using the audit trail and the full formats and structure of the audit records.

8. Test Documentation D1 Requirements (to be met by all D1 subsystems):

a. Test documentation must include the test plan, test procedures and test report for the security mechanisms.

9. Design Documentation D1 Requirements (to be met by all D1 subsystems):

a. Contain threats protected against.

b. Describe Philosophy of Protection (PoP).

c. Explain implementation of the PoP.

d. Modular designs must describe the interfaces.

e. Specify the interactions with the system and other subsystems.


Document Number: NCSC-TG-011 Version-1

Title: Trusted Network Interpretation Environments Guideline

Color: Red II

Date: 1 AUG 90

Highlights

1. This document is in support of the Trusted Network Interpretation [TNI] (NCSC-TG-005). Its purpose is to show how to use the interpretations and concepts found in the TNI and to mend the flaws which have surfaced in the TNI. It is the first of a series (within the Rainbow Series) of trusted network related documents, the Trusted Network Technology (TNT) set.

2. The areas where the TNI fell down revolve around the following set of revised definitions which were not used consistently in, or were not properly brought forth by, the TNI:

a. Network Trusted Computing Base (NTCB): The totality of protection mechanisms within a network system.

b. NTCB Partition: The totality of protection mechanisms within a single network subsystem.

c. Component: An individual physical unit that does not provide a complete set of end-user services.

d. System/Subsystem: A collection of hardware, software, and firmware configured to collect, create, communicate, disseminate, process, store, and/or control data and information.

e. NCSC-evaluation: The process in which the NCSC determines whether a COTS product satisfies the Orange Book.

f. Certification: The technical evaluation of a system's security features that establishes the extent to which a particular system's design and implementation meet a set of specified security requirements.

g. Accreditation: The managerial authorization and approval granted to a system or network to process sensitive data in an operational environment.

3. Network Security Architecture and Design (NSAD):

a. An NSAD is required for each system. Each subordinate subsystem must have an NSAD which conforms to the system NSAD.

b. If the network is composed of previously accredited systems, the NSAD must include Memorandum of Agreement (MOA[s]) between the systems' accreditors, or a Memorandum of Record (MOR) if each of the systems had the same accreditor, covering the connection requirements for continued accreditation.

c. NSAD documents the security services for the network including the network configuration and allocation of security services to network components.

d. Suggested contents include:

1) General description of information transmitted, by subsystem.

2) Summary description of trusted behavior, by subsystem.

3) Details of the system security plan and responsibilities.

4) Overall network security policy.

5) Additional security training and responsibilities.

6) Specification of security parameters transmitted.

7) Security details relevant to information exchange.

8) Description of the user community, including lowest clearance.

9) Considerations for dial-up connections, including safeguards.

10) Security protections provided by data communications, local and network.

11) Audited data and audit trail division of labor.

12) Information security procedures provided by subsystems, including:

a) Types of processing; file query, individual user, general processing, etc.

b) Modes of operation.

c) Sensitivity levels processed.

4. Risk Assessment:

a. Determine the system security mode of operations.

b. Determine minimum user clearance or authorization rating (RMIN).

c. Determine maximum data sensitivity rating (RMAX).

d. Determine Risk Index (RMAX - RMIN).

e. Determine security evaluation class for computer-based controls; C1, B3, etc.

f. Determine adjustments to the evaluation class required.

5. Section 5.4.3 contains a step-by-step questionnaire for identifying needed security services functionality. (It has not been repeated here.)

6. Network Security Services

   Network Security Service         Criterion       Evaluation Range
____________________________________________________________________

Communications Integrity
   Continuity of Operations         Functionality   None | Present
                                    Strength        None - Good
                                    Assurance       None - Good
   Communications Field Integrity   Functionality   None - Good
                                    Strength        None - Good
                                    Assurance       None - Good
   Non-Repudiation                  Functionality   None | Present
                                    Strength        None - Good
                                    Assurance       None - Good
Denial of Service
   Authentication                   Functionality   None - Good
                                    Strength        None - Good
                                    Assurance       None - Good
   Protocol Based Protection        Functionality   None - Good
                                    Strength        None - Good
                                    Assurance       None - Good
   Network Management               Functionality   None | Present
                                    Strength        None - Good
                                    Assurance       None - Good

Compromise Protection
   Data Confidentiality             Functionality   None | Present
                                    Strength        Sensitivity Level
                                    Assurance       None - Good
   Traffic Flow Confidentiality     Functionality   None | Present
                                    Strength        Sensitivity Level
                                    Assurance       None - Good
   Selective Routing                Functionality   None | Present
                                    Strength        None - Good
                                    Assurance       None - Good
____________________________________________________________________

7. Risk Index, Assurance Rating, and Minimum Evaluation Class

Risk Index    Strength of Mechanism   Assurance Rating   Evaluation Class
_________________________________________________________________________

  0                  None                   None                 D
  1                  Minimum                Minimum              C1
  2                  Fair                   Fair                 C2
  >2                 Good                   Good                 B2
_________________________________________________________________________

 

Document Number: NCSC-TG-013 Version-1

Title: Rating Maintenance Phase Program Document

Color: Hot Pink

Date: 23 JUN 89

Highlights: The Rating Maintenance Phase (RAMP) Program is intended to simplify the continuance of ratings for a vendor's products after evaluation. This would allow ongoing versions of the Operating System software and hardware suites to be listed on the Evaluated Product List (EPL) without the need for reevaluation. The following is a summary of RAMP requirements which must be met by the vendor in order to participate in the RAMP Program.

1. Preevaluation Phase

a. Vendor establishes an intent to participate in RAMP in the product's evaluation package.

2. Vendor Assistance Phase/Design Analysis Phase

a. The vendor must identify and maintain a responsible corporate officer.

b. The vendor must complete training of one or more Vendor Security Analysts (VSAs) and provide Dockmaster access. A lead VSA must be identified, if more than one VSA.

c. NCSC will provide training for the VSAs.

d. The vendor must develop an NCSC approved Rating Maintenance Plan (RM-Plan) prior to its implementation and implement the RM-Plan prior to start of development on the version to supersede the evaluated product.

e. NCSC will review for purposes of approval the PM-Plan.

3. Evaluation Phase

a. The vendor must maintain a responsible corporate officer.

b. The vendor must maintain one or more Vendor Security Analysts (VSAs) and provide Dockmaster access. A lead VSA must be identified, if more than one VSA.

c. NCSC will provide training for the VSAs.

d. The vendor must complete his own RAMP audit to ensure that security feature functionality and assurances are being maintained by adherence to the approved RM-Plan.

e. The NCSC evaluation team will review the results of the vendor audit to ensure the vendor's RAMP process follows the approved RM-Plan procedures.

f. NCSC assigns A Technical Point of Contact (TPOC) and a Business Point of Contact (BPOC) before completion of the evaluation phase.

4. Rating Maintenance Phase

a. The vendor must maintain a responsible corporate officer.

b. The vendor must maintain one or more Vendor Security Analysts (VSAs) and provide Dockmaster access. A lead VSA must be identified, if more than one VSA.

c. NCSC will provide training for the VSAs.

d. NCSC maintains a BPOC and a TPOC.

e. The vendor must provide product instruction to the TPOC as needed throughout RAMP.

f. The vendor will provide informal quarterly status reports via Dockmaster.

g. The vendor must conduct at least one RAMP audit per RAMP cycle.

h. TPOC will review the vendor's RAMP audit results.

i. The vendor will submit the following documents for each version of the evaluated product for which the vendor desires to have the rating maintained via RAMP:

1) Rating Maintenance Report (RMP)

2) Rating Maintenance Plan (RM-Plan) with change bars

3) Final Evaluation Report (FER) with change bars

4) FER with integrated changes

5) Proposed Product description for EPL

j. NCSC will review the vendor's documents for the purpose of extending the rating to the specific release and placement on the EPL.

k. The vendor's RAMP process is subject to Interim Reviews and Aperiodic Reviews by the NCSC to ensure the adherence to the vendor's approved RM-Plan procedures.


Document Number: NCSC-TG-014 Version-1

Title: Guidelines for Formal Verification Systems

Color: Purple

Date: 1 APR 89

Highlights Formal specification and verification systems used in the development of systems for submittal to the NCSC for evaluation at the A1 level are to be selected from the Endorsed Tools List (ETL). These are the procedures and requirements for candidates for the ETL.

1. Major Steps Leading To Endorsement And ETL Listing:

a. Developer submits a request for evaluation to the NCSC Committee Chairperson.

b. The Committee meets to see if the verification system has something new to offer.

c. On favorable result, an evaluation team is formed and evaluation starts.

d. On completion of evaluation, a Technical Assessment Report (TAR) is written.

e. The Committee reviews the TAR and makes recommendations on endorsement.

f. The Committee Chairperson Approves/disapproves endorsement.

g. If approved, an ETL entry is issued for the verification system.

h. A TAR is issued on the verification system.

2. Major Steps Leading To Endorsement And ETL Listing For A Revised System:

a. Vendor submits the Vendor Report (VR) and other materials to the Chairperson.

b. An evaluation team is formed to review the VR.

c. The team adds comments and submits them to the Committee.

d. The vendor defends the VR to the Committee.

e. As Above.

3. Major Steps Leading To The Removal Of A Verification System From The ETL:

a. The Committee questions the endorsement of a verification system on the ETL.

b. As Above.

4. The components of the verification system include:

a. A mathematical specification language that allows the use of correctness conditions.

b. A specification processor that interprets the specification and generates conjectures.

c. A Reasoning mechanism that interprets the conjectures and checks the proof.

5. The four technical factors for the verification system are:

a. Methodology.

b. Features.

c. Assurance.

d. Documentation.

6. The methodology must consist of a set of propositions used as rules for performing formal verification in the system. The methodology must have a sound logical basis.

7. Features include:

a. Specification Language.

b. Specification Processing.

c. Reasoning Mechanism.

d. User Interface.

e. Hardware Support.

8. Assurance is provided by:

a. Sound Basis.

b. Correctness.

c. Predictability.

d. Previous Use.

e. Error Recovery.

f. Software Engineering.

g. Logical Theory.

h. Machine Independence.

i. Configuration Management.

j. Support and Maintenance.

k. Testing.

9. Required Documentation:

a. Informal Justification.

b. Formal Definition.

c. Explanation of Methodology.

d. CM Plan.

e. CM Evidence.

f. Source Code.

g. Test Documentation.

h. User's Guide.

i. Reference Manuals.

j. Facilities Manuals.

k. Vendor Report.

l. Worked Examples.


Document Number: NCSC-TG-015 Version 1

Title: A Guide to Understanding Trusted Facility Management

Color: Brown

Date: 18 OCT 89

Highlights

1. At B2 and B3 levels the ISSO functions must be differentiated to allow for compartmentalization of functionality to limit the damage which can be caused by error and malicious intent. The required separations are:

a. For B2, the Administrator and Operator functions.

b. For B3, the security-relevant and non-security-relevant Administrator functions.

c. For all levels, the separations should be in place as possible (even if not required).

2. The duties of the ISSO can be differentiated into the following roles:

a. Security Administrator:
1) Setting the parameters for the Login/Logout mechanisms.

2) Setting the authentication parameters.

3) Defining the user account and registration profile.

4) Defining the group accounts and registration profile.

5) Defining and maintaining the Security Label (SL) map.

6) Setting SL limits and default SLs.

7) Labeling of imported unlabeled data and media.

8) Reclassifying objects.

9) Initializing DAC privileges.

10) Defining and maintaining group memberships.

11) Checking consistency of tables, installed TCB, allowed configuration, etc.

12) Terminating and deleting accounts.

13) Responding to real-time alarms.

14) Destroying errant processes.

b. Secure Operator:

1) Booting and shutting down the system.

2) Locating damaged user files and volumes.

3) Performing routine maintenance of TCB databases.

4) Performing on-line tests.

5) Mounting/unmounting Labelled removable media on user request.

6) Importing/exporting Labelled data on user request.

c. Account Administrator:

1) Installing and maintaining accounting files.

2) Turning system accounting on/off.

3) Running accounting/billing tools.

4) Enabling/disabling accounts at user request.

5) Establishing rates, prices, and policies.

6) Collecting system statistics on availability, configuration, and disk/CPU/memory.

7) Publishing revenue/costs reports.

d. Auditor:

1) Selecting/deselecting audit features.

2) Managing the audit files.

3) Setting delays and randomization factors for covert channel handling.

4) Performing postprocessing on the audit data collected.

e. Operator:

1) Performing user volume backups.

2) Performing system performance metering.

3) Adjusting resource quotas.

4) Responding to various non-security -relevant user requests.

f. System Programmer:

1) Trusted system distribution.

2) Setting of system configuration parameters.

3) Analyzing of dumps.

4) Installing patches.

5) Recovering from system crashes.

6) Repairing damaged SLs.


Document Number: NCSC-TG-017 Version-1

Title: Identification and Authentication

Color: Light Blue

Date: SEP 91

Highlights

1. Identification and Authentication (I&A) are often referred to as login. The valid identification of the user is critical to the effective execution of access controls.

2. There are three types of Authentication:

a. Authentication by Knowledge: (Something you know)
1) The classic password login.

2) Also pass-phrases and entry quizzes.

3) Cannot be stolen or lost but can be readily copied.

4) Implementation and data storage are relatively straight-forward.

b. Authentication by Ownership: (Something you have)

1) Physical object based.

2) Relatively vulnerable to theft.

3) Difficult to copy (duplicate).

c. Authentication by Characteristic: (Something you are)

1) Physical characteristics like retinal pattern, fingerprint, DNA, etc.

2) Relatively expensive and non-standard computer equipment.

3) Not foolproof.

3. Authentication data must be protected from access by all but the System Security Officer (SSO) and the SSO must not be allowed to see the plain-text version of the data.

4. Implementation

a. No method is totally foolproof.

b. Lower levels - one method will suffice. (Passwords are well-understood.)

c. Higher levels - may require combinations of methods.

5. I&A Requirements by Class

a. C1 Requirements
1) Users identify themselves.

2) TCB authenticate via protected means.

3) TCB protect authentication data.

4) Group identities allowed.

b. C2 Requirements (Additional)

1) Uniquely identify each user.

2) Audit the I&A action.

c. B1 Requirements (Additional)

1) Authenticate clearance and authorizations.

2) Run MAC based on I&A results.

d. B2 Requirements (Additional)

1) Trusted path is required for I&A.

2) Operator and administrator functions separate.

e. B3 Requirements (Additional)

1) Trusted path expanded beyond just I&A.

2) Lists for access and non-access by named individual and group for each object.


Document Number: NCSC-TG-019 Version-1

Title: Trusted Product Evaluation Questionnaire

Color: Blue

Date: 16 OCT 89

Highlights

1. The questionnaire (185 questions) is to point developers into the proper lines of thought while developing the system.

2. The questionnaire may be completed in an intelligent manner (not answering meaningless questions, putting off answers until the appropriate phase, etc.)

3. The primary groupings for the questions are:

a. Subjects.

b. Objects.

c. Hardware Architecture.

d. Software.

e. Identification & Authentication (I&A).

f. Object Reuse.

g. Discretionary Access Control (DAC) Policy.

h. Labels.

i. Mandatory Access Control (MAC) Policy.

j. Integrity.

k. Audit.

l. Modeling and Analysis.

m. Testing.

n. Other Assurances.

o. Other Documentation.

4. The questionnaire is only intended for use with operating systems, not networks or subsystems.


Document Number: NCSC-TG-020-A Version 1

Title: Access Control List (ACL) Features for UNIX

Color: Gray

Date: 18 AUG 89

Highlights: These are the recommendations for ACL features for use with UNIX from the Trusted UNIX Working Group (TRUSIX) set up by the NCSC.

1. ACLs are required for files, IPC objects, and UNIX system domain sockets.

2. Access modes required are Read, Write, and Execute.

3. Each ACL entry should specify permissions for a User XOR a Group.

4. ACL entry permissions are masked by the group class file protection bits.

5. Multiple concurrent groups, with group subsetting, should be supported.

6. Evaluation of ACLs should be (system-defined) from most specific to least specific. Multiple group permissions should be ORed.

7. Existing mechanisms should remain as is and used as much as possible.

8. For new mechanisms which must be added, get/set operations should be used.

9. Named ACLs are not needed.

10. Default ACLs should be used along with a user-specifiable Use/No Use mechanism.

11. Default ACLs for IPC objects are not recommended.

12. Default ACLs should be set by directory.

13. Protection bits are not affected when a new object with default ACL is created unless an explicit mechanism is provided.


Document Number: NCSC-TG-021 Version 1

Title: Trusted Database Management System Interpretation

Color: Lilac

Date: APR 91

Highlights: Because the Orange Book was developed with an atomic system in mind, the advances in computer architectures and designs has led to discontinuities in the interpretation of the Orange book requirements. This document covers evaluation by parts and how to deal with the evaluation of trusted systems which are composed of multiple pieces which individually enforce policy.

1. Conditions for Evaluating by Parts

a. TCB subsets are identified.

b. System policy allocated to TCB subsets.

c. Each TCB subset includes all trusted subjects with respect to its policy.

d. Each TCB subset structures or architecture explicitly defined.

e. Each TCB subset occupies distinct subset-domains.

f. More primitive TCB subsets support the less primitive TCB subsets.

2. Local/Global Requirements: Some requirements can be evaluated completely within the TCB subset in question (Local), while others deal with interrelations between TCB subsets (Global). The differentiated requirements follow:

a. Local Requirements
1) Discretionary Access Control (DAC).

2) Object reuse.

3) Labels (except Subject Sensitivity Labels).

4) Mandatory Access Control (MAC).

5) System Architecture (except domains and address spaces).

6) System Integrity.

7) Configuration Management.

8) Security Features User's Guide (SFUG).

9) Design Documentation:

a) Models.

b) DTLSs.

c) FTLSs.

d) Non-FTLS internals.

b. Global Requirements:

1) Subject Sensitivity Labels.

2) Identification and Authentication.

3) Trusted Path.

4) Audit.

5) System Architecture:

a) Domains of Execution.

b) Distinct Address Spaces.

6) Covert Channel Analysis.

7) Trusted Facility Management.

8) Security Testing.

9) Design Specification and Verification:

a) Correspondence between System Policy and the set of TCB Subset Models.

b) Consistency of the TCB Interface between TCB Subset DTLSs.

c) Consistency of the TCB Interface between TCB Subset FTLSs.

10) Trusted Distribution.

11) Trusted Facility Manual.

12) Test Documentation.

13) Design Documentation (Other than the Local Requirements.)

3. Interpretation of the Orange Book Requirements: Some of the requirements from the Orange Book are interpreted for the context of this document. Those that are reinterpreted are listed below. (Others which did not require reinterpretation are not listed here.)

a. Security Policy - Labels
General: DBMS objects which require labels include:
a) Stored View Definitions

b) Files

c) Records

d) Relations

e) Tuples

f) Directories

g) Schemas

h) Indices

i) Data Dictionaries

j) Discretionary Authorization Tables

k) Recovery Logs

l) Transaction Logs

B2: Internal variables within the TCB which are not accessible to untrusted subjects do not need to be labelled, but care should be taken to ensure that internal variables are not usable for covert channels through the visible actions of the variables on the system behavior.

b. Accountability - Audit

General: The emphasis in audit is to provide individual accountability of a user's actions. Access (or attempted access) to the protected objects should be audited. Internal actions within the TCB subset (DBMS) need not be audited. If there are multiple audit logs built correlation methodologies must also be present.
C2: All mediated DAC accesses which are visible to the user must be auditable.

B1: All mediated DAC and MAC accesses which are visible to the user must be auditable.

c. Assurance - System Architecture: Each subset of the TCB must satisfy these system architecture requirements.

C2: Separate execution domain that protects it from interference and tampering.

B2: User interface to be fully defined and all elements of the TCB be identified.

B2: Effectively use available hardware to separate the protection-critical portions.

d. Life Cycle Assurance - Design Specification and Verification

B1: An informal argument must be given the the set of formal or informal policy models represents the security policy supported by the composite TCB.

B2: An informal argument must be given the the set of formal policy models represents the security policy supported by the composite TCB.

B2: Descriptive Top Level Specifications (DTLSs) must be maintained for each TCB subset.

B2: DTLS interface descriptions must cover the interfaces between TCB subsets as well as the external user interface.

B3: For a convincing argument that the DTLSs are consistent with the Model, MAC and DAC access checks at state transitions and visual checking for exceptions must be covered for each subset.

A1: Formal Top Level Specifications must be maintained for each TCB subset and must cover all facets of the policy (Mac, DAC, etc.)

A1: Each TCB subset must cover the user interface as well as the interfaces with other subsets. In addition, there should be a description of how to accurately describe the total TCB accurately.

A1: The FTLSs must be shown by formal and informal methods to be consistent with the Model; MAC and DAC access checks at state transitions and visual checking for exceptions must be covered for each subset.

e. Documentation - Design Documentation

C1: Intermodular interfaces within and between TCB subsets must be described.

B1: The protection mechanisms for each TCB subset must be described and be shown to satisfy the model. The protection mechanisms must include the mechanisms which support the subset structure and separate subset-domains.

B2: Intermodular interfaces within and between TCB subsets must be described.

B2: DTLS interface descriptions must cover the interfaces between TCB subsets as well as the external user interface. In addition, there should be a description of how to accurately describe the total TCB accurately.

B2: Each TCB subset must how it implements the reference monitor concept within its own technical policy. In addition,there must be a documented informal argument that the the cooperative actions of the TCB subsets makes the TCB tamper resistant, non-bypassable, and correct.

B2: The documentation for each TCB subset must describe how the subset is structured to facilitate testing and enforce least privilege.

B3: Each TCB subset must be informally shown to be consistent with the DTLS.

A1: Each TCB subset must be informally shown to be consistent with the FTLS.


Document Number: NCSC-TG-025 Version-2

(Supersedes: CSC-STD-005-85)

Title: Data Remanence in Automated Information Systems

Color: Green

Date: SEP 91

Highlights: This is an important topic but not covered by the evaluation process. The document is guidance only and is not meant to replace policy on the issue. The Degausser Product List (DPL) is included in the NSA's Information Systems Security Products and Services Catalogue available through the U.S. Government Printing Office.

1. Standard Clearing / Purging Methods

a. Overwrite: The media is rewritten one or more times with unclassified data and/or the bit patterns 0011 0101, 1100 1010, 1001 0111 (HEX: 35, BA, 97) followed by unclassified data. This must be written to all memory locations and the number of repetitions required is based upon the various considerations listed. Disk exercisers have many advantages for overwrite including ignoring "bad" sectors and variable write frequencies.

b. Degaussing: The media is subjected to a magnetic field to randomly align the memory locations. The DPL lists tested degaussing products. Type I Degaussers are generally sufficient for Type I Media and Type II Degaussers are generally sufficient for Types I & II Media. No Degaussers have been found to be sufficient for Type III Media. (If the degausser is malfunctioning or not used properly, the results will be unsatisfactory.)

c. Destruction: The recommended destruction methods will remove the risk of compromise of sensitive data but they do tend to be a bit hard on the media and can be a bit expensive.

1) Smelting, Disintegration, or Pulverization

2) Incineration

3) Emery Wheel or Disk Sander

4) Concentrated Hydroiodic Acid (HI) for Gamma Ferric Oxide Disk Surfaces

5) Dubais Race A and Dubais Race B followed by Acetone

2. Considerations which should be observed with regards to data remanence include:

a. Destination of Released Media

b. Effects of Heat and Age

c. Equipment Failures

d. Storage Device Segments not Receptive to Overwrite

e. Overwrite Software Suitability for Clearing and Purging

f. Contractual Obligations

g. Maintenance

h. Data Sensitivity

i. Degaussing Failures

3. Approved Procedures for Various Media:

a. Magnetic tape: Clear - Type I Degausser; Purge - appropriate Degausser.

b. Hard Disks: Overwriting and degaussing for Clear and Purge.

c. Magnetic Drums: Overwriting and degaussing for Clear and Purge. Degaussing by approved hand held magnet.

d. Floppy Disks/Cards: Overwriting for Clearing. Type I Degausser for Purge.

e. Magnetic Core: Overwriting and Type I Degaussing for Clear and Purge.

f. Plated Wire Memory: Overwriting and Type I Degaussing for Clear and Purge.

g. Thin Film Memory: Overwriting and Type I Degaussing for Clear and Purge.

h. Bubble Memory: Overwriting and Type I Degaussing for Clear and Purge.

i. RAM: Overwrite and Removal of Power for Clear and Purge.

j. ROM: Destruction.

k. UVPROM: UV.

l. EEPROM: Overwriting for Clear or Purge.

m. Optical Disks: Destruction.

n. Ferromagnetic RAM: No Standards yet set. Overwrite should do for Clear.


Document Number: NCSC-TG-026 Version 1

Title: Security Features User's Guide

Color: Fluorescent Orange

Date: SEP 91

Highlights: "A single summary, chapter, or manual in user documentation shall describe the protection mechanisms provided by the TCB, guidelines on their use, and how they interact with one another." Orange Book.

1. The user is the audience (may or may not be a computer person.)

2. A separate manual is usually the best choice, especially in the higher classes.

3. Presentation:

a. Information needed by the user to securely operate the system.

b. Explain the user's role in the security of the system.

4. Content: Organized either feature-oriented or task-based. (Feature-oriented preferred.)

a. Example Task-Oriented Security Features User's Guide (SFUG)

1. Introduction to the SFUG

2. System Security Overview

2.1 System Philosophy of Protection

2.2 Definition of Terms and Conditions

2.3 The Information System Security Officer (ISSO)

2.4 User Security Responsibilities

3. Security-Related Commands for Users

3.1 System Access

3.1.1 Session Initiation

3.1.2 Changing the Session Profile

3.1.3 Changing the User Profile

3.1.4 Potential Access Problems and Solutions

3.2 Access Control facilities

3.3 Protecting Removable Objects

3.4 Logging Security-Relevant Events

b. Example Feature-Oriented Security Features User's Guide (SFUG)
1. Introduction to the SFUG

2. System Security Overview

2.1 System Philosophy of Protection

2.2 Definition of Terms and Conditions

2.3 The Information System Security Officer (ISSO)

2.4 User Security Responsibilities

3. Security Related Commands for Users

3.1 User Identification and Authentication
3.1.1 Trusted Path

3.1.2 Logging On to the System

3.1.3 Password Considerations

3.1.4 Changing Group Memberships

3.1.5 Changing Current MAC Authorizations

3.1.6 Logging Off the System

3.1.7 I&A Errors and Their Causes

3.2 Discretionary Access Control Facilities

3.2.1 Setting DAC on Named Objects

3.2.2 Default DAC Protection

3.2.3 DAC Groups

3.2.4 DAC Error and Their Causes

3.3 Mandatory Access Control Facilities

3.3.1 Printing Labelled Objects

3.3.2 Accessing Single-Level Devices

3.3.3 Accessing Multilevel Devices

3.3.4 Downgrading Labelled Objects

3.3.5 MAC Errors and Their Causes

3.4 Object Manipulation Facilities

3.4.1 Object Creation, Reuse, and Deletion

3.4.2 Importing Machine-Readable Objects

3.4.3 Exporting Machine-Readable Objects

3.4.4 Determining the Properties of Objects

3.4.5 Object Manipulation Errors and Their Causes


Document Number: C Technical Report 79-91

Title: Integrity in Automated Information Systems

Color: Yellow

Date: SEP 91

Highlights: Integrity is the "other half" of COMPUSEC. The primary thrust of the Orange Book and the rest of the Rainbow Series is confidentiality of sensitive data. Integrity is concerned with protecting the data and the system from improper modification.

1. Integrity Goals

a. Preventing unauthorized users from making modifications.

b. Maintaining internal and external consistency.

c. Preventing authorized users from making improper modifications.

2. Integrity Principles

a. Identity

b. Constraints

c. Obligation

d. Accountability

e. Authorization

f. Least Privilege

g. Separation

h. Monitoring

i. Alarms

j. Non-Reversible Actions

k. Reversible Actions

l. Redundancy

m. Minimization

1) Variable Minimization

2) Data Minimization

3) Target Value Minimization

4) Access Time Minimization

n. Routine Variation

o. Elimination of Concealment

p. Access Deterrence

3. Integrity Policies and Mechanisms

a. Policy of Identification and Authentication
1) Policy of User Identification and Authentication

2) Policy of Originating Device Authentication

3) Policy of Object Identification and Authentication

b. Policy of Authorized Actions

1) Policy of Conditional Authorization
a) Conditional Enabling

b) Value Checks

2) Policy of Separation of Duties

a) Rotation of Duties

b) Supervisory Control

c) N-Person Control

d) Process Sequencing

c. Policy of Separation of Resources

1) Policy of Address Space Separation
a) Descriptors

b) Separation of Name Spaces

2) Policy of Encapsulation

a) Abstract Data Types

b) Strong Typing

c) Domains

d) Actors

e) Message Passing

f) Data Movement Primitives

g) Gates

3) Policy of Access Control

a) Capabilities

b) Access Control Lists

c) Access Control Triples

d) Labels

d. Policy of Fault Tolerance

1) Policy of Summary Integrity Checks

a) Transmittal Lists

b) Checksums

c) Cryptographic Checksums

d) Chained Checksums

2) Policy of Error Correction

a) Duplication Protocols

b) Handshaking Protocols

c) Error-Correcting Codes

4. Separation Policies

a. Hierarchical Levels

b. Non-Hierarchical Categories

c. Access Control Triples

d. Protected Subsystems

e. Digital Signatures / Encryption

f. Combination of Capabilities and ACLs

5. General integrity Principles

a. Traditional Design Principles
1) Economy of Mechanism

2) Fail-Safe Defaults

3) Complete Mediation

4) Open Design

5) Separation of Privilege

6) Least Privilege

7) Least Common Mechanism

8) Psychological Acceptability

b. Additional Design Principles

1) Work Factor

2) Compromise Recording

c. Functional Control Levels

1) Unprotected Systems

2) All-Or-Nothing Systems

3) Controlled Sharing

4) User-Programmed Sharing Controls

5) Labelling Information


Document Number: NTISSAM COMPUSEC/1-87

Title: Advisory Memorandum on Office Automation Security

Date: 16 JAN 87

Highlights

User Responsibilities

1. Magnetic media must be protected at the level of the most restrictive data which has been written to the media.

2. Removable magnetic media should be used when data at several levels is to be processed on a single system.

3. Each user of the system should:

a. Know who the security officer for the system is.

b. Be aware of, and follow, the security guidelines for the system.

c. Report compromise and/or theft of data and equipment.

d. Only use authorized software.

4. Physical access to the system should be limited to those individuals which have need-to-know and authorization for all data stored and/or being processed on the system.

5. Screens, printers, etc. should be oriented away from doors and windows.

6. Running systems should not be left unattended, especially with output visible.

7. Electronic labels should not be trusted except in NCSC evaluated B1 (or higher) systems.

8. Physical media should be physically marked.

9. Printers should:

a. Not be left unattended when printing.

b. Have output removed at the earliest possible time.

c. Have outputs marked immediately.

d. Have ribbons removed and protected at the level of the data printed.

10. Remember TEMPEST.

11. Protect the hardware and media from food, drink, smoke, etc.

12. Manually review all output (intended for below the maximum level) to ensure proper classification.

13. To operate the system at a level lower than currently being used:

a. Remove all media.

b. Power off the system for at least one minute.

c. Reboot using a copy of the operating system at the proper level.

d. Reload applications software at the proper level.

14. Checklists for startup, shutdown, and downgrade should be kept near the system and used for the operations.

15. Manual audit trails should be fully filled out if used for the system.

16. Downgrade of fixed magnetic media must be in accordance with Department of Defense Magnetic Remanence Security Guideline, CSC-STD-005-85, Dark Blue Book.

17. Passwords, especially network access passwords, must not be shared, written, or stored in "Hot-Keys".

Security Officer Responsibilities

1. One person should be responsible for the security of the system. This person should:

a. Ensure system certification and accreditation.

b. Ensure User awareness of and compliance with security requirements.

c. Investigate violations and determine what happened.

d. Report violations to proper authorities.

e. Ensure configuration control is properly carried out.

f. Review audit logs for anomalies, if used.

g. Enforce downgrade procedures.


Document Number: MIL-STD 1785

Title: System Security Engineering Program Management Requirements

Date: 1 SEP 89

Highlights:

1. This standard does not come to bear on Computer Security directly; the primary focus of the standard is Physical Security. The basic plan over the course of the program lifecycle is sound and if the specifics aren't overly worried about, the standard can give good guidance for the development of a COMPUSEC Program or a total System Security Engineering Program. In Particular, the DIDs which were produced in association with the standard are fairly good models from which to work the appropriate COMPUSEC DIDs.

2. The System Security Engineering (SSE) Management Program should:

a. Enhance the operational readiness and mission success of the defense resource.

b. Identify and reduce potential vulnerabilities to security threats.

c. Provide management information essential to system security planning.

d. Minimize its own impact on program cost and schedule.

3. The SSE requirements carry through the each program lifecycle phase in the following way:

a. Concept Exploration Phase:
1) Develop the System Security Management Plan (SSMP). [DI-MISC-80839]

2) Threat definition and analysis.

3) Develop Preliminary System Security Concept (PSSC). [DI-MISC-80840]

4) Define security requirements.

5) Assess technology and perform cost studies.

6) Prepare a Logistics Support Plan (LSP). [DI-S-1817]

7) Identify security training requirements.

8) Develop an Reliability and Maintainability (R&M) program for the system.

9) Conduct a Preliminary Security Vulnerability Analysis (SVA). [DI-MISC-80841]

10) Prepare Security Classification Guide (SCG).

b. Demonstration and Validation Phase:

1) Conduct Adversary Mission Analysis (AMA). [DI-MISC-80842]

2) Update and expand PSSC.

3) Review security regulatory requirements.

4) Update SVA.

5) Conduct security systems trade-off analysis.

6) Prepare preliminary specification inputs. [DI-S-30551B]

7) Identify manpower requirements associated with security systems deployment.

c. Full-Scale Development Phase:

1) Define System Security Requirements.

2) Update SSMP.

3) Develop Subsystem and interface specifications.

4) Perform preliminary design of the major subsystems in the security system.

5) Perform subsystem verification analysis.

6) Perform subsystem and system response analysis.

d. Production and Deployment Phase:

1) Monitor and support acceptance testing.

2) Monitor and analyze initial security systems training for adequacy.

3) Support Program Management Responsibility Transfer (PMRT).

4) Provide product security.


Document Number: DRS-2600-5502-86

Title: System High and Compartmented Mode Workstations

Date: MAY 86

Highlights

1. This document gives the DIA views on the topics basically covered by the Orange Book for NSA. While not in complete agreement with each other, the differences between this document and the Orange Book are more in the approach taken than in goal sought. The largest difference is in the groupings of requirements. Rather than the six classes (in three divisions) used in the Orange Book, only two groupings of requirements are made and they are based on two of the allowed modes of operation, System High and Compartmented. The requirements are well defined in this document with level of requirement and alternatives identified in the requirement identifier.

2. System High Workstation Requirements: SH1 Discretionary Access Control (DAC)

SH1.1a Access Control Lists (ACL)

SH1.1b Self/Group/Public Permissions (Protection Bits)

SH2 Object Reuse

SH2.1 Object Reuse

SH3 Labels

SH3.1a Floating Labels
SH3.1a.1 Label Contents

SH3.1a.2 Process Data Sensitivity Level (PDSL)

SH3.1a.3 Window Labels

SH3.1a.4 Interwindow Data Moves

SH3.1a.5 Input Labels

SH3.1a.6 File Labels

SH3.1a.7 Printed Output Labeling

SH3.1a.8 Network Output Labeling

SH3.1a.9 Imported Data labeling

SH3.1b Stored Labels

SH3.1b.1 Label Contents

SH3.1b.2 Window Labels

SH3.1b.3 Interwindow Data Moves

SH3.1b.4 Input Labels

SH3.1b.5 File Labels

SH3.1b.6 Printed Output Labeling

SH3.1b.7 Network Output Labeling

SH3.1b.8 Imported Data labeling

SH3.1c Export Labels

SH3.1c.1 Label Contents

SH3.1c.2 Window Labels

SH3.1c.3 Interwindow Data Moves

SH3.1c.4 Input Labels

SH3.1c.5 Printed Output Labeling

SH3.1c.6 Network Output Labeling

SH3.1c.7 Imported Data labeling

SH4 Mandatory Access Control (MAC)

SH5 User Identification and Authentication

SH5.1 Password Authentication

SH5.2a Local Authentication Data
SH5.2a.1a Protected Passwords

SH5.2a.1b Encrypted Passwords

SH5.2a.1c Protected Encrypted Passwords

SH5.2b External Authentication Data

SH5.3a Password Generation

SH5.3b Password Selection

SH6 Identification of User Terminal

SH7 Trusted Path

SH7.1 Trusted Path

SH8 Audit

SH8.1 Audit Data

SH8.2a Selective Collection

SH8.2b Selective Reduction

SH8.3a Local Data Storage

SH8.3b External Data Storage

SH9 System Architecture

SH9.1 System Architecture

SH10 System Integrity

SH10.1 System Integrity

SH11 Trusted Facility Management

SH11.1 Trusted Facility Management

SH12 Trusted Recovery - No Requirement

SH13 Security Testing

SH14 Design Specification and Verification - No Requirement

SH15 Configuration Management

SH16 Trusted Distribution

SH17 System Security Statement/Plan

SH18 Security Features Users Guide/Briefing

SH19 Trusted Facility Manual

SH20 Test Documentation

SH21 Design Documentation

SH22 Communications Security

SH23 Physical Security

SH24 TEMPEST

SH25 Personnel Security

SH26 Annual Accreditation

SH27 Protection Software

SH28 (No) Dial-up Lines

SH29 Access Authentication

3. Compartmented Mode Workstation Requirements:

CM1 Discretionary Access Control (DAC)
CM1.1a Access Control Lists (ACL)

CM1.1b Self/Group/Public Permissions (Protection Bits)

CM2 Object Reuse -SH-

CM2.1 Object Reuse -SH-

CM3 Labels

CM3.1 Label Contents

CM3.2 Process Data Sensitivity Level (PDSL) -SH-

CM3.3 Window Labels -SH-

CM3.4 Interwindow Data Moves

CM3.5 Input Labels -SH-

CM3.6 File Labels -SH-

CM3.7 Printed Output Labeling

CM3.8 Network Output Labeling

CM3.9 Imported Data labeling -SH-

CM4 Mandatory Access Control (MAC)

CM4.1 Mandatory Access Control Levels
CM4.1.1 Process Levels

CM4.1.2 Window Levels

CM4.1.3 File Levels

CM4.2 Mandatory Access Control Policy

CM5 User Identification and Authentication

CM5.1 Password Authentication -SH-

CM5.2a Local Authentication Data

CM5.2a.1a Protected Passwords -SH-

CM5.2a.1b Encrypted Passwords -SH-

CM5.2a.1c Protected Encrypted Passwords -SH-

CM5.2b External Authentication Data

CM5.3a Password Generation -SH-

CM5.3b Password Selection -SH-

CM5.4 Maximum Security Level Determination

CM6 Identification of User Terminal -SH-

CM7 Trusted Path

CM7.1 Trusted Path

CM8 Audit -SH-

CM8.1 Audit Data -SH-

CM8.2a Selective Collection -SH-

CM8.2b Selective Reduction -SH-

CM8.3a Local Data Storage -SH-

CM8.3b External Data Storage -SH-

CM9 System Architecture

CM9.1 System Architecture

CM10 System Integrity -SH-

CM10.1 System Integrity -SH-

CM11 Trusted Facility Management -SH-

CM11.1 Trusted Facility Management -SH-

CM12 Trusted Recovery

CM13 Security Testing -SH-

CM14 Design Specification and Verification

CM15 Configuration Management -SH-

CM16 Trusted Distribution -SH-

CM17 System Security Statement/Plan -SH-

CM18 Security Features Users Guide/Briefing -SH-

CM19 Trusted Facility Manual -SH-

CM20 Test Documentation -SH-

CM21 Design Documentation

CM22 Communications Security -SH-

CM23 Physical Security -SH-

CM24 TEMPEST -SH-

CM25 Personnel Security

CM26 Annual Accreditation -SH-

CM27 Protection Software -SH-

CM28 (No) Dial-up Lines -SH-

CM29 Access Authentication -SH-

4. In the naming of the requirements (LLxx.ym.zn):

a. LL gives the Mode of Operation CM or SH.

b. xx is the general requirement number.

c. y is the specific requirement.

d. m is the option of the specific requirement.

e. z is the subrequirement.

f. n is the option of the subrequirement.


Section IV:

Case Studies

IV. Case Studies

To help bring the other sections of this book into clear focus, this section gives four Case Studies of actual systems and their encounters with Computer Security. In order for the studies to cover as much ground as possible, the programs included span a wide variety of equipment types and levels of security. (The names have been changed to protect the innocent and guilty alike.)

Note: Gonculator translates roughly as Potrezebie.

A. The New Development Gonculator (NDG) is a current development with massive integration of the on-board computer systems and uses data with significant restrictions on its dissemination. The security features concentrate on limiting data access for personnel with other access to the aircraft.

B. Upgrade Gonculator (UG) CompuSec is for upgrades to an aircraft which is also an FMS favorite and thus is not always in the hands of US Nationals. The security measures concentrate on protecting US-only data and control code.

C. Communications Gonculator Group (CGG) is a highly portable communications network which processes and transmits data at SCI levels while operating in the System High mode of operations. The security features concentrate on blocking communications eavesdropping and sudden hardware (involuntary) turnover.

D. Ground Intelligence Gonculator System (GIGS) is a bunkered ground station for a reconnaissance program which processes and transmits intelligence data at various levels and flavors. The security functionality permeates the system; it tracks and mediates all access to data and equipment.

E. Yellow Book Evaluations: Each of the four Gonculator systems is evaluated per the Yellow Books' (CSC-STD-003-85 and CSC-STD-004-85) algorithms to determine the "required" Orange Book level for each system, both before and after changes to allow looser requirements.


A. NDG: New Development Gonculator

1. Threats: The threats to the NDG can be grouped into three main categories: threats related to the aircraft sitting on the flight line unattended, threats related to the aircraft coming down in an unscheduled fashion (crashes), and the personnel with some level of access to the NDG.

a. Flight Line: On the flight line the NDG is protected to some degree from overt attack by fences and guards, but these measures are not sufficient to protect an unattended aircraft from covert attempts to gain access to sensitive data held on the aircraft.

b. Fall Down: When the NDG is operating, all possible data must be assumed to be available at the time of a crash. The threat inherent in this situation is that the aircraft may retain the data after the event. If the event occurs in an area populated, or controlled, by Not So Friendly people, this will give the Not So Friendlies virtually unlimited time and access to derive from the aircraft any data which survived the event.

c. Personnel: Any time that people are involved with data (or data containers) there is some level of risk involved. This is fundamental to security in general and CompuSec as well (clearances, Need-To-Know, least privilege, etc.). In the case of the NDG, there are three levels of personnel with differing needs and differing treatments.

1) Pilot: In order to fly the aircraft to its full capabilities the pilot must have access to the sensitive data residing on the aircraft. However the access which the pilot requires is not necessarily personal access; most of the access required is limited to access by the computers on-board the aircraft with the pilot only seeing the results.

2) Special Maintenance: The maintenance personnel who are actively involved in the maintenance of the flight computers, sensor systems, data busses, etc. have a need for direct access to the data in order to diagnose problems, effect repairs on these systems, and load upgrades to the software and data stores.

3) Flight Line Maintenance: The other maintenance personnel (servicing the engine, fueling the NDG, etc.) don't need the data - just access to the NDG.

2. Data / Classified Material: Data and code stored and processed on the NDG run the gamut from unclassified through plain Vanilla Secret to Special Access Required (SAR) data. Not all of the data is needed by all of the personnel.

3. Implementation:

a. Layered Encryption: Data stored on-board the NDG is encrypted by level of data. The mission data cartridge contains the decryption data which allows the system to come to full capabilities while loading the mission data. In the event of failure of the cartridge, the pilot can key in the decryption data to allow the NDG to become fully functional. With the separate encryption by level the pilot can generally gain access to what is needed without having the data accessible to those who have no need for the data. The maintenance personnel have decryption capabilities commensurate with their Need-To-know and clearances.

b. Working System Without Decryption: In the event of total decryption failure, the NDG is still able to fly the mission. Some of the functionality will be impaired by the inaccessibility of portions of the data, by the ability to fly and fight will be present.

c. Security As a Compromise With the User: While the security measures taken for the NDG do not cover all of the threats fully, the security does cover the major problems relatively well and still allows the User to have a system which will fill the need. The compromise between mission need and security need is always a difficult call to make, but it should be remembered that a useless system (through over-securing) is not going to help the security situation either.

1) While on the flight line, the NDG has no data available for gathering by covert means which is not encrypted.

2) In the event that the NDG falls down the RAM is purged by powering down the craft, either purposefully or by failure on collision, thus the only classified data remaining is in encrypted ROM.

3) The pilots are cleared for the data which is needed to operate the craft in a fully functional manner and Identification and Authentication (I&A) is accomplished through either Ownership of the mission data cartridge or Knowledge of the code to key in for encryption. The special maintenance personnel accomplish I&A through various keying procedures and by duplicating, for test and diagnostic purposes, the pilots I&A procedures. The other flight line personnel have neither the need nor the means to decrypt the data.

B. UG: Upgrade Gonculator

1. Threats: The threats to the UG can be grouped into three main categories: threats related to the aircraft sitting on the flight line unattended, threats related to the aircraft coming down in an unscheduled fashion (crashes), and the personnel with some level of authorized access to the UG. Each of these threat categories is discussed in the following paragraphs.

a. Flight Line: On the flight line the UG is protected to some degree from overt attack by fences and guards, but these measures are not sufficient to protect an unattended aircraft from covert attempts to gain access to sensitive data held on the aircraft. In addition to the usual problems associated with the flight lines on US bases, the UG is scheduled for sale through the FMS program to various foreign countries. The level of flight line physical security is subject to the policies and standards of the acquiring country - which may or may not always meet the US standards.

b. Fall Down: When the UG is operating, all possible data must be assumed to be available at the time of a crash. The threat inherent in this situation is that the aircraft may retain the data after the event. If the event occurs in an area populated, or controlled, by Not So Friendly people, this will give the Not So Friendlies virtually unlimited time and access to derive from the aircraft any data which survived the event.

c. Personnel: Any time that people are involved with data (or data containers) there is some level of risk involved. This is fundamental to security in general and CompuSec as well (clearances, Need-To-Know, least privilege, etc.). In the case of the UG, there are three levels of personnel with differing needs and differing treatments. The US personnel, the foreign personnel, and the contractor field engineers are discussed below.

1) US Personnel: The UG is used as a US fighter with US personnel operating and maintaining the system. With the data and code stored by UG, the clearing all of these personnel is not a problem.

2) Foreign Personnel: Oddly enough, the foreign nationals from the nations which buy the UG will also want access to the craft and its data. Unfortunately, some of the data is currently not releasable to non-US personnel.

3) Contractor Field Engineers: Along with the UG come the contractor field engineers who, among other things, load new versions of the Operational Flight Program (OFP) and other control code. These field engineers are cleared US citizens.

2. Data / Classified Material: The data and classified material related to the UG basically fall into the sets which are discussed in the following paragraphs:

a. Unclassified: The majority of the OFP, other code, and stored data is unclassified and there is no need to protect the data any more stringently than the aircraft will already be protected as the high-value item that it is.

b. US - only Weapons and Control Code: Some of the weapons which are available for use on the UG system are currently limited to US-only use. There is, however, great pressure to sell the FMS customers the "same" system which we use - including the US-only weapons and control code.

3. Implementation: Because this is an UpGrade to an existing system the options are not as open as they would be in a new development. The previous methods employed such as distributing just the object code (not the source code) have not stood the test of reverse-engineering. The contractor field engineers will continue to load the new versions of the OFP, etc. and additionally the following steps are being explored and taken as possible:

a. Downgrading: Where possible, the dissemination-limited data is being downgraded to allow for release to the FMS customers. This entails careful study and evaluation of the subject material and determining actual risk involved with the downgrade.

b. Encryption: Classified data is being encrypted for storage on-board the aircraft. This allows for shortened pre-flight preparation while limiting the risk of compromise of the data.

c. Don't Disperse Weapons: Those weapons subsystems which are not available for the downgrading effort above will not be a part of the FMS purchase packages. This does not, however, directly address the question of the control code for the weapons - just the weapons themselves.

d. Modularize and Excise: The control code for non-distributed weapons subsystems is to be "non-distributed" as well. As a part of the UpGrade, a new mission computer system is being installed with an accompanying software rewrite. during the rewrite, the code will be modularized with permanent "stub" replacements written for those portions of the code which cannot be released to non-US personnel. The stubbed version will be compiled for FMS distribution and the un-stubbed version will be used for US distribution.

C. CGG: Communications Gonculator Group

1. Threats: The threats to the data residing on CGG and transmitted by CGG fall into three main categories: physical overrun by Not So Friendly troops, Eavesdropping on the communications transmitted by CGG, and the personnel actually using CGG.

a. OverRun: A goodly portion of the CGG terminals are used by tactical ground forces in relatively forward areas. In addition, the field terminals are portable and thus even more susceptible to hostile actions. In the event that a live terminal were to be overrun (without opportunity to clear/destroy the equipment), the terminal would continue to send and receive, encrypt and decrypt, etc. until the network is informed, via other channels, of the overrun. This could lead to serious compromise of sensitive data and compromise the integrity of the CGG network through spurious messages sent from the captured CGG unit.

b. Eavesdrop: Being a communications system, the CGG is vulnerable to interception of transmissions. The communications are via radio leaving no physical signs of the interception having been made. Additionally, the far forward field terminals are vulnerable to discovery when transmitting.

c. Personnel: All personnel with authorized access to the CGG network have clearances and formal authorizations for all data on the net. [This simplifies the personnel aspects of the system because partial screening of portions of the data from portions of the personnel (like NDG and UG required) is not needed.]

2. Data / Classified Material: All data on and in CGG is treated at the level of the highest data allowed. The data falls into two main groups which are discussed below.

a. Messages: Messages, both received and to be sent, are stored on the system as space allows. These messages must be protected at the highest level allowed on the system (even if they are just setting up dinner on Friday) because the system cannot differentiate between the (possibly) various levels of data within the system.

b. Stored Crypto Material: In order to facilitate orderly key change operations, the CGG stores keying material for the next change and further into the future as needed by operational requirements in the field. If this crypto material is not protected, the CGG network security is compromised for an extended time period (until all of the stored keying material has been replaced over the entire net, which could take a while with units in the field and net security compromised.)

3. Implementation:

a. Separate Encryption of Header and Message: In order to ensure that the messages only go to the intended recipients, the header and body of the messages are encrypted separately. The message header is decrypted (transparent to the operator) for all messages and only those messages which are intended for that particular terminal are made visible to the operator of the terminal. With both header and message encrypted and all messages going to each terminal, there is no unencrypted data available for analysis and analysis of the transmission patterns is limited.

b. Security at Net Control: Primary security is performed at the network level by the addition of a MicroVAX workstation dedicated to security. It is within this MicroVAX that auditing of events, journalling of messages, and Identification and Authentication occur. All actions which take place over the net are audited by Terminal, User, and event type. Each message which is sent

c. Security Down ? Net Down: In the event that the security work station becomes inoperable, the cabling can bypass the security processing and allow mission essential work to continue. During this time, the field terminals will still require their users to login, but the field terminal will accept the non-response from the (off-line) security workstation as a non-negative response and allow access even if passwords and such are incorrect. In addition, no event auditing or message journalling will take place until the workstation is back on line.

d. Security Operator ? Operator: Because the security is handled at the network level within an add-on MicroVAX workstation, the security personnel are separate from the user/operator personnel. While cleared at the same level as the users, the security personnel are given access to the MicroVAX and its functionality. The user password sets and the Security password sets are separate and if one person is both a user and a security operator that person will have two distinct IDs and passwords for the two distinct roles.

e. Clear Field Unit: The field unit has built-in clearing circuitry to clear all memory and all crypto keys. This can be operated either by the operator (in the event of imminent overrun or by the security operator at the MicroVAX (in the event of unannounced overrun, etc.). These clearing actions are non-destructive and will only require rekeying and reentry of operational data (like frequencies) to become operational again. For those circumstances where the changing of hands is more certain and there is sufficient lead time, provisions have been made for the physical destruction of the field terminal.

D. GIGS: Ground Intelligence Gonculator System

1. Threats: Because GIGS is a bunkered system, the threats from without are greatly lessened by the physical security of the locale. The threats to GIGS revolve around the personnel and functionality of the system itself. These threat groupings are discussed in the paragraphs below.

a. Mixed Intel Types: While all users within GIGS are cleared to the same (high) level and all have formal access permissions for all of the data resident within the system, not all of the users have a verified Need-To-Know for all of the data. In particular, the two sets of intelligence analysts have little or no Need-To-Know for the data analyzed by the other set, with the exception of that data which is used synergistically to feed the collection schemes and build combined reports.

b. Bad Message Distribution: With the massive message processing capabilities of GIGS the possibility of mis-labelled, mis-routed, and mis-distributed messages looms large as a threat. With high volume message traffic, both incoming and outgoing, even a small probability flaw in the system can lead to substantial loss. With communications channels at varying classification levels there is an imperative need for the data to be well labelled so the computer can mediate the access to the communications port for each message sent.

c. Personnel: As with any situation involving people, a major threat source (of low probability) is the personnel. In the case of GIGS, with hundreds of personnel possessing high levels of access to the data, the risk is tangible. The personnel can be grouped as discussed below. (All personnel in both groupings are cleared to the level of and haves formal access permission to all of the data.)

1) Two Sides of the Intelligence Community: The intelligence analysts on both sides of the house need to deal with the data itself. Both sides of the house guard their data closely from prying eyes outside their side of the house as well as the official Not So Friendly persons.

2) Comm and Maintenance: The communications and maintenance personnel do not need close access to the data itself but the nature of their duties lead to close contact with the data as it flows along their comm channels and needs to be diagnosed and tested during maintenance actions. In fact, the maintenance personnel have closer access than anyone because their actions naturally bypass the safeguards.

2. Classified Data / Material: The sensitive (classified) material in GIGS is at varying levels but the primary concern is the types of material. The primary data categories are discussed below. Each category contains data which may be at various levels of classification.

a. Messages: Incoming and outgoing messages resident in GIGS are classified at various levels from Unclassified But Sensitive through Top Secret compartmentalized data. In addition, the incoming messages must be sorted and distributed based on classification and data set. The volume of incoming precludes hand sorting of the messages in a timely manner and the volume of the outgoing messages precludes a Man-In-The-Loop implementation.

b. Keying Material: With many network connectivities and dedicated lines, GIGS has numerous crypto devices of various types. These keys must be protected to ensure integrity of the communications channels and availability of the system to the Users.

c. Intel Databases: Massive databases are stored and updated by GIGS to support the analysts efforts. In addition, pertinent messages (both incoming and outgoing) are parsed, bent, folded, spindled, and mutilated to fit into the database as reference material. Bulk uploads from several agencies are fed to GIGS databases as available. These bulk uploads use various classification schemas and must be converted into GIGS format for classification and storage. The scale of the update effort is sufficient to require computer-driven reformatting and translation.

d. Security Data: The scope of the computer security functionality dictated massive data stores for the functionality. This data runs the gamut from Identification and Authentication (I&A) data, to system security configuration files, to audit trails and message journals. This data must be protected not only from prying eyes within the system (and without) but must also be protected from unauthorized modification, particularly the audit and journal data. This data is classified at the level of the source of the data so pure Mandatory Access Controls (MAC) will not protect the data from within; for protection of the security data, the Discretionary Access Controls (DAC) must be the primary driver.

3. Implementation: Th implementation of GIGS CompuSec is an integral part of the GIGS system as a whole. The DAC, MAC, and audit capabilities handle the primary burden with ancillary Information System Security Officer (ISSO) functionality rounding out the package. Because the security functionality primarily operates in a newly developed segment of GIGS, the implementation was able to be built-in not added-on. MAC, DAC, and audit are discussed below.

a. DAC: DAC is one of the least understood and defined areas in CompuSec. This lack of understanding and definition of the DAC arena gives more opportunity to adapt the security features to the conditions at hand. This also gives more opportunity for the DAC features to fail to meet the need. Care was taken with GIGS to ensure that the DAC functionality fit the situation and were sufficient.
1) Team Concept: Operationally, GIGS is manned by many people working in shifts of teams for a particular analysis type. The need for truly private data is essentially eliminated by the teaming and shift work. While there is still a need for personal data stores for preferences in setup and such, the data itself must be accessible to members of the team on all shifts.
a) User Groups: Access is granted by the DAC mechanisms based on the user group(s) in which the user has membership. The user groups determine access to sets of data, terminals, and peripherals. Each of the teams mentioned above is divided into one or more user groups for DAC purposes and individual users are granted membership to the appropriate user groups.

b) Personal Subdirectory: For personal data (work in progress, etc.) and personal preferences (screen layouts, open windows, standing queries, etc.) are stored in a personal subdirectory with a name based on the UserID which no one else has access to through ordinary channels.

2) Menu Environment: GIGS has a menu-driven DECWindows implementation with no access to the bare system. This allows the system to only permit access to functionality by not offering the non-allowed functionality in the menus. GIGS operates based on function groups which are available to the user groups. On login for a user, the I&A background data is checked for function group membership for that user, based on the user group memberships for the user, and builds menus which offer all of the user's allowed function access and no other functionality.

3) Bit Map Decision: The actual DAC decision mechanism is a simple ANDing of the bit pattern for a user's DAC user groups and the bit pattern of the object in question. If a non-zero answer results from the AND, access is granted because some user group in the user's DAC label matched a user group in the objects DAC label.

b. MAC: MAC is well understood in and of itself. It is, after all, the security requirements which govern the classification and dissemination of paper data as well. The problems come with implementing the requirements in a computer. The classification levels themselves work well because they are strictly hierarchical (U<C<S<TS). The compartments and code words, releasability, handling caveats, etc. add to the task of an orderly implementation because they are not always hierarchical. This leads to a lattice based view of MAC and partially ordered dominance being the basis of the MAC access decisions.

1) Analysis of Levels: For GIGS, the set of all meaningful components of the security lattice needed to be identified and the relations between the combined elements determined. While this sounds straight-forward, the relations (especially between groups in the intel community) were sometimes not well-defined. To simplify matters, the lattice was divided into sublattices of like markings. These sublattices were then well defined and melded into the complete lattice as the relations were completed.

2) Master Lattice List: The physical implementation of the lattice took the form of a series of arrays which contained the representations of the lattice elements suitable to the various needs. These representations include the bit patterns for each element (grouped by sublattice), the standard English version of the label in full and short forms and aliases for the standard version.

a) Bit Patterns: The combined lattice in bit form allows for simple access decisions because the patterns were allocated assign dominance a numeric superiority for the hierarchical portion and a simple subset check can handle the non-hierarchical portions.

Note: This was more than a simple bit setting though; for instance NATO Releasability is shown by turning bits off, not on because no releasability commentary (positive or negative) is the same as NOFORN.

b) Aliases: Aliases include the forms of the label which the system is allowed to accept in incoming messages (abbreviations, typical misspellings, etc.), and the ISSO defined form of the label to be used in various ways throughout the system (screen, window, icon, printed banner, message classification line, etc.). The standard English version of each lattice component is stored in long and short forms as FORTRAN Include files compiled into the code. The aliases are stored in tables readily alterable by the ISSO as needed.

c. Audit: The menu environment of GIGS makes auditing somewhat simpler than on a bare system. Because the actions available to the users are well defined and substantially limited, via the windowing process, the need to audit the lower level actions are reduced and the actions can be more clearly identified in the audit trail.

1) Menu Selections and Errors: Essentially all of the audited actions are based on the menu selection made or errors/non-allowed access attempts. Each of these event types can be selected to be audited or not as the ISSO deems necessary with usability of the audit trail, operational demands, and available storage and processing power taken into account. The TCB has table-driven functionality for this auditing feature and the ISSO can readily modify the tables.

2) Message Journal: Each message sent from GIGS or received by GIGS is stored in its entirety in the message journal. (This is in addition to the parsed versions which may go to the database and the Read File version where users access to read their incoming mail or review outgoing mail.) The only user group with Read access to the journal is the ISSO Group. The only write access available to the ISSO is a message delete function (well audited) for the removal of messages classified outside the GIGS lattice inadvertently received by GIGS.

3) Size: The sheer magnitude of the operation of GIGS leads to sizing problems for the audit trail - approximately 24 gigabytes a month for 24 hour operation with all possible auditable events being audited. The message journal is over and above this figure. Off line storage is only a partial answer, because as the data set grows, it becomes increasingly difficult to access the needed data in the event of an incident. This leads to the ISSO auditing only those events which are most meaningful. This is made possible through ISSO tables, which allow each event type to be turned on and off at both the individual event level and as classes of events. The risk in doing business like this is that the incident may well make use of a not particularly security relevant function in a means not previously considered.


E. Yellow Books

Calculations for the "required" Orange Book class are performed per the algorithms set forth in CSC-STD-003-85 Computer Security Requirements and CSC-STD-004-85 Rationale Behind Computer Security Requirements. The digests of these two documents are on pages 112 and 114 of this book. (For simplicity, the algorithms used for this subsection are found in the digest of CSC-STD-003-85, pages 112 and 113 of this book.)

1. NDG: New Development Gonculator: Dedicated Mode of Operations

a. Initial Calculated Level
Rmin = 3 (Pilot, Special Maintenance,Flight line Personnel)

Rmax = 5 (SAR data)

Risk Index = 2

Required Class = B2 (Open or Closed Development Environment)

b. I&A for Computer Access: This removes the flight line personnel from data access. (No longer users.)

c. Final Calculated Level

Rmin = 3 (Pilot and Special Maintenance)

Rmax = 3 (Categories aren't counted because all users have access.)

Risk Index = 0

Required Class = C1 or Less

2. UG: Upgrade Gonculator: Dedicated Mode of Operations

a. Initial Calculated Level
Rmin = 3 (Foreign Nationals)

Rmax = 5 (Various)

Risk Index = 2

Required Class = B2 (Open or Closed Development Environment)

b. Downgrade, Don't Disperse Weapons, Modularize and Excise

c. Final Calculated Level

Rmin = 3 (All Personnel)

Rmax = 3 (No Categories)

Risk Index = 0

Required Class = C1 or Less

3. CGG: Communication Gonculator Group

a. Calculated Level: System High Mode of Operations
Rmin = 6 (All Personnel TS 1 Category)

Rmax = 6 (1 Category)

Risk Index = 0

Required Class = C1 or Less

4. GIGS

a. Initial Calculated Level: System High Mode of Operations
Rmin = 3 (Maintenance Personnel)

Rmax = 7 (2 or More Categories at TS)

Risk Index = 4

Required Class = A1 (for Open Development Environment)

B3 (for Closed Development Environment)

b. Clear all Personnel and Grant formal Access Permission for All Data

c. Final Calculated Level: System High Mode of Operations

Rmin = 7 (All Personnel TS Multiple Categories)

Rmax = 7 (2 Categories at TS)

Risk Index = 0

Required Class = C2

Note: Agreement with the accrediting agencies resulted in B1ish requirements.


Section V:

Notes

Abbreviations and Acronyms

1C One Category

30DAC 30 Days After Contract Award

90DAC 90 Days After Contract Award

ACL Access Control List

AIS Automated Information Systems

AKA Also Known As

AMA Adversary Mission Analysis

ATM Automatic Teller Machine

AUD Audit

BI Background Investigation

BPOC Business Point of Contact

bps Bits Per Second

BSCS Bachelor of Science in Computer Science

C Confidential

CCB Configuration Control Board

CDR Critical Design Review

CDR+90 90 Days After CDR

CDRL Contract Data Requirements List

CGG Communication Gonculator Group

CM Configuration Management

CMS Code Management System

COMPUSEC Computer Security

COMSEC Communications Security

Con Ops Concept of Operations

COTS Commercial Off the Shelf

CPU Central Processing Unit

CSC Computer Security Center

CSMP Computer Security Management Plan

D&UG Dissemination and Use Group

DAA Designated Approving Authority

DAC Discretionary Access Control

DAL Data Accession List

DBMS Database Management System

DIA Defense Intelligence Agency

DID Data Item Description

DNA Deoxyribonucleic Acid

DoD Department of Defense

DTLS Descriptive Top-Level Specification

EEPROM Electrically Erasable Programmable Read Only Memory

EMD Engineering and Manufacturing Development

EMSEC Emanations Security

EPL Evaluated Product List

ETL Endorsed Tools List

FER Final Evaluation Report

FOC Full Operational Capability

FORTRAN Formula Translation

FSD Full Scale Development

FTLS Formal Top-Level Specification

GAWG Gonculator Accreditation Working Group

GIGS Ground Intelligence Gonculator System

HI Hydroiodic Acid

I&A Identification and Authentication

I/O Input / Output

ID Identification

IIV&V Internal IV&V

INFOSEC Information Security

IOC Initial Operating Capability

IPC Inter-Process Communication

ISSO Information System Security Officer

IV&V Independent Verification and Validation

KISS Keep It Simple, Stupid

LSP Logisical Support Plan

MAC Mandatory Access Control

MC Multiple Categories

MOA Memorandum of Agreement

MOR Memorandum of Record

MSCS Master of Science in Computer Science

NATO North Atlantic Treaty Organization

NCR National Cash Register

NCSC National Computer Security Center

NDG New Development Gonculator

NIST National Institute for Standards and Technology

NOFORN Not Releasable to Foreign Nationals

NSA National Security Agency

NSAD Network Security Architecture Design

NTCB Network TCB

NTISSAM National Telecommunications and Information Systems Security

ObjectID Object Identification

OFP Operational Flight Program

OPSEC Operations Security

OR Object Reuse

P3I Pre-Planned Product Improvement

PC Personal Computer

PDR Preliminary Design Review

PDR-30 Thirty Days Before PDR

PDSL Process Data Sensitivity Level

PHIP PLATFORM Host Interface Processor

PM Program Management

PMRT Program Management Responsibility Turnover

PoP Philosophy of Protection

PSSC Preliminary System Security Concept

R&M Reliability and Maintainability

RAM Random Access Memory

RAMP RAting Maintenance Phase

RM-Plan Rating Maintenance Plan

RMax Maximum Data Sensitivity Rating

RMin Minimum User Clearance Rating

RMP Rating Maintenance Report

S Secret

SAR Special Access Required

SBI Special Background Investigation

SCCS Source Code Control System

SCG Security Classification Guide

SCI Special Compartmented Information

SDR System Design Review

SDR-30 30 Days Before SDR

SFUG Security Features User's Guide

SH System High

SL Security Label

SOW Statement of Work

SRR System Requirements Review

SRR-30 30 Days Before SRR

SS System/Subsystem Specification

SSE System Security Engineering

SSEM SSE Management

SSEP SSE Program

SSMP System Security Management Program

SSO System Security Officer

STD Standard

SVA Security Vulnerability Analysis

TAR Technical Assessment Report

TCB Trusted Computing Base

TCSEC Trusted Computer System Evaluation Criteria

TD Trusted Distribution

TerminalID Terminal Identification

TFM Trusted Facility Manual

TG Technical Guidance

TNI Trusted Network Interpretation

TNT Trusted Network Technology

TPOC Technical Point of Contact

TRIGS TR-1 Ground Station

TRUSIX Trusted UNIX Working Group

TS Top Secret

U Unclassified

UG UpGrade Gonculator

US United States

UserID User Identification

UV Ultraviolet

VR Vendor Report

VSA Vendor Security Analyst

WBS Work Breakdown Structure

XOR Exclusive Logical Or


Glossary

access modes: Types of interactions between subjects and objects which result in the flow of information between them. (Read, Write, etc.)

accountability: The ability to hold an individual responsible for the activities of that individual.

ACL: access control list: A list of users and/or groups associated with an object.

alphabet size: The number of unique components available for use in a "Password". (English letters -> 26, English letters and numbers -> 36, English 4, 5, and 6 letter words -> 23300)

archival storage: Off-line long term storage, often in the form of magnetic tape, often stored at a site other than the site generating the stored information. A requirement for old audit trail data.

assurance: The ability to sufficiently evaluate the system to ensure proper implementation of necessary safeguards to enforce the security policy.

audit trails: The stored audit records. (So you'll know who to take out and shoot.)

audit: Records kept of all security relevant activities on a system and the ability to draw from the records a clear picture of the actions taken and the user who took the actions.

authentication: The part of login concerned with the validity of the identity. (Are you - you?)

authorization: Formal permission to gain access to data. (SCI billets, SAR access, etc.)

baseline: In CM, the classic approach where at any given time there is a "version" of the system which is held pure until the CCB changes to the next baseline.

BPOC: business point of contact: The person at the NCSC responsible for dealing with the business aspects of the RAMP program of a particular vendor.

capabilities: DAC, access control based upon an unforgeable permission for access which comes with the subject.

cascading: In controlled mode networks where two networks connect with disparate ranges, the possibility that the lower-cleared users on the lower-ranged net will gain access to the higher-ranged data on the higher ranged net.

centralized: DAC, access control where all control permissions rest with a centralized person.

clearing: The rendering of media not readily able to be read; single overwrite, etc. Sufficient for reuse of the media in a secure facility but not sufficient to leave the facility.

closed environment: Lifecycle environment for a system where the developers are cleared and authorized for the data to be processed (or at least to the Secret level) and CM is invoked to protect against malicious logic insertion throughout the lifecycle.

CM: configuration management: The management of the system configuration to ensure that the necessary changes are made without adversely affecting the system security.

Compartmented Mode: The mode of operation suited to the conditions prevalent when all users have clearances for all of the data but are not formally authorized access to all of the data.

compression/decompression: In audit trails, when standard conditions are given standard codes to relieve to massiveness of data flows and storage with related decoding on retrieval. (Numeric codes for audit events vs. textual accounts, etc.)

computer security subsystem: Hardware/software add-on packages to handle (or expand upon) Audit, Identification and Authentication, Object Reuse, or Discretionary Access Control.

CCB: configuration control board: Group set up to determine the need and appropriateness of changes to the system configuration.

Configuration control: The CM task relating to evaluation and approval of changes to the CM baseline.

control objective: The points behind the security requirements.

control permission: In DAC, the ability to change the access permissions.

Controlled Mode: A limited multilevel mode of operation where limits are placed on the levels of data which can be processed.

COTS: commercial off the shelf: The development and production of COTS secure systems is the goal of the NCSC.

covert channel analysis: The systematic investigation for possible covert channels.

covert channels: The use of unintended data paths which are not under TCB control to pass data in a manner not allowed by the security policy.

covert storage channels: Covert channels which make use of system data stores.

covert timing channels: Covert channels which make use of modulation of system timing and resource utilization.

DAC: Discretionary Access Control: The control of subjects accesses to objects based on the subjects need-to-know.

declassification: The removal of all data from magnetic media to make the media unclassified.

Dedicated Mode: The mode of operations appropriate when all users have clearances, formal authorization, and need-to-know for all of the data in the system.

delete: DAC, access mode which allows the subject to delete data from the object.

denial of service: The undesirable condition where the authorized users are not able to use the system effectively.

design documentation: The compilation of documents which define the requirements and design for the secure system. Documentation requirements vary with the class.

device labels: Internal labels on peripheral device to determine the allowable levels of data to be processed.

Dockmaster: An NCSC sponsored and managed E-Mail and Bulletin Board system.

downgrade: The act of writing data to an object of lower classification than the subject.

DTLS: Descriptive Top-Level Specification: Top level specification for a secure system written in English or design notation.

electronic labels: Data sensitivity labels which are stored in the same (electronic) media as the data. At B1 and above, the labels are required and should not be trusted upon below that level.

EPL: Evaluated Product List: NCSC list of secure systems which have been evaluated and the class assigned.

ETL: Endorsed Tools List: NCSC list of endorsed formal verification systems.

execute: DAC, access mode which allows subjects to run an executable object (program).

fixed magnetic media: Magnetic media which cannot be readily removed from the hardware.

FTLS: Formal Top-Level Specification: Top level specification for a secure system written in formal mathematical specification language.

group: DAC, a set of users with intersecting needs-to-know. AKA: User Group, Dissemination and Use Group, D&UG, etc.

guess rate: How often a failed login attempt is allowed to be reattempted.

hierarchical classifications: Classification upon which greater then works. (U, C, S, TS)

human readable output: Computer Output which a person can read. (Screen, printer, etc.)

I&A: identification and authentication: The mechanism which allows the system to know who is on the system and who should not be on the system. (Login)

identification: The passing of the identity of the subject to the system as a part of I&A.

integrity policy: The portion of the security policy which deals with data corruption.

label integrity: The assurance that the data sensitivity label associated with (exported) data is correct, uncorrupted and the right one.

laissez-faire: DAC, control permission methodology where the allowed users of the data allow the data to be accessed by other users based on the user's determination of need-to-know.

least privilege: Principle which calls for the least available level of power to be used to perform needed actions. (Read not Read/Write, etc.)

limited access: A mode of operation roughly akin to Compartmented. Falling into disuse.

login: The sequence of events which allow a user to identify himself to the system and the system to verify the identity and right to access.

MAC: Mandatory Access Control: Access based on formal clearances and classifications.

magnetic media: Data storage based on magnetic field manipulation. (Tapes, floppies, etc.)

magnetic remanence: Traces of data left after clearing operations have been executed (and not present after declassification) on magnetic media.

marking: The labeling of data within the system for access control purposes.

maximum data sensitivity: In risk quantification, the maximum sensitivity of the data allowed to be processed, stored, or transmitted by the system.

minimum user clearance: In risk quantification, the maximum clearance level of the lowest cleared user allowed on the system.

mode of operation: The allowable conditions for the operation of a trusted system, including data sensitivities and user clearances.

multilevel devices: Computer devices which are allowed to operate on data of various classifications at the same time.

Multilevel Mode: The mode of operations used when not all of the users have formal clearances, authorizations, and needs-to-know for all of the data on the system.

NCSC: National Computer Security Center: Organization set up to foster computer security availability, especially in COTS systems. (Formerly DoD [NSA], presently NIST.[Sorta])

non-hierarchical classifications: Classifications without true rankings.(noforn, etc.)

non-repudiation: In networks, the receipted and guaranteed delivery of data.

open environment: Development environment with no Guarantees regarding malicious logic.

Operator: The person who runs the computer for system starts, shutdown, tape mounting, etc. (May or may not be a system user.)

OR: object reuse: The mechanisms which ensure that the new user of a storage object (buffer, disk, etc.) gains no knowledge of the data stored by the previous user.

ownership: DAC, control permission methodology where the owners of the data allow the data to be accessed by users based on the owners' determination of need-to-know.

password: A character string or phrase used to authenticate the validity of a login identity.

PoP: Philosophy of Protection: Statement of the protection scheme for a system, in English.

profiles: DAC, access control based on a listing of objects available to the subject in question.

protection bits: DAC, access control based on Self, Group, and World associations.

RAMP: RAting Maintenance Phase: An NCSC program to allow future versions of evaluated products to be placed on the EPL without full new evaluation.

read: DAC, access mode which allows information to flow form the object to the subject.

reboot: Cycling the power of the system, especially a small system, to wipe all volatile memory before operating at a new classification level.

reference monitor: A mechanism which monitors all subject/object access attempts and allows those attempts which are proper.

Risk Index: A quantitative measure of the risk on a system, based on the disparity between the user clearances and the data sensitivity.

secrecy policy: The part of the security policy which deals with proper dissemination of data.

Secure Operator: Special operator who runs the security relevant portions of a secure system.

Security Administrator: The role in a secure facility concerned with overall computer security. In a lower class system (B1 or lower) may be the only player.

Security Features User's Guide: Single reference point for the description of the security features of the system and their use.

security level: The combination of hierarchical and non-hierarchical classification which represent the sensitivity of the information.

security policy model: A model of the security policy to show how the policy will be implemented for a secure system.

security policy: A statement of the security policy to be enforced by a secure system.

selective collection: In audit, the collection of specified events and event types to reduce the volume of data collected to render the analysis of the data easier.

selective reduction: In audit, the selection of specified events and event type which have been collected to render the analysis of the data easier.

sensitivity label: A label which represents the security level of the information.

single level devices: Peripheral devices which are allowed to handle only one level of data.

SL: Ill-defined acronym, used variously as 'Sensitivity Label', 'Security Level', 'Sensitivity Level', and 'Security Label'.

System High Mode: The mode of operations suitable for use when all users have clearances and formal authorizations for all data, but not all users have need-to-know for all of the data.

System Programmer: The role in a secure facility which is concerned with the programmer aspects of the facility including bit-level repairs.

TAR: Technical Assessment Report: The name for the report covering the evaluation of a formal verification system.

TCB: Trusted Computing Base: The mechanisms which enforce the security policy.

TD: Trusted Distribution: The distribution of the TCB, especially updates to the software, in such a way that the product delivered is the product which was sent.

threat: Anything which can cause harm to the system, including destruction, disclosure, or modification of the data or denial of service.

TPOC: Technical Point of Contact: The person at the NCSC responsible for dealing with the technical aspects of the RAMP program of a particular vendor.

Trusted Facility Manual: The manual which tells the security administrator about the trusted facility including cautionary remarks.

trusted identification forwarding: The method of inter-network I&A where the Home system tells the Visited system that the user is that user.

trusted path: A communications path between the user and the TCB which can be used for login, etc., and which cannot be faked.

trusted recovery: The ability to bring the system back from a crash or degraded mode with the secure state untarnished.

user: Person accessing the computer system.

UserID: The unique 'name' for a user recognized by the TCB for I&A purposes.

VR: Vendor Report: The vendor's input to the evaluation of formal verification systems.

VSA: Vendor Security Analyst: NCSC trained and recognized security person working for the vendor on a particular system in the RAMP program.

write access: DAC, the access mode which allows information flow in both directions between subject and object. Essentially, Read Access and Write-Append Access.

write-append access: DAC, the access mode which allows information to flow from the subject to the object but not from the object to the object. (Write without Read.)


[End of document]

RTF to HTML by JYA/Urban Deadline.