STUDY TO RESEARCH EVALUATION EFFORTS OR TECHNIQUES RELATED AND/OR APPLICABLE TO HUMINT
Document Type:
Collection:
Document Number (FOIA) /ESDN (CREST):
CIA-RDP83M00171R001500070010-4
Release Decision:
RIFPUB
Original Classification:
K
Document Page Count:
54
Document Creation Date:
December 16, 2016
Document Release Date:
December 21, 2004
Sequence Number:
10
Case Number:
Publication Date:
March 25, 1980
Content Type:
REPORT
File:
Attachment | Size |
---|---|
CIA-RDP83M00171R001500070010-4.pdf | 2.6 MB |
Body:
? Approved For Release 2005/01/10: CIA-RDP83M00
FINAL REPORT
Study to Research Evaluation Efforts or
Techniques Related and/or Applicable to HUMINT.
Prepared For: Central Tasking Staff/
HUMINT Tasking Office
Prepared By: Madden, Madden and Associates, Inc.
Management Systems Consultants
Bulsontown Road
Stony Point, New York 10980
(914) 786-2749
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Contents Approved For Release 2005/01/10 : CIA-RDP83M00171 R001500070010-4
I.
Executive Summary
A.
B.
General
Specific
II.
Objective of Study - Introduction and Background
A.
B.
Research of evaluation efforts and techniques
Evaluation of HUMINT evaluation efforts and techniques
III.
Contractor Qualifications
A. General
B. Related Experience
C. Education and Experience
IV. Conceptual Framework For Evaluation
A. Concepts and Terms
B. Quality
C. Efficiency
D. Effectiveness
E. Evaluation - A Management System
V. Research of Evaluation Efforts and Techniques
A. Intelligence Community (IC)
B. HUMINT Assessment - HTO(AB)
C. Non-Community Activities
D. Federal Policy on Evaluation
VI. Conclusions
A. Intelligence Community (IC)
B. HUMINT - HTO
C. Non-Intelligence Community
D. Federal Evaluation Policy
VII. Recommendations
A. HUMINT Assessment System (HAS)
B. Assessment Methodology (DDO Model).
C. HUMINT [HTO-(AB)] Evaluation - General
Appendices
A - HUMINT Evaluation Survey Form
B - Federal Policy on Evaluation (Extracts of Materials)
C - Evaluability Assessment (EA)
D - National Tasking and Response/HUMINT - System
Structure and Flow
E - HUMINT Response Assessment Objectives
. Reference List - Materials used in study.
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
I. Executive Summary
The Conclusions and Recommendations resulting from this study are
summarized in the following two sections.
A. General
1. The Intelligence Community must Task - as required by Executive Order.
2. Tasking without assessment is not workable. There would be no rational
grounds upon which management could base the necessary decisions regard-
ing the direction of resources to respond to Tasking received or to
undertake later retasking in view of results obtained or not obtained.
3. A HUMINT Assessment System (HAS) is necessary in order to support a
fully operational HUMINT Tasking System. There is, at present, no
assessment system that could support the level of tasking and retasking
expected within the next year.
4. However, the experimentation conducted during this special study has
established that the basics for a HUMINT Assessment System do exist
or could be quickly set in place. These basics are detailed in the
following section.
B. Specific
1. The Intelligence Community has in-hand the following with respect to a
system for assessment of HUMINT reporting in response to tasking (Tasking
Plans):
(a) A workable approach to assessment based upon the concept of providing
semi-centralized linkages and/or supplementation to existing
assessment activities and, where such activities do not exist,
providing the entire assessment effort necessary. This semi-
centralized approach makes use of.existing assessment activities,
minimizes duplication of efforts and impacts on customer/consumer
time, coordinates and supplements on-going assessment activities,
and provides the necessary level of assessment where none currently
exists.
(b) A tested approach to assessment resulting from the experience
obtained by actual, successful use of the semi-centralized
approach in experimentation conducted over the past six (6) months
related to this study.
(c) An acceptable assessment methodology - the customer/consumer
interview model. The customer interview model has been tested and
_ refined through several years of use within the DDO. It is an
acceptable evaluation/assessment methodology which can be still
improved (see Recommendations). The metholology has been further
refined and successfully utilized during this special study on an
experimental basis.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
2. The Intelligence Community needs to:
(a) Establish a system for the assessment of HUMINT reporting responding
to Tasking. In establishing this HUMINT Assessment System the
Community should build upon the tested approach (semi-centralized,
linkage) and methodology (customer interview) which are in-hand.
(b) Refine and further define the HUMINT Assessment Systems concepts
by completing a formal Evaluability Assessment of the system
(Appendix C). In this regard it should be noted that the concepts
upon which the HUMINT Assessment System should be developed have
already been partially explored during this study:
(i) existing HUMINT assessment activities representing potential
entities for linkage into the assessment system have been
identified and their level of evaluaton activity determined
(Figure V-1).
(ii) the system structure and flow of National Tasking and Response/
HUMINT was outlined and measurement points, assessment
elements, and assessment actions were determined (Appendix D).
(iii) the objectives of HUMINT Response Assessment were refined and
set down in writing (Appendix E).
(c) Provide further evaluation/assessment support for:
(i) management decision making related to trade-offs, resource
reallocaton, "doing more with less".
(ii) budgetary preparation and defense with quantification of
impacts and results related to resource levels.
(d) Continue refinement of the Customers Interview Method with specific
attention to establishing statistically supportable claims related
to the survey results, (through random sampling techniques).
The Intelligence Community has a great deal of work ahead in developing the
role and contribution that assessment/evaluation can play in enhancing management
decision making and supporting Tasking. However, the basic approaches and
methodologies do exist and have been tested. The next step is to bring these
basic elements together into an assessment system. A good portion of the pre-
liminary work to establish such a system has been conducted during the course of
this experimental study. The remaining and recommended activities are seen as
plausible, workable, and positive contributions.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
II. Objective of Study - Introduction and Background
The objectives of the study undertaken by Madden, Madden, and Associates,
Inc. (MMA) in conjunction with staff of the HUMINT Tasking Office - Assessment
Branch (HTO-AB) were: (a) to research evaluation efforts or techniques related
to or applicable to intelligence reporting evaluation; (b) to prepare a study
that would document and describe such efforts and; (c) if previously or currently
utilized, explore the reasons for their respective successes or failures.
The research effort would include non-Intelligence Community (IC)
.departments, agencies, and/or units within its scope in an attempt to
discover and consider any evaluation efforts adaptable to IC use. Recom-
mendations regarding on-going and possible IC evaluation techniques, tools,
methods, and approaches would be made. The evaluation method(s) for a
specific, IC selected area of concern (HUMINT related) would be used as a
"demonstration" of the recommendations.
A. Research of evaluation efforts and techniques:
The study would identify and review current evaluation efforts
related to HUMINT in the Intelligence Community (IC). Included in the study
would be State, Commerce, and other appropriate agencies or departments outside
the Intelligence Community. The scope of this "search" would include Federal,
non-Federal, governmental, non-governmental, and/or other evaluation efforts
that may be applicable to IC evaluation. The search process would include
on-site visits and discussions as appropriate.
(1)
A copy of existing documentation or guides for each evaluation
effort, if available, would be obtained and filed in a central
repository for access by IC evaluation (or interested) staff
as appropriate.
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
(2) Sample copies of evaluation reports and/or other outputs from
evaluation efforts would also be filed.
Note - An appropriate organizational location for such a
central file of information on evaluation would be
selected.
(3) The flow of current IC evaluation reporting [source to user(s)
of such reporting] would be identified and detailed in appro-
priate charts.
(4) Previous IC evaluation efforts would be searched for and, if
found, identified as to type of evaluation and pertinent
related data (who, where, when, extent of effort, use, users,
etc.). Existing documentation and samples, if available,
would be obtained and centrally stored.
The Contractor (MMA) would provide during and at the conclusion of
the special study project the following tangible products in written
and/or briefing or seminar format as indicated:
(1) Recommendation as to central file and repository for data
and documentation concerning evaluation (written, briefing
as warranted).
(2) Listing and identifying current, on-going evaluation efforts
(IC and relevant non-IC) - who, where, when, what, extent,
contact person, users, etc. The list would refer to
available documentation (description, examples, etc.) in
the central repository if such was on file (written).
(3)
Listing of previous IC evaluation efforts identified during
this research.
(4) Description (narrative) and chart(s) of source through user(s)
of IC evaluation effort reporting.
B. Evaluation of HUMINT evaluation efforts and techniques:
The study would review and evaluate the previous and on-going
(1) This review would utilize the framework and concepts regarding
evaluation presented in the:
Seminar on Evaluating Human Source Intelligence,
11-12 December 1978, Center for the Study of
Intelligence, CIA.
Seminar on Tools and Techniques for Evaluating
Human Source Intelligence, 31 May 1979, Center
for the Study of Intelligence, CIA.
The conceptual framework for this review is presented in Part
III- (Conceptual Framework for Evaluation) of this report.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
(2) The review would be a positive, constructive critique of the
previous and on-going evaluation efforts. It would provide:
(a) a description of the relationship to the framework
for evaluation developed in the above seminars
including comments as to variations from the frame-
work and evaluation as to the implications (+ or -)
of such variation.
(b) an analysis of the reasons for the respective success
and/or failure of the efforts; including "lessons to
be learned" and concepts that could be fruitfully
adapted from others. This would include IC and non-IC
evaluation efforts, techniques, and tools.
(c) specific recommendations for on-going efforts and
suggested organizational adoptions of "other's" strengths.
The "other" organizational evaluation efforts considered
for adoption would be as wide ranging as possible - not
necessarily limited to IC, Federal, or governmental
evaluation approaches.
(d) recommendations for evaluation would be made for a specific
HUMINT reporting area (chosen by the IC) as a "demonstration"
of the recommended evaluation efforts.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
III. Contractor Qualifications
A. General
Mr. Lynn P. Madden is Senior Consultant and President of Madden,
Madden, and Associates, Inc. a management consulting and training firm.
Madden, Madden and Associates, Inc. (MMA) has an extensive background in
providing training, evaluation, consultation, and organizational and
management development services to the public sector. Since its founding
in 1971, MMA has:
1. met all objectives of projects undertaken.
2. met all project schedules and deadlines.
3. never exceeded budgeted contract costs.
This is due to the continuous internal application of the management
concepts and techniques MMA makes available in services to clients.
Since 1971, Madden, Madden and Associates, Inc. has conducted
between 85 and 125 seminars per year for public sector organizations.
The seminars range from two (2) days to two (2) weeks in duration.
MMA specializes in consultation and training of upper and middle manage-
ment in government organizations, especially at the state and Federal
level. MMA conducts seminars and provides consulting services for the
states along the east coast, Federal Governmental Agencies, and the
United Nations both in New York and Geneva, Switzerland.
B. Related Experience
Assignments undertaken by Mr. Madden that are of direct relevance to
this project include:
1. Activity (Effectiveness) Assessment in Government -
a 2 day program for middle and upper level management
in the concepts and application of evaluation to
governmental activities. The program was developed
by Mr. Madden and has been conducted for:
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
(a) U.S. Civil Service Commission (OPM) -
Washington, DC., Boston, St. Louis since
1974. Approximately 10-15 seminars per year
have been conducted - 6 are currently
scheduled for 1980.
(b) United Nations - Geneva, Switzerland since
1977, once per year for upper level UN
family (UNDP, ILU, ITO, UN-Secretariat, etc.)
management.
(c) New York State - Public Service Commission,
Department of Health, Department of Social
Services.
(d) State of Maryland - Department of Human
Resources,. Management Development Center.
(e) Pennsylvania Department of Public Welfare -
Boards of Local Assistance.
2. Project Management - 2 to 5 day seminars for project/
program managers developed by MMA and conducted for:
(a) US Department of Transportation, Transportation
System Research Center, Cambridge, Massachusetts.
(b) US Naval Weapons Laboratory, Dalhgren, Virginia.
(c) US Airforce - MAC, Dover AFB.
(d) Department of Transportation State of Maryland.
(e) US Civil Service Commission (OPM) - Philadelphia,
Boston.
3. Operations Research Techniques - 2 to 4 day seminars
developed and conducted for:
(a) OPM - Philadelphia, Boston.
(b) New York State Civil Service Department
(c) United Nations - Geneva, Switzerland.
4. Evaluation of Human Source Intelligence - a 2 day seminar
conducted at the Center for the Study of Intelligence
(11-12) December 1978). The seminar explored concepts of
evaluation applicable to HUMINT reporting and the
current status of reporting in the Intelligence Com-
munity. Participant reactions and recommendations
were set forth in a report on the seminar prepared by
the Center.
5. Tools and Techniques for Evaluating Human Source
Intelligence - a 1 day workshop conducted at the Center
for the Study of Intelligence (31 May 1979). The
workshop explored concepts, methods, and tools of
evaluation and their application to HUMINT reporting.
As a direct result of the above consulting and training seminars Mr.
Madden comes in contact with hundreds of management and professional staff
(from different agencies, organizations, and levels of government, private
sector firms, and academic) who have a concern with evaluation. Their concern
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
ranges from conceptual, theoretical concepts to practical, day-to-day
applications as operating managers. In addition to this on-going contact
and interaction, Mr. Madden is actively involved in reading the literature
and maintaining a current professional working knowledge in the field of
evaluation, assessment of efficiency, and workload standards.
C. Education and Experience
Mr. Madden's formal education and previous experience are also relevant
to consultation on evaluation of intelligence reporting:
Education -
Bachelor of Electrical Engineering -
Syracuse University
Master of Science, Industrial Administration -
Union College, Schenectady, New York
Master of Business Administration, Public Systems -
Union College
Related
Experience - Engineer - International Business Machines with
project management responsibilities.
Analyst - US Army Reserves, 454 Military
Intelligence Detachment (6 years) -
working at the Pentagon, Arlington Hall,
and CIA Headquarters.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
IV. Conceptual Framework For Evaluation
A. Concepts and Terms
Analysis = future oriented - which alternative/option to choose
and why?
Evaluation = present to past orientation - what have we got? How
have we done?
Evaluation is concerned with three related, but distinct aspects that
"build" upward in a hierarchal fashion:
How these aspects relate can be shown in a general "model" of any activity
(field' reporting for example) as a flow processing system that "produces"
certain goods or services which are then used by end "users/consumers". The
consumer use results in certain impacts and/or end effects.
Inputs Outputs
($,people, -I Process
resources)
(goods/services)
Transformed (T)
into:
Results/Effects
(Impacts of various
types - i.e.,
Effects lower costs, avoided
expenses, etc.)
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
B. Quality
Evaluation of Quality would concern itself with the Output(s):
definition of output - what is the unit "produced" (goods,
service, report)? How will I know an output unit? How will I
know a unit of reporting?
attributes of the output (implicit in the definition of the
unit is the concept of "quality" - what perceivable attributes
will be taken as making up an acceptable unit of output)? What
attributes define an item of reporting?
The attributes that represent "quality" of the output unit should be, but often
are not, as specific, measurable, and quantifiable as possible.
evaluation of quality requires that:
1. The output units (goods and/or services, reporting) be identified
and defined: "The manager of the decision unit should
include in a statement of objectives information relating
to both the services or products to be provided by the
decision unit and the contribution of that output to
achievement of the principal missions toward which the
unit's efforts are directed." [OMB A-115, 5/5/781
2. Quality measures (attributes) be stated and defined -
this is implicit in the definition of the output unit.
The attributes should be as quantifiable and objective
as possible, however, subjective attributes will often be
necessary or unavoidable. Note that the quality measures
(attributes) have similar characteristics to objectives
and, in fact, may be stated as objectives regarding the
unit of output.
Example - It is the agency objective(s) that all field reporting be:
- timely
- reliable information
- valuable
- of useful nature [Ref. DD Form 14801
Note - To make the above attributes operational as measures of
quality they would require:
- further definition or
- selection of (reference to) a "judge".
The "measurement" of the attribute existence can be either:
Objective - by some "instrument" independent of
human judgment.
Subjective - the measurement involves human
judgment, i.e., the "instrument"
is a person (judge) or persons (panel).
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Taken together the set of attributes defines the output unit. Any thing
(good/service, report) which meets the minimal level of attributes is by defini-
tion a unit of output - if it does not, then it is not an output, it is not a report.
The set of attributes used to evaluate quality consists, in most cases,
of separate attributes of varying importance to the evaluator or decision
maker. There are tools available to determine the relative importance of the
attributes to the evaluator.
C. Efficiency
Evaluation of Efficiency concerns itself with the relationship between the
Output(s) and Input(s). There are three general categories of efficiency evalua-
tion or assessment:
1. Work Measurement (WM) - relates the human resources (hours,
months, years) "consumed" in the production of a unit of output.
2. Unit Cost (UC) - relates the cost of all accountable and allocat-
able resources to the production of a unit of output.
3. Productivity Index (PI) - relates any "meaningful" measure of out-
put to any measure of input.
Although these three forms of efficiency evaluation may appear in different ways
there are only the three basic forms given above.
D. Effectiveness
Evaluation of Effectiveness requires that the "results" and/or "effects" of
the outputs be considered. There are, perhaps surprisingly, only three approaches
that are available to evaluate effectiveness:
(a) Numeric ratio - the ratio of measurable results obtained over
stated objectives.
(b) Cost-effectiveness - the ratio of measurable results (in some
unit) over the costs of the inputs.
(c) Benefit-Cost - the ratio of the benefits ("value") of the re-
sults over the cost of the inputs.
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
In cost-effectiveness analysis the stated objectives should be result-related -
not output-orientated (often this is not true). Benefit-cost analysis requires
the "valuation" (via shadow prices) of both resources (inputs) and the results/
effects "produced" by the outputs. There are two major issues in the evaluation
of effectiveness:
- do your outputs "produce" a result/effect?
- does your effect have "value"?
Approach
Effect?
Value?
Effectiveness Ratio
Yes
No
Cost-Effective
Yes
No
Benefit-Cost
Yes
Yes
Note - In Unit Cost, Cost-Effectiveness, and Benefit-Cost ratios the cost
portion does not have to be the all-inclusive cost of all of the
inputs. In fact, in most such ratios the costs used are knowingly
limited to partial costs (or accepted as not all-inclusive, due to
unknown, unallocatable, unmeasurable costs recognized as existing).
E. Evaluation - A Management System
The three aspects of evaluation provide management with different items of
information at different organizational levels. Different management levels
within the intelligence community would require different evaluation informa-
tion. However, the provision of such information should be approached through
the development of a hierarchically structured, integrated system where each
level of evaluation utilizes the evaluation systems, reports, and basic data of
the previous level.
Executive, policy,upper
management.
Operating, line management.
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
V. Research of Evaluation Efforts and Techniques
The study objectives required that evaluation efforts, methods, techniques
and tools relevant to HUMINT be explored - both within the IC and in other
agencies, departments, governmental, and non-governmental organizations. On-
site visits for briefings and interviews were conducted at sixteen (16) IC
related entities and five (5) non-IC agencies or departments. On-site visits
were supplemented by numerous telephone calls and background readings of related
reports, articles, memos, and books.
A. Intelligence Community (IC)
A questionnaire was prepared and distributed as a further supplement to
the sixteen (16) on-site interviews conducted during this study (see Appendix A).
Written responses to the questionnaires were received from six (6) activities.
A total of nineteen (19) separate entities related to HUMINT evaluation within
the IC were identified as currently existing (18) or previously existing (1).
The 19 entities and related descriptive data are shown in Figure V-1. The key
to the descriptive data is included in the Figure (refer to section IV of this
report for background and definition of terms). HTO(AB) staff have available
specific contact person information (name, telephone, address) for the 18 active
groups listed.
B. HUMINT Assessment - HTO(AB)
The evaluation or assessment of HUMINT conducted by the HTO during this
study was done through an interface, semi-centralized approach as shown in
Figure V-2. The HTO(AB) utilized, to the extent possible, existing assessment
efforts. HTO(AB) provided assessment through ad-hoc, supplementary means in
those cases where there were no existing efforts. The evaluation was of the
quality of the reporting as "measured" subjectively by the user/consumer of
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
Figure V-1 Survey of NUMINT Assessment Activities:
LEVEL OF EVALUATION EFFORT(S)
ACTIVITY QUALITY EFFICIENCY EFFECTIVENESS MARKET
CIA-DDO:
DDO-CONPTR
DD0-OPEC
DD0-PREC
NFAC/RES
CIA-COMPTR:
COMPTR CRP S(4)
ANALYSIS CR' -
DIA:
DIA-OC3
DIA-DC4
DIA-DP
RNS
NYC (AR)
TAO
FBIS
NAVT(NIC)
STATE(INR)
WM;UC;PI CTI TM; PM; B$
UM - M; PH; IS
CTI;ER TM;PM;I;B$
CTI TM;PM;I
WM;UC;PI. ER;CTS B$;PM;TM
(NEW - IN PROCESS OP SET-UP) B$;PM
N;SS;P 0-2;3;DIVS;I PMP(M;A)
N;SS;P;OTO O-3;DIVS;1 FILES(M;A)
SYMBOL KEY (Reference Figure V-1)
1. Level of Evaluation:
: - Subjectively measured against stated, defined
attributes. The measures can be (depending on
the inutence) obtained from:
(I) the ueera/consumers - via interviews (I)
? or questionnaires (Q)s
(2) a panel of experts.
1 ) a single expert.
(4) the evaluations done by others
2. EFFICIENCY, 0 - objectively measured vs. stated, defined attributes
3 NM - Work measurement - output/staff hours
UC Unit-coot-output/S
4 EFFECTIVENESS: PI - Productivity index - output measure/input measure
? ER - Effectiveness ratio - result/objective
CE - Cost effectiveness ratio - results/$
5 BC Benefit-coot ratio
6 - benefits/$
CTI - Citation of tangible (measurable) Impacts
P;OTO 0-1;2;3
OTO;P I;0
S(1;2;4)
S
NM ER;CrI All-see key N;SS;P;OTO
WM;OC;PI ER;CE;CTI . TN;PM;B$;I SS;P;N;OTO
NM ER;CTI POL N;SS;OTO
ER TM;PM;B$ N;P
CE TM; PH; B$ OIO;P
CTI TM;PM-MTO(TB) OTD;N(TP)
ER TM;PM OTO;N
PM
FOCUS S(1;2;4) - CTI
STATE-HOT O'R S-PANEL - (DISCNT-12/7$)-
CIA-IC STAY' S(4) - -
0-8;PROD FILES(M;A)
0 NIMS(A);P5R
0-PROD FILES(M)
I FILES(M)
0-12 -
I;0 FILES
0-4;11 -
I;0-4 FILES
OTO;N 0 -
N;OTO 0-4;11;164I. FILES
N;SS 1-(PANEL) -
OTO 0;1-4 -
II. -MARKET' for Evaluation Reporting - ^n indication of the role or
? use t e eve nation reports are put to (used for) - 1e, what type
of efforts supported by the evaluation,
7 TM - Top (director, deputy director level) management
having agency level (directorate) renponalbility.
POL - Policy making - the set of upper level executives
involved In debating, resolving, and setting policy.
9 PM - Program management - the set of managers responsible
for a specific program (implementation, operations,
10 direction/redirection, etc.); includes special studies
requested by a unit, agency or department.
D$ Budget - the set of managers and staff-involved in
budgetary preparation and defense, resource allocation,
and fund redirect Ions.
I Individual consumer - a single, Individual who uses the
13 evaluation reporting to tarry out his/her mission;
this category Includes special studies on a request
14 basis for a single person.
13 III. NATURE of Evaluation Re ortin
Narrative reports, 1lttle or no -atatlstieal-
16 data included.
69 - Statistical summarisation, tabular/s,ramary data,
17 scales, scalar values, charts, etc.
P - Periodic in nature, written/produced at regular
scheduled Intervals.
OTO - One-Time-Only written for a specific need, in
response to a particular request for evaluation data.
IV. SOURCES of Data for Evaluation
0 OOthers, outside the evaluation unit - is,
consumers, users, other evaluation units, etc.
I - Internal - the data is obtained through studies,
measures, etc. conducted Inside the evaluation unit.
Note - number indicated is the "entity t- assigned on chart.
V. DATA BASE
Approved For Release 2005/01/10: CIAy- (Wmgs119 tai%? M 07-00-F0=4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
Figure V-2: Interface Approach
HU?PINT
Origins
Foreign I
Sources .- DDO/CS
Federcl\
agencie
Private/
Scctor FBIS
Foreign
Sources
Armed Service
Foreign
Sources
Decentralized
Assessment
IDIA/DC-4
Centralized Evaluation
Assessment Markets
HTO/ AB) Program (Collection Managers)
Consumer V'ogran Officers
National (HTO/TB)
Field Collectors
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
HUMINT ASSESSMENT WORKING CROUP
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
reporting being interviewed. Citations of tangible impacts were recorded
as mentioned.
C. Non-Community Activities
On-site visits were undertaken at five (5) non-IC entities. The
visits were supplemented by telephone calls and numerous readings of
written materials. Relevant articles, books, and documents were
obtained for this study's files (HTO-AB) - see Reference List attached.
D. Federal Policy on Evaluation
Relevant Federal policy and guidance related to evaluation was sought.
Specific inquiry into such policy was made during visits to 0MB and GAO.
The following relevant documents were obtained for the study files (see
extracts in Appendix B):
1. OMB Circular A-11
2. 0MB Circular A-115
3. 0MB Circular A-117
4. GAO - "Comprehensive Approach for Planning and
Conducting A Program Results Review (Exposure Draft)" -
6/78.
5. GAO - "Guidelines for Economy & Efficiency Audits of Federally
Assisted Programs (Exposure Draft)" 6/79.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
VI. Conclusions
A. Intelligence Community (IC)
1. Current evaluation efforts in IC related to HUMINT have been identified and
reviewed (19 "entities" and 3 "data bases"):
(a) The evaluation efforts are distinct activities conducted by
separated organizational entities having little coordination and/
or interconnection of efforts and reporting.
(b) Most efforts are of the special, "one-time-only-study" nature
produced for a single decision maker/user.
(c) Most efforts deal with "quality" of reporting only and little or
no "efficiency" or "effectiveness" evaluation.
(d) The survey included individuals responsible for budget preparation
and defense. A number of them expressed a pressing need for
evaluation information of a type and amount beyond that now coming
to them.
(e) A need for increased quantification in evaluation is felt
through out the IC - both in budget (resource) and program
management areas.
2. Previous IC evaluation efforts have been identified. They are mainly
"one-time-only", special studies conducted for specific needs and focus on
evaluation of reporting "quality".
3. There is no Intelligence Community (IC) coordinated, systematic reporting/
response assessment system in existence to support the HUMINT Tasking System
as it is currently constituted. Assessment to support HUMINT Tasking re-
quires supplementary efforts (such as conducted in this special study, using
the Customer Interview Model) building upon such in-place efforts as do exist.
4. The previous and existing evaluation activities (mainly focussed on evalua-
tion of "Quality" for specific, "local" management use) are not, at this
time, adequate for the provision of responses for:
(a) Community level tasking assessment.
(b) Resource allocation concerns.
(c) Budgetary defense and justification.
To respond to such concerns the evaluation activities will need to "raise"
the level of efforts to "Efficiency" and "Effectiveness" evaluation (see
section IV).
B. HUMINT - HTO
1. HUMINT reporting evaluation methods (based on Customer Interview Model) as
currently conducted and as to be modified in the near term (see Recommendations)
are adequate to support decision making for Tasking/Retasking.
(a) Evaluation for purposes of Tasking/Retasking need be, and is,
evaluation of the "Quality" of reporting.
(b) The attributes of the reporting required to evaluate its "Quality"
are expressed or implied in the:
(i) HUMINT Tasking Plan(s)
(ii) NIT(s)
(iii) DCID 1/2
(iv) PRC (I) Directive(s)
(v) NITC Collection Studies
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
2. The HTO(AB) assessment efforts conducted during this study were based on a
strategy of providing an "interface" to existing evaluation/assessment efforts
within the.IC. This semi-centralized, interface approach is capable (by
supplementing existing assessment efforts or providing such effort when it
does not exist) of supporting Tasking/Retasking (HTO/TB).
3. The assessment methodology (based on DDO model) as used during this study
(two specific topic areas) is based on acceptable evaluation techniques:
(a) objective listing of reporting used by interviewee.
(b) interview process obtains:
(i) subjective rating of reporting quality (8 attributes).
(ii) statements of significant short-falls, gaps, and/or overlaps.
(iii) citations of instances of tangible impacts (policy positions,
redirected efforts or resources, savings due to redirections,
"tip-offs", etc.).
4. Modifications and improvements in the assessment methodology (Customer
Interview Model) are possible (see Recommendations). Several suggested
improvements have been tested during this study.
C. Non-Intelligence Community
1. Non-IC (other agencies/departments) and non-governmental. evaluation efforts
have been reviewed and several relevant concepts and methodologies have
been identified ["market" concept, evaluability assessment (EA) methods,
survey concepts, etc.).
2. The state-of-the-art in evaluation relevant to the subject area of this
study (intelligence reporting and utilization) is "primitive" and evolving.
(a) Further search for evaluation concepts, tools, and techniques
should continue, but are unlikely to lead to significant advances.
Most returns would be repetitions of previously seen concepts,
tools, and techniques.
(b) Useful concepts, tools, and techniques do exist which can be
adopted/modified for application to IC evaluation (refer to
Recommendations following).
(c) The IC should adopt a strategy of modification and adaptation of
existing concepts/techniques and minimal innovation.
(d) Elaborate, complex, and highly sophisticated evaluation tools/
techniques do exist, but are not warranted at this stage of
evaluation of HUMINT reporting in the IC.
D. Federal Evaluation Policy
Federal policy is that evaluation of programs be done and included in
budgetary presentations [0MB Circulars A-11, A-115, A-117; GAO-"Comprehensive
Approach for Planning & Conducting A Program Results Review (Exposure Draft)"
6/78, "Guidelines for Economy & Efficiency Audits of Federally Assisted
Programs (Exposure Draft)" 6/79]. In addition, such evaluation data is
expected for use by Congressional review/oversight groups.
No specific Federal policy/guidance regarding evaluation exists - the
individual agency/department develops its own techniques within the above
"general" guidance. Refer to Appendix B for extracts and copies of
relevant materials.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
VII. Recommendations
A. HUMINT Assessment System (HAS)
1. A HUMINT Intelligence Community-wide (IC) reporting/response assessment
system (HAS) is recommended. The ad-hoc, supplementary assessment as
utilized during this special study does not appear capable of supporting
the HUMINT Tasking System (HTO) when fully operational.
2. The HAS should be developed on an interface, semi-centralized basis
providing the linkage between existing agency, department evaluation
activities and the assessment required to support HTO. The semi-centralized
concept (which has proven successful during this study) would utilize
existing assessment efforts, and would supplement or provide such effort
where required.
3. The first step in establishment of the recommended HAS should be a formal
Evaluability Assessment (EA) of the IC HUMINT Tasking and Reporting System
(partial steps of an EA have been done as a part of this Study) - refer
to Appendix C for detail on the EA methodology. The recommended EA would:
(a) refine objectives for the HAS-(What should be expected from it?
How will its success/failure be established? What will HTO(AB)
be responsible for?).
(b) establish whether or not a HAS is plausible (workable) in addressing
the existing Nationial Tasking and Response/HUMINT System -
see Appendix D.
(c) determine whether the intended uses of HAS assessment reporting
are plausible:
(1) can/can not support tasking, retasking - HTO/TB.
(ii) identify and assess other expected uses for HAS
reporting - budgetary inputs, program management,
policy, etc.
4. The EA and following implementation of a HAS for HUMINT HTO (AB) should be
viewed as a "test-bed" for evaluation concepts and assessment in the entire
National Tasking System.
(a) The concepts being developed and implemented are general and pro-
bably transferable to other source reporting assessment systems.
(b) The EA effort is required to address currently voiced concerns
of assessment activities impact on analyst/user/consumer time.
This question can not be addressed until a HAS is first defined
by an EA.
B. Assessment Methodology - The Customer Interview Model
1. Review survey methodologies related to current HUMINT Assessment survey
techniques (Customer Interview Model) with a view to limiting the impact
on analyst/user/consumer time:
(a) Random sampling and/or rotation of user/consumer community for
assessment surveys.
(b) Classification of user/consumer community by "customer type".
(c) Retrospective interviews - refering to previous (6-12 months prior)
responses of interviewee.
-19-
Approved For Release .2005/01/10 : CIA-RDP83MOOl71 R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
(d) Interviewing via telephone and/or data terminal as a supplement or
alternative to personal contact.
2. Standardize aspects of the interviewer questions among the sampled inter-
viewees so that statistically supportable claims can be made for the sample
responses (tools and techniques exist which appear directly applicable in
this regard).
C. HUMINT [HTO-(AB)] Evaluation - General:
1. Develop a means to provide an "audit trail" from general guidance (NITs,
DCID 1/2) to Tasking Plans to specific/field-level reporting requirements to
reporting to analysis/production to user/consumer and policy/program support.
2. Define and refine specific attributes of "quality" to be used in evaluation
of HUMINT reporting responding to Tasking Plans.
(a) Use the interview process to obtain subjective (user/consumer)
quantification of the reporting's performance vis-a-vis the
attributes.
(b) Develop a data base of these "quality" measures for analysis and
decision making (Retasking) use.
(c) Analysis of this data should be performed to seek differences in
ratings due to:
(i) "customer" type
(ii) time
(iii) topic, subject
3. Formalize the surveying of "leads" to tangible impacts and entry of such
data (on leads and/or impacts) to a data base for possible further use in
the evaluation of "effectiveness".
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
BIBLIOGRAPHY
1. Agency for International Development, Evaluation Handbook, GPO,
Washington, D.C., 1972.
2. American Statistical Association, Proceeding of the 19th Annual
Conference on Statistics, Measuring Productivity in Government,
Albany, New York, May 18, 1972.
3. "An Analytic Procedure for Examining the Relation Between WWMCCS
Performance Capabilities and Funding Levels". Report prepared
for the IBM. Corporation in support of U.S.A. F. Contract
F19628-74-C-0158. McLean, Va.: Decisions and Designs, Inc.,
March, 1975.
4. Barnes, R.M., Motion and Time Study, Design and Measurement of Work,
John Wiley and Sons, Inc., 1968.
5. Barnes, R.M., Work Sampling, John Wiley and Sons, Inc., 1966.
6. Bassett, "Manpower Forecasting and Planning: Problems and Solutions",
Personnel, September, October 1970.
7. Baumol, William J., Economic Theory and Operations Analysis,
Englewood Cliffs, New Jersey, Prentice-Hall, Inc., 1972.
8. Bell, C., Cost-effectiveness Analysis As a Management Tool,
Clearinghouse for Federal Scientific and Technical Information,
Springfield, Virginia, Accession No. AD 607 134, October 1964.
9. Blumstein, A., "The Choice of Analytical Techniques", In T.A.
Goldman (editor), Cost-effectiveness Analysis; New Approaches
in Decision-Making. F.A. Praeger, New York, 1969.
10. Bower, J.L., "Effective Public Management", Harvard Business Review,
March-April 1977.
11. Brady, R. H., "MBO Goes to Work in the Public Sector", Harvard
Business Review, March-April 1973.
12. Buede & C.R. Peterson, An Application of Cost-Benefit Analysis to the USMC
Program Objectives Memorandum (POM), Technical Report TR 77/8/72, Defense
Advanced Research Projects Agency (ARPA), ARPA Order 3052, Arlington,
Virginia, November 1977.
13. Budd, J.M., "Employee Motivation Through Job Enrichment", Journal of
Systems Management, August 1974.
14. Center for Productive Public Management at John Jay College, Public
Productivity Review, Vol. I (1975), Vol. II (1976), Vol. III (1977),
Vol. IV (1978), Vol. V (1979).
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
15. Cheek, L.M., Zero-Base Budgeting Comes of Age, American Management
Association, New York, 1977.
16. Cheek, L.M., "Cost Effectiveness Comes to the Personnel Function",
Harvard Business Review, May-June 1973.
17. Civil Service Commission, General Accounting Office, and Office of Manage-
ment and Budget. Measuring and Enhancing Productivity in the Federal Sector,
Washington, D.C., US Government Printing Office, June 1972.
18. Committee for Economic Development, Improving Productivity in State and
Local Government, Lathan Press, March 1976.
19. Community Planning and Evaluation Institute, A Manual on Conducting
Interview Studies, Washington, D. C., 1972.
20. Dasgupta, P. et al, Guide Lines for-Project Evaluation, UNIDO-United Nations
Publication, E.72.II.8.11, 1972.
21. de Berge, Earl and Conrad Joyner, "Opinion Polls or . . . How to Ask
Right Questions of Right People at the Right Time", Public Management,
Vol. XLVIII, No. 11, November 1966.
22. Department of the Navy, Economic Analysis Handbook, Virginia, June 1975.
23. Donnely, B.P., "Cheap Shots and Costly Pay-offs: A plea for Purpose in
Public Programs", Public Administrative Review, March/April 1977.
24. Dorfman, R. (editor), Measuring Benefits of Government Investments,
Brookings Institute, Washington, D.C., 1965.
25. English, J.M. (editor), Cost-effectiveness: The Economic Evaluation of
Engineered Systems, John Wiley and Sons, Inc., New York, 1968.
26. Fisher, Gene H., Cost Considerations in Systems Analysis, New York,
American Elsevier Publishing Co., 1971.
27. Galambos, E.C. and Schreiber, A.F., Economic Analysis for Local Government,
National League of Cities, Washington, D.C., Nov. 1978.
28. Grillo, E.V. and Berg, C.J., Work Measurement in the Office, A Guide to
Office Cost Control, McGraw-Hill Book Co., Inc., 1959.
29. Hansen, B.L., Work Sampling for Modern Management, Prentice-Hall, Englewood
Cliffs, New Jersey, 1960.
30. Hatry, Harry P., Louis Blair, Donald Fisk and Wayne Kimmel, Program
Analysis for State and Local Governments, Washington, D.C., The
Urban Institute, 1976.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
31. Hatry, Harry P., Weiss, Carol H., An Introduction to Sample Surveys
for Government Managers, Washington, D.C., The Urban Institute, 1971.
32. Hatry, Harry P., Winnie, R.E., Fisk, D.M., Practical Program Evaluation
for State and Local Government Officials, Washington, D.C., The Urban
Institute, 1973.
33. Hatry, Harry P., and Webb, K., Obtaining Citizen Feedback, Washington, D.C.,
The Urban Institute, 1973.
34.-Heiland, R.E. and Richardson, W.J., Work Sampling, McGraw-Hill Book Co.,
Inc., New York, 1957.
35. Heuston, M.C. and Ogawa, G., "Observations on the Theoretical Basis of
Cost-effectiveness", Operations Research, 1966.
36. Hindricks, H.H. and Taylor, G.M., Program Budgeting and Benefit-Cost
Analysis, Goodyear Publishing Co., Pacific Palisades, California, 1969.
37. Hinrichs, Harley H, Taylor, Graeme M., Systematic Analysis A Primer on B-C
Analysis and Program Evaluation, Goodyear Co., 1972.
38. Hopeman, R.J., Production Concepts Analysis Control, Charles E. Merrill
National Commission on Productivity and Work Quality, Washington, D.C.,
November, 1975.
39. International Labour Office, Geneva, Introduction to Work Study, Impression
Conleurs Weber, Switzerland, 1974.
40. Keelin, T.W., III. "A Protocol and Procedure for Assessing Multi-
Attribute Preference Functions". Ph.D. Disseration, Stanford
University, September, 1976.-
41. Keeney, L., and Raiffa, H. Decisions with Multiple Objectives:
Preferences and Value Tradeoffs. New York: John Wiley, 1976.
42. Krick, E.V., Methods Engineering, John Wiley and Sons, Inc., 1965.
43. Kuper, G.H., "Productivity Improvement: Now It's the Government's Turn",
Management Review, November 1974.
44. Lambrou, F.H., Guide to Work Sampling, J.F. Rider Publisher, Inc., 1962.
45. Lyden, F.J. and Miller, E.G. (editor), Planning, Programming, Budgeting -
A System Approach to Management, Markham Publishing, Chicago, Illinois, 1968.
46. Maciarello, J.A., Dynamic Benefit-Cost Analysis, Lexington Brooks, Mass., 1975.
47. Mishan, E.J., Cost-Benefit Analysis, New York, Proeger Publishers, 1976.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
48. National Center for Productivity and Quality of Working Life, Improving
Productivity: A Self-Audit and Guide for Federal Executives and Managers,
US Government Printing Office, Washington, D.C., Fall, 1978.
49. O'Connor, M.F., and Edwards, W. "On Using Scenarios in the
Evaluation of Complex Alternatives". Technical Report TR 76-17,
McLean, Va.: Decisions and Designs, Inc., December, 1976.
50. Pearson, R.H., An Instructional Guide to Cost-Effective Decision Making,
Masterco Press, Michigan, 1974.
51. Productivity and Program Evaluation, Challenge for the Public Service,
Midwest Intergovernment Training Committee, Bloomington, Indiana, 1975.
52. Poland, O.F. (editor), "Program Evaluation", Public Administration Review,
July, 1974.
53. Quade, E.S., Analysis for Public Decisions, New York, American Elsevier
Publishing Co., 1975.
54. Raj, Des., The Design of Sample Surveys, New York, McGraw-Hill Book Co., 1972.
55. United Nations, Guidelines for Project Evaluation, 1972.
56. United Nations - Joint Inspection Unit (JIU), "Evaluation in the United
Nations System: Report of the JIU (E.D.Shom)", JIU/REP/77/1-E/6003, 1977.
57. United Nations Joint Inspection Unit (JIU), "Report on Programming and
Evaluaion in the United Nations", (M. Bertrand), JIU/REP/78/1-E/1978/41.
58. United Nations - Joint Inspection Unit (JIU), "United Nation Public Admin-
istration and Finance Programme, 1972-1976 Volumes I and II", (M. Bertrand),
JIU/REP/78/2/-E/1978/42.
59. United Nations - Joint Inspection Unit (JIU), "Programme Evaluation for the
Period 1974-1977 and Annexes I (Ocean Economics and Technology) and II
(Social Development and Humanitarian Affairs)", AC/51/91.
60. U.S. Civil Service Commission, Evaluation Management: A Source Book of
Readings, Charlottesville, Virginia, July, 1977.
61. U.S. Congress, Senate, Committee on Rules and Administration, Government
Economy and Spending Reform Act of 1976. Report to Accompany S.2925,
Washington, D.C., USPCO, 1976.
62. U.S. Department of HEW, Work Measurements and Workload Standards as Management
Tools for Public Welfare, Washington, D.C., July, 1974.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
63. U.S. Department of HEW, Workload Standards for Public Assistance Agencies,
Consideration for Administration, Washington, D.C., January, 1973.
64. U.S. Department of HEW, Work Measurement and Work Simplification
Washington, D.C., Hew Publication # - (SSA) 78-21201
65. U.S. Department of Labor, Handbook for Analyzing Jobs, Washington,
D.C., US Government Printing Office, 1972.
66. U.S. Department of Labor, Handbook on Work Measurement Systems - For Use in
Measuring Office Occupation, Washington, D.C., April 1972.
67. U.S. Department of Labor, Bureau of Labor Statistics., Productivity:A
Selected Annotated Bibliography 1965-71, Washington, D.C., U.S. Government
Printing Office, Bulletin 1776, 1973.
68. U.S. General Accounting Office, Evaluation and Analysis to Support
Decision-Making, (OPA76-9), Washington, D.C., 1975.
69. Weiss, Carol H., Interviewing in Evaluation Research, Handbook of Evaluation
Research, Vol. 1, Cal., Sage Publication, 1975.
70. Weiss, Carol, H., Evaluation Research, Prentice Hall, New Jersey, 1972.
71. Weiss, C., Evaluation Research - Methods for Assessing Program Effectiveness,
Prentice-Hall, Englewood Cliffs, New Jersey, 1972.
72. Weiss, C., "Alternative Models of Program Evaluation", Social Work,
November, 1974.
73. Wise, C.R. and Norton, 0., Productivity and Program Evaluation in the
Public Sector, Midwest Intergovernmental Training Committee, Bloomington,
Indiana, 1978.
74. Wise, L.R., Evaluating The Impact of Public Programs, A Guide to
Evaluative Research, Midwest Intergovernmental Training Committee, Bloomington,
Indiana, 1978.
75. Wholey, Joseph S., Evaluation: Promise and Performance, The Urban
Institute, Washington, D.C., 1979.
76. Worthley, J.A., "PPA: Dead or Alive?", Public Administration Review,
July, 1974.
77. Wright, C. and Tate, M.D., Economics and Systems Analysis: Introduction
for Public Managers, Reading, Mass, Addison Wesley, 1973.
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
APPENDIX A - Survey of Assessment Activities (Questionnaire)
You have been identified as an individual-or group involved in
assessment activities related to intelligence reporting. The Director
of Central Intelligence (DCI) has approved a special study of such
activities as they may relate to human source reporting (HUMINT).
Please provide the following information about your assessment
activities:
1. Name of organizational unit -
2. Location/address and contact person(s), with phone numbers -
3. Number of persons/staff involved -
4. Nature of assessment activities, reporting (please describe) -
5. Where do assessment reports go -
Primary user(s) -
Secondry user(s) -
6. Diagram/sketch flow of assessment reporting (source to user(s)).
7. Do you have any written documentation, guides, procedures,
references, directives, or other materials related to your
assessment efforts?
If yes, please list title and reference data, and indicate
if copies available.
Are "representative" copies (samples) of assessment reports
available?
8. Please provide leads to other pertinent groups/individuals
or activities.
Do not hesitate to include previous, no-longer-active
assessment activities and their products.
Please key your answers on separate sheet to above numbers.
Thank you for your assistance.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
APPENDIX B - Federal Policy Related to
Evaluation (Extracts)
The following materials:
(1) Analysis and Justification of Programs
(Section 24.1 - 24.6) - 0MB A-11.
(2) Zero-Base Budgeting - 0MB - 115.
(3) Management Improvement and the Use
of Evaluation - A-117.
provide references to guidance and requirements
for evaluation that relate to this study report.
-B1-
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
PREPARATION AND SUBMISSION Sec. 24.1
OF BUDGET ESTIMATES (1178)
Analysis and Justification
24.1. Justification of programs and financing.
All estimates reflected in agency budget submis-
sions will be supported by zero-base budget
justifications.
(a) Material required.-The justification materials
will consist of decision unit overviews, decision pack-
ages, and a ranking sheet (see exhibits 24A, 24B, and
24C). A decision unit overview and set of decision
packages will be prepared and submitted to OMB
for each decision unit identified by the agency in
consultation with Old (see section 11.4).
(1) Decision unit overview.-The overview pro-
vides information necessary to evaluate and
make budget decisions on each of the decision
packages in a set without repeating the same
information in each package. If appropriate,
summary data may be presented in the
overview.
The decision unit overview will be limited
to three pages and prepared in the format of
Exhibit 24A. The overview will provide the
following information:
(A) Heading.-Include sufficient informs-
tion to identify the decision unit.
Agencies will assign a unique four-
digit numerical code to each decision
unit. At a minimum, specify the de-
partment or agency title, bureau or
other organizational subunit, title of
the appropriation or fund account that
finances the decision unit, the ac-
count identification code (see section
21.4). the title of the decision unit and
the four-digit decision unit code.
(B) Long-range goal.-Identify the long-
range goal of the decision unit. Goals
will be directed toward general needs
and will. serve as the basis for deter-
mining the major objective(s).
(C) Major objective(s).-Describe con-
cisely the major objectives of the deci-
sion unit, including those set forth in
the basic authorizing legislation, and
the requirements these objectives are
intended to satisfy.
(D) Current method of accomplishing ob-
jective(s).-Briefly describe the meth-
od currently used to accomplish the
major objectives of the decision unit.
(E) Alternative(s).-Briefly describe fea-
sible alternatives to the current meth-
od of accomplishing major objectives.
Extracted from 0MB Circular No. A-11
Preparation and Submission of Budget Estimates - May 25, 1978
of Programs
Compare the alternatives in terms of
anticipated effectiveness and efficiency
in accomplishing the major objectives
and cite the sources from which the
evaluative information is taken. In-
clude, where appropriate, a brief dis-
cussion of desirable or undesirable side
? effects, organizational structures and
delivery systems, longer-range costs, or
other factors relevant to the selection.
Specify which, if any, alternative rep-
resents the method proposed for the
budget year. When any enlarged or
new activity is proposed, state why the
need cannot be filled by State or local
governments or the private sector.
(F) Accomplishments.-To the extent pos-
sible, using specific measures of ac-
complishment, workload, effectiveness,
and efficiency, describe the recent
progress of the decision unit in achiev-
ing its objectives. The key indicators
by which outputs and achievements
will be measured should be obtainable
from existing evaluation systems, in-
cluding periodic in-depth program
evaluation and work measurement
systems (see subsection 24.1(b) below).
Indirect or proxy measures should be
used only while evaluation and work
measurement systems are being de-
veloped. Progress toward short-term
objectives may also be indicated. In ad-
dition, a brief narrative may be in-
cluded to describe accomplishments
that are not indicated by any of the
measures.
(G) Resource summary (optional).-Sum-
marize the financial and personnel re-
sources required in the budget year to
support each package within the deci-
sion package set. Compare these
amounts to the resources required for
the past and current year.
(H) Performance, activity, and/or work-
load summary (optional).-Summarize
the marginal impact that each decision
package will have in the budget year
on key performance, activity, or work-
load measurements. Compare their im-
pact to the levels for the past and cur-
rent years.
-B2-
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
PREPARATION AND SUBMISSION
See. 24.1 OF BUDGET ESTIMATES (1f7s)
(2) Decision packages.-Decision package sets will
contain packages representing: (1) a mini-
mum level; (2) the current level (unless the
total requested for the decision unit is below
the current level) ; and (3) an enhancement
level, if there is a clear need. Packages rep-
resenting intermediate levels should be pre-
pared when significantly different perform-
ance levels between the minimum and cur-
rent level can be clearly identified. In addi-
tion, packages corresponding to certain speci-
fied levels will be included in the set when re-
quired by OMB representatives. Only one de-
cision package need be prepared in situations
where there clearly Is no discretion in the
amount of the funds to be spent or in the
method or level that the activity is to be con-
ducted (Le., uncontrollable or mandatory
activity) .
Enactment of proposed supplementary and
rescission proposals will be assumed. Amounts
proposed for aupplementals and rescissions
will be separately identified for the current
year and merged into the amounts for the
budget year in the respective decision pack-
ages. However, for purposes of justification
and review, separate decision packages will be
required for each proposed program supple-
mental. New programs or activities (e.g., those
resulting from proposed legislation or a new
major objective) will form the basis for a
separate decision unit and will be presented
in a separate decision package set. Proposals
for termination of current programs or activi-
ties will also be reflected in a separate deci-
sion package and will be discussed in the
summary and highlight memorandum (see
section 22.1).
The minimum level decision packages
should not exceed three pages. The other
packages (intermediate, current, and en-
hancement) should be limited to two pages.
However, supplementary material deemed
necessary to support the decision packages
may accompany the decision packages. Each
package within a decision package set will
clearly describe the increase in activity, fund-
ing, and performance that would result from
its inclusion in the budget. Packages will be
prepared in the format of exhibit 24B and
provide the follow information:
-GA1 ad'g..- 'Asify the organisational
unit (e.g., agency and bureau title), ap-
propriation or fund account title and
identification code, decision unit title,
34
the agency four-digit decision unit
code, and the package number (e.g..
1 of 3).
(B) Activity.-Describe the additional work
to be performed or services to be pro-
vided with the resources specified in
the package. For packages below the
current level, the activity description
should provide an explanation of what
will be performed, rather that what
will be eliminated from the current
level.
(C) Resource requirements.-Include ap-
propriate information, such as obliga-
tions, offsetting collections, budget au-
thority and outlays, and employment
(full-time permanent and total) for
the three years covered by the budget.
For each measure used, the budget year
increase associated with the package
should be listed along with the cumula-
tive amount for this and all preceding
packages for the decision unit. Budget
authority and outlay estimates will also
be provided for 4 years beyond the
budget year consistent with the cri-
teria set forth in section 23.
D) Short-term objective(s).-State the
short-term objectives (usually achiev-
able within one year) that will be ac-
complished and the benefits that will
result from the cumulative resources
described in the package. The expected
results of work performed or services
provided should be identified to the
maximum extent possible through the
use of quantitative measures. All pack-
ages in a decision package set should
use the same set of short-term objec-
tives and measures, so that the incre-
mental benefits from one package to
the next can be readily determined. For
packages below. the current level, the
short-term objective should provide a
description of what will be accomp-
lished, rather than what will be elimi-
nated from the current level.
(E) Impact on major objective(s) -De-
scribe the impact on the major objec-
tive(s) or goals of both the additional
and the cumulative resources shown in
the package using, to the extent pos-
sible, the same quantitative and quali-
tative measures for all packages in the
Bet.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
PREPARATION AND SUBMISSION
OF BUDGET ESTIMATES (1978)
(F) Basis (for minimum packages only)
Describe the basis for the minimum
level, explaining why this level was
determined to be the minimum level.
(0) Other information.-Include other in-
formation that aids in evaluating the
decision package. This should include:
(I) Explanations of any legislation
needed in connection with the
package;
(ii) The results of benefit/cost and
other analyses and evaluations
that contribute to the justifi-
cation;
(Iii) For the minimum level package,
the effect of zero-funding for
any program in the decision
unu
(iv) For packages below the current
level, an explanation of what
now is being accomplished that
will not be accomplished at the
lower level; and
(v) Any special relationship of this
decision package to other deci-
sion packages, including special
coordination that may be re-
quired. if adoption of a decision
package will affect the output
of another decision unit, de-
scribe the expected impact In
terms of the measures used for
the affected unit.
(3) Ranking sheet.-Each agency will prepare and
submit a ranking sheet in the format of ex-
hibit 24C that indicates in a single list the
priority of all decision packages comprising the
budget request. Additional decision packages
prepared to examine alternative methods or for
other purposes that are not part of the agency's
budget request will not be included in the
ranking. The ranking sheet will contain the
following information:
(A) priority rank number;
(B) decision unit title and package number;
(C) budget outlay and total employment
amounts for the decision package; and
(D) cumulative budget outlay and total em-
ployment amounts for the decision
package and all higher ranked packages.
In instances (e.g., revolving funds) where
outlays and total employment are not factors
in determining the appropriate or priority
level of performance, agencies should indicate
a measure such as budget authority, total obli-
gations or costs.
(b) Derivation of amounts requested.-Agencies
should base justifications on or be prepared to sub-
mit additional information covering the following:
(1) Detailed analyses of workload, unit costs, pro-
ductivity trends, the Impact of capital invest-
ment proposals on productivity, changes In
quality of output, and demonstrated results
of.past program activities.
(2) Basis for distribution of funds (i.e., formulas
or principles for allocation, matching, the
awarding of loans, grants, or contracts, etc.)
and data on resulting geographic distribution
(e.g., by State, etc.) with identification of any
Issues.
Work measurement, unit costs, and productivity
Indexes should be used to the maximum extent prac-
ticable in justifying staffing requirements.
Properly developed work measurement procedures
should be used to produce estimates of the staff-
hours per unit of workload, such as the staff-hours
per claim adjudicated, staff-hours per staff main-
tained In the field, staff-hours per infested acre of
pest control, etc., depending on the nature of the
agency programs. These estimates should represent
an acceptable level of performance based on current
realistic time standards. If the agency does not have
a work measurement system that provides this type
of Information, statistical techniques based on his-
torical employment Input and work output may be
used, while an appropriate system is under develop-
ment.
Unit costs relate the volume of work to the funds
required to produce the work. Unit costs may In-
clude, In addition to personnel costs, the costs of
supplies, travel, equipment, etc. Thus, unit costs re-
flect the ratio of personnel, materials, travel and
other costs to the output produced, and will be
stated In the dollars (or cents) required to produce
a unit of work. When unit costs Include personnel
costs, work measurement should be used to support
the acceptability of this component.
Productivity Indexes are based on the ratio of
total output to resource input. Output measures are
based on the volume of products or services produced
for use outside the organization, with due allowance
for differences in the nature of Individual products
or services. Measures of Input may be based on the
amount of personnel alone, on personnel costs, or
on a more comprehensive measure of resource inputs
that includes nonlabor costs.
Agencies will extend the use of work measure-
ment and unit cost analysis to both common service
activities and program activities. Usually. produc-
35
-B4-
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
PREPARATION AND SUBMISSION
Sees. 24.1-24.5 OF BUDGET ESTIMATES (1176)
tivity indexes are based on organization-wide totals 1
of both outputs and inputs, thus already covering r
both direct and indirect costa. OMB will, to the I
extent possible, assist agencies in the establishment
or improvement of work measurement and produc-
tivity analysis systems.
24.2. Analysis of economic impact.
Agencies are reminded of their obligation to pre-
pare analyses of economic impact whenever they are
required by criteria developed in accordance with
OMB Circular No. A-107, and Executive Order
No. 11949. In addition, agencies should respond to re-
quests from the Council on Wage and Price Sta-
bility pursuant to their responsibilities for reviewing
the economic effects of programs and activities.
Also, whenever agencies are required by statute to
prepare economic impact analyses, agencies should
be prepared to provide these analyses when re-
quested by OMB representatives.
243. Analysis of paperwork burden.
Paperwork generated in connection with on-
going programs will be reviewed and, whenever pos-
sible, reduced. For every new program initiative or
expansion of an existing program, an analysis of
the paperwork burden created for the private sec-
tor, State and local governments, as well as within
the Federal establishment, will be submitted with the
Justification.
24.4. Explanations relating to supplemental estimates.
When the need for a program supplemental appro-
priation is forecast (see sections 11.6 and 13.2), a
decision package set should be preoared In accord-
ance with section 24.1. The "Other information" sec-
tion on the decision package should set forth the
reasons for the omission of the request from the reg-
ular estimates for the period concerned, and the rea-
sons why it is considered essential that the additional
appropriation be granted during the year instead of
obtaining the money in a regular appropriation the
following year. Whenever possible, requests for sup-
plelnentals, whether under existing or proposed legis-
lation, should be accompanied by recommendations
as to where corresponding offsets can be made else-
where in the agency. If the estimate is approved for
later transmittal (rather than in the budget), further
Justification of the supplemental estimate will be
required when it is formally submitted (see section
39). In every case, the effect of requested supple-
mentals will also be shown in the appropriate deci-
sion packages (see section 24.1 (a) (2)) .
For anticipated supplementals in the current year
to meet the cost of pay increases, decision packages
need not be prepared. However, information should
be provided identifying, for each appropriation or
fund, the total cost of the pay increases and the
amount that is being absorbed in accordance with
related policy guidance. Any difference from in-
formation submitted with the apportionment request
for the current year should also be explained.
24.5. Rental payments to GSA.
Each agency making rental payments in excess
of $1 million to GSA for space (leased and Govern-
ment-owned) and buildings service~ will provide a
summary report in the form of exhibit 24D. The pay-
ments reported will include both the~Standard Level
User Charges (SLUC) and payments for reimburs-
able special services---e.g., extra guarding or clean-
ing of buildings-that agencies have requested in
addition to the normal services provided.
While the report submitted to OMB will be for the
agency as a whole, space rental charges should be
distributed for budget presentation purposes among
the appropriations and funds that finance the ac-
tivities utilizing the space. These charges must be
consistent with amounts reported under object
classification 23.1 for the affected accounts. Agencies
should be prepared to provide this information at the
appropriation and fund account level, if requested.
Explanations for the information requested are as
follows:
(a) SLUC and reimbursable payments.-Separate
estimates should be provided for:
(1) Standard level user charges (SLUC) to GSA
for space and related services; and
(2) Reimbursable payment for special services
beyond the standard level of services-e.g.,
extra guarding or cleaning.
For the current and budget years, SLUC estimates
should separately identify base costs and expansion
costs. Current year expansion costs represent SLUC
payments associated with expansion space to be ac-
quired during the current year. Budget year base
costs represent SLUC payments for current year
space inventory, plus current year expansion space,
less planned space reductions. Budget year expansion
costs represent SLUC payments associated with
expansion space to be acquired during the budget
year.
(b) Personnel estimates.-Estimates will also be
provided for full-time permanent and total employ-
ment for the 3 years covered by the budget. In cases
where current and/or budget year personnel expan-
sion is not likely to require additional GSA furnished
office space-e.g., an increase in air traffic controllers,
park rangers, or overseas employees-a special nota-
86
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
OFPREPARATION
BUDGET NESAND TIMATES SUBMISSION
078)
tion should be made, with past, current, and budget
year employment estimates separately identified for
that activity. Also, if there have been significant cut-
backs in areas not requiring GSA space that would
distort the relationship between net personnel ex-
pansion and new GSA space required, a separate
notation should be included. DOD estimates should
break out military and nonmilitary functions.
(c) Space estU, ates.-Estimates will also be pro-
vided for the square foot inventory for which SLUC
payments are made to GSA. These estimates should
break out the base inventory and expansion space for
the current and budget years in the same manner
that such a break out is provided for SLUC payments.
If subsequent changes made in budget allowances
result in revised space or personnel requirements.
Secs. 24.5-24.6
Public Buildings Service, GSA will be notified in
order to compute a corresponding revision in rental
payments. An updated report will be prepared and
sent to OMB immediately upon receipt of such
changes in budget allowances.
24.6. Analysis of ADP and telecommunications re-
sourees.
The materials required by section 24.1 should
include Justification for ADP and/or telecommunica-
tions requirements in those decision package sets
where either the estimates for ADP or telecommuni-
cations to support the decision unit exceed $1 million
in the budget year, or where the required ADP or
telecommunications resources are more than 10 per
cent of the total estimate for the decision unit (see
section 43).
37
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
OF BUUDDGETNESTIMATES (1978)
Analysis and Justification
24.1. Justification of programs and financing.
All estimates reflected in agency budget submis-
sions will be supported by sero-base budget
justifications.
(a) Material required.-The justification materials
will consist of decision unit overriewa, decision pack-
ages, and a ranking sheet (see exhibits 24A, 24B. and
24C). A decision unit overview and set of decision
packages will be prepared and submitted to 0MB
for each decision unit identified by the agency in
consultation with OMB (see section 11.4).
(1) Decision unit overview.-The overview pro-
vides information necessary to evaluate and
make budget decisions on each of the decision
packages in a set without repeating the same
information in each package. If appropriate.
summary data may be presented in the
overview.
The decision unit overview will be limited
to three pages and prepared in the format of
Exhibit 24A. The overview will provide the
following information:
(A) Heading.-Include sufficient informa-
tion to identify the decision unit.
Agencies will assign a unique four-
digit numerical code to each decision
unit. At a minimum, specify the de-
partment or agency title, bureau or
other organizational subunit, title of
the appropriation or fund account that
finances the decision unit, the ac-
count identification code (see section
21.4), the title of the decision unit and
the four-digit decision unit code.
(B) Long-range goal.-Identify the long-
range goal of the decision unit. Goals
will be directed toward general needs
and will serve as the basis for deter-
mining the major objective(s).
(C) Major objective(s) -Describe con-
cisely the major objectives of the deci-
sion unit, including those set forth in
the basic authorizing legislation, and
the requirements these objectives are
intended to satisfy.
(D) Current method of accomplishing ob-
jective(s).-Briefly describe the meth-
od currently used to accomplish the
major objectives of the decision unit.
(E) Alternative(s) .-Briefly describe fea-
sible alternatives to the current meth-
od of accomplishing major objectives.
of Programs
Compare the alternatives in terms of
anticipated effectiveness and of lciency
in accomplishing the major objectives
and cite the sources from which the
evaluative information is taken. In-
clude. where appropriate, a brief dis-
cussion of desirable or undesirable side
? effects, organizational structures and
delivery systems, longer-range costs, or
other factors relevant to the selection.
Specify which, if any, alternative rep-
resents the method proposed for the
budget year. When any enlarged or
new activity is proposed, state why the
need cannot be filled by State or local
governments or the private sector.
(F) Accomplishments.-To the extent pos-
sible, using specific measures of ac-
complishment, workload, effectiveness,
and efficiency, describe the recent
progress of the decision unit in achiev-
ing its objectives. The key indicators
by which outputs and achievements
will be measured should be obtainable
from existing evaluation systems, in-
cluding periodic in-depth program
evaluation and work measurement
systems (see subsection 24.1(b) below).
Indirect or proxy measures should be
used only while evaluation and work
measurement systems are being de-
veloped. Progress toward short-term
objectives may also be indicated. In ad-
dition, a brief narrative may be in-
cluded to describe accomplishments
that are not indicated by any of the
measures.
(0) Resource summary (optional).--sum-
marize the financial and personnel re-
sources required in the budget year to
support each package within the deci-
sion package set. Compare these
amounts to the resources required for
the past and current year.
(H) Performance, activity, and/or work-
load summary (optional).-Summarize
the marginal impact that each decision
package will have in the budget year
on key performance, activity, or work-
load measurements. Compare their im-
pact to the levels for the past and cur-
rent years.
Extracted from OMB Circular No. A-11
Pre ration and Submission of Budget Estimates - May 25, 1978
-B2-
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
IMEPARATION AND SUBMISSION
OF BUDGET ESTIMATES (1978)
(F) Basis (/or minimum packages only) ,-
Describe the basis for the minimum
level, explaining why this level Was
determined to be the minimum level.
(0) other information.-Include other In-
formation that aids in evaluating the
decision package. This should include:
(I) Explanations of any legislation
needed In connection with the
package;
(11) The results of benefit/cost and
other analyses and evaluations
that contribute to the justiS-
cation;
(iii) For the minimum level package.
the effect of zero-funding for
any program in the decision
unit:
(iv) For packages below the current
level. an explanation of what
now is being accomplished that
will not be accomplished at the
lower level; and
(v) Any special relationship of this
decision package to other deci-
sion packages, Including special
coordination that may be re-
quired. If adoption of a decision
package will affect the output
of another decision unit, de-
scribe the expected impact In
terms of the measures used for
the affected unit.
(3) Ranking sheet-Each agency will prepare and
submit a ranking sheet in the format of ex-
hibit 24C that indicates in a single list the
priority of all decision packages comprising the
budget request. Additional decision packages
prepared to examine alternative methods or for
other purposes that are not part of the agency's
budget request will not be included in the
ranking. The ranking sheet will contain the
following Information:
(A) priority rank number;
(B) decision unit title and package number;
(C) budget outlay, and total employment
amounts for the decision package; and
(D) cumulative budget outlay and total em-
ployment amounts for the decision
package and all higher ranked packages.
In Instances (e.g.. revolving funds) where
outlays and total employment are not factors
in determining the appropriate or priority
level of performance, agencies should indicate
a measure such as budget authority, total obli-
gations or costs.
(b) Derivation of amounts requested.-Agencies
should base justifications on or be prepared to sub-
mit additional information covering the following:
(1) Detailed analyses of workload, unit costs, pro-
ductivity trends, the impact of capital Invest-
ment proposals on productivity, changes in
quality of output, and demonstrated results
of past program activities.
(2) Basis for distribution of funds (i.e.. formulas
or principles for allocation, matching, the
awarding of loans, grants, or contracts, etc.)
and data on resulting geographic distribution
(e.g., by State, etc.) with identification of any
Issues.
Work measurement, unit costs, and productivity
indexes should be used to the maximum extent prac-
ticable in Justifying staffing requirements.
Properly developed work measurement procedures
should be used to produce estimates of the staff-
hours per unit of workload, such as the staff-hours
per claim adjudicated, staff-hours per staff main-
tained in the field, staff-hours per Infested acre of
pest control, etc? depending on the nature of the
agency programs. These estimates should represent
an acceptable level of performance based on current
realistic time standards. If the agency does not have
a work measurement system that provides this type
of information. statistical techniques based on his-
torical employment input and work output may be
used, while an appropriate system Is under develop-
ment.
Unit costs relate the volume of work to the funds
required to produce the work. Unit costs may in-
clude, in addition to personnel costs, the costs of
supplies. travel, equipment, etc. Thus, unit costs re-
flect the ratio of personnel, materials, travel and
other costs to the output produced, and will be
stated In the dollars (or cents) required to produce
a unit of work. When unit costs include personnel
costs, work measurement should be used to support
the acceptability of this component.
Productivity indexes are based on the ratio of
total output to resource input. Output measures are
based on the volume of products or services produced
for use outside the organization, with due allowance
for differences in the nature of individual products
or services. Measures of input may be based on the
amount of personnel alone, on personnel costs, or
on a more comprehensive measure of resource inputs
that includes nonlabor costs.
Agencies Will extend the use of work measure-
ment and unit cost analysis to both common service
activities and program activities. Usually, produc-
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
Sees. 24.1-24.5 .
tivity Indexes are based on organization-wide totals 1
of both outputs and Inputs, thus already covering
both direct and Indirect costs. OMB will, to the 1
extent possible, assist agencies In the establishment
or Improvement of work measurement and produc-
tivity analysis systems.
24.2. Analysis of economic impact.
Agencies are reminded of their obligation to pre-
pare analyses of economic Impact whenever they are
required by criteria developed in accordance with
OMB Circular No. A-107. and Executive Order
No. 11949. In addition, agencies should respond to re-
quests from the Council on Wage and Price Sts-
bility pursuant to their responsibilities for reviewing
the economic effects of programs and activities.
Also, whenever agencies are required by statute to
prepare economic Impact analyses, agencies should
be prepared to provide these analyses when re-
quested by OMB representatives.
24.3. Analysis of paperwork burden.
Paperwork generated in connection with on-
going programs will be reviewed and, whenever pos-
sible, reduced. For every new program initiative or
expansion of an existing program, an analysis of
the paperwork burden created for the private sec-
tor. State and local governments, as well as within
the Federal establishment, will be submitted with the
justification.
24.4. Explanations relating to supplemental estimates.
When the need for a program supplemental appro-
priation is forecast (aft sections 11.6 and 13.2). s
decision package set should be prepared in accord-
ance with section 24.1. The "Other information" sec-
tion on the decision package should set forth the
reasons for the omission of the request from the reg-
ular estimates for the period concerned, and the rea-
sons why it is considered essential that the additional costs. Current year expansion costs represent SLUC
appropriation be granted during the year instead of payments associated with expansion space to be ac-
obtaining the money in a regular appropriation the quired during the current year. Budget year base
following year. Whenever possible. requests for sup- costs represent SLUC payments for current year
plementala, whether under existing or proposed legis- space inventory, plus current year expansion space.
lation. should be accompanied by recommendations less planned space reductions. Budget year expansion
as to where corresponding offsets can be made else- costs represent SLUC payments associated with
where in the agency. If the estimate is approved for expansion space to be acquired during the budget
later transmittal (rather than in the budget), further
justification of the supplemental estimate will be year.
required when it is formally submitted (see section (b) Personnel estlmates.-Estimates will also be
39). In every case, the effect of requested supple- provided for full-time permanent and total employ-
mentals will also be shown in the appropriate deci- meat for the 3 years covered by the budget. In cases
sion packages (see section 24.1(a) (2)) . where current and/or budget year personnel expan-
For anticipated supplementals in the current year sion is not likely to require additional GSA furnished
to meet the cost of pay increases, decision packages office space--e.g?, an increase in air tra c controllers,
need not be prepared. However, information should ' park rangers, or overseas employees-a special nota-
36
PREPARATION AND SUBMISSION
OF BUDGET ESTIMATES (1975)
be provided Identifying, for each appropriation or
fund, the total cost of the pay Increases and the
amount that is being absorbed in accordance with
related policy guidance. Any difference from in-
formation submitted with the apportionment : equest
for the-current year should also be explained.
245. Rental payments to GSA.
Each agency making rental payments In excess
of $1 million to GSA for space (leased and Govern-
ment-owned) and buildings service; will provide a
summary report in the form of exhibit 24D. The pay-
ments reported will include both the 'Standard Level
User Charges (SLUC) and payments for reimburs-
able special services--e-g. extra guarding or clean-
Ing of buildings-that agencies have requested in
addition to the normal services provided.
While the report submitted to OMB will be for the
agency as a whole, space rental charges should be
distributed for budget presentation purposes among
the appropriations and funds that finance the ac-
tivities utilizing the space. These charges must be
consistent with amounts reported under object
classification 23.1 for the affected accounts. Agencies
should be prepared to provide this Information at the
appropriation and fund account level, if requested.
Explanations for the information requested are as
follows:
(a) SLUC and reimbursable payments.-Separate
estimates should be provided for:
(1) Standard level user charges (BLIIC) to GSA
for space and related services; and
(2) Reimbursable payment for special services
beyond the standard level of services--e.g.,
extra guarding or cleaning.
For the current and budget years, SLUC estimates
should separately identify base costs and expansion
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
PREPARATION AND SUBMISSION
OF BUDGET ESTIMATES (1976)
tion should be made. with past, current, and budget
year employment estimates separately identified for
that activity. Also, if there have been significant cut-
backs in areas not requiring GSA space that would
distort the relationship between net personnel ex-
pansion and new GSA space required, a separate
notation should be included. DOD estimates should
break out military and nonmilitary functions.
(e) Space estimates.-Estimates will also be pro-
vided for the square foot inventory for which SLUC
payments are made to GSA. These estimates should
break out the base inventory and expansion space for
the current and budget years in the same manner
that such a break out is provided for SLUC Payments.
If subsequent changes made in budget allowances
result in revised apace or personnel requirements.
Public Buildings Service, GSA will be notified in
order to compute a corresponding revision in rental
payments. An updated report will be prepared and
sent to OMB immediately upon receipt of such
changes in budget allowances.
24.6. Analysis of ADP and telecommunications re-
sources.
The materials required by section 24.1 should
include justification for ADP and/or telecommunica-
tions requirements in those decision package sets
where either the estimates for ADP or telecommuni-
cations to support the decision unit exceed $1 million
in the budget year, or where the required ADP or
telecommunications resources are more than 10 per
cent of the total estimate for the decision unit (see
section 43).
37
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Extracted from 0MB Circular No. A-115
Zero-Base Budgeting - May 5, 19T8 (pages 6-9)
c. Identification of objectives for each decision unit. Meaningful
objectives should be established for each decision unit. Both
major objectives (the main results ultimately expected to be
accomplished) and short-term objectives (outputs expected for a
given level of funding during the budget year) should:be identi-
fied. The manager of the decision unit should include in a state-
ment of objectives information relating to both the services or
products to be provided by the decision unit and the contribution
of that output to achievement of the principal missions toward
which the unit's efforts are directed.
Major objectives should:
be explicit statements of intended output clearly related to the
mission of the program or organization;
be developed by participation of managers at all levels;
be subject to realistic measures of performance; ,
form the basis for evaluating the accomplishments of the program
or organization; and
normally be of a continuing nature, or take relatively long
periods to accomplish.
Top level agency management should be involved in setting objectives for lower-
level agency members to:
(1) ensure that appropriate policy guidance is furnished to managers
throughout the agency;
(2) aid managers who prepare decision packages to define, explain, and
justify their activities and the required resources at different
levels of performance; and
(3) aid higher level managers in understanding and evaluating the
budget requests.
Some factors to be considered in identifying objectives are:
Number of objectives. The number of objectives for a decision
unit may have a direct bearing on the effectiveness of the ZBB
process. Too many may make it difficult for higher management
levels to assess the relative importance of each decision package
and its respective ranking in comparison with other packages.
Moreover, too many objectives for a decision unit may indicate
that it was established initially at too aggregated a level.
Measures of performance. As objectives are identified, managers
should simultaneously determine the key indicators by which per-
formance is to be measured. Agencies should specify measures of
accomplishment, workload, effectiveness, and efficiency for each
decision unit.. The key indicators by which outputs and achieve-
ments will be measured should be obtainable from existing evalua-
tion systems, including periodic in-depth program evaluations,
and work measurement systems (see 0MB Circular No. A-11, regard-
ing the justification of programs and financing). Indirect or
proxy measures should be used only while evaluation and work
meausrement systems are being developed.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
-- Organization plans and goals. Objectives should evolve from
stated plans and goals for the organization. If such plans and
goals have not already been communicated, this step should pro-
vide for better understanding between organizational units on the
purpose and direction of the respective units.
d. Identification.and evaluation of alternative methods of accom-
plishing objectives. Following the establishment of objectives,
the decision unit manager will examine alternative methods of
accomplishing each objective. The study of methods of operation
that differ from existing practices necessitates a rethinking by
managers of whether the current way of doing business is still the
most appropriate way'. This often requires a reexamination of the
program, including a review of existing legislation, organization
structure, and existing managerial practices. It also requires
managers to search for innovative ways to achieve program objec-
tives at lower costs.
Assessments of alternatives should be based on the relative
effectiveness and efficiency of each alternative in accomplishing
major objectives, taking into account desirable and undesirable
consequences of each alternative. If available information is not
adequate to assess all reasonable alternatives, evaluation efforts
should be undertaken so that meaningful comparisons can be made
for future budget cycles.
In some instances, feasible alternatives may require additional
legislation, the need for which may have been identified during
the course of a major reexamination of the program or activity.
In other instances, it may not be possible to study or to examine
fully alternative methods within the budget cycle, in which case
adjustments can be made subsequent to the submission of the
budget. In still other instances, the alternatives identified may
represent the first steps toward more significant program changes
that will take longer than one year to accomplish.
The alternatives that are identified must be both feasible and
realistic. They must indicate to reviewers that management is
making a conscientious and continuing effort to improve the effec-
tiveness and efficiency of the unit. Each higher level of review
should thoroughly examine the alternatives to ensure that
sufficient attention has been given to ways to improve operations
or reduce resource requirements. To ensure adequate coverage, it
may be advisable for central budget, evaluation, and planning
staffs to participate in the evaluation of alternatives and the
establishment of implementation schedules for those that are
selected.
AAG-5
MMA-11/78
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4.
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
EXECUTIVE OFFICE OF THE PRESIDENT
OFFICE OF MANAGEMENT AND BUDGET
WASHINGTON. D.C. 20503
MAR 2 3 1979
CIRCULAR NO. A- 117
TO THE HEADS OF EXECUTIVE DEPARTMENTS AND ESTABLISHMENTS
SUBJECT: Management Improvement and the Use of Evaluation
in the Executive Branch
1. Purpose. This Circular provides guidance on management
improvement initiatives designed to increase the efficiency
and effectiveness of program operations in the Executive
Branch. It also emphasizes the importance of the role of
evaluation in overall management improvement and the budget
process.
2. Rescissions. This Circular supersedes and rescinds
Circular No. A-44(Revised), dated May 24, 1972, subject:
"Management review and improvement program"; and Circular
No. A-113, dated November 17, 1976, subject: "Preparation
and submission of management plans."
3. Policy. All agencies of the Executive Branch of the
Federa Government will assess the effectiveness of their
programs and the efficiency with which they are conducted
and seek improvement on a continuing basis so that Federal
management will reflect the most progressive practices of
both public and business management and result in improved
service to the public.
4. Definitions. For the purposes of this Circular, the
following definitions apply:
a. Management improvement is any action taken to
improve the quality and time iness of program performance,
increase productivity, control costs, or mitigate adverse
aspects of agency operations.
b. Management evaluation is a formal assessment of
the efficiency of agency operations. It includes assessing
the effectiveness of organizational structures and relation-
ships, operating procedures and systems, and work force
requirements'and utilization.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
c. Program evaluation is a formal assessment,
through objective measurements and systematic analyses,
of the manner and extent to which Federal programs (or
their components) achieve their objectives or produce
other significant effects, used'to assist management and
policy decisionmaking.
d. Performance measures are reliable and objective
indicators of the results of a Federal activity. They
include measures of efficiency, effectiveness, program
impact, and program output or workload.
e. Productivity is a ratio between the output of an
organizational unit and the input it utilizes during a
specified period of time. Output is expressed in terms
of items produced, services provided, or some other measure
of achievement. Input is expressed by the most relevant
measure of resources consumed by the unit, such as
personnel effort, equipment, or total dollars.
f. Evaluation resources in a department or agency
are the funds and personnel used to conduct management
evaluations and audits, program evaluations, and
productivity measurement.
5. General guidelines. The heads of all executive
departments and agencies are responsible for developing
and pursuing comprehensive management improvement efforts.
The objective of such efforts should be discernable
improvement- in Federal programs--in the efficiency of
administration or management and in the effectiveness of
results. The basis for identifying management improvement
needs is a sound evaluation system. While agency evaluation
systems may serve multiple purposes, they should have the
following characteristics to be effective in contributing
to management improvement.
a. They should focus on program operations and
results. They should include procedures to assure that
evaluation efforts result in specific management improve-
ments that can be validated.
b. They should assist management in the identifica-
tion of program objectives, in providing explicit
statements 'of intended outputs related to the-objectives,
and in developing realistic performance measures to be
used in conducting evaluations.
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
c. They should be relevant to the budget process
in that evaluation results should be a major input to
resource allocation decisions.
d. Agency evaluation activities should be reviewed
periodically to assure that:
-- evaluation efforts that do not contribute
directly to improving program operations or
facilitating their improvement are minimized;
-- there is a balanced emphasis on both evaluation
and prospective analyses, such as planning and
policy analysis; and
-- they use available evaluation resources in an
economic and efficient manner.
Continuing attention should he paid to management improve-
ment and cost reduction opportunities in activities such
as accounting, ADP operations, cash management, communi-
cations, data collection, grants management, paperwork,
printing and reproduction, regulations improvement,
travel, and other administrative activities.
6. Reporting requirements. Every department and agency
whose budget is subject to review by OMB will submit an
annual report to OMB, in accordance with the attachment
to this Circular, summarizing the resources devoted to
management improvement and evaluation activities and
naming the principal officials responsible for those
activities.
7. Responsibilities of the Office of Management and
Budget. In addition to the actions taken as part of the
budget process, OMB, as part of its management responsi-
bilities, will:
a. Seek to identify areas or operations.of the
Federal Government in which significant management
improvements can be achieved and the steps necessary to
accomplish those improvements;
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
b. Assess the effectiveness of agency management
improvement and evaluation activities and conduct -foliow-
up consultations with departments and agencies as
appropriate;
c. Disseminate information about highly successful
management improvement projects in areas of government-
wide interest;
d. Promote the development and use of valid per-
formance measures; and
e. Conduct or sponsor a limited number of management.
improvement projects curing the year. The projects will.
be assigned by the Director, OMB, and generally will be
of Presidential interest.
8. Presidential Management Improvement Awards. Awards
will be made by the President to a limited number of
individual Federal employees, teams or organizational
units who have made exceptional and outstanding contri-
butions to the improvement of management in the Executive
Branch. Details of this program are contained in Chapter
451 of the Federal Personnel Manual.
9. Implementing Instructions. Copies of any instructions
implementing this Circular should be forwarded to the
Director, Office of Management and Budget, Attention:
Assistant Director for Management improvement and Evaluation.
10. Inquiries. inquiries relative to this Circular should
be directed -.o the `ice of Management and Budget,
Management I:pl-ovement and Evaluation Division, Room 10235,
New Executive Office Building, Washington, D.C. 20503,
telephone (202) 395-4960 or 395-5642.
James T. McIntyre, Jr
tDirector
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
ATTACHMENT
Circular No. A -117
INSTRUCTIONS FOR ANNUAL REPORT TO OMB
On or before. June 30, 1979, and by May 15 of each subsequent
year, every department or agency whose budget is subject to
review by OMB will submit a report of resources for evalua-
tion and management improvement, using the format of the
attached exhibit.
Fiscal Year - Enter the current fiscal year.
Department or agency - Enter the name of the department or
independent agency submitting the report.
Bureau - Cabinet departments are to submit a separate sheet
for each bureau or comparable organizational unit, plus a
separate sheet giving department totals. Independent agencies
submit a single sheet giving agency totals only.
Resources (Obligations) - For each line enter the total obli-
gations, in thousands of dollars, expected to be devoted to
that function during the current fiscal year. Include in the
totals only obligations for personnel compensation, personnel
benefits, contracts, and grants.
Resources (Staff Years) - For each line enter the total esti-
mated full-time equivalent staff years devoted by all agency
employees (both full-time and part-time) to that function
during the current fiscal year. Include professional, admin-
istrative, and clerical staff, but do not include any employees
who will spend less than 25 percent of their time on the func-
tion during the fiscal year.
LINE ENTRIES
For each function listed, enter obligations and staff years
as indicated. Do not include any.resources in more than one
category. Prorate among categories if necessary. For example,
if a'particular study is 70% program evaluation and 30% man-
agement evaluation, include 70% of the study resources under
program evaluation and 30% under management evaluation.
For each line entry, supply the name, title, address, and
telephone. number of the principal agency official directly
responsible for and familiar with the details of the activi-
ties included.
Approved For Release 2005/01/10 13eA-RDP83M00171 R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
1. Management Evaluation - Enter total obligations and staff
years devoted to management evaluation as defined in Circu-
lar A-117.
For the purposes of this report, management evaluation acti-
vities consist of the following:
,(].). Planning, developing, assessing, and modifying
organizational structures and relationships;
Operating procedures;
Internal regulations, instructions, and
delegations of authority; and
Management information and control systems
(but not including normal operation and
maintenance of such systems).
(2) Conducting or guiding
-- Assessments of operating efficiency or
effectiveness; and
-- Analyses of specific administrative needs.
(3) Assessing worker productivity (but not including
routine collection and processing of productivity
data), achievement of performance objectives, and
other quantitative measures of operational effi-
ciency and effectiveness.
2. Program Evaluation - Enter total obligations and staff
years devoted to program evaluation as defined in Circular A-117.
In the spaces provided, give a breakdown of obligations into
personnel costs (compensation and benefits) and contracts and
grants.
For the purposes of this report, the following are program
evaluation activities:
(1) Formal studies, surveys, and data analyses for the
purpose of determining the effectiveness, efficiency,
or impact of a national or regional program.
(2) Systematic assessment of demonstration programs or
projects which are expected to have major implica-
tion for programs of national or regional scope;
except that evaluation activities which are an in-
trinsic part of the program operations or management
should not be reported.
Approved For Release 2005/01/10: CIA-RDP83M00171R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
(3) Assessment and development of program designs to
assure that programs, once operational, can be
successfully evaluated.
(4) Design, development, and field testing of new
program evaluation methodologies.
(5) Synthesis and further analysis of results
obtained by several previous program evaluation
efforts.
(6) Collection of initial data to help in evaluation
design and provide a baseline for subsequent
evaluations.
The following are not to be included in program evaluation
activities for this report:
(1) Design, development, and operation of general data
systems or management information systems.
.(2) Continuing collection of routine data and general
purpose statistics.
(3) Analysis of existing or proposed policies where no
programs yet exist (even though authorized) for
purposes of appraising the likely costs and effects
of feasible alternatives. Although such analyses
are often called "evaluations," they are prospective
in character; whereas program evaluation is retro-
spective, aimed at determining what has actually
occurred as the result of past program actions.
(4) Basic research and studies intended to increase
or foster general knowledge development, but which
are not expected to be used specifically and pri-
marily in policy and management decisions.
(5) Routine, day-to-day monitoring of program operations
which is an intrinsic part of program administration.
3. Productivity Measurement - Enter total obligations and
staff years devoted to work measurement, determination
of unit costs, and the collection and processing of other
data whose primary use is to measure the productivity of
the agency's own operations. Include routine measurement
activities as well as special studies.
Approved For Release 2005/01*%A-RDP83M00171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
4. Other Management Improvement - Enter total obligations
and staff years devoted to management improvement, as
defined in Circular A- 117, which are not included in
the other line entries.
Total Department/Agency/Bureau - Enter the total obli-
gations and staff years devoted to the management and
evaluation functions detailed in the line entries above.
The line entries should add to the total shown.
PREVIOUS YEAR UPDATE
Beginning with the May 15, 1980 report, the resources
actually devoted to evaluation and management improvement
activities in the department or agency for the previous
fiscal year should be reported. However, no update is
necessary if the actual department or agency totals are
within 10% of the previously reported estimates.
SUBMISSION
Reports for all departments and agencies covered by this
requirement shall be submitted by May 15 each year to OMB,
attention: Management Improvement and Evaluation Division.
Inquiries concerning this report should be addressed to
the Division (395-5193).
Approved For Release 2005/01TWtIA-RDP83M00171 R001500070010-4
Approved For Release 2005/01/10:.CIA-RDP83M00171R001500070010.4
EXHIBIT
Circular A-117
RESOURCES FOR EVALUATION AND MANAGEMENT IMPROVEMENT
Date of submission:
Check one: Current year estimate
Previous year update**
MANAGEMENT IMPROVEMENT R SOURCES
AND EVALUATION Obligations Stalef Years
FUNCTION (Thousands of dollars) (FTE)
1. Management
Evaluation
2. Program Evaluation
(Personnel)
(Contracts & Grants)
3. Productivity
Measurement
4. Other Management
Improvement
RESPONSIBLE OFFICIAL
Name, Title, Address,
Telephone Number
k Cabinet Departments submit a separate sheet for each bureau or comparable subunit. plus one sheet giving
Department totals. Independent agencies submit a single sheet giving agency totals only.
** Actual resources for the previous year should be submitted only if they differ by more than 10 percent
,,from the previously reported estimates for that year.
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
-B17-
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
APPENDIX C - Evaluability Assessment (EA)
"Evaluability Assessment (EA) is a descriptive and analytic process
intended to produce a reasoned basis for proceeding with an evaluation that
is likely to prove useful to both management and policy makers." [Schmidt, 1978]
EA considers and documents the following:
(a) the system objectives, expectations, and causal
assumptions of policy-makers and managers in
charge of the program.
(b) what political groups (Congress, Executive Branch
policy-makers, and interest groups) say the system
objectives are.
(c) the extent to which management's system objectives
and information needs have been defined in
measurable terms.
(d) the system activities actually under way.
(e) the likelihood that system activities will achieve
measurable progress toward system objectives.
(f) likely uses of information on system performance.
(g) options for changes in system activities, objectives,
and use of information that could enhance system
performance.
The final products of an EA would be:
(1) a written list of agreed-on system objectives, important
side-effects, and measures (indicators of performance to
which the system managers could realistically be held
accountable.
(2) a set of evaluation/management options of ways that manage-
ment can change system activities, objectives, or uses of
information (measures of performance) so as to improve
system performance.
A. EA Origins
The EA methodology grew out of efforts of the program evaluation
group at the Urban Institute during the last ten (10) years. Many of these
efforts were Federally funded studies - in particular through DREW.
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
B. EA Assumptions
The assumptions related to EA are:
(1) the environment of governmental systems (programs) is complex.
(2) objectives of such systems are unclear, cloudy, poorly
defined, ambigious, and vague.
(3) results of system activities are often uncertain.
(4) demand for evaluation is growing.
(5) problems in response to demands for evaluation:
- purpose of evaluation unclear.
- few examples of evaluation resulting in
more effective performance.
- usefulness of evaluation products questioned
by both officials and evaluators.
Research and experience in the field of evaluation has shown that
evaluation is likely to aid in improving system (program) performance if the
following three criteria are met:
(1) system (program) objectives are well defined -
specific measures of performance and data on those
measures exist and are obtainable at a reasonable cost.
(2) system assumptions/objectives are plausible -
evidence exists that system activities have some
probability of causing the results/effects stated
in the system objectives.
(3) intended uses of evaluation (assessment) information
are well defined and plausible -
the system managers have defined the uses of
evaluation information and such uses are plausible.
EA is a preliminary assessment of a system (program) to determine if the
above three criteria are satisfied. The major assumption is that if these
criteria are not met, then further efforts to evaluate the system or activity
would be fruitless.
C. EA Experience/Applications
EA studies have been conducted in the following Federal or government
agencies:
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171ROO1500070010-4
(1) National Institute of Mental Health (NIMH) - 1974.
(2) National Institute of Law Enforcement and Criminal Justice - 1977.
(3) Appalachian Regional Commission - 1976.
(4) Bureau of Health Planning and Resources Development - 1977.
The above four (4) cases are discussed in J.S. Wholey, Evaluation:
Promise and Performance, Urban Institute, 1979 and further references are
provided.
D. Applicability of EA to Intelligence Community
1. A formal Evaluability Assessment (EA) of the HUMINT Tasking and Reporting
System is recommended. There are two (2) levels at which an EA can be
conducted - both are recommended:
(a) Tasking Plan level - the system supporting the development
and production of Tasking Plans and their later retasking.
This is the set of activities that make up the HTO(TB) and
HTO(AB) portion of the HUMINT Assessment System (HAS).
(b) National HUMINT Tasking System (NHTS) or System level -
the overall system (which includes (1) above) related to
Intelligence Community (IC) tasking of various agencies/
departments, their tasking of collection resources, collection
activities, reporting, analysis (consumer), final users
(policy-makers), assessment/evaluation, and retasking.
This is the total HUMINT Assessment System (HAS) supporting
HTO.
2. The EA would accomplish the following for both system levels above -
Plan level and System level:
(a) provide well defined system objectives.
(b) establish plausibility of system assumptions/objectives
(system activities have probability of causing results/
effects stated in objectives).
(c) determine plausibility of intended uses of ,evaluation and
assessment reporting.
It is recommended that the Plan level EA be conducted and completed
first as a test/experimental case for the System level EA. Progress has been
made on the Plan level EA.
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Rcje, 2QQ/01/10 : CIA16MMQQj,7IIZQ1O 5400,710010-4
Coll onion
lie-;.-at
Collection
Hash g.usM
SYSTEM STRUCTURE AND PLOY
Rational [nulll R?nn No?d? policy Support
HIT?
DCID 1/2
Wtch Lill
11.. Stud l..
I
Natlonel Ine.lligea0. Obis[/...
Notta0l
tumuli
T..01og 11000
YnaYt
F-
P~
Collection Obl?cttv...od DlnclIvo.
Collvctla G1doM.
11a14 Cnll.:[IOS
.ad l.Porct.g
.Mlban oo ro o path./
QoQlcy Control
Se pone PNPAC, VIA. INt, aIt.
NVAC VI, 5,041.?, D4
Tatla.... [Apart-
oltey Progrm U..[a
c! ? O*'_ ' I_ Are
L Support Dittun
Pollcyaabing. Program P1.alag sod
Oval Ope.nt
p y Rl qu relon[l
Coordlnoead Mgoir.so0e0
Inton.l R.9.1 ...a
port ng a
Shorrtoll
GogrsPhir .nb ngia, Owntry
1.1.11.00., priority
Cornl.[la,
Tasking - n.[1aoi Locdllpnu M.d.
tiportlog u.... Caput., .harti.11. -
..iLo.1 hoe40
S.o.rtu.. coax [[Neer
l.untwl0{
totionsl T.okin{
?.C
Gog.Phi c an?, -Eta. --try
10port.M0, priority
Gt nl?cioa.
01111 NT To.kln{ P1.0. -not lain total Mad. NTO/T{ pM.n?Ila
N.b r adT q,alitY .Matti..r.golu- HTO/Al Obnnatla
Nmbar of report. - aaplltyta{ inqulr.00oc. MTO/Al lags t.ct?~coMu.1/Mar
taport105 uub lapin., hortlalll - TP mad 1.,.rc.. 000
obl.aoiv.? uslg"ad OM.wntla (tor 4etam
A-ant at ndaflnitlw o1 Ti abl..1I-M - We NT0/SS&At ..onaauq
40.009 of .,_tl0$ - TV obl?rrl..e .d Tye ^TO/T$IA{ OMur.t~t(Ion of01?e
Coll.ct~+ 5~~~
ec
C. graphic are.. r.{100, --'try
Priority
Grni?ttMI
Org.nlantla,.' obj.ctlv.. - wt/oool
L0u111{coca obj..ti...
ohl?ttltae
0.... atiM'r obJ.cllr.- - TV
Op.r?c/ng d/nctlvu - Org.dr.tla
objectlw.
l.po tint Mu, 1.90014 rharttall. -
op.nting dlncaiv.?, orgaolu[ta
ob)ocllv00. TIP ubYe/lvu, wtiaM1
obj.ctivu
Nmb.r sod quality .t sPart* -
(.aa. ? above)
Ye?.lg000lu ?ad,, dlnCli000 at
wll?e,Lm et t.e - (a.r or sbov.)
Collac.ia ?cliolliu ?d ..pooAtt.ro -
(a.r u bva)
G11.atot OMOgtot/a
Collector 01m;Orv0ctea
G110cl09 0000[.0[100
Gllactof Ca00af/M.r lot.s-
,1n41t. 10[00[0[0
Collat0f OMOrvi[ta
Coll-IOC OMenotla
Collector OM.n.iloa/
Mlcol?tgr
Coormptton/U.ot.
.pum io .ooiynII od/or
[lolanud Sot.lllgOnc.
Uam of ilnLMd mull I100M to policy ?od/
or program 0011000 ad/or .0[100
Uu of roporllag 410.0[1/ in polio! ad/or
a/or tenon
plain. .upport nd
tdl uctlon or soviet of .[tofu ad/or foods
Chat to [r?c..b1. to the HUN111 nPon104
lborttdl 10 eagrlM91m to oil tb. -be-
COO used Ob..roOtia
u..I
G.r OM.rv.[la
-U .r Obnrv.clos/
ulcalaclo
Conoa.r/ OWna[ta
Approved For Release 2005/01/10 : CIA-RDP83MOO171ROO1500070010-4
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4
APPENDIX E - HUMINT Response Assessment Objectives
1. To ensure the subject and geographic coverage by HUMINT
that is required for that type of intelligence by
national need, as specified in HUMINT Tasking Plan
objectives and associated amplifying requirements.
2. To ensure adequate HUMINT responsiveness to the priority
dependence upon it, in meeting national need, that is
specified in Tasking Plans in connection with certain
objectives.
3. To ensure an amount and quality of HUMINT that constitute
sufficient response to the amplified objectives in
Tasking Plans, in accordance with the collection
responsibilities assigned, as revealed in the uses and
impacts of the reporting.
4. To identify significant shortfall in timely, quality
HUMINT response to amplified objectives in the Tasking
Plans, as revealed in the observations of key consumers
and users of HUMINT.
5. To obtain the consumer/user feedback required in the
above in such a way as to ensure its reliability and
availability.
6. To obtain this feedback through a system that is
cost-effective and help spread new emphasis upon
effectiveness measurement with respect to HUMINT
collection and its assessment.
Approved For Release 2005/01/10 : CIA-RDP83M00171R001500070010-4