COMPUTER PROBLEMS IN GOVERNMENT
Document Type:
Collection:
Document Number (FOIA) /ESDN (CREST):
CIA-RDP79-00498A000300110009-7
Release Decision:
RIPPUB
Original Classification:
K
Document Page Count:
92
Document Creation Date:
December 12, 2016
Document Release Date:
April 12, 2002
Sequence Number:
9
Case Number:
Publication Date:
May 10, 1976
Content Type:
OPEN
File:
Attachment | Size |
---|---|
CIA-RDP79-00498A000300110009-7.pdf | 6.02 MB |
Body:
Approved For Release 2002/06/05 : CIA-RDP79-00498A 1003 101100097.
orgressional
A th ?
PROCEEDINGS AND DEBATES OP THE 91. CONGRESS, SECOND SESSION
?
United States
of America
con
Vol. 122
? ? WASHINGTON, MONDAY, MAY 10, 1976
No. 68
Senate
COMPUThR PROBLEMS IN
GO'VERNMENT
Mr. RIBICOFF. Mr. President, the
General Accounting Office, examining.
computer-related crimes in Federal pro-
grams, studied 69 individual cases that
together totaled more than $2 million in
losses to the Government.
The GAO inquiry revealed that com-
puter fraud is a growing problem in both
the Government and private sector and
that, in many instances?no one knows
hove many?it is, almostt impossible to
detect.
The name of the GAO study is "Com-
puter-Related 'Crimes in Federal Pro-
grams." The study is dated April 29,
1976.
GAO obtained its information from the
investigative files of the Criminal Inves-
tigations Division?CID--Command of
the Army; the Navy Investigative Serv-
ice?NIS?the Office of Special Investi-
gations?OSI---of the Air Force; the Jus-
tice Department's Executive Office for
U.S. Attorneys and the FBI; the Office
of Investigation of the Agriculture De-
partment; the Internal Revenue Service
in Treasury; HEW's Social Security Ad-
ministration; the Division of Investiga-
tion of the Interior Department; and the
Investigation and Security Services of
the Veterans Administration.
In the preponderance of the 69 cases,
criminal prosecutions resulted.
. GAO auditors cited these instances of
computer crimes in Government:
A Defense Department fuel supply em-
ployee who had helped automate an ac-
counting system introduced fraudulent
payment vouchers into the system. The
computer could not recognize that the
transactions were fraudulent and issued
checks payable to fictitious companies set
np by the employee and his accomplices.
These checks were sent directly to barese
where the conspirators had opened ac-
counts for the companies. The criminals
then withdrew the funds from the ac-
counts. Officials estimated the govern-
ment paid this employee and his accom-
plices $100;000 for goods and that were
never delivered.
A supervisory clerk responsible for en-
tering.clains transactions to a computer-,
based social welfare system. found she
could introduce fictitious food stamp
claims on behalf of accomplices and they
;would receive the.beneflts. She processed
more than $90,000 in claims before she
was discovered through an anonymous
tip.
An engineer who was no longer em-'
ployed at a computer installation man-
aged to continue using the equipment for
his own purposes. Before he was discov-
ered, he had used more than $1,000 worth
of computer time. At another installa-
tion a programmer used- a self-initiated
training program to obtain the use a his
? *agency's computer system. But instead
of working on the training exercise, he
was developing his own computer pro-
grams which he hoped to sells GAO
auditors said. ? -
The manager of a computer center
processing personal information stole? -
some of this data and sold it to private
firms. The private firms, none of which
were authorized to have such data, used
the information to promote their prod-
ucts. .GAO said. that although the Gov-
ernment did not lose money in this case
the privacy of individuals whose data.
records were involved was violated,
At one large Army installation officers
estimated that 80 percent of all thefts
may have been computer related.
In transmftting their report to the
Congress, GAO auditors said they were
preelude.d from being more specific about
individual instances of computer fraud
because first, in many instances inf
or-
mation came from raw investigative.t
files; second, several of the cases re--
viewed were still open or were about to
-
be prosecuted at the time GAO corn.-
pletecl its inquiry; - and third, persons
who had perpetrated computer frauds
-
cooperated with GAO but with the un-
derstanding that they would not be
identified.
- GAO auditors said. most of the eases
they studied did not involve sophisti-
cated 'attempts to use computer tech-
nology for fraudulent purposes. Instead,
GAO . said, these were uncomplicated
acts which were made easier because
management controls over the systems
involved were inadequate.
Forty-three of the 69 cases of corn-
,puter-related crimes were classified by
GAO as being "fraudulent record initia-
tion." Under this category, GAO included
cases in which. Federal employees, or
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
?Approved For Release 2002/06/05
. persons employed by Government con-
tractors, deliberately falsified informa-
tion from records and documents to be
fed into' computers. Also included in this
category was the act of falsifying claims
by reuse of supporting documents pre-
viously processed.
The second category of computer-
-related _crimes is termed "unauthorized
or inappropriate use of facilitlei and
supplies." This category includes devel-
oping salable programs on Government
computers, doing commercial work for
outsiders on Government computers and
duplicating files and selling them.
"Processing alteration or destruction"
is the third category of computer-related
crimes studied by GAO. This offense in-
cludes such crimes as sabotage or alter-
ing information in the files affecting pay,
promotion or assignment, and bypassing
existing controls to enter unalithorizecl
changes.
.The final category examined by GAO
is "misappropriation of output." In-
cluded under this section is the misap-
propriation of returned checks.
In connection with its review of com-
puter-related crime in the Government,
? GAO commissioned the Stanford Re-
? search Institute?SRI--of Menlo Park,
Calif., to study' similar crimes in the pri-
- irate sector. 1
GAO said the SRI report indicates the
same types of crimes, occur in both the
? public and private sectors. GAO said in
both the public and private areas the
majority of crimes were committed bY
systems users?that is, persons working
with the computers being abused?but
the proportion of user crimes is larger
in Government.
GAO auditors said the size of the aver-
age loss in private sector crimes is higher
than in the Government cases studied. In.
a review of 144 cases, SRI found the aver-
age loss in private business to be $450,-
000. GAO said the average loss in those
Government cases in which a dollar
figure was apparent was $44,000.
GAO said the Government should im-
prove its management controls over com-
puters. GAO also pointed out that audi-
tors of Government computer programs
? should be educated about the prevalence
? and types of computer crimes. GAO said
? that several Government computer audi-
tors did not know about crimes which
had been committed in their own pro-
ams until GAO informed them.
-_
Another General Accountiqg Once
study found that Navy auditors identi-
fied a computer as being incorrectly pro-
gramed in 1969 but the computer was
not fixed for 5 years, during which time
the machine initiated unnecessary ac-
? tions that cost the Navy $3 million a
year.
GAO said one of the reasons tile Navy
gave for the 5-year delay was that Navy
officials were concerned that. by correct-
2
: CIA-RDP79-00498A00030011Q009-7
ing the computer proigein they, might
cause budget reductions,
Another instance . of computer short-
comings, GAO said, could be seen in a
situation in which Army computers di-
rected the shipment of radioactive equip-
ment 'without requiring the stipulated
safeguards for proper. handling. - -
These examples were cited by GAO to
demonstrate problems In the Federa
Government's "automated decisiomnak-
lug computers." These computers, oper-
ating without human supervision, an-
nually initiate payments, purchases and
other expenditures involving many bil- -
lions of dollars in Government funds and.
? resources and people are not required to
review these actions to determine, .--
whether they are correct or not.
?In its report, dated April 26, 1976; en-
titled "Improvements Needed in Manag-
ing Automated DecisiOnmaking by Com-
puters Throughout the Federal Govern-
? ment," GAO concluded:.
? Coraputers In Federal departments arid
agencies annually J.S841e unreviewed pay-
ments and other actions involving billions
? of dollars in government aesets, mese,- ac-
? tions are often wrong. They can coot the gov-
? ernment huge aurae of money; exactly how
? much no one knows.'
It is troubling to note the extent to which.,'
these decisionela.king computers are able to
? decide things on their own. Computer tech-
nology is progrees, of wears*. But people -
should monitor closely what these machines
are up eto. For all their heralded memory
banks and fantastic fret-taut recall; comput-
ers are still basically beasts of burden. They
have no intelligence, except. for what ixtfor-
naation people insert in them. "
"Automated decielorunaking by computers"'
occurs when computers are programmed to
make payments, purchase material. and
otherwise spend money end take actiong
without the. assistance- of or review by
'people.
In their study of automated decision- -
oinking computers, GAO auditors concluded
that these kinds ,of computers initiate more
than 1.7 billion payments and other
actions by government a year -without any.
,-person evaluating whether they are correct,.
Government automated decisionmaking
? computers Issue each year. a minimum of
unerveiewed authorizations for payment-or -
checks (excluding payroll) totaling $28 -
lion, the GAO report said-
1Inreviewed bills totaling at lease $10 bil- ?
lion are. issued annually by automated de- .7
cisiormaking computers, the GAO auditors
said ?
In addition, the GAO said, these same
computers issue annually unreviewed req-
uisitions, shipping orders, repair schedules -
and property disposal orders- for material
valued at $8 billion.
GAO obtained information on 128 auto-
mated decisiorunaking computer programs at
the Army, Navy, Air Force, Defense Supply
Agency, General Services Administration,
Railroad Retirement Board, Veterans Ad-,
'ministration and the Departments of Agri-
culture, Commerce, Housing and Urban De-
velopment, Interior, Treasury and Health,
Education and Welfare.
The GAO auditors cited examples in which
automated dectelortmaking computers had
resulted in reiniona of dollars of waste and,
in one instance, the unauthorized handling
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05
of radioactive components for military equip. -
ment.
In 1969, the GAO report said, the Navy's
own auditors found that a computer pro-
gram serving the Navy Aviation Supply
Office in Philadelphia was inadequately de-
signed regarding the ability to correctly re-
Sect demand for the purchase and repair of
naval aircraft and spare parts.
The Aviation Supply Office in Philadelphia
Is the central manager for all the purchases
and repair of aircraft and spare parts for
the entire Navy. The Aviation Supply Office
Is under the Naval Supply Systems Com-
mand of the Department of the Navy. .
The inadequacy in the automated deci;
sionmaking computer program at the Avia-
tion Supply Office was not corrected. The
problem was noted in a GAO study Issued
May 21, 1974 entitled "Better Methods
Needed For Cancelling Orders For Material
? No Longer Required."
?
Again, hiswever, the inadequacy was
? not corrected and the decisionmaking
computer continued to inaccurately re-
flect demand for new equipment and for
repairs on naeraVaircraft. Five years Went
by before the needed correction was
made. "At least $3 million in annual un-
necessary cost" were initiated by 'auto-
mated decisionmaking'applications using
this overstated demand data," GAO audi-
tors said. .
Design of the automated decisionne.a,k-
ing computers at the Aviation Supply
Office was developed at the Fleet Mate-
riel Support Command, Mechanicsburg,
Pa., which also reports.to the Naval Sup-
? ply Systems Command in Washington.
GAO asked Navy officials why it had
taken so long to correct the, computer
inadequacy. The GAO report said:
The reasons cited by Navy officials for the
? 5-year delay. in initiating the modifications
included:
Disagreements within the Navy on whether
all canceled requisitions should result in re,
ducing record demands,
Iligh-prtority workload at tb.6 design ac-
tivity mandated by higher headquarters lev-
els in both the Navy and the Department of.
Defense. and
Lack of pressure placed on the Navy corn-
and design activity by the inventory
control points since reduced demands couLd,
result in budget reductions. [Emphasis
added.]
The Veterans' Administration?VA?
uses automated dectsionanaking comput-
ers to make monthly payments to more
than 185,000 veterans in apprenticeship
and other on-the-job training programs.
The VA computers are supposed to be
programed to make payments at a rate
that decreases every 6 months, under the
assumption that an individual veteran's
pay from his employer will increase as
he learns his trade.
Annually, the VA computers process
about 1.4 million unreviewed checks for
more than $225 million in apprenticeship
and other on-the-job training benefits.
However, the data submitted to the come
puters was incomplete and, GAO audi-
tors said, checks went out at the highest
levels to the veterans and no progres-
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009 7
? CIA-RDP79-00498A000300110009-7
3
sively declining payment system was im-
pleznented. The result, GAO said, was
potential overpayments of $700,000.
Code 8 is the designation the Army
gives to equipment and spare parts which
have radioactive components and which,
therefore, are required to be handled, by
authorized personnel in a stipulated
?
manner.
GAO said it obtained from the Army
Audit Agency data concerning the Army
Electronics Command, Fort Monraouth,
N.J., which processes each, year at least
250,000 requisitions for material. valued
at a minimum of $250 *million. About 35
percent of the requisitions are reviewed
by people, GAO said, and the remathing
65 percent are processed by automated
decisionmaking .computers without re-
view by people.
The Army Audit Agency examined 86
radioactive commodities handled by this
Command's automated decisionraaking
computers and found that 18 of the com-
modities were processed not with. the
radioactive designation of code 8?but
Instead carried a code 0 rating. Code 0
means that no special controls or han-
dling are required, GAO said.
In addition, the GAO auditors said,
another 11 radioactive eoromodities were
categorized as code 1, the code that in-
dicates that the item is scarce, costly or
highly technical?but not that it is radio-
active.
GAO said tile Army Audit Agency also
studied the application of automated
decisionmaking computer -technology, at
five Army inventory control points. The
Army auditors found the computers were
of fen In exTorein deciding where material
should be shipped. The result, the Army
auditors showed, was an annual loss of
$900,000 in unnecessary transportation
costs. In addition, a total of $1.3 million
was Incurred by the Army in the early
1970's due to unnecessary inventory in-
creases caused by errors in these same
computers.
The GAO report said that a major
cause of inaccurate computer tabulations
in the Government is the massive
amounts of information fed into the ma-
chines which lead "input preparers"?
that is, cotreputer personnel?to make
mistakes. ?
GAO noted, for example, that the Navy
Aviation Supply Office in Philadelphia
receives about 10 million "transaction
reports" each year, all of which are then
fed into computers. Transaction reports
are mainly Prepared by Navy facilities
that receive, store and issue aeronautical
equipment.
In addition, GAO auditors estimated
that during a 12-month period the VA
Center in Philadelphia prepafed more
than 4 million documents for insertion
into computers.
Approved For Release 2002/06/05/: CIA-RDP79-004,68A000300110009-7'
To insure more accurate automatic
computer calculations, GAO proposed
that the Government require selective or
cyclical monitoring of actions directed by
automated decisionmaking computers.
The GAO also recommended that outside
auditors or independent design teams
from elsewhere in a given agency be
called in. to study the design of a. com-
puter program before it is allowed to
begin making automated decisions. ?
A third General Accounting Office
study found that the Federal Governz
ment's 9,000 computers which are in-
volved in billions of dollars in transac-
tions and contain vast amounts of in-
formation are inadequately protected
against terrorism, vandalism, program
alteration, and natural disasters.
We can see the potential harm in Gov-
ernment's failure to adequately . Protect
computer facilities when we consider
what enormous personal tragedies would
result from serious damage to the social
security computerized system. Social se-
curity could not function without its
computers. It is impossible to estimate
the effects on millions of our elderly citi-
zens whose livelihood depends on social
security should the computers be de-
stroyed. -
But the potential threat is not limited
to social security. In terms of Federal
revenues, for instance, imagine the havoc
that would result from the destruction of
Federal tax records.
In addition, the number of veterans
in this country is larger than ever be-
fore. Each of these men and women who
served in the Armed Forces may be re-
ceiving, or may be entitled to receive,
benefits from their military service. Val-
uable data and records pertaining to
their military service?and the benefits
that accrue from that service-:?are on
computer tapes and, in the event of ca-
tastrophe, could be lost forever.
Since 1965, responsibility for control of
computer applications in the Federal
Governznent has been shared by the Gen-
eral Services Administration, the Office
of Management and Budget, and the De-
partment of Commerce.
The GAO report is named "Managers
Need To Provide Better Protection for
Federal Automatic Data Processing Cells ?
ters." It is dated May 10, 1976.
The GAO report said the total value
of Government's 9,000 computers "is
many billions" of dollars.
GAO said the value of some of the
data which is processed on these com-
puters such as social security records is
immeasureable.
GAO auditors said:
Consequently?protecting equipment and
' data from unauthorized or inadvertent acts
of destruction, alteration or misuse Is a mat-
ter of inestimable importance,
GAO said, for example, that the Na-
tional Aeronautics and Space Admin-
istration could not carry out space pro- -
grams without computer applications;
nor could the Federal Aviation Admin-
istration control aircraft effectively.
Computers are used to manage the
more than half-billion transactions proc-
essed by the Social Security Administra-
tion and the 4 billion facts relating to
the national population compiled and
managed by the Bureau of the Census,
GAO auditors said, adding that many
other Federal agencies rely heavily on
computer technology. -
Catastrophic 'losses- to Governments
sponsored data, processing installations
such as the loss of hunian life, ixreplaces
able data and equipment have occurred,. -
GAO said. in many of these kisses, GAO
said, additional security' measures were
Implemented after the event.
GAO said Information on the physle -
cal security' measures employed at 28
Federal data processing faeilities led its
auditors to conclude that Federal data
processing assets and valuable data. are -
not properly protected._ s ? - te
GAO recommended :that to provide. - more security over Government auto
-
Matic data. processing operations? the
Office of ManageMent and Budget?
OIVLB?should direet that management
officials be appointed at Federal instal-
lations having data. processing systems
and that they be assigned responsibility
for automatic data processing physical
security and risk management.
Reflective of the amount of money,
Federal agencies spend. on computers,
GAO said, is.the fact that more than $10
billion is expended each year to buy and
operate Federal data processing systems.
In concluding that security safe-
guards are inadequate regarding com-
puters, GAO studied security techniques.
,at 28 data processing installations of the
Departments of the Army, Navy, Air -
Force, Agriculture, Transportation,
State and Ilealth, Education and Wel-
fare and the Veterans' Administration.
.Besides the 28 Federal data process-
ing sites, GAO auditors also studied se-
curity problems Identified at 23 addi-
tional Government computer installa-
tions.
In addition, GAO examined data proc-
essing security- systems used at Govern-
ment , contractor sites; universities, pri-
vate companies, a bank, and a local gov-
ernment.
GAO said major areas of security cov-
ered in its -investigation of data process-
ing facilities included steps taken by
management to guard against threats of
modification or destruction to the physi-
cal plant, personnel, computer hardware
and Software, and to the data being proc-
essed or stored by the computerized sys-
tems.
Approved For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
?
Approved For Release 2002/06/055,CIA-RDP79-00498A000300110009-7
Eighteen of the 28 data processing In-
stallations were In the . continental
United States. The remaining 10 were
abroad.
Among its findings that computer in-
stallations are not properly protected,
GAO noted that? -
Fourteen installations had combustible
paper supplies or magnetic tape files
which .were stored in computer rooms
which exposed systems to losses from
fire. ?
-?
Three installations had computers
which were in use in areas where only
portable fire extinguishers were avail-
able.
One installation's computers were in
operation where no portable fire extin-
guishers were available.
Twelve installations had ? computers
which were in Use above raised flooring
without periodically cleaning below such
flooring, constituting ?size hazard. _
Six installations_ had _Computers Which
were in operation Where- iiiaster-eleetri-
cal power shutdown controls Were not
easily accessible at exit points. -
Ten installations had 'computers in
operation in areas where overhead water
or steam pipes?excluding sprinkler sys-
tems?existed with inadequate provision
for drainage. *
Two installations had computers
which *ere Used ? in -basements? below
ground level, exposing systems to.poten-
tial flooding conditions. -
Seven installations allowed vendor
service persismiel near computer banks
without supervision.
Five installations allowed in-house
service personnel to move about without
supervision in computer areas.
Three installations located computers
in quarters that were vulnerable to van-
dals.
Five installations managed their com-
puters in ways susceptible to theft or
misuse. Remotely located computer sys-
tems were in operation without controls
to detect improper or erroneous attempts
to use computers or data files. ?
Fourteen installations lacked contin-
gency planning. Computerized systems
were in operation without formal con-
tingency plans to insure continuity of
operations if an event occurred that
threatened security.
GAO studied instances in which major
data processing facilities had been hit
by terrorism, vandalism, fire or natural
disaster.
GAO said attempts at sabotage of
computer activities have been made by
employees within data processing cen-
ters. GAO said four attempts had been
made to sabotage computer operations
at Wright-Patterson Air Force Base
near Dayton, Ohio, during a 6-month
period ending November 15, 1974, by us-
ing magnets, loosening , wires on.. the
computer mainframe and gouging equip-
ment with a sharp tool.
- On August 24, 1970, a bomb exploded
outside the Sterling Hall Building at the
University of Wisconsin. This building
housed the Army Mathematics Research
Center and other federally funded re-
search. activities. One employee was
killed and three others were injured. The -
explosion damaged 25 buildings at the
university and resulted in a total loss
of $2.4 million for buildings and equip-
ment. Computers at the Army Mathe-
matics Research Center were damaged
and some programing efforts. and 20
years' accumulated data. was destroyed.
It has been estimated that this research
data represented more than 1.3 million
staff hours of effort. GAO calculated this
effort to represent an investment of $16
In. May of 1972, a bomb exploded on
the fourth floor of the Pentagon above
the computer facility and caused exten-
sive damage. The computer facility was
flooded front broken water pipes and
parts of it were inoperable for about 29
hotui. ?
The computer center at the National
Institutes of Health, 33ethesda, Md., has
experienced many computer system fail-
ures due, to electrical power failures.
GAO said officials a the computer center
estimated that they lost a minimum. of
000,000 annually from electrical power
fluctuations. During a 15-week period,
the NIH computer center experienced
6 major electrical power fluctuations
which caused 15 computer -system fail- -
ures. These failures resulted in destruc-
tion of data for 375 batch processing jobs
and for 2,250 remote terminal users.
GAO said these power fluctuations
caused replaeeihent of electronics cost-
ing more than $94,000 in. various compo-
nents of the computer systems.
On June 24; 1972, water from the Sus-
quehanna River flooded all of downtown
Wilkes-Barre, Pa., and filled the base-
"dent of the post office building. Water
continued rising until about 6 inches
of it were on the computer room Roar.
About $7.5 million worth of ?Govern-
ment computer equipment was located
on raised flooring on the first floor. Had
the water risen about an inch more it
would have ruined virtually all of the
cOmputer equipment, GAO said.
GAO described a 1959 fire at the
Pentagon which destroyed three com-
plete computer systems valued at $6.5
million. The fire started in a vault con-
taining stored paper and magnetic tape
and spread throughout the computer
center. When the first occurred em-
ployees were unable to reach the switch
to turn off electrical power for the com-
puter system. This created a hazardous
situation for firefighting efforts.
GAO cited another, example of cata.-
stroPhld loss caused by fire to a Govern-
ment facility, although computer rec-
ords were not directly involved. In July
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/056 CIA-RDP79-00498A000300110009-7
of 1973, fire broke out in the Military
Personnel Records Center in St. Louis,
Mo. Sections of the building housing
these records were not equipped with
sprinkler systems, smoke detectors or
fire walls. Although the fire did major
damage to papers and not computerized
records, GAO said, it nevertheless illus-
trated how devastating the loss of Irre-
placeable documents and records can be.
GAO said that since such records are
being put on computers more and more,
the problem increasingly becomes a com-
puter security problem. . .
GAO said the St. Louis records center
has been the repository for about 52 mil-
lion records- on military personnel ac-
tions since 1912. The sixth floor, where
the fire started, contained about 22 mil-
lion military personnel files or jackets.
About 16.8 million of these records were
lost _
Of the ?Ste- Louis fire, GAO audisald
tors
- - - - ?
This installation's mission is to mitintain
these official government records and to re-
spond to. Inquiries made by the Congress,
other government agenciei and the taxpayer.
This mission will now be hampered for some
time because the lost records?some of which
may be irreplaceable?raust be reconstructed
to satisfy inquiries, which is a costly and
time-consuming process.
While it is unreasonable to expect that
there would be backup for every: original
record in the manual files, it is reasonable
to assume that' sortie 'sort of contingency
planning should have been done to insure
continuity of operations when a loss has
occurred. Agency officials told us that a con-
tingency plan was formulated after the fire
happened. ?
GAO cited an instance at Kelly Air
Force Base in San Antonio, Tex., in which
someone altered a computer program
that resulted in a $100,000 theft of Gov-
ernment money. Due to the computer
alteration, the Air Force paid $100,000 to
bogus companies for aircraft fuel never
delivered. The bogus companies were es-
tablished by a Government employee
working at the base. The employee had
indepth knowledge of the computerized
fuel accounting system which he helped
" develop and install. An investigation was
begun when a bank contacted the Air
Force regarding suspicious banking
transactions involving Government
checks. The employee was arrested, con-
victed and sentenced to 10 years in
prison.
Among the agency comments to the
GAO report were these:
James T. Lynn, Director of the Office
of management and Budget, said the
GAO report was correct in citing a "need
for greater awareness of "threats to phys-
ical security" in automated data process-
ing. However, Lynn said 01VIB did not
support GAO's recommendation that an
; official in each agency be assigned re-
! sponsibility for computer security... In-
stead, Lynn said, the head of each
agency should decide how computer safe-
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
guards should be provided and who
should be in charge.
Terence E. McClany, Assistant Secre-
tary of Defense, Comptroller, said of the -.
GAO report that in general, "the impor-
tance of the subject, the general sub-
stance of the report, and.the thrust of the
recommendations are wholeheartedly
endorsed * * *"
John D. Young, Assistant Secretary of
HEW, Comptroller,, said, We full con.;
cur with the recommendations contained
In the report.. ." .
William S. Heffelfinger, Assistant Sec-
retary for Administration in the Depart-
ment of Transportation, endorsed the
GAO study.
The GAO report did not identify any
of the specific installations where it dis-
covered inadequate safeguards against -
computer damage. GAO auditors felt
that to identify these sites would be to
run the risk that persons might wish to
exploit these security weaknesses. .
Mr.. ? President, as chairman of the
Senate Committee on. Government Op-
erations, r have directed the staff to
conduct It preliminary inquiry into the
problems associated with computer-
related crimes 113. Federal. programs, au-
tomated decisionma.king computers in
Federal programs and computer security
In Federal programs.
Also in colinectinn with computer
problems in the Federal Government, '
Rebecca Leet in the Washington. Star of
May 10, 1976 has written an informative
article. Printed on page 1 of the Star,.
the headline of the article is "Two GAO
Studies Criticize Lack of Controls on
Computers." Mr. President, I ask wan-
imous consenlr that the Washington Star
article by Ms. Leet be printed in the
RECORD. '
There being no objection, the article
was ordered to be printed in the RECORD,
as follows:
c' Two GAO artn.)7FR CErrit.:1-41: LACK
. OF CONTROLS ON COXVIPVTERS
(By Rebecca Lest)
The rapid movement of the federal govern-
ment to greater and greater reliance on con).-
puters has not been accompanied by controls
to assure that computer orders are appro-
priate or necessary, according to two General
Accounting. Office reports.
The result Is a government highly sus-
ceptible to being defrauded, even by un-
sophisticated workers and to losing millions
. of dollars annually in overpayments, un-
necessary repairs and the like.
Probably every federal agency which usea
computers lacks the controls necessary to
prevent the kind of computer fraud and mis-
takes which led to the $622 million overpay-
ment in federal welfare benefits the Social
Security Administration has made since 1974.
The Washington Star reported on Friday
that audits by the Department .of Health,
Education, and Welfare had found that lax
management of the Social Security com-
puter system left large amounts of money
exposed to errors and fraud.
The GAO in,- ? izatiropfrAtiat*to ,
!J?Iv" vc
agency after nu e erons rap ?. ?
fraud in private industry, are the first time
the federal government has looked at what
steps it has taken to protect itself against
computer fraud and mistakes since the gov-
ernment began using them in 1952. Cur-
rently, 9,000 computers are used by the fed-
eral government.
What GAO found, in most cases, is that
government agencies have been more con-
cerned with getting a computer program un-
der way by the date promised than they have
in seeing it function properly, according to
the reports.
Ken Pollock, a GAO official who deals with
computer policy, said. the agency was "ap-
palled" by the lack of controls it found. "In
the old manual systems, everyone was very
conscious of controls.... In the hurry to get
automated, these things were ignored," he
said.
The report on computer fraud noted that
the control weaknesses which criminals were
taking advantage of "are mostly basic man-
agement controls long recognized as being
necessary to insure proper operations." ?
The rush to get the Social Seciirity Ad-
ministration's Supplemental Security In-
come (SSI) program instituted by its target
? date of Jan. 1, 1974, has been given as the
_main reason why the program has sa many
? bugs in iteThe Star previously disclosed that
$622 has 'been overpaid the country's aged,
blind and disabled welfare recepierets under
SW.
Once a government computer system is in
operation, Pollock noted, "there's always
something else (for programmers) to do
other than go back" and review the system
to see if it is functioning properly.
Pollock said that GAO had difficulty in
making its two studies because fraud by cora-
puters had never before been isolated from
regular fraud and because no one had ever
isolated the process which GAO called "auto-
mated decision-making by computers."
Automated decisionmaking by computers
occurs when a computer is programmed to
Issue checks or bills or orders for an agency
under certain circumstances and the actions
are taken without humans ever reviewing
them to see if they are correct.
GAO discovered that such computer pro-
grams annually issue payments or checks?
excluding payrolls?totaling $26 billion. They
issue bills totaling $10 billion and issue re-
quisitions, shipping orders, repairs and dis-
posal orders for materials valued at $8 billion.
Humans never review any of those actions.
? GAO said that while, it believes most of
automatic decisions the computers are cor-
rect, "we know from audit reports we re-
viewed that they also make bad decisions
that cost the government many millions of
dollars annually. Additionally, ? bad deci-
sions ? may result rn harm to people."
"There Is no federal-wide ;policy, guidance
or other instructions on how.computers is-
suing unreviewed actions should be managed
by federal agencies," the report said. "There
Is little checking or monitoring of output on
an ongoing .or short-term periodic basis.
"Internal audit reviews of these computer
actions are made sporadically or not at all,"
the report said.
Examples of the mistakes 'of such auto-
mated decision-making ? by computers, as
noted in the GAO report, include:
The Navy yearly spent $3 million to per-
form unnecessary airplane overhauls from
1969 to 1974 because incorrect information
7
giolvticr_ipaweima9b_7out the fre-
(As the GAO report notes, once computer
mistakes are discovered, they mtist be cor-
rected to change the faulty outcome. The
Navy resisted correcting this program error
for two years after it was discovered at least
partly, GAO was told, because it feared its
budget would be reduced if a lower use level
was shown.)
A faulty computer program led to the un-
needed cross-country shipment of $1 million.
worth of Army supplies one year.
In a study of 89 such shipments, rcilstakee
In a computer code resulted in 29 irapropet
shipments of radioactive material. The ma-
terial was shipped. without proper safeguards
and "there was doubt," the report noted, that
customers in some cases should have received
the material and in others that they knew it
was radioactive.
Regarding computer fraud, the GAO said.
that contrary to the general assumption,
those defrauding the government by com-
puters are mainly the untrained, relatively
unsophisticated computer users and not the
highly trained computer programmers. .
In 50 of 69 instances-of computer fraud.
was committed by the- computer user who-
knew which keys to press on. a computer
terminal to get a check Issued to. someone.
Since no records are kept of computer fraud
separate from regular fraud, the 69 instances
which the GAO investigated, Pollock said,
were all cases which govermnent investiga-
tors recalled as having Involved computers.
These instances, which occurred since
1970, resulted in the federal government
being bilked out of $2 million. It was rela-
tively easy to accomplish, GAO found, be-
cause of inadequateeeontrols over the com-
puter 'systems.
Of the 69 cases, Investigators reviewed 12
in depth. Their conclusions went largely un-
challenged by the agencies which were vic-
tims of the fraud, according to the report.
"In. every case we reviewed in. detail, the
incidents were directly traceable to weak-
nesses in. the system controls . . The pri-
mary reason tv5aknesses In system controls
existed was that management failed to rec-
ognize the importances of controlling sys-
tems," the report said.
"Managers . . primarily emphasized,
their systems operational; control was not
vnphasized."
"Computer criminals were typically not
'professional* criminals, but persons who
encountered difficulties on a. short-term
basis and who commit their crimes to help
them solve their problems. They experience
great personal. suffering when, their acts are
discovered. Therefore, a highly visible and
active audit function could dissuade them
from attempting crime," the report noted.
The instances of fraud included amounts
ranging from $320 to $250,000. Some of the
fraud was not in bilking the government of
money but in using government computers
to design programs which were then sold to
commercial firms.
Conclusions of the GAO reports, which
were sent to all government agencies Using
computers, 'are not. binding on agencies..
However, agencies must inform Congress
within 60 days what actions they have taken.
as a result of the reports' recommendations
on implementing controls. Xt is then up to
Congress to decide whether legislation is
needed to correct the deficiencies noted.
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009t7
?
?
?
?
Approved For .Release 2002/06/05 : CIA-RDP79700498A00030Ol4
. . ? ?
t ?
TT, n 7-To
F"- f
? n.
GONG _17.1f:SS
OTT T1.--TE Cal,.'TPTROI LER 'GE NEPAL -
OF Tilk UNITED STATES
Improvements Needed InNiana
Automated Decisionmaking
By Computers Throughout
The Federal Government_ -
Computers in Federal depertments- and agin-
cies annually issue.enrevi.fwed payments and
other actions involving t iUlon's.of dollars in
Government assests. These actions are often
vironc;. Th;,...y cars. cost the Government huge
sums of money; exactly haw much .no one
no V,,s.
? ?
,This report describes the ways computers
issue unreviev..cd actions and the causes for
incorrect actie:-.s. it susicsts rerrdies to cor-
rect the situaticn Government-wide. ?
FGMSD-76-5.
STATINTL _
M-L 23 ie 7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-.00498A000300110009-7
CHAPiER
? INTRONCTION
Many early business aoplications.on ccoputers involved
ent, ring, manipulating,'anEl sur..marizing da'.a and gc,nerating
.roo)rts. Most outputloroduced by these cOmputers uss
manually reviewed (1) for correctness and/or (2) to decide
what actions should be taken on the basis of.tlie outout
roplirt.
As more complex computer processing deVeIoped.'the
applications became more innovative. Computers wer-t, assign-
ed certain repetitive decksionmaking work which duryIicated
'steps people had taken to do the job previousl'sj. .The output
of these computers is frequently not reviewed by peapie
(that is; no manual review).
These types of applications have no establisher name.
We are calling .them automated decisionmaking'applica.tions.
ADTOMATED DECISIONMAKING APPLICATIONS
Automate3-decisionmaking aoolications are comoC.ter
ro rams that i iate t- (thout.,h output on the basis
of rogramab e decrsionmaking criteria established !:.?
management and incorporated in comouter ibstructIpp, The
Msti"---".ngi"Wia-775=is?t=r-these applicationS. as co14:=
pared to other ccmputer application programs, is that marlif
of the computer's actions take olace without manual revi: ,
and evaluatien.
An inventory application is an example of a conouter.
application progra. If the computer processing of a requi-
sition for material reduces the onhand.quantity. helm; the
.reorder point and if the computer issues a purchase order
without anyone reviewing the proposed procurement'onantity,
then the application is an autoated.decisionmaking appli-
. cation.. Some of the computer output.of.these ap?licet.ions
is reviewed. In the foregoing example, the apPlication-
May call for.manual reviews.of uantititeson all. crchase
orders over $5,000, with al/ purchase orders'under that
amount bei.ng. released without review.
;e reviewed these applications becouse.(1) billions
of dollars are involved in the ..uireviewed actions that
they initiate and (2) of indications that funds were being
wasted because of incorrect actions.
tt-
?
Approved For Release 2002/06/05 : CJA-RDP79-00408A000300110009-7
CHAPATERISTICS ?
one objective.o?. using,OoMputers operaZ:ing under auto-
mated decisionmaking applications is tc take advantage of
their speed, accuracy, storage capabilities, and capacity ?
to obey predetermined -instructions...These.applications
are needed in 'part, because of the 'tremendous volumes of.
infemation to be ot/tained, maniptilated (processed), analyz-.
.ed, and acted on in carrying out agendy missiOns and ..s.oais.
Automated decisionmaking applications process large
volumes.of transactions- put into the computer System from
.various sources. They make repetitive decisions that,. in
many cased, previouSly .have been made. by people.. The.
decision instuctions, built. into the program, ask questions
about the transactions and then initiate many acti.-.Nns through
_output. The actions depend solely on the crteria (logic)
and data inside the computer system.
Computer program
The'computer proram (software), written by people, "-
instructs the computer (1) to exaniine the input data and/
or data already in au'comatic data proceSsing (ADP) files,
(2) to perform logical decisionmaking steps and computa-
tions in- processing the data, and (3) to initiate actions
in the form. of output as a result of this process.
Input
. -
. ? Data is usually obtained. by people.. from various sources
and is put into the computer in machine-readable form
(including punched -gards-optical-oharaCter_recognition
documents, paper tape, Magnetic- ink:character:recognition
'documents,- and direct keybOard entry). The data can be
entered directly for- processing or can. be recorded on ADP
files for processing at.a later time
Output .
. . .
- ? ?
? . . .
The application outputs are such things as (1) direc-
tives to act (sdch as orders to shi material), (2) pel=h1
A.....,----
mthorizations or cnac sj. ( ) U.1.11,L, and (4),not.ices. A
large percentage of the output of these applicatios is not
manually reviewed and evaluated by people.
The following.iLluntration shows a computer operating
under automated decision7akin7 applications.
2
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
cc;
-
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
?
UNREV;EV/ED OUTPUT FROM COMPUTER
;?+?,..;
Ti? ??-??1. -
4 ?P,
c+.7
P.'', US
7.u9 mu-?;;;,1
mAyrRIAL
? RFOUISMOHt.
UR PURCHASE'
ORDERS
inarArrift?'
..1..?eii?tp,-
1_ ..t....,............,H,:?..../.....:,-...4t-.: rt.
SFND IS THIS WA TI rr) AL, 1 .04 "":1.1 lAV114::
{ 7i,f-?11..1 WTI,?
.
P. r rt.. 1 .....s....r-2.....:...?
..
Y..."'''''.-
.....--
? -, !., 14.-.:... ,.:,,!..,;:..!..'4: .
../c
NW, nr
err I y
U/SREV!Eh'EO OUTPUT FROM COMPUTER
rOt..71PuCkT
RtPAit.
SCKEOULES
.
.
E
-So
iliZ
:,,11
?
ut tar f iS
I7X ??
1,././A
. ?
Nr4.,
'414.._,..4)
??
Ilk lit I
.....?
, , .
'
Nt.,.-
Mena..
. ,
?
- '
_
tilikall...
? i :
.
.
,
??
TO J0,104 DOC
'r1.0
k I ..iirit
?
?
CUPU1't13 QPnRATING
AUVOMATED DECISIONWAKING
6PPLICKIIONS
'
01;;ECTIVE TO
INSFOSE OF
MATER;Al..
fv:
N I
Approved For Release 2002/06/05: CIA-IiIDP79700498A000300110009L7
1,1011tiU
'
0.51 US
af.k. -01A
1.11, TICS Alda./../ dd..*
TI
4A4f47
.??
Approved for-Release 200.2/06/-05 ? CIA-13.1DID1,9. r.00498A000300119009-7
The torn of output varies (in'cluding listings, maoneLic
. tapes. preprinted:forms,.and punched cards). These' oaLputs
'indicate the decisions resulting from computer processing
directed by the software..
? ' "eh(' outputs that are not'reviewed or evaIaated are
.usually issuai to the organizationS and people which take
the action being directed or which are 'being paid, billed,
'or notified.- -
? Some of the output of many automated deciaaionalaking
- applications is manually reviewed. Under 'management by
exception' principles, some.out.put,"the nature and extent
Of which is-determined by management, iS'sent to people Li
the?organization for manual review and evaluation. This
teChnigue allows people to consider criteria, factors, and
inforzatien not contained in the computer system in deciding
whether the computer-directed action should be taken. For _
thase applications manual intervention takes place onlv for
the actions output for review. . -
- She criteria for directing manual review of tile _output -
are contained in the--decisionmak'ing part' of tba.pragra7a. In
the inventory application example; the program would direct
that purchases over $5,000 be output for manual revriew. The
applications can be pcogramed so that none'of.the output will'
be manually reviewed or evaluated before actions ale taken._
. .
CONTRAST KITH OTHER COMPUTER APPLICATIONS.
Applicationlar:grams designed AD provide output to peo-
p:e for. infofmatiOn and analysis are' not-automated decision-
making applications. Many types of these application pro-
grams are-used in Government, and the oLtputs are. sent to
.people for review before actions are taken.
? -Typical applicatiOn programs that are.not automated
deCisionmaking applications include:. ,
.--Systems thac,:make recommendations, all.of.which are'
? manually, ?reviewed before actions are takena
--Management.andapther ..information 'systems. which pro-
vide data to various levels Of managers to:assist
them in makin(a policy,management,.and operating
decisions.
--Most mathematical models.
4
[7?777-'_'77-7r-7777.7-77-7-7?Pgriregagigt. 9?t915711107?PR--7:"?."9.?4.91.!Ni.q939-q114,qRge9-? ? ?
- , - ? ? - ? . ? - - ? ?
s.,_ ???,.- ? , .'? '
*.
Approved For Release 2002/06/05 : CIA-RIDID79-00498A000300110009-7
REPORT TO THE CONGRESS
?
BY 'I';AE COMPTROLLER GENERAL
.P OF THE UNITED STATES
?1.111011,101./..1.1...9.116TA ......17+111111?1.1%
rnprovements Needed In Managing
Automated Decisionmaking
By Computers Throughout
The Federal Government.
Computers in Federal dep.vtments and agen-
cies annually issue.unrevi..wed payments and.
other actions involving tillion.of dollars in
Government aests. These actions are often
wrong. -They can. cost the Government huge
sums of money; exactly bow much no one
knows..
,This report describes the ways computers
issue unreviewed action and the causes for
incorrect actions. It suggests remedies to cor-
rect the situaticn Government-wide.
FGMSD-76-5.
WALTER L. ANDERSON
ASSOCIATE DIRECTOR
U. S. GENERAL ACCOUNTING OFFICE
441 G STREET. NW.
WASHINGTON. D. C. 20548
TELEPHONE: 202-275-5044
AFri"W:73 1 97 6
Approved For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
B-415369
.ccimprnoLLER GENERAL Or THE UNITED STATES
WASH ittG TON. 0C., .20Sgs
To the President of the Senate and the
Speaker of the House of Representatives
-Many Federal .agencies use computers to initiate actions
that are not reviewed by people. Th16 report describes the
many problems that have been experiencco by agencies that use
computers this way and offErn some suggestions on how to
solve them.
We made our study pursuant to the Budget and Accounting
Act, 1921 :"31 U.S.C. 53), and the Accounting- and Auditing Act
of 1950 (3! U.S.C. 67)..
Weare sendina copies of this report to the Director,
-Office of Management and Budget; the.Secretary of Commerce;
the Administrator of General Services; and the. heads of Federal
departments and independent agencies.
Comptroller General
of the United States
-.01?1m.1
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Contents
? Page
DIGEST
' CHAPTER
/ INTRODUCTION 1
Automa".sd decisionmakins-applications -1
Characteristics 2
Contrast 'with other computer applica-
tions 4
USE OF AUTOMATED DECISIONMAKING APPLICA-
TIONS - BY FEDERAL. AGENCIES 5
Information about automated decision-
making applications used by Federal
agencies 5
Functions supported by automated deci-
? sionmaking applications 6
Number of automated decisionmaking
? applications and their impact on
Federal agencies 6
Reasons for output of actions for Man-
ual review and evaluation 8
An example of what automated decision-
? making applications do 8
?
3 AUTOMATED DECISIONMAKING APPLICATIONS CAN
? MAKE BAD DECISIONS10
. .
Conditions leading to bad decisions 10
Software problems reported . 13
.Data problems reported . 16
Internal audits of automated decision-
?making applications. 10
6
CAUSES OF BAD AUTOMATED. DECISIONS 20
Software problems 20
Data problems 27
FEDERAL MANAGEMENT OF AUTOMATED DECISION-
MAKING APPLICATIONS 33
Responsibilities for ADP management in
, the Government 33
-Policy actions by Federal.agencies to
manage automated deciSionmaking
applications
.What agencies do 35
AUTOMATED DECISIONMAKING APPLICATIONS CON-
TINUE TO MAKE BAD DECISIONS UNTIL PROB-
LEMS ARE CORRECTED 44
Error detection 44
34
Approved For Release 2002/06/05 : CIA-RDF19-00498A000300110009-7
CHAPTER
APPENDIX
Lade
ErrOr correction . 44
'Agency procedures for timely: correc-
tion of soft1,-;aredesign problems 45 '
OPINIONS ON WAYS TO PREVENT OR REDUCE
THE IMPACT OF PROBLEMS IN AUTOMATED
DECISIONMAKING 'APPLICATIONS . 47
?-Possible solutions?software problems 48
Possiole solutions--data problems 49 .
CONCLUSIONS, RECOMMENDATIONS, AND
-AGENCY COMMENTS 50
Recommendations 52-
Agency comments 53
Letter dated November 17, 1975, from the
Assistant Sedretary of the Department
of Health, Education, and Welfare 56
II Letter dated Nov:mber-28, 1975, from the
Associate Deputy Administrator,. Veterans
? Administration 62
III Letter dated December 29, 1975,. from the
ActinTAdminiStratOr of General Services 63
IV Letter dated January 2, 1976, from the
.Assistant Secretary of Defense
(Comptroller)
Internal audit reports on automated
decisionmaking applications ? 68
ABBREVIATIONS
ADP Automatic Data Processing
ASO Aviation Supply Office
DSA Defense Supply Agency
FM80 Fleet Materia1,SOpport Office
GAO General Accounting Office .
GSA General Service' Administration
HEW Department of Health, Education, and welfare
OMB Office of Management-and Budget
SSA - ? Social Security Administration.
VA Veterans Administration'
P 9-0 499A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
COMPTROLLER GENERAL'S
REPORT TO THE CONGRESS
DIGEST
IMPROVEMENTS NEEDED IN
MANAGING AUTOMATED
DECISIONMAKING BY COMPUTERS
THROUGHOUT THE FEDERAL
GOVERNMENT
Federal agency computers cause more than 1.7 ?
billion payments and other actions a. year with-
out anybody reviewing oc'evaluating whether ?
they are correct. Many agencies use computers
in this way. At a minimum, Governmentacomputers
issue annually:
--Unreviewed authorizations for payments or
checks (excluding payroll) totaling $26 bil-
lion.
--Unreviewed bills totaling $10 billion'.
--Unreviewed requisitions, shipping, orders,
repair schedules, and disposal orders for
material valued at $8 billion.
'COMPUTERS CAN ISSUE
INCORRECT ACTIONS
? ?
Computers are complex data processing machines
which are indispensable to the day-to-day oper-
ations of most Federal agencies. They can proc-
ess data quickly and are especially useful in
business-type applicatioas which involve repet-
itive processing of large volumes of data. How-
ever, computer actions are only as good as the
computer programs (or software) that make the
computers operate and the data within the sys-
tem. Computers can cause incorrect actions if
these factors are wrong. The result is over-
payments and unnecessary or orematuTe costs.
Some agencies' internal audit reports show that
unreviewed incorrect actions have been issued
by several Government computers, incurring over-
payments and unnacessary or premature costs of
tens of millions of dollars annually. For ex-
ample:
--Computers of one military department incurred
increased inveatory pieeline and transporta-
tion costs of $2.2 million because of erro-
neous software. (See p. 13.)
Upon remove!. the report
cover date should bet not's! hereon_
FGMSD-76-5
/05 ? CIA-RDP79-00498A000300110009-7
;.7 "TallaMaliti.girMMCMCZM
Approved For Release 2002/06/05 : C1A-RDP79-00498A000300110009-7
--One military agency's computer caused mil-
lions of dollars in unnecessary and/or
premature overhaul of equipment because of
software and data problems. (Sea p. 14.)
' Computers issuing incorrect 6ctions over an
. extended period of time increase the impact of
?nverpayments,- unnecessary costs, and so on. It
is important. to detect incortect Actions. It
is equally important to correct them as early
. as possible. ? ?
In this report, software that instructs com-
puters to 'issue unreviewed actions are being
called automated decisionmaking applications.
CAUSES FOR INCORRECT
COMPUTER ACTIONS
Incorrect computer actions occur because of
software problems and/or data problems. The
causes of these problems are numerous..
Softwere problems, for example, can be caused
by inadequate communications between people
involved in software development. (See pp. 20
to 27.)
Data problems, for example can be caused 617 the
use of input forms that are too complex. (See
pp. -29 to 32.)
.FEDERAL POLICY AND
AGENCY MANAGEMENT
There is no Federal-wide policy, guidance, or
other instructions on how computers issuing.
unreviewed Actions should be. managed ?by Federal
agencieS. -There is little ehecking or monitoring
Of output on an'ongoing or short-term -periodic
basis. Internal audit reviews of these. computer .
actions -are made sporadically or not. at all.
Several things?can be done that will disclose
some Of. the problems before they :occur and/or
before computers make decisions that can cause
'These
actions for an extended period.
These practices should be considered for
Government-wide use; (See pp. 47 to 49.)
rove For R lea e
11
Approved For Release 2G02106105 CIA-RDP79-00498A000300110009-7
IraLSIteat
'OW
?
RECOMMENDATIONS ?
GAO believes that,.since-automated decision-
making applications. have not. .been
recognized as a. separate problem atea requir-
ing Management attention. and. sthce millions?
Of dollars are presently being wasted as the
result oLactions? generated by such systems-,
the Office of Management and Budget should -
act immediately to improve the situation.
Specifically, GAO recommends that the.Direc-
tor', Office of Management and Budget, in his'
oversight capacity, require that:
--Each agency determine whether any of its
computer operations involve automated de-
cisionmaking applications.
-The agencies review each .operation to .de
termine whether incorrect actions are being
taken as a result .of these applications,
(Pending issuance of technical guidelines
by the National Bureau. ot Standards for
making such reviews, the agencies should. ex-
amine enough automatically generated deci-
sions to provide a basis for deciding.. -.
whether incorrect decisions are occurring
and, if -so, should' take the necessary steps
to correct' the Situation causing :the
in-
accurate. decisions:)
-Before any new automated decisionmiAing ape
,plications pre initiated by an the
proper steps are taken to insure correct de-
cisions. This would include, pending is-
suance of National Bureau of Standards guide-
lines,- a. carefully chosen?combination of in-
dependent review of systems design, -adequate
testing before implementation, and periodic
:testing of decisions after' implementation, as
discussed in this report.
?
--Agencies make reports on the actions taken,
and establish an appropriate mechanism for
?
monitoring reports.
GAO the 'National Bu-
re.7;- :A.a,dards has responsibilities for
technical aspects of automatie data proc-
essing, the Secretary of Commerce direct the
Bureau to issue technical guidelines for de-
veloping, using, technically evaluating,
ill
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
ffaMmNi,timagatil
1-
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
documenting, and modifying these applications
in the Federal Government:, e When issued, these
guidelines should contain certain. criteria.. for
independen'`' technical reviews ard? for .monitor-
ing of these applications to insure problems
are detected and corrected promptly. The.Gen-
erale Services Administration should incorporate
the Bureau guidelines in. its agency directives..
.In addition, GAO ItecOmmends that:
--As the GeneraiServices?dministration sug-
gested,. the Civil Service Commission develop
and add to its automated data processing.
training curriculum courses in automated.de-
cisionmaking applications so that managers,
technical personnel, and auditors will be-
come better equipped to deal with them in
an appropriate manner. .
--Internal audit groups in agenCies having au-
tomated decisionmaking application's partici-
pate actively in Oe-sien, test., and reviews
of such systems to carry out their respon-
sibilities-
Finally, GAO'suggests.that.he Joint Financial
Management Improvement TE-rogram consider this-
area for ongoing attention.
GAO is sending copies of this report to all
departments and independe;.,t agencies for their
information, use, and ddance pending issu-
ance of the Office of Management and Budlet
and the National Bereau of Standards material.
GAO received: comments from several agencies.
They agreed in principle to the need. for in-
creased- mae.agement at to automated
decisicnmaking applications. (See pp. 53 to
55.)
iv
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
??????????
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
'CHAPTER I
INTRODUCTION
- Many early business applications.on computers involved
entering, Manipulating, and suMmarizing:da-,a and generating
xeports. Most outputluroduced by these cOmputrs was
manually reviewed (fl for correctness and/or (2) to decide
what actions should be taken on-the basis ofHthe output
report.
As more complex computer processing developed,'the
applications became more innovative. Computers. were assign-
ed. certain repetitive decisionmaking work which duplicated
steps people had taken to do the job previousl.y, The output
of these computers is ftequentiv not reviewed by people
(that is: no -.manual review).
These types of applications have no established name.
We are calling them automated decisionmaking'applications.
AUTOMATED DECISIONMAKING APPLICATIONS
AutOmate3-decisionmaking applications are compUter
programs that initiate action (though output) on the basis
of programable decisionmaking criteria established by
management and incorporated in computer .instruction. The
distinguishng characteristic of these applications.. as cc-
pared to other computer application programs, is that marl
of the computer's actions take place without Manual revitAIJ
and evaluation.
An inventory application is an example of a computer
application program- If the computer processing of a requi-
sition for material reduces the onhand ,guantity. below the
reorder point and if the computer issues a Purchase order
without anyone reviewing the proposed procurement quantity,
then the application, is an autoMated'decisionmaking appli-
cation. Some of the computer outputof.these anolications
is reviewed. In the foregoing example, the application
May call for-manual reviews.of (:luantitites'on all pc2:Tchase
orders over $5,000, with all otIrchase orders?under that
amount bei.no released without review.
? e, -reviewed these applications because (1) billions
of dollars are 'involved in the unieviewed actions that
they initiate and {2) of indications that funds were being
-wasted because of incorrect actions.
1
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
emEmeamnsAanstarr-masmanzvtaLtawatrigvagramvizzointramfetawlinesamakaeurasnaavsymiusorggrArlar.-..Av2.....wwwm.....
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
?
CHARACTERISTICS ?
One objective ot using Computers Operating under auto-
mated decisionmaking applicatiOns is to take advantage of
their ?speed, acuracy,-storage capabilities, and capacity
to obey predetermined .instructions...These.applications?
are needed in 'pert, because of the 'tremendous volumes of?
information to be obtained, manipulated (processed), analyz-
ed, and acted on in carrying out agendy mitsiOns and yoals.
Automated decisionmaking applications process large
volumes.of transactions .put into the computer system from
?various sources. They make repetitive decisions that, in
many cases, previously have been made by people. The.
decision instuctions, built. into the program, ask questions
about the transactions and then initiate many actins throuah
output. The actions depend solely on the crfteria (logic)
and data inside the computer system.
Comauter_ELogram
The computer program -(software), written by people,
instructs the computer (1) to examine the input data and/
or data already in Au'eomatic data processing (ADP) files,
(2) to perform logical decisionmaking'steps and computa-
tions in processing the data, and (3) to initiate actions
in the form of output as a. result of this Process.
Data is usually obtained by people.. from various sources
and is put into the computer in machine-readable form
(including punched cards, optical charactei:. recognition
documents, -paper tape, Magnetic ink character recognition
documents, and and direct keyboard entry). 'The data can .be
entered directly for processing or can. be recorded on ADP
files for processing ata later tim-.
Output
?
The application outputs are such things as (1) direc-
tives to act (sUch as orders to ship material), (2) payment
authorizations.or checks, (3) bills, and (4) notices. h
large. percentage of the'output of these applications is not
manually reviewed and evaluated by people.
The following. illuntration shows a computer operating
under automated decisionmakina applications.
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
egrat=teeMraneRaERNalfairantaeXtMECOMaga.AW57155,V. sirarSiGallitedaKERS6r?31:41`,42ife,
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
UNREV:EWED OUTPUT FROM COMPUTER
SER,M-E C,IARGES
r.? VI, I 91. i $1,44$.?,,
4
, 47,4A4
A 4. 1?14?
PAY Hs
TIPS AHHHHT
//STOCK
.HH.P.3E-4
II, I WI,. ,Ac.
I SFNO US 7HI5 IJATFP..1?1_
MATERIAL
OPOUISITIONS
QR PURCHASE
ORDERS
$0 JOHN DOE
,
JR1,1"?"
?
? ?
?
UNREV1EWED OUTPUT FROM COMPUTER
EQUIPMENT
REPAIA
SCHEDULES
am
1
,
tas.1
, A
e)k,__
skim_
'.1
?
WAIL
Tar
,',:ZIri ' .1
Y1,1E91 T
COMPUTER OPERATING ORDEN
AU1OMATED DECISIONAKING
APPLICATIONS
et ?
OLP/047,??.'
.? A 4.
22 .0 A
f..!
f0.-:P?SIF THESE I !EMS
ook
064 4.4 .
?
44
4.,...
OAY 1141$ Awf v,.t
DIRECTiVE TO
DISPOSE OF
.1/640E WANTITvl
MATERIAL
01.0,4..,sr of. AC.;nif
5104-116-2144 422 tA7,,i
1
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 20Q2/06/05 : C1A-RDP7900498A000300110009-7
The form of output varies (including listing's, magnetic
tapes, preprinted forPs, and punched cards). These ootputs
indicate the deCisipns? resulting-from computer, processing
directed by the Software.-.
The outputs that are not reviewed or evalaated are
usually issu,i to the organizations: and peoPle.which take
the action being directed or which are beingpaid, billed,.
or notified..
Some of the output of many automated deciSionraking
applications is manually ..reviewed. . Under --management by
exception." principles, some,output,-the .nature and extent
of which is-determined by management, is sent to people ia
the-organization for manual review and.evaluation.? This
technique allows people to consider criteria, factors, and
informatical not contained in the computer system in deciding
whether the computer-directed action should be taken. For
th2.se applications manual intervention takes place only for
the actions output for review. .
The criteria .for directing manual review of ti'4.e output
are contained in the decisionmaking part.of the program. In
the inventory application example, the program would direct
that purchases over $5,000 be output for manual re-.:iew. The
applications can be programed so, ? that none of. the ?output will
be manually reviewed or evaluated before actions are taken. .
CONTRAST VIITH OTHER COMPUTER APPLICATIONS
Application-prlgxams.designedto- provide output to peo-
ple for. information and analysis are not -automated decision-
making applications. .Manytypes.of these. application pro-
grams are-used in Governm,nt, and the mtputs are:sent to
people for review before actions are taken.
? Typical application prOgrams that are not automated
decisionmaking applications includea?
--Systems that make recommendations, all.of..Which are
? manually- reviewed before actions are taken,
--Management and other information systems which pro-
vide data to various levels of managers to assist
them in makin.policy,mananement, and operating
decisions.
--Most mathematical models.
4
Approved For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
,
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
CHAPTER 2 -
USE OF AUTOMATED DECISIONMAKING APPLICATIONS
BY IFEDERAL AGENCIES
.Many Federal agencies use ?automated eecisionMaking
applications to support their functions. Annually, more than
a. billion actions,. involving billions 'of-dollars, in direc-
tives, to act, to make payments, to. issue orders for material,
and to bill for amounts owed are initiated. They also issue
millions of notifications to people outside the Government.
INFORMATION ABOUT AUTOMATED DECISIONMAKING .
APPLICATIONS USED BY FEDERAL AGENCIES ?
We wanted to learn how these applications were used and
to oatain data on their e?haracteristics and monetary impact
on Federal operations, but we found no central inventory. We
therefore developed a questionnaire to gather information
about Federal automated decisionmaking applications and dis-
tributed it to 15 agencies that use Computers extensively.
The information wanted included.
--Functions supported by these applications.
--Numbers of these applicaLions and their impact on op-
erations (including output produced and annual volume
and monetary impact).
--Whether certain parts of the decisions were being man-
ually reviewed.
We obtained more detailed information- about Selected
automated decisionmaking applications to understand and il-
lustrate typical uses.
Almost all the agencies we contacted gave us examples of
their automated decisionmaking applications. The information.
is summarized .below.
5
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
viTZ 4kE.M.'r-4'"'kZtrfi.PZ7f
Approved For Release 2002/06/05 : CIA-RDP79-00498A0003001100097
Defense
departments
and
agencies
Number
of
examoles
Civil
departments
and
agencies
Number
? . of
22.,EaEL2.2
,Air Force
14 -
Agriculture
6
Army .
14
Commerce
4
Defense Supply
- Agency
I.
General ServiCes'. Ad-
ministration
5
.Navy -
18
,Health, Education;
? and Welfare
8
.
Housing and Urban
Development
6
Interior .
10
Transport,tion
18
Treasury
4
Railroad Retirement
Board .
3
Veterans Administra-
tion
9
Total
55
73
Total number of examples obtained--128
FUNCTIONS SUPPORTED BY AUTOMATED
DEC13IONMAKING APPLICATIONS
The questionnaires showed that automated.decisionmaking
applications supported many functions. A compilation of re-
sponses is presented below. '
Function
Number of
times function
was cited .
Function
Number of
times function
.was cited
--------
Controlling
. 48
Maintenance '
30
Notification
48 '
Procurement
30
Fiscal-
46
Diagnostic
23
Payment
46
Scheduling
20
Supply
44
Disposal
17
Billing
41'
Cataloging
13
Distribution
- 38 '
Personnel
11
Eligibility
- 31
Safety '
9
NUMBER OF AUTOMATED DECISIONMAKING
APPLICATIONS AND THEIR IMPACT ON .
FEDERAL AGENCIES
No one collects statistics on these applications for
the Federal Government as a whole, so we could not determine
6
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
QM=
Approved For Release 2002/06/05 : CIA-RDP7900498A000300110009.7
the total number. Some of the agencies responding to our
questionnaire said their responses consisted of representa-
tive applications. Therefore, our report about automated
decisionmaking applications and their impact represents only
a part of .the Federal-Wide total.
, The responses identified 128 applications which issued
several different types of unreviewed'output. The nature of
the output and its estimated annual impact, on Federal opera-
tions, both in volumes and dollars, are Summarized below.
Nature pi output
Total
Number' .Total . monetary
cited actions , -impact
Payment authorizations or
checks to:,
Contractors or grantees
'Members of the public
Government employees
(other than payroll)
Bills sent to:
10
2)
3
(000
? omitted)
8,700
715,000
200
(000,000
omitted)
$ 7,221
18,589
8
Contractors
3
-100
15
Government organizatiops
17
1-7-,300
6,549
Members of the public
18
19,100
3,298
Purchas,.., prders or. supply
requisitions
24
: 28,000
4,456
Directives to Ship material
22
.260,200
a/2,500
Directives to dispose of ma-
terial
11 ?
.8,00.0
a/56
Production, repair,-or rework
schedules or instructions
12
191,300
a/1,150
Notifications to members of,
the public
22,200
' N/A
Other ? ?
,21
48
,
447,300
N/A
Total
212
1,717,400
a/Reptesents the value of material
on which these
actions
were taken-. Information collected indicates that the
transportation costs repreSent 'about 5 perCent of. the value
.61 material shipped; the. disposal Costs about 3 percent of
the material disposed of; and production, repair,- or re-
work cost about.23 percent of the value of the material.
The actions and monetary impact. in the preceding sched- ?
tile are for only a portion of the 212 output types. .M:ny re-
sponses indicated that this data was not readily available.
Our followup confirmed this.
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
iTZWES: 912,7.01ZWITLID.
Approved For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
REASONS_F9R_OUTPUT_OF_ACTIONS
FOR MANUAL REVIEW AND EVALUATION.
Some of the applications initiate. all actions without
review. Most ale designed, however, under the management-by- .
exception principle, which results in.sOme of the. output being
reviewed by employees before the actions are implemented.
,Several reasons given by agencies for reviewing some of
the output are shown below; ?
Monetary value of indicated action exceeds
prescribed dollar limitations .
Criticality of the action to be taken
Eligibility factors related to the action
Geographic considerations of various types
Health and safety considerations related
to the action
Times cited
43
28
21
11
10
? The percentage of actions initiated automatically varies
from one application to another and ean be adjusted by chang-
ing the processing criteria.' The percentage of unreviewed
action's identified by agencies' participating in this study
is shown below. ?
Percent of
actions unreviewed
100
Pumber of. applications
35
90 to
99
42
80 to
89
13
70 to
79
14
60 to
69
5
50 to
59
3
Below
50
14
No data provided
?.2
Total 128
AN EXAMPLE OF WHAT AUTOMiCD ?
DECISIONMAK1NG APPLICATIONS DO
Automated decisionmaking applications are designed to
make internal decisions of varying degrees of complexity and
to generate output containing the action to be taken. An
example of one of these applications is shown in this section.
Other examples are presented in chapter 3.
6
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
.dik-Ma=
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Customer returns- program
The Defense Supply Agency (DSA) uses. an automated
-decisiommaking application--credit returns--to evaluate in-
. . . .
quirres from military activities on what to do with sur-
plus DSA-managed'material. The options are to .(1) return
the material for credit, (2) return it without' credit, or
(3) dispose of it.?
-DSA's computers receive the requests in Itachrine-readable
form. The application identifies the commodity and refers to
pertinent data about it from the ADP files (such as informa-
tion on the quantities cif the Material.already stored in
DgA*s inventory and expected future requirements). Using
this and still other data, the application tells the activ-
ity what to.do with the material. usually these directives
are sent without manual review.
During 4 recent 1-year period, two of the six DSA sup-
ply centers issued the following unreviewed.directives using
this application.
Nature of
directives
Ship the material (with or
without credit) to the DSA
supply system
Dispose of the material
Total unreViewed advices
9
Estimated volumes of
unreviewed directives issued
Number of Value 6Y
directives material
174,000
62,000
236,000
$ 76,000,300
24,000,C00
$100,000,000
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
AMI:1111' 141'-"3.1caffat
Approved For Release 2002/06/05 : CIA-RDP79-90498A000300110009-7
CHAPTER 3.
?
AUTOMATED-DECISIONMAKING APPLICATIONS
? CAN MAKE BAD DECISIONS'
. .
.Whether .actions initiated automatically by the computer
are correct or not. largely. depends on ,(1) the. internal logic
of the program and. (2) the .data that is fed into the system.
Computers will produce bad decisions (1)'ifprogramers
and analysts make misjudgments Or. ertors- in. establishing
the decisionmaking criteria- or. (2) if the application is not
designed and/or' coded in .a manner .that properl?. implements
the decisionmaking criteria. . Changing circumstances can make
adequate eecisionmaking ciiteria in the software obs.)lete,
and had decisions will occur if the software is not changed.
Failure to design appropriate checks on input data, such as
edit checks,' can contribute to bad decisions. These applica-
tions can also Make bad decisions if the data supplied to
them is incomplete or incorrect or if the data ? is not ob-
tained or processed quickly.- '
Some internal audit groups have reported on bad deci-
sions made by Government automated, decisionmaking applica-
tions. The computer-inititated actions caused the agencies
to incur tens- of millions of dollars ofUnne,lessary costs,
premature costs, and overpayments.
. Such bad decisions may also harm -individuals -and impair
an_agencyds ability- to carry out its mission effectively.
CONDITIONS- LEADING TO.EAD DECISIONS
Adve:se conditions common to several- agencies have been:
reported. These conditions, resulting in the applications
automatically initiating uneconomical or Otherwise incorrect'
actions, can be broadly categoried'as (3)? software problems
and (2) data problems..
Software problems
Several software Problems that Can.cause had decisions
by.automated decisionmaking applications include:
--Designing software.with-incomplete or erroneous deci-
sionmaking criteria. ? Actions have been incorrect be-
cause the decisionmaking logic omitted factors which
should have been included. Inother cases decision-
Making criteria included in the software were diappro-
priate, either at the time of design or later, be-
cause of changed circumstances.
10
Approved For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
71.MERiadtrel-
Approved For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
--Failing to program the software as intended by the
customer. (user) or designer, resulting in logic
errors'often referred tO as programing errors.
--Omitting aeeded edit checks for determining Complete-
ness'of input data. Critical data elements have been
left blank on many input documents, and because no
.checks were included, the applications processed the
?tranSaction with...incomplete. data.'
Data problems
? Input data, quality s frequently problem. Since much
of this data is an integral part of. the decisionmaking pro-
cess, its poorequality can adversely affect the computer-
directed actions. Proolems include!
--Incomplete data used by automated dacisionmaking ap-
plications. Some input documents prepared by people
omitted entries in data elements which were critical
to the application but which were processed anyway.
The documents were not rejected when incomplete data
.was being used. In other instances data which the
application needed and which should have become part
'of ADP files was not put into the system.
7-Incorrect data used in automated decisionmaking ap-
plication processing. People have often uninten-
tionally introduced incorrect data. into the ADP sys-
tem. This incorrect data affected application deci-
sions.
--Obsolete data used in automated decisionmaking ap-
plication processing. Data in the ADP files became
obsolete due to new circumstances. The new data
may have been available but was not put into the
computer.
Conditions that have been
reported by internal audit
, Unfavorable conditions were identfied by 32 internal
audit. reportsof 7 agencies. These reports, issued during
a 23-month period, demonstrated that the same conditions
occurred in different agencies and were therefore common
problems. The audit reports, however, did not show the to-
tal occurrences and dollar impact of these conditions, past
or present, in federal automated decisionmaking applications.
11
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Amnaormssummomummr
' Z''5232ENWZ=EIMIEMENZMIMMEMUI=EN",. INSammiR
Approved For Release 2002/06/05 : CIA-RDP79-00498A00030011.0009-7
The results of our analysis of these audit reports are
summarized in the following table. (For further details,
see app. V.)
Number of
Number 'internal
'Category and of ? audit
condition .agencies reports
Software problems:
? ' Incomplete, erroneous
' or obsolete decision-
Number of
.times
condition
was
reported
(note a)
-making criteria
'Programing errors
Criteria or programing
(note b)
? Absence of needed edit
7
5
5
14
10
H.
30
10
14
' Checks'
4
5 .
? 11
Data problems.:
? Data elements incom-
plete
6
10
16
Data elements incorrect
5
17
30
Data elements obsolete
3
3
3
a/Each condition can occur tore than once. Software problems,
such as programing errors, may have occurred in more than
one portion- of the program or the conditi6n.may. have been
observed at more than one location,' each designing its own
program.. The data conditions were based, On the number of
different data elements that were either incomplete, in-
correct, or obsolete at least once.
? b/internal'audit reportsewere?not- sufficiently detailed to
_ -
arrive at an opinion as to whether the. problem was in cri-
teria-or. programing-.
Only 13 of the 32, reports had estimates of the monetarv
impact of bad decisions, but these estimates tAil to tens of
millions a year in, unnecessary .and premature costs and in
potential overpayments. Some reports cited specific cases
but provided no estimates of the total monetary impact.
Other reports cited potential mission impairment and possible
.harm. to, individuals.
?- The following sections are based on internal audit re-
ports selected from the 32 reports obtained.
12
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
firia94,,,LAIN=CiW.--Mra.o'Aano=05a1.mgramlavtaa?mmourxEmserenvev
Approved For Release .2002/06/05 : CIA-RDP79-00498A000300110009-7
SWIWARE PROBLEMS REPORTED
Exanples of software problems are presented to
demonstpatethe problems frequently exPerienced- with auto-
mated decisionmaking. Ate examples ate nOt.intended.to be a
? criticisa of the agencies involved, because these problems
. can occur wherever- these applications ate used.
Army processing of requisitions
for shipnent to overseas locations
. ? Several. Army inventory control points provide, material
support to? overseas Customers which .submit requisitions for
materials to the control points. Automated decisionmaking
applicaCons are used to screen material availability. at U.S.
depots. The computer produces a directive. which is automat-
ically ,issued to a depot to Ship material .to the overseas .
customer. These applications process over 106,000. overseas
requisitions annually.
Early in. the 1970s the Army implemented a system de- -
signed to improve supply support to overseas customers from
U.S. depots. .The control points Were?inStructed to design
their. ADP applications so that material would. be issued from -
east coast depots to .satisfy European customers and from west
coast depots. to satisfy Pacific customers. COntrols were .
required to prevent the.software- from releasing cross-country,
shipments Without. manual review.
?The Army Audit Agency examined the applications in ef-
fect at five control points. At' four activities it found
that the applications- were not adequate.to insure?maximum
filling of.requisitions from the appropriate depots. For
instance, in the initial requisition processing for overseas
customers,-the software used.by one of the?high-volume Con-
trol points screened stock availbility at eight depots be-
fore finding the appropriate depot. For releasing baCk-
ordered-stock.requisitions, depots on the opposite coast -
were often selected for material availability. The auditors
reported that, at three Control points, controls te.prevent
the automatic.release of Material from the wrong depots were
not implemented .and material. was automatically released for
cross-country shipments...- At least two control points used
stfeware that existed before the. criteria for supporting
overseas activities were?developed.
The audit agency estimated that, because of the use of
this erroneous criteria, unnecessary transportation costs of
$900,000 A year were incurred. In addition; $1.3 million
was incurred in increased inventory investment (pipeline)
costs.
13
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
_ The Army Materiel Command egreed with the audit agency's
assessment of .the problem and promised VD revise the criteria
contained in Army-control point applications.
? Navy scheduliels of aircraft
equipment for overhaul
The Navy's central manager_ for aircraft spare equipment
and parts uses a computereto.identify and schedule overhaul
for reparable components needed for future use. Until April
1974 the application used was called the Navy integrated
comprehensive reparable item scheduling program. 1/
. .
This application considered inventory on hand; require-
ments, and other data in ADP files to determine
--which components shou d be scheduled for. overhaul,
--what quantities should be overhauled, .
--which depots should de the work, and .
--what priorities depots should give in deciding which
items should be overhauled- first.
Depots used punched card output as the. basis for sched-
uling components for induction inte their overhaul, facil-
ities. Priority levels shown on the outprit affected the de-
pots J decisions regarding which items and quantities would
be overhauled first. (Not all the gdantities the program
indicated for overhaul were processed because of limited de-
pot overhaul capacity.)
The priority levels shown in the output ranged from
level 0 (zero)--highest priority--to level 3--lowest prior-
ity.
During a 1-year period, 2/ Navy facilities spent about
$145 million to overhaul aircraft components. valued..
1/In April 1974 the Navy integrated comprehensive reparable
item scheduling program was.reelaced by another automated
decisionmaking application called cyclical repair manage-
ment. We believe that the problems that occurred in the
first program could affect cyclical repair management in
a similar way, but GAO's review did not evaluate the new
program.
2/The figures presented are for an overlapping but not iden-
tical period. The overlap is 6 months.
14
Approved For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
at about $797 millione-mastly on the basis of the program's
Output, The Naval Audit Service, reviewing the operation,
identified several major software prOblemsi?all of which
-resulted in overstating overhaul requirement's.,
? --A data element used in computing priority level 1
contained data that resulted in duplications in com-
? puting levels 2 and 3. Gross overhaul requirements
scheduled by the program were therefore overstated.
When the program vas designed, this duplication was
overlooked.
--Data elements showing recurring material usage, used
to compute -levels 2 and 3, were greatly overstated
because of two software problyaa.
1. Required reductions to the material usage quan-
tities were not made automatically, because cer-
tain Navy activities were leaving a data ele-
ment blank on input decuments sent to the cen-.
tral manager. Our. followup determined that be-
cause of the designer's oversight or judgment
error, no edit check- was placed iu the software
to detect this missing data.
?
2. There were no software procedures for. automat-
ically reducing recorded material usage quan-
tities when customers canceled back orders and
? requisitions, Our followup disclosed that when
this application was desioned, the designer be-
lieved that canceled back orders and requisi-
? tions would rarely. occur.
The Naval Audit Service estimated the effect of these
incorrect- actions was millions Of dollars in unnecessary and
premature overhaul. costs. Although the Navy .Command offi-
cials did not agree'withthe auditor's reported ?figures,
they agreed that the problems -identified were -valid. Cor-
rective actions have been taken or. initiated;
A GAO report (B-162J52, May 21, 1974) -Better Methods
Needed for Canceling Orders for Materiel Na Longer Required"
discussed the.Navy's practice of not automatically reducing
recorded material usage when unfilled customer orders were
canceled. The report stated that "we estimate that this
overstatement resulted in annual unnecessary materiel buys
and repairs totaling about $10 million." Cf that amount,
more than $3 million as for repairs initiated by. this auto-
mated decisionmaking application.
15
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDF'79-00498A000300110009-7
DATA PROBL.F1S- REPORTED
The following examples of data:prob.:terns- show how had
data caneadVer;-iely affect the actions directed ny automated
decisionmaking applications,
Veterans Administration payments. for .
221LenticesEip and .other on-job training.
. The Veterans Administration (VA) Uses-a corPuter appli-
cation to make monthly payments to more than 185,000 veter-
ans in apprenticeship or other on-job training. This appli-
Cation is designed to make payments at a rate that 'decreases
every 5-months, under the aassumption-that Veteran's pay will
increase a. he learns his trade.
put into the computer is the basis. for automati-
cally .aa . 'lining the rates at which the veteran will be
paid. __,a month, additional data is put in regarding the
veteran's continuing eligibility to receive-the payments.
This application is programed?to read input documents
and distinguish apprenticeship and other, on-job training
awards .from other types of education awards, When the appli-
cation recognizes these on-job training awards,ait -refers to
appropriate rate tablet to determine the proper -payment.
The application refers to a new lower rate every 6 months
and automatically initiates payments at the .t?educed, rate.
Annually, this application initiates about 1.4 million un-
reviewed checks for more than .$225 .million. in 'apprenticeship
and other on-job training awards..
. Two. types of inpLit documents initiate payments for
these awards; An original award document is designed to
initiate payments to a'veteran not previously receiving them..
If the veteran has alreadyreceived benefits and there .is. a
need for (1) reentrance, (2) a .supplemental award,. or (3) new
key data such as dependency changes, a. different input -docu-
ment (supplemental award. code -sheet)eis prepared. Toth
documents contain data element t that allOw the computer to
determine that it is an. apprenticeship end other on-job
training award and that the redacinn rate tole should be
used.
The data entry on the supplemental award document that
causes the program to build the scheduled rate reduction is
-code 77 in a data element called change reatan.
VA internal auditors reported that ,22 of 121 tested
- supplemental award docaments for these benefits did not
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
'contain change reason code 77 .on the input documents ithe
data problem). . These documents were teceived from 10 dif-
ferent VA locations.. The applicationaacCeoted and processed
the documents.b'-cause the softwateAid notcontair. an edit
check to disclose and reject doCumen.4..a..withAncomplete en-
tries in this data clement (a related software'protlem).
BecaUse the data was incomplete, the comi.iuter used a
single tate for the entire period. of .training at the highest
step indicated. This .problem Caused .potential overpayments
of $700.000.
Possible causes cited for lar'ocetsing incomplete input
documents inoluded new petsonnel--tequiring -additional
training--and .fatigue. The desianer Overlooked the needed
edit check, a software problem, in preparing the detailed
and complex software.
PO:my processing requisitions.
for radioact:ve material
. The Army uses a computer to automatically procesS cus-
tomer.requisitions for commodities- One Atmy agency uses an ?
application to process at least 250,000 requisitions annually
for material valued at a minimum of S20 million. About 35
percent of the customer requisitions are out pat fol. manual
review .and evaluation for any of sevetal reasons.- .The re-
maining:65 pet cent are ptocessed without manual teview.
Some.commodities the agency malinages Contain radioactive
material. The Atmy mate r data ADP file is supposed to con-
tain a special control code (code a) in a specific data ele-
ment for commodities containing radioactive material. This
code, which should be put in by item managets, prevents
automatic issues. The item mariagers receiv- commodity teq-
uisitions for review and evaluation. This. manual intetven-
tion is tequired to insure that the requisitioners are (1)
authorized to receive the material, (2) aware of the radio-
active content, and (3) awate of the safeguards that must .he
used.
The Atm y Audit Acency tevw41 86 adicactive.commad-
ities which the agency managed to detelmins if the piopet
special control item codes w-_re contained in ADP files. The
review showed that 29 of the commodities were incotrectly
coded.
--Eleven commodities were coded as a tgulated item
(code 1) hut not as radioacIive. (A reoulatti item
is one that is scarce, costly, o highly technical.)
17
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 20Q2/06/05 : CIA-RDP79-00498A000300110009-7
'7-Eighteen-commodities contained an 0 code in, the ADP
files. An .0 code: indicates:that .no special controls
or ?handling are required. Many requisitions for
these commiadities are prtcessed automatically.
Most of, the incorrectly coded commodities had ben in the
supply system-4 to 11.years.
During the Audit Agency's review of I year's-transac-
tions, at least 38 customer requisitions'were-automatically
filled for 18 incorrectly coded commodities. 'Army customers
aud foreign governments under military-assistance. programs
were issued 423 units on these 38 requisitions,
. Since the commodities were incOrrectly coded, the item
managers did-not coordinate the issuef-of the ? units with the
38 customeys. Consequently, .there was doubt that the cus-
tomers should have been issUed the material .or that they
were aware of the radioactivity in the commodities
Army officials cited the following possible reasons for
the incorrect codes contained in ADP. files.
--The item managers who prepared the input to-ADP files
may not have been fully aware of the requirements and
procedures for coding radioactive material.
7-The agency's health physicist may not have notified
the item managers .of the radioactivity contained in
these commodities. ?
?
?
--The item managers may have been notified but failed
to input the correct data codes.
Army officialsagreed with the Audit Agency's -findings
and said they would (1) correct theADP files for all radio-
active' commodities, (2)?reemphasize to item.managers the
need for assigning the proper special control item code. to
commodities,' and (3) have a health-phsicist study the-com-
modities to insure that the items could be used safely by .
the customers that ?received them automaticially? special
study determined that the .commodities involved could be
safely used by the recipients.
INTERN,iL AUDITS OF AUTOMATED -
DECISiONMAKING APPLICATIONS.
Since published. internal audit reports were the sources
of our information on had decisions, we asked nine internal
18
A roved For Release 002/08/05 LCIA- DP79-00498A000300110009-
Approved 'For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
audit groups about the nature, approaches, and frequency of
scneduling audits of these applications.
We learned that certain internal audit groups rarely
became involved in the applicationslogid because they
lacked the expertise to effectively 'make such studies.
No internal audit group has prepared lists of agency
automated decisionmaking applications 'and scheduled reviews
of their decisions,.either routinely or'. when the system is
modified. However, several audit groups schedule specific
agency functiols for audit,. and if the functions are sup-
ported by these applications, auditors will get involved in
the internaldecisionmaking logic to evaluate the 'agency's
performance.
- Agency functions are generally audited on a Cyclical
basis, but the cycle may be anywhere from 2.to 8 years.
Ordinarily, the frequency of review is not dependent on
whether the function is supported by an automated decision-
making application. In addition, auditors may -,-aview func-
tions- and related ?automated decisionmaking if there. is (1) a.
special request or (2) an indication of a problem. based on
complaints. On the basis of approaches. taken by internal
audit groups, it appears that, many of' these applications go
unaudited for long periods of time or may never be audited.
Although many of the audit reports adequately show many
of the common problems that exist, they d6 not Show the
overall impact of the problems for all automated decision-
making applications. In fadt, there is.n6ipasis for estimat-
ing the total impact of bad decisions currently being made
by these applications.
19
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
NI,6_,OraWagiUNIr -lardt2IMMENREAX
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
'CHAPTER 4
'CAUSES OF BAD AUTOMATED DECISIONS
- The two basic automated decisionmaking. application
Hproblems, ,software and data, are often interdependent. For
example, automated decisionmakino applications making had
-.decisions because of incomplete data elements. on input dccu-
merits illustrate both a data oLoalemand a software problem
because (1) input- documents hlve not been properly p!:epared
.(data) and (2) edit checks for?, completeness have not been
properly designed (software). Other problems, such as when
-incomplete or erroneous decisionmaking criteria are. used
.(software) and incorrect data is put into the application
(data), can occur independently.'
.. The problems in each of 'these two areas are Caused by a
variety of factors. We identified mady causes of these
problems by (1) corresponding with people experienced in
software design and data management, (2) discussing them
with officials of selected Federal agencies; and (3) ana-
lyzing published internal audit reports.
SOFTWARE, PROBLEMS
? - Computer programs are usually developed and modified
by a combination of people: , the user (or customer), that
requires the coMputer assistance; the designer (or analyst),
who translates the requirements of -tne user into a.logical
structure; and the programer, who translates the logic into
program instructions which can be recognized and used by the
computer.
The software development and modification process was
similar at each Federal agency We visited. Variations are
not related to the process itself but aather involve such
efactors as
-7oiganizational setup ?and physical locations;
--titles of people performing various aspects of the
work.; and
--nature of the dpcumentation that will be prepared,
such as use of program flow charts.
Causes of softWare problems
. Agency officials said that the design or modification
and programing of software coal not be guided by specific
instructions on how best to do the work. Instead:,' agencies
20
Ar,r,roved For Release 2002/06/05 : Cl A 4RDP79-
'"WA,'7EtriEL,'We
IIA.:Alli is
,
Approved For Release 2002/06/05 : CIA-RDP7900498A000300110009-7
rely on people who know (1) the function supported by the
computer nd.(2) the art.of design and coding so that the
computer can: perform the desired .tasks. Some agencies pro-
vide broad guidelines 'on the pretest, the documents to be
used in the process (documentation), and at one agency,
instructions on what designers and programers should con-
sider when'd6ing the'work.
The user initiating the work 'sets forth many of the
specifics regarding the internal decisionmaking.criteria
to be used. Often the designer maket.some.decisions. Both
act on the basis of their knowledge of the funCtion, avail-
able guidelines in terms:of'management instruction or legis-
lation, their perceptions of the transactions to be' process-
ed, and communiCations with each other.' Sometimes they?will
call on operations research experts to help them design new
criteria, Rhile sometimes they will use-existing criLeria
to process similar transactions.
? The designer takes the established criteria and pre-
pares more specific documentation which is used for. program-
ing. The design and programing documents developed become
very detailed and complex, because the computer is instruct-
ed to operate in a logical step-by-step manner on a large
number of different conditions. 'Even less complex appli-
cations can consist of thousands of individual instructions
that must be designed and. programed to do what the-.user
and designer. perceive, to be correct.
. The deSignet and liter are Usually. reSPonsible. for
designing edit checks Anto.the progtam. ThisinclUdet
checks' fot the completeness of data elements on input -
documents.' According to' agencyofficials,. edit checks are
placed in the. software- for. data that is critical to the
decisionmaking,- such-a when incomplete or erroneous data
can affect the determinations made by the computer. Some
officials said that edit chetks are-placed. for almost every
data element, One agency As making an overt effort to -
limit edit checks, to reduce the number of documents reject-
ed by the computer.
In developing software, it is generally accepted that
the lines of communication between the user and designer
and the designer and Programer-must be effective.
To identify some of the causesfor the software prob-
lems presented in chapter we
--discussed them with ?Federal,officials.at several -
acencies;
.21
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
VEZTAMI,
Approved For Release' 2002/06/05 : CIA-RDP79-00498A000300110009-7
--received responses to questionnaires from 257
individuals who are experienced in the areas of ADP
software design, modification, and programing; :and
--analyzed causes cited by internal auditors.
A schedule summarizing some of the causes
problems is followed by a discussion of each.
Cause
Summaty'of Causes OfeSdfiware
Problems
Opinions of people .
answering the questionnaire--
dearee of cause (note ai .
Moderate Somewhat
to ' small
very large or none
Inadequate
communications
between the parties
to software design
Incorrect perceptions
of the nature of
actual transactions
? to be processed
Inadequate documentation
preventing adequate
' reviews of software
Time constraints
hampering the effec-
tiveness of the
design process
Absence of written
criteria or guidelinas
for designers to follow
Detail and complexity
involved in -designing,
coding, and reviewing
software
Reliance on the expertise
and experience of
people doing the work
(state of the art)
Undetected changes in
circumstances making
the application obsolete
State of the art in
softWare testing which
prevents testing all
possitle conditions
251
233 22
229 28
216.. 40,
234 49
177
171 8'3
157 90
164 91
of software
Identified from
' contacts
'with officials
of federal'.
agencies
(bete by
Cited as a
cause by
internal
auditors
a/The questionnaire presented -some possible causes of the design eonditions
? (Problems) * * and asked that -rased. ue your software desio experience
* *'* indicate the degree to which you believe each of these causes contrib-
utes to the design condition ptoblees) le genera).- The resporaes allowed
were to a: very large degree, somewhat latge degree, moderate degree, some-
what small degree, very small degree, or not At all.
b/Our contacts were made with various organizational elements, excludina inter-
nal audit, within five agencies: Department of the Navy; Department of the
Air Force; Department of Health, Education, and Welfare; Veterans Adnrstra-
tion; and National Bureau of Standards.
22
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
-trraiMIZELIW.W. xaEELV,ret,MWer
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
The problems identified are caused at various phases
of the software design process including.
--.user determinations,
7-designer actions,-and.
--program coding.
Many problems are not detected. through the review and test
phases-of the process and are therefore continued throuoh
' implementation and operation of the automated decisionmaking
.application. Officials at the National Bureau of Standards
. and the Air Force believe that it is impossible-to insure the
design of completely error-tee software under the current ?
state of the. art.
Inadequate communication between
the parties to software design
. 'At least three groups of people must adequately com-
municate to develop or modify the applications successfully.
Assuming that the user knows what he wants the computer
tcLdo and that. his criteria are correct, inadequate communi-
cations of this information can result in developing soft-
ware that isnot exactly what the user wants. .
Much has been written about the'communiCatirui. problem
in software development, and it is generally recognized as
,a human problem. -
Incorrect perceptions of the nature
of actual transactions to be.processed
e DeCisionmaking criteria used in these applications have
sometimes? been erroneous, because people, developing them made
wrong assumptions about the nature of the transactions that
were to be .processed. They May have relied On limited data
-about the transactions and established the criteria on their
judgment.
Officials of one agency. believed that a large percent-
age of automated decisionmaking application. sOftware prob-
lems were caused at the very beginning of the design
process by people involved in defining requirements and
establishing decisionmaking criteria.
In other. cases, the designer may have used criteria
contained in existing software to process transactions in a
similar, but not identical, environment. ?SoMetimes this is
23
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
r:CawriEiainmnai,aiimkamecnwtEeaituia:asfzonamgzasssravszwu
Approved For Release 2002/06/05 : C1A-RDP79-00498A000300110009-7
?
done to shorten design and programing time, but it can and
has caused. problems.
Inadequate documentation. preventing
_adequate reviews of software
. .
In our October 8, 1974, report .(B,-115369) "Improvement
Needed in Dodumenting Computer Systems," we noted that some
agencies had not developed adequate guidelines for prepar-
ing good documentation. Several. Federal officials said
that this was still 'a problem and that, documentation for
'many computer applications (including automated decision-
maKing) was inadequate.
? The report stated:
"In oAe case documentation explaining the objec-
tives of the computer system was not prepared by
the systems analyst. Without this information,
management.could not adequately monitor the. '
system's development. * * * the system did not
accomplish the results originally intended by
management.
"In another case, inadequate documentation was
cited as causing management to spend over a year
to determine how the various programs in.a
complex system operated.'
Adequate design documentation 'is needed to allow for
--reviewing the work done during application design and
modification,
--making the necessary modification,.
-,--correcting errors when they are detected, and
--insuring the application is operating as intended.
Time constraints. hampering. the.
effectiveness?of the design process
.r.1ny systems containing- these.appliCations are design-
ed or modified because of legislation or other high-prior-
ity requirements i7posed by top management. Often. this
calls for implementation by a specific date. Developing
and/or modifying software within the required time frames
can hamper efforts to insure. its adequacy. Agencies that
must make changes to Ahese applications on the basis of
legislation include VA and the Departtent of Health,
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
?Mkritc-481MMEEldErsilkitift-MIMEEmt,
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Education, and Welfare (HEW). ' The Department.. of Defense
often must make software modifications .based on high-prio.:-
' ity requirements imposed by top Management.
Absence of written criteria or
auidelines?for designer's to 'follow-
?- Federal officials had many opinions about the need for.
and nature Of written guidelines that should be provided to
designers of software. The agencies we Visited had varying
degree's of formal guidelines, but none'provided .instructions
on how to do design work.
Some officials who believe- that written criteria and
guidelines on how to design software are' not desirable refer
to the process as an art that cannot be ?duided or 'improved
by written instructions. However, the consensus of responses
to our questionnaires indicates that the absence of criteria
or guidelines can be a major cause for Some automated deci-
sionmakihg application problems..
Detail and complexity involved in
designing, coding, and revieTilg software
Even smaller applications can be extremely complex and
detailed when designing and coding the processing logic and
edit checks. The coMplexities and detail.involved:may also
hamper the review process that may exist. ?
An illustration of the problem is VA's.automated.deci-
sionmaking application for supplemental education benefit
awards--which is a small part of VA's total education
applications. This program consists.of more than .1,100
lines of code covering about 420 decision points.. One Navy
automated disposal application?also' a relatively minor
program compared to others--contains about 7,300 lines of
code with more than 290 decision points. ' More complex
software, such as the Navy. cyclical repair management pro-
gram, has morethan 64,800 lines of code with at leapt 630
decision points.
The sheer detail and complexity of the process can
cause design and programing errors and omissions which are
not caught in review and testing. ' Therefore, bad decisions
occur.
25
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
IONEWM=5 4%121241217'4761fClil,airaSiff.t
IMINXIMME19311licmlitaulAnt&IMEMV-Vmszti
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Reliance on the expertise and
experience of people doing the work
. The nature of the design process, causes agencies to
rely or designei-s who must be. experienced in both the
software design and the function to 'be supported by. these
applications.
Federal designers are
--schooled in the art of software design and learn
the function to be supported,
--experienced in an operating function and learn
the art of Software de-sign,' or
--former programers and are promoted to the design
function. Programers are generally schooled in
writing code in specific computer languages.
Much reliance is placed on the individual designer's
ability to convert user requirements to the type of detail-
ed logic .needed for Programer coding. Reliance is also
placed on the programer's ability to write code according
to the logic given him. Because of the detail and complex-
ity involved, it.is'difficult for management to review and
assess every aspect of the designers' and programers' work.
Undetected changes in circumstances
making the application obsolete
A cause for erroneous decisionmaking.criteria includes
theailure to identify and/or to relate changes in pro-
cessing circumstances to the operation. of the application.
Once the application is operational, it will make decisions
.--good Or bad--on the same basis until it is modified.
Not recognizing changed circumstances so that appli-
cations cOuld be modified could result in bad decisions
based on criteria that were correct ? when designed, but
which no longer applied.
State of the art of program
testing which prevenes testing
all possible condiTiThns
- The current stat, of the art makes it difficult for
agencies to test for all conditions that may occur during
the transaction processing. Most agencies cannot even
be sure that the tests nave exercisedevery line of code.
As result, accepted software can contain design and/or
26
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
luME2E1?ZaLTWNRINEMEDZIENMINM9150,amitimaromizsmes..r.rinsmam.gwad
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
?
. coding errors not identified during the test phase. Some
?of these ertors may not be detected Until long- after the
application becos operational..
? The inability to test for all conditions also pre-
cludes a full evaluation of user and .designer Criteria
built into the program (if and when such.evaluatibn is
attempted.) .
DATA PROBLEMS
?
Data. used by the, computer in making-deciSions,comes
from A variety of sources, both internal and'exterrrad 'to
the .agency that has the Computer. A-tabulation. of'the-
various sources of data input, for the 128 automated deci-
sionmakin:, applicaticns identified is presented below.
Source of input document
Number of applications
:in which the originator
was cited
People within the agenby
operating the application 49
People located outside the
agency operating the appli-
cation but within the same
Federal department or,
independent agency
People located in non-Govern-
ment activities
People located in other Federal.
departments or independent ?
agencies .
23
12
7
Control over the completeness, accuracy, and 'currency
of data largely depends on the Source. _Obviously, the cor-
rectness of an application opetated at an agency where all
the data comes from outside sources largely depends, on the
.quality of data submitted.... Some controls can be applied to
incoming. input, but they cannot guarantee completely error-
free data.
According to some Federal officials, -the largest
single data problem is validating input data. However,
data quality must be controlled from the moment data enters
the system until the automatic processing is complete.
27
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2021615': CIA-RDP79-00498A000300110009-7
Types of controls fordata
. There are two basic types of controls for -insuring
the Completeness, accuracy, and currency:Of data used by a
computer in makino decisions..
? 1. External controls are procedures developed outside
the, computer system. The objective is to check the
quality of data to. be put into 'and contained in the
computer system, The controls.. includesuch things
as manual procedures designed to determine if data
is recorded completely and accurately on input doc-
uments and whether documentS'are, being- received
and/or processed on time.'
2. Internal controls generally do not involve human
intervention. Many of these controls aro puilt
into the software. They include edit checks for
-completeness, logical relationship tests (does the
data make sense?) and reasonableness checks (to
isolate predetermined out-of-bounds conditibns).
According to the National Archives and Records Service,
General Services Administration (GSA), both' types of con-
trols are necessary s i no automated decisionmaking applica-
tion can be reliable, if either 'type of control is deficient.
These applications use data 0Aginally prepared by
people. The data input process often consists of people
. --filling out hard copy documents, 1/ often on prede-
signed standard forms, and
--converting the data to a form that can be read
by the computer--machine-r.eadable form;
As part of the external controls that should exist,
the people. doing the .work should be qualified and adequate-
ly trained. Adequate guidelines should be given to these
people on a tiMely basis instructing them how to fill
out the documents involved, including what entries should
be made under varying circumstances. The forms (hard copy
and input) should be designed to be as simple as possible
1/Under some circumstances, such as source data automation
and direct input devices', hard copy documents are not
prepared.
23
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
to- allow for, easy. reading by people. Procedures should
exist for revieWinq (i.e., statistical sampling) input
documents to test their completeness. and accuracy.
Controls should also provide for timely protessing?of
the data.?
If incomplete or inaccurate data enters the-cOmputer
system. undetected, automatic actions can.beiincorxect. The
actions will continue to be incorrect.if that data is:
stored in. ADP files and reused. These 'applications can
also make incorrect decisions if current data is nOt put
into the system.
Causes of data problems
To identify some of the causes (..Z the data problems,
we
--contacted Federal officials at several agencies,
--received responses to questionnaires from 205
? individuals who are experienced in the area of
data management in computers, and
--analyzed causes cited by internal auditors.
A schedule Summarizing some of the causes of data
problems is followed by a discussion of each.
29
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79:00498A000300110009-7
Cause
SU aiy:of Causes of Data Problems
Opinions of people: /dentif'ed from
answering the questionnaire .contacts
--degree of causes (note a) with' officials Cited as a
Moderate Somewhat of Federal cause. by ,
to small. agencies- internal
very large or none (noteab) auditors
Forms desioned and used .
for input preparation are
too complex. , ? 183 2l x
ADP files are not always
adequately reviewed to
assure that good data is
being used. 178 26
Instructions to people
preparing data input are
.not always provided; are
provided late, or are not
adequate.: 175 30
Preparers of data input
are not always adequately
trained. 159 46
Manual reviews of input
documents are not always
adequate..
144 61
High volumes of transactions
cause input preparers to
make errors (workload .'
presSuras).. 131 -
.a/The 'questionnaire presented "sore possible causes of tLa data:conditions
(problems) * * * 7 and asked that 'based on your data nanagement experi-
ence * * * indicate the degree to which you believe ? each of these causes
contributes to the data condition (problems) in genera]. " The responses
? allowed were to a: very large degree, somewhat large degree, Moderate
degree, somewhat small, degree, or very small degree, or not at all..
. b/Our contacts were made with various organizational elements,'exclading
internal audit, within six-ageacies: .the Department of the Navy; De-
partment of Health, Education, and Welfare; Veterans Administration;
National'Bureae.of Standards; National Archives and Accords Service;
and Civil Service Comnission.
30
Approved For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
.'The errors occur at the source of data preparation.
They are. not detectedby-the? varioris internal Controls in
the software because controls for .the specific error. (1)
are not.designea or (2). cannot be designed,
Forms designed and used for ? .
input preparation are too complek.
Using'sinple forms to record,.collect, transmit, and
process inforroation,for .input to computers improves the
completeness and accuracy of the data eventually.used by
all computer application programs.:.. The More .complex the
forms are, the more prone. they are to data errors,.. which
can affect the correctness of actions initiated by auto-
mated decisionmaking apglications.
ADP files are not always
adequately reviewed to assd,-e
that good data is Peino used .
A recognized external control technique is to output
and review data contained in ADP files. Failure to do this
can result in obsolete or otherwise .incorrect data used in
automated decisionmakina applications. ? Incorrect decisions
.w-e therefore initiated. Without reviews, it is possible
ifor some data errors to renain undetected for Years and to
allow for an accumulation of errots compounding the problem.
Instructions to people preparing data .
input are not always provided,. are
provided late, or ace not adequate
It is important to provide clear instruction's to people
preparing input documents. Timelyupdating.of these instruc-
tions when changes occur is also important. The. failure to
issue clear and timely instructions can cause data errors
that may not be detected? by internal controls..
Preparers of data input are not
always adequately trained
Most training in the input data prep.:-ration area is
done by individual agencies, because itomuA be geared
toward the individal apolication, each with its own
special forms, data content, and related input media.
Inadequate training of persons involved in processing
data to the co7outcr (such as filling cut forms and
punchina cards) can lead t_o hi i error rates which result
in had decisions made by these
31
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Manual reviews of input documents
are nof-T1*!!ys adtgai..lt .
External controlt:inalude selective manual reviews of
input documents to determine completeness and accuracy.
These reviewt,-Made by supervisors or quality e:ntrol-groups,
should be geared toward Measuring the quality of data
entering the system, including determining trends,.sioni-
ficance and sources of errors.
. When there are different types and sources of input,
.review procedures should cover them all. Developing and
monitoring, statistical. errorerates is important. The review
procedure, however, should also -include determining the
errors' potential materiality so that.Management can make
judgments on where corrective .actions should be taken-.
Manual reviews supplement internal controls by (1)
disclosing needed software data validation (Such as edit
checks) that is missed because of software problems or
(2) identifying trendsof material data errors which are
not detected by software data validation.
High volumes of transactioas
causea-IFIT)a preparers to make
errors (workload pressures)
Automated .decisionmaking application's are designed,
in part, to help organizations cope with the .high volumes
of transactions that have. to be processed. ? Although- the
computer processes the. data once:it is entered, the.volumes
of documents- (hard copy -and machine readable) that. mutt be
prepared- are tremendous.. For example, we estimated that
during a 12-month period, the VA Center, Philadelphia,
Pennsylvania, -prepared'more than 4 million documents for
input to computers. Other VA activities throughout the.
United States also prepare such input documents. The
'Navy Aviation Supply Office (ASO), also in Philadelphia,
annually receives about 10 million transaction reports
for input to computert.. .The transaction reports are
mainly- prepared by Navy facilities that receive, store,
and issue aeronautical equipment.
The volumes of data that must beprocessed by. people
recording material on original documents and preparing
machine-readable documents can lead to workload pres-
sures that result in data errors.
32
Ap roved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
KAIMPErWOm"9.1gUr
?
tu7
Approved For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
CHAPTER.5
FEDERAL MANAGEMENT OF
AUTOMATED DECISIONMAKING APPLICATIONS
Although we believe that most decisionS made,by these
applications are correct, we know from audit reports we re-
viewed that they also make bad decisions that cost the Gov-
ernment many millions of dollars annually. Additionally,
bad decisions can impede agency mission achievement and may
result in harm to people.
To a large degred.software. design and data Quality con-
trol are an art Much of the process is imperfect .because
people instruct the computer and supply data AO it.
The fact that computers will act only as instructed by
people, and on data prepared by people, makes them particu-
larly susceptible to incorrect output, which in an automated
decisionmaking application causes incorrect actions.
Undetected errors in preparing the software--whether
caused by the user, the designer, or the programer--can
cause the computer to repeat bad decisions. These errors
will-continue until: the problem is detected and corrected.
Data problems may be random or 'repetitive.. The repeti-
tive problems resulting from sucl items as inadequate in-
structions and complexity of forms will also continue .until
corrective actions are taken.
RESPONSIBILITIES FOR ADP MANAGEMENT
IN THE GOVERNM:NT
Public Law 89-306, the Brooks Act, specifies the major
ADP management responsibilities of the Office of Management
and Budget (OMB), the General Services Administration (GSA),
and the Department of Commerce.
Under this act, the Administrator of General Services_
is charged with economic and efficient purchase, lease, and
maintenance of ADP equipment by Federal agencies. The Ad-
ministrator also has. some control over usingADP equipment.
The Department of Commerce is authorized to provide scien-
tific and technological services for ADP systems and to
make recommendations concerning ADP standards. This is
carried out through the National Bureau of Standards' Insti-
tute for Computer Sciences and Technology. The act states
that the authority granted to the Administrator of General
Services and to the Secretary of Commerce is subject to
33
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
-V,M7faiiWt4Rat6 ..KCAUGEMBEIMM0ARDEINCATIM
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
policy and fiscal control by OMB. This constitutes
oversight responsibility for the area.
In response to Government needs for training and
education. in ADP, the Civil Service Commission's Bureau of
Training orates an ADP Management Training Center.
This center offers a Variety of courses to Federal, Civil
and .military personnel. Certain portions of their curriculum
address the controls area in automated systems. The material
presented should assist in alerting Managers who take these
courses to possible control weaknesses in their agency's
Operations.
No Federal-wide guidelines on autdmated
decisionmaking applications
Neither GSA-nor the Secretary of Commerce has? considered
these applications as a separate subject matter for management
consideration. There are, therefore, no established Federal
guidelines for -identifying, developing, operating, or moni-
toring these applications to insure that they are.operating
effectively and economically.
-POLICY ACTIONS BY FEDERAL AGENCIES-TO MANAGE
AUTOMATED DECISIONKAKING APPLICATIONS
No Federal agencies we. contacted had considered these
applifaations separately from other types of computer appli-
cation programs in issuing management instructions. When
instructions onsoftware design had been issued, they were
general _and dealt with such things as
--levels of approval required to initiate and. process
a.design project;
--concepts of project management--including setting
'priorities, establishing target dates, and .
requiring cost-benefit studies;
?
--the phases of software design and the documentation
required; and
--testing and certification requirements.'
Coasidering the current state- of the art and the human
. problems that exist, we agree with those Federal officials
:who contend that issuing detailed instructions on how to
design these applications (or other computer application
programs) will not in itself materially reduce many of-the
errors that are made in them.
34
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
VW/ ammo
Approved For Release 2002/06/05 : CIA-RD079-00498A000300110009-7
'Inventories of automated decisionmaking r
applications
Agencies ?have done little to establish centralized
information on computer application programs '.hat identifies -
these applications ?and shows their?.characteristics-
characteristics include the (1) nature. of 'actions initiated, .
(2) ?monetary and other impact on -operations, and (3) nature
and sources 'of input. -Information, is. Sometimes' available
Within an agency but must bepulled together' from different
sources. This is done mainly when requested by higher level
sources, such as headquarters, .a.budget.committee, or an
agency such as GAO. . it. is. not normally, done.
WHAT AGENCIES DO
We scudied what. Federal agencies do .in designing,
modifying, testing, and operating these applications. . We
also studied how these agencies manage data entered and
contained in their coMouters. The.studies.were.made at
selected agencies of the. Department.of Defense (Navy), HEW
(Social Security Administration), and VA (education and
insurance applications). We also. visited a responsible
headquarters agency in the Department of the Air Force to
discuss these subjectson a limited basis.
- We examined policy and existing procedures and
practices for?managing computer application programs but did
not verify that they were being employed.as described to us.
? Despite the apparent 'variances in the nature and types
of policies and instructions issued, the same types of
problems exist at these and other' agencies.
.2.2219n and modification
' -VA had no written instructione for designing or modify-
ing computer application prograrss.. .Ni?A told' us that it relied
on written text material as a guide. ..VA has issued instruc-
tionson establishing-and controlling software design proj-
ects, establishing approval levels, and establishing. prior-
ities and target dates. ?
, The Social Security Administration (SSA) 'has issued a
guide that describes the various phases of' the design and
modification processes, establishes review and approval
steps and describes who is responsible for doing the work.
. Neither agency has issued instructions on how to do the
design work or what to consider when doing such work. VA
officials do not believe that it is necessary or even
? 35
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
mmuzal ;:3)112= oz lams a z z'
Approved For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
feasible to issue such instructions. SSA assumes that
designers and programers are adequately trained and experi-
enced since courses are continually offered so that skills
.can be maintained at: a satisfactory level..
The Nyy F1. Material Support Office (EMS()) is. the
central design ?activity for naval Supply Systems Command ac-
tivities. They have issued instructions to designers and
programers in the form of information .processing standards.
The instructions provide guidance on What. designers and pro-
gramers are supposed to cOnsider when doing the work, in-
cluding.
.--customEr and mandated requirements;
--logical sequencing of ADP actions;
--types of input and output;
--data formats and uses;
--data accuracy, completeness, and c6rrency require-
ments;
--error and exception conditions (edit checks); and
--data volumes and frequencies.
Independent reviews of designed
and modified product
The reviews of the detailed designed product 1/ are
geuerally made by the user and/or the people doing the work.
According to agency officials the extent. of .these reviews
-varies from
--a page-by-page analysis made by ASO of products
signed by FMSO to .
--a less formalized cursory review made by. supervisors
or management.
1/Usually consisting of a narrative or flow-ctart descrip-
_
tion of'the processing to be followed by the computer dur-
ing operation.
36
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
VEZiGEMMR
Approved For Release.2002/06/05 : CIA-RDP79-00498A000300110009-7
We observed no .requirements for making independent re-
views of the detailed designed product. Essentially, the
people doing the work are responsible for doing the detailed
reviews.
? TheAir Force Audit Agency independently reviews se-
Jected data processing systems before they arn implemented
(preimplementation reviews). These reviews, made' at four
Air Force design activities, include evaluating the designed
computer application programs and related edit checks.
? This approach requires the auditor to become familiar
with functions supported by applications, as well as learn-
ing basic software design and data control concepts. It in-
'eludes reviewingand evaluating (1) the decisionmaking cri-
teria, (2) the program coding, (3) the edit checks, and (4)
other potential data problems.
The Audit ;gency had never calculated the cost savings
that resulted from identifying and correcting 'potential
problems before the applications were placed into operation.
A major reason cited was that since corrective actions were
often taken on the spot, there was no need for estimating
unnecessary costs that would otherwise have resulted during
operation.
Preimplementatlon audit reports of the Air 'Force Audit
Agency showed that many of the problems that had.been re-
portiA in operational automated decisionmaking applications
were identified during preimplementat ion reviews, and Air
Force design officials agreed that the problems existed.
For instance, reports showed examples of. .
--erroneous'decisionmaking 'criteria
?.programing errors,. and
--inadequate data controls
We discussed the concept of independent preimplementa-
tion reViews, with the Deputy Director of the Air Force
Office of Data Automation. ,He agreed with the concept of
such independent reviews but preferred that the reviews be
Made. by. independent teams within the design activity. He
believes that auditors shculd become involved in evaluating
designed or modified applications as soon as possible after
the ? applications are placed into operation. ?
Despite not making a sayings analysis on preimplementation
changes, the Air Force Audit Agency believes hat.preimplemen-
tation reviews shoUld continue because:
37
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
121 4.'417,72VP"rateaSta6M MIWIWIMUZIOCIPTASENSE669galftrIENICSOISZZISbrannbiaseaarmasanstua-tommormsesewssisagmaruses,
Approved For Release.2002/06/05 : CIA-RD079-0049.8A000300110009-7
--The quality of data svstems is improved as a result
of Air Force Audit Agency reviews.
--The dollar impact of'resourtes 'managed by. many auto-
mated systems is a proner subject for spceial audit.
--Systems audits 'during the development stage help in-
crease .the auditors knowledge of.the systems.,
.--The ability to make effective and efficient follow-on
audits of aerations is enhanced by the 'preimplemen-
tation reviews.
Testing
After the designed or modified application progrem? is
coded, agencies test the logic to -determine whether the pro-
gram will run and will perfou,T the processing desired by the
user. A description of the nature of testing- bY each agency
'follows.
--Programers at the Navv?FMSID preoare predetermined
test cases and files to test thelogic of the pro-
gram. If the results are satisfactOrye the user op-
erates the program with -a duplicate ADP file .and a
selected re:mber of actual transactions, which varies
with each aPplication. . Some of the selected transac-
tions are traced to determine iI'the program is op-
erating as ?intended and whether the decisions. being
made a:e tlns same as operating -Personnel would make.
under the circumstances.. The user. advises FS 3 if
there is a.problem..
--Programers andodesigners.at SSA test both test cases
and actual transactions... The number of selected .
transactions will varvedepending on the complexity of
the prograr-. The user is required to certify that
the pr'Araz is operating.ac'corc:ing to the user's're-
quirements.
.--vA primary testing is done by independent system
auditors eessionea to the Department of. Data Manage-
ment. The e-)stem auditors are independent pf the
progra:7.ers and designers, although they also work
for toe sane deoartren.t. The stem auditors use a
large number of test eases that have been deveIoPed
and reuse,:: during the years. An automated comparieon
of the oro:ssinj is made before and after the logic
changes, aed the differences are print'ed out. Unless
there 3re .7any differences;.all.are reviewed for cor-
rectness. Tne oases tnat are not pri.nted are not
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
-zatommusrmsgmanc .1c=takimairiquacamenect=magairamsg2trwasgee2,
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
? ?
reviewed: The auditors moSt? certify that the logic
conforms to the user requiteMents or issue exception
reports:when.it does: not.'
. Federal offiniaIs recognize thatthe current state of
the art. in progra,;1 testing is imperfect. According to Offi-
cials of the 'Institute for Computer Sciences and Technology
Of the National, Bureau of Standards',. most test procedures
currently used do not insure that all lines Of .codes have
been exercised. Official at the agencies visited agree
that it is virtually impossible to test for every condition,
but say they do the best they can?by:
--testing as many Conditicns 3S .considercd feasible and
necessary and
--adding to test case material conditions' which caused
problems during operations but had .not been identi-
fied during the original test phase.
The Institute and the .Air Force Consider the test phase an
area :here the current state of the art .must be advanced.
The institute was aware of numerous examples of com-
puter application prootams which were considered to be ade-
quately tested but which, during operation, ran into serious
problems and caused incorrect actions: As a result,-the In-
stitute. in cooperation. with the National Science ?Fcw,ndation
worked on methods to improve the state of the art..
Onerecently developed procedure is a software program
that will monitor. tests of computer application programs
written in FORTRAN (a programing language). This program
counts the number of times each' line. of code has beer
exercised by test cases. Even though there is no. insurance
that every conceivable condition will be tested, there .is
insurance that each line of code has been tested at least
once. Until recently, this capability was hot generally
available.' .
In a February 19-y2 report, the Air Force said tat
software design and testing were the two most critical prob-
lems in ADP requiring further reserch and-development. In
July 1973 the Air F3i:Ce entered into a contract for .he
development of the type of soft are device that the Insti-
tute had 'developed out. for a different programing language.
Mon.itorinp of proc:ram operation
VA and tne Navy largely ret' on (1) in auditor's
reviews and (2) fee.-:.back from p.,.:ople affected by bad
39
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : C1A-RDP79-00498A0003001100097
decisions or operating, personnel to identify automated
decisionmaking application oroblems. No formal Systematic
monitoring of the applications output is made, with one-ex-
ception; VA audits education payments to veterans in excess
of a predetermined.amount. We believe that this is. of lim-
ited value -in identifying, many costly systematic problems in
automated decisionmaking applications because some. type's of
transactions will never be reviewed. ? -
SSA has .a formal monitoring group continuously taking
random samples of automated decisionmaking aoplicatiOn out-
put. According to SSA officials, this sampling has identi-
fied design and programing errors and repetitive data errors
causing erroneous payments in operating automated decision-
making applications. Examples of the? kind of errors -identi-
fied by'this monitoring function include:
--Design, ?coding, or data problems in the automatic
computation or recomputation of initial or .subsequent
benefits. '
--Data problems in processing notices which affect pay-
ments.
--Design or,codino problems in the updating of master
data records (ADP files)..
-;--Inadequate oreparation 'of data.
SSA told us that system design and coding' errors, as
well as systematic' repetitive data errors, were corrected as
a result of this procedure. However, it could not give us
statistics on numbers of errors found or their potential
monetary impact,' because SSA did not have this kind of
information. 1/
SSA requires categorizing,.in-nridition to Ironitoring,
the reasons for: required program modifications. Tne cate-
gories include:
--Incomplete or incorrect performance requirements or
program specifications.
--Logic errors or program omissions.
1/Monitoring procedures are not' always carried out as soon
. as new programs are place into operation. The supplemen-
:tal security incom rogram,' an automated decisionmaking
aplication, did not have full-scale monitoring during its
initial operational periods.
40
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
'Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
?
---Incomplete validation of input. data.
--'System-produced data not in accordancewith.specifi-
cations.
7-Incomplete 'testing.
HEW headquarters said. that a Consulting .firm noted a
need for continuing reviews and evaluations.of,.among?other
things, applications software. The firm suggested that a
four-member team, including an. auditor, be-resPonsible for
reviewing selected applications on a short-tern cyclic basis
.including (11 reviewing the'application -against the -original
specification tO determine that the software. was performing
as 'intended And (2) determining whether. application Programs
had been eO!equately modified when the processing circum-'
stances changed. HE did not accept the firm's report.
Data control-
- The sources of data input vary for the" following loca-
tions.
--Navy ASO, Philadelphia, receives much of its data
from external sources including (1) contractors for
new aeronautical equipment entering the supply system
and. GO other Navy activities that receive, store,
and issue aeronautical equipment,
--SSA, Baltimore, Maryland,:receives mOst of its. data
from about 1,300 offites-and center's throughout the
United States.
--The VA data processing centers in Hines, Illinois,
.and Philadelphia,.receiVe data from several VA- sta-
tions throughout the country;
Internal Controls
Our review shows that, even though written procedures
may not exist, agencies develop and program extensive edit.
checks in software to help insure the validity of data coming
.into the system. Agency officials admit that, .although ex-
tensive work is done to analyze potential data errors' during
the design process, edit cheCks cannot be designed to iden-
tify all types of data errors.
In many cases erroneods. but acceptable data may be -
placed on input documents; Because such data. can represent
a valid situation, there may be no way to design an edit
check to insure that h. is correct. Also, edit checks will
41
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For. Release,2002/06/05 : CIA-RDP79-00498A000300110009-7
not-catch errors not.conceived.of--and therefore not
considered--in designing edits. ..
?
Agency:officials-agreed that, because of the detail and
complexity involver.. in the 'design- arocss, potential edit-
checks may be missed.
Examples of the types of checks observed at the three
agencies visited included
,--edit checks for rncomplete.data. elements;?
--reasonableness .checks; for exaMple,. rejected documents
containing- numerical values above or below a predeter-
mined amount in a given data element,
--logical checks; for example, checks for. impossible
conditions, such as negative inventory balances or
alphabetic characters contained in data elements that
were designed to contain only numeric characters; and
-.-data relationship checks; for example, comparing data
elements with other data on the sane input document
and/or contained in ADP files-. -
External controls
Because a2encies receive input from numerous sources,
we limited ourstudy of external controls to the controls
at the agencies actually visited (VA Center, ASO, and SSA).
--VA has written procedures for several external con-
trol functions which include (I) random sampling of
input documents to identify and develop statistics
whin are used for identifying error rates and error
sources, (2) selected verification of eligibility
data contained in ADP files,. (3) date:stamping and
sampling of documents to control the timeliness of
documents processed, and (4) controls Over unproc-
-essed (pending) documents.
--ASO makes no, manual reviews' of supply-related data
received from Navy activities and therefore primarily
relies on (1) controls at the data 'preparing site and
(2) internal controls desicned in the ASO software.
ASO makes selected 7.anual reviews of data received
from contractors on new aeronautical components before
the data is allowed to enter the system
42
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
--SSA basically relies on the (1) internal controls
designed into the software, (2) end'-of7line monitoring
.procedures and (3) manual reviews at the vast num-
bers of offices and centers preparing the data.
43
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP7900498A000300110009-7
?
CHAPTER 6
AUTOMATED DECISION4AK1NG APPLICATIONS CONTINUE TO
MAKE BADeDECISIONS UNTIL PROBLEMS ARE CORRECTED
Errors made by users, designers; and programers of
automated decisionmaking applications, if:not identified
and corrected in the,review and testing phases?of.the design
process, can cause bad decisions which .will.continue until
the errors are detected and corrected. When an. insignifi-
cant error for a given'action'is multiplied by thou,sads or
millions of the same type of actions over a period of time,
the err:Cr is compounded. 'Unnecessary Costs-4ill'gtow.and
become larg. An error allowed to exist.for 3 years vill
cost the Government more than if the error.is detected and
corrected within, for example, 3 months after the automated
decisionmaking application is in operation.
ERROR DETECTION
In previous chapters we discussed what agencies do to
detect design and data problems. _Because errors get through
'sign and test:processes and 5ecatise data errors are made,
early detection of them is important in reducing ?the cumula-
tive effects of bad decisions.
ERROR CORRECTION ?
Detecting ,errors occurring in automated..decisionmaking
application, software and/Or data will not., by itself, stop
the unnecessary costs being incurred. When de:;.ected,..action
must be taken to correct the errors by modifying the soft-
ware, or improving the data quality, or both.
,We have noted some instances in .which problems were
identified,but corrective actions'were not taken for a long
time. An example follows.
Naveuse of overstated demands dn
automated decrsionmaking applications
A GAO report, B-16215.2 May 21, 1974, noted that .in _to)
Navy auditors saw a need to design .a routine in the standard
computerized supply management system used by Navy irlentory
control points fot removing from ADP files past material.
.usage quantities (demands) associated with canceled requisi-
tions- The demands recorded in these ADP files were used by
several automated decisionmakiog applications:
44
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
The report noted that in,1969,Navy command offiCials
agreed with the 'need to proberlv adjust deand forecasts
for invalid orders but said that it. would not be able to
.correct the prr'ilems before 1971 becauee of other priority
work. the 'report said that; at the time of the GAD review
in 1972, the Navy was still nOt eliminating from AD? files
-demands related to invalid orders..
We estimated that about $34 Trillion in invalid, demands
were in Nay ADP files and that these overstated-deands
resulted in unnecessary materiel 'buys and reoairt totaling
about $10 million a year. At leant $3.millien in annual
unnecessary costs were initiated by automated dectsionmaking
applications 'using this Overstated demand data.
The design change to correct the condition had not been
made at the time of this review, so WQ discussed the reasons
for the delay with apprdpriate Navy officials.
We were ?told that, beraese of the GAO report and direc-
tion received from the Department of Defense, a high-priority
project was established on JJne 14, 1974, to. make the needed.
design modification.
the reasons cited by Navy officials 'or the 5-year delay
in initiating the modification included..
?
--disagreements within the Navy on whether a31 canceled
requisitions should result in reducing recorded de-
mands,
--high-priority workload at the design activity mandated
,by higher headquarters levels in both the Navy and the
Department of Defense, and
--lack of pressure olaced?on the Navy command and design
activity by the inventory- control boints since reduced
demands could result in budget reductions.. ?
AGENCY PROCEDURES FOR TIMELY CORRECTION
OF SOFTWARE DESIG PROBLEYS
Agencies establish priorities and target dates for soft-
ware design an modification erojects. Agency guidelines
also require cost-benefit studies to justify establishing
and committing resources to a large design effort.
According to some Federal officials, however, little
attention is given to doing cost-benefit stndies which
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA7R.DP79700498A000300110009-7
?
?
demonstrate either -(l) how much will be saved by eliminating
??
an .automated decisionmaking application problem that exists
or (2)? how much the ? continuing automaticdetisions will cost
the Government if the pLohlet is allowed to 0 unChanged.
46
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
CHAPTER -7
OPINIONS ON WAYS TO PREV.ENT.OR REDUCE
THE IMPACT or PROBLEMS. IN ?
AUTOMATED DECISIONMAKING APPLICATIONS
We believe that, despite the imperfect state of the
art in application design and the widespread problems of
getting 'quality data to the compute,' every Federal agency
using these applications should consider doing cetain
things to prevent or reduce the i.mpact of the problems.
identified in this report.
. We issued a. questionnaire to.204 members Of each of the
following professional associations that are dedicated to
furthering the quality of ADP-produced products:
--The Association for Computing Machinery's. Special?
Interest Group for Business Data Processing.
7-The Association for Computing Machinery's Special
Interest Group for Management of Data. .
"The Society- for Management Information Systems.
The questionnaire described the various problemS?that
we had Observed in both the software design and data areas
and requested the members to rate possible solutions pre-
sented in terms of their effectiveness and cost benefit.
.7tte ratings were designed to determine .the validity of each
solution, assuming each application involved spending millions
of dollars or had an impact on people-.
Some of the solutions can be. applied befdre the applica-
tion becomes operational to prevent problem conditions.'
Some of the solutions were to be applied after the automated
decisionmaOng application became operational to detect prob-
lem conditions early. If timely correction is made, the im-
pact will be reduced.
A total of 263 people responded to the questionnaire.
47
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
-rimmicammemrmormwmwsmommmmmmumarAtmmAmvagizmowsummmmummmo
kgt -
'41.".. ?
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Summary of People
? Answering the GAO Questionnaire.
Portion of questionnaire
..qualified. t6 answer
_Affiliation Of data -Design Design. Data
processing -professional and data only 222.1 Total
Commercial concern
136
45
? 4
185
Academic
'34.
10.
45
Government
.25
1
27
.Not indicated
? 42
0
6
Total
199
58 .
6
263
POSSIBLE SOLUTIONS?SOFTWARE PROBLEMS
Some of the highly rated solutions to the various de-
sign conditions are:.
--Documentation should be prepared that highlights (1)
.key portions of the automated decisionmaking ceiteria,
? (2) data elements that. are critical to .the decision-
making, and (3) the edit checks placed (or justifica-
tions for omitting them) in the software. A formal-
ized synopsis Of these items should be prepared for
review and approval by top management.
--Qualified auditors or others who are independent of
designers and users should review the designed appli-
cation before it is placed into.operation.. Others
could include a design team independent of the origi-
nal designer and user. They would be responsible for
evaluating the (1)- adequacy of the decisionmaking
criria, (2). logic in .the coded'application, and (3)
nee and uses of edit checks to detect incomplete
dat.... elements put into the application.
-7-Similar independent teams- should review the. operation
of these applications shortly after they. are imple-
mented. The objectives would be to evaluate the ade
quacy of the decisionmaking criteria in an operational
environment and to provide for early deteCtion of any
bad decisions. This would allow for early correction
of problems.
--Some form of cyclical system monitoring of actions
initiated by operational automated decisionmaking ap-
plications snould exist. Teams composed of -(but not
restricted.to} designers, users, and auditors could
48
A ro ed For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
staNinsworiatrAn.
Approved FOr Release 2002/06/05 : CIA7RDP79-60498A000300110009-7
Analyze application-initiated actions to (I) see if
desired results were achieved the best way, (2)
identify 'unforeseen circumstances that would require
modifying the apPlicatibn,..(3) determine that the
actions were as the user and designer in-ended, and
A41 .insUre that decisionmaking was not adversely
affected by incomplete data not being screened by an.
edit check.
--The designee and user should. be physically located in
the same place during design phases to allow for con-
stant communication.. In effect, the design would be
a joint effort and would help to insure that adequate
. decisionmaking criteria were contained in the applica-
tion.
--Priorities should be established for software modi-
fication'(changes) which are at leastdpartially based
on the cost of continuing incorrect autoMatic.actions
if no changes are made within a short time.
--The initiator of the needed. software modification
(foe example, headquarters, User, audit team, and/or
others) should be informed .about the status of the
cnange and be provided with confirmation that the
changes have been made.
POSSIBLE SOLUTIONS?DATA PROBLEMS .
Some of the highly rated solutions to the various data.
conditiors are to:
followup'procedures for inSurdng the (1)
timely receipt of data preparation 'instructions and
(2) use, of instructions by data preparers.
--Emphasize in training the importance of complete and
' correct. data on computer. input'documents.'
--Make seleetivermanUal verification of key data on
input- documents and in Abp fileS.with'hard copy.docu-
.
ments and ?with the data originator.
a.single organization (data base adminis-
trator) that could be eesponsible.forthe above steps
as well as evaluating and testing internal and external
data controls employed and input documents designed
and used.
49
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7 t-
Approved For Release'2002/06/05 CIA-RDP79-00498A000300110009-7
CHAPTER 8
? CONCLUSIONS, RECOMMENDATIONS, AND
AGENCY .COMMENTS'
Automated decisionmaking applications initiate the
spending of .billions of dollars a year without anyone re-
viewing and evaluating the individual actions. . They are
also used .to support a multitude.of.functions that, although
not directly related to money. expenditures, Can affect mis-
sion achievement and make decisions 'regarding individuals.
Many of these applications maXe bad decisions because
of various software and data problems; The causes. of the
problems are numeraos. Bad decisions may-result in unneces-
sary costs and overpayments of hundreds of millions of dol-
lars a year--exactly how much is unknown. SUch bad- deci-
sions can also impar mission performance and harm individ-
uals.
- ,In the current imperfect environment, the chances of
continuing bad decisions and unnecessary costs are great.
Actions are needed. we believe that it is .necessat _here-
fore to develop and issue Federal-wide guidelines ta foster
uniform cost-effective practices that will (1) minimize the
chances of problems occdrring, (2)? detect. as soon as. pos-
sible the problems that do' occur th operating automated de-
cisionmaking applications, (3) correct problems as early as
possible to reduce their adverse impact, and (4) insure'
that the practices are being effectively applied.
Some practices we consider necessary to meet these ob-
jectives already exist at some.agencies. For instance, we
observed testing, joint design, and inclusion. of internal
data controls. We also observed some established data man-
agement practices which could identify data input problems.
Seveial practices considered by us and.by data process-
ing professionals to be cost effective in reducing the
chances or impact of bad de'cisions,.were not being applied to.
all czilcial automated decisionmaking applications. This indi-
cates a need for cental ouidelines in such areas as:
--Preparing documentation arid/or a. formalized synopsis
that highlightz,, for exampl, key decisionmaking cri-
teria, data elements critical to the decisionmaking,
and edit check placment to facilitate thorough re.-
views by others.
53
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
uiasactzrraaarzzrnmasagammaezrazwmxarrgxwrg
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
-7Making.preimpleMentatron reviews of the designed or
modified applications and. internal data controls.
The reviews should be made by groups that are in-
dependent of the designer or user. The g!oups should
consider evaluatino, among othe:::things, the (1) ade-
quacy of the decisionmaking criteria, (2) logic in
? the coded application, and (3) needs and uses of edit
checks contained in these applications. -
--Analyzing actions initiated by these applications as
Soon as possible after they are placed into operation
to insure that (1: they are operating as intended,
(2) the intended operation -is the most economical and
effective .method, and (3) circumstances that were not
considered during design have not arisen.
--Cyclical or ongoing Monitoring. of automated decision-
making application output to insure that 11) desired
results are achieved most ecGaomically and effec-
tively, (2) new circumstances0 have not arisen. that
pill 'require changes to the decisionmaking?or other
proceSsinc criteria, (3) the logic is correct, and .
(4) decisi.omaking is. not adversely affected by incom-
plete data not. being caught by an internal edic check.
--Establishing priorities and target dates for software
modification which, are at least partiallY?based on the
unnecessary costs of continuing incorrect automatic
. actions and keeping the initiator of.modifications in-
forted of the 'status. of the changes.
--Establishing a single point in each organizationthat
? would have'prime responsibility for insuring that
these applications are making decisions based on the
best data available by.(1). evaluating and testing the
data and data controls (internal and e?iternal), (2)
adequately: training data ptepareus, (3)' reviewing the
? adequacy' and Curreney of instructions giveh data pre-
parers and insuring they are complied with, and (4)
insuring that forms designed 'for data processing -
minimize the chances of data errors. ?
To begin focusing on wnat should be managed, top- man-
agement in each acency should be aware of the autoaated de-
cisionmaking applications that exist (operational and under
development), the functions they support, their monetary and
other impacts, nature and soutces of input, the output-
.initiated actions, the progra7ed reasons, for any manual in-
tervention,. and other 1mporta7t characteristics.
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79:00498A000300110009-7,
?
Agencies should be required to, take stock of their
automated decisipnmaking applications. This action Should
include ascertaining .14:abOcr t7teir current oractiCeS for de-
veloping, modifying, arid operating such applications, to-
gether with related data contlols, are adequate to surface
problen;s of.the types '.discussed. Guidelines should he.
issued to indicate cost-effective .corrective procedures,
and agency management should insure, that automated'deci-
sionmaking applications are under control.
RECOMMENDATIONS
. We believe that, since ,automated decisionnaking appli-
cations have'not, previously been recognized as a separate
problem area requiring management attention, an.!1.sinCe. mil-
lions of d-llars are presently' being wasted as the result
of actions generated by such systems, the Office of- -.4anage-
ment and.Buaget (OMB) should act immediately, to itnorove the
'situation. Specifically, we recommend that 0M34 in its over-
sight. capacity,A:equire that: ??
--Each-agency determine .whether any of its coT.Iptrter op-
erations involve automated deoisionmaking.applica-
tions.
--The aolencies review each operation to determine
? whether incorrect actions are being taken as. a result
of these applications. (Pending 'issuance of technical
guidelines by .the Aational Bureau of Standards for
makinfl such reviews, the agencies'shouldexaine
enough automatically generated decisions to 'provide a
basis for .deciding whether-incorrect 'decisions are
occurring and, if so, should take the necessary steps
to'corlect the situation causing the incorrect. deci-
sions.) ?
Before any new automated dec:isionmaking applications .
are initiated by an agency, the proper steps are taken
to insure correct decisions. This would:include, 'end-
in g issuance of the National .Bureati of Standards
guidelines, a carefully. chosen combination of inde-
pendent review of systems design, adeouate testing
before implementation, and .periodic testing' of de-
cisions after implementation, as discussed earlier
in this report-.
--Agencies report -on the .9c:ions taken and establish
an appiopriate mecnanism for monitoring such fepOrts.
We-reco7mend that, because the m,Zational.'2,ureau of Stand-
ards has responsioilities for technical aspects by ADP, the
52
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For. Release'2002/06/05 : CIA-RDP79-00498A000300110009-7
,Setretarv. of CommerCe'direct the Bureau to issue technical
guidelines?for developing, using, technically evaluating,
documenting, and ..7pdifying these ..plications in the Fed-
eral Government. 4hen issued, these g'qidelines should con-
tain certain criteria for independent 1echnical'reviEws and
for monitoring. of hese applications lo 'insure problems are
detected and corrected'promPtly..'The Geheral Services Ad-
ministration should incorporate Bureau. guidelines in its ?
,agency directives.
In agdition, we tecommend.that.:
-.-As GSA suggested, the Civil Service Comnission de-
velop :and add to its ADP. training-c'urriculum courses
in automated deciSionMaicing applications.So that
- managers, technical nersonnel, and. audit ors will be-
come better equipped to deal with them in an appro-
priate manner.
--Internal audit groups in agencies avin automated
deCisionmaking applications particibate actively in
design, test, and reviews of such systems to carry
out their responsibilities.
:Finally, we _s-iggest that the ...Joint Financial Mana:;ement
Improvement. Progra7.. consider this area for ongoing. at
are.senaing, copies of this -eport to all departments
and independent agencies for their information, use, and.
guidance ?ending issuance of the OMB and IatiOnal Bureau of
Standards material.
AGENCY COYIMENTS
We issued the nroposed report to several agencies for
comment. Their re.lies indicate general agreement as to the
problems reported and varying ooinions on the recommenda-
tions.
With respect to the problems, the Associate Deouty Ad-
.
ministrator, VA, a.7.teed that there was a need for sound man-
agement of current are sophisticated data .processing sys-
tems. He said the report was useful in identifying and con-
solidating the problems associated with automated decision-
main g applications. He believes that the formulation of
standards :elating to these applications is imperative.
The Assistant Secretary., Comptroller, HEW, said that
no one wou:d disagree tnat software and data Problems exist
and that such problms could result in automated decision-
making applications hat made erroneous decisions in some
53
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
.cases. He believed that as much emphasis should be placed
in preventing software errors as in detecting and correcting
them . He agreed. that the current. State of ?the .art in soft-
ware development could not assure-errotefree software,
The Assistant Secretary of Defense, 'Comptroller,. said
that most of DOD's automated systems fit the definition of
automated decisionmaking applications, although damage re-
sulting from errors in some systems was less direct-and less
measurable than in disbursing and Supply systems,'. He added
that our statements of possible solutions to software. and
data problems are logical and' constructive and that while
? they are similar to many DOD practices, their documentation
.will assist system developers, auditors, and operators.
? .the Acting Administrator, GSA, saie that the report.
performed a valuable service in identifying automated de-
cisionmaking -applications as an area of data processing con-
cern and, as such, warrants wide circulation to. ADP software
managers in the Federal Government- He strongly agrees with
.our solutions for software and data Problems, including
--preimplementation and postimplementation system re-
views by independent groups and
--cyclical system monitoring.
Tile agencies had varying opinions on the tentative
proposals contained in our proposed report. .We have
weighed their comments and considereclthem in .forillulating
the proposals in this report. For example., we proPosed
that the agencies involved report to GSA on actions taken'
in response to our recommendations. Upon consideration
of the responses to our Pi:oposed report, we have modified
our recommendation -to provide for OMB to determine an
appropriate reporting mechanism.
Also in response to our proposed. report, the. Acting Ad-
ministrator, GSA, suggested that the National Bureau of
.Standards could develop Government-wide guidelines for in-
formation ?systems development which could specifically in-
clude automated decisionmakinga
'On January 12, 1976, we discussed the. suggestion with
the Director, Institute for Computer Sciences and Technology,
National Bureau of Standards, who agreed to the need for
Government-wide technical guidelines that would include
developing, using, modifying, reviewing, and monitoring auto-
-mated decisionmaking applications and said that budgetary
.resources would be solicited for the National Bureau of
Standards to perform this task. The guidelines, wl:en
54
Approved For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
completed, mould be issued as part of the Federal information
processing standards series for use. by Federal agencies.
?
?
We informally discussed the. recommendations With OMB
officials who have responsibilities in the:APP area. They
believe that the report points out important -problems in
this area and agree that issuing Policy guidance is'appro-,
priate.
?
We discussed ourrecommendations to. the Civil Service
Commission with officials of the ADP Management Training
Center who agreed to further emphasize .controls in their
ADP training.
55
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
APPENDIX I
. ?
? APPENDIX I
DEPARTMENT OF HEALTH, EDUCATION. AND 'WELFARE
.C,FF !CZ OF IvE SECRETARY
WAS:tWNG:IrDN. DC ZQI
ELI 1.7 7975
Mx. Gregory J. Ahart
Director, Manpower and
Welfare Division
U.S. General Accounting Office
Washington, D.C. 20548
Dear Mr. Ahart:
The Secretary asked that I reSpond to your request for
our comments on yOur draft report to the Congress.entitled,
"Improvements Needed in Managinq Computer-Based Automated
Decisionmaking Applications in tht Federal- Government."
They are enclosed.
We appreciate the opportunity .to comment on this draft
report before its publication.
Enclosure
Sincerely sours,
Johri D. Young
--Assistant Secretary, Comptroller
56
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-R0P79-00498A000300110009-7
APPENDIX I APPENDIX I
COMMENTS -ON GAO'S DRAFT REPORT ENTITLED
?IMPROVEMENT'NELDED IN MANAGING.
COMPUTER-MASED AUTOMATED DECISION MAKING
APPLICATIONS IN THE FEDERAL GOVERNMENT"
OVERVIEW
The draft report identifies a certain typeof EDP application
vhich.GAO calls an Automated Decision Making Application,
CADMA1, and notes that ADMAs are widely used by Federal'
agencies. The report points out that the distinguishing
characteristic of ApMAs, as compared?to other computer
application programs, is that many of the actions initiated
by the computer take- place without review and evaluation
by people._ According to GAO, there are iedications that
funds are being wasted because of incorrect, unreviewed
actions.
Te report discusses, at some length, the use of ADMAs by
Federal agencies, and points out that ADMAs can make bad.
decisions. Itcategorizes the causes of these bad decisions
being software problems or data Problems, then goes on
to identify and discuss the reasons for these problems.
Sct on( will disagree that.software and data problems. do
cAist, and that such .problems can result in ADMAs that make
erroneous decisions in some cases. It is of utmost importance,
therefeee, such problems be prevented, during the design
and implementation of the system. While we are of the opinion
that the current state of the art in software development:
'techniques and test techniques cannot assure that error free
software can be designed, techniques are available, that can
contribute significantly to the reduction of software errors.
Furthermore, practice has Suggested that the method-of .
organization of a development effort. can have, a favorable
i4aet.on-the error leeel eS well as the development cost.
Since the?state of the art of development and testing techniques
cannot assure error-free software, it is'of equal importance
that.eeviews of.eystereetake. place before operation,- shortly
after Implement?tion, and on continuing Or cyclical basis
for operational. systems. The exeent of review of an.ADMA
should be.0 function of the probability and impact of errors.
The report discusses vJrious ways to prevent or reduce the
impact of problem con.3:tions in ADMAs.. In our opinion-, the
possible Solutions mentioned in the report are, for the most
part,?reasonabl. .We wciuld, however, place a greater emphasis
than made in the (;AO re-rt on (1) involvement of the. tier ?
in the developm-ht oL AUMA rinti_.(2) approaches to reduclnq
probability of yrror:- ot the det-Ign and stages rather
than emphasizinq erroeeetectioe.end correction. in the
operational stone.
57
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : C1A-RDP79-00498A000300110009-7
APPENDIX
e APPENDIX I
GAO concludes that the development and use of ADMAs is
necetpa;y but because of thecurrntimperfect ensironment,
chances of continuinTbad.decisions and unnecessary costs
are great. Consequently, GAO believes that it is necessary
to develop Federal-wide policy to foster uniform cost-
effective practices that will minimize theechances.Of -
problems occurring,. detect the problems as early as possible,
and assure that the Practices are being effectively applied.
GAO RECOMMENDATIONS AND HEW COMMENTS
RECOMMENDATIONS
Because GSL is responsible for developing-Government-wide
policy on ADP management and for seeing that the policy is
carried wit by the departments and agencies, GAO recommends
that. the Administrator,. GSA:
:Require the identificaticn and characterization of
ADMAs used by Federal aeencies. (A starting point
for material to be included can be the types of data
GAO obtained during its study of ADMAs -- volume of
tiansactions, impact of decisions; etc.). Thir ill
provide agency management and auditors with bat
information on where their resources could best be
applied.- .
Issue policy requirements and guidelines for the
management Of ADMAs in the Federal government. Most
importantly, the policy and guidelines 'should
establish Criteria for independent reviees and.
monitoring of ADMAs to assure that problems are
detected andcorrected in a timely manner. The policy
.should also include criteria for cost-effective -
development, modification, documentation, review
and testing of ADMAs:
Require agency reporting concerning (1) actions taken
based onthe Criteria and (2) problems identified and
corrected as a result of independent reviews and.
monitoring of ADMAs. Justification of cost effective
ways of managing ADMAs s.ould be included..
HEW COMMENTS
With respect to GAO's first rec_mmendation,.we do not believe
that it would be useful to hAve all .agenctes idetify and
character-:e their ADMAs To..do so Would result In the
preparation of an. enormous volume 0 reports covring
hundreds of ADMAs. Since it Is unlIkely that GSA would be
53
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 002106105:2 CIA-RDP79-00498A000300110009-7
?
.-.APPENDIX I: - APPENDIX
?
?able to effectively utilize these reports, their development ?
-- and preparation would. be A waste of agencv.time.and resource:.
Forsimilat reasont we do not favor GAO's third recommenda-
tion which would require agencies to submit reports concerning
the actions taken pursuant to G$A _policy directives.. -
We agree in principle with ?the second recomMendation -- that
GSA establish guidelines for the management- of ADMAs in the
Federal Government. The establishmenttof'guidelines would
encourage agencies to utilize acceptable practices for,
developing, modifying, reviewing and monitoring their ADA
systems.
We are of the opinion that such guidelines .as GSA might
develop must be flexible to eecognize that ADMA systems are
of varying complexity and of varying impact in terms of
probability and cast of errors. Thu, practices employed
for the development, modification, review, and monitoring
of a particular ADMA should be oriented,towards,overall cost
reduction, i.e., expected cost of errors plus cost of '
deveiopment, modification,? In light ofnthediversity of
ADMAs, we do not believe that it is practical to establish
'policy requirements "at this time. We believe that a more
effective procedure would be for GSA to issue guidelines and
then to periodically conduct on-site reviews and audits of
various agency ADMAs. The objective'of'such reView.would
be twofeld:. (1) determinatioitof the extent to which guide-
lines were being followed by agencies and (2) determination
of the effectiveness and efficiency of the.recommended .
practices so' that they could be developed and refinedbased
on actual experience.
-Furthermore, as we indicate in the Overview to these. comments,
we believe that efforts to eliminate errors during develop-
ment is of equal importance to the -review and monitoring
efforts. Therefore, wn suggest modifying the second recoMmen-
dation_ to read:
"Issue guidelines for the management. of ADMAs in the
Federal Government. These guidelines should include.
recommended practices and criteria for cost effective:
I. development, modification ani testing Of ADMAs ?
to reduce error levels in software and data
collection,
2. documentation of ADMAs for internal and external
uses,
3. review and rohitorina of ADYA6 both as continuing
activities by sy-tems ang user porsonnel and by
independent groups."
59
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
?
APpENDIX I
APPE7NDIX I
.OTHER "COMMENTS AND SUGGESTIONS ON THE REPORT
1. In the third paragraph on page 5. of the draft report,
a statement is madeethat. SSA'offjcial said that they
assume that designers and programmers are adequately
trained and experienced and that such instructi-ns are
not necessary.' This is not an accurate statement.
We suggest that GAO change the sentenCe.to read:
"An SSA staff member said that designers and.?
programmers are adequately trained and experienced:
since there are continuing courses offered
system .esign so that skills can be maintained at a
satisfactory lee1."
2. In the lest paragraph on, page 65, the second sentence
reads "According to SSA Officials, this sampling has
identifi.1 many design and programming errors and
repetitive data errors causing erroheous payments in
operating ADMAs." The word "many" is misleading in
that this is an end of the line operation and most
errors, are discovered in validation's, etc. long before
these operations are performed. The. sentence should
read:
"According to SSA Staff members this sampling has
identified design and programming errors and repe-
titive data error's causing erroneous payments. In
operating ADMAs.."
?
3. ,The first Paragraph on page 66 begins. "SSA advised us
that many system design and coding errors, as well as
systematic repetitive data errors e are corrected as a.
result, of this procedure." Foe the same' reasons given
in. the preceding paragraph of our comments, the word:
"many" is misleading and should be deleted. - ?
[See GAO?note, p. 61.1
5. There is consderable :,:veriap and dbplication in several
chapters of. the report. In feeeticular', we Suclnet teat
Chapters 3 and 4 be combined to improve readability.
60
Approved For Release 2002/06/05: CIA-RDP79-00498A000300110009-7
AmiamarzsgmagagzzazgurglEi
-
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
fil?PENDI X I
6. We believe, in general, thatthe'report tends to under-
play the importance of the user in the development of an
-ADM. We note,with interest that in the opinion of
"people answering the, questionnaire" (page. 36 of the
report) the most often cited problem is 'inadequate
-A.losgmunications Oetween the. parties to 'software design."
The second ? ranked problem in this list is "incorrect
perceptions of the nature of the actual transactions to
he processed.."
Furthermore, in Chapter 7, ."Opinions on ways to prevent
or reduce the impact of problem conditions in ADMAs,"
respondents from several professional organizations
suggest that "Physical collocation of"the.designer and ?
user should be.acComplished during the-design phases
to facilitate constant communication. In effect, the
design would be a joint effort and would help to insure
adequate decision-ea'ang criteria contained in the ADMA."
Despite the importance of these caupes of errors and of
this recommendation of professionals te overcome them,
the policies advocated by GAO in Chapter 8 fail to address
the necessity of-user involvement.
Therefore, we suggest that the GAO report Place greater,
emphasis on the participation and responsibility- of the
user in an ADMA system. In commenting on the draft GAO
report "GAO Guidelines for Management Information
Processing Systems," May-1974, HE stated: "The- -
Guidelines include the user in the system development
from thestandpoint of user educotion as opposed to user
participation. While user education is important, it -
is not enough. The successor failure of 4 system is.
critically dependent on user involvement and participation."
We believe that this dependency is even more critical in
ai ADMA systeo.
7. The importance of personnel selection and training for
ADMA development, operation, monitoring and review should
be given greater emphasis in the GAO report-. Designers
and programmers should be. familiar With design tools and
techniques, e.g., structured and modular flowcharting
and programming, decision tables, data base_design tools,
data element management, data collection alternatives.
Management should be aware of alternative organizations
for system development, e.g., chief programmer teams.
Designers should also be aware of techniques for testing
and monitorino systems including statistical samp.ing
approaches. Knowledge ca n be obtained via government
or private sector training courses.
.GAO note: Material no longer related to report has
been deleted.
61
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
wxmdwmxtwcammrmssAprankwmmmmmatwmrmngmmmwwm.mmmm.....
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
? APPENDIX II
APPENDIX II
VETFRANS ADMINISTRATION
OFFKIE Of IRE AoMfIsaroR air VETEA.A5 AFFAIRS
WASHINGTON, D.C. 2.042C
NOVEMBER 2 s 1975
lir. Gregory J. ANart
Director
Manpower and Welfare Division
US. General Accounting Office
Washington, D.C. 2D548
' Dear Mr. Ahart:
We appreciate the opportvnity to review :.(1 comment on your
draft report relating to the mauai.,ement of automated decision-making
applications and are in agreement that there is a' need for sound manage-
ment of the large, sophisticated data processing systems in existence
today.
. [See GAO note.)
Your report has proved useful in identifying and consclidating,.
in one place, many of the problems associated with automated decision-
asking 'applicati:.ms in a clear, straightforward language. We believe
that the formulatiOn of standards relating to these applications is
imperative, and have already tegun to draft our own general requirements
and guidelines.
'GAO note:
RICHARD L. ROUDEBUSH
Administrator
T.)]..eted comments refer to material discussed
in or draft report but, not included in this .
final report.
62
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
agiginenzuwzmansanztftrumssimmougemontwmoitroguestra?maxormarm
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
APPENDIX. III .
? .
DEC 29 1975
UNITED STATES OF AIVER:CA
GENERAL SERVICES ADMINISTRATION
WAS:rtitiGIQN: DC
Honorable Elmer B. Staats
Comptroller General of the United States, ?
General Accounting Office
Washington, D. C. 20548
Dear Mr. Sra.ats:
?APPENDIX III
We appreciate the opportunity to review your draft report "Improvements
Needed in Managing ComPuter-Based Automated Decisionmaking
Applia-
tions in the Federal Government."
The report performs a valuable service in identifying automated decision-
making applications (ADMAs) as a discrete area of data processing
concern and, as such, warrants wide circulation to ADP?software
managers in the Federal Government.
We strongly agree with the following GAO recommended soluions for ?
software and data problems:
. Pre-implementation and?past-implementation system a audits
by independent groups. ?
? Cyclical system monitoring.
. Joint system design by users and ADP systems analysts.
In addition to the management solutions mentioned in the repo. rt, there
arc modern computer programming techniques Whith can aid in increasing
the integrity of any system. Developing detail logic with decision tables,
rather than flow charts is particularly effective in data editing applications.
The use Of top down" programming and "chief programmer teams' is
proving successful in minimizing e?tors. Employment of a data base ?
administrator throughout both the developmental and operational stages.
of a system will help assure that valid data is being processed..
Keep Freedom in TD47 Flit" ,e? With U.S Sing s &rids
63 ?
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
APPENDIX III
APP MDIX III
ViThile we generally, agree with the conclusions and recommended Solu-
. tions for software and data problems contained in the report, we do net
g-ee with the recomr .?ridations that GSA issue policy and gUicielines
for the management of ADMAs nor that GSA require agency reporting to
.allow Monitoring of agency performance. 'Rather We would suggest, since
ADMAs are part of .the broader universe of information systems develop-
mat, that:
. .The National Bureau of Standards, Viith..GSA cooperation.
cPwelop government-wide. guidelines relating to information
systems development which chould specifically include auto-- ?
mated decisionrnaking.
. Agencies report to thc ir own agency head regarding decision-
making criteria,. ADMA problem identification and corrective
actions taken, and that these reports be made available. for
review by OMB and GSA, in line with review provisions in
Federal Management Circular 7-1-5. .
?. The Civil Service Commission include in its management
'training programs.a course on automated ciecisionmaking
? stressing the need for cost effective development, joint
systems design by users and ADP systems analysts, systems
monitoring and auditing of ADMAs.
Because-of the significance of this report, w had the opportunf.tY to hae
the Ad Hoc Committee for Implementation of P. L. 89.-306 briefed by a .
representative from GAO prior to issuance of this draft. This Committee
is chaired by the Commissioner, ADTS, and representatives from ADP-
int/misive agencies are committee members. At a later 'meeting, our
comments were disc-..issed and the Committee generally concurred-in the
approach GSA is proposing.
If yon have any questions, please let us know.
Sincerely,
1.)7$ ght
Acting. t str%t or
64
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
ILLEGIB Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
.
APPENDIX IV
Because of the unique problems of automated systems, We. have and will
continue to develop and apply special measures to their quality control.
However, they are in no way exempt-from.standardYederal -ccounting
system certification, management reviews, and internal audit controls..
The net effect then is to increase management?control of autotated
systems in comparison to manual systems,
Your statements of possible solutions to software and data problems
are logical and constructive. While they are similar to.mny DoD
bractices, their documentation in g?Compact set will assist our system
developers, auditors and operators.'
?
With .respect to the recommendations included in the draft report, we
interpret GS 'S charter in the ADP field to address procurement of
ADP equipment, supplies and services. Your report is aimed at a
different arena, that of functional procedures and acCounting controls.
Accordingly, we recommend that:
I. The subject be proposed as a matter of continuing interest
by the Federal. Financial Management Improvement 'Program...The
.inter-agency effort of seniorninancial. managers is an appropriate
forum for exchange of' new procedures and. techniques.
2. Pertinent and documented studies', 'research reports., methods
and techniques be provided by developing agencies to the :National .
Technical inforeation Service (NTIS) of the Department of Commerce-
for disseminaticn at cost to other potential users. in 'accordance
with the IS charter,
3- The report be issued as a study, retaining the findings and
.conclusions but deleting the recotmendations and substituting. the
following: -
"Each Federal Agency -Should review its internal regulations
and procedures for management of ADMA systems. to assure ?
protection of mission effectiveness and government resources
from system errors. Each agency should establish specific
internal procedures to assure that internal controls and
audit trails for error detection and correction are made
a part of system desigm Specificatiens, tested prior to
system implementation, 'and included in routine and
special audits throughout their operational life,"
66
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
APPENDIX IV
Thank you for an infamative amd valuable research effort. The
opportunity to comment pn,the draft report is appreciated.
? APPENDIX IV
Sincerely,
67
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
. APPENDIX V
INTERNAL AUDIT REPORTS ON
AUTOMATED DECISIONMAKING APPLICATIONS
Title of
report
Army Audit Agency:
Coordinated
Audits of Depots
(Maintenance
.CperatiOns)
U.S?Army Train-
ing Center, In-
fantry and fort
Polk
Direct Support
System
Materiel. Obli-
gation Valida-
tion Procedures
Catalog Function
Naval Audit Service:
Servicewide
Audit of the
Aeronautical Re-
Tat-able Compo-
nents Program
? Headquarters,
Pacific Missile
, Ran'ge, Point.
Mugu,.Califor-
-nra
ravy Aviation
Sopply.Office,
Philadelphia,
Pennsylvania
Type of
application
Date involved
3/ 4/74
Maintenance
. workload
acceptance
12/21/73 Requisiton-
ing
10/16/73
2/ 8/74
8/21/73?
12/ 6/73
Requisition
processing
Procurement
cancellation
Automated pro-
curement and
requisition ?
processing
Overhaul sched-
uling
II/ 1/73 Requisitioning
APPENDIX V
Problem ,.identified
Software Data -
X
10/16/72 RedistributiOn X
63
X
X
X
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
?
Approved For Release 2002/06/05 : C1A-RDP79:00498A0003001100097
-....APPEiNDIX V
Title of
? repor.t.
Naval Audit Service-,
(continued): .
? Aviation Supply
Office, Philadel-
phia, Pennsyl-
vania
Navy Aviation
Supply Office,
? Philadelphia,
Pennsylvania
Type of ?
? application
Date . involved ?
115/73. .Requi8itiOn.
processing
and redistri- ?
bution
12/10/74 Overhaul, sched-
uling and re-
distribution
Auditor General, De- .
fense Supply Agency:
Physical Jnyen- -11/24/72
tory 'Procedures
and Practices.
Kedical Supply . 9/ 5/73
Functions
Mobili.zation
Reserve Re-
quirements'
Defense Supply
Centers
Veterans Administra-
tion:, Fiscal Audit:
- Audit of On-Job
and Apprentice-
ship Training
Awards Processed
by OCR
Processing.
Oe-
pendency Changes
from Supplemental
Award Code Sheets
APPENDIX V
Problem identified
Software Data
Physical inven-
tory requests. X
Customer re-- X
turns,.requisi-
tion Processing
and stock att.ri-
tion
? 5/18/73 .Custopler -returns: X
5/ 8/74 Payments
9'17;7
69
Payments'
X
X
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Title of
report?
Type of
application
Date invo)ved
APPENDIX V
Problem identified
ToTtware Data
Veterans Administra-
tion,-Fiscal Audit
(continued):
Processing Awards 8/ 2/73 Payments X
after Entitlement
is Exhausted
Nonrecovery of .
Accounts Receiv-
able from Re- ?
sumed BCL Ac-
count Payments
Retroactive
Payment Adjust-
ments
Updating Ac-
counts Receiv-
able Deduction
Amount from
Amended Awards
Duplicate Chap-
ter 34 Educa-
tion Payments
Interior, Office of
Survey and Review,
Audit Operations:
Review of Con-
tract No.
N00C14205253
With the Navajo
Tribe, Window
Rock, Arizona,
bureau of In-
dian Affairs
Agriculture, Office
of Inspector General:
Programs Option
B Previsions of
the 1972 Feed
Grain Set-aside
Program
4/20/73:
.Payments
8/31/73
Payments
X
4/12/73
.Payments
X
3/ 2/74
Payments
10/29/73
Payments
10/25/73
Payments
X
70
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
APPENDIX V
.Title of
?-report
? .
Agriculture,"Office
of Inspector General
(continued):
Loading Order
Issuance Proc-
essing and Set-
tlement
Date
Type of
application
involved ?
8/ a/73 . Loading cr:der
settlement
Agriculture, Office
of Audit:
Automated Ac- . 2/15/74 payments ad
counting Service billings .
APPENDIX V
.Problem identified
? Software Data
X
X X
HEW. Audit Agency:
Administrative 1/ 9/74 Payments. X
.Costs Incurred
and. Benefit Pay-
ments Made. Under
the Health In-
surance for tftle.
Aged Act
Administrative 12/2.8/73 Payments. X
Costs Incurred
and Benefit Pay-
ments Made Under
the Health In-
surance for the
Aged. Prograv.
Administrative. 6/18/74 Payments X
Costs Proposed ?
and Operations
Relating to Ben-
efit Payments
Under Medicare
Administrat.ie 5/2?74. Payments X
Costs Claimed
and Benefit
Payments Made
Under the
Health insurance
for the Aged
Program
71 .
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
1-
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7
APPENDIX V
'Title of .
.renort
?
.HEW- -Audit Agency
(coatinued):
Administrative :11/ 9/73
Costs claimed
and Supplementary
Medical Insurance
Benefit Payments
Made Under Health
Insurance for the
Aged Prog,am
Date
Administ.rative
Costs and Bene7'
fit Payments -
Under the Health
insurance for
the Agd PLo gram
'Ope. .of
application Problem identified
?involved Mftware Data
_
Payments X
11/12/73 Payments -X
Administrative 5/ /73 Pz,yments
Costs Claimed
and Bene it.
Pwynents Made .
Under the 11,:,a1th
Ins-1.ta71. to/
the. Aled Pronram
Administlative. ? 4/12/74 - Pwyments.?
Costs and Benefit
Parnts Made Un-
der. the Health
Insurance for
the Aged Act
72
Approved For Release 2002/06/05 : CIA-RDP79-00498A000300110009-7