PDF Computers and People V25 N06 197606
.computers and people formerly Computers and Automation

June, 1976
Vol. 25, No. 6

Electronic Funds Transfer Systems and the Consumer - Paul Armer
The Relevance of Mathematics - Felix E. Browder
Computerized Adaptive Ability Measurement - Part 1 - David J. Weiss
A Skeptical View of Structured Programming - Part 2 - Tom Gilb
PTERODACTYL by Barbara Dwyer

The Notebook on COMMON SENSE, ELEMENTARY AND ADVANCED
is devoted to development, exposition, and illustration of what may be the most important of all fields of knowledge:
WHAT IS GENERALLY TRUE AND IMPORTANT

+

WISDOM

JUDGEMENT

+

AND

+

MATURITY

SCIENCE IN
GENERAL

TECHNIQUES

+

FOR SOLVING

+

PROBLEMS

TECHNIQUES

SOME PARTS

SOME PARTS

AVOIDANCE

FOR AVOIDING

+

OF OPERATIONS

+

OF SYSTEMS

+

OF LOGICAL

+

MISTAKES

RESEARCH

ANALYSIS

FALLACIES

PURPOSES:

Topic: THE SYSTEMATIC

Topic: SYSTEMATIC EXAMINATION

to help you avoid pitfalls

PREVENTION OF MISTAKES

OF GENERAL CONCEPTS

to prevent mistakes before they happen to display new paths around old obstacles to point out new solutions to old problems to stimulate your resourcefulness to increa!!e your accomplishments to improve your capacities to help you solve problems to give you more tools to think with

&

REASONS TO BE INTERESTEO IN THE FIELD OF

COMMON SENSE, WISDOM, AND GENERAL SCIENCE

Already Published Preventing Mistakes from: Failure to Understand Forgetting Unforeseen Hazards Placidity Camouflage and Deception Laxity Bias and Prejudice Ignorance
To Come Preventing Mistakes from:

Already Published The Concept of: Expert Rationalizing Feedback Model Black Box Evolution Niche Understanding Idea Abstraction

COMPUTERS are important But the computer field is over 25 years old. Here is a new field where you can get in on the ground floor to make your mark.
MATHEMATICS is important But this field is more important than mathematics, because

Interpreta ti on Distraction Gullibility Failure to Observe Failure to Inspect

To Come Strategy Teachable Moment Indeterminacy System Operational Definition

common sense, wisdom, and general science have more

applications.

· · · - - - - - - - - - - · - - (may be copied on any piece of paper) - - - - - - - - - - - - - - - - - - - - -

LOGIC is important -

To: Berkeley Enterprises, Inc.

But this field is more important than logic, because common

815 Washington St., Newtonville, MA 02160

sense plus wisdom plus science in general is much broader ' ( ) Yes, please enter my subscription to The Notebook on Common

than logic.

Sense, Elementary and Advanced at $12 a year (24 issues), plus

WISDOM is important -

extras. I understand that you always begin at the beginning

This field can be reasonably called "the engineering of

and so I shall not miss any issues.

wisdom".

COMMON SENSE is important -

· ( ) Please send me as free premiums for subscribing:

This field includes the systematic study and development of: 1. Right Answers - A Short Guide to Obtaining Them 4. Strategy in Chess

common sense.

2. The Empty Column

3. The Golden Trumpets of Yap Yap

SCI ENCE is important -

( ) I enclose$

This field includes what is common to all the sciences, what :

5. The Barrels and the Elephant 6. The Argument of the Beard
) Please bill my organization

is generally true and important in the sciences. MISTAKES are costly and to be AVOIDED -

RETURNABLE IN 7 DAYS FOR FULL REFUND IF NOT SATISFACTORY HOW CAN YOU LOSE?

This field includes the systematic study of the prevention of :

Title_ _ _ _ _ _ __

mistakes. MON EV is important -
The systematic prevention of mistakes in your organization

Organization _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ __ Address (including zip) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ __

might save 10 to 20% of its expenses per year.

Signature _ _ _ _ _ _ _ _ _ _ __ Purchase Order No.- -- -

2

COMPUTERS and PEOPLE for June, 1976

WOULD YOU LIKE TO HAVE SYSTEMS ANALYSIS AND COMPUTER PROGRAMMING USING PLAIN ORDINARY NATURAL LANGUAGE - AND THEN HAVE A COBOL PROGRAM PRODUCED AUTOMATICALLY 60 TIMES AS FAST AS NOW?

We can, for a class of problems in business*:
1. Take in - systems analysis expressed in ordinary natural language - procedures and rules expressed in ordinary natural language - worked examples of calculations written on paper the way clerks write them out - sample report forms typed the way good typists type them
2. Feed all this into a computer
3. Put out automatically a program in COBOL (or FORTRAN or Business BASIC, etc.)
- all this at a cost of less than half your existing cost.

Would you like evidence?
Would you like a demonstration?
Write us on your letterhead for more information.
Berkeley Enterprises, Inc. Attention: Steve Emmerich
815 Washington St. Newtonville, Mass. 02160
(*Examples of a class of problems in business are: invoicing, inventory control, order entry, payroll, accounts receivable, moving average, etc.)

COMPUTERS and PEOPLE for June, 1976

3

Vol 25, No. 6 June, 1976

Editor and Publisher
Assistant to the Publisher
Assistant Editors
Art Editor Software
Editor Contributing
Editors
London Co"espondent
Advisory Committee
Editorial Offices

Edmund C. Berkeley
Judith P. Callahan
Barbara J. Wohner Neil D. Macdonald K. Kaufmann Grace C. Hertlein Stewart B. Nelson
John Bennett John W. Carr Ill Grace C. Hertlein Linda Ladd Lovett Ted Schoeters Richard E. Sprague Edward A. Tomeski Thomas Land
Ed Burnett James J. Cryan
Berkeley Enterprises, Inc. 815 Washington St. Newtonville, MA 02160
617-332-5453

Advertising Contact

The Publisher Berkeley Enterprises, Inc. 815 Washington St. Newtonville, MA 02160
617-332-5453

"Computers and People", formerly "Computers and Automation", is published monthly, 12 issues per year, at 815 Washington St., Newtonville, MA 02160 by Berkeley Enterprises, Inc. Printed in U.S.A. Second Class Postage paid at Boston, MA, and additional mailing points.
Subscription rates: United States, $11.50 for one year, $22.00 for two years. Canada: add $1 a year; elsewhere, add $6 a year.
NOTE: The above rates do not include our publication "The Computer Directory and Buyers' Guide". If you elect to receive "The Computer Directory and Buyers' Guide", please add $12.00 per year to your subscription rate in U.S. and Canada, and $15.00 elsewhere.
NOTE: No organization in Switzerland or Morocco is authorized or permitted by us to solicit subscriptions for or receive payments for "Computers and People" or "The Computer Directory and Buyers' Guide". Such subscriptions and payments should be sent directly to us.
Please address mall to: Berkeley Enterprises, Inc., 815 Washington St., Newtonville, MA 02160
Postmaster: Please send all forms 3579 to Berkeley Enterprises, Inc., 815 Washington St., Newtonville, MA 02160
©Copyright 1976, by Berkeley Enterprises, I nc.
Change of address: If your address changes, please send us both your new address and your old address (as it appears on the magazine address imprint), and allow three weeks for the change to be made.

computers and people formerly Computers and Automation

Computers in Society

8 Electronic Funds Transfer Systems and the Consumer

[A]

by Paul Armer, Center for Advanced Study in the

Behavioral Sciences, Stanford, CA

What are the disadvantages and problems of the Electronic

Funds Transfer Systems for the consumer? Because of

them, should we slow down the pace at which Electronic

Funds Transfer Systems are being implemented?

7 The Universal Product Code and the Pursuit of Truth

[F]

by David M. Carlson, V. P., Allied Supermarkets,

Livonia, Ml

7 Do Supermarkets Manipulate Profit Statistics for Their

[F]

Benefit?

by Thomas V. Sobczak, Waldes Kohinoor, Inc., Long

Island City, NY

Education

10 The Relevance of Mathematics

[A]

by Felix E. Browder, University of Chicago, Chicago, IL

Why is mathematics mysterious to the nonspecialist? Aside

from its use in practical, technical, and research disciplines,

mathematics has been identified by philosophers, like Plato

and Alfred North Whitehead, with the ultimate and trans-

cendent form of all human knowledge.

7 "The Valor of Ignorance": Required Reading for All

[F)

Thinking People

by Richard C. Herlihy, Melrose, MA

Computer Programming

6 Ambiguity and Computers

[E]

by Edmund C. Berkeley, Editor

Can a computer program deal with ambiguity in trans-

lating from one natural language to another? Although

a computer must deal with unambiguous meanings, inter-

active communication with human beings opens many

lines of attack to the problem of ambiguity in programming.

20 A Skeptical View of Structured Programming and Some

[A]

Alternatives - Part 2 (Conclusion)

by Tom Gilb, Kolbotn, Norway

Although programmers who use structured programming

like it, and it is favored by the academic computer science

community, is it really a good procedure? It has never

been fully compared with alternative techniques. There

exist many alternative and supplementary techniques to

structured programming, which might be able to correct

random program errors within two seconds.

7 "How Do Those Crazy Rumors Get Started?"

[F]

by Thomas E. Kurtz, Dartmouth College, Hanover, NH

4

COMPUTERS and PEOPLE for June, 1976

The magazine of the design, applications, and implications of information processing systems - and the pursuit of truth in input, output, and processing, for the benefit of people.

Computer Applications

14 Computerized Adaptive Ability Measurement

[A]

by Dr. David J. Weiss, University of Minnesota,

Minneapolis, MN

During World War I, the paper-and-pencil, multiple-

choice, group-administered, ability test was developed

to classify large numbers of people quickly. Although

it was efficient, it was not adapted to the individual

testee's ability. A computerized adaptive test com-

bines the efficiency of group-administered tests and

the accuracy of an individually administered, adaptive

test.

23 New York Times to Install the World's Largest Electronic

[N]

Newsroom

by Fred Baker, Harris Corporation, Cleveland, OH

25 Computer Model of Mars Will Interpret Data from

[N]

Viking Spacecraft

by Charles H. Ball, Mass. Inst. of Technology, Cambridge, MA

26 Computer Helps Resolve Mexican-U.S. Water Problems

[N]

by R. W. Sheehy, Control Data Corp., Rockville, MD

Computers and Crime Fighting

23 Police Department Uses Computerized System to

[N]

Fight Crime

by Joe Francis, The Boeing Company, Wichita, KS

25 FBI Will Use Automatic "Matcher" for Fingerprint

[N]

Identification

by Ralph Wallenhorst, Calspan Corp., Buffalo, NY

Computer, Puzzles and Games

27 Games and Puzzles for Nimble Minds - and Computers

[C]

by Neil Macdonald, Assistant Editor

MAXIMDIJ - Guessing a maxim expressed in digits

NAYMANDIJ - A systematic pattern among randomness?

NUMB LES - Deciphering unknown digits from arith-

metical relations

Announcements

2 The Notebook on COMMON SENSE, ELEMENTARY

[R]

AND ADVANCED

3 RIDE THE EAST WIND: Parables of Yesterday and Today [R]

5 14th Annual Computer Art Exposition

[R]

A special feature of the August 1976 issue of

Computers and People.

19 14th Annual Computer Art Exposition

[R]

Guidelines for entry.

26 101 MAX/MD/JES

[R]

A new Berkeley Enterprises, Inc. publication

28 COMPUTER GRAPHICS and ART

[R]

A new quarterly magazine

Front Cover Picture
The front cover computer artwork, "Pterodactyl," was submitted to us by Barbara Dwyer for the 13th Annual Computer Art Exposition.
IF YOU ARE INTERESTED IN COMPUTER ART,
WE INVITE YOUR ENTRIES IN OUR
14TH ANNUAL COMPUTER ART EXPOSITION
A special feature of the August 1976 issue of Computers and People. One of the entries we receive will be selected to appear on the cover of our August issue. More entries will be published inside. Other entries will be published later. See our announcement and guidelines for entry on page 19 of this issue.

Key
[A] [CJ [E] [F] [NJ [R)

Article Monthly Column Editorial Forum Newsletter Reference

NOTICE
*D ON YOUR ADDRESS IMPRINT MEANS THAT YOUR SUBSCRIPTION INCLUDES THE COMPUTER DI RECTORY . *N MEANS THAT YOUR PRESENT SUBSCRIPTION DOES NOT INCLUDE THE COMPUTER DI RECTORY .

COMPUTERS and PEOPLE for June, 1976

5

EDITORIAL

Ambiguity and Computers

One of the reasons given for failure of a computer program to give adequate translation from one natural language to another is ambiguity. The argument goes something like this:
1. Ordinary natural language is ambiguous. 2. A computer program must deal with unambig-
uous meanings 3. Therefore, a computer program cannot deal
with ordinary natural language.
A neat example is the following one. The sentence:
"The spirit is willing but the flesh is weak"
was translated by a computer program into Russian, and then translated back by a computer program from Russian into English, producing:
"The wine is agreeable but the meat is spoiled."
But a great many kinds of ambiguity can be dealt with by a computer in the case where it is communicating interactively with a human being. In the same way, a person in a conversation often uses common phrases to deal with ambiguity when he does not understand someone else. Here are some examples:
That's not clear to me.
I did not make out what you said.
What do you mean?
I am having trouble understanding you.
That doesn't click in my mind.
What is an . . . . . (as in "What is an aardvark?")
You're over my head.
Can you say that in other words?
I don't understand you.
I don't see that.
Not one of these phrases offers any serious problem to the person spoken to. He or she can promptly say something again, or say it more slowly, or say it in different words, or ask some questions to clarify which stages of meaning have been grasped and which stages of meaning have not been grasped and proceed accordingly. Such a strategy is entirely open to use by an adequately designed com-

puter program interacting with a human being. Since ambiguity is a common problem in a vast set of human conversations. it is only to be expected when a computer program and a human being are conversing.
A computer in interactive mode can ask questions, and the human being can reply to them. So when a computer program has tentatively determined the meaning of statements in a conversation, it can offer the meaning to the human being, and ask for the human being's "yes" or "no," or even a much wider choice of answers.
In computer-assisted instruction, for example, the computer goes farther still and can regularly ask many kinds of questions of the human being . and deal with the answers:
Arithmetic: What is 6 times 9? Answer: 54
History: Who was a famous Carthaginian general who crossed the Alps to invade Rome? Answer: Hannibal.
Spelling: If any of the following words is misspelled, type the correct spelling: THERMOPILAE, THIRATRON, THERMOPLASTICK, THREATNING
Vocabulary: Improve the following sentence by substituting a better word or phrase for "thing": John went to the amusement park and found a misbehaving thing. He put six nickels into the thing; for each nickel he got twice the usual amount of candy. But the last nickel stuck inside. So he kicked the thing as hard as he could, and out came his nickel and four pieces of candy!
Ambiguity is not all of the problem of translation from one natural language to another, of course. But the subproblem of ambiguity is open to many lines of attack, and it is not true that a computer program cannot deal with the problem of ambiguity.
E~J....C .. ~
Edmund C. Berkeley Editor

6

COMPUTERS and PEOPLE for June, 1976

MULTl-ACCESS FORUM

THE UNIVERSAL PRODUCT CODE AND THE PURSUIT OF TRUTH
From: David M. Carlson, V. P. Allied Supermarkets 12425 Merriman Road Livonia, Ml 48150
Your masthead includes the phrase "and the pursuit of truth in input, output and processing, for the benefit of people."
The Sobczak article on UPC ["The Universal Product Code: An Introduction to What It Means for Consumers," Computers and People, December 1975, p. 7] contains the introductory quote, "With the Universal Product Code, you pay more and know less - while supermarket profits change from 1% to 35%." This is such an incredible falsehood that I suggest you suspend your "pursuit of truth" for a while and concentrate on just being honest. The facts are that supermarket profits are less than 1% of sales and product costs are about 80% of sales. How can one make a profit of 35% of sales when 80% goes to pay for the product itself.
Unfortunately, the rest of the article is also so riddled with inaccuracies and absurd conclusions that the article is irreparable.
It is crystal clear that you and Sobczak either have no concept whatsoever of basic food retailing economics or that you are purposefully distorting the facts. Either way, it is totally inexcusable and I am embarrassed for you both.
DO SUPERMARKETS MANIPULATE PROFIT STATISTICS FOR THEIR BENEFIT?
From: Thomas V. Sobczak Waldes Kohinoor, Inc. 47- 16 Austel Place Long Island City, NY 11101
I apologize to Dr. Carlson if the numbers are quoted incorrectly for his organization. They were quoted from the UPC Hearing held by Nassau County. They have also appeared in data published by the Retail Clerks International Association and the Consumer Federation of America as derived from McKinsey and Neilsen Projections.
I suggest that Dr. Carlson refer to the Washington Post's Tuesday, February 4, 1975 article "Food Chain's Method of Figuring Profits is Disputed" (p. A7). Additionally, Senator William Proxmire, when chairman of the Congress's Joint Economic Committee, contended that supermarkets manipulate profit statistics for their benefit.
The data I presented was the best available to me on my subject. I used it as carefully as I

could. I suggest in free forum he state his objections rather than project innuendo. We are all interested in the pursuit of truth.
"HOW DO THOSE CRAZY RUMORS GET STARTED?"
From the Editor to Thomas E. Kurtz
Somebody said to me a while ago that Dartmouth College had stopped using BASIC and had started to use APL in place of it. Is this true?

From: Thomas E. Kurtz, Director Kiewit Center Dartmouth College Hanover, NH 03755

The reports of BASIC's demise have been greatly exaggerated. For January 1976, the number of runs of various languages in foreground (background or batch not included) was as follows:

BASIC ALGOL FORTRAN LISP APL COBOL SNOBOL PL! (syntax checker) CPS DXPL GMAP DYNAMO

137' 505 (88%) l,862 2,457 181 427
721
8 57 1,404 9,302 1 , 165 1,068

As you can see, we tolerate almost any language, including APL, which is more than APL'ers will do . How do these crazy rumors get started?

"THE VALOR OF IGNORANCE": REQUIRED READING FOR ALL THINKING PEOPLE
From: Richard C. Herlihy 60 Nowell Road Melrose, MA 02176

Regarding your editorial " Military Dete r rence in History," (Computers and People, June 1975) I would like to call your attention to Mr. Honer Lea wh o stated that war is inevitable and the determination of combatants is a pure science in the true sense. He authored two books between 1910 and 1916, "The Valor of Ignorance" in which he specifically details
how Japan can defeat the United States in a war that is inevitable, and "The Day of the Saxon" where he details t he decline of England to a second rate
power.

"The Valor of Ign orance" was republished in the

early 1940s and has a short pref ace that should be

required reading for all thinking people.

CJ

COMPUTERS and PEOPLE for June, 1976

7

Electronic Funds Transfer Systems and the Consumer

Paul Armer Center for Advanced Study in
the Behavioral Sciences Stanford, CA 94305

"An EFTS would make it technologically and economically feasible to abuse civil liberties on a grand scale. ... any instance in which a great deal of information about an individual is concentrated in one place represents a threat to his privacy."

There are many pushers of Electronic Funds Transfer Systems (EFTS) these days who are eloquently proclaiming its advantages, but there has been little discussion of the disadvantages and problems. Consequently, in this article, I will focus on the latter, from the standpoint of the consumer. As a result, this will be a biased presentation_ I should futher point out that I am aware that everything we do has dangers and disadvantages, as well as opportunities and advantages, associated with it. We bring electric power into our houses despite the fact that it can start fires and cause injury or death via electric shock. The purpose of this article is to stimulate thought and discussion of the safety and desirability of EFTS by pointing out some of its dangers, problems, and disadvantages.
EFTS is not a single entity but a class of related practices and technologies. The phrase "cashless/ checkless society" has been around for decades. The initials EFTS are a bit newer ~I believe I first heard them about 1970. The American Bankers Association's history of EFTS has 1970 for its first significant date. The report of the association's Monetary and Payments Systems (MAPS) Planning Commit tee was published that year.
Three Major Thrusts
There are three major thrusts in EFTS. The first is the Automated Clearing House (ACH); a number of these are already in operation. The idea here encompasses such things as direct payroll deposits. The Social Security Administration or a large firm delivers a magnetic tape to the ACH with the payroll information on it, including the bank and account number of each payee. An ACH may also encompass preauthorized payments like insurance premiums or utility bills. From a data processing standpoint, although an ACH is electronic, it is batch, off-line, and has a low time criticality.
A second thrust is the Automated Teller Machine (ATM). The ATM, which might be at the local airport or supermarket, permits you to deal with your bank at any time of the day. At present time, no bank operates in more than one state, and in some states, banks can have only one branch. When the Comptroller of the Currency ruled some months ago that an ATM did not constitute a branch, some of the small banks
Reprinted with permission from "Proceedings of the NSF Software Auditing Workshop," CONF 760-116, Lawrence Livermore Laboratory, 1976.

went right through the roof. What was to keep Bank of America or Citibank from installing ATM's any place they wished? A lawsuit ensued, and the courts decided that an ATM was indeed a branch. The de~i sion has been appealed to a higher court.
On the other hand, thrift institutions, not being subject to federal and state laws which restrict branch banking, have, in several instances involving a number of installations, installed terminals in places like supermarkets. Some are ATM's; others are true point-of-sale (POS) terminals, which are the third thrust of EFTS. With a point-of-sale terminal, when you check out of a hotel or buy something at a store, the money is instantaneously debited to your account and credited to the account of the hotel or of the store. In other words, the funds are transferred electronically in real time. Such systems are sometimes referred to as debit card systems. There is no reason that a point-of-sale system could not utilize both credit cards and debit cards.
90 Per Cent of All Transactions Involve Cash
Let me try to give you a little context. First, money. In 1972, there were roughly $60 billion in cash in circulation, about 11 per cent of that in coins. After subtracting the cash held by banks, businesses, and foreign owners, about $200 is left per capita. If you realize that this statistic means that a family of four would have on hand an average of $800 in cash, you get the notion that the distribution must be incredibly skewed. You might also suspect that cash is probably used to hide a number of questionable transactions of considerable size. The satchels full of $100 bills with which the Committee to Reelect the President dealt tends to affirm such suspicions. Experts guess that there are about 220 billion transactions in cash per year; about 75 per cent being for less than $1, and only 5 per cent exceeding $10.
Checks are involved in transactions only about 1/ 8 as often as cash. The U.S. Treasury writes about 80 million checks per month ~ almost half are for social security. Checks are seldom written for less than $1; in fact 90 per cent are for greater than $10. And l per cent are for more than $10,000, but account for 80 per cent of th e dollar value. It costs roughly twenty cents to process a check, making a rounded total cost of our demand deposit accounting (ignoring preparation costs) of the order of $6

8

COMPUTERS and PEOPLE for June, 1976

billion. Despite all our fine machines, I'm told that 60 per cent of that cost is labor. The number of checks has been growing at about 7 per cent per year. So, if the trend continues, the number per year might be expected to more than double by 1984 to something like 65 billion per year.
Credit card transactions are about 1/5 as frequent as check transactions, though in recent years they've been growing at about 35 per cent per year. Only 60 per cent of the credit card transactions are for more than $10. There are some 400 million cards in existence in the United States. At present time, a crude estimate of the processing cost of a credit card transaction is between thirteen cents and sixty cents. If we multiply an assumed average cost of thirty cents by roughly 5 billion transactions, the system's cost is about $1.5 billion.
Let me summarize. About 90 per cent of all transactions involve cash, but of the transactions over $10, only about one in four is for cash.*
Changing Financial Structure
Who are the players in the EFTS game? What are their motivations? In 1961, a presidential commission on money and credit recommended that the structure of financial institutions in our country be changed. Hecently both the legislative and executive branches of government have been discussing how this should be carried out. Not only would the rules about which service each type of financial institution could offer be changed, but also the authority of the various regulatory agencies would be altered. Thus, the new services of EFTS, made possible by rapid advances in computer technology, arrive at a time when the very structure of the financial community is changing.
I believe that almost all the players believe that when the dust from EFTS settles, just which institutions remain in the financial game and what share of the market each has may bear little resemblance to the situation which exists today. Furthermore, it is not just one commercial bank competing with another commercial bank. Besides the commercial banks, there are the savings and loans and other thrift institutions, the credit unions and stores, particularly the large chains. We don't tend to think of stores as being members of the financial industry, but they have a real stake in the financial game. For example, I'm told that 1/ 3 of Sears's profits come from the interest they collect on various types of loans to their customers.
Regulatory Agencies Fight for Their Own Ends
The three major regulatory agencies ~ the Federal Reserve Board for the banks, the Federal Home Loan Bank Board for the thrift institutions, and the National Credit Union Administration~ are all fighting for their constituencies and for their own ends. For example, the Federal Reserve Board (FRB) appears to be quite aggressive. Why? For one thing, every organization strives not only to survive but to grow and do a good job. One of the FRB's responsibilities is to regulate the money supply. To do so, they need control over the banks. But banks have been defecting from the Federal Reserve System. Since 1947, the proportion of total deposits under the direct control of the FRB has dropped from 86 per cent to 77 per cent, and the pace of dropouts
* Most of these statistics are taken from "The Consequences of Electronic Funds Transfer," prepared for the National Science Foundation by A. D. Little, Inc., U.S. Government Printing Office, Stock Number 038000-00249-0, June 1975.

has been increasing. If the FRB operated a major EFTS network and could offer better service or lower prices to members of the Federal Reserve System, they might recapture some of the defectors.
The Federal Reserve System has been clearing large checks electronically since 1915, using telegraph at first, but now telephone facilities. In January of this year, the FRB proposed changes to Regulation J, pertaining to the clearing of checks, which spell out the manner in which the Federal Reserve System would operate an EFTS. Some have said that this is like the Civil Aeronautics Board operating an airline. There are some aspects of that analogy which don't compare very well, but it does raise some interesting issues.
Among the important players in t his game are the credit card associations. There are two major ones ~ National Bankamericard, Inc. (NBI) and Interbank/ Mastercharge. NBI has in place a nationwide, online credit verification system and another system to handle interbank transfers electronically. Banks are charged 2 1/2 cents a t ransaction for such transfers. Mastercharge is right behind them. Both are busily designing sys t ems which will el ectronically connect point-of-sale terminals and automated t eller machines to your bank's computer.
Of course, the equipment vendors must be included in any enumeration of the EFTS players. Their r ole is pretty clear.
Grand Scale Abuse of Civil Liberties
The final player is t he consumer . What are th e potential dangers and disadvantages to the consum er?
An EFTS would make it technologically and economically feasible to abuse civ i l liber ti es on a grand scale. For example, I believ e that EFTS represents a major threat to privacy. I believe this because I believe that any instance in which a great deal of information about an individual is concent r ated in one place represents a thr ea t to his pri vacy. In our existing systems, pri vacy is assured under all but the most unusual circumstances by the sheer cost and inconvenience of a search. EFTS would concentrate an enormous amount of i nf orma t ion abou t an individual in one place where i t woul d be machine accessible and easy to proces s. It would be possible to profile an individual in considerable detail ~ his consumption habits, f or exa mple . I t would be possible to sor t paym ent i nformation by payee, rather than payor, and thus ass emble a list of co nt ributors to a given cause.
It would be possible, if a large fracti on of financial transactions go throu gh the EFTS via POS and other terminals, to not only know wha t an indi vidual is buying but where he is at the time . Furth ermore, the location information is avai lable in rea l time. Several years ago, I was a member of a te am which was given the assignmen t of a ssuming tha t we were data process i ng advisors to th e hea d of th e Ru ssian Secret Pol i ce (the KGB) and t hen de signing a system for maintaining surve illance of a ll Soviet ci ti zens and foreigners within th e USSR . Af te r some study we decided that t he ea si es t a nd chea pe s t way to do it was to install a real time e l ec troni c fun ds transfer system which would handl e a l l f inan cial t r an s actions.* A system that know s whe re each i ndi vi dual is would be of grea t use to would-be t yrants as a surveillance system. Thus . you ca n 't a l levi ate my
(please turn to page 19)
*The Center for Strategic and International Studies, Georgetown University, October 29-31 , 1971.

COMPUTERS and PEOPLE for June, 1976

9

The Relevance of Mathematics

Felix E. Browder Department of Mathematics University of Chicago Chicago, IL 60637

" 'Having regard to the immensity of its subject matter, mathematics, even modern mathematics is a science in its babyhood. If civilization continues to advance, in the next two thousand years, the overwhelming novelty in human thought will be the dominance of mathematical understanding.' "

The Mystery of Mathematics
Several years ago, I was asked by one of my colleagues in the Department of Mathematics at the University of Chicago to give a general nonmathematical lecture on mathematics to a nonspecialist audience consisting mainly of undergraduates. For this curiously unorthodox venture into semimathematical rhetoric, I chose the title: "Is mathematics relevant, and if so, to what?"
Even though the title is not facetious, it is mysterious. It is mysterious not because it contains exotic or technical terms far from our common experience of the use of language, rather, it contains·two ordinary terms, "mathematics" and "relevant," with which all but a small minority of us are familiar. The mystery lies in their ordinariness and their frequent use in ambiguous and unthoughtful ways. To get a meaningful answer to the question, we must clarify its meaning in a significant way.
Let me begin with the word "relevant." As it has been customarily used in the past few years, relevant refers to a relationship between some institution or mode of action, and a body of values or purposes. The modifying clause, "and if so, to what?" points to the basic vagueness of the customary usage of relevant by asking implicitly: What body of values or purposes? We know that different bodies of values or purposes have been emphasized or pursued within different social or historical contexts, on the basis of different perspectives of the important aspects of the human condition. If we presume an unambiguous meaning for the word "relevance," the way in which the question is posed asks if we have reached a consensus with one another on the nature of the Good. It can also be taken as asking, if we wish, if we are willing to accept definitions of value and purpose arising out of particular aspects of human action as forming a spectrum of diverse human values, which hopefully can be tied together in a coherent view of the human condition.
The "More or Less"
What has all this to do with mathematics, aside from the practice which arose a few years ago of making public demands that all intellectual institutions should prove their relevance? To answer this
Reprinted with permission from the American Mathematical Monthly, vol. 83, no. 4 {April 1976).

question in a convincing way, I shall have to lay the appropriate foundation by clarifying what we mean by the word "mathematics." Before I proceed to this task, let me note ~n interesting and important historical precedent for any discussion of the relation of mathematics and values.
A little more than 2,300 years ago, a very celebrated lecture was given in Athens on a closely related theme. This was the famous "Lecture on the Good" by Plato, the most influential of the world's philosophers, and it is the only lecture (or set of lectures) by Plato of which there is any objective evidence. Among his listeners was Aristotle, then a student in the Platonic Academy and later a critic of Plato's doctrines. Aristotle put down his testimony on the contents of Plato's lecture (or lectures) in a treatise in three books called On the Good of which no fragment remains, but which ~as paraphrased in the writings of Aristotles' s disciples. "Many attended the lecture under the impression that they would obtain some of the human goods, such as riches, health, power, or above all, a wonderful blissfulness. However, when the exposition began with mathematics, number, geometry, and astronomy and the Thesis 'The class of the Limit taken as One is the Good,' the surprise became general. A part lost interest in the subject, the others criticized him." Plato is reported to have said: "The foundations of all things are the One and Indeterminate Magnitude, or the 'More or less.'" In present day terms, what he seems to have said is that the Archai, the foundations of both the physical and moral orders (which he did not distinguish from one another) are processes by which there is generated the sequence of integers or natural numbers and the continuum. Thus the report by Aristotle on Plato's most fundamental Unwritten Doctrine is that Plato solved the problem of the relationship between the foundations of the Good and of mathematics by identifying the two.
Plato's solution of the problem of value by identifying the Good with mathematics is one that very few of us nowadays would be willing to defend, especially in public. The report on Plato's Unwritten Doctrine by Aristotle and his followers has become a perennial scandal in the history of philosophy, and over the ensuing two thousand years continues to generate scholarly controversy to the present day as to whether such a respectable man as Plato could have held such raffish beliefs. Some of the psychological consequences are apparently so drastic that one extreme wing of the classical fraternity led by

10

COMPUTERS and PEOPLE for June, 1976

Professor Harold Cherniss of the Institute for Advanced Study in Princeton has tried to get rid of the problem by declaring on principle that Aristotle was a tendentious and misleading reporter on all his philosophical predecessors and contemporaries and therefore his testimony should be ignored. More weight (and the force of the evidence) can be assigned to views like those of W. D. Ross in his book Plato's Theory of Ideas that Plato did indeed put forward the views ascribed to him by Aristotle.
It seems clear that it was in the Academy, which Plato founded in Athens as .the prototype of later institutions of higher education and research, that under the stimulus of Plato's emphasis upo~ the central role of mathematics and the new standards of logical rigor introduced by such logical critics as Zeno of Elea, the basic topics of Greek mathematics (geometry and the natural numbers) were first made the object of a reasoned penetrating logical development from first principles. Plato was himself a great patron of mathematics, but not a mathematician. The Academy, however, sponsored and stimulated the work of some of the greatest mathematicians of its period, such as Theodorus, Thaetetus, and most important of all, Eudoxus who created the basic tools of Greek mathematical astronomy as well as resolving the logical problems of incommensurable magnitudes by creating the analogue of the modern theory of real numbers.
The Platonic Vision of the Cosmos
In one of his late Dialogues, the Timaeus, Plato expressed in writing the concept of a cosmos founded on mathematical principles. This concept was firmly opposed by Aristotle who tended to limit the role of mathematics to the heavenly bodies and, on the terrestrial level, to essentially the counting of individual instances of a phenomenon or type. The grand Platonic vision of the universe organized on mathematical principles had relatively little influence in the classical world or in the Middle Ages, after the death of Plato and his immediate disciples. The Platonists or Neo-Platonists of succeeding ages (Plotinus, Porphyry, Iamblichus, Proclus, and others) had great intellectual influence during the Roman Empire and indeed seem to be the chief fountainhead of every variety of high mysticism which has flourished since their time, but they took over all the mystical elements in Plato's thought and relatively little of his mathematical interest.
It took the great Scientific Revolution of the 16th and 17th centuries (as the great historian of science, Alexandre Koyre, has pointed out) to vindicate Plato's dream and bring it to fruition. The central point of the greatest achievement of the modern scientific world-view was the creation of a completely successful mathematical physics and astronomy by Isaac Newton as the culmination of the beginnings made by Galileo and Kepler to try to justify the Copernican vision. The Newtonian mathematical astronomy was the model of a fulfilled Platonic vision of the cosmos.
Though there are few open believers in the explicit doctrine of Plato that mathematics is identical with the Good, our intellectual world in many of its most active and vital branches continues to be dominated by great Platonic visions of mathematical order. In mathematical physics, the Newtonian cosmos has been replaced by the even more perfected Platonic vision of general relativity theory. Mathematics itself has been consumed over the past hundred years as well as pricked, delighted, and tormented by Cantor's great Platonic vision of the theory of the accomplished infinite, the precise reasoned theory of infinite magnitudes. The main course of modern

physics has moved through the channel of the revolutionary development of quantum mechanics in the late 1920s, yielding a new and precise mathematical formalism whose consequences were inexpressible in terms of classical physical intuitions. One consequence was a fundamental mathematicization of the basic principles of chemistry. In more recent decades, we have seen the development of molecular biology founding the basic machinery of biological inheritance upon the geometry of the DNA molecule and the combinatorics of sequences of amino acids. New cosmological visions of the Platonic order have been formulated for the origin and possibly the vanishing of the whole physical cosmos. On a more speculative note, we have the suggestive Platonic program of the theory of catastrophes formulated by Rene Thom and his disciples for the mathematicization of developmental biology. As a final example, let us note the recent hi story of lingui sties and the movement called structuralism in the humanistic and social disciplines in France which argues for formal mathematical structures underlying all the varied spheres of human action and meaning.
Four Different Meanings for "Mathematics"
We have proceeded in the discussion of the Platonic tradition in Western thought and its relation to the steeping of crucial areas of modern science in their mathematical foundations and visions without further attention to the clarification of what we mean by the word "mathematics." To repair this omission, I shall have to proceed with care and less in the style of Plato than of Aristotle. I believe and propose to show in detail that there are indeed four fundamentally different meanings in basic usage of the term "mathematics." Thus, I propose to speak not about Mathematics simple, but Mathematics I, Mathematics II, Mathematics III, and Mathematics IV.
Mathematics I: Social Utility
Mathematics I refers to the mathematical practice imbedded in the common life of mankind in all civilized societies, and most intensively in the advanced industrial societies of which the U.S. is the leading example. This kind of mathematics includes all the counting, measuring, and calculation which is part of the life process for almost all human beings in our society as well as the systems of calculation and measurement which underlie the organization of every economic system belong the most primitive stage when money is introduced. In its higher reaches, Mathematics I includes the use of mathematical techniques in such activities as accounting, engineering, and architecture, the collection of statistical data, the counting of votes, not to speak of the tremendous transformation of social practice that has been brought about in recent years by the use of electronic computers. The criterion of relevance or value with reipect to Mathematics I is social utility and (even though the concept of social utility is not completely transparent especially in its relation to individual utility) we tend to assign such value in terms of the effectiveness or efficiency of the mathematical techniques or practices involved toward specific goals of social effort. Such effectiveness is often achieved through the use of relatively standardized techniques which are not especially interesting from an intellectual point of view.
Mathematics 11: Solving Extramathematical Problems
Mathematics II refers to the use of known mathematical techniques and concepts to formulate and solve problems in other intellectual disciplines.

COMPUTERS and PEOPLE for June, 1976

11

In terms of day-to-day practice, this is the primary function of mathematics in the physical sciences, and more recently, in the biological and social sciences. The intellectual difficulties which must be resolved and the ingenuity which must be applied are often of a high order of magnitude in these applications, but the standard of relevance within the framework of these applications is the usefulness of the result for the discipline to which it applied rather than the intrinsic interest and fruitfulness of the processes by which that result is reached. Mathematics II represents an ever expanding area of intellectual activity taking over ever larger portions of many intellectual disciplines, sometimes to the despair of those disciplines' more old-fashioned practitioners. As an example less familiar than some, let me refer to the application of mathematical or statistical techniques in the historical field, known under names like cliometry or prosophography, which proposes to obtain more objective or accurate information about historical trends by a detailed analysis of data concerning life patterns of social or economic groups in a given historical period. Since the relevance of the mathematical activity under Mathematics II is its fruitfulness for the solution of the intellectual problems of the extramathematical discipline to which it is applied, its value is thereby linked to the relevance of that discipline in its own terms.
Mathematics 111: A Creative Art
By Mathematics III, I refer to the body of what is usually called mathematical research, to the investigation of the concepts, methods, and problems of the diverse mathematical disciplines. Historically, these have developed over two thousand years from the classical problems of geometry, the theory of numbers, the solution of algebraic equations, the solution of the differential equations of mathematical physics, and the study of the formal methods of mathematical reasoning themselves. Though originating from this basic stock, new and extremely vital mathematical disciplines have emerged, whose power and fruitfulness both in solving the problems of older fields and in generating important insights in newer directions, have made them new generating foci of basic mathematical developments. Four of the most important examples are the theory of groups, analytic functions . of one or several complex variables, functional analysis, and algebraic and differential topology. Let me note explicitly that the division between Mathematics III and Mathematics II is not the same as the rough division which is sometimes made between pure and applied mathematics. Much that is done in the best varieties of applied mathematics falls under Mathematics III since the criterion of demarcation is whether the mathematical practitioner is seriously interested in the concepts and methods of the mathematical investigation in their own right or is mainly absorbed in the result of the particular application. Mathematics III denotes what mathematicians themselves refer to as "real mathematics."
The criterion of relevance for Mathematics III for all its components, whether pure or applied, is the intellectual criterion of the effectiveness of the mathematical activities in resolving the unsolved problems of their mathematical subdiscipline, in improving the power and fruitfulness of their concepts and methodological tools, and in clarifying the logical structure of calculations and proofs. As many great mathematicians such as Hadamard and Poincar~ have described in detail, the practice of this intellectual discipline on the highest level

takes the form of a creative art which works on the objective material of given problem and concepts by means of inventive jumps and intuitions. These free creations of the mind (to use a phrase of which Einstein was fond) find their judgment in their effectiveness in tangibly transforming the state of the discipline and solving its problems. They are subject to the most stringent analysis and criticism. The practitioners on the highest level judge their path forward, however, not by the prudential analysis of the safest path but by an unarticulated intuitive grasp, sometimes in aesthetic terms of the undeveloped potentialities of a given state of a mathematical field.
Mathematics IV:
The Ultimate Form of Human Knowledge
So far, so good. But what of Mathematics IV? Indeed, I suspect that · many mathematicians of my acquaintance might resent the notion that any category might be put somewhere "above" or "beyond" the concrete practice of mathematical research. Yet, I propose to put forward such a category which I believe essential to a complete description of the nature of mathematics. Mathematics IV differs from the preceding three subdivisions in not being the full-time activity of any sector of the mathematically employed population, and yet represents a major element of the impulse and vitality of the other three categories, and a large part of their unity as well. By Mathem.atics IV, I refer to the vision of mathematics as the ultimate and transparent form of all human knowledge and practice. We come here to the most radical question about the meaning of the word "mathematics." In classical Greece (and purportedly among the Pythagoreans), the word "mathematics" came into existence with the original meaning, "that which can be taught." From the classical age in Greece through the Renaissance, mathematics came to be identified with the great structure of deductive geometry as embodied in Euclid's Elements and the writings of Archimedes. In the 16th and 17th centuries, it came to be identified with the new analytical methods and the techniques of the differential and integral calculus. From the 17th century on, however, a broader vision of mathematics arose in the minds of such intellectual innovators as Leibniz and Descartes, a vision of mathematics as the total science of intellectual order, as the science of pattern and structure. It was in this form that the vital impulse of the Platonic vision of the world was reborn in its most permanent form.
This new vision of mathematics has had many significant fruits in the centuries that followed. Its most, important characteristic has been the ability to detect and analyze significant form in one domain of human experience (often a relatively technical domain in classical mathematics) and then apply the insight so obtained to illuminate apparently unrelated contexts of human thought and action. As a first illustration, consider the development and application of the theory of groups. The basic algebraic concept of symmetry, originated in its explicit form as a consequence of the study of the roots of algebraic equations in the work of Lagrange and Galois in the late 18th and early 19th centuries. During the course of the 19th century, group theory became one of the leading themes of mathematical development which was interwoven with many of the other central themes of the developing mathematics of the time: analytic functions of a complex variable, the foundations of geometry, the theory of matrices, and the study of ordinary and partial differential equations. Group theory itself became a central building block and starting point of new mathematical devel-

12

COMPUTERS and PEOPLE for June, 1976

opments and has remained so till the present day. However, it has also become the fundamental conceptual and formal tool of the mathematical description of the physical world in the 20th century from its earliest uses as the basis of crystallography to its present role as the foundation of the description of the fundamental particles of high energy physics.
As another illustration of a more extreme sort, consider the consequences of the development of mathematical logic and the foundations of mathematics in the first three decades of the 20th century. In the early 1930s, Kurt Godel, to the astonishment of the mathematical world, proved that the program put. forward by the celebrated German mathematician Hilbert for justifying the framework of classical 19th cen-, tury mathematics by showing its consistency as a formal system could not be carried through. To derive this conclusion, Godel had to give a precise description of the formal structure of what is meant by a mathematical proof within the restrictions placed by Hilbert. In so doing, he created a theory which shortly afterwards was christened the theory of recursive functions. What is most remarkable as a historical consequence, but was already appreciated at the time by such farsighted mathematicians as Turing, was that the theory thus created was the basic theory of what could be accomplished by machines like the digital computer. Indeed, the theoretical development of the digital computer rested and still rests upon the foundation thus laid by Godel' s work.
"The Importance of Pattern"
This broader concept of mathematics as the science of significant form has furnished the background and world view for the brilliant triumphs of mathematical research in all of its varied concrete forms, but it has also been the ultimate rationale for the belief in the power of mathematical methods in their application to all the varied intellectual disciplines beyond the boundaries of mathematics in its most restricted sense. Within the sweep of this concept, one sees the world as governed by objective laws of form which can be discovered in the last analysis only by the dialectic between the creative insight of the individual or the creative fantasy of the individual discoverer on the one hand and the objective testing of the consequences of the intuitive insight or fantasy on the other. When this dialectic works, and it works surprisingly often and with a surprising consistency, one obtains the experience of a concept of an order that enhances rather than stifles individual creativity and spontaneity. It is this experience that seems to have lain behind the original vision of Plato, and it might well be that in this kind of union of order with spontaneity and creativity, there resides an educational value of mathematics for those who will not be mathematicians that goes considerably beyond the technical utility of mathematics.
It was a thought of this sort that was expressed by the Anglo-American philosopher Alfred North Whitehead who in one of his last essays put forward his own rewriting of Plato's lecture on the Good in an intellectual credo entitled "Mathematics and the Good" written for the volume dedicated to him in the Library of Living Philosophers. Whitehead wrote:
[Beginning of Quotation]
The notion of the importance of pattern is as old as civilization. Every art is founded on the study of pattern. The cohesion of social systems depends on

the maintenance of patterns of behavior, and advances in civilization depend on the fortunate modification of such behavior patterns. Thus the infusion of patterns into natural occurrences and the stability of such patterns, and the modification of such patterns is the necessary condition for the realization of the Good. Mathematics is the most powerful technique for the understanding of pattern, and the analysis of the relation of patterns. Here we reach the fundmental justification for the topic of Plato's lecture. Having regard to the immensity of its subject matter, mathematics, even modern mathematics is a science in its babyhood. If civilization continues to advance, in the next two thousand years, the overwhelming novelty in human thought will be the dominance of mathematical understanding.
[End of Quotation]
It is this new and sophisticated version of the Platonic vision expressed in Whitehead's words that I have called Mathematics IV.
The Relation of Mathematics and Value
All the four modes of existence of mathematics as I have described them above are of great antiquity. Certainly Mathematics I and II both clearly existed in Babylonian civilization where numerical calculation and algebraic manipulation reached remarkable levels of proficiency while they were applied to both mercantile transactions and the religiously-oriented charting of the heavens. In ancient Greece, as already remarked, Mathematics III and IV appeared in a full-fledged mature form. By their nature, the four forms of mathematical activity and consciousness are mutually autonomous since in any realistic sense, no one of the four can absorb any of the others. All such attempts at absorption usually represent an effort to destroy the influence or activity of whatever aspect of mathematics is supposed to be absorbed. On the other hand, when all the aspects of the mathematical enterprise are flourishing, their mutual interaction is usually intensive. In modern times, they have only flourished together. In terms of social influence, mathematics as a tool in every-day life and in the technical aspects of social life tends to stimulate the application of mathematics as a tool in the very varied intellectual disciplines. In turn, the latter tends to stimulate the development of mathematical research and the success of research in generating new concepts and principles yields the transcendental ideal of mathematical knowledge. In terms of intellectual influence, the pattern of forces usually runs in the opposite direction. The transcendent ideal of mathematical knowledge gives meaning and force to the impulse of mathematical research, which in turn yields new tools and impulses for the application of mathematics, first in other intellectual domains and through their successes, in the practical life of mankind.
What can we say on the basis of this picture about the relation of mathematics and value in our contemporary historical context and in our own society? The complete answer is both exhilarating and alarming. On the one hand, we see the tremendous intellectual vitality of mathematics and the mathematicized sciences in the past several decades, their tremendous success in solving their own problems and creating new tools for the solutions of both technical and practical problems. On the other hand, while the mathematicization of society goes on in leaps and bounds, as far as the every-day consciousness of the mass of human beings jn our society is
(please turn to page 26)

COMPUTERS and PEOPLE for June, 1976

13

Colllputerized Adaptive Ability Measurelllent- Part I

Dr. David J. Weiss Professor of Psychology University of Minnesota Minneapolis, MN 55414

uour research experience has indicated that characteristics of the computer system itself may become an important component of the testing environment, particularly as it has an impact on the testee's psychological state during testing."

The paper-and-pencil, multiple-choice, ability test developed during World War I has been widely used. During the last half century, millions of men and women have been classified, assigned, trained, and promoted in many care~rs and assignments, based on scores from group-administered abil" ity tests similar to the Army Alpha Examination used in World War I. Also during that half century, these tests have been constantly improved through a number of advances in the field of psychological measurement.
Group-Administered Ability Test
The group-administered ability test was a compromise which grew out of the necessity to classify large groups of men to mobilize our personnel for the first world war. Prior to that, ability tests were based on Alfred Binet's intelligence testing model, in which the test was individually administered to each testee by a trained psychologist. Under the psychologist's guidance, the testee was led through a series of prenormed questions until the questions were to difficult for him or her. When the examiner was sure that the testee had reached a set of items that were too difficult for him, testing was terminated.
Alfred Binet's Approach
Binet's approach had three major characteristics: (1) Testing was begun for each individual with a set of questions at his estimated ability level, based on whatever prior information was available to the examiner. (2) The difficulties of the test questions were adapted to the individual's ability level; testing was concentrated in the range of difficulty between the set of questions that were too e~sy for the individual (i.e., he answered them all correctly). (3) The number of questions administered to each individual was based on how long it took to identify the testee's ability level; testing was continued as long as necessary, resulting in shorter tests for some testees and longer tests for others.
These three characteristics formed the basis of an individualized or adaptive testing procedure. That is, the difficulties of the test items presented to each testee were adapted or tailored to his ability level based on information gained by observing the correctness or incorrectness of his responses to previously administered items.
Reprinted with permission from Naval Research Reviews, November 1975, published by the Office of Naval Research, Arlington, VA.

As World War I changed the face of ~ the globe and our social structures, it also changed the course of psychological measurement. The urgent need to classify millions of men and the lack of trained psychologists to administer individual tests resulted in the rejection of Binet's adaptive model and the development of the group-administered, multiplechoice, paper-and-pencil, abi 1i ty test. Whi.le Binet's approach has survived among clinical psychologists, the paper-and-pencil, multiple-choice test is currently used more than 95 per cent of the time for measuring abilities.
The group-administered, multiple-choice, paperand-pencil test has grown in usage because ·it is, like Henry Ford's production line, efficient. Just as an auto maker can turn out thousands of new automobiles in a day, the conventional multiple-choice test can be administered to hundreds or thousands of people at one testing session. The efficiency of this type of test derives from the fact that its administration is highly standardized. All individuals start the test at the same item, proceed through the items in the same order, and end the test either at the same item or when the time limit is reached.
Loss of Major Advantages
However, in achieving this high degree of standardization, the conventional test loses the three major advantages of Binet's adaptive test strategy: (1) prior information is not used to determine a starting point for testing, (2) items are not adapted to the testee's ability level, and (3) all individuals answer the same items. This loss of adaptive properties in the conventional group-administered test leads to a loss of accuracy in test scores. It has been demonstrated in psychometric theory (e.g., Hick, 1951; Lord, 1970) that the most accurate measurement for a given individual is obtained when the difficulties of the test items are at or near the testee's ability level. In a conventional test, item difficulties are usually concentrated around the average ability level of the group for which the test is designed. Thus for individuals whose ability levels are near the group average, the conventional test will provide highly accurate test scores. However, for individuals of above-average or below-average ability levels, the test scores will be less accurate; and the further an individual's ability is from the group average, the less accurate his test score will be. (See Figure 1)

14

COMPUTERS and PEOPLE for June, 1976

High
Measurement Accuracy

Adaptive
--,, -, /--T-e-st

Low.____.i..........a..~....i..~.i........i.~~

Low

Average

High

Ability Level . Figure 1 - Illustrative accuracy of conventional and adaptive
tests as a function of ability level.

Accuracy

In an adaptive test, however, item difficulties are selected specifically to be at or near the ability of each individual tested. Thus, test scores will be highly accurate for individuals of all ability levels. This situation is shown by the dashed line in Figure 1. The almost horizontal nature of
this line indicates a consistently high level of accuracy in adaptive test scores regardless of how high or low an individual's ability level is.

The major implication of a high and constant level of accuracy is that scores derived from adaptive tests are likely to be equally valid for all individuals, regardless of ability level. That is, more accurate or reliable scores more validly reflect the ability the test was intended to measure because there is less error in the test scores. And the more validly the test score reflects the true level of ability of an individual, the more useful and accurate it will be for making other important predictions about him. In contrast, because scores obtained from conventional tests vary in accuracy, they will also differ in their validity and utility for predictive purposes.

Adverse Psychological Effects

Conventional tests, because their construction and administration is appropriate only to examinees of average ability, also tend to have adverse psychological effects on examinees. A low abillty examinee, for whom the test items are much too difficult, may become frustrated and anxious, and his test performance may deteriorate. A high ability examinee, for whom the items are much too easy, may
become bored or be insufficiently motivated to perform to his fullest capacity. Both frustration and boredom may result in inappropriate responses to the test items, and the examinee's test score will not be an accurate representation of his ability level. In the Binet testing procedure, however, the process
of adapting item difficulties to the testee's ability level and terminating the test when it becomes
too difficult can reduce the occurrence of frustration and boredom and thus the extent to which the accuracy of test scores is affected by these adverse psychological reactions.

Thus, for both psychometric and psychological reasons, Binet's adaptive testing procedure yields test scores which are highly and uniformly accurate and, therefore, uniformly useful for predictive purposes. However, Binet's approach requires a trained psychologist to administer test items, evaluate their

correctness and choose the next test item to be administered. Modern computer technology now permits these same functions to be performed by interactive computer systems. The computerized adaptive test retains all the advantages of Binet's approach and adds to it a degree of efficiency even beyond that achieved by the paper-and-pencil, multiple-choice test. A history of testing written twenty years from now may well show the same kind of redirection of psychological measurement resulting from the application of computers to the testing process as resulted from the introduction of the paper-andpencil test during World War I.
Computerized Testing
The research reported below was initiated in order to evaluate the relative merits of a variety of approaches to computerized testing. It has been supported since mid-1972 by a contract with the Personnel and Training Research Programs, Office of Naval Research.
The Testing Environment
The availability of on-line computer systems results in a new environment for ability testing. Rather than using a test booklet and an answer sheet, the testee is now tested at a cathode ray terminal (CRT). The CRT is either directly connected to a central computer or is connected by telephone lines, depending on the nature of the computer system and physical proximity.
Test items are presented on the TV-like screen of the CRT. Following presentation of each item, the testee responds by typing an answer on the CRT keyboard. To insure that lack of familiarity with the CRT terminal does not interfere with the testing process, a series of instructions was developed to teach each testee how to use the CRT and its typewriter keyboard. This instructional sequence is administered by computer prior to the administration of tests to each individual. It introduces each special function key on the CRT, instructs the testee in how to record answers, and then gives him practice in using the CRT keyboard. If, after three tries at any particular instruction, the testee still cannot execute it properly, the CRT calls a proctor who helps the student. Once the testee successfully completes this instructional sequence, some personal data is obtained, he is introduced to the kind of test he will take, and he is given several sample questions. Finally, when all the instructions are completed, actual testing is begun.
Five-Minute Learning
Our experience with this instructional sequence shows that most testees can learn to use the CRT equipment within five minutes. Several thousand college students and several hundred high school students have taken tests under this system, and the proctor has been required to explain the use of the CRT's for only about 2 per cent of the testees. These observations suggest that the man-machine interface problems in computerized testing will likely be minimal although certainly deserving of systematic research.
Our research experience has indicated that characteristics of the computer system itself may become an important component of the testing environment, particularly as it has an impact on the testee's psychological state during testing.

COMPUTERS and PEOPLE for June, 1976

15

Adverse Effects from Large-Scale Time Sharing
We originally began our on-line testing research using CRT's acoustically connected to a large timeshared computer system. This system was operated by the University of Minnesota to simultaneously serve a variety of time-shared users throughout the state. Our experience in using this system for over two years suggests that a large-scale, multipurpose, time-sharing system is not an ideal vehicle for computerized adaptive testing. One major problem encountered was that, because access to the computer was by telephone lines, the display speed of the CRT's was limited to 30 characters per second. This meant that our instructional screens, some of which contained about 2,000 characters of information, required up to a minute each to display. Because many students could read considerably faster than 30 characters per second, they appeared to be irritated at the slow speed of the display.
A second problem, which appears to be characteristic of large scale, time-shared systems, is that they frequently take a lon~ time to respond to the testee (system response time). Modal system response time was usually between 5 and 10 seconds, but response times of 30 to 40 seconds were not unusual. During this period, the testee had nothing to do but sit and watch the screen, waiting for the next question to appear. Frustration and negative feelings toward computerized testing were inevitable.
A thi.rd problem, which frustrated both the research staff and the testees, was frequent computer system "crashes" during which computer processing would stop for periods ranging from minutes to hours. These crashes were computer failures which were not the result of our computerized testing.
In an attempt to insure a standardized testing environment for future research in computerized testing, we began early in the research to investigate alternative computer systems which would be useful for the research and which would ultimately serve as a prototype for operational adaptive testing systems. We were particularly concerned with developing a system which would minimize the extraneous psychological effects on the testee so that we could study the relatively pure effects of adaptive testing itself on such variables as anxiety, frustration, and test-taking motivation.
Minicomputers
The burgeoning field of minicomputers provided us with an answer to our problem. After carefully investigating a number of leading minicomputers, we took delivery of a Hewlett-Packard 9600E Real-Time system in the summer of 1974. This system provides us with CRT's that display at 960 characters per second; a screen is now fully displayed in two seconds. System response time is less than 1/2 second, and the testee no longer has to wait for the next question to appear. Now the computer system "crashes" have virtually ceased. A final advantage is that we can now accurately measure a testee's response latency ~ the amount of time he spends deciding on an answer to each question as additional information for potential use in measuring ability or in making predictions.
Good testing practices dictate that tests be administered in a carefully standardized environment. Our experience has shown that computerized testing research and development will be facilitated by a testing environment in which the nature of the man-

machine interface and the effects of characteristics of the computer system on each testee's scores are carefully controlled.
Adaptive Testing Strategies
The bulk of our research has been concerned with evaluating the utility and measurement effectiveness of a variety of strategies which have been proposed for adaptive testing. Many of these strategies have been suggested by other investigators, and several other strategies have been developed by our research staff. Each strategy represents a different approach to adapting the difficulty level of the items administered to the ability level of the testee. A testing strategy is defined by a set of rules specifying the procedures by which responses to previously-administered items are used to select the next item or items to be administered. Some of the strategies are relatively simple, mechanical approaches to the problem, while others are based on sophisticated mathematical and statistical convergence models borrowed from other fields of research. A comprehensive review of these strategies can be found in Weiss (1974).
Pool of Stratified Test Items
One computer-administered adaptive testing strategy developed by our staff is an adaptation and extension of Binet's original testing strategy (Weiss, 1973). This strategy requires a pool of test items which is divided, or stratified, by difficulty levels. Each level or stratum of the item pool can be thought of as a conventional test in which items are of approximately the same difficulty revel. For
= = example, stratum 1 might consjst of very easy items
with difficulties between p .89 and p .99 (i.e., 89 per cent to 99 per cent of the norming group answered the items correctly) and concentrated around
= p .94. Stratum 9, at the other extreme of diffi-
culty, might consist of 20 very difficult items with
= difficulties concentrated around p .06 and ranging = = from p .01 to p .11. Between these two strata
would be seven other strata, each consisting of about 20 items concentrated around difficulty levels ranging from .83 to .17, in steps of .11.
Stratified Adaptive Test
Given an item pool structured in this way, the procedure for moving an individual through the strata is adaptive ~ hence the name stratified-adaptive or stradaptive test. Similar to Binet's strategy for individual adaptive testing, the stratum at which a testee begins testing is determined using prior information about the testee. If a testee's ability is expected to be relatively low~ based, for example, on the individual's own estimate of his ability~ the testing might begin at stratum 2 or stratum 3, which consist of easy items. If the testee's ability is expected to be high, testing might begin at stratum 7 or 8 (the more difficult items). When there is no information on which to base an estimate of ability prior to testing, testing can begin with items of average difficulty.
Figure 2 (see page 17) shows the record of an actual stradaptive test administration. The first item administered was the first item available at stratum 5, an item of average difficulty. That item was answered correctly (+). Consequently, the next item administered was a more difficult one ~ the first item at stratum 6. That item was also answered correctly, and the testee was branched to a more difficult item, the first item at stratum 7,

16

COMPUTERS and PEOPLE for June, 1976

REP0RT IN STRADAPTIVE TEST

ID NUMBER1

DATE TESTED1 73/07/12

.STRATUM a

.<.EASY>

I

2

3

<DI F'Fl CULT>

5

6

7

8

?

PRIPeCIRRI

:·---.....~· 3+~ · ..........·

4-

5+

·

·

6-

·

7+

·

· · s-

·

9-

·

10<·

·

·

I l+

·

.........12-

1~.~1 ~_.,,,, :

·

15-

16+

·

·

17-

·

18+

·

·

·

19+......._ ·

·

·

20-

a. oo 1. oo ·56 o. oo

TITAL PJUIP0RTUN CIRRECT· · 550

SCORES ON STRADAPTIVE 1EST
Ability Level Scores
1. Difficulty of Most Difficult Item Correct= 1.49
2. Difficulty of the N+l th Item= 1.44 3. Difficulty of Highest Nonchance Item Correct=
1.49 4. Difficulty of Highest Stratum with a Correct
Answer = 1. 33 5. Difficulty of the N+l th Stratum= 1.33 6. Difficulty of Highest Nonchance Stratum=
1.33 7. Interpolated Stratum Difficulty= 1.37 8. Mean Difficulty of All Correct Items= .88 9. Mean Difficulty of Correct Items Between
Ceiling and Basal Strata = 1.28 10. Mean Difficulty of Items Correct at Highest
Nonchance Stratum= 1.28
Consistency Scores
11. SD of Item Difficulties Encountered= .59 12. SD of Difficulties of Items Answered
Correctly = . 46 13. SD of Difficulties of Items Answered Correctly
Between Ceiling and Basal Strata = .18 14. Difference in Difficulties Between Ceiling
and Basal Strata= 1.36 15. Number of Strata Between Ceiling and Basal
Strata = 1
Figure 2 - Report of Stradaptive Test for a Consistent Testee
which was also answered correctly. The fourth item administered to this testee was at stratum 8. Thus, in four items, the testee moved from an item of average difficulty (stratum 5) to a difficult item (stratum 8). Because the fourth item administered was too difficult for him, he answered it incorrectly (-). As a result, he was moved back down to stratum 7 for his fifth item, where he received the second item available at that stratum. From this point on, the testee generally alternated between items answered correctly and items answered incorrectl y . Simi lar to Binet's test, the stradaptive test is designed to administer only items that are relevant to atestee' s ability level.
While testing proceeds, the computer keeps track of the proportion of items that the testee answered correctly at each stratum. This information is used

to determine when the testing should be terminated. One termination rule is to cease testing when an individual's "ceiling stratum" has been identified · The ceiling stratum is the least difficult level at which an individual answers all items incorrectly, or answers at a level no higher than chance (when guessing is possible). Given a minimum of five items administered at a stratum and using a 5-alternative, multiple-choice item, testing can be terminated when the testee answers 20 per cent or fewer of those items incorrectly. For the testee shown in Figure 2, no items in stratum 8 have been answered correctly after five had been administered, and testing was terminated. For that individual, stratum 8 was the ceiling stratum (no items correct) and stratum 6 was the basal stratum (all items correct). Stratum 7 provided almost optimal measurement of the testee's ability level, since he answered 56 per cent of those items correctly. For this testee, ability level was determined using only 20 items.
A Second Testee
Figure 3 (see page 18) shows the stradaptive test record of a testee who required 41 items before his ability level was identified. This testee's response record began at stratum 8, based on a prior estimate of his ability level. It shows several wide oscillations between strata 4 and 9 before the ceiling stratum is finally determined at stratum 8 (only 20 per cent of those items were answered correctly). In the strata between the ceiling stratum and stratum 4, which was the basal stratum, this testee answered between 54 per cent and 67 per cent of the items correctly. Of the total number of items administered to him, he answered 49 per cent correctly.
Figures 2 and 3 show a number of different scores for the stradaptive tests. In adaptive testing, different items are administered to different individuals and, ideally, everyone would answer about 50 per cent of the items correctly. Consequently, simple number correct or proportion correct scores are not appropriate, and new methods of scoring tests to estimate ability level have been developed and are under investigation in our research.
The stradaptive test, in addition to providing ability level scores, also provides what we have called "consistency scores." These scores reflect the range of difficulty of the items administered, and indicate how consistently a given individual interacts with a given pool of items. A comparison of Figures 2 and 3 shows that the test record in Figure 2 concentrates measurement in only a few strata, while the record in Figure 3 reflects an individual who is responding more inconsistently. We have hypothesized that these consistency indices should be related to the reliability of scores for a given individual, a concept not measurable with paper-and-pencil tests.
Other Adaptive Testing Strategies
In addition to studying the stradaptive testing model, we are evaluating the relative merits of other adaptive testing strategies. Some of these models, such as stochastic process models, Bayesian estimation models, and maximum likelihood models, represent very sophisticated applications of modern mathematics and probability theory. Others, such as the two-stage, pyramidal, and flexilevel models are based more on the logic of the measurement procedures involved. The results of our comparative evaluation will permit us to limit our future research efforts to those strategies which hold the

COMPUTERS and PEOPLE for June, 1976

17

RENJllT ·N STRADl\PTIVE TEST

ID NtMDER1

DATE TESTED1 73/~7/02

·-S-T-R-A-T-U-M-· -------·------------------------------·------

CEASY>

CDI Jl'FI CULT>

I

2

3

4

5

S

7

8

9

PR~P.C0RP.1

4+>. . .....--3~-·>-.
_,,,,-8·__.--7·-_,-6···- ·.·s-

9+ ......... ·

. <12- . 10+--.

11+'"'- ·

·

13+

·

·

· ·

1· 4 + <1.5 ·

·

16+

·

·

17+>.

·

·

18-

.<20- . . · __.-19-

·

.21 <22·-

· 23._+.,,,,,,-2·4-

.<26- . · ,........2s- ·

27+

·

·

2 8 + .........

29+......... ·

·

· ' 30+>.

·

·

31·

· ___..32-

3·4·<3.3- .·

·

35+ ........

·

·

36+< ·

.<39· ·

J7-

3B+

·

40+

·

·

41-

1.00 .60 .67 .54 .20 o.oo

T0TAL PnllPllJRTUf'J C01!RF.CT· · 488

most promise for providing highly accurate measurement throughout the range of human abilities.

There are many questions to be answered before computerized adaptive testing can be most profitably applied in personnel selection, training, classification, and promotion. Research is further complicated by the newness of the field. Since virtually no live-testing research had been done in computerized testing prior to 1973, we had to develop our own approach to evaluating the effectiveness of various testing strategies.

Testing of Testing Strategies

In addition to developing a research approach which would permit us to compare the relative effectiveness of about ten basic strategies of adaptive testing, we have had to take into account a very large number of within-strategy variations. Consequently, it was not feasible to systematically vary all these characteristics in live-testing studies (i.e., studies in which testees actually complete a computerized test.)

To resolve this dilemma, our research program
uses a systematic combination of live-testing and computer simulation. First, we construct a particular adaptive test. The test is then administered on the computer to a group of testees in conjunction with either a conventional (nonadaptive) test and/or another type of adaptive test. These tests are usually administered again several weeks later to obtain estimates of the stability of test scores from various strategies.

SCORES ON srRADAPTIVE '!EST
Ability Level Scores
1. Difficulty of Most Difficult Item Correct= 1.89
2. Difficulty of the N+l th Item= 1.01 3. Difficulty of Highest Nonchance Item Correct=
1.53 4. Difficulty of Highest Stratum with a Correct
· Answer = 2.01 5. Difficulty of the N+l th Stratum= 1.33 6. Difficulty of Highest Nonchance Stratum=
1.33 7. Interpolated Stratum Difficulty= 1.36 8. Mean Difficulty of All Correct Items= .72 9. Mean Difficulty of Correct Items Between
Ceiling and Basal Strata= .76 10. Mean Difficulty of Items Correct at Highest
Nonchance Stratum= 1.24
Consistency Scores
11. SD of Item Difficulties Encountered= .86 12. SD of Difficulties of Items Answered
Correctly = . 74 13. SD of Difficulties of Items Answered Correctly
Between Ceiling and Basal Strata = .50 14. Difference in Difficulties Between Ceiling
and Basal Strata = 2.64 15. Number of Strata Between Ceiling and Basal
Strata = 3
--
Figure 3 - Report of Stradaptive Test for an Inconsistent Testee
Given the data from the live-testing studies, and in conjunction with certain assumptions from modern test theory (e.g., Lord and Novick, 1968), we then construct a computer simulation model, using the same tests which were administered to the live subjects, which gives us results similar to those obtained from the live testing. This computer simulation model can then be used to rapidly assess the effects of varying a number of parameters associated with each testing strategy. Using the computer in this way, we "administer" a precisely specified, hypothe ti cal test to a "subject" in a second or so, score the test, and repeat the process for many thousands of "testees." We then change the internal parameters of the testing strategy and "administer" the modified test to another large sample of simulated testees.
Simulating Persons Being Tested
Using this approach, we cannot only systematically vary the characteristics of the test, but we can also vary the characteristics of the "testees" themselves. In this way, we can determine how a particular ·adaptive (or conventional) test might work with testees of high ability, low ability, or those whose ability is inappropriate for the test in question. Also, because simulation permits us to use very larg~ groups of "testees" with known simulated ability, we can evaluate the results of the testing using additional criteria which are not available from livetesting studies. After the simulation studies identify a subset of parametric variations within a strategy which appear to be optimal, it is then necessary to verify those results through live-testing. This is because there might be an interaction between characteristics of the branching strategy and the simulation model which require modifications in the model before the results are fully representative of how live testees behave. This process involves administering to live subjects the particular adaptive test which gave the best results in simulation
(please turn to page 25)

18

COMPUTERS and PEOPLE for June, 1976

Armer - Continued from page 9
misgivings with legislation against using the system , for surveillance since, in one of my scenarios, all civil liberties have been suspended in the national interest. Legislation would be meaningless.

Theft

Theft from an EFTS is an obvious concern. Note

that there is a change of at least several orders

of magnitude in the amount of money at risk in EFTS

compared with today's system. A bank robber can't

get more cash from a bank than it has on hand; in

fact, he is usually limited to what a teller or two

~

has on hand. As a result, the average bank robber

nets less than $1,000 a job; but with EFTS, he would

have access to very, very large amounts. It's not

even clear that a would-be thief has to get cash out

the end of the EFTS line. He might, for example,

buy goods and services, or he might accumulate the

money he is stealing in many different accounts.

Conversely, one of the real advantages of EFTS is

that by reducing the necessity of carrying cash, the

incidence of robbery and violence to the person

would be reduced. Further, checks could not be

stolen from mail boxes.

Lost Deposits

There is also a change of at least three orders of magnitude in the time constants of the system. That might make a real difference. Suppose that your bank loses a deposit you've made. With today's system, there will come a time when your account doesn't have sufficient funds to cover the checks you've written and the bank begins to bounce your · checks. You learn this when the bank mails you a notice that they've bounced a check or two. While you are getting this straightened out, you can continue to write checks, knowing that it's apt to be all straightened out by the time the checks clear. But suppose that in an EFTS environment a deposit is again lost or misplaced, and that you have set out on a weekend trip, planning to rely on EFTS as the medium for financial transactions. If the deposit is lost under these circumstances, you may suddenly find yourself unable to buy gas, food, or lodging or even pay a bridge toll, if you still have gas.

More Questions
The consumer has many more questions about EFTS. Who is responsible for safekeeping and security? How will the consumer be protected against errors, and how will errors be corrected? Who owns the data? Who should be responsible for audits? Who should operate the networks? What happens to the "stop payment" mechanism? What happens to "float"? How will liability be assigned as transactions move through the system? If a thief removes funds from an account, who is responsible for the loss?

Professor C. V. Ramamoorthy of the University of

California at Berkeley has characterized the state

of the art of our ability to audit the security and

reliability of computer systems by saying that we're

all no better than "witch doctors." If that's the

case (and I believe it), then it would seem prudent

to me to slow down the pace at which EFTS are being

implemented. A bit more time might enable us to

erect some safeguards. You may recall my analogy

with electric power ~ it did cause a lot of fires

before appropriate building codes were adopted.

Given a bit more time, we might even think about

whether we want a full blown EFTS at all.

0

IF YOU ARE INTERESTED IN COMPUTER ART,
YOU ARE INVITED TO ENTER OUR
14th Annual Computer Art Exposition
GUIDELINES FOR ENTRY:
1. Any interesting and artistic drawing, design, or sketch made by a computer (analog or digital) may be entered.
2. Entries should be submitted on opaque white paper in black ink for best reproduction. Color entries are acceptable; however, they will be published in black and white.
3. The preferred size of entry is 8¥2 x 11 inches (or smaller); the maximum acceptable size is 12¥2 x 17 inches.
4. Each entry should be accompanied by an explanation in three to five sentences of how the drawing was programmed for a computer, the type of computer used, and how the art was produced by the computer.
5. There are no formal entry blanks; any letter submitting and describing the entry is acceptable.
6. We cannot undertake to return artwork; and we urge that you NOT send us originals.
7. Entries should be addressed to: Computer Art Editor Computers and People Berkeley Enterprises, Inc., Chico Branch 555 Vallombrosa, No. 35 Chico, Calif. 95926
DEADLINE FOR RECEIPT OF ENTRIES: FRIDAY, JULY 2, 1976

COMPUTERS and PEOPLE for June, 1976

19

A Skeptical View of Structured Programming and Soine Alternatives - Part 2

Tom Gilb Iver Holtersvei 2 N-1410 Kolbotn, Norway

"Experts who are selling a technique, while ignorant or silent about comparable techniques,

are not very 'expert.' The least we can expect of a real expert is that he knows what he

f'

doesn't know and tells you without being asked."

Dual Code and Parallel Programming
The subject of .parallel programming was covered briefly in my Datamation Forum "Parallel Programming." /15/ As a result of the public exposure of this technique, a number of computer professionals contacted me regarding their work in the field. Their work serves to confirm the fact of growing recognition of this technique. /13, 9, 16, 20/
I realize that most computer specialists assume the whole technique of producing two independent sets of code for the same function must be an April Fools' joke. This is simply a demonstration of the unfortunate fact that we are populated with too many individuals who don't think very deeply and who try to live in this Future Shock world using comfortable mythodology.
There is nothing at all new or radical about the concept of dual code. It is identical to the technique widely used in high-reliability hardware systems to get a 99 per cent availability using two independently failing, 90 per cent availability, hardware modules.
If you are satisfied with 90 per cent program reliability, perhaps you will only want to code a single module, in structured code. If, however, your requirements are for a 99 per cent reliable program, you may discover that it is far cheaper to have two 90 per cent programs constructed than to try to build a single 99 per cent program.
I think it is worth noting that Dijkstra, who believes strongly in structured progamming, has told me that he considers dual coding with parallel programmers "the only natural way to program," and that he uses it together with SP on his most important projects.
Now, perhaps you are getting an idea of how to achieve 98 per cent probability of repairing random program bugs within two seconds. Dual code. It is exactly the same thing as is done with failing hardware components. The electronic technician is not expected to start removing a failed monolithic circuit and repair it. He, or the machine, simply replaces the failed component with another one, which doesn't have the same bug.
Copyright© 1975 by Tom Gilb. Part 1 of this article was published in Computers and People, May 1976.

That is something you just can't do with structured programming. Lockheed Missiles and Space Research Labs have produced a really exciting and thorough exploration of this technique in their paper "Distinct Software." /13/ They indicate a productivity increase for programmers, compared to single programs.
There are a large number of positive attributes which seem to result from dual coding, which most of us also associate with SP:
increased programmer productivity, shortened testing time, increased program reliability.
Ignorant "Experts"
One of the main capabilities of having dual code is that, for programmed systems which require a large quantity of test cases for reasonable checkout, the extra set of code provides an automatic test-set-for-comparison generator (previously a tedious human task) and the output, which is by definition machine readable, can be compared by computer to the output of the other program. Thus comp~rison of detailed test results, which is bad enough during initial development, but often prohibitive during maintenance, is also an automated process. Spotting random bugs becomes a matter of identifying test output differences by computer. The speed, precision, and low cost of this method make it obviously superior to manual methods. Finding the source of the bug is a different matter, once the failure itself is identified. Here is where the other techniques, SP, comments, and modularity, can help us. Unless of course, we get to the point where we find it cheaper to throw away the failing modules and rewrite new ones entirely. Don't laugh at the idea. You don't throw away a large program or a large ' computer at the first fault, but you do throw away integrated circuits and perhaps smaller modules, as Weinberg advocates in his teachings (after the third attempt to get it working has failed) and demonstrates in his SP text. /17/
I have found that most "experts" on SP are entirely ignorant of the technique of parallel programming and dual code. Weinberg is an exception, because he is a scientist and long ago started testing his ideas by experiments using multiple coders producing multiple code. Dijkstra is an exception too, but he admits he never published the fact that

20

COMPUTERS and PEOPLE for June, 1976

he used the method because "it isn't very exciting for my academic colleagues; it just happens to work." The reader is left to his own reflections.
Experts who are selling a technique, while ignorant or silent about comparable techniques, are not very "expert." The least we can expect of a real expert is that he knows what he doesn't know and tells you without being asked. Myers in "Composite Design" /3/ is one of the few professional paper writers who has taken the trouble to list the areas which he did not understand in connection with his modularization method. Hopefully, some will follow.
Automated Test Path Analysis
Miller of General Research Corporation /18/ and TRW-System Group /11, pp. 6-52, 19/ have developed and made available (RXVP, from GRC, see Datamation, Feb. 1975, p. 103) a new software aid for increasing programmer productivity and effectiveness. It is a program (or set of programs) which analyze the different test path combinations which should be exercised in order to do minimal program logic path verification. Some of these programs relate the necessary test paths to the test data in hand, and attempts are being made to use these programs as a basis for automatic generation of the necessary test cases, although this last point is more difficult.
Obviously, such tools (costing between one and several thousand dollars a month to acquire, for example) are performing much of the work of "reading" source code logic and relating it to test cases, a task with which SP is supposed to help us efficiently. But, now machines have taken over the task, some of the value, at least from readable SP, is gone. We should note that SP might however simplify the task of analyzing the program logic by computer and will tend to reduce the number of test cases needed. This might be a new virtue of SP.
Miller reported orally at Eurocomp-74 in London that 23 of 26 real programs examined, where the responsible authors were convinced that they had test cases for all test paths, were proven to only have test cases for 80 per cent of the paths. In other words, machines are simply better than humans at some tasks.
If I had my choice between automated test path analysis tools and SP, I think I'd be tempted to drop SP. Fortunately, we can have both.
Process Inspections
You may have heard different terms used to describe this technique, such as structured walkthrough or program review committees. The latest term is "design and code inspections" and "Process Control in the Development of Programs" which is also the title of Mike Fagan's excellent research report from IBM's System Development Division. /16/
The programmer productivity problem with which Fagan is concerned is also stressed by TRW /11/ and is the problem of the costly and extensive rewriting of the original program code (up to 95 per cent for large software projects, if they succeed af all, according to TRW). In order to actually satisfy the users requirements, the cost of rewriting program code and, of course , of retesting the system is from ten to one hundred times as much as the cost of making the corresponding changes at an earlier stage (at detailed design or during original coding) according to Fagan.

Structured programming cannot eliminate the rewrites, at best it can only ease the pain. The interesting problem is, therefore, how to eliminate the need to rewrite programs due to inadequate speci fi cati on.
The TRW answer is the "Requirements/Properties Matrix for Design Specification" /10: 5-1, 21/ which is a simple yet obviously powerful tool for considering all important properties of each system function. The objective is to force recognition of design requirement completeness and consistency at a very early stage, thus substantially reducing the need for recoding at the later, more expensive stage.
Mike Fagan's answers differ, but are by no means contradictory. Both answers can be applied at the same time to most programming projects, even smaller ones, I believe. Fagan does a careful investigation of the measurable programmer productivity effect of introducing inspections, with the objective of identifying faults, at three possible stages, in any combination: (1) after detailed design, but before source coding; (2) after code is written, but before any computer time is used to compile and test it, (3) after the code is fully tested. The first inspection gives net savings of 94 programmer hours per thousand noncommentary source statements. The second inspection saves 51 hours more (per 1000 statements), and the final inspection? Surprise! It is counterproductive (loses more programmer time than it saves), losing 20 hours per 1000 statements. Conclusion: drop it. I'd prefer early design and code inspections to SP, but perhaps both are justified.
Data Redundancy-Based Error-Detection and Correction Methods
Both for purposes of improving programmer productivity, improving program maintenance, and reducing errors in operation, there is a large set of techniques which are based on the observation that, aside from time-dependent errors (stop, interrupt, endless loop), all other errors in program logic corrupt data into some other state than the intended result.
By systematically designing data redundancy into all levels of the programmed system, the data elements, the records, and the data bases, the necessary minimum basis can be laid for automated detection and even automated correction of the effects of program errors.
Much of this technology is documented in my own book, Reliable EDP Application Design / 5/ , and is extended and deepened in forthcoming revisions. Briefly, the burden of extensive testing (in particular after maintenance changes) is moved to operational stages of program running, where only the cases that actually occur in practice are fed to the program anyway. Many program errors will be detected as a result of one of th e cross-mesh of error detection mechanisms which are built into the system, and some will be automatically corrected in the data, although the actual logical bug will continue to promote errors. The important point is that the end user is not harmed, and the price is often only a slight performance degradation to allow for automatic correction.
This data redundancy technology, which is not widely taught or understood at present, is also the basis for independent data base audit programs which can, among many other purposes serve as further

COMPUTERS and PEOPLE for June, 1976

21

degree of automation of the verification of correct operation of programs. This is particularly important in a data base environment with constant program maintenance, any of which threatens the common data base resource. The volume, program independence, and data structure complexity considered together with the probable importance of the availability and accuracy of the data, make fully automated data base audit programs an absolute requirement in my opinion.
To illustrate the effect of this tool, a bank client of the author added a data base diagnosis program to an eight year old, on-line data base of a million accounts. It found three hundred thousand faults in the files, which no user or application program had identified. But most of these faults were data which led to inaccurate top management reports. Top management was shocked, SP efforts hadn't helped.
Conclusion
These alternative techniques are not by any means a complete list. The techniques are not described fully, and perhaps, therefore, not convincingly enough. But the reader has been given some references which are intended to allow him to follow up the practical use of these alternative and supplementary techniques.
I wouldn't want the reader to go away with the misconception that this author is against structured programming. I am only against accepting it without considering alternative and supplementary techniques, without better practical evidence of its merits and disadvantages, and without measuring its effects periodically in each site where it is t aken into use. Further, I am disturbed that it is taking at tention away from techniques which would seem to be of more fundamental value to programmer productivity and to programmed system quality.
Lastly, there is the nagging recognition that in the final analysis structured programming is just a way of improving effectiveness of a form of manual labor: program coding and human program reading. The really interesting questions are elimination of as much as possible of the work and then automation as far as possible of what remains.
A Prediction
Finally, I shall allow myself the luxury of a prediction about where this technology is going, indeed, where it must go.
Our programming languages themselves will see considerable simplification to the minimum level needed to specify application logic and data without considering efficiency, reliability, portability, or other quality attributes. The higher le vel, multi-dimensional attributes will be fed into future compilers and then initial, and continually revised, program and data design will be done by software which relates our optimization and design needs t o the actual operating environment. Programs and data bases will be, to a far higher degree, selfdesigning and self-adapting. It is technologically possible to an interesting degree now, and it may be the only way we can provide computer resources in a Future Shock (see Al vin Toffler's book) en vironment. This author / 22 / and Knuth / 2, p.283/ , among others, .have remarked on the idea of self-

designing and self-adapting software, which after all is the same principle which sophisticated operating systems and optimizing compilers already employ.

References

/ 1/ Gilb, Comm of ACM, March 1975, pp. 187-188, ACM

Forum. "When are we going to structure the

knowledge about programming techniques?"

/ 2/ ACM Computing Surveys. December 1974. Special

Issue on (structured) programming.

/ 3/ Myers, Glenford J.: COMPOSITE DESIGN: THE DESIGN

OF MODULAR PROGRAMS, January 1973. IBM TR

0024 06. 80 pages.

/ 4/ Gilb, SOFTWARE MAINTAINABILITY, in Management

Datamatics, IAG, February 1975, p. 25.

/ 5/ Gilb, Reliable EDP Application Deisgn, Petrocel-

li/NY / USA, Studentlitteratur / S.

/ 6/ Gould and Drongowski, A Controlled Psychological

Study of Computer Program Debugging. 39 pages,

October 1972. IBM Research, RC 4083.

/ 7/ Gilb, Bebugging: Measurement of Program Quality,

Project Progress, and Motivation. IAG Commu-

nications February 1975, pp. 8-9; or (!AG)

Management Datamatic, April 1975, p. 68.

/ 8/ Lipow M., Estimation of Software Package Residual

Errors. TRW-SS-72-09. 11 pages. TRW, One

Space Park, Redondo Beach, California 90278

USA.

/ 9/ Girard and Rault, A Programming Technique for

Software Reliability, in IEEE 1973 Symposium

on Computer Software. Rel. IEEE Cat 73CH0741-

9CSR.

/ 10/ Boehm et al., Characteristics of Software Qua-

lity, TRW-SS-73-09. See / 8 / for TRW.

/ 11 / Boehm (Chairman), Reliable, Cost-Effective,

Secure SOFTWARE, TRW-SS-74-14. See / 8/ .

/ 12/ EDP Analyzer, May 1974, The Search for Software

Reliability.

/ 13/ Fischler, Firschein, and Drew: Distinc~ Soft-

ware, Lockheed, Palo Alto, CA.

/ 14 / Hetzel (ed.), Program Test Methods, Prentice-

Hall, ISBN 0-13-729624-X, 1973.

/ 15/ Gilb, Parallel Programming, DATAMATION, October

1975, pp. 160-161.

/ 16 / Fagan, M., Design and Code Inspections. IBM

TR 21,572. De cember 1974, SOD Kingston.

/ 17/ Weinberg, Yasukawa, and Marcus, STRUCTURED

PROGRAMMING IN PL/ C. Wiley, 1973.

/ 18/ Miller, E. F., Automatic Generation of Software

Test Cases. Eurocomp-74 proc.

/ 19/ Krause, Smith R. W., Goodwin, Optimal Software

Test Planning Through Automated Network Analy-

sis, TRW-SS-73-01. Also in / 9/ IEEE, p. 18.

/ 20 / Kopetz, Software Redundancy in Real-Time Sys-

tems. IFIP-74 proc. North-Holland Publ.

/ 21 / Boehm, Some Steps toward formal and automated

aids to software requirements analysis and de-

sign. IFIP-74 proc. North-Holland.

/ 22 / Gilb, The CODASYL DBTG Report: A Counter Propo-

sal. Simplification and Self-Optimization.

IAG Communications 3/ 4 1972, also Infotech

Data Base Rep.

/ 23 / Rodney R. Larson, TEST PLAN AND TEST CASE IN-

SPECTION SPECIFICATION, IBM TR 21.586, April

4, 1975, see / 16 / .

/ 24 / Gilb, T., SOFTWARE METRICS, Winthrop, Cambridge,

Mass., 1976, contains extensive details on the

mat ter discussed in this article.

[]

22

COMPUTERS and PEOPLE for June, 1976

Computing and Data Processing Newsletter

NEW YORK TIMES TO INSTALL THE WORLD'S LARGEST ELECTRONIC NEWSROOM
Fred Baker Harris Corporation 55 Public Square Cleveland, OH 44113
A committee of New York Times executives ~ representing news, production, communications, and data processing departments ~ has recommended after several years of study that the paper start the transition to electronic editing and composition. They have also recommended that the ~use an integrated system, including video display terminals, computer equipment, and software developed especially for news handling. Transition to the new equipment, which will write and edit all of the paper's news and drive its typesetting equipment, will begin this summer. It will be the largest installation of computerized news-editing equipment in the world.
Phased conversion of the Times' news and editorial matter from hot-metal typesetting to photocomposition or "cold type" will begin with selected Sunday feature sections and end with the ultimate changeover of the main news sections. Readers of the Times are expected to benefit because the new system will permit late-breaking news to . be prepared for press faster than under present procedures. Since articles will be typeset automatically after viewing by editors on the video screens, typographical errors will be almost eliminated.
Commenting on the newsroom program, Walter E. Mattson, executive vice president and general manager of the New York Times Newspaper Division, said, "Paired with our decision of last year to build a satellite printing plant in New Jersey, this move into electronic editing will keep the Times in the forefront of newspaper production technology. The results will be to improve the appearance of the paper, speed its distribution to readers, and to reduce our costs."
The linking of video display terminals, computers, and photographic typesetters has helped publishers increase accuracy as well as reduce "prepress" costs despite inflation. Adopted first by smaller newspapers, electronic editing technology has recently been making rapid strides in metropolitan areas. The American Newspaper Publishers' Association expects more than 300 papers to be using the equipment this year, double last year's figure.

When the new system is installed at the Times,' copy from news agency wires such as the Associated Press or United Press International will feed directly into the computer memory for later editing on the video screens. Leased news lines from the Times' Washington news bureau and from Europe will also feed into the computer system. In addition, the Times will use portable "reporter terminals" for remote input to the editing system.
Later this year, when the Times' editors first sit down to their new video typewriters, they will find there is no paper to insert, no carriage to return, and little noise. Striking over an incorrect letter or word will erase it from the screen as the correct character appears. About the size of ordinary electric typewriters, the units will be equipped with standard typewriter keyboards and several special-function "command" keys. As the editor types, his copy is displayed on a five-by-ten-inch cathode ray tube above the keyboard.
A "scroll-back" or recall capability will permit copy editors to review and revise any portion of a story before sending it along to computer memory. From there, senior editors can retrieve it for evaluation and final copy cutting, updating or additions. When satisfied, the editor will press a button that says, "Set It," and the copy will be automatically dispatched for computerized composition.
The~ news-editing and typesetting equipment will be supplied by the Harris Corporation.
POLICE DEPARTMENT USES COMPUTERIZED SYSTEM TO FIGHT CRIME
Joe Francis The Boeing Company Wichita, KS 67210
At approximately 1 a.m. in St. Louis, Missouri, an alert police officer motioned a truck driver to pull his vehicle over to the side of the road. The officer's suspicions that the truck was stolen were confirmed when, once out of his car, the truck accelerated down the highway.
As the officer chased the truck at speed up to 65 miles per hour, a dispatcher, located at St. Louis Police Department Headquarters, watched the
progress of the squad car on a color-television-like
display. He also observed blips on the screen, rep-

COMPUTERS and PEOPLE for June, 1976

23

FLAIR (Fleet Location and Information Reporting) communications are depicted (left) by the code message unit used by police officers to contact headquarters and (right) by the scene of action displayed on a dispatcher's video terminal. The real-time crime fighting system tracks police cars over the entire city of St. Louis with 70-foot accuracy.

resenting other officers in the vicinity of the chase. By directing these other officers to nearby key intersections, an effective road block was set up, which enabled the police to trap the stolen truck within 8 minutes after it was spotted.
The system that is aiding the St. Louis police in fast ·reaction and apprehension is called FLAIR (Fleet Locatfon and Information Reporting). The patented FLAIR system, which is the first of its kind in the world, uses a computer, combined with proprietary software, to automatically track and display a squad car's immediate location and the status of its officer. This computerized crimefighting weapon has proved so successful, while logging more than one million miles in the St. Louis pilot system, that the city recently announced that it will soon expand the FLAIR system which will then consist of 200 FLAIR-equipped cars, 2 computers, and 6 dispatch stations.
Vastly enhanced command and control of forces, and much improved officer safety are the end pro~ ducts of FLAIR, according to St. Louis police officials. St. Louis Police Chief Eugene Camp believes the quicker response time of FLAIR has a twofold result. "First, more apprehensions will occur, and secondly, there's a consequent deterrent effect on would-be offenders."
With FLAIR color-television monitors, a police dispatcher observes the movements of squad cars superimposed on a computer-stored city map. With options to switch among three magnifications, dispatchers may observe the entire city or areas as small as a few city blocks ~ including street names. A dispatcher can estimate the speed of a squad car, note when it turns a corner, even track it into a multilevel garage. Furthermore, an officer in distress need only press a red button on his FLAIR message unit, and the nearest reinforcements are immediately dispatched to his aid. Data accumulated from one-year operation of the FLAIR-pilot equipment shows that a dispatcher's image of a squad car location is within an average of 71 feet of the actua 1 lo ca ti on.
FLAIR digital communications are transmitted over police radio bands. Police cars are equipped with a code message unit, a heading sensor, an odometer, a data processor, and a radio transmitter / receiver. Officers may transmit secure, two-digit messages by entering codes on the 12-key, dash-board-mounted ~essage unit. Base or command control center equipment consists of a data terminal, the computer, a

video processor, and a number of color video display devices. Supporting the computer are disc drives and teletypes. Each computer is capable of tracking up to 500 vehicles simultaneously. An unlimited number of cars may be added to the system by interconnecting computers and communications equipment.
FLAIR's tracking is based upon a fundamental navigational principle that if an original location of a vehicle is known, any future location may be determined if heading, or direction and distance change is added to its original location. The computer continuously reads, decodes, and updates all vehicle positions and coded messages transmitted to the control center from vehicles. Vehicle locations appear as bright moving symbols over the computer map displayed on monitoring screens. The · dispatcher uses push buttons to select various map segments.
Keying a vehicle number (or officer number) and depressing the "locate" button causes a box symbol to appear adjacent to the appropriate vehicle and simultaneously calls up additional map segments as the car travels. In addition, the call numbers of the six closest cars to the squad car being tracked are listed on a status display beside the computer map display. The closest cars may be selected from any combination of officer classifications: patrol, detective, investigation, laboratory, or vice.
A box symbol tracking a squad car glows steadily if the officer is in service but not on a call, appears with an "L" symbol if the officer is on a low priority call, appears with an "H" symbol if he is on a high priority call, and with an "E" symbol for an officer in an emergency situation. In the last case, indicated by the officer's activating his red button, an audible warning also is sounded on the central communications command video terminal.
With FLAIR, a dispatcher can direct officers to incidents by route as opposed to address, thus circumventing known construction barriers or other obstacles. The efficient data communications ~ 99 codes may be keyed into the coded message unit ~ alleviate voice communication congestion.
Speed of computation is a primary concern for police tracking and stat.us applications. FLAIR receives coded messages of each vehicle's heading and distance at 1.2-second intervals. Also FLAIR's executive routine (software instructions governing the

24

COMPUTERS and PEOPLE for June, 1976

computer's control) requires the computer to continuously scan all incoming communications to discern priorities.
Programs representing a hierarchy of priorities are stored in the computer's memory. Incoming messages are then scheduled by the computer's executive routine, according to priority. Thus, an officer's red-button alert commands computer attention ahead of normal heading and distance signals.
The St. Louis program was funded by a Law Enforcement Assistance Administration, Justice Department grant. The FLAIR computer is a Varian Data Machines V73 and uses Boeing software.
COMPUTER MODEL OF MARS WI LL INTERPRET DATA FROM VIKING SPACECRAFT
Charles H. Ball News Office Massachusetts Institute of Technology Cambridge, MA 02139
Two Massachusetts Institute of Technology researchers have prepared computer models of the interior of the planet Mars to facilitate the interpretation of data from two Viking spacecraft headed for a landing on the planet in July. The models predict the structure of the interior, the size and composition of the core, and mineral assemblages of the Martian mantle. The researchers have found that Mars is similar to the earth in many respects and differs from the moon.
The models are based primarily on data from the Mariner orbiters that included observations of the planet's gravity field, its shape and topography, and photographs of the surface. It was learned from these missions that Mars has been an active planet with large volcanoes and other tectonic features. Although the bulk of the experiments on the Vikings will be devoted to detecting the presence of life on the planet, each spacecraft will carry a seismometer to detect Marsquakes and to help determine the structure of the Martian interior.
The researchers, Professor M.Nafi Toksoz and David H. Johnston, calculated models of the evolution and temperature history of Mars in order to show the feasibility of large-scale melting and core formation which would be evident at the surface in the volcanism observed in the Mariner photos and by the weak magnetic field discovered by the Soviet orbiters. The evolution models also predict the present-day temperatures within the planet, one of the most important parameters used by scientists to determine the physical conditions in the interior.
The next step in understanding the internal structure of Mars is to calculate the changes of density with depth in the planet. From these models, Toksoz and Johnston conclude that, like the Earth, Mars has a large, 1250-mile-radius molten core, but that this core is less dense than the Earth's. The next layer, called the mantle, comprises the bulk of the planet and is more dense than the Earth's because it contains more iron. The upper part of the mantle may be partially molten, similar to Earth's aesthenosphere, but on Mars this region occurs at a greater depth (more than 125 miles, compared to about 62 miles in the Earth) so plate tectonics, as is known on Earth, does not occur on Mars. Finally, the surface of the planet is covered by a thin crust which is the result of melting in the Martian interior billions of years ago.

The most useful information for the interpretation of the data that will be obtained from the Viking seismometers is found by using the density models to calculate the velocities of seismic waves in Mars that might be generated by Marsquakes. Toksoz and Johnston predict that Marsquakes will occur and will be more prevalent and energetic than quakes detected on the moon. The seismic activity in a planet is a good indicator of its internal temperature, energy, and tectonic rigor. Viking seismometers will give a good measure of this. They will also enable the seismologists to determine the size of the Martian core and the structure of its mantle and crust.
The interpretation .of such data in terms of the theoretical models will help to define more closely the internal state of Mars and its comparison with the earth and moon. Ultimately this will enable scientists to place the Earth, Moon, and Mars in proper context in the solar system with regard to the formation and evolution of the planets.

FBI WILL USE AUTOMATIC "MATCHER" FOR FINGERPRINT IDENTIFICATION
Ralph Wa//enhorst Ca/span Corporation P.O. Box 235 Buffalo, NY 14221

The Federal Bureau of Investigation will take a major step in its program to automate one of the most exacting and detailed tasks in government ~ establishing the identity of individuals through their fingerprints. Within one year, a prototype "matcher," automatic equipment for matching one fingerprint against another, will be installed at the FBI's Computer Center. Currently the filing, classification, and matching are done manually by approximately 3,300 FBI fingerprint technicians and clerks. Each day the FBI receives about 22,000 fingerprint inquiries, which must be searched against card files representing more than 21 million persons.

The matcher will be capable of matching a set of submitted prints against the prints on file by comparing the minutiae read by an automatic fingerprint
reader and scoring the comparison. It will mak ~ several thousand comparisons each time the print of a single finger is matched against another print. The matcher will also be required to check the fingerprints being searched against about 100 sets of filed fingerprints per second.

Essentially, the automated system will be used by the FBI as a high-speed selection process. The final determination will still be by humans. Pairings of fingerprints scored highest by the ma t cher, the Bureau emphasizes, will be "subjected to a final. manual verification process to assure the accuracy of the identification."

The prototype matcher model will be built by

Calspan Corporation.

(please turn to page 26)

Weiss - Continued from page 18

studies. The results of the live-testing and simulations are then compared. This empirical verification of the simulation model also serves as a check on the adequacy of the initial simulation model, which might have been based to some extent on the characteristics of a unique set of testees, or on their interaction with some particular adaptive test.
(To be continued in next issue)

COMPUTERS and PEOPLE for June, 1976

.25

Newsletter - Continued from page 25
iCOMPUTER HELPS RESOLVE MEXICAN-U.S. WATER WATER PROBLEMS
R. W. Sheehy
Control Data Corporation
6003 Executive Blvd.
Rockville, MD 20852
A giant computer at the Bureau of Reclamation's Engineering and Research Center, in Denver, Colorado, has been put to work easing international tensions between the United States and Mexico. The computer is being used to predict, monitor, and help control salinity levels of the lower Colorado River. Excessive salt content in the water reaching Mexico has been the cause of disputes between the two countries for more than a decade.
Salinity increases in the Colorado River have been recorded since the early 1900s. Even before the problem became acute, salinity of the lower Colorado had been steadily increasing; by 1962, saline concentration was nearly double what it had been only ten years earlier. By 1970, salt content at the southern end of the Wellton-Mohawk irrigation canal, which drains into the Colorado River above Mexico's Morelos Dam, sometimes reached 3,000 parts per million, an unacceptable level for most agricultural purposes.
In August 1973; Mexico and the United States signed an agreement to effectively control the salinity of the 1 1/2 million acre-feet of water that the U.S. is obligated to deliver to Mexico from the Colorado River under a 1944 treaty. Under the 1973 agreement, water reaching Mexico has, " .·· an annual average salinity of no more than 115 ppm~ 30 ppm ·.. over the annual average salinity of the Colorado River waters which arrive at Imperial Dam .... " The requirement became effective with the signing of the Colorado River Basin Salinity Control Act in 1974. That act authorized the construction of the works necessary to achieve the agreed differential in salinity.
The method most acceptable to the United States as a permanent solution for maintaining the differential salinity level of 115 milligrams per liter is to modify the Wellton-Mohawk drainage waters. This will involve irrigated acreage limitations, improved irrigation efficiencies, further regulation of Gila River floodflows, and construction of a large-scale desalting complex to treat the saline drainage waters.
The job of design, construction, operation, and maintenance of the world's largest desalting plant has been assigned to the Bureau of Reclamation. A computer is assisting in comprehensive modeling studies to predict (1) future salinity conditions in the Colorado River, (2) the effects of improved irrigation efficiencies on drain water supplied to the desalting plant, and (3) future flow conditions in the Colorado River. During plant operation, the computer will provide water-quality simulation of the river to help assure that the agreed upon salinity differential is maintained.
In a related study, the Bureau of Reclamation is looking into the feasibility of tapping the huge water reservoirs and geothermal power and sources underlying much of California's Imperial Valley as a means of augmenting and further improving water quality in the Colorado River. Water in this tremendous subterranean reservoir, while often too salty it its natural state as it flows freely from

the ground, can be desalted by using heat in the fluid itself. The heat energy can be used to generate electrical power, power desalting plants, and return the fresh, usable water to the river.

Computer programs are used to determine the size of the reservoir and its expected, useful life. These programs are derived from such information as changes in yield of the wells and various geophysical data. While only on a pilot-test scale, the process offers an additional means to make more good quality water available to users of the lower
Colorado River.

The Bureau of Reclamation's computer is a CDC

CYBER 74 made by Control Data Corporation.

0

Browder - Continued from page 13

concerned, the forms of this development become more and more alien to their outlook. What might have been an increase in human powers and freedom becomes a vaguely defined and somewhat monstrous threat looming in the background of present-day life. The technical as well as the mathematical tools of society take an external and bureaucratic form which depresses human possibilities rather than raising them.

The Failure to Convey the Spirit of Mathematics

It is my suspicion that the discrepancy between the intellectual power of our mathematical and scientific disciplines and their negative impact upon the thinking of nonmathematicians and nonscientists even (or, perhaps, we should say, especially) among the well-educated in our society may be attributed in no small part to a profound defect in our funda-
mental concept of mathematical and scientific education for nonspecialists. To put the matter in mathematical terms, the defect lies in viewing the teaching of mathematics in practical, technical or research terms exclusively, i.e., in terms of Mathematics I, II, or III. We have failed in large measure to find ways to convey the spirit of Mathematics IV, the transcendent ideal of mathematics as a fundamental and universal form of knowledge. This failure is in turn due to the failure or refusal of many of
our contemporaries among the mathematicians to recognize the validity or even the meaningfulness of such an ideal.

It is my hope that this failure represents a chal-

lenge that will be overcome in the historical period

in which we play a part. It can be overcome, only

as Whitehead told us, in the spirit that even modern

mathematics is still in its potential infancy and

that the overwhelming novelty in human thought for

many centuries to come will be the dominant role of

mathematical understanding.

O

101 MAXIMDIJES
Over 100 amusing, easy, cryptographic puzzles . with maxims, quotations, sayings, etc., as the answers . . . . with lists, tables, and a guide for solving speedily . ... hours of fun and entertainment for you and your friends.
For your copy, send $1.80, plus 20 cents for postage and handling, with your name and address to:
Berkeley Enterprises, Inc. 815 Washington St. Newtonville, Mass. 02160

26

COMPUTERS and PEOPLE for June, 1976

GAMES AND PUZZLES for Nimble Minds - and Computers

Neil Macdonald Assistant Editor

It is fun to use one's mind, and it is fun to use the artificial mind of a computer. We publish here a variety of puzzles and problems, related in one way or another to computer game playing and computer puzzle solving, or

to the programming of a computer to understand and use free and unconstrained natural language.
We hope these puzzles will entertain and challenge the readers of Computers and People.

NAYMANDIJ
In this kind of puzzle an array of random or pseudorandom digits ("produced by Nature") has been subjected to a "definite systematic operation" ("chosen by Nature") and the problem ("which Man is faced with") is to figure out what was Nature's operation.
A "definite systematic operation" meets the following requirements: the operation must be performed on all the digits of a definite class which can be designated ; the result displays some kind of evident, systematic, rational order and completely removes some kind of randomness; the operation must be expressible in not more than four English words. (But Man can use more words to express it and still win.)

NUMBLES
A "numble" is an arithmetical problem in which: digits have been replaced by capital letters; and there are two messages, one which can be read right away and a second one in the digit cipher. The problem is to solve for the digits. Each capital letter in the arithmetical problem stands for just one digit 0 to 9. A digit may be represented by more than one letter. The second message, which is expressed in numerical digits, is to be translated (using the same key) into letters so that it may be read; but the spelling uses puns, or deliberate (but evident) misspellings, or is otherwise irregular, to discourage cryptanalytic methods of deciphering.

NAYMANDIJ 766

NUMBLE 766

32213 266356733653675 819450726555 17991940 66757853832594083 252 7793 1529406754598641 88961441771838493 192 457 2686 13737955 81025 604658376 24357452362 78 4 4 20 85 79 23 79790648 253603 08873397224556 95 0 20 3 40 989 4 1438 2677

OAK
x Is
MF R N
s R0 E s N0 N
01623 649658

MAXIMDIJ
In this kind of puzzle, a maxim (common saying, proverb, some good advice, etc.) using 14 or fewer different letters is enciphered (using a simple substitution cipher) into the 10 decimal digits or equivalent signs for them. To compress any extra letters into the 10 digits, the encipherer may use puns, minor misspellings, equivalents like CS or KS for X or vice versa, etc. But the spaces between words are kept.

We invite our readers to send us solutions. Usually the (or "a") solution is published in the next issue.
SOLUTIONS NAYMANDIJ 765: Make row 5 even. MAXIMDIJ 765: Haste is the father of error. NUMBLE 765: Still water, deep bottom.

MAXIMDIJ 766

Our thanks to the following individuals for sending us solutions: Leon Davidson, White Plains, N.Y.: Maximdij 763 - Frank E. DeLeo, Brooklyn, N.Y.: Maximdij 763, Numble 763, Maximdij 764, Numble 764 - T. P. Finn, Indianapolis, In.: Maximdij 764, Numble 764 - Jean Robbins, Pasadena, Calif.: Maximdij 764, Numble 764.

COMPUTERS and PEOPLE for June, 1976

27

COMPUTER GRAPHICS AND ART

COMPUTER GRAPHICS and ART is a new international

Partial Table of Contents - Vol. 1, No. 1

quarterly of interdisciplinary graphics for graphics people and computer artists. This new periodical is aimed at students,

teachers, people from undergraduate and graduate institutions,

Learning Through Graphics by Dr. Al Bork, University of California, Irvine, California A ten-year forecast for computers, education, and graphics by a

researchers, and individuals working professionally in graphics. Its topical coverage is broad , embracing a variety of fields. It is useful, informative, entertaining, and current. Our goal

leading authority.

is excellence, and to achieve this objective, we invite our

Art of the Technical World by Dr. Herbert Franke, Munich, Germany Computer art as the bridge between the two realms of art

readers to participate actively in the magazine, and to advance the state of the art of computer graphics by communication, sharing, and dissemination of ideas.

and leisure.

We invite you, your colleagues and students to help us

Expanding the Graphics Compatability System to Three Dimensions by Richard F. Puk, Purdue University, Lafayette, Indiana

achieve this goal.

Design considerations for a user-oriented 3-D graphics system.

A Personal Philosophy of Ideas, New Hardware, and the Results

List of Coverage for Up-Coming Issues

by Duane Palyka, University of Utah, Salt Lake City, Utah

The frame-buffer from Evans and Sutherland allows the artist to

Applied Arts and Graphics

treat the computer as a paint and brush medium.
PLOTMAP - Computer Representation of Geographic Data by Lloyd Onyett, California State University, Chico, California A computer scientist and geographer reviews a mapping system he has devised for small and medium-sized computers.

Architectural Graphics Cartography Systems Computer-Aided Design Computer Assisted and Managed Instruction
Utilizing Computer Graphics Computer Graphics in Physics, Chemistry,

How to Build Fuzzy Visual Symbols by Alex Makarovitsch, Honeywell Bull, Paris, France A new approach to computer art and graphics by a computer

Mathematics, etc. Computer Programs for New Applications Display Systems and Graphics

scientist.

Fine Art and Media Explorations

Coordination of Bibliography-Making for Interdisciplinary Graphics by Grace C. Hertlein, Editor Proposal for merging, coding, and disseminating interdisciplinary graphics bibliographies, using a tested method.

Graphics in Business Hardware Systems and Graphics Interactive Graphics Languages and Systems Languages for Computer Graphics and
Graphics Primitives

Computer Art Illustrations

Software Systems and Graphic Requirements

by Frieder Nake, Alex Makarovitsch, William Kolomyjec, Ensor
Hoa;;·n·~lvk·- ~.,

Statistical Packages and General Graphing
Svll·bl~ ~

- - - - - - - - - - - - - - - - - - - - - - - - -(may be copied on any piece of paper)- - - - - - - - - - - - - - - - - - - - - - - -

To: COMPUTER GRAPHICS and ART

HERE IS YOUR OPPORTUNITY FOR FEEDBACK TO US:

Berkeley Enterprises, Inc. 815 Washington St. Newtonville, Mass. 02160

) I hope to submit for publication in CG&A material on the following topics: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ __

) Please enter my PERSONAL (U.S. and Canada) subscription to CG&A ($10 per year).
) Please enter my FOREIGN PERSONAL subscription to CG&A ($13 per year).
) Please enter my LIBRARY/DEPARTMENTAL subscription to CG&A ($15 per year).
) Enclosed is my PERSONAL CHECK AND/OR PURCHASE ORDER for CG&A.
) Enclosed is $2.50 for a sample copy of CG&A (applicable toward a subscription).
FULL REFUND IN 30 DAYS IF NOT SATISFACTORY
Name-------------------------~ Title _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ __ Organization _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ __ Address_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ __

) I am interested in reading materials by the following authors : _
) I am particularly interested in coverage of the following subjects:
) I would like to receive materials on other Berkeley Enterprises, Inc. publications: ( ) COMPUTERS and PEOPLE ( ) The COMPUTER DIRECTORY and BUYER'S GUIDE ( ) People and the PURSUIT of Truth ( ) The Notebook on COMMON SENSE and WISDOM ( ) WHO'S WHO in COMPUTERS and DA TA PROCESSING ( ) Books
) I am interested in : ( ) black and white computer art reprints at low cost (a bonus for subscribing to CG&A) ( ) 77 page FORTRAN IV art manual ( ) 45 page interdisciplinary graphics bibliography by G. Hertlein
) Additional Comments (attach another paper if needed): _ _ _ __


Acrobat 11.0.23 Paper Capture Plug-in