perm filename EIGHT.PUB[BIB,CSR]3 blob sn#518671 filedate 1980-06-27 generic text, type C, neo UTF8
C00001 00001
C00003 00002	.require "setup.csr[bib,csr]" source file
C00006 00003	%3PVG-16 (STAN-CS-80-789):%1
C00008 00004	%3STAN-CS-80-790:%1
C00010 00005	%3STAN-CS-80-791 (CSL TR-166):  %1
C00014 00006	%3STAN-CS-80-792 (CSL TR-167):%1
C00016 00007	%3HPP-80-3 (STAN-CS-80-793):%1
C00018 00008	%3STAN-CS-80-794:%1
C00020 00009	%3STAN-CS-80-795:%1
C00021 00010	%3AIM-335 (STAN-CS-80-796):%1
C00023 00011	%3STAN-CS-80-797:%1
C00025 00012	%3STAN-CS-80-798:%1
C00028 00013	%3STAN-CS-80-799 (SLAC PUB-2504):%1
C00030 00014	%3STAN-CS-80-800:%1
C00031 00015	%3STAN-CS-80-801:%1
C00041 00016	%3PVG-17 (STAN-CS-80-802):%1
C00044 00017	%3STAN-CS-80-803:%1
C00048 00018	%3STAN-CS-80-804:%1
C00051 00019	%3STAN-CS-80-805:%1
C00056 00020	.next page
C00058 00021	.next page <<CSL listing>>
C00061 00022	.next page <<CSD order form>>
C00066 ENDMK
.require "setup.csr[bib,csr]" source file;
.font A "math55";
.once center
%3Stanford University Computer Science Reports%1
%3List Number 8↔July 1980%1

@Listed here are abstracts of the most recent reports published by the
Department of Computer Science at Stanford University.

@%3Request Reports:%1 Complete the enclosed order
form, and return the entire order form page (including mailing label) by
September 8, 1980.  In many cases we can print only a limited number of copies,
and requests will be filled on a first come, first served basis.  If the code
(FREE) is printed on your mailing label, you will not be charged for hardcopy.
This exemption from payment is limited primarily to libraries.  (The costs
shown include all applicable sales taxes.  %2Please send
no money now, wait until you get an invoice%1.)

@%3Alternatively:%1 Copies of most Stanford CS Reports may be obtained by writing
(about 2 months after the "%2Most Recent CS Reports%1" listing) to 
Technical Information Service%1, 5285 Port Royal Road, Springfield, Virginia 22161.
Stanford Ph.D. theses are available from %2University Microfilms%1, 300 North
Zeeb Road, Ann Arbor, Michigan 48106.

%3PVG-16 (STAN-CS-80-789):%1
.once preface 0
@%2Ada Exceptions: Specification and Proof Techniques%1
by D. C. Luckham and W. Polak
(19 pages, February 1980)

@A method of documenting exception propagation and handling in Ada programs
is proposed.  Exception propagation declarations  are introduced as a  new
component of  Ada specifications.   This  permits documentation  of  those
exceptions that can be propagated by a subprogram.  Exception handlers are
documented by entry assertions.  Axioms and proof rules for Ada exceptions
are given.  These rules are simple extensions of ≤revious rules for Pascal
and define an  axiomatic semantics of  Ada exceptions.  As  a result,  Ada
programs specified according to the method can be analysed by formal proof
techniques for consistency with their specifications, even if they  employ
exception propagation and handling to achieve required results (i.e.   non
error situations).  Example verifications are given.
.begin nofill
↔Hardcopy $2.25
↔Microfiche (Free)

.once preface 0
@%2Databases in Healthcare%1 by Gio Wiederhold
(76 pages, March 1980)

@This report defines database design and implementation technology as
applicable to healthcare.  The relationship of technology to various
healthcare settings is explored, and the effectiveness on healthcare
costs, quality and access is evaluated.  A summary of relevant development
directions is included

@Detailed examples of 5 typical applications (public health, clinical
trials, clinical research, ambulatory care, and hospitals) are appended.
There is an extended bibliography.
.begin nofill
↔Hardcopy $3.85
↔Microfiche (Free)

%3STAN-CS-80-791 (CSL TR-166):  %1
.once preface 0
@%2MAINSAIL Language Manual%1 
by Clark R. Wilcox, Mary L. Dageforde, and Gregory A. Jirak
(247 pages, March 1980)

@MAINSAIL is a programming system designed for the development of large,
portable software systems.  The compiler and runtime system are themselves
written in MAINSAIL, so that only code generators and a few well-defined 
machine and operating-system dependent  routines need be rewritten to move
MAINSAIL to a new environment.  The language provides single and double
precision integer, floating point and "bits" data types, as well as 
descriptor-based strings, record pointers, and low-level data types "address"
and "charadr" (character address).  All data structures are dynamically
allocated and automatically garbage collected when necessary to reclaim
storage space.  A novel approach to modules and intermodule communications
supports modules which are position independent, self loading and dynamically
linked, thereby avoiding any reliance on machine-dependent loaders and linkers.
A machine-independent file system model is supported across all implementations,
so that portable programs may utilize sequential and random file access
for both text and "binary" data.  An extensive set of compiletime features
including macros, source file inclusion, symbol table save and restore and
conditional compilation support the development of software systems which
may consist of hundreds of modules.  Low-level features, such as explicit
memory allocation, deallocation and access, and in-line assemby code,
provide the ability to bypass the MAINSAIL data structures when necessary.
Such features have made it possible to write even the machine-dependent
parts of the runtime system in MAINSAIL.  An extensive set of runtime
modules are provided as part of the standard environment, including
mathematical routines, file manipulation, i/o, conversions, string
manipulation, character handling, and low-level memory allocation and access.
.begin nofill
↔Hardcopy $8.65
↔Microfiche (Free)

%3STAN-CS-80-792 (CSL TR-167):%1
.once preface 0
@%2MAINSAIL Implementation Overview%1
by Clark R. Wilcox, Mary L. Dageforde, and Gregory A. Jirak
(70 pages, March 1980)

@The MAINSAIL programming language and the supporting implementations have
been dveloped over the past five years as an integrated approach to a viable
machine-independent system suitable for the development of large, portable
programs.  Particular emphasis has been placed on minimizing the effort
involved in moving the system to a new machine and/or operating system.
For this reason, almost all of the compiler and runtime support is written
in MAINSAIL, and is utilized in each implementation without alteration.
This use of a high-level language to support its own implementation has
proved to be a significant advantage in terms of documentation and
maintenance, without unduly affecting the execution speed.  This paper gives
an overview of the compiler and runtime implementation strategies, and
indicates what an implementation requires for the machine-dependent and
operating-system-dependent parts.
.begin nofill
↔Hardcopy $3.70
↔Microfiche (Free)

%3HPP-80-3 (STAN-CS-80-793):%1
.once preface 0
@%2Representation of Knowledge%1 by Avron Barr and James Davidson
(90 pages, March 1980)

@This report is the section of the Handbook of Artificial Intelligence about
knowledge representation research.  The Handbook is a compendium of articles about
AI ideas, techniques, and systems intended for non-AI scientists, engineers, and
students.  The material in this report discusses the problems addressed in
knowledge representation research in AI and suggests some ways of comparing
the various representation schemes.  Additional articles describe the AI
representation techniques:  logic, procedural representations, semantic nets,
production systems, direct (analogical) representations, semantic primitives, and
frames.  The Handbook will be published later in 1980.  We have distributed this
manuscript before publication with the hope of obtaining comments and suggestions
from the AI community. 
.begin nofill
↔Available in microfiche only. (Free)

.once preface 0
@%2Recent Developments in the Complexity of Combinatorial Algorithms%1
by Robert Endre Tarjan
(28 pages, June 1980)

@The last three years have witnessed several major advances in the area of
combinatorial algorithms.  These include improved algorithms for matrix
multiplication and maximum network flow, a polynomial-time algorithm for
linear programming, and steps toward a polynomial-time algorithm for graph
isomorphism.  This paper surveys these results and suggests directions for
future research.  Included is a discussion of recent work by the author and his
students on dynamic dictionaries, network flow problems, and related questions.
.begin nofill
↔Hardcopy $2.50
↔Microfiche (Free)

.once preface 0
@%2The Letter S%1 by Donald E. Knuth
(33 pages, April 1980)

@This expository paper explains how the problem of drawing the letter S leads to
interesting problems in elementary calculus and analytic geometry. It also
gives a brief introduction to the author's METAFONT language for alphabet design.
.begin nofill
↔Hardcopy $2.65
↔Microfiche (Free)
.next page
%3AIM-335 (STAN-CS-80-796):%1
.once preface 0
@%2Essential E%1 by Arthur Samuel
(33 pages, March 1980)

@This is an introductory manual describing the display-oriented text editor
E that s available on the Stanford A.I. Laboratory PDP-10 computer.  The
present manual is intended to be used as an aid for the beginner as well
as for experienced computer users who either are unfamiliar with the E
editor or use it infrequently.  Reference is made to the two on-line manuals
that help the beginner to get started and that provide a complete description
of the editor fr the experienced user.

@E is commonly used for writing computer programs and for preparing reports
and memoranda.  It is not a document editor, although it does provide some
facilities for getting a document into a pleasing format.  The primary emphasis
is that of speed, both in terms of the number of key strokes required of the user
and in terms of the demands made on the computer system.  At the same time, E
is easy to learn and it offers a large range of facilities that are not available
on many editors.
.begin nofill
↔Hardcopy $2.65
↔Microfiche (Free)

.once preface 0
@%2Read-Only Transactions in a Distributed Database%1
by Hector Garcia-Molina and Gio Wiederhold
(25 pages, April 1980)

@A read-only transaction or query is a transaction which does not modify
any data.  Read-only transactions could be processed with general transaction
processing algorithms, but in many cases it is more efficient to process
read-only transactions with special algorithms which take advantage of the
knowledge that the transaction only reads.  This paper defines the various
consistency and currency requirements that read-only transactions may have.
The processing of the different classes of read-only transactions in a distributed
database is discussed.  The concept of %2R%1 insularity is introduced to
characterize both the read-only and update algorithms.  Several simple update
and read-only transaction processing algorithms are presented to illustrate
how the query requirements and the update algorithms affect the read-only
transaction processing algorithms.
.begin nofill
↔Hardcopy $2.40
↔Microfiche (Free)

.once preface 0
@%2The Compilation of Regular Expressions into Integrated Circuits%1
by Robert W. Floyd and Jeffrey D. Ullman
(28 pages, April 1980)

@We consider the design of integrated circuits to implement arbitrary regular
expressions.  In general, we may use the McNaughton-Yamada algorithm to convert
a regular expression of length %2n%1 into a nondeterministic finite automaton with
at most %22n%1 states and %24n%1 transitions.  Instead of converting the
nondeterministic device to a deterministic one, we propose two ways of 
implementing the nondeterministic device directly.  First, we could produce
a PLA (programmable logic array) of approximate dimensions %24n %4x%2 4n%1
by representing the states directly by columns, rather than coding the states
in binary.  This approach, while theoretically suboptimal, makes use of
carefully developed technology and, because of the care with which PLA
implementation has been done, may be the preferred technique in many real
situations.  Another approach is to use the hierarchical structure of the
automation produced from the regular expression to guide a hierarchical
layout of the circuit.  This method produces a circuit
%20(%4\%7¬%2n)%1 on a side and is, to within a constant factor, the best that
can be done in general.
.begin nofill
↔Hardcopy $2.50
↔Microfiche (Free)

%3STAN-CS-80-799 (SLAC PUB-2504):%1
.once preface 0
@%2Multidimensional Additive Spline Approximation%1
by Jerome H. Friedman, Eric Grosse, and Werner Stuetzle
(22 pages, May 1980)

@We describe an adaptive procedure that approximates a function of many
variables by a sum of (univariate) spline functions %2s↓m%1 of selected linear
combinations %2a↓m %4x%2 x%1 of the coordinates
.skip 3
The procedure is nonlinear in that not only the spline coefficients but also
the linear combinations are optimized for the particular problem.  The sample
need not lie on a regular grid, and the approximation is affine invariant,
smooth, and lends itself to graphical interpretation.  Function values,
derivatives, and integrals are cheap to evaluate.
.begin nofill
↔Available in microfiche only. (Free)

.once preface 0
@%2Deciphering a Linear Congruential Encryption%1 by Donald E. Knuth
(10 pages, April 1980)

@It is shown that the multiplier, increment, and starting value of a linear
congruential random number generator on a binary computer can be deduced
from the leading bits of the "random" numbers generated.
.begin nofill
↔Hardcopy $2.00
↔Microfiche (Free)

.once preface 0
@%2On the Design, Use, and Integration of Data Models%1
by Ramez Aziz El-Masri (Thesis, 228 pages, May 1980)

@In spite of the large number of data models that have been proposed, no
consensus exists about what concepts are really important for data modelling.
Presentations of the various models cite the advantages of their approach,
and discuss what the model best can represent.  The advantages cited vary
widely, and are usually very difficult to measure.  How can we measure advantages
of different approaches when they include ease of representation, ease of implementation,
naturalness of expression, power of query language, non-procedurality of query language,
expression of structural constraints, efficiency of implementation, formal foundations,
support of existing implemented models, ... etc?

@Another problem which we encounter when we attempt to compare models is that
not all models are defined at the same level.  For example, hierarchical and network
models include numerous implementation details.  The relation model does not include
any inter-relation constraints.  Although some real-world constraints are captured by
functional, multi-valued, and other dependencies, no systems represent these constraints
except the functional dependencies that result form the primary keys of relations.  The
entity-relationship model is claimed to be a real-world model, rather than an information
structure model, as are semantic and binary relationship models -- does that imply
these models cannot or should not be used for database implementation?

@In this thesis, we are concerned with data models at the information structure
level, rather than at the real-world semantic level.  However, we want the information
structure data model that we use to be able to capture many of the semantic concepts
of the real-world that it models, as well as to support a query language which is close
to the way users perceive the real-world to be organized (to a certain extent).  Hence,
our first task is to define the real-world semantics that we want to correctly represent
in our model.

@In Chapter 2 of the thesis, we define the real-world concepts we want to model.  The
concepts in this chapter are mostly from the literature of the semantic database models,
and include the majority of concepts discussed therein.  For each concept discussed,
we will specify the origin (as far as we know) of that concept.  The characterization
of structural properties of relationships in Section 2.2 though is partly our own.  The
real-world concepts presented in Chapter 2 can also be used as a basis for comparing
the representational powers of various models, although we do not attempt to compare
models in this thesis.

@Next, in Chapter 3, we discuss the information structure model that we use
in this thesis, the %2Structural Model%1.  The structural model is based upon the relational
model.  The main extension to the relational model is the concept of %2logical%1
or %2structural connections%1 between relations.  The connections are used to represent
structural properties in relationships as well as subclass concepts, which we will discuss
in Chapter 2.  The ideas behind this model originated from the model in Wiederhold,
Chapter 7.  However, the formal definitions are our own.

@Connections are also used to define a simple, object-oriented language for query
and update specification which is independent from the model structure to some extent.
The language structure is quite simple, and we believe consistent with our desire for a
high-level language that matches user expectations of the real-world (although we cannot
prove this).  We present the Structural Model Query Language (SMQL) in Chapter 4,
as well as the language used for defining a structural model for a user, the Structural
Model Definition Language (SMDL).

@In Chapter 5, we show how the structural model can correctly represent the real-world
concepts discussed in Chapter 2, using the small set of concepts of the model
(domains, attributes, relations, and connections).  The simplicity of the relational model
is maintained to a certain extent, since the only additional concept introduced is that
of connections.  As we shall see, the value of these connections is twofold.  First, they
are used to represent the structural properties of relationships in the model.  Second,
they allow us to define the query language by giving names to the connections in both
directions.  A third advantage, no concered with modelling, but just as important,
is that system implementors are aware of these logical connections, and can refer to
them during file and access path implementation.  The implementors are also made
aware of the structural constraints specified by the connections, so they can consider
efficient methods of checking these constraints.  This is not possible if these constraints
are defined in a general integrity assertion subysystem, separately from the model as in
a generalized relational system.

@The structural model also allows us to integrate different user data models to form
an integraged database model which correctly supports the different user models.  This
allows the three-level database management system architecture discussed in Section
1.3.  Data model integration, or view integration, has recently been discussed as a
possible improvement on the traditional method for logical database design.
However, integration has not been considered in detail.  This thesis is the first
attempt to examine the way in which data models can differ, and to discover methods
to integrate the data models of different users.  We will discuss data model integration
in Chapter 6.

@Finally, we give our conclusions and directions for future work in Chapter 7.
.begin nofill
↔Hardcopy $5.00
↔Microfiche (Free)

%3PVG-17 (STAN-CS-80-802):%1
.once preface 0
@%2Theory of Compiler Specification and Verification%1
by Wolfgang H. Polak (Thesis, 288 pages, May 1980)

@The formal specification, design, implementation, and verification of a
compiler for a Pascal-like language is described.  All components of the
compilation process such as scanning, parsing, type checking, and code generation
are considered.

@The implemented language contains most control structures of Pascal,
recursive procedures and functions, and jumps.  It provedes user defined data
types includng arrays, records, and pointers.  A simple facility for input-output
is provided.

@The target language assumes a stack machine including a display mechanism to
handle procedure and function calls.

@The compiler itself is written in Pascal Plus, a dialect of Pascal accepted
by the Stanford verifier.  The Stanford verifier is used to give a complete
formal machine checked verification of the compiler.

@One of the main problem areas considered is the formal mathematical treatment
of programming languages and compilers suitable as input for automated program
verification systems.

@Several technical and methodological problems of mechanically verifying large
software systems are considered.  Some new verification techniques are
developed, notably methods to reason about pointers, fixed points, and
quantification.  These techniques are of general importance and are not limited
to compiler verification.

@The result of this research demonstrates that construction of large correct
programsis possible with the existing verification technology.  It indicates
that verification will become a useful software engineering tool in the future.
Several problem areas of current verification systems are pointed out and areas
for future research are outlined.
.begin nofill
↔Hardcopy $5.75
↔Microfiche (Free)

.once preface 0
@%2A Language for Typesetting Graphics%1
by Christopher John Van Wyk (Thesis, 59 pages, May 1980)

@This dissertation presents some new constructs for a
programming language in which two-dimensional figures can be
The language is intended to work with existing
text-formatting systems so that text and figures can be
typeset at the same time.
Another application for the language is to describe VLSI
circuit layout.

@The building block for figure language programs is called a
"box"; it shares some features with procedures and some
with records in general-purpose programming languages.
A box includes a system of simultaneous equations that
declares the relative positions of its significant points
and requests for actions to be performed at those points:
for example, a user may ask for lines drawn between, circles
drawn around, or boxes placed at selected points.

@A box is called by adding enough equations to the system in
its definition that its significant points can be solved for
A box call may also include additional actions to be
performed during the call (say, extra lines or circles to be
The notions of drawing a line using a pattern (perhaps
dashed or dotted) and texturing a polygonal area (for
example, by cross-hatching) follow directly from the
mechanism for defining and calling boxes.

@Users may also ask for a box to be opaque so that it blots
out pieces of picture that it covers.
Two commands that correspond to the informal notion of
sketching serve to localize this blotting effect when it
could ruin parts of the picture that are wanted.

@Good algorithms implementing these instructions for boxes
composed of straight lines and circular arcs are presented
along with methods to detect insufficiency or inconsistency
in the simultaneous equations.
Extensions beyond the ruler-and-compass model are explored
enough to suggest both theoretical and practical reasons to
limit the kinds of curves and functions the system should
deal with.

@An appendix describes an implementation of the language that
was used to produce the dissertation.
.begin nofill
↔Hardcopy $3.35
↔Microfiche (Free)

.once preface 0
@%2Determining Correctness by Testing%1
by Martin Brooks (Thesis, 135 pages, May 1980)

@This report presents a mathematical theory of program testing which
describes how testing can be used to show a program's correctness.
We study the situation where
the programmer
has written a program P which is either correct or which differs
from the correct program by one or more errors in a specified class E.
The theory relates:
.begin indent 8,8,8
The combinatorics of how errors in E could have led to P being written.

The type of program behavior that the programmer will observe when testing; for
example the program's output or a trace.

The characteristics of test data which is guaranteed to reveal
the presence of errors in P.

@The theory is developed at a general level and is then specialized
to the case of recursive programs having only certain types of simple errors, where the
programmer observes a trace of each test run. 
We derive a method for generating test data for recursive programs.
The generated test data is %2complete%1,
meaning that if the only errors in the recursive program are of the specified simple types,
then the test data will
reveal the errors through at least one incorrect trace. Viewed contrapositively,
if one assumes that the program differs from
the correct program at most by errors of the specified simple types,
then if the trace of each
test run is as the programmer intended, then the program must be correct.
The test data generation method has been implemented and examples
of its operation are given.
.begin nofill
↔Hardcopy $5.50
↔Microfiche (Free)
.next page
.once preface 0
@%2Modelling the Environment of an Exploring Vehicle by Means of Stereo Vision%1
by Donald B. Gennery (Thesis, 151 pages, June 1980)

@This dissertation describes research involving vision techniques which
would be useful in an autonomous exploring vehicle, such as a Mars rover.  These
techniques produce a description of the surroundings of the vehicle in terms of
the position, size, and approximate shape of objects, and can match such scene
descriptions with others previously produced.  The information produced is thus
useful both for navigation and obstacle avoidance.  The techniques operate by
using three-dimensional data which they can produce by means of stereo vision
from stereo picture pairs or which can be obtained from a laser rangefinder.
The research thus divides conveniently into two portions:  stereo mapping
and three-dimensional modelling and matching.

@The stereo mapping techniques are designed to be suitable for the kind
of pictures that a Mars rover might obtain and to produce the kind of data that
the modelling techniques need.  These stereo techniques are based upon area
correlation and produce a depth map of the scene.  Emphasis is placed upon
extraction of useful data from noisy pictures and upon the estimation of the
accuracy of the data produced.  Included are the following:  a self-calibration
method for computing the stereo camera model (the relative position and
orientation of the two camera positions); a high-resolution stereo correlator
for producing accurate matches with accuracy and confidence estimates, which
includes the ability to compensate for brightness and contrast
changes between the pictures; a search technique for using the correlator to
produce a dense sampling of matched points for a pair of pictures; and the
computation of the distances to the matched points, including the propagation of
the accuracy estimates.

@The three-dimensional modelling and matching techniques are
designed to be tolerant of the errors that stereo mapping techniques often
produce.  First, a ground surface finder tries to find a set of points that
form a well-defined smooth surface that lies below most of the other points.
Then, by using this knowledge of the ground surface and knowledge
of the camera viewpoint that produced the points in the scene,
an object finder approximates
the objects that are above the ground by ellipsoids.
Finally, a scene matcher can use the descriptions of scenes in terms of
ellipsoidal objects.  By using a search pruned by using probabilities obtained
by means of Bayes' theorem, it determines the probability that
two scene descriptions refer to the same scene and the linear transformation
needed to bring the two scenes into alignment.

@These techniques have been tried on stereo pictures of the Martian
surface taken by the Viking Lander 1.  The object finder was able to
locate rocks fairly successfully, and the scene matcher was able to
match successfully the resulting scene descriptions.  Examples of these
results are shown.
.begin nofill
↔Hardcopy $
↔Microfiche (Free)

.next page

@In the past, we have kept our report abstracts mailing list
confidential.  When we received a request to borrow it or give
access to it to others outside our Publications Office, we have
turned them down.  We are now
contemplating making our mailing list available to other
academic institutions and professional societies.
We are not, repeat %2not%1, selling your name
and address to some company who will in turn send you tons of junk mail.
It would be given (loaned) to other academic institutions and
professional societies who are setting up their own 
reports systems and need assistenance in getting started.
We feel doing this would better serve the Computer Science community as a whole,
by giving more people access to a wider selection of information.

@We are, however, giving you the opportunity to choose whether you
want to participate in this.  There is a box provided on the enclosed
order form.  If you %2do not%1 wish to have your name and address
given to other academic institutions and professional societies,
we ask that you please check this box and return the form to us.  
We respect your right to privacy, and will not release any information
you don't want us to.

.next page <<CSL listing>>
.once center
%3Stanford University Computer Science Laboratory Reports%1

@Listed here are abstracts of the most recent reports published by the
Computer Science Laboratory at Stanford University.  These reports are
available by writing directly to Naomi Schulman, Computer Systems Laboratory,
ERL-234, Stanford University, Stanford, California 94305.

%3CSL TR-192:%1
.once preface 0
@%2Verifying Network Protocols Using Temporal Logic%1 
by Brent T. Hailpern and Susan S. Owicki (June 1980)

@Programs that implement computer communications protocols can exhibit extremely
complicated behavior, and neither informal reasoning nor testing is reliable
enough to establish their correctness.  In this paper we discuss the application
of program verification techniques to protocols.  This approach is more reliable
than informal reasoning, but has the advantage over formal reasoning based on
finite-state models that the complexity of the proof does not grow unmanageably
as the size of the program increases.  Certain tools of concurrent program
verification that are especially useful for protocols are presented:  history
variables that record sequences of input and output values, temporal logic for
expressing properties that must hold in a future system state (such as eventual
receipt of a message), and module specification and composition rules.  The use
of these techniques is illustrated by verifying a simple data transfer protocol
from the literature.

.next page <<CSD order form>>
.once center

@To order reports, change your mailing address, or release your name and address;
complete and return this %2entire form including the mailing label%1 (just the
form, not the entire abstract listing) to the Stanford Department of Computer
Science by September 8, 1980.  To return this form to us; simply fold it so that
our address (on the reverse side of this sheet) shows, staple, affix the
appropriate postage and mail.
Please do %2not%1 send any money with your order.  Wait until you receive an
invoice (which will be enclosed with the reports when they are sent).
.once center
Check off the reports you want.
.begin bf
.tabs 3,8,28,43,46,50,70
|B\X.\STAN-CS-80-805\$    \|B\Y.\STAN-CS-80-805\FREE
.begin nofill
Please check %2one%1 of the following boxes and return this form to us.  Thank you.
	|B YES.  You may give my name and address to other academic institutions and
			 professional agencies.
	|B NO.  I do not want my name and address released to anyone outside of the Stanford
			Department of Computer Science.