perm filename NINETE.MSS[BIB,CSR] blob sn#690291 filedate 1982-12-07 generic text, type T, neo UTF8
@make(PublicationOrderForm,Number 19,AnnouncementDate "November 1982")
@begin(center)
List Number 19
@end(center)
@Entry(Code "STAN-CS-82-919",
        Title "Mechanisms for Broadcast and Selective Broadcast",
        Author "David W. Wall",
        Price "Not available",
        Mprice "$2.00",
        Note "112 pages",
        Date "June 1980")

This thesis deals with a problem in the effective use of a loosely-coupled store-
and-forward network like the ARPANET. Many applications assume the existence of
a mechanism to send a @i[broadcast] message to every node in the network, or a
@i[selective broadcast] message to a specified group of nodes.  The problem of
routing a message to a single destination in such a network has been considered
in detail; the fundamental work is reported in McQuillan's thesis.  In his 
dissertation, Dalal describes several ways in which a broadcast facility might
be provided.  The most attention is given to an algorithm for imposing a minimum
spanning tree on the network and maintaining it in the face of failures and 
changing costs, so that a broadcast can be routed along the branches the MST at
a minimum cost to the netwrok as a whole.

@Entry(Code "STAN-CS-82-924",
	Title "Adaptive Mesh Refinement for Hyperbolic
		Partial Differential Equations",
	Author "Marsha J. Berger",
	Price "$5.70",              
	Mprice "$2.00",
	Note "123 pages",
	Date "August 1982")

This thesis presents a method based on the idea of multiple, component 
grids for the solution of hyperbolic partial differential equations (pde)
using explicit finite difference techniques.  Based upon Richardson-type 
estimates of the local truncation error, refined grids are created or    
existing ones removed to attain a given accuracy for a minimum amount of work.  
In addition, this approach is recursive in that fine grids can themselves 
contain even finer subgrids.  Those grids with finer mesh width in space will
also have a smaller mesh width in time, making this a mesh refinement algorithm
in time and space.

@Entry(Code "STAN-CS-82-925",
	Title "Synthesis of Communicating Processes from Temporal   
	       Logic Specifications",          
	Author "Pierre L. Wolper",
	Price "$5.60",
	Mprice "$2.00",
	Note "120 pages",
 	Date "August 1982")

Most concurrent programs can easily be separated into two parts:  a 
@i[synchronization] part that enforces the necessary constraints on the relative 
timing 
of the execution of the different processes and a @i[functional] part that
actually manipulates the data and performs the computation required of the
program.  Writing the synchronization part of a concurrent program requires a 
lot of attention to intricate details but does not require insight into a
variety of underlying mathematical theories.  These characteristics make the
development of tools for specifying and automatically synthesizing synchonization
code a highly desirable and yet manageable task.

In this thesis, we develop a more expressive version of the  
Temporal Logic (TL) introduced by Pnueli and apply it to the problem           
of specifying and synthesizing synchronization code for programs written 
in the language of Communicating Sequential Processes (CSP) developed by 
Hoare.                                                                 
   
@Entry(Code "STAN-CS-82-926", 
	Title "Principles of Rule-Based Systems",
	Author "Bruce G. Buchanan and Richard O. Duda",     
	Price "$3.75",
	Mprice "$2.00",
	Note "58 pages",
	Date "August l982",)

Rule-based expert systems are surveyed.  The most important considerations
are representation and inference.  Rule-based systems make strong assumptions 
about the representation of knowledge as conditional sentences and about the
control of inference in one of three ways.  The problem of reasoning with
incomplete or inexact information is also discussed, as are several other 
issues regarding the design of expert systems.                      


@Entry(Code "STAN-CS-82-927",
	Title "Combining State Machines and Regular Expressions for
               Automatic Synthesis",
        Author "Jeffrey D. Ullman",
	Price "$2.50",
	Mprice "$2.00",
	Note "14 pages",
	Date "September 1982")

We discuss a system for translating regular expressions into logic equations 
or PLA's, with particular attention to how we can obtain both the benefits
of regular expressions and state machines input languages.  An extended 
example of the method is given, and the results of our approach is compared
with hand design; in this example we use less than twice the area of a 
hand-designed, machine optimized PLA.


@Entry(Code "STAN-CS-82-928",
	Title "Automated Ambulatory Medical Record Systems in the U. S.",
        Author "Kuhn, Wiederhold, Rodnick, et al",
	Price "$4.10",
	Mprice "$2.00",
	Note "70 pages",
    	Date "August 1982")

This repport presents an overview of the developments in Automated Ambulatory
Medical Records Systems (AAMRS) from 1975 to the present.  A summary of findings
from a 1975 state-of-the-art review is presented along with the current findings
of a follow-up study of a selected number of the AAMRS operating today.  The
studies revealed that effective automated medical record systems have been
developed for ambulatory care settings and that they are now in the process of
being transferred to other sites or users, either privately or as a commercial
product.  Since l975 there have been no significant advances in systems design.
However, progress has been substantial in terms of achieving production goals.
Even though a variety of systems are commercially available, there is a continuing
need for research and development to improve the effectiveness of the systems in
use today.


@Entry(Code "STAN-CS-82-929",
	Title "Fragments of Relations",
	Author "David Maier and Jeffrey D. Ullman",
	Price "$2.50",
	Mprice "$2.00",
	Note "11 pages",
	Date "June 1982")

We develop a theory of relations that are constructed by the union and selection
operations from fragment relations.  Algorithms for inserting and deleting from
relations that are composed of physical fragments are discussed, and we show
when such insertions and deletions are meaningful. We also show how to find 
an access set for a relation, that is, a set of fragments sufficient to produce
the relation, and we apply the test to the question of how the fragmentation of 
relations interacts with a query on the relation, showing that a selection on
the relation can be implemented by retrieving a set of physical fragments that
forms an access set for another particular relation.


@Entry(Code "STAN-CS-82-930",
	Title "Depth from Edge and Intensity Based Stereo",
	Author "Henry Harlan Baker",
	Price "$6.00",
	Mprice "$2.00",
	Note "307 pages",
	Date "July 1982")

The past few years have seen a growing interest in the application of three-
dimensional image processing.  With the increasing demand for 3-D spatial
information for tasks of passive navigation ([Gennery 1980], [Moravec 1980]),
automatic surveillance ([Henderson 1979]), aerial cartography ([Kelly l977],
[Panton 1978]), and inspection in industrial automation, the importance of 
effective stereo analysis has been made quite clear.  A particular challenge 
in this area is to provide reliable and accurate depth data for input to 
object or terrain modelling systems (such as ACRONYM [Brooks 1981a]).
This report describes an algorithm for such stereo sensing.  It is founded 
on an edge-based line-by-line stereo correspondence scheme - one which 
provides this three-dimensional analysis in a fast, robust, and parallel
implementable way.  Its processing consists of extracting edge descriptions
of a pair of images, linking these edges to their nearest neighbors to 
obtain the connectivity structure of the images, matching the edge descriptions
on the basis of local edge measures, and cooperatively removing those edge 
pairings formed by the corespondence process which violate the connectivity
structure of the two images.  A further matching process, using a technique
similar to that used for the edges, is done on the image intensity values 
within intervals defined by the edge correspondence.  The result of the 
processing is a full image array depth map of the scene viewed.


@Entry(Code "STAN-CS-82-931",
	Title "PUFF:  An Expert System for Interpretation of Pulmonary
              Function Data",
	Author "Aikins, Kunz, Shortliffe, and Fallat",
	Price "$2.70",
	Mprice "$2.00",
	Note "22 pages",
	Date "August 1982")

The application of Artificial Intelligence techniques to real-world problems
has produced promising research results, but seldom has a system become a 
useful tool in its domain of expertise.  Notable exceptions are the DENTRAL
(1) and MOLGEN (2) systems.  This paper describes PUFF, a program that
interprets lung function test data and has become a working tool in the
pulmonary physiology lab of a large hospital.  Elements of the problem that
paved the way for its success are examined, as are significant limitations
of the solution that warrant further study.


@Entry(Code "STAN-CS-82-932",
   	Title "Expert Systems Research:  Modeling the Medical Decision
              Making Process",
	Author "Edward H. Shortliffe and Lawrence M. Fagan",
	Price "$2.85",
	Mprice "$2.00",
	Note "27 pages",
	Date "August 1982")

During the quarter century since the birth of the branch of computer science
known as artificial intelligence (AI), much of the research has focused on
developing symbolic models of human inference.  In the last decade several 
related AI research themes have come together to form what is now known as
"expert systems research."  In this paper, we review AI and expert systems
to acquaint the reader with the field and to suggest ways in which this 
research will eventually be applied to advanced medical monitoring.


@Entry(Code "STAN-CS-82-933",
	Title "An Algorithmic Method for Studying Percolation Clusters",
	Author "Shmuel T. Klein and Eli Shamir",
	Price "$2.50",
	Mprice "$2.00",
	Note "13 pages",
	Date "September 1982")

In percolation theory one studies configurations, based on some infinite
lattice, where the sites of the lattice are randomly made F (full) with
probability p or E(empty) with probability 1-p.  For p > p @-[c] the set
of configurations which contain an infinite cluster (a connectivity 
component) has probability 1.  Using an algorithmic method and a 
rearrangement lemma for Benoulli sequences, we compute the boundary-
to-body quotient of infinite clusters and prove it has the definite
value (1-p)/p with probability 1.

@b[STAN-CS-82-934 THROUGH STAN-CS-82-939]

These reports are chapters of the @i[Handbook of Artificial Intelligence],
edited by Avron Barr, Paul R. Cohen, and Edward A. Feigenbaum.  The 
three-volume @i[Handbook] has been published by William Kaufmann, Inc. of
Los Altos.  These reports are available in MICROFICHE ONLY.

@Entry(Code "STAN-CS-82-934",
	Title "Understanding Spoken Language",
 	Author "Avron Barr, Paul Cohen and Lawrence Fagan",
	Price "Not available",
	Mprice "$2.00",
	Note "90 pages",
	Date "September 1982")

This report discusses issues in the design of programs that understand
spoken language and the current status of this research in AI.  The systems
that emerged during the ARPA Speech Understanding Research projects, including
HEARSAY, HARPY, HWIN, and the SRI/SDC speech system are described.
 
@Entry(Code "STAN-CS-82-935",
	Title "Programming Language for AI Research",
	Author "Steve Tappel, Stephen Westfold, and Avron Barr",
	Price "Not available",
	Mprice "$2.00",
	Note "90 pages",
	Date "September 1982")

This report describes the programming-language features and environments
developed by AI researchers---the "tools of thought" in which new ideas
and perspectives on the understanding of cognition are first explored.
Of note here is the extended discussion of LISP--by far the most 
important tool of either kind yet invented in AI.

@Entry(Code "STAN-CS-82-936",
        Title "Models of Cognition",
        Author "Paul R. Cohen",
        Price "Not available",
        Mprice "$2.00",
        Note "87 pages",
        Date "September 1982")

This chapter, written by Paul R. Cohen, discusses AI models of human
memory, belief, and planning and problem solving.  These programs
(e.g. EPAM, GPS, and HAM) were among the earliest developed in AI
and give some insight into the powerful influence of the computer
on the development of AI and cognitive psychology.

@Entry(Code "STAN-CS-82-937",
        Title "Automatic Deduction",
        Author "Ballantyne, Bledsoe, Doyle, Moore, et al",
        Price "Not available",
        Mprice "$2.00",
        Note "64 pages",
        Date "September 1982")

Organized by Janice Aikins, this chapter on automatic deduction, also
called automatic theorem proving, describes resolution and natural-
deduction theorem proving, the Boyer-Moore theorem prover, nonmonotonic
logic, and logic programming.

@Entry(Code "STAN-CS-82-938",
        Title "Vision",
        Author "Takeo Kanade",
        Price "Not available",
        Mprice "$2.00",
        Note "220 pages",
        Date "September l982")

With contributions by David A. Bourne, Rodney Brooks, Nancy H. Cornelius,
James L. Crowley, Hiromichi Fujisawa, Martin Herman, Fuminobu Komura, Bruce
D. Lucas, Steven A. Shafer, David R. Smith, Steven L. Tanimoto, and Charles
E. Thorpe, this chapter describes all aspects of computer vision, from the 
design and calibration of cameras to preprocessing and edge detection, the
extraction of image features, and the inference of scene characteristics.
It also includes several articles on blocks-world vision research, a section
on algorithms for vision systems, and a section on applications of vision
research.

@Entry(Code "STAN-CS-82-939",
        Title "Planning and Problem Solving",
        Author "Paul R. Cohen",
        Price "Not available",
        Mprice "$2.00",
        Note "61 pages",
        Date "September 1982")

With contributions by Stephen Westfold and Peter Friedland, this chapter 
reviews nonhierarchical planning and continues on to discuss hierarchical 
and least-commitment planning and the refinement of skeletal plans.

@Entry(Code "STAN-CS-82-940",
        Title "The Equivalence of Universal Relation Definitions",
        Author "David Maier, Jeffrey Ullman, and Moshe Vardi",
        Price "$2.85",
        Mprice "$2.00",
        Note "27 pages",
        Date "September 1982")

The universal relation model aims at achieving complete access path 
independence by relieving the user of the need for logical navigation
among relations.  It assumes that for every set of attributes there is a 
basic relationship that the user has in mind.  Two fundamentally different
approaches to the universal relation model have been taken.  The first
approach sees the universal relation as a user view, about which he
poses queries.  Specifically, a representative instance is constructed, 
and queries are answered based on its non-null part.  The second approach 
sees the model as having query-processing capabilities that relieve the
user of the need to specify the logical access path.  The relationship
between the user's view and the computation answering a query is a central
issue that systems supporting a universal view of data must handle.

@Entry(Code "STAN-CS-82-941",
        Title "Integrating Local Information to Understand Dialog",
        Author "Paul Alan Martin",
        Price "Not available",
        Mprice "$2.00",
        Note "131 pages",
        Date "March 1981")

The design of computer systems capable of understanding natural language is a
long standing and important goal of Artificial Intelligence, both for its
practical benefits and for the insights it offers into human cognition.  The
task is immense, and the accomplishments in the field to date do not provide
an adequate framework to divide the problem into cleanly separable pieces.
One portion of the problem that has recently received significant attention
is the study of coherence among the individual utterances in an extended 
monolog or in discourse among speakers.  This dissertation describes a 
conceptual approach to the related problems of coherence and focus in dialog,
and the use of a computer system to study concretely some of the issues and
problems raised by this approach.

@Entry(Code "STAN-CS-82-942",
        Title "Verifying Concurrent Processes Using Temporal Logic",
        Author "Brent T. Hailpern",
        Price "Not available",
        Mprice "$2.00",
        Note "114 pages",
        Date "August 1980")

Concurrent processes can exhibit extremely complicated behavior, and neither
informal reasoning nor testing is reliable enough to establish their correctness
In this thesis, we develop a new technique for the verification of
parallel programs.  The technique is stated in terms of axioms and inference 
rules, and it is used to prove safety and liveness properties of parallel
programs.  Safety properties are assertions that must be satisfied by systems
state at all times; they are analogous to partial correctness.  Liveness
properties refer to events that will occcur in the future, such as program
termination or the eventual receipt of a message.  In addition to the formal
proof rules, we present several heuristics to aid in the preparation of
correctness proofs.

@Entry(Code "STAN-CS-82-943",
        Title "Naming Planar Graphs",
        Author "Donald R. Woods",
        Price "Not available",
        Mprice "$2.00",
        Note "58 pages",
        Date "June 1981")

Given a planar graph, we wish to draw it in the plane so that no edges 
cross.  We might be given a particular planarisation (a specification
giving the faces of the desired drawing) instead of merely the graph, but 
this is not required.  Given a planarisation and a choice of outermost face,
all drawings are in a sense equivalent; indeed, when the drawing is performed
on the surface of a sphere instead of on the plane, even the choice of outermost
face is irrelevant.  To make the problem meaningful, we must introduce
further constraints.  Two general forms of constraints are:  (1) absolute
restrictions of the details of the drawing, that is, disallowing or requiring
certain features, and (2) weighted restrictions, that is, additional objectives
to be met to whatever extent is possible.

We first examine the general problem, and look at various constraints that have 
been found to yield useful drawings in some applications.  We then examine in 
more detail one particular set of constraints, developing a fast algorithm for 
producing drawings meeting those constraints, and proving some theorems relating
to the overall complexity of the problem.  Finally, we look at what results are
known regarding other variations on the general problem.


@send(order <@b{@\Sub-totals@\@Ux(@hsp(.5 inch))@\@Ux(@hsp(.5 inch))}>)
@send(order <@b{@\Publications total@\@Ux(@hsp(.5 inch))}>)
@send(order <@b{@\Sales tax@\@Ux(@hsp(.5 inch))}>)
@send(order <@b{@\Total enclosed@\@Ux(@hsp(.5 inch))}>)