Browse Results

Showing 82,226 through 82,250 of 83,302 results

Fehlertolerierende Rechensysteme: 2. GI/NTG/GMR-Fachtagung / Fault-Tolerant Computing Systems 2nd GI/NTG/GMR Conference / Bonn, 19.–21. September 1984 (Informatik-Fachberichte #84)

by K. E. Grosspietsch M. Dal Cin

In the last decade of Computer Science development, we can observe a growing interest in fault-tolerant computing. This interest is the result of a rising number of appl'ications where reliable operation of computing systems is an essential requirement. Besides basic research in the field of fault-tolerant computing, there is an increasing num­ ber of systems especially designed to achieve fault-tolerance. It is the objective of this conference to offer a survey of present research and development activities in these areas. The second GI/NTG/GM~ Conference on Fault-Tolerant Computing Systems has had a preparatory time of about two years. In March 1982, the first GI conference concerning fault-tolerant computing systems was held in Munich. One of the results of the conference was to bring an organiza­ tional framework to the FTC community in Germany. This led to the founding of the common interest group "Fault-Tolerant Computing Systems" of the Gesellschaft fur Informatik (GI), the Nachrichtentechnische Gesellschaft (NTG), and the Gesellschaft fur MeB- und Regelungstechnik (VDI/VDE-GMR) in November 1982. At that time, it was also decided to schedule a biannual conference on fault-tolerant computing systems. One of the goals of this second conference is to strengthen the relations with the international FTC community; thus, the call for papers was extended not only to German-speaking countries, but to other countries as well.

First Book on UNIXTM for Executives

by Yukari Shirota Tosiyasu L. Kunii

A good introduction to a new product or concept is vital. This is particularly true for a versatile software system such as UNIX. UNIX provides the depth and intelligence to make your computer work hard for you. It will help you create software and help you use your office automation equipment to create and edit documents. For your intro­ duction to UNIX, you want a great little book. That is what this work is meant to be. This book is designed for non-computer specialists, especially for executives, ad­ ministrators and managers who want to make better use of their software specialists and experts. The way this Springer edition has come to be published is itself a story. Back about 1980, the founder and president of one of the more successful microcomputer companies, Mr. Kazue Ishii of CEC, wanted to start somethig that would be brilliant, sophisticated, innovative, and which would grow steadily. Out of many proposals, the one he accepted happened to be mine. The proposal was to build a family of network workstations for computer-aided design/manufacturing and office automation. UNIX was to be used as a software generator. But he had a hard time understanding UNIX, what good it is and how good it is ... Spending a significant amount of time with a popular computer columnist, Miss Yukari Shirota, I compiled this book for him. I found this book generally useful for top executives, managers, planners and office administrators whose background is outside software engineering. Dr.

Flexible Assembly Systems: Assembly by Robots and Computerized Integrated Systems

by A.E. Owen

It has become clear in recent years from such major forums as the various international conferences on flexible manufacruring systems (FMSs) that the computer-controlled and -integrated "factory of the furure" is now being considered as a commercially viable and technically achievable goal. To date, most attention has been given to the design, development, and evalu­ ation of flexible machining systems. Now, with the essential support of increasing numbers of industrial examples, the general concepts, technical requirements, and cost-effectiveness of responsive, computer-integrated, flexible machining systems are fast becoming established knowledge. There is, of course, much still to be done in the development of modular com­ puter hardware and software, and the scope for cost-effective developments in pro­ gramming systems, workpiece handling, and quality control will ensure that contin­ uing development will occur over the next decade. However, international attention is now increasingly rurning toward the flexible computer control of the assembly process as the next logical step in progressive factory automation. It is here at this very early stage that Tony Owen has bravely set out to encompass the future field of flexible assembly systems (FASs) in his own distinctive, wide-ranging style.

Foundations of Logic Programming (Symbolic Computation)

by J. W. Lloyd

This book gives an account oC the mathematical Coundations oC logic programming. I have attempted to make the book selC-contained by including prooCs of almost all the results needed. The only prerequisites are some Camiliarity with a logic programming language, such as PROLOG, and a certain mathematical maturity. For example, the reader should be Camiliar with induction arguments and be comCortable manipulating logical expressions. Also the last chapter assumes some acquaintance with the elementary aspects of metric spaces, especially properties oC continuous mappings and compact spaces. Chapter 1 presents the declarative aspects of logic programming. This chapter contains the basic material Crom first order logic and fixpoint theory which will be required. The main concepts discussed here are those oC a logic program, model, correct answer substitution and fixpoint. Also the unification algorithm is discussed in some detail. Chapter 2 is concerned with the procedural semantics oC logic programs. The declarative concepts are implemented by means oC a specialized Corm oC resolution, called SLD-resolution. The main results of this chapter concern the soundness and completeness oC SLD-resolution and the independence oC the computation rule. We also discuss the implications of omitting the occur check from PROLOG implementations. Chapter 3 discusses negation. Current PROLOG systems implement a form of negation by means of the negation as failure rule. The main results of this chapter are the soundness and completeness oC the negation as failure rule.

Fundamentals of Operating Systems

by LISTER

An operating system is probably the most important part of the body of soft­ ware which goes with any modern computer system. I ts importance is reflected in the large amount of manpower usually invested in its construction, and in the mystique by which it is often surrounded. To the non-expert the design and construction of operating systems has often appeared an activity impenetrable to those who do not practise it. I hope this book will go some way toward dispelling the mystique, and encourage a greater general understanding of the principles on which operating systems are constructed. The material in the book is based on a course of lectures I have given for the past few years to undergraduate students of computer science. The book is therefore a suitable introduction to operating systems for students who have a basic grounding in computer science, or for people who have worked with computers for some time. Ideally the reader should have a knowledge of prorramming and be familiar with general machine architecture, common data structures such as lists and trees, and the functions of system software such as compilers, loaders, and editors. I t will also be helpful if he has had some experience of using a large operating system, seeing it, as it were, from the out­ side.

Fundamentals of Operating Systems (Computer Science Series)

by A. M. Lister

Fundamentals of Programming Languages

by E. Horowitz

" .. .1 always worked with programming languages because it seemed to me that until you could understand those, you really couldn't understand computers. Understanding them doesn't really mean only being able to use them. A lot of people can use them without understanding them." Christopher Strachey The development of programming languages is one of the finest intellectual achievements of the new discipline called Computer Science. And yet, there is no other subject that I know of, that has such emotionalism and mystique associated with it. Thus, my attempt to write about this highly charged subject is taken with a good deal of in my role as professor I have felt the need for a caution. Nevertheless, modern treatment of this subject. Traditional books on programming languages are like abbreviated language manuals, but this book takes a fundamentally different point of view. I believe that the best possible way to study and understand today's programming languages is by focusing on a few essential concepts. These concepts form the outline for this book and include such topics as variables, expressions, statements, typing, scope, procedures, data types, exception handling and concurrency. By understanding what these concepts are and how they are realized in different programming languages, one arrives at a level of comprehension far greater than one gets by writing some programs in a xii Preface few languages. Moreover, knowledge of these concepts provides a framework for understanding future language designs.

Hands-on Word Processing, Teacher's bk

by NA NA

Information Technology and the Computer Network (NATO ASI Subseries F: #6)

by K. G. Beauchamp

1.1 Scope This paper deals with the following subjects: 1. Introduction 2. Feasibility study definition in IT 3. Forming a feasibility study team 4. The feasibility study work 5. The feasibility study report 6. Discussion 1.2 Information Technology (IT) Information was defined as anything sensed by at least one of the human senses and that may change the level of his knowledge. The information may be true or false, sent by premeditation or generated by coincidence, needed by the interceptor or intended to create new needs. The creation of the information may be very costly or free of charge. The information may be an essential need or just a luxury. Each information may be a one shot nature, eg., announcing a marriage, or a constant update need one, eg., news. Information technology as defined herein means all the types of systems needed to deal wi.th the information, transfer it to any place, store it, adapt it, etc. Information technology is usually bused on Telecommunications. Telecommunications means a large variety of possibilities. Usually, the IT's are based on the creation, updating, processing and transmission of information. The information itself is usually alphanumeric and graphic. Gradually, there is a tendency to step over to what is seen as more natural information, audio and visual.

Interactive Graphics in CAD

by Y. Gardan Lucas

In a society in which the use of information technology is becoming commonplace it is natural that pictures and images produced by elec­ tronic means should be increasing in importance as a means of com­ munication. Computer graphics have only recently come to the atten­ tion of the general public, mainly through animated drawings, advertise­ ments and video games. The quality of the pictures is often such that, unless informed of the fact, people are unaware that they are created with the help of computers. Some simulations, those developed in con­ nection with the space shuttle for example, represent a great and rapid progress. In industry, computer graphic techniques are used not only for the presentation of business data, but also in design and manufacture processes. Such computer-assisted systems are collectively represented by the acronym CAX. In CAD/CAM (computer-assisted design/manufacture), interactive graphic techniques have attained considerable importance. In CAD/CAM systems a dialogue can be established between the user and the machine using a variety of easy to operate communication devices. Due to the recent developments in hardware and software (for modelling, visual display, etc), a designer is now able to make decisions based on the information presented (plans, perspective drawings, graphics, etc) with the help of interactive, graphic techniques. These constitute the most visible and perhaps most spectacular aspect of CAD/CAM systems.

International Calibration Study of Traffic Conflict Techniques (NATO ASI Subseries F: #5)

by E. Asmussen

The concept of traffic conflict was initiated in the Uni ted States in the 60s and raised a lot of interest in many countries : it was an opening towards the develop­ ment of a new tool for safety evaluation and the diagnosis of local safety pro­ blems. The need for such a tool was great, because of the many situations where accident data was either scarce, unsatisfactory or unavailable. Development of Traffic Conflict Techniques (TCT) started simultaneously in the 70s in several European count ries and new studies were also undertaken in the Uni ted States, Canada and Israel. The need for international cooperation was rapidly feIt, in order to exchange data, compare definitions and check progresses. An Association for International Cooperation on Traffic Conflict Techniques (ICTCT) was therefore created, grouping researchers and safety administrators, with the aim of promoting and organising exchange of information and common practical work. Three Traffic Conflict Techniques Workshops were organised, in Oslo (1977), Paris (1979) and Leidschendam (1982). A small scale international experiment of calibra­ tion of TCTs was also carried out in Rouen, France, in 1979, and five teams took part in it from France, Germany, Sweden, the United Kingdom and the United States; results of this first experiment were used as a basis for the present enterprise. To be acknowledged as a safety measuring tool, traffic conflict techniques had to be validated in relation to traditional safety indicators such as injury-accidents.

Introducing CAL: A practical guide to writing Computer-Assisted Learning programs

by Keith Hudson

It is often the case - perhaps more often than not - that new ideas arrive long before there is the me ans to clothe and deli ver them. We can think ofLeonardo da Vinci's drawings of helicopters and submarines among many other examples. Computer-Assisted Learning (CAL) is an example of an idea which has had a particularly long gestation. As I will illustrate early in the book, the principles of CAL were really first discovered by Socrates. As a formal method of teaching, the Socratic method disappeared for over two millennia until the 1950s. It was then revived in the form ofProgrammed Learning (PL) which resulted from the researches ofB. F. Skinner at Harvard University. Even then, PL was premature. In the 1950s and 60s, methods were devised, such as teaching machines and various sorts ofPL text books, and there was a mushrooming of PL publishing at that time. For a complex of reason- economic, logistical and technical-PL also largely disappeared from the mid- 60s, although it continued in a few specialized areas ofteaching and industrial training. However, during the same period, PL quietly transformed itselfinto CAL. But the computerized form was not capable of mass dissemination until recently hecause personal microcomputers did not have sufficient internal memory sizes. That situation has now changed very dramatically and 128K microcomputers are becoming cheap and widely available. Cheap memory chips of256K and 1024K cannot be far away, either.

Lineares Optimieren: Maximierung — Minimierung (Vieweg-Programmbibliothek Mikrocomputer #14)

by Herbert Mai

Dieser Band der Vieweg Programmbibliothek beschäftigt sich mit der Anwendung unter­ schiedlicher Varianten des Simplexverfahrens bei der Lösung linearer Ungleichungs· und/ oder Gleichungssysteme, wie sie bei der mathematischen Behandlung von Planungsvorbe· reitungen und Entscheidungsfindungen eingesetzt werden. Durch die Einbeziehung von Taschencomputern sollen auch umfangreichere Aufgaben zuverlässig rechenbar gemacht werden. Der Band wendet sich in erster Linie an Schüler und Studenten, für deren Bedürfnisse die Kapazität leistungsstarker, programmierbarer Taschenrechner ausreicht. Die hier vorge­ stellten Programme sind für den Hewlett-Packard HP-41 in der Ausstattung mit Ouad­ Modul und Magnetkartenleser entwickelt worden. Um dem Leser das Nachvollziehen der Programme zu erleichtern, sind diese so gehalten, daß die Veränderungen von einem Programm zum anderen möglichst gering sind. Es soll damit auch ein Weg aufgezeigt werden, wie man von zunächst recht einfachen Programmen zu aufwendigeren Lösungsverfahren gelangt. Für Leser, die Besitzer anderer Taschenrech­ ner oder Kleincomputer sind, werden die Beschreibungen der Rechenverfahren so gewählt, daß auch sie leicht eigene Programme zu den hier vorgestellten Verfahren schreiben können. Zudem soll dieser Band eine Anregung darstellen, die Programme für die eigenen Bedürf· nisse zu variieren und auch andere Verfahren der linearen Optimierung zu programmieren. Der Verfasser bietet mit der programmierten Lösung zu einfachen Anwendungen der linearen Programmierung einen interessanten Einstieg in dieses zunehmend wichtiger werdende Fachgebiet. Es wird besonderer Wert auf das Verständnis des mathematischen Hintergrundes gelegt. Die Herausgeber Inhaltsverzeichnis 1 Einleitung .............................................. .

Logic Minimization Algorithms for VLSI Synthesis (The Springer International Series in Engineering and Computer Science #2)

by Robert K. Brayton Gary D. Hachtel C. McMullen Alberto L. Sangiovanni-Vincentelli

The roots of the project which culminates with the writing of this book can be traced to the work on logic synthesis started in 1979 at the IBM Watson Research Center and at University of California, Berkeley. During the preliminary phases of these projects, the impor­ tance of logic minimization for the synthesis of area and performance effective circuits clearly emerged. In 1980, Richard Newton stirred our interest by pointing out new heuristic algorithms for two-level logic minimization and the potential for improving upon existing approaches. In the summer of 1981, the authors organized and participated in a seminar on logic manipulation at IBM Research. One of the goals of the seminar was to study the literature on logic minimization and to look at heuristic algorithms from a fundamental and comparative point of view. The fruits of this investigation were surprisingly abundant: it was apparent from an initial implementation of recursive logic minimiza­ tion (ESPRESSO-I) that, if we merged our new results into a two-level minimization program, an important step forward in automatic logic synthesis could result. ESPRESSO-II was born and an APL implemen­ tation was created in the summer of 1982. The results of preliminary tests on a fairly large set of industrial examples were good enough to justify the publication of our algorithms. It is hoped that the strength and speed of our minimizer warrant its Italian name, which denotes both express delivery and a specially-brewed black coffee.

Refine Search

Showing 82,226 through 82,250 of 83,302 results