Browse Results

Showing 7,426 through 7,450 of 54,761 results

Handbook of Robust Low-Rank and Sparse Matrix Decomposition: Applications in Image and Video Processing

by Thierry Bouwmans Necdet Serhat Aybat El-Hadi Zahzah

Handbook of Robust Low-Rank and Sparse Matrix Decomposition: Applications in Image and Video Processing shows you how robust subspace learning and tracking by decomposition into low-rank and sparse matrices provide a suitable framework for computer vision applications. Incorporating both existing and new ideas, the book conveniently gives you one-stop access to a number of different decompositions, algorithms, implementations, and benchmarking techniques. Divided into five parts, the book begins with an overall introduction to robust principal component analysis (PCA) via decomposition into low-rank and sparse matrices. The second part addresses robust matrix factorization/completion problems while the third part focuses on robust online subspace estimation, learning, and tracking. Covering applications in image and video processing, the fourth part discusses image analysis, image denoising, motion saliency detection, video coding, key frame extraction, and hyperspectral video processing. The final part presents resources and applications in background/foreground separation for video surveillance. With contributions from leading teams around the world, this handbook provides a complete overview of the concepts, theories, algorithms, and applications related to robust low-rank and sparse matrix decompositions. It is designed for researchers, developers, and graduate students in computer vision, image and video processing, real-time architecture, machine learning, and data mining.

Background Modeling and Foreground Detection for Video Surveillance

by Thierry Bouwmans Fatih Porikli Benjamin Höferlin Antoine Vacavant

Background modeling and foreground detection are important steps in video processing used to detect robustly moving objects in challenging environments. This requires effective methods for dealing with dynamic backgrounds and illumination changes as well as algorithms that must meet real-time and low memory requirements.Incorporating both establish

Evaluation and Decision Models with Multiple Criteria: Stepping stones for the analyst (International Series in Operations Research & Management Science #86)

by Denis Bouyssou Thierry Marchant Marc Pirlot Alexis Tsoukias Philippe Vincke

Formal decision and evaluation models are so widespread that almost no one can pretend not to have used or suffered the consequences of one of them. This book is a guide aimed at helping the analyst to choose a model and use it consistently. A sound analysis of techniques is proposed and the presentation can be extended to most decision and evaluation models as a "decision aiding methodology".

Handling Missing Data in Ranked Set Sampling (SpringerBriefs in Statistics)

by Carlos N. Bouza-Herrera

​The existence of missing observations is a very important aspect to be considered in the application of survey sampling, for example. In human populations they may be caused by a refusal of some interviewees to give the true value for the variable of interest. Traditionally, simple random sampling is used to select samples. Most statistical models are supported by the use of samples selected by means of this design. In recent decades, an alternative design has started being used, which, in many cases, shows an improvement in terms of accuracy compared with traditional sampling. It is called Ranked Set Sampling (RSS). A random selection is made with the replacement of samples, which are ordered (ranked). The literature on the subject is increasing due to the potentialities of RSS for deriving more effective alternatives to well-established statistical models. In this work, the use of RSS sub-sampling for obtaining information among the non respondents and different imputation procedures are considered. RSS models are developed as counterparts of well-known simple random sampling (SRS) models. SRS and RSS models for estimating the population using missing data are presented and compared both theoretically and using numerical experiments.

Natural Language Processing and Information Systems: 5th International Conference on Applications of Natural Language to Information Systems, NLDB 2000, Versailles, France, June 28-30, 2000; Revised Papers (Lecture Notes in Computer Science #1959)

by Mokrane Bouzeghoub Zoubida Kedad Elisabeth Metais

This book includes the papers presented at the fifth International Conference on Application of Natural Language to Information Systems (NLDB 2000) which was held in Versailles (France) on June 28-30. Following NLDB95 in Versailles, NLDB96 i n Amsterdam, NLDB97 i n Vancouver, and NLDB99 i n Klagenfurt, NLDB 2000 was a forum for exchanging new research results and trends on the benefits of integrating Natural Language resources in Information System Engineering. Since the first NLDB workshop in 1995 it has become apparent that each aspect of an information system life cycle may be improved by natural language techniques: database design (specification, validation, conflict resolution), database query languages, and application programming that use new software engineering research (natural language program specifications). As information systems are now evolving into the communication area, the term databases should be considered in the broader sense of information and communication systems. The main new trend in NLDB 2000 is related to the WEB wave: WEB querying, WEB answering, and information retrieval. Among 47 papers submitted from 18 countries, the program committee selected 29 papers to be presented during the conference. Besides these regular papers, two invited talks (given by Pr. Reind P. van de Riet and Pr. Maurice Gross), and a set of posters and demonstrations are also included in these proceedings.

Pricing Interest-Rate Derivatives: A Fourier-Transform Based Approach (Lecture Notes in Economics and Mathematical Systems #607)

by Markus Bouziane

The author derives an efficient and accurate pricing tool for interest-rate derivatives within a Fourier-transform based pricing approach, which is generally applicable to exponential-affine jump-diffusion models.

Phase Space Analysis of Partial Differential Equations (Progress in Nonlinear Differential Equations and Their Applications #69)

by Antonio Bove Ferruccio Colombini Daniele Del Santo

Covers phase space analysis methods, including microlocal analysis, and their applications to physics Treats the linear and nonnlinear aspects of the theory of PDEs Original articles are self-contained with full proofs; survey articles give a quick and direct introduction to selected topics evolving at a fast pace Excellent reference and resource for grad students and researchers in PDEs and related fields

Advances in Phase Space Analysis of Partial Differential Equations: In Honor of Ferruccio Colombini's 60th Birthday (Progress in Nonlinear Differential Equations and Their Applications #78)

by Antonio Bove Daniele Del Santo M. K. Venkatesha Murthy

This collection of original articles and surveys addresses the recent advances in linear and nonlinear aspects of the theory of partial differential equations. The key topics include operators as "sums of squares" of real and complex vector fields, nonlinear evolution equations, local solvability, and hyperbolic questions.

Methods for the Analysis of Asymmetric Proximity Data (Behaviormetrics: Quantitative Approaches to Human Behavior #7)

by Giuseppe Bove Akinori Okada Donatella Vicari

This book provides an accessible introduction and practical guidelines to apply asymmetric multidimensional scaling, cluster analysis, and related methods to asymmetric one-mode two-way and three-way asymmetric data. A major objective of this book is to present to applied researchers a set of methods and algorithms for graphical representation and clustering of asymmetric relationships. Data frequently concern measurements of asymmetric relationships between pairs of objects from a given set (e.g., subjects, variables, attributes,…), collected in one or more matrices. Examples abound in many different fields such as psychology, sociology, marketing research, and linguistics and more recently several applications have appeared in technological areas including cybernetics, air traffic control, robotics, and network analysis. The capabilities of the presented algorithms are illustrated by carefully chosen examples and supported by extensive data analyses. A review of the specialized statistical software available for the applications is also provided. This monograph is highly recommended to readers who need a complete and up-to-date reference on methods for asymmetric proximity data analysis.

Random Walks, Random Fields, and Disordered Systems (Lecture Notes in Mathematics #2144)

by Anton Bovier David Brydges Amin Coja-Oghlan Dmitry Ioffe Gregory F. Lawler

Focusing on the mathematics that lies at the intersection of probability theory, statistical physics, combinatorics and computer science, this volume collects together lecture notes on recent developments in the area. The common ground of these subjects is perhaps best described by the three terms in the title: Random Walks, Random Fields and Disordered Systems. The specific topics covered include a study of Branching Brownian Motion from the perspective of disordered (spin-glass) systems, a detailed analysis of weakly self-avoiding random walks in four spatial dimensions via methods of field theory and the renormalization group, a study of phase transitions in disordered discrete structures using a rigorous version of the cavity method, a survey of recent work on interacting polymers in the ballisticity regime and, finally, a treatise on two-dimensional loop-soup models and their connection to conformally invariant systems and the Gaussian Free Field. The notes are aimed at early graduate students with a modest background in probability and mathematical physics, although they could also be enjoyed by seasoned researchers interested in learning about recent advances in the above fields.

Metastability: A Potential-Theoretic Approach (Grundlehren der mathematischen Wissenschaften #351)

by Anton Bovier Frank den Hollander

This monograph provides a concise presentation of a mathematical approach to metastability, a wide-spread phenomenon in the dynamics of non-linear systems - physical, chemical, biological or economic - subject to the action of temporal random forces typically referred to as noise, based on potential theory of reversible Markov processes. The authors shed new light on the metastability phenomenon as a sequence of visits of the path of the process to different metastable sets, and focuses on the precise analysis of the respective hitting probabilities and hitting times of these sets.The theory is illustrated with many examples, ranging from finite-state Markov chains, finite-dimensional diffusions and stochastic partial differential equations, via mean-field dynamics with and without disorder, to stochastic spin-flip and particle-hop dynamics and probabilistic cellular automata, unveiling the common universal features of these systems with respect to their metastable behaviour. The monograph will serve both as comprehensive introduction and as reference for graduate students and researchers interested in metastability.

Methods of Contemporary Mathematical Statistical Physics (Lecture Notes in Mathematics #1970)

by Anton Bovier Frank den Hollander Dima Ioffe Fabio Martinelli Karel Netocný Christina Toninelli Marek Biskup

This volume presents a collection of courses introducing the reader to the recent progress with attention being paid to laying solid grounds and developing various basic tools. It presents new results on phase transitions for gradient lattice models.

Fractal Geometry in Architecture and Design (Design Science Collection)

by Carl Bovill

na broad sense Design Science is the grammar of a language of images Irather than of words. Modern communication techniques enable us to transmit and reconstitute images without needing to know a specific verbal sequence language such as the Morse code or Hungarian. International traffic signs use international image symbols which are not specific to any particular verbal language. An image language differs from a verbal one in that the latter uses a linear string of symbols, whereas the former is multi­ dimensional. Architectural renderings commonly show projections onto three mutual­ ly perpendicular planes, or consist of cross sections at different altitudes capa­ ble of being stacked and representing different floor plans. Such renderings make it difficult to imagine buildings comprising ramps and other features which disguise the separation between floors, and consequently limit the cre­ ative process of the architect. Analogously, we tend to analyze natural struc­ tures as if nature had used similar stacked renderings, rather than, for instance, a system of packed spheres, with the result that we fail to perceive the system of organization determining the form of such structures. Perception is a complex process. Our senses record; they are analogous to audio or video devices. We cannot, however, claim that such devices perceive.

The Information Theory of Comparisons: With Applications to Statistics and the Social Sciences

by Roger Bowden

This book finds a broad domain of relevance in statistics and the social sciences. Its conceptual development is supported by applications to economics and income distribution, finance, education, demographics and actuarial science, political studies, psychology, and general statistics. Fresh perspectives on directional complexity have generated an informational theory of ‘more versus less’, with representative polar outcomes as good or bad, or rich or poor. New duality metrics for spread and asymmetry have resulted, motivated by internal perspectives on the part of subjects, such as attitudes to their comparative (dis)advantage. This book is a readable review of these developments. Concepts and applications are described in tandem with each other. They consolidate recent contributions to the research literature, augmented with fresh insights and applications. Dynamic extensions include modeling shifting social attitudes, while the broader agenda encompasses topical areas such as subjectivist probability, investment decision making, and income distribution.

Protecting Your Privacy in a Data-Driven World (ASA-CRC Series on Statistical Reasoning in Science and Society)

by Claire McKay Bowen

At what point does the sacrifice to our personal information outweigh the public good? If public policymakers had access to our personal and confidential data, they could make more evidence-based, data-informed decisions that could accelerate economic recovery and improve COVID-19 vaccine distribution. However, access to personal data comes at a steep privacy cost for contributors, especially underrepresented groups. Protecting Your Privacy in a Data-Driven World is a practical, nontechnical guide that explains the importance of balancing these competing needs and calls for careful consideration of how data are collected and disseminated by our government and the private sector. Not addressing these concerns can harm the same communities policymakers are trying to protect through data privacy and confidentiality legislation.

Protecting Your Privacy in a Data-Driven World (ASA-CRC Series on Statistical Reasoning in Science and Society)

by Claire McKay Bowen

At what point does the sacrifice to our personal information outweigh the public good? If public policymakers had access to our personal and confidential data, they could make more evidence-based, data-informed decisions that could accelerate economic recovery and improve COVID-19 vaccine distribution. However, access to personal data comes at a steep privacy cost for contributors, especially underrepresented groups. Protecting Your Privacy in a Data-Driven World is a practical, nontechnical guide that explains the importance of balancing these competing needs and calls for careful consideration of how data are collected and disseminated by our government and the private sector. Not addressing these concerns can harm the same communities policymakers are trying to protect through data privacy and confidentiality legislation.

Z User Workshop, London 1992: Proceedings of the Seventh Annual Z User Meeting, London 14–15 December 1992 (Workshops in Computing)

by J. P. Bowen J. E. Nicholls

The Z notation has been developed at the Programming Research Group at the Oxford University Computing Laboratory and elsewhere for over a decade. It is now used by industry as part of the software (and hardware) development process in both Europe and the USA. It is currently undergoing BSI standardisation in the UK, and has been proposed for ISO standardisation internationally. In recent years researchers have begun to focus increasingly on the development of techniques and tools to encourage the wider application of Z and other formal methods and notations. This volume contains papers from the Seventh Annual Z User Meeting, held in London in December 1992. In contrast to previous years the meeting concentrated specifically on industrial applications of Z, and a high proportion of the participants came from an industrial background. The theme is well represented by the four invited papers. Three of these discuss ways in which formal methods are being introduced, and the fourth presents an international survey of industrial applications. It also provides a reminder of the improvements which are needed to make these methods an accepted part of software development. In addition the volume contains several submitted papers on the industrial use of Z, two of which discuss the key area of safety-critical applications. There are also a number of papers related to the recently-completed ZIP project. The papers cover all the main areas of the project including methods, tools, and the development of a Z Standard, the first publicly-available version of which was made available at the meeting. Finally the volume contains a select Z bibliography, and section on how to access information on Z through comp.specification.z, the international, computer-based USENET newsgroup. Z User Workshop, London 1992 provides an important overview of current research into industrial applications of Z, and will provide invaluable reading for researchers, postgraduate students and also potential industrial users of Z.

ZB 2000: First International Conference of B and Z Users York, UK, August 29 - September 2, 2000 Proceedings (Lecture Notes in Computer Science #1878)

by Jonathan P. Bowen Steve Dunne Andy Galloway Steve King

This book constitutes the refereed proceedings of the First International Conference of B and Z Users, ZB 2000, held in York, UK in August/September 2000.The 25 revised full papers presented together with four invited contributions were carefully reviewed and selected for inclusion in the book. The book documents the recent advances for the Z formal specification notion and for the B method; the full scope, ranging from foundational and theoretical issues to advanced applications, tools, and case studies, is covered.

ZUM '98: 11th International Conference of Z Users, Berlin, Germany, September 24-26, 1998, Proceedings (Lecture Notes in Computer Science #1493)

by Jonathan P. Bowen Andreas Fett Michael G. Hinchey

1 In a number of recent presentations – most notably at FME’96 –oneofthe foremost scientists in the ?eld of formal methods, C.A.R. Hoare,has highlighted the fact that formal methods are not the only technique for producing reliable software. This seems to have caused some controversy,not least amongst formal methods practitioners. How can one of the founding fathers of formal methods seemingly denounce the ?eld of research after over a quarter of a century of support? This is a question that has been posed recently by some formal methods skeptics. However, Prof. Hoare has not abandoned formal methods. He is reiterating, 2 albeitmoreradically,his1987view thatmorethanonetoolandnotationwillbe requiredinthepractical,industrialdevelopmentoflarge-scalecomplexcomputer systems; and not all of these tools and notations will be, or even need be, formal in nature. Formalmethods arenotasolution,butratheroneofaselectionoftechniques that have proven to be useful in the development of reliable complex systems, and to result in hardware and software systems that can be produced on-time and within a budget, while satisfying the stated requirements. After almostthree decades,the time has come to view formalmethods in the context of overall industrial-scale system development, and their relationship to othertechniquesandmethods.Weshouldnolongerconsidertheissueofwhether we are “pro-formal” or “anti-formal”, but rather the degree of formality (if any) that we need to support in system development. This is a goal of ZUM’98, the 11th International Conference of Z Users, held for the ?rst time within continental Europe in the city of Berlin, Germany.

High-Integrity System Specification and Design (Formal Approaches to Computing and Information Technology (FACIT))

by Jonathan P. Bowen Michael G. Hinchey

Errata, detected in Taylor's Logarithms. London: 4to, 1792. [sic] 14.18.3 6 Kk Co-sine of 3398 3298 - Nautical Almanac (1832) In the list of ERRATA detected in Taylor's Logarithms, for cos. 4° 18'3", read cos. 14° 18'2". - Nautical Almanac (1833) ERRATUM ofthe ERRATUM ofthe ERRATA of TAYLOR'S Logarithms. For cos. 4° 18'3", read cos. 14° 18' 3". - Nautical Almanac (1836) In the 1820s, an Englishman named Charles Babbage designed and partly built a calculating machine originally intended for use in deriving and printing logarithmic and other tables used in the shipping industry. At that time, such tables were often inaccurate, copied carelessly, and had been instrumental in causing a number of maritime disasters. Babbage's machine, called a 'Difference Engine' because it performed its cal­ culations using the principle of partial differences, was intended to substantially reduce the number of errors made by humans calculating the tables. Babbage had also designed (but never built) a forerunner of the modern printer, which would also reduce the number of errors admitted during the transcription of the results. Nowadays, a system implemented to perform the function of Babbage's engine would be classed as safety-critical. That is, the failure of the system to produce correct results could result in the loss of human life, mass destruction of property (in the form of ships and cargo) as well as financial losses and loss of competitive advantage for the shipping firm.

Engineering Trustworthy Software Systems: 4th International School, SETSS 2018, Chongqing, China, April 7–12, 2018, Tutorial Lectures (Lecture Notes in Computer Science #11430)

by Jonathan P. Bowen Zhiming Liu Zili Zhang

This volume contains lectures on leading-edge research in methods and tools for use in computer system engineering; at the 4th International School on Engineering Trustworthy Software Systems, SETSS 2018, held in April 2018 at Southwest University in Chongqing, China.The five chapters in this volume provide an overview of research in the frontier of theories, methods, and tools for software modelling, design, and verification. The topics covered in these chapter include Software Verification with Whiley, Learning Büchi Automata and Its Applications, Security in IoT Applications, Programming in Z3, and The Impact of Alan Turing: Formal Methods and Beyond. The volume provides a useful resource for postgraduate students, researchers, academics, and engineers in industry, who are interested in theory, methods, and tools for the development of trustworthy software.

Unifying Theories of Programming: 6th International Symposium, UTP 2016, Reykjavik, Iceland, June 4-5, 2016, Revised Selected Papers (Lecture Notes in Computer Science #10134)

by Jonathan P. Bowen Huibiao Zhu

This book constitutes the refereed proceedings of the 6th International Symposium on Unifying Theories of Programming, UTP 2016, held in Reykjavik, Iceland, in June 2016, in conjunction with the 12th International Conference on Integrated Formal Methods, iFM 2016. The 8 revised full papers presented were carefully reviewed and selected from 10 submissions. They deal with the fundamental problem of combination of formal notations and theories of programming that define in various different ways many common notions, such as abstraction refinement, choice, termination, feasibility, locality, concurrency, and communication. They also show that despite many differences, such theories may be unified in a way that greatly facilitates their study and comparison.

Refine Search

Showing 7,426 through 7,450 of 54,761 results