Browse Results

Showing 54,451 through 54,475 of 54,682 results

Territorial Inequalities

by Magali Talandier Josselin Tallec

Spatial planning has embraced the idea of dealing with territorial inequalities by focusing on equipment logic on a national scale, and then economic development on a local scale. Today, this issue is creating new angles of debate with strong political resonances (e.g. Brexit, French gilets jaunes movement). Interpretations of these movements are often quick and binary, such as: the contrast between metropolises and peripheries, between cities and the countryside, between the north and the south or between the east and the west of the European Union. Territorial Inequalities sheds light on the social, political and operational implications of these divergences. The chapters cover the subject at different scales of action and observation (from the neighborhood to the world), but also according to their interdependences. To deal with such a vast and ambitious theme, the preferred approach is that of territorial development in terms of public policy, namely spatial planning.

Testing Statistical Assumptions in Research

by J. P. Verma Abdel-Salam G. Abdel-Salam

Comprehensively teaches the basics of testing statistical assumptions in research and the importance in doing so This book facilitates researchers in checking the assumptions of statistical tests used in their research by focusing on the importance of checking assumptions in using statistical methods, showing them how to check assumptions, and explaining what to do if assumptions are not met. Testing Statistical Assumptions in Research discusses the concepts of hypothesis testing and statistical errors in detail, as well as the concepts of power, sample size, and effect size. It introduces SPSS functionality and shows how to segregate data, draw random samples, file split, and create variables automatically. It then goes on to cover different assumptions required in survey studies, and the importance of designing surveys in reporting the efficient findings. The book provides various parametric tests and the related assumptions and shows the procedures for testing these assumptions using SPSS software. To motivate readers to use assumptions, it includes many situations where violation of assumptions affects the findings. Assumptions required for different non-parametric tests such as Chi-square, Mann-Whitney, Kruskal Wallis, and Wilcoxon signed-rank test are also discussed. Finally, it looks at assumptions in non-parametric correlations, such as bi-serial correlation, tetrachoric correlation, and phi coefficient. An excellent reference for graduate students and research scholars of any discipline in testing assumptions of statistical tests before using them in their research study Shows readers the adverse effect of violating the assumptions on findings by means of various illustrations Describes different assumptions associated with different statistical tests commonly used by research scholars Contains examples using SPSS, which helps facilitate readers to understand the procedure involved in testing assumptions Looks at commonly used assumptions in statistical tests, such as z, t and F tests, ANOVA, correlation, and regression analysis Testing Statistical Assumptions in Research is a valuable resource for graduate students of any discipline who write thesis or dissertation for empirical studies in their course works, as well as for data analysts.

Testing Statistical Assumptions in Research

by J. P. Verma Abdel-Salam G. Abdel-Salam

Comprehensively teaches the basics of testing statistical assumptions in research and the importance in doing so This book facilitates researchers in checking the assumptions of statistical tests used in their research by focusing on the importance of checking assumptions in using statistical methods, showing them how to check assumptions, and explaining what to do if assumptions are not met. Testing Statistical Assumptions in Research discusses the concepts of hypothesis testing and statistical errors in detail, as well as the concepts of power, sample size, and effect size. It introduces SPSS functionality and shows how to segregate data, draw random samples, file split, and create variables automatically. It then goes on to cover different assumptions required in survey studies, and the importance of designing surveys in reporting the efficient findings. The book provides various parametric tests and the related assumptions and shows the procedures for testing these assumptions using SPSS software. To motivate readers to use assumptions, it includes many situations where violation of assumptions affects the findings. Assumptions required for different non-parametric tests such as Chi-square, Mann-Whitney, Kruskal Wallis, and Wilcoxon signed-rank test are also discussed. Finally, it looks at assumptions in non-parametric correlations, such as bi-serial correlation, tetrachoric correlation, and phi coefficient. An excellent reference for graduate students and research scholars of any discipline in testing assumptions of statistical tests before using them in their research study Shows readers the adverse effect of violating the assumptions on findings by means of various illustrations Describes different assumptions associated with different statistical tests commonly used by research scholars Contains examples using SPSS, which helps facilitate readers to understand the procedure involved in testing assumptions Looks at commonly used assumptions in statistical tests, such as z, t and F tests, ANOVA, correlation, and regression analysis Testing Statistical Assumptions in Research is a valuable resource for graduate students of any discipline who write thesis or dissertation for empirical studies in their course works, as well as for data analysts.

Text Mining: Applications and Theory

by Michael W. Berry Jacob Kogan

Text Mining: Applications and Theory presents the state-of-the-art algorithms for text mining from both the academic and industrial perspectives. The contributors span several countries and scientific domains: universities, industrial corporations, and government laboratories, and demonstrate the use of techniques from machine learning, knowledge discovery, natural language processing and information retrieval to design computational models for automated text analysis and mining. This volume demonstrates how advancements in the fields of applied mathematics, computer science, machine learning, and natural language processing can collectively capture, classify, and interpret words and their contexts. As suggested in the preface, text mining is needed when “words are not enough.” This book: Provides state-of-the-art algorithms and techniques for critical tasks in text mining applications, such as clustering, classification, anomaly and trend detection, and stream analysis. Presents a survey of text visualization techniques and looks at the multilingual text classification problem. Discusses the issue of cybercrime associated with chatrooms. Features advances in visual analytics and machine learning along with illustrative examples. Is accompanied by a supporting website featuring datasets. Applied mathematicians, statisticians, practitioners and students in computer science, bioinformatics and engineering will find this book extremely useful.

Text Mining in Practice with R

by Ted Kwartler

A reliable, cost-effective approach to extracting priceless business information from all sources of text Excavating actionable business insights from data is a complex undertaking, and that complexity is magnified by an order of magnitude when the focus is on documents and other text information. This book takes a practical, hands-on approach to teaching you a reliable, cost-effective approach to mining the vast, untold riches buried within all forms of text using R. Author Ted Kwartler clearly describes all of the tools needed to perform text mining and shows you how to use them to identify practical business applications to get your creative text mining efforts started right away. With the help of numerous real-world examples and case studies from industries ranging from healthcare to entertainment to telecommunications, he demonstrates how to execute an array of text mining processes and functions, including sentiment scoring, topic modelling, predictive modelling, extracting clickbait from headlines, and more. You’ll learn how to: Identify actionable social media posts to improve customer service Use text mining in HR to identify candidate perceptions of an organisation, match job descriptions with resumes, and more Extract priceless information from virtually all digital and print sources, including the news media, social media sites, PDFs, and even JPEG and GIF image files Make text mining an integral component of marketing in order to identify brand evangelists, impact customer propensity modelling, and much more Most companies’ data mining efforts focus almost exclusively on numerical and categorical data, while text remains a largely untapped resource. Especially in a global marketplace where being first to identify and respond to customer needs and expectations imparts an unbeatable competitive advantage, text represents a source of immense potential value. Unfortunately, there is no reliable, cost-effective technology for extracting analytical insights from the huge and ever-growing volume of text available online and other digital sources, as well as from paper documents—until now.

Text Mining in Practice with R

by Ted Kwartler

A reliable, cost-effective approach to extracting priceless business information from all sources of text Excavating actionable business insights from data is a complex undertaking, and that complexity is magnified by an order of magnitude when the focus is on documents and other text information. This book takes a practical, hands-on approach to teaching you a reliable, cost-effective approach to mining the vast, untold riches buried within all forms of text using R. Author Ted Kwartler clearly describes all of the tools needed to perform text mining and shows you how to use them to identify practical business applications to get your creative text mining efforts started right away. With the help of numerous real-world examples and case studies from industries ranging from healthcare to entertainment to telecommunications, he demonstrates how to execute an array of text mining processes and functions, including sentiment scoring, topic modelling, predictive modelling, extracting clickbait from headlines, and more. You’ll learn how to: Identify actionable social media posts to improve customer service Use text mining in HR to identify candidate perceptions of an organisation, match job descriptions with resumes, and more Extract priceless information from virtually all digital and print sources, including the news media, social media sites, PDFs, and even JPEG and GIF image files Make text mining an integral component of marketing in order to identify brand evangelists, impact customer propensity modelling, and much more Most companies’ data mining efforts focus almost exclusively on numerical and categorical data, while text remains a largely untapped resource. Especially in a global marketplace where being first to identify and respond to customer needs and expectations imparts an unbeatable competitive advantage, text represents a source of immense potential value. Unfortunately, there is no reliable, cost-effective technology for extracting analytical insights from the huge and ever-growing volume of text available online and other digital sources, as well as from paper documents—until now.

Textbook of Clinical Trials

by David Machin Simon Day Sylvan Green

Now published in its Second Edition, the Textbook of Clinical Trials offers detailed coverage of trial methodology in diverse areas of medicine in a single comprehensive volume. Praise for the First Edition: "... very useful as an introduction to clinical research, or for those planning specific studies within therapeutic or disease areas." BRITISH JOURNAL OF SURGERY, Vol. 92, No. 2, February 2005 The book’s main concept is to describe the impact of clinical trials on the practice of medicine. It separates the information by therapeutic area because the impact of clinical trials, the problems encountered, and the numbers of trials in existence vary tremendously from specialty to specialty. The sections provide a background to the disease area and general clinical trial methodology before concentrating on particular problems experienced in that area. Specific examples are used throughout to address these issues. The Textbook of Clinical Trials, Second Edition: Highlights the various ways clinical trials have influenced the practice of medicine in many therapeutic areas Describes the challenges posed by those conducting clinical trials over a range of medical specialities and allied fields Additional therapeutic areas are included in this Second Edition to fill gaps in the First Edition as the number and complexity of trials increases in this rapidly developing area Newly covered or updated in the Second Edition: general surgery, plastic surgery, aesthetic surgery, palliative care, primary care, anaesthesia and pain, transfusion, wound healing, maternal and perinatal health, early termination, organ transplants, ophthalmology, epilepsy, infectious disease, neuro-oncology, adrenal, thyroid and urological cancers, as well as a chapter on the Cochrane network An invaluable resource for pharmaceutical companies, the Textbook of Clinical Trials, Second Edition appeals to those working in contract research organizations, medical departments and in the area of public health and health science alike.

Textbook of Epidemiology

by Lex Bouter Maurice Zeegers Tianjing Li

TEXTBOOK OF EPIDEMIOLOGY The gold standard in epidemiological texts In the second edition of Textbook of Epidemiology, a distinguished team of researchers deliver an extensively updated and comprehensive exploration of epidemiological methods, illuminating the tools for studying the distribution and risk factors of health states and events in populations. An introduction to epidemiological methods with recent and broadly applicable examples End-of-chapter self-assessment questions for readers to check their understanding of key concepts, with answer keys and further enrichment materials available on a companion website A brand-new chapter covering methods for systematic reviews and meta-analysis Accessible material appropriate for clinical practitioners and researchers from around the world Perfect for professionals working in clinical medicine and public health, Textbook of Epidemiology will also earn a place in the libraries of allied health professionals seeking a one-stop resource or to re-immerse themselves in specific methodological topics and practices.

Textbook of Epidemiology

by Lex Bouter Maurice Zeegers Tianjing Li

TEXTBOOK OF EPIDEMIOLOGY The gold standard in epidemiological texts In the second edition of Textbook of Epidemiology, a distinguished team of researchers deliver an extensively updated and comprehensive exploration of epidemiological methods, illuminating the tools for studying the distribution and risk factors of health states and events in populations. An introduction to epidemiological methods with recent and broadly applicable examples End-of-chapter self-assessment questions for readers to check their understanding of key concepts, with answer keys and further enrichment materials available on a companion website A brand-new chapter covering methods for systematic reviews and meta-analysis Accessible material appropriate for clinical practitioners and researchers from around the world Perfect for professionals working in clinical medicine and public health, Textbook of Epidemiology will also earn a place in the libraries of allied health professionals seeking a one-stop resource or to re-immerse themselves in specific methodological topics and practices.

Theorems, Corollaries, Lemmas, and Methods of Proof (Pure and Applied Mathematics: A Wiley Series of Texts, Monographs and Tracts #82)

by Richard J. Rossi

A hands-on introduction to the tools needed for rigorous and theoretical mathematical reasoning Successfully addressing the frustration many students experience as they make the transition from computational mathematics to advanced calculus and algebraic structures, Theorems, Corollaries, Lemmas, and Methods of Proof equips students with the tools needed to succeed while providing a firm foundation in the axiomatic structure of modern mathematics. This essential book: * Clearly explains the relationship between definitions, conjectures, theorems, corollaries, lemmas, and proofs * Reinforces the foundations of calculus and algebra * Explores how to use both a direct and indirect proof to prove a theorem * Presents the basic properties of real numbers * Discusses how to use mathematical induction to prove a theorem * Identifies the different types of theorems * Explains how to write a clear and understandable proof * Covers the basic structure of modern mathematics and the key components of modern mathematics A complete chapter is dedicated to the different methods of proof such as forward direct proofs, proof by contrapositive, proof by contradiction, mathematical induction, and existence proofs. In addition, the author has supplied many clear and detailed algorithms that outline these proofs. Theorems, Corollaries, Lemmas, and Methods of Proof uniquely introduces scratch work as an indispensable part of the proof process, encouraging students to use scratch work and creative thinking as the first steps in their attempt to prove a theorem. Once their scratch work successfully demonstrates the truth of the theorem, the proof can be written in a clear and concise fashion. The basic structure of modern mathematics is discussed, and each of the key components of modern mathematics is defined. Numerous exercises are included in each chapter, covering a wide range of topics with varied levels of difficulty. Intended as a main text for mathematics courses such as Methods of Proof, Transitions to Advanced Mathematics, and Foundations of Mathematics, the book may also be used as a supplementary textbook in junior- and senior-level courses on advanced calculus, real analysis, and modern algebra.

Theoretical Fluid Dynamics

by Bhimsen K. Shivamoggi

"Although there are many texts and monographs on fluid dynamics, I do not know of any which is as comprehensive as the present book. It surveys nearly the entire field of classical fluid dynamics in an advanced, compact, and clear manner, and discusses the various conceptual and analytical models of fluid flow." - Foundations of Physics on the first edition Theoretical Fluid Dynamics functions equally well as a graduate-level text and a professional reference. Steering a middle course between the empiricism of engineering and the abstractions of pure mathematics, the author focuses on those ideas and formulations that will be of greatest interest to students and researchers in applied mathematics and theoretical physics. Dr. Shivamoggi covers the main branches of fluid dynamics, with particular emphasis on flows of incompressible fluids. Readers well versed in the physical and mathematical prerequisites will find enlightening discussions of many lesser-known areas of study in fluid dynamics. This thoroughly revised, updated, and expanded Second Edition features coverage of recent developments in stability and turbulence, additional chapter-end exercises, relevant experimental information, and an abundance of new material on a wide range of topics, including: * Hamiltonian formulation * Nonlinear water waves and sound waves * Stability of a fluid layer heated from below * Equilibrium statistical mechanics of turbulence * Two-dimensional turbulence

Theoretical Foundations of Functional Data Analysis, with an Introduction to Linear Operators (Wiley Series in Probability and Statistics #997)

by Tailen Hsing Randall Eubank

Theoretical Foundations of Functional Data Analysis, with an Introduction to Linear Operators provides a uniquely broad compendium of the key mathematical concepts and results that are relevant for the theoretical development of functional data analysis (FDA).The self–contained treatment of selected topics of functional analysis and operator theory includes reproducing kernel Hilbert spaces, singular value decomposition of compact operators on Hilbert spaces and perturbation theory for both self–adjoint and non self–adjoint operators. The probabilistic foundation for FDA is described from the perspective of random elements in Hilbert spaces as well as from the viewpoint of continuous time stochastic processes. Nonparametric estimation approaches including kernel and regularized smoothing are also introduced. These tools are then used to investigate the properties of estimators for the mean element, covariance operators, principal components, regression function and canonical correlations. A general treatment of canonical correlations in Hilbert spaces naturally leads to FDA formulations of factor analysis, regression, MANOVA and discriminant analysis.This book will provide a valuable reference for statisticians and other researchers interested in developing or understanding the mathematical aspects of FDA. It is also suitable for a graduate level special topics course.

Theoretical Foundations of Functional Data Analysis, with an Introduction to Linear Operators (Wiley Series in Probability and Statistics #997)

by Tailen Hsing Randall Eubank

Theoretical Foundations of Functional Data Analysis, with an Introduction to Linear Operators provides a uniquely broad compendium of the key mathematical concepts and results that are relevant for the theoretical development of functional data analysis (FDA).The self–contained treatment of selected topics of functional analysis and operator theory includes reproducing kernel Hilbert spaces, singular value decomposition of compact operators on Hilbert spaces and perturbation theory for both self–adjoint and non self–adjoint operators. The probabilistic foundation for FDA is described from the perspective of random elements in Hilbert spaces as well as from the viewpoint of continuous time stochastic processes. Nonparametric estimation approaches including kernel and regularized smoothing are also introduced. These tools are then used to investigate the properties of estimators for the mean element, covariance operators, principal components, regression function and canonical correlations. A general treatment of canonical correlations in Hilbert spaces naturally leads to FDA formulations of factor analysis, regression, MANOVA and discriminant analysis.This book will provide a valuable reference for statisticians and other researchers interested in developing or understanding the mathematical aspects of FDA. It is also suitable for a graduate level special topics course.

Theory and Statistical Applications of Stochastic Processes

by Yuliya Mishura Georgiy Shevchenko

This book is concerned with the theory of stochastic processes and the theoretical aspects of statistics for stochastic processes. It combines classic topics such as construction of stochastic processes, associated filtrations, processes with independent increments, Gaussian processes, martingales, Markov properties, continuity and related properties of trajectories with contemporary subjects: integration with respect to Gaussian processes, Itȏ integration, stochastic analysis, stochastic differential equations, fractional Brownian motion and parameter estimation in diffusion models.

Theory and Statistical Applications of Stochastic Processes

by Yuliya Mishura Georgiy Shevchenko

This book is concerned with the theory of stochastic processes and the theoretical aspects of statistics for stochastic processes. It combines classic topics such as construction of stochastic processes, associated filtrations, processes with independent increments, Gaussian processes, martingales, Markov properties, continuity and related properties of trajectories with contemporary subjects: integration with respect to Gaussian processes, Itȏ integration, stochastic analysis, stochastic differential equations, fractional Brownian motion and parameter estimation in diffusion models.

Theory of Computation

by George Tourlakis

Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational phenomena and provides insights on what makes things tick and also what restrains the ability of computational processes. Recognizing the importance of acquired practical experience, the book begins with the metatheory of general purpose computer programs, using URMs as a straightforward, technology-independent model of modern high-level programming languages while also exploring the restrictions of the URM language. Once readers gain an understanding of computability theory—including the primitive recursive functions—the author presents automata and languages, covering the regular and context-free languages as well as the machines that recognize these languages. Several advanced topics such as reducibilities, the recursion theorem, complexity theory, and Cook's theorem are also discussed. Features of the book include: A review of basic discrete mathematics, covering logic and induction while omitting specialized combinatorial topics A thorough development of the modeling and mathematical analysis of computational phenomena, providing a solid foundation of un-computability The connection between un-computability and un-provability: Gödel's first incompleteness theorem The book provides numerous examples of specific URMs as well as other programming languages including Loop Programs, FA (Deterministic Finite Automata), NFA (Nondeterministic Finite Automata), and PDA (Pushdown Automata). Exercises at the end of each chapter allow readers to test their comprehension of the presented material, and an extensive bibliography suggests resources for further study. Assuming only a basic understanding of general computer programming and discrete mathematics, Theory of Computation serves as a valuable book for courses on theory of computation at the upper-undergraduate level. The book also serves as an excellent resource for programmers and computing professionals wishing to understand the theoretical limitations of their craft.

Theory of Computation

by George Tourlakis

Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational phenomena and provides insights on what makes things tick and also what restrains the ability of computational processes. Recognizing the importance of acquired practical experience, the book begins with the metatheory of general purpose computer programs, using URMs as a straightforward, technology-independent model of modern high-level programming languages while also exploring the restrictions of the URM language. Once readers gain an understanding of computability theory—including the primitive recursive functions—the author presents automata and languages, covering the regular and context-free languages as well as the machines that recognize these languages. Several advanced topics such as reducibilities, the recursion theorem, complexity theory, and Cook's theorem are also discussed. Features of the book include: A review of basic discrete mathematics, covering logic and induction while omitting specialized combinatorial topics A thorough development of the modeling and mathematical analysis of computational phenomena, providing a solid foundation of un-computability The connection between un-computability and un-provability: Gödel's first incompleteness theorem The book provides numerous examples of specific URMs as well as other programming languages including Loop Programs, FA (Deterministic Finite Automata), NFA (Nondeterministic Finite Automata), and PDA (Pushdown Automata). Exercises at the end of each chapter allow readers to test their comprehension of the presented material, and an extensive bibliography suggests resources for further study. Assuming only a basic understanding of general computer programming and discrete mathematics, Theory of Computation serves as a valuable book for courses on theory of computation at the upper-undergraduate level. The book also serves as an excellent resource for programmers and computing professionals wishing to understand the theoretical limitations of their craft.

Theory of Computational Complexity (Wiley Series in Discrete Mathematics and Optimization #58)

by Ding-Zhu Du Ker-I Ko

Praise for the First Edition "...complete, up-to-date coverage of computational complexity theory...the book promises to become the standard reference on computational complexity." -Zentralblatt MATH A thorough revision based on advances in the field of computational complexity and readers’ feedback, the Second Edition of Theory of Computational Complexity presents updates to the principles and applications essential to understanding modern computational complexity theory. The new edition continues to serve as a comprehensive resource on the use of software and computational approaches for solving algorithmic problems and the related difficulties that can be encountered. Maintaining extensive and detailed coverage, Theory of Computational Complexity, Second Edition, examines the theory and methods behind complexity theory, such as computational models, decision tree complexity, circuit complexity, and probabilistic complexity. The Second Edition also features recent developments on areas such as NP-completeness theory, as well as: A new combinatorial proof of the PCP theorem based on the notion of expander graphs, a research area in the field of computer science Additional exercises at varying levels of difficulty to further test comprehension of the presented material End-of-chapter literature reviews that summarize each topic and offer additional sources for further study Theory of Computational Complexity, Second Edition, is an excellent textbook for courses on computational theory and complexity at the graduate level. The book is also a useful reference for practitioners in the fields of computer science, engineering, and mathematics who utilize state-of-the-art software and computational methods to conduct research. A thorough revision based on advances in the field of computational complexity and readers’ feedback, the Second Edition of Theory of Computational Complexity presents updates to the principles and applications essential to understanding modern computational complexity theory. The new edition continues to serve as a comprehensive resource on the use of software and computational approaches for solving algorithmic problems and the related difficulties that can be encountered. Maintaining extensive and detailed coverage, Theory of Computational Complexity, Second Edition, examines the theory and methods behind complexity theory, such as computational models, decision tree complexity, circuit complexity, and probabilistic complexity. The Second Edition also features recent dev

Theory of Computational Complexity (Wiley Series in Discrete Mathematics and Optimization)

by Ding-Zhu Du Ker-I Ko

Praise for the First Edition "...complete, up-to-date coverage of computational complexity theory...the book promises to become the standard reference on computational complexity." -Zentralblatt MATH A thorough revision based on advances in the field of computational complexity and readers’ feedback, the Second Edition of Theory of Computational Complexity presents updates to the principles and applications essential to understanding modern computational complexity theory. The new edition continues to serve as a comprehensive resource on the use of software and computational approaches for solving algorithmic problems and the related difficulties that can be encountered. Maintaining extensive and detailed coverage, Theory of Computational Complexity, Second Edition, examines the theory and methods behind complexity theory, such as computational models, decision tree complexity, circuit complexity, and probabilistic complexity. The Second Edition also features recent developments on areas such as NP-completeness theory, as well as: A new combinatorial proof of the PCP theorem based on the notion of expander graphs, a research area in the field of computer science Additional exercises at varying levels of difficulty to further test comprehension of the presented material End-of-chapter literature reviews that summarize each topic and offer additional sources for further study Theory of Computational Complexity, Second Edition, is an excellent textbook for courses on computational theory and complexity at the graduate level. The book is also a useful reference for practitioners in the fields of computer science, engineering, and mathematics who utilize state-of-the-art software and computational methods to conduct research. A thorough revision based on advances in the field of computational complexity and readers’ feedback, the Second Edition of Theory of Computational Complexity presents updates to the principles and applications essential to understanding modern computational complexity theory. The new edition continues to serve as a comprehensive resource on the use of software and computational approaches for solving algorithmic problems and the related difficulties that can be encountered. Maintaining extensive and detailed coverage, Theory of Computational Complexity, Second Edition, examines the theory and methods behind complexity theory, such as computational models, decision tree complexity, circuit complexity, and probabilistic complexity. The Second Edition also features recent dev

The Theory of Distributions: Introduction

by El Mustapha Hassi

Many physical, chemical, biological and even economic phenomena can be modeled by differential or partial differential equations, and the framework of distribution theory is the most efficient way to study these equations. A solid familiarity with the language of distributions has become almost indispensable in order to treat these questions efficiently. This book presents the theory of distributions in as clear a sense as possible while providing the reader with a background containing the essential and most important results on distributions. Together with a thorough grounding, it also provides a series of exercises and detailed solutions. The Theory of Distributions is intended for master’s students in mathematics and for students preparing for the agrégation certification in mathematics or those studying the physical sciences or engineering.

The Theory of Distributions: Introduction

by El Mustapha Hassi

Many physical, chemical, biological and even economic phenomena can be modeled by differential or partial differential equations, and the framework of distribution theory is the most efficient way to study these equations. A solid familiarity with the language of distributions has become almost indispensable in order to treat these questions efficiently. This book presents the theory of distributions in as clear a sense as possible while providing the reader with a background containing the essential and most important results on distributions. Together with a thorough grounding, it also provides a series of exercises and detailed solutions. The Theory of Distributions is intended for master’s students in mathematics and for students preparing for the agrégation certification in mathematics or those studying the physical sciences or engineering.

The Theory of Measures and Integration (Wiley Series in Probability and Statistics #591)

by Eric M. Vestrup

An accessible, clearly organized survey of the basic topics of measure theory for students and researchers in mathematics, statistics, and physics In order to fully understand and appreciate advanced probability, analysis, and advanced mathematical statistics, a rudimentary knowledge of measure theory and like subjects must first be obtained. The Theory of Measures and Integration illuminates the fundamental ideas of the subject-fascinating in their own right-for both students and researchers, providing a useful theoretical background as well as a solid foundation for further inquiry. Eric Vestrup's patient and measured text presents the major results of classical measure and integration theory in a clear and rigorous fashion. Besides offering the mainstream fare, the author also offers detailed discussions of extensions, the structure of Borel and Lebesgue sets, set-theoretic considerations, the Riesz representation theorem, and the Hardy-Littlewood theorem, among other topics, employing a clear presentation style that is both evenly paced and user-friendly. Chapters include: * Measurable Functions * The Lp Spaces * The Radon-Nikodym Theorem * Products of Two Measure Spaces * Arbitrary Products of Measure Spaces Sections conclude with exercises that range in difficulty between easy "finger exercises"and substantial and independent points of interest. These more difficult exercises are accompanied by detailed hints and outlines. They demonstrate optional side paths in the subject as well as alternative ways of presenting the mainstream topics. In writing his proofs and notation, Vestrup targets the person who wants all of the details shown up front. Ideal for graduate students in mathematics, statistics, and physics, as well as strong undergraduates in these disciplines and practicing researchers, The Theory of Measures and Integration proves both an able primary text for a real analysis sequence with a focus on measure theory and a helpful background text for advanced courses in probability and statistics.

Theory of Preliminary Test and Stein-Type Estimation with Applications (Wiley Series in Probability and Statistics #517)

by A. K. Saleh

Theory of Preliminary Test and Stein-Type Estimation with Applications provides a com-prehensive account of the theory and methods of estimation in a variety of standard models used in applied statistical inference. It is an in-depth introduction to the estimation theory for graduate students, practitioners, and researchers in various fields, such as statistics, engineering, social sciences, and medical sciences. Coverage of the material is designed as a first step in improving the estimates before applying full Bayesian methodology, while problems at the end of each chapter enlarge the scope of the applications. This book contains clear and detailed coverage of basic terminology related to various topics, including: * Simple linear model; ANOVA; parallelism model; multiple regression model with non-stochastic and stochastic constraints; regression with autocorrelated errors; ridge regression; and multivariate and discrete data models * Normal, non-normal, and nonparametric theory of estimation * Bayes and empirical Bayes methods * R-estimation and U-statistics * Confidence set estimation

Theory of Probability: A critical introductory treatment (Wiley Series in Probability and Statistics #6)

by Bruno de Finetti

First issued in translation as a two-volume work in 1975, this classic book provides the first complete development of the theory of probability from a subjectivist viewpoint. It proceeds from a detailed discussion of the philosophical mathematical aspects to a detailed mathematical treatment of probability and statistics. De Finetti’s theory of probability is one of the foundations of Bayesian theory. De Finetti stated that probability is nothing but a subjective analysis of the likelihood that something will happen and that that probability does not exist outside the mind. It is the rate at which a person is willing to bet on something happening. This view is directly opposed to the classicist/ frequentist view of the likelihood of a particular outcome of an event, which assumes that the same event could be identically repeated many times over, and the 'probability' of a particular outcome has to do with the fraction of the time that outcome results from the repeated trials.

Theory of Probability: A critical introductory treatment (Wiley Series in Probability and Statistics #6)

by Bruno de Finetti

First issued in translation as a two-volume work in 1975, this classic book provides the first complete development of the theory of probability from a subjectivist viewpoint. It proceeds from a detailed discussion of the philosophical mathematical aspects to a detailed mathematical treatment of probability and statistics. De Finetti’s theory of probability is one of the foundations of Bayesian theory. De Finetti stated that probability is nothing but a subjective analysis of the likelihood that something will happen and that that probability does not exist outside the mind. It is the rate at which a person is willing to bet on something happening. This view is directly opposed to the classicist/ frequentist view of the likelihood of a particular outcome of an event, which assumes that the same event could be identically repeated many times over, and the 'probability' of a particular outcome has to do with the fraction of the time that outcome results from the repeated trials.

Refine Search

Showing 54,451 through 54,475 of 54,682 results