Browse Results

Showing 5,576 through 5,600 of 100,000 results

Analysis of Integrated and Cointegrated Time Series with R (Use R!)

by Bernhard Pfaff

This book is designed for self study. The reader can apply the theoretical concepts directly within R by following the examples.

An Analysis of John P. Kotter's Leading Change (The Macat Library)

by Yaamina Salman Nick Broten

John P. Kotter’s Leading Change: Why Transformation Efforts Fail is a classic of business literature, and an example of high-level analysis and evaluation. In critical thinking, analysis is all about the sequence and features of arguments. When combined with evaluation of the strengths and weaknesses of an argument, it provides the perfect basis for understanding corporate strategies and direction. Kotter applied these skills to his own experiences of coaching large and small businesses through changes aimed at improving their performance. At its heart, Kotter’s conclusion was simple: unsuccessful transformations usually result from poor management decisions. His view was that it was not enough for executives to have management skills. Strong leadership is required, together with a clear process that can be used by all kinds of companies and organizations, no matter what sector they are operating in. Looking at his own successes and failures alike, Kotter used his analytical skills to understand the sequence and features of relevant arguments before evaluating their strengths and distilling them down to identify common mistakes managers make when they try to implement change. This practical application of two core critical thinking skills allowed him to develop an eight-stage model for successful organizational transformation – a model still widely used twenty years on.

An Analysis of John P. Kotter's Leading Change (The Macat Library)

by Yaamina Salman Nick Broten

John P. Kotter’s Leading Change: Why Transformation Efforts Fail is a classic of business literature, and an example of high-level analysis and evaluation. In critical thinking, analysis is all about the sequence and features of arguments. When combined with evaluation of the strengths and weaknesses of an argument, it provides the perfect basis for understanding corporate strategies and direction. Kotter applied these skills to his own experiences of coaching large and small businesses through changes aimed at improving their performance. At its heart, Kotter’s conclusion was simple: unsuccessful transformations usually result from poor management decisions. His view was that it was not enough for executives to have management skills. Strong leadership is required, together with a clear process that can be used by all kinds of companies and organizations, no matter what sector they are operating in. Looking at his own successes and failures alike, Kotter used his analytical skills to understand the sequence and features of relevant arguments before evaluating their strengths and distilling them down to identify common mistakes managers make when they try to implement change. This practical application of two core critical thinking skills allowed him to develop an eight-stage model for successful organizational transformation – a model still widely used twenty years on.

Analysis of Large and Complex Data (Studies in Classification, Data Analysis, and Knowledge Organization #0)

by Adalbert F.X. Wilhelm Hans A. Kestler

This book offers a snapshot of the state-of-the-art in classification at the interface between statistics, computer science and application fields. The contributions span a broad spectrum, from theoretical developments to practical applications; they all share a strong computational component. The topics addressed are from the following fields: Statistics and Data Analysis; Machine Learning and Knowledge Discovery; Data Analysis in Marketing; Data Analysis in Finance and Economics; Data Analysis in Medicine and the Life Sciences; Data Analysis in the Social, Behavioural, and Health Care Sciences; Data Analysis in Interdisciplinary Domains; Classification and Subject Indexing in Library and Information Science. The book presents selected papers from the Second European Conference on Data Analysis, held at Jacobs University Bremen in July 2014. This conference unites diverse researchers in the pursuit of a common topic, creating truly unique synergies in the process.

Analysis of Legal Argumentation Documents: A Computational Argumentation Approach (Translational Systems Sciences #29)

by Hayato Hirata Katsumi Nitta

This book introduces methods to analyze legal documents such as negotiation records and legal precedents, using computational argumentation theory.First, a method to automatically evaluate argumentation skills from the records of argumentation exercises is proposed. In law school, argumentation exercises are often conducted and many records of them are produced. From each utterance in the record, a pattern of “speech act +factor” is extracted, and argumentation skills are evaluated from the sequences of the patterns, using a scoring prediction model constructed by multiple regression analyses between the appearance pattern and the scoring results. The usefulness of this method is shown by applying it to the example case “the garbage house problem”. Second, a method of extracting factors (elements that characterize precedents and cases) and legal topoi from individual precedents and using them as the expression of precedents to analyze how the pattern of factors and legal topoi appearing in a group of precedents affects the judgment (plaintiff wins/defendant wins) is proposed. This method has been applied to a group of tax cases. Third, the logical structure of 70 labor cases is described in detail by using factors and a bipolar argumentation framework (BAF) and an (extended argumentation framework (EAF) together. BAF describes the logical structure between plaintiff and defendant, and EAF describes the decision of the judge. Incorporating the legal topoi into the EAF of computational argumentation theory, the strength of the analysis of precedents by combined use of factored BAF and EAF, not only which argument the judge adopted could be specified. It was also possible to determine what kind of value judgment was made and to verify the logic. The analysis methods in this book demonstrate the application of logic-based AI methods to the legal domain, and they contribute to the education and training of law school students in logical ways of argumentation.

Analysis of Manufacturing Enterprises: An Approach to Leveraging Value Delivery Processes for Competitive Advantage (The International Series on Discrete Event Dynamic Systems #12)

by N. Viswanadham

Analysis of Manufacturing Enterprises presents a unified and systematic treatment of manufacturing enterprises. These enterprises are networks of companies working in partnership. Such networks are a common occurrence in auto, grocery, apparel, computer and other industries; and competition is among enterprises rather than between individual companies. Thus, for these enterprises (global or local) to succeed, there is a need for systematically designing the enterprise-wide value delivery processes such as the order-to-delivery process, supply chain process, and new product development process. This calls for developing systematic analysis methodologies for evaluating the performance of value delivering processes. Analysis of Manufacturing Enterprises fills this vital need. The first part of the book focuses on foundations of manufacturing enterprises: the generic value delivery process, their performance measures and redesign to meet specifications on lead time and defect levels. The second part provides a clear and comprehensive discussion on new product development, order to delivery, and supply chain processes, which are core processes of a manufacturing enterprise. Analysis of Manufacturing Enterprises is an excellent resource for researchers and professionals in the field of manufacturing engineering.

Analysis of Material Removal Processes (Mechanical Engineering Series)

by Warren R. DeVries

Metal removal processes - cutting and grinding in this book - are an integral part of a large number of manufacturing systems, either as the primary manufacturing process, or as an important part of preparing the tooling for other manufacturing processes. In recent years, industry and educational institutions have concentrated on the metal removal system, perhaps at the expense of the process. This book concentrates on metal removal processes, particularly on the modeling aspects that can either give a direct answer or suggest the general requirements as to how to control, improve or change a metal removal process. This modeling knowledge is more important with automated computer controlled systems than it has ever been before, because quantitative knowledge is needed to design and operate these systems. This senior undergraduate/graduate textbook is aimed at providing the quantitative knowledge, often times at an elementary level, for handling the technological aspects of setting up and operating a metal removal process and interpreting the experience of planning, operating and improving a metal removal process based on rule of thumb approaches.

Analysis of Microdata

by Rainer Winkelmann Stefan Boes

The availability of microdata has increased rapidly over the last decades, and standard statistical and econometric software packages for data analysis include ever more sophisticated modeling options. The goal of this book is to familiarize readers with a wide range of commonly used models, and thereby to enable them to become critical consumers of current empirical research, and to conduct their own empirical analyses. The focus of the book is on regression-type models in the context of large cross-section samples. In microdata applications, dependent variables often are qualitative and discrete, while in other cases, the sample is not randomly drawn from the population of interest and the dependent variable is censored or truncated. Hence, models and methods are required that go beyond the standard linear regression model and ordinary least squares. Maximum li- lihood estimation of conditional probability models and marginal probability e?ects are introduced here as the unifying principle for modeling, estimating and interpreting microdata relationships. We consider the limitation to m- imum likelihood sensible, from a pedagogical point of view if the book is to be used in a semester-long advanced undergraduate or graduate course, and from a practical point of view because maximum likelihood estimation is used in the overwhelming majority of current microdata research. In order to introduce and explain the models and methods, we refer to a number of illustrative applications. The main examples include the deter- nants of individual fertility, the intergenerational transmission of secondary schoolchoices,andthewageelasticityoffemalelaborsupply.

Analysis of Multidimensional Poverty: Theory and Case Studies (Economic Studies in Inequality, Social Exclusion and Well-Being #7)

by Louis-Marie Asselin

Poverty is a paradoxical state. Recognizable in the eld for any sensitive observer who travels in remote rural areas and urban slums and meets marginalized people in a given society, poverty still remains a challenge to conceptual formalization and to measurement that is consistent with such formalization. The analysis of poverty is multidisciplinary. It goes from ethics to economics, from political science to human biology, and any type of measurement rests on mathematics. Moreover, poverty is multifaceted according to the types of deprivation, and it is also gender and age speci c. A vector of variables is required, which raises a substantial problem for individual and group comparisons necessary to equity analysis. Multidimension- ity also complicates the aggregation necessary to perform the ef ciency analysis of policies. In the case of income poverty, these two problems, equity and ef ciency, have bene ted from very signi cant progress in the eld of economics. Similar achievements are still to come in the area of multidimensional poverty. Within this general background, this book has a very modest and narrow-scoped objective. It proposes an operational methodology for measuring multidimensional poverty, independent from the conceptual origin, the size and the qualitative as well as the quantitative nature of the primary indicators used to describe the poverty of an individual, a household or a sociodemographic entity.

Analysis of Queueing Networks with Blocking (International Series in Operations Research & Management Science #31)

by Simonetta Balsamo Vittoria de Nitto Persone Raif Onvural

Queueing network models have been widely applied as a powerful tool for modelling, performance evaluation, and prediction of discrete flow systems, such as computer systems, communication networks, production lines, and manufacturing systems. Queueing network models with finite capacity queues and blocking have been introduced and applied as even more realistic models of systems with finite capacity resources and with population constraints. In recent years, research in this field has grown rapidly. Analysis of Queueing Networks with Blocking introduces queueing network models with finite capacity and various types of blocking mechanisms. It gives a comprehensive definition of the analytical model underlying these blocking queueing networks. It surveys exact and approximate analytical solution methods and algorithms and their relevant properties. It also presents various application examples of queueing networks to model computer systems and communication networks. This book is organized in three parts. Part I introduces queueing networks with blocking and various application examples. Part II deals with exact and approximate analysis of queueing networks with blocking and the condition under which the various techniques can be applied. Part III presents a review of various properties of networks with blocking, describing several equivalence properties both between networks with and without blocking and between different blocking types. Approximate solution methods for the buffer allocation problem are presented.

Analysis of Queues (Operations Research Ser.)

by Natarajan Gautam

Written with students and professors in mind, Analysis of Queues: Methods and Applications combines coverage of classical queueing theory with recent advances in studying stochastic networks. Exploring a broad range of applications, the book contains plenty of solved problems, exercises, case studies, paradoxes, and numerical examples.In addition t

Analysis of Repeated Measures Data

by M. Ataharul Islam Rafiqul I Chowdhury

This book presents a broad range of statistical techniques to address emerging needs in the field of repeated measures. It also provides a comprehensive overview of extensions of generalized linear models for the bivariate exponential family of distributions, which represent a new development in analysing repeated measures data. The demand for statistical models for correlated outcomes has grown rapidly recently, mainly due to presence of two types of underlying associations: associations between outcomes, and associations between explanatory variables and outcomes. The book systematically addresses key problems arising in the modelling of repeated measures data, bearing in mind those factors that play a major role in estimating the underlying relationships between covariates and outcome variables for correlated outcome data. In addition, it presents new approaches to addressing current challenges in the field of repeated measures and models based on conditional and joint probabilities. Markov models of first and higher orders are used for conditional models in addition to conditional probabilities as a function of covariates. Similarly, joint models are developed using both marginal-conditional probabilities as well as joint probabilities as a function of covariates. In addition to generalized linear models for bivariate outcomes, it highlights extended semi-parametric models for continuous failure time data and their applications in order to include models for a broader range of outcome variables that researchers encounter in various fields. The book further discusses the problem of analysing repeated measures data for failure time in the competing risk framework, which is now taking on an increasingly important role in the field of survival analysis, reliability and actuarial science. Details on how to perform the analyses are included in each chapter and supplemented with newly developed R packages and functions along with SAS codes and macro/IML. It is a valuable resource for researchers, graduate students and other users of statistical techniques for analysing repeated measures data.

Analysis of Resource Management in Complex Work Systems: Using the Example of Sterile Goods Management in Hospitals

by Qinglian Lin

This book develops and assesses a decision-making model for resource management in complex work systems in line with the “Systems Engineering” method. It applies the Balanced Scorecard to the development of the criteria system for decision-making, and employs fuzzy linguistics theory to evaluate the alternatives. Further, the book assesses the application of this model in a hospital that has to decide whether or not to outsource its sterile goods. The use of the model opens up a diverse range of fields for decision-making in the area of complex work systems.

Analysis of Science, Technology, and Innovation in Emerging Economies

by Clara Inés Pardo Martínez Alexander Cotte Poveda Sylvia Patricia Fletscher Moreno

This book outlines a number of different perspectives on the relationship between science, technology, and innovation in emerging economies. In it, the authors explore the aforementioned relationship as a pillar of economic development, driving growth in emerging economies. Employing a collaborative and interdisciplinary approach, the authors work to determine the main related factors and outcomes of the relationship between science, technology, and innovation, ultimately seeking to guide public policies to enhance the welfare of the population of an emerging economy.

Analysis of Socio-Economic Conditions: Insights from a Fuzzy Multi-dimensional Approach (Routledge Advances in Social Economics)

by Gianni Betti Achille Lemmi

Showcasing fuzzy set theory, this book highlights the enormous potential of fuzzy logic in helping to analyse the complexity of a wide range of socio-economic patterns and behaviour. The contributions to this volume explore the most up-to-date fuzzy-set methods for the measurement of socio-economic phenomena in a multidimensional and/or dynamic perspective. Thus far, fuzzy-set theory has primarily been utilised in the social sciences in the field of poverty measurement. These chapters examine the latest work in this area, while also exploring further applications including social exclusion, the labour market, educational mismatch, sustainability, quality of life and violence against women. The authors demonstrate that real-world situations are often characterised by imprecision, uncertainty and vagueness, which cannot be properly described by the classical set theory which uses a simple true–false binary logic. By contrast, fuzzy-set theory has been shown to be a powerful tool for describing the multidimensionality and complexity of social phenomena. This book will be of significant interest to economists, statisticians and sociologists utilising quantitative methods to explore socio-economic phenomena.

Analysis of Socio-Economic Conditions: Insights from a Fuzzy Multi-dimensional Approach (Routledge Advances in Social Economics)

by Gianni Betti and Achille Lemmi

Showcasing fuzzy set theory, this book highlights the enormous potential of fuzzy logic in helping to analyse the complexity of a wide range of socio-economic patterns and behaviour. The contributions to this volume explore the most up-to-date fuzzy-set methods for the measurement of socio-economic phenomena in a multidimensional and/or dynamic perspective. Thus far, fuzzy-set theory has primarily been utilised in the social sciences in the field of poverty measurement. These chapters examine the latest work in this area, while also exploring further applications including social exclusion, the labour market, educational mismatch, sustainability, quality of life and violence against women. The authors demonstrate that real-world situations are often characterised by imprecision, uncertainty and vagueness, which cannot be properly described by the classical set theory which uses a simple true–false binary logic. By contrast, fuzzy-set theory has been shown to be a powerful tool for describing the multidimensionality and complexity of social phenomena. This book will be of significant interest to economists, statisticians and sociologists utilising quantitative methods to explore socio-economic phenomena.

The Analysis of Sports Forecasting: Modeling Parallels between Sports Gambling and Financial Markets

by William S. Mallios

Given the magnitude of currency speculation and sports gambling, it is surprising that the literature contains mostly negative forecasting results. Majority opinion still holds that short term fluctuations in financial markets follow random walk. In this non-random walk through financial and sports gambling markets, parallels are drawn between modeling short term currency movements and modeling outcomes of athletic encounters. The forecasting concepts and methodologies are identical; only the variables change names. If, in fact, these markets are driven by mechanisms of non-random walk, there must be some explanation for the negative forecasting results. The Analysis of Sports Forecasting: Modeling Parallels Between Sports Gambling and Financial Markets examines this issue.

The Analysis of Structured Securities: Precise Risk Measurement and Capital Allocation

by Sylvain Raynes Ann Rutledge

The Analysis of Structured Securities presents the first intellectually defensible framework for systematic assessment of the credit quality of structured securities. It begins with a detailed description and critique of methods used to rate asset-backed securities, collateralized debt obligations and asset-backed commercial paper. The book then proposes a single replacement paradigm capable of granular, dynamic results. It offers extensive guidance on using numerical methods in cash flow modeling, as well as a groundbreaking section on trigger optimization. Casework on applying the method to automobile ABS, CDOs-of-ABS and aircraft-lease securitizations is also presented. This book is essential reading for practitioners who seek higher precision, efficiency and control in managing their structured exposures.

The Analysis of Structured Securities: Precise Risk Measurement and Capital Allocation

by Sylvain Raynes Ann Rutledge

The Analysis of Structured Securities presents the first intellectually defensible framework for systematic assessment of the credit quality of structured securities. It begins with a detailed description and critique of methods used to rate asset-backed securities, collateralized debt obligations and asset-backed commercial paper. The book then proposes a single replacement paradigm capable of granular, dynamic results. It offers extensive guidance on using numerical methods in cash flow modeling, as well as a groundbreaking section on trigger optimization. Casework on applying the method to automobile ABS, CDOs-of-ABS and aircraft-lease securitizations is also presented. This book is essential reading for practitioners who seek higher precision, efficiency and control in managing their structured exposures.

Analysis of Symbolic Data: Exploratory Methods for Extracting Statistical Information from Complex Data (Studies in Classification, Data Analysis, and Knowledge Organization)

by Hans Hermann Bock Edwin Diday

This book presents the most recent methods for analyzing and visualizing symbolic data. It generalizes classical methods of exploratory, statistical and graphical data analysis to the case of complex data. Several benchmark examples from National Statistical Offices illustrate the usefulness of the methods. The book contains an extensive bibliography and a subject index.

Analysis of the British Construction Industry

by Patricia M. Hillebrandt

This is a critical and descriptive analysis of the UK construction industry based on up-to-date statistics. The emphasis is on the industry as a whole, including its associated professions, rather than on individual firms or projects. Dr Hillebrandt examines the structure and resources of the industry, the demands made on it and its responses. A concise and comprehensive picture of the industry is given which will enable readers to understand what it does, how it works and how it is likely to develop.

An Analysis of the Determinants of Occupational Upgrading

by Duane E. Leigh

An Analysis of the Determinants of Occupational Upgrading presents a study that focuses on occupational mobility as a proxy measure for job upgrading. This monograph was first prepared in 1975 as a final report to the Manpower Administration of the U.S. Department of Labor. It is the second monograph of the Institute for Research on Poverty to deal with black-white income differentials and is also part of a growing Institute literature on the dual labor market theories and occupational mobility. The book contains seven chapters and begins with an overview of occupational mobility in the United States. The next chapter considers previous attempts to test the dual labor market hypothesis and presents a model of occupational mobility to be used in testing five hypotheses on the determinants of occupational upgrading. Subsequent chapters discuss the Census and NLS samples and outline the empirical variables used to measure the variables specified in the model; the impact on occupational upgrading of formal vocational training, industry structure, and job tenure; and the impact of interfirm and interindustry mobility on occupational progression. The final chapter summarizes the empirical findings with respect to each of the five testable hypotheses and considers some policy conclusions drawn from the analysis.

Analysis of the Development of Beijing (2018)

by Beijing Academy of Social Sciences

This book provides an overview of the rapid development Beijing has seen in a wide range of areas in 2017, both in itself and as an integral part of a larger region, as China’s economic development continues to improve in overall quality and regional coordination. General reports on progress Beijing made and problems it faced in 2017 in improving its economy, public services, municipal and community governance, urban planning, and funding for innovations are followed by case studies that look at best practices and how they can be applied towards promoting coordinated development of the Beijing-Tianjin-Hebei region. The strategy features prominently in the outlook contributors present for the greater metropolitan area of Beijing for 2018.This book is a valuable source of reference for anyone trying to gain a better understanding the what, how, and why in relation to one of the world’s fastest growing mega-cities.

An Analysis Of The Economic Torts (PDF)

by Hazel Carty

The economic torts provide the common law rules on liability for the infliction of pure economic loss. They include the torts of conspiracy, inducing breach of contract, intimidation, unlawful interference with trade, deceit and malicious falsehood. These torts represent the common law's attempt to balance the need to protect claimants against those who inflict economic harm and the wider need to allow effective, even aggressive, competition (including competition between employers and their workers). Given their controversial setting it is hardly surprising to discover that this area of tort law is confused and confusing, with the limits of liability imprecise. Hazel Carty's book creates order out of chaos by offering a thorough analysis of all the economic torts in order to provide a framework for their scope and development. Such a comprehensive analysis has not been undertaken since Heydon's work in 1978. What is revealed is an approach to the economic torts that breaks with the traditional analysis to underline the interconnections and differences between these torts that have been overlooked in the past. The overall aim of this book is to provide a text that is both intellectually satisfying to the tort scholar and of practical worth to the practitioner.

An Analysis of Theodore Levitt's Marketing Myopia (The Macat Library)

by Monique Diderich Elizabeth Mamali

Theodore Levitt’s 1960 article “Marketing Myopia” is a business classic that earned its author the nickname “the father of modern marketing”. It is also a beautiful demonstration of the problem solving skills that are crucial in so many areas of life – in business and beyond. The problem facing Levitt was the same problem that has confronted business after business for hundreds of years: how best to deal with slowing growth and eventual decline. Levitt studied many business empires – the railroads, for instance – that at a certain point simply shrivelled up and shrank to almost nothing. How, he asked, could businesses avoid such failures? His approach and his solution comprise a concise demonstration of high-level problem solving at its best. Good problem solvers first identify what the problem is, then isolate the best methodology for solving it. And, as Levitt showed, a dose of creative thinking also helps. Levitt’s insight was that falling sales are all about marketing, and marketing is about knowing your real business. The railroads misunderstood their real market: they weren’t selling rail, they were selling transport. If they had understood that, they could have successfully taken advantage of new growth areas – truck haulage, for instance – rather than futilely scrabbling to sell rail to a saturated market.

Refine Search

Showing 5,576 through 5,600 of 100,000 results