Download Free Flexibility And Adjustment To Information In Sequential Decision Problems Book in PDF and EPUB Free Download. You can read online Flexibility And Adjustment To Information In Sequential Decision Problems and write the review.

1 The Importance of Irreversibility and Learning - Familiar 11 Bxamples Revisited 1. 1 Neoclassical Investment Models: A Brief Survey 11 1. 1. 1 The Standard Neoclassical Investment Theory Model 13 1. 1. 2 The Investment Model with Adjustment Costs 15 1. 1. 3 The Irreversibility of Investment 17 1. 1. 4 Delivery Lags 18 1. 2 Flexible Manufacturing Systems 22 1. 2. 1 Some Basic Facts about Manufacturing 23 1. 2. 2 The Determinants of the Flexibility of Manufacturing Systems 25 1. 2. 3 Manufacturing as a Multiperiod Choice Problem 28 1. 3 Conclusions 30 2 The Role of Irreversibility and Learning in Sequential Decision Problems - Basic Concepts 33 2. 1 The Two-Period Model without Uncertainty 33 2. 1. 1 The Elements of the Model 34 2. 1. 2 Economic Examples 37 2. 1. 3 Some Basic Results 39 2. 1. 4 Intertemporal Opportunity Costs 42 2. 2 The Two-Period Model with Uncertainty 46 2. 2. 1 The Elements of the Kodel 46 2. 2. 2 Special Cases 50 2. 2. 3 Flexibility and the Value of Information 54 2. 2. 4 An Example: Waiting to Invest 56 2. 3 Switching Costs 59 2. 3. 1 The Extended Model 59 2. 3. 2 An Example: Money Demand as Demand for Flexibility 61 2. 4 Summary and Outlook 63 3 Determinants of the Optimal Choice in Sequential Decision Problems - The Two-Period Case 65 3. 1 The Formulation of the Problem 66 3. 1.
The book investigates a two-person game of litigation and settlement with incomplete information on one side. The experimental design allows investigation of how subjects solve the bargaining problem. A prominence level analysis is applied to the data and suggests that subjects tend to choose "round" numbers. It is shown that there exists a correlation between machiavellianism and subjects' adjustment behaviour in the game. The learning behaviour is discussed extensively. Plaintiffs' acceptance limits polarize at the beginning of the second play. A model of learning direction theory applied to explain subjects's behaviour over the course of the game.
Think of the following situation: A project yielding a gross profit of 100 is offered to two firms. The project can only be conducted by a cooperation of the two firms. No firm is able to conduct the project alone. In order to receive the project the firms have to agree on the allocation of the gross profit. Each of both firms has an alternative project it conducts in case the joint project is not realized. The profitability of an allocation of the joint gross profit for a firm depends on the gross profit from its alternative project. The gross profit from an alternative project can be either 0 (low alternative value) or O
There are two types of tenn structure models in the literature: the equilibrium models and the no-arbitrage models. And there are, correspondingly, two types of interest rate derivatives pricing fonnulas based on each type of model of the tenn structure. The no-arbitrage models are characterized by the work of Ho and Lee (1986), Heath, Jarrow, and Morton (1992), Hull and White (1990 and 1993), and Black, Dennan and Toy (1990). Ho and Lee (1986) invent the no-arbitrage approach to the tenn structure modeling in the sense that the model tenn structure can fit the initial (observed) tenn structure of interest rates. There are a number of disadvantages with their model. First, the model describes the whole volatility structure by a sin gle parameter, implying a number of unrealistic features. Furthennore, the model does not incorporate mean reversion. Black-Dennan-Toy (1990) develop a model along tbe lines of Ho and Lee. They eliminate some of the problems of Ho and Lee (1986) but create a new one: for a certain specification of the volatility function, the short rate can be mean-fteeting rather than mean-reverting. Heath, Jarrow and Morton (1992) (HJM) construct a family of continuous models of the term struc ture consistent with the initial tenn structure data.
1. 1 Rational Expectations and Learning to Become Rational A characteristic feature of dynamic economic models is that, if future states of the economy are uncertain, the expectations of agents mat ter. Producers have to decide today which amount of a good they will produce not knowing what demand will be tomorrow. Consumers have to decide what they spend for consumption today not knowing what prices will prevail tomorrow. Adopting the neo-classical point of view that economic agents are 'rational' in the sense that they behave in their own best interest given their expectations about future states of the ecomomy it is usually assumed that agents are Bayesian deci sion makers. But, as LUCAS points out, there remains an element of indeterminacy: Unfortunately, the general hypothesis that economic agents are Bayesian decision makers has, in many applications, lit tle empirical content: without some way of infering what an agent's subjective view of the future is, this hypothesis is of no help in understanding his behavior. Even psychotic behavior can be (and today, is) understood as "rational", given a sufficiently abnormal view of relevant probabili ties. To practice economics, we need some way (short of psychoanalysis, one hopes) of understanding which decision problem agents are solving. (LucAs (1977, p. 15)) 2 CHAPTER 1. INTRODUCTION 1. 1.
Two features are combined in this book: the analysis of bargaining experiments and the development of axiomatic bargaining theories. Further, a new type of the latter is derived from observations in the former. The author describes bargaining experiments with different economic and ethical frames as well as developing axiomatic approaches to characterize the corresponding bargaining solutions.
This book contributes to the scientific field of optimal control theory applied to dynamic models of the firm. It discusses optimal investment, financing and production policies of the firm, that have to deal with a variety of aspects, such as financial constraints, start-up costs, business cycles, increasing returns to scale, production life cycles and experience curves. In contrast to many other publications on this subject, here, in combination with an analytical approach, the dynamic optimization problems are solved numerically with the aid of a powerful computer and specific programs for optimizing non-linear functions of a finite number of variables and non-linear constraints.
The global greenhouse effect may be one of the greatest challenges ever to face humankind. If fossil fuel use, and the consequent CO emissions, 2 continue to increase at their current trend, there is the possibility that over the next century there will be massive climate change and the flooding of coastal areas. The economics profession is beginning to respond to this challenge, through seeking to understand the economic processes which detennine the demand for energy, the proportion of this energy supplied by fossil fuels, and the policy instruments available for reducing fossil fuel demand while still supplying appropriate amounts of energy. This study is a contribution to that literature. We examine the impact of structural changes in the German and UK economies upon CO emissions 2 over the last two decades, and explore the potential for further structural change to reduce such emissions. This study is different from much of the current literature, in that we do not presuppose that the respective economies consist of only one, or a few, sectors. Instead, we analyse the interrelationships of 47 sectors for about 20 years, using input-output methods. We also deal with the effects of the changing sectoral structure of imports and exports of these two countries on the 'responsibility' for CO emissions. On the basis of this extensive evidence we have a solid 2 foundation to develop different scenarios to show how the 'Toronto target' of reducing CO emissions by 20% over 20 years can be achieved.
A bibliography on stochastic orderings. Was there a real need for it? In a time of reference databases as the MathSci or the Science Citation Index or the Social Science Citation Index the answer seems to be negative. The reason we think that this bibliog raphy might be of some use stems from the frustration that we, as workers in the field, have often experienced by finding similar results being discovered and proved over and over in different journals of different disciplines with different levels of mathematical so phistication and accuracy and most of the times without cross references. Of course it would be very unfair to blame an economist, say, for not knowing a result in mathematical physics, or vice versa, especially when the problems and the languages are so far apart that it is often difficult to recognize the analogies even after further scrutiny. We hope that collecting the references on this topic, regardless of the area of application, will be of some help, at least to pinpoint the problem. We use the term stochastic ordering in a broad sense to denote any ordering relation on a space of probability measures. Questions that can be related to the idea of stochastic orderings are as old as probability itself. Think for instance of the problem of comparing two gambles in order to decide which one is more favorable.
In the recent years, the study of cointegrated time series and the use of error correction models have become extremely popular in the econometric literature. This book provides an analysis of the notion of (weak) exogeneity, which is necessary to sustain valid inference in sub-systems, inthe framework of error correction models (ECMs). In many practical situations, the applied econometrician wants to introduce "structure" on his/her model in order to get economically meaningful coefficients. For thispurpose, ECMs in structural form provide an appealing framework, allowing the researcher to introduce (theoretically motivated) identification restrictions on the long run relationships. In this case, the validity of the inference will depend on a number of conditions which are investigated here. In particular,we point out that orthogonality tests, often used to test for weak exogeneity or for general misspecification, behave poorly in finite samples and are often not very useful in cointegrated systems.