Economics Of Nobel Laureates - Revised by VT Naidu - HTML preview

PLEASE NOTE: This is an HTML preview only and some elements such as links or page numbers may be incorrect.
Download the book in PDF, ePub, Kindle for a complete version.

Klein

Like Tinbergen, who wrote an excellent introductory book on Econometrics, Klein too wrote one elementary but excellent book on Introduction to Econometrics. Besides, Klein wrote earlier an advanced book titled, Text Book of Econometrics. Both the books of Klein are well received and read by Students of Econometrics all over the world. Also, Klein is well known for his Doctoral Dissertation on Keynesian Revolution. The thesis, with some additions, is published.

Klein is a Macro-Econometric model builder. An econometrician first specifies the Economic model. Then, the next task of the econometrician is to obtain estimates of the parameters of the model for the data. If the predictions of the model are consistent with empirical evidence, he accepts the theory; otherwise, he reformulates the theory or proposes new one. Klein built a Model for the American economy.

Richard Stone

Stone’s work relates to Statistical Demand analysis. He estimated Elascities of demand for a wide variety of products in U.K.      Price Elasticity of demand (own price) is measured by percentage change in quantity demanded divided by percentage change in the price of the same commodity. Cross Elasticity of demand refers to the percentage change in quantity demanded of a commodity in response to the percent age change in the price of another related commodity. Percentage change in quantity demanded of the commodity divided by percentage in Income is known as Income-Elasticity of demand.

Havelmo

Havelmo, who worked earlier as Research Assistant with Ragnar Frisch, won the Nobel Prize in 1989 for his valuable contributions to Probability approach in Econometrics. Another area in which Havelmo did valuable Research relates to a system of Simultaneous equations. In the Simultaneous equations approach the particular equation being studied is considered as a part of a relationships describing the simultaneous interactions of the relevant variables. Havelmo devised a statistical method of reduced form equations to estimate the parameters of the system of Simultaneous equations.

Heckman and McFadden

James Heckman and Daniel McFadden were awarded the Nobel prize in Economics in 2000. While giving that prize, the Swedish Academy said: “The micro econometric methods developed by Heckman and McFadden are now part of the standard too-kit, not only of economists but also of other Social Scientists”.

Social Scientists make generalizations about the whole Populations on the basis of Samples. If the samples are selected randomly, the generalizations made are likely to be close to truth. However, Heckman argues that often these Samples are not random, people self-select themselves into these Samples and this leads to bias in results. For instance, in the case of man-=power training programs, Heckman’s Research shows that often those employees who are keen on improving their performance will join the Training program; others will not. Even without training, these who have joined the programme would have performed better than those who did not. This leads to bias in conclusion. Heckman tackled such selection bias problems.

McFadden also conducted Research on micro-units and their decisions. McFadden worked on problems of discrete decisions and developed Statistical methods for discrete choice analysis. This method is known as “conditional Logit method, which is applied for making many discrete policy choices. This method can help to calculate how probable it is that a person of certain age, income and education would choose to travel by bus, Sub-way or car, taking into account costs and journey time. His method is used widely in tackling, urban transport problems and Telephone services.

Engle and Granger.

Engle and Granger shared the Nobel prize in 2003 for their statistical contributions to economic Time Series. Engle contributions are in areas such as Auto-Regressive Conditional Heteroscedasticity (ARCH), co-integration and band-spectrum regression. Granger’s contributions are mainly to spectral Analysis of Time-Series.

Granger’s researches are contained in his book, Forecasting Economic Time Series. (Academic Press, New York)

SIMS

Sims developed Vector Auto-Regression (VAR) method. He popularized the Granger-Sims casuality tests to analyse time-series data. The tests are often used to describe the joint behavior of a variable X will Granger cause variable Y, if the set of correlations between the current innovations in Y and lagged innovations in X is significant. (Sims 1972 AER-62).

Since suggested alternative style of identification of equations and models to that of existing large scale models of the economy prevalent during 1970-80’s. In a path breaking article on macro economics and reality (1977), since discussed the simultaneous equation identification problem and issues involved in constructing macro economic models of an economy for both descriptive and forecasting purposes. His article can be downloaded from the internet. (Discussion paper No: 77-91, Dec 1977 University of Minnesota).

Chapter - 4

ECONOMIC HISTORY

(Fozel and North)

Economic history, in the words of Hicks, is just the applied economics of earlier ages. Any discussion of Economic history is incomplete with out mentioning the works of Karl Marx and W.W. Rostow. Karl Marx is a house- hold word in many Nations and his book Das Kapital revolutionized the thinking of men and influenced working classes everywhere. Marx analysis is an unique materialistic interpretation of history. Marx’s applied Hegel’s dialectic method to economics. The core of the dialectic lies in the conception of the process by which change takes place. The conception embraces the celebrated triad of thesis, anti-thesis and synthesis. The dialectic pattern is best exemplified by Marx view of class struggle in capitalistic society as the mechanism through which a thesis and anti-thesis interact to form a synthesis in the form of communism. What generates the contradiction is the thesis, what represents the contradiction is the anti-thesis and the synthesis represents the negation of negation or the reconstruction of aspects of the thesis with aspects of the anti-thesis into a higher composite.

The gist of Marx’s arguments are:

The nature of individuals depends on their material conditions of production

The mode of production in material life determines the general character of the social, political and spiritual process of life.

At a certain stage of their development, the material forces of production in society come in conflict with the existing forces of production viz, with the property relations within which they had been at war before. From forms of development of the forces of production, those relations turn into fetters. Then comes the period of social revolution. The history of all societies has up to now been the history of class struggles. The burgeoise (capitalists) replaced the feudal nobility.

Capitalist industrial societies would create the conditions for their destruction because of inherent contradictions. Capitalists in their pursuit of profits introduce more and more labor saving machinery; thereby create vast army of unemployed unskilled labor. Increased competition among capitalists leads to concentration of capital and increasing of a labor. Workers receive subsistence wages, the rate of profits decline. There is shortage of demand for the supply of goods produced. The conditions of capitalist society become fetters to the productive forces of capitalism; there results a conflict between the Capitalists and the working classes. The fall of capitalism and the victory of the proletariat are equally inevitable.

Thus Marx suggested several stages in the evolution of societies:

1. Primitive Communism, 2.The ancient slave state, 3. Feudalism,

4. Socialism and finally 5. Communism.

In the early 60’s of last Century, W.W. Rostow wrote a book titled the Stages of Economic Growth (A non-Communist manifesto). He identifies all societies, in their economic dimensions as lying within one of five categories or stages of Growth. They are: 1. the traditional society 2. The Pre-conditions for Take-Off 3. The Take-Off 4. The drive to Maturity and 5. The Age of high mass consumption.

As against the above broad approaches to the study of the evolution of societies, the new economic history some times called ‘clio-metrics’ uses econometric techniques to the historical issues. Robert W.Fozel and Douglas C North made valuable contributions to new-history. They were awarded the Nobel Prize in 1993 for having renewed research in economic history by applying economic theory and quantitative methods in order to explain economic and institutional change.

Fozel’s researches centered round two themes. The first was to measure the impact of key scientific and technological innovations, key Governmental policies and important environmental and institutional changes on the course of economic growth. The second was to promote wider use of mathematical models and statistical methods of economics in studying the complex, long-term processes that were the focus of economic historians.

Fozel’s approach to Historical research is exemplified in his works, Rail Roads and American Growth, the Escape from Hunger and Premature death, 1700-2100 and others.

Douglas C North is another founder member of the new economic history, called clio-metrics. He made a comprehensive study of economic growth of U.S. (1790-1860), the gist of the argument is that the timing and pace of an economy’s development has been determined by: the success of its export sector and the characteristics of the export industry and the disposition of the income received from export sector.

North realized that a theory of economic history is needed. The existing Neo-Classical theory was concerned with the operation of markets and assumed the existence of the underlying condition needed for the efficient operation of markets. It had nothing to say about how markets evolved. The strong points in favour of Neo Classical economics are: its use of individual as the unit of analysis and its areas of analysis are competitive situations. Marxism was explicitly concerned with institutions, asked good questions, and had an explanation of long-run change but there were many flaws in the Marxian model. Making classes as a unit of analysis and failing to incorporate population change as a key source of change, were major short comings.

Douglas North’s initial effort to incorporate institutions into historical economic analysis resulted in two books (one with L. Davis) Institutional Change and American Economic Growth and other (with Robert Thomas) The Rise of Western World. In Structure and Change in Economic History, Douglas North abandoned the notion that institutions were efficient and attempted to explain why inefficient rules would tend to exist and be perpetuated. He stressed the need for a political economic framework to explore long-run institutional change and that led to the publication of Institutions, Institutional Change and Economic performance. He attempted to evolve a theory of institutional change.

The first step in the evolution of a theory was to separate institutions from organizations. The former are rules of the game, and the latter are players. In the world of scarcity and competition, the organizations are in competition to survive. That competition will lead them to try to modify the institutional framework to improve their competitive position. The direction of change of institutions, however, will reflect the perception of the actors. North tries to blend cognitive science with institutional approach to history in his recent book, understanding the process of Economic Change. When humans understand their environment as reflected in their beliefs and construct an institutional framework that enables them to implement their desired objectives, then there is consistency between the objectives of those players in a position to shape their destiny and the desired outcomes. North feels that such consistency is not automatic and further it is an evolving process over a long period. Because of human failure lack of consistency occurs.

The rise and fall of the Soviet Union between 1988 to 1991 is best explained by its process of change: its beliefs – institutions – organizations – policies – and finally outcomes. While admitting that he is no expert on Soviet Union, he gives highlights of the Soviet Union, drawing on the expertise of others.

Gorbachev introduced Perestroika (Reorganization) which gave enterprise directors greater autonomy. Glasnot or openness was introduced with the aim to undermine the power of the party leaders. The decline and destruction of the stable party structure has led to disorder. Government officials lost confidence in Soviet institutions. Soviet institutions were pulled apart by the Government officials. The catalysts of State collapse were the agents of State itself. Soviet institutions did not have adaptive efficiency.

 

 

 

 

 

 

 

 

 

 

 

 

Chapter - 5

 

Experimental Economics

(M.Allais, Kahneman, V. Smith)

 

Maurice Allais

Maurice Allais, a Ph.D. in Engineering and a Prof. of Mechanics at Lyons turned to Economics. Maurice Allais’s contributions to pure Theory and his first book Inquest of Economic discipline are in French and many do not know the contents. Allais formally reports of experiments in economics in his article in Econometrica as early as 1953 but the article is also in French. What is better known is his work on decision theory, and in particular the so called Allais Parodox

 

Allais’s Parodox

Utility measurement passed through several phases-cardinal utility, Ordinal Utility, Behaviouristic ordinalism and to neo-Classical utility under risk. Nuemann and Morgenstern (N-M) have devised a method of measuring utility under condition of Risk, According to N-M method, individuals do not maximize expected money but expected Utility. By way of criticizing N-M method, Allais raised a paradoxical decision situation.

Suppose a person is asked to choose between the following alternatives: Lottery L1 which offers Rs.2 crores for certain and another Lottery L2 which offers a 10% chance of winning Rs.10 crores and 89% chance of winning Rs.2 crores and 1% chance of getting nothing. In this case, anyone will choose L1. Now consider another choice situation. Lottery L3 offers 11% chance of winning Rs.2 crores and 89% chance of earning nothing. Another Lottery L4 offers 10% chance of earning Rs.10 crores and 90% chance of earning nothing. Let us choose between L3 and L4. Many others will choose similarly like us. The preference is for L4 over L3.

Our choices are not consistent with excepted Utility given by probability multiplied by utility. If the expected Utility from L1 is greater than L2, then, the expected Utility from L3 must be greater than L4 Denote the Utility values of the outcomes U10, U02 and U0 (the subscripts indicate the amount of winnings). Expected utility of L1 is presented on the left side and L2 on the right side of Eqn.(1).

Then the choice of L1 to L2 is represented by greater than symbol.

U02> (0.10) U10 + (0.89) U02 + (0.01)U0 Eqn.(1).

Adding (0.89) U0 – (0.89) U02 to both sides, we get

(0.11)U02 + (0.89) U0 > (0.10) U10 + (0.90) U0 Eqn.(2).

The expected utility of L3 is given on the left side and that of L4 on the right side of Eqn.(2).

As per Eqn.(2) L3 must be preferred to L4 (and not L4 to L3 as indicated by our choice)

Thus Allais has shown that certain kinds of risky choice could not be squared with expected utility theory. This and many other anomalies in choice behaviour have been thoroughly explored by both Psychologists and Economists.

 

Daniel Kahneman

Daniel Khaneman, a Professor of Psychology at Princeton University, have used insights from Psychology to study human behaviour and to conduct experiments in individual decision making under uncertainty. He argued that in complex decision situations under uncertainty, individuals do not make rational calculations, as assumed by traditional theory. Instead, individuals rely on heuristic short - cuts or rules of thumb. Khaneman (and Tversky) have developed the ‘Prospect theory’ of decision making under un-certainty. In this theory, individuals are assumed to be sensitive to the way an outcome deviates from statusquo than to the absolute level of outcome. And individuals are more averse to losses relative to the statusquo than they are partial to gains of the same size.

Suppose, you have invested in a start-up company, which is making profits (Company P). You have a 90% chance of winning Rs. 100 lakhs and a 10% chance of receiving nothing. If some one offers to buy the asset from you, for Rs. 85 lakhs, most likely you would accept the offer because the latter option has less risk. You would be exhibiting risk averse behaviour.

Now consider another situation involving huge losses. Suppose you had invested in a start-up company, which is incurring losses (Company L). There is a 90% chance of losing Rs. 100 lakhs but 10% chance of losing nothing (nil losses). Another investor offers to take-over the company if you pay him Rs. 85 lakhs (resulting in a certain loss of Rs. 85 lakhs). You would most likely reject the offer and choose to retain the loss making company.

The loss making unit case is exactly similar to the earlier profit making case. But, in the profit making case, you exhibit risk averse behaviour and in the loss making case, you don’t exhibit risk averse behaviour. This is called ‘a framing effect’.

This Prospect theory can explain why people take out expensive small scale insurance, why people buy expensive service contract for appliances that would be cheap to replace and such other individual (irrational) decisions.

Vernon L. Smith

Smith established laboratory experiments as a tool in empirical analysis, especially in the study of alternative market mechanisms. Smith (and Knez) tested a ‘strong market hypothesis’, which states that markets equilibrate as if agents were Utility maximisers even if the agents do not themselves behave as if they were Utility maximisers. They state this point of view as follows:

“The efficiency and social significance of markets does not depend on the validity of any particular theory of individual demand…. The empirical validity or falsity of efficient markets theory is a proposition that is entirely distinct from the empirical validity or falsity of theories of individual demand in markets”.

Smith (and Knez) conducted experiments to test the market hypothesis and the results confirm their hypothesis. The behaviour of some individuals might be irrational but the market behaviour of all is rational and efficient. Smith’s book Bargaining and Market Behaviour contains his experimental findings.

Smith’s latest interest is in Neuro-Economics. He uses brain scanning of experimental subjects playing economic games. The exponents of Neuro-Economics believe that by brain scanning of experimental subjects, they will be able to peer directly into the brain to predict behaviour.

The three Nobel Economists discussed in this chapter are pioneers and key figures in the experimental economics. Nowadays experimental work in economics is done in many areas such as investigating two-person bargaining problems; the free rider problem in the provision of public goods: and in examining auction markets and in privatization of public monopolies.

The study of human behavour based on human psychology and experiments falls into the category of a newly flourishing field of Behavioral Economics. The new field is again discussed in the Chapter on Markets.

The Principal decision making region of the brain in the pre frontal cortex (PFC). The PFC observes the benefit valuation activity in various regions of the brain and uses there activity levels as inputs into the decision making process. In other words the PFC uses gut feelings as inputs into the decision making process. The dorsal – lateral portion of the PFC (DLPFC) is responsible for introducing other factors into decision making process such as future consequences of an action. Neuro Scientists use traseranial magnetic stimulation (TMS) to down regulate (Partially disable) the DLPFC of the subjects. The subjects with disabled DLPFC’s make impulsive choices based on gut feeling rather than thoughtful choices based on cognition. For example, subjects grab a small immediate reward rather than waiting relatively a short time for a larger reward. Suppose we place before the subject a donut and an apple. A person with disabled DLPFC is more likely to choose the donut because he will ignore the long term health consequences.

Consumers decide about how much income to spend now and how much to save for the future. Experiments by psychologists and Neuro Scientists have consistency shown that people systematically underestimate the strength of future gut feelings both positive (benefits of consumption) and negative (monetary cost) . Our underestimation of future consequences means that humans are subject to present bias; we accurately incorporate the present consequences of an action, but either ignore or underestimate the future consequences. This present bias can lead to misguided decisions but bias can be corrected by cognitive processing. Richard Thaler, Nobel economist 2017 founder of Behavioral economics, bas observed that Americans save little and are subject to present consumption bias. Thus Neuro Science has many insights for Behavioural Economics.