Financial economics: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Nick Gardner
imported>Nick Gardner
Line 27: Line 27:
The efficient markets hypothesis stipulates that all of the available information that is relevant to the price of an asset is already embodied in that price. It is based upon the argument that there is a large body of investors who react immediately (and at no cost to themselves)  to  any fresh information to buy or sell that asset . For example, if prices are expected to rise tomorrow, investors will buy today and in doing so, cause the price to rise until it is no longer  expected to rise further.  The existence in the market of ''noise traders'' need  not invalidate the hypothesis, provided that most traders act rationally and that those who do not, make only random mistakes.  
The efficient markets hypothesis stipulates that all of the available information that is relevant to the price of an asset is already embodied in that price. It is based upon the argument that there is a large body of investors who react immediately (and at no cost to themselves)  to  any fresh information to buy or sell that asset . For example, if prices are expected to rise tomorrow, investors will buy today and in doing so, cause the price to rise until it is no longer  expected to rise further.  The existence in the market of ''noise traders'' need  not invalidate the hypothesis, provided that most traders act rationally and that those who do not, make only random mistakes.  


The formal statement that "in an informationally efficient market, price changes must be unforcastable if they are properly anticipated" was put forward and proved by Paul Samuelson in 1965 <ref>Paul Samuelson: ''Proof that Properly Anticipated Prices Fluctuate Randomly.'', Industrial Management Review, 6,  1965 </ref>, and there followed a debate as to whether stock markets do in fact operate as efficient markets. That  question was subsequently  explored in studies undertaken and  summarised by  the economist Eugene Fama in 1991 <ref>[http://www.e-m-h.org/Fama70.pdf Eugene Fama: "Efficient Capital Markets: A Review of Theory and Empirical Work", ''Journal of Finance'' Vol 25 No 2]</ref>.   
The formal statement that "in an informationally efficient market, price changes must be unforcastable if they are properly anticipated" was put forward and proved by Paul Samuelson in 1965 <ref>Paul Samuelson: ''Proof that Properly Anticipated Prices Fluctuate Randomly.'', Industrial Management Review, 6,  1965 </ref>, and there followed a debate as to whether stock markets do in fact operate as efficient markets. That  question was subsequently  explored in studies undertaken and  summarised by  the economist Eugene Fama in 1970] <ref>[http://www.e-m-h.org/Fama70.pdf Eugene Fama: "Efficient Capital Markets: A Review of Theory and Empirical Work", ''Journal of Finance'' Vol 25 No 2, 1970]</ref>.   
<ref>[http://www.financeprofessor.com/summaries/fama91efficientcapitalmarketsii.htm Eugene Fama:  "Efficient Capital Markets II", ''Journal of Finance'', December 1991]</ref> and others. Fama concluded that there is no important evidence to suggest that prices do not adjust to publicly available information, and only limited evidence of privileged access to information about prices..
<ref>[http://www.financeprofessor.com/summaries/fama91efficientcapitalmarketsii.htm Eugene Fama:  "Efficient Capital Markets II", ''Journal of Finance'', December 1991]</ref> and others. Fama concluded that there is no important evidence to suggest that prices do not adjust to publicly available information, and only limited evidence of privileged access to information about prices..



Revision as of 09:04, 17 March 2011

This article is developed but not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
Tutorials [?]
Glossary [?]
 
This editable, developed Main Article is subject to a disclaimer.
Supplements to this article include a glossary; and a mathematical tutorial .

Financial economics treats the financial system as an open interactive system dealing both in claims upon future goods and services, and in the allocation of the risks that are associated with such claims. It is concerned with the investment choices made by individuals, with the financing choices made by corporations, with the conduct of financial organisations that act as financial intermediaries between individuals and corporations; and with the effects of it all upon the economy. The consensus theory that was developed in the course of the 1970s and 1980s came under question following the crash of 2008, and "the race is on" to develop an acceptable replacement.


The status of financial economics

The pre-crash consensus

Economists and professional investors gave little attention to financial economics until the adoption in the 1970s of models based upon the efficient market hypothesis. A distinction came to be drawn between the version of hypothesis in which prices are unforecastable (termed the "weak version") and the wider-ranging version in which prices correctly reflect all of the relevant information (termed the "strong version"). The weak version came to command acceptance as the general tendency - subject to rare exceptions concerning the very few traders who have consistently outperformed the market. The strong version was less widely accepted because it implies the impossibility of arbitrage and profit-making, the absence of speculative bubbles, and the futility of market regulation. The publication of Andrew Lo's adaptive markets hypothesis in 2004, which he proposed as an amendment to the efficient market hypothesis, further increased its acceptability by taking account of investor activities that it had not allowed for.

The implication of the efficient market hypothesis that stabilising regulatory policy is unnecessary and ineffective was in line with accepted monetarist doctrine[1], and probably contributed to the consensus reluctance to use regulation on the part of both monetarist and "new Keynesian" economists.

The implied conclusion of the efficient markets hypothesis that price variations on the markets for financial assets are random variations, which could be represented by established probability distributions formed the basis of a range of widely accepted risk analysis theorems.

Post-crash speculations

It was not until the crash of 2008 that it was widely recognised that the efficient market hypothesis was no more than a statement of a general tendency, and that major risks could occasionally arise from statistically unpredictable patterns of investor conduct, and it was not until then that it was generally realised that policies that were beneficial to individual investors could be harmful if they were adopted by all of them. Professor Shin of Princeton University accepts that the resulting shortcomings of economic theory had played a big rôle in that disaster, and reports that the "race is on" to fill the gaps in economic theory and to add a new perspective to macroeconomics by the incorporation into it of a new theory of financial economics[2]. It consensus had already formed in the course of 2009 that internationally agreed "financial regulation" had become necessary, two G20 summit meetings of the leaders the major developed and developing countries had been held for that purpose , and a variety of concrete proposals had been formulated.

Competing theories

The efficient market hypothesis

Long before economic analysis was applied to the problem, investment analysts had been advising their clients about their stock market investments, and fund managers had been taking decisions on their behalf. Some sought to predict future movements of the price of a share from a study of the pattern of its recent price movements (known as “technical analysis”) and some attempted to do so by examining the issuing company’s competitive position and the factors affecting the markets in which it operates (known as "fundamental analysis"). But in 1933, an economist suggested that both might be wasting their time. Applying the concept of a perfect market to the stock exchange, the economist Alfred Cowles asked the question "Can Stock Market Forecasters Forecast?" [3] and gave his answer as "it is doubtful", thereby starting a controversy that has yet to be fully resolved. Cowles argued that in an "efficient market" all of the information upon which a forecast could be based was already embodied in the price of the share in question, subject only to unpredictable fluctuations having the characteristics known to statisticians as a "random walk".

The efficient markets hypothesis stipulates that all of the available information that is relevant to the price of an asset is already embodied in that price. It is based upon the argument that there is a large body of investors who react immediately (and at no cost to themselves) to any fresh information to buy or sell that asset . For example, if prices are expected to rise tomorrow, investors will buy today and in doing so, cause the price to rise until it is no longer expected to rise further. The existence in the market of noise traders need not invalidate the hypothesis, provided that most traders act rationally and that those who do not, make only random mistakes.

The formal statement that "in an informationally efficient market, price changes must be unforcastable if they are properly anticipated" was put forward and proved by Paul Samuelson in 1965 [4], and there followed a debate as to whether stock markets do in fact operate as efficient markets. That question was subsequently explored in studies undertaken and summarised by the economist Eugene Fama in 1970] [5]. [6] and others. Fama concluded that there is no important evidence to suggest that prices do not adjust to publicly available information, and only limited evidence of privileged access to information about prices..

The adaptive markets hypothesis

The adaptive markets hypothesis was intended by its author, Professor Andrew Lo, of the Massachusetts Institute of Technology, to serve as an amended version of the efficient markets hypothesis which provides for activities such as arbitrage and profit-making, which do not exist in that model [7]. It does so by borrowing concepts from evolutionary theory and cognitive psychology, and envisions a system in which "heuristic" short-cuts replace logical analysis, and selection by trial and error replaces rigorous optimisation. The degree of market efficiency in this model depends upon "environmental" factors such as the number of competitors in the market, the magnitude of profit opportunities available, and the adaptability of the market

The financial instability hypothesis

The financial instability hypothesis, as formulated in 1992 by Hyman Minsky[8][9] , states that the economy has financing regimes under which it is stable, and financing regimes in which it is unstable and that, during periods of sustained growth it tends to transform itself from stability to instability. It is based upon the argument that there are three categories of financial transaction:

  • "hedge transactions", that characterise the cautious conduct of the banks during a period of stability following an economic downturn, when credit is made available only to those who demonstrate their capacity to make repayments from well-established cash flows;
  • "speculative transactions" that tend to follow as confidence in the steady growth of the economy grows, and credit is offered in the expectation of repayments from expected cash flows;
  • "Ponzi transactions", that take place in the course of the ensuing boom, during which credit is provided in the hope that repayments would be made from currently unknown sources that may become available as the boom develops.

The leverage cycle hypothesis

Professor Hyun Song Shin of Princeton University attributes the occasional instability of the financial system to a "balance sheet leverage cycle". He uses as an analogy the former instability of London's millennium bridge that was caused by the fact that each pedestrian crossing it adjusted his stance in response to the sway of the bridge in such a way as to amplify that sway. In the financial market, he argues, banks similarly adjust their balance sheets in response to a price increase in such a way as to amplify that price increase. As a result, the financial system accumulates risks during a boom which are realised when the process is reversed in the following downturn[10]. He also argues that in the same way that the instability of the millennium bridge was found not to occur unless more than 56 pedestrians were crossing it, there is a threshold level of disturbance below which financial instability does not occur. That threshold level would be difficult, if not impossible to estimate, but he notes that a factor that distinguished the crash of 2008 from the bursting of other bubbles had been introduction of securitisation. Professor Shin maintains that, instead of dispersing risks, as had been intended, securitisation undermined financial stability by concentrating risk upon financial intermediaries. It was securitisation that had allowed banks to leverage up in tranquil times while concentrating risks in the banking system by inducing banks and other financial intermediaries to buy each other’s securities with borrowed money[11].

The effects of financial markets on the economy

Joseph Schumpeter had argued in 1911 that the services provided by financial intermediaries - mobilizing savings, evaluating projects, managing risk, monitoring managers, and facilitating transactions - stimulate technological innovation and economic development [12], World Bank economists, Robert King and Ross Levine provided empirical support for that proposition in 1993 [13], and that has become the consensus view among economists. (However, a critical survey of the empirical evidence notes that it was not universally accepted, suggests that although the evidence is generally supportive of that proposition, the connection is not fully understood [14].)

The choices facing investors

Risk limitation

The value of any investment is definitionally equal to the present value of its future cash flows when discounted at the investor’s discount rate. It is a method that is of limited usefulness in valuing shares because of the uncertainties surrounding the future of the issuing company. It is conceptually possible to allow for those uncertainties by applying subjective probability weightings to each of what are conceived to be the possible outcomes, in order to produce an estimate of the investment’s net present expected value (see the article on net present value). If such a calculation were feasible, a rational choice would be to buy if the net present expected value (net, that is to say, of the purchase price) is greater than zero – or, even better, to buy the asset that has the largest positive net present value of all the assets that are on offer. But it would not be rational to devote all of one’s savings to that asset, even if the probable outcome had been correctly estimated. Every investor needs to limit the risk of total loss; and investors differ in their attitudes to less important risks.

The well-known way of limiting such risks is to buy a diversified share portfolio – a strategy that was analysed in detail in the 1950s by the economist Harry Markowitz [15] [16]. Markowitz reasoned that what matters is the riskiness of the portfolio rather than its components, and that the riskiness of the portfolio depended, not so much upon the riskiness of its components, as upon their covariance, or the tendency of their prices rise and fall in concert. He went on to develop what has come to be known as "portfolio theory" concerning the problem of adjusting a portfolio mix to give the maximum return for a given level of risk. Complex procedures are involved in which assets are grouped according to their riskiness and their covariance. The risk of holding an equity came to be categorised as consisting of "unsystematic risk", which can be reduced by diversification, and "systematic risk" which results from the rise and fall of the equity market as a whole. Modern portfolio theory now takes account of an extension of the Markowitz analysis to include cash, and the possibility of borrowing in order to invest, that was developed by James Tobin [17]. Tobin demonstrated that the process of finding an optimum portfolio for a given level of risk involves two separate two decisions: first finding an optimal mix of equities, and then combining it with the amount of cash necessary to meet the risk requirement - a result known as "Tobin's Separation Theorem". He also argued that in a perfect market with only rational investors, the optimal mix of equities would consist of the entire market.

Equity pricing

The value of an asset is determined by its expected rate of return which, in turn, is related to its riskiness. Competition may be expected to ensure that equities earn greater returns than government bonds in order to compensate their purchasers for undertaking greater risks. The difference for any given share is termed its risk premium. A theorem developed by the economist William Sharpe [18] proves that, under certain ideal circumstances, a share's risk premium will be equal to the equity market’s risk premium multiplied by a factor that he termed Beta, which is related to the covariance of that share's rates of return with the corresponding rates for the equity market as a whole. The result is known as the Capital Asset Pricing Model (CAPM) [19]. Sharpe's proof depends upon the assumption that all investors effectively free themselves of "unsystemic" risk by diversification and receive a risk premium only for the remaining "systemic risk" (he argued that rational investors in a perfect market would arbitrage away any premium gained in return for avoidable risks). Subsequent investigators have tried to establish whether, despite those somewhat unrealistic assumptions, the stock market behaves as predicted by the model. A 1972 study of the New York Stock Exchange during the period 1931-65 broadly confirmed the existence of proportionality between the prices of shares and their Betas [20], a 1992 study of the New York, American and NASDAQ stock exchanges during the period 1963-90 did not indicate any such proportionality [21], and the findings of a 1993 paper using a different methodology tended to confirm the CAPM prediction [22]. The controversy continues, but many economists believe that Beta is a significant factor, although not the only factor, that influences share prices.

The possibility that other factors exert a significant influence is allowed for in an extension of the CAPM methodology, termed the "Arbitrage Pricing Theory" (APT) [23][24]. The theory leaves it to its users to identify the factors likely to influence the price of a share and to weight them according to their relative importance. Firm size, price, earnings ratio, and dividend yield have been found to be relevant factors, as well as factors that are relevant to the markets in which the firm operates [25]. In the 1970s, however, two American economists came to realise that the future volatility of the price of an asset is already allowed for in the operation of the options market, and that it should be possible to deduce the market's expectation of its volatility from the prices ruling in that market. Fischer Black and Myron Scholes developed what came to be known as the "Black-Scholes Model and published their results in a 1973 paper [26]; an achievement that eventualy resulted in the award of the Nobel prize for economics [27]. The model can be used either to determine the fair price for an option on an asset from an estimate of its price volatility, or to estimate the market's expectation of the asset's price volatility from the price of an option for it. (The mathematical form of the theorem, and some of the assumptions on which it depends are set out on the tutorials subpage). The Black-Scholes theorem was used by Robert Merton as the basis for a technique known as "Contingent Claims Analysis" that can be applied to the pricing of almost any form of financial asset [28].

The financing choices facing corporations

The financing choices open to companies are determined by the choices open to investors - and that is true of choices concerning the issue of shares or bonds. A company's shareholders become its true owners only after all its debts have been repaid. In principle, therefore, their view of a company's debts should depend upon the opportunity they have to repay them. If they could do so costlessly, using money borrowed at the same rate as that paid by the company, then shareholders should be indifferent to the existence of debt. It should not matter to them whether they have shares in a company with no debt, or in a similar company with debt that could be costlessly repaid. That was the insight into the economics of company finance that was put forward by Modigliani and Miller in 1958. [29]. On those assumptions the view of a company taken by the finance market would be unaffected, even by unlimited levels of gearing. Reality differs in several respects. Gearing increases the risk that the company's income might fall to a level at which it could not make its contractual income payments - at which point it would become insolvent. On the other hand, it usually gives the company a tax advantage because most tax systems treat interest payments as an expense that can be deducted from income before calculating tax liability. According to the "trade-off theory" of corporate finance, the appropriate decision- making procedure under those circumstances is to increase gearing to the point at which the tax advantage offsets the risk-adjusted cost of insolvency. [30]. The rival "pecking order" theory [31] suggests that companies prefer the cheapest available form of finance, choosing retained profits, debt and equity in that order of preference. Most of the empirical evidence appears to favour the trade-off theory[32]. Much of the evidence also suggests that high gearing can have a negative effect upon corporate growth[33], but the exceptions in both cases suggest that there are other factors that have to be taken into account. Among possible additional factors are the possibilities of agency costs arising from conflicts of interest between shareholders and managers [34] and assymmetry of information between shareholders and managers [35], but it has been suggested that the threat of a hostile takeover, leading to replacement of an existing management, may mitigate such costs [36].

The problems facing the financial intermediaries

(For more detailed information about the operation of the financial intermediaries, see the articles on banking and the financial system.)

The problems facing the financial intermediaries arise mainly from the fact that they make up a tightly-coupled complex interactive system, in which an error of judgement in one of its members can have repercussions in many of the others. The investment banks, which are its largest element, are particularly sensitive to such errors because they borrow and lend vastly greater sums of money than they themselves possess. A mistake that involves a minor proportion of a bank's turnover could consequently have a devastating effect upon its own finances. That sensitivity to relatively minor errors also characterises financial institutions that operate at very high level of gearing, because of their dependence upon sufficient earnings to meet their interest-paying obligations. Banks in particular have proved vulnerable to falls in the value of there assets and there have been many bank failures and rescues. In the latter part of the twentieth century there were attempts to reduce that vunerability by the adoption of sophisticated risk analysis. Schemes of "portfolio insurance" using options priced in accordance with the Black-Scholes model[37]) became popular in the 1980s, and they were followed by a variety of tailor-made risk-management products. Widespread use was made of a variety of "value at risk" calculations (which apply standard probability distributions to the observed volatility of market prices) [38]. The fact that the volatility estimates that were used were based upon experience of a period of exceptionally low volatility known as the great moderation has been held to have been responsible for the crash of 2008 [39],and has thrown doubt about risk management systems that depend upon such assumptions. [40].

However, the increased competition that developed between financial intermediaries in the late twentieth cenury and the early 21st century, led to reduced profit margins on individual transactions. Profitable trading came to require a large number transactions, and increased gearing was adopted in order to finance them - with the effect of further increasing vulnerability. Thousands of highly professionally-managed computer-operated hedge funds came in existence in the 1990s, some using borrowed money amounting to over twenty times their own capital.

The roles of financial regulators

In England, the need for regulation of the financial intermediaries became evident in 1866 when the collapse of the Gurney-Overend bank caused a panic in which large numbers of people tried to withdraw deposits from their banks; leading to the collapse of over 200 companies [41]. On that occasion the Bank of England had refused to help, but the influential commentator Walter Bagehot urged that in a future panic it should "advance freely and vigorously to the public out of its reserves"[42] in order to avoid another "run on the banks, and in 1890 the Bank rescued the failing Barings bank by guaranteeing loans to it by other banks [43] . In the United States there was similar initial inaction in face of the much more serious panic of 1893 but following the further panic of 1907 the Congress created the Federal Reserve System and granted it powers to assist banks that faced demands that they would otherwise be unable to meet. There was controversy among economists concerning the justification for such intervention. Monetarists such as Anna Schwartz argued that it should only be used to deal with a banking panic, such as would that would otherwise cause a substantial fall in the money supply [44]. Others argued that it should be used to deal with a wider range of mishaps, including the failure of a very large financial or non-financial firm [45] [46]. The subsequent practice of central banks has generally been to provide short-term loans to solvent banks to tide them over temporary liquidity difficulties [47] and to provide or arrange longer-term loans to avert failures that would be large enough to threaten the stability of the banking system. Concern about the possibility that major bank failures in one country might infect others and destabilise the world's financial systems, as had happened in the Crash of 1929, led to international consultation, as a result of which, in 1988, the Basel Committee for International Banking Supervision of the Bank for International Settlements recommended limits upon banks' capital adequacy ratios which took account of the riskiness of their assets. Their recommendations have since been implemented by most central banks.

Until the 1980s, investment banks were not normally permitted to undertake non-financial activities, nor other financial activities such as branch banking, insurance or mortgage lending. In the 1980s, however, there was extensive deregulation of the banks with the intention of increasing competition and improving efficiency [48]. Reserve requirements were relaxed and restrictions upon the range of their financial activities were generally relaxed or removed [49]. There followed an extensive restructuring of most of the world's major financial systems in which investment banking and branch banking organisations were merged, banks became closely involved in a wide range of non-banking activities such as mortgage lending and insurance, and new financial institutions came into being whose activities interacted with the new activities of the banking system. In 1997 an inquiry set up by the Australian Government recommended that, in view of the growing interdependence of the banking system with the remainder of the financial system, they should all be regulated by a single agency [50] and that recommendation has since been followed by Japan and many European countries [51]. In Britain the Bank of England retained its responsibility for the provision of emergency liquidity as "lender of last resort" but its regulatory responsibilities were transferred to the existing Financial Services Authority [52]. Concern about the possible effect of these further developments upon the stability of the international financial system led to the precautionary activities of the international regulatory bodies that are described in the article on international economics. Among them, a revised version of the Basel Committee's recommendations, known as "Basel II" [53], which required central banks to ensure that banks were operating adequate risk-management systems, took effect at the beginning of 2008. Further strengthening of financial regulation has been recommended following the crash of 2008 [54].

(For an account of current proposals for regulatory reform, see the article on financial regulation)

How it all worked out

Imperfect markets

The importance of the efficient market hypothesis lies not so much in what it says about investment analysts, as in the implications of its embodiment in subsequent theories: a risk-assessment procedure that is based upon a hypothesis that only holds true most of the time may be expected to have limited reliability. Questions about its usefulness in such applications arise mainly from the known incidence of irrational behaviour. The existence in the market of "noise traders" need not invalidate the hypothesis, provided that most traders act rationally and that those who do not, make only random mistakes. But two events suggest that, even if it nearly always holds true, there can be important exceptions in which those provisos are breached. They are the stock market crash of 1987 and the internet bubble of the 1990s. In defence of the hypothesis, Burton Malkiel argues that the 1987 crash can be explained mainly (but not entirely) in terms of rational behaviour, notes that professional analysts were very much involved in the creation of the internet bubble, and rests his defence upon the observation that bubbles are exceptional [55]. The findings of behavioural finance studies suggest, however, that occurences of that sort are to be expected. The innate characteristics of the human mind have been shown to be responsible for habitual and persistent judgmental mistakes [56], of which some, such as "information cascades" [57] might be expected to lead to non-random price-movements - and there have been many instances of cascades and herding behaviour in financial markets. [58]

Faulty insurance and capital mismanagement

The massive expansion in the use of portfolio insurance which occurred in the 1980s, ended when its shortcomings were exposed by the stock exchange crash of 1987. Used by a privileged few, it had been very effective, but its widespread use eventually violated some of the Black-Scholes assumptions upon which it was based. It was designed to cope with the fundamentally directionless price fluctuations, characterised mathematically as a random walk, and was ill-adapted to deal with a persistent downward trend. Moreover, the short selling which it required could not be achieved in the absence of willing buyers, which cease to be available at times of panic. In 1987, massive losses resulted from its use. Expert opinion differed as to whether it was among the causes of the crash, but it is generally accepted that it increased its severity. [59] [60] [61]. In grossly simplified terms, it could be seen as a sophisticated form of the "stop loss" policy under which investors automatically respond to falling prices by selling stock, thereby contributing to the downward trend. The financial fragility of highly-geared hedge funds was subsequently demonstrated by the spectacular failure of "Long Term Capital Management" (LTCM). Launched in 1994, LTCM immediately attracted investors in other financial intermediaries, including many large investment banks, and raised $1.3 billion at launch. It was spectacularly successful for four years, but by 1998 it had lost over $4 billion and a private sector rescue was organised by the Federal Reserve Bank[62]. There are differing accounts of the complex series of events that led to its failure, [63] [64], but it is generally agreed that it was finally brought down by a lack of liquidity. LTCM's assets eventually yielded a small profit, suggesting that it might not have needed rescue had it not been temporarily unable to meet its financial obligations because it could not find buyers for its assets when it needed them.

Financial crises and systemic failure

The fact that the Federal Reserve Bank organised the rescue of LTCM was a recognition that it was a significant component of what had become a truly complex interactive system; - and a recognition of that system's fragility. Walter Bagehot had warned, over a hundred years previously, of the devastating effect upon the economy of a "run on the banks", and the consequent withdrawal of the credit upon which the non-financial sector depends, and the eminent United States economist, Ben Bernanke later suggested that banking failures had been responsible for the depth and longevity of the Great Depression of 1929 [65]. The devastating effect of the failures of their recently deregulated banking systems upon the economies of the Asian "tigers" had provided a further warning of what could happen [66] [67] . Although LTCM was not a bank, it was feared that its linkages with the banks through loans and investments were so strong that its failure could threaten their ability to provide liquidity on request to the rest of the economy[68]. The Bank was criticised for that rescue on the grounds that it created a moral hazard by motivating financial intermediaries to take unwarranted risks in the expectation of rescue if things went wrong; and for that reason regulatory authorities worldwide have since been reluctant to act in the absence of systemic danger. (Policy reasons other than the fear of systemic failure had been behind the United States government's earlier $150 billion rescue of the "Savings and Loans" domestic mortgage-lenders [69] [70].) In Britain, no attempt was made to rescue Barings, which was Britain's oldest investment bank, when it failed for the second time in 1995 [71], and moral hazard considerations delayed the rescue of the Northern Rock bank in 2007 [72]. In the United States, there were no more bank rescues until liquidity problems crippled its fifth largest investment bank, Bear Stearns in 2008.

The troubles at Northern Rock and Bear Stearns were followed by further bank failures and rescues resulting from the 2007 subprime mortgage crisis in the United States [73] [74]. In earlier days the consequences of a such a crisis would be confined to those immediately involved, but financial innovations such as securitisation have since spread the consequences of loan defaults over a wide range of banks and other financial intermediaries. Multiple defaults by house-owners caused widespread uncertainty about the value of assets relating to the US housing market and a general unwillingness to offer cash in exchange for them. During the following year the subprime crisis triggered the crash of 2008, leading in turn to the subsequent world recession. An international working group attributed the crash to inadequate supervision by the regulatory authorities in the United States and elsewhere; to faulty risk-management by firms, accentuated by bonus schemes that encouraged disproportionate risk-taking; and to undue reliance upon inaccurate assessments by credit-rating agencies. [75].

A survey conducted during the early stages of the crash by the former chairman of Britain's Financial Services Agency concluded that the international regulatory system was flawed, and recommended a strengthening of the Financial Stability Forum [76], and former Federal Reserve Bank chairman Paul Volcker described the crisis as "the culmination... of at least five serious breakdowns of systemic significance in the past 25 years" [77] .

References

  1. Milton Friedman: Essays in Positive Economics, page 139, Phoenix Books, 1953
  2. Hyun Song Shin: Interview with Ramesh Vatilingam, 4 September 2009
  3. Alfred Cowles, "Can Stock Market Forecasters Forecast?", Econometrica July 1933
  4. Paul Samuelson: Proof that Properly Anticipated Prices Fluctuate Randomly., Industrial Management Review, 6, 1965
  5. Eugene Fama: "Efficient Capital Markets: A Review of Theory and Empirical Work", Journal of Finance Vol 25 No 2, 1970
  6. Eugene Fama: "Efficient Capital Markets II", Journal of Finance, December 1991
  7. Andrew W. Lo: Reconciling Efficient Markets With Behavioral Finance: The Adaptive Markets Hypothesis, The Journal of Investment Consulting, Vol 7 No 2, 2005
  8. Hyman Minsky: The Financial Instability Hypothesis, Working Paper No. 74, The Jerome Levy Economics Institute of Bard College, May 1992
  9. Hyman Minsky Stabilizing an Unstable Economy, McGraw Hill, 1986
  10. Hyun Song Shin: Risk and Liquidity, Chapter3, Clarendon Lectures in Finance, 2009, Oxford University Press, forthcoming[1]
  11. Hyun Song Shin Securitisation and Financial Stability, Vox 18 March 2009
  12. Joseph Schumpeter: The Theory of Economic Development, Harvard University Press, 1911
  13. Robert King and Ross Levine: Finance and Growth: Schumpeter Might Be Right, WPS 1083, Country Economics Department, The World Bank, February 1993
  14. Paul Wachtel: How Much Do We Really Know about Growth and Finance?, Paper presented at a Federal Reserve Bank of Atlanta conference on finance and growth, November 15,2002
  15. Harry Markowitz: "Portfolio Selection", in The Journal of Finance, Vol. 7, No. 1 March , 1952.
  16. Harry Markowitz : Portfolio Selection: Efficient Diversification of Investments, Wiley 1959
  17. James Tobin: Liquidity Preference as Behavior Towards Risk The Review of Economic Studies, Vol. 25, No. 2Feb., 1958.
  18. William Sharpe: Portfolio Theory and Capital Markets McGraw-Hill 1970
  19. For the mathematical form of the CAPM model, see the Tutorials subpage
  20. Michael Jensen, Fischer Black, and Myron Scholes, "The Capital Asset Pricing Model: Some Empirical Tests" . Michael C. Jensen, in Studies In The Theory of Capital Markets, Praeger Publishers Inc., 1972
  21. Eugene Fama and Kenneth French: "The Cross-Section of Expected Stock Returns" in The Journal of Finance, Vol. 47, No. 2 1992 [2]
  22. Ravi Jagannathan and Zenyu Wang : The CAPM is Alive and Well Federal Reserve Bank of Minneapolis Staff Report 165. 1993
  23. Stephen Ross: "The Arbitrage Theory of Capital Asset Pricing" in Journal of Economic Theory, December 1976
  24. For a mathematical statement of the arbitrage pricing theory see the Tutorials subpage
  25. Richard Roll and Stephen Ross: "An Empirical Investigation of the Arbitrage Pricing Theory" in Journal of Finance December 1980
  26. Fischer Black and Myron Scholes: "The Pricing of Options and Corporate Liabilities", The Journal of Political Economy, Vol. 81, No. 3 May-June 1973[3]
  27. Press release for the award of the Nobel Prize in Economics to Robert Merton and Myron Scholes, Nobel Committee 1997
  28. Robert Merton: "On the Pricing of Corporate Debt: the Risk Structure of Interest Rates" Journal of Finance May 1974
  29. Franco Modigliani and Merton Miller: "The Cost of Capital, Corporation Finance and the Theory of Investment", American Economic Review June 1958
  30. Peter Brierley and Philip Bunn: "The Determination of UK Corporate Capital Gearing", Bank of England Quarterly Bulletin August 2005
  31. Stewart Myers and Nicolas Majluf: Corporate Financing and Investment Decisions When Firms have Information That Investors Do Not Have. Sloan School of Management Working Paper No 1523-94 1983
  32. Eugene Fama and Kenneth French: Financing Decisions: Who Issues Stock, Center for Research in Security Prices Working Paper No 549
  33. Ofek Eli and Rene Stultz: Leverage, Investment, and Firm Growth NBER Working Paper No 5165 July 1995
  34. Michael Jensen and William Meckling: "Theory of the Firm: Managerial Behavior, Agency Costs and Ownership Structure", Journal of Financial Economics, Vol 3 No 4 1976.
  35. Stewart Myers and Nicolas Majluf: Corporate Financing and Investment Decisions When Firms have Information That Investors Do Not Have Sloan School of Management Working Paper No 1523-941983
  36. Michael Jensen and Richard Ruback: “The Market for Corporate Control: The Scientific Evidence”, Journal of Financial Economics April 1983
  37. Dean Furbush: "Program Trading" Concise Encyclopedia of Economics
  38. Thomas Linsmeier and Neil Pearson: Risk Measurement: An Introduction to Value at Risk, University of Illinois at Urbana-Champaign - Department of Finance Working Paper 96-04 July 1996
  39. David Beckworth: Did the Great Moderation Contribute to the Financial Crisis?, Macro and Other Musings, November 12 2008
  40. Alan Greenspan: We Will Never Have a Perfect Model of Risk", Financial Times, March 16 2008
  41. James Taylor ‘’Limited Liability on Trial: the Commercial Crisis of 1866 and its Aftermath’’ Economic History Society Conference 2003
  42. Walter Bagehot: Lombard Street: A Description of the Money Market Scribner, Armstrong, 1874
  43. A list of subsequent rescues appears in paragraph 3 of Michael Barro and Anna Schwartz: "Under What Circumstances, Past and Present Have International Rescues of Countries in Financial Difficulties Been Successful?" Journal of International Money and Finance 18 1999, [4]
  44. Anna Schwartz: "Real and Pseudo- Financial Crises" in Crises in the Management of the World Banking System Macmillan 1986
  45. Hyman Minsky: "Financial Stability Revisited" in Reappraisal of the Federal Reserve Discount Mechanism, Vol 3, Federal Reserve Bank 1972
  46. Charles Kindleberger:Manias, Panics and Crises, Macmillan, 1978
  47. Xavier Freixas: Lender of the Last Resort: a review of the literature Bank of England Publications 1999
  48. James Barth and Gerard Caprio: Rethinking Bank Regulation: Till Angels Govern, Cambridge University Press 2008.
  49. Claudio Borio and Renato Filoso: The Changing Borders of Banking: Trends and Implications, Working Paper No 43 Bank for International Settlements December 1994
  50. Report of the Financial System Inquiry, Australian Government Publishing Service March 1997
  51. see Fig 5 of Howard Davis: "What Future for the Central Banks", in World Economics Vol7 No4 October 2006[5]
  52. The Financial Services Authority
  53. Core Principles of Effective Banking Supervision Basel Committee on Banking Supervision, Bank for International Settlements 2006(Basel 2)
  54. Report of the Financial Stability Forum on Enhancing Market and Institutiona Resilience Bank for International Settlements April 2008
  55. Burton Malkiel: "The Efficient Market Hypothesis and its Critics", Journal of Economic Perspectives Winter 2003
  56. Nick Gardner: Mistakes: How They Have Happened and How Some Might Be Avoided, Chapter 5, Nick Gardner 2007
  57. Cascades (bibliography).
  58. David Hirshleifer and Siew Hong Teoh: Herd Behavior and Cascading in Capital Markets: A Review and Synthesis Dice Center Working Paper No. 2001-20 Paul Merage School of Business and University of California January 2002 (reviewed)
  59. For the 1987 crash as seen by a Presidential task force, see the Report of the Presidential Task Force on Market Mechanisms. Nicholas F. Brady (Chairman), U.S. Government Printing Office. 1988.
  60. For the 1987 crash as seen by a portfolio insurance operator, see Richard Bookstaber: A Demon of Our Own Design Chapter 2 John Wiley 2007
  61. For a sociologist's view, see Donald MacKenzie: "The Big, Bad Wolf and the Rational Market: Portfolio Insurance, the 1987 Crash and the Performativity of Economics", Economy and Society 33 2004
  62. Hedge Funds, Leverage, and the Lessons of Long-Term Capital Management, Report of the President's Working Group on Financial Markets, April 1999
  63. Roger Lowenstein: When genius failed : the rise and fall of Long-Term Capital Management Random House, 2000 (reviewed)
  64. Richard Bookstaber: "Long Term Capital Management Rides the Leverage Cycle to Hell" (Chapter 7 of A Demon Of Our Own Design Wiley 2007)
  65. Ben Bernanke: Essays on the Great Depression, Princeton University Press 2005
  66. Marcus Miller and Pongsak Luangram: Financial crisis in South East Asia: Bank Runs, Asset Bubbles and Antidotes,CSCR Working Paper No 11/98 July 1998
  67. For an account of some other international financial crises, see the article on International economics
  68. Testimony of Chairman Alan Greenspan concerning the private-sector refinancing of LTCM before the Committee on Banking and Financial Services, U.S. House of Representatives October 1, 1998
  69. Lawrence White: The S&L Debacle: Public Policy Lessons for Bank and Thrift Regulation, Oxford University Press, 1991.
  70. Timothy Curry and Lynn Shibut: The Cost of the Savings and Loan Crisis, Federal Deposit Insurance Corporation Banking Review 2000
  71. Report Of The Board Of Banking Supervision Inquiry Into The Circumstances Of The Collapse Of Barings, Bank of England, 18 July 1995
  72. The Run on the Rock, Report of the House of Commons Treasury Committee January 2008
  73. The Subprime Lending Crisis, Report of the Joint Economic Committee of the United States Congress October 2007
  74. The Sub-prime Crisis in Graphics, BBC News24 21 November 2007
  75. Report of the Financial Stability Forum on Enhancing Market and Institutional Resilience, International Monetary Fund 5th February 2008
  76. Howard Davies: "The Future of Financial Regulation", World Economics vol 9 no1 Jan-Mar 2008
  77. Paul Volcker: talk to The Economic Club of New York, April 8 2008