Thursday, 30 July 2009

Panic! No wait...don't panic

Robert Peston writes an entertaining, slightly scary, story about BT's pension liabilities.

But is this anything to worry about? Not really. He makes a few points that sound striking, but under closer examination are utterly unsurprising. an economic sense, BT's current and future pensioners own this totemic business.
True, but any business is "owned" both by its creditors and its shareholders. The creditors always have to be paid off first before the shareholders have clear title to the company's assets. And in most large, old companies, pensioners are among the biggest creditors. Look at General Motors, which has just handed over a majority stake in itself to its pensioners (via their union, the UAW).

To put this £8bn chasm in its pension-scheme into an appropriate context, the entire market value of the company is less than £10bn.
Also true, but meaningless. The pension deficit - like any other debt that BT has - is already factored into its market value.

If you take out a £300,000 mortgage on a £400,000 house, then the "market value" of your asset is £100,000. You could try to scare yourself by comparing your £300,000 debt to the £100,000 equity, but it wouldn't be very effective unless you're really subject to panic attacks.

The meaningful comparison is to enterprise value, which is much higher than the £10 billion market capitalisation. This is clearly indicated by the fact that the company, in the middle of a recession, generates over £1 billion of cash each year. This money-generating enterprise would be worth at least £20 billion on any sensible valuation.

...if BT were forced to value its pension liabilities on the basis of the yield on gilts...well then the deficit would be well over £20bn
...and if my aunt had balls, she'd be my uncle. There's no reason whatsoever that BT should use gilt yields to discount its pension, as it can earn a much higher return on investment in its trading assets than it could get on gilts.

Having said all that, the story is informative and sheds interesting light on the internal structure of BT's business. It just isn't as big a deal as Robert would like to make us think. But then, news that plays up to the audience's fears and prejudices always gets more eyeballs.

Wednesday, 29 July 2009

Buzz about behavioural finance

Lots of behavioural finance conversations going on on the blogs today and yesterday.
  1. Chris Dillow of Stumbling and Mumbing replies to my proposal for governments to take into account cognitive bias while regulating.
  2. Simon Johnson of Baseline Scenario responds to a debate between Richard Thaler and Richard Posner about financial regulation.
  3. Alex Tabarrok from Marginal Revolution highlights the difficulty of fighting asset bubbles, even if you have overcome the challenge of identifying them.
  4. Kenneth Arrow (via Conor Clarke of The Atlantic) argues that behavioural economics doesn't predict anything.
  5. Update: A friend points out this letter in the FT from John Maule calling for behavioural approaches to be used more in regulation and investment decisions.
I'd love to have time to engage in depth with all of these debates, but let me start with a couple of key points.

Commenters on both Chris's and Simon's posts use a familiar argument to dissent from the idea of behavioural regulation. Surely, this line goes, regulators are just as irrational* as consumers - perhaps more so, since they don't have their own money at stake.

In fact, this argument is wrong in two respects. First, regulators are not as irrational as consumers, for the following key reasons:
  • They are better trained. Just as doctors know more about health than patients, regulators know more about consumer finance than the vast majority of consumers. Experimental results (see Alex's posting for some details) show that more experience and better information makes people more rational.
  • They have more time to analyse the issues. Many behavioural phenomena arise because people make decisions in the moment, when careful reflection would identify an alternative course of action.
Second, regulators are not trying to second-guess every action of consumers. They should only act in cases where consumers clearly act against their own interests.

There is a valid philosophical point about what a consumer's interest really is. It may be that the customer's genuine preference in the moment is to sign the mortgage with a low introductory interest rate, even though the rate will reset to an excessive level in six months. You could make a case relying on high discount rates or rational ignorance which justifies why it isn't in the consumer's interest to inform themselves - or care - about impoverishing themselves next year.

Just as, to use Thaler's example, you could make a case that hard-up parents might be better off buying a cheap crib which endangers their baby's life, in order to save £50 which they would instead spend on feeding their child.

But most people would agree that there are some cases in which you can clearly see a consumer's best interest being violated.

These are the cases where regulators will act - when sufficient evidence is available - not in situations where the regulator might just have a different opinion to the consumer about the best choice to make.

The broad case is that, given certain assumptions, classical economic rationality is actually the best way for us to act. We demonstrably depart from it in some situations, and within carefully constrained limits, there is a case for pushing us back to it.

Chris is not arguing against my point directly, but makes some entirely valid public choice arguments for being careful in the scope of this regulation. Of course these examples don't prove that governments should never do anything at all. Equally, they don't prove that regulators should never try to combat cognitive biases. Of course it is more fun to try to pick positions that are completely opposed to each other, but the reality is that it's all about drawing boundaries at the best achievable position on a spectrum.

Separately, Arrow's point (ha ha) about behavioural economics. This is a valid criticism of some behavioural economics research. Lots of it aims to describe departures from classic rationality but doesn't help us to build better models. If these papers do have predictive value, it's mainly at the level of the individual and doesn't shed light on group or macroeconomic outcomes.

However, some behavioural work - and I include my own research - is specifically intended to build predictive models. Arrow's own work with Debreu is my inspiration here, so he has every right to set the challenge - which is to create an alternative to the classical general equilibrium theory, taking into account better behavioural models.

Rational choice may be the Enlightenment ideal for our thoughts and behaviour, just as costless transactions, perfect information, Coasean incentives and Pareto optimality are an ideal model for a welfare-maximising society. But until we can change our biology, or supplement it with electronics - and really, would we want to? - then we had better keep looking for improved models that describe and predict our real behaviour. I think that's what Arrow means, and if so, I wholeheartedly agree.

* I must get around to a better definition of rationality, because it's a fundamental term in this kind of debate and the argument can end up being about nothing but the meaning of the word. In this post, I broadly use rationality to mean the discounted-lifetime-utility-optimising behaviour of classically modelled economic agents, and irrationality to describe departures from that model. Tyler Cowen has an excellent essay about this.

Tuesday, 28 July 2009

Twenty years of economics

Robert Peston's article today isn't a bad summary of the last twenty years of economic theory.

Information asymmetry is the term for markets that don't work because proper information is not available to both parties - e.g. the broadband speed example that Robert gives.

Adverse selection is a related problem which causes credit to be mispriced. It works like this: companies know more about their business prospects than banks, thus banks will charge a premium on loans to account for the risk they run. This premium puts off creditworthy businesses, whose owners will prefer to invest their own money or raise it in the markets. This leaves only businesses which are middling or downright bad risks still willing to borrow. As the best businesses have left the market, this affects the average risk profile for the banks, who put the rates up again... discouraging the borderline cases and leaving only the really bad risks. Rates then go up again, and... you can see where this leads. This is classically called "the market for lemons" and can lead to a whole market ceasing to operate properly.

Cognitive biases are the specific aspects of psychology where we depart from making rational choices in a predictable way.

All of these problems, which are inherent in unregulated markets, lead to market failure. The discovery and analysis of these issues have been among the main outcomes of economic research in the last two decades.

This is a politically significant debate. The reason is that, unlike most of the earlier debates about free markets versus intervention, none of these problems arise from government action. Traditionally, government action would usually reduce the efficiency of markets; and the debate was about whether this was a worthwhile sacrifice.

In the case of the three items above, the market has a built-in inefficiency. Government regulation therefore may be a way of improving how the market works rather than replacing it.

In this respect the Conservatives' proposals about increasing information about financial products are useful; but it does not address the issue of cognitive bias, which can in fact be worsened by the availability of too much unfiltered information.

No party has a concerted programme to understand cognitive bias on a large scale and how it affects economic outcomes, although there are a few individual programmes - for instance Lord Darzi's recent report on health services includes some discussion of behavioural economics. I would encourage all the parties to give this some detailed thought, as it is becoming clearer and clearer that it's a critical aspect of how the economy works.

Update: Chris Dillow makes some good points in dissent.

Monday, 27 July 2009

The economics zeitgeist, 26 July 2009

This is a word cloud from all economics blog postings in the last week. I generate this every Sunday so please subscribe using the links on the right if you'd like to be notified each time it is published.

It has been constructed from a list of economics RSS feeds from the Palgrave Econolog and other sources, and uses Wordle to generate the image, the ROME RSS reader to download the RSS feeds, and Java software from Inon to process the data.

You can also see the Java version in the Wordle gallery.

If anyone would like a copy of the underlying data used to generate these clouds, or if you would like to see a version with consistent colour and typeface to make week-to-week comparison easier, please get in touch.

Saturday, 25 July 2009

De-averaging and behavioural economics

From the Bloggers Circle:

John Copps of New Philanthropy Capital is applying lessons from the record industry to charitable donations. Just as record companies are creating a wider range of distinct products, in an effort combat illegal copying and generate more revenue from devoted fans, charities should be doing the same to maximise their donations.

An anonymous commenter points out that this is just price discrimination - and many industries have been doing it for decades. True enough, but there are lessons to learn from this.

Price discrimination is less visible in either commodity or high-growth markets.

If you sell commodities, it's harder to use price discrimination - although you will still see it on a smaller scale. The petrol market is fairly competitive, without many proprietary products, and thus there is far less diversity of price than in, say, the retail market for coffee. But because it is high-volume and the margins are small, 5p per litre of extra revenue for "premium super unleaded" makes a big impact on proft.

High growth markets, on the other hand, are all about generating and delivering volume. The advice from Geoffrey Moore's Inside The Tornado, one of two seminal texts about high-tech sales and marketing, is "Just ship". No matter what, your job in a high-growth, high-demand market, is just to get product out the door as fast as possible with as little customisation as you can get away with. In this environment, margins are already high and there is much less to gain from price discrimination. Indeed if you do, you run the risk of confusing your market, diluting critical mass and losing momentum.

Where differential prices and products do emerge is in mature markets, especially those - such as music - where the product can easily be differentiated from your competitors and where growth is slow enough that you need to focus on managing margins instead of getting maximum volume. This is why the charitable sector presents an ideal opportunity for the technique.

We at Inon are working with several charities to help them structure their offering to maximise donations. Some key considerations are:
  • What does the donor get out of giving? Every action is motivated by something - whether material or psychological - and altruism is as subtle and multi-layered as is self-interest.
  • Giving is influenced by standard behavioural economics insights such as signalling, social proof and hyperbolic discounting. These effects have different strengths in different people, at different times and in different social contexts.
  • The disruptive power of financial motivation is especially important in charities. Several experimental results show that people are often demotivated by financial reward rather than being incentivised. It's especially important to keep this in mind when recruiting volunteers or incentivising donors for a charity.
Charities absolutely do need to tailor what they offer, and what they ask for, to each individual donor. To do so, they need a working psychological model of how people decide what to give, and to be able to work with that to fulfil the emotional needs that each donor reveals by interacting with the charity.

I will ask my clients' permission to post some case studies here, but in the meantime I can recommend some research by Steffen Huck of UCL on donations to the Bavarian State Opera House. The results are in some ways counterintuitive: for example the offer of donation matching by a corporate donor actually reduces giving by the public, while the knowledge that a large unconditional corporate donation has been made substantially increases it. There are lots of useful lessons in this research and in other behavioural findings.

Friday, 24 July 2009

The economics zeitgeist, 19 July 2009

This is a word cloud from all economics blog postings in the last week. I generate this every Sunday so please subscribe using the links on the right if you'd like to be notified each time it is published.

This week's posting is a few days late but the data is still based on the period from Sunday 12th to Sunday 19th.

It has been constructed from a list of economics RSS feeds from the Palgrave Econolog and other sources, and uses Wordle to generate the image, the ROME RSS reader to download the RSS feeds, and Java software from Inon to process the data.

You can also see the Java version in the Wordle gallery.

If anyone would like a copy of the underlying data used to generate these clouds, or if you would like to see a version with consistent colour and typeface to make week-to-week comparison easier, please get in touch.

Thursday, 23 July 2009

SuperFreakonomics update

When Stephen Dubner announced a contest to guess the number of Google hits there would be for "SuperFreakonomics" on November 3, my thought process went something like this:
  1. There are about 11,000 results now.
  2. There are 1.3 million for "Freakonomics".
  3. As the SuperFreakonomics launch comes closer, it will be discussed more and more.
  4. But it is unlikely to get close to the total for Freakonomics - partly because many SuperFreakonomics pages will also have the word Freakonomics in them; partly because it's a sequel; partly because Freakonomics is a cultural phenomenon, surely beyond the original expectations, and the chances of Levitt and Dubner hitting two home runs in a row are low (nothing personal, guys, just stats).
  5. So somewhere in the 50,000-300,000 range is probably about right.
  6. I have no accurate way to predict where it will fall within that range
  7. But - just like The Price is Right - success in this contest does not correlate precisely with accuracy; instead, it depends on being closer than any other entrant. There is a subtle difference. Winning an English auction does not arise from getting closest to the real value of the item; it comes from bidding $1 higher than the second highest bidder.
  8. So I analysed the figures submitted by all other contestants - fortunately this was not a sealed-bid contest - and picked the number - 88,782 - that gave me the highest probability of winning. According to my own little predictive model, of course.
However, at the time I did have a fear about the consequences of running this contest. My concern was that it is potentially quite easy to game.

Let's say that you bid 10 million. It's 15th October and the deadline is approaching - and Google still only shows 75,000 results. But you're desperate to win that signed copy of the new book.

So you start posting about SuperFreakonomics on your blog. And other people's blogs. And perhaps you hire a linkfarm company to spread the word around around their automated content generation systems.

Pretty soon you could probably generate another 10,000 hits or so.

And if there are a hundred other contestants who also bid high, then soon enough there are a million new results.

Your chances to win are thus substantially improved - until the guy who guessed 500 million realises and starts doing the same thing.

This gaming is all one-way - there's no way to reduce the number of pages on the Web so only those with high guesses are incentivised to participate.

Still, I relied on the likelihood that $50 worth of schwag would not be worth the effort for most people, and that gaming would be minimal. Thus I stuck with my guess, in the expectation that the increase would be slow - perhaps with an acceleration towards the launch date, but not an overwhelming one.

The last thing I expected was that the number of hits would reduce.

So imagine my astonishmentTM when I searched a few days ago to find only 9,600 results. A glitch in the Google data refresh? Nope: this afternoon there are only 6,360. It is shrinking by the hour, as a slightly earlier search returned 6,400.

Is SuperFreakonomics dying out? Will there be only ten hits left by November? Or even - using the Buiter/Mankiw/Sumner Taylor rule extrapolation - minus two thousand?

I can only assume Dubner and Levitt's promotional scheme is backfiring and that the gods - or Google, their representative on Earth - somehow resist the attempts of mere mortals to generate searchable publicity through clever non-monetary incentives. Does this mean social media is dead? Is this the final revenge of neoclassical rational economics? Or are all those predictions of deflation just belatedly coming true?

Update 30th July: Down to 5,180 now.

The Onion scoops a major development in monetary policy

And to continue the topic of my last posting... The Onion really is America's Finest News Source. Though I suspect they may not be aware of the full import of their relevation:

Maybe the WSJ will be reporting the same thing soon.

From this video, around 2:30. The rest of the clip is also quite funny.

Wednesday, 22 July 2009

Is the Bank of England targeting nominal GDP?

How interesting. The Bank of England, whose notional target is a 2% inflation rate (CPI), is now looking at cash GDP in deciding its quantitative easing policy.

That's according to Stephanie Flanders, who has an intelligent writeup of the Bank's considerations in deciding whether to print another £25 billion to purchase government and corporate bonds.

Scott Sumner will surely be pleased to hear it: he has been advocating for a while that central banks should target nominal GDP. What's more, they are looking not at the rate of change, but at:
...the Bank's expected path for cash GDP in the next year or two...
implying that the target is an absolute level, so if there's a shortfall this year they may even try to make it up next year.

Now there's some way to go from using NGDP as one of the considerations in guiding this month's policy, to setting a formal rising path at 5% a year, trading a futures index and giving up inflation targets altogether. But this is an intriguing step.

What with the Bank of Sweden imposing negative interest rates on reserves, and the Bank of England targeting NGDP, Scott Sumner might turn out to be a prophet of 21st century monetary policy. After all, prophets often start out as cranks...

Update: The Onion reports a major new development in monetary policy.

Tuesday, 21 July 2009

More on financial transparency

From a conversation with Richard Thinks yesterday:
Thanks for the link today...looking back through the emails I notice we had another conversation about transparency a few months ago. I think the idea of publishing standardised information about financial products is one of the strongest part of Osborne's proposals - but the way it's sold as enabling "price comparison websites" is a bit misleading.

One of the oddities about the price comparison market (and I wrote, with a colleague, the first version of so I have a bit of inside knowledge) is that they make their profits precisely because the providers do not offer their products in a standardised way. Instead, everyone (deliberately) distinguishes their products in many different dimensions so that they cannot be directly compared. Ever tried to figure out which mobile phone deal is the cheapest? This is why those sites are so popular.

If everyone did publish their terms and conditions in a common machine-readable format, it's likely that open-source alternatives would spring up and take away the main advantage that the comparison websites have. This would probably be good for consumers as it would allow the comparison sites' commissions to be competed away in favour of cheaper products.

I have argued for a variant of this idea in the financial markets, enabling bond investors to more accurately gauge the risk profile of the companies (especially banks) that they lend money to - essentially so they can do the work themselves, using their own models, instead of relying on the rating agencies, which are very much the bond market equivalents of price comparison sites. I think that between me, George Osborne and Thaler & Sunstein, the concept is gaining momentum - so I expect we'll find out soon enough whether it helps or not.

And today in the Wall Street Journal, Kenneth Scott and John Taylor propose the same thing for mortgage-backed securities (via Justin Fox who points out it's a classic market for lemons).

Monetary versus fiscal policy

The debate between proponents of monetary and fiscal policy remains surprisingly interesting all these months later.

Not so much at Robert Peston's blog, unfortunately, where the question is more or less ignored; instead it is reduced to a question - without an answer - about how expensive it will be for the government to finance its debt.

But have a look at some of the comments on this Worthwhile Canadian Initiative posting. Adam P in particular makes an intriguing point:
Moreover, this really goes to a distinction that people are often not careful about (I was trying to make this point in the discussion of your 'why fiscal policy won't work competition'). We need to decide on what we mean for fiscal stimulus to "work". There are two distinct questions:

1) Can fiscal policy increase output in a liquidity trapped economy? Clearly yes.
2) Can fiscal policy end the recession, break the trap? Only by increasing expected inflation which could also be done by monetary policy (and doing it with monetary policy is clearly the better way).
Now I don't think Adam has quite demonstrated this point. It's a standard assertion among monetary theorists that monetary actions are "clearly" better than fiscal ones. Paul Krugman and others would presumably say that:
  1. Monetary policy has run out now anyway
  2. Fiscal policy can achieve economic adjustment in a different way, by creating investment in public goods and increasing expectations of future aggregate demand
The debate on Peston's article should therefore take into account not just whether the government "can finance its debt" - after all it can, always. It should consider whether financing it by printing money is a good thing. Monetary economists, such as many of those on the right wing who are concerned about public debt, would say yes. And after all there is really no issue about paying the Bank of England "its" money back - the impact of not doing so is only to create a bit of inflation. Which is exactly what we want now.

Two more points on this.

First, Peston's £220 billion figure for new gilt issuance. He claims that this will be the same again next year because the government won't cut spending or raise taxes. Wrong! The reason it got so high this year is not because of increased spending or reduced taxes: it is because of the automatic impact of the recession, mainly on tax receipts. The British public deficit is highly sensitive to recessions for various reasons. In fact, most economists think an economic recovery is either already underway, or is about to be, and so the deficit will automatically shrink. I would expect next year's borrowing figure to be at least a third less than this year's even with no policy changes.

Second, and more interesting from a theoretical point of view. How much does this automatic fiscal stimulus contribute towards recovery? Recovery from a recession can come from various sources:
  1. Adjustment of prices to a new full employment equilibrium (more or less the monetarist viewpoint - inflation makes this adjustment much easier because most individual prices and wages will not adjust downwards, so instead you adjust the general price level upwards to compensate)
  2. Satisfying the demand for excess saving by increasing the money supply (printing money) and thus forestalling the paradox of thrift
  3. Getting out of a 'sunspot' or a low-output equilibrium by raising expectations of future demand, therefore encouraging people to spend more (the Keynesian multiplier effect)
  4. Directly increasing aggregate demand, creating investment opportunities which absorb the desired savings of the private sector (either because the public sector invests directly - the US fiscal stimulus case - or because the private sector finds more opportunities to invest)
Most of these routes are complementary - with a partial exception in the last case, if "crowding out" happens - that is, the demand for government borrowing makes private borrowing for investment more expensive. But crowding out does not appear to be a worry, at least in the short term: the reason for low private investment is not high interest rates, but a perception that there are few profitable investment opportunities available.

Thus we may never know which of the above was the "real" cause of recovery - because the economy will simultaneously adjust in response to all of them.

Monday, 20 July 2009

Andrew Lo's adaptive markets and the Slow EMH

Andrew Lo, writing in the FT, says:
...human behaviour is hardly rational, but is driven by "animal spirits" that generate market bubbles and busts, and regulation is essential for reining in misbehaviour.
Regular readers won't be surprised to see me agreeing with this, and indeed I have a proposal for how that regulation could work.

However I am suspicious about any theory which does not make testable predictions, and I fear that Lo's "adaptive markets hypothesis" may fall into this category.
This "Adaptive Markets Hypothesis" (AMH) - essentially an evolutionary biologist's view of market dynamics - is at odds with economic orthodoxy, which has been heavily influenced by mathematics and physics...The formality of mathematics and physics, in which mainstream economics is routinely dressed, can give outsiders a false sense of precision.
...fixed rules that ignore changing environments will almost always have unintended consequences...The only way to break this vicious cycle is to recognise its origin - adaptive behaviour - and design equally adaptive regulations to counterbalance human nature.
Just like Shiller and Akerlof's Animal Spirits, which has influenced Lo's proposal, and like Nicholas Taleb's Black Swan - the AMH appears to be a narrative proposal requiring creative interpretation, instead of a model with specific outcomes that can be checked. That is the domain of literature, not science.

In essence, I agree with the project that Lo (and Shiller and Akerlof) have outlined. But instead of sidelining the mathematical tools of economics, we need to fiercely apply them to make this concept work.

I suggest we start with a smaller step: not to overthrow the EMH with a set of generic, unpredictable exceptions, but to modify it to take into account certain imperfections of behaviour in markets. Specifically a "slow EMH" which would work like this:
  1. In the long run, market prices do reflect all available information.
  2. However, some of the facts they reflect are not exogenous (externally given facts); they are the opinions of market participants. (Others are exogenous, but hard to observe - I will come back to those)
  3. These opinions take time to form and test; the key mechanism for testing them is to attempt to make or take a price, and observe whether people accept it. For example, when selling a house, a seller may set a price 5% higher than that achieved by a neighbour, in order to find out whether buyers are willing to pay.
  4. These tests, in turn, influence the opinions of other observers - those opinions then also become facts which are relevant to the price.
  5. The nature of such price-setting tests are that they rarely jump straight to the "correct" level; buyers and sellers are rarely willing to pay a price wildly out of line with the last price paid, and so the increments are more gradual than the EMH would imply.
  6. Over time, therefore, prices will gradually move towards a stable level which reflects external information; but in the meantime there may be a free lunch, if you are better or braver at interpreting the external information than the market on average.
This mechanism may explain why there are relatively long-term bull and bear markets in both housing and equities. The standard EMH implies that, say, house prices should jump directly to their new level as soon as a demographic change becomes apparent. However, instead we saw a continuously rising housing market from around 1992 to 2007, and a steady fall since then.

It may also explain markets which never reach stability: if the adjustment process is not complete by the time new external facts arise, a new process will start, overlapping the previous one, and stability may not be achieved.

About that free lunch: it won't be easy to earn, but consider that there are hard-to-measure exogenous changes and you may be able to predict the market's reaction to them. For instance, demographic changes and restrictive planning laws, which were largely predictable throughout the 1990s and 2000s, could be considered to have led to the house price boom in the UK. You could make a good case that the correct level of prices, based on these facts, was probably around the 2005 level (perhaps 2006 and 2007 were an overshoot). But prices in 1999 did not immediately jump to the level they would be at in 2005. Instead, many people had a general belief that prices should be higher, and tested that belief with gradual price increases. As buyers continued to bite, confidence in this belief grew, and that in turn justified a higher price.

If you can identify a market rationale for higher prices, combined with market mechanisms that slow down price adjustments, you may be able to make money. Even if you can't, you can understand markets better and know better when to rely on their signals.

This I call the Slowly Efficient Market Hypothesis, or Slow EMH. It is less universal than the AMH, but has the strength that it, seemingly unlike AMH, can be mathematically modelled in a testable way. Look out for a future article which will do just that.

p.s. I am not at all convinced about the 'free lunch' idea in the above. I am certainly not ready to put any money behind it right now. But I do believe the EMH cannot operate instantaeously and some form of Slow EMH is definitely at play.

p.p.s. The Wikipedia description of the AMH indicates it may actually make some more testable predictions than I assumed from reading the FT article. I have not yet read the 2004 paper but will do so and report back.

Sunday, 19 July 2009

The right way to regulate financial services

Robert Peston reports the Conservatives' new proposal for financial regulation. I'm not a fan of the title: "Proposal for sound banking" sounds like a "proposal for secure borders" or "proposal for safe streets after 11pm": the choice of language prejudices the solution.

But to be fair, every policy document has to be marketed to the voters. So I will try not to make assumptions about the content. Note that I haven't read the document, as it has not been published yet. But the Tories' embargoes are less strict than the government's [I still have an unpublished draft blog posting from two months ago based on Peston's accidental release, seven hours early, of the Treasury Select Committee's report on the banking sector], so he has been able to outline most of the key details of the document in advance of its release.

A simple summary:
  1. Macro-prudential regulation transferred from the FSA to the Bank of England, creating a new Financial Policy Committee
  2. The FSA's remaining consumer regulations supported by Thaler & Sunstein's proposal to electronically publish all details of credit card and other financial contracts
  3. Consider forced demergers of banks that are too large from a consumer point of view
  4. No Glass-Steagall-style separation of investment and regulated banks, but higher capital requirements on "risky" activities which should discourage excessive investment banking by deposit-insured retail banks.
Peston raises a few interesting questions about the medium-term future of the FSA and its staff, but I'm going to go in a different direction in this discussion. My concern is not the details of which of Lord Turner or Mervyn King can claim to be more powerful than the other; it's about the behaviour and incentives of participants in the financial markets.

The behaviour of bank employees is one thing; the behaviour of financial regulators is another; but both are a red herring. Banks and regulators make, on balance, only a marginal impact on market outcomes. Despite Brad DeLong's suggestion, banks are only slightly long on financial assets; the rest of the population have much more to worry about in the value of their savings assets than banks, and the rest of the population are also mostly the ones who borrow. The banking sector as a whole has a small net positive position on financial assets, while everyone else is divided into two groups, one with a huge ($50 trillion maybe) asset and one with a nearly equally vast liability.

(Incidentally, and again a propos of DeLong, this illuminates just why the financial bailout was politically irresistible: never mind the banks, it's 50% of the population whose assets were at stake. More to the point, it's about 75% of the likely voting population; and the loss aversion of savers is far stronger than that of taxpayers.)

The real point of financial regulation is to understand and influence the behaviour of the investors, depositors and borrowers who make up 99% of the population, not the bankers who make up 0.1% of it. How should we do that?

Thaler does have some valuable insights into this; and his proposal for transparency of consumer financial products is a good start. George Osborne has done well to pick this up. But it's only one of many possible directions.

A general theory of consumer finance behaviour is what the regulator really needs to understand and influence the sector appropriately. One simple example: to analyse the property market, add up the total relevant financial investments of all parties who are long on property, net off against those who are short, and compare it to the total market value of all real property assets. I suspect you'll find that - in 2006 at least - the net long position of financial investors in property exceeded the actual value of real estate against which the assets were secured.

On a smaller scale, examine the risk preferences of individual equity investors - for instance as proposed in this article - and compare to the actual potential financial returns of the equities they hold. If there's a discrepancy, that's the real explanation for financial risks and never mind trying to control the behaviour of the stockbrokers.

The Conservatives might have some valid ideas about the UK's regulatory triangle, but they aren't going to make any difference until the behaviour of consumers is incorporated. And regulating that is a much more subtle question. At least I hope it's subtle: anyone who wants to call out "debt is the problem" had better be willing to explain exactly what the right level of consumer debt is, relative to GDP, and justify their figure.

The electoral arithmetic must surely point the same way: how many voters in swing constituencies are going to choose the Conservatives because of their plans to transfer macro-prudential regulation from the FSA to a newly consituted committee within the Bank of England? If this is going to be a political issue, it's time to engage - intelligently - with the question of how we all lend and borrow economic resources from each other through the medium of finance. It's not an easy political fight to pick, but it's about the most important one there is.

Saturday, 18 July 2009

Some thoughts provoked by a few blogs

Some of these articles have been lingering in my spare browser tabs for weeks, so it's time to link to them and explore a few thoughts that they have stimulated.

Chris Dillow asks if economics is like Feynman's onion. It's a good analogy. People, much more than physical particles, are complex. This isn't to say we can't build models of how they behave: we can and should. And while there are lots of people decrying the state of economics (having been given permission to do so again this week by The Economist), those who think model-building is a bankrupt approach are wrong.

Model-building is the essential activity of economics and perhaps its greatest contribution to the social sciences. Indeed this is why it is to similar with physics: the best physicists know, as Feynman did, that they are not looking for an ultimate answer and even if they found one, they wouldn't know it. All we can ever seek is a model that describes the world well, and keep testing it to make it break.

Economists likewise have a neoclassical model which provides some approximation at a description of human behaviour; it clearly has holes in it and those holes need to be eliminated. They can be patched up with the economic equivalent of epicycles - sticky wages, say, or a bit of endowment effect. But every so often the whole model should be replaced with something new, incorporating the best of the previous one but going beyond it. This is where I believe behavioural economics needs to go: not, as Chris points out, under the illusion that they are discovering the economic theory of everything, but with the ambition to provide the next generation of unifying model. No doubt in another fifty years it will be replaced again, but if we have a few decades of better economic predictions, greater wealth and welfare, and reduced poverty to show for it, it will have been as noble a cause as one could imagine.

And so, if you hear anyone say that economists' obsession with mathematical models caused the financial crisis, this is little better than saying that physicists' obsession with the mathematical laws of motion caused that bullet to hit JFK. The question isn't whether models should be used, it's how to make better ones and apply them in the most effective way.

As an aside, I'm believing more and more that we shouldn't call this field behavioural economics but cognitive economics: behaviour is just the outward manifestation of the phenomena we want to analyse, which are about how people think and how they value things.

One of the things they value highly are cultural products. Razia Iqbal, the BBC's culture editor, asks whether the creative industries can provide new business models which flourish in "the current economic climate"*. The cultural sector is expected to grow at a rate of 4% over the next few years, according to NESTA.

While there are always problems of defining the cultural or creative sector - at one point we applied for a venture capital mentoring programme and were allowed in on the basis that software is creative, while another funding application focused around software and applications of game-playing was turned down for not being creative enough - I have little reason to doubt this projection. The cultural sector should, at the least, gain in importance relative to manufacturing and other industries as a developed economy continues to mature.

But this isn't the most interesting area. The exciting potential is for what the cultural sector can do for the rest of the economy. Different types of business are not indepent, mutually exclusive activities: instead they interact in a complex way. The automotive sector is strengthened by the contribution of the creative sector to its marketing and design processes; the music industry is boosted by the availability of new hardware on which to play songs; the property sector was enriched (albeit temporarily) by the evolution of the financial markets; and the aviation and engineering industries grew because of the mining sector's development of cheap oil sources, then returned the favour by providing the technology for miners to explore undersea and other remote sources of minerals.

The development of these links and relationships is at the heart of economic growth and I expect a major driver of the next decade's growth to be the application of cultural ideas and outputs to other types of product.

Few physical products stand alone any more; the value of most of them, at least in the rich economies, is a function of the subjective values that surround them. These include branding, aesthetics, social reinforcement, relative status, the stories that products evoke, the futures they allow us to imagine, and the moral content of the products. All of these factors contribute immensely to how we value the products we use, often as much or more than their material functionality.

The cultural sector, with its insights into human motivation and perception, has the potential to be in the coming decades what finance was for those just gone. It will be the enabler of a transformation in the value and output of every other part of the economy.

Incidentally, this cross-fertilisation between different industries is why arguments like this one from Rob Killick are erroneous. He complains about "innovative investment products" as if investment has no impact on the rest of the economy. But in fact, innovative investment products such as the limited liability company, the life insurance policy and the mortgage have enabled people to create productive companies, provide for themselves and their families and house themselves. While there is less low-hanging fruit now, there are still new investment concepts to be invented which will let the economy work better, use human time and capital more efficiently to make more stuff, and help that stuff to be better personalised - using cultural products - so that it makes consumers happier.

No doubt there are some products which are not so useful - Rob rightly suggests that financial engineering to get around the 50% tax rate is not a very good use of anyone's brain - but this shouldn't be used to condemn the whole finance sector as a parasite.

This is closely linked to the idea that the British economy should be based more on manufacturing and less on services (particularly financial and business services, which are intermediate factors of production and not for direct consumer benefit). But Stephanie Flanders points out that this greater reliance on manufacturing hasn't protected the German economy in the recession: indeed, German GDP has shrunk much more than has the UK this year. There's a respectable argument that services are achieving exactly what they're meant to - and exactly what financial products are meant to do - which is to provide cushioning against volatility. Manufacturing demand is inevitably more volatile because products are more commoditised, relatively fungible and easy to run up and down stocks. To the degree that they are embedded in culturally determined services, this volatility is reduced; and the service providers themselves are much more stable than product businesses. I'm quite relieved not to be selling yachts right now.

If any of you have read Tyler Cowen's Create Your Own Economy yet, you may recognise hints of some of these arguments. And if you have read Chris Anderson's Free (which I haven't, so far) I suspect some of the discussions of cultural products are relevant to that too. I have articles coming up on both of these books so more on that soon.

* this phrase is so overworn now that in some circles it can only be spoken sarcastically. Note that there are over 1.1 million results for it on Google. Possibly the best of which is an advert recommending you to "Find the Best Results for Current Economic Climate!" on a different search engine.

Friday, 17 July 2009

A simple solution for labour market flexibility

OK, this solution is simple only to the extent that it's also simplistic.

But it might just make a contribution.

Arnold Kling asks (via Mark Thoma) "Why is the Recovery of Modern Labor Markets So Slow?" Part of the answer - I emphasise it's only a part - is the reluctance of people to accept low paying jobs while they still feel there's a chance to get a better paid opportunity. This means that people burn through their savings, reducing their own economic welfare as well as society-wide GDP, when they may end up having no choice but to take the same low-end job that was available to them when they were first laid off.

Why do people act this way? Undoubtedly part of the cause is a cognitive issue: the hit to pride and self-esteem from accepting a lower-paid and lower-skilled job. But another aspect highlighted by Thoma is that they want to spend their time looking for a better job instead of working at a worse one.

Now how much time do people really spend looking for jobs? Is it a 40-hour week or is it a couple of hours a day plus whatever interviews they can find? Naturally this will vary by person, but I suspect it is rarely a full-time task.

And so this argument boils down to a coordination problem. If I'm working from Monday to Friday and that elusive interview comes up - but they want to see me on a Wednesday - what am I supposed to do? Put at risk the new job I've been working at for only three weeks by taking a day off, or just pass on the opportunity?

Instead, if the default position were to conduct all interviews on Saturday mornings, much of this problem would disappear.

Of course many jobs - particularly the lower-skilled jobs we're talking about - are not Monday to Friday any more. But on balance they are still more likely to take place during the day and during the standard working week. If interviews were, as standard, conducted in evenings and weekends, this would much reduce the disincentive for people to take a temporary job while looking for a better one.

A similar option, on a larger scale, is to structure more temporary jobs around four-day weeks so that employees have a whole extra day to job hunt. Employers of the temporary workers might not be too keen that their people are looking for new positions, but I think they know not to plan on holding onto most of their staff too long. If a subsidy is required to assist this transition it may be a worthwhile way of spending fiscal stimulus money. I mentioned a few months ago that retraining seems to be the best way of achieving stimulus, and this feels like a variation on that theme.

This of course is a small - perhaps almost trivial - influence on what is a major problem for millions of people. But if it makes a difference of just a couple of percent, hundreds of thousands of people could benefit. It may be worth a try.

Thursday, 16 July 2009

Incoming links and arriving packages

  1. Tyler Cowen's book has finally arrived and I'm devouring it. Full review later.
  2. My bubble detection proposal was mentioned by Scott Sumner on TheMoneyIllusion this week, and there's also an interesting discussion of it on Baseline Scenario today.
  3. The Walker Review was published today, in interim form. You can read the report here and our submission to the process here. Some of the points on the moral hazard of limited liability, and the externalities imposed by the financial sector, echo our comments. Some of our other suggestions, for example that banks should use standardised product definitions to enable transparency of their asset mix and better decision-making by their creditors, have not been taken up but we may make another submission during the consultation period.
  4. As part of its "where economics went wrong" feature this week, The Economist has an interesting analysis of the efficient markets hypothesis and some departures from it, including the implications of behavioural economics. I have an article coming soon about this, so if you're interested in the subject do subscribe (top right corner of the page).

George Loewenstein on behavioural economics

I'm reading George Loewenstein's book Exotic Preferences at the moment (still waiting for Tyler Cowen's to arrive) and enjoying it greatly. It's a collection mainly of journal papers and speeches authored by Loewenstein, but not too technical and an interested lay reader should find most of it very accessible.

So far the most important message is this. Traditional microeconomics derives from preferences based on consumption utility; which is certainly an important component of total utility, but by no means all of it. Philosophy, psychology and behavioural experiments all indicate that people gain much, maybe most, of their motivation from things other than consumption.

The first article is about mountaineering and the - definitely exotic - preferences of top mountaineers and polar explorers, who push themselves way beyond the limits of action that could plausibly be rationalised by consumption utility. Loewenstein gives four key examples of non-consumption utility: self-signaling, goal completion, mastery and meaning. A series of revealing quotes from the writings of mountaineers shed some light on their motivations, and while not a rigorous exposition, they are at least suggestive that these four motivations do operate for those people. Loewenstein then suggests that the same drives also apply to "normal" people who are not willing to kill themselves on the top of the Himalayas.

As a minor technical quibble, I would suggest that it might not be useful to use the term utility for these. While they clearly do affect decision-making, it may be unproductive to try to model them with an underlying utility function. One of the key results of classical microeconomic theory is that rational preferences on a consumption set can be generated by finding an appropriate utility function for each available good; it is not a given that the types of choices Loewenstein is talking about can be generated in this way.

But I'm being pedantic - "utility" is being used as shorthand here and not in the sense of classical utility functions. The article ends with a call for economists to "incorporate non-consumption-related motives into formal models of economic behaviour". The implication being that some young, emerging talent might take up the challenge, just as Greg Mankiw suggested the other day (though Brad DeLong and Paul Krugman (not very wonkish this time) stepped in just in time to save any "ambitious grad students" from getting too much mileage out of the question).

Other chapters cover the economics of meaning, curiosity, social utility, negotiation, experimental economics, prediction of future utility, intertemporal considerations, mental accounting and lots of other key aspects of the field. Overcoming my slight disappointment that he has already covered so many phenomena that I want to write about myself, I think this book is setting some excellent challenges for modelling-minded behavioural economists, and I plan to take some of them up.

I believe it will be possible to create a unified cognitive model - even if the mathematics is complicated - which is broadly consistent with empirical results in most of these areas. Even if not, it's a worthwhile project to spend a few years on. Why climb this particular peak? Is "Because it's there?" an acceptable answer?

Wednesday, 15 July 2009

Gold(man) jewellery

I slightly misread the headline of this article yesterday.

It certainly struck me as a good question, though a baffling one. Just how should we interpret Goldman Sachs' Unexpectedly Large Earrings...

Tuesday, 14 July 2009

1. Stimulus 2. Restructuring 3. Growth

Adam Posen's submission to the Treasury Select Committee - in advance of his appointment to the Bank of England's MPC - are reported by Stephanie Flanders today. Some interesting thoughts.

The first is that:
The bottom line, he says is that economists just don't understand deflation very well:

"I think these facts call for some degree of humility. The Bank of England is right to be engaged in quantitative easing to address our current problem. But I think we should stay away from very mechanistic monetarism that, 'Oh, boy, they've printed a lot of money so at some point that has to turn into inflation.' Or, 'If we do this specific amount of quantitative easing, so it will lead to this result.'

Looking at Japan, it is clear that their quantitative easing measures had the right sign, in the sense of being stimulative, but did not have a predictable or even large short-term result, let alone cause high inflation."
An intriguing comment and one that will no doubt make Scott Sumner happy. Frankly, there is a lot about macroeconomic behaviour that economists clearly don't have a good handle on, and I hope the current crisis will at least help us get better tools in this area.

I wouldn't fully agree with the implication that deflation is not harmful - arguably the Japanese deflation, by the end of the decade, cost the country perhaps 20% of the economic output it could now have been generating. But admittedly it didn't lead to a meltdown either.

And just to balance it out, something for Paul Krugman too - though Stephanie doesn't report it quite like that:
"Second, and more importantly, if you do not fix the banking system by the time your stimulus runs out, then private demand will not pick up when the stimulus runs out. That's what we saw in Japan in 1997, and that is what we saw in Japan in 1999-2000. So we have a clock ticking here in the UK as in the US and the euro area."

So, Posen doesn't think that government spending can create a self-sustaining recovery on its own.
Indeed - and I don't believe anyone else really thinks that either. The point about fiscal stimulus is that it helps bridge a gap in economic output while the economy adjusts to some shock or other. The shock might be exogenous - an interruption in oil supply or some other supply problem - or it might be endogenous - systemic financial problems and a loss of consumer and investor confidence.

Either way, government borrowing can temporarily sustain demand and employment, stop the shock from starting a self-sustaining cycle of economic decline, and in essence "restart the dynamo" of economic growth.

What Posen's comments do critically highlight is that stimulus is not enough on its own. The financial system does need to be fixed; resources do need to be reallocated from declining industries such as carmaking into new ones such as [pick your own winner here]. The economy, in short, has to be restructured. Apparently the gradual restructuring which is meant to happen continuously in a dynamic economy has not worked very well, so a slightly more wrenching reshaping of the system is now called for.

Indeed one reason the financial system is so important is because it is one of the main mechanisms by which this happens: differing returns on capital encourage resources to be shifted between sectors. Financial engineering - as Robert Peston complains about in his posting today - can sometimes arise from straightforward gaming of the system, but equally it can be a competitive response to suboptimal capital allocation. If I think Friends Provident is putting its money in industries where it will get a low return, I should certainly be making a takeover bid and promising to switching that investment into a better place.

The economics of Arsenal

Robert Peston highlights a nice, rather knotty, little economics problem for Arsenal Football Club.

This conundrum highlights a number of areas of economic theory:
  1. Generalised agency problem. The interests of the different stakeholders in the club all, potentially, conflict with each other. The fans want maximum money spent on good players so they have a chance of winning something for the first time in years. The management of the club want (I guess) stability and a profitable business, which probably means accepting a lower probability of sporting success. The different shareholders want different outcomes: Usmanov may want an equity issue because, with more cash available than the other shareholders, it would probably allow him to increase his stake. Other shareholders want to preserve their stake relative to him, so they are less keen on the increase in investment. The players and manager presumably want to be successful on the pitch, well-paid and - in Wenger's case - to have his talent-building strategy and ability recognised. The fact that all these interests differ makes it hard to achieve a stable structure where the interests of all stakeholders are served. A bit like the banking sector really.
  2. Behavioural economics. The article pointedly refers to whether Arsene Wenger feels that he is not being allowed enough money. The implication is that his belief that the club is pursuing the right strategy is in part self-fulfilling.
  3. Capital structures. The balance between debt and equity finance is a problem I've covered before in this blog. In theory they are equivalent, but in reality they bring a whole different set of concerns with them. Equity provides more flexibility - in this case allowing the directors to choose between spending money on players now and paying the shareholders lower dividends - while debt, in a predictable economy, allows the controlling shareholders to increase their leverage and thus their short-term returns, at the cost of giving up some of their options.
  4. Time inconsistency. Would you rather spend the club's money on buying players this year - even if that means selling some assets with good long-term returns - or accept a lower chance of winning the league for the next couple of years and preserve more financial firepower for the future?
  5. Discontinuous or non-marginal consumption and utility. Football clubs are a notable case of a class of problems which do not obey the convenient assumptions of microeconomic utility theory. In general, it's much easier to analyse a situation if consumers have access to as much or little of a good as they want - for instance you can adjust your annual consumption of Coca-Cola pretty much to whatever level you like, and the utility you gain from it is likely to be a smoothly increasing function of the amount you consume, with the marginal utility diminishing as the quantity increases (a convex utility function). However the utility of winning the Champion's League is very different. We have seen how much money Roman Abramovich is willing to spend to win it, but we might reasonably guess that once he has achieved it once, his utility will drop off sharply. Perhaps if he could guarantee to purchase 10% of a win he might do so, but there is no mechanism to permit this. This makes the economic analysis of the behaviour of football club owners - and fans - much more difficult than typical consumer preference analysis.
There is a whole (if small) field of sport economics which touches on some of these issues. But even without venturing into the specialist results of the sports economists, this one little story raises a panoply of exciting economic questions which could keep us busy for weeks.

Tottenham fans are no doubt relieved to be spared most of these dilemmas. The discontinuity of utility is much diminished when the choice is between 11th and 17th place in the league table.

Sunday, 12 July 2009

The economics zeitgeist, 12 July 2009

This is a word cloud from all economics blog postings in the last week. I generate this every Sunday so please subscribe using the links on the right if you'd like to be notified each time it is published.

It has been constructed from a list of economics RSS feeds from the Palgrave Econolog and other sources, and uses Wordle to generate the image, the ROME RSS reader to download the RSS feeds, and Java software from Inon to process the data.

You can also see the Java version in the Wordle gallery. The millionth Wordle is likely to be published this week, though as we approach it the rate of generation of new Wordles seems to diminish.

If anyone would like a copy of the underlying data used to generate these clouds, or if you would like to see a version with consistent colour and typeface to make week-to-week comparison easier, please get in touch.

Friday, 10 July 2009

SuperFreakonomics contest

Today I wasted 45 minutes on a contest to win a £9.99 book.

Not rational at all - except for what I learned from the experience, and the satisfaction of feeling cleverer than 651 other contestants.

Background information on the contest is here. In short, you need to guess how many Google hits there will be for SuperFreakonomics on November 3rd, two weeks after the eponymous book is published. This is the sequel to Freakonomics, so it should be popular. But how popular?

Now this kind of contest has some interesting idiosyncrasies. Like guessing the price in The Price Is Right, or like guessing the weight of a nun - or whatever it is they do in travelling carnivals - your best strategy is not to try to accurately work out the weight. Instead, you should look at what other people have guessed and pick your number to maximise the chance that you'll be just a tiny bit closer than them. You might also recognise this strategy from the "beach vendor problem".

In the simplest case, if the other contestants have guessed 100 lbs, 130 and 135 lbs you might think that the most likely weight is, perhaps, 132 lbs. That may be true, but you would be an idiot to put your bet there.

Instead, you can maximise your chances of winning by bidding 99, 101, 129 or 136 lbs. With a 132 lb bid, the only way you can win is if the weight is exactly 132, or 133, or maybe 131 if there's a tiebreaker. What are the odds of that? Not big unless you are really good at weighing nuns. If that's your fetish, you don't need economic theory to help you.

While if you pick 136 lbs, anything over 135 results in a win for you. Similarly, if you pick 99 lbs, anything under 100 is yours; and at 129, anything from 115 to 129 wins it for you.

Of course judgment must come into this somewhere, and you may simply glance at the nun and be certain she's well over 100 lbs. Which eliminates the 99 lb guess. And if she is definitely slimmer than the 136-lb nun who taught you at boarding school, you won't be guessing that either. But the less idea you have of the real answer, the more guidance you should take from the distribution of the existing answers.

How does this apply to the SuperFreakonomics contest? Well, I honestly have little idea how many hits SuperFreakonomics will get. I can be fairly confident it's well under the 1.3 million that Freakonomics has, and well over the 11,000 that it has accumulated so far. But how far over? I don't have much of a clue - it could be anywhere from 50,000 to 500,000.

Other things being equal, it's more likely to be at the low end. I believe that Google hits - like many distributions, especially where network effects apply - are subject to a power law. That is, there are ten times as many words with 1,000 hits as with 10,000; and a hundred times fewer again with a million hits.

So to reflect this combination of strategic targeting and statistics, I downloaded all the existing guesses (651 of them so far), sorted them and projected them onto a logarithmic distribution. I then calculated the gaps between them in order to find the biggest gap in which to place my estimate.

Turns out that there's a nice space from 88,281 to 97,538 - natural logarithm 0.099717 - which is therefore where I wish to place my bet. Of course I'm better off placing it at the low end than the high, because of the same power law effect; so my estimate is 88,282.

Admittedly there are other factors to consider. The second biggest logarithmic gap is 0.080298, between 137,400 and 148,888. If I felt the number of hits were substantially more likely to be around the 140,000 mark than the 90,000 mark I should put my money up there instead. Perhaps the higher probability would outweigh the tactical disadvantage of being closer to other contestants.

In a more extreme version of this argument, I might have a powerful predictive Google model - let's say I were an SEO professional, and one with an improbable respect for statistics - which predicts that there will be between 200,000 and 230,000 hits. In this case I'd choose the biggest available gap within that range, which is 204,000 to 210,000, and guess 204,001. Bad luck to the 204,000 guy ("James") but all's fair in love and game theory.

Another factor is that the variable we're measuring is not independent. It is subject to influence by the participants in this contest - should any of them care enough. If the number of hits by 28th October or so is (let's say) 80,000, a determined competitor whose bid is 150,000 might find a linkfarm network which will take his money to put "SuperFreakonomics" on seventy thousand websites in a week. Will Google fall for it? That depends on many factors, but given that this is the raw total number of hits and not the duplicates-eliminated count, it might work. So by guessing honestly (for some value of "honest") I am assuming that winning a signed copy of SuperFreakonomics will not incentivise people sufficiently to impact the number of hits meaningfully.

Finally, we are disadvantaged by the New York Times's blog moderation process. Unlike the BBC, who have people on duty 24 hours a day to approve the intemperate comments on Robert Peston's blog (hello there, John_from_Hendon), the NYT seems to keep their interns in front of the PC only during working hours. So, even assuming that my guess is still eligible, there could be a queue of forty people in front of me who have applied exactly the same logic. In which case I should assume that one of them has already picked 88,282 and I should choose 88,283. Indeed they may have applied the same logic themselves, and so I better bump it up to around 88,350 just to be sure. Which, of course, they'll have figured out too. Maybe it's safest just to pretend I'm smarter than everyone else after all. Maybe I'll find out on Monday; more likely my illusion will be dispelled on 3rd November when eight million Google hits show up for SuperFreakonomics - I can only hope that this page, which has mentioned it at least seven times, is one of them.

p.s. my 45 minutes did not include 20 minutes of writing this post.

p.p.s. I'm trusting that hardly anyone will see this between my writing it and the closing of the contest, or that you will be too honourable to go to the NYT and pick a number one higher than me. If you do want to make an entry and not mess too much with my attempt, try 204,001, 137,401 or 80,002.

p.p.p.s. I do actually think the real result will be higher than my guess. But I'm more confident in the game strategy than in my ability to estimate Google counts. And the guesses are pretty crowded around the level I would have picked, meaning that my second-best option is probably the preferred strategy. But on reflection, maybe 178,649 would have been a better choice.

Thursday, 9 July 2009

Women of Iceland, my sympathies

According to this article, you wouldn't want to be a woman in Iceland right now:
Iceland, unlike many of the other nations that went mad for credit, has lots of things going for it: an average age of 37, a highly educated work force, a nearly positive birthrate...

Wednesday, 8 July 2009

An open letter to Tyler Cowen (or his publishers)

Dear Tyler

I'm very much looking forward to reading Create Your Own Economy and even more encouraged by your offer on the blog today.

And yet I have a dilemma. It seems that it won't be published in the UK until September (not in fact, as per, last February). I can order it now from Amazon US, but the estimated delivery time is 18-32 business days - which might take me nearly up to the UK publication date anyway. Unless I pay more for priority courier service than the price of the book. And that option - even if it still generates positive consumer surplus - just doesn't feel right.

On the other hand, I can buy the CD version which is available in the UK and get it delivered next week. Or I can get an audio version online - but that isn't downloadable until Tuesday 14th.

And anyway, either of the audio editions will take longer to consume than the hardcover version. I am pretty sure I can read faster than Patrick Lawlor can speak. Not as long, admittedly, as waiting till the middle of August; but I really would rather consume the book with my eyes than my ears.

I even looked into buying a Kindle so I could download it on that - but Kindles are not available in the UK either.

The only downloadable "Create Your Own Economy" I could find is not the one I want at all.

What is a British Tyler fan to do? I am keen to create my own economy but it seems that there are some real barriers to gathering, slicing or ordering the digital information I need to do so. Maybe someone should write a book about that.

Warm regards,

p.s. (I would normally label this an "Update" but seeing as this is notionally a letter...) Thanks for the tip missmarketcrash - I went back onto Amazon and have now managed to find a Marketplace seller (the_book_depository) who can deliver it to me in about a week. Click here to try them out.

Europe swamped by 0.007% increase in population

The BBC reports that:
Nearly 37,000 immigrants landed on Italian shores last year, an increase of about 75% on the year before.
This increase means that last year's invasion - making up 0.004% of the EU's population - has leapt to an overwhelming 0.007%.

Clearly this army of economic refugees must be stopped. Otherwise Europe's carefully designed demographic timebomb will be accidentally defused. Instead of declining by 9.4% in the next 25 years, Europe's population will fall by only 9.2%. If they're not careful, European countries may actually be able to pay some of their pension commitments.

The question the BBC has to ask itself is not: what should we do about this incoming flood of one immigrant for every 13,480 Europeans? It's not even: why are you publishing ridiculous stories like this? After all, the BBC covers plenty of ridiculous stories.

The question that matters is: why do you bring attention to the laughable opinions of Nick Griffin? Just let the guy shout himself hoarse in the wilderness and in five more years we'll be rid of him.

Bubble-detection technology

Pointing out a speech by William C. Dudley, president of the New York Fed, Simon Johnson says:
Dudley says that the Fed can pop or prevent asset bubbles from developing. This would represent a major change in the nature of American (and G7) central banking. It’s a huge statement - throwing the Greenspan years out of the door, without ceremony.

It’s also an attractive idea. But how will the Fed actually implement? Senior Fed officials in 2007 and 2008 were quite clear that there is no technology that would allow them to "sniff" bubbles accurately - and this was in the face of a housing bubble that, in retrospect, Dudley says was obvious.
But is that true?

If we define a bubble as "overvaluation of assets relative to their future returns" then to spot one, we would need to compare asset prices with future returns. But although asset prices are measurable, future returns are not - and this is why people generally think that bubbles are unspottable. We wouldn't want to put the brakes on, say, a fast run-up in the shares of biotech companies just because we don't know what their profits are going to be in ten years.

But there's another word in the definition, apart from "assets" and "future returns": and that word is "overvaluation". How about if we could directly measure overvaluation? If we could determine directly whether investors are putting an irrationally high value on the things they buy?

As it happens, we can.

Experiments have been designed to measure investors' attitudes to asset valuation - for example lots of Vernon Smith's work, including Caginalp, Porter and Smith (1998) (details of other experiments also available here). This experiment was particularly interesting in understanding recent events, as it showed that the valuations of identical assets with identical returns were strongly correlated to the amount of cash available to investors.

That may be exactly what took place in the 2000s, as a loose monetary policy found no outlet in consumer prices but instead flowed into asset valuations (Scott Sumner has argued that money did become tighter later in the decade, more so than signified by interest rate movements; if true this could be one reason that asset prices are not recovering despite very low interest rates).

Outside of the laboratory, precise knowledge of the returns of some assets does become available at times, and it would be possible to measure investors' behaviour with regard to those assets. If investors, in aggregate, become overconfident about returns it will be possible to spot this from certain types of price change.

If we can't find suitably measurable instruments in the market, it may still be possible to set up the appropriate conditions in a laboratory; but that would be less reliable than using market price signals. So a preferable next step is to determine which kind of assets can be tested and under what circumstances, to reveal the risk attitudes of different classes of investor. These measurements would then be combined with data on the cash balances of those investor classes to detect bubbles in their inflation rather than collapse phase.

What regulators can do about these bubbles is the next question; consumer-level financial regulation is one area to consider, but there are other tools too. More on this later.

(see also my VoxEU papers [1], [2] on this subject)

Tuesday, 7 July 2009

The economics zeitgeist, 5 July 2009

This is a word cloud from all economics blog postings in the last week. I generate this every Sunday so please subscribe using the links on the right if you'd like to be notified each time it is published.

It has been constructed from a list of economics RSS feeds from the Palgrave Econolog and other sources, and uses Wordle to generate the image, the ROME RSS reader to download the RSS feeds, and Java software from Inon to process the data.

You can also see the Java version in the Wordle gallery. The millionth Wordle is coming up in about a week, so look out for that.

If anyone would like a copy of the underlying data used to generate these clouds, or if you would like to see a version with consistent colour and typeface to make week-to-week comparison easier, please get in touch.

Signalling, rationality and blunt levers

What trouble it causes in life when you can't just act directly on your preferences but need to signal indirectly.

Chief example at the moment: tomorrow's vote at Marks and Spencer. The shareholders don't really want to get rid of Stuart Rose - he's done a good job - but they do want to tell the M&S directors to stop messing around with corporate governance. So there's a proposal to get him to step down as chairman (not chief executive); but if it passes, or gets significant votes, it's likely to be interpreted as a vote of no confidence in Rose himself.

What else? A debate over on Worthwhile Canadian Initiative about the bluntness of the interest rate tool as a way of controlling inflation. If inflation is low anyway, for other reasons, but interest rates are kept low to boost economic activity, price pressures can leak over into asset prices - causing a house price bubble for example. A tool, when not designed directly to address the problem it is meant to address, has unintended consequences.

Commerce is full of these effects. Signalling is a common technique used by companies (sometimes without being fully aware of it) to influence customers' perceptions. For example a firm might spend more money than it needs to on advertising, fancy literature, business cards or office decor in order to demonstrate that it is a market-leading firm and worth trusting with your money. Banks and their marble lobbies are a classic example - the sunk cost is supposed to make you feel the executives are less likely to run off with your money. But that just led to prices that customers didn't want to pay - and ultimately to competition from online providers and non-banks.

Nowadays you're more likely to see this behaviour from large law firms or accountants: signalling quality by extra spending on offices or sponsorship of sporting events. The problem is that while it does signal high status, it also signals to savvy customers that their money is being wasted. The additional expense of these signals are borne either by the customer through fees, or by the shareholders through profit margins. Either way the firm becomes less competitive. Firms presumably judge the net effect to be positive - the benefits of the signals outweigh the drawbacks - and they might be - but is that an objective judgement? Or do the executives just enjoy the perks and trappings?

Other examples? Please suggest them and I'll update.

Companies and regulators then have two jobs to do. One is to continue to design better levers - so that they can signal quality without wasting money, or control inflation without creating asset bubbles, or get a new chairman without losing their chief executive.

The other is harder, and that is to understand psychology better. At one extreme we can encourage a new attitude of directness. Instead of influencing one thing in order to indirectly achieve another, just do the thing you want to do and be honest to people. If a firm needs to signal that it can afford to spend lots of money, it can just give a parcel of cash back to the client!

At the other extreme we can develop better models of how customers, shareholders or investors make decisions, and then construct a set of interventions which directly act on the relevant factors. This for example is at the heart of my behavioural risk regulation proposal: instead of using universal blunt price signals like interest rates, act directly by providing information to the investors who are unknowingly taking additional risks.

This latter aspect requires the development of much more subtle and personalised cognitive models than those we have now, but if achieved, there will be a transformation both in the successful management of the economy and in consumer welfare.

Saturday, 4 July 2009

"Ant mega-colony takes over world"

Best article title I've seen since Super Cally Go Ballistic, Celtic are Atrocious and Diana fund pays out to Gypsies and Asylum Seekers in the Express.

Friday, 3 July 2009

Bad mathematics and dodgy economics

Robert Peston today attempts to demonstrate that bank executives have ripped off bank shareholders over the last twenty years or so since deregulation of the City.

However he does so with some classic statistical tricks. See which ones you can spot in the article if you wish, before I carry on...

(note that these tricks are actually courtesy of Andrew Haldane, a director of the Bank of England - Robert just communicates them to us)

from 1900 to 1985, the financial sector produced an average annual return of around 2% a year, relative to other stocks and shares...But in the subsequent 20 years, from 1986 to 2006, returns went through the roof: the average annual return soared to more than 16%
OK so far (though that "relative to other stocks and shares" is a bit tricky, because it's aready quite a meaningful excess return - banks haven't always been as safe as Robert and Andrew suggest).

But now what's happened?

The collapse of banks' share prices in the past two years has wiped out most of those gains: to March this year, when the low point was touched, the fall in UK bank share prices was more than 80%, an all-time record plunge.

What this means is that in the full period from 1900 to the end of 2008, the annual average return on financial shares was less than 3%, almost identical to the market as a whole.
Isn't that terrible...poor bank shareholders only made the same money as everyone else. What a con.

Except...for three little things that are really quite manipulative.

First, he picks the low point of the shares instead of a more objective place - like, say, today's share prices, which are about twice as high as the nadir. Anyone can find a low point with hindsight, but hardly any investor successfully calls the low or the high of the market in reality. So a more fair benchmark would be to look at the returns up to today, where shares have fallen 60-70% from peak and not 80%.

Second, that 3% is not the same as everyone else in the market: it is 3% more than the rest of the market. Which means that anyone investing fully in bank shares in 1900, as Andrew proposes, would have ended up twenty-two times richer than someone who spread their money across the whole stockmarket.

Third and most egregiously, he has completely redefined the time period for the convenience of his argument. All those 16% returns from 1986 to 2006 (minus the recent fall) have suddenly been spread over 108 years from 1900 to 2008! No wonder it looks a lot less.

In fact, if you take the current share price of banks instead of their low point, and look at the returns from 1986 to 2008, you get an annual return of 9.7%. Not an absolute return of 9.7% but 9.7% over and above the rest of the stockmarket. Not too shabby, eh?

Unlike the conclusions that Robert comes to. Admittedly these conclusions are informed by Andrew Haldane's confusing mathematics - I'd recommend any investment bank to hire him to help sell their derivatives - but Robert should know better.

The figures do not show that bank executives have been living high at the expense of bank shareholders. They show that executives and shareholders have both been in cahoots at the expense of bank creditors, depositors and taxpayers.

Robert gets the taxpayers bit right but he doesn't spot this: banks have been able to make such high leveraged returns because lots of people lent them money, at low interest rates, without realising that they were exposed to huge risks. The shareholders took little risk because limited liability means their downside was limited to the value of their shares; while the upside was infinite. Governments have now rescued many of these creditors - meaning that the risk wasn't perhaps as huge as all that. But many have still lost out, without any of the compensating profits that shareholders made over the last twenty years.

Robert proposes that shareholders should insist on bank executives being paid less and running the banks less aggressively. Actually shareholders should - if they can get away with it - get the banks to go back to doing business exactly as before.

The disciplines need to come from creditors and taxpayers, because they are the ones really on the hook. One discipline may be reduced leverage; although a fixed leverage ratio is probably too simplistic and too easy to game; another may be higher capital requirements, though that has the same potential problem.

The best single thing the regulator can do is increase transparency in the banking system so that creditors can make more informed decisions about risk, and impose appropriate discipline on the banks they lend money to. After all, a bank would insist on full disclosure of every aspect of my business, and regular up-to-date management accounts, if it lent money to me. Why shouldn't pension fund bondholders or Swiss buyers of mortgage-backed securities require the same?

A more detailed proposal here.