Preventing Deflation: Lessons from Japan’s Experience in the 1990s
Alan G. Ahearne, et.al.
This paper examines Japan’s experience in the first half of the 1990s to shed some light on several issues that arise as inflation declines toward zero. Is it possible to recognize when an economy is moving into a phase of sustained deflation? How quickly should monetary policy respond to sharp declines in inflation? Are there factors that inhibit the monetary transmission mechanism as interest rates approach zero? What is the role for fiscal policy in warding off a deflationary episode? We conclude that Japan’s sustained deflationary slump was very much unanticipated by Japanese policymakers and observers alike, and that this was a key factor in the authorities? failure to provide sufficient stimulus to maintain growth and positive inflation. Once inflation turned negative and short-term interest rates approached the zero-lower-bound, it became much more difficult for monetary policy to reactivate the economy. We found little compelling evidence that in the lead up to deflation in the first half of the 1990s, the ability of either monetary or fiscal policy to help support the economy fell off significantly. Based on all these considerations, we draw the general lesson from Japan’s experience that when inflation and interest rates have fallen close to zero, and the risk of deflation is high, stimulus, both monetary and fiscal, should go beyond the levels conventionally implied by baseline forecasts of future inflation and economic activity.
Full text: http://ssrn.com/abstract=318700
An interesting computational experiment: [Link to the original page]
Turtle System Hypothetical Performance
- Initial Captial: $1,000,000
- 35 Markets tested: AD, BO, BP, C, CC, CD, CL, CT, ED, EM, FC, GC, HG, HO, HU, JY, KC, LB, LC, LH, MP, NG, O, OJ, PA, PB, PL, S, SB, SF, SI, SM, TY, US, W
- Risk: 2% per trade
- No pyramiding
- Fixed fractional money management
||End of Year
Worst Monthly Drawdown: 36.6%
Worst Cumulative Drawdown, % on Initial Capital: 81.7%
Worst Cumulative Drawdown Period: January 2001 – April 2005
% Return/Year on Original $1,000,000: 2.09%
Trade results generated by Trading Blox Builder, associated with www.OriginalTurtles.org. See www.tradingblox.com for the only software that is available to test the Turtle system.
I redid the calculation to show actual annual returns rather than returns on initial capital and added S&P total return data and some statistics:
|Geometric Mean Return
|Standard Deviation of Returns
Dismal… Not to mention that the software package used for this back-test retails for slightly under $3,000… But, then, the correlation is low enough to support the idea that futures (in limited quantities, of course) can be a good addition to equities:
Come to think of it, CSFB/Tremont Managed Futures Index has negative correlations with both S&P 500 and MSCI World… But 2004 is still hard to stomach…
Attention, Demographics, and the Stock Market
Stefano DellaVigna, Joshua M. Pollet
NBER Working Paper No. 11211
Issued in March 2005
Do investors pay enough attention to long-term fundamentals? We consider the case of demographic information. Cohort size fluctuations produce forecastable demand changes for age-sensitive sectors, such as toys, bicycles, beer, life insurance, and nursing homes. These demand changes are predictable once a specific cohort is born. We use lagged consumption and demographic data to forecast future consumption demand growth induced by changes in age structure. We find that demand forecasts predict profitability by industry. Moreover, forecasted demand changes 5 to 10 years in the future predict annual industry returns. One additional percentage point of annualized demand growth due to demographics predicts a 5 to 10 percentage point increase in annual abnormal industry stock returns. However, forecasted demand changes over shorter horizons do not predict stock returns. The predictability results are more substantial for industries with higher barriers to entry and with more pronounced age patterns in consumption. A trading strategy exploiting demographic information earns an annualized risk-adjusted return of 5 to 7 percent. We present a model of underreaction to information about the distant future that is consistent with the findings.
Alexandra Weber Morales writes in her editorial column in Software Development magazine (“Are You Just a Geek?”, July 2005):
[A] recent working paper entitled “Geek Mythology“… points out, “About 44% of our sample of female students contextualize their interest in computers in other arenas such as medicine, space, the arts. Unfortunately, the academic curriculum and the reward system do not always reflect this orientation to computer science.”
I think there is a broader point here. For the longest time, I’ve been thinking that software development may eventually repeat the evolution of typing and driving. Having started as an occupation, it will eventually evolve into a skill. There will always be people earning a living by developing software, but they will be fewer and farther between compared to today. System software (operating systems, drivers, etc.) and shrink-wrap software will probably remain the province of dedicated software professionals. Business applications, however, will most likely become the turf of domain experts.
This process began a long time ago, when Dan Bricklin came up with VisiCalc, the first spreadsheet program. The result? No need to involve a programmer when it comes to what-if scenarios, basic statistics, and all that jazz. It’s only a matter of time before something similar happens in the world of business applications and they are written in domain-specific languages (or simply generated by point-and-click) by domain experts. Science fiction? Nope; in scientific computing, this has been reality for quite some time. Just take a look at Mathematica…
In come the waves
Jun 16th 2005
From The Economist print edition
The worldwide rise in house prices is the biggest bubble in history. Prepare for the economic pain when it pops Continue reading
From The Economist (Jun 9, 2005)
Ready for baby-boomers?
America’s Pension Benefit Guaranty Corporation said that companies with underfunded defined-benefit pension plans had filed shortfalls of $353.7 billion in 2004. (Entities that have more than $50m in underfunded liabilities are required to report to the PBGCï¿½the estimate for the total shortfall of pension plans is $450 billion.) This week, United Airlines transferred $6.6 billion of liabilities to the government agency. The PBGC is deeply concerned about the liabilities of other airlines.
Riding the Boom
Fortune, May 30, 2005.
…In 2001, just 1.6% of all new U.S. motrgages were interest-only. But last year, a stunning 31% were. If there’s any sign that a downturn could get loads of folks in trouble, that’s it…
The coming downturn would not be a pretty picture…
Do bankruptcy codes matter?
A study of defaults in France, Germany, and the UK
Sergei A. Davydenko, Julian R. Franks
May 1, 2005
This paper studies how bankruptcy codes and creditorsâ€™ rights affect distressed reorganizations in different countries. Using a sample of 2280 small firms that defaulted on their bank debt in France, Germany and the UK, we find that large differences in creditorsâ€™ rights across countries lead banks to adjust their lending and reorganization practices to mitigate the expected creditor-unfriendly aspects of bankruptcy law. In particular, French banks respond to a creditor-unfriendly bankruptcy code by requiring more collateral than lenders elsewhere, and by relying on particular collateral forms that minimize the statutory dilution of their claims in bankruptcy. Despite such adjustments, bank recovery rates in default differ substantially across the three countries, with medians of 92% in the UK, 67% in Germany, and 56% in France. Notwithstanding the low level of creditor protection, low recovery rates, and high historical bankruptcy rates in France, we find that pre-distress loan spreads there are similar to those found in the creditor-friendly UK. We conclude that, despite significant adjustments in lending practices, bankruptcy codes still sharply affect default outcomes.
Full text: http://www.moodyskmv.com/conf05/pdf/papers/j_franks.pdf
From Economist’s View:
Open-Source Models in Economics
From Wikipedia, which seemed appropriate for this post, a definition and an English lesson:
Open source denotes that the origins of a product are publicly accessible in part or in whole. When used as an adjective, the term is hyphenated: “Apache is open-source software.” When used as a noun, there is no hyphen: “Netscape released its Navigator source code as open source.”
Will the same model work in economics?
How about some economic model to explain the open-source movement itself? Here are two proposals:
Open-source software development as a signal in labor market
Back in 1973, Michael Spence showed that under certain conditions, well-informed agents can improve their market outcome by signaling their private information to poorly informed agents (this insight earned him the 2001 Nobel prize in economics). Applied to the open-source software development, this can mean that participation in such a project can be a singal to a potential employer showing that an applicant without an employment record may nevertheless possess valuable development skills, which may include version control, managing a geographically dispersed development team, or whatever else a particular employer is looking for.
The value of such a signal should be the greatest in a tight job market, when employers are cost-conscious and applicants are numerous. Acquiring the experience, on the other hand, is the easiest while in school (quite a few open-source pursuits grew out of term projects and dissertations). Hence, a testable prediction: open-source development should be concentrated in places where tight job market (and especially high youth unemployment) coexists with affordable technical education/training (some of which, in turn, may be youth unemployment in disgiuse). Anecdotally, this sounds plausible. Finland, for example, is home to both Linus Torvalds and Monty Widenius; neither lives there anymore…
Open-source software development as an option on related business
If you believe that giving away a piece of software can increase your chances of being paid for related services, the cost of developing the software can be thought of as an option on service revenues. Given a high degree of uncertainty surrounding those would-be service revenues, the option can be quite valuable…
The problem, of course, is that those need to be tested… 🙂
Some Simple Analytics for a “Hard Landing”
J. Bradford DeLong
April 20, 2005
Letâ€™s start with a situation in which the real exchange rate–the dollar price of foreign currency–is being artificially depressed because of large-scale exchange rate intervention by foreign central banks.
[10 pages of formulas and charts outlining three scenarios that DeLong calls “soft,” “medium,” and “hard” landing]
Which of these scenarios are we in? Are we in the “soft landing” scenario, in which the pace of sectoral shifts and structural changes that are going to be caused by the forthcoming rise in the value of foreign currency will not be large enough to materially depress potential output? Are we in the “medium landing” scenario in which the shock and the consequent rapidly-required sectoral shifts will be large enough to cause a significant but not catastrophic recession? Or are we in the “hard landing” scenario in which the financial fragility and vulnerability of many New York institutions is such that all hell will break loose whenever foreign central banks stop purchasing dollars?
I don’t know. My guesses are still 70% soft, 20% medium, 10% hard. I do know that the longer the U.S. continues to run its massive twin deficits on the current scale, the greater the “medium” and “hard landing” probabilities will become.
Full text: hard_landing.pdf