A famous paper about how to represent a time series as business cycles.
Economic fluctuations are too rapid to be accounted for by slowly changing demographic and technological factors and changes in the stocks of capital that produce secular growth in output per capita.
-> To study the comovements of aggregate economic variables, an efficient, easily replicable technique is needed.
The growth component of aggregate economic time series varies smoothly over time.
The data employed in this paper spans over 40 years, from 1955 to 1995 in 47 prefectures. The reason for excluding the data from 1995 onwards from our sample is that the impact of foreign direct investment (FDI) and foreign outsourcing to Asia were negligible before the mid 1990s. From the mid-1990s onwards the Asian economy affected the Japanese national economy and thus regional business cycle could be synchronized with those of other Asian countries rather than other Japanese regions. Thus the current decade is required to draw attention on Japanese regions as well as foreign countries. However most manufacturing firms and plants located inside Japan before the mid-1990s operated without substantial Asian FDI, foreign outsourcing, and hollowing-out.
Traditional business cycle analysis recognizes two types of cycle:
a) ‘classical’ cycle
involves an absolute decline in economic activity from the peak and an absolute rise in activity from the trough.
The NBER for the US and the CEPR for the Euro Area provide chronologies of such cycles. Clearly such cycles do not exist in growth economies and they are relatively rare for European economies and Japan.
b) deviation or growth (occasionally growth rate) cycle
business cycle is identified as a cycle relative to a trend.
In our case, where the original data are annual, there is a reasonable presumption that high-frequency noise, such as seasonal effects, is already filtered out. On this basis we use a Hodrick-Prescott (HP) filter with a ‘lambda’ value (dampening factor) set at 6.25, following the suggestion of Ravn and Uhlig (2002), this corresponds to a maximum periodicity of the cycle of 10 years just as the popular ‘lambda’ value of 1600 does for data observed at a quarterly frequency. The filter is applied to the log of the GDP series for each prefecture and for Japan as a whole
Hodrick-Prescott filter: The technique is used to decompose economic data into a trend and a cyclical component.
Stephen Williamson writes that in studying the cyclical behavior of economic time series, one has to take a stand on how to separate the cyclical component of the time series from the trend component.
One approach is to simply fit a linear trend to the time series. The problem with this is that there are typically medium-run changes in growth trends (e.g. real GDP grew at a relatively high rate in the 1960s, and at a relatively low rate from 2000-2012). If we are interested in variation in the time series only at business cycle frequencies, we should want to take out some of that medium-run variation. This requires that we somehow allow the growth trend to change over time. That’s essentially what the HP filter does. The HP filter takes an economic time series y(t), and fits a trend g(t) to that raw time series, by solving a minimization problem. The trend g(t) is chosen to minimize the sum of squared deviations of y(t) from g(t), plus the sum of squared second differences, weighted by a smoothing parameter L (the greek letter lambda in the paper). The minimization problem penalizes changes in the growth trend, with the penalty increasing as L increases. The larger is L, the smoother will be the trend g(t).
Paul Krugman writes that what is wrong with this view is that a statistical technique is only appropriate if the underlying assumptions behind that technique reflect economic reality — and that’s almost surely not the case here. The use of the HP filter presumes that deviations from potential output are relatively short-term, and tend to be corrected fairly quickly. In other words, the HP filter methodology basically assumes that prolonged slumps below potential GDP can’t happen. Instead, any protracted slump gets interpreted as a decline in potential output!
Tim Duy argues that these features are what makes the HP filter reveal a period of substantial above trend growth through the middle of 2008… contrary to what most people would believe.
Tim Duy points that if you don’t deal with the endpoint problem, you get that actual output is above than the HP trend – a proposition that most people would say is nonsensical. By itself, the issue of dealing with the endpoint problem should raise red flags about using the HP filter to draw policy conclusions about recent economic dynamics. Luís Morais Sarmento explains that the end point problem results from the fact that the series smoothed by the HP-filter tend to be close to the observed data at the beginning and at the end of the estimation period. This problem is more important when the real output is far from the potential output. At Banco de Portugal, to address the end point problem, they expand the GDP series for some years, using their own projections of GDP growth, for the next years. Marianne Baxter and Robert King (1999) recommend dropping at least three data points from each end of the sample when using the Hodrick-Prescott ﬁlter on annual data.
Timothy Cogley and James Nason (1995) investigate the properties of the HP filter. The first problem is that the HP filter can generate spurious cycles in DS processes. In other words, the “facts” about business cycles obtained from the HP-filtered data are often “artifacts”. In essence what HP does is to decompose series into two parts – trend and deviation, where the trend includes all movements with a period of longer than 8 years (depending on lambda), while deviations contain all other movements. The problem is that after such decomposition this deviation component can demonstrate properties that could not be observed for the initial series.
Note: the trend-cycle decompositions are based on the entire post-war period for the US economy. The graph only displays the results for the last decade.
(1) The HP filter produces series with spurious dynamic relations that have no basis in the underlying data-generating process.
(2) A one-sided version of the filter reduces but does not eliminate spurious predictability and moreover produces series that do not have the properties sought by most potential users of the HP filter.
(3) A statistical formalization of the problem typically produces values for the smoothing parameter vastly at odds with common practice, e.g., a value for λ far below 1600 for quarterly data.
(4) There’s a better alternative. A regression of the variable at date t+h on the four most recent values as of date t offers a robust approach to detrending that achieves all the objectives sought by users of the HP filter with none of its drawbacks.