|Exposing The Lie Behind The Nonfarm Payroll Numbers|
|ZeroHedge, 09/07/2013 (traduire en Français )|
While we have already extensively deconstructed the quality components of jobs in the US, showing first that in June 240K full time jobs were lost, even as 360K part-time jobs were "gained", and second that so far in 2013 only 130K full time jobs have been added offset by 557K part-time jobs, we had sinking suspicions that there was something off with the quantity component as well: after all, at an average monthly gain of precisely 201.8K jobs in the past six months (or in 2013), this number seemed just a little too perfect considering the Fed's implicit target of generating just over 200K jobs in a half year period before it begins tapering, which in light of declining gross issuance and less monetizable instruments, has been the Fed's goal all along. Today, courtesy of the monthly JOLTS survey we got just the confirmation we needed that, indeed, the official non-farm payroll number as per the Establishment Survey has been substantially off to the tune of a whopping 40% above what is quantitatively happening in reality.
The JOLTS, or Job Openings and Labor Turnover Survey, gets little respect for the main reason that it is one month delayed. Indeed, moments ago it just reported data referencing the month of May. Considering last week we got June's NFP data, this is largely irrelevant. Furthermore, since most people simply look at the survey for the simple "Job Openings" update and compare it to estimates, it provides little actionable data to the HFTs and algos that are all that's left of market traders these days.
However, what also hides inside the JOLTS survey are two other data sets: Hires and Separations. As the name implies, these show how many new hires and how many workers left (either quit or were terminated) businesses. The delta between the Hires and the Separations foots with the actual reported number of job additions, which intuitively makes sense - the net number of added jobs per month is the Hires over the Separations. No rocket surgery there.
So how does the chart look when comparing NFP monthly data with JOLTS Hires less Separations?
Presenting exhibit A: 13 years of JOLTS vs NFP data.
A partir de mi 2012, on voit bien le décalage croissant entre la courbe noire officielle des non farm payrolls et les barres bleues, issues d'une série officielle du BLS également, mais dont personne ne parle car les chiffres sortent avec un mois de retard...
What becomes obvious when looking at the highlighted area, is that something is very much afoot in BLS-land.
Consider April: according to JOLTS, there were 108K job additions. According to the NFP data, however, the number was 195K in new jobs created.
Or how about May? According to JOLTS, in May 118K jobs were added. Once again, just about 90% off the NFP reported 195K.
And so on. In fact, if one looks at the data for all of 2013, one can see that JOLTS averages some 145,200 workers, while the Establishment Survey data shows average monthly gains of 201,833.
Or just about a 39% difference!
The chart below shows how through May (because remember, it is delayed by a month) JOLTS indicated only 726K new jobs created, while the NFP report indicated over 1 million, or 1,016,000 new jobs to be specific.
Finally, looking at a simple moving average, the "bullish bias" difference to NFP reported data is now only as big as it was just after the Lehman failure!
There is now a 69K trailing three month average benefit to NFP compared to what JOLTS is implying!
In other words, the headline (and algo moving) NFP data is overblown by some 40% cumulatively when looking at 2013 data. It also means that for the NFP discrepancy to be fixed, there has to be one month where there is a 290,000 job decline according to the Establishment Survey.
We urge all readers to recreate the above result on their own: the Hires timeseries can be found here, the Separations timeseries is here, while the matching reported Nonfarm Payroll series is, as always, here.
We hope that, if nothing else, this above is a lesson to the BLS: when manipulating data series across dimensions, make sure the manipulations foot across, not just in 1 dimension!