...the fallacy of composition separates micro from macro...
As I struggle with teaching senior majors a bit of macroeconomics, I am trying to think of ways in which I can distinguish the mindset of micro (the majority of our curriculm) from macro.
(Aside: for the present, I'm posting primarily on the course site on the publicly-accessible W&L WordPress server, at http://econ398.academic.wlu.edu (and similar sites for Japan at econ272.academic.wlu.edu and Industrial Organization at econ243.academic.wlu.edu)
One issue is data: if you take a skeptical view, that structural change in the US since 1980 is substantial -- look at changes in "openness" (trade shares, international financial flows), financial sector reforms (the nature of "money", the rise of multistate banks, and "shadow" banks), labor markets (education levels, mobility, less weight in unionized sectors, more in services) and demographics (more and older retirees hence greater transfers) -- then you may be reluctant to think that there is much to be garnered, indeed you may believe that much will be muddied, from using data older than 20 years. Since key macro measures are only available on a quarterly basis, you're thus stuck with 80 observations, which doesn't provide for much statistical power, particularly given the infrequency of shocks and major policy changes. Using multi-country panel data requires even stronger assumptions than (say) using data from 1962 on for the US. Those doing micro work tend to use datasets with hundreds, if not many thousands, of observations.
Then there are aggregation issues. The more disaggregated the model, the more convincing are microfoundations (though ironically those who use that term are often using models so aggregate that they are reduced to assuming representative agents with identical and unchanging preferences for labor vs leisure and today vs the future). These are present as well in micro markets, but are either more obvious or less severe, and typically both.
Most central, in my mind, is that the fallacy of composition separates micro from macro. A nice post on the Vox EU blog, "Micro success does not guarantee macro success," provides an illustration. They look at job search assistance programs, for which in the Danish case there are not only good data, but a randomized base that helps control for extraneous factors. Such programs do indeed improve the speed at which workers find new jobs, by about 10% over 3 months. Since such individuals then stop collecting unemployment and start paying taxes, it is extremely cost-effective.
However...such experiments are hard to replicate, because unless the demand for workers is adequate, the primary effect is to speed up who gets jobs -- those fortunate enough to be enrolled in the program -- but not to create extra jobs.
Basically, the normal statistical design takes those in a city who were chosen (randomly) for the program with those who were not. Most (though not all) of the effects disappear when those not chosen for the program are compared with those newly unemployed elsewhere. At first glance that's less clean because it's harder to control for various in geography and attendant local industry effects. However, it misses the point that the difference in the more typical control case doesn't preclude that it takes those (randomly) not chosen for the program longer to find a job. Furthermore, when extended to a wider share of those unemployed, employers are flooded with applications, while the ability of the employment office to taylor their help goes down. Indeed, past the point of including about 30% of those unemployed, the spillovers dominate, and the program ceases to be cost-effective.
Macroeconomics if full of similar examples. One household can increase their saving to provide for retirement; that won't shift asset prices, it won't shift the amount of consumption. So they can effectively transfer resources across time. We have that built in (fallaciously) into our economy, in the form of the Social Security Trust Fund. Unfortunately, we can't put doctors in deep freeze: medical services have to come out of contemporaneous production. So in order for retirees to consume medical services (and in the aggregate, other consumption goods), we who are working have to consume less. All retirement is fundamentally pay-as-you-go. The Trust Fund is meaningless; when the future comes, the idea is that it sells off assets. But it's not small in the economy. To do so requires us to save more (to buy those bonds) or to be taxed more (so the government can buy them on our behalf) or (but only in the short term) rolled over into general government debt.
I won't pretend this is simple to understand. But I don't pretend that macroeconomics is easy, either. It requires us to deal with aggregation and spillovers, which is not what we do in our day-to-day decision making. That requires abstraction and building models to check that we've aggregated consistently; it turns out to be very easy to play with ideas only to discover that they don't add up, that they are internally inconsistent (and not in a small way).
And then there remains the challenge of testing these abstractions against our scanty set of real-world data.