The Official U.S. Poverty Rate is Based on a Hopelessly Out-of-Date Metric
The poverty rate in the United States fell to 11.8 percent in 2018, according to data released last week by the Census Bureau — the lowest it's been since 2001. But this estimate significantly understates the extent of economic deprivation in the United States today. Our official poverty line hasn't kept up with economic change. Nor has it been modified to take into account widely held views among Americans about what counts as "poor."
A better, more modern measure of poverty would set the threshold at half of median disposable income — that is, median income after taxes and transfers, adjusted for household size, a standard commonly used in other wealthy nations. According to the Organization for Economic Cooperation and Development — which includes 34 wealthy democracies — 17.8 percent of Americans were poor according to this standard in 2017, the most recent year available for the United States.
To be sure, there is no such thing as a purely scientific measure of poverty. Poverty is a social and political concept, not merely a technical one. At its core, it is about not having enough income to afford what's needed to live at a minimally decent level. But there's no purely scientific way to determine what goods and services are "necessary" or what it means to live at a "minimally decent level." Both depend in part on shared social understandings and evolve over time as mainstream living standards evolve.
At a minimum, we should set the poverty line in a way that is both transparent and also roughly consistent with the public's evolving understanding of what is necessary for a minimally decent life. The official poverty line used by the Census Bureau fails that test. It was set in the early 1960s at three times the value of an "economy food plan" developed by the Agriculture Department.
The plan was meant for "temporary or emergency use when funds are low" and assumed "that the housewife will be a careful shopper, a skillful cook, and a good manager who will prepare all the family's meals at home." The decision to multiply the cost of the economy food plan by three was based on a 1955 food consumption survey showing that families spent about one-third of their income on food at that time. Since then, the measure has stayed the same, adjusted only for inflation.
No expert today would argue that multiplying by three the cost of an antiquated government food plan — one that assumes the existence of a frugal "housewife" — is a sensible way to measure poverty in 2019, even if you adjust it for inflation. However meaningful this was as a measure of poverty in the 1960s, which is debatable, it makes even less sense to apply it today to an American population in which most people were born after 1980.
In 2018, the official poverty threshold for a family of two adults and two children was $25,465 or about $2,100 a month. If it had been set at half of median disposable income, it would have been $38,098, or $3,175 monthly. Ask yourself: If you were part of a couple raising two children, could you afford the basics on $25,000 a year without going into debt or being evicted? Do you think other people would view you as no longer poor if your family's income was a bit over $25,000?
For context, if you were living on $25,000 a year in Baltimore, and paying the Housing and Urban Development Department's "fair market rent" for a two-bedroom apartment in that city, $1,411 in 2018, you'd be spending just over two-thirds of your income on rent and utilities alone. (HUD's fair market rent, used to set the value of benefits such as housing vouchers, is set at the 40th percentile of actual market rent.)
As it happens, when the official poverty line was first developed in the early 1960s, it was equal to roughly half of median disposable income. (Median disposable income back then was roughly $6,200 for a four-person family, and the official poverty threshold was $3,166.) Research using Gallup and other public opinion data, from the 1960s to the present, has found that, even as median income rose, most Americans continued to believe a family was "poor" if their income fell below roughly half of median disposable income. In other words, Americans for decades have instinctively thought of poverty partly as a matter of relative, not just absolute, deprivation.
This common-sense notion is backed up by research documenting that relative deprivation is bad for health, well-being and social participation. And the negative impact of low income on health and well-being isn't limited to those who are most absolutely deprived: It is apparent at every step of the income ladder.
Many of our international peers' measure poverty in relative terms, as well. In addition to the OECD, which uses half of median disposable income for its comparisons of poverty in member countries, Canada, Ireland and the United Kingdom use similar measures in their domestic statistical reports on poverty. But the United States continues to use an idiosyncratic measure developed during the Kennedy and Johnson administrations. One side effect is that, because median income has outpaced inflation over time, the official poverty line has fallen further and further behind mainstream living standards.
To be sure, some critics, including in the Trump administration, seem to think the real problem with the official poverty line is that it's too generous. They argue, among other things, that the consumer price index often overstates inflation, as it affects the things that workers and their families must purchase to get by; therefore, they say, the official threshold for poverty has risen to a higher level than it ought to have.
But when the public thinks about what it means to be poor in 2019, they don't start by trying to imagine what it meant to be poor in the early 1960s (or the early 1900s for matter), and then update that for inflation. Instead, they start by thinking about today's economy and today's society, and, when they do that, most of them conclude that families need much more income to avoid poverty than the Census Bureau says they do.
There is one other important criticism of the current poverty line, namely that it doesn't take taxes and certain in-kind transfers into account, including the Earned Income Tax Credit and food stamps. But adopting a relative poverty measure set to a percentage of disposable income addresses this issue, too.
Finally, if we set a new poverty threshold using a relative approach, how should it be updated each year? In the United Kingdom, which uses 60 percent of median income, the threshold is adjusted each year to remain equal to that amount. But the U.K. also tracks poverty using a threshold set to 60 percent of median income in a previous base year (15 years ago in their most recent report) — adjusting that poverty line for inflation. Tracking poverty over time in these two distinct ways may ease concerns that some have about measuring poverty in a relative fashion.
The dominant framework for measuring poverty in the United States is too technocratic and too ideologically conservative. There's never going to be unanimity on what counts as "poor," but we ought to give more weight to the views of ordinary Americans on that subject — which would also mean shifting toward the kind of metric used by our economic peer countries.
Shawn Fremstad is a Senior Policy Fellow at the Center for Economic Policy and Research.
-- via my feedly newsfeed