Tuesday, September 22, 2020

CEO/Worker Pay Ratios: Some Snapshots [feedly]

CEO/Worker Pay Ratios: Some Snapshots
https://conversableeconomist.blogspot.com/2020/07/ceoworker-pay-ratios-for-different.html

Each year, US corporations are required to report the pay for their chief executive officers, and also to report the ratio of CEO pay to the pay of the median worker at the company. Lawrence Mishel and Jori Kandra report the results for 2019 pay in "CEO compensation surged 14% in 2019 to $21.3 million: CEOs now earn 320 times as much as a typical worker"(Economic Policy Institute, August 18, 2020). 

Back in the 1970s and 1980s, it was common for CEOs to be paid something like 30-60 times the wage of a typical worker. In 2019, the ratio was a multiple of 320. 
A result of this shift is that while CEOs used to be paid three times as much as the top 0,1% of the income distribution, now they are paid about six times as much. 
What is driving this higher CEO pay ratio? In an immediate sense, the higher pay seems to reflect changes in the stock market. The left-hand margin shows CEO pay; the right-hand margin shows the stock market as measured by the S&P 500 index. 
This rise in CEO/worker pay ratios has led to a continually simmering argument about the underlying causes. Does the rise reflect the market for talent, in the sense that that running a company in a world of globalization and technological change has gotten harder, and the rewards for those who do it well are necessarily greater? Or does it reflect a greater ability of CEOs to take advantage of their position in large companies to grab a bigger share of the economic pie? One's answer to this question will turn, at least in part, on whether you think CEOs have played a major role in the rise of the stock market since about 1990, or whether you think they have just been riding along on a stock market that has risen for other reasons. For an example of this dispute from a few years ago in the Journal of Economic Perspectives (where I work as Managing Editor), I recommend: 
Without trying to resolve that dispute here, I'd offer this thought: Notice that pretty much all of the increase in CEO/worker pay ratios happened in the 1990s, and the ratio has been at about the same level since then. Thus, if you think that the market for executive talent was rewarding CEOs appropriately, you need an explanation for why the increase happened all at once in about a decade, without much change since then. If you think the reason is that CEOs are grabbing a bigger share of the pie, you need an explanation for why CEOs became so much more able to do that in the 1990s, but then their ability to grab even-larger shares of the pie seemed to halt at that point. To put it another way, when discussing a change that happened in the 1990s, you need an explanation specific to the 1990s. 

I don't have a complete explanation to offer, but one obvious possible cause was in 1993, when  Congress and the Clinton administration enacted a bill with the goal of holding down the rise in executive pay (visible in the first graph above). Up into the 1980s, most top executives had been paid on via annual salary-plus-a-bonus. However, the new law put a $1 million cap on salaries for top executive, and instead required that other pay be linked to performance--which in practice meant giving stock options to executives. Although this law was intended to hold down executive pay, the was The stock market more-or-less tripled in value from late 1994 to late 1999, and so those who had stock options did very well indeed. My own belief is that combination of events reset the common expectations for what top executives would be paid, and how they would be paid, in a way that is a primary driver of the overall rise in inequality of incomes in recent decades. 

 -- via my feedly newsfeed

Africa is Not Five Countries [feedly]

Tim Taylor asks -- Economists treat Africa like it was not composed of 57 countries. Is it just a slip, or a racist one: as in, "they all [African nations] look alike."

Africa is Not Five Countries

https://conversableeconomist.blogspot.com/2020/09/africa-is-not-five-countries.html

Scholars of the continent of Africa sometimes feel moved to expostulate: "Africa is not a country!" In part, they are reacting against a certain habit of speech and writing where someone discusses, say, the United States, China, Germany, and Africa--although only the first three are countries. More broadly, they are offering a reminder that Africa is a vast place, and that generalizations about "Africa" may apply only to some of the 54 countries in Africa

Economic research on "Africa" apparently runs some risk of falling into this trap. Obie Porteous has published a working paper that looks at published economics research on Africa: "Research Deserts and Oases: Evidence from 27 Thousand Economics Journal Articles" (September 8, 2020). Porteus creates a database of all articles related to African countries published between 2000-2019 in peer-reviewed economics journals. He points out that the number of such articles has been rising sharply: "[T]he number of articles about Africa published in peer-reviewed economics journals in the 2010s was more than double the number in the 2000s, more than five times the number in the 1990s, and more than twenty times the number in the 1970s." His data shows over 19,000 published economics article about Africa from 2010-2019, and another 8,000-plus from 2000-2010. 

But the alert reader will notice how easy, as shown in the previous paragraph, to slip into discussing articles "about Africa." Are economists studying a wide range of countries across the continent, or are they studying relatively few countries. Porteous has some discouraging news here: "45% of all economics journal articles and 65% of articles in the top five economics journals are about five countries accounting for just 16% of the continent's population."

The "frequent 5" five much-studied countries are Kenya, South Africa, Ghana, Uganda, and Malawi. As Porteous points out, it's straightforward to compile the "scarce 7": the seven countries Sudan, D.R. Congo, Angola, Somalia, Guinea, Chad, and South Sudan,with the same population as the frequent 5, but account for only 3.5% of all journal articles and 4.7% of articles in the top 5 journals.

What explains what some countries are common locations for economic research while others are not? Porteous writes: "I show that 91% of the variation in the number of articles across countries can be explained by a peacefulness index, the number of international tourist arrivals, having English as an official language, and population." It's certain easier for many economists to do research in English-speaking countries that are peaceful and popular tourist destinations--and that's what has been happening. There's also evidence that even within the highly-researched countries, some geographic areas are more often researched than others. 

Of course, it's often useful for a research paper to focus on a specific situation. The hope is that as such papers accumulate, broad-based lessons begin to emerge that can apply beyond the context of a specific country (or area within a country). But local and national context is often highly relevant to the findings of an economic study. It seems that a lot of what economic research has learned about "Africa" is actually about a smallish slice of the continent. 


 -- via my feedly newsfeed

Stock Buybacks: Leverage vs. Managerial Self-Dealing [feedly]

Tim Taylor does a great dive into stock buybacks -- and you do not need a masters in economics to get it. :)

Stock Buybacks: Leverage vs. Managerial Self-Dealing

https://conversableeconomist.blogspot.com/2020/09/stock-buybacks-leverage-vs-managerial.html

Consider a company that has been earning profits, and wants to pay or all of those earnings to its shareholders. There are two practical mechanisms for doing so. Traditionally, the best-known approach was for the firm to pay a dividend to shareholders. But in the last few decades, many US firms instead have used stock buybacks. How substantial has this shift been, and what concerns does it raise? 

Here, I'll draw upon a couple of recent discussions of stock buybacks. Siro Aramonte writes about "Mind the buybacks, beware of the leverage," in the BIS Quarterly Review (September 2020, pp. 49-59). Kathleen Kahle and René M. Stulz tackle the topic from a different angle in "Why are Corporate Payouts So High in the 2000s? (NBER Working Paper 26958, April 2020, subscription required). 

Kahle and Stulz present the evidence both that overall corporate payouts to shareholders are up in the 21st century, and that stock buybacks are the primary vehicle by which this has happened. They calculate that total payouts from corporations to shareholders from 2000-2017 (both dividends and share buybacks) were about $10 trillion. They find that corporate payouts to shareholders have risen substantially post-2000, and that stock buybacks are the main vehicle through which this has happened. They write: 
In the 2000s, annual aggregate real payouts average roughly three times their pre-2000 level. ... Specifically, in the aggregate, higher earnings explain 38% of the increase in real constant dollar payouts and higher payout rates account for 62% of the increase. ...

In our data, the growth in payout rates, defined as the ratio of net payouts to operating income, comes entirely from repurchases. This finding is consistent with the evidence in Skinner (2008) on the growing importance of repurchases. Dividends average 14.4% of operating income from 1971 to 1999 and 14% from 2000 to 2017. In contrast, net repurchases, defined as stock purchases minus stocks issuance, average 4.8% of operating income before 2000 and 18.3% from 2000 to 2017.
The tax code offers obvious reasons for share buybacks, rather than dividends, as economists were already discussing back in the 1980s.  Dividends are subject to the personal income tax, and thus taxed at the progressive rates of the income tax. However, the gains of an investor who sells stock back to the company are taxed at the lower rate for capital gains. In addition, when a company pays a dividend, all shareholders receive it, but when a company announced a share buyback, not all shareholders need to participate, if they do not wish to do so. Thus, share buybacks offer investors more flexibility about when and in what form they wish to receive a payout from the firm. 

In addition, economists have also recognized for some decades that corporations will sometimes find themselves in a position of "free cash flow," where the company has enough money that it can make choices about whether it can find productive internal investments for the funds, or whether it will fiud a way to pay out the money to shareholders, or whether it will use the money to pay bonuses and perquisites to managers. If we agree that lavishing additional benefits on managers is not a socially attractive choice, and if the firm honestly doesn't see  how to use the money productively for internal investments, then paying the funds out to shareholders seems the best choice. 

The public response to firms that pay dividends is often rather different than when a firm does a share buyback--even when the same payout is flowing from the firm to its shareholders. The concern sometimes expressed is that corporate managers have an unspoken additional agenda with stock buybacks, which is to pump up the price of the company's stock--and in that way to increase the stock-based performance bonus for the managers.

Sirio Aramonte also documents the substantial rise in stock buybacks in recent decades. He points out that a primary cause for stock buybacks is for firms to increase their leverage--that is, to increase the proportion of their financing that happens through debt. He writes: "Corporate stock buybacks have roughly tripled in the last decade, often to attain desired leverage, or debt as a share of assets." This pattern especially holds true if the firm finances the stock buyback with borrowed money, rather than out of previously earned profits. He writes: 
In 2019, US firms repurchased own shares worth $800 billion (Graph 1, first panel; all figures are in 2019 US dollars). Net of equity issuance, the 2019 tally reached $600 billion. Net buybacks can turn negative, and they did during the GFC [global financial crisis of 2007-9], as firms issued equity to shore up their balance sheets. ... Underscoring the structural differences between dividends and buybacks, the former were remarkably smooth, while the latter proved procyclical and co-moved with equity valuations ...
Aramonte crisply summarizes the case for share buybacks: 
In a number of cases, repurchases improve a firm's market value. For instance, if managers perceive equity as undervalued, they can credibly signal their assessment to investors through buybacks. In addition, using repurchases to disburse funds when capital gains are taxed less than dividends increases net distributions, all else equal. Furthermore, by substituting equity with debt, firms can lower funding costs when debt risk premia are relatively low, especially in the presence of search for yield. And, by reducing funds that managers can invest at their discretion, repurchases lessen the risk of wasteful expenditures.
What about the concern that corporate managers are using share buybacks to pump up their stock-based bonuses? Aramonte's discussion suggests that this may have been an issue in the past--say, pre-2005--but that the rules have changed. Companies have been shifting away from bonuses based on short-term stock prices, and toward bonuses based on long-term stock value for executives who stay with the firm. There are increased regulations and disclosure rules to limit this practice. Also, if CEOs were using stock buybacks in a short-term pump-and-dump strategy, then the stock price should first jump after a buyback and then fall back to its earlier level--and we don't see this pattern in the data. Thus, this concern that managers are abusing stock buybacks seems overblown. 

What about the linkages from stock buybacks to rising corporate debt? Aramonte provides some evidence, and also refers to the Kayle/Stulz study: 
[B]uybacks were not the main cause of the post-GFC rise in corporate debt. After 2000, internally generated funds became more important in financing buybacks. For one, economic growth resulted in rising profitability. In addition, firms exhibited a higher propensity to distribute available income. Kahle and Stulz (2020) find that cumulative corporate payouts from 2000 to 2018 were higher than those from 1971 to 1999 and that two thirds of the increase was due to this higher propensity.

In short, the overall level of rising corporate debt in recent years is a legitimate cause for concern (as I've noted herehere, and here). Share buybacks are one of the tools that US firms have used to increase their leverage, but the real issue here is whether the higher levels of debt have made US firms shakier, not the use of share buybacks as part of that strategy. The pandemic recession is likely to provide a harsh test of whether firms with more debt are also more vulnerable. As Aramonte writes: 

There is, however, clear evidence that companies make extensive use of share repurchases to meet leverage targets. The initial phase of the pandemic fallout in March 2020 put the spotlight on leverage: irrespective of past buyback activity, firms with high leverage saw considerably lower returns than their low-leverage peers. Thus, investors and policymakers should be mindful of buybacks as a leverage management tool, but they should particularly beware of leverage, as it ultimately matters for economic activity and financial stability.

 -- via my feedly newsfeed

It’s not Vaccine Nationalism, It’s Vaccine Idiocy [feedly]

Dean Baker at his best passive aggressive assault on stupidity.

It's not Vaccine Nationalism, It's Vaccine Idiocy

http://feedproxy.google.com/~r/beat_the_press/~3/eLbxNYR-4Bc/

Last week an official with China's Center for Disease Control and Prevention (CDC) said that the country may have a vaccine available for widespread distribution by November or December. This would almost certainly be at least a month or two before a vaccine is available for distribution in the United States, and possibly quite a bit longer.

While we may want to treat statements from Chinese government officials with some skepticism, there is reason to believe that this claim is close to the mark. China has reported giving its vaccines to more than 100,000 people. In addition to giving it to tens of thousands of people enrolled in clinical trials, it also has given them to front line workers, such as medical personal, through an emergency use authorization. 

This may not have been a good policy, since these workers faced the safety risks associated with a vaccine that has only undergone limited testing, but it does mean that a large number of people have now been exposed to China's leading vaccine candidates. If there were serious side effects, it would be hard for China to bury evidence of large numbers of adverse reactions. If no such evidence surfaces, we can assume that bad reactions to the vaccines were either rare and/or not very serious.

Of course, the evidence to date tells us little about long-term effects. But that would be true even if we had a couple more months of testing. Evidence of long-term effects may not show up for years. Ideally, researchers would have enough understanding of a vaccine so that they would largely be able to rule out problems showing up years down the road, but we know they do sometimes overlook risks. In any case, the possibility of longer-term problems would still be there with a longer initial testing period.

It is possible that the vaccines are not as effective as claimed. Since China has been very successful in controlling the pandemic, even front-line workers would face limited risk of exposure. However, they have been doing Phase 3 testing in Brazil, Bangladesh, and other countries with much more severe outbreaks. 

On this issue, it is worth noting that the United Arab Emirates (UAE), one of the countries in which China is conducting its phase 3 trials, just granted an emergency use authorization for one of its vaccines to be given to frontline workers there. Presumably this reflects positive results of the trial, since it is unlikely that the UAE would grant this authorization simply to please China's government. 

The companies have not yet shared their data, so it's possible that the evidence does not support the claim of this Chinese CDC official. But here too, the value of making an obviously false claim would be limited. If the companies either fail to produce their data, or the data does not show solid evidence of a vaccine's effectiveness, the official and China's government would end up looking rather foolish.

Since they are not just trying to bluff their way through an election, but are rather concerned about China's longer term standing to the world, it's hard to see why they would make a claim that would soon be shown to be false. In short, the promise of a vaccine being distributed in November or December is quite likely true. 

I suppose this will get those hoping that the United States would win the vaccine "race" very angry. But it should get the rest of us asking why we were having a race. 

Why Is Cooperative Research So Hard to Understand?

In the early days of the pandemic there was large degree of international cooperation, with scientists around the world quickly sharing new findings. This allowed for our understanding of the virus to advance far more rapidly than would otherwise be the case. 

But we quickly shifted to a path of nationalistic competition. Donald Trump led the way down this path, with his "Operation Warp Speed." Other countries followed a similar route, even as they maintained some commitment to the World Health Organization's efforts to promote sharing with developing countries.

But the issue was not just nationalism, it is also the monopolization of research findings. If Moderna, Pfizer, or one of the other U.S. drug companies ends up developing a safe and effective vaccine, they fully intend to sell it at a considerable profit, and they will be sharing the money with their top executives and their shareholders, not the American people. This outcome makes sense if the point of the policy is to maximize drug company profits. It makes no sense if the policy goal is to produce the best health outcomes at the lowest possible cost.

The United States did not have to take the patent monopoly nationalistic route. Suppose that all the money from Operation Warp Speed went to fully open research. This would not just be an accidental outcome, it would be an explicit condition of the funding. If a drug company received money from this program, all its results must be posted on the Internet as quickly as practical, and any findings would be in the public domain.

Since we would not just want to pay for the rest of the world's research, we could have negotiated commitments from other countries to make payments that are proportionate, given their size and per capita income. Of course, there is no guarantee that they would all go along, especially with Donald Trump as president. But in principle, this would be a mutually beneficial agreement for pretty much everyone. 

They would contribute their share of funding to the research pool and they would then have the right to produce any vaccines or treatments that are developed. If it turned out to be the case that a U.S. drug company was the first to come up with an effective vaccine, any company with the necessary manufacturing facilities would be able to freely produce and distribute the vaccine anywhere in the world. They would not need to negotiate over patent rights.

The same would be true if, as now seems to be the case, China turned out to develop the first effective vaccine. Our manufacturers would be free to start producing the vaccine as soon as it received the necessary approvals from the Food and Drug Administration. There would be no issue of people here going without the vaccine just because the developer was a Chinese company.

It's not surprising that Donald Trump did not go the route of cooperative development. His first priority is advancing his own political prospects and if he thinks that means having the U.S. win a vaccine race, that is what he is going to do. And, he certainly has no intention of pursuing a course that could limit drug company profits.

But the big question is where were the Democrats? If they were objecting to the path of vaccine nationalism and monopoly, it was not easy to hear their complaints. And, I'm not talking just about centrist Democrats like Biden, Pelosi, and Schumer, I also didn't hear complaints from the Bernie Sanders or Elizabeth Warren wing of the party. Why were there no objections to the Trump course and advocacy for a cooperative alternative?

Going a cooperative route would not just offer benefits in the context of developing vaccines and treatments for the coronavirus, although these benefits would be incredibly important. It also could have provided a great model of an alternative path for financing the development of prescription drugs. 

We will spend over $500 billion this year on prescription drugs. We would pay less than $100 billion if these drugs were available in a free market without patent monopolies and related protections. The $400 billion in annual savings is more than five times what we spend on food stamps each year. It comes to close to $3,000 per household. In other words, it is real money.

Patent and copyright monopolies are also a big part of the upward redistribution of income over the last four decades. If we had alternatives mechanisms for financing innovation and creative work, people like Bill Gates would be much less rich, and the rest of us would have far more money.

Again, it is easy to understand why Donald Trump would have zero interest in promoting world health and reducing inequality. It's also understandable that politicians who are dependent on campaign contributions from those who have benefitted from upward redistribution, would not want to pursue routes that call into question the mechanisms of upward redistribution. 

But where were the progressive voices? The pandemic gave us an extraordinary opportunity to experiment with an alternative mechanism for financing research that could have enormously benefitted public health, both in the United States and elsewhere. The failure to have a visible alternative will cost both lives and money long into the future.  

The post It's not Vaccine Nationalism, It's Vaccine Idiocy appeared first on Center for Economic and Policy Research.


 -- via my feedly newsfeed

Monday, September 21, 2020

Every Day is a Bad Day, Say a Rising Share of Americans [feedly]

Every Day is a Bad Day, Say a Rising Share of Americans
https://conversableeconomist.blogspot.com/2020/09/every-day-is-bad-day-say-rising-share.html

The Behavioral Risk Factor Surveillance System (BRFSS) is a standardized phone survey about health-related behaviors, carried out by the Centers for Disease Control and Prevention (CDC). One question asks: "Now thinking about your mental health, whicjh includes stress, depression, and problems with emotions, for how many days during the past 30 days was your mental health not good?" 

David G. Blanchflower and Andrew J. Oswald focus on this question in "Trends in Extreme Distress in the United States, 1993–2019" (American Journal of Public Health, October 2020, pp. 1538-1544).  I particular, they focus on the share of people who answer that their mental health was not good for all 30 of the previous 30 days, who they categorize as in a condition of "extreme distress." Here are some patterns: 

This graph shows the overall and steady rise for men and women from 1993-2019. 

Here's a breakdown for a specific age group of those 35-54 years of age, with a simple breakdown by education and by ethnicity. 
This kind of survey evidence doesn't let a researcher test for causality, but it's possible to look at some correlations. The authors write: "Regression analysis revealed that (1) at the personal level, the strongest statistical predictor of extreme distress was `I am unable to work,' and (2) at the state level, a decline in the share of manufacturing jobs was a predictor of greater distress."

Of course, one doesn't want to overinterpret graphs like this. The measures on the left-hand axis are single-digit percentages, after all. But remember, these people are reporting that their mental health hasn't been good for a single day in the last month. The share has been steadily rising over time, through different economic and political conditions. In those pre-COVID days of 2019, 11% of the white, non-college population--call it one out of every nine in this group--reported this form of extreme distress. The implications for both public health and politics seem worth considering. 

 -- via my feedly newsfeed

Uninsured Rate Rose in 2019; Income and Poverty Data Overtaken by Pandemic Recession [feedly]

Uninsured Rate Rose in 2019; Income and Poverty Data Overtaken by Pandemic Recession
https://www.cbpp.org/research/poverty-and-inequality/uninsured-rate-rose-in-2019-income-and-poverty-data-overtaken-by

Income rose and poverty declined in 2019, as would be expected in the peak year of a decade-long economic recovery, but the number of Americans without health insurance increased for the third consecutive year despite a growing economy.

"THE NUMBER OF AMERICANS WITHOUT HEALTH INSURANCE INCREASED FOR THE THIRD CONSECUTIVE YEAR DESPITE A GROWING ECONOMY."The rise in the share of Americans who were uninsured at the time of the Census Bureau survey — from 8.9 percent (28.6 million people) in 2018 to 9.2 percent (29.6 million people) in 2019 — likely reflects continued Administration actions that have reduced access to health insurance coverage. The data show a 0.7 percentage-point decline in health insurance among Hispanic individuals, driven by a large, 1.4 percentage-point drop in the share of Hispanic individuals with public coverage.

These health coverage data come from Census' American Community Survey (ACS), which was conducted before the pandemic hit. The income and poverty data, by contrast, come from the Current Population Survey (CPS), and their reliability is compromised because Census chiefly surveyed households for the CPS in March, as the pandemic was hitting. As Census noted in a paper released today, the pandemic and associated safety concerns forced it to stop in-person interviews and make other changes in collecting data, which sharply reduced response rates in the survey — especially among lower-income households. Census concluded that the increased non-response artificially boosted the income figures for 2019 and artificially lowered the poverty rate.[1]

Accordingly, Census issued adjusted figures for some limited points, including household median income and the overall poverty rate, to account for the biases the pandemic caused in the survey results. The adjusted Census numbers, which are much more reliable than the unadjusted numbers, show median household income climbing 4.1 percent in 2019 after adjusting for inflation, from $64,180 in 2018 to $66,790 in 2019.

The adjusted poverty numbers show the poverty rate falling from 11.9 percent in 2018 to 11.1 percent in 2019.

These data, however, bear little resemblance to conditions today, as the pandemic and recession have fundamentally altered the economic landscape. Households' circumstances differ sharply now from those reflected in the 2019 data.

Much more current Census Bureau data — from Census' Household Pulse survey, which since April has collected data on a weekly or bi-weekly basis — show that as of late August, more than 22 million adults were in households that lacked sufficient food the previous week and millions of renters had fallen behind on rent. And with millions of workers having lost their jobs and hence their employer-based health coverage, the ranks of the uninsured will likely increase.

These hardship data underscore the critical need for the nation's leaders to work out an agreement on strong new stimulus and relief measures before Congress adjourns later this month. It's particularly urgent to enact measures strengthening food and rental assistance, public health measures, state and local fiscal relief, and unemployment insurance, among other areas. Without such measures, poverty and hardship will likely worsen further in the months ahead.

Health Coverage in 2019

The uninsured rate rose from 8.9 percent in 2018 to 9.2 percent in 2019, according to the ACS. The ACS data were collected before the pandemic and so are unaffected by the March drop in response rates. Census noted in its written documents issued today (and explained on its press call) that these data give a more reliable picture of how health coverage changed in 2019 than the CPS. The ACS is also a larger survey and generally more suitable for measuring recent year-to-year changes in health insurance.

The increases in the uninsured rate in 2017, 2018, and 2019 followed six consecutive years of decline, the ACS data show — from 15.5 percent in 2010 to a historic low of 8.6 percent in 2016. But because of the increases after 2016, 2.3 million more Americans were uninsured in 2019 than in 2016, including more than 720,000 children.

The regression in the last three years came despite steady job growth that should have improved health coverage, and it likely reflects, in large part, the effects of Trump Administration policies that reduce access to coverage. For example, the Administration's policies toward immigrants, particularly its so-called "public charge" rule, have created a climate of fear among immigrants and their family members that discourages eligible people from enrolling in Medicaid or marketplace coverage. The new Census data show that groups disproportionately affected by these policies, including Hispanic adults, Hispanic children, and children who were not born in the United States, all saw much larger-than-average increases in uninsured rates.

To be sure, the uninsured rate in 2019 was still far below its level before enactment of the Affordable Care Act (ACA). But the ACA's remaining coverage gains will be in jeopardy if the Administration succeeds in its efforts to undo the law through the courts or legislation. Just last month the Administration filed its latest brief with the Supreme Court in its effort to overturn all of the ACA. That would add approximately 20 million people to the ranks of the uninsured, the Urban Institute estimated before the pandemic, and it would likely cause even larger coverage losses today since millions of people are turning to ACA coverage programs during the recession.

Income and Poverty in 2019

Median household income stood at $66,790 in 2019, about 4.1 percent above the inflation-adjusted 2018 level of $64,180, and 11.1 percent of Americans lived below the official poverty line, compared to 11.9 percent in 2018 — according to the Census data that correct for the impact of the pandemic on data collection in the CPS.

These improvements shouldn't be surprising, as 2019 was the tenth (and final) full year of a record-long economic recovery. As in most past "peak" recovery years, median income appears to have reached a new high in 2019. Median household income set new highs in 15 of the last 50 years, including more than 40 percent of non-recession years. The poverty rate fell in 2019 for the fifth straight year.

Although incomes clearly grew and poverty declined in 2019, a better assessment of changes in these areas in 2019 will require additional data, including detailed income and poverty findings from the ACS that are due out later this week. The ACS not only has a larger sample size than the CPS but also collected its 2019 data before the pandemic began. We expect its income and poverty data for 2019 to be the more reliable, in particular for the non-elderly population. (The 2019 ACS includes some wording changes that especially affect the elderly.)

2019 Economy Bears Little Resemblance to That of Today

Since the declaration on March 13 of a national coronavirus emergency, the nation has endured the deepest recession since the 1930s, fundamentally altering the economic landscape and making the 2019 findings an inaccurate gauge of current circumstances. The official unemployment rate, 3.5 percent in February, peaked above 14 percent in April and still exceeded 8 percent in August. Some 35 million people were either officially unemployed or lived with an unemployed family member in August, including 9 million children. This figure rises to 61 million workers and family members when one includes those not officially unemployed but nonetheless sidelined from the labor market, including workers with an unpaid absence from their job due to illness, caretaking responsibilities, or other reasons, as well as those who want a job but aren't actively seeking work given the state of the labor market and the risks the pandemic poses.[2]

Other data show sharp increases in various measures of hardship. Multiple studies have found that the number of households having difficulty affording food has soared during the crisis.[3] A July Brookings study found that "young children are experiencing food insecurity to an extent unprecedented in modern times,"[4] and more than 22 million adults — or more than 1 in 10 — reported in August that their household had "not enough to eat" sometimes or often in the last seven days. By comparison, fewer than 4 percent of adults reported experiencing this problem at any point in 2019, according to a CBPP analysis of a separate Census survey.[5]

In addition, millions of renters are behind on rent, and data suggest the number climbed higher over the summer.[6]

Adding to these concerns, the economic crisis and the pandemic appear to be hitting hardest those who already were struggling most to make ends meet. They include workers with no college education, who have lost their jobs at twice the national rate, and workers in low-paid industries, which have shed jobs at nearly three times the rate in high-paid industries. They also include Black and Latino individuals, many of whom work in low-paying jobs and entered the pandemic with limited household resources due to factors such as inadequate educational opportunities, employment discrimination, and other forms of racism.

Today's Census data show that in 2019, before the recession, Black and Hispanic individuals were more than twice as likely to be poor as non-Hispanic whites: 18.8 and 15.7 percent, respectively, versus 7.3 percent. (These data are not adjusted for the reporting bias described above, but similar disparities are also found in prior years.) And during the pandemic, employment levels have fallen more among Black and Latino workers than among non-Hispanic white workers. Data that Census collected in August as part of its Household Pulse survey show Black and Latino adults also are more likely to live in households that lack sufficient food.


 -- via my feedly newsfeed

The World Is Losing the Money Laundering Fight [feedly]

The World Is Losing the Money Laundering Fight
https://www.washingtonpost.com/business/the-world-is-losing-the-money-laundering-fight/2020/09/21/8750087c-fc1a-11ea-b0e4-350e4e60cc91_story.html?utm_source=feedly&utm_medium=referral&utm_campaign=wp_business

We've just had the closest look yet at the global battle against money laundering, and it's deeply troubling: Banks and their regulators are nowhere near restraining the flow of trillions of dollars of illicit funds. 

Both the finance industry and the authorities are to blame. Without an urgent, concerted political effort, criminals — from drug dealers and terrorists to human traffickers — will keep the upper hand.

In a year-long investigation by BuzzFeed and the International Consortium of Investigative Journalists, reporters pored over about 2,100 suspicious activity reports, or SARs, which lenders file to the U.S. Treasury's Financial Crimes Enforcement Network (FinCEN) when they spot potential money laundering and other bad behavior.

While the number of SARs reviewed by the journalists dwarfs any previous access to these confidential documents, they're still just a tiny fraction —  0.02% — of the 12 million or so SARs that were probably filed during the period in question, mostly 2011 through 2017. Also, the sample isn't representative of overall banking activity. Some records stem from the U.S. congressional investigation into interference with the 2016 presidential election. Almost half of the SARs came from Deutsche Bank AG.

AD

Still, the sums and the patterns of failings are staggering. This small number of reports alone flagged up $2 trillion of fund flows, $1.3 trillion from Deutsche Bank, that may have stemmed from criminal activity. And the FinCEN Files are just the tip of the iceberg, as Transparency International put it. (The banks' responses to BuzzFeed are here.) 

The U.K., home to more than 600 companies flagged in the reports, appears to be the biggest hub for dodgy money flows, with the U.S. second. Britain clearly hasn't done enough to tighten laws against money launderers. A huge web of enablers, from lawyers to accountants and bankers, helps oil the wheels of illicit finance through London.

Banks, for their part, are too slow if not outright negligent in submitting SARs. More than a fifth of the documents included in the submissions related to subjects whose addresses weren't known to the banks, including companies with whom the lenders were already banking.

AD

SARs, which should be filed within 60 days of detecting potential criminal activity, were sometimes submitted years later. According to the BuzzFeed/ICIJ report, that was allegedly the case with JPMorgan Chase & Co., which processed payments for Paul Manafort, President Donald Trump's former campaign manager, after he resigned from the 2016 campaign amid money-laundering allegations. HSBC Holdings Plc kept moving money for an investment fund that was already under investigation over allegations it was a Ponzi scheme, the report says.

Alarmingly, banks often just rely on internet searches to find out who their clients are and only file suspicious reports after news breaks or formal investigations are launched. In the bundle of SARs reviewed by the reporters, the median filing time since the suspicious activity began was 166 days. Imagine how far those funds would have gone in half a year.

The reviewed SARs pertain to a period when many banks were already being investigated and punished for failing to adhere to money-laundering regulation. But the billions of dollars in fines levied against them haven't changed behavior much.

AD

Having a bigger legal stick with which to whack errant bankers and other enablers would help. So too would a rethink of how money laundering is tackled by policy makers. Police forces and national regulators don't only struggle to cooperate across borders; even within some crime agencies, various units don't always share information.

Tom Kirchmaier, a policing and crime researcher at the London School of Economics, suggests a three-step solution. For starters, the SARs filings must be standardized. Far too much information is submitted in narratives that are impossible to scrutinize. FinCEN employs about 270 employees, receiving up to 2 million SARs every year. "We're still stuck in the 19th century," Switzerland's former top money-laundering cop Daniel Thelesklaf says of his nation's paper-based efforts.

Second, Kirchmaier calls for far more sharing of data between regulators and enforcers. In Europe, more than 50 authorities supervise money laundering and terror financing. The European Commission will propose an EU-level supervisor next year. That's long overdue.

AD

Lastly, Kirchmaier says humans need to be removed from the process as far as possible. Crime agencies should be able to automate the screening of SARs and report back to banks, providing them with an assessment of the risk associated with a particular client, for example. And lenders ought to be able to stop transactions to flagged entities without so much human intervention.

A radical improvement in the fight against money laundering may not be possible overnight, but — as I've written before — the system isn't working. As the speed of payments accelerates and virtual currencies proliferate, criminals will find new ways to move money. Banks and their supervisors collectively need to do much, much better.

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Elisa Martinuzzi is a Bloomberg Opinion columnist covering finance. She is a former managing editor for European finance at Bloomberg News.

For more articles like this, please visit us at bloomberg.com/opinion

©2020 Bloomberg L.P.


 -- via my feedly newsfeed