Saturday, November 21, 2020

Michael Roberts: G20: the debt solution [feedly]

Another impressive post from Mike Roberts on the G20, and related issues.

G20: the debt solution

Michael Roberts 
https://thenextrecession.wordpress.com/2020/11/21/g20-the-debt-solution/

This weekend, the G20 leaders' summit takes place – not physically of course, but by video link.  Proudly hosted by Saudi Arabia, that bastion of democracy and civil rights, the G20 leaders are focusing on the impact on the world economy from the COVID-19 pandemic.

In particular, the leaders are alarmed by the huge increase in government spending engendered by the slump forced on the major capitalist governments to ameliorate the impact on businesses, large and small, and on the wider working population. The IMF estimates that the combined fiscal and monetary stimulus delivered by advanced economies has been equal to 20 per cent of their gross domestic product. Middle income countries in the developing world have been able to do less but they still put together a combined response equal to 6 or 7 per cent of GDP, according to the IMF. For the poorest countries, however, the reaction has been much more modest. Together they injected spending equal to just 2 per cent of their much smaller national output in reaction to the pandemic. That has left their economies much more vulnerable to a prolonged slump, potentially pushing millions of people into poverty.

The situation is getting more urgent as the pain from the pandemic crisis starts to be felt. Zambia this week became the sixth developing country to default or restructure debts in 2020 and more are expected as the economic cost of the virus mounts — even amid the good news about potential vaccines.

The Financial Times commented that: "some observers think that even large developing countries such as Brazil and South Africa, which are both in the G20 group of large nations, could face severe challenges in obtaining finance in the coming 12 to 24 months."

Up to now, very little has been done by the G20 governments to avoid or ameliorate this coming debt disaster.  In April, Kristalina Georgieva, the IMF managing director, said the external financing needs of emerging market and developing countries would be in "the trillions of dollars". The IMF itself has lent $100bn in emergency loans. The World Bank has set aside $160bn to lend over 15 months.  But even the World Bank reckons that "low and middle-income countries will need between $175bn and $700bn a year".

The only co-ordinated innovation has been a debt service suspension initiative (DSSI) unveiled in April by the G20. The DSSI allowed 73 of the world's poorest countries to postpone repayments.  But pausing payments is no solution – the debt remains and even if G20 governments show some further relaxation, private creditors (banks, pension funds, hedge funds and bond 'vigilantes') continue to demand their pound of flesh.

In advanced economies and some emerging market economies, central bank purchases of government debt have helped keep interest rates at historic lows and supported government borrowing. In these economies, the fiscal response to the crisis has been massive. In many highly indebted emerging market and low-income economies, however, governments have had limited space to increase borrowing, which has hampered their ability to scale up support to those most affected by the crisis. These governments face tough choices.  For example, in 2020, government debt-to-revenue will reach over 480% across the 35 Sub-Saharan Africa countries eligible for the DSSI.

Even before the pandemic broke, global debt had reached record levels.  According to the IIF, in 'mature' markets, debt surpassed 432% of GDP in Q3 2020, up over 50 percentage points year-over-year. Global debt in total will have reached $277trn by year end, or 365% of world GDP.

Much of the increase in debt among the so-called developing economies has been in China where state banks have expanded loans, while 'shadow banking' loans have increased and local governments have carried out increased property and infrastructure projects using land sales to fund them or borrowing.

Many 'Western' pundits reckon that, as a result, China is heading for a major debt default crisis that will seriously damage the Beijing government and the economy.  But such predictions have been made for the last two decades since the minor 'asset readjustment' after 1998.  Despite the increase in debt levels in China, such a crisis is unlikely.

First, China, unlike other large and small emerging economies with high debts, has a massive foreign exchange reserve of $3trn.  Second, less than 10% of its debt is owed to foreigners, unlike countries like Turkey, South Africa and much of Latin America.  Third, the Chinese economy is growing. It has recovered from the pandemic slump much quicker than the other G20 economies, which remain in a slump.

Moreover, if any banks or finance companies go bust (and some have), the state banking system and the state itself stands behind ready to pick up the bill or allow 'restructuring'. And the Chinese state has the power to restructure the financial sector – as the recent blockage on the planned launch of Jack Ma's 'finbank' shows.  On any serious sign that the Chinese financial and property sector is getting too 'big to fail' , the government can and will act.  There will be no financial meltdown. That's not the picture in the rest of the G20.

And most important, globally the rise in debt was not just in public sector debt but also in the private sector, especially corporate debt.  Companies around the world had built up their debt levels while interest rates were low or even zero.  The large tech companies did so in order to hoard cash, buy back shares to boost their price or to carry through mergers, but the smaller companies, where profitability had been low for a decade or more, did so just to keep their heads above water. This latter group have become more and more zombified (ie where profits were not enough even to cover the interest charge on the debt).  That is a recipe for eventual defaults, if and when, interest rates should rise.

What is to be done?  One offered solution is more credit.  At the G20, the IMF officials and others will push not just for an extension of the DSSI, but also for a doubling of the credit firepower of the IMF through Special Drawing Rights (SDRs).  This is a form of international money, like gold in that sense, but instead a fiat currency valued by basket of major currencies like the dollar, the euro and the yen and only issued by the IMF.

The IMF has issued them in past crises and proponents say it should do so now. But the proposal was vetoed by the US last April. "SDRs mean giving unconditional liquidity to developing countries," says Stephanie Blankenburg, head of debt and development finance at Unctad. "If advanced economies can't agree on that, then the whole multilateral system is pretty much bankrupt."

How true that is. But is yet more debt (sorry, 'credit') piled on top of the existing mountain any solution, even in the short term?  Why do not the G2 leaders instead agree to wipe out the debts of the poor countries and why do they not insist that the private creditors do the same?

Of course, the answer is obvious.  It would mean huge losses globally for bond holders and banks, possibly germinating a financial crisis in the advanced economies. At a time when governments are experiencing massive budget deficits and public debt levels well over 100% of GDP, they would then face a mega bailout of banks and financial institutions as the burden of emerging debt came home to bite.

Recently, the former chief economist of the Bank for International Settlements, William White, was interviewed on what to do.  White is a longstanding member of Austrian school of economics, which blames crises in capitalism, not on any inherent contradictions within the capitalist mode of production, but on 'excessive' and 'uncontrolled' expansion of credit.  This happens because institutions outside the 'perfect' running of the capitalist money markets interfere with interest and money creation, in particular, central banks.

White puts the cause of the impending debt crisis at the door of the central banks.  "They have pursued the wrong policies over the past three decades, which have caused ever-higher debt and ever greater instability in the financial system."  He goes on: "my point is: central banks create the instabilities, then they have to save the system during the crisis, and by that they create even more instabilities. They keep shooting themselves in the foot."

There is some truth in this analysis, as even the Federal Reserve admitted in its latest report on financial stability in the US.  There has been $7 trillion increase in G7 central bank assets in just eight months in contrast to the $3 trillion increase in the year following the collapse of Lehman Brothers in 2008. The Fed admitted that the world economy was in trouble before the pandemic and needed more credit injections: "following a long global recovery from the 2008 financial crisis, the outlook for growth and corporate earnings had weakened by early 2020 and become more uncertain."  But while credit injections engendered a "decline in finance costs reduced debt burdens", it encouraged further debt accumulation which, coupled with declining asset quality and lower credit underwriting standards "meant that firms became increasingly exposed to the risk of a material economic downturn or an unexpected rise in interest rates. Investors had therefore become more susceptible to sudden shifts in market sentiment and a tightening of financial conditions in response to shocks."

Indeed, central bank injections have kicked the problem can down the road but solved nothing: "The measures taken by central banks were aimed at restoring market functioning, and not at addressing the underlying vulnerabilities that caused markets to amplify the stress. The financial system remains vulnerable to another liquidity strain, as the underlying structures and mechanisms that gave rise to the turmoil are still in place."  So credit has been piled on credit and the only solution is more credit.

White argues for other solutions. He says: "There is no return back to any form of normalcy without dealing with the debt overhang. This is the elephant in the room. If we agree that the policy of the past thirty years has created an ever-growing mountain of debt and ever-rising instabilities in the system, then we need to deal with that."

He offers "four ways to get rid of an overhang of bad debt. One: households, corporations and governments try to save more to repay their debt. But we know that this gets you into the Keynesian Paradox of Thrift, where the economy collapses. So this way leads to disaster."  So don't go for 'austerity'.

The second way: "you can try to grow your way out of a debt overhang, through stronger real economic growth. But we know that a debt overhang impedes real economic growth. Of course, we should try to increase potential growth through structural reforms, but this is unlikely to be the silver bullet that saves us."  White says this second way cannot work if productive investment is too low because the debt burden is too high.

What White leaves out here is the low level of profitability on existing capital that deters capitalists investing productively with their extra credit.  By 'structural reforms', White means sacking workers and replacing them with technology and destroying what's left of labour rights and conditions. That might work, he says but he does not think this will be implemented by governments sufficiently.

White goes on: "This leaves the two remaining ways: higher nominal growth—i.e., higher inflation—or try to get rid of the bad debt by restructuring and writing it off."  Higher inflation may well be one option, one that Keynesian/MMT policies would lead to, but in effect it means the debt is paid off in real terms by reducing the living standards of most people. and hitting the real value of the loans made by the banks.  The debtors gain at the expense of the creditors and labour.

White, being a good Austrian, opts for writing off the debts. "That's the one I would strongly advise. Approach the problem, try to identify the bad debts, and restructure them in as orderly a fashion that you can. But we know how extremely difficult it is to get creditors and debtors together to sort this out cooperatively. Our current procedures are completely inadequate." Indeed, apart from the IMF-G20 and the rest not having any 'structures' to do this, these leading institutions do not want to provoke a financial crash and a deeper slump by 'liquidating' the debt, as was proposed by US treasury officials during the Great Depression of the 1930s.

Instead, the G20 will agree to extend the DSSI payment postponement plan, but not write off any debts.  It will probably not even agree to expand the SDR fund.  Instead, it will just hope to muddle through at the expense of the poor countries and their people; and labour globally.


 -- via my feedly newsfeed

No improvement in initial unemployment claims as labor market gains falter [feedly]

No improvement in initial unemployment claims as labor market gains falter
https://www.epi.org/blog/no-improvement-in-initial-unemployment-claims-as-labor-market-gains-falter/

Another 1.1 million people applied for unemployment insurance (UI) benefits last week, including 742,000 people who applied for regular state UI and 320,000 who applied for Pandemic Unemployment Assistance (PUA). The 1.1 million who applied for UI last week was an increase of 55,000 from the prior week's figures. Last week was the 35th straight week total initial claims were greater than the worst week of the Great Recession. (If that comparison is restricted to regular state claims—because we didn't have PUA in the Great Recession—initial claims last week were still 3.3 times where they were a year ago.)

Most states provide 26 weeks of regular benefits, but this crisis has gone on much longer than that. That means many workers are exhausting their regular state UI benefits. In the most recent data, continuing claims for regular state UI dropped by 429,000, from 6.8 million to 6.4 million.

For now, after an individual exhausts regular state benefits, they can move on to Pandemic Emergency Unemployment Compensation (PEUC), which is an additional 13 weeks of regular state UI. However, PEUC is set to expire on December 26 (as is PUA).

In the latest data available for PEUC (the week ending October 31), PEUC rose by 233,000, from 4.1 million to 4.4 million, offsetting only about 60% of the 383,000 decline in continuing claims for regular state benefits for the same week. This is likely due in large part to the fact that many of the roughly 2 million workers who were on UI before the recession began, or who are in states with less than the standard 26 weeks of regular state benefits, are exhausting PEUC benefits at the same time others are taking it up. More than 1.5 million workers have exhausted PEUC so far (see column C43 in form ETA 5159 for PEUC here). Last week, 634,000 workers were on Extended Benefits (EB), which is a program that eligible unemployed workers in some states can get on if they've exhausted PEUC.

Figure A shows continuing claims in all programs over time (the latest data are for October 31). Continuing claims are nearly 19 million above where they were a year ago.

Figure A

Senate Republicans allowed the across-the-board $600 increase in weekly UI benefits to expire at the end of July, so last week was the 16th week of unemployment in this pandemic for which recipients did not get the extra $600. And worse, as mentioned above, PUA and PEUC will expire in just over five weeks, on December 26—unless Congress acts. Millions of workers are depending on these programs—DOL reports that a total of 13.1 million workers were on PUA (8.7 million) or PEUC (4.4 million) during the week ending October 31. When these programs expire, millions of these workers and their families will be financially devastated.

An excellent new paper from The Century Foundation finds that 12 million workers will lose PUA or PEUC benefits when they expire on December 26—on top of the 4.4 million who will have exhausted them before then. It also finds that only 2.9 million will then be eligible for Extended Benefits. That means a total of 13.5 million workers (12.0 million + 4.4 million – 2.9 million) will have lost CARES Act unemployment benefits by the end of the year with nothing to fill in the gap.

The House passed a $2.2 trillion relief package that would extend the UI provisions in the CARES Act, but Senate Majority Leader Mitch McConnell has thus far blocked additional COVID relief, knowing full well that millions will see their benefits disappear on December 26 if he doesn't act. The cruelty is beyond belief.

And blocking more COVID relief is not just cruel, it's bad economic policy. UI is great stimulus. The spending made possible by pandemic UI benefits is supporting millions of jobs. Letting these benefits expire means cutting those jobs. There are now 25.7 million workers who are officially unemployed, otherwise out of work because of the virus, or have seen a drop in hours and pay because of the pandemic. And job growth is slowing. Stimulus is desperately needed.

Blocking stimulus is also exacerbating racial inequality. Due to the impact of historic and current systemic racism, Black and Latinx communities have seen more job loss in this recession, and have less wealth to fall back on. The lack of stimulus hits these workers the hardest, which means stimulus is a racial justice issue. Further, workers in this pandemic aren't just losing their jobs—millions of workers and their family members have lost employer-provided health insurance due to the COVID-19 downturn. Senate Republicans are failing struggling families.


 -- via my feedly newsfeed

Racism and the Economy: Focus on Employment [feedly]

Racism and the Economy: Focus on Employment
https://www.epi.org/blog/racism-and-the-economy-fed/

Valerie Wilson, director of the Economic Policy Institute Program on Race, Ethnicity, and the Economy, gave the keynote address at the Federal Reserve Symposium on Racism and the Economy. These are her remarks.

"In my view, these patterns of persistent racial inequality in the labor market are analogous to horrific videos capturing blatant disregard for black lives. Let me be clear. While losing a job comes nowhere close to losing a life, both are symptoms of the kind of racial injustice that sparked national and international protests this past summer. "


According to the Center for Assessment and Policy Development, racial equity is the condition that would be achieved if one's racial identity no longer predicted, in a statistical sense, how one fares.

In reality, statistical analysis often reveals that racial identity is a measurable, significant, and persistent predictor of labor market outcomes.  Let's pause and think about that for a moment.  Why should racial identity—something as arbitrary and superficial as physical appearance (skin color)—be statistically correlated with one's likelihood of being employed or how much they are paid for their labor?

As silly as that sounds, when we consider the origins of race as a social construct, the racial disparities we observe across any number of economic outcomes should come as no surprise.  Since this nation's inception, race has been used to systematically exclude, marginalize, exploit, and generate unequal economic outcomes, while also being used to justify and normalize those unequal outcomes.

I'm going to start with a few graphical examples of what this looks like in the specific context of black-white inequality.  These images and statistics are probably familiar to many of you, so I'm going to run through them quickly in order to highlight a few key points.  Then, I want to spend most of my time talking about the historical context, policy relevance, and contradictory interpretations of racial disparity.  Finally, I'll share what role I think the Federal Reserve and others play in addressing racial disparities in employment and wages.

  • Black workers are far more likely to be unemployed than white workers – typically twice as likely.  Even at the historically low rates of unemployment reached in 2019, this was the case overall and at nearly every level of education.  In practical terms, this means that black workers are not just twice as likely to be unemployed as similarly educated white workers but are often more likely to be unemployed than less-educated white workers.
  • Among the employed, black workers face large pay disparities relative to white workers.  These black-white wage gaps have grown over the last 40 years (1979-2019) and are largely unexplained by factors associated with individual productivity. In 2019, the average hourly wage of black workers was 26.5% less than that of white workers (dark blue line).  Relative to white workers with the same education, of the same age and gender, and living in the same geographic division, the gap was 14.9% (light blue line, regression-adjusted gap). In other words, less than half the observed black-white difference in average hourly wage is explained by the main factors presumed to determine pay.
  • The intersection of race and gender imposes a dual wage penalty on black women. In 2019, black women (dark blue line) were paid 33.7% less than their white male counterparts, which was a much larger gap than that faced by either white women (25.7% – light blue line) or black men (22.2% -red line). This is even after the sharp downward trend in the gender wage gap that occurred throughout the 1980s and essentially stopped in the mid-1990s.
  • There has been insufficient vigilance in fighting unemployment since the late 1970s – the same period of time that we have observed growing wage inequality overall and widening wage gaps between black and white workers.  This graph shows the actual rate of unemployment (dark blue) and estimates of the natural rate of unemployment (also referred to as the non-accelerating inflation rate of unemployment or NAIRU). Between 1979 and 2019, the actual unemployment rate exceeded estimates of the NAIRU by an average of roughly 0.8 percentage points each year.  I'm going to argue that the Fed's monetary policy has been too contractionary in the decades since 1979 and this has had an adverse effect on closing the black-white wage gap. I'll say more about this particular point later.

By now, we've all heard about the talk Black parents have with their children about how to conduct themselves during interactions with law enforcement.  This talk is based on the painful reality that being black means they are more likely to be judged as dangerous, suspicious or guilty, placing them at greater risk of suffering violence at the hands of police officers.

But there's another talk that black parents have with their children. That talk goes something like this: You're going to have to work twice as hard and be twice as good just to get the same recognition and opportunities available to your white colleagues. This talk is informed by painful first-hand experience with the outcomes shown in the slides.

The persistent racial disparities we observe in unemployment and wages arguably provide the most compelling evidence of structural racism in the labor market.

Unemployment disparities

One of the most defining features of the U.S. labor market is the large and persistent disparity in unemployment that exists between Black and white workers.  This disparity is well-documented in decades of official estimates from the Bureau of Labor Statistics, dating back to 1972 when the agency first began reporting rates of unemployment by race.  More than 45 years' worth of data can be summed up in one simple ratio: 2-to-1.  It means black job seekers are about half as likely to secure employment during a consecutive 4-week period as are white job seekers.  This has been true across multiple periods of economic expansion and recession, and is observed for all age cohorts, at every level of education, and for men and women alike.

Over the last four and a half decades, only the most highly educated and most experienced black workers have approached anything near unemployment rate parity with their white counterparts, and only during periods of exceptionally low unemployment.

These empirical data are consistent with multiple field experiments revealing that black job applicants with equivalent, and sometimes superior, credentials to white applicants are less likely to receive job call backs.

Wage inequality

Another defining feature of racial inequality in the labor market are the large disparities in pay between black and white workers.  I remind you that racial pay gaps persist even after accounting for factors commonly associated with individual productivity (education or skills) and have in fact gotten worse over the last forty years.  One of the most troubling aspects of growing black-white wage gaps is the fact that they have grown most among college-educated workers; those presumed to have done all the "right" things, yet still receive lower returns on their investment in higher education than their white counterparts.

In 1979, black bachelor's degree holders were paid an average hourly wage that was 86.4% of what white bachelor's degree holders were paid.  The ratio between black and white advanced degree holders was 89.9%.  Fast forward to 2019, and the ratio between black and white bachelor's degree holders was down 8.9 percentage points to 77.5%; the ratio for those with advanced degrees was down 7.5 percentage points to 82.4%.

These patterns are consistent with the fact that relative to their white counterparts, college-educated black workers are less likely to be employed in positions and industries that have seen the most wage growth in recent decades.  Only 3 percent of all chief executives are African American, and a disproportionate number of them are employed in the public or private nonprofit sectors, where salaries are lower and more likely to be capped than they are in the private for-profit sector where CEO pay has soared in recent decades.1  Moreover, in 2019, Black workers with a college or advanced degree were more likely than their white counterparts to be underemployed based on their skill level—almost 40% were in a job that typically does not require a college degree, compared with 31% of white college graduates.

Unlike unemployment rates that are characterized by a nearly constant 2-to-1 ratio, there have been 4 distinct periods of change in black-white wage inequality. These periods of change have often been consistent with distinctive shifts in policy.

For example, the narrowing of the gap from the late 1960s through the 1970s can be traced to the passage of important Civil rights legislation, and active enforcement of anti-discrimination and affirmative action policy which also contributed to narrowing of the educational attainment gap between Blacks and whites.

On the other hand, the widening of the gap since the 1980s is associated with retrenchment on anti-discrimination policy, in combination with policies and practices that fueled growing wage inequality, including failure to increase the federal minimum wage as the cost of living rose, a decline in union representation, and weaker macroeconomic conditions.

Since the 1980s, there has only be one very brief period of when the gap narrowed, and that occurred during the last five years of the 1990s economic boom.  Notably, the unemployment rate was consistently lower than the natural rate of unemployment for almost 4 years during that time, and without runaway inflation.

Since 2000, the black-white wage gap has widened amid sluggish economic growth, two jobless recoveries, the Great Recession and subsequent recovery that was needlessly hampered by premature fiscal austerity. Fiscal austerity has been especially harmful to black workers who are a disproportionate share of those employed in the public sector where they have historically been drawn by better job opportunities and greater pay equity.

I want to take a moment here to connect black-white wage inequality with the Fed mandate to maximize employment.  Between 1979 and 2019, the ratio of Black to white median wages fell by 8 percentage points (from .83 to .75). Recall that in the last slide I showed, the actual unemployment rate exceeded estimates of the NAIRU by an average of 0.8 percentage points each year during the same time.

In forthcoming work, we use a Phillips wage curve to estimate the relationship between wage growth and the level of unemployment and find that the coefficient for median Black wages is roughly 0.3 percentage points larger (in absolute value) than for median white wages. This indicates that relative wage growth for Black workers would have been roughly 0.25 percent higher each year over that time if, on average, the actual unemployment rate had been equal to the NAIRU instead of above it.  In other words, tighter labor markets (less contractionary monetary policy) could have stopped the deterioration of the Black-white median wage gap. If estimates of the NAIRU are actually too-conservative, and unemployment could have averaged 1 to 2 percentage points lower than that over the 1979-2019 period, the Black-white median wage ratio could have actually gone up, extending the progress made in narrowing the gap during the 1960s and 1970s.

We are now on the heels of the longest period of economic expansion in history. Yet, the ever-present legacy of racism and economic inequality continue to impose a needlessly heavy burden on the backs of black workers during the COVID-19 crisis.

One of the structures contributing to such racially disparate impacts is occupational segregation, characterized by overrepresentation of black workers, and especially black women workers, in low-wage occupations, and underrepresentation in higher-wage occupations. The COVID-19 crisis has popularized the term essential worker, drawing attention to the fact that black workers occupy a disproportionate share of lower-wage jobs in major frontline industries, often with unpredictable work hours (thus, unpredictable pay) and without paid leave or employer-provided health coverage.

Conclusion

In my view, these patterns of persistent racial inequality in the labor market are analogous to horrific videos capturing blatant disregard for black lives. Let me be clear. While losing a job comes nowhere close to losing a life, both are symptoms of the kind of racial injustice that sparked national and international protests this past summer.

Unfortunately, what should be overwhelming evidence of racial injustice gets filtered through our own experiences, perceptions and biases.  In other words, it is our interpretation of the evidence that determines if and how we respond.

Do we recognize these patterns as racial discrimination, and seek to dismantle the structures that assign a relative advantage to being born white and relative penalty to being born black? Or do we seek some other explanation that points to failure on the part of the individual with unfavorable outcomes?


 -- via my feedly newsfeed

Moderna Vaccine Follows Pfizer with Very Strong Results on Effectiveness: What This Means [feedly]

Dean Baker exposes a very important point about state led vs corporate led economic investments that lead to public well-being that does not easily correspond to GDP valuations.

The vaccine successes of Moderna and Pfizer were in fact state led and funded, no different essentially from China. But in China the public owns the share it put up for the huge investment. Here, the state ends up with nothing, although its funding and development were completely public -- the corps had no risk at all, and, while the initial distribution will likely be free (if Biden has anything to say about it), eventually, both companies stand to make fortunes having risked nothing.

Moderna Vaccine Follows Pfizer with Very Strong Results on Effectiveness: What This Means

http://feedproxy.google.com/~r/beat_the_press/~3/8fxKJk22itg/

 -- via my feedly newsfeed

Last week, we got the very good news that preliminary results from its Phase 3 trial showed that Pfizer's vaccine had an effectiveness rate of 92 percent. (We now have final results that put it at 95 percent.) This week, Moderna released results showing that its vaccine also has a 95 percent effectiveness rate. In both cases, the companies reported a relatively small number of instances of adverse effects, which were in almost all cases minor (e.g. headaches or sore arms).

It was great to hear the news about Pfizer last week. It is even better to get the news about Moderna. That is not only because it is good to have a second vaccine available to the United States and the world, and not just because the Moderna vaccine will be easier to distribute, but also because of the mode of funding.[1]

Assuming that the vaccines are equally effective (any difference in effectiveness reported from the Phase 3 trials will be far too small to be statistically significant), the Moderna vaccine will be far easier to distribute and store. To remain stable, the Moderna vaccine must be kept at a temperature just below zero Fahrenheit. By contrast the Pfizer vaccine must be kept at a temperature of almost -100 degrees Fahrenheit. This means that, while the Moderna vaccine can be kept in a normal freezer, the Pfizer vaccine requires special storage facilities. That will create a problem for distribution even in the United States and other rich countries. For developing countries, maintaining this super-cold temperature will be a very serious burden.    

The other important point about the Moderna vaccine is that U.S. government essentially paid for the full development costs upfront. While both the Moderna and Pfizer vaccines benefitted enormously from a key NIH funded breakthrough, in the case of the Moderna vaccine, we also picked up both its research costs and the cost of its clinical trials. The federal government initially paid Moderna $483 million for its research and Phase 1 and 2 trials. It then coughed up another $472 million for its phase 3 trial. This would cover any expenses that Moderna incurred in the development and testing of the vaccine.

On the one hand, this makes for a strong moral claim that the U.S. government should own any rights to the vaccine. After all, Moderna took no risk. If the vaccine proved to be a dud, it was already paid for its work, it would not be out any money, the taxpayers would have then thrown almost $ 1 billion in the toilet paying for the research. Ideally, we would then be able to buy the vaccine for its cost of production and distribution. We would also make it available to the rest of the world at this price and allow anyone else in the world to produce the vaccine as well, in order to get it distributed as widely as possible as quickly as possible.

But apart from the moral claim, Moderna's success also demonstrates an important economic point. Having the government pay a company to develop a vaccine is not the same thing as throwing money in the toilet.

If that point sounds strange to people, then you haven't paid attention to our policy on developing drugs. We spend over $40 billion annually on biomedical research each year through the National Institutes of Health and other government agencies. Most of this is for basic research, but some of it does actually go for developing drugs and clinical trials.

This spending enjoys enormous bipartisan support, including from the pharmaceutical industry. But, the argument against moving downstream and having the government pick up the tab for developing and testing drugs, and bringing them through the FDA approval process, is that we would just be throwing money in the toilet. The idea is that, somehow the government is able to very effectively fund basic research, but our health agencies become incompetent morons when we move to later stage development.

Note, this is not a question of the government actually doing the research. The vast majority of NIH spending is paid out on private contracts, not work done by NIH staff. The argument is that even if the government contracts out for research, the money will just be wasted because the government has touched it. (I describe a system of financing research through long-term contracts in chapter 5 of Rigged [it's free].) 

While it was always hard to treat this argument seriously, it should be even harder now that we have the very visible example of Moderna. The government coughed up a bit less than $1 billion and it looks like less than nine months later, we will have a highly effective vaccine. That certainly would seem to indicate that the government can productively spend money to support the development of vaccines and presumably drugs as well.

Needless to say, the industry will have all sorts of creative arguments as to why we should not think the government can usefully fund the development of drugs, in spite of the Moderna success story. They might tell us that this success was special because it was a crisis, or because vaccines are simple (they aren't), or a thousand other reasons, but the facts on the ground are hard to deny. The government put up the money and quickly got an effective vaccine.

 

Weighing Direct Funding versus Patent Monopolies

If we can get direct funding on the table as a feasible alternative to patent monopoly financing, then we can have a serious debate on the relative merits of the two methods. Many of the advantages of direct funding are obvious. First and foremost, nearly all drugs would be cheap if they were sold in a free market without patent monopolies or related protections. Drugs are rarely expensive to manufacture and deliver, they are expensive because the government has granted a company a monopoly on a drug that is essential for people's lives or health. In many cases the free market price is less than one percent of the patent monopoly price.

Having drugs sold as generics in the free market would mean that paying for drugs would no longer be a major hurdle for most people. We would not see people cutting drugs in half or skipping doses because they couldn't afford refills and wanted to stretch their supply further. We also would not see doctors prescribing inferior drugs because they wanted to save their patients money, or the insurer would not pay for a more expensive drug. Doctors could prescribe the drug they considered best for a particular patient.

We would also take away the incentive for drug companies to lie about the safety and effectiveness of their drugs. This is a common problem, which comes up all the time. The most dramatic recent case is with the opioid crisis, where drug manufacturers have already paid billions of dollars in settlements based on the allegation that they deliberately misled physicians about the addictiveness of the new generation of opioids. If opioids sold at generic prices, there would have been far less incentive to conceal evidence about their addictiveness.

We would also make such deceptions far more difficult with publicly funded research. A condition of getting funding should be that, in addition to all patents being in the public domain, all research findings are fully public and posted on the web as soon as practical. This would make it very difficult to mislead physicians and other researchers about the safety and effectiveness of a drug, since they would all have access to the same information as the company that had developed the drug.

These are some of the advantages of publicly funded research, but we still have the question of how it compares on a per dollar basis to patent monopoly financed research. Now that we know public funding is not the same thing as throwing money in the toilet, we can ask how many public dollars does it take to equal a dollar of patent monopoly supported research.

Given the enormous benefits from having drugs sold in a free market without patent monopolies, we may well be better off with publicly funded research even if we had to pay three or four dollars to replace a dollar of patent monopoly financed research. After all, we would save more than $400 billion annually (twice the size of the Trump tax cut) if drugs were sold in a free market; we would only have to replace around $90 billion in patent financed research.

However, there is reason for thinking that public funding might be more efficient on a per dollar basis, most importantly because it is open. Open research would allow all researchers to learn quickly from breakthroughs that are done anywhere in the country, and ideally anywhere in the world, if we arranged for worldwide collaboration.   

As I noted before, we had seen much collaboration in the early days of the pandemic, but that largely disappeared as companies raced to develop their own treatments and vaccines. I was under the impression that several of the vaccines developed by Chinese companies, including two old-fashioned dead virus vaccines, were ahead of the U.S.  vaccines in the testing process. It is difficult to know whether this is the case, since we don't have published results from the tests of these companies.

We do know that the United Arab Emirates (UAE) already granted an emergency use authorization for one of these vaccines back in mid-September, based on results from a Phase 3 trial. The Chinese government is also claiming 90 percenteffectiveness for one its vaccines, although it has not publicly released any data.

We would not expect people in the United States to take a vaccine based on the UAE's emergency use authorization or vague claims coming from China's researchers. However, if we had gone a collaborative route, which would have required some agreement with China and other countries, our researchers would have full access, both to China's results and the vaccines themselves.

This means that our researchers could have performed their own tests to determine the safety and effectiveness of the vaccines developed by Chinese companies. If the UAE emergency use authorization in mid-September was based on solid evidence, it is possible that out researchers could have built on these results and obtained sufficient evidence to allow for the vaccine to be used here at a considerably earlier date than we now expect for the Pfizer and Moderna vaccines.

Needless to say, if we could have begun to distribute an effective vaccine two months or even one month sooner, the benefits in terms of lives saved, reduced infections and increased economic activity would have been enormous. Without more information, we can't know at this point, and we may never know, whether the access to the Chinese vaccines would have allowed us to begin to immunize our population sooner, and hopefully the rest of the world as well. But, our pandemic response should have been focused on getting effective vaccines as soon as possible, rather than trying to rack up a victory for the United States in an international vaccine race.

 

Can We Debate Alternatives to Patent Monopoly Financing?

While we can't change the past, we hopefully can learn from it. Moderna shows that direct public funding of the development and testing of a vaccine can be effective. This should open the door on a wider debate about this alternative mechanism for financing pharmaceutical research more generally. We don't have to jump in with both feet and discard the current patent monopoly financing model. We can experiment by designating funds for research and development in specific areas, like cancer or heart disease.

My expectation is, if the experiment were structured right, that we would see important new drugs developed in a reasonable time-frame. And, these drugs would be available as cheap generics from the day they were approved by FDA. That should create considerable pressure for expanding financing to more areas.

But that may not prove correct, perhaps for some unknown reason companies can't do good work when they are only paid in cash and not patent monopolies. Even in that case, the spending is not likely to be a complete waste, since the open research should provide at least some insights to other scientists.

Anyhow, patents should not be a holy grail. The cost of patent monopolies to the economy, especially in the case of prescription drugs, has exploded over the last four decades. It should be possible to have a serious discussion about alternative mechanisms. 

[1] Donald Trump has complained that he is not getting credit for the vaccines. The United States, like other wealthy countries, did put up money to develop vaccines and treatment, so he can share in the credit for that reason. However, there is a perverse way in which Trump may deserve credit for an earlier than expected determination of the vaccines' effectiveness.

In a trial, the ability to determine effectiveness depends largely on the number of infections in the control group. Because the pandemic has spread so widely in the last couple of months, the companies saw a larger number of infections in their control group, which allowed them to have large and statistically significant differences with the treatment group, sooner than they otherwise would have been the case.

If U.S. policy had been more effective in controlling the pandemic, it would have taken longer to determine the vaccines' effectiveness. China has been testing its vaccines in Brazil, Pakistan, and other developing countries because it has been so successful in controlling the pandemic within its borders. Even if a vaccine were 100 percent effective, it would take a very long time to have a control group in China that had enough infections to allow for the vaccine's effectiveness to be determined.  

The post Moderna Vaccine Follows Pfizer with Very Strong Results on Effectiveness: What This Means appeared first on Center for Economic and Policy Research.

Dean Baker: More Thoughts on Post Pandemic Economt

More Thoughts on the Post-Pandemic Economy


Dean Baker

 I have written before on the post-pandemic economy and how it should actually provide enormous opportunities, but it is worth clarifying a few points. First and most importantly, there is an important measurement issue with GDP that people will need to appreciate.

It is often said that GDP is not a good measure of well-being, we see this in a very big way in the post-pandemic period. It is likely that many of the changes in behavior forced by the pandemic, first and foremost telecommuting, will be enduring.

Most immediately, this will show up as a sharp drop in GDP. We will be consuming much less of the goods and services associated with commuting to and from work. This means that we will be driving less. That means we will be buying less gas and needing fewer cars, car parts, and car repair services. We'll also need less auto insurance. In addition, there will be many fewer taxi or Uber trips, as well as trips on busses, trains, and other forms of public transportation.

There is also an economy built up around serving the people working in downtown office buildings. This includes the offices themselves and the people who service and clean them. There are also restaurants, gyms, and other businesses that serve the people who come into the city to work each day. And, there are all the items that people have to spend money on for office work, such as business clothes and shoes and dry-cleaning services.

We will see a huge reduction in demand in all these areas if much of the work being done on-line stays online. We will also see less business travel, which means fewer airplane trips, taxi rides, and stays in hotels.

This fall into demand will translate into a large loss of GDP, but it translates into very little by way of real loss in well-being. This doesn't mean there will not be some loss. People may miss seeing work colleagues on a daily basis, or the opportunity to meet up with friends for lunch near the office. Some people may actually enjoy business travel. But the drop in GDP will dwarf whatever losses of these sorts people may feel, and in most cases, they will be offset by gains, such as not having to spend two hours a day commuting and having more time to spend with friends and family.

So, let's say that we see GDP drop by 3.0 percent ($660 billion a year), how should we think about this? (This is a very crude guess, not a careful calculation) On its face, that would look like a very severe downturn. In the Great Recession, GDP only fell by 4.0 percent from peak to trough, so this looks like a very serious hit to the economy.

But that really misses the story. To take an analogous situation, let's say for some reason, such as better diet, more exercise, or an act of God, everyone's health got hugely better. Imagine that we could have the same outcomes in terms of life expectancy and quality of overall health using half as many health care services. This would mean half as many doctors' visits, surgeries, MRIs, prescription drugs, and everything else in health care that costs money.

This reduction in health care consumption would mean a drop in GDP of more than 8.5 percent, yet everyone's health would be just as good as it had been previously. In this story, no one in their right mind would be concerned about the loss of GDP, what we value is health, not the number of times we see a doctor or the amount of drugs we take. The decline in the resources needed to maintain our health is effectively an increase in productivity. We have seen a jump of 8.5 percent in the level of productivity, as we can get the same output as we had previously, with 8.5 percent fewer inputs.[1]  In other words, we are much richer as a result of this remarkable improvement in the public's health.

We should think about my hypothesized savings of 3.0 percent of GDP on work-related expenses the same way. We had been expending a large amount of resources to maintain an office work system that is no longer needed. This is effectively a huge jump in productivity. By comparison, over the last 15 years, productivity growth has averaged just 1.3 percent annually.

This matters hugely in how we think about the post-pandemic economy. If we look at the lost GDP associated with fewer work-related expenses we would think that the economy is really suffering. However, if we think of this as big jump in productivity, then it effectively means that we have extra resources to address long-neglected social needs.

And, these resources should be readily visible in the form of all the workers who are no longer employed in restaurants, gyms, dry cleaners, or the making, servicing, or driving of cars. These are people who can be instead employed providing child care, senior care, doing energy audits of buildings, installing solar panels and energy-conserving appliances, or other tasks that address neglected needs.

As I have pointed out before, we need not think that every person who lost their job waiting tables will get a job installing solar panels or as a child care provider. That's not the way the labor market works. People in fact switch jobs frequently. In a normal pre-pandemic economy more than 5.5 million people lose or leave their job every month. If we create jobs in installing solar panels, energy audits, and child care, people will leave other jobs to fill these newly created positions, which can leave openings for laid-off restaurant and hotel workers to again get jobs in hotels and restaurants, as well as other sectors.

The fact that we have a large number of idle workers, because of this effective jump in productivity, means that we should not be shy about large amounts of government spending to address these unmet needs, even though it will mean large budget deficits. For the near-term future, we will not have to worry about deficits creating too much demand in the economy and causing inflation. In the longer term, excessive demand and the resulting inflation can be a problem, which will require addressing the factors that redistribute so much money upward (e.g. patent and copyright monopolies, a corrupt corporate governance structure, and a bloated financial sector), but that will not be a problem as we recover from the recession.

If we do let obsessions with government deficits and debt curtail spending, then we can expect to see a long and harsh recession. To set up the analogy, suppose there was a 3.0 percent jump in productivity, but there was no increase in workers' real wages. Assume all the money went to higher corporate profits. Since profits have little relationship to investment, there is no reason to expect any notable increase in investment. Let's assume that consumption spending out of dividends and share buybacks is limited.

In this case, the economy can produce the same output with 3 percent fewer workers, meaning that 4.8 million people will be out of jobs. And, that situation can persist for a long period of time, since there is nothing inherent to the workings of the economy to bring us back to full employment.

That would really be a disaster story, especially if the correct figure for this implicit jump in productivity is something more like 5 percent, or even more. The key to preventing this sort of disaster is to understand that the reduced spending on work-related expenses is effectively an increase in productivity.

And, we also have to recognize that when we have a serious problem of unemployment, the failure to run large deficits is incredibly damaging to the country. Millions of workers will needlessly suffer, as will their families. And the failure is increased when it means not spending in areas that will have long-term benefits for the country, like child care and slowing global warming. It is tragic that deficit hawks are able to do so much harm to our children under the guise of saving our children.

[1] For those being technical, I am not using "productivity" precisely here. A reduction equal to 8.5 percent of GDP in the value of goods or services devoted to health care, does not necessarily mean that the amount of labor used in the health care sector has fallen by an amount equal to 8.5 percent of the economy's annual labor usage. But I'm ignoring this point for now.  

--

Dean Baker: Contra - The Krugman Boom: Don’t Bank on It [feedly]

The Krugman Boom: Don't Bank on It
http://feedproxy.google.com/~r/beat_the_press/~3/nulXARWmDsQ/

I have largely been in agreement with Paul Krugman in his assessment of the economy over the last dozen years or so, but I think in his latest column he let the promise of a post-Trump era get the better of him. Krugman notes that the distribution of effective vaccines should allow people to return to their normal lives.

He argues that this will lead to a spending boom, as consumers have accumulated savings through the slump and will now be in a position to spend lots of money. As a model he points to the boom in 1983 and 1984 after the Fed lowered interest rates.

While I have not been one of the doomsayers predicting economic collapse, I can't be as optimistic as Paul on this one. First, just to be clear, Krugman does not at all question the need for immediate and substantial stimulus. In the next few months, with the pandemic spreading largely unchecked until vaccines become widely available, millions of people will be thrown out of work as restaurants, bars and other businesses in the service sector are either forced to close or see demand collapse even if they remain open.

These people will need unemployment benefits, protection from eviction, and other support until the labor market improves. State and local governments will also need massive aid to avoid a further round of layoffs and to provide essential services, including setting guidelines and rules for safe re-openings.

But getting beyond this period, once the vaccines have allowed us to return to normal, will there be a spending spree? Krugman is right about the state of people's balance sheets. The people who have stayed employed have been doing well. The government gave them $1,200 checks, many have refinanced mortgages often saving a percentage point or more in annual interest. That's $2,000 a year for someone with a $200,000 mortgage. And, many have been saving money as result of not commuting to work, eating out at restaurants, or spending on other services.

Still, I don't see this as a 1983-84 type spending boom for the simple reason that the sectors that drove that boom, housing and car buying, have not been depressed. The boom in those years followed large contractions in home buying and construction, as well as car buying, which were the result of the Fed's high interest rate policy.

At the trough of the recession in the third quarter of 1982, residential construction was down more than 40 percent from its peak in 1979. New car sales were down by more than 15 percent. The recovery was driven by the reversal of these drops. By the second quarter of 1984, residential construction was up almost 70 percent from its level of two years earlier. New car sales were almost 50 percent higher than they were in 1981. This was the basis of the 1980s boom.

We can't tell any comparable story today. Both housing and car sales have done very well through this downturn. Residential construction in the most recent quarter was actually more than 5 percent higher than in the fourth quarter of 2019. Car sales were almost 8.0 percent higher. Clearly there is no basis for expecting a boom based on pent-up demand in these sectors.

The question is whether we should expect a huge boom in other consumption spending. That doesn't seem likely to me. We will see people return to restaurants, but do we think they will be eating meals out every night? People will go to gyms and movies again, but this will not create the sort of boom we had in 1983 and 1984. If you look at the various categories of consumption spending, it's very hard to see anything like the booms in housing and car buying we had after the 1981-82 recession.

So what's my story? I see many of the changes forced by the pandemic as being permanent. First and foremost, many of the people who were forced to work from home will continue to work from home. Some may move away from the cities where they are working. (We are already seeing this.) We are also likely to see less business travel as meetings and conferences are held over Zoom. We'll see more telemedicine and, hitting close to home, less in-person classes in colleges and universities.

In many ways this is great news. People will save hundreds of hours a year on commuting. We will also see large reductions in greenhouse gas emissions by eliminating unnecessary travel. (This will be recorded as a drop in GDP, which is yet another case where GDP is not a good measure of well-being. In effect, we are eliminating work-related expenses that provide little welfare to anyone. I discuss this issue more here.)

The downside in this story is that the many of the jobs that were dependent on supporting this commuting economy will not be coming back. We are talking about millions of lost jobs in restaurants, gyms, and other businesses that serve the people who come into the city to work each day. These people will not easily find new jobs.

It is also likely to be the case that the finances of cities like New York, Boston, and others seeing major reductions in their commuting population will be badly strained. This means that they will lack the resources to help displaced workers get new jobs in areas where they are needed, like health care, child care, and clean energy.

The really bad part of this story is that the displaced workers will be overwhelmingly women, Blacks, Hispanics and others who are disadvantaged in the labor market. These groups had been doing relatively well as the labor market tightened between 2014 and 2020. These gains evaporated with the pandemic. These groups may see this pain enduring long into the future.

So, this is a story of worsening inequality. Large sectors of the population will be doing just fine, but those at the bottom will be getting kicked in the face. I expect a Biden administration will try to help this displaced workforce, but the Republicans in Congress will be celebrating the pain in "Democrat" cities.

I can't say how this will shape elections in 2022 and 2024. The people who are doing well may be happy and vote for Biden and the Democrats. But if the economy shapes up like I fear, this will not be a story worth celebrating.        

The post The Krugman Boom: Don't Bank on It appeared first on Center for Economic and Policy Research.


 -- via my feedly newsfeed

pk: Making the Most of the Coming Biden Boom [feedly]

Note a following article by Dean Baker questioning the Biden "boom"

Making the Most of the Coming Biden Boom

Paul Krugman
https://www.nytimes.com/2020/11/19/opinion/joe-biden-economy.html

text only 

The next few months are going to be incredibly grim. The pandemic is exploding, but Donald Trump is tweeting while America burns. His officials, unwilling to admit that he lost the election, are refusing even to share coronavirus data with the Biden team.

As a result, many preventable deaths will occur before a vaccine's widespread distribution. And the economy will take a hit, too; travel is declining, an early indicator of a slowdown in job growth and possibly even a return to job losses as virus fears cause consumers to hunker down again.

But a vaccine is coming. Nobody is sure which of the promising candidates will prevail, or when they'll be widely available. But it's a good guess that we'll get this pandemic under control at some point next year.

And it's also a good bet that when we do the economy will come roaring back.

OK, this is not the consensus view. Most economic forecasters appear to be quite pessimistic; they expect a long, sluggish recovery that will take years to bring us back to anything resembling full employment. They worry a lot about long-term "scarring" from unemployment and closed businesses. And they could be right.

ADVERTISEMENT

Continue reading the main story

But my sense is that many analysts have overlearned the lessons from the 2008 financial crisis, which was indeed followed by years of depressed employment, defying the predictions of economists who expected the kind of "V-shaped" recovery the economy experienced after earlier deep slumps. For what it's worth, I was among those who dissented back then, arguing that this was a different kind of recession, and that recovery would take a long time.

Gift Subscriptions to The Times, Cooking and Games.

Starting at $25.

And here's the thing: The same logic that predicted sluggish recovery from the last big slump points to a much faster recovery this time around — again, once the pandemic is under control.

What held recovery back after 2008? Most obviously, the bursting of the housing bubble left households with high levels of debt and greatly weakened balance sheets that took years to recover.

This time, however, households entered the pandemic slump with much lower debt. Net worth took a brief hit but quickly recovered. And there's probably a lot of pent-up demand: Americans who remained employed did a huge amount of saving in quarantine, accumulating a lot of liquid assets.

Editors' Picks

What Does the MAGA Hat Mean Now?

What the 'Fresh Prince' of the '90s Tells Us About Race Now

For U.S. Stars in England, There Is Now Only Red and Blue

Continue reading the main story

ADVERTISEMENT

Continue reading the main story

All of this suggests to me that spending will surge once the pandemic subsides and people feel safe to go out and about, just as spending surged in 1982 when the Federal Reserve slashed interest rates. And this in turn suggests that Joe Biden will eventually preside over a soaring, "morning in America"-type recovery.

Which brings me to the politics. How should Biden play the good economic news if and when it comes?

First of all, he should celebrate it. I don't expect Biden to engage in Trump-like boasting; he's not that kind of guy, and his economics team will be composed of people who care about their professional reputations, not the quacks and hacks who populate the current administration. But he can highlight the good news, and point out how it refutes claims that progressive policies somehow prevent prosperity.

Also, Biden and his surrogates shouldn't hesitate to call out Republicans, both in Washington and in state governments, when they try to sabotage the economy — which, of course, they will. I won't even be surprised if we see G.O.P. efforts to impede the wide distribution of a vaccine.

What, do you think there are some lines a party refusing to cooperate with the incoming administration — and, in fact, still trying to steal the election — won't cross?

Finally, while Biden should make the most of good economic news, he should try to build on success, not rest on his laurels. Short-term booms are no guarantee of longer-term prosperity. Despite the rapid recovery of 1982-1984, the typical American worker earned less, adjusted for inflation, at the end of Reagan's presidency in 1989 than in 1979.

And while I'm optimistic about the immediate outlook for a post-vaccine economy, we'll still need to invest on a large scale to rebuild our crumbling infrastructure, improve the condition of America's families (especially children) and, above all, head off catastrophic climate change.

So even if I'm right about the prospects for a Biden boom, the political benefits of that boom shouldn't be cause for complacency; they should be harnessed in the service of fixing America for the long run.

ADVERTISEMENT

Continue reading the main story

And the fact that Biden may be able to do that is reason for hope.

Those of us worried about the future were relieved to see Trump defeated (even though it's possible he'll have to be removed forcibly from the White House), but bitterly disappointed by the failure of the expected blue wave to materialize down-ballot.

If I'm right, however, the peculiar nature of the coronavirus slump may give Democrats another big political opportunity. There's a pretty good chance that they'll be able to run in the 2022 midterms as the party that brought the nation and the economy back from the depths of Covid despond. And they should seize that opportunity, not just for their own sake, but for the sake of the nation and the world.


 -- via my feedly newsfeed