Friday, March 2, 2018

Enlighten Radio:The Moose Turd Cafe: Case, Case and the Sacred Harp

John Case has sent you a link to a blog:



Blog: Enlighten Radio
Post: The Moose Turd Cafe: Case, Case and the Sacred Harp
Link: http://www.enlightenradio.org/2018/03/the-moose-turd-cafe-case-case-and.html

--
Powered by Blogger
https://www.blogger.com/

Thursday, March 1, 2018

Enlighten Radio:The Moose Turd Cafe -- Will the Ton O Dung Done Do It?

John Case has sent you a link to a blog:



Blog: Enlighten Radio
Post: The Moose Turd Cafe -- Will the Ton O Dung Done Do It?
Link: http://www.enlightenradio.org/2018/03/the-moose-turd-cafe-will-ton-o-dung.html

--
Powered by Blogger
https://www.blogger.com/

Computational social science [feedly]

Computational social science
http://understandingsociety.blogspot.com/2018/02/computational-social-science.html

Is it possible to elucidate complex social outcomes using computational tools? Can we overcome some of the issues for social explanation posed by the fact of heterogeneous actors and changing social environments by making use of increasingly powerful computational tools for modeling the social world? Ken Kollman, John Miller, and Scott Page make the affirmative case to this question in their 2003 volume, Computational Models in Political Economy. The book focuses on computational approaches to political economy and social choice. Their introduction provides an excellent overview of the methodological and philosophical issues that arise in computational social science.
The subject of this book, political economy, naturally lends itself to a computational methodology. Much of political economy concerns institutions that aggregate the behavior of multiple actors, such as voters, politicians, organizations, consumers, and firms. Even when the interactions within and rules of a political or economic institution tion are relatively simple, the aggregate patterns that emerge can be difficult to predict and understand, particularly when there is no equilibrium. It is even more difficult to understand overlapping and interdependent institutions.... Computational methods hold the promise of enabling scholars to integrate aspects of both political and economic institutions without compromising fundamental features of either. (kl 27)
The most interesting of the approaches that they describe is the method of agent-based models (linklink, link). They summarize the approach in these terms:
The models typically have four characteristics, or methodological primitives: agents are diverse, agents interact with each other in a decentralized manner, agents are boundedly rational and adaptive, and the resulting patterns of outcomes comes often do not settle into equilibria.... The purpose of using computer programs in this second role is to study the aggregate patterns that emerge from the "bottom up." (kl 51)
Here is how the editors summarize the strengths of computational approaches to social science.
First, computational models are flexible in their ability to encode a wide range of behaviors and institutions. Any set of assumptions about agent behavior or institutional constraints that can be encoded can be analyzed. 
Second, as stated, computational models are rigorous in that conclusions follow from computer code that forces researchers to be explicit about assumptions. 
Third, while most mathematical models include assumptions so that an equilibrium exists, a system of interacting political actors need not settle into an equilibrium point. It can also cycle, or it can traverse an unpredictable path of outcomes. 
The great strength of computational models is their ability to uncover dynamic patterns. (kl 116)
And they offer a set of criteria of adequacy for ABM models. The model should explain the results; the researcher should check robustness; the model should build upon the past; the researcher should justify the use of the computer; and the researcher should question assumptions (kl 131).
To summarize, models should be evaluated based on their ability to give insight and understanding into old and new phenomena in the simplest way possible. Good, simple models, such as the Prisoner's Dilemma or Nash bargaining, with their ability to frame and shed light on important questions, outlast any particular tool or technique. (kl 139)
A good illustration of a computational approach to problems of political economy is the editors' own contribution to the volume, "Political institutions and sorting in a Tiebout model". A Tiebout configuration is a construct within public choice theory where citizens are permitted to choose among jurisdictions providing different bundles of goods.
In a Tiebout model, local jurisdictions compete for citizens by offering bundles of public goods. Citizens then sort themselves among jurisdictions according to their preferences. Charles M. Tiebout's (1956) original hypothesis challenged Paul Samuelson's (1954) conjecture that public goods could not be allocated efficiently. The Tiebout hypothesis has since been extended to include additional propositions. (kl 2012)
Using an agent-based model they compare different sets of political institutions at the jurisdiction level through which policy choices are made; and they find that there are unexpected outcomes at the population level that derive from differences in the institutions embodied at the jurisdiction level.
Our model departs from previous approaches in several important respects. First, with a few exceptions, our primary interest in comparing paring the performance of political institutions has been largely neglected in the Tiebout literature. A typical Tiebout model takes the political institution, usually majority rule, as constant. Here we vary institutions and measure performance, an approach more consistent with the literature on mechanism design. Second, aside from an example used to demonstrate the annealing phenomenon, we do not explicitly compare equilibria. (kl 2210)
And they find significant differences in collective behavior in different institutional settings.

ABM methodology is well suited to the kind of research problem the authors have posed here. The computational method permits intuitive illustration of the ways that individual preferences in specific settings aggregate to distinctive collective behaviors at the group level. But the approach is not so suitable to the analysis of social behavior that involves a higher degree of hierarchical coordination of individual behavior -- for example, in an army, a religious institution, or a business firm. Furthermore, the advantage of abstractness in ABM formulations is also a disadvantage, in that it leads researchers to ignore some of the complexity and nuance of local circumstances of action that lead to significant differences in outcome.

 -- via my feedly newsfeed

Free speech after Mill [feedly]

Free speech after Mill
http://stumblingandmumbling.typepad.com/stumbling_and_mumbling/2018/02/free-speech-after-mill.html

We need to reconsider the case for free speech, because the classical liberal case for it is questionable.

Robert Sharp recently echoed John Stuart Mill when he said:

Our own opinions, beliefs and values should always be tested, less they become yet another kind of dogma...With each iteration, we refine our arguments and can persuade new people to our point of view.

This is true in many cases: it's why we should tolerate, and even welcome, Economists for Free Trade. But in the case Robert discusses – Richard Littlejohn's homophobia – it's not so clear. The benefits to us from decent people like Robert having a more refined opinion of the errors of homophobia are quite possibly outweighed by the threat that gay people feel from seeing homophobia normalized in the mass media.

There's something else. Mill thought that free speech would lead to intellectual progress:

[Mankind's] errors are corrigible. He is capable of rectifying his mistakes by discussion and experience. Not by experience alone. There must be discussion, to show how experience is to be interpreted. Wrong opinions and practices gradually yield to fact and argument: but facts and arguments, to produce any effect on the mind, must be brought before it.

This, though, is deeply doubtful. Facts are dismissed as "fake news" and evidence is interpreted with asymmetric Bayesianism; we are credulous about evidence that supports our prior views but sceptical of that which disconfirms it. Fact and argument lead not to the rectification of error but to stronger polarization. As Lord, Ross and Lepper concluded (pdf) back in 1979:

Social scientists can not expect rationality, enlightenment, and consensus about policy to emerge from their attempts to furnish "objective" data about burning social issues.

Our supposedly impartial media is no help here. It consistently failed to call out lies during the EU referendum, and it continues to give insufficiently critical coverage the peddlers of those falsehoods. The Brexit debate has been top of the news agenda for two years and yet no intelligent person on either side considers it well-informed.  Oliver Norgrove, a Brexiter, surely speaks for many on both sides when he decries "elected officials barely able to understand the basics", "intellectual vacuums" in both main parties and "ignorant headline-grabbers".

The "impartial" media does, however does something effectively. Cathy Newman's now-notorious uncomprehending interview with Jordan Peterson showed that it is incapable of coping with voices that don't conform to a narrow range of views. As Ryan Bourne says:

Certain viewpoints from certain spokespeople are just not heard on the media.

He's speaking from a rightist viewpoint, but I suspect his point generalizes. Many valuable perspectives – Oakeshottian conservatives, left libertarians, free market egalitarians and so on – are largely excluded from public discourse. One reason why Corbyn got such bad coverage especially early in his leadership was, I suspect, not so much that the BBC was actively hostile to him, but that it was so in thrall to a narrow Westminster-centric conception of politics that it just could not comprehend something different.

I say this with regret, but I fear Mill is no longer relevant to the debate about free speech. He was writing in an age before the mass media, when discussion was to a large extent among equals – albeit posh white men. He could not have anticipated the truth of Afua Hirsch's point:

The marketplace of ideas, like many other markets, has monopolies, rackets and biases. Long-established "suppliers" of opinions with entrenched positions in "the sector" enjoy huge advantages. 

"Free speech" has become one more means through which the mega-rich exercise power to, as Simon says, seriously distort democracy.

Now, I'm not sure this is a case for repressing speech: laws will be used against the powerless rather than the powerful. Instead, what I'm saying is that we need to move beyond romantic twaddle about actually-existing free speech being a road to truth. Instead we need to ask: what would a rational, pluralistic inclusive Habermasian political discourse look like?

I agree with Paul that we need

a public sphere in which, as far as possible, all citizens have a right to express, and have heard their views, through formal democratic structures and civil society mechanisms.

This of course requires radical changes. But what sort? To what extent are they compatible with currently-existing capitalism?

Underneath those questions, however, I have a nasty one. What if the many irrationalities that distort debate now are due not just to capitalist ideology and power structures, but to innate flaws in human reasoning? What then?



 -- via my feedly newsfeed

Krugman: Bonuses and Bogosity [feedly]

Bonuses and Bogosity


https://www.nytimes.com/2018/02/27/opinion/bonuses-and-bogosity.html

On Feb. 19 Walmart, America's largest employer, announced broad-based pay hikes. In fact, it announced that it would be raising wages for half a million workers. The Trump tax cut is working!

Oh, wait. That announcement was three years ago – it came on Feb. 19, 2015. And as far as I can tell, nobody gave the Obama administration credit for the move.
Why did Walmart raise wages? Partly because the labor market was tightening: unemployment had fallen more than 40 percent from its peak in 2009 (a decline for which the Obama administration's stimulus and other policies actually do deserve some credit.) This tightening labor market meant that Walmart was having trouble attracting workers.

Walmart was also in the midst of a change in business strategy. Its policy of always low wages, always, was starting to look problematic: low pay meant high turnover and low morale, and management believed that moving at least partway to the Costco strategy of paying more and getting better performance as a result made sense.

The moral of this story is that unless the economy is deeply depressed, there will always be some companies, somewhere, raising wages and offering bonuses. Which brings me to the Trump tax cut and the very bad, no good job much of the media initially did of covering its effects.

Every card-carrying economist agrees that cutting corporate tax rates will have some trickle-down effect on wages – in the long run. How much of a trickle-down effect depends on a bunch of technical factors: what share of corporate profits represents monopoly rents rather than returns to capital, how responsive inflows of foreign capital are to the U.S. rate of return, what share of the capital stock is even affected by the corporate tax rate. Enthusiasts claim that the tax cut will eventually go 100% to workers; most serious modelers think the number is more like 20 or 25 percent.

Whatever the number, however, it's about the long run. It requires a chain of events: lower taxes -> higher investment -> higher stock of capital -> higher demand for labor -> higher wages. And this chain of events should take a number of years, probably decades, to fully work itself through. Even in the most favorable analyses, there is no reason to expect any wage gains in the first few months after a tax cut.

Yet for a while there the news was full of stories about companies announcing that they were giving their workers bonuses thanks to the Trump tax cut. Why were they doing this?

Well, the tax cut provided no immediate incentive to raise wages. It did, however, give companies that benefited from the tax cut a strong political incentive to claim that the tax break was the reason for bonuses or wage hikes they would have given in any case. Why not? They want the tax cut to look good, since it's good for them. And hyping its benefits is a cheap way to make a de facto campaign contribution to an administration that can do a lot to help companies it likes, say by removing pesky safety or environmental regulations.

In other words, the whole "bonuses thanks to tax cut" story was bogus, and obviously so. Yet much of the media fell for it.

There was also a big element of innumeracy involved. Most people have no idea just how big the U.S. economy is; if you say "company X just hired 1000 workers" or "company Y just gave 5000 workers a $1000 bonus" they imagine that these are big stories, when in reality they're just noise in an economy where around 5 million workers are hired – and an almost equal number quit or are fired – every month.

The numbers we have so far show that the much-hyped bonuses are trivial – less than $6 billion, or 0.03% of GDP – while stock buybacks have been more than $170 billion. And many of those bonuses would probably have happened anyway, whereas stock buybacks are running far above historical levels.

Furthermore, the surge in stock buybacks suggests that the long run effect of the tax cut on wages will be small. Remember that the chain that should lead to trickle-down begins with lower taxes -> higher investment. If companies use the cuts to buy back stocks, not add to plant and equipment, the wage-growth story doesn't even get started.

So the real news about the tax cut is that it is – I know you'll be shocked – mainly a giveaway to corporations. Who could have predicted?

Follow me on Twitter (@PaulKrugman) and Facebook.

Follow The New York Times Opinion section on Facebook and Twitter (@NYTopinion), and sign up for the Opinion Today newsletter.



 -- via my feedly newsfeed

Books to Reread: John Maynard Keynes: Essays in Persuasion [feedly]

Books to Reread: John Maynard Keynes: Essays in Persuasion
http://www.bradford-delong.com/2018/02/books-to-reread-john-maynard-keynes-essays-in-persuasion.html

Books to Reread: John Maynard Keynes (1931): Essays in Persuasion: "Here are collected the croakings of twelve years—the croakings of a Cassandra who could never influence the course of events in time...

...The volume might have been entitled "Essays in Prophecy and Persuasion," for the Prophecy, unfortunately, has been more successful than the Persuasion. But it was in a spirit of persuasion that most of these essays were written, in an attempt to influence opinion. They were regarded at the time, many of them, as extreme and reckless utterances. But I think that the reader, looking through them to-day, will admit that this was because they often ran directly counter to the overwhelming weight of contemporary sentiment and opinion, and not because of their character in themselves.

On the contrary, I feel—reading them again, though I am a prejudiced witness—that they contain more understatement than overstatement, as judged by after-events. That this should be their tendency, is a natural consequence of the circumstances in which they were written. For I wrote many of these essays painfully conscious that a cloud of witnesses would rise up against me and very few in my support, and that I must, therefore, be at great pains to say nothing which I could not substantiate. I was constantly on my guard—as I well remember, looking back—to be as moderate as my convictions and the argument would permit.

All this applies to the first three of the five books into which these essays naturally group themselves, rather than to the last two; that is to say, to the three great controversies of the past decade, into which I plunged myself without reserve,—the Treaty of Peace and the War Debts, the Policy of Deflation, and the Return to the Gold Standard, of which the last two, and indeed in some respects all three, were closely interconnected. In these essays the author was in a hurry, desperately anxious to convince his audience in time.

But in the last two books time's chariots make a less disturbing noise. The author is looking into the more distant future, and is ruminating matters which need a slow course of evolution to determine them. He is more free to be leisurely and philosophical.

And here emerges more clearly what is in truth his central thesis throughout,—the profound conviction that the Economic Problem, as one may call it for short, the problem of want and poverty and the economic struggle between classes and nations, is nothing but a frightful muddle, a transitory and an unnecessary muddle. For the Western World already has the resources and the technique, if we could create the organisation to use them, capable of reducing the Economic Problem, which now absorbs our moral and material energies, to a position of secondary importance.

Thus the author of these essays, for all his croakings, still hopes and believes that the day is not far off when the Economic Problem will take the back seat where it belongs, and that the arena of the heart and head will be occupied, or re-occupied, by our real problems—the problems of life and of human relations, of creation and behaviour and religion. And it happens that there is a subtle reason drawn from economic analysis why, in this case, faith may work. For if we consistently act on the optimistic hypothesis, this hypothesis will tend to be realised; whilst by acting on the pessimistic hypothesis we can keep ourselves for ever in the pit of want...



 -- via my feedly newsfeed

China Is Raising Up to $31.5 Billion to Fuel Chip Vision [feedly]

China Is Raising Up to $31.5 Billion to Fuel Chip Vision
https://www.bloomberg.com/news/articles/2018-03-01/china-is-said-raising-up-to-31-5-billion-to-fuel-chip-vision

China's government aims to raise as much as 200 billion yuan ($31.5 billion) to invest in homegrown chip companies and accelerate its ambition of building a world-class semiconductor industry, people familiar with the matter said.

The state-backed China Integrated Circuit Industry Investment Fund Co. is in talks with government agencies and corporations to raise at least 150 billion yuan for its second fund vehicle but is angling for up to 200 billion yuan, the people said, asking not to be identified talking about a plan that hasn't been publicized. It intends to begin deploying capital in the second half of the year, they added.

The firm will again invest in a wide range of sectors from processor design and manufacturing to chip testing and packaging, potentially benefiting industry leaders from telecoms gear makers Huawei Technologies Co. and ZTE Corp. to major players such as the Tsinghua Group. The first fund -- about 140 billion yuan -- had gone toward more than 20 listed companies, including ZTE and contract chipmaker Semiconductor Manufacturing International Corp., the people said.

Smaller chip players gained in the afternoon. Integrated circuit manufacturer Jiangsu Changjiang Electronics Technology Co. climbed as much as 6.2 percent in afternoon trading in Shanghai, while chip packager China Wafer Level CSP Co. gained almost 5 percent.

China's trying to reduce a reliance on some $200 billion of annual semiconductor imports, which it fears undermines national security and hampers the development of a thriving technology sector. The country envisions spending about $150 billion over 10 years to achieve a leading position in design and manufacturing, an ambitious plan that U.S. executives and officials have warned could harm American interests.

While officials have suggested their initial vision of attaining pole position in chips may have been unrealistic, the government remains intent on finding ways to reduce imports as the world's largest consumer of semiconductors. Established in 2014, the secretive China IC Fund plays a key role by steering overall investment and strategy. For its second fund, the state-backed outfit will again turn to central and local government agencies as well as the government-backed enterprises that contributed previously, the people said. 

The envisioned ramp-up in domestic investment comes as foreign scrutiny of Chinese-led acquisitions intensifies. The Trump administration blocked a $1.3 billion deal for U.S. chipmaker Lattice Semiconductor Corp. from China-backed Canyon Bridge last year, citing security concerns. The Ministry of Industry and Information Technologydidn't respond to a faxed request for comment.

Read more: China Is Starting to Rethink Its Dreams of Chip Domination

The Tsinghua group, an affiliate of the prestigious Beijing university that spawned Chinese leaders including President Xi Jinping, also plays a crucial role. It groups under its umbrella the largest of the country's chip producers including Tsinghua Unigroup Co., which has outlined a $22 billion plan -- partly from the China IC Fund --- to pursue acquisitions and build capacity.

"We want to be a balancer against the international giants, which means delivering self-made technologies and growing the market share of made-in-China products," Qi Lian, Unigroup joint president and chief executive of its storage arm, told reporters Thursday. "To achieve this goal, we are building up our own R&D and also keeping an eye on global technology collaboration."

— With assistance by Yuan Gao, and Steven Yang



 -- via my feedly newsfeed