Friday, April 30, 2010

"Drill, Baby, Drill", continued

Yesterday I posted an item about the the massive and still growing oil spill in the Gulf of Mexico, noting that incident this should remind us of one possible drawback to "Drill, Baby, Drill". That post generated a lot of e-mailed responses. Here are two.

First, this sardonic comment from my friend Perry Deess:
This spill was orchestrated by the Democrats as a prelude to introducing energy legislation, just as the Goldman Sachs investigation was a sham to move forward with the socialist regulation of the finance industry.
That analysis sounds right to me ... but, of course, what Sarah Palin and her friends call the "lamestream media" have largely ignored this obvious connection. I expect that Fox News (with help from assorted Republican pundits and bloggers) will break through the cover-up.

Another correspondent approached the matter from a slightly different direction:
Ms. Palin would have been well advised to be more discreet. Instead of proclaiming this exuberantly from the dais at the Republican National Convention, it should have remained a matter between her and her spouse.
Well, that brings up the larger question of who's really getting screwed here.

=> On a somewhat more serious note, several correspondents raised the intriguing question of how the Republicans, who have gone out of their way to brand themselves the party of "Drill, Baby, Drill", are going to manage the public-relations challenge posed by this accident--which appears likely to turn into a significant environmental and economic disaster. (The Obama administration's recent decision to move pre-emptively toward an accommodation with the Republicans on this issue--don't some people ever learn?--also looks a bit less well-advised in retrospect, tactically as well as substantively.)

In a more sensible world, people who have been vociferously demanding that we start drilling everywhere without restraint while dismantling existing safeguards--as opposed to, say, finding ways to use oil more efficiently and developing viable alternative sources of energy--should be feeling a bit embarrassed and discredited right now. In the real world, the more likely result is that the policy agendas won't change much, but the propaganda will have to be adjusted. New slogans and talking-points are no doubt being crafted behind the scenes, and I suspect we'll be hearing the phrase "Drill, Baby, Drill" a little less frequently.

On the other hand, I may be wrong about that last point, since by now it is well established that Sarah Palin's sloganeering is largely unaffected by empirical reality, or by the kinds of embarrassments that affect most politicians. Even when she gets caught telling straightforward, easily checkable lies on matters large and small--the kind that even the normal run political journalists are capable of noticing and reporting--that doesn't prevent her from continuing to repeat them as if nothing had happened, and her fans don't seem to get upset. So we may get a chance to see how fully that applies to the rest of the Republican Party, too.

=> Meanwhile, the oil-spill disaster continues to unfold:
As the vast and growing oil slick spread across the Gulf and approached shore, fishermen in coastal towns feared for their businesses and the White House stepped up its response to the worsening situation.

President Obama ordered a freeze on new offshore drilling leases until a review of the oil rig accident that caused the spill could be concluded, and new safeguards put in place. [....]

For now, residents and workers in towns like Venice, La., are feeling the effects of the oil slick most directly. All fishing and shrimping, the area’s economic mainstay, was halted in Venice on Friday morning as the oil drew near.

The port’s skiffs were all docked. A local seafood company, Sharkco, was selling its last 50 pounds of shrimp, and had already sold out of oysters and fish. [....]

Well aware of the damage done to President Bush by his administration’s slow response to Hurricane Katrina in 2005, the White House has been moving aggressively in the last two days to respond to the oil leak, with Mr. Obama addressing the issue publicly twice in two days. [....]

On Friday morning, the Air Force sent two -130 planes to Mississippi, where they awaited orders to start spraying chemicals on the spill, The Associated Press reported.

Resources from the United States Navy have been marshaled to supplement an operation that already consisted of more than 1,000 people and scores of vessels and aircraft.

And Attorney General Eric Holder on Friday announced that he was dispatching lawyers from the Justice Department to New Orleans to monitor the oil spill in the Gulf of Mexico from the perspective of environmental regulation.

Calling the disaster “a spill of national significance” that could threaten coastline in several states, Homeland Security Secretary Janet Napolitano announced the creation of a second command post in Mobile, Ala., in addition to the one in Louisiana, to manage potential coastal impact in Alabama, Mississippi and Florida. Interior Secretary Ken Salazar ordered an immediate review of the 30 offshore drilling rigs and 47 production platforms operating in the deepwater Gulf, and is sending teams to conduct on-site inspections.

Gov. Bobby Jindal of Louisiana declared a state of emergency and mobilized the Louisiana National Guard to participate in response efforts.

On Friday, he also requested federal assistance for state fishermen, asking the Secretary of Commerce to declare a commercial fisheries failure. [....]

“This spill isn’t going to be fixed in a day, probably even in a year,” said Chuc Nguyen, 35, who immigrated from Vietnam as a child and has fished his entire life. “What else can I do? I don’t know how to read and write. If you tell me to do something other than fishing, I don’t even know what it would be.”
Hoping for the best,
Jeff Weintraub

Thursday, April 29, 2010

One possible drawback to "Drill, Baby, Drill"

From today's Foreign Policy Morning Brief:

----------------------------------------
Gulf of Mexico oil spill five times larger than previously thought

Top story: U.S. Coast Guard officials say the amount of oil seeping from a sunken rig in the Gulf of Mexico has increased to as much as 5,000 barrels a day, five times more than was originally thought. The 100-mile wide oil slick caused by the leak is now only 16 miles off the coast of Louisiana.

A third leak has also been discovered in the pipeline connecting the sunken rig to the oil well. The first two were discovered a few days after the explosion on the Deep Horizon on April 20. The Coast Guard also attempted a controlled burn of part of the oil slick on Wednesday, an operation they say was successful.

Louisiana governor Bobby Jindal has requested emergency help from the federal government as the oil slick nears the coast. If large amounts of oil reach the shore, it could be devastating to the state's coastal wetlands as well as its fishing industry.

The chief operating officer of oil giant BP, which owns the leaking well, says the company would welcome the help of the military in containing the spill.
----------------------------------------

For an interactive map showing the spread of the oil slick, click HERE.

=> UPDATE HERE.

--Jeff Weintraub

How seriously should we take Wall Street whining about the dangers of regulation? (Dan Gross & Adam Smith)

Dan Gross surveys the record (A brief history of alarmist—and wrong—Wall Street predictions about the effect of new regulations) and draws the right conclusion:
My general rule of thumb is that we should generally ignore what Wall Street has to say about financial regulation. Investment banks lack the common sense to know what's good for them. The financial sector opposed all the regulations that were good for it in the 1930s—i.e. the advent of the Securities and Exchange Commission and the creation of the Federal Deposit Insurance Corp. And the regulatory changes it requested and received in the past decade—eroding Glass-Steagall, getting the SEC to permit investment banks to increase their use of leverage—set the stage for the debacle of 2008.

For the past several decades, Wall Street has continually told Washington that if the Street can't do things the way it always has, and if the government changes the rules to mandate greater transparency and customer protection, that the geniuses in Lower Manhattan won't be able to make money, and it would stunt the industry. They've been wrong every time
For further details, see below.

Of course, it's also true that just because the financial industry and its propagandists oppose a measure, that doesn't necessarily mean it's a good idea. (Even the Wall Street Journal editorial page is sometimes right, or at least not totally off-base.) But it's wise to be skeptical.

Perhaps it's not unfair to quote something that Adam Smith once said (in The Wealth of Nations, Book I, Ch. 11) about the proper attitude to take when capitalists—not just financiers, but capitalists in general—offer offer advice about laws and regulations that might affect them:
The proposal of any new law or regulation of commerce which comes from this order, ought always to be listened to with great precaution, and ought never to be adopted till after having been long and carefully examined, not only with the most scrupulous, but with the most suspicious attention. It comes from an order of men, whose interest is never exactly the same with that of the publick, who have generally an interest to deceive and even to oppress the publick, and who accordingly have, upon many occasions, both deceived and oppressed it.
Amen. But Smith's warning here assumes that capitalists do have enough "acuteness of understanding" to grasp what's in their own economic interest. As Dan Gross correctly emphasizes, we shouldn't always take that for granted, either.

—Jeff Weintraub

==============================
Slate
Tuesday, April 27, 2010
The Stock Market Who Cried Wolf
A brief history of alarmist—and wrong—Wall Street predictions about the effect of new regulations

By Daniel Gross

Daniel Gross is the Moneybox columnist for Slate and the business columnist for Newsweek. You can e-mail him at moneybox@slate.com and follow him on Twitter. His latest book, Dumb Money: How Our Greatest Financial Minds Bankrupted the Nation, has just been published in paperback.

Last week, the Senate agriculture committee, led by Blanche Lambert Lincoln, sent to the floor a bill that would significantly alter derivatives trading. Should it become law—here are the highlights—the bill would require regulated banks with derivatives-trading units to spin them off. It would also require that derivatives, many of which are traded over-the-counter (i.e., not on an exchange), be traded through a central clearinghouse, with pricing and volume data made available to the public.

Predictably, the industry is opposed to the mandates for greater transparency. As Reuters reported, "Exchange trading has nothing to do with reducing credit risk," said Conrad Voldstad, chief executive officer of the International Swaps and Derivatives Association. "In fact, mandating that all swaps be exchange-traded will increase costs and risks for the manufacturers, technology firms, retailers, energy producers, utilities, service companies and others who use over-the-counter derivatives."

My general rule of thumb is that we should generally ignore what Wall Street has to say about financial regulation. Investment banks lack the common sense to know what's good for them. The financial sector opposed all the regulations that were good for it in the 1930s—i.e. the advent of the Securities and Exchange Commission and the creation of the Federal Deposit Insurance Corp. And the regulatory changes it requested and received in the past decade—eroding Glass-Steagall, getting the SEC to permit investment banks to increase their use of leverage—set the stage for the debacle of 2008.

For the past several decades, Wall Street has continually told Washington that if the Street can't do things the way it always has, and if the government changes the rules to mandate greater transparency and customer protection, that the geniuses in Lower Manhattan won't be able to make money, and it would stunt the industry. They've been wrong every time.

Through the 1970s, NYSE rules required that member firms charge the same fee to execute trades. There wasn't much competition, and there weren't any discounters. On May 1, 1975, over the howls of Wall Street firms, the SEC did away with fixed commissions. (Read more about the process here and here.) As the SEC chairman said at the time: "For the first time in almost 200 years, the rates of commission that brokers charge to public customers … will not be determined by exchange rules. Market forces will operate to set these prices and there may be variances from firm to firm." And, of course, that's what happened. Yes, the incumbent firms found their profits from executing trades were pinched. But the new discounters that formed brought in millions of new investors—who eagerly snapped up the mutual funds and stock offerings of the big Wall Street firms.

In the 1990s, SEC Chairman Arthur Levitt took aim at the archaic method of pricing stocks. Since the 19th century, stocks were priced—and moved—in increments of 1/8. If you were willing to buy IBM at 58 5/8, the market maker (one of those guys on the floor in the funny jackets) would buy it for 58 3/8, give it to you at the higher price, and pocket the difference. By the late 1990s, most stock markets had gone digital, and decimalization—stocks priced and moving in increments of a penny—was becoming the standard. "Currently, the United States securities markets are the only major markets not to price stocks in decimals," Levitt said. "And the overall benefits of decimal pricing are likely to be significant. Investors may benefit from lower transaction costs due to narrower spreads." Over the wails of Wall Street firms, the SEC ordered exchanges to switch to decimalization in 2000 and 2001. If the stock was trading at 50.375, you could offer to pay 50.425. The result: The profits of the market-makers were obliterated, but customers got better treatment. As this 2003 paper suggests: "Quoted spreads decreased substantially after decimalization, on both markets, and for stocks in all market capitalization groups."

Next, the SEC turned its attention to corporate bond trading—a market much less liquid and transparent than the stock market. With many trades conducted over-the-counter, it was difficult for investors to see the best prices and to see what prices other investors had been paying for identical or similar bonds. In 1998, the SEC began to instruct the National Association of Securities Dealers to set up TRACE, a system through which corporate bond trades would be reported in real time. (Here's some history on the process.) Again, bond dealers were less than thrilled. But since TRACE went into action, costs for investors have come down. A paper that studied trading before and after TRACE found "a reduction of approximately 50 percent in trade execution costs for bonds eligible for TRACE transaction reporting." And the market grew. The TRACE fact book (see Pages 27 and 28) shows the volume and value of bonds traded on the system have increased substantially.

Now the same dynamic is playing out with derivatives. Under the current system—as was the case with stocks and corporate bonds—the large investment banks serve as market makers and keep pricing information close to the vest. They extract fat spreads for doing customers the service of introducing buyers to sellers. But in a range of markets—stocks, antiques, baseball tickets—it has become clear that electronic selling platforms in which buyers and sellers can meet on their own terms are more efficient. That would probably be the case with derivatives as well.

Trading is always a zero-sum game. And in this arena, a gain for customers would be a loss to the Wall Street intermediaries. JPMorgan Chase CEO Jamie Dimon acknowledged as much when he told analysts that if derivatives trading went through clearinghouses it could cost his bank from "$700 million to a couple billion dollars."

Rather than let buyers and sellers meet in exchanges, rather than let investors be able to see the full array of trading in real time, the Wall Street firms effectively want things to work the way they did in the last century, when you had to pick up a phone and call somebody to get a price and execute a trade. The opposition to moving derivative trades to a clearinghouse isn't about protecting customers. It's about protecting the entrenched positions and profits of large banks.

Tuesday, April 27, 2010

How did finance swallow the economy? (George Packer)

Trying to diagnose and reform the pathologies of the financial system that brought on the financial crisis of 2008 and, by doing so, helped trigger the larger economic crash from which we're still just beginning to recover ... is a complicated business, which will require wrestling with a whole range of difficult, complex, interconnected, and in some ways dauntingly intractable issues. It's clear that the beginning of wisdom is to recognize that this is not a simple or mono-causal problem with a single magic-bullet solution.

On the other hand, it's also important not to lose the forest for the trees. As George Packer correctly pointed out last Thursday in his New Yorker blog, the most fundamental questions, which in practice are all too easily ignored or evaded even by serious proponents of financial reform, are "the ones that Paul Krugman, Simon Johnson, and others have been addressing over the past year and a half:"
[D]o we need a financial sector whose share of gross domestic product has doubled over the past several decades? Is it healthy for financial services and investment to dominate our economy as they do, and to consume the talents and advantages of astounding percentages of our élite graduates?
(My emphasis.) For some time now, John Rentoul has been running a nice series of "Questions to Which the Answer is No". It's clear that these questions belong there.

And they get to the heart of the matter. The financial system we have now is not only dysfunctional in many of the ways it operates, too open to frauds and swindles, too prone to instability and crisis, and so on. It's also just too big. It has swallowed up too much of the economy, too much talent, and (Packer could have added) too much political influence.

(For some further elaboration of what people like Paul Krugman, Simon Johnson, and Paul Volcker have been saying on this subject, see Time to restore some sanity to the financial system.)

And here are some more questions to follow those up:
Are long-term growth and shared prosperity ever to be found in an economy that depends so heavily on electronic transactions rather than production? Is social cohesion in a democracy possible when the gap in incomes between investment bankers and doctors, let alone teachers, let alone fast-food workers, is as enormously wide as it is today?
Again, no.

Effective reform of the financial system is essential (which doesn't necessarily mean it will happen). But, by itself, that won't address an even deeper and more fundamental problem--the enormously overgrown role that the financial sector has come to play in the overall economy and, by extension, in society and politics more generally. That role has metastasized over the past three decades or so, and figuring out good ways to shrink it back to reasonable proportions (without generating unpleasant side-effects in the process) is the most difficult challenge.

Of course, as Packer notes:
Investment bankers like to say that what they do makes the rest of the economy work.
And, up to a point, this is certainly true. But only up to a point. Aside from the fact that what they do can also sometimes help blow up the rest of the economy--which ought to be a sobering consideration--we need to qualify this claim in another respect, too. Some of the things that banks and investment firms do help to make the 'real' economy function effectively, and sometimes that happens through indirect mechanisms that may not be immediately obvious. (That's part of the logic of a market economy.) But it's by no means clear that everything they do benefits the larger society. For example,
the synthetic products that helped create Goldman’s record-breaking profits, drove the financial system close to collapse, cost millions of Americans their jobs and houses, and led to a civil suit against the firm have no economically redeeming aspects whatsoever. They have as little to do with productive activity as high-stakes blackjack.
Now, blackjack might be OK as long as you're gambling with your own money. But as Louis Brandeis pointed out almost a century ago, one of the central and inescapable features of the financial system is precisely that it involves some people gambling with other people's money. So all of us have an interest in making sure that this particular casino doesn't run wild--and that it doesn't dominate the rest of the economy.

(You can read the rest of George Packer's post here. Again, for some elaboration on the substantive issues, see Time to restore some sanity to the financial system. And I really do recommend reading that book by Louis Brandeis, published in 1914 but still surprisingly timely: Other People's Money--And How the Bankers Use It.)

--Jeff Weintraub

Tuesday, April 20, 2010

The last legitimate bastion of "separate but equal"? (Andy Markovits)

Among his many other interests and enthusiasms, my friend Andy Markovits, a distinguished student of European politics, culture, and political economy, also happens to be a passionate sports fan (unlike me), and a significant scholar of the subject as well. His preoccupation with the question of why the US, almost uniquely, has failed to get swept up in the world soccer culture motivated him to write the definitive book on the comparative historical sociology of mass-frenzy competitive team sports, Offside: Soccer and American Exceptionalism (with Steven Hellerman: Princeton University Press, 2001).

More recently, he and Lars Rensmann have followed that up with an updated and in some ways even more sweeping analysis of the ongoing globalization and transformation of sports and sports cultures, Gaming the World: How Sports Are Reshaping Global Politics and Culture (due to be released by Princeton in June 2010). These are both fascinating and important books even if you don't happen to be crazy about soccer, cricket, baseball, basketball, and/or American football yourself ... believe it or not.

All that is background to a provocative piece that Andy just did for the Huffington Post, which takes up some issues dealt with at greater length in Gaming the World. He's guest-posting it here as well. I'm not entirely sure how I feel about the substantive issues that Andy raises here, but readers can ponder those for themselves.

--Jeff Weintraub

================================
The Last Legitimate Bastion of "Separate but Equal"
Guest-posted by Andrei Markovits
[Also posted HERE]

Eri Yashoda, an 18-year old female knuckleball pitcher from Japan, will commence playing minor league ball for the Chico Outlaws this spring. She will thus be the first female to pitch in professional ball since Ila Borders retired more than 10 years ago. But will this be more than a mere gimmick with few, if any, social consequences? Or might it be a harbinger for substantial change in extant gender relations at the top levels of sport?

One of the fundamentally democratic and progressive legacies of the 1960s has been an unmistakable tendency in all advanced industrial democracies to include the hitherto excluded, to empower the formerly disempowered. Barack Obama is as much testimony to this remarkable societal and cultural transformation as is the fact that nearly 50 percent of law and medical students in the United States are female and that the presidents of such fine universities as Harvard, Princeton, Penn, Brown and Michigan are women. And the struggle is far from over since there are still massive areas in all these democratic societies where the formerly disempowered still constitute little more than tokens. Be it among the tenured professoriate, particularly in subjects belonging to the STEM fields of science, technology, engineering, and mathematics; or among CEOs and CFOs of large and powerful companies; women continue to be underrepresented. But the thrust of the struggle remains crystal clear: full inclusion on equal terms.

And yet, there is one domain in which the modus operandi and ultimate aim have been "separate but equal" from the very beginning: sports, particularly the dominant team sports that are not only performed on a popular basis but also avidly followed. Short of certain religions (an arena in which, too, the struggle for equality has had some remarkable successes), one would be hard put to point to any institution of such importance in our society in which such "sexual apartheid" (to use Paul Hoch's apt terminology though I prefer "gender apartheid") is not only tolerated but actively enforced, perhaps even feted as progress.

To be sure, Title IX's empowering legacy and major contribution to the inclusive and thus democratizing process hailing from the late 1960s, is nothing short of transformative, indeed revolutionary. Just think of the national prominence of the University of Connecticut's women's basketball team or that of the United States women's national team in soccer to mention just two of many other relevant examples. And yet, Title IX and its empowering legacy merely aspired to a situation of "separate but equal" in the culturally crucial world of sports.

Why have few, if any, feminists - at least to my knowledge - ever demanded that the quarterback position of the Green Bay Packers, the point guard of the Los Angeles Lakers, and one of the closers for the New York Yankees be occupied by a woman the way they have successfully asked that university presidents, doctors, lawyers, mathematicians, chess players, even presidents of the United States, be women? Or why have there not been any movements afoot to change the rules to have every football team consist of six men and five women (or vice versa) in effect making them into mixed-gender teams like the Dutch game of "korfball", a kind of basketball played by three men and two women on the same team in which, however, only men can guard men and women can play against women thus in essence perpetuating the gender apartheid within this game itself?

I am, of course, talking only about sports at the top level, not in amateur leagues in which we have indeed observed a large degree of integration since the late 1960s and early 1970s. Just think of the thorough gender integration of intra-mural sports teams on many college campuses. But why do we make such a discriminatory exception for the highest echelons of sports, i.e. the world of the physical that we would never tolerate in the world of the mental or intellectual or political? The equivalent in education would be for us to foster gender-integrated elementary and secondary schools, but then only allow men to enter and compete in the top universities with women relegated to lesser institutions even though the value of their effort in terms of degrees or championships attained would be nominally equal; or, to offer an analogy from the world of politics, women only permitted to run for state and local though not for national offices.

Does the logic of citius, altius, fortius - swifter, higher, stronger -- by definition demand our currently practiced and legitimately perceived sexual apartheid at the very top level of sports since the most accomplished men will always run faster, jump higher, and be stronger than the most accomplished women? If we continue to define "the best", which is such an integral part of any sport, by our current criteria, then this separate but equal world will never change. But if we construct alternate logics to what constitutes "the best" - include metrics of cooperation and style, for example, in computing winners and losers, or create truly gender-integrated teams in which the women's output would be weighted more heavily (e.g. assign five points to baskets scored by female players as opposed to the two by males) thereby creating real incentives to have the women be welcomed as positive additions to these teams, as has been the case in the aforementioned intramural contests -- then we might actually arrive at a truly integrated sports world which would thus be congruent with virtually all important public institutions of our contemporary democratic world.

Andrei S. Markovits teaches sociology of sports, among other subjects, at the University of Michigan. He is co-author with Lars Rensmann of Gaming the World: How Sports Are Reshaping Global Politics and Culture published by Princeton University Press in June 2010 in which the issue raised in this article -- as well as others of the sports worlds in Europe and North America -- are discussed in detail.

Friday, April 09, 2010

Comparing the economic records of dictatorship and democracy in Chile (Samuel Valenzuela)

As a follow-up to my post on Military dictatorship and free-market economics in Chile - Sorting out propaganda from reality ...

... my friend Samuel Valenzuela, a Chilean comparative/historical political sociologist based at Notre Dame University, e-mailed me the following response, which I pass on with his permission. --Jeff Weintraub

--------------------------------------------------
Dear Jeff,

Chilean economic growth during the Pinochet years was only 1% per year on average. The big spurt in growth with macro-economic stability has occurred in the years after the transition to democracy in 1990. The economy since then has almost tripled in size, and now has the highest per capita income in Latin America. The democracy dividend came in the form of a huge increase in foreign direct investment, which over these years has totaled about 120 billion dollars. Foreign investors did not really believe in the stability of the military regime.

(The growth curve began to edge up in 1985, but until 1988 it was all just recovery of past levels. Moreover, in 1988 Pinochet inflated the economy with fiscal spending in a vain attempt to win the plebiscite, resulting in a projected 35% inflation rate in 1990--which the Concertatión government stopped in its tracks by deliberately inducing a slow down of growth to 2% for the year 1991.)

In the years of Concertatión government Chilean welfare institutions were perfected. There is effective access to health for everyone; a universal pension system, with minimum pensions for everyone over 65 whether or not they contributed into a pension fund; access to heavily subsidized housing (all of which was sturdily built and survived the earthquake without a scratch; a new unemployment insurance system; and financial assistance for tertiary education (which now has a coverage of about 35% of all 18- to 24-year-olds). All these measures were taken while maintaining fiscal balance (as an average over several years).

There are few governments anywhere that can match this record.

As to the earthquake, yes indeed: building codes are very strict in Chile. But the essential point is that they are enforced because the country has very low levels of corruption, and builders are held accountable legally for any failures. We were there for the earthquake, and near the epicenter of it. The highways need a lot of repairs, and the coastal towns subjected to the tsunami were very hard hit. Many older structures (often where no one was living) were also destroyed. But in general the cities look just about the same as they did before.

Best,
Samuel

Thursday, April 08, 2010

Military dictatorship and free-market economics in Chile – Sorting out propaganda from reality (Paul Krugman)

It should be too obvious to need emphasizing, but apparently it isn't, that although the capitalist market economy and democracy are certainly compatible, and may even be mutually supportive in various ways, they are not the same thing. Indeed, along with their (potential) compatibility, both theoretical analysis and actual history show that there is also an inherent and inescapable tension between them.

Among other reasons: Democratic self-government entails making conscious decisions about collective outcomes, and many of the purposes for which empowered democratic citizens use the political process--ranging from essentially 'nice' purposes like environmental protection, insuring the safety of food and drugs and drinking water, building codes, unemployment insurance, minimum-wage laws, Social Security, bank deposit insurance, and the like to various forms of more narrowly selfish special-interest measures including unfair and/or dysfunctional protectionism, overt and disguised corporate welfare, rigidifying regulation, and so on--necessarily involve interfering with the pure logic of the self-regulating market. Ditto for unions, of course, which might not be seen as "political" institutions according to some formal definitions but which are a necessary element in any genuinely active and viable democratic political society.

In fact, not only has the capitalist market economy often coexisted in practice with authoritarian or dictatorial political regimes, but policies of radical marketization often require the use of despotic political power, in order to break popular resistance to such policies and to prevent social groups (and some "special interests" that might even include industries lobbying for subsidies and protectionism) from mobilizing to protect themselves against their effects. A classic example, of course, is the Pinochet dictatorship that ruled Chile from 1973-1990, in which the terroristic power of a military dictatorship was used precisely to push through policies of radical marketization (though this marketization was not quite as complete as some of its foreign admirers believe, since the Pinochet regime never considered giving up state control of the crucial mining industry).

One might legitimately agree or disagree about whether, and in what ways, these economic policies were good or bad for Chile in the long run. But what's indisputable is that they were pushed through by a dictatorial regime--which, at the very least, nicely illustrates the analytical distinction between political freedom, in the form of democracy & "human rights," and "free market" economics.

What's also indisputable is the close collaboration between the Pinochet dictatorship and a number of American economic advisers espousing a Chicago-school free-market ideology of the sort that often styles itself "libertarian"--a theoretical and ideological orientation epitomized and celebrated in, for example, Milton Friedman's Capitalism and Freedom. (In the past few decades, this ideological orientation has also been called "neo-liberalism"--a term that sometimes confuses Americans, since in the US a commitment to the self-regulating market is, somewhat peculiarly, called "conservative" economics. But it actually does represent a resurgence of what, in a broader historical perspective, should properly be called economic liberalism. Friedman himself was aware of this, of course.)

If one happens to believe that democracy has anything to do with liberty, then this experience should be enough to make it clear, once again, that there is no necessary or direct connection between free-market-fundamentalist economics and genuine libertarianism. Not only is it mistaken, misleading, and potentially pernicious to simply confuse the two (as Tocqueville, for example, cogently explained almost two centuries ago). Sometimes one even has to choose between them.

=> But let's forget about political liberty for the moment, and focus purely on the Pinochet/Chicago economic policies and their long-term consequences. There, again, the actual story is more complicated than a lot of retrospective right-wing mythology suggests. In a recent piece, Paul Krugman nicely sorted out some of the ways in which this is true.

Yours for reality-based discourse,
Jeff Weintraub

UPDATE: For a follow-up, see Comparing the economic records of dictatorship and democracy in Chile.

==============================
Paul Krugman (The Conscience of a Liberal)
March 3, 2010
Fantasies of the Chicago Boys



Ah, Chile. Remember how, during the Social Security debate, Chile’s retirement system was held up as an ideal — except it turned out that it actually yielded very poor results for many people, and the Chileans themselves hated it? Now we have the usual suspects claiming that Chile’s relatively low death toll in the quake proves that — you guessed it — Milton Friedman was right. You see, the Chicago Boys made Chile rich, and that’s what did it.

As a number of people have pointed out, there’s this little matter of building codes. Friedman wasn’t exactly fond of such codes — see this interview in which he calls such codes a form of government spending, because they “impose costs that you might not privately want to engage in”.
[JW: This is an analytical point worth stressing, even though it should once again be obvious, because its implications seem to get lost in a lot of discussions. The tight building codes in Chile played a key role in reducing the amount of death and destruction from the recent mega-earthquake. But as Friedman understood quite well, although some current Friedmanite propagandists might want to obscure it, is that the whole point of building codes is that they are a form of public-interest regulation that necessarily works by interfering with the pure logic of the market and thus modifying--we might also say "distorting"--the outcomes that pure market processes might otherwise produce.]
But there’s another point: the economics of Chile under Pinochet are a lot more ambiguous than legend has it. The way the story is told now, the free-market guys moved in, liberalized, and then there was a boom.

Actually, as you can see from the chart above, what happened was this: Chile had a huge economic crisis in the early 70s, which was, yes, partly due to Allende and the accompanying turmoil. Then the country experienced a recovery driven in large part by massive capital inflows, which mostly consisted of making up the lost ground. Then there was a huge crisis again in the early 1980s — part of the broader Latin debt crisis, but Chile was hit much worse than other major players.
[JW: By the way, it's true that most retrospective discussions of the Pinochet years do tend to overlook this small detail.]
It wasn’t until the late 1980s, by which time the hard-line free-market policies had been considerably softened, that Chile finally moved definitively ahead of where it had been in the early 70s.

So: free-market policies are applied, and presto! prosperity follows — fifteen years later.

But remember, Obamanomics has definitely failed after 13 months.

Wednesday, April 07, 2010

Recognizing "historic truth" - The Prime Ministers of Russia and Poland jointly commemorate the 1940 Katyn massacre



Brian Brivati, posting on Dissent's new "Arguing the World" blog, just highlighted "a simple news item" that made him do a double-take. He was right to be startled:
Seven decades ago, Soviet secret police executed thousands of Polish military officers in a forest in western Russia. On April 7, Russian and Polish leaders will meet to officially commemorate the massacre's anniversary together for the first time.
Russian Prime Minister Vladimir Putin and his Polish counterpart, Donald Tusk, are due to attend a memorial ceremony in the village of Katyn honoring the more than 20,000 officers, policemen, and intellectuals who were killed on Soviet leader Josef Stalin's orders during World War II.
The two men will also pay tribute to Soviet victims of the Stalinist terror.[....]

On April 2, the Oscar-nominated 2007 film "Katyn," by Polish director Andrzej Wajda -- whose father was a Katyn massacre victim -- premiered on Russia's "Kultura" television channel.
The official Russian government newspaper "Rossiiskaya gazeta," which in the past had published articles casting doubt on Soviet responsibility for the Katyn massacre, opined that the screening "shows our society's serious progress on the path toward restoring historic truth about the tragedy of World War II."
Here is a still from Wajda's film:



=> For some readers, the word Katyn may not immediately ring a bell, so it might be worth adding a few words of historical background.

It is sometimes forgotten that, during the first few years of the Second World War, Nazi Germany and the Soviet Union were allies. Part of the 1939 Hitler-Stalin Pact was a secret agreement to carve up Poland between them. A few weeks after the German invasion of Poland, the Soviet Union invaded from the east and seized roughly half of the country. Between that time and Hitler's invasion of the Soviet Union in 1941, hundreds of thousands of Poles were killed, imprisoned, or deported to Siberian camps.

Katyn figures in the most notorious incident from that period. In 1940 the Soviets decided to execute about 22,000 Polish officers and other prisoners of war under their control. In this operation, the largest single massacre was carried out in the Katyn forest, near Smolensk. It is important to emphasize that a large proportion of these murdered officers--most of them, I believe--were not professional military men. They were reserve officers, largely university graduates, called up for service at the beginning of the way. So they included doctors, lawyers, educators, and other professionals as well as civil servants, police, businessmen, journalists, intellectuals, and so on. (Also the Chief Rabbi of the Polish Army, Baruch Steinberg.) This fit into a larger pattern in which both the Nazis and the Soviets systematically targeted members of the Polish intelligentsia and other leading groups in Polish society in order to help cripple potential resistance.

One reason why the Katyn massacre, in particular, acquired such iconic place in Poland's historical memory is that this massive crime was followed up by a long-term historical lie--in which Poles themselves were forced to participate, as long as their country was under Communist rule and Soviet domination. As an article in Deutsche Welle explains:
For 70 years, the Katyn massacre has provided Soviet and Russian governments with a political and diplomatic headache. Until the late 1980s, the official version was that German troops had killed the Poles in 1941, in the wake of the German attack on the Soviet Union. But the Germans uncovered the mass graves at Katyn in 1943 and shifted the blame towards the Soviets.

The Communist propaganda machine was swift to reply. "The whole world must know about the monstrous crimes of the fascist German butchers," cried a Soviet war time movie, a slogan that was repeated time and again in the following decades.

The truth emerged almost half a century later. The then Russian president Boris Yeltsin opened the archives in 1992 and released documents carrying the signature of the Soviet dictator Joseph Stalin. It was the first piece of irrefutable proof that Soviet death squads, not German soldiers, were the perpetrators.

"I did it right away," said Yeltsin. "Every secretary-general of the Communist Party handed these documents to his successor, who put them in his personal safe and kept silent."

Yeltsin offered his apologies to Poland, according to witnesses, with tears in his eyes. [....]
That was a moving gesture, but a relatively isolated gesture. Furthermore, 2010 is not 1992, and Vladimir Putin is certainly not Boris Yeltsin. In today's Russia, Stalin and the Stalinist era are getting increasingly rehabilitated in both public opinion and official discourse.

(And some things never did change. Today's New York Times article reports that the Russian Communist Party, sounding a bit like some US Republicans complaining about Obama, "chastised Mr. Putin on Wednesday for 'going to Katyn to apologize'. In a statement on its Web site, the party said, 'You can apologize as much as you want about the so-called Soviet guilt, but no one can hide the fact of German responsibility for the shootings of Polish soldiers'.”)

Putin, for his part, is a former KGB agent who has repeatedly described the demise of the Soviet Union as "the greatest geopolitical catastrophe of the [twentieth] century"--a truly mind-boggling judgment, considering everything else that happened in the 20th century, even if one forgets for a moment that the history of the Soviet Union was itself a gigantic catastrophe. For Putin to take this step, in his official capacity as Prime Minister of the Russian Federation, is a genuinely big deal.

Quoting the New York Times article again:
Prime Minister Vladimir V. Putin on Wednesday became the first Russian or Soviet leader to join Polish officials in commemorating the anniversary of the murder of thousands of Polish officers by the Soviet Union at the beginning of World War II.

Mr. Putin cast the executions as one tragedy out of many wrought by what he called the Soviet Union’s “totalitarian regime.”
=> As Brian Brivati correctly observes, there may be a lesson here for other countries whose history includes large-scale mass murders, carried out by now-superseded political regimes, which they still cannot honestly acknowledge and confront.
Armenians and Turkey take note: the world is moving towards an understanding and reconciliation on these issues. Putin has made a move. Who will follow?
If even Putin's Russia can stop living this lie (or, at least, begin to stop living it), perhaps a democratic Turkey can begin to face up honestly to the "historic truth" of the Armenian genocide. Yes, it can take time for societies to come to terms with such things, but at this point they've had almost a century. If not now, when?

Yours for reality-based discourse,
Jeff Weintraub

P.S. Many academics in various disciplines, on the basis of conventions that are sometimes well-intentioned and superficially plausible but in fact are substantively fallacious and intellectually and morally misleading, remain allergic to the word "totalitarian." If Vladimir Putin is willing to utter it without embarrassment or circumlocution, perhaps others can follow him in that respect, too.

Monday, April 05, 2010

Environmental policies that work - Reducing air pollution in Mexico City

Following up the earlier post about long-term reductions in air pollution in the US ... I am pleasantly surprised (perhaps I should say pleasantly astonished) to discover that since the 1990s they have been able to achieve a dramatic improvement of air quality in (believe it or not) Mexico City.
This megalopolis once had the world's worst air, with skies so poisonous that birds dropped dead in flight. Today, efforts to clean the smog are showing visible progress, revealing stunning views of snow-capped volcanoes -- and offering a model for the developing world.
"Model" is right. If it can be done there, it can be done anywhere.
"We have seen a lot of improvement. It is very clear," said Luiz Augusto Cassanha Galvao, a senior environmental officer at the Pan-American Health Organization. "On a scale of one to 10, they were at 10, and now they're at five." [....]

In 1992, the United Nations declared Mexico City the most polluted on the planet. High ozone levels were thought to cause 1,000 deaths and 35,000 hospitalizations a year. Thermal inversions held a toxic blanket of dirty air over a grimy city that seemed to embody the apocalyptic "Makesicko City" of the fiction of Mexican author Carlos Fuentes.

Mexico was forced to act. It replaced the city's soot-belching old cars, removed lead from gasoline, embraced natural gas, expanded public transportation, and relocated refineries and factories.

Change was gradual, but the pace has quickened in recent years.

The presence of lead in the air has dropped by 90 percent since 1990. Suspended particles -- pieces of dust, soot or chemicals that lodge in lungs and cause asthma, emphysema or cancer -- have been cut 70 percent. Carbon monoxide and other pollutants also have been drastically reduced. [....]

Ozone levels have dropped 75 percent since 1992, but they still exceeded international standards for a total of 530 hours last year. [....]

"If the government decides to do something about it, it can be done," said Nobel Prize-winning air quality expert Mario Molina. "There's really no excuse not to do more." [....]

Mexico City's geography adds to the problem; the city of more than 20 million is cradled in a 7,300-foot-high bowl, surrounded by peaks higher than 17,000 feet that trap pollutants.

But experts say many places overcame similar challenges. European cities, for example, halved pollution in recent decades by dramatically reducing coal fuel.

"Simple measures that enormously reduce pollution are feasible, and they are not expensive," said Michal Krzyzanowski, an air quality adviser for the World Health Organization.

"It is not the destiny of mankind to live in polluted cities."
Perhaps not.

The passages quoted above come from a Washington Post story to which I was alerted by Michael O'Hare at The Reality-Based Community. His post celebrating the transformation of Mexico city's atmosphere from deadly and intolerable to merely unsatisfactory and still improving is worth reading (below)--not least for the before-and-after pictures. As he says:
The improvement in every quality indicator of air quality in an enormous city located in one of the worst places for air pollution persistence is an inspiration.
Yours for effective environmental sanity,
Jeff Weintraub
========================================
Michael O'Hare (@ The Reality-Based Community)
March 31, 2010
Environmental policy that works


J. M. Velasco, Valle de Mexico 1892

This is what the Valley of Mexico looked like at the turn of the 20th century. When I came across a couple of paintings of this view by José Maria Velasco, a near-contemporary of the Hudson River School artists of the US, in the Museo Nacional de Arte, I was close to tears, having walked into the building from contemporary Mexico City. Has there ever been a more complete devastation of a natural paradise, short of flooding a valley with a dam, or dumping a West Virginia mountaintop into the river below it to get some coal out?

In less than a hundred years an idyllic mountain valley surrounded by volcanoes had turned into the eighth largest city in the world stifling in a pool of toxic, opaque air pollution:


Photo: Alfredo Cottin

Mexico City hasn’t got its lake back, and is still sinking because of pumping groundwater, and it remains one of the most pedestrian-hostile cities in the world, but not having been there for almost a decade, I loved this story: you can see across it again, and breathing isn’t a constant insult to lungs.


Photo: www.imagenesaereasdemexico.com

The improvement in every quality indicator of air quality in an enormous city located in one of the worst places for air pollution persistence is an inspiration. No, the economy didn’t collapse under the crushing weight of brutal regulation: the cleanup wasn’t free but it’s such a bargain, not just in health benefits but quality of life…and what else matters, when you get right down to it?

It takes one to know one — A medico-political illustration

Mark Kleiman, as usual, zeroes right in on the key point:

I’m not sure why people are criticizing Dr. Jack Cassell, a Florida urologist, for telling patients who voted for Obama to seek care elsewhere.

After all, doesn’t it stand to reason that the very best urological treatment would come from a physician who is himself a prick?

Perhaps Dr. Cassell could form a joint practice with a proctologist



--Jeff Weintraub

Bob Dylan banned in Beijing

Actually, he's been banned from performing in Shanghai, too, or anywhere else in China. But I couldn't resist the alliteration. According to the Guardian report:
Aged 68 and almost half a century past the zenith of his angry, protest-song youth, Bob Dylan must almost have forgotten what it was like to be deemed a threat to society. But it seems at least one place still sees him as a dangerous radical.

Dylan's planned tour of east Asia later this month has been called off after Chinese officials refused permission for him to play in Beijing and Shanghai, his local promoters said. China's ministry of culture, which vets planned concerts by overseas artists, appeared wary of Dylan's past as an icon of the counterculture movement, said Jeffrey Wu, of the Taiwan-based promoters Brokers Brothers Herald. [...]

The verdict scuppers Dylan's plans to play his first dates in mainland China. The singer, who plays around 100 concerts a year on his Never Ending Tour, had hoped to extend a multi-city Japanese leg with concerts in Beijing, Shanghai, Taiwan, South Korea and Hong Kong. All these would now be called off, Wu told the newspaper.

"With Beijing and China ruled out, it was not possible for him just to play concerts in Hong Kong, South Korea and Taiwan," he said. "The chance to play in China was the main attraction for him. When that fell through everything else was called off."
Why is the Chinese government afraid of Bob Dylan? It probably goes back to six words spoken by the Icelandic singer Björk during her 2008 concert in Shanghai: "Tibet, Tibet! Tibet, Tibet! Tibet, Tibet!":
Dylan fans denied the chance to see their hero might also blame Björk, who caused consternation among Chinese officials two years ago by shouting pro-Tibet slogans at a concert in Shanghai, Wu told Hong Kong's South China Morning Post.
Actually, it was closer to an amplified whisper than a shout. You can see it on video here. Apparently, this "hurt the feelings" of the Chinese people.
"What Björk did definitely made life very difficult for other performers. They are very wary of what will be said by performers on stage now," Wu said.

Last year, Oasis were told they were "unsuitable" to play in Beijing and Shanghai as Noel Gallagher had appeared at a Tibet freedom concert 12 years earlier.
This whole affair tells us less about Bob Dylan (and Björk) than about the continuing nervous insecurity of Chinese officialdom in a lot of respects, despite China's increasing economic strength and its increasing bravado on the world stage ... and, let us not forget, the touchy and defensive nationalism the Chinese government shares with much of the Chinese public, easily inflamed by any mention of Tibet.

As Mick Hartley observes, "It's been decades since Bob Dylan could have been described as a protest singer"--though how would a Chinese cultural apparatchik know that? And, alas, he's not as great an artist as he once was, so I'm not sure that seeing him live is necessarily better than listening to his recordings (again and again). But it's still true that "the Chinese need Dylan more than Dylan needs the Chinese."

(As for China's Ministry of Culture ... the right song to send them is probably Positively Fourth Street.)

Stuck Inside of Mobile with the Memphis Blues Again,
Jeff Weintraub

Sunday, April 04, 2010

Environmental policies that work - Air pollution in the US, 1980 vs. 2008

(Via Brad DeLong.) It's sometimes necessary to remind ourselves that sensible policies in the service of environmental sanity can work, if we make a serious effort.

At first glance, I also can't help being struck by the fact that the downward trends in these forms of air pollution carried on through several anti-environmentalist Republican presidential administrations--beginning with Reagan in 1980. Back in 1970, the Environmental Protection Agency was created with the support of then-President Richard Nixon, and of course one of the founding fathers of the environmental cause in US national politics was the conservative reformer Teddy Roosevelt. Historically, the Republican Party was not always or uniformly identified with opposition to environmentalist policies. Since the commencement of the age of Reagan, that has changed. But once these policies are set in motion, they seem to develop their own momentum.

--Jeff Weintraub

Saturday, April 03, 2010

Health care reform vs. judicial activism? – Some speculations

[Also posted, with slight revisions, in the new Dissent blog, "Arguing the World". Whether or not this particular post interests you, I recommend checking out Arguing the World.]

Will right-wing opponents of the new health care reform legislation be able to overturn it by getting the Supreme Court to declare it unconstitutional? I’d like to think the answer is a definite no, and probably it is, but I’m not sure we can rule out the possibility that the answer should be maybe.

I am definitely not a professional Supreme Court watcher, and I make no pretense to expertise in such matters. My non-expert impression is that such an outcome is unlikely, and the bulk of more expert opinion seems to incline in the same direction. (For some representative treatments of the purely legal issues, see the on-line New York Times round-up of lawyers and legal scholars on the question: “Is the Health Care Law Unconstitutional?”).

Unlikely--but not impossible. After all, back in 2000 the prospect that the Supreme Court would intervene in such a crudely and blatantly unprincipled way to decide the outcome of the Presidential election seemed a little far-fetched to many people until it actually happened. And, more recently, the sweepingly high-handed manner in which the Supreme Court swept away more than a century of federal and state laws limiting the ability of corporations to spend money in federal elections should have alerted even people who hadn’t been paying attention that we are living through an era of exceptionally unabashed judicial activism.

Furthermore, while most of the legal analysts urging the Supreme Court to declare the bill unconstitutional are, unsurprisingly, on the right, I notice that even some generally left-of-center analysts like Jonathan Turley believe they might have a case:
One of the most contested issues is the so-called individual mandate under which Congress has ordered all citizens to get medical insurance or face fines. Though the federal government has the clear advantage in such litigation, these challenges should not be dismissed as baseless political maneuvering. There is a legitimate concern for many that this mandate constitutes the greatest (and perhaps the most lethal) challenge to states' rights in U.S. history.
So the possibilities for Supreme Court nullification of the Patient Protection and Affordable Care Act may be worth pondering in advance, just in case.

=> The Democratic health care reform package that emerged from more than a year of political drama and legislative trench warfare is certainly flawed, incomplete, and otherwise unsatisfactory in many respects. Nevertheless, it adds up to a crucial first step in the right direction, and its enactment was an enormously important achievement—whereas the ultimate failure of the whole effort would have been a political and public-policy fiasco.

However, the political fight over this bill is far from over. Republicans immediately began threatening to repeal it as soon as they regain control of Congress, and in the meantime they promise to make repeal a central issue for the 2010 mid-term elections. (More cautious exceptions have been few.)

I’m generally persuaded by the arguments that, even if the Republicans win a crushing victory this November, they still won’t actually be able to repeal the bill. As long as Obama is President, they’ll be blocked by a Presidential veto. And even if a Republican President committed to repeal gets elected down the line, the Republicans will run into an obstacle that is close to being an unwritten law of modern American politics—once government benefits have been granted to a significant body of middle-class voters, it is very hard to take them back. Nevertheless, these factors should not lead supporters of the PPACA to feel complacent. Even if a Republican Congress can’t repeal it outright, there are various ways they could undermine, sabotage, and distort its implementation.

Meanwhile, opponents of the bill are also trying to circumvent those political obstacles by taking their case to the federal courts, a route that will probably lead to the Supreme Court. At last count, 14 state Attorneys-General (all but one of them Republican) have joined two lawsuits seeking to have the new health care law declared unconstitutional, describing it as “an unprecedented encroachment on the sovereignty of the states."

The heart of the objection, as noted above, is to the individual mandate that the new law shares with RomneyCare in Massachusetts. That is, in return for promising universal health insurance coverage, at least as a goal, the law requires that everyone obtain coverage. If an individual is not covered through an employer (or by Medicare or other public insurance), then he or she is required to buy health insurance individually (with (with subsidies to assist individuals for whom that would be an undue financial burden).

This is not a secondary or optional aspect of the plan—and not only because the fundamental moral basis of this health care reform lies in a commitment to social solidarity and mutual responsibility. Universal insurance coverage requires universal participation. The key principle of insurance, after all, is to share and spread risk. Without universal, or close-to-universal, participation, the problems of “adverse selection” and incentives for free-riding would make the rest of the system unworkable.

According to the main lawsuit challenging the bill, filed in Florida:
The Constitution nowhere authorizes the United States to mandate, either directly or under threat of penalty, that all citizens and legal residents have qualifying health care coverage.
At first glance, this might seem like a puzzling basis for a lawsuit. No, the text of the Constitution doesn’t explicitly mention health insurance. But hasn’t it long been accepted that the federal government can require us to pay taxes toward Medicare, which is certainly a system of health insurance? And how about Social Security, which is a system of retirement insurance? No doubt some people still regard both of those as unconstitutional, but that has become a marginal view. And is it really “unprecedented” for the federal government to require individuals to pay directly for their own health insurance? Actually, no. As Paul J. O'Rourke helpfully points out:
In July, 1798, Congress passed, and President John Adams signed into law “An Act for the Relief of Sick and Disabled Seamen,” authorizing the creation of a marine hospital service, and mandating privately employed sailors to purchase healthcare insurance.
Since John Adams happened to be one of the framers of the Constitution, we can presume that he had some understanding of the Constitution’s “original intent.”

QED? Not quite. Here is where the argument gets a bit tricky.

Ironically, the feature of ObamaCare that renders it potentially vulnerable to this legal challenge is precisely one of those features that make it a moderate, centrist, gradualist reform, as opposed to the “radical” or “socialistic” measure claimed by right-wing propaganda. Medicare and Social Security are single-payer systems of public insurance funded directly by federal taxes, and the same was true for the 1798 plan covering merchant seamen. But the current Democratic health care plan did not attempt to institute a single-payer, Medicare-for-all system (let alone a “government take-over of health care”). Instead, for better or worse, it left in place a system in which most non-Medicare health insurance is provided through private profit-seeking corporations. In the end, even the alternative of a public option was knocked out of the bill.

Thus, precisely because the plan is not “socialistic,” opponents now claim that even if we accept the constitutionality of Social Security and Medicare, the insurance mandate in this bill really is “unprecedented.” Randy Barnett, Professor of Legal Theory at Georgetown Law Center and proponent of the so-called "Lost Constitution" (essentially, the pre-New Deal Constitution), puts it this way:
Congress has never before mandated that a citizen enter into an economic transaction with a private company, so there can be no judicial precedent for such a law.
Again, some readers may be perplexed. Aren’t we legally required to buy auto insurance, from “a private company,” if we want to drive a car? The rejoinder seems to be that those requirements are instituted by state governments, not by the federal government. Others add that buying a car and getting a driver’s license are voluntary choices, so citizens can avoid these requirements simply by not buying or driving a car.

Barnett, who clearly feels he has an undisputable knock-down argument here, reaches for a reductio ad absurdum:
Imagine if Congress ordered the majority of American households without a firearm to buy a handgun from a private company, and punished their failure to do so with an escalating monetary fine, which it labeled a “tax.” Would the supporters of the health insurance mandate feel the same about the constitutionality of such a measure?
Well, we don't have to imagine it. As Brad DeLong quickly pointed out, the Militia Act of 1792, passed by Congress and signed into law by President George Washington, "ordered the majority of American households" to do precisely that.
Be it enacted by the Senate and House of Representatives of the United States of America, in Congress assembled, That each and every free able-bodied white male citizen of the respective States, resident therein, who is or shall be of age of eighteen years, and under the age of forty-five years (except as is herein after excepted) shall severally and respectively be enrolled in the militia [….] That every citizen, so enrolled and notified, shall, within six months thereafter, provide himself with a good musket or firelock, a sufficient bayonet and belt, two spare flints, and a knapsack, a pouch, with a box therein, to contain not less than twenty four cartridges, suited to the bore of his musket or firelock, each cartridge to contain a proper quantity of powder and ball; or with a good rifle, knapsack, shot-pouch, and powder-horn, twenty balls suited to the bore of his rifle, and a quarter of a pound of powder; and shall appear so armed [….]
Etc. In short, Barnett’s claim that “Congress has never before mandated that a citizen enter into an economic transaction with a private company” is, again, historically inaccurate. It’s clear that federal laws with individual mandates date back to the earliest years of the republic.

Will these historical precedents, and others that are sure to turn up, be enough to settle the matter? Probably not.

=> I repeat that I find it unlikely that the Supreme Court will actually throw out the health care reform bill, if only because this could lead to a major political and constitutional crisis. But two interconnected factors give me pause.

First, it so happens that the legal basis for much of the national regulatory state that has grown up since the New Deal rests on a surprisingly narrow constitutional foundation—and a potentially shaky foundation, if a hostile Supreme Court were determined to make it so. For the lawyers, the ultimate basis for a wide range of federal laws and other measures whose legitimacy most of us have come to take for granted (including, say, the Civil Rights Act of 1964) is derived from implications of the Commerce Clause of the Constitution (Article I, Section 8, Clause 3), which gives Congress the power “To regulate Commerce with foreign Nations, and among the several States, and with the Indian Tribes.” In many cases, this has involved interpreting the relevance of particular laws to “interstate commerce” in fairly broad and creative ways—and even in ways that, from a non-legalistic perspective, could plausibly be described as strained or problematic. Thus, in terms of its ultimate legal basis, the whole apparatus of the national regulatory state in the US is a bit of a Rube Goldberg contraption.

In itself, that’s not necessarily a problem. All legal systems rely on legal fictions and on interpretive conventions that may look strained or implausible to outsiders. As long as they continue to be accepted as authoritative, and are embedded in a solid structure of taken-for-granted precedents, there’s no reason to assume that they are necessarily vulnerable to challenge—unless, say, an exceptionally activist and ideologically aggressive Supreme Court decided to exploit that potential vulnerability. But this brings me to my next point.

The second disquieting factor is that we are, indeed, dealing with an exceptionally and aggressively activist Supreme Court. This obvious fact has been obscured, to some degree, by the continuing tendency to equate “judicial activism” with “liberal judicial activism.” Even people who should know better often talk as though “judicial activism” necessarily meant using the courts to take the initiative in promoting liberal or progressive ends—challenging legalized racism, extending rights to oppressed or stigmatized minorities, protecting the rights of criminal defendants, and the like—whereas conservatives claim to favor “judicial restraint.” But these presumptions, and the political slogans that go with them, are hangovers from the Warren Court of the 1950s-1960s and its after-effects in the Burger Court during the 1970s. (Warren Burger, not Earl Warren, was Chief Justice in 1973 when the Roe vs. Wade decision established a woman’s right to choose abortion.) These clichés long ceased to have much contact with reality.

It shouldn’t be necessary (but sometimes it is) to make the obvious point that, in the context of these political and ideological divisions, judicial activism is a double-edged sword. If we use the most straightforward criterion of Supreme Court “judicial activism”—the frequency with which the Court strikes down federal and state laws—then the Rehnquist and Robert Courts have been among the most activist in American history. And it looks as though the transition from the Rehnquist Court to the Roberts Court has brought an escalation, not a moderation, of this tendency. During their confirmation hearings, both John Roberts and Samuel Alito piously insisted on their commitment to judicial restraint, their respect for judicial precedent, their disinclination to legislate from the bench, and so on—and then, once appointed, went on to demonstrate that they didn’t mean a bit of it, and that on key issues they have a majority of the Court on their side.

Nor should this seem surprising or paradoxical, because in certain respects the era of the Warren/Burger Court was anomalous. As people who know American history are aware, for most of that history periods of aggressive judicial activism have usually, though not exclusively, involved a willingness by the courts to block laws passed by democratically elected legislatures in ways that served the interests of wealth and power. (At times during the 19th century, this included active support for maintaining white supremacy. Before the Civil War, there was the notorious Dred Scott decision, and after the Civil War, a series of disgraceful Supreme Court decisions helped undermine efforts by the federal government to defend Reconstruction in the South against a violent white-supremacist backlash spearheaded by the Redeemer Democrats and the Ku Klux Klan.)

In 1905 Oliver Wendell Holmes, Jr., a conservative jurist who could hardly be called a crypto-socialist or egalitarian social reformer in his sympathies, felt compelled to remind his Supreme Court colleagues that the contested “economic theory” of doctrinaire free-market fundamentalism that they found so convincing was not, in fact, part of the Constitution. (“The Fourteenth Amendment does not enact Mr. Herbert Spencer's Social Statics.”) But this particular confusion pervaded Supreme Court decisions for a long time. During Franklin Roosevelt’s first term, the Supreme Court struck down so much New Deal legislation that it almost provoked a head-on confrontation between Roosevelt and the Court. Roosevelt did develop a plan to pack the court with less obstructionist Justices, but that gambit generated enough opposition to kill it. For its part, the Supreme Court backed off, and for the next half-century it generally recognized the right of the federal government to pass national regulatory legislation.

And, generally speaking, it still does. But over the past few decades there have been signs of incipient counter-reaction. Rehnquist came to the Court determined to push a “federalism revolution” that involved, essentially, using the Supreme Court (and the rest of the federal court system) to repeal much of the New Deal. The progress of this agenda was limited and gradual rather than revolutionary (some observers sympathetic to the effort even concluded, in disappointment, that “the federalism boomlet has fizzled”), in large part because the Rehnquist Court was divided—even some of the Justices appointed by Republican Presidents turned out to be relatively moderate, to the frequent dismay of the Republican hard right. Nevertheless, it did give rise to some startling decisions. The legal journalist Linda Greenhouse recently recalled some of the most dramatic examples:
In a series of 5-to-4 rulings, the court took a view of Congressional authority that was narrower than at any time since the early New Deal. The court struck down a federal law that barred guns near schools, on the ground that possession of a gun near a school was not the type of activity that the Constitution’s Commerce Clause authorized Congress to regulate. It ruled that Congress could not require states to give their employees the protections of the federal laws against discrimination on the basis of age or disability. It ruled that the federal government couldn’t “commandeer” state officials to perform federal functions like federally mandated background checks of gun purchasers.
One of the more “moderate” Republican appointees on the Court, Sandra Day O’Connor, has now been replaced by Samuel Alito. So wouldn’t the current challenge to the constitutionality of the individual health insurance mandate offer the opportunity for an even more momentous decision along these lines?

As it happens, despite the ever more rightward tilt of the Supreme Court, Greenhouse nevertheless believes that the chances that the Roberts Court will overturn the health care reform law are remote. Greenhouse knows a lot more about such matters than I do, but I am not entirely convinced by her analysis. The main reason she offers is that, although Roberts and Alito are certainly not more centrist than Rehnquist and O’Connor, they are less committed to the “federalism” agenda than their predecessors. Even if that’s true, I’m not sure how much it matters. If states-rights arguments can provide them with a rationale for striking down the individual mandate, thus eviscerating the most important piece of quasi-social-democratic social legislation in decades, I suspect they might find the opportunity very hard to resist.

=> Let me reiterate that all these prognoses are matters of informed speculation—and, in my case, the speculation isn’t even so well informed. To avoid any possible misunderstanding, I also want to make it clear that I’m not suggesting that the current right-wing majority on the Supreme Court (or, more precisely, the right-wing quartet of Roberts, Alito, Scalia, and Thomas plus the swing-voter Anthony Kennedy) are just political operatives with robes who would simply manipulate the law to advance a partisan agenda. On the contrary, whatever one thinks of them—and I don’t think much of some of them—everything I’ve read about them seems to indicate that they are serious jurists committed to the rule of law and to constitutional principles as they understand them. But all the evidence also suggests that they are sincerely committed to an ideologically driven agenda of aggressive judicial activism.

At the same time, as Finley Peter Dunne’s character Mr. Dooley sagely observed back in the 1930s, it’s also true that "the Supreme Court follows the election returns." Nowadays, like everyone else, they probably follow the polls, too. So my guess, for what it’s worth, is that if the level of public opposition to the health care reform bill stays roughly the same or declines over the rest of 2010, and if the Democrats retain control of Congress in the November elections, then this legal challenge to the constitutionality of the individual mandate will probably go nowhere. On the other hand, if the Republicans win a crushing victory in November—then we’ll see.

That prediction could also turn out to be entirely wrong. Meanwhile, we should probably expect a high-volume propaganda war over the unconstitutionality and iniquity of the individual mandate at least until November.

------------------------------

Of course, if people are genuinely concerned that a federal mandate to buy something from private companies is “unprecedented” and unconstitutional, the fix is obvious. Just move to a government-run single-payer system, like Medicare, and cut out private insurance companies entirely. I’m inclined to think that would produce a better health care system anyway.

--Jeff Weintraub

Friday, April 02, 2010

Moses & the Israelites cross the Red Sea (updated)

I have now seen this updated image at several places. Among others, it was reproduced on the cover of the custom-written Hagaddah used for the Passover seder my wife Ageliki & I attended (at the home of Kathy Hirsh-Pasek & Jeffrey Pasek, to whom we are grateful for including us). So why not share it with the world?

(Click on the picture to enlarge it.)



=> A FOLLOW-UP: This post generated a lot of e-mailed responses, including this one from Mark Kleiman:
This seems inconsistent with the explanation I have heard for why we dip hard-boiled eggs in salt water during the seder: because when they crossed the Red Sea, the Children of Israel had to wade through water up to their baitzim.
(That is, their balls.) Curiously enough, I've never heard this other explanation, but I like it. It's a classic example of the cheerful vulgarity that runs through a lot of Yiddish humor.

With Jews, of course, there will always be conflicting interpretations.

Next year in Jerusalem (metaphorically speaking),
Jeff Weintraub