Over the weekend Niall Ferguson got himself into intellectual hot water over an off-the-cuff response to a question about Keynes in which he suggested that Keynes didn't value the future too much because he was gay, had no heirs, and therefore didn't care about future generations. Now, Keynes's writings here and here would betray the claim that he didn't care about the future. And the whole "someone who's gay must have a reduced shadow of the future" stereotype is hackneyed in the extreme. So, Ferguson was doubly wrong -- and to his credit, he offered up a real apology (not an "I'm sorry if this offended anyone" variant) pretty quickly.
Critical wounds run deep, however. In response to a lot of online discourse that noted his prior observations on Keynes's sexual orientation, Ferguson penned an open letter in the Harvard Crimson. Some highlights:
I was duly attacked for my remarks and offered an immediate and unqualified apology. But this did not suffice for some critics, who insisted that I was guilty not just of stupidity but also of homophobia. I have no doubt that at least some students were influenced by these allegations. Nobody would want to study with a bigot. I therefore owe it to students—former and prospective—to make it unambiguously clear that I am no such thing.
To be accused of prejudice is one of the occupational hazards of public life nowadays.…
Not for one moment did I mean to suggest that Keynesian economics as a body of thought was simply a function of Keynes’ sexuality. But nor can it be true—as some of my critics apparently believe—that his sexuality is totally irrelevant to our historical understanding of the man. My very first book dealt with the German hyperinflation of 1923, a historical calamity in which Keynes played a minor but important role. In that particular context, Keynes’ sexual orientation did have historical significance. The strong attraction he felt for the German banker Carl Melchior undoubtedly played a part in shaping Keynes’ views on the Treaty of Versailles and its aftermath.…
What the self-appointed speech police of the blogosphere forget is that to err occasionally is an integral part of the learning process. And one of the things I learnt from my stupidity last week is that those who seek to demonize error, rather than forgive it, are among the most insidious enemies of academic freedom.
Now there are two things going on here. First, to what extent does a person's biography affect his or her role in history? And second, just who are these "self-appointed speech police of the blogosphere"?
Ferguson is correct on the first point in general, though I'm not so sure about this particular instance. I'm in the middle of Jeremy Adelman's magisterial biography Worldly Philosopher: The Odyssey of Albert O. Hirschman, for example. One would be hard-pressed to suggest that Hirschman wrote what he wrote about without paying some attention to his life story. So it is entirely appropriate for a historian to talk about Keynes's personal background in trying to suss out why he argued what he argued.
The thing is, Ferguson keeps eliding important details when he talks about the effect of Keynes's sexual preferences on his policy pronouncements. Take the claim that Keynes's attraction to Melchior affected his views on Versailles. Eric Rauchway points out some additional facts not in evidence:
Keynes made early calculations for what Germany should pay in reparations in October, 1918. In “Notes on an Indemnity,” he presented two sets of figures – one “without crushing Germany” and one “with crushing Germany”. He objected to crushing Germany because seeking to extract too much from the enemy would “defeat its object by leading to a condition in which the allies would have to give [Germany] a loan to save her from starvation and general anarchy.” As he put in a revised version of the same memorandum, “If Germany is to be ‘milked’, she must not first of all be ruined.”
Keynes also worried that too large a reparations bill might distort international trade. “An indemnity so high that it can only be paid by means of a great expansion of Germany’s export trade must necessarily interfere with the export trade of other countries.”
The point of mentioning it is that Keynes developed these concerns prior to going to the negotiations and meeting Carl Melchior.
So even if Ferguson is right on general principle, he's misleading on this particular point.
It's the last paragraph of Ferguson's letter that's quite … quite … 2004 in its formulation. Just who are these "self-appointed speech police of the blogosphere" anyway? The most damning indictments of Ferguson's past discussions of Keynes's homosexuality, Ferguson's more contemporary and woefully wrong economic predictions, and Ferguson's recent intellectual dust-ups come from either Business Insider or the Atlantic. Other prominent online critics of Ferguson over the past week have been Justin Wolfers, Paul Krugman, Brad DeLong, and Rauchway. That's three full professors of economics and a full professor of history.
Ferguson's rhetorical trick here is to try to denigrate the content of their criticisms by pointing to the medium. It's a cute gambit in public discourse, and I suspect it will make him and his acolytes feel better. Intellectually, however, that dog won't hunt.
As much fun as it is to dissect Niall Ferguson -- and I won't lie, I've had a lot of fun at his expense -- this sort of thing gets tedious after a spell. So, please, Niall, try to wade into more interesting intellectual waters the next time you make a mistake.
Oh, and stop claiming "academic freedom" as a shield to protect you from public critiques of something you said at an investment conference. That's not how academic freedom works.
Am I missing anything?
Yesterday the New York Times announced a brand new conference called The Next New World. The URL gives the game away, however -- it's the Friedman Forum. The précis:
Pulitzer Prize–winning New York Times columnist Thomas L. Friedman hosts this timely forum, bringing together chief executive officers, tech pioneers, government officials, influential decision-makers and scholars to discuss the new world economy, opportunities and challenges. We will explore the complex dynamics of new-world infrastructure, especially the transformative electronic, digital and mobile environment. Attendees can expect invaluable insights into strategies for success in today’s new world order.
If you act before May 10, you can get the discounted rate of $995.00 to attend!
Why should you shell out that kind of cabbage to go to such a confab? Well, there's the speaker list of course, but even better, the Friedman Forum has a "Why Attend?" page that will answer this very question. The good parts version:
The New York Times Next New World Forum is an invitation-only, highly interactive forum that explains:
How this Next New World is changing your job, your workplace, and your competition...
How cyberattacks and monetary crises are the new national security threats—threats to global businesses as well as nations....
How brands are threatened as never before by new players, and why C-Suite executives are both more constrained and less likely to last....
How robotics and other cutting-edge technologies can increase productivity but also disrupt your office and workforce....
How everything from climate change to fallen infrastructure is threatening global supply chains and how the rise of a new global middle class is disrupting American global dominance—while creating new markets.
After reading this, as well as CUNY's announcement that former CENTCOM commander/CIA Director David Petraeus will lead a seminar on the United States and the global economic crisis, I had two reactions.
1) At what point does one decide, "Why, yes, I should lecture people on the New New Things in the Global Economy! And charge at least a thousand dollars for the privilege"?
This is a serious question. I get asked this a lot at various talks, and I'm always befuddled by the query. I mean, if I had the actual answer, I wouldn't be so low in the international relations speaker ecosystem.
2) Forget Davos, Aspen or TED -- the Friedman Forum suggests a whole new vista of conferences branded around the idiosyncracies of individual thought leaders. Friedman better nail this down fast, because the coming competition will be fierce. In the spirit of... er... alliteration and Robert Ludlum titles, let me predict some other possible confabs on the horizon:
A) The Gross Gaggle. Organized by PIMCO's Bill Gross, this would be a collection of the world's most florid investment letter-writers in the world, warning about risk and uncertainty.
The Big Finale: Gross doing a spoken-word version of his latest newsletter with Dave Brubeck's "Take Five" playing in the background.
B) The Slaughter Seminar. The new president of the New America Foundation will lead a highly interdisciplinary gathering to focus on the myriad ways that the 21st century is upending our static 20th century mindsets. Topics will include the role of social networks, social media networks, online networks, gendered networks, and networked networks.
The Big Finale: A three-hour break in the middle of the day for participants to bond with their families.
C) The Dowd Doohickey. Join the Red Priestess as she explains how leadership is supposed to be done in the 21st century. After the ritual flaying of a political scientist to appease the Social Science Gods, Dowd will explain exactly how politicians used to Get Things Gone back in the day.
The Big Finale: Dowd and Aaron Sorkin will re-enact some of the classic Josh Lyman-Donna Moss scenes from The West Wing.
D) The Taleb Teach-In. Just how fragile is your financial position in this time of massive geopolitical and geoeconomic uncertainty? The author of The Black Swan and Antifragile will unleash his crystal ball and stare deeply into your portfolio to see if you're really and truly prepared for a volatile century.
The Big Finale: Taleb unleashes an army of zombies into the auditorium to sort out the resilient from the posers.
E) The Morozov Mish-Mash. Everything is sh*t -- your beliefs, your ideas, your likes, your dislikes, and particularly your values. If you dare attend, Morozov will explain why Everything You Hope for is a Chimera.
The Big Finale. Morozov will glare out at the audience, grumble, "you all suck," drop the mic, and walk off stage.
[And what about your confab?!--ed. I'll let the commenters decide the contents of... the Drezner Deliberations!]
Your humble blogger gave a talk at the "Sex, Tech and Rock & Roll" TEDx event at Binghamton University last month. My talk was entitled "Metaphor of the Living Dead" and was in part prompted by my prior work on zombies, as well as this blog post from last December.
Here's the TED talk:
I look forward to The Onion trying to satirize that talk.
A little more than a year ago I blogged that global policymakers had reached a "focal point" moment on the merits of austerity as a macroeconomic policy during a global recession. Namely, central bank authorities had concluded that the policy doesn't really work well at all. If true, this was a big deal. One could argue that from the May 2010 Toronto G-20 summit to the end of 2011 was a period where the austerity policies were widely touted and occasionally implemented. If this was the wrong policy, and there was a shift, that's kind of a big deal.
So where are we now on this?
On the public commentary side, I'd say we're approaching near-consensus on the failures of austerity for large economies. The passing of time has allowed for a comparative look at the data, and the results are not pretty for austertity enthusiasts. Martin Wolf sums up the indictment rather neatly, riffing off of a paper by Paul De Grauve and Yuemei Li:
[T]he chief determinant of the reduction in spreads over German Bunds since the second quarter of 2012, when OMT [the ECB pledge to open up its monetary taps] was announced, was the initial spread. In brief, "the decline in the spreads was strongest in the countries where the fear factor had been the strongest."
What role did the fundamentals play? After all, nobody doubts that some countries, notably Greece, had and have a dreadful fiscal position. One such fundamental is the change in the ratio of debt to gross domestic product. The paper makes three important observations. First, the ratio of debt to GDP increased in all countries even after the ECB announcement. Second, the change in this ratio turned out to be a poor predictor of declines in spreads. Finally, the spreads determined the austerity borne by countries.
On the policy output side, there's been a demonstrable but partial shift. In the past year, the European Central Bank, Federal Reserve, and Bank of Japan have rejected austerity policies in favor of greater levels of quantitative easing. Furthermore, contrary to the outright hostility developing countries directed at quantitative easing in the fall of 2010, the reaction to the past half-year of quantitative easing has been far more muted. When the latest G-20 communique said:
Monetary policy should be directed toward domestic price stability and continuing to support economic recovery according to the respective mandates. We commit to monitor and minimize the negative spillovers on other countries of policies implemented for domestic purposes.
That was code for "hey, G-7 central banks, you gotta do what you gotta do. We get that." Which is demonstrably different from yelling "currency wars", a meme that seems not to have caught fire this time around.
Top central bank authorities have also been willing to speak truth to power -- in this case, GOP members of Congress. John Cassidy recounts Ben Bernanke's testimony from yesterday:
Departing from his statutory duty of reporting to the Senate Banking Committee on the Fed’s monetary policy, Bernanke devoted much of his testimony to fiscal policy, warning his congressional class that letting the sequester go ahead would endanger the economic recovery and do little or nothing to reduce the country’s debt burden.
"Given the still-moderate underlying pace of economic growth, this additional near-term burden on the recovery is significant," Bernanke told his students, who included a number of right-wing Republican diehards, such as Senator Bob Corker, of Tennessee, and Patrick Toomey, of Pennsylvania. "Moreover, besides having adverse effects on jobs and incomes, a slower recovery would lead to less actual deficit reduction in the short run."
Translated from Fed-speak, that meant that congressional Republicans have got things upside down. Bernanke has warned before about the dangers of excessive short-term spending cuts. But this was his most blunt assertion yet that Mitch McConnell, John Boehner, et al. should change course. "To address both the near- and longer-term issues, the Congress and the Administration should consider replacing the sharp, frontloaded spending cuts required by the sequestration with policies that reduce the federal deficit more gradually in the near term but more substantially in the longer run," Bernanke said. "Such an approach could lessen the near-term fiscal headwinds facing the recovery while more effectively addressing the longer-term imbalances in the federal budget."
So does this mean some additional policy shifts? Alas, probably not. The consensus against austerity seems pretty strong on the monetary policy side of the equation. On the fiscal policy dimension, however, austerity remains the de facto policy for a lot of economies. This includes the United States, which is conventionally depicted as not having embraced austerity. The New York Times' Binyamin Appelbaum outlines the current fiscal austerity in his story today:
The federal government, the nation’s largest consumer and investor, is cutting back at a pace exceeded in the last half-century only by the military demobilizations after the Vietnam War and the cold war.
And the turn toward austerity is set to accelerate on Friday if the mandatory federal spending cuts known as sequestration start to take effect as scheduled. Those cuts would join an earlier round of deficit reduction measures passed in 2011 and the wind-down of wars in Iraq and Afghanistan that already have reduced the federal government’s contribution to the nation’s gross domestic product by almost 7 percent in the last two years.
The cuts may be felt more deeply because state and local governments — which expanded rapidly during earlier rounds of federal reductions in the 1970s and the 1990s, offsetting much of the impact — have also been cutting back.
Federal, state and local governments now employ 500,000 fewer workers than they did on the eve of the recession in 2007, the longest and deepest decline in total government employment since the aftermath of World War II.
Total government spending continues to increase, but those broader figures include benefit programs like Social Security. Government purchases and investments expand the nation’s economy, just as private sector transactions do, while benefit programs move money from one group of people to another without directly expanding economic activity.
The reason for this split does not require rocket science. Monetary policy is a tool of politrically insulated central bankers. Fiscal policy is a tool for elected politicians. The public might dislike specific budget cuts, but damn if they don't love austerity in theory.
So, in retrospect, I think early 2012 was a focal point -- but only for central bankers and commentators. As Cassidy notes, there remain elected politicians who are super-keen on austerity:
Corker, a former builder who is a long-time critic of Bernanke’s expansionary policies, called him "the biggest dove since World War Two." Toomey, a former head of the conservative lobbying group Club for Growth, questioned whether the sequester would have any real impact on the economy. Bernanke shrugged off the criticisms, calmly and methodically laying out the realities of the situation.
This month Merriam-Webster highlighted their most looked-up words of 2012, with the rather boring conclusion that "capitalism" and "socialism" were the words of the year. To spice things up a bit for those philologists in the crowd, I would like to suggest that every years, some words get temporarily "retired." Not permanently, just for a yar or so. Think of it as a word vacation.
I don't make this suggestion lightly -- no writer wants to constrain their options as they craft their arguments. Some words, however, find themselves abused to the point where, no matter how scintillating they might have been in the past, they need some time in rehab. Think Ryan Lochte after the post-Olympics publicity tour or Lindsay Lohan after making a Lifetime movie.
So the worst word of 2012, the word that desperately needs a break is... bubble.
Since 2008, analysts, commentators, pundits et al have been on the lookout for the next bubble. To provide one example of how this search for the next bubble abuses the term, let's look at the brouhaha surrouding the "higher ed bubble." It was brewing in 2011, but this year, with the rise of online education, it's been just lousy in the blogosphere: Megan McArdle, Glenn Reynolds, and Walter Russell Mead have been hammering away at this concept.
It is Mead's latest post on the subject that has me thoroughly annoyed. He links to the lead essay in The American Interest by Nathan Harden that opens as follows:
In fifty years, if not much sooner, half of the roughly 4,500 colleges and universities now operating in the United States will have ceased to exist. The technology driving this change is already at work, and nothing can stop it. The future looks like this: Access to college-level education will be free for everyone; the residential college campus will become largely obsolete; tens of thousands of professors will lose their jobs; the bachelor’s degree will become increasingly irrelevant; and ten years from now Harvard will enroll ten million students.
We’ve all heard plenty about the “college bubble” in recent years. Student loan debt is at an all-time high—an average of more than $23,000 per graduate by some counts—and tuition costs continue to rise at a rate far outpacing inflation, as they have for decades. Credential inflation is devaluing the college degree, making graduate degrees, and the greater debt required to pay for them, increasingly necessary for many people to maintain the standard of living they experienced growing up in their parents’ homes. Students are defaulting on their loans at an unprecedented rate, too, partly a function of an economy short on entry-level professional positions. Yet, as with all bubbles, there’s a persistent public belief in the value of something, and that faith in the college degree has kept demand high.
The figures are alarming, the anecdotes downright depressing. But the real story of the American higher-education bubble has little to do with individual students and their debts or employment problems. The most important part of the college bubble story—the one we will soon be hearing much more about—concerns the impending financial collapse of numerous private colleges and universities and the likely shrinkage of many public ones. And when that bubble bursts, it will end a system of higher education that, for all of its history, has been steeped in a culture of exclusivity. Then we’ll see the birth of something entirely new as we accept one central and unavoidable fact: The college classroom is about to go virtual.
Now, let's stipulate that higher education may well be on the cusp of some interesting changes. Let's also stipulate that some colleges appear to have gone on a borrowing binge (though the linked story fails to note that these debt loads have been declining for the past few years). Let's further stipulate that Harden's prediction might well be correct -- though even he acknowledges later in the essay that traditional classroom instruction is a necessary component to a good higher education.
Here's the thing, and it's worth repeating: This. Is. Not. A. Bubble.
When the prices of securities or other assets rise so sharply and at such a sustained rate that they exceed valuations justified by fundamentals, making a sudden collapse likely - at which point the bubble "bursts".
I think it's possible that the first part of this definition might be happening in higher education -- though I'd wager that what's actually happening is that universities are engaging in greater price discrimination and trying to capture some of the wage premium effects from higher education that have built up over the past three decades.
It's the second part of that definition where things don't match up. Unless and until there is a sudden and dramatic shift in the valuation of a college degree, this is simply not like a bubble. From a knowledge perspective, there are far too many professions in the economy where degrees are still considered a necessary condition. From a sociological perspective, there are also far too many people who got to where they are in their careers because of the social capital built up at universities.
I think it's possible that the American system of higher education might be facing what happened to American manufacturing over the past fifty years, in which structural and technological forces caused a slow, steady reduction in the workforce and dramatic improvements in productivity and output. That's something important -- but it's not a bubble.
[Don't you have some skin in this game? Aren't you just defending your interest group?--ed. I teach at a graduate school in which demand for my courses has spiked rather than slowed over the past decade. I'm also a full professor at an elite school. I would personally benefit from the changes that Mead et al are describing. So if I was arguing my own self-interest, I'd be nodding vigorously at what the higher ed bubble gurus are selling.]
Please, let's give "bubble" a break before the term loses all meaning.
You humble blogger has, on occasion, waxed poetic about Hirschman's accomplishments as a scholar and a writer. His primary area of expertise was in development economics, particularly in Latin America. He was a true giant in the larger study of political economy -- which is why my best-global-political-economy-of-the-year awards are named The Albies. The Social Science Research Council also named a prestigious award after Hirschman:
The Prize recognizes Albert Hirschman's pioneering role in contemporary social science and public policy as well as his life-long commitment to international economic development. Exploring theory and practice, the history of ideas - economic, social or political - and innovative approaches to fostering growth, Hirschman has seen scholarship both as a tool for social change and as an inherent value in a world in need of better understanding. He has written in ways that help social science effectively inform public affairs. His work stands as an exemplar of the necessary knowledge that the Social Science Research Council seeks to develop and the interdisciplinary and international approach in which it works.
What did Hirschman write to earn such honorifics? Well, Exit, Voice and Loyalty is one of those books that you have to read if you're earning a Ph.D. in any social science; as I've said before, that book was crucial to some of my thinking behind All Politics is Global. Beyond that book, however, Hirschman wrote must-read books on international economic power (National Power and the Structure of Foreign Trade), economic ideas (The Passions and the Interests), political rhetoric (The Rhetoric of Reaction), and the evolution of the social sciences themselves ("Paradigms as a Hindrance to Understanding").
Hirschman's ideas ere important, but I'd argue that his writing style was equaly important -- clear, lucid, vivid, never a word wasted. As a grad student, I dozed off a lot reading a necessary but abstruse journal article. One did not fall asleep reading Hirschman -- hell, he was better than any energy drink in boosting one's intellectual energies.
He will be missed -- but not forgotten.
Two data points from my morning reads can highlight -- but not prove -- this trend. Exhibit A is a fascinaing column by Gillian Terzis in The New Inquiry on the persistence of superstar economists since the 2008 financial crisis. What caught my attention:
E]conomists have not only retained their prominence in the years since the global financial crisis; they have expanded it. Media-savvy economists have only grown in number, disseminating nuggets of user-friendly economic theory and technocratic liberalism in newspaper columns, blogs, and econo-centric podcasts. Krugman, along with Joseph Stiglitz, Nouriel Roubini, Nassim Taleb, and Jeffrey Sachs have become household names as swaggering political pundits....
With economists becoming mainstream personalities, their econospeak is worming its way deeper into everyday language. Our money is as easily invested as our time: remember to “calculate” your “opportunity cost.” Emotions are “inefficient”: try not to have any. Choosing a restaurant necessarily invokes a “cost-benefit analysis.” Steering the course of one’s life is necessarily about making the right decisions at the right time. And the time for this linguistic evolution is right. In an age of laissez-faire capitalism and precarious labor, what are individuals and corporations doing, if not constantly “re-establishing themselves” as “market players?”....
Underlying all these examples is the idea that a perfunctory understanding of economics, it seems, is society’s best attempt at a code of justice amid endemic institutional dysfunction in political and legislative frameworks. As such, the quotidian economist presents himself (most often, it is a “he”) to audiences as above and beyond the realm of trifling matters like ideology or politics. The everyday economist goes out of his way to portray economics as a social science untouched by politics and ignorant of historical context. But such an approach is at a deliberate remove from the complexity and the uncertainties of modern life. It suggests that because humans are rational thinkers, then our actions can always be predicted, or at least reduced into theoretical epigrams. And so mainstream economics affirms itself as the discipline with an answer to everything, even when financial crises repeatedly underscore the gap between theory and praxis....
Metaphors may make for a great pull-quote, but too often they perpetuate causal simplification. Everyone is assumed to act in a certain fashion under a specified set of conditions, holding all other variables constant. Oversimplifying economic phenomena ignores possible failures and contingencies: how does one account for empathy, altruism, irrationality? Surely, politics must play a part; surely there are objects — sentimental talismans, or the right to decent shelter — to which no market value can be ascribed. It’s beyond the remit of economics to care....
In the online marketplace of ideas, the influence of a few celebrity economists creates an illusion of scarcity of new, heterodox voices. Yet now more than ever, to prevent costly and irreparable policy errors, economics needs its crowded-out Cassandras.
This is such an extreme mixture of fascinating analysis and total bulls**t that your humble blogger really needs to step back and gaze in awe at it. A big problem with Terzis's analysis is that the very "celebrity ecconomists" she cites -- Roubini, Taleb, Stiglitz -- were precisely the economists who were the Cassandras prior to 2008. One would assume that a public intellectual ecosystem that rewards critics who provided trenchant criticism is a good thing. Lamenting their rise seems... odd.
Except that it isn't for Terzis, because she objects to the very idea of a social science that tries to drain the complexity out of modern life in order to model it. Which is a fancy way of saying she objects to social science in principle -- because without simplifying reality a lot, it's simply impossible to model or explain it. In essence, Terzis' argument is that modern society is sooooo complex that radical uncertainty can't be eliminated -- so don't bother.
Terzis is coming at this from a Karl Polanyi-esque place on the left. Meanwhile, on the right, John Podhoretz looks at yesterday's polling in the 2012 presidential race, throws up his hands, and basically says, "Bah!! Numbers!!"
Mark it down on your calendars: Yesterday — Monday, Oct. 8, 2012 — may go down in the annals of history as the day political polling died.
It was the most ridiculous polling day among many preposterous polling days in the course of this long campaign...
The disparity in these numbers and their trends are so broad that even the cautionary method of adding them all together and averaging them out — best done by the Real Clear Politics “poll of polls” — makes little sense....
Pollsters themselves, when challenged on their stats, say they’re just presenting a snapshot of public opinion. Fine, but these snapshots are wildly distorted.
The key hidden fact is that fewer than one in 10 respond to those who try to poll them.
People who screen their calls, hang up on people they don’t know or end the survey because they don’t have time to take it make up more than 90 percent of those phoned by pollsters.
Then there are issues with cellphone users and those who communicate pretty much solely by texts and e-mail, and the like.
All we can be sure of, in the words of the peerless Internet humorist Iowahawk, “political poll results accurately reflect the opinions of the weirdo 9 percent who agree to participate in political polls.”
What yesterday proved is that all bets are off. We’re judging the state of this contest with junk data, and we need to stop. Until pollsters can figure out how to avoid all these crazy mood swings and white noise, they should be put on political and pundit probation.
Yeah!! Until pollsters learn to avoid... um... statistical variance... um... they shouldn't do statistics. And get off my lawn!!
Podhoretz raises some useful points here -- omitting cell phones does introduce a possible bias into polls, and the possible sample bias of low response rates. Podhoretz's core complaint, however, is both deeper and pretty friggin' absurd -- there's too much variance!! Stop the madness!!
The whole basis of statistics is that one is attempting to determine what a population thinks by looking at a small sample of that population. Such an exercise inherently introduces variance in interpreting the results. One day of particularly wide variance does not spell doom for the polling enterprise. Indeed, as a poll-watcher, what's been striking this election season is not the variance in the poll numbers but the relative lack of it compared to past elections. Both candidates' post-convention "bounces" were modest compared to past elections, and the numbers were pretty constant for a pretty long period of time during the summer.
Look, I get that social scientists are easy to mock and ridicule, and Lord knows, we make mistakes. Acknowledging fundamental levels of uncertainty and unknowability is a healthy thing to do. Going from that acknowledgement to rejecting the enterprise of social science entirely -- as both Terzi and Podhoretz do in their essays -- is really, really stupid.
Now get off my lawn.
Now is the time of year when students go to citadels of higher learning and hopefully learn some stuff instead of getting bogged down in weird cheating scandals. Coincidentally enough, this past month there's also been a lot of talk about how impressionable young people often get enamored with Ayn Rand and isn't that awful or something.
These laments this misses the point of how 18-year olds encountered the world of ideas in college. That is the age when they are expected to seriously think about ideas for the first time. They will crave ideas that will bake their noodle -- or at a minimum, that's the time when they should have their worldviews rocked ever few weeks or so. If not Rand, then whom?
In your blogger's humble opinion, there's another book that is celebrating it's 50th anniversary and remains far more earth-shattering in its intellectual effects. A few weeks ago the Guardian's John Naughton celebrated Thomas Kuhn's The Structure of Scientific Revolutions with an astute essay on its significance. The highlights:
Kuhn's version of how science develops differed dramatically from the Whig version. Where the standard account saw steady, cumulative "progress", he saw discontinuities – a set of alternating "normal" and "revolutionary" phases in which communities of specialists in particular fields are plunged into periods of turmoil, uncertainty and angst. These revolutionary phases – for example the transition from Newtonian mechanics to quantum physics – correspond to great conceptual breakthroughs and lay the basis for a succeeding phase of business as usual. The fact that his version seems unremarkable now is, in a way, the greatest measure of his success. But in 1962 almost everything about it was controversial because of the challenge it posed to powerful, entrenched philosophical assumptions about how science did – and should – work....
Kuhn's central claim is that a careful study of the history of science reveals that development in any scientific field happens via a series of phases. The first he christened "normal science" – business as usual, if you like. In this phase, a community of researchers who share a common intellectual framework – called a paradigm or a "disciplinary matrix" – engage in solving puzzles thrown up by discrepancies (anomalies) between what the paradigm predicts and what is revealed by observation or experiment. Most of the time, the anomalies are resolved either by incremental changes to the paradigm or by uncovering observational or experimental error. As philosopher Ian Hacking puts it in his terrific preface to the new edition: "Normal science does not aim at novelty but at clearing up the status quo. It tends to discover what it expects to discover."
The trouble is that over longer periods unresolved anomalies accumulate and eventually get to the point where some scientists begin to question the paradigm itself. At this point, the discipline enters a period of crisis characterised by, in Kuhn's words, "a proliferation of compelling articulations, the willingness to try anything, the expression of explicit discontent, the recourse to philosophy and to debate over fundamentals". In the end, the crisis is resolved by a revolutionary change in world-view in which the now-deficient paradigm is replaced by a newer one. This is the paradigm shift of modern parlance and after it has happened the scientific field returns to normal science, based on the new framework. And so it goes on.
This brutal summary of the revolutionary process does not do justice to the complexity and subtlety of Kuhn's thinking.
He's right -- read the whole thing. I've blogged before about why Kuhn is equally important to social science here and here. To put this into words that today's millenial generation can comprehend: the effect of reading Thomas Kuhn to 18 year old is like the moment when Neo realizes there is no spoon.
One's education about how science works shouldn't stop with Kuhn -- there have been some worthy responses to him -- but it's a great place to start.
futile determined effort to expand his public intellectual brand, your humble blogger was on the Melissa Harris-Perry show on MSNBC for about an hour this AM. It was a veeeeery interesting experience. Now, I know that some readers of this blog aspire to poison me and take my place on the FP masthead punditry themselves. So, as a public service, this is what happens when you go on a Sunday morning talk show: imagine the Law & Order chunk-CHUNK! going after each paragraph:
WEDNESDAY: I receive a polite email from one of Melissa's segment/booking producers, who we'll call "M", asking if I want to be on the show this Sunday. Having never done this kind of punditry before, and having my previous obligation for the weekend cancelled, I accept.
Immediately after accepting, I experience two contradictory emotions. First, the fear builds that they will email back the same day and say, "uh, we just re-checked our Rolodex, and.. [laughs] we have no idea who thought you merited being on television." At the same time, I realize I'm going to have to watch the rest of the prime-time Republican National Convention speeches. At which point I let loose a strong of profanities [What if had been the DNC instead?--ed. The same number of profanities would have been released.]
THURSDAY: I talk with M again, who runs down the planned topics with me -- the RNC, the DNC, the state of the election, and gun violence. Immediately I have a slight quandry -- my expertise is in international relations, not electoral politics. Do I dare to think I can play in the sandbox for a Sunday morning talk show? At which point my inner media whore screams, Gollum-like, "YES!! WE WANTS TO BE ON THE TV!!! IT'S OUR PRECIOUS!!"
This is, as near as I can determine, the First Commandment of Televsion Punditry -- you have to be secure enough in your abilities to chat with authority about issues outside your intellectual wheelhouse.
FRIDAY: To make my inner media whore proud, I start reading up on gun violence statistics. I also read as much commentary as I can about the Republican National Convention. Whatever intelligence I gain from the former I lose by reading the latter.
SATURDAY: Full disclosure: Melissa and I were political science colleagues at the University of Chicago back in the day, so I'm familiar with her style and her politics. That said, I haven't been a regular watcher of her MSNBC show, so I tune for the first half-hour in to get a sense of the roundtable. Clearly MHP leans juuuuust a little to the left. This raises an identity issue -- am I supposed to be wearing the hat of "defender of conservatism"? It's not a role I've been comfortable with as of late. Or am I gonna wear the "dispassionate, snarky observer of the political scene"? I feel much more comfortable with that hat on. In all likelihood, it's going to be a little from column A and a little from column B.
LATER ON SATURDAY: If you're going to be on roundtable for radio/television, there's a 98% chance you will need to do a "pre-interview" with a producer so they have a sense of what you're going to say during the conversation. So I have that conversation with M, during which I learn that the topics have been jiggled around a bit and all those
minutes hours of sketchy detailed online research into gun violence stats won't be worth much. Instead, we'll be talking gay marriage. Which is fine, I will rally with yet more Wikipedia-surfing intense study of primary texts. And, of course, that segment will be moved as well. So, the Second Commandment of Television Punditry is that the schedule will always change.
EVEN LATER ON SATURDAY: The MHP show is classy -- they took care of all the flight/hotel/car service logistics. So by Saturday night I found myself in a comfortable midtown hotel with a lovely view of Central Park. Surfing the web furiously to research for Sunday AM, I see that some fireworks broke out on MHP's Saturday show after I switched off to pack. Melissa was responding to the panel's most conservative participant making a point. Ruh-roh.
To prep, I take notes on each of the "blocks" or segments that I'm supposed to weigh in on. I'm a professor and an academic, and therefore alwasys feel better with notes.
SUNDAY MORNING PRE-SHOW: I get to 30 Rock on time, which means I'm the first panelist there.... or so I think. In actuality, the two female panelists -- NYU's Cristina Beltran and the Center for American Progress' Aisha Moodie-Mills -- are in makeup. As they enter the green rom, I quickly comprehend that I am in deep trouble, because I am not nearly as pretty. As the clock ticks towards 10 AM, and no one comes to get me, I am petrified that I have been typecast as the splotchy-faced redneck surrounded by urbane alluring panelists. Before this scenario can play out in even weirder directions inside my head, however, M sends me to makeup. They apply as enough powder to my face to fuel at least two Dunkin' Donuts franchies -- but it works . My family later informs me that I looked good, so many, many thanks to those professionals at MSNBC.
SUNDAY MORNING, 10 AM-11 AM: OK, having done my first hour of Sunday morning chat -- which you can watch here, here, here and here -- I have learned the following effects on one's senses when three cameras and numerous klieg lights are pointed in your general direction:
A) Time speeds up. Seriously, that hour flew by. Every time I was feeling like we were getting to a good part of the conversation, we hit a commercial break.
B) Almost all prep work is useless. I knew exactly what I wanted to say in response to the first question asked, and I did that competently. After that, whatever good punchy things were in my notes might as well have been left back at home for all the use they were to me. Part of the issue is visual -- you don't want to be looking down at your notes. Another part of the issue, which I've blogged about before, is the academic weakness of trying to directly answer the question.
C) Really, cameras can make you stupid. If you mangle your words during an ordinary conversation, or even when giving a speech, you can take a moment and regroup. If you mangle your words during a panel show, you become acutely aware that you're screwing up, which compounds the problem. Another panelist will be happy to enter the breach. On at least two occasioons, I had written down a better answer than the one that came out of my mouth during the program.
SUNDAY MORNING, 11 AM: I need to get to the airport to come home, but I find two pieces of critical feedback from the experience worthwhile. The first was the thumbs-up that Moodie-Mills and her partner give me as I exit the green room. The second is an anonymous email I soon find in my inbox:
Drezner........Why are you such an idiot? Your appearance on MSLSD was laughable and embarrassing. It's really sad and embarrassing how you liberal bedwetters continue to be brainwashed by your worthless president. Why can't you clowns form a coherent thought on your own? Exactly, because you buffoons are mindless robots programmed by your worthless president. Barnum and Bailey have nothing on you clowns. What do you call a basement full of liberals? A whine cellar. Keep up the good work loser................
Why, it's... it's a troll!! I've made it!! I'M A REAL PUNDIT NOW!!!!!
In the last few weeks, Fareed Zakaria and Niall Ferguson have found themselves at the centers of controversy. As someone who has written a thing or two about public intellectuals, I confess to finding it all very fascinating. What's striking to me is the vehemence on all sides. Brad DeLong is an excitable sort, but calling for Harvard to fire Niall Ferguson for tendentious matters unrelated to his scholarly work seems... a bit much. Last week the Washington Post ran a story falsely accusing Zakaria of another act of plagiarism... without independently checking to see if the charge had any validity.
On the other hand, the defenses that have been mounted also seem a bit over the top. Tunku Varadarajan defended Zakaria in Newsweek with an essay that bordered on the sycophantic, all the while accusing Zakaria's accusers of simple envy:
What one has seen in the past few days can only be described as a hideous manifestation of envy—Fareed Envy. Henry Kissinger’s aphorism about academia (where the “politics are so vicious precisely because the stakes are so small”) applies with delicious tartness to journalism, where media reporters of the kind who hounded Zakaria occupy the lowest rung and exult at the prospect of pulling people down. Zakaria, by contrast, is insanely successful by the standards of his profession: he has a TV show to which few people of any prominence would refuse an invitation, plus columns at Time, CNN.com, and The Washington Post. He also writes academic-lite books that presidents clutch as they clamber aboard planes, and gives speeches at—it is said—$75,000 a pop. He is as much a brand as he is a journalist: he has “inc.” in his veins.
Zakaria himself responded to the Post's bogus second charge of plagiarism in a somewhat curious manner. Here's what he told them:
Zakaria, in an interview Monday, defended the practice of not attributing quotes in a popular book. “As I write explicitly [in the book], this is not an academic work where everything has to be acknowledged and footnoted,” he said. The book contains “hundreds” of comments and quotes that aren’t attributed because doing so, in context, would “interrupt the flow for the reader,” he said.
He compared his technique to other popular non-fiction authors. “Please look at other books in this genre and you will notice that I'm following standard practice,” he said.
“I should not be judged by a standard that's not applied to everyone else,” he added. “People are piling on with every grudge or vendetta. The charge is totally bogus.”
Ferguson responded to his critics in a similar fashion:
The other day, a British friend asked me if there was anything about the United States I disliked. I was happily on vacation and couldn’t think of anything. But now I remember. I really can’t stand America’s liberal bloggers....
My critics have three things in common. First, they wholly fail to respond to the central arguments of the piece. Second, they claim to be engaged in “fact checking,” whereas in nearly all cases they are merely offering alternative (often silly or skewed) interpretations of the facts. Third, they adopt a tone of outrage that would be appropriate only if I had argued that, say, women’s bodies can somehow prevent pregnancies in case of “legitimate rape.”
Their approach is highly effective, and I must remember it if I ever decide to organize an intellectual witch hunt. What makes it so irksome is that it simultaneously dodges the central thesis of my piece and at the same time seeks to brand me as a liar.
I'd feel more sympathy towards Ferguson if his term "liberal blogosphere" obfuscates the fact that a Nobel Prize-winning economist is rebutting Ferguson on his use of facts, and then Ferguson didn't compound his economic errors in a Bloomberg interview.
So what the hell is going on?
I think there are three interlocking things going on that explain why everyone feels so cranky. The first, as I alluded to in my Zakaria post, is that the economics of superstars has now reached the world of public intellectuals. There's been a lot of talk about "brands" recently, and it gets at how the rewards for intellectual output have expanded at the upper strata:
Not that long ago, getting a column in Time would have been the pinnacle of a journalist’s career. But expectations and opportunities have grown in the last few years. Many writers now market themselves as separate brands, and their journalism works largely as a promotion for more lucrative endeavors like writing books and public speaking.
Replace "journalist" with "intellectual" and that paragraph still works. Credentialed thinkers like Zakaria and Ferguson, once they've reached the top, become brands that can multiply their earning potential far more than was the case fifty years ago. The ways in which the Internet concentrates attention on a Few Big Things means that if you are good and lucky enough to become one of those Big Things, money will rain down on your door. Over at Esquire, Stephen Marche proffered this explanation for what he would call Ferguson's intellectual devolution:
The real issue isn't the substance of Ferguson's argument, though, which is shallow and basically exploded by this point in time. It isn't even the question of how such garbage managed to be written and published. It is, rather, why did Ferguson write it? The answer is simple but has profound implications for American intellectual life generally: public speaking.
Ferguson's critics have simply misunderstood for whom Ferguson was writing that piece. They imagine that he is working as a professor or as a journalist, and that his standards slipped below those of academia or the media. Neither is right. Look at his speaking agent's Web site. The fee: 50 to 75 grand per appearance. That number means that the entire economics of Ferguson's writing career, and many other writing careers, has been permanently altered. Nonfiction writers can and do make vastly more, and more easily, than they could ever make any other way, including by writing bestselling books or being a Harvard professor. Articles and ideas are only as good as the fees you can get for talking about them. They are merely billboards for the messengers.
That number means that Ferguson doesn't have to please his publishers; he doesn't have to please his editors; he sure as hell doesn't have to please scholars. He has to please corporations and high-net-worth individuals, the people who can pay 50 to 75K to hear him talk. That incredibly sloppy article was a way of communicating to them: I am one of you. I can give a great rousing talk about Obama's failures at any event you want to have me at.
Now, railing at the One Percent aside (*cough* Esquire's target demograpic *cough*) Marche is really onto something here. I've heard from a few sources that Ferguson resigned his professorship at Harvard Business School (but not Harvard University) because he calculated that if he gave four or five extra talks a year, he could earn his HBS salary without all the tedious teaching obligations.
Zakaria and Ferguson got to where they are by dint of their own efforts, but the thing about the superstar phenomenon is that there's also an element of caprice involved. The gap between Zakaria and Ferguson, and their replacement-level deep thinkers is pretty narrow; the gap in the financial and intellectual rewards is pretty vast.
So I suspect that there is a bit of jealousy in some of the criticisms being leveled. These guys earn many multiples of the median intellectual income -- and I guarantee you that the median intellectual doesn't think that either Ferguson or Zakaria is many times smarter. That's gonna stir up some petty and not-so-petty resentments.
The top tier of public intellectuals are doing well in this world, and the best are pretty savvy at marketing their ideas across multiple platforms in a Web 2.0 world. But the same dynamics that push these people to the top also increase their vulnerability to intellectual criticism -- and this is the second thing that's going on here. As I noted a few years ago:
The most useful function of bloggers is when they engage in the quality control of other public intellectuals. [Richard] Posner believed public intellectuals were in decline because there was no market discipline for poor quality. Even if public intellectuals royally screw up, he argued, the mass public is sufficiently disinterested and disengaged for it not to matter. Bloggers are changing this dynamic, however. If Michael Ignatieff, Paul Krugman or William Kristol pen substandard essays, blogs have and will provide a wide spectrum of critical feedback.
One can clearly add Niall Ferguson and Fareed Zakaria to this list. Furthermore, the very act of trying to market ideas across platforms -- and the constant drive to generate new content -- leaves these intellectuals vulnerable to criticism. They can get sloppy, like Zakaria, and commit a near-fatal error. They can be tendentious in their use of facts, like Ferguson, and suffer reputational damage. Or, they can simple debase themselves to the point where Evgeny Morozov goes medieval on them.
For high-flying intellectuals, this kind of public criticism clearly wounds. What the superstar phenomenon gives, it can also threaten to take away (though, to be honest, scandals and bad writing don't seem to actually take away rewards all that often). But in the mind of top-tier public intellectuals, effort and intellect drive their accomplishments, not fortuna. They see online criticism and interpret it as jealousy, pettiness and ideological score-settling. A lot of the time that's exactly what it is -- but the online intellectual ecosystem is also pretty good at fact-checking and substantive criticism. Publc intellectuals don't see that these kinds of criticisms are the flip-side of the very phenomenon that is enriching them in the first place. They also don't realize that in a Web 2.0 world, mere bloggers can fact-check them and scorn them for a lack of citation.
Which leads to the last thing that I think is going on: this superstar phenomenon is invading one of the last spheres of life where money is not necessarily the Most Important Thing. Getting a Ph.D. means being socialized into a world where an academic job is considered more respectable than becoming a consultant that earns gazillions more in money. The currency in the academic economy is intellectual respect. Even if public criticism doesn't affect their real-world income, it does affect their intellectual standing. Even if Zakaria has left the academy, and Ferguson can "transcend" it, they were socialized into this value system, and they clearly care what their peers think.
Zakaria's argument that general nonfiction shouldn't be held to the standards of academic discourse rankles academics who know that he should know better -- the first instinct of any person with graduate training is to read the literature and cite, cite, cite. As my friend Delia Lloyd put it: "I find him culpable because Zakaria comes from the world of academia.... Plagiarism may not be a major moral failing... in the university setting in which Zakaria was trained and credentialed, it’s pretty much one of the worst crimes you can commit."
As for Ferguson, Timothy Burke blogs about what it is exactly about Ferguson's career arc that nettles him:
Ferguson would feel more like he was still within the bounds if he either investigated his own distaste for Obama in more reflective, philosophical and recursive ways or if he was willing to lay out a generalized, prescriptive theory of political leadership that didn’t fitfully move the goalposts on intensely granular or particular issues every few seconds. Why? Because I think scholarship requires some measure of self-aware and reflective movement between what you know and what you believe, and the relationship between your own movements and those of your professional peers... A scholar has to believe on some level that things are known or understood only after being investigated, tested, read, interpreted, that there’s something unseemly about robbing the graves and morgues for cast-off “facts” in order to assemble them into a shambling, monstrous conclusion built from a hackish blueprint. Being an intellectual takes some form of thoughtfulness, some respect for evidence and truth, something that goes beyond hollow, sleazy rhetoric that plays dumb every time it gets caught out truncating quotes or doctoring charts. Being an expert means you guide an audience through what is known and said about a subject with some respect for the totality of that knowing and saying before favoring your own interpretation.
Public intellectuals who have PhDs do not want to lose their standing as scholars. Sure, they can gin up psychological defenses against the hidebound ivory tower, but criticism like the one quoted above will leave a permanent mark. They'll have their riches, but they won't have what they were trained to crave more than anything -- respect.
In the end, what I think is going on is that, contra Russell Jacoby, top-tier public intellectals have acquired greater power than they used to possess. What they resist on occasion is the responsibility that comes with that power.
So that's what I think is going on. What do you think?
P.S. I think one of the best compliments I've ever received is that Justin Fox independently and simultaneously arrived at very similar intellectual destination on this topic.
Time Magazine columnist and CNN host Fareed Zakaria has apologized "unreservedly" to Jill Lepore for plagiarizing her work in The New Yorker.
"Media reporters have pointed out that paragraphs in my Time column this week bear close similarities to paragraphs in Jill Lepore's essay in the April 22nd issue of The New Yorker. They are right," Zakaria said in a statement to The Atlantic Wire. "I made a terrible mistake. It is a serious lapse and one that is entirely my fault. I apologize unreservedly to her, to my editors at Time, and to my readers."
Zakaria's column about gun laws for Time's August 20 issue includes a paragraph that is remarkably similar to one Jill Lepore wrote in April for a New Yorker article about the National Rifle Association. (The similarities were first flagged by NRANews.com and first reported by Tim Graham of the conservative watchdog group Newsbusters, who leveled the plagiarism charge.)
Time suspended Zakaria for a month, CNN suspended him from his GPS hosting duties pending further review, and the Washington Post is looking into his work there. Rodger Payne has a useful round up of the relevant links.
Once the news broke, there was a whole lotta Twitter speculation about how and why this happened. Many media types assume that this was a mistake made by one of Zakaria's flunkies/assistants/interns, but in some ways that's just the proximate cause. A better question would be: why would Fareed Zakaria outsource any writing under his name to others?
I used to think that doing this kind of thing required willful negligence on the part of a writer. Now my view has changed a bit. It's still negligence, but with only a fraction of Zakaria's writing obligations, I can see all too clearly how this happened. To paraphrase Chris Rock, I'm not saying I approve... but I understand.
The New York Times lists Zakaria's day jobs, and they're formidable: "Mr. Zakaria, 48, balances a demanding schedule, doing work for multiple media properties. He is a CNN host, an editor at large at Time, a Washington Post columnist and an author."
Most people who wind up in this situation don't just snap their fingers and take on all of these jobs at once. It's a slow accretion of opportunities that are hard to say no until you are overextended. I'm not remotely close to being a member of the League of Extraordinary Pundits like Zakaria. Still, even I've noticed that, as writing & speaking obligations pile up, corners get... well, let's say rounded rather than cut.
I suspect, as one
has more gobs of money tossed at them than they ever expected out of life approaches League status, three factors dramatically increases the likelihood of this kind of thing happening. First, since the distribution of punditry assignments likely follows a power law distribution, superstars are asked to write a lot more, the pressure builds up. Second, to compensate, the pundit has to hire a staff -- and most people who get into the writing/thinking business are lousy at managing subordinates and staff. Third, if small shortcuts aren't caught the first time a writer uses them, they become crutches that pave the way for bigger shortcuts, which then become cheats.
None of this is to excuse Zakaria for what he did. It just makes me very sad. I enjoyed his first book, and I've enjoyed Fareed Zakaria GPS because it's one of the few Sunday morning shows devoted to international affairs. It didn't air this Sunday because of what happened.
I hope the show goes on, with or without Zakaria. And either way, I hope whoever hosts it learns from this mistake.
I read Chris Hayes' Twilight of the Elites last month and will suggest that you read it too -- it's an engaging read that addresses the question of whether a meritocratic elite can really stay meritocratic over extended periods of time. Hayes thinks the answer is no, and puts together a decent brief for that case. It's a good book in no small part because Hayes acknowledges his inner conflict -- as disgusted as he is with Enron, Lehman, Katrina, Penn State, Iraq and other elite catastrophes, he has peered into the maw of the populists who rail against these elites, and they give him a slight shudder as well.
I bring this up because David Brooks pushes back against Hayes' argument in his New York Times column today. One key section:
The corruption that has now crept into the world of finance and the other professions is not endemic to meritocracy but to the specific culture of our meritocracy. The problem is that today’s meritocratic elites cannot admit to themselves that they are elites.
Everybody thinks they are countercultural rebels, insurgents against the true establishment, which is always somewhere else. This attitude prevails in the Ivy League, in the corporate boardrooms and even at television studios where hosts from Harvard, Stanford and Brown rail against the establishment.
As a result, today’s elite lacks the self-conscious leadership ethos that the racist, sexist and anti-Semitic old boys’ network did possess. If you went to Groton a century ago, you knew you were privileged. You were taught how morally precarious privilege was and how much responsibility it entailed. You were housed in a spartan 6-foot-by-9-foot cubicle to prepare you for the rigors of leadership.
The best of the WASP elites had a stewardship mentality, that they were temporary caretakers of institutions that would span generations. They cruelly ostracized people who did not live up to their codes of gentlemanly conduct and scrupulosity. They were insular and struggled with intimacy, but they did believe in restraint, reticence and service.
Kevin Drum pushes back hard, and correctly in my view, against this argument:
Hayes does a good job of describing all the pathologies of today's meritocratic aristocracy, but his book never seriously addresses all the pathologies of past aristocracies, meritocratic or otherwise. You're left thinking that cheating and corruption and nepotism are somehow unique to the 21st century West. But not only is none of that stuff unique, it's not clear that it's even any worse than it used to be....
Brooks, if anything, is worse on this score. He's careful to admit the problem with the elites of the 19th century, but even so he idealizes them. Sure, the best of the old WASP elites were good people in a noblesse oblige sort of way, but the best of any set of elites are good people. Today's meritocracy is loaded with fine, upstanding citizens. The problem is that they're a minority. But the upstanding folks were a minority back in the days of the WASP aristocracy too.
I'd make one further point, which is that, likely since the start of the Industrial Revolution, elites have felt like insurgents. George Kennan, for example, is as much of a paragon of the Eastern Establishment as you can get -- but he always thought of himself as an outsider.
Most of the obituaries for the public intellectual suffer from the cognitive bias that comes with comparing the annals of history to the present day. Over time, lesser intellectual lights tend to fade from view - only the canon remains. When one looks back at only the great thinkers, it is natural to presume that all of the writers from a bygone era are great. Even when looking at the intellectual giants of the past, current public commentary is more likely to gloss over past intellectual errors and instead focus on their greatest moments. Francis Fukuyama's The End of History and the Last Man might look wrong in retrospect, but it is not more wrong than Daniel Bell's The End of Ideology. Intellectuals like Sontag or Friedman occupy their exalted status in the present only because they survived the crucible of history. As Posner acknowledges, "One of the chief sources of cultural pessimism is the tendency to compare the best of the past with the average of the present, because the passage of time filters out the worst of the past." It is riskier to assess the legacies of current public intellectuals - their ability to misstep or err remains.
It's always useful to remember that the first thirty centuries of human history was one long slog of poverty, misery and violence. By and large, things have gotten much better. This isn't to excuse the errors of today's elites -- but context matters.
Your humble blogger is not naive in the ways of punditry. He is keenly aware that the only way to move up the punditry food chain is to bemoan the crumbling state of America's infrastructure while pining for better high-speed rail, better schools, and ORDER, dammit!!
In the interest of serving the greater good, your humble blogger has decided to do the crucial pundit fieldwork necessary to adopt this position. I am therefore taking the Acela "hi speed" train from Washington, DC, to New York City, and shall chronicle every moment of import along the way in this blog post. So buckle your seat bekts -- it's going to be a bumpy ride:
8:10 AM: Part of the pundit code is getting into a local taxi and getting colorful quotes from them. Alas, my cabbie was not the chatty type. Also, despire the morning rush-hour time, there wasn't a lot of sitting around time. Oh, and his cab was clean too. Clearly, Washington DC is receiving favored treatment in its infrastructure.
8:35 AM: I get to Union Station to find much of it being renovated. There are cranes and construction equipment everywhere! What is his, Shanghai?! Of course, in the Far East, they're just building new things, whereas here in the decaying United States, we're trying to preserve our crumbling monuments to modernity [Oh, that is Pulitzer GOLD, baby!!--ed.]
8:40 AM: I want to get coffee from Starbucks, but the Acela line has already started forming. I bypass the coffee to make sure I get a good seat. Anger at stupid American regulations... rising!!
9:00 AM: On the train, I hold my breath as I try to access Acela's wifi. Many an expeletive has been tweeted in anger at this unreliable system. In my case, however, it opens with no difficulty. There is a warning page informing me that, for myriad reasons, the wifi might cut in and out and it can't access certain pages. Still, Amtrak's web service has jumped up a notch since the last time I took the Acela... or, again, the NYC-DC corridor gets preferential treatment compared with the Boston trains. Note to self: hire eager-beaver grad student to unearth Amtrak perfidy.
9:10 AM: I can't access YouTube. That's it, this is the worst f***ing WiFi service I've ever encountered. There's no WAY this would happen in China!!!
9:20 AM: Well, the Acela reveals itself to be erratic, as it starts to slow down from its pathetically low "hi speed" -- oh, it's stopoing st the BWI station. Never mind.
9:33 AM: Sure, I could have opted for the quiet car, but I wanted to mix with "the people," get a sense of what they're talking about amongst themselves. So far, they're talking about... PowerPoint presentations. There's a column in here somewhere...
10:00 AM: So far, the train has been on time, the WiFi has worked, and even the non-quiet car has been pretty sedate. Friedman's Rage is not building. [Bye-bye Pulitzer!!--ed.] No, wait, the train ride is kinda bumpy. Very bumpy at times. Kind of like... like... the American body politic!! [Atta boy! You're back in the game!--ed.]
10:20 AM: The WiFi cut out for, like 10 minutes south of Wilmington. How sad and pathetic for America. Why, if this had happened in, say, Chongqing, at least one train bureaucrat would have been executed and one British hedge-fund manager would have been poisoned to set an example for other trains.
10:39 AM: The WiFi is becoming erratic again, causing additional mutterings from other passengers in my car. One of them says "This would never happen in Michael Bloomberg's America!!" #notreally.
11:35 AM: The train has arrived in Newark. I look around. God, I miss China.
11:45 AM: Your pundit's long morning nightmare has come to an end on a gorgeous day in Manhattan. I learned a lot about America on this trip, but even more importantly... I learned a lot about myself. [Stick that in your pipe and smoke it, Aaron Sorkin!!--ed.]
David Brooks' New York Times column this AM is a riff off of a Peter Thiel lecture about the odd relationship between capitalism and competition. Thiel's point is that our meritocratic society conditions individuals to compete -- for admission to good schools, good jobs, and so forth -- when, in fact, the goal of the capitalist should be to innovate their way to the capitalist holy grail -- a temporary monopoly.
Brooks takes this argument and runs with it:
students have to jump through ever-more demanding, preassigned academic hoops. Instead of developing a passion for one subject, they’re rewarded for becoming professional students, getting great grades across all subjects, regardless of their intrinsic interests. Instead of wandering across strange domains, they have to prudentially apportion their time, making productive use of each hour.
Then they move into a ranking system in which the most competitive college, program and employment opportunity is deemed to be the best. There is a status funnel pointing to the most competitive colleges and banks and companies, regardless of their appropriateness.
Then they move into businesses in which the main point is to beat the competition, in which the competitive juices take control and gradually obliterate other goals. I see this in politics all the time. Candidates enter politics wanting to be authentic and change things. But once the candidates enter the campaign, they stop focusing on how to be change-agents. They and their staff spend all their time focusing on beating the other guy. They hone the skills of one-upsmanship. They get engulfed in a tit-for-tat competition to win the news cycle. Instead of being new and authentic, they become artificial mirror opposites of their opponents. Instead of providing the value voters want — change — they become canned tacticians, hoping to eke out a slight win over the other side.…
You know somebody has been sucked into the competitive myopia when they start using sports or war metaphors. Sports and war are competitive enterprises. If somebody hits three home runs against you in the top of the inning, your job is to go hit four home runs in the bottom of the inning.
But business, politics, intellectual life and most other realms are not like that. In most realms, if somebody hits three home runs against you in one inning, you have the option of picking up your equipment and inventing a different game. You don’t have to compete; you can invent (emphasis added).
Now, there's definitely a strong element of truth to Brooks' point. True innovators in many fields are searching for the genuinely new idea, and then run with it. I, for one, have a (temporary) monopoly in the international relations theory and zombies market. So I get what Brooks is trying to say here.
And yet, in the end, I think this is a crap argument, for a few reasons:
1) Brooks too neatly divides the innovation from the competition elements of market life. Indeed, the company that made Thiel super-rich -- Facebook -- is exhibit A for this point. Facebook didn't really innovate anything that MySpace or other social networking sites hadn't done already. Rather, because social networking is an arena where greater size means greater profitability, Facebook managed to beat its competitors at gaining market share. It did this through a few bells and whistles, but Facebook did not "create a new market and totally dominate it," as Brooks would put it.
2) Brooks has a tendency to conflate different dimensions of social activity as if they're one and the same, and that bolded sentence is exhibit A of that. In point of fact, a politician usually can't invent a different game -- or, if s/he does, it's often called a coup or a revolution. Brooks is clearly disgusted with the ticky-tack, news-cycle, tweet-length tactical fights of politics, and I can't say I blame him. But Brooks of all people should know that politics is also about Very Big Arguments that cannot be avoided. Say what you will about either Barack Obama or the GOP House of Representatives, but over the past two years they've been having The Big Argument. I don't think either of them can risk abandoning the intellectual competition (though maybe this is another option).
Similarly, as someone schooled in intellectual life, walking away from an argument means you've lost the argument. That's not the end of the world, but given that arguing about ideas is what intellectuals do, it's not great either. I think Thiel's argument works well for business -- but business is manifestly not like other spheres of life.
3) It's worth noting that Brooks and Thiel mistakenly conclude that if the meritocratic system was adjusted, the best and the brightest would become better entrepreneurs. I question that hypothesis. Here's the pithy Larry Summers observation that's worth remembering. Summers was actually making an argument consistent with Brooks and Thiel, pushing back against the Amy Chua "Tiger Mom" silliness:
"Which two freshmen at Harvard have arguably been most transformative of the world in the last 25 years?" he asked. "You can make a reasonable case for Bill Gates and Mark Zuckerberg, neither of whom graduated." If they had been the product of a Tiger Mom upbringing, he added, their mothers would probably have been none too pleased with their performance.
The A, B and C alums at Harvard in fact could be broadly characterized thus, he said: The A students became academics, B students spent their time trying to get their children into the university as legacies, and the C students—the ones who had made the money—sat on the fund-raising committee.
The thing is, these groups are more self-selecting than Summers, Brooks and Thiel believe. Which means altering the meritocratic incentives won't change all that much.
Am I missing anything?
Last Wednesday Thomas Friedman wrote a very silly column in which he called for Michael Bloomberg to enter the presidential race because
he had an annoying experience at Union Station he thinks the United States needs a real leader:
[W]ith Europe in peril, China and America wobbling, the Arab world in turmoil, energy prices spiraling and the climate changing, we are facing some real storms ahead. We need to weatherproof our American house — and fast — in order to ensure that America remains a rock of stability for the world. To do that, we’ll have to make some big, hard decisions soon — and to do that successfully will require presidential leadership in the next four years of the highest caliber.
This election has to be about those hard choices, smart investments and shared sacrifices — how we set our economy on a clear-cut path of near-term, job-growing improvements in infrastructure and education and on a long-term pathway to serious fiscal, tax and entitlement reform. The next president has to have a mandate to do all of this.
But, today, neither party is generating that mandate — talking seriously enough about the taxes that will have to be raised or the entitlement spending that will have to be cut to put us on sustainable footing, let alone offering an inspired vision of American renewal that might motivate such sacrifice. That’s why I still believe that the national debate would benefit from the entrance of a substantial independent candidate — like the straight-talking, socially moderate and fiscally conservative Bloomberg — who could challenge, and maybe even improve, both major-party presidential candidates by speaking honestly about what is needed to restore the foundations of America’s global leadership before we implode.
The Twitterati and blogosphere reaction to Friedman's argument tended towards the scathing, and now we're beginning to see the responses elaborated to op-ed length. This smart essay, for example, makes the very trenchant point that in a political structure with so many veto points , so much political polarization and so many entrenched interests, the ability of any one leader to reform the system on the scale that Friedman proposes is next to impossible:
A system with as many checks and balances built into it as ours assumes — indeed requires — a certain minimum level of cooperation on major issues between the two parties, despite ideological differences. Unfortunately, since the end of the cold war, which was a hugely powerful force compelling compromise between the parties, several factors are combining to paralyze our whole system.
For starters, we’ve added more checks and balances to make decision-making even more difficult — such as senatorial holds now being used to block any appointments by the executive branch or the Senate filibuster rule, effectively requiring a 60-vote majority to pass any major piece of legislation, rather than 51 votes. Also, our political divisions have become more venomous than ever....
A system with as many checks and balances built into it as ours assumes — indeed requires — a certain minimum level of cooperation on major issues between the two parties, despite ideological differences. Unfortunately, since the end of the cold war, which was a hugely powerful force compelling compromise between the parties, several factors are combining to paralyze our whole system.
For starters, we’ve added more checks and balances to make decision-making even more difficult — such as senatorial holds now being used to block any appointments by the executive branch or the Senate filibuster rule, effectively requiring a 60-vote majority to pass any major piece of legislation, rather than 51 votes. Also, our political divisions have become more venomous than ever....
We can’t be great as long as we remain a vetocracy rather than a democracy. Our deformed political system — with a Congress that’s become a forum for legalized bribery — is now truly holding us back.
Congratulations to present Thomas Friedman -- for effectively refuting past Tom Friedman.
I think it's safe to say that the
vampire squid Goldman Sachs brand has taken a few hits in recent years. To add to the calumny, Greg Smith, an executive director and head of the firm’s United States equity derivatives business in Europe, the Middle East and Africa, is leaving Goldman today after publishing an op-ed in the New York Times explaining why he's leaving. It's not pretty:
Today is my last day at Goldman Sachs. After almost 12 years at the firm — first as a summer intern while at Stanford, then in New York for 10 years, and now in London — I believe I have worked here long enough to understand the trajectory of its culture, its people and its identity. And I can honestly say that the environment now is as toxic and destructive as I have ever seen it.
Man, that best-of-luck office party is going to be awkward.
The fact that this op-ed has already spawned its own satire suggests that it's not going to have much of an effect on the larger debate on Goldman Sachs. Which is a shame, because such a debate would be pretty useful when thinking about Big Finance (though see Gabriel Sherman's excellent New York essay on this topic from a few weeks back). Indeed, this is a teachable moment for how to compose a memo, or a mission statement, or an op-ed that will provoke a deep debate over corporate culture. Let's see where Smith went wrong:
1) He made it all about himself. The ostensible point of this exercise is to shine a light on a shady corporate culture that values sins over virtues. In these instances, the following paragraph should never appear:
My proudest moments in life — getting a full scholarship to go from South Africa to Stanford University, being selected as a Rhodes Scholar national finalist, winning a bronze medal for table tennis at the Maccabiah Games in Israel, known as the Jewish Olympics — have all come through hard work, with no shortcuts. Goldman Sachs today has become too much about shortcuts and not enough about achievement. It just doesn’t feel right to me anymore.
See how that was ostensibly about Goldman Sachs but was really about the author? Not a good sign.
2) His job apparently required him to burrow out and reside in a large soundproof hole in the ground. Let's take a look at what Smith said about the halcyon, early days of his Goldman Sachs tenure -- i.e., when he started in 2000:
It might sound surprising to a skeptical public, but culture was always a vital part of Goldman Sachs’s success. It revolved around teamwork, integrity, a spirit of humility, and always doing right by our clients....
How did we get here? The firm changed the way it thought about leadership. Leadership used to be about ideas, setting an example and doing the right thing. Today, if you make enough money for the firm (and are not currently an ax murderer) you will be promoted into a position of influence.
Excuse me for a sec, I need to do this for a spell.
Look, Smith should know the Goldman Sachs culture better than I do, but as an outsider, I am fairly certain of two things: A) Before Smith's op-ed, the terms "humility" and "Goldman Sachs" never appeared in the same sentence.... ever; and B) Making money was always how people got promoted at Goldman Sachs.
There's been enough written about Goldman Sachs to know that by 2006, the firm had recognized that it was badly overexposed in the subprime market and decided to dump their holdings onto their clients. We know that in 2007, the firm went so far out of bounds that the SEC actually brought a civil suit against them, securing a $550 million settlement more than 18 months ago. And now Smith notices something is amiss??!! While the Wall Street Journal suggests Smith's role at Goldman wasn't pivotal, this kind of naivite requires a special kind of willful blindness.
If you're going to be a whistle-blower, you need to acknowledge upfront your complicity in any malfeasance, be it legal or ethical. Smith's op-ed doesn't come close to doing this.
Your humble blogger has, on occasion, opined about the intersection of sports and politics. This topic is both tempting and treacherous. Tempting, because a lot more people pay attention to sports than world politics, and so it's a way for the pundit to A) show how "in touch" s/he is with the mass p;ublic; and B) use the sporting moment-du-jour as a metaphor to make a point that was already in the pundit's back pocket. This is why most of my writings on this topic have been either to debunk the notion that sports really affects world politics, or just as another excuse to mock the Very Serious Foreign Policy Community.
Which brings me to New York Knicks point guard Jeremy Lin. In a month Lin has gone from being demoted to the development league to leading the Knicks to a globally televised victory over the defending champion Dallas Mavericks. It's a great story: undrafted , devout Taiwanese-American Harvard graduate bucking the odds -- as well as numerous outdated stereotypes -- to seize his moment in the sun and turn what had been a lackluster Knicks
decade season into something exciting.
This is a narrative that one simply has to enjoy. Professional basketball is, at best, my third-favorite sport, but I tuned in yesterday to watch the Kincks-Mavericks game. Unfortunately, I've noticed that some ink has been spilled and some keyboards have been tapped about him -- and here we get to the treacherous part of this post. Some sportswriters have used the opportunity to wax grandiosely about the Deeper Meaning of Linsanity. Some politics commentators have tried to use Lin to make deeper arguments about the fabric of society and sports.
Let's be blunt -- most of these efforts result in utter crap. Unfortunately, too many sportswriters know too little about the rest of the world to even try to comment on the social or cultural significance of Lin. Numerous idiots have not helped the sportswriting profession by writing things that result in apologies from said idiots for stereotyping Lin and amusing Saturday Night Live skits. We're not seeing the second coming of Red Smith in most of this output. As for the politics writers, well, the lack of actual sports knowledge in some of these efforts makes one almost nostalgic for George F. Will's Sports Machine. Almost.
So I was all set to blog a request for everyone to leave Jeremy Lin and his family alone... but then Gady Epstein wrote something interesting about the whole phenomenon over at the Economist about China's reaction to Lin and why their own sports programs could never have produced someone like him:
Mr Lin is, put plainly, precisely everything that China’s state sport system cannot possibly produce. If Mr Lin were to have been born and raised in China, his height alone might have denied him entry into China’s sport machine, as Time’s Hannah Beech points out: “Firstly, at a mere 6’3”—relatively short by basketball standards—Lin might not have registered with Chinese basketball scouts, who in their quest for suitable kids to funnel into the state sports system are obsessed with height over any individual passion for hoops.” Even when Mr Lin was still a young boy, one look at his parents, each of unremarkable stature, would have made evaluators sceptical. Ms Beech’s other half happens to be Brook Larmer, the author of the fascinating book “Operation Yao Ming”, which details how Chinese authorities contrived to create China’s most successful basketball star, Mr Yao, the product of tall parents who were themselves Chinese national basketball team players. The machine excels at identifying, processing and churning out physical specimens—and it does so exceedingly well for individual sports, as it will again prove in London this year. But it happens to lack the nuance and creativity necessary for team sport.
What of Mr Lin’s faith? If by chance Mr Lin were to have gained entry into the sport system, he would not have emerged a Christian, at least not openly so. China has tens of millions of Christians, and officially tolerates Christianity; but the Communist Party bars religion from its membership and institutions, and religion has no place in its sports model. One does not see Chinese athletes thanking God for their gifts; their coach and Communist Party leaders, yes, but Jesus Christ the Saviour? No.
Then there is the fact that Mr Lin’s parents probably never would have allowed him anywhere near the Chinese sport system in the first place. This is because to put one’s child (and in China, usually an only child at that) in the sport system is to surrender that child’s upbringing and education to a bureaucracy that cares for little but whether he or she will win medals someday. If Mr Lin were ultimately to be injured or wash out as an athlete, he would have given up his only chance at an elite education, and been separated from his parents for lengthy stretches, for nothing. (One must add to this the problem of endemic corruption in Chinese sport that also scares away parents—Chinese football referee Lu Jun, once heralded as the “golden whistle” for his probity, was sentenced to jail last week as part of a massive match-fixing scandal). Most Chinese parents, understandably, prefer to see their children focus on schooling and exams.
In America, meanwhile, athletic excellence actually can open doors to an elite education, through scholarships and recruitment. Harvard does not provide athletic scholarships, but it does recruit players who also happen to be academic stars. There is no real equivalent in China.
So China almost certainly has other potential Jeremy Lins out there, but there is no path for them to follow. This also helps explain, as we have noted at length,why China fails at another sport it loves, football. Granted, Mr Lin’s own path to stardom is in itself unprecedented, but in America, the unprecedented is possible. Chinese basketball fans have taken note of this. Mr Lin’s story may be a great and inspiring proof of athleticism to the Chinese people, but it is also unavoidably a story of American soft power.
Epstein is overreaching juuuust a bit with that closing -- if Lin is an example of American soft power, then all the galactically stupid puns and stereotypes that the Lin story has propagated is a demerit to that soft power as well. Also, last I checked, the countries that dominate the top of the FIFA rankings are not exactly models of laissez-faire in sports.
Still, Epstein has probably done the best possible job of trying to relate Lin to Deeper Global Meanings. Let's hope the rest of the writing class reads him and gives up their own futile quest to do the same.
Ross Douthat had a great column to start the new year, offering his own interpretation on the Ron Paul phenomenon. His last few paragraphs:
There’s often a fine line between a madman and a prophet. Perhaps Paul has emerged as a teller of some important truths precisely because in many ways he’s still as far out there as ever.
The United States is living through an era of unprecedented elite failure, in which America’s public institutions are understandably distrusted and our leadership class is justifiably despised. Yet politicians of both parties are required, by the demands of partisanship, to embrace the convenient lie that our problem can be pinned exclusively on the other side’s elites — as though both liberals and conservatives hadn’t participated in the decisions that dug our current hole.
In this climate, it sometimes takes a fearless crank to expose realities that neither Republicans nor Democrats are particularly eager to acknowledge.
In both the 2008 and 2012 campaigns, Paul has been the only figure willing to point out the deep continuities in American politics — the way social spending grows and overseas commitments multiply no matter which party is in power, the revolving doors that connect K Street to Congress and Wall Street to the White House, the long list of dubious policies and programs that both sides tacitly support. In both election cycles, his honest extremism has sometimes cut closer to the heart of our national predicament than the calculating partisanship of his more grounded rivals. He sometimes rants, but he rarely spins — and he’s one of the few figures on the national stage who says “a plague on both your houses!” and actually means it.
Obviously it would be better for the country if this message weren’t freighted with Paul’s noxious baggage, and entangled with his many implausible ideas. But would it be better off without his presence entirely? I’m not so sure.
Neither prophets nor madmen should be elected to the presidency. But neither can they safely be ignored (emphases added).
Conor Friedersdorf and Glenn Greenwald take a similar position. Greenwald in particular argues that Paul's positions on foreign policy/national security/civil liberties are so much better than the bipartisan consensus view that Paul's tacit approval of those odious newsletters should be heavily discounted. As Greenwald puts it, progressives who don't support Paul must apparently accept the following preference ordering:
Yes, I’m willing to continue to have Muslim children slaughtered by covert drones and cluster bombs, and America’s minorities imprisoned by the hundreds of thousands for no good reason, and the CIA able to run rampant with no checks or transparency, and privacy eroded further by the unchecked Surveillance State, and American citizens targeted by the President for assassination with no due process, and whistleblowers threatened with life imprisonment for “espionage,” and the Fed able to dole out trillions to bankers in secret, and a substantially higher risk of war with Iran (fought by the U.S. or by Israel with U.S. support) in exchange for less severe cuts to Social Security, Medicare and other entitlement programs, the preservation of the Education and Energy Departments, more stringent environmental regulations, broader health care coverage, defense of reproductive rights for women, stronger enforcement of civil rights for America’s minorities, a President with no associations with racist views in a newsletter, and a more progressive Supreme Court.
I'm of two minds about this line of argument. On the one hand, there is no denying that Paul's worldview has helped him to launch a powerful critique on American foreign policy. This can't just be dismissed as "yes, he was right on Iraq, but..." either. As Douthat, Friedersdorf and Greenwald observe, Paul really is the only candidate to bring up these issues
not named Gary Johnson or Jon Hunstman. His hypothesis that the United States has invited some blowback by overly militarizing its foreign policy cannot be easily dismissed.
Think of it this way: Paul is a hedgehog. He knows One Big Thing and uses it to construct his worldview. We know from Philip Tetlock that hedgehogs are less likely to be right when making predictions than foxes -- those people who know a little about a lot of things. Hedgehogs outperform foxes is in getting big macro-consequential events correct, however. We tend to ignore such predictions, however, because hedgehogs usually lack the emotional intelligence necessary to persuade nonbelievers. I want Paul banging on about the dangers of excessive government intrusion and overexpansion. That's not nothing.
Here's the thing, though -- precisely because Paul is a hedgehog, he brings other less-than-desirable qualities to the table. I don't think his intriguing take on foreign policy and civil liberties can be separated from, say, his batshit-insane views about the Federal Reserve. In fact, let me just edit Greenwald's proposed tradeoff so that it's a bit more accurate:
Yes, I’m willing to continue to have some Muslim children inadvertently die by covert drones and cluster bombs, and a disproportionate percentage of America’s minorities imprisoned for no good reason, and the CIA taking action with minimal checks or transparency, and privacy eroded further by the unchecked Surveillance State, and American citizens targeted by the President for assassination with no due process, and whistleblowers threatened with life imprisonment for “espionage,” and the Fed able to dole out trillions to bankers and lots of rhetoric & covert action against Iran that makes Glenn Greenwald hyperventilate in exchange for avoiding a complete and total meltdown of the global economy due to the massive deflation that would naturally follow from a re-constituted gold standard.
I don't like this choice, but it's an easy one to make.
To paraphrase both Douthat and This is Spinal Tap, there's a fine line between prophetic and crazy. I would posit that only someone who fanatically accepted this entire worldview would have been capable of inspiring the Ron Paul movement. Only those leaders with sufficient levels of ideological zeal to never compromise, never bend on principle, until they eventually reach a position of power are able to foment revolution. This kind of zeal requires a singular worldview that might contain some worthwhile elements but is likely also based on some axioms or articles of faith that seem a little nuts and makes the person wrong an awful lot of the time. These kinds of leaders, precisely because they were in the political wilderness, will tend to be supremely convinced in their own rightness if they ever win power.
Ron Paul is great at affecting the marketplace of ideas. He would be worse than Newt Gingrich if he actually became president, however. The great presidents -- Washington, Lincoln, FDR -- knew the when to compromise and when to stand firm, when to lead public opinion and when to follow it. They were, in other words, great politicians. The presidents who simply knew they were right on everything and resisted compromise -- Jackson, Wilson, Bush 43 -- tended towards the disastrous. Paul would be part of the latter group.
So if Ron Paul wants to influence the debate, that's good. He raises important questions about important issues. He's also wrong about some really important issues and therefore should be kept away from the presidency.
Fortunately, as James Hohmann's Politico story suggests today, Paul and his supporters seem to care about the former more than the latter:
As much as anything else, [Paul's] pitch centers on sending a message.
“This is ideological,” he said here late Friday night at his last campaign stop of 2011. “So it isn’t a numbers game. It has to do with determination.”
He paraphrased a Samuel Adams quote, saying, “It doesn’t take a majority to prevail. It takes an irate, determined minority keen on starting the brushfires of liberty in the minds of men.”
“So in many ways, it’s a political revolution to change these ideas, but it’s an intellectual revolution,” Paul explained, wrapping up a nearly hourlong speech. “It’s a change in ideas about economic policy, understanding our traditions about foreign policy, understanding monetary policy. This is where we’re making progress. This is where we have advanced so much over the last couple decades and even in the last four years.”...
Many of his die-hard supporters see him more as an alarm-sounding Paul Revere than a Founding Father.
“I would say its 10 percent campaign, 90 percent a movement,” said Quaitemes Williams, a 26-year-old nursing student who drove from Dallas to volunteer for the full week before the caucuses. “Once you’ve seen the light, you can never go back to the dark. Once you learn about the Federal Reserve and foreign policy, you can’t go back to thinking in the right-left dichotomy.” (emphasis added)
That last quotation, by the way, is part of what I find problematic about the Paul movement. The revolutionary leader worries me -- but the Jacobin followers scare the ever-living crap out of me.
I was never formally introduced to either Vaclav Havel or Christopher Hitchens, but I encountered both of them exactly once. I was lucky enough to hear Havel deliver a speech at Stanford University in the fall of 1994. I don't remember much about the speech itself beyond a vaguely metaphysical theme. What I do remember is a specific physical gesture. At one point during the proceedings, at Havel's request, Joan Baez came on stage, played her guitar, and sang a song. After Havel spoke, everyone exited the stage, Havel last. He noticed Baez's guitar, and picked it up. As he left the stage, he looked over his shoulder and raised the guitar over his head. The expression on his face screamed, "can you believe I'm holding Joan Baez's friggin' guitar??!!!"
My encounter with Hitchens was a little more mundane -- we were both participating at an AEI panel in early 2001 on international law. I was on a morning panel, and afterwards, Hitchens gave the lunch keynote. I can recall the standard Hitchens attributes: him reeking of cigarettes and alcohol, but nevertheless giving a very good speech. What I also remember is talking with one of the AEI assistants who was tasked with "handling" Hitchens for the day. We started chatting, and at one point she said plainly, "the minute he leaves here will not be soon enough for me."
I'd love to be able to divine some deeper meaning from their deaths, but I'm not quite as inspired a writer as either of them. It's funny to think that Hitchens started out politically to the left of Havel, swerving a bit to his right about a decade ago, but that's not a theme. Rather, this being a blog, I have two unrelated thoughts.
First, as someone who has written a thing or two about public intellectuals, Havel really was extraordinary as someone who could be trusted with power. As Mark Lilla noted in his excellent The Reckless Mind, intellectuals don't really have a distinguished track record when they actually acquire power. Havel was a notable exception -- perhaps because he never really thought he should have it. In David Remnick's New Yorker write-up of the end of Havel's (politically successful) presidency, the politics of doubt that I like so much shines through quite clearly:
At times, Havel felt thoroughly insufficient, a fraud. A familiar Prague voice, the voice of Kafka, told him what anyone who has grown up in a police state knows instinctually—that it could all end as easily as it started.
"I am the kind of person who would not be in the least surprised if, in the very middle of my Presidency, I were to be summoned and led off to stand trial before some shadowy tribunal, or taken straight to a quarry to break rocks," he told a startled audience at Hebrew University, in Jerusalem, less than six months after taking office. "Nor would I be surprised if I were to suddenly hear the reveille and wake up in my prison cell, and then, with great bemusement, proceed to tell my fellow-prisoners everything that had happened to me in the past six months. The lower I am, the more proper my place seems; and the higher I am the stronger my suspicion is that there has been some mistake."
In Havel's thirteen years as President—first of Czechoslovakia and then, after the Slovaks and the Czechs divided into two states, in 1993, of the Czech Republic—many of his advisers repeatedly begged him to delete, or at least soften, these public moments of self-doubt. What effect would they have on an exhausted people waiting for the radical transformation of their country? (Imagine Chirac or Blair, Bush or Schröder beginning a national address with an ode to his midnight dread!) Havel, however, would not be edited. The Presidential speech was the only literary genre left to him now, his most direct means of expressing not only his personal feelings but also the spirit of the distinctively human politics he wanted to encourage after so many decades of inhuman ideology. "Some aides tried to stop him, but these speeches had a therapeutic value for him," Havel's closest aide, Vladimír Hanzel, told me.
As Ta-Nehisi Coates observed recently, most people are mediocre and, if they were given power, would likely not exercise it all that benevolently. Havel was about as far away from mediocre as one could be.
Hitchens was not mediocre, but neither was he gentle, and so his passing generated a more variegated response. There was the eruption of fond memories from fellow writers at his ability to consume and produce prodigious amounts of prose and other substances -- this one is my favorite. It's also led Glenn Greenwald to grouse about the hagiography that the death of public figures ostensibly produces:
We are all taught that it is impolite to speak ill of the dead, particularly in the immediate aftermath of someone’s death. For a private person, in a private setting, that makes perfect sense. Most human beings are complex and shaped by conflicting drives, defined by both good and bad acts. That’s more or less what it means to be human. And — when it comes to private individuals — it’s entirely appropriate to emphasize the positives of someone’s life and avoid criticisms upon their death: it comforts their grieving loved ones and honors their memory. In that context, there’s just no reason, no benefit, to highlight their flaws.
But that is completely inapplicable to the death of a public person, especially one who is political. When someone dies who is a public figure by virtue of their political acts — like Ronald Reagan — discussions of them upon death will be inherently politicized. How they are remembered is not strictly a matter of the sensitivities of their loved ones, but has substantial impact on the culture which discusses their lives. To allow significant political figures to be heralded with purely one-sided requiems — enforced by misguided (even if well-intentioned) notions of private etiquette that bar discussions of their bad acts — is not a matter of politeness; it’s deceitful and propagandistic. To exploit the sentiments of sympathy produced by death to enshrine a political figure as Great and Noble is to sanction, or at best minimize, their sins. Misapplying private death etiquette to public figures creates false history and glorifies the ignoble.
Meh. I read a lot of the Hitchens write-ups, and a fair number of them were pretty blunt about his personal and political dark sides. Even critics like Corey Robin acknowledge the "consistent line" of “Yes, he was wrong on Iraq, but…” in the public responses to his death. This suggests that Hitchens has not, in fact, been a subject of one-sided requiems. even by those who liked him.
I suspect two things are going on in the public reaction to Hitchens' death, one unique to him and one that's more general. What was unique about Hitchens was that he was an archetype brought to life. Here was a real, honest-to-goodness heavy drinking, heavy smoking, occasionally rude Brit who could nevertheless dash off excellent writing on a daily basis. Where do you actually see that outside of the movies nowadays?
The more general trend is that in an age of self-publishing, perhaps the personal and the public are more fused than Greenwald realizes or comprehends. Hitchens hung around with a lot of writers, and as friends it's not shocking that their initial responses will be to talk about the private individual behind the public persona. As time passess, more strangers will push back, there will be more sober reassessments, and eventually some kind of perspective is achieved. The thing about the internet is that it amplifies these cycles of reactions and counterreactions for all to see.
Your humble blogger is down in Washington DC to attend FP's Global Thinkers gala, in honor of the magazine's annual Top 100 Global Thinker list. I look forward to seeing some of the thinkers I know (Tyler Cowen, Joseph Nye) and meeting the many that I don't know.
This list tends to beget a lot of carping from
my friends who erroneously believe that I contol the entire FP-verse like a Muppeteer in the Twitterverse about who's on the list, and in some ways, that's kind of the point -- to foster debate. I'd like to ask my readers a slightly different question: who got overlooked? Who are the BigThinkers that FP missed?
Put your answer in the comments!
Here's an open secret -- most American foreign policy observers loathe domestic politics. To those who seek to define and distill the national interest, the notion that factions or parties can get in the way of the common good is very, very frustrating. This is why, whenever gridlock breaks out in Washington, there is a spasm of caterwauling from prominent foreign policy thinkers that Something. Must. Be Done.
This leads to some silly memes, like claims that a third party will break the logjam. It won't -- a glance at Duverger's Law and you know that the first-past-the-post electoral system in this country means that a two-party system is the only stable long-term equilibrium. A third party in the United States could only achieve electoral viability in one of two ways: either supplanting one of the existing parties, or focusing on success in a particular region. Since neither of these outcomes has occurred since the Civil War, I'm not holding my breath.
Gridlock frustration also leads to proposals of Grand Diagnoses and Remedies for Fixing the System. Fareed Zakaria goes down this road, offering a diagnosis of why partisanship has been rising in the United States and then links to Mickey Edwards' essay in The Atlantic of how to fix things. Zakaria, riffing off of Edwards, lists four reasons why partisanship is so high:
1) Redistricting has created safe seats so that for most House members, their only concern is a challenge from the right for Republicans and the left for Democrats....
2) Party primaries have been taken over by small groups of activists who push even popular senators to extreme positions.
3) Changes in Congressional rules have also made it far more difficult to enact large, compromise legislation.
4) Political polarization has also been fueled by a new media, which is also narrowcast.
These sound compelling, except that A) none of them really explain increased polarization in the Senate; and B) only the fourth trend is in any way recent (the rest of these phenomenas can be traced back to the 1970's).
The real problem with Congress is that any proposed institutional reform to correct the problems would require either a dilution of legislative power or a dilution of the minority's power to obstruct. Neither minority nor majority parties in Congress will be interested in moves like that unless and until we're in a crisis that made 2008 look like a ripple in the pond.
If you are looking to this humble blogger for ways out of this current problem... um... look elsewhere. My training is in international relations, and I've found that people with that kind of training tend to prefer policy reforms that provide political leeway and insulation to the executive branch. These measures are appealing because they tend to minimize the number of stupid interactions with galactically stupid members of Congress. Over the long-term, however, even a stupid Congress still serves as a valuable check on executive branch authority.
I'm as frustrated as the next foreign policy observer when it comes to the current policy paralysis. I know my own kind, however, and we suffer from the flawed belief that there was a halcyon era of bipartisanship in the foreign policy days of yore. Be very, very wary when a foreign policy pundit gives advice about how to reform the American system of government. Most of the time they are relying on decades-old Introduction to American Government arguments that are either obsolecent or incentive incompatible.
Chip Somodevilla/Getty Images
The Official Blog Son and I were lucky enough to catch Team USA's thrilling come-from-behind victory over Brazil in the FIFA Women's World Cup. It was a great and controversial game, sure to be replayed on ESPN Classic for years to come. It also got me to thinking about how prominent thinkers and writers about world politics would use the game as a hook for their foreign affairs columns and op-eds this week. Here are their opening paragraphs:
I was quaffing hearty German pilsners with FIFA President Sepp Blatter in a luxury box in Dresden's Glücksgas Stadium (try the bratwurst!!) when he said something that hit me like a thunderbolt: "I can't understand why there's so much demand for video replay in soccer. You know, there is no instant replay in the real world." And really, that's what the global economy is like -- a fast-speed, arcing bullet of a free kick with no time to press the pause button. You have to use every part of your being -- your legs, your head, though admittedly not your arms -- just to keep pace.
Watching the thrilling run of the Americans leading up to Abby Wambach's header, I was struck by the complex, free-flowing sequence of passes that got the ball from the American end to Megan Rapinoe's left foot. It was such a seamless, interlaced network of exchanges -- dare I call it a web of them? -- that moved the ball forward. As the passes moved from one player to another, I bet social networking technologies moved even faster, alerting Americans that a Big Moment was about to happen. In winning, the United States showed the power of webbed networks -- or is it networked webs? -- yet again.
All of the Western media will focus on the "theatrics" of the USA-Brazil game, but it doesn't matter. This was an intramural match between Western Hemisphere teams, which means it was irrelevant. Japan's stunning upset of host Germany in the quarterfinals is the real story of this World Cup, yet another signal of how the one remaining Asian team will leave the three "Western" teams still alive in the dust.
This was an example of American exceptionalism and American will to power at its finest. Battling a set of rules and referees that were clearly anti-American in their effect, the noble U.S. side displayed dogged determination and grit, vanquishing their Brazilian counterparts. The only black mark on the U.S. side was the timidity of the U.S. coach Pia Sundhage in obeying FIFA's absurd and corrupt rules. Sundhage, from that socialist bastion of meek multilateralism that is Sweden, adhered to the letter of FIFA law in pulling Rachel Buehler after she was "red-carded." A true American coach would have instead followed the spirit of the law and sent an 11th player onto the pitch in place of the unjustly accused Buehler.
Americans will thump their chests, display their brassy jingoism, and bray to the heavens about how the refereeing in this game was "unfair" or "ridiculous." They'll claim that the referee's red card of Buehler and mandated do-over of the penalty kick during regular time was "anti-American." They'll overlook the fact that the Australian ref could have midfielder Carli Lloyd off the field for a flagrant, deliberate handball but didn't. They'll overlook the granting of a re-kick for U.S. player Shannon Boxx during the penalty kick phase. They'll overlook the aesthetic beauty of Brazilian star Marta's soccer artistry. They'll overlook the arrogance of U.S. goalkeeper Hope Solo -- a perfect American name if there ever was one -- as she had the audacity to question the ref (if the officials weren't so obviously in Corporate America's back pocket, Solo would have been red-carded). They'll overlook the fact that the extra half-hour of play insidiously stacked the deck for the Americans, rewarding their better conditioning against the poorer and put-upon Brazilians. They'll overlook the 158 other things that I will now lay out in excruciating detail. Only when WikiLeaks focuses its might on FIFA will the soccer world be more just.
The sweltering heat in Dresden clearly began to affect the crowd. They booed the Brazilian star Marta with all of her touches. You could sense a growing danger as the boos grew louder. The German fans, upset at seeing their own team get knocked out, had clearly decided to side with their tribal allies. It is likely that only Wambach's header prevented what would have been an unruly German/American riot, breaking down the tenuous social fabric. The riot would have started in the heart of Europe, but I have every confidence that, before long, the unrest would have spread to Halford MacKinder's heartland in the middle of Eurasia.
This match crystallized both the promise and the peril of the rising BRIC powers as they assume more responsibilities in global governance. The game put FIFA's many problems -- bad decision-making, a lack of transparency about the bad decision-making -- on full display. Even after the match, FIFA never explained why Brazil was awarded a re-kick following Solo's block of Christina's penalty kick. Instead of constructively seeking reform, however, the Brazilian side tried to free-ride off of FIFA's flaws. Marta constantly whined to the refs about the lack of Brazilian free kicks. Defender Erkia flopped onto the pitch in a transparent effort to stall play. Unless and until the BRIC countries learn to play cooperatively with the fading West, global governance will look as effective as FIFA's efforts to block corruption. Which is to say, not effective at all.
Readers are warmly encouraged to offer their own suggestions in the comments.
As the fallout from Dominique Strauss-Kahn and The Chambermaid's Tale continues,
the guy from the Dos Equis commercials French public intellectual Bernard-Henri Lévy is taking quite a beating inside the United States. Lévy -- or BHL for those in the know -- is a longtime friend of Strauss-Kahn -- or DSK for, well, you get the idea. After DSK's arrest, BHL penned the following in the Daily Beast:
I do not know what actually happened Saturday, the day before yesterday, in the room of the now famous Hotel Sofitel in New York.
I do not know—no one knows, because there have been no leaks regarding the declarations of the man in question—if Dominique Strauss-Kahn was guilty of the acts he is accused of committing there, or if, at the time, as was stated, he was having lunch with his daughter [we actually know that, given the timeline, DSK's lunch with his daughter is not an alibi, as even his defenders acknowlege --DWD].
I do not know—but, on the other hand, it would be nice to know, and without delay—how a chambermaid could have walked in alone, contrary to the habitual practice of most of New York’s grand hotels of sending a “cleaning brigade” of two people, into the room of one of the most closely watched figures on the planet....
And what I know even more is that the Strauss-Kahn I know, who has been my friend for 20 years and who will remain my friend, bears no resemblance to this monster, this caveman, this insatiable and malevolent beast now being described nearly everywhere. Charming, seductive, yes, certainly; a friend to women and, first of all, to his own woman, naturally, but this brutal and violent individual, this wild animal, this primate, obviously no, it’s absurd.
This morning, I hold it against the American judge who, by delivering him to the crowd of photo hounds, pretended to take him for a subject of justice like any other....
I hold it against all those who complacently accept the account of this other young woman, this one French, who pretends to have been the victim of the same kind of attempted rape, who has shut up for eight years but, sensing the golden opportunity, whips out her old dossier and comes to flog it on television.
I do not know the extent to which BHL fact-checked his column -- for example, the French woman he accuses of being opportunistic now actually went public in 2007 only to have herself censored on French television.
I do not know the extent to which BHL is aware that DSK's other sexual indiscretions appear to have a greater element of coercion than had been previously realized.
I do not know why BHL's understanding of "cleaning brigades" is somewhat at odds with the reality of how American hotels actually function.
So, this raises an exceptionally uncomfortable question for some foreign policy commentators. BHL might look like a horse's ass right now, but six or seven weeks ago, he was playing a very different role. According to
BHL himself multiple press reports, Bernard-Henri Lévy was the interlocutor between Libya's rebels and the rest of the world. He therefore played a crucial role in getting French President Nicolas Sarkozy -- and therefore, the West more generally -- to intervene in Libya. This caused some consternation at the time. It would obviously set off even louder alarm bells now.
Given this role, Ben Smith tweets a very valid question: "So if the order of DSK-gate and Libya are reversed... do we go into Libya?"
This touches on some very interesting questions about temporality, causation, correlation and counterfactuals. What are the necessary or sufficient conditions for a policy outcome to occur? Do events have to happen in a particular sequence to reach a particular outcome? Was BHL either a necessary or sufficient condiition for the UN/NATO action in Libya?
My answer would be that Bernard-Henri Lévy's intellectual reputation was neither necessary nor sufficient for Operation Odyssey Dawn to take place. Consider the following:
1) French president Nicolas Sarkozy has been more circumspect than BHL in commenting on DSK, reflecting the general muteness of the French political class on the topic. It seems unlikely that BHL's ardent advocacy would have caused Sarkozy to listen to him any less on Libya.
2) One of the key aspects of the Libya decision was the compressed time frame in which it had to be made. Qaddafi's forces seemed on the verge of retaking the country within a week. Debating whether BHL was an honest broker or not seemed pretty peripheral to the real-time changes on the ground in Libya. It's worth remembering that the Arab League and the UN Security Council acted very quickly by International Organization Standard Time, and I certainly don't think BHL had much of a role to play. On the scale of things, one would have expected the "flickers" of Al Qaeda presence among the Libyan rebels to have acted as a bigger brake, and yet that fact did not derail the policy either.
3) Without in any way diminishing the allegatioons and official charges against DSK, there is a difference between the (mostly) venal sins of BHL and the French political class, and the (mostly) mortal sins of Qaddafi and his family If the Libya decision was happening right now, my hunch is that it would drown out much of the Franco-American contretemps over
American puritanism French misogyny one person's failings.
What do you think?
Your humble blogger has not been
contributing to the Osama-a-thon here at FP blogging all that much, because he was busy being a moosehead attending the 2011 Estoril Conference. Many Important topics were covered at this conference, including:
1) The eurozone crisis;
2) The global governance crisis;
3) The crisis in the Middle East;
4) Other global security challenges;
5) The life and times of Larry King.
It was that kind of conclave.
Actually, that really doesn't do it justice. Here's a link to the opening video. Even that doesn't do it justice -- the opening ceremonies featured a sporano suspended 50 feet in the air, a gospel choir, a drum corps, and what I can only assume are the backup dancers for Lady Gaga's music videos.
For a rundown of what the Big Cheeses said at the conference, check out my Twitter feed. The major substantive takeaway I got from the conference is that Portugal would like to do a serious hurt dance on Fitch, Moody's, and Standard & Poor. Half of the conference presenters were Portuguese, and most of the audience was as well. Here is a sampling of the questions the Portuguese asked anyone talking about anything remotely related to economics:
"Why do the bond rating agencies still influence markets after they failed so badly in 2008?"
"Shouldn't the bond-rating agencies be punished for their malfeasance last decade?"
"Aren't the bond-rating agencies to blame for everything bad that has happened since 2008?"
"What do you think of the idea of creating a European standard-ratings agency?"
"Say, has anyone thought about taking the heads of the bond-rating agencies and putting them in a duffel bag?"
OK, I made that last one up, but not the others.
Obviously, the Portuguese have very good reasons to be stressed out. And the bond-rating agencies deserrve an awful amount of flack. Still, the idea that they -- and they alone -- triggered both the 2008 financial crisis and Europe's sovereign debt crisis is absurd. They are far more the symptom than the cause of the crisis.
More blogging after
my eyes adjust to not seeing Lady Gaga's backup dancers everywhere I turn the weekend.
As Laura Rozen, Michael Peel, Farah Stockman, Jon Wiener, John Sides, Siddhartha Mahanta & David Corn, and various reporters have observed, an awful lot of high-powered academics and academic institutions have some 'splainin to do about their relationship with Libya's Qaddafi family.
The Monitor Group ferried a number of high-profile international studies scholars, including Joseph Nye, Robert Putnam, Michael Porter, Francis Fukuyama, Nicholas Negroponte, and Benjamin Barber to the shores of Tripoli in an effort to burnish the regime's image. The London School of Economics and some of its faculty were deeply involved with Saif al-Islam al-Qaddafi, as he earned his Ph.D. there in 2007 with a dissertation on -- wait for it -- liberal democracy and civil society. Even FP's own Steve Walt went for a brief visit in 2010.
As the Qaddafi family has morphed from pragmatic strongmen to bloodthirsty killers, the fallout in the academic world has been uneven. On the one hand, Howard Davies resigned as the head of LSE in the wake of the Libyan revelations. The Monitor Group acknowledged in a statement that, "We … believed that these visits could boost global receptivity for Mr. Gaddafi's stated intention to move the country more towards the West and open up to the rest of the world. Sadly, it is now clear that we, along with many others, misjudged that possibility."
On the other hand, Benjamin Barber sounds totally unapologetic in his interview with FP. His basic message is that "second-guessing the past, I mean, it's just 20/20 hindsight." Then there's this response:
I mean, did LSE take Saif's money -- the Gaddafi Foundation money -- improperly? No, they all took it properly. And promised a scholarly center to study the Middle East and North Africa. And offer scholarships to students from the region. Just the way Harvard and Georgetown and Cambridge and Edinburgh have done -- not with Libyan money, but with Saudi money (look at Prince Alwaleed bin Talal). By the way, not just Monitor, but McKinsey, Exxon, Blackstone, the Carlyle Group -- everybody was in it. The only difference for Monitor was that it actually had a project that was aimed at trying to effect some internal change. Everybody else who went in, which is every major consultancy, every major financial group, went in to do nothing more than make big bucks for themselves. But now people are attacking Monitor because they took consulting fees for actually trying to effect reform and change.
Finally, there is an important background controversy here: It is about whether academics should stay in the ivory tower and do research and write books? Or engage in the world on behalf of the principles and theories their research produces? Do you simply shut your mouth and write? Or do you try to engage? This is an old question that goes back to Machiavelli, back to Plato going to Syracuse: Do you engage with power? Sometimes power is devilish and brutal; sometimes it's simply constitutional and democratic; but in every case, it's power, and to touch it is to risk being tainted by it.
My answer is that each person has to make their own decision. I don't condemn those who prefer the solitude of the academy, though they lose the chance to effect change directly; and I don't condemn those who do try to influence power, risking being tainted by it, even when power doesn't really pay much attention to them, whether its legitimate power like in the United States or illegitimate, as in Libya. The notion that there is something wrong with people who choose to intervene and try to engage the practice of democracy -- that they are somehow more morally culpable than people who prefer not to intervene -- is to me untenable.
Rereading his 2007 Washington Post op-ed, I think it's safe to say that Barber embraced sucking up to power juuuuuuuuust a wee bit more fervently than everyone else.
That said, the man has half a point here. As Ben Wildavsky has chronicled in The Great Brain Race, Western universities have been racing across the globe to set up
additional revenue streams satellite campuses in authoritarian countries. Those schools that had no dealings with Libya likely do have dealings with the Gulf emirates, or China, or Russia, or … you get the point.
Furthermore, if you believe what Charles Kupchan writes in How Enemies Become Friends, it's precisely this category of interactions that potentially leads to reduced tensions between rival nations. Bear in mind that by 2006 Libya had renounced its WMD program and did seem somewhat interested in integrating itself into the West. Surely that's a moment when these kinds of interactions could havehad an appreciable effect on a country's trajectory.
Another ethical question comes down to exactly how a scholar is engaging with a country. Engagement at the elite level, for example, has a greater potential for change, but also a great potential for "capture" by the authoritarian elite. Engagement with the population might have fewer moral quandaries (if there's a choice between teaching Saudi women* and not teaching Saudi women, for example, is not teaching really the morally correct option? ) but fewer opportunities for change.
There's an interesting quote in Farah Stockman's write-up that does stand out, however:
“The really nefarious aspect of [Monitor's parade of academics] is that it reinforced in Khadafy’s mind that he truly was an international intellectual world figure, and that his ideas of democracy were to be taken seriously,’’ said Dirk Vandewalle, associate professor at Dartmouth College and author of “A History of Modern Libya.’’ “It reinforced his reluctance to come to terms with the reality around him, which was that Libya is in many ways an inconsequential country and his ideas are half-baked.’’
In the Libyan case, maybe that is the best criteria for assorting ethical responsibility. For a scholar, engagement with power should not be automatically rejected, particularly if it means altering policies in a fruitful manner. When the exercise morphs into intellectual kabuki theater, however, then disengagement seems like the best course of action.
Those scholars who stopped participating after it was obvious that Qaddafi wasn't really interested in genuine change don't deserve much opprobrium. By that count, Barber really has a lot to answer for, while some of the others seem to have emerged relatively unscathed.
I'm curious what commenters have to say about this because I guarantee you one thing -- the more that autocratic regimes either buckle or crack down, the more this issue is going to come up for both universities and individual scholars.
[Full disclosure: I taught a short course for Saudi women at Fletcher in the summer of 2009, and have absolutely no regrets about doing so.]
Since I moved to Foreign Policy, the blog post that generated the most feedback was my impressionistic take on the Millennial generation's foreign policy perspectives. I concluded that post on whether generaional cohorts would have distinct foreign policy attitudes with the following:
As I think about it, here are the Millennials' foundational foreign policy experiences:
1) An early childhood of peace and prosperity -- a.k.a., the Nineties;
2) The September 11th attacks;
3) Two Very Long Wars in Afghanistan and Iraq;
4) One Financial Panic/Great Recession;
5) The ascent of China under the shadow of U.S. hegemony.
From these experiences, I would have to conclude that this generation should be anti-interventionist to the point of isolationism.
There was a LOT of very thoughtful pushback in the comments and e-mails from Millennials themselves -- enough for me to wonder whether my jaded Gen-Xer eyes were growing too world-weary.
Now, however, we actually have some data. The Brookings Institution has released a new report, "D.C.'s New Guard: What Does the Next Generation of American Leaders Think?" The survey results came from 1,057 respondents (with a average age of 16.4) who attended the National Student Leadership Conference, Americans for Informed Democracy young leaders programs, and other DC internships -- i.e., those young people already predisposed towards a political career.
The results are veeeeery revealing. The headline figure is that 73% of respondents think that "The U.S. is no longer globally respected" -- which actually suggests that the respondents haven't been looking at the data, but that's a side note. No, the really interesting response is as follows:
[A]lmost 58% of the young leaders in this survey agreed with the statement that the U.S. is too involved in global affairs and should do more at home. Alternatively, 32.4% thought the U.S. had "struck the right balance" between issues at home and abroad," while only 10% thought that the United States should be more globally proactive.
This isolationist sentiment among the younger generation stands in stark comparison to the Chicago Council's recent 2010 polling of older Americans, which found that 67% wanted America to have an active role in the world and only 31% thought we should limit our involvement, a near exact reverse. The older generation survey concluded that there was "persisting support for an internationalist foreign policy at levels unchanged from the past," but this perceived persistence is certainly not there among the young leaders (emphasis added).
Now, to be fair, It is possible to reconcile beliefs that the United States is doing too much abroad now while still believing that the U.S. should exert global leadership, but on a more modest scale. Still, I'm counting this as a clear win over the young people insisting that my impressionistic take on their generation was wrong. Take that, Bieberheads!!!
[Hey, I just noticed this paragraph by P.W. Singer at the start of the report:
In 2011, a “silver tsunami” will hit the United States: the oldest Baby Boomers will reach the United States’ legal retirement age of 65. As the Boomers leave the scene, a new generation will begin to take over. But while the generation that directly follows the Boomers, Generation X, may be “of age”, there is a good chance that it will not actually shape public life and leadership as much the following generation, the Echo Boomers, also known as the “Millennials." (emphasis added)
Say, could that swipe at your generation explain your attitude in this post?--ed.]
No!! Really!! It has nothing to do with that! Now if you'll excuse me, I need to lock myself into a dark room and watch Reality Bites on an endless loop for the next 24 hours.
Erik Gartzke, an associate professor of political science at UCSD and a man who's Google Scholar citation count makes me feel very, very small, sent me the following thoughts on political science and policy relevance. I reprint them, below, without edits or comment:
by Erik Gartzke
Dan Drezner's penchant for zombies may have yet another application. In the policy relevance debate, political scientists are like Renfield, Dracula's sidekick (or possibly like Thomas the Tank engine if children are present). We really want to be "useful." I know of no other discipline that is so angst-ridden about mattering, even those that don't matter in any concrete, "real world" sense. Obviously, what makes us different from poets, particle physicists, or Professors of Pediatric Oncology is that we study politics and occasionally imagine that this gives us some special salience to that subject. Policy makers, too, want us to be "relevant," though I think what they have in mind differs in important respects.
There are three ways that political science can be relevant to politics. On both sides of the debate, attention seems focused on only one of these roles. Interestingly, each side has chosen a different role to emphasize. First, academics could have expertise that is valuable in connecting policies to outcomes. We have lots of examples of this. Economists invented theories like adverse taxation and tools like GDP to help policy makers more effectively manage the economy. Unfortunately, there are very few insights or tools from political science, and those we do have are either very narrowly relevant (i.e. techniques for gerrymandering congressional districts to achieve affirmative action objectives), or very imprecise (i.e. nuclear balancing). Academic political scientists consciously _want_ this role, but the complaint from policy makers is that they do it poorly, providing policy guidance that is not expert enough, or overly nuanced and complex. This would seem to imply that political science should remain in the ivory tower, developing better tools. Instead, however, the argument appears to be that political science should give up these tools and practice a form of political consultation more comprehensible by the policy community. One then has to ask why, and what this will achieve. Is it the case, as many argue, that non-expert political scientists will be more useful? Why?
Interestingly, one of the critical exceptions to the general trend, and examples where political scientists have prospered in Washington as experts, involves pollsters. Survey methodology got its start in political science and has penetrated deeply into the political process, precisely because pollsters can provide valuable information to politicians and policy makers about cause and effect. Pollsters are now even regulars as pundits, asked to shill for policies and politicians on the basis of their expertise.
The second thing that academics can provide is thus credibility. We can "speak truth to power" or perhaps just generally speak the truth, at least as we see it. This could be valuable if policy makers themselves have become zombies, enslaved to a process that prevents them from stating things, even when obvious, that are unpopular or controversial. We see this happening in processes such as the Base Closure Commission, where outsiders helped to smooth a transition that was politically difficult. This kind of relevance is difficult, however, as politics is not really about the truth. Paul Pillar, one of the protagonists the debate ("In your face, political science!") found this out, much to his regret. One of the least zombie-like people in the national security bureaucracy, Paul was the perfect foil as author of the national intelligence estimate that legitimated the Bush policy of invading Iraq. In his, and his boss's moment to speak truth, they propagated a politically-expiedent myth. This kind of policy relevance really _is_ valuable to policy makers, especially since credibility is such a scarce commodity inside the beltway, and so valued elsewhere. The problem, of course, from an academic perspective is that selling credibility has nothing directly to do with expertise and everything to do with what, for lack of a better phrase, was once called "moral turpitude." The value in academics in holding forth in Washington may have as much to do on occasion with their _lack_ of contact with policy making, as with their putative expertise, at least in terms of credibility.
A corollary to this is the role of academistic consultants, some with faculty positions, others with beltway connections, that provide "research" that feeds the beast of the Washington policy machine. This can be financially rewarding, but the desire for funding leads to varying degrees of compromise, a zombification by extension.
The third contribution that academics can make to the policy community is one that all seem to agree upon, but which makes the least direct demand on political science as a substantive discipline. The intellectual discipline of first getting a PhD and then practicing as an academic gives one an ordered, logical mind, which can then be applied to tasks in the policy community, as well as to more purely intellectual pursuits. There is nothing wrong with this, but then again, there is nothing particularly unique about how political science does this that prevents scholars in other disciplines from applying themselves to policy making as well. Indeed, this is what we observe. Sociologists, economists, engineers and physicists (even the occasional poet) enter public service.
What makes political science different from most other fields is that we have failed to resolve our conflict with our subject matter. Poets report the human condition. They do not expect to alter it, at least not permanently. Physicians can make you better, so they do intervene, but their detachment is credible in the sense that they do not want to become illnesses. No physicist I know of hankers to _be_ her subject matter, though of course we are all of us made of matter. Political science alone wants to be different but engaged.
Imagine suggesting to a congressional committee that Congress should abandon the forecasting models of the OMB as esoteric and speculative. Try to suggest to someone like Paul Pillar that he should hanker after the "good old days" of pre-GDP census taking and data collection. Economics became policy relevant in the first sense because it developed tools that could help policy makers better connect their actions with outcomes. These are not perfect, as recent events illustrate, but they work better than the old way of doing things (i.e. whatever we did last time, or holding one's thumb up to the wind). The problem is that political science does not yet have "killer apps" like GDP. Optimists would say we are still working on these things. Pessimists would say that they will never come. I will not weigh in on that debate because in some sense it does not matter.
The real point, however, is that the debate does not matter. Either way, the search for policy relevance, as it is pursued by many in the policy community, makes no sense.
If you believe the optimists, then the correct role of political science is to get back in the kitchen (metaphorically) and cook up some good insights and tools so that we can eventually fulfill role number one. If you are instead pessimistic and despair of political science ever achieving much headway in terms of expertise, then you should still prefer us in our academic enclaves, only occasionally venturing down from the mountain, since this is what gives us our credibility as unbiased agents. The largely pessimistic perception of policy practitioners implies that they should treat political scientists like poets, or perhaps adherents of atonal music. Someone gets it, but thank God it is hidden in academic cloisters! This is perhaps what policy makers often do, as suggested by Paul Pillar's example of the debate between academics over perestroika witnessed by James Baker.
Another possibility is that those in the policy community wish academic political scientists were more like them for reason number three. This, however, does not make much sense. There can be no harm in making some political scientists esoteric if after all not everyone can move in policy circles. The training of academic political scientists still provides disciplined minds. Nor does it appear to be the case that there is a shortage of policy-eager political scientists to staff government bureaus and policy-focused beltway agencies and advocacy groups. In this light, academic political science may be accused of leading the youth astray, but no more than poetry or physics departments.
So what is it that makes many in the policy community so uncomfortable with academic political science, and for that measure why are political scientists so anxious about being labeled as not policy relevant? The best I can come up with again involves those zombies. Zombies eat the living. They move slowly, clumsily, if inexorably. People who run away can escape the zombies. So, the problem for zombies is that they cannot really catch unwilling prey. Academic political scientists, for their part, are strangely attracted to these undead creatures. They run, but not vigorously. Having your brains eaten is bad, but still, it is nice to be valued for something in which you have considerable pride....
Academic political scientists keep looking back to see if they can make eye contact with one of those zombies, maybe share a good anecdote, provide some advice, secure funding for the next research project...
There is the hint of the symbiotic relationship between predator and prey, political scientist and policy community. Each needs something from the other, even as both communities see the other as distant, alien. Policy practitioner-political scientists who disdainfully remark that they cannot even read the American Political Science Review would never see the need to make such a comment about a journal like Solid State Physics, or the Journal of Philosophy. Academic political scientists, for their part, should stop pretending that their main value to the policy community at present is in their expertise and fess up, if appropriate, to providing credibility or intellectual discipline (directly or through our students).
Becoming comfortable with this duality as a community also means embracing the differences that follow from that duality. Some of us should be in the ivory tower, just like physicists, chemical engineers, and art historians. In order for political science to fulfill the objective of expertise, it must --- like other fields of expertise --- become "expert", and unfortunately that really means becoming largely incomprehensible to all but those deeply enmeshed in the field or a particular subfield, at least for the purposes of "inhouse" debates. Others will work best in applying, interpreting, or otherwise interacting with the "real world" -- though if this characterization of non-academia were true, we would not need anyone studying (i.e. how does one know the real world and still hanker after insights that would connect his-or-her actions with the (unknown) implications of policy?). In any case, those of us on the academic side should stop teasing the zombies, just as the zombies should stop pretending that every academic brain is a ready meal. "Policy relevance" is a complex set of social phenomena that both attract and repel political scientists on both sides of the policy divide. Let some of us be more like our poet, mathematician or linguist brethren and become one with our academic-nerd nature. Others can prefer to engage Washington more directly, but they will make themselves, and their sponsors happier if they are candid about the fact that those within the beltway want your brains (or your soul), not your incites.
New Year's is a time for resolutions, a time for pledging to shed the bad habits of the previous year. Goodness knows, the foreign policy community and public commentators who occasionally foray into international relations have accumulated a lot of bad habits over the past year. Here's a list of nine memes, tropes, rhetorical tics, and baseless arguments that I'd like to see less of in 2011:
1) [Fill in the blank] is an "existential threat". This term of art has been on the rise for decades, but it seemed omnipresent this past year. To be sure, lots of actors face a lot of threats out there in world politics. The bar has to be pretty high, however, for something to be an "existential threat." For my money, it means that the country or its modus operandi could be completely extinguished.
Using this criteria, there are no existential threats to the United States in the international system. In 2010, this term was increasingly used by Israelis with respect to Iran. So, let's stipulate that if Iran were ever to acquire/develop, say, a dozen nuclear weapons, then the country would represent an existential threat to Israel. Commentators who do this, however, would also need to stipulate that Israel, in possessing 60-85 warheads, has represented an existential threat to Iran for decades.
2) Iran or North Korea are "irrational" actors in world politics. Look, it's a lot of blog fun to point out the absurdities of the Kims or Ahmadinejad -- I get that. Claiming a country's leaders are "irrational" simply because they don't want the same things that you want doesn't make them irrational, however -- it just means that they have a different set of preferences. The question to ask is whether these governments are pursuing their desired ends in a strategic, utility-maximizing manner. Based on their behavior in 2010, I'd say the answer for both governments is yes. So quit saying either Pyongyang or Tehran might be crazy enough to launch a pre-emptive nuclear strike for no reason. Both regimes have really strong self-preservation instincts, however -- the "crazy man launching the bomb" contingency ain't gonna happen.
Note that this doesn't make it any easier to bargain with these countries -- these might be zero-sum bargaining situations. Eliminating the "irrationality" dimension, however, might make public debates about what to do a little more grounded.
3) Wikileaks is just like a newspaper: No it isn't. There's a lot of loose talk about how Wikileaks is really just like Bob Woodward or the New York Times in what it's doing. And sure, Wikileaks' Israel Shamir bears more than a passing resemblance to the Times' Walter Duranty. To my knowledge, neither Woodward nor the Times has threatened to publish documents in its possession as a blackmail device, or warn people to get their money out of a financial institution before something damaging is released.
Wikileaks is an NGO with a quixotic leader who really doesn't like the U.S. government... which makes Wikileaks like a lot of other NGOs. I think a lot of invective directed against the organization has been misplaced, and I agree with Gideon Rachman that Wikileaks has actually does the U.S. a favor. That said, tt's not an ethical journalistic entity.
4) Barack Obama does not really believe in American exceptionalism. I've heard this from a lot of Republican wonks that should know better. The meme emerged from an answer that Obama gave at a news conference in April 2009. John Dickerson at Slate addressed this a few weeks ago:
In the complete answer of nearly 300 words... it's clear that Obama is saying something more complex."We have a core set of values that are enshrined in our Constitution, in our body of law, in our democratic practices, in our belief in free speech and equality, that, though imperfect, are exceptional," he says. What makes this moment notable is not that the president nails it but that, in real time, without a carefully crafted set of talking points to guide him, he is trying to find the balance between singing the song of America and demonstrating before a foreign audience that he understands that America is not the only country in the world.
6) Networks will end hierarchy as we know it. Arguments like this one tend to underestimate the ability of hierarchical organizations to evolve over time. More importantly, networks can have plenty of hierarchy embedded within them.
7) Sarah Palin's views correspond to either American public opinion or the official policy of the U.S. government. Um... no. Palin has perfected the art of using a tweet or Facebook post to capture attention. She's also perfected the art of declining poll numbers and increasing unfavorability ratings.
Some commentators have blurred the distinction between Palin and the United States as a whole. Let's be clear -- Sarah Palin is a private citizen with absolutely no official authority. Using her as an example of how "America" or the "United States government" feels about an issue is disingenuous in the extreme. When the Wall Street Journal editorial page sides with Michelle Obama over Sarah Palin, I think it's safe to say that the former Alaska governor has jumped the shark.
8) Because some international relations uses numbers, it's useless to foreign policymakers. Click here and here -- really, there was a surprising amount of crap written about this topic this year, and it would be just peachy if it could stop.
9) 2011 will be all about zombies. Oh, wait... that one is true.
OK, there's a lot of smack talk in this post. So, in the interest of karma, readers are warily encouraged to suggest what my 2011 reslutions should be for the blog.
[You meant warmly, right?--ed. Uh.. sure.]
Over at The Atlantic, Ta-Nehisi Coates talks about why he largely abstains from cable news appearances and why this is in and of itself a problem:
The outlines of the problem are becoming clear--I'm a snob. More seriously, it's my impression that much of cable news is rigged. Complicated questions are forced into small spaces of time, and guests frequently dissemble in order to score debate points and avoid being intellectually honest. Finally, many of the guests don't seem to be actual experts in the field of which they're addressing, so much as they're "strategists" or "analysts." I strongly suspect that part of the reason this is the case is talking on TV is, itself, a craft and one that requires a skill-set very different than what is required of academics. I'm sure many academics themselves share the disdain for the format that I've outlined. Finally, the handful of scholars who regularly appear on the talk shows, generally aren't of the sort that hold my interests.
With that said, it's very difficult to inveigh against these shows when you refuse to participate. The discomfiting fact is that cable news reaches a ton of people, many of whom--presuming they're interested--could use the information (emphasis added).
As an academic who is occasionally asked to be on TV/radio
after the producer has gone through their top ten options, I have similarly mixed feelings about the skill mismatch. Speaking from my own experience, I find that my biggest weakness in these venues is that I genuinely want to answer the question asked of me.
You'd think this would be a good thing, but it's not, because it means that you're a hostage to the interviewer's ability to ask good questions. Usually if you're asked to be on a program, you know what the news hook is, and you should (obviously) know your overarching take on the issue. The problem, for me at least, is that no interviewer asks, "So what do you think?" Instead, they'll ask a more specific question -- which I then try to answer specifically. I've rarely been able to integrate a specific answer with the larger theme I want to stress in the appearance.
I suppose I could just admit my failings and abstain from these kinds of media appearances. One of my 2011 resolutions, however, is to try and get better at doing this sort of thing.
I'll have my list of proposed resolutions for the rest of the foreign-policy community tomorrow.
As the rest of the Foreign Policy gang hobnobs with the foreign
policy glitterati tonight, I'm
stuck in Boston mulling over the fact
that Tom Friedman managed to earn a Bullock.
What is a Bullock? You might recall that earlier this year Sandra Bullock managed to win both an Academy Award for Best Actress (for The Blind Side) and a Golden Raspberry for Worst Actress (for All About Steve) -- the first time that has ever happened. So a Bullock is when one manages to earn both a "best of" and "worst of" in the span of a single year.
Lo and behold, this week Friedman's name appears on both Foreign Policy's Top 100 Global Thinkers as well as Salon's Hack Thirty -- which is definitely the first time that's ever happened. What can we infer from Friedman earning the Bullock? I suppose this depends on who you ask and which mention you think is the more unjustified. Friedman is the certainly the most prominent international relations columnist working today. Your humble blogger has had his occasional issues with Friedman's columns. That said, even Friedman's harsher critics tend to acknowledge that he makes an interesting point every once in a while. And I've had to write enough 700 word columns in my life to know that it's a much harder task than most people realize.
In a perfect world, foreign affairs columnists would rotate in and out of the op-ed pages after 18 months or so. In the branding world in which we live, I can think of better options than Friedman, but man, I can think of a lot more aspirants who would be worse.
This goes back to my point about the opportunity cost of stupid ideas. Friedman is frequently wrong (as are we all), but he's usually wrong in a way that tends to requires serious engagement rather than a backhanded wrist-slap or easy put-down.
For comparison in terms of stupidity, consider Dan Shaughnessy's latest Boston Globe column in which he suggests that the Boston Red Sox sign Derek Jeter:
Suppose the Red Sox step up and shock the world? There is simply no downside to making Jeter a massive offer. In the worst-case scenario he calls your bluff and you get the Yankees captain.
I don't care if Jeter is way past his prime or if the Sox would have to wildly overpay a player of his diminished skills.
I say offer him the world. Forget about Jayson Werth. Blow Jeter away with dollars and years. At worst this would just mean the Sox would jack up the final price the Yankees must pay. It could be sort of like Mark Teixeira-in-reverse…
What's the harm in offering Jeter $20 million a year over three years? If you can pay J.D. Drew $14 million per year… if you can pay a Japanese team $50 million just for the right to speak with Daisuke Matsuzaka… if you can buy a futbol club for $476 million, why not spend $60 million to bust pinstripe chops for all the ages?
Jeter is closing in on 3,000 hits. Imagine if he gets his 3,000th hit as a Red Sox… at Fenway… against Mariano Rivera?
Since we are pretty certain Adrian Beltre is gone, the Red Sox have a big hole at third base. Jeter could play third. Or you could trade Marco Scutaro and put Jeter at short.
This certainly would make the Sox less boring.
This is bad even when grading on a Shaughnessy curve, which already sets the bar ridiculously low.
First, it's horribly written: in the span of three paragraphs, Shaughnessy manages to give two very different worst-case scenarios. Which is it, exactly?
Second, it's horribly argued. If Jeter is not going to move off of shortstop for the Yankees, why would he do it for the Red Sox? Smart baseball people will tell you that Jeter's recent numbers don't justify anyone paying him $20 million a year -- and no one but the Yankees should even pay him $15 million. If I'm the Red Sox, I would make a play for closer Mariano Rivera -- but why sign an aging shortstop when the Red Sox already have one decent veteran (Marco Scutaro) and two pretty promising younger shortstops (Jed Lowrie and Jose Iglesias)?
Shaughnessy thinks the merit of this option is to force the Yankees payroll up. OK, except that a few paragraphs down, he implies that the Red Sox budget is essentially unlimited. There's no world in which a) the sky is blue; and b) the Yankees have a more constrained budget than the Red Sox. Either there are opportunity costs in paying Jeter a lot of money (in which case the cost for the Sox is greater) or both franchises are so rich that money doesn't matter (in which case there's no point to starting a bidding war in the first place).
I've just wasted untold minutes and several neurons of brainpower to explain why Shaughnessy's column might be the stupidest sports column I've read this year. It's not even stupid in an interesting way -- it's just a brainless rant. Arguing when and why Tom Friedman is wrong doesn't feel like the same waste of time to me.
In other words, he deserves his Bullock.
Question to readers: if not Tom Friedman, who would you want to read on world politics on the New York Times op-ed page?
Daniel W. Drezner is professor of international politics at the Fletcher School of Law and Diplomacy at Tufts University.