Thursday, December 23, 2010

Why Civil Marriage Ought to End

The debate about who should be allowed to marry is, first and foremost, a debate around what the word 'marriage' actually means. Is it a descriptor of the loving union of two people? Or is it an institution upon which families (and specifically the optimal raising of children) is based? A series of court decisions have cast doubt on whether marriage is really about procreation: one of the foremost arguments against the procreative view of marriage is that if the point of marriage was to beget children, then we wouldn't let old people or the infertile marry at all. Therefore, the logic goes, the institution must be about two people loving each other and having an exclusive sexual relationship, and therefore it is discriminatory not to let gay couples participate in marriage.

Princeton professor Robert George attempts to bolster the procreative argument in a recent paper. His argument is that, even though some heterosexual couples can't have children, they are oriented towards that goal through the nature of the act. He makes the analogy to a baseball team: all baseball teams are structured to win baseball games, but some win and others lose. Yet the losers are still baseball teams, and in a similar way non-procreative heterosexual couples are still oriented towards procreation.

Over at Slate.com, Kenji Yoshino responds:

I suspect it will be cold comfort to many infertile opposite-sex couples to hear that while their marriage is still "real," it is a "losing" marriage as opposed to a "winning" one. Ideally, most of them view their marriages as something more than honorable defeats and would despise the contention that they had not fulfilled the central purpose of the institution. Moreover, the article says nothing of straight people who choose not to procreate. It is unclear why they would have "true marriages," as they are not even trying to win.


This argument tells you everything you need to know about the pro-gay marriage side of the debate. The first half of the paragraph hints at the skepticism, made more explicit elsewhere in the piece, that marriage has to do with anything other than the shared love and sexual intimacy of two people. (I'll just say, in passing, that the notion marriage is only about love strikes me as historically, anthropologically and psychologically clueless.) But the second half of the paragraph gets at why, despite my much greater sympathy for Robert George's point of view, I believe Yoshino ultimately has the better of the argument. You cannot argue that marriage is about an orientation towards procreation when a large number of married heterosexuals explicitly reject that belief, not by accident of biology, but by choice. And, what's more important, there are no legal or social repercussions when a married couple chooses not to have children.

However, while Yoshino is right that George's procreative argument does not create a sufficient case for preserving marriage as a heterosexual-only institution, he is wrong that his counter argument makes an effective case for gay marriage. Consider this sentence: "Closely examined, the common-procreation argument denigrates not only same-sex couples but several kinds of married opposite-sex couples." He is speaking about the State, our government, smiling upon certain types of loving relationships and 'denigrating' others. Does our government have any interest, at all, in licensing personal, loving relationships? Especially if no-fault divorce means they can be dissolved any time, or for any reason? Of course not. Gay marriage advocates are arguing for inclusion in an institution that, from a civil point of view, is completely hollowed out. If they were intellectually honest, they'd be arguing for an end to marriage as a legal concept, and its replacement with a system of contracts: we could have a basic civil union that covers things like joint tax filing and hospital visits, and a child-rearing contract that expresses the intention of two people to commit to raising a child (whether conceived by the couple or adopted) to adulthood.

To go back to the question of how you define marriage, it isn't an "either, or" issue, but rather a, "both, and" issue. Marriage is both about love, and procreation, and raising children correctly, and (at least for religious folks) making a commitment to God and Church about how you intend to conduct the rest of your life in partnership with your spouse. But that is very explicitly a religious vision of marriage. In our society, I would argue, the only consistent, valuable definitions of marriage are religious ones.

Our society is deeply influenced by its Christian heritage. However, once that faith ceases to be vital in the public square, that influence is not sustained, which means basic assumptions about our society enshrined in law, like marriage, come under scrutiny. I believe marriage as a civil institution was fundamentally hollowed by no-fault divorce, which strips a sense of duty and commitment out of marriage. That shift in the law was largely embraced by the public, including practicing Christians, who get divorced with basically the same frequency as the public at large. If we wish to have a religiously informed vision of marriage enshrined in civil law, Christians must prove through their actions the superiority of that vision. Until then, the best (and most fair) thing to do is to end civil marriage and return it to the religious sphere where it finds its truest expression.

Thursday, December 16, 2010

The Wall Street Two-Step

I've long harbored the suspicion that our modern financial sector is something of a parasite, sucking wealth out of the American bloodstream. This suspicion intensified when I moved to New York, when I started meeting wealthy people who weren't obviously doing anything valuable to justify their wishes. (I know, a nervy statement coming from someone in advertising.)

But this article, from Tyler Cowen, brought these fuzzy thoughts into sharp relief. Cowen starts with the mission of explaining income inequality, but transitions to focusing on how Wall Street, in his words, "has learned how to game the American (and UK-based) system of state capitalism." How? By "going short on volatility," or by making financial bets that assume what is likely to happen will always happen. Cowen explains it well here:
To understand how this strategy works, consider an example from sports betting. The NBA’s Washington Wizards are a perennially hapless team that rarely gets beyond the first round of the playoffs, if they make the playoffs at all. This year the odds of the Wizards winning the NBA title will likely clock in at longer than a hundred to one. I could, as a gambling strategy, bet against the Wizards and other low-quality teams each year. Most years I would earn a decent profit, and it would feel like I was earning money for virtually nothing. The Los Angeles Lakers or Boston Celtics or some other quality team would win the title again and I would collect some surplus from my bets. For many years I would earn excess returns relative to the market as a whole.

Yet such bets are not wise over the long run. Every now and then a surprise team does win the title and in those years I would lose a huge amount of money. Even the Washington Wizards (under their previous name, the Capital Bullets) won the title in 1977–78 despite compiling a so-so 44–38 record during the regular season, by marching through the playoffs in spectacular fashion. So if you bet against unlikely events, most of the time you will look smart and have the money to validate the appearance. Periodically, however, you will look very bad.


This is essentially the same as Nassim Taleb's Black Swan argument, but Cowen explains why it in fact makes sense to ignore Black Swan possibilities. Taleb argues that there is opportunity betting against the herd (and also that the herd is too stupid to see these risks), but what if the Wall Street herd is actually smart enough to know that the government will have to step in when these unlikely events happen, to 'save the system'?

So far, Cowen is simplifying and clarifying arguments I have come across before, that essentially we are held hostage to Wall Street because it is the lynchpin of the economy. But one could still argue that it in best interests of individuals not to fail, because they will lose money, prestige, and opportunity. That is where Cowen makes his most devastating observation:

Another root cause of growing inequality is that the modern world, by so limiting our downside risk, makes extreme risk-taking all too comfortable and easy. More risk-taking will mean more inequality, sooner or later, because winners always emerge from risk-taking. Yet bankers who take bad risks (provided those risks are legal) simply do not end up with bad outcomes in any absolute sense. They still have millions in the bank, lots of human capital and plenty of social status. We’re not going to bring back torture, trial by ordeal or debtors’ prisons, nor should we. Yet the threat of impoverishment and disgrace no longer looms the way it once did, so we no longer can constrain excess financial risk-taking. It’s too soft and cushy a world.


If, to pick one example, Wall Street traders who lost billions of dollars lost every dollar they had, or spent years in prison, or were exiled to Zimbabwe, individuals would have incentive to resist following the investing herd. But they end up only slightly less rich and successful if they fail than if they succeed. This allure of big money without big risk, as Cowen and others observe, draws smart, driven people away from other fields (creative endeavors, entrepreneurialism, scientific exploration) where success is a prerequisite of financial reward.

The only problem is that this whole no-lose system depends on governments to be able to bail out the banks when the periodic crashes happen. But depending on this will encourage financial firms to take bigger and bigger risks until they overwhelm the government's ability to intervene. Whether what follows is another depression or societal collapse is unclear, but we can be sure it will be ugly.

Monday, December 6, 2010

The Limits of Facebook

Let me first start by saying I have a not-entirely-rational animus towards Facebook. It's mostly because I'm a contrarian: when all my basketball-loving friends were touting LeBron James as the greatest player of the 21st Century, I predicted he would fail because he was too immature to handle the fame and money. (In my defense, I underestimated his talent, but he clearly has some maturing to do, even now.) Basically, I get annoyed when someone or something is annointed 'transformative' before they've earned it. So when I read Mark Zuckerberg's recent comments that Facebook would 'reform' the world of entertainment within five years, I felt my teeth grinding.

Of course, that's hardly the most outrageous thing anyone has said about Facebook: one Russian tech investor with a stake in the company said it could help create artificial intelligence the next decade. All of these claims are based on one central contention: that the massive amount of data, and the underlying 'social graph', that Facebook controls will enable it to transform industry and technology in unprecedented ways. In the interests of brevity, here are three reasons why I don't think it will happen, at least not to the extent Zuckerberg and his investors would like:

1) Who owns the data?
Facebook has courted controversy multiple times when it has changed its privacy settings or let third parties access its data. So far, it hasn't seemed to lead to any mass exodus from Facebook. But the business needs of Facebook fundamentally clash with the desires of its users. Most poeople I know use Facebook to keep a virtual finger on the pulse of distant friends, and as a way of sharing lower-level personal news (photos of your vacation, where you went for dinner on Friday, the celebrity you saw at Saks Fifth Avenue) with a bunch of people easily and unobtrusively. Then there are the people who are on the site obsessively, curating their network and their social presence to some personally defined level of perfection. The former group likes the convenience of Facebook, but doesn't need it if push comes to shove. And the latter group is msotly showing off for each other and for the majority that fits into the first bucket. If Facebook starts monetizing its networks too aggressviely people will get turned off and leave. So the power of Facebook's data only exists so long as it can avoid too obviously using the data. When Zuckerberg talks about "changing standards of privacy," what he is saying is that his business ultimately depends on his users becoming comfortable with Facebook selling their data (aka the digital version of their personal lives) to marketers.

2) The Perils of Overexposure
Unsurprising news: many divorces, including celebrity divorces, are being abetted in part by discoveries on Facebook. This isn't a failing on the part of Facebook any more than it is Visa's fault if a wife finds charges for expensive 'massages' on her husband's bill. But it is the inherent nature of Facebook to expose the ragged edges of human life, and more and more people are going to get burned by it as time goes by. Soon everyone will have the story of a coworker who got fired because of an ill-conceived post, or family members who aren't talking because one sibling saw the other blew off her baby's christening to go to the movies. Once people start aggressively self-censoring, the site becomes less fun, and the data less valuable.

3) Our Network is not Our Brain
One of the fundamental assumptions of Facebook is that we'd rather learn things from our friends than from an algorythm. Zuckerberg and his boosters think we will soon be deciding what to buy, where to eat, and what to believe thanks to the power of our network. And to some degree, that will happen: there is a fairly compelling body of research that we are subtly influenced by the decisions and beliefs of the people around us. But where Facebook goes wrong is assuming the subtle interactions that drive so much human behavior are easily replicated online. If I see a friend wearing a really nice jacket, I might be envious and go get a similar one. But if I see a friend of five 'likes' Burberry on Facebook, that is not going to arouse my desire for Burberry coats: if anything, the display of sycophantic passion for a consumer brand is going to be a bit of a turnoff. I'll think less of my friend, and maybe a bit leas of the brand. The reason social effects drive so much real world behavior is because we aren't really aware they're happening. When we become conscious that someone is trying to influence us, we react very differently.

In the movie The Social Network, the Eduardo Savarin character wants to sell advertising on the site. Zuckerberg, under the influence of Sean Parker, is hostile to the notion, because "ads aren't cool." This is a basic insight into Facebook's enormous success: people like Facebook because it seems like a safe space to connect with people they know. But Facebook exists to make money, and the desires of its users and owners are in tension, if not conflict.

I am not predicting Facebook's demise: I learned better after talking down LeBron. But like LeBron, Facebook may not fulfill all of the lofty expectations people have for it, and when the hype has gotten this out of control, moderate success can feel an awful lot like failure.

Monday, November 29, 2010

Dreams for the 21st Century

In church on Sunday, our priest mentioned in passing Dr. Martin Luther King, and posed a question (I'm paraphrasing): "Who would have thought, when Doctor King gave his famous speech on the Washington Mall, that his dream would be realized in the lifetime of those hearing it?" The thought, which I have heard expressed before, set me to thinking about how a clear vision for how things could be in the future can actually change the trajectory of society.

And so, while humbly admitting that my dreams aren't as eloquent or as powerful as MLK's, I humbly submit three dreams that I would hope to see realized in my lifetime:

I have a dream that those involved in politics will assume their opponents are arguing their position in goodwill. Having become a political junky in my adulthood, I am disheartened by how hard most politicians and pundits find it to not cast doubt on their opponents' motives. While both sides have their share of opportunists and dealmakers, working with the assumption that a person genuinely believes his or her policy will solve the problem in question makes it a lot more likely to find a middle ground or a new solution that actually can work for everyone. This seems like asking for human nature to change, but then again, so did (does?) asking people to act without
thought of race.

I have a dream that we will cherish the value of human life.
I am pro-life, but I think a major failing of those leading the pro-life movement is that they act as if the lives of the women and doctors involved are less important to them than the lives of the babies they may abort. But on the flip side, pro-choice rhetoric seeks to deny the obvious point that a human fetus is a human life. Abortions will happen in a sinful world, but appreciating the common humanity that unites everyone from the beginning until the end of life could change the tenor of the debate, and lead to a solution where abortion, at the very least, is seen as a genuine loss. I could make the same point about euthenasia, the death penalty, and medical cloning, among other topics.

I have a dream that religious people will really love the sinner. It is a difficult balancing act, in public debate, to condemn what you think is a sin without making the sinner feel rejected, shunned and unprotected by the law. We have to seperate sin (which is a matter of the soul) and crime (which is when one person does harm to another). A powerful article I read recently pointed out that if Christians had done a better job of reaching out to homosexuals by, for example, caring for those with AIDS, people might be a bit more willing to listen to arguments they might make about gay marriage, because it would be clear they weren't making the argument out of hatred for gays. But the reality is that many believers can't separate the sinner from the sin, leaving the faithful open to (sometimes well-deserved) charges of hypocricy.

At any rate, my list may seem utopian. But I hold out hope that as a society, we'll get there. (My dream for myself is that I get off the couch and find ways to make these dreams happen.) If we want to achieve the vision, we first need to clearly express the vision, which is what I've made a rough first effort to do here.

Tuesday, November 16, 2010

Which Way, America?

I think I have been hearing some version of the phrase "this is America's decisive moment" since I was old enough to care about politics, certainly since I was old enough to vote (1998, for the record). So all of the people going on about how we need to make some huge changes right now, OR ELSE are probably feeding the hype monster just a little bit.

But it is hard not to agree that the country is trending the wrong way, and has been for about a decade. It seems most people are content to look to assign blame (which in and of itself is a symptom of the disease). And in this column, Peggy Noonan does a bit of that, pointing at Obama's seeming disconnect from the country both in terms of policy and tone. But I don't think she really blames him, but rather sees his ascent to power as a symptom of the disease. And that disease is one that the Tea Party succumbed to this year, looking for the quick fix, for voting for passing celebrities rather than proven leaders. Noonan takes Sarah Palin to task for her fundamental unseriousness, then goes on to say:
Reagan's career is a guide, not only for the tea party but for all in politics. He brought his fully mature, fully seasoned self into politics with him. He wasn't in search of a life when he ran for office, and he wasn't in search of fame; he'd already lived a life, he was already well known, he'd accomplished things in the world.

Here is an old tradition badly in need of return: You have to earn your way into politics. You should go have a life, build a string of accomplishments, then enter public service. And you need actual talent: You have to be able to bring people in and along. You can't just bully them, you can't just assert and taunt, you have to be able to persuade.

I completely agree that we have been following the political equivalent of false prophets for a while, but I think Noonan's invocation of those who "earn their way into politics" is another potentially dangerous shortcut to an answer. If, in 2012, we elect someone with a history of service, of accomplishment, does that guarantee anything? Couldn't that, for example, refer to Richard Nixon about as well as Ronald Reagan? Or Lyndon Johnson? The problem is not that the experienced leaders aren't popular, it is that they are correctly seen by many Americans as having put us in our current bind, and so rather than think coherently about the best way out of this mess, they look for fresh faces. And unfortunately, some of those faces are essentially reality TV stars who have learned to espouse some catch phrases that fire up a good chunk of one political faction or another.

Which is why, despite its lack of concrete policy proposals or political suggestions, I really liked this David Brooks column on America's potential to be "The Crossroads Nation". Brooks manages to avoid the tired terms of our current debate about whether we need more stimulus, or what the tax rates should be, or if ObamaCare should be repealed. He looks to principles, and tries to express a vision for what our nation should stand for. I personally find it a moving vision:
In fact, the U.S. is well situated to be the crossroads nation. It is well situated to be the center of global networks and to nurture the right kinds of networks. Building that America means doing everything possible to thicken connections: finance research to attract scientists; improve infrastructure to ease travel; fix immigration to funnel talent; reform taxes to attract superstars; make study abroad a rite of passage for college students; take advantage of the millions of veterans who have served overseas.

In other words, he wants the nation to serve as a welcoming hub for entrepreneurs, thinkers and inventors from around the world. That vision both identifies a key, hard to compete with strength that our country has, and suggests a series of policies that should be pursued to make it happen. Some would appeal more to the right, some to the left. That might make it harder (or impossible) to pass, but it also suggests that it offers a way to escape the dull political pendulum swings that are marking time in our nation's current decline.

Thursday, November 11, 2010

On Condemnation

200 Years ago, the idea of slavery had many defenders: it was seen as biblically endorsed, blacks were assumed to be inferior, and it was the basis for an entrenched economic system. A combination of high and low rationales made it seem like a permanent feature of our society. Today, anyone who would endorse anything that bears the faintest resemblence to slavery would be hounded out of public life.

This evident truth occurred to me when reading a recent post by Ta-Nehisi Coates, titled, "On Improvement". He talks about some of the flaws inherent in humanity, and whether we can ever wash away the dark spots on our collective soul. He starts, innocently enough, by speculating on why we so enjoy football:

Have we created institutions which look unseemly, but actually are addressing some deeply-felt need? In relation to football, what if we--as humans--have a need to vent aggression, even if only vicariously? And what if we do this through other people who will be richly rewarded for their sacrifice, but will also suffer tremendously?


He goes on to speculate why some people persist in holding seemingly ludicrous beliefs and seeking out sources of information that reinforce them, for example those who cling to the 'Obama-is-a-Muslim' theory. He then concludes:

I confess that I have not fully worked this out, yet. I guess I'm just wondering the extent to which we've crafted our own chains. How much of this is just who we are? How much of it can be improved and reformed?


Coates doesn't touch on what I would consider the most obvious explanation: that our sinful human nature constantly tempts us down crooked paths that make us feel good, or make our lives easier, or help us avoid standing out from the crowd. To return to slavery, think about the ways that institution rewarded southern whites 200 years ago:

1) It enriched them. (we could probably stop right there, but wait, there's more!)
2) It gave them absolute power over someone else.
3) It helped them fit in, by not condemning something the richest, most important members of their community were all doing.
4) It gave them a clearly defined sense of the enemy (both the slaves and, perhaps more importantly, northern whites who were butting in.)
5) It provided a clear social hierarchy, and ensured they would never be at the bottom of it. (This is why, I'd guess, so many poor whites who owned no or few slaves were so adamantly against abolition, and explains some of the nostalgia for the Confederacy that still exists.)

I'm sure I could go on, but the point is that slavery was deeply embedded in the psyches and the relationships of the people in the southern states, and thus required major the application of powerful forces to tear it out.

Now, the interesting thing is that I could build similar lists for social causes championed today by both the left and the right. Does not abortion give the woman absolute power over the unborn child? Does not condemnation of gay marriage clearly define the enemy for conservatives? The question is whether as a society we will ever come to see abortion-on-demand or denying gay couples the right to marry in the same way we now regard slavery: that is, as a stain, something abhorrent.

One final thought. If it is true that societal ills have their root in human sinfulness, then we can expect changes in social structures, technology, and government to change what evils we tolerate, but not that some evil will be tolerated by large swaths of what we generally call decent people. An example: in the 1960's, television brought images of the civil rights struggle into every American home, making it impossible for people to ignore the plight of blacks the way they had for a century. At the same time, the contraceptive pill and improved surgical techniques made it possible, for the first time, for women to seperate sex from childbirth. Both technological changes created the impetus for major social changes: I would argue one for the better and one for the worse. One could imagine, say, a technology that allowed a fetus to develop in an artificial uterus after a few months of pregnancy, along with increased concerns about population declines in western nations, to change the societal calculus again.

A society's morals are not static, nor should we assume blindly that all changes are for the better. But we should hold out hope that we can learn an improve. Often, the sign of our progress is marked by those old practices which are now so widely condemned that (we hope) they will never curse us again.

Monday, November 1, 2010

Retreating to Psychology

My daily reading tends to cycle rapidly between political commentary and sports writing. Today, I read two pieces that demonstrated to me just how much of my life I am wasting on reading other people's dimestore psychologizing about public figures of all sorts. First, here is Maureen Dowd on President Obama:

His arrogance led him to assume: If I build it, they will understand. He can’t get the gratitude he feels he deserves for his achievements if no one knows what he achieved and why those achievements are so vital.

Once it seemed impressive that he was so comfortable in his own skin. Now that comfort comes across as an unwillingness to be wrong.


And here is Bill Simmons on Dwayne Wade:

The overthinking-it-but-maybe-I'm-right explanation: Maybe everyone slowly realized during the preseason, "Good God, LeBron is MUCH better than Dwyane. What do we do? How do we handle this? Do we wait for Dwyane to admit it? Do we ... wait, what do we do???"

Maybe Wade can feel it. Maybe his competitive juices are kicking in. No, no, we're equals. He's not better than me. We're equally good. Look, I'll show you. Maybe it's just been the elephant in the room for six weeks. Maybe deep down, everyone knows the Heat can't take off until Wade has his "You can be chairman and CEO, I'll be president and COO" moment. It goes beyond who gets to take the last shot. It's about the dynamics of basketball. It's about someone emerging as the emotional leader, the spine of the team, the guy who says over and over again, "I got this." And you can't keep saying that if you're looking over your shoulder worrying that someone else is saying the same thing. It's like a fly ball in the outfield. Eventually, someone has to call it.


In case you don't feel like clicking through to the whole pieces, I'll give you a summary. Dowd's point is: Obama was too cocky and believed his own hype and now he's paying for it. Simmons' point is: Both LeBron and Wade are used to being the star, and one of them is going to have to defer to the other if they are going to succeed. Both of these themes are so obvious and well-worn that if they had written their pieces without resorting to pop psychology, they wouldn't have been much longer than those sentences.

Why is psychological speculation so compelling to readers? Sadly, I can only answer the question by resorting to it myself: readers today are overwhelmed by how complex and challenging things have become, so they take comfort in speculation as to behavioral drivers that they can easily understand. We may not be able to figure out an agenda that will both help the country and appeal to Obama's political base, we may not be able to envision an offensive system that will maximize the combined talents of LeBron James and Dwayne Wade, but we can presume that we understand what's going on in the heads of famous people, and that makes us feel smugly superior to them. At least I'm not as cocky as the President (or Dwayne Wade).

There are writers who make more substantative arguments: Walter Russell Mead is a great example of someone using historic and strategic insights to help his readers understand what is happening in the world right now, and what might happen in the future if certain trends continue. His recent post about the sorry state of our politics, and the structural weaknesses of both parties that keep them from addressing our major problems, was compelling and avoided cheap point scoring. But as long as we continue to indulge our collective intellectual laziness by analyzing why our public figures don't behave exactly as we'd like, we will remain hopelessly far from finding new solutions to our tired problems.

Tuesday, October 19, 2010

What Soccer Can Teach Us About the Value of "The Best"

Word came recently that my beloved Red Sox are close to buying Liverpool, one of the soccer (I should say football, especially since I'm in Europe) clubs in the English Premier League (EPL). While I would prefer that the Sox owners put the money into acquiring a few stronger players, I understand the business motivation: after the last World Cup, there is (again) talk that soccer is finally catching on in America, and investing in a premium franchise, and in an English-speaking country, makes sense as a long-term play.

Well, so I thought, to the extent I thought about it at all. The essential logic is that the EPL represents the best in soccer, and that those franchises will only continue to increase in prominence and importance. Numerous books on business and branding have noted the phenomenon that the rich get richer, that there are many industries where winner increasingly takes all, and the most successful, the most powerful, the most well-known increasingly get more money and fame while second-best fades.

However, this column by Theodore Dalyrimple, ostensibly about the World Cup, caused me to reconsider. The key point:

There is also an interesting contrast between the way the professional sport is practiced in Germany and in England.

The English football league generates far more money than the German, and most observers deem it the best in the world, at least in terms of attracting hundreds of millions of television viewers. Players in the English league are much better paid than those in the German league (though the Germans are hardly impoverished). However, most of the players in the top echelon of the English league are foreign. Many of the best clubs have only two or three English players, and some have none. English clubs import players; German clubs foster and train German players. English clubs are largely owned by foreigners, such as Russian oligarchs of the most dubious reputation; German clubs are owned by Germans. English clubs lose money and are highly indebted; German clubs make a profit and have monetary reserves. And as I have already mentioned, the German national team plays incomparably better than the English national team.


As a marketer, I have to think that the German teams (and German soccer) as brands, are in the stronger long-term position. They are authentic, local, and sustainable, all modern branding buzzwords for a reason. One could imagine a future in which the EPL fans grow disenchanted with their crazy owners (who aren't English) and pampered players (who aren't English) and see their league as fatally compromised. One could also see a future in which the unsustainable financial practices of these teams cause a paring back, with the best talent going to other leagues, including the German ones, and cutting into the EPL's talent advantage.

If John Henry and the other Red Sox owners wanted my advice (which of course they don't) I would tell them not to invest in a bubble inflated by foreign money chasing "the best". They should try to be a part of building something (which is what they've done with the Sox) rather than buying at the peak of a sports bubble. They might even think about using some of that money to build soccer in the US. But I think they'll find the value of owning the Liverpool 'brand' will be much less than they think.

Wednesday, October 13, 2010

The Incentive to Mediocrity

In my last post, I argued that greatness is increasingly unlikely to occur when society rewards talent so lavishly in its embryonic state. I would like to examine the issue from another perspective: is there, in fact, an incentive in modern life to be mediocre?

Throughout most of history, luxury was exceedingly rare, and only those at the pinnacle of power (think kings, dukes and the like) could harness sufficient economic resources to have it. This changed with the industrial revolution to some degree, but the real shift came with the advent of electricity and broadcast technology, which allowed for the first time most people to consume entertainment regularly. The advent of commercial air travel then allowed the non-elite to consume experiences (going to a beach in the winter, seeing exotic places or great art) with some regularity. After this, you no longer had to be a king to receive the best society has to offer, just moderatly wealthy: the top 1% of US households bring in more than $350,000 a year, which will pay for a very nice home, luxurious travel, and a damned big television.

So the incentives, perhaps, start to change for people. Is it worth it to kill yourself for your art, your science, your political platform, if you can achieve recognition and wealth without going quite that far? For those without genius-level talent, a great effort is necessary just to get to that elite level, and they will work extremely hard if they want all the perks of modern life. But the most gifted can achieve great success without supreme exertion: these are the people who 'make it look easy.'

I admit there is no proof to back up my idle theorizing, but the nature of modern luxury seems to offer at least a partial explanation for why society has produced so few truly great individuals in the post-WWII era.

Sunday, October 10, 2010

The End of Greatness

Would Ghengis Khan have conquered most of Asia and a big chunk of Europe if he had been able to vacation in the South of France? Would Tolstoy or Dostoevsky have toiled over thousand page novels if they could have gained fame and fortune from an Oprah's Book Club sticker? Would the leaders of the American Revolution have stuck with their hard path if the UN had existed to intervene and 'talk it out'?

These are, perhaps, stupid questions. At least, on their face, they are unanswerable. But they point to something I think is important: the great reduction of greatness in all walks of life in our modern times. Whatever the realm you choose to inspect, whether statecraft or art or science, we seem to be suffering, in the last 70 years or so, from a distinct absence of greatness. One could argue that we might recognize greatness in some of our contemporaries only once time has passed, but I think we'll find, even decades from now, that this time will be marked by a dearth of the extraordinary. Why? Because the luxuries and temptations of modern society sap the most talented of their will to punish themselves to scale the mountains which they might be capable of ascending.

Society has become so good at recognizing and celebrating talent that it rewards the gifted before their talents have matured. The intelligent, the creative, the visionary: for the most part they are absorbed into the upper levels of privilege before they have had the chance to achieve true greatness. If a talented 20- or 30-something is whisked off to Cannes or Miami or given the funds to afford a lavish lifestyle in the great cities of the world, chances are they are going to be diverted from whatever greatness they still had to achieve.

I can't imagine an easy solution to this problem. We just have to hope that later generations will be better able to resist the lures of a well-developed material culture, and will once again be willing to walk the hard road to high achievement.

Saturday, September 18, 2010

Is Evil a Heritable Trait?

As an American businessman attending meetings in Germany, I have had to bite my tongue frequently. For example, over dinner in Berlin with some wonderful German beers, I had to suppress the strong urge to blurt out, "This is a beautiful city, especially considering the pounding we gave it seventy years ago." I wonder if my counterparts also ever stop and realize that two generations ago, our relatives would have been doing their best to kill each other. So far, at any rate, I've managed to keep Basil Fawlty's directive, "Whatever you do, don't mention the war!"

On the plane home the other day, these thoughts led me to what might be a commonplace observation: I could never imagine the Germans I've met throwing their support behind anything as vile as the Nazis. Now, there are some traits that are stereotypically German which I see in abundance: they are serious, hard-working, comfortable with hierarchy. You could see how they would make good soldiers: they seem like they would take to discipline very easily. But for the evil of Nazism to take hold, you would expect there to be some hint of that darkness in the character of the people, and I at least haven't seen it.

This is important because many materialists will tell you that free will doesn't exist in any meaningful sense. History and individual behavior are essentially pre-ordained. But if this is true, our personal and group behaviors must come from our DNA, and if so, it must be relatively immutable, certainly in the span of a few generations. People capable of aggression and cruelty on an industrial scale, especially people with a relatively homogeneous population, should show at least that potential relatively constantly.

I'm sure some clever materialist has come up with an explanation for this: the most plausible is that these cultures mask their traits as they recover from a defeat. But I think it much more likely that choice makes all the difference. The free will of parents to raise their children to hate violence changes the way they will feel as adults about war. Teaching ethnic and social tolerance will reduce fear of the other.

The good news is that our DNA does not determine our behavior, and whether we embrace good or evil in our lives. The bad news is that our DNA does not make us immune to the kind of evil that overtook Germany in the middle of the 20th century. If society makes bad choices, it could happen anywhere, including here.

Monday, August 30, 2010

Lazy or Sick?

As a thought experiment, imagine that we discovered a virus that inhibited certain higher brain functions. Most subjects infected with this virus would exhibit a greater tendency towards short-temperedness, and would be seen as 'difficult' or 'moody' by others. A few, though, become criminally anti-social. When a treatment for this virus is developed, many formerly hardened criminals become model citizens, exhibiting none of the destructive tendencies that had seemed so hard-wired.

How would this change the way we viewed criminals? (And, for that matter, the cranky uncle who drives everyone nuts at Thanksgiving?) My hypothetical above does not claim that the criminals and the cranky were powerless to resist these urges, just that it was harder for them than for the non-infected. I would contend that society would be split between those who felt those with the virus had been dealt a bad hand, but their behavior was still their fault, and those who would argue people cannot be held really responsible for behavior driven to a large extent by an outside influence.

I bring this up because of a brief article in New Scientist that outlines the link between a mouse virus and Chronic Fatigue Syndrome (CFS). The science is clearly at an early stage, and previous attempts to link CFS to a virus have not borne out. But here is the key finding:

Shyh-Ching Lo of the Food and Drug Administration in Bethesda, Maryland, and colleagues found that blood samples from 32 of 37 people with chronic fatigue syndrome contained "polytropic" murine leukaemia virus-related fragments, compared with only three of 44 healthy blood donors.


Now, having done some work in this topic, I can say that many physicians, especially older male physicians, put CFS in a bucket of "women's conditions", along with related syndromes like Fybromyalgia and Restless Leg Syndrome. They tend to believe that the women in their care have underlying psychological problems that are manifesting themselves in these syndromes. That some anti-depressants have proven helpful in alleviating several of these syndromes reinforces their view. To say these physicians are dismissive of these problems and these patients is an understatement.

Patients would widely embrace the identification of a 'real' cause, and would undoubtedly demand a level of care and support for their condition far beyond what they receive today. But notice that not every CFS patient has the virus, and not everyone with the virus has CFS. That implies either that the virus may contribute to the syndrome without fully causing it, or that the reaction to infection might vary enough that a significant number of the infected aren't noticeably sick. (And would this be so surprising? After all, people react very differently to infection by the same cold viruses.) But if, for example, 25% of the people infected with this mouse virus are not noticeably fatigued, and another 50% are fatigued or lethargic to a degree, but are still able to function, many people are going to dismiss the 25% who are most affected as lazy, as milking their diagnosis. And it will be hard to prove the truth either way.

As the science of health continues its amazing advance, we're going to learn more and more about the environmental influences (viruses, bacteria, chemicals, etc.) that impact human performance. If pre-natal pollution exposure lowers IQ, should the less intelligent from dirtier environments be compensated, or get preferential treatment at schools? If certain gut flora lead to obesity, do we give them health coverage for bariatric surgery? With each new learning or theory, we move farther from the notion that people should be held accountable for their choices, and closer to a world where every personal failure is attributed to an outside force. It may be hard work to preserve the notion that we are masters of our own lives.

Monday, August 16, 2010

The Untalented and the Fearful

About a week ago, I saw some truly lousy modern dance. It went as these things often do, with odd costumes, occasionally amusing visual gags, brief moments of coordinated physical movement, and an audience that often tittered at what one could only guess were inside jokes. And so my mind wandered, a bit.

I found myself trying to determine if the dancers had talent in any discernible way. Did they, through combination of native ability and hard work, demonstrate high aptitude, if I took what they were doing at face value? I thought to other modern dance performances I have seen, and in subtle ways, I thought they were probably in fact a bit less skilled then many of their peers. To put it another way, I thought other dancers I have seen would have performed the same routine with more grace, more flair, and more humor. But those differences are minimized by the art form itself, which has sacrificed everything but novelty.

I don't mean to bitch about modern art, or not only to do so. Other writers, including one of my favorites, have done that much better than I can. But I mean to question if the sad state of modern art forms (I'd include painting, poetry, dance, classical music composition, and much of theater and architecture in this bucket) have evolved so as to coddle the talented and shelter the talentless.

If judgments about art's quality are subjective, then it is awfully mean to tell anyone their art stinks. So you don't, and the poor artists don't get weeded out. But the talented have no incentive to hone their skills, either, because that's not how they will be judged. They will be judged by novelty, their flair for self-promotion, and whether they master the language and the symbols of the in-crowd. Showing too much raw skill just might turn everyone else off.

And so the state of art is perfect for our self-esteem culture. The talentless are given a fair chance to beat the gifted. If someone rejects your work, it is not because you stink, it is because they don't get it. You don't need to master what your predecessors knew because it is old hat, and all anyone in your crowd cares about is what's next. Jackson Pollock is relevant as a cultural marker, but a young painter cannot learn what he needs by studying his predecessor's technique, because he can never fling paint at a canvas better than Pollock. The attempt would just be derivative. Maybe splatter a canvas with your own blood or desecrate a religious icon.

Great art, even mediocre art, requires deeply understanding and at least partially mastering what the public standard for quality is, and then either building on it or turning it on its ear. Modern art has build a Tower of Babel where the insiders only pretend to understand each others' gibberish, and then pat each other on the back in celebration of the wonderful new language each has created.

Sunday, August 1, 2010

On Rejection

If I can disagree a bit with the image at left, increasingly, we don't learn early on how to deal with rejection. It seems the self-esteem factories of childhood are dead set against letting young people experience anything that could be seen as rejection or a repudiation of their innate wonderfulness.

I am writing this from the very un-objective point of view of someone who just got a rejection letter for a short story he thought was pretty good. But despite my anger (and there's no other word for it) that some hack assistant editor decided my story wasn't worth the time of day, rejection is clarifying. And I don't think we have enough of it in our modern world, especially for young people.

In the worlds of academia and youth activities, we have worked hard to ensure that relatively few people are told their work is inadequate. There's a paint-by-numbers way to get at least decent grades, and there's always an activity willing to celebrate your effort. (Sports remain something of an exception, but only because putting lousy players on the team only delays rejection until the actual game, when you get the ultimate rejection of losing.)

Grade inflation is perhaps one of the more insidious ways that a culture of non-rejection ruins people. A smart student that gets B's and C's because their work isn't quite up to the level of top achievers might be motivated. But when everyone who is adequate gets an "A", the underachiever has no motivation to improve and the overachiever feels cheated and stops working as hard.

Even in dating, perhaps the area most likely to create rejection, the challenges have been lessened. When most people dated with some degree of commitment, the decision to be with a person or not was serious, and a lot of feelings got hurt. Now, young people increasingly have to live with a hook-up culture that discourages serious commitment, but opens the door to casual flings where no one's feelings are hurt, but no one leaves quite satisfied, either.

In adulthood, this translates into no one being willing to tell you that you could be better. I had a long conversation recently with some managers at my company, who said that they were trained to treat anyone under 30 with kid gloves and to overwhelm them with positive reinforcement, lest they become discouraged and quit. Unsurprisingly, those 20-somethings don't seem to be learning how to get better.

That's the value of rejection: it either clarifies your failings or makes you that much more determined to prove the rejector wrong. If we are losing our ability as a society to reject that which we feel is inadequate, we will end up with steadily less excellence, as too many talented people will feel their first, mediocre efforts in their chosen field are good enough. Far better to maintain a culture of high standards, where a little rejection goes a long way to motivating people.

Wednesday, July 21, 2010

The Expert Problem

I was listening to NPR the other day and heard some pundit complain about something I've heard in many venues: namely, that so little of the stimulus money has been spent. This article lays out some of the details, as do many others if you care to look. Why, you might ask, is it so hard to actually spend this money and get some stuff built? After all, there are no shortages of rundown or overcrowded highways, or congested rail corridors, or outdated airports. Let's get to work!

Not so fast. The reason you don't see new bridges and tunnels being built, as happened frequently in the Great Depression, is in large part because we now have experts to clutter up these projects. Nothing gets done by the people who make things without the people who think about things having their say first. In many ways, the white collar worker is the enemy of his blue collar brother, as the reports, analyses and meetings of the former slow up the building, creating and manufacturing of the latter.

Just take a look at this page updating on progress for the Second Avenue Subway line here in NYC: you see study after study, recommendation after recommendation, but words like, "dig" are barely to be found. A century ago, this city was criss-crossed with subway lines, built largely by private industry. But now it seems it takes decades just to study the possibility of a project. I work a half block away from where the proposed line will run, and strongly doubt I will ever ride it.

Obviously, there is merit in looking before you leap, and considering the best way to execute a project as well as its possible impact before starting. But projects like the Big Dig are beset by cost overruns and engineering failures even after the best minds study the problem endlessly, so how much value do all these reports really provide? The country aches for renewal, and improving our infrastructure is part of that. So lets stop the debating and get to work.

Thursday, July 1, 2010

Faith and Knowledge

I just finished reading an interesting piece in Slate by Ron Rosenbaum on agnosticism. It tapped into something that I think is a logical challenge of being a religious believer: namely, that the only way you can be 100% sure your faith is right is to be in some way delusional. Faith is almost meaningless if it ia guaranteed.

Does that make me a religious agnostic? (Or a Catholic agnostic, to be more specific?) I do think the medieval Church would have condemned this line of thinking, but I am not sure the Church of John Paul II, and now Pope Benedict, would. How can one say that religious belief cannot be compelled, acknowledge the legitimacy (if not the truth) of other traditions, and then state that a member of the Church must profess 100% certainty in all of its teachings? To believe is not to know.

And to go further, I think there is a good, theological reason we cannot be absolutely sure of our faith claims, one that has been said many times: to make God undeniable is to remove the free will which is His greatest gift to us. Even within an individual heart, an unearned certainty in one's faith turns the believer from a free soul to an automaton. And history shows that those who are absolutely certain in a belief, whether in the Catholic Church, or Islam, or Communism, or Atheism, or the superiority of a certain race, are likely to do horrible things in the furtherance of their beliefs.

There are few humans so incurious about the world that the question of why we are here will never come to them. And that sliver of a non-materialistic question opens the door to belief. As the compulsion to believe fades from human society, I think religion will get more confused and less hierarchical, but the need people have to 'grow in faith' together (a process more halting and unsteady than the phrase implies) will remain. There will be a purges and renewals in faith, as has happened periodically throughout the history of all the great religions, but the existance of doubt does not destroy, and may increase, the need for organized churches.

Tuesday, June 22, 2010

Personality is Destiny

I am going to start this post by praising Keith Olbermann, a beginning I find so shocking that I had to compensate by including the goofy picture at left. As pointed out in this WSJ article and elsewhere, Olbermann (and other liberals) have been extremely critical of Obama's handling of the Gulf oil spill, and of his Oval Office speech on the topic in particular. What I find so worthy about this is that many liberals are expressing discontent without regard for what it might do to their side's political fortunes. In contrast, conservatives generally held fire on George W. Bush until his second term in office, figuring he had to be better than John Kerry. It is, I think, good for the country when people, regardless of ideology, say what they honestly feel about political decisions, rather than playing the us-versus-them game. It helps us find common ground, even if that common ground is on the other side of the river from the President.

That said, many conservatives hear the complaints liberals have just started making about Obama and respond, "Oh, you're just noticing that now?" For them, Obama's flaws have been evident from early on: he's aloof, he has no experience running anything, he is over-dependant on experts. All of which I would argue is true to some extent, if often overstated.

Perhaps the issue is not what Obama does but, quite simply, who he is. And what he is, at his core, is a man who values success as a goal in and of itself, without caring to much about the wielding of power he achieves. The presidency was just a mountain to climb, and now that he's achieved the summit, he doesn't know what to do next. A Spengler essay from 2009 put it well:

I have never met the man, but I have interviewed a fair sampling of his supporters, and conclude that Obama learned the power to cloud men's minds, like the Shadow on the old radio show. Apart from ambition, there is no "there" there. There are as many Obamas as there are interlocutors.


Was this really so surprising? At one point, he cited his ability to run a presidential campaign as proof of executive experience. People have pointed out how laughable this is on its face, but I have not heard many say how frightening it is that he would equate the task of promoting himself with the task of running the most powerful nation on Earth.

Rather than continue to pile on an already beleaguered politician, though, I'd like to point out that a relatively basic analysis of a candidate's personality seems a much better predictor of their performance as President than their policy positions. Moving past Obama, George W. Bush seemed obviously cocksure, privileged and unreflective even on the campaign trail. Shouldn't we have anticipated he would not adjust well to changing circumstances, that he would retreat, sulking, into a protective bubble rather than spar with a hostile media? (I rather liked the man, all things considered, but I don't think even his most steadfast supporters could deny those two crippling flaws.)

Going back one more, wasn't Bill Clinton a textbook case of a man who would rather be popular than be right? Perhaps a bit narcissistic? I'd argue those traits explain everything from his sexual dalliances to his political strategy of triangulation. His desire to do big, important things like reform healthcare could not compete with the fundamental yearning to be loved and validated by as many people as possible.

One last point, if I may, because I anticipate an objection: If Obama, as I suggest, values personal achievement more than governing, then why did he push so hard for and ultimately get his healthcare bill passed? I would argue that it has little to do with his belief that the policy was best for the country (in the past, he advocated single payer, but abandoned that approach once he announced his candidacy). Rather, he wanted to do something that many Democratic presidents and politicians before him had tried and failed to do, something that would make him better than them. Obama was willing to take a hit to his approval ratings because he wanted to succeed where others has failed.

Why do we as a country continue to vote for candidates with massive (and fairly glaring) personality flaws? I believe it is the nature of our political system. In the primaries, the discontented stalwarts of the opposition party look for a candidate that seems the exact opposite of what they hate in the current president. They overlook (or even embrace) the flaws of this opposing personality, because he seems like a corrective to the despised enemy. In the general election, those flaws are muted, and the independent voter generally embraces someone new and different. It is only in the (increasingly rare) cases where a president finishes two terms with their popularity intact (like Reagan) that this dynamic is less pronounced.

If the model holds, we will in two or six years elect a leader whose personality is diametrically opposed to Obama's. And then feel a pang of regret when we realize we've made the same mistakee all over again.

Tuesday, June 15, 2010

On Being First

I am reading World Without End, sequel to Pillars of the Earth: both books are about medieval England, and the assortment of craftsman, priests, nobles and visionaries that either impede or advance the efforts to build, respectively, a bridge and a cathedral in the fictional town of Kingsbridge. Along the way, the characters kill, make love, go on journeys, try to ruin each other, and do all the other things characters do to fill out two 800+ page popular novels.

But what really interests me is the way the author, Ken Follett, captures the nature of how progress is made. Both books are driven by characters who solve seemingly intractable problems through intellect and force of will, doing things no one else in their community could have imagined.

But the interesting thing is that many of the solutions they come up with seem, to the modern reader, clever but not particularly ingenious. One example: in World Without End, the main character builds the foundation of a bridge by driving stakes around the area where the pillars will go, then filling the gaps between rows of stakes with clay to block the water. Clever? Absolutely. But not revolutionary, at least not to modern minds, even though few if any of Follett's readers have done anything similar.

What you realize upon relfection is that it doesn't seem so revolutionary because you know similar things are done all the time. You might have to reason out how to do it, but you know it can be done. It is much harder to ask yourself, about something that has never been imagined before (at least, as far as you know), "Can this be done?" Follett's characters win us over because they are always asking that question, and overcoming the obstacles in between them and their goals.

Being first to do anything is hard, especially in hard times like these when the easiest thing to do is play it safe and wait for things to improve. But human progress is the accumulation of knowledge gained as people try something first, and then a lot of other people applying that hard-earned knowledge to their situations. In any walk of life, there is the possibility to try the untried, and perhaps to prosper by doing something totally new.

Wednesday, June 2, 2010

Declining Opportunities for "Earned Success"?


I am quite intrigued by Arthur Brooks' arguments about "earned success", which I have seen in several places but which are summarized nicely here. To briefly summarize, Brooks suggests that it is not ultimately income that correlates with happiness, but the feeling that our success (which yes, is often reflected in earnings) is the result of our drive, and that we have contributed in some way to improving humanity's condition. Whether that is by solving a scientific problem, making something useful for someone else, or creating beauty doesn't matter.

I agree with the notion, but I find it mildly depressing, because I believe that the future will offer fewer professional opportunities for earned success. Most of the people I know (and admittedly, New Yorkers and other northeast urbanites/suburbanites are an unrepresentative sample) do not feel their jobs are contributing towards anything bigger, at least not in any way they can observe.

And the future seems likely to create more of the same. Most of the jobs created in the western world are based on various types of information manipulation: packaging facts and ideas for consumption. It may be lucrative, it may even be challenging, but it quite often feels detached and utterly meaningless to the people in these positions. Our corporations, our systems of production and distribution, are so large that few people really have a view of how their contribution fits in. In other words, they feel they are part of a machine and not in control of their own success. This is even happening in healthcare, where doctors and nurses frequently complain that they spend much more time doing administrative tasks and less tending to patients.

This is not cause for despair, exactly. There will always be jobs in research, in the arts, and in engineering that offer what corporate positions increasingly do not: a sense that your effort makes an impact. And for the rest of us, we may have to find earned success in our families or our avocations as we are denied it in the working world.

Wednesday, May 26, 2010

It's the Energy Source, Stupid

For my entire life, the energy of human innovation has been focused on the individual: we have seen the advent of the computer, the cellular phone, and then the infinite improvements, iterations, and combinations of those technologies. When you read a "BLANK of the Future" article, whether that blank is filled in by "Car", or "House", or "Hammer", you can bet it has something to do with the microchip or wireless communication.

In fact, the only other trend that has had a major impact on the things we use has been incremental improvements in materials, allowing us to build things stronger and lighter. (Note, I'm sure someone could find a technology to prove me wrong on this, and I'm purposefully ignoring healthcare innovation because--well, because I want to.)

Now, as I've written before, what we have not innovated on is what I'll call 'collective technologies': things that affect a lot of people, but that individuals rarely 'own'. (Think railroads and other forms of mass transportation, or other infrastructure.) These have been improved in recent years, but there hasn't been anything like a major breakthrough: just incremental gains, caused mostly by improvements in materials and the introduction of the microchip into older technologies.

When I read things like this, then, about the need to establish an economic reason for further space exploration, I'm in complete agreement. But the author is stealing a base: there is simply no way, with the technology we have, that we are going to see space mining or any other sort of economic activity pay off until we have an energy source that makes space transit much, much more efficient.

Think of it this way: the ascension of oil as the dominant energy source (and coal to a lesser extent) made possible every single transportation improvement over the last 150 or so years. Before that we had very inefficient, unreliable wind, water and steam power. Fossil fuels represented an order of magnitude improvement, and it took about a century (until Apollo, arguably) for us to push the limits of where that energy source could take us. Since then, no major breakthroughs on collective technologies. Why? We're waiting for the next energy source that can offer an order-of-magnitude gain on the current standard.

And just so we can skip to the point, that ain't going to be fuel cells, and it ain't going to be wind or geothermal. Maybe solar can step up, if we create much, much, MUCH more efficient panels. Otherwise, it has to be fusion, or something the vast majority of us haven't heard of. But if you want to get humanity off this rock and into a solar-system wide economy, don't worry about what NASA is doing, invest in energy research.

Thursday, May 13, 2010

How the Supreme Court Threatens National Cohesion

Every time a new nominee is put forth for the Supreme Court, there is a lot of opining about what it means for the country, whether the nominee will tilt the balance of the court, and, as in this post by Ross Douthat, whether there should be term limits or other mechanisms to make these nominations a little less momentous.

In the course of discussing term limits, Douthat quotes blogger Matt Yglesias as saying, "The most important consideration for the future of American law is not whether Justice Kagan turns out to be more like Breyer or more like Stevens, it’s whether the seventy-four year-old Antonin Scalia can stay in good health until there’s a Republican in the White House." This hits at the heart of the problem with the court, but it is not one I think can be solved by term limits.

The problem is this: the only reason that the decisions of the Supreme Court have not caused major problems in this country so far is that justices have had the courtesy to leave or get ill at a time when the current President would replace them with someone who would carry on their ideological torch. This has happened somewhat accidentally at first, with Republican presidents nominating justices that would turn out to be liberal. But since Jimmy Carter, we've had somewhat consistent back-and-forth between Democratic and Republican presidents, who have (with the exception of Souter), always nominated someone more or less on their 'team'. There are decisions that are unpopular with one side or the other, but the Court seems within the mainstream of American politics, more or less.

But what happens if, as an example, John Roberts and Samuel Alito are in a fatal car accident next week? Suddenly, the two Bush justices, who were supposed to be reliably conservative votes for decades, are wiped out, and Obama is going to pick their replacements. The Republicans can filibuster a hard-left liberal, but they have to let someone through. Now the court is six liberal votes (four of whom Obama picked, and will serve for a long time in all likelihood), two conservative votes (and one of them 74 years old) and Kennedy, whose 'swing vote' power is now meaningless. Even if the Republicans defeat Obama in 2012 and hold the presidency for the next 20 years, they might not get a chance to do more than replace Ginsburg and Breyer. And who knows, one or both of them may retire soon to ensure Obama replaces them, too.

A court of that composition would certainly never consider adjustments to Roe v. Wade, would likely strike down restrictions on gay marriage, may well curtail gun ownership rights, and would be inclined to extend and expand racial preferences. No matter how you feel about these issues, it is impossible to deny that not giving conservatives any recourse on a number of issues that matter greatly to them would strain the political system immensely. I once drafted a novel with the premise that assassinations on the Supreme Court precipitated the breakup of the United States. Hopefully that never comes to pass, but it is undeniable that the Supreme Court is not only our least democratic branch of government, it is the most unstable and unpredictable. It is thus a likely source of future instability in the country.

Monday, May 3, 2010

Too Complex to Fail? Debt and Modern Government

You are, I've often been told, supposed to write about things you understand. Which is why I should not be writing about the ongoing financial crisis: I can't understand what's happening. From an outsider's perspective, it certainly seems like the numbers don't add up. Our national debt is building at an unfathomable rate. Greece just required a bailout of over a hundred billion dollars, and most experts seem skeptical that it was enough. The nations of the developed world all have massive amounts of public and private debt. (Check out this page on 'external debt' if you want your mind blown.)

I have generally taken it as a given that the bankers who lend money know what they are doing, and despite an amount of debt that frankly seems impossible to pay back, there must be enough corresponding assets that it would all work out. After all, we generally compare debt to annual GDP output, but there are many past years of wealth built up that could pay off debt in a crisis.

However, these two posts by David Goldman have forced me to start thinking that our elected officials, rather than confront the crisis of debt, have made a deal with the devil to keep government going the same way it always has without making anyone too uncomfortable. What we should be doing, in a common-sense world, is dramatically cutting spending and raising taxes to pay for our profligate ways. But that would raise unemployment, make the recession worse, and get all the politicians thrown out of office. So they are instead letting bankers make a lot of money by borrowing from one branch of government and using the money to buy debt from another branch of government. This sounds an awful lot like money laundering: an illegitimate enterprise gives money to someone else for a legitimate use, then they pay back the criminals in clean money. Only in this case they are trying to hide the fact not that they made the money illegally, but that they made the money out of nothing: there's no wealth behind it, nothing but the guarantee of a country rapidly losing control of its finances.

At the risk of piling on too many metaphors, the whole thing feels like a NASCAR race where the cars keep going faster and faster. Eventually they're speeding so fast that it is obvious to everyone that the race has become unsafe. They need to slow down. But the cars are packed so close together, if anyone so much as taps the brakes, there will be a massive pileup. So everyone keeps accelerating, knowing they're just putting off the crash, hoping for some miracle to save them. We have so much debt, so much money moving around so fast, I don't think anyone really understands exactly what's going on. But I keep bracing myself for the moment when some big lender or country finally flinches, and taps on the brakes.

Tuesday, April 27, 2010

The Future in Assassinations

Every once in a while, when I'm looking to learn something completely new and random, I head over to DARPA's website, to browse around and see the possible technologies of the future. And today I found a little gem: the Extreme Accuracy Tactical Ordinance (EXACTO) program.

By way of a brief summary, DARPA paints the picture of an extremely accurate 50 caliber rifle with maneuverable bullets, which it claims will radically increase the range and accuracy of two-man sniper teams. It vaguely mentions a 'real-time guidance system', but understandably doesn't say more. There's no mention, of course, of how close this technology is to full development. (I'm not the only one to find this fascinating, it also received a write-up at WIRED.

My initial guess is that DARPA hopes to create something like the laser-guided smart bombs that have been used in our recent wars. In this case, I can imagine a spotter who gets much closer to the target and lights him up with a laser (the range of the laser obviously becomes important here). Then the bullet would have some sort of fins that allow slight adjustments to hone in on the spot. However, its also possible the technology would allow the target to be marked in advance (imagine a James Bond type sticking something to the back of a jacket) so the spotter can get away before the shot is fired.

According to a quick web search, the farthest shot to kill a human target ever was about a mile and a half. If this technology can double that and make such long-distance shots more accurate, assassinations would become easier by an order of magnitude. (Imagine being in charge of securing all the possible firing positions within over 20 square miles.) But the interesting question is whether and how we would use this technology if we had it available. Right now, we're using Predator drones to kill a lot of Al Qaeda types. In the future, will we once again opt for the bullet over the bomb? Or is ordering a person shot, instead of a target bombed, a little too real for leaders detached from the fight?

Wednesday, April 21, 2010

Madness? THIS. IS. PARIS!

The picture to the left depicts some modernist architect's vision of an updated Paris. And while it might be hard to imagine a sophisticated (read: wimpy) Parisian parroting Leonidas' words from 300, I'd argue Sarkozy, the French leader trying to implement a massive modernization plan in the City of Lights, deserves to be dragged to the top of the Eiffel Tower and thrown off for his treachery. The thought of turning the world's most harmonious and beautiful city into the set of Blade Runner 2 is more than I can tolerate.

David Brussat, the tireless architecture critic writing for the Providence Journal, routinely highlights the many modernist horrors inflicted upon our great cities. And I am right with him in believing that the current style of architecture is a destructive force when applied to existing cityscapes.

Yet, the part of me that thinks we have abandoned the search for the future to better enjoy the familiar comforts of the past knows that without experimentation, a city risks becoming a museum. So what to do? I think you can find a hint of the answer in New York. While the city has its classical structures, it embraced the latest techniques and trends of the time of its great expansion, the 19th and early 20th centuries. The skyscraper works in New York in a way it simply cannot in Paris, because the development happened organically and the cityscape was built around this new form.

The new styles and building techniques are not going to originate in New York, or Paris, nor should they. The growing cities of the world (for example Sao Paolo, or Shanghai) should look for approaches that solve their unique problems, and these solutions will diffuse to other cities. And even more importantly, when we find ourselves building under the ocean, or on Mars (assuming we ever do) those environments will reflect new needs and new thinking about the best way to live.

The other option is to destroy our built heritage and model our urban cores after Dubai. That is to say, a series of monumental, impressive buildings that fail to congeal into anything like a liveable, cohesive cityscape. But if that comes to pass, I think we'll have a lot of architects and leaders that we want to throw off the top of their monstrosities.

Friday, April 16, 2010

The World in 2200: Part Three

In the two previous posts in this series, I examined some social trends that I believe will be significant drivers of our future, and then hypothesized about the type of events that would happen in a world where those forces were shaping history. In this post, we'll take a look at how this world might appear to a typical citizen of New York City, a physician named Zeke.

At 8:30, Zeke bought a soy bagel and walked to his customary bench in Battery Park, where he would scarf down breakfast. It was Wednesday, so he'd get to see the weekly Mars Cruiser launch from the Jamaica Bay spaceport. Not for the first time, he wished he was on it: the biggest city on Mars, Anconia, was a model libertarian community, and he was increasingly feeling that life in old Manhattan was a little oppressive. He looked at the solid old scrapers of the Wall Street historic district: nothing new had been built there in about 150 years. The streets were closed to vehicle traffic both to preserve its tourist appeal and so no terrorist could bomb the old buildings into rubble. In Brooklyn and Queens, where more new development took place, the buildings were engineered to take a lot more. Just last month, a Bolivarian freedom fighter had driven an old electric van into the base of the Twitter Tower in Long Island City, and the blast had done little more than crack a few lower windows.

Zeke pulled out his thin-screen reader and unfolded it. Its smart cells snapped it rigid after it was fully expanded, and he quickly downloaded the print and video feeds from the Huffington Times. He glanced up as the Mars Cruiser fired its big fusion-powered jet, then went back to reading about a plan for Indian Buddhists to set up "New Tibet" in the asteroid belt. The Chiinese were objecting, as they claimed there was no country named Tibet that it could be named after. Zeke chuckled: he remembered the settlement between India and China after their last scuffle, 20 years earlier, and how China had to keep Tibet because neither side wanted to attract the wrath of the Warriors of the Lama, nor did India want the challenge of an independent base for its dissident Buddhists to rally around. Now, like it had for so many nations with unhappy minorities, it seemed that space would solve the problem.

He then watched yet another debate about the legality of genetic enhancements in children. It seemed besides the point to him. Like most physicians, he saw many patients who had obviously been 'upgraded' in some way or another, and despite the laws he never reported his suspicions or gave evidence to investigators, even when the mods were obvious. (One of his patients, an avid diver, always wore turtlenecks to hide his gills.)

Before he shuffled off to work, he noted that the Southern Autonomous Region was threatening to declare total independence again. The new wrinkle was that now Washigon (he still hated that name) was claiming that if the South went, they would form a new state with British Columbia, which was already independent of Canada. Zeke wondered how long the US was going to hold together: with all the independent space colonies, it seemed like every region with a few million people thought it could and should run itself. "I might be old-fashioned," he thought, "but I still rather like the idea of one big country, even if I don't much care for some of the decisions the government makes."

He folded up his paper-thin screen, threw his trash into a disintigrator (The motto of Waste Repurposing: 'The building blocks of tomorrow's...Whatever!' was cheerfully printed across its side) and headed to his office. His local patients would start arriving at 11:00, for the first two hours, he was required by law to practice vid-medicine for patients in Africa, as part of the Emerging World Grievance Resolution Treaty of 2182. The compromise was that American doctors didn't have to move there. He consulted with a few patients in Great Congo, who all had easily resolved complaints, then finally saw his first live human promptly at 11:00. The poor woman had been suffering from Grant's Syndrome for weeks, and had not been able to get an appointment before now. It doesn't make sense, thought Zeke, that the 'disadvantaged' countries could get better care than the 'modern' ones. He imagined things were better on Mars.

He walked home after work, happy he wasn't one of the people screaming all over the northeast on one of the new MagLev Commuter Trains. Part of the recent "Renewing America" program, they could get a person from Ohio to New York in 70 minutes. But the damned things were still unreliable, and they were packed for almost every run. He supposed they were good for the economy, but they had made New York more crowded than it had been in quite a long time.

In his apartment, his wife had left a holo-message to say she'd be out late with friends, and his son had left for his weekend trip to Moscow. So, with a free evening for the first time he could recall, he played his favorite Virtual Reality war game (relive the epic battles of the great South American wars of the 2080's!) and downed a bit of his illicit scotch. He would be damned if he gave up the occassional drink because he was over 50, and the bureaucrats thought it would increase his medical costs. He was a doctor, and could take care of himself. Eventually, his wife still not home, he drifted off, thinking about how nice Mars sounded.