Hiding a Mountain Of Debt By David S. Broder
Sunday, March 29, 2009; A15
With a bit of bookkeeping legerdemain borrowed from the Bush administration, the Democratic Congress is about to perform a cover-up on the most serious threat to America's economic future.
That threat is not the severe recession, tough as that is for the families and businesses struggling to make ends meet. In time, the recession will end, and last week's stock market performance hinted that we may not have to wait years for the recovery to begin.
The real threat is the monstrous debt resulting from the slump in revenue and the staggering sums being committed by Washington to rescuing embattled banks and homeowners -- and the absence of any serious strategy for paying it all back.
The Congressional Budget Office sketched the dimensions of the problem on March 20, and Congress reacted with shock. The CBO said that over the next 10 years, current policies would add a staggering $9.3 trillion to the national debt -- one-third more than President Obama had estimated by using much more optimistic assumptions about future economic growth.
As far as the eye could see, the CBO said, the debt would continue to grow by about $1 trillion a year because of a structural deficit between the spending rate, averaging 23 percent of gross domestic product, and federal revenue at 19 percent.
The ever-growing national debt will require ever-larger annual interest payments, with much of that money going overseas to China, Japan and other countries that have been buying our bonds.
Reacting to this scary prospect, the House and Senate budget committees took the paring knife to some of Obama's spending proposals and tax cuts last week. But many of the proposed savings look more like bookkeeping gimmicks than realistic cutbacks. The budget resolutions assume, for example, that no more money will be needed this year to bail out foundering businesses or pump up consumer demand, even though estimates of those needs start at $250 billion and go up by giant steps.
Republicans on the budget committees offered cuts that were larger and, in some but not all instances, more realistic.
But the main device the Democratic budgeteers employed was simply to shrink the budget "window" from 10 years to five. Instantly, $5 trillion in debt disappeared from view, along with the worry that long after the recession is past, the structural deficit would continue to blight the future of young working families.
The Democrats did not invent this gimmick. They borrowed it from George W. Bush, who turned to it as soon as his inherited budget surpluses withered with the tax cuts and recession of 2001-02. But Obama had promised a more honest budget and said that this meant looking at the long-term consequences of today's tax and spending decisions.
There are plenty of people in Congress for whom the CBO report was no surprise, and some of them have proposed a solution that would confront this reality. Kent Conrad, the chairman of the Senate Budget Committee, and Judd Gregg, its ranking Republican, have offered a bill to create a bipartisan commission to examine every aspect of the budget -- taxes, defense and domestic spending, and, especially, Medicare, Medicaid and Social Security. Congress would be required to vote promptly, up or down, on its recommendations, or come up with an alternative that would achieve at least as much in savings.
In the House, Democrat Jim Cooper of Tennessee and Republican Frank Wolf of Virginia have been pressing a similar proposal but have been regularly thwarted.
The roadblock in chief is Nancy Pelosi, the speaker of the House. She has made it clear that her main goal is to protect Social Security and Medicare from any significant reforms. Pelosi has not forgotten how Democrats benefited from the 2005-06 fight against Bush's effort to change Social Security. Her party, which had lost elections in 2000, 2002 and 2004, found its voice and its rallying cry to "Save Social Security," and Pelosi is not about to allow any bipartisan commission to take that issue away from her control.
The price for her obduracy is being paid in the rigging of the budget process. The larger price will be paid by your children and grandchildren, who will inherit a future-blighting mountain of debt.
For daily notes; adjunct to calendar; in lieu of handwriting notes in Day-Timer
Sunday, March 29, 2009
Sunday, March 22, 2009
When ‘Deficit’ Isn’t a Dirty Word By ROBERT H. FRANK
March 22, 2009
Economic View
When ‘Deficit’ Isn’t a Dirty Word By ROBERT H. FRANK
ARE you confused about whether large federal budget deficits matter?
No wonder, when disagreement about deficits is popping up everywhere. Even among Republicans, there is no unity on this basic issue. Defending his recent proposal to freeze government spending, Representative John A. Boehner, the House minority leader, said, “We simply cannot afford to mortgage our children and grandchildren’s future to pay for this big government spending spree.” But Martin Feldstein, the Harvard economist, disagrees. An adviser to the past three Republican presidents, Professor Feldstein warns that failure to run large deficits would prolong the current economic downturn.
Because important policy decisions hinge on whether deficits matter, this is an opportune moment to take stock of what we know. The good news is that there is little disagreement among economists who have studied the issue. The consensus is that short-run deficits help end recessions, and that whether long-run deficits matter depends entirely on how government spends the borrowed money. If failure to borrow meant forgoing productive investments, bigger long-run deficits would actually be better than smaller ones.
In 1929, President Herbert Hoover thought that the best response to a collapsing economy was to balance the federal budget. With incomes and tax receipts falling sharply, that meant cutting federal spending. But as almost all economists now recognize, President Hoover was profoundly mistaken.
When a downturn throws people out of work, they spend less, causing still others to be thrown out of work, and so on, in a downward spiral. Failure to use short-run deficits to stimulate spending amplifies that spiral, causing further declines in tax receipts and even bigger deficits. That this path makes no sense is a settled issue.
But what about long-run deficits? To think more clearly about them, we must recognize that carrying debt is costly. The government can pay just the interest on its debt each year, or it can pay interest plus some additional amount to reduce the principal. The yearly payment is clearly greater in the second case, just as a homeowner’s monthly payment is larger with a 10-year mortgage than with a 30-year one. But the total burden of the various repayment options (in technical terms, their “present value”) is exactly the same. It’s a simple trade-off between intensity of burden and duration of burden.
No matter which option we choose, money spent to service debt can’t be spent for other things we value. But that doesn’t mean we should always borrow less. The main issue is what we do with the borrowed money.
If we simply use the money to buy bigger houses and cars, deficits make us unambiguously worse off in the long run. That’s why the explosive increase in the national debt during the Bush administration was a grave misstep.
Trillions of dollars, many of them borrowed from China, financed tax cuts for the wealthy, who spent much of their added wealth on things like bigger mansions. But beyond a certain point, when everyone builds bigger, the primary effect is merely to raise the bar that defines the size of home that people feel they need. Much of the interest we’ll pay on debt incurred during the Bush years is thus money down the drain.
In contrast, borrowing for well-chosen investments doesn’t make us poorer. Road maintenance is a case in point. Failure to repair roads in a timely way could mean eventually spending two to four times as much for the work. Even ignoring the fact that timely repairs would reduce the substantial vehicle damage from potholes, it would be much cheaper to borrow the money and do maintenance on schedule.
It’s also useful to put the nation’s debt burden into perspective. Over the last eight years, Bush administration deficits raised the national debt by almost $5 trillion. Given the current crisis, it’s easy to imagine a similar increase during the next four years. At recent interest rates, servicing $10 trillion of extra debt costs about $400 billion annually — a big amount, to be sure, but less than 3 percent of the economy’s full-employment output. We’ll still be the richest country on the planet even after paying all that interest.
Once the downturn ends, there should be no need to incur additional debt. Indeed, there are many ways to pay down debt without requiring painful sacrifices. A $2 tax on each gallon of gasoline, for example, would generate more than $100 billion in additional revenue a year. Europeans, who pay more than $2 a gallon in gasoline taxes, have adapted by choosing more efficient cars — and they appear no less satisfied with them.
We could also levy a progressive consumption surtax, which would not only generate additional revenue to pay down debt or finance additional public investment, but would also stimulate private savings by diverting money from those over-the-top coming-of-age parties that the wealthy stage for their children.
Notwithstanding the neo-Hooverite talk from stimulus-program opponents, the current deficit isn’t too large. If anything, it may need to be even larger to revive the economy. In the long run, new sources of tax revenue could keep deficits from growing and could even pay down existing debt. But if the political system cannot figure out how to pay for productive investments with tax revenue, we’d still end up richer, on balance, by making those investments with borrowed money.
Robert H. Frank, an economist at Cornell, is a visiting faculty member at the Stern School of Business at New York University.
Economic View
When ‘Deficit’ Isn’t a Dirty Word By ROBERT H. FRANK
ARE you confused about whether large federal budget deficits matter?
No wonder, when disagreement about deficits is popping up everywhere. Even among Republicans, there is no unity on this basic issue. Defending his recent proposal to freeze government spending, Representative John A. Boehner, the House minority leader, said, “We simply cannot afford to mortgage our children and grandchildren’s future to pay for this big government spending spree.” But Martin Feldstein, the Harvard economist, disagrees. An adviser to the past three Republican presidents, Professor Feldstein warns that failure to run large deficits would prolong the current economic downturn.
Because important policy decisions hinge on whether deficits matter, this is an opportune moment to take stock of what we know. The good news is that there is little disagreement among economists who have studied the issue. The consensus is that short-run deficits help end recessions, and that whether long-run deficits matter depends entirely on how government spends the borrowed money. If failure to borrow meant forgoing productive investments, bigger long-run deficits would actually be better than smaller ones.
In 1929, President Herbert Hoover thought that the best response to a collapsing economy was to balance the federal budget. With incomes and tax receipts falling sharply, that meant cutting federal spending. But as almost all economists now recognize, President Hoover was profoundly mistaken.
When a downturn throws people out of work, they spend less, causing still others to be thrown out of work, and so on, in a downward spiral. Failure to use short-run deficits to stimulate spending amplifies that spiral, causing further declines in tax receipts and even bigger deficits. That this path makes no sense is a settled issue.
But what about long-run deficits? To think more clearly about them, we must recognize that carrying debt is costly. The government can pay just the interest on its debt each year, or it can pay interest plus some additional amount to reduce the principal. The yearly payment is clearly greater in the second case, just as a homeowner’s monthly payment is larger with a 10-year mortgage than with a 30-year one. But the total burden of the various repayment options (in technical terms, their “present value”) is exactly the same. It’s a simple trade-off between intensity of burden and duration of burden.
No matter which option we choose, money spent to service debt can’t be spent for other things we value. But that doesn’t mean we should always borrow less. The main issue is what we do with the borrowed money.
If we simply use the money to buy bigger houses and cars, deficits make us unambiguously worse off in the long run. That’s why the explosive increase in the national debt during the Bush administration was a grave misstep.
Trillions of dollars, many of them borrowed from China, financed tax cuts for the wealthy, who spent much of their added wealth on things like bigger mansions. But beyond a certain point, when everyone builds bigger, the primary effect is merely to raise the bar that defines the size of home that people feel they need. Much of the interest we’ll pay on debt incurred during the Bush years is thus money down the drain.
In contrast, borrowing for well-chosen investments doesn’t make us poorer. Road maintenance is a case in point. Failure to repair roads in a timely way could mean eventually spending two to four times as much for the work. Even ignoring the fact that timely repairs would reduce the substantial vehicle damage from potholes, it would be much cheaper to borrow the money and do maintenance on schedule.
It’s also useful to put the nation’s debt burden into perspective. Over the last eight years, Bush administration deficits raised the national debt by almost $5 trillion. Given the current crisis, it’s easy to imagine a similar increase during the next four years. At recent interest rates, servicing $10 trillion of extra debt costs about $400 billion annually — a big amount, to be sure, but less than 3 percent of the economy’s full-employment output. We’ll still be the richest country on the planet even after paying all that interest.
Once the downturn ends, there should be no need to incur additional debt. Indeed, there are many ways to pay down debt without requiring painful sacrifices. A $2 tax on each gallon of gasoline, for example, would generate more than $100 billion in additional revenue a year. Europeans, who pay more than $2 a gallon in gasoline taxes, have adapted by choosing more efficient cars — and they appear no less satisfied with them.
We could also levy a progressive consumption surtax, which would not only generate additional revenue to pay down debt or finance additional public investment, but would also stimulate private savings by diverting money from those over-the-top coming-of-age parties that the wealthy stage for their children.
Notwithstanding the neo-Hooverite talk from stimulus-program opponents, the current deficit isn’t too large. If anything, it may need to be even larger to revive the economy. In the long run, new sources of tax revenue could keep deficits from growing and could even pay down existing debt. But if the political system cannot figure out how to pay for productive investments with tax revenue, we’d still end up richer, on balance, by making those investments with borrowed money.
Robert H. Frank, an economist at Cornell, is a visiting faculty member at the Stern School of Business at New York University.
Saturday, March 21, 2009
The Problem With Flogging A.I.G. By JOE NOCERA
March 21, 2009
Talking Business
The Problem With Flogging A.I.G. By JOE NOCERA
Can we all just calm down a little?
Yes, the $165 million in bonuses handed out to executives in the financial products division of American International Group was infuriating. Truly, it was. As many others have noted, this is the same unit whose shenanigans came perilously close to bringing the world's financial system to its knees. When the Federal Reserve chairman, Ben Bernanke, said recently that A.I.G.'s "irresponsible bets" had made him "more angry" than anything else about the financial crisis, he could have been speaking for most Americans.
But death threats? "All the executives and their families should be executed with piano wire — my greatest hope," wrote one person in an e-mail message to the company. Another suggested publishing a list of the "Yankee" bankers "so some good old southern boys can take care of them."
Or how about those efforts to publicize names of individual executives who received bonuses — efforts championed by Attorney General Andrew Cuomo of New York and Barney Frank, chairman of the House Financial Services Committee. To what end?
How does outing these executives fix skewed compensation incentives, which have created that unjustified sense of entitlement that pervades Wall Street? No, it's mostly about using subpoena power to satisfy the public's thirst for blood. (In light of the death threats, when Mr. Cuomo received the list of A.I.G. bonus recipients on Thursday, he promised to consider "individual security" and "privacy rights" in deciding whether to publicize the names.)
Then there was that awful Congressional hearing on Wednesday, in which A.I.G.'s newly installed chief executive, Edward Liddy, was forced to listen to one outraged member of Congress after another rail about bonuses — and obsess about when Treasury Secretary Timothy Geithner learned about them — while ignoring far more troubling problems surrounding the A.I.G. rescue.
Oh, and let's not forget the bill that was passed on Thursday by the House of Representatives. It would tax at a 90 percent rate bonus payments made to anyone who earned over $250,000 at any financial institution receiving significant bailout funds. Should it become law, it will affect tens of thousands of employees who had absolutely nothing to do with creating the crisis, and who are trying to help fix their companies.
Meanwhile, the real culprits — like Joseph J. Cassano, the former head of A.I.G.'s financial products division— are counting their money in "retirement." Nobody on Capitol Hill seems much interested in getting that money back. (And the bill does nothing about bonuses that were paid before 2009, meaning that most of those egregious Merrill Lynch bonuses, paid at the end of last year, will not be touched.)
By week's end, I was more depressed about the financial crisis than I've been since last September. Back then, the issue was the disintegration of the financial system, as the Lehman bankruptcy set off a terrible chain reaction. Now I'm worried that the political response is making the crisis worse. The Obama administration appears to have lost its grip on Congress, while the Treasury Department always seems caught off guard by bad news.
And Congress, with its howls of rage, its chaotic, episodic reaction to the crisis, and its shameless playing to the crowds, is out of control. This week, the body politic ran off the rails.
There are times when anger is cathartic. There are other times when anger makes a bad situation worse. "We need to stop committing economic arson," Bert Ely, a banking consultant, said to me this week. That is what Congress committed: economic arson.
How is the political reaction to the crisis making it worse? Let us count the ways.
IT IS DESTROYING VALUE During his testimony on Wednesday, Mr. Liddy pointed out that much of the money the government turned over to A.I.G. was a loan, not a gift. The company's goal, he kept saying, was to pay that money back. But how? Mr. Liddy's plan is to sell off the healthy insurance units — or, failing that, give them to the government to sell when they can muster a good price.
In other words, it is in the taxpayers' best interest to position A.I.G. as a company with many profitable units, worth potentially billions, and one bad unit that needs to be unwound. Which, by the way, is the truth. But as Mr. Ely puts it, "the indiscriminate pounding that A.I.G. is taking is destroying the value of the company." Potential buyers are wary. Customers are going elsewhere. Employees are looking to leave. Treating all of A.I.G. like Public Enemy No. 1 is a pretty dumb way for a majority shareholder to act when he hopes to sell the company for top dollar.
IT IS, UNFORTUNATELY, BESIDE THE POINT Even on Wall Street this week, I didn't hear anyone condoning the A.I.G. bonuses. They should never have been granted, and Mr. Liddy should have been tougher about renegotiating them. (A rich irony here is that any nonfinancial company in A.I.G.'s straits would be in bankruptcy, and contracts would have to be renegotiated. The fact that the government is afraid to force A.I.G. into bankruptcy, despite its crippled state, is the main reason Mr. Liddy felt he couldn't try to redo the contracts.)
But there is a much bigger issue that has barely been touched upon by Congress: the way tens of billions of dollars of taxpayers' money has been funneled to A.I.G.'s counterparties — at 100 cents on the dollar. How can it possibly make sense that Goldman Sachs, Bank of America, Citigroup and every other company that bought credit-default swaps from A.I.G. should be made whole by the government? Why isn't it forcing them to take a haircut?
What's worse, some of those companies are foreign banks that used credit-default swaps to exploit a regulatory loophole. Should the United States taxpayer really be responsible for ensuring the safety of European banks that were taking advantage of European regulations?
The person who has made this point most forcefully is Eliot Spitzer, of all people. In his column for Slate.com, he wrote: "Why did Goldman have to get back 100 cents on the dollar? Didn't we already give Goldman a $25 billion cash infusion, and aren't they sitting on more than $100 billion in cash?" Mr. Spitzer told me that while "there is a legitimate sense of outrage over the bonuses, the larger outrage should be the use of A.I.G. funding as a second bailout for the large investment houses." Precisely.
IT IS DESTABILIZING How can you run a company when the rules keep changing, when you have to worry about being second-guessed by Congress? Who can do business under those circumstances?
Take, for instance, that new securitization program the government is trying to get off the ground, called the Term Asset-Backed Securities Loan Facility — or TALF. Although it is backed by large government loans, it requires people in the marketplace — Wall Street bankers! — to participate.
This program could help revive the consumer credit market. But at this point, most Wall Street bankers would rather be attacked by wild dogs than take part. They fear that they'll do something — make money perhaps? — that will arouse Congressional ire. Or that the rules will change. "The constant flip-flopping is terrible," said Simon Johnson, a banking expert who teaches at the M.I.T. Sloan School of Business.
A.I.G. offers another good example. Not all the employees who face the possibility of having their bonuses taxed out from under them work for the evil financial products division. Many of them work in insurance divisions. Very few of them pull down million-dollar bonuses, and none of them brought A.I.G. to its knees. (And employees who bought the company's stock are already hurting financially, having seen its value virtually wiped out.) They are the ones the company badly needs to keep if it hopes to sell those units at a healthy price. Taking away their bonuses — after they've already put the money in their bank accounts — hardly seems like the right way to motivate them. And demonizing them in Congressional hearings doesn't help either.
In previous columns, I have been an advocate of nationalizing big banks like Citigroup. But after watching Congress this week, I'm having second thoughts. If this is how Congress treats A.I.G., what would it do if it had a bank in its paws?
What the country really needs right now from Congress is facts instead of rhetoric. Instead of these "raise your hand if you took a private jet to get here" exercises of outraged populism, we need hearings that educate and illuminate. Hearings like the old Watergate hearings. Hearings in which knowledge is accumulated over time, and a record is established. Hearings that might actually help us get out of this crisis. It's happened before. In 1932, Congress established the Pecora committee, named for its chief counsel, Ferdinand Pecora. It was an intense, two-year inquiry, and its findings — executives shorting their own company's stock, for instance — shocked the country. It also led to the establishment of the Securities and Exchange Commission and other investor protections. One person who has been calling for a new Pecora committee is Senator Richard Shelby of Alabama, a Republican and key member of the Senate Banking Committee.
"As we restructure our regulatory system, we need to be thorough," he told me. "We need to understand what caused it. We shouldn't rush it."
Meanwhile, the House Financial Services Committee has scheduled a hearing on Tuesday featuring Mr. Bernanke and Mr. Geithner. The hearing has been called to find out only one thing: what did the two men know about the A.I.G. bonuses, and when did they know it?
Is that Nero I hear fiddling?
http://www.nytimes.com/2009/03/21/business/21nocera.html?sq=Joe%20Nocera%20Floggin%20A.I.G&st=cse&scp=1&pagewanted=print
http://snipurl.com/eakqp
Talking Business
The Problem With Flogging A.I.G. By JOE NOCERA
Can we all just calm down a little?
Yes, the $165 million in bonuses handed out to executives in the financial products division of American International Group was infuriating. Truly, it was. As many others have noted, this is the same unit whose shenanigans came perilously close to bringing the world's financial system to its knees. When the Federal Reserve chairman, Ben Bernanke, said recently that A.I.G.'s "irresponsible bets" had made him "more angry" than anything else about the financial crisis, he could have been speaking for most Americans.
But death threats? "All the executives and their families should be executed with piano wire — my greatest hope," wrote one person in an e-mail message to the company. Another suggested publishing a list of the "Yankee" bankers "so some good old southern boys can take care of them."
Or how about those efforts to publicize names of individual executives who received bonuses — efforts championed by Attorney General Andrew Cuomo of New York and Barney Frank, chairman of the House Financial Services Committee. To what end?
How does outing these executives fix skewed compensation incentives, which have created that unjustified sense of entitlement that pervades Wall Street? No, it's mostly about using subpoena power to satisfy the public's thirst for blood. (In light of the death threats, when Mr. Cuomo received the list of A.I.G. bonus recipients on Thursday, he promised to consider "individual security" and "privacy rights" in deciding whether to publicize the names.)
Then there was that awful Congressional hearing on Wednesday, in which A.I.G.'s newly installed chief executive, Edward Liddy, was forced to listen to one outraged member of Congress after another rail about bonuses — and obsess about when Treasury Secretary Timothy Geithner learned about them — while ignoring far more troubling problems surrounding the A.I.G. rescue.
Oh, and let's not forget the bill that was passed on Thursday by the House of Representatives. It would tax at a 90 percent rate bonus payments made to anyone who earned over $250,000 at any financial institution receiving significant bailout funds. Should it become law, it will affect tens of thousands of employees who had absolutely nothing to do with creating the crisis, and who are trying to help fix their companies.
Meanwhile, the real culprits — like Joseph J. Cassano, the former head of A.I.G.'s financial products division— are counting their money in "retirement." Nobody on Capitol Hill seems much interested in getting that money back. (And the bill does nothing about bonuses that were paid before 2009, meaning that most of those egregious Merrill Lynch bonuses, paid at the end of last year, will not be touched.)
By week's end, I was more depressed about the financial crisis than I've been since last September. Back then, the issue was the disintegration of the financial system, as the Lehman bankruptcy set off a terrible chain reaction. Now I'm worried that the political response is making the crisis worse. The Obama administration appears to have lost its grip on Congress, while the Treasury Department always seems caught off guard by bad news.
And Congress, with its howls of rage, its chaotic, episodic reaction to the crisis, and its shameless playing to the crowds, is out of control. This week, the body politic ran off the rails.
There are times when anger is cathartic. There are other times when anger makes a bad situation worse. "We need to stop committing economic arson," Bert Ely, a banking consultant, said to me this week. That is what Congress committed: economic arson.
How is the political reaction to the crisis making it worse? Let us count the ways.
IT IS DESTROYING VALUE During his testimony on Wednesday, Mr. Liddy pointed out that much of the money the government turned over to A.I.G. was a loan, not a gift. The company's goal, he kept saying, was to pay that money back. But how? Mr. Liddy's plan is to sell off the healthy insurance units — or, failing that, give them to the government to sell when they can muster a good price.
In other words, it is in the taxpayers' best interest to position A.I.G. as a company with many profitable units, worth potentially billions, and one bad unit that needs to be unwound. Which, by the way, is the truth. But as Mr. Ely puts it, "the indiscriminate pounding that A.I.G. is taking is destroying the value of the company." Potential buyers are wary. Customers are going elsewhere. Employees are looking to leave. Treating all of A.I.G. like Public Enemy No. 1 is a pretty dumb way for a majority shareholder to act when he hopes to sell the company for top dollar.
IT IS, UNFORTUNATELY, BESIDE THE POINT Even on Wall Street this week, I didn't hear anyone condoning the A.I.G. bonuses. They should never have been granted, and Mr. Liddy should have been tougher about renegotiating them. (A rich irony here is that any nonfinancial company in A.I.G.'s straits would be in bankruptcy, and contracts would have to be renegotiated. The fact that the government is afraid to force A.I.G. into bankruptcy, despite its crippled state, is the main reason Mr. Liddy felt he couldn't try to redo the contracts.)
But there is a much bigger issue that has barely been touched upon by Congress: the way tens of billions of dollars of taxpayers' money has been funneled to A.I.G.'s counterparties — at 100 cents on the dollar. How can it possibly make sense that Goldman Sachs, Bank of America, Citigroup and every other company that bought credit-default swaps from A.I.G. should be made whole by the government? Why isn't it forcing them to take a haircut?
What's worse, some of those companies are foreign banks that used credit-default swaps to exploit a regulatory loophole. Should the United States taxpayer really be responsible for ensuring the safety of European banks that were taking advantage of European regulations?
The person who has made this point most forcefully is Eliot Spitzer, of all people. In his column for Slate.com, he wrote: "Why did Goldman have to get back 100 cents on the dollar? Didn't we already give Goldman a $25 billion cash infusion, and aren't they sitting on more than $100 billion in cash?" Mr. Spitzer told me that while "there is a legitimate sense of outrage over the bonuses, the larger outrage should be the use of A.I.G. funding as a second bailout for the large investment houses." Precisely.
IT IS DESTABILIZING How can you run a company when the rules keep changing, when you have to worry about being second-guessed by Congress? Who can do business under those circumstances?
Take, for instance, that new securitization program the government is trying to get off the ground, called the Term Asset-Backed Securities Loan Facility — or TALF. Although it is backed by large government loans, it requires people in the marketplace — Wall Street bankers! — to participate.
This program could help revive the consumer credit market. But at this point, most Wall Street bankers would rather be attacked by wild dogs than take part. They fear that they'll do something — make money perhaps? — that will arouse Congressional ire. Or that the rules will change. "The constant flip-flopping is terrible," said Simon Johnson, a banking expert who teaches at the M.I.T. Sloan School of Business.
A.I.G. offers another good example. Not all the employees who face the possibility of having their bonuses taxed out from under them work for the evil financial products division. Many of them work in insurance divisions. Very few of them pull down million-dollar bonuses, and none of them brought A.I.G. to its knees. (And employees who bought the company's stock are already hurting financially, having seen its value virtually wiped out.) They are the ones the company badly needs to keep if it hopes to sell those units at a healthy price. Taking away their bonuses — after they've already put the money in their bank accounts — hardly seems like the right way to motivate them. And demonizing them in Congressional hearings doesn't help either.
In previous columns, I have been an advocate of nationalizing big banks like Citigroup. But after watching Congress this week, I'm having second thoughts. If this is how Congress treats A.I.G., what would it do if it had a bank in its paws?
What the country really needs right now from Congress is facts instead of rhetoric. Instead of these "raise your hand if you took a private jet to get here" exercises of outraged populism, we need hearings that educate and illuminate. Hearings like the old Watergate hearings. Hearings in which knowledge is accumulated over time, and a record is established. Hearings that might actually help us get out of this crisis. It's happened before. In 1932, Congress established the Pecora committee, named for its chief counsel, Ferdinand Pecora. It was an intense, two-year inquiry, and its findings — executives shorting their own company's stock, for instance — shocked the country. It also led to the establishment of the Securities and Exchange Commission and other investor protections. One person who has been calling for a new Pecora committee is Senator Richard Shelby of Alabama, a Republican and key member of the Senate Banking Committee.
"As we restructure our regulatory system, we need to be thorough," he told me. "We need to understand what caused it. We shouldn't rush it."
Meanwhile, the House Financial Services Committee has scheduled a hearing on Tuesday featuring Mr. Bernanke and Mr. Geithner. The hearing has been called to find out only one thing: what did the two men know about the A.I.G. bonuses, and when did they know it?
Is that Nero I hear fiddling?
http://www.nytimes.com/2009/03/21/business/21nocera.html?sq=Joe%20Nocera%20Floggin%20A.I.G&st=cse&scp=1&pagewanted=print
http://snipurl.com/eakqp
Friday, March 20, 2009
Perverse Cosmic Myopia By DAVID BROOKS
March 20, 2009
Op-Ed Columnist
Perverse Cosmic Myopia By DAVID BROOKS
You'd think if some tiger were lunging at your neck, your attention would be riveted on the tiger. But that's apparently not how it works in the age of global A.D.D. As a tiger sinks its teeth into the world's neck, we focus on the dust bunnies under the bed and the floorboards that need replacing on the deck. We live in the world of Perverse Cosmic Myopia, an inability to focus attention on the most perilous matter at hand.
The tiger, of course, is the collapsing world financial system. Americans actually have a falsely mild view of this crisis because the economy is worse abroad. The U.N.'s International Labor Organization projects between 30 million and 50 million job losses worldwide. Central European countries are teetering; Japan's economy is horrifying; and the Chinese job creation machine is losing the race against its demographic pressures.
There have been riots in Greece and China as well as huge protest rallies in Dublin, Paris, London and beyond. So far, the protesters express anger without an agenda, but if the global economy continues to slide through 2010, they'll discover one. A predictable result is a series of beggar-thy-neighbor exchange-rate policies, followed by rising trade barriers and the degradation of the entire global system.
In times like these, you'd expect prudent leaders to prepare for the worst. After all, the pessimists have recently been vindicated by events. But that's apparently too painful to think about. In normal times, leaders like to focus on the short term at the expense of the long term. But now the short term is really confusing, so leaders take refuge in projects that are years or decades away.
The president of the United States has decided to address this crisis while simultaneously tackling the four most complicated problems facing the nation: health care, energy, immigration and education. Why he has not also decided to spend his evenings mastering quantum mechanics and discovering the origins of consciousness is beyond me.
The results of this overload are evident on Capitol Hill. The banking plan is incomplete, and there is zero political will to pay for it. The president's budget is being nibbled to death. The revenue ideas are dying one by one, while the spending ideas expand. By the latest estimate, the health care approach will cost $1.5 trillion over 10 years and the national debt will at least double, while the Chinese publicly complain about picking up the tab.
The Obama administration is at least distracted by important things. The Washington political class has spent the past week going into made-for-TV hysterics over $165 million in A.I.G. bonuses. We're in the middle of a multitrillion-dollar crisis, and our political masters — always willing to throw themselves into any issue that is understandable on cable television — have decided to risk destroying the entire bank-rescue plan because of bonuses that account for 0.001 percent of the annual G.D.P.
Even this is not the most idiotic of the distractions. For that, you have to look abroad.
This is a global crisis, and a core lesson of the Great Depression is that a global crisis calls for a global response. As such, Tim Geithner and Larry Summers are preparing for the upcoming G-20 summit with an agenda that has the merit of actually addressing the problem at hand: coordinate global stimulus, strengthen the International Monetary Fund, preserve open trade.
But the G-20 process is heading toward global impotence because the Europeans are dismissing this approach. Instead, they want to spend this moment of peril working on a long-term architecture to regulate global finance. The world is in flames and they want directorates and multilateral symposia and vague plans for a powerless "college of supervisors." This is what Marie Antoinette would be for if she were an annual Davos attendee.
Why are they taking this position? First, many European leaders think the answer to every problem is more global architecture. They've got Jean Monnet on the brain. Second, they prefer to free-ride on the stimulus packages that the Americans and Chinese are already paying for. Third, the fiscally responsible European countries can't commit to a policy that their debt-ridden partners can't live up to. Fourth, some reject the idea of using fiscal policy to end recessions.
Some of these reasons have merit, especially the last one. But one thing is for sure: The American agenda might work to ease the immediate crisis, but efforts to build a long-range global architecture certainly will not. After all the pious talk about post-Bush international cooperation, the current approach will lead to a big multilateral zero.
Many people used to wonder how the world's leaders could be so myopic at various points in history — like during the Versailles Treaty or the turmoil of the 1930s. We don't have to wonder any more. We get to watch the cosmic myopia replay itself in our own times.
http://www.nytimes.com/2009/03/20/opinion/20brooks.html?sq=Perverse%20Cosmic%20Myopia&st=cse&scp=1&pagewanted=print
http://snipurl.com/eakus
Op-Ed Columnist
Perverse Cosmic Myopia By DAVID BROOKS
You'd think if some tiger were lunging at your neck, your attention would be riveted on the tiger. But that's apparently not how it works in the age of global A.D.D. As a tiger sinks its teeth into the world's neck, we focus on the dust bunnies under the bed and the floorboards that need replacing on the deck. We live in the world of Perverse Cosmic Myopia, an inability to focus attention on the most perilous matter at hand.
The tiger, of course, is the collapsing world financial system. Americans actually have a falsely mild view of this crisis because the economy is worse abroad. The U.N.'s International Labor Organization projects between 30 million and 50 million job losses worldwide. Central European countries are teetering; Japan's economy is horrifying; and the Chinese job creation machine is losing the race against its demographic pressures.
There have been riots in Greece and China as well as huge protest rallies in Dublin, Paris, London and beyond. So far, the protesters express anger without an agenda, but if the global economy continues to slide through 2010, they'll discover one. A predictable result is a series of beggar-thy-neighbor exchange-rate policies, followed by rising trade barriers and the degradation of the entire global system.
In times like these, you'd expect prudent leaders to prepare for the worst. After all, the pessimists have recently been vindicated by events. But that's apparently too painful to think about. In normal times, leaders like to focus on the short term at the expense of the long term. But now the short term is really confusing, so leaders take refuge in projects that are years or decades away.
The president of the United States has decided to address this crisis while simultaneously tackling the four most complicated problems facing the nation: health care, energy, immigration and education. Why he has not also decided to spend his evenings mastering quantum mechanics and discovering the origins of consciousness is beyond me.
The results of this overload are evident on Capitol Hill. The banking plan is incomplete, and there is zero political will to pay for it. The president's budget is being nibbled to death. The revenue ideas are dying one by one, while the spending ideas expand. By the latest estimate, the health care approach will cost $1.5 trillion over 10 years and the national debt will at least double, while the Chinese publicly complain about picking up the tab.
The Obama administration is at least distracted by important things. The Washington political class has spent the past week going into made-for-TV hysterics over $165 million in A.I.G. bonuses. We're in the middle of a multitrillion-dollar crisis, and our political masters — always willing to throw themselves into any issue that is understandable on cable television — have decided to risk destroying the entire bank-rescue plan because of bonuses that account for 0.001 percent of the annual G.D.P.
Even this is not the most idiotic of the distractions. For that, you have to look abroad.
This is a global crisis, and a core lesson of the Great Depression is that a global crisis calls for a global response. As such, Tim Geithner and Larry Summers are preparing for the upcoming G-20 summit with an agenda that has the merit of actually addressing the problem at hand: coordinate global stimulus, strengthen the International Monetary Fund, preserve open trade.
But the G-20 process is heading toward global impotence because the Europeans are dismissing this approach. Instead, they want to spend this moment of peril working on a long-term architecture to regulate global finance. The world is in flames and they want directorates and multilateral symposia and vague plans for a powerless "college of supervisors." This is what Marie Antoinette would be for if she were an annual Davos attendee.
Why are they taking this position? First, many European leaders think the answer to every problem is more global architecture. They've got Jean Monnet on the brain. Second, they prefer to free-ride on the stimulus packages that the Americans and Chinese are already paying for. Third, the fiscally responsible European countries can't commit to a policy that their debt-ridden partners can't live up to. Fourth, some reject the idea of using fiscal policy to end recessions.
Some of these reasons have merit, especially the last one. But one thing is for sure: The American agenda might work to ease the immediate crisis, but efforts to build a long-range global architecture certainly will not. After all the pious talk about post-Bush international cooperation, the current approach will lead to a big multilateral zero.
Many people used to wonder how the world's leaders could be so myopic at various points in history — like during the Versailles Treaty or the turmoil of the 1930s. We don't have to wonder any more. We get to watch the cosmic myopia replay itself in our own times.
http://www.nytimes.com/2009/03/20/opinion/20brooks.html?sq=Perverse%20Cosmic%20Myopia&st=cse&scp=1&pagewanted=print
http://snipurl.com/eakus
Thursday, March 19, 2009
A Veterinarian Advises How To 'Speak For Spot'
A Veterinarian Advises How To 'Speak For Spot'
Listen Now [38 min 55 sec] add to playlist
Nancy Kay
Enlarge
Blair O'Neil
Dr. Nancy Kay owns and runs a 24-hour emergency specialty care center for animals in Rohnert Park, Calif. Courtesy of Trafalgar Square Books
Fresh Air from WHYY, March 19, 2009 · Navigating the world of veterinary medicine can be daunting, but one veterinarian believes she can help. Nancy Kay, a veterinarian with 20 years of experience, is the author of Speaking for Spot: Be the Advocate Your Dog Needs to Live a Happy, Longer Life, a guide that advises dog owners about everything from routine vet visits to euthanasia and chemotherapy.
Kay is an owner and staff internist at the Animal Care Center in Rohnert Park, Calif. and she writes a canine health care blog.
Listen Now [38 min 55 sec] add to playlist
Nancy Kay
Enlarge
Blair O'Neil
Dr. Nancy Kay owns and runs a 24-hour emergency specialty care center for animals in Rohnert Park, Calif. Courtesy of Trafalgar Square Books
Fresh Air from WHYY, March 19, 2009 · Navigating the world of veterinary medicine can be daunting, but one veterinarian believes she can help. Nancy Kay, a veterinarian with 20 years of experience, is the author of Speaking for Spot: Be the Advocate Your Dog Needs to Live a Happy, Longer Life, a guide that advises dog owners about everything from routine vet visits to euthanasia and chemotherapy.
Kay is an owner and staff internist at the Animal Care Center in Rohnert Park, Calif. and she writes a canine health care blog.
A Prison of Words By NOAH FELDMAN
March 19, 2009
Op-Ed Contributor
A Prison of Words By NOAH FELDMAN
Cambridge, Mass.
HAS the Obama administration changed the legal rules for detaining suspects in the war on terrorism, or is it continuing in the footsteps of the Bush administration?
We got a clue last week when the Justice Department filed an important document “refining” the government’s position in lawsuits over those held at Guantánamo Bay. Hailed by supporters as a leap forward, yet criticized by human rights groups as being little different from what came before, the filing reveals a distinctive approach to constitutional law. Cautious and modest where George W. Bush was ambitious and brash, Mr. Obama still claims the authority necessary to sustain almost everything his predecessor did.
Perhaps what’s most important here is what Mr. Obama’s lawyers do not say. The Bush White House long insisted that the president had inherent power as commander in chief to do whatever it took to defend the country — including overriding American and international law. The Obama filing, however, is silent on the topic of inherent executive power. Indeed, the magic words “commander in chief” never even appear.
Technically, the Obama lawyers have not abandoned the argument for broad presidential power, just implied that such authority is unnecessary to get them what they want.
Yet omitting the claim to unfettered executive authority shows respect for Congress and international standards. In effect, the Obama administration is saying to the courts that if the detainees cannot be held as a matter of federal or international law, judges should release them. This approach is brave — so brave it might even prove foolhardy if the courts, sick of nearly a decade of detention, decide to clear the decks.
The filing argues that the authorization for the use of military force passed by Congress after 9/11 — the contemporary equivalent of a declaration of war — gives the president the powers any sovereign would have under the general principles of the international law of war. Relying on international law to make sense of Congress’s grant of power has deep roots in our constitutional tradition.
In the context of America’s present global military posture, however, the rediscovery of this notion is little short of astonishing. The laws of war, mostly designed for old-fashioned struggles between sovereign states, often do not fit today’s circumstances. The Bush administration saw this mismatch as an occasion to treat the Geneva conventions as “quaint” (in the words of Alberto Gonzales, the former White House counsel).
The Obama lawyers, however, seem to believe that the international law of war is flexible enough to serve their interests — and even to expand the president’s power to detain suspects beyond the strict language used by Congress when it gave President Bush authority to carry out his war on terrorism.
Here is where the law gets complicated: In 2001, Congress told the president he could make war on anyone who had “planned, authorized, committed or aided” the Sept. 11 attacks. The Bush administration, though, went further; it claimed the power to detain any “enemy combatant,” defined to include “anyone who is part of or supporting Taliban or Al Qaeda forces or associated forces.” In an unfortunate legal overreach, one administration lawyer said the government could detain a “little old lady in Switzerland” whose donation to an Afghan orphanage ended up in the hands of Al Qaeda.
In place of the “enemy combatant” definition, the Obama administration now claims the right to detain anyone who “substantially supported” terrorists. Thankfully, the Obama standard would free the little old Swiss lady. But the words “substantial support” do not come from international law any more than Bush’s “enemy combatant” did.
The administration lawyers suggest in their brief that “substantial support” of terrorists could be defined by some unspecified analogy to the laws of detention in traditional armed conflict. Yet the details are left to the imagination; and when push comes to shove, this language might well include all the Guantánamo detainees, including those who never belonged to a terrorist group.
The upshot is that the Obama approach is potentially broad enough to continue detaining everyone whom the Bush administration put in Guantánamo in the first place. The legal theories are subtler, and the reliance on international law may prove more attractive to our allies. But President Obama is stuck with the detainees Mr. Bush left him, and some may pose a real danger. Faced with this conundrum, and pressed for answers by judges who are rightfully impatient, the administration is hurrying to reframe existing powers in new legal doctrines.
The true test of whether Mr. Obama has improved on the Bush era lies in how his administration justifies its decisions on the 241 remaining Guantánamo detainees, whose cases will now be evaluated internally and reviewed by the courts. If the new legal arguments actually affect who goes free and who stays in custody, then they will amount to meaningful change. Without real-world effects, though, even the most elegant new legal arguments are nothing but words.
Noah Feldman is a law professor at Harvard, a fellow at the Council on Foreign Relations and a contributing writer to The Times Magazine.
Op-Ed Contributor
A Prison of Words By NOAH FELDMAN
Cambridge, Mass.
HAS the Obama administration changed the legal rules for detaining suspects in the war on terrorism, or is it continuing in the footsteps of the Bush administration?
We got a clue last week when the Justice Department filed an important document “refining” the government’s position in lawsuits over those held at Guantánamo Bay. Hailed by supporters as a leap forward, yet criticized by human rights groups as being little different from what came before, the filing reveals a distinctive approach to constitutional law. Cautious and modest where George W. Bush was ambitious and brash, Mr. Obama still claims the authority necessary to sustain almost everything his predecessor did.
Perhaps what’s most important here is what Mr. Obama’s lawyers do not say. The Bush White House long insisted that the president had inherent power as commander in chief to do whatever it took to defend the country — including overriding American and international law. The Obama filing, however, is silent on the topic of inherent executive power. Indeed, the magic words “commander in chief” never even appear.
Technically, the Obama lawyers have not abandoned the argument for broad presidential power, just implied that such authority is unnecessary to get them what they want.
Yet omitting the claim to unfettered executive authority shows respect for Congress and international standards. In effect, the Obama administration is saying to the courts that if the detainees cannot be held as a matter of federal or international law, judges should release them. This approach is brave — so brave it might even prove foolhardy if the courts, sick of nearly a decade of detention, decide to clear the decks.
The filing argues that the authorization for the use of military force passed by Congress after 9/11 — the contemporary equivalent of a declaration of war — gives the president the powers any sovereign would have under the general principles of the international law of war. Relying on international law to make sense of Congress’s grant of power has deep roots in our constitutional tradition.
In the context of America’s present global military posture, however, the rediscovery of this notion is little short of astonishing. The laws of war, mostly designed for old-fashioned struggles between sovereign states, often do not fit today’s circumstances. The Bush administration saw this mismatch as an occasion to treat the Geneva conventions as “quaint” (in the words of Alberto Gonzales, the former White House counsel).
The Obama lawyers, however, seem to believe that the international law of war is flexible enough to serve their interests — and even to expand the president’s power to detain suspects beyond the strict language used by Congress when it gave President Bush authority to carry out his war on terrorism.
Here is where the law gets complicated: In 2001, Congress told the president he could make war on anyone who had “planned, authorized, committed or aided” the Sept. 11 attacks. The Bush administration, though, went further; it claimed the power to detain any “enemy combatant,” defined to include “anyone who is part of or supporting Taliban or Al Qaeda forces or associated forces.” In an unfortunate legal overreach, one administration lawyer said the government could detain a “little old lady in Switzerland” whose donation to an Afghan orphanage ended up in the hands of Al Qaeda.
In place of the “enemy combatant” definition, the Obama administration now claims the right to detain anyone who “substantially supported” terrorists. Thankfully, the Obama standard would free the little old Swiss lady. But the words “substantial support” do not come from international law any more than Bush’s “enemy combatant” did.
The administration lawyers suggest in their brief that “substantial support” of terrorists could be defined by some unspecified analogy to the laws of detention in traditional armed conflict. Yet the details are left to the imagination; and when push comes to shove, this language might well include all the Guantánamo detainees, including those who never belonged to a terrorist group.
The upshot is that the Obama approach is potentially broad enough to continue detaining everyone whom the Bush administration put in Guantánamo in the first place. The legal theories are subtler, and the reliance on international law may prove more attractive to our allies. But President Obama is stuck with the detainees Mr. Bush left him, and some may pose a real danger. Faced with this conundrum, and pressed for answers by judges who are rightfully impatient, the administration is hurrying to reframe existing powers in new legal doctrines.
The true test of whether Mr. Obama has improved on the Bush era lies in how his administration justifies its decisions on the 241 remaining Guantánamo detainees, whose cases will now be evaluated internally and reviewed by the courts. If the new legal arguments actually affect who goes free and who stays in custody, then they will amount to meaningful change. Without real-world effects, though, even the most elegant new legal arguments are nothing but words.
Noah Feldman is a law professor at Harvard, a fellow at the Council on Foreign Relations and a contributing writer to The Times Magazine.
Labels:
Civil Rights,
Guantanamo,
Justice,
NYTimes,
Obama,
Supreme Court
The Daily Me By NICHOLAS D. KRISTOF
March 19, 2009
Op-Ed Columnist
The Daily Me By NICHOLAS D. KRISTOF
Some of the obituaries these days aren’t in the newspapers but are for the newspapers. The Seattle Post-Intelligencer is the latest to pass away, save for a remnant that will exist only in cyberspace, and the public is increasingly seeking its news not from mainstream television networks or ink-on-dead-trees but from grazing online.
When we go online, each of us is our own editor, our own gatekeeper. We select the kind of news and opinions that we care most about.
Nicholas Negroponte of M.I.T. has called this emerging news product The Daily Me. And if that’s the trend, God save us from ourselves.
That’s because there’s pretty good evidence that we generally don’t truly want good information — but rather information that confirms our prejudices. We may believe intellectually in the clash of opinions, but in practice we like to embed ourselves in the reassuring womb of an echo chamber.
One classic study sent mailings to Republicans and Democrats, offering them various kinds of political research, ostensibly from a neutral source. Both groups were most eager to receive intelligent arguments that strongly corroborated their pre-existing views.
There was also modest interest in receiving manifestly silly arguments for the other party’s views (we feel good when we can caricature the other guys as dunces). But there was little interest in encountering solid arguments that might undermine one’s own position.
That general finding has been replicated repeatedly, as the essayist and author Farhad Manjoo noted in his terrific book last year: “True Enough: Learning to Live in a Post-Fact Society.”
Let me get one thing out of the way: I’m sometimes guilty myself of selective truth-seeking on the Web. The blog I turn to for insight into Middle East news is often Professor Juan Cole’s, because he’s smart, well-informed and sensible — in other words, I often agree with his take. I’m less likely to peruse the blog of Daniel Pipes, another Middle East expert who is smart and well-informed — but who strikes me as less sensible, partly because I often disagree with him.
The effect of The Daily Me would be to insulate us further in our own hermetically sealed political chambers. One of last year’s more fascinating books was Bill Bishop’s “The Big Sort: Why the Clustering of Like-Minded America is Tearing Us Apart.” He argues that Americans increasingly are segregating themselves into communities, clubs and churches where they are surrounded by people who think the way they do.
Almost half of Americans now live in counties that vote in landslides either for Democrats or for Republicans, he said. In the 1960s and 1970s, in similarly competitive national elections, only about one-third lived in landslide counties.
“The nation grows more politically segregated — and the benefit that ought to come with having a variety of opinions is lost to the righteousness that is the special entitlement of homogeneous groups,” Mr. Bishop writes.
One 12-nation study found Americans the least likely to discuss politics with people of different views, and this was particularly true of the well educated. High school dropouts had the most diverse group of discussion-mates, while college graduates managed to shelter themselves from uncomfortable perspectives.
The result is polarization and intolerance. Cass Sunstein, a Harvard law professor now working for President Obama, has conducted research showing that when liberals or conservatives discuss issues such as affirmative action or climate change with like-minded people, their views quickly become more homogeneous and more extreme than before the discussion. For example, some liberals in one study initially worried that action on climate change might hurt the poor, while some conservatives were sympathetic to affirmative action. But after discussing the issue with like-minded people for only 15 minutes, liberals became more liberal and conservatives more conservative.
The decline of traditional news media will accelerate the rise of The Daily Me, and we’ll be irritated less by what we read and find our wisdom confirmed more often. The danger is that this self-selected “news” acts as a narcotic, lulling us into a self-confident stupor through which we will perceive in blacks and whites a world that typically unfolds in grays.
So what’s the solution? Tax breaks for liberals who watch Bill O’Reilly or conservatives who watch Keith Olbermann? No, until President Obama brings us universal health care, we can’t risk the surge in heart attacks.
So perhaps the only way forward is for each of us to struggle on our own to work out intellectually with sparring partners whose views we deplore. Think of it as a daily mental workout analogous to a trip to the gym; if you don’t work up a sweat, it doesn’t count.
Now excuse me while I go and read The Wall Street Journal’s editorial page.
Op-Ed Columnist
The Daily Me By NICHOLAS D. KRISTOF
Some of the obituaries these days aren’t in the newspapers but are for the newspapers. The Seattle Post-Intelligencer is the latest to pass away, save for a remnant that will exist only in cyberspace, and the public is increasingly seeking its news not from mainstream television networks or ink-on-dead-trees but from grazing online.
When we go online, each of us is our own editor, our own gatekeeper. We select the kind of news and opinions that we care most about.
Nicholas Negroponte of M.I.T. has called this emerging news product The Daily Me. And if that’s the trend, God save us from ourselves.
That’s because there’s pretty good evidence that we generally don’t truly want good information — but rather information that confirms our prejudices. We may believe intellectually in the clash of opinions, but in practice we like to embed ourselves in the reassuring womb of an echo chamber.
One classic study sent mailings to Republicans and Democrats, offering them various kinds of political research, ostensibly from a neutral source. Both groups were most eager to receive intelligent arguments that strongly corroborated their pre-existing views.
There was also modest interest in receiving manifestly silly arguments for the other party’s views (we feel good when we can caricature the other guys as dunces). But there was little interest in encountering solid arguments that might undermine one’s own position.
That general finding has been replicated repeatedly, as the essayist and author Farhad Manjoo noted in his terrific book last year: “True Enough: Learning to Live in a Post-Fact Society.”
Let me get one thing out of the way: I’m sometimes guilty myself of selective truth-seeking on the Web. The blog I turn to for insight into Middle East news is often Professor Juan Cole’s, because he’s smart, well-informed and sensible — in other words, I often agree with his take. I’m less likely to peruse the blog of Daniel Pipes, another Middle East expert who is smart and well-informed — but who strikes me as less sensible, partly because I often disagree with him.
The effect of The Daily Me would be to insulate us further in our own hermetically sealed political chambers. One of last year’s more fascinating books was Bill Bishop’s “The Big Sort: Why the Clustering of Like-Minded America is Tearing Us Apart.” He argues that Americans increasingly are segregating themselves into communities, clubs and churches where they are surrounded by people who think the way they do.
Almost half of Americans now live in counties that vote in landslides either for Democrats or for Republicans, he said. In the 1960s and 1970s, in similarly competitive national elections, only about one-third lived in landslide counties.
“The nation grows more politically segregated — and the benefit that ought to come with having a variety of opinions is lost to the righteousness that is the special entitlement of homogeneous groups,” Mr. Bishop writes.
One 12-nation study found Americans the least likely to discuss politics with people of different views, and this was particularly true of the well educated. High school dropouts had the most diverse group of discussion-mates, while college graduates managed to shelter themselves from uncomfortable perspectives.
The result is polarization and intolerance. Cass Sunstein, a Harvard law professor now working for President Obama, has conducted research showing that when liberals or conservatives discuss issues such as affirmative action or climate change with like-minded people, their views quickly become more homogeneous and more extreme than before the discussion. For example, some liberals in one study initially worried that action on climate change might hurt the poor, while some conservatives were sympathetic to affirmative action. But after discussing the issue with like-minded people for only 15 minutes, liberals became more liberal and conservatives more conservative.
The decline of traditional news media will accelerate the rise of The Daily Me, and we’ll be irritated less by what we read and find our wisdom confirmed more often. The danger is that this self-selected “news” acts as a narcotic, lulling us into a self-confident stupor through which we will perceive in blacks and whites a world that typically unfolds in grays.
So what’s the solution? Tax breaks for liberals who watch Bill O’Reilly or conservatives who watch Keith Olbermann? No, until President Obama brings us universal health care, we can’t risk the surge in heart attacks.
So perhaps the only way forward is for each of us to struggle on our own to work out intellectually with sparring partners whose views we deplore. Think of it as a daily mental workout analogous to a trip to the gym; if you don’t work up a sweat, it doesn’t count.
Now excuse me while I go and read The Wall Street Journal’s editorial page.
Sunday, March 15, 2009
Finding Messages in a Blueprint By N. GREGORY MANKIW
March 15, 2009
Economic View
Finding Messages in a Blueprint By N. GREGORY MANKIW
PRESIDENTIAL candidates campaign with soaring rhetoric, but presidents and their advisers make actual policy with spreadsheets. So for policy wonks like me, there is no better place to learn what President Obama really believes than the budget proposal released late last month.
Here are four lessons we can learn from the budget documents about the president and his economic team:
THEY ARE ECONOMIC OPTIMISTS Like everyone else, the president’s economists expect 2009 to be a grim year of falling national income and rising unemployment. But despite all the talk about the worst crisis since the Great Depression, they expect their policies to bring the recession to a swift conclusion. For the next four years, they forecast an average growth rate of 4 percent. The unemployment rate is projected to fall to 5.2 percent in 2013.
Not everyone is so sanguine. The administration forecast is “way too optimistic,” said Nariman Behravesh, chief economist at IHS Global Insight and author of the excellent primer “Spin-Free Economics.”
Let’s hope that the administration is right. But if I had to bet, I’d put my money on Mr. Behravesh.
THEY LIKE TO SPEND In light of the economic downturn, the stimulus package and all the bailouts coming out of Washington, it is no surprise government spending is skyrocketing. According to the president’s budget, federal outlays will be 27.7 percent of gross domestic product in 2009 and 24.1 percent in 2010 — levels not reached since World War II.
But more telling about the president’s priorities is what happens to spending after the crisis is well behind us, at least according to the administration’s forecast. In a second term for Mr. Obama, with the economy recovered and unemployment stabilized at 5 percent, federal outlays would be 22.2 percent of G.D.P. — well above the average of 20.2 percent over the last 50 years.
It is also well above levels in recent history. Before the financial crisis hit in 2008, federal outlays under President George W. Bush never exceeded 20.4 percent of G.D.P. That includes spending from the Iraq war. President Obama is counting on that conflict being over, and no new money-draining military commitment taking its place. Yet federal spending still remains high.
To be sure, part of the increase in government spending is driven by the aging of the population. As more baby boomers retire and become eligible for Social Security and Medicare, spending rises automatically. But President Obama’s focus on universal health insurance suggests that he is more interested in expanding the benefits that Americans can claim than in reining in the unfunded entitlements already on the books.
THEY ARE SERIOUS ABOUT CLIMATE CHANGE President Obama’s budget makes clear that he wants to address the problem of global climate change. This commitment stands in stark contrast to policy during the previous two administrations.
President Bill Clinton offered the Kyoto Protocol, but the policy ended up more symbolic than real. The treaty was overwhelmingly rejected by both parties in Congress, in part because it left out China, now the world’s largest emitter of carbon. President Bush rejected the Kyoto principles as well, but he never made finding an alternative approach to climate change a major priority.
For the new administration, climate change is not only an environmental issue but a budgetary one as well. Under the proposed cap-and-trade system, the government would auction off a limited number of carbon allowances. The cost would be passed on to consumers as higher energy prices, encouraging conservation. According to President Obama’s budget projections, the system would also raise more government revenue than his much-discussed tax increases on upper-income households.
The thrust of the policy makes sense, but several questions remain. First, why not instead impose a more transparent and administratively simpler tax on carbon emissions? Is it merely because the phrase “climate revenues” used in the budget is more politically palatable than the word “tax”? More important, how will the president get China on board? Without China’s participation, any climate policy, along with the associated revenue, may be a political nonstarter.
THEY ARE DEFICIT DOVES Few economists would blame either the Bush administration or the Obama administration for running budget deficits during an economic downturn. What is more telling is what happens to the deficit during normal economic times. From that perspective, the Obama budget policy looks surprisingly similar to the Bush version.
From 2005 to 2007, before the current crisis, unemployment in the United States hovered around 5 percent. During those years, the budget deficit averaged just under 2 percent of G.D.P.
In the Obama administration’s forecast, unemployment again reaches 5 percent in 2014 and remains at that level thereafter. But despite that rosy prediction, the budget does not get close to balance. The Obama team calculates that under its proposed policies, the budget deficit will average a bit over 3 percent of G.D.P.
So if you are a deficit hawk who lamented the Bush budget deficits, the new president’s budget should not make you feel much better. President Obama offers different fiscal priorities than President Bush did: less military spending, more domestic spending and higher marginal tax rates to “spread the wealth around.” But the borrowing and debt imposed on future generations will not be very different, at least if the numbers in the Obama administration’s own budget document can be trusted.
N. Gregory Mankiw is a professor of economics at Harvard. He was an adviser to President George W. Bush.
Economic View
Finding Messages in a Blueprint By N. GREGORY MANKIW
PRESIDENTIAL candidates campaign with soaring rhetoric, but presidents and their advisers make actual policy with spreadsheets. So for policy wonks like me, there is no better place to learn what President Obama really believes than the budget proposal released late last month.
Here are four lessons we can learn from the budget documents about the president and his economic team:
THEY ARE ECONOMIC OPTIMISTS Like everyone else, the president’s economists expect 2009 to be a grim year of falling national income and rising unemployment. But despite all the talk about the worst crisis since the Great Depression, they expect their policies to bring the recession to a swift conclusion. For the next four years, they forecast an average growth rate of 4 percent. The unemployment rate is projected to fall to 5.2 percent in 2013.
Not everyone is so sanguine. The administration forecast is “way too optimistic,” said Nariman Behravesh, chief economist at IHS Global Insight and author of the excellent primer “Spin-Free Economics.”
Let’s hope that the administration is right. But if I had to bet, I’d put my money on Mr. Behravesh.
THEY LIKE TO SPEND In light of the economic downturn, the stimulus package and all the bailouts coming out of Washington, it is no surprise government spending is skyrocketing. According to the president’s budget, federal outlays will be 27.7 percent of gross domestic product in 2009 and 24.1 percent in 2010 — levels not reached since World War II.
But more telling about the president’s priorities is what happens to spending after the crisis is well behind us, at least according to the administration’s forecast. In a second term for Mr. Obama, with the economy recovered and unemployment stabilized at 5 percent, federal outlays would be 22.2 percent of G.D.P. — well above the average of 20.2 percent over the last 50 years.
It is also well above levels in recent history. Before the financial crisis hit in 2008, federal outlays under President George W. Bush never exceeded 20.4 percent of G.D.P. That includes spending from the Iraq war. President Obama is counting on that conflict being over, and no new money-draining military commitment taking its place. Yet federal spending still remains high.
To be sure, part of the increase in government spending is driven by the aging of the population. As more baby boomers retire and become eligible for Social Security and Medicare, spending rises automatically. But President Obama’s focus on universal health insurance suggests that he is more interested in expanding the benefits that Americans can claim than in reining in the unfunded entitlements already on the books.
THEY ARE SERIOUS ABOUT CLIMATE CHANGE President Obama’s budget makes clear that he wants to address the problem of global climate change. This commitment stands in stark contrast to policy during the previous two administrations.
President Bill Clinton offered the Kyoto Protocol, but the policy ended up more symbolic than real. The treaty was overwhelmingly rejected by both parties in Congress, in part because it left out China, now the world’s largest emitter of carbon. President Bush rejected the Kyoto principles as well, but he never made finding an alternative approach to climate change a major priority.
For the new administration, climate change is not only an environmental issue but a budgetary one as well. Under the proposed cap-and-trade system, the government would auction off a limited number of carbon allowances. The cost would be passed on to consumers as higher energy prices, encouraging conservation. According to President Obama’s budget projections, the system would also raise more government revenue than his much-discussed tax increases on upper-income households.
The thrust of the policy makes sense, but several questions remain. First, why not instead impose a more transparent and administratively simpler tax on carbon emissions? Is it merely because the phrase “climate revenues” used in the budget is more politically palatable than the word “tax”? More important, how will the president get China on board? Without China’s participation, any climate policy, along with the associated revenue, may be a political nonstarter.
THEY ARE DEFICIT DOVES Few economists would blame either the Bush administration or the Obama administration for running budget deficits during an economic downturn. What is more telling is what happens to the deficit during normal economic times. From that perspective, the Obama budget policy looks surprisingly similar to the Bush version.
From 2005 to 2007, before the current crisis, unemployment in the United States hovered around 5 percent. During those years, the budget deficit averaged just under 2 percent of G.D.P.
In the Obama administration’s forecast, unemployment again reaches 5 percent in 2014 and remains at that level thereafter. But despite that rosy prediction, the budget does not get close to balance. The Obama team calculates that under its proposed policies, the budget deficit will average a bit over 3 percent of G.D.P.
So if you are a deficit hawk who lamented the Bush budget deficits, the new president’s budget should not make you feel much better. President Obama offers different fiscal priorities than President Bush did: less military spending, more domestic spending and higher marginal tax rates to “spread the wealth around.” But the borrowing and debt imposed on future generations will not be very different, at least if the numbers in the Obama administration’s own budget document can be trusted.
N. Gregory Mankiw is a professor of economics at Harvard. He was an adviser to President George W. Bush.
Wednesday, March 11, 2009
The Looting of America’s Coffers By DAVID LEONHARDT
March 11, 2009
Economic Scene
The Looting of America’s Coffers By DAVID LEONHARDT
Sixteen years ago, two economists published a research paper with a delightfully simple title: “Looting.”
The economists were George Akerlof, who would later win a Nobel Prize, and Paul Romer, the renowned expert on economic growth. In the paper, they argued that several financial crises in the 1980s, like the Texas real estate bust, had been the result of private investors taking advantage of the government. The investors had borrowed huge amounts of money, made big profits when times were good and then left the government holding the bag for their eventual (and predictable) losses.
In a word, the investors looted. Someone trying to make an honest profit, Professors Akerlof and Romer said, would have operated in a completely different manner. The investors displayed a “total disregard for even the most basic principles of lending,” failing to verify standard information about their borrowers or, in some cases, even to ask for that information.
The investors “acted as if future losses were somebody else’s problem,” the economists wrote. “They were right.”
On Tuesday morning in Washington, Ben Bernanke, the Federal Reserve chairman, gave a speech that read like a sad coda to the “Looting” paper. Because the government is unwilling to let big, interconnected financial firms fail — and because people at those firms knew it — they engaged in what Mr. Bernanke called “excessive risk-taking.” To prevent such problems in the future, he called for tougher regulation.
Now, it would have been nice if the Fed had shown some of this regulatory zeal before the worst financial crisis since the Great Depression. But that day has passed. So people are rightly starting to think about building a new, less vulnerable financial system.
And “Looting” provides a really useful framework. The paper’s message is that the promise of government bailouts isn’t merely one aspect of the problem. It is the core problem.
Promised bailouts mean that anyone lending money to Wall Street — ranging from small-time savers like you and me to the Chinese government — doesn’t have to worry about losing that money. The United States Treasury (which, in the end, is also you and me) will cover the losses. In fact, it has to cover the losses, to prevent a cascade of worldwide losses and panic that would make today’s crisis look tame.
But the knowledge among lenders that their money will ultimately be returned, no matter what, clearly brings a terrible downside. It keeps the lenders from asking tough questions about how their money is being used. Looters — savings and loans and Texas developers in the 1980s; the American International Group, Citigroup, Fannie Mae and the rest in this decade — can then act as if their future losses are indeed somebody else’s problem.
Do you remember the mea culpa that Alan Greenspan, Mr. Bernanke’s predecessor, delivered on Capitol Hill last fall? He said that he was “in a state of shocked disbelief” that “the self-interest” of Wall Street bankers hadn’t prevented this mess.
He shouldn’t have been. The looting theory explains why his laissez-faire theory didn’t hold up. The bankers were acting in their self-interest, after all.
•
The term that’s used to describe this general problem, of course, is moral hazard. When people are protected from the consequences of risky behavior, they behave in a pretty risky fashion. Bankers can make long-shot investments, knowing that they will keep the profits if they succeed, while the taxpayers will cover the losses.
This form of moral hazard — when profits are privatized and losses are socialized — certainly played a role in creating the current mess. But when I spoke with Mr. Romer on Tuesday, he was careful to make a distinction between classic moral hazard and looting. It’s an important distinction.
With moral hazard, bankers are making real wagers. If those wagers pay off, the government has no role in the transaction. With looting, the government’s involvement is crucial to the whole enterprise.
Think about the so-called liars’ loans from recent years: like those Texas real estate loans from the 1980s, they never had a chance of paying off. Sure, they would deliver big profits for a while, so long as the bubble kept inflating. But when they inevitably imploded, the losses would overwhelm the gains. As Gretchen Morgenson has reported, Merrill Lynch’s losses from the last two years wiped out its profits from the previous decade.
What happened? Banks borrowed money from lenders around the world. The bankers then kept a big chunk of that money for themselves, calling it “management fees” or “performance bonuses.” Once the investments were exposed as hopeless, the lenders — ordinary savers, foreign countries, other banks, you name it — were repaid with government bailouts.
In effect, the bankers had siphoned off this bailout money in advance, years before the government had spent it.
I understand this chain of events sounds a bit like a conspiracy. And in some cases, it surely was. Some A.I.G. employees, to take one example, had to have understood what their credit derivative division in London was doing. But more innocent optimism probably played a role, too. The human mind has a tremendous ability to rationalize, and the possibility of making millions of dollars invites some hard-core rationalization.
Either way, the bottom line is the same: given an incentive to loot, Wall Street did so. “If you think of the financial system as a whole,” Mr. Romer said, “it actually has an incentive to trigger the rare occasions in which tens or hundreds of billions of dollars come flowing out of the Treasury.”
Unfortunately, we can’t very well stop the flow of that money now. The bankers have already walked away with their profits (though many more of them deserve a subpoena to a Congressional hearing room). Allowing A.I.G. to collapse, out of spite, could cause a financial shock bigger than the one that followed the collapse of Lehman Brothers. Modern economies can’t function without credit, which means the financial system needs to be bailed out.
But the future also requires the kind of overhaul that Mr. Bernanke has begun to sketch out. Firms will have to be monitored much more seriously than they were during the Greenspan era. They can’t be allowed to shop around for the regulatory agency that least understands what they’re doing. The biggest Wall Street paydays should be held in escrow until it’s clear they weren’t based on fictional profits.
Above all, as Mr. Romer says, the federal government needs the power and the will to take over a firm as soon as its potential losses exceed its assets. Anything short of that is an invitation to loot.
Mr. Bernanke actually took a step in this direction on Tuesday. He said the government “needs improved tools to allow the orderly resolution of a systemically important nonbank financial firm.” In layman’s terms, he was asking for a clearer legal path to nationalization.
At a time like this, when trust in financial markets is so scant, it may be hard to imagine that looting will ever be a problem again. But it will be. If we don’t get rid of the incentive to loot, the only question is what form the next round of looting will take.
Mr. Akerlof and Mr. Romer finished writing their paper in the early 1990s, when the economy was still suffering a hangover from the excesses of the 1980s. But Mr. Akerlof told Mr. Romer — a skeptical Mr. Romer, as he acknowledged with a laugh on Tuesday — that the next candidate for looting already seemed to be taking shape.
It was an obscure little market called credit derivatives.
Economic Scene
The Looting of America’s Coffers By DAVID LEONHARDT
Sixteen years ago, two economists published a research paper with a delightfully simple title: “Looting.”
The economists were George Akerlof, who would later win a Nobel Prize, and Paul Romer, the renowned expert on economic growth. In the paper, they argued that several financial crises in the 1980s, like the Texas real estate bust, had been the result of private investors taking advantage of the government. The investors had borrowed huge amounts of money, made big profits when times were good and then left the government holding the bag for their eventual (and predictable) losses.
In a word, the investors looted. Someone trying to make an honest profit, Professors Akerlof and Romer said, would have operated in a completely different manner. The investors displayed a “total disregard for even the most basic principles of lending,” failing to verify standard information about their borrowers or, in some cases, even to ask for that information.
The investors “acted as if future losses were somebody else’s problem,” the economists wrote. “They were right.”
On Tuesday morning in Washington, Ben Bernanke, the Federal Reserve chairman, gave a speech that read like a sad coda to the “Looting” paper. Because the government is unwilling to let big, interconnected financial firms fail — and because people at those firms knew it — they engaged in what Mr. Bernanke called “excessive risk-taking.” To prevent such problems in the future, he called for tougher regulation.
Now, it would have been nice if the Fed had shown some of this regulatory zeal before the worst financial crisis since the Great Depression. But that day has passed. So people are rightly starting to think about building a new, less vulnerable financial system.
And “Looting” provides a really useful framework. The paper’s message is that the promise of government bailouts isn’t merely one aspect of the problem. It is the core problem.
Promised bailouts mean that anyone lending money to Wall Street — ranging from small-time savers like you and me to the Chinese government — doesn’t have to worry about losing that money. The United States Treasury (which, in the end, is also you and me) will cover the losses. In fact, it has to cover the losses, to prevent a cascade of worldwide losses and panic that would make today’s crisis look tame.
But the knowledge among lenders that their money will ultimately be returned, no matter what, clearly brings a terrible downside. It keeps the lenders from asking tough questions about how their money is being used. Looters — savings and loans and Texas developers in the 1980s; the American International Group, Citigroup, Fannie Mae and the rest in this decade — can then act as if their future losses are indeed somebody else’s problem.
Do you remember the mea culpa that Alan Greenspan, Mr. Bernanke’s predecessor, delivered on Capitol Hill last fall? He said that he was “in a state of shocked disbelief” that “the self-interest” of Wall Street bankers hadn’t prevented this mess.
He shouldn’t have been. The looting theory explains why his laissez-faire theory didn’t hold up. The bankers were acting in their self-interest, after all.
•
The term that’s used to describe this general problem, of course, is moral hazard. When people are protected from the consequences of risky behavior, they behave in a pretty risky fashion. Bankers can make long-shot investments, knowing that they will keep the profits if they succeed, while the taxpayers will cover the losses.
This form of moral hazard — when profits are privatized and losses are socialized — certainly played a role in creating the current mess. But when I spoke with Mr. Romer on Tuesday, he was careful to make a distinction between classic moral hazard and looting. It’s an important distinction.
With moral hazard, bankers are making real wagers. If those wagers pay off, the government has no role in the transaction. With looting, the government’s involvement is crucial to the whole enterprise.
Think about the so-called liars’ loans from recent years: like those Texas real estate loans from the 1980s, they never had a chance of paying off. Sure, they would deliver big profits for a while, so long as the bubble kept inflating. But when they inevitably imploded, the losses would overwhelm the gains. As Gretchen Morgenson has reported, Merrill Lynch’s losses from the last two years wiped out its profits from the previous decade.
What happened? Banks borrowed money from lenders around the world. The bankers then kept a big chunk of that money for themselves, calling it “management fees” or “performance bonuses.” Once the investments were exposed as hopeless, the lenders — ordinary savers, foreign countries, other banks, you name it — were repaid with government bailouts.
In effect, the bankers had siphoned off this bailout money in advance, years before the government had spent it.
I understand this chain of events sounds a bit like a conspiracy. And in some cases, it surely was. Some A.I.G. employees, to take one example, had to have understood what their credit derivative division in London was doing. But more innocent optimism probably played a role, too. The human mind has a tremendous ability to rationalize, and the possibility of making millions of dollars invites some hard-core rationalization.
Either way, the bottom line is the same: given an incentive to loot, Wall Street did so. “If you think of the financial system as a whole,” Mr. Romer said, “it actually has an incentive to trigger the rare occasions in which tens or hundreds of billions of dollars come flowing out of the Treasury.”
Unfortunately, we can’t very well stop the flow of that money now. The bankers have already walked away with their profits (though many more of them deserve a subpoena to a Congressional hearing room). Allowing A.I.G. to collapse, out of spite, could cause a financial shock bigger than the one that followed the collapse of Lehman Brothers. Modern economies can’t function without credit, which means the financial system needs to be bailed out.
But the future also requires the kind of overhaul that Mr. Bernanke has begun to sketch out. Firms will have to be monitored much more seriously than they were during the Greenspan era. They can’t be allowed to shop around for the regulatory agency that least understands what they’re doing. The biggest Wall Street paydays should be held in escrow until it’s clear they weren’t based on fictional profits.
Above all, as Mr. Romer says, the federal government needs the power and the will to take over a firm as soon as its potential losses exceed its assets. Anything short of that is an invitation to loot.
Mr. Bernanke actually took a step in this direction on Tuesday. He said the government “needs improved tools to allow the orderly resolution of a systemically important nonbank financial firm.” In layman’s terms, he was asking for a clearer legal path to nationalization.
At a time like this, when trust in financial markets is so scant, it may be hard to imagine that looting will ever be a problem again. But it will be. If we don’t get rid of the incentive to loot, the only question is what form the next round of looting will take.
Mr. Akerlof and Mr. Romer finished writing their paper in the early 1990s, when the economy was still suffering a hangover from the excesses of the 1980s. But Mr. Akerlof told Mr. Romer — a skeptical Mr. Romer, as he acknowledged with a laugh on Tuesday — that the next candidate for looting already seemed to be taking shape.
It was an obscure little market called credit derivatives.
Saturday, March 07, 2009
homologate
ho⋅mol⋅o⋅gate [huh-mol-uh-geyt, hoh-] Show IPA
–verb (used with object), -gat⋅ed, -gat⋅ing.
1. to approve; confirm or ratify.
2. to register (a specific make of automobile in general production) so as to make it eligible for international racing competition.
From Dan Neill, LATimes: To bring you up to speed a bit: The Cube is a huge hit for Nissan in Japan, and now – given a projected upswing in the small crossover segment in the U.S. – the company has homologated it for the North American market.
–verb (used with object), -gat⋅ed, -gat⋅ing.
1. to approve; confirm or ratify.
2. to register (a specific make of automobile in general production) so as to make it eligible for international racing competition.
From Dan Neill, LATimes: To bring you up to speed a bit: The Cube is a huge hit for Nissan in Japan, and now – given a projected upswing in the small crossover segment in the U.S. – the company has homologated it for the North American market.
Thursday, March 05, 2009
An Empty In-Box, or With Just a Few E-Mail Messages? Read On By FARHAD MANJOO
March 5, 2009
Basics
An Empty In-Box, or With Just a Few E-Mail Messages? Read On By FARHAD MANJOO
SINCE e-mail became a fixture in our professional and personal lives, many academic researchers have investigated the complex mix of feelings brought on by the technology.
We feel guilty about being late in responding, about our in-boxes being disorganized, about the tens of thousands of unread messages that we’re sure we’ll never get to. What is it about e-mail that consumes us — that invades every corner of our personal space, demands ever more sophisticated methods of organization, and makes us wish for extra hours in the day to deal with the deluge? More important, how can we overcome it?
In the last few weeks, I set about finding a cure for e-mail anxiety. It was not the first time I’d done so; I’ve been looking for better ways to handle my mail since shortly after logging in to my first in-box.
Over the years, I’ve discovered many methods that worked for a while, but never permanently. For a while, I set up elaborate filters meant to automatically categorize every incoming message according to who sent it. Another time, I instituted a complicated system of color-coded labels aimed at getting me to understand which e-mail messages I had to respond to, which I had to save and which I could ignore.
But eventually every finely honed trick to tame my mail would collapse, and I’d backslide into a messy, undisciplined in-box. So in my search for a new way to deal with e-mail, I followed one guiding principle: Keep it simple. Any method that made too many demands on my time or my brain was bound to fail.
Fortunately, after much experimentation with various experts’ many tips, I’ve found something that works. Here are the basic rules:
LIMIT YOUR TIME WITH IT Turn off all auto-notifications that alert you to incoming mail, and if you must check mail while you’re on the go, keep it to a minimum. Here’s a good guideline that worked for me: Don’t dip into your in-box more than three times an hour. It’s unlikely that any message is so urgent that it can’t wait 20 minutes for your response — if it were, the sender would call, send an instant message or find some other way to reach you.
CLEAR OUT YOUR IN-BOX Set aside an hour or two to respond to every important message that has dogged you in the last couple months (anything older than that is too ancient to bother with). Next, move everything else into a new folder called Archive — this will be your storehouse of old mail.
Your in-box should now be empty. Think of this as its optimal state — your goal, from now on, will be to keep this space as pristine as possible, either empty or nearly so. To realize that goal, live by this precept: Whenever you receive a new message, do something with it. Don’t read your e-mail and then just let it sit there — that’s a recipe for chaos.
This isn’t always so easy. A day’s worth of mail demands a variety of complex actions, and the daunting task of figuring out how to respond to each message is probably what made your in-box untidy in the first place.
That’s where the next steps come in — an algorithm for dealing with incoming e-mail messages. For each new one you receive, take one of the following actions:
ARCHIVE IT Most e-mail messages require no action or response on your part — messages from Facebook letting you know that an old college pal has commented on your wall, for instance. Skim through these missives (or leave them unread), then shoot them into your archive and forget them.
RESPOND TO IT If the e-mail message calls for an easy answer, send it. Say a colleague wants to know if you’re up for dinner at his place on Saturday, or your boss wants to praise you for a job well done. Shoot back a quick response — “Yes!” or “Thanks!” — and then push the original message into your archive. The productivity guru and “Getting Things Done” author David Allen has a rule of thumb that comes in handy here: If responding is going to take two minutes or less, you’re better off doing it now than procrastinating.
FORWARD IT If the message is better handled by someone else — your boss, your sister, anyone but you — send it off to that person, then archive it.
HOLD IT FOR LATER This is the trickiest option. Some e-mail messages demand complicated answers. You don’t really want to dine with your colleague, but coming up with an excuse will take longer than two minutes. Other messages simply require information not yet available. Your friend wants to know if you’re up for watching the game on Sunday, but you’ve got to check with your spouse first.
You can leave these messages in your in-box with a promise to come back to them soon. (Depending on the mail program you use, you might want to set a reminder or a flag to make it stand out — in Microsoft Outlook, you can click the flag icon, or in Google’s Gmail, the star).
Be careful to avoid letting many such messages pile up. Carve out a short amount of time — perhaps 15 to 30 minutes at the end of the day — to respond to all flagged e-mail. Remember, your goal is to keep your in-box empty. Each message sitting there should serve as a stark, visible reminder of your undisciplined ways.
Notice that my system doesn’t include any complex method for organizing e-mail — I don’t categorize my messages into folders by sender, subject matter, date or any other scheme. That way lies distraction.
E-mail isn’t a test of your skills at making things look pretty; indeed, making things look pretty will only take time away from your goal of actually getting through your mail. Most modern e-mail programs include search engines that are powerful enough to find any message you need without the aid of a taxonomy.
Note, too, that this system is far from new. It was inspired by Mr. Allen’s ideas, and it’s been proselytized, in various forms, by a host of efficiency experts and people who have spent a lot of time wrestling with e-mail.
In particular, I relied partly on a series of essays and lectures put together by Merlin Mann, proprietor of the Web site 43 Folders, which aims to help you get a handle on how much attention you focus on unrewarding tasks, like e-mail.
It wasn’t easy for me to curb my time in my in-box. E-mail was like a drug, and I needed a constant fix. That’s a good sign you need help.
“People arrive at this because they’re feeling overwhelmed,” Mr. Mann said. “They feel like the train is going off the rails.” He was careful to add that much of what troubles people about modern life goes beyond e-mail — but you can think of fixing your in-box as a private victory for modern professionals. Once you deal with your e-mail, you’ll be able to tackle stuff that really matters.
Basics
An Empty In-Box, or With Just a Few E-Mail Messages? Read On By FARHAD MANJOO
SINCE e-mail became a fixture in our professional and personal lives, many academic researchers have investigated the complex mix of feelings brought on by the technology.
We feel guilty about being late in responding, about our in-boxes being disorganized, about the tens of thousands of unread messages that we’re sure we’ll never get to. What is it about e-mail that consumes us — that invades every corner of our personal space, demands ever more sophisticated methods of organization, and makes us wish for extra hours in the day to deal with the deluge? More important, how can we overcome it?
In the last few weeks, I set about finding a cure for e-mail anxiety. It was not the first time I’d done so; I’ve been looking for better ways to handle my mail since shortly after logging in to my first in-box.
Over the years, I’ve discovered many methods that worked for a while, but never permanently. For a while, I set up elaborate filters meant to automatically categorize every incoming message according to who sent it. Another time, I instituted a complicated system of color-coded labels aimed at getting me to understand which e-mail messages I had to respond to, which I had to save and which I could ignore.
But eventually every finely honed trick to tame my mail would collapse, and I’d backslide into a messy, undisciplined in-box. So in my search for a new way to deal with e-mail, I followed one guiding principle: Keep it simple. Any method that made too many demands on my time or my brain was bound to fail.
Fortunately, after much experimentation with various experts’ many tips, I’ve found something that works. Here are the basic rules:
LIMIT YOUR TIME WITH IT Turn off all auto-notifications that alert you to incoming mail, and if you must check mail while you’re on the go, keep it to a minimum. Here’s a good guideline that worked for me: Don’t dip into your in-box more than three times an hour. It’s unlikely that any message is so urgent that it can’t wait 20 minutes for your response — if it were, the sender would call, send an instant message or find some other way to reach you.
CLEAR OUT YOUR IN-BOX Set aside an hour or two to respond to every important message that has dogged you in the last couple months (anything older than that is too ancient to bother with). Next, move everything else into a new folder called Archive — this will be your storehouse of old mail.
Your in-box should now be empty. Think of this as its optimal state — your goal, from now on, will be to keep this space as pristine as possible, either empty or nearly so. To realize that goal, live by this precept: Whenever you receive a new message, do something with it. Don’t read your e-mail and then just let it sit there — that’s a recipe for chaos.
This isn’t always so easy. A day’s worth of mail demands a variety of complex actions, and the daunting task of figuring out how to respond to each message is probably what made your in-box untidy in the first place.
That’s where the next steps come in — an algorithm for dealing with incoming e-mail messages. For each new one you receive, take one of the following actions:
ARCHIVE IT Most e-mail messages require no action or response on your part — messages from Facebook letting you know that an old college pal has commented on your wall, for instance. Skim through these missives (or leave them unread), then shoot them into your archive and forget them.
RESPOND TO IT If the e-mail message calls for an easy answer, send it. Say a colleague wants to know if you’re up for dinner at his place on Saturday, or your boss wants to praise you for a job well done. Shoot back a quick response — “Yes!” or “Thanks!” — and then push the original message into your archive. The productivity guru and “Getting Things Done” author David Allen has a rule of thumb that comes in handy here: If responding is going to take two minutes or less, you’re better off doing it now than procrastinating.
FORWARD IT If the message is better handled by someone else — your boss, your sister, anyone but you — send it off to that person, then archive it.
HOLD IT FOR LATER This is the trickiest option. Some e-mail messages demand complicated answers. You don’t really want to dine with your colleague, but coming up with an excuse will take longer than two minutes. Other messages simply require information not yet available. Your friend wants to know if you’re up for watching the game on Sunday, but you’ve got to check with your spouse first.
You can leave these messages in your in-box with a promise to come back to them soon. (Depending on the mail program you use, you might want to set a reminder or a flag to make it stand out — in Microsoft Outlook, you can click the flag icon, or in Google’s Gmail, the star).
Be careful to avoid letting many such messages pile up. Carve out a short amount of time — perhaps 15 to 30 minutes at the end of the day — to respond to all flagged e-mail. Remember, your goal is to keep your in-box empty. Each message sitting there should serve as a stark, visible reminder of your undisciplined ways.
Notice that my system doesn’t include any complex method for organizing e-mail — I don’t categorize my messages into folders by sender, subject matter, date or any other scheme. That way lies distraction.
E-mail isn’t a test of your skills at making things look pretty; indeed, making things look pretty will only take time away from your goal of actually getting through your mail. Most modern e-mail programs include search engines that are powerful enough to find any message you need without the aid of a taxonomy.
Note, too, that this system is far from new. It was inspired by Mr. Allen’s ideas, and it’s been proselytized, in various forms, by a host of efficiency experts and people who have spent a lot of time wrestling with e-mail.
In particular, I relied partly on a series of essays and lectures put together by Merlin Mann, proprietor of the Web site 43 Folders, which aims to help you get a handle on how much attention you focus on unrewarding tasks, like e-mail.
It wasn’t easy for me to curb my time in my in-box. E-mail was like a drug, and I needed a constant fix. That’s a good sign you need help.
“People arrive at this because they’re feeling overwhelmed,” Mr. Mann said. “They feel like the train is going off the rails.” He was careful to add that much of what troubles people about modern life goes beyond e-mail — but you can think of fixing your in-box as a private victory for modern professionals. Once you deal with your e-mail, you’ll be able to tackle stuff that really matters.
Sunday, March 01, 2009
Is Food the New Sex? By Mary Eberstadt
February & March 2009
Table of Contents
FEATURES:
Is Food the New Sex? By Mary Eberstadt
A curious reversal in moralizing
f all the truly seismic shifts transforming daily life today — deeper than our financial fissures, wider even than our most obvious political and cultural divides — one of the most important is also among the least remarked. That is the chasm in attitude that separates almost all of us living in the West today from almost all of our ancestors, over two things without which human beings cannot exist: food and sex.
The question before us today is not whether the two appetites are closely connected. About that much, philosophers and other commentators have been agreed for a very long time. As far back as Aristotle, observers have made the same point reiterated in 1749 in Henry Fielding’s famous scene in Tom Jones: The desires for sex and for food are joined at the root. The fact that Fielding’s scene would go on to inspire an equally iconic movie segment over 200 years later, in the Tom Jones film from 1963, just clinches the point.
What happens when, for the first time in history, adult human beings are free to have all the sex and food they want?
Philosophers and artists aside, ordinary language itself verifies how similarly the two appetites are experienced, with many of the same words crossing over to describe what is desirable and undesirable in each case. In fact, we sometimes have trouble even talking about food without metaphorically invoking sex, and vice versa. In a hundred entangled ways, judging by either language or literature, the human mind juggles sex and food almost interchangeably at times. And why not? Both desires can make people do things they otherwise would not; and both are experienced at different times by most men and women as the most powerful of all human drives.
One more critical link between the appetites for sex and food is this: Both, if pursued without regard to consequence, can prove ruinous not only to oneself, but also to other people, and even to society itself. No doubt for that reason, both appetites have historically been subject in all civilizations to rules both formal and informal. Thus the potentially destructive forces of sex — disease, disorder, sexual aggression, sexual jealousy, and what used to be called “home-wrecking” — have been ameliorated in every recorded society by legal, social, and religious conventions, primarily stigma and punishment. Similarly, all societies have developed rules and rituals governing food in part to avoid the destructiveness of free-for-alls over scarce necessities. And while food rules may not always have been as stringent as sex rules, they have nevertheless been stringent as needed. Such is the meaning, for example, of being hanged for stealing a loaf of bread in the marketplace, or keel-hauled for plundering rations on a ship.
These disciplines imposed historically on access to food and sex now raise a question that has not come up before, probably because it was not even possible to imagine it until the lifetimes of the people reading this: What happens when, for the first time in history — at least in theory, and at least in the advanced nations — adult human beings are more or less free to have all the sex and food they want?
This question opens the door to a real paradox. For given how closely connected the two appetites appear to be, it would be natural to expect that people would do the same kinds of things with both appetites — that they would pursue both with equal ardor when finally allowed to do so, for example, or with equal abandon for consequence; or conversely, with similar degrees of discipline in the consumption of each.
In fact, though, evidence from the advanced West suggests that nearly the opposite seems to be true. The answer appears to be that when many people are faced with these possibilities for the very first time, they end up doing very different things — things we might signal by shorthand as mindful eating, and mindless sex. This essay is both an exploration of that curious dynamic, and a speculation about what is driving it.
AS MUCH AS YOU WANT
he dramatic expansion in access to food on the one hand and to sex on the other are complicated stories; but in each case, technology has written most of it.
Up until just about now, for example, the prime brakes on sex outside of marriage have been several: fear of pregnancy, fear of social stigma and punishment, and fear of disease. The Pill and its cousins have substantially undermined the first two strictures, at least in theory, while modern medicine has largely erased the third. Even hiv/aids, only a decade ago a stunning exception to the brand new rule that one could apparently have any kind of sex at all without serious consequence, is now regarded as a “manageable” disease in the affluent West, even as it continues to kill millions of less fortunate patients elsewhere.
As for food, here too one technological revolution after another explains the extraordinary change in its availability: pesticides, mechanized farming, economical transportation, genetic manipulation of food stocks, and other advances. As a result, almost everyone in the Western world is now able to buy sustenance of all kinds, for very little money, and in quantities unimaginable until the lifetimes of the people reading this.
One result of this change in food fortune, of course, is the unprecedented “disease of civilization” known as obesity, with its corollary ills. Nevertheless, the commonplace fact of obesity in today’s West itself testifies to the point that access to food has expanded exponentially for just about everyone. So does the statistical fact that obesity is most prevalent in the lowest social classes and least exhibited in the highest.
And just as technology has made sex and food more accessible for a great many people, important extra-technological influences on both pursuits — particularly longstanding religious strictures — have meanwhile diminished in a way that has made both appetites even easier to indulge. The opprobrium reserved for gluttony, for example, seems to have little immediate force now, even among believers. On the rare occasions when one even sees the word, it is almost always used in a metaphorical, secular sense.
Similarly, and far more consequential, the longstanding religious prohibitions in every major creed against extramarital sex have rather famously loosed their holds over the contemporary mind. Of particular significance, perhaps, has been the movement of many Protestant denominations away from the sexual morality agreed upon by the previous millennia of Christendom. The Anglican abandonment in 1930 of the longstanding prohibition against artificial contraception is a special case in point, undermining as it subsequently did for many believers the very idea that any church could tell people what to do with their bodies, ever again. Whether they defended their traditional teachings or abandoned them, however, all Western Christian churches in the past century have found themselves increasingly beleaguered over issues of sex, and commensurately less influential over all but a fraction of the most traditionally minded parishioners.
Of course this waning of the traditional restraints on the pursuit of sex and food is only part of the story; any number of non-religious forces today also act as contemporary brakes on both. In the case of food, for example, these would include factors like personal vanity, say, or health concerns, or preoccupation with the morality of what is consumed (about which more below). Similarly, to acknowledge that sex is more accessible than ever before is not to say that it is always and everywhere available. Many people who do not think they will go to hell for premarital sex or adultery, for example, find brakes on their desires for other reasons: fear of disease, fear of hurting children or other loved ones, fear of disrupting one’s career, fear of financial setbacks in the form of divorce and child support, and so on.
Even men and women who do want all the food or sex they can get their hands on face obstacles of other kinds in their pursuit. Though many people really can afford to eat more or less around the clock, for example, home economics will still put the brakes on; it’s not as if everyone can afford pheasant under glass day and night. The same is true of sex, which likewise imposes its own unwritten yet practical constraints. Older and less attractive people simply cannot command the sexual marketplace as the younger and more attractive can (which is why the promises of erasing time and age are such a booming business in a post-liberation age). So do time and age still circumscribe the pursuit of sex, even as churches and other conventional enforcers increasingly do not.
Still and all, the initial point stands: As consumers of both sex and food, today’s people in the advanced societies are freer to pursue and consume both than almost all the human beings who came before us; and our culture has evolved in interesting ways to exhibit both those trends.
BROCCOLI, PORNOGRAPHY, AND KANT
o begin to see just how recent and dramatic this change is, let us imagine some broad features of the world seen through two different sets of eyes: a hypothetical 30-year-old housewife from 1958 named Betty, and her hypothetical granddaughter Jennifer, of the same age, today.
Begin with a tour of Betty’s kitchen. Much of what she makes comes from jars and cans. Much of it is also heavy on substances that people of our time are told to minimize — dairy products, red meat, refined sugars and flours — because of compelling research about nutrition that occurred after Betty’s time. Betty’s freezer is filled with meat every four months by a visiting company that specializes in volume, and on most nights she thaws a piece of this and accompanies it with food from one or two jars. If there is anything “fresh” on the plate, it is likely a potato. Interestingly, and rudimentary to our contemporary eyes though it may be, Betty’s food is served with what for us would appear to be high ceremony, i.e., at a set table with family members present.
As it happens, there is little that Betty herself, who is adventurous by the standards of her day, will not eat; the going slogan she learned as a child is about cleaning your plate, and not doing so is still considered bad form. Aside from that notion though, which is a holdover to scarcer times, Betty is much like any other American home cook in 1958. She likes making some things and not others, even as she prefers eating some things to others — and there, in personal aesthetics, does the matter end for her. It’s not that Betty lacks opinions about food. It’s just that the ones she has are limited to what she does and does not personally like to make and eat.
Now imagine one possible counterpart to Betty today, her 30-year-old granddaughter Jennifer. Jennifer has almost no cans or jars in her cupboard. She has no children or husband or live-in boyfriend either, which is why her kitchen table on most nights features a laptop and goes unset. Yet interestingly enough, despite the lack of ceremony at the table, Jennifer pays far more attention to food, and feels far more strongly in her convictions about it, than anyone she knows from Betty’s time.
Wavering in and out of vegetarianism, Jennifer is adamantly opposed to eating red meat or endangered fish. She is also opposed to industrialized breeding, genetically enhanced fruits and vegetables, and to pesticides and other artificial agents. She tries to minimize her dairy intake, and cooks tofu as much as possible. She also buys “organic” in the belief that it is better both for her and for the animals raised in that way, even though the products are markedly more expensive than those from the local grocery store. Her diet is heavy in all the ways that Betty’s was light: with fresh vegetables and fruits in particular. Jennifer has nothing but ice in her freezer, soymilk and various other items her grandmother wouldn’t have recognized in the refrigerator, and on the counter stands a vegetable juicer she feels she “ought” to use more.
Most important of all, however, is the difference in moral attitude separating Betty and Jennifer on the matter of food. Jennifer feels that there is a right and wrong about these options that transcends her exercise of choice as a consumer. She does not exactly condemn those who believe otherwise, but she doesn’t understand why they do, either. And she certainly thinks the world would be a better place if more people evaluated their food choices as she does. She even proselytizes on occasion when she can.
In short, with regard to food, Jennifer falls within Immanuel Kant’s definition of the Categorical Imperative: She acts according to a set of maxims that she wills at the same time to be universal law.
Betty, on the other hand, would be baffled by the idea of dragooning such moral abstractions into the service of food. This is partly because, as a child of her time, she was impressed — as Jennifer is not — about what happens when food is scarce (Betty’s parents told her often about their memories of the Great Depression; and many of the older men of her time had vivid memories of deprivation in wartime). Even without such personal links to food scarcity, though, it makes no sense to Betty that people would feel as strongly as her granddaughter does about something as simple as deciding just what goes into one’s mouth. That is because Betty feels, as Jennifer obviously does not, that opinions about food are simply de gustibus, a matter of individual taste — and only that.
This clear difference in opinion leads to an intriguing juxtaposition. Just as Betty and Jennifer have radically different approaches to food, so do they to matters of sex. For Betty, the ground rules of her time — which she both participates in and substantially agrees with — are clear: Just about every exercise of sex outside marriage is subject to social (if not always private) opprobrium. Wavering in and out of established religion herself, Betty nevertheless clearly adheres to a traditional Judeo-Christian sexual ethic. Thus, for example, Mr. Jones next door “ran off” with another woman, leaving his wife and children behind; Susie in the town nearby got pregnant and wasn’t allowed back in school; Uncle Bill is rumored to have contracted gonorrhea; and so on. None of these breaches of the going sexual ethic is considered by Betty to be a good thing, let alone a celebrated thing. They are not even considered to be neutral things. In fact, they are all considered by her to be wrong.
Most important of all, Betty feels that sex, unlike food, is not de gustibus. She believes to the contrary that there is a right and wrong about these choices that transcends any individual act. She further believes that the world would be a better place, and individual people better off, if others believed as she does. She even proselytizes such on occasion when given the chance.
In short, as Jennifer does with food, Betty in the matter of sex fulfills the requirements for Kant’s Categorical Imperative.
Jennifer’s approach to sex is just about 180 degrees different. She too disapproves of the father next door who left his wife and children for a younger woman; she does not want to be cheated on herself, or to have those she cares about cheated on either. These ground-zero stipulations, aside, however, she is otherwise laissez-faire on just about every other aspect of nonmarital sex. She believes that living together before marriage is not only morally neutral, but actually better than not having such a “trial run.” Pregnant unwed Susie in the next town doesn’t elicit a thought one way or the other from her, and neither does Uncle Bill’s gonorrhea, which is of course a trivial medical matter between him and his doctor.
Jennifer, unlike Betty, thinks that falling in love creates its own demands and generally trumps other considerations — unless perhaps children are involved (and sometimes, on a case-by-case basis, then too). A consistent thinker in this respect, she also accepts the consequences of her libertarian convictions about sex. She is pro-abortion, pro-gay marriage, indifferent to ethical questions about stem cell research and other technological manipulations of nature (as she is not, ironically, when it comes to food), and agnostic on the question of whether any particular parental arrangements seem best for children. She has even been known to watch pornography with her boyfriend, at his coaxing, in part to show just how very laissez-faire she is.
Betty thinks food is a matter of taste, whereas sex is governed by universal moral law; and Jennifer thinks exactly the reverse.
Most important, once again, is the difference in moral attitude between the two women on this subject of sex. Betty feels that there is a right and wrong about sexual choices that transcends any individual act, and Jennifer — exceptions noted — does not. It’s not that Jennifer lacks for opinions about sex, any more than Betty does about food. It’s just that, for the most part, they are limited to what she personally does and doesn’t like.
Thus far, what the imaginary examples of Betty and Jennifer have established is this: Their personal moral relationships toward food and toward sex are just about perfectly reversed. Betty does care about nutrition and food, but it doesn’t occur to her to extend her opinions to a moral judgment — i.e., to believe that other people ought to do as she does in the matter of food, and that they are wrong if they don’t. In fact, she thinks such an extension would be wrong in a different way; it would be impolite, needlessly judgmental, simply not done. Jennifer, similarly, does care to some limited degree about what other people do about sex; but it seldom occurs to her to extend her opinions to a moral judgment. In fact, she thinks such an extension would be wrong in a different way — because it would be impolite, needlessly judgmental, simply not done.
On the other hand, Jennifer is genuinely certain that her opinions about food are not only nutritionally correct, but also, in some deep, meaningful sense, morally correct — i.e., she feels that others ought to do something like what she does. And Betty, on the other hand, feels exactly the same way about what she calls sexual morality.
As noted, this desire to extend their personal opinions in two different areas to an “ought” that they think should be somehow binding — binding, that is, to the idea that others should do the same — is the definition of the Kantian imperative. Once again, note: Betty’s Kantian imperative concerns sex not food, and Jennifer’s concerns food not sex. In just over 50 years, in other words — not for everyone, of course, but for a great many people, and for an especially large portion of sophisticated people — the moral poles of sex and food have been reversed. Betty thinks food is a matter of taste, whereas sex is governed by universal moral law of some kind; and Jennifer thinks exactly the reverse.
What has happened here?
ROLE REVERSAL
etty and jennifermay be imaginary, but the decades that separate the two women have brought related changes to the lives of many millions. In the 50 years between their two kitchens, a similar polar transformation has taken root and grown not only throughout America but also throughout Western society itself. During those years, cultural artifacts and forces in the form of articles, books, movies, and ideas aimed at deregulating what is now quaintly called “nonmarital sex” have abounded and prospered; while the cultural artifacts and forces aimed at regulating or seeking to re-regulate sex outside of marriage have largely declined. In the matter of food, on the other hand, exactly the reverse has happened. Increasing scrutiny over the decades to the quality of what goes into people’s mouths has been accompanied by something almost wholly new under the sun: the rise of universalizable moral codes based on food choices.
Begin with the more familiar face of diets and fads — the Atkins diet, the Zone diet, the tea diet, the high-carb diet, Jenny Craig, Weight Watchers, and all the rest of the food fixes promising us new and improved versions of ourselves. Abundant though they and all their relatives are, those short-term fads and diets are nevertheless merely epiphenomena.
Digging a little deeper, the obsession with food that they reflect resonates in many other strata of the commercial marketplace. Book reading, for example, may indeed be on the way out, but until it goes, cookbooks and food books remain among the most reliable moneymakers in the industry. To scan the bestseller lists or page the major reviews in any given month is to find that books on food and food-thought are at least reliably represented, and sometimes even predominate — to list a few from the past few years alone: Michael Pollan’s The Omnivore’s Dilemma; Eric Schlosser’s Fast Food Nation; Gary Taubes’ Good Calories, Bad Calories; Bill Buford’s Heat.
Then there are the voyeur and celebrity genres, which have made some chefs the equivalent of rock stars and further feed the public curiosity with books like Kitchen Confidential: Adventures in the Culinary Underbelly or Service Included: Four-Star Secrets of an Eavesdropping Waiter or The Devil in the Kitchen: Sex, Pain, Madness, and the Making of a Great Chef. Anywhere you go, anywhere you look, food in one form or another is what’s on tap. The proliferation of chains like Whole Foods, the recent institution by Governor Arnold Schwarzenegger of state-mandated nutritional breakdowns in restaurants in the state of California (a move that is sure to be repeated by governors in the other 49): All these and many other developments speak to the paramount place occupied by food and food choices in the modern consciousness. As the New York Times Magazine noted recently, in a foreword emphasizing the intended expansion of its (already sizeable) food coverage, such writing is “perhaps never a more crucial part of what we do than today — a moment when what and how we eat has emerged as a Washington issue and a global-environmental issue as well as a kitchen-table one.”
Underneath the passing fads and short-term fixes and notices like these, deep down where the real seismic change lies, is a series of revolutions in how we now think about food — changes that focus not on today or tomorrow, but on eating as a way of life.
One recent influential figure in this tradition was George Ohsawa, a Japanese philosopher who codified what is known as macrobiotics. Popularized in the United States by his pupil, Michio Kushi, macrobiotics has been the object of fierce debate for several decades now, and Kushi’s book, The Macrobiotic Path to Total Health: A Complete Guide to Naturally Preventing and Relieving More Than 200 Chronic Conditions and Disorders, remains one of the modern bibles on food. Macrobiotics makes historical as well as moral claims, including the claim that its tradition stretches back to Hippocrates and includes Jesus and the Han dynasty among other enlightened beneficiaries. These claims are also reflected in the macrobiotic system, which includes the expression of gratitude (not exactly prayers) for food, serenity in the preparation of it, and other extra-nutritional ritual. And even as the macrobiotic discipline has proved too ascetic for many people (and certainly for most Americans), one can see its influence at work in other serious treatments of the food question that have trickled outward. The current popular call to “mindful eating,” for example, echoes the macrobiotic injunction to think of nothing but food and gratitude while consuming, even to the point of chewing any given mouthful at least 50 times.
Alongside macrobiotics, the past decades have also seen tremendous growth in vegetarianism and its related offshoots, another food system that typically makes moral as well as health claims. As a movement, and depending on which part of the world one looks at, vegetarianism predates macrobiotics.1 Vegetarian histories claim for themselves the Brahmins, Buddhists, Jainists, and Zoroastrians, as well as certain Jewish and Christian practitioners. In the modern West, Percy Bysshe Shelley was a prominent activist in the early nineteenth century; and the first Vegetarian Society was founded in England in 1847.
Around the same time in the United States, a Presbyterian minister named Sylvester Graham popularized vegetarianism in tandem with a campaign against excess of all kinds (ironically, under the circumstances, this health titan is remembered primarily for the Graham cracker). Various other American religious groups have also gone in for vegetarianism, including the Seventh Day Adventists, studies on whom make up some of the most compelling data about the possible health benefits of a diet devoid of animal flesh. Uniting numerous discrete movements under one umbrella is the International Vegetarian Union, which started just a hundred years ago, in 1908.
Despite this long history, though, it is clear that vegetarianism apart from its role in religious movements did not really take off as a mass movement until relatively recently. Even so, its contemporary success has been remarkable. Pushed perhaps by the synergistic public interest in macrobiotics and nutritional health, and nudged also by occasional rallying books including Peter Singer’s Animal Rights and Matthew Scully’s Dominion, vegetarianism today is one of the most successful secular moral movements in the West; whereas macrobiotics for its part, though less successful as a mass movement by name, has witnessed the vindication of some of its core ideas and stands as a kind of synergistic brother in arms.
To be sure, macrobiotics and vegetarianism/veganism have their doctrinal differences. Macrobiotics limits animal flesh not out of moral indignation, but for reasons of health and Eastern ideas of proper “balancing” of the forces of yin and yang. Similarly, macrobiotics also allows for moderate amounts of certain types of fish — as strict vegans do not. On the other hand, macrobiotics also bans a number of plants (among them tomatoes, potatoes, peppers, and tropical fruits), whereas vegetarianism bans none. Nonetheless, macrobiotics and vegetarianism have more in common than not, especially from the point of view of anyone eating outside either of these codes. The doctrinal differences separating one from another are about equivalent in force today to those between, say, Presbyterians and Lutherans.
And that is exactly the point. For many people, schismatic differences about food have taken the place of schismatic differences about faith. Again, the curiosity is just how recent this is. Throughout history, practically no one devoted this much time to matters of food as ideas (as opposed to, say, time spent gathering the stuff). Still less does it appear to have occurred to people that dietary schools could be untethered from a larger metaphysical and moral worldview. Observant Jews and Muslims, among others, have had strict dietary laws from their faiths’ inception; but that is just it — their laws told believers what to do with food when they got it, rather than inviting them to dwell on food as a thing in itself. Like the Adventists, who speak of their vegetarianism as being “harmony with the Creator,” or like the Catholics with their itinerant Lenten and other obligations, these previous dietary laws were clearly designed to enhance religion — not replace it.
Do today’s influential dietary ways of life in effect replace religion? Consider that macrobiotics, vegetarianism, and veganism all make larger health claims as part of their universality — but unlike yesteryear, to repeat the point, most of them no longer do so in conjunction with organized religion. Macrobiotics, for its part, argues (with some evidence) that processed foods and too much animal flesh are toxic to the human body, whereas whole grains, vegetables, and fruits are not. The literature of vegetarianism makes a similar point, recently drawing particular attention to new research concerning the connection between the consumption of red meat and certain cancers. In both cases, however, dietary laws are not intended to be handmaidens to a higher cause, but moral causes in themselves.
Just as the food of today often attracts a level of metaphysical attentiveness suggestive of the sex of yesterday, so does food today seem attended by a similarly evocative — and proliferating — number of verboten signs. The opprobrium reserved for perceived “violations” of what one “ought” to do has migrated, in some cases fully, from one to the other. Many people who wouldn’t be caught dead with an extra ten pounds — or eating a hamburger, or wearing real leather — tend to be laissez-faire in matters of sex. In fact, just observing the world as it is, one is tempted to say that the more vehement people are about the morality of their food choices, themore hands-off they believe the rest of the world should be about sex. What were the circumstances the last time you heard or used the word “guilt” — in conjunction with sin as traditionally conceived? Or with having eaten something verboten and not having gone to the gym?
Perhaps the most revealing example of the infusion of morality into food codes can be found in the current European passion for what the French call terroir — an idea that originally referred to the specific qualities conferred by geography on certain food products (notably wine) and that has now assumed a life of its own as a moral guide to buying and consuming locally. That there is no such widespread, concomitant attempt to impose a new morality on sexual pursuits in Western Europe seems something of an understatement. But as a measure of the reach of terroir as a moral code, consider only a sermon from Durham Cathedral in 2007. In it, the dean explained Lent as an event that “says to us, cultivate a good terroir, a spiritual ecology that will re-focus our passion for God, our praying, our pursuit of justice in the world, our care for our fellow human beings.”
There stands an emblematic example of the reversal between food and sex in our time: in which the once-universal moral code of European Christianity is being explicated for the masses by reference to the now apparently more-universal European moral code of consumption à la terroir.
Moreover, this reversal between sex and food appears firmer the more passionately one clings to either pole. Thus, for instance, though much has lately been made of the “greening” of the evangelicals, no vegetarian Christian group is as nationally known as, say, People for the Ethical Treatment of Animals or any number of other vegetarian/vegan organizations, most of which appear to be secular or anti-religious and none of which, so far as my research shows, extend their universalizable moral ambitions to the realm of sexuality. When Skinny Bitch — a hip guide to veganism that recently topped the bestseller lists for months — exhorts its readers to a life that is “clean, pure, healthy,” for example, it is emphatically not including sex in this moral vocabulary, and makes a point of saying so.
C.S. Lewis once compared the two desires as follows, to make the point that something about sex had gotten incommensurate in his own time: “There is nothing to be ashamed of in enjoying your food: there would be everything to be ashamed of if half the world made food the main interest of their lives and spent their time looking at pictures of food and dribbling and smacking their lips.” He was making a point in the genre of reductio ad absurdum.
But for the jibe to work as it once did, our shared sense of what is absurd about it must work too — and that shared sense, in an age as visually, morally, and aesthetically dominated by food as is our own, is waning fast. Consider the coining of the term “gastroporn” to describe the eerily similar styles of high-definition pornography on the one hand and stylized shots of food on the other. Actually, the term is not even that new. It dates back at least 30 years, to a 1977 essay by that title in the New York Review of Books. In it author Andrew Cockburn observed that “it cannot escape attention that there are curious parallels between manuals on sexual techniques and manuals on the preparation of food; the same studious emphasis on leisurely technique, the same apostrophes to the ultimate, heavenly delights. True gastro-porn heightens the excitement and also the sense of the unattainable by proffering colored photographs of various completed recipes.”
With such a transfer, the polar migrations of food and sex during the last half century would appear complete.
RESPECTING SOME HAZARDS, IGNORING OTHERS
f it is true that food is the new sex, however, where does that leave sex? This brings us to the paradox already hinted at. As the consumption of food not only literally but also figuratively has become progressively more discriminate and thoughtful, at least in theory (if rather obviously not always in practice), the consumption of sex in various forms appears to have become the opposite for a great many people: i.e., progressively more indiscriminate and unthinking.
Several proofs could be offered for such a claim, beginning with any number of statistical studies. Both men and women are far less likely to be sexually inexperienced on their weddings now (if indeed they marry) than they were just a few decades ago. They are also more likely to be experienced in all kinds of ways, including in the use of pornography. Like the example of Jennifer, moreover, their general thoughts about sex become more laissez-faire the further down the age demographic one goes.
Consider as further proof of the dumbing-down of sex the coarseness of popular entertainment, say through a popular advice column on left-leaning Slate magazine called “Dear Prudence” that concerns “manners and morals.” Practically every subject line is window onto a world of cheap, indiscriminate sex, where the only ground rule is apparently that no sexual urge shall ever be discouraged unless it manifestly hurts others — meaning literally. “Should I destroy the erotic video my husband and I have made?” “My boyfriend’s kinky fetish might doom our relationship.” “My husband wants me to abort, and I don’t.” “How do I tell my daughter she’s the result of a sexual assault?” “A friend confessed to a fling with my now-dead husband.” And so on. The mindful vegetarian slogan, “you are what you eat,” has no counterpart in the popular culture today when it comes to sex.
The third and probably most important feature of sex in our time testifying to the ubiquity of appetites fulfilled and indulged indiscriminately is the staggering level of consumption of Internet pornography. As Ross Douthat recently summarized in an essay for the Atlantic, provocatively titled “Is Pornography Adultery?”:
Over the past three decades, the vcr, on-demand cable service, and the Internet have completely overhauled the ways in which people interact with porn. Innovation has piled on innovation, making modern pornography a more immediate, visceral, and personalized experience. Nothing in the long history of erotica compares with the way millions of Americans experience porn today, and our moral intuitions are struggling to catch up.
Statistics too, or at least preliminary ones, bear out just how consequential this erotic novelty is becoming. Pornography is the single most viewed subject online, by men anyway; it is increasingly a significant factor in divorce cases; and it is resulting in any number of cottage industries, from the fields of therapy to law to academia, as society’s leading cultural institutions strive to measure and cope with its impact.2
This junk sex shares all the defining features of junk food. It is produced and consumed by people who do not know one another. It is disdained by those who believe they have access to more authentic experience or “healthier” options. Internet pornography is further widely said — right now, in its relatively early years — to be harmless, much as few people thought little of the ills to come through convenient prepared food when it first appeared; and evidence is also beginning to emerge about compulsive pornography consumption, as it did slowly but surely in the case of compulsive packaged food consumption, that this laissez-faire judgment is wrong.3
This brings us to another similarity between junk sex and junk food: People are furtive about both, and many feel guilty about their pursuit and indulgence of each. And those who consume large amounts of both are also typically self-deceptive, too: i.e., they underestimate just how much they do it and deny its ill effects on the rest of their lives. In sum, to compare junk food to junk sex is to realize that they have become virtually interchangeable vices — even if many people who do not put “sex” in the category of vice will readily do so with food.
At this point, the impatient reader will interject that something else — something understandable and anodyne — is driving the increasing attention to food in our day: namely, the fact that we have learned much more than humans used to know about the importance of a proper diet to health and longevity. And this is surely a point borne out by the facts, too. One attraction of macrobiotics, for example, is its promise to reduce the risks of cancer. The fall in cholesterol that attends a true vegan or vegetarian diet is another example. Manifestly, one reason that people today are so much more discriminating about food is that decades of recent research have taught us that diet has more potent effects than Betty and her friends understood, and can be bad for you or good for you in ways not enumerated before.
All that is true, but then the question is this: Why aren’t more people doing the same with sex?
For here we come to the most fascinating turn of all. One cannot answer the question by arguing that there is no such empirical news about indiscriminately pursued sex and how it can be good or bad for you; to the contrary, there is, and lots of it. After all, several decades of empirical research — which also did not exist before — have demonstrated that the sexual revolution, too, has had consequences, and that many of them have redounded to the detriment of a sexually liberationist ethic.
Married, monogamous people are more likely to be happy. They live longer. These effects are particularly evident for men. Divorced men in particular and conversely face health risks — including heightened drug use and alcoholism — that married men do not. Married men also work more and save more, and married households not surprisingly trump other households in income. Divorce, by contrast, is often a financial catastrophe for a family, particularly the women and children in it. So is illegitimacy typically a financial disaster.
By any number of measures, moreover, nontraditional sexual morality — and the fallout from it — is detrimental to the well-being of one specifically vulnerable subset: children. Children from broken homes are at risk for all kinds of behavioral, psychological, educational, and other problems that children from intact homes are not. Children from fatherless homes are far more likely to end up in prison than are those who grew up with both biological parents. Girls growing up without a biological father are far more likely to suffer physical or sexual abuse. Girls and boys, numerous sources also show, are adversely affected by family breakup into adulthood, and have higher risks than children from intact homes of repeating the pattern of breakup themselves.
This recital touches only the periphery of the empirical record now being assembled about the costs of laissez-faire sex to American society — a record made all the more interesting by the fact that it could not have been foreseen back when sexual liberationism seemed merely synonymous with the removal of some seemingly inexplicable old stigmas. Today, however, two generations of social science replete with studies, surveys, and regression analyses galore stand between the Moynihan Report and what we know now, and the overall weight of its findings is clear. The sexual revolution — meaning the widespread extension of sex outside of marriage and frequently outside commitment of any kind — has had negative effects on many people, chiefly the most vulnerable; and it has also had clear financial costs to society at large. And this is true not only in the obvious ways, like the spread of aids and other stds, but also in other ways affecting human well-being, beginning but not ending with those enumerated above.
The question raised by this record is not why some people changed their habits and ideas when faced with compelling new facts about food and quality of life. It is rather why more people have not done the same about sex.
THE MINDLESS SHIFT
hen friedrich nietzschewrote longingly of the “transvaluation of all values,” he meant the hoped-for restoration of sexuality to its proper place as a celebrated, morally neutral life force. He could not possibly have foreseen our world: one in which sex would indeed become “morally neutral” in the eyes of a great many people — even as food would come to replace it as source of moral authority.4
Nevertheless, events have proven Nietzsche wrong about his wider hope that men and women of the future would simply enjoy the benefits of free sex without any attendant seismic shifts. For there may in fact be no such thing as a destigmatization of sex simplicitur, as the events outlined in this essay suggest. The rise of a recognizably Kantian, morally universalizable code concerning food — beginning with the international vegetarian movement of the last century and proceeding with increasing moral fervor into our own times via macrobiotics, veganism/vegetarianism, and European codes of terroir — has paralleled exactly the waning of a universally accepted sexual code in the Western world during these same years.
Who can doubt that the two trends are related? Unable or unwilling (or both) to impose rules on sex at a time when it is easier to pursue it than ever before, yet equally unwilling to dispense altogether with a universal moral code that he would have bind society against the problems created by exactly that pursuit, modern man (and woman) has apparently performed his own act of transubstantiation. He has taken longstanding morality about sex, and substituted it onto food. The all-you-can-eat buffet is now stigmatized; the sexual smorgasbord is not.
In the end, it is hard to avoid the conclusion that the rules being drawn around food receive some force from the fact that people are uncomfortable with how far the sexual revolution has gone — and not knowing what to do about it, they turn for increasing consolation to mining morality out of what they eat.
So what does it finally mean to have a civilization puritanical about food, and licentious about sex? In this sense, Nietzsche’s fabled madman came not too late, but too early — too early to have seen the empirical library that would be amassed from the mid- twenty-first century on, testifying to the problematic social, emotional, and even financial nature of exactly the solution he sought.
It is a curious coda that this transvaluation should not be applauded by the liberationist heirs of Nietzsche, even as their day in the sun seems to have come. According to them, after all, consensual sex is simply what comes naturally, and ought therefore to be judged value-free. But as the contemporary history outlined in this essay goes to show, the same can be said of overeating — and overeating is something that today’s society is manifestly embarked on re-stigmatizing. It may be doing so for very different reasons than the condemnations of gluttony outlined by the likes of Gregory the Great and St. Thomas Aquinas. But if indiscriminate sex can also have a negative impact — and not just in the obvious sense of disease, but in the other aspects of psyche and well-being now being written into the empirical record of the sexual revolution — then indiscriminate sex may be judged to need reining in, too.
So if there is a moral to this curious transvaluation, it would seem to be that the norms society imposes on itself in pursuit of its own self-protection do not wholly disappear, but rather mutate and move on, sometimes in curious guises. Far-fetched though it seems at the moment, where mindless food is today, mindless sex — in light of the growing empirical record of its own unleashing — may yet again be tomorrow.
Mary Eberstadt is a research fellow at the Hoover Institution and consulting editor to Policy Review.
1 As defined by the International Vegetarian Union, a vegetarian eats no animals but may eat eggs and dairy (and is then an ovo-lacto vegetarian). A pescetarian is a vegetarian who allows the consumption of fish. A vegan excludes both animals and animal products from his diet, including honey. Vegetarians and vegans can be further refined into numerous other categories — fruitarian, Halal vegetarian, and so on. The terminological complexity here only amplifies the point that food now attracts the taxonomical energies once devoted to, say, metaphysics.
2 For a general discussion, see Pamela Paul, Pornified: How Pornography is Transforming Our Lives, Our Relationships, and Our Families (Times Books, 2005).
3 For clinical accounts of the evidence of harm, see, for example, Ana J. Bridges, “Pornography’s Effects on Interpersonal Relationships,” and Jill C. Manning, “The Impact of Pornography on Women,” papers presented to a conference on “The Social Costs of Pornography,” Princeton University (December 2008). For further information and for pre-consultation drafts of these papers, see http://www.winst.org/family_marriage_and_democracy/social_costs_of_pornography/consultation2008.php (accessed January 7, 2008). The papers also include an interesting econometric assessment of what is spent to avoid or recover from pornography addiction: Kirk Doran, “The Economics of Pornography.”
4 Interestingly, Nietzsche does appear to have foreseen the universalizability of vegetarianism, writing in the 1870s, “I believe that the vegetarians, with their prescription to eat less and more simply, are of more use than all the new moral systems taken together. . . . There is no doubt that the future educators of mankind will also prescribe a stricter diet.” Also interesting, Adolf Hitler — whose own vegetarianism appears to have been adopted because of Wagner’s (Wagner in turn had been convinced by the sometime vegetarian Nietzsche) — reportedly remarked in 1941 that “there’s one thing I can predict to eaters of meat: the world of the future will be vegetarian.”
Table of Contents
FEATURES:
Is Food the New Sex? By Mary Eberstadt
A curious reversal in moralizing
f all the truly seismic shifts transforming daily life today — deeper than our financial fissures, wider even than our most obvious political and cultural divides — one of the most important is also among the least remarked. That is the chasm in attitude that separates almost all of us living in the West today from almost all of our ancestors, over two things without which human beings cannot exist: food and sex.
The question before us today is not whether the two appetites are closely connected. About that much, philosophers and other commentators have been agreed for a very long time. As far back as Aristotle, observers have made the same point reiterated in 1749 in Henry Fielding’s famous scene in Tom Jones: The desires for sex and for food are joined at the root. The fact that Fielding’s scene would go on to inspire an equally iconic movie segment over 200 years later, in the Tom Jones film from 1963, just clinches the point.
What happens when, for the first time in history, adult human beings are free to have all the sex and food they want?
Philosophers and artists aside, ordinary language itself verifies how similarly the two appetites are experienced, with many of the same words crossing over to describe what is desirable and undesirable in each case. In fact, we sometimes have trouble even talking about food without metaphorically invoking sex, and vice versa. In a hundred entangled ways, judging by either language or literature, the human mind juggles sex and food almost interchangeably at times. And why not? Both desires can make people do things they otherwise would not; and both are experienced at different times by most men and women as the most powerful of all human drives.
One more critical link between the appetites for sex and food is this: Both, if pursued without regard to consequence, can prove ruinous not only to oneself, but also to other people, and even to society itself. No doubt for that reason, both appetites have historically been subject in all civilizations to rules both formal and informal. Thus the potentially destructive forces of sex — disease, disorder, sexual aggression, sexual jealousy, and what used to be called “home-wrecking” — have been ameliorated in every recorded society by legal, social, and religious conventions, primarily stigma and punishment. Similarly, all societies have developed rules and rituals governing food in part to avoid the destructiveness of free-for-alls over scarce necessities. And while food rules may not always have been as stringent as sex rules, they have nevertheless been stringent as needed. Such is the meaning, for example, of being hanged for stealing a loaf of bread in the marketplace, or keel-hauled for plundering rations on a ship.
These disciplines imposed historically on access to food and sex now raise a question that has not come up before, probably because it was not even possible to imagine it until the lifetimes of the people reading this: What happens when, for the first time in history — at least in theory, and at least in the advanced nations — adult human beings are more or less free to have all the sex and food they want?
This question opens the door to a real paradox. For given how closely connected the two appetites appear to be, it would be natural to expect that people would do the same kinds of things with both appetites — that they would pursue both with equal ardor when finally allowed to do so, for example, or with equal abandon for consequence; or conversely, with similar degrees of discipline in the consumption of each.
In fact, though, evidence from the advanced West suggests that nearly the opposite seems to be true. The answer appears to be that when many people are faced with these possibilities for the very first time, they end up doing very different things — things we might signal by shorthand as mindful eating, and mindless sex. This essay is both an exploration of that curious dynamic, and a speculation about what is driving it.
AS MUCH AS YOU WANT
he dramatic expansion in access to food on the one hand and to sex on the other are complicated stories; but in each case, technology has written most of it.
Up until just about now, for example, the prime brakes on sex outside of marriage have been several: fear of pregnancy, fear of social stigma and punishment, and fear of disease. The Pill and its cousins have substantially undermined the first two strictures, at least in theory, while modern medicine has largely erased the third. Even hiv/aids, only a decade ago a stunning exception to the brand new rule that one could apparently have any kind of sex at all without serious consequence, is now regarded as a “manageable” disease in the affluent West, even as it continues to kill millions of less fortunate patients elsewhere.
As for food, here too one technological revolution after another explains the extraordinary change in its availability: pesticides, mechanized farming, economical transportation, genetic manipulation of food stocks, and other advances. As a result, almost everyone in the Western world is now able to buy sustenance of all kinds, for very little money, and in quantities unimaginable until the lifetimes of the people reading this.
One result of this change in food fortune, of course, is the unprecedented “disease of civilization” known as obesity, with its corollary ills. Nevertheless, the commonplace fact of obesity in today’s West itself testifies to the point that access to food has expanded exponentially for just about everyone. So does the statistical fact that obesity is most prevalent in the lowest social classes and least exhibited in the highest.
And just as technology has made sex and food more accessible for a great many people, important extra-technological influences on both pursuits — particularly longstanding religious strictures — have meanwhile diminished in a way that has made both appetites even easier to indulge. The opprobrium reserved for gluttony, for example, seems to have little immediate force now, even among believers. On the rare occasions when one even sees the word, it is almost always used in a metaphorical, secular sense.
Similarly, and far more consequential, the longstanding religious prohibitions in every major creed against extramarital sex have rather famously loosed their holds over the contemporary mind. Of particular significance, perhaps, has been the movement of many Protestant denominations away from the sexual morality agreed upon by the previous millennia of Christendom. The Anglican abandonment in 1930 of the longstanding prohibition against artificial contraception is a special case in point, undermining as it subsequently did for many believers the very idea that any church could tell people what to do with their bodies, ever again. Whether they defended their traditional teachings or abandoned them, however, all Western Christian churches in the past century have found themselves increasingly beleaguered over issues of sex, and commensurately less influential over all but a fraction of the most traditionally minded parishioners.
Of course this waning of the traditional restraints on the pursuit of sex and food is only part of the story; any number of non-religious forces today also act as contemporary brakes on both. In the case of food, for example, these would include factors like personal vanity, say, or health concerns, or preoccupation with the morality of what is consumed (about which more below). Similarly, to acknowledge that sex is more accessible than ever before is not to say that it is always and everywhere available. Many people who do not think they will go to hell for premarital sex or adultery, for example, find brakes on their desires for other reasons: fear of disease, fear of hurting children or other loved ones, fear of disrupting one’s career, fear of financial setbacks in the form of divorce and child support, and so on.
Even men and women who do want all the food or sex they can get their hands on face obstacles of other kinds in their pursuit. Though many people really can afford to eat more or less around the clock, for example, home economics will still put the brakes on; it’s not as if everyone can afford pheasant under glass day and night. The same is true of sex, which likewise imposes its own unwritten yet practical constraints. Older and less attractive people simply cannot command the sexual marketplace as the younger and more attractive can (which is why the promises of erasing time and age are such a booming business in a post-liberation age). So do time and age still circumscribe the pursuit of sex, even as churches and other conventional enforcers increasingly do not.
Still and all, the initial point stands: As consumers of both sex and food, today’s people in the advanced societies are freer to pursue and consume both than almost all the human beings who came before us; and our culture has evolved in interesting ways to exhibit both those trends.
BROCCOLI, PORNOGRAPHY, AND KANT
o begin to see just how recent and dramatic this change is, let us imagine some broad features of the world seen through two different sets of eyes: a hypothetical 30-year-old housewife from 1958 named Betty, and her hypothetical granddaughter Jennifer, of the same age, today.
Begin with a tour of Betty’s kitchen. Much of what she makes comes from jars and cans. Much of it is also heavy on substances that people of our time are told to minimize — dairy products, red meat, refined sugars and flours — because of compelling research about nutrition that occurred after Betty’s time. Betty’s freezer is filled with meat every four months by a visiting company that specializes in volume, and on most nights she thaws a piece of this and accompanies it with food from one or two jars. If there is anything “fresh” on the plate, it is likely a potato. Interestingly, and rudimentary to our contemporary eyes though it may be, Betty’s food is served with what for us would appear to be high ceremony, i.e., at a set table with family members present.
As it happens, there is little that Betty herself, who is adventurous by the standards of her day, will not eat; the going slogan she learned as a child is about cleaning your plate, and not doing so is still considered bad form. Aside from that notion though, which is a holdover to scarcer times, Betty is much like any other American home cook in 1958. She likes making some things and not others, even as she prefers eating some things to others — and there, in personal aesthetics, does the matter end for her. It’s not that Betty lacks opinions about food. It’s just that the ones she has are limited to what she does and does not personally like to make and eat.
Now imagine one possible counterpart to Betty today, her 30-year-old granddaughter Jennifer. Jennifer has almost no cans or jars in her cupboard. She has no children or husband or live-in boyfriend either, which is why her kitchen table on most nights features a laptop and goes unset. Yet interestingly enough, despite the lack of ceremony at the table, Jennifer pays far more attention to food, and feels far more strongly in her convictions about it, than anyone she knows from Betty’s time.
Wavering in and out of vegetarianism, Jennifer is adamantly opposed to eating red meat or endangered fish. She is also opposed to industrialized breeding, genetically enhanced fruits and vegetables, and to pesticides and other artificial agents. She tries to minimize her dairy intake, and cooks tofu as much as possible. She also buys “organic” in the belief that it is better both for her and for the animals raised in that way, even though the products are markedly more expensive than those from the local grocery store. Her diet is heavy in all the ways that Betty’s was light: with fresh vegetables and fruits in particular. Jennifer has nothing but ice in her freezer, soymilk and various other items her grandmother wouldn’t have recognized in the refrigerator, and on the counter stands a vegetable juicer she feels she “ought” to use more.
Most important of all, however, is the difference in moral attitude separating Betty and Jennifer on the matter of food. Jennifer feels that there is a right and wrong about these options that transcends her exercise of choice as a consumer. She does not exactly condemn those who believe otherwise, but she doesn’t understand why they do, either. And she certainly thinks the world would be a better place if more people evaluated their food choices as she does. She even proselytizes on occasion when she can.
In short, with regard to food, Jennifer falls within Immanuel Kant’s definition of the Categorical Imperative: She acts according to a set of maxims that she wills at the same time to be universal law.
Betty, on the other hand, would be baffled by the idea of dragooning such moral abstractions into the service of food. This is partly because, as a child of her time, she was impressed — as Jennifer is not — about what happens when food is scarce (Betty’s parents told her often about their memories of the Great Depression; and many of the older men of her time had vivid memories of deprivation in wartime). Even without such personal links to food scarcity, though, it makes no sense to Betty that people would feel as strongly as her granddaughter does about something as simple as deciding just what goes into one’s mouth. That is because Betty feels, as Jennifer obviously does not, that opinions about food are simply de gustibus, a matter of individual taste — and only that.
This clear difference in opinion leads to an intriguing juxtaposition. Just as Betty and Jennifer have radically different approaches to food, so do they to matters of sex. For Betty, the ground rules of her time — which she both participates in and substantially agrees with — are clear: Just about every exercise of sex outside marriage is subject to social (if not always private) opprobrium. Wavering in and out of established religion herself, Betty nevertheless clearly adheres to a traditional Judeo-Christian sexual ethic. Thus, for example, Mr. Jones next door “ran off” with another woman, leaving his wife and children behind; Susie in the town nearby got pregnant and wasn’t allowed back in school; Uncle Bill is rumored to have contracted gonorrhea; and so on. None of these breaches of the going sexual ethic is considered by Betty to be a good thing, let alone a celebrated thing. They are not even considered to be neutral things. In fact, they are all considered by her to be wrong.
Most important of all, Betty feels that sex, unlike food, is not de gustibus. She believes to the contrary that there is a right and wrong about these choices that transcends any individual act. She further believes that the world would be a better place, and individual people better off, if others believed as she does. She even proselytizes such on occasion when given the chance.
In short, as Jennifer does with food, Betty in the matter of sex fulfills the requirements for Kant’s Categorical Imperative.
Jennifer’s approach to sex is just about 180 degrees different. She too disapproves of the father next door who left his wife and children for a younger woman; she does not want to be cheated on herself, or to have those she cares about cheated on either. These ground-zero stipulations, aside, however, she is otherwise laissez-faire on just about every other aspect of nonmarital sex. She believes that living together before marriage is not only morally neutral, but actually better than not having such a “trial run.” Pregnant unwed Susie in the next town doesn’t elicit a thought one way or the other from her, and neither does Uncle Bill’s gonorrhea, which is of course a trivial medical matter between him and his doctor.
Jennifer, unlike Betty, thinks that falling in love creates its own demands and generally trumps other considerations — unless perhaps children are involved (and sometimes, on a case-by-case basis, then too). A consistent thinker in this respect, she also accepts the consequences of her libertarian convictions about sex. She is pro-abortion, pro-gay marriage, indifferent to ethical questions about stem cell research and other technological manipulations of nature (as she is not, ironically, when it comes to food), and agnostic on the question of whether any particular parental arrangements seem best for children. She has even been known to watch pornography with her boyfriend, at his coaxing, in part to show just how very laissez-faire she is.
Betty thinks food is a matter of taste, whereas sex is governed by universal moral law; and Jennifer thinks exactly the reverse.
Most important, once again, is the difference in moral attitude between the two women on this subject of sex. Betty feels that there is a right and wrong about sexual choices that transcends any individual act, and Jennifer — exceptions noted — does not. It’s not that Jennifer lacks for opinions about sex, any more than Betty does about food. It’s just that, for the most part, they are limited to what she personally does and doesn’t like.
Thus far, what the imaginary examples of Betty and Jennifer have established is this: Their personal moral relationships toward food and toward sex are just about perfectly reversed. Betty does care about nutrition and food, but it doesn’t occur to her to extend her opinions to a moral judgment — i.e., to believe that other people ought to do as she does in the matter of food, and that they are wrong if they don’t. In fact, she thinks such an extension would be wrong in a different way; it would be impolite, needlessly judgmental, simply not done. Jennifer, similarly, does care to some limited degree about what other people do about sex; but it seldom occurs to her to extend her opinions to a moral judgment. In fact, she thinks such an extension would be wrong in a different way — because it would be impolite, needlessly judgmental, simply not done.
On the other hand, Jennifer is genuinely certain that her opinions about food are not only nutritionally correct, but also, in some deep, meaningful sense, morally correct — i.e., she feels that others ought to do something like what she does. And Betty, on the other hand, feels exactly the same way about what she calls sexual morality.
As noted, this desire to extend their personal opinions in two different areas to an “ought” that they think should be somehow binding — binding, that is, to the idea that others should do the same — is the definition of the Kantian imperative. Once again, note: Betty’s Kantian imperative concerns sex not food, and Jennifer’s concerns food not sex. In just over 50 years, in other words — not for everyone, of course, but for a great many people, and for an especially large portion of sophisticated people — the moral poles of sex and food have been reversed. Betty thinks food is a matter of taste, whereas sex is governed by universal moral law of some kind; and Jennifer thinks exactly the reverse.
What has happened here?
ROLE REVERSAL
etty and jennifermay be imaginary, but the decades that separate the two women have brought related changes to the lives of many millions. In the 50 years between their two kitchens, a similar polar transformation has taken root and grown not only throughout America but also throughout Western society itself. During those years, cultural artifacts and forces in the form of articles, books, movies, and ideas aimed at deregulating what is now quaintly called “nonmarital sex” have abounded and prospered; while the cultural artifacts and forces aimed at regulating or seeking to re-regulate sex outside of marriage have largely declined. In the matter of food, on the other hand, exactly the reverse has happened. Increasing scrutiny over the decades to the quality of what goes into people’s mouths has been accompanied by something almost wholly new under the sun: the rise of universalizable moral codes based on food choices.
Begin with the more familiar face of diets and fads — the Atkins diet, the Zone diet, the tea diet, the high-carb diet, Jenny Craig, Weight Watchers, and all the rest of the food fixes promising us new and improved versions of ourselves. Abundant though they and all their relatives are, those short-term fads and diets are nevertheless merely epiphenomena.
Digging a little deeper, the obsession with food that they reflect resonates in many other strata of the commercial marketplace. Book reading, for example, may indeed be on the way out, but until it goes, cookbooks and food books remain among the most reliable moneymakers in the industry. To scan the bestseller lists or page the major reviews in any given month is to find that books on food and food-thought are at least reliably represented, and sometimes even predominate — to list a few from the past few years alone: Michael Pollan’s The Omnivore’s Dilemma; Eric Schlosser’s Fast Food Nation; Gary Taubes’ Good Calories, Bad Calories; Bill Buford’s Heat.
Then there are the voyeur and celebrity genres, which have made some chefs the equivalent of rock stars and further feed the public curiosity with books like Kitchen Confidential: Adventures in the Culinary Underbelly or Service Included: Four-Star Secrets of an Eavesdropping Waiter or The Devil in the Kitchen: Sex, Pain, Madness, and the Making of a Great Chef. Anywhere you go, anywhere you look, food in one form or another is what’s on tap. The proliferation of chains like Whole Foods, the recent institution by Governor Arnold Schwarzenegger of state-mandated nutritional breakdowns in restaurants in the state of California (a move that is sure to be repeated by governors in the other 49): All these and many other developments speak to the paramount place occupied by food and food choices in the modern consciousness. As the New York Times Magazine noted recently, in a foreword emphasizing the intended expansion of its (already sizeable) food coverage, such writing is “perhaps never a more crucial part of what we do than today — a moment when what and how we eat has emerged as a Washington issue and a global-environmental issue as well as a kitchen-table one.”
Underneath the passing fads and short-term fixes and notices like these, deep down where the real seismic change lies, is a series of revolutions in how we now think about food — changes that focus not on today or tomorrow, but on eating as a way of life.
One recent influential figure in this tradition was George Ohsawa, a Japanese philosopher who codified what is known as macrobiotics. Popularized in the United States by his pupil, Michio Kushi, macrobiotics has been the object of fierce debate for several decades now, and Kushi’s book, The Macrobiotic Path to Total Health: A Complete Guide to Naturally Preventing and Relieving More Than 200 Chronic Conditions and Disorders, remains one of the modern bibles on food. Macrobiotics makes historical as well as moral claims, including the claim that its tradition stretches back to Hippocrates and includes Jesus and the Han dynasty among other enlightened beneficiaries. These claims are also reflected in the macrobiotic system, which includes the expression of gratitude (not exactly prayers) for food, serenity in the preparation of it, and other extra-nutritional ritual. And even as the macrobiotic discipline has proved too ascetic for many people (and certainly for most Americans), one can see its influence at work in other serious treatments of the food question that have trickled outward. The current popular call to “mindful eating,” for example, echoes the macrobiotic injunction to think of nothing but food and gratitude while consuming, even to the point of chewing any given mouthful at least 50 times.
Alongside macrobiotics, the past decades have also seen tremendous growth in vegetarianism and its related offshoots, another food system that typically makes moral as well as health claims. As a movement, and depending on which part of the world one looks at, vegetarianism predates macrobiotics.1 Vegetarian histories claim for themselves the Brahmins, Buddhists, Jainists, and Zoroastrians, as well as certain Jewish and Christian practitioners. In the modern West, Percy Bysshe Shelley was a prominent activist in the early nineteenth century; and the first Vegetarian Society was founded in England in 1847.
Around the same time in the United States, a Presbyterian minister named Sylvester Graham popularized vegetarianism in tandem with a campaign against excess of all kinds (ironically, under the circumstances, this health titan is remembered primarily for the Graham cracker). Various other American religious groups have also gone in for vegetarianism, including the Seventh Day Adventists, studies on whom make up some of the most compelling data about the possible health benefits of a diet devoid of animal flesh. Uniting numerous discrete movements under one umbrella is the International Vegetarian Union, which started just a hundred years ago, in 1908.
Despite this long history, though, it is clear that vegetarianism apart from its role in religious movements did not really take off as a mass movement until relatively recently. Even so, its contemporary success has been remarkable. Pushed perhaps by the synergistic public interest in macrobiotics and nutritional health, and nudged also by occasional rallying books including Peter Singer’s Animal Rights and Matthew Scully’s Dominion, vegetarianism today is one of the most successful secular moral movements in the West; whereas macrobiotics for its part, though less successful as a mass movement by name, has witnessed the vindication of some of its core ideas and stands as a kind of synergistic brother in arms.
To be sure, macrobiotics and vegetarianism/veganism have their doctrinal differences. Macrobiotics limits animal flesh not out of moral indignation, but for reasons of health and Eastern ideas of proper “balancing” of the forces of yin and yang. Similarly, macrobiotics also allows for moderate amounts of certain types of fish — as strict vegans do not. On the other hand, macrobiotics also bans a number of plants (among them tomatoes, potatoes, peppers, and tropical fruits), whereas vegetarianism bans none. Nonetheless, macrobiotics and vegetarianism have more in common than not, especially from the point of view of anyone eating outside either of these codes. The doctrinal differences separating one from another are about equivalent in force today to those between, say, Presbyterians and Lutherans.
And that is exactly the point. For many people, schismatic differences about food have taken the place of schismatic differences about faith. Again, the curiosity is just how recent this is. Throughout history, practically no one devoted this much time to matters of food as ideas (as opposed to, say, time spent gathering the stuff). Still less does it appear to have occurred to people that dietary schools could be untethered from a larger metaphysical and moral worldview. Observant Jews and Muslims, among others, have had strict dietary laws from their faiths’ inception; but that is just it — their laws told believers what to do with food when they got it, rather than inviting them to dwell on food as a thing in itself. Like the Adventists, who speak of their vegetarianism as being “harmony with the Creator,” or like the Catholics with their itinerant Lenten and other obligations, these previous dietary laws were clearly designed to enhance religion — not replace it.
Do today’s influential dietary ways of life in effect replace religion? Consider that macrobiotics, vegetarianism, and veganism all make larger health claims as part of their universality — but unlike yesteryear, to repeat the point, most of them no longer do so in conjunction with organized religion. Macrobiotics, for its part, argues (with some evidence) that processed foods and too much animal flesh are toxic to the human body, whereas whole grains, vegetables, and fruits are not. The literature of vegetarianism makes a similar point, recently drawing particular attention to new research concerning the connection between the consumption of red meat and certain cancers. In both cases, however, dietary laws are not intended to be handmaidens to a higher cause, but moral causes in themselves.
Just as the food of today often attracts a level of metaphysical attentiveness suggestive of the sex of yesterday, so does food today seem attended by a similarly evocative — and proliferating — number of verboten signs. The opprobrium reserved for perceived “violations” of what one “ought” to do has migrated, in some cases fully, from one to the other. Many people who wouldn’t be caught dead with an extra ten pounds — or eating a hamburger, or wearing real leather — tend to be laissez-faire in matters of sex. In fact, just observing the world as it is, one is tempted to say that the more vehement people are about the morality of their food choices, themore hands-off they believe the rest of the world should be about sex. What were the circumstances the last time you heard or used the word “guilt” — in conjunction with sin as traditionally conceived? Or with having eaten something verboten and not having gone to the gym?
Perhaps the most revealing example of the infusion of morality into food codes can be found in the current European passion for what the French call terroir — an idea that originally referred to the specific qualities conferred by geography on certain food products (notably wine) and that has now assumed a life of its own as a moral guide to buying and consuming locally. That there is no such widespread, concomitant attempt to impose a new morality on sexual pursuits in Western Europe seems something of an understatement. But as a measure of the reach of terroir as a moral code, consider only a sermon from Durham Cathedral in 2007. In it, the dean explained Lent as an event that “says to us, cultivate a good terroir, a spiritual ecology that will re-focus our passion for God, our praying, our pursuit of justice in the world, our care for our fellow human beings.”
There stands an emblematic example of the reversal between food and sex in our time: in which the once-universal moral code of European Christianity is being explicated for the masses by reference to the now apparently more-universal European moral code of consumption à la terroir.
Moreover, this reversal between sex and food appears firmer the more passionately one clings to either pole. Thus, for instance, though much has lately been made of the “greening” of the evangelicals, no vegetarian Christian group is as nationally known as, say, People for the Ethical Treatment of Animals or any number of other vegetarian/vegan organizations, most of which appear to be secular or anti-religious and none of which, so far as my research shows, extend their universalizable moral ambitions to the realm of sexuality. When Skinny Bitch — a hip guide to veganism that recently topped the bestseller lists for months — exhorts its readers to a life that is “clean, pure, healthy,” for example, it is emphatically not including sex in this moral vocabulary, and makes a point of saying so.
C.S. Lewis once compared the two desires as follows, to make the point that something about sex had gotten incommensurate in his own time: “There is nothing to be ashamed of in enjoying your food: there would be everything to be ashamed of if half the world made food the main interest of their lives and spent their time looking at pictures of food and dribbling and smacking their lips.” He was making a point in the genre of reductio ad absurdum.
But for the jibe to work as it once did, our shared sense of what is absurd about it must work too — and that shared sense, in an age as visually, morally, and aesthetically dominated by food as is our own, is waning fast. Consider the coining of the term “gastroporn” to describe the eerily similar styles of high-definition pornography on the one hand and stylized shots of food on the other. Actually, the term is not even that new. It dates back at least 30 years, to a 1977 essay by that title in the New York Review of Books. In it author Andrew Cockburn observed that “it cannot escape attention that there are curious parallels between manuals on sexual techniques and manuals on the preparation of food; the same studious emphasis on leisurely technique, the same apostrophes to the ultimate, heavenly delights. True gastro-porn heightens the excitement and also the sense of the unattainable by proffering colored photographs of various completed recipes.”
With such a transfer, the polar migrations of food and sex during the last half century would appear complete.
RESPECTING SOME HAZARDS, IGNORING OTHERS
f it is true that food is the new sex, however, where does that leave sex? This brings us to the paradox already hinted at. As the consumption of food not only literally but also figuratively has become progressively more discriminate and thoughtful, at least in theory (if rather obviously not always in practice), the consumption of sex in various forms appears to have become the opposite for a great many people: i.e., progressively more indiscriminate and unthinking.
Several proofs could be offered for such a claim, beginning with any number of statistical studies. Both men and women are far less likely to be sexually inexperienced on their weddings now (if indeed they marry) than they were just a few decades ago. They are also more likely to be experienced in all kinds of ways, including in the use of pornography. Like the example of Jennifer, moreover, their general thoughts about sex become more laissez-faire the further down the age demographic one goes.
Consider as further proof of the dumbing-down of sex the coarseness of popular entertainment, say through a popular advice column on left-leaning Slate magazine called “Dear Prudence” that concerns “manners and morals.” Practically every subject line is window onto a world of cheap, indiscriminate sex, where the only ground rule is apparently that no sexual urge shall ever be discouraged unless it manifestly hurts others — meaning literally. “Should I destroy the erotic video my husband and I have made?” “My boyfriend’s kinky fetish might doom our relationship.” “My husband wants me to abort, and I don’t.” “How do I tell my daughter she’s the result of a sexual assault?” “A friend confessed to a fling with my now-dead husband.” And so on. The mindful vegetarian slogan, “you are what you eat,” has no counterpart in the popular culture today when it comes to sex.
The third and probably most important feature of sex in our time testifying to the ubiquity of appetites fulfilled and indulged indiscriminately is the staggering level of consumption of Internet pornography. As Ross Douthat recently summarized in an essay for the Atlantic, provocatively titled “Is Pornography Adultery?”:
Over the past three decades, the vcr, on-demand cable service, and the Internet have completely overhauled the ways in which people interact with porn. Innovation has piled on innovation, making modern pornography a more immediate, visceral, and personalized experience. Nothing in the long history of erotica compares with the way millions of Americans experience porn today, and our moral intuitions are struggling to catch up.
Statistics too, or at least preliminary ones, bear out just how consequential this erotic novelty is becoming. Pornography is the single most viewed subject online, by men anyway; it is increasingly a significant factor in divorce cases; and it is resulting in any number of cottage industries, from the fields of therapy to law to academia, as society’s leading cultural institutions strive to measure and cope with its impact.2
This junk sex shares all the defining features of junk food. It is produced and consumed by people who do not know one another. It is disdained by those who believe they have access to more authentic experience or “healthier” options. Internet pornography is further widely said — right now, in its relatively early years — to be harmless, much as few people thought little of the ills to come through convenient prepared food when it first appeared; and evidence is also beginning to emerge about compulsive pornography consumption, as it did slowly but surely in the case of compulsive packaged food consumption, that this laissez-faire judgment is wrong.3
This brings us to another similarity between junk sex and junk food: People are furtive about both, and many feel guilty about their pursuit and indulgence of each. And those who consume large amounts of both are also typically self-deceptive, too: i.e., they underestimate just how much they do it and deny its ill effects on the rest of their lives. In sum, to compare junk food to junk sex is to realize that they have become virtually interchangeable vices — even if many people who do not put “sex” in the category of vice will readily do so with food.
At this point, the impatient reader will interject that something else — something understandable and anodyne — is driving the increasing attention to food in our day: namely, the fact that we have learned much more than humans used to know about the importance of a proper diet to health and longevity. And this is surely a point borne out by the facts, too. One attraction of macrobiotics, for example, is its promise to reduce the risks of cancer. The fall in cholesterol that attends a true vegan or vegetarian diet is another example. Manifestly, one reason that people today are so much more discriminating about food is that decades of recent research have taught us that diet has more potent effects than Betty and her friends understood, and can be bad for you or good for you in ways not enumerated before.
All that is true, but then the question is this: Why aren’t more people doing the same with sex?
For here we come to the most fascinating turn of all. One cannot answer the question by arguing that there is no such empirical news about indiscriminately pursued sex and how it can be good or bad for you; to the contrary, there is, and lots of it. After all, several decades of empirical research — which also did not exist before — have demonstrated that the sexual revolution, too, has had consequences, and that many of them have redounded to the detriment of a sexually liberationist ethic.
Married, monogamous people are more likely to be happy. They live longer. These effects are particularly evident for men. Divorced men in particular and conversely face health risks — including heightened drug use and alcoholism — that married men do not. Married men also work more and save more, and married households not surprisingly trump other households in income. Divorce, by contrast, is often a financial catastrophe for a family, particularly the women and children in it. So is illegitimacy typically a financial disaster.
By any number of measures, moreover, nontraditional sexual morality — and the fallout from it — is detrimental to the well-being of one specifically vulnerable subset: children. Children from broken homes are at risk for all kinds of behavioral, psychological, educational, and other problems that children from intact homes are not. Children from fatherless homes are far more likely to end up in prison than are those who grew up with both biological parents. Girls growing up without a biological father are far more likely to suffer physical or sexual abuse. Girls and boys, numerous sources also show, are adversely affected by family breakup into adulthood, and have higher risks than children from intact homes of repeating the pattern of breakup themselves.
This recital touches only the periphery of the empirical record now being assembled about the costs of laissez-faire sex to American society — a record made all the more interesting by the fact that it could not have been foreseen back when sexual liberationism seemed merely synonymous with the removal of some seemingly inexplicable old stigmas. Today, however, two generations of social science replete with studies, surveys, and regression analyses galore stand between the Moynihan Report and what we know now, and the overall weight of its findings is clear. The sexual revolution — meaning the widespread extension of sex outside of marriage and frequently outside commitment of any kind — has had negative effects on many people, chiefly the most vulnerable; and it has also had clear financial costs to society at large. And this is true not only in the obvious ways, like the spread of aids and other stds, but also in other ways affecting human well-being, beginning but not ending with those enumerated above.
The question raised by this record is not why some people changed their habits and ideas when faced with compelling new facts about food and quality of life. It is rather why more people have not done the same about sex.
THE MINDLESS SHIFT
hen friedrich nietzschewrote longingly of the “transvaluation of all values,” he meant the hoped-for restoration of sexuality to its proper place as a celebrated, morally neutral life force. He could not possibly have foreseen our world: one in which sex would indeed become “morally neutral” in the eyes of a great many people — even as food would come to replace it as source of moral authority.4
Nevertheless, events have proven Nietzsche wrong about his wider hope that men and women of the future would simply enjoy the benefits of free sex without any attendant seismic shifts. For there may in fact be no such thing as a destigmatization of sex simplicitur, as the events outlined in this essay suggest. The rise of a recognizably Kantian, morally universalizable code concerning food — beginning with the international vegetarian movement of the last century and proceeding with increasing moral fervor into our own times via macrobiotics, veganism/vegetarianism, and European codes of terroir — has paralleled exactly the waning of a universally accepted sexual code in the Western world during these same years.
Who can doubt that the two trends are related? Unable or unwilling (or both) to impose rules on sex at a time when it is easier to pursue it than ever before, yet equally unwilling to dispense altogether with a universal moral code that he would have bind society against the problems created by exactly that pursuit, modern man (and woman) has apparently performed his own act of transubstantiation. He has taken longstanding morality about sex, and substituted it onto food. The all-you-can-eat buffet is now stigmatized; the sexual smorgasbord is not.
In the end, it is hard to avoid the conclusion that the rules being drawn around food receive some force from the fact that people are uncomfortable with how far the sexual revolution has gone — and not knowing what to do about it, they turn for increasing consolation to mining morality out of what they eat.
So what does it finally mean to have a civilization puritanical about food, and licentious about sex? In this sense, Nietzsche’s fabled madman came not too late, but too early — too early to have seen the empirical library that would be amassed from the mid- twenty-first century on, testifying to the problematic social, emotional, and even financial nature of exactly the solution he sought.
It is a curious coda that this transvaluation should not be applauded by the liberationist heirs of Nietzsche, even as their day in the sun seems to have come. According to them, after all, consensual sex is simply what comes naturally, and ought therefore to be judged value-free. But as the contemporary history outlined in this essay goes to show, the same can be said of overeating — and overeating is something that today’s society is manifestly embarked on re-stigmatizing. It may be doing so for very different reasons than the condemnations of gluttony outlined by the likes of Gregory the Great and St. Thomas Aquinas. But if indiscriminate sex can also have a negative impact — and not just in the obvious sense of disease, but in the other aspects of psyche and well-being now being written into the empirical record of the sexual revolution — then indiscriminate sex may be judged to need reining in, too.
So if there is a moral to this curious transvaluation, it would seem to be that the norms society imposes on itself in pursuit of its own self-protection do not wholly disappear, but rather mutate and move on, sometimes in curious guises. Far-fetched though it seems at the moment, where mindless food is today, mindless sex — in light of the growing empirical record of its own unleashing — may yet again be tomorrow.
Mary Eberstadt is a research fellow at the Hoover Institution and consulting editor to Policy Review.
1 As defined by the International Vegetarian Union, a vegetarian eats no animals but may eat eggs and dairy (and is then an ovo-lacto vegetarian). A pescetarian is a vegetarian who allows the consumption of fish. A vegan excludes both animals and animal products from his diet, including honey. Vegetarians and vegans can be further refined into numerous other categories — fruitarian, Halal vegetarian, and so on. The terminological complexity here only amplifies the point that food now attracts the taxonomical energies once devoted to, say, metaphysics.
2 For a general discussion, see Pamela Paul, Pornified: How Pornography is Transforming Our Lives, Our Relationships, and Our Families (Times Books, 2005).
3 For clinical accounts of the evidence of harm, see, for example, Ana J. Bridges, “Pornography’s Effects on Interpersonal Relationships,” and Jill C. Manning, “The Impact of Pornography on Women,” papers presented to a conference on “The Social Costs of Pornography,” Princeton University (December 2008). For further information and for pre-consultation drafts of these papers, see http://www.winst.org/family_marriage_and_democracy/social_costs_of_pornography/consultation2008.php (accessed January 7, 2008). The papers also include an interesting econometric assessment of what is spent to avoid or recover from pornography addiction: Kirk Doran, “The Economics of Pornography.”
4 Interestingly, Nietzsche does appear to have foreseen the universalizability of vegetarianism, writing in the 1870s, “I believe that the vegetarians, with their prescription to eat less and more simply, are of more use than all the new moral systems taken together. . . . There is no doubt that the future educators of mankind will also prescribe a stricter diet.” Also interesting, Adolf Hitler — whose own vegetarianism appears to have been adopted because of Wagner’s (Wagner in turn had been convinced by the sometime vegetarian Nietzsche) — reportedly remarked in 1941 that “there’s one thing I can predict to eaters of meat: the world of the future will be vegetarian.”
Labels:
Best Of,
Culture,
David Brooks,
Essay,
Food,
Hoover Institute,
Year-End
Subscribe to:
Posts (Atom)
Blog Archive
-
▼
2009
(223)
-
▼
March
(12)
- Hiding a Mountain Of Debt By David S. Broder
- When ‘Deficit’ Isn’t a Dirty Word By ROBERT H. FRANK
- The Problem With Flogging A.I.G. By JOE NOCERA
- Perverse Cosmic Myopia By DAVID BROOKS
- A Veterinarian Advises How To 'Speak For Spot'
- A Prison of Words By NOAH FELDMAN
- The Daily Me By NICHOLAS D. KRISTOF
- Finding Messages in a Blueprint By N. GREGORY MANKIW
- The Looting of America’s Coffers By DAVID LEONHARDT
- homologate
- An Empty In-Box, or With Just a Few E-Mail Message...
- Is Food the New Sex? By Mary Eberstadt
-
▼
March
(12)