Wednesday, November 7, 2007

Dubious Fees Hit Borrowers in Foreclosures

November 6, 2007
Dubious Fees Hit Borrowers in Foreclosures
By GRETCHEN MORGENSON


As record numbers of homeowners default on their mortgages, questionable practices among lenders are coming to light in bankruptcy courts, leading some legal specialists to contend that companies instigating foreclosures may be taking advantage of imperiled borrowers.
Because there is little oversight of foreclosure practices and the fees that are charged, bankruptcy specialists fear that some consumers may be losing their homes unnecessarily or that mortgage servicers, who collect loan payments, are profiting from foreclosures.
Bankruptcy specialists say lenders and loan servicers often do not comply with even the most basic legal requirements, like correctly computing the amount a borrower owes on a foreclosed loan or providing proof of holding the mortgage note in question.
“Regulators need to look beyond their current, myopic focus on loan origination and consider how servicers’ calculation and collection practices leave families vulnerable to foreclosure,” said Katherine M. Porter, associate professor of law at the University of Iowa.
In an analysis of foreclosures in Chapter 13 bankruptcy, the program intended to help troubled borrowers save their homes, Ms. Porter found that questionable fees had been added to almost half of the loans she examined, and many of the charges were identified only vaguely. Most of the fees were less than $200 each, but collectively they could raise millions of dollars for loan servicers at a time when the other side of the business, mortgage origination, has faltered.
In one example, Ms. Porter found that a lender had filed a claim stating that the borrower owed more than $1 million. But after the loan history was scrutinized, the balance turned out to be $60,000. And a judge in Louisiana is considering an award for sanctions against Wells Fargo in a case in which the bank assessed improper fees and charges that added more than $24,000 to a borrower’s loan.
Ms. Porter’s analysis comes as more homeowners face foreclosure. Testifying before Congress on Tuesday, Mark Zandi, the chief economist at Moody’s Economy.com, estimated that two million families would lose their homes by the end of the current mortgage crisis.
Questionable practices by loan servicers appear to be enough of a problem that the Office of the United States Trustee, a division of the Justice Department that monitors the bankruptcy system, is getting involved. Last month, It announced plans to move against mortgage servicing companies that file false or inaccurate claims, assess unreasonable fees or fail to account properly for loan payments after a bankruptcy has been discharged.
On Oct. 9, the Chapter 13 trustee in Pittsburgh asked the court to sanction Countrywide, the nation’s largest loan servicer, saying that the company had lost or destroyed more than $500,000 in checks paid by homeowners in foreclosure from December 2005 to April 2007.
The trustee, Ronda J. Winnecour, said in court filings that she was concerned that even as Countrywide misplaced or destroyed the checks, it levied charges on the borrowers, including late fees and legal costs.
“The integrity of the bankruptcy process is threatened when a single creditor dishonors its obligation to provide a truthful and accurate account of the funds it has received,” Ms. Winnecour said in requesting sanctions.
A Countrywide spokesman disputed the accusations about the lost checks, saying the company had no record of having received the payments the trustee said had been sent. It is Countrywide’s practice not to charge late fees to borrowers in bankruptcy, he said, adding that the company also does not charge fees or costs relating to its own mistakes.
Loan servicing is extremely lucrative. Servicers, which collect payments from borrowers and pass them on to investors who own the loans, generally receive a percentage of income from a loan, often 0.25 percent on a prime mortgage and 0.50 percent on a subprime loan. Servicers typically generate profit margins of about 20 percent.
Now that big lenders are originating fewer mortgages, servicing revenues make up a greater percentage of earnings. Because servicers typically keep late fees and certain other charges assessed on delinquent or defaulted loans, “a borrower’s default can present a servicer with an opportunity for additional profit,” Ms. Porter said.
The amounts can be significant. Late fees accounted for 11.5 percent of servicing revenues in 2006 at Ocwen Financial, a big servicing company. At Countrywide, $285 million came from late fees last year, up 20 percent from 2005. Late fees accounted for 7.5 percent of Countrywide’s servicing revenue last year.
But these are not the only charges borrowers face. Others include $145 in something called “demand fees,” $137 in overnight delivery fees, fax fees of $50 and payoff statement charges of $60. Property inspection fees can be levied every month or so, and fees can be imposed every two months to cover assessments of a home’s worth.
“We’re talking about millions and millions of dollars that mortgage servicers are extracting from debtors that I think are totally unlawful and illegal,” said O. Max Gardner III, a lawyer in Shelby, N.C., specializing in consumer bankruptcies. “Somebody files a Chapter 13 bankruptcy, they make all their payments, get their discharge and then three months later, they get a statement from their servicer for $7,000 in fees and charges incurred in bankruptcy but that were never applied for in court and never approved.”
Some fees levied by loan servicers in foreclosure run afoul of state laws. In 2003, for example, a New York appeals court disallowed a $100 payoff statement fee sought by North Fork Bank.
Fees for legal services in foreclosure are also under scrutiny.
A class-action lawsuit filed in September in Federal District Court in Delaware accused the Mortgage Electronic Registration System, a home loan registration system owned by Fannie Mae, Countrywide Financial and other large lenders, of overcharging borrowers for legal services in foreclosures. The system, known as MERS, oversees more than 20 million mortgage loans.
The complaint was filed on behalf of Jose Trevino and Lorry S. Trevino of University City, Mo., whose Washington Mutual loan went into foreclosure in 2006 after the couple became ill and fell behind on payments.
Jeffrey M. Norton, a lawyer who represents the Trevinos, said that although MERS pays a flat rate of $400 or $500 to its lawyers during a foreclosure, the legal fees that it demands from borrowers are three or four times that.
A spokeswoman for MERS declined to comment.
Typically, consumers who are behind on their mortgages but hoping to stay in their homes invoke Chapter 13 bankruptcy because it puts creditors on hold, giving borrowers time to put together a repayment plan.
Given that a Chapter 13 bankruptcy involves the oversight of a court, the findings in Ms. Porter’s study are especially troubling. In July, she presented her paper to the United States trustee, and on Oct. 12 she outlined her data for the National Conference of Bankruptcy Judges in Orlando, Fla.
With Tara Twomey, who is a lecturer at Stanford Law School and a consultant for the National Association of Consumer Bankruptcy Attorneys, Ms. Porter analyzed 1,733 Chapter 13 filings made in April 2006. The data were drawn from public court records and include schedules filed under penalty of perjury by borrowers listing debts, assets and income.
Though bankruptcy laws require documentation that a creditor has a claim on the property, 4 out of 10 claims in Ms. Porter’s study did not attach such a promissory note. And one in six claims was not supported by the itemization of charges required by law.
Without proper documentation, families must choose between the costs of filing an objection or the risk of overpayment, Ms. Porter concluded.
She also found that some creditors ask for fees, like fax charges and payoff statement fees, that would probably be considered “unreasonable” by the courts.
Not surprisingly, these fees may contribute to the other problem identified by her study: a discrepancy between what debtors think they owe and what creditors say they are owed.
In 96 percent of the claims Ms. Porter studied, the borrower and the lender disagreed on the amount of the mortgage debt. In about a quarter of the cases, borrowers thought they owed more than the creditors claimed, but in about 70 percent, the creditors asserted that the debt owed was greater than the amounts specified by borrowers.
The median difference between the amounts the creditor and the borrower submitted was $1,366; the average was $3,533, Ms. Porter said. In 30 percent of the cases in which creditors’ claims were higher, the discrepancy was greater than 5 percent of the homeowners’ figure.
Based on the study, mortgage creditors in the 1,733 cases put in claims for almost $6 million more than the loan debts listed by borrowers in the bankruptcy filings. The discrepancies are too big, Ms. Porter said, to be simple record-keeping errors.
Michael L. Jones, a homeowner going through a Chapter 13 bankruptcy in Louisiana, experienced such a discrepancy with Wells Fargo Home Mortgage. After being told that he owed $231,463.97 on his mortgage, he disputed the amount and ultimately sued Wells Fargo.
In April, Elizabeth W. Magner, a federal bankruptcy judge in Louisiana, ruled that Wells Fargo overcharged Mr. Jones by $24,450.65, or 12 percent more than what the court said he actually owed. The court attributed some of that to arithmetic errors but found that Wells Fargo had improperly added charges, including $6,741.67 in commissions to the sheriff’s office that were not owed, almost $13,000 in additional interest and fees for 16 unnecessary inspections of the borrowers’ property in the 29 months the case was pending.
“Incredibly, Wells Fargo also argues that it was debtor’s burden to verify that its accounting was correct,” the judge wrote, “even though Wells Fargo failed to disclose the details of that accounting until it was sued.”
A Wells Fargo spokesman, Kevin Waetke, said the bank would not comment on the details of the case as the bank is appealing a motion by Mr. Jones for sanctions. “All of our practices and procedures in the handling of bankruptcy cases follow applicable laws, and we stand behind our actions in this case,” he said.
In Texas, a United States trustee has asked for sanctions against Barrett Burke Wilson Castle Daffin & Frappier, a Houston law firm that sues borrowers on behalf of the lenders, for providing inaccurate information to the court about mortgage payments made by homeowners who sought refuge in Chapter 13.
Michael C. Barrett, a partner at the firm, said he did not expect the firm to be sanctioned.
“We certainly believe we have not misbehaved in any way,” he said, saying the trustee’s office became involved because it is trying to persuade Congress to increase its budget. “It is trying to portray itself as an organ to pursue mortgage bankers.”
Closing arguments in the case are scheduled for Dec. 12.

Wednesday, October 24, 2007

Six Fingers of Blame in the Mortgage Mess

Six Fingers of Blame in the Mortgage Mess


By ALAN S. BLINDER
Publishedin the NYT on September 30, 2007

SOMETHING went badly wrong in the subprime mortgage market. In fact, several things did. And now quite a few homeowners, investors and financial institutions are feeling the pain. So far, harried policy makers have understandably focused on crisis management, on getting out of this mess. But soon the nation will turn to recrimination — to good old-fashioned finger-pointing.

Finger-pointing is often decried both as mean-spirited and as a distraction from the more important task of finding remedies. I beg to differ. Until we diagnose what went wrong with subprime, we cannot even begin to devise policy changes that might protect us from a repeat performance. So here goes. Because so much went wrong, the fingers on one hand will not be enough.

The first finger points at households who borrowed recklessly to buy homes, often saddling themselves with mortgages that were all too likely to default. They should have known better. But what can we do to guard against it happening again?

Not much, I’m afraid. Gullible consumers have been around since Adam consumed that apple. Greater financial literacy might help, but I’m dubious about our ability to deliver it effectively. The Federal Reserve is working on clearer mortgage disclosures to help borrowers understand what they are getting themselves into. (“Warning! This mortgage can be dangerous to your family’s financial health.”) While I applaud the effort, I’m skeptical that it will work. If you have ever closed on a home, you know that the disclosure forms you receive are copious and dense. Should we add even more?

Fewer words, and in plainer English, might help, especially if they highlighted the truly important risks. (“In two years, your mortgage payments could double.”) But the truth is that there is much to disclose, that complicated mortgage products are, well, complicated, and that people don’t read those documents anyway.

It seems more promising to point a finger directly at lenders. Some lenders sold mortgage products that were plainly inappropriate for customers, and that they did not understand. There were numerous cases of unsophisticated borrowers being led into risky mortgages.

Here, something can be done. For openers, we need to think about devising a “suitability standard” for everyone who sells mortgage products. Under current law, a stockbroker who persuades Granny to use her last $5,000 to buy a speculative stock on margin is in legal peril because the investment is “unsuitable” for her (though perfectly suitable for Warren Buffett). Knowing that, the broker usually doesn’t do it.

But who will create and enforce such a standard for mortgages? Roughly half of recent subprime mortgages originated in mortgage companies that were not part of any bank, and thus stood outside the federal regulatory system. That was trouble waiting for a time and a place to happen. We should place all mortgage lenders under federal regulation.

That said, bank regulators deserve the next finger of blame for not doing a better job of protecting consumers and ensuring that banks followed sound lending practices. Fortunately, the regulators know they underperformed, and repair work is already under way.

Regulators also need to start thinking about how to deal with a serious incentive problem. In old-fashioned finance, a bank that originated a mortgage also held it for years (think of Jimmy Stewart in “It’s a Wonderful Life”), giving it a clear incentive to lend carefully. But in newfangled finance, banks and mortgage brokers originate loans and sell them quickly to a big financial firm that “securitizes” them; in other words, it pools thousands of mortgages and issues marketable securities representing shares in the pool. These “mortgage-backed securities” are then sold to investors worldwide, to people with no idea who the original borrowers are.

Securitization is a marvelous thing. It has lubricated the market and made mortgages more affordable. We certainly don’t want to end it. But securitization sharply reduces the originator’s incentive to scrutinize the creditworthiness of borrowers. After all, if the loan goes sour, someone else will be holding the bag. We need to find ways to restore that incentive, perhaps by requiring loan originators to retain a share of each mortgage.

But wait. Don’t the ultimate investors have every incentive to scrutinize the credits? If they buy riskier mortgage-backed securities in search of higher yields, isn’t that their business? The answer is yes — which leads me to point a fourth finger of blame. By now, it is abundantly clear that many investors, swept up in the euphoria of the moment, failed to pay close attention to what they were buying.

Why did they behave so foolishly? Part of the answer is that the securities, especially the now-notorious C.D.O.’s, for collateralized debt obligations, were probably too complex for anyone’s good — which points a fifth finger, this one at the investment bankers who dreamed them up and marketed them aggressively.

Another part of the answer merits a sixth finger of blame. Investors placed too much faith in the rating agencies — which, to put it mildly, failed to get it right. It is tempting to take the rating agencies out for a public whipping. But it is more constructive to ask how the rating system might be improved. That’s a tough question because of another serious incentive problem.

Under the current system, the rating agencies are hired and paid by the issuers of the very securities they rate — which creates an obvious potential conflict of interest. If I proposed that students pay me directly for grading their work, my dean would be outraged. Yet that’s exactly how securities are rated. This needs to change, but precisely how is not clear.

SO that’s my list of men (and a few women) behaving badly. But as we point all these fingers, let’s remember the sage advice of the late and dearly missed Ned Gramlich, the former Fed governor who saw the emerging subprime problems sooner and clearer than anyone. Yes, the subprime market failed us. But before it blew up, it placed a few million families of modest means in homes they otherwise could not have financed. That accomplishment is worth something — in fact, quite a lot.

We don’t have to destroy the subprime market in order to save it.

Alan S. Blinder is a professor of economics and public affairs at Princeton and former vice chairman of the Federal Reserve. He has advised many Democratic politicians.

Friday, August 17, 2007

Remembering a Classic Investing Theory NYT

August 15, 2007Economic Scene
Remembering a Classic Investing Theory
By DAVID LEONHARDT

More than 70 years ago, two Columbia professors named Benjamin Graham and David L. Dodd came up with a simple investing idea that remains more influential than perhaps any other. In the wake of the stock market crash in 1929, they urged investors to focus on hard facts — like a company’s past earnings and the value of its assets — rather than trying to guess what the future would bring. A company with strong profits and a relatively low stock price was probably undervalued, they said.
Their classic 1934 textbook, “Security Analysis,” became the bible for what is now known as value investing. Warren E. Buffett took Mr. Graham’s course at Columbia Business School in the 1950s and, after working briefly for Mr. Graham’s investment firm, set out on his own to put the theories into practice. Mr. Buffett’s billions are just one part of the professors’ giant legacy.
Yet somehow, one of their big ideas about how to analyze stock prices has been almost entirely forgotten. The idea essentially reminds investors to focus on long-term trends and not to get caught up in the moment. Unfortunately, when you apply it to today’s stock market, you get even more nervous about what’s going on.

Most Wall Street analysts, of course, say there is nothing to be worried about, at least not beyond the mortgage market. In an effort to calm investors after the recent volatility, analysts have been arguing that stocks are not very expensive right now. The basis for this argument is the standard measure of the market: the price-to-earnings ratio.
It sounds like just the sort of thing the professors would have loved. In its most common form, the ratio is equal to a company’s stock price divided by its earnings per share over the last 12 months. You can skip the math, though, and simply remember that a P/E ratio tells you how much a stock costs relative to a company’s performance. The higher the ratio, the more expensive the stock is — and the stronger the argument that it won’t do very well going forward.
Right now, the stocks in the Standard & Poor’s 500-stock index have an average P/E ratio of about 16.5, which by historical standards is quite normal. Since World War II, the average P/E ratio has been 16.1. During the bubbles of the 1920s and the 1990s, on the other hand, the ratio shot above 40. The core of Wall Street’s reassuring message, then, is that even if the mortgage mess leads to a full-blown credit squeeze, the damage will not last long because stocks don’t have far to fall.
To Mr. Graham and Mr. Dodd, the P/E ratio was indeed a crucial measure, but they would have had a problem with the way that the number is calculated today. Besides advising investors to focus on the past, the two men also cautioned against putting too much emphasis on the recent past. They realized that a few months, or even a year, of financial information could be deeply misleading. It could say more about what the economy happened to be doing at any one moment than about a company’s long-term prospects.
So they argued that P/E ratios should not be based on only one year’s worth of earnings. It is much better, they wrote in “Security Analysis,” to look at profits for “not less than five years, preferably seven or ten years.”
This advice has been largely lost to history. For one thing, collecting a decade’s worth of earnings data can be time consuming. It also seems a little strange to look so far into the past when your goal is to predict future returns.
But at least two economists have remembered the advice. For years, John Y. Campbell and Robert J. Shiller have been calculating long-term P/E ratios. When they were invited to a make a presentation to Alan Greenspan in 1996, they used the statistic to argue that stocks were badly overvalued. A few days later, Mr. Greenspan touched off a brief worldwide sell-off by wondering aloud whether “irrational exuberance” was infecting the markets. In 2000, not long before the market began its real swoon, Mr. Shiller published a book that used Mr. Greenspan’s phrase as its title.
Today, the Graham-Dodd approach produces a very different picture from the one that Wall Street has been offering. Based on average profits over the last 10 years, the P/E ratio has been hovering around 27 recently. That’s higher than it has been at any other point over the last 130 years, save the great bubbles of the 1920s and the 1990s. The stock run-up of the 1990s was so big, in other words, that the market may still not have fully worked it off.
Now, this one statistic does not mean that a bear market is inevitable. But it does offer a good framework for thinking about stocks.
Over the last few years, corporate profits have soared. Economies around the world have been growing, new technologies have made companies more efficient and for a variety of reasons — globalization and automation chief among them — workers have not been able to demand big pay increases. In just three years, from 2003 to 2006, inflation-adjusted corporate profits jumped more than 30 percent, according to the Commerce Department. This profit boom has allowed standard, one-year P/E ratios to remain fairly low.
Going forward, one possibility is that the boom will continue. In this case, the Graham-Dodd P/E ratio doesn’t really matter. It is capturing a reality that no longer exists, and stocks could do well over the next few years.
The other possibility is that the boom will prove fleeting. Perhaps the recent productivity gains will peter out (as some measures suggest is already happening). Or perhaps the world’s major economies will slump in the next few years. If something along these lines happens, stocks may suddenly start to look very expensive.
In the long term, the stock market will almost certainly continue to be a good investment. But the next few years do seem to depend on a more rickety foundation than Wall Street’s soothing words suggest. Many investors are banking on the idea that the economy has entered a new era of rapid profit growth, and investments that depend on the words “new era” don’t usually do so well.
That makes for one more risk in a market that is relearning the meaning of the word.
E-mail: leonhardt@nytimes.com

The legacy of Indian partition - New Yorker article

Exit Wounds
The legacy of Indian partition.
by Pankaj Mishra August 13, 2007

Sixty years ago, on the evening of August 14, 1947, a few hours before Britain’s Indian Empire was formally divided into the nation-states of India and Pakistan, Lord Louis Mountbatten and his wife, Edwina, sat down in the viceregal mansion in New Delhi to watch the latest Bob Hope movie, “My Favorite Brunette.” Large parts of the subcontinent were descending into chaos, as the implications of partitioning the Indian Empire along religious lines became clear to the millions of Hindus, Muslims, and Sikhs caught on the wrong side of the border. In the next few months, some twelve million people would be uprooted and as many as a million murdered. But on that night in mid-August the bloodbath—and the fuller consequences of hasty imperial retreat—still lay in the future, and the Mountbattens probably felt they had earned their evening’s entertainment.
Mountbatten, the last viceroy of India, had arrived in New Delhi in March, 1947, charged with an almost impossible task. Irrevocably enfeebled by the Second World War, the British belatedly realized that they had to leave the subcontinent, which had spiralled out of their control through the nineteen-forties. But plans for brisk disengagement ignored messy realities on the ground. Mountbatten had a clear remit to transfer power to the Indians within fifteen months. Leaving India to God, or anarchy, as Mohandas Gandhi, the foremost Indian leader, exhorted, wasn’t a political option, however tempting. Mountbatten had to work hard to figure out how and to whom power was to be transferred.
The dominant political party, the Congress Party, took inspiration from Gandhi in claiming to be a secular organization, representing all four hundred million Indians. But many Muslim politicians saw it as a party of upper-caste Hindus and demanded a separate homeland for their hundred million co-religionists, who were intermingled with non-Muslim populations across the subcontinent’s villages, towns, and cities. Eventually, as in Palestine, the British saw partition along religious lines as the quickest way to the exit.
But sectarian riots in Punjab and Bengal dimmed hopes for a quick and dignified British withdrawal, and boded ill for India’s assumption of power. Not surprisingly, there were some notable absences at the Independence Day celebrations in New Delhi on August 15th. Gandhi, denouncing freedom from imperial rule as a “wooden loaf,” had remained in Calcutta, trying, with the force of his moral authority, to stop Hindus and Muslims from killing each other. His great rival Mohammed Ali Jinnah, who had fought bitterly for a separate homeland for Indian Muslims, was in Karachi, trying to hold together the precarious nation-state of Pakistan.
Nevertheless, the significance of the occasion was not lost on many. While the Mountbattens were sitting down to their Bob Hope movie, India’s constituent assembly was convening in New Delhi. The moment demanded grandiloquence, and Jawaharlal Nehru, Gandhi’s closest disciple and soon to be India’s first Prime Minister, provided it. “Long years ago, we made a tryst with destiny,” he said. “At the stroke of the midnight hour, while the world sleeps, India will awaken to life and freedom. A moment comes, which comes but rarely in history, when we step out from the old to the new, when an age ends, and when the soul of a nation, long suppressed, finds utterance.”
Posterity has enshrined this speech, as Nehru clearly intended. But today his quaint phrase “tryst with destiny” resonates ominously, so enduring have been the political and psychological scars of partition. The souls of the two new nation-states immediately found utterance in brutal enmity. In Punjab, armed vigilante groups, organized along religious lines and incited by local politicians, murdered countless people, abducting and raping thousands of women. Soon, India and Pakistan were fighting a war—the first of three—over the disputed territory of Kashmir. Gandhi, reduced to despair by the seemingly endless cycle of retaliatory mass murders and displacement, was shot dead in January, 1948, by a Hindu extremist who believed that the father of the Indian nation was too soft on Muslims. Jinnah, racked with tuberculosis and overwork, died a few months later, his dream of a secular Pakistan apparently buried with him.
Many of the seeds of postcolonial disorder in South Asia were sown much earlier, in two centuries of direct and indirect British rule, but, as book after book has demonstrated, nothing in the complex tragedy of partition was inevitable. In “Indian Summer” (Henry Holt; $30), Alex von Tunzelmann pays particular attention to how negotiations were shaped by an interplay of personalities. Von Tunzelmann goes on a bit too much about the Mountbattens’ open marriage and their connections to various British royals, toffs, and fops, but her account, unlike those of some of her fellow British historians, isn’t filtered by nostalgia. She summarizes bluntly the economic record of the British overlords, who, though never as rapacious and destructive as the Belgians in the Congo, damaged agriculture and retarded industrial growth in India through a blind faith in the “invisible hand” that supposedly regulated markets. Von Tunzelmann echoes Edmund Burke’s denunciation of the East India Company when she terms the empire’s corporate forerunner a “beast” whose “only object was money”; and she reminds readers that, in 1877, the year that Queen Victoria officially became Empress of India, a famine in the south killed five million people even as the Queen’s viceroy remained adamant that famine relief was a misguided policy.
Politically, too, British rule in India was deeply conservative, limiting Indian access to higher education, industry, and the civil service. Writing in the New York Tribune in the mid-nineteenth century, Karl Marx predicted that British colonials would prove to be the “unconscious tool” of a “social revolution” in a subcontinent stagnating under “Oriental despotism.” As it turned out, the British, while restricting an educated middle class, empowered a multitude of petty Oriental despots. (In 1947, there were five hundred and sixty-five of these feudatories, often called maharajas, running states as large as Belgium and as small as Central Park.)
Though blessed with many able administrators, the British found India just too large and diverse to handle. Many of their decisions stoked Hindu-Muslim tensions, imposing sharp new religious-political identities on Indians. As the recent experience of Iraq proves, elections in a country where the rights and responsibilities of secular and democratic citizenship are largely unknown do little more than crudely assert the majority’s right to rule. British-supervised elections in 1937 and 1946, which the Hindu-dominated Congress won easily, only hardened Muslim identity, and made partition inevitable.
This was a deeper tragedy than is commonly realized—and not only because India today has almost as many Muslims as Pakistan. In a land where cultures, traditions, and beliefs cut across religious communities, few people had defined themselves exclusively through their ancestral faith. The Pashto-speaking Muslim in the North-West Frontier province (later the nursery of the Taliban and Al Qaeda) had little in common with the Bangla-speaking Muslim in the eastern province of Bengal. (Even today, a Sunni Muslim from Lahore has less in common with a Sunni Muslim from Dhaka than he has with a Hindu Brahmin from New Delhi, who, in turn, may find alien the language, food, and dress of a low-caste Hindu from Chennai.) The British policy of defining communities based on religious identity radically altered Indian self-perceptions, as von Tunzelmann points out: “Many Indians stopped accepting the diversity of their own thoughts and began to ask themselves in which of the boxes they belonged.”
Ineptitude and negligence directed British policies in India more than any cynical desire to divide and rule, but the British were not above exploiting rivalries. As late as 1940, Winston Churchill hoped that Hindu-Muslim antagonism would remain “a bulwark of British rule in India.” Certainly Churchill, who did not want his views on India to be “disturbed by any bloody Indians,” was disinclined to recognize the upsurge of nationalism in India. Imperial authority in India rested on the claim that the British, as representatives of a superior civilization, were essentially benign custodians of a fractious country. But as an Indian middle-class élite trained in Western institutions became politicized—more aware of the nature and scale of Indian political and economic subjugation to Britain—self-serving British rhetoric about benevolent masters and volatile natives was bound to be challenged. And no one undermined British assumptions of moral and legal custodianship better than Gandhi, who was adept both at galvanizing the Indian masses and at alerting the British to the gap between their high claims and the reality of their rule. With a series of imaginative, often carefully choreographed campaigns of civil disobedience throughout the nineteen-twenties, Gandhi shook the confidence of the British, becoming, by 1931, as India’s viceroy Lord Willingdon put it in a letter to King George V, a “terribly difficult little person.” Once such middle-class nationalists as Gandhi and Nehru acquired a popular following, independence was only a matter of time. If anything, Gandhi’s doctrine of nonviolence probably reduced the threat that a nationwide uprising would force an early and bloody exit for the British.
Through the nineteen-thirties, Gandhi had a few perceptive and sympathetic British interlocutors, such as the viceroy Lord Irwin, who when asked if he thought Gandhi was tiresome retorted, “Some people thought Our Lord very tiresome.” For the most part, though, Gandhi dealt with such hidebound members of Britain’s landowning class as Lord Linlithgow, who, as viceroy of India in the crucial period from 1936 to 1943, liked to be accompanied into dinner every evening by a band playing “The Roast Beef of Old England”—a tactless choice of preprandial music in the land of the holy cow. In 1939, without consulting any Indian leaders, Linlithgow declared war on Germany on behalf of India, committing two and a half million Indian soldiers to the Allied cause. Convinced that independence for India was many decades away, he found an equally obdurate ally in London once Churchill came to power, in 1940.
In the nineteen-twenties and thirties, Churchill had been loudest among the reactionaries who were determined not to lose India, “the jewel in the crown,” and, as Prime Minister during the Second World War, he tried every tactic to thwart Indian independence. “I hate Indians,” he declared. “They are a beastly people with a beastly religion.” He had a special animus for Gandhi, describing him as a “rascal” and a “half-naked” “fakir.” (In a letter to Churchill, Gandhi took the latter as a compliment, claiming that he was striving for even greater renunciation.) According to his own Secretary of State for India, Leopold Amery, Churchill knew “as much of the Indian problem as George III did of the American colonies.”
In 1942, as the Japanese Army advanced on India, the Congress Party was willing to offer war support in return for immediate self-government. But Churchill was in no mood to negotiate. Frustrated by his stonewalling tactics, the Congress Party launched a vigorous “Quit India” campaign in August of 1942. The British suppressed it ruthlessly, imprisoning tens of thousands, including Gandhi and Nehru. Meanwhile, Churchill’s indispensable quartermaster Franklin D. Roosevelt was aware of the contradiction in claiming to fight for freedom and democracy while keeping India under foreign occupation. In letters and telegrams, he continually urged Churchill to move India toward self-government, only to receive replies that waffled and prevaricated. Muslims, Churchill once claimed, made up seventy-five per cent of the Indian Army (the actual figure was close to thirty-five), and none of them wanted to be ruled by the “Hindu priesthood.”
Von Tunzelmann judges that Churchill, hoping to forestall independence by opportunistically supporting Muslim separatism, instead became “instrumental in creating the world’s first modern Islamic state.” This is a bit unfair—not to Churchill but to Jinnah, the founder of Pakistan. Though always keen to incite Muslim disaffection in his last years, the Anglicized, whiskey-drinking Jinnah was far from being an Islamic theocrat; he wanted a secular Pakistan, in which Muslims, Hindus, and Christians were equal before the law. (In fact, political Islam found only intermittent support within Pakistan until the nineteen-eighties, when the country’s military dictator, working with the Saudis and the C.I.A., turned the North-West Frontier province into the base of a global jihad against the Soviet occupation of neighboring Afghanistan.)
What Leopold Amery denounced as Churchill’s “Hitler-like attitude” to India manifested itself most starkly during a famine, caused by a combination of war and mismanagement, that claimed between one and two million lives in Bengal in 1943. Urgently beseeched by Amery and the Indian viceroy to release food stocks for India, Churchill responded with a telegram asking why Gandhi hadn’t died yet.
“It is strange,” George Orwell wrote in his diary in August, 1942, “but quite truly the way the British government is now behaving in India upsets me more than a military defeat.” Orwell, who produced many BBC broadcasts from London to India during the war, feared that “if these repressive measures in India are seemingly successful, the effects in this country will be very bad. All seems set for a big comeback by the reactionaries.” But in the British elections at the end of the war, the reactionaries unexpectedly lost to the Labour Party, and a new era in British politics began.
As von Tunzelmann writes, “By 1946, the subcontinent was a mess, with British civil and military officers desperate to leave, and a growing hostility to their presence among Indians.” In an authoritative recent two-volume account of the end of the British Empire in Asia—“Forgotten Armies” and “Forgotten Wars”—the Cambridge University historians Tim Harper and Christopher Bayly describe how quickly the Japanese had humiliated the British in Malaya and Burma, threatening their hold over India. With their mystique of power gone, Asia’s British masters depended on what Bayly and Harper term the “temporary sufferance of Asians.” Although Churchill had rejected the Congress Party’s offer of military support in exchange for independence, Bayley and Harper write that, ultimately, “it was Indian soldiers, civilian laborers and businessmen who made possible the victory of 1945. Their price was the rapid independence of India.”
The British could not now rely on brute force without imperilling their own sense of legitimacy. Besides, however much they “preferred the illusion of imperial might to the admission of imperial failure,” as von Tunzelmann puts it, the country, deep in wartime debt, simply couldn’t afford to hold on to its increasingly unstable empire. Imperial disengagement appeared not just inevitable but urgent.
But Churchill’s divisive policies had already produced a disastrous effect on the Indian political scene. Congress Party leaders had refused to share power with Jinnah, confident that they did not need Muslim support in order to win a majority vote in elections. These attitudes stoked Muslim fears that the secular nationalism of Gandhi and Nehru was a cover for Hindu dominance. While the Congress leaders were in prison, Jinnah, with Churchill’s encouragement, steadily consolidated Muslim opinion behind him. By 1946, this secularist politician had managed to present himself as the best defender of Muslim interests in a Hindu-dominated India. Religion was never so deeply and enduringly politicized in India as it was in the last years of imperial rule.
At first, Nehru and other Congress Party leaders dismissed the idea of Pakistan as a joke. Jinnah demonstrated his newfound power by ordering mass strikes across India, many of which degenerated into Hindu-Muslim riots. In just three days in August, 1946, four thousand residents of Calcutta died. Retaliatory killings around the country further envenomed political attitudes. A heartbroken Gandhi found fewer and fewer takers for nonviolence, even among his Congress Party, many of whose leaders spoke openly of civil war.
When the improbably handsome Mountbatten arrived, in March of 1947, with his rich and beautiful wife, he did not initially seem up to the task of supervising British withdrawal and giving a viable postcolonial shape to the subcontinent. Not everyone had been impressed by his elevation, in 1943, to the post of the supreme commander of the Allied Forces in South-East Asia. His American deputy, General Joseph Stilwell, concluded, “The Glamour Boy is just that. Enormous staff, endless walla-walla, but damned little fighting.” It was probably just as well that Mountbatten did little fighting. Early in the war, he had sailed the destroyer H.M.S. Kelly into a minefield before ramming it into another British ship. After exposing his ship to German torpedo fire (“That’s going to kill an awful lot of chaps,” he recalled thinking as he saw the metal streaking toward him), Mountbatten finally saw it sunk by German dive-bombers off the coast of Crete.
Known in the British Admiralty as the Master of Disaster, Mountbatten nonetheless displayed astonishing political maturity as the war ended in the Asian countries under his command. He realized that prolonged Japanese occupation of Malaya, Burma, Indonesia, and Indochina had unleashed nationalistic aspirations that exhausted European empires would not be able to suppress. He advised the French that war with the Viet Minh, who had declared an independent Vietnam soon after the Japanese surrender, was pointless, and he even supported an ambitious plan by the British Labour politician Tom Driberg to negotiate with Ho Chi Minh. He had little sympathy for the efforts of the Dutch to reassert their authority in Indonesia, and in Burma he infuriated the old imperialist guard by promoting the nationalist radical Aung San (the father of the long-imprisoned activist Aung San Suu Kyi).
The awesome task Mountbatten faced in India may have appealed to his ego. Though he knew little of the intricacies of Indian politics, he deployed a great deal of personal charm; and he had an effective ally in his estranged wife, Edwina. Together, this “power couple” went to work on Indian leaders. Gandhi succumbed, as did the Anglophilic Nehru, who grew particularly close to Edwina. Jinnah, however, remained difficult to please.
New problems arose every day. British concessions to Muslim separatism emboldened other religious and ethnic minorities. The fiercely tribalist Pashtuns of the North-West Frontier province, wary of Jinnah, asked for Pathanistan; the Naga tribes in the northeastern hills, who had been armed by the British to fight the Japanese, demanded Nagastan; the Sikhs proposed Sikhistan; the Baluchis went ahead and declared an independent Baluchistan. Mountbatten defused most of these would-be secessionists with a mixture of sweet-talking and bluster. His aristocratic connections came in particularly handy as he placated maharajas who were abruptly forced to choose between India and Pakistan. The trickiest of them, the Hindu ruler of Kashmir, who presided over a Muslim-majority population, was later to accede to India in circumstances that remain controversial and have preserved Pakistan’s claims on the state.
Eventually, after wrangling and recriminations, Mountbatten got Indian leaders to agree to partition. Then, abruptly, in early June, he announced August 15, 1947, as the date for the transfer of power, bringing forward the British government’s original schedule by nine months. The reason for this rush is not known. Mountbatten may have wanted to inject some urgency into the tortuous negotiations about who would get what—even ink pots were to be divided between the new nation-states. He may also have simply wanted to cut and run. In any case, his decision is partly to blame for the disasters that followed.
Cyril Radcliffe, a London barrister, was flown to Delhi and given forty days to define precisely the strange political geography of an India flanked by an eastern and a western wing called Pakistan. He did not visit the villages, communities, rivers, or forests divided by the lines he drew on paper. Ill-informed about the relation between agricultural hinterlands and industrial centers, he made a mistake of enormous economic consequence when, dividing Bengal on religious lines, he deprived the Muslim majority in the eastern region of its major city, Calcutta, condemning East Pakistan—and, later, Bangladesh—to decades of rural backwardness.
It was in Punjab that Radcliffe’s mapmaking sparked the biggest conflagration. As Hindus, Muslims, and Sikhs on either side of the new border suddenly found themselves reduced to a religious minority, the tensions of the preceding months exploded into the violence of ethnic cleansing. It seems extraordinary today that so few among the cabal of Indian leaders whom Mountbatten consulted anticipated that the drawing of borders and the crystallizing of national identities along religious lines would plunge millions into bewilderment, panic, and murderous rage. If the British were eager to divide and quit, their successors wanted to savor power. No one had prepared for a massive transfer of population. Even as armed militias roamed the countryside, looking for people to kidnap, rape, and kill, houses to loot, and trains to derail and burn, the only force capable of restoring order, the British Indian Army, was itself being divided along religious lines—Muslim soldiers to Pakistan, Hindus to India. Soon, many of the communalized soldiers would join their co-religionists in killing sprees, giving the violence of partition its genocidal cast. Radcliffe never returned to India. Just before his death, in 1977, he told a journalist, “I suspect they’d shoot me out of hand—both sides.”
Trains carrying nothing but corpses through a desolate countryside became the totemic image of the savagery of partition. British soldiers confined to their barracks, ordered by Mountbatten to save only British lives, may prove to be the most enduring image of imperial retreat. With this act of moral dereliction, the British Empire finally disowned its noble sense of mission. As Paul Scott put it in “The Raj Quartet,” the epic of imperial exhaustion and disillusion, India in 1947 was where the empire’s high idea of itself collapsed and “the British came to the end of themselves as they were.”
The British Empire passed quickly and with less humiliation than its French and Dutch counterparts, but decades later the vicious politics of partition still seems to define India and Pakistan. The millions of Muslims who chose to stay in India never ceased to be hostages to Hindu extremists. As recently as 2002, Hindu nationalists massacred more than two thousand Muslims in the state of Gujarat. The dispute over Kashmir, the biggest unfinished business of partition, committed countries with mostly poor and illiterate populations to a nuclear arms race and nourished extremists in both countries: Islamic fundamentalists in Pakistan, Hindu nationalists in India. It also damaged India’s fragile democracy—Indian soldiers and policemen in Kashmir routinely execute and torture Pakistan-backed Muslim insurgents—and helped cement the military’s extra-constitutional influence over Pakistan’s inherently weaker state. Tens of thousands have died in Kashmir in the past decade and a half, and since 1947 sectarian conflicts in India and Pakistan have killed thousands more.
Many ethnic minorities chafed at the postcolonial nationalism of India and Pakistan, and some rebelled. At least one group—Bengali Muslims—succeeded in establishing their own nation-state (Bangladesh), though only after suffering another round of ethnic cleansing, this time by fellow-Muslims. Other minorities demanding political autonomy—Nagas, Sikhs, Kashmiris, Baluchis—were quelled, often with greater brutality than the British had ever used against their subjects.
Meeting Mountbatten a few months after partition, Churchill assailed him for helping Britain’s “enemies,” “Hindustan,” against “Britain’s friends,” the Muslims. Little did Churchill know that his expedient boosting of political Islam would eventually unleash a global jihad engulfing even distant New York and London. The rival nationalisms and politicized religions the British Empire brought into being now clash in an enlarged geopolitical arena; and the human costs of imperial overreaching seem unlikely to attain a final tally for many more decades. ♦
PHOTOGRAPH: GETTY IMAGES

Tuesday, August 7, 2007

In Dusty Archives, a Theory of Affluence

August 7, 2007; NYT
In Dusty Archives, a Theory of Affluence
By NICHOLAS WADE
For thousands of years, most people on earth lived in abject poverty, first as hunters and gatherers, then as peasants or laborers. But with the Industrial Revolution, some societies traded this ancient poverty for amazing affluence.
Historians and economists have long struggled to understand how this transition occurred and why it took place only in some countries. A scholar who has spent the last 20 years scanning medieval English archives has now emerged with startling answers for both questions.
Gregory Clark, an economic historian at the University of California, Davis, believes that the Industrial Revolution — the surge in economic growth that occurred first in England around 1800 — occurred because of a change in the nature of the human population. The change was one in which people gradually developed the strange new behaviors required to make a modern economy work. The middle-class values of nonviolence, literacy, long working hours and a willingness to save emerged only recently in human history, Dr. Clark argues.
Because they grew more common in the centuries before 1800, whether by cultural transmission or evolutionary adaptation, the English population at last became productive enough to escape from poverty, followed quickly by other countries with the same long agrarian past.
Dr. Clark’s ideas have been circulating in articles and manuscripts for several years and are to be published as a book next month, “A Farewell to Alms” (Princeton University Press). Economic historians have high praise for his thesis, though many disagree with parts of it.
“This is a great book and deserves attention,” said Philip Hoffman, a historian at the California Institute of Technology. He described it as “delightfully provocative” and a “real challenge” to the prevailing school of thought that it is institutions that shape economic history.
Samuel Bowles, an economist who studies cultural evolution at the Santa Fe Institute, said Dr. Clark’s work was “great historical sociology and, unlike the sociology of the past, is informed by modern economic theory.”
The basis of Dr. Clark’s work is his recovery of data from which he can reconstruct many features of the English economy from 1200 to 1800. From this data, he shows, far more clearly than has been possible before, that the economy was locked in a Malthusian trap _ — each time new technology increased the efficiency of production a little, the population grew, the extra mouths ate up the surplus, and average income fell back to its former level.
This income was pitifully low in terms of the amount of wheat it could buy. By 1790, the average person’s consumption in England was still just 2,322 calories a day, with the poor eating a mere 1,508. Living hunter-gatherer societies enjoy diets of 2,300 calories or more.
“Primitive man ate well compared with one of the richest societies in the world in 1800,” Dr. Clark observes.
The tendency of population to grow faster than the food supply, keeping most people at the edge of starvation, was described by Thomas Malthus in a 1798 book, “An Essay on the Principle of Population.” This Malthusian trap, Dr. Clark’s data show, governed the English economy from 1200 until the Industrial Revolution and has in his view probably constrained humankind throughout its existence. The only respite was during disasters like the Black Death, when population plummeted, and for several generations the survivors had more to eat.
Malthus’s book is well known because it gave Darwin the idea of natural selection. Reading of the struggle for existence that Malthus predicted, Darwin wrote in his autobiography, “It at once struck me that under these circumstances favourable variations would tend to be preserved, and unfavourable ones to be destroyed. ... Here then I had at last got a theory by which to work.”
Given that the English economy operated under Malthusian constraints, might it not have responded in some way to the forces of natural selection that Darwin had divined would flourish in such conditions? Dr. Clark started to wonder whether natural selection had indeed changed the nature of the population in some way and, if so, whether this might be the missing explanation for the Industrial Revolution.
The Industrial Revolution, the first escape from the Malthusian trap, occurred when the efficiency of production at last accelerated, growing fast enough to outpace population growth and allow average incomes to rise. Many explanations have been offered for this spurt in efficiency, some economic and some political, but none is fully satisfactory, historians say.
Dr. Clark’s first thought was that the population might have evolved greater resistance to disease. The idea came from Jared Diamond’s book “Guns, Germs and Steel,” which argues that Europeans were able to conquer other nations in part because of their greater immunity to disease.
In support of the disease-resistance idea, cities like London were so filthy and disease ridden that a third of their populations died off every generation, and the losses were restored by immigrants from the countryside. That suggested to Dr. Clark that the surviving population of England might be the descendants of peasants.
A way to test the idea, he realized, was through analysis of ancient wills, which might reveal a connection between wealth and the number of progeny. The wills did that, , but in quite the opposite direction to what he had expected.
Generation after generation, the rich had more surviving children than the poor, his research showed. That meant there must have been constant downward social mobility as the poor failed to reproduce themselves and the progeny of the rich took over their occupations. “The modern population of the English is largely descended from the economic upper classes of the Middle Ages,” he concluded.
As the progeny of the rich pervaded all levels of society, Dr. Clark considered, the behaviors that made for wealth could have spread with them. He has documented that several aspects of what might now be called middle-class values changed significantly from the days of hunter gatherer societies to 1800. Work hours increased, literacy and numeracy rose, and the level of interpersonal violence dropped.
Another significant change in behavior, Dr. Clark argues, was an increase in people’s preference for saving over instant consumption, which he sees reflected in the steady decline in interest rates from 1200 to 1800.
“Thrift, prudence, negotiation and hard work were becoming values for communities that previously had been spendthrift, impulsive, violent and leisure loving,” Dr. Clark writes.
Around 1790, a steady upward trend in production efficiency first emerges in the English economy. It was this significant acceleration in the rate of productivity growth that at last made possible England’s escape from the Malthusian trap and the emergence of the Industrial Revolution.
In the rest of Europe and East Asia, populations had also long been shaped by the Malthusian trap of their stable agrarian economies. Their workforces easily absorbed the new production technologies that appeared first in England.
It is puzzling that the Industrial Revolution did not occur first in the much larger populations of China or Japan. Dr. Clark has found data showing that their richer classes, the Samurai in Japan and the Qing dynasty in China, were surprisingly unfertile and so would have failed to generate the downward social mobility that spread production-oriented values in England.
After the Industrial Revolution, the gap in living standards between the richest and the poorest countries started to accelerate, from a wealth disparity of about 4 to 1 in 1800 to more than 50 to 1 today. Just as there is no agreed explanation for the Industrial Revolution, economists cannot account well for the divergence between rich and poor nations or they would have better remedies to offer.
Many commentators point to a failure of political and social institutions as the reason that poor countries remain poor. But the proposed medicine of institutional reform “has failed repeatedly to cure the patient,” Dr. Clark writes. He likens the “cult centers” of the World Bank and International Monetary Fund to prescientific physicians who prescribed bloodletting for ailments they did not understand.
If the Industrial Revolution was caused by changes in people’s behavior, then populations that have not had time to adapt to the Malthusian constraints of agrarian economies will not be able to achieve the same production efficiencies, his thesis implies.
Dr. Clark says the middle-class values needed for productivity could have been transmitted either culturally or genetically. But in some passages, he seems to lean toward evolution as the explanation. “Through the long agrarian passage leading up to the Industrial Revolution, man was becoming biologically more adapted to the modern economic world,” he writes. And, “The triumph of capitalism in the modern world thus may lie as much in our genes as in ideology or rationality.”
What was being inherited, in his view, was not greater intelligence — being a hunter in a foraging society requires considerably greater skill than the repetitive actions of an agricultural laborer. Rather, it was “a repertoire of skills and dispositions that were very different from those of the pre-agrarian world.”
Reaction to Dr. Clark’s thesis from other economic historians seems largely favorable, although few agree with all of it, and many are skeptical of the most novel part, his suggestion that evolutionary change is a factor to be considered in history.
Historians used to accept changes in people’s behavior as an explanation for economic events, like Max Weber’s thesis linking the rise of capitalism with Protestantism. But most have now swung to the economists’ view that all people are alike and will respond in the same way to the same incentives. Hence they seek to explain events like the Industrial Revolution in terms of changes in institutions, not people.
Dr. Clark’s view is that institutions and incentives have been much the same all along and explain very little, which is why there is so little agreement on the causes of the Industrial Revolution. In saying the answer lies in people’s behavior, he is asking his fellow economic historians to revert to a type of explanation they had mostly abandoned and in addition is evoking an idea that historians seldom consider as an explanatory variable, that of evolution.
Most historians have assumed that evolutionary change is too gradual to have affected human populations in the historical period. But geneticists, with information from the human genome now at their disposal, have begun to detect ever more recent instances of human evolutionary change like the spread of lactose tolerance in cattle-raising people of northern Europe just 5,000 years ago. A study in the current American Journal of Human Genetics finds evidence of natural selection at work in the population of Puerto Rico since 1513. So historians are likely to be more enthusiastic about the medieval economic data and elaborate time series that Dr. Clark has reconstructed than about his suggestion that people adapted to the Malthusian constraints of an agrarian society.
“He deserves kudos for assembling all this data,” said Dr. Hoffman, the Caltech historian, “but I don’t agree with his underlying argument.”
The decline in English interest rates, for example, could have been caused by the state’s providing better domestic security and enforcing property rights, Dr. Hoffman said, not by a change in people’s willingness to save, as Dr. Clark asserts.
The natural-selection part of Dr. Clark’s argument “is significantly weaker, and maybe just not necessary, if you can trace the changes in the institutions,” said Kenneth L. Pomeranz, a historian at the University of California, Irvine. In a recent book, “The Great Divergence,” Dr. Pomeranz argues that tapping new sources of energy like coal and bringing new land into cultivation, as in the North American colonies, were the productivity advances that pushed the old agrarian economies out of their Malthusian constraints.
Robert P. Brenner, a historian at the University of California, Los Angeles, said although there was no satisfactory explanation at present for why economic growth took off in Europe around 1800, he believed that institutional explanations would provide the answer and that Dr. Clark’s idea of genes for capitalist behavior was “quite a speculative leap.”
Dr. Bowles, the Santa Fe economist, said he was “not averse to the idea” that genetic transmission of capitalist values is important, but that the evidence for it was not yet there. “It’s just that we don’t have any idea what it is, and everything we look at ends up being awfully small,” he said. Tests of most social behaviors show they are very weakly heritable.
He also took issue with Dr. Clark’s suggestion that the unwillingness to postpone consumption, called time preference by economists, had changed in people over the centuries. “If I were as poor as the people who take out payday loans, I might also have a high time preference,” he said.
Dr. Clark said he set out to write his book 12 years ago on discovering that his undergraduates knew nothing about the history of Europe. His colleagues have been surprised by its conclusions but also interested in them, he said.
“The actual data underlying this stuff is hard to dispute,” Dr. Clark said. “When people see the logic, they say ‘I don’t necessarily believe it, but it’s hard to dismiss.’ ”

Friday, August 3, 2007

Jeremy Blake, 35, Artist Who Used Lush-Toned Video, Dies

Jeremy Blake, an up-and-coming artist who sought to bridge the worlds of painting and film in lush, color-saturated, hallucinatory digital video works, has died, the New York City Police said yesterday. He was 35 and lived in the East Village in Manhattan.


Paul J. Browne, the chief Police Department spokesman, said that a body found by a fisherman July 22 in the waters off Sea Girt, N.J., had been identified as that of Mr. Blake. The cause of death was presumed to be suicide, Mr. Browne said.


Mr. Blake was reported missing on July 17, when his clothes and wallet were found on a beach in the Rockaways, in Queens. A bystander reported that she saw a man disrobing that evening and walking out into the surf.


Mr. Blake’s companion of a dozen years, Theresa Duncan, 40, a writer, filmmaker and former video-game designer, had committed suicide a week earlier, on July 10, and Mr. Blake found her body in their apartment, according friends of the couple. The police said that a note found on the beach with his belongings made reference to Ms. Duncan’s death.
Mr. Blake began to make a name for himself in the late 1990’s with digital projections that combined colorful abstract geometric forms with photographic images — poolside cabanas, Modernist interiors, patio lights, skylines — that suggested scenes from movies. Some art critics described the work as Color Field paintings set in motion. He called much of his work “time-based paintings,” and wrote that he drew his subject matter from a fascination with “half-remembered and imaginary architecture” and images borrowed from “Hollywood’s psychic dustbin.”


He began to veer more toward the narrative and documentary in works like his “Winchester” video trilogy, which was shown at the San Francisco Museum of Modern Art in 2005. The videos focused on the Winchester Mystery House in San Jose, Calif., a 160-room mansion with mazes of hallways and dead-end staircases, built by Sarah Winchester, the widowed heiress to the Winchester rifle fortune, to try to protect herself from ghosts of gunshot victims whom she believed would haunt her.
A 2003 work at Feigen Contemporary gallery (now Kinz, Tillou & Feigen), was inspired by the diaries of Ossie Clark, the flamboyant English fashion designer who helped create the mod look in the 1960’s.
Jed Perl wrote last year in The New Republic that Mr. Blake had “the clearsighted cheerfulness of the madman working at the margins.”
His work, which was included in three Whitney Biennials, became known to a much larger audience when he created trippy, fluid sequences of abstract art for the 2002 movie “Punch-Drunk Love,” directed by Paul Thomas Anderson, who had seen an exhibition of Mr. Blake’s art while working on the film.
Mr. Blake was born in Fort Sill, Okla., but his family soon moved to the Washington area, and he was reared in Takoma Park, Md.
He received a bachelor’s degree from the School of the Art Institute of Chicago in 1993 and a master’s in 1995 from the California Institute of the Arts in Valencia. His father, Jeffrey Blake, who worked in commercial real estate, died when Mr. Blake was 17.
He is survived by his mother, Anne Schwartz Delibert; a half-sister, Adrienne Delibert; and a stepfather, Arthur Delibert, all of Bethesda, Md.
A new work in progress by Mr. Blake called “Glitterbest,” a collaboration with the musician and designer Malcolm McLaren, was to have been shown in an exhibition of Mr. Blake’s work at the Corcoran Gallery of Art in Washington in late October. Older pieces scheduled for the exhibition will be shown, the gallery said, but the status of the new project is uncertain. Jonathan P. Binstock, the exhibition’s curator and a former curator of contemporary art at the Corcoran, said the gallery would try to “present as full a picture as possible of this work.”

Thursday, August 2, 2007

Leave Your Name at the Border (NYT)

August 1, 2007
Op-Ed Contributor
Leave Your Name at the Border
By MANUEL MUÑOZ
Dinuba, Calif.
AT the Fresno airport, as I made my way to the gate, I heard a name over the intercom. The way the name was pronounced by the gate agent made me want to see what she looked like. That is, I wanted to see whether she was Mexican. Around Fresno, identity politics rarely deepen into exacting terms, so to say “Mexican” means, essentially, “not white.” The slivered self-identifications Chicano, Hispanic, Mexican-American and Latino are not part of everyday life in the Valley. You’re either Mexican or you’re not. If someone wants to know if you were born in Mexico, they’ll ask. Then you’re From Over There — de allá. And leave it at that.
The gate agent, it turned out, was Mexican. Well-coiffed, in her 30s, she wore foundation that was several shades lighter than the rest of her skin. It was the kind of makeup job I’ve learned to silently identify at the mall when I’m with my mother, who will say nothing about it until we’re back in the car. Then she’ll stretch her neck like an ostrich and point to the darkness of her own skin, wondering aloud why women try to camouflage who they are.
I watched the Mexican gate agent busy herself at the counter, professional and studied. Once again, she picked up the microphone and, with authority, announced the name of the missing customer: “Eugenio Reyes, please come to the front desk.”
You can probably guess how she said it. Her Anglicized pronunciation wouldn’t be unusual in a place like California’s Central Valley. I didn’t have a Mexican name there either: I was an instruction guide.
When people ask me where I’m from, I say Fresno because I don’t expect them to know little Dinuba. Fresno is a booming city of nearly 500,000 these days, with a diversity — white, Mexican, African-American, Armenian, Hmong and Middle Eastern people are all well represented — that shouldn’t surprise anyone. It’s in the small towns like Dinuba that surround Fresno that the awareness of cultural difference is stripped down to the interactions between the only two groups that tend to live there: whites and Mexicans. When you hear a Mexican name spoken in these towns, regardless of the speaker’s background, it’s no wonder that there’s an “English way of pronouncing it.”
I was born in 1972, part of a generation that learned both English and Spanish. Many of my cousins and siblings are bilingual, serving as translators for those in the family whose English is barely functional. Others have no way of following the Spanish banter at family gatherings. You can tell who falls into which group: Estella, Eric, Delia, Dubina, Melanie.
It’s intriguing to watch “American” names begin to dominate among my nieces and nephews and second cousins, as well as with the children of my hometown friends. I am not surprised to meet 5-year-old Brandon or Kaitlyn. Hardly anyone questions the incongruity of matching these names with last names like Trujillo or Zepeda. The English-only way of life partly explains the quiet erasure of cultural difference that assimilation has attempted to accomplish. A name like Kaitlyn Zepeda doesn’t completely obscure her ethnicity, but the half-step of her name, as a gesture, is almost understandable.
Spanish was and still is viewed with suspicion: always the language of the vilified illegal immigrant, it segregated schoolchildren into English-only and bilingual programs; it defined you, above all else, as part of a lower class. Learning English, though, brought its own complications with identity. It was simultaneously the language of the white population and a path toward the richer, expansive identity of “American.” But it took getting out of the Valley for me to understand that “white” and “American” were two very different things.
Something as simple as saying our names “in English” was our unwittingly complicit gesture of trying to blend in. Pronouncing Mexican names correctly was never encouraged. Names like Daniel, Olivia and Marco slipped right into the mutability of the English language.
I remember a school ceremony at which the mathematics teacher, a white man, announced the names of Mexican students correctly and caused some confusion, if not embarrassment. Years later we recognized that he spoke in deference to our Spanish-speaking parents in the audience, caring teacher that he was.
These were difficult names for a non-Spanish speaker: Araceli, Nadira, Luis (a beautiful name when you glide the u and the i as you’re supposed to). We had been accustomed to having our birth names altered for convenience. Concepción was Connie. Ramón was Raymond. My cousin Esperanza was Hope — but her name was pronounced “Hopie” because any Spanish speaker would automatically pronounce the e at the end.
Ours, then, were names that stood as barriers to a complete embrace of an American identity, simply because their pronunciations required a slip into Spanish, the otherness that assimilation was supposed to erase. What to do with names like Amado, Lucio or Élida? There are no English “equivalents,” no answer when white teachers asked, “What does your name mean?” when what they really wanted to know was “What’s the English one?” So what you heard was a name butchered beyond recognition, a pronunciation that pointed the finger at the Spanish language as the source of clunky sound and ugly rhythm.
My stepfather, from Ojos de Agua, Mexico, jokes when I ask him about the names of Mexicans born here. He deliberately stumbles over pronunciations, imitating our elders who have difficulty with Bradley and Madelyn. “Ashley Sánchez. ¿Tú crees?” He wonders aloud what has happened to the “nombres del rancho” — traditional Mexican names that are hardly given anymore to children born in the States: Heraclio, Madaleno, Otilia, Dominga.
My stepfather’s experience with the Anglicization of his name — Antonio to Tony — ties into something bigger than learning English. For him, the erasure of his name was about deference and subservience. Becoming Tony gave him a measure of access as he struggled to learn English and get more fieldwork.
This isn’t to say that my stepfather welcomed the change, only that he could not put up much resistance. Not changing put him at risk of being passed over for work. English was a world of power and decisions, of smooth, uninterrupted negotiation. There was no time to search for the right word while a shop clerk waited for him to come up with the English name of the correct part needed out in the field. Clear communication meant you could go unsupervised, or that you were even able to read instructions directly off a piece of paper. Every gesture made toward convincing an employer that English was on its way to being mastered had the potential to make a season of fieldwork profitable.
It’s curious that many of us growing up in Dinuba adhered to the same rules. Although as children of farm workers we worked in the fields at an early age, we’d also had the opportunity to stay in one town long enough to finish school. Most of us had learned English early and splintered off into a dual existence of English at school, Spanish at home. But instead of recognizing the need for fluency in both languages, we turned it into a peculiar kind of battle. English was for public display. Spanish was for privacy — and privacy quickly turned to shame.
The corrosive effect of assimilation is the displacement of one culture over another, the inability to sustain more than one way of being. It isn’t a code word for racial and ethnic acculturation only. It applies to needing and wanting to belong, of seeing from the outside and wondering how to get in and then, once inside, realizing there are always those still on the fringe.
When I went to college on the East Coast, I was confronted for the first time by people who said my name correctly without prompting; if they stumbled, there was a quick apology and an honest plea to help with the pronunciation. But introducing myself was painful: already shy, I avoided meeting people because I didn’t want to say my name, felt burdened by my own history. I knew that my small-town upbringing and its limitations on Spanish would not have been tolerated by any of the students of color who had grown up in large cities, in places where the sheer force of their native languages made them dominant in their neighborhoods.
It didn’t take long for me to assert the power of code-switching in public, the transferring of words from one language to another, regardless of who might be listening. I was learning that the English language composed new meanings when its constrictions were ignored, crossed over or crossed out. Language is all about manipulation, or not listening to the rules.
When I come back to Dinuba, I have a hard time hearing my name said incorrectly, but I have an even harder time beginning a conversation with others about why the pronunciation of our names matters. Leaving a small town requires an embrace of a larger point of view, but a town like Dinuba remains forever embedded in an either/or way of life. My stepfather still answers to Tony and, as the United States-born children grow older, their Anglicized names begin to signify who does and who does not “belong” — who was born here and who is de allá.
My name is Manuel. To this day, most people cannot say it correctly, the way it was intended to be said. But I can live with that because I love the alliteration of my full name. It wasn’t the name my mother, Esmeralda, was going to give me. At the last minute, my father named me after an uncle I would never meet. My name was to have been Ricardo. Growing up in Dinuba, I’m certain I would have become Ricky or even Richard, and the journey toward the discovery of the English language’s extraordinary power in even the most ordinary of circumstances would probably have gone unlearned.
I count on a collective sense of cultural loss to once again swing the names back to our native language. The Mexican gate agent announced Eugenio Reyes, but I never got a chance to see who appeared. I pictured an older man, cowboy hat in hand, but I made the assumption on his name alone, the clash of privileges I imagined between someone de allá and a Mexican woman with a good job in the United States. Would she speak to him in Spanish? Or would she raise her voice to him as if he were hard of hearing?
But who was I to imagine this man being from anywhere, based on his name alone? At a place of arrivals and departures, it sank into me that the currency of our names is a stroke of luck: because mine was not an easy name, it forced me to consider how language would rule me if I allowed it. Yet I discovered that only by leaving. My stepfather must live in the Valley, a place that does not allow that choice, every day. And Eugenio Reyes — I do not know if he was coming or going.
Manuel Muñoz is the author of “The Faith Healer of Olive Avenue.”